in_source_id
stringlengths
13
58
issue
stringlengths
3
241k
before_files
listlengths
0
3
after_files
listlengths
0
3
pr_diff
stringlengths
109
107M
pex-tool__pex-804
Release 2.0.2 On the docket: + [x] Add a test of pypi index rendering. (#799) + [x] Fix `iter_compatible_interpreters` path biasing. (#798) + [x] Fix current platform handling. #801
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.0.1'\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.0.2'\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 52bfe2f01..80e4b80d7 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,22 @@ Release Notes ============= +2.0.2 +----- + +This is a hotfix release that fixes a bug exposed when Pex was asked to use an +interpreter with a non-canonical path as well as fixes for 'current' platform +handling in the resolver API. + +* Fix current platform handling. (#801) + `PR #801 <https://github.com/pantsbuild/pex/pull/801>`_ + +* Add a test of pypi index rendering. (#799) + `PR #799 <https://github.com/pantsbuild/pex/pull/799>`_ + +* Fix `iter_compatible_interpreters` path biasing. (#798) + `PR #798 <https://github.com/pantsbuild/pex/pull/798>`_ + 2.0.1 ----- diff --git a/pex/version.py b/pex/version.py index 7d8716be3..a698f9d1f 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = '2.0.1' +__version__ = '2.0.2'
pex-tool__pex-743
Release 1.6.8 On the docket: + [x] Fixup pex re-exec during bootstrap. #741 + [x] Pex should not re-exec when the current interpreter satifies constraints #709 + [x] Pex should not lose PEX_PYTHON or PEX_PYTHON_PATH when re-exec-ing #710 + [x] Fix resolution of `setup.py` project extras. #739 Deferred: + [ ] Remove PEX_HTTP_RETRIES and push into a flag for the pex tool #94 + [ ] Sdist resolution is not always reproducible #735
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '1.6.7'\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '1.6.8'\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index d753c6bb2..a32083e64 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,21 @@ Release Notes ============= +1.6.8 +----- + +* Fixup pex re-exec during bootstrap. (#741) + `PR #741 <https://github.com/pantsbuild/pex/pull/741>`_ + +* Fix resolution of `setup.py` project extras. (#739) + `PR #739 <https://github.com/pantsbuild/pex/pull/739>`_ + +* Tighten up namespace declaration logic. (#732) + `PR #732 <https://github.com/pantsbuild/pex/pull/732>`_ + +* Fixup import sorting. (#731) + `PR #731 <https://github.com/pantsbuild/pex/pull/731>`_ + 1.6.7 ----- diff --git a/pex/version.py b/pex/version.py index 789a4befa..fade44a0f 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = '1.6.7' +__version__ = '1.6.8'
pex-tool__pex-691
Release 1.6.4 On the docket: + [x] Restore pex.pex_bootstrapper.is_compressed API #684 + [ ] Release more flexible pex binaries. #654 + [x] If an `--interpreter-constraint` is set, it should always be honored. #656
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '1.6.3'\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '1.6.4'\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index c6e494f18..5979452d2 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,9 +1,31 @@ Release Notes ============= +1.6.4 +----- + +This release un-breaks `lambdex <https://github.com/wickman/lambdex>`_. + +* Restore ``pex.pex_bootstrapper.is_compressed`` API. (#685) + `PR #685 <https://github.com/pantsbuild/pex/pull/685>`_ + +* Add the version of pex used to build a pex to build_properties. (#687) + `PR #687 <https://github.com/pantsbuild/pex/pull/687>`_ + +* Honor interpreter constraints even when PEX_PYTHON and PEX_PYTHON_PATH not set (#668) + `PR #668 <https://github.com/pantsbuild/pex/pull/668>`_ + 1.6.3 ----- +This release changes the behavior of the ``--interpreter-constraint`` option. +Previously, interpreter constraints were ANDed, which made it impossible to +express constraints like '>=2.7,<3' OR '>=3.6,<4'; ie: either python 2.7 or +else any python 3 release at or above 3.6. Now interpreter constraints are +ORed, which is likely a breaking change if you have scripts that pass multiple +interpreter constraints. To transition, use the native ``,`` AND operator in +your constraint expression, as used in the example above. + * Provide control over pex warning behavior. (#680) `PR #680 <https://github.com/pantsbuild/pex/pull/680>`_ diff --git a/pex/version.py b/pex/version.py index 24b3a7da7..0fc585ba9 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = '1.6.3' +__version__ = '1.6.4'
pex-tool__pex-702
Release 1.6.6 On the docket: + [x] Release more flexible pex binaries. #654 + [x] If sys.executable is not on PATH a pex will re-exec itself forever. #700
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '1.6.5'\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '1.6.6'\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 39ebf2bce..808b715ed 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,18 @@ Release Notes ============= +1.6.6 +----- + +This is the first release including only a single PEX pex, which +supports execution under all interpreters pex supports. + +* Fix pex bootstrap interpreter selection. (#701) + `PR #701 <https://github.com/pantsbuild/pex/pull/701>`_ + +* Switch releases to a single multi-pex. (#698) + `PR #698 <https://github.com/pantsbuild/pex/pull/698>`_ + 1.6.5 ----- diff --git a/pex/version.py b/pex/version.py index b15f6cce4..3ecfde800 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = '1.6.5' +__version__ = '1.6.6'
ethereum__consensus-specs-1130
BLS and testing Decided I wanted to get this out to explain the current state of testing, and **collect feedback** (implementers please comment) on what you need from testing, and your feelings about BLS usage in tests. # BLS and testing The two pain-points to get a pretty (and large) set of test-vectors out for clients are: - BLS Signature creation - BLS Signature verification And side-issue, but easily resolved: *efficient creation of a genesis state*: When BLS functionality is implemented in test-code (creation of signed deposits, and verification). Solution would be to either cache it, or create it directly, without going through the spec functions (current temporary solution on experiment branch). ## Status Talking about the status on [`spectest-deco` PR 1052](https://github.com/ethereum/eth2.0-specs/pull/1052) here, based on the `v06x` branch, where we are developing 0.6 improvements. (to be merged back into dev later) ### The testing pipeline currently looks like: - py-spec, calls BLS stub - test-helpers, don't create self-signed objects with valid signatures - py-test code, unified with test-vector-creation (see [PR 1052](https://github.com/ethereum/eth2.0-specs/pull/1052)) - py-test runner to run spec-tests, purely for assertions - test-generator running the spec-tests, passing `generator_mode=true` to each of them, making them output a test-vector. ### Pytests status: - move from `tests/` to `eth2spec/test`, i.e. part of package - removed use of `pytest` - annotated with `@spec_test` or similar (see PR 1052) - as part of test-generation effort, yay for shared effort: - expanded in block-operation testing: [coverage checklist here](https://github.com/ethereum/eth2.0-specs/issues/927) - slightly faster, less deep-copies - stuck on BLS stub (no sig creation/verification) ### Test-generation status: - BLS, SSZ-generic, SSZ-static, shuffling test generators still all in place and up to date (`v06x` branch) - `operations` test-gen uses test-package ability to output test-vectors for each test-case - but no valid signatures - lack of a definition how to handle this signature problem as a test-consumer - there are no signature-related testcases - turning BLS off would effectively let you check conformance, but it's hacky, and not remotely a good practice to have even an option for... - it's approx. ~140MB worth (iirc) of yaml encoded state-transitions, covering many edge-cases. Worth to get in the hands of implementers quick. - `sanity` tests updated and can be cleanly used for test-generation, but requires more work to define the format of the test-vectors, as they is more variety. - `epoch` processing tests also updated, also can be used, not as complete as block-processing, lower priority. ## Possible ways forward: - Simple but hacky: "turn BLS off for testing" - No "BLS off", BLS ON on client side, but only partially on spec side. Rely on signature verification not being hit before anything else during testing - valid test cases generated with valid signatures - invalid test cases marked: does it error because of BLS? And runners should check the reason for aborting processing: if it doesn't match, the test should fail. Now these pytests don't need full BLS update work, and can be released somewhat quicker - "BLS on", more work (~1 week) - slower on test-generation, but we get the best kind of test-vectors: correct, BLS verification ON. - blocker: what if a test case fails because of a signature error (test setup not creating the sig correctly), instead of a real assertion case. Spec will look correct, passes tests, but things are not right. We need to mark Sig-verification errors distinctly, so we can catch these problems when we turn BLS on in the pyspec. How: instead of `assert verify_...`, just `verify_...`, and make it raise a special `BLSVerificationError` (or something like that) - We likely still want to mark tests as "signature related" or not, so implementers can catch it easily if their code is not aborting properly before signature verification, to assure invalid inputs are not costly. A work-in-progress introduction of actual full BLS usage in the pytests is started here: [`tests-with-sigs` branch](https://github.com/ethereum/eth2.0-specs/tree/tests-with-sigs) Suggestions welcome.
[ { "content": "import sys\nimport function_puller\n\n\ndef build_phase0_spec(sourcefile, outfile):\n code_lines = []\n code_lines.append(\"\"\"\nfrom typing import (\n Any,\n Dict,\n List,\n NewType,\n Tuple,\n)\nfrom eth2spec.utils.minimal_ssz import *\nfrom eth2spec.utils.bls_stub import *\n\n\"\"\")\n for i in (1, 2, 3, 4, 8, 32, 48, 96):\n code_lines.append(\"def int_to_bytes%d(x): return x.to_bytes(%d, 'little')\" % (i, i))\n\n code_lines.append(\"\"\"\n\n# stub, will get overwritten by real var\nSLOTS_PER_EPOCH = 64\n\n\nSlot = NewType('Slot', int) # uint64\nEpoch = NewType('Epoch', int) # uint64\nShard = NewType('Shard', int) # uint64\nValidatorIndex = NewType('ValidatorIndex', int) # uint64\nGwei = NewType('Gwei', int) # uint64\nBytes32 = NewType('Bytes32', bytes) # bytes32\nBLSPubkey = NewType('BLSPubkey', bytes) # bytes48\nBLSSignature = NewType('BLSSignature', bytes) # bytes96\nStore = None\n\"\"\")\n\n code_lines += function_puller.get_spec(sourcefile)\n\n code_lines.append(\"\"\"\n# Monkey patch validator compute committee code\n_compute_committee = compute_committee\ncommittee_cache = {}\n\n\ndef compute_committee(indices: List[ValidatorIndex], seed: Bytes32, index: int, count: int) -> List[ValidatorIndex]:\n param_hash = (hash_tree_root(indices), seed, index, count)\n\n if param_hash in committee_cache:\n return committee_cache[param_hash]\n else:\n ret = _compute_committee(indices, seed, index, count)\n committee_cache[param_hash] = ret\n return ret\n\n\n# Monkey patch hash cache\n_hash = hash\nhash_cache = {}\n\n\ndef hash(x):\n if x in hash_cache:\n return hash_cache[x]\n else:\n ret = _hash(x)\n hash_cache[x] = ret\n return ret\n\n# Access to overwrite spec constants based on configuration\ndef apply_constants_preset(preset: Dict[str, Any]):\n global_vars = globals()\n for k, v in preset.items():\n global_vars[k] = v\n\n # Deal with derived constants\n global_vars['GENESIS_EPOCH'] = slot_to_epoch(GENESIS_SLOT)\n\n # Initialize SSZ types again, to account for changed lengths\n init_SSZ_types()\n\"\"\")\n\n with open(outfile, 'w') as out:\n out.write(\"\\n\".join(code_lines))\n\n\nif __name__ == '__main__':\n if len(sys.argv) < 3:\n print(\"Usage: <source phase0> <output phase0 pyspec>\")\n build_phase0_spec(sys.argv[1], sys.argv[2])\n\n", "path": "scripts/phase0/build_spec.py" } ]
[ { "content": "import sys\nimport function_puller\n\n\ndef build_phase0_spec(sourcefile, outfile):\n code_lines = []\n code_lines.append(\"\"\"\nfrom typing import (\n Any,\n Dict,\n List,\n NewType,\n Tuple,\n)\nfrom eth2spec.utils.minimal_ssz import *\nfrom eth2spec.utils.bls import *\n\n\"\"\")\n for i in (1, 2, 3, 4, 8, 32, 48, 96):\n code_lines.append(\"def int_to_bytes%d(x): return x.to_bytes(%d, 'little')\" % (i, i))\n\n code_lines.append(\"\"\"\n\n# stub, will get overwritten by real var\nSLOTS_PER_EPOCH = 64\n\n\nSlot = NewType('Slot', int) # uint64\nEpoch = NewType('Epoch', int) # uint64\nShard = NewType('Shard', int) # uint64\nValidatorIndex = NewType('ValidatorIndex', int) # uint64\nGwei = NewType('Gwei', int) # uint64\nBytes32 = NewType('Bytes32', bytes) # bytes32\nBLSPubkey = NewType('BLSPubkey', bytes) # bytes48\nBLSSignature = NewType('BLSSignature', bytes) # bytes96\nStore = None\n\"\"\")\n\n code_lines += function_puller.get_spec(sourcefile)\n\n code_lines.append(\"\"\"\n# Monkey patch validator compute committee code\n_compute_committee = compute_committee\ncommittee_cache = {}\n\n\ndef compute_committee(indices: List[ValidatorIndex], seed: Bytes32, index: int, count: int) -> List[ValidatorIndex]:\n param_hash = (hash_tree_root(indices), seed, index, count)\n\n if param_hash in committee_cache:\n return committee_cache[param_hash]\n else:\n ret = _compute_committee(indices, seed, index, count)\n committee_cache[param_hash] = ret\n return ret\n\n\n# Monkey patch hash cache\n_hash = hash\nhash_cache = {}\n\n\ndef hash(x):\n if x in hash_cache:\n return hash_cache[x]\n else:\n ret = _hash(x)\n hash_cache[x] = ret\n return ret\n\n# Access to overwrite spec constants based on configuration\ndef apply_constants_preset(preset: Dict[str, Any]):\n global_vars = globals()\n for k, v in preset.items():\n global_vars[k] = v\n\n # Deal with derived constants\n global_vars['GENESIS_EPOCH'] = slot_to_epoch(GENESIS_SLOT)\n\n # Initialize SSZ types again, to account for changed lengths\n init_SSZ_types()\n\"\"\")\n\n with open(outfile, 'w') as out:\n out.write(\"\\n\".join(code_lines))\n\n\nif __name__ == '__main__':\n if len(sys.argv) < 3:\n print(\"Usage: <source phase0> <output phase0 pyspec>\")\n build_phase0_spec(sys.argv[1], sys.argv[2])\n\n", "path": "scripts/phase0/build_spec.py" } ]
diff --git a/Makefile b/Makefile index 73d8adea89..86303680d3 100644 --- a/Makefile +++ b/Makefile @@ -34,7 +34,7 @@ install_test: cd $(PY_SPEC_DIR); python3 -m venv venv; . venv/bin/activate; pip3 install -r requirements-testing.txt; test: $(PY_SPEC_ALL_TARGETS) - cd $(PY_SPEC_DIR); . venv/bin/activate; python -m pytest . + cd $(PY_SPEC_DIR); . venv/bin/activate; python -m pytest eth2spec citest: $(PY_SPEC_ALL_TARGETS) cd $(PY_SPEC_DIR); mkdir -p test-reports/eth2spec; . venv/bin/activate; python -m pytest --junitxml=test-reports/eth2spec/test_results.xml . diff --git a/scripts/phase0/build_spec.py b/scripts/phase0/build_spec.py index da5845951d..26b0e5a8a6 100644 --- a/scripts/phase0/build_spec.py +++ b/scripts/phase0/build_spec.py @@ -13,7 +13,7 @@ def build_phase0_spec(sourcefile, outfile): Tuple, ) from eth2spec.utils.minimal_ssz import * -from eth2spec.utils.bls_stub import * +from eth2spec.utils.bls import * """) for i in (1, 2, 3, 4, 8, 32, 48, 96): diff --git a/specs/core/0_beacon-chain.md b/specs/core/0_beacon-chain.md index e56fd976cc..46c811fedb 100644 --- a/specs/core/0_beacon-chain.md +++ b/specs/core/0_beacon-chain.md @@ -1756,7 +1756,8 @@ def process_deposit(state: BeaconState, deposit: Deposit) -> None: amount = deposit.data.amount validator_pubkeys = [v.pubkey for v in state.validator_registry] if pubkey not in validator_pubkeys: - # Verify the deposit signature (proof of possession) + # Verify the deposit signature (proof of possession). + # Invalid signatures are allowed by the deposit contract, and hence included on-chain, but must not be processed. if not bls_verify(pubkey, signing_root(deposit.data), deposit.data.signature, get_domain(state, DOMAIN_DEPOSIT)): return diff --git a/specs/test_formats/README.md b/specs/test_formats/README.md index 273659ce93..d245fcfa46 100644 --- a/specs/test_formats/README.md +++ b/specs/test_formats/README.md @@ -176,6 +176,18 @@ To prevent parsing of hundreds of different YAML files to test a specific test t ... <--- more test types ``` +## Common test-case properties + +Some test-case formats share some common key-value pair patterns, and these are documented here: + +``` +bls_setting: int -- optional, can have 3 different values: + 0: (default, applies if key-value pair is absent). Free to choose either BLS ON or OFF. + Tests are generated with valid BLS data in this case, + but there is no change of outcome when running the test if BLS is ON or OFF. + 1: known as "BLS required" - if the test validity is strictly dependent on BLS being ON + 2: known as "BLS ignored" - if the test validity is strictly dependent on BLS being OFF +``` ## Note for implementers diff --git a/specs/test_formats/epoch_processing/README.md b/specs/test_formats/epoch_processing/README.md new file mode 100644 index 0000000000..6384a0eda9 --- /dev/null +++ b/specs/test_formats/epoch_processing/README.md @@ -0,0 +1,29 @@ +# Epoch processing tests + +The different epoch sub-transitions are tested individually with test handlers. +The format is similar to block-processing state-transition tests. +There is no "change" factor however, the transitions are pure functions with just the pre-state as input. +Hence, the format is shared between each test-handler. (See test condition documentation on how to run the tests.) + +## Test case format + +```yaml +description: string -- description of test case, purely for debugging purposes +bls_setting: int -- see general test-format spec. +pre: BeaconState -- state before running the sub-transition +post: BeaconState -- state after applying the epoch sub-transition. +``` + +## Condition + +A handler of the `epoch_processing` test-runner should process these cases, + calling the corresponding processing implementation. + +Sub-transitions: + +| *`sub-transition-name`* | *`processing call`* | +|-------------------------|-----------------------------------| +| `crosslinks` | `process_crosslinks(state)` | +| `registry_updates` | `process_registry_updates(state)` | + +The resulting state should match the expected `post` state. diff --git a/specs/test_formats/operations/README.md b/specs/test_formats/operations/README.md index 842dc3615f..32cf880b36 100644 --- a/specs/test_formats/operations/README.md +++ b/specs/test_formats/operations/README.md @@ -2,9 +2,34 @@ The different kinds of operations ("transactions") are tested individually with test handlers. -The tested operation kinds are: -- [`deposits`](./deposits.md) -- More tests are work-in-progress. +## Test case format +```yaml +description: string -- description of test case, purely for debugging purposes +bls_setting: int -- see general test-format spec. +pre: BeaconState -- state before applying the operation +<operation-name>: <operation-object> -- the YAML encoded operation, e.g. a "ProposerSlashing", or "Deposit". +post: BeaconState -- state after applying the operation. No value if operation processing is aborted. +``` +## Condition +A handler of the `operations` test-runner should process these cases, + calling the corresponding processing implementation. + +Operations: + +| *`operation-name`* | *`operation-object`* | *`input name`* | *`processing call`* | +|-------------------------|----------------------|----------------------|--------------------------------------------------------| +| `attestation` | `Attestation` | `attestation` | `process_attestation(state, attestation)` | +| `attester_slashing` | `AttesterSlashing` | `attester_slashing` | `process_attester_slashing(state, attester_slashing)` | +| `block_header` | `Block` | `block` | `process_block_header(state, block)` | +| `deposit` | `Deposit` | `deposit` | `process_deposit(state, deposit)` | +| `proposer_slashing` | `ProposerSlashing` | `proposer_slashing` | `process_proposer_slashing(state, proposer_slashing)` | +| `transfer` | `Transfer` | `transfer` | `process_transfer(state, transfer)` | +| `voluntary_exit` | `VoluntaryExit` | `voluntary_exit` | `process_voluntary_exit(state, voluntary_exit)` | + +Note that `block_header` is not strictly an operation (and is a full `Block`), but processed in the same manner, and hence included here. + +The resulting state should match the expected `post` state, or if the `post` state is left blank, + the handler should reject the input operation as invalid. diff --git a/specs/test_formats/operations/deposits.md b/specs/test_formats/operations/deposits.md deleted file mode 100644 index 8f44ebb228..0000000000 --- a/specs/test_formats/operations/deposits.md +++ /dev/null @@ -1,18 +0,0 @@ -# Test format: Deposit operations - -A deposit is a form of an operation (or "transaction"), modifying the state. - -## Test case format - -```yaml -description: string -- description of test case, purely for debugging purposes -pre: BeaconState -- state before applying the deposit -deposit: Deposit -- the deposit -post: BeaconState -- state after applying the deposit. No value if deposit processing is aborted. -``` - -## Condition - -A `deposits` handler of the `operations` should process these cases, - calling the implementation of the `process_deposit(state, deposit)` functionality described in the spec. -The resulting state should match the expected `post` state, or if the `post` state is left blank, the handler should reject the inputs as invalid. diff --git a/specs/test_formats/sanity/README.md b/specs/test_formats/sanity/README.md new file mode 100644 index 0000000000..20b36208a4 --- /dev/null +++ b/specs/test_formats/sanity/README.md @@ -0,0 +1,7 @@ +# Sanity tests + +The aim of the sanity tests is to set a base-line on what really needs to pass, i.e. the essentials. + +There are two handlers, documented individually: +- [`slots`](./slots.md): transitions of one or more slots (and epoch transitions within) +- [`blocks`](./blocks.md): transitions triggered by one or more blocks diff --git a/specs/test_formats/sanity/blocks.md b/specs/test_formats/sanity/blocks.md new file mode 100644 index 0000000000..3004a6de70 --- /dev/null +++ b/specs/test_formats/sanity/blocks.md @@ -0,0 +1,18 @@ +# Sanity blocks testing + +Sanity tests to cover a series of one or more blocks being processed, aiming to cover common changes. + +## Test case format + +```yaml +description: string -- description of test case, purely for debugging purposes +bls_setting: int -- see general test-format spec. +pre: BeaconState -- state before running through the transitions triggered by the blocks. +blocks: [BeaconBlock] -- blocks to process, in given order, following the main transition function (i.e. process slot and epoch transitions in between blocks as normal) +post: BeaconState -- state after applying all the transitions triggered by the blocks. +``` + +## Condition + +The resulting state should match the expected `post` state, or if the `post` state is left blank, + the handler should reject the series of blocks as invalid. diff --git a/specs/test_formats/sanity/slots.md b/specs/test_formats/sanity/slots.md new file mode 100644 index 0000000000..81866d47b9 --- /dev/null +++ b/specs/test_formats/sanity/slots.md @@ -0,0 +1,23 @@ +# Sanity slots testing + +Sanity tests to cover a series of one or more empty-slot transitions being processed, aiming to cover common changes. + +## Test case format + +```yaml +description: string -- description of test case, purely for debugging purposes +bls_setting: int -- see general test-format spec. +pre: BeaconState -- state before running through the transitions. +slots: N -- amount of slots to process, N being a positive numer. +post: BeaconState -- state after applying all the transitions. +``` + +The transition with pure time, no blocks, is known as `state_transition_to(state, slot)` in the spec. +This runs state-caching (pure slot transition) and epoch processing (every E slots). + +To process the data, call `state_transition_to(pre, pre.slot + N)`. And see if `pre` mutated into the equivalent of `post`. + + +## Condition + +The resulting state should match the expected `post` state. diff --git a/specs/test_formats/ssz_static/core.md b/specs/test_formats/ssz_static/core.md index 1d470c3381..0f26e0f9c8 100644 --- a/specs/test_formats/ssz_static/core.md +++ b/specs/test_formats/ssz_static/core.md @@ -9,11 +9,11 @@ This test-format ensures these direct serializations are covered. ## Test case format ```yaml -type_name: string -- string, object name, formatted as in spec. E.g. "BeaconBlock" -value: dynamic -- the YAML-encoded value, of the type specified by type_name. -serialized: bytes -- string, SSZ-serialized data, hex encoded, with prefix 0x -root: bytes32 -- string, hash-tree-root of the value, hex encoded, with prefix 0x -signing_root: bytes32 -- string, signing-root of the value, hex encoded, with prefix 0x. Optional, present if type contains ``signature`` field +SomeObjectName: -- key, object name, formatted as in spec. E.g. "BeaconBlock". + value: dynamic -- the YAML-encoded value, of the type specified by type_name. + serialized: bytes -- string, SSZ-serialized data, hex encoded, with prefix 0x + root: bytes32 -- string, hash-tree-root of the value, hex encoded, with prefix 0x + signing_root: bytes32 -- string, signing-root of the value, hex encoded, with prefix 0x. Optional, present if type contains ``signature`` field ``` ## Condition diff --git a/test_generators/README.md b/test_generators/README.md index 43bf7af031..309a64bd92 100644 --- a/test_generators/README.md +++ b/test_generators/README.md @@ -58,7 +58,7 @@ It's recommended to extend the base-generator. Create a `requirements.txt` in the root of your generator directory: ``` -eth-utils==1.4.1 +eth-utils==1.6.0 ../../test_libs/gen_helpers ../../test_libs/config_helpers ../../test_libs/pyspec diff --git a/test_generators/bls/requirements.txt b/test_generators/bls/requirements.txt index 8a933d41ca..6d83bdfb59 100644 --- a/test_generators/bls/requirements.txt +++ b/test_generators/bls/requirements.txt @@ -1,3 +1,3 @@ -py-ecc==1.6.0 -eth-utils==1.4.1 +py-ecc==1.7.0 +eth-utils==1.6.0 ../../test_libs/gen_helpers diff --git a/test_generators/epoch_processing/README.md b/test_generators/epoch_processing/README.md new file mode 100644 index 0000000000..9b57875e2a --- /dev/null +++ b/test_generators/epoch_processing/README.md @@ -0,0 +1,11 @@ +# Epoch processing + +Epoch processing covers the sub-transitions during an epoch change. + +An epoch-processing test-runner can consume these sub-transition test-suites, + and handle different kinds of epoch sub-transitions by processing the cases using the specified test handler. + +Information on the format of the tests can be found in the [epoch-processing test formats documentation](../../specs/test_formats/epoch_processing/README.md). + + + diff --git a/test_generators/epoch_processing/main.py b/test_generators/epoch_processing/main.py new file mode 100644 index 0000000000..8f067e4a35 --- /dev/null +++ b/test_generators/epoch_processing/main.py @@ -0,0 +1,38 @@ +from typing import Callable, Iterable + +from eth2spec.phase0 import spec +from eth2spec.test.epoch_processing import ( + test_process_crosslinks, + test_process_registry_updates +) +from gen_base import gen_runner, gen_suite, gen_typing +from gen_from_tests.gen import generate_from_tests +from preset_loader import loader + + +def create_suite(transition_name: str, config_name: str, get_cases: Callable[[], Iterable[gen_typing.TestCase]]) \ + -> Callable[[str], gen_typing.TestSuiteOutput]: + def suite_definition(configs_path: str) -> gen_typing.TestSuiteOutput: + presets = loader.load_presets(configs_path, config_name) + spec.apply_constants_preset(presets) + + return ("%s_%s" % (transition_name, config_name), transition_name, gen_suite.render_suite( + title="%s epoch processing" % transition_name, + summary="Test suite for %s type epoch processing" % transition_name, + forks_timeline="testing", + forks=["phase0"], + config=config_name, + runner="epoch_processing", + handler=transition_name, + test_cases=get_cases())) + + return suite_definition + + +if __name__ == "__main__": + gen_runner.run_generator("epoch_processing", [ + create_suite('crosslinks', 'minimal', lambda: generate_from_tests(test_process_crosslinks)), + create_suite('crosslinks', 'mainnet', lambda: generate_from_tests(test_process_crosslinks)), + create_suite('registry_updates', 'minimal', lambda: generate_from_tests(test_process_registry_updates)), + create_suite('registry_updates', 'mainnet', lambda: generate_from_tests(test_process_registry_updates)), + ]) diff --git a/test_generators/epoch_processing/requirements.txt b/test_generators/epoch_processing/requirements.txt new file mode 100644 index 0000000000..595cee69cd --- /dev/null +++ b/test_generators/epoch_processing/requirements.txt @@ -0,0 +1,4 @@ +eth-utils==1.6.0 +../../test_libs/gen_helpers +../../test_libs/config_helpers +../../test_libs/pyspec \ No newline at end of file diff --git a/test_generators/operations/README.md b/test_generators/operations/README.md index e0b9d0e187..5cb3afc989 100644 --- a/test_generators/operations/README.md +++ b/test_generators/operations/README.md @@ -3,7 +3,6 @@ Operations (or "transactions" in previous spec iterations), are atomic changes to the state, introduced by embedding in blocks. -This generator provides a series of test suites, divided into handler, for each operation type. An operation test-runner can consume these operation test-suites, and handle different kinds of operations by processing the cases using the specified test handler. diff --git a/test_generators/operations/deposits.py b/test_generators/operations/deposits.py deleted file mode 100644 index 075ccbd5ba..0000000000 --- a/test_generators/operations/deposits.py +++ /dev/null @@ -1,180 +0,0 @@ -from eth2spec.phase0 import spec -from eth_utils import ( - to_dict, to_tuple -) -from gen_base import gen_suite, gen_typing -from preset_loader import loader -from eth2spec.debug.encode import encode -from eth2spec.utils.minimal_ssz import signing_root -from eth2spec.utils.merkle_minimal import get_merkle_root, calc_merkle_tree_from_leaves, get_merkle_proof - -from typing import List, Tuple - -import genesis -import keys -from py_ecc import bls - - -def build_deposit_data(state, - pubkey: spec.BLSPubkey, - withdrawal_cred: spec.Bytes32, - privkey: int, - amount: int): - deposit_data = spec.DepositData( - pubkey=pubkey, - withdrawal_credentials=spec.BLS_WITHDRAWAL_PREFIX_BYTE + withdrawal_cred[1:], - amount=amount, - ) - deposit_data.proof_of_possession = bls.sign( - message_hash=signing_root(deposit_data), - privkey=privkey, - domain=spec.get_domain( - state, - spec.get_current_epoch(state), - spec.DOMAIN_DEPOSIT, - ) - ) - return deposit_data - - -def build_deposit(state, - deposit_data_leaves: List[spec.Bytes32], - pubkey: spec.BLSPubkey, - withdrawal_cred: spec.Bytes32, - privkey: int, - amount: int) -> spec.Deposit: - - deposit_data = build_deposit_data(state, pubkey, withdrawal_cred, privkey, amount) - - item = deposit_data.hash_tree_root() - index = len(deposit_data_leaves) - deposit_data_leaves.append(item) - tree = calc_merkle_tree_from_leaves(tuple(deposit_data_leaves)) - proof = list(get_merkle_proof(tree, item_index=index)) - - deposit = spec.Deposit( - proof=list(proof), - index=index, - data=deposit_data, - ) - assert spec.verify_merkle_branch(item, proof, spec.DEPOSIT_CONTRACT_TREE_DEPTH, index, get_merkle_root(tuple(deposit_data_leaves))) - - return deposit - - -def build_deposit_for_index(initial_validator_count: int, index: int) -> Tuple[spec.Deposit, spec.BeaconState]: - genesis_deposits = genesis.create_deposits( - keys.pubkeys[:initial_validator_count], - keys.withdrawal_creds[:initial_validator_count] - ) - state = genesis.create_genesis_state(genesis_deposits) - - deposit_data_leaves = [dep.data.hash_tree_root() for dep in genesis_deposits] - - deposit = build_deposit( - state, - deposit_data_leaves, - keys.pubkeys[index], - keys.withdrawal_creds[index], - keys.privkeys[index], - spec.MAX_EFFECTIVE_BALANCE, - ) - - state.latest_eth1_data.deposit_root = get_merkle_root(tuple(deposit_data_leaves)) - state.latest_eth1_data.deposit_count = len(deposit_data_leaves) - - return deposit, state - - -@to_dict -def valid_deposit(): - new_dep, state = build_deposit_for_index(10, 10) - yield 'description', 'valid deposit to add new validator' - yield 'pre', encode(state, spec.BeaconState) - yield 'deposit', encode(new_dep, spec.Deposit) - spec.process_deposit(state, new_dep) - yield 'post', encode(state, spec.BeaconState) - - -@to_dict -def valid_topup(): - new_dep, state = build_deposit_for_index(10, 3) - yield 'description', 'valid deposit to top-up existing validator' - yield 'pre', encode(state, spec.BeaconState) - yield 'deposit', encode(new_dep, spec.Deposit) - spec.process_deposit(state, new_dep) - yield 'post', encode(state, spec.BeaconState) - - -@to_dict -def invalid_deposit_index(): - new_dep, state = build_deposit_for_index(10, 10) - # Mess up deposit index, 1 too small - state.deposit_index = 9 - - yield 'description', 'invalid deposit index' - yield 'pre', encode(state, spec.BeaconState) - yield 'deposit', encode(new_dep, spec.Deposit) - try: - spec.process_deposit(state, new_dep) - except AssertionError: - # expected - yield 'post', None - return - raise Exception('invalid_deposit_index has unexpectedly allowed deposit') - - -@to_dict -def invalid_deposit_proof(): - new_dep, state = build_deposit_for_index(10, 10) - # Make deposit proof invalid (at bottom of proof) - new_dep.proof[-1] = spec.ZERO_HASH - - yield 'description', 'invalid deposit proof' - yield 'pre', encode(state, spec.BeaconState) - yield 'deposit', encode(new_dep, spec.Deposit) - try: - spec.process_deposit(state, new_dep) - except AssertionError: - # expected - yield 'post', None - return - raise Exception('invalid_deposit_index has unexpectedly allowed deposit') - - -@to_tuple -def deposit_cases(): - yield valid_deposit() - yield valid_topup() - yield invalid_deposit_index() - yield invalid_deposit_proof() - - -def mini_deposits_suite(configs_path: str) -> gen_typing.TestSuiteOutput: - presets = loader.load_presets(configs_path, 'minimal') - spec.apply_constants_preset(presets) - - return ("deposit_minimal", "deposits", gen_suite.render_suite( - title="deposit operation", - summary="Test suite for deposit type operation processing", - forks_timeline="testing", - forks=["phase0"], - config="minimal", - runner="operations", - handler="deposits", - test_cases=deposit_cases())) - - -def full_deposits_suite(configs_path: str) -> gen_typing.TestSuiteOutput: - presets = loader.load_presets(configs_path, 'mainnet') - spec.apply_constants_preset(presets) - - return ("deposit_full", "deposits", gen_suite.render_suite( - title="deposit operation", - summary="Test suite for deposit type operation processing", - forks_timeline="mainnet", - forks=["phase0"], - config="mainnet", - runner="operations", - handler="deposits", - test_cases=deposit_cases())) diff --git a/test_generators/operations/genesis.py b/test_generators/operations/genesis.py deleted file mode 100644 index f4d63c10ec..0000000000 --- a/test_generators/operations/genesis.py +++ /dev/null @@ -1,44 +0,0 @@ -from eth2spec.phase0 import spec -from eth2spec.utils.merkle_minimal import get_merkle_root, calc_merkle_tree_from_leaves, get_merkle_proof -from typing import List - - -def create_genesis_state(deposits: List[spec.Deposit]) -> spec.BeaconState: - deposit_root = get_merkle_root((tuple([(dep.data.hash_tree_root()) for dep in deposits]))) - - return spec.get_genesis_beacon_state( - deposits, - genesis_time=0, - genesis_eth1_data=spec.Eth1Data( - deposit_root=deposit_root, - deposit_count=len(deposits), - block_hash=spec.ZERO_HASH, - ), - ) - - -def create_deposits(pubkeys: List[spec.BLSPubkey], withdrawal_cred: List[spec.Bytes32]) -> List[spec.Deposit]: - - # Mock proof of possession - proof_of_possession = b'\x33' * 96 - - deposit_data = [ - spec.DepositData( - pubkey=pubkeys[i], - withdrawal_credentials=spec.BLS_WITHDRAWAL_PREFIX_BYTE + withdrawal_cred[i][1:], - amount=spec.MAX_EFFECTIVE_BALANCE, - proof_of_possession=proof_of_possession, - ) for i in range(len(pubkeys)) - ] - - # Fill tree with existing deposits - deposit_data_leaves = [data.hash_tree_root() for data in deposit_data] - tree = calc_merkle_tree_from_leaves(tuple(deposit_data_leaves)) - - return [ - spec.Deposit( - proof=list(get_merkle_proof(tree, item_index=i)), - index=i, - data=deposit_data[i] - ) for i in range(len(deposit_data)) - ] diff --git a/test_generators/operations/keys.py b/test_generators/operations/keys.py deleted file mode 100644 index db4f59e0e6..0000000000 --- a/test_generators/operations/keys.py +++ /dev/null @@ -1,7 +0,0 @@ -from py_ecc import bls -from eth2spec.phase0.spec import hash - -privkeys = list(range(1, 101)) -pubkeys = [bls.privtopub(k) for k in privkeys] -# Insecure, but easier to follow -withdrawal_creds = [hash(bls.privtopub(k)) for k in privkeys] diff --git a/test_generators/operations/main.py b/test_generators/operations/main.py index 8b0a2a6d83..96c639d12d 100644 --- a/test_generators/operations/main.py +++ b/test_generators/operations/main.py @@ -1,9 +1,53 @@ -from gen_base import gen_runner +from typing import Callable, Iterable + +from eth2spec.test.block_processing import ( + test_process_attestation, + test_process_attester_slashing, + test_process_block_header, + test_process_deposit, + test_process_proposer_slashing, + test_process_transfer, + test_process_voluntary_exit +) + +from gen_base import gen_runner, gen_suite, gen_typing +from gen_from_tests.gen import generate_from_tests +from preset_loader import loader +from eth2spec.phase0 import spec + + +def create_suite(operation_name: str, config_name: str, get_cases: Callable[[], Iterable[gen_typing.TestCase]]) \ + -> Callable[[str], gen_typing.TestSuiteOutput]: + def suite_definition(configs_path: str) -> gen_typing.TestSuiteOutput: + presets = loader.load_presets(configs_path, config_name) + spec.apply_constants_preset(presets) + + return ("%s_%s" % (operation_name, config_name), operation_name, gen_suite.render_suite( + title="%s operation" % operation_name, + summary="Test suite for %s type operation processing" % operation_name, + forks_timeline="testing", + forks=["phase0"], + config=config_name, + runner="operations", + handler=operation_name, + test_cases=get_cases())) + return suite_definition -from deposits import mini_deposits_suite, full_deposits_suite if __name__ == "__main__": gen_runner.run_generator("operations", [ - mini_deposits_suite, - full_deposits_suite + create_suite('attestation', 'minimal', lambda: generate_from_tests(test_process_attestation)), + create_suite('attestation', 'mainnet', lambda: generate_from_tests(test_process_attestation)), + create_suite('attester_slashing', 'minimal', lambda: generate_from_tests(test_process_attester_slashing)), + create_suite('attester_slashing', 'mainnet', lambda: generate_from_tests(test_process_attester_slashing)), + create_suite('block_header', 'minimal', lambda: generate_from_tests(test_process_block_header)), + create_suite('block_header', 'mainnet', lambda: generate_from_tests(test_process_block_header)), + create_suite('deposit', 'minimal', lambda: generate_from_tests(test_process_deposit)), + create_suite('deposit', 'mainnet', lambda: generate_from_tests(test_process_deposit)), + create_suite('proposer_slashing', 'minimal', lambda: generate_from_tests(test_process_proposer_slashing)), + create_suite('proposer_slashing', 'mainnet', lambda: generate_from_tests(test_process_proposer_slashing)), + create_suite('transfer', 'minimal', lambda: generate_from_tests(test_process_transfer)), + create_suite('transfer', 'mainnet', lambda: generate_from_tests(test_process_transfer)), + create_suite('voluntary_exit', 'minimal', lambda: generate_from_tests(test_process_voluntary_exit)), + create_suite('voluntary_exit', 'mainnet', lambda: generate_from_tests(test_process_voluntary_exit)), ]) diff --git a/test_generators/operations/requirements.txt b/test_generators/operations/requirements.txt index dfe8535365..595cee69cd 100644 --- a/test_generators/operations/requirements.txt +++ b/test_generators/operations/requirements.txt @@ -1,5 +1,4 @@ -eth-utils==1.4.1 +eth-utils==1.6.0 ../../test_libs/gen_helpers ../../test_libs/config_helpers -../../test_libs/pyspec -py_ecc \ No newline at end of file +../../test_libs/pyspec \ No newline at end of file diff --git a/test_generators/sanity/README.md b/test_generators/sanity/README.md new file mode 100644 index 0000000000..6d2e2f30dd --- /dev/null +++ b/test_generators/sanity/README.md @@ -0,0 +1,8 @@ +# Sanity tests + +Sanity tests cover regular state-transitions in a common block-list format, to ensure the basics work. + +Information on the format of the tests can be found in the [sanity test formats documentation](../../specs/test_formats/sanity/README.md). + + + diff --git a/test_generators/sanity/main.py b/test_generators/sanity/main.py new file mode 100644 index 0000000000..bba6ed03df --- /dev/null +++ b/test_generators/sanity/main.py @@ -0,0 +1,35 @@ +from typing import Callable, Iterable + +from eth2spec.test.sanity import test_blocks, test_slots + +from gen_base import gen_runner, gen_suite, gen_typing +from gen_from_tests.gen import generate_from_tests +from preset_loader import loader +from eth2spec.phase0 import spec + + +def create_suite(handler_name: str, config_name: str, get_cases: Callable[[], Iterable[gen_typing.TestCase]]) \ + -> Callable[[str], gen_typing.TestSuiteOutput]: + def suite_definition(configs_path: str) -> gen_typing.TestSuiteOutput: + presets = loader.load_presets(configs_path, config_name) + spec.apply_constants_preset(presets) + + return ("%sanity_s_%s" % (handler_name, config_name), handler_name, gen_suite.render_suite( + title="sanity testing", + summary="Sanity test suite, %s type, generated from pytests" % handler_name, + forks_timeline="testing", + forks=["phase0"], + config=config_name, + runner="sanity", + handler=handler_name, + test_cases=get_cases())) + return suite_definition + + +if __name__ == "__main__": + gen_runner.run_generator("sanity", [ + create_suite('blocks', 'minimal', lambda: generate_from_tests(test_blocks)), + create_suite('blocks', 'mainnet', lambda: generate_from_tests(test_blocks)), + create_suite('slots', 'minimal', lambda: generate_from_tests(test_slots)), + create_suite('slots', 'mainnet', lambda: generate_from_tests(test_slots)), + ]) diff --git a/test_generators/sanity/requirements.txt b/test_generators/sanity/requirements.txt new file mode 100644 index 0000000000..595cee69cd --- /dev/null +++ b/test_generators/sanity/requirements.txt @@ -0,0 +1,4 @@ +eth-utils==1.6.0 +../../test_libs/gen_helpers +../../test_libs/config_helpers +../../test_libs/pyspec \ No newline at end of file diff --git a/test_generators/shuffling/main.py b/test_generators/shuffling/main.py index 2c4faeb8fb..bb14520e12 100644 --- a/test_generators/shuffling/main.py +++ b/test_generators/shuffling/main.py @@ -10,7 +10,7 @@ def shuffling_case(seed: spec.Bytes32, count: int): yield 'seed', '0x' + seed.hex() yield 'count', count - yield 'shuffled', [spec.get_permuted_index(i, count, seed) for i in range(count)] + yield 'shuffled', [spec.get_shuffled_index(i, count, seed) for i in range(count)] @to_tuple diff --git a/test_generators/shuffling/requirements.txt b/test_generators/shuffling/requirements.txt index 8f9bede8f3..595cee69cd 100644 --- a/test_generators/shuffling/requirements.txt +++ b/test_generators/shuffling/requirements.txt @@ -1,4 +1,4 @@ -eth-utils==1.4.1 +eth-utils==1.6.0 ../../test_libs/gen_helpers ../../test_libs/config_helpers ../../test_libs/pyspec \ No newline at end of file diff --git a/test_generators/ssz_generic/requirements.txt b/test_generators/ssz_generic/requirements.txt index 94afc9d91b..dcdb0824ff 100644 --- a/test_generators/ssz_generic/requirements.txt +++ b/test_generators/ssz_generic/requirements.txt @@ -1,4 +1,4 @@ -eth-utils==1.4.1 +eth-utils==1.6.0 ../../test_libs/gen_helpers ../../test_libs/config_helpers ssz==0.1.0a2 diff --git a/test_generators/ssz_static/main.py b/test_generators/ssz_static/main.py index 1234294db9..e8995b9185 100644 --- a/test_generators/ssz_static/main.py +++ b/test_generators/ssz_static/main.py @@ -18,10 +18,7 @@ @to_dict -def create_test_case(rng: Random, name: str, mode: random_value.RandomizationMode, chaos: bool): - typ = spec.get_ssz_type_by_name(name) - value = random_value.get_random_ssz_object(rng, typ, MAX_BYTES_LENGTH, MAX_LIST_LENGTH, mode, chaos) - yield "type_name", name +def create_test_case_contents(value, typ): yield "value", encode.encode(value, typ) yield "serialized", '0x' + serialize(value).hex() yield "root", '0x' + hash_tree_root(value).hex() @@ -29,6 +26,13 @@ def create_test_case(rng: Random, name: str, mode: random_value.RandomizationMod yield "signing_root", '0x' + signing_root(value).hex() +@to_dict +def create_test_case(rng: Random, name: str, mode: random_value.RandomizationMode, chaos: bool): + typ = spec.get_ssz_type_by_name(name) + value = random_value.get_random_ssz_object(rng, typ, MAX_BYTES_LENGTH, MAX_LIST_LENGTH, mode, chaos) + yield name, create_test_case_contents(value, typ) + + @to_tuple def ssz_static_cases(rng: Random, mode: random_value.RandomizationMode, chaos: bool, count: int): for type_name in spec.ssz_types: diff --git a/test_generators/ssz_static/requirements.txt b/test_generators/ssz_static/requirements.txt index 8f9bede8f3..595cee69cd 100644 --- a/test_generators/ssz_static/requirements.txt +++ b/test_generators/ssz_static/requirements.txt @@ -1,4 +1,4 @@ -eth-utils==1.4.1 +eth-utils==1.6.0 ../../test_libs/gen_helpers ../../test_libs/config_helpers ../../test_libs/pyspec \ No newline at end of file diff --git a/test_libs/config_helpers/requirements.txt b/test_libs/config_helpers/requirements.txt index e441a474b8..f2f208c3fb 100644 --- a/test_libs/config_helpers/requirements.txt +++ b/test_libs/config_helpers/requirements.txt @@ -1 +1 @@ -ruamel.yaml==0.15.87 +ruamel.yaml==0.15.96 diff --git a/test_libs/config_helpers/setup.py b/test_libs/config_helpers/setup.py index 90ad94ee44..9f0ea06419 100644 --- a/test_libs/config_helpers/setup.py +++ b/test_libs/config_helpers/setup.py @@ -4,6 +4,6 @@ name='config_helpers', packages=['preset_loader'], install_requires=[ - "ruamel.yaml==0.15.87" + "ruamel.yaml==0.15.96" ] ) diff --git a/test_libs/pyspec/tests/__init__.py b/test_libs/gen_helpers/gen_from_tests/__init__.py similarity index 100% rename from test_libs/pyspec/tests/__init__.py rename to test_libs/gen_helpers/gen_from_tests/__init__.py diff --git a/test_libs/gen_helpers/gen_from_tests/gen.py b/test_libs/gen_helpers/gen_from_tests/gen.py new file mode 100644 index 0000000000..e7d8011310 --- /dev/null +++ b/test_libs/gen_helpers/gen_from_tests/gen.py @@ -0,0 +1,25 @@ +from inspect import getmembers, isfunction + +def generate_from_tests(src, bls_active=True): + """ + Generate a list of test cases by running tests from the given src in generator-mode. + :param src: to retrieve tests from (discovered using inspect.getmembers) + :param bls_active: optional, to override BLS switch preference. Defaults to True. + :return: the list of test cases. + """ + fn_names = [ + name for (name, _) in getmembers(src, isfunction) + if name.startswith('test_') + ] + out = [] + print("generating test vectors from tests source: %s" % src.__name__) + for name in fn_names: + tfn = getattr(src, name) + try: + test_case = tfn(generator_mode=True, bls_active=bls_active) + # If no test case data is returned, the test is ignored. + if test_case is not None: + out.append(test_case) + except AssertionError: + print("ERROR: failed to generate vector from test: %s (src: %s)" % (name, src.__name__)) + return out diff --git a/test_libs/gen_helpers/requirements.txt b/test_libs/gen_helpers/requirements.txt index 3d6a39458e..557cae6317 100644 --- a/test_libs/gen_helpers/requirements.txt +++ b/test_libs/gen_helpers/requirements.txt @@ -1,2 +1,2 @@ -ruamel.yaml==0.15.87 -eth-utils==1.4.1 +ruamel.yaml==0.15.96 +eth-utils==1.6.0 diff --git a/test_libs/gen_helpers/setup.py b/test_libs/gen_helpers/setup.py index 5de27a6dbe..ee2c815c76 100644 --- a/test_libs/gen_helpers/setup.py +++ b/test_libs/gen_helpers/setup.py @@ -2,9 +2,9 @@ setup( name='gen_helpers', - packages=['gen_base'], + packages=['gen_base', 'gen_from_tests'], install_requires=[ - "ruamel.yaml==0.15.87", - "eth-utils==1.4.1" + "ruamel.yaml==0.15.96", + "eth-utils==1.6.0" ] ) diff --git a/test_libs/pyspec/README.md b/test_libs/pyspec/README.md index df18342100..330972e772 100644 --- a/test_libs/pyspec/README.md +++ b/test_libs/pyspec/README.md @@ -46,8 +46,9 @@ The `-B` flag may be helpful to force-overwrite the `pyspec` output after you ma Run the tests: ``` -pytest --config=minimal +pytest --config=minimal eth2spec ``` +Note the package-name, this is to locate the tests. ## Contributing diff --git a/test_libs/pyspec/eth2spec/test/__init__.py b/test_libs/pyspec/eth2spec/test/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/test_libs/pyspec/eth2spec/test/block_processing/__init__.py b/test_libs/pyspec/eth2spec/test/block_processing/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/test_libs/pyspec/eth2spec/test/block_processing/test_process_attestation.py b/test_libs/pyspec/eth2spec/test/block_processing/test_process_attestation.py new file mode 100644 index 0000000000..af6b39ef6e --- /dev/null +++ b/test_libs/pyspec/eth2spec/test/block_processing/test_process_attestation.py @@ -0,0 +1,255 @@ +from copy import deepcopy + +import eth2spec.phase0.spec as spec +from eth2spec.phase0.spec import ( + get_current_epoch, + process_attestation +) +from eth2spec.phase0.state_transition import ( + state_transition_to, +) +from eth2spec.test.context import spec_state_test, expect_assertion_error, always_bls +from eth2spec.test.helpers.attestations import ( + get_valid_attestation, + sign_attestation, +) +from eth2spec.test.helpers.state import ( + next_epoch, + next_slot, +) +from eth2spec.test.helpers.block import apply_empty_block + + +def run_attestation_processing(state, attestation, valid=True): + """ + Run ``process_attestation``, yielding: + - pre-state ('pre') + - attestation ('attestation') + - post-state ('post'). + If ``valid == False``, run expecting ``AssertionError`` + """ + # yield pre-state + yield 'pre', state + + yield 'attestation', attestation + + # If the attestation is invalid, processing is aborted, and there is no post-state. + if not valid: + expect_assertion_error(lambda: process_attestation(state, attestation)) + yield 'post', None + return + + current_epoch_count = len(state.current_epoch_attestations) + previous_epoch_count = len(state.previous_epoch_attestations) + + # process attestation + process_attestation(state, attestation) + + # Make sure the attestation has been processed + if attestation.data.target_epoch == get_current_epoch(state): + assert len(state.current_epoch_attestations) == current_epoch_count + 1 + else: + assert len(state.previous_epoch_attestations) == previous_epoch_count + 1 + + # yield post-state + yield 'post', state + + +@spec_state_test +def test_success(state): + attestation = get_valid_attestation(state, signed=True) + state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY + + yield from run_attestation_processing(state, attestation) + + +@spec_state_test +def test_success_previous_epoch(state): + attestation = get_valid_attestation(state, signed=True) + next_epoch(state) + apply_empty_block(state) + + yield from run_attestation_processing(state, attestation) + + +@always_bls +@spec_state_test +def test_invalid_attestation_signature(state): + attestation = get_valid_attestation(state) + state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY + + yield from run_attestation_processing(state, attestation, False) + + +@spec_state_test +def test_before_inclusion_delay(state): + attestation = get_valid_attestation(state, signed=True) + # do not increment slot to allow for inclusion delay + + yield from run_attestation_processing(state, attestation, False) + + +@spec_state_test +def test_after_epoch_slots(state): + attestation = get_valid_attestation(state, signed=True) + # increment past latest inclusion slot + state_transition_to(state, state.slot + spec.SLOTS_PER_EPOCH + 1) + apply_empty_block(state) + + yield from run_attestation_processing(state, attestation, False) + + +@spec_state_test +def test_old_source_epoch(state): + state.slot = spec.SLOTS_PER_EPOCH * 5 + state.finalized_epoch = 2 + state.previous_justified_epoch = 3 + state.current_justified_epoch = 4 + attestation = get_valid_attestation(state, slot=(spec.SLOTS_PER_EPOCH * 3) + 1) + + # test logic sanity check: make sure the attestation is pointing to oldest known source epoch + assert attestation.data.source_epoch == state.previous_justified_epoch + + # Now go beyond that, it will be invalid + attestation.data.source_epoch -= 1 + + sign_attestation(state, attestation) + + yield from run_attestation_processing(state, attestation, False) + + +@spec_state_test +def test_wrong_shard(state): + attestation = get_valid_attestation(state) + state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY + + attestation.data.shard += 1 + + sign_attestation(state, attestation) + + yield from run_attestation_processing(state, attestation, False) + + +@spec_state_test +def test_new_source_epoch(state): + attestation = get_valid_attestation(state) + state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY + + attestation.data.source_epoch += 1 + + sign_attestation(state, attestation) + + yield from run_attestation_processing(state, attestation, False) + + +@spec_state_test +def test_source_root_is_target_root(state): + attestation = get_valid_attestation(state) + state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY + + attestation.data.source_root = attestation.data.target_root + + sign_attestation(state, attestation) + + yield from run_attestation_processing(state, attestation, False) + + +@spec_state_test +def test_invalid_current_source_root(state): + state.slot = spec.SLOTS_PER_EPOCH * 5 + state.finalized_epoch = 2 + + state.previous_justified_epoch = 3 + state.previous_justified_root = b'\x01' * 32 + + state.current_justified_epoch = 4 + state.current_justified_root = b'\xff' * 32 + + attestation = get_valid_attestation(state, slot=(spec.SLOTS_PER_EPOCH * 3) + 1) + state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY + + # Test logic sanity checks: + assert state.current_justified_root != state.previous_justified_root + assert attestation.data.source_root == state.previous_justified_root + + # Make attestation source root invalid: should be previous justified, not current one + attestation.data.source_root = state.current_justified_root + + sign_attestation(state, attestation) + + yield from run_attestation_processing(state, attestation, False) + + +@spec_state_test +def test_bad_source_root(state): + attestation = get_valid_attestation(state) + state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY + + attestation.data.source_root = b'\x42' * 32 + + sign_attestation(state, attestation) + + yield from run_attestation_processing(state, attestation, False) + + +@spec_state_test +def test_non_zero_crosslink_data_root(state): + attestation = get_valid_attestation(state) + state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY + + attestation.data.crosslink_data_root = b'\x42' * 32 + + sign_attestation(state, attestation) + + yield from run_attestation_processing(state, attestation, False) + + +@spec_state_test +def test_bad_previous_crosslink(state): + next_epoch(state) + apply_empty_block(state) + + attestation = get_valid_attestation(state, signed=True) + for _ in range(spec.MIN_ATTESTATION_INCLUSION_DELAY): + next_slot(state) + apply_empty_block(state) + + state.current_crosslinks[attestation.data.shard].epoch += 10 + + yield from run_attestation_processing(state, attestation, False) + + +@spec_state_test +def test_inconsistent_bitfields(state): + attestation = get_valid_attestation(state) + state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY + + attestation.custody_bitfield = deepcopy(attestation.aggregation_bitfield) + b'\x00' + + sign_attestation(state, attestation) + + yield from run_attestation_processing(state, attestation, False) + + +@spec_state_test +def test_non_empty_custody_bitfield(state): + attestation = get_valid_attestation(state) + state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY + + attestation.custody_bitfield = deepcopy(attestation.aggregation_bitfield) + + sign_attestation(state, attestation) + + yield from run_attestation_processing(state, attestation, False) + + +@spec_state_test +def test_empty_aggregation_bitfield(state): + attestation = get_valid_attestation(state) + state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY + + attestation.aggregation_bitfield = b'\x00' * len(attestation.aggregation_bitfield) + + sign_attestation(state, attestation) + + yield from run_attestation_processing(state, attestation, False) diff --git a/test_libs/pyspec/eth2spec/test/block_processing/test_process_attester_slashing.py b/test_libs/pyspec/eth2spec/test/block_processing/test_process_attester_slashing.py new file mode 100644 index 0000000000..28e2322772 --- /dev/null +++ b/test_libs/pyspec/eth2spec/test/block_processing/test_process_attester_slashing.py @@ -0,0 +1,149 @@ +import eth2spec.phase0.spec as spec +from eth2spec.phase0.spec import ( + get_beacon_proposer_index, + process_attester_slashing, +) +from eth2spec.test.context import spec_state_test, expect_assertion_error, always_bls +from eth2spec.test.helpers.attestations import sign_indexed_attestation +from eth2spec.test.helpers.attester_slashings import get_valid_attester_slashing +from eth2spec.test.helpers.block import apply_empty_block +from eth2spec.test.helpers.state import ( + get_balance, + next_epoch, +) + + +def run_attester_slashing_processing(state, attester_slashing, valid=True): + """ + Run ``process_attester_slashing``, yielding: + - pre-state ('pre') + - attester_slashing ('attester_slashing') + - post-state ('post'). + If ``valid == False``, run expecting ``AssertionError`` + """ + + yield 'pre', state + yield 'attester_slashing', attester_slashing + + if not valid: + expect_assertion_error(lambda: process_attester_slashing(state, attester_slashing)) + yield 'post', None + return + + slashed_index = attester_slashing.attestation_1.custody_bit_0_indices[0] + pre_slashed_balance = get_balance(state, slashed_index) + + proposer_index = get_beacon_proposer_index(state) + pre_proposer_balance = get_balance(state, proposer_index) + + # Process slashing + process_attester_slashing(state, attester_slashing) + + slashed_validator = state.validator_registry[slashed_index] + + # Check slashing + assert slashed_validator.slashed + assert slashed_validator.exit_epoch < spec.FAR_FUTURE_EPOCH + assert slashed_validator.withdrawable_epoch < spec.FAR_FUTURE_EPOCH + + if slashed_index != proposer_index: + # lost whistleblower reward + assert get_balance(state, slashed_index) < pre_slashed_balance + # gained whistleblower reward + assert get_balance(state, proposer_index) > pre_proposer_balance + else: + # gained rewards for all slashings, which may include others. And only lost that of themselves. + # Netto at least 0, if more people where slashed, a balance increase. + assert get_balance(state, slashed_index) >= pre_slashed_balance + + yield 'post', state + + +@spec_state_test +def test_success_double(state): + attester_slashing = get_valid_attester_slashing(state, signed_1=True, signed_2=True) + + yield from run_attester_slashing_processing(state, attester_slashing) + + +@spec_state_test +def test_success_surround(state): + next_epoch(state) + apply_empty_block(state) + + state.current_justified_epoch += 1 + attester_slashing = get_valid_attester_slashing(state, signed_1=False, signed_2=True) + + # set attestion1 to surround attestation 2 + attester_slashing.attestation_1.data.source_epoch = attester_slashing.attestation_2.data.source_epoch - 1 + attester_slashing.attestation_1.data.target_epoch = attester_slashing.attestation_2.data.target_epoch + 1 + + sign_indexed_attestation(state, attester_slashing.attestation_1) + + yield from run_attester_slashing_processing(state, attester_slashing) + + +@always_bls +@spec_state_test +def test_invalid_sig_1(state): + attester_slashing = get_valid_attester_slashing(state, signed_1=False, signed_2=True) + yield from run_attester_slashing_processing(state, attester_slashing, False) + + +@always_bls +@spec_state_test +def test_invalid_sig_2(state): + attester_slashing = get_valid_attester_slashing(state, signed_1=True, signed_2=False) + yield from run_attester_slashing_processing(state, attester_slashing, False) + + +@always_bls +@spec_state_test +def test_invalid_sig_1_and_2(state): + attester_slashing = get_valid_attester_slashing(state, signed_1=False, signed_2=False) + yield from run_attester_slashing_processing(state, attester_slashing, False) + + +@spec_state_test +def test_same_data(state): + attester_slashing = get_valid_attester_slashing(state, signed_1=False, signed_2=True) + + attester_slashing.attestation_1.data = attester_slashing.attestation_2.data + sign_indexed_attestation(state, attester_slashing.attestation_1) + + yield from run_attester_slashing_processing(state, attester_slashing, False) + + +@spec_state_test +def test_no_double_or_surround(state): + attester_slashing = get_valid_attester_slashing(state, signed_1=False, signed_2=True) + + attester_slashing.attestation_1.data.target_epoch += 1 + sign_indexed_attestation(state, attester_slashing.attestation_1) + + yield from run_attester_slashing_processing(state, attester_slashing, False) + + +@spec_state_test +def test_participants_already_slashed(state): + attester_slashing = get_valid_attester_slashing(state, signed_1=True, signed_2=True) + + # set all indices to slashed + attestation_1 = attester_slashing.attestation_1 + validator_indices = attestation_1.custody_bit_0_indices + attestation_1.custody_bit_1_indices + for index in validator_indices: + state.validator_registry[index].slashed = True + + yield from run_attester_slashing_processing(state, attester_slashing, False) + + +@spec_state_test +def test_custody_bit_0_and_1(state): + attester_slashing = get_valid_attester_slashing(state, signed_1=False, signed_2=True) + + attester_slashing.attestation_1.custody_bit_1_indices = ( + attester_slashing.attestation_1.custody_bit_0_indices + ) + sign_indexed_attestation(state, attester_slashing.attestation_1) + + yield from run_attester_slashing_processing(state, attester_slashing, False) diff --git a/test_libs/pyspec/eth2spec/test/block_processing/test_process_block_header.py b/test_libs/pyspec/eth2spec/test/block_processing/test_process_block_header.py new file mode 100644 index 0000000000..454f557c5c --- /dev/null +++ b/test_libs/pyspec/eth2spec/test/block_processing/test_process_block_header.py @@ -0,0 +1,87 @@ +from copy import deepcopy + +from eth2spec.phase0.spec import ( + get_beacon_proposer_index, + cache_state, + advance_slot, + process_block_header, +) +from eth2spec.test.context import spec_state_test, expect_assertion_error, always_bls +from eth2spec.test.helpers.block import ( + build_empty_block_for_next_slot, + sign_block +) +from eth2spec.test.helpers.state import next_slot + + +def prepare_state_for_header_processing(state): + cache_state(state) + advance_slot(state) + + +def run_block_header_processing(state, block, valid=True): + """ + Run ``process_block_header``, yielding: + - pre-state ('pre') + - block ('block') + - post-state ('post'). + If ``valid == False``, run expecting ``AssertionError`` + """ + prepare_state_for_header_processing(state) + + yield 'pre', state + yield 'block', block + + if not valid: + expect_assertion_error(lambda: process_block_header(state, block)) + yield 'post', None + return + + process_block_header(state, block) + yield 'post', state + + +@spec_state_test +def test_success_block_header(state): + block = build_empty_block_for_next_slot(state, signed=True) + yield from run_block_header_processing(state, block) + + +@always_bls +@spec_state_test +def test_invalid_sig_block_header(state): + block = build_empty_block_for_next_slot(state) + yield from run_block_header_processing(state, block, valid=False) + + +@spec_state_test +def test_invalid_slot_block_header(state): + block = build_empty_block_for_next_slot(state) + block.slot = state.slot + 2 # invalid slot + sign_block(state, block) + + yield from run_block_header_processing(state, block, valid=False) + + +@spec_state_test +def test_invalid_previous_block_root(state): + block = build_empty_block_for_next_slot(state) + block.previous_block_root = b'\12' * 32 # invalid prev root + sign_block(state, block) + + yield from run_block_header_processing(state, block, valid=False) + + +@spec_state_test +def test_proposer_slashed(state): + # use stub state to get proposer index of next slot + stub_state = deepcopy(state) + next_slot(stub_state) + proposer_index = get_beacon_proposer_index(stub_state) + + # set proposer to slashed + state.validator_registry[proposer_index].slashed = True + + block = build_empty_block_for_next_slot(state, signed=True) + + yield from run_block_header_processing(state, block, valid=False) diff --git a/test_libs/pyspec/eth2spec/test/block_processing/test_process_deposit.py b/test_libs/pyspec/eth2spec/test/block_processing/test_process_deposit.py new file mode 100644 index 0000000000..336af3bf73 --- /dev/null +++ b/test_libs/pyspec/eth2spec/test/block_processing/test_process_deposit.py @@ -0,0 +1,124 @@ +import eth2spec.phase0.spec as spec +from eth2spec.phase0.spec import process_deposit +from eth2spec.test.context import spec_state_test, expect_assertion_error, always_bls +from eth2spec.test.helpers.deposits import prepare_state_and_deposit, sign_deposit_data +from eth2spec.test.helpers.state import get_balance +from eth2spec.test.helpers.keys import privkeys + + +def run_deposit_processing(state, deposit, validator_index, valid=True, effective=True): + """ + Run ``process_deposit``, yielding: + - pre-state ('pre') + - deposit ('deposit') + - post-state ('post'). + If ``valid == False``, run expecting ``AssertionError`` + """ + pre_validator_count = len(state.validator_registry) + pre_balance = 0 + if validator_index < pre_validator_count: + pre_balance = get_balance(state, validator_index) + else: + # if it is a new validator, it should be right at the end of the current registry. + assert validator_index == pre_validator_count + + yield 'pre', state + yield 'deposit', deposit + + if not valid: + expect_assertion_error(lambda: process_deposit(state, deposit)) + yield 'post', None + return + + process_deposit(state, deposit) + + yield 'post', state + + if not effective: + assert len(state.validator_registry) == pre_validator_count + assert len(state.balances) == pre_validator_count + if validator_index < pre_validator_count: + assert get_balance(state, validator_index) == pre_balance + else: + if validator_index < pre_validator_count: + # top-up + assert len(state.validator_registry) == pre_validator_count + assert len(state.balances) == pre_validator_count + else: + # new validator + assert len(state.validator_registry) == pre_validator_count + 1 + assert len(state.balances) == pre_validator_count + 1 + assert get_balance(state, validator_index) == pre_balance + deposit.data.amount + + assert state.deposit_index == state.latest_eth1_data.deposit_count + + +@spec_state_test +def test_new_deposit(state): + # fresh deposit = next validator index = validator appended to registry + validator_index = len(state.validator_registry) + amount = spec.MAX_EFFECTIVE_BALANCE + deposit = prepare_state_and_deposit(state, validator_index, amount, signed=True) + + yield from run_deposit_processing(state, deposit, validator_index) + + +@always_bls +@spec_state_test +def test_invalid_sig_new_deposit(state): + # fresh deposit = next validator index = validator appended to registry + validator_index = len(state.validator_registry) + amount = spec.MAX_EFFECTIVE_BALANCE + deposit = prepare_state_and_deposit(state, validator_index, amount) + yield from run_deposit_processing(state, deposit, validator_index, valid=True, effective=False) + + +@spec_state_test +def test_success_top_up(state): + validator_index = 0 + amount = spec.MAX_EFFECTIVE_BALANCE // 4 + deposit = prepare_state_and_deposit(state, validator_index, amount, signed=True) + + yield from run_deposit_processing(state, deposit, validator_index) + + +@always_bls +@spec_state_test +def test_invalid_sig_top_up(state): + validator_index = 0 + amount = spec.MAX_EFFECTIVE_BALANCE // 4 + deposit = prepare_state_and_deposit(state, validator_index, amount) + + # invalid signatures, in top-ups, are allowed! + yield from run_deposit_processing(state, deposit, validator_index, valid=True, effective=True) + + +@spec_state_test +def test_wrong_index(state): + validator_index = len(state.validator_registry) + amount = spec.MAX_EFFECTIVE_BALANCE + deposit = prepare_state_and_deposit(state, validator_index, amount) + + # mess up deposit_index + deposit.index = state.deposit_index + 1 + + sign_deposit_data(state, deposit.data, privkeys[validator_index]) + + yield from run_deposit_processing(state, deposit, validator_index, valid=False) + + +# TODO: test invalid signature + + +@spec_state_test +def test_bad_merkle_proof(state): + validator_index = len(state.validator_registry) + amount = spec.MAX_EFFECTIVE_BALANCE + deposit = prepare_state_and_deposit(state, validator_index, amount) + + # mess up merkle branch + deposit.proof[-1] = spec.ZERO_HASH + + sign_deposit_data(state, deposit.data, privkeys[validator_index]) + + yield from run_deposit_processing(state, deposit, validator_index, valid=False) diff --git a/test_libs/pyspec/eth2spec/test/block_processing/test_process_proposer_slashing.py b/test_libs/pyspec/eth2spec/test/block_processing/test_process_proposer_slashing.py new file mode 100644 index 0000000000..07ccc25f1c --- /dev/null +++ b/test_libs/pyspec/eth2spec/test/block_processing/test_process_proposer_slashing.py @@ -0,0 +1,137 @@ +import eth2spec.phase0.spec as spec +from eth2spec.phase0.spec import ( + get_current_epoch, + process_proposer_slashing, +) +from eth2spec.test.context import spec_state_test, expect_assertion_error, always_bls +from eth2spec.test.helpers.block_header import sign_block_header +from eth2spec.test.helpers.keys import privkeys +from eth2spec.test.helpers.proposer_slashings import get_valid_proposer_slashing +from eth2spec.test.helpers.state import get_balance + + +def run_proposer_slashing_processing(state, proposer_slashing, valid=True): + """ + Run ``process_proposer_slashing``, yielding: + - pre-state ('pre') + - proposer_slashing ('proposer_slashing') + - post-state ('post'). + If ``valid == False``, run expecting ``AssertionError`` + """ + + yield 'pre', state + yield 'proposer_slashing', proposer_slashing + + if not valid: + expect_assertion_error(lambda: process_proposer_slashing(state, proposer_slashing)) + yield 'post', None + return + + pre_proposer_balance = get_balance(state, proposer_slashing.proposer_index) + + process_proposer_slashing(state, proposer_slashing) + yield 'post', state + + # check if slashed + slashed_validator = state.validator_registry[proposer_slashing.proposer_index] + assert slashed_validator.slashed + assert slashed_validator.exit_epoch < spec.FAR_FUTURE_EPOCH + assert slashed_validator.withdrawable_epoch < spec.FAR_FUTURE_EPOCH + + # lost whistleblower reward + assert ( + get_balance(state, proposer_slashing.proposer_index) < + pre_proposer_balance + ) + + +@spec_state_test +def test_success(state): + proposer_slashing = get_valid_proposer_slashing(state, signed_1=True, signed_2=True) + + yield from run_proposer_slashing_processing(state, proposer_slashing) + + +@always_bls +@spec_state_test +def test_invalid_sig_1(state): + proposer_slashing = get_valid_proposer_slashing(state, signed_1=False, signed_2=True) + yield from run_proposer_slashing_processing(state, proposer_slashing, False) + + +@always_bls +@spec_state_test +def test_invalid_sig_2(state): + proposer_slashing = get_valid_proposer_slashing(state, signed_1=True, signed_2=False) + yield from run_proposer_slashing_processing(state, proposer_slashing, False) + + +@always_bls +@spec_state_test +def test_invalid_sig_1_and_2(state): + proposer_slashing = get_valid_proposer_slashing(state, signed_1=False, signed_2=False) + yield from run_proposer_slashing_processing(state, proposer_slashing, False) + + +@spec_state_test +def test_invalid_proposer_index(state): + proposer_slashing = get_valid_proposer_slashing(state, signed_1=True, signed_2=True) + # Index just too high (by 1) + proposer_slashing.proposer_index = len(state.validator_registry) + + yield from run_proposer_slashing_processing(state, proposer_slashing, False) + + +@spec_state_test +def test_epochs_are_different(state): + proposer_slashing = get_valid_proposer_slashing(state, signed_1=True, signed_2=False) + + # set slots to be in different epochs + proposer_slashing.header_2.slot += spec.SLOTS_PER_EPOCH + sign_block_header(state, proposer_slashing.header_2, privkeys[proposer_slashing.proposer_index]) + + yield from run_proposer_slashing_processing(state, proposer_slashing, False) + + +@spec_state_test +def test_headers_are_same(state): + proposer_slashing = get_valid_proposer_slashing(state, signed_1=True, signed_2=False) + + # set headers to be the same + proposer_slashing.header_2 = proposer_slashing.header_1 + + yield from run_proposer_slashing_processing(state, proposer_slashing, False) + + +@spec_state_test +def test_proposer_is_not_activated(state): + proposer_slashing = get_valid_proposer_slashing(state, signed_1=True, signed_2=True) + + # set proposer to be not active yet + state.validator_registry[proposer_slashing.proposer_index].activation_epoch = get_current_epoch(state) + 1 + + yield from run_proposer_slashing_processing(state, proposer_slashing, False) + + +@spec_state_test +def test_proposer_is_slashed(state): + proposer_slashing = get_valid_proposer_slashing(state, signed_1=True, signed_2=True) + + # set proposer to slashed + state.validator_registry[proposer_slashing.proposer_index].slashed = True + + yield from run_proposer_slashing_processing(state, proposer_slashing, False) + + +@spec_state_test +def test_proposer_is_withdrawn(state): + proposer_slashing = get_valid_proposer_slashing(state, signed_1=True, signed_2=True) + + # move 1 epoch into future, to allow for past withdrawable epoch + state.slot += spec.SLOTS_PER_EPOCH + # set proposer withdrawable_epoch in past + current_epoch = get_current_epoch(state) + proposer_index = proposer_slashing.proposer_index + state.validator_registry[proposer_index].withdrawable_epoch = current_epoch - 1 + + yield from run_proposer_slashing_processing(state, proposer_slashing, False) diff --git a/test_libs/pyspec/eth2spec/test/block_processing/test_process_transfer.py b/test_libs/pyspec/eth2spec/test/block_processing/test_process_transfer.py new file mode 100644 index 0000000000..83af755743 --- /dev/null +++ b/test_libs/pyspec/eth2spec/test/block_processing/test_process_transfer.py @@ -0,0 +1,172 @@ +import eth2spec.phase0.spec as spec +from eth2spec.phase0.spec import ( + get_active_validator_indices, + get_beacon_proposer_index, + get_current_epoch, + process_transfer, +) +from eth2spec.test.context import spec_state_test, expect_assertion_error, always_bls +from eth2spec.test.helpers.state import next_epoch +from eth2spec.test.helpers.block import apply_empty_block +from eth2spec.test.helpers.transfers import get_valid_transfer + + +def run_transfer_processing(state, transfer, valid=True): + """ + Run ``process_transfer``, yielding: + - pre-state ('pre') + - transfer ('transfer') + - post-state ('post'). + If ``valid == False``, run expecting ``AssertionError`` + """ + + proposer_index = get_beacon_proposer_index(state) + pre_transfer_sender_balance = state.balances[transfer.sender] + pre_transfer_recipient_balance = state.balances[transfer.recipient] + pre_transfer_proposer_balance = state.balances[proposer_index] + + yield 'pre', state + yield 'transfer', transfer + + if not valid: + expect_assertion_error(lambda: process_transfer(state, transfer)) + yield 'post', None + return + + process_transfer(state, transfer) + yield 'post', state + + sender_balance = state.balances[transfer.sender] + recipient_balance = state.balances[transfer.recipient] + assert sender_balance == pre_transfer_sender_balance - transfer.amount - transfer.fee + assert recipient_balance == pre_transfer_recipient_balance + transfer.amount + assert state.balances[proposer_index] == pre_transfer_proposer_balance + transfer.fee + + +@spec_state_test +def test_success_non_activated(state): + transfer = get_valid_transfer(state, signed=True) + # un-activate so validator can transfer + state.validator_registry[transfer.sender].activation_eligibility_epoch = spec.FAR_FUTURE_EPOCH + + yield from run_transfer_processing(state, transfer) + + +@spec_state_test +def test_success_withdrawable(state): + next_epoch(state) + apply_empty_block(state) + + transfer = get_valid_transfer(state, signed=True) + + # withdrawable_epoch in past so can transfer + state.validator_registry[transfer.sender].withdrawable_epoch = get_current_epoch(state) - 1 + + yield from run_transfer_processing(state, transfer) + + +@spec_state_test +def test_success_active_above_max_effective(state): + sender_index = get_active_validator_indices(state, get_current_epoch(state))[-1] + state.balances[sender_index] = spec.MAX_EFFECTIVE_BALANCE + 1 + transfer = get_valid_transfer(state, sender_index=sender_index, amount=1, fee=0, signed=True) + + yield from run_transfer_processing(state, transfer) + + +@spec_state_test +def test_success_active_above_max_effective_fee(state): + sender_index = get_active_validator_indices(state, get_current_epoch(state))[-1] + state.balances[sender_index] = spec.MAX_EFFECTIVE_BALANCE + 1 + transfer = get_valid_transfer(state, sender_index=sender_index, amount=0, fee=1, signed=True) + + yield from run_transfer_processing(state, transfer) + + +@always_bls +@spec_state_test +def test_invalid_signature(state): + transfer = get_valid_transfer(state) + # un-activate so validator can transfer + state.validator_registry[transfer.sender].activation_eligibility_epoch = spec.FAR_FUTURE_EPOCH + + yield from run_transfer_processing(state, transfer, False) + + +@spec_state_test +def test_active_but_transfer_past_effective_balance(state): + sender_index = get_active_validator_indices(state, get_current_epoch(state))[-1] + amount = spec.MAX_EFFECTIVE_BALANCE // 32 + state.balances[sender_index] = spec.MAX_EFFECTIVE_BALANCE + transfer = get_valid_transfer(state, sender_index=sender_index, amount=amount, fee=0, signed=True) + + yield from run_transfer_processing(state, transfer, False) + + +@spec_state_test +def test_incorrect_slot(state): + transfer = get_valid_transfer(state, slot=state.slot + 1, signed=True) + # un-activate so validator can transfer + state.validator_registry[transfer.sender].activation_epoch = spec.FAR_FUTURE_EPOCH + + yield from run_transfer_processing(state, transfer, False) + + +@spec_state_test +def test_insufficient_balance_for_fee(state): + sender_index = get_active_validator_indices(state, get_current_epoch(state))[-1] + state.balances[sender_index] = spec.MAX_EFFECTIVE_BALANCE + transfer = get_valid_transfer(state, sender_index=sender_index, amount=0, fee=1, signed=True) + + # un-activate so validator can transfer + state.validator_registry[transfer.sender].activation_epoch = spec.FAR_FUTURE_EPOCH + + yield from run_transfer_processing(state, transfer, False) + + +@spec_state_test +def test_insufficient_balance(state): + sender_index = get_active_validator_indices(state, get_current_epoch(state))[-1] + state.balances[sender_index] = spec.MAX_EFFECTIVE_BALANCE + transfer = get_valid_transfer(state, sender_index=sender_index, amount=1, fee=0, signed=True) + + # un-activate so validator can transfer + state.validator_registry[transfer.sender].activation_epoch = spec.FAR_FUTURE_EPOCH + + yield from run_transfer_processing(state, transfer, False) + + +@spec_state_test +def test_no_dust_sender(state): + sender_index = get_active_validator_indices(state, get_current_epoch(state))[-1] + balance = state.balances[sender_index] + transfer = get_valid_transfer(state, sender_index=sender_index, amount=balance - spec.MIN_DEPOSIT_AMOUNT + 1, fee=0, signed=True) + + # un-activate so validator can transfer + state.validator_registry[transfer.sender].activation_epoch = spec.FAR_FUTURE_EPOCH + + yield from run_transfer_processing(state, transfer, False) + + +@spec_state_test +def test_no_dust_recipient(state): + sender_index = get_active_validator_indices(state, get_current_epoch(state))[-1] + state.balances[sender_index] = spec.MAX_EFFECTIVE_BALANCE + 1 + transfer = get_valid_transfer(state, sender_index=sender_index, amount=1, fee=0, signed=True) + state.balances[transfer.recipient] = 0 + + # un-activate so validator can transfer + state.validator_registry[transfer.sender].activation_epoch = spec.FAR_FUTURE_EPOCH + + yield from run_transfer_processing(state, transfer, False) + + +@spec_state_test +def test_invalid_pubkey(state): + transfer = get_valid_transfer(state, signed=True) + state.validator_registry[transfer.sender].withdrawal_credentials = spec.ZERO_HASH + + # un-activate so validator can transfer + state.validator_registry[transfer.sender].activation_epoch = spec.FAR_FUTURE_EPOCH + + yield from run_transfer_processing(state, transfer, False) diff --git a/test_libs/pyspec/eth2spec/test/block_processing/test_process_voluntary_exit.py b/test_libs/pyspec/eth2spec/test/block_processing/test_process_voluntary_exit.py new file mode 100644 index 0000000000..53fb4e3f7c --- /dev/null +++ b/test_libs/pyspec/eth2spec/test/block_processing/test_process_voluntary_exit.py @@ -0,0 +1,225 @@ +import eth2spec.phase0.spec as spec +from eth2spec.phase0.spec import ( + get_active_validator_indices, + get_churn_limit, + get_current_epoch, + process_voluntary_exit, +) +from eth2spec.test.context import spec_state_test, expect_assertion_error, always_bls +from eth2spec.test.helpers.keys import pubkey_to_privkey +from eth2spec.test.helpers.voluntary_exits import build_voluntary_exit, sign_voluntary_exit + + +def run_voluntary_exit_processing(state, voluntary_exit, valid=True): + """ + Run ``process_voluntary_exit``, yielding: + - pre-state ('pre') + - voluntary_exit ('voluntary_exit') + - post-state ('post'). + If ``valid == False``, run expecting ``AssertionError`` + """ + validator_index = voluntary_exit.validator_index + + yield 'pre', state + yield 'voluntary_exit', voluntary_exit + + if not valid: + expect_assertion_error(lambda: process_voluntary_exit(state, voluntary_exit)) + yield 'post', None + return + + pre_exit_epoch = state.validator_registry[validator_index].exit_epoch + + process_voluntary_exit(state, voluntary_exit) + + yield 'post', state + + assert pre_exit_epoch == spec.FAR_FUTURE_EPOCH + assert state.validator_registry[validator_index].exit_epoch < spec.FAR_FUTURE_EPOCH + + +@spec_state_test +def test_success(state): + # move state forward PERSISTENT_COMMITTEE_PERIOD epochs to allow for exit + state.slot += spec.PERSISTENT_COMMITTEE_PERIOD * spec.SLOTS_PER_EPOCH + + current_epoch = get_current_epoch(state) + validator_index = get_active_validator_indices(state, current_epoch)[0] + privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey] + + voluntary_exit = build_voluntary_exit(state, current_epoch, validator_index, privkey, signed=True) + + yield from run_voluntary_exit_processing(state, voluntary_exit) + + +@always_bls +@spec_state_test +def test_invalid_signature(state): + # move state forward PERSISTENT_COMMITTEE_PERIOD epochs to allow for exit + state.slot += spec.PERSISTENT_COMMITTEE_PERIOD * spec.SLOTS_PER_EPOCH + + current_epoch = get_current_epoch(state) + validator_index = get_active_validator_indices(state, current_epoch)[0] + privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey] + + voluntary_exit = build_voluntary_exit(state, current_epoch, validator_index, privkey) + + yield from run_voluntary_exit_processing(state, voluntary_exit, False) + + +@spec_state_test +def test_success_exit_queue(state): + # move state forward PERSISTENT_COMMITTEE_PERIOD epochs to allow for exit + state.slot += spec.PERSISTENT_COMMITTEE_PERIOD * spec.SLOTS_PER_EPOCH + + current_epoch = get_current_epoch(state) + + # exit `MAX_EXITS_PER_EPOCH` + initial_indices = get_active_validator_indices(state, current_epoch)[:get_churn_limit(state)] + + # Prepare a bunch of exits, based on the current state + exit_queue = [] + for index in initial_indices: + privkey = pubkey_to_privkey[state.validator_registry[index].pubkey] + exit_queue.append(build_voluntary_exit( + state, + current_epoch, + index, + privkey, + signed=True, + )) + + # Now run all the exits + for voluntary_exit in exit_queue: + # the function yields data, but we are just interested in running it here, ignore yields. + for _ in run_voluntary_exit_processing(state, voluntary_exit): + continue + + # exit an additional validator + validator_index = get_active_validator_indices(state, current_epoch)[-1] + privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey] + voluntary_exit = build_voluntary_exit( + state, + current_epoch, + validator_index, + privkey, + signed=True, + ) + + # This is the interesting part of the test: on a pre-state with a full exit queue, + # when processing an additional exit, it results in an exit in a later epoch + yield from run_voluntary_exit_processing(state, voluntary_exit) + + assert ( + state.validator_registry[validator_index].exit_epoch == + state.validator_registry[initial_indices[0]].exit_epoch + 1 + ) + + +@spec_state_test +def test_validator_exit_in_future(state): + # move state forward PERSISTENT_COMMITTEE_PERIOD epochs to allow for exit + state.slot += spec.PERSISTENT_COMMITTEE_PERIOD * spec.SLOTS_PER_EPOCH + + current_epoch = get_current_epoch(state) + validator_index = get_active_validator_indices(state, current_epoch)[0] + privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey] + + voluntary_exit = build_voluntary_exit( + state, + current_epoch, + validator_index, + privkey, + signed=False, + ) + voluntary_exit.epoch += 1 + sign_voluntary_exit(state, voluntary_exit, privkey) + + yield from run_voluntary_exit_processing(state, voluntary_exit, False) + + +@spec_state_test +def test_validator_invalid_validator_index(state): + # move state forward PERSISTENT_COMMITTEE_PERIOD epochs to allow for exit + state.slot += spec.PERSISTENT_COMMITTEE_PERIOD * spec.SLOTS_PER_EPOCH + + current_epoch = get_current_epoch(state) + validator_index = get_active_validator_indices(state, current_epoch)[0] + privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey] + + voluntary_exit = build_voluntary_exit( + state, + current_epoch, + validator_index, + privkey, + signed=False, + ) + voluntary_exit.validator_index = len(state.validator_registry) + sign_voluntary_exit(state, voluntary_exit, privkey) + + yield from run_voluntary_exit_processing(state, voluntary_exit, False) + + +@spec_state_test +def test_validator_not_active(state): + current_epoch = get_current_epoch(state) + validator_index = get_active_validator_indices(state, current_epoch)[0] + privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey] + + state.validator_registry[validator_index].activation_epoch = spec.FAR_FUTURE_EPOCH + + # build and test voluntary exit + voluntary_exit = build_voluntary_exit( + state, + current_epoch, + validator_index, + privkey, + signed=True, + ) + + yield from run_voluntary_exit_processing(state, voluntary_exit, False) + + +@spec_state_test +def test_validator_already_exited(state): + # move state forward PERSISTENT_COMMITTEE_PERIOD epochs to allow validator able to exit + state.slot += spec.PERSISTENT_COMMITTEE_PERIOD * spec.SLOTS_PER_EPOCH + + current_epoch = get_current_epoch(state) + validator_index = get_active_validator_indices(state, current_epoch)[0] + privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey] + + # but validator already has exited + state.validator_registry[validator_index].exit_epoch = current_epoch + 2 + + voluntary_exit = build_voluntary_exit( + state, + current_epoch, + validator_index, + privkey, + signed=True, + ) + + yield from run_voluntary_exit_processing(state, voluntary_exit, False) + + +@spec_state_test +def test_validator_not_active_long_enough(state): + current_epoch = get_current_epoch(state) + validator_index = get_active_validator_indices(state, current_epoch)[0] + privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey] + + voluntary_exit = build_voluntary_exit( + state, + current_epoch, + validator_index, + privkey, + signed=True, + ) + + assert ( + current_epoch - state.validator_registry[validator_index].activation_epoch < + spec.PERSISTENT_COMMITTEE_PERIOD + ) + + yield from run_voluntary_exit_processing(state, voluntary_exit, False) diff --git a/test_libs/pyspec/eth2spec/test/conftest.py b/test_libs/pyspec/eth2spec/test/conftest.py new file mode 100644 index 0000000000..dadb0d5d06 --- /dev/null +++ b/test_libs/pyspec/eth2spec/test/conftest.py @@ -0,0 +1,36 @@ +from eth2spec.phase0 import spec + +# We import pytest only when it's present, i.e. when we are running tests. +# The test-cases themselves can be generated without installing pytest. + +def module_exists(module_name): + try: + __import__(module_name) + except ImportError: + return False + else: + return True + + +def fixture(*args, **kwargs): + if module_exists("pytest"): + import pytest + return pytest.fixture(*args, **kwargs) + else: + def ignore(): + pass + return ignore + + +def pytest_addoption(parser): + parser.addoption( + "--config", action="store", default="minimal", help="config: make the pyspec use the specified configuration" + ) + + +@fixture(autouse=True) +def config(request): + config_name = request.config.getoption("--config") + from preset_loader import loader + presets = loader.load_presets('../../configs/', config_name) + spec.apply_constants_preset(presets) diff --git a/test_libs/pyspec/eth2spec/test/context.py b/test_libs/pyspec/eth2spec/test/context.py new file mode 100644 index 0000000000..2be9322de2 --- /dev/null +++ b/test_libs/pyspec/eth2spec/test/context.py @@ -0,0 +1,82 @@ +from eth2spec.phase0 import spec +from eth2spec.utils import bls + +from .helpers.genesis import create_genesis_state + +from .utils import spectest, with_args, with_tags + +# Provides a genesis state as first argument to the function decorated with this +with_state = with_args(lambda: [create_genesis_state(spec.SLOTS_PER_EPOCH * 8)]) + + +# BLS is turned off by default *for performance purposes during TESTING*. +# The runner of the test can indicate the preferred setting (test generators prefer BLS to be ON). +# - Some tests are marked as BLS-requiring, and ignore this setting. +# (tests that express differences caused by BLS, e.g. invalid signatures being rejected) +# - Some other tests are marked as BLS-ignoring, and ignore this setting. +# (tests that are heavily performance impacted / require unsigned state transitions) +# - Most tests respect the BLS setting. +DEFAULT_BLS_ACTIVE = False + + +# shorthand for decorating @with_state @spectest() +def spec_state_test(fn): + return with_state(bls_switch(spectest()(fn))) + + +def expect_assertion_error(fn): + bad = False + try: + fn() + bad = True + except AssertionError: + pass + except IndexError: + # Index errors are special; the spec is not explicit on bound checking, an IndexError is like a failed assert. + pass + if bad: + raise AssertionError('expected an assertion error, but got none.') + + +# Tags a test to be ignoring BLS for it to pass. +bls_ignored = with_tags({'bls_setting': 2}) + + +def never_bls(fn): + """ + Decorator to apply on ``bls_switch`` decorator to force BLS de-activation. Useful to mark tests as BLS-ignorant. + """ + def entry(*args, **kw): + # override bls setting + kw['bls_active'] = False + return fn(*args, **kw) + return bls_ignored(entry) + + +# Tags a test to be requiring BLS for it to pass. +bls_required = with_tags({'bls_setting': 1}) + + +def always_bls(fn): + """ + Decorator to apply on ``bls_switch`` decorator to force BLS activation. Useful to mark tests as BLS-dependent. + """ + def entry(*args, **kw): + # override bls setting + kw['bls_active'] = True + return fn(*args, **kw) + return bls_required(entry) + + +def bls_switch(fn): + """ + Decorator to make a function execute with BLS ON, or BLS off. + Based on an optional bool argument ``bls_active``, passed to the function at runtime. + """ + def entry(*args, **kw): + old_state = bls.bls_active + bls.bls_active = kw.pop('bls_active', DEFAULT_BLS_ACTIVE) + out = fn(*args, **kw) + bls.bls_active = old_state + return out + return entry diff --git a/test_libs/pyspec/eth2spec/test/epoch_processing/__init__.py b/test_libs/pyspec/eth2spec/test/epoch_processing/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/test_libs/pyspec/tests/epoch_processing/test_process_crosslinks.py b/test_libs/pyspec/eth2spec/test/epoch_processing/test_process_crosslinks.py similarity index 62% rename from test_libs/pyspec/tests/epoch_processing/test_process_crosslinks.py rename to test_libs/pyspec/eth2spec/test/epoch_processing/test_process_crosslinks.py index d6765e3a72..cfbcd18834 100644 --- a/test_libs/pyspec/tests/epoch_processing/test_process_crosslinks.py +++ b/test_libs/pyspec/eth2spec/test/epoch_processing/test_process_crosslinks.py @@ -1,116 +1,131 @@ from copy import deepcopy -import pytest import eth2spec.phase0.spec as spec - -from eth2spec.phase0.state_transition import ( - state_transition, -) from eth2spec.phase0.spec import ( cache_state, get_crosslink_deltas, process_crosslinks, ) -from tests.helpers import ( +from eth2spec.phase0.state_transition import ( + state_transition, +) +from eth2spec.test.context import spec_state_test +from eth2spec.test.helpers.state import ( + next_epoch, + next_slot +) +from eth2spec.test.helpers.block import apply_empty_block, sign_block +from eth2spec.test.helpers.attestations import ( add_attestation_to_state, build_empty_block_for_next_slot, fill_aggregate_attestation, get_crosslink_committee, get_valid_attestation, - next_epoch, - next_slot, - set_bitfield_bit, + sign_attestation, ) -# mark entire file as 'crosslinks' -pytestmark = pytest.mark.crosslinks - - def run_process_crosslinks(state, valid=True): + """ + Run ``process_crosslinks``, yielding: + - pre-state ('pre') + - post-state ('post'). + If ``valid == False``, run expecting ``AssertionError`` + """ # transition state to slot before state transition slot = state.slot + (spec.SLOTS_PER_EPOCH - state.slot % spec.SLOTS_PER_EPOCH) - 1 block = build_empty_block_for_next_slot(state) block.slot = slot + sign_block(state, block) state_transition(state, block) # cache state before epoch transition cache_state(state) - post_state = deepcopy(state) - process_crosslinks(post_state) - - return state, post_state + yield 'pre', state + process_crosslinks(state) + yield 'post', state +@spec_state_test def test_no_attestations(state): - pre_state, post_state = run_process_crosslinks(state) + yield from run_process_crosslinks(state) for shard in range(spec.SHARD_COUNT): - assert post_state.previous_crosslinks[shard] == post_state.current_crosslinks[shard] - - return pre_state, post_state + assert state.previous_crosslinks[shard] == state.current_crosslinks[shard] +@spec_state_test def test_single_crosslink_update_from_current_epoch(state): next_epoch(state) - attestation = get_valid_attestation(state) + attestation = get_valid_attestation(state, signed=True) fill_aggregate_attestation(state, attestation) add_attestation_to_state(state, attestation, state.slot + spec.MIN_ATTESTATION_INCLUSION_DELAY) assert len(state.current_epoch_attestations) == 1 - pre_state, post_state = run_process_crosslinks(state) - shard = attestation.data.shard - assert post_state.previous_crosslinks[shard] != post_state.current_crosslinks[shard] - assert pre_state.current_crosslinks[shard] != post_state.current_crosslinks[shard] + pre_crosslink = deepcopy(state.current_crosslinks[shard]) - return pre_state, post_state + yield from run_process_crosslinks(state) + assert state.previous_crosslinks[shard] != state.current_crosslinks[shard] + assert pre_crosslink != state.current_crosslinks[shard] + +@spec_state_test def test_single_crosslink_update_from_previous_epoch(state): next_epoch(state) - attestation = get_valid_attestation(state) + attestation = get_valid_attestation(state, signed=True) fill_aggregate_attestation(state, attestation) add_attestation_to_state(state, attestation, state.slot + spec.SLOTS_PER_EPOCH) assert len(state.previous_epoch_attestations) == 1 - pre_state, post_state = run_process_crosslinks(state) + shard = attestation.data.shard + pre_crosslink = deepcopy(state.current_crosslinks[shard]) + crosslink_deltas = get_crosslink_deltas(state) - shard = attestation.data.shard - assert post_state.previous_crosslinks[shard] != post_state.current_crosslinks[shard] - assert pre_state.current_crosslinks[shard] != post_state.current_crosslinks[shard] + yield from run_process_crosslinks(state) + + assert state.previous_crosslinks[shard] != state.current_crosslinks[shard] + assert pre_crosslink != state.current_crosslinks[shard] + # ensure rewarded for index in get_crosslink_committee(state, attestation.data.target_epoch, attestation.data.shard): assert crosslink_deltas[0][index] > 0 assert crosslink_deltas[1][index] == 0 - return pre_state, post_state - +@spec_state_test def test_double_late_crosslink(state): + if spec.get_epoch_committee_count(state, spec.get_current_epoch(state)) < spec.SHARD_COUNT: + print("warning: ignoring test, test-assumptions are incompatible with configuration") + return + next_epoch(state) state.slot += 4 - attestation_1 = get_valid_attestation(state) + attestation_1 = get_valid_attestation(state, signed=True) fill_aggregate_attestation(state, attestation_1) - # add attestation_1 in the next epoch + # add attestation_1 to next epoch next_epoch(state) add_attestation_to_state(state, attestation_1, state.slot + 1) for slot in range(spec.SLOTS_PER_EPOCH): attestation_2 = get_valid_attestation(state) if attestation_2.data.shard == attestation_1.data.shard: + sign_attestation(state, attestation_2) break next_slot(state) + apply_empty_block(state) + fill_aggregate_attestation(state, attestation_2) # add attestation_2 in the next epoch after attestation_1 has @@ -121,16 +136,15 @@ def test_double_late_crosslink(state): assert len(state.previous_epoch_attestations) == 1 assert len(state.current_epoch_attestations) == 0 - pre_state, post_state = run_process_crosslinks(state) crosslink_deltas = get_crosslink_deltas(state) + yield from run_process_crosslinks(state) + shard = attestation_2.data.shard # ensure that the current crosslinks were not updated by the second attestation - assert post_state.previous_crosslinks[shard] == post_state.current_crosslinks[shard] + assert state.previous_crosslinks[shard] == state.current_crosslinks[shard] # ensure no reward, only penalties for the failed crosslink for index in get_crosslink_committee(state, attestation_2.data.target_epoch, attestation_2.data.shard): assert crosslink_deltas[0][index] == 0 assert crosslink_deltas[1][index] > 0 - - return pre_state, post_state diff --git a/test_libs/pyspec/tests/epoch_processing/test_process_registry_updates.py b/test_libs/pyspec/eth2spec/test/epoch_processing/test_process_registry_updates.py similarity index 53% rename from test_libs/pyspec/tests/epoch_processing/test_process_registry_updates.py rename to test_libs/pyspec/eth2spec/test/epoch_processing/test_process_registry_updates.py index 11f5de2ad4..71bf89c702 100644 --- a/test_libs/pyspec/tests/epoch_processing/test_process_registry_updates.py +++ b/test_libs/pyspec/eth2spec/test/epoch_processing/test_process_registry_updates.py @@ -1,21 +1,44 @@ -from copy import deepcopy - -import pytest - import eth2spec.phase0.spec as spec from eth2spec.phase0.spec import ( get_current_epoch, is_active_validator, + process_registry_updates ) -from tests.helpers import ( - next_epoch, -) - -# mark entire file as 'state' -pytestmark = pytest.mark.state - - +from eth2spec.phase0.state_transition import state_transition +from eth2spec.test.helpers.block import build_empty_block_for_next_slot, sign_block +from eth2spec.test.helpers.state import next_epoch +from eth2spec.test.context import spec_state_test + + +def run_process_registry_updates(state, valid=True): + """ + Run ``process_crosslinks``, yielding: + - pre-state ('pre') + - post-state ('post'). + If ``valid == False``, run expecting ``AssertionError`` + """ + # transition state to slot before state transition + slot = state.slot + (spec.SLOTS_PER_EPOCH - state.slot % spec.SLOTS_PER_EPOCH) - 1 + block = build_empty_block_for_next_slot(state) + block.slot = slot + sign_block(state, block) + state_transition(state, block) + + # cache state before epoch transition + spec.cache_state(state) + + # process components of epoch transition before registry update + spec.process_justification_and_finalization(state) + spec.process_crosslinks(state) + spec.process_rewards_and_penalties(state) + + yield 'pre', state + process_registry_updates(state) + yield 'post', state + + +@spec_state_test def test_activation(state): index = 0 assert is_active_validator(state.validator_registry[index], get_current_epoch(state)) @@ -26,12 +49,10 @@ def test_activation(state): state.validator_registry[index].effective_balance = spec.MAX_EFFECTIVE_BALANCE assert not is_active_validator(state.validator_registry[index], get_current_epoch(state)) - pre_state = deepcopy(state) - - blocks = [] for _ in range(spec.ACTIVATION_EXIT_DELAY + 1): - block = next_epoch(state) - blocks.append(block) + next_epoch(state) + + yield from run_process_registry_updates(state) assert state.validator_registry[index].activation_eligibility_epoch != spec.FAR_FUTURE_EPOCH assert state.validator_registry[index].activation_epoch != spec.FAR_FUTURE_EPOCH @@ -40,9 +61,8 @@ def test_activation(state): get_current_epoch(state), ) - return pre_state, blocks, state - +@spec_state_test def test_ejection(state): index = 0 assert is_active_validator(state.validator_registry[index], get_current_epoch(state)) @@ -51,17 +71,13 @@ def test_ejection(state): # Mock an ejection state.validator_registry[index].effective_balance = spec.EJECTION_BALANCE - pre_state = deepcopy(state) - - blocks = [] for _ in range(spec.ACTIVATION_EXIT_DELAY + 1): - block = next_epoch(state) - blocks.append(block) + next_epoch(state) + + yield from run_process_registry_updates(state) assert state.validator_registry[index].exit_epoch != spec.FAR_FUTURE_EPOCH assert not is_active_validator( state.validator_registry[index], get_current_epoch(state), ) - - return pre_state, blocks, state diff --git a/test_libs/pyspec/eth2spec/test/helpers/__init__.py b/test_libs/pyspec/eth2spec/test/helpers/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/test_libs/pyspec/eth2spec/test/helpers/attestations.py b/test_libs/pyspec/eth2spec/test/helpers/attestations.py new file mode 100644 index 0000000000..b541e610f4 --- /dev/null +++ b/test_libs/pyspec/eth2spec/test/helpers/attestations.py @@ -0,0 +1,146 @@ +from typing import List + +# Access constants from spec pkg reference. +import eth2spec.phase0.spec as spec +from eth2spec.phase0.spec import ( + Attestation, + AttestationData, + AttestationDataAndCustodyBit, + get_epoch_start_slot, get_block_root, get_current_epoch, get_previous_epoch, slot_to_epoch, + get_crosslink_committee, get_domain, IndexedAttestation, get_attesting_indices, BeaconState, get_block_root_at_slot, + get_epoch_start_shard, get_epoch_committee_count) +from eth2spec.phase0.state_transition import ( + state_transition, state_transition_to +) +from eth2spec.test.helpers.bitfields import set_bitfield_bit +from eth2spec.test.helpers.block import build_empty_block_for_next_slot, sign_block +from eth2spec.test.helpers.keys import privkeys +from eth2spec.utils.bls import bls_sign, bls_aggregate_signatures +from eth2spec.utils.minimal_ssz import hash_tree_root + + +def build_attestation_data(state, slot, shard): + assert state.slot >= slot + + if slot == state.slot: + block_root = build_empty_block_for_next_slot(state).previous_block_root + else: + block_root = get_block_root_at_slot(state, slot) + + current_epoch_start_slot = get_epoch_start_slot(get_current_epoch(state)) + if slot < current_epoch_start_slot: + epoch_boundary_root = get_block_root(state, get_previous_epoch(state)) + elif slot == current_epoch_start_slot: + epoch_boundary_root = block_root + else: + epoch_boundary_root = get_block_root(state, get_current_epoch(state)) + + if slot < current_epoch_start_slot: + justified_epoch = state.previous_justified_epoch + justified_block_root = state.previous_justified_root + else: + justified_epoch = state.current_justified_epoch + justified_block_root = state.current_justified_root + + crosslinks = state.current_crosslinks if slot_to_epoch(slot) == get_current_epoch( + state) else state.previous_crosslinks + return AttestationData( + shard=shard, + beacon_block_root=block_root, + source_epoch=justified_epoch, + source_root=justified_block_root, + target_epoch=slot_to_epoch(slot), + target_root=epoch_boundary_root, + crosslink_data_root=spec.ZERO_HASH, + previous_crosslink_root=hash_tree_root(crosslinks[shard]), + ) + + +def get_valid_attestation(state, slot=None, signed=False): + if slot is None: + slot = state.slot + + epoch = slot_to_epoch(slot) + epoch_start_shard = get_epoch_start_shard(state, epoch) + committees_per_slot = get_epoch_committee_count(state, epoch) // spec.SLOTS_PER_EPOCH + shard = (epoch_start_shard + committees_per_slot * (slot % spec.SLOTS_PER_EPOCH)) % spec.SHARD_COUNT + + attestation_data = build_attestation_data(state, slot, shard) + + crosslink_committee = get_crosslink_committee(state, attestation_data.target_epoch, attestation_data.shard) + + committee_size = len(crosslink_committee) + bitfield_length = (committee_size + 7) // 8 + aggregation_bitfield = b'\x00' * bitfield_length + custody_bitfield = b'\x00' * bitfield_length + attestation = Attestation( + aggregation_bitfield=aggregation_bitfield, + data=attestation_data, + custody_bitfield=custody_bitfield, + ) + fill_aggregate_attestation(state, attestation) + if signed: + sign_attestation(state, attestation) + return attestation + + +def sign_aggregate_attestation(state: BeaconState, data: AttestationData, participants: List[int]): + signatures = [] + for validator_index in participants: + privkey = privkeys[validator_index] + signatures.append( + get_attestation_signature( + state, + data, + privkey + ) + ) + + return bls_aggregate_signatures(signatures) + + +def sign_indexed_attestation(state, indexed_attestation: IndexedAttestation): + participants = indexed_attestation.custody_bit_0_indices + indexed_attestation.custody_bit_1_indices + indexed_attestation.signature = sign_aggregate_attestation(state, indexed_attestation.data, participants) + + +def sign_attestation(state, attestation: Attestation): + participants = get_attesting_indices( + state, + attestation.data, + attestation.aggregation_bitfield, + ) + + attestation.signature = sign_aggregate_attestation(state, attestation.data, participants) + + +def get_attestation_signature(state, attestation_data, privkey, custody_bit=0b0): + message_hash = AttestationDataAndCustodyBit( + data=attestation_data, + custody_bit=custody_bit, + ).hash_tree_root() + + return bls_sign( + message_hash=message_hash, + privkey=privkey, + domain=get_domain( + state=state, + domain_type=spec.DOMAIN_ATTESTATION, + message_epoch=attestation_data.target_epoch, + ) + ) + + +def fill_aggregate_attestation(state, attestation): + crosslink_committee = get_crosslink_committee(state, attestation.data.target_epoch, attestation.data.shard) + for i in range(len(crosslink_committee)): + attestation.aggregation_bitfield = set_bitfield_bit(attestation.aggregation_bitfield, i) + + +def add_attestation_to_state(state, attestation, slot): + block = build_empty_block_for_next_slot(state) + block.slot = slot + block.body.attestations.append(attestation) + state_transition_to(state, block.slot) + sign_block(state, block) + state_transition(state, block) diff --git a/test_libs/pyspec/eth2spec/test/helpers/attester_slashings.py b/test_libs/pyspec/eth2spec/test/helpers/attester_slashings.py new file mode 100644 index 0000000000..d19b41dfec --- /dev/null +++ b/test_libs/pyspec/eth2spec/test/helpers/attester_slashings.py @@ -0,0 +1,19 @@ +from copy import deepcopy + +from eth2spec.phase0.spec import AttesterSlashing, convert_to_indexed +from eth2spec.test.helpers.attestations import get_valid_attestation, sign_attestation + + +def get_valid_attester_slashing(state, signed_1=False, signed_2=False): + attestation_1 = get_valid_attestation(state, signed=signed_1) + + attestation_2 = deepcopy(attestation_1) + attestation_2.data.target_root = b'\x01' * 32 + + if signed_2: + sign_attestation(state, attestation_2) + + return AttesterSlashing( + attestation_1=convert_to_indexed(state, attestation_1), + attestation_2=convert_to_indexed(state, attestation_2), + ) diff --git a/test_libs/pyspec/eth2spec/test/helpers/bitfields.py b/test_libs/pyspec/eth2spec/test/helpers/bitfields.py new file mode 100644 index 0000000000..7c25d073ab --- /dev/null +++ b/test_libs/pyspec/eth2spec/test/helpers/bitfields.py @@ -0,0 +1,11 @@ +def set_bitfield_bit(bitfield, i): + """ + Set the bit in ``bitfield`` at position ``i`` to ``1``. + """ + byte_index = i // 8 + bit_index = i % 8 + return ( + bitfield[:byte_index] + + bytes([bitfield[byte_index] | (1 << bit_index)]) + + bitfield[byte_index + 1:] + ) diff --git a/test_libs/pyspec/eth2spec/test/helpers/block.py b/test_libs/pyspec/eth2spec/test/helpers/block.py new file mode 100644 index 0000000000..81c5e9ef5b --- /dev/null +++ b/test_libs/pyspec/eth2spec/test/helpers/block.py @@ -0,0 +1,77 @@ +from copy import deepcopy + +from eth2spec.phase0 import spec +from eth2spec.phase0.spec import get_beacon_proposer_index, slot_to_epoch, get_domain, BeaconBlock +from eth2spec.phase0.state_transition import state_transition, state_transition_to +from eth2spec.test.helpers.keys import privkeys +from eth2spec.utils.bls import bls_sign, only_with_bls +from eth2spec.utils.minimal_ssz import signing_root, hash_tree_root + + +# Fully ignore the function if BLS is off, beacon-proposer index calculation is slow. +@only_with_bls() +def sign_block(state, block, proposer_index=None): + assert state.slot <= block.slot + + if proposer_index is None: + if block.slot == state.slot: + proposer_index = get_beacon_proposer_index(state) + else: + if slot_to_epoch(state.slot) + 1 > slot_to_epoch(block.slot): + print("warning: block slot far away, and no proposer index manually given." + " Signing block is slow due to transition for proposer index calculation.") + # use stub state to get proposer index of future slot + stub_state = deepcopy(state) + state_transition_to(stub_state, block.slot) + proposer_index = get_beacon_proposer_index(stub_state) + + privkey = privkeys[proposer_index] + + block.body.randao_reveal = bls_sign( + privkey=privkey, + message_hash=hash_tree_root(slot_to_epoch(block.slot)), + domain=get_domain( + state, + message_epoch=slot_to_epoch(block.slot), + domain_type=spec.DOMAIN_RANDAO, + ) + ) + block.signature = bls_sign( + message_hash=signing_root(block), + privkey=privkey, + domain=get_domain( + state, + spec.DOMAIN_BEACON_PROPOSER, + slot_to_epoch(block.slot))) + + +def apply_empty_block(state): + """ + Transition via an empty block (on current slot, assuming no block has been applied yet). + :return: the empty block that triggered the transition. + """ + block = build_empty_block(state, signed=True) + state_transition(state, block) + return block + + +def build_empty_block(state, slot=None, signed=False): + if slot is None: + slot = state.slot + empty_block = BeaconBlock() + empty_block.slot = slot + empty_block.body.eth1_data.deposit_count = state.deposit_index + previous_block_header = deepcopy(state.latest_block_header) + if previous_block_header.state_root == spec.ZERO_HASH: + previous_block_header.state_root = state.hash_tree_root() + empty_block.previous_block_root = signing_root(previous_block_header) + + if signed: + sign_block(state, empty_block) + + return empty_block + + +def build_empty_block_for_next_slot(state, signed=False): + return build_empty_block(state, state.slot + 1, signed=signed) + diff --git a/test_libs/pyspec/eth2spec/test/helpers/block_header.py b/test_libs/pyspec/eth2spec/test/helpers/block_header.py new file mode 100644 index 0000000000..9aba62d37d --- /dev/null +++ b/test_libs/pyspec/eth2spec/test/helpers/block_header.py @@ -0,0 +1,18 @@ +# Access constants from spec pkg reference. +import eth2spec.phase0.spec as spec + +from eth2spec.phase0.spec import get_domain +from eth2spec.utils.bls import bls_sign +from eth2spec.utils.minimal_ssz import signing_root + + +def sign_block_header(state, header, privkey): + domain = get_domain( + state=state, + domain_type=spec.DOMAIN_BEACON_PROPOSER, + ) + header.signature = bls_sign( + message_hash=signing_root(header), + privkey=privkey, + domain=domain, + ) diff --git a/test_libs/pyspec/eth2spec/test/helpers/deposits.py b/test_libs/pyspec/eth2spec/test/helpers/deposits.py new file mode 100644 index 0000000000..c5deb124e6 --- /dev/null +++ b/test_libs/pyspec/eth2spec/test/helpers/deposits.py @@ -0,0 +1,81 @@ +# Access constants from spec pkg reference. +import eth2spec.phase0.spec as spec + +from eth2spec.phase0.spec import get_domain, DepositData, verify_merkle_branch, Deposit, ZERO_HASH +from eth2spec.test.helpers.keys import pubkeys, privkeys +from eth2spec.utils.bls import bls_sign +from eth2spec.utils.merkle_minimal import calc_merkle_tree_from_leaves, get_merkle_root, get_merkle_proof +from eth2spec.utils.minimal_ssz import signing_root + + +def build_deposit_data(state, pubkey, privkey, amount, signed=False): + deposit_data = DepositData( + pubkey=pubkey, + # insecurely use pubkey as withdrawal key as well + withdrawal_credentials=spec.BLS_WITHDRAWAL_PREFIX_BYTE + spec.hash(pubkey)[1:], + amount=amount, + ) + if signed: + sign_deposit_data(state, deposit_data, privkey) + return deposit_data + + +def sign_deposit_data(state, deposit_data, privkey): + signature = bls_sign( + message_hash=signing_root(deposit_data), + privkey=privkey, + domain=get_domain( + state, + spec.DOMAIN_DEPOSIT, + ) + ) + deposit_data.signature = signature + + +def build_deposit(state, + deposit_data_leaves, + pubkey, + privkey, + amount, + signed): + deposit_data = build_deposit_data(state, pubkey, privkey, amount, signed) + + item = deposit_data.hash_tree_root() + index = len(deposit_data_leaves) + deposit_data_leaves.append(item) + tree = calc_merkle_tree_from_leaves(tuple(deposit_data_leaves)) + root = get_merkle_root((tuple(deposit_data_leaves))) + proof = list(get_merkle_proof(tree, item_index=index)) + assert verify_merkle_branch(item, proof, spec.DEPOSIT_CONTRACT_TREE_DEPTH, index, root) + + deposit = Deposit( + proof=list(proof), + index=index, + data=deposit_data, + ) + + return deposit, root, deposit_data_leaves + + +def prepare_state_and_deposit(state, validator_index, amount, signed=False): + """ + Prepare the state for the deposit, and create a deposit for the given validator, depositing the given amount. + """ + pre_validator_count = len(state.validator_registry) + # fill previous deposits with zero-hash + deposit_data_leaves = [ZERO_HASH] * pre_validator_count + + pubkey = pubkeys[validator_index] + privkey = privkeys[validator_index] + deposit, root, deposit_data_leaves = build_deposit( + state, + deposit_data_leaves, + pubkey, + privkey, + amount, + signed + ) + + state.latest_eth1_data.deposit_root = root + state.latest_eth1_data.deposit_count = len(deposit_data_leaves) + return deposit diff --git a/test_libs/pyspec/eth2spec/test/helpers/genesis.py b/test_libs/pyspec/eth2spec/test/helpers/genesis.py new file mode 100644 index 0000000000..01011cacd0 --- /dev/null +++ b/test_libs/pyspec/eth2spec/test/helpers/genesis.py @@ -0,0 +1,51 @@ +# Access constants from spec pkg reference. +import eth2spec.phase0.spec as spec + +from eth2spec.phase0.spec import Eth1Data, ZERO_HASH, get_active_validator_indices +from eth2spec.test.helpers.keys import pubkeys +from eth2spec.utils.minimal_ssz import hash_tree_root + + +def build_mock_validator(i: int, balance: int): + pubkey = pubkeys[i] + # insecurely use pubkey as withdrawal key as well + withdrawal_credentials = spec.BLS_WITHDRAWAL_PREFIX_BYTE + spec.hash(pubkey)[1:] + return spec.Validator( + pubkey=pubkeys[i], + withdrawal_credentials=withdrawal_credentials, + activation_eligibility_epoch=spec.FAR_FUTURE_EPOCH, + activation_epoch=spec.FAR_FUTURE_EPOCH, + exit_epoch=spec.FAR_FUTURE_EPOCH, + withdrawable_epoch=spec.FAR_FUTURE_EPOCH, + effective_balance=min(balance - balance % spec.EFFECTIVE_BALANCE_INCREMENT, spec.MAX_EFFECTIVE_BALANCE) + ) + + +def create_genesis_state(num_validators): + deposit_root = b'\x42' * 32 + + state = spec.BeaconState( + genesis_time=0, + deposit_index=num_validators, + latest_eth1_data=Eth1Data( + deposit_root=deposit_root, + deposit_count=num_validators, + block_hash=ZERO_HASH, + )) + + # We "hack" in the initial validators, + # as it is much faster than creating and processing genesis deposits for every single test case. + state.balances = [spec.MAX_EFFECTIVE_BALANCE] * num_validators + state.validator_registry = [build_mock_validator(i, state.balances[i]) for i in range(num_validators)] + + # Process genesis activations + for validator in state.validator_registry: + if validator.effective_balance >= spec.MAX_EFFECTIVE_BALANCE: + validator.activation_eligibility_epoch = spec.GENESIS_EPOCH + validator.activation_epoch = spec.GENESIS_EPOCH + + genesis_active_index_root = hash_tree_root(get_active_validator_indices(state, spec.GENESIS_EPOCH)) + for index in range(spec.LATEST_ACTIVE_INDEX_ROOTS_LENGTH): + state.latest_active_index_roots[index] = genesis_active_index_root + + return state diff --git a/test_libs/pyspec/eth2spec/test/helpers/keys.py b/test_libs/pyspec/eth2spec/test/helpers/keys.py new file mode 100644 index 0000000000..f47cd7c10b --- /dev/null +++ b/test_libs/pyspec/eth2spec/test/helpers/keys.py @@ -0,0 +1,6 @@ +from py_ecc import bls +from eth2spec.phase0 import spec + +privkeys = [i + 1 for i in range(spec.SLOTS_PER_EPOCH * 16)] +pubkeys = [bls.privtopub(privkey) for privkey in privkeys] +pubkey_to_privkey = {pubkey: privkey for privkey, pubkey in zip(privkeys, pubkeys)} diff --git a/test_libs/pyspec/eth2spec/test/helpers/proposer_slashings.py b/test_libs/pyspec/eth2spec/test/helpers/proposer_slashings.py new file mode 100644 index 0000000000..dfb8895dc2 --- /dev/null +++ b/test_libs/pyspec/eth2spec/test/helpers/proposer_slashings.py @@ -0,0 +1,35 @@ +from copy import deepcopy + +from eth2spec.phase0.spec import ( + get_current_epoch, get_active_validator_indices, BeaconBlockHeader, ProposerSlashing +) +from eth2spec.test.helpers.block_header import sign_block_header +from eth2spec.test.helpers.keys import pubkey_to_privkey + + +def get_valid_proposer_slashing(state, signed_1=False, signed_2=False): + current_epoch = get_current_epoch(state) + validator_index = get_active_validator_indices(state, current_epoch)[-1] + privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey] + slot = state.slot + + header_1 = BeaconBlockHeader( + slot=slot, + previous_block_root=b'\x33' * 32, + state_root=b'\x44' * 32, + block_body_root=b'\x55' * 32, + ) + header_2 = deepcopy(header_1) + header_2.previous_block_root = b'\x99' * 32 + header_2.slot = slot + 1 + + if signed_1: + sign_block_header(state, header_1, privkey) + if signed_2: + sign_block_header(state, header_2, privkey) + + return ProposerSlashing( + proposer_index=validator_index, + header_1=header_1, + header_2=header_2, + ) diff --git a/test_libs/pyspec/eth2spec/test/helpers/state.py b/test_libs/pyspec/eth2spec/test/helpers/state.py new file mode 100644 index 0000000000..e720a9709f --- /dev/null +++ b/test_libs/pyspec/eth2spec/test/helpers/state.py @@ -0,0 +1,31 @@ +# Access constants from spec pkg reference. +import eth2spec.phase0.spec as spec + +from eth2spec.phase0.state_transition import state_transition_to + + +def get_balance(state, index): + return state.balances[index] + + +def next_slot(state): + """ + Transition to the next slot. + """ + state_transition_to(state, state.slot + 1) + + +def next_epoch(state): + """ + Transition to the start slot of the next epoch + """ + slot = state.slot + spec.SLOTS_PER_EPOCH - (state.slot % spec.SLOTS_PER_EPOCH) + state_transition_to(state, slot) + + +def get_state_root(state, slot) -> bytes: + """ + Return the state root at a recent ``slot``. + """ + assert slot < state.slot <= slot + spec.SLOTS_PER_HISTORICAL_ROOT + return state.latest_state_roots[slot % spec.SLOTS_PER_HISTORICAL_ROOT] diff --git a/test_libs/pyspec/eth2spec/test/helpers/transfers.py b/test_libs/pyspec/eth2spec/test/helpers/transfers.py new file mode 100644 index 0000000000..2045f48ad6 --- /dev/null +++ b/test_libs/pyspec/eth2spec/test/helpers/transfers.py @@ -0,0 +1,55 @@ +# Access constants from spec pkg reference. +import eth2spec.phase0.spec as spec + +from eth2spec.phase0.spec import get_current_epoch, get_active_validator_indices, Transfer, get_domain +from eth2spec.test.helpers.keys import pubkeys, privkeys +from eth2spec.test.helpers.state import get_balance +from eth2spec.utils.bls import bls_sign +from eth2spec.utils.minimal_ssz import signing_root + + +def get_valid_transfer(state, slot=None, sender_index=None, amount=None, fee=None, signed=False): + if slot is None: + slot = state.slot + current_epoch = get_current_epoch(state) + if sender_index is None: + sender_index = get_active_validator_indices(state, current_epoch)[-1] + recipient_index = get_active_validator_indices(state, current_epoch)[0] + transfer_pubkey = pubkeys[-1] + transfer_privkey = privkeys[-1] + + if fee is None: + fee = get_balance(state, sender_index) // 32 + if amount is None: + amount = get_balance(state, sender_index) - fee + + transfer = Transfer( + sender=sender_index, + recipient=recipient_index, + amount=amount, + fee=fee, + slot=slot, + pubkey=transfer_pubkey, + ) + if signed: + sign_transfer(state, transfer, transfer_privkey) + + # ensure withdrawal_credentials reproducible + state.validator_registry[transfer.sender].withdrawal_credentials = ( + spec.BLS_WITHDRAWAL_PREFIX_BYTE + spec.hash(transfer.pubkey)[1:] + ) + + return transfer + + +def sign_transfer(state, transfer, privkey): + transfer.signature = bls_sign( + message_hash=signing_root(transfer), + privkey=privkey, + domain=get_domain( + state=state, + domain_type=spec.DOMAIN_TRANSFER, + message_epoch=get_current_epoch(state), + ) + ) + return transfer diff --git a/test_libs/pyspec/eth2spec/test/helpers/voluntary_exits.py b/test_libs/pyspec/eth2spec/test/helpers/voluntary_exits.py new file mode 100644 index 0000000000..54376d694b --- /dev/null +++ b/test_libs/pyspec/eth2spec/test/helpers/voluntary_exits.py @@ -0,0 +1,28 @@ +# Access constants from spec pkg reference. +import eth2spec.phase0.spec as spec + +from eth2spec.phase0.spec import VoluntaryExit, get_domain +from eth2spec.utils.bls import bls_sign +from eth2spec.utils.minimal_ssz import signing_root + + +def build_voluntary_exit(state, epoch, validator_index, privkey, signed=False): + voluntary_exit = VoluntaryExit( + epoch=epoch, + validator_index=validator_index, + ) + if signed: + sign_voluntary_exit(state, voluntary_exit, privkey) + return voluntary_exit + + +def sign_voluntary_exit(state, voluntary_exit, privkey): + voluntary_exit.signature = bls_sign( + message_hash=signing_root(voluntary_exit), + privkey=privkey, + domain=get_domain( + state=state, + domain_type=spec.DOMAIN_VOLUNTARY_EXIT, + message_epoch=voluntary_exit.epoch, + ) + ) diff --git a/test_libs/pyspec/eth2spec/test/sanity/__init__.py b/test_libs/pyspec/eth2spec/test/sanity/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/test_libs/pyspec/eth2spec/test/sanity/test_blocks.py b/test_libs/pyspec/eth2spec/test/sanity/test_blocks.py new file mode 100644 index 0000000000..c9aadbf2ac --- /dev/null +++ b/test_libs/pyspec/eth2spec/test/sanity/test_blocks.py @@ -0,0 +1,408 @@ +from copy import deepcopy + +import eth2spec.phase0.spec as spec +from eth2spec.utils.bls import bls_sign + +from eth2spec.utils.minimal_ssz import signing_root +from eth2spec.phase0.spec import ( + # SSZ + VoluntaryExit, + # functions + get_active_validator_indices, + get_beacon_proposer_index, + get_block_root_at_slot, + get_current_epoch, + get_domain, +) +from eth2spec.phase0.state_transition import ( + state_transition, +) +from eth2spec.test.helpers.state import get_balance +from eth2spec.test.helpers.transfers import get_valid_transfer +from eth2spec.test.helpers.block import build_empty_block_for_next_slot, sign_block +from eth2spec.test.helpers.keys import privkeys, pubkeys +from eth2spec.test.helpers.attester_slashings import get_valid_attester_slashing +from eth2spec.test.helpers.proposer_slashings import get_valid_proposer_slashing +from eth2spec.test.helpers.attestations import get_valid_attestation +from eth2spec.test.helpers.deposits import prepare_state_and_deposit + +from eth2spec.test.context import spec_state_test, never_bls + + +@never_bls +@spec_state_test +def test_empty_block_transition(state): + pre_slot = state.slot + pre_eth1_votes = len(state.eth1_data_votes) + + yield 'pre', state + + block = build_empty_block_for_next_slot(state, signed=True) + yield 'blocks', [block], [spec.BeaconBlock] + + state_transition(state, block) + yield 'post', state + + assert len(state.eth1_data_votes) == pre_eth1_votes + 1 + assert get_block_root_at_slot(state, pre_slot) == block.previous_block_root + + +@never_bls +@spec_state_test +def test_skipped_slots(state): + pre_slot = state.slot + yield 'pre', state + + block = build_empty_block_for_next_slot(state) + block.slot += 3 + sign_block(state, block) + yield 'blocks', [block], [spec.BeaconBlock] + + state_transition(state, block) + yield 'post', state + + assert state.slot == block.slot + for slot in range(pre_slot, state.slot): + assert get_block_root_at_slot(state, slot) == block.previous_block_root + + +@spec_state_test +def test_empty_epoch_transition(state): + pre_slot = state.slot + yield 'pre', state + + block = build_empty_block_for_next_slot(state) + block.slot += spec.SLOTS_PER_EPOCH + sign_block(state, block) + yield 'blocks', [block], [spec.BeaconBlock] + + state_transition(state, block) + yield 'post', state + + assert state.slot == block.slot + for slot in range(pre_slot, state.slot): + assert get_block_root_at_slot(state, slot) == block.previous_block_root + + +# @spec_state_test +# def test_empty_epoch_transition_not_finalizing(state): +# # copy for later balance lookups. +# pre_state = deepcopy(state) +# yield 'pre', state +# +# block = build_empty_block_for_next_slot(state) +# block.slot += spec.SLOTS_PER_EPOCH * 5 +# sign_block(state, block, proposer_index=0) +# yield 'blocks', [block], [spec.BeaconBlock] +# +# state_transition(state, block) +# yield 'post', state +# +# assert state.slot == block.slot +# assert state.finalized_epoch < get_current_epoch(state) - 4 +# for index in range(len(state.validator_registry)): +# assert get_balance(state, index) < get_balance(pre_state, index) + + +@spec_state_test +def test_proposer_slashing(state): + # copy for later balance lookups. + pre_state = deepcopy(state) + proposer_slashing = get_valid_proposer_slashing(state, signed_1=True, signed_2=True) + validator_index = proposer_slashing.proposer_index + + assert not state.validator_registry[validator_index].slashed + + yield 'pre', state + + # + # Add to state via block transition + # + block = build_empty_block_for_next_slot(state) + block.body.proposer_slashings.append(proposer_slashing) + sign_block(state, block) + yield 'blocks', [block], [spec.BeaconBlock] + + state_transition(state, block) + yield 'post', state + + # check if slashed + slashed_validator = state.validator_registry[validator_index] + assert slashed_validator.slashed + assert slashed_validator.exit_epoch < spec.FAR_FUTURE_EPOCH + assert slashed_validator.withdrawable_epoch < spec.FAR_FUTURE_EPOCH + # lost whistleblower reward + assert get_balance(state, validator_index) < get_balance(pre_state, validator_index) + + +@spec_state_test +def test_attester_slashing(state): + # copy for later balance lookups. + pre_state = deepcopy(state) + + attester_slashing = get_valid_attester_slashing(state, signed_1=True, signed_2=True) + validator_index = (attester_slashing.attestation_1.custody_bit_0_indices + + attester_slashing.attestation_1.custody_bit_1_indices)[0] + + assert not state.validator_registry[validator_index].slashed + + yield 'pre', state + + # + # Add to state via block transition + # + block = build_empty_block_for_next_slot(state) + block.body.attester_slashings.append(attester_slashing) + sign_block(state, block) + yield 'blocks', [block], [spec.BeaconBlock] + + state_transition(state, block) + yield 'post', state + + slashed_validator = state.validator_registry[validator_index] + assert slashed_validator.slashed + assert slashed_validator.exit_epoch < spec.FAR_FUTURE_EPOCH + assert slashed_validator.withdrawable_epoch < spec.FAR_FUTURE_EPOCH + # lost whistleblower reward + assert get_balance(state, validator_index) < get_balance(pre_state, validator_index) + + proposer_index = get_beacon_proposer_index(state) + # gained whistleblower reward + assert ( + get_balance(state, proposer_index) > + get_balance(pre_state, proposer_index) + ) + + +# TODO update functions below to be like above, i.e. with @spec_state_test and yielding data to put into the test vector + +@spec_state_test +def test_deposit_in_block(state): + initial_registry_len = len(state.validator_registry) + initial_balances_len = len(state.balances) + + validator_index = len(state.validator_registry) + amount = spec.MAX_EFFECTIVE_BALANCE + deposit = prepare_state_and_deposit(state, validator_index, amount, signed=True) + + yield 'pre', state + + block = build_empty_block_for_next_slot(state) + block.body.deposits.append(deposit) + sign_block(state, block) + + yield 'blocks', [block], [spec.BeaconBlock] + + state_transition(state, block) + yield 'post', state + + assert len(state.validator_registry) == initial_registry_len + 1 + assert len(state.balances) == initial_balances_len + 1 + assert get_balance(state, validator_index) == spec.MAX_EFFECTIVE_BALANCE + assert state.validator_registry[validator_index].pubkey == pubkeys[validator_index] + + +@spec_state_test +def test_deposit_top_up(state): + validator_index = 0 + amount = spec.MAX_EFFECTIVE_BALANCE // 4 + deposit = prepare_state_and_deposit(state, validator_index, amount) + + initial_registry_len = len(state.validator_registry) + initial_balances_len = len(state.balances) + validator_pre_balance = get_balance(state, validator_index) + + yield 'pre', state + + block = build_empty_block_for_next_slot(state) + block.body.deposits.append(deposit) + sign_block(state, block) + + yield 'blocks', [block], [spec.BeaconBlock] + + state_transition(state, block) + yield 'post', state + + assert len(state.validator_registry) == initial_registry_len + assert len(state.balances) == initial_balances_len + assert get_balance(state, validator_index) == validator_pre_balance + amount + + +@spec_state_test +def test_attestation(state): + state.slot = spec.SLOTS_PER_EPOCH + + yield 'pre', state + + attestation = get_valid_attestation(state, signed=True) + + # Add to state via block transition + pre_current_attestations_len = len(state.current_epoch_attestations) + attestation_block = build_empty_block_for_next_slot(state) + attestation_block.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY + attestation_block.body.attestations.append(attestation) + sign_block(state, attestation_block) + state_transition(state, attestation_block) + + assert len(state.current_epoch_attestations) == pre_current_attestations_len + 1 + + # Epoch transition should move to previous_epoch_attestations + pre_current_attestations_root = spec.hash_tree_root(state.current_epoch_attestations) + + epoch_block = build_empty_block_for_next_slot(state) + epoch_block.slot += spec.SLOTS_PER_EPOCH + sign_block(state, epoch_block) + state_transition(state, epoch_block) + + yield 'blocks', [attestation_block, epoch_block], [spec.BeaconBlock] + yield 'post', state + + assert len(state.current_epoch_attestations) == 0 + assert spec.hash_tree_root(state.previous_epoch_attestations) == pre_current_attestations_root + + +@spec_state_test +def test_voluntary_exit(state): + validator_index = get_active_validator_indices( + state, + get_current_epoch(state) + )[-1] + + # move state forward PERSISTENT_COMMITTEE_PERIOD epochs to allow for exit + state.slot += spec.PERSISTENT_COMMITTEE_PERIOD * spec.SLOTS_PER_EPOCH + + yield 'pre', state + + voluntary_exit = VoluntaryExit( + epoch=get_current_epoch(state), + validator_index=validator_index, + ) + voluntary_exit.signature = bls_sign( + message_hash=signing_root(voluntary_exit), + privkey=privkeys[validator_index], + domain=get_domain( + state=state, + domain_type=spec.DOMAIN_VOLUNTARY_EXIT, + ) + ) + + # Add to state via block transition + initiate_exit_block = build_empty_block_for_next_slot(state) + initiate_exit_block.body.voluntary_exits.append(voluntary_exit) + sign_block(state, initiate_exit_block) + state_transition(state, initiate_exit_block) + + assert state.validator_registry[validator_index].exit_epoch < spec.FAR_FUTURE_EPOCH + + # Process within epoch transition + exit_block = build_empty_block_for_next_slot(state) + exit_block.slot += spec.SLOTS_PER_EPOCH + sign_block(state, exit_block) + state_transition(state, exit_block) + + yield 'blocks', [initiate_exit_block, exit_block], [spec.BeaconBlock] + yield 'post', state + + assert state.validator_registry[validator_index].exit_epoch < spec.FAR_FUTURE_EPOCH + + +@spec_state_test +def test_transfer(state): + # overwrite default 0 to test + spec.MAX_TRANSFERS = 1 + + sender_index = get_active_validator_indices(state, get_current_epoch(state))[-1] + amount = get_balance(state, sender_index) + + transfer = get_valid_transfer(state, state.slot + 1, sender_index, amount, signed=True) + recipient_index = transfer.recipient + pre_transfer_recipient_balance = get_balance(state, recipient_index) + + # un-activate so validator can transfer + state.validator_registry[sender_index].activation_eligibility_epoch = spec.FAR_FUTURE_EPOCH + + yield 'pre', state + + # Add to state via block transition + block = build_empty_block_for_next_slot(state) + block.body.transfers.append(transfer) + sign_block(state, block) + + yield 'blocks', [block], [spec.BeaconBlock] + + state_transition(state, block) + yield 'post', state + + sender_balance = get_balance(state, sender_index) + recipient_balance = get_balance(state, recipient_index) + assert sender_balance == 0 + assert recipient_balance == pre_transfer_recipient_balance + amount + + +@spec_state_test +def test_balance_driven_status_transitions(state): + current_epoch = get_current_epoch(state) + validator_index = get_active_validator_indices(state, current_epoch)[-1] + + assert state.validator_registry[validator_index].exit_epoch == spec.FAR_FUTURE_EPOCH + + # set validator balance to below ejection threshold + state.validator_registry[validator_index].effective_balance = spec.EJECTION_BALANCE + + yield 'pre', state + + # trigger epoch transition + block = build_empty_block_for_next_slot(state) + block.slot += spec.SLOTS_PER_EPOCH + sign_block(state, block) + state_transition(state, block) + + yield 'blocks', [block], [spec.BeaconBlock] + yield 'post', state + + assert state.validator_registry[validator_index].exit_epoch < spec.FAR_FUTURE_EPOCH + + +@spec_state_test +def test_historical_batch(state): + state.slot += spec.SLOTS_PER_HISTORICAL_ROOT - (state.slot % spec.SLOTS_PER_HISTORICAL_ROOT) - 1 + pre_historical_roots_len = len(state.historical_roots) + + yield 'pre', state + + block = build_empty_block_for_next_slot(state, signed=True) + state_transition(state, block) + + yield 'blocks', [block], [spec.BeaconBlock] + yield 'post', state + + assert state.slot == block.slot + assert get_current_epoch(state) % (spec.SLOTS_PER_HISTORICAL_ROOT // spec.SLOTS_PER_EPOCH) == 0 + assert len(state.historical_roots) == pre_historical_roots_len + 1 + + +# @spec_state_test +# def test_eth1_data_votes(state): +# yield 'pre', state +# +# expected_votes = 0 +# assert len(state.eth1_data_votes) == expected_votes +# +# blocks = [] +# for _ in range(spec.SLOTS_PER_ETH1_VOTING_PERIOD - 1): +# block = build_empty_block_for_next_slot(state) +# state_transition(state, block) +# expected_votes += 1 +# assert len(state.eth1_data_votes) == expected_votes +# blocks.append(block) +# +# block = build_empty_block_for_next_slot(state) +# blocks.append(block) +# +# state_transition(state, block) +# +# yield 'blocks', [block], [spec.BeaconBlock] +# yield 'post', state +# +# assert state.slot % spec.SLOTS_PER_ETH1_VOTING_PERIOD == 0 +# assert len(state.eth1_data_votes) == 1 diff --git a/test_libs/pyspec/eth2spec/test/sanity/test_slots.py b/test_libs/pyspec/eth2spec/test/sanity/test_slots.py new file mode 100644 index 0000000000..2e5f3a5df6 --- /dev/null +++ b/test_libs/pyspec/eth2spec/test/sanity/test_slots.py @@ -0,0 +1,58 @@ +import eth2spec.phase0.spec as spec + +from eth2spec.phase0.state_transition import state_transition_to +from eth2spec.test.helpers.state import get_state_root +from eth2spec.test.context import spec_state_test + + +@spec_state_test +def test_slots_1(state): + pre_slot = state.slot + pre_root = state.hash_tree_root() + yield 'pre', state + + slots = 1 + yield 'slots', slots + state_transition_to(state, state.slot + slots) + + yield 'post', state + assert state.slot == pre_slot + 1 + assert get_state_root(state, pre_slot) == pre_root + + +@spec_state_test +def test_slots_2(state): + yield 'pre', state + slots = 2 + yield 'slots', slots + state_transition_to(state, state.slot + slots) + yield 'post', state + + +@spec_state_test +def test_empty_epoch(state): + yield 'pre', state + slots = spec.SLOTS_PER_EPOCH + yield 'slots', slots + state_transition_to(state, state.slot + slots) + yield 'post', state + + +@spec_state_test +def test_double_empty_epoch(state): + yield 'pre', state + slots = spec.SLOTS_PER_EPOCH * 2 + yield 'slots', slots + state_transition_to(state, state.slot + slots) + yield 'post', state + + +@spec_state_test +def test_over_epoch_boundary(state): + state_transition_to(state, state.slot + (spec.SLOTS_PER_EPOCH // 2)) + yield 'pre', state + slots = spec.SLOTS_PER_EPOCH + yield 'slots', slots + state_transition_to(state, state.slot + slots) + yield 'post', state + diff --git a/test_libs/pyspec/tests/test_finality.py b/test_libs/pyspec/eth2spec/test/test_finality.py similarity index 56% rename from test_libs/pyspec/tests/test_finality.py rename to test_libs/pyspec/eth2spec/test/test_finality.py index ca048c2b2a..56f65eca9a 100644 --- a/test_libs/pyspec/tests/test_finality.py +++ b/test_libs/pyspec/eth2spec/test/test_finality.py @@ -1,24 +1,18 @@ from copy import deepcopy -import pytest - import eth2spec.phase0.spec as spec - from eth2spec.phase0.state_transition import ( state_transition, ) -from .helpers import ( - build_empty_block_for_next_slot, - fill_aggregate_attestation, +from .context import spec_state_test, never_bls +from .helpers.state import next_epoch +from .helpers.block import build_empty_block_for_next_slot, apply_empty_block +from .helpers.attestations import ( get_current_epoch, get_epoch_start_slot, get_valid_attestation, - next_epoch, ) -# mark entire file as 'state' -pytestmark = pytest.mark.state - def check_finality(state, prev_state, @@ -58,13 +52,11 @@ def next_epoch_with_attestations(state, slot_to_attest = post_state.slot - spec.MIN_ATTESTATION_INCLUSION_DELAY + 1 if slot_to_attest >= get_epoch_start_slot(get_current_epoch(post_state)): cur_attestation = get_valid_attestation(post_state, slot_to_attest) - fill_aggregate_attestation(post_state, cur_attestation) block.body.attestations.append(cur_attestation) if fill_prev_epoch: slot_to_attest = post_state.slot - spec.SLOTS_PER_EPOCH + 1 prev_attestation = get_valid_attestation(post_state, slot_to_attest) - fill_aggregate_attestation(post_state, prev_attestation) block.body.attestations.append(prev_attestation) state_transition(post_state, block) @@ -73,126 +65,140 @@ def next_epoch_with_attestations(state, return state, blocks, post_state +@never_bls +@spec_state_test def test_finality_rule_4(state): - test_state = deepcopy(state) + yield 'pre', state blocks = [] for epoch in range(4): - prev_state, new_blocks, test_state = next_epoch_with_attestations(test_state, True, False) + prev_state, new_blocks, state = next_epoch_with_attestations(state, True, False) blocks += new_blocks # justification/finalization skipped at GENESIS_EPOCH if epoch == 0: - check_finality(test_state, prev_state, False, False, False) + check_finality(state, prev_state, False, False, False) # justification/finalization skipped at GENESIS_EPOCH + 1 elif epoch == 1: - check_finality(test_state, prev_state, False, False, False) + check_finality(state, prev_state, False, False, False) elif epoch == 2: - check_finality(test_state, prev_state, True, False, False) + check_finality(state, prev_state, True, False, False) elif epoch >= 3: # rule 4 of finality - check_finality(test_state, prev_state, True, True, True) - assert test_state.finalized_epoch == prev_state.current_justified_epoch - assert test_state.finalized_root == prev_state.current_justified_root + check_finality(state, prev_state, True, True, True) + assert state.finalized_epoch == prev_state.current_justified_epoch + assert state.finalized_root == prev_state.current_justified_root - return state, blocks, test_state + yield 'blocks', blocks, [spec.BeaconBlock] + yield 'post', state +@never_bls +@spec_state_test def test_finality_rule_1(state): # get past first two epochs that finality does not run on next_epoch(state) + apply_empty_block(state) next_epoch(state) + apply_empty_block(state) - pre_state = deepcopy(state) - test_state = deepcopy(state) + yield 'pre', state blocks = [] for epoch in range(3): - prev_state, new_blocks, test_state = next_epoch_with_attestations(test_state, False, True) + prev_state, new_blocks, state = next_epoch_with_attestations(state, False, True) blocks += new_blocks if epoch == 0: - check_finality(test_state, prev_state, True, False, False) + check_finality(state, prev_state, True, False, False) elif epoch == 1: - check_finality(test_state, prev_state, True, True, False) + check_finality(state, prev_state, True, True, False) elif epoch == 2: # finalized by rule 1 - check_finality(test_state, prev_state, True, True, True) - assert test_state.finalized_epoch == prev_state.previous_justified_epoch - assert test_state.finalized_root == prev_state.previous_justified_root + check_finality(state, prev_state, True, True, True) + assert state.finalized_epoch == prev_state.previous_justified_epoch + assert state.finalized_root == prev_state.previous_justified_root - return pre_state, blocks, test_state + yield 'blocks', blocks, [spec.BeaconBlock] + yield 'post', state +@never_bls +@spec_state_test def test_finality_rule_2(state): # get past first two epochs that finality does not run on next_epoch(state) + apply_empty_block(state) next_epoch(state) + apply_empty_block(state) - pre_state = deepcopy(state) - test_state = deepcopy(state) + yield 'pre', state blocks = [] for epoch in range(3): if epoch == 0: - prev_state, new_blocks, test_state = next_epoch_with_attestations(test_state, True, False) - check_finality(test_state, prev_state, True, False, False) + prev_state, new_blocks, state = next_epoch_with_attestations(state, True, False) + check_finality(state, prev_state, True, False, False) elif epoch == 1: - prev_state, new_blocks, test_state = next_epoch_with_attestations(test_state, False, False) - check_finality(test_state, prev_state, False, True, False) + prev_state, new_blocks, state = next_epoch_with_attestations(state, False, False) + check_finality(state, prev_state, False, True, False) elif epoch == 2: - prev_state, new_blocks, test_state = next_epoch_with_attestations(test_state, False, True) + prev_state, new_blocks, state = next_epoch_with_attestations(state, False, True) # finalized by rule 2 - check_finality(test_state, prev_state, True, False, True) - assert test_state.finalized_epoch == prev_state.previous_justified_epoch - assert test_state.finalized_root == prev_state.previous_justified_root + check_finality(state, prev_state, True, False, True) + assert state.finalized_epoch == prev_state.previous_justified_epoch + assert state.finalized_root == prev_state.previous_justified_root blocks += new_blocks - return pre_state, blocks, test_state + yield 'blocks', blocks, [spec.BeaconBlock] + yield 'post', state +@never_bls +@spec_state_test def test_finality_rule_3(state): """ Test scenario described here https://github.com/ethereum/eth2.0-specs/issues/611#issuecomment-463612892 """ - # get past first two epochs that finality does not run on next_epoch(state) + apply_empty_block(state) next_epoch(state) + apply_empty_block(state) - pre_state = deepcopy(state) - test_state = deepcopy(state) + yield 'pre', state blocks = [] - prev_state, new_blocks, test_state = next_epoch_with_attestations(test_state, True, False) + prev_state, new_blocks, state = next_epoch_with_attestations(state, True, False) blocks += new_blocks - check_finality(test_state, prev_state, True, False, False) + check_finality(state, prev_state, True, False, False) # In epoch N, JE is set to N, prev JE is set to N-1 - prev_state, new_blocks, test_state = next_epoch_with_attestations(test_state, True, False) + prev_state, new_blocks, state = next_epoch_with_attestations(state, True, False) blocks += new_blocks - check_finality(test_state, prev_state, True, True, True) + check_finality(state, prev_state, True, True, True) # In epoch N+1, JE is N, prev JE is N-1, and not enough messages get in to do anything - prev_state, new_blocks, test_state = next_epoch_with_attestations(test_state, False, False) + prev_state, new_blocks, state = next_epoch_with_attestations(state, False, False) blocks += new_blocks - check_finality(test_state, prev_state, False, True, False) + check_finality(state, prev_state, False, True, False) # In epoch N+2, JE is N, prev JE is N, and enough messages from the previous epoch get in to justify N+1. # N+1 now becomes the JE. Not enough messages from epoch N+2 itself get in to justify N+2 - prev_state, new_blocks, test_state = next_epoch_with_attestations(test_state, False, True) + prev_state, new_blocks, state = next_epoch_with_attestations(state, False, True) blocks += new_blocks # rule 2 - check_finality(test_state, prev_state, True, False, True) + check_finality(state, prev_state, True, False, True) # In epoch N+3, LJE is N+1, prev LJE is N, and enough messages get in to justify epochs N+2 and N+3. - prev_state, new_blocks, test_state = next_epoch_with_attestations(test_state, True, True) + prev_state, new_blocks, state = next_epoch_with_attestations(state, True, True) blocks += new_blocks # rule 3 - check_finality(test_state, prev_state, True, True, True) - assert test_state.finalized_epoch == prev_state.current_justified_epoch - assert test_state.finalized_root == prev_state.current_justified_root + check_finality(state, prev_state, True, True, True) + assert state.finalized_epoch == prev_state.current_justified_epoch + assert state.finalized_root == prev_state.current_justified_root - return pre_state, blocks, test_state + yield 'blocks', blocks, [spec.BeaconBlock] + yield 'post', state diff --git a/test_libs/pyspec/eth2spec/test/utils.py b/test_libs/pyspec/eth2spec/test/utils.py new file mode 100644 index 0000000000..b61801c3dd --- /dev/null +++ b/test_libs/pyspec/eth2spec/test/utils.py @@ -0,0 +1,80 @@ +from typing import Dict, Any, Callable, Iterable +from eth2spec.debug.encode import encode + + +def spectest(description: str = None): + def runner(fn): + # this wraps the function, to hide that the function actually is yielding data, instead of returning once. + def entry(*args, **kw): + # check generator mode, may be None/else. + # "pop" removes it, so it is not passed to the inner function. + if kw.pop('generator_mode', False) is True: + out = {} + if description is None: + # fall back on function name for test description + name = fn.__name__ + if name.startswith('test_'): + name = name[5:] + out['description'] = name + else: + # description can be explicit + out['description'] = description + has_contents = False + # put all generated data into a dict. + for data in fn(*args, **kw): + has_contents = True + # If there is a type argument, encode it as that type. + if len(data) == 3: + (key, value, typ) = data + out[key] = encode(value, typ) + else: + # Otherwise, try to infer the type, but keep it as-is if it's not a SSZ container. + (key, value) = data + if hasattr(value.__class__, 'fields'): + out[key] = encode(value, value.__class__) + else: + out[key] = value + if has_contents: + return out + else: + return None + else: + # just complete the function, ignore all yielded data, we are not using it + for _ in fn(*args, **kw): + continue + return None + return entry + return runner + + +def with_tags(tags: Dict[str, Any]): + """ + Decorator factory, adds tags (key, value) pairs to the output of the function. + Useful to build test-vector annotations with. + This decorator is applied after the ``spectest`` decorator is applied. + :param tags: dict of tags + :return: Decorator. + """ + def runner(fn): + def entry(*args, **kw): + fn_out = fn(*args, **kw) + # do not add tags if the function is not returning a dict at all (i.e. not in generator mode) + if fn_out is None: + return None + return {**tags, **fn_out} + return entry + return runner + + +def with_args(create_args: Callable[[], Iterable[Any]]): + """ + Decorator factory, adds given extra arguments to the decorated function. + :param create_args: function to create arguments with. + :return: Decorator. + """ + def runner(fn): + # this wraps the function, to hide that the function actually yielding data. + def entry(*args, **kw): + return fn(*(list(create_args()) + list(args)), **kw) + return entry + return runner diff --git a/test_libs/pyspec/eth2spec/utils/bls.py b/test_libs/pyspec/eth2spec/utils/bls.py new file mode 100644 index 0000000000..52f1fed632 --- /dev/null +++ b/test_libs/pyspec/eth2spec/utils/bls.py @@ -0,0 +1,46 @@ +from py_ecc import bls + +# Flag to make BLS active or not. Used for testing, do not ignore BLS in production unless you know what you are doing. +bls_active = True + +STUB_SIGNATURE = b'\x11' * 96 +STUB_PUBKEY = b'\x22' * 48 + + +def only_with_bls(alt_return=None): + """ + Decorator factory to make a function only run when BLS is active. Otherwise return the default. + """ + def runner(fn): + def entry(*args, **kw): + if bls_active: + return fn(*args, **kw) + else: + return alt_return + return entry + return runner + + +@only_with_bls(alt_return=True) +def bls_verify(pubkey, message_hash, signature, domain): + return bls.verify(message_hash=message_hash, pubkey=pubkey, signature=signature, domain=domain) + + +@only_with_bls(alt_return=True) +def bls_verify_multiple(pubkeys, message_hashes, signature, domain): + return bls.verify_multiple(pubkeys, message_hashes, signature, domain) + + +@only_with_bls(alt_return=STUB_PUBKEY) +def bls_aggregate_pubkeys(pubkeys): + return bls.aggregate_pubkeys(pubkeys) + + +@only_with_bls(alt_return=STUB_SIGNATURE) +def bls_aggregate_signatures(signatures): + return bls.aggregate_signatures(signatures) + + +@only_with_bls(alt_return=STUB_SIGNATURE) +def bls_sign(message_hash, privkey, domain): + return bls.sign(message_hash=message_hash, privkey=privkey, domain=domain) diff --git a/test_libs/pyspec/eth2spec/utils/bls_stub.py b/test_libs/pyspec/eth2spec/utils/bls_stub.py deleted file mode 100644 index 108c4ef710..0000000000 --- a/test_libs/pyspec/eth2spec/utils/bls_stub.py +++ /dev/null @@ -1,12 +0,0 @@ - - -def bls_verify(pubkey, message_hash, signature, domain): - return True - - -def bls_verify_multiple(pubkeys, message_hashes, signature, domain): - return True - - -def bls_aggregate_pubkeys(pubkeys): - return b'\x42' * 96 diff --git a/test_libs/pyspec/tests/block_processing/test_process_attestation.py b/test_libs/pyspec/tests/block_processing/test_process_attestation.py deleted file mode 100644 index bcf71376ce..0000000000 --- a/test_libs/pyspec/tests/block_processing/test_process_attestation.py +++ /dev/null @@ -1,155 +0,0 @@ -from copy import deepcopy -import pytest - -import eth2spec.phase0.spec as spec - -from eth2spec.phase0.state_transition import ( - state_transition, -) -from eth2spec.phase0.spec import ( - get_current_epoch, - process_attestation, - slot_to_epoch, -) -from tests.helpers import ( - build_empty_block_for_next_slot, - get_valid_attestation, - next_epoch, - next_slot, -) - - -# mark entire file as 'attestations' -pytestmark = pytest.mark.attestations - - -def run_attestation_processing(state, attestation, valid=True): - """ - Run ``process_attestation`` returning the pre and post state. - If ``valid == False``, run expecting ``AssertionError`` - """ - post_state = deepcopy(state) - - if not valid: - with pytest.raises(AssertionError): - process_attestation(post_state, attestation) - return state, None - - process_attestation(post_state, attestation) - - current_epoch = get_current_epoch(state) - if attestation.data.target_epoch == current_epoch: - assert len(post_state.current_epoch_attestations) == len(state.current_epoch_attestations) + 1 - else: - assert len(post_state.previous_epoch_attestations) == len(state.previous_epoch_attestations) + 1 - - return state, post_state - - -def test_success(state): - attestation = get_valid_attestation(state) - state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY - - pre_state, post_state = run_attestation_processing(state, attestation) - - return pre_state, attestation, post_state - - -def test_success_prevous_epoch(state): - attestation = get_valid_attestation(state) - block = build_empty_block_for_next_slot(state) - block.slot = state.slot + spec.SLOTS_PER_EPOCH - state_transition(state, block) - - pre_state, post_state = run_attestation_processing(state, attestation) - - return pre_state, attestation, post_state - - -def test_before_inclusion_delay(state): - attestation = get_valid_attestation(state) - # do not increment slot to allow for inclusion delay - - pre_state, post_state = run_attestation_processing(state, attestation, False) - - return pre_state, attestation, post_state - - -def test_after_epoch_slots(state): - attestation = get_valid_attestation(state) - block = build_empty_block_for_next_slot(state) - # increment past latest inclusion slot - block.slot = state.slot + spec.SLOTS_PER_EPOCH + 1 - state_transition(state, block) - - pre_state, post_state = run_attestation_processing(state, attestation, False) - - return pre_state, attestation, post_state - - -def test_bad_source_epoch(state): - attestation = get_valid_attestation(state) - state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY - - attestation.data.source_epoch += 10 - - pre_state, post_state = run_attestation_processing(state, attestation, False) - - return pre_state, attestation, post_state - - -def test_bad_source_root(state): - attestation = get_valid_attestation(state) - state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY - - attestation.data.source_root = b'\x42' * 32 - - pre_state, post_state = run_attestation_processing(state, attestation, False) - - return pre_state, attestation, post_state - - -def test_non_zero_crosslink_data_root(state): - attestation = get_valid_attestation(state) - state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY - - attestation.data.crosslink_data_root = b'\x42' * 32 - - pre_state, post_state = run_attestation_processing(state, attestation, False) - - return pre_state, attestation, post_state - - -def test_bad_previous_crosslink(state): - next_epoch(state) - attestation = get_valid_attestation(state) - for _ in range(spec.MIN_ATTESTATION_INCLUSION_DELAY): - next_slot(state) - - state.current_crosslinks[attestation.data.shard].epoch += 10 - - pre_state, post_state = run_attestation_processing(state, attestation, False) - - return pre_state, attestation, post_state - - -def test_non_empty_custody_bitfield(state): - attestation = get_valid_attestation(state) - state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY - - attestation.custody_bitfield = deepcopy(attestation.aggregation_bitfield) - - pre_state, post_state = run_attestation_processing(state, attestation, False) - - return pre_state, attestation, post_state - - -def test_empty_aggregation_bitfield(state): - attestation = get_valid_attestation(state) - state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY - - attestation.aggregation_bitfield = b'\x00' * len(attestation.aggregation_bitfield) - - pre_state, post_state = run_attestation_processing(state, attestation, False) - - return pre_state, attestation, post_state diff --git a/test_libs/pyspec/tests/block_processing/test_process_attester_slashing.py b/test_libs/pyspec/tests/block_processing/test_process_attester_slashing.py deleted file mode 100644 index 2ea16f13d9..0000000000 --- a/test_libs/pyspec/tests/block_processing/test_process_attester_slashing.py +++ /dev/null @@ -1,117 +0,0 @@ -from copy import deepcopy -import pytest - -import eth2spec.phase0.spec as spec -from eth2spec.phase0.spec import ( - get_beacon_proposer_index, - process_attester_slashing, -) -from tests.helpers import ( - get_balance, - get_valid_attester_slashing, - next_epoch, -) - -# mark entire file as 'attester_slashing' -pytestmark = pytest.mark.attester_slashings - - -def run_attester_slashing_processing(state, attester_slashing, valid=True): - """ - Run ``process_attester_slashing`` returning the pre and post state. - If ``valid == False``, run expecting ``AssertionError`` - """ - post_state = deepcopy(state) - - if not valid: - with pytest.raises(AssertionError): - process_attester_slashing(post_state, attester_slashing) - return state, None - - process_attester_slashing(post_state, attester_slashing) - - slashed_index = attester_slashing.attestation_1.custody_bit_0_indices[0] - slashed_validator = post_state.validator_registry[slashed_index] - assert slashed_validator.slashed - assert slashed_validator.exit_epoch < spec.FAR_FUTURE_EPOCH - assert slashed_validator.withdrawable_epoch < spec.FAR_FUTURE_EPOCH - # lost whistleblower reward - assert ( - get_balance(post_state, slashed_index) < - get_balance(state, slashed_index) - ) - proposer_index = get_beacon_proposer_index(state) - # gained whistleblower reward - assert ( - get_balance(post_state, proposer_index) > - get_balance(state, proposer_index) - ) - - return state, post_state - - -def test_success_double(state): - attester_slashing = get_valid_attester_slashing(state) - - pre_state, post_state = run_attester_slashing_processing(state, attester_slashing) - - return pre_state, attester_slashing, post_state - - -def test_success_surround(state): - next_epoch(state) - state.current_justified_epoch += 1 - attester_slashing = get_valid_attester_slashing(state) - - # set attestion1 to surround attestation 2 - attester_slashing.attestation_1.data.source_epoch = attester_slashing.attestation_2.data.source_epoch - 1 - attester_slashing.attestation_1.data.target_epoch = attester_slashing.attestation_2.data.target_epoch + 1 - - pre_state, post_state = run_attester_slashing_processing(state, attester_slashing) - - return pre_state, attester_slashing, post_state - - -def test_same_data(state): - attester_slashing = get_valid_attester_slashing(state) - - attester_slashing.attestation_1.data = attester_slashing.attestation_2.data - - pre_state, post_state = run_attester_slashing_processing(state, attester_slashing, False) - - return pre_state, attester_slashing, post_state - - -def test_no_double_or_surround(state): - attester_slashing = get_valid_attester_slashing(state) - - attester_slashing.attestation_1.data.target_epoch += 1 - - pre_state, post_state = run_attester_slashing_processing(state, attester_slashing, False) - - return pre_state, attester_slashing, post_state - - -def test_participants_already_slashed(state): - attester_slashing = get_valid_attester_slashing(state) - - # set all indices to slashed - attestation_1 = attester_slashing.attestation_1 - validator_indices = attestation_1.custody_bit_0_indices + attestation_1.custody_bit_1_indices - for index in validator_indices: - state.validator_registry[index].slashed = True - - pre_state, post_state = run_attester_slashing_processing(state, attester_slashing, False) - - return pre_state, attester_slashing, post_state - - -def test_custody_bit_0_and_1(state): - attester_slashing = get_valid_attester_slashing(state) - - attester_slashing.attestation_1.custody_bit_1_indices = ( - attester_slashing.attestation_1.custody_bit_0_indices - ) - pre_state, post_state = run_attester_slashing_processing(state, attester_slashing, False) - - return pre_state, attester_slashing, post_state diff --git a/test_libs/pyspec/tests/block_processing/test_process_block_header.py b/test_libs/pyspec/tests/block_processing/test_process_block_header.py deleted file mode 100644 index b35b0a9c11..0000000000 --- a/test_libs/pyspec/tests/block_processing/test_process_block_header.py +++ /dev/null @@ -1,76 +0,0 @@ -from copy import deepcopy -import pytest - - -from eth2spec.phase0.spec import ( - get_beacon_proposer_index, - cache_state, - advance_slot, - process_block_header, -) -from tests.helpers import ( - build_empty_block_for_next_slot, - next_slot, -) - -# mark entire file as 'header' -pytestmark = pytest.mark.header - - -def prepare_state_for_header_processing(state): - cache_state(state) - advance_slot(state) - - -def run_block_header_processing(state, block, valid=True): - """ - Run ``process_block_header`` returning the pre and post state. - If ``valid == False``, run expecting ``AssertionError`` - """ - prepare_state_for_header_processing(state) - post_state = deepcopy(state) - - if not valid: - with pytest.raises(AssertionError): - process_block_header(post_state, block) - return state, None - - process_block_header(post_state, block) - return state, post_state - - -def test_success(state): - block = build_empty_block_for_next_slot(state) - pre_state, post_state = run_block_header_processing(state, block) - return state, block, post_state - - -def test_invalid_slot(state): - block = build_empty_block_for_next_slot(state) - block.slot = state.slot + 2 # invalid slot - - pre_state, post_state = run_block_header_processing(state, block, valid=False) - return pre_state, block, None - - -def test_invalid_previous_block_root(state): - block = build_empty_block_for_next_slot(state) - block.previous_block_root = b'\12' * 32 # invalid prev root - - pre_state, post_state = run_block_header_processing(state, block, valid=False) - return pre_state, block, None - - -def test_proposer_slashed(state): - # use stub state to get proposer index of next slot - stub_state = deepcopy(state) - next_slot(stub_state) - proposer_index = get_beacon_proposer_index(stub_state) - - # set proposer to slashed - state.validator_registry[proposer_index].slashed = True - - block = build_empty_block_for_next_slot(state) - - pre_state, post_state = run_block_header_processing(state, block, valid=False) - return pre_state, block, None diff --git a/test_libs/pyspec/tests/block_processing/test_process_deposit.py b/test_libs/pyspec/tests/block_processing/test_process_deposit.py deleted file mode 100644 index bbfb390efb..0000000000 --- a/test_libs/pyspec/tests/block_processing/test_process_deposit.py +++ /dev/null @@ -1,141 +0,0 @@ -from copy import deepcopy -import pytest - -import eth2spec.phase0.spec as spec - -from eth2spec.phase0.spec import ( - ZERO_HASH, - process_deposit, -) -from tests.helpers import ( - get_balance, - build_deposit, - privkeys, - pubkeys, -) - - -# mark entire file as 'deposits' -pytestmark = pytest.mark.deposits - - -def test_success(state): - pre_state = deepcopy(state) - # fill previous deposits with zero-hash - deposit_data_leaves = [ZERO_HASH] * len(pre_state.validator_registry) - - index = len(deposit_data_leaves) - pubkey = pubkeys[index] - privkey = privkeys[index] - deposit, root, deposit_data_leaves = build_deposit( - pre_state, - deposit_data_leaves, - pubkey, - privkey, - spec.MAX_EFFECTIVE_BALANCE, - ) - - pre_state.latest_eth1_data.deposit_root = root - pre_state.latest_eth1_data.deposit_count = len(deposit_data_leaves) - - post_state = deepcopy(pre_state) - - process_deposit(post_state, deposit) - - assert len(post_state.validator_registry) == len(state.validator_registry) + 1 - assert len(post_state.balances) == len(state.balances) + 1 - assert post_state.validator_registry[index].pubkey == pubkeys[index] - assert get_balance(post_state, index) == spec.MAX_EFFECTIVE_BALANCE - assert post_state.deposit_index == post_state.latest_eth1_data.deposit_count - - return pre_state, deposit, post_state - - -def test_success_top_up(state): - pre_state = deepcopy(state) - deposit_data_leaves = [ZERO_HASH] * len(pre_state.validator_registry) - - validator_index = 0 - amount = spec.MAX_EFFECTIVE_BALANCE // 4 - pubkey = pubkeys[validator_index] - privkey = privkeys[validator_index] - deposit, root, deposit_data_leaves = build_deposit( - pre_state, - deposit_data_leaves, - pubkey, - privkey, - amount, - ) - - pre_state.latest_eth1_data.deposit_root = root - pre_state.latest_eth1_data.deposit_count = len(deposit_data_leaves) - pre_balance = get_balance(pre_state, validator_index) - - post_state = deepcopy(pre_state) - - process_deposit(post_state, deposit) - - assert len(post_state.validator_registry) == len(state.validator_registry) - assert len(post_state.balances) == len(state.balances) - assert post_state.deposit_index == post_state.latest_eth1_data.deposit_count - assert get_balance(post_state, validator_index) == pre_balance + amount - - return pre_state, deposit, post_state - - -def test_wrong_index(state): - pre_state = deepcopy(state) - deposit_data_leaves = [ZERO_HASH] * len(pre_state.validator_registry) - - index = len(deposit_data_leaves) - pubkey = pubkeys[index] - privkey = privkeys[index] - deposit, root, deposit_data_leaves = build_deposit( - pre_state, - deposit_data_leaves, - pubkey, - privkey, - spec.MAX_EFFECTIVE_BALANCE, - ) - - # mess up deposit_index - deposit.index = pre_state.deposit_index + 1 - - pre_state.latest_eth1_data.deposit_root = root - pre_state.latest_eth1_data.deposit_count = len(deposit_data_leaves) - - post_state = deepcopy(pre_state) - - with pytest.raises(AssertionError): - process_deposit(post_state, deposit) - - return pre_state, deposit, None - - -def test_bad_merkle_proof(state): - pre_state = deepcopy(state) - deposit_data_leaves = [ZERO_HASH] * len(pre_state.validator_registry) - - index = len(deposit_data_leaves) - pubkey = pubkeys[index] - privkey = privkeys[index] - deposit, root, deposit_data_leaves = build_deposit( - pre_state, - deposit_data_leaves, - pubkey, - privkey, - spec.MAX_EFFECTIVE_BALANCE, - ) - - # mess up merkle branch - deposit.proof[-1] = spec.ZERO_HASH - - pre_state.latest_eth1_data.deposit_root = root - pre_state.latest_eth1_data.deposit_count = len(deposit_data_leaves) - - post_state = deepcopy(pre_state) - - with pytest.raises(AssertionError): - process_deposit(post_state, deposit) - - return pre_state, deposit, None diff --git a/test_libs/pyspec/tests/block_processing/test_process_proposer_slashing.py b/test_libs/pyspec/tests/block_processing/test_process_proposer_slashing.py deleted file mode 100644 index 4752210366..0000000000 --- a/test_libs/pyspec/tests/block_processing/test_process_proposer_slashing.py +++ /dev/null @@ -1,96 +0,0 @@ -from copy import deepcopy -import pytest - -import eth2spec.phase0.spec as spec -from eth2spec.phase0.spec import ( - get_current_epoch, - process_proposer_slashing, -) -from tests.helpers import ( - get_balance, - get_valid_proposer_slashing, -) - -# mark entire file as 'proposer_slashings' -pytestmark = pytest.mark.proposer_slashings - - -def run_proposer_slashing_processing(state, proposer_slashing, valid=True): - """ - Run ``process_proposer_slashing`` returning the pre and post state. - If ``valid == False``, run expecting ``AssertionError`` - """ - post_state = deepcopy(state) - - if not valid: - with pytest.raises(AssertionError): - process_proposer_slashing(post_state, proposer_slashing) - return state, None - - process_proposer_slashing(post_state, proposer_slashing) - - slashed_validator = post_state.validator_registry[proposer_slashing.proposer_index] - assert slashed_validator.slashed - assert slashed_validator.exit_epoch < spec.FAR_FUTURE_EPOCH - assert slashed_validator.withdrawable_epoch < spec.FAR_FUTURE_EPOCH - # lost whistleblower reward - assert ( - get_balance(post_state, proposer_slashing.proposer_index) < - get_balance(state, proposer_slashing.proposer_index) - ) - - return state, post_state - - -def test_success(state): - proposer_slashing = get_valid_proposer_slashing(state) - - pre_state, post_state = run_proposer_slashing_processing(state, proposer_slashing) - - return pre_state, proposer_slashing, post_state - - -def test_epochs_are_different(state): - proposer_slashing = get_valid_proposer_slashing(state) - - # set slots to be in different epochs - proposer_slashing.header_2.slot += spec.SLOTS_PER_EPOCH - - pre_state, post_state = run_proposer_slashing_processing(state, proposer_slashing, False) - - return pre_state, proposer_slashing, post_state - - -def test_headers_are_same(state): - proposer_slashing = get_valid_proposer_slashing(state) - - # set headers to be the same - proposer_slashing.header_2 = proposer_slashing.header_1 - - pre_state, post_state = run_proposer_slashing_processing(state, proposer_slashing, False) - - return pre_state, proposer_slashing, post_state - - -def test_proposer_is_slashed(state): - proposer_slashing = get_valid_proposer_slashing(state) - - # set proposer to slashed - state.validator_registry[proposer_slashing.proposer_index].slashed = True - - pre_state, post_state = run_proposer_slashing_processing(state, proposer_slashing, False) - - return pre_state, proposer_slashing, post_state - - -def test_proposer_is_withdrawn(state): - proposer_slashing = get_valid_proposer_slashing(state) - - # set proposer withdrawable_epoch in past - current_epoch = get_current_epoch(state) - proposer_index = proposer_slashing.proposer_index - state.validator_registry[proposer_index].withdrawable_epoch = current_epoch - 1 - - pre_state, post_state = run_proposer_slashing_processing(state, proposer_slashing, False) - - return pre_state, proposer_slashing, post_state diff --git a/test_libs/pyspec/tests/block_processing/test_process_transfer.py b/test_libs/pyspec/tests/block_processing/test_process_transfer.py deleted file mode 100644 index 0eeaa77929..0000000000 --- a/test_libs/pyspec/tests/block_processing/test_process_transfer.py +++ /dev/null @@ -1,141 +0,0 @@ -from copy import deepcopy -import pytest - -import eth2spec.phase0.spec as spec - -from eth2spec.phase0.spec import ( - get_active_validator_indices, - get_beacon_proposer_index, - get_current_epoch, - process_transfer, -) -from tests.helpers import ( - get_valid_transfer, - next_epoch, -) - - -# mark entire file as 'transfers' -pytestmark = pytest.mark.transfers - - -def run_transfer_processing(state, transfer, valid=True): - """ - Run ``process_transfer`` returning the pre and post state. - If ``valid == False``, run expecting ``AssertionError`` - """ - post_state = deepcopy(state) - - if not valid: - with pytest.raises(AssertionError): - process_transfer(post_state, transfer) - return state, None - - - process_transfer(post_state, transfer) - - proposer_index = get_beacon_proposer_index(state) - pre_transfer_sender_balance = state.balances[transfer.sender] - pre_transfer_recipient_balance = state.balances[transfer.recipient] - pre_transfer_proposer_balance = state.balances[proposer_index] - sender_balance = post_state.balances[transfer.sender] - recipient_balance = post_state.balances[transfer.recipient] - assert sender_balance == pre_transfer_sender_balance - transfer.amount - transfer.fee - assert recipient_balance == pre_transfer_recipient_balance + transfer.amount - assert post_state.balances[proposer_index] == pre_transfer_proposer_balance + transfer.fee - - return state, post_state - - -def test_success_non_activated(state): - transfer = get_valid_transfer(state) - # un-activate so validator can transfer - state.validator_registry[transfer.sender].activation_eligibility_epoch = spec.FAR_FUTURE_EPOCH - - pre_state, post_state = run_transfer_processing(state, transfer) - - return pre_state, transfer, post_state - - -def test_success_withdrawable(state): - next_epoch(state) - - transfer = get_valid_transfer(state) - - # withdrawable_epoch in past so can transfer - state.validator_registry[transfer.sender].withdrawable_epoch = get_current_epoch(state) - 1 - - pre_state, post_state = run_transfer_processing(state, transfer) - - return pre_state, transfer, post_state - - -def test_success_active_above_max_effective(state): - sender_index = get_active_validator_indices(state, get_current_epoch(state))[-1] - amount = spec.MAX_EFFECTIVE_BALANCE // 32 - state.balances[sender_index] = spec.MAX_EFFECTIVE_BALANCE + amount - transfer = get_valid_transfer(state, sender_index=sender_index, amount=amount, fee=0) - - pre_state, post_state = run_transfer_processing(state, transfer) - - return pre_state, transfer, post_state - - -def test_active_but_transfer_past_effective_balance(state): - sender_index = get_active_validator_indices(state, get_current_epoch(state))[-1] - amount = spec.MAX_EFFECTIVE_BALANCE // 32 - state.balances[sender_index] = spec.MAX_EFFECTIVE_BALANCE - transfer = get_valid_transfer(state, sender_index=sender_index, amount=amount, fee=0) - - pre_state, post_state = run_transfer_processing(state, transfer, False) - - return pre_state, transfer, post_state - - -def test_incorrect_slot(state): - transfer = get_valid_transfer(state, slot=state.slot+1) - # un-activate so validator can transfer - state.validator_registry[transfer.sender].activation_epoch = spec.FAR_FUTURE_EPOCH - - pre_state, post_state = run_transfer_processing(state, transfer, False) - - return pre_state, transfer, post_state - - -def test_insufficient_balance(state): - sender_index = get_active_validator_indices(state, get_current_epoch(state))[-1] - amount = spec.MAX_EFFECTIVE_BALANCE - state.balances[sender_index] = spec.MAX_EFFECTIVE_BALANCE - transfer = get_valid_transfer(state, sender_index=sender_index, amount=amount + 1, fee=0) - - # un-activate so validator can transfer - state.validator_registry[transfer.sender].activation_epoch = spec.FAR_FUTURE_EPOCH - - pre_state, post_state = run_transfer_processing(state, transfer, False) - - return pre_state, transfer, post_state - - -def test_no_dust(state): - sender_index = get_active_validator_indices(state, get_current_epoch(state))[-1] - balance = state.balances[sender_index] - transfer = get_valid_transfer(state, sender_index=sender_index, amount=balance - spec.MIN_DEPOSIT_AMOUNT + 1, fee=0) - - # un-activate so validator can transfer - state.validator_registry[transfer.sender].activation_epoch = spec.FAR_FUTURE_EPOCH - - pre_state, post_state = run_transfer_processing(state, transfer, False) - - return pre_state, transfer, post_state - - -def test_invalid_pubkey(state): - transfer = get_valid_transfer(state) - state.validator_registry[transfer.sender].withdrawal_credentials = spec.ZERO_HASH - - # un-activate so validator can transfer - state.validator_registry[transfer.sender].activation_epoch = spec.FAR_FUTURE_EPOCH - - pre_state, post_state = run_transfer_processing(state, transfer, False) - - return pre_state, transfer, post_state diff --git a/test_libs/pyspec/tests/block_processing/test_voluntary_exit.py b/test_libs/pyspec/tests/block_processing/test_voluntary_exit.py deleted file mode 100644 index c58c5238a9..0000000000 --- a/test_libs/pyspec/tests/block_processing/test_voluntary_exit.py +++ /dev/null @@ -1,163 +0,0 @@ -from copy import deepcopy -import pytest - -import eth2spec.phase0.spec as spec - -from eth2spec.phase0.spec import ( - get_active_validator_indices, - get_churn_limit, - get_current_epoch, - process_voluntary_exit, -) -from tests.helpers import ( - build_voluntary_exit, - pubkey_to_privkey, -) - - -# mark entire file as 'voluntary_exits' -pytestmark = pytest.mark.voluntary_exits - - -def run_voluntary_exit_processing(state, voluntary_exit, valid=True): - """ - Run ``process_voluntary_exit`` returning the pre and post state. - If ``valid == False``, run expecting ``AssertionError`` - """ - post_state = deepcopy(state) - - if not valid: - with pytest.raises(AssertionError): - process_voluntary_exit(post_state, voluntary_exit) - return state, None - - process_voluntary_exit(post_state, voluntary_exit) - - validator_index = voluntary_exit.validator_index - assert state.validator_registry[validator_index].exit_epoch == spec.FAR_FUTURE_EPOCH - assert post_state.validator_registry[validator_index].exit_epoch < spec.FAR_FUTURE_EPOCH - - return state, post_state - - -def test_success(state): - # move state forward PERSISTENT_COMMITTEE_PERIOD epochs to allow for exit - state.slot += spec.PERSISTENT_COMMITTEE_PERIOD * spec.SLOTS_PER_EPOCH - - current_epoch = get_current_epoch(state) - validator_index = get_active_validator_indices(state, current_epoch)[0] - privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey] - - voluntary_exit = build_voluntary_exit( - state, - current_epoch, - validator_index, - privkey, - ) - - pre_state, post_state = run_voluntary_exit_processing(state, voluntary_exit) - return pre_state, voluntary_exit, post_state - - -def test_success_exit_queue(state): - # move state forward PERSISTENT_COMMITTEE_PERIOD epochs to allow for exit - state.slot += spec.PERSISTENT_COMMITTEE_PERIOD * spec.SLOTS_PER_EPOCH - - current_epoch = get_current_epoch(state) - - # exit `MAX_EXITS_PER_EPOCH` - initial_indices = get_active_validator_indices(state, current_epoch)[:get_churn_limit(state)] - post_state = state - for index in initial_indices: - privkey = pubkey_to_privkey[state.validator_registry[index].pubkey] - voluntary_exit = build_voluntary_exit( - state, - current_epoch, - index, - privkey, - ) - - pre_state, post_state = run_voluntary_exit_processing(post_state, voluntary_exit) - - # exit an additional validator - validator_index = get_active_validator_indices(state, current_epoch)[-1] - privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey] - voluntary_exit = build_voluntary_exit( - state, - current_epoch, - validator_index, - privkey, - ) - - pre_state, post_state = run_voluntary_exit_processing(post_state, voluntary_exit) - - assert ( - post_state.validator_registry[validator_index].exit_epoch == - post_state.validator_registry[initial_indices[0]].exit_epoch + 1 - ) - - return pre_state, voluntary_exit, post_state - - -def test_validator_not_active(state): - current_epoch = get_current_epoch(state) - validator_index = get_active_validator_indices(state, current_epoch)[0] - privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey] - - state.validator_registry[validator_index].activation_epoch = spec.FAR_FUTURE_EPOCH - - # - # build and test voluntary exit - # - voluntary_exit = build_voluntary_exit( - state, - current_epoch, - validator_index, - privkey, - ) - - pre_state, post_state = run_voluntary_exit_processing(state, voluntary_exit, False) - return pre_state, voluntary_exit, post_state - - -def test_validator_already_exited(state): - # move state forward PERSISTENT_COMMITTEE_PERIOD epochs to allow validator able to exit - state.slot += spec.PERSISTENT_COMMITTEE_PERIOD * spec.SLOTS_PER_EPOCH - - current_epoch = get_current_epoch(state) - validator_index = get_active_validator_indices(state, current_epoch)[0] - privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey] - - # but validator already has exited - state.validator_registry[validator_index].exit_epoch = current_epoch + 2 - - voluntary_exit = build_voluntary_exit( - state, - current_epoch, - validator_index, - privkey, - ) - - pre_state, post_state = run_voluntary_exit_processing(state, voluntary_exit, False) - return pre_state, voluntary_exit, post_state - - -def test_validator_not_active_long_enough(state): - current_epoch = get_current_epoch(state) - validator_index = get_active_validator_indices(state, current_epoch)[0] - privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey] - - voluntary_exit = build_voluntary_exit( - state, - current_epoch, - validator_index, - privkey, - ) - - assert ( - current_epoch - state.validator_registry[validator_index].activation_epoch < - spec.PERSISTENT_COMMITTEE_PERIOD - ) - - pre_state, post_state = run_voluntary_exit_processing(state, voluntary_exit, False) - return pre_state, voluntary_exit, post_state diff --git a/test_libs/pyspec/tests/conftest.py b/test_libs/pyspec/tests/conftest.py deleted file mode 100644 index 9840dc7b20..0000000000 --- a/test_libs/pyspec/tests/conftest.py +++ /dev/null @@ -1,36 +0,0 @@ -import pytest - -from eth2spec.phase0 import spec -from preset_loader import loader - -from .helpers import ( - create_genesis_state, -) - - -def pytest_addoption(parser): - parser.addoption( - "--config", action="store", default="minimal", help="config: make the pyspec use the specified configuration" - ) - - [email protected](autouse=True) -def config(request): - config_name = request.config.getoption("--config") - presets = loader.load_presets('../../configs/', config_name) - spec.apply_constants_preset(presets) - - [email protected] -def num_validators(config): - return spec.SLOTS_PER_EPOCH * 8 - - [email protected] -def deposit_data_leaves(): - return list() - - [email protected] -def state(num_validators, deposit_data_leaves): - return create_genesis_state(num_validators, deposit_data_leaves) diff --git a/test_libs/pyspec/tests/helpers.py b/test_libs/pyspec/tests/helpers.py deleted file mode 100644 index 3b9b6904d5..0000000000 --- a/test_libs/pyspec/tests/helpers.py +++ /dev/null @@ -1,422 +0,0 @@ -from copy import deepcopy - -from py_ecc import bls - -from eth2spec.phase0.state_transition import ( - state_transition, -) -import eth2spec.phase0.spec as spec -from eth2spec.utils.minimal_ssz import signing_root -from eth2spec.phase0.spec import ( - # constants - ZERO_HASH, - # SSZ - Attestation, - AttestationData, - AttestationDataAndCustodyBit, - AttesterSlashing, - BeaconBlock, - BeaconBlockHeader, - Deposit, - DepositData, - Eth1Data, - ProposerSlashing, - Transfer, - VoluntaryExit, - # functions - convert_to_indexed, - get_active_validator_indices, - get_attesting_indices, - get_block_root, - get_block_root_at_slot, - get_crosslink_committee, - get_current_epoch, - get_domain, - get_epoch_start_slot, - get_genesis_beacon_state, - get_previous_epoch, - get_shard_delta, - hash_tree_root, - slot_to_epoch, - verify_merkle_branch, - hash, -) -from eth2spec.utils.merkle_minimal import ( - calc_merkle_tree_from_leaves, - get_merkle_proof, - get_merkle_root, -) - - -privkeys = [i + 1 for i in range(1024)] -pubkeys = [bls.privtopub(privkey) for privkey in privkeys] -pubkey_to_privkey = {pubkey: privkey for privkey, pubkey in zip(privkeys, pubkeys)} - - -def get_balance(state, index): - return state.balances[index] - - -def set_bitfield_bit(bitfield, i): - """ - Set the bit in ``bitfield`` at position ``i`` to ``1``. - """ - byte_index = i // 8 - bit_index = i % 8 - return ( - bitfield[:byte_index] + - bytes([bitfield[byte_index] | (1 << bit_index)]) + - bitfield[byte_index+1:] - ) - - -def create_mock_genesis_validator_deposits(num_validators, deposit_data_leaves=None): - if not deposit_data_leaves: - deposit_data_leaves = [] - signature = b'\x33' * 96 - - deposit_data_list = [] - for i in range(num_validators): - pubkey = pubkeys[i] - deposit_data = DepositData( - pubkey=pubkey, - # insecurely use pubkey as withdrawal key as well - withdrawal_credentials=spec.BLS_WITHDRAWAL_PREFIX_BYTE + hash(pubkey)[1:], - amount=spec.MAX_EFFECTIVE_BALANCE, - signature=signature, - ) - item = deposit_data.hash_tree_root() - deposit_data_leaves.append(item) - tree = calc_merkle_tree_from_leaves(tuple(deposit_data_leaves)) - root = get_merkle_root((tuple(deposit_data_leaves))) - proof = list(get_merkle_proof(tree, item_index=i)) - assert verify_merkle_branch(item, proof, spec.DEPOSIT_CONTRACT_TREE_DEPTH, i, root) - deposit_data_list.append(deposit_data) - - genesis_validator_deposits = [] - for i in range(num_validators): - genesis_validator_deposits.append(Deposit( - proof=list(get_merkle_proof(tree, item_index=i)), - index=i, - data=deposit_data_list[i] - )) - return genesis_validator_deposits, root - - -def create_genesis_state(num_validators, deposit_data_leaves=None): - initial_deposits, deposit_root = create_mock_genesis_validator_deposits( - num_validators, - deposit_data_leaves, - ) - return get_genesis_beacon_state( - initial_deposits, - genesis_time=0, - genesis_eth1_data=Eth1Data( - deposit_root=deposit_root, - deposit_count=len(initial_deposits), - block_hash=spec.ZERO_HASH, - ), - ) - - -def build_empty_block_for_next_slot(state): - empty_block = BeaconBlock() - empty_block.slot = state.slot + 1 - empty_block.body.eth1_data.deposit_count = state.deposit_index - previous_block_header = deepcopy(state.latest_block_header) - if previous_block_header.state_root == spec.ZERO_HASH: - previous_block_header.state_root = state.hash_tree_root() - empty_block.previous_block_root = signing_root(previous_block_header) - return empty_block - - -def build_deposit_data(state, pubkey, privkey, amount): - deposit_data = DepositData( - pubkey=pubkey, - # insecurely use pubkey as withdrawal key as well - withdrawal_credentials=spec.BLS_WITHDRAWAL_PREFIX_BYTE + hash(pubkey)[1:], - amount=amount, - ) - signature = bls.sign( - message_hash=signing_root(deposit_data), - privkey=privkey, - domain=get_domain( - state, - spec.DOMAIN_DEPOSIT, - ) - ) - deposit_data.signature = signature - return deposit_data - - -def build_attestation_data(state, slot, shard): - assert state.slot >= slot - - if slot == state.slot: - block_root = build_empty_block_for_next_slot(state).previous_block_root - else: - block_root = get_block_root_at_slot(state, slot) - - current_epoch_start_slot = get_epoch_start_slot(get_current_epoch(state)) - if slot < current_epoch_start_slot: - epoch_boundary_root = get_block_root(state, get_previous_epoch(state)) - elif slot == current_epoch_start_slot: - epoch_boundary_root = block_root - else: - epoch_boundary_root = get_block_root(state, get_current_epoch(state)) - - if slot < current_epoch_start_slot: - justified_epoch = state.previous_justified_epoch - justified_block_root = state.previous_justified_root - else: - justified_epoch = state.current_justified_epoch - justified_block_root = state.current_justified_root - - crosslinks = state.current_crosslinks if slot_to_epoch(slot) == get_current_epoch(state) else state.previous_crosslinks - return AttestationData( - shard=shard, - beacon_block_root=block_root, - source_epoch=justified_epoch, - source_root=justified_block_root, - target_epoch=slot_to_epoch(slot), - target_root=epoch_boundary_root, - crosslink_data_root=spec.ZERO_HASH, - previous_crosslink_root=hash_tree_root(crosslinks[shard]), - ) - - -def build_voluntary_exit(state, epoch, validator_index, privkey): - voluntary_exit = VoluntaryExit( - epoch=epoch, - validator_index=validator_index, - ) - voluntary_exit.signature = bls.sign( - message_hash=signing_root(voluntary_exit), - privkey=privkey, - domain=get_domain( - state=state, - domain_type=spec.DOMAIN_VOLUNTARY_EXIT, - message_epoch=epoch, - ) - ) - - return voluntary_exit - - -def build_deposit(state, - deposit_data_leaves, - pubkey, - privkey, - amount): - deposit_data = build_deposit_data(state, pubkey, privkey, amount) - - item = deposit_data.hash_tree_root() - index = len(deposit_data_leaves) - deposit_data_leaves.append(item) - tree = calc_merkle_tree_from_leaves(tuple(deposit_data_leaves)) - root = get_merkle_root((tuple(deposit_data_leaves))) - proof = list(get_merkle_proof(tree, item_index=index)) - assert verify_merkle_branch(item, proof, spec.DEPOSIT_CONTRACT_TREE_DEPTH, index, root) - - deposit = Deposit( - proof=list(proof), - index=index, - data=deposit_data, - ) - - return deposit, root, deposit_data_leaves - - -def get_valid_proposer_slashing(state): - current_epoch = get_current_epoch(state) - validator_index = get_active_validator_indices(state, current_epoch)[-1] - privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey] - slot = state.slot - - header_1 = BeaconBlockHeader( - slot=slot, - previous_block_root=ZERO_HASH, - state_root=ZERO_HASH, - block_body_root=ZERO_HASH, - ) - header_2 = deepcopy(header_1) - header_2.previous_block_root = b'\x02' * 32 - header_2.slot = slot + 1 - - domain = get_domain( - state=state, - domain_type=spec.DOMAIN_BEACON_PROPOSER, - ) - header_1.signature = bls.sign( - message_hash=signing_root(header_1), - privkey=privkey, - domain=domain, - ) - header_2.signature = bls.sign( - message_hash=signing_root(header_2), - privkey=privkey, - domain=domain, - ) - - return ProposerSlashing( - proposer_index=validator_index, - header_1=header_1, - header_2=header_2, - ) - - -def get_valid_attester_slashing(state): - attestation_1 = get_valid_attestation(state) - attestation_2 = deepcopy(attestation_1) - attestation_2.data.target_root = b'\x01' * 32 - - return AttesterSlashing( - attestation_1=convert_to_indexed(state, attestation_1), - attestation_2=convert_to_indexed(state, attestation_2), - ) - - -def get_valid_attestation(state, slot=None): - if slot is None: - slot = state.slot - - if slot_to_epoch(slot) == get_current_epoch(state): - shard = (state.latest_start_shard + slot) % spec.SLOTS_PER_EPOCH - else: - previous_shard_delta = get_shard_delta(state, get_previous_epoch(state)) - shard = (state.latest_start_shard - previous_shard_delta + slot) % spec.SHARD_COUNT - - attestation_data = build_attestation_data(state, slot, shard) - - crosslink_committee = get_crosslink_committee(state, attestation_data.target_epoch, attestation_data.shard) - - committee_size = len(crosslink_committee) - bitfield_length = (committee_size + 7) // 8 - aggregation_bitfield = b'\xC0' + b'\x00' * (bitfield_length - 1) - custody_bitfield = b'\x00' * bitfield_length - attestation = Attestation( - aggregation_bitfield=aggregation_bitfield, - data=attestation_data, - custody_bitfield=custody_bitfield, - ) - participants = get_attesting_indices( - state, - attestation.data, - attestation.aggregation_bitfield, - ) - assert len(participants) == 2 - - signatures = [] - for validator_index in participants: - privkey = privkeys[validator_index] - signatures.append( - get_attestation_signature( - state, - attestation.data, - privkey - ) - ) - - attestation.aggregation_signature = bls.aggregate_signatures(signatures) - return attestation - - -def get_valid_transfer(state, slot=None, sender_index=None, amount=None, fee=None): - if slot is None: - slot = state.slot - current_epoch = get_current_epoch(state) - if sender_index is None: - sender_index = get_active_validator_indices(state, current_epoch)[-1] - recipient_index = get_active_validator_indices(state, current_epoch)[0] - transfer_pubkey = pubkeys[-1] - transfer_privkey = privkeys[-1] - - if fee is None: - fee = get_balance(state, sender_index) // 32 - if amount is None: - amount = get_balance(state, sender_index) - fee - - transfer = Transfer( - sender=sender_index, - recipient=recipient_index, - amount=amount, - fee=fee, - slot=slot, - pubkey=transfer_pubkey, - signature=ZERO_HASH, - ) - transfer.signature = bls.sign( - message_hash=signing_root(transfer), - privkey=transfer_privkey, - domain=get_domain( - state=state, - domain_type=spec.DOMAIN_TRANSFER, - message_epoch=get_current_epoch(state), - ) - ) - - # ensure withdrawal_credentials reproducable - state.validator_registry[transfer.sender].withdrawal_credentials = ( - spec.BLS_WITHDRAWAL_PREFIX_BYTE + spec.hash(transfer.pubkey)[1:] - ) - - return transfer - - -def get_attestation_signature(state, attestation_data, privkey, custody_bit=0b0): - message_hash = AttestationDataAndCustodyBit( - data=attestation_data, - custody_bit=custody_bit, - ).hash_tree_root() - - return bls.sign( - message_hash=message_hash, - privkey=privkey, - domain=get_domain( - state=state, - domain_type=spec.DOMAIN_ATTESTATION, - message_epoch=attestation_data.target_epoch, - ) - ) - - -def fill_aggregate_attestation(state, attestation): - crosslink_committee = get_crosslink_committee(state, attestation.data.target_epoch, attestation.data.shard) - for i in range(len(crosslink_committee)): - attestation.aggregation_bitfield = set_bitfield_bit(attestation.aggregation_bitfield, i) - - -def add_attestation_to_state(state, attestation, slot): - block = build_empty_block_for_next_slot(state) - block.slot = slot - block.body.attestations.append(attestation) - state_transition(state, block) - - -def next_slot(state): - """ - Transition to the next slot via an empty block. - Return the empty block that triggered the transition. - """ - block = build_empty_block_for_next_slot(state) - state_transition(state, block) - return block - - -def next_epoch(state): - """ - Transition to the start slot of the next epoch via an empty block. - Return the empty block that triggered the transition. - """ - block = build_empty_block_for_next_slot(state) - block.slot += spec.SLOTS_PER_EPOCH - (state.slot % spec.SLOTS_PER_EPOCH) - state_transition(state, block) - return block - - -def get_state_root(state, slot) -> bytes: - """ - Return the state root at a recent ``slot``. - """ - assert slot < state.slot <= slot + spec.SLOTS_PER_HISTORICAL_ROOT - return state.latest_state_roots[slot % spec.SLOTS_PER_HISTORICAL_ROOT] diff --git a/test_libs/pyspec/tests/test_sanity.py b/test_libs/pyspec/tests/test_sanity.py deleted file mode 100644 index 1b4d20f4c5..0000000000 --- a/test_libs/pyspec/tests/test_sanity.py +++ /dev/null @@ -1,438 +0,0 @@ -from copy import deepcopy - -import pytest - -from py_ecc import bls -import eth2spec.phase0.spec as spec - -from eth2spec.utils.minimal_ssz import signing_root -from eth2spec.phase0.spec import ( - # constants - ZERO_HASH, - SLOTS_PER_HISTORICAL_ROOT, - # SSZ - Deposit, - Transfer, - VoluntaryExit, - # functions - get_active_validator_indices, - get_beacon_proposer_index, - get_block_root_at_slot, - get_current_epoch, - get_domain, - advance_slot, - cache_state, - verify_merkle_branch, - hash, -) -from eth2spec.phase0.state_transition import ( - state_transition, -) -from eth2spec.utils.merkle_minimal import ( - calc_merkle_tree_from_leaves, - get_merkle_proof, - get_merkle_root, -) -from .helpers import ( - get_balance, - build_deposit_data, - build_empty_block_for_next_slot, - fill_aggregate_attestation, - get_state_root, - get_valid_attestation, - get_valid_attester_slashing, - get_valid_proposer_slashing, - next_slot, - privkeys, - pubkeys, -) - - -# mark entire file as 'sanity' -pytestmark = pytest.mark.sanity - - -def test_slot_transition(state): - test_state = deepcopy(state) - cache_state(test_state) - advance_slot(test_state) - assert test_state.slot == state.slot + 1 - assert get_state_root(test_state, state.slot) == state.hash_tree_root() - return test_state - - -def test_empty_block_transition(state): - test_state = deepcopy(state) - - block = build_empty_block_for_next_slot(test_state) - state_transition(test_state, block) - - assert len(test_state.eth1_data_votes) == len(state.eth1_data_votes) + 1 - assert get_block_root_at_slot(test_state, state.slot) == block.previous_block_root - - return state, [block], test_state - - -def test_skipped_slots(state): - test_state = deepcopy(state) - block = build_empty_block_for_next_slot(test_state) - block.slot += 3 - - state_transition(test_state, block) - - assert test_state.slot == block.slot - for slot in range(state.slot, test_state.slot): - assert get_block_root_at_slot(test_state, slot) == block.previous_block_root - - return state, [block], test_state - - -def test_empty_epoch_transition(state): - test_state = deepcopy(state) - block = build_empty_block_for_next_slot(test_state) - block.slot += spec.SLOTS_PER_EPOCH - - state_transition(test_state, block) - - assert test_state.slot == block.slot - for slot in range(state.slot, test_state.slot): - assert get_block_root_at_slot(test_state, slot) == block.previous_block_root - - return state, [block], test_state - - -def test_empty_epoch_transition_not_finalizing(state): - test_state = deepcopy(state) - block = build_empty_block_for_next_slot(test_state) - block.slot += spec.SLOTS_PER_EPOCH * 5 - - state_transition(test_state, block) - - assert test_state.slot == block.slot - assert test_state.finalized_epoch < get_current_epoch(test_state) - 4 - for index in range(len(test_state.validator_registry)): - assert get_balance(test_state, index) < get_balance(state, index) - - return state, [block], test_state - - -def test_proposer_slashing(state): - test_state = deepcopy(state) - proposer_slashing = get_valid_proposer_slashing(state) - validator_index = proposer_slashing.proposer_index - - # - # Add to state via block transition - # - block = build_empty_block_for_next_slot(test_state) - block.body.proposer_slashings.append(proposer_slashing) - state_transition(test_state, block) - - assert not state.validator_registry[validator_index].slashed - - slashed_validator = test_state.validator_registry[validator_index] - assert slashed_validator.slashed - assert slashed_validator.exit_epoch < spec.FAR_FUTURE_EPOCH - assert slashed_validator.withdrawable_epoch < spec.FAR_FUTURE_EPOCH - # lost whistleblower reward - assert get_balance(test_state, validator_index) < get_balance(state, validator_index) - - return state, [block], test_state - - -def test_attester_slashing(state): - test_state = deepcopy(state) - attester_slashing = get_valid_attester_slashing(state) - validator_index = attester_slashing.attestation_1.custody_bit_0_indices[0] - - # - # Add to state via block transition - # - block = build_empty_block_for_next_slot(test_state) - block.body.attester_slashings.append(attester_slashing) - state_transition(test_state, block) - - assert not state.validator_registry[validator_index].slashed - - slashed_validator = test_state.validator_registry[validator_index] - assert slashed_validator.slashed - assert slashed_validator.exit_epoch < spec.FAR_FUTURE_EPOCH - assert slashed_validator.withdrawable_epoch < spec.FAR_FUTURE_EPOCH - # lost whistleblower reward - assert get_balance(test_state, validator_index) < get_balance(state, validator_index) - - proposer_index = get_beacon_proposer_index(test_state) - # gained whistleblower reward - assert ( - get_balance(test_state, proposer_index) > - get_balance(state, proposer_index) - ) - - return state, [block], test_state - - -def test_deposit_in_block(state): - pre_state = deepcopy(state) - test_deposit_data_leaves = [ZERO_HASH] * len(pre_state.validator_registry) - - index = len(test_deposit_data_leaves) - pubkey = pubkeys[index] - privkey = privkeys[index] - deposit_data = build_deposit_data(pre_state, pubkey, privkey, spec.MAX_EFFECTIVE_BALANCE) - - item = deposit_data.hash_tree_root() - test_deposit_data_leaves.append(item) - tree = calc_merkle_tree_from_leaves(tuple(test_deposit_data_leaves)) - root = get_merkle_root((tuple(test_deposit_data_leaves))) - proof = list(get_merkle_proof(tree, item_index=index)) - assert verify_merkle_branch(item, proof, spec.DEPOSIT_CONTRACT_TREE_DEPTH, index, root) - - deposit = Deposit( - proof=list(proof), - index=index, - data=deposit_data, - ) - - pre_state.latest_eth1_data.deposit_root = root - pre_state.latest_eth1_data.deposit_count = len(test_deposit_data_leaves) - post_state = deepcopy(pre_state) - block = build_empty_block_for_next_slot(post_state) - block.body.deposits.append(deposit) - - state_transition(post_state, block) - assert len(post_state.validator_registry) == len(state.validator_registry) + 1 - assert len(post_state.balances) == len(state.balances) + 1 - assert get_balance(post_state, index) == spec.MAX_EFFECTIVE_BALANCE - assert post_state.validator_registry[index].pubkey == pubkeys[index] - - return pre_state, [block], post_state - - -def test_deposit_top_up(state): - pre_state = deepcopy(state) - test_deposit_data_leaves = [ZERO_HASH] * len(pre_state.validator_registry) - - validator_index = 0 - amount = spec.MAX_EFFECTIVE_BALANCE // 4 - pubkey = pubkeys[validator_index] - privkey = privkeys[validator_index] - deposit_data = build_deposit_data(pre_state, pubkey, privkey, amount) - - merkle_index = len(test_deposit_data_leaves) - item = deposit_data.hash_tree_root() - test_deposit_data_leaves.append(item) - tree = calc_merkle_tree_from_leaves(tuple(test_deposit_data_leaves)) - root = get_merkle_root((tuple(test_deposit_data_leaves))) - proof = list(get_merkle_proof(tree, item_index=merkle_index)) - assert verify_merkle_branch(item, proof, spec.DEPOSIT_CONTRACT_TREE_DEPTH, merkle_index, root) - - deposit = Deposit( - proof=list(proof), - index=merkle_index, - data=deposit_data, - ) - - pre_state.latest_eth1_data.deposit_root = root - pre_state.latest_eth1_data.deposit_count = len(test_deposit_data_leaves) - block = build_empty_block_for_next_slot(pre_state) - block.body.deposits.append(deposit) - - pre_balance = get_balance(pre_state, validator_index) - post_state = deepcopy(pre_state) - state_transition(post_state, block) - assert len(post_state.validator_registry) == len(pre_state.validator_registry) - assert len(post_state.balances) == len(pre_state.balances) - assert get_balance(post_state, validator_index) == pre_balance + amount - - return pre_state, [block], post_state - - -def test_attestation(state): - state.slot = spec.SLOTS_PER_EPOCH - test_state = deepcopy(state) - attestation = get_valid_attestation(state) - - # - # Add to state via block transition - # - attestation_block = build_empty_block_for_next_slot(test_state) - attestation_block.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY - attestation_block.body.attestations.append(attestation) - state_transition(test_state, attestation_block) - - assert len(test_state.current_epoch_attestations) == len(state.current_epoch_attestations) + 1 - - - # - # Epoch transition should move to previous_epoch_attestations - # - pre_current_epoch_attestations = deepcopy(test_state.current_epoch_attestations) - - epoch_block = build_empty_block_for_next_slot(test_state) - epoch_block.slot += spec.SLOTS_PER_EPOCH - state_transition(test_state, epoch_block) - - assert len(test_state.current_epoch_attestations) == 0 - assert test_state.previous_epoch_attestations == pre_current_epoch_attestations - - return state, [attestation_block, epoch_block], test_state - - -def test_voluntary_exit(state): - pre_state = deepcopy(state) - validator_index = get_active_validator_indices( - pre_state, - get_current_epoch(pre_state) - )[-1] - - # move state forward PERSISTENT_COMMITTEE_PERIOD epochs to allow for exit - pre_state.slot += spec.PERSISTENT_COMMITTEE_PERIOD * spec.SLOTS_PER_EPOCH - - post_state = deepcopy(pre_state) - - voluntary_exit = VoluntaryExit( - epoch=get_current_epoch(pre_state), - validator_index=validator_index, - ) - voluntary_exit.signature = bls.sign( - message_hash=signing_root(voluntary_exit), - privkey=privkeys[validator_index], - domain=get_domain( - state=pre_state, - domain_type=spec.DOMAIN_VOLUNTARY_EXIT, - ) - ) - - # - # Add to state via block transition - # - initiate_exit_block = build_empty_block_for_next_slot(post_state) - initiate_exit_block.body.voluntary_exits.append(voluntary_exit) - state_transition(post_state, initiate_exit_block) - - assert post_state.validator_registry[validator_index].exit_epoch < spec.FAR_FUTURE_EPOCH - - # - # Process within epoch transition - # - exit_block = build_empty_block_for_next_slot(post_state) - exit_block.slot += spec.SLOTS_PER_EPOCH - state_transition(post_state, exit_block) - - assert post_state.validator_registry[validator_index].exit_epoch < spec.FAR_FUTURE_EPOCH - - return pre_state, [initiate_exit_block, exit_block], post_state - - -def test_transfer(state): - # overwrite default 0 to test - spec.MAX_TRANSFERS = 1 - - pre_state = deepcopy(state) - current_epoch = get_current_epoch(pre_state) - sender_index = get_active_validator_indices(pre_state, current_epoch)[-1] - recipient_index = get_active_validator_indices(pre_state, current_epoch)[0] - transfer_pubkey = pubkeys[-1] - transfer_privkey = privkeys[-1] - amount = get_balance(pre_state, sender_index) - pre_transfer_recipient_balance = get_balance(pre_state, recipient_index) - transfer = Transfer( - sender=sender_index, - recipient=recipient_index, - amount=amount, - fee=0, - slot=pre_state.slot + 1, - pubkey=transfer_pubkey, - ) - transfer.signature = bls.sign( - message_hash=signing_root(transfer), - privkey=transfer_privkey, - domain=get_domain( - state=pre_state, - domain_type=spec.DOMAIN_TRANSFER, - ) - ) - - # ensure withdrawal_credentials reproducable - pre_state.validator_registry[sender_index].withdrawal_credentials = ( - spec.BLS_WITHDRAWAL_PREFIX_BYTE + hash(transfer_pubkey)[1:] - ) - # un-activate so validator can transfer - pre_state.validator_registry[sender_index].activation_eligibility_epoch = spec.FAR_FUTURE_EPOCH - - post_state = deepcopy(pre_state) - # - # Add to state via block transition - # - block = build_empty_block_for_next_slot(post_state) - block.body.transfers.append(transfer) - state_transition(post_state, block) - - sender_balance = get_balance(post_state, sender_index) - recipient_balance = get_balance(post_state, recipient_index) - assert sender_balance == 0 - assert recipient_balance == pre_transfer_recipient_balance + amount - - return pre_state, [block], post_state - - -def test_balance_driven_status_transitions(state): - current_epoch = get_current_epoch(state) - validator_index = get_active_validator_indices(state, current_epoch)[-1] - - assert state.validator_registry[validator_index].exit_epoch == spec.FAR_FUTURE_EPOCH - - # set validator balance to below ejection threshold - state.validator_registry[validator_index].effective_balance = spec.EJECTION_BALANCE - - post_state = deepcopy(state) - # - # trigger epoch transition - # - block = build_empty_block_for_next_slot(post_state) - block.slot += spec.SLOTS_PER_EPOCH - state_transition(post_state, block) - - assert post_state.validator_registry[validator_index].exit_epoch < spec.FAR_FUTURE_EPOCH - - return state, [block], post_state - - -def test_historical_batch(state): - state.slot += spec.SLOTS_PER_HISTORICAL_ROOT - (state.slot % spec.SLOTS_PER_HISTORICAL_ROOT) - 1 - - post_state = deepcopy(state) - - block = build_empty_block_for_next_slot(post_state) - - state_transition(post_state, block) - - assert post_state.slot == block.slot - assert get_current_epoch(post_state) % (spec.SLOTS_PER_HISTORICAL_ROOT // spec.SLOTS_PER_EPOCH) == 0 - assert len(post_state.historical_roots) == len(state.historical_roots) + 1 - - return state, [block], post_state - - -def test_eth1_data_votes(state): - post_state = deepcopy(state) - - expected_votes = 0 - assert len(state.eth1_data_votes) == expected_votes - - blocks = [] - for _ in range(spec.SLOTS_PER_ETH1_VOTING_PERIOD - 1): - block = build_empty_block_for_next_slot(post_state) - state_transition(post_state, block) - expected_votes += 1 - assert len(post_state.eth1_data_votes) == expected_votes - blocks.append(block) - - block = build_empty_block_for_next_slot(post_state) - state_transition(post_state, block) - blocks.append(block) - - assert post_state.slot % spec.SLOTS_PER_ETH1_VOTING_PERIOD == 0 - assert len(post_state.eth1_data_votes) == 1 - - return state, blocks, post_state
pex-tool__pex-822
Release 2.0.3 On the docket: + [x] Pex should trust any host passed via `--index` or `--find-links`. #812 + [x] A cache should always be used by `pex.resolver.resolve`. #809 + [x] Use the resolve cache to skip installs. #815 + [x] Parallelize resolve. #818 + [x] Cache sdist & local project builds #817 + [x] Unify resolve and runtime wheel caches. #820
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.0.2'\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.0.3'\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 80e4b80d7..42a5e1252 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,26 @@ Release Notes ============= +2.0.3 +----- + +This release fixes a regression in handling explicitly requested `--index` or +`--find-links` http (insecure) repos. In addition, performance of the pex 2.x +resolver is brought in line with the 1.x resolver in all cases and improved in +most cases. + +* Unify PEX buildtime and runtime wheel caches. #821 + `PR #821 <https://github.com/pantsbuild/pex/pull/821>`_ + +* Parallelize resolve. (#819) + `PR #819 <https://github.com/pantsbuild/pex/pull/819>`_ + +* Use the resolve cache to skip installs. (#815) + `PR #815 <https://github.com/pantsbuild/pex/pull/815>`_ + +* Implicitly trust explicitly requested repos. (#813) + `PR #813 <https://github.com/pantsbuild/pex/pull/813>`_ + 2.0.2 ----- diff --git a/pex/version.py b/pex/version.py index a698f9d1f..f77cc369d 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = '2.0.2' +__version__ = '2.0.3'
mabel-dev__opteryx-1641
🪲 Python 3.9 tests stalling ### Thank you for taking the time to report a problem with Opteryx. _To help us to respond to your request we ask that you try to provide the below detail about the bug._ **Describe the bug** _A clear and specific description of what the bug is. What the error, incorrect or unexpected behaviour was._ **Expected behaviour** _A clear and concise description of what you expected to happen._ **Sample Code/Statement** _If you can, please submit the SQL statement or Python code snippet, or a representative example using the sample datasets._ ~~~sql ~~~ **Additional context** _Add any other context about the problem here, for example what you have done to try to diagnose or workaround the problem._
[ { "content": "__build__ = 477\n\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"\nStore the version here so:\n1) we don't load dependencies by storing it in __init__.py\n2) we can import it in setup.py for the same reason\n\"\"\"\nfrom enum import Enum # isort: skip\n\n\nclass VersionStatus(Enum):\n ALPHA = \"alpha\"\n BETA = \"beta\"\n RELEASE = \"release\"\n\n\n_major = 0\n_minor = 15\n_revision = 0\n_status = VersionStatus.BETA\n\n__author__ = \"@joocer\"\n__version__ = f\"{_major}.{_minor}.{_revision}\" + (\n f\"-{_status.value}.{__build__}\" if _status != VersionStatus.RELEASE else \"\"\n)\n", "path": "opteryx/__version__.py" } ]
[ { "content": "__build__ = 482\n\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"\nStore the version here so:\n1) we don't load dependencies by storing it in __init__.py\n2) we can import it in setup.py for the same reason\n\"\"\"\nfrom enum import Enum # isort: skip\n\n\nclass VersionStatus(Enum):\n ALPHA = \"alpha\"\n BETA = \"beta\"\n RELEASE = \"release\"\n\n\n_major = 0\n_minor = 15\n_revision = 0\n_status = VersionStatus.BETA\n\n__author__ = \"@joocer\"\n__version__ = f\"{_major}.{_minor}.{_revision}\" + (\n f\"-{_status.value}.{__build__}\" if _status != VersionStatus.RELEASE else \"\"\n)\n", "path": "opteryx/__version__.py" } ]
diff --git a/.github/workflows/regression_suite.yaml b/.github/workflows/regression_suite.yaml index ef2373725..dac54768e 100644 --- a/.github/workflows/regression_suite.yaml +++ b/.github/workflows/regression_suite.yaml @@ -11,7 +11,7 @@ jobs: max-parallel: 4 fail-fast: false matrix: - python-version: ['3.9', '3.10', '3.11', '3.12'] + python-version: ['3.10', '3.11', '3.12'] os: [ubuntu-latest] runs-on: ${{ matrix.os }} steps: diff --git a/.github/workflows/regression_suite_mac_x86.yaml b/.github/workflows/regression_suite_mac_x86.yaml index ecde26602..d50d52d62 100644 --- a/.github/workflows/regression_suite_mac_x86.yaml +++ b/.github/workflows/regression_suite_mac_x86.yaml @@ -8,7 +8,7 @@ jobs: strategy: max-parallel: 4 matrix: - python-version: ['3.9', '3.10', '3.11'] + python-version: ['3.10', '3.11'] os: ['macos-latest'] runs-on: ${{ matrix.os }} steps: diff --git a/.github/workflows/regression_suite_windows.yaml b/.github/workflows/regression_suite_windows.yaml index 488f85140..f5e13c483 100644 --- a/.github/workflows/regression_suite_windows.yaml +++ b/.github/workflows/regression_suite_windows.yaml @@ -8,7 +8,7 @@ jobs: strategy: max-parallel: 4 matrix: - python-version: ['3.9', '3.10', '3.11'] + python-version: ['3.10', '3.11'] os: ['windows-latest'] runs-on: ${{ matrix.os }} steps: diff --git a/opteryx/__version__.py b/opteryx/__version__.py index ff5d4c87b..25008bb66 100644 --- a/opteryx/__version__.py +++ b/opteryx/__version__.py @@ -1,4 +1,4 @@ -__build__ = 477 +__build__ = 482 # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. diff --git a/tests/misc/test_documentation.py b/tests/misc/test_documentation.py index fbc138621..e7ae5d67a 100644 --- a/tests/misc/test_documentation.py +++ b/tests/misc/test_documentation.py @@ -8,8 +8,10 @@ sys.path.insert(1, os.path.join(sys.path[0], "../..")) from tests.tools import download_file +from tests.tools import is_version, skip_if +@skip_if(is_version("3.9")) def test_documentation_connect_example(): import opteryx @@ -24,6 +26,7 @@ def test_documentation_connect_example(): conn.close() +@skip_if(is_version("3.9")) def test_readme_1(): import opteryx @@ -31,6 +34,7 @@ def test_readme_1(): result.head() +@skip_if(is_version("3.9")) def test_readme_2(): import pandas @@ -44,6 +48,7 @@ def test_readme_2(): aggregated_df.head() +@skip_if(is_version("3.9")) def test_readme_3(): import opteryx @@ -57,6 +62,7 @@ def test_readme_3(): result.head() +@skip_if(is_version("3.9")) def test_readme_4(): import opteryx from opteryx.connectors import GcpCloudStorageConnector @@ -68,6 +74,7 @@ def test_readme_4(): result.head() +@skip_if(is_version("3.9")) def test_readme_5(): import opteryx from opteryx.connectors import SqlConnector @@ -90,12 +97,14 @@ def test_readme_5(): result.head() +@skip_if(is_version("3.9")) def test_get_started(): import opteryx result = opteryx.query("SELECT * FROM $planets;").arrow() +@skip_if(is_version("3.9")) def test_python_client(): import opteryx @@ -136,6 +145,7 @@ def test_python_client(): ).fetchall() +@skip_if(is_version("3.9")) def test_pandas_integration_input(): import pandas @@ -155,12 +165,14 @@ def test_pandas_integration_input(): results = opteryx.query("SELECT * FROM nephews").arrow() +@skip_if(is_version("3.9")) def test_pandas_integration_output(): import opteryx dataframe = opteryx.query("SELECT * FROM $planets").pandas() +@skip_if(is_version("3.9")) def test_polars_integration_input(): import polars @@ -180,12 +192,14 @@ def test_polars_integration_input(): results = opteryx.query("SELECT * FROM nephews").arrow() +@skip_if(is_version("3.9")) def test_polars_integration_output(): import opteryx dataframe = opteryx.query("SELECT * FROM $planets").polars() +@skip_if(is_version("3.9")) def test_permissions_example(): import opteryx @@ -200,6 +214,7 @@ def test_permissions_example(): print("User does not have permission to execute this query") +@skip_if(is_version("3.9")) def test_role_based_permissions(): import opteryx @@ -226,6 +241,7 @@ def get_user_permissions(user_roles): assert perms == {"Query", "Execute", "Analyze"} +@skip_if(is_version("3.9")) def test_membership_permissions(): import opteryx diff --git a/tests/plan_optimization/test_fold_all_constants.py b/tests/plan_optimization/test_fold_all_constants.py index 67591ab63..61124b475 100644 --- a/tests/plan_optimization/test_fold_all_constants.py +++ b/tests/plan_optimization/test_fold_all_constants.py @@ -5,8 +5,10 @@ import numpy import opteryx +from tests.tools import is_version, skip_if +@skip_if(is_version("3.9")) def test_we_dont_fold_random(): SQL = "SELECT random() AS r FROM GENERATE_SERIES(5000) AS g"
pex-tool__pex-910
Release 2.1.5 On the docket: + [x] Kill `Pip.spawn_install_wheel` `overwrite` arg. #907 + [x] Silence pip warnings about Python 2.7. #908
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.1.4'\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.1.5'\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 8d19aed0e..80c591fe1 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,18 @@ Release Notes ============= +2.1.5 +----- + +* Silence pip warnings about Python 2.7. (#908) + `PR #908 <https://github.com/pantsbuild/pexpull/908>`_ + +* Kill `Pip.spawn_install_wheel` `overwrite` arg. (#907) + `PR #907 <https://github.com/pantsbuild/pexpull/907>`_ + +* Show pex-root from env as default in help output (#901) + `PR #901 <https://github.com/pantsbuild/pexpull/901>`_ + 2.1.4 ----- diff --git a/pex/version.py b/pex/version.py index 3860dbc2e..3e0c53016 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = '2.1.4' +__version__ = '2.1.5'
mabel-dev__opteryx-1695
✨ Memory Pool Optimizations ### Thanks for stopping by to let us know something could be better! **Is your feature request related to a problem? Please describe.** _A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]_ **Describe the solution you'd like** _A clear and concise description of what you want to happen._ **Describe alternatives you've considered** _A clear and concise description of any alternative solutions or features you've considered._ **Additional context** _Add any other context or screenshots about the feature request here._
[ { "content": "__build__ = 527\n\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"\nStore the version here so:\n1) we don't load dependencies by storing it in __init__.py\n2) we can import it in setup.py for the same reason\n\"\"\"\nfrom enum import Enum # isort: skip\n\n\nclass VersionStatus(Enum):\n ALPHA = \"alpha\"\n BETA = \"beta\"\n RELEASE = \"release\"\n\n\n_major = 0\n_minor = 16\n_revision = 0\n_status = VersionStatus.ALPHA\n\n__author__ = \"@joocer\"\n__version__ = f\"{_major}.{_minor}.{_revision}\" + (\n f\"-{_status.value}.{__build__}\" if _status != VersionStatus.RELEASE else \"\"\n)\n", "path": "opteryx/__version__.py" } ]
[ { "content": "__build__ = 532\n\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"\nStore the version here so:\n1) we don't load dependencies by storing it in __init__.py\n2) we can import it in setup.py for the same reason\n\"\"\"\nfrom enum import Enum # isort: skip\n\n\nclass VersionStatus(Enum):\n ALPHA = \"alpha\"\n BETA = \"beta\"\n RELEASE = \"release\"\n\n\n_major = 0\n_minor = 16\n_revision = 0\n_status = VersionStatus.ALPHA\n\n__author__ = \"@joocer\"\n__version__ = f\"{_major}.{_minor}.{_revision}\" + (\n f\"-{_status.value}.{__build__}\" if _status != VersionStatus.RELEASE else \"\"\n)\n", "path": "opteryx/__version__.py" } ]
diff --git a/opteryx/__version__.py b/opteryx/__version__.py index a2313e19e..f9db5e2cc 100644 --- a/opteryx/__version__.py +++ b/opteryx/__version__.py @@ -1,4 +1,4 @@ -__build__ = 527 +__build__ = 532 # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. diff --git a/opteryx/compiled/structures/memory_pool.pyx b/opteryx/compiled/structures/memory_pool.pyx index 538b79869..449ff55d7 100644 --- a/opteryx/compiled/structures/memory_pool.pyx +++ b/opteryx/compiled/structures/memory_pool.pyx @@ -1,6 +1,10 @@ # cython: language_level=3 # cython: nonecheck=False # cython: cdivision=True +# cython: initializedcheck=False +# cython: infer_types=True +# cython: wraparound=True +# cython: boundscheck=False from libc.stdlib cimport malloc, free from libc.string cimport memcpy @@ -81,24 +85,26 @@ cdef class MemoryPool: cdef vector[MemorySegment] sorted_segments self.l1_compaction += 1 - i = 1 n = len(self.free_segments) + if n <= 1: + return - sorted_segments = sorted(self.free_segments, key=lambda x: x["start"]) - new_free_segments = [sorted_segments[0]] + # Sort the free segments by start attribute + self.free_segments = sorted(self.free_segments, key=lambda x: x["start"]) + new_free_segments = [self.free_segments[0]] - for segment in sorted_segments[1:]: + for segment in self.free_segments[1:]: last_segment = new_free_segments[-1] if last_segment.start + last_segment.length == segment.start: # If adjacent, merge by extending the last segment - last_segment.length += segment.length - new_free_segments[-1] = last_segment + new_free_segments[-1] = MemorySegment(last_segment.start, last_segment.length + segment.length) else: # If not adjacent, just add the segment to the new list new_free_segments.append(segment) self.free_segments = new_free_segments + def _level2_compaction(self): """ Aggressively compacts by pushing all free memory to the end (Level 2 compaction). @@ -134,7 +140,6 @@ cdef class MemoryPool: # special case for 0 byte segments if len_data == 0: new_segment = MemorySegment(0, 0) - ref_id = random_int() self.used_segments[ref_id] = new_segment self.commits += 1 return ref_id @@ -179,7 +184,7 @@ cdef class MemoryPool: segment = self.used_segments[ref_id] if zero_copy != 0: - raw_data = <char[:segment.length]> char_ptr + raw_data = <char[:segment.length]> (char_ptr + segment.start) data = memoryview(raw_data) # Create a memoryview from the raw data else: data = PyBytes_FromStringAndSize(char_ptr + segment.start, segment.length) @@ -188,7 +193,6 @@ cdef class MemoryPool: raise ValueError("Invalid reference ID.") post_read_segment = self.used_segments[ref_id] if post_read_segment.start != segment.start or post_read_segment.length != segment.length: - with self.lock: self.read_locks += 1 @@ -197,11 +201,10 @@ cdef class MemoryPool: segment = self.used_segments[ref_id] if zero_copy != 0: - raw_data = <char[:segment.length]> char_ptr + raw_data = <char[:segment.length]> (char_ptr + segment.start) data = memoryview(raw_data) # Create a memoryview from the raw data else: return PyBytes_FromStringAndSize(char_ptr + segment.start, segment.length) - return data def read_and_release(self, long ref_id, int zero_copy = 1): @@ -219,7 +222,7 @@ cdef class MemoryPool: self.free_segments.push_back(segment) if zero_copy != 0: - raw_data = <char[:segment.length]> char_ptr + raw_data = <char[:segment.length]> (char_ptr + segment.start) return memoryview(raw_data) # Create a memoryview from the raw data else: return PyBytes_FromStringAndSize(char_ptr + segment.start, segment.length) diff --git a/tests/misc/test_memory_pool.py b/tests/misc/test_memory_pool.py index a8f32a88d..b6dcfcb67 100644 --- a/tests/misc/test_memory_pool.py +++ b/tests/misc/test_memory_pool.py @@ -73,7 +73,8 @@ def test_compaction(): mp.release(ref1) ref3 = mp.commit(b"Third") # Ensure that the third commit succeeds after compaction, despite the first segment being released - assert mp.read(ref3, False) == b"Third" + data = mp.read(ref3, False) + assert data == b"Third" def test_multiple_commits_and_reads(): @@ -94,6 +95,119 @@ def test_overlapping_writes(): assert mp.read(ref2, False) == b"abcde" assert mp.read(ref3, False) == b"XYZ" +def test_overlapping_writes_memcopy(): + mp = MemoryPool(size=20) + ref1 = mp.commit(b"12345") + ref2 = mp.commit(b"abcde") + mp.release(ref1) + ref3 = mp.commit(b"XYZ") + # Test if the new write overlaps correctly and does not corrupt other data + r2_memcopy = bytes(mp.read(ref2, True)) + r2_no_memcopy = mp.read(ref2, False) + r3_memcopy = bytes(mp.read(ref3, True)) + r3_no_memcopy = mp.read(ref3, False) + + assert r2_memcopy == r2_no_memcopy == b"abcde", f"{r2_memcopy} / {r2_no_memcopy} / abcde" + assert r3_memcopy == r3_no_memcopy == b"XYZ", f"{r3_memcopy} / {r3_no_memcopy} / XYZ" + +def test_zero_copy_vs_copy_reads(): + mp = MemoryPool(size=30) + + # Initial commits + ref1 = mp.commit(b"12345") + ref2 = mp.commit(b"abcde") + ref3 = mp.commit(b"ABCDE") + + # Release one segment to create free space + mp.release(ref1) + + # Commit more data to fill the pool + ref4 = mp.commit(b"XYZ") + ref5 = mp.commit(b"7890") + + # Additional activity + ref6 = mp.commit(b"LMNOP") + mp.release(ref3) + ref7 = mp.commit(b"qrst") + mp.release(ref2) + ref8 = mp.commit(b"uvwxyz") + + # Reading segments with and without zero-copy + r4_memcopy = bytes(mp.read(ref4, True)) + r4_no_memcopy = mp.read(ref4, False) + r5_memcopy = bytes(mp.read(ref5, True)) + r5_no_memcopy = mp.read(ref5, False) + r6_memcopy = bytes(mp.read(ref6, True)) + r6_no_memcopy = mp.read(ref6, False) + r7_memcopy = bytes(mp.read(ref7, True)) + r7_no_memcopy = mp.read(ref7, False) + r8_memcopy = bytes(mp.read(ref8, True)) + r8_no_memcopy = mp.read(ref8, False) + + assert r4_memcopy == r4_no_memcopy == b"XYZ", f"{r4_memcopy} / {r4_no_memcopy} / XYZ" + assert r5_memcopy == r5_no_memcopy == b"7890", f"{r5_memcopy} / {r5_no_memcopy} / 7890" + assert r6_memcopy == r6_no_memcopy == b"LMNOP", f"{r6_memcopy} / {r6_no_memcopy} / LMNOP" + assert r7_memcopy == r7_no_memcopy == b"qrst", f"{r7_memcopy} / {r7_no_memcopy} / qrst" + assert r8_memcopy == r8_no_memcopy == b"uvwxyz", f"{r8_memcopy} / {r8_no_memcopy} / uvwxyz" + + +def test_zero_copy_vs_copy_reads_and_release(): + mp = MemoryPool(size=30) + + # Initial commits + ref1 = mp.commit(b"12345") + ref2 = mp.commit(b"abcde") + ref3 = mp.commit(b"ABCDE") + + # Release one segment to create free space + mp.release(ref1) + + # Commit more data to fill the pool + ref4 = mp.commit(b"XYZ") + ref5 = mp.commit(b"7890") + + # Additional activity + ref6 = mp.commit(b"LMNOP") + mp.release(ref3) + ref7 = mp.commit(b"qrst") + mp.release(ref2) + ref8 = mp.commit(b"uvwxyz") + + # Reading segments with and without zero-copy, alternating read and read_and_release + # read no zero copy, release zero copy + r4_read_no_memcopy = bytes(mp.read(ref4, False)) + r4_release_memcopy = bytes(mp.read_and_release(ref4, True)) + + # read zero copy, release no zero copy + r5_read_memcopy = bytes(mp.read(ref5, True)) + r5_release_no_memcopy = bytes(mp.read_and_release(ref5, False)) + + # read zero copy, release zero copy + r6_read_memcopy = bytes(mp.read(ref6, True)) + r6_release_memcopy = bytes(mp.read_and_release(ref6, True)) + + # read no zero copy, release no zero copy + r7_read_no_memcopy = bytes(mp.read(ref7, False)) + r7_release_no_memcopy = bytes(mp.read_and_release(ref7, False)) + + # read zero copy, release zero copy + r8_read_memcopy = bytes(mp.read(ref8, True)) + r8_release_memcopy = bytes(mp.read_and_release(ref8, True)) + + assert r4_read_no_memcopy == r4_release_memcopy == b"XYZ", f"{r4_read_no_memcopy} / {r4_release_memcopy} / XYZ" + assert r5_read_memcopy == r5_release_no_memcopy == b"7890", f"{r5_read_memcopy} / {r5_release_no_memcopy} / 7890" + assert r6_read_memcopy == r6_release_memcopy == b"LMNOP", f"{r6_read_memcopy} / {r6_release_memcopy} / LMNOP" + assert r7_read_no_memcopy == r7_release_no_memcopy == b"qrst", f"{r7_read_no_memcopy} / {r7_release_no_memcopy} / qrst" + assert r8_read_memcopy == r8_release_memcopy == b"uvwxyz", f"{r8_read_memcopy} / {r8_release_memcopy} / uvwxyz" + + # Ensure that the segments are released and available for new commits + ref9 = mp.commit(b"newdata") + r9_memcopy = bytes(mp.read(ref9, True)) + r9_no_memcopy = mp.read(ref9, False) + + assert r9_memcopy == r9_no_memcopy == b"newdata", f"{r9_memcopy} / {r9_no_memcopy} / newdata" + + def test_pool_exhaustion_and_compaction(): mp = MemoryPool(size=20) @@ -375,5 +489,5 @@ def test_return_types(): if __name__ == "__main__": # pragma: no cover from tests.tools import run_tests - + test_compaction_effectiveness() run_tests() diff --git a/tests/tools.py b/tests/tools.py index 19afe1ce4..8788fa848 100644 --- a/tests/tools.py +++ b/tests/tools.py @@ -137,7 +137,7 @@ def run_tests(): for index, method in enumerate(test_methods): start_time = time.monotonic_ns() test_name = f"\033[38;2;255;184;108m{(index + 1):04}\033[0m \033[38;2;189;147;249m{str(method.__name__)}\033[0m" - print(test_name.ljust(display_width - 20), end="") + print(test_name.ljust(display_width - 20), end="", flush=True) error = None output = "" try:
gammapy__gammapy-5151
Defaults for `methods` in signature of `SafeMaskMaker.__init__` is confusing ``` def __init__( self, methods=("aeff-default",), aeff_percent=10, bias_percent=10, position=None, fixed_offset=None, offset_max="3 deg", irfs="DL4", ): ``` In the signature of the `SafeMaskMaker`, the methods arguments is defaults to `("aeff-default",)` which is confusing because the coma is necessary for the code to work. If one don't put the coma while using a tuple, in the code latter, the instruction `set(methods)` will give back `{'-', 'a', 'd', 'e', 'f', 'l', 't', 'u'}`. To make it less confusing I think it would be good change the tuple to a list, for which `set(["aeff-default"])` give back `{'aeff-default'}`.
[ { "content": "# Licensed under a 3-clause BSD style license - see LICENSE.rst\nimport logging\nimport numpy as np\nfrom astropy import units as u\nfrom astropy.coordinates import Angle\nfrom gammapy.irf import EDispKernelMap\nfrom gammapy.maps import Map\nfrom gammapy.modeling.models import TemplateSpectralModel\nfrom .core import Maker\n\n__all__ = [\"SafeMaskMaker\"]\n\n\nlog = logging.getLogger(__name__)\n\n\nclass SafeMaskMaker(Maker):\n \"\"\"Make safe data range mask for a given observation.\n\n .. warning::\n\n Currently some methods computing a safe energy range (\"aeff-default\",\n \"aeff-max\" and \"edisp-bias\") determine a true energy range and apply\n it to reconstructed energy, effectively neglecting the energy dispersion.\n\n Parameters\n ----------\n methods : {\"aeff-default\", \"aeff-max\", \"edisp-bias\", \"offset-max\", \"bkg-peak\"}\n Method to use for the safe energy range. Can be a\n list with a combination of those. Resulting masks\n are combined with logical `and`. \"aeff-default\"\n uses the energy ranged specified in the DL3 data\n files, if available.\n aeff_percent : float\n Percentage of the maximal effective area to be used\n as lower energy threshold for method \"aeff-max\".\n bias_percent : float\n Percentage of the energy bias to be used as lower\n energy threshold for method \"edisp-bias\".\n position : `~astropy.coordinates.SkyCoord`\n Position at which the `aeff_percent` or `bias_percent` are computed.\n fixed_offset : `~astropy.coordinates.Angle`\n Offset, calculated from the pointing position, at which\n the `aeff_percent` or `bias_percent` are computed.\n If neither the position nor fixed_offset is specified,\n it uses the position of the center of the map by default.\n offset_max : str or `~astropy.units.Quantity`\n Maximum offset cut.\n irfs : {\"DL4\", \"DL3\"}\n Whether to use reprojected (\"DL4\") or raw (\"DL3\") irfs.\n Default is \"DL4\".\n \"\"\"\n\n tag = \"SafeMaskMaker\"\n available_methods = {\n \"aeff-default\",\n \"aeff-max\",\n \"edisp-bias\",\n \"offset-max\",\n \"bkg-peak\",\n }\n\n def __init__(\n self,\n methods=(\"aeff-default\",),\n aeff_percent=10,\n bias_percent=10,\n position=None,\n fixed_offset=None,\n offset_max=\"3 deg\",\n irfs=\"DL4\",\n ):\n methods = set(methods)\n\n if not methods.issubset(self.available_methods):\n difference = methods.difference(self.available_methods)\n raise ValueError(f\"{difference} is not a valid method.\")\n\n self.methods = methods\n self.aeff_percent = aeff_percent\n self.bias_percent = bias_percent\n self.position = position\n self.fixed_offset = fixed_offset\n self.offset_max = Angle(offset_max)\n if self.position and self.fixed_offset:\n raise ValueError(\n \"`position` and `fixed_offset` attributes are mutually exclusive\"\n )\n\n if irfs not in [\"DL3\", \"DL4\"]:\n ValueError(\n \"Invalid option for irfs: expected 'DL3' or 'DL4', got {irfs} instead.\"\n )\n self.irfs = irfs\n\n def make_mask_offset_max(self, dataset, observation):\n \"\"\"Make maximum offset mask.\n\n Parameters\n ----------\n dataset : `~gammapy.datasets.MapDataset` or `~gammapy.datasets.SpectrumDataset`\n Dataset to compute mask for.\n observation : `~gammapy.data.Observation`\n Observation to compute mask for.\n\n Returns\n -------\n mask_safe : `~numpy.ndarray`\n Maximum offset mask.\n \"\"\"\n if observation is None:\n raise ValueError(\"Method 'offset-max' requires an observation object.\")\n\n separation = dataset._geom.separation(\n observation.get_pointing_icrs(observation.tmid)\n )\n return separation < self.offset_max\n\n @staticmethod\n def make_mask_energy_aeff_default(dataset, observation):\n \"\"\"Make safe energy mask from aeff default.\n\n Parameters\n ----------\n dataset : `~gammapy.datasets.MapDataset` or `~gammapy.datasets.SpectrumDataset`\n Dataset to compute mask for.\n observation : `~gammapy.data.Observation`\n Observation to compute mask for.\n\n Returns\n -------\n mask_safe : `~numpy.ndarray`\n Safe data range mask.\n \"\"\"\n if observation is None:\n raise ValueError(\"Method 'aeff-default' requires an observation object.\")\n\n energy_max = observation.aeff.meta.get(\"HI_THRES\", None)\n\n if energy_max:\n energy_max = energy_max * u.TeV\n else:\n log.warning(\n f\"No default upper safe energy threshold defined for obs {observation.obs_id}\"\n )\n\n energy_min = observation.aeff.meta.get(\"LO_THRES\", None)\n\n if energy_min:\n energy_min = energy_min * u.TeV\n else:\n log.warning(\n f\"No default lower safe energy threshold defined for obs {observation.obs_id}\"\n )\n\n return dataset._geom.energy_mask(energy_min=energy_min, energy_max=energy_max)\n\n def _get_offset(self, observation):\n offset = self.fixed_offset\n if offset is None:\n if self.position:\n offset = observation.get_pointing_icrs(observation.tmid).separation(\n self.position\n )\n else:\n offset = 0.0 * u.deg\n return offset\n\n def _get_position(self, observation, geom):\n if self.fixed_offset is not None and observation is not None:\n pointing = observation.get_pointing_icrs(observation.tmid)\n return pointing.directional_offset_by(\n position_angle=0 * u.deg, separation=self.fixed_offset\n )\n elif self.position is not None:\n return self.position\n else:\n return geom.center_skydir\n\n def make_mask_energy_aeff_max(self, dataset, observation=None):\n \"\"\"Make safe energy mask from effective area maximum value.\n\n Parameters\n ----------\n dataset : `~gammapy.datasets.MapDataset` or `~gammapy.datasets.SpectrumDataset`\n Dataset to compute mask for.\n observation : `~gammapy.data.Observation`\n Observation to compute mask for. It is a mandatory argument when fixed_offset is set.\n\n Returns\n -------\n mask_safe : `~numpy.ndarray`\n Safe data range mask.\n \"\"\"\n\n if self.fixed_offset is not None and observation is None:\n raise ValueError(\n f\"{observation} argument is mandatory with {self.fixed_offset}\"\n )\n\n geom, exposure = dataset._geom, dataset.exposure\n\n if self.irfs == \"DL3\":\n offset = self._get_offset(observation)\n\n values = observation.aeff.evaluate(\n offset=offset, energy_true=observation.aeff.axes[\"energy_true\"].edges\n )\n valid = observation.aeff.axes[\"energy_true\"].edges[\n values > self.aeff_percent * np.max(values) / 100\n ]\n energy_min = np.min(valid)\n\n else:\n position = self._get_position(observation, geom)\n\n aeff = exposure.get_spectrum(position) / exposure.meta[\"livetime\"]\n if not np.any(aeff.data > 0.0):\n log.warning(\n f\"Effective area is all zero at [{position.to_string('dms')}]. \"\n f\"No safe energy band can be defined for the dataset '{dataset.name}': \"\n \"setting `mask_safe` to all False.\"\n )\n return Map.from_geom(geom, data=False, dtype=\"bool\")\n\n model = TemplateSpectralModel.from_region_map(aeff)\n\n energy_true = model.energy\n energy_min = energy_true[np.where(model.values > 0)[0][0]]\n energy_max = energy_true[-1]\n\n aeff_thres = (self.aeff_percent / 100) * aeff.quantity.max()\n inversion = model.inverse(\n aeff_thres, energy_min=energy_min, energy_max=energy_max\n )\n\n if not np.isnan(inversion[0]):\n energy_min = inversion[0]\n\n return geom.energy_mask(energy_min=energy_min)\n\n def make_mask_energy_edisp_bias(self, dataset, observation=None):\n \"\"\"Make safe energy mask from energy dispersion bias.\n\n Parameters\n ----------\n dataset : `~gammapy.datasets.MapDataset` or `~gammapy.datasets.SpectrumDataset`\n Dataset to compute mask for.\n observation : `~gammapy.data.Observation`\n Observation to compute mask for. It is a mandatory argument when fixed_offset is set.\n\n Returns\n -------\n mask_safe : `~numpy.ndarray`\n Safe data range mask.\n \"\"\"\n\n if self.fixed_offset is not None and observation is None:\n raise ValueError(\n f\"{observation} argument is mandatory with {self.fixed_offset}\"\n )\n\n edisp, geom = dataset.edisp, dataset._geom\n\n if self.irfs == \"DL3\":\n offset = self._get_offset(observation)\n edisp = observation.edisp.to_edisp_kernel(offset)\n else:\n kwargs = dict()\n kwargs[\"position\"] = self._get_position(observation, geom)\n if not isinstance(edisp, EDispKernelMap):\n kwargs[\"energy_axis\"] = dataset._geom.axes[\"energy\"]\n edisp = edisp.get_edisp_kernel(**kwargs)\n energy_min = edisp.get_bias_energy(self.bias_percent / 100)[0]\n return geom.energy_mask(energy_min=energy_min)\n\n def make_mask_energy_bkg_peak(self, dataset, observation=None):\n \"\"\"Make safe energy mask based on the binned background.\n\n The energy threshold is defined as the lower edge of the energy\n bin with the highest predicted background rate. This is to ensure analysis in\n a region where a Powerlaw approximation to the background spectrum is valid.\n The is motivated by its use in the HESS DL3\n validation paper: https://arxiv.org/pdf/1910.08088.pdf\n\n Parameters\n ----------\n dataset : `~gammapy.datasets.MapDataset` or `~gammapy.datasets.SpectrumDataset`\n Dataset to compute mask for.\n observation: `~gammapy.data.Observation`\n Observation to compute mask for. It is a mandatory argument when DL3 irfs are used.\n\n\n Returns\n -------\n mask_safe : `~numpy.ndarray`\n Safe data range mask.\n \"\"\"\n geom = dataset._geom\n if self.irfs == \"DL3\":\n bkg = observation.bkg.to_2d()\n background_spectrum = np.ravel(\n bkg.integral(axis_name=\"offset\", offset=bkg.axes[\"offset\"].bounds[1])\n )\n energy_axis = bkg.axes[\"energy\"]\n else:\n background_spectrum = dataset.npred_background().get_spectrum()\n energy_axis = geom.axes[\"energy\"]\n\n idx = np.argmax(background_spectrum.data, axis=0)\n return geom.energy_mask(energy_min=energy_axis.edges[idx])\n\n @staticmethod\n def make_mask_bkg_invalid(dataset):\n \"\"\"Mask non-finite values and zeros values in background maps.\n\n Parameters\n ----------\n dataset : `~gammapy.datasets.MapDataset` or `~gammapy.datasets.SpectrumDataset`\n Dataset to compute mask for.\n\n Returns\n -------\n mask_safe : `~numpy.ndarray`\n Safe data range mask.\n \"\"\"\n bkg = dataset.background.data\n mask = np.isfinite(bkg)\n\n if not dataset.stat_type == \"wstat\":\n mask &= bkg > 0.0\n\n return mask\n\n def run(self, dataset, observation=None):\n \"\"\"Make safe data range mask.\n\n Parameters\n ----------\n dataset : `~gammapy.datasets.MapDataset` or `~gammapy.datasets.SpectrumDataset`\n Dataset to compute mask for.\n observation : `~gammapy.data.Observation`\n Observation to compute mask for.\n\n Returns\n -------\n dataset : `Dataset`\n Dataset with defined safe range mask.\n \"\"\"\n\n if self.irfs == \"DL3\":\n if observation is None:\n raise ValueError(\"observation argument is mandatory with DL3 irfs\")\n\n if dataset.mask_safe:\n mask_safe = dataset.mask_safe.data\n else:\n mask_safe = np.ones(dataset._geom.data_shape, dtype=bool)\n\n if dataset.background is not None:\n # apply it first so only clipped values are removed for \"bkg-peak\"\n mask_safe &= self.make_mask_bkg_invalid(dataset)\n\n if \"offset-max\" in self.methods:\n mask_safe &= self.make_mask_offset_max(dataset, observation)\n\n if \"aeff-default\" in self.methods:\n mask_safe &= self.make_mask_energy_aeff_default(dataset, observation)\n\n if \"aeff-max\" in self.methods:\n mask_safe &= self.make_mask_energy_aeff_max(dataset, observation)\n\n if \"edisp-bias\" in self.methods:\n mask_safe &= self.make_mask_energy_edisp_bias(dataset, observation)\n\n if \"bkg-peak\" in self.methods:\n mask_safe &= self.make_mask_energy_bkg_peak(dataset, observation)\n\n dataset.mask_safe = Map.from_geom(dataset._geom, data=mask_safe, dtype=bool)\n return dataset\n", "path": "gammapy/makers/safe.py" } ]
[ { "content": "# Licensed under a 3-clause BSD style license - see LICENSE.rst\nimport logging\nimport numpy as np\nfrom astropy import units as u\nfrom astropy.coordinates import Angle\nfrom gammapy.irf import EDispKernelMap\nfrom gammapy.maps import Map\nfrom gammapy.modeling.models import TemplateSpectralModel\nfrom .core import Maker\n\n__all__ = [\"SafeMaskMaker\"]\n\n\nlog = logging.getLogger(__name__)\n\n\nclass SafeMaskMaker(Maker):\n \"\"\"Make safe data range mask for a given observation.\n\n .. warning::\n\n Currently some methods computing a safe energy range (\"aeff-default\",\n \"aeff-max\" and \"edisp-bias\") determine a true energy range and apply\n it to reconstructed energy, effectively neglecting the energy dispersion.\n\n Parameters\n ----------\n methods : {\"aeff-default\", \"aeff-max\", \"edisp-bias\", \"offset-max\", \"bkg-peak\"}\n Method to use for the safe energy range. Can be a\n list with a combination of those. Resulting masks\n are combined with logical `and`. \"aeff-default\"\n uses the energy ranged specified in the DL3 data\n files, if available.\n aeff_percent : float\n Percentage of the maximal effective area to be used\n as lower energy threshold for method \"aeff-max\".\n bias_percent : float\n Percentage of the energy bias to be used as lower\n energy threshold for method \"edisp-bias\".\n position : `~astropy.coordinates.SkyCoord`\n Position at which the `aeff_percent` or `bias_percent` are computed.\n fixed_offset : `~astropy.coordinates.Angle`\n Offset, calculated from the pointing position, at which\n the `aeff_percent` or `bias_percent` are computed.\n If neither the position nor fixed_offset is specified,\n it uses the position of the center of the map by default.\n offset_max : str or `~astropy.units.Quantity`\n Maximum offset cut.\n irfs : {\"DL4\", \"DL3\"}\n Whether to use reprojected (\"DL4\") or raw (\"DL3\") irfs.\n Default is \"DL4\".\n \"\"\"\n\n tag = \"SafeMaskMaker\"\n available_methods = {\n \"aeff-default\",\n \"aeff-max\",\n \"edisp-bias\",\n \"offset-max\",\n \"bkg-peak\",\n }\n\n def __init__(\n self,\n methods=[\"aeff-default\"],\n aeff_percent=10,\n bias_percent=10,\n position=None,\n fixed_offset=None,\n offset_max=\"3 deg\",\n irfs=\"DL4\",\n ):\n methods = set(methods)\n\n if not methods.issubset(self.available_methods):\n difference = methods.difference(self.available_methods)\n raise ValueError(f\"{difference} is not a valid method.\")\n\n self.methods = methods\n self.aeff_percent = aeff_percent\n self.bias_percent = bias_percent\n self.position = position\n self.fixed_offset = fixed_offset\n self.offset_max = Angle(offset_max)\n if self.position and self.fixed_offset:\n raise ValueError(\n \"`position` and `fixed_offset` attributes are mutually exclusive\"\n )\n\n if irfs not in [\"DL3\", \"DL4\"]:\n ValueError(\n \"Invalid option for irfs: expected 'DL3' or 'DL4', got {irfs} instead.\"\n )\n self.irfs = irfs\n\n def make_mask_offset_max(self, dataset, observation):\n \"\"\"Make maximum offset mask.\n\n Parameters\n ----------\n dataset : `~gammapy.datasets.MapDataset` or `~gammapy.datasets.SpectrumDataset`\n Dataset to compute mask for.\n observation : `~gammapy.data.Observation`\n Observation to compute mask for.\n\n Returns\n -------\n mask_safe : `~numpy.ndarray`\n Maximum offset mask.\n \"\"\"\n if observation is None:\n raise ValueError(\"Method 'offset-max' requires an observation object.\")\n\n separation = dataset._geom.separation(\n observation.get_pointing_icrs(observation.tmid)\n )\n return separation < self.offset_max\n\n @staticmethod\n def make_mask_energy_aeff_default(dataset, observation):\n \"\"\"Make safe energy mask from aeff default.\n\n Parameters\n ----------\n dataset : `~gammapy.datasets.MapDataset` or `~gammapy.datasets.SpectrumDataset`\n Dataset to compute mask for.\n observation : `~gammapy.data.Observation`\n Observation to compute mask for.\n\n Returns\n -------\n mask_safe : `~numpy.ndarray`\n Safe data range mask.\n \"\"\"\n if observation is None:\n raise ValueError(\"Method 'aeff-default' requires an observation object.\")\n\n energy_max = observation.aeff.meta.get(\"HI_THRES\", None)\n\n if energy_max:\n energy_max = energy_max * u.TeV\n else:\n log.warning(\n f\"No default upper safe energy threshold defined for obs {observation.obs_id}\"\n )\n\n energy_min = observation.aeff.meta.get(\"LO_THRES\", None)\n\n if energy_min:\n energy_min = energy_min * u.TeV\n else:\n log.warning(\n f\"No default lower safe energy threshold defined for obs {observation.obs_id}\"\n )\n\n return dataset._geom.energy_mask(energy_min=energy_min, energy_max=energy_max)\n\n def _get_offset(self, observation):\n offset = self.fixed_offset\n if offset is None:\n if self.position:\n offset = observation.get_pointing_icrs(observation.tmid).separation(\n self.position\n )\n else:\n offset = 0.0 * u.deg\n return offset\n\n def _get_position(self, observation, geom):\n if self.fixed_offset is not None and observation is not None:\n pointing = observation.get_pointing_icrs(observation.tmid)\n return pointing.directional_offset_by(\n position_angle=0 * u.deg, separation=self.fixed_offset\n )\n elif self.position is not None:\n return self.position\n else:\n return geom.center_skydir\n\n def make_mask_energy_aeff_max(self, dataset, observation=None):\n \"\"\"Make safe energy mask from effective area maximum value.\n\n Parameters\n ----------\n dataset : `~gammapy.datasets.MapDataset` or `~gammapy.datasets.SpectrumDataset`\n Dataset to compute mask for.\n observation : `~gammapy.data.Observation`\n Observation to compute mask for. It is a mandatory argument when fixed_offset is set.\n\n Returns\n -------\n mask_safe : `~numpy.ndarray`\n Safe data range mask.\n \"\"\"\n\n if self.fixed_offset is not None and observation is None:\n raise ValueError(\n f\"{observation} argument is mandatory with {self.fixed_offset}\"\n )\n\n geom, exposure = dataset._geom, dataset.exposure\n\n if self.irfs == \"DL3\":\n offset = self._get_offset(observation)\n\n values = observation.aeff.evaluate(\n offset=offset, energy_true=observation.aeff.axes[\"energy_true\"].edges\n )\n valid = observation.aeff.axes[\"energy_true\"].edges[\n values > self.aeff_percent * np.max(values) / 100\n ]\n energy_min = np.min(valid)\n\n else:\n position = self._get_position(observation, geom)\n\n aeff = exposure.get_spectrum(position) / exposure.meta[\"livetime\"]\n if not np.any(aeff.data > 0.0):\n log.warning(\n f\"Effective area is all zero at [{position.to_string('dms')}]. \"\n f\"No safe energy band can be defined for the dataset '{dataset.name}': \"\n \"setting `mask_safe` to all False.\"\n )\n return Map.from_geom(geom, data=False, dtype=\"bool\")\n\n model = TemplateSpectralModel.from_region_map(aeff)\n\n energy_true = model.energy\n energy_min = energy_true[np.where(model.values > 0)[0][0]]\n energy_max = energy_true[-1]\n\n aeff_thres = (self.aeff_percent / 100) * aeff.quantity.max()\n inversion = model.inverse(\n aeff_thres, energy_min=energy_min, energy_max=energy_max\n )\n\n if not np.isnan(inversion[0]):\n energy_min = inversion[0]\n\n return geom.energy_mask(energy_min=energy_min)\n\n def make_mask_energy_edisp_bias(self, dataset, observation=None):\n \"\"\"Make safe energy mask from energy dispersion bias.\n\n Parameters\n ----------\n dataset : `~gammapy.datasets.MapDataset` or `~gammapy.datasets.SpectrumDataset`\n Dataset to compute mask for.\n observation : `~gammapy.data.Observation`\n Observation to compute mask for. It is a mandatory argument when fixed_offset is set.\n\n Returns\n -------\n mask_safe : `~numpy.ndarray`\n Safe data range mask.\n \"\"\"\n\n if self.fixed_offset is not None and observation is None:\n raise ValueError(\n f\"{observation} argument is mandatory with {self.fixed_offset}\"\n )\n\n edisp, geom = dataset.edisp, dataset._geom\n\n if self.irfs == \"DL3\":\n offset = self._get_offset(observation)\n edisp = observation.edisp.to_edisp_kernel(offset)\n else:\n kwargs = dict()\n kwargs[\"position\"] = self._get_position(observation, geom)\n if not isinstance(edisp, EDispKernelMap):\n kwargs[\"energy_axis\"] = dataset._geom.axes[\"energy\"]\n edisp = edisp.get_edisp_kernel(**kwargs)\n energy_min = edisp.get_bias_energy(self.bias_percent / 100)[0]\n return geom.energy_mask(energy_min=energy_min)\n\n def make_mask_energy_bkg_peak(self, dataset, observation=None):\n \"\"\"Make safe energy mask based on the binned background.\n\n The energy threshold is defined as the lower edge of the energy\n bin with the highest predicted background rate. This is to ensure analysis in\n a region where a Powerlaw approximation to the background spectrum is valid.\n The is motivated by its use in the HESS DL3\n validation paper: https://arxiv.org/pdf/1910.08088.pdf\n\n Parameters\n ----------\n dataset : `~gammapy.datasets.MapDataset` or `~gammapy.datasets.SpectrumDataset`\n Dataset to compute mask for.\n observation: `~gammapy.data.Observation`\n Observation to compute mask for. It is a mandatory argument when DL3 irfs are used.\n\n\n Returns\n -------\n mask_safe : `~numpy.ndarray`\n Safe data range mask.\n \"\"\"\n geom = dataset._geom\n if self.irfs == \"DL3\":\n bkg = observation.bkg.to_2d()\n background_spectrum = np.ravel(\n bkg.integral(axis_name=\"offset\", offset=bkg.axes[\"offset\"].bounds[1])\n )\n energy_axis = bkg.axes[\"energy\"]\n else:\n background_spectrum = dataset.npred_background().get_spectrum()\n energy_axis = geom.axes[\"energy\"]\n\n idx = np.argmax(background_spectrum.data, axis=0)\n return geom.energy_mask(energy_min=energy_axis.edges[idx])\n\n @staticmethod\n def make_mask_bkg_invalid(dataset):\n \"\"\"Mask non-finite values and zeros values in background maps.\n\n Parameters\n ----------\n dataset : `~gammapy.datasets.MapDataset` or `~gammapy.datasets.SpectrumDataset`\n Dataset to compute mask for.\n\n Returns\n -------\n mask_safe : `~numpy.ndarray`\n Safe data range mask.\n \"\"\"\n bkg = dataset.background.data\n mask = np.isfinite(bkg)\n\n if not dataset.stat_type == \"wstat\":\n mask &= bkg > 0.0\n\n return mask\n\n def run(self, dataset, observation=None):\n \"\"\"Make safe data range mask.\n\n Parameters\n ----------\n dataset : `~gammapy.datasets.MapDataset` or `~gammapy.datasets.SpectrumDataset`\n Dataset to compute mask for.\n observation : `~gammapy.data.Observation`\n Observation to compute mask for.\n\n Returns\n -------\n dataset : `Dataset`\n Dataset with defined safe range mask.\n \"\"\"\n\n if self.irfs == \"DL3\":\n if observation is None:\n raise ValueError(\"observation argument is mandatory with DL3 irfs\")\n\n if dataset.mask_safe:\n mask_safe = dataset.mask_safe.data\n else:\n mask_safe = np.ones(dataset._geom.data_shape, dtype=bool)\n\n if dataset.background is not None:\n # apply it first so only clipped values are removed for \"bkg-peak\"\n mask_safe &= self.make_mask_bkg_invalid(dataset)\n\n if \"offset-max\" in self.methods:\n mask_safe &= self.make_mask_offset_max(dataset, observation)\n\n if \"aeff-default\" in self.methods:\n mask_safe &= self.make_mask_energy_aeff_default(dataset, observation)\n\n if \"aeff-max\" in self.methods:\n mask_safe &= self.make_mask_energy_aeff_max(dataset, observation)\n\n if \"edisp-bias\" in self.methods:\n mask_safe &= self.make_mask_energy_edisp_bias(dataset, observation)\n\n if \"bkg-peak\" in self.methods:\n mask_safe &= self.make_mask_energy_bkg_peak(dataset, observation)\n\n dataset.mask_safe = Map.from_geom(dataset._geom, data=mask_safe, dtype=bool)\n return dataset\n", "path": "gammapy/makers/safe.py" } ]
diff --git a/gammapy/makers/safe.py b/gammapy/makers/safe.py index 09c381c9f1..73d9b60b8c 100644 --- a/gammapy/makers/safe.py +++ b/gammapy/makers/safe.py @@ -62,7 +62,7 @@ class SafeMaskMaker(Maker): def __init__( self, - methods=("aeff-default",), + methods=["aeff-default"], aeff_percent=10, bias_percent=10, position=None,
pex-tool__pex-797
Release 2.0.1 On the docket: + [x] pex --index-url=... fails in 2.0.0 #794
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.0.0'\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.0.1'\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 2b97c531c..52bfe2f01 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,15 @@ Release Notes ============= +2.0.1 +----- + +This is a htofix release that fixes a bug when specifying a custom index +(`-i`/`--index`/`--index-url`) via the CLI. + +* Fix #794 issue by add missing return statement in __str__ (#795) + `PR #795 <https://github.com/pantsbuild/pex/pull/795>`_ + 2.0.0 ----- diff --git a/pex/version.py b/pex/version.py index 772804dc4..7d8716be3 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = '2.0.0' +__version__ = '2.0.1'
pex-tool__pex-891
Release 2.1.3 On the docket: + [x] Error eagerly if an interpreter binary doesn't exist #886 + [x] The pip-powered resolve in pex 2 will re-tokenize --find-links pages on each transitive requirement #887
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.1.2'\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.1.3'\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index ea14e891f..a8a961167 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,23 @@ Release Notes ============= +2.1.3 +----- + +This release fixes a performance regression in which pip +would re-tokenize --find-links pages unnecessarily. +The parsed pages are now cached in a pip patch that has +also been submitted upstream. + +* Revendor pip (#890) + `PR #890 <https://github.com/pantsbuild/pex/pull/890>`_ + +* Add a clear_cache() method to PythonInterpreter. (#885) + `PR #885 <https://github.com/pantsbuild/pex/pull/885>`_ + +* Error eagerly if an interpreter binary doesn't exist. (#886) + `PR #886 <https://github.com/pantsbuild/pex/pull/886>`_ + 2.1.2 ----- diff --git a/pex/version.py b/pex/version.py index b75078c50..393c14b99 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = '2.1.2' +__version__ = '2.1.3'
pex-tool__pex-836
Release 2.1.0 On the docket: The prime motivator: + [x] Pex does not download foreign abi3 wheels correctly #823 Changes to support the above as well as others: + [x] Fix pex resolving for foreign platforms. #835 + [x] Use pypa/packaging. #831 + [x] Upgrade vendored setuptools to 42.0.2. #832 + [x] De-vendor pex just once per version. #833 + [x] Support VCS urls for vendoring. #834 + [x] Support python 3.8 in CI. #829 + [x] Fix pex resolution to respect --ignore-errors. #828 + [x] Kill `pkg_resources` finders monkey-patching. #827 + [x] Use flit to distribute pex. #826 + [x] Cleanup extras_require. #825
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.0.3'\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.1.0'\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 42a5e1252..330d89a06 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,49 @@ Release Notes ============= +2.1.0 +----- + +This release restores and improves support for building and running +multiplatform pexes. Foreign `linux*` platform builds now include +`manylinux2014` compatible wheels by default and foreign CPython pexes now +resolve `abi3` wheels correctly. In addition, error messages at both buildtime +and runtime related to resolution of dependencies are more informative. + +Pex 2.1.0 should be considered the first Pex 2-series release that fully +replaces and improves upon Pex 1-series functionality. + +* Fix pex resolving for foreign platforms. (#835) + `PR #835 <https://github.com/pantsbuild/pex/pull/835>`_ + +* Use pypa/packaging. (#831) + `PR #831 <https://github.com/pantsbuild/pex/pull/831>`_ + +* Upgrade vendored setuptools to 42.0.2. (#832) + `PR #832 <https://github.com/pantsbuild/pex/pull/832>`_ + `PR #1830 <https://github.com/pypa/setuptools/pull/1830>`_ + +* De-vendor pex just once per version. (#833) + `PR #833 <https://github.com/pantsbuild/pex/pull/833>`_ + +* Support VCS urls for vendoring. (#834) + `PR #834 <https://github.com/pantsbuild/pex/pull/834>`_ + +* Support python 3.8 in CI. (#829) + `PR #829 <https://github.com/pantsbuild/pex/pull/829>`_ + +* Fix pex resolution to respect --ignore-errors. (#828) + `PR #828 <https://github.com/pantsbuild/pex/pull/828>`_ + +* Kill `pkg_resources` finders monkey-patching. (#827) + `PR #827 <https://github.com/pantsbuild/pex/pull/827>`_ + +* Use flit to distribute pex. (#826) + `PR #826 <https://github.com/pantsbuild/pex/pull/826>`_ + +* Cleanup extras_require. (#825) + `PR #825 <https://github.com/pantsbuild/pex/pull/825>`_ + 2.0.3 ----- diff --git a/pex/version.py b/pex/version.py index f77cc369d..befdccbff 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = '2.0.3' +__version__ = '2.1.0'
pex-tool__pex-945
Release 2.1.8 On the docket: + [x] Cache pip.pex. #937 + [x] Ensure the interpreter path is a file #938 + [x] Support an unzip toggle for PEXes. #939 + [x] Better support unzip mode PEXes. #941
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.1.7'\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.1.8'\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 199e0d5db..b306c05b0 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,27 @@ Release Notes ============= +2.1.8 +----- + +This release brings enhanced performance when using the Pex CLI or API to resolve requirements and +improved performance for many PEXed applications when specifying the `--unzip` option. PEXes built +with `--unzip` will first unzip themselves into the Pex cache if not unzipped there already and +then re-execute themselves from there. This can improve startup latency. Pex itself now uses this +mode in our [PEX release](https://github.com/pantsbuild/pex/releases/download/v2.1.8/pex). + +* Better support unzip mode PEXes. (#941) + `PR #941 <https://github.com/pantsbuild/pex/pull/941>`_ + +* Support an unzip toggle for PEXes. (#939) + `PR #939 <https://github.com/pantsbuild/pex/pull/939>`_ + +* Ensure the interpreter path is a file (#938) + `PR #938 <https://github.com/pantsbuild/pex/pull/938>`_ + +* Cache pip.pex. (#937) + `PR #937 <https://github.com/pantsbuild/pex/pull/937>`_ + 2.1.7 ----- diff --git a/pex/version.py b/pex/version.py index 930579c1a..b24b6e806 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = '2.1.7' +__version__ = '2.1.8'
fossasia__open-event-server-2429
Add filter "Checked In" and "Not checked in" to attendees In order to easily sort "checked in" and "not checked-in" it would be good to have the relevant options in the filters next to "completed, pending, expired". ![screenshot from 2016-08-23 08 33 47](https://cloud.githubusercontent.com/assets/1583873/17882481/fa7ee8d6-690c-11e6-8a08-5177c6e070f1.png)
[ { "content": "# -*- coding: utf-8 -*-\n\n##\n# Module for helper static variables\n##\n\n# Event Licences\n\nEVENT_LICENCES = {\n # Licence Name : ( Long Name, Description, Licence URL, Licence Logo, Licence Compact Logo )\n 'All rights reserved': (\n 'All rights reserved',\n u'The copyright holder reserves, or holds for their own use, all the rights provided by copyright law under one specific copyright treaty.',\n 'https://en.wikipedia.org/wiki/All_rights_reserved',\n '',\n ''),\n 'Attribution': (\n 'Creative Commons Attribution 4.0 International License',\n u'This license lets others distribute, remix, tweak, and build upon the work, even commercially, as long as they credit the copyright holder for the original creation.',\n 'https://creativecommons.org/licenses/by/4.0',\n 'https://licensebuttons.net/l/by/3.0/88x31.png',\n 'https://licensebuttons.net/l/by/3.0/80x15.png'),\n 'Attribution-ShareAlike': (\n 'Creative Commons Attribution-ShareAlike 4.0 International License',\n u'This license lets others remix, tweak, and build upon the work even for commercial purposes, as long as they credit the copyright holder and license their new creations under the identical terms.',\n 'https://creativecommons.org/licenses/by-sa/4.0',\n 'https://licensebuttons.net/l/by-sa/3.0/88x31.png',\n 'https://licensebuttons.net/l/by-sa/3.0/80x15.png'),\n 'Attribution-NoDerivs': (\n 'Creative Commons Attribution-NoDerivs 4.0 International License',\n u'This license allows for redistribution, commercial and non-commercial, as long as it is passed along unchanged and in whole, with credit to the copyright holder.',\n 'https://creativecommons.org/licenses/by-nd/4.0',\n 'https://licensebuttons.net/l/by-nd/3.0/88x31.png',\n 'https://licensebuttons.net/l/by-nd/3.0/80x15.png'),\n 'Attribution-NonCommercial': (\n 'Creative Commons Attribution-NonCommercial 4.0 International License',\n u'This license lets others remix, tweak, and build upon the work non-commercially, and although their new works must also acknowledge the copyright holder and be non-commercial, they don’t have to license their derivative works on the same terms.',\n 'https://creativecommons.org/licenses/by-nc/4.0',\n 'https://licensebuttons.net/l/by-nc/3.0/88x31.png',\n 'https://licensebuttons.net/l/by-nc/3.0/80x15.png'),\n 'Attribution-NonCommercial-NoDerivs': (\n 'Creative Commons Attribution-NonCommercial-NoDerivs 4.0 International License',\n u'This license only allows others to download the work and share them with others as long as they credit the copyright holder, but they can’t change them in any way or use them commercially.',\n 'https://creativecommons.org/licenses/by-nc-nd/4.0',\n 'https://licensebuttons.net/l/by-nc-nd/3.0/88x31.png',\n 'https://licensebuttons.net/l/by-nc-nd/3.0/80x15.png'),\n 'Attribution-NonCommercial-ShareAlike': (\n 'Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License',\n u'This license lets others remix, tweak, and build upon the work non-commercially, as long as they credit the copyright holder and license their new creations under the identical terms.',\n 'https://creativecommons.org/licenses/by-nc-sa/4.0',\n 'https://licensebuttons.net/l/by-nc-sa/3.0/88x31.png',\n 'https://licensebuttons.net/l/by-nc-sa/3.0/80x15.png'),\n 'Public Domain Dedication (CC0)': (\n 'Creative Commons Public Domain Dedication (CC0)',\n u'The copyright holder waives his interest in his work and places the work as completely as possible in the public domain so others may freely exploit and use the work without restriction under copyright or database law.',\n 'https://creativecommons.org/publicdomain/zero/1.0/',\n 'http://i.creativecommons.org/p/zero/1.0/88x31.png',\n 'http://i.creativecommons.org/p/zero/1.0/80x15.png'),\n 'Public Domain Work': (\n 'Creative Commons Public Domain Work',\n u'This license enables works that are no longer restricted by copyright to be marked as such in a standard and simple way, making them easily discoverable and available to others.',\n 'https://creativecommons.org/publicdomain/mark/1.0/',\n 'https://licensebuttons.net/p/mark/1.0/88x31.png',\n 'https://licensebuttons.net/p/mark/1.0/80x15.png')\n}\n\n# Event Topics with sub topics\n\nEVENT_TOPICS = {\n 'Auto, Boat & Air': ['Air', 'Auto', 'Boat', 'Motorcycle/ATV', 'Other'],\n 'Business & Professional': [\n 'Career', 'Design', 'Educators', 'Environment &amp; Sustainability',\n 'Finance', 'Media', 'Non Profit &amp; NGOs', 'Other', 'Real Estate',\n 'Sales &amp; Marketing', 'Startups &amp; Small Business'\n ],\n 'Charity & Causes': [\n 'Animal Welfare', 'Disaster Relief', 'Education',\n 'Environment', 'Healthcare', 'Human Rights',\n 'International Aid', 'Other', 'Poverty'\n ],\n 'Community & Culture': [\n 'City/Town', 'County', 'Heritage', 'LGBT', 'Language',\n 'Medieval', 'Nationality', 'Other', 'Renaissance', 'State'\n ],\n 'Family & Education': [\n 'Alumni', 'Baby', 'Children &amp; Youth', 'Education', 'Other',\n 'Parenting', 'Parents Association', 'Reunion'\n ],\n 'Fashion & Beauty': [\n 'Accessories', 'Beauty', 'Bridal', 'Fashion', 'Other'\n ],\n 'Film, Media & Entertainment': [\n 'Adult', 'Anime', 'Comedy', 'Comics', 'Film', 'Gaming', 'Other', 'TV'\n ],\n 'Food & Drink': [\"Beer\", \"Food\", \"Other\", \"Spirits\", \"Wine\"],\n 'Government & Politics': [\n \"County/Municipal Government\", \"Democratic Party\", \"Federal Government\",\n \"Non-partisan\", \"Other\", \"Other Party\", \"Republican Party\",\n \"State Government\"\n ],\n 'Health & Wellness': [\n \"Medical\", \"Mental health\", \"Other\", \"Personal health\", \"Spa\", \"Yoga\"\n ],\n 'Hobbies & Special Interest': [\n \"Adult\", \"Anime/Comics\", \"Books\", \"DIY\", \"Drawing & Painting\", \"Gaming\",\n \"Knitting\", \"Other\", \"Photography\"\n ],\n 'Home & Lifestyle': [\"Dating\", \"Home & Garden\", \"Other\", \"Pets & Animals\"],\n 'Music': [\n \"Alternative\", \"Blues & Jazz\", \"Classical\", \"Country\", \"Cultural\",\n \"EDM / Electronic\", \"Folk\", \"Hip Hop / Rap\", \"Indie\", \"Latin\", \"Metal\",\n \"Opera\", \"Other\", \"Pop\", \"R&B\", \"Reggae\", \"Religious/Spiritual\", \"Rock\",\n \"Top 40\"\n ],\n 'Other': [],\n 'Performing & Visual Arts': [\n \"Ballet\", \"Comedy\", \"Craft\", \"Dance\", \"Fine Art\", \"Literary Arts\",\n \"Musical\", \"Opera\", \"Orchestra\", \"Other\", \"Theatre\"\n ],\n 'Religion & Spirituality': [\n \"Buddhism\", \"Christianity\", \"Eastern Religion\", \"Islam\", \"Judaism\",\n \"Mormonism\", \"Mysticism and Occult\", \"New Age\", \"Other\", \"Sikhism\"\n ],\n 'Science & Technology': [\n \"Biotech\", \"High Tech\", \"Medicine\", \"Mobile\", \"Other\", \"Robotics\",\n \"Science\", \"Social Media\"\n ],\n 'Seasonal & Holiday': [\n \"Channukah\", \"Christmas\", \"Easter\", \"Fall events\", \"Halloween/Haunt\",\n \"Independence Day\", \"New Years Eve\", \"Other\", \"St Patricks Day\",\n \"Thanksgiving\"\n ],\n 'Sports & Fitness': [\n \"Baseball\", \"Basketball\", \"Cycling\", \"Exercise\", \"Fighting & Martial Arts\",\n \"Football\", \"Golf\", \"Hockey\", \"Motorsports\", \"Mountain Biking\",\n \"Obstacles\", \"Other\", \"Rugby\", \"Running\", \"Snow Sports\", \"Soccer\",\n \"Swimming & Water Sports\", \"Tennis\", \"Volleyball\", \"Walking\", \"Yoga\"\n ],\n 'Travel & Outdoor': [\n \"Canoeing\", \"Climbing\", \"Hiking\", \"Kayaking\", \"Other\", \"Rafting\", \"Travel\"\n ]\n}\nPAYMENT_COUNTRIES = {\n 'United States',\n 'Argentina',\n 'Australia',\n 'Austria',\n 'Belgium',\n 'Brazil',\n 'Canada',\n 'Cyprus',\n 'Czech Republic',\n 'Denmark',\n 'Estonia',\n 'Finland',\n 'France',\n 'Germany',\n 'Greece',\n 'Hong Kong',\n 'Hungary',\n 'Ireland',\n 'Israel',\n 'Italy',\n 'Japan',\n 'Latvia',\n 'Lithuania',\n 'Luxemborg',\n 'Malaysia',\n 'Malta',\n 'Mexico',\n 'Netherlands',\n 'New Zealand',\n 'Norway',\n 'Philippines',\n 'Poland',\n 'Portugal',\n 'Singapore',\n 'Slovakia',\n 'Slovenia',\n 'Spain',\n 'Sweden',\n 'Switzerland',\n 'Taiwan',\n 'United Kingdom',\n}\n\n# (currency_code,available_on_paypal,available_on_stripe)\nPAYMENT_CURRENCIES = {\n ('AUD', True, True),\n ('BRL', True, True),\n ('CAD', True, True),\n ('CHF', True, True),\n ('CZK', True, True),\n ('DKK', True, True),\n ('EUR', True, True),\n ('GBP', True, True),\n ('HKD', True, True),\n ('HUF', True, True),\n ('ILS', True, True),\n ('INR', False, True),\n ('JPY', True, True),\n ('MXN', True, True),\n ('MYR', True, True),\n ('NOK', True, True),\n ('NZD', True, True),\n ('PHP', True, True),\n ('PLN', True, True),\n ('RUB', True, True),\n ('SEK', True, True),\n ('SGD', True, True),\n ('THB', True, True),\n ('TWD', True, True),\n\n}\n\n# Event Images with Event Topics and Subtopics\n\nDEFAULT_EVENT_IMAGES = {\n 'Auto, Boat & Air': 'auto.jpg',\n 'Air': 'auto.jpg',\n 'Auto': 'auto.jpg',\n 'Boat': 'auto.jpg',\n 'Motorcycle/ATV': 'auto.jpg',\n 'Business & Professional': 'business.jpg',\n 'Career': 'business.jpg',\n 'Design': 'business.jpg',\n 'Educators': 'business.jpg',\n 'Environment &amp; Sustainability': 'business.jpg',\n 'Finance': 'business.jpg',\n 'Media': 'business.jpg',\n 'Non Profit &amp; NGOs': 'business.jpg',\n 'Real Estate': 'business.jpg',\n 'Sales &amp; Marketing': 'business.jpg',\n 'Startups &amp; Small Business': 'business.jpg',\n 'Charity & Causes': 'charity.jpg',\n 'Animal Welfare': 'charity.jpg',\n 'Disaster Relief': 'charity.jpg',\n 'Education': 'charity.jpg',\n 'Environment': 'charity.jpg',\n 'Healthcare': 'charity.jpg',\n 'Human Rights': 'charity.jpg',\n 'International Aid': 'charity.jpg',\n 'Poverty': 'charity.jpg',\n 'Community & Culture': 'culture.jpg',\n 'City/Town': 'culture.jpg',\n 'County': 'culture.jpg',\n 'Heritage': 'culture.jpg',\n 'LGBT': 'culture.jpg',\n 'Language': 'culture.jpg',\n 'Medieval': 'culture.jpg',\n 'Nationality': 'culture.jpg',\n 'Renaissance': 'culture.jpg',\n 'State': 'culture.jpg',\n 'Family & Education': 'education.jpg',\n 'Alumni': 'education.jpg',\n 'Baby': 'education.jpg',\n 'Children &amp; Youth': 'education.jpg',\n 'Education': 'education.jpg',\n 'Parenting': 'education.jpg',\n 'Parents Association': 'education.jpg',\n 'Reunion': 'education.jpg',\n 'Fashion & Beauty': 'fashion.jpg',\n 'Accessories': 'fashion.jpg',\n 'Beauty': 'fashion.jpg',\n 'Bridal': 'fashion.jpg',\n 'Fashion': 'fashion.jpg',\n 'Film, Media & Entertainment': 'film.jpg',\n 'Adult': 'film.jpg',\n 'Anime': 'film.jpg',\n 'Comedy': 'film.jpg',\n 'Comics': 'film.jpg',\n 'Film': 'film.jpg',\n 'Gaming': 'film.jpg',\n 'TV': 'film.jpg',\n 'Food & Drink': 'food.jpg',\n \"Beer\": 'food.jpg',\n \"Food\": 'food.jpg',\n \"Spirits\": 'food.jpg',\n \"Wine\": 'food.jpg',\n 'Government & Politics': 'government.jpg',\n \"County/Municipal Government\": 'government.jpg',\n \"Democratic Party\": 'government.jpg',\n \"Federal Government\": 'government.jpg',\n \"Non-partisan\": 'government.jpg',\n \"Other Party\": 'government.jpg',\n \"Republican Party\": 'government.jpg',\n \"State Government\": 'government.jpg',\n 'Health & Wellness': 'health.jpg',\n 'Hobbies & Special Interest': 'hobbies.jpg',\n \"Adult\": 'hobbies.jpg',\n \"Anime/Comics\": 'hobbies.jpg',\n \"Books\": 'hobbies.jpg',\n \"DIY\": 'hobbies.jpg',\n \"Drawing & Painting\": 'hobbies.jpg',\n \"Gaming\": 'hobbies.jpg',\n \"Knitting\": 'hobbies.jpg',\n \"Photography\": 'hobbies.jpg',\n 'Home & Lifestyle': 'home.jpg',\n \"Dating\": 'home.jpg',\n \"Home & Garden\": 'home.jpg',\n \"Pets & Animals\": 'home.jpg',\n 'Music': 'music.jpg',\n \"Alternative\": 'music.jpg',\n \"Blues & Jazz\": 'music.jpg',\n \"Classical\": 'music.jpg',\n \"Country\": 'music.jpg',\n \"Cultural\": 'music.jpg',\n \"EDM / Electronic\": 'music.jpg',\n \"Folk\": 'music.jpg',\n \"Hip Hop / Rap\": 'music.jpg',\n \"Indie\": 'music.jpg',\n \"Latin\": 'music.jpg',\n \"Metal\": 'music.jpg',\n \"Opera\": 'music.jpg',\n \"Pop\": 'music.jpg',\n \"R&B\": 'music.jpg',\n \"Reggae\": 'music.jpg',\n \"Religious/Spiritual\": 'music.jpg',\n \"Rock\": 'music.jpg',\n \"Top 40\": 'music.jpg',\n 'Other': 'generic.jpg',\n 'Performing & Visual Arts': 'perform.jpg',\n \"Ballet\": 'perform.jpg',\n \"Comedy\": 'perform.jpg',\n \"Craft\": 'perform.jpg',\n \"Dance\": 'perform.jpg',\n \"Fine Art\": 'perform.jpg',\n \"Literary Arts\": 'perform.jpg',\n \"Musical\": 'perform.jpg',\n \"Opera\": 'perform.jpg',\n \"Orchestra\": 'perform.jpg',\n \"Theatre\": 'perform.jpg',\n 'Religion & Spirituality': 'spiritual.jpg',\n \"Buddhism\": 'spiritual.jpg',\n \"Christianity\": 'spiritual.jpg',\n \"Eastern Religion\": 'spiritual.jpg',\n \"Islam\": 'spiritual.jpg',\n \"Judaism\": 'spiritual.jpg',\n \"Mormonism\": 'spiritual.jpg',\n \"Mysticism and Occult\": 'spiritual.jpg',\n \"New Age\": 'spiritual.jpg',\n \"Sikhism\": 'spiritual.jpg',\n 'Science & Technology': 'science.jpg',\n \"Biotech\": 'science.jpg',\n \"High Tech\": 'science.jpg',\n \"Medicine\": 'science.jpg',\n \"Mobile\": 'science.jpg',\n \"Robotics\": 'science.jpg',\n \"Science\": 'science.jpg',\n \"Social Media\": 'science.jpg',\n 'Seasonal & Holiday': 'holiday.jpg',\n \"Channukah\": 'holiday.jpg',\n \"Christmas\": 'holiday.jpg',\n \"Easter\": 'holiday.jpg',\n \"Fall events\": 'holiday.jpg',\n \"Halloween/Haunt\": 'holiday.jpg',\n \"Independence Day\": 'holiday.jpg',\n \"New Years Eve\": 'holiday.jpg',\n \"St Patricks Day\": 'holiday.jpg',\n \"Thanksgiving\": 'holiday.jpg',\n 'Sports & Fitness': 'sport.jpg',\n \"Baseball\": 'sport.jpg',\n \"Basketball\": 'sport.jpg',\n \"Cycling\": 'sport.jpg',\n \"Exercise\": 'sport.jpg',\n \"Fighting & Martial Arts\": 'sport.jpg',\n \"Football\": 'sport.jpg',\n \"Golf\": 'sport.jpg',\n \"Hockey\": 'sport.jpg',\n \"Motorsports\": 'sport.jpg',\n \"Mountain Biking\": 'sport.jpg',\n \"Obstacles\": 'sport.jpg',\n \"Rugby\": 'sport.jpg',\n \"Running\": 'sport.jpg',\n \"Snow Sports\": 'sport.jpg',\n \"Soccer\": 'sport.jpg',\n \"Swimming & Water Sports\": 'sport.jpg',\n \"Tennis\": 'sport.jpg',\n \"Volleyball\": 'sport.jpg',\n \"Walking\": 'sport.jpg',\n \"Yoga\": 'sport.jpg',\n 'Travel & Outdoor': 'travel.jpg',\n \"Canoeing\": 'travel.jpg',\n \"Climbing\": 'travel.jpg',\n \"Hiking\": 'travel.jpg',\n \"Kayaking\": 'travel.jpg',\n \"Rafting\": 'travel.jpg',\n \"Travel\": 'travel.jpg',\n}\n", "path": "app/helpers/static.py" } ]
[ { "content": "# -*- coding: utf-8 -*-\n\n##\n# Module for helper static variables\n##\n\n# Event Licences\n\nEVENT_LICENCES = {\n # Licence Name : ( Long Name, Description, Licence URL, Licence Logo, Licence Compact Logo )\n 'All rights reserved': (\n 'All rights reserved',\n u'The copyright holder reserves, or holds for their own use, all the rights provided by copyright law under one specific copyright treaty.',\n 'https://en.wikipedia.org/wiki/All_rights_reserved',\n '',\n ''),\n 'Attribution': (\n 'Creative Commons Attribution 4.0 International License',\n u'This license lets others distribute, remix, tweak, and build upon the work, even commercially, as long as they credit the copyright holder for the original creation.',\n 'https://creativecommons.org/licenses/by/4.0',\n 'https://licensebuttons.net/l/by/3.0/88x31.png',\n 'https://licensebuttons.net/l/by/3.0/80x15.png'),\n 'Attribution-ShareAlike': (\n 'Creative Commons Attribution-ShareAlike 4.0 International License',\n u'This license lets others remix, tweak, and build upon the work even for commercial purposes, as long as they credit the copyright holder and license their new creations under the identical terms.',\n 'https://creativecommons.org/licenses/by-sa/4.0',\n 'https://licensebuttons.net/l/by-sa/3.0/88x31.png',\n 'https://licensebuttons.net/l/by-sa/3.0/80x15.png'),\n 'Attribution-NoDerivs': (\n 'Creative Commons Attribution-NoDerivs 4.0 International License',\n u'This license allows for redistribution, commercial and non-commercial, as long as it is passed along unchanged and in whole, with credit to the copyright holder.',\n 'https://creativecommons.org/licenses/by-nd/4.0',\n 'https://licensebuttons.net/l/by-nd/3.0/88x31.png',\n 'https://licensebuttons.net/l/by-nd/3.0/80x15.png'),\n 'Attribution-NonCommercial': (\n 'Creative Commons Attribution-NonCommercial 4.0 International License',\n u'This license lets others remix, tweak, and build upon the work non-commercially, and although their new works must also acknowledge the copyright holder and be non-commercial, they don’t have to license their derivative works on the same terms.',\n 'https://creativecommons.org/licenses/by-nc/4.0',\n 'https://licensebuttons.net/l/by-nc/3.0/88x31.png',\n 'https://licensebuttons.net/l/by-nc/3.0/80x15.png'),\n 'Attribution-NonCommercial-NoDerivs': (\n 'Creative Commons Attribution-NonCommercial-NoDerivs 4.0 International License',\n u'This license only allows others to download the work and share them with others as long as they credit the copyright holder, but they can’t change them in any way or use them commercially.',\n 'https://creativecommons.org/licenses/by-nc-nd/4.0',\n 'https://licensebuttons.net/l/by-nc-nd/3.0/88x31.png',\n 'https://licensebuttons.net/l/by-nc-nd/3.0/80x15.png'),\n 'Attribution-NonCommercial-ShareAlike': (\n 'Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License',\n u'This license lets others remix, tweak, and build upon the work non-commercially, as long as they credit the copyright holder and license their new creations under the identical terms.',\n 'https://creativecommons.org/licenses/by-nc-sa/4.0',\n 'https://licensebuttons.net/l/by-nc-sa/3.0/88x31.png',\n 'https://licensebuttons.net/l/by-nc-sa/3.0/80x15.png'),\n 'Public Domain Dedication (CC0)': (\n 'Creative Commons Public Domain Dedication (CC0)',\n u'The copyright holder waives his interest in his work and places the work as completely as possible in the public domain so others may freely exploit and use the work without restriction under copyright or database law.',\n 'https://creativecommons.org/publicdomain/zero/1.0/',\n 'http://i.creativecommons.org/p/zero/1.0/88x31.png',\n 'http://i.creativecommons.org/p/zero/1.0/80x15.png'),\n 'Public Domain Work': (\n 'Creative Commons Public Domain Work',\n u'This license enables works that are no longer restricted by copyright to be marked as such in a standard and simple way, making them easily discoverable and available to others.',\n 'https://creativecommons.org/publicdomain/mark/1.0/',\n 'https://licensebuttons.net/p/mark/1.0/88x31.png',\n 'https://licensebuttons.net/p/mark/1.0/80x15.png')\n}\n\n# Event Topics with sub topics\n\nEVENT_TOPICS = {\n 'Auto, Boat & Air': ['Air', 'Auto', 'Boat', 'Motorcycle/ATV', 'Other'],\n 'Business & Professional': [\n 'Career', 'Design', 'Educators', 'Environment &amp; Sustainability',\n 'Finance', 'Media', 'Non Profit &amp; NGOs', 'Other', 'Real Estate',\n 'Sales &amp; Marketing', 'Startups &amp; Small Business'\n ],\n 'Charity & Causes': [\n 'Animal Welfare', 'Disaster Relief', 'Education',\n 'Environment', 'Healthcare', 'Human Rights',\n 'International Aid', 'Other', 'Poverty'\n ],\n 'Community & Culture': [\n 'City/Town', 'County', 'Heritage', 'LGBT', 'Language',\n 'Medieval', 'Nationality', 'Other', 'Renaissance', 'State'\n ],\n 'Family & Education': [\n 'Alumni', 'Baby', 'Children &amp; Youth', 'Education', 'Other',\n 'Parenting', 'Parents Association', 'Reunion'\n ],\n 'Fashion & Beauty': [\n 'Accessories', 'Beauty', 'Bridal', 'Fashion', 'Other'\n ],\n 'Film, Media & Entertainment': [\n 'Adult', 'Anime', 'Comedy', 'Comics', 'Film', 'Gaming', 'Other', 'TV'\n ],\n 'Food & Drink': [\"Beer\", \"Food\", \"Other\", \"Spirits\", \"Wine\"],\n 'Government & Politics': [\n \"County/Municipal Government\", \"Democratic Party\", \"Federal Government\",\n \"Non-partisan\", \"Other\", \"Other Party\", \"Republican Party\",\n \"State Government\"\n ],\n 'Health & Wellness': [\n \"Medical\", \"Mental health\", \"Other\", \"Personal health\", \"Spa\", \"Yoga\"\n ],\n 'Hobbies & Special Interest': [\n \"Adult\", \"Anime/Comics\", \"Books\", \"DIY\", \"Drawing & Painting\", \"Gaming\",\n \"Knitting\", \"Other\", \"Photography\"\n ],\n 'Home & Lifestyle': [\"Dating\", \"Home & Garden\", \"Other\", \"Pets & Animals\"],\n 'Music': [\n \"Alternative\", \"Blues & Jazz\", \"Classical\", \"Country\", \"Cultural\",\n \"EDM / Electronic\", \"Folk\", \"Hip Hop / Rap\", \"Indie\", \"Latin\", \"Metal\",\n \"Opera\", \"Other\", \"Pop\", \"R&B\", \"Reggae\", \"Religious/Spiritual\", \"Rock\",\n \"Top 40\"\n ],\n 'Other': [],\n 'Performing & Visual Arts': [\n \"Ballet\", \"Comedy\", \"Craft\", \"Dance\", \"Fine Art\", \"Literary Arts\",\n \"Musical\", \"Opera\", \"Orchestra\", \"Other\", \"Theatre\"\n ],\n 'Religion & Spirituality': [\n \"Buddhism\", \"Christianity\", \"Eastern Religion\", \"Islam\", \"Judaism\",\n \"Mormonism\", \"Mysticism and Occult\", \"New Age\", \"Other\", \"Sikhism\"\n ],\n 'Science & Technology': [\n \"Biotech\", \"High Tech\", \"Medicine\", \"Mobile\", \"Other\", \"Robotics\",\n \"Science\", \"Social Media\"\n ],\n 'Seasonal & Holiday': [\n \"Channukah\", \"Christmas\", \"Easter\", \"Fall events\", \"Halloween/Haunt\",\n \"Independence Day\", \"New Years Eve\", \"Other\", \"St Patricks Day\",\n \"Thanksgiving\"\n ],\n 'Sports & Fitness': [\n \"Baseball\", \"Basketball\", \"Cycling\", \"Exercise\", \"Fighting & Martial Arts\",\n \"Football\", \"Golf\", \"Hockey\", \"Motorsports\", \"Mountain Biking\",\n \"Obstacles\", \"Other\", \"Rugby\", \"Running\", \"Snow Sports\", \"Soccer\",\n \"Swimming & Water Sports\", \"Tennis\", \"Volleyball\", \"Walking\", \"Yoga\"\n ],\n 'Travel & Outdoor': [\n \"Canoeing\", \"Climbing\", \"Hiking\", \"Kayaking\", \"Other\", \"Rafting\", \"Travel\"\n ]\n}\nPAYMENT_COUNTRIES = {\n 'United States',\n 'Argentina',\n 'Australia',\n 'Austria',\n 'Belgium',\n 'Brazil',\n 'Canada',\n 'Cyprus',\n 'Czech Republic',\n 'Denmark',\n 'Estonia',\n 'Finland',\n 'France',\n 'Germany',\n 'Greece',\n 'Hong Kong',\n 'Hungary',\n 'Ireland',\n 'Israel',\n 'Italy',\n 'Japan',\n 'Latvia',\n 'Lithuania',\n 'Luxemborg',\n 'Malaysia',\n 'Malta',\n 'Mexico',\n 'Netherlands',\n 'New Zealand',\n 'Norway',\n 'Philippines',\n 'Poland',\n 'Portugal',\n 'Singapore',\n 'Slovakia',\n 'Slovenia',\n 'Spain',\n 'Sweden',\n 'Switzerland',\n 'Taiwan',\n 'United Kingdom',\n}\n\n# (currency_code,available_on_paypal,available_on_stripe)\nPAYMENT_CURRENCIES = {\n ('AUD', True, True),\n ('BRL', True, True),\n ('CAD', True, True),\n ('CHF', True, True),\n ('CZK', True, True),\n ('DKK', True, True),\n ('EUR', True, True),\n ('GBP', True, True),\n ('HKD', True, True),\n ('HUF', True, True),\n ('ILS', True, True),\n ('INR', False, True),\n ('JPY', True, True),\n ('MXN', True, True),\n ('MYR', True, True),\n ('NOK', True, True),\n ('NZD', True, True),\n ('PHP', True, True),\n ('PLN', True, True),\n ('RUB', True, True),\n ('SEK', True, True),\n ('SGD', True, True),\n ('THB', True, True),\n ('TWD', True, True),\n ('USD', True, True),\n}\n\n# Event Images with Event Topics and Subtopics\n\nDEFAULT_EVENT_IMAGES = {\n 'Auto, Boat & Air': 'auto.jpg',\n 'Air': 'auto.jpg',\n 'Auto': 'auto.jpg',\n 'Boat': 'auto.jpg',\n 'Motorcycle/ATV': 'auto.jpg',\n 'Business & Professional': 'business.jpg',\n 'Career': 'business.jpg',\n 'Design': 'business.jpg',\n 'Educators': 'business.jpg',\n 'Environment &amp; Sustainability': 'business.jpg',\n 'Finance': 'business.jpg',\n 'Media': 'business.jpg',\n 'Non Profit &amp; NGOs': 'business.jpg',\n 'Real Estate': 'business.jpg',\n 'Sales &amp; Marketing': 'business.jpg',\n 'Startups &amp; Small Business': 'business.jpg',\n 'Charity & Causes': 'charity.jpg',\n 'Animal Welfare': 'charity.jpg',\n 'Disaster Relief': 'charity.jpg',\n 'Education': 'charity.jpg',\n 'Environment': 'charity.jpg',\n 'Healthcare': 'charity.jpg',\n 'Human Rights': 'charity.jpg',\n 'International Aid': 'charity.jpg',\n 'Poverty': 'charity.jpg',\n 'Community & Culture': 'culture.jpg',\n 'City/Town': 'culture.jpg',\n 'County': 'culture.jpg',\n 'Heritage': 'culture.jpg',\n 'LGBT': 'culture.jpg',\n 'Language': 'culture.jpg',\n 'Medieval': 'culture.jpg',\n 'Nationality': 'culture.jpg',\n 'Renaissance': 'culture.jpg',\n 'State': 'culture.jpg',\n 'Family & Education': 'education.jpg',\n 'Alumni': 'education.jpg',\n 'Baby': 'education.jpg',\n 'Children &amp; Youth': 'education.jpg',\n 'Education': 'education.jpg',\n 'Parenting': 'education.jpg',\n 'Parents Association': 'education.jpg',\n 'Reunion': 'education.jpg',\n 'Fashion & Beauty': 'fashion.jpg',\n 'Accessories': 'fashion.jpg',\n 'Beauty': 'fashion.jpg',\n 'Bridal': 'fashion.jpg',\n 'Fashion': 'fashion.jpg',\n 'Film, Media & Entertainment': 'film.jpg',\n 'Adult': 'film.jpg',\n 'Anime': 'film.jpg',\n 'Comedy': 'film.jpg',\n 'Comics': 'film.jpg',\n 'Film': 'film.jpg',\n 'Gaming': 'film.jpg',\n 'TV': 'film.jpg',\n 'Food & Drink': 'food.jpg',\n \"Beer\": 'food.jpg',\n \"Food\": 'food.jpg',\n \"Spirits\": 'food.jpg',\n \"Wine\": 'food.jpg',\n 'Government & Politics': 'government.jpg',\n \"County/Municipal Government\": 'government.jpg',\n \"Democratic Party\": 'government.jpg',\n \"Federal Government\": 'government.jpg',\n \"Non-partisan\": 'government.jpg',\n \"Other Party\": 'government.jpg',\n \"Republican Party\": 'government.jpg',\n \"State Government\": 'government.jpg',\n 'Health & Wellness': 'health.jpg',\n 'Hobbies & Special Interest': 'hobbies.jpg',\n \"Adult\": 'hobbies.jpg',\n \"Anime/Comics\": 'hobbies.jpg',\n \"Books\": 'hobbies.jpg',\n \"DIY\": 'hobbies.jpg',\n \"Drawing & Painting\": 'hobbies.jpg',\n \"Gaming\": 'hobbies.jpg',\n \"Knitting\": 'hobbies.jpg',\n \"Photography\": 'hobbies.jpg',\n 'Home & Lifestyle': 'home.jpg',\n \"Dating\": 'home.jpg',\n \"Home & Garden\": 'home.jpg',\n \"Pets & Animals\": 'home.jpg',\n 'Music': 'music.jpg',\n \"Alternative\": 'music.jpg',\n \"Blues & Jazz\": 'music.jpg',\n \"Classical\": 'music.jpg',\n \"Country\": 'music.jpg',\n \"Cultural\": 'music.jpg',\n \"EDM / Electronic\": 'music.jpg',\n \"Folk\": 'music.jpg',\n \"Hip Hop / Rap\": 'music.jpg',\n \"Indie\": 'music.jpg',\n \"Latin\": 'music.jpg',\n \"Metal\": 'music.jpg',\n \"Opera\": 'music.jpg',\n \"Pop\": 'music.jpg',\n \"R&B\": 'music.jpg',\n \"Reggae\": 'music.jpg',\n \"Religious/Spiritual\": 'music.jpg',\n \"Rock\": 'music.jpg',\n \"Top 40\": 'music.jpg',\n 'Other': 'generic.jpg',\n 'Performing & Visual Arts': 'perform.jpg',\n \"Ballet\": 'perform.jpg',\n \"Comedy\": 'perform.jpg',\n \"Craft\": 'perform.jpg',\n \"Dance\": 'perform.jpg',\n \"Fine Art\": 'perform.jpg',\n \"Literary Arts\": 'perform.jpg',\n \"Musical\": 'perform.jpg',\n \"Opera\": 'perform.jpg',\n \"Orchestra\": 'perform.jpg',\n \"Theatre\": 'perform.jpg',\n 'Religion & Spirituality': 'spiritual.jpg',\n \"Buddhism\": 'spiritual.jpg',\n \"Christianity\": 'spiritual.jpg',\n \"Eastern Religion\": 'spiritual.jpg',\n \"Islam\": 'spiritual.jpg',\n \"Judaism\": 'spiritual.jpg',\n \"Mormonism\": 'spiritual.jpg',\n \"Mysticism and Occult\": 'spiritual.jpg',\n \"New Age\": 'spiritual.jpg',\n \"Sikhism\": 'spiritual.jpg',\n 'Science & Technology': 'science.jpg',\n \"Biotech\": 'science.jpg',\n \"High Tech\": 'science.jpg',\n \"Medicine\": 'science.jpg',\n \"Mobile\": 'science.jpg',\n \"Robotics\": 'science.jpg',\n \"Science\": 'science.jpg',\n \"Social Media\": 'science.jpg',\n 'Seasonal & Holiday': 'holiday.jpg',\n \"Channukah\": 'holiday.jpg',\n \"Christmas\": 'holiday.jpg',\n \"Easter\": 'holiday.jpg',\n \"Fall events\": 'holiday.jpg',\n \"Halloween/Haunt\": 'holiday.jpg',\n \"Independence Day\": 'holiday.jpg',\n \"New Years Eve\": 'holiday.jpg',\n \"St Patricks Day\": 'holiday.jpg',\n \"Thanksgiving\": 'holiday.jpg',\n 'Sports & Fitness': 'sport.jpg',\n \"Baseball\": 'sport.jpg',\n \"Basketball\": 'sport.jpg',\n \"Cycling\": 'sport.jpg',\n \"Exercise\": 'sport.jpg',\n \"Fighting & Martial Arts\": 'sport.jpg',\n \"Football\": 'sport.jpg',\n \"Golf\": 'sport.jpg',\n \"Hockey\": 'sport.jpg',\n \"Motorsports\": 'sport.jpg',\n \"Mountain Biking\": 'sport.jpg',\n \"Obstacles\": 'sport.jpg',\n \"Rugby\": 'sport.jpg',\n \"Running\": 'sport.jpg',\n \"Snow Sports\": 'sport.jpg',\n \"Soccer\": 'sport.jpg',\n \"Swimming & Water Sports\": 'sport.jpg',\n \"Tennis\": 'sport.jpg',\n \"Volleyball\": 'sport.jpg',\n \"Walking\": 'sport.jpg',\n \"Yoga\": 'sport.jpg',\n 'Travel & Outdoor': 'travel.jpg',\n \"Canoeing\": 'travel.jpg',\n \"Climbing\": 'travel.jpg',\n \"Hiking\": 'travel.jpg',\n \"Kayaking\": 'travel.jpg',\n \"Rafting\": 'travel.jpg',\n \"Travel\": 'travel.jpg',\n}\n", "path": "app/helpers/static.py" } ]
diff --git a/app/helpers/static.py b/app/helpers/static.py index 2de510161d..67a0b9635c 100644 --- a/app/helpers/static.py +++ b/app/helpers/static.py @@ -210,7 +210,7 @@ ('SGD', True, True), ('THB', True, True), ('TWD', True, True), - + ('USD', True, True), } # Event Images with Event Topics and Subtopics diff --git a/app/templates/gentelella/admin/event/tickets/attendees.html b/app/templates/gentelella/admin/event/tickets/attendees.html index 566cbfe9d7..df2bbf6422 100644 --- a/app/templates/gentelella/admin/event/tickets/attendees.html +++ b/app/templates/gentelella/admin/event/tickets/attendees.html @@ -57,6 +57,13 @@ <label class="btn btn-default btn-responsive"> <input type="radio" name="show_state" autocomplete="off" value="Expired"> Expired </label> + <label class="btn btn-default btn-responsive"> + <input type="radio" name="show_state" autocomplete="off" value="checked_in"> Checked In + </label> + <label class="btn btn-default btn-responsive"> + <input type="radio" name="show_state" autocomplete="off" value="not_checked_in"> Not Checked In + </label> + </div> </div> @@ -131,7 +138,7 @@ <h3>View Attendees</h3> {% if order.status == 'completed' %} {% if holder.checked_in %} <button class="btn btn-warning holder-check-in-toggle" data-holder-id="{{ holder.id }}"> - Undo + Undo </button> {% else %} <button class="btn btn-success holder-check-in-toggle" data-holder-id="{{ holder.id }}"> @@ -155,10 +162,13 @@ <h3>View Attendees</h3> $.fn.dataTable.ext.search.push( function (settings, data, dataIndex) { var user_option = $("input[name=show_state]:checked").val(); - console.log(data); var state = data[2].trim() || 'pending'; if (user_option === "all") { return true; + } else if (user_option === 'checked_in') { + return data[8].trim().indexOf('Undo') !== -1 + } else if (user_option === 'not_checked_in') { + return data[8].trim().indexOf('Check In') !== -1 } else if (user_option === state) { return true; } @@ -214,11 +224,16 @@ <h3>View Attendees</h3> success: function (result) { $btn.prop("disabled", false); if (result.status === "ok") { - if(result.checked_in) { + if (result.checked_in) { $btn.html("Undo").removeClass("btn-success").addClass("btn-warning"); } else { $btn.html('<i class="fa fa-check fa-fw"></i> Check In').removeClass("btn-warning").addClass("btn-success"); } + var row = table.row($btn.closest('tr')); + var data = row.data(); + data[8] = $btn.text(); + row.invalidate(); + table.draw(); } else { $btn.html(oldText); createSnackbar("There was an error while processing.", "Try Again", function () { diff --git a/app/templates/gentelella/admin/event/wizard/step-1.html b/app/templates/gentelella/admin/event/wizard/step-1.html index ff1c92ac7d..36049c8342 100644 --- a/app/templates/gentelella/admin/event/wizard/step-1.html +++ b/app/templates/gentelella/admin/event/wizard/step-1.html @@ -1503,12 +1503,11 @@ <h3 class="modal-title">Bank Instructions</h3> } }; } - var $stripeConnectedMessage = $("#stripe-connected-message"); $(".stripe-connect").click(function (e) { e.preventDefault(); $.oauthpopup({ - path: "https://connect.stripe.com/oauth/authorize?response_type=code&client_id={{ settings.stripe_client_id }}&scope=read_write&redirect_uri={{ url_for('ticketing.stripe_callback', _external=true) }}", + path: "https://connect.stripe.com/oauth/authorize?response_type=code&client_id={{ key_settings.stripe_client_id }}&scope=read_write&redirect_uri={{ url_for('ticketing.stripe_callback', _external=true) }}", callback: function () { // TODO Disallow test accounts. Only accept live accounts. if (1 || window.oauth_response.live_mode) {
Uberspace__lab-28
Change project name to lab in config
[ { "content": "#!/usr/bin/env python3\n# -*- coding: utf-8 -*-\n#\n# Uberspace 7 lab documentation build configuration file, created by\n# sphinx-quickstart on Tue Feb 13 12:19:29 2018.\n#\n# This file is execfile()d with the current directory set to its\n# containing dir.\n#\n# Note that not all possible configuration values are present in this\n# autogenerated file.\n#\n# All configuration values have a default; values that are commented out\n# serve to show the default.\n\n# If extensions (or modules to document with autodoc) are in another directory,\n# add these directories to sys.path here. If the directory is relative to the\n# documentation root, use os.path.abspath to make it absolute, like shown here.\n#\n# import os\n# import sys\n# sys.path.insert(0, os.path.abspath('.'))\n\nimport sphinx_rtd_theme\n\n# -- General configuration ------------------------------------------------\n\n# If your documentation needs a minimal Sphinx version, state it here.\n#\n# needs_sphinx = '1.0'\n\n# Add any Sphinx extension module names here, as strings. They can be\n# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom\n# ones.\nextensions = []\n\n# Add any paths that contain templates here, relative to this directory.\ntemplates_path = ['_templates']\n\n# The suffix(es) of source filenames.\n# You can specify multiple suffix as a list of string:\n#\n# source_suffix = ['.rst', '.md']\nsource_suffix = '.rst'\n\n# The master toctree document.\nmaster_doc = 'index'\n\n# General information about the project.\nproject = 'Uberspace 7 Lab'\ncopyright = '2018, uberspace.de'\nauthor = 'uberspace.de'\n\n# The version info for the project you're documenting, acts as replacement for\n# |version| and |release|, also used in various other places throughout the\n# built documents.\n#\n# The short X.Y version.\nrelease = version = '7'\n\n# The language for content autogenerated by Sphinx. Refer to documentation\n# for a list of supported languages.\n#\n# This is also used if you do content translation via gettext catalogs.\n# Usually you set \"language\" from the command line for these cases.\nlanguage = None\n\n# List of patterns, relative to source directory, that match files and\n# directories to ignore when looking for source files.\n# This patterns also effect to html_static_path and html_extra_path\nexclude_patterns = []\n\n# The name of the Pygments (syntax highlighting) style to use.\npygments_style = 'sphinx'\n\n# If true, `todo` and `todoList` produce output, else they produce nothing.\ntodo_include_todos = False\n\n\n# -- Options for HTML output ----------------------------------------------\n\nhtml_theme = 'sphinx_rtd_theme'\nhtml_theme_options = {\n 'display_version': False,\n 'navigation_depth': 2,\n 'collapse_navigation': True\n}\nhtml_theme_path = [sphinx_rtd_theme.get_html_theme_path()]\nhtml_last_updated_fmt = '%b %d, %Y'\nhtml_context = {\n 'css_files': ['_static/css/custom.css'],\n 'display_github': True,\n 'github_user': 'Uberspace', \n 'github_repo': 'lab', \n 'github_version': 'master',\n 'conf_py_path': '/source/'\n}\nhtml_show_copyright = False\nhtml_favicon = '_static/favicon.ico'\n\n\n# Theme options are theme-specific and customize the look and feel of a theme\n# further. For a list of options available for each theme, see the\n# documentation.\n#\n# html_theme_options = {}\n\n# Add any paths that contain custom static files (such as style sheets) here,\n# relative to this directory. They are copied after the builtin static files,\n# so a file named \"default.css\" will overwrite the builtin \"default.css\".\nhtml_static_path = ['_static']\n\n# Custom sidebar templates, must be a dictionary that maps document names\n# to template names.\n#\n# This is required for the alabaster theme\n# refs: http://alabaster.readthedocs.io/en/latest/installation.html#sidebars\nhtml_sidebars = {\n '**': [\n 'relations.html', # needs 'show_related': True theme option to display\n 'searchbox.html',\n ]\n}\n\n\n# -- Options for HTMLHelp output ------------------------------------------\n\n# Output file base name for HTML help builder.\nhtmlhelp_basename = 'Uberspace7labdoc'\n", "path": "source/conf.py" } ]
[ { "content": "#!/usr/bin/env python3\n# -*- coding: utf-8 -*-\n#\n# Uberspace 7 lab documentation build configuration file, created by\n# sphinx-quickstart on Tue Feb 13 12:19:29 2018.\n#\n# This file is execfile()d with the current directory set to its\n# containing dir.\n#\n# Note that not all possible configuration values are present in this\n# autogenerated file.\n#\n# All configuration values have a default; values that are commented out\n# serve to show the default.\n\n# If extensions (or modules to document with autodoc) are in another directory,\n# add these directories to sys.path here. If the directory is relative to the\n# documentation root, use os.path.abspath to make it absolute, like shown here.\n#\n# import os\n# import sys\n# sys.path.insert(0, os.path.abspath('.'))\n\nimport sphinx_rtd_theme\n\n# -- General configuration ------------------------------------------------\n\n# If your documentation needs a minimal Sphinx version, state it here.\n#\n# needs_sphinx = '1.0'\n\n# Add any Sphinx extension module names here, as strings. They can be\n# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom\n# ones.\nextensions = []\n\n# Add any paths that contain templates here, relative to this directory.\ntemplates_path = ['_templates']\n\n# The suffix(es) of source filenames.\n# You can specify multiple suffix as a list of string:\n#\n# source_suffix = ['.rst', '.md']\nsource_suffix = '.rst'\n\n# The master toctree document.\nmaster_doc = 'index'\n\n# General information about the project.\nproject = 'UberLab'\ncopyright = '2018, uberspace.de'\nauthor = 'uberspace.de'\n\n# The version info for the project you're documenting, acts as replacement for\n# |version| and |release|, also used in various other places throughout the\n# built documents.\n#\n# The short X.Y version.\nrelease = version = '7'\n\n# The language for content autogenerated by Sphinx. Refer to documentation\n# for a list of supported languages.\n#\n# This is also used if you do content translation via gettext catalogs.\n# Usually you set \"language\" from the command line for these cases.\nlanguage = None\n\n# List of patterns, relative to source directory, that match files and\n# directories to ignore when looking for source files.\n# This patterns also effect to html_static_path and html_extra_path\nexclude_patterns = []\n\n# The name of the Pygments (syntax highlighting) style to use.\npygments_style = 'sphinx'\n\n# If true, `todo` and `todoList` produce output, else they produce nothing.\ntodo_include_todos = False\n\n\n# -- Options for HTML output ----------------------------------------------\n\nhtml_theme = 'sphinx_rtd_theme'\nhtml_theme_options = {\n 'display_version': False,\n 'navigation_depth': 2,\n 'collapse_navigation': True\n}\nhtml_theme_path = [sphinx_rtd_theme.get_html_theme_path()]\nhtml_last_updated_fmt = '%b %d, %Y'\nhtml_context = {\n 'css_files': ['_static/css/custom.css'],\n 'display_github': True,\n 'github_user': 'Uberspace', \n 'github_repo': 'lab', \n 'github_version': 'master',\n 'conf_py_path': '/source/'\n}\nhtml_show_copyright = False\nhtml_favicon = '_static/favicon.ico'\n\n\n# Theme options are theme-specific and customize the look and feel of a theme\n# further. For a list of options available for each theme, see the\n# documentation.\n#\n# html_theme_options = {}\n\n# Add any paths that contain custom static files (such as style sheets) here,\n# relative to this directory. They are copied after the builtin static files,\n# so a file named \"default.css\" will overwrite the builtin \"default.css\".\nhtml_static_path = ['_static']\n\n# Custom sidebar templates, must be a dictionary that maps document names\n# to template names.\n#\n# This is required for the alabaster theme\n# refs: http://alabaster.readthedocs.io/en/latest/installation.html#sidebars\nhtml_sidebars = {\n '**': [\n 'relations.html', # needs 'show_related': True theme option to display\n 'searchbox.html',\n ]\n}\n\n\n# -- Options for HTMLHelp output ------------------------------------------\n\n# Output file base name for HTML help builder.\nhtmlhelp_basename = 'Uberspace7labdoc'\n", "path": "source/conf.py" } ]
diff --git a/source/_templates/breadcrumbs.html b/source/_templates/breadcrumbs.html index 39d3706f..81a84c88 100755 --- a/source/_templates/breadcrumbs.html +++ b/source/_templates/breadcrumbs.html @@ -32,7 +32,7 @@ <ul class="wy-breadcrumbs"> {% block breadcrumbs %} - <li><a href="{{ pathto(master_doc) }}">{{ _('Manual') }}</a> &raquo;</li> + <li><a href="{{ pathto(master_doc) }}">{{ _('Lab') }}</a> &raquo;</li> {% for doc in parents %} <li><a href="{{ doc.link|e }}">{{ doc.title }}</a> &raquo;</li> {% endfor %} diff --git a/source/conf.py b/source/conf.py index eea6ac81..e04ec227 100644 --- a/source/conf.py +++ b/source/conf.py @@ -47,7 +47,7 @@ master_doc = 'index' # General information about the project. -project = 'Uberspace 7 Lab' +project = 'UberLab' copyright = '2018, uberspace.de' author = 'uberspace.de'
pennersr__django-allauth-2388
Error when 500 template contains a django-allauth template tag **Error message**: - `AttributeError: 'NoneType' object has no attribute 'POST'` **How to reproduce**: 1) Create a 500 template (`500.html` in your template directory) that includes a template tag from django-allauth. For me, it was `Google Sign In` button: ```html <!--500.html--> {% load socialaccount %} <a class="nav-link" href="{% provider_login_url 'google' %}">Log In</a> ``` 2. Add an endpoint that is handled by Django's default 500 handler (`handler500` in `django.conf.urls`, which by default points to `django.views.defaults.server_error`) ```python # urls.py from django.conf.urls import handler500 urlpatterns = [ # ... path('500/', handler500, name='500'), # ... ] ``` - The handler (`server_error`) renders the template: `return HttpResponseServerError(template.render())` - `render` in `allauth/socialaccount/templatetags/socialaccount.py` is called - `get_request_param` in `allauth/utils.py` is called in the line `next = get_request_param(request, 'next')` - `return request.POST.get(param) or request.GET.get(param, default)` in `get_request_param` causes an error because `request` is `None` in this case **Solution**: - Add a guard statement to the function `get_request_param` in `allauth/utils.py` - before a patch: Use a custom 500 handler instead of Django's default 500 handler
[ { "content": "import base64\nimport importlib\nimport json\nimport random\nimport re\nimport string\nimport unicodedata\nfrom collections import OrderedDict\n\nfrom django.contrib.auth import get_user_model\nfrom django.contrib.sites.models import Site\nfrom django.core.exceptions import ImproperlyConfigured\nfrom django.core.serializers.json import DjangoJSONEncoder\nfrom django.core.validators import ValidationError, validate_email\nfrom django.db.models import FieldDoesNotExist, FileField\nfrom django.db.models.fields import (\n BinaryField,\n DateField,\n DateTimeField,\n EmailField,\n TimeField,\n)\nfrom django.utils import dateparse\nfrom django.utils.encoding import force_bytes\n\nfrom allauth.compat import force_str, six, urlsplit\n\n\n# Magic number 7: if you run into collisions with this number, then you are\n# of big enough scale to start investing in a decent user model...\nMAX_USERNAME_SUFFIX_LENGTH = 7\nUSERNAME_SUFFIX_CHARS = (\n [string.digits] * 4 +\n [string.ascii_letters] * (MAX_USERNAME_SUFFIX_LENGTH - 4))\n\n\ndef _generate_unique_username_base(txts, regex=None):\n from .account.adapter import get_adapter\n adapter = get_adapter()\n username = None\n regex = regex or r'[^\\w\\s@+.-]'\n for txt in txts:\n if not txt:\n continue\n username = unicodedata.normalize('NFKD', force_str(txt))\n username = username.encode('ascii', 'ignore').decode('ascii')\n username = force_str(re.sub(regex, '', username).lower())\n # Django allows for '@' in usernames in order to accomodate for\n # project wanting to use e-mail for username. In allauth we don't\n # use this, we already have a proper place for putting e-mail\n # addresses (EmailAddress), so let's not use the full e-mail\n # address and only take the part leading up to the '@'.\n username = username.split('@')[0]\n username = username.strip()\n username = re.sub(r'\\s+', '_', username)\n # Finally, validating base username without database lookups etc.\n try:\n username = adapter.clean_username(username, shallow=True)\n break\n except ValidationError:\n pass\n return username or 'user'\n\n\ndef get_username_max_length():\n from .account.app_settings import USER_MODEL_USERNAME_FIELD\n if USER_MODEL_USERNAME_FIELD is not None:\n User = get_user_model()\n max_length = User._meta.get_field(USER_MODEL_USERNAME_FIELD).max_length\n else:\n max_length = 0\n return max_length\n\n\ndef generate_username_candidate(basename, suffix_length):\n max_length = get_username_max_length()\n suffix = ''.join(\n random.choice(USERNAME_SUFFIX_CHARS[i])\n for i in range(suffix_length))\n return basename[0:max_length - len(suffix)] + suffix\n\n\ndef generate_username_candidates(basename):\n from .account.app_settings import USERNAME_MIN_LENGTH\n if len(basename) >= USERNAME_MIN_LENGTH:\n ret = [basename]\n else:\n ret = []\n min_suffix_length = max(1, USERNAME_MIN_LENGTH - len(basename))\n max_suffix_length = min(\n get_username_max_length(),\n MAX_USERNAME_SUFFIX_LENGTH)\n for suffix_length in range(min_suffix_length, max_suffix_length):\n ret.append(generate_username_candidate(basename, suffix_length))\n return ret\n\n\ndef generate_unique_username(txts, regex=None):\n from .account.app_settings import USER_MODEL_USERNAME_FIELD\n from .account.adapter import get_adapter\n from allauth.account.utils import filter_users_by_username\n\n adapter = get_adapter()\n basename = _generate_unique_username_base(txts, regex)\n candidates = generate_username_candidates(basename)\n existing_usernames = filter_users_by_username(*candidates).values_list(\n USER_MODEL_USERNAME_FIELD, flat=True)\n existing_usernames = set([n.lower() for n in existing_usernames])\n for candidate in candidates:\n if candidate.lower() not in existing_usernames:\n try:\n return adapter.clean_username(candidate, shallow=True)\n except ValidationError:\n pass\n # This really should not happen\n raise NotImplementedError('Unable to find a unique username')\n\n\ndef valid_email_or_none(email):\n ret = None\n try:\n if email:\n validate_email(email)\n if len(email) <= EmailField().max_length:\n ret = email\n except ValidationError:\n pass\n return ret\n\n\ndef email_address_exists(email, exclude_user=None):\n from .account import app_settings as account_settings\n from .account.models import EmailAddress\n\n emailaddresses = EmailAddress.objects\n if exclude_user:\n emailaddresses = emailaddresses.exclude(user=exclude_user)\n ret = emailaddresses.filter(email__iexact=email).exists()\n if not ret:\n email_field = account_settings.USER_MODEL_EMAIL_FIELD\n if email_field:\n users = get_user_model().objects\n if exclude_user:\n users = users.exclude(pk=exclude_user.pk)\n ret = users.filter(**{email_field + '__iexact': email}).exists()\n return ret\n\n\ndef import_attribute(path):\n assert isinstance(path, six.string_types)\n pkg, attr = path.rsplit('.', 1)\n ret = getattr(importlib.import_module(pkg), attr)\n return ret\n\n\ndef import_callable(path_or_callable):\n if not hasattr(path_or_callable, '__call__'):\n ret = import_attribute(path_or_callable)\n else:\n ret = path_or_callable\n return ret\n\n\nSERIALIZED_DB_FIELD_PREFIX = '_db_'\n\n\ndef serialize_instance(instance):\n \"\"\"\n Since Django 1.6 items added to the session are no longer pickled,\n but JSON encoded by default. We are storing partially complete models\n in the session (user, account, token, ...). We cannot use standard\n Django serialization, as these are models are not \"complete\" yet.\n Serialization will start complaining about missing relations et al.\n \"\"\"\n data = {}\n for k, v in instance.__dict__.items():\n if k.startswith('_') or callable(v):\n continue\n try:\n field = instance._meta.get_field(k)\n if isinstance(field, BinaryField):\n v = force_str(base64.b64encode(v))\n elif isinstance(field, FileField):\n if v and not isinstance(v, six.string_types):\n v = v.name\n # Check if the field is serializable. If not, we'll fall back\n # to serializing the DB values which should cover most use cases.\n try:\n json.dumps(v, cls=DjangoJSONEncoder)\n except TypeError:\n v = field.get_prep_value(v)\n k = SERIALIZED_DB_FIELD_PREFIX + k\n except FieldDoesNotExist:\n pass\n data[k] = v\n return json.loads(json.dumps(data, cls=DjangoJSONEncoder))\n\n\ndef deserialize_instance(model, data):\n ret = model()\n for k, v in data.items():\n is_db_value = False\n if k.startswith(SERIALIZED_DB_FIELD_PREFIX):\n k = k[len(SERIALIZED_DB_FIELD_PREFIX):]\n is_db_value = True\n if v is not None:\n try:\n f = model._meta.get_field(k)\n if isinstance(f, DateTimeField):\n v = dateparse.parse_datetime(v)\n elif isinstance(f, TimeField):\n v = dateparse.parse_time(v)\n elif isinstance(f, DateField):\n v = dateparse.parse_date(v)\n elif isinstance(f, BinaryField):\n v = force_bytes(\n base64.b64decode(\n force_bytes(v)))\n elif is_db_value:\n try:\n # This is quite an ugly hack, but will cover most\n # use cases...\n v = f.from_db_value(v, None, None, None)\n except Exception:\n raise ImproperlyConfigured(\n \"Unable to auto serialize field '{}', custom\"\n \" serialization override required\".format(k)\n )\n except FieldDoesNotExist:\n pass\n setattr(ret, k, v)\n return ret\n\n\ndef set_form_field_order(form, field_order):\n \"\"\"\n This function is a verbatim copy of django.forms.Form.order_fields() to\n support field ordering below Django 1.9.\n\n field_order is a list of field names specifying the order. Append fields\n not included in the list in the default order for backward compatibility\n with subclasses not overriding field_order. If field_order is None, keep\n all fields in the order defined in the class. Ignore unknown fields in\n field_order to allow disabling fields in form subclasses without\n redefining ordering.\n \"\"\"\n if field_order is None:\n return\n fields = OrderedDict()\n for key in field_order:\n try:\n fields[key] = form.fields.pop(key)\n except KeyError: # ignore unknown fields\n pass\n fields.update(form.fields) # add remaining fields in original order\n form.fields = fields\n\n\ndef build_absolute_uri(request, location, protocol=None):\n \"\"\"request.build_absolute_uri() helper\n\n Like request.build_absolute_uri, but gracefully handling\n the case where request is None.\n \"\"\"\n from .account import app_settings as account_settings\n\n if request is None:\n site = Site.objects.get_current()\n bits = urlsplit(location)\n if not (bits.scheme and bits.netloc):\n uri = '{proto}://{domain}{url}'.format(\n proto=account_settings.DEFAULT_HTTP_PROTOCOL,\n domain=site.domain,\n url=location)\n else:\n uri = location\n else:\n uri = request.build_absolute_uri(location)\n # NOTE: We only force a protocol if we are instructed to do so\n # (via the `protocol` parameter, or, if the default is set to\n # HTTPS. The latter keeps compatibility with the debatable use\n # case of running your site under both HTTP and HTTPS, where one\n # would want to make sure HTTPS links end up in password reset\n # mails even while they were initiated on an HTTP password reset\n # form.\n if not protocol and account_settings.DEFAULT_HTTP_PROTOCOL == 'https':\n protocol = account_settings.DEFAULT_HTTP_PROTOCOL\n # (end NOTE)\n if protocol:\n uri = protocol + ':' + uri.partition(':')[2]\n return uri\n\n\ndef get_form_class(forms, form_id, default_form):\n form_class = forms.get(form_id, default_form)\n if isinstance(form_class, six.string_types):\n form_class = import_attribute(form_class)\n return form_class\n\n\ndef get_request_param(request, param, default=None):\n return request.POST.get(param) or request.GET.get(param, default)\n", "path": "allauth/utils.py" } ]
[ { "content": "import base64\nimport importlib\nimport json\nimport random\nimport re\nimport string\nimport unicodedata\nfrom collections import OrderedDict\n\nfrom django.contrib.auth import get_user_model\nfrom django.contrib.sites.models import Site\nfrom django.core.exceptions import ImproperlyConfigured\nfrom django.core.serializers.json import DjangoJSONEncoder\nfrom django.core.validators import ValidationError, validate_email\nfrom django.db.models import FieldDoesNotExist, FileField\nfrom django.db.models.fields import (\n BinaryField,\n DateField,\n DateTimeField,\n EmailField,\n TimeField,\n)\nfrom django.utils import dateparse\nfrom django.utils.encoding import force_bytes\n\nfrom allauth.compat import force_str, six, urlsplit\n\n\n# Magic number 7: if you run into collisions with this number, then you are\n# of big enough scale to start investing in a decent user model...\nMAX_USERNAME_SUFFIX_LENGTH = 7\nUSERNAME_SUFFIX_CHARS = (\n [string.digits] * 4 +\n [string.ascii_letters] * (MAX_USERNAME_SUFFIX_LENGTH - 4))\n\n\ndef _generate_unique_username_base(txts, regex=None):\n from .account.adapter import get_adapter\n adapter = get_adapter()\n username = None\n regex = regex or r'[^\\w\\s@+.-]'\n for txt in txts:\n if not txt:\n continue\n username = unicodedata.normalize('NFKD', force_str(txt))\n username = username.encode('ascii', 'ignore').decode('ascii')\n username = force_str(re.sub(regex, '', username).lower())\n # Django allows for '@' in usernames in order to accomodate for\n # project wanting to use e-mail for username. In allauth we don't\n # use this, we already have a proper place for putting e-mail\n # addresses (EmailAddress), so let's not use the full e-mail\n # address and only take the part leading up to the '@'.\n username = username.split('@')[0]\n username = username.strip()\n username = re.sub(r'\\s+', '_', username)\n # Finally, validating base username without database lookups etc.\n try:\n username = adapter.clean_username(username, shallow=True)\n break\n except ValidationError:\n pass\n return username or 'user'\n\n\ndef get_username_max_length():\n from .account.app_settings import USER_MODEL_USERNAME_FIELD\n if USER_MODEL_USERNAME_FIELD is not None:\n User = get_user_model()\n max_length = User._meta.get_field(USER_MODEL_USERNAME_FIELD).max_length\n else:\n max_length = 0\n return max_length\n\n\ndef generate_username_candidate(basename, suffix_length):\n max_length = get_username_max_length()\n suffix = ''.join(\n random.choice(USERNAME_SUFFIX_CHARS[i])\n for i in range(suffix_length))\n return basename[0:max_length - len(suffix)] + suffix\n\n\ndef generate_username_candidates(basename):\n from .account.app_settings import USERNAME_MIN_LENGTH\n if len(basename) >= USERNAME_MIN_LENGTH:\n ret = [basename]\n else:\n ret = []\n min_suffix_length = max(1, USERNAME_MIN_LENGTH - len(basename))\n max_suffix_length = min(\n get_username_max_length(),\n MAX_USERNAME_SUFFIX_LENGTH)\n for suffix_length in range(min_suffix_length, max_suffix_length):\n ret.append(generate_username_candidate(basename, suffix_length))\n return ret\n\n\ndef generate_unique_username(txts, regex=None):\n from .account.app_settings import USER_MODEL_USERNAME_FIELD\n from .account.adapter import get_adapter\n from allauth.account.utils import filter_users_by_username\n\n adapter = get_adapter()\n basename = _generate_unique_username_base(txts, regex)\n candidates = generate_username_candidates(basename)\n existing_usernames = filter_users_by_username(*candidates).values_list(\n USER_MODEL_USERNAME_FIELD, flat=True)\n existing_usernames = set([n.lower() for n in existing_usernames])\n for candidate in candidates:\n if candidate.lower() not in existing_usernames:\n try:\n return adapter.clean_username(candidate, shallow=True)\n except ValidationError:\n pass\n # This really should not happen\n raise NotImplementedError('Unable to find a unique username')\n\n\ndef valid_email_or_none(email):\n ret = None\n try:\n if email:\n validate_email(email)\n if len(email) <= EmailField().max_length:\n ret = email\n except ValidationError:\n pass\n return ret\n\n\ndef email_address_exists(email, exclude_user=None):\n from .account import app_settings as account_settings\n from .account.models import EmailAddress\n\n emailaddresses = EmailAddress.objects\n if exclude_user:\n emailaddresses = emailaddresses.exclude(user=exclude_user)\n ret = emailaddresses.filter(email__iexact=email).exists()\n if not ret:\n email_field = account_settings.USER_MODEL_EMAIL_FIELD\n if email_field:\n users = get_user_model().objects\n if exclude_user:\n users = users.exclude(pk=exclude_user.pk)\n ret = users.filter(**{email_field + '__iexact': email}).exists()\n return ret\n\n\ndef import_attribute(path):\n assert isinstance(path, six.string_types)\n pkg, attr = path.rsplit('.', 1)\n ret = getattr(importlib.import_module(pkg), attr)\n return ret\n\n\ndef import_callable(path_or_callable):\n if not hasattr(path_or_callable, '__call__'):\n ret = import_attribute(path_or_callable)\n else:\n ret = path_or_callable\n return ret\n\n\nSERIALIZED_DB_FIELD_PREFIX = '_db_'\n\n\ndef serialize_instance(instance):\n \"\"\"\n Since Django 1.6 items added to the session are no longer pickled,\n but JSON encoded by default. We are storing partially complete models\n in the session (user, account, token, ...). We cannot use standard\n Django serialization, as these are models are not \"complete\" yet.\n Serialization will start complaining about missing relations et al.\n \"\"\"\n data = {}\n for k, v in instance.__dict__.items():\n if k.startswith('_') or callable(v):\n continue\n try:\n field = instance._meta.get_field(k)\n if isinstance(field, BinaryField):\n v = force_str(base64.b64encode(v))\n elif isinstance(field, FileField):\n if v and not isinstance(v, six.string_types):\n v = v.name\n # Check if the field is serializable. If not, we'll fall back\n # to serializing the DB values which should cover most use cases.\n try:\n json.dumps(v, cls=DjangoJSONEncoder)\n except TypeError:\n v = field.get_prep_value(v)\n k = SERIALIZED_DB_FIELD_PREFIX + k\n except FieldDoesNotExist:\n pass\n data[k] = v\n return json.loads(json.dumps(data, cls=DjangoJSONEncoder))\n\n\ndef deserialize_instance(model, data):\n ret = model()\n for k, v in data.items():\n is_db_value = False\n if k.startswith(SERIALIZED_DB_FIELD_PREFIX):\n k = k[len(SERIALIZED_DB_FIELD_PREFIX):]\n is_db_value = True\n if v is not None:\n try:\n f = model._meta.get_field(k)\n if isinstance(f, DateTimeField):\n v = dateparse.parse_datetime(v)\n elif isinstance(f, TimeField):\n v = dateparse.parse_time(v)\n elif isinstance(f, DateField):\n v = dateparse.parse_date(v)\n elif isinstance(f, BinaryField):\n v = force_bytes(\n base64.b64decode(\n force_bytes(v)))\n elif is_db_value:\n try:\n # This is quite an ugly hack, but will cover most\n # use cases...\n v = f.from_db_value(v, None, None, None)\n except Exception:\n raise ImproperlyConfigured(\n \"Unable to auto serialize field '{}', custom\"\n \" serialization override required\".format(k)\n )\n except FieldDoesNotExist:\n pass\n setattr(ret, k, v)\n return ret\n\n\ndef set_form_field_order(form, field_order):\n \"\"\"\n This function is a verbatim copy of django.forms.Form.order_fields() to\n support field ordering below Django 1.9.\n\n field_order is a list of field names specifying the order. Append fields\n not included in the list in the default order for backward compatibility\n with subclasses not overriding field_order. If field_order is None, keep\n all fields in the order defined in the class. Ignore unknown fields in\n field_order to allow disabling fields in form subclasses without\n redefining ordering.\n \"\"\"\n if field_order is None:\n return\n fields = OrderedDict()\n for key in field_order:\n try:\n fields[key] = form.fields.pop(key)\n except KeyError: # ignore unknown fields\n pass\n fields.update(form.fields) # add remaining fields in original order\n form.fields = fields\n\n\ndef build_absolute_uri(request, location, protocol=None):\n \"\"\"request.build_absolute_uri() helper\n\n Like request.build_absolute_uri, but gracefully handling\n the case where request is None.\n \"\"\"\n from .account import app_settings as account_settings\n\n if request is None:\n site = Site.objects.get_current()\n bits = urlsplit(location)\n if not (bits.scheme and bits.netloc):\n uri = '{proto}://{domain}{url}'.format(\n proto=account_settings.DEFAULT_HTTP_PROTOCOL,\n domain=site.domain,\n url=location)\n else:\n uri = location\n else:\n uri = request.build_absolute_uri(location)\n # NOTE: We only force a protocol if we are instructed to do so\n # (via the `protocol` parameter, or, if the default is set to\n # HTTPS. The latter keeps compatibility with the debatable use\n # case of running your site under both HTTP and HTTPS, where one\n # would want to make sure HTTPS links end up in password reset\n # mails even while they were initiated on an HTTP password reset\n # form.\n if not protocol and account_settings.DEFAULT_HTTP_PROTOCOL == 'https':\n protocol = account_settings.DEFAULT_HTTP_PROTOCOL\n # (end NOTE)\n if protocol:\n uri = protocol + ':' + uri.partition(':')[2]\n return uri\n\n\ndef get_form_class(forms, form_id, default_form):\n form_class = forms.get(form_id, default_form)\n if isinstance(form_class, six.string_types):\n form_class = import_attribute(form_class)\n return form_class\n\n\ndef get_request_param(request, param, default=None):\n if request is None:\n return default\n return request.POST.get(param) or request.GET.get(param, default)\n", "path": "allauth/utils.py" } ]
diff --git a/AUTHORS b/AUTHORS index 588e9b7a77..642ae957c4 100644 --- a/AUTHORS +++ b/AUTHORS @@ -66,6 +66,7 @@ Jeff Triplett Jeremy Satterfield Jerome Leclanche Jesse Gerard Brands +Jihoon Park Jiyoon Ha Joe Vanderstelt John Bazik diff --git a/allauth/utils.py b/allauth/utils.py index fe366eae8e..d2c58cda68 100644 --- a/allauth/utils.py +++ b/allauth/utils.py @@ -299,4 +299,6 @@ def get_form_class(forms, form_id, default_form): def get_request_param(request, param, default=None): + if request is None: + return default return request.POST.get(param) or request.GET.get(param, default)
hylang__hy-1343
REPL history is lost on (quit) REPL history is not flushed to disk if the REPL is exited using `(quit)`. A workaround is to remember to use `CTRL-D` to exit the REPL. Would be nice if `(quit)` also worked.
[ { "content": "# Copyright 2017 the authors.\n# This file is part of Hy, which is free software licensed under the Expat\n# license. See the LICENSE.\n\nimport contextlib\nimport os\nimport re\nimport sys\n\nimport hy.macros\nimport hy.compiler\nfrom hy._compat import builtins, string_types\n\n\ndocomplete = True\n\ntry:\n import readline\nexcept ImportError:\n try:\n import pyreadline.rlmain\n import pyreadline.unicode_helper # NOQA\n import readline\n except ImportError:\n docomplete = False\n\nif sys.platform == 'darwin' and 'libedit' in readline.__doc__:\n readline_bind = \"bind ^I rl_complete\"\nelse:\n readline_bind = \"tab: complete\"\n\n\nclass Completer(object):\n\n def __init__(self, namespace={}):\n if not isinstance(namespace, dict):\n raise TypeError('namespace must be a dictionary')\n self.namespace = namespace\n self.path = [hy.compiler._compile_table,\n builtins.__dict__,\n hy.macros._hy_macros[None],\n namespace]\n self.tag_path = [hy.macros._hy_tag[None]]\n if '__name__' in namespace:\n module_name = namespace['__name__']\n self.path.append(hy.macros._hy_macros[module_name])\n self.tag_path.append(hy.macros._hy_tag[module_name])\n\n def attr_matches(self, text):\n # Borrowed from IPython's completer\n m = re.match(r\"(\\S+(\\.[\\w-]+)*)\\.([\\w-]*)$\", text)\n\n if m:\n expr, attr = m.group(1, 3)\n attr = attr.replace(\"-\", \"_\")\n expr = expr.replace(\"-\", \"_\")\n else:\n return []\n\n try:\n obj = eval(expr, self.namespace)\n words = dir(obj)\n except Exception:\n return []\n\n n = len(attr)\n matches = []\n for w in words:\n if w[:n] == attr:\n matches.append(\"{}.{}\".format(\n expr.replace(\"_\", \"-\"), w.replace(\"_\", \"-\")))\n return matches\n\n def global_matches(self, text):\n matches = []\n for p in self.path:\n for k in p.keys():\n if isinstance(k, string_types):\n k = k.replace(\"_\", \"-\")\n if k.startswith(text):\n matches.append(k)\n return matches\n\n def tag_matches(self, text):\n text = text[1:]\n matches = []\n for p in self.tag_path:\n for k in p.keys():\n if isinstance(k, string_types):\n if k.startswith(text):\n matches.append(\"#{}\".format(k))\n return matches\n\n def complete(self, text, state):\n if text.startswith(\"#\"):\n matches = self.tag_matches(text)\n elif \".\" in text:\n matches = self.attr_matches(text)\n else:\n matches = self.global_matches(text)\n try:\n return matches[state]\n except IndexError:\n return None\n\n\[email protected]\ndef completion(completer=None):\n delims = \"()[]{} \"\n if not completer:\n completer = Completer()\n\n if docomplete:\n readline.set_completer(completer.complete)\n readline.set_completer_delims(delims)\n\n history = os.path.expanduser(\"~/.hy-history\")\n readline.parse_and_bind(\"set blink-matching-paren on\")\n\n try:\n readline.read_history_file(history)\n except IOError:\n open(history, 'a').close()\n\n readline.parse_and_bind(readline_bind)\n\n yield\n\n if docomplete:\n readline.write_history_file(history)\n", "path": "hy/completer.py" } ]
[ { "content": "# Copyright 2017 the authors.\n# This file is part of Hy, which is free software licensed under the Expat\n# license. See the LICENSE.\n\nimport contextlib\nimport os\nimport re\nimport sys\n\nimport hy.macros\nimport hy.compiler\nfrom hy._compat import builtins, string_types\n\n\ndocomplete = True\n\ntry:\n import readline\nexcept ImportError:\n try:\n import pyreadline.rlmain\n import pyreadline.unicode_helper # NOQA\n import readline\n except ImportError:\n docomplete = False\n\nif sys.platform == 'darwin' and 'libedit' in readline.__doc__:\n readline_bind = \"bind ^I rl_complete\"\nelse:\n readline_bind = \"tab: complete\"\n\n\nclass Completer(object):\n\n def __init__(self, namespace={}):\n if not isinstance(namespace, dict):\n raise TypeError('namespace must be a dictionary')\n self.namespace = namespace\n self.path = [hy.compiler._compile_table,\n builtins.__dict__,\n hy.macros._hy_macros[None],\n namespace]\n self.tag_path = [hy.macros._hy_tag[None]]\n if '__name__' in namespace:\n module_name = namespace['__name__']\n self.path.append(hy.macros._hy_macros[module_name])\n self.tag_path.append(hy.macros._hy_tag[module_name])\n\n def attr_matches(self, text):\n # Borrowed from IPython's completer\n m = re.match(r\"(\\S+(\\.[\\w-]+)*)\\.([\\w-]*)$\", text)\n\n if m:\n expr, attr = m.group(1, 3)\n attr = attr.replace(\"-\", \"_\")\n expr = expr.replace(\"-\", \"_\")\n else:\n return []\n\n try:\n obj = eval(expr, self.namespace)\n words = dir(obj)\n except Exception:\n return []\n\n n = len(attr)\n matches = []\n for w in words:\n if w[:n] == attr:\n matches.append(\"{}.{}\".format(\n expr.replace(\"_\", \"-\"), w.replace(\"_\", \"-\")))\n return matches\n\n def global_matches(self, text):\n matches = []\n for p in self.path:\n for k in p.keys():\n if isinstance(k, string_types):\n k = k.replace(\"_\", \"-\")\n if k.startswith(text):\n matches.append(k)\n return matches\n\n def tag_matches(self, text):\n text = text[1:]\n matches = []\n for p in self.tag_path:\n for k in p.keys():\n if isinstance(k, string_types):\n if k.startswith(text):\n matches.append(\"#{}\".format(k))\n return matches\n\n def complete(self, text, state):\n if text.startswith(\"#\"):\n matches = self.tag_matches(text)\n elif \".\" in text:\n matches = self.attr_matches(text)\n else:\n matches = self.global_matches(text)\n try:\n return matches[state]\n except IndexError:\n return None\n\n\[email protected]\ndef completion(completer=None):\n delims = \"()[]{} \"\n if not completer:\n completer = Completer()\n\n if docomplete:\n readline.set_completer(completer.complete)\n readline.set_completer_delims(delims)\n\n history = os.path.expanduser(\"~/.hy-history\")\n readline.parse_and_bind(\"set blink-matching-paren on\")\n\n try:\n readline.read_history_file(history)\n except IOError:\n open(history, 'a').close()\n\n readline.parse_and_bind(readline_bind)\n\n try:\n yield\n finally:\n if docomplete:\n readline.write_history_file(history)\n", "path": "hy/completer.py" } ]
diff --git a/NEWS b/NEWS index 3785792ec..300db30e5 100644 --- a/NEWS +++ b/NEWS @@ -20,6 +20,8 @@ Changes from 0.13.0 * String literals should no longer be interpreted as special forms or macros * Tag macros (née sharp macros) whose names begin with `!` are no longer mistaken for shebang lines + * Fixed a bug where REPL history wasn't saved if you quit the REPL with + `(quit)` or `(exit)` Changes from 0.12.1 diff --git a/hy/completer.py b/hy/completer.py index 0d9c906e9..c4de45cca 100644 --- a/hy/completer.py +++ b/hy/completer.py @@ -124,7 +124,8 @@ def completion(completer=None): readline.parse_and_bind(readline_bind) - yield - - if docomplete: - readline.write_history_file(history) + try: + yield + finally: + if docomplete: + readline.write_history_file(history)
google-research__text-to-text-transfer-transformer-351
Unable to import tensorflow_gcs_config in t5_trivia colab notebook Upon running line `import tensorflow_gcs_config` (in t5_trivia colab notebook, setup section) I get this error, ``` --------------------------------------------------------------------------- NotImplementedError Traceback (most recent call last) <ipython-input-2-3bb7f36f8553> in <module>() ----> 1 import tensorflow_gcs_config 1 frames /usr/local/lib/python3.6/dist-packages/tensorflow_gcs_config/__init__.py in _load_library(filename, lib) 55 raise NotImplementedError( 56 "unable to open file: " + ---> 57 "{}, from paths: {}\ncaused by: {}".format(filename, filenames, errs)) 58 59 _gcs_config_so = _load_library("_gcs_config_ops.so") NotImplementedError: unable to open file: _gcs_config_ops.so, from paths: ['/usr/local/lib/python3.6/dist-packages/tensorflow_gcs_config/_gcs_config_ops.so'] caused by: ['/usr/local/lib/python3.6/dist-packages/tensorflow_gcs_config/_gcs_config_ops.so: undefined symbol: _ZN10tensorflow15OpKernelContext5inputEN4absl11string_viewEPPKNS_6TensorE'] ``` `tf.__version__` is '2.3.0'
[ { "content": "# Copyright 2020 The T5 Authors.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n# Lint as: python3\nr\"\"\"Separate file for storing the current version of T5.\n\nStored in a separate file so that setup.py can reference the version without\npulling in all the dependencies in __init__.py.\n\"\"\"\n__version__ = '0.6.3'\n", "path": "t5/version.py" } ]
[ { "content": "# Copyright 2020 The T5 Authors.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n# Lint as: python3\nr\"\"\"Separate file for storing the current version of T5.\n\nStored in a separate file so that setup.py can reference the version without\npulling in all the dependencies in __init__.py.\n\"\"\"\n__version__ = '0.6.4'\n", "path": "t5/version.py" } ]
diff --git a/t5/version.py b/t5/version.py index f3b47b22..e50b2461 100644 --- a/t5/version.py +++ b/t5/version.py @@ -18,4 +18,4 @@ Stored in a separate file so that setup.py can reference the version without pulling in all the dependencies in __init__.py. """ -__version__ = '0.6.3' +__version__ = '0.6.4'
espnet__espnet-913
matplotlib.use('Agg') fail It result in plot fail ``` _tkinter.TclError: no display name and no $DISPLAY environment variable ``` I fixed this by applying a patch on `espnet/nets/pytorch_backend/transformer/plot.py` ``` @@ -1,5 +1,6 @@ import logging - +import matplotlib +matplotlib.use('Agg') import matplotlib.pyplot as plt from espnet.asr import asr_utils ```
[ { "content": "import logging\n\nimport matplotlib.pyplot as plt\n\nfrom espnet.asr import asr_utils\n\n\ndef _plot_and_save_attention(att_w, filename):\n # dynamically import matplotlib due to not found error\n from matplotlib.ticker import MaxNLocator\n import os\n d = os.path.dirname(filename)\n if not os.path.exists(d):\n os.makedirs(d)\n w, h = plt.figaspect(1.0 / len(att_w))\n fig = plt.Figure(figsize=(w * 2, h * 2))\n axes = fig.subplots(1, len(att_w))\n if len(att_w) == 1:\n axes = [axes]\n for ax, aw in zip(axes, att_w):\n # plt.subplot(1, len(att_w), h)\n ax.imshow(aw, aspect=\"auto\")\n ax.set_xlabel(\"Input\")\n ax.set_ylabel(\"Output\")\n ax.xaxis.set_major_locator(MaxNLocator(integer=True))\n ax.yaxis.set_major_locator(MaxNLocator(integer=True))\n fig.tight_layout()\n return fig\n\n\ndef savefig(plot, filename):\n plot.savefig(filename)\n plt.clf()\n\n\ndef plot_multi_head_attention(data, attn_dict, outdir, suffix=\"png\", savefn=savefig):\n \"\"\"Plot multi head attentions\n\n :param dict data: utts info from json file\n :param dict[str, torch.Tensor] attn_dict: multi head attention dict.\n values should be torch.Tensor (head, input_length, output_length)\n :param str outdir: dir to save fig\n :param str suffix: filename suffix including image type (e.g., png)\n :param savefn: function to save\n \"\"\"\n for name, att_ws in attn_dict.items():\n for idx, att_w in enumerate(att_ws):\n filename = \"%s/%s.%s.%s\" % (\n outdir, data[idx][0], name, suffix)\n dec_len = int(data[idx][1]['output'][0]['shape'][0])\n enc_len = int(data[idx][1]['input'][0]['shape'][0])\n if \"encoder\" in name:\n att_w = att_w[:, :enc_len, :enc_len]\n elif \"decoder\" in name:\n if \"self\" in name:\n att_w = att_w[:, :dec_len, :dec_len]\n else:\n att_w = att_w[:, :dec_len, :enc_len]\n else:\n logging.warning(\"unknown name for shaping attention\")\n fig = _plot_and_save_attention(att_w, filename)\n savefn(fig, filename)\n\n\nclass PlotAttentionReport(asr_utils.PlotAttentionReport):\n def plotfn(self, *args, **kwargs):\n plot_multi_head_attention(*args, **kwargs)\n\n def __call__(self, trainer):\n attn_dict = self.get_attention_weights()\n suffix = \"ep.{.updater.epoch}.png\".format(trainer)\n self.plotfn(self.data, attn_dict, self.outdir, suffix, savefig)\n\n def get_attention_weights(self):\n batch = self.converter([self.transform(self.data)], self.device)\n if isinstance(batch, tuple):\n att_ws = self.att_vis_fn(*batch)\n elif isinstance(batch, dict):\n att_ws = self.att_vis_fn(**batch)\n return att_ws\n\n def log_attentions(self, logger, step):\n def log_fig(plot, filename):\n from os.path import basename\n logger.add_figure(basename(filename), plot, step)\n plt.clf()\n\n attn_dict = self.get_attention_weights()\n self.plotfn(self.data, attn_dict, self.outdir, \"\", log_fig)\n", "path": "espnet/nets/pytorch_backend/transformer/plot.py" } ]
[ { "content": "import logging\n\nfrom espnet.asr import asr_utils\nimport matplotlib.pyplot as plt\n\n\ndef _plot_and_save_attention(att_w, filename):\n # dynamically import matplotlib due to not found error\n from matplotlib.ticker import MaxNLocator\n import os\n d = os.path.dirname(filename)\n if not os.path.exists(d):\n os.makedirs(d)\n w, h = plt.figaspect(1.0 / len(att_w))\n fig = plt.Figure(figsize=(w * 2, h * 2))\n axes = fig.subplots(1, len(att_w))\n if len(att_w) == 1:\n axes = [axes]\n for ax, aw in zip(axes, att_w):\n # plt.subplot(1, len(att_w), h)\n ax.imshow(aw, aspect=\"auto\")\n ax.set_xlabel(\"Input\")\n ax.set_ylabel(\"Output\")\n ax.xaxis.set_major_locator(MaxNLocator(integer=True))\n ax.yaxis.set_major_locator(MaxNLocator(integer=True))\n fig.tight_layout()\n return fig\n\n\ndef savefig(plot, filename):\n plot.savefig(filename)\n plt.clf()\n\n\ndef plot_multi_head_attention(data, attn_dict, outdir, suffix=\"png\", savefn=savefig):\n \"\"\"Plot multi head attentions\n\n :param dict data: utts info from json file\n :param dict[str, torch.Tensor] attn_dict: multi head attention dict.\n values should be torch.Tensor (head, input_length, output_length)\n :param str outdir: dir to save fig\n :param str suffix: filename suffix including image type (e.g., png)\n :param savefn: function to save\n \"\"\"\n for name, att_ws in attn_dict.items():\n for idx, att_w in enumerate(att_ws):\n filename = \"%s/%s.%s.%s\" % (\n outdir, data[idx][0], name, suffix)\n dec_len = int(data[idx][1]['output'][0]['shape'][0])\n enc_len = int(data[idx][1]['input'][0]['shape'][0])\n if \"encoder\" in name:\n att_w = att_w[:, :enc_len, :enc_len]\n elif \"decoder\" in name:\n if \"self\" in name:\n att_w = att_w[:, :dec_len, :dec_len]\n else:\n att_w = att_w[:, :dec_len, :enc_len]\n else:\n logging.warning(\"unknown name for shaping attention\")\n fig = _plot_and_save_attention(att_w, filename)\n savefn(fig, filename)\n\n\nclass PlotAttentionReport(asr_utils.PlotAttentionReport):\n def plotfn(self, *args, **kwargs):\n plot_multi_head_attention(*args, **kwargs)\n\n def __call__(self, trainer):\n attn_dict = self.get_attention_weights()\n suffix = \"ep.{.updater.epoch}.png\".format(trainer)\n self.plotfn(self.data, attn_dict, self.outdir, suffix, savefig)\n\n def get_attention_weights(self):\n batch = self.converter([self.transform(self.data)], self.device)\n if isinstance(batch, tuple):\n att_ws = self.att_vis_fn(*batch)\n elif isinstance(batch, dict):\n att_ws = self.att_vis_fn(**batch)\n return att_ws\n\n def log_attentions(self, logger, step):\n def log_fig(plot, filename):\n from os.path import basename\n logger.add_figure(basename(filename), plot, step)\n plt.clf()\n\n attn_dict = self.get_attention_weights()\n self.plotfn(self.data, attn_dict, self.outdir, \"\", log_fig)\n", "path": "espnet/nets/pytorch_backend/transformer/plot.py" } ]
diff --git a/espnet/nets/pytorch_backend/transformer/plot.py b/espnet/nets/pytorch_backend/transformer/plot.py index 66985df55e0..af5f0ef6e68 100644 --- a/espnet/nets/pytorch_backend/transformer/plot.py +++ b/espnet/nets/pytorch_backend/transformer/plot.py @@ -1,8 +1,7 @@ import logging -import matplotlib.pyplot as plt - from espnet.asr import asr_utils +import matplotlib.pyplot as plt def _plot_and_save_attention(att_w, filename):
mkdocs__mkdocs-1921
Unexpected behaviour with page.is_homepage Starting a new site and rolling my own theme. Came across some slightly odd behaviour. Mkdocs version 1.0.4 Python version 3.7.1 **Expected:** `page.is_homepage` evaluates to True on the home (index.md) of the site, and false on all other pages. **Actual:** `page.is_homepage` evaluates to True on the home (index.md), and on any other index.md that is included in the nav object without nesting. **Examples:** The unexpected result: ``` nav: - Home: index.md <--- page.is_homepage evaluates to True - About: about.md <--- page.is_homepage evaluates to False - Projects: projects/index.md <--- page.is_homepage evaluates to True ``` Changing the filename causes it to evaluate to false: ``` nav: - Home: index.md <--- page.is_homepage evaluates to True - About: about.md <--- page.is_homepage evaluates to False - Projects: projects/test.md <--- page.is_homepage evaluates to False ``` If I tweak it a bit, so that the sections are nested, then it evaluates to false as I'd expect: ``` nav: - About: - About: about.md <--- page.is_homepage evaluates to False - Projects: - Project home: projects/index.md <--- page.is_homepage evaluates to False ``` This feels like a bug - especially as simply changing the markdown file name causes the behaviour to change.
[ { "content": "# coding: utf-8\n\nfrom __future__ import unicode_literals\n\nimport os\nimport io\nimport datetime\nimport logging\n\nimport markdown\nfrom markdown.extensions import Extension\nfrom markdown.treeprocessors import Treeprocessor\nfrom markdown.util import AMP_SUBSTITUTE\n\nfrom mkdocs.structure.toc import get_toc\nfrom mkdocs.utils import meta, urlparse, urlunparse, urljoin, urlunquote, get_markdown_title, warning_filter\n\nlog = logging.getLogger(__name__)\nlog.addFilter(warning_filter)\n\n\nclass Page(object):\n def __init__(self, title, file, config):\n file.page = self\n self.file = file\n self.title = title\n\n # Navigation attributes\n self.parent = None\n self.children = None\n self.previous_page = None\n self.next_page = None\n self.active = False\n\n self.is_section = False\n self.is_page = True\n self.is_link = False\n\n # Support SOURCE_DATE_EPOCH environment variable for \"reproducible\" builds.\n # See https://reproducible-builds.org/specs/source-date-epoch/\n if 'SOURCE_DATE_EPOCH' in os.environ:\n self.update_date = datetime.datetime.utcfromtimestamp(\n int(os.environ['SOURCE_DATE_EPOCH'])\n ).strftime(\"%Y-%m-%d\")\n else:\n self.update_date = datetime.datetime.now().strftime(\"%Y-%m-%d\")\n\n self._set_canonical_url(config.get('site_url', None))\n self._set_edit_url(config.get('repo_url', None), config.get('edit_uri', None))\n\n # Placeholders to be filled in later in the build process.\n self.markdown = None\n self.content = None\n self.toc = []\n self.meta = {}\n\n def __eq__(self, other):\n\n def sub_dict(d):\n return dict((key, value) for key, value in d.items() if key in ['title', 'file'])\n\n return (isinstance(other, self.__class__) and sub_dict(self.__dict__) == sub_dict(other.__dict__))\n\n def __ne__(self, other):\n return not self.__eq__(other)\n\n def __repr__(self):\n title = \"'{}'\".format(self.title) if (self.title is not None) else '[blank]'\n return \"Page(title={}, url='{}')\".format(title, self.abs_url or self.file.url)\n\n def _indent_print(self, depth=0):\n return '{}{}'.format(' ' * depth, repr(self))\n\n def _get_active(self):\n \"\"\" Return active status of page. \"\"\"\n return self.__active\n\n def _set_active(self, value):\n \"\"\" Set active status of page and ancestors. \"\"\"\n self.__active = bool(value)\n if self.parent is not None:\n self.parent.active = bool(value)\n\n active = property(_get_active, _set_active)\n\n @property\n def is_index(self):\n return self.file.name == 'index'\n\n @property\n def is_top_level(self):\n return self.parent is None\n\n @property\n def is_homepage(self):\n return self.is_top_level and self.is_index\n\n @property\n def url(self):\n return '' if self.file.url == '.' else self.file.url\n\n @property\n def ancestors(self):\n if self.parent is None:\n return []\n return [self.parent] + self.parent.ancestors\n\n def _set_canonical_url(self, base):\n if base:\n if not base.endswith('/'):\n base += '/'\n self.canonical_url = urljoin(base, self.url)\n self.abs_url = urlparse(self.canonical_url).path\n else:\n self.canonical_url = None\n self.abs_url = None\n\n def _set_edit_url(self, repo_url, edit_uri):\n if repo_url and edit_uri:\n src_path = self.file.src_path.replace('\\\\', '/')\n self.edit_url = urljoin(repo_url, edit_uri + src_path)\n else:\n self.edit_url = None\n\n def read_source(self, config):\n source = config['plugins'].run_event(\n 'page_read_source', page=self, config=config\n )\n if source is None:\n try:\n with io.open(self.file.abs_src_path, 'r', encoding='utf-8-sig', errors='strict') as f:\n source = f.read()\n except IOError:\n log.error('File not found: {}'.format(self.file.src_path))\n raise\n except ValueError:\n log.error('Encoding error reading file: {}'.format(self.file.src_path))\n raise\n\n self.markdown, self.meta = meta.get_data(source)\n self._set_title()\n\n def _set_title(self):\n \"\"\"\n Set the title for a Markdown document.\n\n Check these in order and use the first that returns a valid title:\n - value provided on init (passed in from config)\n - value of metadata 'title'\n - content of the first H1 in Markdown content\n - convert filename to title\n \"\"\"\n if self.title is not None:\n return\n\n if 'title' in self.meta:\n self.title = self.meta['title']\n return\n\n title = get_markdown_title(self.markdown)\n\n if title is None:\n if self.is_homepage:\n title = 'Home'\n else:\n title = self.file.name.replace('-', ' ').replace('_', ' ')\n # Capitalize if the filename was all lowercase, otherwise leave it as-is.\n if title.lower() == title:\n title = title.capitalize()\n\n self.title = title\n\n def render(self, config, files):\n \"\"\"\n Convert the Markdown source file to HTML as per the config.\n \"\"\"\n\n extensions = [\n _RelativePathExtension(self.file, files)\n ] + config['markdown_extensions']\n\n md = markdown.Markdown(\n extensions=extensions,\n extension_configs=config['mdx_configs'] or {}\n )\n self.content = md.convert(self.markdown)\n self.toc = get_toc(getattr(md, 'toc', ''))\n\n\nclass _RelativePathTreeprocessor(Treeprocessor):\n def __init__(self, file, files):\n self.file = file\n self.files = files\n\n def run(self, root):\n \"\"\"\n Update urls on anchors and images to make them relative\n\n Iterates through the full document tree looking for specific\n tags and then makes them relative based on the site navigation\n \"\"\"\n for element in root.iter():\n if element.tag == 'a':\n key = 'href'\n elif element.tag == 'img':\n key = 'src'\n else:\n continue\n\n url = element.get(key)\n new_url = self.path_to_url(url)\n element.set(key, new_url)\n\n return root\n\n def path_to_url(self, url):\n scheme, netloc, path, params, query, fragment = urlparse(url)\n\n if (scheme or netloc or not path or url.startswith('/')\n or AMP_SUBSTITUTE in url or '.' not in os.path.split(path)[-1]):\n # Ignore URLs unless they are a relative link to a source file.\n # AMP_SUBSTITUTE is used internally by Markdown only for email.\n # No '.' in the last part of a path indicates path does not point to a file.\n return url\n\n # Determine the filepath of the target.\n target_path = os.path.join(os.path.dirname(self.file.src_path), urlunquote(path))\n target_path = os.path.normpath(target_path).lstrip(os.sep)\n\n # Validate that the target exists in files collection.\n if target_path not in self.files:\n log.warning(\n \"Documentation file '{}' contains a link to '{}' which is not found \"\n \"in the documentation files.\".format(self.file.src_path, target_path)\n )\n return url\n target_file = self.files.get_file_from_path(target_path)\n path = target_file.url_relative_to(self.file)\n components = (scheme, netloc, path, params, query, fragment)\n return urlunparse(components)\n\n\nclass _RelativePathExtension(Extension):\n \"\"\"\n The Extension class is what we pass to markdown, it then\n registers the Treeprocessor.\n \"\"\"\n\n def __init__(self, file, files):\n self.file = file\n self.files = files\n\n def extendMarkdown(self, md, md_globals):\n relpath = _RelativePathTreeprocessor(self.file, self.files)\n md.treeprocessors.add(\"relpath\", relpath, \"_end\")\n", "path": "mkdocs/structure/pages.py" } ]
[ { "content": "# coding: utf-8\n\nfrom __future__ import unicode_literals\n\nimport os\nimport io\nimport datetime\nimport logging\n\nimport markdown\nfrom markdown.extensions import Extension\nfrom markdown.treeprocessors import Treeprocessor\nfrom markdown.util import AMP_SUBSTITUTE\n\nfrom mkdocs.structure.toc import get_toc\nfrom mkdocs.utils import meta, urlparse, urlunparse, urljoin, urlunquote, get_markdown_title, warning_filter\n\nlog = logging.getLogger(__name__)\nlog.addFilter(warning_filter)\n\n\nclass Page(object):\n def __init__(self, title, file, config):\n file.page = self\n self.file = file\n self.title = title\n\n # Navigation attributes\n self.parent = None\n self.children = None\n self.previous_page = None\n self.next_page = None\n self.active = False\n\n self.is_section = False\n self.is_page = True\n self.is_link = False\n\n # Support SOURCE_DATE_EPOCH environment variable for \"reproducible\" builds.\n # See https://reproducible-builds.org/specs/source-date-epoch/\n if 'SOURCE_DATE_EPOCH' in os.environ:\n self.update_date = datetime.datetime.utcfromtimestamp(\n int(os.environ['SOURCE_DATE_EPOCH'])\n ).strftime(\"%Y-%m-%d\")\n else:\n self.update_date = datetime.datetime.now().strftime(\"%Y-%m-%d\")\n\n self._set_canonical_url(config.get('site_url', None))\n self._set_edit_url(config.get('repo_url', None), config.get('edit_uri', None))\n\n # Placeholders to be filled in later in the build process.\n self.markdown = None\n self.content = None\n self.toc = []\n self.meta = {}\n\n def __eq__(self, other):\n\n def sub_dict(d):\n return dict((key, value) for key, value in d.items() if key in ['title', 'file'])\n\n return (isinstance(other, self.__class__) and sub_dict(self.__dict__) == sub_dict(other.__dict__))\n\n def __ne__(self, other):\n return not self.__eq__(other)\n\n def __repr__(self):\n title = \"'{}'\".format(self.title) if (self.title is not None) else '[blank]'\n return \"Page(title={}, url='{}')\".format(title, self.abs_url or self.file.url)\n\n def _indent_print(self, depth=0):\n return '{}{}'.format(' ' * depth, repr(self))\n\n def _get_active(self):\n \"\"\" Return active status of page. \"\"\"\n return self.__active\n\n def _set_active(self, value):\n \"\"\" Set active status of page and ancestors. \"\"\"\n self.__active = bool(value)\n if self.parent is not None:\n self.parent.active = bool(value)\n\n active = property(_get_active, _set_active)\n\n @property\n def is_index(self):\n return self.file.name == 'index'\n\n @property\n def is_top_level(self):\n return self.parent is None\n\n @property\n def is_homepage(self):\n return self.is_top_level and self.is_index and self.file.url == '.'\n\n @property\n def url(self):\n return '' if self.file.url == '.' else self.file.url\n\n @property\n def ancestors(self):\n if self.parent is None:\n return []\n return [self.parent] + self.parent.ancestors\n\n def _set_canonical_url(self, base):\n if base:\n if not base.endswith('/'):\n base += '/'\n self.canonical_url = urljoin(base, self.url)\n self.abs_url = urlparse(self.canonical_url).path\n else:\n self.canonical_url = None\n self.abs_url = None\n\n def _set_edit_url(self, repo_url, edit_uri):\n if repo_url and edit_uri:\n src_path = self.file.src_path.replace('\\\\', '/')\n self.edit_url = urljoin(repo_url, edit_uri + src_path)\n else:\n self.edit_url = None\n\n def read_source(self, config):\n source = config['plugins'].run_event(\n 'page_read_source', page=self, config=config\n )\n if source is None:\n try:\n with io.open(self.file.abs_src_path, 'r', encoding='utf-8-sig', errors='strict') as f:\n source = f.read()\n except IOError:\n log.error('File not found: {}'.format(self.file.src_path))\n raise\n except ValueError:\n log.error('Encoding error reading file: {}'.format(self.file.src_path))\n raise\n\n self.markdown, self.meta = meta.get_data(source)\n self._set_title()\n\n def _set_title(self):\n \"\"\"\n Set the title for a Markdown document.\n\n Check these in order and use the first that returns a valid title:\n - value provided on init (passed in from config)\n - value of metadata 'title'\n - content of the first H1 in Markdown content\n - convert filename to title\n \"\"\"\n if self.title is not None:\n return\n\n if 'title' in self.meta:\n self.title = self.meta['title']\n return\n\n title = get_markdown_title(self.markdown)\n\n if title is None:\n if self.is_homepage:\n title = 'Home'\n else:\n title = self.file.name.replace('-', ' ').replace('_', ' ')\n # Capitalize if the filename was all lowercase, otherwise leave it as-is.\n if title.lower() == title:\n title = title.capitalize()\n\n self.title = title\n\n def render(self, config, files):\n \"\"\"\n Convert the Markdown source file to HTML as per the config.\n \"\"\"\n\n extensions = [\n _RelativePathExtension(self.file, files)\n ] + config['markdown_extensions']\n\n md = markdown.Markdown(\n extensions=extensions,\n extension_configs=config['mdx_configs'] or {}\n )\n self.content = md.convert(self.markdown)\n self.toc = get_toc(getattr(md, 'toc', ''))\n\n\nclass _RelativePathTreeprocessor(Treeprocessor):\n def __init__(self, file, files):\n self.file = file\n self.files = files\n\n def run(self, root):\n \"\"\"\n Update urls on anchors and images to make them relative\n\n Iterates through the full document tree looking for specific\n tags and then makes them relative based on the site navigation\n \"\"\"\n for element in root.iter():\n if element.tag == 'a':\n key = 'href'\n elif element.tag == 'img':\n key = 'src'\n else:\n continue\n\n url = element.get(key)\n new_url = self.path_to_url(url)\n element.set(key, new_url)\n\n return root\n\n def path_to_url(self, url):\n scheme, netloc, path, params, query, fragment = urlparse(url)\n\n if (scheme or netloc or not path or url.startswith('/')\n or AMP_SUBSTITUTE in url or '.' not in os.path.split(path)[-1]):\n # Ignore URLs unless they are a relative link to a source file.\n # AMP_SUBSTITUTE is used internally by Markdown only for email.\n # No '.' in the last part of a path indicates path does not point to a file.\n return url\n\n # Determine the filepath of the target.\n target_path = os.path.join(os.path.dirname(self.file.src_path), urlunquote(path))\n target_path = os.path.normpath(target_path).lstrip(os.sep)\n\n # Validate that the target exists in files collection.\n if target_path not in self.files:\n log.warning(\n \"Documentation file '{}' contains a link to '{}' which is not found \"\n \"in the documentation files.\".format(self.file.src_path, target_path)\n )\n return url\n target_file = self.files.get_file_from_path(target_path)\n path = target_file.url_relative_to(self.file)\n components = (scheme, netloc, path, params, query, fragment)\n return urlunparse(components)\n\n\nclass _RelativePathExtension(Extension):\n \"\"\"\n The Extension class is what we pass to markdown, it then\n registers the Treeprocessor.\n \"\"\"\n\n def __init__(self, file, files):\n self.file = file\n self.files = files\n\n def extendMarkdown(self, md, md_globals):\n relpath = _RelativePathTreeprocessor(self.file, self.files)\n md.treeprocessors.add(\"relpath\", relpath, \"_end\")\n", "path": "mkdocs/structure/pages.py" } ]
diff --git a/docs/about/release-notes.md b/docs/about/release-notes.md index bb33917c3d..35564ffab2 100644 --- a/docs/about/release-notes.md +++ b/docs/about/release-notes.md @@ -56,6 +56,7 @@ your global navigation uses more than one level, things will likely be broken. ### Other Changes and Additions to Version 1.1 +* Bugfix: Ensure nested index pages do not get identified as the homepage (#1919). * Bugfix: Properly identify deployment version (#1879). * Bugfix: Properly build `ValidationError` message for `custom_dir` (#1849). * Bugfix: Exclude Markdown files and READMEs from theme (#1766). diff --git a/mkdocs/structure/pages.py b/mkdocs/structure/pages.py index b032d7799b..a15a28a83e 100644 --- a/mkdocs/structure/pages.py +++ b/mkdocs/structure/pages.py @@ -93,7 +93,7 @@ def is_top_level(self): @property def is_homepage(self): - return self.is_top_level and self.is_index + return self.is_top_level and self.is_index and self.file.url == '.' @property def url(self): diff --git a/mkdocs/tests/structure/page_tests.py b/mkdocs/tests/structure/page_tests.py index f229c8cbe5..fd015eadf2 100644 --- a/mkdocs/tests/structure/page_tests.py +++ b/mkdocs/tests/structure/page_tests.py @@ -70,6 +70,54 @@ def test_nested_index_page(self): self.assertEqual(pg.title, 'Foo') self.assertEqual(pg.toc, []) + def test_nested_index_page_no_parent(self): + cfg = load_config(docs_dir=self.DOCS_DIR) + fl = File('sub1/index.md', cfg['docs_dir'], cfg['site_dir'], cfg['use_directory_urls']) + pg = Page('Foo', fl, cfg) + pg.parent = None # non-homepage at nav root level; see #1919. + self.assertEqual(pg.url, 'sub1/') + self.assertEqual(pg.abs_url, None) + self.assertEqual(pg.canonical_url, None) + self.assertEqual(pg.edit_url, None) + self.assertEqual(pg.file, fl) + self.assertEqual(pg.content, None) + self.assertFalse(pg.is_homepage) + self.assertTrue(pg.is_index) + self.assertTrue(pg.is_page) + self.assertFalse(pg.is_section) + self.assertTrue(pg.is_top_level) + self.assertEqual(pg.markdown, None) + self.assertEqual(pg.meta, {}) + self.assertEqual(pg.next_page, None) + self.assertEqual(pg.parent, None) + self.assertEqual(pg.previous_page, None) + self.assertEqual(pg.title, 'Foo') + self.assertEqual(pg.toc, []) + + def test_nested_index_page_no_parent_no_directory_urls(self): + cfg = load_config(docs_dir=self.DOCS_DIR, use_directory_urls=False) + fl = File('sub1/index.md', cfg['docs_dir'], cfg['site_dir'], cfg['use_directory_urls']) + pg = Page('Foo', fl, cfg) + pg.parent = None # non-homepage at nav root level; see #1919. + self.assertEqual(pg.url, 'sub1/index.html') + self.assertEqual(pg.abs_url, None) + self.assertEqual(pg.canonical_url, None) + self.assertEqual(pg.edit_url, None) + self.assertEqual(pg.file, fl) + self.assertEqual(pg.content, None) + self.assertFalse(pg.is_homepage) + self.assertTrue(pg.is_index) + self.assertTrue(pg.is_page) + self.assertFalse(pg.is_section) + self.assertTrue(pg.is_top_level) + self.assertEqual(pg.markdown, None) + self.assertEqual(pg.meta, {}) + self.assertEqual(pg.next_page, None) + self.assertEqual(pg.parent, None) + self.assertEqual(pg.previous_page, None) + self.assertEqual(pg.title, 'Foo') + self.assertEqual(pg.toc, []) + def test_nested_nonindex_page(self): cfg = load_config(docs_dir=self.DOCS_DIR) fl = File('sub1/non-index.md', cfg['docs_dir'], cfg['site_dir'], cfg['use_directory_urls'])
pex-tool__pex-975
Release 2.1.10 On the docket: + [x] Improve Pex packaging. (#961) + [x] Make the interpreter cache deterministic. (#960) + [x] Fix deprecation warning for `rU` mode (#956) + [x] Fix runtime resolve error message generation. (#955) + [x] Kill dead code. (#954) + [x] Many Pex tests fail under Python 2.7 in CI #967 + [x] Add a `--local` mode for packaging the Pex PEX. #971 + [x] Split Pex resolve API. (#970) + [x] Can't run PEX file when a dependency's wheel includes a build tag #964 + [x] Expose network configuration in pex options. #803
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.1.9'\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.1.10'\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 12c003ef3..bc1013393 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,53 @@ Release Notes ============= +2.1.10 +------ + +This release focuses on the resolver API and resolution performance. Pex 2 resolving using Pip is +now at least at performance parity with Pex 1 in all studied cases and most often is 5% to 10% +faster. + +As part of the resolution performance work, Pip networking configuration is now exposed via Pex CLI +options and the ``NetworkConfiguration`` API type / new ``resolver.resolve`` API parameter. + +With network configuration now wired up, the ``PEX_HTTP_RETRIES`` and ``PEX_HTTP_TIMEOUT`` env var +support in Pex 1 that was never wired into Pex 2 is now dropped in favor of passing ``--retries`` +and ``--timeout`` via the CLI (See: `Issue #94 <https://github.com/pantsbuild/pex/issues/94>`_) + +* Expose Pip network configuration. (#974) + `PR #974 <https://github.com/pantsbuild/pex/pull/974>`_ + +* Restore handling for bad wheel filenames to ``.can_add()`` (#973) + `PR #973 <https://github.com/pantsbuild/pex/pull/973>`_ + +* Fix wheel filename parsing in PEXEnvironment.can_add (#965) + `PR #965 <https://github.com/pantsbuild/pex/pull/965>`_ + +* Split Pex resolve API. (#970) + `PR #970 <https://github.com/pantsbuild/pex/pull/970>`_ + +* Add a ``--local`` mode for packaging the Pex PEX. (#971) + `PR #971 <https://github.com/pantsbuild/pex/pull/971>`_ + +* Constrain the virtualenv version used by tox. (#968) + `PR #968 <https://github.com/pantsbuild/pex/pull/968>`_ + +* Improve Pex packaging. (#961) + `PR #961 <https://github.com/pantsbuild/pex/pull/961>`_ + +* Make the interpreter cache deterministic. (#960) + `PR #960 <https://github.com/pantsbuild/pex/pull/960>`_ + +* Fix deprecation warning for ``rU`` mode (#956) + `PR #956 <https://github.com/pantsbuild/pex/pull/956>`_ + +* Fix runtime resolve error message generation. (#955) + `PR #955 <https://github.com/pantsbuild/pex/pull/955>`_ + +* Kill dead code. (#954) + `PR #954 <https://github.com/pantsbuild/pex/pull/954>`_ + 2.1.9 ----- diff --git a/pex/version.py b/pex/version.py index a6ec8e0ae..ef5420cfc 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = '2.1.9' +__version__ = '2.1.10'
numba__numba-1356
Use CPython allocator in NRT NRT should optionally use the CPython memory allocation functions (when imported from CPython). This would allow Numba-allocated memory to be seen by other utilities such as `sys.getallocatedblocks()`, `sys.debugmallocstats()`, and `tracemalloc`.
[ { "content": "from __future__ import print_function, absolute_import, division\n\nfrom collections import namedtuple\n\nfrom . import atomicops\nfrom llvmlite import binding as ll\n\nfrom numba.utils import finalize as _finalize\nfrom . import _nrt_python as _nrt\n\n_nrt_mstats = namedtuple(\"nrt_mstats\", [\"alloc\", \"free\", \"mi_alloc\", \"mi_free\"])\n\n\nclass _Runtime(object):\n def __init__(self):\n self._init = False\n\n def initialize(self, ctx):\n \"\"\"Initializes the NRT\n\n Must be called before any actual call to the NRT API.\n Safe to be called multiple times.\n \"\"\"\n if self._init:\n # Already initialized\n return\n\n # Register globals into the system\n for py_name in _nrt.c_helpers:\n c_name = \"NRT_\" + py_name\n c_address = _nrt.c_helpers[py_name]\n ll.add_symbol(c_name, c_address)\n\n # Compile atomic operations\n self._library = atomicops.compile_nrt_functions(ctx)\n\n self._ptr_inc = self._library.get_pointer_to_function(\"nrt_atomic_add\")\n self._ptr_dec = self._library.get_pointer_to_function(\"nrt_atomic_sub\")\n self._ptr_cas = self._library.get_pointer_to_function(\"nrt_atomic_cas\")\n\n # Install atomic ops to NRT\n _nrt.memsys_set_atomic_inc_dec(self._ptr_inc, self._ptr_dec)\n _nrt.memsys_set_atomic_cas(self._ptr_cas)\n\n self._init = True\n\n @staticmethod\n def shutdown():\n \"\"\"\n Shutdown the NRT\n Safe to be called without calling Runtime.initialize first\n \"\"\"\n _nrt.memsys_shutdown()\n\n @property\n def library(self):\n \"\"\"\n Return the Library object containing the various NRT functions.\n \"\"\"\n return self._library\n\n def meminfo_new(self, data, pyobj):\n \"\"\"\n Returns a MemInfo object that tracks memory at `data` owned by `pyobj`.\n MemInfo will acquire a reference on `pyobj`.\n The release of MemInfo will release a reference on `pyobj`.\n \"\"\"\n mi = _nrt.meminfo_new(data, pyobj)\n return MemInfo(mi)\n\n def meminfo_alloc(self, size, safe=False):\n \"\"\"\n Allocate a new memory of `size` bytes and returns a MemInfo object\n that tracks the allocation. When there is no more reference to the\n MemInfo object, the underlying memory will be deallocated.\n\n If `safe` flag is True, the memory is allocated using the `safe` scheme.\n This is used for debugging and testing purposes.\n See `NRT_MemInfo_alloc_safe()` in \"nrt.h\" for details.\n \"\"\"\n if safe:\n mi = _nrt.meminfo_alloc_safe(size)\n else:\n mi = _nrt.meminfo_alloc(size)\n return MemInfo(mi)\n\n def get_allocation_stats(self):\n \"\"\"\n Returns a namedtuple of (alloc, free, mi_alloc, mi_free) for count of\n each memory operations.\n \"\"\"\n return _nrt_mstats(alloc=_nrt.memsys_get_stats_alloc(),\n free=_nrt.memsys_get_stats_free(),\n mi_alloc=_nrt.memsys_get_stats_mi_alloc(),\n mi_free=_nrt.memsys_get_stats_mi_free())\n\n\n# Alias to _nrt_python._MemInfo\nMemInfo = _nrt._MemInfo\n\n# Create uninitialized runtime\nrtsys = _Runtime()\n\n# Install finalizer\n_finalize(rtsys, _Runtime.shutdown)\n\n# Avoid future use of the class\ndel _Runtime\n", "path": "numba/runtime/nrt.py" } ]
[ { "content": "from __future__ import print_function, absolute_import, division\n\nfrom collections import namedtuple\n\nfrom . import atomicops\nfrom llvmlite import binding as ll\n\nfrom numba.utils import finalize as _finalize\nfrom . import _nrt_python as _nrt\n\n_nrt_mstats = namedtuple(\"nrt_mstats\", [\"alloc\", \"free\", \"mi_alloc\", \"mi_free\"])\n\n\nclass _Runtime(object):\n def __init__(self):\n self._init = False\n\n def initialize(self, ctx):\n \"\"\"Initializes the NRT\n\n Must be called before any actual call to the NRT API.\n Safe to be called multiple times.\n \"\"\"\n if self._init:\n # Already initialized\n return\n\n # Register globals into the system\n for py_name in _nrt.c_helpers:\n c_name = \"NRT_\" + py_name\n c_address = _nrt.c_helpers[py_name]\n ll.add_symbol(c_name, c_address)\n\n # Compile atomic operations\n self._library = atomicops.compile_nrt_functions(ctx)\n\n self._ptr_inc = self._library.get_pointer_to_function(\"nrt_atomic_add\")\n self._ptr_dec = self._library.get_pointer_to_function(\"nrt_atomic_sub\")\n self._ptr_cas = self._library.get_pointer_to_function(\"nrt_atomic_cas\")\n\n # Install atomic ops to NRT\n _nrt.memsys_set_atomic_inc_dec(self._ptr_inc, self._ptr_dec)\n _nrt.memsys_set_atomic_cas(self._ptr_cas)\n\n self._init = True\n\n @staticmethod\n def shutdown():\n \"\"\"\n Shutdown the NRT\n Safe to be called without calling Runtime.initialize first\n \"\"\"\n _nrt.memsys_shutdown()\n\n @property\n def library(self):\n \"\"\"\n Return the Library object containing the various NRT functions.\n \"\"\"\n return self._library\n\n def meminfo_new(self, data, pyobj):\n \"\"\"\n Returns a MemInfo object that tracks memory at `data` owned by `pyobj`.\n MemInfo will acquire a reference on `pyobj`.\n The release of MemInfo will release a reference on `pyobj`.\n \"\"\"\n mi = _nrt.meminfo_new(data, pyobj)\n return MemInfo(mi)\n\n def meminfo_alloc(self, size, safe=False):\n \"\"\"\n Allocate a new memory of `size` bytes and returns a MemInfo object\n that tracks the allocation. When there is no more reference to the\n MemInfo object, the underlying memory will be deallocated.\n\n If `safe` flag is True, the memory is allocated using the `safe` scheme.\n This is used for debugging and testing purposes.\n See `NRT_MemInfo_alloc_safe()` in \"nrt.h\" for details.\n \"\"\"\n if safe:\n mi = _nrt.meminfo_alloc_safe(size)\n else:\n mi = _nrt.meminfo_alloc(size)\n return MemInfo(mi)\n\n def get_allocation_stats(self):\n \"\"\"\n Returns a namedtuple of (alloc, free, mi_alloc, mi_free) for count of\n each memory operations.\n \"\"\"\n return _nrt_mstats(alloc=_nrt.memsys_get_stats_alloc(),\n free=_nrt.memsys_get_stats_free(),\n mi_alloc=_nrt.memsys_get_stats_mi_alloc(),\n mi_free=_nrt.memsys_get_stats_mi_free())\n\n\n# Alias to _nrt_python._MemInfo\nMemInfo = _nrt._MemInfo\n\n# Create runtime\n_nrt.memsys_use_cpython_allocator()\nrtsys = _Runtime()\n\n# Install finalizer\n_finalize(rtsys, _Runtime.shutdown)\n\n# Avoid future use of the class\ndel _Runtime\n", "path": "numba/runtime/nrt.py" } ]
diff --git a/numba/_pymodule.h b/numba/_pymodule.h index 89787ce4d6d..182817a0aa1 100644 --- a/numba/_pymodule.h +++ b/numba/_pymodule.h @@ -38,8 +38,9 @@ #define Py_uhash_t unsigned long #endif -#if PY_MAJOR_VERSION < 3 || (PY_MAJOR_VERSION == 3 && Py_MINOR_VERSION < 4) +#if PY_MAJOR_VERSION < 3 || (PY_MAJOR_VERSION == 3 && PY_MINOR_VERSION < 4) #define PyMem_RawMalloc malloc + #define PyMem_RawRealloc realloc #define PyMem_RawFree free #endif diff --git a/numba/runtime/_nrt_python.c b/numba/runtime/_nrt_python.c index 218ca957f30..f1b00e73f50 100644 --- a/numba/runtime/_nrt_python.c +++ b/numba/runtime/_nrt_python.c @@ -20,6 +20,14 @@ memsys_shutdown(PyObject *self, PyObject *args) { Py_RETURN_NONE; } +static PyObject * +memsys_use_cpython_allocator(PyObject *self, PyObject *args) { + NRT_MemSys_set_allocator(PyMem_RawMalloc, + PyMem_RawRealloc, + PyMem_RawFree); + Py_RETURN_NONE; +} + static PyObject* memsys_set_atomic_inc_dec(PyObject *self, PyObject *args) { @@ -518,6 +526,7 @@ NRT_decref(MemInfo* mi) { static PyMethodDef ext_methods[] = { #define declmethod(func) { #func , ( PyCFunction )func , METH_VARARGS , NULL } #define declmethod_noargs(func) { #func , ( PyCFunction )func , METH_NOARGS, NULL } + declmethod_noargs(memsys_use_cpython_allocator), declmethod_noargs(memsys_shutdown), declmethod(memsys_set_atomic_inc_dec), declmethod(memsys_set_atomic_cas), diff --git a/numba/runtime/nrt.c b/numba/runtime/nrt.c index f68f80e805a..220b2ec22dd 100644 --- a/numba/runtime/nrt.c +++ b/numba/runtime/nrt.c @@ -22,6 +22,21 @@ struct MemInfo { }; +/* + * Misc helpers. + */ + +static void nrt_fatal_error(const char *msg) +{ + fprintf(stderr, "Fatal Numba error: %s\n", msg); + fflush(stderr); /* it helps in Windows debug build */ + +#if defined(MS_WINDOWS) && defined(_DEBUG) + DebugBreak(); +#endif + abort(); +} + /* * Global resources. */ @@ -35,6 +50,12 @@ struct MemSys{ int shutting; /* Stats */ size_t stats_alloc, stats_free, stats_mi_alloc, stats_mi_free; + /* System allocation functions */ + struct { + NRT_malloc_func malloc; + NRT_realloc_func realloc; + NRT_free_func free; + } allocator; }; /* The Memory System object */ @@ -42,6 +63,10 @@ static MemSys TheMSys; void NRT_MemSys_init(void) { memset(&TheMSys, 0, sizeof(MemSys)); + /* Bind to libc allocator */ + TheMSys.allocator.malloc = malloc; + TheMSys.allocator.realloc = realloc; + TheMSys.allocator.free = free; } void NRT_MemSys_shutdown(void) { @@ -54,6 +79,22 @@ void NRT_MemSys_shutdown(void) { NRT_MemSys_set_atomic_cas_stub(); } +void NRT_MemSys_set_allocator(NRT_malloc_func malloc_func, + NRT_realloc_func realloc_func, + NRT_free_func free_func) +{ + if ((malloc_func != TheMSys.allocator.malloc || + realloc_func != TheMSys.allocator.realloc || + free_func != TheMSys.allocator.free) && + (TheMSys.stats_alloc != TheMSys.stats_free || + TheMSys.stats_mi_alloc != TheMSys.stats_mi_free)) { + nrt_fatal_error("cannot change allocator while blocks are allocated"); + } + TheMSys.allocator.malloc = malloc_func; + TheMSys.allocator.realloc = realloc_func; + TheMSys.allocator.free = free_func; +} + void NRT_MemSys_set_atomic_inc_dec(atomic_inc_dec_func inc, atomic_inc_dec_func dec) { @@ -122,17 +163,6 @@ void NRT_MemSys_set_atomic_cas_stub(void) { NRT_MemSys_set_atomic_cas(nrt_testing_atomic_cas); } -static void nrt_fatal_error(const char *msg) -{ - fprintf(stderr, "Fatal Numba error: %s\n", msg); - fflush(stderr); /* it helps in Windows debug build */ - -#if defined(MS_WINDOWS) && defined(_DEBUG) - DebugBreak(); -#endif - abort(); -} - /* * The MemInfo structure. @@ -328,14 +358,14 @@ void *NRT_MemInfo_varsize_realloc(MemInfo *mi, size_t size) */ void* NRT_Allocate(size_t size) { - void *ptr = malloc(size); + void *ptr = TheMSys.allocator.malloc(size); NRT_Debug(nrt_debug_print("NRT_Allocate bytes=%zu ptr=%p\n", size, ptr)); TheMSys.atomic_inc(&TheMSys.stats_alloc); return ptr; } void *NRT_Reallocate(void *ptr, size_t size) { - void *new_ptr = realloc(ptr, size); + void *new_ptr = TheMSys.allocator.realloc(ptr, size); NRT_Debug(nrt_debug_print("NRT_Reallocate bytes=%zu ptr=%p -> %p\n", size, ptr, new_ptr)); return new_ptr; @@ -343,6 +373,6 @@ void *NRT_Reallocate(void *ptr, size_t size) { void NRT_Free(void *ptr) { NRT_Debug(nrt_debug_print("NRT_Free %p\n", ptr)); - free(ptr); + TheMSys.allocator.free(ptr); TheMSys.atomic_inc(&TheMSys.stats_free); } diff --git a/numba/runtime/nrt.h b/numba/runtime/nrt.h index 5c2cfe5432d..10510ec1acb 100644 --- a/numba/runtime/nrt.h +++ b/numba/runtime/nrt.h @@ -36,6 +36,11 @@ typedef int (*atomic_cas_func)(void * volatile *ptr, void *cmp, void *repl, typedef struct MemInfo MemInfo; typedef struct MemSys MemSys; +typedef void *(*NRT_malloc_func)(size_t size); +typedef void *(*NRT_realloc_func)(void *ptr, size_t new_size); +typedef void (*NRT_free_func)(void *ptr); + + /* Memory System API */ /* Initialize the memory system */ @@ -44,6 +49,11 @@ void NRT_MemSys_init(void); /* Shutdown the memory system */ void NRT_MemSys_shutdown(void); +/* + * Register the system allocation functions + */ +void NRT_MemSys_set_allocator(NRT_malloc_func, NRT_realloc_func, NRT_free_func); + /* * Register the atomic increment and decrement functions */ diff --git a/numba/runtime/nrt.py b/numba/runtime/nrt.py index b337a18f30e..721cf769db4 100644 --- a/numba/runtime/nrt.py +++ b/numba/runtime/nrt.py @@ -98,7 +98,8 @@ def get_allocation_stats(self): # Alias to _nrt_python._MemInfo MemInfo = _nrt._MemInfo -# Create uninitialized runtime +# Create runtime +_nrt.memsys_use_cpython_allocator() rtsys = _Runtime() # Install finalizer diff --git a/numba/tests/test_nrt.py b/numba/tests/test_nrt.py index 94f957f793d..6e1b2add502 100644 --- a/numba/tests/test_nrt.py +++ b/numba/tests/test_nrt.py @@ -1,7 +1,13 @@ from __future__ import absolute_import, division, print_function +import math +import os +import sys + import numpy as np + from numba import unittest_support as unittest +from numba import njit from numba.runtime import rtsys from numba.config import PYVERSION from .support import MemoryLeakMixin @@ -164,15 +170,71 @@ def test_buffer(self): # consumed by another thread. [email protected](sys.version_info >= (3, 4), + "need Python 3.4+ for the tracemalloc module") +class TestTracemalloc(unittest.TestCase): + """ + Test NRT-allocated memory can be tracked by tracemalloc. + """ + + def measure_memory_diff(self, func): + import tracemalloc + tracemalloc.start() + try: + before = tracemalloc.take_snapshot() + # Keep the result and only delete it after taking a snapshot + res = func() + after = tracemalloc.take_snapshot() + del res + return after.compare_to(before, 'lineno') + finally: + tracemalloc.stop() + + def test_snapshot(self): + N = 1000000 + dtype = np.int8 + + @njit + def alloc_nrt_memory(): + """ + Allocate and return a large array. + """ + return np.empty(N, dtype) + + def keep_memory(): + return alloc_nrt_memory() + + def release_memory(): + alloc_nrt_memory() + + alloc_lineno = keep_memory.__code__.co_firstlineno + 1 + + # Warmup JIT + alloc_nrt_memory() + + # The large NRT-allocated array should appear topmost in the diff + diff = self.measure_memory_diff(keep_memory) + stat = diff[0] + # There is a slight overhead, so the allocated size won't exactly be N + self.assertGreaterEqual(stat.size, N) + self.assertLess(stat.size, N * 1.01) + frame = stat.traceback[0] + self.assertEqual(os.path.basename(frame.filename), "test_nrt.py") + self.assertEqual(frame.lineno, alloc_lineno) + + # If NRT memory is released before taking a snapshot, it shouldn't + # appear. + diff = self.measure_memory_diff(release_memory) + stat = diff[0] + # Something else appears, but nothing the magnitude of N + self.assertLess(stat.size, N * 0.01) + + class TestNRTIssue(MemoryLeakMixin, unittest.TestCase): def test_issue_with_refct_op_pruning(self): """ GitHub Issue #1244 https://github.com/numba/numba/issues/1244 """ - from numba import njit - import numpy as np - import math - @njit def calculate_2D_vector_mag(vector): x, y = vector
DjangoGirls__djangogirls-785
return paginator.Paginator(self._items(), self.limit) Sentry Issue: [DJANGO-GIRLS-WEBSITE-3V](https://sentry.io/organizations/django-girls/issues/3236790374/?referrer=github_integration) ``` return paginator.Paginator(self._items(), self.limit) ```
[ { "content": "from django.contrib.sitemaps import Sitemap\n\nfrom .models import Story\n\n\nclass BlogSiteMap(Sitemap):\n priority = 0.5\n\n def items(self):\n return Story.objects.all()\n\n def location(self, item):\n url = item.post_url\n if url is not None and 'http://' in url:\n return url.replace('http://', '')\n else:\n return url.replace('https://', '')\n\n def lastmod(self, obj):\n return obj.created\n\n def _urls(self, page, protocol, domain):\n return super(BlogSiteMap, self)._urls(\n page=page, protocol='https', domain='')\n", "path": "story/sitemap.py" } ]
[ { "content": "from django.contrib.sitemaps import Sitemap\n\nfrom .models import Story\n\n\nclass BlogSiteMap(Sitemap):\n priority = 0.5\n\n def items(self):\n return Story.objects.all().order_by('-created')\n\n def location(self, item):\n url = item.post_url\n if url is not None and 'http://' in url:\n return url.replace('http://', '')\n else:\n return url.replace('https://', '')\n\n def lastmod(self, obj):\n return obj.created\n\n def _urls(self, page, protocol, domain):\n return super(BlogSiteMap, self)._urls(\n page=page, protocol='https', domain='')\n", "path": "story/sitemap.py" } ]
diff --git a/story/sitemap.py b/story/sitemap.py index 95f2d36b2..44a8eb71f 100644 --- a/story/sitemap.py +++ b/story/sitemap.py @@ -7,7 +7,7 @@ class BlogSiteMap(Sitemap): priority = 0.5 def items(self): - return Story.objects.all() + return Story.objects.all().order_by('-created') def location(self, item): url = item.post_url
mlcommons__GaNDLF-537
Radiology DataLoader takes up a *lot* memory during certain conditions **Describe the bug** During sanity checking of subjects, the queue construction seems to take up a lot of memory. **To Reproduce** Steps to reproduce the behavior: 1. Have a ridiculous number of subjects on a small machine (e.g., 10k on a machine with 16G RAM) 2. Start training on rad mode 4. See error sometime during/after queue construction: ```bash ## last message Constructing queue for train data: 100%|██████████| 8681/8681 [07:57<00:00, 18.19it/s] ## failure with message related to exceeded RAM usage ``` **Expected behavior** There should not be any failure at this stage. **Screenshots** N.A> **GaNDLF Version** <!-- Put the output of the following command: python -c 'import GANDLF as g;print(g.__version__)' --> 0.0.16-dev **Desktop (please complete the following information):** CentOS 7 **Additional context** N.A.
[ { "content": "#!/usr/bin/env python\n\n\"\"\"The setup script.\"\"\"\n\n\nimport os\nfrom setuptools import setup, find_packages\nfrom setuptools.command.install import install\nfrom setuptools.command.develop import develop\nfrom setuptools.command.egg_info import egg_info\n\nwith open(\"README.md\") as readme_file:\n readme = readme_file.read()\n\n\ndef git_submodule_update():\n ## submodule update\n os.system(\"git submodule update --init --recursive\")\n\n\nclass CustomInstallCommand(install):\n def run(self):\n install.run(self)\n git_submodule_update()\n\n\nclass CustomDevelopCommand(develop):\n def run(self):\n develop.run(self)\n git_submodule_update()\n\n\nclass CustomEggInfoCommand(egg_info):\n def run(self):\n egg_info.run(self)\n git_submodule_update()\n\n\n# read version.py\nimport sys, re\n\ntry:\n filepath = \"GANDLF/version.py\"\n version_file = open(filepath)\n (__version__,) = re.findall('__version__ = \"(.*)\"', version_file.read())\n\nexcept Exception as error:\n __version__ = \"0.0.1\"\n sys.stderr.write(\"Warning: Could not open '%s' due %s\\n\" % (filepath, error))\n\nrequirements = [\n \"black\",\n \"numpy==1.22.0\",\n \"scipy\",\n \"SimpleITK!=2.0.*\",\n \"torchvision\",\n \"tqdm\",\n \"torchio==0.18.75\",\n \"pandas\",\n \"scikit-learn>=0.23.2\",\n \"scikit-image>=0.19.1\",\n 'pickle5>=0.0.11; python_version < \"3.8.0\"',\n \"setuptools\",\n \"seaborn\",\n \"pyyaml\",\n \"tiffslide\",\n \"matplotlib\",\n \"requests>=2.25.0\",\n \"pytest\",\n \"coverage\",\n \"pytest-cov\",\n \"psutil\",\n \"medcam\",\n \"opencv-python\",\n \"torchmetrics==0.5.1\", # newer versions have changed api for f1 invocation\n \"OpenPatchMiner==0.1.8\",\n \"zarr==2.10.3\",\n \"pydicom\",\n \"onnx\",\n \"torchinfo==1.7.0\",\n \"segmentation-models-pytorch==0.3.0\",\n \"ACSConv==0.1.1\",\n]\n\n# pytorch doesn't have LTS support on OSX - https://github.com/mlcommons/GaNDLF/issues/389\nif sys.platform == \"darwin\":\n requirements.append(\"torch==1.11.0\")\nelse:\n requirements.append(\"torch==1.11.0\")\n\nsetup(\n name=\"GANDLF\",\n version=__version__,\n author=\"MLCommons\",\n author_email=\"[email protected]\",\n python_requires=\">=3.7\",\n packages=find_packages(),\n cmdclass={ # this ensures git_submodule_update is called during install\n \"install\": CustomInstallCommand,\n \"develop\": CustomDevelopCommand,\n \"egg_info\": CustomEggInfoCommand,\n },\n scripts=[\n \"gandlf_run\",\n \"gandlf_constructCSV\",\n \"gandlf_collectStats\",\n \"gandlf_patchMiner\",\n \"gandlf_preprocess\",\n \"gandlf_anonymizer\",\n \"gandlf_verifyInstall\",\n \"gandlf_configGenerator\",\n ],\n classifiers=[\n \"Development Status :: 3 - Alpha\",\n \"Intended Audience :: Science/Research\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Natural Language :: English\",\n \"Operating System :: OS Independent\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Programming Language :: Python :: 3.10\",\n \"Topic :: Scientific/Engineering :: Medical Science Apps\",\n ],\n description=(\n \"PyTorch-based framework that handles segmentation/regression/classification using various DL architectures for medical imaging.\"\n ),\n install_requires=requirements,\n license=\"Apache-2.0\",\n long_description=readme,\n long_description_content_type=\"text/markdown\",\n include_package_data=True,\n keywords=\"semantic, segmentation, regression, classification, data-augmentation, medical-imaging, clinical-workflows, deep-learning, pytorch\",\n zip_safe=False,\n)\n", "path": "setup.py" } ]
[ { "content": "#!/usr/bin/env python\n\n\"\"\"The setup script.\"\"\"\n\n\nimport os\nfrom setuptools import setup, find_packages\nfrom setuptools.command.install import install\nfrom setuptools.command.develop import develop\nfrom setuptools.command.egg_info import egg_info\n\nwith open(\"README.md\") as readme_file:\n readme = readme_file.read()\n\n\ndef git_submodule_update():\n ## submodule update\n os.system(\"git submodule update --init --recursive\")\n\n\nclass CustomInstallCommand(install):\n def run(self):\n install.run(self)\n git_submodule_update()\n\n\nclass CustomDevelopCommand(develop):\n def run(self):\n develop.run(self)\n git_submodule_update()\n\n\nclass CustomEggInfoCommand(egg_info):\n def run(self):\n egg_info.run(self)\n git_submodule_update()\n\n\n# read version.py\nimport sys, re\n\ntry:\n filepath = \"GANDLF/version.py\"\n version_file = open(filepath)\n (__version__,) = re.findall('__version__ = \"(.*)\"', version_file.read())\n\nexcept Exception as error:\n __version__ = \"0.0.1\"\n sys.stderr.write(\"Warning: Could not open '%s' due %s\\n\" % (filepath, error))\n\nrequirements = [\n \"black\",\n \"numpy==1.22.0\",\n \"scipy\",\n \"SimpleITK!=2.0.*\",\n \"SimpleITK!=2.2.1\", # https://github.com/mlcommons/GaNDLF/issues/536\n \"torchvision\",\n \"tqdm\",\n \"torchio==0.18.75\",\n \"pandas\",\n \"scikit-learn>=0.23.2\",\n \"scikit-image>=0.19.1\",\n 'pickle5>=0.0.11; python_version < \"3.8.0\"',\n \"setuptools\",\n \"seaborn\",\n \"pyyaml\",\n \"tiffslide\",\n \"matplotlib\",\n \"requests>=2.25.0\",\n \"pytest\",\n \"coverage\",\n \"pytest-cov\",\n \"psutil\",\n \"medcam\",\n \"opencv-python\",\n \"torchmetrics==0.5.1\", # newer versions have changed api for f1 invocation\n \"OpenPatchMiner==0.1.8\",\n \"zarr==2.10.3\",\n \"pydicom\",\n \"onnx\",\n \"torchinfo==1.7.0\",\n \"segmentation-models-pytorch==0.3.0\",\n \"ACSConv==0.1.1\",\n]\n\n# pytorch doesn't have LTS support on OSX - https://github.com/mlcommons/GaNDLF/issues/389\nif sys.platform == \"darwin\":\n requirements.append(\"torch==1.11.0\")\nelse:\n requirements.append(\"torch==1.11.0\")\n\nsetup(\n name=\"GANDLF\",\n version=__version__,\n author=\"MLCommons\",\n author_email=\"[email protected]\",\n python_requires=\">=3.7\",\n packages=find_packages(),\n cmdclass={ # this ensures git_submodule_update is called during install\n \"install\": CustomInstallCommand,\n \"develop\": CustomDevelopCommand,\n \"egg_info\": CustomEggInfoCommand,\n },\n scripts=[\n \"gandlf_run\",\n \"gandlf_constructCSV\",\n \"gandlf_collectStats\",\n \"gandlf_patchMiner\",\n \"gandlf_preprocess\",\n \"gandlf_anonymizer\",\n \"gandlf_verifyInstall\",\n \"gandlf_configGenerator\",\n ],\n classifiers=[\n \"Development Status :: 3 - Alpha\",\n \"Intended Audience :: Science/Research\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Natural Language :: English\",\n \"Operating System :: OS Independent\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Programming Language :: Python :: 3.10\",\n \"Topic :: Scientific/Engineering :: Medical Science Apps\",\n ],\n description=(\n \"PyTorch-based framework that handles segmentation/regression/classification using various DL architectures for medical imaging.\"\n ),\n install_requires=requirements,\n license=\"Apache-2.0\",\n long_description=readme,\n long_description_content_type=\"text/markdown\",\n include_package_data=True,\n keywords=\"semantic, segmentation, regression, classification, data-augmentation, medical-imaging, clinical-workflows, deep-learning, pytorch\",\n zip_safe=False,\n)\n", "path": "setup.py" } ]
diff --git a/setup.py b/setup.py index 74d1422d8..46241f78e 100644 --- a/setup.py +++ b/setup.py @@ -53,6 +53,7 @@ def run(self): "numpy==1.22.0", "scipy", "SimpleITK!=2.0.*", + "SimpleITK!=2.2.1", # https://github.com/mlcommons/GaNDLF/issues/536 "torchvision", "tqdm", "torchio==0.18.75",
pex-tool__pex-792
Release 2.0.0 On the docket: + [x] Use pip for resolving and building distributions. #788 + [x] Pex defaults to reproduceable builds. #791 That one issue closes out a slew of others, partially documented in #686.
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '1.6.12'\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.0.0'\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index ae7de56f8..2b97c531c 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,58 @@ Release Notes ============= +2.0.0 +----- + +Pex 2.0.0 is cut on the advent of a large, mostly internal change for typical +use cases: it now uses vendored pip to perform resolves and wheel builds. This +fixes a large number of compatibility and correctness bugs as well as gaining +feature support from pip including handling manylinux2010 and manylinux2014 as +well as VCS requirements and support for PEP-517 & PEP-518 builds. + +API changes to be wary of: + +* The egg distribution format is no longer supported. +* The deprecated ``--interpreter-cache-dir`` CLI option was removed. +* The ``--cache-ttl`` CLI option and ``cache_ttl`` resolver API argument were + removed. +* The resolver API replaced ``fetchers`` with a list of ``indexes`` and a list + of ``find_links`` repos. +* The resolver API removed (http) ``context`` which is now automatically + handled. +* The resolver API removed ``precedence`` which is now pip default precedence: + wheels when available and not ruled out via the ``--no-wheel`` CLI option or + ``use_wheel=False`` API argument. +* The ``--platform`` CLI option and ``platform`` resolver API argument now must + be full platform strings that include platform, implementation, version and + abi; e.g.: ``--platform=macosx-10.13-x86_64-cp-36-m``. +* The ``--manylinux`` CLI option and ``use_manylinux`` resolver API argument + were removed. Instead, to resolve manylinux wheels for a foreign platform, + specify the manylinux platform to target with an explicit ``--platform`` CLI + flag or ``platform`` resolver API argument; e.g.: + ``--platform=manylinux2010-x86_64-cp-36-m``. + +In addition, Pex 2.0.0 now builds reproduceable pexes by default; ie: + +* Python modules embedded in the pex are not pre-compiled (pass --compile if + you want this). +* The timestamps for Pex file zip entries default to midnight on + January 1, 1980 (pass --use-system-time to change this). + +This finishes off the effort tracked by +`Issue #716 <https://github.com/pantsbuild/pex/pull/718>`_ + +Changes in this release: + +* Pex defaults to reproduceable builds. (#791) + `PR #791 <https://github.com/pantsbuild/pex/pull/791>`_ + +* Use pip for resolving and building distributions. (#788) + `PR #788 <https://github.com/pantsbuild/pex/pull/788>`_ + +* Bias selecting the current interpreter. (#783) + `PR #783 <https://github.com/pantsbuild/pex/pull/783>`_ + 1.6.12 ------ diff --git a/pex/version.py b/pex/version.py index 23734e89b..772804dc4 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = '1.6.12' +__version__ = '2.0.0'
pex-tool__pex-1191
Release 2.1.26 On the docket: + [x] Pex requirement parsing is tripped up by files in the CWD with the same name as requirements' project names. #1188
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.25\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.26\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index f63fa42fa..03ecea3f8 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,15 @@ Release Notes ============= +2.1.26 +------ + +This is a hotfix release that fixes requirement parsing when there is a local file in the CWD with +the same name as the project name of a remote requirement to be resolved. + +* Requirement parsing handles local non-dist files. (#1190) + `PR #1190 <https://github.com/pantsbuild/pex/pull/1190>`_ + 2.1.25 ------ diff --git a/pex/version.py b/pex/version.py index d89e91f46..d49dc67b8 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.25" +__version__ = "2.1.26"
pex-tool__pex-1139
Release 2.1.22 On the docket: + [x] Fix `--help-variables` docs. #1113 + [x] pex binary hangs on startup at atomic_directory #1119 + [x] pex vendoring does not (always) isolate itself from a colliding setuptools install in site-packages #1031 + [x] Remove long deprecated support for _pex module. #1135
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.21\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.22\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index c7101cdd1..9a14e88b8 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,76 @@ Release Notes ============= +2.1.22 +------ + +This release fixes a deadlock that could be experienced when building +PEX files in highly concurrent environments in addition to fixing +`pex --help-variables` output. + +A new suite of PEX tools is now available in Pex itself and any PEXes +built with the new `--include-tools` option. Use +`PEX_TOOLS=1 pex --help` to find out more about the available tools and +their usage. + +Finally, the long deprecated exposure of the Pex APIs through `_pex` has +been removed. To use the Pex APIs you must include pex as a dependency +in your PEX file. + +* Add a dependency graph tool. (#1132) + `PR #1132 <https://github.com/pantsbuild/pex/pull/1132>`_ + +* Add a venv tool. (#1128) + `PR #1128 <https://github.com/pantsbuild/pex/pull/1128>`_ + +* Remove long deprecated support for _pex module. (#1135) + `PR #1135 <https://github.com/pantsbuild/pex/pull/1135>`_ + +* Add an interpreter tool. (#1131) + `PR #1131 <https://github.com/pantsbuild/pex/pull/1131>`_ + +* Escape venvs unless PEX_INHERIT_PATH is requested. (#1130) + `PR #1130 <https://github.com/pantsbuild/pex/pull/1130>`_ + +* Improve `PythonInterpreter` venv support. (#1129) + `PR #1129 <https://github.com/pantsbuild/pex/pull/1129>`_ + +* Add support for PEX runtime tools & an info tool. (#1127) + `PR #1127 <https://github.com/pantsbuild/pex/pull/1127>`_ + +* Exclusive atomic_directory always unlocks. (#1126) + `PR #1126 <https://github.com/pantsbuild/pex/pull/1126>`_ + +* Fix `PythonInterpreter` binary normalization. (#1125) + `PR #1125 <https://github.com/pantsbuild/pex/pull/1125>`_ + +* Add a `requires_dists` function. (#1122) + `PR #1122 <https://github.com/pantsbuild/pex/pull/1122>`_ + +* Add an `is_exe` helper. (#1123) + `PR #1123 <https://github.com/pantsbuild/pex/pull/1123>`_ + +* Fix req parsing for local archives & projects. (#1121) + `PR #1121 <https://github.com/pantsbuild/pex/pull/1121>`_ + +* Improve PEXEnvironment constructor ergonomics. (#1120) + `PR #1120 <https://github.com/pantsbuild/pex/pull/1120>`_ + +* Fix `safe_open` for single element relative paths. (#1118) + `PR #1118 <https://github.com/pantsbuild/pex/pull/1118>`_ + +* Add URLFetcher IT. (#1116) + `PR #1116 <https://github.com/pantsbuild/pex/pull/1116>`_ + +* Implement full featured requirment parsing. (#1114) + `PR #1114 <https://github.com/pantsbuild/pex/pull/1114>`_ + +* Fix `--help-variables` docs. (#1113) + `PR #1113 <https://github.com/pantsbuild/pex/pull/1113>`_ + +* Switch from optparse to argparse. (#1083) + `PR #1083 <https://github.com/pantsbuild/pex/pull/1083>`_ + 2.1.21 ------ diff --git a/pex/version.py b/pex/version.py index 8b44b5e28..16163f6f4 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.21" +__version__ = "2.1.22"
pex-tool__pex-777
Release 1.6.12 On the docket: + [x] PythonInterpreter: support python binary names with single letter suffixes #769 + [x] Pex should support some form of verifiably reproducible resolve. #772
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '1.6.11'\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '1.6.12'\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 6c2163b2d..ae7de56f8 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,18 @@ Release Notes ============= +1.6.12 +------ + +This release adds the `--intransitive` option to support pre-resolved requirements +lists and allows for python binaries built under Gentoo naming conventions. + +* Add an --intransitive option. (#775) + `PR #775 <https://github.com/pantsbuild/pex/pull/775>`_ + +* PythonInterpreter: support python binary names with single letter suffixes (#769) + `PR #769 <https://github.com/pantsbuild/pex/pull/769>`_ + 1.6.11 ------ diff --git a/pex/version.py b/pex/version.py index 255088057..23734e89b 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = '1.6.11' +__version__ = '1.6.12'
pex-tool__pex-1750
Release 2.1.85 On the docket: + [x] PEX interpreters should support all underlying Python interpreter options. #1745
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.84\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.85\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 0edc4617a..3ecf6a075 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,21 @@ Release Notes ============= +2.1.85 +------ + +This PyCon US 2022 release brings full support for Python interpreter +emulation when a PEX is run in interpreter mode (without an entry point +or else when forced via ``PEX_INTERPRETER=1``). + +A special thank you to Loren Arthur for contributing the fix in the +Pantsbuild sprint at PyCon. + +* PEX interpreters should support all underlying Python interpreter options. (#1745) + `Issue #1745 <https://github.com/pantsbuild/pex/issues/1745>`_ + `PR #1746 <https://github.com/pantsbuild/pex/pull/1746>`_ + `PR #1748 <https://github.com/pantsbuild/pex/pull/1748>`_ + 2.1.84 ------ diff --git a/pex/version.py b/pex/version.py index be294c41c..62a251651 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.84" +__version__ = "2.1.85"
pex-tool__pex-1709
Release 2.1.77 On the docket: + [x] Fix pathologic lock creation slowness. #1707 + [x] Support uncompressed PEXes. (#1705)
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.76\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.77\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 1e0fb931f..2f87648a4 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,10 +1,23 @@ Release Notes ============= +2.1.77 +------ + +This release fixes pathologically slow cases of lock creation as well as +introducing support for ``--no-compression`` to allow picking the the +time-space tradeoff you want for your PEX zips. + +* Fix pathologic lock creation slowness. (#1707) + `PR #1707 <https://github.com/pantsbuild/pex/pull/1707>`_ + +* Support uncompressed PEXes. (#1705) + `PR #1705 <https://github.com/pantsbuild/pex/pull/1705>`_ + 2.1.76 ------ -This release finalizes spurious deadlock handling in `--lock` resolves +This release finalizes spurious deadlock handling in ``--lock`` resolves worked around in #1694 in Pex 2.1.75. * Fix lock_resolver to use BSD file locks. (#1702) diff --git a/pex/version.py b/pex/version.py index 75d3da9cc..f8a647e8e 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.76" +__version__ = "2.1.77"
pex-tool__pex-1679
Release 2.1.73 On the docket: + [x] Unexpected distribution hash #1683 + [x] Pex fails to parse wheel tags correctly when resolving from a lock. #1676 + [x] `pex3 lock create --style universal` does not fully patch ambient interpreter properties. #1681
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.72\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.73\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index da8f81916..b2b079c4a 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,27 @@ Release Notes ============= +2.1.73 +------ + +This is a hotfix for various PEX issues: + +#. ``--requirements-pex`` handling was broken by #1661 in the 2.1.71 + release and is now fixed. +#. Creating ``universal`` locks now works using any interpreter when the + resolver version is the ``pip-2020-resolver``. +#. Building PEXes with ``--lock`` resolves that contain wheels with + build tags in their names now works. + +* Fix ``--requirements-pex``. (#1684) + `PR #1684 <https://github.com/pantsbuild/pex/pull/1684>`_ + +* Fix universal locks for the ``pip-2020-resolver``. (#1682) + `PR #1682 <https://github.com/pantsbuild/pex/pull/1682>`_ + +* Fix ``--lock`` resolve wheel tag parsing. (#1678) + `PR #1678 <https://github.com/pantsbuild/pex/pull/1678>`_ + 2.1.72 ------ diff --git a/pex/version.py b/pex/version.py index 1cf91c0a8..a1e0ffe02 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.72" +__version__ = "2.1.73"
pex-tool__pex-1275
Release 2.1.34 On the docket: + [x] Allow command-line arguments to be read from a file #1271 + [x] Issue when running a module inside pex file #1018 + [x] Guard against concurrent re-imports. #1270 + [x] Ensure Pip logs to stderr. #1268
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.33\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.34\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 0e5749f8f..b2a99154a 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,26 @@ Release Notes ============= +2.1.34 +------ + +Beyond bugfixes for a few important edge cases, this release includes +new support for @argfiles on the command line from @jjhelmus. These +can be useful to overcome command line length limitations. See: +https://docs.python.org/3/library/argparse.html#fromfile-prefix-chars. + +* Allow cli arguments to be specified in a file (#1273) + `PR #1273 <https://github.com/pantsbuild/pex/pull/1273>`_ + +* Fix module entrypoints. (#1274) + `PR #1274 <https://github.com/pantsbuild/pants/pull/1274>`_ + +* Guard against concurrent re-imports. (#1270) + `PR #1270 <https://github.com/pantsbuild/pants/pull/1270>`_ + +* Ensure Pip logs to stderr. (#1268) + `PR #1268 <https://github.com/pantsbuild/pants/pull/1268>`_ + 2.1.33 ------ diff --git a/pex/version.py b/pex/version.py index c20716d59..ee3ef65c4 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.33" +__version__ = "2.1.34"
pex-tool__pex-1450
Release 2.1.50 On the docket: + [x] Fix zipapp layout identification. #1448
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.49\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.50\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 4cf9ec11a..f05e62f6f 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,17 @@ Release Notes ============= +2.1.50 +------ + +This is another hotfix of the 2.1.48 release's ``--layout`` feature that +fixes identification of ``--layout zipapp`` PEXes that have had their +execute mode bit turned off. A notable example is the Pex PEX when +downloaded from https://github.com/pantsbuild/pex/releases. + +* Fix zipapp layout identification. (#1448) + `PR #1448 <https://github.com/pantsbuild/pex/pull/1448>`_ + 2.1.49 ------ diff --git a/pex/version.py b/pex/version.py index 3977ea4eb..fed40855d 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.49" +__version__ = "2.1.50"
pex-tool__pex-1148
PexInfo.copy does not copy its collection attributes. The copy method was oversimplified in #1127 and now only copies the dict backing the non-collection attributes of PexInfo.
[ { "content": "# Copyright 2014 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\nfrom __future__ import absolute_import\n\nimport json\nimport os\n\nfrom pex import pex_warnings\nfrom pex.common import can_write_dir, open_zip, safe_mkdtemp\nfrom pex.compatibility import PY2\nfrom pex.compatibility import string as compatibility_string\nfrom pex.inherit_path import InheritPath\nfrom pex.orderedset import OrderedSet\nfrom pex.typing import TYPE_CHECKING, cast\nfrom pex.variables import ENV, Variables\nfrom pex.version import __version__ as pex_version\n\nif TYPE_CHECKING:\n from pex.interpreter import PythonInterpreter\n\n from typing import Any, Dict, Mapping, Optional, Text, Union\n\n\n# TODO(wickman) Split this into a PexInfoBuilder/PexInfo to ensure immutability.\n# Issue #92.\nclass PexInfo(object):\n \"\"\"PEX metadata.\n\n # Build metadata:\n build_properties: BuildProperties # (key-value information about the build system)\n code_hash: str # sha1 hash of all names/code in the archive\n distributions: {dist_name: str} # map from distribution name (i.e. path in\n # the internal cache) to its cache key (sha1)\n requirements: list # list of requirements for this environment\n\n # Environment options\n pex_root: string # root of all pex-related files eg: ~/.pex\n entry_point: string # entry point into this pex\n script: string # script to execute in this pex environment\n # at most one of script/entry_point can be specified\n zip_safe: bool, default True # is this pex zip safe?\n unzip: bool, default False # should this pex be unzipped and re-executed from there?\n inherit_path: false/fallback/prefer # should this pex inherit site-packages + user site-packages\n # + PYTHONPATH?\n ignore_errors: True, default False # should we ignore inability to resolve dependencies?\n always_write_cache: False # should we always write the internal cache to disk first?\n # this is useful if you have very large dependencies that\n # do not fit in RAM constrained environments\n\n .. versionchanged:: 0.8\n Removed the ``repositories`` and ``indices`` information, as they were never\n implemented.\n \"\"\"\n\n PATH = \"PEX-INFO\"\n INSTALL_CACHE = \"installed_wheels\"\n\n @classmethod\n def make_build_properties(cls, interpreter=None):\n # This lazy import is currently needed for performance reasons. At PEX runtime PexInfo is\n # read in the bootstrap to see if the PEX should run in `--unzip` mode. If so, it must\n # re-exec itself to run against its unzipped contents. Since `make_build_properties` is only\n # used at PEX buildtime and the transitive imports of PythonInterpreter are large and slow,\n # we avoid this import cost for runtime-only use.\n #\n # See: https://github.com/pantsbuild/pex/issues/1054\n from pex.interpreter import PythonInterpreter\n\n pi = interpreter or PythonInterpreter.get()\n plat = pi.platform\n platform_name = plat.platform\n return {\n \"pex_version\": pex_version,\n \"class\": pi.identity.interpreter,\n \"version\": pi.identity.version,\n \"platform\": platform_name,\n }\n\n @classmethod\n def default(cls, interpreter=None):\n # type: (Optional[PythonInterpreter]) -> PexInfo\n pex_info = {\n \"requirements\": [],\n \"distributions\": {},\n \"build_properties\": cls.make_build_properties(interpreter),\n }\n return cls(info=pex_info)\n\n @classmethod\n def from_pex(cls, pex):\n # type: (str) -> PexInfo\n if os.path.isfile(pex):\n with open_zip(pex) as zf:\n pex_info = zf.read(cls.PATH)\n else:\n with open(os.path.join(pex, cls.PATH)) as fp:\n pex_info = fp.read()\n return cls.from_json(pex_info)\n\n @classmethod\n def from_json(cls, content):\n # type: (Union[bytes, Text]) -> PexInfo\n if isinstance(content, bytes):\n content = content.decode(\"utf-8\")\n return cls(info=json.loads(content))\n\n @classmethod\n def from_env(cls, env=ENV):\n # type: (Variables) -> PexInfo\n pex_force_local = Variables.PEX_FORCE_LOCAL.strip_default(env)\n zip_safe = None if pex_force_local is None else not pex_force_local\n\n pex_inherit_path = Variables.PEX_INHERIT_PATH.strip_default(env)\n inherit_path = None if pex_inherit_path is None else pex_inherit_path.value\n\n pex_info = {\n \"pex_root\": Variables.PEX_ROOT.strip_default(env),\n \"entry_point\": env.PEX_MODULE,\n \"script\": env.PEX_SCRIPT,\n \"zip_safe\": zip_safe,\n \"unzip\": Variables.PEX_UNZIP.strip_default(env),\n \"inherit_path\": inherit_path,\n \"ignore_errors\": Variables.PEX_IGNORE_ERRORS.strip_default(env),\n \"always_write_cache\": Variables.PEX_ALWAYS_CACHE.strip_default(env),\n }\n # Filter out empty entries not explicitly set in the environment.\n return cls(info=dict((k, v) for (k, v) in pex_info.items() if v is not None))\n\n @classmethod\n def _parse_requirement_tuple(cls, requirement_tuple):\n if isinstance(requirement_tuple, (tuple, list)):\n if len(requirement_tuple) != 3:\n raise ValueError(\"Malformed PEX requirement: %r\" % (requirement_tuple,))\n # pre 0.8.x requirement type:\n pex_warnings.warn(\n \"Attempting to use deprecated PEX feature. Please upgrade past PEX 0.8.x.\"\n )\n return requirement_tuple[0]\n elif isinstance(requirement_tuple, compatibility_string):\n return requirement_tuple\n raise ValueError(\"Malformed PEX requirement: %r\" % (requirement_tuple,))\n\n def __init__(self, info=None):\n # type: (Optional[Mapping[str, Any]]) -> None\n \"\"\"Construct a new PexInfo.\n\n This should not be used directly.\n \"\"\"\n\n if info is not None and not isinstance(info, dict):\n raise ValueError(\n \"PexInfo can only be seeded with a dict, got: \" \"%s of type %s\" % (info, type(info))\n )\n self._pex_info = dict(info) if info else {} # type Dict[str, Any]\n self._distributions = self._pex_info.get(\"distributions\", {})\n # cast as set because pex info from json must store interpreter_constraints as a list\n self._interpreter_constraints = set(self._pex_info.get(\"interpreter_constraints\", set()))\n requirements = self._pex_info.get(\"requirements\", [])\n if not isinstance(requirements, (list, tuple)):\n raise ValueError(\"Expected requirements to be a list, got %s\" % type(requirements))\n self._requirements = OrderedSet(self._parse_requirement_tuple(req) for req in requirements)\n\n def _get_safe(self, key):\n if key not in self._pex_info:\n return None\n value = self._pex_info[key]\n return value.encode(\"utf-8\") if PY2 else value\n\n @property\n def build_properties(self):\n \"\"\"Information about the system on which this PEX was generated.\n\n :returns: A dictionary containing metadata about the environment used to build this PEX.\n \"\"\"\n return self._pex_info.get(\"build_properties\", {})\n\n @build_properties.setter\n def build_properties(self, value):\n if not isinstance(value, dict):\n raise TypeError(\"build_properties must be a dictionary!\")\n self._pex_info[\"build_properties\"] = self.make_build_properties()\n self._pex_info[\"build_properties\"].update(value)\n\n @property\n def zip_safe(self):\n \"\"\"Whether or not this PEX should be treated as zip-safe.\n\n If set to false and the PEX is zipped, the contents of the PEX will be unpacked into a\n directory within the PEX_ROOT prior to execution. This allows code and frameworks depending\n upon __file__ existing on disk to operate normally.\n\n By default zip_safe is True. May be overridden at runtime by the $PEX_FORCE_LOCAL environment\n variable.\n \"\"\"\n return self._pex_info.get(\"zip_safe\", True)\n\n @zip_safe.setter\n def zip_safe(self, value):\n self._pex_info[\"zip_safe\"] = bool(value)\n\n @property\n def unzip(self):\n \"\"\"Whether or not PEX should be unzipped before it's executed.\n\n Unzipping a PEX is a operation that can be cached on the 1st run of a given PEX file which\n can result in lower startup latency in subsequent runs.\n \"\"\"\n return self._pex_info.get(\"unzip\", False)\n\n @unzip.setter\n def unzip(self, value):\n self._pex_info[\"unzip\"] = bool(value)\n\n @property\n def strip_pex_env(self):\n \"\"\"Whether or not this PEX should strip `PEX_*` env vars before executing its entrypoint.\n\n You might want to set this to `False` if this PEX executes other PEXes or the Pex CLI itself\n and you want the executed PEX to be controlled via PEX environment variables.\n \"\"\"\n return self._pex_info.get(\"strip_pex_env\", True)\n\n @strip_pex_env.setter\n def strip_pex_env(self, value):\n self._pex_info[\"strip_pex_env\"] = bool(value)\n\n @property\n def pex_path(self):\n # type: () -> Optional[str]\n \"\"\"A colon separated list of other pex files to merge into the runtime environment.\n\n This pex info property is used to persist the PEX_PATH environment variable into the pex\n info metadata for reuse within a built pex.\n \"\"\"\n return cast(\"Optional[str]\", self._pex_info.get(\"pex_path\"))\n\n @pex_path.setter\n def pex_path(self, value):\n # type: (str) -> None\n self._pex_info[\"pex_path\"] = value\n\n @property\n def inherit_path(self):\n # type: () -> InheritPath.Value\n \"\"\"Whether or not this PEX should be allowed to inherit system dependencies.\n\n By default, PEX environments are scrubbed of all system distributions prior to execution.\n This means that PEX files cannot rely upon preexisting system libraries.\n\n By default inherit_path is false. This may be overridden at runtime by the $PEX_INHERIT_PATH\n environment variable.\n \"\"\"\n inherit_path = self._pex_info.get(\"inherit_path\")\n return InheritPath.for_value(inherit_path) if inherit_path else InheritPath.FALSE\n\n @inherit_path.setter\n def inherit_path(self, value):\n # type: (InheritPath.Value) -> None\n self._pex_info[\"inherit_path\"] = value.value\n\n @property\n def interpreter_constraints(self):\n \"\"\"A list of constraints that determine the interpreter compatibility for this pex, using\n the Requirement-style format, e.g. ``'CPython>=3', or just '>=2.7,<3'`` for requirements\n agnostic to interpreter class.\n\n This property will be used at exec time when bootstrapping a pex to search PEX_PYTHON_PATH\n for a list of compatible interpreters.\n \"\"\"\n return list(self._interpreter_constraints)\n\n def add_interpreter_constraint(self, value):\n self._interpreter_constraints.add(str(value))\n\n @property\n def ignore_errors(self):\n return self._pex_info.get(\"ignore_errors\", False)\n\n @ignore_errors.setter\n def ignore_errors(self, value):\n self._pex_info[\"ignore_errors\"] = bool(value)\n\n @property\n def emit_warnings(self):\n return self._pex_info.get(\"emit_warnings\", True)\n\n @emit_warnings.setter\n def emit_warnings(self, value):\n self._pex_info[\"emit_warnings\"] = bool(value)\n\n @property\n def code_hash(self):\n return self._pex_info.get(\"code_hash\")\n\n @code_hash.setter\n def code_hash(self, value):\n self._pex_info[\"code_hash\"] = value\n\n @property\n def entry_point(self):\n return self._get_safe(\"entry_point\")\n\n @entry_point.setter\n def entry_point(self, value):\n self._pex_info[\"entry_point\"] = value\n\n @property\n def script(self):\n return self._get_safe(\"script\")\n\n @script.setter\n def script(self, value):\n self._pex_info[\"script\"] = value\n\n def add_requirement(self, requirement):\n self._requirements.add(str(requirement))\n\n @property\n def requirements(self):\n return self._requirements\n\n def add_distribution(self, location, sha):\n self._distributions[location] = sha\n\n @property\n def distributions(self):\n return self._distributions\n\n @property\n def always_write_cache(self):\n return self._pex_info.get(\"always_write_cache\", False)\n\n @always_write_cache.setter\n def always_write_cache(self, value):\n self._pex_info[\"always_write_cache\"] = bool(value)\n\n @property\n def pex_root(self):\n pex_root = os.path.expanduser(self._pex_info.get(\"pex_root\", os.path.join(\"~\", \".pex\")))\n if not can_write_dir(pex_root):\n tmp_root = safe_mkdtemp()\n pex_warnings.warn(\n \"PEX_ROOT is configured as {pex_root} but that path is un-writeable, \"\n \"falling back to a temporary PEX_ROOT of {tmp_root} which will hurt \"\n \"performance.\".format(pex_root=pex_root, tmp_root=tmp_root)\n )\n pex_root = self._pex_info[\"pex_root\"] = tmp_root\n return pex_root\n\n @pex_root.setter\n def pex_root(self, value):\n if value is None:\n self._pex_info.pop(\"pex_root\", None)\n else:\n self._pex_info[\"pex_root\"] = value\n\n @property\n def internal_cache(self):\n return \".deps\"\n\n @property\n def install_cache(self):\n return os.path.join(self.pex_root, self.INSTALL_CACHE)\n\n @property\n def zip_unsafe_cache(self):\n return os.path.join(self.pex_root, \"code\")\n\n def update(self, other):\n # type: (PexInfo) -> None\n if not isinstance(other, PexInfo):\n raise TypeError(\"Cannot merge a %r with PexInfo\" % type(other))\n self._pex_info.update(other._pex_info)\n self._distributions.update(other.distributions)\n self._interpreter_constraints.update(other.interpreter_constraints)\n self._requirements.update(other.requirements)\n\n def as_json_dict(self):\n # type: () -> Dict[str, Any]\n data = self._pex_info.copy()\n data[\"inherit_path\"] = self.inherit_path.value\n data[\"requirements\"] = list(self._requirements)\n data[\"interpreter_constraints\"] = list(self._interpreter_constraints)\n data[\"distributions\"] = self._distributions.copy()\n return data\n\n def dump(self):\n # type: (...) -> str\n data = self.as_json_dict()\n data[\"requirements\"].sort()\n data[\"interpreter_constraints\"].sort()\n return json.dumps(data, sort_keys=True)\n\n def copy(self):\n # type: () -> PexInfo\n return PexInfo(self._pex_info)\n\n @staticmethod\n def _merge_split(*paths):\n filtered_paths = filter(None, paths)\n return [p for p in \":\".join(filtered_paths).split(\":\") if p]\n\n def merge_pex_path(self, pex_path):\n \"\"\"Merges a new PEX_PATH definition into the existing one (if any).\n\n :param str pex_path: The PEX_PATH to merge.\n \"\"\"\n if not pex_path:\n return\n self.pex_path = \":\".join(self._merge_split(self.pex_path, pex_path))\n\n def __repr__(self):\n return \"{}({!r})\".format(type(self).__name__, self._pex_info)\n", "path": "pex/pex_info.py" } ]
[ { "content": "# Copyright 2014 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\nfrom __future__ import absolute_import\n\nimport json\nimport os\n\nfrom pex import pex_warnings\nfrom pex.common import can_write_dir, open_zip, safe_mkdtemp\nfrom pex.compatibility import PY2\nfrom pex.compatibility import string as compatibility_string\nfrom pex.inherit_path import InheritPath\nfrom pex.orderedset import OrderedSet\nfrom pex.typing import TYPE_CHECKING, cast\nfrom pex.variables import ENV, Variables\nfrom pex.version import __version__ as pex_version\n\nif TYPE_CHECKING:\n from pex.interpreter import PythonInterpreter\n\n from typing import Any, Dict, Mapping, Optional, Text, Union\n\n\n# TODO(wickman) Split this into a PexInfoBuilder/PexInfo to ensure immutability.\n# Issue #92.\nclass PexInfo(object):\n \"\"\"PEX metadata.\n\n # Build metadata:\n build_properties: BuildProperties # (key-value information about the build system)\n code_hash: str # sha1 hash of all names/code in the archive\n distributions: {dist_name: str} # map from distribution name (i.e. path in\n # the internal cache) to its cache key (sha1)\n requirements: list # list of requirements for this environment\n\n # Environment options\n pex_root: string # root of all pex-related files eg: ~/.pex\n entry_point: string # entry point into this pex\n script: string # script to execute in this pex environment\n # at most one of script/entry_point can be specified\n zip_safe: bool, default True # is this pex zip safe?\n unzip: bool, default False # should this pex be unzipped and re-executed from there?\n inherit_path: false/fallback/prefer # should this pex inherit site-packages + user site-packages\n # + PYTHONPATH?\n ignore_errors: True, default False # should we ignore inability to resolve dependencies?\n always_write_cache: False # should we always write the internal cache to disk first?\n # this is useful if you have very large dependencies that\n # do not fit in RAM constrained environments\n\n .. versionchanged:: 0.8\n Removed the ``repositories`` and ``indices`` information, as they were never\n implemented.\n \"\"\"\n\n PATH = \"PEX-INFO\"\n INSTALL_CACHE = \"installed_wheels\"\n\n @classmethod\n def make_build_properties(cls, interpreter=None):\n # This lazy import is currently needed for performance reasons. At PEX runtime PexInfo is\n # read in the bootstrap to see if the PEX should run in `--unzip` mode. If so, it must\n # re-exec itself to run against its unzipped contents. Since `make_build_properties` is only\n # used at PEX buildtime and the transitive imports of PythonInterpreter are large and slow,\n # we avoid this import cost for runtime-only use.\n #\n # See: https://github.com/pantsbuild/pex/issues/1054\n from pex.interpreter import PythonInterpreter\n\n pi = interpreter or PythonInterpreter.get()\n plat = pi.platform\n platform_name = plat.platform\n return {\n \"pex_version\": pex_version,\n \"class\": pi.identity.interpreter,\n \"version\": pi.identity.version,\n \"platform\": platform_name,\n }\n\n @classmethod\n def default(cls, interpreter=None):\n # type: (Optional[PythonInterpreter]) -> PexInfo\n pex_info = {\n \"requirements\": [],\n \"distributions\": {},\n \"build_properties\": cls.make_build_properties(interpreter),\n }\n return cls(info=pex_info)\n\n @classmethod\n def from_pex(cls, pex):\n # type: (str) -> PexInfo\n if os.path.isfile(pex):\n with open_zip(pex) as zf:\n pex_info = zf.read(cls.PATH)\n else:\n with open(os.path.join(pex, cls.PATH)) as fp:\n pex_info = fp.read()\n return cls.from_json(pex_info)\n\n @classmethod\n def from_json(cls, content):\n # type: (Union[bytes, Text]) -> PexInfo\n if isinstance(content, bytes):\n content = content.decode(\"utf-8\")\n return cls(info=json.loads(content))\n\n @classmethod\n def from_env(cls, env=ENV):\n # type: (Variables) -> PexInfo\n pex_force_local = Variables.PEX_FORCE_LOCAL.strip_default(env)\n zip_safe = None if pex_force_local is None else not pex_force_local\n\n pex_inherit_path = Variables.PEX_INHERIT_PATH.strip_default(env)\n inherit_path = None if pex_inherit_path is None else pex_inherit_path.value\n\n pex_info = {\n \"pex_root\": Variables.PEX_ROOT.strip_default(env),\n \"entry_point\": env.PEX_MODULE,\n \"script\": env.PEX_SCRIPT,\n \"zip_safe\": zip_safe,\n \"unzip\": Variables.PEX_UNZIP.strip_default(env),\n \"inherit_path\": inherit_path,\n \"ignore_errors\": Variables.PEX_IGNORE_ERRORS.strip_default(env),\n \"always_write_cache\": Variables.PEX_ALWAYS_CACHE.strip_default(env),\n }\n # Filter out empty entries not explicitly set in the environment.\n return cls(info=dict((k, v) for (k, v) in pex_info.items() if v is not None))\n\n @classmethod\n def _parse_requirement_tuple(cls, requirement_tuple):\n if isinstance(requirement_tuple, (tuple, list)):\n if len(requirement_tuple) != 3:\n raise ValueError(\"Malformed PEX requirement: %r\" % (requirement_tuple,))\n # pre 0.8.x requirement type:\n pex_warnings.warn(\n \"Attempting to use deprecated PEX feature. Please upgrade past PEX 0.8.x.\"\n )\n return requirement_tuple[0]\n elif isinstance(requirement_tuple, compatibility_string):\n return requirement_tuple\n raise ValueError(\"Malformed PEX requirement: %r\" % (requirement_tuple,))\n\n def __init__(self, info=None):\n # type: (Optional[Mapping[str, Any]]) -> None\n \"\"\"Construct a new PexInfo.\n\n This should not be used directly.\n \"\"\"\n\n if info is not None and not isinstance(info, dict):\n raise ValueError(\n \"PexInfo can only be seeded with a dict, got: \" \"%s of type %s\" % (info, type(info))\n )\n self._pex_info = dict(info) if info else {} # type Dict[str, Any]\n self._distributions = self._pex_info.get(\"distributions\", {})\n # cast as set because pex info from json must store interpreter_constraints as a list\n self._interpreter_constraints = set(self._pex_info.get(\"interpreter_constraints\", set()))\n requirements = self._pex_info.get(\"requirements\", [])\n if not isinstance(requirements, (list, tuple)):\n raise ValueError(\"Expected requirements to be a list, got %s\" % type(requirements))\n self._requirements = OrderedSet(self._parse_requirement_tuple(req) for req in requirements)\n\n def _get_safe(self, key):\n if key not in self._pex_info:\n return None\n value = self._pex_info[key]\n return value.encode(\"utf-8\") if PY2 else value\n\n @property\n def build_properties(self):\n \"\"\"Information about the system on which this PEX was generated.\n\n :returns: A dictionary containing metadata about the environment used to build this PEX.\n \"\"\"\n return self._pex_info.get(\"build_properties\", {})\n\n @build_properties.setter\n def build_properties(self, value):\n if not isinstance(value, dict):\n raise TypeError(\"build_properties must be a dictionary!\")\n self._pex_info[\"build_properties\"] = self.make_build_properties()\n self._pex_info[\"build_properties\"].update(value)\n\n @property\n def zip_safe(self):\n \"\"\"Whether or not this PEX should be treated as zip-safe.\n\n If set to false and the PEX is zipped, the contents of the PEX will be unpacked into a\n directory within the PEX_ROOT prior to execution. This allows code and frameworks depending\n upon __file__ existing on disk to operate normally.\n\n By default zip_safe is True. May be overridden at runtime by the $PEX_FORCE_LOCAL environment\n variable.\n \"\"\"\n return self._pex_info.get(\"zip_safe\", True)\n\n @zip_safe.setter\n def zip_safe(self, value):\n self._pex_info[\"zip_safe\"] = bool(value)\n\n @property\n def unzip(self):\n \"\"\"Whether or not PEX should be unzipped before it's executed.\n\n Unzipping a PEX is a operation that can be cached on the 1st run of a given PEX file which\n can result in lower startup latency in subsequent runs.\n \"\"\"\n return self._pex_info.get(\"unzip\", False)\n\n @unzip.setter\n def unzip(self, value):\n self._pex_info[\"unzip\"] = bool(value)\n\n @property\n def strip_pex_env(self):\n \"\"\"Whether or not this PEX should strip `PEX_*` env vars before executing its entrypoint.\n\n You might want to set this to `False` if this PEX executes other PEXes or the Pex CLI itself\n and you want the executed PEX to be controlled via PEX environment variables.\n \"\"\"\n return self._pex_info.get(\"strip_pex_env\", True)\n\n @strip_pex_env.setter\n def strip_pex_env(self, value):\n self._pex_info[\"strip_pex_env\"] = bool(value)\n\n @property\n def pex_path(self):\n # type: () -> Optional[str]\n \"\"\"A colon separated list of other pex files to merge into the runtime environment.\n\n This pex info property is used to persist the PEX_PATH environment variable into the pex\n info metadata for reuse within a built pex.\n \"\"\"\n return cast(\"Optional[str]\", self._pex_info.get(\"pex_path\"))\n\n @pex_path.setter\n def pex_path(self, value):\n # type: (str) -> None\n self._pex_info[\"pex_path\"] = value\n\n @property\n def inherit_path(self):\n # type: () -> InheritPath.Value\n \"\"\"Whether or not this PEX should be allowed to inherit system dependencies.\n\n By default, PEX environments are scrubbed of all system distributions prior to execution.\n This means that PEX files cannot rely upon preexisting system libraries.\n\n By default inherit_path is false. This may be overridden at runtime by the $PEX_INHERIT_PATH\n environment variable.\n \"\"\"\n inherit_path = self._pex_info.get(\"inherit_path\")\n return InheritPath.for_value(inherit_path) if inherit_path else InheritPath.FALSE\n\n @inherit_path.setter\n def inherit_path(self, value):\n # type: (InheritPath.Value) -> None\n self._pex_info[\"inherit_path\"] = value.value\n\n @property\n def interpreter_constraints(self):\n \"\"\"A list of constraints that determine the interpreter compatibility for this pex, using\n the Requirement-style format, e.g. ``'CPython>=3', or just '>=2.7,<3'`` for requirements\n agnostic to interpreter class.\n\n This property will be used at exec time when bootstrapping a pex to search PEX_PYTHON_PATH\n for a list of compatible interpreters.\n \"\"\"\n return list(self._interpreter_constraints)\n\n def add_interpreter_constraint(self, value):\n self._interpreter_constraints.add(str(value))\n\n @property\n def ignore_errors(self):\n return self._pex_info.get(\"ignore_errors\", False)\n\n @ignore_errors.setter\n def ignore_errors(self, value):\n self._pex_info[\"ignore_errors\"] = bool(value)\n\n @property\n def emit_warnings(self):\n return self._pex_info.get(\"emit_warnings\", True)\n\n @emit_warnings.setter\n def emit_warnings(self, value):\n self._pex_info[\"emit_warnings\"] = bool(value)\n\n @property\n def code_hash(self):\n return self._pex_info.get(\"code_hash\")\n\n @code_hash.setter\n def code_hash(self, value):\n self._pex_info[\"code_hash\"] = value\n\n @property\n def entry_point(self):\n return self._get_safe(\"entry_point\")\n\n @entry_point.setter\n def entry_point(self, value):\n self._pex_info[\"entry_point\"] = value\n\n @property\n def script(self):\n return self._get_safe(\"script\")\n\n @script.setter\n def script(self, value):\n self._pex_info[\"script\"] = value\n\n def add_requirement(self, requirement):\n self._requirements.add(str(requirement))\n\n @property\n def requirements(self):\n return self._requirements\n\n def add_distribution(self, location, sha):\n self._distributions[location] = sha\n\n @property\n def distributions(self):\n return self._distributions\n\n @property\n def always_write_cache(self):\n return self._pex_info.get(\"always_write_cache\", False)\n\n @always_write_cache.setter\n def always_write_cache(self, value):\n self._pex_info[\"always_write_cache\"] = bool(value)\n\n @property\n def pex_root(self):\n pex_root = os.path.expanduser(self._pex_info.get(\"pex_root\", os.path.join(\"~\", \".pex\")))\n if not can_write_dir(pex_root):\n tmp_root = safe_mkdtemp()\n pex_warnings.warn(\n \"PEX_ROOT is configured as {pex_root} but that path is un-writeable, \"\n \"falling back to a temporary PEX_ROOT of {tmp_root} which will hurt \"\n \"performance.\".format(pex_root=pex_root, tmp_root=tmp_root)\n )\n pex_root = self._pex_info[\"pex_root\"] = tmp_root\n return pex_root\n\n @pex_root.setter\n def pex_root(self, value):\n if value is None:\n self._pex_info.pop(\"pex_root\", None)\n else:\n self._pex_info[\"pex_root\"] = value\n\n @property\n def internal_cache(self):\n return \".deps\"\n\n @property\n def install_cache(self):\n return os.path.join(self.pex_root, self.INSTALL_CACHE)\n\n @property\n def zip_unsafe_cache(self):\n return os.path.join(self.pex_root, \"code\")\n\n def update(self, other):\n # type: (PexInfo) -> None\n if not isinstance(other, PexInfo):\n raise TypeError(\"Cannot merge a %r with PexInfo\" % type(other))\n self._pex_info.update(other._pex_info)\n self._distributions.update(other.distributions)\n self._interpreter_constraints.update(other.interpreter_constraints)\n self._requirements.update(other.requirements)\n\n def as_json_dict(self):\n # type: () -> Dict[str, Any]\n data = self._pex_info.copy()\n data[\"inherit_path\"] = self.inherit_path.value\n data[\"requirements\"] = list(self._requirements)\n data[\"interpreter_constraints\"] = list(self._interpreter_constraints)\n data[\"distributions\"] = self._distributions.copy()\n return data\n\n def dump(self):\n # type: (...) -> str\n data = self.as_json_dict()\n data[\"requirements\"].sort()\n data[\"interpreter_constraints\"].sort()\n return json.dumps(data, sort_keys=True)\n\n def copy(self):\n # type: () -> PexInfo\n return PexInfo(self.as_json_dict())\n\n @staticmethod\n def _merge_split(*paths):\n filtered_paths = filter(None, paths)\n return [p for p in \":\".join(filtered_paths).split(\":\") if p]\n\n def merge_pex_path(self, pex_path):\n \"\"\"Merges a new PEX_PATH definition into the existing one (if any).\n\n :param str pex_path: The PEX_PATH to merge.\n \"\"\"\n if not pex_path:\n return\n self.pex_path = \":\".join(self._merge_split(self.pex_path, pex_path))\n\n def __repr__(self):\n return \"{}({!r})\".format(type(self).__name__, self._pex_info)\n", "path": "pex/pex_info.py" } ]
diff --git a/pex/pex_info.py b/pex/pex_info.py index 24c2bf524..5a9dcb43c 100644 --- a/pex/pex_info.py +++ b/pex/pex_info.py @@ -394,7 +394,7 @@ def dump(self): def copy(self): # type: () -> PexInfo - return PexInfo(self._pex_info) + return PexInfo(self.as_json_dict()) @staticmethod def _merge_split(*paths): diff --git a/tests/test_pex_info.py b/tests/test_pex_info.py index ee613697f..aea4e23b2 100644 --- a/tests/test_pex_info.py +++ b/tests/test_pex_info.py @@ -7,6 +7,7 @@ import pytest from pex.common import temporary_dir +from pex.inherit_path import InheritPath from pex.orderedset import OrderedSet from pex.pex_info import PexInfo from pex.pex_warnings import PEXWarning @@ -142,3 +143,31 @@ def test_pex_root_set_unwriteable(): assert isinstance(message, PEXWarning) assert pex_root in str(message) assert pex_info.pex_root in str(message) + + +def test_copy(): + # type: () -> None + default_info = PexInfo.default() + default_info_copy = default_info.copy() + assert default_info is not default_info_copy + assert default_info.dump() == default_info_copy.dump() + + info = PexInfo.default() + info.unzip = True + info.code_hash = "foo" + info.inherit_path = InheritPath.FALLBACK + info.add_requirement("bar==1") + info.add_requirement("baz==2") + info.add_distribution("bar.whl", "bar-sha") + info.add_distribution("baz.whl", "baz-sha") + info.add_interpreter_constraint(">=2.7.18") + info.add_interpreter_constraint("CPython==2.7.9") + info_copy = info.copy() + + assert info_copy.unzip is True + assert "foo" == info_copy.code_hash + assert InheritPath.FALLBACK == info_copy.inherit_path + assert OrderedSet(["bar==1", "baz==2"]) == info_copy.requirements + assert {"bar.whl": "bar-sha", "baz.whl": "baz-sha"} == info_copy.distributions + assert {">=2.7.18", "CPython==2.7.9"} == set(info_copy.interpreter_constraints) + assert info.dump() == info_copy.dump()
pex-tool__pex-1377
Release 2.1.43 On the docket: + [x] Support more verbose output for interpreter info. (#1347) + [x] Fix Pex emitting warnings about its Pip PEX venv. (#1351) + [x] Fix execution modes. (#1353) + [x] Warn for PEX env vars unsupported by venv. (#1354) + [x] Do not suppress pex output in bidst_pex (#1358) + [x] Using --platform manylinux2010 includes pyarrow wheel for manylinux2014 #1355 + [x] Fix --no-manylinux. #1365 + [x] Environment markers are incorrectly evaluated for --platform resolves. #1366 + [x] Pex probes wheel metadata incorrectly. #1375
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.42\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.43\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index fd894df7f..713eaf4ec 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,39 @@ Release Notes ============= +2.1.43 +------ + +* Fix dist-info metadata discovery. (#1376) + `PR #1376 <https://github.com/pantsbuild/pex/pull/1376>`_ + +* Fix ``--platform`` resolve handling of env markers. (#1367) + `PR #1367 <https://github.com/pantsbuild/pex/pull/1367>`_ + +* Fix ``--no-manylinux``. (#1365) + `PR #1365 <https://github.com/pantsbuild/pex/pull/1365>`_ + +* Allow ``--platform`` resolves for current interpreter. (#1364) + `PR #1364 <https://github.com/pantsbuild/pex/pull/1364>`_ + +* Do not suppress pex output in bidst_pex (#1358) + `PR #1358 <https://github.com/pantsbuild/pex/pull/1358>`_ + +* Warn for PEX env vars unsupported by venv. (#1354) + `PR #1354 <https://github.com/pantsbuild/pex/pull/1354>`_ + +* Fix execution modes. (#1353) + `PR #1353 <https://github.com/pantsbuild/pex/pull/1353>`_ + +* Fix Pex emitting warnings about its Pip PEX venv. (#1351) + `PR #1351 <https://github.com/pantsbuild/pex/pull/1351>`_ + +* Support more verbose output for interpreter info. (#1347) + `PR #1347 <https://github.com/pantsbuild/pex/pull/1347>`_ + +* Fix typo in recipes.rst (#1342) + `PR #1342 <https://github.com/pantsbuild/pex/pull/1342>`_ + 2.1.42 ------ diff --git a/pex/version.py b/pex/version.py index 5d2cd4dfc..4c1b8de5b 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.42" +__version__ = "2.1.43"
pex-tool__pex-1838
Release 2.1.96 On the docket: + [x] PEX_EXTRA_SYS_PATH propagation can break subprocesses run against other venvs. #1836
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.95\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.96\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index a66e2c6c4..bd4f2b4fe 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,16 @@ Release Notes ============= +2.1.96 +------ + +This is a hotfix release that fixes ``--venv`` mode +``PEX_EXTRA_SYS_PATH`` propogation introduced in Pex 2.1.95 to only +apply to ``sys.executable`` and not other Pythons. + +* Fix ``--venv`` ``PEX PEX_EXTRA_SYS_PATH`` propagation. (#1837) + `PR #1837 <https://github.com/pantsbuild/pex/pull/1837>`_ + 2.1.95 ------ diff --git a/pex/version.py b/pex/version.py index 40153e391..26a429cb9 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.95" +__version__ = "2.1.96"
pex-tool__pex-1140
Release 2.1.23 On the docket: + [x] Upgrade Pex to Pip 20.3.1. #1133
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.22\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.23\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 9a14e88b8..3e999fa51 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,21 @@ Release Notes ============= +2.1.23 +------ + +This release upgrades Pex to the latest Pip which includes support for +the new 2020-resolver (see: +https://pip.pypa.io/en/stable/user_guide/#resolver-changes-2020) as well +as support for macOS BigSur. Although this release defaults to the +legacy resolver behavior, the next release will deprecate the legacy +resolver and support for the legacy resolver will later be removed to +allow continuing Pip upgrades going forward. To switch to the new +resolver, use: `--resolver-version pip-2020-resolver`. + +* Upgrade Pex to Pip 20.3.1. (#1133) + `PR #1133 <https://github.com/pantsbuild/pex/pull/1133>`_ + 2.1.22 ------ diff --git a/pex/version.py b/pex/version.py index 16163f6f4..10a632ad9 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.22" +__version__ = "2.1.23"
rotki__rotki-2262
Wrong Binance asset mapping for STX ## Problem Definition This was reported by a user via email. They own some STX in binance. For Rotki STX is Stox (https://www.coingecko.com/en/coins/stox) but in Binance it's another token not called Stacks (https://www.coingecko.com/en/coins/stack). We support it in Rotki as blockstacks (STX-2) so we just need to change the binance mapping. ## Task Fix the binance mapping.
[ { "content": "from dataclasses import dataclass, field\nfrom functools import total_ordering\nfrom typing import Any, Optional, Type, TypeVar\n\nfrom rotkehlchen.assets.resolver import AssetResolver\nfrom rotkehlchen.errors import DeserializationError, UnknownAsset, UnsupportedAsset\nfrom rotkehlchen.typing import AssetType, ChecksumEthAddress, EthTokenInfo, Timestamp\n\nWORLD_TO_BITTREX = {\n # In Rotkehlchen Bitswift is BITS-2 but in Bittrex it's BITS\n 'BITS-2': 'BITS',\n # In Rotkehlchen NuBits is USNBT but in Bittrex it's NBT\n 'USNBT': 'NBT',\n # In Rotkehlchen BTM-2 is Bytom but in Bittrex it's BTM\n 'BTM-2': 'BTM',\n # In Rotkehlchen PAI-2 is PCHAIN token but in Bittrex it's PI\n 'PAI-2': 'PI',\n # In Rotkehlchen PLA-2 is Playchip but in Bittrex is PLA\n 'PLA-2': 'PLA',\n # In Rotkehlchen sUSD is Synt USD but in Bittrex it's SUSD\n 'sUSD': 'SUSD',\n # In Rotkehlchen LUNA-2 is Terra Luna but in Bittrex it's LUNA\n 'LUNA-2': 'LUNA',\n # In Rotkehlchen WorldWideAssetExchange is WAX but in Bittrex it's WASP\n 'WAX': 'WAXP',\n # In Rotkehlchen Validity is RADS, the old name but in Bittrex it's VAL\n 'RADS': 'VAL',\n}\n\nWORLD_TO_POLONIEX = {\n # AIR-2 is aircoin for us and AIR is airtoken. Poloniex has only aircoin\n 'AIR-2': 'AIR',\n # Decentr is DEC-2 for us but DEC in Poloniex\n 'DEC-2': 'DEC',\n # Poloniex delisted BCH and listed it as BCHABC after the Bitcoin Cash\n # ABC / SV fork. In Rotkehlchen we consider BCH to be the same as BCHABC\n 'BCH': 'BCHABC',\n # Poloniex has the BCH Fork, Bitcoin Satoshi's vision listed as BCHSV.\n # We know it as BSV\n 'BSV': 'BCHSV',\n # Caishen is known as CAI in Poloniex. This is before the swap to CAIX\n 'CAIX': 'CAI',\n # CCN is Cannacoin in Poloniex but in Rotkehlchen we know it as CCN-2\n 'CCN-2': 'CCN',\n # CCN is CustomContractNetwork in Rotkehlchen but does not exist in Cryptocompare\n # Putting it as conversion to make sure we don't accidentally ask for wrong price\n 'CCN': '',\n 'cUSDT': 'CUSDT',\n # Faircoin is known as FAIR outside of Poloniex. Seems to be the same as the\n # now delisted Poloniex's FAC if you look at the bitcointalk announcement\n # https://bitcointalk.org/index.php?topic=702675.0\n 'FAIR': 'FAC',\n # KeyCoin in Poloniex is KEY but in Rotkehlchen it's KEY-3\n 'KEY-3': 'KEY',\n # Mazacoin in Poloniex is MZC but in Rotkehlchen it's MAZA\n 'MAZA': 'MZC',\n # Myriadcoin in Poloniex is MYR but in Rotkehlchen it's XMY\n 'XMY': 'MYR',\n # NuBits in Poloniex is NBT but in Rotkehlchen it's USNBT\n 'USNBT': 'NBT',\n # Stellar is XLM everywhere, apart from Poloniex\n 'XLM': 'STR',\n # Poloniex still has the old name WC for WhiteCoin\n 'XWC': 'WC',\n # Poloniex uses a different name for 1inch. Maybe due to starting with number?\n '1INCH': 'ONEINCH',\n}\n\nWORLD_TO_KRAKEN = {\n 'ATOM': 'ATOM',\n 'ALGO': 'ALGO',\n 'AUD': 'ZAUD',\n 'BAT': 'BAT',\n 'COMP': 'COMP',\n 'DOT': 'DOT',\n 'KAVA': 'KAVA',\n 'KNC': 'KNC',\n 'LINK': 'LINK',\n 'BSV': 'BSV',\n 'ETC': 'XETC',\n 'ETH': 'XETH',\n 'LTC': 'XLTC',\n 'REP': 'XREP',\n 'BTC': 'XXBT',\n 'XMR': 'XXMR',\n 'XRP': 'XXRP',\n 'ZEC': 'XZEC',\n 'EUR': 'ZEUR',\n 'USD': 'ZUSD',\n 'GBP': 'ZGBP',\n 'CAD': 'ZCAD',\n 'JPY': 'ZJPY',\n 'CHF': 'CHF',\n 'KRW': 'ZKRW',\n 'REPV2': 'REPV2',\n 'DAO': 'XDAO',\n 'MLN': 'XMLN',\n 'ICN': 'XICN',\n 'GNO': 'GNO',\n 'BCH': 'BCH',\n 'XLM': 'XXLM',\n 'DASH': 'DASH',\n 'EOS': 'EOS',\n 'USDC': 'USDC',\n 'USDT': 'USDT',\n 'KFEE': 'KFEE',\n 'ADA': 'ADA',\n 'QTUM': 'QTUM',\n 'NMC': 'XNMC',\n 'VEN': 'XXVN',\n 'DOGE': 'XXDG',\n 'DAI': 'DAI',\n 'XTZ': 'XTZ',\n 'WAVES': 'WAVES',\n 'ICX': 'ICX',\n 'NANO': 'NANO',\n 'OMG': 'OMG',\n 'SC': 'SC',\n 'PAXG': 'PAXG',\n 'LSK': 'LSK',\n 'TRX': 'TRX',\n 'OXT': 'OXT',\n 'STORJ': 'STORJ',\n 'BAL': 'BAL',\n 'KSM': 'KSM',\n 'CRV': 'CRV',\n 'SNX': 'SNX',\n 'FIL': 'FIL',\n 'UNI': 'UNI',\n 'YFI': 'YFI',\n 'ANT': 'ANT',\n 'KEEP': 'KEEP',\n 'TBTC': 'TBTC',\n 'ETH2': 'ETH2',\n 'AAVE': 'AAVE',\n 'MANA': 'MANA',\n 'GRT': 'GRT',\n 'FLOW': 'FLOW',\n}\n\nWORLD_TO_BINANCE = {\n # When BCH forked to BCHABC and BCHSV, binance renamed the original to ABC\n 'BCH': 'BCHABC',\n 'BSV': 'BCHSV',\n # ETHOS is known as BQX in Binance\n 'ETHOS': 'BQX',\n # GXChain is GXS in Binance but GXC in Rotkehlchen\n 'GXC': 'GXS',\n # Luna Terra is LUNA-2 in rotki\n 'LUNA-2': 'LUNA',\n # YOYOW is known as YOYO in Binance\n 'YOYOW': 'YOYO',\n # Solana is SOL-2 in rotki\n 'SOL-2': 'SOL',\n # BETH is the eth staked in beacon chain\n 'ETH2': 'BETH',\n}\n\nWORLD_TO_BITFINEX = {\n 'BCH': 'BCHABC',\n 'CNY': 'CNH',\n 'DOGE': 'DOG',\n 'REPV2': 'REP',\n 'TRIO': 'TRI',\n 'ZB': 'ZBT',\n}\n\nWORLD_TO_ICONOMI = {\n # In Rotkehlchen LUNA-2 is Terra Luna but in Bittrex it's LUNA\n 'LUNA-2': 'LUNA',\n}\n\n\n@total_ordering\n@dataclass(init=True, repr=True, eq=False, order=False, unsafe_hash=False, frozen=True)\nclass Asset():\n identifier: str\n name: str = field(init=False)\n symbol: str = field(init=False)\n active: bool = field(init=False)\n asset_type: AssetType = field(init=False)\n started: Timestamp = field(init=False)\n ended: Optional[Timestamp] = field(init=False)\n forked: Optional[str] = field(init=False)\n swapped_for: Optional[str] = field(init=False)\n # None means no special mapping. '' means not supported\n cryptocompare: Optional[str] = field(init=False)\n coingecko: Optional[str] = field(init=False)\n\n def __post_init__(self) -> None:\n \"\"\"\n Asset post initialization\n\n The only thing that is given to initialize an asset is a string.\n\n If a non string is given then it's probably a deserialization error or\n invalid data were given to us by the server if an API was queried.\n \"\"\"\n if not isinstance(self.identifier, str):\n raise DeserializationError(\n 'Tried to initialize an asset out of a non-string identifier',\n )\n\n canonical_id = AssetResolver().is_identifier_canonical(self.identifier)\n if canonical_id is None:\n raise UnknownAsset(self.identifier)\n # else let's make sure we got the canonical id in our data struct\n object.__setattr__(self, 'identifier', canonical_id)\n\n data = AssetResolver().get_asset_data(self.identifier)\n # Ugly hack to set attributes of a frozen data class as post init\n # https://docs.python.org/3/library/dataclasses.html#frozen-instances\n object.__setattr__(self, 'name', data.name)\n object.__setattr__(self, 'symbol', data.symbol)\n object.__setattr__(self, 'active', data.active)\n object.__setattr__(self, 'asset_type', data.asset_type)\n object.__setattr__(self, 'started', data.started)\n object.__setattr__(self, 'ended', data.ended)\n object.__setattr__(self, 'forked', data.forked)\n object.__setattr__(self, 'swapped_for', data.swapped_for)\n object.__setattr__(self, 'cryptocompare', data.cryptocompare)\n object.__setattr__(self, 'coingecko', data.coingecko)\n\n def serialize(self) -> str:\n return self.identifier\n\n def is_fiat(self) -> bool:\n return self.asset_type == AssetType.FIAT\n\n def is_eth_token(self) -> bool:\n return self.asset_type in (AssetType.ETH_TOKEN, AssetType.ETH_TOKEN_AND_MORE)\n\n def __str__(self) -> str:\n return self.name\n\n def __repr__(self) -> str:\n return f'<Asset identifier:{self.identifier} name:{self.name} symbol:{self.symbol}>'\n\n def to_kraken(self) -> str:\n return WORLD_TO_KRAKEN[self.identifier]\n\n def to_bitfinex(self) -> str:\n return WORLD_TO_BITFINEX.get(self.identifier, self.identifier)\n\n def to_bittrex(self) -> str:\n return WORLD_TO_BITTREX.get(self.identifier, self.identifier)\n\n def to_binance(self) -> str:\n return WORLD_TO_BINANCE.get(self.identifier, self.identifier)\n\n def to_cryptocompare(self) -> str:\n \"\"\"Returns the symbol with which to query cryptocompare for the asset\n\n May raise:\n - UnsupportedAsset() if the asset is not supported by cryptocompare\n \"\"\"\n cryptocompare_str = self.identifier if self.cryptocompare is None else self.cryptocompare\n # There is an asset which should not be queried in cryptocompare\n if cryptocompare_str == '':\n raise UnsupportedAsset(f'{self.identifier} is not supported by cryptocompare')\n\n # Seems cryptocompare capitalizes everything. So cDAI -> CDAI\n return cryptocompare_str.upper()\n\n def to_coingecko(self) -> str:\n \"\"\"Returns the symbol with which to query coingecko for the asset\n\n May raise:\n - UnsupportedAsset() if the asset is not supported by coingecko\n \"\"\"\n coingecko_str = self.identifier if self.coingecko is None else self.coingecko\n # There is an asset which should not be queried in cryptocompare\n if coingecko_str == '':\n raise UnsupportedAsset(f'{self.identifier} is not supported by coingecko')\n return coingecko_str\n\n def has_coingecko(self) -> bool:\n return self.coingecko is not None and self.coingecko != ''\n\n def __hash__(self) -> int:\n return hash(self.identifier)\n\n def __eq__(self, other: Any) -> bool:\n if other is None:\n return False\n\n if isinstance(other, Asset):\n return self.identifier == other.identifier\n if isinstance(other, str):\n return self.identifier == other\n # else\n raise ValueError(f'Invalid comparison of asset with {type(other)}')\n\n def __ne__(self, other: Any) -> bool:\n return not self.__eq__(other)\n\n def __lt__(self, other: Any) -> bool:\n if isinstance(other, Asset):\n return self.identifier < other.identifier\n if isinstance(other, str):\n return self.identifier < other\n # else\n raise ValueError(f'Invalid comparison of asset with {type(other)}')\n\n\n@dataclass(init=True, repr=True, eq=False, order=False, unsafe_hash=False, frozen=True)\nclass HasEthereumToken(Asset):\n \"\"\" Marker to denote assets having an Ethereum token address \"\"\"\n ethereum_address: ChecksumEthAddress = field(init=False)\n decimals: int = field(init=False)\n\n def __post_init__(self) -> None:\n super().__post_init__()\n data = AssetResolver().get_asset_data(self.identifier) # pylint: disable=no-member\n\n if not data.ethereum_address:\n raise DeserializationError(\n 'Tried to initialize a non Ethereum asset as Ethereum Token',\n )\n\n object.__setattr__(self, 'ethereum_address', data.ethereum_address)\n object.__setattr__(self, 'decimals', data.decimals)\n\n\n# Create a generic variable that can be 'EthereumToken', or any subclass.\nT = TypeVar('T', bound='EthereumToken')\n\n\n@dataclass(init=True, repr=True, eq=False, order=False, unsafe_hash=False, frozen=True)\nclass EthereumToken(HasEthereumToken):\n\n def token_info(self) -> EthTokenInfo:\n return EthTokenInfo(\n identifier=self.identifier,\n address=self.ethereum_address,\n symbol=self.symbol,\n name=self.name,\n decimals=self.decimals,\n )\n\n @classmethod\n def from_asset(cls: Type[T], asset: Asset) -> Optional[T]:\n \"\"\"Attempts to turn an asset into an EthereumToken. If it fails returns None\"\"\"\n try:\n return cls(asset.identifier)\n except DeserializationError:\n return None\n", "path": "rotkehlchen/assets/asset.py" } ]
[ { "content": "from dataclasses import dataclass, field\nfrom functools import total_ordering\nfrom typing import Any, Optional, Type, TypeVar\n\nfrom rotkehlchen.assets.resolver import AssetResolver\nfrom rotkehlchen.errors import DeserializationError, UnknownAsset, UnsupportedAsset\nfrom rotkehlchen.typing import AssetType, ChecksumEthAddress, EthTokenInfo, Timestamp\n\nWORLD_TO_BITTREX = {\n # In Rotkehlchen Bitswift is BITS-2 but in Bittrex it's BITS\n 'BITS-2': 'BITS',\n # In Rotkehlchen NuBits is USNBT but in Bittrex it's NBT\n 'USNBT': 'NBT',\n # In Rotkehlchen BTM-2 is Bytom but in Bittrex it's BTM\n 'BTM-2': 'BTM',\n # In Rotkehlchen PAI-2 is PCHAIN token but in Bittrex it's PI\n 'PAI-2': 'PI',\n # In Rotkehlchen PLA-2 is Playchip but in Bittrex is PLA\n 'PLA-2': 'PLA',\n # In Rotkehlchen sUSD is Synt USD but in Bittrex it's SUSD\n 'sUSD': 'SUSD',\n # In Rotkehlchen LUNA-2 is Terra Luna but in Bittrex it's LUNA\n 'LUNA-2': 'LUNA',\n # In Rotkehlchen WorldWideAssetExchange is WAX but in Bittrex it's WASP\n 'WAX': 'WAXP',\n # In Rotkehlchen Validity is RADS, the old name but in Bittrex it's VAL\n 'RADS': 'VAL',\n}\n\nWORLD_TO_POLONIEX = {\n # AIR-2 is aircoin for us and AIR is airtoken. Poloniex has only aircoin\n 'AIR-2': 'AIR',\n # Decentr is DEC-2 for us but DEC in Poloniex\n 'DEC-2': 'DEC',\n # Poloniex delisted BCH and listed it as BCHABC after the Bitcoin Cash\n # ABC / SV fork. In Rotkehlchen we consider BCH to be the same as BCHABC\n 'BCH': 'BCHABC',\n # Poloniex has the BCH Fork, Bitcoin Satoshi's vision listed as BCHSV.\n # We know it as BSV\n 'BSV': 'BCHSV',\n # Caishen is known as CAI in Poloniex. This is before the swap to CAIX\n 'CAIX': 'CAI',\n # CCN is Cannacoin in Poloniex but in Rotkehlchen we know it as CCN-2\n 'CCN-2': 'CCN',\n # CCN is CustomContractNetwork in Rotkehlchen but does not exist in Cryptocompare\n # Putting it as conversion to make sure we don't accidentally ask for wrong price\n 'CCN': '',\n 'cUSDT': 'CUSDT',\n # Faircoin is known as FAIR outside of Poloniex. Seems to be the same as the\n # now delisted Poloniex's FAC if you look at the bitcointalk announcement\n # https://bitcointalk.org/index.php?topic=702675.0\n 'FAIR': 'FAC',\n # KeyCoin in Poloniex is KEY but in Rotkehlchen it's KEY-3\n 'KEY-3': 'KEY',\n # Mazacoin in Poloniex is MZC but in Rotkehlchen it's MAZA\n 'MAZA': 'MZC',\n # Myriadcoin in Poloniex is MYR but in Rotkehlchen it's XMY\n 'XMY': 'MYR',\n # NuBits in Poloniex is NBT but in Rotkehlchen it's USNBT\n 'USNBT': 'NBT',\n # Stellar is XLM everywhere, apart from Poloniex\n 'XLM': 'STR',\n # Poloniex still has the old name WC for WhiteCoin\n 'XWC': 'WC',\n # Poloniex uses a different name for 1inch. Maybe due to starting with number?\n '1INCH': 'ONEINCH',\n}\n\nWORLD_TO_KRAKEN = {\n 'ATOM': 'ATOM',\n 'ALGO': 'ALGO',\n 'AUD': 'ZAUD',\n 'BAT': 'BAT',\n 'COMP': 'COMP',\n 'DOT': 'DOT',\n 'KAVA': 'KAVA',\n 'KNC': 'KNC',\n 'LINK': 'LINK',\n 'BSV': 'BSV',\n 'ETC': 'XETC',\n 'ETH': 'XETH',\n 'LTC': 'XLTC',\n 'REP': 'XREP',\n 'BTC': 'XXBT',\n 'XMR': 'XXMR',\n 'XRP': 'XXRP',\n 'ZEC': 'XZEC',\n 'EUR': 'ZEUR',\n 'USD': 'ZUSD',\n 'GBP': 'ZGBP',\n 'CAD': 'ZCAD',\n 'JPY': 'ZJPY',\n 'CHF': 'CHF',\n 'KRW': 'ZKRW',\n 'REPV2': 'REPV2',\n 'DAO': 'XDAO',\n 'MLN': 'XMLN',\n 'ICN': 'XICN',\n 'GNO': 'GNO',\n 'BCH': 'BCH',\n 'XLM': 'XXLM',\n 'DASH': 'DASH',\n 'EOS': 'EOS',\n 'USDC': 'USDC',\n 'USDT': 'USDT',\n 'KFEE': 'KFEE',\n 'ADA': 'ADA',\n 'QTUM': 'QTUM',\n 'NMC': 'XNMC',\n 'VEN': 'XXVN',\n 'DOGE': 'XXDG',\n 'DAI': 'DAI',\n 'XTZ': 'XTZ',\n 'WAVES': 'WAVES',\n 'ICX': 'ICX',\n 'NANO': 'NANO',\n 'OMG': 'OMG',\n 'SC': 'SC',\n 'PAXG': 'PAXG',\n 'LSK': 'LSK',\n 'TRX': 'TRX',\n 'OXT': 'OXT',\n 'STORJ': 'STORJ',\n 'BAL': 'BAL',\n 'KSM': 'KSM',\n 'CRV': 'CRV',\n 'SNX': 'SNX',\n 'FIL': 'FIL',\n 'UNI': 'UNI',\n 'YFI': 'YFI',\n 'ANT': 'ANT',\n 'KEEP': 'KEEP',\n 'TBTC': 'TBTC',\n 'ETH2': 'ETH2',\n 'AAVE': 'AAVE',\n 'MANA': 'MANA',\n 'GRT': 'GRT',\n 'FLOW': 'FLOW',\n}\n\nWORLD_TO_BINANCE = {\n # When BCH forked to BCHABC and BCHSV, binance renamed the original to ABC\n 'BCH': 'BCHABC',\n 'BSV': 'BCHSV',\n # ETHOS is known as BQX in Binance\n 'ETHOS': 'BQX',\n # GXChain is GXS in Binance but GXC in Rotkehlchen\n 'GXC': 'GXS',\n # Luna Terra is LUNA-2 in rotki\n 'LUNA-2': 'LUNA',\n # YOYOW is known as YOYO in Binance\n 'YOYOW': 'YOYO',\n # Solana is SOL-2 in rotki\n 'SOL-2': 'SOL',\n # BETH is the eth staked in beacon chain\n 'ETH2': 'BETH',\n # STX is Blockstack in Binance\n 'STX-2': 'STX',\n}\n\nWORLD_TO_BITFINEX = {\n 'BCH': 'BCHABC',\n 'CNY': 'CNH',\n 'DOGE': 'DOG',\n 'REPV2': 'REP',\n 'TRIO': 'TRI',\n 'ZB': 'ZBT',\n}\n\nWORLD_TO_ICONOMI = {\n # In Rotkehlchen LUNA-2 is Terra Luna but in Bittrex it's LUNA\n 'LUNA-2': 'LUNA',\n}\n\n\n@total_ordering\n@dataclass(init=True, repr=True, eq=False, order=False, unsafe_hash=False, frozen=True)\nclass Asset():\n identifier: str\n name: str = field(init=False)\n symbol: str = field(init=False)\n active: bool = field(init=False)\n asset_type: AssetType = field(init=False)\n started: Timestamp = field(init=False)\n ended: Optional[Timestamp] = field(init=False)\n forked: Optional[str] = field(init=False)\n swapped_for: Optional[str] = field(init=False)\n # None means no special mapping. '' means not supported\n cryptocompare: Optional[str] = field(init=False)\n coingecko: Optional[str] = field(init=False)\n\n def __post_init__(self) -> None:\n \"\"\"\n Asset post initialization\n\n The only thing that is given to initialize an asset is a string.\n\n If a non string is given then it's probably a deserialization error or\n invalid data were given to us by the server if an API was queried.\n \"\"\"\n if not isinstance(self.identifier, str):\n raise DeserializationError(\n 'Tried to initialize an asset out of a non-string identifier',\n )\n\n canonical_id = AssetResolver().is_identifier_canonical(self.identifier)\n if canonical_id is None:\n raise UnknownAsset(self.identifier)\n # else let's make sure we got the canonical id in our data struct\n object.__setattr__(self, 'identifier', canonical_id)\n\n data = AssetResolver().get_asset_data(self.identifier)\n # Ugly hack to set attributes of a frozen data class as post init\n # https://docs.python.org/3/library/dataclasses.html#frozen-instances\n object.__setattr__(self, 'name', data.name)\n object.__setattr__(self, 'symbol', data.symbol)\n object.__setattr__(self, 'active', data.active)\n object.__setattr__(self, 'asset_type', data.asset_type)\n object.__setattr__(self, 'started', data.started)\n object.__setattr__(self, 'ended', data.ended)\n object.__setattr__(self, 'forked', data.forked)\n object.__setattr__(self, 'swapped_for', data.swapped_for)\n object.__setattr__(self, 'cryptocompare', data.cryptocompare)\n object.__setattr__(self, 'coingecko', data.coingecko)\n\n def serialize(self) -> str:\n return self.identifier\n\n def is_fiat(self) -> bool:\n return self.asset_type == AssetType.FIAT\n\n def is_eth_token(self) -> bool:\n return self.asset_type in (AssetType.ETH_TOKEN, AssetType.ETH_TOKEN_AND_MORE)\n\n def __str__(self) -> str:\n return self.name\n\n def __repr__(self) -> str:\n return f'<Asset identifier:{self.identifier} name:{self.name} symbol:{self.symbol}>'\n\n def to_kraken(self) -> str:\n return WORLD_TO_KRAKEN[self.identifier]\n\n def to_bitfinex(self) -> str:\n return WORLD_TO_BITFINEX.get(self.identifier, self.identifier)\n\n def to_bittrex(self) -> str:\n return WORLD_TO_BITTREX.get(self.identifier, self.identifier)\n\n def to_binance(self) -> str:\n return WORLD_TO_BINANCE.get(self.identifier, self.identifier)\n\n def to_cryptocompare(self) -> str:\n \"\"\"Returns the symbol with which to query cryptocompare for the asset\n\n May raise:\n - UnsupportedAsset() if the asset is not supported by cryptocompare\n \"\"\"\n cryptocompare_str = self.identifier if self.cryptocompare is None else self.cryptocompare\n # There is an asset which should not be queried in cryptocompare\n if cryptocompare_str == '':\n raise UnsupportedAsset(f'{self.identifier} is not supported by cryptocompare')\n\n # Seems cryptocompare capitalizes everything. So cDAI -> CDAI\n return cryptocompare_str.upper()\n\n def to_coingecko(self) -> str:\n \"\"\"Returns the symbol with which to query coingecko for the asset\n\n May raise:\n - UnsupportedAsset() if the asset is not supported by coingecko\n \"\"\"\n coingecko_str = self.identifier if self.coingecko is None else self.coingecko\n # There is an asset which should not be queried in cryptocompare\n if coingecko_str == '':\n raise UnsupportedAsset(f'{self.identifier} is not supported by coingecko')\n return coingecko_str\n\n def has_coingecko(self) -> bool:\n return self.coingecko is not None and self.coingecko != ''\n\n def __hash__(self) -> int:\n return hash(self.identifier)\n\n def __eq__(self, other: Any) -> bool:\n if other is None:\n return False\n\n if isinstance(other, Asset):\n return self.identifier == other.identifier\n if isinstance(other, str):\n return self.identifier == other\n # else\n raise ValueError(f'Invalid comparison of asset with {type(other)}')\n\n def __ne__(self, other: Any) -> bool:\n return not self.__eq__(other)\n\n def __lt__(self, other: Any) -> bool:\n if isinstance(other, Asset):\n return self.identifier < other.identifier\n if isinstance(other, str):\n return self.identifier < other\n # else\n raise ValueError(f'Invalid comparison of asset with {type(other)}')\n\n\n@dataclass(init=True, repr=True, eq=False, order=False, unsafe_hash=False, frozen=True)\nclass HasEthereumToken(Asset):\n \"\"\" Marker to denote assets having an Ethereum token address \"\"\"\n ethereum_address: ChecksumEthAddress = field(init=False)\n decimals: int = field(init=False)\n\n def __post_init__(self) -> None:\n super().__post_init__()\n data = AssetResolver().get_asset_data(self.identifier) # pylint: disable=no-member\n\n if not data.ethereum_address:\n raise DeserializationError(\n 'Tried to initialize a non Ethereum asset as Ethereum Token',\n )\n\n object.__setattr__(self, 'ethereum_address', data.ethereum_address)\n object.__setattr__(self, 'decimals', data.decimals)\n\n\n# Create a generic variable that can be 'EthereumToken', or any subclass.\nT = TypeVar('T', bound='EthereumToken')\n\n\n@dataclass(init=True, repr=True, eq=False, order=False, unsafe_hash=False, frozen=True)\nclass EthereumToken(HasEthereumToken):\n\n def token_info(self) -> EthTokenInfo:\n return EthTokenInfo(\n identifier=self.identifier,\n address=self.ethereum_address,\n symbol=self.symbol,\n name=self.name,\n decimals=self.decimals,\n )\n\n @classmethod\n def from_asset(cls: Type[T], asset: Asset) -> Optional[T]:\n \"\"\"Attempts to turn an asset into an EthereumToken. If it fails returns None\"\"\"\n try:\n return cls(asset.identifier)\n except DeserializationError:\n return None\n", "path": "rotkehlchen/assets/asset.py" } ]
diff --git a/docs/changelog.rst b/docs/changelog.rst index a5f1e33579..cd069973b1 100755 --- a/docs/changelog.rst +++ b/docs/changelog.rst @@ -2,6 +2,7 @@ Changelog ========= +* :bug:`2261` Users who had STX in Binance should now see it mapped properly to blockstack and not stox. * :bug:`-` Users will now see the total worth contained in the card for bigger amounts. * :bug:`2239` Amounts in the dashboard should now appear in single line for users. * :bug:`2244` Fix edge case where using a cryptocompare api key could result in the all coins endpoint to error if no cache already existed. diff --git a/rotkehlchen/assets/asset.py b/rotkehlchen/assets/asset.py index 9253a63d18..d76c8f1fda 100644 --- a/rotkehlchen/assets/asset.py +++ b/rotkehlchen/assets/asset.py @@ -154,6 +154,8 @@ 'SOL-2': 'SOL', # BETH is the eth staked in beacon chain 'ETH2': 'BETH', + # STX is Blockstack in Binance + 'STX-2': 'STX', } WORLD_TO_BITFINEX = { diff --git a/rotkehlchen/data/all_assets.json b/rotkehlchen/data/all_assets.json index 886fd48332..5aa997e462 100644 --- a/rotkehlchen/data/all_assets.json +++ b/rotkehlchen/data/all_assets.json @@ -11140,6 +11140,7 @@ }, "STX": { "coingecko": "stox", + "cryptocompare": "STOX", "ethereum_address": "0x006BeA43Baa3f7A6f765F14f10A1a1b08334EF45", "ethereum_token_decimals": 18, "name": "Stox", @@ -11149,7 +11150,7 @@ }, "STX-2": { "coingecko": "blockstack", - "cryptocompare": "BSTX", + "cryptocompare": "STX", "name": "Blockstack", "started": 1572048000, "symbol": "STX", diff --git a/rotkehlchen/data/all_assets.meta b/rotkehlchen/data/all_assets.meta index 79efc18417..79e8195bbd 100644 --- a/rotkehlchen/data/all_assets.meta +++ b/rotkehlchen/data/all_assets.meta @@ -1 +1 @@ -{"md5": "6499628db3b6996da07eb0e19741d8c0", "version": 48} +{"md5": "9d34eb6018b8d6013b4cb3fd8af22edf", "version": 48} diff --git a/rotkehlchen/tests/unit/test_assets.py b/rotkehlchen/tests/unit/test_assets.py index 9594df7ab8..cf76d74938 100644 --- a/rotkehlchen/tests/unit/test_assets.py +++ b/rotkehlchen/tests/unit/test_assets.py @@ -186,7 +186,7 @@ def test_coingecko_identifiers_are_reachable(data_dir): def test_assets_json_meta(): """Test that all_assets.json md5 matches and that if md5 changes since last time then version is also bumped""" - last_meta = {'md5': '6499628db3b6996da07eb0e19741d8c0', 'version': 48} + last_meta = {'md5': '9d34eb6018b8d6013b4cb3fd8af22edf', 'version': 48} data_dir = Path(__file__).resolve().parent.parent.parent / 'data' data_md5 = file_md5(data_dir / 'all_assets.json')
projectmesa__mesa-1437
v1.1.0 Safford Release Milestone: https://github.com/projectmesa/mesa/milestone/31 Highlighted changes: - #1376 > 6x perf speedup for add/remove agent in `ContinuousSpace` - #1391 correctness fix for `SimultaneousActivation` and `StagedActivation` - #1399 make `self.running = True` optional. We need to tell existing users that initializing this is no longer necessary, and so, reducing the boilerplate code - #1435 Allow user-specified local dir to be served by Tornado. Needed by Mesa-Geo - #1413 Allow batch_run to take arbitrary parameters
[ { "content": "\"\"\"\nMesa Agent-Based Modeling Framework\n\nCore Objects: Model, and Agent.\n\n\"\"\"\nimport datetime\n\nfrom mesa.model import Model\nfrom mesa.agent import Agent\n\nimport mesa.time as time\nimport mesa.space as space\nimport mesa.flat.visualization as visualization\nfrom mesa.datacollection import DataCollector\nfrom mesa.batchrunner import batch_run # noqa\n\n__all__ = [\n \"Model\",\n \"Agent\",\n \"time\",\n \"space\",\n \"visualization\",\n \"DataCollector\",\n \"batch_run\",\n]\n\n__title__ = \"mesa\"\n__version__ = \"1.0.0\"\n__license__ = \"Apache 2.0\"\n__copyright__ = f\"Copyright {datetime.date.today().year} Project Mesa Team\"\n", "path": "mesa/__init__.py" } ]
[ { "content": "\"\"\"\nMesa Agent-Based Modeling Framework\n\nCore Objects: Model, and Agent.\n\n\"\"\"\nimport datetime\n\nfrom mesa.model import Model\nfrom mesa.agent import Agent\n\nimport mesa.time as time\nimport mesa.space as space\nimport mesa.flat.visualization as visualization\nfrom mesa.datacollection import DataCollector\nfrom mesa.batchrunner import batch_run # noqa\n\n__all__ = [\n \"Model\",\n \"Agent\",\n \"time\",\n \"space\",\n \"visualization\",\n \"DataCollector\",\n \"batch_run\",\n]\n\n__title__ = \"mesa\"\n__version__ = \"1.1.0\"\n__license__ = \"Apache 2.0\"\n__copyright__ = f\"Copyright {datetime.date.today().year} Project Mesa Team\"\n", "path": "mesa/__init__.py" } ]
diff --git a/HISTORY.rst b/HISTORY.rst index 640c93a0c59..55afb0b1840 100644 --- a/HISTORY.rst +++ b/HISTORY.rst @@ -3,6 +3,59 @@ Release History --------------- +1.1.0 (2022-10-10) Safford +++++++++++++++++++++++++++ + +**Special notes** + +* Perf: ContinuousSpace: speed-up add/remove agents #1376. This is a ~6x performance improvement for add/remove. +* fix: time: Recompute agent_keys between stages #1391. This is a correctness fix for ``SimultaneousActivation`` and ``StagedActivation`` when agents are being removed during simulation. +* ModularServer: Always set model.running = True on reset #1399. With this change, specifying ``self.running = True`` in your model ``__init__`` is now optional. Mesa's visualization server will automatically sets it to ``True`` in the beginning of a simulation. +* feat: Allow user-specified local dir to be served by Tornado #1435. This simplifies the usage of ``ModularServer`` in Mesa-Geo. +* Allow batch_run to take arbitrary parameters #1413. With this change, you can finally use any arbitrary Python objects as ``batch_run`` parameters, where previously they are restricted to hashable objects only. +* Prevent seed and random from being shared between instances #1439. With this fix, a model instance has their own isolated RNG. + +**Improvements** + +* CI Updates + * ci: Cancel previous obsolete runs #1378 + * ci: update black to prevent click error #1382 + * Add "falsy" to .codespellignore #1412 + * Upgrade pre-commit CI (with pyupgrade and syntax checks) #1422 +* Tests + * test: RandomActivationByType: Test adding agents with duplicate ID #1392 +* Dependency updates + * Update Pipfile.lock (dependencies) #1398 + * Update Pipfile.lock (dependencies) #1408 + * Update Pipfile.lock (dependencies) #1434 +* Docs + * docs: Add Tim Pope's guideline for proper Git commit msg #1379 + * readme: Improve the pip install for Git repo instruction #1416 + * Docs: Remove trailing whitespaces #1421 + * Fixes #1423 - fixes build badge in docs #1424 +* Refactors + * refactor: Apply pyupgrade --py37-plus #1429 + * refactor ModularServer (moving code into __init__) #1403 +* Perf: ContinuousSpace: speed-up add/remove agents #1376 +* Remove monospace formatting for hyperlinks #1388 +* ModularServer: Always set model.running = True on reset #1399 +* Allow batch_run to take arbitrary parameters #1413 +* ModularServer: Put new optional arg port last #1432 +* feat: Allow user-specified local dir to be served by Tornado #1435 +* Improve and measure speed of clamp function #1440 + +**Fixes** + +* Fix stray " in modular_template.html #1380 +* Fix zoom on network visualisation #1381 +* Fix broken monospace links #1387 +* fix: Ensure agent id is unique in RandomActivationByType.add #1386 +* fix: time: Recompute agent_keys between stages #1391 +* Fix batchrunner progress bar #1395 +* Fix stray " in visualisation dropdown labels #1409 +* space: Fix type error for Python < 3.9 #1430 +* Prevent seed and random from being shared between instances #1439 + 1.0.0 (2022-07-06) Quartzsite +++++++++++++++++++++++++++++++++++++++++++ diff --git a/mesa/__init__.py b/mesa/__init__.py index 64b3ea6fc0d..b5d5c7a7d03 100644 --- a/mesa/__init__.py +++ b/mesa/__init__.py @@ -26,6 +26,6 @@ ] __title__ = "mesa" -__version__ = "1.0.0" +__version__ = "1.1.0" __license__ = "Apache 2.0" __copyright__ = f"Copyright {datetime.date.today().year} Project Mesa Team"
pex-tool__pex-1057
Release 2.1.17 On the docket: + [x] TypeError when resolving local platforms. #1043 + [x] No such file for interpreter's binary name #1009 + [x] Pex resources leak while bootstrapping pants #1050 + [x] Pex PEX perf regression #1054
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.16\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.17\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index f1e69a75b..118247d4d 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,46 @@ Release Notes ============= +2.1.17 +------ + +This release fixes a bug in ``--resolve-local-platforms`` handling that made it unusable in 2.1.16 +(#1043) as well as fixing a long standing file handle leak (#1050) and a bug when running under +macOS framework builds of Python (#1009). + +* Fix `--unzip` performance regression. (#1056) + `PR #1056 <https://github.com/pantsbuild/pex/pull/1056>`_ + +* Fix resource leak in Pex self-isolation. (#1052) + `PR #1052 <https://github.com/pantsbuild/pex/pull/1052>`_ + +* Fix use of `iter_compatible_interpreters`. (#1048) + `PR #1048 <https://github.com/pantsbuild/pex/pull/1048>`_ + +* Do not rely on `sys.executable` being accurate. (#1049) + `PR #1049 <https://github.com/pantsbuild/pex/pull/1049>`_ + +* slightly demystify the relationship between platforms and interpreters in the library API and CLI (#1047) + `PR #1047 <https://github.com/pantsbuild/pex/pull/1047>`_ + +* Path filter for PythonInterpreter.iter_candidates. (#1046) + `PR #1046 <https://github.com/pantsbuild/pex/pull/1046>`_ + +* Add type hints to `util.py` and `tracer.py` (#1044) + `PR #1044 <https://github.com/pantsbuild/pex/pull/1044>`_ + +* Add type hints to variables.py and platforms.py (#1042) + `PR #1042 <https://github.com/pantsbuild/pex/pull/1042>`_ + +* Add type hints to the remaining tests (#1040) + `PR #1040 <https://github.com/pantsbuild/pex/pull/1040>`_ + +* Add type hints to most tests (#1036) + `PR #1036 <https://github.com/pantsbuild/pex/pull/1036>`_ + +* Use MyPy via type comments (#1032) + `PR #1032 <https://github.com/pantsbuild/pex/pull/1032>`_ + 2.1.16 ------ diff --git a/pex/version.py b/pex/version.py index 5d7a390f0..b8849c9a0 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.16" +__version__ = "2.1.17"
pytorch__ignite-2907
Issue with Enum on Python3.11 ## 🐛 Bug description Importing `ignite.distributed` fails on Python3.11. To reproduce: ```bash python3.11 -m pip install pytorch-ignite python3.11 -c 'import ignite.distributed' ``` I get the following `AttributeError`: ```python Traceback (most recent call last): File "<string>", line 1, in <module> File "/opt/homebrew/lib/python3.11/site-packages/ignite/__init__.py", line 3, in <module> import ignite.engine File "/opt/homebrew/lib/python3.11/site-packages/ignite/engine/__init__.py", line 7, in <module> from ignite.engine.deterministic import DeterministicEngine File "/opt/homebrew/lib/python3.11/site-packages/ignite/engine/deterministic.py", line 11, in <module> from ignite.engine.engine import Engine File "/opt/homebrew/lib/python3.11/site-packages/ignite/engine/engine.py", line 13, in <module> from ignite.engine.events import CallableEventWithFilter, EventEnum, Events, EventsList, RemovableEventHandle, State File "/opt/homebrew/lib/python3.11/site-packages/ignite/engine/events.py", line 254, in <module> class Events(EventEnum): File "/opt/homebrew/Cellar/[email protected]/3.11.2_1/Frameworks/Python.framework/Versions/3.11/lib/python3.11/enum.py", line 560, in __new__ raise exc File "/opt/homebrew/Cellar/[email protected]/3.11.2_1/Frameworks/Python.framework/Versions/3.11/lib/python3.11/enum.py", line 280, in __set_name__ enum_member = enum_class._value2member_map_[value] ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^ File "/opt/homebrew/lib/python3.11/site-packages/ignite/engine/events.py", line 200, in __hash__ return hash(self._name_) ^^^^^^^^^^^ AttributeError: 'CallableEventWithFilter' object has no attribute '_name_'. Did you mean: 'name'? ``` <!-- A clear and concise description of what the bug is. --> <!-- Please, add steps on how to reproduce it. --> <!-- If you have a code sample, error messages, stack traces, please provide it here as well --> <!-- A clear and concise description of what you expected to happen. --> ## Environment - PyTorch Version: 2.0.0 - Ignite Version: 0.4.11 - OS (e.g., Linux): macOS Ventura 13.2.1 - How you installed Ignite: `pip` - Python version: 3.11
[ { "content": "import numbers\nimport warnings\nimport weakref\nfrom collections.abc import Sequence\nfrom enum import Enum\nfrom types import DynamicClassAttribute\nfrom typing import Any, Callable, Dict, Iterable, Iterator, List, Optional, TYPE_CHECKING, Union\n\nfrom torch.utils.data import DataLoader\n\nfrom ignite.engine.utils import _check_signature\n\nif TYPE_CHECKING:\n from ignite.engine.engine import Engine\n\n__all__ = [\"CallableEventWithFilter\", \"EventEnum\", \"Events\", \"State\", \"EventsList\", \"RemovableEventHandle\"]\n\n\nclass CallableEventWithFilter:\n \"\"\"Single Event containing a filter, specifying whether the event should\n be run at the current event (if the event type is correct)\n\n Args:\n value: The actual enum value. Only needed for internal use. Do not touch!\n event_filter: A function taking the engine and the current event value as input and returning a\n boolean to indicate whether this event should be executed. Defaults to None, which will result to a\n function that always returns `True`\n name: The enum-name of the current object. Only needed for internal use. Do not touch!\n \"\"\"\n\n def __init__(self, value: str, event_filter: Optional[Callable] = None, name: Optional[str] = None) -> None:\n self.filter = event_filter\n\n if not hasattr(self, \"_value_\"):\n self._value_ = value\n\n if not hasattr(self, \"_name_\") and name is not None:\n self._name_ = name\n\n # copied to be compatible to enum\n @DynamicClassAttribute\n def name(self) -> str:\n \"\"\"The name of the Enum member.\"\"\"\n return self._name_\n\n @DynamicClassAttribute\n def value(self) -> str:\n \"\"\"The value of the Enum member.\"\"\"\n return self._value_\n\n def __call__(\n self,\n event_filter: Optional[Callable] = None,\n every: Optional[int] = None,\n once: Optional[Union[int, List]] = None,\n before: Optional[int] = None,\n after: Optional[int] = None,\n ) -> \"CallableEventWithFilter\":\n \"\"\"\n Makes the event class callable and accepts either an arbitrary callable as filter\n (which must take in the engine and current event value and return a boolean) or an every or once value\n\n Args:\n event_filter: a filter function to check if the event should be executed when\n the event type was fired\n every: a value specifying how often the event should be fired\n once: a value or list of values specifying when the event should be fired (if only once)\n before: a value specifying the number of occurrence that event should be fired before\n after: a value specifying the number of occurrence that event should be fired after\n\n Returns:\n CallableEventWithFilter: A new event having the same value but a different filter function\n \"\"\"\n\n if (\n sum(\n (\n event_filter is not None,\n once is not None,\n (every is not None or before is not None or after is not None),\n )\n )\n != 1\n ):\n raise ValueError(\"Only one of the input arguments should be specified, except before, after and every\")\n\n if (event_filter is not None) and not callable(event_filter):\n raise TypeError(\"Argument event_filter should be a callable\")\n\n if (every is not None) and not (isinstance(every, numbers.Integral) and every > 0):\n raise ValueError(\"Argument every should be integer and greater than zero\")\n\n if once is not None:\n c1 = isinstance(once, numbers.Integral) and once > 0\n c2 = isinstance(once, Sequence) and len(once) > 0 and all(isinstance(e, int) and e > 0 for e in once)\n if not (c1 or c2):\n raise ValueError(\n f\"Argument once should either be a positive integer or a list of positive integers, got {once}\"\n )\n\n if (before is not None) and not (isinstance(before, numbers.Integral) and before >= 0):\n raise ValueError(\"Argument before should be integer and greater or equal to zero\")\n\n if (after is not None) and not (isinstance(after, numbers.Integral) and after >= 0):\n raise ValueError(\"Argument after should be integer and greater or equal to zero\")\n\n if every is not None:\n if every == 1:\n # Just return the event itself\n event_filter = None\n else:\n event_filter = self.every_event_filter(every)\n\n if once is not None:\n event_filter = self.once_event_filter([once] if isinstance(once, int) else once)\n\n if before is not None or after is not None:\n if every is not None:\n event_filter = self.every_before_and_after_event_filter(every, before, after)\n else:\n event_filter = self.before_and_after_event_filter(before, after)\n\n # check signature:\n if event_filter is not None:\n _check_signature(event_filter, \"event_filter\", \"engine\", \"event\")\n\n return CallableEventWithFilter(self.value, event_filter, self.name)\n\n @staticmethod\n def every_event_filter(every: int) -> Callable:\n \"\"\"A wrapper for every event filter.\"\"\"\n\n def wrapper(engine: \"Engine\", event: int) -> bool:\n if event % every == 0:\n return True\n return False\n\n return wrapper\n\n @staticmethod\n def once_event_filter(once: List) -> Callable:\n \"\"\"A wrapper for once event filter.\"\"\"\n\n def wrapper(engine: \"Engine\", event: int) -> bool:\n if event in once:\n return True\n return False\n\n return wrapper\n\n @staticmethod\n def before_and_after_event_filter(before: Optional[int] = None, after: Optional[int] = None) -> Callable:\n \"\"\"A wrapper for before and after event filter.\"\"\"\n before_: Union[int, float] = float(\"inf\") if before is None else before\n after_: int = 0 if after is None else after\n\n def wrapper(engine: \"Engine\", event: int) -> bool:\n if event > after_ and event < before_:\n return True\n return False\n\n return wrapper\n\n @staticmethod\n def every_before_and_after_event_filter(\n every: int, before: Optional[int] = None, after: Optional[int] = None\n ) -> Callable:\n \"\"\"A wrapper which triggers for every `every` iterations after `after` and before `before`.\"\"\"\n before_: Union[int, float] = float(\"inf\") if before is None else before\n after_: int = 0 if after is None else after\n\n def wrapper(engine: \"Engine\", event: int) -> bool:\n if after_ < event < before_ and (event - after_ - 1) % every == 0:\n return True\n return False\n\n return wrapper\n\n @staticmethod\n def default_event_filter(engine: \"Engine\", event: int) -> bool:\n \"\"\"Default event filter. This method is is deprecated and will be removed. Please, use None instead\"\"\"\n warnings.warn(\"Events.default_event_filter is deprecated and will be removed. Please, use None instead\")\n return True\n\n def __repr__(self) -> str:\n out = f\"Events.{self.name}\"\n if self.filter is not None:\n out += f\"(filter={self.filter})\"\n return out\n\n def __eq__(self, other: Any) -> bool:\n if isinstance(other, CallableEventWithFilter):\n return self.name == other.name\n elif isinstance(other, str):\n return self.name == other\n else:\n return NotImplemented\n\n def __hash__(self) -> int:\n return hash(self._name_)\n\n def __or__(self, other: Any) -> \"EventsList\":\n return EventsList() | self | other\n\n\nclass EventEnum(CallableEventWithFilter, Enum):\n \"\"\"Base class for all :class:`~ignite.engine.events.Events`. User defined custom events should also inherit\n this class.\n\n Examples:\n Custom events based on the loss calculation and backward pass can be created as follows:\n\n .. code-block:: python\n\n from ignite.engine import EventEnum\n\n class BackpropEvents(EventEnum):\n BACKWARD_STARTED = 'backward_started'\n BACKWARD_COMPLETED = 'backward_completed'\n OPTIM_STEP_COMPLETED = 'optim_step_completed'\n\n def update(engine, batch):\n # ...\n loss = criterion(y_pred, y)\n engine.fire_event(BackpropEvents.BACKWARD_STARTED)\n loss.backward()\n engine.fire_event(BackpropEvents.BACKWARD_COMPLETED)\n optimizer.step()\n engine.fire_event(BackpropEvents.OPTIM_STEP_COMPLETED)\n # ...\n\n trainer = Engine(update)\n trainer.register_events(*BackpropEvents)\n\n @trainer.on(BackpropEvents.BACKWARD_STARTED)\n def function_before_backprop(engine):\n # ...\n \"\"\"\n\n pass\n\n\nclass Events(EventEnum):\n \"\"\"Events that are fired by the :class:`~ignite.engine.engine.Engine` during execution. Built-in events:\n\n - STARTED : triggered when engine's run is started\n - EPOCH_STARTED : triggered when the epoch is started\n - GET_BATCH_STARTED : triggered before next batch is fetched\n - GET_BATCH_COMPLETED : triggered after the batch is fetched\n - ITERATION_STARTED : triggered when an iteration is started\n - ITERATION_COMPLETED : triggered when the iteration is ended\n\n - DATALOADER_STOP_ITERATION : engine's specific event triggered when dataloader has no more data to provide\n\n - EXCEPTION_RAISED : triggered when an exception is encountered\n - TERMINATE_SINGLE_EPOCH : triggered when the run is about to end the current epoch,\n after receiving a :meth:`~ignite.engine.engine.Engine.terminate_epoch()` or\n :meth:`~ignite.engine.engine.Engine.terminate()` call.\n\n - TERMINATE : triggered when the run is about to end completely,\n after receiving :meth:`~ignite.engine.engine.Engine.terminate()` call.\n\n - EPOCH_COMPLETED : triggered when the epoch is ended. Note that this is triggered even\n when :meth:`~ignite.engine.engine.Engine.terminate_epoch()` is called.\n - COMPLETED : triggered when engine's run is completed\n\n The table below illustrates which events are triggered when various termination methods are called.\n\n .. list-table::\n :widths: 24 25 33 18\n :header-rows: 1\n\n * - Method\n - EVENT_COMPLETED\n - TERMINATE_SINGLE_EPOCH\n - TERMINATE\n * - no termination\n - ✔\n - ✗\n - ✗\n * - :meth:`~ignite.engine.engine.Engine.terminate_epoch()`\n - ✔\n - ✔\n - ✗\n * - :meth:`~ignite.engine.engine.Engine.terminate()`\n - ✗\n - ✔\n - ✔\n\n Since v0.3.0, Events become more flexible and allow to pass an event filter to the Engine:\n\n .. code-block:: python\n\n engine = Engine()\n\n # a) custom event filter\n def custom_event_filter(engine, event):\n if event in [1, 2, 5, 10, 50, 100]:\n return True\n return False\n\n @engine.on(Events.ITERATION_STARTED(event_filter=custom_event_filter))\n def call_on_special_event(engine):\n # do something on 1, 2, 5, 10, 50, 100 iterations\n\n # b) \"every\" event filter\n @engine.on(Events.ITERATION_STARTED(every=10))\n def call_every(engine):\n # do something every 10th iteration\n\n # c) \"once\" event filter\n @engine.on(Events.ITERATION_STARTED(once=50))\n def call_once(engine):\n # do something on 50th iteration\n\n # d) \"before\" and \"after\" event filter\n @engine.on(Events.EPOCH_STARTED(before=30, after=10))\n def call_before(engine):\n # do something in 11 to 29 epoch\n\n # e) Mixing \"every\" and \"before\" / \"after\" event filters\n @engine.on(Events.EPOCH_STARTED(every=5, before=25, after=8))\n def call_every_itr_before_after(engine):\n # do something on 9, 14, 19, 24 epochs\n\n Event filter function `event_filter` accepts as input `engine` and `event` and should return True/False.\n Argument `event` is the value of iteration or epoch, depending on which type of Events the function is passed.\n\n Since v0.4.0, user can also combine events with `|`-operator:\n\n .. code-block:: python\n\n events = Events.STARTED | Events.COMPLETED | Events.ITERATION_STARTED(every=3)\n engine = ...\n\n @engine.on(events)\n def call_on_events(engine):\n # do something\n\n Since v0.4.0, custom events defined by user should inherit from :class:`~ignite.engine.events.EventEnum` :\n\n .. code-block:: python\n\n class CustomEvents(EventEnum):\n FOO_EVENT = \"foo_event\"\n BAR_EVENT = \"bar_event\"\n \"\"\"\n\n EPOCH_STARTED = \"epoch_started\"\n \"\"\"triggered when the epoch is started.\"\"\"\n EPOCH_COMPLETED = \"epoch_completed\"\n \"\"\"Event attribute indicating epoch is ended.\"\"\"\n\n STARTED = \"started\"\n \"\"\"triggered when engine's run is started.\"\"\"\n COMPLETED = \"completed\"\n \"\"\"triggered when engine's run is completed\"\"\"\n\n ITERATION_STARTED = \"iteration_started\"\n \"\"\"triggered when an iteration is started.\"\"\"\n ITERATION_COMPLETED = \"iteration_completed\"\n \"\"\"triggered when the iteration is ended.\"\"\"\n EXCEPTION_RAISED = \"exception_raised\"\n \"\"\"triggered when an exception is encountered.\"\"\"\n\n GET_BATCH_STARTED = \"get_batch_started\"\n \"\"\"triggered before next batch is fetched.\"\"\"\n GET_BATCH_COMPLETED = \"get_batch_completed\"\n \"\"\"triggered after the batch is fetched.\"\"\"\n\n DATALOADER_STOP_ITERATION = \"dataloader_stop_iteration\"\n \"\"\"engine's specific event triggered when dataloader has no more data to provide\"\"\"\n TERMINATE = \"terminate\"\n \"\"\"triggered when the run is about to end completely, after receiving terminate() call.\"\"\"\n TERMINATE_SINGLE_EPOCH = \"terminate_single_epoch\"\n \"\"\"triggered when the run is about to end the current epoch,\n after receiving a terminate_epoch() call.\"\"\"\n INTERRUPT = \"interrupt\"\n \"\"\"triggered when the run is interrupted, after receiving interrupt() call.\"\"\"\n\n def __or__(self, other: Any) -> \"EventsList\":\n return EventsList() | self | other\n\n\nclass EventsList:\n \"\"\"Collection of events stacked by operator `__or__`.\n\n .. code-block:: python\n\n events = Events.STARTED | Events.COMPLETED\n events |= Events.ITERATION_STARTED(every=3)\n\n engine = ...\n\n @engine.on(events)\n def call_on_events(engine):\n # do something\n\n or\n\n .. code-block:: python\n\n @engine.on(Events.STARTED | Events.COMPLETED | Events.ITERATION_STARTED(every=3))\n def call_on_events(engine):\n # do something\n\n \"\"\"\n\n def __init__(self) -> None:\n self._events: List[Union[Events, CallableEventWithFilter]] = []\n\n def _append(self, event: Union[Events, CallableEventWithFilter]) -> None:\n if not isinstance(event, (Events, CallableEventWithFilter)):\n raise TypeError(f\"Argument event should be Events or CallableEventWithFilter, got: {type(event)}\")\n self._events.append(event)\n\n def __getitem__(self, item: int) -> Union[Events, CallableEventWithFilter]:\n return self._events[item]\n\n def __iter__(self) -> Iterator[Union[Events, CallableEventWithFilter]]:\n return iter(self._events)\n\n def __len__(self) -> int:\n return len(self._events)\n\n def __or__(self, other: Union[Events, CallableEventWithFilter]) -> \"EventsList\":\n self._append(event=other)\n return self\n\n\nclass State:\n \"\"\"An object that is used to pass internal and user-defined state between event handlers. By default, state\n contains the following attributes:\n\n .. code-block:: python\n\n state.iteration # 1-based, the first iteration is 1\n state.epoch # 1-based, the first epoch is 1\n state.seed # seed to set at each epoch\n state.dataloader # data passed to engine\n state.epoch_length # optional length of an epoch\n state.max_epochs # number of epochs to run\n state.max_iters # number of iterations to run\n state.batch # batch passed to `process_function`\n state.output # output of `process_function` after a single iteration\n state.metrics # dictionary with defined metrics if any\n state.times # dictionary with total and per-epoch times fetched on\n # keys: Events.EPOCH_COMPLETED.name and Events.COMPLETED.name\n\n Args:\n kwargs: keyword arguments to be defined as State attributes.\n \"\"\"\n\n event_to_attr: Dict[Union[str, \"Events\", \"CallableEventWithFilter\"], str] = {\n Events.GET_BATCH_STARTED: \"iteration\",\n Events.GET_BATCH_COMPLETED: \"iteration\",\n Events.ITERATION_STARTED: \"iteration\",\n Events.ITERATION_COMPLETED: \"iteration\",\n Events.EPOCH_STARTED: \"epoch\",\n Events.EPOCH_COMPLETED: \"epoch\",\n Events.STARTED: \"epoch\",\n Events.COMPLETED: \"epoch\",\n }\n\n def __init__(self, **kwargs: Any) -> None:\n self.iteration = 0\n self.epoch = 0\n self.epoch_length: Optional[int] = None\n self.max_epochs: Optional[int] = None\n self.max_iters: Optional[int] = None\n self.output: Optional[int] = None\n self.batch: Optional[int] = None\n self.metrics: Dict[str, Any] = {}\n self.dataloader: Optional[Union[DataLoader, Iterable[Any]]] = None\n self.seed: Optional[int] = None\n self.times: Dict[str, Optional[float]] = {\n Events.EPOCH_COMPLETED.name: None,\n Events.COMPLETED.name: None,\n }\n\n for k, v in kwargs.items():\n setattr(self, k, v)\n\n self._update_attrs()\n\n def _update_attrs(self) -> None:\n for value in self.event_to_attr.values():\n if not hasattr(self, value):\n setattr(self, value, 0)\n\n def get_event_attrib_value(self, event_name: Union[str, Events, CallableEventWithFilter]) -> int:\n \"\"\"Get the value of Event attribute with given `event_name`.\"\"\"\n if event_name not in State.event_to_attr:\n raise RuntimeError(f\"Unknown event name '{event_name}'\")\n return getattr(self, State.event_to_attr[event_name])\n\n def __repr__(self) -> str:\n s = \"State:\\n\"\n for attr, value in self.__dict__.items():\n if not isinstance(value, (numbers.Number, str)):\n value = type(value)\n s += f\"\\t{attr}: {value}\\n\"\n return s\n\n\nclass RemovableEventHandle:\n \"\"\"A weakref handle to remove a registered event.\n\n A handle that may be used to remove a registered event handler via the\n remove method, with-statement, or context manager protocol. Returned from\n :meth:`~ignite.engine.engine.Engine.add_event_handler`.\n\n\n Args:\n event_name: Registered event name.\n handler: Registered event handler, stored as weakref.\n engine: Target engine, stored as weakref.\n\n Examples:\n .. code-block:: python\n\n engine = Engine()\n\n def print_epoch(engine):\n print(f\"Epoch: {engine.state.epoch}\")\n\n with engine.add_event_handler(Events.EPOCH_COMPLETED, print_epoch):\n # print_epoch handler registered for a single run\n engine.run(data)\n\n # print_epoch handler is now unregistered\n \"\"\"\n\n def __init__(\n self, event_name: Union[CallableEventWithFilter, Enum, EventsList, Events], handler: Callable, engine: \"Engine\"\n ) -> None:\n self.event_name = event_name\n self.handler = weakref.ref(handler)\n self.engine = weakref.ref(engine)\n\n def remove(self) -> None:\n \"\"\"Remove handler from engine.\"\"\"\n handler = self.handler()\n engine = self.engine()\n\n if handler is None or engine is None:\n return\n\n if hasattr(handler, \"_parent\"):\n handler = handler._parent()\n if handler is None:\n raise RuntimeError(\n \"Internal error! Please fill an issue on https://github.com/pytorch/ignite/issues \"\n \"if encounter this error. Thank you!\"\n )\n\n if isinstance(self.event_name, EventsList):\n for e in self.event_name:\n if engine.has_event_handler(handler, e):\n engine.remove_event_handler(handler, e)\n else:\n if engine.has_event_handler(handler, self.event_name):\n engine.remove_event_handler(handler, self.event_name)\n\n def __enter__(self) -> \"RemovableEventHandle\":\n return self\n\n def __exit__(self, *args: Any, **kwargs: Any) -> None:\n self.remove()\n", "path": "ignite/engine/events.py" } ]
[ { "content": "import numbers\nimport warnings\nimport weakref\nfrom collections.abc import Sequence\nfrom enum import Enum\nfrom types import DynamicClassAttribute\nfrom typing import Any, Callable, Dict, Iterable, Iterator, List, Optional, TYPE_CHECKING, Union\n\nfrom torch.utils.data import DataLoader\n\nfrom ignite.engine.utils import _check_signature\n\nif TYPE_CHECKING:\n from ignite.engine.engine import Engine\n\n__all__ = [\"CallableEventWithFilter\", \"EventEnum\", \"Events\", \"State\", \"EventsList\", \"RemovableEventHandle\"]\n\n\nclass CallableEventWithFilter:\n \"\"\"Single Event containing a filter, specifying whether the event should\n be run at the current event (if the event type is correct)\n\n Args:\n value: The actual enum value. Only needed for internal use. Do not touch!\n event_filter: A function taking the engine and the current event value as input and returning a\n boolean to indicate whether this event should be executed. Defaults to None, which will result to a\n function that always returns `True`\n name: The enum-name of the current object. Only needed for internal use. Do not touch!\n \"\"\"\n\n def __init__(self, value: str, event_filter: Optional[Callable] = None, name: Optional[str] = None) -> None:\n self.filter = event_filter\n\n if not hasattr(self, \"_value_\"):\n self._value_ = value\n\n if not hasattr(self, \"_name_\") and name is not None:\n self._name_ = name\n\n # copied to be compatible to enum\n @DynamicClassAttribute\n def name(self) -> str:\n \"\"\"The name of the Enum member.\"\"\"\n return self._name_\n\n @DynamicClassAttribute\n def value(self) -> str:\n \"\"\"The value of the Enum member.\"\"\"\n return self._value_\n\n def __call__(\n self,\n event_filter: Optional[Callable] = None,\n every: Optional[int] = None,\n once: Optional[Union[int, List]] = None,\n before: Optional[int] = None,\n after: Optional[int] = None,\n ) -> \"CallableEventWithFilter\":\n \"\"\"\n Makes the event class callable and accepts either an arbitrary callable as filter\n (which must take in the engine and current event value and return a boolean) or an every or once value\n\n Args:\n event_filter: a filter function to check if the event should be executed when\n the event type was fired\n every: a value specifying how often the event should be fired\n once: a value or list of values specifying when the event should be fired (if only once)\n before: a value specifying the number of occurrence that event should be fired before\n after: a value specifying the number of occurrence that event should be fired after\n\n Returns:\n CallableEventWithFilter: A new event having the same value but a different filter function\n \"\"\"\n\n if (\n sum(\n (\n event_filter is not None,\n once is not None,\n (every is not None or before is not None or after is not None),\n )\n )\n != 1\n ):\n raise ValueError(\"Only one of the input arguments should be specified, except before, after and every\")\n\n if (event_filter is not None) and not callable(event_filter):\n raise TypeError(\"Argument event_filter should be a callable\")\n\n if (every is not None) and not (isinstance(every, numbers.Integral) and every > 0):\n raise ValueError(\"Argument every should be integer and greater than zero\")\n\n if once is not None:\n c1 = isinstance(once, numbers.Integral) and once > 0\n c2 = isinstance(once, Sequence) and len(once) > 0 and all(isinstance(e, int) and e > 0 for e in once)\n if not (c1 or c2):\n raise ValueError(\n f\"Argument once should either be a positive integer or a list of positive integers, got {once}\"\n )\n\n if (before is not None) and not (isinstance(before, numbers.Integral) and before >= 0):\n raise ValueError(\"Argument before should be integer and greater or equal to zero\")\n\n if (after is not None) and not (isinstance(after, numbers.Integral) and after >= 0):\n raise ValueError(\"Argument after should be integer and greater or equal to zero\")\n\n if every is not None:\n if every == 1:\n # Just return the event itself\n event_filter = None\n else:\n event_filter = self.every_event_filter(every)\n\n if once is not None:\n event_filter = self.once_event_filter([once] if isinstance(once, int) else once)\n\n if before is not None or after is not None:\n if every is not None:\n event_filter = self.every_before_and_after_event_filter(every, before, after)\n else:\n event_filter = self.before_and_after_event_filter(before, after)\n\n # check signature:\n if event_filter is not None:\n _check_signature(event_filter, \"event_filter\", \"engine\", \"event\")\n\n return CallableEventWithFilter(self.value, event_filter, self.name)\n\n @staticmethod\n def every_event_filter(every: int) -> Callable:\n \"\"\"A wrapper for every event filter.\"\"\"\n\n def wrapper(engine: \"Engine\", event: int) -> bool:\n if event % every == 0:\n return True\n return False\n\n return wrapper\n\n @staticmethod\n def once_event_filter(once: List) -> Callable:\n \"\"\"A wrapper for once event filter.\"\"\"\n\n def wrapper(engine: \"Engine\", event: int) -> bool:\n if event in once:\n return True\n return False\n\n return wrapper\n\n @staticmethod\n def before_and_after_event_filter(before: Optional[int] = None, after: Optional[int] = None) -> Callable:\n \"\"\"A wrapper for before and after event filter.\"\"\"\n before_: Union[int, float] = float(\"inf\") if before is None else before\n after_: int = 0 if after is None else after\n\n def wrapper(engine: \"Engine\", event: int) -> bool:\n if event > after_ and event < before_:\n return True\n return False\n\n return wrapper\n\n @staticmethod\n def every_before_and_after_event_filter(\n every: int, before: Optional[int] = None, after: Optional[int] = None\n ) -> Callable:\n \"\"\"A wrapper which triggers for every `every` iterations after `after` and before `before`.\"\"\"\n before_: Union[int, float] = float(\"inf\") if before is None else before\n after_: int = 0 if after is None else after\n\n def wrapper(engine: \"Engine\", event: int) -> bool:\n if after_ < event < before_ and (event - after_ - 1) % every == 0:\n return True\n return False\n\n return wrapper\n\n @staticmethod\n def default_event_filter(engine: \"Engine\", event: int) -> bool:\n \"\"\"Default event filter. This method is is deprecated and will be removed. Please, use None instead\"\"\"\n warnings.warn(\"Events.default_event_filter is deprecated and will be removed. Please, use None instead\")\n return True\n\n def __repr__(self) -> str:\n out = f\"Events.{self.name}\"\n if self.filter is not None:\n out += f\"(filter={self.filter})\"\n return out\n\n def __eq__(self, other: Any) -> bool:\n if isinstance(other, CallableEventWithFilter):\n return self.name == other.name\n elif isinstance(other, str):\n return self.name == other\n else:\n return NotImplemented\n\n def __hash__(self) -> int:\n return hash(self._name_)\n\n def __or__(self, other: Any) -> \"EventsList\":\n return EventsList() | self | other\n\n\nclass EventEnum(CallableEventWithFilter, Enum):\n \"\"\"Base class for all :class:`~ignite.engine.events.Events`. User defined custom events should also inherit\n this class.\n\n Examples:\n Custom events based on the loss calculation and backward pass can be created as follows:\n\n .. code-block:: python\n\n from ignite.engine import EventEnum\n\n class BackpropEvents(EventEnum):\n BACKWARD_STARTED = 'backward_started'\n BACKWARD_COMPLETED = 'backward_completed'\n OPTIM_STEP_COMPLETED = 'optim_step_completed'\n\n def update(engine, batch):\n # ...\n loss = criterion(y_pred, y)\n engine.fire_event(BackpropEvents.BACKWARD_STARTED)\n loss.backward()\n engine.fire_event(BackpropEvents.BACKWARD_COMPLETED)\n optimizer.step()\n engine.fire_event(BackpropEvents.OPTIM_STEP_COMPLETED)\n # ...\n\n trainer = Engine(update)\n trainer.register_events(*BackpropEvents)\n\n @trainer.on(BackpropEvents.BACKWARD_STARTED)\n def function_before_backprop(engine):\n # ...\n \"\"\"\n\n def __new__(cls, value: str) -> \"EventEnum\":\n obj = CallableEventWithFilter.__new__(cls)\n obj._value_ = value\n return obj\n\n\nclass Events(EventEnum):\n \"\"\"Events that are fired by the :class:`~ignite.engine.engine.Engine` during execution. Built-in events:\n\n - STARTED : triggered when engine's run is started\n - EPOCH_STARTED : triggered when the epoch is started\n - GET_BATCH_STARTED : triggered before next batch is fetched\n - GET_BATCH_COMPLETED : triggered after the batch is fetched\n - ITERATION_STARTED : triggered when an iteration is started\n - ITERATION_COMPLETED : triggered when the iteration is ended\n\n - DATALOADER_STOP_ITERATION : engine's specific event triggered when dataloader has no more data to provide\n\n - EXCEPTION_RAISED : triggered when an exception is encountered\n - TERMINATE_SINGLE_EPOCH : triggered when the run is about to end the current epoch,\n after receiving a :meth:`~ignite.engine.engine.Engine.terminate_epoch()` or\n :meth:`~ignite.engine.engine.Engine.terminate()` call.\n\n - TERMINATE : triggered when the run is about to end completely,\n after receiving :meth:`~ignite.engine.engine.Engine.terminate()` call.\n\n - EPOCH_COMPLETED : triggered when the epoch is ended. Note that this is triggered even\n when :meth:`~ignite.engine.engine.Engine.terminate_epoch()` is called.\n - COMPLETED : triggered when engine's run is completed\n\n The table below illustrates which events are triggered when various termination methods are called.\n\n .. list-table::\n :widths: 24 25 33 18\n :header-rows: 1\n\n * - Method\n - EVENT_COMPLETED\n - TERMINATE_SINGLE_EPOCH\n - TERMINATE\n * - no termination\n - ✔\n - ✗\n - ✗\n * - :meth:`~ignite.engine.engine.Engine.terminate_epoch()`\n - ✔\n - ✔\n - ✗\n * - :meth:`~ignite.engine.engine.Engine.terminate()`\n - ✗\n - ✔\n - ✔\n\n Since v0.3.0, Events become more flexible and allow to pass an event filter to the Engine:\n\n .. code-block:: python\n\n engine = Engine()\n\n # a) custom event filter\n def custom_event_filter(engine, event):\n if event in [1, 2, 5, 10, 50, 100]:\n return True\n return False\n\n @engine.on(Events.ITERATION_STARTED(event_filter=custom_event_filter))\n def call_on_special_event(engine):\n # do something on 1, 2, 5, 10, 50, 100 iterations\n\n # b) \"every\" event filter\n @engine.on(Events.ITERATION_STARTED(every=10))\n def call_every(engine):\n # do something every 10th iteration\n\n # c) \"once\" event filter\n @engine.on(Events.ITERATION_STARTED(once=50))\n def call_once(engine):\n # do something on 50th iteration\n\n # d) \"before\" and \"after\" event filter\n @engine.on(Events.EPOCH_STARTED(before=30, after=10))\n def call_before(engine):\n # do something in 11 to 29 epoch\n\n # e) Mixing \"every\" and \"before\" / \"after\" event filters\n @engine.on(Events.EPOCH_STARTED(every=5, before=25, after=8))\n def call_every_itr_before_after(engine):\n # do something on 9, 14, 19, 24 epochs\n\n Event filter function `event_filter` accepts as input `engine` and `event` and should return True/False.\n Argument `event` is the value of iteration or epoch, depending on which type of Events the function is passed.\n\n Since v0.4.0, user can also combine events with `|`-operator:\n\n .. code-block:: python\n\n events = Events.STARTED | Events.COMPLETED | Events.ITERATION_STARTED(every=3)\n engine = ...\n\n @engine.on(events)\n def call_on_events(engine):\n # do something\n\n Since v0.4.0, custom events defined by user should inherit from :class:`~ignite.engine.events.EventEnum` :\n\n .. code-block:: python\n\n class CustomEvents(EventEnum):\n FOO_EVENT = \"foo_event\"\n BAR_EVENT = \"bar_event\"\n \"\"\"\n\n EPOCH_STARTED = \"epoch_started\"\n \"\"\"triggered when the epoch is started.\"\"\"\n EPOCH_COMPLETED = \"epoch_completed\"\n \"\"\"Event attribute indicating epoch is ended.\"\"\"\n\n STARTED = \"started\"\n \"\"\"triggered when engine's run is started.\"\"\"\n COMPLETED = \"completed\"\n \"\"\"triggered when engine's run is completed\"\"\"\n\n ITERATION_STARTED = \"iteration_started\"\n \"\"\"triggered when an iteration is started.\"\"\"\n ITERATION_COMPLETED = \"iteration_completed\"\n \"\"\"triggered when the iteration is ended.\"\"\"\n EXCEPTION_RAISED = \"exception_raised\"\n \"\"\"triggered when an exception is encountered.\"\"\"\n\n GET_BATCH_STARTED = \"get_batch_started\"\n \"\"\"triggered before next batch is fetched.\"\"\"\n GET_BATCH_COMPLETED = \"get_batch_completed\"\n \"\"\"triggered after the batch is fetched.\"\"\"\n\n DATALOADER_STOP_ITERATION = \"dataloader_stop_iteration\"\n \"\"\"engine's specific event triggered when dataloader has no more data to provide\"\"\"\n TERMINATE = \"terminate\"\n \"\"\"triggered when the run is about to end completely, after receiving terminate() call.\"\"\"\n TERMINATE_SINGLE_EPOCH = \"terminate_single_epoch\"\n \"\"\"triggered when the run is about to end the current epoch,\n after receiving a terminate_epoch() call.\"\"\"\n INTERRUPT = \"interrupt\"\n \"\"\"triggered when the run is interrupted, after receiving interrupt() call.\"\"\"\n\n def __or__(self, other: Any) -> \"EventsList\":\n return EventsList() | self | other\n\n\nclass EventsList:\n \"\"\"Collection of events stacked by operator `__or__`.\n\n .. code-block:: python\n\n events = Events.STARTED | Events.COMPLETED\n events |= Events.ITERATION_STARTED(every=3)\n\n engine = ...\n\n @engine.on(events)\n def call_on_events(engine):\n # do something\n\n or\n\n .. code-block:: python\n\n @engine.on(Events.STARTED | Events.COMPLETED | Events.ITERATION_STARTED(every=3))\n def call_on_events(engine):\n # do something\n\n \"\"\"\n\n def __init__(self) -> None:\n self._events: List[Union[Events, CallableEventWithFilter]] = []\n\n def _append(self, event: Union[Events, CallableEventWithFilter]) -> None:\n if not isinstance(event, (Events, CallableEventWithFilter)):\n raise TypeError(f\"Argument event should be Events or CallableEventWithFilter, got: {type(event)}\")\n self._events.append(event)\n\n def __getitem__(self, item: int) -> Union[Events, CallableEventWithFilter]:\n return self._events[item]\n\n def __iter__(self) -> Iterator[Union[Events, CallableEventWithFilter]]:\n return iter(self._events)\n\n def __len__(self) -> int:\n return len(self._events)\n\n def __or__(self, other: Union[Events, CallableEventWithFilter]) -> \"EventsList\":\n self._append(event=other)\n return self\n\n\nclass State:\n \"\"\"An object that is used to pass internal and user-defined state between event handlers. By default, state\n contains the following attributes:\n\n .. code-block:: python\n\n state.iteration # 1-based, the first iteration is 1\n state.epoch # 1-based, the first epoch is 1\n state.seed # seed to set at each epoch\n state.dataloader # data passed to engine\n state.epoch_length # optional length of an epoch\n state.max_epochs # number of epochs to run\n state.max_iters # number of iterations to run\n state.batch # batch passed to `process_function`\n state.output # output of `process_function` after a single iteration\n state.metrics # dictionary with defined metrics if any\n state.times # dictionary with total and per-epoch times fetched on\n # keys: Events.EPOCH_COMPLETED.name and Events.COMPLETED.name\n\n Args:\n kwargs: keyword arguments to be defined as State attributes.\n \"\"\"\n\n event_to_attr: Dict[Union[str, \"Events\", \"CallableEventWithFilter\"], str] = {\n Events.GET_BATCH_STARTED: \"iteration\",\n Events.GET_BATCH_COMPLETED: \"iteration\",\n Events.ITERATION_STARTED: \"iteration\",\n Events.ITERATION_COMPLETED: \"iteration\",\n Events.EPOCH_STARTED: \"epoch\",\n Events.EPOCH_COMPLETED: \"epoch\",\n Events.STARTED: \"epoch\",\n Events.COMPLETED: \"epoch\",\n }\n\n def __init__(self, **kwargs: Any) -> None:\n self.iteration = 0\n self.epoch = 0\n self.epoch_length: Optional[int] = None\n self.max_epochs: Optional[int] = None\n self.max_iters: Optional[int] = None\n self.output: Optional[int] = None\n self.batch: Optional[int] = None\n self.metrics: Dict[str, Any] = {}\n self.dataloader: Optional[Union[DataLoader, Iterable[Any]]] = None\n self.seed: Optional[int] = None\n self.times: Dict[str, Optional[float]] = {\n Events.EPOCH_COMPLETED.name: None,\n Events.COMPLETED.name: None,\n }\n\n for k, v in kwargs.items():\n setattr(self, k, v)\n\n self._update_attrs()\n\n def _update_attrs(self) -> None:\n for value in self.event_to_attr.values():\n if not hasattr(self, value):\n setattr(self, value, 0)\n\n def get_event_attrib_value(self, event_name: Union[str, Events, CallableEventWithFilter]) -> int:\n \"\"\"Get the value of Event attribute with given `event_name`.\"\"\"\n if event_name not in State.event_to_attr:\n raise RuntimeError(f\"Unknown event name '{event_name}'\")\n return getattr(self, State.event_to_attr[event_name])\n\n def __repr__(self) -> str:\n s = \"State:\\n\"\n for attr, value in self.__dict__.items():\n if not isinstance(value, (numbers.Number, str)):\n value = type(value)\n s += f\"\\t{attr}: {value}\\n\"\n return s\n\n\nclass RemovableEventHandle:\n \"\"\"A weakref handle to remove a registered event.\n\n A handle that may be used to remove a registered event handler via the\n remove method, with-statement, or context manager protocol. Returned from\n :meth:`~ignite.engine.engine.Engine.add_event_handler`.\n\n\n Args:\n event_name: Registered event name.\n handler: Registered event handler, stored as weakref.\n engine: Target engine, stored as weakref.\n\n Examples:\n .. code-block:: python\n\n engine = Engine()\n\n def print_epoch(engine):\n print(f\"Epoch: {engine.state.epoch}\")\n\n with engine.add_event_handler(Events.EPOCH_COMPLETED, print_epoch):\n # print_epoch handler registered for a single run\n engine.run(data)\n\n # print_epoch handler is now unregistered\n \"\"\"\n\n def __init__(\n self, event_name: Union[CallableEventWithFilter, Enum, EventsList, Events], handler: Callable, engine: \"Engine\"\n ) -> None:\n self.event_name = event_name\n self.handler = weakref.ref(handler)\n self.engine = weakref.ref(engine)\n\n def remove(self) -> None:\n \"\"\"Remove handler from engine.\"\"\"\n handler = self.handler()\n engine = self.engine()\n\n if handler is None or engine is None:\n return\n\n if hasattr(handler, \"_parent\"):\n handler = handler._parent()\n if handler is None:\n raise RuntimeError(\n \"Internal error! Please fill an issue on https://github.com/pytorch/ignite/issues \"\n \"if encounter this error. Thank you!\"\n )\n\n if isinstance(self.event_name, EventsList):\n for e in self.event_name:\n if engine.has_event_handler(handler, e):\n engine.remove_event_handler(handler, e)\n else:\n if engine.has_event_handler(handler, self.event_name):\n engine.remove_event_handler(handler, self.event_name)\n\n def __enter__(self) -> \"RemovableEventHandle\":\n return self\n\n def __exit__(self, *args: Any, **kwargs: Any) -> None:\n self.remove()\n", "path": "ignite/engine/events.py" } ]
diff --git a/ignite/engine/events.py b/ignite/engine/events.py index a80277c525d3..9dd99348492b 100644 --- a/ignite/engine/events.py +++ b/ignite/engine/events.py @@ -237,7 +237,10 @@ def function_before_backprop(engine): # ... """ - pass + def __new__(cls, value: str) -> "EventEnum": + obj = CallableEventWithFilter.__new__(cls) + obj._value_ = value + return obj class Events(EventEnum):
pex-tool__pex-1673
Release 2.1.72 On the docket: + [x] Fix Locker to prune un-downloaded entries. (#1666) + [x] Fix venv creation to ignore ambient PEX env vars. #1669 + [x] Lockfiles: requirement might not be compatible with requested interpreter constraints #1667
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.71\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.72\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index e7c692766..da8f81916 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,15 @@ Release Notes ============= +2.1.72 +------ + +This release fixes an old bug with ``--venv`` PEXes initially executed +with either ``PEX_MODULE`` or ``PEX_SCRIPT`` active in the environment. + +* Fix venv creation to ignore ambient PEX env vars. (#1669) + `PR #1669 <https://github.com/pantsbuild/pex/pull/1669>`_ + 2.1.71 ------ diff --git a/pex/version.py b/pex/version.py index ff6708e70..1cf91c0a8 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.71" +__version__ = "2.1.72"
pex-tool__pex-1610
Release 2.1.66 On the docket: + [x] Support specifying foreign platforms in full detail. #1597 + [x] Respect PEX_ROOT in PEXEnvironment.mount. #1599 + [x] Be able to see what .pex file is run from the list of system processes #1604
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.65\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.66\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index c524a7ef1..31b6db792 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,36 @@ Release Notes ============= +2.1.66 +------ + +This release brings a new ``--complete-platform`` Pex CLI option that +can be used instead of ``--platform`` when more detailed foreign +platform specification is needed to satisfy a resolve (most commonly, +when ``python_full_version`` environment markers are in-play). This, +paired with the new ``pex3 interpreter inspect`` command that can be +used to generate complete platform data on the foreign platform machine +being targeted, should allow all foreign platform PEX builds to succeed +exactly as they would if run on that foreign platform as long as +pre-built wheels are available for that foreign platform. + +Additionally, PEXes now know how to set a useable process name when the +PEX contains the `psutil` distribution. See +`here <https://pex.readthedocs.io/en/v2.1.66/recipes.html#long-running-pex-applications-and-daemons>`_ +for more information. + +* Add support for ``--complete-platform``. (#1609) + `PR #1609 <https://github.com/pantsbuild/pex/pull/1609>`_ + +* Introduce ``pex3 interpreter inspect``. (#1607) + `PR #1607 <https://github.com/pantsbuild/pex/pull/1607>`_ + +* Use setproctitle to sanitize ``ps`` info. (#1605) + `PR #1605 <https://github.com/pantsbuild/pex/pull/1605>`_ + +* Respect ``PEX_ROOT`` in ``PEXEnvironment.mount``. (#1599) + `PR #1599 <https://github.com/pantsbuild/pex/pull/1599>`_ + 2.1.65 ------ diff --git a/pex/version.py b/pex/version.py index 891b24c7e..24551f628 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.65" +__version__ = "2.1.66"
pex-tool__pex-1733
Release 2.1.82 On the docket: + [x] Pex resolve checking does not allow resolved pre-releases when --no-pre. #1730
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.81\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.82\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index d79994b8a..76a815eff 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,15 @@ Release Notes ============= +2.1.82 +------ + +This is a hotfix release for a regression in prerelease version handling +introduced in the 2.1.81 release by #1727. + +* Fix prerelease handling when checking resolves. (#1732) + `PR #1732 <https://github.com/pantsbuild/pex/pull/1732>`_ + 2.1.81 ------ diff --git a/pex/version.py b/pex/version.py index ad0037b0d..45a6e1143 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.81" +__version__ = "2.1.82"
pex-tool__pex-1834
Release 2.1.95 On the docket: + [x] Lock creation should skip Windows-only requirements and / or allow selecting target platforms (OS classes). #1821 + [x] Feature request: "universal" lock mode can reject unsupported platforms #1595 + [x] Avoid ENOEXEC for --venv shebangs. #1828 + [x] pex3 lock export does't seem to respect the platform flag. #1826 + [x] Clarify pex3 lock export command. #1645 + [x] Support exporting PYTHONPATH before running user code #1825
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.94\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.95\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index b40dc81c0..a66e2c6c4 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,49 @@ Release Notes ============= +2.1.95 +------ + +This release brings two new ``pex3 lock`` features for +``--style universal`` locks. + +By default, universal locks are created to target all operating systems. +This can cause problems when you only target a subset of operating +systems and a lock transitive dependency that is conditional on an OS +you do not target is not lockable. The new +``--target-system {linux,mac,windows}`` option allows you to restrict +the set of targeted OSes to work around this sort of issue. Since PEX +files currently only support running on Linux and Mac, specifying +``--target-system linux --target-system mac`` is a safe way to +pre-emptively avoid these sorts of locking issues when creating a +universal lock. + +Previously you could not specify the ``--platform``\s or +``--complete-platform``\s you would be using later to build PEXes with +when creating a universal lock. You now can, and Pex will verify the +universal lock can support all the specified platforms. + +As is usual there are also several bug fixes including properly +propagating ``PEX_EXTRA_SYS_PATH`` additions to forked Python processes, +fixing ``pex3 lock export`` to only attempt to export for the selected +target and avoiding too long shebang errors for ``--venv`` mode PEXes in +a robust way. + +* Fix ``PEX_EXTRA_SYS_PATH`` propagation. (#1832) + `PR #1832 <https://github.com/pantsbuild/pex/pull/1832>`_ + +* Fix ``pex3 lock export``: re-use ``--lock`` resolver. (#1831) + `PR #1831 <https://github.com/pantsbuild/pex/pull/1831>`_ + +* Avoid ENOEXEC for ``--venv`` shebangs. (#1828) + `PR #1828 <https://github.com/pantsbuild/pex/pull/1828>`_ + +* Check lock can resolve platforms at creation time. (#1824) + `PR #1824 <https://github.com/pantsbuild/pex/pull/1824>`_ + +* Support restricting universal lock target os. (#1823) + `PR #1823 <https://github.com/pantsbuild/pex/pull/1823>`_ + 2.1.94 ------ diff --git a/pex/version.py b/pex/version.py index d72f3cdbf..40153e391 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.94" +__version__ = "2.1.95"
pex-tool__pex-1319
Release 2.1.39 On the docket: + [x] Running opvault 0.4.9 pex leads to infinite recursion in setup tools #1316
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.38\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.39\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 778a61032..d63bfcf3e 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,15 @@ Release Notes ============= +2.1.39 +------ + +A hotfix that fixes a bug present since 2.1.25 that results in infinite +recursion in PEX runtime resolves when handling dependency cycles. + +* Guard against cyclic dependency graphs. (#1317) + `PR #1317 <https://github.com/pantsbuild/pex/pull/1317>`_ + 2.1.38 ------ diff --git a/pex/version.py b/pex/version.py index 092a74bf7..82e4c47d5 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.38" +__version__ = "2.1.39"
pex-tool__pex-1288
Release 2.1.35 On the docket: + [x] Ensure venv pex does not enter a re-exec loop. #1286 + [x] Improve resolve error information. #1287 + [x] Expose Pex tools via a pex-tools console script. #1279 + [x] Fix auto-created `--venv` core scripts. (#1278)
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.34\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.35\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index b2a99154a..fd0770874 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,25 @@ Release Notes ============= +2.1.35 +------ + +This release hardens a few aspects of `--venv` mode PEXes. An infinite +re-exec loop in venv `pex` scripts is fixed and the `activate` family +of scripts in the venv is fixed. + +* Improve resolve error information. (#1287) + `PR #1287 <https://github.com/pantsbuild/pex/pull/1287>`_ + +* Ensure venv pex does not enter a re-exec loop. (#1286) + `PR #1286 <https://github.com/pantsbuild/pex/pull/1286>`_ + +* Expose Pex tools via a pex-tools console script. (#1279) + `PR #1279 <https://github.com/pantsbuild/pex/pull/1279>`_ + +* Fix auto-created `--venv` core scripts. (#1278) + `PR #1278 <https://github.com/pantsbuild/pex/pull/1278>`_ + 2.1.34 ------ diff --git a/pex/version.py b/pex/version.py index ee3ef65c4..c4e8f5075 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.34" +__version__ = "2.1.35"
pex-tool__pex-1844
Release 2.1.97 On the docket: + [x] Avoid ENOEXEC for Pex internal --venvs. #1843
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.96\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.97\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index bd4f2b4fe..cfaab5109 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,16 @@ Release Notes ============= +2.1.97 +------ + +This release patches a hole left by #1828 in the Pex 2.1.95 release +whereby, although you could run a PEX under a too-long PEX_ROOT you +could not build a PEX under a tool-long PEX_ROOT. + +* Avoid ENOEXEC for Pex internal --venvs. (#1843) + `PR #1843 <https://github.com/pantsbuild/pex/pull/1843>`_ + 2.1.96 ------ @@ -9,7 +19,7 @@ This is a hotfix release that fixes ``--venv`` mode apply to ``sys.executable`` and not other Pythons. * Fix ``--venv`` ``PEX PEX_EXTRA_SYS_PATH`` propagation. (#1837) - `PR #1837 <https://github.com/pantsbuild/pex/pull/1837>`_ + `PR #1837 <https://github.com/pantsbuild/pex/pull/1837>`_ 2.1.95 ------ diff --git a/pex/version.py b/pex/version.py index 26a429cb9..3cebb26c2 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.96" +__version__ = "2.1.97"
pex-tool__pex-1761
Release 2.1.87 On the docket: + [ ] A relative --tmpdir foils pex3 lock create. #1758
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.86\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.87\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 37e7ea169..8a8cac55a 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,15 +1,23 @@ Release Notes ============= +2.1.87 +------ + +This release fixes ``pex3 lock create`` to handle relative ``--tmpdir``. + +* Fix lock save detection to be more robust. (#1760) + `PR #1760 <https://github.com/pantsbuild/pex/pull/1760>`_ + 2.1.86 ------ This release fixes an oversight in lock file use against secured custom indexes and find links repos. Previously credentials were passed during -the lock creation process via either `~/.netrc` or via embedded +the lock creation process via either ``~/.netrc`` or via embedded credentials in the custom indexes and find links URLs Pex was configured with. But, at lock use time, these credentials were not used. Now -`~/.netrc` entries are always used and embedded credentials passed via +``~/.netrc`` entries are always used and embedded credentials passed via custom URLS at lock creation time can be passed in the same manner at lock use time. diff --git a/pex/version.py b/pex/version.py index 10d51c427..3899ba6eb 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.86" +__version__ = "2.1.87"
pex-tool__pex-1419
Release 2.1.46 On the docket: + [x] Fix Pip proprietary URL env marker handling. #1417 + [x] Un-reify installed wheel script shebangs. #1410 + [x] Support deterministic repository extract tool. #1411 + [x] support setuptools scripts #1379
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.45\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.46\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index c829f74d5..26dc72d40 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,32 @@ Release Notes ============= +2.1.46 +------ + +This release improves PEX file build reproducibility and requirement +parsing of environment markers in Pip's proprietary URL format. + +Also, the `-c` / `--script` / `--console-script` argument now supports +non-Python distribution scripts. + +Finally, new contributor @blag improved the README. + +* Fix Pip proprietary URL env marker handling. (#1417) + `PR #1417 <https://github.com/pantsbuild/pex/pull/1417>`_ + +* Un-reify installed wheel script shebangs. (#1410) + `PR #1410 <https://github.com/pantsbuild/pex/pull/1410>`_ + +* Support deterministic repository extract tool. (#1411) + `PR #1411 <https://github.com/pantsbuild/pex/pull/1411>`_ + +* Improve examples and add example subsection titles (#1409) + `PR #1409 <https://github.com/pantsbuild/pex/pull/1409>`_ + +* support any scripts specified in `setup(scripts=...)` from setup.py. (#1381) + `PR #1381 <https://github.com/pantsbuild/pex/pull/1381>`_ + 2.1.45 ------ @@ -72,7 +98,7 @@ that improves Pip execution environment isolation. 2.1.41 ------ -This release brings a hotfix from @kaos for interpreter identification +This release brings a hotfix from @kaos for interpreter identification on macOS 11. * Update interpreter.py (#1332) diff --git a/pex/version.py b/pex/version.py index 29d0fe485..2c840cdc4 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.45" +__version__ = "2.1.46"
pex-tool__pex-1664
Release 2.1.71 On the docket: + [x] Secure Pex against sha1 collision attacks. #1662 + [x] Problems building venvs from certain distributions. #1656
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.70\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.71\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index d8bc7c6e8..e7c692766 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,19 @@ Release Notes ============= +2.1.71 +------ + +This release fixes the instability introduced in 2.1.68 by switching to +a more robust means of determining venv layouts. Along the way it +upgrades Pex internals to cache all artifacts with strong hashes ( +previously sha1 was used). It's strongly recommended to upgrade or use +the exclude ``!=2.1.68,!=2.1.69,!=2.1.70`` when depending on an open +ended Pex version range. + +* Switch Pex installed wheels to ``--prefix`` scheme. (#1661) + `PR #1661 <https://github.com/pantsbuild/pex/pull/1661>`_ + 2.1.70 ------ diff --git a/pex/version.py b/pex/version.py index 4183f280c..ff6708e70 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.70" +__version__ = "2.1.71"
pex-tool__pex-1559
Release 2.1.61 On the docket: + [x] Merge packages for --venv-site-packages-copies. #1557
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.60\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.61\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 76c9fd97b..35792bd9e 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,16 @@ Release Notes ============= +2.1.61 +------ + +This release fixes a regression in Pex ``--venv`` mode compatibility +with distributions that are members of a namespace package that was +introduced by #1532 in the 2.1.57 release. + +* Merge packages for ``--venv-site-packages-copies``. (#1557) + `PR #1557 <https://github.com/pantsbuild/pex/pull/1557>`_ + 2.1.60 ------ diff --git a/pex/version.py b/pex/version.py index e534a9d5b..225183d03 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.60" +__version__ = "2.1.61"
pex-tool__pex-1446
Release 2.1.49 On the docket: + [ ] Avoid re-using old ~/.pex/code/ caches. #1444
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.48\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.49\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 52558ddeb..4cf9ec11a 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,17 @@ Release Notes ============= +2.1.49 +------ + +This is a hotfix release that fixes the new ``--layout {zipapp,packed}`` +modes for PEX files with no user code & just third party dependencies +when executed against a ``$PEX_ROOT`` where similar PEXes built with the +old ``--not-zip-safe`` option were were run in the past. + +* Avoid re-using old ~/.pex/code/ caches. (#1444) + `PR #1444 <https://github.com/pantsbuild/pex/pull/1444>`_ + 2.1.48 ------ diff --git a/pex/version.py b/pex/version.py index c923ea6ba..3977ea4eb 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.48" +__version__ = "2.1.49"
pex-tool__pex-1482
Release 2.1.51 On the docket: + [ ] UnicodeDecodeError when packaging after upgrading to v2.1.46 #1479
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.50\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.51\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index f05e62f6f..07f84eada 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,17 @@ Release Notes ============= +2.1.51 +------ + +This release fixes both PEX creation and ``--venv`` creation to handle +distributions that contain scripts with non-ascii characters in them +when running in environments with a default encoding that does not +contain those characters under PyPy3, Python 3.5 and Python 3.6. + +* Fix non-ascii script shebang re-writing. (#1480) + `PR #1480 <https://github.com/pantsbuild/pex/pull/1480>`_ + 2.1.50 ------ diff --git a/pex/version.py b/pex/version.py index fed40855d..f960a0a0b 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.50" +__version__ = "2.1.51"
pypi__warehouse-3165
Add a "shortlink" for projects **From user testing:** When viewing projects on PyPI, some users type the URL directly if they know the project name. We should create a shortlink like`pypi.org/p/myproject` which would redirect to `pypi.org/projects/myproject` cc @di for feedback / guidance. --- **Good First Issue**: This issue is good for first time contributors. If you've already contributed to Warehouse, please work on [another issue without this label](https://github.com/pypa/warehouse/issues?utf8=%E2%9C%93&q=is%3Aissue+is%3Aopen+-label%3A%22good+first+issue%22) instead. If there is not a corresponding pull request for this issue, it is up for grabs. For directions for getting set up, see our [Getting Started Guide](https://warehouse.pypa.io/development/getting-started/). If you are working on this issue and have questions, please feel free to ask them here, [`#pypa-dev` on Freenode](https://webchat.freenode.net/?channels=%23pypa-dev), or the [pypa-dev mailing list](https://groups.google.com/forum/#!forum/pypa-dev).
[ { "content": "# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\ndef includeme(config):\n # We need to get the value of the Warehouse and Forklift domains, we'll use\n # these to segregate the Warehouse routes from the Forklift routes until\n # Forklift is properly split out into it's own project.\n warehouse = config.get_settings().get(\"warehouse.domain\")\n files_url = config.get_settings()[\"files.url\"]\n\n # Simple Route for health checks.\n config.add_route(\"health\", \"/_health/\")\n\n # Internal route to make it easier to force a particular status for\n # debugging HTTPException templates.\n config.add_route(\"force-status\", \"/_force-status/{status:[45]\\d\\d}/\")\n\n # Basic global routes\n config.add_route(\"index\", \"/\", domain=warehouse)\n config.add_route(\"robots.txt\", \"/robots.txt\", domain=warehouse)\n config.add_route(\"opensearch.xml\", \"/opensearch.xml\", domain=warehouse)\n config.add_route(\"index.sitemap.xml\", \"/sitemap.xml\", domain=warehouse)\n config.add_route(\n \"bucket.sitemap.xml\",\n \"/{bucket}.sitemap.xml\",\n domain=warehouse,\n )\n\n # Some static, template driven pages\n config.add_template_view(\"help\", \"/help/\", \"pages/help.html\")\n config.add_template_view(\"security\", \"/security/\", \"pages/security.html\")\n config.add_template_view(\n \"sponsors\",\n \"/sponsors/\",\n # Use the full resource path here to make it able to be overridden by\n # pypi-theme.\n \"warehouse:templates/pages/sponsors.html\",\n )\n\n # Our legal policies\n config.add_policy(\"terms-of-use\", \"terms.md\")\n\n # HTML Snippets for including into other pages.\n config.add_route(\n \"includes.current-user-indicator\",\n \"/_includes/current-user-indicator/\",\n domain=warehouse,\n )\n config.add_route(\n \"includes.flash-messages\",\n \"/_includes/flash-messages/\",\n domain=warehouse,\n )\n config.add_route(\n \"includes.current-user-profile-callout\",\n \"/_includes/current-user-profile-callout/{username}\",\n factory=\"warehouse.accounts.models:UserFactory\",\n traverse=\"/{username}\",\n domain=warehouse,\n )\n config.add_route(\n \"includes.edit-project-button\",\n \"/_includes/edit-project-button/{project_name}\",\n factory=\"warehouse.packaging.models:ProjectFactory\",\n traverse=\"/{project_name}\",\n domain=warehouse,\n )\n config.add_route(\n \"includes.edit-profile-button\",\n \"/_includes/edit-profile-button/{username}\",\n factory=\"warehouse.accounts.models:UserFactory\",\n traverse=\"/{username}\",\n domain=warehouse,\n )\n\n # Search Routes\n config.add_route(\"search\", \"/search/\", domain=warehouse)\n\n # Accounts\n config.add_route(\n \"accounts.profile\",\n \"/user/{username}/\",\n factory=\"warehouse.accounts.models:UserFactory\",\n traverse=\"/{username}\",\n domain=warehouse,\n )\n config.add_route(\"accounts.login\", \"/account/login/\", domain=warehouse)\n config.add_route(\"accounts.logout\", \"/account/logout/\", domain=warehouse)\n config.add_route(\n \"accounts.register\",\n \"/account/register/\",\n domain=warehouse,\n )\n config.add_route(\n \"accounts.request-password-reset\",\n \"/account/request-password-reset/\",\n domain=warehouse,\n )\n config.add_route(\n \"accounts.reset-password\",\n \"/account/reset-password/\",\n domain=warehouse,\n )\n config.add_route(\n \"accounts.verify-email\",\n \"/account/verify-email/\",\n domain=warehouse,\n )\n\n # Management (views for logged-in users)\n config.add_route(\"manage.account\", \"/manage/account/\", domain=warehouse)\n config.add_route(\"manage.projects\", \"/manage/projects/\", domain=warehouse)\n config.add_route(\n \"manage.project.settings\",\n \"/manage/project/{project_name}/settings/\",\n factory=\"warehouse.packaging.models:ProjectFactory\",\n traverse=\"/{project_name}\",\n domain=warehouse,\n )\n config.add_route(\n \"manage.project.delete_project\",\n \"/manage/project/{project_name}/delete_project/\",\n factory=\"warehouse.packaging.models:ProjectFactory\",\n traverse=\"/{project_name}\",\n domain=warehouse,\n )\n config.add_route(\n \"manage.project.releases\",\n \"/manage/project/{project_name}/releases/\",\n factory=\"warehouse.packaging.models:ProjectFactory\",\n traverse=\"/{project_name}\",\n domain=warehouse,\n )\n config.add_route(\n \"manage.project.release\",\n \"/manage/project/{project_name}/release/{version}/\",\n factory=\"warehouse.packaging.models:ProjectFactory\",\n traverse=\"/{project_name}/{version}\",\n domain=warehouse,\n )\n config.add_route(\n \"manage.project.roles\",\n \"/manage/project/{project_name}/collaboration/\",\n factory=\"warehouse.packaging.models:ProjectFactory\",\n traverse=\"/{project_name}\",\n domain=warehouse,\n )\n config.add_route(\n \"manage.project.change_role\",\n \"/manage/project/{project_name}/collaboration/change/\",\n factory=\"warehouse.packaging.models:ProjectFactory\",\n traverse=\"/{project_name}\",\n domain=warehouse,\n )\n config.add_route(\n \"manage.project.delete_role\",\n \"/manage/project/{project_name}/collaboration/delete/\",\n factory=\"warehouse.packaging.models:ProjectFactory\",\n traverse=\"/{project_name}\",\n domain=warehouse,\n )\n config.add_route(\n \"manage.project.history\",\n \"/manage/project/{project_name}/history/\",\n factory=\"warehouse.packaging.models:ProjectFactory\",\n traverse=\"/{project_name}\",\n domain=warehouse,\n )\n\n # Packaging\n config.add_route(\n \"packaging.project\",\n \"/project/{name}/\",\n factory=\"warehouse.packaging.models:ProjectFactory\",\n traverse=\"/{name}\",\n domain=warehouse,\n )\n config.add_route(\n \"packaging.release\",\n \"/project/{name}/{version}/\",\n factory=\"warehouse.packaging.models:ProjectFactory\",\n traverse=\"/{name}/{version}\",\n domain=warehouse,\n )\n config.add_route(\"packaging.file\", files_url)\n\n # RSS\n config.add_route(\"rss.updates\", \"/rss/updates.xml\", domain=warehouse)\n config.add_route(\"rss.packages\", \"/rss/packages.xml\", domain=warehouse)\n\n # Legacy URLs\n config.add_route(\"legacy.api.simple.index\", \"/simple/\", domain=warehouse)\n config.add_route(\n \"legacy.api.simple.detail\",\n \"/simple/{name}/\",\n factory=\"warehouse.packaging.models:ProjectFactory\",\n traverse=\"/{name}/\",\n read_only=True,\n domain=warehouse,\n )\n config.add_route(\n \"legacy.api.json.project\",\n \"/pypi/{name}/json\",\n factory=\"warehouse.packaging.models:ProjectFactory\",\n traverse=\"/{name}\",\n read_only=True,\n domain=warehouse,\n )\n config.add_route(\n \"legacy.api.json.release\",\n \"/pypi/{name}/{version}/json\",\n factory=\"warehouse.packaging.models:ProjectFactory\",\n traverse=\"/{name}/{version}\",\n read_only=True,\n domain=warehouse,\n )\n\n # Legacy Action URLs\n # TODO: We should probably add Warehouse routes for these that just error\n # and direct people to use upload.pypi.io\n config.add_pypi_action_route(\n \"legacy.api.pypi.file_upload\",\n \"file_upload\",\n domain=warehouse,\n )\n config.add_pypi_action_route(\n \"legacy.api.pypi.submit\",\n \"submit\",\n domain=warehouse,\n )\n config.add_pypi_action_route(\n \"legacy.api.pypi.submit_pkg_info\",\n \"submit_pkg_info\",\n domain=warehouse,\n )\n config.add_pypi_action_route(\n \"legacy.api.pypi.doc_upload\",\n \"doc_upload\",\n domain=warehouse,\n )\n config.add_pypi_action_route(\n \"legacy.api.pypi.doap\",\n \"doap\",\n domain=warehouse,\n )\n config.add_pypi_action_route(\n \"legacy.api.pypi.list_classifiers\",\n \"list_classifiers\",\n domain=warehouse,\n )\n\n # Legacy XMLRPC\n config.add_xmlrpc_endpoint(\n \"pypi\",\n pattern=\"/pypi\",\n header=\"Content-Type:text/xml\",\n domain=warehouse,\n )\n\n # Legacy Documentation\n config.add_route(\"legacy.docs\", config.registry.settings[\"docs.url\"])\n\n # Legacy Redirects\n config.add_redirect(\"/pypi/{name}/\", \"/project/{name}/\", domain=warehouse)\n config.add_redirect(\n \"/pypi/{name}/{version}/\",\n \"/project/{name}/{version}/\",\n domain=warehouse,\n )\n config.add_redirect(\"/packages/{path:.*}\", files_url, domain=warehouse)\n\n # Legacy Action Redirects\n config.add_pypi_action_redirect(\n \"rss\",\n \"/rss/updates.xml\",\n domain=warehouse,\n )\n config.add_pypi_action_redirect(\n \"packages_rss\",\n \"/rss/packages.xml\",\n domain=warehouse,\n )\n", "path": "warehouse/routes.py" } ]
[ { "content": "# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\ndef includeme(config):\n # We need to get the value of the Warehouse and Forklift domains, we'll use\n # these to segregate the Warehouse routes from the Forklift routes until\n # Forklift is properly split out into it's own project.\n warehouse = config.get_settings().get(\"warehouse.domain\")\n files_url = config.get_settings()[\"files.url\"]\n\n # Simple Route for health checks.\n config.add_route(\"health\", \"/_health/\")\n\n # Internal route to make it easier to force a particular status for\n # debugging HTTPException templates.\n config.add_route(\"force-status\", \"/_force-status/{status:[45]\\d\\d}/\")\n\n # Basic global routes\n config.add_route(\"index\", \"/\", domain=warehouse)\n config.add_route(\"robots.txt\", \"/robots.txt\", domain=warehouse)\n config.add_route(\"opensearch.xml\", \"/opensearch.xml\", domain=warehouse)\n config.add_route(\"index.sitemap.xml\", \"/sitemap.xml\", domain=warehouse)\n config.add_route(\n \"bucket.sitemap.xml\",\n \"/{bucket}.sitemap.xml\",\n domain=warehouse,\n )\n\n # Some static, template driven pages\n config.add_template_view(\"help\", \"/help/\", \"pages/help.html\")\n config.add_template_view(\"security\", \"/security/\", \"pages/security.html\")\n config.add_template_view(\n \"sponsors\",\n \"/sponsors/\",\n # Use the full resource path here to make it able to be overridden by\n # pypi-theme.\n \"warehouse:templates/pages/sponsors.html\",\n )\n\n # Our legal policies\n config.add_policy(\"terms-of-use\", \"terms.md\")\n\n # HTML Snippets for including into other pages.\n config.add_route(\n \"includes.current-user-indicator\",\n \"/_includes/current-user-indicator/\",\n domain=warehouse,\n )\n config.add_route(\n \"includes.flash-messages\",\n \"/_includes/flash-messages/\",\n domain=warehouse,\n )\n config.add_route(\n \"includes.current-user-profile-callout\",\n \"/_includes/current-user-profile-callout/{username}\",\n factory=\"warehouse.accounts.models:UserFactory\",\n traverse=\"/{username}\",\n domain=warehouse,\n )\n config.add_route(\n \"includes.edit-project-button\",\n \"/_includes/edit-project-button/{project_name}\",\n factory=\"warehouse.packaging.models:ProjectFactory\",\n traverse=\"/{project_name}\",\n domain=warehouse,\n )\n config.add_route(\n \"includes.edit-profile-button\",\n \"/_includes/edit-profile-button/{username}\",\n factory=\"warehouse.accounts.models:UserFactory\",\n traverse=\"/{username}\",\n domain=warehouse,\n )\n\n # Search Routes\n config.add_route(\"search\", \"/search/\", domain=warehouse)\n\n # Accounts\n config.add_route(\n \"accounts.profile\",\n \"/user/{username}/\",\n factory=\"warehouse.accounts.models:UserFactory\",\n traverse=\"/{username}\",\n domain=warehouse,\n )\n config.add_route(\"accounts.login\", \"/account/login/\", domain=warehouse)\n config.add_route(\"accounts.logout\", \"/account/logout/\", domain=warehouse)\n config.add_route(\n \"accounts.register\",\n \"/account/register/\",\n domain=warehouse,\n )\n config.add_route(\n \"accounts.request-password-reset\",\n \"/account/request-password-reset/\",\n domain=warehouse,\n )\n config.add_route(\n \"accounts.reset-password\",\n \"/account/reset-password/\",\n domain=warehouse,\n )\n config.add_route(\n \"accounts.verify-email\",\n \"/account/verify-email/\",\n domain=warehouse,\n )\n\n # Management (views for logged-in users)\n config.add_route(\"manage.account\", \"/manage/account/\", domain=warehouse)\n config.add_route(\"manage.projects\", \"/manage/projects/\", domain=warehouse)\n config.add_route(\n \"manage.project.settings\",\n \"/manage/project/{project_name}/settings/\",\n factory=\"warehouse.packaging.models:ProjectFactory\",\n traverse=\"/{project_name}\",\n domain=warehouse,\n )\n config.add_route(\n \"manage.project.delete_project\",\n \"/manage/project/{project_name}/delete_project/\",\n factory=\"warehouse.packaging.models:ProjectFactory\",\n traverse=\"/{project_name}\",\n domain=warehouse,\n )\n config.add_route(\n \"manage.project.releases\",\n \"/manage/project/{project_name}/releases/\",\n factory=\"warehouse.packaging.models:ProjectFactory\",\n traverse=\"/{project_name}\",\n domain=warehouse,\n )\n config.add_route(\n \"manage.project.release\",\n \"/manage/project/{project_name}/release/{version}/\",\n factory=\"warehouse.packaging.models:ProjectFactory\",\n traverse=\"/{project_name}/{version}\",\n domain=warehouse,\n )\n config.add_route(\n \"manage.project.roles\",\n \"/manage/project/{project_name}/collaboration/\",\n factory=\"warehouse.packaging.models:ProjectFactory\",\n traverse=\"/{project_name}\",\n domain=warehouse,\n )\n config.add_route(\n \"manage.project.change_role\",\n \"/manage/project/{project_name}/collaboration/change/\",\n factory=\"warehouse.packaging.models:ProjectFactory\",\n traverse=\"/{project_name}\",\n domain=warehouse,\n )\n config.add_route(\n \"manage.project.delete_role\",\n \"/manage/project/{project_name}/collaboration/delete/\",\n factory=\"warehouse.packaging.models:ProjectFactory\",\n traverse=\"/{project_name}\",\n domain=warehouse,\n )\n config.add_route(\n \"manage.project.history\",\n \"/manage/project/{project_name}/history/\",\n factory=\"warehouse.packaging.models:ProjectFactory\",\n traverse=\"/{project_name}\",\n domain=warehouse,\n )\n\n # Packaging\n config.add_redirect('/p/{name}/', '/project/{name}/', domain=warehouse)\n config.add_route(\n \"packaging.project\",\n \"/project/{name}/\",\n factory=\"warehouse.packaging.models:ProjectFactory\",\n traverse=\"/{name}\",\n domain=warehouse,\n )\n config.add_route(\n \"packaging.release\",\n \"/project/{name}/{version}/\",\n factory=\"warehouse.packaging.models:ProjectFactory\",\n traverse=\"/{name}/{version}\",\n domain=warehouse,\n )\n config.add_route(\"packaging.file\", files_url)\n\n # RSS\n config.add_route(\"rss.updates\", \"/rss/updates.xml\", domain=warehouse)\n config.add_route(\"rss.packages\", \"/rss/packages.xml\", domain=warehouse)\n\n # Legacy URLs\n config.add_route(\"legacy.api.simple.index\", \"/simple/\", domain=warehouse)\n config.add_route(\n \"legacy.api.simple.detail\",\n \"/simple/{name}/\",\n factory=\"warehouse.packaging.models:ProjectFactory\",\n traverse=\"/{name}/\",\n read_only=True,\n domain=warehouse,\n )\n config.add_route(\n \"legacy.api.json.project\",\n \"/pypi/{name}/json\",\n factory=\"warehouse.packaging.models:ProjectFactory\",\n traverse=\"/{name}\",\n read_only=True,\n domain=warehouse,\n )\n config.add_route(\n \"legacy.api.json.release\",\n \"/pypi/{name}/{version}/json\",\n factory=\"warehouse.packaging.models:ProjectFactory\",\n traverse=\"/{name}/{version}\",\n read_only=True,\n domain=warehouse,\n )\n\n # Legacy Action URLs\n # TODO: We should probably add Warehouse routes for these that just error\n # and direct people to use upload.pypi.io\n config.add_pypi_action_route(\n \"legacy.api.pypi.file_upload\",\n \"file_upload\",\n domain=warehouse,\n )\n config.add_pypi_action_route(\n \"legacy.api.pypi.submit\",\n \"submit\",\n domain=warehouse,\n )\n config.add_pypi_action_route(\n \"legacy.api.pypi.submit_pkg_info\",\n \"submit_pkg_info\",\n domain=warehouse,\n )\n config.add_pypi_action_route(\n \"legacy.api.pypi.doc_upload\",\n \"doc_upload\",\n domain=warehouse,\n )\n config.add_pypi_action_route(\n \"legacy.api.pypi.doap\",\n \"doap\",\n domain=warehouse,\n )\n config.add_pypi_action_route(\n \"legacy.api.pypi.list_classifiers\",\n \"list_classifiers\",\n domain=warehouse,\n )\n\n # Legacy XMLRPC\n config.add_xmlrpc_endpoint(\n \"pypi\",\n pattern=\"/pypi\",\n header=\"Content-Type:text/xml\",\n domain=warehouse,\n )\n\n # Legacy Documentation\n config.add_route(\"legacy.docs\", config.registry.settings[\"docs.url\"])\n\n # Legacy Redirects\n config.add_redirect(\"/pypi/{name}/\", \"/project/{name}/\", domain=warehouse)\n config.add_redirect(\n \"/pypi/{name}/{version}/\",\n \"/project/{name}/{version}/\",\n domain=warehouse,\n )\n config.add_redirect(\"/packages/{path:.*}\", files_url, domain=warehouse)\n\n # Legacy Action Redirects\n config.add_pypi_action_redirect(\n \"rss\",\n \"/rss/updates.xml\",\n domain=warehouse,\n )\n config.add_pypi_action_redirect(\n \"packages_rss\",\n \"/rss/packages.xml\",\n domain=warehouse,\n )\n", "path": "warehouse/routes.py" } ]
diff --git a/tests/unit/test_routes.py b/tests/unit/test_routes.py index 362b91010524..1d9607ca39a7 100644 --- a/tests/unit/test_routes.py +++ b/tests/unit/test_routes.py @@ -269,6 +269,7 @@ def add_policy(name, filename): ] assert config.add_redirect.calls == [ + pretend.call("/p/{name}/", "/project/{name}/", domain=warehouse), pretend.call("/pypi/{name}/", "/project/{name}/", domain=warehouse), pretend.call( "/pypi/{name}/{version}/", diff --git a/warehouse/routes.py b/warehouse/routes.py index 0a49b73894fc..84902feb03f0 100644 --- a/warehouse/routes.py +++ b/warehouse/routes.py @@ -178,6 +178,7 @@ def includeme(config): ) # Packaging + config.add_redirect('/p/{name}/', '/project/{name}/', domain=warehouse) config.add_route( "packaging.project", "/project/{name}/",
ivy-llc__ivy-23142
ifft
[ { "content": "# local\nimport ivy\nfrom ivy.functional.frontends.jax.func_wrapper import to_ivy_arrays_and_back\nfrom ivy.func_wrapper import with_unsupported_dtypes\n\n\n@to_ivy_arrays_and_back\ndef fft(a, n=None, axis=-1, norm=None):\n if norm is None:\n norm = \"backward\"\n return ivy.fft(a, axis, norm=norm, n=n)\n\n\n@to_ivy_arrays_and_back\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\ndef fftshift(x, axes=None, name=None):\n shape = x.shape\n\n if axes is None:\n axes = tuple(range(x.ndim))\n shifts = [(dim // 2) for dim in shape]\n elif isinstance(axes, int):\n shifts = shape[axes] // 2\n else:\n shifts = [shape[ax] // 2 for ax in axes]\n\n roll = ivy.roll(x, shifts, axis=axes)\n\n return roll\n", "path": "ivy/functional/frontends/jax/numpy/fft.py" } ]
[ { "content": "# local\nimport ivy\nfrom ivy.functional.frontends.jax.func_wrapper import to_ivy_arrays_and_back\nfrom ivy.func_wrapper import with_unsupported_dtypes\n\n\n@to_ivy_arrays_and_back\ndef fft(a, n=None, axis=-1, norm=None):\n if norm is None:\n norm = \"backward\"\n return ivy.fft(a, axis, norm=norm, n=n)\n\n\n@to_ivy_arrays_and_back\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\ndef fftshift(x, axes=None, name=None):\n shape = x.shape\n\n if axes is None:\n axes = tuple(range(x.ndim))\n shifts = [(dim // 2) for dim in shape]\n elif isinstance(axes, int):\n shifts = shape[axes] // 2\n else:\n shifts = [shape[ax] // 2 for ax in axes]\n\n roll = ivy.roll(x, shifts, axis=axes)\n\n return roll\n\n\n@to_ivy_arrays_and_back\ndef ifft(a, n=None, axis=-1, norm=None):\n if norm is None:\n norm = \"backward\"\n return ivy.ifft(a, axis, norm=norm, n=n)\n", "path": "ivy/functional/frontends/jax/numpy/fft.py" } ]
diff --git a/ivy/functional/frontends/jax/numpy/fft.py b/ivy/functional/frontends/jax/numpy/fft.py index 7a62b524d67e8..16d9cb97e67b2 100644 --- a/ivy/functional/frontends/jax/numpy/fft.py +++ b/ivy/functional/frontends/jax/numpy/fft.py @@ -27,3 +27,10 @@ def fftshift(x, axes=None, name=None): roll = ivy.roll(x, shifts, axis=axes) return roll + + +@to_ivy_arrays_and_back +def ifft(a, n=None, axis=-1, norm=None): + if norm is None: + norm = "backward" + return ivy.ifft(a, axis, norm=norm, n=n) diff --git a/ivy_tests/test_ivy/test_frontends/test_jax/test_numpy/test_fft.py b/ivy_tests/test_ivy/test_frontends/test_jax/test_numpy/test_fft.py index ed316ca5a544e..7fd9f00f3f135 100644 --- a/ivy_tests/test_ivy/test_frontends/test_jax/test_numpy/test_fft.py +++ b/ivy_tests/test_ivy/test_frontends/test_jax/test_numpy/test_fft.py @@ -70,3 +70,45 @@ def test_jax_numpy_fftshift( x=arr[0], axes=None, ) + + +# ifft +@handle_frontend_test( + fn_tree="jax.numpy.fft.ifft", + dtype_values_axis=helpers.dtype_values_axis( + available_dtypes=helpers.get_dtypes("complex"), + num_arrays=1, + min_value=-1e5, + max_value=1e5, + min_num_dims=1, + max_num_dims=5, + min_dim_size=1, + max_dim_size=5, + allow_inf=False, + large_abs_safety_factor=2.5, + small_abs_safety_factor=2.5, + safety_factor_scale="log", + valid_axis=True, + force_int_axis=True, + ), + n=st.integers(min_value=2, max_value=10), + norm=st.sampled_from(["backward", "ortho", "forward", None]), +) +def test_jax_numpy_ifft( + dtype_values_axis, n, norm, frontend, backend_fw, test_flags, fn_tree, on_device +): + dtype, values, axis = dtype_values_axis + helpers.test_frontend_function( + input_dtypes=dtype, + frontend=frontend, + backend_to_test=backend_fw, + test_flags=test_flags, + fn_tree=fn_tree, + on_device=on_device, + a=values[0], + n=n, + axis=axis, + norm=norm, + atol=1e-02, + rtol=1e-02, + )
pex-tool__pex-1251
Release 2.1.31 On the docket: + [x] When Pex is run from a Pex PEX its isolation is broken. #1232 + [x] The `--venv` mode `pex` script does not have a `__name__ == '__main__'` guard breaking multiprocessing. #1236 + [x] The `--seed` mode for a `--venv` PEX is unsafe. #1239 + [x] The venv `pex` script handles entrypoint functions differently from PEX. #1241 + [x] Interpreter identification leaks an unconstrained `$PWD` entry into `sys.path`. #1231 + [x] Support control of venv creation mode `--copies` vs. `--symlinks` #1230
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.30\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.31\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index fa1eefff1..85a8bdc9b 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,38 @@ Release Notes ============= +2.1.31 +------ + +This release primarily hardens Pex venvs fixing several bugs. + +* Fix Pex isolation. (#1250) + `PR #1250 <https://github.com/pantsbuild/pex/pull/1250>`_ + +* Support pre-compiling a venv. (#1246) + `PR #1246 <https://github.com/pantsbuild/pex/pull/1246>`_ + +* Support venv relocation. (#1247) + `PR #1247 <https://github.com/pantsbuild/pex/pull/1247>`_ + +* Fix `--runtime-pex-root` leak in pex bootstrap. (#1244) + `PR #1244 <https://github.com/pantsbuild/pex/pull/1244>`_ + +* Support venvs that can outlive their base python. (#1245) + `PR #1245 <https://github.com/pantsbuild/pex/pull/1245>`_ + +* Harden Pex interpreter identification. (#1248) + `PR #1248 <https://github.com/pantsbuild/pex/pull/1248>`_ + +* The `pex` venv script handles entrypoints like PEX. (#1242) + `PR #1242 <https://github.com/pantsbuild/pex/pull/1242>`_ + +* Ensure PEX files aren't symlinked in venv. (#1240) + `PR #1240 <https://github.com/pantsbuild/pex/pull/1240>`_ + +* Fix venv pex script for use with multiprocessing. (#1238) + `PR #1238 <https://github.com/pantsbuild/pex/pull/1238>`_ + 2.1.30 ------ diff --git a/pex/version.py b/pex/version.py index ea2323052..058623131 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.30" +__version__ = "2.1.31"
pex-tool__pex-1502
Release 2.1.53 On the docket: + [x] pex stops interpreter search if even one intepreter fails to identify itself #1494 + [x] Add support for setting custom venv prompts. #1499 + [x] How to know whether we are running from within pex #1485
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.52\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.53\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index e3cb3fcdf..72ae8ff25 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,20 @@ Release Notes ============= +2.1.53 +------ + +This release fixes a bug identifying certain interpreters on macOS +Monterey. Additionally, Pex now exposes the ``PEX`` environment +variable inside running PEXes to allow application code to both detect +it's running from a PEX and determine where that PEX is located. + +* Guard against fake interpreters. (#1500) + `PR #1500 <https://github.com/pantsbuild/pex/pull/1500>`_ + +* Introduce the ``PEX`` env var. (#1495) + `PR #1495 <https://github.com/pantsbuild/pex/pull/1495>`_ + 2.1.52 ------ diff --git a/pex/version.py b/pex/version.py index b9833ab3e..ea63babce 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.52" +__version__ = "2.1.53"
pex-tool__pex-1314
Release 2.1.38 On the docket: + [ ] PEX direct requirement metadata for resolves via Pip is incorrect. #1311
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.37\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.38\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index f66b63894..778a61032 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,16 @@ Release Notes ============= +2.1.38 +------ + +A hotfix that finishes work started in 2.1.37 by #1304 to align Pip +based resolve results with ``--pex-repository`` based resolve results +for requirements with '.' in their names as allowed by PEP-503. + +* Fix PEX direct requirements metadata. (#1312) + `PR #1312 <https://github.com/pantsbuild/pex/pull/1312>`_ + 2.1.37 ------ diff --git a/pex/version.py b/pex/version.py index 7fc3ab4af..092a74bf7 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.37" +__version__ = "2.1.38"
pex-tool__pex-1547
Release 2.1.59 On the docket: + [x] Add knob for --venv site-packages symlinking. #1543 + [x] Fix Pex to identify Python 3.10 interpreters. #1545
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.58\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.59\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index c8533e860..6f5b53a3e 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,21 @@ Release Notes ============= +2.1.59 +------ + +This release adds the boolean option ``--venv-site-packages-copies`` to +control whether ``--venv`` execution mode PEXes create their venv with +copies (hardlinks when possible) or symlinks. It also fixes a bug that +prevented Python 3.10 interpreters from being discovered when +``--interpreter-constraint`` was used. + +* Add knob for --venv site-packages symlinking. (#1543) + `PR #1543 <https://github.com/pantsbuild/pex/pull/1543>`_ + +* Fix Pex to identify Python 3.10 interpreters. (#1545) + `PR #1545 <https://github.com/pantsbuild/pex/pull/1545>`_ + 2.1.58 ------ diff --git a/pex/version.py b/pex/version.py index 51231815e..30fbcbd9c 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.58" +__version__ = "2.1.59"
pex-tool__pex-1255
Release 2.1.32 On the docket: + [x] Venv `pex` and bin scripts can run afoul of shebang length limits. #1252
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.31\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.32\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 85a8bdc9b..d72ec401a 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,15 @@ Release Notes ============= +2.1.32 +------ + +This is a hotfix release that fixes ``--venv`` mode shebangs being too long for some Linux +environments. + +* Guard against too long ``--venv`` mode shebangs. (#1254) + `PR #1254 <https://github.com/pantsbuild/pex/pull/1254>`_ + 2.1.31 ------ diff --git a/pex/version.py b/pex/version.py index 058623131..21d1e3b7c 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.31" +__version__ = "2.1.32"
pytest-dev__pytest-django-881
admin_client is not checking for login success `client.login` inside `admin_client` can return `False` in the case when there's an existing admin user with a password set to something other than `'password'`. Perhaps, `admin_client` should use `force_login` instead?
[ { "content": "\"\"\"All pytest-django fixtures\"\"\"\n\n\nimport os\nfrom contextlib import contextmanager\nfrom functools import partial\n\nimport pytest\n\nfrom . import live_server_helper\nfrom .django_compat import is_django_unittest\nfrom .lazy_django import skip_if_no_django\n\n__all__ = [\n \"django_db_setup\",\n \"db\",\n \"transactional_db\",\n \"django_db_reset_sequences\",\n \"admin_user\",\n \"django_user_model\",\n \"django_username_field\",\n \"client\",\n \"admin_client\",\n \"rf\",\n \"settings\",\n \"live_server\",\n \"_live_server_helper\",\n \"django_assert_num_queries\",\n \"django_assert_max_num_queries\",\n]\n\n\[email protected](scope=\"session\")\ndef django_db_modify_db_settings_tox_suffix():\n skip_if_no_django()\n\n tox_environment = os.getenv(\"TOX_PARALLEL_ENV\")\n if tox_environment:\n # Put a suffix like _py27-django21 on tox workers\n _set_suffix_to_test_databases(suffix=tox_environment)\n\n\[email protected](scope=\"session\")\ndef django_db_modify_db_settings_xdist_suffix(request):\n skip_if_no_django()\n\n xdist_suffix = getattr(request.config, \"workerinput\", {}).get(\"workerid\")\n if xdist_suffix:\n # Put a suffix like _gw0, _gw1 etc on xdist processes\n _set_suffix_to_test_databases(suffix=xdist_suffix)\n\n\[email protected](scope=\"session\")\ndef django_db_modify_db_settings_parallel_suffix(\n django_db_modify_db_settings_tox_suffix,\n django_db_modify_db_settings_xdist_suffix,\n):\n skip_if_no_django()\n\n\[email protected](scope=\"session\")\ndef django_db_modify_db_settings(django_db_modify_db_settings_parallel_suffix):\n skip_if_no_django()\n\n\[email protected](scope=\"session\")\ndef django_db_use_migrations(request):\n return not request.config.getvalue(\"nomigrations\")\n\n\[email protected](scope=\"session\")\ndef django_db_keepdb(request):\n return request.config.getvalue(\"reuse_db\")\n\n\[email protected](scope=\"session\")\ndef django_db_createdb(request):\n return request.config.getvalue(\"create_db\")\n\n\[email protected](scope=\"session\")\ndef django_db_setup(\n request,\n django_test_environment,\n django_db_blocker,\n django_db_use_migrations,\n django_db_keepdb,\n django_db_createdb,\n django_db_modify_db_settings,\n):\n \"\"\"Top level fixture to ensure test databases are available\"\"\"\n from django.test.utils import setup_databases, teardown_databases\n\n setup_databases_args = {}\n\n if not django_db_use_migrations:\n _disable_native_migrations()\n\n if django_db_keepdb and not django_db_createdb:\n setup_databases_args[\"keepdb\"] = True\n\n with django_db_blocker.unblock():\n db_cfg = setup_databases(\n verbosity=request.config.option.verbose,\n interactive=False,\n **setup_databases_args\n )\n\n def teardown_database():\n with django_db_blocker.unblock():\n try:\n teardown_databases(db_cfg, verbosity=request.config.option.verbose)\n except Exception as exc:\n request.node.warn(\n pytest.PytestWarning(\n \"Error when trying to teardown test databases: %r\" % exc\n )\n )\n\n if not django_db_keepdb:\n request.addfinalizer(teardown_database)\n\n\ndef _django_db_fixture_helper(\n request, django_db_blocker, transactional=False, reset_sequences=False\n):\n if is_django_unittest(request):\n return\n\n if not transactional and \"live_server\" in request.fixturenames:\n # Do nothing, we get called with transactional=True, too.\n return\n\n django_db_blocker.unblock()\n request.addfinalizer(django_db_blocker.restore)\n\n if transactional:\n from django.test import TransactionTestCase as django_case\n\n if reset_sequences:\n\n class ResetSequenceTestCase(django_case):\n reset_sequences = True\n\n django_case = ResetSequenceTestCase\n else:\n from django.test import TestCase as django_case\n\n test_case = django_case(methodName=\"__init__\")\n test_case._pre_setup()\n request.addfinalizer(test_case._post_teardown)\n\n\ndef _disable_native_migrations():\n from django.conf import settings\n from django.core.management.commands import migrate\n\n from .migrations import DisableMigrations\n\n settings.MIGRATION_MODULES = DisableMigrations()\n\n class MigrateSilentCommand(migrate.Command):\n def handle(self, *args, **kwargs):\n kwargs[\"verbosity\"] = 0\n return super().handle(*args, **kwargs)\n\n migrate.Command = MigrateSilentCommand\n\n\ndef _set_suffix_to_test_databases(suffix):\n from django.conf import settings\n\n for db_settings in settings.DATABASES.values():\n test_name = db_settings.get(\"TEST\", {}).get(\"NAME\")\n\n if not test_name:\n if db_settings[\"ENGINE\"] == \"django.db.backends.sqlite3\":\n continue\n test_name = \"test_{}\".format(db_settings[\"NAME\"])\n\n if test_name == \":memory:\":\n continue\n\n db_settings.setdefault(\"TEST\", {})\n db_settings[\"TEST\"][\"NAME\"] = \"{}_{}\".format(test_name, suffix)\n\n\n# ############### User visible fixtures ################\n\n\[email protected](scope=\"function\")\ndef db(request, django_db_setup, django_db_blocker):\n \"\"\"Require a django test database.\n\n This database will be setup with the default fixtures and will have\n the transaction management disabled. At the end of the test the outer\n transaction that wraps the test itself will be rolled back to undo any\n changes to the database (in case the backend supports transactions).\n This is more limited than the ``transactional_db`` resource but\n faster.\n\n If multiple database fixtures are requested, they take precedence\n over each other in the following order (the last one wins): ``db``,\n ``transactional_db``, ``django_db_reset_sequences``.\n \"\"\"\n if \"django_db_reset_sequences\" in request.fixturenames:\n request.getfixturevalue(\"django_db_reset_sequences\")\n if (\n \"transactional_db\" in request.fixturenames\n or \"live_server\" in request.fixturenames\n ):\n request.getfixturevalue(\"transactional_db\")\n else:\n _django_db_fixture_helper(request, django_db_blocker, transactional=False)\n\n\[email protected](scope=\"function\")\ndef transactional_db(request, django_db_setup, django_db_blocker):\n \"\"\"Require a django test database with transaction support.\n\n This will re-initialise the django database for each test and is\n thus slower than the normal ``db`` fixture.\n\n If you want to use the database with transactions you must request\n this resource.\n\n If multiple database fixtures are requested, they take precedence\n over each other in the following order (the last one wins): ``db``,\n ``transactional_db``, ``django_db_reset_sequences``.\n \"\"\"\n if \"django_db_reset_sequences\" in request.fixturenames:\n request.getfixturevalue(\"django_db_reset_sequences\")\n _django_db_fixture_helper(request, django_db_blocker, transactional=True)\n\n\[email protected](scope=\"function\")\ndef django_db_reset_sequences(request, django_db_setup, django_db_blocker):\n \"\"\"Require a transactional test database with sequence reset support.\n\n This behaves like the ``transactional_db`` fixture, with the addition\n of enforcing a reset of all auto increment sequences. If the enquiring\n test relies on such values (e.g. ids as primary keys), you should\n request this resource to ensure they are consistent across tests.\n\n If multiple database fixtures are requested, they take precedence\n over each other in the following order (the last one wins): ``db``,\n ``transactional_db``, ``django_db_reset_sequences``.\n \"\"\"\n _django_db_fixture_helper(\n request, django_db_blocker, transactional=True, reset_sequences=True\n )\n\n\[email protected]()\ndef client():\n \"\"\"A Django test client instance.\"\"\"\n skip_if_no_django()\n\n from django.test.client import Client\n\n return Client()\n\n\[email protected]()\ndef django_user_model(db):\n \"\"\"The class of Django's user model.\"\"\"\n from django.contrib.auth import get_user_model\n\n return get_user_model()\n\n\[email protected]()\ndef django_username_field(django_user_model):\n \"\"\"The fieldname for the username used with Django's user model.\"\"\"\n return django_user_model.USERNAME_FIELD\n\n\[email protected]()\ndef admin_user(db, django_user_model, django_username_field):\n \"\"\"A Django admin user.\n\n This uses an existing user with username \"admin\", or creates a new one with\n password \"password\".\n \"\"\"\n UserModel = django_user_model\n username_field = django_username_field\n username = \"[email protected]\" if username_field == \"email\" else \"admin\"\n\n try:\n # The default behavior of `get_by_natural_key()` is to look up by `username_field`.\n # However the user model is free to override it with any sort of custom behavior.\n # The Django authentication backend already assumes the lookup is by username,\n # so we can assume so as well.\n user = UserModel._default_manager.get_by_natural_key(username)\n except UserModel.DoesNotExist:\n extra_fields = {}\n if username_field not in (\"username\", \"email\"):\n extra_fields[username_field] = \"admin\"\n user = UserModel._default_manager.create_superuser(\n username, \"[email protected]\", \"password\", **extra_fields\n )\n return user\n\n\[email protected]()\ndef admin_client(db, admin_user):\n \"\"\"A Django test client logged in as an admin user.\"\"\"\n from django.test.client import Client\n\n client = Client()\n client.login(username=admin_user.username, password=\"password\")\n return client\n\n\[email protected]()\ndef rf():\n \"\"\"RequestFactory instance\"\"\"\n skip_if_no_django()\n\n from django.test.client import RequestFactory\n\n return RequestFactory()\n\n\nclass SettingsWrapper:\n _to_restore = []\n\n def __delattr__(self, attr):\n from django.test import override_settings\n\n override = override_settings()\n override.enable()\n from django.conf import settings\n\n delattr(settings, attr)\n\n self._to_restore.append(override)\n\n def __setattr__(self, attr, value):\n from django.test import override_settings\n\n override = override_settings(**{attr: value})\n override.enable()\n self._to_restore.append(override)\n\n def __getattr__(self, item):\n from django.conf import settings\n\n return getattr(settings, item)\n\n def finalize(self):\n for override in reversed(self._to_restore):\n override.disable()\n\n del self._to_restore[:]\n\n\[email protected]_fixture()\ndef settings():\n \"\"\"A Django settings object which restores changes after the testrun\"\"\"\n skip_if_no_django()\n\n wrapper = SettingsWrapper()\n yield wrapper\n wrapper.finalize()\n\n\[email protected](scope=\"session\")\ndef live_server(request):\n \"\"\"Run a live Django server in the background during tests\n\n The address the server is started from is taken from the\n --liveserver command line option or if this is not provided from\n the DJANGO_LIVE_TEST_SERVER_ADDRESS environment variable. If\n neither is provided ``localhost`` is used. See the Django\n documentation for its full syntax.\n\n NOTE: If the live server needs database access to handle a request\n your test will have to request database access. Furthermore\n when the tests want to see data added by the live-server (or\n the other way around) transactional database access will be\n needed as data inside a transaction is not shared between\n the live server and test code.\n\n Static assets will be automatically served when\n ``django.contrib.staticfiles`` is available in INSTALLED_APPS.\n \"\"\"\n skip_if_no_django()\n\n addr = request.config.getvalue(\"liveserver\") or os.getenv(\n \"DJANGO_LIVE_TEST_SERVER_ADDRESS\"\n ) or \"localhost\"\n\n server = live_server_helper.LiveServer(addr)\n request.addfinalizer(server.stop)\n return server\n\n\[email protected](autouse=True, scope=\"function\")\ndef _live_server_helper(request):\n \"\"\"Helper to make live_server work, internal to pytest-django.\n\n This helper will dynamically request the transactional_db fixture\n for a test which uses the live_server fixture. This allows the\n server and test to access the database without having to mark\n this explicitly which is handy since it is usually required and\n matches the Django behaviour.\n\n The separate helper is required since live_server can not request\n transactional_db directly since it is session scoped instead of\n function-scoped.\n\n It will also override settings only for the duration of the test.\n \"\"\"\n if \"live_server\" not in request.fixturenames:\n return\n\n request.getfixturevalue(\"transactional_db\")\n\n live_server = request.getfixturevalue(\"live_server\")\n live_server._live_server_modified_settings.enable()\n request.addfinalizer(live_server._live_server_modified_settings.disable)\n\n\n@contextmanager\ndef _assert_num_queries(config, num, exact=True, connection=None, info=None):\n from django.test.utils import CaptureQueriesContext\n\n if connection is None:\n from django.db import connection\n\n verbose = config.getoption(\"verbose\") > 0\n with CaptureQueriesContext(connection) as context:\n yield context\n num_performed = len(context)\n if exact:\n failed = num != num_performed\n else:\n failed = num_performed > num\n if failed:\n msg = \"Expected to perform {} queries {}{}\".format(\n num,\n \"\" if exact else \"or less \",\n \"but {} done\".format(\n num_performed == 1 and \"1 was\" or \"{} were\".format(num_performed)\n ),\n )\n if info:\n msg += \"\\n{}\".format(info)\n if verbose:\n sqls = (q[\"sql\"] for q in context.captured_queries)\n msg += \"\\n\\nQueries:\\n========\\n\\n\" + \"\\n\\n\".join(sqls)\n else:\n msg += \" (add -v option to show queries)\"\n pytest.fail(msg)\n\n\[email protected](scope=\"function\")\ndef django_assert_num_queries(pytestconfig):\n return partial(_assert_num_queries, pytestconfig)\n\n\[email protected](scope=\"function\")\ndef django_assert_max_num_queries(pytestconfig):\n return partial(_assert_num_queries, pytestconfig, exact=False)\n", "path": "pytest_django/fixtures.py" } ]
[ { "content": "\"\"\"All pytest-django fixtures\"\"\"\n\n\nimport os\nfrom contextlib import contextmanager\nfrom functools import partial\n\nimport pytest\n\nfrom . import live_server_helper\nfrom .django_compat import is_django_unittest\nfrom .lazy_django import skip_if_no_django\n\n__all__ = [\n \"django_db_setup\",\n \"db\",\n \"transactional_db\",\n \"django_db_reset_sequences\",\n \"admin_user\",\n \"django_user_model\",\n \"django_username_field\",\n \"client\",\n \"admin_client\",\n \"rf\",\n \"settings\",\n \"live_server\",\n \"_live_server_helper\",\n \"django_assert_num_queries\",\n \"django_assert_max_num_queries\",\n]\n\n\[email protected](scope=\"session\")\ndef django_db_modify_db_settings_tox_suffix():\n skip_if_no_django()\n\n tox_environment = os.getenv(\"TOX_PARALLEL_ENV\")\n if tox_environment:\n # Put a suffix like _py27-django21 on tox workers\n _set_suffix_to_test_databases(suffix=tox_environment)\n\n\[email protected](scope=\"session\")\ndef django_db_modify_db_settings_xdist_suffix(request):\n skip_if_no_django()\n\n xdist_suffix = getattr(request.config, \"workerinput\", {}).get(\"workerid\")\n if xdist_suffix:\n # Put a suffix like _gw0, _gw1 etc on xdist processes\n _set_suffix_to_test_databases(suffix=xdist_suffix)\n\n\[email protected](scope=\"session\")\ndef django_db_modify_db_settings_parallel_suffix(\n django_db_modify_db_settings_tox_suffix,\n django_db_modify_db_settings_xdist_suffix,\n):\n skip_if_no_django()\n\n\[email protected](scope=\"session\")\ndef django_db_modify_db_settings(django_db_modify_db_settings_parallel_suffix):\n skip_if_no_django()\n\n\[email protected](scope=\"session\")\ndef django_db_use_migrations(request):\n return not request.config.getvalue(\"nomigrations\")\n\n\[email protected](scope=\"session\")\ndef django_db_keepdb(request):\n return request.config.getvalue(\"reuse_db\")\n\n\[email protected](scope=\"session\")\ndef django_db_createdb(request):\n return request.config.getvalue(\"create_db\")\n\n\[email protected](scope=\"session\")\ndef django_db_setup(\n request,\n django_test_environment,\n django_db_blocker,\n django_db_use_migrations,\n django_db_keepdb,\n django_db_createdb,\n django_db_modify_db_settings,\n):\n \"\"\"Top level fixture to ensure test databases are available\"\"\"\n from django.test.utils import setup_databases, teardown_databases\n\n setup_databases_args = {}\n\n if not django_db_use_migrations:\n _disable_native_migrations()\n\n if django_db_keepdb and not django_db_createdb:\n setup_databases_args[\"keepdb\"] = True\n\n with django_db_blocker.unblock():\n db_cfg = setup_databases(\n verbosity=request.config.option.verbose,\n interactive=False,\n **setup_databases_args\n )\n\n def teardown_database():\n with django_db_blocker.unblock():\n try:\n teardown_databases(db_cfg, verbosity=request.config.option.verbose)\n except Exception as exc:\n request.node.warn(\n pytest.PytestWarning(\n \"Error when trying to teardown test databases: %r\" % exc\n )\n )\n\n if not django_db_keepdb:\n request.addfinalizer(teardown_database)\n\n\ndef _django_db_fixture_helper(\n request, django_db_blocker, transactional=False, reset_sequences=False\n):\n if is_django_unittest(request):\n return\n\n if not transactional and \"live_server\" in request.fixturenames:\n # Do nothing, we get called with transactional=True, too.\n return\n\n django_db_blocker.unblock()\n request.addfinalizer(django_db_blocker.restore)\n\n if transactional:\n from django.test import TransactionTestCase as django_case\n\n if reset_sequences:\n\n class ResetSequenceTestCase(django_case):\n reset_sequences = True\n\n django_case = ResetSequenceTestCase\n else:\n from django.test import TestCase as django_case\n\n test_case = django_case(methodName=\"__init__\")\n test_case._pre_setup()\n request.addfinalizer(test_case._post_teardown)\n\n\ndef _disable_native_migrations():\n from django.conf import settings\n from django.core.management.commands import migrate\n\n from .migrations import DisableMigrations\n\n settings.MIGRATION_MODULES = DisableMigrations()\n\n class MigrateSilentCommand(migrate.Command):\n def handle(self, *args, **kwargs):\n kwargs[\"verbosity\"] = 0\n return super().handle(*args, **kwargs)\n\n migrate.Command = MigrateSilentCommand\n\n\ndef _set_suffix_to_test_databases(suffix):\n from django.conf import settings\n\n for db_settings in settings.DATABASES.values():\n test_name = db_settings.get(\"TEST\", {}).get(\"NAME\")\n\n if not test_name:\n if db_settings[\"ENGINE\"] == \"django.db.backends.sqlite3\":\n continue\n test_name = \"test_{}\".format(db_settings[\"NAME\"])\n\n if test_name == \":memory:\":\n continue\n\n db_settings.setdefault(\"TEST\", {})\n db_settings[\"TEST\"][\"NAME\"] = \"{}_{}\".format(test_name, suffix)\n\n\n# ############### User visible fixtures ################\n\n\[email protected](scope=\"function\")\ndef db(request, django_db_setup, django_db_blocker):\n \"\"\"Require a django test database.\n\n This database will be setup with the default fixtures and will have\n the transaction management disabled. At the end of the test the outer\n transaction that wraps the test itself will be rolled back to undo any\n changes to the database (in case the backend supports transactions).\n This is more limited than the ``transactional_db`` resource but\n faster.\n\n If multiple database fixtures are requested, they take precedence\n over each other in the following order (the last one wins): ``db``,\n ``transactional_db``, ``django_db_reset_sequences``.\n \"\"\"\n if \"django_db_reset_sequences\" in request.fixturenames:\n request.getfixturevalue(\"django_db_reset_sequences\")\n if (\n \"transactional_db\" in request.fixturenames\n or \"live_server\" in request.fixturenames\n ):\n request.getfixturevalue(\"transactional_db\")\n else:\n _django_db_fixture_helper(request, django_db_blocker, transactional=False)\n\n\[email protected](scope=\"function\")\ndef transactional_db(request, django_db_setup, django_db_blocker):\n \"\"\"Require a django test database with transaction support.\n\n This will re-initialise the django database for each test and is\n thus slower than the normal ``db`` fixture.\n\n If you want to use the database with transactions you must request\n this resource.\n\n If multiple database fixtures are requested, they take precedence\n over each other in the following order (the last one wins): ``db``,\n ``transactional_db``, ``django_db_reset_sequences``.\n \"\"\"\n if \"django_db_reset_sequences\" in request.fixturenames:\n request.getfixturevalue(\"django_db_reset_sequences\")\n _django_db_fixture_helper(request, django_db_blocker, transactional=True)\n\n\[email protected](scope=\"function\")\ndef django_db_reset_sequences(request, django_db_setup, django_db_blocker):\n \"\"\"Require a transactional test database with sequence reset support.\n\n This behaves like the ``transactional_db`` fixture, with the addition\n of enforcing a reset of all auto increment sequences. If the enquiring\n test relies on such values (e.g. ids as primary keys), you should\n request this resource to ensure they are consistent across tests.\n\n If multiple database fixtures are requested, they take precedence\n over each other in the following order (the last one wins): ``db``,\n ``transactional_db``, ``django_db_reset_sequences``.\n \"\"\"\n _django_db_fixture_helper(\n request, django_db_blocker, transactional=True, reset_sequences=True\n )\n\n\[email protected]()\ndef client():\n \"\"\"A Django test client instance.\"\"\"\n skip_if_no_django()\n\n from django.test.client import Client\n\n return Client()\n\n\[email protected]()\ndef django_user_model(db):\n \"\"\"The class of Django's user model.\"\"\"\n from django.contrib.auth import get_user_model\n\n return get_user_model()\n\n\[email protected]()\ndef django_username_field(django_user_model):\n \"\"\"The fieldname for the username used with Django's user model.\"\"\"\n return django_user_model.USERNAME_FIELD\n\n\[email protected]()\ndef admin_user(db, django_user_model, django_username_field):\n \"\"\"A Django admin user.\n\n This uses an existing user with username \"admin\", or creates a new one with\n password \"password\".\n \"\"\"\n UserModel = django_user_model\n username_field = django_username_field\n username = \"[email protected]\" if username_field == \"email\" else \"admin\"\n\n try:\n user = UserModel._default_manager.get(**{username_field: username})\n except UserModel.DoesNotExist:\n extra_fields = {}\n if username_field not in (\"username\", \"email\"):\n extra_fields[username_field] = \"admin\"\n user = UserModel._default_manager.create_superuser(\n username, \"[email protected]\", \"password\", **extra_fields\n )\n return user\n\n\[email protected]()\ndef admin_client(db, admin_user):\n \"\"\"A Django test client logged in as an admin user.\"\"\"\n from django.test.client import Client\n\n client = Client()\n client.force_login(admin_user)\n return client\n\n\[email protected]()\ndef rf():\n \"\"\"RequestFactory instance\"\"\"\n skip_if_no_django()\n\n from django.test.client import RequestFactory\n\n return RequestFactory()\n\n\nclass SettingsWrapper:\n _to_restore = []\n\n def __delattr__(self, attr):\n from django.test import override_settings\n\n override = override_settings()\n override.enable()\n from django.conf import settings\n\n delattr(settings, attr)\n\n self._to_restore.append(override)\n\n def __setattr__(self, attr, value):\n from django.test import override_settings\n\n override = override_settings(**{attr: value})\n override.enable()\n self._to_restore.append(override)\n\n def __getattr__(self, item):\n from django.conf import settings\n\n return getattr(settings, item)\n\n def finalize(self):\n for override in reversed(self._to_restore):\n override.disable()\n\n del self._to_restore[:]\n\n\[email protected]_fixture()\ndef settings():\n \"\"\"A Django settings object which restores changes after the testrun\"\"\"\n skip_if_no_django()\n\n wrapper = SettingsWrapper()\n yield wrapper\n wrapper.finalize()\n\n\[email protected](scope=\"session\")\ndef live_server(request):\n \"\"\"Run a live Django server in the background during tests\n\n The address the server is started from is taken from the\n --liveserver command line option or if this is not provided from\n the DJANGO_LIVE_TEST_SERVER_ADDRESS environment variable. If\n neither is provided ``localhost`` is used. See the Django\n documentation for its full syntax.\n\n NOTE: If the live server needs database access to handle a request\n your test will have to request database access. Furthermore\n when the tests want to see data added by the live-server (or\n the other way around) transactional database access will be\n needed as data inside a transaction is not shared between\n the live server and test code.\n\n Static assets will be automatically served when\n ``django.contrib.staticfiles`` is available in INSTALLED_APPS.\n \"\"\"\n skip_if_no_django()\n\n addr = request.config.getvalue(\"liveserver\") or os.getenv(\n \"DJANGO_LIVE_TEST_SERVER_ADDRESS\"\n ) or \"localhost\"\n\n server = live_server_helper.LiveServer(addr)\n request.addfinalizer(server.stop)\n return server\n\n\[email protected](autouse=True, scope=\"function\")\ndef _live_server_helper(request):\n \"\"\"Helper to make live_server work, internal to pytest-django.\n\n This helper will dynamically request the transactional_db fixture\n for a test which uses the live_server fixture. This allows the\n server and test to access the database without having to mark\n this explicitly which is handy since it is usually required and\n matches the Django behaviour.\n\n The separate helper is required since live_server can not request\n transactional_db directly since it is session scoped instead of\n function-scoped.\n\n It will also override settings only for the duration of the test.\n \"\"\"\n if \"live_server\" not in request.fixturenames:\n return\n\n request.getfixturevalue(\"transactional_db\")\n\n live_server = request.getfixturevalue(\"live_server\")\n live_server._live_server_modified_settings.enable()\n request.addfinalizer(live_server._live_server_modified_settings.disable)\n\n\n@contextmanager\ndef _assert_num_queries(config, num, exact=True, connection=None, info=None):\n from django.test.utils import CaptureQueriesContext\n\n if connection is None:\n from django.db import connection\n\n verbose = config.getoption(\"verbose\") > 0\n with CaptureQueriesContext(connection) as context:\n yield context\n num_performed = len(context)\n if exact:\n failed = num != num_performed\n else:\n failed = num_performed > num\n if failed:\n msg = \"Expected to perform {} queries {}{}\".format(\n num,\n \"\" if exact else \"or less \",\n \"but {} done\".format(\n num_performed == 1 and \"1 was\" or \"{} were\".format(num_performed)\n ),\n )\n if info:\n msg += \"\\n{}\".format(info)\n if verbose:\n sqls = (q[\"sql\"] for q in context.captured_queries)\n msg += \"\\n\\nQueries:\\n========\\n\\n\" + \"\\n\\n\".join(sqls)\n else:\n msg += \" (add -v option to show queries)\"\n pytest.fail(msg)\n\n\[email protected](scope=\"function\")\ndef django_assert_num_queries(pytestconfig):\n return partial(_assert_num_queries, pytestconfig)\n\n\[email protected](scope=\"function\")\ndef django_assert_max_num_queries(pytestconfig):\n return partial(_assert_num_queries, pytestconfig, exact=False)\n", "path": "pytest_django/fixtures.py" } ]
diff --git a/docs/helpers.rst b/docs/helpers.rst index 03434faf7..d70ffe2d0 100644 --- a/docs/helpers.rst +++ b/docs/helpers.rst @@ -158,14 +158,18 @@ Example response = client.get('/') assert response.content == 'Foobar' -To use `client` as an authenticated standard user, call its `login()` method before accessing a URL: +To use `client` as an authenticated standard user, call its `force_login()` or +`login()` method before accessing a URL: :: def test_with_authenticated_client(client, django_user_model): username = "user1" password = "bar" - django_user_model.objects.create_user(username=username, password=password) + user = django_user_model.objects.create_user(username=username, password=password) + # Use this: + client.force_login(user) + # Or this: client.login(username=username, password=password) response = client.get('/private') assert response.content == 'Protected Area' diff --git a/pytest_django/fixtures.py b/pytest_django/fixtures.py index 0f2dd6115..d1918d3fd 100644 --- a/pytest_django/fixtures.py +++ b/pytest_django/fixtures.py @@ -304,7 +304,7 @@ def admin_client(db, admin_user): from django.test.client import Client client = Client() - client.login(username=admin_user.username, password="password") + client.force_login(admin_user) return client diff --git a/tests/test_fixtures.py b/tests/test_fixtures.py index 6a8e4208a..26b5394aa 100644 --- a/tests/test_fixtures.py +++ b/tests/test_fixtures.py @@ -49,6 +49,17 @@ def test_admin_client_no_db_marker(admin_client): assert force_str(resp.content) == "You are an admin" +# For test below. [email protected] +def existing_admin_user(django_user_model): + return django_user_model._default_manager.create_superuser('admin', None, None) + + +def test_admin_client_existing_user(db, existing_admin_user, admin_user, admin_client): + resp = admin_client.get("/admin-required/") + assert force_str(resp.content) == "You are an admin" + + @pytest.mark.django_db def test_admin_user(admin_user, django_user_model): assert isinstance(admin_user, django_user_model)
scverse__scanpy-2566
pyproject.toml should refer to `igraph` and not `python-igraph` ### Please make sure these conditions are met - [X] I have checked that this issue has not already been reported. - [X] I have confirmed this bug exists on the latest version of scanpy. - [X] (optional) I have confirmed this bug exists on the master branch of scanpy. ### What happened? I've noticed that `pyproject.toml` refers to the `python-igraph` package in the PyPI repository. This name is deprecated; the `igraph` package is currently called [`igraph`](https://pypi.org/project/igraph). The old package name currently works as a redirect (i.e. it brings in `igraph` as its own sub-dependency), but it will not be maintained in the future. Please switch to referring to `igraph` in `pyproject.toml` and not `python-igraph`. ### Minimal code sample ```python N/A ``` ### Error output _No response_ ### Versions <details> ``` ----- anndata 0.9.1 scanpy 1.9.3 ----- PIL 10.0.0 cycler 0.10.0 cython_runtime NA dateutil 2.8.2 h5py 3.9.0 joblib 1.3.1 kiwisolver 1.4.4 llvmlite 0.40.1 matplotlib 3.7.2 mpl_toolkits NA natsort 8.4.0 numba 0.57.1 numpy 1.24.4 packaging 23.1 pandas 2.0.3 pyparsing 3.0.9 pytz 2023.3 scipy 1.11.1 session_info 1.0.0 sitecustomize NA six 1.16.0 sklearn 1.3.0 threadpoolctl 3.2.0 ----- Python 3.11.4 (main, Jun 20 2023, 17:23:00) [Clang 14.0.3 (clang-1403.0.22.14.1)] macOS-13.2.1-arm64-arm-64bit ----- Session information updated at 2023-07-19 13:34 ``` </details> BaseException: Could not construct partition: Weight vector not the same size as the number of edges. - [X] I have checked that this issue has not already been reported. - [X] I have confirmed this bug exists on the latest version of scanpy. - [ ] (optional) I have confirmed this bug exists on the master branch of scanpy. --- I have been trying to replicate [this tutorial](https://scanpy-tutorials.readthedocs.io/en/latest/paga-paul15.html#Clustering-and-PAGA) on trajectory inference. I have followed every step up until clustering, where I try to use sc.tl.leiden(adata) to cluster, but keep having the following error. This seemed to have resolved itself by installing leidenalg via pip, but with conda install it fails every time. ### Minimal code sample (that we can copy&paste without having any data) ```python sc.tl.leiden(adata) ``` ```pytb BaseException Traceback (most recent call last) Cell In [15], line 1 ----> 1 sc.tl.leiden(adata) File ~/miniconda3/envs/py39/lib/python3.9/site-packages/scanpy/tools/_leiden.py:144, in leiden(adata, resolution, restrict_to, random_state, key_added, adjacency, directed, use_weights, n_iterations, partition_type, neighbors_key, obsp, copy, **partition_kwargs) 142 partition_kwargs[‘resolution_parameter’] = resolution 143 # clustering proper → 144 part = leidenalg.find_partition(g, partition_type, **partition_kwargs) 145 # store output into adata.obs 146 groups = np.array(part.membership) File ~/miniconda3/envs/py39/lib/python3.9/site-packages/leidenalg/functions.py:81, in find_partition(graph, partition_type, initial_membership, weights, n_iterations, max_comm_size, seed, **kwargs) 79 if not weights is None: 80 kwargs[‘weights’] = weights —> 81 partition = partition_type(graph, 82 initial_membership=initial_membership, 83 **kwargs) 84 optimiser = Optimiser() 86 optimiser.max_comm_size = max_comm_size File ~/miniconda3/envs/py39/lib/python3.9/site-packages/leidenalg/VertexPartition.py:855, in RBConfigurationVertexPartition.init(self, graph, initial_membership, weights, node_sizes, resolution_parameter) 851 else: 852 # Make sure it is a list 853 node_sizes = list(node_sizes) → 855 self._partition = _c_leiden._new_RBConfigurationVertexPartition(pygraph_t, 856 initial_membership, weights, node_sizes, resolution_parameter) 857 self._update_internal_membership() BaseException: Could not construct partition: Weight vector not the same size as the number of edges. ``` #### Versions <details> ``` anndata 0.8.0 scanpy 1.9.1 ----- PIL 9.2.0 appnope 0.1.3 asttokens NA backcall 0.2.0 beta_ufunc NA binom_ufunc NA cffi 1.15.1 colorama 0.4.5 cycler 0.10.0 cython_runtime NA dateutil 2.8.2 debugpy 1.6.3 decorator 5.1.1 defusedxml 0.7.1 entrypoints 0.4 executing 1.1.0 h5py 3.7.0 hypergeom_ufunc NA igraph 0.9.11 ipykernel 6.16.0 ipython_genutils 0.2.0 ipywidgets 8.0.2 jedi 0.18.1 joblib 1.2.0 jupyter_server 1.19.1 kiwisolver 1.4.4 leidenalg 0.8.10 llvmlite 0.39.1 louvain 0.7.1 matplotlib 3.6.0 matplotlib_inline 0.1.6 mpl_toolkits NA natsort 8.2.0 nbinom_ufunc NA ncf_ufunc NA numba 0.56.2 numpy 1.23.3 packaging 21.3 pandas 1.5.0 parso 0.8.3 pexpect 4.8.0 pickleshare 0.7.5 pkg_resources NA prompt_toolkit 3.0.31 psutil 5.9.2 ptyprocess 0.7.0 pure_eval 0.2.2 pycparser 2.21 pydev_ipython NA pydevconsole NA pydevd 2.8.0 pydevd_file_utils NA pydevd_plugins NA pydevd_tracing NA pygments 2.13.0 pynndescent 0.5.7 pyparsing 3.0.9 pytz 2022.2.1 scipy 1.9.1 session_info 1.0.0 setuptools 65.4.0 six 1.16.0 sklearn 1.1.2 sphinxcontrib NA stack_data 0.5.1 statsmodels 0.13.2 texttable 1.6.4 threadpoolctl 3.1.0 tornado 6.2 tqdm 4.64.1 traitlets 5.4.0 typing_extensions NA umap 0.5.3 wcwidth 0.2.5 zipp NA zmq 24.0.1 zoneinfo NA ----- IPython 8.5.0 jupyter_client 7.3.5 jupyter_core 4.11.1 jupyterlab 3.4.7 notebook 6.4.12 ----- Python 3.9.13 | packaged by conda-forge | (main, May 27 2022, 17:00:33) [Clang 13.0.1 ] macOS-12.6-arm64-arm-64bit ----- Session information updated at 2022-09-29 11:08 ``` </details>
[ { "content": "\"\"\"Logging and Profiling\n\"\"\"\nimport logging\nimport sys\nfrom functools import update_wrapper, partial\nfrom logging import CRITICAL, ERROR, WARNING, INFO, DEBUG\nfrom datetime import datetime, timedelta, timezone\nfrom typing import Optional, IO\nimport warnings\n\nimport anndata.logging\n\n\nHINT = (INFO + DEBUG) // 2\nlogging.addLevelName(HINT, 'HINT')\n\n\nclass _RootLogger(logging.RootLogger):\n def __init__(self, level):\n super().__init__(level)\n self.propagate = False\n _RootLogger.manager = logging.Manager(self)\n\n def log(\n self,\n level: int,\n msg: str,\n *,\n extra: Optional[dict] = None,\n time: datetime = None,\n deep: Optional[str] = None,\n ) -> datetime:\n from . import settings\n\n now = datetime.now(timezone.utc)\n time_passed: timedelta = None if time is None else now - time\n extra = {\n **(extra or {}),\n 'deep': deep if settings.verbosity.level < level else None,\n 'time_passed': time_passed,\n }\n super().log(level, msg, extra=extra)\n return now\n\n def critical(self, msg, *, time=None, deep=None, extra=None) -> datetime:\n return self.log(CRITICAL, msg, time=time, deep=deep, extra=extra)\n\n def error(self, msg, *, time=None, deep=None, extra=None) -> datetime:\n return self.log(ERROR, msg, time=time, deep=deep, extra=extra)\n\n def warning(self, msg, *, time=None, deep=None, extra=None) -> datetime:\n return self.log(WARNING, msg, time=time, deep=deep, extra=extra)\n\n def info(self, msg, *, time=None, deep=None, extra=None) -> datetime:\n return self.log(INFO, msg, time=time, deep=deep, extra=extra)\n\n def hint(self, msg, *, time=None, deep=None, extra=None) -> datetime:\n return self.log(HINT, msg, time=time, deep=deep, extra=extra)\n\n def debug(self, msg, *, time=None, deep=None, extra=None) -> datetime:\n return self.log(DEBUG, msg, time=time, deep=deep, extra=extra)\n\n\ndef _set_log_file(settings):\n file = settings.logfile\n name = settings.logpath\n root = settings._root_logger\n h = logging.StreamHandler(file) if name is None else logging.FileHandler(name)\n h.setFormatter(_LogFormatter())\n h.setLevel(root.level)\n if len(root.handlers) == 1:\n root.removeHandler(root.handlers[0])\n elif len(root.handlers) > 1:\n raise RuntimeError('Scanpy’s root logger somehow got more than one handler')\n root.addHandler(h)\n\n\ndef _set_log_level(settings, level: int):\n root = settings._root_logger\n root.setLevel(level)\n (h,) = root.handlers # may only be 1\n h.setLevel(level)\n\n\nclass _LogFormatter(logging.Formatter):\n def __init__(\n self, fmt='{levelname}: {message}', datefmt='%Y-%m-%d %H:%M', style='{'\n ):\n super().__init__(fmt, datefmt, style)\n\n def format(self, record: logging.LogRecord):\n format_orig = self._style._fmt\n if record.levelno == INFO:\n self._style._fmt = '{message}'\n elif record.levelno == HINT:\n self._style._fmt = '--> {message}'\n elif record.levelno == DEBUG:\n self._style._fmt = ' {message}'\n if record.time_passed:\n # strip microseconds\n if record.time_passed.microseconds:\n record.time_passed = timedelta(\n seconds=int(record.time_passed.total_seconds())\n )\n if '{time_passed}' in record.msg:\n record.msg = record.msg.replace(\n '{time_passed}', str(record.time_passed)\n )\n else:\n self._style._fmt += ' ({time_passed})'\n if record.deep:\n record.msg = f'{record.msg}: {record.deep}'\n result = logging.Formatter.format(self, record)\n self._style._fmt = format_orig\n return result\n\n\nprint_memory_usage = anndata.logging.print_memory_usage\nget_memory_usage = anndata.logging.get_memory_usage\n\n\n_DEPENDENCIES_NUMERICS = [\n 'anndata', # anndata actually shouldn't, but as long as it's in development\n 'umap',\n 'numpy',\n 'scipy',\n 'pandas',\n ('sklearn', 'scikit-learn'),\n 'statsmodels',\n ('igraph', 'python-igraph'),\n 'louvain',\n 'leidenalg',\n 'pynndescent',\n]\n\n\ndef _versions_dependencies(dependencies):\n # this is not the same as the requirements!\n for mod in dependencies:\n mod_name, dist_name = mod if isinstance(mod, tuple) else (mod, mod)\n try:\n imp = __import__(mod_name)\n yield dist_name, imp.__version__\n except (ImportError, AttributeError):\n pass\n\n\ndef print_header(*, file=None):\n \"\"\"\\\n Versions that might influence the numerical results.\n Matplotlib and Seaborn are excluded from this.\n \"\"\"\n\n modules = ['scanpy'] + _DEPENDENCIES_NUMERICS\n print(\n ' '.join(f'{mod}=={ver}' for mod, ver in _versions_dependencies(modules)),\n file=file or sys.stdout,\n )\n\n\ndef print_versions(*, file: Optional[IO[str]] = None):\n \"\"\"\\\n Print versions of imported packages, OS, and jupyter environment.\n\n For more options (including rich output) use `session_info.show` directly.\n \"\"\"\n import session_info\n\n if file is not None:\n from contextlib import redirect_stdout\n\n warnings.warn(\n \"Passing argument 'file' to print_versions is deprecated, and will be \"\n \"removed in a future version.\",\n FutureWarning,\n )\n with redirect_stdout(file):\n print_versions()\n else:\n session_info.show(\n dependencies=True,\n html=False,\n excludes=[\n 'builtins',\n 'stdlib_list',\n 'importlib_metadata',\n # Special module present if test coverage being calculated\n # https://gitlab.com/joelostblom/session_info/-/issues/10\n \"$coverage\",\n ],\n )\n\n\ndef print_version_and_date(*, file=None):\n \"\"\"\\\n Useful for starting a notebook so you see when you started working.\n \"\"\"\n from . import __version__\n\n if file is None:\n file = sys.stdout\n print(\n f'Running Scanpy {__version__}, ' f'on {datetime.now():%Y-%m-%d %H:%M}.',\n file=file,\n )\n\n\ndef _copy_docs_and_signature(fn):\n return partial(update_wrapper, wrapped=fn, assigned=['__doc__', '__annotations__'])\n\n\ndef error(\n msg: str,\n *,\n time: datetime = None,\n deep: Optional[str] = None,\n extra: Optional[dict] = None,\n) -> datetime:\n \"\"\"\\\n Log message with specific level and return current time.\n\n Parameters\n ==========\n msg\n Message to display.\n time\n A time in the past. If this is passed, the time difference from then\n to now is appended to `msg` as ` (HH:MM:SS)`.\n If `msg` contains `{time_passed}`, the time difference is instead\n inserted at that position.\n deep\n If the current verbosity is higher than the log function’s level,\n this gets displayed as well\n extra\n Additional values you can specify in `msg` like `{time_passed}`.\n \"\"\"\n from ._settings import settings\n\n return settings._root_logger.error(msg, time=time, deep=deep, extra=extra)\n\n\n@_copy_docs_and_signature(error)\ndef warning(msg, *, time=None, deep=None, extra=None) -> datetime:\n from ._settings import settings\n\n return settings._root_logger.warning(msg, time=time, deep=deep, extra=extra)\n\n\n@_copy_docs_and_signature(error)\ndef info(msg, *, time=None, deep=None, extra=None) -> datetime:\n from ._settings import settings\n\n return settings._root_logger.info(msg, time=time, deep=deep, extra=extra)\n\n\n@_copy_docs_and_signature(error)\ndef hint(msg, *, time=None, deep=None, extra=None) -> datetime:\n from ._settings import settings\n\n return settings._root_logger.hint(msg, time=time, deep=deep, extra=extra)\n\n\n@_copy_docs_and_signature(error)\ndef debug(msg, *, time=None, deep=None, extra=None) -> datetime:\n from ._settings import settings\n\n return settings._root_logger.debug(msg, time=time, deep=deep, extra=extra)\n", "path": "scanpy/logging.py" } ]
[ { "content": "\"\"\"Logging and Profiling\n\"\"\"\nimport logging\nimport sys\nfrom functools import update_wrapper, partial\nfrom logging import CRITICAL, ERROR, WARNING, INFO, DEBUG\nfrom datetime import datetime, timedelta, timezone\nfrom typing import Optional, IO\nimport warnings\n\nimport anndata.logging\n\n\nHINT = (INFO + DEBUG) // 2\nlogging.addLevelName(HINT, 'HINT')\n\n\nclass _RootLogger(logging.RootLogger):\n def __init__(self, level):\n super().__init__(level)\n self.propagate = False\n _RootLogger.manager = logging.Manager(self)\n\n def log(\n self,\n level: int,\n msg: str,\n *,\n extra: Optional[dict] = None,\n time: datetime = None,\n deep: Optional[str] = None,\n ) -> datetime:\n from . import settings\n\n now = datetime.now(timezone.utc)\n time_passed: timedelta = None if time is None else now - time\n extra = {\n **(extra or {}),\n 'deep': deep if settings.verbosity.level < level else None,\n 'time_passed': time_passed,\n }\n super().log(level, msg, extra=extra)\n return now\n\n def critical(self, msg, *, time=None, deep=None, extra=None) -> datetime:\n return self.log(CRITICAL, msg, time=time, deep=deep, extra=extra)\n\n def error(self, msg, *, time=None, deep=None, extra=None) -> datetime:\n return self.log(ERROR, msg, time=time, deep=deep, extra=extra)\n\n def warning(self, msg, *, time=None, deep=None, extra=None) -> datetime:\n return self.log(WARNING, msg, time=time, deep=deep, extra=extra)\n\n def info(self, msg, *, time=None, deep=None, extra=None) -> datetime:\n return self.log(INFO, msg, time=time, deep=deep, extra=extra)\n\n def hint(self, msg, *, time=None, deep=None, extra=None) -> datetime:\n return self.log(HINT, msg, time=time, deep=deep, extra=extra)\n\n def debug(self, msg, *, time=None, deep=None, extra=None) -> datetime:\n return self.log(DEBUG, msg, time=time, deep=deep, extra=extra)\n\n\ndef _set_log_file(settings):\n file = settings.logfile\n name = settings.logpath\n root = settings._root_logger\n h = logging.StreamHandler(file) if name is None else logging.FileHandler(name)\n h.setFormatter(_LogFormatter())\n h.setLevel(root.level)\n if len(root.handlers) == 1:\n root.removeHandler(root.handlers[0])\n elif len(root.handlers) > 1:\n raise RuntimeError('Scanpy’s root logger somehow got more than one handler')\n root.addHandler(h)\n\n\ndef _set_log_level(settings, level: int):\n root = settings._root_logger\n root.setLevel(level)\n (h,) = root.handlers # may only be 1\n h.setLevel(level)\n\n\nclass _LogFormatter(logging.Formatter):\n def __init__(\n self, fmt='{levelname}: {message}', datefmt='%Y-%m-%d %H:%M', style='{'\n ):\n super().__init__(fmt, datefmt, style)\n\n def format(self, record: logging.LogRecord):\n format_orig = self._style._fmt\n if record.levelno == INFO:\n self._style._fmt = '{message}'\n elif record.levelno == HINT:\n self._style._fmt = '--> {message}'\n elif record.levelno == DEBUG:\n self._style._fmt = ' {message}'\n if record.time_passed:\n # strip microseconds\n if record.time_passed.microseconds:\n record.time_passed = timedelta(\n seconds=int(record.time_passed.total_seconds())\n )\n if '{time_passed}' in record.msg:\n record.msg = record.msg.replace(\n '{time_passed}', str(record.time_passed)\n )\n else:\n self._style._fmt += ' ({time_passed})'\n if record.deep:\n record.msg = f'{record.msg}: {record.deep}'\n result = logging.Formatter.format(self, record)\n self._style._fmt = format_orig\n return result\n\n\nprint_memory_usage = anndata.logging.print_memory_usage\nget_memory_usage = anndata.logging.get_memory_usage\n\n\n_DEPENDENCIES_NUMERICS = [\n 'anndata', # anndata actually shouldn't, but as long as it's in development\n 'umap',\n 'numpy',\n 'scipy',\n 'pandas',\n ('sklearn', 'scikit-learn'),\n 'statsmodels',\n 'igraph',\n 'louvain',\n 'leidenalg',\n 'pynndescent',\n]\n\n\ndef _versions_dependencies(dependencies):\n # this is not the same as the requirements!\n for mod in dependencies:\n mod_name, dist_name = mod if isinstance(mod, tuple) else (mod, mod)\n try:\n imp = __import__(mod_name)\n yield dist_name, imp.__version__\n except (ImportError, AttributeError):\n pass\n\n\ndef print_header(*, file=None):\n \"\"\"\\\n Versions that might influence the numerical results.\n Matplotlib and Seaborn are excluded from this.\n \"\"\"\n\n modules = ['scanpy'] + _DEPENDENCIES_NUMERICS\n print(\n ' '.join(f'{mod}=={ver}' for mod, ver in _versions_dependencies(modules)),\n file=file or sys.stdout,\n )\n\n\ndef print_versions(*, file: Optional[IO[str]] = None):\n \"\"\"\\\n Print versions of imported packages, OS, and jupyter environment.\n\n For more options (including rich output) use `session_info.show` directly.\n \"\"\"\n import session_info\n\n if file is not None:\n from contextlib import redirect_stdout\n\n warnings.warn(\n \"Passing argument 'file' to print_versions is deprecated, and will be \"\n \"removed in a future version.\",\n FutureWarning,\n )\n with redirect_stdout(file):\n print_versions()\n else:\n session_info.show(\n dependencies=True,\n html=False,\n excludes=[\n 'builtins',\n 'stdlib_list',\n 'importlib_metadata',\n # Special module present if test coverage being calculated\n # https://gitlab.com/joelostblom/session_info/-/issues/10\n \"$coverage\",\n ],\n )\n\n\ndef print_version_and_date(*, file=None):\n \"\"\"\\\n Useful for starting a notebook so you see when you started working.\n \"\"\"\n from . import __version__\n\n if file is None:\n file = sys.stdout\n print(\n f'Running Scanpy {__version__}, ' f'on {datetime.now():%Y-%m-%d %H:%M}.',\n file=file,\n )\n\n\ndef _copy_docs_and_signature(fn):\n return partial(update_wrapper, wrapped=fn, assigned=['__doc__', '__annotations__'])\n\n\ndef error(\n msg: str,\n *,\n time: datetime = None,\n deep: Optional[str] = None,\n extra: Optional[dict] = None,\n) -> datetime:\n \"\"\"\\\n Log message with specific level and return current time.\n\n Parameters\n ==========\n msg\n Message to display.\n time\n A time in the past. If this is passed, the time difference from then\n to now is appended to `msg` as ` (HH:MM:SS)`.\n If `msg` contains `{time_passed}`, the time difference is instead\n inserted at that position.\n deep\n If the current verbosity is higher than the log function’s level,\n this gets displayed as well\n extra\n Additional values you can specify in `msg` like `{time_passed}`.\n \"\"\"\n from ._settings import settings\n\n return settings._root_logger.error(msg, time=time, deep=deep, extra=extra)\n\n\n@_copy_docs_and_signature(error)\ndef warning(msg, *, time=None, deep=None, extra=None) -> datetime:\n from ._settings import settings\n\n return settings._root_logger.warning(msg, time=time, deep=deep, extra=extra)\n\n\n@_copy_docs_and_signature(error)\ndef info(msg, *, time=None, deep=None, extra=None) -> datetime:\n from ._settings import settings\n\n return settings._root_logger.info(msg, time=time, deep=deep, extra=extra)\n\n\n@_copy_docs_and_signature(error)\ndef hint(msg, *, time=None, deep=None, extra=None) -> datetime:\n from ._settings import settings\n\n return settings._root_logger.hint(msg, time=time, deep=deep, extra=extra)\n\n\n@_copy_docs_and_signature(error)\ndef debug(msg, *, time=None, deep=None, extra=None) -> datetime:\n from ._settings import settings\n\n return settings._root_logger.debug(msg, time=time, deep=deep, extra=extra)\n", "path": "scanpy/logging.py" } ]
diff --git a/.gitignore b/.gitignore index ed60829903..74a7fdcc33 100644 --- a/.gitignore +++ b/.gitignore @@ -21,6 +21,7 @@ /scanpy/tests/notebooks/figures/ # Environment management +/hatch.toml /Pipfile /Pipfile.lock /requirements*.lock diff --git a/docs/installation.md b/docs/installation.md index b8b75e1a9d..c94883946b 100644 --- a/docs/installation.md +++ b/docs/installation.md @@ -24,7 +24,7 @@ pip install 'scanpy[leiden]' ``` The extra `[leiden]` installs two packages that are needed for popular -parts of scanpy but aren't requirements: [python-igraph] [^cite_csardi06] and [leiden] [^cite_traag18]. +parts of scanpy but aren't requirements: [igraph] [^cite_csardi06] and [leiden] [^cite_traag18]. (dev-install-instructions)= @@ -83,7 +83,7 @@ pip install --user scanpy - `brew install igraph` -- If python-igraph still fails to install, see the question on [compiling igraph]. +- If igraph still fails to install, see the question on [compiling igraph]. Alternatively consider installing gcc via `brew install gcc --without-multilib` and exporting the required variables: @@ -125,5 +125,5 @@ The whole process takes just a couple of minutes. [leiden]: https://leidenalg.readthedocs.io [miniconda]: http://conda.pydata.org/miniconda.html [on github]: https://github.com/scverse/scanpy -[python-igraph]: http://igraph.org/python/ +[igraph]: https://python.igraph.org/en/stable/ [unofficial binaries]: https://www.lfd.uci.edu/~gohlke/pythonlibs/ diff --git a/pyproject.toml b/pyproject.toml index 1a0d3abdac..076155e867 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -131,9 +131,9 @@ dev = [ "docutils", ] # Algorithms -paga = ["python-igraph"] -louvain = ["python-igraph", "louvain>=0.6,!=0.6.2"] # Louvain community detection -leiden = ["python-igraph", "leidenalg"] # Leiden community detection +paga = ["igraph"] +louvain = ["igraph", "louvain>=0.6,!=0.6.2"] # Louvain community detection +leiden = ["igraph>=0.10", "leidenalg>=0.9"] # Leiden community detection bbknn = ["bbknn"] # Batch balanced KNN (batch correction) magic = ["magic-impute>=2.0"] # MAGIC imputation method skmisc = ["scikit-misc>=0.1.3"] # highly_variable_genes method 'seurat_v3' diff --git a/scanpy/logging.py b/scanpy/logging.py index 712a187961..086c559593 100644 --- a/scanpy/logging.py +++ b/scanpy/logging.py @@ -127,7 +127,7 @@ def format(self, record: logging.LogRecord): 'pandas', ('sklearn', 'scikit-learn'), 'statsmodels', - ('igraph', 'python-igraph'), + 'igraph', 'louvain', 'leidenalg', 'pynndescent', diff --git a/scanpy/testing/_pytest/marks.py b/scanpy/testing/_pytest/marks.py index 64ef05d24e..6ea633bf17 100644 --- a/scanpy/testing/_pytest/marks.py +++ b/scanpy/testing/_pytest/marks.py @@ -10,7 +10,7 @@ louvain="louvain", skmisc="scikit-misc", fa2="fa2", - igraph="python-igraph", + igraph="igraph", dask="dask", zarr="zarr", zappy="zappy",
pex-tool__pex-1618
Release 2.1.67 On the docket: + [x] Expand --platform syntax: support full versions. #1614
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.66\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.67\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 7ade9666f..3d9366907 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,20 @@ Release Notes ============= +2.1.67 +------ + +This release brings support for `--platform` arguments with a +3-component PYVER portion. This supports working around +`python_full_version` environment marker evaluation failures for +`--platform` resolves by changing, for example, a platform of +`linux_x86_64-cp-38-cp38` to `linux_x86_64-cp-3.8.10-cp38`. This is +likely a simpler way to work around these issues than using the +`--complete-platform` facility introduced in 2.1.66 by #1609. + +* Expand `--platform` syntax: support full versions. (#1614) + `PR #1614 <https://github.com/pantsbuild/pex/pull/1614>`_ + 2.1.66 ------ diff --git a/pex/version.py b/pex/version.py index 24551f628..872eae86e 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.66" +__version__ = "2.1.67" diff --git a/tests/test_pep_508.py b/tests/test_pep_508.py index 8842ce185..a108b19b2 100644 --- a/tests/test_pep_508.py +++ b/tests/test_pep_508.py @@ -50,13 +50,11 @@ def test_extended_platform_marker_environment(): # type: () -> None platform = Platform.create("linux-x86_64-cp-3.10.1-cp310") marker_environment = MarkerEnvironment.from_platform(platform) - env_defaulted = marker_environment.as_dict(default_unknown=True) - env_sparse = marker_environment.as_dict(default_unknown=False) + env = marker_environment.as_dict() def assert_known_marker(expression): # type: (str) -> None - assert evaluate_marker(expression, env_defaulted) - assert evaluate_marker(expression, env_sparse) + assert evaluate_marker(expression, env) assert_known_marker("python_full_version == '3.10.1'") assert_known_marker("python_version == '3.10'") @@ -66,9 +64,8 @@ def assert_known_marker(expression): def assert_unknown_marker(expression): # type: (str) -> None - assert not evaluate_marker(expression, env_defaulted) with pytest.raises(markers.UndefinedEnvironmentName): - evaluate_marker(expression, env_sparse) + evaluate_marker(expression, env) assert_unknown_marker("platform_release == '5.12.12-arch1-1'") assert_unknown_marker("platform_version == '#1 SMP PREEMPT Fri, 18 Jun 2021 21:59:22 +0000'")
Bitmessage__PyBitmessage-1387
Better logging Using the built-in python logging module I've made various log levels possible and made the creation of a log file a matter of changing the configuration in debug.py. The python logging module is thread-safe so we can safely replace all `print` calls with calls to `logger`. I only replaced some of them mainly to test the configuration (and there are a lot of `print` calls) There are some commits in my merge that mention translation files, I'm working on that but didn't mean to include them in this merge. I deleted them but the commit history is already there.
[ { "content": "#!/usr/bin/env python2\n\"\"\"\nCheck dependendies and give recommendations about how to satisfy them\n\nLimitations:\n\n * Does not detect whether packages are already installed. Solving this requires writing more of a configuration\n management system. Or we could switch to an existing one.\n * Not fully PEP508 compliant. Not slightly. It makes bold assumptions about the simplicity of the contents of\n EXTRAS_REQUIRE. This is fine because most developers do, too.\n\"\"\"\n\nimport os\nimport sys\nfrom distutils.errors import CompileError\ntry:\n from setuptools.dist import Distribution\n from setuptools.extension import Extension\n from setuptools.command.build_ext import build_ext\n HAVE_SETUPTOOLS = True\n # another import from setuptools is in setup.py\n from setup import EXTRAS_REQUIRE\nexcept ImportError:\n HAVE_SETUPTOOLS = False\n EXTRAS_REQUIRE = []\n\nfrom importlib import import_module\n\nfrom src.depends import detectOS, PACKAGES, PACKAGE_MANAGER\n\n\nCOMPILING = {\n \"Debian\": \"build-essential libssl-dev\",\n \"Ubuntu\": \"build-essential libssl-dev\",\n \"Fedora\": \"gcc-c++ redhat-rpm-config python-devel openssl-devel\",\n \"openSUSE\": \"gcc-c++ libopenssl-devel python-devel\",\n \"optional\": False,\n}\n\n# OS-specific dependencies for optional components listed in EXTRAS_REQUIRE\nEXTRAS_REQUIRE_DEPS = {\n # The values from setup.EXTRAS_REQUIRE\n 'python_prctl': {\n # The packages needed for this requirement, by OS\n \"OpenBSD\": [\"\"],\n \"FreeBSD\": [\"\"],\n \"Debian\": [\"libcap-dev python-prctl\"],\n \"Ubuntu\": [\"libcap-dev python-prctl\"],\n \"Ubuntu 12\": [\"libcap-dev python-prctl\"],\n \"openSUSE\": [\"\"],\n \"Fedora\": [\"prctl\"],\n \"Guix\": [\"\"],\n \"Gentoo\": [\"dev-python/python-prctl\"],\n },\n}\n\n\ndef detectPrereqs(missing=True):\n available = []\n for module in PACKAGES:\n try:\n import_module(module)\n if not missing:\n available.append(module)\n except ImportError:\n if missing:\n available.append(module)\n return available\n\n\ndef prereqToPackages():\n if not detectPrereqs():\n return\n print(\"%s %s\" % (\n PACKAGE_MANAGER[detectOS()], \" \".join(\n PACKAGES[x][detectOS()] for x in detectPrereqs())))\n\n\ndef compilerToPackages():\n if not detectOS() in COMPILING:\n return\n print(\"%s %s\" % (\n PACKAGE_MANAGER[detectOS.result], COMPILING[detectOS.result]))\n\n\ndef testCompiler():\n if not HAVE_SETUPTOOLS:\n # silent, we can't test without setuptools\n return True\n\n bitmsghash = Extension(\n 'bitmsghash',\n sources=['src/bitmsghash/bitmsghash.cpp'],\n libraries=['pthread', 'crypto'],\n )\n\n dist = Distribution()\n dist.ext_modules = [bitmsghash]\n cmd = build_ext(dist)\n cmd.initialize_options()\n cmd.finalize_options()\n cmd.force = True\n try:\n cmd.run()\n except CompileError:\n return False\n else:\n fullPath = os.path.join(cmd.build_lib, cmd.get_ext_filename(\"bitmsghash\"))\n return os.path.isfile(fullPath)\n\n\nprereqs = detectPrereqs()\ncompiler = testCompiler()\n\nif (not compiler or prereqs) and detectOS() in PACKAGE_MANAGER:\n print(\n \"It looks like you're using %s. \"\n \"It is highly recommended to use the package manager\\n\"\n \"to install the missing dependencies.\" % detectOS.result)\n\nif not compiler:\n print(\n \"Building the bitmsghash module failed.\\n\"\n \"You may be missing a C++ compiler and/or the OpenSSL headers.\")\n\nif prereqs:\n mandatory = [x for x in prereqs if not PACKAGES[x].get(\"optional\")]\n optional = [x for x in prereqs if PACKAGES[x].get(\"optional\")]\n if mandatory:\n print(\"Missing mandatory dependencies: %s\" % \" \".join(mandatory))\n if optional:\n print(\"Missing optional dependencies: %s\" % \" \".join(optional))\n for package in optional:\n print(PACKAGES[package].get('description'))\n\n# Install the system dependencies of optional extras_require components\nOPSYS = detectOS()\nCMD = PACKAGE_MANAGER[OPSYS] if OPSYS in PACKAGE_MANAGER else 'UNKNOWN_INSTALLER'\nfor lhs, rhs in EXTRAS_REQUIRE.items():\n if OPSYS is None:\n break\n if rhs and any([\n EXTRAS_REQUIRE_DEPS[x][OPSYS]\n for x in rhs\n if x in EXTRAS_REQUIRE_DEPS\n ]):\n rhs_cmd = ''.join([\n CMD,\n ' ',\n ' '.join([\n ''. join([\n xx for xx in EXTRAS_REQUIRE_DEPS[x][OPSYS]\n ])\n for x in rhs\n if x in EXTRAS_REQUIRE_DEPS\n ]),\n ])\n print(\n \"Optional dependency `pip install .[{}]` would require `{}`\"\n \" to be run as root\".format(lhs, rhs_cmd))\n\nif (not compiler or prereqs) and OPSYS in PACKAGE_MANAGER:\n print(\"You can install the missing dependencies by running, as root:\")\n if not compiler:\n compilerToPackages()\n prereqToPackages()\n if mandatory:\n sys.exit(1)\nelse:\n print(\"All the dependencies satisfied, you can install PyBitmessage\")\n", "path": "checkdeps.py" } ]
[ { "content": "#!/usr/bin/env python2\n\"\"\"\nCheck dependendies and give recommendations about how to satisfy them\n\nLimitations:\n\n * Does not detect whether packages are already installed. Solving this requires writing more of a configuration\n management system. Or we could switch to an existing one.\n * Not fully PEP508 compliant. Not slightly. It makes bold assumptions about the simplicity of the contents of\n EXTRAS_REQUIRE. This is fine because most developers do, too.\n\"\"\"\n\nimport os\nimport sys\nfrom distutils.errors import CompileError\ntry:\n from setuptools.dist import Distribution\n from setuptools.extension import Extension\n from setuptools.command.build_ext import build_ext\n HAVE_SETUPTOOLS = True\n # another import from setuptools is in setup.py\n from setup import EXTRAS_REQUIRE\nexcept ImportError:\n HAVE_SETUPTOOLS = False\n EXTRAS_REQUIRE = {}\n\nfrom importlib import import_module\n\nfrom src.depends import detectOS, PACKAGES, PACKAGE_MANAGER\n\n\nCOMPILING = {\n \"Debian\": \"build-essential libssl-dev\",\n \"Ubuntu\": \"build-essential libssl-dev\",\n \"Fedora\": \"gcc-c++ redhat-rpm-config python-devel openssl-devel\",\n \"openSUSE\": \"gcc-c++ libopenssl-devel python-devel\",\n \"optional\": False,\n}\n\n# OS-specific dependencies for optional components listed in EXTRAS_REQUIRE\nEXTRAS_REQUIRE_DEPS = {\n # The values from setup.EXTRAS_REQUIRE\n 'python_prctl': {\n # The packages needed for this requirement, by OS\n \"OpenBSD\": [\"\"],\n \"FreeBSD\": [\"\"],\n \"Debian\": [\"libcap-dev python-prctl\"],\n \"Ubuntu\": [\"libcap-dev python-prctl\"],\n \"Ubuntu 12\": [\"libcap-dev python-prctl\"],\n \"openSUSE\": [\"\"],\n \"Fedora\": [\"prctl\"],\n \"Guix\": [\"\"],\n \"Gentoo\": [\"dev-python/python-prctl\"],\n },\n}\n\n\ndef detectPrereqs(missing=True):\n available = []\n for module in PACKAGES:\n try:\n import_module(module)\n if not missing:\n available.append(module)\n except ImportError:\n if missing:\n available.append(module)\n return available\n\n\ndef prereqToPackages():\n if not detectPrereqs():\n return\n print(\"%s %s\" % (\n PACKAGE_MANAGER[detectOS()], \" \".join(\n PACKAGES[x][detectOS()] for x in detectPrereqs())))\n\n\ndef compilerToPackages():\n if not detectOS() in COMPILING:\n return\n print(\"%s %s\" % (\n PACKAGE_MANAGER[detectOS.result], COMPILING[detectOS.result]))\n\n\ndef testCompiler():\n if not HAVE_SETUPTOOLS:\n # silent, we can't test without setuptools\n return True\n\n bitmsghash = Extension(\n 'bitmsghash',\n sources=['src/bitmsghash/bitmsghash.cpp'],\n libraries=['pthread', 'crypto'],\n )\n\n dist = Distribution()\n dist.ext_modules = [bitmsghash]\n cmd = build_ext(dist)\n cmd.initialize_options()\n cmd.finalize_options()\n cmd.force = True\n try:\n cmd.run()\n except CompileError:\n return False\n else:\n fullPath = os.path.join(cmd.build_lib, cmd.get_ext_filename(\"bitmsghash\"))\n return os.path.isfile(fullPath)\n\n\nprereqs = detectPrereqs()\ncompiler = testCompiler()\n\nif (not compiler or prereqs) and detectOS() in PACKAGE_MANAGER:\n print(\n \"It looks like you're using %s. \"\n \"It is highly recommended to use the package manager\\n\"\n \"to install the missing dependencies.\" % detectOS.result)\n\nif not compiler:\n print(\n \"Building the bitmsghash module failed.\\n\"\n \"You may be missing a C++ compiler and/or the OpenSSL headers.\")\n\nif prereqs:\n mandatory = [x for x in prereqs if not PACKAGES[x].get(\"optional\")]\n optional = [x for x in prereqs if PACKAGES[x].get(\"optional\")]\n if mandatory:\n print(\"Missing mandatory dependencies: %s\" % \" \".join(mandatory))\n if optional:\n print(\"Missing optional dependencies: %s\" % \" \".join(optional))\n for package in optional:\n print(PACKAGES[package].get('description'))\n\n# Install the system dependencies of optional extras_require components\nOPSYS = detectOS()\nCMD = PACKAGE_MANAGER[OPSYS] if OPSYS in PACKAGE_MANAGER else 'UNKNOWN_INSTALLER'\nfor lhs, rhs in EXTRAS_REQUIRE.items():\n if OPSYS is None:\n break\n if rhs and any([\n EXTRAS_REQUIRE_DEPS[x][OPSYS]\n for x in rhs\n if x in EXTRAS_REQUIRE_DEPS\n ]):\n rhs_cmd = ''.join([\n CMD,\n ' ',\n ' '.join([\n ''. join([\n xx for xx in EXTRAS_REQUIRE_DEPS[x][OPSYS]\n ])\n for x in rhs\n if x in EXTRAS_REQUIRE_DEPS\n ]),\n ])\n print(\n \"Optional dependency `pip install .[{}]` would require `{}`\"\n \" to be run as root\".format(lhs, rhs_cmd))\n\nif (not compiler or prereqs) and OPSYS in PACKAGE_MANAGER:\n print(\"You can install the missing dependencies by running, as root:\")\n if not compiler:\n compilerToPackages()\n prereqToPackages()\n if mandatory:\n sys.exit(1)\nelse:\n print(\"All the dependencies satisfied, you can install PyBitmessage\")\n", "path": "checkdeps.py" } ]
diff --git a/checkdeps.py b/checkdeps.py index c0e1005199..03782037e8 100755 --- a/checkdeps.py +++ b/checkdeps.py @@ -22,7 +22,7 @@ from setup import EXTRAS_REQUIRE except ImportError: HAVE_SETUPTOOLS = False - EXTRAS_REQUIRE = [] + EXTRAS_REQUIRE = {} from importlib import import_module
liqd__a4-meinberlin-4706
#6460 Previous/Next Button Poll Request Results no backround color **URL:** https://meinberlin-dev.liqd.net/projekte/test-poll-merge-running-poll-with-user-content/ **user:** any **expected behaviour:** Previous/Next button on the poll request results has a pink background. **behaviour:** Button has no background. Only the outlines turn pink when the button is clicked **important screensize:** **device & browser:** **Comment/Question:** Screenshot? dev: <img width="286" alt="Bildschirmfoto 2022-11-09 um 05 38 05" src="https://user-images.githubusercontent.com/113356258/200740386-60d26bc2-f169-40e4-9730-79d6d8724dad.png"> <img width="220" alt="Bildschirmfoto 2022-11-09 um 05 40 30" src="https://user-images.githubusercontent.com/113356258/200740411-e40f6bf6-83ba-468f-a941-93bbfe045993.png"> stage: <img width="189" alt="Bildschirmfoto 2022-11-09 um 05 44 21" src="https://user-images.githubusercontent.com/113356258/200740726-f116d498-cb19-4074-bd57-541f7d5d8d2a.png">
[ { "content": "from django.contrib import messages\nfrom django.db import transaction\nfrom django.urls import reverse\nfrom django.utils.translation import gettext_lazy as _\nfrom django.views import generic\n\nfrom adhocracy4.categories import filters as category_filters\nfrom adhocracy4.exports.views import DashboardExportView\nfrom adhocracy4.filters import filters as a4_filters\nfrom adhocracy4.filters import views as filter_views\nfrom adhocracy4.filters import widgets as filters_widgets\nfrom adhocracy4.filters.filters import FreeTextFilter\nfrom adhocracy4.labels import filters as label_filters\nfrom adhocracy4.projects.mixins import DisplayProjectOrModuleMixin\nfrom adhocracy4.projects.mixins import ProjectMixin\nfrom adhocracy4.rules import mixins as rules_mixins\nfrom meinberlin.apps.contrib import forms as contrib_forms\nfrom meinberlin.apps.contrib.views import CanonicalURLDetailView\nfrom meinberlin.apps.moderatorfeedback.forms import ModeratorStatementForm\nfrom meinberlin.apps.moderatorfeedback.models import ModeratorStatement\nfrom meinberlin.apps.notifications.emails import \\\n NotifyContactOnModeratorFeedback\nfrom meinberlin.apps.notifications.emails import \\\n NotifyCreatorOnModeratorFeedback\n\nfrom . import forms\nfrom . import models\n\n\nclass FreeTextFilterWidget(filters_widgets.FreeTextFilterWidget):\n label = _('Search')\n\n\ndef get_ordering_choices(view):\n choices = (('-created', _('Most recent')),)\n if view.module.has_feature('rate', models.Idea):\n choices += ('-positive_rating_count', _('Most popular')),\n choices += ('-comment_count', _('Most commented')),\n return choices\n\n\nclass IdeaFilterSet(a4_filters.DefaultsFilterSet):\n defaults = {\n 'ordering': '-created'\n }\n category = category_filters.CategoryFilter()\n labels = label_filters.LabelFilter()\n ordering = a4_filters.DynamicChoicesOrderingFilter(\n choices=get_ordering_choices\n )\n search = FreeTextFilter(\n widget=FreeTextFilterWidget,\n fields=['name']\n )\n\n class Meta:\n model = models.Idea\n fields = ['search', 'labels', 'category']\n\n\nclass AbstractIdeaListView(ProjectMixin,\n filter_views.FilteredListView):\n paginate_by = 15\n\n\nclass IdeaListView(AbstractIdeaListView,\n DisplayProjectOrModuleMixin\n ):\n model = models.Idea\n filter_set = IdeaFilterSet\n\n def get_queryset(self):\n return super().get_queryset()\\\n .filter(module=self.module)\n\n\nclass AbstractIdeaDetailView(ProjectMixin,\n rules_mixins.PermissionRequiredMixin,\n CanonicalURLDetailView):\n get_context_from_object = True\n\n\nclass IdeaDetailView(AbstractIdeaDetailView):\n model = models.Idea\n queryset = models.Idea.objects.annotate_positive_rating_count()\\\n .annotate_negative_rating_count()\n permission_required = 'meinberlin_ideas.view_idea'\n\n\nclass AbstractIdeaCreateView(ProjectMixin,\n rules_mixins.PermissionRequiredMixin,\n generic.CreateView):\n \"\"\"Create an idea in the context of a module.\"\"\"\n\n def get_permission_object(self, *args, **kwargs):\n return self.module\n\n def form_valid(self, form):\n form.instance.creator = self.request.user\n form.instance.module = self.module\n return super().form_valid(form)\n\n def get_form_kwargs(self):\n kwargs = super().get_form_kwargs()\n kwargs['module'] = self.module\n if self.module.settings_instance:\n kwargs['settings_instance'] = self.module.settings_instance\n return kwargs\n\n\nclass IdeaCreateView(AbstractIdeaCreateView):\n model = models.Idea\n form_class = forms.IdeaForm\n permission_required = 'meinberlin_ideas.add_idea'\n template_name = 'meinberlin_ideas/idea_create_form.html'\n\n\nclass AbstractIdeaUpdateView(ProjectMixin,\n rules_mixins.PermissionRequiredMixin,\n generic.UpdateView):\n get_context_from_object = True\n\n def get_form_kwargs(self):\n kwargs = super().get_form_kwargs()\n instance = kwargs.get('instance')\n kwargs['module'] = instance.module\n if instance.module.settings_instance:\n kwargs['settings_instance'] = \\\n instance.module.settings_instance\n return kwargs\n\n\nclass IdeaUpdateView(AbstractIdeaUpdateView):\n model = models.Idea\n form_class = forms.IdeaForm\n permission_required = 'meinberlin_ideas.change_idea'\n template_name = 'meinberlin_ideas/idea_update_form.html'\n\n\nclass AbstractIdeaDeleteView(ProjectMixin,\n rules_mixins.PermissionRequiredMixin,\n generic.DeleteView):\n get_context_from_object = True\n\n def get_success_url(self):\n return reverse(\n 'project-detail', kwargs={'slug': self.project.slug})\n\n def delete(self, request, *args, **kwargs):\n messages.success(self.request, self.success_message)\n return super(AbstractIdeaDeleteView, self)\\\n .delete(request, *args, **kwargs)\n\n\nclass IdeaDeleteView(AbstractIdeaDeleteView):\n model = models.Idea\n success_message = _('Your Idea has been deleted')\n permission_required = 'meinberlin_ideas.change_idea'\n template_name = 'meinberlin_ideas/idea_confirm_delete.html'\n\n\nclass AbstractIdeaModerateView(\n ProjectMixin,\n rules_mixins.PermissionRequiredMixin,\n generic.detail.SingleObjectMixin,\n generic.detail.SingleObjectTemplateResponseMixin,\n contrib_forms.BaseMultiModelFormView):\n\n get_context_from_object = True\n\n def __init__(self):\n self.forms = {\n 'moderateable': {\n 'model': self.model,\n 'form_class': self.moderateable_form_class\n },\n 'statement': {\n 'model': ModeratorStatement,\n 'form_class': ModeratorStatementForm\n }\n }\n\n def dispatch(self, *args, **kwargs):\n self.object = self.get_object()\n return super().dispatch(*args, **kwargs)\n\n def get_success_url(self):\n return self.object.get_absolute_url()\n\n def forms_save(self, forms, commit=True):\n objects = super().forms_save(forms, commit=False)\n moderateable = objects['moderateable']\n statement = objects['statement']\n\n if not statement.pk:\n statement.creator = self.request.user\n\n with transaction.atomic():\n statement.save()\n moderateable.moderator_statement = statement\n moderateable.save()\n if hasattr(self.object, 'contact_email'):\n NotifyContactOnModeratorFeedback.send(self.object)\n else:\n NotifyCreatorOnModeratorFeedback.send(self.object)\n return objects\n\n def get_instance(self, name):\n if name == 'moderateable':\n return self.object\n elif name == 'statement':\n return self.object.moderator_statement\n\n\nclass IdeaModerateView(AbstractIdeaModerateView):\n model = models.Idea\n permission_required = 'meinberlin_ideas.moderate_idea'\n template_name = 'meinberlin_ideas/idea_moderate_form.html'\n moderateable_form_class = forms.IdeaModerateForm\n\n\nclass IdeaDashboardExportView(DashboardExportView):\n template_name = 'a4exports/export_dashboard.html'\n\n def get_context_data(self, **kwargs):\n context = super().get_context_data(**kwargs)\n context['export'] = reverse(\n 'a4dashboard:idea-export',\n kwargs={'module_slug': self.module.slug})\n context['comment_export'] = reverse(\n 'a4dashboard:idea-comment-export',\n kwargs={'module_slug': self.module.slug})\n return context\n", "path": "meinberlin/apps/ideas/views.py" } ]
[ { "content": "from django.contrib import messages\nfrom django.db import transaction\nfrom django.urls import reverse\nfrom django.utils.translation import gettext_lazy as _\nfrom django.views import generic\n\nfrom adhocracy4.categories import filters as category_filters\nfrom adhocracy4.exports.views import DashboardExportView\nfrom adhocracy4.filters import filters as a4_filters\nfrom adhocracy4.filters import views as filter_views\nfrom adhocracy4.filters import widgets as filters_widgets\nfrom adhocracy4.filters.filters import FreeTextFilter\nfrom adhocracy4.labels import filters as label_filters\nfrom adhocracy4.projects.mixins import DisplayProjectOrModuleMixin\nfrom adhocracy4.projects.mixins import ProjectMixin\nfrom adhocracy4.rules import mixins as rules_mixins\nfrom meinberlin.apps.contrib import forms as contrib_forms\nfrom meinberlin.apps.contrib.views import CanonicalURLDetailView\nfrom meinberlin.apps.moderatorfeedback.forms import ModeratorStatementForm\nfrom meinberlin.apps.moderatorfeedback.models import ModeratorStatement\nfrom meinberlin.apps.notifications.emails import \\\n NotifyContactOnModeratorFeedback\nfrom meinberlin.apps.notifications.emails import \\\n NotifyCreatorOnModeratorFeedback\n\nfrom . import forms\nfrom . import models\n\n\nclass FreeTextFilterWidget(filters_widgets.FreeTextFilterWidget):\n label = _('Search')\n\n\ndef get_ordering_choices(view):\n choices = (('-created', _('Most recent')),)\n if view.module.has_feature('rate', models.Idea):\n choices += ('-positive_rating_count', _('Most popular')),\n choices += ('-comment_count', _('Most commented')),\n return choices\n\n\nclass IdeaFilterSet(a4_filters.DefaultsFilterSet):\n defaults = {\n 'ordering': '-created'\n }\n category = category_filters.CategoryFilter()\n labels = label_filters.LabelFilter()\n ordering = a4_filters.DynamicChoicesOrderingFilter(\n choices=get_ordering_choices\n )\n search = FreeTextFilter(\n widget=FreeTextFilterWidget,\n fields=['name']\n )\n\n class Meta:\n model = models.Idea\n fields = ['search', 'category', 'labels']\n\n\nclass AbstractIdeaListView(ProjectMixin,\n filter_views.FilteredListView):\n paginate_by = 15\n\n\nclass IdeaListView(AbstractIdeaListView,\n DisplayProjectOrModuleMixin\n ):\n model = models.Idea\n filter_set = IdeaFilterSet\n\n def get_queryset(self):\n return super().get_queryset()\\\n .filter(module=self.module)\n\n\nclass AbstractIdeaDetailView(ProjectMixin,\n rules_mixins.PermissionRequiredMixin,\n CanonicalURLDetailView):\n get_context_from_object = True\n\n\nclass IdeaDetailView(AbstractIdeaDetailView):\n model = models.Idea\n queryset = models.Idea.objects.annotate_positive_rating_count()\\\n .annotate_negative_rating_count()\n permission_required = 'meinberlin_ideas.view_idea'\n\n\nclass AbstractIdeaCreateView(ProjectMixin,\n rules_mixins.PermissionRequiredMixin,\n generic.CreateView):\n \"\"\"Create an idea in the context of a module.\"\"\"\n\n def get_permission_object(self, *args, **kwargs):\n return self.module\n\n def form_valid(self, form):\n form.instance.creator = self.request.user\n form.instance.module = self.module\n return super().form_valid(form)\n\n def get_form_kwargs(self):\n kwargs = super().get_form_kwargs()\n kwargs['module'] = self.module\n if self.module.settings_instance:\n kwargs['settings_instance'] = self.module.settings_instance\n return kwargs\n\n\nclass IdeaCreateView(AbstractIdeaCreateView):\n model = models.Idea\n form_class = forms.IdeaForm\n permission_required = 'meinberlin_ideas.add_idea'\n template_name = 'meinberlin_ideas/idea_create_form.html'\n\n\nclass AbstractIdeaUpdateView(ProjectMixin,\n rules_mixins.PermissionRequiredMixin,\n generic.UpdateView):\n get_context_from_object = True\n\n def get_form_kwargs(self):\n kwargs = super().get_form_kwargs()\n instance = kwargs.get('instance')\n kwargs['module'] = instance.module\n if instance.module.settings_instance:\n kwargs['settings_instance'] = \\\n instance.module.settings_instance\n return kwargs\n\n\nclass IdeaUpdateView(AbstractIdeaUpdateView):\n model = models.Idea\n form_class = forms.IdeaForm\n permission_required = 'meinberlin_ideas.change_idea'\n template_name = 'meinberlin_ideas/idea_update_form.html'\n\n\nclass AbstractIdeaDeleteView(ProjectMixin,\n rules_mixins.PermissionRequiredMixin,\n generic.DeleteView):\n get_context_from_object = True\n\n def get_success_url(self):\n return reverse(\n 'project-detail', kwargs={'slug': self.project.slug})\n\n def delete(self, request, *args, **kwargs):\n messages.success(self.request, self.success_message)\n return super(AbstractIdeaDeleteView, self)\\\n .delete(request, *args, **kwargs)\n\n\nclass IdeaDeleteView(AbstractIdeaDeleteView):\n model = models.Idea\n success_message = _('Your Idea has been deleted')\n permission_required = 'meinberlin_ideas.change_idea'\n template_name = 'meinberlin_ideas/idea_confirm_delete.html'\n\n\nclass AbstractIdeaModerateView(\n ProjectMixin,\n rules_mixins.PermissionRequiredMixin,\n generic.detail.SingleObjectMixin,\n generic.detail.SingleObjectTemplateResponseMixin,\n contrib_forms.BaseMultiModelFormView):\n\n get_context_from_object = True\n\n def __init__(self):\n self.forms = {\n 'moderateable': {\n 'model': self.model,\n 'form_class': self.moderateable_form_class\n },\n 'statement': {\n 'model': ModeratorStatement,\n 'form_class': ModeratorStatementForm\n }\n }\n\n def dispatch(self, *args, **kwargs):\n self.object = self.get_object()\n return super().dispatch(*args, **kwargs)\n\n def get_success_url(self):\n return self.object.get_absolute_url()\n\n def forms_save(self, forms, commit=True):\n objects = super().forms_save(forms, commit=False)\n moderateable = objects['moderateable']\n statement = objects['statement']\n\n if not statement.pk:\n statement.creator = self.request.user\n\n with transaction.atomic():\n statement.save()\n moderateable.moderator_statement = statement\n moderateable.save()\n if hasattr(self.object, 'contact_email'):\n NotifyContactOnModeratorFeedback.send(self.object)\n else:\n NotifyCreatorOnModeratorFeedback.send(self.object)\n return objects\n\n def get_instance(self, name):\n if name == 'moderateable':\n return self.object\n elif name == 'statement':\n return self.object.moderator_statement\n\n\nclass IdeaModerateView(AbstractIdeaModerateView):\n model = models.Idea\n permission_required = 'meinberlin_ideas.moderate_idea'\n template_name = 'meinberlin_ideas/idea_moderate_form.html'\n moderateable_form_class = forms.IdeaModerateForm\n\n\nclass IdeaDashboardExportView(DashboardExportView):\n template_name = 'a4exports/export_dashboard.html'\n\n def get_context_data(self, **kwargs):\n context = super().get_context_data(**kwargs)\n context['export'] = reverse(\n 'a4dashboard:idea-export',\n kwargs={'module_slug': self.module.slug})\n context['comment_export'] = reverse(\n 'a4dashboard:idea-comment-export',\n kwargs={'module_slug': self.module.slug})\n return context\n", "path": "meinberlin/apps/ideas/views.py" } ]
diff --git a/meinberlin/apps/ideas/views.py b/meinberlin/apps/ideas/views.py index fb0f02f039..2bd994b679 100644 --- a/meinberlin/apps/ideas/views.py +++ b/meinberlin/apps/ideas/views.py @@ -55,7 +55,7 @@ class IdeaFilterSet(a4_filters.DefaultsFilterSet): class Meta: model = models.Idea - fields = ['search', 'labels', 'category'] + fields = ['search', 'category', 'labels'] class AbstractIdeaListView(ProjectMixin, diff --git a/meinberlin/assets/scss/components/_dropdown.scss b/meinberlin/assets/scss/components/_dropdown.scss index f9db9a61fe..abe6aad79c 100644 --- a/meinberlin/assets/scss/components/_dropdown.scss +++ b/meinberlin/assets/scss/components/_dropdown.scss @@ -47,9 +47,7 @@ } } - button:last-child, - li:last-child > a, - li:last-child > button { + &:last-child { border-bottom: 0; } diff --git a/meinberlin/assets/scss/components/_poll.scss b/meinberlin/assets/scss/components/_poll.scss index bb11928dc8..587ec9085d 100644 --- a/meinberlin/assets/scss/components/_poll.scss +++ b/meinberlin/assets/scss/components/_poll.scss @@ -241,7 +241,7 @@ $checkbox-size: 20px; } .poll-slider__answer { - padding-bottom: 1.75 * $spacer; + padding-bottom: 2 * $spacer; i { color: $gray-lighter; @@ -253,7 +253,7 @@ $checkbox-size: 20px; font-size: $font-size-xs; color: $gray-lighter; position: absolute; - bottom: 1.25 * $spacer; // to allign with arrows + bottom: 1.75 * $spacer; // to allign with arrows left: $spacer; } @@ -266,7 +266,7 @@ $checkbox-size: 20px; // slick overwrites - nested for specificity .slick-prev { left: revert !important; - right: 4 * $spacer !important; + right: 5 * $spacer !important; } .slick-next { @@ -275,13 +275,14 @@ $checkbox-size: 20px; .slick-prev, .slick-next { + @extend .btn; @extend .btn--primary; position: absolute; top: revert; bottom: 0; text-align: center; - width: 30px; - height: 30px; + width: 40px; + height: 40px; border-radius: 100%; z-index: 1; // for when tile links overlap @@ -289,6 +290,7 @@ $checkbox-size: 20px; opacity: 0.25; cursor: not-allowed; pointer-events: none; + box-shadow: none; } &:before { @@ -296,8 +298,8 @@ $checkbox-size: 20px; opacity: 1; font-family: "Font Awesome 6 Free", sans-serif; font-weight: 900; - font-size: $font-size-xl; - line-height: 1.3rem; + font-size: $font-size-xxl; + line-height: 1.6rem; } } @@ -325,7 +327,7 @@ $checkbox-size: 20px; @media (min-width: $breakpoint) { .poll-slider__count--spaced { left: revert; - right: 6.5 * $spacer; + right: 8.5 * $spacer; } .poll-slider__count {
pwndbg__pwndbg-1104
`dX` commands truncate output longer than native word size ### Example The screenshot below shows pwndbg commands issued when debugging an x86 program. Note that some of the data printed by the `dd` command is omitted by the `dq` command: ![dX issue](https://user-images.githubusercontent.com/16000770/186494094-3c455a3d-7945-4aea-a919-b0a15105588e.png) ### Cause This happens in the first line of `enhex()`, which is called by `dX()`: https://github.com/pwndbg/pwndbg/blob/5d358585b1149aead6774f17c5721f10c4bed7be/pwndbg/commands/windbg.py#L137-L138 `value` is masked to the native word size, resulting in loss of information when `dX()` tries to print words longer than this, e.g. printing quadwords from an x86 process memory. ### Possible solution Making the mask in `enhex()` fit the requested data width could fix this. `pwndbg.arch.ptrmask` is calculated like so: https://github.com/pwndbg/pwndbg/blob/5d358585b1149aead6774f17c5721f10c4bed7be/pwndbg/arch.py#L53 So perhaps replacing the first line of `enhex()` with `value = value & (1 << 8*size) - 1` might work.
[ { "content": "\"\"\"\nCompatibility functionality for Windbg users.\n\"\"\"\n\nimport argparse\nimport codecs\nimport math\nimport sys\nfrom builtins import str\n\nimport gdb\n\nimport pwndbg.arch\nimport pwndbg.commands\nimport pwndbg.memory\nimport pwndbg.strings\nimport pwndbg.symbol\nimport pwndbg.typeinfo\n\n\ndef get_type(size):\n return {\n 1: pwndbg.typeinfo.uint8,\n 2: pwndbg.typeinfo.uint16,\n 4: pwndbg.typeinfo.uint32,\n 8: pwndbg.typeinfo.uint64,\n }[size]\n\n\nparser = argparse.ArgumentParser(description=\"Starting at the specified address, dump N bytes.\")\nparser.add_argument(\n \"address\", type=pwndbg.commands.HexOrAddressExpr, help=\"The address to dump from.\"\n)\nparser.add_argument(\n \"count\",\n type=pwndbg.commands.AddressExpr,\n default=64,\n nargs=\"?\",\n help=\"The number of bytes to dump.\",\n)\n\n\[email protected](parser)\[email protected]\ndef db(address, count=64):\n \"\"\"\n Starting at the specified address, dump N bytes\n (default 64).\n \"\"\"\n return dX(1, address, count, repeat=db.repeat)\n\n\nparser = argparse.ArgumentParser(description=\"Starting at the specified address, dump N words.\")\nparser.add_argument(\n \"address\", type=pwndbg.commands.HexOrAddressExpr, help=\"The address to dump from.\"\n)\nparser.add_argument(\n \"count\",\n type=pwndbg.commands.AddressExpr,\n default=32,\n nargs=\"?\",\n help=\"The number of words to dump.\",\n)\n\n\[email protected](parser)\[email protected]\ndef dw(address, count=32):\n \"\"\"\n Starting at the specified address, dump N words\n (default 32).\n \"\"\"\n return dX(2, address, count, repeat=dw.repeat)\n\n\nparser = argparse.ArgumentParser(description=\"Starting at the specified address, dump N dwords.\")\nparser.add_argument(\n \"address\", type=pwndbg.commands.HexOrAddressExpr, help=\"The address to dump from.\"\n)\nparser.add_argument(\n \"count\",\n type=pwndbg.commands.AddressExpr,\n default=16,\n nargs=\"?\",\n help=\"The number of dwords to dump.\",\n)\n\n\[email protected](parser)\[email protected]\ndef dd(address, count=16):\n \"\"\"\n Starting at the specified address, dump N dwords\n (default 16).\n \"\"\"\n return dX(4, address, count, repeat=dd.repeat)\n\n\nparser = argparse.ArgumentParser(description=\"Starting at the specified address, dump N qwords.\")\nparser.add_argument(\n \"address\", type=pwndbg.commands.HexOrAddressExpr, help=\"The address to dump from.\"\n)\nparser.add_argument(\n \"count\",\n type=pwndbg.commands.AddressExpr,\n default=8,\n nargs=\"?\",\n help=\"The number of qwords to dump.\",\n)\n\n\[email protected](parser)\[email protected]\ndef dq(address, count=8):\n \"\"\"\n Starting at the specified address, dump N qwords\n (default 8).\n \"\"\"\n return dX(8, address, count, repeat=dq.repeat)\n\n\nparser = argparse.ArgumentParser(description=\"Starting at the specified address, hexdump.\")\nparser.add_argument(\n \"address\", type=pwndbg.commands.HexOrAddressExpr, help=\"The address to dump from.\"\n)\nparser.add_argument(\n \"count\",\n type=pwndbg.commands.AddressExpr,\n default=8,\n nargs=\"?\",\n help=\"The number of bytes to hexdump.\",\n)\n\n\[email protected](parser)\[email protected]\ndef dc(address, count=8):\n return pwndbg.commands.hexdump.hexdump(address=address, count=count)\n\n\ndef dX(size, address, count, to_string=False, repeat=False):\n \"\"\"\n Traditionally, windbg will display 16 bytes of data per line.\n \"\"\"\n values = []\n\n if repeat:\n count = dX.last_count\n address = dX.last_address\n else:\n address = int(address) & pwndbg.arch.ptrmask\n count = int(count)\n\n type = get_type(size)\n\n for i in range(count):\n try:\n gval = pwndbg.memory.poi(type, address + i * size)\n # print(str(gval))\n values.append(int(gval))\n except gdb.MemoryError:\n break\n\n if not values:\n print(\"Could not access the provided address\")\n return\n\n n_rows = int(math.ceil(count * size / float(16)))\n row_sz = int(16 / size)\n rows = [values[i * row_sz : (i + 1) * row_sz] for i in range(n_rows)]\n lines = []\n\n # sys.stdout.write(repr(rows) + '\\n')\n\n for i, row in enumerate(rows):\n if not row:\n continue\n line = [enhex(pwndbg.arch.ptrsize, address + (i * 16)), \" \"]\n for value in row:\n line.append(enhex(size, value))\n lines.append(\" \".join(line))\n\n if not to_string:\n print(\"\\n\".join(lines))\n\n dX.last_count = count\n dX.last_address = address + len(rows) * 16\n\n return lines\n\n\ndef enhex(size, value):\n value = value & pwndbg.arch.ptrmask\n x = \"%x\" % abs(value)\n x = x.rjust(size * 2, \"0\")\n return x\n\n\nparser = argparse.ArgumentParser(description=\"Write hex bytes at the specified address.\")\nparser.add_argument(\n \"address\", type=pwndbg.commands.HexOrAddressExpr, help=\"The address to write to.\"\n)\nparser.add_argument(\"data\", type=str, nargs=\"*\", help=\"The bytes to write.\")\n\n\[email protected](parser)\[email protected]\ndef eb(address, data):\n \"\"\"\n Write hex bytes at the specified address.\n \"\"\"\n return eX(1, address, data)\n\n\nparser = argparse.ArgumentParser(description=\"Write hex words at the specified address.\")\nparser.add_argument(\n \"address\", type=pwndbg.commands.HexOrAddressExpr, help=\"The address to write to.\"\n)\nparser.add_argument(\"data\", type=str, nargs=\"*\", help=\"The words to write.\")\n\n\[email protected](parser)\[email protected]\ndef ew(address, data):\n \"\"\"\n Write hex words at the specified address.\n \"\"\"\n return eX(2, address, data)\n\n\nparser = argparse.ArgumentParser(description=\"Write hex dwords at the specified address.\")\nparser.add_argument(\n \"address\", type=pwndbg.commands.HexOrAddressExpr, help=\"The address to write to.\"\n)\nparser.add_argument(\"data\", type=str, nargs=\"*\", help=\"The dwords to write.\")\n\n\[email protected](parser)\[email protected]\ndef ed(address, data):\n \"\"\"\n Write hex dwords at the specified address.\n \"\"\"\n return eX(4, address, data)\n\n\nparser = argparse.ArgumentParser(description=\"Write hex qwords at the specified address.\")\nparser.add_argument(\n \"address\", type=pwndbg.commands.HexOrAddressExpr, help=\"The address to write to.\"\n)\nparser.add_argument(\"data\", type=str, nargs=\"*\", help=\"The qwords to write.\")\n\n\[email protected](parser)\[email protected]\ndef eq(address, data):\n \"\"\"\n Write hex qwords at the specified address.\n \"\"\"\n return eX(8, address, data)\n\n\nparser = argparse.ArgumentParser(description=\"Write a string at the specified address.\")\nparser.add_argument(\n \"address\", type=pwndbg.commands.HexOrAddressExpr, help=\"The address to write to.\"\n)\nparser.add_argument(\"data\", type=str, help=\"The string to write.\")\n\n\[email protected](parser)\[email protected]\ndef ez(address, data):\n \"\"\"\n Write a character at the specified address.\n \"\"\"\n return eX(1, address, data, hex=False)\n\n\nparser = argparse.ArgumentParser(\n description=\"Write a string at the specified address.\"\n) # TODO Is eza just ez? If so just alias. I had trouble finding windbg documentation defining ez\nparser.add_argument(\n \"address\", type=pwndbg.commands.HexOrAddressExpr, help=\"The address to write to.\"\n)\nparser.add_argument(\"data\", type=str, help=\"The string to write.\")\n\n\[email protected](parser)\[email protected]\ndef eza(address, data):\n \"\"\"\n Write a string at the specified address.\n \"\"\"\n return ez(address, data)\n\n\ndef eX(size, address, data, hex=True):\n \"\"\"\n This relies on windbg's default hex encoding being enforced\n \"\"\"\n if not data:\n print(\"Cannot write empty data into memory.\")\n return\n\n if hex:\n # Early validation if all data is hex\n for string in data:\n if string.startswith(\"0x\"):\n string = string[2:]\n\n if any(ch not in \"0123456789abcdefABCDEF\" for ch in string):\n print(\n \"Incorrect data format: it must all be a hex value (0x1234 or 1234, both interpreted as 0x1234)\"\n )\n return\n\n writes = 0\n for i, string in enumerate(data):\n if hex:\n if string.startswith(\"0x\"):\n string = string[2:]\n\n string = string.rjust(size * 2, \"0\")\n\n data = codecs.decode(string, \"hex\")\n else:\n data = string\n\n if pwndbg.arch.endian == \"little\":\n data = data[::-1]\n\n try:\n pwndbg.memory.write(address + (i * size), data)\n writes += 1\n except gdb.error:\n print(\"Cannot access memory at address %#x\" % address)\n if writes > 0:\n print(\"(Made %d writes to memory; skipping further writes)\" % writes)\n return\n\n\nparser = argparse.ArgumentParser(description=\"Dump pointers and symbols at the specified address.\")\nparser.add_argument(\"addr\", type=pwndbg.commands.HexOrAddressExpr, help=\"The address to dump from.\")\n\n\[email protected](\n parser, aliases=[\"kd\", \"dps\", \"dqs\"]\n) # TODO are these really all the same? They had identical implementation...\[email protected]\ndef dds(addr):\n \"\"\"\n Dump pointers and symbols at the specified address.\n \"\"\"\n return pwndbg.commands.telescope.telescope(addr)\n\n\nda_parser = argparse.ArgumentParser()\nda_parser.description = \"Dump a string at the specified address.\"\nda_parser.add_argument(\"address\", type=pwndbg.commands.HexOrAddressExpr, help=\"Address to dump\")\nda_parser.add_argument(\"max\", type=int, nargs=\"?\", default=256, help=\"Maximum string length\")\n\n\[email protected](da_parser)\[email protected]\ndef da(address, max):\n print(\"%x\" % address, repr(pwndbg.strings.get(address, max)))\n\n\nds_parser = argparse.ArgumentParser()\nds_parser.description = \"Dump a string at the specified address.\"\nds_parser.add_argument(\"address\", type=pwndbg.commands.HexOrAddressExpr, help=\"Address to dump\")\nds_parser.add_argument(\"max\", type=int, nargs=\"?\", default=256, help=\"Maximum string length\")\n\n\[email protected](ds_parser)\[email protected]\ndef ds(address, max):\n # We do change the max length to the default if its too low\n # because the truncated display is not that ideal/not the same as GDB's yet\n # (ours: \"truncated ...\", GDBs: \"truncated \"...)\n if max < 256:\n print(\"Max str len of %d too low, changing to 256\" % max)\n max = 256\n\n string = pwndbg.strings.get(address, max, maxread=4096)\n if string:\n print(\"%x %r\" % (address, string))\n else:\n print(\n \"Data at address can't be dereferenced or is not a printable null-terminated string or is too short.\"\n )\n print(\"Perhaps try: db <address> <count> or hexdump <address>\")\n\n\[email protected](\"List breakpoints.\")\ndef bl():\n \"\"\"\n List breakpoints\n \"\"\"\n gdb.execute(\"info breakpoints\")\n\n\nparser = argparse.ArgumentParser(description=\"Disable the breakpoint with the specified index.\")\nparser.add_argument(\n \"which\", nargs=\"?\", type=str, default=\"*\", help=\"Index of the breakpoint to disable.\"\n)\n\n\[email protected](parser)\ndef bd(which=\"*\"):\n \"\"\"\n Disable the breakpoint with the specified index.\n \"\"\"\n if which == \"*\":\n gdb.execute(\"disable breakpoints\")\n else:\n gdb.execute(\"disable breakpoints %s\" % which)\n\n\nparser = argparse.ArgumentParser(description=\"Enable the breakpoint with the specified index.\")\nparser.add_argument(\n \"which\", nargs=\"?\", type=str, default=\"*\", help=\"Index of the breakpoint to enable.\"\n)\n\n\[email protected](parser)\ndef be(which=\"*\"):\n \"\"\"\n Enable the breakpoint with the specified index.\n \"\"\"\n if which == \"*\":\n gdb.execute(\"enable breakpoints\")\n else:\n gdb.execute(\"enable breakpoints %s\" % which)\n\n\nparser = argparse.ArgumentParser(description=\"Clear the breakpoint with the specified index.\")\nparser.add_argument(\n \"which\", nargs=\"?\", type=str, default=\"*\", help=\"Index of the breakpoint to clear.\"\n)\n\n\[email protected](parser)\ndef bc(which=\"*\"):\n \"\"\"\n Clear the breakpoint with the specified index.\n \"\"\"\n if which == \"*\":\n gdb.execute(\"delete breakpoints\")\n else:\n gdb.execute(\"delete breakpoints %s\" % which)\n\n\nparser = argparse.ArgumentParser(description=\"Set a breakpoint at the specified address.\")\nparser.add_argument(\"where\", type=int, help=\"The address to break at.\")\n\n\[email protected](parser)\ndef bp(where):\n \"\"\"\n Set a breakpoint at the specified address.\n \"\"\"\n result = pwndbg.commands.fix(where)\n if result is not None:\n gdb.execute(\"break *%#x\" % int(result))\n\n\nparser = argparse.ArgumentParser(\n description=\"Starting at the specified address, disassemble N instructions.\"\n)\nparser.add_argument(\n \"where\", type=int, nargs=\"?\", default=None, help=\"The address to disassemble at.\"\n)\nparser.add_argument(\n \"n\", type=int, nargs=\"?\", default=5, help=\"The number of instructions to disassemble.\"\n)\n\n\[email protected](parser)\[email protected]\ndef u(where=None, n=5, to_string=False):\n \"\"\"\n Starting at the specified address, disassemble\n N instructions (default 5).\n \"\"\"\n if where is None:\n where = pwndbg.regs.pc\n return pwndbg.commands.nearpc.nearpc(where, n, to_string)\n\n\[email protected](\"Print a backtrace (alias 'bt').\")\[email protected]\ndef k():\n \"\"\"\n Print a backtrace (alias 'bt')\n \"\"\"\n gdb.execute(\"bt\")\n\n\nparser = argparse.ArgumentParser(description=\"List the symbols nearest to the provided value.\")\nparser.add_argument(\n \"value\", type=int, nargs=\"?\", default=None, help=\"The address you want the name of.\"\n)\n\n\[email protected](parser)\[email protected]\ndef ln(value=None):\n \"\"\"\n List the symbols nearest to the provided value.\n \"\"\"\n if value is None:\n value = pwndbg.regs.pc\n value = int(value)\n x = pwndbg.symbol.get(value)\n if x:\n result = \"(%#x) %s\" % (value, x)\n print(result)\n\n\n# The three commands are aliases for `vmmap` and are set so in vmmap.py\n# lm\n# address\n# vprot\n\n\[email protected](\"Not be windows.\")\[email protected]\ndef peb():\n print(\"This isn't Windows!\")\n\n\[email protected](\"Windbg compatibility alias for 'continue' command.\")\[email protected]\ndef go():\n \"\"\"\n Windbg compatibility alias for 'continue' command.\n \"\"\"\n gdb.execute(\"continue\")\n\n\[email protected](\"Windbg compatibility alias for 'nextcall' command.\")\[email protected]\ndef pc():\n \"\"\"\n Windbg compatibility alias for 'nextcall' command.\n \"\"\"\n return pwndbg.commands.next.nextcall()\n", "path": "pwndbg/commands/windbg.py" } ]
[ { "content": "\"\"\"\nCompatibility functionality for Windbg users.\n\"\"\"\n\nimport argparse\nimport codecs\nimport math\nimport sys\nfrom builtins import str\n\nimport gdb\n\nimport pwndbg.arch\nimport pwndbg.commands\nimport pwndbg.memory\nimport pwndbg.strings\nimport pwndbg.symbol\nimport pwndbg.typeinfo\n\n\ndef get_type(size):\n return {\n 1: pwndbg.typeinfo.uint8,\n 2: pwndbg.typeinfo.uint16,\n 4: pwndbg.typeinfo.uint32,\n 8: pwndbg.typeinfo.uint64,\n }[size]\n\n\nparser = argparse.ArgumentParser(description=\"Starting at the specified address, dump N bytes.\")\nparser.add_argument(\n \"address\", type=pwndbg.commands.HexOrAddressExpr, help=\"The address to dump from.\"\n)\nparser.add_argument(\n \"count\",\n type=pwndbg.commands.AddressExpr,\n default=64,\n nargs=\"?\",\n help=\"The number of bytes to dump.\",\n)\n\n\[email protected](parser)\[email protected]\ndef db(address, count=64):\n \"\"\"\n Starting at the specified address, dump N bytes\n (default 64).\n \"\"\"\n return dX(1, address, count, repeat=db.repeat)\n\n\nparser = argparse.ArgumentParser(description=\"Starting at the specified address, dump N words.\")\nparser.add_argument(\n \"address\", type=pwndbg.commands.HexOrAddressExpr, help=\"The address to dump from.\"\n)\nparser.add_argument(\n \"count\",\n type=pwndbg.commands.AddressExpr,\n default=32,\n nargs=\"?\",\n help=\"The number of words to dump.\",\n)\n\n\[email protected](parser)\[email protected]\ndef dw(address, count=32):\n \"\"\"\n Starting at the specified address, dump N words\n (default 32).\n \"\"\"\n return dX(2, address, count, repeat=dw.repeat)\n\n\nparser = argparse.ArgumentParser(description=\"Starting at the specified address, dump N dwords.\")\nparser.add_argument(\n \"address\", type=pwndbg.commands.HexOrAddressExpr, help=\"The address to dump from.\"\n)\nparser.add_argument(\n \"count\",\n type=pwndbg.commands.AddressExpr,\n default=16,\n nargs=\"?\",\n help=\"The number of dwords to dump.\",\n)\n\n\[email protected](parser)\[email protected]\ndef dd(address, count=16):\n \"\"\"\n Starting at the specified address, dump N dwords\n (default 16).\n \"\"\"\n return dX(4, address, count, repeat=dd.repeat)\n\n\nparser = argparse.ArgumentParser(description=\"Starting at the specified address, dump N qwords.\")\nparser.add_argument(\n \"address\", type=pwndbg.commands.HexOrAddressExpr, help=\"The address to dump from.\"\n)\nparser.add_argument(\n \"count\",\n type=pwndbg.commands.AddressExpr,\n default=8,\n nargs=\"?\",\n help=\"The number of qwords to dump.\",\n)\n\n\[email protected](parser)\[email protected]\ndef dq(address, count=8):\n \"\"\"\n Starting at the specified address, dump N qwords\n (default 8).\n \"\"\"\n return dX(8, address, count, repeat=dq.repeat)\n\n\nparser = argparse.ArgumentParser(description=\"Starting at the specified address, hexdump.\")\nparser.add_argument(\n \"address\", type=pwndbg.commands.HexOrAddressExpr, help=\"The address to dump from.\"\n)\nparser.add_argument(\n \"count\",\n type=pwndbg.commands.AddressExpr,\n default=8,\n nargs=\"?\",\n help=\"The number of bytes to hexdump.\",\n)\n\n\[email protected](parser)\[email protected]\ndef dc(address, count=8):\n return pwndbg.commands.hexdump.hexdump(address=address, count=count)\n\n\ndef dX(size, address, count, to_string=False, repeat=False):\n \"\"\"\n Traditionally, windbg will display 16 bytes of data per line.\n \"\"\"\n values = []\n\n if repeat:\n count = dX.last_count\n address = dX.last_address\n else:\n address = int(address) & pwndbg.arch.ptrmask\n count = int(count)\n\n type = get_type(size)\n\n for i in range(count):\n try:\n gval = pwndbg.memory.poi(type, address + i * size)\n # print(str(gval))\n values.append(int(gval))\n except gdb.MemoryError:\n break\n\n if not values:\n print(\"Could not access the provided address\")\n return\n\n n_rows = int(math.ceil(count * size / float(16)))\n row_sz = int(16 / size)\n rows = [values[i * row_sz : (i + 1) * row_sz] for i in range(n_rows)]\n lines = []\n\n # sys.stdout.write(repr(rows) + '\\n')\n\n for i, row in enumerate(rows):\n if not row:\n continue\n line = [enhex(pwndbg.arch.ptrsize, address + (i * 16)), \" \"]\n for value in row:\n line.append(enhex(size, value))\n lines.append(\" \".join(line))\n\n if not to_string:\n print(\"\\n\".join(lines))\n\n dX.last_count = count\n dX.last_address = address + len(rows) * 16\n\n return lines\n\n\ndef enhex(size, value):\n value = value & ((1 << 8 * size) - 1)\n x = \"%x\" % abs(value)\n x = x.rjust(size * 2, \"0\")\n return x\n\n\nparser = argparse.ArgumentParser(description=\"Write hex bytes at the specified address.\")\nparser.add_argument(\n \"address\", type=pwndbg.commands.HexOrAddressExpr, help=\"The address to write to.\"\n)\nparser.add_argument(\"data\", type=str, nargs=\"*\", help=\"The bytes to write.\")\n\n\[email protected](parser)\[email protected]\ndef eb(address, data):\n \"\"\"\n Write hex bytes at the specified address.\n \"\"\"\n return eX(1, address, data)\n\n\nparser = argparse.ArgumentParser(description=\"Write hex words at the specified address.\")\nparser.add_argument(\n \"address\", type=pwndbg.commands.HexOrAddressExpr, help=\"The address to write to.\"\n)\nparser.add_argument(\"data\", type=str, nargs=\"*\", help=\"The words to write.\")\n\n\[email protected](parser)\[email protected]\ndef ew(address, data):\n \"\"\"\n Write hex words at the specified address.\n \"\"\"\n return eX(2, address, data)\n\n\nparser = argparse.ArgumentParser(description=\"Write hex dwords at the specified address.\")\nparser.add_argument(\n \"address\", type=pwndbg.commands.HexOrAddressExpr, help=\"The address to write to.\"\n)\nparser.add_argument(\"data\", type=str, nargs=\"*\", help=\"The dwords to write.\")\n\n\[email protected](parser)\[email protected]\ndef ed(address, data):\n \"\"\"\n Write hex dwords at the specified address.\n \"\"\"\n return eX(4, address, data)\n\n\nparser = argparse.ArgumentParser(description=\"Write hex qwords at the specified address.\")\nparser.add_argument(\n \"address\", type=pwndbg.commands.HexOrAddressExpr, help=\"The address to write to.\"\n)\nparser.add_argument(\"data\", type=str, nargs=\"*\", help=\"The qwords to write.\")\n\n\[email protected](parser)\[email protected]\ndef eq(address, data):\n \"\"\"\n Write hex qwords at the specified address.\n \"\"\"\n return eX(8, address, data)\n\n\nparser = argparse.ArgumentParser(description=\"Write a string at the specified address.\")\nparser.add_argument(\n \"address\", type=pwndbg.commands.HexOrAddressExpr, help=\"The address to write to.\"\n)\nparser.add_argument(\"data\", type=str, help=\"The string to write.\")\n\n\[email protected](parser)\[email protected]\ndef ez(address, data):\n \"\"\"\n Write a character at the specified address.\n \"\"\"\n return eX(1, address, data, hex=False)\n\n\nparser = argparse.ArgumentParser(\n description=\"Write a string at the specified address.\"\n) # TODO Is eza just ez? If so just alias. I had trouble finding windbg documentation defining ez\nparser.add_argument(\n \"address\", type=pwndbg.commands.HexOrAddressExpr, help=\"The address to write to.\"\n)\nparser.add_argument(\"data\", type=str, help=\"The string to write.\")\n\n\[email protected](parser)\[email protected]\ndef eza(address, data):\n \"\"\"\n Write a string at the specified address.\n \"\"\"\n return ez(address, data)\n\n\ndef eX(size, address, data, hex=True):\n \"\"\"\n This relies on windbg's default hex encoding being enforced\n \"\"\"\n if not data:\n print(\"Cannot write empty data into memory.\")\n return\n\n if hex:\n # Early validation if all data is hex\n for string in data:\n if string.startswith(\"0x\"):\n string = string[2:]\n\n if any(ch not in \"0123456789abcdefABCDEF\" for ch in string):\n print(\n \"Incorrect data format: it must all be a hex value (0x1234 or 1234, both interpreted as 0x1234)\"\n )\n return\n\n writes = 0\n for i, string in enumerate(data):\n if hex:\n if string.startswith(\"0x\"):\n string = string[2:]\n\n string = string.rjust(size * 2, \"0\")\n\n data = codecs.decode(string, \"hex\")\n else:\n data = string\n\n if pwndbg.arch.endian == \"little\":\n data = data[::-1]\n\n try:\n pwndbg.memory.write(address + (i * size), data)\n writes += 1\n except gdb.error:\n print(\"Cannot access memory at address %#x\" % address)\n if writes > 0:\n print(\"(Made %d writes to memory; skipping further writes)\" % writes)\n return\n\n\nparser = argparse.ArgumentParser(description=\"Dump pointers and symbols at the specified address.\")\nparser.add_argument(\"addr\", type=pwndbg.commands.HexOrAddressExpr, help=\"The address to dump from.\")\n\n\[email protected](\n parser, aliases=[\"kd\", \"dps\", \"dqs\"]\n) # TODO are these really all the same? They had identical implementation...\[email protected]\ndef dds(addr):\n \"\"\"\n Dump pointers and symbols at the specified address.\n \"\"\"\n return pwndbg.commands.telescope.telescope(addr)\n\n\nda_parser = argparse.ArgumentParser()\nda_parser.description = \"Dump a string at the specified address.\"\nda_parser.add_argument(\"address\", type=pwndbg.commands.HexOrAddressExpr, help=\"Address to dump\")\nda_parser.add_argument(\"max\", type=int, nargs=\"?\", default=256, help=\"Maximum string length\")\n\n\[email protected](da_parser)\[email protected]\ndef da(address, max):\n print(\"%x\" % address, repr(pwndbg.strings.get(address, max)))\n\n\nds_parser = argparse.ArgumentParser()\nds_parser.description = \"Dump a string at the specified address.\"\nds_parser.add_argument(\"address\", type=pwndbg.commands.HexOrAddressExpr, help=\"Address to dump\")\nds_parser.add_argument(\"max\", type=int, nargs=\"?\", default=256, help=\"Maximum string length\")\n\n\[email protected](ds_parser)\[email protected]\ndef ds(address, max):\n # We do change the max length to the default if its too low\n # because the truncated display is not that ideal/not the same as GDB's yet\n # (ours: \"truncated ...\", GDBs: \"truncated \"...)\n if max < 256:\n print(\"Max str len of %d too low, changing to 256\" % max)\n max = 256\n\n string = pwndbg.strings.get(address, max, maxread=4096)\n if string:\n print(\"%x %r\" % (address, string))\n else:\n print(\n \"Data at address can't be dereferenced or is not a printable null-terminated string or is too short.\"\n )\n print(\"Perhaps try: db <address> <count> or hexdump <address>\")\n\n\[email protected](\"List breakpoints.\")\ndef bl():\n \"\"\"\n List breakpoints\n \"\"\"\n gdb.execute(\"info breakpoints\")\n\n\nparser = argparse.ArgumentParser(description=\"Disable the breakpoint with the specified index.\")\nparser.add_argument(\n \"which\", nargs=\"?\", type=str, default=\"*\", help=\"Index of the breakpoint to disable.\"\n)\n\n\[email protected](parser)\ndef bd(which=\"*\"):\n \"\"\"\n Disable the breakpoint with the specified index.\n \"\"\"\n if which == \"*\":\n gdb.execute(\"disable breakpoints\")\n else:\n gdb.execute(\"disable breakpoints %s\" % which)\n\n\nparser = argparse.ArgumentParser(description=\"Enable the breakpoint with the specified index.\")\nparser.add_argument(\n \"which\", nargs=\"?\", type=str, default=\"*\", help=\"Index of the breakpoint to enable.\"\n)\n\n\[email protected](parser)\ndef be(which=\"*\"):\n \"\"\"\n Enable the breakpoint with the specified index.\n \"\"\"\n if which == \"*\":\n gdb.execute(\"enable breakpoints\")\n else:\n gdb.execute(\"enable breakpoints %s\" % which)\n\n\nparser = argparse.ArgumentParser(description=\"Clear the breakpoint with the specified index.\")\nparser.add_argument(\n \"which\", nargs=\"?\", type=str, default=\"*\", help=\"Index of the breakpoint to clear.\"\n)\n\n\[email protected](parser)\ndef bc(which=\"*\"):\n \"\"\"\n Clear the breakpoint with the specified index.\n \"\"\"\n if which == \"*\":\n gdb.execute(\"delete breakpoints\")\n else:\n gdb.execute(\"delete breakpoints %s\" % which)\n\n\nparser = argparse.ArgumentParser(description=\"Set a breakpoint at the specified address.\")\nparser.add_argument(\"where\", type=int, help=\"The address to break at.\")\n\n\[email protected](parser)\ndef bp(where):\n \"\"\"\n Set a breakpoint at the specified address.\n \"\"\"\n result = pwndbg.commands.fix(where)\n if result is not None:\n gdb.execute(\"break *%#x\" % int(result))\n\n\nparser = argparse.ArgumentParser(\n description=\"Starting at the specified address, disassemble N instructions.\"\n)\nparser.add_argument(\n \"where\", type=int, nargs=\"?\", default=None, help=\"The address to disassemble at.\"\n)\nparser.add_argument(\n \"n\", type=int, nargs=\"?\", default=5, help=\"The number of instructions to disassemble.\"\n)\n\n\[email protected](parser)\[email protected]\ndef u(where=None, n=5, to_string=False):\n \"\"\"\n Starting at the specified address, disassemble\n N instructions (default 5).\n \"\"\"\n if where is None:\n where = pwndbg.regs.pc\n return pwndbg.commands.nearpc.nearpc(where, n, to_string)\n\n\[email protected](\"Print a backtrace (alias 'bt').\")\[email protected]\ndef k():\n \"\"\"\n Print a backtrace (alias 'bt')\n \"\"\"\n gdb.execute(\"bt\")\n\n\nparser = argparse.ArgumentParser(description=\"List the symbols nearest to the provided value.\")\nparser.add_argument(\n \"value\", type=int, nargs=\"?\", default=None, help=\"The address you want the name of.\"\n)\n\n\[email protected](parser)\[email protected]\ndef ln(value=None):\n \"\"\"\n List the symbols nearest to the provided value.\n \"\"\"\n if value is None:\n value = pwndbg.regs.pc\n value = int(value)\n x = pwndbg.symbol.get(value)\n if x:\n result = \"(%#x) %s\" % (value, x)\n print(result)\n\n\n# The three commands are aliases for `vmmap` and are set so in vmmap.py\n# lm\n# address\n# vprot\n\n\[email protected](\"Not be windows.\")\[email protected]\ndef peb():\n print(\"This isn't Windows!\")\n\n\[email protected](\"Windbg compatibility alias for 'continue' command.\")\[email protected]\ndef go():\n \"\"\"\n Windbg compatibility alias for 'continue' command.\n \"\"\"\n gdb.execute(\"continue\")\n\n\[email protected](\"Windbg compatibility alias for 'nextcall' command.\")\[email protected]\ndef pc():\n \"\"\"\n Windbg compatibility alias for 'nextcall' command.\n \"\"\"\n return pwndbg.commands.next.nextcall()\n", "path": "pwndbg/commands/windbg.py" } ]
diff --git a/pwndbg/commands/windbg.py b/pwndbg/commands/windbg.py index f02bb70d6e4..e81af32eeaf 100644 --- a/pwndbg/commands/windbg.py +++ b/pwndbg/commands/windbg.py @@ -190,7 +190,7 @@ def dX(size, address, count, to_string=False, repeat=False): def enhex(size, value): - value = value & pwndbg.arch.ptrmask + value = value & ((1 << 8 * size) - 1) x = "%x" % abs(value) x = x.rjust(size * 2, "0") return x diff --git a/tests/test_windbg.py b/tests/test_windbg.py index 1af5dbedadd..8aabb7c7816 100644 --- a/tests/test_windbg.py +++ b/tests/test_windbg.py @@ -4,6 +4,8 @@ import tests MEMORY_BINARY = tests.binaries.get("memory.out") +X86_BINARY = tests.binaries.get("gosample.x86") + data_addr = "0x400081" @@ -299,3 +301,67 @@ def test_windbg_eX_commands(start_binary): # Check if the write actually occurred assert pwndbg.memory.read(stack_last_qword_ea, 8) == b"\xef\xbe\xad\xde\xbe\xba\xfe\xca" + + +def test_windbg_commands_x86(start_binary): + """ + Tests windbg compatibility commands that dump memory + like dq, dw, db, ds etc. + """ + start_binary(X86_BINARY) + + # Prepare memory + pwndbg.memory.write(pwndbg.regs.esp, b"1234567890abcdef_") + pwndbg.memory.write(pwndbg.regs.esp + 16, b"\x00" * 16) + pwndbg.memory.write(pwndbg.regs.esp + 32, bytes(range(16))) + pwndbg.memory.write(pwndbg.regs.esp + 48, b"Z" * 16) + + ################################################# + #### dX command tests + ################################################# + db = gdb.execute("db $esp", to_string=True).splitlines() + assert db == [ + "%x 31 32 33 34 35 36 37 38 39 30 61 62 63 64 65 66" % pwndbg.regs.esp, + "%x 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00" % (pwndbg.regs.esp + 16), + "%x 00 01 02 03 04 05 06 07 08 09 0a 0b 0c 0d 0e 0f" % (pwndbg.regs.esp + 32), + "%x 5a 5a 5a 5a 5a 5a 5a 5a 5a 5a 5a 5a 5a 5a 5a 5a" % (pwndbg.regs.esp + 48), + ] + + dw = gdb.execute("dw $esp", to_string=True).splitlines() + assert dw == [ + "%x 3231 3433 3635 3837 3039 6261 6463 6665" % pwndbg.regs.esp, + "%x 0000 0000 0000 0000 0000 0000 0000 0000" % (pwndbg.regs.esp + 16), + "%x 0100 0302 0504 0706 0908 0b0a 0d0c 0f0e" % (pwndbg.regs.esp + 32), + "%x 5a5a 5a5a 5a5a 5a5a 5a5a 5a5a 5a5a 5a5a" % (pwndbg.regs.esp + 48), + ] + + dd = gdb.execute("dd $esp", to_string=True).splitlines() + assert dd == [ + "%x 34333231 38373635 62613039 66656463" % pwndbg.regs.esp, + "%x 00000000 00000000 00000000 00000000" % (pwndbg.regs.esp + 16), + "%x 03020100 07060504 0b0a0908 0f0e0d0c" % (pwndbg.regs.esp + 32), + "%x 5a5a5a5a 5a5a5a5a 5a5a5a5a 5a5a5a5a" % (pwndbg.regs.esp + 48), + ] + + dq = gdb.execute("dq $esp", to_string=True).splitlines() + assert dq == [ + "%x 3837363534333231 6665646362613039" % pwndbg.regs.esp, + "%x 0000000000000000 0000000000000000" % (pwndbg.regs.esp + 16), + "%x 0706050403020100 0f0e0d0c0b0a0908" % (pwndbg.regs.esp + 32), + "%x 5a5a5a5a5a5a5a5a 5a5a5a5a5a5a5a5a" % (pwndbg.regs.esp + 48), + ] + + ################################################# + #### eX command tests + ################################################# + gdb.execute("eb $esp 00") + assert pwndbg.memory.read(pwndbg.regs.esp, 1) == b"\x00" + + gdb.execute("ew $esp 4141") + assert pwndbg.memory.read(pwndbg.regs.esp, 2) == b"\x41\x41" + + gdb.execute("ed $esp 5252525252") + assert pwndbg.memory.read(pwndbg.regs.esp, 4) == b"\x52" * 4 + + gdb.execute("eq $esp 1122334455667788") + assert pwndbg.memory.read(pwndbg.regs.esp, 8) == b"\x88\x77\x66\x55\x44\x33\x22\x11"
readthedocs__readthedocs.org-10610
Change profile edit form success page Currently, when a user saves the profile edit form, the success page is not the profile form page, the user gets redirected to the profile public view page. This is quite confusing UX but might be baked into Allauth. I would expect this end up on the profile edit form page instead.
[ { "content": "\"\"\"Views for creating, editing and viewing site-specific user profiles.\"\"\"\n\nfrom allauth.account.views import LoginView as AllAuthLoginView\nfrom allauth.account.views import LogoutView as AllAuthLogoutView\nfrom django.conf import settings\nfrom django.contrib import messages\nfrom django.contrib.auth import logout\nfrom django.contrib.auth.models import User\nfrom django.contrib.messages.views import SuccessMessageMixin\nfrom django.http import Http404, HttpResponseRedirect\nfrom django.urls import reverse\nfrom django.utils import timezone\nfrom django.utils.translation import gettext_lazy as _\nfrom rest_framework.authtoken.models import Token\nfrom vanilla import CreateView, DeleteView, DetailView, FormView, ListView, UpdateView\n\nfrom readthedocs.audit.filters import UserSecurityLogFilter\nfrom readthedocs.audit.models import AuditLog\nfrom readthedocs.core.forms import UserAdvertisingForm, UserDeleteForm, UserProfileForm\nfrom readthedocs.core.history import set_change_reason\nfrom readthedocs.core.mixins import PrivateViewMixin\nfrom readthedocs.core.models import UserProfile\nfrom readthedocs.core.permissions import AdminPermission\nfrom readthedocs.core.utils.extend import SettingsOverrideObject\nfrom readthedocs.organizations.models import Organization\nfrom readthedocs.projects.models import Project\nfrom readthedocs.projects.utils import get_csv_file\n\n\nclass LoginViewBase(AllAuthLoginView):\n\n pass\n\n\nclass LoginView(SettingsOverrideObject):\n\n _default_class = LoginViewBase\n\n\nclass LogoutViewBase(AllAuthLogoutView):\n\n pass\n\n\nclass LogoutView(SettingsOverrideObject):\n\n _default_class = LogoutViewBase\n\n\nclass ProfileEdit(PrivateViewMixin, UpdateView):\n\n \"\"\"Edit the current user's profile.\"\"\"\n\n model = UserProfile\n form_class = UserProfileForm\n template_name = 'profiles/private/edit_profile.html'\n context_object_name = 'profile'\n\n def get_object(self):\n return self.request.user.profile\n\n def get_success_url(self):\n return reverse(\n 'profiles_profile_detail',\n kwargs={'username': self.request.user.username},\n )\n\n\nclass AccountDelete(PrivateViewMixin, SuccessMessageMixin, FormView):\n\n form_class = UserDeleteForm\n template_name = 'profiles/private/delete_account.html'\n success_message = _('You have successfully deleted your account')\n\n def get_object(self):\n return User.objects.get(pk=self.request.user.pk)\n\n def form_valid(self, form):\n user = self.get_object()\n logout(self.request)\n set_change_reason(user, self.get_change_reason())\n user.delete()\n return super().form_valid(form)\n\n def get_form(self, data=None, files=None, **kwargs):\n kwargs['instance'] = self.get_object()\n kwargs['initial'] = {'username': ''}\n return super().get_form(data, files, **kwargs)\n\n def get_context_data(self, **kwargs):\n context = super().get_context_data(**kwargs)\n user = self.request.user\n context[\"projects_to_be_deleted\"] = Project.objects.single_owner(user)\n context[\"organizations_to_be_deleted\"] = Organization.objects.single_owner(user)\n return context\n\n def get_success_url(self):\n return reverse('homepage')\n\n def get_change_reason(self):\n klass = self.__class__.__name__\n return f'origin=form class={klass}'\n\n\nclass ProfileDetail(DetailView):\n\n model = User\n template_name = 'profiles/public/profile_detail.html'\n lookup_field = 'username'\n\n def get_object(self):\n \"\"\"\n Get the user object.\n\n If organizations are enabled, show the profile to users in the same organization only.\n Otherwise, all users can see the profile of others.\n \"\"\"\n user = super().get_object()\n if not settings.RTD_ALLOW_ORGANIZATIONS:\n return user\n\n request_user = self.request.user\n if not request_user.is_authenticated:\n raise Http404()\n\n # Always allow users to see their own profile.\n if request_user == user:\n return user\n\n for org in Organization.objects.for_user(request_user):\n if AdminPermission.is_member(user=user, obj=org):\n return user\n raise Http404()\n\n def get_context_data(self, **kwargs):\n context = super().get_context_data(**kwargs)\n context['profile'] = self.get_object().profile\n return context\n\n\nclass AccountAdvertisingEdit(PrivateViewMixin, SuccessMessageMixin, UpdateView):\n\n model = UserProfile\n form_class = UserAdvertisingForm\n context_object_name = 'profile'\n template_name = 'profiles/private/advertising_profile.html'\n success_message = _('Updated your advertising preferences')\n\n def get_object(self):\n return self.request.user.profile\n\n def get_success_url(self):\n return reverse('account_advertising')\n\n\nclass TokenMixin(PrivateViewMixin):\n\n \"\"\"User token to access APIv3.\"\"\"\n\n model = Token\n lookup_url_kwarg = 'token_pk'\n template_name = 'profiles/private/token_list.html'\n\n def get_queryset(self):\n # NOTE: we are currently showing just one token since the DRF model has\n # a OneToOneField relation with User. Although, we plan to have multiple\n # scope-based tokens.\n return Token.objects.filter(user__in=[self.request.user])\n\n def get_success_url(self):\n return reverse('profiles_tokens')\n\n\nclass TokenListView(TokenMixin, ListView):\n pass\n\n\nclass TokenCreateView(TokenMixin, CreateView):\n\n \"\"\"Simple view to generate a Token object for the logged in User.\"\"\"\n\n http_method_names = ['post']\n\n def post(self, request, *args, **kwargs):\n _, created = Token.objects.get_or_create(user=self.request.user)\n if created:\n messages.info(request, 'API Token created successfully')\n return HttpResponseRedirect(self.get_success_url())\n\n\nclass TokenDeleteView(TokenMixin, DeleteView):\n\n \"\"\"View to delete/revoke the current Token of the logged in User.\"\"\"\n\n http_method_names = ['post']\n\n def get_object(self, queryset=None): # noqa\n return self.request.user.auth_token\n\n\nclass UserSecurityLogView(PrivateViewMixin, ListView):\n model = AuditLog\n template_name = 'profiles/private/security_log.html'\n days_limit = settings.RTD_AUDITLOGS_DEFAULT_RETENTION_DAYS\n\n def get(self, request, *args, **kwargs):\n download_data = request.GET.get('download', False)\n if download_data:\n return self._get_csv_data()\n return super().get(request, *args, **kwargs)\n\n def _get_start_date(self):\n \"\"\"Get the date to show logs from.\"\"\"\n creation_date = self.request.user.date_joined.date()\n start_date = timezone.now().date() - timezone.timedelta(days=self.days_limit)\n # The max we can go back is to the creation of the user.\n return max(start_date, creation_date)\n\n def _get_csv_data(self):\n current_timezone = settings.TIME_ZONE\n values = [\n (f\"Date ({current_timezone})\", \"created\"),\n (\"User\", \"log_user_username\"),\n (\"Project\", \"log_project_slug\"),\n (\"Organization\", \"log_organization_slug\"),\n (\"Action\", \"action\"),\n (\"IP\", \"ip\"),\n (\"Browser\", \"browser\"),\n (\"Extra data\", \"data\"),\n ]\n data = self.get_queryset().values_list(*[value for _, value in values])\n\n start_date = self._get_start_date()\n end_date = timezone.now().date()\n date_filter = self.filter.form.cleaned_data.get('date')\n if date_filter:\n start_date = date_filter.start or start_date\n end_date = date_filter.stop or end_date\n\n filename = 'readthedocs_user_security_logs_{username}_{start}_{end}.csv'.format(\n username=self.request.user.username,\n start=timezone.datetime.strftime(start_date, '%Y-%m-%d'),\n end=timezone.datetime.strftime(end_date, '%Y-%m-%d'),\n )\n csv_data = [\n [timezone.datetime.strftime(date, '%Y-%m-%d %H:%M:%S'), *rest]\n for date, *rest in data\n ]\n csv_data.insert(0, [header for header, _ in values])\n return get_csv_file(filename=filename, csv_data=csv_data)\n\n def get_context_data(self, **kwargs):\n context = super().get_context_data(**kwargs)\n context['days_limit'] = self.days_limit\n context['filter'] = self.filter\n context['AuditLog'] = AuditLog\n return context\n\n def _get_queryset(self):\n \"\"\"Return the queryset without filters.\"\"\"\n user = self.request.user\n start_date = self._get_start_date()\n queryset = AuditLog.objects.filter(\n user=user,\n action__in=[action for action, _ in UserSecurityLogFilter.allowed_actions],\n created__gte=start_date,\n )\n return queryset\n\n def get_queryset(self):\n \"\"\"\n Return the queryset with filters.\n\n If you want the original queryset without filters,\n use `_get_queryset`.\n \"\"\"\n queryset = self._get_queryset()\n # Set filter on self, so we can use it in the context.\n # Without executing it twice.\n self.filter = UserSecurityLogFilter(\n self.request.GET,\n queryset=queryset,\n )\n return self.filter.qs\n", "path": "readthedocs/profiles/views.py" } ]
[ { "content": "\"\"\"Views for creating, editing and viewing site-specific user profiles.\"\"\"\n\nfrom allauth.account.views import LoginView as AllAuthLoginView\nfrom allauth.account.views import LogoutView as AllAuthLogoutView\nfrom django.conf import settings\nfrom django.contrib import messages\nfrom django.contrib.auth import logout\nfrom django.contrib.auth.models import User\nfrom django.contrib.messages.views import SuccessMessageMixin\nfrom django.http import Http404, HttpResponseRedirect\nfrom django.urls import reverse\nfrom django.utils import timezone\nfrom django.utils.translation import gettext_lazy as _\nfrom rest_framework.authtoken.models import Token\nfrom vanilla import CreateView, DeleteView, DetailView, FormView, ListView, UpdateView\n\nfrom readthedocs.audit.filters import UserSecurityLogFilter\nfrom readthedocs.audit.models import AuditLog\nfrom readthedocs.core.forms import UserAdvertisingForm, UserDeleteForm, UserProfileForm\nfrom readthedocs.core.history import set_change_reason\nfrom readthedocs.core.mixins import PrivateViewMixin\nfrom readthedocs.core.models import UserProfile\nfrom readthedocs.core.permissions import AdminPermission\nfrom readthedocs.core.utils.extend import SettingsOverrideObject\nfrom readthedocs.organizations.models import Organization\nfrom readthedocs.projects.models import Project\nfrom readthedocs.projects.utils import get_csv_file\n\n\nclass LoginViewBase(AllAuthLoginView):\n\n pass\n\n\nclass LoginView(SettingsOverrideObject):\n\n _default_class = LoginViewBase\n\n\nclass LogoutViewBase(AllAuthLogoutView):\n\n pass\n\n\nclass LogoutView(SettingsOverrideObject):\n\n _default_class = LogoutViewBase\n\n\nclass ProfileEdit(PrivateViewMixin, UpdateView):\n\n \"\"\"Edit the current user's profile.\"\"\"\n\n model = UserProfile\n form_class = UserProfileForm\n template_name = 'profiles/private/edit_profile.html'\n context_object_name = 'profile'\n\n def get_object(self):\n return self.request.user.profile\n\n def get_success_url(self):\n return reverse(\n \"profiles_profile_edit\",\n )\n\n\nclass AccountDelete(PrivateViewMixin, SuccessMessageMixin, FormView):\n\n form_class = UserDeleteForm\n template_name = 'profiles/private/delete_account.html'\n success_message = _('You have successfully deleted your account')\n\n def get_object(self):\n return User.objects.get(pk=self.request.user.pk)\n\n def form_valid(self, form):\n user = self.get_object()\n logout(self.request)\n set_change_reason(user, self.get_change_reason())\n user.delete()\n return super().form_valid(form)\n\n def get_form(self, data=None, files=None, **kwargs):\n kwargs['instance'] = self.get_object()\n kwargs['initial'] = {'username': ''}\n return super().get_form(data, files, **kwargs)\n\n def get_context_data(self, **kwargs):\n context = super().get_context_data(**kwargs)\n user = self.request.user\n context[\"projects_to_be_deleted\"] = Project.objects.single_owner(user)\n context[\"organizations_to_be_deleted\"] = Organization.objects.single_owner(user)\n return context\n\n def get_success_url(self):\n return reverse('homepage')\n\n def get_change_reason(self):\n klass = self.__class__.__name__\n return f'origin=form class={klass}'\n\n\nclass ProfileDetail(DetailView):\n\n model = User\n template_name = 'profiles/public/profile_detail.html'\n lookup_field = 'username'\n\n def get_object(self):\n \"\"\"\n Get the user object.\n\n If organizations are enabled, show the profile to users in the same organization only.\n Otherwise, all users can see the profile of others.\n \"\"\"\n user = super().get_object()\n if not settings.RTD_ALLOW_ORGANIZATIONS:\n return user\n\n request_user = self.request.user\n if not request_user.is_authenticated:\n raise Http404()\n\n # Always allow users to see their own profile.\n if request_user == user:\n return user\n\n for org in Organization.objects.for_user(request_user):\n if AdminPermission.is_member(user=user, obj=org):\n return user\n raise Http404()\n\n def get_context_data(self, **kwargs):\n context = super().get_context_data(**kwargs)\n context['profile'] = self.get_object().profile\n return context\n\n\nclass AccountAdvertisingEdit(PrivateViewMixin, SuccessMessageMixin, UpdateView):\n\n model = UserProfile\n form_class = UserAdvertisingForm\n context_object_name = 'profile'\n template_name = 'profiles/private/advertising_profile.html'\n success_message = _('Updated your advertising preferences')\n\n def get_object(self):\n return self.request.user.profile\n\n def get_success_url(self):\n return reverse('account_advertising')\n\n\nclass TokenMixin(PrivateViewMixin):\n\n \"\"\"User token to access APIv3.\"\"\"\n\n model = Token\n lookup_url_kwarg = 'token_pk'\n template_name = 'profiles/private/token_list.html'\n\n def get_queryset(self):\n # NOTE: we are currently showing just one token since the DRF model has\n # a OneToOneField relation with User. Although, we plan to have multiple\n # scope-based tokens.\n return Token.objects.filter(user__in=[self.request.user])\n\n def get_success_url(self):\n return reverse('profiles_tokens')\n\n\nclass TokenListView(TokenMixin, ListView):\n pass\n\n\nclass TokenCreateView(TokenMixin, CreateView):\n\n \"\"\"Simple view to generate a Token object for the logged in User.\"\"\"\n\n http_method_names = ['post']\n\n def post(self, request, *args, **kwargs):\n _, created = Token.objects.get_or_create(user=self.request.user)\n if created:\n messages.info(request, 'API Token created successfully')\n return HttpResponseRedirect(self.get_success_url())\n\n\nclass TokenDeleteView(TokenMixin, DeleteView):\n\n \"\"\"View to delete/revoke the current Token of the logged in User.\"\"\"\n\n http_method_names = ['post']\n\n def get_object(self, queryset=None): # noqa\n return self.request.user.auth_token\n\n\nclass UserSecurityLogView(PrivateViewMixin, ListView):\n model = AuditLog\n template_name = 'profiles/private/security_log.html'\n days_limit = settings.RTD_AUDITLOGS_DEFAULT_RETENTION_DAYS\n\n def get(self, request, *args, **kwargs):\n download_data = request.GET.get('download', False)\n if download_data:\n return self._get_csv_data()\n return super().get(request, *args, **kwargs)\n\n def _get_start_date(self):\n \"\"\"Get the date to show logs from.\"\"\"\n creation_date = self.request.user.date_joined.date()\n start_date = timezone.now().date() - timezone.timedelta(days=self.days_limit)\n # The max we can go back is to the creation of the user.\n return max(start_date, creation_date)\n\n def _get_csv_data(self):\n current_timezone = settings.TIME_ZONE\n values = [\n (f\"Date ({current_timezone})\", \"created\"),\n (\"User\", \"log_user_username\"),\n (\"Project\", \"log_project_slug\"),\n (\"Organization\", \"log_organization_slug\"),\n (\"Action\", \"action\"),\n (\"IP\", \"ip\"),\n (\"Browser\", \"browser\"),\n (\"Extra data\", \"data\"),\n ]\n data = self.get_queryset().values_list(*[value for _, value in values])\n\n start_date = self._get_start_date()\n end_date = timezone.now().date()\n date_filter = self.filter.form.cleaned_data.get('date')\n if date_filter:\n start_date = date_filter.start or start_date\n end_date = date_filter.stop or end_date\n\n filename = 'readthedocs_user_security_logs_{username}_{start}_{end}.csv'.format(\n username=self.request.user.username,\n start=timezone.datetime.strftime(start_date, '%Y-%m-%d'),\n end=timezone.datetime.strftime(end_date, '%Y-%m-%d'),\n )\n csv_data = [\n [timezone.datetime.strftime(date, '%Y-%m-%d %H:%M:%S'), *rest]\n for date, *rest in data\n ]\n csv_data.insert(0, [header for header, _ in values])\n return get_csv_file(filename=filename, csv_data=csv_data)\n\n def get_context_data(self, **kwargs):\n context = super().get_context_data(**kwargs)\n context['days_limit'] = self.days_limit\n context['filter'] = self.filter\n context['AuditLog'] = AuditLog\n return context\n\n def _get_queryset(self):\n \"\"\"Return the queryset without filters.\"\"\"\n user = self.request.user\n start_date = self._get_start_date()\n queryset = AuditLog.objects.filter(\n user=user,\n action__in=[action for action, _ in UserSecurityLogFilter.allowed_actions],\n created__gte=start_date,\n )\n return queryset\n\n def get_queryset(self):\n \"\"\"\n Return the queryset with filters.\n\n If you want the original queryset without filters,\n use `_get_queryset`.\n \"\"\"\n queryset = self._get_queryset()\n # Set filter on self, so we can use it in the context.\n # Without executing it twice.\n self.filter = UserSecurityLogFilter(\n self.request.GET,\n queryset=queryset,\n )\n return self.filter.qs\n", "path": "readthedocs/profiles/views.py" } ]
diff --git a/readthedocs/profiles/views.py b/readthedocs/profiles/views.py index 97607e0eda5..00f72266458 100644 --- a/readthedocs/profiles/views.py +++ b/readthedocs/profiles/views.py @@ -61,8 +61,7 @@ def get_object(self): def get_success_url(self): return reverse( - 'profiles_profile_detail', - kwargs={'username': self.request.user.username}, + "profiles_profile_edit", ) diff --git a/readthedocs/rtd_tests/tests/test_profile_views.py b/readthedocs/rtd_tests/tests/test_profile_views.py index 23f5297fc5d..a8bf31d439b 100644 --- a/readthedocs/rtd_tests/tests/test_profile_views.py +++ b/readthedocs/rtd_tests/tests/test_profile_views.py @@ -36,6 +36,7 @@ def test_edit_profile(self): }, ) self.assertTrue(resp.status_code, 200) + self.assertEqual(resp["Location"], "/accounts/edit/") self.user.refresh_from_db() self.user.profile.refresh_from_db()
pex-tool__pex-1859
Release 2.1.100 On the docket: + [x] Using --target-system linux --target-system mac can still lead to failed attempts to lock Windows requirements. #1856
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.99\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.100\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 134903f81..4c6cf4cb8 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,15 @@ Release Notes ============= +2.1.100 +------- + +This release fixes a hole in the lock creation ``--target-system`` +feature added in #1823 in Pex 2.1.95. + +* Fix lock creation ``--target-system`` handling. (#1858) + `PR #1858 <https://github.com/pantsbuild/pex/pull/1858>`_ + 2.1.99 ------ diff --git a/pex/version.py b/pex/version.py index 1262846c2..80d82318d 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.99" +__version__ = "2.1.100"
microsoft__Qcodes-5046
Update Sphinx favicon config Thanks for using [Sphinx Favicon](https://github.com/tcmetzger/sphinx-favicon) in your project! I just released version 1.0 of the extension, which brings one breaking change: to better conform with Python standards, we changed the module name to `sphinx_favicon` (instead of `sphinx-favicon`). This means you'll have to update the name in the `extensions` list of your conf.py file (https://github.com/QCoDeS/Qcodes/blob/master/docs/conf.py#L81) to use version 1.0. Otherwise, your existing configuration should continue to work!
[ { "content": "#!/usr/bin/env python3\n#\n# QCoDeS documentation build configuration file, created by\n# sphinx-quickstart on Thu Jun 2 10:41:37 2016.\n#\n# This file is execfile()d with the current directory set to its\n# containing dir.\n#\n# Note that not all possible configuration values are present in this\n# autogenerated file.\n#\n# All configuration values have a default; values that are commented out\n# serve to show the default.\n\n# If extensions (or modules to document with autodoc) are in another directory,\n# add these directories to sys.path here. If the directory is relative to the\n# documentation root, use os.path.abspath to make it absolute, like shown here.\n#\nimport os\nimport sys\nfrom abc import ABCMeta\nfrom importlib import reload\n\n# Import matplotlib and set the backend\n# before qcodes imports pyplot and automatically\n# sets the backend\nimport matplotlib\nimport sphinx_rtd_theme # noqa F401\nfrom packaging.version import parse\n\n# setting the metaclass will cause sphinx\n# to document the signature of `__call__`\n# rather than `__init__` that is unhelpful\n# for instruments. When building the docs\n# we patch it back to ABCMeta\n# this should happen as early as possible\nimport qcodes.instrument.instrument_meta\n\nqcodes.instrument.instrument_meta.InstrumentMeta = ABCMeta\n# we need to reload any module that has been imported and\n# makes use of this metaclass. The modules below are all imported\n# by importing qcodes.instrument so we need to reload them\nreload(qcodes.instrument.instrument)\nreload(qcodes.instrument.ip)\nreload(qcodes.instrument.visa)\nreload(qcodes.instrument)\n\nimport qcodes # noqa F402\n\nmatplotlib.use('Agg')\n\nsys.path.insert(0, os.path.abspath('..'))\n\n# -- General configuration ------------------------------------------------\n\n# If your documentation needs a minimal Sphinx version, state it here.\n#\n# needs_sphinx = '1.0'\n\n# Add any Sphinx extension module names here, as strings. They can be\n# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom\n# ones.\nextensions = [\n \"nbsphinx\",\n \"sphinx.ext.autodoc\",\n \"sphinx.ext.autosummary\",\n \"sphinx.ext.napoleon\",\n \"sphinx-jsonschema\",\n \"sphinx.ext.doctest\",\n \"sphinx.ext.intersphinx\",\n \"sphinx.ext.todo\",\n \"sphinx.ext.coverage\",\n \"sphinx.ext.mathjax\",\n \"sphinx.ext.viewcode\",\n \"sphinx.ext.githubpages\",\n \"sphinx.ext.todo\",\n \"qcodes.sphinx_extensions.parse_parameter_attr\",\n \"sphinxcontrib.towncrier\",\n \"autodocsumm\",\n \"sphinx_issues\",\n \"sphinx-favicon\",\n]\n\n# include special __xxx__ that DO have a docstring\n# it probably means something important\nnapoleon_include_special_with_doc = True\n\n# The suffix(es) of source filenames.\n# You can specify multiple suffix as a list of string:\n#\n# source_suffix = ['.rst', '.md']\nsource_suffix = '.rst'\n\n# The encoding of source files.\n#\n# source_encoding = 'utf-8-sig'\n\n# Add link to Binder in Prolog of the notebooks\n# -- Get version information ----------------------------\n\nversion = qcodes.__version__\nrelease = parse(qcodes.__version__).public\n\n# Add link to Binder in Prolog (WRITE MORE DETAILS ONCE FIXED)\nnbsphinx_prolog = r\"\"\"\n{% set docname = 'docs/' + env.doc2path(env.docname, base=None) %}\n\n.. raw:: html\n\n <div class=\"admonition note\">\n <p>This page was generated from\n <a class=\"reference external\"\n href=\"https://github.com/qcodes/qcodes/blob/master/{{docname|e}}\">{{ docname|replace(\"\\\\\",\"/\") }}</a>.\n Interactive online version:\n <a href=\"https://mybinder.org/v2/gh/qcodes/qcodes/master?filepath={{\n docname|replace(\"\\\\\",\"/\") }}\"><img\n alt=\"Binder badge\"\n src=\"https://mybinder.org/badge_logo.svg\"\n style=\"vertical-align:text-bottom\"></a>.\n </p>\n <script>\n if (document.location.host) {\n var p = document.currentScript.previousSibling.previousSibling;\n var a = document.createElement('a');\n a.innerHTML = 'View in <em>nbviewer</em>';\n a.href = `https://nbviewer.jupyter.org/url${\n (window.location.protocol == 'https:' ? 's/' : '/') +\n window.location.host +\n window.location.pathname.slice(0, -4) }ipynb`;\n a.classList.add('reference');\n a.classList.add('external');\n p.appendChild(a);\n p.appendChild(document.createTextNode('.'));\n }\n </script>\n </div>\n\"\"\"\n\n# The master toctree document.\nmaster_doc = 'index'\n\n# General information about the project.\nproject = 'QCoDeS'\ncopyright = '2016, Giulio Ungaretti, Alex Johnson'\nauthor = 'Giulio Ungaretti, Alex Johnson'\n\n\n# The language for content autogenerated by Sphinx. Refer to documentation\n# for a list of supported languages.\n#\n# This is also used if you do content translation via gettext catalogs.\n# Usually you set \"language\" from the command line for these cases.\nlanguage = \"en\"\n\n# There are two options for replacing |today|: either, you set today to some\n# non-false value, then it is used:\n#\n# today = ''\n#\n# Else, today_fmt is used as the format for a strftime call.\n#\n# today_fmt = '%B %d, %Y'\n\n# List of patterns, relative to source directory, that match files and\n# directories to ignore when looking for source files.\n# This patterns also effect to html_static_path and html_extra_path\nexclude_patterns = ['_build', 'Thumbs.db', '.DS_Store', '_templates', '_auto',\n '**.ipynb_checkpoints']\n\n# The reST default role (used for this markup: `text`) to use for all\n# documents.\n#\n# default_role = None\n\n# If true, '()' will be appended to :func: etc. cross-reference text.\n#\n# add_function_parentheses = True\n\n# If true, the current module name will be prepended to all description\n# unit titles (such as .. function::).\n#\n# add_module_names = True\n\n# If true, sectionauthor and moduleauthor directives will be shown in the\n# output. They are ignored by default.\n#\n# show_authors = False\n\n# The name of the Pygments (syntax highlighting) style to use.\npygments_style = 'sphinx'\n\n# A list of ignored prefixes for module index sorting.\n# modindex_common_prefix = []\n\n# If true, keep warnings as \"system message\" paragraphs in the built documents.\n# keep_warnings = False\n\n# If true, `todo` and `todoList` produce output, else they produce nothing.\ntodo_include_todos = True\n\n# -- Options for HTML output ----------------------------------------------\n\n# The theme to use for HTML and HTML Help pages. See the documentation for\n# a list of builtin themes.\n#\nhtml_theme = \"sphinx_rtd_theme\"\n\n# Theme options are theme-specific and customize the look and feel of a theme\n# further. For a list of options available for each theme, see the\n# documentation.\n#\n# html_theme_options = {}\n\n# Add any paths that contain custom themes here, relative to this directory.\n# html_theme_path = []\n\n# The name for this set of Sphinx documents.\n# \"<project> v<release> documentation\" by default.\n#\n# html_title = 'QCoDeS v1'\n\n# A shorter title for the navigation bar. Default is the same as html_title.\n#\n# html_short_title = None\n\n# The name of an image file (relative to this directory) to place at the top\n# of the sidebar.\n#\n# html_logo = None\n\n# The name of an image file (relative to this directory) to use as a favicon of\n# the docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32\n# pixels large.\n#\n# html_favicon = None\n\n# Add any paths that contain custom static files (such as style sheets) here,\n# relative to this directory. They are copied after the builtin static files,\n# so a file named \"default.css\" will overwrite the builtin \"default.css\".\nhtml_static_path = ['_static']\n\n# Add custom favicon to the sphinx html documentation.\n# Can be an absolute url or a local static file.\nfavicons = {\"rel\": \"icon\", \"static-file\": \"qcodes_favicon.png\", \"type\": \"image/png\"}\n\n# Add any extra paths that contain custom files (such as robots.txt or\n# .htaccess) here, relative to this directory. These files are copied\n# directly to the root of the documentation.\n#\n# html_extra_path = []\n\n# If not None, a 'Last updated on:' timestamp is inserted at every page\n# bottom, using the given strftime format.\n# The empty string is equivalent to '%b %d, %Y'.\n#\n# html_last_updated_fmt = None\n\n# If true, SmartyPants will be used to convert quotes and dashes to\n# typographically correct entities.\n#\n# html_use_smartypants = True\n\n# Custom sidebar templates, maps document names to template names.\n#\n# html_sidebars = {}\n\n# Additional templates that should be rendered to pages, maps page names to\n# template names.\n#\n# html_additional_pages = {}\n\n# If false, no module index is generated.\n#\n# html_domain_indices = True\n\n# If false, no index is generated.\n#\nhtml_use_index = True\n\n# If true, the index is split into individual pages for each letter.\n#\n# html_split_index = False\n\n# If true, links to the reST sources are added to the pages.\n#\n# html_show_sourcelink = True\n\n# If true, \"Created using Sphinx\" is shown in the HTML footer. Default is True.\n#\n# html_show_sphinx = True\n\n# If true, \"(C) Copyright ...\" is shown in the HTML footer. Default is True.\n#\nhtml_show_copyright = False\n\n# If true, an OpenSearch description file will be output, and all pages will\n# contain a <link> tag referring to it. The value of this option must be the\n# base URL from which the finished HTML is served.\n#\n# html_use_opensearch = ''\n\n# This is the file name suffix for HTML files (e.g. \".xhtml\").\n# html_file_suffix = None\n\n# Language to be used for generating the HTML full-text search index.\n# Sphinx supports the following languages:\n# 'da', 'de', 'en', 'es', 'fi', 'fr', 'h', 'it', 'ja'\n# 'nl', 'no', 'pt', 'ro', 'r', 'sv', 'tr', 'zh'\n#\n# html_search_language = 'en'\n\n# A dictionary with options for the search language support, empty by default.\n# 'ja' uses this config value.\n# 'zh' user can custom change `jieba` dictionary path.\n#\n# html_search_options = {'type': 'default'}\n\n# The name of a javascript file (relative to the configuration directory) that\n# implements a search results scorer. If empty, the default will be used.\n#\n# html_search_scorer = 'scorer.js'\n\n# Output file base name for HTML help builder.\nhtmlhelp_basename = 'QCoDeSdoc'\n\n# -- Options for LaTeX output ---------------------------------------------\n\nlatex_elements = { # The paper size ('letterpaper' or 'a4paper').\n #\n # 'papersize': 'letterpaper',\n\n # The font size ('10pt', '11pt' or '12pt').\n #\n # 'pointsize': '10pt',\n\n # Additional stuff for the LaTeX preamble.\n #\n # 'preamble': '',\n\n # Latex figure (float) alignment\n #\n # 'figure_align': 'htbp',\n}\n\n# Grouping the document tree into LaTeX files. List of tuples\n# (source start file, target name, title,\n# author, documentclass [howto, manual, or own class]).\nlatex_documents = [(master_doc, 'QCoDeS.tex', 'QCoDeS Documentation',\n 'Giulio Ungaretti, Alex Johnson', 'manual'), ]\n\n# The name of an image file (relative to this directory) to place at the top of\n# the title page.\n#\n# latex_logo = None\n\n# For \"manual\" documents, if this is true, then toplevel headings are parts,\n# not chapters.\n#\n# latex_use_parts = False\n\n# If true, show page references after internal links.\n#\n# latex_show_pagerefs = False\n\n# If true, show URL addresses after external links.\n#\n# latex_show_urls = False\n\n# Documents to append as an appendix to all manuals.\n#\n# latex_appendices = []\n\n# If false, no module index is generated.\n#\n# latex_domain_indices = True\n\n\n# -- Options for manual page output ---------------------------------------\n\n# One entry per manual page. List of tuples\n# (source start file, name, description, authors, manual section).\nman_pages = [(master_doc, 'qcodes', 'QCoDeS Documentation', [author], 1)]\n\n# If true, show URL addresses after external links.\n#\n# man_show_urls = False\n\n\n# -- Options for Texinfo output -------------------------------------------\n\n# Grouping the document tree into Texinfo files. List of tuples\n# (source start file, target name, title, author,\n# dir menu entry, description, category)\ntexinfo_documents = [(\n master_doc, 'QCoDeS', 'QCoDeS Documentation', author, 'QCoDeS',\n 'One line description of project.', 'Miscellaneous'), ]\n\n# Documents to append as an appendix to all manuals.\n#\n# texinfo_appendices = []\n\n# If false, no module index is generated.\n#\n# texinfo_domain_indices = True\n\n# How to display URL addresses: 'footnote', 'no', or 'inline'.\n#\ntexinfo_show_urls = 'footnote'\n\n# If true, do not generate a @detailmenu in the \"Top\" node's menu.\n#\n# texinfo_no_detailmenu = False\n\n\n# Example configuration for intersphinx: refer to the Python standard library.\nintersphinx_mapping = {\n \"pandas\": (\"https://pandas.pydata.org/pandas-docs/stable/\", None),\n \"matplotlib\": (\"https://matplotlib.org/stable\", None),\n \"python\": (\"https://docs.python.org/3.10/\", None),\n \"numpy\": (\"https://numpy.org/doc/stable/\", None),\n \"pyvisa\": (\"https://pyvisa.readthedocs.io/en/stable/\", None),\n \"IPython\": (\"https://ipython.readthedocs.io/en/stable/\", None),\n}\n\nautoclass_content = \"both\"\n# classes should include both the\n# class' and the __init__ method's docstring\nautosummary_generate = True\nautodoc_member_order = 'bysource'\nautodoc_default_options = {'members': True, 'undoc-members': True,\n 'inherited-members': True, 'show-inheritance': True}\n\n# we mock modules that for one reason or another is not\n# there when generating the docs\nautodoc_mock_imports = [\n \"pyspcm\",\n \"zhinst\",\n \"zhinst.utils\",\n \"keysightSD1\",\n \"cffi\",\n \"spirack\",\n \"clr\",\n \"win32com\",\n \"win32com.client\",\n \"pythoncom\",\n \"slack-sdk\",\n \"hickle\",\n \"gclib\",\n]\n\nautodoc_typehints_format = \"short\"\n\n# Add any paths that contain templates here, relative to this directory.\ntemplates_path = ['_templates']\n\n# we are using non local images for badges. These will change so we dont\n# want to store them locally.\nsuppress_warnings = ['image.nonlocal_uri']\n\nnitpicky = False\n\nnumfig = True\n\n# Use this kernel instead of the one stored in the notebook metadata:\nnbsphinx_kernel_name = 'python3'\n# always execute notebooks.\nnbsphinx_execute = 'always'\n\ntowncrier_draft_autoversion_mode = \"draft\"\ntowncrier_draft_include_empty = True\ntowncrier_draft_working_directory = \"..\"\n\nissues_github_path = \"QCoDeS/Qcodes\"\n", "path": "docs/conf.py" } ]
[ { "content": "#!/usr/bin/env python3\n#\n# QCoDeS documentation build configuration file, created by\n# sphinx-quickstart on Thu Jun 2 10:41:37 2016.\n#\n# This file is execfile()d with the current directory set to its\n# containing dir.\n#\n# Note that not all possible configuration values are present in this\n# autogenerated file.\n#\n# All configuration values have a default; values that are commented out\n# serve to show the default.\n\n# If extensions (or modules to document with autodoc) are in another directory,\n# add these directories to sys.path here. If the directory is relative to the\n# documentation root, use os.path.abspath to make it absolute, like shown here.\n#\nimport os\nimport sys\nfrom abc import ABCMeta\nfrom importlib import reload\n\n# Import matplotlib and set the backend\n# before qcodes imports pyplot and automatically\n# sets the backend\nimport matplotlib\nimport sphinx_rtd_theme # noqa F401\nfrom packaging.version import parse\n\n# setting the metaclass will cause sphinx\n# to document the signature of `__call__`\n# rather than `__init__` that is unhelpful\n# for instruments. When building the docs\n# we patch it back to ABCMeta\n# this should happen as early as possible\nimport qcodes.instrument.instrument_meta\n\nqcodes.instrument.instrument_meta.InstrumentMeta = ABCMeta\n# we need to reload any module that has been imported and\n# makes use of this metaclass. The modules below are all imported\n# by importing qcodes.instrument so we need to reload them\nreload(qcodes.instrument.instrument)\nreload(qcodes.instrument.ip)\nreload(qcodes.instrument.visa)\nreload(qcodes.instrument)\n\nimport qcodes # noqa F402\n\nmatplotlib.use('Agg')\n\nsys.path.insert(0, os.path.abspath('..'))\n\n# -- General configuration ------------------------------------------------\n\n# If your documentation needs a minimal Sphinx version, state it here.\n#\n# needs_sphinx = '1.0'\n\n# Add any Sphinx extension module names here, as strings. They can be\n# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom\n# ones.\nextensions = [\n \"nbsphinx\",\n \"sphinx.ext.autodoc\",\n \"sphinx.ext.autosummary\",\n \"sphinx.ext.napoleon\",\n \"sphinx-jsonschema\",\n \"sphinx.ext.doctest\",\n \"sphinx.ext.intersphinx\",\n \"sphinx.ext.todo\",\n \"sphinx.ext.coverage\",\n \"sphinx.ext.mathjax\",\n \"sphinx.ext.viewcode\",\n \"sphinx.ext.githubpages\",\n \"sphinx.ext.todo\",\n \"qcodes.sphinx_extensions.parse_parameter_attr\",\n \"sphinxcontrib.towncrier\",\n \"autodocsumm\",\n \"sphinx_issues\",\n \"sphinx_favicon\",\n]\n\n# include special __xxx__ that DO have a docstring\n# it probably means something important\nnapoleon_include_special_with_doc = True\n\n# The suffix(es) of source filenames.\n# You can specify multiple suffix as a list of string:\n#\n# source_suffix = ['.rst', '.md']\nsource_suffix = '.rst'\n\n# The encoding of source files.\n#\n# source_encoding = 'utf-8-sig'\n\n# Add link to Binder in Prolog of the notebooks\n# -- Get version information ----------------------------\n\nversion = qcodes.__version__\nrelease = parse(qcodes.__version__).public\n\n# Add link to Binder in Prolog (WRITE MORE DETAILS ONCE FIXED)\nnbsphinx_prolog = r\"\"\"\n{% set docname = 'docs/' + env.doc2path(env.docname, base=None) %}\n\n.. raw:: html\n\n <div class=\"admonition note\">\n <p>This page was generated from\n <a class=\"reference external\"\n href=\"https://github.com/qcodes/qcodes/blob/master/{{docname|e}}\">{{ docname|replace(\"\\\\\",\"/\") }}</a>.\n Interactive online version:\n <a href=\"https://mybinder.org/v2/gh/qcodes/qcodes/master?filepath={{\n docname|replace(\"\\\\\",\"/\") }}\"><img\n alt=\"Binder badge\"\n src=\"https://mybinder.org/badge_logo.svg\"\n style=\"vertical-align:text-bottom\"></a>.\n </p>\n <script>\n if (document.location.host) {\n var p = document.currentScript.previousSibling.previousSibling;\n var a = document.createElement('a');\n a.innerHTML = 'View in <em>nbviewer</em>';\n a.href = `https://nbviewer.jupyter.org/url${\n (window.location.protocol == 'https:' ? 's/' : '/') +\n window.location.host +\n window.location.pathname.slice(0, -4) }ipynb`;\n a.classList.add('reference');\n a.classList.add('external');\n p.appendChild(a);\n p.appendChild(document.createTextNode('.'));\n }\n </script>\n </div>\n\"\"\"\n\n# The master toctree document.\nmaster_doc = 'index'\n\n# General information about the project.\nproject = 'QCoDeS'\ncopyright = '2016, Giulio Ungaretti, Alex Johnson'\nauthor = 'Giulio Ungaretti, Alex Johnson'\n\n\n# The language for content autogenerated by Sphinx. Refer to documentation\n# for a list of supported languages.\n#\n# This is also used if you do content translation via gettext catalogs.\n# Usually you set \"language\" from the command line for these cases.\nlanguage = \"en\"\n\n# There are two options for replacing |today|: either, you set today to some\n# non-false value, then it is used:\n#\n# today = ''\n#\n# Else, today_fmt is used as the format for a strftime call.\n#\n# today_fmt = '%B %d, %Y'\n\n# List of patterns, relative to source directory, that match files and\n# directories to ignore when looking for source files.\n# This patterns also effect to html_static_path and html_extra_path\nexclude_patterns = ['_build', 'Thumbs.db', '.DS_Store', '_templates', '_auto',\n '**.ipynb_checkpoints']\n\n# The reST default role (used for this markup: `text`) to use for all\n# documents.\n#\n# default_role = None\n\n# If true, '()' will be appended to :func: etc. cross-reference text.\n#\n# add_function_parentheses = True\n\n# If true, the current module name will be prepended to all description\n# unit titles (such as .. function::).\n#\n# add_module_names = True\n\n# If true, sectionauthor and moduleauthor directives will be shown in the\n# output. They are ignored by default.\n#\n# show_authors = False\n\n# The name of the Pygments (syntax highlighting) style to use.\npygments_style = 'sphinx'\n\n# A list of ignored prefixes for module index sorting.\n# modindex_common_prefix = []\n\n# If true, keep warnings as \"system message\" paragraphs in the built documents.\n# keep_warnings = False\n\n# If true, `todo` and `todoList` produce output, else they produce nothing.\ntodo_include_todos = True\n\n# -- Options for HTML output ----------------------------------------------\n\n# The theme to use for HTML and HTML Help pages. See the documentation for\n# a list of builtin themes.\n#\nhtml_theme = \"sphinx_rtd_theme\"\n\n# Theme options are theme-specific and customize the look and feel of a theme\n# further. For a list of options available for each theme, see the\n# documentation.\n#\n# html_theme_options = {}\n\n# Add any paths that contain custom themes here, relative to this directory.\n# html_theme_path = []\n\n# The name for this set of Sphinx documents.\n# \"<project> v<release> documentation\" by default.\n#\n# html_title = 'QCoDeS v1'\n\n# A shorter title for the navigation bar. Default is the same as html_title.\n#\n# html_short_title = None\n\n# The name of an image file (relative to this directory) to place at the top\n# of the sidebar.\n#\n# html_logo = None\n\n# The name of an image file (relative to this directory) to use as a favicon of\n# the docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32\n# pixels large.\n#\n# html_favicon = None\n\n# Add any paths that contain custom static files (such as style sheets) here,\n# relative to this directory. They are copied after the builtin static files,\n# so a file named \"default.css\" will overwrite the builtin \"default.css\".\nhtml_static_path = ['_static']\n\n# Add custom favicon to the sphinx html documentation.\n# Can be an absolute url or a local static file.\nfavicons = {\"rel\": \"icon\", \"static-file\": \"qcodes_favicon.png\", \"type\": \"image/png\"}\n\n# Add any extra paths that contain custom files (such as robots.txt or\n# .htaccess) here, relative to this directory. These files are copied\n# directly to the root of the documentation.\n#\n# html_extra_path = []\n\n# If not None, a 'Last updated on:' timestamp is inserted at every page\n# bottom, using the given strftime format.\n# The empty string is equivalent to '%b %d, %Y'.\n#\n# html_last_updated_fmt = None\n\n# If true, SmartyPants will be used to convert quotes and dashes to\n# typographically correct entities.\n#\n# html_use_smartypants = True\n\n# Custom sidebar templates, maps document names to template names.\n#\n# html_sidebars = {}\n\n# Additional templates that should be rendered to pages, maps page names to\n# template names.\n#\n# html_additional_pages = {}\n\n# If false, no module index is generated.\n#\n# html_domain_indices = True\n\n# If false, no index is generated.\n#\nhtml_use_index = True\n\n# If true, the index is split into individual pages for each letter.\n#\n# html_split_index = False\n\n# If true, links to the reST sources are added to the pages.\n#\n# html_show_sourcelink = True\n\n# If true, \"Created using Sphinx\" is shown in the HTML footer. Default is True.\n#\n# html_show_sphinx = True\n\n# If true, \"(C) Copyright ...\" is shown in the HTML footer. Default is True.\n#\nhtml_show_copyright = False\n\n# If true, an OpenSearch description file will be output, and all pages will\n# contain a <link> tag referring to it. The value of this option must be the\n# base URL from which the finished HTML is served.\n#\n# html_use_opensearch = ''\n\n# This is the file name suffix for HTML files (e.g. \".xhtml\").\n# html_file_suffix = None\n\n# Language to be used for generating the HTML full-text search index.\n# Sphinx supports the following languages:\n# 'da', 'de', 'en', 'es', 'fi', 'fr', 'h', 'it', 'ja'\n# 'nl', 'no', 'pt', 'ro', 'r', 'sv', 'tr', 'zh'\n#\n# html_search_language = 'en'\n\n# A dictionary with options for the search language support, empty by default.\n# 'ja' uses this config value.\n# 'zh' user can custom change `jieba` dictionary path.\n#\n# html_search_options = {'type': 'default'}\n\n# The name of a javascript file (relative to the configuration directory) that\n# implements a search results scorer. If empty, the default will be used.\n#\n# html_search_scorer = 'scorer.js'\n\n# Output file base name for HTML help builder.\nhtmlhelp_basename = 'QCoDeSdoc'\n\n# -- Options for LaTeX output ---------------------------------------------\n\nlatex_elements = { # The paper size ('letterpaper' or 'a4paper').\n #\n # 'papersize': 'letterpaper',\n\n # The font size ('10pt', '11pt' or '12pt').\n #\n # 'pointsize': '10pt',\n\n # Additional stuff for the LaTeX preamble.\n #\n # 'preamble': '',\n\n # Latex figure (float) alignment\n #\n # 'figure_align': 'htbp',\n}\n\n# Grouping the document tree into LaTeX files. List of tuples\n# (source start file, target name, title,\n# author, documentclass [howto, manual, or own class]).\nlatex_documents = [(master_doc, 'QCoDeS.tex', 'QCoDeS Documentation',\n 'Giulio Ungaretti, Alex Johnson', 'manual'), ]\n\n# The name of an image file (relative to this directory) to place at the top of\n# the title page.\n#\n# latex_logo = None\n\n# For \"manual\" documents, if this is true, then toplevel headings are parts,\n# not chapters.\n#\n# latex_use_parts = False\n\n# If true, show page references after internal links.\n#\n# latex_show_pagerefs = False\n\n# If true, show URL addresses after external links.\n#\n# latex_show_urls = False\n\n# Documents to append as an appendix to all manuals.\n#\n# latex_appendices = []\n\n# If false, no module index is generated.\n#\n# latex_domain_indices = True\n\n\n# -- Options for manual page output ---------------------------------------\n\n# One entry per manual page. List of tuples\n# (source start file, name, description, authors, manual section).\nman_pages = [(master_doc, 'qcodes', 'QCoDeS Documentation', [author], 1)]\n\n# If true, show URL addresses after external links.\n#\n# man_show_urls = False\n\n\n# -- Options for Texinfo output -------------------------------------------\n\n# Grouping the document tree into Texinfo files. List of tuples\n# (source start file, target name, title, author,\n# dir menu entry, description, category)\ntexinfo_documents = [(\n master_doc, 'QCoDeS', 'QCoDeS Documentation', author, 'QCoDeS',\n 'One line description of project.', 'Miscellaneous'), ]\n\n# Documents to append as an appendix to all manuals.\n#\n# texinfo_appendices = []\n\n# If false, no module index is generated.\n#\n# texinfo_domain_indices = True\n\n# How to display URL addresses: 'footnote', 'no', or 'inline'.\n#\ntexinfo_show_urls = 'footnote'\n\n# If true, do not generate a @detailmenu in the \"Top\" node's menu.\n#\n# texinfo_no_detailmenu = False\n\n\n# Example configuration for intersphinx: refer to the Python standard library.\nintersphinx_mapping = {\n \"pandas\": (\"https://pandas.pydata.org/pandas-docs/stable/\", None),\n \"matplotlib\": (\"https://matplotlib.org/stable\", None),\n \"python\": (\"https://docs.python.org/3.10/\", None),\n \"numpy\": (\"https://numpy.org/doc/stable/\", None),\n \"pyvisa\": (\"https://pyvisa.readthedocs.io/en/stable/\", None),\n \"IPython\": (\"https://ipython.readthedocs.io/en/stable/\", None),\n}\n\nautoclass_content = \"both\"\n# classes should include both the\n# class' and the __init__ method's docstring\nautosummary_generate = True\nautodoc_member_order = 'bysource'\nautodoc_default_options = {'members': True, 'undoc-members': True,\n 'inherited-members': True, 'show-inheritance': True}\n\n# we mock modules that for one reason or another is not\n# there when generating the docs\nautodoc_mock_imports = [\n \"pyspcm\",\n \"zhinst\",\n \"zhinst.utils\",\n \"keysightSD1\",\n \"cffi\",\n \"spirack\",\n \"clr\",\n \"win32com\",\n \"win32com.client\",\n \"pythoncom\",\n \"slack-sdk\",\n \"hickle\",\n \"gclib\",\n]\n\nautodoc_typehints_format = \"short\"\n\n# Add any paths that contain templates here, relative to this directory.\ntemplates_path = ['_templates']\n\n# we are using non local images for badges. These will change so we dont\n# want to store them locally.\nsuppress_warnings = ['image.nonlocal_uri']\n\nnitpicky = False\n\nnumfig = True\n\n# Use this kernel instead of the one stored in the notebook metadata:\nnbsphinx_kernel_name = 'python3'\n# always execute notebooks.\nnbsphinx_execute = 'always'\n\ntowncrier_draft_autoversion_mode = \"draft\"\ntowncrier_draft_include_empty = True\ntowncrier_draft_working_directory = \"..\"\n\nissues_github_path = \"QCoDeS/Qcodes\"\n", "path": "docs/conf.py" } ]
diff --git a/docs/conf.py b/docs/conf.py index f199041d57f..de9b9b3413f 100644 --- a/docs/conf.py +++ b/docs/conf.py @@ -78,7 +78,7 @@ "sphinxcontrib.towncrier", "autodocsumm", "sphinx_issues", - "sphinx-favicon", + "sphinx_favicon", ] # include special __xxx__ that DO have a docstring diff --git a/pyproject.toml b/pyproject.toml index 1b6ece851b0..c851529a1b1 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -107,7 +107,7 @@ docs = [ "nbsphinx>=0.8.9", "PyVisa-sim>=0.4.0", "sphinx>=4.5.0", - "sphinx-favicon>=0.2", + "sphinx-favicon>=1.0", "sphinx-issues>=3.0.1", "sphinx-jsonschema>=1.19.1", "sphinx-rtd-theme>=1.0.0", diff --git a/requirements.txt b/requirements.txt index d58457f88a5..de398991089 100644 --- a/requirements.txt +++ b/requirements.txt @@ -139,7 +139,7 @@ snowballstemmer~=2.2.0 sortedcontainers~=2.4.0 soupsieve~=2.4 Sphinx @ git+https://github.com/jenshnielsen/sphinx.git@fix_9884_6_1_3 -sphinx-favicon==0.2 +sphinx-favicon==1.0 sphinx-issues~=3.0.1 sphinx-jsonschema~=1.19.1 sphinx-rtd-theme~=1.2.0
codespell-project__codespell-89
Makefile is broken The makefile is no longer working, since there is no longer a codespell.py
[ { "content": "#! /usr/bin/env python\n\n# adapted from mne-python\n\nimport os\nfrom os import path as op\n\ntry:\n import setuptools # noqa to allow --develop\nexcept Exception:\n pass\nfrom distutils.core import setup\n\nfrom codespell_lib import __version__\n\nDISTNAME = 'codespell'\nDESCRIPTION = \"\"\"Codespell\"\"\"\nMAINTAINER = 'Lucas De Marchi'\nMAINTAINER_EMAIL = '[email protected]'\nURL = 'https://github.com/lucasdemarchi/codespell/'\nLICENSE = 'GPL v2'\nDOWNLOAD_URL = 'https://github.com/lucasdemarchi/codespell/'\nwith open('README.rst', 'r') as f:\n LONG_DESCRIPTION = f.read()\n\nif __name__ == \"__main__\":\n if os.path.exists('MANIFEST'):\n os.remove('MANIFEST')\n\n setup(name=DISTNAME,\n maintainer=MAINTAINER,\n include_package_data=True,\n maintainer_email=MAINTAINER_EMAIL,\n description=DESCRIPTION,\n license=LICENSE,\n url=URL,\n version=__version__,\n download_url=DOWNLOAD_URL,\n long_description=LONG_DESCRIPTION,\n zip_safe=False,\n classifiers=['Intended Audience :: Developers',\n 'License :: OSI Approved',\n 'Programming Language :: Python',\n 'Topic :: Software Development',\n 'Operating System :: Microsoft :: Windows',\n 'Operating System :: POSIX',\n 'Operating System :: Unix',\n 'Operating System :: MacOS'],\n platforms='any',\n packages=[\n 'codespell_lib', 'codespell_lib.tests',\n 'codespell_lib.data',\n ],\n package_data={'codespell_lib': [\n op.join('data', 'dictionary.txt'),\n op.join('data', 'linux-kernel.exclude'),\n ]},\n scripts=['bin/codespell.py'])\n", "path": "setup.py" } ]
[ { "content": "#! /usr/bin/env python\n\n# adapted from mne-python\n\nimport os\nfrom os import path as op\n\ntry:\n import setuptools # noqa to allow --develop\nexcept Exception:\n pass\nfrom distutils.core import setup\n\nfrom codespell_lib import __version__\n\nDISTNAME = 'codespell'\nDESCRIPTION = \"\"\"Codespell\"\"\"\nMAINTAINER = 'Lucas De Marchi'\nMAINTAINER_EMAIL = '[email protected]'\nURL = 'https://github.com/lucasdemarchi/codespell/'\nLICENSE = 'GPL v2'\nDOWNLOAD_URL = 'https://github.com/lucasdemarchi/codespell/'\nwith open('README.rst', 'r') as f:\n LONG_DESCRIPTION = f.read()\n\nif __name__ == \"__main__\":\n if os.path.exists('MANIFEST'):\n os.remove('MANIFEST')\n\n setup(name=DISTNAME,\n maintainer=MAINTAINER,\n include_package_data=True,\n maintainer_email=MAINTAINER_EMAIL,\n description=DESCRIPTION,\n license=LICENSE,\n url=URL,\n version=__version__,\n download_url=DOWNLOAD_URL,\n long_description=LONG_DESCRIPTION,\n zip_safe=False,\n classifiers=['Intended Audience :: Developers',\n 'License :: OSI Approved',\n 'Programming Language :: Python',\n 'Topic :: Software Development',\n 'Operating System :: Microsoft :: Windows',\n 'Operating System :: POSIX',\n 'Operating System :: Unix',\n 'Operating System :: MacOS'],\n platforms='any',\n packages=[\n 'codespell_lib', 'codespell_lib.tests',\n 'codespell_lib.data',\n ],\n package_data={'codespell_lib': [\n op.join('data', 'dictionary.txt'),\n op.join('data', 'linux-kernel.exclude'),\n ]},\n scripts=['bin/codespell'])\n", "path": "setup.py" } ]
diff --git a/README.rst b/README.rst index cc19f4cf8f..44b4f0b749 100644 --- a/README.rst +++ b/README.rst @@ -31,7 +31,7 @@ You can use ``pip`` to install codespell with e.g.:: Usage ----- -Check usage with ``./codespell.py -h``. There are a few command line options. +Check usage with ``codespell -h``. There are a few command line options. Note that upon installation with "make install" we don't have the "py" suffix. We ship a dictionary that is an improved version of the one available `on Wikipedia <https://en.wikipedia.org/wiki/Wikipedia:Lists_of_common_misspellings/For_machines>`_ @@ -71,7 +71,7 @@ directly, but instead be manually inspected. E.g.: License ------- -The Python script ``codespell.py`` is available with the following terms: +The Python script ``codespell`` is available with the following terms: (*tl;dr*: `GPL v2`_) Copyright (C) 2010-2011 Lucas De Marchi <[email protected]> diff --git a/bin/codespell.py b/bin/codespell similarity index 100% rename from bin/codespell.py rename to bin/codespell diff --git a/codespell_lib/tests/test_basic.py b/codespell_lib/tests/test_basic.py index 05d2fe9693..c0285412f9 100644 --- a/codespell_lib/tests/test_basic.py +++ b/codespell_lib/tests/test_basic.py @@ -18,12 +18,12 @@ def run_codespell(args=(), cwd=None): """Helper to run codespell""" return subprocess.Popen( - ['codespell.py'] + list(args), cwd=cwd, + ['codespell'] + list(args), cwd=cwd, stdout=subprocess.PIPE, stderr=subprocess.PIPE).wait() def test_command(): - """Test running codespell.py""" + """Test running the codespell executable""" # With no arguments does "." with TemporaryDirectory() as d: assert_equal(run_codespell(cwd=d), 0) diff --git a/setup.py b/setup.py index 5619389a75..e43fbf23a9 100755 --- a/setup.py +++ b/setup.py @@ -55,4 +55,4 @@ op.join('data', 'dictionary.txt'), op.join('data', 'linux-kernel.exclude'), ]}, - scripts=['bin/codespell.py']) + scripts=['bin/codespell'])
OpenEnergyPlatform__oeplatform-495
The data versioning does not track change types Change tables do not store the types of changes. "_type" has to be injected into queries
[ { "content": "###########\n# Parsers #\n###########\nimport decimal\nimport re\nfrom datetime import datetime, date\n\nimport geoalchemy2 # Although this import seems unused is has to be here\nimport sqlalchemy as sa\nfrom sqlalchemy import (\n Column,\n MetaData,\n Table,\n and_,\n not_,\n column,\n func,\n literal_column,\n or_,\n select,\n util,\n cast,\n)\nimport dateutil\nfrom sqlalchemy.dialects.postgresql.base import INTERVAL\nfrom sqlalchemy.schema import Sequence\nfrom sqlalchemy.sql import functions as fun\nfrom sqlalchemy.sql.annotation import Annotated\nfrom sqlalchemy.sql.elements import Slice\nfrom sqlalchemy.sql.expression import ColumnClause, CompoundSelect\nfrom sqlalchemy.sql.sqltypes import Interval, _AbstractInterval\n\nfrom api.connection import _get_engine\nfrom api.error import APIError, APIKeyError\nfrom api.connection import _get_engine\nfrom sqlalchemy.sql.sqltypes import Interval, _AbstractInterval\nfrom sqlalchemy.dialects.postgresql.base import INTERVAL\nfrom sqlalchemy import types as sqltypes\n\nfrom . import DEFAULT_SCHEMA\n\n__KNOWN_TABLES = {}\n\npgsql_qualifier = re.compile(r\"^[\\w\\d_\\.]+$\")\n\n\ndef get_or_403(dictionary, key):\n try:\n return dictionary[key]\n except KeyError:\n raise APIKeyError(dictionary, key)\n\n\ndef parse_single(x, caster):\n try:\n return caster(x)\n except ValueError:\n raise APIError(\"Could not parse %s as %s\" % (x, caster))\n\n\ndef is_pg_qual(x):\n if not isinstance(x, str):\n return False\n return pgsql_qualifier.search(x)\n\n\ndef read_pgvalue(x):\n # TODO: Implement check for valid values\n if x is None:\n return \"null\"\n return x\n\n\nclass ValidationError(Exception):\n def __init__(self, message, value):\n self.message = message\n self.value = value\n\n\ndef read_bool(s):\n if isinstance(s, bool):\n return s\n if s.lower() in [\"true\", \"false\"]:\n return s.lower() == \"true\"\n elif s.lower() in [\"yes\", \"no\"]:\n return s.lower() == \"true\"\n else:\n raise APIError(\"Invalid value in binary field\", s)\n\n\ndef read_pgid(s):\n if is_pg_qual(s):\n return s\n raise APIError(\"Invalid identifier: '%s'\" % s)\n\n\ndef set_meta_info(method, user, message=None):\n val_dict = {}\n val_dict[\"_user\"] = user # TODO: Add user handling\n val_dict[\"_message\"] = message\n return val_dict\n\n\ndef parse_insert(d, context, message=None, mapper=None):\n table = Table(\n read_pgid(get_or_403(d, \"table\")),\n MetaData(bind=_get_engine()),\n autoload=True,\n schema=read_pgid(get_or_403(d, \"schema\")),\n )\n field_strings = []\n for field in d.get(\"fields\", []):\n if not (\n (isinstance(field, dict) and \"type\" in field and field[\"type\"] == \"column\")\n or isinstance(field, str)\n ):\n raise APIError(\"Only pure column expressions are allowed in insert\")\n field_strings.append(parse_expression(field))\n\n query = table.insert()\n\n if not \"method\" in d:\n d[\"method\"] = \"values\"\n if d[\"method\"] == \"values\":\n if field_strings:\n raw_values = get_or_403(d, \"values\")\n if not isinstance(raw_values, list):\n raise APIError(\"{} is not a list\".format(raw_values))\n values = (\n zip(\n field_strings,\n parse_expression(x, allow_untyped_dicts=True, escape_quotes=False),\n )\n for x in raw_values\n )\n else:\n values = get_or_403(d, \"values\")\n\n def clear_meta(vals):\n val_dict = dict(vals)\n # make sure meta fields are not compromised\n if context[\"user\"].is_anonymous:\n username = \"Anonymous\"\n else:\n username = context[\"user\"].name\n val_dict.update(set_meta_info(\"insert\", username, message))\n return val_dict\n\n values = list(map(clear_meta, values))\n\n query = query.values(values)\n elif d[\"method\"] == \"select\":\n values = parse_select(d[\"values\"])\n query = query.from_select(field_strings, values)\n else:\n raise APIError(\"Unknown insert method: \" + str(d[\"method\"]))\n\n if \"returning\" in d:\n return_clauses = [parse_expression(x, mapper) for x in d[\"returning\"]]\n query = query.returning(*return_clauses)\n\n return query, values\n\n\ndef parse_select(d):\n \"\"\"\n Defintion of a select query according to\n http://www.postgresql.org/docs/9.3/static/sql-select.html\n\n not implemented:\n [ WITH [ RECURSIVE ] with_query [, ...] ]\n [ WINDOW window_name AS ( window_definition ) [, ...] ]\n [ FOR { UPDATE | NO KEY UPDATE | SHARE | KEY SHARE } [ OF table_name [, ...] ] [ NOWAIT ] [...] ]\n \"\"\"\n distinct = d.get(\"distinct\", False)\n\n L = None\n\n keyword = d.get(\"keyword\")\n\n if keyword and keyword.lower() in [\"union\", \"except\", \"intersect\"]:\n partials = []\n for part_sel in d.get(\"selects\", []):\n t = part_sel.get(\"type\")\n if t == \"grouping\":\n grouping = get_or_403(part_sel, \"grouping\")\n if isinstance(grouping, dict):\n partials.append(parse_select(grouping))\n elif isinstance(grouping, list):\n partials = map(parse_select, grouping)\n else:\n APIError(\n \"Cannot handle grouping type. Dictionary or list expected.\"\n )\n elif t == \"select\":\n partials.append(parse_select(part_sel))\n else:\n raise APIError(\"Unknown select type: \" + t)\n query = CompoundSelect(util.symbol(keyword), *partials)\n else:\n kwargs = dict(distinct=distinct)\n if \"fields\" in d and d[\"fields\"]:\n L = []\n for field in d[\"fields\"]:\n col = parse_expression(field)\n if \"as\" in field:\n col.label(read_pgid(field[\"as\"]))\n L.append(col)\n if \"from\" in d:\n kwargs[\"from_obj\"] = parse_from_item(get_or_403(d, \"from\"))\n else:\n kwargs[\"from_obj\"] = []\n if not L:\n L = \"*\"\n kwargs[\"columns\"] = L\n query = select(**kwargs)\n\n # [ WHERE condition ]\n if d.get(\"where\", False):\n query = query.where(parse_condition(d[\"where\"]))\n\n if \"group_by\" in d:\n query = query.group_by(*[parse_expression(f) for f in d[\"group_by\"]])\n\n if \"having\" in d:\n query.having([parse_condition(f) for f in d[\"having\"]])\n\n if \"select\" in d:\n for constraint in d[\"select\"]:\n type = get_or_403(constraint, \"type\")\n subquery = parse_select(get_or_403(constraint, \"query\"))\n if type.lower() == \"union\":\n query.union(subquery)\n elif type.lower() == \"intersect\":\n query.intersect(subquery)\n elif type.lower() == \"except\":\n query.except_(subquery)\n if \"order_by\" in d:\n for ob in d[\"order_by\"]:\n expr = parse_expression(ob)\n if isinstance(ob, dict):\n desc = ob.get(\"ordering\", \"asc\").lower() == \"desc\"\n if desc:\n expr = expr.desc()\n query = query.order_by(expr)\n\n if \"limit\" in d:\n if isinstance(d[\"limit\"], int) or d[\"limit\"].isdigit():\n query = query.limit(int(d[\"limit\"]))\n else:\n raise APIError(\"Invalid LIMIT: Expected a digit\")\n\n if \"offset\" in d:\n if isinstance(d[\"offset\"], int) or d[\"offset\"].isdigit():\n query = query.offset(int(d[\"offset\"]))\n else:\n raise APIError(\"Invalid LIMIT: Expected a digit\")\n return query\n\n\ndef parse_from_item(d):\n \"\"\"\n Defintion of a from_item according to \n http://www.postgresql.org/docs/9.3/static/sql-select.html\n \n return: A from_item string with checked psql qualifiers.\n \n Not implemented:\n with_query_name [ [ AS ] alias [ ( column_alias [, ...] ) ] ]\n [ LATERAL ] function_name ( [ argument [, ...] ] ) [ AS ] alias [ ( column_alias [, ...] | column_definition [, ...] ) ]\n [ LATERAL ] function_name ( [ argument [, ...] ] ) AS ( column_definition [, ...] )\n \"\"\"\n # TODO: If 'type' is not set assume just a table name is present\n if isinstance(d, str):\n d = {\"type\": \"table\", \"table\": d}\n if isinstance(d, list):\n return [parse_from_item(f) for f in d]\n dtype = get_or_403(d, \"type\")\n if dtype == \"table\":\n schema_name = read_pgid(d[\"schema\"]) if \"schema\" in d else None\n only = d.get(\"only\", False)\n ext_name = table_name = read_pgid(get_or_403(d, \"table\"))\n tkwargs = dict(autoload=True)\n if schema_name:\n ext_name = schema_name + \".\" + ext_name\n tkwargs[\"schema\"] = d[\"schema\"]\n if ext_name in __PARSER_META.tables:\n item = __PARSER_META.tables[ext_name]\n else:\n try:\n item = Table(d[\"table\"], __PARSER_META, **tkwargs)\n except sa.exc.NoSuchTableError as e:\n raise APIError(\"Table {table} not found\".format(table=ext_name))\n\n engine = _get_engine()\n conn = engine.connect()\n exists = engine.dialect.has_table(conn, item.name, item.schema)\n conn.close()\n if not exists:\n raise APIError(\"Table not found: \" + str(item), status=400)\n elif dtype == \"select\":\n item = parse_select(d)\n elif dtype == \"join\":\n left = parse_from_item(get_or_403(d, \"left\"))\n right = parse_from_item(get_or_403(d, \"right\"))\n is_outer = d.get(\"is_outer\", False)\n full = d.get(\"is_full\", False)\n on_clause = None\n if \"on\" in d:\n on_clause = parse_condition(d[\"on\"])\n item = left.join(right, onclause=on_clause, isouter=is_outer, full=full)\n else:\n raise APIError(\"Unknown from-item: \" + dtype)\n\n if \"alias\" in d:\n item = item.alias(read_pgid(d[\"alias\"]))\n return item\n\n\n__PARSER_META = MetaData(bind=_get_engine())\n\n\ndef load_table_from_metadata(table_name, schema_name=None):\n ext_name = table_name\n if schema_name:\n ext_name = schema_name + \".\" + ext_name\n if ext_name and ext_name in __PARSER_META.tables:\n return __PARSER_META.tables[ext_name]\n else:\n if _get_engine().dialect.has_table(\n _get_engine().connect(), table_name, schema=schema_name\n ):\n return Table(table_name, __PARSER_META, autoload=True, schema=schema_name)\n\n\ndef parse_column(d, mapper):\n name = get_or_403(d, \"column\")\n is_literal = parse_single(d.get(\"is_literal\", False), bool)\n table_name = d.get(\"table\")\n table = None\n if table_name:\n table_name = read_pgid(table_name)\n if mapper is None:\n mapper = dict()\n do_map = lambda x: mapper.get(x, x)\n if \"schema\" in d:\n schema_name = read_pgid(do_map(d[\"schema\"]))\n else:\n schema_name = None\n table = load_table_from_metadata(table_name, schema_name=schema_name)\n if table is not None and name in table.c:\n col = table.c[name]\n if isinstance(col.type, INTERVAL):\n col.type = Interval(col.type)\n return col\n else:\n if is_literal:\n return literal_column(name)\n else:\n if table_name is not None:\n return literal_column(table_name + \".\" + name)\n else:\n return column(name)\n\n\ndef parse_type(dt_string, **kwargs):\n\n if isinstance(dt_string, dict):\n dt = parse_type(\n get_or_403(dt_string, \"datatype\"), **dt_string.get(\"kwargs\", {})\n )\n return dt\n else:\n # Are you an array?\n dtarr_expression = r\"(?P<dtname>[A-z_]+)\\s*\\[\\]\"\n arr_match = re.match(dtarr_expression, dt_string)\n if arr_match:\n is_array = True\n dt_string = arr_match.groups()[0]\n dt, autoincrement = parse_type(dt_string)\n return sa.ARRAY(dt), autoincrement\n\n # Is the datatypestring of form NAME(NUMBER)?\n dt_expression = r\"(?P<dtname>[A-z_]+)\\s*\\((?P<cardinality>.*(,.*)?)\\)\"\n match = re.match(dt_expression, dt_string)\n if match:\n dt_string = match.groups()[0]\n if dt_string.lower() == \"geometry\":\n return geoalchemy2.Geometry(geometry_type=match.groups()[1]), False\n else:\n dt_cardinality = map(int, match.groups()[1].replace(\" \", \"\").split(\",\"))\n dt, autoincrement = parse_type(dt_string)\n return dt(*dt_cardinality, **kwargs), autoincrement\n\n # So it's a plain type\n autoincrement = False\n\n dt_string = dt_string.lower()\n\n if dt_string in (\"int\", \"integer\"):\n dt = sa.types.INTEGER\n elif dt_string in (\"bigint\", \"biginteger\"):\n dt = sa.types.BigInteger\n elif dt_string in (\"bit\",):\n dt = sa.types.Binary\n elif dt_string in (\"boolean\", \"bool\"):\n dt = sa.types.Boolean\n elif dt_string in (\"char\",):\n dt = sqltypes.CHAR\n elif dt_string in (\"date\",):\n dt = sqltypes.Date\n elif dt_string in (\"datetime\",):\n dt = sqltypes.DateTime\n elif dt_string in (\"timestamp\", \"timestamp without time zone\"):\n dt = sqltypes.TIMESTAMP\n elif dt_string in (\"time\", \"time without time zone\"):\n dt = sqltypes.TIME\n elif dt_string in (\"float\"):\n dt = sqltypes.FLOAT\n elif dt_string in (\"decimal\"):\n dt = sqltypes.DECIMAL\n elif dt_string in (\"interval\",):\n dt = sqltypes.Interval\n elif dt_string in (\"json\",):\n dt = sqltypes.JSON\n elif dt_string in (\"nchar\",):\n dt = sqltypes.NCHAR\n elif dt_string in (\"numerical\", \"numeric\"):\n dt = sa.types.Numeric\n elif dt_string in [\"varchar\", \"character varying\"]:\n dt = sqltypes.VARCHAR\n elif dt_string in (\"real\",):\n dt = sqltypes.REAL\n elif dt_string in (\"smallint\",):\n dt = sqltypes.SMALLINT\n elif hasattr(geoalchemy2, dt_string):\n dt = getattr(geoalchemy2, dt_string)\n elif hasattr(sqltypes, dt_string.upper()):\n dt = getattr(sqltypes, dt_string.upper())\n elif dt_string == \"bigserial\":\n dt = sa.types.BigInteger\n autoincrement = True\n else:\n raise APIError(\"Unknown type (%s).\" % dt_string)\n return dt, autoincrement\n\n\ndef parse_expression(d, mapper=None, allow_untyped_dicts=False, escape_quotes=True):\n # TODO: Implement\n if isinstance(d, dict):\n if allow_untyped_dicts and \"type\" not in d:\n return d\n dtype = get_or_403(d, \"type\")\n if dtype == \"column\":\n return parse_column(d, mapper)\n if dtype == \"grouping\":\n grouping = get_or_403(d, \"grouping\")\n if isinstance(grouping, list):\n return [parse_expression(e) for e in grouping]\n else:\n return parse_expression(grouping)\n if dtype == \"operator\":\n return parse_operator(d)\n if dtype == \"modifier\":\n return parse_modifier(d)\n if dtype == \"function\":\n return parse_function(d)\n if dtype == \"slice\":\n return parse_slice(d)\n if dtype == \"star\":\n return \"*\"\n if dtype == \"value\":\n if \"value\" in d:\n if \"datatype\" in d:\n dt = d[\"datatype\"]\n if dt == \"Decimal\":\n return decimal.Decimal(get_or_403(d, \"value\"))\n elif dt == \"date\":\n return dateutil.parser.parse(get_or_403(d, \"value\")).date()\n elif dt == \"datetime\":\n return dateutil.parser.parse(get_or_403(d, \"value\"))\n elif dt == \"time\":\n return dateutil.parser.parse(get_or_403(d, \"value\")).time()\n return read_pgvalue(get_or_403(d, \"value\"))\n else:\n return None\n if dtype == \"label\":\n return parse_label(d)\n if dtype == \"sequence\":\n schema = read_pgid(d[\"schema\"]) if \"schema\" in d else DEFAULT_SCHEMA\n s = '\"%s\".\"%s\"' % (schema, get_or_403(d, \"sequence\"))\n return Sequence(get_or_403(d, \"sequence\"), schema=schema)\n if dtype == \"select\":\n return parse_select(d)\n if dtype == \"cast\":\n expr = parse_expression(get_or_403(d, \"source\"))\n t, _ = parse_type(get_or_403(d, \"as\"))\n return cast(expr, t)\n else:\n raise APIError(\"Unknown expression type: \" + dtype)\n if isinstance(d, list):\n return [\n parse_expression(\n x, allow_untyped_dicts=allow_untyped_dicts, escape_quotes=escape_quotes\n )\n for x in d\n ]\n if isinstance(d, str):\n if escape_quotes:\n return d.replace('\"', \"\")\n else:\n return d\n return d\n\n\ndef parse_label(d):\n element = parse_expression(get_or_403(d, \"element\"))\n if not isinstance(element, sa.sql.expression.ClauseElement):\n element = sa.literal(element)\n return element.label(get_or_403(d, \"label\"))\n\n\ndef parse_slice(d):\n kwargs = {\"step\": 1}\n if \"start\" in d:\n kwargs[\"start\"] = d[\"start\"]\n if \"stop\" in d:\n kwargs[\"stop\"] = d[\"stop\"]\n return Slice(**kwargs)\n\n\ndef _unpack_clauses(clauses):\n if isinstance(clauses, list):\n clean_clauses = []\n for clause in clauses:\n if isinstance(clause, list):\n clean_clauses += list(map(_unpack_clauses, clause))\n else:\n clean_clauses.append(clause)\n clauses = {\n \"type\": \"operator\",\n \"operator\": \"AND\",\n \"operands\": list(map(parse_expression, clean_clauses)),\n }\n return clauses\n\n\ndef parse_condition(dl):\n clean_dl = _unpack_clauses(dl)\n return parse_expression(clean_dl)\n\n\ndef parse_operator(d):\n query = parse_sqla_operator(\n get_or_403(d, \"operator\"),\n *list(map(parse_expression, get_or_403(d, \"operands\")))\n )\n return query\n\n\ndef parse_modifier(d):\n query = parse_sqla_modifier(\n get_or_403(d, \"operator\"),\n *list(map(parse_expression, get_or_403(d, \"operands\")))\n )\n return query\n\n\ndef parse_function(d):\n fname = get_or_403(d, \"function\")\n\n operand_struc = get_or_403(d, \"operands\")\n if isinstance(operand_struc, list):\n operands = list(map(parse_expression, operand_struc))\n else:\n if (\n isinstance(operand_struc, dict)\n and operand_struc.get(\"type\", None) == \"grouping\"\n ):\n operands = parse_expression(operand_struc)\n else:\n operands = [parse_expression(operand_struc)]\n\n if fname == \"+\":\n if len(operands) != 2:\n raise APIError(\n \"Wrong number of arguments for function %s. Expected 2. Got %d\"\n % (fname, len(operands))\n )\n x, y = operands\n return x + y\n else:\n if fname == \"nextval\":\n return func.next_value(*operands)\n else:\n function = getattr(func, fname)\n return function(*operands)\n\n\ndef parse_scolumnd_from_columnd(schema, table, name, column_description):\n # Migrate Postgres to Python Structures\n data_type = column_description.get(\"data_type\")\n size = column_description.get(\"character_maximum_length\")\n if size is not None and data_type is not None:\n data_type += \"(\" + str(size) + \")\"\n\n notnull = column_description.get(\"is_nullable\", False)\n\n return {\n \"column_name\": name,\n \"not_null\": notnull,\n \"data_type\": data_type,\n \"new_name\": column_description.get(\"new_name\"),\n \"c_schema\": schema,\n \"c_table\": table,\n }\n\n\ndef parse_sconstd_from_constd(schema, table, name_const, constraint_description):\n defi = constraint_description.get(\"definition\")\n return {\n \"action\": None, # {ADD, DROP}\n \"constraint_type\": constraint_description.get(\n \"constraint_typ\"\n ), # {FOREIGN KEY, PRIMARY KEY, UNIQUE, CHECK}\n \"constraint_name\": name_const,\n \"constraint_parameter\": constraint_description.get(\"definition\")\n .split(\"(\")[1]\n .split(\")\")[0],\n # Things in Brackets, e.g. name of column\n \"reference_table\": defi.split(\"REFERENCES \")[1].split(\"(\")[2]\n if \"REFERENCES\" in defi\n else None,\n \"reference_column\": defi.split(\"(\")[2].split(\")\")[1]\n if \"REFERENCES\" in defi\n else None,\n \"c_schema\": schema,\n \"c_table\": table,\n }\n\n\ndef replace_None_with_NULL(dictonary):\n # Replacing None with null for Database\n for key, value in dictonary.items():\n if value is None:\n dictonary[key] = \"NULL\"\n\n return dictonary\n\n\ndef split(string, seperator):\n if string is None:\n return None\n else:\n return str(string).split(seperator)\n\n\ndef replace(string, occuring_symb, replace_symb):\n if string is None:\n return None\n else:\n return str(string).replace(occuring_symb, replace_symb)\n\n\ndef alchemyencoder(obj):\n \"\"\"JSON encoder function for SQLAlchemy special classes.\"\"\"\n if isinstance(obj, datetime.date):\n return obj.isoformat()\n elif isinstance(obj, decimal.Decimal):\n return float(obj)\n\n\nsql_operators = {\n \"EQUALS\": \"=\",\n \"GREATER\": \">\",\n \"LOWER\": \"<\",\n \"NOTEQUAL\": \"!=\",\n \"NOTGREATER\": \"<=\",\n \"NOTLOWER\": \">=\",\n \"=\": \"=\",\n \">\": \">\",\n \"<\": \"<\",\n \"!=\": \"!=\",\n \"<>\": \"!=\",\n \"<=\": \"<=\",\n \">=\": \">=\",\n}\n\n\ndef parse_sql_operator(key: str) -> str:\n return sql_operators.get(key)\n\n\ndef parse_sqla_operator(raw_key, *operands):\n key = raw_key.lower().strip()\n if not operands:\n raise APIError(\"Missing arguments for '%s'.\" % (key))\n if key in [\"and\"]:\n query = and_(*operands)\n return query\n elif key in [\"or\"]:\n query = or_(*operands)\n return query\n elif key in [\"not\"]:\n x = operands[0]\n return not_(parse_condition(x))\n else:\n if len(operands) != 2:\n raise APIError(\n \"Wrong number of arguments for '%s'. Expected: 2 Got: %s\"\n % (key, len(operands))\n )\n x, y = operands\n if key in [\"equals\", \"=\"]:\n return x == y\n if key in [\"greater\", \">\"]:\n return x > y\n if key in [\"lower\", \"<\"]:\n return x < y\n if key in [\"notequal\", \"<>\", \"!=\"]:\n return x != y\n if key in [\"notgreater\", \"<=\"]:\n return x <= y\n if key in [\"notlower\", \">=\"]:\n return x >= y\n if key in [\"add\", \"+\"]:\n return x + y\n if key in [\"substract\", \"-\"]:\n return x - y\n if key in [\"multiply\", \"*\"]:\n return x * y\n if key in [\"divide\", \"/\"]:\n return x / y\n if key in [\"concatenate\", \"||\"]:\n return fun.concat(x, y)\n if key in [\"is not\"]:\n return x.isnot(y)\n if key in [\"<->\"]:\n return x.distance_centroid(y)\n if key in [\"getitem\"]:\n if isinstance(y, Slice):\n return x[parse_single(y.start, int) : parse_single(y.stop, int)]\n else:\n return x[read_pgid(y)]\n if key in [\"in\"]:\n return x.in_(y)\n\n raise APIError(\"Operator '%s' not supported\" % key)\n\n\ndef parse_sqla_modifier(raw_key, *operands):\n key = raw_key.lower().strip()\n if not operands:\n raise APIError(\"Missing arguments for '%s'.\" % key)\n\n if len(operands) != 1:\n raise APIError(\n \"Wrong number of arguments for '%s'. Expected: 1 Got: %s\"\n % (key, len(operands))\n )\n x = operands[0]\n if key in [\"asc\"]:\n return x.asc()\n if key in [\"desc\"]:\n return x.desc()\n raise APIError(\"Operator %s not supported\" % key)\n", "path": "api/parser.py" } ]
[ { "content": "###########\n# Parsers #\n###########\nimport decimal\nimport re\nfrom datetime import datetime, date\n\nimport geoalchemy2 # Although this import seems unused is has to be here\nimport sqlalchemy as sa\nfrom sqlalchemy import (\n Column,\n MetaData,\n Table,\n and_,\n not_,\n column,\n func,\n literal_column,\n or_,\n select,\n util,\n cast,\n)\nimport dateutil\nfrom sqlalchemy.dialects.postgresql.base import INTERVAL\nfrom sqlalchemy.schema import Sequence\nfrom sqlalchemy.sql import functions as fun\nfrom sqlalchemy.sql.annotation import Annotated\nfrom sqlalchemy.sql.elements import Slice\nfrom sqlalchemy.sql.expression import ColumnClause, CompoundSelect\nfrom sqlalchemy.sql.sqltypes import Interval, _AbstractInterval\n\nfrom api.connection import _get_engine\nfrom api.error import APIError, APIKeyError\nfrom api.connection import _get_engine\nfrom sqlalchemy.sql.sqltypes import Interval, _AbstractInterval\nfrom sqlalchemy.dialects.postgresql.base import INTERVAL\nfrom sqlalchemy import types as sqltypes\n\nfrom . import DEFAULT_SCHEMA\n\n__KNOWN_TABLES = {}\n\npgsql_qualifier = re.compile(r\"^[\\w\\d_\\.]+$\")\n\n\ndef get_or_403(dictionary, key):\n try:\n return dictionary[key]\n except KeyError:\n raise APIKeyError(dictionary, key)\n\n\ndef parse_single(x, caster):\n try:\n return caster(x)\n except ValueError:\n raise APIError(\"Could not parse %s as %s\" % (x, caster))\n\n\ndef is_pg_qual(x):\n if not isinstance(x, str):\n return False\n return pgsql_qualifier.search(x)\n\n\ndef read_pgvalue(x):\n # TODO: Implement check for valid values\n if x is None:\n return \"null\"\n return x\n\n\nclass ValidationError(Exception):\n def __init__(self, message, value):\n self.message = message\n self.value = value\n\n\ndef read_bool(s):\n if isinstance(s, bool):\n return s\n if s.lower() in [\"true\", \"false\"]:\n return s.lower() == \"true\"\n elif s.lower() in [\"yes\", \"no\"]:\n return s.lower() == \"true\"\n else:\n raise APIError(\"Invalid value in binary field\", s)\n\n\ndef read_pgid(s):\n if is_pg_qual(s):\n return s\n raise APIError(\"Invalid identifier: '%s'\" % s)\n\n\ndef set_meta_info(method, user, message=None):\n val_dict = {}\n val_dict[\"_user\"] = user # TODO: Add user handling\n val_dict[\"_message\"] = message\n val_dict[\"_type\"] = method\n return val_dict\n\n\ndef parse_insert(d, context, message=None, mapper=None):\n table = Table(\n read_pgid(get_or_403(d, \"table\")),\n MetaData(bind=_get_engine()),\n autoload=True,\n schema=read_pgid(get_or_403(d, \"schema\")),\n )\n field_strings = []\n for field in d.get(\"fields\", []):\n if not (\n (isinstance(field, dict) and \"type\" in field and field[\"type\"] == \"column\")\n or isinstance(field, str)\n ):\n raise APIError(\"Only pure column expressions are allowed in insert\")\n field_strings.append(parse_expression(field))\n\n query = table.insert()\n\n if not \"method\" in d:\n d[\"method\"] = \"values\"\n if d[\"method\"] == \"values\":\n if field_strings:\n raw_values = get_or_403(d, \"values\")\n if not isinstance(raw_values, list):\n raise APIError(\"{} is not a list\".format(raw_values))\n values = (\n zip(\n field_strings,\n parse_expression(x, allow_untyped_dicts=True, escape_quotes=False),\n )\n for x in raw_values\n )\n else:\n values = get_or_403(d, \"values\")\n\n def clear_meta(vals):\n val_dict = dict(vals)\n # make sure meta fields are not compromised\n if context[\"user\"].is_anonymous:\n username = \"Anonymous\"\n else:\n username = context[\"user\"].name\n val_dict.update(set_meta_info(\"insert\", username, message))\n return val_dict\n\n values = list(map(clear_meta, values))\n\n query = query.values(values)\n elif d[\"method\"] == \"select\":\n values = parse_select(d[\"values\"])\n query = query.from_select(field_strings, values)\n else:\n raise APIError(\"Unknown insert method: \" + str(d[\"method\"]))\n\n if \"returning\" in d:\n return_clauses = [parse_expression(x, mapper) for x in d[\"returning\"]]\n query = query.returning(*return_clauses)\n\n return query, values\n\n\ndef parse_select(d):\n \"\"\"\n Defintion of a select query according to\n http://www.postgresql.org/docs/9.3/static/sql-select.html\n\n not implemented:\n [ WITH [ RECURSIVE ] with_query [, ...] ]\n [ WINDOW window_name AS ( window_definition ) [, ...] ]\n [ FOR { UPDATE | NO KEY UPDATE | SHARE | KEY SHARE } [ OF table_name [, ...] ] [ NOWAIT ] [...] ]\n \"\"\"\n distinct = d.get(\"distinct\", False)\n\n L = None\n\n keyword = d.get(\"keyword\")\n\n if keyword and keyword.lower() in [\"union\", \"except\", \"intersect\"]:\n partials = []\n for part_sel in d.get(\"selects\", []):\n t = part_sel.get(\"type\")\n if t == \"grouping\":\n grouping = get_or_403(part_sel, \"grouping\")\n if isinstance(grouping, dict):\n partials.append(parse_select(grouping))\n elif isinstance(grouping, list):\n partials = map(parse_select, grouping)\n else:\n APIError(\n \"Cannot handle grouping type. Dictionary or list expected.\"\n )\n elif t == \"select\":\n partials.append(parse_select(part_sel))\n else:\n raise APIError(\"Unknown select type: \" + t)\n query = CompoundSelect(util.symbol(keyword), *partials)\n else:\n kwargs = dict(distinct=distinct)\n if \"fields\" in d and d[\"fields\"]:\n L = []\n for field in d[\"fields\"]:\n col = parse_expression(field)\n if \"as\" in field:\n col.label(read_pgid(field[\"as\"]))\n L.append(col)\n if \"from\" in d:\n kwargs[\"from_obj\"] = parse_from_item(get_or_403(d, \"from\"))\n else:\n kwargs[\"from_obj\"] = []\n if not L:\n L = \"*\"\n kwargs[\"columns\"] = L\n query = select(**kwargs)\n\n # [ WHERE condition ]\n if d.get(\"where\", False):\n query = query.where(parse_condition(d[\"where\"]))\n\n if \"group_by\" in d:\n query = query.group_by(*[parse_expression(f) for f in d[\"group_by\"]])\n\n if \"having\" in d:\n query.having([parse_condition(f) for f in d[\"having\"]])\n\n if \"select\" in d:\n for constraint in d[\"select\"]:\n type = get_or_403(constraint, \"type\")\n subquery = parse_select(get_or_403(constraint, \"query\"))\n if type.lower() == \"union\":\n query.union(subquery)\n elif type.lower() == \"intersect\":\n query.intersect(subquery)\n elif type.lower() == \"except\":\n query.except_(subquery)\n if \"order_by\" in d:\n for ob in d[\"order_by\"]:\n expr = parse_expression(ob)\n if isinstance(ob, dict):\n desc = ob.get(\"ordering\", \"asc\").lower() == \"desc\"\n if desc:\n expr = expr.desc()\n query = query.order_by(expr)\n\n if \"limit\" in d:\n if isinstance(d[\"limit\"], int) or d[\"limit\"].isdigit():\n query = query.limit(int(d[\"limit\"]))\n else:\n raise APIError(\"Invalid LIMIT: Expected a digit\")\n\n if \"offset\" in d:\n if isinstance(d[\"offset\"], int) or d[\"offset\"].isdigit():\n query = query.offset(int(d[\"offset\"]))\n else:\n raise APIError(\"Invalid LIMIT: Expected a digit\")\n return query\n\n\ndef parse_from_item(d):\n \"\"\"\n Defintion of a from_item according to \n http://www.postgresql.org/docs/9.3/static/sql-select.html\n \n return: A from_item string with checked psql qualifiers.\n \n Not implemented:\n with_query_name [ [ AS ] alias [ ( column_alias [, ...] ) ] ]\n [ LATERAL ] function_name ( [ argument [, ...] ] ) [ AS ] alias [ ( column_alias [, ...] | column_definition [, ...] ) ]\n [ LATERAL ] function_name ( [ argument [, ...] ] ) AS ( column_definition [, ...] )\n \"\"\"\n # TODO: If 'type' is not set assume just a table name is present\n if isinstance(d, str):\n d = {\"type\": \"table\", \"table\": d}\n if isinstance(d, list):\n return [parse_from_item(f) for f in d]\n dtype = get_or_403(d, \"type\")\n if dtype == \"table\":\n schema_name = read_pgid(d[\"schema\"]) if \"schema\" in d else None\n only = d.get(\"only\", False)\n ext_name = table_name = read_pgid(get_or_403(d, \"table\"))\n tkwargs = dict(autoload=True)\n if schema_name:\n ext_name = schema_name + \".\" + ext_name\n tkwargs[\"schema\"] = d[\"schema\"]\n if ext_name in __PARSER_META.tables:\n item = __PARSER_META.tables[ext_name]\n else:\n try:\n item = Table(d[\"table\"], __PARSER_META, **tkwargs)\n except sa.exc.NoSuchTableError as e:\n raise APIError(\"Table {table} not found\".format(table=ext_name))\n\n engine = _get_engine()\n conn = engine.connect()\n exists = engine.dialect.has_table(conn, item.name, item.schema)\n conn.close()\n if not exists:\n raise APIError(\"Table not found: \" + str(item), status=400)\n elif dtype == \"select\":\n item = parse_select(d)\n elif dtype == \"join\":\n left = parse_from_item(get_or_403(d, \"left\"))\n right = parse_from_item(get_or_403(d, \"right\"))\n is_outer = d.get(\"is_outer\", False)\n full = d.get(\"is_full\", False)\n on_clause = None\n if \"on\" in d:\n on_clause = parse_condition(d[\"on\"])\n item = left.join(right, onclause=on_clause, isouter=is_outer, full=full)\n else:\n raise APIError(\"Unknown from-item: \" + dtype)\n\n if \"alias\" in d:\n item = item.alias(read_pgid(d[\"alias\"]))\n return item\n\n\n__PARSER_META = MetaData(bind=_get_engine())\n\n\ndef load_table_from_metadata(table_name, schema_name=None):\n ext_name = table_name\n if schema_name:\n ext_name = schema_name + \".\" + ext_name\n if ext_name and ext_name in __PARSER_META.tables:\n return __PARSER_META.tables[ext_name]\n else:\n if _get_engine().dialect.has_table(\n _get_engine().connect(), table_name, schema=schema_name\n ):\n return Table(table_name, __PARSER_META, autoload=True, schema=schema_name)\n\n\ndef parse_column(d, mapper):\n name = get_or_403(d, \"column\")\n is_literal = parse_single(d.get(\"is_literal\", False), bool)\n table_name = d.get(\"table\")\n table = None\n if table_name:\n table_name = read_pgid(table_name)\n if mapper is None:\n mapper = dict()\n do_map = lambda x: mapper.get(x, x)\n if \"schema\" in d:\n schema_name = read_pgid(do_map(d[\"schema\"]))\n else:\n schema_name = None\n table = load_table_from_metadata(table_name, schema_name=schema_name)\n if table is not None and name in table.c:\n col = table.c[name]\n if isinstance(col.type, INTERVAL):\n col.type = Interval(col.type)\n return col\n else:\n if is_literal:\n return literal_column(name)\n else:\n return column(name)\n\n\ndef parse_type(dt_string, **kwargs):\n\n if isinstance(dt_string, dict):\n dt = parse_type(\n get_or_403(dt_string, \"datatype\"), **dt_string.get(\"kwargs\", {})\n )\n return dt\n else:\n # Are you an array?\n dtarr_expression = r\"(?P<dtname>[A-z_]+)\\s*\\[\\]\"\n arr_match = re.match(dtarr_expression, dt_string)\n if arr_match:\n is_array = True\n dt_string = arr_match.groups()[0]\n dt, autoincrement = parse_type(dt_string)\n return sa.ARRAY(dt), autoincrement\n\n # Is the datatypestring of form NAME(NUMBER)?\n dt_expression = r\"(?P<dtname>[A-z_]+)\\s*\\((?P<cardinality>.*(,.*)?)\\)\"\n match = re.match(dt_expression, dt_string)\n if match:\n dt_string = match.groups()[0]\n if dt_string.lower() == \"geometry\":\n return geoalchemy2.Geometry(geometry_type=match.groups()[1]), False\n else:\n dt_cardinality = map(int, match.groups()[1].replace(\" \", \"\").split(\",\"))\n dt, autoincrement = parse_type(dt_string)\n return dt(*dt_cardinality, **kwargs), autoincrement\n\n # So it's a plain type\n autoincrement = False\n\n dt_string = dt_string.lower()\n\n if dt_string in (\"int\", \"integer\"):\n dt = sa.types.INTEGER\n elif dt_string in (\"bigint\", \"biginteger\"):\n dt = sa.types.BigInteger\n elif dt_string in (\"bit\",):\n dt = sa.types.Binary\n elif dt_string in (\"boolean\", \"bool\"):\n dt = sa.types.Boolean\n elif dt_string in (\"char\",):\n dt = sqltypes.CHAR\n elif dt_string in (\"date\",):\n dt = sqltypes.Date\n elif dt_string in (\"datetime\",):\n dt = sqltypes.DateTime\n elif dt_string in (\"timestamp\", \"timestamp without time zone\"):\n dt = sqltypes.TIMESTAMP\n elif dt_string in (\"time\", \"time without time zone\"):\n dt = sqltypes.TIME\n elif dt_string in (\"float\"):\n dt = sqltypes.FLOAT\n elif dt_string in (\"decimal\"):\n dt = sqltypes.DECIMAL\n elif dt_string in (\"interval\",):\n dt = sqltypes.Interval\n elif dt_string in (\"json\",):\n dt = sqltypes.JSON\n elif dt_string in (\"nchar\",):\n dt = sqltypes.NCHAR\n elif dt_string in (\"numerical\", \"numeric\"):\n dt = sa.types.Numeric\n elif dt_string in [\"varchar\", \"character varying\"]:\n dt = sqltypes.VARCHAR\n elif dt_string in (\"real\",):\n dt = sqltypes.REAL\n elif dt_string in (\"smallint\",):\n dt = sqltypes.SMALLINT\n elif hasattr(geoalchemy2, dt_string):\n dt = getattr(geoalchemy2, dt_string)\n elif hasattr(sqltypes, dt_string.upper()):\n dt = getattr(sqltypes, dt_string.upper())\n elif dt_string == \"bigserial\":\n dt = sa.types.BigInteger\n autoincrement = True\n else:\n raise APIError(\"Unknown type (%s).\" % dt_string)\n return dt, autoincrement\n\n\ndef parse_expression(d, mapper=None, allow_untyped_dicts=False, escape_quotes=True):\n # TODO: Implement\n if isinstance(d, dict):\n if allow_untyped_dicts and \"type\" not in d:\n return d\n dtype = get_or_403(d, \"type\")\n if dtype == \"column\":\n return parse_column(d, mapper)\n if dtype == \"grouping\":\n grouping = get_or_403(d, \"grouping\")\n if isinstance(grouping, list):\n return [parse_expression(e) for e in grouping]\n else:\n return parse_expression(grouping)\n if dtype == \"operator\":\n return parse_operator(d)\n if dtype == \"modifier\":\n return parse_modifier(d)\n if dtype == \"function\":\n return parse_function(d)\n if dtype == \"slice\":\n return parse_slice(d)\n if dtype == \"star\":\n return \"*\"\n if dtype == \"value\":\n if \"value\" in d:\n if \"datatype\" in d:\n dt = d[\"datatype\"]\n if dt == \"Decimal\":\n return decimal.Decimal(get_or_403(d, \"value\"))\n elif dt == \"date\":\n return dateutil.parser.parse(get_or_403(d, \"value\")).date()\n elif dt == \"datetime\":\n return dateutil.parser.parse(get_or_403(d, \"value\"))\n elif dt == \"time\":\n return dateutil.parser.parse(get_or_403(d, \"value\")).time()\n return read_pgvalue(get_or_403(d, \"value\"))\n else:\n return None\n if dtype == \"label\":\n return parse_label(d)\n if dtype == \"sequence\":\n schema = read_pgid(d[\"schema\"]) if \"schema\" in d else DEFAULT_SCHEMA\n s = '\"%s\".\"%s\"' % (schema, get_or_403(d, \"sequence\"))\n return Sequence(get_or_403(d, \"sequence\"), schema=schema)\n if dtype == \"select\":\n return parse_select(d)\n if dtype == \"cast\":\n expr = parse_expression(get_or_403(d, \"source\"))\n t, _ = parse_type(get_or_403(d, \"as\"))\n return cast(expr, t)\n else:\n raise APIError(\"Unknown expression type: \" + dtype)\n if isinstance(d, list):\n return [\n parse_expression(\n x, allow_untyped_dicts=allow_untyped_dicts, escape_quotes=escape_quotes\n )\n for x in d\n ]\n if isinstance(d, str):\n if escape_quotes:\n return d.replace('\"', \"\")\n else:\n return d\n return d\n\n\ndef parse_label(d):\n element = parse_expression(get_or_403(d, \"element\"))\n if not isinstance(element, sa.sql.expression.ClauseElement):\n element = sa.literal(element)\n return element.label(get_or_403(d, \"label\"))\n\n\ndef parse_slice(d):\n kwargs = {\"step\": 1}\n if \"start\" in d:\n kwargs[\"start\"] = d[\"start\"]\n if \"stop\" in d:\n kwargs[\"stop\"] = d[\"stop\"]\n return Slice(**kwargs)\n\n\ndef _unpack_clauses(clauses):\n if isinstance(clauses, list):\n clean_clauses = []\n for clause in clauses:\n if isinstance(clause, list):\n clean_clauses += list(map(_unpack_clauses, clause))\n else:\n clean_clauses.append(clause)\n clauses = {\n \"type\": \"operator\",\n \"operator\": \"AND\",\n \"operands\": list(map(parse_expression, clean_clauses)),\n }\n return clauses\n\n\ndef parse_condition(dl):\n clean_dl = _unpack_clauses(dl)\n return parse_expression(clean_dl)\n\n\ndef parse_operator(d):\n query = parse_sqla_operator(\n get_or_403(d, \"operator\"),\n *list(map(parse_expression, get_or_403(d, \"operands\")))\n )\n return query\n\n\ndef parse_modifier(d):\n query = parse_sqla_modifier(\n get_or_403(d, \"operator\"),\n *list(map(parse_expression, get_or_403(d, \"operands\")))\n )\n return query\n\n\ndef parse_function(d):\n fname = get_or_403(d, \"function\")\n\n operand_struc = get_or_403(d, \"operands\")\n if isinstance(operand_struc, list):\n operands = list(map(parse_expression, operand_struc))\n else:\n if (\n isinstance(operand_struc, dict)\n and operand_struc.get(\"type\", None) == \"grouping\"\n ):\n operands = parse_expression(operand_struc)\n else:\n operands = [parse_expression(operand_struc)]\n\n if fname == \"+\":\n if len(operands) != 2:\n raise APIError(\n \"Wrong number of arguments for function %s. Expected 2. Got %d\"\n % (fname, len(operands))\n )\n x, y = operands\n return x + y\n else:\n if fname == \"nextval\":\n return func.next_value(*operands)\n else:\n function = getattr(func, fname)\n return function(*operands)\n\n\ndef parse_scolumnd_from_columnd(schema, table, name, column_description):\n # Migrate Postgres to Python Structures\n data_type = column_description.get(\"data_type\")\n size = column_description.get(\"character_maximum_length\")\n if size is not None and data_type is not None:\n data_type += \"(\" + str(size) + \")\"\n\n notnull = column_description.get(\"is_nullable\", False)\n\n return {\n \"column_name\": name,\n \"not_null\": notnull,\n \"data_type\": data_type,\n \"new_name\": column_description.get(\"new_name\"),\n \"c_schema\": schema,\n \"c_table\": table,\n }\n\n\ndef parse_sconstd_from_constd(schema, table, name_const, constraint_description):\n defi = constraint_description.get(\"definition\")\n return {\n \"action\": None, # {ADD, DROP}\n \"constraint_type\": constraint_description.get(\n \"constraint_typ\"\n ), # {FOREIGN KEY, PRIMARY KEY, UNIQUE, CHECK}\n \"constraint_name\": name_const,\n \"constraint_parameter\": constraint_description.get(\"definition\")\n .split(\"(\")[1]\n .split(\")\")[0],\n # Things in Brackets, e.g. name of column\n \"reference_table\": defi.split(\"REFERENCES \")[1].split(\"(\")[2]\n if \"REFERENCES\" in defi\n else None,\n \"reference_column\": defi.split(\"(\")[2].split(\")\")[1]\n if \"REFERENCES\" in defi\n else None,\n \"c_schema\": schema,\n \"c_table\": table,\n }\n\n\ndef replace_None_with_NULL(dictonary):\n # Replacing None with null for Database\n for key, value in dictonary.items():\n if value is None:\n dictonary[key] = \"NULL\"\n\n return dictonary\n\n\ndef split(string, seperator):\n if string is None:\n return None\n else:\n return str(string).split(seperator)\n\n\ndef replace(string, occuring_symb, replace_symb):\n if string is None:\n return None\n else:\n return str(string).replace(occuring_symb, replace_symb)\n\n\ndef alchemyencoder(obj):\n \"\"\"JSON encoder function for SQLAlchemy special classes.\"\"\"\n if isinstance(obj, datetime.date):\n return obj.isoformat()\n elif isinstance(obj, decimal.Decimal):\n return float(obj)\n\n\nsql_operators = {\n \"EQUALS\": \"=\",\n \"GREATER\": \">\",\n \"LOWER\": \"<\",\n \"NOTEQUAL\": \"!=\",\n \"NOTGREATER\": \"<=\",\n \"NOTLOWER\": \">=\",\n \"=\": \"=\",\n \">\": \">\",\n \"<\": \"<\",\n \"!=\": \"!=\",\n \"<>\": \"!=\",\n \"<=\": \"<=\",\n \">=\": \">=\",\n}\n\n\ndef parse_sql_operator(key: str) -> str:\n return sql_operators.get(key)\n\n\ndef parse_sqla_operator(raw_key, *operands):\n key = raw_key.lower().strip()\n if not operands:\n raise APIError(\"Missing arguments for '%s'.\" % (key))\n if key in [\"and\"]:\n query = and_(*operands)\n return query\n elif key in [\"or\"]:\n query = or_(*operands)\n return query\n elif key in [\"not\"]:\n x = operands[0]\n return not_(parse_condition(x))\n else:\n if len(operands) != 2:\n raise APIError(\n \"Wrong number of arguments for '%s'. Expected: 2 Got: %s\"\n % (key, len(operands))\n )\n x, y = operands\n if key in [\"equals\", \"=\"]:\n return x == y\n if key in [\"greater\", \">\"]:\n return x > y\n if key in [\"lower\", \"<\"]:\n return x < y\n if key in [\"notequal\", \"<>\", \"!=\"]:\n return x != y\n if key in [\"notgreater\", \"<=\"]:\n return x <= y\n if key in [\"notlower\", \">=\"]:\n return x >= y\n if key in [\"add\", \"+\"]:\n return x + y\n if key in [\"substract\", \"-\"]:\n return x - y\n if key in [\"multiply\", \"*\"]:\n return x * y\n if key in [\"divide\", \"/\"]:\n return x / y\n if key in [\"concatenate\", \"||\"]:\n return fun.concat(x, y)\n if key in [\"is not\"]:\n return x.isnot(y)\n if key in [\"<->\"]:\n return x.distance_centroid(y)\n if key in [\"getitem\"]:\n if isinstance(y, Slice):\n return x[parse_single(y.start, int) : parse_single(y.stop, int)]\n else:\n return x[read_pgid(y)]\n if key in [\"in\"]:\n return x.in_(y)\n\n raise APIError(\"Operator '%s' not supported\" % key)\n\n\ndef parse_sqla_modifier(raw_key, *operands):\n key = raw_key.lower().strip()\n if not operands:\n raise APIError(\"Missing arguments for '%s'.\" % key)\n\n if len(operands) != 1:\n raise APIError(\n \"Wrong number of arguments for '%s'. Expected: 1 Got: %s\"\n % (key, len(operands))\n )\n x = operands[0]\n if key in [\"asc\"]:\n return x.asc()\n if key in [\"desc\"]:\n return x.desc()\n raise APIError(\"Operator %s not supported\" % key)\n", "path": "api/parser.py" } ]
diff --git a/api/parser.py b/api/parser.py index 3c76c36ef..8e0f4abba 100644 --- a/api/parser.py +++ b/api/parser.py @@ -98,6 +98,7 @@ def set_meta_info(method, user, message=None): val_dict = {} val_dict["_user"] = user # TODO: Add user handling val_dict["_message"] = message + val_dict["_type"] = method return val_dict
bridgecrewio__checkov-2810
HCL2 parser cannot parse functions with comments interleaved in the arguments. **Describe the issue** The HCL2 parser fails to parse a file that contains an expression with a Terraform function call that contains comments interleaved within the arguments. **Example Value** A file that contains the following exaple variable will fail to parse. ```hcl variable "example" { default = function( # this comment is fine argument1, # this comment causes a parsing error argument2 # this comment is fine ) } ``` This seems to be a replicated issue in the downstream as well > https://github.com/amplify-education/python-hcl2/issues/95. I have opened a PR to fix this in the bridgecrewio specific parser > https://github.com/bridgecrewio/python-hcl2/pull/29. **Question** Is the bridgecrewio HCL2 Parser intened to be merged upstream? If not, I will implement the change in Amplify's codebase separately. **An aside** Checkov is an awesome tool, it makes the jobs of myself and the rest of the Platform/DevOps Engineers on my team so much easier!
[ { "content": "#!/usr/bin/env python\nimport logging\nimport os\nfrom importlib import util\nfrom os import path\n\nimport setuptools\nfrom setuptools import setup\n\n# read the contents of your README file\nthis_directory = path.abspath(path.dirname(__file__))\nwith open(path.join(this_directory, \"README.md\"), encoding=\"utf-8\") as f:\n long_description = f.read()\n\nlogger = logging.getLogger(__name__)\nspec = util.spec_from_file_location(\n \"checkov.version\", os.path.join(\"checkov\", \"version.py\")\n)\n# noinspection PyUnresolvedReferences\nmod = util.module_from_spec(spec)\nspec.loader.exec_module(mod) # type: ignore\nversion = mod.version # type: ignore\n\nsetup(\n extras_require={\n \"dev\": [\n \"pytest==5.3.1\",\n \"coverage==5.5\",\n \"coverage-badge\",\n \"GitPython==3.1.7\",\n \"bandit\",\n \"jsonschema\",\n ]\n },\n install_requires=[\n \"bc-python-hcl2==0.3.38\",\n \"cloudsplaining>=0.4.1\",\n \"deep_merge\",\n \"tabulate\",\n \"colorama\",\n \"termcolor\",\n \"junit-xml>=1.9\",\n \"dpath>=1.5.0,<2\",\n \"pyyaml>=5.4.1\",\n \"boto3>=1.17\",\n \"GitPython\",\n \"jmespath\",\n \"tqdm\",\n \"update_checker\",\n \"semantic_version\",\n \"packaging\",\n \"networkx\",\n \"dockerfile-parse\",\n \"docker\",\n \"configargparse\",\n \"argcomplete\",\n \"detect-secrets\",\n \"policyuniverse\",\n \"typing-extensions\",\n \"cachetools\",\n \"cyclonedx-python-lib>=0.11.0,<1.0.0\",\n \"click>=8.0.0\",\n \"aiohttp\",\n \"aiodns\",\n \"aiomultiprocess\",\n \"jsonpath_ng\",\n \"jsonschema~=3.0\",\n \"prettytable>=3.0.0\",\n \"pycep-parser==0.3.4\",\n \"charset-normalizer\",\n ],\n license=\"Apache License 2.0\",\n name=\"checkov\",\n version=version,\n python_requires=\">=3.7\",\n description=\"Infrastructure as code static analysis\",\n author=\"bridgecrew\",\n author_email=\"[email protected]\",\n url=\"https://github.com/bridgecrewio/checkov\",\n packages=setuptools.find_packages(exclude=[\"tests*\", \"integration_tests*\"]),\n include_package_data=True,\n package_dir={\n \"checkov.bicep.checks.graph_checks\": \"checkov/bicep/checks/graph_checks\",\n \"checkov.terraform.checks.graph_checks\": \"checkov/terraform/checks/graph_checks\",\n },\n package_data={\n \"checkov.terraform.checks.graph_checks\": [\n \"aws/*.yaml\",\n \"gcp/*.yaml\",\n \"azure/*.yaml\",\n ],\n \"checkov.common.util.templates\": [\n \"*.jinja2\"\n ]\n },\n scripts=[\"bin/checkov\", \"bin/checkov.cmd\"],\n long_description=long_description,\n long_description_content_type=\"text/markdown\",\n classifiers=[\n \"Environment :: Console\",\n \"Intended Audience :: Developers\",\n \"Intended Audience :: System Administrators\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Programming Language :: Python :: 3.10\",\n \"Topic :: Security\",\n \"Topic :: Software Development :: Build Tools\",\n ],\n)\n", "path": "setup.py" } ]
[ { "content": "#!/usr/bin/env python\nimport logging\nimport os\nfrom importlib import util\nfrom os import path\n\nimport setuptools\nfrom setuptools import setup\n\n# read the contents of your README file\nthis_directory = path.abspath(path.dirname(__file__))\nwith open(path.join(this_directory, \"README.md\"), encoding=\"utf-8\") as f:\n long_description = f.read()\n\nlogger = logging.getLogger(__name__)\nspec = util.spec_from_file_location(\n \"checkov.version\", os.path.join(\"checkov\", \"version.py\")\n)\n# noinspection PyUnresolvedReferences\nmod = util.module_from_spec(spec)\nspec.loader.exec_module(mod) # type: ignore\nversion = mod.version # type: ignore\n\nsetup(\n extras_require={\n \"dev\": [\n \"pytest==5.3.1\",\n \"coverage==5.5\",\n \"coverage-badge\",\n \"GitPython==3.1.7\",\n \"bandit\",\n \"jsonschema\",\n ]\n },\n install_requires=[\n \"bc-python-hcl2==0.3.39\",\n \"cloudsplaining>=0.4.1\",\n \"deep_merge\",\n \"tabulate\",\n \"colorama\",\n \"termcolor\",\n \"junit-xml>=1.9\",\n \"dpath>=1.5.0,<2\",\n \"pyyaml>=5.4.1\",\n \"boto3>=1.17\",\n \"GitPython\",\n \"jmespath\",\n \"tqdm\",\n \"update_checker\",\n \"semantic_version\",\n \"packaging\",\n \"networkx\",\n \"dockerfile-parse\",\n \"docker\",\n \"configargparse\",\n \"argcomplete\",\n \"detect-secrets\",\n \"policyuniverse\",\n \"typing-extensions\",\n \"cachetools\",\n \"cyclonedx-python-lib>=0.11.0,<1.0.0\",\n \"click>=8.0.0\",\n \"aiohttp\",\n \"aiodns\",\n \"aiomultiprocess\",\n \"jsonpath_ng\",\n \"jsonschema~=3.0\",\n \"prettytable>=3.0.0\",\n \"pycep-parser==0.3.4\",\n \"charset-normalizer\",\n ],\n license=\"Apache License 2.0\",\n name=\"checkov\",\n version=version,\n python_requires=\">=3.7\",\n description=\"Infrastructure as code static analysis\",\n author=\"bridgecrew\",\n author_email=\"[email protected]\",\n url=\"https://github.com/bridgecrewio/checkov\",\n packages=setuptools.find_packages(exclude=[\"tests*\", \"integration_tests*\"]),\n include_package_data=True,\n package_dir={\n \"checkov.bicep.checks.graph_checks\": \"checkov/bicep/checks/graph_checks\",\n \"checkov.terraform.checks.graph_checks\": \"checkov/terraform/checks/graph_checks\",\n },\n package_data={\n \"checkov.terraform.checks.graph_checks\": [\n \"aws/*.yaml\",\n \"gcp/*.yaml\",\n \"azure/*.yaml\",\n ],\n \"checkov.common.util.templates\": [\n \"*.jinja2\"\n ]\n },\n scripts=[\"bin/checkov\", \"bin/checkov.cmd\"],\n long_description=long_description,\n long_description_content_type=\"text/markdown\",\n classifiers=[\n \"Environment :: Console\",\n \"Intended Audience :: Developers\",\n \"Intended Audience :: System Administrators\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Programming Language :: Python :: 3.10\",\n \"Topic :: Security\",\n \"Topic :: Software Development :: Build Tools\",\n ],\n)\n", "path": "setup.py" } ]
diff --git a/Pipfile b/Pipfile index 1f3262d62c..c27ece993d 100644 --- a/Pipfile +++ b/Pipfile @@ -33,7 +33,7 @@ dlint = "*" # # REMINDER: Update "install_requires" deps on setup.py when changing # -bc-python-hcl2 = "==0.3.38" +bc-python-hcl2 = "==0.3.39" deep_merge = "*" tabulate = "*" colorama="*" diff --git a/Pipfile.lock b/Pipfile.lock index d72dfdbf46..150358cf43 100644 --- a/Pipfile.lock +++ b/Pipfile.lock @@ -1,7 +1,7 @@ { "_meta": { "hash": { - "sha256": "169f933b7a713d9a651f2709ecae1a574bc490f5b62bc0c5a9859096b876ca1d" + "sha256": "8ca53ee8b86605dbfd93847c70135a3b8dab3b11c3c632f87efa9678b83d29d7" }, "pipfile-spec": 6, "requires": { @@ -144,11 +144,11 @@ }, "bc-python-hcl2": { "hashes": [ - "sha256:8bccdfd4ac9ec1997f313abef7b130f32e54b5cdb028a3941213141cffd46dee", - "sha256:ac54165081831db2eb25fdb7cd9e0c3c350b677afdd68467dcf295ca7811da6f" + "sha256:24c436b0b8009cc275ff49d1f0b80a6e93c1e378152ad0adad92abfd3e29d0ef", + "sha256:baa5491d0d1497a5c2f07ef2eea2f4a2f3bc1d730b3a7f96dd9bf92ce8c1b586" ], "index": "pypi", - "version": "==0.3.38" + "version": "==0.3.39" }, "beautifulsoup4": { "hashes": [ @@ -160,19 +160,19 @@ }, "boto3": { "hashes": [ - "sha256:013ba57295f05da141e364191dd46f4086e8fe3eb83a3cd09730eeb684ffbab3", - "sha256:1e845aa92b3ad70b954329b98835135c28b3000e322ff8d3fc46a956bdb6e94b" + "sha256:56425debf5f1fd2cf5494d9cb110b2a977453888f071898a12e6ab64bdd41796", + "sha256:b709cb65ffc4e3f78c590145e2dee40758056c9edafb9ee692f67d170855dfc3" ], "index": "pypi", - "version": "==1.21.37" + "version": "==1.21.39" }, "botocore": { "hashes": [ - "sha256:21e164a213beca36033c46026bffa62f2ee2cd2600777271f9a551fb34dba006", - "sha256:70c48c4ae3c2b9ec0ca025385979d01f4c7dae4d9a61c82758d4cf7caa7082cd" + "sha256:94f50a544003918270ba726eb5652b2c31f6cb34accbf25e053ed6ea97ecf1fd", + "sha256:a0883dfe8b81689060af7bb2ca4ce3048b954b25bef4ed712c6760ce3da51485" ], "markers": "python_version >= '3.6'", - "version": "==1.24.37" + "version": "==1.24.39" }, "cached-property": { "hashes": [ @@ -926,11 +926,11 @@ }, "setuptools": { "hashes": [ - "sha256:7999cbd87f1b6e1f33bf47efa368b224bed5e27b5ef2c4d46580186cbcb1a86a", - "sha256:a65e3802053e99fc64c6b3b29c11132943d5b8c8facbcc461157511546510967" + "sha256:26ead7d1f93efc0f8c804d9fafafbe4a44b179580a7105754b245155f9af05a8", + "sha256:47c7b0c0f8fc10eec4cf1e71c6fdadf8decaa74ffa087e68cd1c20db7ad6a592" ], "markers": "python_version >= '3.7'", - "version": "==62.0.0" + "version": "==62.1.0" }, "six": { "hashes": [ @@ -1872,11 +1872,11 @@ }, "virtualenv": { "hashes": [ - "sha256:1e8588f35e8b42c6ec6841a13c5e88239de1e6e4e4cedfd3916b306dc826ec66", - "sha256:8e5b402037287126e81ccde9432b95a8be5b19d36584f64957060a3488c11ca8" + "sha256:e617f16e25b42eb4f6e74096b9c9e37713cf10bf30168fb4a739f3fa8f898a3a", + "sha256:ef589a79795589aada0c1c5b319486797c03b67ac3984c48c669c0e4f50df3a5" ], "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'", - "version": "==20.14.0" + "version": "==20.14.1" }, "yarl": { "hashes": [ diff --git a/setup.py b/setup.py index 9a9f9109bd..6128201ffc 100644 --- a/setup.py +++ b/setup.py @@ -33,7 +33,7 @@ ] }, install_requires=[ - "bc-python-hcl2==0.3.38", + "bc-python-hcl2==0.3.39", "cloudsplaining>=0.4.1", "deep_merge", "tabulate",
bridgecrewio__checkov-3151
Terraform parsing error string with escaped backslash at the end **Describe the issue** Checkov crashes if it encounters an escaped backslash (`"\\"`) at the end of a string. **Examples** Minimal example to reproduce the error: ```terraform variable "slash" { default = "\\" } output "slash" { value = var.slash } ``` `terraform validate` sees this configuration as valid, but checkov fails with a parsing error. This only happens when the last character of the string is the escaped backslash, as the parser assumes the closing quotation mark is escaped. Adding any normal character at the end of the string doesn't trigger this error. ```terraform variable "slash" { default = "\\" } ``` **Exception Trace** Relevant traceback ```sh > LOG_LEVEL=DEBUG checkov -d . [...] [MainThread ] [DEBUG] failed while parsing file /workdir/main.tf Traceback (most recent call last): File "/Users/user/.local/pipx/venvs/checkov/lib/python3.8/site-packages/checkov/terraform/parser.py", line 726, in _load_or_die_quietly raw_data = hcl2.load(f) File "/Users/user/.local/pipx/venvs/checkov/lib/python3.8/site-packages/hcl2/api.py", line 12, in load return loads(file.read()) File "/Users/user/.local/pipx/venvs/checkov/lib/python3.8/site-packages/hcl2/api.py", line 80, in loads raise ValueError(f"Line has unclosed quote marks: {line}") ValueError: Line has unclosed quote marks: default = "\\" [...] ``` **Desktop (please complete the following information):** - OS: MacOS 12.3.1 (Intel) - Checkov Version: 2.0.1230
[ { "content": "#!/usr/bin/env python\nimport logging\nimport os\nfrom importlib import util\nfrom os import path\n\nimport setuptools\nfrom setuptools import setup\n\n# read the contents of your README file\nthis_directory = path.abspath(path.dirname(__file__))\nwith open(path.join(this_directory, \"README.md\"), encoding=\"utf-8\") as f:\n long_description = f.read()\n\nlogger = logging.getLogger(__name__)\nspec = util.spec_from_file_location(\n \"checkov.version\", os.path.join(\"checkov\", \"version.py\")\n)\n# noinspection PyUnresolvedReferences\nmod = util.module_from_spec(spec)\nspec.loader.exec_module(mod) # type: ignore\nversion = mod.version # type: ignore\n\nsetup(\n extras_require={\n \"dev\": [\n \"pytest==5.3.1\",\n \"coverage==5.5\",\n \"coverage-badge\",\n \"GitPython==3.1.7\",\n \"bandit\",\n \"jsonschema\",\n ]\n },\n install_requires=[\n \"bc-python-hcl2==0.3.42\",\n \"cloudsplaining>=0.4.1\",\n \"deep_merge\",\n \"tabulate\",\n \"colorama\",\n \"termcolor\",\n \"junit-xml>=1.9\",\n \"dpath>=1.5.0,<2\",\n \"pyyaml>=5.4.1\",\n \"boto3>=1.17\",\n \"GitPython\",\n \"jmespath\",\n \"tqdm\",\n \"update_checker\",\n \"semantic_version\",\n \"packaging\",\n \"networkx\",\n \"dockerfile-parse\",\n \"docker\",\n \"configargparse\",\n \"argcomplete\",\n \"detect-secrets\",\n \"policyuniverse\",\n \"typing-extensions>=4.1.0\",\n \"cachetools\",\n \"cyclonedx-python-lib>=2.4.0\",\n \"click>=8.0.0\",\n \"aiohttp\",\n \"aiodns\",\n \"aiomultiprocess\",\n \"jsonpath_ng\",\n \"jsonschema~=3.0\",\n \"prettytable>=3.0.0\",\n \"pycep-parser==0.3.7\",\n \"charset-normalizer\",\n ],\n license=\"Apache License 2.0\",\n name=\"checkov\",\n version=version,\n python_requires=\">=3.7\",\n description=\"Infrastructure as code static analysis\",\n author=\"bridgecrew\",\n author_email=\"[email protected]\",\n url=\"https://github.com/bridgecrewio/checkov\",\n packages=setuptools.find_packages(exclude=[\"tests*\", \"integration_tests*\"]),\n include_package_data=True,\n package_dir={\n \"checkov.bicep.checks.graph_checks\": \"checkov/bicep/checks/graph_checks\",\n \"checkov.terraform.checks.graph_checks\": \"checkov/terraform/checks/graph_checks\",\n },\n package_data={\n \"checkov\": [\"py.typed\"],\n \"checkov.bicep.checks.graph_checks\": [\"*.yaml\"],\n \"checkov.common.util.templates\": [\"*.jinja2\"],\n \"checkov.terraform.checks.graph_checks\": [\n \"aws/*.yaml\",\n \"gcp/*.yaml\",\n \"azure/*.yaml\",\n ],\n },\n scripts=[\"bin/checkov\", \"bin/checkov.cmd\"],\n long_description=long_description,\n long_description_content_type=\"text/markdown\",\n classifiers=[\n \"Environment :: Console\",\n \"Intended Audience :: Developers\",\n \"Intended Audience :: System Administrators\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Programming Language :: Python :: 3 :: Only\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Programming Language :: Python :: 3.10\",\n \"Topic :: Security\",\n \"Topic :: Software Development :: Build Tools\",\n \"Typing :: Typed\",\n ],\n)\n", "path": "setup.py" } ]
[ { "content": "#!/usr/bin/env python\nimport logging\nimport os\nfrom importlib import util\nfrom os import path\n\nimport setuptools\nfrom setuptools import setup\n\n# read the contents of your README file\nthis_directory = path.abspath(path.dirname(__file__))\nwith open(path.join(this_directory, \"README.md\"), encoding=\"utf-8\") as f:\n long_description = f.read()\n\nlogger = logging.getLogger(__name__)\nspec = util.spec_from_file_location(\n \"checkov.version\", os.path.join(\"checkov\", \"version.py\")\n)\n# noinspection PyUnresolvedReferences\nmod = util.module_from_spec(spec)\nspec.loader.exec_module(mod) # type: ignore\nversion = mod.version # type: ignore\n\nsetup(\n extras_require={\n \"dev\": [\n \"pytest==5.3.1\",\n \"coverage==5.5\",\n \"coverage-badge\",\n \"GitPython==3.1.7\",\n \"bandit\",\n \"jsonschema\",\n ]\n },\n install_requires=[\n \"bc-python-hcl2==0.3.44\",\n \"cloudsplaining>=0.4.1\",\n \"deep_merge\",\n \"tabulate\",\n \"colorama\",\n \"termcolor\",\n \"junit-xml>=1.9\",\n \"dpath>=1.5.0,<2\",\n \"pyyaml>=5.4.1\",\n \"boto3>=1.17\",\n \"GitPython\",\n \"jmespath\",\n \"tqdm\",\n \"update_checker\",\n \"semantic_version\",\n \"packaging\",\n \"networkx\",\n \"dockerfile-parse\",\n \"docker\",\n \"configargparse\",\n \"argcomplete\",\n \"detect-secrets\",\n \"policyuniverse\",\n \"typing-extensions>=4.1.0\",\n \"cachetools\",\n \"cyclonedx-python-lib>=2.4.0\",\n \"click>=8.0.0\",\n \"aiohttp\",\n \"aiodns\",\n \"aiomultiprocess\",\n \"jsonpath_ng\",\n \"jsonschema~=3.0\",\n \"prettytable>=3.0.0\",\n \"pycep-parser==0.3.7\",\n \"charset-normalizer\",\n ],\n license=\"Apache License 2.0\",\n name=\"checkov\",\n version=version,\n python_requires=\">=3.7\",\n description=\"Infrastructure as code static analysis\",\n author=\"bridgecrew\",\n author_email=\"[email protected]\",\n url=\"https://github.com/bridgecrewio/checkov\",\n packages=setuptools.find_packages(exclude=[\"tests*\", \"integration_tests*\"]),\n include_package_data=True,\n package_dir={\n \"checkov.bicep.checks.graph_checks\": \"checkov/bicep/checks/graph_checks\",\n \"checkov.terraform.checks.graph_checks\": \"checkov/terraform/checks/graph_checks\",\n },\n package_data={\n \"checkov\": [\"py.typed\"],\n \"checkov.bicep.checks.graph_checks\": [\"*.yaml\"],\n \"checkov.common.util.templates\": [\"*.jinja2\"],\n \"checkov.terraform.checks.graph_checks\": [\n \"aws/*.yaml\",\n \"gcp/*.yaml\",\n \"azure/*.yaml\",\n ],\n },\n scripts=[\"bin/checkov\", \"bin/checkov.cmd\"],\n long_description=long_description,\n long_description_content_type=\"text/markdown\",\n classifiers=[\n \"Environment :: Console\",\n \"Intended Audience :: Developers\",\n \"Intended Audience :: System Administrators\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Programming Language :: Python :: 3 :: Only\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Programming Language :: Python :: 3.10\",\n \"Topic :: Security\",\n \"Topic :: Software Development :: Build Tools\",\n \"Typing :: Typed\",\n ],\n)\n", "path": "setup.py" } ]
diff --git a/Pipfile b/Pipfile index 7a57559a95..e164b06aed 100644 --- a/Pipfile +++ b/Pipfile @@ -41,7 +41,7 @@ flake8-bugbear = "*" # # REMINDER: Update "install_requires" deps on setup.py when changing # -bc-python-hcl2 = "==0.3.42" +bc-python-hcl2 = "==0.3.44" deep_merge = "*" tabulate = "*" colorama="*" diff --git a/Pipfile.lock b/Pipfile.lock index 5cfd9bed84..f2ea7e5f42 100644 --- a/Pipfile.lock +++ b/Pipfile.lock @@ -1,7 +1,7 @@ { "_meta": { "hash": { - "sha256": "a63095146044c16ab3ae85422a75874c797a0aaa6bd246fdf054d0bcc31fef6f" + "sha256": "f2b84cfe07cdad3a1d23ddb6849034ffac03c489ead842fc690a8bedd32b0a44" }, "pipfile-spec": 6, "requires": { @@ -115,6 +115,7 @@ "sha256:26e62109036cd181df6e6ad646f91f0dcfd05fe16d0cb924138ff2ab75d64e3a", "sha256:78ed67db6c7b7ced4f98e495e572106d5c432a93e1ddd1bf475e1dc05f5b7df2" ], + "markers": "python_version >= '3.6'", "version": "==1.2.0" }, "argcomplete": { @@ -130,44 +131,56 @@ "sha256:2163e1640ddb52b7a8c80d0a67a08587e5d245cc9c553a74a847056bc2976b15", "sha256:8ca1e4fcf50d07413d66d1a5e416e42cfdf5851c981d679a09851a6853383b3c" ], + "markers": "python_version >= '3.6'", "version": "==4.0.2" }, + "asynctest": { + "hashes": [ + "sha256:5da6118a7e6d6b54d83a8f7197769d046922a44d2a99c21382f0a6e4fadae676", + "sha256:c27862842d15d83e6a34eb0b2866c323880eb3a75e4485b079ea11748fd77fac" + ], + "markers": "python_version < '3.8'", + "version": "==0.13.0" + }, "attrs": { "hashes": [ "sha256:2d27e3784d7a565d36ab851fe94887c5eccd6a463168875832a1be79c82828b4", "sha256:626ba8234211db98e869df76230a137c4c40a12d72445c45d5f5b716f076e2fd" ], + "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'", "version": "==21.4.0" }, "bc-python-hcl2": { "hashes": [ - "sha256:ab9e851013561f537015725a2cc787091611c7f31bb900a03bd4ffc5e280054b", - "sha256:b383b6c6835a7d81c5053a3190ace0279363407e0ce43ecf729b2688066a47ff" + "sha256:0bb7649c37e8378af05c8e7f621b0a5e87abb69414a6838aadb6d6e32557ae04", + "sha256:f9beb5d6b835413fb306a0f458098ca0bf74a79db5bc5af6fe97a405c6a3f8fa" ], "index": "pypi", - "version": "==0.3.42" + "version": "==0.3.44" }, "beautifulsoup4": { "hashes": [ "sha256:58d5c3d29f5a36ffeb94f02f0d786cd53014cf9b3b3951d42e0080d8a9498d30", "sha256:ad9aa55b65ef2808eb405f46cf74df7fcb7044d5cbc26487f96eb2ef2e436693" ], + "markers": "python_version >= '3.6'", "version": "==4.11.1" }, "boto3": { "hashes": [ - "sha256:0b9757575b8003928defc5fb6e816936fa1bdb1384d0edec6622bb9fb104e96c", - "sha256:f39b91a4c3614db8e44912ee82426fb4b16d5df2cd66883f3aff6f76d7f5d310" + "sha256:7033d3a351171b85647405eb70c4c9ae0d75c085dd987d7674557607acbcd459", + "sha256:e87dbc67475b0ea7564b17b6686995fd3a120312a95a625e6db61490fa0a3fed" ], "index": "pypi", - "version": "==1.24.12" + "version": "==1.24.20" }, "botocore": { "hashes": [ - "sha256:17d3ec9f684d21e06b64d9cb224934557bcd95031e2ecb551bf16271e8722fec", - "sha256:b8ac156e55267da6e728ea0b806bfcd97adf882801cffe7849c4b88ce4780326" + "sha256:bb80a2204ccd51c1611e562d3d0511dc2a156257f87edeb59e99d7cef24b75d6", + "sha256:d3445a382711b58b4ec29e42267f074aa743ac7a5ddc50a08e0aae2b8309e3a5" ], - "version": "==1.27.12" + "markers": "python_version >= '3.7'", + "version": "==1.27.20" }, "cached-property": { "hashes": [ @@ -189,6 +202,7 @@ "sha256:84c85a9078b11105f04f3036a9482ae10e4621616db313fe045dd24743a0820d", "sha256:fe86415d55e84719d75f8b69414f6438ac3547d2078ab91b67e779ef69378412" ], + "markers": "python_version >= '3.6'", "version": "==2022.6.15" }, "cffi": { @@ -248,11 +262,11 @@ }, "charset-normalizer": { "hashes": [ - "sha256:2857e29ff0d34db842cd7ca3230549d1a697f96ee6d3fb071cfa6c7393832597", - "sha256:6881edbebdb17b39b4eaaa821b438bf6eddffb4468cf344f09f89def34a8b1df" + "sha256:5189b6f22b01957427f35b6a08d9a0bc45b46d3788ef5a92e978433c7a35f8a5", + "sha256:575e708016ff3a5e3681541cb9d79312c416835686d054a23accb873b254f413" ], "index": "pypi", - "version": "==2.0.12" + "version": "==2.1.0" }, "click": { "hashes": [ @@ -267,6 +281,7 @@ "sha256:9653a2297357335d7325a1827e71ac1245d91c97d959346a7decabd4a52d5354", "sha256:a6e924f3c46b657feb5b72679f7e930f8e5b224b766ab35c91ae4019b4e0615e" ], + "markers": "python_version >= '3.6' and python_version < '4'", "version": "==0.5.3" }, "cloudsplaining": { @@ -298,21 +313,23 @@ "sha256:3fbdb64466afd23abaf6c977627b75b6139a5a3e8ce38405c5b413aed7a0471f", "sha256:ab1e2bfe1d01d968e1b7e8d9023bc51ef3509bba217bb730cee3827e1ee82869" ], + "markers": "python_version >= '3.6'", "version": "==21.6.0" }, "cyclonedx-python-lib": { "hashes": [ - "sha256:7a3aebcc1603e2cb0bc13ebf4274d2bd28ee46d199a7c2c05bd9d823ea7143e4", - "sha256:875c0dac4c8be1da58cef399eb09ceba8668a153d2bfed67b7af8bdbca5bad61" + "sha256:06242c2a61033c4112b41b4f55d3b5130bc2a7bc6107a7b4950eac2431351963", + "sha256:8235aad70efc0f84cdf154b8c28802b605ee8133cd5c9a247d834bdb8b6c827a" ], "index": "pypi", - "version": "==2.5.2" + "version": "==2.6.0" }, "decorator": { "hashes": [ "sha256:637996211036b6385ef91435e4fae22989472f9d571faba8927ba8253acbc330", "sha256:b8c3f85900b9dc423225913c5aace94729fe1fa9763b38939a95226f02d37186" ], + "markers": "python_version >= '3.5'", "version": "==5.1.1" }, "deep-merge": { @@ -416,6 +433,7 @@ "sha256:f96293d6f982c58ebebb428c50163d010c2f05de0cde99fd681bfdc18d4b2dc2", "sha256:ff9310f05b9d9c5c4dd472983dc956901ee6cb2c3ec1ab116ecdde25f3ce4951" ], + "markers": "python_version >= '3.7'", "version": "==1.3.0" }, "gitdb": { @@ -423,6 +441,7 @@ "sha256:8033ad4e853066ba6ca92050b9df2f89301b8fc8bf7e9324d412a63f8bf1a8fd", "sha256:bac2fd45c0a1c9cf619e63a90d62bdc63892ef92387424b855792a6cabe789aa" ], + "markers": "python_version >= '3.6'", "version": "==4.0.9" }, "gitpython": { @@ -438,15 +457,16 @@ "sha256:84d9dd047ffa80596e0f246e2eab0b391788b0503584e8945f2368256d2735ff", "sha256:9d643ff0a55b762d5cdb124b8eaa99c66322e2157b69160bc32796e824360e6d" ], + "markers": "python_version >= '3.5'", "version": "==3.3" }, "importlib-metadata": { "hashes": [ - "sha256:5d26852efe48c0a32b0509ffbc583fda1a2266545a78d104a6f4aff3db17d700", - "sha256:c58c8eb8a762858f49e18436ff552e83914778e50e9d2f1660535ffb364552ec" + "sha256:637245b8bab2b6502fcbc752cc4b7a6f6243bb02b31c5c26156ad103d3d45670", + "sha256:7401a975809ea1fdc658c3aa4f78cc2195a0e019c5cbc4c06122884e9ae80c23" ], "index": "pypi", - "version": "==4.11.4" + "version": "==4.12.0" }, "importlib-resources": { "hashes": [ @@ -461,6 +481,7 @@ "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852", "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61" ], + "markers": "python_version >= '3.7'", "version": "==3.1.2" }, "jmespath": { @@ -507,6 +528,7 @@ "sha256:cbb516f16218e643d8e0a95b309f77eb118cb138d39a4f27851e6a63581db874", "sha256:f5da449a6e1c989a4cea2631aa8ee67caa5a2ef855d551c88f9e309f4634c621" ], + "markers": "python_version >= '3.6'", "version": "==3.3.7" }, "markupsafe": { @@ -552,6 +574,7 @@ "sha256:f121a1420d4e173a5d96e47e9a0c0dcff965afdf1626d28de1460815f7c4ee7a", "sha256:fc7b548b17d238737688817ab67deebb30e8073c95749d55538ed473130ec0c7" ], + "markers": "python_version >= '3.7'", "version": "==2.1.1" }, "multidict": { @@ -616,6 +639,7 @@ "sha256:feba80698173761cddd814fa22e88b0661e98cb810f9f986c54aa34d281e4937", "sha256:feea820722e69451743a3d56ad74948b68bf456984d63c1a92e8347b7b88452d" ], + "markers": "python_version >= '3.7'", "version": "==6.0.2" }, "networkx": { @@ -628,10 +652,11 @@ }, "packageurl-python": { "hashes": [ - "sha256:07aa852d1c48b0e86e625f6a32d83f96427739806b269d0f8142788ee807114b", - "sha256:872a0434b9a448b3fa97571711f69dd2a3fb72345ad66c90b17d827afea82f09" + "sha256:99df143960b7100fff3b2cf5b0beba2f64b6d8c818f6c9f125aed6fac7438763", + "sha256:c7dc928aaa9465f04c86eaa956c75247d5f140ec8d50bc111b55314f143324bb" ], - "version": "==0.9.9" + "markers": "python_version >= '3.6'", + "version": "==0.10.0" }, "packaging": { "hashes": [ @@ -653,15 +678,16 @@ "sha256:5358f388ba7ff682a337f0a80a9cb7fb1ee981b6f46ad1c446132c017ac5ede4", "sha256:75137fc7e1311bc24836855dce7caa40548f3f81a72045bb6731d55f48de644a" ], + "markers": "python_version >= '3.6'", "version": "==0.12.3" }, "policyuniverse": { "hashes": [ - "sha256:826705f0a77018b314e60d4d620c4b2a004b935c89ad68bf7695444c3698d15a", - "sha256:997db60c3c0181a3fbae09e73a56f5c28076fa5e9b13ea09f93eedf9a0978fa7" + "sha256:be5d9148bf6cc2586b02aa85242e9c9cdc94e4469f9b393114950cae299eeb5d", + "sha256:c66b1fb907750643a1987eb419b2112ae3f9c527c013429525f9fab989c9a2d7" ], "index": "pypi", - "version": "==1.5.0.20220523" + "version": "==1.5.0.20220613" }, "prettytable": { "hashes": [ @@ -673,39 +699,39 @@ }, "pycares": { "hashes": [ - "sha256:03490be0e7b51a0c8073f877bec347eff31003f64f57d9518d419d9369452837", - "sha256:056330275dea42b7199494047a745e1d9785d39fb8c4cd469dca043532240b80", - "sha256:0aa897543a786daba74ec5e19638bd38b2b432d179a0e248eac1e62de5756207", - "sha256:112e1385c451069112d6b5ea1f9c378544f3c6b89882ff964e9a64be3336d7e4", - "sha256:27a6f09dbfb69bb79609724c0f90dfaa7c215876a7cd9f12d585574d1f922112", - "sha256:2b837315ed08c7df009b67725fe1f50489e99de9089f58ec1b243dc612f172aa", - "sha256:2f5f84fe9f83eab9cd68544b165b74ba6e3412d029cc9ab20098d9c332869fc5", - "sha256:40079ed58efa91747c50aac4edf8ecc7e570132ab57dc0a4030eb0d016a6cab8", - "sha256:439799be4b7576e907139a7f9b3c8a01b90d3e38af4af9cd1fc6c1ee9a42b9e6", - "sha256:4d5da840aa0d9b15fa51107f09270c563a348cb77b14ae9653d0bbdbe326fcc2", - "sha256:4e190471a015f8225fa38069617192e06122771cce2b169ac7a60bfdbd3d4ab2", - "sha256:5632f21d92cc0225ba5ff906e4e5dec415ef0b3df322c461d138190681cd5d89", - "sha256:569eef8597b5e02b1bc4644b9f272160304d8c9985357d7ecfcd054da97c0771", - "sha256:58a41a2baabcd95266db776c510d349d417919407f03510fc87ac7488730d913", - "sha256:6831e963a910b0a8cbdd2750ffcdf5f2bb0edb3f53ca69ff18484de2cc3807c4", - "sha256:71b99b9e041ae3356b859822c511f286f84c8889ec9ed1fbf6ac30fb4da13e4c", - "sha256:8319afe4838e09df267c421ca93da408f770b945ec6217dda72f1f6a493e37e4", - "sha256:8fd1ff17a26bb004f0f6bb902ba7dddd810059096ae0cc3b45e4f5be46315d19", - "sha256:a810d01c9a426ee8b0f36969c2aef5fb966712be9d7e466920beb328cd9cefa3", - "sha256:ad7b28e1b6bc68edd3d678373fa3af84e39d287090434f25055d21b4716b2fc6", - "sha256:b0e50ddc78252f2e2b6b5f2c73e5b2449dfb6bea7a5a0e21dfd1e2bcc9e17382", - "sha256:b266cec81dcea2c3efbbd3dda00af8d7eb0693ae9e47e8706518334b21f27d4a", - "sha256:c000942f5fc64e6e046aa61aa53b629b576ba11607d108909727c3c8f211a157", - "sha256:c6680f7fdc0f1163e8f6c2a11d11b9a0b524a61000d2a71f9ccd410f154fb171", - "sha256:c7eba3c8354b730a54d23237d0b6445a2f68570fa68d0848887da23a3f3b71f3", - "sha256:cbceaa9b2c416aa931627466d3240aecfc905c292c842252e3d77b8630072505", - "sha256:dc942692fca0e27081b7bb414bb971d34609c80df5e953f6d0c62ecc8019acd9", - "sha256:e1489aa25d14dbf7176110ead937c01176ed5a0ebefd3b092bbd6b202241814c", - "sha256:e5a060f5fa90ae245aa99a4a8ad13ec39c2340400de037c7e8d27b081e1a3c64", - "sha256:ec00f3594ee775665167b1a1630edceefb1b1283af9ac57480dba2fb6fd6c360", - "sha256:ed71dc4290d9c3353945965604ef1f6a4de631733e9819a7ebc747220b27e641" - ], - "version": "==4.1.2" + "sha256:061dd4c80fec73feb150455b159704cd51a122f20d36790033bd6375d4198579", + "sha256:15dd5cf21bc73ad539e8aabf7afe370d1df8af7bc6944cd7298f3bfef0c1a27c", + "sha256:1a9506d496efeb809a1b63647cb2f3f33c67fcf62bf80a2359af692fef2c1755", + "sha256:1f37f762414680063b4dfec5be809a84f74cd8e203d939aaf3ba9c807a9e7013", + "sha256:2113529004df4894783eaa61e9abc3a680756b6f033d942f2800301ae8c71c29", + "sha256:2fd53eb5b441c4f6f9c78d7900e05883e9998b34a14b804be4fc4c6f9fea89f3", + "sha256:3636fccf643c5192c34ee0183c514a2d09419e3a76ca2717cef626638027cb21", + "sha256:396ee487178e9de06ca4122a35a157474db3ce0a0db6038a31c831ebb9863315", + "sha256:3b78bdee2f2f1351d5fccc2d1b667aea2d15a55d74d52cb9fd5bea8b5e74c4dc", + "sha256:4ee625d7571039038bca51ae049b047cbfcfc024b302aae6cc53d5d9aa8648a8", + "sha256:5333b51ef4ff3e8973b4a1b57cad5ada13e15552445ee3cd74bd77407dec9d44", + "sha256:66b5390a4885a578e687d3f2683689c35e1d4573f4d0ecf217431f7bb55c49a0", + "sha256:6724573e830ea2345f4bcf0f968af64cc6d491dc2133e9c617f603445dcdfa58", + "sha256:735b4f75fd0f595c4e9184da18cd87737f46bc81a64ea41f4edce2b6b68d46d2", + "sha256:7a901776163a04de5d67c42bd63a287cff9cb05fc041668ad1681fe3daa36445", + "sha256:8bd6ed3ad3a5358a635c1acf5d0f46be9afb095772b84427ff22283d2f31db1b", + "sha256:99e00e397d07a79c9f43e4303e67f4f97bcabd013bda0d8f2d430509b7aef8a0", + "sha256:9b05c2cec644a6c66b55bcf6c24d4dfdaf2f7205b16e5c4ceee31db104fac958", + "sha256:a521d7f54f3e52ded4d34c306ba05cfe9eb5aaa2e5aaf83c96564b9369495588", + "sha256:b03f69df69f0ab3bfb8dbe54444afddff6ff9389561a08aade96b4f91207a655", + "sha256:c8a46839da642b281ac5f56d3c6336528e128b3c41eab9c5330d250f22325e9d", + "sha256:d2e8ec4c8e07c986b70a3cc8f5b297c53b08ac755e5b9797512002a466e2de86", + "sha256:d83f193563b42360528167705b1c7bb91e2a09f990b98e3d6378835b72cd5c96", + "sha256:d9cd826d8e0c270059450709bff994bfeb072f79d82fd3f11c701690ff65d0e7", + "sha256:e4dc37f732f7110ca6368e0128cbbd0a54f5211515a061b2add64da2ddb8e5ca", + "sha256:e75cbd4d3b3d9b02bba6e170846e39893a825e7a5fb1b96728fc6d7b964f8945", + "sha256:e7a95763cdc20cf9ec357066e656ea30b8de6b03de6175cbb50890e22aa01868", + "sha256:e9dbfcacbde6c21380c412c13d53ea44b257dea3f7b9d80be2c873bb20e21fee", + "sha256:f05223de13467bb26f9a1594a1799ce2d08ad8ea241489fecd9d8ed3bbbfc672", + "sha256:f8e6942965465ca98e212376c4afb9aec501d8129054929744b2f4a487c8c14b", + "sha256:fbd53728d798d07811898e11991e22209229c090eab265a53d12270b95d70d1a" + ], + "version": "==4.2.1" }, "pycep-parser": { "hashes": [ @@ -727,6 +753,7 @@ "sha256:2b020ecf7d21b687f219b71ecad3631f644a47f01403fa1d1036b0c6416d70fb", "sha256:5026bae9a10eeaefb61dab2f09052b9f4307d44aee4eda64b309723d8d206bbc" ], + "markers": "python_full_version >= '3.6.8'", "version": "==3.0.9" }, "pyrsistent": { @@ -753,6 +780,7 @@ "sha256:f87cc2863ef33c709e237d4b5f4502a62a00fab450c9e020892e8e2ede5847f5", "sha256:fd8da6d0124efa2f67d86fa70c851022f87c98e205f0594e1fae044e7119a5a6" ], + "markers": "python_version >= '3.7'", "version": "==0.18.1" }, "python-dateutil": { @@ -760,6 +788,7 @@ "sha256:0123cacc1627ae19ddf3c27a5de5bd67ee4586fbdd6440d9748f8abb483d3e86", "sha256:961d03dc3453ebbc59dbdea9e4e11c5651520a876d0f4db161e8674aae935da9" ], + "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", "version": "==2.8.2" }, "pyyaml": { @@ -878,20 +907,23 @@ "sha256:fdecb225d0f1d50d4b26ac423e0032e76d46a788b83b4e299a520717a47d968c", "sha256:ffef4b30785dc2d1604dfb7cf9fca5dc27cd86d65f7c2a9ec34d6d3ae4565ec2" ], + "markers": "python_version >= '3.6'", "version": "==2022.6.2" }, "requests": { "hashes": [ - "sha256:bc7861137fbce630f17b03d3ad02ad0bf978c844f3536d0edda6499dafce2b6f", - "sha256:d568723a7ebd25875d8d1eaf5dfa068cd2fc8194b2e483d7b1f7c81918dbec6b" + "sha256:7c5599b102feddaa661c826c56ab4fee28bfd17f5abca1ebbe3e7f19d7c97983", + "sha256:8fefa2a1a1365bf5520aac41836fbee479da67864514bdb821f31ce07ce65349" ], - "version": "==2.28.0" + "markers": "python_version >= '3.7' and python_version < '4'", + "version": "==2.28.1" }, "s3transfer": { "hashes": [ "sha256:06176b74f3a15f61f1b4f25a1fc29a4429040b7647133a463da8fa5bd28d5ecd", "sha256:2ed07d3866f523cc561bf4a00fc5535827981b117dd7876f036b0c1aca42c947" ], + "markers": "python_version >= '3.7'", "version": "==0.6.0" }, "schema": { @@ -909,11 +941,20 @@ "index": "pypi", "version": "==2.10.0" }, + "setuptools": { + "hashes": [ + "sha256:990a4f7861b31532871ab72331e755b5f14efbe52d336ea7f6118144dd478741", + "sha256:c1848f654aea2e3526d17fc3ce6aeaa5e7e24e66e645b5be2171f3f6b4e5a178" + ], + "markers": "python_version >= '3.7'", + "version": "==62.6.0" + }, "six": { "hashes": [ "sha256:1e61c37477a1626458e36f7b1d82aa5c9b094fa4802892072e49de9c60c4c926", "sha256:8abb2f1d86890a2dfb989f9a77cfcfd3e47c2a354b01111771326f8aa26e0254" ], + "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", "version": "==1.16.0" }, "smmap": { @@ -921,6 +962,7 @@ "sha256:2aba19d6a040e78d8b09de5c57e96207b09ed71d8e55ce0959eeee6c8e190d94", "sha256:c840e62059cd3be204b0c9c9f74be2c09d5648eddd4580d9314c3ecde0b30936" ], + "markers": "python_version >= '3.6'", "version": "==5.0.0" }, "sortedcontainers": { @@ -935,15 +977,17 @@ "sha256:3b2503d3c7084a42b1ebd08116e5f81aadfaea95863628c80a3b774a11b7c759", "sha256:fc53893b3da2c33de295667a0e19f078c14bf86544af307354de5fcf12a3f30d" ], + "markers": "python_version >= '3.6'", "version": "==2.3.2.post1" }, "tabulate": { "hashes": [ - "sha256:d7c013fe7abbc5e491394e10fa845f8f32fe54f8dc60c6622c6cf482d25d47e4", - "sha256:eb1d13f25760052e8931f2ef80aaf6045a6cceb47514db8beab24cded16f13a7" + "sha256:0ba055423dbaa164b9e456abe7920c5e8ed33fcc16f6d1b2f2d152c8e1e8b4fc", + "sha256:436f1c768b424654fce8597290d2764def1eea6a77cfa5c33be00b1bc0f4f63d", + "sha256:6c57f3f3dd7ac2782770155f3adb2db0b1a269637e42f27599925e64b114f519" ], "index": "pypi", - "version": "==0.8.9" + "version": "==0.8.10" }, "termcolor": { "hashes": [ @@ -957,6 +1001,7 @@ "sha256:806143ae5bfb6a3c6e736a764057db0e6a0e05e338b5630894a5f779cabb4f9b", "sha256:b3bda1d108d5dd99f4a20d24d9c348e91c4db7ab1b749200bded2f839ccbe68f" ], + "markers": "python_version >= '2.6' and python_version not in '3.0, 3.1, 3.2, 3.3'", "version": "==0.10.2" }, "tqdm": { @@ -988,6 +1033,7 @@ "sha256:44ece4d53fb1706f667c9bd1c648f5469a2ec925fcf3a776667042d645472c14", "sha256:aabaf16477806a5e1dd19aa41f8c2b7950dd3c746362d7e3223dbe6de6ac448e" ], + "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4' and python_version < '4'", "version": "==1.26.9" }, "wcwidth": { @@ -999,10 +1045,11 @@ }, "websocket-client": { "hashes": [ - "sha256:50b21db0058f7a953d67cc0445be4b948d7fc196ecbeb8083d68d94628e4abf6", - "sha256:722b171be00f2b90e1d4fb2f2b53146a536ca38db1da8ff49c972a4e1365d0ef" + "sha256:5d55652dc1d0b3c734f044337d929aaf83f4f9138816ec680c1aefefb4dc4877", + "sha256:d58c5f284d6a9bf8379dab423259fe8f85b70d5fa5d2916d5791a84594b122b1" ], - "version": "==1.3.2" + "markers": "python_version >= '3.7'", + "version": "==1.3.3" }, "yarl": { "hashes": [ @@ -1079,6 +1126,7 @@ "sha256:fce78593346c014d0d986b7ebc80d782b7f5e19843ca798ed62f8e3ba8728576", "sha256:fd547ec596d90c8676e369dd8a581a21227fe9b4ad37d0dc7feb4ccf544c2d59" ], + "markers": "python_version >= '3.6'", "version": "==1.7.2" }, "zipp": { @@ -1086,7 +1134,7 @@ "sha256:56bf8aadb83c24db6c4b577e13de374ccfb67da2078beba1d037c17980bf43ad", "sha256:c4f6e5bbf48e74f7a38e7cc5b0480ff42b0ae5178957d564d18932525d5cf099" ], - "markers": "python_version < '3.10'", + "markers": "python_version >= '3.7'", "version": "==3.8.0" } }, @@ -1182,6 +1230,7 @@ "sha256:26e62109036cd181df6e6ad646f91f0dcfd05fe16d0cb924138ff2ab75d64e3a", "sha256:78ed67db6c7b7ced4f98e495e572106d5c432a93e1ddd1bf475e1dc05f5b7df2" ], + "markers": "python_version >= '3.6'", "version": "==1.2.0" }, "async-timeout": { @@ -1189,8 +1238,17 @@ "sha256:2163e1640ddb52b7a8c80d0a67a08587e5d245cc9c553a74a847056bc2976b15", "sha256:8ca1e4fcf50d07413d66d1a5e416e42cfdf5851c981d679a09851a6853383b3c" ], + "markers": "python_version >= '3.6'", "version": "==4.0.2" }, + "asynctest": { + "hashes": [ + "sha256:5da6118a7e6d6b54d83a8f7197769d046922a44d2a99c21382f0a6e4fadae676", + "sha256:c27862842d15d83e6a34eb0b2866c323880eb3a75e4485b079ea11748fd77fac" + ], + "markers": "python_version < '3.8'", + "version": "==0.13.0" + }, "atomicwrites": { "hashes": [ "sha256:6d1784dea7c0c8d4a5172b6c620f40b6e4cbfdf96d783691f2e1302a7b88e197", @@ -1204,6 +1262,7 @@ "sha256:2d27e3784d7a565d36ab851fe94887c5eccd6a463168875832a1be79c82828b4", "sha256:626ba8234211db98e869df76230a137c4c40a12d72445c45d5f5b716f076e2fd" ], + "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'", "version": "==21.4.0" }, "bandit": { @@ -1219,6 +1278,7 @@ "sha256:84c85a9078b11105f04f3036a9482ae10e4621616db313fe045dd24743a0820d", "sha256:fe86415d55e84719d75f8b69414f6438ac3547d2078ab91b67e779ef69378412" ], + "markers": "python_version >= '3.6'", "version": "==2022.6.15" }, "cfgv": { @@ -1226,20 +1286,19 @@ "sha256:c6a0883f3917a037485059700b9e75da2464e6c27051014ad85ba6aaa5884426", "sha256:f5a830efb9ce7a445376bb66ec94c638a9787422f96264c98edc6bdeed8ab736" ], + "markers": "python_full_version >= '3.6.1'", "version": "==3.3.1" }, "charset-normalizer": { "hashes": [ - "sha256:2857e29ff0d34db842cd7ca3230549d1a697f96ee6d3fb071cfa6c7393832597", - "sha256:6881edbebdb17b39b4eaaa821b438bf6eddffb4468cf344f09f89def34a8b1df" + "sha256:5189b6f22b01957427f35b6a08d9a0bc45b46d3788ef5a92e978433c7a35f8a5", + "sha256:575e708016ff3a5e3681541cb9d79312c416835686d054a23accb873b254f413" ], "index": "pypi", - "version": "==2.0.12" + "version": "==2.1.0" }, "coverage": { - "extras": [ - "toml" - ], + "extras": [], "hashes": [ "sha256:004d1880bed2d97151facef49f08e255a20ceb6f9432df75f4eef018fdd5a78c", "sha256:01d84219b5cdbfc8122223b39a954820929497a1cb1422824bb86b07b74594b6", @@ -1324,6 +1383,7 @@ "sha256:8f694f3ba9cc92cab508b152dcfe322153975c29bda272e2fd7f3f00f36e47c5", "sha256:a295f7cc774947aac58dde7fdc85f4aa00c42adf5d8f5468fc630c1acf30a142" ], + "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'", "version": "==1.9.0" }, "filelock": { @@ -1331,6 +1391,7 @@ "sha256:37def7b658813cda163b56fc564cdc75e86d338246458c4c28ae84cabefa2404", "sha256:3a0fd85166ad9dbab54c9aec96737b744106dc5f15c0b09a6744a445299fcf04" ], + "markers": "python_version >= '3.7'", "version": "==3.7.1" }, "flake8": { @@ -1343,11 +1404,11 @@ }, "flake8-bugbear": { "hashes": [ - "sha256:ec374101cddf65bd7a96d393847d74e58d3b98669dbf9768344c39b6290e8bd6", - "sha256:f7c080563fca75ee6b205d06b181ecba22b802babb96b0b084cc7743d6908a55" + "sha256:ac3317eba27d79dc19dcdeb7356ca1f656f0cde11d899c4551badf770f05cbef", + "sha256:ad2b33dbe33a6d4ca1f0037e1d156d0a89107ee63c0600e3b4f7b60e37998ac2" ], "index": "pypi", - "version": "==22.4.25" + "version": "==22.6.22" }, "frozenlist": { "hashes": [ @@ -1411,6 +1472,7 @@ "sha256:f96293d6f982c58ebebb428c50163d010c2f05de0cde99fd681bfdc18d4b2dc2", "sha256:ff9310f05b9d9c5c4dd472983dc956901ee6cb2c3ec1ab116ecdde25f3ce4951" ], + "markers": "python_version >= '3.7'", "version": "==1.3.0" }, "gitdb": { @@ -1418,6 +1480,7 @@ "sha256:8033ad4e853066ba6ca92050b9df2f89301b8fc8bf7e9324d412a63f8bf1a8fd", "sha256:bac2fd45c0a1c9cf619e63a90d62bdc63892ef92387424b855792a6cabe789aa" ], + "markers": "python_version >= '3.6'", "version": "==4.0.9" }, "gitpython": { @@ -1433,6 +1496,7 @@ "sha256:0dca2ea3e4381c435ef9c33ba100a78a9b40c0bab11189c7cf121f75815efeaa", "sha256:3d11b16f3fe19f52039fb7e39c9c884b21cb1b586988114fbe42671f03de3e82" ], + "markers": "python_version >= '3.7'", "version": "==2.5.1" }, "idna": { @@ -1440,8 +1504,17 @@ "sha256:84d9dd047ffa80596e0f246e2eab0b391788b0503584e8945f2368256d2735ff", "sha256:9d643ff0a55b762d5cdb124b8eaa99c66322e2157b69160bc32796e824360e6d" ], + "markers": "python_version >= '3.5'", "version": "==3.3" }, + "importlib-metadata": { + "hashes": [ + "sha256:637245b8bab2b6502fcbc752cc4b7a6f6243bb02b31c5c26156ad103d3d45670", + "sha256:7401a975809ea1fdc658c3aa4f78cc2195a0e019c5cbc4c06122884e9ae80c23" + ], + "index": "pypi", + "version": "==4.12.0" + }, "importlib-resources": { "hashes": [ "sha256:568c9f16cb204f9decc8d6d24a572eeea27dacbb4cee9e6b03a8025736769751", @@ -1542,6 +1615,7 @@ "sha256:feba80698173761cddd814fa22e88b0661e98cb810f9f986c54aa34d281e4937", "sha256:feea820722e69451743a3d56ad74948b68bf456984d63c1a92e8347b7b88452d" ], + "markers": "python_version >= '3.7'", "version": "==6.0.2" }, "mypy": { @@ -1582,10 +1656,11 @@ }, "nodeenv": { "hashes": [ - "sha256:3ef13ff90291ba2a4a7a4ff9a979b63ffdd00a464dbe04acf0ea6471517a4c2b", - "sha256:621e6b7076565ddcacd2db0294c0381e01fd28945ab36bcf00f41c5daf63bef7" + "sha256:27083a7b96a25f2f5e1d8cb4b6317ee8aeda3bdd121394e5ac54e498028a042e", + "sha256:e0e7f7dfb85fc5394c6fe1e8fa98131a2473e04311a45afb6508f7cf1836fa2b" ], - "version": "==1.6.0" + "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4, 3.5, 3.6'", + "version": "==1.7.0" }, "packaging": { "hashes": [ @@ -1600,6 +1675,7 @@ "sha256:e547125940bcc052856ded43be8e101f63828c2d94239ffbe2b327ba3d5ccf0a", "sha256:e8dca2f4b43560edef58813969f52a56cef023146cbb8931626db80e6c1c4308" ], + "markers": "python_version >= '2.6'", "version": "==5.9.0" }, "platformdirs": { @@ -1607,6 +1683,7 @@ "sha256:027d8e83a2d7de06bbac4e5ef7e023c02b863d7ea5d079477e722bb41ab25788", "sha256:58c8abb07dcb441e6ee4b11d8df0ac856038f944ab98b7be6b27b2a3c7feef19" ], + "markers": "python_version >= '3.7'", "version": "==2.5.2" }, "pluggy": { @@ -1614,6 +1691,7 @@ "sha256:4224373bacce55f955a878bf9cfa763c1e360858e330072059e10bad68531159", "sha256:74134bbf457f031a36d68416e1509f34bd5ccc019f0bcc952c7b909d06b37bd3" ], + "markers": "python_version >= '3.6'", "version": "==1.0.0" }, "pre-commit": { @@ -1629,6 +1707,7 @@ "sha256:51c75c4126074b472f746a24399ad32f6053d1b34b68d2fa41e558e6f4a98719", "sha256:607c53218732647dff4acdfcd50cb62615cedf612e72d1724fb1a0cc6405b378" ], + "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'", "version": "==1.11.0" }, "pycodestyle": { @@ -1636,6 +1715,7 @@ "sha256:720f8b39dde8b293825e7ff02c475f3077124006db4f440dcbc9a20b76548a20", "sha256:eddd5847ef438ea1c7870ca7eb78a9d47ce0cdb4851a5523949f2601d0cbbe7f" ], + "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'", "version": "==2.8.0" }, "pyflakes": { @@ -1643,6 +1723,7 @@ "sha256:05a85c2872edf37a4ed30b0cce2f6093e1d0581f8c19d7393122da7e25b2b24c", "sha256:3bb3a3f256f4b7968c9c788781e4ff07dce46bdf12339dcda61053375426ee2e" ], + "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", "version": "==2.4.0" }, "pyparsing": { @@ -1650,6 +1731,7 @@ "sha256:2b020ecf7d21b687f219b71ecad3631f644a47f01403fa1d1036b0c6416d70fb", "sha256:5026bae9a10eeaefb61dab2f09052b9f4307d44aee4eda64b309723d8d206bbc" ], + "markers": "python_full_version >= '3.6.8'", "version": "==3.0.9" }, "pyrsistent": { @@ -1676,6 +1758,7 @@ "sha256:f87cc2863ef33c709e237d4b5f4502a62a00fab450c9e020892e8e2ede5847f5", "sha256:fd8da6d0124efa2f67d86fa70c851022f87c98e205f0594e1fae044e7119a5a6" ], + "markers": "python_version >= '3.7'", "version": "==0.18.1" }, "pytest": { @@ -1708,15 +1791,16 @@ "sha256:8b67587c8f98cbbadfdd804539ed5455b6ed03802203485dd2f53c1422d7440e", "sha256:bbbb6717efc886b9d64537b41fb1497cfaf3c9601276be8da2cccfea5a3c8ad8" ], + "markers": "python_version >= '3.6'", "version": "==1.4.0" }, "pytest-mock": { "hashes": [ - "sha256:5112bd92cc9f186ee96e1a92efc84969ea494939c3aead39c50f421c4cc69534", - "sha256:6cff27cec936bf81dc5ee87f07132b807bcda51106b5ec4b90a04331cba76231" + "sha256:2c6d756d5d3bf98e2e80797a959ca7f81f479e7d1f5f571611b0fdd6d1745240", + "sha256:d989f11ca4a84479e288b0cd1e6769d6ad0d3d7743dcc75e460d1416a5f2135a" ], "index": "pypi", - "version": "==3.7.0" + "version": "==3.8.1" }, "pytest-xdist": { "hashes": [ @@ -1767,10 +1851,11 @@ }, "requests": { "hashes": [ - "sha256:bc7861137fbce630f17b03d3ad02ad0bf978c844f3536d0edda6499dafce2b6f", - "sha256:d568723a7ebd25875d8d1eaf5dfa068cd2fc8194b2e483d7b1f7c81918dbec6b" + "sha256:7c5599b102feddaa661c826c56ab4fee28bfd17f5abca1ebbe3e7f19d7c97983", + "sha256:8fefa2a1a1365bf5520aac41836fbee479da67864514bdb821f31ce07ce65349" ], - "version": "==2.28.0" + "markers": "python_version >= '3.7' and python_version < '4'", + "version": "==2.28.1" }, "responses": { "hashes": [ @@ -1780,11 +1865,20 @@ "index": "pypi", "version": "==0.21.0" }, + "setuptools": { + "hashes": [ + "sha256:990a4f7861b31532871ab72331e755b5f14efbe52d336ea7f6118144dd478741", + "sha256:c1848f654aea2e3526d17fc3ce6aeaa5e7e24e66e645b5be2171f3f6b4e5a178" + ], + "markers": "python_version >= '3.7'", + "version": "==62.6.0" + }, "six": { "hashes": [ "sha256:1e61c37477a1626458e36f7b1d82aa5c9b094fa4802892072e49de9c60c4c926", "sha256:8abb2f1d86890a2dfb989f9a77cfcfd3e47c2a354b01111771326f8aa26e0254" ], + "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", "version": "==1.16.0" }, "smmap": { @@ -1792,6 +1886,7 @@ "sha256:2aba19d6a040e78d8b09de5c57e96207b09ed71d8e55ce0959eeee6c8e190d94", "sha256:c840e62059cd3be204b0c9c9f74be2c09d5648eddd4580d9314c3ecde0b30936" ], + "markers": "python_version >= '3.6'", "version": "==5.0.0" }, "stevedore": { @@ -1799,6 +1894,7 @@ "sha256:a547de73308fd7e90075bb4d301405bebf705292fa90a90fc3bcf9133f58616c", "sha256:f40253887d8712eaa2bb0ea3830374416736dc8ec0e22f5a65092c1174c44335" ], + "markers": "python_version >= '3.6'", "version": "==3.5.0" }, "toml": { @@ -1806,6 +1902,7 @@ "sha256:806143ae5bfb6a3c6e736a764057db0e6a0e05e338b5630894a5f779cabb4f9b", "sha256:b3bda1d108d5dd99f4a20d24d9c348e91c4db7ab1b749200bded2f839ccbe68f" ], + "markers": "python_version >= '2.6' and python_version not in '3.0, 3.1, 3.2, 3.3'", "version": "==0.10.2" }, "tomli": { @@ -1816,13 +1913,43 @@ "markers": "python_version < '3.11'", "version": "==2.0.1" }, + "typed-ast": { + "hashes": [ + "sha256:0261195c2062caf107831e92a76764c81227dae162c4f75192c0d489faf751a2", + "sha256:0fdbcf2fef0ca421a3f5912555804296f0b0960f0418c440f5d6d3abb549f3e1", + "sha256:183afdf0ec5b1b211724dfef3d2cad2d767cbefac291f24d69b00546c1837fb6", + "sha256:211260621ab1cd7324e0798d6be953d00b74e0428382991adfddb352252f1d62", + "sha256:267e3f78697a6c00c689c03db4876dd1efdfea2f251a5ad6555e82a26847b4ac", + "sha256:2efae9db7a8c05ad5547d522e7dbe62c83d838d3906a3716d1478b6c1d61388d", + "sha256:370788a63915e82fd6f212865a596a0fefcbb7d408bbbb13dea723d971ed8bdc", + "sha256:39e21ceb7388e4bb37f4c679d72707ed46c2fbf2a5609b8b8ebc4b067d977df2", + "sha256:3e123d878ba170397916557d31c8f589951e353cc95fb7f24f6bb69adc1a8a97", + "sha256:4879da6c9b73443f97e731b617184a596ac1235fe91f98d279a7af36c796da35", + "sha256:4e964b4ff86550a7a7d56345c7864b18f403f5bd7380edf44a3c1fb4ee7ac6c6", + "sha256:639c5f0b21776605dd6c9dbe592d5228f021404dafd377e2b7ac046b0349b1a1", + "sha256:669dd0c4167f6f2cd9f57041e03c3c2ebf9063d0757dc89f79ba1daa2bfca9d4", + "sha256:6778e1b2f81dfc7bc58e4b259363b83d2e509a65198e85d5700dfae4c6c8ff1c", + "sha256:683407d92dc953c8a7347119596f0b0e6c55eb98ebebd9b23437501b28dcbb8e", + "sha256:79b1e0869db7c830ba6a981d58711c88b6677506e648496b1f64ac7d15633aec", + "sha256:7d5d014b7daa8b0bf2eaef684295acae12b036d79f54178b92a2b6a56f92278f", + "sha256:98f80dee3c03455e92796b58b98ff6ca0b2a6f652120c263efdba4d6c5e58f72", + "sha256:a94d55d142c9265f4ea46fab70977a1944ecae359ae867397757d836ea5a3f47", + "sha256:a9916d2bb8865f973824fb47436fa45e1ebf2efd920f2b9f99342cb7fab93f72", + "sha256:c542eeda69212fa10a7ada75e668876fdec5f856cd3d06829e6aa64ad17c8dfe", + "sha256:cf4afcfac006ece570e32d6fa90ab74a17245b83dfd6655a6f68568098345ff6", + "sha256:ebd9d7f80ccf7a82ac5f88c521115cc55d84e35bf8b446fcd7836eb6b98929a3", + "sha256:ed855bbe3eb3715fca349c80174cfcfd699c2f9de574d40527b8429acae23a66" + ], + "markers": "python_version < '3.8'", + "version": "==1.5.4" + }, "types-cachetools": { "hashes": [ - "sha256:4291c3b6ae10e7b0d7ae3c4cb7d9daa6b21d4b7deb64d193e44bcec7ca5c4095", - "sha256:bf22b2e9f9243983914f6510e43a1873f012afb8c3fc5e09a59b0ccbe3ab0f35" + "sha256:069cfc825697cd51445c1feabbe4edc1fae2b2315870e7a9a179a7c4a5851bee", + "sha256:b496b7e364ba050c4eaadcc6582f2c9fbb04f8ee7141eb3b311a8589dbd4506a" ], "index": "pypi", - "version": "==5.0.2" + "version": "==5.2.1" }, "types-colorama": { "hashes": [ @@ -1834,11 +1961,11 @@ }, "types-jmespath": { "hashes": [ - "sha256:46ec8e126f2b132879f431c607e9ef7928d0040c10a8e9eb40bf75752431a003", - "sha256:c2f5810f4c5026ea537e352d6b06368ae456daf983bf2dafb68c8f4c6f864842" + "sha256:89c0f6894f59626dcd074664a7294c4a7740b9e2195f5ccb698ada0f6680ce1f", + "sha256:db05811bbd758c76b3209fa92c78f8b28ac9fdf5e62bac6aed95cffc55ff0195" ], "index": "pypi", - "version": "==0.10.2" + "version": "==1.0.0" }, "types-jsonschema": { "hashes": [ @@ -1850,35 +1977,35 @@ }, "types-pyyaml": { "hashes": [ - "sha256:56a7b0e8109602785f942a11ebfbd16e97d5d0e79f5fbb077ec4e6a0004837ff", - "sha256:d9495d377bb4f9c5387ac278776403eb3b4bb376851025d913eea4c22b4c6438" + "sha256:33ae75c84b8f61fddf0c63e9c7e557db9db1694ad3c2ee8628ec5efebb5a5e9b", + "sha256:b738e9ef120da0af8c235ba49d3b72510f56ef9bcc308fc8e7357100ff122284" ], "index": "pypi", - "version": "==6.0.8" + "version": "==6.0.9" }, "types-requests": { "hashes": [ - "sha256:b9b6cd0a6e5d500e56419b79f44ec96f316e9375ff6c8ee566c39d25e9612621", - "sha256:ca8d7cc549c3d10dbcb3c69c1b53e3ffd1270089c1001a65c1e9e1017eb5e704" + "sha256:85383b4ef0535f639c3f06c5bbb6494bbf59570c4cd88bbcf540f0b2ac1b49ab", + "sha256:9863d16dfbb3fa55dcda64fa3b989e76e8859033b26c1e1623e30465cfe294d3" ], "index": "pypi", - "version": "==2.27.30" + "version": "==2.28.0" }, "types-tabulate": { "hashes": [ - "sha256:2fc3fa4fe1853ac987cf50e8d4599e3fe446dd53064fe86a46a407a98e9fc04f", - "sha256:7971ed0cd40454eb18d82c01e2f18bcd09ca23cc9eb901c62d2b04e5d1f57f84" + "sha256:17a5fa3b5ca453815778fc9865e8ecd0118b07b2b9faff3e2b06fe448174dd5e", + "sha256:af811268241e8fb87b63c052c87d1e329898a93191309d5d42111372232b2e0e" ], "index": "pypi", - "version": "==0.8.9" + "version": "==0.8.11" }, "types-termcolor": { "hashes": [ - "sha256:4986dea39b82c9b78714154ac88033d4e225e3c06e0386491f74003c9071e541", - "sha256:becba28967a8792221f202c6ba14c2ae236ef90519dd14aaefe8af54d94639e0" + "sha256:3dc714e884a98b6a8c4c6af22ee99e1b53d2e595a22e0933b2dc9cc32b8b8c58", + "sha256:dd10b878548dbd72885f72c1c45d42a45172634f7c8d0284559238785604e068" ], "index": "pypi", - "version": "==1.1.4" + "version": "==1.1.5" }, "types-urllib3": { "hashes": [ @@ -1900,6 +2027,7 @@ "sha256:44ece4d53fb1706f667c9bd1c648f5469a2ec925fcf3a776667042d645472c14", "sha256:aabaf16477806a5e1dd19aa41f8c2b7950dd3c746362d7e3223dbe6de6ac448e" ], + "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4' and python_version < '4'", "version": "==1.26.9" }, "urllib3-mock": { @@ -1912,10 +2040,11 @@ }, "virtualenv": { "hashes": [ - "sha256:e617f16e25b42eb4f6e74096b9c9e37713cf10bf30168fb4a739f3fa8f898a3a", - "sha256:ef589a79795589aada0c1c5b319486797c03b67ac3984c48c669c0e4f50df3a5" + "sha256:288171134a2ff3bfb1a2f54f119e77cd1b81c29fc1265a2356f3e8d14c7d58c4", + "sha256:b30aefac647e86af6d82bfc944c556f8f1a9c90427b2fb4e3bfbf338cb82becf" ], - "version": "==20.14.1" + "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'", + "version": "==20.15.1" }, "yarl": { "hashes": [ @@ -1992,6 +2121,7 @@ "sha256:fce78593346c014d0d986b7ebc80d782b7f5e19843ca798ed62f8e3ba8728576", "sha256:fd547ec596d90c8676e369dd8a581a21227fe9b4ad37d0dc7feb4ccf544c2d59" ], + "markers": "python_version >= '3.6'", "version": "==1.7.2" }, "zipp": { @@ -1999,7 +2129,7 @@ "sha256:56bf8aadb83c24db6c4b577e13de374ccfb67da2078beba1d037c17980bf43ad", "sha256:c4f6e5bbf48e74f7a38e7cc5b0480ff42b0ae5178957d564d18932525d5cf099" ], - "markers": "python_version < '3.10'", + "markers": "python_version >= '3.7'", "version": "==3.8.0" } } diff --git a/setup.py b/setup.py index 7e7a2020a0..59e644cec6 100644 --- a/setup.py +++ b/setup.py @@ -33,7 +33,7 @@ ] }, install_requires=[ - "bc-python-hcl2==0.3.42", + "bc-python-hcl2==0.3.44", "cloudsplaining>=0.4.1", "deep_merge", "tabulate",
litestar-org__litestar-1633
StaticFilesConfig and virtual directories I'm trying to write a ``FileSystemProtocol`` to load files from the package data using [importlib_resources](https://importlib-resources.readthedocs.io/en/latest/using.html#). But because ``directories`` is defined as ``DirectoryPath``, pydantic checks if the given directories exist in the local filesystem. This is not generally true, especially in any kind of virtual filesystem (e.g. a zipped package). I think this condition should be relaxed to support virtual filesystems. https://github.com/starlite-api/starlite/blob/9bb6dcd57c10a591377cf8e3a537e9292566d5b9/starlite/config/static_files.py#L32
[ { "content": "from __future__ import annotations\n\nimport argparse\nimport importlib.metadata\nimport json\nimport os\nimport shutil\nimport subprocess\nfrom contextlib import contextmanager\nfrom pathlib import Path\nfrom typing import TypedDict\n\nREDIRECT_TEMPLATE = \"\"\"\n<!DOCTYPE HTML>\n<html lang=\"en-US\">\n <head>\n <title>Page Redirection</title>\n <meta charset=\"UTF-8\">\n <meta http-equiv=\"refresh\" content=\"0; url={target}\">\n <script type=\"text/javascript\">window.location.href = \"{target}\"</script>\n </head>\n <body>\n You are being redirected. If this does not work, click <a href='{target}'>this link</a>\n </body>\n</html>\n\"\"\"\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--version\", required=False)\nparser.add_argument(\"--ignore-missing-examples-output\", action=\"store_true\", default=False)\nparser.add_argument(\"output\")\n\n\nclass VersionSpec(TypedDict):\n versions: list[str]\n latest: str\n\n\n@contextmanager\ndef checkout(branch: str) -> None:\n subprocess.run([\"git\", \"checkout\", branch], check=True) # noqa: S603 S607\n yield\n subprocess.run([\"git\", \"checkout\", \"-\"], check=True) # noqa: S603 S607\n\n\ndef load_version_spec() -> VersionSpec:\n versions_file = Path(\"docs/_static/versions.json\")\n if versions_file.exists():\n return json.loads(versions_file.read_text())\n return {\"versions\": [], \"latest\": \"\"}\n\n\ndef build(output_dir: str, version: str | None, ignore_missing_output: bool) -> None:\n if version is None:\n version = importlib.metadata.version(\"litestar\").rsplit(\".\")[0]\n else:\n os.environ[\"_LITESTAR_DOCS_BUILD_VERSION\"] = version\n\n if ignore_missing_output:\n os.environ[\"_LITESTAR_DOCS_IGNORE_MISSING_EXAMPLE_OUTPUT\"] = \"1\"\n\n subprocess.run([\"make\", \"docs\"], check=True) # noqa: S603 S607\n\n output_dir = Path(output_dir)\n output_dir.mkdir()\n output_dir.joinpath(\".nojekyll\").touch(exist_ok=True)\n\n version_spec = load_version_spec()\n is_latest = version == version_spec[\"latest\"]\n\n docs_src_path = Path(\"docs/_build/html\")\n\n output_dir.joinpath(\"index.html\").write_text(REDIRECT_TEMPLATE.format(target=\"latest\"))\n\n if is_latest:\n shutil.copytree(docs_src_path, output_dir / \"latest\", dirs_exist_ok=True)\n shutil.copytree(docs_src_path, output_dir / version, dirs_exist_ok=True)\n\n # copy existing versions into our output dir to preserve them when cleaning the branch\n with checkout(\"gh-pages\"):\n for other_version in [*version_spec[\"versions\"], \"latest\"]:\n other_version_path = Path(other_version)\n other_version_target_path = output_dir / other_version\n if other_version_path.exists() and not other_version_target_path.exists():\n shutil.copytree(other_version_path, other_version_target_path)\n\n\ndef main() -> None:\n args = parser.parse_args()\n build(\n output_dir=args.output,\n version=args.version,\n ignore_missing_output=args.ignore_missing_output,\n )\n\n\nif __name__ == \"__main__\":\n main()\n", "path": "tools/build_docs.py" } ]
[ { "content": "from __future__ import annotations\n\nimport argparse\nimport importlib.metadata\nimport json\nimport os\nimport shutil\nimport subprocess\nfrom contextlib import contextmanager\nfrom pathlib import Path\nfrom typing import TypedDict\n\nREDIRECT_TEMPLATE = \"\"\"\n<!DOCTYPE HTML>\n<html lang=\"en-US\">\n <head>\n <title>Page Redirection</title>\n <meta charset=\"UTF-8\">\n <meta http-equiv=\"refresh\" content=\"0; url={target}\">\n <script type=\"text/javascript\">window.location.href = \"{target}\"</script>\n </head>\n <body>\n You are being redirected. If this does not work, click <a href='{target}'>this link</a>\n </body>\n</html>\n\"\"\"\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\"--version\", required=False)\nparser.add_argument(\"--ignore-missing-examples-output\", action=\"store_true\", default=False)\nparser.add_argument(\"output\")\n\n\nclass VersionSpec(TypedDict):\n versions: list[str]\n latest: str\n\n\n@contextmanager\ndef checkout(branch: str) -> None:\n subprocess.run([\"git\", \"checkout\", branch], check=True) # noqa: S603 S607\n yield\n subprocess.run([\"git\", \"checkout\", \"-\"], check=True) # noqa: S603 S607\n\n\ndef load_version_spec() -> VersionSpec:\n versions_file = Path(\"docs/_static/versions.json\")\n if versions_file.exists():\n return json.loads(versions_file.read_text())\n return {\"versions\": [], \"latest\": \"\"}\n\n\ndef build(output_dir: str, version: str | None, ignore_missing_output: bool) -> None:\n if version is None:\n version = importlib.metadata.version(\"litestar\").rsplit(\".\")[0]\n else:\n os.environ[\"_LITESTAR_DOCS_BUILD_VERSION\"] = version\n\n if ignore_missing_output:\n os.environ[\"_LITESTAR_DOCS_IGNORE_MISSING_EXAMPLE_OUTPUT\"] = \"1\"\n\n subprocess.run([\"make\", \"docs\"], check=True) # noqa: S603 S607\n\n output_dir = Path(output_dir)\n output_dir.mkdir()\n output_dir.joinpath(\".nojekyll\").touch(exist_ok=True)\n\n version_spec = load_version_spec()\n is_latest = version == version_spec[\"latest\"]\n\n docs_src_path = Path(\"docs/_build/html\")\n\n output_dir.joinpath(\"index.html\").write_text(REDIRECT_TEMPLATE.format(target=\"latest\"))\n\n if is_latest:\n shutil.copytree(docs_src_path, output_dir / \"latest\", dirs_exist_ok=True)\n shutil.copytree(docs_src_path, output_dir / version, dirs_exist_ok=True)\n\n # copy existing versions into our output dir to preserve them when cleaning the branch\n with checkout(\"gh-pages\"):\n for other_version in [*version_spec[\"versions\"], \"latest\"]:\n other_version_path = Path(other_version)\n other_version_target_path = output_dir / other_version\n if other_version_path.exists() and not other_version_target_path.exists():\n shutil.copytree(other_version_path, other_version_target_path)\n\n\ndef main() -> None:\n args = parser.parse_args()\n build(\n output_dir=args.output,\n version=args.version,\n ignore_missing_output=args.ignore_missing_examples_output,\n )\n\n\nif __name__ == \"__main__\":\n main()\n", "path": "tools/build_docs.py" } ]
diff --git a/tools/build_docs.py b/tools/build_docs.py index 46e577e3e5..34b4f48b4a 100644 --- a/tools/build_docs.py +++ b/tools/build_docs.py @@ -90,7 +90,7 @@ def main() -> None: build( output_dir=args.output, version=args.version, - ignore_missing_output=args.ignore_missing_output, + ignore_missing_output=args.ignore_missing_examples_output, )
zulip__zulip-8684
lint rules: Prevent `return undefined;` We should sweep the code to replace `return undefined;` with `return;`, and then make a lint rule for it, either via eslint (if they support that) or by making a custom rule.
[ { "content": "ZULIP_VERSION = \"1.7.1+git\"\n\n# Bump the minor PROVISION_VERSION to indicate that folks should provision\n# only when going from an old version of the code to a newer version. Bump\n# the major version to indicate that folks should provision in both\n# directions.\n\n# Typically, adding a dependency only requires a minor version bump, and\n# removing a dependency requires a major version bump.\n\nPROVISION_VERSION = '15.9'\n", "path": "version.py" } ]
[ { "content": "ZULIP_VERSION = \"1.7.1+git\"\n\n# Bump the minor PROVISION_VERSION to indicate that folks should provision\n# only when going from an old version of the code to a newer version. Bump\n# the major version to indicate that folks should provision in both\n# directions.\n\n# Typically, adding a dependency only requires a minor version bump, and\n# removing a dependency requires a major version bump.\n\nPROVISION_VERSION = '15.10'\n", "path": "version.py" } ]
diff --git a/.eslintrc.json b/.eslintrc.json index 7c680d38220ce..db7c93bd78cad 100644 --- a/.eslintrc.json +++ b/.eslintrc.json @@ -172,6 +172,9 @@ "common": false, "panels": false }, + "plugins": [ + "eslint-plugin-empty-returns" + ], "rules": { "array-callback-return": "error", "array-bracket-spacing": "error", @@ -191,6 +194,7 @@ "complexity": [ 0, 4 ], "curly": 2, "dot-notation": [ "error", { "allowKeywords": true } ], + "empty-returns/main": "error", "eol-last": [ "error", "always" ], "eqeqeq": 2, "func-style": [ "off", "expression" ], diff --git a/frontend_tests/node_tests/people.js b/frontend_tests/node_tests/people.js index f7a1552f9d644..4e62c45a3aff8 100644 --- a/frontend_tests/node_tests/people.js +++ b/frontend_tests/node_tests/people.js @@ -2,7 +2,7 @@ zrequire('util'); zrequire('people'); set_global('blueslip', { - error: function () { return undefined; }, + error: function () { return; }, }); set_global('page_params', {}); set_global('md5', function (s) { @@ -555,7 +555,7 @@ initialize(); assert.equal(email, '[email protected]'); // Test undefined slug - people.emails_strings_to_user_ids_string = function () { return undefined; }; + people.emails_strings_to_user_ids_string = function () { return; }; assert.equal(people.emails_to_slug(), undefined); }()); diff --git a/frontend_tests/node_tests/people_errors.js b/frontend_tests/node_tests/people_errors.js index 3eb3a5d0d6666..c3906c9d798a0 100644 --- a/frontend_tests/node_tests/people_errors.js +++ b/frontend_tests/node_tests/people_errors.js @@ -107,7 +107,7 @@ people.initialize_current_user(me.user_id); assert(reply_to.indexOf('?') > -1); people.pm_with_user_ids = function () { return [42]; }; - people.get_person_from_user_id = function () { return undefined; }; + people.get_person_from_user_id = function () { return; }; global.blueslip.error = function (msg) { assert.equal(msg, 'Unknown people in message'); }; diff --git a/frontend_tests/node_tests/search_suggestion.js b/frontend_tests/node_tests/search_suggestion.js index 437a86ea7105b..382e29429889e 100644 --- a/frontend_tests/node_tests/search_suggestion.js +++ b/frontend_tests/node_tests/search_suggestion.js @@ -53,7 +53,7 @@ topic_data.reset(); }; global.narrow_state.stream = function () { - return undefined; + return; }; var suggestions = search.get_suggestions(query); @@ -73,7 +73,7 @@ topic_data.reset(); }; global.narrow_state.stream = function () { - return undefined; + return; }; var ted = @@ -244,7 +244,7 @@ topic_data.reset(); }; global.narrow_state.stream = function () { - return undefined; + return; }; set_global('activity', { @@ -430,7 +430,7 @@ init(); }; global.narrow_state.stream = function () { - return undefined; + return; }; var suggestions = search.get_suggestions(query); @@ -466,7 +466,7 @@ init(); }; global.narrow_state.stream = function () { - return undefined; + return; }; var query = ''; diff --git a/frontend_tests/node_tests/topic_generator.js b/frontend_tests/node_tests/topic_generator.js index 70c6b8023f0bb..61f182d870488 100644 --- a/frontend_tests/node_tests/topic_generator.js +++ b/frontend_tests/node_tests/topic_generator.js @@ -174,7 +174,7 @@ function is_odd(i) { return i % 2 === 1; } assert.equal(gen.next(), undefined); var undef = function () { - return undefined; + return; }; global.blueslip.error = function (msg) { @@ -315,7 +315,7 @@ function is_odd(i) { return i % 2 === 1; } unread.num_unread_for_person = function (user_ids_string) { if (user_ids_string === 'unk') { - return undefined; + return; } if (user_ids_string === 'read') { diff --git a/package.json b/package.json index 69db08b0d523a..2c8a4f0f332e8 100644 --- a/package.json +++ b/package.json @@ -50,6 +50,7 @@ "cssstyle": "0.2.29", "difflib": "0.2.4", "eslint": "3.9.1", + "eslint-plugin-empty-returns": "1.0.2", "htmlparser2": "3.8.3", "istanbul": "0.4.5", "jsdom": "9.4.1", diff --git a/static/js/blueslip.js b/static/js/blueslip.js index 65c2c0cc6dd9e..aa597cee54b3d 100644 --- a/static/js/blueslip.js +++ b/static/js/blueslip.js @@ -71,7 +71,7 @@ Logger.prototype = (function () { if (console[name] !== undefined) { return console[name].apply(console, arguments); } - return undefined; + return; }; } diff --git a/static/js/common.js b/static/js/common.js index 12390dde98a6d..d145583bb9449 100644 --- a/static/js/common.js +++ b/static/js/common.js @@ -26,7 +26,7 @@ exports.autofocus = function (selector) { exports.password_quality = function (password, bar, password_field) { // We load zxcvbn.js asynchronously, so the variable might not be set. if (typeof zxcvbn === 'undefined') { - return undefined; + return; } var min_length = password_field.data('minLength'); @@ -58,7 +58,7 @@ exports.password_quality = function (password, bar, password_field) { exports.password_warning = function (password, password_field) { if (typeof zxcvbn === 'undefined') { - return undefined; + return; } var min_length = password_field.data('minLength'); diff --git a/static/js/compose_fade.js b/static/js/compose_fade.js index ea307c5982076..4b916097cc9a0 100644 --- a/static/js/compose_fade.js +++ b/static/js/compose_fade.js @@ -122,13 +122,13 @@ exports.would_receive_message = function (email) { if (!sub) { // If the stream isn't valid, there is no risk of a mix // yet, so don't fade. - return undefined; + return; } if (user && user.is_bot && !sub.invite_only) { // Bots may receive messages on public streams even if they are // not subscribed. - return undefined; + return; } return stream_data.user_is_subscribed(focused_recipient.stream, email); } diff --git a/static/js/copy_and_paste.js b/static/js/copy_and_paste.js index 69cadb26141ad..99f3eb1f41aeb 100644 --- a/static/js/copy_and_paste.js +++ b/static/js/copy_and_paste.js @@ -11,7 +11,7 @@ function find_boundary_tr(initial_tr, iterate_row) { // parent tr, we should let the browser handle the copy-paste // entirely on its own if (tr.length === 0) { - return undefined; + return; } // If the selection boundary is on a table row that does not have an @@ -24,7 +24,7 @@ function find_boundary_tr(initial_tr, iterate_row) { tr = iterate_row(tr); } if (j === 10) { - return undefined; + return; } else if (j !== 0) { // If we updated tr, then we are not dealing with a selection // that is entirely within one td, and we can skip the same td diff --git a/static/js/dict.js b/static/js/dict.js index f8b5577f02adf..b7485f853bc85 100644 --- a/static/js/dict.js +++ b/static/js/dict.js @@ -57,7 +57,7 @@ Dict.prototype = { _munge: function Dict__munge(k) { if (k === undefined) { blueslip.error("Tried to call a Dict method with an undefined key."); - return undefined; + return; } if (this._opts.fold_case) { k = k.toLowerCase(); @@ -74,7 +74,7 @@ Dict.prototype = { get: function Dict_get(key) { var mapping = this._items[this._munge(key)]; if (mapping === undefined) { - return undefined; + return; } return mapping.v; }, diff --git a/static/js/echo.js b/static/js/echo.js index d1285b89c586f..a6bde1f55b837 100644 --- a/static/js/echo.js +++ b/static/js/echo.js @@ -60,19 +60,19 @@ var get_next_local_id = (function () { // If our id is already used, it is probably an edge case like we had // to abort a very recent message. blueslip.warn("We don't reuse ids for local echo."); - return undefined; + return; } if (next_local_id % 1 > local_id_increment * 5) { blueslip.warn("Turning off local echo for this message to let host catch up"); - return undefined; + return; } if (next_local_id % 1 === 0) { // The logic to stop at 0.05 should prevent us from ever wrapping around // to the next integer. blueslip.error("Programming error"); - return undefined; + return; } already_used[next_local_id] = true; @@ -139,18 +139,18 @@ function insert_local_message(message_request, local_id) { exports.try_deliver_locally = function try_deliver_locally(message_request) { if (markdown.contains_backend_only_syntax(message_request.content)) { - return undefined; + return; } if (narrow_state.active() && !narrow_state.filter().can_apply_locally()) { - return undefined; + return; } var next_local_id = get_next_local_id(); if (!next_local_id) { // This can happen for legit reasons. - return undefined; + return; } return insert_local_message(message_request, next_local_id); diff --git a/static/js/hashchange.js b/static/js/hashchange.js index 9d2d29b70f374..917565455e40c 100644 --- a/static/js/hashchange.js +++ b/static/js/hashchange.js @@ -85,7 +85,7 @@ exports.parse_narrow = function (hash) { } operators.push({negated: negated, operator: operator, operand: operand}); } catch (err) { - return undefined; + return; } } return operators; diff --git a/static/js/localstorage.js b/static/js/localstorage.js index c68d7aa1ef47e..1bea82f21b6d9 100644 --- a/static/js/localstorage.js +++ b/static/js/localstorage.js @@ -6,7 +6,7 @@ var ls = { try { return JSON.parse(str); } catch (err) { - return undefined; + return; } }, diff --git a/static/js/markdown.js b/static/js/markdown.js index c436e73ba6c0a..c171e54ecabbd 100644 --- a/static/js/markdown.js +++ b/static/js/markdown.js @@ -63,7 +63,7 @@ exports.apply_markdown = function (message) { '@' + name + '</span>'; } - return undefined; + return; }, groupMentionHandler: function (name) { var group = user_groups.get_user_group_from_name(name); @@ -75,7 +75,7 @@ exports.apply_markdown = function (message) { '@' + group.name + '</span>'; } - return undefined; + return; }, }; message.content = marked(message.raw_content + '\n\n', options).trim(); @@ -165,7 +165,7 @@ function handleAvatar(email) { function handleStream(streamName) { var stream = stream_data.get_sub(streamName); if (stream === undefined) { - return undefined; + return; } var href = window.location.origin + '/#narrow/stream/' + hash_util.encode_stream_name(stream.name); return '<a class="stream" data-stream-id="' + stream.stream_id + '" ' + diff --git a/static/js/message_list.js b/static/js/message_list.js index e49e6f8828bd3..4b73888cbb725 100644 --- a/static/js/message_list.js +++ b/static/js/message_list.js @@ -100,7 +100,7 @@ exports.MessageList.prototype = { get: function MessageList_get(id) { id = parseFloat(id); if (isNaN(id)) { - return undefined; + return; } return this._hash[id]; }, diff --git a/static/js/narrow_state.js b/static/js/narrow_state.js index af47562ec5a6a..028510671bd86 100644 --- a/static/js/narrow_state.js +++ b/static/js/narrow_state.js @@ -41,7 +41,7 @@ exports.update_email = function (user_id, new_email) { /* Operators we should send to the server. */ exports.public_operators = function () { if (current_filter === undefined) { - return undefined; + return; } return current_filter.public_operators(); }; @@ -96,7 +96,7 @@ exports.set_compose_defaults = function () { exports.stream = function () { if (current_filter === undefined) { - return undefined; + return; } var stream_operands = current_filter.operands("stream"); if (stream_operands.length === 1) { @@ -106,18 +106,18 @@ exports.stream = function () { // name (considering renames and capitalization). return stream_data.get_name(name); } - return undefined; + return; }; exports.topic = function () { if (current_filter === undefined) { - return undefined; + return; } var operands = current_filter.operands("topic"); if (operands.length === 1) { return operands[0]; } - return undefined; + return; }; exports.pm_string = function () { diff --git a/static/js/people.js b/static/js/people.js index 9db02b7606040..c64ffd6047bd5 100644 --- a/static/js/people.js +++ b/static/js/people.js @@ -35,7 +35,7 @@ exports.init(); exports.get_person_from_user_id = function (user_id) { if (!people_by_user_id_dict.has(user_id)) { blueslip.error('Unknown user_id in get_person_from_user_id: ' + user_id); - return undefined; + return; } return people_by_user_id_dict.get(user_id); }; @@ -44,7 +44,7 @@ exports.get_by_email = function (email) { var person = people_dict.get(email); if (!person) { - return undefined; + return; } if (person.email.toLowerCase() !== email.toLowerCase()) { @@ -91,12 +91,12 @@ exports.get_user_id = function (email) { if (person === undefined) { var error_msg = 'Unknown email for get_user_id: ' + email; blueslip.error(error_msg); - return undefined; + return; } var user_id = person.user_id; if (!user_id) { blueslip.error('No user_id found for ' + email); - return undefined; + return; } return user_id; @@ -555,7 +555,7 @@ exports.is_valid_email_for_compose = function (email) { exports.get_active_user_for_email = function (email) { var person = people.get_by_email(email); if (!person) { - return undefined; + return; } return active_user_dict.get(person.user_id); }; @@ -596,7 +596,7 @@ exports.get_active_user_ids = function () { exports.is_cross_realm_email = function (email) { var person = people.get_by_email(email); if (!person) { - return undefined; + return; } return cross_realm_dict.has(person.user_id); }; diff --git a/static/js/stream_data.js b/static/js/stream_data.js index 1ac17478782c1..e5374022aa137 100644 --- a/static/js/stream_data.js +++ b/static/js/stream_data.js @@ -370,7 +370,7 @@ exports.user_is_subscribed = function (stream_name, user_email) { // subscribed, we can't keep track of the subscriber list in general, // so we return undefined (treated as falsy if not explicitly handled). blueslip.warn("We got a user_is_subscribed call for a non-existent or unsubscribed stream."); - return undefined; + return; } var user_id = people.get_user_id(user_email); if (!user_id) { @@ -529,7 +529,7 @@ exports.get_newbie_stream = function () { return page_params.notifications_stream; } - return undefined; + return; }; exports.remove_default_stream = function (stream_id) { diff --git a/static/js/typing.js b/static/js/typing.js index f364566d91678..86b669f3ef892 100644 --- a/static/js/typing.js +++ b/static/js/typing.js @@ -24,7 +24,7 @@ function send_typing_notification_ajax(recipients, operation) { function get_recipient() { var compose_recipient = compose_state.recipient(); if (compose_recipient === "") { - return undefined; + return; } return compose_recipient; } diff --git a/static/js/user_groups.js b/static/js/user_groups.js index cdb9cd8281646..1b63815b323da 100644 --- a/static/js/user_groups.js +++ b/static/js/user_groups.js @@ -31,7 +31,7 @@ exports.remove = function (user_group) { exports.get_user_group_from_id = function (group_id) { if (!user_group_by_id_dict.has(group_id)) { blueslip.error('Unknown group_id in get_user_group_from_id: ' + group_id); - return undefined; + return; } return user_group_by_id_dict.get(group_id); }; diff --git a/version.py b/version.py index 72fea154e844d..96d25f384546d 100644 --- a/version.py +++ b/version.py @@ -8,4 +8,4 @@ # Typically, adding a dependency only requires a minor version bump, and # removing a dependency requires a major version bump. -PROVISION_VERSION = '15.9' +PROVISION_VERSION = '15.10' diff --git a/yarn.lock b/yarn.lock index 04d346301a3a7..27c03618df4f7 100644 --- a/yarn.lock +++ b/yarn.lock @@ -1731,6 +1731,10 @@ escope@^3.6.0: esrecurse "^4.1.0" estraverse "^4.1.1" +eslint-plugin-empty-returns@^1.0.1: + version "1.0.1" + resolved "https://registry.yarnpkg.com/eslint-plugin-empty-returns/-/eslint-plugin-empty-returns-1.0.1.tgz#ca19faa501e114812577db68ec6882ea48c40a27" + [email protected]: version "3.9.1" resolved "https://registry.yarnpkg.com/eslint/-/eslint-3.9.1.tgz#5a8597706fc6048bc6061ac754d4a211d28f4f5b" @@ -3994,13 +3998,13 @@ mapbox-gl-function@^1.2.1: version "1.3.0" resolved "https://registry.yarnpkg.com/mapbox-gl-function/-/mapbox-gl-function-1.3.0.tgz#cee3d95750c189d45e83ab41a0a57fc2a8a509bc" -"mapbox-gl-shaders@github:mapbox/mapbox-gl-shaders#de2ab007455aa2587c552694c68583f94c9f2747": +mapbox-gl-shaders@mapbox/mapbox-gl-shaders#de2ab007455aa2587c552694c68583f94c9f2747: version "1.0.0" resolved "https://codeload.github.com/mapbox/mapbox-gl-shaders/tar.gz/de2ab007455aa2587c552694c68583f94c9f2747" dependencies: brfs "^1.4.0" -"mapbox-gl-style-spec@github:mapbox/mapbox-gl-style-spec#83b1a3e5837d785af582efd5ed1a212f2df6a4ae": +mapbox-gl-style-spec@mapbox/mapbox-gl-style-spec#83b1a3e5837d785af582efd5ed1a212f2df6a4ae: version "8.8.0" resolved "https://codeload.github.com/mapbox/mapbox-gl-style-spec/tar.gz/83b1a3e5837d785af582efd5ed1a212f2df6a4ae" dependencies:
pex-tool__pex-2240
Release 2.1.146 On the docket: + [x] Fix non executable venv sys path bug #2236
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.145\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.146\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.md b/CHANGES.md index e0b86c7d2..e4578df2c 100644 --- a/CHANGES.md +++ b/CHANGES.md @@ -1,5 +1,12 @@ # Release Notes +## 2.1.146 + +This release brings a fix by new contributor @yjabri for the `__pex__` +import hook that gets it working properly for `--venv` mode PEXes. + +* Fix non executable venv sys path bug (#2236) + ## 2.1.145 This release broadens the range of the `flit-core` build system Pex uses diff --git a/pex/version.py b/pex/version.py index 1fff73a2e..79b99d81d 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.145" +__version__ = "2.1.146"
pex-tool__pex-2042
Release 2.1.121 On the docket: + [x] Building Pex with requirements.txt that includes local directory + Python version specifier fails #2037 + [x] Failed to resolve compatible distributions when building Pex from .whl with local dependencies #2038
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.120\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.121\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 992b31e92..052808652 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,18 @@ Release Notes ============= +2.1.121 +------- + +This release fixes two bugs brought to light trying to interoperate with +Poetry projects. + +* Support space separated markers in URL reqs. (#2039) + `PR #2039 <https://github.com/pantsbuild/pex/pull/2039>`_ + +* Handle file:// URL deps in distributions. (#2041) + `PR #2041 <https://github.com/pantsbuild/pex/pull/2041>`_ + 2.1.120 ------- diff --git a/pex/version.py b/pex/version.py index 85c867798..2513fd6e8 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.120" +__version__ = "2.1.121"
pex-tool__pex-2245
Release 2.1.147 On the docket: + [x] pex does not use .pip/pip.conf to resolve packages #336 / #838
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.146\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.147\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.md b/CHANGES.md index e4578df2c..70201b9bb 100644 --- a/CHANGES.md +++ b/CHANGES.md @@ -1,5 +1,13 @@ # Release Notes +## 2.1.147 + +Add support for `--use-pip-config` to allow the Pip Pex calls to read +`PIP_*` env vars and Pip configuration files. This can be particularly +useful for picking up custom index configuration (including auth). + +* Add support for --use-pip-config. (#2243) + ## 2.1.146 This release brings a fix by new contributor @yjabri for the `__pex__` diff --git a/pex/version.py b/pex/version.py index 79b99d81d..34e32d6eb 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.146" +__version__ = "2.1.147"
pex-tool__pex-1947
Release 2.1.110 On the docket: + [x] PEX runtime sys.path scrubbing is imperfect. #1944
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.109\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.110\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index cc087a1ee..79be2e8ea 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,17 @@ Release Notes ============= +2.1.110 +------- + +This release fixes Pex runtime ``sys.path`` scrubbing for cases where +Pex is not the main entry point. An important example of this is in +Lambdex where the AWS Lambda Python runtime packages (``boto3`` and +``botocore``) are leaked into the PEX runtime ``sys.path``. + +* Fix ``sys.path`` scrubbing. (#1946) + `PR #1946 <https://github.com/pantsbuild/pex/pull/1946>`_ + 2.1.109 ------- diff --git a/pex/version.py b/pex/version.py index 32f577f51..6e6ad76d6 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.109" +__version__ = "2.1.110"
ivy-llc__ivy-26758
igamma
[ { "content": "# global\nfrom typing import Any\nimport itertools\nimport string\nimport builtins\n\n# local\nimport ivy\nfrom ivy.func_wrapper import with_supported_dtypes\nfrom ivy.functional.frontends.jax.func_wrapper import to_ivy_arrays_and_back\nfrom ivy.func_wrapper import with_unsupported_dtypes, frontend_outputs_to_ivy_arrays\n\n_slice = builtins.slice\n\n\n# --- Helpers --- #\n# --------------- #\n\n\ndef _argsort_tuple(the_tuple):\n return tuple(i for i, _ in sorted(enumerate(the_tuple), key=lambda x: x[1]))\n\n\ndef _conv_transpose_padding(k, s, padding):\n if padding == \"SAME\":\n pad_len = k + s - 2\n if s > k - 1:\n pad_a = k - 1\n else:\n pad_a = int(ivy.to_scalar(ivy.ceil(pad_len / 2)))\n elif padding == \"VALID\":\n pad_len = k + s - 2 + ivy.to_scalar(ivy.maximum(k - s, 0))\n pad_a = k - 1\n else:\n raise ValueError(\"Padding mode must be `SAME` or `VALID`.\")\n pad_b = pad_len - pad_a\n return pad_a, pad_b\n\n\ndef _dimension_numbers(dimension_numbers, lhs_len, transp=False):\n if dimension_numbers is None:\n if transp:\n iota = (0, lhs_len - 1, *range(1, lhs_len - 1))\n iotb = (lhs_len - 1, lhs_len - 2, *range(0, lhs_len - 2))\n return iota, iotb, iota\n else:\n iota = tuple(range(lhs_len))\n return iota, iota, iota\n elif isinstance(dimension_numbers[0], (tuple, list)):\n return dimension_numbers\n else:\n lhs_spec, rhs_spec, out_spec = dimension_numbers\n\n def getperm(spec, charpair):\n spatial = (i for i, c in enumerate(spec) if c not in charpair)\n if spec is not rhs_spec:\n spatial = sorted(spatial, key=lambda i: rhs_spec.index(spec[i]))\n return (spec.index(charpair[0]), spec.index(charpair[1])) + tuple(spatial)\n\n charpairs = (\"N\", \"C\"), (\"O\", \"I\"), (\"N\", \"C\")\n lhs_spec, rhs_spec, out_spec = map(getperm, dimension_numbers, charpairs)\n return lhs_spec, rhs_spec, out_spec\n\n\n# --- Main --- #\n# ------------ #\n\n\n@to_ivy_arrays_and_back\ndef abs(x):\n return ivy.abs(x)\n\n\n@to_ivy_arrays_and_back\ndef acos(x):\n return ivy.acos(x)\n\n\n@to_ivy_arrays_and_back\ndef add(x, y):\n return ivy.add(x, y)\n\n\n@to_ivy_arrays_and_back\ndef argmax(operand, axis, index_dtype):\n return ivy.astype(ivy.argmax(operand, axis=axis), index_dtype)\n\n\n@to_ivy_arrays_and_back\ndef argmin(operand, axis, index_dtype):\n return ivy.astype(ivy.argmin(operand, axis=axis), index_dtype)\n\n\n@to_ivy_arrays_and_back\ndef asin(x):\n return ivy.asin(x)\n\n\n@to_ivy_arrays_and_back\ndef asinh(x):\n return ivy.asinh(x)\n\n\n@to_ivy_arrays_and_back\ndef atan(x):\n return ivy.atan(x)\n\n\n@to_ivy_arrays_and_back\ndef atan2(x, y):\n return ivy.atan2(x, y)\n\n\n@to_ivy_arrays_and_back\ndef atanh(x):\n return ivy.atanh(x)\n\n\n@to_ivy_arrays_and_back\ndef batch_matmul(lhs, rhs, precision=None):\n if lhs.ndim < 2 or rhs.ndim < 2:\n raise ValueError(\n f\"Arguments to batch_matmul must be at least 2D, got {lhs.ndim}, {rhs.ndim}\"\n )\n if lhs.ndim != rhs.ndim:\n raise ValueError(\n f\"Arguments to batch_matmul must have same ndim, got {lhs.ndim}, {rhs.ndim}\"\n )\n return ivy.matmul(lhs, rhs).astype(lhs.dtype)\n\n\n@to_ivy_arrays_and_back\ndef bitwise_and(x, y):\n return ivy.bitwise_and(x, y)\n\n\n@to_ivy_arrays_and_back\ndef bitwise_not(x):\n return ivy.bitwise_invert(x)\n\n\n@to_ivy_arrays_and_back\ndef bitwise_or(x, y):\n return ivy.bitwise_or(x, y)\n\n\n@to_ivy_arrays_and_back\ndef bitwise_xor(x, y):\n return ivy.bitwise_xor(x, y)\n\n\n@to_ivy_arrays_and_back\ndef broadcast(operand, sizes):\n ret = ivy.zeros(tuple(sizes) + tuple(ivy.shape(operand)), dtype=ivy.dtype(operand))\n return ret + operand\n\n\n@with_supported_dtypes(\n {\n \"0.4.17 and below\": (\n \"float16\",\n \"float32\",\n \"float64\",\n )\n },\n \"jax\",\n)\n@to_ivy_arrays_and_back\ndef cbrt(x):\n return ivy.pow(x, 1 / 3)\n\n\n@to_ivy_arrays_and_back\ndef ceil(x):\n return ivy.ceil(x)\n\n\n@to_ivy_arrays_and_back\ndef clamp(min, x, max):\n return ivy.clip(x, min, max)\n\n\n@to_ivy_arrays_and_back\ndef complex(x, y):\n return ivy.complex(x, y)\n\n\n@to_ivy_arrays_and_back\ndef concatenate(operands, dimension):\n return ivy.concat(operands, axis=dimension)\n\n\n@to_ivy_arrays_and_back\ndef conj(x):\n return ivy.conj(x)\n\n\n@to_ivy_arrays_and_back\ndef conv(\n lhs, rhs, window_strides, padding, precision=None, preferred_element_type=None\n):\n if preferred_element_type:\n lhs = ivy.astype(lhs, preferred_element_type)\n rhs = ivy.astype(rhs, preferred_element_type)\n dims = len(lhs.shape) - 2\n return ivy.conv_general_dilated(\n lhs,\n rhs,\n window_strides,\n padding,\n dims=dims,\n data_format=\"channel_first\",\n filter_format=\"channel_first\",\n )\n\n\n@to_ivy_arrays_and_back\ndef conv_general_dilated(\n lhs,\n rhs,\n window_strides,\n padding,\n lhs_dilation=None,\n rhs_dilation=None,\n dimension_numbers=None,\n feature_group_count=1,\n batch_group_count=1,\n precision=None,\n preferred_element_type=None,\n):\n # TODO: add support for batch_group_count\n if preferred_element_type:\n lhs = ivy.astype(lhs, preferred_element_type)\n rhs = ivy.astype(rhs, preferred_element_type)\n dims = len(lhs.shape) - 2\n dim_nums = _dimension_numbers(dimension_numbers, dims + 2)\n rhs_spec = tuple(dim_nums[1][i] for i in (*range(2, dims + 2), 1, 0))\n return ivy.permute_dims(\n ivy.conv_general_dilated(\n ivy.permute_dims(lhs, axes=dim_nums[0]),\n ivy.permute_dims(rhs, axes=rhs_spec),\n window_strides,\n padding,\n dims=dims,\n data_format=\"channel_first\",\n x_dilations=1 if lhs_dilation is None else lhs_dilation,\n dilations=1 if rhs_dilation is None else rhs_dilation,\n feature_group_count=feature_group_count,\n ),\n axes=_argsort_tuple(dim_nums[2]),\n )\n\n\n@to_ivy_arrays_and_back\ndef conv_transpose(\n lhs,\n rhs,\n strides,\n padding,\n rhs_dilation=None,\n dimension_numbers=None,\n transpose_kernel=False,\n precision=None,\n preferred_element_type=None,\n):\n # TODO: add support for transpose_kernel\n if preferred_element_type:\n lhs = ivy.astype(lhs, preferred_element_type)\n rhs = ivy.astype(rhs, preferred_element_type)\n dims = len(lhs.shape) - 2\n dim_nums = _dimension_numbers(dimension_numbers, dims + 2, transp=True)\n rhs_spec = tuple(dim_nums[1][i] for i in (*range(2, dims + 2), 1, 0))\n rhs_dilation = 1 if rhs_dilation is None else rhs_dilation\n if isinstance(padding, str):\n k_sdims = [rhs.shape[i] for i in rhs_spec[:-2]]\n effective_k_size = map(lambda k, r: (k - 1) * r + 1, k_sdims, rhs_dilation)\n padding = [\n _conv_transpose_padding(k, s, padding)\n for k, s in zip(effective_k_size, strides)\n ]\n return ivy.permute_dims(\n ivy.conv_general_dilated(\n ivy.permute_dims(lhs, axes=dim_nums[0]),\n ivy.permute_dims(rhs, axes=rhs_spec),\n 1,\n padding,\n dilations=rhs_dilation,\n x_dilations=strides,\n dims=dims,\n data_format=\"channel_first\",\n ),\n axes=_argsort_tuple(dim_nums[2]),\n )\n\n\n@to_ivy_arrays_and_back\ndef convert_element_type(operand, new_dtype):\n return ivy.astype(operand, new_dtype, copy=False)\n\n\n@to_ivy_arrays_and_back\ndef cos(x):\n return ivy.cos(x)\n\n\n@to_ivy_arrays_and_back\ndef cosh(x):\n return ivy.cosh(x)\n\n\n@with_unsupported_dtypes(\n {\"0.4.17 and below\": (\"bfloat16\", \"float16\", \"bool\", \"complex64\", \"complex128\")},\n \"jax\",\n)\n@to_ivy_arrays_and_back\ndef cummin(operand, axis=0, reverse=False):\n return ivy.cummin(operand, axis=axis, reverse=reverse, dtype=operand.dtype)\n\n\n@to_ivy_arrays_and_back\ndef cumprod(operand, axis=None, reverse=False):\n dtype = ivy.dtype(operand)\n return ivy.cumprod(operand, axis=axis, reverse=reverse).astype(dtype)\n\n\n@to_ivy_arrays_and_back\ndef cumsum(operand, axis=None, reverse=False):\n if reverse:\n return ivy.flip(ivy.cumsum(ivy.flip(operand), axis=axis, dtype=operand.dtype))\n return ivy.cumsum(operand, axis=axis, dtype=operand.dtype)\n\n\n@to_ivy_arrays_and_back\ndef div(x, y):\n return ivy.astype(ivy.divide(x, y), x.dtype)\n\n\n@to_ivy_arrays_and_back\ndef dot(lhs, rhs, precision=None, preferred_element_type=None):\n ret = ivy.matmul(lhs, rhs)\n if preferred_element_type:\n ret = ivy.astype(ret, preferred_element_type, copy=False)\n return ret\n\n\n@with_unsupported_dtypes({\"0.4.5 and below\": (\"bool\",)}, \"jax\")\n@to_ivy_arrays_and_back\ndef dot_general(\n lhs, rhs, dimension_numbers, precision=None, preferred_element_type=None\n):\n (lhs_contracting, rhs_contracting), (lhs_batch, rhs_batch) = dimension_numbers\n ivy.utils.assertions.check_less(\n len(lhs.shape),\n 52,\n \"number of dimensions greater than 52 is not supported\",\n as_array=False,\n )\n new_id = itertools.count()\n lhs_axis_ids = [next(new_id) for _ in lhs.shape]\n rhs_axis_ids = [next(new_id) for _ in rhs.shape]\n lhs_out_axis_ids = lhs_axis_ids[:]\n rhs_out_axis_ids = rhs_axis_ids[:]\n for lhs_axis, rhs_axis in zip(lhs_contracting, rhs_contracting):\n shared_id = next(new_id)\n lhs_axis_ids[lhs_axis] = shared_id\n rhs_axis_ids[rhs_axis] = shared_id\n lhs_out_axis_ids[lhs_axis] = None\n rhs_out_axis_ids[rhs_axis] = None\n batch_ids = []\n for lhs_axis, rhs_axis in zip(lhs_batch, rhs_batch):\n shared_id = next(new_id)\n lhs_axis_ids[lhs_axis] = shared_id\n rhs_axis_ids[rhs_axis] = shared_id\n lhs_out_axis_ids[lhs_axis] = None\n rhs_out_axis_ids[rhs_axis] = None\n batch_ids.append(shared_id)\n out_axis_ids = list(\n filter(lambda x: x is not None, batch_ids + lhs_out_axis_ids + rhs_out_axis_ids)\n )\n char_list = [*string.ascii_letters]\n lhs_axis_ids = \"\".join(str(char_list[i]) for i in lhs_axis_ids)\n rhs_axis_ids = \"\".join(str(char_list[i]) for i in rhs_axis_ids)\n out_axis_ids = \"\".join(str(char_list[i]) for i in out_axis_ids)\n equ_str = f\"{lhs_axis_ids},{rhs_axis_ids}->{out_axis_ids}\"\n ret = ivy.einsum(equ_str, lhs, rhs)\n if preferred_element_type:\n ret = ivy.astype(ret, preferred_element_type, copy=False)\n return ret\n\n\n@to_ivy_arrays_and_back\ndef eq(x, y):\n return ivy.equal(x, y)\n\n\n@to_ivy_arrays_and_back\ndef erf(x):\n return ivy.erf(x)\n\n\n@with_supported_dtypes(\n {\n \"0.4.17 and below\": (\n \"float16\",\n \"float32\",\n \"float64\",\n )\n },\n \"jax\",\n)\n@to_ivy_arrays_and_back\ndef erfc(x):\n value = ivy.erf(x)\n value = (1.0 - value) if value is not None else None\n return value\n\n\n@to_ivy_arrays_and_back\ndef exp(x):\n return ivy.exp(x)\n\n\n@to_ivy_arrays_and_back\ndef expand_dims(array, dimensions):\n return ivy.expand_dims(array, axis=dimensions)\n\n\n@to_ivy_arrays_and_back\ndef expm1(x):\n return ivy.expm1(x)\n\n\n@to_ivy_arrays_and_back\ndef full(shape, fill_value, dtype=None):\n return ivy.full(shape, fill_value, dtype=dtype)\n\n\n@to_ivy_arrays_and_back\ndef full_like(x, fill_value, dtype=None, shape=None):\n if shape is None:\n return ivy.full_like(x, fill_value, dtype=dtype)\n return ivy.full(shape, fill_value, dtype=dtype)\n\n\n@with_unsupported_dtypes({\"0.4.5 and below\": (\"complex\",)}, \"jax\")\n@to_ivy_arrays_and_back\ndef ge(x, y):\n return ivy.greater_equal(x, y)\n\n\n@with_unsupported_dtypes({\"0.4.5 and below\": (\"complex\",)}, \"jax\")\n@to_ivy_arrays_and_back\ndef gt(x, y):\n return ivy.greater(x, y)\n\n\n@to_ivy_arrays_and_back\ndef imag(x):\n return ivy.imag(x)\n\n\n@with_unsupported_dtypes(\n {\"0.4.17 and below\": (\"bool\", \"bfloat16\")},\n \"jax\",\n)\n@to_ivy_arrays_and_back\ndef iota(dtype, size):\n return ivy.arange(0, size, dtype=dtype)\n\n\n@to_ivy_arrays_and_back\ndef is_finite(x):\n return ivy.isfinite(x)\n\n\n@with_unsupported_dtypes({\"0.4.5 and below\": (\"complex\",)}, \"jax\")\n@to_ivy_arrays_and_back\ndef le(x, y):\n return ivy.less_equal(x, y)\n\n\n@to_ivy_arrays_and_back\ndef log(x):\n return ivy.log(x)\n\n\n@to_ivy_arrays_and_back\ndef log1p(x):\n return ivy.log1p(x)\n\n\n@to_ivy_arrays_and_back\ndef lt(x, y):\n return ivy.less(x, y)\n\n\n@to_ivy_arrays_and_back\ndef max(x: Any, y: Any):\n return ivy.maximum(x, y)\n\n\n@to_ivy_arrays_and_back\ndef min(x, y):\n return ivy.minimum(x, y)\n\n\n@to_ivy_arrays_and_back\ndef mul(x, y):\n return ivy.multiply(x, y)\n\n\n@to_ivy_arrays_and_back\ndef ne(x, y):\n return ivy.not_equal(x, y)\n\n\n@to_ivy_arrays_and_back\ndef neg(x):\n return ivy.negative(x)\n\n\n@to_ivy_arrays_and_back\ndef nextafter(x1, x2):\n return ivy.nextafter(x1, x2)\n\n\n@to_ivy_arrays_and_back\ndef pad(operand, padding_value, padding_config):\n return ivy.pad(\n operand, padding_config, mode=\"dilated\", constant_values=padding_value\n )\n\n\n@to_ivy_arrays_and_back\ndef pow(x, y):\n return ivy.pow(x, y)\n\n\n@to_ivy_arrays_and_back\ndef real(x):\n return ivy.real(x)\n\n\n@to_ivy_arrays_and_back\ndef reciprocal(x):\n return ivy.reciprocal(x)\n\n\n@to_ivy_arrays_and_back\ndef reduce_window(\n operand,\n init_value,\n computation,\n window_dimensions,\n window_strides,\n padding,\n base_dilation=None,\n window_dilation=None,\n):\n computation = frontend_outputs_to_ivy_arrays(computation)\n return ivy.reduce_window(\n operand,\n init_value,\n computation,\n window_dimensions,\n window_strides=window_strides,\n padding=padding,\n base_dilation=base_dilation,\n window_dilation=window_dilation,\n )\n\n\n@to_ivy_arrays_and_back\ndef rem(x, y):\n return ivy.remainder(ivy.abs(x), ivy.abs(y)) * ivy.sign(x)\n\n\n@to_ivy_arrays_and_back\ndef reshape(operand, new_sizes, dimensions=None):\n if dimensions:\n operand = ivy.permute_dims(operand, dimensions)\n return ivy.reshape(operand, new_sizes)\n\n\n@to_ivy_arrays_and_back\ndef rev(operand, dimensions):\n return ivy.flip(operand, axis=dimensions)\n\n\n@to_ivy_arrays_and_back\ndef round(x, rounding_method=1):\n if rounding_method == 0:\n ret = ivy.where(\n ivy.less(x, 0),\n ivy.ceil(x) - (ivy.ceil(x) - ivy.floor(x)),\n ivy.ceil(x),\n )\n elif rounding_method == 1:\n ret = ivy.ceil(x)\n ret = ivy.where(ivy.remainder(ret, 2) == 0, ret, ret - 1)\n return ivy.where(ivy.abs(x - ivy.floor(x) - 0.5) < 1e-7, ret, ivy.round(x))\n\n\n@to_ivy_arrays_and_back\ndef rsqrt(x):\n return ivy.reciprocal(ivy.sqrt(x))\n\n\n@to_ivy_arrays_and_back\ndef select(pred, on_true, on_false):\n return ivy.where(pred, on_true, on_false)\n\n\n@to_ivy_arrays_and_back\ndef shift_left(x, y):\n return ivy.bitwise_left_shift(x, y)\n\n\n@to_ivy_arrays_and_back\ndef shift_right_logical(x, y):\n return ivy.bitwise_right_shift(x, y)\n\n\n@to_ivy_arrays_and_back\ndef sign(x):\n return ivy.sign(x, np_variant=False)\n\n\n@to_ivy_arrays_and_back\ndef sin(x):\n return ivy.sin(x)\n\n\n@to_ivy_arrays_and_back\ndef sinh(x):\n return ivy.sinh(x)\n\n\n@to_ivy_arrays_and_back\ndef slice(operand, start_indices, limit_indices, strides=None):\n strides = [1] * len(operand.shape) if strides is None else strides\n\n full_slice = ()\n for i, _ in enumerate(operand.shape):\n strides_i = int(strides[i])\n start_i = int(start_indices[i])\n limit_i = int(limit_indices[i])\n full_slice += (_slice(start_i, limit_i, strides_i),)\n return operand[full_slice]\n\n\n@to_ivy_arrays_and_back\ndef slice_in_dim(operand, start_index, limit_index, stride=1, axis=0):\n start_indices = [0] * operand.ndim\n limit_indices = list(operand.shape)\n strides = [1] * operand.ndim\n\n len_axis = operand.shape[axis]\n start_index_int = start_index if start_index is not None else 0\n limit_index_int = limit_index if limit_index is not None else len_axis\n\n if start_index_int < 0:\n start_index_int = start_index_int + len_axis\n if limit_index_int < 0:\n limit_index_int = limit_index_int + len_axis\n\n axis = int(axis)\n start_indices[axis] = start_index_int\n limit_indices[axis] = limit_index_int\n strides[axis] = int(stride)\n return slice(operand, start_indices, limit_indices, strides)\n\n\n@to_ivy_arrays_and_back\ndef sort(operand, dimension=-1, is_stable=True, num_keys=1):\n return ivy.sort(operand, axis=dimension, stable=is_stable)\n\n\n@to_ivy_arrays_and_back\ndef sqrt(x):\n return ivy.sqrt(x)\n\n\n@to_ivy_arrays_and_back\ndef square(x):\n return ivy.square(x)\n\n\n@to_ivy_arrays_and_back\ndef squeeze(array, dimensions):\n return ivy.squeeze(array, axis=dimensions)\n\n\n@to_ivy_arrays_and_back\ndef sub(x, y):\n return ivy.subtract(x, y)\n\n\n@to_ivy_arrays_and_back\ndef tan(x):\n return ivy.tan(x)\n\n\n@to_ivy_arrays_and_back\ndef tie_in(x, y):\n return y\n\n\n# top_k\n@to_ivy_arrays_and_back\ndef top_k(operand, k):\n values, indices = ivy.top_k(operand, k, axis=-1)\n indices = ivy.astype(indices, ivy.int32, copy=False)\n return [values, indices]\n\n\n@to_ivy_arrays_and_back\ndef transpose(operand, permutation):\n return ivy.permute_dims(operand, permutation)\n", "path": "ivy/functional/frontends/jax/lax/operators.py" } ]
[ { "content": "# global\nfrom typing import Any\nimport itertools\nimport string\nimport builtins\n\n# local\nimport ivy\nfrom ivy.func_wrapper import with_supported_dtypes\nfrom ivy.functional.frontends.jax.func_wrapper import to_ivy_arrays_and_back\nfrom ivy.func_wrapper import with_unsupported_dtypes, frontend_outputs_to_ivy_arrays\n\n_slice = builtins.slice\n\n\n# --- Helpers --- #\n# --------------- #\n\n\ndef _argsort_tuple(the_tuple):\n return tuple(i for i, _ in sorted(enumerate(the_tuple), key=lambda x: x[1]))\n\n\ndef _conv_transpose_padding(k, s, padding):\n if padding == \"SAME\":\n pad_len = k + s - 2\n if s > k - 1:\n pad_a = k - 1\n else:\n pad_a = int(ivy.to_scalar(ivy.ceil(pad_len / 2)))\n elif padding == \"VALID\":\n pad_len = k + s - 2 + ivy.to_scalar(ivy.maximum(k - s, 0))\n pad_a = k - 1\n else:\n raise ValueError(\"Padding mode must be `SAME` or `VALID`.\")\n pad_b = pad_len - pad_a\n return pad_a, pad_b\n\n\ndef _dimension_numbers(dimension_numbers, lhs_len, transp=False):\n if dimension_numbers is None:\n if transp:\n iota = (0, lhs_len - 1, *range(1, lhs_len - 1))\n iotb = (lhs_len - 1, lhs_len - 2, *range(0, lhs_len - 2))\n return iota, iotb, iota\n else:\n iota = tuple(range(lhs_len))\n return iota, iota, iota\n elif isinstance(dimension_numbers[0], (tuple, list)):\n return dimension_numbers\n else:\n lhs_spec, rhs_spec, out_spec = dimension_numbers\n\n def getperm(spec, charpair):\n spatial = (i for i, c in enumerate(spec) if c not in charpair)\n if spec is not rhs_spec:\n spatial = sorted(spatial, key=lambda i: rhs_spec.index(spec[i]))\n return (spec.index(charpair[0]), spec.index(charpair[1])) + tuple(spatial)\n\n charpairs = (\"N\", \"C\"), (\"O\", \"I\"), (\"N\", \"C\")\n lhs_spec, rhs_spec, out_spec = map(getperm, dimension_numbers, charpairs)\n return lhs_spec, rhs_spec, out_spec\n\n\n# --- Main --- #\n# ------------ #\n\n\n@to_ivy_arrays_and_back\ndef abs(x):\n return ivy.abs(x)\n\n\n@to_ivy_arrays_and_back\ndef acos(x):\n return ivy.acos(x)\n\n\n@to_ivy_arrays_and_back\ndef add(x, y):\n return ivy.add(x, y)\n\n\n@to_ivy_arrays_and_back\ndef argmax(operand, axis, index_dtype):\n return ivy.astype(ivy.argmax(operand, axis=axis), index_dtype)\n\n\n@to_ivy_arrays_and_back\ndef argmin(operand, axis, index_dtype):\n return ivy.astype(ivy.argmin(operand, axis=axis), index_dtype)\n\n\n@to_ivy_arrays_and_back\ndef asin(x):\n return ivy.asin(x)\n\n\n@to_ivy_arrays_and_back\ndef asinh(x):\n return ivy.asinh(x)\n\n\n@to_ivy_arrays_and_back\ndef atan(x):\n return ivy.atan(x)\n\n\n@to_ivy_arrays_and_back\ndef atan2(x, y):\n return ivy.atan2(x, y)\n\n\n@to_ivy_arrays_and_back\ndef atanh(x):\n return ivy.atanh(x)\n\n\n@to_ivy_arrays_and_back\ndef batch_matmul(lhs, rhs, precision=None):\n if lhs.ndim < 2 or rhs.ndim < 2:\n raise ValueError(\n f\"Arguments to batch_matmul must be at least 2D, got {lhs.ndim}, {rhs.ndim}\"\n )\n if lhs.ndim != rhs.ndim:\n raise ValueError(\n f\"Arguments to batch_matmul must have same ndim, got {lhs.ndim}, {rhs.ndim}\"\n )\n return ivy.matmul(lhs, rhs).astype(lhs.dtype)\n\n\n@to_ivy_arrays_and_back\ndef bitwise_and(x, y):\n return ivy.bitwise_and(x, y)\n\n\n@to_ivy_arrays_and_back\ndef bitwise_not(x):\n return ivy.bitwise_invert(x)\n\n\n@to_ivy_arrays_and_back\ndef bitwise_or(x, y):\n return ivy.bitwise_or(x, y)\n\n\n@to_ivy_arrays_and_back\ndef bitwise_xor(x, y):\n return ivy.bitwise_xor(x, y)\n\n\n@to_ivy_arrays_and_back\ndef broadcast(operand, sizes):\n ret = ivy.zeros(tuple(sizes) + tuple(ivy.shape(operand)), dtype=ivy.dtype(operand))\n return ret + operand\n\n\n@with_supported_dtypes(\n {\n \"0.4.17 and below\": (\n \"float16\",\n \"float32\",\n \"float64\",\n )\n },\n \"jax\",\n)\n@to_ivy_arrays_and_back\ndef cbrt(x):\n return ivy.pow(x, 1 / 3)\n\n\n@to_ivy_arrays_and_back\ndef ceil(x):\n return ivy.ceil(x)\n\n\n@to_ivy_arrays_and_back\ndef clamp(min, x, max):\n return ivy.clip(x, min, max)\n\n\n@to_ivy_arrays_and_back\ndef complex(x, y):\n return ivy.complex(x, y)\n\n\n@to_ivy_arrays_and_back\ndef concatenate(operands, dimension):\n return ivy.concat(operands, axis=dimension)\n\n\n@to_ivy_arrays_and_back\ndef conj(x):\n return ivy.conj(x)\n\n\n@to_ivy_arrays_and_back\ndef conv(\n lhs, rhs, window_strides, padding, precision=None, preferred_element_type=None\n):\n if preferred_element_type:\n lhs = ivy.astype(lhs, preferred_element_type)\n rhs = ivy.astype(rhs, preferred_element_type)\n dims = len(lhs.shape) - 2\n return ivy.conv_general_dilated(\n lhs,\n rhs,\n window_strides,\n padding,\n dims=dims,\n data_format=\"channel_first\",\n filter_format=\"channel_first\",\n )\n\n\n@to_ivy_arrays_and_back\ndef conv_general_dilated(\n lhs,\n rhs,\n window_strides,\n padding,\n lhs_dilation=None,\n rhs_dilation=None,\n dimension_numbers=None,\n feature_group_count=1,\n batch_group_count=1,\n precision=None,\n preferred_element_type=None,\n):\n # TODO: add support for batch_group_count\n if preferred_element_type:\n lhs = ivy.astype(lhs, preferred_element_type)\n rhs = ivy.astype(rhs, preferred_element_type)\n dims = len(lhs.shape) - 2\n dim_nums = _dimension_numbers(dimension_numbers, dims + 2)\n rhs_spec = tuple(dim_nums[1][i] for i in (*range(2, dims + 2), 1, 0))\n return ivy.permute_dims(\n ivy.conv_general_dilated(\n ivy.permute_dims(lhs, axes=dim_nums[0]),\n ivy.permute_dims(rhs, axes=rhs_spec),\n window_strides,\n padding,\n dims=dims,\n data_format=\"channel_first\",\n x_dilations=1 if lhs_dilation is None else lhs_dilation,\n dilations=1 if rhs_dilation is None else rhs_dilation,\n feature_group_count=feature_group_count,\n ),\n axes=_argsort_tuple(dim_nums[2]),\n )\n\n\n@to_ivy_arrays_and_back\ndef conv_transpose(\n lhs,\n rhs,\n strides,\n padding,\n rhs_dilation=None,\n dimension_numbers=None,\n transpose_kernel=False,\n precision=None,\n preferred_element_type=None,\n):\n # TODO: add support for transpose_kernel\n if preferred_element_type:\n lhs = ivy.astype(lhs, preferred_element_type)\n rhs = ivy.astype(rhs, preferred_element_type)\n dims = len(lhs.shape) - 2\n dim_nums = _dimension_numbers(dimension_numbers, dims + 2, transp=True)\n rhs_spec = tuple(dim_nums[1][i] for i in (*range(2, dims + 2), 1, 0))\n rhs_dilation = 1 if rhs_dilation is None else rhs_dilation\n if isinstance(padding, str):\n k_sdims = [rhs.shape[i] for i in rhs_spec[:-2]]\n effective_k_size = map(lambda k, r: (k - 1) * r + 1, k_sdims, rhs_dilation)\n padding = [\n _conv_transpose_padding(k, s, padding)\n for k, s in zip(effective_k_size, strides)\n ]\n return ivy.permute_dims(\n ivy.conv_general_dilated(\n ivy.permute_dims(lhs, axes=dim_nums[0]),\n ivy.permute_dims(rhs, axes=rhs_spec),\n 1,\n padding,\n dilations=rhs_dilation,\n x_dilations=strides,\n dims=dims,\n data_format=\"channel_first\",\n ),\n axes=_argsort_tuple(dim_nums[2]),\n )\n\n\n@to_ivy_arrays_and_back\ndef convert_element_type(operand, new_dtype):\n return ivy.astype(operand, new_dtype, copy=False)\n\n\n@to_ivy_arrays_and_back\ndef cos(x):\n return ivy.cos(x)\n\n\n@to_ivy_arrays_and_back\ndef cosh(x):\n return ivy.cosh(x)\n\n\n@with_unsupported_dtypes(\n {\"0.4.17 and below\": (\"bfloat16\", \"float16\", \"bool\", \"complex64\", \"complex128\")},\n \"jax\",\n)\n@to_ivy_arrays_and_back\ndef cummin(operand, axis=0, reverse=False):\n return ivy.cummin(operand, axis=axis, reverse=reverse, dtype=operand.dtype)\n\n\n@to_ivy_arrays_and_back\ndef cumprod(operand, axis=None, reverse=False):\n dtype = ivy.dtype(operand)\n return ivy.cumprod(operand, axis=axis, reverse=reverse).astype(dtype)\n\n\n@to_ivy_arrays_and_back\ndef cumsum(operand, axis=None, reverse=False):\n if reverse:\n return ivy.flip(ivy.cumsum(ivy.flip(operand), axis=axis, dtype=operand.dtype))\n return ivy.cumsum(operand, axis=axis, dtype=operand.dtype)\n\n\n@to_ivy_arrays_and_back\ndef div(x, y):\n return ivy.astype(ivy.divide(x, y), x.dtype)\n\n\n@to_ivy_arrays_and_back\ndef dot(lhs, rhs, precision=None, preferred_element_type=None):\n ret = ivy.matmul(lhs, rhs)\n if preferred_element_type:\n ret = ivy.astype(ret, preferred_element_type, copy=False)\n return ret\n\n\n@with_unsupported_dtypes({\"0.4.5 and below\": (\"bool\",)}, \"jax\")\n@to_ivy_arrays_and_back\ndef dot_general(\n lhs, rhs, dimension_numbers, precision=None, preferred_element_type=None\n):\n (lhs_contracting, rhs_contracting), (lhs_batch, rhs_batch) = dimension_numbers\n ivy.utils.assertions.check_less(\n len(lhs.shape),\n 52,\n \"number of dimensions greater than 52 is not supported\",\n as_array=False,\n )\n new_id = itertools.count()\n lhs_axis_ids = [next(new_id) for _ in lhs.shape]\n rhs_axis_ids = [next(new_id) for _ in rhs.shape]\n lhs_out_axis_ids = lhs_axis_ids[:]\n rhs_out_axis_ids = rhs_axis_ids[:]\n for lhs_axis, rhs_axis in zip(lhs_contracting, rhs_contracting):\n shared_id = next(new_id)\n lhs_axis_ids[lhs_axis] = shared_id\n rhs_axis_ids[rhs_axis] = shared_id\n lhs_out_axis_ids[lhs_axis] = None\n rhs_out_axis_ids[rhs_axis] = None\n batch_ids = []\n for lhs_axis, rhs_axis in zip(lhs_batch, rhs_batch):\n shared_id = next(new_id)\n lhs_axis_ids[lhs_axis] = shared_id\n rhs_axis_ids[rhs_axis] = shared_id\n lhs_out_axis_ids[lhs_axis] = None\n rhs_out_axis_ids[rhs_axis] = None\n batch_ids.append(shared_id)\n out_axis_ids = list(\n filter(lambda x: x is not None, batch_ids + lhs_out_axis_ids + rhs_out_axis_ids)\n )\n char_list = [*string.ascii_letters]\n lhs_axis_ids = \"\".join(str(char_list[i]) for i in lhs_axis_ids)\n rhs_axis_ids = \"\".join(str(char_list[i]) for i in rhs_axis_ids)\n out_axis_ids = \"\".join(str(char_list[i]) for i in out_axis_ids)\n equ_str = f\"{lhs_axis_ids},{rhs_axis_ids}->{out_axis_ids}\"\n ret = ivy.einsum(equ_str, lhs, rhs)\n if preferred_element_type:\n ret = ivy.astype(ret, preferred_element_type, copy=False)\n return ret\n\n\n@to_ivy_arrays_and_back\ndef eq(x, y):\n return ivy.equal(x, y)\n\n\n@to_ivy_arrays_and_back\ndef erf(x):\n return ivy.erf(x)\n\n\n@with_supported_dtypes(\n {\n \"0.4.17 and below\": (\n \"float16\",\n \"float32\",\n \"float64\",\n )\n },\n \"jax\",\n)\n@to_ivy_arrays_and_back\ndef erfc(x):\n value = ivy.erf(x)\n value = (1.0 - value) if value is not None else None\n return value\n\n\n@to_ivy_arrays_and_back\ndef exp(x):\n return ivy.exp(x)\n\n\n@to_ivy_arrays_and_back\ndef expand_dims(array, dimensions):\n return ivy.expand_dims(array, axis=dimensions)\n\n\n@to_ivy_arrays_and_back\ndef expm1(x):\n return ivy.expm1(x)\n\n\n@to_ivy_arrays_and_back\ndef full(shape, fill_value, dtype=None):\n return ivy.full(shape, fill_value, dtype=dtype)\n\n\n@to_ivy_arrays_and_back\ndef full_like(x, fill_value, dtype=None, shape=None):\n if shape is None:\n return ivy.full_like(x, fill_value, dtype=dtype)\n return ivy.full(shape, fill_value, dtype=dtype)\n\n\n@with_unsupported_dtypes({\"0.4.5 and below\": (\"complex\",)}, \"jax\")\n@to_ivy_arrays_and_back\ndef ge(x, y):\n return ivy.greater_equal(x, y)\n\n\n@with_unsupported_dtypes({\"0.4.5 and below\": (\"complex\",)}, \"jax\")\n@to_ivy_arrays_and_back\ndef gt(x, y):\n return ivy.greater(x, y)\n\n\n@to_ivy_arrays_and_back\ndef igamma(a, x):\n return ivy.igamma(a, x=x)\n\n\n@to_ivy_arrays_and_back\ndef imag(x):\n return ivy.imag(x)\n\n\n@with_unsupported_dtypes(\n {\"0.4.17 and below\": (\"bool\", \"bfloat16\")},\n \"jax\",\n)\n@to_ivy_arrays_and_back\ndef iota(dtype, size):\n return ivy.arange(0, size, dtype=dtype)\n\n\n@to_ivy_arrays_and_back\ndef is_finite(x):\n return ivy.isfinite(x)\n\n\n@with_unsupported_dtypes({\"0.4.5 and below\": (\"complex\",)}, \"jax\")\n@to_ivy_arrays_and_back\ndef le(x, y):\n return ivy.less_equal(x, y)\n\n\n@to_ivy_arrays_and_back\ndef log(x):\n return ivy.log(x)\n\n\n@to_ivy_arrays_and_back\ndef log1p(x):\n return ivy.log1p(x)\n\n\n@to_ivy_arrays_and_back\ndef lt(x, y):\n return ivy.less(x, y)\n\n\n@to_ivy_arrays_and_back\ndef max(x: Any, y: Any):\n return ivy.maximum(x, y)\n\n\n@to_ivy_arrays_and_back\ndef min(x, y):\n return ivy.minimum(x, y)\n\n\n@to_ivy_arrays_and_back\ndef mul(x, y):\n return ivy.multiply(x, y)\n\n\n@to_ivy_arrays_and_back\ndef ne(x, y):\n return ivy.not_equal(x, y)\n\n\n@to_ivy_arrays_and_back\ndef neg(x):\n return ivy.negative(x)\n\n\n@to_ivy_arrays_and_back\ndef nextafter(x1, x2):\n return ivy.nextafter(x1, x2)\n\n\n@to_ivy_arrays_and_back\ndef pad(operand, padding_value, padding_config):\n return ivy.pad(\n operand, padding_config, mode=\"dilated\", constant_values=padding_value\n )\n\n\n@to_ivy_arrays_and_back\ndef pow(x, y):\n return ivy.pow(x, y)\n\n\n@to_ivy_arrays_and_back\ndef real(x):\n return ivy.real(x)\n\n\n@to_ivy_arrays_and_back\ndef reciprocal(x):\n return ivy.reciprocal(x)\n\n\n@to_ivy_arrays_and_back\ndef reduce_window(\n operand,\n init_value,\n computation,\n window_dimensions,\n window_strides,\n padding,\n base_dilation=None,\n window_dilation=None,\n):\n computation = frontend_outputs_to_ivy_arrays(computation)\n return ivy.reduce_window(\n operand,\n init_value,\n computation,\n window_dimensions,\n window_strides=window_strides,\n padding=padding,\n base_dilation=base_dilation,\n window_dilation=window_dilation,\n )\n\n\n@to_ivy_arrays_and_back\ndef rem(x, y):\n return ivy.remainder(ivy.abs(x), ivy.abs(y)) * ivy.sign(x)\n\n\n@to_ivy_arrays_and_back\ndef reshape(operand, new_sizes, dimensions=None):\n if dimensions:\n operand = ivy.permute_dims(operand, dimensions)\n return ivy.reshape(operand, new_sizes)\n\n\n@to_ivy_arrays_and_back\ndef rev(operand, dimensions):\n return ivy.flip(operand, axis=dimensions)\n\n\n@to_ivy_arrays_and_back\ndef round(x, rounding_method=1):\n if rounding_method == 0:\n ret = ivy.where(\n ivy.less(x, 0),\n ivy.ceil(x) - (ivy.ceil(x) - ivy.floor(x)),\n ivy.ceil(x),\n )\n elif rounding_method == 1:\n ret = ivy.ceil(x)\n ret = ivy.where(ivy.remainder(ret, 2) == 0, ret, ret - 1)\n return ivy.where(ivy.abs(x - ivy.floor(x) - 0.5) < 1e-7, ret, ivy.round(x))\n\n\n@to_ivy_arrays_and_back\ndef rsqrt(x):\n return ivy.reciprocal(ivy.sqrt(x))\n\n\n@to_ivy_arrays_and_back\ndef select(pred, on_true, on_false):\n return ivy.where(pred, on_true, on_false)\n\n\n@to_ivy_arrays_and_back\ndef shift_left(x, y):\n return ivy.bitwise_left_shift(x, y)\n\n\n@to_ivy_arrays_and_back\ndef shift_right_logical(x, y):\n return ivy.bitwise_right_shift(x, y)\n\n\n@to_ivy_arrays_and_back\ndef sign(x):\n return ivy.sign(x, np_variant=False)\n\n\n@to_ivy_arrays_and_back\ndef sin(x):\n return ivy.sin(x)\n\n\n@to_ivy_arrays_and_back\ndef sinh(x):\n return ivy.sinh(x)\n\n\n@to_ivy_arrays_and_back\ndef slice(operand, start_indices, limit_indices, strides=None):\n strides = [1] * len(operand.shape) if strides is None else strides\n\n full_slice = ()\n for i, _ in enumerate(operand.shape):\n strides_i = int(strides[i])\n start_i = int(start_indices[i])\n limit_i = int(limit_indices[i])\n full_slice += (_slice(start_i, limit_i, strides_i),)\n return operand[full_slice]\n\n\n@to_ivy_arrays_and_back\ndef slice_in_dim(operand, start_index, limit_index, stride=1, axis=0):\n start_indices = [0] * operand.ndim\n limit_indices = list(operand.shape)\n strides = [1] * operand.ndim\n\n len_axis = operand.shape[axis]\n start_index_int = start_index if start_index is not None else 0\n limit_index_int = limit_index if limit_index is not None else len_axis\n\n if start_index_int < 0:\n start_index_int = start_index_int + len_axis\n if limit_index_int < 0:\n limit_index_int = limit_index_int + len_axis\n\n axis = int(axis)\n start_indices[axis] = start_index_int\n limit_indices[axis] = limit_index_int\n strides[axis] = int(stride)\n return slice(operand, start_indices, limit_indices, strides)\n\n\n@to_ivy_arrays_and_back\ndef sort(operand, dimension=-1, is_stable=True, num_keys=1):\n return ivy.sort(operand, axis=dimension, stable=is_stable)\n\n\n@to_ivy_arrays_and_back\ndef sqrt(x):\n return ivy.sqrt(x)\n\n\n@to_ivy_arrays_and_back\ndef square(x):\n return ivy.square(x)\n\n\n@to_ivy_arrays_and_back\ndef squeeze(array, dimensions):\n return ivy.squeeze(array, axis=dimensions)\n\n\n@to_ivy_arrays_and_back\ndef sub(x, y):\n return ivy.subtract(x, y)\n\n\n@to_ivy_arrays_and_back\ndef tan(x):\n return ivy.tan(x)\n\n\n@to_ivy_arrays_and_back\ndef tie_in(x, y):\n return y\n\n\n# top_k\n@to_ivy_arrays_and_back\ndef top_k(operand, k):\n values, indices = ivy.top_k(operand, k, axis=-1)\n indices = ivy.astype(indices, ivy.int32, copy=False)\n return [values, indices]\n\n\n@to_ivy_arrays_and_back\ndef transpose(operand, permutation):\n return ivy.permute_dims(operand, permutation)\n", "path": "ivy/functional/frontends/jax/lax/operators.py" } ]
diff --git a/ivy/functional/frontends/jax/lax/operators.py b/ivy/functional/frontends/jax/lax/operators.py index e456a89ff4e01..488e60b4335e2 100644 --- a/ivy/functional/frontends/jax/lax/operators.py +++ b/ivy/functional/frontends/jax/lax/operators.py @@ -454,6 +454,11 @@ def gt(x, y): return ivy.greater(x, y) +@to_ivy_arrays_and_back +def igamma(a, x): + return ivy.igamma(a, x=x) + + @to_ivy_arrays_and_back def imag(x): return ivy.imag(x) diff --git a/ivy_tests/test_ivy/test_frontends/test_jax/test_lax/test_operators.py b/ivy_tests/test_ivy/test_frontends/test_jax/test_lax/test_operators.py index 7ad483e8edfc1..b04d2ec3604f1 100644 --- a/ivy_tests/test_ivy/test_frontends/test_jax/test_lax/test_operators.py +++ b/ivy_tests/test_ivy/test_frontends/test_jax/test_lax/test_operators.py @@ -1921,6 +1921,40 @@ def test_jax_gt( ) +# igamma +@handle_frontend_test( + fn_tree="jax.lax.igamma", + dtypes_and_xs=helpers.dtype_and_values( + available_dtypes=helpers.get_dtypes("numeric"), + num_arrays=2, + shared_dtype=True, + ), + test_with_out=st.just(False), +) +def test_jax_igamma( + *, + dtypes_and_xs, + on_device, + fn_tree, + frontend, + test_flags, + backend_fw, +): + input_dtypes, (x, y) = dtypes_and_xs + + helpers.test_frontend_function( + input_dtypes=input_dtypes, + backend_to_test=backend_fw, + frontend=frontend, + test_flags=test_flags, + fn_tree=fn_tree, + on_device=on_device, + test_values=True, + x=x, + y=y, + ) + + # imag @handle_frontend_test( fn_tree="jax.lax.imag",
docker__docker-py-635
Can't use multiple of same host path in binds Reference for issue is docker/compose#983 The `convert_volume_binds` is using the dict key to get the host path to bind. Because of this, it is impossible to do the equivalent of `docker run -v /foo:/bar -v /foo:baz`.
[ { "content": "# Copyright 2013 dotCloud inc.\n\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n\n# http://www.apache.org/licenses/LICENSE-2.0\n\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport io\nimport os\nimport os.path\nimport json\nimport shlex\nimport tarfile\nimport tempfile\nfrom distutils.version import StrictVersion\nfrom fnmatch import fnmatch\nfrom datetime import datetime\n\nimport requests\nimport six\n\nfrom .. import errors\nfrom .. import tls\nfrom .types import Ulimit, LogConfig\n\n\nDEFAULT_HTTP_HOST = \"127.0.0.1\"\nDEFAULT_UNIX_SOCKET = \"http+unix://var/run/docker.sock\"\nBYTE_UNITS = {\n 'b': 1,\n 'k': 1024,\n 'm': 1024 * 1024,\n 'g': 1024 * 1024 * 1024\n}\n\n\ndef mkbuildcontext(dockerfile):\n f = tempfile.NamedTemporaryFile()\n t = tarfile.open(mode='w', fileobj=f)\n if isinstance(dockerfile, io.StringIO):\n dfinfo = tarfile.TarInfo('Dockerfile')\n if six.PY3:\n raise TypeError('Please use io.BytesIO to create in-memory '\n 'Dockerfiles with Python 3')\n else:\n dfinfo.size = len(dockerfile.getvalue())\n dockerfile.seek(0)\n elif isinstance(dockerfile, io.BytesIO):\n dfinfo = tarfile.TarInfo('Dockerfile')\n dfinfo.size = len(dockerfile.getvalue())\n dockerfile.seek(0)\n else:\n dfinfo = t.gettarinfo(fileobj=dockerfile, arcname='Dockerfile')\n t.addfile(dfinfo, dockerfile)\n t.close()\n f.seek(0)\n return f\n\n\ndef fnmatch_any(relpath, patterns):\n return any([fnmatch(relpath, pattern) for pattern in patterns])\n\n\ndef tar(path, exclude=None):\n f = tempfile.NamedTemporaryFile()\n t = tarfile.open(mode='w', fileobj=f)\n for dirpath, dirnames, filenames in os.walk(path):\n relpath = os.path.relpath(dirpath, path)\n if relpath == '.':\n relpath = ''\n if exclude is None:\n fnames = filenames\n else:\n dirnames[:] = [d for d in dirnames\n if not fnmatch_any(os.path.join(relpath, d),\n exclude)]\n fnames = [name for name in filenames\n if not fnmatch_any(os.path.join(relpath, name),\n exclude)]\n dirnames.sort()\n for name in sorted(fnames):\n arcname = os.path.join(relpath, name)\n t.add(os.path.join(path, arcname), arcname=arcname)\n for name in dirnames:\n arcname = os.path.join(relpath, name)\n t.add(os.path.join(path, arcname),\n arcname=arcname, recursive=False)\n t.close()\n f.seek(0)\n return f\n\n\ndef compare_version(v1, v2):\n \"\"\"Compare docker versions\n\n >>> v1 = '1.9'\n >>> v2 = '1.10'\n >>> compare_version(v1, v2)\n 1\n >>> compare_version(v2, v1)\n -1\n >>> compare_version(v2, v2)\n 0\n \"\"\"\n s1 = StrictVersion(v1)\n s2 = StrictVersion(v2)\n if s1 == s2:\n return 0\n elif s1 > s2:\n return -1\n else:\n return 1\n\n\ndef ping_registry(url):\n return ping(url + '/v2/') or ping(url + '/v1/_ping')\n\n\ndef ping(url):\n try:\n res = requests.get(url, timeout=3)\n except Exception:\n return False\n else:\n return res.status_code < 400\n\n\ndef _convert_port_binding(binding):\n result = {'HostIp': '', 'HostPort': ''}\n if isinstance(binding, tuple):\n if len(binding) == 2:\n result['HostPort'] = binding[1]\n result['HostIp'] = binding[0]\n elif isinstance(binding[0], six.string_types):\n result['HostIp'] = binding[0]\n else:\n result['HostPort'] = binding[0]\n elif isinstance(binding, dict):\n if 'HostPort' in binding:\n result['HostPort'] = binding['HostPort']\n if 'HostIp' in binding:\n result['HostIp'] = binding['HostIp']\n else:\n raise ValueError(binding)\n else:\n result['HostPort'] = binding\n\n if result['HostPort'] is None:\n result['HostPort'] = ''\n else:\n result['HostPort'] = str(result['HostPort'])\n\n return result\n\n\ndef convert_port_bindings(port_bindings):\n result = {}\n for k, v in six.iteritems(port_bindings):\n key = str(k)\n if '/' not in key:\n key = key + '/tcp'\n if isinstance(v, list):\n result[key] = [_convert_port_binding(binding) for binding in v]\n else:\n result[key] = [_convert_port_binding(v)]\n return result\n\n\ndef convert_volume_binds(binds):\n result = []\n for k, v in binds.items():\n if isinstance(v, dict):\n result.append('{0}:{1}:{2}'.format(\n k, v['bind'], 'ro' if v.get('ro', False) else 'rw'\n ))\n else:\n result.append('{0}:{1}:rw'.format(k, v))\n return result\n\n\ndef parse_repository_tag(repo):\n column_index = repo.rfind(':')\n if column_index < 0:\n return repo, None\n tag = repo[column_index + 1:]\n slash_index = tag.find('/')\n if slash_index < 0:\n return repo[:column_index], tag\n\n return repo, None\n\n\n# Based on utils.go:ParseHost http://tinyurl.com/nkahcfh\n# fd:// protocol unsupported (for obvious reasons)\n# Added support for http and https\n# Protocol translation: tcp -> http, unix -> http+unix\ndef parse_host(addr):\n proto = \"http+unix\"\n host = DEFAULT_HTTP_HOST\n port = None\n if not addr or addr.strip() == 'unix://':\n return DEFAULT_UNIX_SOCKET\n\n addr = addr.strip()\n if addr.startswith('http://'):\n addr = addr.replace('http://', 'tcp://')\n if addr.startswith('http+unix://'):\n addr = addr.replace('http+unix://', 'unix://')\n\n if addr == 'tcp://':\n raise errors.DockerException(\n \"Invalid bind address format: {0}\".format(addr))\n elif addr.startswith('unix://'):\n addr = addr[7:]\n elif addr.startswith('tcp://'):\n proto = \"http\"\n addr = addr[6:]\n elif addr.startswith('https://'):\n proto = \"https\"\n addr = addr[8:]\n elif addr.startswith('fd://'):\n raise errors.DockerException(\"fd protocol is not implemented\")\n else:\n if \"://\" in addr:\n raise errors.DockerException(\n \"Invalid bind address protocol: {0}\".format(addr)\n )\n proto = \"http\"\n\n if proto != \"http+unix\" and \":\" in addr:\n host_parts = addr.split(':')\n if len(host_parts) != 2:\n raise errors.DockerException(\n \"Invalid bind address format: {0}\".format(addr)\n )\n if host_parts[0]:\n host = host_parts[0]\n\n try:\n port = int(host_parts[1])\n except Exception:\n raise errors.DockerException(\n \"Invalid port: %s\", addr\n )\n\n elif proto in (\"http\", \"https\") and ':' not in addr:\n raise errors.DockerException(\n \"Bind address needs a port: {0}\".format(addr))\n else:\n host = addr\n\n if proto == \"http+unix\":\n return \"{0}://{1}\".format(proto, host)\n return \"{0}://{1}:{2}\".format(proto, host, port)\n\n\ndef parse_devices(devices):\n device_list = []\n for device in devices:\n device_mapping = device.split(\":\")\n if device_mapping:\n path_on_host = device_mapping[0]\n if len(device_mapping) > 1:\n path_in_container = device_mapping[1]\n else:\n path_in_container = path_on_host\n if len(device_mapping) > 2:\n permissions = device_mapping[2]\n else:\n permissions = 'rwm'\n device_list.append({\"PathOnHost\": path_on_host,\n \"PathInContainer\": path_in_container,\n \"CgroupPermissions\": permissions})\n return device_list\n\n\ndef kwargs_from_env(ssl_version=None, assert_hostname=None):\n host = os.environ.get('DOCKER_HOST')\n cert_path = os.environ.get('DOCKER_CERT_PATH')\n tls_verify = os.environ.get('DOCKER_TLS_VERIFY')\n\n params = {}\n if host:\n params['base_url'] = (host.replace('tcp://', 'https://')\n if tls_verify else host)\n if tls_verify and cert_path:\n params['tls'] = tls.TLSConfig(\n client_cert=(os.path.join(cert_path, 'cert.pem'),\n os.path.join(cert_path, 'key.pem')),\n ca_cert=os.path.join(cert_path, 'ca.pem'),\n verify=True,\n ssl_version=ssl_version,\n assert_hostname=assert_hostname)\n return params\n\n\ndef convert_filters(filters):\n result = {}\n for k, v in six.iteritems(filters):\n if isinstance(v, bool):\n v = 'true' if v else 'false'\n if not isinstance(v, list):\n v = [v, ]\n result[k] = v\n return json.dumps(result)\n\n\ndef datetime_to_timestamp(dt=datetime.now()):\n \"\"\"Convert a datetime in local timezone to a unix timestamp\"\"\"\n delta = dt - datetime.fromtimestamp(0)\n return delta.seconds + delta.days * 24 * 3600\n\n\ndef parse_bytes(s):\n if len(s) == 0:\n s = 0\n else:\n if s[-2:-1].isalpha() and s[-1].isalpha():\n if (s[-1] == \"b\" or s[-1] == \"B\"):\n s = s[:-1]\n units = BYTE_UNITS\n suffix = s[-1].lower()\n\n # Check if the variable is a string representation of an int\n # without a units part. Assuming that the units are bytes.\n if suffix.isdigit():\n digits_part = s\n suffix = 'b'\n else:\n digits_part = s[:-1]\n\n if suffix in units.keys() or suffix.isdigit():\n try:\n digits = int(digits_part)\n except ValueError:\n message = ('Failed converting the string value for'\n 'memory ({0}) to a number.')\n formatted_message = message.format(digits_part)\n raise errors.DockerException(formatted_message)\n\n s = digits * units[suffix]\n else:\n message = ('The specified value for memory'\n ' ({0}) should specify the units. The postfix'\n ' should be one of the `b` `k` `m` `g`'\n ' characters')\n raise errors.DockerException(message.format(s))\n\n return s\n\n\ndef create_host_config(\n binds=None, port_bindings=None, lxc_conf=None,\n publish_all_ports=False, links=None, privileged=False,\n dns=None, dns_search=None, volumes_from=None, network_mode=None,\n restart_policy=None, cap_add=None, cap_drop=None, devices=None,\n extra_hosts=None, read_only=None, pid_mode=None, ipc_mode=None,\n security_opt=None, ulimits=None, log_config=None\n):\n host_config = {}\n\n if pid_mode not in (None, 'host'):\n raise errors.DockerException(\n 'Invalid value for pid param: {0}'.format(pid_mode)\n )\n elif pid_mode:\n host_config['PidMode'] = pid_mode\n\n if ipc_mode:\n host_config['IpcMode'] = ipc_mode\n\n if privileged:\n host_config['Privileged'] = privileged\n\n if publish_all_ports:\n host_config['PublishAllPorts'] = publish_all_ports\n\n if read_only is not None:\n host_config['ReadonlyRootfs'] = read_only\n\n if dns_search:\n host_config['DnsSearch'] = dns_search\n\n if network_mode:\n host_config['NetworkMode'] = network_mode\n\n if restart_policy:\n host_config['RestartPolicy'] = restart_policy\n\n if cap_add:\n host_config['CapAdd'] = cap_add\n\n if cap_drop:\n host_config['CapDrop'] = cap_drop\n\n if devices:\n host_config['Devices'] = parse_devices(devices)\n\n if dns is not None:\n host_config['Dns'] = dns\n\n if security_opt is not None:\n if not isinstance(security_opt, list):\n raise errors.DockerException(\n 'Invalid type for security_opt param: expected list but found'\n ' {0}'.format(type(security_opt))\n )\n host_config['SecurityOpt'] = security_opt\n\n if volumes_from is not None:\n if isinstance(volumes_from, six.string_types):\n volumes_from = volumes_from.split(',')\n host_config['VolumesFrom'] = volumes_from\n\n if binds is not None:\n host_config['Binds'] = convert_volume_binds(binds)\n\n if port_bindings is not None:\n host_config['PortBindings'] = convert_port_bindings(\n port_bindings\n )\n\n if extra_hosts is not None:\n if isinstance(extra_hosts, dict):\n extra_hosts = [\n '{0}:{1}'.format(k, v)\n for k, v in sorted(six.iteritems(extra_hosts))\n ]\n\n host_config['ExtraHosts'] = extra_hosts\n\n if links is not None:\n if isinstance(links, dict):\n links = six.iteritems(links)\n\n formatted_links = [\n '{0}:{1}'.format(k, v) for k, v in sorted(links)\n ]\n\n host_config['Links'] = formatted_links\n\n if isinstance(lxc_conf, dict):\n formatted = []\n for k, v in six.iteritems(lxc_conf):\n formatted.append({'Key': k, 'Value': str(v)})\n lxc_conf = formatted\n\n if lxc_conf is not None:\n host_config['LxcConf'] = lxc_conf\n\n if ulimits is not None:\n if not isinstance(ulimits, list):\n raise errors.DockerException(\n 'Invalid type for ulimits param: expected list but found'\n ' {0}'.format(type(ulimits))\n )\n host_config['Ulimits'] = []\n for l in ulimits:\n if not isinstance(l, Ulimit):\n l = Ulimit(**l)\n host_config['Ulimits'].append(l)\n\n if log_config is not None:\n if not isinstance(log_config, LogConfig):\n if not isinstance(log_config, dict):\n raise errors.DockerException(\n 'Invalid type for log_config param: expected LogConfig but'\n ' found {0}'.format(type(log_config))\n )\n log_config = LogConfig(**log_config)\n host_config['LogConfig'] = log_config\n\n return host_config\n\n\ndef create_container_config(\n version, image, command, hostname=None, user=None, detach=False,\n stdin_open=False, tty=False, mem_limit=0, ports=None, environment=None,\n dns=None, volumes=None, volumes_from=None, network_disabled=False,\n entrypoint=None, cpu_shares=None, working_dir=None, domainname=None,\n memswap_limit=0, cpuset=None, host_config=None, mac_address=None,\n labels=None\n):\n if isinstance(command, six.string_types):\n command = shlex.split(str(command))\n if isinstance(environment, dict):\n environment = [\n six.text_type('{0}={1}').format(k, v)\n for k, v in six.iteritems(environment)\n ]\n\n if labels is not None and compare_version('1.18', version) < 0:\n raise errors.DockerException(\n 'labels were only introduced in API version 1.18'\n )\n\n if isinstance(labels, list):\n labels = dict((lbl, six.text_type('')) for lbl in labels)\n\n if isinstance(mem_limit, six.string_types):\n mem_limit = parse_bytes(mem_limit)\n if isinstance(memswap_limit, six.string_types):\n memswap_limit = parse_bytes(memswap_limit)\n\n if isinstance(ports, list):\n exposed_ports = {}\n for port_definition in ports:\n port = port_definition\n proto = 'tcp'\n if isinstance(port_definition, tuple):\n if len(port_definition) == 2:\n proto = port_definition[1]\n port = port_definition[0]\n exposed_ports['{0}/{1}'.format(port, proto)] = {}\n ports = exposed_ports\n\n if isinstance(volumes, six.string_types):\n volumes = [volumes, ]\n\n if isinstance(volumes, list):\n volumes_dict = {}\n for vol in volumes:\n volumes_dict[vol] = {}\n volumes = volumes_dict\n\n if volumes_from:\n if not isinstance(volumes_from, six.string_types):\n volumes_from = ','.join(volumes_from)\n else:\n # Force None, an empty list or dict causes client.start to fail\n volumes_from = None\n\n attach_stdin = False\n attach_stdout = False\n attach_stderr = False\n stdin_once = False\n\n if not detach:\n attach_stdout = True\n attach_stderr = True\n\n if stdin_open:\n attach_stdin = True\n stdin_once = True\n\n if compare_version('1.10', version) >= 0:\n message = ('{0!r} parameter has no effect on create_container().'\n ' It has been moved to start()')\n if dns is not None:\n raise errors.DockerException(message.format('dns'))\n if volumes_from is not None:\n raise errors.DockerException(message.format('volumes_from'))\n\n return {\n 'Hostname': hostname,\n 'Domainname': domainname,\n 'ExposedPorts': ports,\n 'User': user,\n 'Tty': tty,\n 'OpenStdin': stdin_open,\n 'StdinOnce': stdin_once,\n 'Memory': mem_limit,\n 'AttachStdin': attach_stdin,\n 'AttachStdout': attach_stdout,\n 'AttachStderr': attach_stderr,\n 'Env': environment,\n 'Cmd': command,\n 'Dns': dns,\n 'Image': image,\n 'Volumes': volumes,\n 'VolumesFrom': volumes_from,\n 'NetworkDisabled': network_disabled,\n 'Entrypoint': entrypoint,\n 'CpuShares': cpu_shares,\n 'Cpuset': cpuset,\n 'CpusetCpus': cpuset,\n 'WorkingDir': working_dir,\n 'MemorySwap': memswap_limit,\n 'HostConfig': host_config,\n 'MacAddress': mac_address,\n 'Labels': labels,\n }\n", "path": "docker/utils/utils.py" } ]
[ { "content": "# Copyright 2013 dotCloud inc.\n\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n\n# http://www.apache.org/licenses/LICENSE-2.0\n\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport io\nimport os\nimport os.path\nimport json\nimport shlex\nimport tarfile\nimport tempfile\nfrom distutils.version import StrictVersion\nfrom fnmatch import fnmatch\nfrom datetime import datetime\n\nimport requests\nimport six\n\nfrom .. import errors\nfrom .. import tls\nfrom .types import Ulimit, LogConfig\n\n\nDEFAULT_HTTP_HOST = \"127.0.0.1\"\nDEFAULT_UNIX_SOCKET = \"http+unix://var/run/docker.sock\"\nBYTE_UNITS = {\n 'b': 1,\n 'k': 1024,\n 'm': 1024 * 1024,\n 'g': 1024 * 1024 * 1024\n}\n\n\ndef mkbuildcontext(dockerfile):\n f = tempfile.NamedTemporaryFile()\n t = tarfile.open(mode='w', fileobj=f)\n if isinstance(dockerfile, io.StringIO):\n dfinfo = tarfile.TarInfo('Dockerfile')\n if six.PY3:\n raise TypeError('Please use io.BytesIO to create in-memory '\n 'Dockerfiles with Python 3')\n else:\n dfinfo.size = len(dockerfile.getvalue())\n dockerfile.seek(0)\n elif isinstance(dockerfile, io.BytesIO):\n dfinfo = tarfile.TarInfo('Dockerfile')\n dfinfo.size = len(dockerfile.getvalue())\n dockerfile.seek(0)\n else:\n dfinfo = t.gettarinfo(fileobj=dockerfile, arcname='Dockerfile')\n t.addfile(dfinfo, dockerfile)\n t.close()\n f.seek(0)\n return f\n\n\ndef fnmatch_any(relpath, patterns):\n return any([fnmatch(relpath, pattern) for pattern in patterns])\n\n\ndef tar(path, exclude=None):\n f = tempfile.NamedTemporaryFile()\n t = tarfile.open(mode='w', fileobj=f)\n for dirpath, dirnames, filenames in os.walk(path):\n relpath = os.path.relpath(dirpath, path)\n if relpath == '.':\n relpath = ''\n if exclude is None:\n fnames = filenames\n else:\n dirnames[:] = [d for d in dirnames\n if not fnmatch_any(os.path.join(relpath, d),\n exclude)]\n fnames = [name for name in filenames\n if not fnmatch_any(os.path.join(relpath, name),\n exclude)]\n dirnames.sort()\n for name in sorted(fnames):\n arcname = os.path.join(relpath, name)\n t.add(os.path.join(path, arcname), arcname=arcname)\n for name in dirnames:\n arcname = os.path.join(relpath, name)\n t.add(os.path.join(path, arcname),\n arcname=arcname, recursive=False)\n t.close()\n f.seek(0)\n return f\n\n\ndef compare_version(v1, v2):\n \"\"\"Compare docker versions\n\n >>> v1 = '1.9'\n >>> v2 = '1.10'\n >>> compare_version(v1, v2)\n 1\n >>> compare_version(v2, v1)\n -1\n >>> compare_version(v2, v2)\n 0\n \"\"\"\n s1 = StrictVersion(v1)\n s2 = StrictVersion(v2)\n if s1 == s2:\n return 0\n elif s1 > s2:\n return -1\n else:\n return 1\n\n\ndef ping_registry(url):\n return ping(url + '/v2/') or ping(url + '/v1/_ping')\n\n\ndef ping(url):\n try:\n res = requests.get(url, timeout=3)\n except Exception:\n return False\n else:\n return res.status_code < 400\n\n\ndef _convert_port_binding(binding):\n result = {'HostIp': '', 'HostPort': ''}\n if isinstance(binding, tuple):\n if len(binding) == 2:\n result['HostPort'] = binding[1]\n result['HostIp'] = binding[0]\n elif isinstance(binding[0], six.string_types):\n result['HostIp'] = binding[0]\n else:\n result['HostPort'] = binding[0]\n elif isinstance(binding, dict):\n if 'HostPort' in binding:\n result['HostPort'] = binding['HostPort']\n if 'HostIp' in binding:\n result['HostIp'] = binding['HostIp']\n else:\n raise ValueError(binding)\n else:\n result['HostPort'] = binding\n\n if result['HostPort'] is None:\n result['HostPort'] = ''\n else:\n result['HostPort'] = str(result['HostPort'])\n\n return result\n\n\ndef convert_port_bindings(port_bindings):\n result = {}\n for k, v in six.iteritems(port_bindings):\n key = str(k)\n if '/' not in key:\n key = key + '/tcp'\n if isinstance(v, list):\n result[key] = [_convert_port_binding(binding) for binding in v]\n else:\n result[key] = [_convert_port_binding(v)]\n return result\n\n\ndef convert_volume_binds(binds):\n if isinstance(binds, list):\n return binds\n\n result = []\n for k, v in binds.items():\n if isinstance(v, dict):\n result.append('{0}:{1}:{2}'.format(\n k, v['bind'], 'ro' if v.get('ro', False) else 'rw'\n ))\n else:\n result.append('{0}:{1}:rw'.format(k, v))\n return result\n\n\ndef parse_repository_tag(repo):\n column_index = repo.rfind(':')\n if column_index < 0:\n return repo, None\n tag = repo[column_index + 1:]\n slash_index = tag.find('/')\n if slash_index < 0:\n return repo[:column_index], tag\n\n return repo, None\n\n\n# Based on utils.go:ParseHost http://tinyurl.com/nkahcfh\n# fd:// protocol unsupported (for obvious reasons)\n# Added support for http and https\n# Protocol translation: tcp -> http, unix -> http+unix\ndef parse_host(addr):\n proto = \"http+unix\"\n host = DEFAULT_HTTP_HOST\n port = None\n if not addr or addr.strip() == 'unix://':\n return DEFAULT_UNIX_SOCKET\n\n addr = addr.strip()\n if addr.startswith('http://'):\n addr = addr.replace('http://', 'tcp://')\n if addr.startswith('http+unix://'):\n addr = addr.replace('http+unix://', 'unix://')\n\n if addr == 'tcp://':\n raise errors.DockerException(\n \"Invalid bind address format: {0}\".format(addr))\n elif addr.startswith('unix://'):\n addr = addr[7:]\n elif addr.startswith('tcp://'):\n proto = \"http\"\n addr = addr[6:]\n elif addr.startswith('https://'):\n proto = \"https\"\n addr = addr[8:]\n elif addr.startswith('fd://'):\n raise errors.DockerException(\"fd protocol is not implemented\")\n else:\n if \"://\" in addr:\n raise errors.DockerException(\n \"Invalid bind address protocol: {0}\".format(addr)\n )\n proto = \"http\"\n\n if proto != \"http+unix\" and \":\" in addr:\n host_parts = addr.split(':')\n if len(host_parts) != 2:\n raise errors.DockerException(\n \"Invalid bind address format: {0}\".format(addr)\n )\n if host_parts[0]:\n host = host_parts[0]\n\n try:\n port = int(host_parts[1])\n except Exception:\n raise errors.DockerException(\n \"Invalid port: %s\", addr\n )\n\n elif proto in (\"http\", \"https\") and ':' not in addr:\n raise errors.DockerException(\n \"Bind address needs a port: {0}\".format(addr))\n else:\n host = addr\n\n if proto == \"http+unix\":\n return \"{0}://{1}\".format(proto, host)\n return \"{0}://{1}:{2}\".format(proto, host, port)\n\n\ndef parse_devices(devices):\n device_list = []\n for device in devices:\n device_mapping = device.split(\":\")\n if device_mapping:\n path_on_host = device_mapping[0]\n if len(device_mapping) > 1:\n path_in_container = device_mapping[1]\n else:\n path_in_container = path_on_host\n if len(device_mapping) > 2:\n permissions = device_mapping[2]\n else:\n permissions = 'rwm'\n device_list.append({\"PathOnHost\": path_on_host,\n \"PathInContainer\": path_in_container,\n \"CgroupPermissions\": permissions})\n return device_list\n\n\ndef kwargs_from_env(ssl_version=None, assert_hostname=None):\n host = os.environ.get('DOCKER_HOST')\n cert_path = os.environ.get('DOCKER_CERT_PATH')\n tls_verify = os.environ.get('DOCKER_TLS_VERIFY')\n\n params = {}\n if host:\n params['base_url'] = (host.replace('tcp://', 'https://')\n if tls_verify else host)\n if tls_verify and cert_path:\n params['tls'] = tls.TLSConfig(\n client_cert=(os.path.join(cert_path, 'cert.pem'),\n os.path.join(cert_path, 'key.pem')),\n ca_cert=os.path.join(cert_path, 'ca.pem'),\n verify=True,\n ssl_version=ssl_version,\n assert_hostname=assert_hostname)\n return params\n\n\ndef convert_filters(filters):\n result = {}\n for k, v in six.iteritems(filters):\n if isinstance(v, bool):\n v = 'true' if v else 'false'\n if not isinstance(v, list):\n v = [v, ]\n result[k] = v\n return json.dumps(result)\n\n\ndef datetime_to_timestamp(dt=datetime.now()):\n \"\"\"Convert a datetime in local timezone to a unix timestamp\"\"\"\n delta = dt - datetime.fromtimestamp(0)\n return delta.seconds + delta.days * 24 * 3600\n\n\ndef parse_bytes(s):\n if len(s) == 0:\n s = 0\n else:\n if s[-2:-1].isalpha() and s[-1].isalpha():\n if (s[-1] == \"b\" or s[-1] == \"B\"):\n s = s[:-1]\n units = BYTE_UNITS\n suffix = s[-1].lower()\n\n # Check if the variable is a string representation of an int\n # without a units part. Assuming that the units are bytes.\n if suffix.isdigit():\n digits_part = s\n suffix = 'b'\n else:\n digits_part = s[:-1]\n\n if suffix in units.keys() or suffix.isdigit():\n try:\n digits = int(digits_part)\n except ValueError:\n message = ('Failed converting the string value for'\n 'memory ({0}) to a number.')\n formatted_message = message.format(digits_part)\n raise errors.DockerException(formatted_message)\n\n s = digits * units[suffix]\n else:\n message = ('The specified value for memory'\n ' ({0}) should specify the units. The postfix'\n ' should be one of the `b` `k` `m` `g`'\n ' characters')\n raise errors.DockerException(message.format(s))\n\n return s\n\n\ndef create_host_config(\n binds=None, port_bindings=None, lxc_conf=None,\n publish_all_ports=False, links=None, privileged=False,\n dns=None, dns_search=None, volumes_from=None, network_mode=None,\n restart_policy=None, cap_add=None, cap_drop=None, devices=None,\n extra_hosts=None, read_only=None, pid_mode=None, ipc_mode=None,\n security_opt=None, ulimits=None, log_config=None\n):\n host_config = {}\n\n if pid_mode not in (None, 'host'):\n raise errors.DockerException(\n 'Invalid value for pid param: {0}'.format(pid_mode)\n )\n elif pid_mode:\n host_config['PidMode'] = pid_mode\n\n if ipc_mode:\n host_config['IpcMode'] = ipc_mode\n\n if privileged:\n host_config['Privileged'] = privileged\n\n if publish_all_ports:\n host_config['PublishAllPorts'] = publish_all_ports\n\n if read_only is not None:\n host_config['ReadonlyRootfs'] = read_only\n\n if dns_search:\n host_config['DnsSearch'] = dns_search\n\n if network_mode:\n host_config['NetworkMode'] = network_mode\n\n if restart_policy:\n host_config['RestartPolicy'] = restart_policy\n\n if cap_add:\n host_config['CapAdd'] = cap_add\n\n if cap_drop:\n host_config['CapDrop'] = cap_drop\n\n if devices:\n host_config['Devices'] = parse_devices(devices)\n\n if dns is not None:\n host_config['Dns'] = dns\n\n if security_opt is not None:\n if not isinstance(security_opt, list):\n raise errors.DockerException(\n 'Invalid type for security_opt param: expected list but found'\n ' {0}'.format(type(security_opt))\n )\n host_config['SecurityOpt'] = security_opt\n\n if volumes_from is not None:\n if isinstance(volumes_from, six.string_types):\n volumes_from = volumes_from.split(',')\n host_config['VolumesFrom'] = volumes_from\n\n if binds is not None:\n host_config['Binds'] = convert_volume_binds(binds)\n\n if port_bindings is not None:\n host_config['PortBindings'] = convert_port_bindings(\n port_bindings\n )\n\n if extra_hosts is not None:\n if isinstance(extra_hosts, dict):\n extra_hosts = [\n '{0}:{1}'.format(k, v)\n for k, v in sorted(six.iteritems(extra_hosts))\n ]\n\n host_config['ExtraHosts'] = extra_hosts\n\n if links is not None:\n if isinstance(links, dict):\n links = six.iteritems(links)\n\n formatted_links = [\n '{0}:{1}'.format(k, v) for k, v in sorted(links)\n ]\n\n host_config['Links'] = formatted_links\n\n if isinstance(lxc_conf, dict):\n formatted = []\n for k, v in six.iteritems(lxc_conf):\n formatted.append({'Key': k, 'Value': str(v)})\n lxc_conf = formatted\n\n if lxc_conf is not None:\n host_config['LxcConf'] = lxc_conf\n\n if ulimits is not None:\n if not isinstance(ulimits, list):\n raise errors.DockerException(\n 'Invalid type for ulimits param: expected list but found'\n ' {0}'.format(type(ulimits))\n )\n host_config['Ulimits'] = []\n for l in ulimits:\n if not isinstance(l, Ulimit):\n l = Ulimit(**l)\n host_config['Ulimits'].append(l)\n\n if log_config is not None:\n if not isinstance(log_config, LogConfig):\n if not isinstance(log_config, dict):\n raise errors.DockerException(\n 'Invalid type for log_config param: expected LogConfig but'\n ' found {0}'.format(type(log_config))\n )\n log_config = LogConfig(**log_config)\n host_config['LogConfig'] = log_config\n\n return host_config\n\n\ndef create_container_config(\n version, image, command, hostname=None, user=None, detach=False,\n stdin_open=False, tty=False, mem_limit=0, ports=None, environment=None,\n dns=None, volumes=None, volumes_from=None, network_disabled=False,\n entrypoint=None, cpu_shares=None, working_dir=None, domainname=None,\n memswap_limit=0, cpuset=None, host_config=None, mac_address=None,\n labels=None\n):\n if isinstance(command, six.string_types):\n command = shlex.split(str(command))\n if isinstance(environment, dict):\n environment = [\n six.text_type('{0}={1}').format(k, v)\n for k, v in six.iteritems(environment)\n ]\n\n if labels is not None and compare_version('1.18', version) < 0:\n raise errors.DockerException(\n 'labels were only introduced in API version 1.18'\n )\n\n if isinstance(labels, list):\n labels = dict((lbl, six.text_type('')) for lbl in labels)\n\n if isinstance(mem_limit, six.string_types):\n mem_limit = parse_bytes(mem_limit)\n if isinstance(memswap_limit, six.string_types):\n memswap_limit = parse_bytes(memswap_limit)\n\n if isinstance(ports, list):\n exposed_ports = {}\n for port_definition in ports:\n port = port_definition\n proto = 'tcp'\n if isinstance(port_definition, tuple):\n if len(port_definition) == 2:\n proto = port_definition[1]\n port = port_definition[0]\n exposed_ports['{0}/{1}'.format(port, proto)] = {}\n ports = exposed_ports\n\n if isinstance(volumes, six.string_types):\n volumes = [volumes, ]\n\n if isinstance(volumes, list):\n volumes_dict = {}\n for vol in volumes:\n volumes_dict[vol] = {}\n volumes = volumes_dict\n\n if volumes_from:\n if not isinstance(volumes_from, six.string_types):\n volumes_from = ','.join(volumes_from)\n else:\n # Force None, an empty list or dict causes client.start to fail\n volumes_from = None\n\n attach_stdin = False\n attach_stdout = False\n attach_stderr = False\n stdin_once = False\n\n if not detach:\n attach_stdout = True\n attach_stderr = True\n\n if stdin_open:\n attach_stdin = True\n stdin_once = True\n\n if compare_version('1.10', version) >= 0:\n message = ('{0!r} parameter has no effect on create_container().'\n ' It has been moved to start()')\n if dns is not None:\n raise errors.DockerException(message.format('dns'))\n if volumes_from is not None:\n raise errors.DockerException(message.format('volumes_from'))\n\n return {\n 'Hostname': hostname,\n 'Domainname': domainname,\n 'ExposedPorts': ports,\n 'User': user,\n 'Tty': tty,\n 'OpenStdin': stdin_open,\n 'StdinOnce': stdin_once,\n 'Memory': mem_limit,\n 'AttachStdin': attach_stdin,\n 'AttachStdout': attach_stdout,\n 'AttachStderr': attach_stderr,\n 'Env': environment,\n 'Cmd': command,\n 'Dns': dns,\n 'Image': image,\n 'Volumes': volumes,\n 'VolumesFrom': volumes_from,\n 'NetworkDisabled': network_disabled,\n 'Entrypoint': entrypoint,\n 'CpuShares': cpu_shares,\n 'Cpuset': cpuset,\n 'CpusetCpus': cpuset,\n 'WorkingDir': working_dir,\n 'MemorySwap': memswap_limit,\n 'HostConfig': host_config,\n 'MacAddress': mac_address,\n 'Labels': labels,\n }\n", "path": "docker/utils/utils.py" } ]
diff --git a/docker/utils/utils.py b/docker/utils/utils.py index e4a3c9e64..724af4650 100644 --- a/docker/utils/utils.py +++ b/docker/utils/utils.py @@ -174,6 +174,9 @@ def convert_port_bindings(port_bindings): def convert_volume_binds(binds): + if isinstance(binds, list): + return binds + result = [] for k, v in binds.items(): if isinstance(v, dict): diff --git a/docs/volumes.md b/docs/volumes.md index de2821400..db421557a 100644 --- a/docs/volumes.md +++ b/docs/volumes.md @@ -19,3 +19,16 @@ container_id = c.create_container( }) ) ``` + +You can alternatively specify binds as a list. This code is equivalent to the +example above: + +```python +container_id = c.create_container( + 'busybox', 'ls', volumes=['/mnt/vol1', '/mnt/vol2'], + host_config=docker.utils.create_host_config(binds=[ + '/home/user1/:/mnt/vol2', + '/var/www:/mnt/vol1:ro', + ]) +) +``` diff --git a/tests/test.py b/tests/test.py index e0a9e3452..97af11eec 100644 --- a/tests/test.py +++ b/tests/test.py @@ -808,6 +808,36 @@ def test_create_container_with_binds_rw(self): DEFAULT_TIMEOUT_SECONDS ) + def test_create_container_with_binds_list(self): + try: + self.client.create_container( + 'busybox', 'true', host_config=create_host_config( + binds=[ + "/tmp:/mnt/1:ro", + "/tmp:/mnt/2", + ], + ) + ) + except Exception as e: + self.fail('Command should not raise exception: {0}'.format(e)) + + args = fake_request.call_args + self.assertEqual(args[0][0], url_prefix + + 'containers/create') + expected_payload = self.base_create_payload() + expected_payload['HostConfig'] = create_host_config() + expected_payload['HostConfig']['Binds'] = [ + "/tmp:/mnt/1:ro", + "/tmp:/mnt/2", + ] + self.assertEqual(json.loads(args[1]['data']), expected_payload) + self.assertEqual(args[1]['headers'], + {'Content-Type': 'application/json'}) + self.assertEqual( + args[1]['timeout'], + DEFAULT_TIMEOUT_SECONDS + ) + def test_create_container_with_port_binds(self): self.maxDiff = None try:
pex-tool__pex-2143
Release 2.1.135 On the docket: + [x] Add Support for Pip 23.1.1. #2133 + [x] Introduce pex3 venv inspect. #2135 + [x] Add support for Pip 23.1.2. #2142 + [x] Introduce pex3 venv create. #2140
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.134\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.135\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 46b9ed061..5730d24ef 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,42 @@ Release Notes ============= +2.1.135 +------- + +This release brings support for ``pex3 venv {inspect,create}`` for +working with venvs directly using Pex. Previously, a PEX built with +``--include-tools`` (or ``--venv``) had the capability of turning itself +into a venv but the new ``pex3 venv create`` command can do this for any +PEX file with the addition of a few new features: + +#. The venv can now be created directly from requirements producing no + intermediate PEX file. +#. The venv can be created either from a PEX file or a lock file. A + subset of either of those can be chosen by also supplying + requirements. +#. Instead of creating a full-fledged venv, just the site-packages can + be exported (without creating an intermediate venv). This "flat" + layout is used by several prominent runtimes - notably AWS Lambda - + and emulates ``pip install --target``. This style layout can also be + zipped and prefixed. Additionally it supports ``--platform`` and + ``--complete-platform`` allowing creation of, for example, an AWS + Lambda (or Lambda Layer) deployment zip on a non-Linux host. + +Additionally this release adds support for Pip 23.1.1 and 23.1.2. + +* Add Support for Pip 23.1.1. (#2133) + `PR #2133 <https://github.com/pantsbuild/pex/pull/2133>`_ + +* Introduce pex3 venv inspect. (#2135) + `PR #2135 <https://github.com/pantsbuild/pex/pull/2135>`_ + +* Introduce pex3 venv create. (#2140) + `PR #2140 <https://github.com/pantsbuild/pex/pull/2140>`_ + +* Add support for Pip 23.1.2. (#2142) + `PR #2142 <https://github.com/pantsbuild/pex/pull/2142>`_ + 2.1.134 ------- diff --git a/pex/version.py b/pex/version.py index bfbeb741e..7e49b0302 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.134" +__version__ = "2.1.135"
Kinto__kinto-1304
Cannot import name `Utc` While trying to debug #1299 I encountered the following error: ``` $ make serve ... ~/.virtualenvs/test/bin/kinto migrate --ini config/kinto.ini Traceback (most recent call last): File "~/.virtualenvs/test/bin/kinto", line 11, in <module> load_entry_point('kinto', 'console_scripts', 'kinto')() File "~/.virtualenvs/test/lib/python3.5/site-packages/pkg_resources/__init__.py", line 560, in load_entry_point return get_distribution(dist).load_entry_point(group, name) File "~/.virtualenvs/test/lib/python3.5/site-packages/pkg_resources/__init__.py", line 2648, in load_entry_point return ep.load() File "~/.virtualenvs/test/lib/python3.5/site-packages/pkg_resources/__init__.py", line 2302, in load return self.resolve() File "~/.virtualenvs/test/lib/python3.5/site-packages/pkg_resources/__init__.py", line 2308, in resolve module = __import__(self.module_name, fromlist=['__name__'], level=0) File "~/mozilla/kinto/kinto/__init__.py", line 4, in <module> import kinto.core File "~/mozilla/kinto/kinto/core/__init__.py", line 10, in <module> from kinto.core import errors File "~/mozilla/kinto/kinto/core/errors.py", line 1, in <module> import colander File "~/.virtualenvs/test/lib/python3.5/site-packages/colander/__init__.py", line 22, in <module> from . import iso8601 File "~/.virtualenvs/test/lib/python3.5/site-packages/colander/iso8601.py", line 3, in <module> from iso8601.iso8601 import (parse_date, ParseError, Utc, FixedOffset, UTC, ZERO, ISO8601_REGEX) ImportError: cannot import name 'Utc' Makefile:87 : la recette pour la cible « migrate » a échouée make: *** [migrate] Erreur 1 ``` Cannot import name `Utc` While trying to debug #1299 I encountered the following error: ``` $ make serve ... ~/.virtualenvs/test/bin/kinto migrate --ini config/kinto.ini Traceback (most recent call last): File "~/.virtualenvs/test/bin/kinto", line 11, in <module> load_entry_point('kinto', 'console_scripts', 'kinto')() File "~/.virtualenvs/test/lib/python3.5/site-packages/pkg_resources/__init__.py", line 560, in load_entry_point return get_distribution(dist).load_entry_point(group, name) File "~/.virtualenvs/test/lib/python3.5/site-packages/pkg_resources/__init__.py", line 2648, in load_entry_point return ep.load() File "~/.virtualenvs/test/lib/python3.5/site-packages/pkg_resources/__init__.py", line 2302, in load return self.resolve() File "~/.virtualenvs/test/lib/python3.5/site-packages/pkg_resources/__init__.py", line 2308, in resolve module = __import__(self.module_name, fromlist=['__name__'], level=0) File "~/mozilla/kinto/kinto/__init__.py", line 4, in <module> import kinto.core File "~/mozilla/kinto/kinto/core/__init__.py", line 10, in <module> from kinto.core import errors File "~/mozilla/kinto/kinto/core/errors.py", line 1, in <module> import colander File "~/.virtualenvs/test/lib/python3.5/site-packages/colander/__init__.py", line 22, in <module> from . import iso8601 File "~/.virtualenvs/test/lib/python3.5/site-packages/colander/iso8601.py", line 3, in <module> from iso8601.iso8601 import (parse_date, ParseError, Utc, FixedOffset, UTC, ZERO, ISO8601_REGEX) ImportError: cannot import name 'Utc' Makefile:87 : la recette pour la cible « migrate » a échouée make: *** [migrate] Erreur 1 ```
[ { "content": "import codecs\nimport os\nfrom setuptools import setup, find_packages\n\nhere = os.path.abspath(os.path.dirname(__file__))\n\n\ndef read_file(filename):\n \"\"\"Open a related file and return its content.\"\"\"\n with codecs.open(os.path.join(here, filename), encoding='utf-8') as f:\n content = f.read()\n return content\n\n\nREADME = read_file('README.rst')\nCHANGELOG = read_file('CHANGELOG.rst')\nCONTRIBUTORS = read_file('CONTRIBUTORS.rst')\n\nREQUIREMENTS = [\n 'bcrypt',\n 'iso8601==0.1.11', # Refs #1301\n 'colander >= 1.3.2',\n 'cornice >= 2.4',\n 'cornice_swagger >= 0.5.1',\n 'jsonschema',\n 'jsonpatch',\n 'logging-color-formatter >= 1.0.1', # Message interpolations.\n 'python-dateutil',\n 'pyramid > 1.8, < 1.9b1',\n 'pyramid_multiauth >= 0.8', # User on policy selected event.\n 'transaction',\n # pyramid_tm changed the location of their tween in 2.x and one of\n # our tests fails on 2.0.\n 'pyramid_tm >= 2.1',\n 'requests',\n 'waitress',\n 'ujson >= 1.35'\n]\n\nPOSTGRESQL_REQUIRES = [\n 'SQLAlchemy',\n 'psycopg2 > 2.5',\n 'zope.sqlalchemy',\n]\n\nREDIS_REQUIRES = [\n 'kinto_redis'\n]\n\nSETUP_REQUIRES = [\n 'pytest-runner'\n]\n\nTEST_REQUIREMENTS = [\n 'bravado_core',\n 'pytest',\n 'WebTest'\n]\n\nDEPENDENCY_LINKS = [\n]\n\nMONITORING_REQUIRES = [\n 'raven',\n 'statsd',\n 'newrelic',\n 'werkzeug',\n]\n\nENTRY_POINTS = {\n 'paste.app_factory': [\n 'main = kinto:main',\n ],\n 'console_scripts': [\n 'kinto = kinto.__main__:main'\n ],\n}\n\n\nsetup(name='kinto',\n version='7.3.2.dev0',\n description='Kinto Web Service - Store, Sync, Share, and Self-Host.',\n long_description=\"{}\\n\\n{}\\n\\n{}\".format(README, CHANGELOG, CONTRIBUTORS),\n license='Apache License (2.0)',\n classifiers=[\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.5\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: Implementation :: CPython\",\n \"Topic :: Internet :: WWW/HTTP\",\n \"Topic :: Internet :: WWW/HTTP :: WSGI :: Application\",\n \"License :: OSI Approved :: Apache Software License\"\n ],\n keywords=\"web sync json storage services\",\n author='Mozilla Services',\n author_email='[email protected]',\n url='https://github.com/Kinto/kinto',\n packages=find_packages(),\n package_data={'': ['*.rst', '*.py', '*.yaml']},\n include_package_data=True,\n zip_safe=False,\n setup_requires=SETUP_REQUIRES,\n tests_require=TEST_REQUIREMENTS,\n install_requires=REQUIREMENTS,\n extras_require={\n 'redis': REDIS_REQUIRES,\n 'postgresql': POSTGRESQL_REQUIRES,\n 'monitoring': MONITORING_REQUIRES,\n },\n test_suite=\"tests\",\n dependency_links=DEPENDENCY_LINKS,\n entry_points=ENTRY_POINTS)\n", "path": "setup.py" } ]
[ { "content": "import codecs\nimport os\nfrom setuptools import setup, find_packages\n\nhere = os.path.abspath(os.path.dirname(__file__))\n\n\ndef read_file(filename):\n \"\"\"Open a related file and return its content.\"\"\"\n with codecs.open(os.path.join(here, filename), encoding='utf-8') as f:\n content = f.read()\n return content\n\n\nREADME = read_file('README.rst')\nCHANGELOG = read_file('CHANGELOG.rst')\nCONTRIBUTORS = read_file('CONTRIBUTORS.rst')\n\nREQUIREMENTS = [\n 'bcrypt',\n 'colander >= 1.4.0',\n 'cornice >= 2.4',\n 'cornice_swagger >= 0.5.1',\n 'jsonschema',\n 'jsonpatch',\n 'logging-color-formatter >= 1.0.1', # Message interpolations.\n 'python-dateutil',\n 'pyramid > 1.8, < 1.9b1',\n 'pyramid_multiauth >= 0.8', # User on policy selected event.\n 'transaction',\n # pyramid_tm changed the location of their tween in 2.x and one of\n # our tests fails on 2.0.\n 'pyramid_tm >= 2.1',\n 'requests',\n 'waitress',\n 'ujson >= 1.35'\n]\n\nPOSTGRESQL_REQUIRES = [\n 'SQLAlchemy',\n 'psycopg2 > 2.5',\n 'zope.sqlalchemy',\n]\n\nREDIS_REQUIRES = [\n 'kinto_redis'\n]\n\nSETUP_REQUIRES = [\n 'pytest-runner'\n]\n\nTEST_REQUIREMENTS = [\n 'bravado_core',\n 'pytest',\n 'WebTest'\n]\n\nDEPENDENCY_LINKS = [\n]\n\nMONITORING_REQUIRES = [\n 'raven',\n 'statsd',\n 'newrelic',\n 'werkzeug',\n]\n\nENTRY_POINTS = {\n 'paste.app_factory': [\n 'main = kinto:main',\n ],\n 'console_scripts': [\n 'kinto = kinto.__main__:main'\n ],\n}\n\n\nsetup(name='kinto',\n version='7.3.2.dev0',\n description='Kinto Web Service - Store, Sync, Share, and Self-Host.',\n long_description=\"{}\\n\\n{}\\n\\n{}\".format(README, CHANGELOG, CONTRIBUTORS),\n license='Apache License (2.0)',\n classifiers=[\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.5\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: Implementation :: CPython\",\n \"Topic :: Internet :: WWW/HTTP\",\n \"Topic :: Internet :: WWW/HTTP :: WSGI :: Application\",\n \"License :: OSI Approved :: Apache Software License\"\n ],\n keywords=\"web sync json storage services\",\n author='Mozilla Services',\n author_email='[email protected]',\n url='https://github.com/Kinto/kinto',\n packages=find_packages(),\n package_data={'': ['*.rst', '*.py', '*.yaml']},\n include_package_data=True,\n zip_safe=False,\n setup_requires=SETUP_REQUIRES,\n tests_require=TEST_REQUIREMENTS,\n install_requires=REQUIREMENTS,\n extras_require={\n 'redis': REDIS_REQUIRES,\n 'postgresql': POSTGRESQL_REQUIRES,\n 'monitoring': MONITORING_REQUIRES,\n },\n test_suite=\"tests\",\n dependency_links=DEPENDENCY_LINKS,\n entry_points=ENTRY_POINTS)\n", "path": "setup.py" } ]
diff --git a/setup.py b/setup.py index 1ffb4863d..7515b388d 100644 --- a/setup.py +++ b/setup.py @@ -18,8 +18,7 @@ def read_file(filename): REQUIREMENTS = [ 'bcrypt', - 'iso8601==0.1.11', # Refs #1301 - 'colander >= 1.3.2', + 'colander >= 1.4.0', 'cornice >= 2.4', 'cornice_swagger >= 0.5.1', 'jsonschema',
pex-tool__pex-2278
Release 2.1.150 On the docket: + [x] Add support for Pip 23.3.1. #2276 + [x] Support .egg-info dist metadata. #2264
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.149\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.150\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.md b/CHANGES.md index d8ccde54f..9c6d8a052 100644 --- a/CHANGES.md +++ b/CHANGES.md @@ -1,5 +1,11 @@ # Release Notes +## 2.1.150 + +This release brings support for `--pip-version 23.3.1`. + +* Add support for Pip 23.3.1. (#2276) + ## 2.1.149 Fix `--style universal` lock handing of `none` ABI wheels with a diff --git a/pex/version.py b/pex/version.py index 70044c887..e85c65794 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.149" +__version__ = "2.1.150"
pex-tool__pex-2226
Release 2.1.144 On the docket: + [x] Traverse directories in stable order when building a PEX #2220
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.143\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.144\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.md b/CHANGES.md index 51e6cf86f..8b6c19d5f 100644 --- a/CHANGES.md +++ b/CHANGES.md @@ -1,5 +1,12 @@ # Release Notes +## 2.1.144 + +This release fixes Pex to build PEX files with deterministic file order +regardless of the operating system / file system the PEX was built on. + +* Traverse directories in stable order when building a PEX (#2220) + ## 2.1.143 This release fixes Pex to work by default under eCryptFS home dirs. diff --git a/pex/version.py b/pex/version.py index b3894a15d..80b91697b 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.143" +__version__ = "2.1.144"
pex-tool__pex-1922
Release 2.1.106 On the docket: + [x] Providing a direct reference to a wheel with a local version fails to resolve #1919
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.105\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.106\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 4d8d4f999..15da265af 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,17 +1,26 @@ Release Notes ============= +2.1.106 +------- + +This release fixes a long standing bug in handling direct reference +requirements with a local version component. + +* Unquote path component of parsed url requirements (#1920) + `PR #1920 <https://github.com/pantsbuild/pex/pull/1920>`_ + 2.1.105 ------- -This is a fix release which addresses issues related to build time work_dir creation, -virtualenv, and sh_boot support. +This is a fix release which addresses issues related to build time +work_dir creation, virtualenv, and sh_boot support. In the unlikely event of a UUID collision in atomic workdir creation, pex could overwrite an existing directory and cause a corrupt state. -When building a shell bootable ``--sh-boot`` pex the ``--runtime-pex-root`` -was not always respected based on the condition of the build environment, -and the value of the PEX_ROOT. +When building a shell bootable ``--sh-boot`` pex the +``--runtime-pex-root`` was not always respected based on the condition +of the build environment, and the value of the PEX_ROOT. * Fail on atomic_directory work_dir collision. (#1905) `PR #1905 <https://github.com/pantsbuild/pex/pull/1905>`_ @@ -19,8 +28,6 @@ and the value of the PEX_ROOT. * Use raw_pex_root when constructing sh_boot pexes. (#1906) `PR #1906 <https://github.com/pantsbuild/pex/pull/1906>`_ -Docs. - * Add support for offline downloads (#1898) `PR #1898 <https://github.com/pantsbuild/pex/pull/1898>`_ diff --git a/pex/version.py b/pex/version.py index 1daaee1de..1949bc250 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.105" +__version__ = "2.1.106"
pex-tool__pex-2219
Release 2.1.143 On the docket: + [x] pex fails to build pycryptodome due to filename too long #2087
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.142\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.143\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.md b/CHANGES.md index 5bbc1c157..51e6cf86f 100644 --- a/CHANGES.md +++ b/CHANGES.md @@ -1,5 +1,11 @@ # Release Notes +## 2.1.143 + +This release fixes Pex to work by default under eCryptFS home dirs. + +* Guard against too long filenames on eCryptFS. (#2217) + ## 2.1.142 This release fixes Pex to handle Pip backtracking due to sdist build diff --git a/pex/version.py b/pex/version.py index f385f96a2..b3894a15d 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.142" +__version__ = "2.1.143"
pex-tool__pex-2062
Release 2.1.123 On the docket: + [x] Create lockfile for xmlsec fails #2063 + [x] Internal not enough values to unpack error for pex3 lock create 'pip @ https://github.com/pypa/pip/archive/22.0.2.zip' ... #2057 + [x] Pex lock creation does not handle wheels with non {cp,pp,py} pyver tag. #2059
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.122\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.123\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 869877089..978a81753 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,17 +1,51 @@ Release Notes ============= +2.1.123 +------- + +This release fixes a few ``pex3 lock create`` bugs. + +There was a regression introduced in Pex 2.1.122 where projects that +used a PEP-518 ``[build-system] requires`` but specified no +corresponding ``build-backend`` would fail to lock. + +There were also two long standing issues handling more exotic direct +reference URL requirements. Source archives with names not following the +standard Python sdist naming scheme of +``<project name>-<version>.{zip,tar.gz}`` would cause a lock error. An +important class of these is provided by GitHub's magic source archive +download URLs. Also, although local projects addressed with Pip +proprietary support for pure local path requirements would lock, the +same local projects addressed via +``<project name> @ file://<local project path>`` would also cause a lock +error. Both of these cases are now fixed and can be locked successfully. + +When locking with an ``--interpreter-constraint``, any resolve +traversing wheels using the ``pypyXY`` or ``cpythonXY`` python tags +would cause the lock to error. Wheels with this form of python tag are +now handled correctly. + +* Handle ``[build-system]`` with no build-backend. (#2064) + `PR #2064 <https://github.com/pantsbuild/pex/pull/2064>`_ + +* Handle locking all direct reference URL forms. (#2060) + `PR #2060 <https://github.com/pantsbuild/pex/pull/2060>`_ + +* Fix python tag handling in IC locks. (#2061) + `PR #2061 <https://github.com/pantsbuild/pex/pull/2061>`_ + 2.1.122 ------- This release fixes posix file locks used by Pex internally and enhances lock creation to support locking sdist-only C extension projects that do not build on the current platform. Pex is also updated to support -`--pip-version 22.3.1` and `--pip-version 23.0`, bringing it up to date -with the latest Pip's available. +``--pip-version 22.3.1`` and ``--pip-version 23.0``, bringing it up to +date with the latest Pip's available. * Support the latest Pip releases: 22.3.1 & 23.0 (#2056) - `PR #2053 <https://github.com/pantsbuild/pex/pull/2056>`_ + `PR #2056 <https://github.com/pantsbuild/pex/pull/2056>`_ * Lock sdists with ``prepare-metadata-for-build-wheel``. (#2053) `PR #2053 <https://github.com/pantsbuild/pex/pull/2053>`_ diff --git a/pex/version.py b/pex/version.py index c30e2a6bb..79f767fd5 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.122" +__version__ = "2.1.123"
pex-tool__pex-1925
Release 2.1.107 On the docket: + [x] `git` username replaced with `****` redaction in lockfile for `git+ssh` direct references #1918
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.106\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.107\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 15da265af..be93a0e4d 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,15 @@ Release Notes ============= +2.1.107 +------- + +This release fixes an issue handling credentials in git+ssh VCS urls +when creating locks. + +* Fix locks for git+ssh with credentials. (#1923) + `PR #1923 <https://github.com/pantsbuild/pex/pull/1923>`_ + 2.1.106 ------- diff --git a/pex/version.py b/pex/version.py index 1949bc250..648e9a986 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.106" +__version__ = "2.1.107"
pex-tool__pex-1864
Release 2.1.101 On the docket: + [x] Pex fails to find RECORD for python-certifi-win32 1.6.1 #1861
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.100\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.101\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 4c6cf4cb8..d02e8c1cc 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,15 @@ Release Notes ============= +2.1.101 +------- + +This release fixes a corner-case revealed by python-certifi-win32 1.6.1 +that was not previously handled when installing certain distributions. + +* Make wheel install ``site-packages`` detection robust. (#1863) + `PR #1863 <https://github.com/pantsbuild/pex/pull/1863>`_ + 2.1.100 ------- diff --git a/pex/version.py b/pex/version.py index 80d82318d..37d0e7cd6 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.100" +__version__ = "2.1.101"
zulip__zulip-12366
Scrollbar drag can result in unintended click actions Split off from #11792: > * on the settings pages, if you click on the scrollbar, drag it down, and then release your click when the mouse is outside the settings modal (e.g. below it or to the right), it closes the settings modal. I don't know if this is an existing thing or a regression, but I ran into it a bunch of times when testing even after knowing the behavior. This was not a regression from perfect-scrollbar, but I fixed it in Grsmto/simplebar#312 and Grsmto/simplebar#317. Just waiting for the fixes to be included in a new upstream release.
[ { "content": "ZULIP_VERSION = \"2.0.3+git\"\nLATEST_MAJOR_VERSION = \"2.0\"\nLATEST_RELEASE_VERSION = \"2.0.3\"\nLATEST_RELEASE_ANNOUNCEMENT = \"https://blog.zulip.org/2019/03/01/zulip-2-0-released/\"\n\n# Bump the minor PROVISION_VERSION to indicate that folks should provision\n# only when going from an old version of the code to a newer version. Bump\n# the major version to indicate that folks should provision in both\n# directions.\n\n# Typically, adding a dependency only requires a minor version bump, and\n# removing a dependency requires a major version bump.\n\nPROVISION_VERSION = '32.0'\n", "path": "version.py" } ]
[ { "content": "ZULIP_VERSION = \"2.0.3+git\"\nLATEST_MAJOR_VERSION = \"2.0\"\nLATEST_RELEASE_VERSION = \"2.0.3\"\nLATEST_RELEASE_ANNOUNCEMENT = \"https://blog.zulip.org/2019/03/01/zulip-2-0-released/\"\n\n# Bump the minor PROVISION_VERSION to indicate that folks should provision\n# only when going from an old version of the code to a newer version. Bump\n# the major version to indicate that folks should provision in both\n# directions.\n\n# Typically, adding a dependency only requires a minor version bump, and\n# removing a dependency requires a major version bump.\n\nPROVISION_VERSION = '32.1'\n", "path": "version.py" } ]
diff --git a/package.json b/package.json index 7072781bf8cae..58195f9d7ddb6 100644 --- a/package.json +++ b/package.json @@ -34,7 +34,7 @@ "plotly.js": "1.37.1", "sass-loader": "7.0.1", "script-loader": "0.7.2", - "simplebar": "^4.0.0-alpha.9", + "simplebar": "^4.0.0", "sortablejs": "^1.7.0", "sorttable": "1.0.2", "source-map-loader": "0.2.3", diff --git a/version.py b/version.py index f9bad91b290c6..9d3ebd232d245 100644 --- a/version.py +++ b/version.py @@ -11,4 +11,4 @@ # Typically, adding a dependency only requires a minor version bump, and # removing a dependency requires a major version bump. -PROVISION_VERSION = '32.0' +PROVISION_VERSION = '32.1' diff --git a/yarn.lock b/yarn.lock index 820cb7c560785..0847b96ce0991 100644 --- a/yarn.lock +++ b/yarn.lock @@ -10687,10 +10687,10 @@ signum@^1.0.0: resolved "https://registry.yarnpkg.com/signum/-/signum-1.0.0.tgz#74a7d2bf2a20b40eba16a92b152124f1d559fa77" integrity sha1-dKfSvyogtA66FqkrFSEk8dVZ+nc= -simplebar@^4.0.0-alpha.9: - version "4.0.0-alpha.9" - resolved "https://registry.yarnpkg.com/simplebar/-/simplebar-4.0.0-alpha.9.tgz#e6cf24a2e613abbef952e962680ed2429d421617" - integrity sha512-WGscL/Lsrfk0uTuG1Pyl/jV6ZkZh0A70atCxcVfvS81aGZdnRjQfHntQPT/nSr+8jxv6YSib5F+FnPCGVw9raw== +simplebar@^4.0.0: + version "4.0.0" + resolved "https://registry.yarnpkg.com/simplebar/-/simplebar-4.0.0.tgz#7f1b9e735ec94a58f887d4803f6b15abf401b6b5" + integrity sha512-td6vJVhqIXfa3JgNZR5OgETPLfmHNSSpt+OXIbk6WH/nOrUtX3Qcyio30+5rdxxAV/61+F5eJ4jJV4Ek7/KJYQ== dependencies: can-use-dom "^0.1.0" core-js "^3.0.1"
pex-tool__pex-2034
Release 2.1.120 On the docket: + [x] Support REPL command history #2019 + [x] Using --complete-platform with --resolve-local-platforms should build sdists when local platform provides a subset of complete-platforms #2026 + [x] A loose layout, venv-with-symlink PEX creates brittle symlinks #2023
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.119\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.120\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index a7c5d4575..992b31e92 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,29 @@ Release Notes ============= +2.1.120 +------- + +This release completes the ``--complete-platform`` fix started in +Pex 2.1.116 by #1991. That fix did not work in all cases but now does. + +PEXes run in interpreter mode now support command history when the +underlying interpreter being used to run the PEX does; use the +``PEX_INTERPRETER_HISTORY`` bool env var to turn this on. + +Additionally, PEXes built with the combination +``--layout loose --venv --no-venv-site-packages-copies`` are fixed to +be robust to moves of the source loose PEX directory. + +* Fix loose --venv PEXes to be robust to moves. (#2033) + `PR #2033 <https://github.com/pantsbuild/pex/pull/2033>`_ + +* Fix interpreter resolution when using --complete-platform with --resolve-local-platforms (#2031) + `PR #2031 <https://github.com/pantsbuild/pex/pull/2031>`_ + +* Support REPL command history. (#2018) + `PR #2018 <https://github.com/pantsbuild/pex/pull/2018>`_ + 2.1.119 ------- diff --git a/pex/version.py b/pex/version.py index a517c22a2..85c867798 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.119" +__version__ = "2.1.120"
pex-tool__pex-2095
Release 2.1.129 On the docket: + [x] Pex resolves VCS and local project requirements from locks incorrectly. #2092
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.128\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.129\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index d4a3a0197..46c800c11 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,16 @@ Release Notes ============= +2.1.129 +------- + +This release fixes a bug downloading a VCS requirement from a lock when +the ambient Python interpreter used to run Pex does not meet the +``Requires-Python`` constraint of the VCS requirement. + +* Fix VCS lock downloads to respect target. (#2094) + `PR #2094 <https://github.com/pantsbuild/pex/pull/2094>`_ + 2.1.128 ------- diff --git a/pex/version.py b/pex/version.py index f00e67d65..2553debad 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.128" +__version__ = "2.1.129"
pex-tool__pex-2000
Release 2.1.117 On the docket: + [x] Published pex on github no longer works with PyPy since 2.1.109 #1995
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.116\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.117\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 3f75d503d..20955dbe7 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,21 @@ Release Notes ============= +2.1.117 +------- + +This release fixes a bug introduced in Pex 2.1.109 where the released +Pex PEX could not be executed by PyPy interpreters. More generally, any +PEX created with interpreter constraints that did not specify the Python +implementation, e.g.: ``==3.8.*``, were interpreted as being CPython +specific, i.e.: ``CPython==3.8.*``. This is now fixed, but if the +intention of a constraint like ``==3.8.*`` was in fact to restrict to +CPython only, interpreter constraints need to say so now and use +``CPython==3.8.*`` explicitly. + +* Fix interpreter constraint parsing. (#1998) + `PR #1998 <https://github.com/pantsbuild/pex/pull/1998>`_ + 2.1.116 ------- diff --git a/pex/version.py b/pex/version.py index 12b8e8168..eadcefbeb 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.116" +__version__ = "2.1.117"
pex-tool__pex-1997
Release 2.1.116 On the docket: + [x] The --resolve-local-platforms option does not work with --complete-platforms #1899
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.115\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.116\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 94f0b27b4..3f75d503d 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,15 @@ Release Notes ============= +2.1.116 +------- + +This release fixes a bug in ``--resolve-local-platforms`` when +``--complete-platform`` was used. + +* Check for --complete-platforms match when --resolve-local-platforms (#1991) + `PR #1991 <https://github.com/pantsbuild/pex/pull/1991>`_ + 2.1.115 ------- diff --git a/pex/version.py b/pex/version.py index edffe5b53..12b8e8168 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.115" +__version__ = "2.1.116"
pex-tool__pex-1942
Release 2.1.109 On the docket: + [x] pex does not support musllinux wheels #1933 + [x] Empty string PEX_PATH="" env var causes CWD (.) to be added bootstrapped pex_path #1936
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.108\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.109\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index f1a94fcc9..cc087a1ee 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,20 @@ Release Notes ============= +2.1.109 +------- + +This release brings musllinux wheel support and a fix for a regression +introduced in Pex 2.1.105 by #1902 that caused ``PEX_PATH=`` (an +exported ``PEX_PATH`` with an empty string value) to raise an error in +almost all use cases. + +* Vendor latest packaging; support musllinux wheels. (#1937) + `PR #1937 <https://github.com/pantsbuild/pex/pull/1937>`_ + +* Don't treat ``PEX_PATH=`` as ``.`` like other PATHS. (#1938) + `PR #1938 <https://github.com/pantsbuild/pex/pull/1938>`_ + 2.1.108 ------- diff --git a/pex/version.py b/pex/version.py index c0dfd4790..32f577f51 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.108" +__version__ = "2.1.109"
DataDog__integrations-extras-1031
Collect clock_time_seconds metric from cert-manager cert-manager v1.5+ exposes a `clock_time` metric which reports the current seconds since the Unix Epoch See: https://github.com/jetstack/cert-manager/pull/4105 It would be useful to collect this metric in DataDog so we can alert on seconds until a given certificate expires
[ { "content": "# (C) Datadog, Inc. 2019-present\n# All rights reserved\n# Licensed under a 3-clause BSD style license (see LICENSE)\n\nCERT_METRICS = {\n 'certmanager_certificate_ready_status': 'certificate.ready_status',\n 'certmanager_certificate_expiration_timestamp_seconds': 'certificate.expiration_timestamp',\n}\n\nCONTROLLER_METRICS = {\n 'certmanager_controller_sync_call_count': 'controller.sync_call.count',\n}\n\nACME_METRICS = {\n 'certmanager_http_acme_client_request_count': 'http_acme_client.request.count',\n 'certmanager_http_acme_client_request_duration_seconds': 'http_acme_client.request.duration',\n}\n", "path": "cert_manager/datadog_checks/cert_manager/metrics.py" } ]
[ { "content": "# (C) Datadog, Inc. 2019-present\n# All rights reserved\n# Licensed under a 3-clause BSD style license (see LICENSE)\n\nCERT_METRICS = {\n 'certmanager_certificate_ready_status': 'certificate.ready_status',\n 'certmanager_certificate_expiration_timestamp_seconds': 'certificate.expiration_timestamp',\n}\n\nCONTROLLER_METRICS = {\n 'certmanager_clock_time_seconds': 'clock_time',\n 'certmanager_controller_sync_call_count': 'controller.sync_call.count',\n}\n\nACME_METRICS = {\n 'certmanager_http_acme_client_request_count': 'http_acme_client.request.count',\n 'certmanager_http_acme_client_request_duration_seconds': 'http_acme_client.request.duration',\n}\n", "path": "cert_manager/datadog_checks/cert_manager/metrics.py" } ]
diff --git a/cert_manager/datadog_checks/cert_manager/metrics.py b/cert_manager/datadog_checks/cert_manager/metrics.py index f098835222..7df0d19522 100644 --- a/cert_manager/datadog_checks/cert_manager/metrics.py +++ b/cert_manager/datadog_checks/cert_manager/metrics.py @@ -8,6 +8,7 @@ } CONTROLLER_METRICS = { + 'certmanager_clock_time_seconds': 'clock_time', 'certmanager_controller_sync_call_count': 'controller.sync_call.count', } diff --git a/cert_manager/metadata.csv b/cert_manager/metadata.csv index e9822aaf53..95c6dea960 100644 --- a/cert_manager/metadata.csv +++ b/cert_manager/metadata.csv @@ -1,4 +1,5 @@ metric_name,metric_type,interval,unit_name,per_unit_name,description,orientation,integration,short_name +cert_manager.clock_time,count,,second,,The clock time given in seconds (from 1970/01/01 UTC),0,cert-manager,cm.clock_time cert_manager.prometheus.health,gauge,,,,Whether the check is able to connect to the metrics endpoint,0,cert-manager,cm.health cert_manager.certificate.ready_status,gauge,,,,The ready status of the certificate,0,cert-manager,cm.cert_ready_status cert_manager.certificate.expiration_timestamp,gauge,,second,,The date after which the certificate expires. Expressed as a Unix Epoch Time,0,cert-manager,cm.cert_exp_time diff --git a/cert_manager/tests/common.py b/cert_manager/tests/common.py index cf75d2464a..f9ffa63be6 100644 --- a/cert_manager/tests/common.py +++ b/cert_manager/tests/common.py @@ -16,6 +16,7 @@ } CONTROLLER_METRICS = { + 'cert_manager.clock_time': aggregator.MONOTONIC_COUNT, 'cert_manager.controller.sync_call.count': aggregator.MONOTONIC_COUNT, 'cert_manager.prometheus.health': aggregator.GAUGE, } diff --git a/cert_manager/tests/conftest.py b/cert_manager/tests/conftest.py index 7afb08c858..88c2f581f0 100644 --- a/cert_manager/tests/conftest.py +++ b/cert_manager/tests/conftest.py @@ -27,7 +27,7 @@ def setup_cert_manager(): "kubectl", "apply", "-f", - "https://github.com/jetstack/cert-manager/releases/download/v1.2.0/cert-manager.yaml", + "https://github.com/jetstack/cert-manager/releases/download/v1.5.0/cert-manager.yaml", ] ) run_command( diff --git a/cert_manager/tests/fixtures/cert_manager.txt b/cert_manager/tests/fixtures/cert_manager.txt index d1cf303737..b5e0fd94b1 100644 --- a/cert_manager/tests/fixtures/cert_manager.txt +++ b/cert_manager/tests/fixtures/cert_manager.txt @@ -18,6 +18,9 @@ certmanager_certificate_ready_status{condition="Unknown",name="acme-cert",namesp certmanager_certificate_ready_status{condition="Unknown",name="acme-cert2",namespace="default"} 0 certmanager_certificate_ready_status{condition="Unknown",name="myingress-cert",namespace="cert-manager-test"} 0 certmanager_certificate_ready_status{condition="Unknown",name="selfsigned-cert",namespace="cert-manager-test"} 0 +# HELP certmanager_clock_time_seconds The clock time given in seconds (from 1970/01/01 UTC). +# TYPE certmanager_clock_time_seconds counter +certmanager_clock_time_seconds 1.61915483e+09 # HELP certmanager_controller_sync_call_count The number of sync() calls made by a controller. # TYPE certmanager_controller_sync_call_count counter certmanager_controller_sync_call_count{controller="CertificateIssuing"} 20
pex-tool__pex-2086
Release 2.1.127 On the docket: + [x] Pex fails to subset a "foo @ file:///bar" URL lock. #2083
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.126\"\n", "path": "pex/version.py" } ]
[ { "content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.127\"\n", "path": "pex/version.py" } ]
diff --git a/CHANGES.rst b/CHANGES.rst index 112d072a0..b7c71cb3c 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,6 +1,15 @@ Release Notes ============= +2.1.127 +------- + +This release fixes `--lock` resolve sub-setting for local project +requirements. + +* Fix lock subsetting for local projects. (#2085) + `PR #2085 <https://github.com/pantsbuild/pex/pull/2085>`_ + 2.1.126 ------- diff --git a/pex/version.py b/pex/version.py index 014827637..98d74ea5d 100644 --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.126" +__version__ = "2.1.127"