problem_id
stringlengths
18
22
source
stringclasses
1 value
task_type
stringclasses
1 value
in_source_id
stringlengths
13
58
prompt
stringlengths
1.1k
10.2k
golden_diff
stringlengths
151
4.94k
verification_info
stringlengths
582
21k
num_tokens
int64
271
2.05k
num_tokens_diff
int64
47
1.02k
gh_patches_debug_17298
rasdani/github-patches
git_diff
pytorch__TensorRT-2080
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- 🐛 [Bug] Encountered TRT Error in Dynamo compile path ## Bug Description When compiling GPT-2 with Dynamo compile, the following error is encountered: ```python [07/01/2023-00:07:12] [TRT] [E] 3: [executionContext.cpp::enqueueInternal::795] Error Code 3: API Usage Error (Parameter check failed at: runtime/api/executionContext.cpp::enqueueInternal::795, condition: bindings[x] || nullBindingOK ) ``` Additionally, this does not seem to cause failures in the Dynamo runtime (not caught by `pass_through_build_failures`), and it appears both with and without the experimental runtime. ## To Reproduce ```python model = GPT2Model.from_pretrained("gpt2").eval().cuda() input_ids = torch.randint(0, 2, (1, 14), dtype=torch.int32).cuda() attention_mask = torch.randint(0, 2, (1, 14), dtype=torch.int32).cuda() traced = transformers_trace(model, input_names=["input_ids", "attention_mask"]).eval().cuda() fx_trt_model = torch_tensorrt.compile(traced, ir="dynamo_compile", inputs=[input_ids, attention_mask], debug=True, pass_through_build_failures=True, min_block_size=10) ``` ## Expected behavior The model should not encounter TRT errors while compiling. ## Environment - Torch-TensorRT Version (e.g. 1.0.0): 2844630 - PyTorch Version (e.g. 1.0): `2.1.0.dev20230620+cu118` - TensorRT Version: `8.6.1` ## Additional context This error no longer appears when we apply the `@fake_tensor_unsupported` flag to the backends, as was removed in #1955. It is unclear whether this is the direct _cause_ of the bug, however. Additionally, this bug seems to appear only sometimes, and not consistently. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `py/torch_tensorrt/dynamo/backend/backends.py` Content: ``` 1 import logging 2 from typing import Sequence 3 import torch 4 from functools import partial 5 import torch._dynamo as td 6 7 from torch_tensorrt.dynamo.backend._settings import CompilationSettings 8 from torch_tensorrt.dynamo.backend.lowering._decompositions import ( 9 get_decompositions, 10 ) 11 from torch_tensorrt.dynamo.backend.lowering._pre_aot_lowering import ( 12 pre_aot_substitutions, 13 ) 14 from torch_tensorrt.dynamo.backend.lowering._partition import ( 15 partition, 16 get_submod_inputs, 17 ) 18 from torch_tensorrt.dynamo.backend.utils import parse_dynamo_kwargs 19 from torch_tensorrt.dynamo.backend.conversion import convert_module 20 21 from torch._functorch.aot_autograd import aot_module_simplified, make_boxed_compiler 22 23 24 logger = logging.getLogger(__name__) 25 26 27 @td.register_backend(name="torch_tensorrt") 28 def torch_tensorrt_backend( 29 gm: torch.fx.GraphModule, sample_inputs: Sequence[torch.Tensor], **kwargs 30 ): 31 DEFAULT_BACKEND = aot_torch_tensorrt_aten_backend 32 33 return DEFAULT_BACKEND(gm, sample_inputs, **kwargs) 34 35 36 @td.register_backend(name="aot_torch_tensorrt_aten") 37 def aot_torch_tensorrt_aten_backend( 38 gm: torch.fx.GraphModule, sample_inputs: Sequence[torch.Tensor], **kwargs 39 ): 40 settings = parse_dynamo_kwargs(kwargs) 41 42 custom_backend = partial( 43 _pretraced_backend, 44 settings=settings, 45 ) 46 47 # Perform Pre-AOT Lowering for Module-Level Replacement 48 gm = pre_aot_substitutions(gm) 49 50 # Invoke AOTAutograd to translate operators to aten 51 return aot_module_simplified( 52 gm, 53 sample_inputs, 54 fw_compiler=make_boxed_compiler(custom_backend), 55 decompositions=get_decompositions(), 56 ) 57 58 59 def _pretraced_backend( 60 gm: torch.fx.GraphModule, 61 sample_inputs: Sequence[torch.Tensor], 62 settings: CompilationSettings = CompilationSettings(), 63 ): 64 """Helper function to manage translation of traced FX module to TRT engines 65 66 Args: 67 module: FX GraphModule to convert 68 inputs: Inputs to the module 69 settings: Compilation settings 70 Returns: 71 Compiled FX GraphModule 72 """ 73 try: 74 logger.debug("Post-AOT Autograd graph:\n" + str(gm.graph)) 75 76 trt_compiled = _compile_module( 77 gm, 78 sample_inputs, 79 settings=settings, 80 ) 81 return trt_compiled 82 except: 83 if not settings.pass_through_build_failures: 84 logger.warning( 85 "TRT conversion failed on the subgraph. See trace above. " 86 + "Returning GraphModule forward instead.", 87 exc_info=True, 88 ) 89 return gm.forward 90 else: 91 raise AssertionError( 92 "Halting compilation on build failure since " 93 + "pass_through_build_failures was specified as True. " 94 + "To return the default Torch implementation and avoid " 95 + "halting compilation on engine build failures, " 96 + "specify pass_through_build_failures=False." 97 ) 98 99 100 def _compile_module( 101 gm: torch.fx.GraphModule, 102 sample_inputs: Sequence[torch.Tensor], 103 settings: CompilationSettings = CompilationSettings(), 104 ) -> torch.fx.GraphModule: 105 """Compile a traced FX module 106 107 Includes: Partitioning + Conversion Phases 108 109 Args: 110 module: FX GraphModule to convert 111 inputs: Inputs to the module 112 settings: Compilation settings 113 Returns: 114 Compiled FX GraphModule 115 """ 116 # Partition module into components that can be TRT-accelerated 117 partitioned_module = partition( 118 gm, 119 verbose=settings.debug, 120 min_block_size=settings.min_block_size, 121 torch_executed_ops=settings.torch_executed_ops, 122 ) 123 124 # Iterate over all components that can be accelerated 125 # Generate the corresponding TRT Module for those 126 for name, _ in partitioned_module.named_children(): 127 submodule = getattr(partitioned_module, name) 128 129 # Get submodule inputs 130 submodule_inputs = get_submod_inputs( 131 partitioned_module, submodule, sample_inputs 132 ) 133 134 # Create TRT Module from submodule 135 trt_mod = convert_module( 136 submodule, 137 submodule_inputs, 138 settings=settings, 139 ) 140 141 # Replace FX Module with TRT Module 142 setattr(partitioned_module, name, trt_mod) 143 144 return partitioned_module 145 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/py/torch_tensorrt/dynamo/backend/backends.py b/py/torch_tensorrt/dynamo/backend/backends.py --- a/py/torch_tensorrt/dynamo/backend/backends.py +++ b/py/torch_tensorrt/dynamo/backend/backends.py @@ -121,6 +121,9 @@ torch_executed_ops=settings.torch_executed_ops, ) + # Store TRT replicas of Torch subgraphs + trt_modules = {} + # Iterate over all components that can be accelerated # Generate the corresponding TRT Module for those for name, _ in partitioned_module.named_children(): @@ -138,7 +141,10 @@ settings=settings, ) - # Replace FX Module with TRT Module + trt_modules[name] = trt_mod + + # Replace all FX Modules with TRT Modules + for name, trt_mod in trt_modules.items(): setattr(partitioned_module, name, trt_mod) return partitioned_module
{"golden_diff": "diff --git a/py/torch_tensorrt/dynamo/backend/backends.py b/py/torch_tensorrt/dynamo/backend/backends.py\n--- a/py/torch_tensorrt/dynamo/backend/backends.py\n+++ b/py/torch_tensorrt/dynamo/backend/backends.py\n@@ -121,6 +121,9 @@\n torch_executed_ops=settings.torch_executed_ops,\n )\n \n+ # Store TRT replicas of Torch subgraphs\n+ trt_modules = {}\n+\n # Iterate over all components that can be accelerated\n # Generate the corresponding TRT Module for those\n for name, _ in partitioned_module.named_children():\n@@ -138,7 +141,10 @@\n settings=settings,\n )\n \n- # Replace FX Module with TRT Module\n+ trt_modules[name] = trt_mod\n+\n+ # Replace all FX Modules with TRT Modules\n+ for name, trt_mod in trt_modules.items():\n setattr(partitioned_module, name, trt_mod)\n \n return partitioned_module\n", "issue": "\ud83d\udc1b [Bug] Encountered TRT Error in Dynamo compile path\n## Bug Description\r\nWhen compiling GPT-2 with Dynamo compile, the following error is encountered:\r\n```python\r\n[07/01/2023-00:07:12] [TRT] [E] 3: [executionContext.cpp::enqueueInternal::795] Error Code 3: API Usage Error (Parameter check failed at: runtime/api/executionContext.cpp::enqueueInternal::795, condition: bindings[x] || nullBindingOK\r\n)\r\n```\r\nAdditionally, this does not seem to cause failures in the Dynamo runtime (not caught by `pass_through_build_failures`), and it appears both with and without the experimental runtime.\r\n\r\n## To Reproduce\r\n```python\r\nmodel = GPT2Model.from_pretrained(\"gpt2\").eval().cuda()\r\ninput_ids = torch.randint(0, 2, (1, 14), dtype=torch.int32).cuda()\r\nattention_mask = torch.randint(0, 2, (1, 14), dtype=torch.int32).cuda()\r\ntraced = transformers_trace(model, input_names=[\"input_ids\", \"attention_mask\"]).eval().cuda()\r\nfx_trt_model = torch_tensorrt.compile(traced, ir=\"dynamo_compile\", inputs=[input_ids, attention_mask], debug=True, pass_through_build_failures=True, min_block_size=10)\r\n```\r\n\r\n## Expected behavior\r\n\r\nThe model should not encounter TRT errors while compiling.\r\n\r\n## Environment\r\n\r\n - Torch-TensorRT Version (e.g. 1.0.0): 2844630\r\n - PyTorch Version (e.g. 1.0): `2.1.0.dev20230620+cu118`\r\n - TensorRT Version: `8.6.1`\r\n\r\n## Additional context\r\n\r\nThis error no longer appears when we apply the `@fake_tensor_unsupported` flag to the backends, as was removed in #1955. It is unclear whether this is the direct _cause_ of the bug, however.\r\n\r\nAdditionally, this bug seems to appear only sometimes, and not consistently.\n", "before_files": [{"content": "import logging\nfrom typing import Sequence\nimport torch\nfrom functools import partial\nimport torch._dynamo as td\n\nfrom torch_tensorrt.dynamo.backend._settings import CompilationSettings\nfrom torch_tensorrt.dynamo.backend.lowering._decompositions import (\n get_decompositions,\n)\nfrom torch_tensorrt.dynamo.backend.lowering._pre_aot_lowering import (\n pre_aot_substitutions,\n)\nfrom torch_tensorrt.dynamo.backend.lowering._partition import (\n partition,\n get_submod_inputs,\n)\nfrom torch_tensorrt.dynamo.backend.utils import parse_dynamo_kwargs\nfrom torch_tensorrt.dynamo.backend.conversion import convert_module\n\nfrom torch._functorch.aot_autograd import aot_module_simplified, make_boxed_compiler\n\n\nlogger = logging.getLogger(__name__)\n\n\[email protected]_backend(name=\"torch_tensorrt\")\ndef torch_tensorrt_backend(\n gm: torch.fx.GraphModule, sample_inputs: Sequence[torch.Tensor], **kwargs\n):\n DEFAULT_BACKEND = aot_torch_tensorrt_aten_backend\n\n return DEFAULT_BACKEND(gm, sample_inputs, **kwargs)\n\n\[email protected]_backend(name=\"aot_torch_tensorrt_aten\")\ndef aot_torch_tensorrt_aten_backend(\n gm: torch.fx.GraphModule, sample_inputs: Sequence[torch.Tensor], **kwargs\n):\n settings = parse_dynamo_kwargs(kwargs)\n\n custom_backend = partial(\n _pretraced_backend,\n settings=settings,\n )\n\n # Perform Pre-AOT Lowering for Module-Level Replacement\n gm = pre_aot_substitutions(gm)\n\n # Invoke AOTAutograd to translate operators to aten\n return aot_module_simplified(\n gm,\n sample_inputs,\n fw_compiler=make_boxed_compiler(custom_backend),\n decompositions=get_decompositions(),\n )\n\n\ndef _pretraced_backend(\n gm: torch.fx.GraphModule,\n sample_inputs: Sequence[torch.Tensor],\n settings: CompilationSettings = CompilationSettings(),\n):\n \"\"\"Helper function to manage translation of traced FX module to TRT engines\n\n Args:\n module: FX GraphModule to convert\n inputs: Inputs to the module\n settings: Compilation settings\n Returns:\n Compiled FX GraphModule\n \"\"\"\n try:\n logger.debug(\"Post-AOT Autograd graph:\\n\" + str(gm.graph))\n\n trt_compiled = _compile_module(\n gm,\n sample_inputs,\n settings=settings,\n )\n return trt_compiled\n except:\n if not settings.pass_through_build_failures:\n logger.warning(\n \"TRT conversion failed on the subgraph. See trace above. \"\n + \"Returning GraphModule forward instead.\",\n exc_info=True,\n )\n return gm.forward\n else:\n raise AssertionError(\n \"Halting compilation on build failure since \"\n + \"pass_through_build_failures was specified as True. \"\n + \"To return the default Torch implementation and avoid \"\n + \"halting compilation on engine build failures, \"\n + \"specify pass_through_build_failures=False.\"\n )\n\n\ndef _compile_module(\n gm: torch.fx.GraphModule,\n sample_inputs: Sequence[torch.Tensor],\n settings: CompilationSettings = CompilationSettings(),\n) -> torch.fx.GraphModule:\n \"\"\"Compile a traced FX module\n\n Includes: Partitioning + Conversion Phases\n\n Args:\n module: FX GraphModule to convert\n inputs: Inputs to the module\n settings: Compilation settings\n Returns:\n Compiled FX GraphModule\n \"\"\"\n # Partition module into components that can be TRT-accelerated\n partitioned_module = partition(\n gm,\n verbose=settings.debug,\n min_block_size=settings.min_block_size,\n torch_executed_ops=settings.torch_executed_ops,\n )\n\n # Iterate over all components that can be accelerated\n # Generate the corresponding TRT Module for those\n for name, _ in partitioned_module.named_children():\n submodule = getattr(partitioned_module, name)\n\n # Get submodule inputs\n submodule_inputs = get_submod_inputs(\n partitioned_module, submodule, sample_inputs\n )\n\n # Create TRT Module from submodule\n trt_mod = convert_module(\n submodule,\n submodule_inputs,\n settings=settings,\n )\n\n # Replace FX Module with TRT Module\n setattr(partitioned_module, name, trt_mod)\n\n return partitioned_module\n", "path": "py/torch_tensorrt/dynamo/backend/backends.py"}], "after_files": [{"content": "import logging\nfrom typing import Sequence\nimport torch\nfrom functools import partial\nimport torch._dynamo as td\n\nfrom torch_tensorrt.dynamo.backend._settings import CompilationSettings\nfrom torch_tensorrt.dynamo.backend.lowering._decompositions import (\n get_decompositions,\n)\nfrom torch_tensorrt.dynamo.backend.lowering._pre_aot_lowering import (\n pre_aot_substitutions,\n)\nfrom torch_tensorrt.dynamo.backend.lowering._partition import (\n partition,\n get_submod_inputs,\n)\nfrom torch_tensorrt.dynamo.backend.utils import parse_dynamo_kwargs\nfrom torch_tensorrt.dynamo.backend.conversion import convert_module\n\nfrom torch._functorch.aot_autograd import aot_module_simplified, make_boxed_compiler\n\n\nlogger = logging.getLogger(__name__)\n\n\[email protected]_backend(name=\"torch_tensorrt\")\ndef torch_tensorrt_backend(\n gm: torch.fx.GraphModule, sample_inputs: Sequence[torch.Tensor], **kwargs\n):\n DEFAULT_BACKEND = aot_torch_tensorrt_aten_backend\n\n return DEFAULT_BACKEND(gm, sample_inputs, **kwargs)\n\n\[email protected]_backend(name=\"aot_torch_tensorrt_aten\")\ndef aot_torch_tensorrt_aten_backend(\n gm: torch.fx.GraphModule, sample_inputs: Sequence[torch.Tensor], **kwargs\n):\n settings = parse_dynamo_kwargs(kwargs)\n\n custom_backend = partial(\n _pretraced_backend,\n settings=settings,\n )\n\n # Perform Pre-AOT Lowering for Module-Level Replacement\n gm = pre_aot_substitutions(gm)\n\n # Invoke AOTAutograd to translate operators to aten\n return aot_module_simplified(\n gm,\n sample_inputs,\n fw_compiler=make_boxed_compiler(custom_backend),\n decompositions=get_decompositions(),\n )\n\n\ndef _pretraced_backend(\n gm: torch.fx.GraphModule,\n sample_inputs: Sequence[torch.Tensor],\n settings: CompilationSettings = CompilationSettings(),\n):\n \"\"\"Helper function to manage translation of traced FX module to TRT engines\n\n Args:\n module: FX GraphModule to convert\n inputs: Inputs to the module\n settings: Compilation settings\n Returns:\n Compiled FX GraphModule\n \"\"\"\n try:\n logger.debug(\"Post-AOT Autograd graph:\\n\" + str(gm.graph))\n\n trt_compiled = _compile_module(\n gm,\n sample_inputs,\n settings=settings,\n )\n return trt_compiled\n except:\n if not settings.pass_through_build_failures:\n logger.warning(\n \"TRT conversion failed on the subgraph. See trace above. \"\n + \"Returning GraphModule forward instead.\",\n exc_info=True,\n )\n return gm.forward\n else:\n raise AssertionError(\n \"Halting compilation on build failure since \"\n + \"pass_through_build_failures was specified as True. \"\n + \"To return the default Torch implementation and avoid \"\n + \"halting compilation on engine build failures, \"\n + \"specify pass_through_build_failures=False.\"\n )\n\n\ndef _compile_module(\n gm: torch.fx.GraphModule,\n sample_inputs: Sequence[torch.Tensor],\n settings: CompilationSettings = CompilationSettings(),\n) -> torch.fx.GraphModule:\n \"\"\"Compile a traced FX module\n\n Includes: Partitioning + Conversion Phases\n\n Args:\n module: FX GraphModule to convert\n inputs: Inputs to the module\n settings: Compilation settings\n Returns:\n Compiled FX GraphModule\n \"\"\"\n # Partition module into components that can be TRT-accelerated\n partitioned_module = partition(\n gm,\n verbose=settings.debug,\n min_block_size=settings.min_block_size,\n torch_executed_ops=settings.torch_executed_ops,\n )\n\n # Store TRT replicas of Torch subgraphs\n trt_modules = {}\n\n # Iterate over all components that can be accelerated\n # Generate the corresponding TRT Module for those\n for name, _ in partitioned_module.named_children():\n submodule = getattr(partitioned_module, name)\n\n # Get submodule inputs\n submodule_inputs = get_submod_inputs(\n partitioned_module, submodule, sample_inputs\n )\n\n # Create TRT Module from submodule\n trt_mod = convert_module(\n submodule,\n submodule_inputs,\n settings=settings,\n )\n\n trt_modules[name] = trt_mod\n\n # Replace all FX Modules with TRT Modules\n for name, trt_mod in trt_modules.items():\n setattr(partitioned_module, name, trt_mod)\n\n return partitioned_module\n", "path": "py/torch_tensorrt/dynamo/backend/backends.py"}]}
1,994
231
gh_patches_debug_30885
rasdani/github-patches
git_diff
plotly__dash-2207
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Render arbitrary keys of components inside dictionaries. It should be possible to render arbitrary keys as component inside an object prop. Example typescript props: ```ts type Props = { dynamic: {[k: string]: JSX.Element} } ``` prop-types: `PropTypes.objectOf(PropTypes.node)` --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `dash/development/_collect_nodes.py` Content: ``` 1 def is_node(value): 2 return value in ("node", "element") 3 4 5 def is_shape(value): 6 return value in ("shape", "exact") 7 8 9 def collect_array(a_value, base, nodes): 10 a_type = a_value["name"] 11 if is_node(a_type): 12 nodes.append(base) 13 elif a_type in ("shape", "exact"): 14 nodes = collect_nodes(a_value["value"], base + "[]", nodes) 15 elif a_type == "union": 16 nodes = collect_union(a_value["value"], base + "[]", nodes) 17 return nodes 18 19 20 def collect_union(type_list, base, nodes): 21 for t in type_list: 22 if is_node(t["name"]): 23 nodes.append(base) 24 elif is_shape(t["name"]): 25 nodes = collect_nodes(t["value"], base, nodes) 26 elif t["name"] == "arrayOf": 27 nodes = collect_array(t["value"], base, nodes) 28 return nodes 29 30 31 def collect_nodes(metadata, base="", nodes=None): 32 nodes = nodes or [] 33 34 for prop_name, value in metadata.items(): 35 # Support for recursive shapes, the type is directly in the field. 36 t_value = value.get("type", value) 37 p_type = t_value.get("name") 38 39 if base: 40 key = f"{base}.{prop_name}" 41 else: 42 key = prop_name 43 if is_node(p_type): 44 nodes.append(key) 45 elif p_type == "arrayOf": 46 a_value = t_value.get("value", t_value) 47 nodes = collect_array(a_value, key, nodes) 48 elif is_shape(p_type): 49 nodes = collect_nodes(t_value["value"], key, nodes) 50 elif p_type == "union": 51 nodes = collect_union(t_value["value"], key, nodes) 52 53 return nodes 54 55 56 def filter_base_nodes(nodes): 57 return [n for n in nodes if not any(e in n for e in ("[]", "."))] 58 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/dash/development/_collect_nodes.py b/dash/development/_collect_nodes.py --- a/dash/development/_collect_nodes.py +++ b/dash/development/_collect_nodes.py @@ -14,6 +14,8 @@ nodes = collect_nodes(a_value["value"], base + "[]", nodes) elif a_type == "union": nodes = collect_union(a_value["value"], base + "[]", nodes) + elif a_type == "objectOf": + nodes = collect_object(a_value["value"], base + "[]", nodes) return nodes @@ -25,6 +27,22 @@ nodes = collect_nodes(t["value"], base, nodes) elif t["name"] == "arrayOf": nodes = collect_array(t["value"], base, nodes) + elif t["name"] == "objectOf": + nodes = collect_object(t["value"], base, nodes) + return nodes + + +def collect_object(o_value, base, nodes): + o_name = o_value.get("name") + o_key = base + "{}" + if is_node(o_name): + nodes.append(o_key) + elif is_shape(o_name): + nodes = collect_nodes(o_value.get("value", {}), o_key, nodes) + elif o_name == "union": + nodes = collect_union(o_value.get("value"), o_key, nodes) + elif o_name == "arrayOf": + nodes = collect_array(o_value, o_key, nodes) return nodes @@ -49,9 +67,12 @@ nodes = collect_nodes(t_value["value"], key, nodes) elif p_type == "union": nodes = collect_union(t_value["value"], key, nodes) + elif p_type == "objectOf": + o_value = t_value.get("value", {}) + nodes = collect_object(o_value, key, nodes) return nodes def filter_base_nodes(nodes): - return [n for n in nodes if not any(e in n for e in ("[]", "."))] + return [n for n in nodes if not any(e in n for e in ("[]", ".", "{}"))]
{"golden_diff": "diff --git a/dash/development/_collect_nodes.py b/dash/development/_collect_nodes.py\n--- a/dash/development/_collect_nodes.py\n+++ b/dash/development/_collect_nodes.py\n@@ -14,6 +14,8 @@\n nodes = collect_nodes(a_value[\"value\"], base + \"[]\", nodes)\n elif a_type == \"union\":\n nodes = collect_union(a_value[\"value\"], base + \"[]\", nodes)\n+ elif a_type == \"objectOf\":\n+ nodes = collect_object(a_value[\"value\"], base + \"[]\", nodes)\n return nodes\n \n \n@@ -25,6 +27,22 @@\n nodes = collect_nodes(t[\"value\"], base, nodes)\n elif t[\"name\"] == \"arrayOf\":\n nodes = collect_array(t[\"value\"], base, nodes)\n+ elif t[\"name\"] == \"objectOf\":\n+ nodes = collect_object(t[\"value\"], base, nodes)\n+ return nodes\n+\n+\n+def collect_object(o_value, base, nodes):\n+ o_name = o_value.get(\"name\")\n+ o_key = base + \"{}\"\n+ if is_node(o_name):\n+ nodes.append(o_key)\n+ elif is_shape(o_name):\n+ nodes = collect_nodes(o_value.get(\"value\", {}), o_key, nodes)\n+ elif o_name == \"union\":\n+ nodes = collect_union(o_value.get(\"value\"), o_key, nodes)\n+ elif o_name == \"arrayOf\":\n+ nodes = collect_array(o_value, o_key, nodes)\n return nodes\n \n \n@@ -49,9 +67,12 @@\n nodes = collect_nodes(t_value[\"value\"], key, nodes)\n elif p_type == \"union\":\n nodes = collect_union(t_value[\"value\"], key, nodes)\n+ elif p_type == \"objectOf\":\n+ o_value = t_value.get(\"value\", {})\n+ nodes = collect_object(o_value, key, nodes)\n \n return nodes\n \n \n def filter_base_nodes(nodes):\n- return [n for n in nodes if not any(e in n for e in (\"[]\", \".\"))]\n+ return [n for n in nodes if not any(e in n for e in (\"[]\", \".\", \"{}\"))]\n", "issue": "Render arbitrary keys of components inside dictionaries.\nIt should be possible to render arbitrary keys as component inside an object prop.\r\n\r\nExample typescript props:\r\n```ts\r\ntype Props = {\r\n dynamic: {[k: string]: JSX.Element}\r\n}\r\n```\r\nprop-types: `PropTypes.objectOf(PropTypes.node)` \n", "before_files": [{"content": "def is_node(value):\n return value in (\"node\", \"element\")\n\n\ndef is_shape(value):\n return value in (\"shape\", \"exact\")\n\n\ndef collect_array(a_value, base, nodes):\n a_type = a_value[\"name\"]\n if is_node(a_type):\n nodes.append(base)\n elif a_type in (\"shape\", \"exact\"):\n nodes = collect_nodes(a_value[\"value\"], base + \"[]\", nodes)\n elif a_type == \"union\":\n nodes = collect_union(a_value[\"value\"], base + \"[]\", nodes)\n return nodes\n\n\ndef collect_union(type_list, base, nodes):\n for t in type_list:\n if is_node(t[\"name\"]):\n nodes.append(base)\n elif is_shape(t[\"name\"]):\n nodes = collect_nodes(t[\"value\"], base, nodes)\n elif t[\"name\"] == \"arrayOf\":\n nodes = collect_array(t[\"value\"], base, nodes)\n return nodes\n\n\ndef collect_nodes(metadata, base=\"\", nodes=None):\n nodes = nodes or []\n\n for prop_name, value in metadata.items():\n # Support for recursive shapes, the type is directly in the field.\n t_value = value.get(\"type\", value)\n p_type = t_value.get(\"name\")\n\n if base:\n key = f\"{base}.{prop_name}\"\n else:\n key = prop_name\n if is_node(p_type):\n nodes.append(key)\n elif p_type == \"arrayOf\":\n a_value = t_value.get(\"value\", t_value)\n nodes = collect_array(a_value, key, nodes)\n elif is_shape(p_type):\n nodes = collect_nodes(t_value[\"value\"], key, nodes)\n elif p_type == \"union\":\n nodes = collect_union(t_value[\"value\"], key, nodes)\n\n return nodes\n\n\ndef filter_base_nodes(nodes):\n return [n for n in nodes if not any(e in n for e in (\"[]\", \".\"))]\n", "path": "dash/development/_collect_nodes.py"}], "after_files": [{"content": "def is_node(value):\n return value in (\"node\", \"element\")\n\n\ndef is_shape(value):\n return value in (\"shape\", \"exact\")\n\n\ndef collect_array(a_value, base, nodes):\n a_type = a_value[\"name\"]\n if is_node(a_type):\n nodes.append(base)\n elif a_type in (\"shape\", \"exact\"):\n nodes = collect_nodes(a_value[\"value\"], base + \"[]\", nodes)\n elif a_type == \"union\":\n nodes = collect_union(a_value[\"value\"], base + \"[]\", nodes)\n elif a_type == \"objectOf\":\n nodes = collect_object(a_value[\"value\"], base + \"[]\", nodes)\n return nodes\n\n\ndef collect_union(type_list, base, nodes):\n for t in type_list:\n if is_node(t[\"name\"]):\n nodes.append(base)\n elif is_shape(t[\"name\"]):\n nodes = collect_nodes(t[\"value\"], base, nodes)\n elif t[\"name\"] == \"arrayOf\":\n nodes = collect_array(t[\"value\"], base, nodes)\n elif t[\"name\"] == \"objectOf\":\n nodes = collect_object(t[\"value\"], base, nodes)\n return nodes\n\n\ndef collect_object(o_value, base, nodes):\n o_name = o_value.get(\"name\")\n o_key = base + \"{}\"\n if is_node(o_name):\n nodes.append(o_key)\n elif is_shape(o_name):\n nodes = collect_nodes(o_value.get(\"value\", {}), o_key, nodes)\n elif o_name == \"union\":\n nodes = collect_union(o_value.get(\"value\"), o_key, nodes)\n elif o_name == \"arrayOf\":\n nodes = collect_array(o_value, o_key, nodes)\n return nodes\n\n\ndef collect_nodes(metadata, base=\"\", nodes=None):\n nodes = nodes or []\n\n for prop_name, value in metadata.items():\n # Support for recursive shapes, the type is directly in the field.\n t_value = value.get(\"type\", value)\n p_type = t_value.get(\"name\")\n\n if base:\n key = f\"{base}.{prop_name}\"\n else:\n key = prop_name\n if is_node(p_type):\n nodes.append(key)\n elif p_type == \"arrayOf\":\n a_value = t_value.get(\"value\", t_value)\n nodes = collect_array(a_value, key, nodes)\n elif is_shape(p_type):\n nodes = collect_nodes(t_value[\"value\"], key, nodes)\n elif p_type == \"union\":\n nodes = collect_union(t_value[\"value\"], key, nodes)\n elif p_type == \"objectOf\":\n o_value = t_value.get(\"value\", {})\n nodes = collect_object(o_value, key, nodes)\n\n return nodes\n\n\ndef filter_base_nodes(nodes):\n return [n for n in nodes if not any(e in n for e in (\"[]\", \".\", \"{}\"))]\n", "path": "dash/development/_collect_nodes.py"}]}
847
486
gh_patches_debug_16417
rasdani/github-patches
git_diff
celery__kombu-101
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Kombu compression "encodes" the body ? At https://github.com/ask/kombu/blob/master/kombu/compression.py#L61, we see: ``` return encoder(body.encode("utf-8")), content_type ``` Same for decoding. What gives? It crashes with msgpack (which can't be encoded). Is there any reason for that ? --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `kombu/compression.py` Content: ``` 1 """ 2 kombu.compression 3 ================= 4 5 Compression utilities. 6 7 :copyright: (c) 2009 - 2012 by Ask Solem. 8 :license: BSD, see LICENSE for more details. 9 10 """ 11 from __future__ import absolute_import 12 13 import zlib 14 15 _aliases = {} 16 _encoders = {} 17 _decoders = {} 18 19 __all__ = ["register", "encoders", "get_encoder", 20 "get_decoder", "compress", "decompress"] 21 22 23 def register(encoder, decoder, content_type, aliases=[]): 24 """Register new compression method. 25 26 :param encoder: Function used to compress text. 27 :param decoder: Function used to decompress previously compressed text. 28 :param content_type: The mime type this compression method identifies as. 29 :param aliases: A list of names to associate with this compression method. 30 31 """ 32 _encoders[content_type] = encoder 33 _decoders[content_type] = decoder 34 _aliases.update((alias, content_type) for alias in aliases) 35 36 37 def encoders(): 38 """Returns a list of available compression methods.""" 39 return _encoders.keys() 40 41 42 def get_encoder(t): 43 """Get encoder by alias name.""" 44 t = _aliases.get(t, t) 45 return _encoders[t], t 46 47 48 def get_decoder(t): 49 """Get decoder by alias name.""" 50 return _decoders[_aliases.get(t, t)] 51 52 53 def compress(body, content_type): 54 """Compress text. 55 56 :param body: The text to compress. 57 :param content_type: mime-type of compression method to use. 58 59 """ 60 encoder, content_type = get_encoder(content_type) 61 return encoder(body.encode("utf-8")), content_type 62 63 64 def decompress(body, content_type): 65 """Decompress compressed text. 66 67 :param body: Previously compressed text to uncompress. 68 :param content_type: mime-type of compression method used. 69 70 """ 71 return get_decoder(content_type)(body).decode("utf-8") 72 73 74 register(zlib.compress, 75 zlib.decompress, 76 "application/x-gzip", aliases=["gzip", "zlib"]) 77 try: 78 import bz2 79 except ImportError: 80 pass # Jython? 81 else: 82 register(bz2.compress, 83 bz2.decompress, 84 "application/x-bz2", aliases=["bzip2", "bzip"]) 85 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/kombu/compression.py b/kombu/compression.py --- a/kombu/compression.py +++ b/kombu/compression.py @@ -10,6 +10,8 @@ """ from __future__ import absolute_import +from kombu.utils.encoding import ensure_bytes, bytes_to_str + import zlib _aliases = {} @@ -58,7 +60,7 @@ """ encoder, content_type = get_encoder(content_type) - return encoder(body.encode("utf-8")), content_type + return encoder(ensure_bytes(body)), content_type def decompress(body, content_type): @@ -68,7 +70,7 @@ :param content_type: mime-type of compression method used. """ - return get_decoder(content_type)(body).decode("utf-8") + return bytes_to_str(get_decoder(content_type)(body)) register(zlib.compress,
{"golden_diff": "diff --git a/kombu/compression.py b/kombu/compression.py\n--- a/kombu/compression.py\n+++ b/kombu/compression.py\n@@ -10,6 +10,8 @@\n \"\"\"\n from __future__ import absolute_import\n \n+from kombu.utils.encoding import ensure_bytes, bytes_to_str\n+\n import zlib\n \n _aliases = {}\n@@ -58,7 +60,7 @@\n \n \"\"\"\n encoder, content_type = get_encoder(content_type)\n- return encoder(body.encode(\"utf-8\")), content_type\n+ return encoder(ensure_bytes(body)), content_type\n \n \n def decompress(body, content_type):\n@@ -68,7 +70,7 @@\n :param content_type: mime-type of compression method used.\n \n \"\"\"\n- return get_decoder(content_type)(body).decode(\"utf-8\")\n+ return bytes_to_str(get_decoder(content_type)(body))\n \n \n register(zlib.compress,\n", "issue": "Kombu compression \"encodes\" the body ?\nAt https://github.com/ask/kombu/blob/master/kombu/compression.py#L61, we see:\n\n```\nreturn encoder(body.encode(\"utf-8\")), content_type\n```\n\nSame for decoding. What gives? It crashes with msgpack (which can't be encoded). Is there any reason for that ?\n\n", "before_files": [{"content": "\"\"\"\nkombu.compression\n=================\n\nCompression utilities.\n\n:copyright: (c) 2009 - 2012 by Ask Solem.\n:license: BSD, see LICENSE for more details.\n\n\"\"\"\nfrom __future__ import absolute_import\n\nimport zlib\n\n_aliases = {}\n_encoders = {}\n_decoders = {}\n\n__all__ = [\"register\", \"encoders\", \"get_encoder\",\n \"get_decoder\", \"compress\", \"decompress\"]\n\n\ndef register(encoder, decoder, content_type, aliases=[]):\n \"\"\"Register new compression method.\n\n :param encoder: Function used to compress text.\n :param decoder: Function used to decompress previously compressed text.\n :param content_type: The mime type this compression method identifies as.\n :param aliases: A list of names to associate with this compression method.\n\n \"\"\"\n _encoders[content_type] = encoder\n _decoders[content_type] = decoder\n _aliases.update((alias, content_type) for alias in aliases)\n\n\ndef encoders():\n \"\"\"Returns a list of available compression methods.\"\"\"\n return _encoders.keys()\n\n\ndef get_encoder(t):\n \"\"\"Get encoder by alias name.\"\"\"\n t = _aliases.get(t, t)\n return _encoders[t], t\n\n\ndef get_decoder(t):\n \"\"\"Get decoder by alias name.\"\"\"\n return _decoders[_aliases.get(t, t)]\n\n\ndef compress(body, content_type):\n \"\"\"Compress text.\n\n :param body: The text to compress.\n :param content_type: mime-type of compression method to use.\n\n \"\"\"\n encoder, content_type = get_encoder(content_type)\n return encoder(body.encode(\"utf-8\")), content_type\n\n\ndef decompress(body, content_type):\n \"\"\"Decompress compressed text.\n\n :param body: Previously compressed text to uncompress.\n :param content_type: mime-type of compression method used.\n\n \"\"\"\n return get_decoder(content_type)(body).decode(\"utf-8\")\n\n\nregister(zlib.compress,\n zlib.decompress,\n \"application/x-gzip\", aliases=[\"gzip\", \"zlib\"])\ntry:\n import bz2\nexcept ImportError:\n pass # Jython?\nelse:\n register(bz2.compress,\n bz2.decompress,\n \"application/x-bz2\", aliases=[\"bzip2\", \"bzip\"])\n", "path": "kombu/compression.py"}], "after_files": [{"content": "\"\"\"\nkombu.compression\n=================\n\nCompression utilities.\n\n:copyright: (c) 2009 - 2012 by Ask Solem.\n:license: BSD, see LICENSE for more details.\n\n\"\"\"\nfrom __future__ import absolute_import\n\nfrom kombu.utils.encoding import ensure_bytes, bytes_to_str\n\nimport zlib\n\n_aliases = {}\n_encoders = {}\n_decoders = {}\n\n__all__ = [\"register\", \"encoders\", \"get_encoder\",\n \"get_decoder\", \"compress\", \"decompress\"]\n\n\ndef register(encoder, decoder, content_type, aliases=[]):\n \"\"\"Register new compression method.\n\n :param encoder: Function used to compress text.\n :param decoder: Function used to decompress previously compressed text.\n :param content_type: The mime type this compression method identifies as.\n :param aliases: A list of names to associate with this compression method.\n\n \"\"\"\n _encoders[content_type] = encoder\n _decoders[content_type] = decoder\n _aliases.update((alias, content_type) for alias in aliases)\n\n\ndef encoders():\n \"\"\"Returns a list of available compression methods.\"\"\"\n return _encoders.keys()\n\n\ndef get_encoder(t):\n \"\"\"Get encoder by alias name.\"\"\"\n t = _aliases.get(t, t)\n return _encoders[t], t\n\n\ndef get_decoder(t):\n \"\"\"Get decoder by alias name.\"\"\"\n return _decoders[_aliases.get(t, t)]\n\n\ndef compress(body, content_type):\n \"\"\"Compress text.\n\n :param body: The text to compress.\n :param content_type: mime-type of compression method to use.\n\n \"\"\"\n encoder, content_type = get_encoder(content_type)\n return encoder(ensure_bytes(body)), content_type\n\n\ndef decompress(body, content_type):\n \"\"\"Decompress compressed text.\n\n :param body: Previously compressed text to uncompress.\n :param content_type: mime-type of compression method used.\n\n \"\"\"\n return bytes_to_str(get_decoder(content_type)(body))\n\n\nregister(zlib.compress,\n zlib.decompress,\n \"application/x-gzip\", aliases=[\"gzip\", \"zlib\"])\ntry:\n import bz2\nexcept ImportError:\n pass # Jython?\nelse:\n register(bz2.compress,\n bz2.decompress,\n \"application/x-bz2\", aliases=[\"bzip2\", \"bzip\"])\n", "path": "kombu/compression.py"}]}
1,012
203
gh_patches_debug_31050
rasdani/github-patches
git_diff
litestar-org__litestar-2204
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Bug: pydantic validations aren't being handled by DTOs ### Description In the documentation of Litestar we have: > When a value fails pydantic validation, the result will be a ValidationException with the extra key set to the pydantic validation errors. Thus, this data will be made available for the API consumers by default. But when combined with DTO's in a post request, if a pydantic validation fails the exception is not handled and the client receives an InternalServerError (500) instead of a BadRequest (400): ``` __pydantic_self__.__pydantic_validator__.validate_python(data, self_instance=__pydantic_self__) pydantic_core._pydantic_core.ValidationError: 1 validation error for User name String should have at most 2 characters [type=string_too_long, input_value='abc', input_type=str] For further information visit https://errors.pydantic.dev/2.2/v/string_too_long Status code: 500 ``` If the validation isn't handled by pydantic (or at least it isn't handled **first** by pydantic) it works, for example an Enum validation returns a proper 400 status code because `msgspec` catches it first: ```python from enum import Enum class NameEnum(str, Enum): a = "A" b = "B" # (replace the User class of the example with this) class User(BaseModel): name: NameEnum ``` Output for incorrect Enum: ``` File ".../.venv/lib64/python3.11/site-packages/litestar/serialization/msgspec_hooks.py", line 191, in decode_json raise SerializationException(str(msgspec_error)) from msgspec_error litestar.exceptions.base_exceptions.SerializationException: Invalid enum value 'abc' - at `$.name` ... File ".../.venv/lib64/python3.11/site-packages/litestar/routes/http.py", line 186, in _get_response_data raise ClientException(str(e)) from e litestar.exceptions.http_exceptions.ClientException: 400: Invalid enum value 'abc' - at `$.name` ``` ### URL to code causing the issue _No response_ ### MCVE ```python from litestar import Litestar, post from litestar.contrib.pydantic import PydanticDTO from litestar.testing import TestClient from pydantic import BaseModel, Field class User(BaseModel): name: str = Field(max_length=2) UserDTO = PydanticDTO[User] @post("/user", dto=UserDTO, sync_to_thread=False) def create_user(data: User) -> User: return data with TestClient(Litestar([create_user], debug=True)) as client: response = client.post("/user", json={"name": "abc"}) print(response.text) print(f"Status code: {response.status_code}") assert response.status_code == 201 ``` ### Steps to reproduce ```bash 1. Execute the MCVE ``` ### Screenshots _No response_ ### Logs _No response_ ### Litestar Version 2.0.0rc1 ### Platform - [X] Linux - [ ] Mac - [ ] Windows - [ ] Other (Please specify in the description above) <!-- POLAR PLEDGE BADGE START --> --- ## Funding * If you would like to see an issue prioritized, make a pledge towards it! * We receive the pledge once the issue is completed & verified <a href="https://polar.sh/litestar-org/litestar/issues/2190"> <picture> <source media="(prefers-color-scheme: dark)" srcset="https://polar.sh/api/github/litestar-org/litestar/issues/2190/pledge.svg?darkmode=1"> <img alt="Fund with Polar" src="https://polar.sh/api/github/litestar-org/litestar/issues/2190/pledge.svg"> </picture> </a> <!-- POLAR PLEDGE BADGE END --> StaticFilesConfig and virtual directories I'm trying to write a ``FileSystemProtocol`` to load files from the package data using [importlib_resources](https://importlib-resources.readthedocs.io/en/latest/using.html#). But because ``directories`` is defined as ``DirectoryPath``, pydantic checks if the given directories exist in the local filesystem. This is not generally true, especially in any kind of virtual filesystem (e.g. a zipped package). I think this condition should be relaxed to support virtual filesystems. https://github.com/starlite-api/starlite/blob/9bb6dcd57c10a591377cf8e3a537e9292566d5b9/starlite/config/static_files.py#L32 --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `litestar/contrib/pydantic/pydantic_dto_factory.py` Content: ``` 1 from __future__ import annotations 2 3 from dataclasses import replace 4 from typing import TYPE_CHECKING, Collection, Generic, TypeVar 5 6 from litestar.dto.base_dto import AbstractDTO 7 from litestar.dto.data_structures import DTOFieldDefinition 8 from litestar.dto.field import DTO_FIELD_META_KEY, DTOField 9 from litestar.exceptions import MissingDependencyException 10 from litestar.types.empty import Empty 11 12 if TYPE_CHECKING: 13 from typing import Generator 14 15 from litestar.typing import FieldDefinition 16 17 18 try: 19 import pydantic 20 21 if pydantic.VERSION.startswith("2"): 22 from pydantic_core import PydanticUndefined 23 else: # pragma: no cover 24 from pydantic.fields import Undefined as PydanticUndefined # type: ignore 25 except ImportError as e: 26 raise MissingDependencyException("pydantic") from e 27 28 __all__ = ("PydanticDTO",) 29 30 T = TypeVar("T", bound="pydantic.BaseModel | Collection[pydantic.BaseModel]") 31 32 33 class PydanticDTO(AbstractDTO[T], Generic[T]): 34 """Support for domain modelling with Pydantic.""" 35 36 @classmethod 37 def generate_field_definitions( 38 cls, model_type: type[pydantic.BaseModel] 39 ) -> Generator[DTOFieldDefinition, None, None]: 40 model_field_definitions = cls.get_model_type_hints(model_type) 41 42 if pydantic.VERSION.startswith("1"): # pragma: no cover 43 model_fields: dict[str, pydantic.fields.FieldInfo] = {k: model_field.field_info for k, model_field in model_type.__fields__.items()} # type: ignore 44 else: 45 model_fields = dict(model_type.model_fields) 46 47 for field_name, field_info in model_fields.items(): 48 field_definition = model_field_definitions[field_name] 49 dto_field = (field_definition.extra or {}).pop(DTO_FIELD_META_KEY, DTOField()) 50 51 if field_info.default is not PydanticUndefined: 52 default = field_info.default 53 elif field_definition.is_optional: 54 default = None 55 else: 56 default = Empty 57 58 yield replace( 59 DTOFieldDefinition.from_field_definition( 60 field_definition=field_definition, 61 dto_field=dto_field, 62 model_name=model_type.__name__, 63 default_factory=field_info.default_factory 64 if field_info.default_factory and field_info.default_factory is not PydanticUndefined # type: ignore[comparison-overlap] 65 else Empty, 66 ), 67 default=default, 68 name=field_name, 69 ) 70 71 @classmethod 72 def detect_nested_field(cls, field_definition: FieldDefinition) -> bool: 73 return field_definition.is_subclass_of(pydantic.BaseModel) 74 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/litestar/contrib/pydantic/pydantic_dto_factory.py b/litestar/contrib/pydantic/pydantic_dto_factory.py --- a/litestar/contrib/pydantic/pydantic_dto_factory.py +++ b/litestar/contrib/pydantic/pydantic_dto_factory.py @@ -3,20 +3,23 @@ from dataclasses import replace from typing import TYPE_CHECKING, Collection, Generic, TypeVar +from typing_extensions import override + from litestar.dto.base_dto import AbstractDTO from litestar.dto.data_structures import DTOFieldDefinition from litestar.dto.field import DTO_FIELD_META_KEY, DTOField -from litestar.exceptions import MissingDependencyException +from litestar.exceptions import MissingDependencyException, ValidationException from litestar.types.empty import Empty if TYPE_CHECKING: - from typing import Generator + from typing import Any, Generator from litestar.typing import FieldDefinition try: import pydantic + from pydantic import ValidationError if pydantic.VERSION.startswith("2"): from pydantic_core import PydanticUndefined @@ -33,6 +36,20 @@ class PydanticDTO(AbstractDTO[T], Generic[T]): """Support for domain modelling with Pydantic.""" + @override + def decode_builtins(self, value: dict[str, Any]) -> Any: + try: + return super().decode_builtins(value) + except ValidationError as ex: + raise ValidationException(extra=ex.errors()) from ex + + @override + def decode_bytes(self, value: bytes) -> Any: + try: + return super().decode_bytes(value) + except ValidationError as ex: + raise ValidationException(extra=ex.errors()) from ex + @classmethod def generate_field_definitions( cls, model_type: type[pydantic.BaseModel]
{"golden_diff": "diff --git a/litestar/contrib/pydantic/pydantic_dto_factory.py b/litestar/contrib/pydantic/pydantic_dto_factory.py\n--- a/litestar/contrib/pydantic/pydantic_dto_factory.py\n+++ b/litestar/contrib/pydantic/pydantic_dto_factory.py\n@@ -3,20 +3,23 @@\n from dataclasses import replace\n from typing import TYPE_CHECKING, Collection, Generic, TypeVar\n \n+from typing_extensions import override\n+\n from litestar.dto.base_dto import AbstractDTO\n from litestar.dto.data_structures import DTOFieldDefinition\n from litestar.dto.field import DTO_FIELD_META_KEY, DTOField\n-from litestar.exceptions import MissingDependencyException\n+from litestar.exceptions import MissingDependencyException, ValidationException\n from litestar.types.empty import Empty\n \n if TYPE_CHECKING:\n- from typing import Generator\n+ from typing import Any, Generator\n \n from litestar.typing import FieldDefinition\n \n \n try:\n import pydantic\n+ from pydantic import ValidationError\n \n if pydantic.VERSION.startswith(\"2\"):\n from pydantic_core import PydanticUndefined\n@@ -33,6 +36,20 @@\n class PydanticDTO(AbstractDTO[T], Generic[T]):\n \"\"\"Support for domain modelling with Pydantic.\"\"\"\n \n+ @override\n+ def decode_builtins(self, value: dict[str, Any]) -> Any:\n+ try:\n+ return super().decode_builtins(value)\n+ except ValidationError as ex:\n+ raise ValidationException(extra=ex.errors()) from ex\n+\n+ @override\n+ def decode_bytes(self, value: bytes) -> Any:\n+ try:\n+ return super().decode_bytes(value)\n+ except ValidationError as ex:\n+ raise ValidationException(extra=ex.errors()) from ex\n+\n @classmethod\n def generate_field_definitions(\n cls, model_type: type[pydantic.BaseModel]\n", "issue": "Bug: pydantic validations aren't being handled by DTOs\n### Description\r\n\r\nIn the documentation of Litestar we have:\r\n\r\n> When a value fails pydantic validation, the result will be a ValidationException with the extra key set to the pydantic validation errors. Thus, this data will be made available for the API consumers by default.\r\n\r\nBut when combined with DTO's in a post request, if a pydantic validation fails the exception is not handled and the client receives an InternalServerError (500) instead of a BadRequest (400):\r\n\r\n\r\n```\r\n __pydantic_self__.__pydantic_validator__.validate_python(data, self_instance=__pydantic_self__)\r\npydantic_core._pydantic_core.ValidationError: 1 validation error for User\r\nname\r\n String should have at most 2 characters [type=string_too_long, input_value='abc', input_type=str]\r\n For further information visit https://errors.pydantic.dev/2.2/v/string_too_long\r\n\r\nStatus code: 500\r\n```\r\n\r\nIf the validation isn't handled by pydantic (or at least it isn't handled **first** by pydantic) it works, for example an Enum validation returns a proper 400 status code because `msgspec` catches it first:\r\n\r\n```python\r\nfrom enum import Enum\r\n\r\nclass NameEnum(str, Enum):\r\n a = \"A\"\r\n b = \"B\"\r\n\r\n# (replace the User class of the example with this)\r\nclass User(BaseModel):\r\n name: NameEnum\r\n```\r\n\r\nOutput for incorrect Enum:\r\n\r\n```\r\n File \".../.venv/lib64/python3.11/site-packages/litestar/serialization/msgspec_hooks.py\", line 191, in decode_json\r\n raise SerializationException(str(msgspec_error)) from msgspec_error\r\nlitestar.exceptions.base_exceptions.SerializationException: Invalid enum value 'abc' - at `$.name`\r\n\r\n\r\n...\r\n\r\n File \".../.venv/lib64/python3.11/site-packages/litestar/routes/http.py\", line 186, in _get_response_data\r\n raise ClientException(str(e)) from e\r\nlitestar.exceptions.http_exceptions.ClientException: 400: Invalid enum value 'abc' - at `$.name`\r\n```\r\n\r\n### URL to code causing the issue\r\n\r\n_No response_\r\n\r\n### MCVE\r\n\r\n```python\r\nfrom litestar import Litestar, post\r\nfrom litestar.contrib.pydantic import PydanticDTO\r\nfrom litestar.testing import TestClient\r\nfrom pydantic import BaseModel, Field\r\n\r\n\r\nclass User(BaseModel):\r\n name: str = Field(max_length=2)\r\n\r\n\r\nUserDTO = PydanticDTO[User]\r\n\r\n\r\n@post(\"/user\", dto=UserDTO, sync_to_thread=False)\r\ndef create_user(data: User) -> User:\r\n return data\r\n\r\n\r\nwith TestClient(Litestar([create_user], debug=True)) as client:\r\n response = client.post(\"/user\", json={\"name\": \"abc\"})\r\n print(response.text)\r\n print(f\"Status code: {response.status_code}\")\r\n assert response.status_code == 201\r\n```\r\n\r\n\r\n### Steps to reproduce\r\n\r\n```bash\r\n1. Execute the MCVE\r\n```\r\n\r\n\r\n### Screenshots\r\n\r\n_No response_\r\n\r\n### Logs\r\n\r\n_No response_\r\n\r\n### Litestar Version\r\n\r\n2.0.0rc1\r\n\r\n### Platform\r\n\r\n- [X] Linux\r\n- [ ] Mac\r\n- [ ] Windows\r\n- [ ] Other (Please specify in the description above)\r\n\r\n<!-- POLAR PLEDGE BADGE START -->\r\n---\r\n\r\n## Funding\r\n* If you would like to see an issue prioritized, make a pledge towards it!\r\n* We receive the pledge once the issue is completed & verified\r\n\r\n<a href=\"https://polar.sh/litestar-org/litestar/issues/2190\">\r\n<picture>\r\n <source media=\"(prefers-color-scheme: dark)\" srcset=\"https://polar.sh/api/github/litestar-org/litestar/issues/2190/pledge.svg?darkmode=1\">\r\n <img alt=\"Fund with Polar\" src=\"https://polar.sh/api/github/litestar-org/litestar/issues/2190/pledge.svg\">\r\n</picture>\r\n</a>\r\n<!-- POLAR PLEDGE BADGE END -->\r\n\nStaticFilesConfig and virtual directories\nI'm trying to write a ``FileSystemProtocol`` to load files from the package data using [importlib_resources](https://importlib-resources.readthedocs.io/en/latest/using.html#). But because ``directories`` is defined as ``DirectoryPath``, pydantic checks if the given directories exist in the local filesystem. \r\n\r\nThis is not generally true, especially in any kind of virtual filesystem (e.g. a zipped package). I think this condition should be relaxed to support virtual filesystems.\r\n\r\nhttps://github.com/starlite-api/starlite/blob/9bb6dcd57c10a591377cf8e3a537e9292566d5b9/starlite/config/static_files.py#L32\n", "before_files": [{"content": "from __future__ import annotations\n\nfrom dataclasses import replace\nfrom typing import TYPE_CHECKING, Collection, Generic, TypeVar\n\nfrom litestar.dto.base_dto import AbstractDTO\nfrom litestar.dto.data_structures import DTOFieldDefinition\nfrom litestar.dto.field import DTO_FIELD_META_KEY, DTOField\nfrom litestar.exceptions import MissingDependencyException\nfrom litestar.types.empty import Empty\n\nif TYPE_CHECKING:\n from typing import Generator\n\n from litestar.typing import FieldDefinition\n\n\ntry:\n import pydantic\n\n if pydantic.VERSION.startswith(\"2\"):\n from pydantic_core import PydanticUndefined\n else: # pragma: no cover\n from pydantic.fields import Undefined as PydanticUndefined # type: ignore\nexcept ImportError as e:\n raise MissingDependencyException(\"pydantic\") from e\n\n__all__ = (\"PydanticDTO\",)\n\nT = TypeVar(\"T\", bound=\"pydantic.BaseModel | Collection[pydantic.BaseModel]\")\n\n\nclass PydanticDTO(AbstractDTO[T], Generic[T]):\n \"\"\"Support for domain modelling with Pydantic.\"\"\"\n\n @classmethod\n def generate_field_definitions(\n cls, model_type: type[pydantic.BaseModel]\n ) -> Generator[DTOFieldDefinition, None, None]:\n model_field_definitions = cls.get_model_type_hints(model_type)\n\n if pydantic.VERSION.startswith(\"1\"): # pragma: no cover\n model_fields: dict[str, pydantic.fields.FieldInfo] = {k: model_field.field_info for k, model_field in model_type.__fields__.items()} # type: ignore\n else:\n model_fields = dict(model_type.model_fields)\n\n for field_name, field_info in model_fields.items():\n field_definition = model_field_definitions[field_name]\n dto_field = (field_definition.extra or {}).pop(DTO_FIELD_META_KEY, DTOField())\n\n if field_info.default is not PydanticUndefined:\n default = field_info.default\n elif field_definition.is_optional:\n default = None\n else:\n default = Empty\n\n yield replace(\n DTOFieldDefinition.from_field_definition(\n field_definition=field_definition,\n dto_field=dto_field,\n model_name=model_type.__name__,\n default_factory=field_info.default_factory\n if field_info.default_factory and field_info.default_factory is not PydanticUndefined # type: ignore[comparison-overlap]\n else Empty,\n ),\n default=default,\n name=field_name,\n )\n\n @classmethod\n def detect_nested_field(cls, field_definition: FieldDefinition) -> bool:\n return field_definition.is_subclass_of(pydantic.BaseModel)\n", "path": "litestar/contrib/pydantic/pydantic_dto_factory.py"}], "after_files": [{"content": "from __future__ import annotations\n\nfrom dataclasses import replace\nfrom typing import TYPE_CHECKING, Collection, Generic, TypeVar\n\nfrom typing_extensions import override\n\nfrom litestar.dto.base_dto import AbstractDTO\nfrom litestar.dto.data_structures import DTOFieldDefinition\nfrom litestar.dto.field import DTO_FIELD_META_KEY, DTOField\nfrom litestar.exceptions import MissingDependencyException, ValidationException\nfrom litestar.types.empty import Empty\n\nif TYPE_CHECKING:\n from typing import Any, Generator\n\n from litestar.typing import FieldDefinition\n\n\ntry:\n import pydantic\n from pydantic import ValidationError\n\n if pydantic.VERSION.startswith(\"2\"):\n from pydantic_core import PydanticUndefined\n else: # pragma: no cover\n from pydantic.fields import Undefined as PydanticUndefined # type: ignore\nexcept ImportError as e:\n raise MissingDependencyException(\"pydantic\") from e\n\n__all__ = (\"PydanticDTO\",)\n\nT = TypeVar(\"T\", bound=\"pydantic.BaseModel | Collection[pydantic.BaseModel]\")\n\n\nclass PydanticDTO(AbstractDTO[T], Generic[T]):\n \"\"\"Support for domain modelling with Pydantic.\"\"\"\n\n @override\n def decode_builtins(self, value: dict[str, Any]) -> Any:\n try:\n return super().decode_builtins(value)\n except ValidationError as ex:\n raise ValidationException(extra=ex.errors()) from ex\n\n @override\n def decode_bytes(self, value: bytes) -> Any:\n try:\n return super().decode_bytes(value)\n except ValidationError as ex:\n raise ValidationException(extra=ex.errors()) from ex\n\n @classmethod\n def generate_field_definitions(\n cls, model_type: type[pydantic.BaseModel]\n ) -> Generator[DTOFieldDefinition, None, None]:\n model_field_definitions = cls.get_model_type_hints(model_type)\n\n if pydantic.VERSION.startswith(\"1\"): # pragma: no cover\n model_fields: dict[str, pydantic.fields.FieldInfo] = {k: model_field.field_info for k, model_field in model_type.__fields__.items()} # type: ignore\n else:\n model_fields = dict(model_type.model_fields)\n\n for field_name, field_info in model_fields.items():\n field_definition = model_field_definitions[field_name]\n dto_field = (field_definition.extra or {}).pop(DTO_FIELD_META_KEY, DTOField())\n\n if field_info.default is not PydanticUndefined:\n default = field_info.default\n elif field_definition.is_optional:\n default = None\n else:\n default = Empty\n\n yield replace(\n DTOFieldDefinition.from_field_definition(\n field_definition=field_definition,\n dto_field=dto_field,\n model_name=model_type.__name__,\n default_factory=field_info.default_factory\n if field_info.default_factory and field_info.default_factory is not PydanticUndefined # type: ignore[comparison-overlap]\n else Empty,\n ),\n default=default,\n name=field_name,\n )\n\n @classmethod\n def detect_nested_field(cls, field_definition: FieldDefinition) -> bool:\n return field_definition.is_subclass_of(pydantic.BaseModel)\n", "path": "litestar/contrib/pydantic/pydantic_dto_factory.py"}]}
2,047
424
gh_patches_debug_1572
rasdani/github-patches
git_diff
hylang__hy-2070
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- The manual is missing module names It looks like the big doc reorganization ended up omitting the actual module names of the various contrib and extra modules; for example, the section named "Walk" documents the functions and macros in `hy.contrib.walk`, but doesn't mention the name `hy.contrib.walk` or otherwise indicate how to bring the enumerated names into scope. Pointed out in https://github.com/hylang/hy/issues/2065#issuecomment-842377526. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `docs/conf.py` Content: ``` 1 # -*- coding: utf-8 -*- 2 # 3 # This file is execfile()d with the current directory set to its containing dir. 4 5 import re, os, sys, time, html 6 7 sys.path.insert(0, os.path.abspath('..')) 8 9 extensions = [ 10 'sphinx.ext.napoleon', 11 'sphinx.ext.intersphinx', 12 'sphinx.ext.autodoc', 13 'sphinxcontrib.hydomain', 14 ] 15 16 from get_version import __version__ as hy_version 17 18 # Read the Docs might dirty its checkout, so strip the dirty flag. 19 hy_version = re.sub(r'[+.]dirty\Z', '', hy_version) 20 21 templates_path = ['_templates'] 22 source_suffix = '.rst' 23 24 master_doc = 'index' 25 26 # General information about the project. 27 project = 'hy' 28 copyright = '%s the authors' % time.strftime('%Y') 29 30 # The version info for the project you're documenting, acts as replacement for 31 # |version| and |release|, also used in various other places throughout the 32 # built documents. 33 # 34 # The short X.Y version. 35 version = ".".join(hy_version.split(".")[:-1]) 36 # The full version, including alpha/beta/rc tags. 37 release = hy_version 38 hy_descriptive_version = html.escape(hy_version) 39 if "+" in hy_version: 40 hy_descriptive_version += " <strong style='color: red;'>(unstable)</strong>" 41 42 exclude_patterns = ['_build', 'coreteam.rst'] 43 add_module_names = False 44 45 pygments_style = 'sphinx' 46 47 import sphinx_rtd_theme 48 html_theme = 'sphinx_rtd_theme' 49 html_theme_path = [sphinx_rtd_theme.get_html_theme_path()] 50 51 # Add any paths that contain custom static files (such as style sheets) here, 52 # relative to this directory. They are copied after the builtin static files, 53 # so a file named "default.css" will overwrite the builtin "default.css". 54 html_static_path = ['_static'] 55 56 html_use_smartypants = False 57 html_show_sphinx = False 58 59 html_context = dict( 60 hy_descriptive_version = hy_descriptive_version) 61 62 highlight_language = 'clojure' 63 64 intersphinx_mapping = dict( 65 py = ('https://docs.python.org/3/', None)) 66 # ** Generate Cheatsheet 67 import json 68 from pathlib import Path 69 from itertools import zip_longest 70 71 def refize(spec): 72 role = ':hy:func:' 73 if isinstance(spec, dict): 74 _name = spec['name'] 75 uri = spec['uri'] 76 if spec.get('internal'): 77 role = ':ref:' 78 else: 79 uri = spec 80 _name = str.split(uri, '.')[-1] 81 return '{}`{} <{}>`'.format(role, _name, uri) 82 83 84 def format_refs(refs, indent): 85 args = [iter(map(refize, refs))] 86 ref_groups = zip_longest(*args, fillvalue="") 87 return str.join( 88 ' \\\n' + ' ' * (indent + 3), 89 [str.join(' ', ref_group) for ref_group in ref_groups], 90 ) 91 92 93 def format_row(category, divider_loc): 94 return '{title: <{width}} | {methods}'.format( 95 width=divider_loc, 96 title=category['name'], 97 methods=format_refs(category['methods'], divider_loc) 98 ) 99 100 101 def format_table(table_spec): 102 table_name = table_spec['name'] 103 categories = table_spec['categories'] 104 longest_cat_name = max([len(category['name']) for category in categories]) 105 table = [ 106 table_name, 107 '-' * len(table_name), 108 '', 109 '=' * longest_cat_name + ' ' + '=' * 25, 110 *(format_row(category, longest_cat_name) for category in categories), 111 '=' * longest_cat_name + ' ' + '=' * 25, 112 '' 113 ] 114 return '\n'.join(table) 115 116 117 # Modifications to the cheatsheet should be added in `cheatsheet.json` 118 cheatsheet_spec = json.loads(Path('./docs/cheatsheet.json').read_text()) 119 cheatsheet = [ 120 '..', 121 ' DO NOT MODIFY THIS FILE. IT IS AUTO GENERATED BY ``conf.py``', 122 ' If you need to change or add methods, modify ``cheatsheet_spec`` in ``conf.py``', 123 '', 124 '.. _cheatsheet:', 125 '', 126 'Cheatsheet', 127 '==========', 128 '', 129 *map(format_table, cheatsheet_spec), 130 ] 131 Path('./docs/cheatsheet.rst').write_text('\n'.join(cheatsheet)) 132 133 134 # ** Sphinx App Setup 135 136 137 def setup(app): 138 app.add_css_file('overrides.css') 139 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/docs/conf.py b/docs/conf.py --- a/docs/conf.py +++ b/docs/conf.py @@ -40,7 +40,7 @@ hy_descriptive_version += " <strong style='color: red;'>(unstable)</strong>" exclude_patterns = ['_build', 'coreteam.rst'] -add_module_names = False +add_module_names = True pygments_style = 'sphinx'
{"golden_diff": "diff --git a/docs/conf.py b/docs/conf.py\n--- a/docs/conf.py\n+++ b/docs/conf.py\n@@ -40,7 +40,7 @@\n hy_descriptive_version += \" <strong style='color: red;'>(unstable)</strong>\"\n \n exclude_patterns = ['_build', 'coreteam.rst']\n-add_module_names = False\n+add_module_names = True\n \n pygments_style = 'sphinx'\n", "issue": "The manual is missing module names\nIt looks like the big doc reorganization ended up omitting the actual module names of the various contrib and extra modules; for example, the section named \"Walk\" documents the functions and macros in `hy.contrib.walk`, but doesn't mention the name `hy.contrib.walk` or otherwise indicate how to bring the enumerated names into scope.\r\n\r\nPointed out in https://github.com/hylang/hy/issues/2065#issuecomment-842377526.\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n#\n# This file is execfile()d with the current directory set to its containing dir.\n\nimport re, os, sys, time, html\n\nsys.path.insert(0, os.path.abspath('..'))\n\nextensions = [\n 'sphinx.ext.napoleon',\n 'sphinx.ext.intersphinx',\n 'sphinx.ext.autodoc',\n 'sphinxcontrib.hydomain',\n]\n\nfrom get_version import __version__ as hy_version\n\n# Read the Docs might dirty its checkout, so strip the dirty flag.\nhy_version = re.sub(r'[+.]dirty\\Z', '', hy_version)\n\ntemplates_path = ['_templates']\nsource_suffix = '.rst'\n\nmaster_doc = 'index'\n\n# General information about the project.\nproject = 'hy'\ncopyright = '%s the authors' % time.strftime('%Y')\n\n# The version info for the project you're documenting, acts as replacement for\n# |version| and |release|, also used in various other places throughout the\n# built documents.\n#\n# The short X.Y version.\nversion = \".\".join(hy_version.split(\".\")[:-1])\n# The full version, including alpha/beta/rc tags.\nrelease = hy_version\nhy_descriptive_version = html.escape(hy_version)\nif \"+\" in hy_version:\n hy_descriptive_version += \" <strong style='color: red;'>(unstable)</strong>\"\n\nexclude_patterns = ['_build', 'coreteam.rst']\nadd_module_names = False\n\npygments_style = 'sphinx'\n\nimport sphinx_rtd_theme\nhtml_theme = 'sphinx_rtd_theme'\nhtml_theme_path = [sphinx_rtd_theme.get_html_theme_path()]\n\n# Add any paths that contain custom static files (such as style sheets) here,\n# relative to this directory. They are copied after the builtin static files,\n# so a file named \"default.css\" will overwrite the builtin \"default.css\".\nhtml_static_path = ['_static']\n\nhtml_use_smartypants = False\nhtml_show_sphinx = False\n\nhtml_context = dict(\n hy_descriptive_version = hy_descriptive_version)\n\nhighlight_language = 'clojure'\n\nintersphinx_mapping = dict(\n py = ('https://docs.python.org/3/', None))\n# ** Generate Cheatsheet\nimport json\nfrom pathlib import Path\nfrom itertools import zip_longest\n\ndef refize(spec):\n role = ':hy:func:'\n if isinstance(spec, dict):\n _name = spec['name']\n uri = spec['uri']\n if spec.get('internal'):\n role = ':ref:'\n else:\n uri = spec\n _name = str.split(uri, '.')[-1]\n return '{}`{} <{}>`'.format(role, _name, uri)\n\n\ndef format_refs(refs, indent):\n args = [iter(map(refize, refs))]\n ref_groups = zip_longest(*args, fillvalue=\"\")\n return str.join(\n ' \\\\\\n' + ' ' * (indent + 3),\n [str.join(' ', ref_group) for ref_group in ref_groups],\n )\n\n\ndef format_row(category, divider_loc):\n return '{title: <{width}} | {methods}'.format(\n width=divider_loc,\n title=category['name'],\n methods=format_refs(category['methods'], divider_loc)\n )\n\n\ndef format_table(table_spec):\n table_name = table_spec['name']\n categories = table_spec['categories']\n longest_cat_name = max([len(category['name']) for category in categories])\n table = [\n table_name,\n '-' * len(table_name),\n '',\n '=' * longest_cat_name + ' ' + '=' * 25,\n *(format_row(category, longest_cat_name) for category in categories),\n '=' * longest_cat_name + ' ' + '=' * 25,\n ''\n ]\n return '\\n'.join(table)\n\n\n# Modifications to the cheatsheet should be added in `cheatsheet.json`\ncheatsheet_spec = json.loads(Path('./docs/cheatsheet.json').read_text())\ncheatsheet = [\n '..',\n ' DO NOT MODIFY THIS FILE. IT IS AUTO GENERATED BY ``conf.py``',\n ' If you need to change or add methods, modify ``cheatsheet_spec`` in ``conf.py``',\n '',\n '.. _cheatsheet:',\n '',\n 'Cheatsheet',\n '==========',\n '',\n *map(format_table, cheatsheet_spec),\n]\nPath('./docs/cheatsheet.rst').write_text('\\n'.join(cheatsheet))\n\n\n# ** Sphinx App Setup\n\n\ndef setup(app):\n app.add_css_file('overrides.css')\n", "path": "docs/conf.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n#\n# This file is execfile()d with the current directory set to its containing dir.\n\nimport re, os, sys, time, html\n\nsys.path.insert(0, os.path.abspath('..'))\n\nextensions = [\n 'sphinx.ext.napoleon',\n 'sphinx.ext.intersphinx',\n 'sphinx.ext.autodoc',\n 'sphinxcontrib.hydomain',\n]\n\nfrom get_version import __version__ as hy_version\n\n# Read the Docs might dirty its checkout, so strip the dirty flag.\nhy_version = re.sub(r'[+.]dirty\\Z', '', hy_version)\n\ntemplates_path = ['_templates']\nsource_suffix = '.rst'\n\nmaster_doc = 'index'\n\n# General information about the project.\nproject = 'hy'\ncopyright = '%s the authors' % time.strftime('%Y')\n\n# The version info for the project you're documenting, acts as replacement for\n# |version| and |release|, also used in various other places throughout the\n# built documents.\n#\n# The short X.Y version.\nversion = \".\".join(hy_version.split(\".\")[:-1])\n# The full version, including alpha/beta/rc tags.\nrelease = hy_version\nhy_descriptive_version = html.escape(hy_version)\nif \"+\" in hy_version:\n hy_descriptive_version += \" <strong style='color: red;'>(unstable)</strong>\"\n\nexclude_patterns = ['_build', 'coreteam.rst']\nadd_module_names = True\n\npygments_style = 'sphinx'\n\nimport sphinx_rtd_theme\nhtml_theme = 'sphinx_rtd_theme'\nhtml_theme_path = [sphinx_rtd_theme.get_html_theme_path()]\n\n# Add any paths that contain custom static files (such as style sheets) here,\n# relative to this directory. They are copied after the builtin static files,\n# so a file named \"default.css\" will overwrite the builtin \"default.css\".\nhtml_static_path = ['_static']\n\nhtml_use_smartypants = False\nhtml_show_sphinx = False\n\nhtml_context = dict(\n hy_descriptive_version = hy_descriptive_version)\n\nhighlight_language = 'clojure'\n\nintersphinx_mapping = dict(\n py = ('https://docs.python.org/3/', None))\n# ** Generate Cheatsheet\nimport json\nfrom pathlib import Path\nfrom itertools import zip_longest\n\ndef refize(spec):\n role = ':hy:func:'\n if isinstance(spec, dict):\n _name = spec['name']\n uri = spec['uri']\n if spec.get('internal'):\n role = ':ref:'\n else:\n uri = spec\n _name = str.split(uri, '.')[-1]\n return '{}`{} <{}>`'.format(role, _name, uri)\n\n\ndef format_refs(refs, indent):\n args = [iter(map(refize, refs))]\n ref_groups = zip_longest(*args, fillvalue=\"\")\n return str.join(\n ' \\\\\\n' + ' ' * (indent + 3),\n [str.join(' ', ref_group) for ref_group in ref_groups],\n )\n\n\ndef format_row(category, divider_loc):\n return '{title: <{width}} | {methods}'.format(\n width=divider_loc,\n title=category['name'],\n methods=format_refs(category['methods'], divider_loc)\n )\n\n\ndef format_table(table_spec):\n table_name = table_spec['name']\n categories = table_spec['categories']\n longest_cat_name = max([len(category['name']) for category in categories])\n table = [\n table_name,\n '-' * len(table_name),\n '',\n '=' * longest_cat_name + ' ' + '=' * 25,\n *(format_row(category, longest_cat_name) for category in categories),\n '=' * longest_cat_name + ' ' + '=' * 25,\n ''\n ]\n return '\\n'.join(table)\n\n\n# Modifications to the cheatsheet should be added in `cheatsheet.json`\ncheatsheet_spec = json.loads(Path('./docs/cheatsheet.json').read_text())\ncheatsheet = [\n '..',\n ' DO NOT MODIFY THIS FILE. IT IS AUTO GENERATED BY ``conf.py``',\n ' If you need to change or add methods, modify ``cheatsheet_spec`` in ``conf.py``',\n '',\n '.. _cheatsheet:',\n '',\n 'Cheatsheet',\n '==========',\n '',\n *map(format_table, cheatsheet_spec),\n]\nPath('./docs/cheatsheet.rst').write_text('\\n'.join(cheatsheet))\n\n\n# ** Sphinx App Setup\n\n\ndef setup(app):\n app.add_css_file('overrides.css')\n", "path": "docs/conf.py"}]}
1,669
92
gh_patches_debug_29513
rasdani/github-patches
git_diff
TheAlgorithms__Python-10140
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Concatenate/consolidate all algorithms with different implementations ### Feature description There are lots of algorithms with the same concept but different implementations/methods in different files. All these should be moved into one file --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `strings/reverse_letters.py` Content: ``` 1 def reverse_letters(input_str: str) -> str: 2 """ 3 Reverses letters in a given string without adjusting the position of the words 4 >>> reverse_letters('The cat in the hat') 5 'ehT tac ni eht tah' 6 >>> reverse_letters('The quick brown fox jumped over the lazy dog.') 7 'ehT kciuq nworb xof depmuj revo eht yzal .god' 8 >>> reverse_letters('Is this true?') 9 'sI siht ?eurt' 10 >>> reverse_letters("I love Python") 11 'I evol nohtyP' 12 """ 13 return " ".join([word[::-1] for word in input_str.split()]) 14 15 16 if __name__ == "__main__": 17 import doctest 18 19 doctest.testmod() 20 ``` Path: `strings/reverse_long_words.py` Content: ``` 1 def reverse_long_words(sentence: str) -> str: 2 """ 3 Reverse all words that are longer than 4 characters in a sentence. 4 5 >>> reverse_long_words("Hey wollef sroirraw") 6 'Hey fellow warriors' 7 >>> reverse_long_words("nohtyP is nohtyP") 8 'Python is Python' 9 >>> reverse_long_words("1 12 123 1234 54321 654321") 10 '1 12 123 1234 12345 123456' 11 """ 12 return " ".join( 13 "".join(word[::-1]) if len(word) > 4 else word for word in sentence.split() 14 ) 15 16 17 if __name__ == "__main__": 18 import doctest 19 20 doctest.testmod() 21 print(reverse_long_words("Hey wollef sroirraw")) 22 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/strings/reverse_letters.py b/strings/reverse_letters.py --- a/strings/reverse_letters.py +++ b/strings/reverse_letters.py @@ -1,19 +1,24 @@ -def reverse_letters(input_str: str) -> str: +def reverse_letters(sentence: str, length: int = 0) -> str: """ - Reverses letters in a given string without adjusting the position of the words - >>> reverse_letters('The cat in the hat') - 'ehT tac ni eht tah' - >>> reverse_letters('The quick brown fox jumped over the lazy dog.') - 'ehT kciuq nworb xof depmuj revo eht yzal .god' - >>> reverse_letters('Is this true?') - 'sI siht ?eurt' - >>> reverse_letters("I love Python") - 'I evol nohtyP' + Reverse all words that are longer than the given length of characters in a sentence. + If unspecified, length is taken as 0 + + >>> reverse_letters("Hey wollef sroirraw", 3) + 'Hey fellow warriors' + >>> reverse_letters("nohtyP is nohtyP", 2) + 'Python is Python' + >>> reverse_letters("1 12 123 1234 54321 654321", 0) + '1 21 321 4321 12345 123456' + >>> reverse_letters("racecar") + 'racecar' """ - return " ".join([word[::-1] for word in input_str.split()]) + return " ".join( + "".join(word[::-1]) if len(word) > length else word for word in sentence.split() + ) if __name__ == "__main__": import doctest doctest.testmod() + print(reverse_letters("Hey wollef sroirraw")) diff --git a/strings/reverse_long_words.py b/strings/reverse_long_words.py deleted file mode 100644 --- a/strings/reverse_long_words.py +++ /dev/null @@ -1,21 +0,0 @@ -def reverse_long_words(sentence: str) -> str: - """ - Reverse all words that are longer than 4 characters in a sentence. - - >>> reverse_long_words("Hey wollef sroirraw") - 'Hey fellow warriors' - >>> reverse_long_words("nohtyP is nohtyP") - 'Python is Python' - >>> reverse_long_words("1 12 123 1234 54321 654321") - '1 12 123 1234 12345 123456' - """ - return " ".join( - "".join(word[::-1]) if len(word) > 4 else word for word in sentence.split() - ) - - -if __name__ == "__main__": - import doctest - - doctest.testmod() - print(reverse_long_words("Hey wollef sroirraw"))
{"golden_diff": "diff --git a/strings/reverse_letters.py b/strings/reverse_letters.py\n--- a/strings/reverse_letters.py\n+++ b/strings/reverse_letters.py\n@@ -1,19 +1,24 @@\n-def reverse_letters(input_str: str) -> str:\n+def reverse_letters(sentence: str, length: int = 0) -> str:\n \"\"\"\n- Reverses letters in a given string without adjusting the position of the words\n- >>> reverse_letters('The cat in the hat')\n- 'ehT tac ni eht tah'\n- >>> reverse_letters('The quick brown fox jumped over the lazy dog.')\n- 'ehT kciuq nworb xof depmuj revo eht yzal .god'\n- >>> reverse_letters('Is this true?')\n- 'sI siht ?eurt'\n- >>> reverse_letters(\"I love Python\")\n- 'I evol nohtyP'\n+ Reverse all words that are longer than the given length of characters in a sentence.\n+ If unspecified, length is taken as 0\n+\n+ >>> reverse_letters(\"Hey wollef sroirraw\", 3)\n+ 'Hey fellow warriors'\n+ >>> reverse_letters(\"nohtyP is nohtyP\", 2)\n+ 'Python is Python'\n+ >>> reverse_letters(\"1 12 123 1234 54321 654321\", 0)\n+ '1 21 321 4321 12345 123456'\n+ >>> reverse_letters(\"racecar\")\n+ 'racecar'\n \"\"\"\n- return \" \".join([word[::-1] for word in input_str.split()])\n+ return \" \".join(\n+ \"\".join(word[::-1]) if len(word) > length else word for word in sentence.split()\n+ )\n \n \n if __name__ == \"__main__\":\n import doctest\n \n doctest.testmod()\n+ print(reverse_letters(\"Hey wollef sroirraw\"))\ndiff --git a/strings/reverse_long_words.py b/strings/reverse_long_words.py\ndeleted file mode 100644\n--- a/strings/reverse_long_words.py\n+++ /dev/null\n@@ -1,21 +0,0 @@\n-def reverse_long_words(sentence: str) -> str:\n- \"\"\"\n- Reverse all words that are longer than 4 characters in a sentence.\n-\n- >>> reverse_long_words(\"Hey wollef sroirraw\")\n- 'Hey fellow warriors'\n- >>> reverse_long_words(\"nohtyP is nohtyP\")\n- 'Python is Python'\n- >>> reverse_long_words(\"1 12 123 1234 54321 654321\")\n- '1 12 123 1234 12345 123456'\n- \"\"\"\n- return \" \".join(\n- \"\".join(word[::-1]) if len(word) > 4 else word for word in sentence.split()\n- )\n-\n-\n-if __name__ == \"__main__\":\n- import doctest\n-\n- doctest.testmod()\n- print(reverse_long_words(\"Hey wollef sroirraw\"))\n", "issue": "Concatenate/consolidate all algorithms with different implementations\n### Feature description\n\nThere are lots of algorithms with the same concept but different implementations/methods in different files. All these should be moved into one file\n", "before_files": [{"content": "def reverse_letters(input_str: str) -> str:\n \"\"\"\n Reverses letters in a given string without adjusting the position of the words\n >>> reverse_letters('The cat in the hat')\n 'ehT tac ni eht tah'\n >>> reverse_letters('The quick brown fox jumped over the lazy dog.')\n 'ehT kciuq nworb xof depmuj revo eht yzal .god'\n >>> reverse_letters('Is this true?')\n 'sI siht ?eurt'\n >>> reverse_letters(\"I love Python\")\n 'I evol nohtyP'\n \"\"\"\n return \" \".join([word[::-1] for word in input_str.split()])\n\n\nif __name__ == \"__main__\":\n import doctest\n\n doctest.testmod()\n", "path": "strings/reverse_letters.py"}, {"content": "def reverse_long_words(sentence: str) -> str:\n \"\"\"\n Reverse all words that are longer than 4 characters in a sentence.\n\n >>> reverse_long_words(\"Hey wollef sroirraw\")\n 'Hey fellow warriors'\n >>> reverse_long_words(\"nohtyP is nohtyP\")\n 'Python is Python'\n >>> reverse_long_words(\"1 12 123 1234 54321 654321\")\n '1 12 123 1234 12345 123456'\n \"\"\"\n return \" \".join(\n \"\".join(word[::-1]) if len(word) > 4 else word for word in sentence.split()\n )\n\n\nif __name__ == \"__main__\":\n import doctest\n\n doctest.testmod()\n print(reverse_long_words(\"Hey wollef sroirraw\"))\n", "path": "strings/reverse_long_words.py"}], "after_files": [{"content": "def reverse_letters(sentence: str, length: int = 0) -> str:\n \"\"\"\n Reverse all words that are longer than the given length of characters in a sentence.\n If unspecified, length is taken as 0\n\n >>> reverse_letters(\"Hey wollef sroirraw\", 3)\n 'Hey fellow warriors'\n >>> reverse_letters(\"nohtyP is nohtyP\", 2)\n 'Python is Python'\n >>> reverse_letters(\"1 12 123 1234 54321 654321\", 0)\n '1 21 321 4321 12345 123456'\n >>> reverse_letters(\"racecar\")\n 'racecar'\n \"\"\"\n return \" \".join(\n \"\".join(word[::-1]) if len(word) > length else word for word in sentence.split()\n )\n\n\nif __name__ == \"__main__\":\n import doctest\n\n doctest.testmod()\n print(reverse_letters(\"Hey wollef sroirraw\"))\n", "path": "strings/reverse_letters.py"}, {"content": null, "path": "strings/reverse_long_words.py"}]}
761
732
gh_patches_debug_28763
rasdani/github-patches
git_diff
DataDog__dd-trace-py-2196
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- DD_TAGS separator inconsistent with heroku-buildpack-datadog https://github.com/DataDog/heroku-buildpack-datadog sets `DD_TAGS` separated by spaces. However, dd-trace-py splits the tags by commas: https://github.com/DataDog/dd-trace-py/blob/master/ddtrace/utils/formats.py#L87-L116 ``` def parse_tags_str(tags_str): """Parse a string of tags typically provided via environment variables. The expected string is of the form:: "key1:value1,key2:value2" :param tags_str: A string of the above form to parse tags from. :return: A dict containing the tags that were parsed. """ parsed_tags = {} if not tags_str: return parsed_tags for tag in tags_str.split(","): try: key, value = tag.split(":", 1) # Validate the tag if key == "" or value == "" or value.endswith(":"): raise ValueError except ValueError: log.error( "Malformed tag in tag pair '%s' from tag string '%s'.", tag, tags_str, ) else: parsed_tags[key] = value return parsed_tags ``` This results in all of the tags being set as the value of the first tag. This looks to have been previously surfaced for the java tracer here: https://github.com/DataDog/heroku-buildpack-datadog/issues/187 And was initially changed in the buildpack but was reverted and dd-trace-java updated to accommodate commas or spaces: https://github.com/DataDog/dd-trace-java/pull/2011 ### Which version of dd-trace-py are you using? 0.47.0 ### How can we reproduce your problem? Perform a trace with `DD_TAGS` set by https://github.com/DataDog/heroku-buildpack-datadog. ### What is the result that you get? `dyno` tag contains all other tags. ### What is the result that you expected? Tags to be properly separated. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `ddtrace/utils/formats.py` Content: ``` 1 import logging 2 import os 3 4 from .deprecation import deprecation 5 6 7 log = logging.getLogger(__name__) 8 9 10 def get_env(*parts, **kwargs): 11 """Retrieves environment variables value for the given integration. It must be used 12 for consistency between integrations. The implementation is backward compatible 13 with legacy nomenclature: 14 15 * `DATADOG_` is a legacy prefix with lower priority 16 * `DD_` environment variables have the highest priority 17 * the environment variable is built concatenating `integration` and `variable` 18 arguments 19 * return `default` otherwise 20 21 :param parts: evironment variable parts that will be joined with ``_`` to generate the name 22 :type parts: :obj:`str` 23 :param kwargs: ``default`` is the only supported keyword argument which sets the default value 24 if no environment variable is found 25 :rtype: :obj:`str` | ``kwargs["default"]`` 26 :returns: The string environment variable value or the value of ``kwargs["default"]`` if not found 27 """ 28 default = kwargs.get("default") 29 30 key = "_".join(parts) 31 key = key.upper() 32 legacy_env = "DATADOG_{}".format(key) 33 env = "DD_{}".format(key) 34 35 value = os.getenv(env) 36 legacy = os.getenv(legacy_env) 37 if legacy: 38 # Deprecation: `DATADOG_` variables are deprecated 39 deprecation( 40 name="DATADOG_", 41 message="Use `DD_` prefix instead", 42 version="1.0.0", 43 ) 44 45 value = value or legacy 46 return value if value else default 47 48 49 def deep_getattr(obj, attr_string, default=None): 50 """ 51 Returns the attribute of `obj` at the dotted path given by `attr_string` 52 If no such attribute is reachable, returns `default` 53 54 >>> deep_getattr(cass, 'cluster') 55 <cassandra.cluster.Cluster object at 0xa20c350 56 57 >>> deep_getattr(cass, 'cluster.metadata.partitioner') 58 u'org.apache.cassandra.dht.Murmur3Partitioner' 59 60 >>> deep_getattr(cass, 'i.dont.exist', default='default') 61 'default' 62 """ 63 attrs = attr_string.split(".") 64 for attr in attrs: 65 try: 66 obj = getattr(obj, attr) 67 except AttributeError: 68 return default 69 70 return obj 71 72 73 def asbool(value): 74 """Convert the given String to a boolean object. 75 76 Accepted values are `True` and `1`. 77 """ 78 if value is None: 79 return False 80 81 if isinstance(value, bool): 82 return value 83 84 return value.lower() in ("true", "1") 85 86 87 def parse_tags_str(tags_str): 88 """Parse a string of tags typically provided via environment variables. 89 90 The expected string is of the form:: 91 "key1:value1,key2:value2" 92 93 :param tags_str: A string of the above form to parse tags from. 94 :return: A dict containing the tags that were parsed. 95 """ 96 parsed_tags = {} 97 if not tags_str: 98 return parsed_tags 99 100 for tag in tags_str.split(","): 101 try: 102 key, value = tag.split(":", 1) 103 104 # Validate the tag 105 if key == "" or value == "" or value.endswith(":"): 106 raise ValueError 107 except ValueError: 108 log.error( 109 "Malformed tag in tag pair '%s' from tag string '%s'.", 110 tag, 111 tags_str, 112 ) 113 else: 114 parsed_tags[key] = value 115 116 return parsed_tags 117 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/ddtrace/utils/formats.py b/ddtrace/utils/formats.py --- a/ddtrace/utils/formats.py +++ b/ddtrace/utils/formats.py @@ -1,9 +1,13 @@ import logging import os +import re from .deprecation import deprecation +# Tags `key:value` must be separated by either comma or space +_TAGS_NOT_SEPARATED = re.compile(r":[^,\s]+:") + log = logging.getLogger(__name__) @@ -89,6 +93,7 @@ The expected string is of the form:: "key1:value1,key2:value2" + "key1:value1 key2:value2" :param tags_str: A string of the above form to parse tags from. :return: A dict containing the tags that were parsed. @@ -97,7 +102,29 @@ if not tags_str: return parsed_tags - for tag in tags_str.split(","): + if _TAGS_NOT_SEPARATED.search(tags_str): + log.error("Malformed tag string with tags not separated by comma or space '%s'.", tags_str) + return parsed_tags + + # Identify separator based on which successfully identifies the correct + # number of valid tags + numtagseps = tags_str.count(":") + for sep in [",", " "]: + if sum(":" in _ for _ in tags_str.split(sep)) == numtagseps: + break + else: + log.error( + ( + "Failed to find separator for tag string: '%s'.\n" + "Tag strings must be comma or space separated:\n" + " key1:value1,key2:value2\n" + " key1:value1 key2:value2" + ), + tags_str, + ) + return parsed_tags + + for tag in tags_str.split(sep): try: key, value = tag.split(":", 1)
{"golden_diff": "diff --git a/ddtrace/utils/formats.py b/ddtrace/utils/formats.py\n--- a/ddtrace/utils/formats.py\n+++ b/ddtrace/utils/formats.py\n@@ -1,9 +1,13 @@\n import logging\n import os\n+import re\n \n from .deprecation import deprecation\n \n \n+# Tags `key:value` must be separated by either comma or space\n+_TAGS_NOT_SEPARATED = re.compile(r\":[^,\\s]+:\")\n+\n log = logging.getLogger(__name__)\n \n \n@@ -89,6 +93,7 @@\n \n The expected string is of the form::\n \"key1:value1,key2:value2\"\n+ \"key1:value1 key2:value2\"\n \n :param tags_str: A string of the above form to parse tags from.\n :return: A dict containing the tags that were parsed.\n@@ -97,7 +102,29 @@\n if not tags_str:\n return parsed_tags\n \n- for tag in tags_str.split(\",\"):\n+ if _TAGS_NOT_SEPARATED.search(tags_str):\n+ log.error(\"Malformed tag string with tags not separated by comma or space '%s'.\", tags_str)\n+ return parsed_tags\n+\n+ # Identify separator based on which successfully identifies the correct\n+ # number of valid tags\n+ numtagseps = tags_str.count(\":\")\n+ for sep in [\",\", \" \"]:\n+ if sum(\":\" in _ for _ in tags_str.split(sep)) == numtagseps:\n+ break\n+ else:\n+ log.error(\n+ (\n+ \"Failed to find separator for tag string: '%s'.\\n\"\n+ \"Tag strings must be comma or space separated:\\n\"\n+ \" key1:value1,key2:value2\\n\"\n+ \" key1:value1 key2:value2\"\n+ ),\n+ tags_str,\n+ )\n+ return parsed_tags\n+\n+ for tag in tags_str.split(sep):\n try:\n key, value = tag.split(\":\", 1)\n", "issue": "DD_TAGS separator inconsistent with heroku-buildpack-datadog\nhttps://github.com/DataDog/heroku-buildpack-datadog sets `DD_TAGS` separated by spaces.\r\n\r\nHowever, dd-trace-py splits the tags by commas:\r\n\r\nhttps://github.com/DataDog/dd-trace-py/blob/master/ddtrace/utils/formats.py#L87-L116\r\n```\r\ndef parse_tags_str(tags_str):\r\n \"\"\"Parse a string of tags typically provided via environment variables.\r\n The expected string is of the form::\r\n \"key1:value1,key2:value2\"\r\n :param tags_str: A string of the above form to parse tags from.\r\n :return: A dict containing the tags that were parsed.\r\n \"\"\"\r\n parsed_tags = {}\r\n if not tags_str:\r\n return parsed_tags\r\n\r\n for tag in tags_str.split(\",\"):\r\n try:\r\n key, value = tag.split(\":\", 1)\r\n\r\n # Validate the tag\r\n if key == \"\" or value == \"\" or value.endswith(\":\"):\r\n raise ValueError\r\n except ValueError:\r\n log.error(\r\n \"Malformed tag in tag pair '%s' from tag string '%s'.\",\r\n tag,\r\n tags_str,\r\n )\r\n else:\r\n parsed_tags[key] = value\r\n\r\n return parsed_tags\r\n```\r\n\r\nThis results in all of the tags being set as the value of the first tag.\r\n\r\nThis looks to have been previously surfaced for the java tracer here:\r\nhttps://github.com/DataDog/heroku-buildpack-datadog/issues/187\r\n\r\nAnd was initially changed in the buildpack but was reverted and dd-trace-java updated to accommodate commas or spaces:\r\nhttps://github.com/DataDog/dd-trace-java/pull/2011\r\n\r\n### Which version of dd-trace-py are you using?\r\n0.47.0\r\n\r\n### How can we reproduce your problem?\r\nPerform a trace with `DD_TAGS` set by https://github.com/DataDog/heroku-buildpack-datadog.\r\n\r\n### What is the result that you get?\r\n`dyno` tag contains all other tags.\r\n\r\n### What is the result that you expected?\r\nTags to be properly separated.\r\n\n", "before_files": [{"content": "import logging\nimport os\n\nfrom .deprecation import deprecation\n\n\nlog = logging.getLogger(__name__)\n\n\ndef get_env(*parts, **kwargs):\n \"\"\"Retrieves environment variables value for the given integration. It must be used\n for consistency between integrations. The implementation is backward compatible\n with legacy nomenclature:\n\n * `DATADOG_` is a legacy prefix with lower priority\n * `DD_` environment variables have the highest priority\n * the environment variable is built concatenating `integration` and `variable`\n arguments\n * return `default` otherwise\n\n :param parts: evironment variable parts that will be joined with ``_`` to generate the name\n :type parts: :obj:`str`\n :param kwargs: ``default`` is the only supported keyword argument which sets the default value\n if no environment variable is found\n :rtype: :obj:`str` | ``kwargs[\"default\"]``\n :returns: The string environment variable value or the value of ``kwargs[\"default\"]`` if not found\n \"\"\"\n default = kwargs.get(\"default\")\n\n key = \"_\".join(parts)\n key = key.upper()\n legacy_env = \"DATADOG_{}\".format(key)\n env = \"DD_{}\".format(key)\n\n value = os.getenv(env)\n legacy = os.getenv(legacy_env)\n if legacy:\n # Deprecation: `DATADOG_` variables are deprecated\n deprecation(\n name=\"DATADOG_\",\n message=\"Use `DD_` prefix instead\",\n version=\"1.0.0\",\n )\n\n value = value or legacy\n return value if value else default\n\n\ndef deep_getattr(obj, attr_string, default=None):\n \"\"\"\n Returns the attribute of `obj` at the dotted path given by `attr_string`\n If no such attribute is reachable, returns `default`\n\n >>> deep_getattr(cass, 'cluster')\n <cassandra.cluster.Cluster object at 0xa20c350\n\n >>> deep_getattr(cass, 'cluster.metadata.partitioner')\n u'org.apache.cassandra.dht.Murmur3Partitioner'\n\n >>> deep_getattr(cass, 'i.dont.exist', default='default')\n 'default'\n \"\"\"\n attrs = attr_string.split(\".\")\n for attr in attrs:\n try:\n obj = getattr(obj, attr)\n except AttributeError:\n return default\n\n return obj\n\n\ndef asbool(value):\n \"\"\"Convert the given String to a boolean object.\n\n Accepted values are `True` and `1`.\n \"\"\"\n if value is None:\n return False\n\n if isinstance(value, bool):\n return value\n\n return value.lower() in (\"true\", \"1\")\n\n\ndef parse_tags_str(tags_str):\n \"\"\"Parse a string of tags typically provided via environment variables.\n\n The expected string is of the form::\n \"key1:value1,key2:value2\"\n\n :param tags_str: A string of the above form to parse tags from.\n :return: A dict containing the tags that were parsed.\n \"\"\"\n parsed_tags = {}\n if not tags_str:\n return parsed_tags\n\n for tag in tags_str.split(\",\"):\n try:\n key, value = tag.split(\":\", 1)\n\n # Validate the tag\n if key == \"\" or value == \"\" or value.endswith(\":\"):\n raise ValueError\n except ValueError:\n log.error(\n \"Malformed tag in tag pair '%s' from tag string '%s'.\",\n tag,\n tags_str,\n )\n else:\n parsed_tags[key] = value\n\n return parsed_tags\n", "path": "ddtrace/utils/formats.py"}], "after_files": [{"content": "import logging\nimport os\nimport re\n\nfrom .deprecation import deprecation\n\n\n# Tags `key:value` must be separated by either comma or space\n_TAGS_NOT_SEPARATED = re.compile(r\":[^,\\s]+:\")\n\nlog = logging.getLogger(__name__)\n\n\ndef get_env(*parts, **kwargs):\n \"\"\"Retrieves environment variables value for the given integration. It must be used\n for consistency between integrations. The implementation is backward compatible\n with legacy nomenclature:\n\n * `DATADOG_` is a legacy prefix with lower priority\n * `DD_` environment variables have the highest priority\n * the environment variable is built concatenating `integration` and `variable`\n arguments\n * return `default` otherwise\n\n :param parts: evironment variable parts that will be joined with ``_`` to generate the name\n :type parts: :obj:`str`\n :param kwargs: ``default`` is the only supported keyword argument which sets the default value\n if no environment variable is found\n :rtype: :obj:`str` | ``kwargs[\"default\"]``\n :returns: The string environment variable value or the value of ``kwargs[\"default\"]`` if not found\n \"\"\"\n default = kwargs.get(\"default\")\n\n key = \"_\".join(parts)\n key = key.upper()\n legacy_env = \"DATADOG_{}\".format(key)\n env = \"DD_{}\".format(key)\n\n value = os.getenv(env)\n legacy = os.getenv(legacy_env)\n if legacy:\n # Deprecation: `DATADOG_` variables are deprecated\n deprecation(\n name=\"DATADOG_\",\n message=\"Use `DD_` prefix instead\",\n version=\"1.0.0\",\n )\n\n value = value or legacy\n return value if value else default\n\n\ndef deep_getattr(obj, attr_string, default=None):\n \"\"\"\n Returns the attribute of `obj` at the dotted path given by `attr_string`\n If no such attribute is reachable, returns `default`\n\n >>> deep_getattr(cass, 'cluster')\n <cassandra.cluster.Cluster object at 0xa20c350\n\n >>> deep_getattr(cass, 'cluster.metadata.partitioner')\n u'org.apache.cassandra.dht.Murmur3Partitioner'\n\n >>> deep_getattr(cass, 'i.dont.exist', default='default')\n 'default'\n \"\"\"\n attrs = attr_string.split(\".\")\n for attr in attrs:\n try:\n obj = getattr(obj, attr)\n except AttributeError:\n return default\n\n return obj\n\n\ndef asbool(value):\n \"\"\"Convert the given String to a boolean object.\n\n Accepted values are `True` and `1`.\n \"\"\"\n if value is None:\n return False\n\n if isinstance(value, bool):\n return value\n\n return value.lower() in (\"true\", \"1\")\n\n\ndef parse_tags_str(tags_str):\n \"\"\"Parse a string of tags typically provided via environment variables.\n\n The expected string is of the form::\n \"key1:value1,key2:value2\"\n \"key1:value1 key2:value2\"\n\n :param tags_str: A string of the above form to parse tags from.\n :return: A dict containing the tags that were parsed.\n \"\"\"\n parsed_tags = {}\n if not tags_str:\n return parsed_tags\n\n if _TAGS_NOT_SEPARATED.search(tags_str):\n log.error(\"Malformed tag string with tags not separated by comma or space '%s'.\", tags_str)\n return parsed_tags\n\n # Identify separator based on which successfully identifies the correct\n # number of valid tags\n numtagseps = tags_str.count(\":\")\n for sep in [\",\", \" \"]:\n if sum(\":\" in _ for _ in tags_str.split(sep)) == numtagseps:\n break\n else:\n log.error(\n (\n \"Failed to find separator for tag string: '%s'.\\n\"\n \"Tag strings must be comma or space separated:\\n\"\n \" key1:value1,key2:value2\\n\"\n \" key1:value1 key2:value2\"\n ),\n tags_str,\n )\n return parsed_tags\n\n for tag in tags_str.split(sep):\n try:\n key, value = tag.split(\":\", 1)\n\n # Validate the tag\n if key == \"\" or value == \"\" or value.endswith(\":\"):\n raise ValueError\n except ValueError:\n log.error(\n \"Malformed tag in tag pair '%s' from tag string '%s'.\",\n tag,\n tags_str,\n )\n else:\n parsed_tags[key] = value\n\n return parsed_tags\n", "path": "ddtrace/utils/formats.py"}]}
1,745
448
gh_patches_debug_40630
rasdani/github-patches
git_diff
readthedocs__readthedocs.org-6617
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- docs.opnfv.org is 'Not Found.' ## Details The entire documentation site for OPNFV is missing. The most recent builds have succeeded, and as far as I know DNS hasn't changed recently. * Read the Docs project URL: opnfvdocsdemo.readthedocs.io * Build URL (if applicable): https://readthedocs.org/projects/opnfvdocsdemo/builds/ * Read the Docs username (if applicable): ## Expected Result Going to https://docs.opnfv.org/ returns the documentation site. ## Actual Result ``` curl -i -L https://opnfvdocsdemo.readthedocs.io/ HTTP/1.1 302 Found Content-Type: text/html; charset=utf-8 Location: http://docs.opnfv.org/en/stable-hunter/ Server: nginx X-Frame-Options: DENY x-content-type-options: nosniff x-xss-protection: 1; mode=block X-Served: Django-Proxito X-Deity: web04 Strict-Transport-Security: max-age=31536000; includeSubDomains Date: Wed, 29 Jan 2020 23:13:29 GMT Content-Length: 0 HTTP/1.1 301 Moved Permanently Server: CloudFront Date: Wed, 29 Jan 2020 23:13:29 GMT Content-Type: text/html Content-Length: 183 Connection: keep-alive Location: https://docs.opnfv.org/en/stable-hunter/ X-Cache: Redirect from cloudfront Via: 1.1 5ab5dc09da67e3ea794ec8a82992cc89.cloudfront.net (CloudFront) X-Amz-Cf-Pop: HIO50-C1 X-Amz-Cf-Id: 0_rJ9aN8nFAFm6M9VPcWPWHa7B8QOaSW1_Y3Llttz31ZTaK03cTaYQ== HTTP/2 404 content-type: text/html; charset=utf-8 content-length: 10 server: nginx x-frame-options: DENY x-content-type-options: nosniff x-xss-protection: 1; mode=block x-served: Proxito-404 x-deity: web03 strict-transport-security: max-age=0 date: Wed, 29 Jan 2020 23:13:30 GMT x-cache: Miss from cloudfront via: 1.1 1b0911478686968732f973d6e5e31d11.cloudfront.net (CloudFront) x-amz-cf-pop: HIO50-C1 x-amz-cf-id: sRmKIeU3LyXtKb93316GUwkxqiChktuq227k3nhDcOPqU-78E7JFTA== Not Found. ``` --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `readthedocs/proxito/middleware.py` Content: ``` 1 """ 2 Middleware for Proxito. 3 4 This is used to take the request and map the host to the proper project slug. 5 6 Additional processing is done to get the project from the URL in the ``views.py`` as well. 7 """ 8 import logging 9 10 from django.conf import settings 11 from django.shortcuts import render 12 from django.utils.deprecation import MiddlewareMixin 13 14 from readthedocs.projects.models import Domain 15 16 log = logging.getLogger(__name__) # noqa 17 18 19 def map_host_to_project_slug(request): 20 """ 21 Take the request and map the host to the proper project slug. 22 23 We check, in order: 24 25 * The ``HTTP_X_RTD_SLUG`` host header for explicit Project mapping 26 - This sets ``request.rtdheader`` True 27 * The ``PUBLIC_DOMAIN`` where we can use the subdomain as the project name 28 - This sets ``request.subdomain`` True 29 * The hostname without port information, which maps to ``Domain`` objects 30 - This sets ``request.cname`` True 31 """ 32 33 host = request.get_host().lower().split(':')[0] 34 public_domain = settings.PUBLIC_DOMAIN.lower().split(':')[0] 35 host_parts = host.split('.') 36 public_domain_parts = public_domain.split('.') 37 38 project_slug = None 39 40 # Explicit Project slug being passed in 41 if 'HTTP_X_RTD_SLUG' in request.META: 42 project_slug = request.META['HTTP_X_RTD_SLUG'].lower() 43 request.rtdheader = True 44 log.info('Setting project based on X_RTD_SLUG header: %s' % project_slug) 45 46 elif public_domain in host or host == 'proxito': 47 # Serve from the PUBLIC_DOMAIN, ensuring it looks like `foo.PUBLIC_DOMAIN` 48 if public_domain_parts == host_parts[1:]: 49 project_slug = host_parts[0] 50 request.subdomain = True 51 log.debug('Proxito Public Domain: host=%s', host) 52 else: 53 # TODO: This can catch some possibly valid domains (docs.readthedocs.io.com) for example 54 # But these feel like they might be phishing, etc. so let's block them for now. 55 log.warning('Weird variation on our hostname: host=%s', host) 56 return render( 57 request, 'core/dns-404.html', context={'host': host}, status=400 58 ) 59 60 # Serve CNAMEs 61 else: 62 domain = Domain.objects.filter(domain=host).first() 63 if domain: 64 project_slug = domain.project.slug 65 request.cname = True 66 log.debug('Proxito CNAME: host=%s', host) 67 else: 68 # Some person is CNAMEing to us without configuring a domain - 404. 69 log.debug('CNAME 404: host=%s', host) 70 return render( 71 request, 'core/dns-404.html', context={'host': host}, status=404 72 ) 73 74 log.debug('Proxito Project: slug=%s', project_slug) 75 return project_slug 76 77 78 class ProxitoMiddleware(MiddlewareMixin): 79 80 """The actual middleware we'll be using in prod.""" 81 82 def process_request(self, request): # noqa 83 if any([not settings.USE_SUBDOMAIN, 'localhost' in request.get_host(), 84 'testserver' in request.get_host()]): 85 log.debug('Not processing Proxito middleware') 86 return None 87 88 ret = map_host_to_project_slug(request) 89 90 # Handle returning a response 91 if hasattr(ret, 'status_code'): 92 return ret 93 94 # Otherwise set the slug on the request 95 request.host_project_slug = request.slug = ret 96 97 return None 98 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/readthedocs/proxito/middleware.py b/readthedocs/proxito/middleware.py --- a/readthedocs/proxito/middleware.py +++ b/readthedocs/proxito/middleware.py @@ -11,7 +11,7 @@ from django.shortcuts import render from django.utils.deprecation import MiddlewareMixin -from readthedocs.projects.models import Domain +from readthedocs.projects.models import Domain, Project log = logging.getLogger(__name__) # noqa @@ -40,39 +40,38 @@ # Explicit Project slug being passed in if 'HTTP_X_RTD_SLUG' in request.META: project_slug = request.META['HTTP_X_RTD_SLUG'].lower() - request.rtdheader = True - log.info('Setting project based on X_RTD_SLUG header: %s' % project_slug) + if Project.objects.filter(slug=project_slug).exists(): + request.rtdheader = True + log.info('Setting project based on X_RTD_SLUG header: %s', project_slug) + return project_slug - elif public_domain in host or host == 'proxito': + if public_domain in host or host == 'proxito': # Serve from the PUBLIC_DOMAIN, ensuring it looks like `foo.PUBLIC_DOMAIN` if public_domain_parts == host_parts[1:]: project_slug = host_parts[0] request.subdomain = True log.debug('Proxito Public Domain: host=%s', host) - else: - # TODO: This can catch some possibly valid domains (docs.readthedocs.io.com) for example - # But these feel like they might be phishing, etc. so let's block them for now. - log.warning('Weird variation on our hostname: host=%s', host) - return render( - request, 'core/dns-404.html', context={'host': host}, status=400 - ) + return project_slug + # TODO: This can catch some possibly valid domains (docs.readthedocs.io.com) for example + # But these feel like they might be phishing, etc. so let's block them for now. + log.warning('Weird variation on our hostname: host=%s', host) + return render( + request, 'core/dns-404.html', context={'host': host}, status=400 + ) # Serve CNAMEs - else: - domain = Domain.objects.filter(domain=host).first() - if domain: - project_slug = domain.project.slug - request.cname = True - log.debug('Proxito CNAME: host=%s', host) - else: - # Some person is CNAMEing to us without configuring a domain - 404. - log.debug('CNAME 404: host=%s', host) - return render( - request, 'core/dns-404.html', context={'host': host}, status=404 - ) - - log.debug('Proxito Project: slug=%s', project_slug) - return project_slug + domain = Domain.objects.filter(domain=host).first() + if domain: + project_slug = domain.project.slug + request.cname = True + log.debug('Proxito CNAME: host=%s', host) + return project_slug + + # Some person is CNAMEing to us without configuring a domain - 404. + log.debug('CNAME 404: host=%s', host) + return render( + request, 'core/dns-404.html', context={'host': host}, status=404 + ) class ProxitoMiddleware(MiddlewareMixin): @@ -91,6 +90,8 @@ if hasattr(ret, 'status_code'): return ret + log.debug('Proxito Project: slug=%s', ret) + # Otherwise set the slug on the request request.host_project_slug = request.slug = ret
{"golden_diff": "diff --git a/readthedocs/proxito/middleware.py b/readthedocs/proxito/middleware.py\n--- a/readthedocs/proxito/middleware.py\n+++ b/readthedocs/proxito/middleware.py\n@@ -11,7 +11,7 @@\n from django.shortcuts import render\n from django.utils.deprecation import MiddlewareMixin\n \n-from readthedocs.projects.models import Domain\n+from readthedocs.projects.models import Domain, Project\n \n log = logging.getLogger(__name__) # noqa\n \n@@ -40,39 +40,38 @@\n # Explicit Project slug being passed in\n if 'HTTP_X_RTD_SLUG' in request.META:\n project_slug = request.META['HTTP_X_RTD_SLUG'].lower()\n- request.rtdheader = True\n- log.info('Setting project based on X_RTD_SLUG header: %s' % project_slug)\n+ if Project.objects.filter(slug=project_slug).exists():\n+ request.rtdheader = True\n+ log.info('Setting project based on X_RTD_SLUG header: %s', project_slug)\n+ return project_slug\n \n- elif public_domain in host or host == 'proxito':\n+ if public_domain in host or host == 'proxito':\n # Serve from the PUBLIC_DOMAIN, ensuring it looks like `foo.PUBLIC_DOMAIN`\n if public_domain_parts == host_parts[1:]:\n project_slug = host_parts[0]\n request.subdomain = True\n log.debug('Proxito Public Domain: host=%s', host)\n- else:\n- # TODO: This can catch some possibly valid domains (docs.readthedocs.io.com) for example\n- # But these feel like they might be phishing, etc. so let's block them for now.\n- log.warning('Weird variation on our hostname: host=%s', host)\n- return render(\n- request, 'core/dns-404.html', context={'host': host}, status=400\n- )\n+ return project_slug\n+ # TODO: This can catch some possibly valid domains (docs.readthedocs.io.com) for example\n+ # But these feel like they might be phishing, etc. so let's block them for now.\n+ log.warning('Weird variation on our hostname: host=%s', host)\n+ return render(\n+ request, 'core/dns-404.html', context={'host': host}, status=400\n+ )\n \n # Serve CNAMEs\n- else:\n- domain = Domain.objects.filter(domain=host).first()\n- if domain:\n- project_slug = domain.project.slug\n- request.cname = True\n- log.debug('Proxito CNAME: host=%s', host)\n- else:\n- # Some person is CNAMEing to us without configuring a domain - 404.\n- log.debug('CNAME 404: host=%s', host)\n- return render(\n- request, 'core/dns-404.html', context={'host': host}, status=404\n- )\n-\n- log.debug('Proxito Project: slug=%s', project_slug)\n- return project_slug\n+ domain = Domain.objects.filter(domain=host).first()\n+ if domain:\n+ project_slug = domain.project.slug\n+ request.cname = True\n+ log.debug('Proxito CNAME: host=%s', host)\n+ return project_slug\n+\n+ # Some person is CNAMEing to us without configuring a domain - 404.\n+ log.debug('CNAME 404: host=%s', host)\n+ return render(\n+ request, 'core/dns-404.html', context={'host': host}, status=404\n+ )\n \n \n class ProxitoMiddleware(MiddlewareMixin):\n@@ -91,6 +90,8 @@\n if hasattr(ret, 'status_code'):\n return ret\n \n+ log.debug('Proxito Project: slug=%s', ret)\n+\n # Otherwise set the slug on the request\n request.host_project_slug = request.slug = ret\n", "issue": "docs.opnfv.org is 'Not Found.'\n## Details\r\n\r\nThe entire documentation site for OPNFV is missing. The most recent builds have succeeded, and as far as I know DNS hasn't changed recently.\r\n\r\n* Read the Docs project URL: opnfvdocsdemo.readthedocs.io\r\n* Build URL (if applicable): https://readthedocs.org/projects/opnfvdocsdemo/builds/\r\n* Read the Docs username (if applicable):\r\n\r\n## Expected Result\r\n\r\nGoing to https://docs.opnfv.org/ returns the documentation site.\r\n\r\n## Actual Result\r\n```\r\ncurl -i -L https://opnfvdocsdemo.readthedocs.io/\r\nHTTP/1.1 302 Found\r\nContent-Type: text/html; charset=utf-8\r\nLocation: http://docs.opnfv.org/en/stable-hunter/\r\nServer: nginx\r\nX-Frame-Options: DENY\r\nx-content-type-options: nosniff\r\nx-xss-protection: 1; mode=block\r\nX-Served: Django-Proxito\r\nX-Deity: web04\r\nStrict-Transport-Security: max-age=31536000; includeSubDomains\r\nDate: Wed, 29 Jan 2020 23:13:29 GMT\r\nContent-Length: 0\r\n\r\nHTTP/1.1 301 Moved Permanently\r\nServer: CloudFront\r\nDate: Wed, 29 Jan 2020 23:13:29 GMT\r\nContent-Type: text/html\r\nContent-Length: 183\r\nConnection: keep-alive\r\nLocation: https://docs.opnfv.org/en/stable-hunter/\r\nX-Cache: Redirect from cloudfront\r\nVia: 1.1 5ab5dc09da67e3ea794ec8a82992cc89.cloudfront.net (CloudFront)\r\nX-Amz-Cf-Pop: HIO50-C1\r\nX-Amz-Cf-Id: 0_rJ9aN8nFAFm6M9VPcWPWHa7B8QOaSW1_Y3Llttz31ZTaK03cTaYQ==\r\n\r\nHTTP/2 404 \r\ncontent-type: text/html; charset=utf-8\r\ncontent-length: 10\r\nserver: nginx\r\nx-frame-options: DENY\r\nx-content-type-options: nosniff\r\nx-xss-protection: 1; mode=block\r\nx-served: Proxito-404\r\nx-deity: web03\r\nstrict-transport-security: max-age=0\r\ndate: Wed, 29 Jan 2020 23:13:30 GMT\r\nx-cache: Miss from cloudfront\r\nvia: 1.1 1b0911478686968732f973d6e5e31d11.cloudfront.net (CloudFront)\r\nx-amz-cf-pop: HIO50-C1\r\nx-amz-cf-id: sRmKIeU3LyXtKb93316GUwkxqiChktuq227k3nhDcOPqU-78E7JFTA==\r\n\r\nNot Found.\r\n```\r\n\n", "before_files": [{"content": "\"\"\"\nMiddleware for Proxito.\n\nThis is used to take the request and map the host to the proper project slug.\n\nAdditional processing is done to get the project from the URL in the ``views.py`` as well.\n\"\"\"\nimport logging\n\nfrom django.conf import settings\nfrom django.shortcuts import render\nfrom django.utils.deprecation import MiddlewareMixin\n\nfrom readthedocs.projects.models import Domain\n\nlog = logging.getLogger(__name__) # noqa\n\n\ndef map_host_to_project_slug(request):\n \"\"\"\n Take the request and map the host to the proper project slug.\n\n We check, in order:\n\n * The ``HTTP_X_RTD_SLUG`` host header for explicit Project mapping\n - This sets ``request.rtdheader`` True\n * The ``PUBLIC_DOMAIN`` where we can use the subdomain as the project name\n - This sets ``request.subdomain`` True\n * The hostname without port information, which maps to ``Domain`` objects\n - This sets ``request.cname`` True\n \"\"\"\n\n host = request.get_host().lower().split(':')[0]\n public_domain = settings.PUBLIC_DOMAIN.lower().split(':')[0]\n host_parts = host.split('.')\n public_domain_parts = public_domain.split('.')\n\n project_slug = None\n\n # Explicit Project slug being passed in\n if 'HTTP_X_RTD_SLUG' in request.META:\n project_slug = request.META['HTTP_X_RTD_SLUG'].lower()\n request.rtdheader = True\n log.info('Setting project based on X_RTD_SLUG header: %s' % project_slug)\n\n elif public_domain in host or host == 'proxito':\n # Serve from the PUBLIC_DOMAIN, ensuring it looks like `foo.PUBLIC_DOMAIN`\n if public_domain_parts == host_parts[1:]:\n project_slug = host_parts[0]\n request.subdomain = True\n log.debug('Proxito Public Domain: host=%s', host)\n else:\n # TODO: This can catch some possibly valid domains (docs.readthedocs.io.com) for example\n # But these feel like they might be phishing, etc. so let's block them for now.\n log.warning('Weird variation on our hostname: host=%s', host)\n return render(\n request, 'core/dns-404.html', context={'host': host}, status=400\n )\n\n # Serve CNAMEs\n else:\n domain = Domain.objects.filter(domain=host).first()\n if domain:\n project_slug = domain.project.slug\n request.cname = True\n log.debug('Proxito CNAME: host=%s', host)\n else:\n # Some person is CNAMEing to us without configuring a domain - 404.\n log.debug('CNAME 404: host=%s', host)\n return render(\n request, 'core/dns-404.html', context={'host': host}, status=404\n )\n\n log.debug('Proxito Project: slug=%s', project_slug)\n return project_slug\n\n\nclass ProxitoMiddleware(MiddlewareMixin):\n\n \"\"\"The actual middleware we'll be using in prod.\"\"\"\n\n def process_request(self, request): # noqa\n if any([not settings.USE_SUBDOMAIN, 'localhost' in request.get_host(),\n 'testserver' in request.get_host()]):\n log.debug('Not processing Proxito middleware')\n return None\n\n ret = map_host_to_project_slug(request)\n\n # Handle returning a response\n if hasattr(ret, 'status_code'):\n return ret\n\n # Otherwise set the slug on the request\n request.host_project_slug = request.slug = ret\n\n return None\n", "path": "readthedocs/proxito/middleware.py"}], "after_files": [{"content": "\"\"\"\nMiddleware for Proxito.\n\nThis is used to take the request and map the host to the proper project slug.\n\nAdditional processing is done to get the project from the URL in the ``views.py`` as well.\n\"\"\"\nimport logging\n\nfrom django.conf import settings\nfrom django.shortcuts import render\nfrom django.utils.deprecation import MiddlewareMixin\n\nfrom readthedocs.projects.models import Domain, Project\n\nlog = logging.getLogger(__name__) # noqa\n\n\ndef map_host_to_project_slug(request):\n \"\"\"\n Take the request and map the host to the proper project slug.\n\n We check, in order:\n\n * The ``HTTP_X_RTD_SLUG`` host header for explicit Project mapping\n - This sets ``request.rtdheader`` True\n * The ``PUBLIC_DOMAIN`` where we can use the subdomain as the project name\n - This sets ``request.subdomain`` True\n * The hostname without port information, which maps to ``Domain`` objects\n - This sets ``request.cname`` True\n \"\"\"\n\n host = request.get_host().lower().split(':')[0]\n public_domain = settings.PUBLIC_DOMAIN.lower().split(':')[0]\n host_parts = host.split('.')\n public_domain_parts = public_domain.split('.')\n\n project_slug = None\n\n # Explicit Project slug being passed in\n if 'HTTP_X_RTD_SLUG' in request.META:\n project_slug = request.META['HTTP_X_RTD_SLUG'].lower()\n if Project.objects.filter(slug=project_slug).exists():\n request.rtdheader = True\n log.info('Setting project based on X_RTD_SLUG header: %s', project_slug)\n return project_slug\n\n if public_domain in host or host == 'proxito':\n # Serve from the PUBLIC_DOMAIN, ensuring it looks like `foo.PUBLIC_DOMAIN`\n if public_domain_parts == host_parts[1:]:\n project_slug = host_parts[0]\n request.subdomain = True\n log.debug('Proxito Public Domain: host=%s', host)\n return project_slug\n # TODO: This can catch some possibly valid domains (docs.readthedocs.io.com) for example\n # But these feel like they might be phishing, etc. so let's block them for now.\n log.warning('Weird variation on our hostname: host=%s', host)\n return render(\n request, 'core/dns-404.html', context={'host': host}, status=400\n )\n\n # Serve CNAMEs\n domain = Domain.objects.filter(domain=host).first()\n if domain:\n project_slug = domain.project.slug\n request.cname = True\n log.debug('Proxito CNAME: host=%s', host)\n return project_slug\n\n # Some person is CNAMEing to us without configuring a domain - 404.\n log.debug('CNAME 404: host=%s', host)\n return render(\n request, 'core/dns-404.html', context={'host': host}, status=404\n )\n\n\nclass ProxitoMiddleware(MiddlewareMixin):\n\n \"\"\"The actual middleware we'll be using in prod.\"\"\"\n\n def process_request(self, request): # noqa\n if any([not settings.USE_SUBDOMAIN, 'localhost' in request.get_host(),\n 'testserver' in request.get_host()]):\n log.debug('Not processing Proxito middleware')\n return None\n\n ret = map_host_to_project_slug(request)\n\n # Handle returning a response\n if hasattr(ret, 'status_code'):\n return ret\n\n log.debug('Proxito Project: slug=%s', ret)\n\n # Otherwise set the slug on the request\n request.host_project_slug = request.slug = ret\n\n return None\n", "path": "readthedocs/proxito/middleware.py"}]}
1,955
897
gh_patches_debug_685
rasdani/github-patches
git_diff
pytorch__TensorRT-1849
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Add Test Suite for `torch.compile` backend Partitioning/Lowering Phases - Add robust test suite for `torch.compile` backend, ensuring each phase functions correctly - Add general-purpose utilities for test expansion as the backend evolves --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `py/torch_tensorrt/dynamo/torch_compile/utils.py` Content: ``` 1 import torch 2 3 from typing import Any, Union, Sequence, Dict 4 from torch_tensorrt import _Input, Device 5 6 7 def prepare_inputs( 8 inputs: Union[_Input.Input, torch.Tensor, Sequence, Dict], 9 device: torch.device = torch.device("cuda"), 10 ) -> Any: 11 if isinstance(inputs, _Input.Input): 12 if isinstance(inputs.shape, dict): 13 return inputs.example_tensor(optimization_profile_field="opt_shape").to( 14 device 15 ) 16 else: 17 return inputs.example_tensor().to(device) 18 19 elif isinstance(inputs, torch.Tensor): 20 return inputs 21 22 elif isinstance(inputs, list): 23 prepared_input = list() 24 25 for input_obj in inputs: 26 prepared_input.append(prepare_inputs(input_obj)) 27 28 return prepared_input 29 30 elif isinstance(inputs, tuple): 31 prepared_input = list() 32 33 for input_obj in inputs: 34 prepared_input.append(prepare_inputs(input_obj)) 35 36 return tuple(prepared_input) 37 38 elif isinstance(inputs, dict): 39 prepared_input = dict() 40 41 for key, input_obj in inputs.items(): 42 prepared_input[key] = prepare_inputs(input_obj) 43 44 return prepared_input 45 46 else: 47 raise ValueError( 48 f"Invalid input type {type(inputs)} encountered in the torch_compile input parsing. " 49 + "Allowed input types: {torch_tensorrt.Input, torch.Tensor, list, tuple, dict}" 50 ) 51 52 53 def prepare_device(device: Union[Device, torch.device]) -> torch.device: 54 if isinstance(device, Device): 55 if device.gpu_id != -1: 56 device = torch.device(device.gpu_id) 57 else: 58 raise ValueError("Invalid GPU ID provided for the CUDA device provided") 59 60 elif isinstance(device, torch.device): 61 device = device 62 63 else: 64 raise ValueError( 65 "Invalid device provided. Supported options: torch.device | torch_tensorrt.Device" 66 ) 67 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/py/torch_tensorrt/dynamo/torch_compile/utils.py b/py/torch_tensorrt/dynamo/torch_compile/utils.py --- a/py/torch_tensorrt/dynamo/torch_compile/utils.py +++ b/py/torch_tensorrt/dynamo/torch_compile/utils.py @@ -64,3 +64,5 @@ raise ValueError( "Invalid device provided. Supported options: torch.device | torch_tensorrt.Device" ) + + return device
{"golden_diff": "diff --git a/py/torch_tensorrt/dynamo/torch_compile/utils.py b/py/torch_tensorrt/dynamo/torch_compile/utils.py\n--- a/py/torch_tensorrt/dynamo/torch_compile/utils.py\n+++ b/py/torch_tensorrt/dynamo/torch_compile/utils.py\n@@ -64,3 +64,5 @@\n raise ValueError(\n \"Invalid device provided. Supported options: torch.device | torch_tensorrt.Device\"\n )\n+\n+ return device\n", "issue": "Add Test Suite for `torch.compile` backend Partitioning/Lowering Phases\n- Add robust test suite for `torch.compile` backend, ensuring each phase functions correctly\r\n- Add general-purpose utilities for test expansion as the backend evolves\n", "before_files": [{"content": "import torch\n\nfrom typing import Any, Union, Sequence, Dict\nfrom torch_tensorrt import _Input, Device\n\n\ndef prepare_inputs(\n inputs: Union[_Input.Input, torch.Tensor, Sequence, Dict],\n device: torch.device = torch.device(\"cuda\"),\n) -> Any:\n if isinstance(inputs, _Input.Input):\n if isinstance(inputs.shape, dict):\n return inputs.example_tensor(optimization_profile_field=\"opt_shape\").to(\n device\n )\n else:\n return inputs.example_tensor().to(device)\n\n elif isinstance(inputs, torch.Tensor):\n return inputs\n\n elif isinstance(inputs, list):\n prepared_input = list()\n\n for input_obj in inputs:\n prepared_input.append(prepare_inputs(input_obj))\n\n return prepared_input\n\n elif isinstance(inputs, tuple):\n prepared_input = list()\n\n for input_obj in inputs:\n prepared_input.append(prepare_inputs(input_obj))\n\n return tuple(prepared_input)\n\n elif isinstance(inputs, dict):\n prepared_input = dict()\n\n for key, input_obj in inputs.items():\n prepared_input[key] = prepare_inputs(input_obj)\n\n return prepared_input\n\n else:\n raise ValueError(\n f\"Invalid input type {type(inputs)} encountered in the torch_compile input parsing. \"\n + \"Allowed input types: {torch_tensorrt.Input, torch.Tensor, list, tuple, dict}\"\n )\n\n\ndef prepare_device(device: Union[Device, torch.device]) -> torch.device:\n if isinstance(device, Device):\n if device.gpu_id != -1:\n device = torch.device(device.gpu_id)\n else:\n raise ValueError(\"Invalid GPU ID provided for the CUDA device provided\")\n\n elif isinstance(device, torch.device):\n device = device\n\n else:\n raise ValueError(\n \"Invalid device provided. Supported options: torch.device | torch_tensorrt.Device\"\n )\n", "path": "py/torch_tensorrt/dynamo/torch_compile/utils.py"}], "after_files": [{"content": "import torch\n\nfrom typing import Any, Union, Sequence, Dict\nfrom torch_tensorrt import _Input, Device\n\n\ndef prepare_inputs(\n inputs: Union[_Input.Input, torch.Tensor, Sequence, Dict],\n device: torch.device = torch.device(\"cuda\"),\n) -> Any:\n if isinstance(inputs, _Input.Input):\n if isinstance(inputs.shape, dict):\n return inputs.example_tensor(optimization_profile_field=\"opt_shape\").to(\n device\n )\n else:\n return inputs.example_tensor().to(device)\n\n elif isinstance(inputs, torch.Tensor):\n return inputs\n\n elif isinstance(inputs, list):\n prepared_input = list()\n\n for input_obj in inputs:\n prepared_input.append(prepare_inputs(input_obj))\n\n return prepared_input\n\n elif isinstance(inputs, tuple):\n prepared_input = list()\n\n for input_obj in inputs:\n prepared_input.append(prepare_inputs(input_obj))\n\n return tuple(prepared_input)\n\n elif isinstance(inputs, dict):\n prepared_input = dict()\n\n for key, input_obj in inputs.items():\n prepared_input[key] = prepare_inputs(input_obj)\n\n return prepared_input\n\n else:\n raise ValueError(\n f\"Invalid input type {type(inputs)} encountered in the torch_compile input parsing. \"\n + \"Allowed input types: {torch_tensorrt.Input, torch.Tensor, list, tuple, dict}\"\n )\n\n\ndef prepare_device(device: Union[Device, torch.device]) -> torch.device:\n if isinstance(device, Device):\n if device.gpu_id != -1:\n device = torch.device(device.gpu_id)\n else:\n raise ValueError(\"Invalid GPU ID provided for the CUDA device provided\")\n\n elif isinstance(device, torch.device):\n device = device\n\n else:\n raise ValueError(\n \"Invalid device provided. Supported options: torch.device | torch_tensorrt.Device\"\n )\n\n return device\n", "path": "py/torch_tensorrt/dynamo/torch_compile/utils.py"}]}
834
101
gh_patches_debug_38916
rasdani/github-patches
git_diff
scrapy__scrapy-1944
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Scrapy 1.1.0 RC3 - exception thrown with invalid ssl certificate Hello, I am crawling sometimes websites with an invalid ssl certificate. For example, Scrapy 1.1.0 RC3 fails to open when I do: > scrapy shell https://www.directoriosanitario.com/directorio > or > scrapy shell https://saobinv.5go.cc/top/ and throws the following exception: > twisted.web._newclient.ResponseNeverReceived: [<twisted.python.failure.Failure service_identity.exceptions.VerificationError: VerificationError(errors=[DNSMismatch(mismatched_id=DNS_ID(hostname=b'www.directoriosanitario.com'))])>] I tried it with Scrapy 1.0.5 on python 2.7 and the spider opens but warns with: > AttributeError: 'NoneType' object has no attribute 'failVerification' Is there a way to force the spider to open with Scrapy 1.1.0 RC3? --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `scrapy/core/downloader/tls.py` Content: ``` 1 from OpenSSL import SSL 2 3 4 METHOD_SSLv3 = 'SSLv3' 5 METHOD_TLS = 'TLS' 6 METHOD_TLSv10 = 'TLSv1.0' 7 METHOD_TLSv11 = 'TLSv1.1' 8 METHOD_TLSv12 = 'TLSv1.2' 9 10 openssl_methods = { 11 METHOD_TLS: SSL.SSLv23_METHOD, # protocol negotiation (recommended) 12 METHOD_SSLv3: SSL.SSLv3_METHOD, # SSL 3 (NOT recommended) 13 METHOD_TLSv10: SSL.TLSv1_METHOD, # TLS 1.0 only 14 METHOD_TLSv11: getattr(SSL, 'TLSv1_1_METHOD', 5), # TLS 1.1 only 15 METHOD_TLSv12: getattr(SSL, 'TLSv1_2_METHOD', 6), # TLS 1.2 only 16 } 17 ``` Path: `scrapy/core/downloader/contextfactory.py` Content: ``` 1 from OpenSSL import SSL 2 from twisted.internet.ssl import ClientContextFactory 3 4 try: 5 6 from zope.interface.declarations import implementer 7 8 # the following should be available from Twisted 14.0.0 9 from twisted.internet.ssl import optionsForClientTLS, CertificateOptions, platformTrust 10 from twisted.internet._sslverify import ClientTLSOptions 11 from twisted.web.client import BrowserLikePolicyForHTTPS 12 from twisted.web.iweb import IPolicyForHTTPS 13 14 @implementer(IPolicyForHTTPS) 15 class ScrapyClientContextFactory(BrowserLikePolicyForHTTPS): 16 """ 17 Non-peer-certificate verifying HTTPS context factory 18 19 Default OpenSSL method is TLS_METHOD (also called SSLv23_METHOD) 20 which allows TLS protocol negotiation 21 22 'A TLS/SSL connection established with [this method] may 23 understand the SSLv3, TLSv1, TLSv1.1 and TLSv1.2 protocols.' 24 """ 25 26 def __init__(self, method=SSL.SSLv23_METHOD, *args, **kwargs): 27 super(ScrapyClientContextFactory, self).__init__(*args, **kwargs) 28 self._ssl_method = method 29 30 def getCertificateOptions(self): 31 # setting verify=True will require you to provide CAs 32 # to verify against; in other words: it's not that simple 33 34 # backward-compatible SSL/TLS method: 35 # 36 # * this will respect `method` attribute in often recommended 37 # `ScrapyClientContextFactory` subclass 38 # (https://github.com/scrapy/scrapy/issues/1429#issuecomment-131782133) 39 # 40 # * getattr() for `_ssl_method` attribute for context factories 41 # not calling super(..., self).__init__ 42 return CertificateOptions(verify=False, 43 method=getattr(self, 'method', 44 getattr(self, '_ssl_method', None))) 45 46 # kept for old-style HTTP/1.0 downloader context twisted calls, 47 # e.g. connectSSL() 48 def getContext(self, hostname=None, port=None): 49 return self.getCertificateOptions().getContext() 50 51 def creatorForNetloc(self, hostname, port): 52 return ClientTLSOptions(hostname.decode("ascii"), self.getContext()) 53 54 55 @implementer(IPolicyForHTTPS) 56 class BrowserLikeContextFactory(ScrapyClientContextFactory): 57 """ 58 Twisted-recommended context factory for web clients. 59 60 Quoting http://twistedmatrix.com/documents/current/api/twisted.web.client.Agent.html: 61 "The default is to use a BrowserLikePolicyForHTTPS, 62 so unless you have special requirements you can leave this as-is." 63 64 creatorForNetloc() is the same as BrowserLikePolicyForHTTPS 65 except this context factory allows setting the TLS/SSL method to use. 66 67 Default OpenSSL method is TLS_METHOD (also called SSLv23_METHOD) 68 which allows TLS protocol negotiation. 69 """ 70 def creatorForNetloc(self, hostname, port): 71 72 # trustRoot set to platformTrust() will use the platform's root CAs. 73 # 74 # This means that a website like https://www.cacert.org will be rejected 75 # by default, since CAcert.org CA certificate is seldom shipped. 76 return optionsForClientTLS(hostname.decode("ascii"), 77 trustRoot=platformTrust(), 78 extraCertificateOptions={ 79 'method': self._ssl_method, 80 }) 81 82 except ImportError: 83 84 class ScrapyClientContextFactory(ClientContextFactory): 85 "A SSL context factory which is more permissive against SSL bugs." 86 # see https://github.com/scrapy/scrapy/issues/82 87 # and https://github.com/scrapy/scrapy/issues/26 88 # and https://github.com/scrapy/scrapy/issues/981 89 90 def __init__(self, method=SSL.SSLv23_METHOD): 91 self.method = method 92 93 def getContext(self, hostname=None, port=None): 94 ctx = ClientContextFactory.getContext(self) 95 # Enable all workarounds to SSL bugs as documented by 96 # http://www.openssl.org/docs/ssl/SSL_CTX_set_options.html 97 ctx.set_options(SSL.OP_ALL) 98 return ctx 99 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/scrapy/core/downloader/contextfactory.py b/scrapy/core/downloader/contextfactory.py --- a/scrapy/core/downloader/contextfactory.py +++ b/scrapy/core/downloader/contextfactory.py @@ -6,11 +6,16 @@ from zope.interface.declarations import implementer # the following should be available from Twisted 14.0.0 - from twisted.internet.ssl import optionsForClientTLS, CertificateOptions, platformTrust - from twisted.internet._sslverify import ClientTLSOptions + from twisted.internet.ssl import (optionsForClientTLS, + CertificateOptions, + platformTrust) + from twisted.web.client import BrowserLikePolicyForHTTPS from twisted.web.iweb import IPolicyForHTTPS + from scrapy.core.downloader.tls import ScrapyClientTLSOptions + + @implementer(IPolicyForHTTPS) class ScrapyClientContextFactory(BrowserLikePolicyForHTTPS): """ @@ -49,7 +54,7 @@ return self.getCertificateOptions().getContext() def creatorForNetloc(self, hostname, port): - return ClientTLSOptions(hostname.decode("ascii"), self.getContext()) + return ScrapyClientTLSOptions(hostname.decode("ascii"), self.getContext()) @implementer(IPolicyForHTTPS) diff --git a/scrapy/core/downloader/tls.py b/scrapy/core/downloader/tls.py --- a/scrapy/core/downloader/tls.py +++ b/scrapy/core/downloader/tls.py @@ -1,6 +1,9 @@ +import logging from OpenSSL import SSL +logger = logging.getLogger(__name__) + METHOD_SSLv3 = 'SSLv3' METHOD_TLS = 'TLS' METHOD_TLSv10 = 'TLSv1.0' @@ -14,3 +17,36 @@ METHOD_TLSv11: getattr(SSL, 'TLSv1_1_METHOD', 5), # TLS 1.1 only METHOD_TLSv12: getattr(SSL, 'TLSv1_2_METHOD', 6), # TLS 1.2 only } + +# ClientTLSOptions requires a recent-enough version of Twisted +try: + + # taken from twisted/twisted/internet/_sslverify.py + try: + from OpenSSL.SSL import SSL_CB_HANDSHAKE_DONE, SSL_CB_HANDSHAKE_START + except ImportError: + SSL_CB_HANDSHAKE_START = 0x10 + SSL_CB_HANDSHAKE_DONE = 0x20 + + from twisted.internet._sslverify import (ClientTLSOptions, + _maybeSetHostNameIndication, + verifyHostname, + VerificationError) + + class ScrapyClientTLSOptions(ClientTLSOptions): + # same as Twisted's ClientTLSOptions, + # except that VerificationError is caught + # and doesn't close the connection + def _identityVerifyingInfoCallback(self, connection, where, ret): + if where & SSL_CB_HANDSHAKE_START: + _maybeSetHostNameIndication(connection, self._hostnameBytes) + elif where & SSL_CB_HANDSHAKE_DONE: + try: + verifyHostname(connection, self._hostnameASCII) + except VerificationError as e: + logger.warning(e) + +except ImportError: + # ImportError should not matter for older Twisted versions + # as the above is not used in the fallback ScrapyClientContextFactory + pass
{"golden_diff": "diff --git a/scrapy/core/downloader/contextfactory.py b/scrapy/core/downloader/contextfactory.py\n--- a/scrapy/core/downloader/contextfactory.py\n+++ b/scrapy/core/downloader/contextfactory.py\n@@ -6,11 +6,16 @@\n from zope.interface.declarations import implementer\n \n # the following should be available from Twisted 14.0.0\n- from twisted.internet.ssl import optionsForClientTLS, CertificateOptions, platformTrust\n- from twisted.internet._sslverify import ClientTLSOptions\n+ from twisted.internet.ssl import (optionsForClientTLS,\n+ CertificateOptions,\n+ platformTrust)\n+\n from twisted.web.client import BrowserLikePolicyForHTTPS\n from twisted.web.iweb import IPolicyForHTTPS\n \n+ from scrapy.core.downloader.tls import ScrapyClientTLSOptions\n+\n+\n @implementer(IPolicyForHTTPS)\n class ScrapyClientContextFactory(BrowserLikePolicyForHTTPS):\n \"\"\"\n@@ -49,7 +54,7 @@\n return self.getCertificateOptions().getContext()\n \n def creatorForNetloc(self, hostname, port):\n- return ClientTLSOptions(hostname.decode(\"ascii\"), self.getContext())\n+ return ScrapyClientTLSOptions(hostname.decode(\"ascii\"), self.getContext())\n \n \n @implementer(IPolicyForHTTPS)\ndiff --git a/scrapy/core/downloader/tls.py b/scrapy/core/downloader/tls.py\n--- a/scrapy/core/downloader/tls.py\n+++ b/scrapy/core/downloader/tls.py\n@@ -1,6 +1,9 @@\n+import logging\n from OpenSSL import SSL\n \n \n+logger = logging.getLogger(__name__)\n+\n METHOD_SSLv3 = 'SSLv3'\n METHOD_TLS = 'TLS'\n METHOD_TLSv10 = 'TLSv1.0'\n@@ -14,3 +17,36 @@\n METHOD_TLSv11: getattr(SSL, 'TLSv1_1_METHOD', 5), # TLS 1.1 only\n METHOD_TLSv12: getattr(SSL, 'TLSv1_2_METHOD', 6), # TLS 1.2 only\n }\n+\n+# ClientTLSOptions requires a recent-enough version of Twisted\n+try:\n+\n+ # taken from twisted/twisted/internet/_sslverify.py\n+ try:\n+ from OpenSSL.SSL import SSL_CB_HANDSHAKE_DONE, SSL_CB_HANDSHAKE_START\n+ except ImportError:\n+ SSL_CB_HANDSHAKE_START = 0x10\n+ SSL_CB_HANDSHAKE_DONE = 0x20\n+\n+ from twisted.internet._sslverify import (ClientTLSOptions,\n+ _maybeSetHostNameIndication,\n+ verifyHostname,\n+ VerificationError)\n+\n+ class ScrapyClientTLSOptions(ClientTLSOptions):\n+ # same as Twisted's ClientTLSOptions,\n+ # except that VerificationError is caught\n+ # and doesn't close the connection\n+ def _identityVerifyingInfoCallback(self, connection, where, ret):\n+ if where & SSL_CB_HANDSHAKE_START:\n+ _maybeSetHostNameIndication(connection, self._hostnameBytes)\n+ elif where & SSL_CB_HANDSHAKE_DONE:\n+ try:\n+ verifyHostname(connection, self._hostnameASCII)\n+ except VerificationError as e:\n+ logger.warning(e)\n+\n+except ImportError:\n+ # ImportError should not matter for older Twisted versions\n+ # as the above is not used in the fallback ScrapyClientContextFactory\n+ pass\n", "issue": "Scrapy 1.1.0 RC3 - exception thrown with invalid ssl certificate\nHello,\n\nI am crawling sometimes websites with an invalid ssl certificate. For example, Scrapy 1.1.0 RC3 fails to open when I do:\n\n> scrapy shell https://www.directoriosanitario.com/directorio\n> or\n> scrapy shell https://saobinv.5go.cc/top/\n\nand throws the following exception:\n\n> twisted.web._newclient.ResponseNeverReceived: [<twisted.python.failure.Failure service_identity.exceptions.VerificationError: VerificationError(errors=[DNSMismatch(mismatched_id=DNS_ID(hostname=b'www.directoriosanitario.com'))])>]\n\nI tried it with Scrapy 1.0.5 on python 2.7 and the spider opens but warns with: \n\n> AttributeError: 'NoneType' object has no attribute 'failVerification'\n\nIs there a way to force the spider to open with Scrapy 1.1.0 RC3?\n\n", "before_files": [{"content": "from OpenSSL import SSL\n\n\nMETHOD_SSLv3 = 'SSLv3'\nMETHOD_TLS = 'TLS'\nMETHOD_TLSv10 = 'TLSv1.0'\nMETHOD_TLSv11 = 'TLSv1.1'\nMETHOD_TLSv12 = 'TLSv1.2'\n\nopenssl_methods = {\n METHOD_TLS: SSL.SSLv23_METHOD, # protocol negotiation (recommended)\n METHOD_SSLv3: SSL.SSLv3_METHOD, # SSL 3 (NOT recommended)\n METHOD_TLSv10: SSL.TLSv1_METHOD, # TLS 1.0 only\n METHOD_TLSv11: getattr(SSL, 'TLSv1_1_METHOD', 5), # TLS 1.1 only\n METHOD_TLSv12: getattr(SSL, 'TLSv1_2_METHOD', 6), # TLS 1.2 only\n}\n", "path": "scrapy/core/downloader/tls.py"}, {"content": "from OpenSSL import SSL\nfrom twisted.internet.ssl import ClientContextFactory\n\ntry:\n\n from zope.interface.declarations import implementer\n\n # the following should be available from Twisted 14.0.0\n from twisted.internet.ssl import optionsForClientTLS, CertificateOptions, platformTrust\n from twisted.internet._sslverify import ClientTLSOptions\n from twisted.web.client import BrowserLikePolicyForHTTPS\n from twisted.web.iweb import IPolicyForHTTPS\n\n @implementer(IPolicyForHTTPS)\n class ScrapyClientContextFactory(BrowserLikePolicyForHTTPS):\n \"\"\"\n Non-peer-certificate verifying HTTPS context factory\n\n Default OpenSSL method is TLS_METHOD (also called SSLv23_METHOD)\n which allows TLS protocol negotiation\n\n 'A TLS/SSL connection established with [this method] may\n understand the SSLv3, TLSv1, TLSv1.1 and TLSv1.2 protocols.'\n \"\"\"\n\n def __init__(self, method=SSL.SSLv23_METHOD, *args, **kwargs):\n super(ScrapyClientContextFactory, self).__init__(*args, **kwargs)\n self._ssl_method = method\n\n def getCertificateOptions(self):\n # setting verify=True will require you to provide CAs\n # to verify against; in other words: it's not that simple\n\n # backward-compatible SSL/TLS method:\n #\n # * this will respect `method` attribute in often recommended\n # `ScrapyClientContextFactory` subclass\n # (https://github.com/scrapy/scrapy/issues/1429#issuecomment-131782133)\n #\n # * getattr() for `_ssl_method` attribute for context factories\n # not calling super(..., self).__init__\n return CertificateOptions(verify=False,\n method=getattr(self, 'method',\n getattr(self, '_ssl_method', None)))\n\n # kept for old-style HTTP/1.0 downloader context twisted calls,\n # e.g. connectSSL()\n def getContext(self, hostname=None, port=None):\n return self.getCertificateOptions().getContext()\n\n def creatorForNetloc(self, hostname, port):\n return ClientTLSOptions(hostname.decode(\"ascii\"), self.getContext())\n\n\n @implementer(IPolicyForHTTPS)\n class BrowserLikeContextFactory(ScrapyClientContextFactory):\n \"\"\"\n Twisted-recommended context factory for web clients.\n\n Quoting http://twistedmatrix.com/documents/current/api/twisted.web.client.Agent.html:\n \"The default is to use a BrowserLikePolicyForHTTPS,\n so unless you have special requirements you can leave this as-is.\"\n\n creatorForNetloc() is the same as BrowserLikePolicyForHTTPS\n except this context factory allows setting the TLS/SSL method to use.\n\n Default OpenSSL method is TLS_METHOD (also called SSLv23_METHOD)\n which allows TLS protocol negotiation.\n \"\"\"\n def creatorForNetloc(self, hostname, port):\n\n # trustRoot set to platformTrust() will use the platform's root CAs.\n #\n # This means that a website like https://www.cacert.org will be rejected\n # by default, since CAcert.org CA certificate is seldom shipped.\n return optionsForClientTLS(hostname.decode(\"ascii\"),\n trustRoot=platformTrust(),\n extraCertificateOptions={\n 'method': self._ssl_method,\n })\n\nexcept ImportError:\n\n class ScrapyClientContextFactory(ClientContextFactory):\n \"A SSL context factory which is more permissive against SSL bugs.\"\n # see https://github.com/scrapy/scrapy/issues/82\n # and https://github.com/scrapy/scrapy/issues/26\n # and https://github.com/scrapy/scrapy/issues/981\n\n def __init__(self, method=SSL.SSLv23_METHOD):\n self.method = method\n\n def getContext(self, hostname=None, port=None):\n ctx = ClientContextFactory.getContext(self)\n # Enable all workarounds to SSL bugs as documented by\n # http://www.openssl.org/docs/ssl/SSL_CTX_set_options.html\n ctx.set_options(SSL.OP_ALL)\n return ctx\n", "path": "scrapy/core/downloader/contextfactory.py"}], "after_files": [{"content": "import logging\nfrom OpenSSL import SSL\n\n\nlogger = logging.getLogger(__name__)\n\nMETHOD_SSLv3 = 'SSLv3'\nMETHOD_TLS = 'TLS'\nMETHOD_TLSv10 = 'TLSv1.0'\nMETHOD_TLSv11 = 'TLSv1.1'\nMETHOD_TLSv12 = 'TLSv1.2'\n\nopenssl_methods = {\n METHOD_TLS: SSL.SSLv23_METHOD, # protocol negotiation (recommended)\n METHOD_SSLv3: SSL.SSLv3_METHOD, # SSL 3 (NOT recommended)\n METHOD_TLSv10: SSL.TLSv1_METHOD, # TLS 1.0 only\n METHOD_TLSv11: getattr(SSL, 'TLSv1_1_METHOD', 5), # TLS 1.1 only\n METHOD_TLSv12: getattr(SSL, 'TLSv1_2_METHOD', 6), # TLS 1.2 only\n}\n\n# ClientTLSOptions requires a recent-enough version of Twisted\ntry:\n\n # taken from twisted/twisted/internet/_sslverify.py\n try:\n from OpenSSL.SSL import SSL_CB_HANDSHAKE_DONE, SSL_CB_HANDSHAKE_START\n except ImportError:\n SSL_CB_HANDSHAKE_START = 0x10\n SSL_CB_HANDSHAKE_DONE = 0x20\n\n from twisted.internet._sslverify import (ClientTLSOptions,\n _maybeSetHostNameIndication,\n verifyHostname,\n VerificationError)\n\n class ScrapyClientTLSOptions(ClientTLSOptions):\n # same as Twisted's ClientTLSOptions,\n # except that VerificationError is caught\n # and doesn't close the connection\n def _identityVerifyingInfoCallback(self, connection, where, ret):\n if where & SSL_CB_HANDSHAKE_START:\n _maybeSetHostNameIndication(connection, self._hostnameBytes)\n elif where & SSL_CB_HANDSHAKE_DONE:\n try:\n verifyHostname(connection, self._hostnameASCII)\n except VerificationError as e:\n logger.warning(e)\n\nexcept ImportError:\n # ImportError should not matter for older Twisted versions\n # as the above is not used in the fallback ScrapyClientContextFactory\n pass\n", "path": "scrapy/core/downloader/tls.py"}, {"content": "from OpenSSL import SSL\nfrom twisted.internet.ssl import ClientContextFactory\n\ntry:\n\n from zope.interface.declarations import implementer\n\n # the following should be available from Twisted 14.0.0\n from twisted.internet.ssl import (optionsForClientTLS,\n CertificateOptions,\n platformTrust)\n\n from twisted.web.client import BrowserLikePolicyForHTTPS\n from twisted.web.iweb import IPolicyForHTTPS\n\n from scrapy.core.downloader.tls import ScrapyClientTLSOptions\n\n\n @implementer(IPolicyForHTTPS)\n class ScrapyClientContextFactory(BrowserLikePolicyForHTTPS):\n \"\"\"\n Non-peer-certificate verifying HTTPS context factory\n\n Default OpenSSL method is TLS_METHOD (also called SSLv23_METHOD)\n which allows TLS protocol negotiation\n\n 'A TLS/SSL connection established with [this method] may\n understand the SSLv3, TLSv1, TLSv1.1 and TLSv1.2 protocols.'\n \"\"\"\n\n def __init__(self, method=SSL.SSLv23_METHOD, *args, **kwargs):\n super(ScrapyClientContextFactory, self).__init__(*args, **kwargs)\n self._ssl_method = method\n\n def getCertificateOptions(self):\n # setting verify=True will require you to provide CAs\n # to verify against; in other words: it's not that simple\n\n # backward-compatible SSL/TLS method:\n #\n # * this will respect `method` attribute in often recommended\n # `ScrapyClientContextFactory` subclass\n # (https://github.com/scrapy/scrapy/issues/1429#issuecomment-131782133)\n #\n # * getattr() for `_ssl_method` attribute for context factories\n # not calling super(..., self).__init__\n return CertificateOptions(verify=False,\n method=getattr(self, 'method',\n getattr(self, '_ssl_method', None)))\n\n # kept for old-style HTTP/1.0 downloader context twisted calls,\n # e.g. connectSSL()\n def getContext(self, hostname=None, port=None):\n return self.getCertificateOptions().getContext()\n\n def creatorForNetloc(self, hostname, port):\n return ScrapyClientTLSOptions(hostname.decode(\"ascii\"), self.getContext())\n\n\n @implementer(IPolicyForHTTPS)\n class BrowserLikeContextFactory(ScrapyClientContextFactory):\n \"\"\"\n Twisted-recommended context factory for web clients.\n\n Quoting http://twistedmatrix.com/documents/current/api/twisted.web.client.Agent.html:\n \"The default is to use a BrowserLikePolicyForHTTPS,\n so unless you have special requirements you can leave this as-is.\"\n\n creatorForNetloc() is the same as BrowserLikePolicyForHTTPS\n except this context factory allows setting the TLS/SSL method to use.\n\n Default OpenSSL method is TLS_METHOD (also called SSLv23_METHOD)\n which allows TLS protocol negotiation.\n \"\"\"\n def creatorForNetloc(self, hostname, port):\n\n # trustRoot set to platformTrust() will use the platform's root CAs.\n #\n # This means that a website like https://www.cacert.org will be rejected\n # by default, since CAcert.org CA certificate is seldom shipped.\n return optionsForClientTLS(hostname.decode(\"ascii\"),\n trustRoot=platformTrust(),\n extraCertificateOptions={\n 'method': self._ssl_method,\n })\n\nexcept ImportError:\n\n class ScrapyClientContextFactory(ClientContextFactory):\n \"A SSL context factory which is more permissive against SSL bugs.\"\n # see https://github.com/scrapy/scrapy/issues/82\n # and https://github.com/scrapy/scrapy/issues/26\n # and https://github.com/scrapy/scrapy/issues/981\n\n def __init__(self, method=SSL.SSLv23_METHOD):\n self.method = method\n\n def getContext(self, hostname=None, port=None):\n ctx = ClientContextFactory.getContext(self)\n # Enable all workarounds to SSL bugs as documented by\n # http://www.openssl.org/docs/ssl/SSL_CTX_set_options.html\n ctx.set_options(SSL.OP_ALL)\n return ctx\n", "path": "scrapy/core/downloader/contextfactory.py"}]}
1,799
758
gh_patches_debug_21907
rasdani/github-patches
git_diff
webkom__lego-1985
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Timezone email Format dates in emails in the same language as the email template (Norwegian), and converted to the proper timezone. ![image](https://user-images.githubusercontent.com/1467188/34150647-476ffb12-e4a9-11e7-9f17-5b83e0690dad.png) ![screenshot 19 des 1 2017 10_39_27](https://user-images.githubusercontent.com/1467188/34150870-f9c33536-e4a9-11e7-836e-a3e93ae27f0f.png) --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `lego/apps/events/notifications.py` Content: ``` 1 from lego.apps.notifications.constants import ( 2 EVENT_ADMIN_REGISTRATION, 3 EVENT_ADMIN_UNREGISTRATION, 4 EVENT_BUMP, 5 EVENT_PAYMENT_OVERDUE, 6 EVENT_PAYMENT_OVERDUE_CREATOR, 7 ) 8 from lego.apps.notifications.notification import Notification 9 10 11 class EventBumpNotification(Notification): 12 13 name = EVENT_BUMP 14 15 def generate_mail(self): 16 event = self.kwargs["event"] 17 18 return self._delay_mail( 19 to_email=self.user.email, 20 context={"event": event.title, "name": self.user.full_name, "id": event.id}, 21 subject=f"Du er flyttet opp fra ventelisten på arrangementet {event.title}", 22 plain_template="events/email/bump.txt", 23 html_template="events/email/bump.html", 24 ) 25 26 def generate_push(self): 27 event = self.kwargs["event"] 28 29 return self._delay_push( 30 template="events/push/bump.txt", 31 context={"event": event.title}, 32 instance=event, 33 ) 34 35 36 class EventPaymentOverdueNotification(Notification): 37 38 name = EVENT_PAYMENT_OVERDUE 39 40 def generate_mail(self): 41 event = self.kwargs["event"] 42 43 return self._delay_mail( 44 to_email=self.user.email, 45 context={ 46 "event": event.title, 47 "name": self.user.full_name, 48 "due_date": event.payment_due_date, 49 "id": event.id, 50 }, 51 subject=f"Du har ikke betalt påmeldingen på arrangementet {event.title}", 52 plain_template="events/email/payment_overdue.txt", 53 html_template="events/email/payment_overdue.html", 54 ) 55 56 def generate_push(self): 57 event = self.kwargs["event"] 58 59 return self._delay_push( 60 template="events/push/payment_overdue.txt", 61 context={"event": event.title}, 62 instance=event, 63 ) 64 65 66 class EventPaymentOverdueCreatorNotification(Notification): 67 68 name = EVENT_PAYMENT_OVERDUE_CREATOR 69 70 def generate_mail(self): 71 event = self.kwargs["event"] 72 users = self.kwargs["users"] 73 74 return self._delay_mail( 75 to_email=self.user.email, 76 context={ 77 "event": event.title, 78 "users": users, 79 "name": self.user.full_name, 80 "id": event.id, 81 }, 82 subject=f"Følgende registrerte har ikke betalt påmeldingen til arrangementet" 83 f" {event.title}", 84 plain_template="events/email/payment_overdue_author.txt", 85 html_template="events/email/payment_overdue_author.html", 86 ) 87 88 89 class EventAdminRegistrationNotification(Notification): 90 91 name = EVENT_ADMIN_REGISTRATION 92 93 def generate_mail(self): 94 event = self.kwargs["event"] 95 reason = self.kwargs["reason"] 96 97 return self._delay_mail( 98 to_email=self.user.email, 99 context={ 100 "event": event.title, 101 "name": self.user.full_name, 102 "reason": reason, 103 "id": event.id, 104 }, 105 subject=f"Du har blitt adminpåmeldt på arrangementet {event.title}", 106 plain_template="events/email/admin_registration.txt", 107 html_template="events/email/admin_registration.html", 108 ) 109 110 def generate_push(self): 111 event = self.kwargs["event"] 112 113 return self._delay_push( 114 template="events/push/admin_registration.txt", 115 context={"event": event.title}, 116 instance=event, 117 ) 118 119 120 class EventAdminUnregistrationNotification(Notification): 121 122 name = EVENT_ADMIN_UNREGISTRATION 123 124 def generate_mail(self): 125 event = self.kwargs["event"] 126 creator = self.kwargs["creator"] 127 reason = self.kwargs["reason"] 128 129 return self._delay_mail( 130 to_email=self.user.email, 131 context={ 132 "event": event.title, 133 "creator_name": creator.full_name, 134 "creator_email": creator.email, 135 "name": self.user.full_name, 136 "reason": reason, 137 "id": event.id, 138 }, 139 subject=f"Du har blitt fjernet fra arrangementet {event.title}", 140 plain_template="events/email/admin_unregistration.txt", 141 html_template="events/email/admin_unregistration.html", 142 ) 143 144 def generate_push(self): 145 event = self.kwargs["event"] 146 147 return self._delay_push( 148 template="events/push/admin_unregistration.txt", 149 context={"event": event.title}, 150 instance=event, 151 ) 152 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/lego/apps/events/notifications.py b/lego/apps/events/notifications.py --- a/lego/apps/events/notifications.py +++ b/lego/apps/events/notifications.py @@ -1,3 +1,7 @@ +from django.utils import timezone + +import pytz + from lego.apps.notifications.constants import ( EVENT_ADMIN_REGISTRATION, EVENT_ADMIN_UNREGISTRATION, @@ -40,12 +44,18 @@ def generate_mail(self): event = self.kwargs["event"] + date = timezone.localtime( + value=event.payment_due_date, timezone=pytz.timezone("Europe/Oslo") + ) + + due_date = date.strftime("%d.%m.%y, kl. %H:%M") + return self._delay_mail( to_email=self.user.email, context={ "event": event.title, "name": self.user.full_name, - "due_date": event.payment_due_date, + "due_date": due_date, "id": event.id, }, subject=f"Du har ikke betalt påmeldingen på arrangementet {event.title}",
{"golden_diff": "diff --git a/lego/apps/events/notifications.py b/lego/apps/events/notifications.py\n--- a/lego/apps/events/notifications.py\n+++ b/lego/apps/events/notifications.py\n@@ -1,3 +1,7 @@\n+from django.utils import timezone\n+\n+import pytz\n+\n from lego.apps.notifications.constants import (\n EVENT_ADMIN_REGISTRATION,\n EVENT_ADMIN_UNREGISTRATION,\n@@ -40,12 +44,18 @@\n def generate_mail(self):\n event = self.kwargs[\"event\"]\n \n+ date = timezone.localtime(\n+ value=event.payment_due_date, timezone=pytz.timezone(\"Europe/Oslo\")\n+ )\n+\n+ due_date = date.strftime(\"%d.%m.%y, kl. %H:%M\")\n+\n return self._delay_mail(\n to_email=self.user.email,\n context={\n \"event\": event.title,\n \"name\": self.user.full_name,\n- \"due_date\": event.payment_due_date,\n+ \"due_date\": due_date,\n \"id\": event.id,\n },\n subject=f\"Du har ikke betalt p\u00e5meldingen p\u00e5 arrangementet {event.title}\",\n", "issue": "Timezone email\nFormat dates in emails in the same language as the email template (Norwegian), and converted to the proper timezone. \r\n\r\n![image](https://user-images.githubusercontent.com/1467188/34150647-476ffb12-e4a9-11e7-9f17-5b83e0690dad.png)\r\n\r\n![screenshot 19 des 1 2017 10_39_27](https://user-images.githubusercontent.com/1467188/34150870-f9c33536-e4a9-11e7-836e-a3e93ae27f0f.png)\r\n\r\n\n", "before_files": [{"content": "from lego.apps.notifications.constants import (\n EVENT_ADMIN_REGISTRATION,\n EVENT_ADMIN_UNREGISTRATION,\n EVENT_BUMP,\n EVENT_PAYMENT_OVERDUE,\n EVENT_PAYMENT_OVERDUE_CREATOR,\n)\nfrom lego.apps.notifications.notification import Notification\n\n\nclass EventBumpNotification(Notification):\n\n name = EVENT_BUMP\n\n def generate_mail(self):\n event = self.kwargs[\"event\"]\n\n return self._delay_mail(\n to_email=self.user.email,\n context={\"event\": event.title, \"name\": self.user.full_name, \"id\": event.id},\n subject=f\"Du er flyttet opp fra ventelisten p\u00e5 arrangementet {event.title}\",\n plain_template=\"events/email/bump.txt\",\n html_template=\"events/email/bump.html\",\n )\n\n def generate_push(self):\n event = self.kwargs[\"event\"]\n\n return self._delay_push(\n template=\"events/push/bump.txt\",\n context={\"event\": event.title},\n instance=event,\n )\n\n\nclass EventPaymentOverdueNotification(Notification):\n\n name = EVENT_PAYMENT_OVERDUE\n\n def generate_mail(self):\n event = self.kwargs[\"event\"]\n\n return self._delay_mail(\n to_email=self.user.email,\n context={\n \"event\": event.title,\n \"name\": self.user.full_name,\n \"due_date\": event.payment_due_date,\n \"id\": event.id,\n },\n subject=f\"Du har ikke betalt p\u00e5meldingen p\u00e5 arrangementet {event.title}\",\n plain_template=\"events/email/payment_overdue.txt\",\n html_template=\"events/email/payment_overdue.html\",\n )\n\n def generate_push(self):\n event = self.kwargs[\"event\"]\n\n return self._delay_push(\n template=\"events/push/payment_overdue.txt\",\n context={\"event\": event.title},\n instance=event,\n )\n\n\nclass EventPaymentOverdueCreatorNotification(Notification):\n\n name = EVENT_PAYMENT_OVERDUE_CREATOR\n\n def generate_mail(self):\n event = self.kwargs[\"event\"]\n users = self.kwargs[\"users\"]\n\n return self._delay_mail(\n to_email=self.user.email,\n context={\n \"event\": event.title,\n \"users\": users,\n \"name\": self.user.full_name,\n \"id\": event.id,\n },\n subject=f\"F\u00f8lgende registrerte har ikke betalt p\u00e5meldingen til arrangementet\"\n f\" {event.title}\",\n plain_template=\"events/email/payment_overdue_author.txt\",\n html_template=\"events/email/payment_overdue_author.html\",\n )\n\n\nclass EventAdminRegistrationNotification(Notification):\n\n name = EVENT_ADMIN_REGISTRATION\n\n def generate_mail(self):\n event = self.kwargs[\"event\"]\n reason = self.kwargs[\"reason\"]\n\n return self._delay_mail(\n to_email=self.user.email,\n context={\n \"event\": event.title,\n \"name\": self.user.full_name,\n \"reason\": reason,\n \"id\": event.id,\n },\n subject=f\"Du har blitt adminp\u00e5meldt p\u00e5 arrangementet {event.title}\",\n plain_template=\"events/email/admin_registration.txt\",\n html_template=\"events/email/admin_registration.html\",\n )\n\n def generate_push(self):\n event = self.kwargs[\"event\"]\n\n return self._delay_push(\n template=\"events/push/admin_registration.txt\",\n context={\"event\": event.title},\n instance=event,\n )\n\n\nclass EventAdminUnregistrationNotification(Notification):\n\n name = EVENT_ADMIN_UNREGISTRATION\n\n def generate_mail(self):\n event = self.kwargs[\"event\"]\n creator = self.kwargs[\"creator\"]\n reason = self.kwargs[\"reason\"]\n\n return self._delay_mail(\n to_email=self.user.email,\n context={\n \"event\": event.title,\n \"creator_name\": creator.full_name,\n \"creator_email\": creator.email,\n \"name\": self.user.full_name,\n \"reason\": reason,\n \"id\": event.id,\n },\n subject=f\"Du har blitt fjernet fra arrangementet {event.title}\",\n plain_template=\"events/email/admin_unregistration.txt\",\n html_template=\"events/email/admin_unregistration.html\",\n )\n\n def generate_push(self):\n event = self.kwargs[\"event\"]\n\n return self._delay_push(\n template=\"events/push/admin_unregistration.txt\",\n context={\"event\": event.title},\n instance=event,\n )\n", "path": "lego/apps/events/notifications.py"}], "after_files": [{"content": "from django.utils import timezone\n\nimport pytz\n\nfrom lego.apps.notifications.constants import (\n EVENT_ADMIN_REGISTRATION,\n EVENT_ADMIN_UNREGISTRATION,\n EVENT_BUMP,\n EVENT_PAYMENT_OVERDUE,\n EVENT_PAYMENT_OVERDUE_CREATOR,\n)\nfrom lego.apps.notifications.notification import Notification\n\n\nclass EventBumpNotification(Notification):\n\n name = EVENT_BUMP\n\n def generate_mail(self):\n event = self.kwargs[\"event\"]\n\n return self._delay_mail(\n to_email=self.user.email,\n context={\"event\": event.title, \"name\": self.user.full_name, \"id\": event.id},\n subject=f\"Du er flyttet opp fra ventelisten p\u00e5 arrangementet {event.title}\",\n plain_template=\"events/email/bump.txt\",\n html_template=\"events/email/bump.html\",\n )\n\n def generate_push(self):\n event = self.kwargs[\"event\"]\n\n return self._delay_push(\n template=\"events/push/bump.txt\",\n context={\"event\": event.title},\n instance=event,\n )\n\n\nclass EventPaymentOverdueNotification(Notification):\n\n name = EVENT_PAYMENT_OVERDUE\n\n def generate_mail(self):\n event = self.kwargs[\"event\"]\n\n date = timezone.localtime(\n value=event.payment_due_date, timezone=pytz.timezone(\"Europe/Oslo\")\n )\n\n due_date = date.strftime(\"%d.%m.%y, kl. %H:%M\")\n\n return self._delay_mail(\n to_email=self.user.email,\n context={\n \"event\": event.title,\n \"name\": self.user.full_name,\n \"due_date\": due_date,\n \"id\": event.id,\n },\n subject=f\"Du har ikke betalt p\u00e5meldingen p\u00e5 arrangementet {event.title}\",\n plain_template=\"events/email/payment_overdue.txt\",\n html_template=\"events/email/payment_overdue.html\",\n )\n\n def generate_push(self):\n event = self.kwargs[\"event\"]\n\n return self._delay_push(\n template=\"events/push/payment_overdue.txt\",\n context={\"event\": event.title},\n instance=event,\n )\n\n\nclass EventPaymentOverdueCreatorNotification(Notification):\n\n name = EVENT_PAYMENT_OVERDUE_CREATOR\n\n def generate_mail(self):\n event = self.kwargs[\"event\"]\n users = self.kwargs[\"users\"]\n\n return self._delay_mail(\n to_email=self.user.email,\n context={\n \"event\": event.title,\n \"users\": users,\n \"name\": self.user.full_name,\n \"id\": event.id,\n },\n subject=f\"F\u00f8lgende registrerte har ikke betalt p\u00e5meldingen til arrangementet\"\n f\" {event.title}\",\n plain_template=\"events/email/payment_overdue_author.txt\",\n html_template=\"events/email/payment_overdue_author.html\",\n )\n\n\nclass EventAdminRegistrationNotification(Notification):\n\n name = EVENT_ADMIN_REGISTRATION\n\n def generate_mail(self):\n event = self.kwargs[\"event\"]\n reason = self.kwargs[\"reason\"]\n\n return self._delay_mail(\n to_email=self.user.email,\n context={\n \"event\": event.title,\n \"name\": self.user.full_name,\n \"reason\": reason,\n \"id\": event.id,\n },\n subject=f\"Du har blitt adminp\u00e5meldt p\u00e5 arrangementet {event.title}\",\n plain_template=\"events/email/admin_registration.txt\",\n html_template=\"events/email/admin_registration.html\",\n )\n\n def generate_push(self):\n event = self.kwargs[\"event\"]\n\n return self._delay_push(\n template=\"events/push/admin_registration.txt\",\n context={\"event\": event.title},\n instance=event,\n )\n\n\nclass EventAdminUnregistrationNotification(Notification):\n\n name = EVENT_ADMIN_UNREGISTRATION\n\n def generate_mail(self):\n event = self.kwargs[\"event\"]\n creator = self.kwargs[\"creator\"]\n reason = self.kwargs[\"reason\"]\n\n return self._delay_mail(\n to_email=self.user.email,\n context={\n \"event\": event.title,\n \"creator_name\": creator.full_name,\n \"creator_email\": creator.email,\n \"name\": self.user.full_name,\n \"reason\": reason,\n \"id\": event.id,\n },\n subject=f\"Du har blitt fjernet fra arrangementet {event.title}\",\n plain_template=\"events/email/admin_unregistration.txt\",\n html_template=\"events/email/admin_unregistration.html\",\n )\n\n def generate_push(self):\n event = self.kwargs[\"event\"]\n\n return self._delay_push(\n template=\"events/push/admin_unregistration.txt\",\n context={\"event\": event.title},\n instance=event,\n )\n", "path": "lego/apps/events/notifications.py"}]}
1,704
251
gh_patches_debug_30420
rasdani/github-patches
git_diff
PrefectHQ__prefect-347
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Add back `environment.yml` file I realized why we might want to maintain an `environment.yml` file in parallel with our `requirements.txt` file: `requirements.txt` will be installed via `pip`, whereas if you create an environment via `conda`, the packages will be installed / maintained via `conda`. This can be useful for those who try to `conda install` everything (since it has different package version logic + handles non-python dependencies). --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `setup.py` Content: ``` 1 # Licensed under LICENSE.md; also available at https://www.prefect.io/licenses/alpha-eula 2 3 from setuptools import find_packages, setup 4 5 import sys 6 import versioneer 7 8 install_requires = [ 9 "click >= 6.7, < 7.0", 10 "cloudpickle >= 0.6.0", 11 "croniter >= 0.3.23, < 0.4", 12 "cryptography >= 2.2.2, < 3.0", 13 "dask >= 0.18, < 0.19", 14 "distributed >= 1.21.8, < 2.0", 15 "docker >= 3.4.1, < 3.5", 16 "marshmallow == 3.0.0b19", 17 "marshmallow-oneofschema >= 2.0.0b2, < 3.0", 18 "mypy >= 0.600, < 0.700", 19 "mypy_extensions >= 0.4.0, < 0.5", 20 "pendulum >= 2.0.4, < 3.0", 21 "python-dateutil >= 2.7.3, < 3.0", 22 "requests >= 2.20, < 3.0", 23 "toml >= 0.9.4, < 1.0", 24 "typing >= 3.6.4, < 4.0", 25 "typing_extensions >= 3.6.4, < 4.0", 26 "xxhash >= 1.2.0, < 2.0", 27 ] 28 29 templates = ["jinja2 >= 2.0, < 3.0"] 30 viz = ["bokeh == 0.13.0", "graphviz >= 0.8.3"] 31 dev = [ 32 "pre-commit", 33 "pytest >= 3.8, < 4.0", 34 "pytest-cov", 35 "pytest-env", 36 "pytest-xdist", 37 "Pygments == 2.2.0", 38 ] 39 40 if sys.version_info >= (3, 6): 41 dev += ["black"] 42 43 extras = { 44 "dev": dev + viz, 45 "viz": viz, 46 "templates": templates, 47 "all_extras": dev + templates + viz, 48 } 49 50 setup( 51 name="prefect", 52 version=versioneer.get_version(), 53 cmdclass=versioneer.get_cmdclass(), 54 description="", 55 long_description=open("README.md").read(), 56 url="https://www.github.com/prefecthq/prefect", 57 author="Prefect Technologies, Inc.", 58 author_email="[email protected]", 59 install_requires=install_requires, 60 extras_require=extras, 61 scripts=[], 62 packages=find_packages(where="src"), 63 package_dir={"": "src"}, 64 include_package_data=True, 65 entry_points={"console_scripts": ["prefect=prefect.cli:cli"]}, 66 ) 67 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/setup.py b/setup.py --- a/setup.py +++ b/setup.py @@ -2,51 +2,39 @@ from setuptools import find_packages, setup +import configparser import sys import versioneer -install_requires = [ - "click >= 6.7, < 7.0", - "cloudpickle >= 0.6.0", - "croniter >= 0.3.23, < 0.4", - "cryptography >= 2.2.2, < 3.0", - "dask >= 0.18, < 0.19", - "distributed >= 1.21.8, < 2.0", - "docker >= 3.4.1, < 3.5", - "marshmallow == 3.0.0b19", - "marshmallow-oneofschema >= 2.0.0b2, < 3.0", - "mypy >= 0.600, < 0.700", - "mypy_extensions >= 0.4.0, < 0.5", - "pendulum >= 2.0.4, < 3.0", - "python-dateutil >= 2.7.3, < 3.0", - "requests >= 2.20, < 3.0", - "toml >= 0.9.4, < 1.0", - "typing >= 3.6.4, < 4.0", - "typing_extensions >= 3.6.4, < 4.0", - "xxhash >= 1.2.0, < 2.0", -] +config = configparser.ConfigParser() +config.read("requirements.ini") -templates = ["jinja2 >= 2.0, < 3.0"] -viz = ["bokeh == 0.13.0", "graphviz >= 0.8.3"] -dev = [ - "pre-commit", - "pytest >= 3.8, < 4.0", - "pytest-cov", - "pytest-env", - "pytest-xdist", - "Pygments == 2.2.0", -] +## base requirements +install_requires = ["".join(req) for req in config["base"].items()] -if sys.version_info >= (3, 6): - dev += ["black"] +## section dependencies +includes = {} +for section in config.sections(): + includes[section] = config[section].pop("include", "").split(",") extras = { - "dev": dev + viz, - "viz": viz, - "templates": templates, - "all_extras": dev + templates + viz, + "dev": ["".join(req) for req in config["dev"].items()], + "viz": ["".join(req) for req in config["viz"].items()], + "templates": ["".join(req) for req in config["templates"].items()], } +## process include keyword for related sections +for section in extras: + for other in includes[section]: + extras[section] += extras.get(other.strip(), []) + + +if sys.version_info >= (3, 6): + extras["dev"] += ["black"] + +extras["all_extras"] = extras["dev"] + extras["viz"] + extras["templates"] + + setup( name="prefect", version=versioneer.get_version(),
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -2,51 +2,39 @@\n \n from setuptools import find_packages, setup\n \n+import configparser\n import sys\n import versioneer\n \n-install_requires = [\n- \"click >= 6.7, < 7.0\",\n- \"cloudpickle >= 0.6.0\",\n- \"croniter >= 0.3.23, < 0.4\",\n- \"cryptography >= 2.2.2, < 3.0\",\n- \"dask >= 0.18, < 0.19\",\n- \"distributed >= 1.21.8, < 2.0\",\n- \"docker >= 3.4.1, < 3.5\",\n- \"marshmallow == 3.0.0b19\",\n- \"marshmallow-oneofschema >= 2.0.0b2, < 3.0\",\n- \"mypy >= 0.600, < 0.700\",\n- \"mypy_extensions >= 0.4.0, < 0.5\",\n- \"pendulum >= 2.0.4, < 3.0\",\n- \"python-dateutil >= 2.7.3, < 3.0\",\n- \"requests >= 2.20, < 3.0\",\n- \"toml >= 0.9.4, < 1.0\",\n- \"typing >= 3.6.4, < 4.0\",\n- \"typing_extensions >= 3.6.4, < 4.0\",\n- \"xxhash >= 1.2.0, < 2.0\",\n-]\n+config = configparser.ConfigParser()\n+config.read(\"requirements.ini\")\n \n-templates = [\"jinja2 >= 2.0, < 3.0\"]\n-viz = [\"bokeh == 0.13.0\", \"graphviz >= 0.8.3\"]\n-dev = [\n- \"pre-commit\",\n- \"pytest >= 3.8, < 4.0\",\n- \"pytest-cov\",\n- \"pytest-env\",\n- \"pytest-xdist\",\n- \"Pygments == 2.2.0\",\n-]\n+## base requirements\n+install_requires = [\"\".join(req) for req in config[\"base\"].items()]\n \n-if sys.version_info >= (3, 6):\n- dev += [\"black\"]\n+## section dependencies\n+includes = {}\n+for section in config.sections():\n+ includes[section] = config[section].pop(\"include\", \"\").split(\",\")\n \n extras = {\n- \"dev\": dev + viz,\n- \"viz\": viz,\n- \"templates\": templates,\n- \"all_extras\": dev + templates + viz,\n+ \"dev\": [\"\".join(req) for req in config[\"dev\"].items()],\n+ \"viz\": [\"\".join(req) for req in config[\"viz\"].items()],\n+ \"templates\": [\"\".join(req) for req in config[\"templates\"].items()],\n }\n \n+## process include keyword for related sections\n+for section in extras:\n+ for other in includes[section]:\n+ extras[section] += extras.get(other.strip(), [])\n+\n+\n+if sys.version_info >= (3, 6):\n+ extras[\"dev\"] += [\"black\"]\n+\n+extras[\"all_extras\"] = extras[\"dev\"] + extras[\"viz\"] + extras[\"templates\"]\n+\n+\n setup(\n name=\"prefect\",\n version=versioneer.get_version(),\n", "issue": "Add back `environment.yml` file\nI realized why we might want to maintain an `environment.yml` file in parallel with our `requirements.txt` file: `requirements.txt` will be installed via `pip`, whereas if you create an environment via `conda`, the packages will be installed / maintained via `conda`. This can be useful for those who try to `conda install` everything (since it has different package version logic + handles non-python dependencies).\n", "before_files": [{"content": "# Licensed under LICENSE.md; also available at https://www.prefect.io/licenses/alpha-eula\n\nfrom setuptools import find_packages, setup\n\nimport sys\nimport versioneer\n\ninstall_requires = [\n \"click >= 6.7, < 7.0\",\n \"cloudpickle >= 0.6.0\",\n \"croniter >= 0.3.23, < 0.4\",\n \"cryptography >= 2.2.2, < 3.0\",\n \"dask >= 0.18, < 0.19\",\n \"distributed >= 1.21.8, < 2.0\",\n \"docker >= 3.4.1, < 3.5\",\n \"marshmallow == 3.0.0b19\",\n \"marshmallow-oneofschema >= 2.0.0b2, < 3.0\",\n \"mypy >= 0.600, < 0.700\",\n \"mypy_extensions >= 0.4.0, < 0.5\",\n \"pendulum >= 2.0.4, < 3.0\",\n \"python-dateutil >= 2.7.3, < 3.0\",\n \"requests >= 2.20, < 3.0\",\n \"toml >= 0.9.4, < 1.0\",\n \"typing >= 3.6.4, < 4.0\",\n \"typing_extensions >= 3.6.4, < 4.0\",\n \"xxhash >= 1.2.0, < 2.0\",\n]\n\ntemplates = [\"jinja2 >= 2.0, < 3.0\"]\nviz = [\"bokeh == 0.13.0\", \"graphviz >= 0.8.3\"]\ndev = [\n \"pre-commit\",\n \"pytest >= 3.8, < 4.0\",\n \"pytest-cov\",\n \"pytest-env\",\n \"pytest-xdist\",\n \"Pygments == 2.2.0\",\n]\n\nif sys.version_info >= (3, 6):\n dev += [\"black\"]\n\nextras = {\n \"dev\": dev + viz,\n \"viz\": viz,\n \"templates\": templates,\n \"all_extras\": dev + templates + viz,\n}\n\nsetup(\n name=\"prefect\",\n version=versioneer.get_version(),\n cmdclass=versioneer.get_cmdclass(),\n description=\"\",\n long_description=open(\"README.md\").read(),\n url=\"https://www.github.com/prefecthq/prefect\",\n author=\"Prefect Technologies, Inc.\",\n author_email=\"[email protected]\",\n install_requires=install_requires,\n extras_require=extras,\n scripts=[],\n packages=find_packages(where=\"src\"),\n package_dir={\"\": \"src\"},\n include_package_data=True,\n entry_points={\"console_scripts\": [\"prefect=prefect.cli:cli\"]},\n)\n", "path": "setup.py"}], "after_files": [{"content": "# Licensed under LICENSE.md; also available at https://www.prefect.io/licenses/alpha-eula\n\nfrom setuptools import find_packages, setup\n\nimport configparser\nimport sys\nimport versioneer\n\nconfig = configparser.ConfigParser()\nconfig.read(\"requirements.ini\")\n\n## base requirements\ninstall_requires = [\"\".join(req) for req in config[\"base\"].items()]\n\n## section dependencies\nincludes = {}\nfor section in config.sections():\n includes[section] = config[section].pop(\"include\", \"\").split(\",\")\n\nextras = {\n \"dev\": [\"\".join(req) for req in config[\"dev\"].items()],\n \"viz\": [\"\".join(req) for req in config[\"viz\"].items()],\n \"templates\": [\"\".join(req) for req in config[\"templates\"].items()],\n}\n\n## process include keyword for related sections\nfor section in extras:\n for other in includes[section]:\n extras[section] += extras.get(other.strip(), [])\n\n\nif sys.version_info >= (3, 6):\n extras[\"dev\"] += [\"black\"]\n\nextras[\"all_extras\"] = extras[\"dev\"] + extras[\"viz\"] + extras[\"templates\"]\n\n\nsetup(\n name=\"prefect\",\n version=versioneer.get_version(),\n cmdclass=versioneer.get_cmdclass(),\n description=\"\",\n long_description=open(\"README.md\").read(),\n url=\"https://www.github.com/prefecthq/prefect\",\n author=\"Prefect Technologies, Inc.\",\n author_email=\"[email protected]\",\n install_requires=install_requires,\n extras_require=extras,\n scripts=[],\n packages=find_packages(where=\"src\"),\n package_dir={\"\": \"src\"},\n include_package_data=True,\n entry_points={\"console_scripts\": [\"prefect=prefect.cli:cli\"]},\n)\n", "path": "setup.py"}]}
1,122
787
gh_patches_debug_13957
rasdani/github-patches
git_diff
opendatacube__datacube-core-680
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Errors when running against the impending sqlalchemy 1.3 release (in beta) Originally reported in #667 Datacube consistently fails when run against the current beta version of sqlalchemy. [According to](https://www.sqlalchemy.org/blog/2019/02/08/sqlalchemy-1.3.0b3-released/) the sqlalchemy devs this release "1.3b3 should hopefully be the last beta release for 1.3, as no additional major changes are planned." This isn't currently a problem, but it will break all of our builds and guides if not resolved before 1.3 is declared stable. Manually reproduce the error with: ``` pip install sqlalchemy==1.3b3 datacube system init ``` - Either the sqlalchemy 1.3 beta has a bug, which we should report to them. - Or our own code is doing something incorrect and we should fix it before 1.3 is declared stable. Errors when running against the impending sqlalchemy 1.3 release (in beta) Originally reported in #667 Datacube consistently fails when run against the current beta version of sqlalchemy. [According to](https://www.sqlalchemy.org/blog/2019/02/08/sqlalchemy-1.3.0b3-released/) the sqlalchemy devs this release "1.3b3 should hopefully be the last beta release for 1.3, as no additional major changes are planned." This isn't currently a problem, but it will break all of our builds and guides if not resolved before 1.3 is declared stable. Manually reproduce the error with: ``` pip install sqlalchemy==1.3b3 datacube system init ``` - Either the sqlalchemy 1.3 beta has a bug, which we should report to them. - Or our own code is doing something incorrect and we should fix it before 1.3 is declared stable. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `datacube/drivers/postgres/sql.py` Content: ``` 1 # coding=utf-8 2 """ 3 Custom types for postgres & sqlalchemy 4 """ 5 6 from sqlalchemy import TIMESTAMP 7 from sqlalchemy.dialects.postgresql.ranges import RangeOperators 8 from sqlalchemy.ext.compiler import compiles 9 from sqlalchemy.sql import sqltypes 10 from sqlalchemy.sql.expression import Executable, ClauseElement 11 from sqlalchemy.sql.functions import GenericFunction 12 13 SCHEMA_NAME = 'agdc' 14 15 16 class CreateView(Executable, ClauseElement): 17 def __init__(self, name, select): 18 self.name = name 19 self.select = select 20 21 22 @compiles(CreateView) 23 def visit_create_view(element, compiler, **kw): 24 return "CREATE VIEW %s AS %s" % ( 25 element.name, 26 compiler.process(element.select, literal_binds=True) 27 ) 28 29 30 TYPES_INIT_SQL = """ 31 create or replace function {schema}.common_timestamp(text) 32 returns timestamp with time zone as $$ 33 select ($1)::timestamp at time zone 'utc'; 34 $$ language sql immutable returns null on null input; 35 36 create type {schema}.float8range as range ( 37 subtype = float8, 38 subtype_diff = float8mi 39 ); 40 """.format(schema=SCHEMA_NAME) 41 42 43 # pylint: disable=abstract-method 44 class FLOAT8RANGE(RangeOperators, sqltypes.TypeEngine): 45 __visit_name__ = 'FLOAT8RANGE' 46 47 48 @compiles(FLOAT8RANGE) 49 def visit_float8range(element, compiler, **kw): 50 return "FLOAT8RANGE" 51 52 53 # Register the function with SQLAlchemhy. 54 # pylint: disable=too-many-ancestors 55 class CommonTimestamp(GenericFunction): 56 type = TIMESTAMP(timezone=True) 57 package = 'agdc' 58 identifier = 'common_timestamp' 59 60 name = '%s.common_timestamp' % SCHEMA_NAME 61 62 63 # pylint: disable=too-many-ancestors 64 class Float8Range(GenericFunction): 65 type = FLOAT8RANGE 66 package = 'agdc' 67 identifier = 'float8range' 68 69 name = '%s.float8range' % SCHEMA_NAME 70 71 72 class PGNAME(sqltypes.Text): 73 """Postgres 'NAME' type.""" 74 __visit_name__ = 'NAME' 75 76 77 @compiles(PGNAME) 78 def visit_name(element, compiler, **kw): 79 return "NAME" 80 81 82 def pg_exists(conn, name): 83 """ 84 Does a postgres object exist? 85 :rtype bool 86 """ 87 return conn.execute("SELECT to_regclass(%s)", name).scalar() is not None 88 89 90 def pg_column_exists(conn, table, column): 91 """ 92 Does a postgres object exist? 93 :rtype bool 94 """ 95 return conn.execute(""" 96 SELECT 1 FROM pg_attribute 97 WHERE attrelid = to_regclass(%s) 98 AND attname = %s 99 AND NOT attisdropped 100 """, table, column).scalar() is not None 101 102 103 def escape_pg_identifier(engine, name): 104 """ 105 Escape identifiers (tables, fields, roles, etc) for inclusion in SQL statements. 106 107 psycopg2 can safely merge query arguments, but cannot do the same for dynamically 108 generating queries. 109 110 See http://initd.org/psycopg/docs/sql.html for more information. 111 """ 112 # New (2.7+) versions of psycopg2 have function: extensions.quote_ident() 113 # But it's too bleeding edge right now. We'll ask the server to escape instead, as 114 # these are not performance sensitive. 115 return engine.execute("select quote_ident(%s)", name).scalar() 116 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/datacube/drivers/postgres/sql.py b/datacube/drivers/postgres/sql.py --- a/datacube/drivers/postgres/sql.py +++ b/datacube/drivers/postgres/sql.py @@ -57,7 +57,11 @@ package = 'agdc' identifier = 'common_timestamp' - name = '%s.common_timestamp' % SCHEMA_NAME + name = 'common_timestamp' + + def __init__(self, *args, **kwargs): + super().__init__(*args, **kwargs) + self.packagenames = ['%s' % SCHEMA_NAME] # pylint: disable=too-many-ancestors @@ -66,7 +70,11 @@ package = 'agdc' identifier = 'float8range' - name = '%s.float8range' % SCHEMA_NAME + name = 'float8range' + + def __init__(self, *args, **kwargs): + super().__init__(*args, **kwargs) + self.packagenames = ['%s' % SCHEMA_NAME] class PGNAME(sqltypes.Text):
{"golden_diff": "diff --git a/datacube/drivers/postgres/sql.py b/datacube/drivers/postgres/sql.py\n--- a/datacube/drivers/postgres/sql.py\n+++ b/datacube/drivers/postgres/sql.py\n@@ -57,7 +57,11 @@\n package = 'agdc'\n identifier = 'common_timestamp'\n \n- name = '%s.common_timestamp' % SCHEMA_NAME\n+ name = 'common_timestamp'\n+\n+ def __init__(self, *args, **kwargs):\n+ super().__init__(*args, **kwargs)\n+ self.packagenames = ['%s' % SCHEMA_NAME]\n \n \n # pylint: disable=too-many-ancestors\n@@ -66,7 +70,11 @@\n package = 'agdc'\n identifier = 'float8range'\n \n- name = '%s.float8range' % SCHEMA_NAME\n+ name = 'float8range'\n+\n+ def __init__(self, *args, **kwargs):\n+ super().__init__(*args, **kwargs)\n+ self.packagenames = ['%s' % SCHEMA_NAME]\n \n \n class PGNAME(sqltypes.Text):\n", "issue": "Errors when running against the impending sqlalchemy 1.3 release (in beta)\nOriginally reported in #667\r\n\r\nDatacube consistently fails when run against the current beta version of sqlalchemy. [According to](https://www.sqlalchemy.org/blog/2019/02/08/sqlalchemy-1.3.0b3-released/) the sqlalchemy devs this release \"1.3b3 should hopefully be the last beta release for 1.3, as no additional major changes are planned.\"\r\n\r\nThis isn't currently a problem, but it will break all of our builds and guides if not resolved before 1.3 is declared stable.\r\n\r\nManually reproduce the error with:\r\n\r\n```\r\n pip install sqlalchemy==1.3b3\r\n datacube system init\r\n```\r\n\r\n- Either the sqlalchemy 1.3 beta has a bug, which we should report to them.\r\n- Or our own code is doing something incorrect and we should fix it before 1.3 is declared stable.\r\n\nErrors when running against the impending sqlalchemy 1.3 release (in beta)\nOriginally reported in #667\r\n\r\nDatacube consistently fails when run against the current beta version of sqlalchemy. [According to](https://www.sqlalchemy.org/blog/2019/02/08/sqlalchemy-1.3.0b3-released/) the sqlalchemy devs this release \"1.3b3 should hopefully be the last beta release for 1.3, as no additional major changes are planned.\"\r\n\r\nThis isn't currently a problem, but it will break all of our builds and guides if not resolved before 1.3 is declared stable.\r\n\r\nManually reproduce the error with:\r\n\r\n```\r\n pip install sqlalchemy==1.3b3\r\n datacube system init\r\n```\r\n\r\n- Either the sqlalchemy 1.3 beta has a bug, which we should report to them.\r\n- Or our own code is doing something incorrect and we should fix it before 1.3 is declared stable.\r\n\n", "before_files": [{"content": "# coding=utf-8\n\"\"\"\nCustom types for postgres & sqlalchemy\n\"\"\"\n\nfrom sqlalchemy import TIMESTAMP\nfrom sqlalchemy.dialects.postgresql.ranges import RangeOperators\nfrom sqlalchemy.ext.compiler import compiles\nfrom sqlalchemy.sql import sqltypes\nfrom sqlalchemy.sql.expression import Executable, ClauseElement\nfrom sqlalchemy.sql.functions import GenericFunction\n\nSCHEMA_NAME = 'agdc'\n\n\nclass CreateView(Executable, ClauseElement):\n def __init__(self, name, select):\n self.name = name\n self.select = select\n\n\n@compiles(CreateView)\ndef visit_create_view(element, compiler, **kw):\n return \"CREATE VIEW %s AS %s\" % (\n element.name,\n compiler.process(element.select, literal_binds=True)\n )\n\n\nTYPES_INIT_SQL = \"\"\"\ncreate or replace function {schema}.common_timestamp(text)\nreturns timestamp with time zone as $$\nselect ($1)::timestamp at time zone 'utc';\n$$ language sql immutable returns null on null input;\n\ncreate type {schema}.float8range as range (\n subtype = float8,\n subtype_diff = float8mi\n);\n\"\"\".format(schema=SCHEMA_NAME)\n\n\n# pylint: disable=abstract-method\nclass FLOAT8RANGE(RangeOperators, sqltypes.TypeEngine):\n __visit_name__ = 'FLOAT8RANGE'\n\n\n@compiles(FLOAT8RANGE)\ndef visit_float8range(element, compiler, **kw):\n return \"FLOAT8RANGE\"\n\n\n# Register the function with SQLAlchemhy.\n# pylint: disable=too-many-ancestors\nclass CommonTimestamp(GenericFunction):\n type = TIMESTAMP(timezone=True)\n package = 'agdc'\n identifier = 'common_timestamp'\n\n name = '%s.common_timestamp' % SCHEMA_NAME\n\n\n# pylint: disable=too-many-ancestors\nclass Float8Range(GenericFunction):\n type = FLOAT8RANGE\n package = 'agdc'\n identifier = 'float8range'\n\n name = '%s.float8range' % SCHEMA_NAME\n\n\nclass PGNAME(sqltypes.Text):\n \"\"\"Postgres 'NAME' type.\"\"\"\n __visit_name__ = 'NAME'\n\n\n@compiles(PGNAME)\ndef visit_name(element, compiler, **kw):\n return \"NAME\"\n\n\ndef pg_exists(conn, name):\n \"\"\"\n Does a postgres object exist?\n :rtype bool\n \"\"\"\n return conn.execute(\"SELECT to_regclass(%s)\", name).scalar() is not None\n\n\ndef pg_column_exists(conn, table, column):\n \"\"\"\n Does a postgres object exist?\n :rtype bool\n \"\"\"\n return conn.execute(\"\"\"\n SELECT 1 FROM pg_attribute\n WHERE attrelid = to_regclass(%s)\n AND attname = %s\n AND NOT attisdropped\n \"\"\", table, column).scalar() is not None\n\n\ndef escape_pg_identifier(engine, name):\n \"\"\"\n Escape identifiers (tables, fields, roles, etc) for inclusion in SQL statements.\n\n psycopg2 can safely merge query arguments, but cannot do the same for dynamically\n generating queries.\n\n See http://initd.org/psycopg/docs/sql.html for more information.\n \"\"\"\n # New (2.7+) versions of psycopg2 have function: extensions.quote_ident()\n # But it's too bleeding edge right now. We'll ask the server to escape instead, as\n # these are not performance sensitive.\n return engine.execute(\"select quote_ident(%s)\", name).scalar()\n", "path": "datacube/drivers/postgres/sql.py"}], "after_files": [{"content": "# coding=utf-8\n\"\"\"\nCustom types for postgres & sqlalchemy\n\"\"\"\n\nfrom sqlalchemy import TIMESTAMP\nfrom sqlalchemy.dialects.postgresql.ranges import RangeOperators\nfrom sqlalchemy.ext.compiler import compiles\nfrom sqlalchemy.sql import sqltypes\nfrom sqlalchemy.sql.expression import Executable, ClauseElement\nfrom sqlalchemy.sql.functions import GenericFunction\n\nSCHEMA_NAME = 'agdc'\n\n\nclass CreateView(Executable, ClauseElement):\n def __init__(self, name, select):\n self.name = name\n self.select = select\n\n\n@compiles(CreateView)\ndef visit_create_view(element, compiler, **kw):\n return \"CREATE VIEW %s AS %s\" % (\n element.name,\n compiler.process(element.select, literal_binds=True)\n )\n\n\nTYPES_INIT_SQL = \"\"\"\ncreate or replace function {schema}.common_timestamp(text)\nreturns timestamp with time zone as $$\nselect ($1)::timestamp at time zone 'utc';\n$$ language sql immutable returns null on null input;\n\ncreate type {schema}.float8range as range (\n subtype = float8,\n subtype_diff = float8mi\n);\n\"\"\".format(schema=SCHEMA_NAME)\n\n\n# pylint: disable=abstract-method\nclass FLOAT8RANGE(RangeOperators, sqltypes.TypeEngine):\n __visit_name__ = 'FLOAT8RANGE'\n\n\n@compiles(FLOAT8RANGE)\ndef visit_float8range(element, compiler, **kw):\n return \"FLOAT8RANGE\"\n\n\n# Register the function with SQLAlchemhy.\n# pylint: disable=too-many-ancestors\nclass CommonTimestamp(GenericFunction):\n type = TIMESTAMP(timezone=True)\n package = 'agdc'\n identifier = 'common_timestamp'\n\n name = 'common_timestamp'\n\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n self.packagenames = ['%s' % SCHEMA_NAME]\n\n\n# pylint: disable=too-many-ancestors\nclass Float8Range(GenericFunction):\n type = FLOAT8RANGE\n package = 'agdc'\n identifier = 'float8range'\n\n name = 'float8range'\n\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n self.packagenames = ['%s' % SCHEMA_NAME]\n\n\nclass PGNAME(sqltypes.Text):\n \"\"\"Postgres 'NAME' type.\"\"\"\n __visit_name__ = 'NAME'\n\n\n@compiles(PGNAME)\ndef visit_name(element, compiler, **kw):\n return \"NAME\"\n\n\ndef pg_exists(conn, name):\n \"\"\"\n Does a postgres object exist?\n :rtype bool\n \"\"\"\n return conn.execute(\"SELECT to_regclass(%s)\", name).scalar() is not None\n\n\ndef pg_column_exists(conn, table, column):\n \"\"\"\n Does a postgres object exist?\n :rtype bool\n \"\"\"\n return conn.execute(\"\"\"\n SELECT 1 FROM pg_attribute\n WHERE attrelid = to_regclass(%s)\n AND attname = %s\n AND NOT attisdropped\n \"\"\", table, column).scalar() is not None\n\n\ndef escape_pg_identifier(engine, name):\n \"\"\"\n Escape identifiers (tables, fields, roles, etc) for inclusion in SQL statements.\n\n psycopg2 can safely merge query arguments, but cannot do the same for dynamically\n generating queries.\n\n See http://initd.org/psycopg/docs/sql.html for more information.\n \"\"\"\n # New (2.7+) versions of psycopg2 have function: extensions.quote_ident()\n # But it's too bleeding edge right now. We'll ask the server to escape instead, as\n # these are not performance sensitive.\n return engine.execute(\"select quote_ident(%s)\", name).scalar()\n", "path": "datacube/drivers/postgres/sql.py"}]}
1,666
248
gh_patches_debug_38519
rasdani/github-patches
git_diff
alltheplaces__alltheplaces-3343
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Spider lees_famous_recipe is broken During the global build at 2021-10-20-14-42-48, spider **lees_famous_recipe** failed with **0 features** and **130 errors**. Here's [the log](https://data.alltheplaces.xyz/runs/2021-10-20-14-42-48/logs/lees_famous_recipe.txt) and [the output](https://data.alltheplaces.xyz/runs/2021-10-20-14-42-48/output/lees_famous_recipe.geojson) ([on a map](https://data.alltheplaces.xyz/map.html?show=https://data.alltheplaces.xyz/runs/2021-10-20-14-42-48/output/lees_famous_recipe.geojson)) --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `locations/spiders/lees_famous_recipe.py` Content: ``` 1 # -*- coding: utf-8 -*- 2 import scrapy 3 from locations.items import GeojsonPointItem 4 import re 5 6 daysKey = { 7 'Monday': 'Mo', 'Tuesday': 'Tu', 'Wednesday': 'We', 'Thursday': 'Th', 8 'Friday': 'Fr', 'Saturday': 'Sa', 'Sunday': 'Su' 9 } 10 11 12 class LeesFamousRecipeSpider(scrapy.Spider): 13 name = "lees_famous_recipe" 14 item_attributes = { 'brand': "Lee's Famous Recipe Chicken" } 15 allowed_domains = ["www.leesfamousrecipe.com"] 16 start_urls = ( 17 'https://www.leesfamousrecipe.com/locations/all', 18 ) 19 20 def parse_phone(self, phone): 21 phone = phone.replace('.','') 22 phone = phone.replace(')','') 23 phone = phone.replace('(','') 24 phone = phone.replace('_','') 25 phone = phone.replace('-','') 26 phone = phone.replace('+','') 27 phone = phone.replace(' ','') 28 return phone 29 30 def store_hours(self, hours): 31 try: 32 days = hours.split(': ')[0].strip() 33 if('-' in days): 34 startDay = daysKey[days.split('-')[0]] 35 endDay = daysKey[days.split('-')[1]] 36 dayOutput = startDay + "-" + endDay 37 else: 38 dayOutput = daysKey[days] 39 40 bothHours = hours.split(': ')[1].replace(' ','') 41 openHours = bothHours.split("-")[0] 42 closeHours = bothHours.split("-")[1] 43 44 if("am" in openHours): 45 openHours = openHours.replace("am","") 46 if(":" in openHours): 47 openH = openHours.split(":")[0] 48 openM = openHours.split(":")[1] 49 else: 50 openH = openHours 51 openM = "00" 52 openHours = openH + ":" + openM 53 54 if("pm" in openHours): 55 openHours = openHours.replace("pm","") 56 if(":" in openHours): 57 openH = openHours.split(":")[0] 58 openM = openHours.split(":")[1] 59 else: 60 openH = openHours 61 openM = "00" 62 openH = str(int(openH) + 12) 63 openHours = openH + ":" + openM 64 65 if("am" in closeHours): 66 closeHours = closeHours.replace("am","") 67 if(":" in closeHours): 68 closeH = closeHours.split(":")[0] 69 closeM = closeHours.split(":")[1] 70 else: 71 closeH = closeHours 72 closeM = "00" 73 closeHours = closeH + ":" + closeM 74 75 if("pm" in closeHours): 76 closeHours = closeHours.replace("pm","") 77 if(":" in closeHours): 78 closeH = closeHours.split(":")[0] 79 closeM = closeHours.split(":")[1] 80 else: 81 closeH = closeHours 82 closeM = "00" 83 closeH = str(int(closeH) + 12) 84 closeHours = closeH + ":" + closeM 85 return dayOutput +' '+ openHours.replace(' ','') + "-" + closeHours + ';' 86 except KeyError: 87 return "" 88 89 def parse(self, response): 90 if("https://www.leesfamousrecipe.com/locations/all" == response.url): 91 for match in response.xpath("//div[contains(@class,'field-content')]/a/@href"): 92 request = scrapy.Request(match.extract()) 93 yield request 94 else: 95 nameString = response.xpath("//h1[@class='node-title']/text()").extract_first().strip() 96 shortString = response.xpath("//h1[@class='node-title']/small/text()").extract_first() 97 if shortString is None: 98 shortString = "" 99 nameString = nameString + " " + shortString 100 nameString = nameString.strip() 101 102 scriptBody = response.xpath("//script[@type='text/javascript' and contains(.,'latitude')]/text()").extract_first() 103 latString = re.findall("latitude\":\"(.*?)\"", scriptBody)[0] 104 lonString = re.findall("longitude\":\"(.*?)\"", scriptBody)[0] 105 106 openingHoursString = "" 107 firstHourBlock = response.xpath("//div[contains(@class,'field-name-field-hours-summer')]/div/div/p/br/parent::p/text()") 108 for hourLine in firstHourBlock: 109 openingHoursString = openingHoursString +' '+self.store_hours(hourLine.extract()) 110 openingHoursString = openingHoursString.strip(';').strip() 111 112 113 if("british-columbia" in response.url): 114 countryString = "CA" 115 stateString = "BC" 116 else: 117 countryString = "US" 118 mapUrl = response.xpath("//div[contains(@class,'map-link')]/div/a/@href").extract_first() 119 stateString = re.findall(r'(?<=\+)(.*?)(?=\+)', mapUrl)[len(re.findall(r'(?<=\+)(.*?)(?=\+)', mapUrl)) - 2].strip().replace('%2C','') 120 121 yield GeojsonPointItem( 122 ref=nameString, 123 addr_full=response.xpath("//div[@class='street-address']/text()").extract_first().strip(), 124 city=response.xpath("//div[@class='city-state-zip']/span[@class='locality']/text()").extract_first().strip(), 125 opening_hours=openingHoursString, 126 state=stateString, 127 postcode=response.xpath("//div[@class='city-state-zip']/span[@class='postal-code']/text()").extract_first().strip(), 128 phone=self.parse_phone(response.xpath("//div[contains(@class,'field-name-field-phone')]/div/div/text()").extract_first().strip()), 129 country = countryString, 130 lat=float(latString), 131 lon=float(lonString), 132 ) 133 134 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/locations/spiders/lees_famous_recipe.py b/locations/spiders/lees_famous_recipe.py --- a/locations/spiders/lees_famous_recipe.py +++ b/locations/spiders/lees_famous_recipe.py @@ -83,7 +83,7 @@ closeH = str(int(closeH) + 12) closeHours = closeH + ":" + closeM return dayOutput +' '+ openHours.replace(' ','') + "-" + closeHours + ';' - except KeyError: + except (KeyError, IndexError): return "" def parse(self, response): @@ -99,9 +99,8 @@ nameString = nameString + " " + shortString nameString = nameString.strip() - scriptBody = response.xpath("//script[@type='text/javascript' and contains(.,'latitude')]/text()").extract_first() - latString = re.findall("latitude\":\"(.*?)\"", scriptBody)[0] - lonString = re.findall("longitude\":\"(.*?)\"", scriptBody)[0] + googleMapSrc = response.xpath("//*[@id='block-system-main']/div/div/iframe").extract_first() + [latString, lonString] = re.findall("center=(.*?)\"", googleMapSrc)[0].split(',') openingHoursString = "" firstHourBlock = response.xpath("//div[contains(@class,'field-name-field-hours-summer')]/div/div/p/br/parent::p/text()") @@ -116,7 +115,7 @@ else: countryString = "US" mapUrl = response.xpath("//div[contains(@class,'map-link')]/div/a/@href").extract_first() - stateString = re.findall(r'(?<=\+)(.*?)(?=\+)', mapUrl)[len(re.findall(r'(?<=\+)(.*?)(?=\+)', mapUrl)) - 2].strip().replace('%2C','') + stateString = response.xpath("//div[contains(@class,'adr')]/div[2]/span[2]/text()").extract_first() yield GeojsonPointItem( ref=nameString, @@ -125,7 +124,7 @@ opening_hours=openingHoursString, state=stateString, postcode=response.xpath("//div[@class='city-state-zip']/span[@class='postal-code']/text()").extract_first().strip(), - phone=self.parse_phone(response.xpath("//div[contains(@class,'field-name-field-phone')]/div/div/text()").extract_first().strip()), + phone=self.parse_phone(response.xpath("//div[contains(@class,'adr')]/div[3]/text()").extract_first().strip()), country = countryString, lat=float(latString), lon=float(lonString),
{"golden_diff": "diff --git a/locations/spiders/lees_famous_recipe.py b/locations/spiders/lees_famous_recipe.py\n--- a/locations/spiders/lees_famous_recipe.py\n+++ b/locations/spiders/lees_famous_recipe.py\n@@ -83,7 +83,7 @@\n closeH = str(int(closeH) + 12)\n closeHours = closeH + \":\" + closeM\n return dayOutput +' '+ openHours.replace(' ','') + \"-\" + closeHours + ';'\n- except KeyError:\n+ except (KeyError, IndexError):\n return \"\"\n \n def parse(self, response):\n@@ -99,9 +99,8 @@\n nameString = nameString + \" \" + shortString\n nameString = nameString.strip()\n \n- scriptBody = response.xpath(\"//script[@type='text/javascript' and contains(.,'latitude')]/text()\").extract_first()\n- latString = re.findall(\"latitude\\\":\\\"(.*?)\\\"\", scriptBody)[0]\n- lonString = re.findall(\"longitude\\\":\\\"(.*?)\\\"\", scriptBody)[0]\n+ googleMapSrc = response.xpath(\"//*[@id='block-system-main']/div/div/iframe\").extract_first()\n+ [latString, lonString] = re.findall(\"center=(.*?)\\\"\", googleMapSrc)[0].split(',')\n \n openingHoursString = \"\"\n firstHourBlock = response.xpath(\"//div[contains(@class,'field-name-field-hours-summer')]/div/div/p/br/parent::p/text()\")\n@@ -116,7 +115,7 @@\n else:\n countryString = \"US\"\n mapUrl = response.xpath(\"//div[contains(@class,'map-link')]/div/a/@href\").extract_first()\n- stateString = re.findall(r'(?<=\\+)(.*?)(?=\\+)', mapUrl)[len(re.findall(r'(?<=\\+)(.*?)(?=\\+)', mapUrl)) - 2].strip().replace('%2C','')\n+ stateString = response.xpath(\"//div[contains(@class,'adr')]/div[2]/span[2]/text()\").extract_first()\n \n yield GeojsonPointItem(\n ref=nameString,\n@@ -125,7 +124,7 @@\n opening_hours=openingHoursString,\n state=stateString,\n postcode=response.xpath(\"//div[@class='city-state-zip']/span[@class='postal-code']/text()\").extract_first().strip(),\n- phone=self.parse_phone(response.xpath(\"//div[contains(@class,'field-name-field-phone')]/div/div/text()\").extract_first().strip()),\n+ phone=self.parse_phone(response.xpath(\"//div[contains(@class,'adr')]/div[3]/text()\").extract_first().strip()),\n country = countryString,\n lat=float(latString),\n lon=float(lonString),\n", "issue": "Spider lees_famous_recipe is broken\nDuring the global build at 2021-10-20-14-42-48, spider **lees_famous_recipe** failed with **0 features** and **130 errors**.\n\nHere's [the log](https://data.alltheplaces.xyz/runs/2021-10-20-14-42-48/logs/lees_famous_recipe.txt) and [the output](https://data.alltheplaces.xyz/runs/2021-10-20-14-42-48/output/lees_famous_recipe.geojson) ([on a map](https://data.alltheplaces.xyz/map.html?show=https://data.alltheplaces.xyz/runs/2021-10-20-14-42-48/output/lees_famous_recipe.geojson))\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\nimport scrapy\nfrom locations.items import GeojsonPointItem\nimport re\n\ndaysKey = {\n 'Monday': 'Mo', 'Tuesday': 'Tu', 'Wednesday': 'We', 'Thursday': 'Th',\n 'Friday': 'Fr', 'Saturday': 'Sa', 'Sunday': 'Su'\n}\n\n\nclass LeesFamousRecipeSpider(scrapy.Spider):\n name = \"lees_famous_recipe\"\n item_attributes = { 'brand': \"Lee's Famous Recipe Chicken\" }\n allowed_domains = [\"www.leesfamousrecipe.com\"]\n start_urls = (\n 'https://www.leesfamousrecipe.com/locations/all',\n )\n\n def parse_phone(self, phone):\n phone = phone.replace('.','')\n phone = phone.replace(')','')\n phone = phone.replace('(','')\n phone = phone.replace('_','')\n phone = phone.replace('-','')\n phone = phone.replace('+','')\n phone = phone.replace(' ','')\n return phone\n\n def store_hours(self, hours):\n try:\n days = hours.split(': ')[0].strip()\n if('-' in days):\n startDay = daysKey[days.split('-')[0]]\n endDay = daysKey[days.split('-')[1]]\n dayOutput = startDay + \"-\" + endDay\n else:\n dayOutput = daysKey[days]\n\n bothHours = hours.split(': ')[1].replace(' ','')\n openHours = bothHours.split(\"-\")[0]\n closeHours = bothHours.split(\"-\")[1]\n\n if(\"am\" in openHours):\n openHours = openHours.replace(\"am\",\"\")\n if(\":\" in openHours):\n openH = openHours.split(\":\")[0]\n openM = openHours.split(\":\")[1]\n else:\n openH = openHours\n openM = \"00\"\n openHours = openH + \":\" + openM\n\n if(\"pm\" in openHours):\n openHours = openHours.replace(\"pm\",\"\")\n if(\":\" in openHours):\n openH = openHours.split(\":\")[0]\n openM = openHours.split(\":\")[1]\n else:\n openH = openHours\n openM = \"00\"\n openH = str(int(openH) + 12)\n openHours = openH + \":\" + openM\n\n if(\"am\" in closeHours):\n closeHours = closeHours.replace(\"am\",\"\")\n if(\":\" in closeHours):\n closeH = closeHours.split(\":\")[0]\n closeM = closeHours.split(\":\")[1]\n else:\n closeH = closeHours\n closeM = \"00\"\n closeHours = closeH + \":\" + closeM\n\n if(\"pm\" in closeHours):\n closeHours = closeHours.replace(\"pm\",\"\")\n if(\":\" in closeHours):\n closeH = closeHours.split(\":\")[0]\n closeM = closeHours.split(\":\")[1]\n else:\n closeH = closeHours\n closeM = \"00\"\n closeH = str(int(closeH) + 12)\n closeHours = closeH + \":\" + closeM\n return dayOutput +' '+ openHours.replace(' ','') + \"-\" + closeHours + ';'\n except KeyError:\n return \"\"\n\n def parse(self, response):\n if(\"https://www.leesfamousrecipe.com/locations/all\" == response.url):\n for match in response.xpath(\"//div[contains(@class,'field-content')]/a/@href\"):\n request = scrapy.Request(match.extract())\n yield request\n else:\n nameString = response.xpath(\"//h1[@class='node-title']/text()\").extract_first().strip()\n shortString = response.xpath(\"//h1[@class='node-title']/small/text()\").extract_first()\n if shortString is None:\n shortString = \"\"\n nameString = nameString + \" \" + shortString\n nameString = nameString.strip()\n\n scriptBody = response.xpath(\"//script[@type='text/javascript' and contains(.,'latitude')]/text()\").extract_first()\n latString = re.findall(\"latitude\\\":\\\"(.*?)\\\"\", scriptBody)[0]\n lonString = re.findall(\"longitude\\\":\\\"(.*?)\\\"\", scriptBody)[0]\n\n openingHoursString = \"\"\n firstHourBlock = response.xpath(\"//div[contains(@class,'field-name-field-hours-summer')]/div/div/p/br/parent::p/text()\")\n for hourLine in firstHourBlock:\n openingHoursString = openingHoursString +' '+self.store_hours(hourLine.extract())\n openingHoursString = openingHoursString.strip(';').strip()\n\n\n if(\"british-columbia\" in response.url):\n countryString = \"CA\"\n stateString = \"BC\"\n else:\n countryString = \"US\"\n mapUrl = response.xpath(\"//div[contains(@class,'map-link')]/div/a/@href\").extract_first()\n stateString = re.findall(r'(?<=\\+)(.*?)(?=\\+)', mapUrl)[len(re.findall(r'(?<=\\+)(.*?)(?=\\+)', mapUrl)) - 2].strip().replace('%2C','')\n\n yield GeojsonPointItem(\n ref=nameString,\n addr_full=response.xpath(\"//div[@class='street-address']/text()\").extract_first().strip(),\n city=response.xpath(\"//div[@class='city-state-zip']/span[@class='locality']/text()\").extract_first().strip(),\n opening_hours=openingHoursString,\n state=stateString,\n postcode=response.xpath(\"//div[@class='city-state-zip']/span[@class='postal-code']/text()\").extract_first().strip(),\n phone=self.parse_phone(response.xpath(\"//div[contains(@class,'field-name-field-phone')]/div/div/text()\").extract_first().strip()),\n country = countryString,\n lat=float(latString),\n lon=float(lonString),\n )\n\n", "path": "locations/spiders/lees_famous_recipe.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\nimport scrapy\nfrom locations.items import GeojsonPointItem\nimport re\n\ndaysKey = {\n 'Monday': 'Mo', 'Tuesday': 'Tu', 'Wednesday': 'We', 'Thursday': 'Th',\n 'Friday': 'Fr', 'Saturday': 'Sa', 'Sunday': 'Su'\n}\n\n\nclass LeesFamousRecipeSpider(scrapy.Spider):\n name = \"lees_famous_recipe\"\n item_attributes = { 'brand': \"Lee's Famous Recipe Chicken\" }\n allowed_domains = [\"www.leesfamousrecipe.com\"]\n start_urls = (\n 'https://www.leesfamousrecipe.com/locations/all',\n )\n\n def parse_phone(self, phone):\n phone = phone.replace('.','')\n phone = phone.replace(')','')\n phone = phone.replace('(','')\n phone = phone.replace('_','')\n phone = phone.replace('-','')\n phone = phone.replace('+','')\n phone = phone.replace(' ','')\n return phone\n\n def store_hours(self, hours):\n try:\n days = hours.split(': ')[0].strip()\n if('-' in days):\n startDay = daysKey[days.split('-')[0]]\n endDay = daysKey[days.split('-')[1]]\n dayOutput = startDay + \"-\" + endDay\n else:\n dayOutput = daysKey[days]\n\n bothHours = hours.split(': ')[1].replace(' ','')\n openHours = bothHours.split(\"-\")[0]\n closeHours = bothHours.split(\"-\")[1]\n\n if(\"am\" in openHours):\n openHours = openHours.replace(\"am\",\"\")\n if(\":\" in openHours):\n openH = openHours.split(\":\")[0]\n openM = openHours.split(\":\")[1]\n else:\n openH = openHours\n openM = \"00\"\n openHours = openH + \":\" + openM\n\n if(\"pm\" in openHours):\n openHours = openHours.replace(\"pm\",\"\")\n if(\":\" in openHours):\n openH = openHours.split(\":\")[0]\n openM = openHours.split(\":\")[1]\n else:\n openH = openHours\n openM = \"00\"\n openH = str(int(openH) + 12)\n openHours = openH + \":\" + openM\n\n if(\"am\" in closeHours):\n closeHours = closeHours.replace(\"am\",\"\")\n if(\":\" in closeHours):\n closeH = closeHours.split(\":\")[0]\n closeM = closeHours.split(\":\")[1]\n else:\n closeH = closeHours\n closeM = \"00\"\n closeHours = closeH + \":\" + closeM\n\n if(\"pm\" in closeHours):\n closeHours = closeHours.replace(\"pm\",\"\")\n if(\":\" in closeHours):\n closeH = closeHours.split(\":\")[0]\n closeM = closeHours.split(\":\")[1]\n else:\n closeH = closeHours\n closeM = \"00\"\n closeH = str(int(closeH) + 12)\n closeHours = closeH + \":\" + closeM\n return dayOutput +' '+ openHours.replace(' ','') + \"-\" + closeHours + ';'\n except (KeyError, IndexError):\n return \"\"\n\n def parse(self, response):\n if(\"https://www.leesfamousrecipe.com/locations/all\" == response.url):\n for match in response.xpath(\"//div[contains(@class,'field-content')]/a/@href\"):\n request = scrapy.Request(match.extract())\n yield request\n else:\n nameString = response.xpath(\"//h1[@class='node-title']/text()\").extract_first().strip()\n shortString = response.xpath(\"//h1[@class='node-title']/small/text()\").extract_first()\n if shortString is None:\n shortString = \"\"\n nameString = nameString + \" \" + shortString\n nameString = nameString.strip()\n\n googleMapSrc = response.xpath(\"//*[@id='block-system-main']/div/div/iframe\").extract_first()\n [latString, lonString] = re.findall(\"center=(.*?)\\\"\", googleMapSrc)[0].split(',')\n\n openingHoursString = \"\"\n firstHourBlock = response.xpath(\"//div[contains(@class,'field-name-field-hours-summer')]/div/div/p/br/parent::p/text()\")\n for hourLine in firstHourBlock:\n openingHoursString = openingHoursString +' '+self.store_hours(hourLine.extract())\n openingHoursString = openingHoursString.strip(';').strip()\n\n\n if(\"british-columbia\" in response.url):\n countryString = \"CA\"\n stateString = \"BC\"\n else:\n countryString = \"US\"\n mapUrl = response.xpath(\"//div[contains(@class,'map-link')]/div/a/@href\").extract_first()\n stateString = response.xpath(\"//div[contains(@class,'adr')]/div[2]/span[2]/text()\").extract_first()\n\n yield GeojsonPointItem(\n ref=nameString,\n addr_full=response.xpath(\"//div[@class='street-address']/text()\").extract_first().strip(),\n city=response.xpath(\"//div[@class='city-state-zip']/span[@class='locality']/text()\").extract_first().strip(),\n opening_hours=openingHoursString,\n state=stateString,\n postcode=response.xpath(\"//div[@class='city-state-zip']/span[@class='postal-code']/text()\").extract_first().strip(),\n phone=self.parse_phone(response.xpath(\"//div[contains(@class,'adr')]/div[3]/text()\").extract_first().strip()),\n country = countryString,\n lat=float(latString),\n lon=float(lonString),\n )\n\n", "path": "locations/spiders/lees_famous_recipe.py"}]}
2,026
606
gh_patches_debug_1473
rasdani/github-patches
git_diff
ivy-llc__ivy-13177
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- tril_indces_from --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `ivy/functional/frontends/jax/numpy/indexing.py` Content: ``` 1 # local 2 import ivy 3 from ivy.functional.frontends.jax.func_wrapper import ( 4 to_ivy_arrays_and_back, 5 ) 6 7 8 @to_ivy_arrays_and_back 9 def diagonal(a, offset=0, axis1=0, axis2=1): 10 return ivy.diagonal(a, offset=offset, axis1=axis1, axis2=axis2) 11 12 13 @to_ivy_arrays_and_back 14 def diag(v, k=0): 15 return ivy.diag(v, k=k) 16 17 18 @to_ivy_arrays_and_back 19 def diag_indices(n, ndim=2): 20 idx = ivy.arange(n, dtype=int) 21 return (idx,) * ndim 22 23 24 # take_along_axis 25 @to_ivy_arrays_and_back 26 def take_along_axis(arr, indices, axis, mode="fill"): 27 return ivy.take_along_axis(arr, indices, axis, mode=mode) 28 29 30 @to_ivy_arrays_and_back 31 def tril_indices(n_rows, n_cols=None, k=0): 32 return ivy.tril_indices(n_rows, n_cols, k) 33 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/ivy/functional/frontends/jax/numpy/indexing.py b/ivy/functional/frontends/jax/numpy/indexing.py --- a/ivy/functional/frontends/jax/numpy/indexing.py +++ b/ivy/functional/frontends/jax/numpy/indexing.py @@ -30,3 +30,8 @@ @to_ivy_arrays_and_back def tril_indices(n_rows, n_cols=None, k=0): return ivy.tril_indices(n_rows, n_cols, k) + + +@to_ivy_arrays_and_back +def tril_indices_from(arr, k=0): + return ivy.tril_indices(arr.shape[-2], arr.shape[-1], k)
{"golden_diff": "diff --git a/ivy/functional/frontends/jax/numpy/indexing.py b/ivy/functional/frontends/jax/numpy/indexing.py\n--- a/ivy/functional/frontends/jax/numpy/indexing.py\n+++ b/ivy/functional/frontends/jax/numpy/indexing.py\n@@ -30,3 +30,8 @@\n @to_ivy_arrays_and_back\n def tril_indices(n_rows, n_cols=None, k=0):\n return ivy.tril_indices(n_rows, n_cols, k)\n+\n+\n+@to_ivy_arrays_and_back\n+def tril_indices_from(arr, k=0):\n+ return ivy.tril_indices(arr.shape[-2], arr.shape[-1], k)\n", "issue": "tril_indces_from\n\n", "before_files": [{"content": "# local\nimport ivy\nfrom ivy.functional.frontends.jax.func_wrapper import (\n to_ivy_arrays_and_back,\n)\n\n\n@to_ivy_arrays_and_back\ndef diagonal(a, offset=0, axis1=0, axis2=1):\n return ivy.diagonal(a, offset=offset, axis1=axis1, axis2=axis2)\n\n\n@to_ivy_arrays_and_back\ndef diag(v, k=0):\n return ivy.diag(v, k=k)\n\n\n@to_ivy_arrays_and_back\ndef diag_indices(n, ndim=2):\n idx = ivy.arange(n, dtype=int)\n return (idx,) * ndim\n\n\n# take_along_axis\n@to_ivy_arrays_and_back\ndef take_along_axis(arr, indices, axis, mode=\"fill\"):\n return ivy.take_along_axis(arr, indices, axis, mode=mode)\n\n\n@to_ivy_arrays_and_back\ndef tril_indices(n_rows, n_cols=None, k=0):\n return ivy.tril_indices(n_rows, n_cols, k)\n", "path": "ivy/functional/frontends/jax/numpy/indexing.py"}], "after_files": [{"content": "# local\nimport ivy\nfrom ivy.functional.frontends.jax.func_wrapper import (\n to_ivy_arrays_and_back,\n)\n\n\n@to_ivy_arrays_and_back\ndef diagonal(a, offset=0, axis1=0, axis2=1):\n return ivy.diagonal(a, offset=offset, axis1=axis1, axis2=axis2)\n\n\n@to_ivy_arrays_and_back\ndef diag(v, k=0):\n return ivy.diag(v, k=k)\n\n\n@to_ivy_arrays_and_back\ndef diag_indices(n, ndim=2):\n idx = ivy.arange(n, dtype=int)\n return (idx,) * ndim\n\n\n# take_along_axis\n@to_ivy_arrays_and_back\ndef take_along_axis(arr, indices, axis, mode=\"fill\"):\n return ivy.take_along_axis(arr, indices, axis, mode=mode)\n\n\n@to_ivy_arrays_and_back\ndef tril_indices(n_rows, n_cols=None, k=0):\n return ivy.tril_indices(n_rows, n_cols, k)\n\n\n@to_ivy_arrays_and_back\ndef tril_indices_from(arr, k=0):\n return ivy.tril_indices(arr.shape[-2], arr.shape[-1], k)\n", "path": "ivy/functional/frontends/jax/numpy/indexing.py"}]}
566
157
gh_patches_debug_336
rasdani/github-patches
git_diff
piskvorky__gensim-919
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- import gensim fails since updating to Xcode 7.3 I just updated my version of Xcode to 7.3. When I run `pip install --upgrade gensim` the process completed without any issues. However, when I try `import gensim` within the python shell the terminal barfs a bunch of C++ output with a block of execution errors that begins with: `Exception: Compilation failed (return status=1): clang: error: unsupported option '-b mi2'. clang: error: unsupported option '-b mi'. clang: error: unknown argument: '-target-feature'. clang: error: unknown argument: '-sse4a'. clang: error: unknown argument: '-target-feature'. clang: error: unknown argument: '-target-feature'. clang: error: unknown argument: '-target-feature'. clang: error: unknown argument: '-tbm'. clang: error: unknown argument: '-target-feature'. clang: error: unknown argument: '-target-feature'....` I think this has something to do with where gensim is looking for its header files, but I'm somewhat at a loss. Any help debugging would be greatly appreciated. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `gensim/corpora/__init__.py` Content: ``` 1 """ 2 This package contains implementations of various streaming corpus I/O format. 3 """ 4 5 # bring corpus classes directly into package namespace, to save some typing 6 from .indexedcorpus import IndexedCorpus # must appear before the other classes 7 8 from .mmcorpus import MmCorpus 9 from .bleicorpus import BleiCorpus 10 from .svmlightcorpus import SvmLightCorpus 11 from .lowcorpus import LowCorpus 12 from .dictionary import Dictionary 13 from .hashdictionary import HashDictionary 14 from .wikicorpus import WikiCorpus 15 from .textcorpus import TextCorpus 16 from .ucicorpus import UciCorpus 17 from .malletcorpus import MalletCorpus 18 from .sharded_corpus import ShardedCorpus 19 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/gensim/corpora/__init__.py b/gensim/corpora/__init__.py --- a/gensim/corpora/__init__.py +++ b/gensim/corpora/__init__.py @@ -15,4 +15,3 @@ from .textcorpus import TextCorpus from .ucicorpus import UciCorpus from .malletcorpus import MalletCorpus -from .sharded_corpus import ShardedCorpus
{"golden_diff": "diff --git a/gensim/corpora/__init__.py b/gensim/corpora/__init__.py\n--- a/gensim/corpora/__init__.py\n+++ b/gensim/corpora/__init__.py\n@@ -15,4 +15,3 @@\n from .textcorpus import TextCorpus\n from .ucicorpus import UciCorpus\n from .malletcorpus import MalletCorpus\n-from .sharded_corpus import ShardedCorpus\n", "issue": "import gensim fails since updating to Xcode 7.3 \nI just updated my version of Xcode to 7.3. When I run `pip install --upgrade gensim` the process completed without any issues. However, when I try `import gensim` within the python shell the terminal barfs a bunch of C++ output with a block of execution errors that begins with: \n\n`Exception: Compilation failed (return status=1): clang: error: unsupported option '-b mi2'. clang: error: unsupported option '-b mi'. clang: error: unknown argument: '-target-feature'. clang: error: unknown argument: '-sse4a'. clang: error: unknown argument: '-target-feature'. clang: error: unknown argument: '-target-feature'. clang: error: unknown argument: '-target-feature'. clang: error: unknown argument: '-tbm'. clang: error: unknown argument: '-target-feature'. clang: error: unknown argument: '-target-feature'....`\n\nI think this has something to do with where gensim is looking for its header files, but I'm somewhat at a loss. Any help debugging would be greatly appreciated. \n\n", "before_files": [{"content": "\"\"\"\nThis package contains implementations of various streaming corpus I/O format.\n\"\"\"\n\n# bring corpus classes directly into package namespace, to save some typing\nfrom .indexedcorpus import IndexedCorpus # must appear before the other classes\n\nfrom .mmcorpus import MmCorpus\nfrom .bleicorpus import BleiCorpus\nfrom .svmlightcorpus import SvmLightCorpus\nfrom .lowcorpus import LowCorpus\nfrom .dictionary import Dictionary\nfrom .hashdictionary import HashDictionary\nfrom .wikicorpus import WikiCorpus\nfrom .textcorpus import TextCorpus\nfrom .ucicorpus import UciCorpus\nfrom .malletcorpus import MalletCorpus\nfrom .sharded_corpus import ShardedCorpus\n", "path": "gensim/corpora/__init__.py"}], "after_files": [{"content": "\"\"\"\nThis package contains implementations of various streaming corpus I/O format.\n\"\"\"\n\n# bring corpus classes directly into package namespace, to save some typing\nfrom .indexedcorpus import IndexedCorpus # must appear before the other classes\n\nfrom .mmcorpus import MmCorpus\nfrom .bleicorpus import BleiCorpus\nfrom .svmlightcorpus import SvmLightCorpus\nfrom .lowcorpus import LowCorpus\nfrom .dictionary import Dictionary\nfrom .hashdictionary import HashDictionary\nfrom .wikicorpus import WikiCorpus\nfrom .textcorpus import TextCorpus\nfrom .ucicorpus import UciCorpus\nfrom .malletcorpus import MalletCorpus\n", "path": "gensim/corpora/__init__.py"}]}
695
109
gh_patches_debug_1358
rasdani/github-patches
git_diff
mirumee__ariadne-270
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Upgrade to GraphQL-core v3 I'm getting the following deprecation warning. Is this something that is already on your radar / that you are planning to resolve for the next release? >**DeprecationWarning**: GraphQL-core-next has been discontinued. It is now released as GraphQL-core v3 and newer. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `setup.py` Content: ``` 1 #! /usr/bin/env python 2 import os 3 from setuptools import setup 4 5 CLASSIFIERS = [ 6 "Development Status :: 4 - Beta", 7 "Intended Audience :: Developers", 8 "License :: OSI Approved :: BSD License", 9 "Operating System :: OS Independent", 10 "Programming Language :: Python", 11 "Programming Language :: Python :: 3.6", 12 "Programming Language :: Python :: 3.7", 13 "Programming Language :: Python :: 3.8", 14 "Topic :: Software Development :: Libraries :: Python Modules", 15 ] 16 17 README_PATH = os.path.join(os.path.dirname(os.path.abspath(__file__)), "README.md") 18 with open(README_PATH, "r") as f: 19 README = f.read() 20 21 setup( 22 name="ariadne", 23 author="Mirumee Software", 24 author_email="[email protected]", 25 description="Ariadne is a Python library for implementing GraphQL servers.", 26 long_description=README, 27 long_description_content_type="text/markdown", 28 license="BSD", 29 version="0.8.0", 30 url="https://github.com/mirumee/ariadne", 31 packages=["ariadne"], 32 include_package_data=True, 33 install_requires=[ 34 "graphql-core-next<3.0.0", 35 "starlette<0.14", 36 "typing_extensions>=3.6.0", 37 ], 38 extras_require={"asgi-file-uploads": ["python-multipart>=0.0.5"]}, 39 classifiers=CLASSIFIERS, 40 platforms=["any"], 41 zip_safe=False, 42 ) 43 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/setup.py b/setup.py --- a/setup.py +++ b/setup.py @@ -31,7 +31,7 @@ packages=["ariadne"], include_package_data=True, install_requires=[ - "graphql-core-next<3.0.0", + "graphql-core>=3.0.0", "starlette<0.14", "typing_extensions>=3.6.0", ],
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -31,7 +31,7 @@\n packages=[\"ariadne\"],\n include_package_data=True,\n install_requires=[\n- \"graphql-core-next<3.0.0\",\n+ \"graphql-core>=3.0.0\",\n \"starlette<0.14\",\n \"typing_extensions>=3.6.0\",\n ],\n", "issue": "Upgrade to GraphQL-core v3\nI'm getting the following deprecation warning. Is this something that is already on your radar / that you are planning to resolve for the next release?\r\n\r\n>**DeprecationWarning**: GraphQL-core-next has been discontinued. It is now released as GraphQL-core v3 and newer.\n", "before_files": [{"content": "#! /usr/bin/env python\nimport os\nfrom setuptools import setup\n\nCLASSIFIERS = [\n \"Development Status :: 4 - Beta\",\n \"Intended Audience :: Developers\",\n \"License :: OSI Approved :: BSD License\",\n \"Operating System :: OS Independent\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Topic :: Software Development :: Libraries :: Python Modules\",\n]\n\nREADME_PATH = os.path.join(os.path.dirname(os.path.abspath(__file__)), \"README.md\")\nwith open(README_PATH, \"r\") as f:\n README = f.read()\n\nsetup(\n name=\"ariadne\",\n author=\"Mirumee Software\",\n author_email=\"[email protected]\",\n description=\"Ariadne is a Python library for implementing GraphQL servers.\",\n long_description=README,\n long_description_content_type=\"text/markdown\",\n license=\"BSD\",\n version=\"0.8.0\",\n url=\"https://github.com/mirumee/ariadne\",\n packages=[\"ariadne\"],\n include_package_data=True,\n install_requires=[\n \"graphql-core-next<3.0.0\",\n \"starlette<0.14\",\n \"typing_extensions>=3.6.0\",\n ],\n extras_require={\"asgi-file-uploads\": [\"python-multipart>=0.0.5\"]},\n classifiers=CLASSIFIERS,\n platforms=[\"any\"],\n zip_safe=False,\n)\n", "path": "setup.py"}], "after_files": [{"content": "#! /usr/bin/env python\nimport os\nfrom setuptools import setup\n\nCLASSIFIERS = [\n \"Development Status :: 4 - Beta\",\n \"Intended Audience :: Developers\",\n \"License :: OSI Approved :: BSD License\",\n \"Operating System :: OS Independent\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Topic :: Software Development :: Libraries :: Python Modules\",\n]\n\nREADME_PATH = os.path.join(os.path.dirname(os.path.abspath(__file__)), \"README.md\")\nwith open(README_PATH, \"r\") as f:\n README = f.read()\n\nsetup(\n name=\"ariadne\",\n author=\"Mirumee Software\",\n author_email=\"[email protected]\",\n description=\"Ariadne is a Python library for implementing GraphQL servers.\",\n long_description=README,\n long_description_content_type=\"text/markdown\",\n license=\"BSD\",\n version=\"0.8.0\",\n url=\"https://github.com/mirumee/ariadne\",\n packages=[\"ariadne\"],\n include_package_data=True,\n install_requires=[\n \"graphql-core>=3.0.0\",\n \"starlette<0.14\",\n \"typing_extensions>=3.6.0\",\n ],\n extras_require={\"asgi-file-uploads\": [\"python-multipart>=0.0.5\"]},\n classifiers=CLASSIFIERS,\n platforms=[\"any\"],\n zip_safe=False,\n)\n", "path": "setup.py"}]}
732
97
gh_patches_debug_50420
rasdani/github-patches
git_diff
litestar-org__litestar-2330
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- StaticFilesConfig and virtual directories I'm trying to write a ``FileSystemProtocol`` to load files from the package data using [importlib_resources](https://importlib-resources.readthedocs.io/en/latest/using.html#). But because ``directories`` is defined as ``DirectoryPath``, pydantic checks if the given directories exist in the local filesystem. This is not generally true, especially in any kind of virtual filesystem (e.g. a zipped package). I think this condition should be relaxed to support virtual filesystems. https://github.com/starlite-api/starlite/blob/9bb6dcd57c10a591377cf8e3a537e9292566d5b9/starlite/config/static_files.py#L32 --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `litestar/openapi/spec/enums.py` Content: ``` 1 from enum import Enum 2 3 __all__ = ("OpenAPIFormat", "OpenAPIType") 4 5 6 class OpenAPIFormat(str, Enum): 7 """Formats extracted from: https://datatracker.ietf.org/doc/html/draft-bhutton-json-schema-validation-00#page-13""" 8 9 DATE = "date" 10 DATE_TIME = "date-time" 11 TIME = "time" 12 DURATION = "duration" 13 URL = "url" 14 EMAIL = "email" 15 IDN_EMAIL = "idn-email" 16 HOST_NAME = "hostname" 17 IDN_HOST_NAME = "idn-hostname" 18 IPV4 = "ipv4" 19 IPV6 = "ipv6" 20 URI = "uri" 21 URI_REFERENCE = "uri-reference" 22 URI_TEMPLATE = "uri-template" 23 JSON_POINTER = "json-pointer" 24 RELATIVE_JSON_POINTER = "relative-json-pointer" 25 IRI = "iri-reference" 26 IRI_REFERENCE = "iri-reference" # noqa: PIE796 27 UUID = "uuid" 28 REGEX = "regex" 29 30 31 class OpenAPIType(str, Enum): 32 """An OopenAPI type.""" 33 34 ARRAY = "array" 35 BOOLEAN = "boolean" 36 INTEGER = "integer" 37 NULL = "null" 38 NUMBER = "number" 39 OBJECT = "object" 40 STRING = "string" 41 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/litestar/openapi/spec/enums.py b/litestar/openapi/spec/enums.py --- a/litestar/openapi/spec/enums.py +++ b/litestar/openapi/spec/enums.py @@ -26,6 +26,7 @@ IRI_REFERENCE = "iri-reference" # noqa: PIE796 UUID = "uuid" REGEX = "regex" + BINARY = "binary" class OpenAPIType(str, Enum):
{"golden_diff": "diff --git a/litestar/openapi/spec/enums.py b/litestar/openapi/spec/enums.py\n--- a/litestar/openapi/spec/enums.py\n+++ b/litestar/openapi/spec/enums.py\n@@ -26,6 +26,7 @@\n IRI_REFERENCE = \"iri-reference\" # noqa: PIE796\n UUID = \"uuid\"\n REGEX = \"regex\"\n+ BINARY = \"binary\"\n \n \n class OpenAPIType(str, Enum):\n", "issue": "StaticFilesConfig and virtual directories\nI'm trying to write a ``FileSystemProtocol`` to load files from the package data using [importlib_resources](https://importlib-resources.readthedocs.io/en/latest/using.html#). But because ``directories`` is defined as ``DirectoryPath``, pydantic checks if the given directories exist in the local filesystem. \r\n\r\nThis is not generally true, especially in any kind of virtual filesystem (e.g. a zipped package). I think this condition should be relaxed to support virtual filesystems.\r\n\r\nhttps://github.com/starlite-api/starlite/blob/9bb6dcd57c10a591377cf8e3a537e9292566d5b9/starlite/config/static_files.py#L32\n", "before_files": [{"content": "from enum import Enum\n\n__all__ = (\"OpenAPIFormat\", \"OpenAPIType\")\n\n\nclass OpenAPIFormat(str, Enum):\n \"\"\"Formats extracted from: https://datatracker.ietf.org/doc/html/draft-bhutton-json-schema-validation-00#page-13\"\"\"\n\n DATE = \"date\"\n DATE_TIME = \"date-time\"\n TIME = \"time\"\n DURATION = \"duration\"\n URL = \"url\"\n EMAIL = \"email\"\n IDN_EMAIL = \"idn-email\"\n HOST_NAME = \"hostname\"\n IDN_HOST_NAME = \"idn-hostname\"\n IPV4 = \"ipv4\"\n IPV6 = \"ipv6\"\n URI = \"uri\"\n URI_REFERENCE = \"uri-reference\"\n URI_TEMPLATE = \"uri-template\"\n JSON_POINTER = \"json-pointer\"\n RELATIVE_JSON_POINTER = \"relative-json-pointer\"\n IRI = \"iri-reference\"\n IRI_REFERENCE = \"iri-reference\" # noqa: PIE796\n UUID = \"uuid\"\n REGEX = \"regex\"\n\n\nclass OpenAPIType(str, Enum):\n \"\"\"An OopenAPI type.\"\"\"\n\n ARRAY = \"array\"\n BOOLEAN = \"boolean\"\n INTEGER = \"integer\"\n NULL = \"null\"\n NUMBER = \"number\"\n OBJECT = \"object\"\n STRING = \"string\"\n", "path": "litestar/openapi/spec/enums.py"}], "after_files": [{"content": "from enum import Enum\n\n__all__ = (\"OpenAPIFormat\", \"OpenAPIType\")\n\n\nclass OpenAPIFormat(str, Enum):\n \"\"\"Formats extracted from: https://datatracker.ietf.org/doc/html/draft-bhutton-json-schema-validation-00#page-13\"\"\"\n\n DATE = \"date\"\n DATE_TIME = \"date-time\"\n TIME = \"time\"\n DURATION = \"duration\"\n URL = \"url\"\n EMAIL = \"email\"\n IDN_EMAIL = \"idn-email\"\n HOST_NAME = \"hostname\"\n IDN_HOST_NAME = \"idn-hostname\"\n IPV4 = \"ipv4\"\n IPV6 = \"ipv6\"\n URI = \"uri\"\n URI_REFERENCE = \"uri-reference\"\n URI_TEMPLATE = \"uri-template\"\n JSON_POINTER = \"json-pointer\"\n RELATIVE_JSON_POINTER = \"relative-json-pointer\"\n IRI = \"iri-reference\"\n IRI_REFERENCE = \"iri-reference\" # noqa: PIE796\n UUID = \"uuid\"\n REGEX = \"regex\"\n BINARY = \"binary\"\n\n\nclass OpenAPIType(str, Enum):\n \"\"\"An OopenAPI type.\"\"\"\n\n ARRAY = \"array\"\n BOOLEAN = \"boolean\"\n INTEGER = \"integer\"\n NULL = \"null\"\n NUMBER = \"number\"\n OBJECT = \"object\"\n STRING = \"string\"\n", "path": "litestar/openapi/spec/enums.py"}]}
791
108
gh_patches_debug_19349
rasdani/github-patches
git_diff
fossasia__open-event-server-4248
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Microlocations : GET requests return ERROR 500 **I'm submitting a ...** (check one with "x") - [x] bug report - [ ] feature request - [ ] support request => Please do not submit support requests here, instead ask your query in out Gitter channel at https://gitter.im/fossasia/open-event-orga-server Endpoint ``` GET v1/events/<event_id>/microlocations ``` Response ``` { "errors":[ { "detail":"Unknown error", "source":{ "pointer":"" }, "status":500, "title":"Unknown error" } ], "jsonapi":{ "version":"1.0" } } ``` Example URL ``` https://open-event-api.herokuapp.com/v1/events/173/microlocations ``` --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `app/api/microlocations.py` Content: ``` 1 from flask_rest_jsonapi import ResourceDetail, ResourceList, ResourceRelationship 2 from marshmallow_jsonapi.flask import Schema, Relationship 3 from marshmallow_jsonapi import fields 4 5 from app.api.bootstrap import api 6 from app.api.helpers.utilities import dasherize 7 from app.models import db 8 from app.models.microlocation import Microlocation 9 from app.models.session import Session 10 from app.api.helpers.db import safe_query 11 from app.api.helpers.utilities import require_relationship 12 from app.api.helpers.permission_manager import has_access 13 from app.api.helpers.exceptions import ForbiddenException 14 from app.api.helpers.query import event_query 15 16 17 class MicrolocationSchema(Schema): 18 """ 19 Api schema for Microlocation Model 20 """ 21 22 class Meta: 23 """ 24 Meta class for Microlocation Api Schema 25 """ 26 type_ = 'microlocation' 27 self_view = 'v1.microlocation_detail' 28 self_view_kwargs = {'id': '<id>'} 29 self_view_many = 'v1.session_list' 30 inflect = dasherize 31 32 id = fields.Str(dump_only=True) 33 name = fields.Str(required=True) 34 latitude = fields.Float(validate=lambda n: -90 <= n <= 90, allow_none=True) 35 longitude = fields.Float(validate=lambda n: -180 <= n <= 180, allow_none=True) 36 floor = fields.Integer(allow_none=True) 37 room = fields.Str(allow_none=True) 38 sessions = Relationship(attribute='session', 39 self_view='v1.microlocation_session', 40 self_view_kwargs={'id': '<id>'}, 41 related_view='v1.session_list', 42 related_view_kwargs={'microlocation_id': '<id>'}, 43 schema='SessionSchema', 44 type_='session') 45 event = Relationship(attribute='event', 46 self_view='v1.microlocation_event', 47 self_view_kwargs={'id': '<id>'}, 48 related_view='v1.event_detail', 49 related_view_kwargs={'microlocation_id': '<id>'}, 50 schema='EventSchema', 51 type_='event') 52 53 54 class MicrolocationListPost(ResourceList): 55 """ 56 List and create microlocations 57 """ 58 def before_post(self, args, kwargs, data): 59 require_relationship(['event'], data) 60 if not has_access('is_coorganizer', event_id=data['event']): 61 raise ForbiddenException({'source': ''}, 'Co-organizer access is required.') 62 63 methods = ['POST', ] 64 schema = MicrolocationSchema 65 data_layer = {'session': db.session, 66 'model': Microlocation} 67 68 69 class MicrolocationList(ResourceList): 70 """ 71 List Microlocations 72 """ 73 def query(self, view_kwargs): 74 query_ = self.session.query(Microlocation) 75 query_ = event_query(self, query_, view_kwargs) 76 if view_kwargs.get('session_id'): 77 session = safe_query(self, Session, 'id', view_kwargs['session_id'], 'session_id') 78 query_ = query_.join(Session).filter(Session.id == session.id) 79 return query_ 80 81 view_kwargs = True 82 methods = ['GET'] 83 schema = MicrolocationSchema 84 data_layer = {'session': db.session, 85 'model': Microlocation, 86 'methods': { 87 'query': query 88 }} 89 90 91 class MicrolocationDetail(ResourceDetail): 92 """ 93 Microlocation detail by id 94 """ 95 96 def before_get_object(self, view_kwargs): 97 98 if view_kwargs.get('session_id') is not None: 99 sessions = safe_query(self, Session, 'id', view_kwargs['session_id'], 'session_id') 100 if sessions.event_id is not None: 101 view_kwargs['id'] = sessions.event_id 102 else: 103 view_kwargs['id'] = None 104 105 decorators = (api.has_permission('is_coorganizer', methods="PATCH,DELETE", fetch="event_id", fetch_as="event_id", 106 model=Microlocation),) 107 schema = MicrolocationSchema 108 data_layer = {'session': db.session, 109 'model': Microlocation, 110 'methods': {'before_get_object': before_get_object}} 111 112 113 class MicrolocationRelationshipRequired(ResourceRelationship): 114 """ 115 Microlocation Relationship for required entities 116 """ 117 decorators = (api.has_permission('is_coorganizer', methods="PATCH", fetch="event_id", fetch_as="event_id", 118 model=Microlocation),) 119 methods = ['GET', 'PATCH'] 120 schema = MicrolocationSchema 121 data_layer = {'session': db.session, 122 'model': Microlocation} 123 124 125 class MicrolocationRelationshipOptional(ResourceRelationship): 126 """ 127 Microlocation Relationship 128 """ 129 decorators = (api.has_permission('is_coorganizer', methods="PATCH,DELETE", fetch="event_id", fetch_as="event_id", 130 model=Microlocation),) 131 schema = MicrolocationSchema 132 data_layer = {'session': db.session, 133 'model': Microlocation} 134 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/app/api/microlocations.py b/app/api/microlocations.py --- a/app/api/microlocations.py +++ b/app/api/microlocations.py @@ -26,7 +26,7 @@ type_ = 'microlocation' self_view = 'v1.microlocation_detail' self_view_kwargs = {'id': '<id>'} - self_view_many = 'v1.session_list' + self_view_many = 'v1.microlocation_list_post' inflect = dasherize id = fields.Str(dump_only=True) @@ -36,6 +36,7 @@ floor = fields.Integer(allow_none=True) room = fields.Str(allow_none=True) sessions = Relationship(attribute='session', + many=True, self_view='v1.microlocation_session', self_view_kwargs={'id': '<id>'}, related_view='v1.session_list',
{"golden_diff": "diff --git a/app/api/microlocations.py b/app/api/microlocations.py\n--- a/app/api/microlocations.py\n+++ b/app/api/microlocations.py\n@@ -26,7 +26,7 @@\n type_ = 'microlocation'\n self_view = 'v1.microlocation_detail'\n self_view_kwargs = {'id': '<id>'}\n- self_view_many = 'v1.session_list'\n+ self_view_many = 'v1.microlocation_list_post'\n inflect = dasherize\n \n id = fields.Str(dump_only=True)\n@@ -36,6 +36,7 @@\n floor = fields.Integer(allow_none=True)\n room = fields.Str(allow_none=True)\n sessions = Relationship(attribute='session',\n+ many=True,\n self_view='v1.microlocation_session',\n self_view_kwargs={'id': '<id>'},\n related_view='v1.session_list',\n", "issue": "Microlocations : GET requests return ERROR 500\n**I'm submitting a ...** (check one with \"x\")\r\n- [x] bug report\r\n- [ ] feature request\r\n- [ ] support request => Please do not submit support requests here, instead ask your query in out Gitter channel at https://gitter.im/fossasia/open-event-orga-server\r\n\r\nEndpoint \r\n```\r\nGET v1/events/<event_id>/microlocations \r\n```\r\n\r\nResponse\r\n```\r\n{\r\n \"errors\":[\r\n {\r\n \"detail\":\"Unknown error\",\r\n \"source\":{\r\n \"pointer\":\"\"\r\n },\r\n \"status\":500,\r\n \"title\":\"Unknown error\"\r\n }\r\n ],\r\n \"jsonapi\":{\r\n \"version\":\"1.0\"\r\n }\r\n}\r\n```\r\n\r\nExample URL\r\n```\r\nhttps://open-event-api.herokuapp.com/v1/events/173/microlocations\r\n```\r\n\n", "before_files": [{"content": "from flask_rest_jsonapi import ResourceDetail, ResourceList, ResourceRelationship\nfrom marshmallow_jsonapi.flask import Schema, Relationship\nfrom marshmallow_jsonapi import fields\n\nfrom app.api.bootstrap import api\nfrom app.api.helpers.utilities import dasherize\nfrom app.models import db\nfrom app.models.microlocation import Microlocation\nfrom app.models.session import Session\nfrom app.api.helpers.db import safe_query\nfrom app.api.helpers.utilities import require_relationship\nfrom app.api.helpers.permission_manager import has_access\nfrom app.api.helpers.exceptions import ForbiddenException\nfrom app.api.helpers.query import event_query\n\n\nclass MicrolocationSchema(Schema):\n \"\"\"\n Api schema for Microlocation Model\n \"\"\"\n\n class Meta:\n \"\"\"\n Meta class for Microlocation Api Schema\n \"\"\"\n type_ = 'microlocation'\n self_view = 'v1.microlocation_detail'\n self_view_kwargs = {'id': '<id>'}\n self_view_many = 'v1.session_list'\n inflect = dasherize\n\n id = fields.Str(dump_only=True)\n name = fields.Str(required=True)\n latitude = fields.Float(validate=lambda n: -90 <= n <= 90, allow_none=True)\n longitude = fields.Float(validate=lambda n: -180 <= n <= 180, allow_none=True)\n floor = fields.Integer(allow_none=True)\n room = fields.Str(allow_none=True)\n sessions = Relationship(attribute='session',\n self_view='v1.microlocation_session',\n self_view_kwargs={'id': '<id>'},\n related_view='v1.session_list',\n related_view_kwargs={'microlocation_id': '<id>'},\n schema='SessionSchema',\n type_='session')\n event = Relationship(attribute='event',\n self_view='v1.microlocation_event',\n self_view_kwargs={'id': '<id>'},\n related_view='v1.event_detail',\n related_view_kwargs={'microlocation_id': '<id>'},\n schema='EventSchema',\n type_='event')\n\n\nclass MicrolocationListPost(ResourceList):\n \"\"\"\n List and create microlocations\n \"\"\"\n def before_post(self, args, kwargs, data):\n require_relationship(['event'], data)\n if not has_access('is_coorganizer', event_id=data['event']):\n raise ForbiddenException({'source': ''}, 'Co-organizer access is required.')\n\n methods = ['POST', ]\n schema = MicrolocationSchema\n data_layer = {'session': db.session,\n 'model': Microlocation}\n\n\nclass MicrolocationList(ResourceList):\n \"\"\"\n List Microlocations\n \"\"\"\n def query(self, view_kwargs):\n query_ = self.session.query(Microlocation)\n query_ = event_query(self, query_, view_kwargs)\n if view_kwargs.get('session_id'):\n session = safe_query(self, Session, 'id', view_kwargs['session_id'], 'session_id')\n query_ = query_.join(Session).filter(Session.id == session.id)\n return query_\n\n view_kwargs = True\n methods = ['GET']\n schema = MicrolocationSchema\n data_layer = {'session': db.session,\n 'model': Microlocation,\n 'methods': {\n 'query': query\n }}\n\n\nclass MicrolocationDetail(ResourceDetail):\n \"\"\"\n Microlocation detail by id\n \"\"\"\n\n def before_get_object(self, view_kwargs):\n\n if view_kwargs.get('session_id') is not None:\n sessions = safe_query(self, Session, 'id', view_kwargs['session_id'], 'session_id')\n if sessions.event_id is not None:\n view_kwargs['id'] = sessions.event_id\n else:\n view_kwargs['id'] = None\n\n decorators = (api.has_permission('is_coorganizer', methods=\"PATCH,DELETE\", fetch=\"event_id\", fetch_as=\"event_id\",\n model=Microlocation),)\n schema = MicrolocationSchema\n data_layer = {'session': db.session,\n 'model': Microlocation,\n 'methods': {'before_get_object': before_get_object}}\n\n\nclass MicrolocationRelationshipRequired(ResourceRelationship):\n \"\"\"\n Microlocation Relationship for required entities\n \"\"\"\n decorators = (api.has_permission('is_coorganizer', methods=\"PATCH\", fetch=\"event_id\", fetch_as=\"event_id\",\n model=Microlocation),)\n methods = ['GET', 'PATCH']\n schema = MicrolocationSchema\n data_layer = {'session': db.session,\n 'model': Microlocation}\n\n\nclass MicrolocationRelationshipOptional(ResourceRelationship):\n \"\"\"\n Microlocation Relationship\n \"\"\"\n decorators = (api.has_permission('is_coorganizer', methods=\"PATCH,DELETE\", fetch=\"event_id\", fetch_as=\"event_id\",\n model=Microlocation),)\n schema = MicrolocationSchema\n data_layer = {'session': db.session,\n 'model': Microlocation}\n", "path": "app/api/microlocations.py"}], "after_files": [{"content": "from flask_rest_jsonapi import ResourceDetail, ResourceList, ResourceRelationship\nfrom marshmallow_jsonapi.flask import Schema, Relationship\nfrom marshmallow_jsonapi import fields\n\nfrom app.api.bootstrap import api\nfrom app.api.helpers.utilities import dasherize\nfrom app.models import db\nfrom app.models.microlocation import Microlocation\nfrom app.models.session import Session\nfrom app.api.helpers.db import safe_query\nfrom app.api.helpers.utilities import require_relationship\nfrom app.api.helpers.permission_manager import has_access\nfrom app.api.helpers.exceptions import ForbiddenException\nfrom app.api.helpers.query import event_query\n\n\nclass MicrolocationSchema(Schema):\n \"\"\"\n Api schema for Microlocation Model\n \"\"\"\n\n class Meta:\n \"\"\"\n Meta class for Microlocation Api Schema\n \"\"\"\n type_ = 'microlocation'\n self_view = 'v1.microlocation_detail'\n self_view_kwargs = {'id': '<id>'}\n self_view_many = 'v1.microlocation_list_post'\n inflect = dasherize\n\n id = fields.Str(dump_only=True)\n name = fields.Str(required=True)\n latitude = fields.Float(validate=lambda n: -90 <= n <= 90, allow_none=True)\n longitude = fields.Float(validate=lambda n: -180 <= n <= 180, allow_none=True)\n floor = fields.Integer(allow_none=True)\n room = fields.Str(allow_none=True)\n sessions = Relationship(attribute='session',\n many=True,\n self_view='v1.microlocation_session',\n self_view_kwargs={'id': '<id>'},\n related_view='v1.session_list',\n related_view_kwargs={'microlocation_id': '<id>'},\n schema='SessionSchema',\n type_='session')\n event = Relationship(attribute='event',\n self_view='v1.microlocation_event',\n self_view_kwargs={'id': '<id>'},\n related_view='v1.event_detail',\n related_view_kwargs={'microlocation_id': '<id>'},\n schema='EventSchema',\n type_='event')\n\n\nclass MicrolocationListPost(ResourceList):\n \"\"\"\n List and create microlocations\n \"\"\"\n def before_post(self, args, kwargs, data):\n require_relationship(['event'], data)\n if not has_access('is_coorganizer', event_id=data['event']):\n raise ForbiddenException({'source': ''}, 'Co-organizer access is required.')\n\n methods = ['POST', ]\n schema = MicrolocationSchema\n data_layer = {'session': db.session,\n 'model': Microlocation}\n\n\nclass MicrolocationList(ResourceList):\n \"\"\"\n List Microlocations\n \"\"\"\n def query(self, view_kwargs):\n query_ = self.session.query(Microlocation)\n query_ = event_query(self, query_, view_kwargs)\n if view_kwargs.get('session_id'):\n session = safe_query(self, Session, 'id', view_kwargs['session_id'], 'session_id')\n query_ = query_.join(Session).filter(Session.id == session.id)\n return query_\n\n view_kwargs = True\n methods = ['GET']\n schema = MicrolocationSchema\n data_layer = {'session': db.session,\n 'model': Microlocation,\n 'methods': {\n 'query': query\n }}\n\n\nclass MicrolocationDetail(ResourceDetail):\n \"\"\"\n Microlocation detail by id\n \"\"\"\n\n def before_get_object(self, view_kwargs):\n\n if view_kwargs.get('session_id') is not None:\n sessions = safe_query(self, Session, 'id', view_kwargs['session_id'], 'session_id')\n if sessions.event_id is not None:\n view_kwargs['id'] = sessions.event_id\n else:\n view_kwargs['id'] = None\n\n decorators = (api.has_permission('is_coorganizer', methods=\"PATCH,DELETE\", fetch=\"event_id\", fetch_as=\"event_id\",\n model=Microlocation),)\n schema = MicrolocationSchema\n data_layer = {'session': db.session,\n 'model': Microlocation,\n 'methods': {'before_get_object': before_get_object}}\n\n\nclass MicrolocationRelationshipRequired(ResourceRelationship):\n \"\"\"\n Microlocation Relationship for required entities\n \"\"\"\n decorators = (api.has_permission('is_coorganizer', methods=\"PATCH\", fetch=\"event_id\", fetch_as=\"event_id\",\n model=Microlocation),)\n methods = ['GET', 'PATCH']\n schema = MicrolocationSchema\n data_layer = {'session': db.session,\n 'model': Microlocation}\n\n\nclass MicrolocationRelationshipOptional(ResourceRelationship):\n \"\"\"\n Microlocation Relationship\n \"\"\"\n decorators = (api.has_permission('is_coorganizer', methods=\"PATCH,DELETE\", fetch=\"event_id\", fetch_as=\"event_id\",\n model=Microlocation),)\n schema = MicrolocationSchema\n data_layer = {'session': db.session,\n 'model': Microlocation}\n", "path": "app/api/microlocations.py"}]}
1,768
198
gh_patches_debug_8814
rasdani/github-patches
git_diff
CTPUG__wafer-243
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- tickets should be decoded on python 3 As seen from the recent quicket hook posts TypeError at /tickets/quicket_hook/ the JSON object must be str, not 'bytes' --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `wafer/tickets/views.py` Content: ``` 1 import json 2 import logging 3 4 from django.conf import settings 5 from django.contrib.auth import get_user_model 6 from django.core.exceptions import PermissionDenied, ValidationError 7 from django.core.urlresolvers import reverse 8 from django.http import HttpResponse, Http404 9 from django.views.decorators.csrf import csrf_exempt 10 from django.views.decorators.http import require_POST 11 from django.views.generic.edit import FormView 12 13 from wafer.tickets.models import Ticket, TicketType 14 from wafer.tickets.forms import TicketForm 15 16 log = logging.getLogger(__name__) 17 18 19 class ClaimView(FormView): 20 template_name = 'wafer.tickets/claim.html' 21 form_class = TicketForm 22 23 def get_context_data(self, **kwargs): 24 context = super(ClaimView, self).get_context_data(**kwargs) 25 context['can_claim'] = self.can_claim() 26 return context 27 28 def can_claim(self): 29 if settings.WAFER_REGISTRATION_MODE != 'ticket': 30 raise Http404('Ticket-based registration is not in use') 31 if not settings.WAFER_REGISTRATION_OPEN: 32 return False 33 return not self.request.user.userprofile.is_registered() 34 35 def form_valid(self, form): 36 if not self.can_claim(): 37 raise ValidationError('User may not claim a ticket') 38 ticket = Ticket.objects.get(barcode=form.cleaned_data['barcode']) 39 ticket.user = self.request.user 40 ticket.save() 41 return super(ClaimView, self).form_valid(form) 42 43 def get_success_url(self): 44 return reverse( 45 'wafer_user_profile', args=(self.request.user.username,)) 46 47 48 @csrf_exempt 49 @require_POST 50 def quicket_hook(request): 51 ''' 52 Quicket.co.za can POST something like this when tickets are bought: 53 { 54 "reference": "REF00123456", 55 "event_id": 123, 56 "event_name": "My Big Event", 57 "amount": 0.00, 58 "email": "[email protected]", 59 "action": "checkout_started", 60 // Options are "checkout_started","checkout_cancelled","eft_pending", 61 // "checkout_completed" 62 "tickets": [ 63 { 64 "id": 122, 65 "attendee_name": "", 66 "attendee_email": "", 67 "ticket_type": "Free Ticket", 68 "price": 0.00, 69 "barcode": 12345, 70 }, 71 ... 72 ], 73 } 74 ''' 75 if request.GET.get('secret') != settings.WAFER_TICKETS_SECRET: 76 raise PermissionDenied('Incorrect secret') 77 78 payload = json.load(request) 79 for ticket in payload['tickets']: 80 import_ticket(ticket['barcode'], ticket['ticket_type'], 81 ticket['attendee_email']) 82 83 return HttpResponse("Noted\n", content_type='text/plain') 84 85 86 def import_ticket(ticket_barcode, ticket_type, email): 87 if Ticket.objects.filter(barcode=ticket_barcode).exists(): 88 log.debug('Ticket already registered: %s', ticket_barcode) 89 return 90 91 # truncate long ticket type names to length allowed by database 92 ticket_type = ticket_type[:TicketType.MAX_NAME_LENGTH] 93 type_, created = TicketType.objects.get_or_create(name=ticket_type) 94 95 UserModel = get_user_model() 96 97 try: 98 user = UserModel.objects.get(email=email, ticket=None) 99 except UserModel.DoesNotExist: 100 user = None 101 except UserModel.MultipleObjectsReturned: 102 # We're can't uniquely identify the user to associate this ticket 103 # with, so leave it for them to figure out via the 'claim ticket' 104 # interface 105 user = None 106 107 ticket = Ticket.objects.create( 108 barcode=ticket_barcode, 109 email=email, 110 type=type_, 111 user=user, 112 ) 113 ticket.save() 114 115 if user: 116 log.info('Ticket registered: %s and linked to user', ticket) 117 else: 118 log.info('Ticket registered: %s. Unclaimed', ticket) 119 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/wafer/tickets/views.py b/wafer/tickets/views.py --- a/wafer/tickets/views.py +++ b/wafer/tickets/views.py @@ -75,7 +75,8 @@ if request.GET.get('secret') != settings.WAFER_TICKETS_SECRET: raise PermissionDenied('Incorrect secret') - payload = json.load(request) + # This is required for python 3, and in theory fine on python 2 + payload = json.loads(request.body.decode('utf8')) for ticket in payload['tickets']: import_ticket(ticket['barcode'], ticket['ticket_type'], ticket['attendee_email'])
{"golden_diff": "diff --git a/wafer/tickets/views.py b/wafer/tickets/views.py\n--- a/wafer/tickets/views.py\n+++ b/wafer/tickets/views.py\n@@ -75,7 +75,8 @@\n if request.GET.get('secret') != settings.WAFER_TICKETS_SECRET:\n raise PermissionDenied('Incorrect secret')\n \n- payload = json.load(request)\n+ # This is required for python 3, and in theory fine on python 2\n+ payload = json.loads(request.body.decode('utf8'))\n for ticket in payload['tickets']:\n import_ticket(ticket['barcode'], ticket['ticket_type'],\n ticket['attendee_email'])\n", "issue": "tickets should be decoded on python 3\nAs seen from the recent quicket hook posts\n\nTypeError at /tickets/quicket_hook/\nthe JSON object must be str, not 'bytes'\n\n", "before_files": [{"content": "import json\nimport logging\n\nfrom django.conf import settings\nfrom django.contrib.auth import get_user_model\nfrom django.core.exceptions import PermissionDenied, ValidationError\nfrom django.core.urlresolvers import reverse\nfrom django.http import HttpResponse, Http404\nfrom django.views.decorators.csrf import csrf_exempt\nfrom django.views.decorators.http import require_POST\nfrom django.views.generic.edit import FormView\n\nfrom wafer.tickets.models import Ticket, TicketType\nfrom wafer.tickets.forms import TicketForm\n\nlog = logging.getLogger(__name__)\n\n\nclass ClaimView(FormView):\n template_name = 'wafer.tickets/claim.html'\n form_class = TicketForm\n\n def get_context_data(self, **kwargs):\n context = super(ClaimView, self).get_context_data(**kwargs)\n context['can_claim'] = self.can_claim()\n return context\n\n def can_claim(self):\n if settings.WAFER_REGISTRATION_MODE != 'ticket':\n raise Http404('Ticket-based registration is not in use')\n if not settings.WAFER_REGISTRATION_OPEN:\n return False\n return not self.request.user.userprofile.is_registered()\n\n def form_valid(self, form):\n if not self.can_claim():\n raise ValidationError('User may not claim a ticket')\n ticket = Ticket.objects.get(barcode=form.cleaned_data['barcode'])\n ticket.user = self.request.user\n ticket.save()\n return super(ClaimView, self).form_valid(form)\n\n def get_success_url(self):\n return reverse(\n 'wafer_user_profile', args=(self.request.user.username,))\n\n\n@csrf_exempt\n@require_POST\ndef quicket_hook(request):\n '''\n Quicket.co.za can POST something like this when tickets are bought:\n {\n \"reference\": \"REF00123456\",\n \"event_id\": 123,\n \"event_name\": \"My Big Event\",\n \"amount\": 0.00,\n \"email\": \"[email protected]\",\n \"action\": \"checkout_started\",\n // Options are \"checkout_started\",\"checkout_cancelled\",\"eft_pending\",\n // \"checkout_completed\"\n \"tickets\": [\n {\n \"id\": 122,\n \"attendee_name\": \"\",\n \"attendee_email\": \"\",\n \"ticket_type\": \"Free Ticket\",\n \"price\": 0.00,\n \"barcode\": 12345,\n },\n ...\n ],\n }\n '''\n if request.GET.get('secret') != settings.WAFER_TICKETS_SECRET:\n raise PermissionDenied('Incorrect secret')\n\n payload = json.load(request)\n for ticket in payload['tickets']:\n import_ticket(ticket['barcode'], ticket['ticket_type'],\n ticket['attendee_email'])\n\n return HttpResponse(\"Noted\\n\", content_type='text/plain')\n\n\ndef import_ticket(ticket_barcode, ticket_type, email):\n if Ticket.objects.filter(barcode=ticket_barcode).exists():\n log.debug('Ticket already registered: %s', ticket_barcode)\n return\n\n # truncate long ticket type names to length allowed by database\n ticket_type = ticket_type[:TicketType.MAX_NAME_LENGTH]\n type_, created = TicketType.objects.get_or_create(name=ticket_type)\n\n UserModel = get_user_model()\n\n try:\n user = UserModel.objects.get(email=email, ticket=None)\n except UserModel.DoesNotExist:\n user = None\n except UserModel.MultipleObjectsReturned:\n # We're can't uniquely identify the user to associate this ticket\n # with, so leave it for them to figure out via the 'claim ticket'\n # interface\n user = None\n\n ticket = Ticket.objects.create(\n barcode=ticket_barcode,\n email=email,\n type=type_,\n user=user,\n )\n ticket.save()\n\n if user:\n log.info('Ticket registered: %s and linked to user', ticket)\n else:\n log.info('Ticket registered: %s. Unclaimed', ticket)\n", "path": "wafer/tickets/views.py"}], "after_files": [{"content": "import json\nimport logging\n\nfrom django.conf import settings\nfrom django.contrib.auth import get_user_model\nfrom django.core.exceptions import PermissionDenied, ValidationError\nfrom django.core.urlresolvers import reverse\nfrom django.http import HttpResponse, Http404\nfrom django.views.decorators.csrf import csrf_exempt\nfrom django.views.decorators.http import require_POST\nfrom django.views.generic.edit import FormView\n\nfrom wafer.tickets.models import Ticket, TicketType\nfrom wafer.tickets.forms import TicketForm\n\nlog = logging.getLogger(__name__)\n\n\nclass ClaimView(FormView):\n template_name = 'wafer.tickets/claim.html'\n form_class = TicketForm\n\n def get_context_data(self, **kwargs):\n context = super(ClaimView, self).get_context_data(**kwargs)\n context['can_claim'] = self.can_claim()\n return context\n\n def can_claim(self):\n if settings.WAFER_REGISTRATION_MODE != 'ticket':\n raise Http404('Ticket-based registration is not in use')\n if not settings.WAFER_REGISTRATION_OPEN:\n return False\n return not self.request.user.userprofile.is_registered()\n\n def form_valid(self, form):\n if not self.can_claim():\n raise ValidationError('User may not claim a ticket')\n ticket = Ticket.objects.get(barcode=form.cleaned_data['barcode'])\n ticket.user = self.request.user\n ticket.save()\n return super(ClaimView, self).form_valid(form)\n\n def get_success_url(self):\n return reverse(\n 'wafer_user_profile', args=(self.request.user.username,))\n\n\n@csrf_exempt\n@require_POST\ndef quicket_hook(request):\n '''\n Quicket.co.za can POST something like this when tickets are bought:\n {\n \"reference\": \"REF00123456\",\n \"event_id\": 123,\n \"event_name\": \"My Big Event\",\n \"amount\": 0.00,\n \"email\": \"[email protected]\",\n \"action\": \"checkout_started\",\n // Options are \"checkout_started\",\"checkout_cancelled\",\"eft_pending\",\n // \"checkout_completed\"\n \"tickets\": [\n {\n \"id\": 122,\n \"attendee_name\": \"\",\n \"attendee_email\": \"\",\n \"ticket_type\": \"Free Ticket\",\n \"price\": 0.00,\n \"barcode\": 12345,\n },\n ...\n ],\n }\n '''\n if request.GET.get('secret') != settings.WAFER_TICKETS_SECRET:\n raise PermissionDenied('Incorrect secret')\n\n # This is required for python 3, and in theory fine on python 2\n payload = json.loads(request.body.decode('utf8'))\n for ticket in payload['tickets']:\n import_ticket(ticket['barcode'], ticket['ticket_type'],\n ticket['attendee_email'])\n\n return HttpResponse(\"Noted\\n\", content_type='text/plain')\n\n\ndef import_ticket(ticket_barcode, ticket_type, email):\n if Ticket.objects.filter(barcode=ticket_barcode).exists():\n log.debug('Ticket already registered: %s', ticket_barcode)\n return\n\n # truncate long ticket type names to length allowed by database\n ticket_type = ticket_type[:TicketType.MAX_NAME_LENGTH]\n type_, created = TicketType.objects.get_or_create(name=ticket_type)\n\n UserModel = get_user_model()\n\n try:\n user = UserModel.objects.get(email=email, ticket=None)\n except UserModel.DoesNotExist:\n user = None\n except UserModel.MultipleObjectsReturned:\n # We're can't uniquely identify the user to associate this ticket\n # with, so leave it for them to figure out via the 'claim ticket'\n # interface\n user = None\n\n ticket = Ticket.objects.create(\n barcode=ticket_barcode,\n email=email,\n type=type_,\n user=user,\n )\n ticket.save()\n\n if user:\n log.info('Ticket registered: %s and linked to user', ticket)\n else:\n log.info('Ticket registered: %s. Unclaimed', ticket)\n", "path": "wafer/tickets/views.py"}]}
1,390
146
gh_patches_debug_3467
rasdani/github-patches
git_diff
getmoto__moto-1739
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- [SES] Does not properly verify mailbox with display name https://tools.ietf.org/html/rfc2822.html#section-3.4 defines two forms of valid mailbox: * `[email protected]` * `"Foo Bar" <[email protected]>` SES supports both of these forms. Per https://github.com/spulec/moto/blob/master/moto/ses/models.py#L55, only the first form is supported by moto. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `moto/ses/models.py` Content: ``` 1 from __future__ import unicode_literals 2 3 import email 4 from email.utils import parseaddr 5 6 from moto.core import BaseBackend, BaseModel 7 from .exceptions import MessageRejectedError 8 from .utils import get_random_message_id 9 10 11 RECIPIENT_LIMIT = 50 12 13 14 class Message(BaseModel): 15 16 def __init__(self, message_id, source, subject, body, destinations): 17 self.id = message_id 18 self.source = source 19 self.subject = subject 20 self.body = body 21 self.destinations = destinations 22 23 24 class RawMessage(BaseModel): 25 26 def __init__(self, message_id, source, destinations, raw_data): 27 self.id = message_id 28 self.source = source 29 self.destinations = destinations 30 self.raw_data = raw_data 31 32 33 class SESQuota(BaseModel): 34 35 def __init__(self, sent): 36 self.sent = sent 37 38 @property 39 def sent_past_24(self): 40 return self.sent 41 42 43 class SESBackend(BaseBackend): 44 45 def __init__(self): 46 self.addresses = [] 47 self.email_addresses = [] 48 self.domains = [] 49 self.sent_messages = [] 50 self.sent_message_count = 0 51 52 def _is_verified_address(self, address): 53 if address in self.addresses: 54 return True 55 user, host = address.split('@', 1) 56 return host in self.domains 57 58 def verify_email_identity(self, address): 59 self.addresses.append(address) 60 61 def verify_email_address(self, address): 62 self.email_addresses.append(address) 63 64 def verify_domain(self, domain): 65 self.domains.append(domain) 66 67 def list_identities(self): 68 return self.domains + self.addresses 69 70 def list_verified_email_addresses(self): 71 return self.email_addresses 72 73 def delete_identity(self, identity): 74 if '@' in identity: 75 self.addresses.remove(identity) 76 else: 77 self.domains.remove(identity) 78 79 def send_email(self, source, subject, body, destinations): 80 recipient_count = sum(map(len, destinations.values())) 81 if recipient_count > RECIPIENT_LIMIT: 82 raise MessageRejectedError('Too many recipients.') 83 if not self._is_verified_address(source): 84 raise MessageRejectedError( 85 "Email address not verified %s" % source 86 ) 87 88 message_id = get_random_message_id() 89 message = Message(message_id, source, subject, body, destinations) 90 self.sent_messages.append(message) 91 self.sent_message_count += recipient_count 92 return message 93 94 def send_raw_email(self, source, destinations, raw_data): 95 if source is not None: 96 _, source_email_address = parseaddr(source) 97 if source_email_address not in self.addresses: 98 raise MessageRejectedError( 99 "Did not have authority to send from email %s" % source_email_address 100 ) 101 102 recipient_count = len(destinations) 103 message = email.message_from_string(raw_data) 104 if source is None: 105 if message['from'] is None: 106 raise MessageRejectedError( 107 "Source not specified" 108 ) 109 110 _, source_email_address = parseaddr(message['from']) 111 if source_email_address not in self.addresses: 112 raise MessageRejectedError( 113 "Did not have authority to send from email %s" % source_email_address 114 ) 115 116 for header in 'TO', 'CC', 'BCC': 117 recipient_count += sum( 118 d.strip() and 1 or 0 119 for d in message.get(header, '').split(',') 120 ) 121 if recipient_count > RECIPIENT_LIMIT: 122 raise MessageRejectedError('Too many recipients.') 123 124 self.sent_message_count += recipient_count 125 message_id = get_random_message_id() 126 message = RawMessage(message_id, source, destinations, raw_data) 127 self.sent_messages.append(message) 128 return message 129 130 def get_send_quota(self): 131 return SESQuota(self.sent_message_count) 132 133 134 ses_backend = SESBackend() 135 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/moto/ses/models.py b/moto/ses/models.py --- a/moto/ses/models.py +++ b/moto/ses/models.py @@ -49,7 +49,8 @@ self.sent_messages = [] self.sent_message_count = 0 - def _is_verified_address(self, address): + def _is_verified_address(self, source): + _, address = parseaddr(source) if address in self.addresses: return True user, host = address.split('@', 1)
{"golden_diff": "diff --git a/moto/ses/models.py b/moto/ses/models.py\n--- a/moto/ses/models.py\n+++ b/moto/ses/models.py\n@@ -49,7 +49,8 @@\n self.sent_messages = []\n self.sent_message_count = 0\n \n- def _is_verified_address(self, address):\n+ def _is_verified_address(self, source):\n+ _, address = parseaddr(source)\n if address in self.addresses:\n return True\n user, host = address.split('@', 1)\n", "issue": "[SES] Does not properly verify mailbox with display name\nhttps://tools.ietf.org/html/rfc2822.html#section-3.4 defines two forms of valid mailbox:\r\n\r\n* `[email protected]`\r\n* `\"Foo Bar\" <[email protected]>`\r\n\r\nSES supports both of these forms. Per https://github.com/spulec/moto/blob/master/moto/ses/models.py#L55, only the first form is supported by moto.\n", "before_files": [{"content": "from __future__ import unicode_literals\n\nimport email\nfrom email.utils import parseaddr\n\nfrom moto.core import BaseBackend, BaseModel\nfrom .exceptions import MessageRejectedError\nfrom .utils import get_random_message_id\n\n\nRECIPIENT_LIMIT = 50\n\n\nclass Message(BaseModel):\n\n def __init__(self, message_id, source, subject, body, destinations):\n self.id = message_id\n self.source = source\n self.subject = subject\n self.body = body\n self.destinations = destinations\n\n\nclass RawMessage(BaseModel):\n\n def __init__(self, message_id, source, destinations, raw_data):\n self.id = message_id\n self.source = source\n self.destinations = destinations\n self.raw_data = raw_data\n\n\nclass SESQuota(BaseModel):\n\n def __init__(self, sent):\n self.sent = sent\n\n @property\n def sent_past_24(self):\n return self.sent\n\n\nclass SESBackend(BaseBackend):\n\n def __init__(self):\n self.addresses = []\n self.email_addresses = []\n self.domains = []\n self.sent_messages = []\n self.sent_message_count = 0\n\n def _is_verified_address(self, address):\n if address in self.addresses:\n return True\n user, host = address.split('@', 1)\n return host in self.domains\n\n def verify_email_identity(self, address):\n self.addresses.append(address)\n\n def verify_email_address(self, address):\n self.email_addresses.append(address)\n\n def verify_domain(self, domain):\n self.domains.append(domain)\n\n def list_identities(self):\n return self.domains + self.addresses\n\n def list_verified_email_addresses(self):\n return self.email_addresses\n\n def delete_identity(self, identity):\n if '@' in identity:\n self.addresses.remove(identity)\n else:\n self.domains.remove(identity)\n\n def send_email(self, source, subject, body, destinations):\n recipient_count = sum(map(len, destinations.values()))\n if recipient_count > RECIPIENT_LIMIT:\n raise MessageRejectedError('Too many recipients.')\n if not self._is_verified_address(source):\n raise MessageRejectedError(\n \"Email address not verified %s\" % source\n )\n\n message_id = get_random_message_id()\n message = Message(message_id, source, subject, body, destinations)\n self.sent_messages.append(message)\n self.sent_message_count += recipient_count\n return message\n\n def send_raw_email(self, source, destinations, raw_data):\n if source is not None:\n _, source_email_address = parseaddr(source)\n if source_email_address not in self.addresses:\n raise MessageRejectedError(\n \"Did not have authority to send from email %s\" % source_email_address\n )\n\n recipient_count = len(destinations)\n message = email.message_from_string(raw_data)\n if source is None:\n if message['from'] is None:\n raise MessageRejectedError(\n \"Source not specified\"\n )\n\n _, source_email_address = parseaddr(message['from'])\n if source_email_address not in self.addresses:\n raise MessageRejectedError(\n \"Did not have authority to send from email %s\" % source_email_address\n )\n\n for header in 'TO', 'CC', 'BCC':\n recipient_count += sum(\n d.strip() and 1 or 0\n for d in message.get(header, '').split(',')\n )\n if recipient_count > RECIPIENT_LIMIT:\n raise MessageRejectedError('Too many recipients.')\n\n self.sent_message_count += recipient_count\n message_id = get_random_message_id()\n message = RawMessage(message_id, source, destinations, raw_data)\n self.sent_messages.append(message)\n return message\n\n def get_send_quota(self):\n return SESQuota(self.sent_message_count)\n\n\nses_backend = SESBackend()\n", "path": "moto/ses/models.py"}], "after_files": [{"content": "from __future__ import unicode_literals\n\nimport email\nfrom email.utils import parseaddr\n\nfrom moto.core import BaseBackend, BaseModel\nfrom .exceptions import MessageRejectedError\nfrom .utils import get_random_message_id\n\n\nRECIPIENT_LIMIT = 50\n\n\nclass Message(BaseModel):\n\n def __init__(self, message_id, source, subject, body, destinations):\n self.id = message_id\n self.source = source\n self.subject = subject\n self.body = body\n self.destinations = destinations\n\n\nclass RawMessage(BaseModel):\n\n def __init__(self, message_id, source, destinations, raw_data):\n self.id = message_id\n self.source = source\n self.destinations = destinations\n self.raw_data = raw_data\n\n\nclass SESQuota(BaseModel):\n\n def __init__(self, sent):\n self.sent = sent\n\n @property\n def sent_past_24(self):\n return self.sent\n\n\nclass SESBackend(BaseBackend):\n\n def __init__(self):\n self.addresses = []\n self.email_addresses = []\n self.domains = []\n self.sent_messages = []\n self.sent_message_count = 0\n\n def _is_verified_address(self, source):\n _, address = parseaddr(source)\n if address in self.addresses:\n return True\n user, host = address.split('@', 1)\n return host in self.domains\n\n def verify_email_identity(self, address):\n self.addresses.append(address)\n\n def verify_email_address(self, address):\n self.email_addresses.append(address)\n\n def verify_domain(self, domain):\n self.domains.append(domain)\n\n def list_identities(self):\n return self.domains + self.addresses\n\n def list_verified_email_addresses(self):\n return self.email_addresses\n\n def delete_identity(self, identity):\n if '@' in identity:\n self.addresses.remove(identity)\n else:\n self.domains.remove(identity)\n\n def send_email(self, source, subject, body, destinations):\n recipient_count = sum(map(len, destinations.values()))\n if recipient_count > RECIPIENT_LIMIT:\n raise MessageRejectedError('Too many recipients.')\n if not self._is_verified_address(source):\n raise MessageRejectedError(\n \"Email address not verified %s\" % source\n )\n\n message_id = get_random_message_id()\n message = Message(message_id, source, subject, body, destinations)\n self.sent_messages.append(message)\n self.sent_message_count += recipient_count\n return message\n\n def send_raw_email(self, source, destinations, raw_data):\n if source is not None:\n _, source_email_address = parseaddr(source)\n if source_email_address not in self.addresses:\n raise MessageRejectedError(\n \"Did not have authority to send from email %s\" % source_email_address\n )\n\n recipient_count = len(destinations)\n message = email.message_from_string(raw_data)\n if source is None:\n if message['from'] is None:\n raise MessageRejectedError(\n \"Source not specified\"\n )\n\n _, source_email_address = parseaddr(message['from'])\n if source_email_address not in self.addresses:\n raise MessageRejectedError(\n \"Did not have authority to send from email %s\" % source_email_address\n )\n\n for header in 'TO', 'CC', 'BCC':\n recipient_count += sum(\n d.strip() and 1 or 0\n for d in message.get(header, '').split(',')\n )\n if recipient_count > RECIPIENT_LIMIT:\n raise MessageRejectedError('Too many recipients.')\n\n self.sent_message_count += recipient_count\n message_id = get_random_message_id()\n message = RawMessage(message_id, source, destinations, raw_data)\n self.sent_messages.append(message)\n return message\n\n def get_send_quota(self):\n return SESQuota(self.sent_message_count)\n\n\nses_backend = SESBackend()\n", "path": "moto/ses/models.py"}]}
1,499
118
gh_patches_debug_31263
rasdani/github-patches
git_diff
svthalia__concrexit-1749
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- FieldError: Cannot resolve keyword 'pizza_event' into field. Choices are: food_event, food_event_id, id, memb... Sentry Issue: [CONCREXIT-6G](https://sentry.io/organizations/thalia/issues/2468308255/?referrer=github_integration) ``` FieldError: Cannot resolve keyword 'pizza_event' into field. Choices are: food_event, food_event_id, id, member, member_id, name, payment, payment_id, product, product_id (9 additional frame(s) were not displayed) ... File "django/db/models/sql/query.py", line 1391, in add_q clause, _ = self._add_q(q_object, self.used_aliases) File "django/db/models/sql/query.py", line 1410, in _add_q child_clause, needed_inner = self.build_filter( File "django/db/models/sql/query.py", line 1284, in build_filter lookups, parts, reffed_expression = self.solve_lookup_type(arg) File "django/db/models/sql/query.py", line 1110, in solve_lookup_type _, field, _, lookup_parts = self.names_to_path(lookup_splitted, self.get_meta()) File "django/db/models/sql/query.py", line 1537, in names_to_path raise FieldError("Cannot resolve keyword '%s' into field. " ``` --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `website/pizzas/services.py` Content: ``` 1 from events.services import is_organiser 2 from .models import Product, FoodOrder, FoodEvent 3 4 5 def gen_stats_pizza_orders(): 6 """Generate statistics about number of orders per product. 7 8 :return: Dict with key, value being resp. name, order count of a product. 9 """ 10 total = {} 11 12 for product in Product.objects.all(): 13 total.update( 14 {product.name: FoodOrder.objects.filter(product=product).count(),} 15 ) 16 17 return { 18 i[0]: i[1] 19 for i in sorted(total.items(), key=lambda x: x[1], reverse=True)[:5] 20 if i[1] > 0 21 } 22 23 24 def gen_stats_current_pizza_orders(): 25 """Generate statistics about number of orders per product of the active pizza event. 26 27 :return: Dict with key, value being resp. name, order count of a product. 28 """ 29 total = {} 30 31 current_pizza_event = FoodEvent.current() 32 if not current_pizza_event: 33 return None 34 35 for product in Product.objects.filter(): 36 total.update( 37 { 38 product.name: FoodOrder.objects.filter( 39 product=product, pizza_event=current_pizza_event, 40 ).count(), 41 } 42 ) 43 44 return { 45 i[0]: i[1] 46 for i in sorted(total.items(), key=lambda x: x[1], reverse=True)[:5] 47 if i[1] > 0 48 } 49 50 51 def can_change_order(member, food_event): 52 """Determine if a certain member can edit orders of an event. 53 54 :param member: Member who wants to change and order 55 :param food_event: The event for which we want to change an order 56 :return: True if we can change an order else False 57 """ 58 return ( 59 food_event 60 and member.has_perm("pizzas.change_foodorder") 61 and is_organiser(member, food_event.event) 62 ) 63 ``` Path: `website/pizzas/views.py` Content: ``` 1 """Views provided by the pizzas package.""" 2 from django.contrib import messages 3 from django.contrib.auth.decorators import login_required 4 from django.http import Http404 5 from django.shortcuts import get_object_or_404, render, redirect 6 from django.utils.translation import gettext_lazy as _ 7 from django.views.decorators.http import require_http_methods 8 9 from .models import FoodOrder, FoodEvent, Product 10 11 12 @login_required 13 def index(request): 14 """Overview of user order for a pizza event.""" 15 products = Product.available_products.order_by("name") 16 if not request.user.has_perm("pizzas.order_restricted_products"): 17 products = products.exclude(restricted=True) 18 event = FoodEvent.current() 19 try: 20 obj = FoodOrder.objects.get(pizza_event=event, member=request.member) 21 except FoodOrder.DoesNotExist: 22 obj = None 23 context = {"event": event, "products": products, "order": obj} 24 return render(request, "pizzas/index.html", context) 25 26 27 @require_http_methods(["POST"]) 28 def cancel_order(request): 29 """View that cancels a user's order.""" 30 if "order" in request.POST: 31 try: 32 order = get_object_or_404(FoodOrder, pk=int(request.POST["order"])) 33 if not order.can_be_changed: 34 messages.error(request, _("You can no longer cancel.")) 35 elif order.member == request.member: 36 order.delete() 37 messages.success(request, _("Your order has been cancelled.")) 38 except Http404: 39 messages.error(request, _("Your order could not be found.")) 40 return redirect("pizzas:index") 41 42 43 @login_required 44 def place_order(request): 45 """View that shows the detail of the current order.""" 46 event = FoodEvent.current() 47 if not event: 48 return redirect("pizzas:index") 49 50 try: 51 obj = FoodOrder.objects.get(pizza_event=event, member=request.member) 52 current_order_locked = not obj.can_be_changed 53 except FoodOrder.DoesNotExist: 54 obj = None 55 current_order_locked = False 56 57 if "product" in request.POST and not current_order_locked: 58 productset = Product.available_products.all() 59 if not request.user.has_perm("pizzas.order_restricted_products"): 60 productset = productset.exclude(restricted=True) 61 try: 62 product = productset.get(pk=int(request.POST["product"])) 63 except Product.DoesNotExist as e: 64 raise Http404("Pizza does not exist") from e 65 if not obj: 66 obj = FoodOrder(pizza_event=event, member=request.member) 67 obj.product = product 68 obj.save() 69 return redirect("pizzas:index") 70 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/website/pizzas/services.py b/website/pizzas/services.py --- a/website/pizzas/services.py +++ b/website/pizzas/services.py @@ -36,7 +36,7 @@ total.update( { product.name: FoodOrder.objects.filter( - product=product, pizza_event=current_pizza_event, + product=product, food_event=current_pizza_event, ).count(), } ) diff --git a/website/pizzas/views.py b/website/pizzas/views.py --- a/website/pizzas/views.py +++ b/website/pizzas/views.py @@ -17,7 +17,7 @@ products = products.exclude(restricted=True) event = FoodEvent.current() try: - obj = FoodOrder.objects.get(pizza_event=event, member=request.member) + obj = FoodOrder.objects.get(food_event=event, member=request.member) except FoodOrder.DoesNotExist: obj = None context = {"event": event, "products": products, "order": obj} @@ -48,7 +48,7 @@ return redirect("pizzas:index") try: - obj = FoodOrder.objects.get(pizza_event=event, member=request.member) + obj = FoodOrder.objects.get(food_event=event, member=request.member) current_order_locked = not obj.can_be_changed except FoodOrder.DoesNotExist: obj = None @@ -63,7 +63,7 @@ except Product.DoesNotExist as e: raise Http404("Pizza does not exist") from e if not obj: - obj = FoodOrder(pizza_event=event, member=request.member) + obj = FoodOrder(food_event=event, member=request.member) obj.product = product obj.save() return redirect("pizzas:index")
{"golden_diff": "diff --git a/website/pizzas/services.py b/website/pizzas/services.py\n--- a/website/pizzas/services.py\n+++ b/website/pizzas/services.py\n@@ -36,7 +36,7 @@\n total.update(\n {\n product.name: FoodOrder.objects.filter(\n- product=product, pizza_event=current_pizza_event,\n+ product=product, food_event=current_pizza_event,\n ).count(),\n }\n )\ndiff --git a/website/pizzas/views.py b/website/pizzas/views.py\n--- a/website/pizzas/views.py\n+++ b/website/pizzas/views.py\n@@ -17,7 +17,7 @@\n products = products.exclude(restricted=True)\n event = FoodEvent.current()\n try:\n- obj = FoodOrder.objects.get(pizza_event=event, member=request.member)\n+ obj = FoodOrder.objects.get(food_event=event, member=request.member)\n except FoodOrder.DoesNotExist:\n obj = None\n context = {\"event\": event, \"products\": products, \"order\": obj}\n@@ -48,7 +48,7 @@\n return redirect(\"pizzas:index\")\n \n try:\n- obj = FoodOrder.objects.get(pizza_event=event, member=request.member)\n+ obj = FoodOrder.objects.get(food_event=event, member=request.member)\n current_order_locked = not obj.can_be_changed\n except FoodOrder.DoesNotExist:\n obj = None\n@@ -63,7 +63,7 @@\n except Product.DoesNotExist as e:\n raise Http404(\"Pizza does not exist\") from e\n if not obj:\n- obj = FoodOrder(pizza_event=event, member=request.member)\n+ obj = FoodOrder(food_event=event, member=request.member)\n obj.product = product\n obj.save()\n return redirect(\"pizzas:index\")\n", "issue": "FieldError: Cannot resolve keyword 'pizza_event' into field. Choices are: food_event, food_event_id, id, memb...\nSentry Issue: [CONCREXIT-6G](https://sentry.io/organizations/thalia/issues/2468308255/?referrer=github_integration)\n\n```\nFieldError: Cannot resolve keyword 'pizza_event' into field. Choices are: food_event, food_event_id, id, member, member_id, name, payment, payment_id, product, product_id\n(9 additional frame(s) were not displayed)\n...\n File \"django/db/models/sql/query.py\", line 1391, in add_q\n clause, _ = self._add_q(q_object, self.used_aliases)\n File \"django/db/models/sql/query.py\", line 1410, in _add_q\n child_clause, needed_inner = self.build_filter(\n File \"django/db/models/sql/query.py\", line 1284, in build_filter\n lookups, parts, reffed_expression = self.solve_lookup_type(arg)\n File \"django/db/models/sql/query.py\", line 1110, in solve_lookup_type\n _, field, _, lookup_parts = self.names_to_path(lookup_splitted, self.get_meta())\n File \"django/db/models/sql/query.py\", line 1537, in names_to_path\n raise FieldError(\"Cannot resolve keyword '%s' into field. \"\n```\n", "before_files": [{"content": "from events.services import is_organiser\nfrom .models import Product, FoodOrder, FoodEvent\n\n\ndef gen_stats_pizza_orders():\n \"\"\"Generate statistics about number of orders per product.\n\n :return: Dict with key, value being resp. name, order count of a product.\n \"\"\"\n total = {}\n\n for product in Product.objects.all():\n total.update(\n {product.name: FoodOrder.objects.filter(product=product).count(),}\n )\n\n return {\n i[0]: i[1]\n for i in sorted(total.items(), key=lambda x: x[1], reverse=True)[:5]\n if i[1] > 0\n }\n\n\ndef gen_stats_current_pizza_orders():\n \"\"\"Generate statistics about number of orders per product of the active pizza event.\n\n :return: Dict with key, value being resp. name, order count of a product.\n \"\"\"\n total = {}\n\n current_pizza_event = FoodEvent.current()\n if not current_pizza_event:\n return None\n\n for product in Product.objects.filter():\n total.update(\n {\n product.name: FoodOrder.objects.filter(\n product=product, pizza_event=current_pizza_event,\n ).count(),\n }\n )\n\n return {\n i[0]: i[1]\n for i in sorted(total.items(), key=lambda x: x[1], reverse=True)[:5]\n if i[1] > 0\n }\n\n\ndef can_change_order(member, food_event):\n \"\"\"Determine if a certain member can edit orders of an event.\n\n :param member: Member who wants to change and order\n :param food_event: The event for which we want to change an order\n :return: True if we can change an order else False\n \"\"\"\n return (\n food_event\n and member.has_perm(\"pizzas.change_foodorder\")\n and is_organiser(member, food_event.event)\n )\n", "path": "website/pizzas/services.py"}, {"content": "\"\"\"Views provided by the pizzas package.\"\"\"\nfrom django.contrib import messages\nfrom django.contrib.auth.decorators import login_required\nfrom django.http import Http404\nfrom django.shortcuts import get_object_or_404, render, redirect\nfrom django.utils.translation import gettext_lazy as _\nfrom django.views.decorators.http import require_http_methods\n\nfrom .models import FoodOrder, FoodEvent, Product\n\n\n@login_required\ndef index(request):\n \"\"\"Overview of user order for a pizza event.\"\"\"\n products = Product.available_products.order_by(\"name\")\n if not request.user.has_perm(\"pizzas.order_restricted_products\"):\n products = products.exclude(restricted=True)\n event = FoodEvent.current()\n try:\n obj = FoodOrder.objects.get(pizza_event=event, member=request.member)\n except FoodOrder.DoesNotExist:\n obj = None\n context = {\"event\": event, \"products\": products, \"order\": obj}\n return render(request, \"pizzas/index.html\", context)\n\n\n@require_http_methods([\"POST\"])\ndef cancel_order(request):\n \"\"\"View that cancels a user's order.\"\"\"\n if \"order\" in request.POST:\n try:\n order = get_object_or_404(FoodOrder, pk=int(request.POST[\"order\"]))\n if not order.can_be_changed:\n messages.error(request, _(\"You can no longer cancel.\"))\n elif order.member == request.member:\n order.delete()\n messages.success(request, _(\"Your order has been cancelled.\"))\n except Http404:\n messages.error(request, _(\"Your order could not be found.\"))\n return redirect(\"pizzas:index\")\n\n\n@login_required\ndef place_order(request):\n \"\"\"View that shows the detail of the current order.\"\"\"\n event = FoodEvent.current()\n if not event:\n return redirect(\"pizzas:index\")\n\n try:\n obj = FoodOrder.objects.get(pizza_event=event, member=request.member)\n current_order_locked = not obj.can_be_changed\n except FoodOrder.DoesNotExist:\n obj = None\n current_order_locked = False\n\n if \"product\" in request.POST and not current_order_locked:\n productset = Product.available_products.all()\n if not request.user.has_perm(\"pizzas.order_restricted_products\"):\n productset = productset.exclude(restricted=True)\n try:\n product = productset.get(pk=int(request.POST[\"product\"]))\n except Product.DoesNotExist as e:\n raise Http404(\"Pizza does not exist\") from e\n if not obj:\n obj = FoodOrder(pizza_event=event, member=request.member)\n obj.product = product\n obj.save()\n return redirect(\"pizzas:index\")\n", "path": "website/pizzas/views.py"}], "after_files": [{"content": "from events.services import is_organiser\nfrom .models import Product, FoodOrder, FoodEvent\n\n\ndef gen_stats_pizza_orders():\n \"\"\"Generate statistics about number of orders per product.\n\n :return: Dict with key, value being resp. name, order count of a product.\n \"\"\"\n total = {}\n\n for product in Product.objects.all():\n total.update(\n {product.name: FoodOrder.objects.filter(product=product).count(),}\n )\n\n return {\n i[0]: i[1]\n for i in sorted(total.items(), key=lambda x: x[1], reverse=True)[:5]\n if i[1] > 0\n }\n\n\ndef gen_stats_current_pizza_orders():\n \"\"\"Generate statistics about number of orders per product of the active pizza event.\n\n :return: Dict with key, value being resp. name, order count of a product.\n \"\"\"\n total = {}\n\n current_pizza_event = FoodEvent.current()\n if not current_pizza_event:\n return None\n\n for product in Product.objects.filter():\n total.update(\n {\n product.name: FoodOrder.objects.filter(\n product=product, food_event=current_pizza_event,\n ).count(),\n }\n )\n\n return {\n i[0]: i[1]\n for i in sorted(total.items(), key=lambda x: x[1], reverse=True)[:5]\n if i[1] > 0\n }\n\n\ndef can_change_order(member, food_event):\n \"\"\"Determine if a certain member can edit orders of an event.\n\n :param member: Member who wants to change and order\n :param food_event: The event for which we want to change an order\n :return: True if we can change an order else False\n \"\"\"\n return (\n food_event\n and member.has_perm(\"pizzas.change_foodorder\")\n and is_organiser(member, food_event.event)\n )\n", "path": "website/pizzas/services.py"}, {"content": "\"\"\"Views provided by the pizzas package.\"\"\"\nfrom django.contrib import messages\nfrom django.contrib.auth.decorators import login_required\nfrom django.http import Http404\nfrom django.shortcuts import get_object_or_404, render, redirect\nfrom django.utils.translation import gettext_lazy as _\nfrom django.views.decorators.http import require_http_methods\n\nfrom .models import FoodOrder, FoodEvent, Product\n\n\n@login_required\ndef index(request):\n \"\"\"Overview of user order for a pizza event.\"\"\"\n products = Product.available_products.order_by(\"name\")\n if not request.user.has_perm(\"pizzas.order_restricted_products\"):\n products = products.exclude(restricted=True)\n event = FoodEvent.current()\n try:\n obj = FoodOrder.objects.get(food_event=event, member=request.member)\n except FoodOrder.DoesNotExist:\n obj = None\n context = {\"event\": event, \"products\": products, \"order\": obj}\n return render(request, \"pizzas/index.html\", context)\n\n\n@require_http_methods([\"POST\"])\ndef cancel_order(request):\n \"\"\"View that cancels a user's order.\"\"\"\n if \"order\" in request.POST:\n try:\n order = get_object_or_404(FoodOrder, pk=int(request.POST[\"order\"]))\n if not order.can_be_changed:\n messages.error(request, _(\"You can no longer cancel.\"))\n elif order.member == request.member:\n order.delete()\n messages.success(request, _(\"Your order has been cancelled.\"))\n except Http404:\n messages.error(request, _(\"Your order could not be found.\"))\n return redirect(\"pizzas:index\")\n\n\n@login_required\ndef place_order(request):\n \"\"\"View that shows the detail of the current order.\"\"\"\n event = FoodEvent.current()\n if not event:\n return redirect(\"pizzas:index\")\n\n try:\n obj = FoodOrder.objects.get(food_event=event, member=request.member)\n current_order_locked = not obj.can_be_changed\n except FoodOrder.DoesNotExist:\n obj = None\n current_order_locked = False\n\n if \"product\" in request.POST and not current_order_locked:\n productset = Product.available_products.all()\n if not request.user.has_perm(\"pizzas.order_restricted_products\"):\n productset = productset.exclude(restricted=True)\n try:\n product = productset.get(pk=int(request.POST[\"product\"]))\n except Product.DoesNotExist as e:\n raise Http404(\"Pizza does not exist\") from e\n if not obj:\n obj = FoodOrder(food_event=event, member=request.member)\n obj.product = product\n obj.save()\n return redirect(\"pizzas:index\")\n", "path": "website/pizzas/views.py"}]}
1,807
391
gh_patches_debug_1786
rasdani/github-patches
git_diff
mozilla__telemetry-analysis-service-413
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- ImportError: No module named 'atmo.clusters.jobs' ``` app@a898b116953a:~$ ./manage.py update_clusters Traceback (most recent call last): File "./manage.py", line 11, in <module> execute_from_command_line(sys.argv) File "/usr/local/lib/python3.5/site-packages/django/core/management/__init__.py", line 353, in execute_from_command_line utility.execute() File "/usr/local/lib/python3.5/site-packages/django/core/management/__init__.py", line 345, in execute self.fetch_command(subcommand).run_from_argv(self.argv) File "/usr/local/lib/python3.5/site-packages/django/core/management/__init__.py", line 195, in fetch_command klass = load_command_class(app_name, subcommand) File "/usr/local/lib/python3.5/site-packages/django/core/management/__init__.py", line 39, in load_command_class module = import_module('%s.management.commands.%s' % (app_name, name)) File "/usr/local/lib/python3.5/importlib/__init__.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "<frozen importlib._bootstrap>", line 986, in _gcd_import File "<frozen importlib._bootstrap>", line 969, in _find_and_load File "<frozen importlib._bootstrap>", line 958, in _find_and_load_unlocked File "<frozen importlib._bootstrap>", line 673, in _load_unlocked File "<frozen importlib._bootstrap_external>", line 673, in exec_module File "<frozen importlib._bootstrap>", line 222, in _call_with_frames_removed File "/app/atmo/clusters/management/commands/update_clusters.py", line 6, in <module> from ...jobs import update_clusters ImportError: No module named 'atmo.clusters.jobs' ``` --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `atmo/clusters/management/commands/update_clusters.py` Content: ``` 1 # This Source Code Form is subject to the terms of the Mozilla Public 2 # License, v. 2.0. If a copy of the MPL was not distributed with this 3 # file, you can obtain one at http://mozilla.org/MPL/2.0/. 4 from django.core.management.base import BaseCommand 5 6 from ...jobs import update_clusters 7 8 9 class Command(BaseCommand): 10 help = 'Go through active clusters and update their status' 11 12 def handle(self, *args, **options): 13 self.stdout.write('Updating cluster info...', ending='') 14 update_clusters() 15 self.stdout.write('done.') 16 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/atmo/clusters/management/commands/update_clusters.py b/atmo/clusters/management/commands/update_clusters.py --- a/atmo/clusters/management/commands/update_clusters.py +++ b/atmo/clusters/management/commands/update_clusters.py @@ -3,7 +3,7 @@ # file, you can obtain one at http://mozilla.org/MPL/2.0/. from django.core.management.base import BaseCommand -from ...jobs import update_clusters +from ...tasks import update_clusters class Command(BaseCommand):
{"golden_diff": "diff --git a/atmo/clusters/management/commands/update_clusters.py b/atmo/clusters/management/commands/update_clusters.py\n--- a/atmo/clusters/management/commands/update_clusters.py\n+++ b/atmo/clusters/management/commands/update_clusters.py\n@@ -3,7 +3,7 @@\n # file, you can obtain one at http://mozilla.org/MPL/2.0/.\n from django.core.management.base import BaseCommand\n \n-from ...jobs import update_clusters\n+from ...tasks import update_clusters\n \n \n class Command(BaseCommand):\n", "issue": "ImportError: No module named 'atmo.clusters.jobs'\n```\r\napp@a898b116953a:~$ ./manage.py update_clusters\r\nTraceback (most recent call last):\r\n File \"./manage.py\", line 11, in <module>\r\n execute_from_command_line(sys.argv)\r\n File \"/usr/local/lib/python3.5/site-packages/django/core/management/__init__.py\", line 353, in execute_from_command_line\r\n utility.execute()\r\n File \"/usr/local/lib/python3.5/site-packages/django/core/management/__init__.py\", line 345, in execute\r\n self.fetch_command(subcommand).run_from_argv(self.argv)\r\n File \"/usr/local/lib/python3.5/site-packages/django/core/management/__init__.py\", line 195, in fetch_command\r\n klass = load_command_class(app_name, subcommand)\r\n File \"/usr/local/lib/python3.5/site-packages/django/core/management/__init__.py\", line 39, in load_command_class\r\n module = import_module('%s.management.commands.%s' % (app_name, name))\r\n File \"/usr/local/lib/python3.5/importlib/__init__.py\", line 126, in import_module\r\n return _bootstrap._gcd_import(name[level:], package, level)\r\n File \"<frozen importlib._bootstrap>\", line 986, in _gcd_import\r\n File \"<frozen importlib._bootstrap>\", line 969, in _find_and_load\r\n File \"<frozen importlib._bootstrap>\", line 958, in _find_and_load_unlocked\r\n File \"<frozen importlib._bootstrap>\", line 673, in _load_unlocked\r\n File \"<frozen importlib._bootstrap_external>\", line 673, in exec_module\r\n File \"<frozen importlib._bootstrap>\", line 222, in _call_with_frames_removed\r\n File \"/app/atmo/clusters/management/commands/update_clusters.py\", line 6, in <module>\r\n from ...jobs import update_clusters\r\nImportError: No module named 'atmo.clusters.jobs'\r\n```\n", "before_files": [{"content": "# This Source Code Form is subject to the terms of the Mozilla Public\n# License, v. 2.0. If a copy of the MPL was not distributed with this\n# file, you can obtain one at http://mozilla.org/MPL/2.0/.\nfrom django.core.management.base import BaseCommand\n\nfrom ...jobs import update_clusters\n\n\nclass Command(BaseCommand):\n help = 'Go through active clusters and update their status'\n\n def handle(self, *args, **options):\n self.stdout.write('Updating cluster info...', ending='')\n update_clusters()\n self.stdout.write('done.')\n", "path": "atmo/clusters/management/commands/update_clusters.py"}], "after_files": [{"content": "# This Source Code Form is subject to the terms of the Mozilla Public\n# License, v. 2.0. If a copy of the MPL was not distributed with this\n# file, you can obtain one at http://mozilla.org/MPL/2.0/.\nfrom django.core.management.base import BaseCommand\n\nfrom ...tasks import update_clusters\n\n\nclass Command(BaseCommand):\n help = 'Go through active clusters and update their status'\n\n def handle(self, *args, **options):\n self.stdout.write('Updating cluster info...', ending='')\n update_clusters()\n self.stdout.write('done.')\n", "path": "atmo/clusters/management/commands/update_clusters.py"}]}
880
114
gh_patches_debug_39196
rasdani/github-patches
git_diff
liqd__a4-meinberlin-3705
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- testing 4295: 402 error for poll export **URL:** https://meinberlin-dev.liqd.net/dashboard/modules/umfrage-24/poll/export/ **user:** initiator, moderator, group member **expected behaviour:** download export **behaviour:** 403 error **important screensize:** **device & browser:** big sur, firefox **Comment/Question:** Screenshot? --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `meinberlin/apps/polls/exports.py` Content: ``` 1 from django.utils.translation import ugettext as _ 2 from rules.contrib.views import PermissionRequiredMixin 3 4 from adhocracy4.comments.models import Comment 5 from adhocracy4.exports import mixins 6 from adhocracy4.exports import views as export_views 7 from adhocracy4.polls import models as poll_models 8 from meinberlin.apps.users.models import User 9 10 11 class PollCommentExportView( 12 PermissionRequiredMixin, 13 mixins.ExportModelFieldsMixin, 14 mixins.UserGeneratedContentExportMixin, 15 mixins.ItemExportWithLinkMixin, 16 mixins.CommentExportWithRepliesToMixin, 17 export_views.BaseItemExportView 18 ): 19 20 model = Comment 21 22 fields = ['id', 'comment', 'created'] 23 permission_required = 'a4projects.change_project' 24 25 def get_permission_object(self): 26 return self.module.project 27 28 def get_queryset(self): 29 comments = ( 30 Comment.objects.filter(poll__module=self.module) | 31 Comment.objects.filter(parent_comment__poll__module=self.module) 32 ) 33 return comments 34 35 def get_virtual_fields(self, virtual): 36 virtual.setdefault('id', _('ID')) 37 virtual.setdefault('comment', _('Comment')) 38 virtual.setdefault('created', _('Created')) 39 return super().get_virtual_fields(virtual) 40 41 @property 42 def raise_exception(self): 43 return self.request.user.is_authenticated 44 45 46 class PollExportView( 47 PermissionRequiredMixin, 48 export_views.BaseItemExportView 49 ): 50 51 permission_required = 'a4projects.change_project' 52 53 def get_queryset(self): 54 creators_vote = poll_models.Vote.objects.filter( 55 choice__question__poll=self.poll).values_list('creator', flat=True) 56 creators_answer = poll_models.Answer.objects.filter( 57 question__poll=self.poll).values_list('creator', flat=True) 58 creator_ids = list(set(creators_vote).union(set(creators_answer))) 59 return User.objects.filter(pk__in=creator_ids) 60 61 @property 62 def poll(self): 63 return poll_models.Poll.objects.get(module=self.module) 64 65 @property 66 def single_choice_questions(self): 67 return self.poll.questions.filter( 68 multiple_choice=False, 69 is_open=False).order_by('id') 70 71 @property 72 def multiple_choice_questions(self): 73 return self.poll.questions.filter(multiple_choice=True).order_by('id') 74 75 @property 76 def open_questions(self): 77 return self.poll.questions.filter(is_open=True).order_by('id') 78 79 def get_virtual_fields(self, virtual): 80 virtual = super().get_virtual_fields(virtual) 81 virtual = self.get_virtual_fields_choice_questions( 82 virtual, self.single_choice_questions) 83 virtual = self.get_virtual_fields_choice_questions( 84 virtual, self.multiple_choice_questions) 85 virtual = self.get_virtual_fields_open_questions( 86 virtual, self.open_questions) 87 88 return virtual 89 90 def get_virtual_fields_choice_questions(self, virtual, choice_questions): 91 for question in choice_questions.all(): 92 for choice in question.choices.all(): 93 identifier = 'Q' + str(question.pk) + '_A' + str(choice.pk) 94 virtual[(choice, False)] = identifier 95 if choice.is_other_choice: 96 identifier_answer = identifier + '_text' 97 virtual[(choice, True)] = identifier_answer 98 99 return virtual 100 101 def get_virtual_fields_open_questions(self, virtual, open_questions): 102 for question in open_questions.all(): 103 identifier = 'Q' + str(question.pk) 104 virtual[(question, False)] = identifier 105 identifier_answer = identifier + '_text' 106 virtual[(question, True)] = identifier_answer 107 108 return virtual 109 110 def get_field_data(self, user, field): 111 field_object, is_text_field = field 112 113 if type(field_object) == poll_models.Choice: 114 votes_qs = poll_models.Vote.objects.filter( 115 choice=field_object, 116 creator=user) 117 if not is_text_field: 118 value = int(votes_qs.exists()) 119 else: 120 vote = votes_qs.first() 121 if vote: 122 value = poll_models.OtherVote.objects.get(vote=vote).answer 123 else: 124 value = '' 125 else: # field_object is question 126 answers_qs = poll_models.Answer.objects.filter( 127 question=field_object, 128 creator=user) 129 if not is_text_field: 130 value = int(answers_qs.exists()) 131 else: 132 answer = answers_qs.first() 133 if answer: 134 value = answer.answer 135 else: 136 value = '' 137 138 return value 139 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/meinberlin/apps/polls/exports.py b/meinberlin/apps/polls/exports.py --- a/meinberlin/apps/polls/exports.py +++ b/meinberlin/apps/polls/exports.py @@ -50,6 +50,9 @@ permission_required = 'a4projects.change_project' + def get_permission_object(self): + return self.module.project + def get_queryset(self): creators_vote = poll_models.Vote.objects.filter( choice__question__poll=self.poll).values_list('creator', flat=True) @@ -63,47 +66,37 @@ return poll_models.Poll.objects.get(module=self.module) @property - def single_choice_questions(self): - return self.poll.questions.filter( - multiple_choice=False, - is_open=False).order_by('id') - - @property - def multiple_choice_questions(self): - return self.poll.questions.filter(multiple_choice=True).order_by('id') - - @property - def open_questions(self): - return self.poll.questions.filter(is_open=True).order_by('id') + def questions(self): + return self.poll.questions.all() def get_virtual_fields(self, virtual): virtual = super().get_virtual_fields(virtual) - virtual = self.get_virtual_fields_choice_questions( - virtual, self.single_choice_questions) - virtual = self.get_virtual_fields_choice_questions( - virtual, self.multiple_choice_questions) - virtual = self.get_virtual_fields_open_questions( - virtual, self.open_questions) + + for question in self.questions: + if question.is_open: + virtual = \ + self.get_virtual_field_open_question(virtual, question) + else: + virtual = \ + self.get_virtual_field_choice_question(virtual, question) return virtual - def get_virtual_fields_choice_questions(self, virtual, choice_questions): - for question in choice_questions.all(): - for choice in question.choices.all(): - identifier = 'Q' + str(question.pk) + '_A' + str(choice.pk) - virtual[(choice, False)] = identifier - if choice.is_other_choice: - identifier_answer = identifier + '_text' - virtual[(choice, True)] = identifier_answer + def get_virtual_field_choice_question(self, virtual, choice_question): + for choice in choice_question.choices.all(): + identifier = 'Q' + str(choice_question.pk) + '_A' + str(choice.pk) + virtual[(choice, False)] = identifier + if choice.is_other_choice: + identifier_answer = identifier + '_text' + virtual[(choice, True)] = identifier_answer return virtual - def get_virtual_fields_open_questions(self, virtual, open_questions): - for question in open_questions.all(): - identifier = 'Q' + str(question.pk) - virtual[(question, False)] = identifier - identifier_answer = identifier + '_text' - virtual[(question, True)] = identifier_answer + def get_virtual_field_open_question(self, virtual, open_question): + identifier = 'Q' + str(open_question.pk) + virtual[(open_question, False)] = identifier + identifier_answer = identifier + '_text' + virtual[(open_question, True)] = identifier_answer return virtual
{"golden_diff": "diff --git a/meinberlin/apps/polls/exports.py b/meinberlin/apps/polls/exports.py\n--- a/meinberlin/apps/polls/exports.py\n+++ b/meinberlin/apps/polls/exports.py\n@@ -50,6 +50,9 @@\n \n permission_required = 'a4projects.change_project'\n \n+ def get_permission_object(self):\n+ return self.module.project\n+\n def get_queryset(self):\n creators_vote = poll_models.Vote.objects.filter(\n choice__question__poll=self.poll).values_list('creator', flat=True)\n@@ -63,47 +66,37 @@\n return poll_models.Poll.objects.get(module=self.module)\n \n @property\n- def single_choice_questions(self):\n- return self.poll.questions.filter(\n- multiple_choice=False,\n- is_open=False).order_by('id')\n-\n- @property\n- def multiple_choice_questions(self):\n- return self.poll.questions.filter(multiple_choice=True).order_by('id')\n-\n- @property\n- def open_questions(self):\n- return self.poll.questions.filter(is_open=True).order_by('id')\n+ def questions(self):\n+ return self.poll.questions.all()\n \n def get_virtual_fields(self, virtual):\n virtual = super().get_virtual_fields(virtual)\n- virtual = self.get_virtual_fields_choice_questions(\n- virtual, self.single_choice_questions)\n- virtual = self.get_virtual_fields_choice_questions(\n- virtual, self.multiple_choice_questions)\n- virtual = self.get_virtual_fields_open_questions(\n- virtual, self.open_questions)\n+\n+ for question in self.questions:\n+ if question.is_open:\n+ virtual = \\\n+ self.get_virtual_field_open_question(virtual, question)\n+ else:\n+ virtual = \\\n+ self.get_virtual_field_choice_question(virtual, question)\n \n return virtual\n \n- def get_virtual_fields_choice_questions(self, virtual, choice_questions):\n- for question in choice_questions.all():\n- for choice in question.choices.all():\n- identifier = 'Q' + str(question.pk) + '_A' + str(choice.pk)\n- virtual[(choice, False)] = identifier\n- if choice.is_other_choice:\n- identifier_answer = identifier + '_text'\n- virtual[(choice, True)] = identifier_answer\n+ def get_virtual_field_choice_question(self, virtual, choice_question):\n+ for choice in choice_question.choices.all():\n+ identifier = 'Q' + str(choice_question.pk) + '_A' + str(choice.pk)\n+ virtual[(choice, False)] = identifier\n+ if choice.is_other_choice:\n+ identifier_answer = identifier + '_text'\n+ virtual[(choice, True)] = identifier_answer\n \n return virtual\n \n- def get_virtual_fields_open_questions(self, virtual, open_questions):\n- for question in open_questions.all():\n- identifier = 'Q' + str(question.pk)\n- virtual[(question, False)] = identifier\n- identifier_answer = identifier + '_text'\n- virtual[(question, True)] = identifier_answer\n+ def get_virtual_field_open_question(self, virtual, open_question):\n+ identifier = 'Q' + str(open_question.pk)\n+ virtual[(open_question, False)] = identifier\n+ identifier_answer = identifier + '_text'\n+ virtual[(open_question, True)] = identifier_answer\n \n return virtual\n", "issue": "testing 4295: 402 error for poll export\n**URL:** https://meinberlin-dev.liqd.net/dashboard/modules/umfrage-24/poll/export/\r\n**user:** initiator, moderator, group member\r\n**expected behaviour:** download export\r\n**behaviour:** 403 error\r\n**important screensize:**\r\n**device & browser:** big sur, firefox\r\n**Comment/Question:** \r\n\r\nScreenshot?\r\n\n", "before_files": [{"content": "from django.utils.translation import ugettext as _\nfrom rules.contrib.views import PermissionRequiredMixin\n\nfrom adhocracy4.comments.models import Comment\nfrom adhocracy4.exports import mixins\nfrom adhocracy4.exports import views as export_views\nfrom adhocracy4.polls import models as poll_models\nfrom meinberlin.apps.users.models import User\n\n\nclass PollCommentExportView(\n PermissionRequiredMixin,\n mixins.ExportModelFieldsMixin,\n mixins.UserGeneratedContentExportMixin,\n mixins.ItemExportWithLinkMixin,\n mixins.CommentExportWithRepliesToMixin,\n export_views.BaseItemExportView\n):\n\n model = Comment\n\n fields = ['id', 'comment', 'created']\n permission_required = 'a4projects.change_project'\n\n def get_permission_object(self):\n return self.module.project\n\n def get_queryset(self):\n comments = (\n Comment.objects.filter(poll__module=self.module) |\n Comment.objects.filter(parent_comment__poll__module=self.module)\n )\n return comments\n\n def get_virtual_fields(self, virtual):\n virtual.setdefault('id', _('ID'))\n virtual.setdefault('comment', _('Comment'))\n virtual.setdefault('created', _('Created'))\n return super().get_virtual_fields(virtual)\n\n @property\n def raise_exception(self):\n return self.request.user.is_authenticated\n\n\nclass PollExportView(\n PermissionRequiredMixin,\n export_views.BaseItemExportView\n):\n\n permission_required = 'a4projects.change_project'\n\n def get_queryset(self):\n creators_vote = poll_models.Vote.objects.filter(\n choice__question__poll=self.poll).values_list('creator', flat=True)\n creators_answer = poll_models.Answer.objects.filter(\n question__poll=self.poll).values_list('creator', flat=True)\n creator_ids = list(set(creators_vote).union(set(creators_answer)))\n return User.objects.filter(pk__in=creator_ids)\n\n @property\n def poll(self):\n return poll_models.Poll.objects.get(module=self.module)\n\n @property\n def single_choice_questions(self):\n return self.poll.questions.filter(\n multiple_choice=False,\n is_open=False).order_by('id')\n\n @property\n def multiple_choice_questions(self):\n return self.poll.questions.filter(multiple_choice=True).order_by('id')\n\n @property\n def open_questions(self):\n return self.poll.questions.filter(is_open=True).order_by('id')\n\n def get_virtual_fields(self, virtual):\n virtual = super().get_virtual_fields(virtual)\n virtual = self.get_virtual_fields_choice_questions(\n virtual, self.single_choice_questions)\n virtual = self.get_virtual_fields_choice_questions(\n virtual, self.multiple_choice_questions)\n virtual = self.get_virtual_fields_open_questions(\n virtual, self.open_questions)\n\n return virtual\n\n def get_virtual_fields_choice_questions(self, virtual, choice_questions):\n for question in choice_questions.all():\n for choice in question.choices.all():\n identifier = 'Q' + str(question.pk) + '_A' + str(choice.pk)\n virtual[(choice, False)] = identifier\n if choice.is_other_choice:\n identifier_answer = identifier + '_text'\n virtual[(choice, True)] = identifier_answer\n\n return virtual\n\n def get_virtual_fields_open_questions(self, virtual, open_questions):\n for question in open_questions.all():\n identifier = 'Q' + str(question.pk)\n virtual[(question, False)] = identifier\n identifier_answer = identifier + '_text'\n virtual[(question, True)] = identifier_answer\n\n return virtual\n\n def get_field_data(self, user, field):\n field_object, is_text_field = field\n\n if type(field_object) == poll_models.Choice:\n votes_qs = poll_models.Vote.objects.filter(\n choice=field_object,\n creator=user)\n if not is_text_field:\n value = int(votes_qs.exists())\n else:\n vote = votes_qs.first()\n if vote:\n value = poll_models.OtherVote.objects.get(vote=vote).answer\n else:\n value = ''\n else: # field_object is question\n answers_qs = poll_models.Answer.objects.filter(\n question=field_object,\n creator=user)\n if not is_text_field:\n value = int(answers_qs.exists())\n else:\n answer = answers_qs.first()\n if answer:\n value = answer.answer\n else:\n value = ''\n\n return value\n", "path": "meinberlin/apps/polls/exports.py"}], "after_files": [{"content": "from django.utils.translation import ugettext as _\nfrom rules.contrib.views import PermissionRequiredMixin\n\nfrom adhocracy4.comments.models import Comment\nfrom adhocracy4.exports import mixins\nfrom adhocracy4.exports import views as export_views\nfrom adhocracy4.polls import models as poll_models\nfrom meinberlin.apps.users.models import User\n\n\nclass PollCommentExportView(\n PermissionRequiredMixin,\n mixins.ExportModelFieldsMixin,\n mixins.UserGeneratedContentExportMixin,\n mixins.ItemExportWithLinkMixin,\n mixins.CommentExportWithRepliesToMixin,\n export_views.BaseItemExportView\n):\n\n model = Comment\n\n fields = ['id', 'comment', 'created']\n permission_required = 'a4projects.change_project'\n\n def get_permission_object(self):\n return self.module.project\n\n def get_queryset(self):\n comments = (\n Comment.objects.filter(poll__module=self.module) |\n Comment.objects.filter(parent_comment__poll__module=self.module)\n )\n return comments\n\n def get_virtual_fields(self, virtual):\n virtual.setdefault('id', _('ID'))\n virtual.setdefault('comment', _('Comment'))\n virtual.setdefault('created', _('Created'))\n return super().get_virtual_fields(virtual)\n\n @property\n def raise_exception(self):\n return self.request.user.is_authenticated\n\n\nclass PollExportView(\n PermissionRequiredMixin,\n export_views.BaseItemExportView\n):\n\n permission_required = 'a4projects.change_project'\n\n def get_permission_object(self):\n return self.module.project\n\n def get_queryset(self):\n creators_vote = poll_models.Vote.objects.filter(\n choice__question__poll=self.poll).values_list('creator', flat=True)\n creators_answer = poll_models.Answer.objects.filter(\n question__poll=self.poll).values_list('creator', flat=True)\n creator_ids = list(set(creators_vote).union(set(creators_answer)))\n return User.objects.filter(pk__in=creator_ids)\n\n @property\n def poll(self):\n return poll_models.Poll.objects.get(module=self.module)\n\n @property\n def questions(self):\n return self.poll.questions.all()\n\n def get_virtual_fields(self, virtual):\n virtual = super().get_virtual_fields(virtual)\n\n for question in self.questions:\n if question.is_open:\n virtual = \\\n self.get_virtual_field_open_question(virtual, question)\n else:\n virtual = \\\n self.get_virtual_field_choice_question(virtual, question)\n\n return virtual\n\n def get_virtual_field_choice_question(self, virtual, choice_question):\n for choice in choice_question.choices.all():\n identifier = 'Q' + str(choice_question.pk) + '_A' + str(choice.pk)\n virtual[(choice, False)] = identifier\n if choice.is_other_choice:\n identifier_answer = identifier + '_text'\n virtual[(choice, True)] = identifier_answer\n\n return virtual\n\n def get_virtual_field_open_question(self, virtual, open_question):\n identifier = 'Q' + str(open_question.pk)\n virtual[(open_question, False)] = identifier\n identifier_answer = identifier + '_text'\n virtual[(open_question, True)] = identifier_answer\n\n return virtual\n\n def get_field_data(self, user, field):\n field_object, is_text_field = field\n\n if type(field_object) == poll_models.Choice:\n votes_qs = poll_models.Vote.objects.filter(\n choice=field_object,\n creator=user)\n if not is_text_field:\n value = int(votes_qs.exists())\n else:\n vote = votes_qs.first()\n if vote:\n value = poll_models.OtherVote.objects.get(vote=vote).answer\n else:\n value = ''\n else: # field_object is question\n answers_qs = poll_models.Answer.objects.filter(\n question=field_object,\n creator=user)\n if not is_text_field:\n value = int(answers_qs.exists())\n else:\n answer = answers_qs.first()\n if answer:\n value = answer.answer\n else:\n value = ''\n\n return value\n", "path": "meinberlin/apps/polls/exports.py"}]}
1,608
730
gh_patches_debug_11535
rasdani/github-patches
git_diff
Qiskit__qiskit-5613
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Avoid dependencies duplicity <!-- ⚠️ If you do not respect this template, your issue will be closed --> <!-- ⚠️ Make sure to browse the opened and closed issues to confirm this idea does not exist. --> ### What is the expected enhancement? Currently, when you want to add or update a dependency, you need to do that in the `requirements.txt` and `setup.py` files. That is really error-prone. It would be nice to avoid that situation and make changes only in one of the files when a dependency is added or updated. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `setup.py` Content: ``` 1 # This code is part of Qiskit. 2 # 3 # (C) Copyright IBM 2017. 4 # 5 # This code is licensed under the Apache License, Version 2.0. You may 6 # obtain a copy of this license in the LICENSE.txt file in the root directory 7 # of this source tree or at http://www.apache.org/licenses/LICENSE-2.0. 8 # 9 # Any modifications or derivative works of this code must retain this 10 # copyright notice, and modified files need to carry a notice indicating 11 # that they have been altered from the originals. 12 13 "The Qiskit Terra setup file." 14 15 import os 16 import sys 17 from setuptools import setup, find_packages, Extension 18 try: 19 from Cython.Build import cythonize 20 except ImportError: 21 import subprocess 22 subprocess.call([sys.executable, '-m', 'pip', 'install', 'Cython>=0.27.1']) 23 from Cython.Build import cythonize 24 25 REQUIREMENTS = [ 26 "contextvars>=2.4;python_version<'3.7'", 27 "jsonschema>=2.6", 28 "retworkx>=0.7.0", 29 "numpy>=1.17", 30 "ply>=3.10", 31 "psutil>=5", 32 "scipy>=1.4", 33 "sympy>=1.3", 34 "dill>=0.3", 35 "fastjsonschema>=2.10", 36 "python-constraint>=1.4", 37 "python-dateutil>=2.8.0", 38 ] 39 40 # Add Cython extensions here 41 CYTHON_EXTS = ['utils', 'swap_trial'] 42 CYTHON_MODULE = 'qiskit.transpiler.passes.routing.cython.stochastic_swap' 43 CYTHON_SOURCE_DIR = 'qiskit/transpiler/passes/routing/cython/stochastic_swap' 44 45 INCLUDE_DIRS = [] 46 # Extra link args 47 LINK_FLAGS = [] 48 # If on Win and not in MSYS2 (i.e. Visual studio compile) 49 if (sys.platform == 'win32' and os.environ.get('MSYSTEM') is None): 50 COMPILER_FLAGS = ['/O2'] 51 # Everything else 52 else: 53 COMPILER_FLAGS = ['-O2', '-funroll-loops', '-std=c++11'] 54 if sys.platform == 'darwin': 55 # These are needed for compiling on OSX 10.14+ 56 COMPILER_FLAGS.append('-mmacosx-version-min=10.9') 57 LINK_FLAGS.append('-mmacosx-version-min=10.9') 58 59 60 EXT_MODULES = [] 61 # Add Cython Extensions 62 for ext in CYTHON_EXTS: 63 mod = Extension(CYTHON_MODULE + '.' + ext, 64 sources=[CYTHON_SOURCE_DIR + '/' + ext + '.pyx'], 65 include_dirs=INCLUDE_DIRS, 66 extra_compile_args=COMPILER_FLAGS, 67 extra_link_args=LINK_FLAGS, 68 language='c++') 69 EXT_MODULES.append(mod) 70 71 # Read long description from README. 72 README_PATH = os.path.join(os.path.abspath(os.path.dirname(__file__)), 73 'README.md') 74 with open(README_PATH) as readme_file: 75 README = readme_file.read() 76 77 setup( 78 name="qiskit-terra", 79 version="0.17.0", 80 description="Software for developing quantum computing programs", 81 long_description=README, 82 long_description_content_type='text/markdown', 83 url="https://github.com/Qiskit/qiskit-terra", 84 author="Qiskit Development Team", 85 author_email="[email protected]", 86 license="Apache 2.0", 87 classifiers=[ 88 "Environment :: Console", 89 "License :: OSI Approved :: Apache Software License", 90 "Intended Audience :: Developers", 91 "Intended Audience :: Science/Research", 92 "Operating System :: Microsoft :: Windows", 93 "Operating System :: MacOS", 94 "Operating System :: POSIX :: Linux", 95 "Programming Language :: Python :: 3 :: Only", 96 "Programming Language :: Python :: 3.6", 97 "Programming Language :: Python :: 3.7", 98 "Programming Language :: Python :: 3.8", 99 "Programming Language :: Python :: 3.9", 100 "Topic :: Scientific/Engineering", 101 ], 102 keywords="qiskit sdk quantum", 103 packages=find_packages(exclude=['test*']), 104 install_requires=REQUIREMENTS, 105 setup_requires=['Cython>=0.27.1'], 106 include_package_data=True, 107 python_requires=">=3.6", 108 extras_require={ 109 'visualization': ['matplotlib>=2.1', 'ipywidgets>=7.3.0', 110 'pydot', "pillow>=4.2.1", "pylatexenc>=1.4", 111 "seaborn>=0.9.0", "pygments>=2.4"], 112 'classical-function-compiler': ['tweedledum'], 113 'full-featured-simulators': ['qiskit-aer>=0.1'], 114 'crosstalk-pass': ['z3-solver>=4.7'], 115 }, 116 project_urls={ 117 "Bug Tracker": "https://github.com/Qiskit/qiskit-terra/issues", 118 "Documentation": "https://qiskit.org/documentation/", 119 "Source Code": "https://github.com/Qiskit/qiskit-terra", 120 }, 121 ext_modules=cythonize(EXT_MODULES), 122 zip_safe=False 123 ) 124 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/setup.py b/setup.py --- a/setup.py +++ b/setup.py @@ -22,20 +22,8 @@ subprocess.call([sys.executable, '-m', 'pip', 'install', 'Cython>=0.27.1']) from Cython.Build import cythonize -REQUIREMENTS = [ - "contextvars>=2.4;python_version<'3.7'", - "jsonschema>=2.6", - "retworkx>=0.7.0", - "numpy>=1.17", - "ply>=3.10", - "psutil>=5", - "scipy>=1.4", - "sympy>=1.3", - "dill>=0.3", - "fastjsonschema>=2.10", - "python-constraint>=1.4", - "python-dateutil>=2.8.0", -] +with open('requirements.txt') as f: + REQUIREMENTS = f.read().splitlines() # Add Cython extensions here CYTHON_EXTS = ['utils', 'swap_trial']
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -22,20 +22,8 @@\n subprocess.call([sys.executable, '-m', 'pip', 'install', 'Cython>=0.27.1'])\n from Cython.Build import cythonize\n \n-REQUIREMENTS = [\n- \"contextvars>=2.4;python_version<'3.7'\",\n- \"jsonschema>=2.6\",\n- \"retworkx>=0.7.0\",\n- \"numpy>=1.17\",\n- \"ply>=3.10\",\n- \"psutil>=5\",\n- \"scipy>=1.4\",\n- \"sympy>=1.3\",\n- \"dill>=0.3\",\n- \"fastjsonschema>=2.10\",\n- \"python-constraint>=1.4\",\n- \"python-dateutil>=2.8.0\",\n-]\n+with open('requirements.txt') as f:\n+ REQUIREMENTS = f.read().splitlines()\n \n # Add Cython extensions here\n CYTHON_EXTS = ['utils', 'swap_trial']\n", "issue": "Avoid dependencies duplicity\n<!-- \u26a0\ufe0f If you do not respect this template, your issue will be closed -->\r\n<!-- \u26a0\ufe0f Make sure to browse the opened and closed issues to confirm this idea does not exist. -->\r\n\r\n### What is the expected enhancement?\r\nCurrently, when you want to add or update a dependency, you need to do that in the `requirements.txt` and `setup.py` files. That is really error-prone.\r\n\r\nIt would be nice to avoid that situation and make changes only in one of the files when a dependency is added or updated.\r\n\n", "before_files": [{"content": "# This code is part of Qiskit.\n#\n# (C) Copyright IBM 2017.\n#\n# This code is licensed under the Apache License, Version 2.0. You may\n# obtain a copy of this license in the LICENSE.txt file in the root directory\n# of this source tree or at http://www.apache.org/licenses/LICENSE-2.0.\n#\n# Any modifications or derivative works of this code must retain this\n# copyright notice, and modified files need to carry a notice indicating\n# that they have been altered from the originals.\n\n\"The Qiskit Terra setup file.\"\n\nimport os\nimport sys\nfrom setuptools import setup, find_packages, Extension\ntry:\n from Cython.Build import cythonize\nexcept ImportError:\n import subprocess\n subprocess.call([sys.executable, '-m', 'pip', 'install', 'Cython>=0.27.1'])\n from Cython.Build import cythonize\n\nREQUIREMENTS = [\n \"contextvars>=2.4;python_version<'3.7'\",\n \"jsonschema>=2.6\",\n \"retworkx>=0.7.0\",\n \"numpy>=1.17\",\n \"ply>=3.10\",\n \"psutil>=5\",\n \"scipy>=1.4\",\n \"sympy>=1.3\",\n \"dill>=0.3\",\n \"fastjsonschema>=2.10\",\n \"python-constraint>=1.4\",\n \"python-dateutil>=2.8.0\",\n]\n\n# Add Cython extensions here\nCYTHON_EXTS = ['utils', 'swap_trial']\nCYTHON_MODULE = 'qiskit.transpiler.passes.routing.cython.stochastic_swap'\nCYTHON_SOURCE_DIR = 'qiskit/transpiler/passes/routing/cython/stochastic_swap'\n\nINCLUDE_DIRS = []\n# Extra link args\nLINK_FLAGS = []\n# If on Win and not in MSYS2 (i.e. Visual studio compile)\nif (sys.platform == 'win32' and os.environ.get('MSYSTEM') is None):\n COMPILER_FLAGS = ['/O2']\n# Everything else\nelse:\n COMPILER_FLAGS = ['-O2', '-funroll-loops', '-std=c++11']\n if sys.platform == 'darwin':\n # These are needed for compiling on OSX 10.14+\n COMPILER_FLAGS.append('-mmacosx-version-min=10.9')\n LINK_FLAGS.append('-mmacosx-version-min=10.9')\n\n\nEXT_MODULES = []\n# Add Cython Extensions\nfor ext in CYTHON_EXTS:\n mod = Extension(CYTHON_MODULE + '.' + ext,\n sources=[CYTHON_SOURCE_DIR + '/' + ext + '.pyx'],\n include_dirs=INCLUDE_DIRS,\n extra_compile_args=COMPILER_FLAGS,\n extra_link_args=LINK_FLAGS,\n language='c++')\n EXT_MODULES.append(mod)\n\n# Read long description from README.\nREADME_PATH = os.path.join(os.path.abspath(os.path.dirname(__file__)),\n 'README.md')\nwith open(README_PATH) as readme_file:\n README = readme_file.read()\n\nsetup(\n name=\"qiskit-terra\",\n version=\"0.17.0\",\n description=\"Software for developing quantum computing programs\",\n long_description=README,\n long_description_content_type='text/markdown',\n url=\"https://github.com/Qiskit/qiskit-terra\",\n author=\"Qiskit Development Team\",\n author_email=\"[email protected]\",\n license=\"Apache 2.0\",\n classifiers=[\n \"Environment :: Console\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Intended Audience :: Developers\",\n \"Intended Audience :: Science/Research\",\n \"Operating System :: Microsoft :: Windows\",\n \"Operating System :: MacOS\",\n \"Operating System :: POSIX :: Linux\",\n \"Programming Language :: Python :: 3 :: Only\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Topic :: Scientific/Engineering\",\n ],\n keywords=\"qiskit sdk quantum\",\n packages=find_packages(exclude=['test*']),\n install_requires=REQUIREMENTS,\n setup_requires=['Cython>=0.27.1'],\n include_package_data=True,\n python_requires=\">=3.6\",\n extras_require={\n 'visualization': ['matplotlib>=2.1', 'ipywidgets>=7.3.0',\n 'pydot', \"pillow>=4.2.1\", \"pylatexenc>=1.4\",\n \"seaborn>=0.9.0\", \"pygments>=2.4\"],\n 'classical-function-compiler': ['tweedledum'],\n 'full-featured-simulators': ['qiskit-aer>=0.1'],\n 'crosstalk-pass': ['z3-solver>=4.7'],\n },\n project_urls={\n \"Bug Tracker\": \"https://github.com/Qiskit/qiskit-terra/issues\",\n \"Documentation\": \"https://qiskit.org/documentation/\",\n \"Source Code\": \"https://github.com/Qiskit/qiskit-terra\",\n },\n ext_modules=cythonize(EXT_MODULES),\n zip_safe=False\n)\n", "path": "setup.py"}], "after_files": [{"content": "# This code is part of Qiskit.\n#\n# (C) Copyright IBM 2017.\n#\n# This code is licensed under the Apache License, Version 2.0. You may\n# obtain a copy of this license in the LICENSE.txt file in the root directory\n# of this source tree or at http://www.apache.org/licenses/LICENSE-2.0.\n#\n# Any modifications or derivative works of this code must retain this\n# copyright notice, and modified files need to carry a notice indicating\n# that they have been altered from the originals.\n\n\"The Qiskit Terra setup file.\"\n\nimport os\nimport sys\nfrom setuptools import setup, find_packages, Extension\ntry:\n from Cython.Build import cythonize\nexcept ImportError:\n import subprocess\n subprocess.call([sys.executable, '-m', 'pip', 'install', 'Cython>=0.27.1'])\n from Cython.Build import cythonize\n\nwith open('requirements.txt') as f:\n REQUIREMENTS = f.read().splitlines()\n\n# Add Cython extensions here\nCYTHON_EXTS = ['utils', 'swap_trial']\nCYTHON_MODULE = 'qiskit.transpiler.passes.routing.cython.stochastic_swap'\nCYTHON_SOURCE_DIR = 'qiskit/transpiler/passes/routing/cython/stochastic_swap'\n\nINCLUDE_DIRS = []\n# Extra link args\nLINK_FLAGS = []\n# If on Win and not in MSYS2 (i.e. Visual studio compile)\nif (sys.platform == 'win32' and os.environ.get('MSYSTEM') is None):\n COMPILER_FLAGS = ['/O2']\n# Everything else\nelse:\n COMPILER_FLAGS = ['-O2', '-funroll-loops', '-std=c++11']\n if sys.platform == 'darwin':\n # These are needed for compiling on OSX 10.14+\n COMPILER_FLAGS.append('-mmacosx-version-min=10.9')\n LINK_FLAGS.append('-mmacosx-version-min=10.9')\n\n\nEXT_MODULES = []\n# Add Cython Extensions\nfor ext in CYTHON_EXTS:\n mod = Extension(CYTHON_MODULE + '.' + ext,\n sources=[CYTHON_SOURCE_DIR + '/' + ext + '.pyx'],\n include_dirs=INCLUDE_DIRS,\n extra_compile_args=COMPILER_FLAGS,\n extra_link_args=LINK_FLAGS,\n language='c++')\n EXT_MODULES.append(mod)\n\n# Read long description from README.\nREADME_PATH = os.path.join(os.path.abspath(os.path.dirname(__file__)),\n 'README.md')\nwith open(README_PATH) as readme_file:\n README = readme_file.read()\n\nsetup(\n name=\"qiskit-terra\",\n version=\"0.17.0\",\n description=\"Software for developing quantum computing programs\",\n long_description=README,\n long_description_content_type='text/markdown',\n url=\"https://github.com/Qiskit/qiskit-terra\",\n author=\"Qiskit Development Team\",\n author_email=\"[email protected]\",\n license=\"Apache 2.0\",\n classifiers=[\n \"Environment :: Console\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Intended Audience :: Developers\",\n \"Intended Audience :: Science/Research\",\n \"Operating System :: Microsoft :: Windows\",\n \"Operating System :: MacOS\",\n \"Operating System :: POSIX :: Linux\",\n \"Programming Language :: Python :: 3 :: Only\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Topic :: Scientific/Engineering\",\n ],\n keywords=\"qiskit sdk quantum\",\n packages=find_packages(exclude=['test*']),\n install_requires=REQUIREMENTS,\n setup_requires=['Cython>=0.27.1'],\n include_package_data=True,\n python_requires=\">=3.6\",\n extras_require={\n 'visualization': ['matplotlib>=2.1', 'ipywidgets>=7.3.0',\n 'pydot', \"pillow>=4.2.1\", \"pylatexenc>=1.4\",\n \"seaborn>=0.9.0\", \"pygments>=2.4\"],\n 'classical-function-compiler': ['tweedledum'],\n 'full-featured-simulators': ['qiskit-aer>=0.1'],\n 'crosstalk-pass': ['z3-solver>=4.7'],\n },\n project_urls={\n \"Bug Tracker\": \"https://github.com/Qiskit/qiskit-terra/issues\",\n \"Documentation\": \"https://qiskit.org/documentation/\",\n \"Source Code\": \"https://github.com/Qiskit/qiskit-terra\",\n },\n ext_modules=cythonize(EXT_MODULES),\n zip_safe=False\n)\n", "path": "setup.py"}]}
1,784
255
gh_patches_debug_3482
rasdani/github-patches
git_diff
docker__docker-py-1528
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Cannot wait blocking generator output from events() Since upgrade to `docker 2.2.0`. API 'events()' using same API client time out. So I got `UnixHTTPConnectionPool(host='localhost', port=None): Read timed out.` exception message. But in my use case, `containers()` or other client APIs are reasonable to set an default timeout. But `events()` should have another timeout setting for users. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `docker/api/daemon.py` Content: ``` 1 import os 2 import warnings 3 from datetime import datetime 4 5 from .. import auth, utils 6 from ..constants import INSECURE_REGISTRY_DEPRECATION_WARNING 7 8 9 class DaemonApiMixin(object): 10 @utils.minimum_version('1.25') 11 def df(self): 12 """ 13 Get data usage information. 14 15 Returns: 16 (dict): A dictionary representing different resource categories 17 and their respective data usage. 18 19 Raises: 20 :py:class:`docker.errors.APIError` 21 If the server returns an error. 22 """ 23 url = self._url('/system/df') 24 return self._result(self._get(url), True) 25 26 def events(self, since=None, until=None, filters=None, decode=None): 27 """ 28 Get real-time events from the server. Similar to the ``docker events`` 29 command. 30 31 Args: 32 since (UTC datetime or int): Get events from this point 33 until (UTC datetime or int): Get events until this point 34 filters (dict): Filter the events by event time, container or image 35 decode (bool): If set to true, stream will be decoded into dicts on 36 the fly. False by default. 37 38 Returns: 39 (generator): A blocking generator you can iterate over to retrieve 40 events as they happen. 41 42 Raises: 43 :py:class:`docker.errors.APIError` 44 If the server returns an error. 45 46 Example: 47 48 >>> for event in client.events() 49 ... print event 50 {u'from': u'image/with:tag', 51 u'id': u'container-id', 52 u'status': u'start', 53 u'time': 1423339459} 54 ... 55 """ 56 57 if isinstance(since, datetime): 58 since = utils.datetime_to_timestamp(since) 59 60 if isinstance(until, datetime): 61 until = utils.datetime_to_timestamp(until) 62 63 if filters: 64 filters = utils.convert_filters(filters) 65 66 params = { 67 'since': since, 68 'until': until, 69 'filters': filters 70 } 71 72 return self._stream_helper( 73 self._get(self._url('/events'), params=params, stream=True), 74 decode=decode 75 ) 76 77 def info(self): 78 """ 79 Display system-wide information. Identical to the ``docker info`` 80 command. 81 82 Returns: 83 (dict): The info as a dict 84 85 Raises: 86 :py:class:`docker.errors.APIError` 87 If the server returns an error. 88 """ 89 return self._result(self._get(self._url("/info")), True) 90 91 def login(self, username, password=None, email=None, registry=None, 92 reauth=False, insecure_registry=False, dockercfg_path=None): 93 """ 94 Authenticate with a registry. Similar to the ``docker login`` command. 95 96 Args: 97 username (str): The registry username 98 password (str): The plaintext password 99 email (str): The email for the registry account 100 registry (str): URL to the registry. E.g. 101 ``https://index.docker.io/v1/`` 102 reauth (bool): Whether refresh existing authentication on the 103 Docker server. 104 dockercfg_path (str): Use a custom path for the ``.dockercfg`` file 105 (default ``$HOME/.dockercfg``) 106 107 Returns: 108 (dict): The response from the login request 109 110 Raises: 111 :py:class:`docker.errors.APIError` 112 If the server returns an error. 113 """ 114 if insecure_registry: 115 warnings.warn( 116 INSECURE_REGISTRY_DEPRECATION_WARNING.format('login()'), 117 DeprecationWarning 118 ) 119 120 # If we don't have any auth data so far, try reloading the config file 121 # one more time in case anything showed up in there. 122 # If dockercfg_path is passed check to see if the config file exists, 123 # if so load that config. 124 if dockercfg_path and os.path.exists(dockercfg_path): 125 self._auth_configs = auth.load_config(dockercfg_path) 126 elif not self._auth_configs: 127 self._auth_configs = auth.load_config() 128 129 authcfg = auth.resolve_authconfig(self._auth_configs, registry) 130 # If we found an existing auth config for this registry and username 131 # combination, we can return it immediately unless reauth is requested. 132 if authcfg and authcfg.get('username', None) == username \ 133 and not reauth: 134 return authcfg 135 136 req_data = { 137 'username': username, 138 'password': password, 139 'email': email, 140 'serveraddress': registry, 141 } 142 143 response = self._post_json(self._url('/auth'), data=req_data) 144 if response.status_code == 200: 145 self._auth_configs[registry or auth.INDEX_NAME] = req_data 146 return self._result(response, json=True) 147 148 def ping(self): 149 """ 150 Checks the server is responsive. An exception will be raised if it 151 isn't responding. 152 153 Returns: 154 (bool) The response from the server. 155 156 Raises: 157 :py:class:`docker.errors.APIError` 158 If the server returns an error. 159 """ 160 return self._result(self._get(self._url('/_ping'))) == 'OK' 161 162 def version(self, api_version=True): 163 """ 164 Returns version information from the server. Similar to the ``docker 165 version`` command. 166 167 Returns: 168 (dict): The server version information 169 170 Raises: 171 :py:class:`docker.errors.APIError` 172 If the server returns an error. 173 """ 174 url = self._url("/version", versioned_api=api_version) 175 return self._result(self._get(url), json=True) 176 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/docker/api/daemon.py b/docker/api/daemon.py --- a/docker/api/daemon.py +++ b/docker/api/daemon.py @@ -68,9 +68,10 @@ 'until': until, 'filters': filters } + url = self._url('/events') return self._stream_helper( - self._get(self._url('/events'), params=params, stream=True), + self._get(url, params=params, stream=True, timeout=None), decode=decode )
{"golden_diff": "diff --git a/docker/api/daemon.py b/docker/api/daemon.py\n--- a/docker/api/daemon.py\n+++ b/docker/api/daemon.py\n@@ -68,9 +68,10 @@\n 'until': until,\n 'filters': filters\n }\n+ url = self._url('/events')\n \n return self._stream_helper(\n- self._get(self._url('/events'), params=params, stream=True),\n+ self._get(url, params=params, stream=True, timeout=None),\n decode=decode\n )\n", "issue": "Cannot wait blocking generator output from events()\nSince upgrade to `docker 2.2.0`. \r\n\r\nAPI 'events()' using same API client time out. \r\nSo I got `UnixHTTPConnectionPool(host='localhost', port=None): Read timed out.` exception message.\r\n\r\nBut in my use case, `containers()` or other client APIs are reasonable to set an default timeout.\r\nBut `events()` should have another timeout setting for users.\n", "before_files": [{"content": "import os\nimport warnings\nfrom datetime import datetime\n\nfrom .. import auth, utils\nfrom ..constants import INSECURE_REGISTRY_DEPRECATION_WARNING\n\n\nclass DaemonApiMixin(object):\n @utils.minimum_version('1.25')\n def df(self):\n \"\"\"\n Get data usage information.\n\n Returns:\n (dict): A dictionary representing different resource categories\n and their respective data usage.\n\n Raises:\n :py:class:`docker.errors.APIError`\n If the server returns an error.\n \"\"\"\n url = self._url('/system/df')\n return self._result(self._get(url), True)\n\n def events(self, since=None, until=None, filters=None, decode=None):\n \"\"\"\n Get real-time events from the server. Similar to the ``docker events``\n command.\n\n Args:\n since (UTC datetime or int): Get events from this point\n until (UTC datetime or int): Get events until this point\n filters (dict): Filter the events by event time, container or image\n decode (bool): If set to true, stream will be decoded into dicts on\n the fly. False by default.\n\n Returns:\n (generator): A blocking generator you can iterate over to retrieve\n events as they happen.\n\n Raises:\n :py:class:`docker.errors.APIError`\n If the server returns an error.\n\n Example:\n\n >>> for event in client.events()\n ... print event\n {u'from': u'image/with:tag',\n u'id': u'container-id',\n u'status': u'start',\n u'time': 1423339459}\n ...\n \"\"\"\n\n if isinstance(since, datetime):\n since = utils.datetime_to_timestamp(since)\n\n if isinstance(until, datetime):\n until = utils.datetime_to_timestamp(until)\n\n if filters:\n filters = utils.convert_filters(filters)\n\n params = {\n 'since': since,\n 'until': until,\n 'filters': filters\n }\n\n return self._stream_helper(\n self._get(self._url('/events'), params=params, stream=True),\n decode=decode\n )\n\n def info(self):\n \"\"\"\n Display system-wide information. Identical to the ``docker info``\n command.\n\n Returns:\n (dict): The info as a dict\n\n Raises:\n :py:class:`docker.errors.APIError`\n If the server returns an error.\n \"\"\"\n return self._result(self._get(self._url(\"/info\")), True)\n\n def login(self, username, password=None, email=None, registry=None,\n reauth=False, insecure_registry=False, dockercfg_path=None):\n \"\"\"\n Authenticate with a registry. Similar to the ``docker login`` command.\n\n Args:\n username (str): The registry username\n password (str): The plaintext password\n email (str): The email for the registry account\n registry (str): URL to the registry. E.g.\n ``https://index.docker.io/v1/``\n reauth (bool): Whether refresh existing authentication on the\n Docker server.\n dockercfg_path (str): Use a custom path for the ``.dockercfg`` file\n (default ``$HOME/.dockercfg``)\n\n Returns:\n (dict): The response from the login request\n\n Raises:\n :py:class:`docker.errors.APIError`\n If the server returns an error.\n \"\"\"\n if insecure_registry:\n warnings.warn(\n INSECURE_REGISTRY_DEPRECATION_WARNING.format('login()'),\n DeprecationWarning\n )\n\n # If we don't have any auth data so far, try reloading the config file\n # one more time in case anything showed up in there.\n # If dockercfg_path is passed check to see if the config file exists,\n # if so load that config.\n if dockercfg_path and os.path.exists(dockercfg_path):\n self._auth_configs = auth.load_config(dockercfg_path)\n elif not self._auth_configs:\n self._auth_configs = auth.load_config()\n\n authcfg = auth.resolve_authconfig(self._auth_configs, registry)\n # If we found an existing auth config for this registry and username\n # combination, we can return it immediately unless reauth is requested.\n if authcfg and authcfg.get('username', None) == username \\\n and not reauth:\n return authcfg\n\n req_data = {\n 'username': username,\n 'password': password,\n 'email': email,\n 'serveraddress': registry,\n }\n\n response = self._post_json(self._url('/auth'), data=req_data)\n if response.status_code == 200:\n self._auth_configs[registry or auth.INDEX_NAME] = req_data\n return self._result(response, json=True)\n\n def ping(self):\n \"\"\"\n Checks the server is responsive. An exception will be raised if it\n isn't responding.\n\n Returns:\n (bool) The response from the server.\n\n Raises:\n :py:class:`docker.errors.APIError`\n If the server returns an error.\n \"\"\"\n return self._result(self._get(self._url('/_ping'))) == 'OK'\n\n def version(self, api_version=True):\n \"\"\"\n Returns version information from the server. Similar to the ``docker\n version`` command.\n\n Returns:\n (dict): The server version information\n\n Raises:\n :py:class:`docker.errors.APIError`\n If the server returns an error.\n \"\"\"\n url = self._url(\"/version\", versioned_api=api_version)\n return self._result(self._get(url), json=True)\n", "path": "docker/api/daemon.py"}], "after_files": [{"content": "import os\nimport warnings\nfrom datetime import datetime\n\nfrom .. import auth, utils\nfrom ..constants import INSECURE_REGISTRY_DEPRECATION_WARNING\n\n\nclass DaemonApiMixin(object):\n @utils.minimum_version('1.25')\n def df(self):\n \"\"\"\n Get data usage information.\n\n Returns:\n (dict): A dictionary representing different resource categories\n and their respective data usage.\n\n Raises:\n :py:class:`docker.errors.APIError`\n If the server returns an error.\n \"\"\"\n url = self._url('/system/df')\n return self._result(self._get(url), True)\n\n def events(self, since=None, until=None, filters=None, decode=None):\n \"\"\"\n Get real-time events from the server. Similar to the ``docker events``\n command.\n\n Args:\n since (UTC datetime or int): Get events from this point\n until (UTC datetime or int): Get events until this point\n filters (dict): Filter the events by event time, container or image\n decode (bool): If set to true, stream will be decoded into dicts on\n the fly. False by default.\n\n Returns:\n (generator): A blocking generator you can iterate over to retrieve\n events as they happen.\n\n Raises:\n :py:class:`docker.errors.APIError`\n If the server returns an error.\n\n Example:\n\n >>> for event in client.events()\n ... print event\n {u'from': u'image/with:tag',\n u'id': u'container-id',\n u'status': u'start',\n u'time': 1423339459}\n ...\n \"\"\"\n\n if isinstance(since, datetime):\n since = utils.datetime_to_timestamp(since)\n\n if isinstance(until, datetime):\n until = utils.datetime_to_timestamp(until)\n\n if filters:\n filters = utils.convert_filters(filters)\n\n params = {\n 'since': since,\n 'until': until,\n 'filters': filters\n }\n url = self._url('/events')\n\n return self._stream_helper(\n self._get(url, params=params, stream=True, timeout=None),\n decode=decode\n )\n\n def info(self):\n \"\"\"\n Display system-wide information. Identical to the ``docker info``\n command.\n\n Returns:\n (dict): The info as a dict\n\n Raises:\n :py:class:`docker.errors.APIError`\n If the server returns an error.\n \"\"\"\n return self._result(self._get(self._url(\"/info\")), True)\n\n def login(self, username, password=None, email=None, registry=None,\n reauth=False, insecure_registry=False, dockercfg_path=None):\n \"\"\"\n Authenticate with a registry. Similar to the ``docker login`` command.\n\n Args:\n username (str): The registry username\n password (str): The plaintext password\n email (str): The email for the registry account\n registry (str): URL to the registry. E.g.\n ``https://index.docker.io/v1/``\n reauth (bool): Whether refresh existing authentication on the\n Docker server.\n dockercfg_path (str): Use a custom path for the ``.dockercfg`` file\n (default ``$HOME/.dockercfg``)\n\n Returns:\n (dict): The response from the login request\n\n Raises:\n :py:class:`docker.errors.APIError`\n If the server returns an error.\n \"\"\"\n if insecure_registry:\n warnings.warn(\n INSECURE_REGISTRY_DEPRECATION_WARNING.format('login()'),\n DeprecationWarning\n )\n\n # If we don't have any auth data so far, try reloading the config file\n # one more time in case anything showed up in there.\n # If dockercfg_path is passed check to see if the config file exists,\n # if so load that config.\n if dockercfg_path and os.path.exists(dockercfg_path):\n self._auth_configs = auth.load_config(dockercfg_path)\n elif not self._auth_configs:\n self._auth_configs = auth.load_config()\n\n authcfg = auth.resolve_authconfig(self._auth_configs, registry)\n # If we found an existing auth config for this registry and username\n # combination, we can return it immediately unless reauth is requested.\n if authcfg and authcfg.get('username', None) == username \\\n and not reauth:\n return authcfg\n\n req_data = {\n 'username': username,\n 'password': password,\n 'email': email,\n 'serveraddress': registry,\n }\n\n response = self._post_json(self._url('/auth'), data=req_data)\n if response.status_code == 200:\n self._auth_configs[registry or auth.INDEX_NAME] = req_data\n return self._result(response, json=True)\n\n def ping(self):\n \"\"\"\n Checks the server is responsive. An exception will be raised if it\n isn't responding.\n\n Returns:\n (bool) The response from the server.\n\n Raises:\n :py:class:`docker.errors.APIError`\n If the server returns an error.\n \"\"\"\n return self._result(self._get(self._url('/_ping'))) == 'OK'\n\n def version(self, api_version=True):\n \"\"\"\n Returns version information from the server. Similar to the ``docker\n version`` command.\n\n Returns:\n (dict): The server version information\n\n Raises:\n :py:class:`docker.errors.APIError`\n If the server returns an error.\n \"\"\"\n url = self._url(\"/version\", versioned_api=api_version)\n return self._result(self._get(url), json=True)\n", "path": "docker/api/daemon.py"}]}
1,998
117
gh_patches_debug_52919
rasdani/github-patches
git_diff
great-expectations__great_expectations-3469
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Use cleaner solution for non-truncating division in python 2 Prefer `from __future__ import division` to `1.*x/y` --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `great_expectations/expectations/metrics/multicolumn_map_metrics/select_column_values_unique_within_record.py` Content: ``` 1 import logging 2 from functools import reduce 3 4 from great_expectations.execution_engine import ( 5 PandasExecutionEngine, 6 SparkDFExecutionEngine, 7 SqlAlchemyExecutionEngine, 8 ) 9 from great_expectations.expectations.metrics.import_manager import F, sa 10 from great_expectations.expectations.metrics.map_metric_provider import ( 11 MulticolumnMapMetricProvider, 12 multicolumn_condition_partial, 13 ) 14 15 logger = logging.getLogger(__name__) 16 17 18 class SelectColumnValuesUniqueWithinRecord(MulticolumnMapMetricProvider): 19 condition_metric_name = "select_column_values.unique.within_record" 20 condition_domain_keys = ( 21 "batch_id", 22 "table", 23 "column_list", 24 "row_condition", 25 "condition_parser", 26 "ignore_row_if", 27 ) 28 29 @multicolumn_condition_partial(engine=PandasExecutionEngine) 30 def _pandas(cls, column_list, **kwargs): 31 num_columns = len(column_list.columns) 32 row_wise_cond = column_list.nunique(dropna=False, axis=1) >= num_columns 33 return row_wise_cond 34 35 @multicolumn_condition_partial(engine=SqlAlchemyExecutionEngine) 36 def _sqlalchemy(cls, column_list, **kwargs): 37 """ 38 The present approach relies on an inefficient query condition construction implementation, whose computational 39 cost is O(num_columns^2). However, until a more efficient implementation compatible with SQLAlchemy is 40 available, this is the only feasible mechanism under the current architecture, where map metric providers must 41 return a condition. Nevertheless, SQL query length limit is 1GB (sufficient for most practical scenarios). 42 """ 43 num_columns = len(column_list) 44 45 # An arbitrary "num_columns" value used for issuing an explanatory message as a warning. 46 if num_columns > 100: 47 logger.warning( 48 f"""Batch data with {num_columns} columns is detected. Computing the "{cls.condition_metric_name}" \ 49 metric for wide tables using SQLAlchemy leads to long WHERE clauses for the underlying database engine to process. 50 """ 51 ) 52 53 conditions = sa.or_( 54 *( 55 sa.or_( 56 column_list[idx_src] == column_list[idx_dest], 57 sa.and_( 58 column_list[idx_src] == None, column_list[idx_dest] == None 59 ), 60 ) 61 for idx_src in range(num_columns - 1) 62 for idx_dest in range(idx_src + 1, num_columns) 63 ) 64 ) 65 row_wise_cond = sa.not_(sa.or_(conditions)) 66 return row_wise_cond 67 68 @multicolumn_condition_partial(engine=SparkDFExecutionEngine) 69 def _spark(cls, column_list, **kwargs): 70 column_names = column_list.columns 71 num_columns = len(column_names) 72 73 conditions = [] 74 for idx_src in range(num_columns - 1): 75 for idx_dest in range(idx_src + 1, num_columns): 76 conditions.append( 77 F.col(column_names[idx_src]).eqNullSafe( 78 F.col(column_names[idx_dest]) 79 ) 80 ) 81 82 row_wise_cond = ~reduce(lambda a, b: a | b, conditions) 83 return row_wise_cond 84 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/great_expectations/expectations/metrics/multicolumn_map_metrics/select_column_values_unique_within_record.py b/great_expectations/expectations/metrics/multicolumn_map_metrics/select_column_values_unique_within_record.py --- a/great_expectations/expectations/metrics/multicolumn_map_metrics/select_column_values_unique_within_record.py +++ b/great_expectations/expectations/metrics/multicolumn_map_metrics/select_column_values_unique_within_record.py @@ -62,7 +62,7 @@ for idx_dest in range(idx_src + 1, num_columns) ) ) - row_wise_cond = sa.not_(sa.or_(conditions)) + row_wise_cond = sa.not_(conditions) return row_wise_cond @multicolumn_condition_partial(engine=SparkDFExecutionEngine)
{"golden_diff": "diff --git a/great_expectations/expectations/metrics/multicolumn_map_metrics/select_column_values_unique_within_record.py b/great_expectations/expectations/metrics/multicolumn_map_metrics/select_column_values_unique_within_record.py\n--- a/great_expectations/expectations/metrics/multicolumn_map_metrics/select_column_values_unique_within_record.py\n+++ b/great_expectations/expectations/metrics/multicolumn_map_metrics/select_column_values_unique_within_record.py\n@@ -62,7 +62,7 @@\n for idx_dest in range(idx_src + 1, num_columns)\n )\n )\n- row_wise_cond = sa.not_(sa.or_(conditions))\n+ row_wise_cond = sa.not_(conditions)\n return row_wise_cond\n \n @multicolumn_condition_partial(engine=SparkDFExecutionEngine)\n", "issue": "Use cleaner solution for non-truncating division in python 2\nPrefer `from __future__ import division` to `1.*x/y`\n", "before_files": [{"content": "import logging\nfrom functools import reduce\n\nfrom great_expectations.execution_engine import (\n PandasExecutionEngine,\n SparkDFExecutionEngine,\n SqlAlchemyExecutionEngine,\n)\nfrom great_expectations.expectations.metrics.import_manager import F, sa\nfrom great_expectations.expectations.metrics.map_metric_provider import (\n MulticolumnMapMetricProvider,\n multicolumn_condition_partial,\n)\n\nlogger = logging.getLogger(__name__)\n\n\nclass SelectColumnValuesUniqueWithinRecord(MulticolumnMapMetricProvider):\n condition_metric_name = \"select_column_values.unique.within_record\"\n condition_domain_keys = (\n \"batch_id\",\n \"table\",\n \"column_list\",\n \"row_condition\",\n \"condition_parser\",\n \"ignore_row_if\",\n )\n\n @multicolumn_condition_partial(engine=PandasExecutionEngine)\n def _pandas(cls, column_list, **kwargs):\n num_columns = len(column_list.columns)\n row_wise_cond = column_list.nunique(dropna=False, axis=1) >= num_columns\n return row_wise_cond\n\n @multicolumn_condition_partial(engine=SqlAlchemyExecutionEngine)\n def _sqlalchemy(cls, column_list, **kwargs):\n \"\"\"\n The present approach relies on an inefficient query condition construction implementation, whose computational\n cost is O(num_columns^2). However, until a more efficient implementation compatible with SQLAlchemy is\n available, this is the only feasible mechanism under the current architecture, where map metric providers must\n return a condition. Nevertheless, SQL query length limit is 1GB (sufficient for most practical scenarios).\n \"\"\"\n num_columns = len(column_list)\n\n # An arbitrary \"num_columns\" value used for issuing an explanatory message as a warning.\n if num_columns > 100:\n logger.warning(\n f\"\"\"Batch data with {num_columns} columns is detected. Computing the \"{cls.condition_metric_name}\" \\\nmetric for wide tables using SQLAlchemy leads to long WHERE clauses for the underlying database engine to process.\n\"\"\"\n )\n\n conditions = sa.or_(\n *(\n sa.or_(\n column_list[idx_src] == column_list[idx_dest],\n sa.and_(\n column_list[idx_src] == None, column_list[idx_dest] == None\n ),\n )\n for idx_src in range(num_columns - 1)\n for idx_dest in range(idx_src + 1, num_columns)\n )\n )\n row_wise_cond = sa.not_(sa.or_(conditions))\n return row_wise_cond\n\n @multicolumn_condition_partial(engine=SparkDFExecutionEngine)\n def _spark(cls, column_list, **kwargs):\n column_names = column_list.columns\n num_columns = len(column_names)\n\n conditions = []\n for idx_src in range(num_columns - 1):\n for idx_dest in range(idx_src + 1, num_columns):\n conditions.append(\n F.col(column_names[idx_src]).eqNullSafe(\n F.col(column_names[idx_dest])\n )\n )\n\n row_wise_cond = ~reduce(lambda a, b: a | b, conditions)\n return row_wise_cond\n", "path": "great_expectations/expectations/metrics/multicolumn_map_metrics/select_column_values_unique_within_record.py"}], "after_files": [{"content": "import logging\nfrom functools import reduce\n\nfrom great_expectations.execution_engine import (\n PandasExecutionEngine,\n SparkDFExecutionEngine,\n SqlAlchemyExecutionEngine,\n)\nfrom great_expectations.expectations.metrics.import_manager import F, sa\nfrom great_expectations.expectations.metrics.map_metric_provider import (\n MulticolumnMapMetricProvider,\n multicolumn_condition_partial,\n)\n\nlogger = logging.getLogger(__name__)\n\n\nclass SelectColumnValuesUniqueWithinRecord(MulticolumnMapMetricProvider):\n condition_metric_name = \"select_column_values.unique.within_record\"\n condition_domain_keys = (\n \"batch_id\",\n \"table\",\n \"column_list\",\n \"row_condition\",\n \"condition_parser\",\n \"ignore_row_if\",\n )\n\n @multicolumn_condition_partial(engine=PandasExecutionEngine)\n def _pandas(cls, column_list, **kwargs):\n num_columns = len(column_list.columns)\n row_wise_cond = column_list.nunique(dropna=False, axis=1) >= num_columns\n return row_wise_cond\n\n @multicolumn_condition_partial(engine=SqlAlchemyExecutionEngine)\n def _sqlalchemy(cls, column_list, **kwargs):\n \"\"\"\n The present approach relies on an inefficient query condition construction implementation, whose computational\n cost is O(num_columns^2). However, until a more efficient implementation compatible with SQLAlchemy is\n available, this is the only feasible mechanism under the current architecture, where map metric providers must\n return a condition. Nevertheless, SQL query length limit is 1GB (sufficient for most practical scenarios).\n \"\"\"\n num_columns = len(column_list)\n\n # An arbitrary \"num_columns\" value used for issuing an explanatory message as a warning.\n if num_columns > 100:\n logger.warning(\n f\"\"\"Batch data with {num_columns} columns is detected. Computing the \"{cls.condition_metric_name}\" \\\nmetric for wide tables using SQLAlchemy leads to long WHERE clauses for the underlying database engine to process.\n\"\"\"\n )\n\n conditions = sa.or_(\n *(\n sa.or_(\n column_list[idx_src] == column_list[idx_dest],\n sa.and_(\n column_list[idx_src] == None, column_list[idx_dest] == None\n ),\n )\n for idx_src in range(num_columns - 1)\n for idx_dest in range(idx_src + 1, num_columns)\n )\n )\n row_wise_cond = sa.not_(conditions)\n return row_wise_cond\n\n @multicolumn_condition_partial(engine=SparkDFExecutionEngine)\n def _spark(cls, column_list, **kwargs):\n column_names = column_list.columns\n num_columns = len(column_names)\n\n conditions = []\n for idx_src in range(num_columns - 1):\n for idx_dest in range(idx_src + 1, num_columns):\n conditions.append(\n F.col(column_names[idx_src]).eqNullSafe(\n F.col(column_names[idx_dest])\n )\n )\n\n row_wise_cond = ~reduce(lambda a, b: a | b, conditions)\n return row_wise_cond\n", "path": "great_expectations/expectations/metrics/multicolumn_map_metrics/select_column_values_unique_within_record.py"}]}
1,122
179
gh_patches_debug_43188
rasdani/github-patches
git_diff
pyqtgraph__pyqtgraph-2357
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Axes zoom area does not resize in 0.12.4 ### Short description When a plot is first generated with a given size, say `width ` and `height` in pixels, the entire axes areas on the left and bottom of the plot are zoomable. When the plot size is resized such that the plot is larger (e.g. the window is grabbed along an edge or corner and expanded), only the top `height` portion of the y-axes or the left `width` portion of the x-axis retain the ability to zoom the plot using the scroll wheel. The axes area outside (i.e. the lower portion of the y-axis or the right portion of the x-axis) are not zoomable. If hovering over the plot, not on an axes, there is no issue with zooming. Reverting to 0.12.3 fixes issue. I was able to reproduce this in both custom plots and many of the pyqtgraph.example scripts. ### Tested environment(s) * PyQtGraph version: 0.12.4 * Qt Python binding: PySide2 5.15.2.1 Qt 5.15.2 * Python version: 3.7 * NumPy version: 1.21.6 * Operating system: Windows 10 Enterprise 21H2 * Installation method: pip --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `pyqtgraph/graphicsItems/GraphicsWidget.py` Content: ``` 1 from ..Qt import QtGui, QtWidgets 2 from .GraphicsItem import GraphicsItem 3 4 __all__ = ['GraphicsWidget'] 5 6 class GraphicsWidget(GraphicsItem, QtWidgets.QGraphicsWidget): 7 8 _qtBaseClass = QtWidgets.QGraphicsWidget 9 def __init__(self, *args, **kargs): 10 """ 11 **Bases:** :class:`GraphicsItem <pyqtgraph.GraphicsItem>`, :class:`QtWidgets.QGraphicsWidget` 12 13 Extends QGraphicsWidget with several helpful methods and workarounds for PyQt bugs. 14 Most of the extra functionality is inherited from :class:`GraphicsItem <pyqtgraph.GraphicsItem>`. 15 """ 16 QtWidgets.QGraphicsWidget.__init__(self, *args, **kargs) 17 GraphicsItem.__init__(self) 18 19 # cache bouding rect and geometry 20 self._boundingRectCache = self._previousGeometry = None 21 self._painterPathCache = None 22 23 ## done by GraphicsItem init 24 #GraphicsScene.registerObject(self) ## workaround for pyqt bug in graphicsscene.items() 25 26 # Removed due to https://bugreports.qt-project.org/browse/PYSIDE-86 27 #def itemChange(self, change, value): 28 ## BEWARE: Calling QGraphicsWidget.itemChange can lead to crashing! 29 ##ret = QtWidgets.QGraphicsWidget.itemChange(self, change, value) ## segv occurs here 30 ## The default behavior is just to return the value argument, so we'll do that 31 ## without calling the original method. 32 #ret = value 33 #if change in [self.ItemParentHasChanged, self.ItemSceneHasChanged]: 34 #self._updateView() 35 #return ret 36 37 def setFixedHeight(self, h): 38 self.setMaximumHeight(h) 39 self.setMinimumHeight(h) 40 41 def setFixedWidth(self, h): 42 self.setMaximumWidth(h) 43 self.setMinimumWidth(h) 44 45 def height(self): 46 return self.geometry().height() 47 48 def width(self): 49 return self.geometry().width() 50 51 def boundingRect(self): 52 geometry = self.geometry() 53 if geometry != self._previousGeometry: 54 self._painterPathCache = None 55 56 br = self.mapRectFromParent(geometry).normalized() 57 self._boundingRectCache = br 58 self._previousGeometry = geometry 59 else: 60 br = self._boundingRectCache 61 62 return br 63 64 def shape(self): ## No idea why this is necessary, but rotated items do not receive clicks otherwise. 65 p = self._painterPathCache 66 if p is None: 67 self._painterPathCache = p = QtGui.QPainterPath() 68 p.addRect(self.boundingRect()) 69 70 return p 71 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/pyqtgraph/graphicsItems/GraphicsWidget.py b/pyqtgraph/graphicsItems/GraphicsWidget.py --- a/pyqtgraph/graphicsItems/GraphicsWidget.py +++ b/pyqtgraph/graphicsItems/GraphicsWidget.py @@ -3,36 +3,43 @@ __all__ = ['GraphicsWidget'] + class GraphicsWidget(GraphicsItem, QtWidgets.QGraphicsWidget): _qtBaseClass = QtWidgets.QGraphicsWidget - def __init__(self, *args, **kargs): + + def __init__(self, *args, **kwargs): """ **Bases:** :class:`GraphicsItem <pyqtgraph.GraphicsItem>`, :class:`QtWidgets.QGraphicsWidget` Extends QGraphicsWidget with several helpful methods and workarounds for PyQt bugs. Most of the extra functionality is inherited from :class:`GraphicsItem <pyqtgraph.GraphicsItem>`. """ - QtWidgets.QGraphicsWidget.__init__(self, *args, **kargs) + QtWidgets.QGraphicsWidget.__init__(self, *args, **kwargs) GraphicsItem.__init__(self) - # cache bouding rect and geometry + # cache bounding rect and geometry self._boundingRectCache = self._previousGeometry = None self._painterPathCache = None - - ## done by GraphicsItem init - #GraphicsScene.registerObject(self) ## workaround for pyqt bug in graphicsscene.items() + self.geometryChanged.connect(self._resetCachedProperties) + + # done by GraphicsItem init + # GraphicsScene.registerObject(self) # workaround for pyqt bug in GraphicsScene.items() # Removed due to https://bugreports.qt-project.org/browse/PYSIDE-86 - #def itemChange(self, change, value): - ## BEWARE: Calling QGraphicsWidget.itemChange can lead to crashing! - ##ret = QtWidgets.QGraphicsWidget.itemChange(self, change, value) ## segv occurs here - ## The default behavior is just to return the value argument, so we'll do that - ## without calling the original method. - #ret = value - #if change in [self.ItemParentHasChanged, self.ItemSceneHasChanged]: - #self._updateView() - #return ret + # def itemChange(self, change, value): + # # BEWARE: Calling QGraphicsWidget.itemChange can lead to crashing! + # # ret = QtWidgets.QGraphicsWidget.itemChange(self, change, value) # segv occurs here + # # The default behavior is just to return the value argument, so we'll do that + # # without calling the original method. + # ret = value + # if change in [self.ItemParentHasChanged, self.ItemSceneHasChanged]: + # self._updateView() + # return ret + + def _resetCachedProperties(self): + self._boundingRectCache = self._previousGeometry = None + self._painterPathCache = None def setFixedHeight(self, h): self.setMaximumHeight(h) @@ -41,10 +48,10 @@ def setFixedWidth(self, h): self.setMaximumWidth(h) self.setMinimumWidth(h) - + def height(self): return self.geometry().height() - + def width(self): return self.geometry().width() @@ -52,19 +59,16 @@ geometry = self.geometry() if geometry != self._previousGeometry: self._painterPathCache = None - br = self.mapRectFromParent(geometry).normalized() self._boundingRectCache = br self._previousGeometry = geometry else: br = self._boundingRectCache - return br - def shape(self): ## No idea why this is necessary, but rotated items do not receive clicks otherwise. + def shape(self): p = self._painterPathCache if p is None: self._painterPathCache = p = QtGui.QPainterPath() p.addRect(self.boundingRect()) - return p
{"golden_diff": "diff --git a/pyqtgraph/graphicsItems/GraphicsWidget.py b/pyqtgraph/graphicsItems/GraphicsWidget.py\n--- a/pyqtgraph/graphicsItems/GraphicsWidget.py\n+++ b/pyqtgraph/graphicsItems/GraphicsWidget.py\n@@ -3,36 +3,43 @@\n \n __all__ = ['GraphicsWidget']\n \n+\n class GraphicsWidget(GraphicsItem, QtWidgets.QGraphicsWidget):\n \n _qtBaseClass = QtWidgets.QGraphicsWidget\n- def __init__(self, *args, **kargs):\n+\n+ def __init__(self, *args, **kwargs):\n \"\"\"\n **Bases:** :class:`GraphicsItem <pyqtgraph.GraphicsItem>`, :class:`QtWidgets.QGraphicsWidget`\n \n Extends QGraphicsWidget with several helpful methods and workarounds for PyQt bugs. \n Most of the extra functionality is inherited from :class:`GraphicsItem <pyqtgraph.GraphicsItem>`.\n \"\"\"\n- QtWidgets.QGraphicsWidget.__init__(self, *args, **kargs)\n+ QtWidgets.QGraphicsWidget.__init__(self, *args, **kwargs)\n GraphicsItem.__init__(self)\n \n- # cache bouding rect and geometry\n+ # cache bounding rect and geometry\n self._boundingRectCache = self._previousGeometry = None\n self._painterPathCache = None\n- \n- ## done by GraphicsItem init\n- #GraphicsScene.registerObject(self) ## workaround for pyqt bug in graphicsscene.items()\n+ self.geometryChanged.connect(self._resetCachedProperties)\n+\n+ # done by GraphicsItem init\n+ # GraphicsScene.registerObject(self) # workaround for pyqt bug in GraphicsScene.items()\n \n # Removed due to https://bugreports.qt-project.org/browse/PYSIDE-86\n- #def itemChange(self, change, value):\n- ## BEWARE: Calling QGraphicsWidget.itemChange can lead to crashing!\n- ##ret = QtWidgets.QGraphicsWidget.itemChange(self, change, value) ## segv occurs here\n- ## The default behavior is just to return the value argument, so we'll do that\n- ## without calling the original method.\n- #ret = value\n- #if change in [self.ItemParentHasChanged, self.ItemSceneHasChanged]:\n- #self._updateView()\n- #return ret\n+ # def itemChange(self, change, value):\n+ # # BEWARE: Calling QGraphicsWidget.itemChange can lead to crashing!\n+ # # ret = QtWidgets.QGraphicsWidget.itemChange(self, change, value) # segv occurs here\n+ # # The default behavior is just to return the value argument, so we'll do that\n+ # # without calling the original method.\n+ # ret = value\n+ # if change in [self.ItemParentHasChanged, self.ItemSceneHasChanged]:\n+ # self._updateView()\n+ # return ret\n+\n+ def _resetCachedProperties(self):\n+ self._boundingRectCache = self._previousGeometry = None\n+ self._painterPathCache = None\n \n def setFixedHeight(self, h):\n self.setMaximumHeight(h)\n@@ -41,10 +48,10 @@\n def setFixedWidth(self, h):\n self.setMaximumWidth(h)\n self.setMinimumWidth(h)\n- \n+\n def height(self):\n return self.geometry().height()\n- \n+\n def width(self):\n return self.geometry().width()\n \n@@ -52,19 +59,16 @@\n geometry = self.geometry()\n if geometry != self._previousGeometry:\n self._painterPathCache = None\n- \n br = self.mapRectFromParent(geometry).normalized()\n self._boundingRectCache = br\n self._previousGeometry = geometry\n else:\n br = self._boundingRectCache\n-\n return br\n \n- def shape(self): ## No idea why this is necessary, but rotated items do not receive clicks otherwise.\n+ def shape(self):\n p = self._painterPathCache\n if p is None:\n self._painterPathCache = p = QtGui.QPainterPath()\n p.addRect(self.boundingRect())\n-\n return p\n", "issue": "Axes zoom area does not resize in 0.12.4\n### Short description\r\nWhen a plot is first generated with a given size, say `width ` and `height` in pixels, the entire axes areas on the left and bottom of the plot are zoomable. When the plot size is resized such that the plot is larger (e.g. the window is grabbed along an edge or corner and expanded), only the top `height` portion of the y-axes or the left `width` portion of the x-axis retain the ability to zoom the plot using the scroll wheel. The axes area outside (i.e. the lower portion of the y-axis or the right portion of the x-axis) are not zoomable. If hovering over the plot, not on an axes, there is no issue with zooming. Reverting to 0.12.3 fixes issue.\r\n\r\nI was able to reproduce this in both custom plots and many of the pyqtgraph.example scripts.\r\n\r\n### Tested environment(s)\r\n\r\n * PyQtGraph version: 0.12.4\r\n * Qt Python binding: PySide2 5.15.2.1 Qt 5.15.2\r\n * Python version: 3.7\r\n * NumPy version: 1.21.6\r\n * Operating system: Windows 10 Enterprise 21H2\r\n * Installation method: pip\r\n\n", "before_files": [{"content": "from ..Qt import QtGui, QtWidgets\nfrom .GraphicsItem import GraphicsItem\n\n__all__ = ['GraphicsWidget']\n\nclass GraphicsWidget(GraphicsItem, QtWidgets.QGraphicsWidget):\n \n _qtBaseClass = QtWidgets.QGraphicsWidget\n def __init__(self, *args, **kargs):\n \"\"\"\n **Bases:** :class:`GraphicsItem <pyqtgraph.GraphicsItem>`, :class:`QtWidgets.QGraphicsWidget`\n \n Extends QGraphicsWidget with several helpful methods and workarounds for PyQt bugs. \n Most of the extra functionality is inherited from :class:`GraphicsItem <pyqtgraph.GraphicsItem>`.\n \"\"\"\n QtWidgets.QGraphicsWidget.__init__(self, *args, **kargs)\n GraphicsItem.__init__(self)\n\n # cache bouding rect and geometry\n self._boundingRectCache = self._previousGeometry = None\n self._painterPathCache = None\n \n ## done by GraphicsItem init\n #GraphicsScene.registerObject(self) ## workaround for pyqt bug in graphicsscene.items()\n\n # Removed due to https://bugreports.qt-project.org/browse/PYSIDE-86\n #def itemChange(self, change, value):\n ## BEWARE: Calling QGraphicsWidget.itemChange can lead to crashing!\n ##ret = QtWidgets.QGraphicsWidget.itemChange(self, change, value) ## segv occurs here\n ## The default behavior is just to return the value argument, so we'll do that\n ## without calling the original method.\n #ret = value\n #if change in [self.ItemParentHasChanged, self.ItemSceneHasChanged]:\n #self._updateView()\n #return ret\n\n def setFixedHeight(self, h):\n self.setMaximumHeight(h)\n self.setMinimumHeight(h)\n\n def setFixedWidth(self, h):\n self.setMaximumWidth(h)\n self.setMinimumWidth(h)\n \n def height(self):\n return self.geometry().height()\n \n def width(self):\n return self.geometry().width()\n\n def boundingRect(self):\n geometry = self.geometry()\n if geometry != self._previousGeometry:\n self._painterPathCache = None\n \n br = self.mapRectFromParent(geometry).normalized()\n self._boundingRectCache = br\n self._previousGeometry = geometry\n else:\n br = self._boundingRectCache\n\n return br\n\n def shape(self): ## No idea why this is necessary, but rotated items do not receive clicks otherwise.\n p = self._painterPathCache\n if p is None:\n self._painterPathCache = p = QtGui.QPainterPath()\n p.addRect(self.boundingRect())\n\n return p\n", "path": "pyqtgraph/graphicsItems/GraphicsWidget.py"}], "after_files": [{"content": "from ..Qt import QtGui, QtWidgets\nfrom .GraphicsItem import GraphicsItem\n\n__all__ = ['GraphicsWidget']\n\n\nclass GraphicsWidget(GraphicsItem, QtWidgets.QGraphicsWidget):\n \n _qtBaseClass = QtWidgets.QGraphicsWidget\n\n def __init__(self, *args, **kwargs):\n \"\"\"\n **Bases:** :class:`GraphicsItem <pyqtgraph.GraphicsItem>`, :class:`QtWidgets.QGraphicsWidget`\n \n Extends QGraphicsWidget with several helpful methods and workarounds for PyQt bugs. \n Most of the extra functionality is inherited from :class:`GraphicsItem <pyqtgraph.GraphicsItem>`.\n \"\"\"\n QtWidgets.QGraphicsWidget.__init__(self, *args, **kwargs)\n GraphicsItem.__init__(self)\n\n # cache bounding rect and geometry\n self._boundingRectCache = self._previousGeometry = None\n self._painterPathCache = None\n self.geometryChanged.connect(self._resetCachedProperties)\n\n # done by GraphicsItem init\n # GraphicsScene.registerObject(self) # workaround for pyqt bug in GraphicsScene.items()\n\n # Removed due to https://bugreports.qt-project.org/browse/PYSIDE-86\n # def itemChange(self, change, value):\n # # BEWARE: Calling QGraphicsWidget.itemChange can lead to crashing!\n # # ret = QtWidgets.QGraphicsWidget.itemChange(self, change, value) # segv occurs here\n # # The default behavior is just to return the value argument, so we'll do that\n # # without calling the original method.\n # ret = value\n # if change in [self.ItemParentHasChanged, self.ItemSceneHasChanged]:\n # self._updateView()\n # return ret\n\n def _resetCachedProperties(self):\n self._boundingRectCache = self._previousGeometry = None\n self._painterPathCache = None\n\n def setFixedHeight(self, h):\n self.setMaximumHeight(h)\n self.setMinimumHeight(h)\n\n def setFixedWidth(self, h):\n self.setMaximumWidth(h)\n self.setMinimumWidth(h)\n\n def height(self):\n return self.geometry().height()\n\n def width(self):\n return self.geometry().width()\n\n def boundingRect(self):\n geometry = self.geometry()\n if geometry != self._previousGeometry:\n self._painterPathCache = None\n br = self.mapRectFromParent(geometry).normalized()\n self._boundingRectCache = br\n self._previousGeometry = geometry\n else:\n br = self._boundingRectCache\n return br\n\n def shape(self):\n p = self._painterPathCache\n if p is None:\n self._painterPathCache = p = QtGui.QPainterPath()\n p.addRect(self.boundingRect())\n return p\n", "path": "pyqtgraph/graphicsItems/GraphicsWidget.py"}]}
1,261
914
gh_patches_debug_41159
rasdani/github-patches
git_diff
azavea__raster-vision-328
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Fix raster stats bug If you run compute_raster_stats on 4-channel imagery (yielding stats for 4 channels), and use a `channel_order` of [0, 1, 2] in your raster_transformer, and then switch to using 3-channel imagery, it leads to an error because currently the `means` do not have the `channel_order` applied to them before being subtracted from the raster. In other words, 4 channel means is subtracted from a 3 channel raster. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `src/rastervision/builders/raster_transformer_builder.py` Content: ``` 1 from rastervision.core.raster_transformer import RasterTransformer 2 3 4 def build(config): 5 return RasterTransformer(config) 6 ``` Path: `src/rastervision/core/raster_transformer.py` Content: ``` 1 import numpy as np 2 3 from rastervision.core.raster_stats import RasterStats 4 5 6 class RasterTransformer(object): 7 """Transforms chips according to a config.""" 8 9 def __init__(self, options): 10 """Construct a new RasterTransformer. 11 12 Args: 13 options: protos.raster_transformer_pb2.RasterTransformer 14 """ 15 self.options = options 16 self.raster_stats = None 17 if options.stats_uri: 18 self.raster_stats = RasterStats() 19 self.raster_stats.load(options.stats_uri) 20 21 def transform(self, chip): 22 """Transform a chip. 23 24 Selects a subset of the channels and transforms non-uint8 to 25 uint8 values using options.stats_uri 26 27 Args: 28 chip: [height, width, channels] numpy array 29 30 Returns: 31 [height, width, channels] uint8 numpy array where channels is equal 32 to len(self.options.channel_order) 33 """ 34 if chip.dtype != np.uint8: 35 if self.raster_stats: 36 # Subtract mean and divide by std to get zscores. 37 means = np.array(self.raster_stats.means) 38 means = means[np.newaxis, np.newaxis, :].astype(np.float) 39 stds = np.array(self.raster_stats.stds) 40 stds = stds[np.newaxis, np.newaxis, :].astype(np.float) 41 42 # Don't transform NODATA zero values. 43 nodata = chip == 0 44 45 chip = chip - means 46 chip = chip / stds 47 48 # Make zscores that fall between -3 and 3 span 0 to 255. 49 chip += 3 50 chip /= 6 51 52 chip = np.clip(chip, 0, 1) 53 chip *= 255 54 chip = chip.astype(np.uint8) 55 56 chip[nodata] = 0 57 else: 58 raise ValueError( 59 'Need to provide stats_uri for non-uint8 rasters.') 60 61 if self.options.channel_order: 62 return chip[:, :, self.options.channel_order] 63 return chip 64 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/src/rastervision/builders/raster_transformer_builder.py b/src/rastervision/builders/raster_transformer_builder.py --- a/src/rastervision/builders/raster_transformer_builder.py +++ b/src/rastervision/builders/raster_transformer_builder.py @@ -1,5 +1,12 @@ from rastervision.core.raster_transformer import RasterTransformer +from rastervision.core.raster_stats import RasterStats def build(config): - return RasterTransformer(config) + raster_stats = None + if config.stats_uri: + raster_stats = RasterStats() + raster_stats.load(config.stats_uri) + + return RasterTransformer( + channel_order=config.channel_order, raster_stats=raster_stats) diff --git a/src/rastervision/core/raster_transformer.py b/src/rastervision/core/raster_transformer.py --- a/src/rastervision/core/raster_transformer.py +++ b/src/rastervision/core/raster_transformer.py @@ -1,43 +1,50 @@ import numpy as np -from rastervision.core.raster_stats import RasterStats - class RasterTransformer(object): - """Transforms chips according to a config.""" + """Transforms raw chips to be input to a neural network.""" - def __init__(self, options): + def __init__(self, channel_order=None, raster_stats=None): """Construct a new RasterTransformer. Args: - options: protos.raster_transformer_pb2.RasterTransformer + channel_order: numpy array of length n where n is the number of + channels to use and the values are channel indices + raster_stats: (RasterStats) used to transform chip to have + desired statistics """ - self.options = options - self.raster_stats = None - if options.stats_uri: - self.raster_stats = RasterStats() - self.raster_stats.load(options.stats_uri) + self.channel_order = channel_order + self.raster_stats = raster_stats def transform(self, chip): """Transform a chip. Selects a subset of the channels and transforms non-uint8 to - uint8 values using options.stats_uri + uint8 values using raster_stats. Args: chip: [height, width, channels] numpy array Returns: [height, width, channels] uint8 numpy array where channels is equal - to len(self.options.channel_order) + to len(channel_order) """ + if self.channel_order is None: + channel_order = np.arange(chip.shape[2]) + else: + channel_order = self.channel_order + + chip = chip[:, :, channel_order] + if chip.dtype != np.uint8: if self.raster_stats: # Subtract mean and divide by std to get zscores. means = np.array(self.raster_stats.means) - means = means[np.newaxis, np.newaxis, :].astype(np.float) + means = means[np.newaxis, np.newaxis, channel_order].astype( + np.float) stds = np.array(self.raster_stats.stds) - stds = stds[np.newaxis, np.newaxis, :].astype(np.float) + stds = stds[np.newaxis, np.newaxis, channel_order].astype( + np.float) # Don't transform NODATA zero values. nodata = chip == 0 @@ -56,8 +63,6 @@ chip[nodata] = 0 else: raise ValueError( - 'Need to provide stats_uri for non-uint8 rasters.') + 'Need to provide raster_stats for non-uint8 rasters.') - if self.options.channel_order: - return chip[:, :, self.options.channel_order] return chip
{"golden_diff": "diff --git a/src/rastervision/builders/raster_transformer_builder.py b/src/rastervision/builders/raster_transformer_builder.py\n--- a/src/rastervision/builders/raster_transformer_builder.py\n+++ b/src/rastervision/builders/raster_transformer_builder.py\n@@ -1,5 +1,12 @@\n from rastervision.core.raster_transformer import RasterTransformer\n+from rastervision.core.raster_stats import RasterStats\n \n \n def build(config):\n- return RasterTransformer(config)\n+ raster_stats = None\n+ if config.stats_uri:\n+ raster_stats = RasterStats()\n+ raster_stats.load(config.stats_uri)\n+\n+ return RasterTransformer(\n+ channel_order=config.channel_order, raster_stats=raster_stats)\ndiff --git a/src/rastervision/core/raster_transformer.py b/src/rastervision/core/raster_transformer.py\n--- a/src/rastervision/core/raster_transformer.py\n+++ b/src/rastervision/core/raster_transformer.py\n@@ -1,43 +1,50 @@\n import numpy as np\n \n-from rastervision.core.raster_stats import RasterStats\n-\n \n class RasterTransformer(object):\n- \"\"\"Transforms chips according to a config.\"\"\"\n+ \"\"\"Transforms raw chips to be input to a neural network.\"\"\"\n \n- def __init__(self, options):\n+ def __init__(self, channel_order=None, raster_stats=None):\n \"\"\"Construct a new RasterTransformer.\n \n Args:\n- options: protos.raster_transformer_pb2.RasterTransformer\n+ channel_order: numpy array of length n where n is the number of\n+ channels to use and the values are channel indices\n+ raster_stats: (RasterStats) used to transform chip to have\n+ desired statistics\n \"\"\"\n- self.options = options\n- self.raster_stats = None\n- if options.stats_uri:\n- self.raster_stats = RasterStats()\n- self.raster_stats.load(options.stats_uri)\n+ self.channel_order = channel_order\n+ self.raster_stats = raster_stats\n \n def transform(self, chip):\n \"\"\"Transform a chip.\n \n Selects a subset of the channels and transforms non-uint8 to\n- uint8 values using options.stats_uri\n+ uint8 values using raster_stats.\n \n Args:\n chip: [height, width, channels] numpy array\n \n Returns:\n [height, width, channels] uint8 numpy array where channels is equal\n- to len(self.options.channel_order)\n+ to len(channel_order)\n \"\"\"\n+ if self.channel_order is None:\n+ channel_order = np.arange(chip.shape[2])\n+ else:\n+ channel_order = self.channel_order\n+\n+ chip = chip[:, :, channel_order]\n+\n if chip.dtype != np.uint8:\n if self.raster_stats:\n # Subtract mean and divide by std to get zscores.\n means = np.array(self.raster_stats.means)\n- means = means[np.newaxis, np.newaxis, :].astype(np.float)\n+ means = means[np.newaxis, np.newaxis, channel_order].astype(\n+ np.float)\n stds = np.array(self.raster_stats.stds)\n- stds = stds[np.newaxis, np.newaxis, :].astype(np.float)\n+ stds = stds[np.newaxis, np.newaxis, channel_order].astype(\n+ np.float)\n \n # Don't transform NODATA zero values.\n nodata = chip == 0\n@@ -56,8 +63,6 @@\n chip[nodata] = 0\n else:\n raise ValueError(\n- 'Need to provide stats_uri for non-uint8 rasters.')\n+ 'Need to provide raster_stats for non-uint8 rasters.')\n \n- if self.options.channel_order:\n- return chip[:, :, self.options.channel_order]\n return chip\n", "issue": "Fix raster stats bug\nIf you run compute_raster_stats on 4-channel imagery (yielding stats for 4 channels), and use a `channel_order` of [0, 1, 2] in your raster_transformer, and then switch to using 3-channel imagery, it leads to an error because currently the `means` do not have the `channel_order` applied to them before being subtracted from the raster. In other words, 4 channel means is subtracted from a 3 channel raster.\n", "before_files": [{"content": "from rastervision.core.raster_transformer import RasterTransformer\n\n\ndef build(config):\n return RasterTransformer(config)\n", "path": "src/rastervision/builders/raster_transformer_builder.py"}, {"content": "import numpy as np\n\nfrom rastervision.core.raster_stats import RasterStats\n\n\nclass RasterTransformer(object):\n \"\"\"Transforms chips according to a config.\"\"\"\n\n def __init__(self, options):\n \"\"\"Construct a new RasterTransformer.\n\n Args:\n options: protos.raster_transformer_pb2.RasterTransformer\n \"\"\"\n self.options = options\n self.raster_stats = None\n if options.stats_uri:\n self.raster_stats = RasterStats()\n self.raster_stats.load(options.stats_uri)\n\n def transform(self, chip):\n \"\"\"Transform a chip.\n\n Selects a subset of the channels and transforms non-uint8 to\n uint8 values using options.stats_uri\n\n Args:\n chip: [height, width, channels] numpy array\n\n Returns:\n [height, width, channels] uint8 numpy array where channels is equal\n to len(self.options.channel_order)\n \"\"\"\n if chip.dtype != np.uint8:\n if self.raster_stats:\n # Subtract mean and divide by std to get zscores.\n means = np.array(self.raster_stats.means)\n means = means[np.newaxis, np.newaxis, :].astype(np.float)\n stds = np.array(self.raster_stats.stds)\n stds = stds[np.newaxis, np.newaxis, :].astype(np.float)\n\n # Don't transform NODATA zero values.\n nodata = chip == 0\n\n chip = chip - means\n chip = chip / stds\n\n # Make zscores that fall between -3 and 3 span 0 to 255.\n chip += 3\n chip /= 6\n\n chip = np.clip(chip, 0, 1)\n chip *= 255\n chip = chip.astype(np.uint8)\n\n chip[nodata] = 0\n else:\n raise ValueError(\n 'Need to provide stats_uri for non-uint8 rasters.')\n\n if self.options.channel_order:\n return chip[:, :, self.options.channel_order]\n return chip\n", "path": "src/rastervision/core/raster_transformer.py"}], "after_files": [{"content": "from rastervision.core.raster_transformer import RasterTransformer\nfrom rastervision.core.raster_stats import RasterStats\n\n\ndef build(config):\n raster_stats = None\n if config.stats_uri:\n raster_stats = RasterStats()\n raster_stats.load(config.stats_uri)\n\n return RasterTransformer(\n channel_order=config.channel_order, raster_stats=raster_stats)\n", "path": "src/rastervision/builders/raster_transformer_builder.py"}, {"content": "import numpy as np\n\n\nclass RasterTransformer(object):\n \"\"\"Transforms raw chips to be input to a neural network.\"\"\"\n\n def __init__(self, channel_order=None, raster_stats=None):\n \"\"\"Construct a new RasterTransformer.\n\n Args:\n channel_order: numpy array of length n where n is the number of\n channels to use and the values are channel indices\n raster_stats: (RasterStats) used to transform chip to have\n desired statistics\n \"\"\"\n self.channel_order = channel_order\n self.raster_stats = raster_stats\n\n def transform(self, chip):\n \"\"\"Transform a chip.\n\n Selects a subset of the channels and transforms non-uint8 to\n uint8 values using raster_stats.\n\n Args:\n chip: [height, width, channels] numpy array\n\n Returns:\n [height, width, channels] uint8 numpy array where channels is equal\n to len(channel_order)\n \"\"\"\n if self.channel_order is None:\n channel_order = np.arange(chip.shape[2])\n else:\n channel_order = self.channel_order\n\n chip = chip[:, :, channel_order]\n\n if chip.dtype != np.uint8:\n if self.raster_stats:\n # Subtract mean and divide by std to get zscores.\n means = np.array(self.raster_stats.means)\n means = means[np.newaxis, np.newaxis, channel_order].astype(\n np.float)\n stds = np.array(self.raster_stats.stds)\n stds = stds[np.newaxis, np.newaxis, channel_order].astype(\n np.float)\n\n # Don't transform NODATA zero values.\n nodata = chip == 0\n\n chip = chip - means\n chip = chip / stds\n\n # Make zscores that fall between -3 and 3 span 0 to 255.\n chip += 3\n chip /= 6\n\n chip = np.clip(chip, 0, 1)\n chip *= 255\n chip = chip.astype(np.uint8)\n\n chip[nodata] = 0\n else:\n raise ValueError(\n 'Need to provide raster_stats for non-uint8 rasters.')\n\n return chip\n", "path": "src/rastervision/core/raster_transformer.py"}]}
1,000
854
gh_patches_debug_11754
rasdani/github-patches
git_diff
svthalia__concrexit-1826
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Add search parameter to event registrations admin api ### Is your feature request related to a problem? Please describe. I'm always frustrated when I can't search for a registration in the event admin. ### Describe the solution you'd like A search parameter (by member.name or name) on `api/v2/admin/events/<id>/registrations/`. ### Motivation Then we can search for registrations. A parameter is desirable for consistency in making pagination available. ### Describe alternatives you've considered Local search. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `website/events/api/v2/admin/views.py` Content: ``` 1 from django.http import Http404 2 from oauth2_provider.contrib.rest_framework import IsAuthenticatedOrTokenHasScope 3 from rest_framework import status 4 from rest_framework.exceptions import ValidationError, PermissionDenied 5 from rest_framework.generics import get_object_or_404 6 from rest_framework.response import Response 7 from rest_framework.views import APIView 8 from rest_framework import filters as framework_filters 9 10 from events import services 11 from events.api.v2.admin import filters 12 from events.api.v2.admin.permissions import IsOrganiser 13 from events.api.v2.admin.serializers.event import ( 14 EventListAdminSerializer, 15 EventAdminSerializer, 16 ) 17 from events.api.v2.admin.serializers.event_registration import ( 18 EventRegistrationAdminSerializer, 19 ) 20 from events.exceptions import RegistrationError 21 from events.models import Event, EventRegistration 22 from thaliawebsite.api.v2.admin.views import ( 23 AdminListAPIView, 24 AdminRetrieveAPIView, 25 AdminCreateAPIView, 26 AdminUpdateAPIView, 27 AdminDestroyAPIView, 28 AdminPermissionsMixin, 29 ) 30 import events.api.v2.filters as normal_filters 31 32 33 class EventAdminListCreateAPIView(AdminListAPIView, AdminCreateAPIView): 34 queryset = Event.objects.prefetch_related("organiser") 35 permission_classes = [IsAuthenticatedOrTokenHasScope] 36 required_scopes = ["events:admin"] 37 filter_backends = [ 38 framework_filters.OrderingFilter, 39 normal_filters.CategoryFilter, 40 normal_filters.OrganiserFilter, 41 normal_filters.EventDateFilter, 42 filters.PublishedFilter, 43 ] 44 ordering_fields = ( 45 "start", 46 "end", 47 "published", 48 "registration_start", 49 "registration_end", 50 ) 51 52 def get_serializer_class(self): 53 if self.request.method.lower() == "post": 54 return EventAdminSerializer 55 return EventListAdminSerializer 56 57 58 class EventAdminDetailAPIView( 59 AdminRetrieveAPIView, AdminUpdateAPIView, AdminDestroyAPIView 60 ): 61 queryset = Event.objects.all() 62 serializer_class = EventAdminSerializer 63 permission_classes = [IsOrganiser, IsAuthenticatedOrTokenHasScope] 64 required_scopes = ["events:admin"] 65 66 67 class EventRegistrationAdminListView(AdminListAPIView, AdminCreateAPIView): 68 """Returns a list of registrations.""" 69 70 serializer_class = EventRegistrationAdminSerializer 71 permission_classes = [IsOrganiser, IsAuthenticatedOrTokenHasScope] 72 required_scopes = ["events:admin"] 73 filter_backends = ( 74 framework_filters.OrderingFilter, 75 filters.EventRegistrationCancelledFilter, 76 ) 77 ordering_fields = ("queue_position", "date", "date_cancelled") 78 79 def get_queryset(self): 80 event = get_object_or_404(Event, pk=self.kwargs.get("pk")) 81 if event: 82 return EventRegistration.objects.filter(event_id=event).prefetch_related( 83 "member", "member__profile" 84 ) 85 return EventRegistration.objects.none() 86 87 88 class EventRegistrationAdminDetailView( 89 AdminRetrieveAPIView, AdminUpdateAPIView, AdminDestroyAPIView 90 ): 91 """Returns details of an event registration.""" 92 93 serializer_class = EventRegistrationAdminSerializer 94 queryset = EventRegistration.objects.all() 95 permission_classes = [IsOrganiser, IsAuthenticatedOrTokenHasScope] 96 required_scopes = ["events:admin"] 97 event_lookup_field = "event_id" 98 99 def get_queryset(self): 100 return super().get_queryset().filter(event=self.kwargs["event_id"]) 101 102 103 class EventRegistrationAdminFieldsView(AdminPermissionsMixin, APIView): 104 """Returns details of an event registration.""" 105 106 permission_classes = [IsOrganiser, IsAuthenticatedOrTokenHasScope] 107 required_scopes = ["events:admin"] 108 109 def get_queryset(self): 110 return EventRegistration.objects.filter(event=self.kwargs["event_id"]) 111 112 def get_object(self): 113 event_registration = get_object_or_404( 114 EventRegistration, 115 event=self.kwargs["event_id"], 116 pk=self.kwargs["registration_id"], 117 ) 118 119 if not event_registration.event.has_fields: 120 raise Http404 121 122 return event_registration 123 124 def get(self, request, *args, **kwargs): 125 return Response( 126 data=services.registration_fields(request, registration=self.get_object()), 127 status=status.HTTP_200_OK, 128 ) 129 130 def put(self, request, *args, **kwargs): 131 original = services.registration_fields(request, registration=self.get_object()) 132 required_keys = set(original.keys()) - set(request.data.keys()) 133 if len(required_keys) > 0: 134 raise ValidationError( 135 f"Missing keys '{', '.join(required_keys)}' in request", 136 status.HTTP_400_BAD_REQUEST, 137 ) 138 139 services.update_registration( 140 registration=self.get_object(), field_values=request.data.items() 141 ) 142 143 return Response( 144 data=services.registration_fields(request, registration=self.get_object()), 145 status=status.HTTP_200_OK, 146 ) 147 148 def patch(self, request, *args, **kwargs): 149 services.update_registration( 150 registration=self.get_object(), field_values=request.data.items() 151 ) 152 153 return Response( 154 data=services.registration_fields(request, registration=self.get_object()), 155 status=status.HTTP_200_OK, 156 ) 157 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/website/events/api/v2/admin/views.py b/website/events/api/v2/admin/views.py --- a/website/events/api/v2/admin/views.py +++ b/website/events/api/v2/admin/views.py @@ -72,9 +72,15 @@ required_scopes = ["events:admin"] filter_backends = ( framework_filters.OrderingFilter, + framework_filters.SearchFilter, filters.EventRegistrationCancelledFilter, ) ordering_fields = ("queue_position", "date", "date_cancelled") + search_fields = ( + "member__first_name", + "member__last_name", + "name", + ) def get_queryset(self): event = get_object_or_404(Event, pk=self.kwargs.get("pk"))
{"golden_diff": "diff --git a/website/events/api/v2/admin/views.py b/website/events/api/v2/admin/views.py\n--- a/website/events/api/v2/admin/views.py\n+++ b/website/events/api/v2/admin/views.py\n@@ -72,9 +72,15 @@\n required_scopes = [\"events:admin\"]\n filter_backends = (\n framework_filters.OrderingFilter,\n+ framework_filters.SearchFilter,\n filters.EventRegistrationCancelledFilter,\n )\n ordering_fields = (\"queue_position\", \"date\", \"date_cancelled\")\n+ search_fields = (\n+ \"member__first_name\",\n+ \"member__last_name\",\n+ \"name\",\n+ )\n \n def get_queryset(self):\n event = get_object_or_404(Event, pk=self.kwargs.get(\"pk\"))\n", "issue": "Add search parameter to event registrations admin api\n### Is your feature request related to a problem? Please describe.\r\nI'm always frustrated when I can't search for a registration in the event admin.\r\n\r\n### Describe the solution you'd like\r\nA search parameter (by member.name or name) on `api/v2/admin/events/<id>/registrations/`.\r\n\r\n### Motivation\r\nThen we can search for registrations. A parameter is desirable for consistency in making pagination available.\r\n\r\n### Describe alternatives you've considered\r\nLocal search.\r\n\n", "before_files": [{"content": "from django.http import Http404\nfrom oauth2_provider.contrib.rest_framework import IsAuthenticatedOrTokenHasScope\nfrom rest_framework import status\nfrom rest_framework.exceptions import ValidationError, PermissionDenied\nfrom rest_framework.generics import get_object_or_404\nfrom rest_framework.response import Response\nfrom rest_framework.views import APIView\nfrom rest_framework import filters as framework_filters\n\nfrom events import services\nfrom events.api.v2.admin import filters\nfrom events.api.v2.admin.permissions import IsOrganiser\nfrom events.api.v2.admin.serializers.event import (\n EventListAdminSerializer,\n EventAdminSerializer,\n)\nfrom events.api.v2.admin.serializers.event_registration import (\n EventRegistrationAdminSerializer,\n)\nfrom events.exceptions import RegistrationError\nfrom events.models import Event, EventRegistration\nfrom thaliawebsite.api.v2.admin.views import (\n AdminListAPIView,\n AdminRetrieveAPIView,\n AdminCreateAPIView,\n AdminUpdateAPIView,\n AdminDestroyAPIView,\n AdminPermissionsMixin,\n)\nimport events.api.v2.filters as normal_filters\n\n\nclass EventAdminListCreateAPIView(AdminListAPIView, AdminCreateAPIView):\n queryset = Event.objects.prefetch_related(\"organiser\")\n permission_classes = [IsAuthenticatedOrTokenHasScope]\n required_scopes = [\"events:admin\"]\n filter_backends = [\n framework_filters.OrderingFilter,\n normal_filters.CategoryFilter,\n normal_filters.OrganiserFilter,\n normal_filters.EventDateFilter,\n filters.PublishedFilter,\n ]\n ordering_fields = (\n \"start\",\n \"end\",\n \"published\",\n \"registration_start\",\n \"registration_end\",\n )\n\n def get_serializer_class(self):\n if self.request.method.lower() == \"post\":\n return EventAdminSerializer\n return EventListAdminSerializer\n\n\nclass EventAdminDetailAPIView(\n AdminRetrieveAPIView, AdminUpdateAPIView, AdminDestroyAPIView\n):\n queryset = Event.objects.all()\n serializer_class = EventAdminSerializer\n permission_classes = [IsOrganiser, IsAuthenticatedOrTokenHasScope]\n required_scopes = [\"events:admin\"]\n\n\nclass EventRegistrationAdminListView(AdminListAPIView, AdminCreateAPIView):\n \"\"\"Returns a list of registrations.\"\"\"\n\n serializer_class = EventRegistrationAdminSerializer\n permission_classes = [IsOrganiser, IsAuthenticatedOrTokenHasScope]\n required_scopes = [\"events:admin\"]\n filter_backends = (\n framework_filters.OrderingFilter,\n filters.EventRegistrationCancelledFilter,\n )\n ordering_fields = (\"queue_position\", \"date\", \"date_cancelled\")\n\n def get_queryset(self):\n event = get_object_or_404(Event, pk=self.kwargs.get(\"pk\"))\n if event:\n return EventRegistration.objects.filter(event_id=event).prefetch_related(\n \"member\", \"member__profile\"\n )\n return EventRegistration.objects.none()\n\n\nclass EventRegistrationAdminDetailView(\n AdminRetrieveAPIView, AdminUpdateAPIView, AdminDestroyAPIView\n):\n \"\"\"Returns details of an event registration.\"\"\"\n\n serializer_class = EventRegistrationAdminSerializer\n queryset = EventRegistration.objects.all()\n permission_classes = [IsOrganiser, IsAuthenticatedOrTokenHasScope]\n required_scopes = [\"events:admin\"]\n event_lookup_field = \"event_id\"\n\n def get_queryset(self):\n return super().get_queryset().filter(event=self.kwargs[\"event_id\"])\n\n\nclass EventRegistrationAdminFieldsView(AdminPermissionsMixin, APIView):\n \"\"\"Returns details of an event registration.\"\"\"\n\n permission_classes = [IsOrganiser, IsAuthenticatedOrTokenHasScope]\n required_scopes = [\"events:admin\"]\n\n def get_queryset(self):\n return EventRegistration.objects.filter(event=self.kwargs[\"event_id\"])\n\n def get_object(self):\n event_registration = get_object_or_404(\n EventRegistration,\n event=self.kwargs[\"event_id\"],\n pk=self.kwargs[\"registration_id\"],\n )\n\n if not event_registration.event.has_fields:\n raise Http404\n\n return event_registration\n\n def get(self, request, *args, **kwargs):\n return Response(\n data=services.registration_fields(request, registration=self.get_object()),\n status=status.HTTP_200_OK,\n )\n\n def put(self, request, *args, **kwargs):\n original = services.registration_fields(request, registration=self.get_object())\n required_keys = set(original.keys()) - set(request.data.keys())\n if len(required_keys) > 0:\n raise ValidationError(\n f\"Missing keys '{', '.join(required_keys)}' in request\",\n status.HTTP_400_BAD_REQUEST,\n )\n\n services.update_registration(\n registration=self.get_object(), field_values=request.data.items()\n )\n\n return Response(\n data=services.registration_fields(request, registration=self.get_object()),\n status=status.HTTP_200_OK,\n )\n\n def patch(self, request, *args, **kwargs):\n services.update_registration(\n registration=self.get_object(), field_values=request.data.items()\n )\n\n return Response(\n data=services.registration_fields(request, registration=self.get_object()),\n status=status.HTTP_200_OK,\n )\n", "path": "website/events/api/v2/admin/views.py"}], "after_files": [{"content": "from django.http import Http404\nfrom oauth2_provider.contrib.rest_framework import IsAuthenticatedOrTokenHasScope\nfrom rest_framework import status\nfrom rest_framework.exceptions import ValidationError, PermissionDenied\nfrom rest_framework.generics import get_object_or_404\nfrom rest_framework.response import Response\nfrom rest_framework.views import APIView\nfrom rest_framework import filters as framework_filters\n\nfrom events import services\nfrom events.api.v2.admin import filters\nfrom events.api.v2.admin.permissions import IsOrganiser\nfrom events.api.v2.admin.serializers.event import (\n EventListAdminSerializer,\n EventAdminSerializer,\n)\nfrom events.api.v2.admin.serializers.event_registration import (\n EventRegistrationAdminSerializer,\n)\nfrom events.exceptions import RegistrationError\nfrom events.models import Event, EventRegistration\nfrom thaliawebsite.api.v2.admin.views import (\n AdminListAPIView,\n AdminRetrieveAPIView,\n AdminCreateAPIView,\n AdminUpdateAPIView,\n AdminDestroyAPIView,\n AdminPermissionsMixin,\n)\nimport events.api.v2.filters as normal_filters\n\n\nclass EventAdminListCreateAPIView(AdminListAPIView, AdminCreateAPIView):\n queryset = Event.objects.prefetch_related(\"organiser\")\n permission_classes = [IsAuthenticatedOrTokenHasScope]\n required_scopes = [\"events:admin\"]\n filter_backends = [\n framework_filters.OrderingFilter,\n normal_filters.CategoryFilter,\n normal_filters.OrganiserFilter,\n normal_filters.EventDateFilter,\n filters.PublishedFilter,\n ]\n ordering_fields = (\n \"start\",\n \"end\",\n \"published\",\n \"registration_start\",\n \"registration_end\",\n )\n\n def get_serializer_class(self):\n if self.request.method.lower() == \"post\":\n return EventAdminSerializer\n return EventListAdminSerializer\n\n\nclass EventAdminDetailAPIView(\n AdminRetrieveAPIView, AdminUpdateAPIView, AdminDestroyAPIView\n):\n queryset = Event.objects.all()\n serializer_class = EventAdminSerializer\n permission_classes = [IsOrganiser, IsAuthenticatedOrTokenHasScope]\n required_scopes = [\"events:admin\"]\n\n\nclass EventRegistrationAdminListView(AdminListAPIView, AdminCreateAPIView):\n \"\"\"Returns a list of registrations.\"\"\"\n\n serializer_class = EventRegistrationAdminSerializer\n permission_classes = [IsOrganiser, IsAuthenticatedOrTokenHasScope]\n required_scopes = [\"events:admin\"]\n filter_backends = (\n framework_filters.OrderingFilter,\n framework_filters.SearchFilter,\n filters.EventRegistrationCancelledFilter,\n )\n ordering_fields = (\"queue_position\", \"date\", \"date_cancelled\")\n search_fields = (\n \"member__first_name\",\n \"member__last_name\",\n \"name\",\n )\n\n def get_queryset(self):\n event = get_object_or_404(Event, pk=self.kwargs.get(\"pk\"))\n if event:\n return EventRegistration.objects.filter(event_id=event).prefetch_related(\n \"member\", \"member__profile\"\n )\n return EventRegistration.objects.none()\n\n\nclass EventRegistrationAdminDetailView(\n AdminRetrieveAPIView, AdminUpdateAPIView, AdminDestroyAPIView\n):\n \"\"\"Returns details of an event registration.\"\"\"\n\n serializer_class = EventRegistrationAdminSerializer\n queryset = EventRegistration.objects.all()\n permission_classes = [IsOrganiser, IsAuthenticatedOrTokenHasScope]\n required_scopes = [\"events:admin\"]\n event_lookup_field = \"event_id\"\n\n def get_queryset(self):\n return super().get_queryset().filter(event=self.kwargs[\"event_id\"])\n\n\nclass EventRegistrationAdminFieldsView(AdminPermissionsMixin, APIView):\n \"\"\"Returns details of an event registration.\"\"\"\n\n permission_classes = [IsOrganiser, IsAuthenticatedOrTokenHasScope]\n required_scopes = [\"events:admin\"]\n\n def get_queryset(self):\n return EventRegistration.objects.filter(event=self.kwargs[\"event_id\"])\n\n def get_object(self):\n event_registration = get_object_or_404(\n EventRegistration,\n event=self.kwargs[\"event_id\"],\n pk=self.kwargs[\"registration_id\"],\n )\n\n if not event_registration.event.has_fields:\n raise Http404\n\n return event_registration\n\n def get(self, request, *args, **kwargs):\n return Response(\n data=services.registration_fields(request, registration=self.get_object()),\n status=status.HTTP_200_OK,\n )\n\n def put(self, request, *args, **kwargs):\n original = services.registration_fields(request, registration=self.get_object())\n required_keys = set(original.keys()) - set(request.data.keys())\n if len(required_keys) > 0:\n raise ValidationError(\n f\"Missing keys '{', '.join(required_keys)}' in request\",\n status.HTTP_400_BAD_REQUEST,\n )\n\n services.update_registration(\n registration=self.get_object(), field_values=request.data.items()\n )\n\n return Response(\n data=services.registration_fields(request, registration=self.get_object()),\n status=status.HTTP_200_OK,\n )\n\n def patch(self, request, *args, **kwargs):\n services.update_registration(\n registration=self.get_object(), field_values=request.data.items()\n )\n\n return Response(\n data=services.registration_fields(request, registration=self.get_object()),\n status=status.HTTP_200_OK,\n )\n", "path": "website/events/api/v2/admin/views.py"}]}
1,805
172
gh_patches_debug_140
rasdani/github-patches
git_diff
d2l-ai__d2l-en-2078
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- [MXNet] matplotlib >=3.5 raises TypeError with ax.plot_wireframe in MXNet ndarray With the latest version of matplotlib, multiple notebooks fail with a type error in mxnet (mxnet==1.7.0 & CUDA 10.2). Some of the affected sections include [optimization intro](https://d2l.ai/chapter_optimization/optimization-intro.html), [integral calculus](https://d2l.ai/chapter_appendix-mathematics-for-deep-learning/integral-calculus.html), [multivariable calculus](https://d2l.ai/chapter_appendix-mathematics-for-deep-learning/multivariable-calculus.html) etc. ``` TypeError: no implementation found for 'numpy.column_stack' on types that implement __array_function__: [<class 'mxnet.numpy.ndarray'>, <class 'numpy.ndarray'>] ``` Please see attached traceback and reproduction instructions below. Steps to reproduce the issue. 1. Setup the d2l environment (using `static/build.yml`) 2. While setting up the environment, it will automatically install the latest version of matplotlib (i.e. `matplotlib==3.5.1` as of today). Run one of the notebooks which is affected (mentioned above) <details> <summary>Click to expand: Error Traceback</summary> ``` --------------------------------------------------------------------------- TypeError Traceback (most recent call last) Input In [7], in <module> 9 # Plot function 10 ax = d2l.plt.figure().add_subplot(111, projection='3d') ---> 11 ax.plot_wireframe(x, y, z, **{'rstride': 10, 'cstride': 10}) 12 ax.plot_wireframe(x, y, w, **{'rstride': 10, 'cstride': 10}, color='purple') 13 d2l.plt.xlabel('x') File ~/miniconda3/envs/mpl_d2l/lib/python3.8/site-packages/matplotlib/_api/deprecation.py:412, in delete_parameter.<locals>.wrapper(*inner_args, **inner_kwargs) 402 deprecation_addendum = ( 403 f"If any parameter follows {name!r}, they should be passed as " 404 f"keyword, not positionally.") 405 warn_deprecated( 406 since, 407 name=repr(name), (...) 410 else deprecation_addendum, 411 **kwargs) --> 412 return func(*inner_args, **inner_kwargs) File ~/miniconda3/envs/mpl_d2l/lib/python3.8/site-packages/mpl_toolkits/mplot3d/axes3d.py:1908, in Axes3D.plot_wireframe(self, X, Y, Z, *args, **kwargs) 1906 linec = art3d.Line3DCollection(lines, *args, **kwargs) 1907 self.add_collection(linec) -> 1908 self.auto_scale_xyz(X, Y, Z, had_data) 1910 return linec File ~/miniconda3/envs/mpl_d2l/lib/python3.8/site-packages/mpl_toolkits/mplot3d/axes3d.py:658, in Axes3D.auto_scale_xyz(self, X, Y, Z, had_data) 656 self.xy_dataLim.update_from_data_y(Y, not had_data) 657 if Z is not None: --> 658 self.zz_dataLim.update_from_data_x(Z, not had_data) 659 # Let autoscale_view figure out how to use this data. 660 self.autoscale_view() File ~/miniconda3/envs/mpl_d2l/lib/python3.8/site-packages/matplotlib/transforms.py:922, in Bbox.update_from_data_x(self, x, ignore) 906 """ 907 Update the x-bounds of the `Bbox` based on the passed in data. After 908 updating, the bounds will have positive *width*, and *x0* will be the (...) 919 - When ``None``, use the last value passed to :meth:`ignore`. 920 """ 921 x = np.ravel(x) --> 922 self.update_from_data_xy(np.column_stack([x, np.ones(x.size)]), 923 ignore=ignore, updatey=False) File <__array_function__ internals>:180, in column_stack(*args, **kwargs) TypeError: no implementation found for 'numpy.column_stack' on types that implement __array_function__: [<class 'mxnet.numpy.ndarray'>, <class 'numpy.ndarray'>] ``` </details> This is another issue validating the need of #2044. A simple solution for now is to pin the matplotlib version to 1.4. I'll send a PR for this. cc @astonzhang --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `setup.py` Content: ``` 1 from setuptools import setup, find_packages 2 import d2l 3 4 requirements = [ 5 'jupyter', 6 'numpy', 7 'matplotlib==3.4', 8 'requests', 9 'pandas', 10 'gym' 11 ] 12 13 setup( 14 name='d2l', 15 version=d2l.__version__, 16 python_requires='>=3.5', 17 author='D2L Developers', 18 author_email='[email protected]', 19 url='https://d2l.ai', 20 description='Dive into Deep Learning', 21 license='MIT-0', 22 packages=find_packages(), 23 zip_safe=True, 24 install_requires=requirements, 25 ) 26 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/setup.py b/setup.py --- a/setup.py +++ b/setup.py @@ -4,7 +4,7 @@ requirements = [ 'jupyter', 'numpy', - 'matplotlib==3.4', + 'matplotlib', 'requests', 'pandas', 'gym'
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -4,7 +4,7 @@\n requirements = [\n 'jupyter',\n 'numpy',\n- 'matplotlib==3.4',\n+ 'matplotlib',\n 'requests',\n 'pandas',\n 'gym'\n", "issue": "[MXNet] matplotlib >=3.5 raises TypeError with ax.plot_wireframe in MXNet ndarray\nWith the latest version of matplotlib, multiple notebooks fail with a type error in mxnet (mxnet==1.7.0 & CUDA 10.2). Some of the affected sections include [optimization intro](https://d2l.ai/chapter_optimization/optimization-intro.html), [integral calculus](https://d2l.ai/chapter_appendix-mathematics-for-deep-learning/integral-calculus.html), [multivariable calculus](https://d2l.ai/chapter_appendix-mathematics-for-deep-learning/multivariable-calculus.html) etc.\r\n\r\n```\r\nTypeError: no implementation found for 'numpy.column_stack' on types that implement __array_function__: [<class 'mxnet.numpy.ndarray'>, <class 'numpy.ndarray'>]\r\n```\r\n\r\nPlease see attached traceback and reproduction instructions below.\r\n\r\nSteps to reproduce the issue.\r\n\r\n1. Setup the d2l environment (using `static/build.yml`)\r\n2. While setting up the environment, it will automatically install the latest version of matplotlib (i.e. `matplotlib==3.5.1` as of today). \r\n\r\nRun one of the notebooks which is affected (mentioned above) \r\n\r\n<details>\r\n <summary>Click to expand: Error Traceback</summary>\r\n\r\n```\r\n---------------------------------------------------------------------------\r\nTypeError Traceback (most recent call last)\r\nInput In [7], in <module>\r\n 9 # Plot function\r\n 10 ax = d2l.plt.figure().add_subplot(111, projection='3d')\r\n---> 11 ax.plot_wireframe(x, y, z, **{'rstride': 10, 'cstride': 10})\r\n 12 ax.plot_wireframe(x, y, w, **{'rstride': 10, 'cstride': 10}, color='purple')\r\n 13 d2l.plt.xlabel('x')\r\n\r\nFile ~/miniconda3/envs/mpl_d2l/lib/python3.8/site-packages/matplotlib/_api/deprecation.py:412, in delete_parameter.<locals>.wrapper(*inner_args, **inner_kwargs)\r\n 402 deprecation_addendum = (\r\n 403 f\"If any parameter follows {name!r}, they should be passed as \"\r\n 404 f\"keyword, not positionally.\")\r\n 405 warn_deprecated(\r\n 406 since,\r\n 407 name=repr(name),\r\n (...)\r\n 410 else deprecation_addendum,\r\n 411 **kwargs)\r\n--> 412 return func(*inner_args, **inner_kwargs)\r\n\r\nFile ~/miniconda3/envs/mpl_d2l/lib/python3.8/site-packages/mpl_toolkits/mplot3d/axes3d.py:1908, in Axes3D.plot_wireframe(self, X, Y, Z, *args, **kwargs)\r\n 1906 linec = art3d.Line3DCollection(lines, *args, **kwargs)\r\n 1907 self.add_collection(linec)\r\n-> 1908 self.auto_scale_xyz(X, Y, Z, had_data)\r\n 1910 return linec\r\n\r\nFile ~/miniconda3/envs/mpl_d2l/lib/python3.8/site-packages/mpl_toolkits/mplot3d/axes3d.py:658, in Axes3D.auto_scale_xyz(self, X, Y, Z, had_data)\r\n 656 self.xy_dataLim.update_from_data_y(Y, not had_data)\r\n 657 if Z is not None:\r\n--> 658 self.zz_dataLim.update_from_data_x(Z, not had_data)\r\n 659 # Let autoscale_view figure out how to use this data.\r\n 660 self.autoscale_view()\r\n\r\nFile ~/miniconda3/envs/mpl_d2l/lib/python3.8/site-packages/matplotlib/transforms.py:922, in Bbox.update_from_data_x(self, x, ignore)\r\n 906 \"\"\"\r\n 907 Update the x-bounds of the `Bbox` based on the passed in data. After\r\n 908 updating, the bounds will have positive *width*, and *x0* will be the\r\n (...)\r\n 919 - When ``None``, use the last value passed to :meth:`ignore`.\r\n 920 \"\"\"\r\n 921 x = np.ravel(x)\r\n--> 922 self.update_from_data_xy(np.column_stack([x, np.ones(x.size)]),\r\n 923 ignore=ignore, updatey=False)\r\n\r\nFile <__array_function__ internals>:180, in column_stack(*args, **kwargs)\r\n\r\nTypeError: no implementation found for 'numpy.column_stack' on types that implement __array_function__: [<class 'mxnet.numpy.ndarray'>, <class 'numpy.ndarray'>]\r\n```\r\n\r\n</details>\r\n\r\nThis is another issue validating the need of #2044.\r\n\r\nA simple solution for now is to pin the matplotlib version to 1.4. I'll send a PR for this.\r\n\r\ncc @astonzhang \n", "before_files": [{"content": "from setuptools import setup, find_packages\nimport d2l\n\nrequirements = [\n 'jupyter',\n 'numpy',\n 'matplotlib==3.4',\n 'requests',\n 'pandas',\n 'gym'\n]\n\nsetup(\n name='d2l',\n version=d2l.__version__,\n python_requires='>=3.5',\n author='D2L Developers',\n author_email='[email protected]',\n url='https://d2l.ai',\n description='Dive into Deep Learning',\n license='MIT-0',\n packages=find_packages(),\n zip_safe=True,\n install_requires=requirements,\n)\n", "path": "setup.py"}], "after_files": [{"content": "from setuptools import setup, find_packages\nimport d2l\n\nrequirements = [\n 'jupyter',\n 'numpy',\n 'matplotlib',\n 'requests',\n 'pandas',\n 'gym'\n]\n\nsetup(\n name='d2l',\n version=d2l.__version__,\n python_requires='>=3.5',\n author='D2L Developers',\n author_email='[email protected]',\n url='https://d2l.ai',\n description='Dive into Deep Learning',\n license='MIT-0',\n packages=find_packages(),\n zip_safe=True,\n install_requires=requirements,\n)\n", "path": "setup.py"}]}
1,558
70
gh_patches_debug_34829
rasdani/github-patches
git_diff
liqd__adhocracy4-168
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- improve accessibility of image upload * add alt attribute with the filename * if there is no image uploaded the image tag should not be there * the label's `for` attribute doesn't reference the file input's id. The first part causes the HTML to be invalid, which is part of the BITV Test "4.1.1a Valides HTML". The third part is part of the BITV Test "3.3.2a Formularfelder richtig beschriftet". --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `adhocracy4/images/widgets.py` Content: ``` 1 from os.path import basename 2 3 from django.contrib.staticfiles.storage import staticfiles_storage 4 from django.forms import widgets 5 from django.template import loader 6 from django.utils.html import conditional_escape 7 from django.utils.translation import ugettext 8 9 10 class ImageInputWidget(widgets.ClearableFileInput): 11 12 """ 13 A project-specific improved version of the clearable file upload. 14 15 Allows to upload and delete uploaded files. It doesn't passing attributes 16 using the positional `attrs` argument and hard codes css files. 17 """ 18 class Media: 19 js = (staticfiles_storage.url('a4images/imageUploader.js'),) 20 21 def render(self, name, value, attrs=None): 22 23 has_image_set = self.is_initial(value) 24 is_required = self.is_required 25 26 file_placeholder = ugettext('Select a picture from your local folder.') 27 file_input = super().render(name, None, { 28 'id': name, 29 'class': 'form-control form-control-file' 30 }) 31 32 if has_image_set: 33 file_name = basename(value.name) 34 file_url = conditional_escape(value.url) 35 else: 36 file_name = "" 37 file_url = "" 38 39 text_input = widgets.TextInput().render('__noname__', file_name, { 40 'class': 'form-control form-control-file-dummy', 41 'placeholder': file_placeholder, 42 'tabindex': '-1' 43 }) 44 45 checkbox_id = self.clear_checkbox_id(name) 46 checkbox_name = self.clear_checkbox_name(name) 47 checkbox_input = widgets.CheckboxInput().render(checkbox_name, False, { 48 'id': checkbox_id, 49 'class': 'clear-image', 50 'data-upload-clear': name, 51 }) 52 53 context = { 54 'name': name, 55 'has_image_set': has_image_set, 56 'is_required': is_required, 57 'file_url': file_url, 58 'file_input': file_input, 59 'file_id': name + '-file', 60 'text_input': text_input, 61 'checkbox_input': checkbox_input, 62 'checkbox_id': checkbox_id 63 } 64 65 return loader.render_to_string( 66 'a4images/image_upload_widget.html', 67 context 68 ) 69 70 def value_from_datadict(self, data, files, name): 71 """ 72 Modify value_from_datadict, so that delete takes precedence over 73 upload. 74 """ 75 file_value = super(widgets.ClearableFileInput, self)\ 76 .value_from_datadict(data, files, name) 77 checkbox_value = widgets.CheckboxInput()\ 78 .value_from_datadict(data, files, self.clear_checkbox_name(name)) 79 if not self.is_required and checkbox_value: 80 return False 81 return file_value 82 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/adhocracy4/images/widgets.py b/adhocracy4/images/widgets.py --- a/adhocracy4/images/widgets.py +++ b/adhocracy4/images/widgets.py @@ -19,13 +19,13 @@ js = (staticfiles_storage.url('a4images/imageUploader.js'),) def render(self, name, value, attrs=None): - + html_id = attrs and attrs.get('id', name) or name has_image_set = self.is_initial(value) is_required = self.is_required file_placeholder = ugettext('Select a picture from your local folder.') file_input = super().render(name, None, { - 'id': name, + 'id': html_id, 'class': 'form-control form-control-file' }) @@ -39,7 +39,8 @@ text_input = widgets.TextInput().render('__noname__', file_name, { 'class': 'form-control form-control-file-dummy', 'placeholder': file_placeholder, - 'tabindex': '-1' + 'tabindex': '-1', + 'id': 'text-{}'.format(html_id) }) checkbox_id = self.clear_checkbox_id(name) @@ -47,16 +48,16 @@ checkbox_input = widgets.CheckboxInput().render(checkbox_name, False, { 'id': checkbox_id, 'class': 'clear-image', - 'data-upload-clear': name, + 'data-upload-clear': html_id, }) context = { - 'name': name, + 'id': html_id, 'has_image_set': has_image_set, 'is_required': is_required, 'file_url': file_url, 'file_input': file_input, - 'file_id': name + '-file', + 'file_id': html_id + '-file', 'text_input': text_input, 'checkbox_input': checkbox_input, 'checkbox_id': checkbox_id
{"golden_diff": "diff --git a/adhocracy4/images/widgets.py b/adhocracy4/images/widgets.py\n--- a/adhocracy4/images/widgets.py\n+++ b/adhocracy4/images/widgets.py\n@@ -19,13 +19,13 @@\n js = (staticfiles_storage.url('a4images/imageUploader.js'),)\n \n def render(self, name, value, attrs=None):\n-\n+ html_id = attrs and attrs.get('id', name) or name\n has_image_set = self.is_initial(value)\n is_required = self.is_required\n \n file_placeholder = ugettext('Select a picture from your local folder.')\n file_input = super().render(name, None, {\n- 'id': name,\n+ 'id': html_id,\n 'class': 'form-control form-control-file'\n })\n \n@@ -39,7 +39,8 @@\n text_input = widgets.TextInput().render('__noname__', file_name, {\n 'class': 'form-control form-control-file-dummy',\n 'placeholder': file_placeholder,\n- 'tabindex': '-1'\n+ 'tabindex': '-1',\n+ 'id': 'text-{}'.format(html_id)\n })\n \n checkbox_id = self.clear_checkbox_id(name)\n@@ -47,16 +48,16 @@\n checkbox_input = widgets.CheckboxInput().render(checkbox_name, False, {\n 'id': checkbox_id,\n 'class': 'clear-image',\n- 'data-upload-clear': name,\n+ 'data-upload-clear': html_id,\n })\n \n context = {\n- 'name': name,\n+ 'id': html_id,\n 'has_image_set': has_image_set,\n 'is_required': is_required,\n 'file_url': file_url,\n 'file_input': file_input,\n- 'file_id': name + '-file',\n+ 'file_id': html_id + '-file',\n 'text_input': text_input,\n 'checkbox_input': checkbox_input,\n 'checkbox_id': checkbox_id\n", "issue": "improve accessibility of image upload\n* add alt attribute with the filename\r\n* if there is no image uploaded the image tag should not be there\r\n* the label's `for` attribute doesn't reference the file input's id.\r\n\r\nThe first part causes the HTML to be invalid, which is part of the BITV Test \"4.1.1a Valides HTML\".\r\nThe third part is part of the BITV Test \"3.3.2a Formularfelder richtig beschriftet\".\n", "before_files": [{"content": "from os.path import basename\n\nfrom django.contrib.staticfiles.storage import staticfiles_storage\nfrom django.forms import widgets\nfrom django.template import loader\nfrom django.utils.html import conditional_escape\nfrom django.utils.translation import ugettext\n\n\nclass ImageInputWidget(widgets.ClearableFileInput):\n\n \"\"\"\n A project-specific improved version of the clearable file upload.\n\n Allows to upload and delete uploaded files. It doesn't passing attributes\n using the positional `attrs` argument and hard codes css files.\n \"\"\"\n class Media:\n js = (staticfiles_storage.url('a4images/imageUploader.js'),)\n\n def render(self, name, value, attrs=None):\n\n has_image_set = self.is_initial(value)\n is_required = self.is_required\n\n file_placeholder = ugettext('Select a picture from your local folder.')\n file_input = super().render(name, None, {\n 'id': name,\n 'class': 'form-control form-control-file'\n })\n\n if has_image_set:\n file_name = basename(value.name)\n file_url = conditional_escape(value.url)\n else:\n file_name = \"\"\n file_url = \"\"\n\n text_input = widgets.TextInput().render('__noname__', file_name, {\n 'class': 'form-control form-control-file-dummy',\n 'placeholder': file_placeholder,\n 'tabindex': '-1'\n })\n\n checkbox_id = self.clear_checkbox_id(name)\n checkbox_name = self.clear_checkbox_name(name)\n checkbox_input = widgets.CheckboxInput().render(checkbox_name, False, {\n 'id': checkbox_id,\n 'class': 'clear-image',\n 'data-upload-clear': name,\n })\n\n context = {\n 'name': name,\n 'has_image_set': has_image_set,\n 'is_required': is_required,\n 'file_url': file_url,\n 'file_input': file_input,\n 'file_id': name + '-file',\n 'text_input': text_input,\n 'checkbox_input': checkbox_input,\n 'checkbox_id': checkbox_id\n }\n\n return loader.render_to_string(\n 'a4images/image_upload_widget.html',\n context\n )\n\n def value_from_datadict(self, data, files, name):\n \"\"\"\n Modify value_from_datadict, so that delete takes precedence over\n upload.\n \"\"\"\n file_value = super(widgets.ClearableFileInput, self)\\\n .value_from_datadict(data, files, name)\n checkbox_value = widgets.CheckboxInput()\\\n .value_from_datadict(data, files, self.clear_checkbox_name(name))\n if not self.is_required and checkbox_value:\n return False\n return file_value\n", "path": "adhocracy4/images/widgets.py"}], "after_files": [{"content": "from os.path import basename\n\nfrom django.contrib.staticfiles.storage import staticfiles_storage\nfrom django.forms import widgets\nfrom django.template import loader\nfrom django.utils.html import conditional_escape\nfrom django.utils.translation import ugettext\n\n\nclass ImageInputWidget(widgets.ClearableFileInput):\n\n \"\"\"\n A project-specific improved version of the clearable file upload.\n\n Allows to upload and delete uploaded files. It doesn't passing attributes\n using the positional `attrs` argument and hard codes css files.\n \"\"\"\n class Media:\n js = (staticfiles_storage.url('a4images/imageUploader.js'),)\n\n def render(self, name, value, attrs=None):\n html_id = attrs and attrs.get('id', name) or name\n has_image_set = self.is_initial(value)\n is_required = self.is_required\n\n file_placeholder = ugettext('Select a picture from your local folder.')\n file_input = super().render(name, None, {\n 'id': html_id,\n 'class': 'form-control form-control-file'\n })\n\n if has_image_set:\n file_name = basename(value.name)\n file_url = conditional_escape(value.url)\n else:\n file_name = \"\"\n file_url = \"\"\n\n text_input = widgets.TextInput().render('__noname__', file_name, {\n 'class': 'form-control form-control-file-dummy',\n 'placeholder': file_placeholder,\n 'tabindex': '-1',\n 'id': 'text-{}'.format(html_id)\n })\n\n checkbox_id = self.clear_checkbox_id(name)\n checkbox_name = self.clear_checkbox_name(name)\n checkbox_input = widgets.CheckboxInput().render(checkbox_name, False, {\n 'id': checkbox_id,\n 'class': 'clear-image',\n 'data-upload-clear': html_id,\n })\n\n context = {\n 'id': html_id,\n 'has_image_set': has_image_set,\n 'is_required': is_required,\n 'file_url': file_url,\n 'file_input': file_input,\n 'file_id': html_id + '-file',\n 'text_input': text_input,\n 'checkbox_input': checkbox_input,\n 'checkbox_id': checkbox_id\n }\n\n return loader.render_to_string(\n 'a4images/image_upload_widget.html',\n context\n )\n\n def value_from_datadict(self, data, files, name):\n \"\"\"\n Modify value_from_datadict, so that delete takes precedence over\n upload.\n \"\"\"\n file_value = super(widgets.ClearableFileInput, self)\\\n .value_from_datadict(data, files, name)\n checkbox_value = widgets.CheckboxInput()\\\n .value_from_datadict(data, files, self.clear_checkbox_name(name))\n if not self.is_required and checkbox_value:\n return False\n return file_value\n", "path": "adhocracy4/images/widgets.py"}]}
1,087
436
gh_patches_debug_15533
rasdani/github-patches
git_diff
voxel51__fiftyone-1660
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- [BUG] Support Fortran ordered masks in the App Currently fortran ordered masks are flipped. ```py import fiftyone as fo import fiftyone.zoo as foz import numpy as np dataset = foz.load_zoo_dataset("quickstart", max_samples=1).select_fields().clone() sample = dataset.first() contiguous = np.asarray([[True, False], [True, False]]) sample["contiguous"] = fo.Segmentation(mask=contiguous) sample["fortran"] = fo.Segmentation(mask=np.asfortranarray(contiguous)) sample.save() session = fo.Session(dataset) ``` <img width="1792" alt="flipped" src="https://user-images.githubusercontent.com/19821840/159953546-5eef71bc-d111-4667-a271-6c4e34e1b7da.png"> --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `fiftyone/server/json_util.py` Content: ``` 1 """ 2 FiftyOne server json utilies. 3 4 | Copyright 2017-2022, Voxel51, Inc. 5 | `voxel51.com <https://voxel51.com/>`_ 6 | 7 """ 8 from bson import ObjectId, json_util 9 from collections import OrderedDict 10 from datetime import date, datetime 11 from json import JSONEncoder 12 import math 13 14 from fiftyone.core.sample import Sample, SampleView 15 from fiftyone.core.stages import ViewStage 16 import fiftyone.core.utils as fou 17 18 19 _MASK_CLASSES = {"Detection", "Heatmap", "Segmentation"} 20 21 22 def _handle_bytes(o): 23 for k, v in o.items(): 24 if isinstance(v, bytes): 25 o[k] = str(fou.deserialize_numpy_array(v).shape) 26 elif isinstance(v, dict): 27 o[k] = _handle_bytes(v) 28 29 return o 30 31 32 def _handle_numpy_array(raw, _cls=None): 33 if _cls not in _MASK_CLASSES: 34 return str(fou.deserialize_numpy_array(raw).shape) 35 36 return fou.serialize_numpy_array( 37 fou.deserialize_numpy_array(raw), ascii=True 38 ) 39 40 41 def _handle_date(dt): 42 return { 43 "_cls": "DateTime", 44 "datetime": fou.datetime_to_timestamp(dt), 45 } 46 47 48 def _is_invalid_number(value): 49 if not isinstance(value, float): 50 return False 51 52 return math.isnan(value) or math.isinf(value) 53 54 55 def convert(d): 56 if isinstance(d, (dict, OrderedDict)): 57 for k, v in d.items(): 58 if isinstance(v, bytes): 59 d[k] = _handle_numpy_array(v, d.get("_cls", None)) 60 elif isinstance(v, (date, datetime)): 61 d[k] = _handle_date(v) 62 elif isinstance(v, ObjectId): 63 d[k] = str(v) 64 elif isinstance(v, (dict, OrderedDict, list)): 65 convert(v) 66 elif _is_invalid_number(v): 67 d[k] = str(v) 68 69 if isinstance(d, list): 70 for idx, i in enumerate(d): 71 if isinstance(i, tuple): 72 d[idx] = list(i) 73 i = d[idx] 74 75 if isinstance(i, bytes): 76 d[idx] = _handle_numpy_array(i) 77 elif isinstance(i, (date, datetime)): 78 d[idx] = _handle_date(i) 79 elif isinstance(i, ObjectId): 80 d[idx] = str(i) 81 elif isinstance(i, (dict, OrderedDict, list)): 82 convert(i) 83 elif _is_invalid_number(i): 84 d[idx] = str(i) 85 86 87 class FiftyOneJSONEncoder(JSONEncoder): 88 """JSON encoder for the FiftyOne server. 89 90 Any classes with non-standard serialization methods should 91 be accounted for in the `default()` method. 92 """ 93 94 def default(self, o): # pylint: disable=E0202 95 """Returns the serialized representation of the objects 96 97 Args: 98 o: the object 99 100 Returns: 101 str 102 """ 103 if isinstance(o, (Sample, SampleView)): 104 return _handle_bytes(o.to_mongo_dict(include_id=True)) 105 if issubclass(type(o), ViewStage): 106 return o._serialize() 107 if isinstance(o, ObjectId): 108 return str(o) 109 if isinstance(o, float): 110 return json_util.dumps(o) 111 return super().default(o) 112 113 @staticmethod 114 def dumps(*args, **kwargs): 115 """Defined for overriding the default SocketIO `json` interface""" 116 kwargs["cls"] = FiftyOneJSONEncoder 117 return json_util.dumps( 118 json_util.loads( 119 json_util.dumps(*args, **kwargs), parse_constant=lambda c: c 120 ), 121 **kwargs 122 ) 123 124 @staticmethod 125 def loads(*args, **kwargs): 126 """Defined for overriding the default SocketIO `json` interface""" 127 return json_util.loads(*args, **kwargs) 128 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/fiftyone/server/json_util.py b/fiftyone/server/json_util.py --- a/fiftyone/server/json_util.py +++ b/fiftyone/server/json_util.py @@ -10,6 +10,7 @@ from datetime import date, datetime from json import JSONEncoder import math +import numpy as np from fiftyone.core.sample import Sample, SampleView from fiftyone.core.stages import ViewStage @@ -33,9 +34,12 @@ if _cls not in _MASK_CLASSES: return str(fou.deserialize_numpy_array(raw).shape) - return fou.serialize_numpy_array( - fou.deserialize_numpy_array(raw), ascii=True - ) + array = fou.deserialize_numpy_array(raw) + + if np.isfortran(array): + array = np.ascontiguousarray(array) + + return fou.serialize_numpy_array(array, ascii=True) def _handle_date(dt):
{"golden_diff": "diff --git a/fiftyone/server/json_util.py b/fiftyone/server/json_util.py\n--- a/fiftyone/server/json_util.py\n+++ b/fiftyone/server/json_util.py\n@@ -10,6 +10,7 @@\n from datetime import date, datetime\n from json import JSONEncoder\n import math\n+import numpy as np\n \n from fiftyone.core.sample import Sample, SampleView\n from fiftyone.core.stages import ViewStage\n@@ -33,9 +34,12 @@\n if _cls not in _MASK_CLASSES:\n return str(fou.deserialize_numpy_array(raw).shape)\n \n- return fou.serialize_numpy_array(\n- fou.deserialize_numpy_array(raw), ascii=True\n- )\n+ array = fou.deserialize_numpy_array(raw)\n+\n+ if np.isfortran(array):\n+ array = np.ascontiguousarray(array)\n+\n+ return fou.serialize_numpy_array(array, ascii=True)\n \n \n def _handle_date(dt):\n", "issue": "[BUG] Support Fortran ordered masks in the App\nCurrently fortran ordered masks are flipped.\r\n\r\n```py\r\nimport fiftyone as fo\r\nimport fiftyone.zoo as foz\r\nimport numpy as np\r\n\r\ndataset = foz.load_zoo_dataset(\"quickstart\", max_samples=1).select_fields().clone()\r\nsample = dataset.first()\r\n\r\ncontiguous = np.asarray([[True, False], [True, False]])\r\nsample[\"contiguous\"] = fo.Segmentation(mask=contiguous)\r\nsample[\"fortran\"] = fo.Segmentation(mask=np.asfortranarray(contiguous))\r\nsample.save()\r\n\r\nsession = fo.Session(dataset)\r\n```\r\n<img width=\"1792\" alt=\"flipped\" src=\"https://user-images.githubusercontent.com/19821840/159953546-5eef71bc-d111-4667-a271-6c4e34e1b7da.png\">\r\n\r\n\n", "before_files": [{"content": "\"\"\"\nFiftyOne server json utilies.\n\n| Copyright 2017-2022, Voxel51, Inc.\n| `voxel51.com <https://voxel51.com/>`_\n|\n\"\"\"\nfrom bson import ObjectId, json_util\nfrom collections import OrderedDict\nfrom datetime import date, datetime\nfrom json import JSONEncoder\nimport math\n\nfrom fiftyone.core.sample import Sample, SampleView\nfrom fiftyone.core.stages import ViewStage\nimport fiftyone.core.utils as fou\n\n\n_MASK_CLASSES = {\"Detection\", \"Heatmap\", \"Segmentation\"}\n\n\ndef _handle_bytes(o):\n for k, v in o.items():\n if isinstance(v, bytes):\n o[k] = str(fou.deserialize_numpy_array(v).shape)\n elif isinstance(v, dict):\n o[k] = _handle_bytes(v)\n\n return o\n\n\ndef _handle_numpy_array(raw, _cls=None):\n if _cls not in _MASK_CLASSES:\n return str(fou.deserialize_numpy_array(raw).shape)\n\n return fou.serialize_numpy_array(\n fou.deserialize_numpy_array(raw), ascii=True\n )\n\n\ndef _handle_date(dt):\n return {\n \"_cls\": \"DateTime\",\n \"datetime\": fou.datetime_to_timestamp(dt),\n }\n\n\ndef _is_invalid_number(value):\n if not isinstance(value, float):\n return False\n\n return math.isnan(value) or math.isinf(value)\n\n\ndef convert(d):\n if isinstance(d, (dict, OrderedDict)):\n for k, v in d.items():\n if isinstance(v, bytes):\n d[k] = _handle_numpy_array(v, d.get(\"_cls\", None))\n elif isinstance(v, (date, datetime)):\n d[k] = _handle_date(v)\n elif isinstance(v, ObjectId):\n d[k] = str(v)\n elif isinstance(v, (dict, OrderedDict, list)):\n convert(v)\n elif _is_invalid_number(v):\n d[k] = str(v)\n\n if isinstance(d, list):\n for idx, i in enumerate(d):\n if isinstance(i, tuple):\n d[idx] = list(i)\n i = d[idx]\n\n if isinstance(i, bytes):\n d[idx] = _handle_numpy_array(i)\n elif isinstance(i, (date, datetime)):\n d[idx] = _handle_date(i)\n elif isinstance(i, ObjectId):\n d[idx] = str(i)\n elif isinstance(i, (dict, OrderedDict, list)):\n convert(i)\n elif _is_invalid_number(i):\n d[idx] = str(i)\n\n\nclass FiftyOneJSONEncoder(JSONEncoder):\n \"\"\"JSON encoder for the FiftyOne server.\n\n Any classes with non-standard serialization methods should\n be accounted for in the `default()` method.\n \"\"\"\n\n def default(self, o): # pylint: disable=E0202\n \"\"\"Returns the serialized representation of the objects\n\n Args:\n o: the object\n\n Returns:\n str\n \"\"\"\n if isinstance(o, (Sample, SampleView)):\n return _handle_bytes(o.to_mongo_dict(include_id=True))\n if issubclass(type(o), ViewStage):\n return o._serialize()\n if isinstance(o, ObjectId):\n return str(o)\n if isinstance(o, float):\n return json_util.dumps(o)\n return super().default(o)\n\n @staticmethod\n def dumps(*args, **kwargs):\n \"\"\"Defined for overriding the default SocketIO `json` interface\"\"\"\n kwargs[\"cls\"] = FiftyOneJSONEncoder\n return json_util.dumps(\n json_util.loads(\n json_util.dumps(*args, **kwargs), parse_constant=lambda c: c\n ),\n **kwargs\n )\n\n @staticmethod\n def loads(*args, **kwargs):\n \"\"\"Defined for overriding the default SocketIO `json` interface\"\"\"\n return json_util.loads(*args, **kwargs)\n", "path": "fiftyone/server/json_util.py"}], "after_files": [{"content": "\"\"\"\nFiftyOne server json utilies.\n\n| Copyright 2017-2022, Voxel51, Inc.\n| `voxel51.com <https://voxel51.com/>`_\n|\n\"\"\"\nfrom bson import ObjectId, json_util\nfrom collections import OrderedDict\nfrom datetime import date, datetime\nfrom json import JSONEncoder\nimport math\nimport numpy as np\n\nfrom fiftyone.core.sample import Sample, SampleView\nfrom fiftyone.core.stages import ViewStage\nimport fiftyone.core.utils as fou\n\n\n_MASK_CLASSES = {\"Detection\", \"Heatmap\", \"Segmentation\"}\n\n\ndef _handle_bytes(o):\n for k, v in o.items():\n if isinstance(v, bytes):\n o[k] = str(fou.deserialize_numpy_array(v).shape)\n elif isinstance(v, dict):\n o[k] = _handle_bytes(v)\n\n return o\n\n\ndef _handle_numpy_array(raw, _cls=None):\n if _cls not in _MASK_CLASSES:\n return str(fou.deserialize_numpy_array(raw).shape)\n\n array = fou.deserialize_numpy_array(raw)\n\n if np.isfortran(array):\n array = np.ascontiguousarray(array)\n\n return fou.serialize_numpy_array(array, ascii=True)\n\n\ndef _handle_date(dt):\n return {\n \"_cls\": \"DateTime\",\n \"datetime\": fou.datetime_to_timestamp(dt),\n }\n\n\ndef _is_invalid_number(value):\n if not isinstance(value, float):\n return False\n\n return math.isnan(value) or math.isinf(value)\n\n\ndef convert(d):\n if isinstance(d, (dict, OrderedDict)):\n for k, v in d.items():\n if isinstance(v, bytes):\n d[k] = _handle_numpy_array(v, d.get(\"_cls\", None))\n elif isinstance(v, (date, datetime)):\n d[k] = _handle_date(v)\n elif isinstance(v, ObjectId):\n d[k] = str(v)\n elif isinstance(v, (dict, OrderedDict, list)):\n convert(v)\n elif _is_invalid_number(v):\n d[k] = str(v)\n\n if isinstance(d, list):\n for idx, i in enumerate(d):\n if isinstance(i, tuple):\n d[idx] = list(i)\n i = d[idx]\n\n if isinstance(i, bytes):\n d[idx] = _handle_numpy_array(i)\n elif isinstance(i, (date, datetime)):\n d[idx] = _handle_date(i)\n elif isinstance(i, ObjectId):\n d[idx] = str(i)\n elif isinstance(i, (dict, OrderedDict, list)):\n convert(i)\n elif _is_invalid_number(i):\n d[idx] = str(i)\n\n\nclass FiftyOneJSONEncoder(JSONEncoder):\n \"\"\"JSON encoder for the FiftyOne server.\n\n Any classes with non-standard serialization methods should\n be accounted for in the `default()` method.\n \"\"\"\n\n def default(self, o): # pylint: disable=E0202\n \"\"\"Returns the serialized representation of the objects\n\n Args:\n o: the object\n\n Returns:\n str\n \"\"\"\n if isinstance(o, (Sample, SampleView)):\n return _handle_bytes(o.to_mongo_dict(include_id=True))\n if issubclass(type(o), ViewStage):\n return o._serialize()\n if isinstance(o, ObjectId):\n return str(o)\n if isinstance(o, float):\n return json_util.dumps(o)\n return super().default(o)\n\n @staticmethod\n def dumps(*args, **kwargs):\n \"\"\"Defined for overriding the default SocketIO `json` interface\"\"\"\n kwargs[\"cls\"] = FiftyOneJSONEncoder\n return json_util.dumps(\n json_util.loads(\n json_util.dumps(*args, **kwargs), parse_constant=lambda c: c\n ),\n **kwargs\n )\n\n @staticmethod\n def loads(*args, **kwargs):\n \"\"\"Defined for overriding the default SocketIO `json` interface\"\"\"\n return json_util.loads(*args, **kwargs)\n", "path": "fiftyone/server/json_util.py"}]}
1,570
204
gh_patches_debug_1764
rasdani/github-patches
git_diff
apple__coremltools-298
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Why is six pinned to 1.10.0? Is there any reason for [six to be pinned to version 1.10.0](https://github.com/apple/coremltools/blob/master/setup.py#L44). This gives transitive dependency issues sometimes. /cc @mats-claassen --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `setup.py` Content: ``` 1 #!/usr/bin/env python 2 3 import os 4 from setuptools import setup 5 6 README = os.path.join(os.getcwd(), "README.rst") 7 8 9 with open(README) as f: 10 long_description = f.read() 11 12 setup(name='coremltools', 13 version='2.0', 14 description='Community Tools for CoreML', 15 long_description=long_description, 16 author='Apple Inc.', 17 author_email='[email protected]', 18 url='', 19 packages=[ 20 'coremltools', 21 'coremltools._deps', 22 'coremltools.converters', 23 'coremltools.converters.caffe', 24 'coremltools.converters.sklearn', 25 'coremltools.converters.xgboost', 26 'coremltools.converters.libsvm', 27 'coremltools.converters.keras', 28 'coremltools.graph_visualization', 29 'coremltools.models', 30 'coremltools.models.neural_network', 31 'coremltools.proto', 32 'coremltools._scripts' 33 ], 34 package_data={'': ['LICENSE.txt', 'README.rst', 'libcaffeconverter.so', 'libcoremlpython.so'], 35 'coremltools': ['graph_visualization/__init__.py', 36 'graph_visualization/app.js', 37 'graph_visualization/index.html', 38 'graph_visualization/style.css', 39 'graph_visualization/assets/*', 40 'graph_visualization/icons/*'] 41 }, 42 install_requires=[ 43 'numpy >= 1.10.0', 44 'protobuf >= 3.1.0', 45 'six==1.10.0' 46 ], 47 entry_points = { 48 'console_scripts': ['coremlconverter = coremltools:_main'] 49 }, 50 classifiers=[ 51 'Development Status :: 4 - Beta', 52 'Intended Audience :: End Users/Desktop', 53 'Intended Audience :: Developers', 54 'Operating System :: MacOS :: MacOS X', 55 'Programming Language :: Python :: 2.7', 56 'Programming Language :: Python :: 3.5', 57 'Programming Language :: Python :: 3.6', 58 'Topic :: Scientific/Engineering', 59 'Topic :: Software Development' 60 ], 61 license='BSD' 62 ) 63 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/setup.py b/setup.py --- a/setup.py +++ b/setup.py @@ -42,7 +42,7 @@ install_requires=[ 'numpy >= 1.10.0', 'protobuf >= 3.1.0', - 'six==1.10.0' + 'six>=1.10.0' ], entry_points = { 'console_scripts': ['coremlconverter = coremltools:_main']
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -42,7 +42,7 @@\n install_requires=[\n 'numpy >= 1.10.0',\n 'protobuf >= 3.1.0',\n- 'six==1.10.0'\n+ 'six>=1.10.0'\n ],\n entry_points = {\n 'console_scripts': ['coremlconverter = coremltools:_main']\n", "issue": "Why is six pinned to 1.10.0?\nIs there any reason for [six to be pinned to version 1.10.0](https://github.com/apple/coremltools/blob/master/setup.py#L44). This gives transitive dependency issues sometimes.\r\n\r\n/cc @mats-claassen\n", "before_files": [{"content": "#!/usr/bin/env python\n\nimport os\nfrom setuptools import setup\n\nREADME = os.path.join(os.getcwd(), \"README.rst\")\n\n\nwith open(README) as f:\n long_description = f.read()\n\nsetup(name='coremltools',\n version='2.0',\n description='Community Tools for CoreML',\n long_description=long_description,\n author='Apple Inc.',\n author_email='[email protected]',\n url='',\n packages=[\n 'coremltools',\n 'coremltools._deps',\n 'coremltools.converters',\n 'coremltools.converters.caffe',\n 'coremltools.converters.sklearn',\n 'coremltools.converters.xgboost',\n 'coremltools.converters.libsvm',\n 'coremltools.converters.keras',\n 'coremltools.graph_visualization',\n 'coremltools.models',\n 'coremltools.models.neural_network',\n 'coremltools.proto',\n 'coremltools._scripts'\n ],\n package_data={'': ['LICENSE.txt', 'README.rst', 'libcaffeconverter.so', 'libcoremlpython.so'],\n 'coremltools': ['graph_visualization/__init__.py',\n 'graph_visualization/app.js',\n 'graph_visualization/index.html',\n 'graph_visualization/style.css',\n 'graph_visualization/assets/*',\n 'graph_visualization/icons/*']\n },\n install_requires=[\n 'numpy >= 1.10.0',\n 'protobuf >= 3.1.0',\n 'six==1.10.0'\n ],\n entry_points = {\n 'console_scripts': ['coremlconverter = coremltools:_main']\n },\n classifiers=[\n 'Development Status :: 4 - Beta',\n 'Intended Audience :: End Users/Desktop',\n 'Intended Audience :: Developers',\n 'Operating System :: MacOS :: MacOS X',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: 3.6',\n 'Topic :: Scientific/Engineering',\n 'Topic :: Software Development'\n ],\n license='BSD'\n)\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/env python\n\nimport os\nfrom setuptools import setup\n\nREADME = os.path.join(os.getcwd(), \"README.rst\")\n\n\nwith open(README) as f:\n long_description = f.read()\n\nsetup(name='coremltools',\n version='2.0',\n description='Community Tools for CoreML',\n long_description=long_description,\n author='Apple Inc.',\n author_email='[email protected]',\n url='',\n packages=[\n 'coremltools',\n 'coremltools._deps',\n 'coremltools.converters',\n 'coremltools.converters.caffe',\n 'coremltools.converters.sklearn',\n 'coremltools.converters.xgboost',\n 'coremltools.converters.libsvm',\n 'coremltools.converters.keras',\n 'coremltools.graph_visualization',\n 'coremltools.models',\n 'coremltools.models.neural_network',\n 'coremltools.proto',\n 'coremltools._scripts'\n ],\n package_data={'': ['LICENSE.txt', 'README.rst', 'libcaffeconverter.so', 'libcoremlpython.so'],\n 'coremltools': ['graph_visualization/__init__.py',\n 'graph_visualization/app.js',\n 'graph_visualization/index.html',\n 'graph_visualization/style.css',\n 'graph_visualization/assets/*',\n 'graph_visualization/icons/*']\n },\n install_requires=[\n 'numpy >= 1.10.0',\n 'protobuf >= 3.1.0',\n 'six>=1.10.0'\n ],\n entry_points = {\n 'console_scripts': ['coremlconverter = coremltools:_main']\n },\n classifiers=[\n 'Development Status :: 4 - Beta',\n 'Intended Audience :: End Users/Desktop',\n 'Intended Audience :: Developers',\n 'Operating System :: MacOS :: MacOS X',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: 3.6',\n 'Topic :: Scientific/Engineering',\n 'Topic :: Software Development'\n ],\n license='BSD'\n)\n", "path": "setup.py"}]}
902
106
gh_patches_debug_36300
rasdani/github-patches
git_diff
fidals__shopelectro-767
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Use SiteDriver class instead of seleniumrequests.Remote It will bring ability to use `shopelectro.selenium` classes in tests. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `shopelectro/selenium/pages/order.py` Content: ``` 1 from shopelectro.models import PaymentOptions 2 from shopelectro.selenium.elements import Input, Button 3 from shopelectro.selenium.pages import Page 4 5 from selenium.webdriver.common.by import By 6 7 from pages.models import CustomPage 8 9 # @todo #682:120m Implement and reuse shopelectro.selenium.OrderPage for selenium tests. 10 11 12 class OrderPage(Page): 13 14 def __init__(self, driver): 15 super().__init__(driver) 16 self.submit_button = Button(self.driver, (By.ID, 'submit-order')) 17 18 @property 19 def path(self): 20 return CustomPage.objects.get(slug='order').url 21 22 def fill_contacts( 23 self, name='Name', city='Санкт-Петербург', phone='2222222222', email='[email protected]', 24 ): 25 contacts = { 26 'id_name': name, 27 'id_city': city, 28 'id_phone': phone, 29 'id_email': email, 30 } 31 32 for id_, value in contacts.items(): 33 Input(self.driver, (By.ID, id_)).send_keys(value) 34 35 def make_order(self): 36 self.submit_button.click() 37 38 def select_payment_type(self, payment_option: PaymentOptions): 39 if payment_option not in PaymentOptions: 40 raise ValueError( 41 'An invalid payment type provided.' 42 f'It should be one of: {PaymentOptions}' 43 ) 44 45 item = Button( 46 self.driver, 47 (By.CSS, f'input[name="payment_type"][value="{payment_option.name}"]'), 48 ) 49 item.click() 50 ``` Path: `shopelectro/selenium/pages/page.py` Content: ``` 1 from shopelectro.selenium import SiteDriver 2 3 from selenium.webdriver.common.by import By 4 from selenium.webdriver.support import expected_conditions as EC 5 6 7 class Page: 8 """ 9 Represent a typical Shopelectro's page. 10 11 Contains cross-page elements: header, footer, ... 12 """ 13 14 def __init__(self, driver: SiteDriver): 15 if not isinstance(driver, SiteDriver): 16 raise TypeError('Driver must be an instance of shopelectro.selenium.SiteDriver') 17 self.driver = driver 18 self.path: str 19 20 def load(self): 21 if not self.path: 22 raise ValueError(f'Set a page path to {self.__class__.__name__}') 23 self.driver.get(self.path) 24 self.driver.wait.until(EC.visibility_of_element_located( 25 (By.TAG_NAME, 'body') 26 )) 27 ``` Path: `shopelectro/selenium/pages/success.py` Content: ``` 1 from shopelectro.selenium.pages import Page 2 3 from pages.models import CustomPage 4 5 6 class SuccessPage(Page): 7 8 @property 9 def path(self): 10 CustomPage.objects.get(slug='order-success').url 11 12 def is_success(self): 13 return 'Заказ принят' in self.driver.find_element_by_tag_name('h1').text 14 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/shopelectro/selenium/pages/order.py b/shopelectro/selenium/pages/order.py --- a/shopelectro/selenium/pages/order.py +++ b/shopelectro/selenium/pages/order.py @@ -3,6 +3,7 @@ from shopelectro.selenium.pages import Page from selenium.webdriver.common.by import By +from selenium.webdriver.support import expected_conditions as EC from pages.models import CustomPage @@ -34,6 +35,7 @@ def make_order(self): self.submit_button.click() + self.driver.wait.until(EC.url_changes(self.path)) def select_payment_type(self, payment_option: PaymentOptions): if payment_option not in PaymentOptions: diff --git a/shopelectro/selenium/pages/page.py b/shopelectro/selenium/pages/page.py --- a/shopelectro/selenium/pages/page.py +++ b/shopelectro/selenium/pages/page.py @@ -1,3 +1,5 @@ +from functools import wraps + from shopelectro.selenium import SiteDriver from selenium.webdriver.common.by import By @@ -17,10 +19,17 @@ self.driver = driver self.path: str + def wait_loaded(self): + def loaded(driver): + is_sync = EC.url_contains(self.path) + is_rendered = EC.visibility_of_element_located( + (By.TAG_NAME, 'body') + ) + return is_sync(driver) and is_rendered(driver) + self.driver.wait.until(loaded) + def load(self): if not self.path: raise ValueError(f'Set a page path to {self.__class__.__name__}') self.driver.get(self.path) - self.driver.wait.until(EC.visibility_of_element_located( - (By.TAG_NAME, 'body') - )) + self.wait_loaded() diff --git a/shopelectro/selenium/pages/success.py b/shopelectro/selenium/pages/success.py --- a/shopelectro/selenium/pages/success.py +++ b/shopelectro/selenium/pages/success.py @@ -1,3 +1,6 @@ +from selenium.webdriver.common.by import By +from selenium.webdriver.support import expected_conditions as EC + from shopelectro.selenium.pages import Page from pages.models import CustomPage @@ -7,7 +10,10 @@ @property def path(self): - CustomPage.objects.get(slug='order-success').url + return CustomPage.objects.get(slug='order-success').url def is_success(self): - return 'Заказ принят' in self.driver.find_element_by_tag_name('h1').text + h1 = self.driver.wait.until( + EC.visibility_of_element_located((By.TAG_NAME, 'h1')) + ).text + return 'Заказ принят' in h1
{"golden_diff": "diff --git a/shopelectro/selenium/pages/order.py b/shopelectro/selenium/pages/order.py\n--- a/shopelectro/selenium/pages/order.py\n+++ b/shopelectro/selenium/pages/order.py\n@@ -3,6 +3,7 @@\n from shopelectro.selenium.pages import Page\n \n from selenium.webdriver.common.by import By\n+from selenium.webdriver.support import expected_conditions as EC\n \n from pages.models import CustomPage\n \n@@ -34,6 +35,7 @@\n \n def make_order(self):\n self.submit_button.click()\n+ self.driver.wait.until(EC.url_changes(self.path))\n \n def select_payment_type(self, payment_option: PaymentOptions):\n if payment_option not in PaymentOptions:\ndiff --git a/shopelectro/selenium/pages/page.py b/shopelectro/selenium/pages/page.py\n--- a/shopelectro/selenium/pages/page.py\n+++ b/shopelectro/selenium/pages/page.py\n@@ -1,3 +1,5 @@\n+from functools import wraps\n+\n from shopelectro.selenium import SiteDriver\n \n from selenium.webdriver.common.by import By\n@@ -17,10 +19,17 @@\n self.driver = driver\n self.path: str\n \n+ def wait_loaded(self):\n+ def loaded(driver):\n+ is_sync = EC.url_contains(self.path)\n+ is_rendered = EC.visibility_of_element_located(\n+ (By.TAG_NAME, 'body')\n+ )\n+ return is_sync(driver) and is_rendered(driver)\n+ self.driver.wait.until(loaded)\n+\n def load(self):\n if not self.path:\n raise ValueError(f'Set a page path to {self.__class__.__name__}')\n self.driver.get(self.path)\n- self.driver.wait.until(EC.visibility_of_element_located(\n- (By.TAG_NAME, 'body')\n- ))\n+ self.wait_loaded()\ndiff --git a/shopelectro/selenium/pages/success.py b/shopelectro/selenium/pages/success.py\n--- a/shopelectro/selenium/pages/success.py\n+++ b/shopelectro/selenium/pages/success.py\n@@ -1,3 +1,6 @@\n+from selenium.webdriver.common.by import By\n+from selenium.webdriver.support import expected_conditions as EC\n+\n from shopelectro.selenium.pages import Page\n \n from pages.models import CustomPage\n@@ -7,7 +10,10 @@\n \n @property\n def path(self):\n- CustomPage.objects.get(slug='order-success').url\n+ return CustomPage.objects.get(slug='order-success').url\n \n def is_success(self):\n- return '\u0417\u0430\u043a\u0430\u0437 \u043f\u0440\u0438\u043d\u044f\u0442' in self.driver.find_element_by_tag_name('h1').text\n+ h1 = self.driver.wait.until(\n+ EC.visibility_of_element_located((By.TAG_NAME, 'h1'))\n+ ).text\n+ return '\u0417\u0430\u043a\u0430\u0437 \u043f\u0440\u0438\u043d\u044f\u0442' in h1\n", "issue": "Use SiteDriver class instead of seleniumrequests.Remote\nIt will bring ability to use `shopelectro.selenium` classes in tests. \n", "before_files": [{"content": "from shopelectro.models import PaymentOptions\nfrom shopelectro.selenium.elements import Input, Button\nfrom shopelectro.selenium.pages import Page\n\nfrom selenium.webdriver.common.by import By\n\nfrom pages.models import CustomPage\n\n# @todo #682:120m Implement and reuse shopelectro.selenium.OrderPage for selenium tests.\n\n\nclass OrderPage(Page):\n\n def __init__(self, driver):\n super().__init__(driver)\n self.submit_button = Button(self.driver, (By.ID, 'submit-order'))\n\n @property\n def path(self):\n return CustomPage.objects.get(slug='order').url\n\n def fill_contacts(\n self, name='Name', city='\u0421\u0430\u043d\u043a\u0442-\u041f\u0435\u0442\u0435\u0440\u0431\u0443\u0440\u0433', phone='2222222222', email='[email protected]',\n ):\n contacts = {\n 'id_name': name,\n 'id_city': city,\n 'id_phone': phone,\n 'id_email': email,\n }\n\n for id_, value in contacts.items():\n Input(self.driver, (By.ID, id_)).send_keys(value)\n\n def make_order(self):\n self.submit_button.click()\n\n def select_payment_type(self, payment_option: PaymentOptions):\n if payment_option not in PaymentOptions:\n raise ValueError(\n 'An invalid payment type provided.'\n f'It should be one of: {PaymentOptions}'\n )\n\n item = Button(\n self.driver,\n (By.CSS, f'input[name=\"payment_type\"][value=\"{payment_option.name}\"]'),\n )\n item.click()\n", "path": "shopelectro/selenium/pages/order.py"}, {"content": "from shopelectro.selenium import SiteDriver\n\nfrom selenium.webdriver.common.by import By\nfrom selenium.webdriver.support import expected_conditions as EC\n\n\nclass Page:\n \"\"\"\n Represent a typical Shopelectro's page.\n\n Contains cross-page elements: header, footer, ...\n \"\"\"\n\n def __init__(self, driver: SiteDriver):\n if not isinstance(driver, SiteDriver):\n raise TypeError('Driver must be an instance of shopelectro.selenium.SiteDriver')\n self.driver = driver\n self.path: str\n\n def load(self):\n if not self.path:\n raise ValueError(f'Set a page path to {self.__class__.__name__}')\n self.driver.get(self.path)\n self.driver.wait.until(EC.visibility_of_element_located(\n (By.TAG_NAME, 'body')\n ))\n", "path": "shopelectro/selenium/pages/page.py"}, {"content": "from shopelectro.selenium.pages import Page\n\nfrom pages.models import CustomPage\n\n\nclass SuccessPage(Page):\n\n @property\n def path(self):\n CustomPage.objects.get(slug='order-success').url\n\n def is_success(self):\n return '\u0417\u0430\u043a\u0430\u0437 \u043f\u0440\u0438\u043d\u044f\u0442' in self.driver.find_element_by_tag_name('h1').text\n", "path": "shopelectro/selenium/pages/success.py"}], "after_files": [{"content": "from shopelectro.models import PaymentOptions\nfrom shopelectro.selenium.elements import Input, Button\nfrom shopelectro.selenium.pages import Page\n\nfrom selenium.webdriver.common.by import By\nfrom selenium.webdriver.support import expected_conditions as EC\n\nfrom pages.models import CustomPage\n\n# @todo #682:120m Implement and reuse shopelectro.selenium.OrderPage for selenium tests.\n\n\nclass OrderPage(Page):\n\n def __init__(self, driver):\n super().__init__(driver)\n self.submit_button = Button(self.driver, (By.ID, 'submit-order'))\n\n @property\n def path(self):\n return CustomPage.objects.get(slug='order').url\n\n def fill_contacts(\n self, name='Name', city='\u0421\u0430\u043d\u043a\u0442-\u041f\u0435\u0442\u0435\u0440\u0431\u0443\u0440\u0433', phone='2222222222', email='[email protected]',\n ):\n contacts = {\n 'id_name': name,\n 'id_city': city,\n 'id_phone': phone,\n 'id_email': email,\n }\n\n for id_, value in contacts.items():\n Input(self.driver, (By.ID, id_)).send_keys(value)\n\n def make_order(self):\n self.submit_button.click()\n self.driver.wait.until(EC.url_changes(self.path))\n\n def select_payment_type(self, payment_option: PaymentOptions):\n if payment_option not in PaymentOptions:\n raise ValueError(\n 'An invalid payment type provided.'\n f'It should be one of: {PaymentOptions}'\n )\n\n item = Button(\n self.driver,\n (By.CSS, f'input[name=\"payment_type\"][value=\"{payment_option.name}\"]'),\n )\n item.click()\n", "path": "shopelectro/selenium/pages/order.py"}, {"content": "from functools import wraps\n\nfrom shopelectro.selenium import SiteDriver\n\nfrom selenium.webdriver.common.by import By\nfrom selenium.webdriver.support import expected_conditions as EC\n\n\nclass Page:\n \"\"\"\n Represent a typical Shopelectro's page.\n\n Contains cross-page elements: header, footer, ...\n \"\"\"\n\n def __init__(self, driver: SiteDriver):\n if not isinstance(driver, SiteDriver):\n raise TypeError('Driver must be an instance of shopelectro.selenium.SiteDriver')\n self.driver = driver\n self.path: str\n\n def wait_loaded(self):\n def loaded(driver):\n is_sync = EC.url_contains(self.path)\n is_rendered = EC.visibility_of_element_located(\n (By.TAG_NAME, 'body')\n )\n return is_sync(driver) and is_rendered(driver)\n self.driver.wait.until(loaded)\n\n def load(self):\n if not self.path:\n raise ValueError(f'Set a page path to {self.__class__.__name__}')\n self.driver.get(self.path)\n self.wait_loaded()\n", "path": "shopelectro/selenium/pages/page.py"}, {"content": "from selenium.webdriver.common.by import By\nfrom selenium.webdriver.support import expected_conditions as EC\n\nfrom shopelectro.selenium.pages import Page\n\nfrom pages.models import CustomPage\n\n\nclass SuccessPage(Page):\n\n @property\n def path(self):\n return CustomPage.objects.get(slug='order-success').url\n\n def is_success(self):\n h1 = self.driver.wait.until(\n EC.visibility_of_element_located((By.TAG_NAME, 'h1'))\n ).text\n return '\u0417\u0430\u043a\u0430\u0437 \u043f\u0440\u0438\u043d\u044f\u0442' in h1\n", "path": "shopelectro/selenium/pages/success.py"}]}
1,077
629
gh_patches_debug_564
rasdani/github-patches
git_diff
mabel-dev__opteryx-1695
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- ✨ Memory Pool Optimizations ### Thanks for stopping by to let us know something could be better! **Is your feature request related to a problem? Please describe.** _A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]_ **Describe the solution you'd like** _A clear and concise description of what you want to happen._ **Describe alternatives you've considered** _A clear and concise description of any alternative solutions or features you've considered._ **Additional context** _Add any other context or screenshots about the feature request here._ --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `opteryx/__version__.py` Content: ``` 1 __build__ = 527 2 3 # Licensed under the Apache License, Version 2.0 (the "License"); 4 # you may not use this file except in compliance with the License. 5 # You may obtain a copy of the License at 6 # 7 # http://www.apache.org/licenses/LICENSE-2.0 8 # 9 # Unless required by applicable law or agreed to in writing, software 10 # distributed under the License is distributed on an "AS IS" BASIS, 11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 # See the License for the specific language governing permissions and 13 # limitations under the License. 14 15 """ 16 Store the version here so: 17 1) we don't load dependencies by storing it in __init__.py 18 2) we can import it in setup.py for the same reason 19 """ 20 from enum import Enum # isort: skip 21 22 23 class VersionStatus(Enum): 24 ALPHA = "alpha" 25 BETA = "beta" 26 RELEASE = "release" 27 28 29 _major = 0 30 _minor = 16 31 _revision = 0 32 _status = VersionStatus.ALPHA 33 34 __author__ = "@joocer" 35 __version__ = f"{_major}.{_minor}.{_revision}" + ( 36 f"-{_status.value}.{__build__}" if _status != VersionStatus.RELEASE else "" 37 ) 38 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/opteryx/__version__.py b/opteryx/__version__.py --- a/opteryx/__version__.py +++ b/opteryx/__version__.py @@ -1,4 +1,4 @@ -__build__ = 527 +__build__ = 532 # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License.
{"golden_diff": "diff --git a/opteryx/__version__.py b/opteryx/__version__.py\n--- a/opteryx/__version__.py\n+++ b/opteryx/__version__.py\n@@ -1,4 +1,4 @@\n-__build__ = 527\n+__build__ = 532\n \n # Licensed under the Apache License, Version 2.0 (the \"License\");\n # you may not use this file except in compliance with the License.\n", "issue": "\u2728 Memory Pool Optimizations\n### Thanks for stopping by to let us know something could be better!\r\n\r\n**Is your feature request related to a problem? Please describe.** _A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]_\r\n\r\n**Describe the solution you'd like** _A clear and concise description of what you want to happen._\r\n\r\n**Describe alternatives you've considered** _A clear and concise description of any alternative solutions or features you've considered._\r\n\r\n**Additional context** _Add any other context or screenshots about the feature request here._\r\n\n", "before_files": [{"content": "__build__ = 527\n\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"\nStore the version here so:\n1) we don't load dependencies by storing it in __init__.py\n2) we can import it in setup.py for the same reason\n\"\"\"\nfrom enum import Enum # isort: skip\n\n\nclass VersionStatus(Enum):\n ALPHA = \"alpha\"\n BETA = \"beta\"\n RELEASE = \"release\"\n\n\n_major = 0\n_minor = 16\n_revision = 0\n_status = VersionStatus.ALPHA\n\n__author__ = \"@joocer\"\n__version__ = f\"{_major}.{_minor}.{_revision}\" + (\n f\"-{_status.value}.{__build__}\" if _status != VersionStatus.RELEASE else \"\"\n)\n", "path": "opteryx/__version__.py"}], "after_files": [{"content": "__build__ = 532\n\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"\nStore the version here so:\n1) we don't load dependencies by storing it in __init__.py\n2) we can import it in setup.py for the same reason\n\"\"\"\nfrom enum import Enum # isort: skip\n\n\nclass VersionStatus(Enum):\n ALPHA = \"alpha\"\n BETA = \"beta\"\n RELEASE = \"release\"\n\n\n_major = 0\n_minor = 16\n_revision = 0\n_status = VersionStatus.ALPHA\n\n__author__ = \"@joocer\"\n__version__ = f\"{_major}.{_minor}.{_revision}\" + (\n f\"-{_status.value}.{__build__}\" if _status != VersionStatus.RELEASE else \"\"\n)\n", "path": "opteryx/__version__.py"}]}
734
101
gh_patches_debug_308
rasdani/github-patches
git_diff
zulip__zulip-13077
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Upgrade pip from 19.1.1 and pip-tools from 3.8.0 Followup issue from #13067. pip-tools 3.9.0 or 4.0.0 fails to resolve dependencies from Git URLs (jazzband/pip-tools#851): `pip._internal.exceptions.DistributionNotFound: No matching distribution found for zulip==0.6.1_git (from -r requirements/common.in (line 135))` while pip 19.2 breaks pip-tools 3.8.0 (jazzband/pip-tools#853): `TypeError: __init__() got an unexpected keyword argument 'find_links'` --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `version.py` Content: ``` 1 import os 2 3 ZULIP_VERSION = "2.0.4+git" 4 # Add information on number of commits and commit hash to version, if available 5 zulip_git_version_file = os.path.join(os.path.dirname(os.path.abspath(__file__)), 'zulip-git-version') 6 if os.path.exists(zulip_git_version_file): 7 with open(zulip_git_version_file) as f: 8 version = f.read().strip() 9 if version: 10 ZULIP_VERSION = version 11 12 LATEST_MAJOR_VERSION = "2.0" 13 LATEST_RELEASE_VERSION = "2.0.4" 14 LATEST_RELEASE_ANNOUNCEMENT = "https://blog.zulip.org/2019/03/01/zulip-2-0-released/" 15 16 # Bump the minor PROVISION_VERSION to indicate that folks should provision 17 # only when going from an old version of the code to a newer version. Bump 18 # the major version to indicate that folks should provision in both 19 # directions. 20 21 # Typically, 22 # * adding a dependency only requires a minor version bump; 23 # * removing a dependency requires a major version bump; 24 # * upgrading a dependency requires a major version bump, unless the 25 # upgraded dependency is backwards compatible with all of our 26 # historical commits sharing the same major version, in which case a 27 # minor version bump suffices. 28 29 PROVISION_VERSION = '49.2' 30 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/version.py b/version.py --- a/version.py +++ b/version.py @@ -26,4 +26,4 @@ # historical commits sharing the same major version, in which case a # minor version bump suffices. -PROVISION_VERSION = '49.2' +PROVISION_VERSION = '49.3'
{"golden_diff": "diff --git a/version.py b/version.py\n--- a/version.py\n+++ b/version.py\n@@ -26,4 +26,4 @@\n # historical commits sharing the same major version, in which case a\n # minor version bump suffices.\n \n-PROVISION_VERSION = '49.2'\n+PROVISION_VERSION = '49.3'\n", "issue": "Upgrade pip from 19.1.1 and pip-tools from 3.8.0\nFollowup issue from #13067. pip-tools 3.9.0 or 4.0.0 fails to resolve dependencies from Git URLs (jazzband/pip-tools#851):\r\n\r\n`pip._internal.exceptions.DistributionNotFound: No matching distribution found for zulip==0.6.1_git (from -r requirements/common.in (line 135))`\r\n\r\nwhile pip 19.2 breaks pip-tools 3.8.0 (jazzband/pip-tools#853):\r\n\r\n`TypeError: __init__() got an unexpected keyword argument 'find_links'`\n", "before_files": [{"content": "import os\n\nZULIP_VERSION = \"2.0.4+git\"\n# Add information on number of commits and commit hash to version, if available\nzulip_git_version_file = os.path.join(os.path.dirname(os.path.abspath(__file__)), 'zulip-git-version')\nif os.path.exists(zulip_git_version_file):\n with open(zulip_git_version_file) as f:\n version = f.read().strip()\n if version:\n ZULIP_VERSION = version\n\nLATEST_MAJOR_VERSION = \"2.0\"\nLATEST_RELEASE_VERSION = \"2.0.4\"\nLATEST_RELEASE_ANNOUNCEMENT = \"https://blog.zulip.org/2019/03/01/zulip-2-0-released/\"\n\n# Bump the minor PROVISION_VERSION to indicate that folks should provision\n# only when going from an old version of the code to a newer version. Bump\n# the major version to indicate that folks should provision in both\n# directions.\n\n# Typically,\n# * adding a dependency only requires a minor version bump;\n# * removing a dependency requires a major version bump;\n# * upgrading a dependency requires a major version bump, unless the\n# upgraded dependency is backwards compatible with all of our\n# historical commits sharing the same major version, in which case a\n# minor version bump suffices.\n\nPROVISION_VERSION = '49.2'\n", "path": "version.py"}], "after_files": [{"content": "import os\n\nZULIP_VERSION = \"2.0.4+git\"\n# Add information on number of commits and commit hash to version, if available\nzulip_git_version_file = os.path.join(os.path.dirname(os.path.abspath(__file__)), 'zulip-git-version')\nif os.path.exists(zulip_git_version_file):\n with open(zulip_git_version_file) as f:\n version = f.read().strip()\n if version:\n ZULIP_VERSION = version\n\nLATEST_MAJOR_VERSION = \"2.0\"\nLATEST_RELEASE_VERSION = \"2.0.4\"\nLATEST_RELEASE_ANNOUNCEMENT = \"https://blog.zulip.org/2019/03/01/zulip-2-0-released/\"\n\n# Bump the minor PROVISION_VERSION to indicate that folks should provision\n# only when going from an old version of the code to a newer version. Bump\n# the major version to indicate that folks should provision in both\n# directions.\n\n# Typically,\n# * adding a dependency only requires a minor version bump;\n# * removing a dependency requires a major version bump;\n# * upgrading a dependency requires a major version bump, unless the\n# upgraded dependency is backwards compatible with all of our\n# historical commits sharing the same major version, in which case a\n# minor version bump suffices.\n\nPROVISION_VERSION = '49.3'\n", "path": "version.py"}]}
760
78
gh_patches_debug_17290
rasdani/github-patches
git_diff
joke2k__faker-919
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Brazilian RG (identity card) Add Generator to Brazilian RG (identity card) ### Steps to reproduce fake = Faker('pt_Br') fake.rg() ### Expected behavior return like this rules: https://www.ngmatematica.com/2014/02/como-determinar-o-digito-verificador-do.html 8 digits + 1 checksum digit ### Actual behavior New feature --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `faker/providers/ssn/pt_BR/__init__.py` Content: ``` 1 # coding=utf-8 2 3 from __future__ import unicode_literals 4 from .. import Provider as SsnProvider 5 6 7 def checksum(digits): 8 """ 9 Returns the checksum of CPF digits. 10 References to the algorithm: 11 https://pt.wikipedia.org/wiki/Cadastro_de_pessoas_f%C3%ADsicas#Algoritmo 12 https://metacpan.org/source/MAMAWE/Algorithm-CheckDigits-v1.3.0/lib/Algorithm/CheckDigits/M11_004.pm 13 """ 14 s = 0 15 p = len(digits) + 1 16 for i in range(0, len(digits)): 17 s += digits[i] * p 18 p -= 1 19 20 reminder = s % 11 21 if reminder == 0 or reminder == 1: 22 return 0 23 else: 24 return 11 - reminder 25 26 27 class Provider(SsnProvider): 28 """ 29 Provider for Brazilian SSN also known in Brazil as CPF. 30 There are two methods Provider.ssn and Provider.cpf 31 The snn returns a valid number with numbers only 32 The cpf return a valid number formatted with brazilian mask. eg nnn.nnn.nnn-nn 33 """ 34 35 def ssn(self): 36 digits = self.generator.random.sample(range(10), 9) 37 38 dv = checksum(digits) 39 digits.append(dv) 40 digits.append(checksum(digits)) 41 42 return ''.join(map(str, digits)) 43 44 def cpf(self): 45 c = self.ssn() 46 return c[:3] + '.' + c[3:6] + '.' + c[6:9] + '-' + c[9:] 47 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/faker/providers/ssn/pt_BR/__init__.py b/faker/providers/ssn/pt_BR/__init__.py --- a/faker/providers/ssn/pt_BR/__init__.py +++ b/faker/providers/ssn/pt_BR/__init__.py @@ -1,6 +1,7 @@ # coding=utf-8 from __future__ import unicode_literals + from .. import Provider as SsnProvider @@ -44,3 +45,22 @@ def cpf(self): c = self.ssn() return c[:3] + '.' + c[3:6] + '.' + c[6:9] + '-' + c[9:] + + def rg(self): + """ + Brazilian RG, return plain numbers. + Check: https://www.ngmatematica.com/2014/02/como-determinar-o-digito-verificador-do.html + """ + + digits = self.generator.random.sample(range(0, 9), 8) + checksum = sum(i * digits[i - 2] for i in range(2, 10)) + last_digit = 11 - (checksum % 11) + + if last_digit == 10: + digits.append('X') + elif last_digit == 11: + digits.append(0) + else: + digits.append(last_digit) + + return ''.join(map(str, digits))
{"golden_diff": "diff --git a/faker/providers/ssn/pt_BR/__init__.py b/faker/providers/ssn/pt_BR/__init__.py\n--- a/faker/providers/ssn/pt_BR/__init__.py\n+++ b/faker/providers/ssn/pt_BR/__init__.py\n@@ -1,6 +1,7 @@\n # coding=utf-8\n \n from __future__ import unicode_literals\n+\n from .. import Provider as SsnProvider\n \n \n@@ -44,3 +45,22 @@\n def cpf(self):\n c = self.ssn()\n return c[:3] + '.' + c[3:6] + '.' + c[6:9] + '-' + c[9:]\n+\n+ def rg(self):\n+ \"\"\"\n+ Brazilian RG, return plain numbers.\n+ Check: https://www.ngmatematica.com/2014/02/como-determinar-o-digito-verificador-do.html\n+ \"\"\"\n+\n+ digits = self.generator.random.sample(range(0, 9), 8)\n+ checksum = sum(i * digits[i - 2] for i in range(2, 10))\n+ last_digit = 11 - (checksum % 11)\n+\n+ if last_digit == 10:\n+ digits.append('X')\n+ elif last_digit == 11:\n+ digits.append(0)\n+ else:\n+ digits.append(last_digit)\n+\n+ return ''.join(map(str, digits))\n", "issue": "Brazilian RG (identity card)\nAdd Generator to Brazilian RG (identity card)\r\n\r\n### Steps to reproduce\r\nfake = Faker('pt_Br')\r\nfake.rg()\r\n\r\n### Expected behavior\r\nreturn like this rules:\r\nhttps://www.ngmatematica.com/2014/02/como-determinar-o-digito-verificador-do.html\r\n8 digits + 1 checksum digit\r\n### Actual behavior\r\nNew feature\r\n\n", "before_files": [{"content": "# coding=utf-8\n\nfrom __future__ import unicode_literals\nfrom .. import Provider as SsnProvider\n\n\ndef checksum(digits):\n \"\"\"\n Returns the checksum of CPF digits.\n References to the algorithm:\n https://pt.wikipedia.org/wiki/Cadastro_de_pessoas_f%C3%ADsicas#Algoritmo\n https://metacpan.org/source/MAMAWE/Algorithm-CheckDigits-v1.3.0/lib/Algorithm/CheckDigits/M11_004.pm\n \"\"\"\n s = 0\n p = len(digits) + 1\n for i in range(0, len(digits)):\n s += digits[i] * p\n p -= 1\n\n reminder = s % 11\n if reminder == 0 or reminder == 1:\n return 0\n else:\n return 11 - reminder\n\n\nclass Provider(SsnProvider):\n \"\"\"\n Provider for Brazilian SSN also known in Brazil as CPF.\n There are two methods Provider.ssn and Provider.cpf\n The snn returns a valid number with numbers only\n The cpf return a valid number formatted with brazilian mask. eg nnn.nnn.nnn-nn\n \"\"\"\n\n def ssn(self):\n digits = self.generator.random.sample(range(10), 9)\n\n dv = checksum(digits)\n digits.append(dv)\n digits.append(checksum(digits))\n\n return ''.join(map(str, digits))\n\n def cpf(self):\n c = self.ssn()\n return c[:3] + '.' + c[3:6] + '.' + c[6:9] + '-' + c[9:]\n", "path": "faker/providers/ssn/pt_BR/__init__.py"}], "after_files": [{"content": "# coding=utf-8\n\nfrom __future__ import unicode_literals\n\nfrom .. import Provider as SsnProvider\n\n\ndef checksum(digits):\n \"\"\"\n Returns the checksum of CPF digits.\n References to the algorithm:\n https://pt.wikipedia.org/wiki/Cadastro_de_pessoas_f%C3%ADsicas#Algoritmo\n https://metacpan.org/source/MAMAWE/Algorithm-CheckDigits-v1.3.0/lib/Algorithm/CheckDigits/M11_004.pm\n \"\"\"\n s = 0\n p = len(digits) + 1\n for i in range(0, len(digits)):\n s += digits[i] * p\n p -= 1\n\n reminder = s % 11\n if reminder == 0 or reminder == 1:\n return 0\n else:\n return 11 - reminder\n\n\nclass Provider(SsnProvider):\n \"\"\"\n Provider for Brazilian SSN also known in Brazil as CPF.\n There are two methods Provider.ssn and Provider.cpf\n The snn returns a valid number with numbers only\n The cpf return a valid number formatted with brazilian mask. eg nnn.nnn.nnn-nn\n \"\"\"\n\n def ssn(self):\n digits = self.generator.random.sample(range(10), 9)\n\n dv = checksum(digits)\n digits.append(dv)\n digits.append(checksum(digits))\n\n return ''.join(map(str, digits))\n\n def cpf(self):\n c = self.ssn()\n return c[:3] + '.' + c[3:6] + '.' + c[6:9] + '-' + c[9:]\n\n def rg(self):\n \"\"\"\n Brazilian RG, return plain numbers.\n Check: https://www.ngmatematica.com/2014/02/como-determinar-o-digito-verificador-do.html\n \"\"\"\n\n digits = self.generator.random.sample(range(0, 9), 8)\n checksum = sum(i * digits[i - 2] for i in range(2, 10))\n last_digit = 11 - (checksum % 11)\n\n if last_digit == 10:\n digits.append('X')\n elif last_digit == 11:\n digits.append(0)\n else:\n digits.append(last_digit)\n\n return ''.join(map(str, digits))\n", "path": "faker/providers/ssn/pt_BR/__init__.py"}]}
797
324
gh_patches_debug_34130
rasdani/github-patches
git_diff
azavea__raster-vision-1560
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Improve Dataset.from_uris methods When using the `from_uris` methods (such as in `SemanticSegmentationSlidingWindowGeoDataset`), it's easy to forget to pass in an important argument due to the use of kwargs. For example, size and stride are needed, and `label_vector_default_class_id` defaults to None which counterintuitively removes all the vectors. We should fix these and related problems. This issue was originally noted in https://github.com/azavea/raster-vision/pull/1476 --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `rastervision_core/rastervision/core/data/vector_transformer/class_inference_transformer.py` Content: ``` 1 from typing import TYPE_CHECKING, Dict, Optional 2 from copy import deepcopy 3 4 from rastervision.core.data.vector_transformer import VectorTransformer 5 from rastervision.core.data.vector_transformer.label_maker.filter import ( 6 create_filter) 7 from rastervision.core.data.utils.geojson import features_to_geojson 8 9 if TYPE_CHECKING: 10 from rastervision.core.data import ClassConfig, CRSTransformer 11 12 13 class ClassInferenceTransformer(VectorTransformer): 14 """Infers missing class_ids from GeoJSON features. 15 16 Rules: 17 1) If class_id is in feature['properties'], use it. 18 2) If class_config is set and class_name or label are in 19 feature['properties'] and in class_config, use corresponding 20 class_id. 21 3) If class_id_to_filter is set and filter is true when applied to 22 feature, use corresponding class_id. 23 4) Otherwise, return the default_class_id 24 """ 25 26 def __init__(self, 27 default_class_id: Optional[int], 28 class_config: Optional['ClassConfig'] = None, 29 class_id_to_filter: Optional[Dict[int, list]] = None): 30 self.class_config = class_config 31 self.class_id_to_filter = class_id_to_filter 32 self.default_class_id = default_class_id 33 34 if self.class_id_to_filter is not None: 35 self.class_id_to_filter = {} 36 for class_id, filter_exp in class_id_to_filter.items(): 37 self.class_id_to_filter[int(class_id)] = create_filter( 38 filter_exp) 39 40 @staticmethod 41 def infer_feature_class_id( 42 feature: dict, 43 default_class_id: Optional[int], 44 class_config: Optional['ClassConfig'] = None, 45 class_id_to_filter: Optional[Dict[int, list]] = None 46 ) -> Optional[int]: 47 """Infer the class_id for a GeoJSON feature. 48 49 Rules: 50 1) If class_id is in feature['properties'], use it. 51 2) If class_config is set and class_name or label are in 52 feature['properties'] and in class_config, use corresponding 53 class_id. 54 3) If class_id_to_filter is set and filter is true when applied to 55 feature, use corresponding class_id. 56 4) Otherwise, return the default_class_id. 57 58 Args: 59 feature (dict): GeoJSON feature. 60 61 Returns: 62 Optional[int]: Inferred class ID. 63 """ 64 class_id = feature.get('properties', {}).get('class_id') 65 if class_id is not None: 66 return class_id 67 68 if class_config is not None: 69 class_name = feature.get('properties', {}).get('class_name') 70 if class_name in class_config.names: 71 return class_config.names.index(class_name) 72 73 label = feature.get('properties', {}).get('label') 74 if label in class_config.names: 75 return class_config.names.index(label) 76 77 if class_id_to_filter is not None: 78 for class_id, filter_fn in class_id_to_filter.items(): 79 if filter_fn(feature): 80 return class_id 81 82 return default_class_id 83 84 def transform(self, 85 geojson: dict, 86 crs_transformer: Optional['CRSTransformer'] = None) -> dict: 87 """Add class_id to feature properties and drop features with no class. 88 89 For each feature in geojson, the class_id is inferred and is set into 90 feature['properties']. If the class_id is None (because none of the 91 rules apply and the default_class_id is None), the feature is dropped. 92 """ 93 new_features = [] 94 for feature in geojson['features']: 95 class_id = self.infer_feature_class_id( 96 feature, 97 default_class_id=self.default_class_id, 98 class_config=self.class_config, 99 class_id_to_filter=self.class_id_to_filter) 100 if class_id is not None: 101 feature = deepcopy(feature) 102 properties = feature.get('properties', {}) 103 properties['class_id'] = class_id 104 feature['properties'] = properties 105 new_features.append(feature) 106 new_geojson = features_to_geojson(new_features) 107 return new_geojson 108 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/rastervision_core/rastervision/core/data/vector_transformer/class_inference_transformer.py b/rastervision_core/rastervision/core/data/vector_transformer/class_inference_transformer.py --- a/rastervision_core/rastervision/core/data/vector_transformer/class_inference_transformer.py +++ b/rastervision_core/rastervision/core/data/vector_transformer/class_inference_transformer.py @@ -1,5 +1,6 @@ from typing import TYPE_CHECKING, Dict, Optional from copy import deepcopy +import logging from rastervision.core.data.vector_transformer import VectorTransformer from rastervision.core.data.vector_transformer.label_maker.filter import ( @@ -9,6 +10,8 @@ if TYPE_CHECKING: from rastervision.core.data import ClassConfig, CRSTransformer +log = logging.getLogger(__name__) + class ClassInferenceTransformer(VectorTransformer): """Infers missing class_ids from GeoJSON features. @@ -91,6 +94,7 @@ rules apply and the default_class_id is None), the feature is dropped. """ new_features = [] + warned = False for feature in geojson['features']: class_id = self.infer_feature_class_id( feature, @@ -103,5 +107,13 @@ properties['class_id'] = class_id feature['properties'] = properties new_features.append(feature) + elif not warned: + log.warning( + 'ClassInferenceTransformer is dropping vector features because ' + 'class_id cannot be inferred. To avoid this behavior, ' + 'set default_class_id to a non-None value in ' + 'ClassInferenceTransformer.') + warned = True + new_geojson = features_to_geojson(new_features) return new_geojson
{"golden_diff": "diff --git a/rastervision_core/rastervision/core/data/vector_transformer/class_inference_transformer.py b/rastervision_core/rastervision/core/data/vector_transformer/class_inference_transformer.py\n--- a/rastervision_core/rastervision/core/data/vector_transformer/class_inference_transformer.py\n+++ b/rastervision_core/rastervision/core/data/vector_transformer/class_inference_transformer.py\n@@ -1,5 +1,6 @@\n from typing import TYPE_CHECKING, Dict, Optional\n from copy import deepcopy\n+import logging\n \n from rastervision.core.data.vector_transformer import VectorTransformer\n from rastervision.core.data.vector_transformer.label_maker.filter import (\n@@ -9,6 +10,8 @@\n if TYPE_CHECKING:\n from rastervision.core.data import ClassConfig, CRSTransformer\n \n+log = logging.getLogger(__name__)\n+\n \n class ClassInferenceTransformer(VectorTransformer):\n \"\"\"Infers missing class_ids from GeoJSON features.\n@@ -91,6 +94,7 @@\n rules apply and the default_class_id is None), the feature is dropped.\n \"\"\"\n new_features = []\n+ warned = False\n for feature in geojson['features']:\n class_id = self.infer_feature_class_id(\n feature,\n@@ -103,5 +107,13 @@\n properties['class_id'] = class_id\n feature['properties'] = properties\n new_features.append(feature)\n+ elif not warned:\n+ log.warning(\n+ 'ClassInferenceTransformer is dropping vector features because '\n+ 'class_id cannot be inferred. To avoid this behavior, '\n+ 'set default_class_id to a non-None value in '\n+ 'ClassInferenceTransformer.')\n+ warned = True\n+\n new_geojson = features_to_geojson(new_features)\n return new_geojson\n", "issue": "Improve Dataset.from_uris methods\nWhen using the `from_uris` methods (such as in `SemanticSegmentationSlidingWindowGeoDataset`), it's easy to forget to pass in an important argument due to the use of kwargs. For example, size and stride are needed, and `label_vector_default_class_id` defaults to None which counterintuitively removes all the vectors. We should fix these and related problems.\r\n\r\nThis issue was originally noted in https://github.com/azavea/raster-vision/pull/1476\r\n\r\n\n", "before_files": [{"content": "from typing import TYPE_CHECKING, Dict, Optional\nfrom copy import deepcopy\n\nfrom rastervision.core.data.vector_transformer import VectorTransformer\nfrom rastervision.core.data.vector_transformer.label_maker.filter import (\n create_filter)\nfrom rastervision.core.data.utils.geojson import features_to_geojson\n\nif TYPE_CHECKING:\n from rastervision.core.data import ClassConfig, CRSTransformer\n\n\nclass ClassInferenceTransformer(VectorTransformer):\n \"\"\"Infers missing class_ids from GeoJSON features.\n\n Rules:\n 1) If class_id is in feature['properties'], use it.\n 2) If class_config is set and class_name or label are in\n feature['properties'] and in class_config, use corresponding\n class_id.\n 3) If class_id_to_filter is set and filter is true when applied to\n feature, use corresponding class_id.\n 4) Otherwise, return the default_class_id\n \"\"\"\n\n def __init__(self,\n default_class_id: Optional[int],\n class_config: Optional['ClassConfig'] = None,\n class_id_to_filter: Optional[Dict[int, list]] = None):\n self.class_config = class_config\n self.class_id_to_filter = class_id_to_filter\n self.default_class_id = default_class_id\n\n if self.class_id_to_filter is not None:\n self.class_id_to_filter = {}\n for class_id, filter_exp in class_id_to_filter.items():\n self.class_id_to_filter[int(class_id)] = create_filter(\n filter_exp)\n\n @staticmethod\n def infer_feature_class_id(\n feature: dict,\n default_class_id: Optional[int],\n class_config: Optional['ClassConfig'] = None,\n class_id_to_filter: Optional[Dict[int, list]] = None\n ) -> Optional[int]:\n \"\"\"Infer the class_id for a GeoJSON feature.\n\n Rules:\n 1) If class_id is in feature['properties'], use it.\n 2) If class_config is set and class_name or label are in\n feature['properties'] and in class_config, use corresponding\n class_id.\n 3) If class_id_to_filter is set and filter is true when applied to\n feature, use corresponding class_id.\n 4) Otherwise, return the default_class_id.\n\n Args:\n feature (dict): GeoJSON feature.\n\n Returns:\n Optional[int]: Inferred class ID.\n \"\"\"\n class_id = feature.get('properties', {}).get('class_id')\n if class_id is not None:\n return class_id\n\n if class_config is not None:\n class_name = feature.get('properties', {}).get('class_name')\n if class_name in class_config.names:\n return class_config.names.index(class_name)\n\n label = feature.get('properties', {}).get('label')\n if label in class_config.names:\n return class_config.names.index(label)\n\n if class_id_to_filter is not None:\n for class_id, filter_fn in class_id_to_filter.items():\n if filter_fn(feature):\n return class_id\n\n return default_class_id\n\n def transform(self,\n geojson: dict,\n crs_transformer: Optional['CRSTransformer'] = None) -> dict:\n \"\"\"Add class_id to feature properties and drop features with no class.\n\n For each feature in geojson, the class_id is inferred and is set into\n feature['properties']. If the class_id is None (because none of the\n rules apply and the default_class_id is None), the feature is dropped.\n \"\"\"\n new_features = []\n for feature in geojson['features']:\n class_id = self.infer_feature_class_id(\n feature,\n default_class_id=self.default_class_id,\n class_config=self.class_config,\n class_id_to_filter=self.class_id_to_filter)\n if class_id is not None:\n feature = deepcopy(feature)\n properties = feature.get('properties', {})\n properties['class_id'] = class_id\n feature['properties'] = properties\n new_features.append(feature)\n new_geojson = features_to_geojson(new_features)\n return new_geojson\n", "path": "rastervision_core/rastervision/core/data/vector_transformer/class_inference_transformer.py"}], "after_files": [{"content": "from typing import TYPE_CHECKING, Dict, Optional\nfrom copy import deepcopy\nimport logging\n\nfrom rastervision.core.data.vector_transformer import VectorTransformer\nfrom rastervision.core.data.vector_transformer.label_maker.filter import (\n create_filter)\nfrom rastervision.core.data.utils.geojson import features_to_geojson\n\nif TYPE_CHECKING:\n from rastervision.core.data import ClassConfig, CRSTransformer\n\nlog = logging.getLogger(__name__)\n\n\nclass ClassInferenceTransformer(VectorTransformer):\n \"\"\"Infers missing class_ids from GeoJSON features.\n\n Rules:\n 1) If class_id is in feature['properties'], use it.\n 2) If class_config is set and class_name or label are in\n feature['properties'] and in class_config, use corresponding\n class_id.\n 3) If class_id_to_filter is set and filter is true when applied to\n feature, use corresponding class_id.\n 4) Otherwise, return the default_class_id\n \"\"\"\n\n def __init__(self,\n default_class_id: Optional[int],\n class_config: Optional['ClassConfig'] = None,\n class_id_to_filter: Optional[Dict[int, list]] = None):\n self.class_config = class_config\n self.class_id_to_filter = class_id_to_filter\n self.default_class_id = default_class_id\n\n if self.class_id_to_filter is not None:\n self.class_id_to_filter = {}\n for class_id, filter_exp in class_id_to_filter.items():\n self.class_id_to_filter[int(class_id)] = create_filter(\n filter_exp)\n\n @staticmethod\n def infer_feature_class_id(\n feature: dict,\n default_class_id: Optional[int],\n class_config: Optional['ClassConfig'] = None,\n class_id_to_filter: Optional[Dict[int, list]] = None\n ) -> Optional[int]:\n \"\"\"Infer the class_id for a GeoJSON feature.\n\n Rules:\n 1) If class_id is in feature['properties'], use it.\n 2) If class_config is set and class_name or label are in\n feature['properties'] and in class_config, use corresponding\n class_id.\n 3) If class_id_to_filter is set and filter is true when applied to\n feature, use corresponding class_id.\n 4) Otherwise, return the default_class_id.\n\n Args:\n feature (dict): GeoJSON feature.\n\n Returns:\n Optional[int]: Inferred class ID.\n \"\"\"\n class_id = feature.get('properties', {}).get('class_id')\n if class_id is not None:\n return class_id\n\n if class_config is not None:\n class_name = feature.get('properties', {}).get('class_name')\n if class_name in class_config.names:\n return class_config.names.index(class_name)\n\n label = feature.get('properties', {}).get('label')\n if label in class_config.names:\n return class_config.names.index(label)\n\n if class_id_to_filter is not None:\n for class_id, filter_fn in class_id_to_filter.items():\n if filter_fn(feature):\n return class_id\n\n return default_class_id\n\n def transform(self,\n geojson: dict,\n crs_transformer: Optional['CRSTransformer'] = None) -> dict:\n \"\"\"Add class_id to feature properties and drop features with no class.\n\n For each feature in geojson, the class_id is inferred and is set into\n feature['properties']. If the class_id is None (because none of the\n rules apply and the default_class_id is None), the feature is dropped.\n \"\"\"\n new_features = []\n warned = False\n for feature in geojson['features']:\n class_id = self.infer_feature_class_id(\n feature,\n default_class_id=self.default_class_id,\n class_config=self.class_config,\n class_id_to_filter=self.class_id_to_filter)\n if class_id is not None:\n feature = deepcopy(feature)\n properties = feature.get('properties', {})\n properties['class_id'] = class_id\n feature['properties'] = properties\n new_features.append(feature)\n elif not warned:\n log.warning(\n 'ClassInferenceTransformer is dropping vector features because '\n 'class_id cannot be inferred. To avoid this behavior, '\n 'set default_class_id to a non-None value in '\n 'ClassInferenceTransformer.')\n warned = True\n\n new_geojson = features_to_geojson(new_features)\n return new_geojson\n", "path": "rastervision_core/rastervision/core/data/vector_transformer/class_inference_transformer.py"}]}
1,494
404
gh_patches_debug_1246
rasdani/github-patches
git_diff
getsentry__sentry-15491
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Simple typo in the compact docstring for utils.functional ## Important Details How are you running Sentry? * [ ] On-Premise docker [Version xyz] * [ ] Saas (sentry.io) * [x] Other [briefly describe your environment] Observed documentation - not running sentry. ## Description Simple typo should be values rather than valules. ## Steps to Reproduce 1. Observe docstring in utils.functional.compact method ### What you expected to happen Should be values rather than valules. ### Possible Solution Replace valules with values. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `src/sentry/utils/functional.py` Content: ``` 1 from __future__ import absolute_import 2 3 import six 4 5 from django.utils.functional import empty 6 7 8 def extract_lazy_object(lo): 9 """ 10 Unwrap a LazyObject and return the inner object. Whatever that may be. 11 12 ProTip: This is relying on `django.utils.functional.empty`, which may 13 or may not be removed in the future. It's 100% undocumented. 14 """ 15 if not hasattr(lo, "_wrapped"): 16 return lo 17 if lo._wrapped is empty: 18 lo._setup() 19 return lo._wrapped 20 21 22 def apply_values(function, mapping): 23 """\ 24 Applies ``function`` to a sequence containing all of the values in the 25 provided mapping, returing a new mapping with the values replaced with 26 the results of the provided function. 27 28 >>> apply_values( 29 ... lambda values: map(u'{} fish'.format, values), 30 ... {1: 'red', 2: 'blue'}, 31 ... ) 32 {1: u'red fish', 2: u'blue fish'} 33 """ 34 if not mapping: 35 return {} 36 37 keys, values = zip(*mapping.items()) 38 return dict(zip(keys, function(values))) 39 40 41 def compact(seq): 42 """ 43 Removes ``None`` values from various sequence-based data structures. 44 45 dict: 46 Removes keys with a corresponding ``None`` value. 47 48 list: 49 Removes ``None`` valules. 50 51 >>> compact({'foo': 'bar', 'baz': None}) 52 {'foo': 'bar'} 53 54 >>> compact([1, None, 2]) 55 [1, 2] 56 """ 57 if isinstance(seq, dict): 58 return {k: v for k, v in six.iteritems(seq) if v is not None} 59 60 elif isinstance(seq, list): 61 return [k for k in seq if k is not None] 62 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/src/sentry/utils/functional.py b/src/sentry/utils/functional.py --- a/src/sentry/utils/functional.py +++ b/src/sentry/utils/functional.py @@ -46,7 +46,7 @@ Removes keys with a corresponding ``None`` value. list: - Removes ``None`` valules. + Removes ``None`` values. >>> compact({'foo': 'bar', 'baz': None}) {'foo': 'bar'}
{"golden_diff": "diff --git a/src/sentry/utils/functional.py b/src/sentry/utils/functional.py\n--- a/src/sentry/utils/functional.py\n+++ b/src/sentry/utils/functional.py\n@@ -46,7 +46,7 @@\n Removes keys with a corresponding ``None`` value.\n \n list:\n- Removes ``None`` valules.\n+ Removes ``None`` values.\n \n >>> compact({'foo': 'bar', 'baz': None})\n {'foo': 'bar'}\n", "issue": "Simple typo in the compact docstring for utils.functional\n## Important Details\r\n\r\nHow are you running Sentry?\r\n\r\n* [ ] On-Premise docker [Version xyz]\r\n* [ ] Saas (sentry.io)\r\n* [x] Other [briefly describe your environment]\r\nObserved documentation - not running sentry.\r\n\r\n## Description\r\n\r\nSimple typo should be values rather than valules.\r\n\r\n## Steps to Reproduce\r\n\r\n1. Observe docstring in utils.functional.compact method\r\n\r\n### What you expected to happen\r\n\r\nShould be values rather than valules.\r\n\r\n### Possible Solution\r\n\r\nReplace valules with values.\r\n\n", "before_files": [{"content": "from __future__ import absolute_import\n\nimport six\n\nfrom django.utils.functional import empty\n\n\ndef extract_lazy_object(lo):\n \"\"\"\n Unwrap a LazyObject and return the inner object. Whatever that may be.\n\n ProTip: This is relying on `django.utils.functional.empty`, which may\n or may not be removed in the future. It's 100% undocumented.\n \"\"\"\n if not hasattr(lo, \"_wrapped\"):\n return lo\n if lo._wrapped is empty:\n lo._setup()\n return lo._wrapped\n\n\ndef apply_values(function, mapping):\n \"\"\"\\\n Applies ``function`` to a sequence containing all of the values in the\n provided mapping, returing a new mapping with the values replaced with\n the results of the provided function.\n\n >>> apply_values(\n ... lambda values: map(u'{} fish'.format, values),\n ... {1: 'red', 2: 'blue'},\n ... )\n {1: u'red fish', 2: u'blue fish'}\n \"\"\"\n if not mapping:\n return {}\n\n keys, values = zip(*mapping.items())\n return dict(zip(keys, function(values)))\n\n\ndef compact(seq):\n \"\"\"\n Removes ``None`` values from various sequence-based data structures.\n\n dict:\n Removes keys with a corresponding ``None`` value.\n\n list:\n Removes ``None`` valules.\n\n >>> compact({'foo': 'bar', 'baz': None})\n {'foo': 'bar'}\n\n >>> compact([1, None, 2])\n [1, 2]\n \"\"\"\n if isinstance(seq, dict):\n return {k: v for k, v in six.iteritems(seq) if v is not None}\n\n elif isinstance(seq, list):\n return [k for k in seq if k is not None]\n", "path": "src/sentry/utils/functional.py"}], "after_files": [{"content": "from __future__ import absolute_import\n\nimport six\n\nfrom django.utils.functional import empty\n\n\ndef extract_lazy_object(lo):\n \"\"\"\n Unwrap a LazyObject and return the inner object. Whatever that may be.\n\n ProTip: This is relying on `django.utils.functional.empty`, which may\n or may not be removed in the future. It's 100% undocumented.\n \"\"\"\n if not hasattr(lo, \"_wrapped\"):\n return lo\n if lo._wrapped is empty:\n lo._setup()\n return lo._wrapped\n\n\ndef apply_values(function, mapping):\n \"\"\"\\\n Applies ``function`` to a sequence containing all of the values in the\n provided mapping, returing a new mapping with the values replaced with\n the results of the provided function.\n\n >>> apply_values(\n ... lambda values: map(u'{} fish'.format, values),\n ... {1: 'red', 2: 'blue'},\n ... )\n {1: u'red fish', 2: u'blue fish'}\n \"\"\"\n if not mapping:\n return {}\n\n keys, values = zip(*mapping.items())\n return dict(zip(keys, function(values)))\n\n\ndef compact(seq):\n \"\"\"\n Removes ``None`` values from various sequence-based data structures.\n\n dict:\n Removes keys with a corresponding ``None`` value.\n\n list:\n Removes ``None`` values.\n\n >>> compact({'foo': 'bar', 'baz': None})\n {'foo': 'bar'}\n\n >>> compact([1, None, 2])\n [1, 2]\n \"\"\"\n if isinstance(seq, dict):\n return {k: v for k, v in six.iteritems(seq) if v is not None}\n\n elif isinstance(seq, list):\n return [k for k in seq if k is not None]\n", "path": "src/sentry/utils/functional.py"}]}
897
106
gh_patches_debug_28789
rasdani/github-patches
git_diff
pypa__virtualenv-2206
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Incorrect (broken) virtualenv layout with pypy3.8's new layout **Issue** PyPy3.8 (currently 7.3.6rc1) supports a new install layout that resembles CPython more. That is, `sys.prefix` no longer needs being isolated, and site-packages are found in `$prefix/lib/pypy3.8/site-packages`. However, virtualenv tries to symlink everything from `/usr/lib` including the `pypy3.8` directory. As a result, the user can't write to the site-packages directory in the venv. I haven't tried running it as root but I can imagine it doing major mess if it virtualenv doesn't take any precautions from writing into system directories. **Environment** Provide at least: - OS: Gentoo Linux - ``pip list`` of the host python where ``virtualenv`` is installed: [pip-list.txt](https://github.com/pypa/virtualenv/files/7167321/pip-list.txt) **Output of the virtual environment creation** Make sure to run the creation with `-vvv --with-traceback`: Full output: [output.txt](https://github.com/pypa/virtualenv/files/7167331/output.txt) tail: ``` 1048 create virtualenv import hook file /tmp/z/lib/pypy3.8/site-packages/_virtualenv.pth [DEBUG api:95] Traceback (most recent call last): File "/usr/lib/pypy3.8/runpy.py", line 198, in _run_module_as_main "__main__", mod_spec) File "/usr/lib/pypy3.8/runpy.py", line 87, in _run_code exec(code, run_globals) File "/usr/lib/pypy3.8/site-packages/virtualenv/__main__.py", line 80, in <module> run_with_catch() # pragma: no cov File "/usr/lib/pypy3.8/site-packages/virtualenv/__main__.py", line 65, in run_with_catch run(args, options, env) File "/usr/lib/pypy3.8/site-packages/virtualenv/__main__.py", line 18, in run session = cli_run(args, options, env) File "/usr/lib/pypy3.8/site-packages/virtualenv/run/__init__.py", line 32, in cli_run of_session.run() File "/usr/lib/pypy3.8/site-packages/virtualenv/run/session.py", line 46, in run self._create() File "/usr/lib/pypy3.8/site-packages/virtualenv/run/session.py", line 53, in _create self.creator.run() File "/usr/lib/pypy3.8/site-packages/virtualenv/create/creator.py", line 171, in run self.create() File "/usr/lib/pypy3.8/site-packages/virtualenv/create/via_global_ref/builtin/via_global_self_do.py", line 101, in create super(ViaGlobalRefVirtualenvBuiltin, self).create() File "/usr/lib/pypy3.8/site-packages/virtualenv/create/via_global_ref/api.py", line 89, in create self.install_patch() File "/usr/lib/pypy3.8/site-packages/virtualenv/create/via_global_ref/api.py", line 96, in install_patch pth.write_text("import _virtualenv") File "/usr/lib/pypy3.8/pathlib.py", line 1255, in write_text with self.open(mode='w', encoding=encoding, errors=errors) as f: File "/usr/lib/pypy3.8/pathlib.py", line 1223, in open opener=self._opener) File "/usr/lib/pypy3.8/pathlib.py", line 1078, in _opener return self._accessor.open(self, flags, mode) PermissionError: [Errno 13] Permission denied: PosixPath('/tmp/z/lib/pypy3.8/site-packages/_virtualenv.pth') ``` --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `src/virtualenv/create/via_global_ref/builtin/pypy/pypy3.py` Content: ``` 1 from __future__ import absolute_import, unicode_literals 2 3 import abc 4 5 from six import add_metaclass 6 7 from virtualenv.create.describe import PosixSupports, Python3Supports, WindowsSupports 8 from virtualenv.create.via_global_ref.builtin.ref import PathRefToDest 9 from virtualenv.util.path import Path 10 11 from .common import PyPy 12 13 14 @add_metaclass(abc.ABCMeta) 15 class PyPy3(PyPy, Python3Supports): 16 @classmethod 17 def exe_stem(cls): 18 return "pypy3" 19 20 @classmethod 21 def exe_names(cls, interpreter): 22 return super(PyPy3, cls).exe_names(interpreter) | {"pypy"} 23 24 25 class PyPy3Posix(PyPy3, PosixSupports): 26 """PyPy 2 on POSIX""" 27 28 @property 29 def stdlib(self): 30 """PyPy3 respects sysconfig only for the host python, virtual envs is instead lib/pythonx.y/site-packages""" 31 return self.dest / "lib" / "python{}".format(self.interpreter.version_release_str) / "site-packages" 32 33 @classmethod 34 def _shared_libs(cls): 35 return ["libpypy3-c.so", "libpypy3-c.dylib"] 36 37 def to_lib(self, src): 38 return self.dest / "lib" / src.name 39 40 @classmethod 41 def sources(cls, interpreter): 42 for src in super(PyPy3Posix, cls).sources(interpreter): 43 yield src 44 host_lib = Path(interpreter.system_prefix) / "lib" 45 if host_lib.exists() and host_lib.is_dir(): 46 for path in host_lib.iterdir(): 47 yield PathRefToDest(path, dest=cls.to_lib) 48 49 50 class Pypy3Windows(PyPy3, WindowsSupports): 51 """PyPy 2 on Windows""" 52 53 @property 54 def stdlib(self): 55 """PyPy3 respects sysconfig only for the host python, virtual envs is instead Lib/site-packages""" 56 return self.dest / "Lib" / "site-packages" 57 58 @property 59 def bin_dir(self): 60 """PyPy3 needs to fallback to pypy definition""" 61 return self.dest / "Scripts" 62 63 @classmethod 64 def _shared_libs(cls): 65 return ["libpypy3-c.dll", "libffi-7.dll"] 66 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/src/virtualenv/create/via_global_ref/builtin/pypy/pypy3.py b/src/virtualenv/create/via_global_ref/builtin/pypy/pypy3.py --- a/src/virtualenv/create/via_global_ref/builtin/pypy/pypy3.py +++ b/src/virtualenv/create/via_global_ref/builtin/pypy/pypy3.py @@ -28,7 +28,7 @@ @property def stdlib(self): """PyPy3 respects sysconfig only for the host python, virtual envs is instead lib/pythonx.y/site-packages""" - return self.dest / "lib" / "python{}".format(self.interpreter.version_release_str) / "site-packages" + return self.dest / "lib" / "pypy{}".format(self.interpreter.version_release_str) / "site-packages" @classmethod def _shared_libs(cls): @@ -41,9 +41,19 @@ def sources(cls, interpreter): for src in super(PyPy3Posix, cls).sources(interpreter): yield src + # Also copy/symlink anything under prefix/lib, which, for "portable" + # PyPy builds, includes the tk,tcl runtime and a number of shared + # objects. In distro-specific builds or on conda this should be empty + # (on PyPy3.8+ it will, like on CPython, hold the stdlib). host_lib = Path(interpreter.system_prefix) / "lib" + stdlib = Path(interpreter.system_stdlib) if host_lib.exists() and host_lib.is_dir(): for path in host_lib.iterdir(): + if stdlib == path: + # For PyPy3.8+ the stdlib lives in lib/pypy3.8 + # We need to avoid creating a symlink to it since that + # will defeat the purpose of a virtualenv + continue yield PathRefToDest(path, dest=cls.to_lib)
{"golden_diff": "diff --git a/src/virtualenv/create/via_global_ref/builtin/pypy/pypy3.py b/src/virtualenv/create/via_global_ref/builtin/pypy/pypy3.py\n--- a/src/virtualenv/create/via_global_ref/builtin/pypy/pypy3.py\n+++ b/src/virtualenv/create/via_global_ref/builtin/pypy/pypy3.py\n@@ -28,7 +28,7 @@\n @property\n def stdlib(self):\n \"\"\"PyPy3 respects sysconfig only for the host python, virtual envs is instead lib/pythonx.y/site-packages\"\"\"\n- return self.dest / \"lib\" / \"python{}\".format(self.interpreter.version_release_str) / \"site-packages\"\n+ return self.dest / \"lib\" / \"pypy{}\".format(self.interpreter.version_release_str) / \"site-packages\"\n \n @classmethod\n def _shared_libs(cls):\n@@ -41,9 +41,19 @@\n def sources(cls, interpreter):\n for src in super(PyPy3Posix, cls).sources(interpreter):\n yield src\n+ # Also copy/symlink anything under prefix/lib, which, for \"portable\"\n+ # PyPy builds, includes the tk,tcl runtime and a number of shared\n+ # objects. In distro-specific builds or on conda this should be empty\n+ # (on PyPy3.8+ it will, like on CPython, hold the stdlib).\n host_lib = Path(interpreter.system_prefix) / \"lib\"\n+ stdlib = Path(interpreter.system_stdlib)\n if host_lib.exists() and host_lib.is_dir():\n for path in host_lib.iterdir():\n+ if stdlib == path:\n+ # For PyPy3.8+ the stdlib lives in lib/pypy3.8\n+ # We need to avoid creating a symlink to it since that\n+ # will defeat the purpose of a virtualenv\n+ continue\n yield PathRefToDest(path, dest=cls.to_lib)\n", "issue": "Incorrect (broken) virtualenv layout with pypy3.8's new layout\n**Issue**\r\n\r\nPyPy3.8 (currently 7.3.6rc1) supports a new install layout that resembles CPython more. That is, `sys.prefix` no longer needs being isolated, and site-packages are found in `$prefix/lib/pypy3.8/site-packages`. However, virtualenv tries to symlink everything from `/usr/lib` including the `pypy3.8` directory. As a result, the user can't write to the site-packages directory in the venv.\r\n\r\nI haven't tried running it as root but I can imagine it doing major mess if it virtualenv doesn't take any precautions from writing into system directories.\r\n\r\n**Environment**\r\n\r\nProvide at least:\r\n- OS: Gentoo Linux\r\n- ``pip list`` of the host python where ``virtualenv`` is installed: [pip-list.txt](https://github.com/pypa/virtualenv/files/7167321/pip-list.txt)\r\n\r\n\r\n**Output of the virtual environment creation**\r\n\r\nMake sure to run the creation with `-vvv --with-traceback`:\r\n\r\nFull output: [output.txt](https://github.com/pypa/virtualenv/files/7167331/output.txt)\r\n\r\ntail:\r\n```\r\n1048 create virtualenv import hook file /tmp/z/lib/pypy3.8/site-packages/_virtualenv.pth [DEBUG api:95]\r\nTraceback (most recent call last):\r\n File \"/usr/lib/pypy3.8/runpy.py\", line 198, in _run_module_as_main\r\n \"__main__\", mod_spec)\r\n File \"/usr/lib/pypy3.8/runpy.py\", line 87, in _run_code\r\n exec(code, run_globals)\r\n File \"/usr/lib/pypy3.8/site-packages/virtualenv/__main__.py\", line 80, in <module>\r\n run_with_catch() # pragma: no cov\r\n File \"/usr/lib/pypy3.8/site-packages/virtualenv/__main__.py\", line 65, in run_with_catch\r\n run(args, options, env)\r\n File \"/usr/lib/pypy3.8/site-packages/virtualenv/__main__.py\", line 18, in run\r\n session = cli_run(args, options, env)\r\n File \"/usr/lib/pypy3.8/site-packages/virtualenv/run/__init__.py\", line 32, in cli_run\r\n of_session.run()\r\n File \"/usr/lib/pypy3.8/site-packages/virtualenv/run/session.py\", line 46, in run\r\n self._create()\r\n File \"/usr/lib/pypy3.8/site-packages/virtualenv/run/session.py\", line 53, in _create\r\n self.creator.run()\r\n File \"/usr/lib/pypy3.8/site-packages/virtualenv/create/creator.py\", line 171, in run\r\n self.create()\r\n File \"/usr/lib/pypy3.8/site-packages/virtualenv/create/via_global_ref/builtin/via_global_self_do.py\", line 101, in create\r\n super(ViaGlobalRefVirtualenvBuiltin, self).create()\r\n File \"/usr/lib/pypy3.8/site-packages/virtualenv/create/via_global_ref/api.py\", line 89, in create\r\n self.install_patch()\r\n File \"/usr/lib/pypy3.8/site-packages/virtualenv/create/via_global_ref/api.py\", line 96, in install_patch\r\n pth.write_text(\"import _virtualenv\")\r\n File \"/usr/lib/pypy3.8/pathlib.py\", line 1255, in write_text\r\n with self.open(mode='w', encoding=encoding, errors=errors) as f:\r\n File \"/usr/lib/pypy3.8/pathlib.py\", line 1223, in open\r\n opener=self._opener)\r\n File \"/usr/lib/pypy3.8/pathlib.py\", line 1078, in _opener\r\n return self._accessor.open(self, flags, mode)\r\nPermissionError: [Errno 13] Permission denied: PosixPath('/tmp/z/lib/pypy3.8/site-packages/_virtualenv.pth')\r\n```\n", "before_files": [{"content": "from __future__ import absolute_import, unicode_literals\n\nimport abc\n\nfrom six import add_metaclass\n\nfrom virtualenv.create.describe import PosixSupports, Python3Supports, WindowsSupports\nfrom virtualenv.create.via_global_ref.builtin.ref import PathRefToDest\nfrom virtualenv.util.path import Path\n\nfrom .common import PyPy\n\n\n@add_metaclass(abc.ABCMeta)\nclass PyPy3(PyPy, Python3Supports):\n @classmethod\n def exe_stem(cls):\n return \"pypy3\"\n\n @classmethod\n def exe_names(cls, interpreter):\n return super(PyPy3, cls).exe_names(interpreter) | {\"pypy\"}\n\n\nclass PyPy3Posix(PyPy3, PosixSupports):\n \"\"\"PyPy 2 on POSIX\"\"\"\n\n @property\n def stdlib(self):\n \"\"\"PyPy3 respects sysconfig only for the host python, virtual envs is instead lib/pythonx.y/site-packages\"\"\"\n return self.dest / \"lib\" / \"python{}\".format(self.interpreter.version_release_str) / \"site-packages\"\n\n @classmethod\n def _shared_libs(cls):\n return [\"libpypy3-c.so\", \"libpypy3-c.dylib\"]\n\n def to_lib(self, src):\n return self.dest / \"lib\" / src.name\n\n @classmethod\n def sources(cls, interpreter):\n for src in super(PyPy3Posix, cls).sources(interpreter):\n yield src\n host_lib = Path(interpreter.system_prefix) / \"lib\"\n if host_lib.exists() and host_lib.is_dir():\n for path in host_lib.iterdir():\n yield PathRefToDest(path, dest=cls.to_lib)\n\n\nclass Pypy3Windows(PyPy3, WindowsSupports):\n \"\"\"PyPy 2 on Windows\"\"\"\n\n @property\n def stdlib(self):\n \"\"\"PyPy3 respects sysconfig only for the host python, virtual envs is instead Lib/site-packages\"\"\"\n return self.dest / \"Lib\" / \"site-packages\"\n\n @property\n def bin_dir(self):\n \"\"\"PyPy3 needs to fallback to pypy definition\"\"\"\n return self.dest / \"Scripts\"\n\n @classmethod\n def _shared_libs(cls):\n return [\"libpypy3-c.dll\", \"libffi-7.dll\"]\n", "path": "src/virtualenv/create/via_global_ref/builtin/pypy/pypy3.py"}], "after_files": [{"content": "from __future__ import absolute_import, unicode_literals\n\nimport abc\n\nfrom six import add_metaclass\n\nfrom virtualenv.create.describe import PosixSupports, Python3Supports, WindowsSupports\nfrom virtualenv.create.via_global_ref.builtin.ref import PathRefToDest\nfrom virtualenv.util.path import Path\n\nfrom .common import PyPy\n\n\n@add_metaclass(abc.ABCMeta)\nclass PyPy3(PyPy, Python3Supports):\n @classmethod\n def exe_stem(cls):\n return \"pypy3\"\n\n @classmethod\n def exe_names(cls, interpreter):\n return super(PyPy3, cls).exe_names(interpreter) | {\"pypy\"}\n\n\nclass PyPy3Posix(PyPy3, PosixSupports):\n \"\"\"PyPy 2 on POSIX\"\"\"\n\n @property\n def stdlib(self):\n \"\"\"PyPy3 respects sysconfig only for the host python, virtual envs is instead lib/pythonx.y/site-packages\"\"\"\n return self.dest / \"lib\" / \"pypy{}\".format(self.interpreter.version_release_str) / \"site-packages\"\n\n @classmethod\n def _shared_libs(cls):\n return [\"libpypy3-c.so\", \"libpypy3-c.dylib\"]\n\n def to_lib(self, src):\n return self.dest / \"lib\" / src.name\n\n @classmethod\n def sources(cls, interpreter):\n for src in super(PyPy3Posix, cls).sources(interpreter):\n yield src\n # Also copy/symlink anything under prefix/lib, which, for \"portable\"\n # PyPy builds, includes the tk,tcl runtime and a number of shared\n # objects. In distro-specific builds or on conda this should be empty\n # (on PyPy3.8+ it will, like on CPython, hold the stdlib).\n host_lib = Path(interpreter.system_prefix) / \"lib\"\n stdlib = Path(interpreter.system_stdlib)\n if host_lib.exists() and host_lib.is_dir():\n for path in host_lib.iterdir():\n if stdlib == path:\n # For PyPy3.8+ the stdlib lives in lib/pypy3.8\n # We need to avoid creating a symlink to it since that\n # will defeat the purpose of a virtualenv\n continue\n yield PathRefToDest(path, dest=cls.to_lib)\n\n\nclass Pypy3Windows(PyPy3, WindowsSupports):\n \"\"\"PyPy 2 on Windows\"\"\"\n\n @property\n def stdlib(self):\n \"\"\"PyPy3 respects sysconfig only for the host python, virtual envs is instead Lib/site-packages\"\"\"\n return self.dest / \"Lib\" / \"site-packages\"\n\n @property\n def bin_dir(self):\n \"\"\"PyPy3 needs to fallback to pypy definition\"\"\"\n return self.dest / \"Scripts\"\n\n @classmethod\n def _shared_libs(cls):\n return [\"libpypy3-c.dll\", \"libffi-7.dll\"]\n", "path": "src/virtualenv/create/via_global_ref/builtin/pypy/pypy3.py"}]}
1,811
439
gh_patches_debug_65703
rasdani/github-patches
git_diff
carpentries__amy-1793
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Bug: assignment form queryset may return duplicate results Introduced in v2.16, AssignmentForm contains a queryset that may yield duplicate results due to the filtering used. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `amy/dashboard/forms.py` Content: ``` 1 from django import forms 2 from django.core.exceptions import ValidationError 3 from django.db.models import Q 4 from django_countries.fields import CountryField 5 6 from workshops.models import ( 7 Language, 8 GenderMixin, 9 Person, 10 TrainingProgress, 11 TrainingRequirement, 12 ) 13 14 from workshops.forms import BootstrapHelper 15 # this is used instead of Django Autocomplete Light widgets 16 # see issue #1330: https://github.com/swcarpentry/amy/issues/1330 17 from workshops.fields import ( 18 Select2Widget, 19 ModelSelect2MultipleWidget, 20 RadioSelectWithOther, 21 ) 22 23 24 class AssignmentForm(forms.Form): 25 assigned_to = forms.ModelChoiceField( 26 label="Assigned to:", 27 required=False, 28 queryset=Person.objects.filter( 29 Q(is_superuser=True) | Q(groups__name="administrators") 30 ), 31 widget=Select2Widget(), 32 ) 33 helper = BootstrapHelper( 34 add_submit_button=False, 35 add_cancel_button=False, 36 wider_labels=True, 37 use_get_method=True, 38 form_id="assignment-form" 39 ) 40 41 42 class AutoUpdateProfileForm(forms.ModelForm): 43 username = forms.CharField(disabled=True, required=False) 44 email = forms.CharField( 45 disabled=True, required=False, 46 label=Person._meta.get_field('email').verbose_name, 47 help_text=Person._meta.get_field('email').help_text, 48 ) 49 github = forms.CharField( 50 disabled=True, required=False, 51 help_text='If you want to change your github username, please email ' 52 'us at <a href="mailto:[email protected]">' 53 '[email protected]</a>.') 54 55 country = CountryField().formfield( 56 required=False, 57 help_text='Your country of residence.', 58 widget=Select2Widget, 59 ) 60 61 languages = forms.ModelMultipleChoiceField( 62 label='Languages', 63 required=False, 64 queryset=Language.objects.all(), 65 widget=ModelSelect2MultipleWidget(data_view='language-lookup') 66 ) 67 68 helper = BootstrapHelper(add_cancel_button=False) 69 70 class Meta: 71 model = Person 72 fields = [ 73 'personal', 74 'middle', 75 'family', 76 'email', 77 'secondary_email', 78 'gender', 79 'gender_other', 80 'may_contact', 81 'publish_profile', 82 'lesson_publication_consent', 83 'country', 84 'airport', 85 'github', 86 'twitter', 87 'url', 88 'username', 89 'affiliation', 90 'domains', 91 'lessons', 92 'languages', 93 'occupation', 94 'orcid', 95 ] 96 readonly_fields = ( 97 'username', 98 'github', 99 ) 100 widgets = { 101 'gender': RadioSelectWithOther('gender_other'), 102 'domains': forms.CheckboxSelectMultiple(), 103 'lessons': forms.CheckboxSelectMultiple(), 104 'airport': Select2Widget, 105 } 106 107 def __init__(self, *args, **kwargs): 108 super().__init__(*args, **kwargs) 109 110 # set up a layout object for the helper 111 self.helper.layout = self.helper.build_default_layout(self) 112 113 # set up `*WithOther` widgets so that they can display additional 114 # fields inline 115 self['gender'].field.widget.other_field = self['gender_other'] 116 117 # remove additional fields 118 self.helper.layout.fields.remove('gender_other') 119 120 def clean(self): 121 super().clean() 122 errors = dict() 123 124 # 1: require "other gender" field if "other" was selected in 125 # "gender" field 126 gender = self.cleaned_data.get('gender', '') 127 gender_other = self.cleaned_data.get('gender_other', '') 128 if gender == GenderMixin.OTHER and not gender_other: 129 errors['gender'] = ValidationError("This field is required.") 130 elif gender != GenderMixin.OTHER and gender_other: 131 errors['gender'] = ValidationError( 132 'If you entered data in "Other" field, please select that ' 133 "option.") 134 135 # raise errors if any present 136 if errors: 137 raise ValidationError(errors) 138 139 140 class SendHomeworkForm(forms.ModelForm): 141 url = forms.URLField(label='URL') 142 requirement = forms.ModelChoiceField( 143 queryset=TrainingRequirement.objects.filter(name__endswith="Homework"), 144 label="Type", required=True, 145 ) 146 147 helper = BootstrapHelper(add_cancel_button=False) 148 149 class Meta: 150 model = TrainingProgress 151 fields = [ 152 'requirement', 153 'url', 154 ] 155 156 157 class SearchForm(forms.Form): 158 """Represent general searching form.""" 159 160 term = forms.CharField(label="Term", max_length=100) 161 no_redirect = forms.BooleanField(required=False, initial=False, widget=forms.HiddenInput) 162 helper = BootstrapHelper(add_cancel_button=False, use_get_method=True) 163 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/amy/dashboard/forms.py b/amy/dashboard/forms.py --- a/amy/dashboard/forms.py +++ b/amy/dashboard/forms.py @@ -27,7 +27,7 @@ required=False, queryset=Person.objects.filter( Q(is_superuser=True) | Q(groups__name="administrators") - ), + ).distinct(), widget=Select2Widget(), ) helper = BootstrapHelper(
{"golden_diff": "diff --git a/amy/dashboard/forms.py b/amy/dashboard/forms.py\n--- a/amy/dashboard/forms.py\n+++ b/amy/dashboard/forms.py\n@@ -27,7 +27,7 @@\n required=False,\n queryset=Person.objects.filter(\n Q(is_superuser=True) | Q(groups__name=\"administrators\")\n- ),\n+ ).distinct(),\n widget=Select2Widget(),\n )\n helper = BootstrapHelper(\n", "issue": "Bug: assignment form queryset may return duplicate results\nIntroduced in v2.16, AssignmentForm contains a queryset that may yield duplicate results due to the filtering used.\n", "before_files": [{"content": "from django import forms\nfrom django.core.exceptions import ValidationError\nfrom django.db.models import Q\nfrom django_countries.fields import CountryField\n\nfrom workshops.models import (\n Language,\n GenderMixin,\n Person,\n TrainingProgress,\n TrainingRequirement,\n)\n\nfrom workshops.forms import BootstrapHelper\n# this is used instead of Django Autocomplete Light widgets\n# see issue #1330: https://github.com/swcarpentry/amy/issues/1330\nfrom workshops.fields import (\n Select2Widget,\n ModelSelect2MultipleWidget,\n RadioSelectWithOther,\n)\n\n\nclass AssignmentForm(forms.Form):\n assigned_to = forms.ModelChoiceField(\n label=\"Assigned to:\",\n required=False,\n queryset=Person.objects.filter(\n Q(is_superuser=True) | Q(groups__name=\"administrators\")\n ),\n widget=Select2Widget(),\n )\n helper = BootstrapHelper(\n add_submit_button=False,\n add_cancel_button=False,\n wider_labels=True,\n use_get_method=True,\n form_id=\"assignment-form\"\n )\n\n\nclass AutoUpdateProfileForm(forms.ModelForm):\n username = forms.CharField(disabled=True, required=False)\n email = forms.CharField(\n disabled=True, required=False,\n label=Person._meta.get_field('email').verbose_name,\n help_text=Person._meta.get_field('email').help_text,\n )\n github = forms.CharField(\n disabled=True, required=False,\n help_text='If you want to change your github username, please email '\n 'us at <a href=\"mailto:[email protected]\">'\n '[email protected]</a>.')\n\n country = CountryField().formfield(\n required=False,\n help_text='Your country of residence.',\n widget=Select2Widget,\n )\n\n languages = forms.ModelMultipleChoiceField(\n label='Languages',\n required=False,\n queryset=Language.objects.all(),\n widget=ModelSelect2MultipleWidget(data_view='language-lookup')\n )\n\n helper = BootstrapHelper(add_cancel_button=False)\n\n class Meta:\n model = Person\n fields = [\n 'personal',\n 'middle',\n 'family',\n 'email',\n 'secondary_email',\n 'gender',\n 'gender_other',\n 'may_contact',\n 'publish_profile',\n 'lesson_publication_consent',\n 'country',\n 'airport',\n 'github',\n 'twitter',\n 'url',\n 'username',\n 'affiliation',\n 'domains',\n 'lessons',\n 'languages',\n 'occupation',\n 'orcid',\n ]\n readonly_fields = (\n 'username',\n 'github',\n )\n widgets = {\n 'gender': RadioSelectWithOther('gender_other'),\n 'domains': forms.CheckboxSelectMultiple(),\n 'lessons': forms.CheckboxSelectMultiple(),\n 'airport': Select2Widget,\n }\n\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n\n # set up a layout object for the helper\n self.helper.layout = self.helper.build_default_layout(self)\n\n # set up `*WithOther` widgets so that they can display additional\n # fields inline\n self['gender'].field.widget.other_field = self['gender_other']\n\n # remove additional fields\n self.helper.layout.fields.remove('gender_other')\n\n def clean(self):\n super().clean()\n errors = dict()\n\n # 1: require \"other gender\" field if \"other\" was selected in\n # \"gender\" field\n gender = self.cleaned_data.get('gender', '')\n gender_other = self.cleaned_data.get('gender_other', '')\n if gender == GenderMixin.OTHER and not gender_other:\n errors['gender'] = ValidationError(\"This field is required.\")\n elif gender != GenderMixin.OTHER and gender_other:\n errors['gender'] = ValidationError(\n 'If you entered data in \"Other\" field, please select that '\n \"option.\")\n\n # raise errors if any present\n if errors:\n raise ValidationError(errors)\n\n\nclass SendHomeworkForm(forms.ModelForm):\n url = forms.URLField(label='URL')\n requirement = forms.ModelChoiceField(\n queryset=TrainingRequirement.objects.filter(name__endswith=\"Homework\"),\n label=\"Type\", required=True,\n )\n\n helper = BootstrapHelper(add_cancel_button=False)\n\n class Meta:\n model = TrainingProgress\n fields = [\n 'requirement',\n 'url',\n ]\n\n\nclass SearchForm(forms.Form):\n \"\"\"Represent general searching form.\"\"\"\n\n term = forms.CharField(label=\"Term\", max_length=100)\n no_redirect = forms.BooleanField(required=False, initial=False, widget=forms.HiddenInput)\n helper = BootstrapHelper(add_cancel_button=False, use_get_method=True)\n", "path": "amy/dashboard/forms.py"}], "after_files": [{"content": "from django import forms\nfrom django.core.exceptions import ValidationError\nfrom django.db.models import Q\nfrom django_countries.fields import CountryField\n\nfrom workshops.models import (\n Language,\n GenderMixin,\n Person,\n TrainingProgress,\n TrainingRequirement,\n)\n\nfrom workshops.forms import BootstrapHelper\n# this is used instead of Django Autocomplete Light widgets\n# see issue #1330: https://github.com/swcarpentry/amy/issues/1330\nfrom workshops.fields import (\n Select2Widget,\n ModelSelect2MultipleWidget,\n RadioSelectWithOther,\n)\n\n\nclass AssignmentForm(forms.Form):\n assigned_to = forms.ModelChoiceField(\n label=\"Assigned to:\",\n required=False,\n queryset=Person.objects.filter(\n Q(is_superuser=True) | Q(groups__name=\"administrators\")\n ).distinct(),\n widget=Select2Widget(),\n )\n helper = BootstrapHelper(\n add_submit_button=False,\n add_cancel_button=False,\n wider_labels=True,\n use_get_method=True,\n form_id=\"assignment-form\"\n )\n\n\nclass AutoUpdateProfileForm(forms.ModelForm):\n username = forms.CharField(disabled=True, required=False)\n email = forms.CharField(\n disabled=True, required=False,\n label=Person._meta.get_field('email').verbose_name,\n help_text=Person._meta.get_field('email').help_text,\n )\n github = forms.CharField(\n disabled=True, required=False,\n help_text='If you want to change your github username, please email '\n 'us at <a href=\"mailto:[email protected]\">'\n '[email protected]</a>.')\n\n country = CountryField().formfield(\n required=False,\n help_text='Your country of residence.',\n widget=Select2Widget,\n )\n\n languages = forms.ModelMultipleChoiceField(\n label='Languages',\n required=False,\n queryset=Language.objects.all(),\n widget=ModelSelect2MultipleWidget(data_view='language-lookup')\n )\n\n helper = BootstrapHelper(add_cancel_button=False)\n\n class Meta:\n model = Person\n fields = [\n 'personal',\n 'middle',\n 'family',\n 'email',\n 'secondary_email',\n 'gender',\n 'gender_other',\n 'may_contact',\n 'publish_profile',\n 'lesson_publication_consent',\n 'country',\n 'airport',\n 'github',\n 'twitter',\n 'url',\n 'username',\n 'affiliation',\n 'domains',\n 'lessons',\n 'languages',\n 'occupation',\n 'orcid',\n ]\n readonly_fields = (\n 'username',\n 'github',\n )\n widgets = {\n 'gender': RadioSelectWithOther('gender_other'),\n 'domains': forms.CheckboxSelectMultiple(),\n 'lessons': forms.CheckboxSelectMultiple(),\n 'airport': Select2Widget,\n }\n\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n\n # set up a layout object for the helper\n self.helper.layout = self.helper.build_default_layout(self)\n\n # set up `*WithOther` widgets so that they can display additional\n # fields inline\n self['gender'].field.widget.other_field = self['gender_other']\n\n # remove additional fields\n self.helper.layout.fields.remove('gender_other')\n\n def clean(self):\n super().clean()\n errors = dict()\n\n # 1: require \"other gender\" field if \"other\" was selected in\n # \"gender\" field\n gender = self.cleaned_data.get('gender', '')\n gender_other = self.cleaned_data.get('gender_other', '')\n if gender == GenderMixin.OTHER and not gender_other:\n errors['gender'] = ValidationError(\"This field is required.\")\n elif gender != GenderMixin.OTHER and gender_other:\n errors['gender'] = ValidationError(\n 'If you entered data in \"Other\" field, please select that '\n \"option.\")\n\n # raise errors if any present\n if errors:\n raise ValidationError(errors)\n\n\nclass SendHomeworkForm(forms.ModelForm):\n url = forms.URLField(label='URL')\n requirement = forms.ModelChoiceField(\n queryset=TrainingRequirement.objects.filter(name__endswith=\"Homework\"),\n label=\"Type\", required=True,\n )\n\n helper = BootstrapHelper(add_cancel_button=False)\n\n class Meta:\n model = TrainingProgress\n fields = [\n 'requirement',\n 'url',\n ]\n\n\nclass SearchForm(forms.Form):\n \"\"\"Represent general searching form.\"\"\"\n\n term = forms.CharField(label=\"Term\", max_length=100)\n no_redirect = forms.BooleanField(required=False, initial=False, widget=forms.HiddenInput)\n helper = BootstrapHelper(add_cancel_button=False, use_get_method=True)\n", "path": "amy/dashboard/forms.py"}]}
1,686
93
gh_patches_debug_201
rasdani/github-patches
git_diff
blaze__blaze-475
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Make blaze.test() return True or False @asmeurer suggests this. Currently we're passing through pytest.main() which is like the error code from command line programs. <!--- @huboard:{"order":398.859375,"milestone_order":452,"custom_state":""} --> --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `blaze/__init__.py` Content: ``` 1 from __future__ import absolute_import, division, print_function 2 3 import logging 4 5 from dynd import nd 6 from pandas import DataFrame 7 import h5py 8 9 from multipledispatch import halt_ordering, restart_ordering 10 11 halt_ordering() # Turn off multipledispatch ordering 12 13 from .expr import * 14 from .expr.functions import * 15 from .api import * 16 from .data.csv import * 17 from .data.json import * 18 from .data.hdf5 import * 19 from .compute.python import * 20 from .data.meta import * 21 from .compute.pandas import * 22 from .compute.numpy import * 23 from .compute.core import * 24 from .compute.core import compute 25 from .sql import * 26 27 try: 28 from .spark import * 29 except ImportError: 30 pass 31 try: 32 from .compute.pytables import * 33 except ImportError: 34 pass 35 try: 36 from .compute.chunks import * 37 except ImportError: 38 pass 39 try: 40 from .bcolz import * 41 except ImportError: 42 pass 43 try: 44 from .mongo import * 45 except ImportError: 46 pass 47 48 restart_ordering() # Restart multipledispatch ordering and do ordering 49 50 logging.basicConfig() 51 logger = logging.getLogger(__name__) 52 logger.setLevel(logging.WARNING) 53 54 55 inf = float('inf') 56 nan = float('nan') 57 58 __version__ = '0.6.1' 59 60 # If IPython is already loaded, register the Blaze catalog magic 61 # from . import catalog 62 # import sys 63 # if 'IPython' in sys.modules: 64 # catalog.register_ipy_magic() 65 # del sys 66 67 def print_versions(): 68 """Print all the versions of software that Blaze relies on.""" 69 import sys, platform 70 import numpy as np 71 import dynd 72 import datashape 73 print("-=" * 38) 74 print("Blaze version: %s" % __version__) 75 print("Datashape version: %s" % datashape.__version__) 76 print("NumPy version: %s" % np.__version__) 77 print("DyND version: %s / LibDyND %s" % 78 (dynd.__version__, dynd.__libdynd_version__)) 79 print("Python version: %s" % sys.version) 80 (sysname, nodename, release, version, machine, processor) = \ 81 platform.uname() 82 print("Platform: %s-%s-%s (%s)" % (sysname, release, machine, version)) 83 if sysname == "Linux": 84 print("Linux dist: %s" % " ".join(platform.linux_distribution()[:-1])) 85 if not processor: 86 processor = "not recognized" 87 print("Processor: %s" % processor) 88 print("Byte-ordering: %s" % sys.byteorder) 89 print("-=" * 38) 90 91 92 def test(verbose=False, junitfile=None, exit=False): 93 """ 94 Runs the full Blaze test suite, outputting 95 the results of the tests to sys.stdout. 96 97 This uses py.test to discover which tests to 98 run, and runs tests in any 'tests' subdirectory 99 within the Blaze module. 100 101 Parameters 102 ---------- 103 verbose : int, optional 104 Value 0 prints very little, 1 prints a little bit, 105 and 2 prints the test names while testing. 106 junitfile : string, optional 107 If provided, writes the test results to an junit xml 108 style xml file. This is useful for running the tests 109 in a CI server such as Jenkins. 110 exit : bool, optional 111 If True, the function will call sys.exit with an 112 error code after the tests are finished. 113 """ 114 import os 115 import sys 116 import pytest 117 118 args = [] 119 120 if verbose: 121 args.append('--verbose') 122 123 # Output an xunit file if requested 124 if junitfile is not None: 125 args.append('--junit-xml=%s' % junitfile) 126 127 # Add all 'tests' subdirectories to the options 128 rootdir = os.path.dirname(__file__) 129 for root, dirs, files in os.walk(rootdir): 130 if 'tests' in dirs: 131 testsdir = os.path.join(root, 'tests') 132 args.append(testsdir) 133 print('Test dir: %s' % testsdir[len(rootdir) + 1:]) 134 # print versions (handy when reporting problems) 135 print_versions() 136 sys.stdout.flush() 137 138 # Ask pytest to do its thing 139 error_code = pytest.main(args=args) 140 if exit: 141 return sys.exit(error_code) 142 return error_code 143 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/blaze/__init__.py b/blaze/__init__.py --- a/blaze/__init__.py +++ b/blaze/__init__.py @@ -139,4 +139,4 @@ error_code = pytest.main(args=args) if exit: return sys.exit(error_code) - return error_code + return error_code == 0
{"golden_diff": "diff --git a/blaze/__init__.py b/blaze/__init__.py\n--- a/blaze/__init__.py\n+++ b/blaze/__init__.py\n@@ -139,4 +139,4 @@\n error_code = pytest.main(args=args)\n if exit:\n return sys.exit(error_code)\n- return error_code\n+ return error_code == 0\n", "issue": "Make blaze.test() return True or False\n@asmeurer suggests this. Currently we're passing through pytest.main() which is like the error code from command line programs.\n\n<!---\n@huboard:{\"order\":398.859375,\"milestone_order\":452,\"custom_state\":\"\"}\n-->\n\n", "before_files": [{"content": "from __future__ import absolute_import, division, print_function\n\nimport logging\n\nfrom dynd import nd\nfrom pandas import DataFrame\nimport h5py\n\nfrom multipledispatch import halt_ordering, restart_ordering\n\nhalt_ordering() # Turn off multipledispatch ordering\n\nfrom .expr import *\nfrom .expr.functions import *\nfrom .api import *\nfrom .data.csv import *\nfrom .data.json import *\nfrom .data.hdf5 import *\nfrom .compute.python import *\nfrom .data.meta import *\nfrom .compute.pandas import *\nfrom .compute.numpy import *\nfrom .compute.core import *\nfrom .compute.core import compute\nfrom .sql import *\n\ntry:\n from .spark import *\nexcept ImportError:\n pass\ntry:\n from .compute.pytables import *\nexcept ImportError:\n pass\ntry:\n from .compute.chunks import *\nexcept ImportError:\n pass\ntry:\n from .bcolz import *\nexcept ImportError:\n pass\ntry:\n from .mongo import *\nexcept ImportError:\n pass\n\nrestart_ordering() # Restart multipledispatch ordering and do ordering\n\nlogging.basicConfig()\nlogger = logging.getLogger(__name__)\nlogger.setLevel(logging.WARNING)\n\n\ninf = float('inf')\nnan = float('nan')\n\n__version__ = '0.6.1'\n\n# If IPython is already loaded, register the Blaze catalog magic\n# from . import catalog\n# import sys\n# if 'IPython' in sys.modules:\n# catalog.register_ipy_magic()\n# del sys\n\ndef print_versions():\n \"\"\"Print all the versions of software that Blaze relies on.\"\"\"\n import sys, platform\n import numpy as np\n import dynd\n import datashape\n print(\"-=\" * 38)\n print(\"Blaze version: %s\" % __version__)\n print(\"Datashape version: %s\" % datashape.__version__)\n print(\"NumPy version: %s\" % np.__version__)\n print(\"DyND version: %s / LibDyND %s\" %\n (dynd.__version__, dynd.__libdynd_version__))\n print(\"Python version: %s\" % sys.version)\n (sysname, nodename, release, version, machine, processor) = \\\n platform.uname()\n print(\"Platform: %s-%s-%s (%s)\" % (sysname, release, machine, version))\n if sysname == \"Linux\":\n print(\"Linux dist: %s\" % \" \".join(platform.linux_distribution()[:-1]))\n if not processor:\n processor = \"not recognized\"\n print(\"Processor: %s\" % processor)\n print(\"Byte-ordering: %s\" % sys.byteorder)\n print(\"-=\" * 38)\n\n\ndef test(verbose=False, junitfile=None, exit=False):\n \"\"\"\n Runs the full Blaze test suite, outputting\n the results of the tests to sys.stdout.\n\n This uses py.test to discover which tests to\n run, and runs tests in any 'tests' subdirectory\n within the Blaze module.\n\n Parameters\n ----------\n verbose : int, optional\n Value 0 prints very little, 1 prints a little bit,\n and 2 prints the test names while testing.\n junitfile : string, optional\n If provided, writes the test results to an junit xml\n style xml file. This is useful for running the tests\n in a CI server such as Jenkins.\n exit : bool, optional\n If True, the function will call sys.exit with an\n error code after the tests are finished.\n \"\"\"\n import os\n import sys\n import pytest\n\n args = []\n\n if verbose:\n args.append('--verbose')\n\n # Output an xunit file if requested\n if junitfile is not None:\n args.append('--junit-xml=%s' % junitfile)\n\n # Add all 'tests' subdirectories to the options\n rootdir = os.path.dirname(__file__)\n for root, dirs, files in os.walk(rootdir):\n if 'tests' in dirs:\n testsdir = os.path.join(root, 'tests')\n args.append(testsdir)\n print('Test dir: %s' % testsdir[len(rootdir) + 1:])\n # print versions (handy when reporting problems)\n print_versions()\n sys.stdout.flush()\n\n # Ask pytest to do its thing\n error_code = pytest.main(args=args)\n if exit:\n return sys.exit(error_code)\n return error_code\n", "path": "blaze/__init__.py"}], "after_files": [{"content": "from __future__ import absolute_import, division, print_function\n\nimport logging\n\nfrom dynd import nd\nfrom pandas import DataFrame\nimport h5py\n\nfrom multipledispatch import halt_ordering, restart_ordering\n\nhalt_ordering() # Turn off multipledispatch ordering\n\nfrom .expr import *\nfrom .expr.functions import *\nfrom .api import *\nfrom .data.csv import *\nfrom .data.json import *\nfrom .data.hdf5 import *\nfrom .compute.python import *\nfrom .data.meta import *\nfrom .compute.pandas import *\nfrom .compute.numpy import *\nfrom .compute.core import *\nfrom .compute.core import compute\nfrom .sql import *\n\ntry:\n from .spark import *\nexcept ImportError:\n pass\ntry:\n from .compute.pytables import *\nexcept ImportError:\n pass\ntry:\n from .compute.chunks import *\nexcept ImportError:\n pass\ntry:\n from .bcolz import *\nexcept ImportError:\n pass\ntry:\n from .mongo import *\nexcept ImportError:\n pass\n\nrestart_ordering() # Restart multipledispatch ordering and do ordering\n\nlogging.basicConfig()\nlogger = logging.getLogger(__name__)\nlogger.setLevel(logging.WARNING)\n\n\ninf = float('inf')\nnan = float('nan')\n\n__version__ = '0.6.1'\n\n# If IPython is already loaded, register the Blaze catalog magic\n# from . import catalog\n# import sys\n# if 'IPython' in sys.modules:\n# catalog.register_ipy_magic()\n# del sys\n\ndef print_versions():\n \"\"\"Print all the versions of software that Blaze relies on.\"\"\"\n import sys, platform\n import numpy as np\n import dynd\n import datashape\n print(\"-=\" * 38)\n print(\"Blaze version: %s\" % __version__)\n print(\"Datashape version: %s\" % datashape.__version__)\n print(\"NumPy version: %s\" % np.__version__)\n print(\"DyND version: %s / LibDyND %s\" %\n (dynd.__version__, dynd.__libdynd_version__))\n print(\"Python version: %s\" % sys.version)\n (sysname, nodename, release, version, machine, processor) = \\\n platform.uname()\n print(\"Platform: %s-%s-%s (%s)\" % (sysname, release, machine, version))\n if sysname == \"Linux\":\n print(\"Linux dist: %s\" % \" \".join(platform.linux_distribution()[:-1]))\n if not processor:\n processor = \"not recognized\"\n print(\"Processor: %s\" % processor)\n print(\"Byte-ordering: %s\" % sys.byteorder)\n print(\"-=\" * 38)\n\n\ndef test(verbose=False, junitfile=None, exit=False):\n \"\"\"\n Runs the full Blaze test suite, outputting\n the results of the tests to sys.stdout.\n\n This uses py.test to discover which tests to\n run, and runs tests in any 'tests' subdirectory\n within the Blaze module.\n\n Parameters\n ----------\n verbose : int, optional\n Value 0 prints very little, 1 prints a little bit,\n and 2 prints the test names while testing.\n junitfile : string, optional\n If provided, writes the test results to an junit xml\n style xml file. This is useful for running the tests\n in a CI server such as Jenkins.\n exit : bool, optional\n If True, the function will call sys.exit with an\n error code after the tests are finished.\n \"\"\"\n import os\n import sys\n import pytest\n\n args = []\n\n if verbose:\n args.append('--verbose')\n\n # Output an xunit file if requested\n if junitfile is not None:\n args.append('--junit-xml=%s' % junitfile)\n\n # Add all 'tests' subdirectories to the options\n rootdir = os.path.dirname(__file__)\n for root, dirs, files in os.walk(rootdir):\n if 'tests' in dirs:\n testsdir = os.path.join(root, 'tests')\n args.append(testsdir)\n print('Test dir: %s' % testsdir[len(rootdir) + 1:])\n # print versions (handy when reporting problems)\n print_versions()\n sys.stdout.flush()\n\n # Ask pytest to do its thing\n error_code = pytest.main(args=args)\n if exit:\n return sys.exit(error_code)\n return error_code == 0\n", "path": "blaze/__init__.py"}]}
1,626
85
gh_patches_debug_30662
rasdani/github-patches
git_diff
goauthentik__authentik-4428
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- System Tasks: Show start timestamp and calculate Duration **Is your feature request related to a problem? Please describe.** For debugging purposes, I need the info when a task started, and when it finished. **Describe the solution you'd like** I have seen that the TaskInfo object actually holds that information, but it is not returned by the API, and not shown in the "SystemTasks" table of the web UI. It would also make sense to calculate the duration for easier debugging. **Describe alternatives you've considered** I could look this up in the database, but this would be questionable UX, since there is already a view in the web app which should show this information. **Additional context** (none) --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `authentik/admin/api/tasks.py` Content: ``` 1 """Tasks API""" 2 from importlib import import_module 3 4 from django.contrib import messages 5 from django.http.response import Http404 6 from django.utils.translation import gettext_lazy as _ 7 from drf_spectacular.types import OpenApiTypes 8 from drf_spectacular.utils import OpenApiParameter, OpenApiResponse, extend_schema 9 from rest_framework.decorators import action 10 from rest_framework.fields import CharField, ChoiceField, DateTimeField, ListField 11 from rest_framework.permissions import IsAdminUser 12 from rest_framework.request import Request 13 from rest_framework.response import Response 14 from rest_framework.viewsets import ViewSet 15 from structlog.stdlib import get_logger 16 17 from authentik.core.api.utils import PassiveSerializer 18 from authentik.events.monitored_tasks import TaskInfo, TaskResultStatus 19 20 LOGGER = get_logger() 21 22 23 class TaskSerializer(PassiveSerializer): 24 """Serialize TaskInfo and TaskResult""" 25 26 task_name = CharField() 27 task_description = CharField() 28 task_finish_timestamp = DateTimeField(source="finish_time") 29 30 status = ChoiceField( 31 source="result.status.name", 32 choices=[(x.name, x.name) for x in TaskResultStatus], 33 ) 34 messages = ListField(source="result.messages") 35 36 def to_representation(self, instance): 37 """When a new version of authentik adds fields to TaskInfo, 38 the API will fail with an AttributeError, as the classes 39 are pickled in cache. In that case, just delete the info""" 40 try: 41 return super().to_representation(instance) 42 except AttributeError: # pragma: no cover 43 if isinstance(self.instance, list): 44 for inst in self.instance: 45 inst.delete() 46 else: 47 self.instance.delete() 48 return {} 49 50 51 class TaskViewSet(ViewSet): 52 """Read-only view set that returns all background tasks""" 53 54 permission_classes = [IsAdminUser] 55 serializer_class = TaskSerializer 56 57 @extend_schema( 58 responses={ 59 200: TaskSerializer(many=False), 60 404: OpenApiResponse(description="Task not found"), 61 }, 62 parameters=[ 63 OpenApiParameter( 64 "id", 65 type=OpenApiTypes.STR, 66 location=OpenApiParameter.PATH, 67 required=True, 68 ), 69 ], 70 ) 71 # pylint: disable=invalid-name 72 def retrieve(self, request: Request, pk=None) -> Response: 73 """Get a single system task""" 74 task = TaskInfo.by_name(pk) 75 if not task: 76 raise Http404 77 return Response(TaskSerializer(task, many=False).data) 78 79 @extend_schema(responses={200: TaskSerializer(many=True)}) 80 def list(self, request: Request) -> Response: 81 """List system tasks""" 82 tasks = sorted(TaskInfo.all().values(), key=lambda task: task.task_name) 83 return Response(TaskSerializer(tasks, many=True).data) 84 85 @extend_schema( 86 request=OpenApiTypes.NONE, 87 responses={ 88 204: OpenApiResponse(description="Task retried successfully"), 89 404: OpenApiResponse(description="Task not found"), 90 500: OpenApiResponse(description="Failed to retry task"), 91 }, 92 parameters=[ 93 OpenApiParameter( 94 "id", 95 type=OpenApiTypes.STR, 96 location=OpenApiParameter.PATH, 97 required=True, 98 ), 99 ], 100 ) 101 @action(detail=True, methods=["post"]) 102 # pylint: disable=invalid-name 103 def retry(self, request: Request, pk=None) -> Response: 104 """Retry task""" 105 task = TaskInfo.by_name(pk) 106 if not task: 107 raise Http404 108 try: 109 task_module = import_module(task.task_call_module) 110 task_func = getattr(task_module, task.task_call_func) 111 LOGGER.debug("Running task", task=task_func) 112 task_func.delay(*task.task_call_args, **task.task_call_kwargs) 113 messages.success( 114 self.request, 115 _("Successfully re-scheduled Task %(name)s!" % {"name": task.task_name}), 116 ) 117 return Response(status=204) 118 except (ImportError, AttributeError): # pragma: no cover 119 LOGGER.warning("Failed to run task, remove state", task=task) 120 # if we get an import error, the module path has probably changed 121 task.delete() 122 return Response(status=500) 123 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/authentik/admin/api/tasks.py b/authentik/admin/api/tasks.py --- a/authentik/admin/api/tasks.py +++ b/authentik/admin/api/tasks.py @@ -7,7 +7,13 @@ from drf_spectacular.types import OpenApiTypes from drf_spectacular.utils import OpenApiParameter, OpenApiResponse, extend_schema from rest_framework.decorators import action -from rest_framework.fields import CharField, ChoiceField, DateTimeField, ListField +from rest_framework.fields import ( + CharField, + ChoiceField, + DateTimeField, + ListField, + SerializerMethodField, +) from rest_framework.permissions import IsAdminUser from rest_framework.request import Request from rest_framework.response import Response @@ -26,6 +32,7 @@ task_name = CharField() task_description = CharField() task_finish_timestamp = DateTimeField(source="finish_time") + task_duration = SerializerMethodField() status = ChoiceField( source="result.status.name", @@ -33,7 +40,11 @@ ) messages = ListField(source="result.messages") - def to_representation(self, instance): + def get_task_duration(self, instance: TaskInfo) -> int: + """Get the duration a task took to run""" + return max(instance.finish_timestamp - instance.start_timestamp, 0) + + def to_representation(self, instance: TaskInfo): """When a new version of authentik adds fields to TaskInfo, the API will fail with an AttributeError, as the classes are pickled in cache. In that case, just delete the info"""
{"golden_diff": "diff --git a/authentik/admin/api/tasks.py b/authentik/admin/api/tasks.py\n--- a/authentik/admin/api/tasks.py\n+++ b/authentik/admin/api/tasks.py\n@@ -7,7 +7,13 @@\n from drf_spectacular.types import OpenApiTypes\n from drf_spectacular.utils import OpenApiParameter, OpenApiResponse, extend_schema\n from rest_framework.decorators import action\n-from rest_framework.fields import CharField, ChoiceField, DateTimeField, ListField\n+from rest_framework.fields import (\n+ CharField,\n+ ChoiceField,\n+ DateTimeField,\n+ ListField,\n+ SerializerMethodField,\n+)\n from rest_framework.permissions import IsAdminUser\n from rest_framework.request import Request\n from rest_framework.response import Response\n@@ -26,6 +32,7 @@\n task_name = CharField()\n task_description = CharField()\n task_finish_timestamp = DateTimeField(source=\"finish_time\")\n+ task_duration = SerializerMethodField()\n \n status = ChoiceField(\n source=\"result.status.name\",\n@@ -33,7 +40,11 @@\n )\n messages = ListField(source=\"result.messages\")\n \n- def to_representation(self, instance):\n+ def get_task_duration(self, instance: TaskInfo) -> int:\n+ \"\"\"Get the duration a task took to run\"\"\"\n+ return max(instance.finish_timestamp - instance.start_timestamp, 0)\n+\n+ def to_representation(self, instance: TaskInfo):\n \"\"\"When a new version of authentik adds fields to TaskInfo,\n the API will fail with an AttributeError, as the classes\n are pickled in cache. In that case, just delete the info\"\"\"\n", "issue": "System Tasks: Show start timestamp and calculate Duration\n**Is your feature request related to a problem? Please describe.**\r\nFor debugging purposes, I need the info when a task started, and when it finished. \r\n\r\n**Describe the solution you'd like**\r\nI have seen that the TaskInfo object actually holds that information, but it is not returned by the API, and not shown in the \"SystemTasks\" table of the web UI. \r\nIt would also make sense to calculate the duration for easier debugging.\r\n\r\n**Describe alternatives you've considered**\r\nI could look this up in the database, but this would be questionable UX, since there is already a view in the web app which should show this information.\r\n\r\n**Additional context**\r\n(none)\r\n\n", "before_files": [{"content": "\"\"\"Tasks API\"\"\"\nfrom importlib import import_module\n\nfrom django.contrib import messages\nfrom django.http.response import Http404\nfrom django.utils.translation import gettext_lazy as _\nfrom drf_spectacular.types import OpenApiTypes\nfrom drf_spectacular.utils import OpenApiParameter, OpenApiResponse, extend_schema\nfrom rest_framework.decorators import action\nfrom rest_framework.fields import CharField, ChoiceField, DateTimeField, ListField\nfrom rest_framework.permissions import IsAdminUser\nfrom rest_framework.request import Request\nfrom rest_framework.response import Response\nfrom rest_framework.viewsets import ViewSet\nfrom structlog.stdlib import get_logger\n\nfrom authentik.core.api.utils import PassiveSerializer\nfrom authentik.events.monitored_tasks import TaskInfo, TaskResultStatus\n\nLOGGER = get_logger()\n\n\nclass TaskSerializer(PassiveSerializer):\n \"\"\"Serialize TaskInfo and TaskResult\"\"\"\n\n task_name = CharField()\n task_description = CharField()\n task_finish_timestamp = DateTimeField(source=\"finish_time\")\n\n status = ChoiceField(\n source=\"result.status.name\",\n choices=[(x.name, x.name) for x in TaskResultStatus],\n )\n messages = ListField(source=\"result.messages\")\n\n def to_representation(self, instance):\n \"\"\"When a new version of authentik adds fields to TaskInfo,\n the API will fail with an AttributeError, as the classes\n are pickled in cache. In that case, just delete the info\"\"\"\n try:\n return super().to_representation(instance)\n except AttributeError: # pragma: no cover\n if isinstance(self.instance, list):\n for inst in self.instance:\n inst.delete()\n else:\n self.instance.delete()\n return {}\n\n\nclass TaskViewSet(ViewSet):\n \"\"\"Read-only view set that returns all background tasks\"\"\"\n\n permission_classes = [IsAdminUser]\n serializer_class = TaskSerializer\n\n @extend_schema(\n responses={\n 200: TaskSerializer(many=False),\n 404: OpenApiResponse(description=\"Task not found\"),\n },\n parameters=[\n OpenApiParameter(\n \"id\",\n type=OpenApiTypes.STR,\n location=OpenApiParameter.PATH,\n required=True,\n ),\n ],\n )\n # pylint: disable=invalid-name\n def retrieve(self, request: Request, pk=None) -> Response:\n \"\"\"Get a single system task\"\"\"\n task = TaskInfo.by_name(pk)\n if not task:\n raise Http404\n return Response(TaskSerializer(task, many=False).data)\n\n @extend_schema(responses={200: TaskSerializer(many=True)})\n def list(self, request: Request) -> Response:\n \"\"\"List system tasks\"\"\"\n tasks = sorted(TaskInfo.all().values(), key=lambda task: task.task_name)\n return Response(TaskSerializer(tasks, many=True).data)\n\n @extend_schema(\n request=OpenApiTypes.NONE,\n responses={\n 204: OpenApiResponse(description=\"Task retried successfully\"),\n 404: OpenApiResponse(description=\"Task not found\"),\n 500: OpenApiResponse(description=\"Failed to retry task\"),\n },\n parameters=[\n OpenApiParameter(\n \"id\",\n type=OpenApiTypes.STR,\n location=OpenApiParameter.PATH,\n required=True,\n ),\n ],\n )\n @action(detail=True, methods=[\"post\"])\n # pylint: disable=invalid-name\n def retry(self, request: Request, pk=None) -> Response:\n \"\"\"Retry task\"\"\"\n task = TaskInfo.by_name(pk)\n if not task:\n raise Http404\n try:\n task_module = import_module(task.task_call_module)\n task_func = getattr(task_module, task.task_call_func)\n LOGGER.debug(\"Running task\", task=task_func)\n task_func.delay(*task.task_call_args, **task.task_call_kwargs)\n messages.success(\n self.request,\n _(\"Successfully re-scheduled Task %(name)s!\" % {\"name\": task.task_name}),\n )\n return Response(status=204)\n except (ImportError, AttributeError): # pragma: no cover\n LOGGER.warning(\"Failed to run task, remove state\", task=task)\n # if we get an import error, the module path has probably changed\n task.delete()\n return Response(status=500)\n", "path": "authentik/admin/api/tasks.py"}], "after_files": [{"content": "\"\"\"Tasks API\"\"\"\nfrom importlib import import_module\n\nfrom django.contrib import messages\nfrom django.http.response import Http404\nfrom django.utils.translation import gettext_lazy as _\nfrom drf_spectacular.types import OpenApiTypes\nfrom drf_spectacular.utils import OpenApiParameter, OpenApiResponse, extend_schema\nfrom rest_framework.decorators import action\nfrom rest_framework.fields import (\n CharField,\n ChoiceField,\n DateTimeField,\n ListField,\n SerializerMethodField,\n)\nfrom rest_framework.permissions import IsAdminUser\nfrom rest_framework.request import Request\nfrom rest_framework.response import Response\nfrom rest_framework.viewsets import ViewSet\nfrom structlog.stdlib import get_logger\n\nfrom authentik.core.api.utils import PassiveSerializer\nfrom authentik.events.monitored_tasks import TaskInfo, TaskResultStatus\n\nLOGGER = get_logger()\n\n\nclass TaskSerializer(PassiveSerializer):\n \"\"\"Serialize TaskInfo and TaskResult\"\"\"\n\n task_name = CharField()\n task_description = CharField()\n task_finish_timestamp = DateTimeField(source=\"finish_time\")\n task_duration = SerializerMethodField()\n\n status = ChoiceField(\n source=\"result.status.name\",\n choices=[(x.name, x.name) for x in TaskResultStatus],\n )\n messages = ListField(source=\"result.messages\")\n\n def get_task_duration(self, instance: TaskInfo) -> int:\n \"\"\"Get the duration a task took to run\"\"\"\n return max(instance.finish_timestamp - instance.start_timestamp, 0)\n\n def to_representation(self, instance: TaskInfo):\n \"\"\"When a new version of authentik adds fields to TaskInfo,\n the API will fail with an AttributeError, as the classes\n are pickled in cache. In that case, just delete the info\"\"\"\n try:\n return super().to_representation(instance)\n except AttributeError: # pragma: no cover\n if isinstance(self.instance, list):\n for inst in self.instance:\n inst.delete()\n else:\n self.instance.delete()\n return {}\n\n\nclass TaskViewSet(ViewSet):\n \"\"\"Read-only view set that returns all background tasks\"\"\"\n\n permission_classes = [IsAdminUser]\n serializer_class = TaskSerializer\n\n @extend_schema(\n responses={\n 200: TaskSerializer(many=False),\n 404: OpenApiResponse(description=\"Task not found\"),\n },\n parameters=[\n OpenApiParameter(\n \"id\",\n type=OpenApiTypes.STR,\n location=OpenApiParameter.PATH,\n required=True,\n ),\n ],\n )\n # pylint: disable=invalid-name\n def retrieve(self, request: Request, pk=None) -> Response:\n \"\"\"Get a single system task\"\"\"\n task = TaskInfo.by_name(pk)\n if not task:\n raise Http404\n return Response(TaskSerializer(task, many=False).data)\n\n @extend_schema(responses={200: TaskSerializer(many=True)})\n def list(self, request: Request) -> Response:\n \"\"\"List system tasks\"\"\"\n tasks = sorted(TaskInfo.all().values(), key=lambda task: task.task_name)\n return Response(TaskSerializer(tasks, many=True).data)\n\n @extend_schema(\n request=OpenApiTypes.NONE,\n responses={\n 204: OpenApiResponse(description=\"Task retried successfully\"),\n 404: OpenApiResponse(description=\"Task not found\"),\n 500: OpenApiResponse(description=\"Failed to retry task\"),\n },\n parameters=[\n OpenApiParameter(\n \"id\",\n type=OpenApiTypes.STR,\n location=OpenApiParameter.PATH,\n required=True,\n ),\n ],\n )\n @action(detail=True, methods=[\"post\"])\n # pylint: disable=invalid-name\n def retry(self, request: Request, pk=None) -> Response:\n \"\"\"Retry task\"\"\"\n task = TaskInfo.by_name(pk)\n if not task:\n raise Http404\n try:\n task_module = import_module(task.task_call_module)\n task_func = getattr(task_module, task.task_call_func)\n LOGGER.debug(\"Running task\", task=task_func)\n task_func.delay(*task.task_call_args, **task.task_call_kwargs)\n messages.success(\n self.request,\n _(\"Successfully re-scheduled Task %(name)s!\" % {\"name\": task.task_name}),\n )\n return Response(status=204)\n except (ImportError, AttributeError): # pragma: no cover\n LOGGER.warning(\"Failed to run task, remove state\", task=task)\n # if we get an import error, the module path has probably changed\n task.delete()\n return Response(status=500)\n", "path": "authentik/admin/api/tasks.py"}]}
1,594
360
gh_patches_debug_22614
rasdani/github-patches
git_diff
aws-cloudformation__cfn-lint-269
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- E2502 is mistaken about IamInstanceProfile *cfn-lint version: 0.4.2* *Description of issue.* Linting a template returned: ``` E2502 Property IamInstanceProfile shouldn't be an ARN for Resources/BuildkiteSpotfleet/Properties/SpotFleetRequestConfigData/LaunchSpecifications/0/IamInstanceProfile/Arn/Fn::GetAtt ``` However that property can be an ARN according to https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-ec2-spotfleet-spotfleetrequestconfigdata-launchspecifications.html#cfn-ec2-spotfleet-spotfleetrequestconfigdata-launchspecifications-iaminstanceprofile It can be an `{"Arn": "profile_arn"}` structure. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `src/cfnlint/rules/resources/iam/InstanceProfile.py` Content: ``` 1 """ 2 Copyright 2018 Amazon.com, Inc. or its affiliates. All Rights Reserved. 3 4 Permission is hereby granted, free of charge, to any person obtaining a copy of this 5 software and associated documentation files (the "Software"), to deal in the Software 6 without restriction, including without limitation the rights to use, copy, modify, 7 merge, publish, distribute, sublicense, and/or sell copies of the Software, and to 8 permit persons to whom the Software is furnished to do so. 9 10 THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, 11 INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A 12 PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT 13 HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION 14 OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE 15 SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. 16 """ 17 from cfnlint import CloudFormationLintRule 18 from cfnlint import RuleMatch 19 20 21 class InstanceProfile(CloudFormationLintRule): 22 """Check if IamInstanceProfile are used""" 23 id = 'E2502' 24 shortdesc = 'Check if IamInstanceProfile are using the name and not ARN' 25 description = 'See if there are any properties IamInstanceProfile' + \ 26 'are using name and not ARN' 27 source_url = 'https://github.com/awslabs/cfn-python-lint' 28 tags = ['properties'] 29 30 def match(self, cfn): 31 """Check CloudFormation IamInstanceProfile Parameters""" 32 33 matches = list() 34 35 # Build the list of keys 36 trees = cfn.search_deep_keys('Fn::GetAtt') 37 # Filter only resoureces 38 # Disable pylint for Pylint 2 39 # pylint: disable=W0110 40 trees = filter(lambda x: x[0] == 'Resources', trees) 41 for tree in trees: 42 if any(e == 'IamInstanceProfile' for e in tree): 43 obj = tree[-1] 44 objtype = cfn.template.get('Resources', {}).get(obj[0], {}).get('Type') 45 if objtype: 46 if objtype != 'AWS::IAM::InstanceProfile': 47 message = 'Property IamInstanceProfile should relate to AWS::IAM::InstanceProfile for %s' % ( 48 '/'.join(map(str, tree[:-1]))) 49 matches.append(RuleMatch(tree[:-1], message)) 50 else: 51 if obj[1] == 'Arn': 52 message = 'Property IamInstanceProfile shouldn\'t be an ARN for %s' % ( 53 '/'.join(map(str, tree[:-1]))) 54 matches.append(RuleMatch(tree[:-1], message)) 55 56 # Search Refs 57 trees = cfn.search_deep_keys('Ref') 58 # Filter only resoureces 59 trees = filter(lambda x: x[0] == 'Resources', trees) 60 for tree in trees: 61 if any(e == 'IamInstanceProfile' for e in tree): 62 obj = tree[-1] 63 objtype = cfn.template.get('Resources', {}).get(obj, {}).get('Type') 64 if objtype: 65 if objtype != 'AWS::IAM::InstanceProfile': 66 message = 'Property IamInstanceProfile should relate to AWS::IAM::InstanceProfile for %s' % ( 67 '/'.join(map(str, tree[:-1]))) 68 matches.append(RuleMatch(tree[:-1], message)) 69 70 return matches 71 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/src/cfnlint/rules/resources/iam/InstanceProfile.py b/src/cfnlint/rules/resources/iam/InstanceProfile.py --- a/src/cfnlint/rules/resources/iam/InstanceProfile.py +++ b/src/cfnlint/rules/resources/iam/InstanceProfile.py @@ -48,10 +48,16 @@ '/'.join(map(str, tree[:-1]))) matches.append(RuleMatch(tree[:-1], message)) else: - if obj[1] == 'Arn': - message = 'Property IamInstanceProfile shouldn\'t be an ARN for %s' % ( - '/'.join(map(str, tree[:-1]))) - matches.append(RuleMatch(tree[:-1], message)) + if cfn.template.get('Resources', {}).get(tree[1], {}).get('Type') in ['AWS::EC2::SpotFleet']: + if obj[1] != 'Arn': + message = 'Property IamInstanceProfile should be an ARN for %s' % ( + '/'.join(map(str, tree[:-1]))) + matches.append(RuleMatch(tree[:-1], message)) + else: + if obj[1] == 'Arn': + message = 'Property IamInstanceProfile shouldn\'t be an ARN for %s' % ( + '/'.join(map(str, tree[:-1]))) + matches.append(RuleMatch(tree[:-1], message)) # Search Refs trees = cfn.search_deep_keys('Ref')
{"golden_diff": "diff --git a/src/cfnlint/rules/resources/iam/InstanceProfile.py b/src/cfnlint/rules/resources/iam/InstanceProfile.py\n--- a/src/cfnlint/rules/resources/iam/InstanceProfile.py\n+++ b/src/cfnlint/rules/resources/iam/InstanceProfile.py\n@@ -48,10 +48,16 @@\n '/'.join(map(str, tree[:-1])))\n matches.append(RuleMatch(tree[:-1], message))\n else:\n- if obj[1] == 'Arn':\n- message = 'Property IamInstanceProfile shouldn\\'t be an ARN for %s' % (\n- '/'.join(map(str, tree[:-1])))\n- matches.append(RuleMatch(tree[:-1], message))\n+ if cfn.template.get('Resources', {}).get(tree[1], {}).get('Type') in ['AWS::EC2::SpotFleet']:\n+ if obj[1] != 'Arn':\n+ message = 'Property IamInstanceProfile should be an ARN for %s' % (\n+ '/'.join(map(str, tree[:-1])))\n+ matches.append(RuleMatch(tree[:-1], message))\n+ else:\n+ if obj[1] == 'Arn':\n+ message = 'Property IamInstanceProfile shouldn\\'t be an ARN for %s' % (\n+ '/'.join(map(str, tree[:-1])))\n+ matches.append(RuleMatch(tree[:-1], message))\n \n # Search Refs\n trees = cfn.search_deep_keys('Ref')\n", "issue": "E2502 is mistaken about IamInstanceProfile\n*cfn-lint version: 0.4.2*\r\n\r\n*Description of issue.*\r\n\r\nLinting a template returned:\r\n```\r\nE2502 Property IamInstanceProfile shouldn't be an ARN for Resources/BuildkiteSpotfleet/Properties/SpotFleetRequestConfigData/LaunchSpecifications/0/IamInstanceProfile/Arn/Fn::GetAtt\r\n```\r\n\r\nHowever that property can be an ARN according to https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-ec2-spotfleet-spotfleetrequestconfigdata-launchspecifications.html#cfn-ec2-spotfleet-spotfleetrequestconfigdata-launchspecifications-iaminstanceprofile\r\n\r\nIt can be an `{\"Arn\": \"profile_arn\"}` structure.\n", "before_files": [{"content": "\"\"\"\n Copyright 2018 Amazon.com, Inc. or its affiliates. All Rights Reserved.\n\n Permission is hereby granted, free of charge, to any person obtaining a copy of this\n software and associated documentation files (the \"Software\"), to deal in the Software\n without restriction, including without limitation the rights to use, copy, modify,\n merge, publish, distribute, sublicense, and/or sell copies of the Software, and to\n permit persons to whom the Software is furnished to do so.\n\n THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED,\n INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A\n PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT\n HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION\n OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE\n SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.\n\"\"\"\nfrom cfnlint import CloudFormationLintRule\nfrom cfnlint import RuleMatch\n\n\nclass InstanceProfile(CloudFormationLintRule):\n \"\"\"Check if IamInstanceProfile are used\"\"\"\n id = 'E2502'\n shortdesc = 'Check if IamInstanceProfile are using the name and not ARN'\n description = 'See if there are any properties IamInstanceProfile' + \\\n 'are using name and not ARN'\n source_url = 'https://github.com/awslabs/cfn-python-lint'\n tags = ['properties']\n\n def match(self, cfn):\n \"\"\"Check CloudFormation IamInstanceProfile Parameters\"\"\"\n\n matches = list()\n\n # Build the list of keys\n trees = cfn.search_deep_keys('Fn::GetAtt')\n # Filter only resoureces\n # Disable pylint for Pylint 2\n # pylint: disable=W0110\n trees = filter(lambda x: x[0] == 'Resources', trees)\n for tree in trees:\n if any(e == 'IamInstanceProfile' for e in tree):\n obj = tree[-1]\n objtype = cfn.template.get('Resources', {}).get(obj[0], {}).get('Type')\n if objtype:\n if objtype != 'AWS::IAM::InstanceProfile':\n message = 'Property IamInstanceProfile should relate to AWS::IAM::InstanceProfile for %s' % (\n '/'.join(map(str, tree[:-1])))\n matches.append(RuleMatch(tree[:-1], message))\n else:\n if obj[1] == 'Arn':\n message = 'Property IamInstanceProfile shouldn\\'t be an ARN for %s' % (\n '/'.join(map(str, tree[:-1])))\n matches.append(RuleMatch(tree[:-1], message))\n\n # Search Refs\n trees = cfn.search_deep_keys('Ref')\n # Filter only resoureces\n trees = filter(lambda x: x[0] == 'Resources', trees)\n for tree in trees:\n if any(e == 'IamInstanceProfile' for e in tree):\n obj = tree[-1]\n objtype = cfn.template.get('Resources', {}).get(obj, {}).get('Type')\n if objtype:\n if objtype != 'AWS::IAM::InstanceProfile':\n message = 'Property IamInstanceProfile should relate to AWS::IAM::InstanceProfile for %s' % (\n '/'.join(map(str, tree[:-1])))\n matches.append(RuleMatch(tree[:-1], message))\n\n return matches\n", "path": "src/cfnlint/rules/resources/iam/InstanceProfile.py"}], "after_files": [{"content": "\"\"\"\n Copyright 2018 Amazon.com, Inc. or its affiliates. All Rights Reserved.\n\n Permission is hereby granted, free of charge, to any person obtaining a copy of this\n software and associated documentation files (the \"Software\"), to deal in the Software\n without restriction, including without limitation the rights to use, copy, modify,\n merge, publish, distribute, sublicense, and/or sell copies of the Software, and to\n permit persons to whom the Software is furnished to do so.\n\n THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED,\n INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A\n PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT\n HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION\n OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE\n SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.\n\"\"\"\nfrom cfnlint import CloudFormationLintRule\nfrom cfnlint import RuleMatch\n\n\nclass InstanceProfile(CloudFormationLintRule):\n \"\"\"Check if IamInstanceProfile are used\"\"\"\n id = 'E2502'\n shortdesc = 'Check if IamInstanceProfile are using the name and not ARN'\n description = 'See if there are any properties IamInstanceProfile' + \\\n 'are using name and not ARN'\n source_url = 'https://github.com/awslabs/cfn-python-lint'\n tags = ['properties']\n\n def match(self, cfn):\n \"\"\"Check CloudFormation IamInstanceProfile Parameters\"\"\"\n\n matches = list()\n\n # Build the list of keys\n trees = cfn.search_deep_keys('Fn::GetAtt')\n # Filter only resoureces\n # Disable pylint for Pylint 2\n # pylint: disable=W0110\n trees = filter(lambda x: x[0] == 'Resources', trees)\n for tree in trees:\n if any(e == 'IamInstanceProfile' for e in tree):\n obj = tree[-1]\n objtype = cfn.template.get('Resources', {}).get(obj[0], {}).get('Type')\n if objtype:\n if objtype != 'AWS::IAM::InstanceProfile':\n message = 'Property IamInstanceProfile should relate to AWS::IAM::InstanceProfile for %s' % (\n '/'.join(map(str, tree[:-1])))\n matches.append(RuleMatch(tree[:-1], message))\n else:\n if cfn.template.get('Resources', {}).get(tree[1], {}).get('Type') in ['AWS::EC2::SpotFleet']:\n if obj[1] != 'Arn':\n message = 'Property IamInstanceProfile should be an ARN for %s' % (\n '/'.join(map(str, tree[:-1])))\n matches.append(RuleMatch(tree[:-1], message))\n else:\n if obj[1] == 'Arn':\n message = 'Property IamInstanceProfile shouldn\\'t be an ARN for %s' % (\n '/'.join(map(str, tree[:-1])))\n matches.append(RuleMatch(tree[:-1], message))\n\n # Search Refs\n trees = cfn.search_deep_keys('Ref')\n # Filter only resoureces\n trees = filter(lambda x: x[0] == 'Resources', trees)\n for tree in trees:\n if any(e == 'IamInstanceProfile' for e in tree):\n obj = tree[-1]\n objtype = cfn.template.get('Resources', {}).get(obj, {}).get('Type')\n if objtype:\n if objtype != 'AWS::IAM::InstanceProfile':\n message = 'Property IamInstanceProfile should relate to AWS::IAM::InstanceProfile for %s' % (\n '/'.join(map(str, tree[:-1])))\n matches.append(RuleMatch(tree[:-1], message))\n\n return matches\n", "path": "src/cfnlint/rules/resources/iam/InstanceProfile.py"}]}
1,328
329
gh_patches_debug_7860
rasdani/github-patches
git_diff
encode__httpx-321
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Away-from-origin redirects should add a new host header Prompted by #310 Away-from-origin redirects currently remove the `Host` header, without adding a new one. I think we ought to be using `headers['Host'] = url.authority.encode("ascii")` instead of simply deleting the header. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `httpx/middleware.py` Content: ``` 1 import functools 2 import typing 3 from base64 import b64encode 4 5 from .config import DEFAULT_MAX_REDIRECTS 6 from .exceptions import RedirectBodyUnavailable, RedirectLoop, TooManyRedirects 7 from .models import URL, AsyncRequest, AsyncResponse, Cookies, Headers 8 from .status_codes import codes 9 10 11 class BaseMiddleware: 12 async def __call__( 13 self, request: AsyncRequest, get_response: typing.Callable 14 ) -> AsyncResponse: 15 raise NotImplementedError # pragma: no cover 16 17 18 class BasicAuthMiddleware(BaseMiddleware): 19 def __init__( 20 self, username: typing.Union[str, bytes], password: typing.Union[str, bytes] 21 ): 22 if isinstance(username, str): 23 username = username.encode("latin1") 24 25 if isinstance(password, str): 26 password = password.encode("latin1") 27 28 userpass = b":".join((username, password)) 29 token = b64encode(userpass).decode().strip() 30 31 self.authorization_header = f"Basic {token}" 32 33 async def __call__( 34 self, request: AsyncRequest, get_response: typing.Callable 35 ) -> AsyncResponse: 36 request.headers["Authorization"] = self.authorization_header 37 return await get_response(request) 38 39 40 class CustomAuthMiddleware(BaseMiddleware): 41 def __init__(self, auth: typing.Callable[[AsyncRequest], AsyncRequest]): 42 self.auth = auth 43 44 async def __call__( 45 self, request: AsyncRequest, get_response: typing.Callable 46 ) -> AsyncResponse: 47 request = self.auth(request) 48 return await get_response(request) 49 50 51 class RedirectMiddleware(BaseMiddleware): 52 def __init__( 53 self, 54 allow_redirects: bool = True, 55 max_redirects: int = DEFAULT_MAX_REDIRECTS, 56 cookies: typing.Optional[Cookies] = None, 57 ): 58 self.allow_redirects = allow_redirects 59 self.max_redirects = max_redirects 60 self.cookies = cookies 61 self.history: typing.List[AsyncResponse] = [] 62 63 async def __call__( 64 self, request: AsyncRequest, get_response: typing.Callable 65 ) -> AsyncResponse: 66 if len(self.history) > self.max_redirects: 67 raise TooManyRedirects() 68 if request.url in (response.url for response in self.history): 69 raise RedirectLoop() 70 71 response = await get_response(request) 72 response.history = list(self.history) 73 74 if not response.is_redirect: 75 return response 76 77 self.history.append(response) 78 next_request = self.build_redirect_request(request, response) 79 80 if self.allow_redirects: 81 return await self(next_request, get_response) 82 83 response.call_next = functools.partial(self, next_request, get_response) 84 return response 85 86 def build_redirect_request( 87 self, request: AsyncRequest, response: AsyncResponse 88 ) -> AsyncRequest: 89 method = self.redirect_method(request, response) 90 url = self.redirect_url(request, response) 91 headers = self.redirect_headers(request, url, method) # TODO: merge headers? 92 content = self.redirect_content(request, method) 93 cookies = Cookies(self.cookies) 94 cookies.update(request.cookies) 95 return AsyncRequest( 96 method=method, url=url, headers=headers, data=content, cookies=cookies 97 ) 98 99 def redirect_method(self, request: AsyncRequest, response: AsyncResponse) -> str: 100 """ 101 When being redirected we may want to change the method of the request 102 based on certain specs or browser behavior. 103 """ 104 method = request.method 105 106 # https://tools.ietf.org/html/rfc7231#section-6.4.4 107 if response.status_code == codes.SEE_OTHER and method != "HEAD": 108 method = "GET" 109 110 # Do what the browsers do, despite standards... 111 # Turn 302s into GETs. 112 if response.status_code == codes.FOUND and method != "HEAD": 113 method = "GET" 114 115 # If a POST is responded to with a 301, turn it into a GET. 116 # This bizarre behaviour is explained in 'requests' issue 1704. 117 if response.status_code == codes.MOVED_PERMANENTLY and method == "POST": 118 method = "GET" 119 120 return method 121 122 def redirect_url(self, request: AsyncRequest, response: AsyncResponse) -> URL: 123 """ 124 Return the URL for the redirect to follow. 125 """ 126 location = response.headers["Location"] 127 128 url = URL(location, allow_relative=True) 129 130 # Facilitate relative 'Location' headers, as allowed by RFC 7231. 131 # (e.g. '/path/to/resource' instead of 'http://domain.tld/path/to/resource') 132 if url.is_relative_url: 133 url = request.url.join(url) 134 135 # Attach previous fragment if needed (RFC 7231 7.1.2) 136 if request.url.fragment and not url.fragment: 137 url = url.copy_with(fragment=request.url.fragment) 138 139 return url 140 141 def redirect_headers(self, request: AsyncRequest, url: URL, method: str) -> Headers: 142 """ 143 Return the headers that should be used for the redirect request. 144 """ 145 headers = Headers(request.headers) 146 147 if url.origin != request.url.origin: 148 # Strip Authorization headers when responses are redirected away from 149 # the origin. 150 del headers["Authorization"] 151 del headers["Host"] 152 153 if method != request.method and method == "GET": 154 # If we've switch to a 'GET' request, then strip any headers which 155 # are only relevant to the request body. 156 del headers["Content-Length"] 157 del headers["Transfer-Encoding"] 158 159 return headers 160 161 def redirect_content(self, request: AsyncRequest, method: str) -> bytes: 162 """ 163 Return the body that should be used for the redirect request. 164 """ 165 if method != request.method and method == "GET": 166 return b"" 167 if request.is_streaming: 168 raise RedirectBodyUnavailable() 169 return request.content 170 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/httpx/middleware.py b/httpx/middleware.py --- a/httpx/middleware.py +++ b/httpx/middleware.py @@ -148,7 +148,7 @@ # Strip Authorization headers when responses are redirected away from # the origin. del headers["Authorization"] - del headers["Host"] + headers["Host"] = url.authority if method != request.method and method == "GET": # If we've switch to a 'GET' request, then strip any headers which
{"golden_diff": "diff --git a/httpx/middleware.py b/httpx/middleware.py\n--- a/httpx/middleware.py\n+++ b/httpx/middleware.py\n@@ -148,7 +148,7 @@\n # Strip Authorization headers when responses are redirected away from\n # the origin.\n del headers[\"Authorization\"]\n- del headers[\"Host\"]\n+ headers[\"Host\"] = url.authority\n \n if method != request.method and method == \"GET\":\n # If we've switch to a 'GET' request, then strip any headers which\n", "issue": "Away-from-origin redirects should add a new host header\nPrompted by #310\r\n\r\nAway-from-origin redirects currently remove the `Host` header, without adding a new one.\r\nI think we ought to be using `headers['Host'] = url.authority.encode(\"ascii\")` instead of simply deleting the header.\n", "before_files": [{"content": "import functools\nimport typing\nfrom base64 import b64encode\n\nfrom .config import DEFAULT_MAX_REDIRECTS\nfrom .exceptions import RedirectBodyUnavailable, RedirectLoop, TooManyRedirects\nfrom .models import URL, AsyncRequest, AsyncResponse, Cookies, Headers\nfrom .status_codes import codes\n\n\nclass BaseMiddleware:\n async def __call__(\n self, request: AsyncRequest, get_response: typing.Callable\n ) -> AsyncResponse:\n raise NotImplementedError # pragma: no cover\n\n\nclass BasicAuthMiddleware(BaseMiddleware):\n def __init__(\n self, username: typing.Union[str, bytes], password: typing.Union[str, bytes]\n ):\n if isinstance(username, str):\n username = username.encode(\"latin1\")\n\n if isinstance(password, str):\n password = password.encode(\"latin1\")\n\n userpass = b\":\".join((username, password))\n token = b64encode(userpass).decode().strip()\n\n self.authorization_header = f\"Basic {token}\"\n\n async def __call__(\n self, request: AsyncRequest, get_response: typing.Callable\n ) -> AsyncResponse:\n request.headers[\"Authorization\"] = self.authorization_header\n return await get_response(request)\n\n\nclass CustomAuthMiddleware(BaseMiddleware):\n def __init__(self, auth: typing.Callable[[AsyncRequest], AsyncRequest]):\n self.auth = auth\n\n async def __call__(\n self, request: AsyncRequest, get_response: typing.Callable\n ) -> AsyncResponse:\n request = self.auth(request)\n return await get_response(request)\n\n\nclass RedirectMiddleware(BaseMiddleware):\n def __init__(\n self,\n allow_redirects: bool = True,\n max_redirects: int = DEFAULT_MAX_REDIRECTS,\n cookies: typing.Optional[Cookies] = None,\n ):\n self.allow_redirects = allow_redirects\n self.max_redirects = max_redirects\n self.cookies = cookies\n self.history: typing.List[AsyncResponse] = []\n\n async def __call__(\n self, request: AsyncRequest, get_response: typing.Callable\n ) -> AsyncResponse:\n if len(self.history) > self.max_redirects:\n raise TooManyRedirects()\n if request.url in (response.url for response in self.history):\n raise RedirectLoop()\n\n response = await get_response(request)\n response.history = list(self.history)\n\n if not response.is_redirect:\n return response\n\n self.history.append(response)\n next_request = self.build_redirect_request(request, response)\n\n if self.allow_redirects:\n return await self(next_request, get_response)\n\n response.call_next = functools.partial(self, next_request, get_response)\n return response\n\n def build_redirect_request(\n self, request: AsyncRequest, response: AsyncResponse\n ) -> AsyncRequest:\n method = self.redirect_method(request, response)\n url = self.redirect_url(request, response)\n headers = self.redirect_headers(request, url, method) # TODO: merge headers?\n content = self.redirect_content(request, method)\n cookies = Cookies(self.cookies)\n cookies.update(request.cookies)\n return AsyncRequest(\n method=method, url=url, headers=headers, data=content, cookies=cookies\n )\n\n def redirect_method(self, request: AsyncRequest, response: AsyncResponse) -> str:\n \"\"\"\n When being redirected we may want to change the method of the request\n based on certain specs or browser behavior.\n \"\"\"\n method = request.method\n\n # https://tools.ietf.org/html/rfc7231#section-6.4.4\n if response.status_code == codes.SEE_OTHER and method != \"HEAD\":\n method = \"GET\"\n\n # Do what the browsers do, despite standards...\n # Turn 302s into GETs.\n if response.status_code == codes.FOUND and method != \"HEAD\":\n method = \"GET\"\n\n # If a POST is responded to with a 301, turn it into a GET.\n # This bizarre behaviour is explained in 'requests' issue 1704.\n if response.status_code == codes.MOVED_PERMANENTLY and method == \"POST\":\n method = \"GET\"\n\n return method\n\n def redirect_url(self, request: AsyncRequest, response: AsyncResponse) -> URL:\n \"\"\"\n Return the URL for the redirect to follow.\n \"\"\"\n location = response.headers[\"Location\"]\n\n url = URL(location, allow_relative=True)\n\n # Facilitate relative 'Location' headers, as allowed by RFC 7231.\n # (e.g. '/path/to/resource' instead of 'http://domain.tld/path/to/resource')\n if url.is_relative_url:\n url = request.url.join(url)\n\n # Attach previous fragment if needed (RFC 7231 7.1.2)\n if request.url.fragment and not url.fragment:\n url = url.copy_with(fragment=request.url.fragment)\n\n return url\n\n def redirect_headers(self, request: AsyncRequest, url: URL, method: str) -> Headers:\n \"\"\"\n Return the headers that should be used for the redirect request.\n \"\"\"\n headers = Headers(request.headers)\n\n if url.origin != request.url.origin:\n # Strip Authorization headers when responses are redirected away from\n # the origin.\n del headers[\"Authorization\"]\n del headers[\"Host\"]\n\n if method != request.method and method == \"GET\":\n # If we've switch to a 'GET' request, then strip any headers which\n # are only relevant to the request body.\n del headers[\"Content-Length\"]\n del headers[\"Transfer-Encoding\"]\n\n return headers\n\n def redirect_content(self, request: AsyncRequest, method: str) -> bytes:\n \"\"\"\n Return the body that should be used for the redirect request.\n \"\"\"\n if method != request.method and method == \"GET\":\n return b\"\"\n if request.is_streaming:\n raise RedirectBodyUnavailable()\n return request.content\n", "path": "httpx/middleware.py"}], "after_files": [{"content": "import functools\nimport typing\nfrom base64 import b64encode\n\nfrom .config import DEFAULT_MAX_REDIRECTS\nfrom .exceptions import RedirectBodyUnavailable, RedirectLoop, TooManyRedirects\nfrom .models import URL, AsyncRequest, AsyncResponse, Cookies, Headers\nfrom .status_codes import codes\n\n\nclass BaseMiddleware:\n async def __call__(\n self, request: AsyncRequest, get_response: typing.Callable\n ) -> AsyncResponse:\n raise NotImplementedError # pragma: no cover\n\n\nclass BasicAuthMiddleware(BaseMiddleware):\n def __init__(\n self, username: typing.Union[str, bytes], password: typing.Union[str, bytes]\n ):\n if isinstance(username, str):\n username = username.encode(\"latin1\")\n\n if isinstance(password, str):\n password = password.encode(\"latin1\")\n\n userpass = b\":\".join((username, password))\n token = b64encode(userpass).decode().strip()\n\n self.authorization_header = f\"Basic {token}\"\n\n async def __call__(\n self, request: AsyncRequest, get_response: typing.Callable\n ) -> AsyncResponse:\n request.headers[\"Authorization\"] = self.authorization_header\n return await get_response(request)\n\n\nclass CustomAuthMiddleware(BaseMiddleware):\n def __init__(self, auth: typing.Callable[[AsyncRequest], AsyncRequest]):\n self.auth = auth\n\n async def __call__(\n self, request: AsyncRequest, get_response: typing.Callable\n ) -> AsyncResponse:\n request = self.auth(request)\n return await get_response(request)\n\n\nclass RedirectMiddleware(BaseMiddleware):\n def __init__(\n self,\n allow_redirects: bool = True,\n max_redirects: int = DEFAULT_MAX_REDIRECTS,\n cookies: typing.Optional[Cookies] = None,\n ):\n self.allow_redirects = allow_redirects\n self.max_redirects = max_redirects\n self.cookies = cookies\n self.history: typing.List[AsyncResponse] = []\n\n async def __call__(\n self, request: AsyncRequest, get_response: typing.Callable\n ) -> AsyncResponse:\n if len(self.history) > self.max_redirects:\n raise TooManyRedirects()\n if request.url in (response.url for response in self.history):\n raise RedirectLoop()\n\n response = await get_response(request)\n response.history = list(self.history)\n\n if not response.is_redirect:\n return response\n\n self.history.append(response)\n next_request = self.build_redirect_request(request, response)\n\n if self.allow_redirects:\n return await self(next_request, get_response)\n\n response.call_next = functools.partial(self, next_request, get_response)\n return response\n\n def build_redirect_request(\n self, request: AsyncRequest, response: AsyncResponse\n ) -> AsyncRequest:\n method = self.redirect_method(request, response)\n url = self.redirect_url(request, response)\n headers = self.redirect_headers(request, url, method) # TODO: merge headers?\n content = self.redirect_content(request, method)\n cookies = Cookies(self.cookies)\n cookies.update(request.cookies)\n return AsyncRequest(\n method=method, url=url, headers=headers, data=content, cookies=cookies\n )\n\n def redirect_method(self, request: AsyncRequest, response: AsyncResponse) -> str:\n \"\"\"\n When being redirected we may want to change the method of the request\n based on certain specs or browser behavior.\n \"\"\"\n method = request.method\n\n # https://tools.ietf.org/html/rfc7231#section-6.4.4\n if response.status_code == codes.SEE_OTHER and method != \"HEAD\":\n method = \"GET\"\n\n # Do what the browsers do, despite standards...\n # Turn 302s into GETs.\n if response.status_code == codes.FOUND and method != \"HEAD\":\n method = \"GET\"\n\n # If a POST is responded to with a 301, turn it into a GET.\n # This bizarre behaviour is explained in 'requests' issue 1704.\n if response.status_code == codes.MOVED_PERMANENTLY and method == \"POST\":\n method = \"GET\"\n\n return method\n\n def redirect_url(self, request: AsyncRequest, response: AsyncResponse) -> URL:\n \"\"\"\n Return the URL for the redirect to follow.\n \"\"\"\n location = response.headers[\"Location\"]\n\n url = URL(location, allow_relative=True)\n\n # Facilitate relative 'Location' headers, as allowed by RFC 7231.\n # (e.g. '/path/to/resource' instead of 'http://domain.tld/path/to/resource')\n if url.is_relative_url:\n url = request.url.join(url)\n\n # Attach previous fragment if needed (RFC 7231 7.1.2)\n if request.url.fragment and not url.fragment:\n url = url.copy_with(fragment=request.url.fragment)\n\n return url\n\n def redirect_headers(self, request: AsyncRequest, url: URL, method: str) -> Headers:\n \"\"\"\n Return the headers that should be used for the redirect request.\n \"\"\"\n headers = Headers(request.headers)\n\n if url.origin != request.url.origin:\n # Strip Authorization headers when responses are redirected away from\n # the origin.\n del headers[\"Authorization\"]\n headers[\"Host\"] = url.authority\n\n if method != request.method and method == \"GET\":\n # If we've switch to a 'GET' request, then strip any headers which\n # are only relevant to the request body.\n del headers[\"Content-Length\"]\n del headers[\"Transfer-Encoding\"]\n\n return headers\n\n def redirect_content(self, request: AsyncRequest, method: str) -> bytes:\n \"\"\"\n Return the body that should be used for the redirect request.\n \"\"\"\n if method != request.method and method == \"GET\":\n return b\"\"\n if request.is_streaming:\n raise RedirectBodyUnavailable()\n return request.content\n", "path": "httpx/middleware.py"}]}
2,014
118
gh_patches_debug_57932
rasdani/github-patches
git_diff
scrapy__scrapy-3825
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Issue with Twisted and Python 3.4 Twisted had a patch 3 days ago and it's causing test suite to fail for py34 environment. Twisted , according to their Readme, support Python 3.5+. This needs to be fixed if the builds need to pass --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `setup.py` Content: ``` 1 from os.path import dirname, join 2 from pkg_resources import parse_version 3 from setuptools import setup, find_packages, __version__ as setuptools_version 4 5 6 with open(join(dirname(__file__), 'scrapy/VERSION'), 'rb') as f: 7 version = f.read().decode('ascii').strip() 8 9 10 def has_environment_marker_platform_impl_support(): 11 """Code extracted from 'pytest/setup.py' 12 https://github.com/pytest-dev/pytest/blob/7538680c/setup.py#L31 13 14 The first known release to support environment marker with range operators 15 it is 18.5, see: 16 https://setuptools.readthedocs.io/en/latest/history.html#id235 17 """ 18 return parse_version(setuptools_version) >= parse_version('18.5') 19 20 21 extras_require = {} 22 23 if has_environment_marker_platform_impl_support(): 24 extras_require[':platform_python_implementation == "PyPy"'] = [ 25 'PyPyDispatcher>=2.1.0', 26 ] 27 28 29 setup( 30 name='Scrapy', 31 version=version, 32 url='https://scrapy.org', 33 description='A high-level Web Crawling and Web Scraping framework', 34 long_description=open('README.rst').read(), 35 author='Scrapy developers', 36 maintainer='Pablo Hoffman', 37 maintainer_email='[email protected]', 38 license='BSD', 39 packages=find_packages(exclude=('tests', 'tests.*')), 40 include_package_data=True, 41 zip_safe=False, 42 entry_points={ 43 'console_scripts': ['scrapy = scrapy.cmdline:execute'] 44 }, 45 classifiers=[ 46 'Framework :: Scrapy', 47 'Development Status :: 5 - Production/Stable', 48 'Environment :: Console', 49 'Intended Audience :: Developers', 50 'License :: OSI Approved :: BSD License', 51 'Operating System :: OS Independent', 52 'Programming Language :: Python', 53 'Programming Language :: Python :: 2', 54 'Programming Language :: Python :: 2.7', 55 'Programming Language :: Python :: 3', 56 'Programming Language :: Python :: 3.4', 57 'Programming Language :: Python :: 3.5', 58 'Programming Language :: Python :: 3.6', 59 'Programming Language :: Python :: 3.7', 60 'Programming Language :: Python :: Implementation :: CPython', 61 'Programming Language :: Python :: Implementation :: PyPy', 62 'Topic :: Internet :: WWW/HTTP', 63 'Topic :: Software Development :: Libraries :: Application Frameworks', 64 'Topic :: Software Development :: Libraries :: Python Modules', 65 ], 66 python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*', 67 install_requires=[ 68 'Twisted>=13.1.0', 69 'w3lib>=1.17.0', 70 'queuelib', 71 'lxml', 72 'pyOpenSSL', 73 'cssselect>=0.9', 74 'six>=1.5.2', 75 'parsel>=1.5', 76 'PyDispatcher>=2.0.5', 77 'service_identity', 78 ], 79 extras_require=extras_require, 80 ) 81 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/setup.py b/setup.py --- a/setup.py +++ b/setup.py @@ -65,7 +65,8 @@ ], python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*', install_requires=[ - 'Twisted>=13.1.0', + 'Twisted>=13.1.0;python_version!="3.4"', + 'Twisted>=13.1.0,<=19.2.0;python_version=="3.4"', 'w3lib>=1.17.0', 'queuelib', 'lxml',
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -65,7 +65,8 @@\n ],\n python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*',\n install_requires=[\n- 'Twisted>=13.1.0',\n+ 'Twisted>=13.1.0;python_version!=\"3.4\"',\n+ 'Twisted>=13.1.0,<=19.2.0;python_version==\"3.4\"',\n 'w3lib>=1.17.0',\n 'queuelib',\n 'lxml',\n", "issue": "Issue with Twisted and Python 3.4 \nTwisted had a patch 3 days ago and it's causing test suite to fail for py34 environment. \r\nTwisted , according to their Readme, support Python 3.5+. This needs to be fixed if the builds need to pass\n", "before_files": [{"content": "from os.path import dirname, join\nfrom pkg_resources import parse_version\nfrom setuptools import setup, find_packages, __version__ as setuptools_version\n\n\nwith open(join(dirname(__file__), 'scrapy/VERSION'), 'rb') as f:\n version = f.read().decode('ascii').strip()\n\n\ndef has_environment_marker_platform_impl_support():\n \"\"\"Code extracted from 'pytest/setup.py'\n https://github.com/pytest-dev/pytest/blob/7538680c/setup.py#L31\n\n The first known release to support environment marker with range operators\n it is 18.5, see:\n https://setuptools.readthedocs.io/en/latest/history.html#id235\n \"\"\"\n return parse_version(setuptools_version) >= parse_version('18.5')\n\n\nextras_require = {}\n\nif has_environment_marker_platform_impl_support():\n extras_require[':platform_python_implementation == \"PyPy\"'] = [\n 'PyPyDispatcher>=2.1.0',\n ]\n\n\nsetup(\n name='Scrapy',\n version=version,\n url='https://scrapy.org',\n description='A high-level Web Crawling and Web Scraping framework',\n long_description=open('README.rst').read(),\n author='Scrapy developers',\n maintainer='Pablo Hoffman',\n maintainer_email='[email protected]',\n license='BSD',\n packages=find_packages(exclude=('tests', 'tests.*')),\n include_package_data=True,\n zip_safe=False,\n entry_points={\n 'console_scripts': ['scrapy = scrapy.cmdline:execute']\n },\n classifiers=[\n 'Framework :: Scrapy',\n 'Development Status :: 5 - Production/Stable',\n 'Environment :: Console',\n 'Intended Audience :: Developers',\n 'License :: OSI Approved :: BSD License',\n 'Operating System :: OS Independent',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 2',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.4',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7',\n 'Programming Language :: Python :: Implementation :: CPython',\n 'Programming Language :: Python :: Implementation :: PyPy',\n 'Topic :: Internet :: WWW/HTTP',\n 'Topic :: Software Development :: Libraries :: Application Frameworks',\n 'Topic :: Software Development :: Libraries :: Python Modules',\n ],\n python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*',\n install_requires=[\n 'Twisted>=13.1.0',\n 'w3lib>=1.17.0',\n 'queuelib',\n 'lxml',\n 'pyOpenSSL',\n 'cssselect>=0.9',\n 'six>=1.5.2',\n 'parsel>=1.5',\n 'PyDispatcher>=2.0.5',\n 'service_identity',\n ],\n extras_require=extras_require,\n)\n", "path": "setup.py"}], "after_files": [{"content": "from os.path import dirname, join\nfrom pkg_resources import parse_version\nfrom setuptools import setup, find_packages, __version__ as setuptools_version\n\n\nwith open(join(dirname(__file__), 'scrapy/VERSION'), 'rb') as f:\n version = f.read().decode('ascii').strip()\n\n\ndef has_environment_marker_platform_impl_support():\n \"\"\"Code extracted from 'pytest/setup.py'\n https://github.com/pytest-dev/pytest/blob/7538680c/setup.py#L31\n\n The first known release to support environment marker with range operators\n it is 18.5, see:\n https://setuptools.readthedocs.io/en/latest/history.html#id235\n \"\"\"\n return parse_version(setuptools_version) >= parse_version('18.5')\n\n\nextras_require = {}\n\nif has_environment_marker_platform_impl_support():\n extras_require[':platform_python_implementation == \"PyPy\"'] = [\n 'PyPyDispatcher>=2.1.0',\n ]\n\n\nsetup(\n name='Scrapy',\n version=version,\n url='https://scrapy.org',\n description='A high-level Web Crawling and Web Scraping framework',\n long_description=open('README.rst').read(),\n author='Scrapy developers',\n maintainer='Pablo Hoffman',\n maintainer_email='[email protected]',\n license='BSD',\n packages=find_packages(exclude=('tests', 'tests.*')),\n include_package_data=True,\n zip_safe=False,\n entry_points={\n 'console_scripts': ['scrapy = scrapy.cmdline:execute']\n },\n classifiers=[\n 'Framework :: Scrapy',\n 'Development Status :: 5 - Production/Stable',\n 'Environment :: Console',\n 'Intended Audience :: Developers',\n 'License :: OSI Approved :: BSD License',\n 'Operating System :: OS Independent',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 2',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.4',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7',\n 'Programming Language :: Python :: Implementation :: CPython',\n 'Programming Language :: Python :: Implementation :: PyPy',\n 'Topic :: Internet :: WWW/HTTP',\n 'Topic :: Software Development :: Libraries :: Application Frameworks',\n 'Topic :: Software Development :: Libraries :: Python Modules',\n ],\n python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*',\n install_requires=[\n 'Twisted>=13.1.0;python_version!=\"3.4\"',\n 'Twisted>=13.1.0,<=19.2.0;python_version==\"3.4\"',\n 'w3lib>=1.17.0',\n 'queuelib',\n 'lxml',\n 'pyOpenSSL',\n 'cssselect>=0.9',\n 'six>=1.5.2',\n 'parsel>=1.5',\n 'PyDispatcher>=2.0.5',\n 'service_identity',\n ],\n extras_require=extras_require,\n)\n", "path": "setup.py"}]}
1,152
153
gh_patches_debug_31720
rasdani/github-patches
git_diff
meltano__meltano-7115
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Allow vendor-specific annotations in `meltano.yml` Discussed in https://github.com/meltano/meltano/discussions/7053 We should update `meltano.schema.json` to permit an object with arbitrary fields (i.e. `"additionalProperties": true`) within any object in the schema that does not already impose restrictions on the permitted additional properties. That is to say: - If the object has `"additionalProperties": true`, then it can have `annotations`, and no changes are required to the schema. - If the object has `"additionalProperties": false`, then it can have `annotations` and we must explicitly add it to the properties. - If the object has `additionalProperties` set to anything else (e.g. limiting it to strings as `env` does), then annotations are not supported, and no change to the schema is necessary. So the only places in the schema that need to be updated are where it says `"additionalProperties": false`. Documentation around what this field is for should be added. It should be made clear that the core Meltano library/CLI does not use this field, but it may be used for Meltano Cloud, or third-party tools. The documentation should also state that we impose no limitation on how it can be used except for requiring that the top-level of each annotation object may only contain objects as properties, and that we recommend that third-party tools keep their annotations under a sensible key, such as the name of the tool. Testing should be done by adding annotations liberally to integration tests, and ensuring that they are ignored. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `src/meltano/core/meltano_file.py` Content: ``` 1 """Module for working with meltano.yml files.""" 2 from __future__ import annotations 3 4 import copy 5 from typing import Iterable 6 7 from meltano.core.behavior.canonical import Canonical 8 from meltano.core.environment import Environment 9 from meltano.core.plugin import PluginType 10 from meltano.core.plugin.project_plugin import ProjectPlugin 11 from meltano.core.schedule import Schedule 12 from meltano.core.task_sets import TaskSets 13 14 VERSION = 1 15 16 17 class MeltanoFile(Canonical): 18 """Data and loading methods for meltano.yml files.""" 19 20 def __init__( 21 self, 22 version: int = VERSION, 23 plugins: dict[str, dict] = None, 24 schedules: list[dict] = None, 25 environments: list[dict] = None, 26 jobs: list[dict] = None, 27 env: dict[str, str] = None, 28 **extras, 29 ): 30 """Construct a new MeltanoFile object from meltano.yml file. 31 32 Args: 33 version: The meltano.yml version, currently always 1. 34 plugins: Plugin configuration for this project. 35 schedules: Schedule configuration for this project. 36 environments: Environment configuration for this project. 37 jobs: Job configuration for this project. 38 env: Environment variables for this project. 39 extras: Additional configuration for this project. 40 """ 41 super().__init__( 42 # Attributes will be listed in meltano.yml in this order: 43 version=version, 44 extras=extras, 45 plugins=self.load_plugins(plugins or {}), 46 schedules=self.load_schedules(schedules or []), 47 environments=self.load_environments(environments or []), 48 jobs=self.load_job_tasks(jobs or []), 49 env=env or {}, 50 ) 51 52 def load_plugins(self, plugins: dict[str, dict]) -> Canonical: 53 """Parse the `meltano.yml` file and return it as `ProjectPlugin` instances. 54 55 Args: 56 plugins: Dictionary of plugin configurations. 57 58 Returns: 59 New ProjectPlugin instances. 60 """ 61 plugin_type_plugins = Canonical() 62 63 for ptype in PluginType: 64 plugin_type_plugins[ptype] = [] 65 66 # this will parse the meltano.yml file and create an instance of the 67 # corresponding `plugin_class` for all the plugins. 68 for plugin_type, raw_plugins in plugins.items(): 69 if plugin_type == PluginType.MAPPERS: 70 for mapper in raw_plugins: 71 plugin_type_plugins[PluginType.MAPPERS].append( 72 ProjectPlugin(PluginType.MAPPERS, **mapper) 73 ) 74 plugin_type_plugins[PluginType.MAPPERS].extend( 75 self.get_plugins_for_mappings(mapper) 76 ) 77 else: 78 for raw_plugin in raw_plugins: 79 plugin = ProjectPlugin(PluginType(plugin_type), **raw_plugin) 80 plugin_type_plugins[plugin.type].append(plugin) 81 82 return plugin_type_plugins 83 84 def load_schedules(self, schedules: list[dict]) -> list[Schedule]: 85 """Parse the meltano.yml file and return it as Schedule instances. 86 87 Args: 88 schedules: List of schedule configurations. 89 90 Returns: 91 List of new Schedule instances. 92 """ 93 return list(map(Schedule.parse, schedules)) 94 95 @staticmethod 96 def load_environments(environments: Iterable[dict]) -> list[Environment]: 97 """Parse `Environment` objects from python objects. 98 99 Args: 100 environments: Sequence of environment dictionaries. 101 102 Returns: 103 A list of `Environment` objects. 104 """ 105 return [Environment.parse(obj) for obj in environments] 106 107 @staticmethod 108 def load_job_tasks(jobs: Iterable[dict]) -> list[TaskSets]: 109 """Parse `TaskSets` objects from python objects. 110 111 Args: 112 jobs: Sequence of job dictionaries. 113 114 Returns: 115 A list of `Job` objects. 116 """ 117 return [TaskSets.parse(obj) for obj in jobs] 118 119 @staticmethod 120 def get_plugins_for_mappings(mapper_config: dict) -> list[ProjectPlugin]: 121 """Mapper plugins are a special case. They are not a single plugin, but actually a list of plugins generated from the mapping config defined within the mapper config. 122 123 Args: 124 mapper_config: The dict representation of a mapper config found in in meltano.yml. 125 126 Returns: 127 A list of `ProjectPlugin` instances. 128 """ 129 mapping_plugins: list[ProjectPlugin] = [] 130 for mapping in mapper_config.get("mappings", []): 131 raw_mapping_plugin = copy.deepcopy(mapper_config) 132 raw_mapping_plugin["mapping"] = True 133 raw_mapping_plugin["mapping_name"] = mapping.get("name") 134 raw_mapping_plugin["config"] = mapping.get("config") 135 mapping_plugins.append( 136 ProjectPlugin(PluginType.MAPPERS, **raw_mapping_plugin) 137 ) 138 return mapping_plugins 139 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/src/meltano/core/meltano_file.py b/src/meltano/core/meltano_file.py --- a/src/meltano/core/meltano_file.py +++ b/src/meltano/core/meltano_file.py @@ -1,8 +1,9 @@ """Module for working with meltano.yml files.""" + from __future__ import annotations import copy -from typing import Iterable +from typing import Any, Iterable from meltano.core.behavior.canonical import Canonical from meltano.core.environment import Environment @@ -25,6 +26,7 @@ environments: list[dict] = None, jobs: list[dict] = None, env: dict[str, str] = None, + annotations: dict[str, dict[Any, Any]] | None = None, # noqa: WPS442 **extras, ): """Construct a new MeltanoFile object from meltano.yml file. @@ -36,6 +38,7 @@ environments: Environment configuration for this project. jobs: Job configuration for this project. env: Environment variables for this project. + annotations: Annotations for external tools/vendors - do not access. extras: Additional configuration for this project. """ super().__init__( @@ -47,6 +50,7 @@ environments=self.load_environments(environments or []), jobs=self.load_job_tasks(jobs or []), env=env or {}, + annotations=annotations, ) def load_plugins(self, plugins: dict[str, dict]) -> Canonical:
{"golden_diff": "diff --git a/src/meltano/core/meltano_file.py b/src/meltano/core/meltano_file.py\n--- a/src/meltano/core/meltano_file.py\n+++ b/src/meltano/core/meltano_file.py\n@@ -1,8 +1,9 @@\n \"\"\"Module for working with meltano.yml files.\"\"\"\n+\n from __future__ import annotations\n \n import copy\n-from typing import Iterable\n+from typing import Any, Iterable\n \n from meltano.core.behavior.canonical import Canonical\n from meltano.core.environment import Environment\n@@ -25,6 +26,7 @@\n environments: list[dict] = None,\n jobs: list[dict] = None,\n env: dict[str, str] = None,\n+ annotations: dict[str, dict[Any, Any]] | None = None, # noqa: WPS442\n **extras,\n ):\n \"\"\"Construct a new MeltanoFile object from meltano.yml file.\n@@ -36,6 +38,7 @@\n environments: Environment configuration for this project.\n jobs: Job configuration for this project.\n env: Environment variables for this project.\n+ annotations: Annotations for external tools/vendors - do not access.\n extras: Additional configuration for this project.\n \"\"\"\n super().__init__(\n@@ -47,6 +50,7 @@\n environments=self.load_environments(environments or []),\n jobs=self.load_job_tasks(jobs or []),\n env=env or {},\n+ annotations=annotations,\n )\n \n def load_plugins(self, plugins: dict[str, dict]) -> Canonical:\n", "issue": "Allow vendor-specific annotations in `meltano.yml`\nDiscussed in https://github.com/meltano/meltano/discussions/7053\r\n\r\nWe should update `meltano.schema.json` to permit an object with arbitrary fields (i.e. `\"additionalProperties\": true`) within any object in the schema that does not already impose restrictions on the permitted additional properties. That is to say:\r\n- If the object has `\"additionalProperties\": true`, then it can have `annotations`, and no changes are required to the schema.\r\n- If the object has `\"additionalProperties\": false`, then it can have `annotations` and we must explicitly add it to the properties.\r\n- If the object has `additionalProperties` set to anything else (e.g. limiting it to strings as `env` does), then annotations are not supported, and no change to the schema is necessary.\r\n\r\nSo the only places in the schema that need to be updated are where it says `\"additionalProperties\": false`.\r\n\r\nDocumentation around what this field is for should be added. It should be made clear that the core Meltano library/CLI does not use this field, but it may be used for Meltano Cloud, or third-party tools.\r\n\r\nThe documentation should also state that we impose no limitation on how it can be used except for requiring that the top-level of each annotation object may only contain objects as properties, and that we recommend that third-party tools keep their annotations under a sensible key, such as the name of the tool.\r\n\r\nTesting should be done by adding annotations liberally to integration tests, and ensuring that they are ignored.\n", "before_files": [{"content": "\"\"\"Module for working with meltano.yml files.\"\"\"\nfrom __future__ import annotations\n\nimport copy\nfrom typing import Iterable\n\nfrom meltano.core.behavior.canonical import Canonical\nfrom meltano.core.environment import Environment\nfrom meltano.core.plugin import PluginType\nfrom meltano.core.plugin.project_plugin import ProjectPlugin\nfrom meltano.core.schedule import Schedule\nfrom meltano.core.task_sets import TaskSets\n\nVERSION = 1\n\n\nclass MeltanoFile(Canonical):\n \"\"\"Data and loading methods for meltano.yml files.\"\"\"\n\n def __init__(\n self,\n version: int = VERSION,\n plugins: dict[str, dict] = None,\n schedules: list[dict] = None,\n environments: list[dict] = None,\n jobs: list[dict] = None,\n env: dict[str, str] = None,\n **extras,\n ):\n \"\"\"Construct a new MeltanoFile object from meltano.yml file.\n\n Args:\n version: The meltano.yml version, currently always 1.\n plugins: Plugin configuration for this project.\n schedules: Schedule configuration for this project.\n environments: Environment configuration for this project.\n jobs: Job configuration for this project.\n env: Environment variables for this project.\n extras: Additional configuration for this project.\n \"\"\"\n super().__init__(\n # Attributes will be listed in meltano.yml in this order:\n version=version,\n extras=extras,\n plugins=self.load_plugins(plugins or {}),\n schedules=self.load_schedules(schedules or []),\n environments=self.load_environments(environments or []),\n jobs=self.load_job_tasks(jobs or []),\n env=env or {},\n )\n\n def load_plugins(self, plugins: dict[str, dict]) -> Canonical:\n \"\"\"Parse the `meltano.yml` file and return it as `ProjectPlugin` instances.\n\n Args:\n plugins: Dictionary of plugin configurations.\n\n Returns:\n New ProjectPlugin instances.\n \"\"\"\n plugin_type_plugins = Canonical()\n\n for ptype in PluginType:\n plugin_type_plugins[ptype] = []\n\n # this will parse the meltano.yml file and create an instance of the\n # corresponding `plugin_class` for all the plugins.\n for plugin_type, raw_plugins in plugins.items():\n if plugin_type == PluginType.MAPPERS:\n for mapper in raw_plugins:\n plugin_type_plugins[PluginType.MAPPERS].append(\n ProjectPlugin(PluginType.MAPPERS, **mapper)\n )\n plugin_type_plugins[PluginType.MAPPERS].extend(\n self.get_plugins_for_mappings(mapper)\n )\n else:\n for raw_plugin in raw_plugins:\n plugin = ProjectPlugin(PluginType(plugin_type), **raw_plugin)\n plugin_type_plugins[plugin.type].append(plugin)\n\n return plugin_type_plugins\n\n def load_schedules(self, schedules: list[dict]) -> list[Schedule]:\n \"\"\"Parse the meltano.yml file and return it as Schedule instances.\n\n Args:\n schedules: List of schedule configurations.\n\n Returns:\n List of new Schedule instances.\n \"\"\"\n return list(map(Schedule.parse, schedules))\n\n @staticmethod\n def load_environments(environments: Iterable[dict]) -> list[Environment]:\n \"\"\"Parse `Environment` objects from python objects.\n\n Args:\n environments: Sequence of environment dictionaries.\n\n Returns:\n A list of `Environment` objects.\n \"\"\"\n return [Environment.parse(obj) for obj in environments]\n\n @staticmethod\n def load_job_tasks(jobs: Iterable[dict]) -> list[TaskSets]:\n \"\"\"Parse `TaskSets` objects from python objects.\n\n Args:\n jobs: Sequence of job dictionaries.\n\n Returns:\n A list of `Job` objects.\n \"\"\"\n return [TaskSets.parse(obj) for obj in jobs]\n\n @staticmethod\n def get_plugins_for_mappings(mapper_config: dict) -> list[ProjectPlugin]:\n \"\"\"Mapper plugins are a special case. They are not a single plugin, but actually a list of plugins generated from the mapping config defined within the mapper config.\n\n Args:\n mapper_config: The dict representation of a mapper config found in in meltano.yml.\n\n Returns:\n A list of `ProjectPlugin` instances.\n \"\"\"\n mapping_plugins: list[ProjectPlugin] = []\n for mapping in mapper_config.get(\"mappings\", []):\n raw_mapping_plugin = copy.deepcopy(mapper_config)\n raw_mapping_plugin[\"mapping\"] = True\n raw_mapping_plugin[\"mapping_name\"] = mapping.get(\"name\")\n raw_mapping_plugin[\"config\"] = mapping.get(\"config\")\n mapping_plugins.append(\n ProjectPlugin(PluginType.MAPPERS, **raw_mapping_plugin)\n )\n return mapping_plugins\n", "path": "src/meltano/core/meltano_file.py"}], "after_files": [{"content": "\"\"\"Module for working with meltano.yml files.\"\"\"\n\nfrom __future__ import annotations\n\nimport copy\nfrom typing import Any, Iterable\n\nfrom meltano.core.behavior.canonical import Canonical\nfrom meltano.core.environment import Environment\nfrom meltano.core.plugin import PluginType\nfrom meltano.core.plugin.project_plugin import ProjectPlugin\nfrom meltano.core.schedule import Schedule\nfrom meltano.core.task_sets import TaskSets\n\nVERSION = 1\n\n\nclass MeltanoFile(Canonical):\n \"\"\"Data and loading methods for meltano.yml files.\"\"\"\n\n def __init__(\n self,\n version: int = VERSION,\n plugins: dict[str, dict] = None,\n schedules: list[dict] = None,\n environments: list[dict] = None,\n jobs: list[dict] = None,\n env: dict[str, str] = None,\n annotations: dict[str, dict[Any, Any]] | None = None, # noqa: WPS442\n **extras,\n ):\n \"\"\"Construct a new MeltanoFile object from meltano.yml file.\n\n Args:\n version: The meltano.yml version, currently always 1.\n plugins: Plugin configuration for this project.\n schedules: Schedule configuration for this project.\n environments: Environment configuration for this project.\n jobs: Job configuration for this project.\n env: Environment variables for this project.\n annotations: Annotations for external tools/vendors - do not access.\n extras: Additional configuration for this project.\n \"\"\"\n super().__init__(\n # Attributes will be listed in meltano.yml in this order:\n version=version,\n extras=extras,\n plugins=self.load_plugins(plugins or {}),\n schedules=self.load_schedules(schedules or []),\n environments=self.load_environments(environments or []),\n jobs=self.load_job_tasks(jobs or []),\n env=env or {},\n annotations=annotations,\n )\n\n def load_plugins(self, plugins: dict[str, dict]) -> Canonical:\n \"\"\"Parse the `meltano.yml` file and return it as `ProjectPlugin` instances.\n\n Args:\n plugins: Dictionary of plugin configurations.\n\n Returns:\n New ProjectPlugin instances.\n \"\"\"\n plugin_type_plugins = Canonical()\n\n for ptype in PluginType:\n plugin_type_plugins[ptype] = []\n\n # this will parse the meltano.yml file and create an instance of the\n # corresponding `plugin_class` for all the plugins.\n for plugin_type, raw_plugins in plugins.items():\n if plugin_type == PluginType.MAPPERS:\n for mapper in raw_plugins:\n plugin_type_plugins[PluginType.MAPPERS].append(\n ProjectPlugin(PluginType.MAPPERS, **mapper)\n )\n plugin_type_plugins[PluginType.MAPPERS].extend(\n self.get_plugins_for_mappings(mapper)\n )\n else:\n for raw_plugin in raw_plugins:\n plugin = ProjectPlugin(PluginType(plugin_type), **raw_plugin)\n plugin_type_plugins[plugin.type].append(plugin)\n\n return plugin_type_plugins\n\n def load_schedules(self, schedules: list[dict]) -> list[Schedule]:\n \"\"\"Parse the meltano.yml file and return it as Schedule instances.\n\n Args:\n schedules: List of schedule configurations.\n\n Returns:\n List of new Schedule instances.\n \"\"\"\n return list(map(Schedule.parse, schedules))\n\n @staticmethod\n def load_environments(environments: Iterable[dict]) -> list[Environment]:\n \"\"\"Parse `Environment` objects from python objects.\n\n Args:\n environments: Sequence of environment dictionaries.\n\n Returns:\n A list of `Environment` objects.\n \"\"\"\n return [Environment.parse(obj) for obj in environments]\n\n @staticmethod\n def load_job_tasks(jobs: Iterable[dict]) -> list[TaskSets]:\n \"\"\"Parse `TaskSets` objects from python objects.\n\n Args:\n jobs: Sequence of job dictionaries.\n\n Returns:\n A list of `Job` objects.\n \"\"\"\n return [TaskSets.parse(obj) for obj in jobs]\n\n @staticmethod\n def get_plugins_for_mappings(mapper_config: dict) -> list[ProjectPlugin]:\n \"\"\"Mapper plugins are a special case. They are not a single plugin, but actually a list of plugins generated from the mapping config defined within the mapper config.\n\n Args:\n mapper_config: The dict representation of a mapper config found in in meltano.yml.\n\n Returns:\n A list of `ProjectPlugin` instances.\n \"\"\"\n mapping_plugins: list[ProjectPlugin] = []\n for mapping in mapper_config.get(\"mappings\", []):\n raw_mapping_plugin = copy.deepcopy(mapper_config)\n raw_mapping_plugin[\"mapping\"] = True\n raw_mapping_plugin[\"mapping_name\"] = mapping.get(\"name\")\n raw_mapping_plugin[\"config\"] = mapping.get(\"config\")\n mapping_plugins.append(\n ProjectPlugin(PluginType.MAPPERS, **raw_mapping_plugin)\n )\n return mapping_plugins\n", "path": "src/meltano/core/meltano_file.py"}]}
1,904
339
gh_patches_debug_2557
rasdani/github-patches
git_diff
ManimCommunity__manim-235
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- -f broken on windows Basically the title. When passing -f on windows in show the video file with the default video browser (like -p does) and not in the file explorer. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `manim/__main__.py` Content: ``` 1 import inspect 2 import os 3 import platform 4 import subprocess as sp 5 import sys 6 import re 7 import traceback 8 import importlib.util 9 import types 10 11 from .config import file_writer_config 12 from .scene.scene import Scene 13 from .utils.sounds import play_error_sound 14 from .utils.sounds import play_finish_sound 15 from . import constants 16 from .logger import logger, console 17 18 19 def open_file_if_needed(file_writer): 20 if file_writer_config["quiet"]: 21 curr_stdout = sys.stdout 22 sys.stdout = open(os.devnull, "w") 23 24 open_file = any( 25 [file_writer_config["preview"], file_writer_config["show_file_in_finder"]] 26 ) 27 if open_file: 28 current_os = platform.system() 29 file_paths = [] 30 31 if file_writer_config["save_last_frame"]: 32 file_paths.append(file_writer.get_image_file_path()) 33 if file_writer_config["write_to_movie"]: 34 file_paths.append(file_writer.get_movie_file_path()) 35 36 for file_path in file_paths: 37 if current_os == "Windows": 38 os.startfile(file_path) 39 else: 40 commands = [] 41 if current_os == "Linux": 42 commands.append("xdg-open") 43 elif current_os.startswith("CYGWIN"): 44 commands.append("cygstart") 45 else: # Assume macOS 46 commands.append("open") 47 48 if file_writer_config["show_file_in_finder"]: 49 commands.append("-R") 50 51 commands.append(file_path) 52 53 # commands.append("-g") 54 FNULL = open(os.devnull, "w") 55 sp.call(commands, stdout=FNULL, stderr=sp.STDOUT) 56 FNULL.close() 57 58 if file_writer_config["quiet"]: 59 sys.stdout.close() 60 sys.stdout = curr_stdout 61 62 63 def is_child_scene(obj, module): 64 return ( 65 inspect.isclass(obj) 66 and issubclass(obj, Scene) 67 and obj != Scene 68 and obj.__module__.startswith(module.__name__) 69 ) 70 71 72 def prompt_user_for_choice(scene_classes): 73 num_to_class = {} 74 for count, scene_class in enumerate(scene_classes): 75 count += 1 # start with 1 instead of 0 76 name = scene_class.__name__ 77 console.print(f"{count}: {name}", style="logging.level.info") 78 num_to_class[count] = scene_class 79 try: 80 user_input = console.input( 81 f"[log.message] {constants.CHOOSE_NUMBER_MESSAGE} [/log.message]" 82 ) 83 return [ 84 num_to_class[int(num_str)] 85 for num_str in re.split(r"\s*,\s*", user_input.strip()) 86 ] 87 except KeyError: 88 logger.error(constants.INVALID_NUMBER_MESSAGE) 89 sys.exit(2) 90 except EOFError: 91 sys.exit(1) 92 93 94 def get_scenes_to_render(scene_classes): 95 if not scene_classes: 96 logger.error(constants.NO_SCENE_MESSAGE) 97 return [] 98 if file_writer_config["write_all"]: 99 return scene_classes 100 result = [] 101 for scene_name in file_writer_config["scene_names"]: 102 found = False 103 for scene_class in scene_classes: 104 if scene_class.__name__ == scene_name: 105 result.append(scene_class) 106 found = True 107 break 108 if not found and (scene_name != ""): 109 logger.error(constants.SCENE_NOT_FOUND_MESSAGE.format(scene_name)) 110 if result: 111 return result 112 return ( 113 [scene_classes[0]] 114 if len(scene_classes) == 1 115 else prompt_user_for_choice(scene_classes) 116 ) 117 118 119 def get_scene_classes_from_module(module): 120 return [ 121 member[1] 122 for member in inspect.getmembers(module, lambda x: is_child_scene(x, module)) 123 ] 124 125 126 def get_module(file_name): 127 if file_name == "-": 128 module = types.ModuleType("input_scenes") 129 logger.info( 130 "Enter the animation's code & end with an EOF (CTRL+D on Linux/Unix, CTRL+Z on Windows):" 131 ) 132 code = sys.stdin.read() 133 if not code.startswith("from manim import"): 134 logger.warn( 135 "Didn't find an import statement for Manim. Importing automatically..." 136 ) 137 code = "from manim import *\n" + code 138 logger.info("Rendering animation from typed code...") 139 try: 140 exec(code, module.__dict__) 141 return module 142 except Exception as e: 143 logger.error(f"Failed to render scene: {str(e)}") 144 sys.exit(2) 145 else: 146 if os.path.exists(file_name): 147 if file_name[-3:] != ".py": 148 raise Exception(f"{file_name} is not a valid Manim python script.") 149 module_name = file_name[:-3].replace(os.sep, ".").split(".")[-1] 150 spec = importlib.util.spec_from_file_location(module_name, file_name) 151 module = importlib.util.module_from_spec(spec) 152 spec.loader.exec_module(module) 153 return module 154 else: 155 raise FileNotFoundError(f"{file_name} not found") 156 157 158 def main(): 159 module = get_module(file_writer_config["input_file"]) 160 all_scene_classes = get_scene_classes_from_module(module) 161 scene_classes_to_render = get_scenes_to_render(all_scene_classes) 162 sound_on = file_writer_config["sound"] 163 for SceneClass in scene_classes_to_render: 164 try: 165 # By invoking, this renders the full scene 166 scene = SceneClass() 167 open_file_if_needed(scene.file_writer) 168 if sound_on: 169 play_finish_sound() 170 except Exception: 171 print("\n\n") 172 traceback.print_exc() 173 print("\n\n") 174 if sound_on: 175 play_error_sound() 176 177 178 if __name__ == "__main__": 179 main() 180 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/manim/__main__.py b/manim/__main__.py --- a/manim/__main__.py +++ b/manim/__main__.py @@ -36,7 +36,7 @@ for file_path in file_paths: if current_os == "Windows": - os.startfile(file_path) + os.startfile(os.path.dirname(file_path)) else: commands = [] if current_os == "Linux":
{"golden_diff": "diff --git a/manim/__main__.py b/manim/__main__.py\n--- a/manim/__main__.py\n+++ b/manim/__main__.py\n@@ -36,7 +36,7 @@\n \n for file_path in file_paths:\n if current_os == \"Windows\":\n- os.startfile(file_path)\n+ os.startfile(os.path.dirname(file_path))\n else:\n commands = []\n if current_os == \"Linux\":\n", "issue": "-f broken on windows\nBasically the title.\r\nWhen passing -f on windows in show the video file with the default video browser (like -p does) and not in the file explorer. \n", "before_files": [{"content": "import inspect\nimport os\nimport platform\nimport subprocess as sp\nimport sys\nimport re\nimport traceback\nimport importlib.util\nimport types\n\nfrom .config import file_writer_config\nfrom .scene.scene import Scene\nfrom .utils.sounds import play_error_sound\nfrom .utils.sounds import play_finish_sound\nfrom . import constants\nfrom .logger import logger, console\n\n\ndef open_file_if_needed(file_writer):\n if file_writer_config[\"quiet\"]:\n curr_stdout = sys.stdout\n sys.stdout = open(os.devnull, \"w\")\n\n open_file = any(\n [file_writer_config[\"preview\"], file_writer_config[\"show_file_in_finder\"]]\n )\n if open_file:\n current_os = platform.system()\n file_paths = []\n\n if file_writer_config[\"save_last_frame\"]:\n file_paths.append(file_writer.get_image_file_path())\n if file_writer_config[\"write_to_movie\"]:\n file_paths.append(file_writer.get_movie_file_path())\n\n for file_path in file_paths:\n if current_os == \"Windows\":\n os.startfile(file_path)\n else:\n commands = []\n if current_os == \"Linux\":\n commands.append(\"xdg-open\")\n elif current_os.startswith(\"CYGWIN\"):\n commands.append(\"cygstart\")\n else: # Assume macOS\n commands.append(\"open\")\n\n if file_writer_config[\"show_file_in_finder\"]:\n commands.append(\"-R\")\n\n commands.append(file_path)\n\n # commands.append(\"-g\")\n FNULL = open(os.devnull, \"w\")\n sp.call(commands, stdout=FNULL, stderr=sp.STDOUT)\n FNULL.close()\n\n if file_writer_config[\"quiet\"]:\n sys.stdout.close()\n sys.stdout = curr_stdout\n\n\ndef is_child_scene(obj, module):\n return (\n inspect.isclass(obj)\n and issubclass(obj, Scene)\n and obj != Scene\n and obj.__module__.startswith(module.__name__)\n )\n\n\ndef prompt_user_for_choice(scene_classes):\n num_to_class = {}\n for count, scene_class in enumerate(scene_classes):\n count += 1 # start with 1 instead of 0\n name = scene_class.__name__\n console.print(f\"{count}: {name}\", style=\"logging.level.info\")\n num_to_class[count] = scene_class\n try:\n user_input = console.input(\n f\"[log.message] {constants.CHOOSE_NUMBER_MESSAGE} [/log.message]\"\n )\n return [\n num_to_class[int(num_str)]\n for num_str in re.split(r\"\\s*,\\s*\", user_input.strip())\n ]\n except KeyError:\n logger.error(constants.INVALID_NUMBER_MESSAGE)\n sys.exit(2)\n except EOFError:\n sys.exit(1)\n\n\ndef get_scenes_to_render(scene_classes):\n if not scene_classes:\n logger.error(constants.NO_SCENE_MESSAGE)\n return []\n if file_writer_config[\"write_all\"]:\n return scene_classes\n result = []\n for scene_name in file_writer_config[\"scene_names\"]:\n found = False\n for scene_class in scene_classes:\n if scene_class.__name__ == scene_name:\n result.append(scene_class)\n found = True\n break\n if not found and (scene_name != \"\"):\n logger.error(constants.SCENE_NOT_FOUND_MESSAGE.format(scene_name))\n if result:\n return result\n return (\n [scene_classes[0]]\n if len(scene_classes) == 1\n else prompt_user_for_choice(scene_classes)\n )\n\n\ndef get_scene_classes_from_module(module):\n return [\n member[1]\n for member in inspect.getmembers(module, lambda x: is_child_scene(x, module))\n ]\n\n\ndef get_module(file_name):\n if file_name == \"-\":\n module = types.ModuleType(\"input_scenes\")\n logger.info(\n \"Enter the animation's code & end with an EOF (CTRL+D on Linux/Unix, CTRL+Z on Windows):\"\n )\n code = sys.stdin.read()\n if not code.startswith(\"from manim import\"):\n logger.warn(\n \"Didn't find an import statement for Manim. Importing automatically...\"\n )\n code = \"from manim import *\\n\" + code\n logger.info(\"Rendering animation from typed code...\")\n try:\n exec(code, module.__dict__)\n return module\n except Exception as e:\n logger.error(f\"Failed to render scene: {str(e)}\")\n sys.exit(2)\n else:\n if os.path.exists(file_name):\n if file_name[-3:] != \".py\":\n raise Exception(f\"{file_name} is not a valid Manim python script.\")\n module_name = file_name[:-3].replace(os.sep, \".\").split(\".\")[-1]\n spec = importlib.util.spec_from_file_location(module_name, file_name)\n module = importlib.util.module_from_spec(spec)\n spec.loader.exec_module(module)\n return module\n else:\n raise FileNotFoundError(f\"{file_name} not found\")\n\n\ndef main():\n module = get_module(file_writer_config[\"input_file\"])\n all_scene_classes = get_scene_classes_from_module(module)\n scene_classes_to_render = get_scenes_to_render(all_scene_classes)\n sound_on = file_writer_config[\"sound\"]\n for SceneClass in scene_classes_to_render:\n try:\n # By invoking, this renders the full scene\n scene = SceneClass()\n open_file_if_needed(scene.file_writer)\n if sound_on:\n play_finish_sound()\n except Exception:\n print(\"\\n\\n\")\n traceback.print_exc()\n print(\"\\n\\n\")\n if sound_on:\n play_error_sound()\n\n\nif __name__ == \"__main__\":\n main()\n", "path": "manim/__main__.py"}], "after_files": [{"content": "import inspect\nimport os\nimport platform\nimport subprocess as sp\nimport sys\nimport re\nimport traceback\nimport importlib.util\nimport types\n\nfrom .config import file_writer_config\nfrom .scene.scene import Scene\nfrom .utils.sounds import play_error_sound\nfrom .utils.sounds import play_finish_sound\nfrom . import constants\nfrom .logger import logger,console\n\n\ndef open_file_if_needed(file_writer):\n if file_writer_config[\"quiet\"]:\n curr_stdout = sys.stdout\n sys.stdout = open(os.devnull, \"w\")\n\n open_file = any([\n file_writer_config[\"preview\"],\n file_writer_config[\"show_file_in_finder\"]\n ])\n if open_file:\n current_os = platform.system()\n file_paths = []\n\n if file_writer_config[\"save_last_frame\"]:\n file_paths.append(file_writer.get_image_file_path())\n if file_writer_config[\"write_to_movie\"]:\n file_paths.append(file_writer.get_movie_file_path())\n\n for file_path in file_paths:\n if current_os == \"Windows\":\n os.startfile(os.path.dirname(file_path))\n else:\n commands = []\n if current_os == \"Linux\":\n commands.append(\"xdg-open\")\n elif current_os.startswith(\"CYGWIN\"):\n commands.append(\"cygstart\")\n else: # Assume macOS\n commands.append(\"open\")\n\n if file_writer_config[\"show_file_in_finder\"]:\n commands.append(\"-R\")\n\n commands.append(file_path)\n\n # commands.append(\"-g\")\n FNULL = open(os.devnull, 'w')\n sp.call(commands, stdout=FNULL, stderr=sp.STDOUT)\n FNULL.close()\n\n if file_writer_config[\"quiet\"]:\n sys.stdout.close()\n sys.stdout = curr_stdout\n\n\ndef is_child_scene(obj, module):\n return (inspect.isclass(obj)\n and issubclass(obj, Scene)\n and obj != Scene\n and obj.__module__.startswith(module.__name__))\n\n\ndef prompt_user_for_choice(scene_classes):\n num_to_class = {}\n for count, scene_class in enumerate(scene_classes):\n count += 1 # start with 1 instead of 0\n name = scene_class.__name__\n console.print(f\"{count}: {name}\", style=\"logging.level.info\")\n num_to_class[count] = scene_class\n try:\n user_input = console.input(f\"[log.message] {constants.CHOOSE_NUMBER_MESSAGE} [/log.message]\")\n return [num_to_class[int(num_str)]\n for num_str in re.split(r\"\\s*,\\s*\", user_input.strip())]\n except KeyError:\n logger.error(constants.INVALID_NUMBER_MESSAGE)\n sys.exit(2)\n except EOFError:\n sys.exit(1)\n\n\ndef get_scenes_to_render(scene_classes):\n if not scene_classes:\n logger.error(constants.NO_SCENE_MESSAGE)\n return []\n if file_writer_config[\"write_all\"]:\n return scene_classes\n result = []\n for scene_name in file_writer_config[\"scene_names\"]:\n found = False\n for scene_class in scene_classes:\n if scene_class.__name__ == scene_name:\n result.append(scene_class)\n found = True\n break\n if not found and (scene_name != \"\"):\n logger.error(\n constants.SCENE_NOT_FOUND_MESSAGE.format(\n scene_name\n )\n )\n if result:\n return result\n return [scene_classes[0]] if len(scene_classes) == 1 else prompt_user_for_choice(scene_classes)\n\n\ndef get_scene_classes_from_module(module):\n return [\n member[1]\n for member in inspect.getmembers(\n module,\n lambda x: is_child_scene(x, module)\n )\n ]\n\n\ndef get_module(file_name):\n if file_name == \"-\":\n module = types.ModuleType(\"input_scenes\")\n logger.info(\"Enter the animation's code & end with an EOF (CTRL+D on Linux/Unix, CTRL+Z on Windows):\")\n code = sys.stdin.read()\n if not code.startswith(\"from manim import\"):\n logger.warn(\"Didn't find an import statement for Manim. Importing automatically...\")\n code=\"from manim import *\\n\"+code\n logger.info(\"Rendering animation from typed code...\")\n try:\n exec(code, module.__dict__)\n return module\n except Exception as e:\n logger.error(f\"Failed to render scene: {str(e)}\")\n sys.exit(2)\n else:\n if os.path.exists(file_name):\n if file_name[-3:] != \".py\":\n raise Exception(f\"{file_name} is not a valid Manim python script.\")\n module_name = file_name[:-3].replace(os.sep, '.').split('.')[-1]\n spec = importlib.util.spec_from_file_location(module_name, file_name)\n module = importlib.util.module_from_spec(spec)\n spec.loader.exec_module(module)\n return module\n else:\n raise FileNotFoundError(f'{file_name} not found')\n\n\ndef main():\n module = get_module(file_writer_config[\"input_file\"])\n all_scene_classes = get_scene_classes_from_module(module)\n scene_classes_to_render = get_scenes_to_render(all_scene_classes)\n sound_on = file_writer_config[\"sound\"]\n for SceneClass in scene_classes_to_render:\n try:\n # By invoking, this renders the full scene\n scene = SceneClass()\n open_file_if_needed(scene.file_writer)\n if sound_on:\n play_finish_sound()\n except Exception:\n print(\"\\n\\n\")\n traceback.print_exc()\n print(\"\\n\\n\")\n if sound_on:\n play_error_sound()\n\n\nif __name__ == \"__main__\":\n main()\n", "path": "manim/__main__.py"}]}
1,934
99
gh_patches_debug_8898
rasdani/github-patches
git_diff
speechbrain__speechbrain-71
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Learning rate printing Now it will only print 0.00. Maybe we should print it with scientific notation. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `speechbrain/utils/train_logger.py` Content: ``` 1 import logging 2 from speechbrain.utils.edit_distance import wer_summary 3 4 logger = logging.getLogger(__name__) 5 6 7 class TrainLogger: 8 """Abstract class defining an interface for training loggers.""" 9 10 def log_stats( 11 self, 12 stats_meta, 13 train_stats=None, 14 valid_stats=None, 15 test_stats=None, 16 verbose=False, 17 ): 18 """Log the stats for one epoch. 19 20 Arguments 21 --------- 22 stats_meta : dict of str:scalar pairs 23 Meta information about the stats (e.g. epoch, learning-rate, etc.) 24 train_stats : dict of str:list pairs 25 Each loss type is represented with a str : list pair including 26 all the values for the training pass. 27 valid_stats : dict of str:list pairs 28 Each loss type is represented with a str : list pair including 29 all the values for the validation pass. 30 test_stats : dict of str:list pairs 31 Each loss type is represented with a str : list pair including 32 all the values for the test pass. 33 verbose : bool 34 Whether to also put logging information to the standard logger. 35 """ 36 raise NotImplementedError 37 38 39 class FileTrainLogger(TrainLogger): 40 """Text logger of training information 41 42 Arguments 43 --------- 44 save_file : str 45 The file to use for logging train information. 46 summary_fns : dict of str:function pairs 47 Each summary function should take a list produced as output 48 from a training/validation pass and summarize it to a single scalar. 49 """ 50 51 def __init__(self, save_file, summary_fns): 52 self.save_file = save_file 53 self.summary_fns = summary_fns 54 55 def _item_to_string(self, key, value, dataset=None): 56 """Convert one item to string, handling floats""" 57 if isinstance(value, float): 58 value = f"{value:.2f}" 59 if dataset is not None: 60 key = f"{dataset} {key}" 61 return f"{key}: {value}" 62 63 def _stats_to_string(self, stats, dataset=None): 64 """Convert all stats to a single string summary""" 65 return ", ".join( 66 [self._item_to_string(k, v, dataset) for k, v in stats.items()] 67 ) 68 69 def log_stats( 70 self, 71 stats_meta, 72 train_stats=None, 73 valid_stats=None, 74 test_stats=None, 75 verbose=True, 76 ): 77 """See TrainLogger.log_stats()""" 78 string_summary = self._stats_to_string(stats_meta) 79 for dataset, stats in [ 80 ("train", train_stats), 81 ("valid", valid_stats), 82 ("test", test_stats), 83 ]: 84 if stats is None: 85 continue 86 summary = {} 87 for stat, value_list in stats.items(): 88 summary[stat] = self.summary_fns[stat](value_list) 89 string_summary += " - " + self._stats_to_string(summary, dataset) 90 91 with open(self.save_file, "a") as fout: 92 print(string_summary, file=fout) 93 if verbose: 94 logger.info(string_summary) 95 96 97 class TensorboardLogger(TrainLogger): 98 """Logs training information in the format required by Tensorboard. 99 100 Arguments 101 --------- 102 save_dir : str 103 A directory for storing all the relevant logs 104 105 Raises 106 ------ 107 ImportError if Tensorboard is not installed. 108 """ 109 110 def __init__(self, save_dir): 111 self.save_dir = save_dir 112 113 # Raises ImportError if TensorBoard is not installed 114 from torch.utils.tensorboard import SummaryWriter 115 116 self.writer = SummaryWriter(self.save_dir) 117 self.global_step = {"train": {}, "valid": {}, "meta": 0} 118 119 def log_stats( 120 self, 121 stats_meta, 122 train_stats=None, 123 valid_stats=None, 124 test_stats=None, 125 verbose=False, 126 ): 127 """See TrainLogger.log_stats()""" 128 self.global_step["meta"] += 1 129 for name, value in stats_meta.items(): 130 self.writer.add_scalar(name, value, self.global_step["meta"]) 131 132 for dataset, stats in [ 133 ("train", train_stats), 134 ("valid", valid_stats), 135 ("test", test_stats), 136 ]: 137 if stats is None: 138 continue 139 for stat, value_list in stats.items(): 140 if stat not in self.global_step[dataset]: 141 self.global_step[dataset][stat] = 0 142 tag = f"{stat}/{dataset}" 143 for value in value_list: 144 new_global_step = self.global_step[dataset][stat] + 1 145 self.writer.add_scalar(tag, value, new_global_step) 146 self.global_step[dataset][stat] = new_global_step 147 148 149 def summarize_average(stat_list): 150 return float(sum(stat_list) / len(stat_list)) 151 152 153 def summarize_error_rate(stat_list): 154 summary = wer_summary(stat_list) 155 return summary["WER"] 156 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/speechbrain/utils/train_logger.py b/speechbrain/utils/train_logger.py --- a/speechbrain/utils/train_logger.py +++ b/speechbrain/utils/train_logger.py @@ -54,8 +54,10 @@ def _item_to_string(self, key, value, dataset=None): """Convert one item to string, handling floats""" - if isinstance(value, float): + if isinstance(value, float) and 0.01 < value < 100.0: value = f"{value:.2f}" + elif isinstance(value, float): + value = f"{value:.2e}" if dataset is not None: key = f"{dataset} {key}" return f"{key}: {value}"
{"golden_diff": "diff --git a/speechbrain/utils/train_logger.py b/speechbrain/utils/train_logger.py\n--- a/speechbrain/utils/train_logger.py\n+++ b/speechbrain/utils/train_logger.py\n@@ -54,8 +54,10 @@\n \n def _item_to_string(self, key, value, dataset=None):\n \"\"\"Convert one item to string, handling floats\"\"\"\n- if isinstance(value, float):\n+ if isinstance(value, float) and 0.01 < value < 100.0:\n value = f\"{value:.2f}\"\n+ elif isinstance(value, float):\n+ value = f\"{value:.2e}\"\n if dataset is not None:\n key = f\"{dataset} {key}\"\n return f\"{key}: {value}\"\n", "issue": "Learning rate printing\nNow it will only print 0.00. Maybe we should print it with scientific notation.\n", "before_files": [{"content": "import logging\nfrom speechbrain.utils.edit_distance import wer_summary\n\nlogger = logging.getLogger(__name__)\n\n\nclass TrainLogger:\n \"\"\"Abstract class defining an interface for training loggers.\"\"\"\n\n def log_stats(\n self,\n stats_meta,\n train_stats=None,\n valid_stats=None,\n test_stats=None,\n verbose=False,\n ):\n \"\"\"Log the stats for one epoch.\n\n Arguments\n ---------\n stats_meta : dict of str:scalar pairs\n Meta information about the stats (e.g. epoch, learning-rate, etc.)\n train_stats : dict of str:list pairs\n Each loss type is represented with a str : list pair including\n all the values for the training pass.\n valid_stats : dict of str:list pairs\n Each loss type is represented with a str : list pair including\n all the values for the validation pass.\n test_stats : dict of str:list pairs\n Each loss type is represented with a str : list pair including\n all the values for the test pass.\n verbose : bool\n Whether to also put logging information to the standard logger.\n \"\"\"\n raise NotImplementedError\n\n\nclass FileTrainLogger(TrainLogger):\n \"\"\"Text logger of training information\n\n Arguments\n ---------\n save_file : str\n The file to use for logging train information.\n summary_fns : dict of str:function pairs\n Each summary function should take a list produced as output\n from a training/validation pass and summarize it to a single scalar.\n \"\"\"\n\n def __init__(self, save_file, summary_fns):\n self.save_file = save_file\n self.summary_fns = summary_fns\n\n def _item_to_string(self, key, value, dataset=None):\n \"\"\"Convert one item to string, handling floats\"\"\"\n if isinstance(value, float):\n value = f\"{value:.2f}\"\n if dataset is not None:\n key = f\"{dataset} {key}\"\n return f\"{key}: {value}\"\n\n def _stats_to_string(self, stats, dataset=None):\n \"\"\"Convert all stats to a single string summary\"\"\"\n return \", \".join(\n [self._item_to_string(k, v, dataset) for k, v in stats.items()]\n )\n\n def log_stats(\n self,\n stats_meta,\n train_stats=None,\n valid_stats=None,\n test_stats=None,\n verbose=True,\n ):\n \"\"\"See TrainLogger.log_stats()\"\"\"\n string_summary = self._stats_to_string(stats_meta)\n for dataset, stats in [\n (\"train\", train_stats),\n (\"valid\", valid_stats),\n (\"test\", test_stats),\n ]:\n if stats is None:\n continue\n summary = {}\n for stat, value_list in stats.items():\n summary[stat] = self.summary_fns[stat](value_list)\n string_summary += \" - \" + self._stats_to_string(summary, dataset)\n\n with open(self.save_file, \"a\") as fout:\n print(string_summary, file=fout)\n if verbose:\n logger.info(string_summary)\n\n\nclass TensorboardLogger(TrainLogger):\n \"\"\"Logs training information in the format required by Tensorboard.\n\n Arguments\n ---------\n save_dir : str\n A directory for storing all the relevant logs\n\n Raises\n ------\n ImportError if Tensorboard is not installed.\n \"\"\"\n\n def __init__(self, save_dir):\n self.save_dir = save_dir\n\n # Raises ImportError if TensorBoard is not installed\n from torch.utils.tensorboard import SummaryWriter\n\n self.writer = SummaryWriter(self.save_dir)\n self.global_step = {\"train\": {}, \"valid\": {}, \"meta\": 0}\n\n def log_stats(\n self,\n stats_meta,\n train_stats=None,\n valid_stats=None,\n test_stats=None,\n verbose=False,\n ):\n \"\"\"See TrainLogger.log_stats()\"\"\"\n self.global_step[\"meta\"] += 1\n for name, value in stats_meta.items():\n self.writer.add_scalar(name, value, self.global_step[\"meta\"])\n\n for dataset, stats in [\n (\"train\", train_stats),\n (\"valid\", valid_stats),\n (\"test\", test_stats),\n ]:\n if stats is None:\n continue\n for stat, value_list in stats.items():\n if stat not in self.global_step[dataset]:\n self.global_step[dataset][stat] = 0\n tag = f\"{stat}/{dataset}\"\n for value in value_list:\n new_global_step = self.global_step[dataset][stat] + 1\n self.writer.add_scalar(tag, value, new_global_step)\n self.global_step[dataset][stat] = new_global_step\n\n\ndef summarize_average(stat_list):\n return float(sum(stat_list) / len(stat_list))\n\n\ndef summarize_error_rate(stat_list):\n summary = wer_summary(stat_list)\n return summary[\"WER\"]\n", "path": "speechbrain/utils/train_logger.py"}], "after_files": [{"content": "import logging\nfrom speechbrain.utils.edit_distance import wer_summary\n\nlogger = logging.getLogger(__name__)\n\n\nclass TrainLogger:\n \"\"\"Abstract class defining an interface for training loggers.\"\"\"\n\n def log_stats(\n self,\n stats_meta,\n train_stats=None,\n valid_stats=None,\n test_stats=None,\n verbose=False,\n ):\n \"\"\"Log the stats for one epoch.\n\n Arguments\n ---------\n stats_meta : dict of str:scalar pairs\n Meta information about the stats (e.g. epoch, learning-rate, etc.)\n train_stats : dict of str:list pairs\n Each loss type is represented with a str : list pair including\n all the values for the training pass.\n valid_stats : dict of str:list pairs\n Each loss type is represented with a str : list pair including\n all the values for the validation pass.\n test_stats : dict of str:list pairs\n Each loss type is represented with a str : list pair including\n all the values for the test pass.\n verbose : bool\n Whether to also put logging information to the standard logger.\n \"\"\"\n raise NotImplementedError\n\n\nclass FileTrainLogger(TrainLogger):\n \"\"\"Text logger of training information\n\n Arguments\n ---------\n save_file : str\n The file to use for logging train information.\n summary_fns : dict of str:function pairs\n Each summary function should take a list produced as output\n from a training/validation pass and summarize it to a single scalar.\n \"\"\"\n\n def __init__(self, save_file, summary_fns):\n self.save_file = save_file\n self.summary_fns = summary_fns\n\n def _item_to_string(self, key, value, dataset=None):\n \"\"\"Convert one item to string, handling floats\"\"\"\n if isinstance(value, float) and 0.01 < value < 100.0:\n value = f\"{value:.2f}\"\n elif isinstance(value, float):\n value = f\"{value:.2e}\"\n if dataset is not None:\n key = f\"{dataset} {key}\"\n return f\"{key}: {value}\"\n\n def _stats_to_string(self, stats, dataset=None):\n \"\"\"Convert all stats to a single string summary\"\"\"\n return \", \".join(\n [self._item_to_string(k, v, dataset) for k, v in stats.items()]\n )\n\n def log_stats(\n self,\n stats_meta,\n train_stats=None,\n valid_stats=None,\n test_stats=None,\n verbose=True,\n ):\n \"\"\"See TrainLogger.log_stats()\"\"\"\n string_summary = self._stats_to_string(stats_meta)\n for dataset, stats in [\n (\"train\", train_stats),\n (\"valid\", valid_stats),\n (\"test\", test_stats),\n ]:\n if stats is None:\n continue\n summary = {}\n for stat, value_list in stats.items():\n summary[stat] = self.summary_fns[stat](value_list)\n string_summary += \" - \" + self._stats_to_string(summary, dataset)\n\n with open(self.save_file, \"a\") as fout:\n print(string_summary, file=fout)\n if verbose:\n logger.info(string_summary)\n\n\nclass TensorboardLogger(TrainLogger):\n \"\"\"Logs training information in the format required by Tensorboard.\n\n Arguments\n ---------\n save_dir : str\n A directory for storing all the relevant logs\n\n Raises\n ------\n ImportError if Tensorboard is not installed.\n \"\"\"\n\n def __init__(self, save_dir):\n self.save_dir = save_dir\n\n # Raises ImportError if TensorBoard is not installed\n from torch.utils.tensorboard import SummaryWriter\n\n self.writer = SummaryWriter(self.save_dir)\n self.global_step = {\"train\": {}, \"valid\": {}, \"meta\": 0}\n\n def log_stats(\n self,\n stats_meta,\n train_stats=None,\n valid_stats=None,\n test_stats=None,\n verbose=False,\n ):\n \"\"\"See TrainLogger.log_stats()\"\"\"\n self.global_step[\"meta\"] += 1\n for name, value in stats_meta.items():\n self.writer.add_scalar(name, value, self.global_step[\"meta\"])\n\n for dataset, stats in [\n (\"train\", train_stats),\n (\"valid\", valid_stats),\n (\"test\", test_stats),\n ]:\n if stats is None:\n continue\n for stat, value_list in stats.items():\n if stat not in self.global_step[dataset]:\n self.global_step[dataset][stat] = 0\n tag = f\"{stat}/{dataset}\"\n for value in value_list:\n new_global_step = self.global_step[dataset][stat] + 1\n self.writer.add_scalar(tag, value, new_global_step)\n self.global_step[dataset][stat] = new_global_step\n\n\ndef summarize_average(stat_list):\n return float(sum(stat_list) / len(stat_list))\n\n\ndef summarize_error_rate(stat_list):\n summary = wer_summary(stat_list)\n return summary[\"WER\"]\n", "path": "speechbrain/utils/train_logger.py"}]}
1,687
167
gh_patches_debug_5285
rasdani/github-patches
git_diff
freedomofpress__securedrop-7140
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Release SecureDrop 2.8.0 This is a tracking issue for the release of SecureDrop 2.8.0 Tentatively scheduled as follows: **Pre-release announcement:** 2023-03-05 **Release date:** 2024-03-12 **Release manager:** @zenmonkeykstop **Deputy release manager:** @legoktm **Localization manager:** @cfm **Communications manager:** @eloquence _SecureDrop maintainers and testers:_ As you QA 2.8.0, please report back your testing results as comments on this ticket. File GitHub issues for any problems found, tag them "QA: Release". Test debian packages will be posted on https://apt-test.freedom.press signed with [the test key](https://gist.githubusercontent.com/conorsch/ec4008b111bc3142fca522693f3cce7e/raw/2968621e8ad92db4505a31fcc5776422d7d26729/apt-test%2520apt%2520pubkey). # [QA Matrix for 2.8.0](https://docs.google.com/spreadsheets/d/1hcSrgbid03so0tQz3zfwvMaWJ8x7OOZsCfEz1I_PjAE/edit#gid=96348658) # [Test Plan for 2.8.0](https://github.com/freedomofpress/securedrop/wiki/2.8.0-Test-Plan) # [Tails-only test plan for 2.8.0-rc2](https://github.com/freedomofpress/securedrop/issues/7121#issuecomment-1988954749) (complete if you've already tested 2.8.0-rc1, there are no server changes in rc2) # Prepare release candidate (2.8.0~rc1) - [ ] Link to latest version of Tails, including release candidates, to test against during QA - [ ] Tails 5 - [ ] Tails 6 - [x] Prepare 2.8.0~rc1 release changelog - [x] Branch off release/2.8.0 from develop - [x] Prepare 2.8.0 - [ ] Build debs, preserving build log, and put up `2.8.0~rc1` on test apt server - [ ] Commit build log. # Prepare release candidate (2.8.0~rc2) - [ ] Link to latest version of Tails, including release candidates, to test against during QA - [x] Tails 5 - [x] Tails 6 - [x] Prepare 2.8.0~rc2 release changelog - [x] Branch off release/2.8.0 from develop - [x] Prepare 2.8.0-rc2 - [ ] ~Build debs, preserving build log, and put up `2.8.0~rc1` on test apt server~ skipped, as changes are Tails-only. - [ ] ~Commit build log.~ After each test, please update the QA matrix and post details for Basic Server Testing, Application Acceptance Testing and release-specific testing below in comments to this ticket. # Final release - [ ] ~Ensure builder in release branch is updated and/or update builder image~ (no longer in use) - [x] Push signed tag - [x] Pre-Flight: Test updater logic in Tails (apt-qa tracks the `release` branch in the LFS repo) - [x] Build final Debian packages(and preserve build log) - [x] Commit package build log to https://github.com/freedomofpress/build-logs - [x] Pre-Flight: Test that install and upgrade from 2.7.0 to 2.8.0 works w/ prod repo debs (apt-qa.freedom.press polls the `release` branch in the LFS repo for the debs) - [ ] Flip apt QA server to prod status (merge to `main` in the LFS repo) - [ ] Merge Docs branch changes to ``main`` and verify new docs build in securedrop-docs repo - [ ] Prepare release messaging # Post release - [ ] Create GitHub release object - [ ] Once release object is created, update versions in `securedrop-docs` and Wagtail - [ ] Verify new docs show up on https://docs.securedrop.org - [ ] Publish announcements - [ ] Merge changelog back to `develop` - [ ] Update roadmap wiki page: https://github.com/freedomofpress/securedrop/wiki/Development-Roadmap --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `securedrop/version.py` Content: ``` 1 __version__ = "2.8.0~rc1" 2 ``` Path: `securedrop/setup.py` Content: ``` 1 import setuptools 2 3 long_description = "The SecureDrop whistleblower platform." 4 5 setuptools.setup( 6 name="securedrop-app-code", 7 version="2.8.0~rc1", 8 author="Freedom of the Press Foundation", 9 author_email="[email protected]", 10 description="SecureDrop Server", 11 long_description=long_description, 12 long_description_content_type="text/markdown", 13 license="AGPLv3+", 14 python_requires=">=3.8", 15 url="https://github.com/freedomofpress/securedrop", 16 classifiers=[ 17 "Development Status :: 5 - Stable", 18 "Programming Language :: Python :: 3", 19 "Topic :: Software Development :: Libraries :: Python Modules", 20 "License :: OSI Approved :: GNU Affero General Public License v3 or later (AGPLv3+)", 21 "Intended Audience :: Developers", 22 "Operating System :: OS Independent", 23 ], 24 ) 25 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/securedrop/setup.py b/securedrop/setup.py --- a/securedrop/setup.py +++ b/securedrop/setup.py @@ -4,7 +4,7 @@ setuptools.setup( name="securedrop-app-code", - version="2.8.0~rc1", + version="2.9.0~rc1", author="Freedom of the Press Foundation", author_email="[email protected]", description="SecureDrop Server", diff --git a/securedrop/version.py b/securedrop/version.py --- a/securedrop/version.py +++ b/securedrop/version.py @@ -1 +1 @@ -__version__ = "2.8.0~rc1" +__version__ = "2.9.0~rc1"
{"golden_diff": "diff --git a/securedrop/setup.py b/securedrop/setup.py\n--- a/securedrop/setup.py\n+++ b/securedrop/setup.py\n@@ -4,7 +4,7 @@\n \n setuptools.setup(\n name=\"securedrop-app-code\",\n- version=\"2.8.0~rc1\",\n+ version=\"2.9.0~rc1\",\n author=\"Freedom of the Press Foundation\",\n author_email=\"[email protected]\",\n description=\"SecureDrop Server\",\ndiff --git a/securedrop/version.py b/securedrop/version.py\n--- a/securedrop/version.py\n+++ b/securedrop/version.py\n@@ -1 +1 @@\n-__version__ = \"2.8.0~rc1\"\n+__version__ = \"2.9.0~rc1\"\n", "issue": "Release SecureDrop 2.8.0\nThis is a tracking issue for the release of SecureDrop 2.8.0\r\n\r\nTentatively scheduled as follows:\r\n\r\n**Pre-release announcement:** 2023-03-05\r\n**Release date:** 2024-03-12\r\n\r\n**Release manager:** @zenmonkeykstop \r\n**Deputy release manager:** @legoktm \r\n**Localization manager:** @cfm\r\n**Communications manager:** @eloquence\r\n\r\n_SecureDrop maintainers and testers:_ As you QA 2.8.0, please report back your testing results as comments on this ticket. File GitHub issues for any problems found, tag them \"QA: Release\".\r\n\r\nTest debian packages will be posted on https://apt-test.freedom.press signed with [the test key](https://gist.githubusercontent.com/conorsch/ec4008b111bc3142fca522693f3cce7e/raw/2968621e8ad92db4505a31fcc5776422d7d26729/apt-test%2520apt%2520pubkey).\r\n\r\n# [QA Matrix for 2.8.0](https://docs.google.com/spreadsheets/d/1hcSrgbid03so0tQz3zfwvMaWJ8x7OOZsCfEz1I_PjAE/edit#gid=96348658)\r\n# [Test Plan for 2.8.0](https://github.com/freedomofpress/securedrop/wiki/2.8.0-Test-Plan)\r\n# [Tails-only test plan for 2.8.0-rc2](https://github.com/freedomofpress/securedrop/issues/7121#issuecomment-1988954749)\r\n(complete if you've already tested 2.8.0-rc1, there are no server changes in rc2)\r\n\r\n# Prepare release candidate (2.8.0~rc1)\r\n- [ ] Link to latest version of Tails, including release candidates, to test against during QA\r\n - [ ] Tails 5 \r\n - [ ] Tails 6 \r\n- [x] Prepare 2.8.0~rc1 release changelog\r\n- [x] Branch off release/2.8.0 from develop\r\n- [x] Prepare 2.8.0\r\n- [ ] Build debs, preserving build log, and put up `2.8.0~rc1` on test apt server\r\n- [ ] Commit build log.\r\n\r\n# Prepare release candidate (2.8.0~rc2)\r\n- [ ] Link to latest version of Tails, including release candidates, to test against during QA\r\n - [x] Tails 5 \r\n - [x] Tails 6 \r\n- [x] Prepare 2.8.0~rc2 release changelog\r\n- [x] Branch off release/2.8.0 from develop\r\n- [x] Prepare 2.8.0-rc2\r\n- [ ] ~Build debs, preserving build log, and put up `2.8.0~rc1` on test apt server~ skipped, as changes are Tails-only.\r\n- [ ] ~Commit build log.~\r\n\r\n\r\nAfter each test, please update the QA matrix and post details for Basic Server Testing, Application Acceptance Testing and release-specific testing below in comments to this ticket.\r\n\r\n# Final release\r\n- [ ] ~Ensure builder in release branch is updated and/or update builder image~ (no longer in use)\r\n- [x] Push signed tag \r\n- [x] Pre-Flight: Test updater logic in Tails (apt-qa tracks the `release` branch in the LFS repo)\r\n- [x] Build final Debian packages(and preserve build log)\r\n- [x] Commit package build log to https://github.com/freedomofpress/build-logs\r\n- [x] Pre-Flight: Test that install and upgrade from 2.7.0 to 2.8.0 works w/ prod repo debs (apt-qa.freedom.press polls the `release` branch in the LFS repo for the debs)\r\n- [ ] Flip apt QA server to prod status (merge to `main` in the LFS repo)\r\n- [ ] Merge Docs branch changes to ``main`` and verify new docs build in securedrop-docs repo\r\n- [ ] Prepare release messaging\r\n\r\n# Post release\r\n- [ ] Create GitHub release object \r\n- [ ] Once release object is created, update versions in `securedrop-docs` and Wagtail\r\n- [ ] Verify new docs show up on https://docs.securedrop.org\r\n- [ ] Publish announcements\r\n- [ ] Merge changelog back to `develop`\r\n- [ ] Update roadmap wiki page: https://github.com/freedomofpress/securedrop/wiki/Development-Roadmap\n", "before_files": [{"content": "__version__ = \"2.8.0~rc1\"\n", "path": "securedrop/version.py"}, {"content": "import setuptools\n\nlong_description = \"The SecureDrop whistleblower platform.\"\n\nsetuptools.setup(\n name=\"securedrop-app-code\",\n version=\"2.8.0~rc1\",\n author=\"Freedom of the Press Foundation\",\n author_email=\"[email protected]\",\n description=\"SecureDrop Server\",\n long_description=long_description,\n long_description_content_type=\"text/markdown\",\n license=\"AGPLv3+\",\n python_requires=\">=3.8\",\n url=\"https://github.com/freedomofpress/securedrop\",\n classifiers=[\n \"Development Status :: 5 - Stable\",\n \"Programming Language :: Python :: 3\",\n \"Topic :: Software Development :: Libraries :: Python Modules\",\n \"License :: OSI Approved :: GNU Affero General Public License v3 or later (AGPLv3+)\",\n \"Intended Audience :: Developers\",\n \"Operating System :: OS Independent\",\n ],\n)\n", "path": "securedrop/setup.py"}], "after_files": [{"content": "__version__ = \"2.9.0~rc1\"\n", "path": "securedrop/version.py"}, {"content": "import setuptools\n\nlong_description = \"The SecureDrop whistleblower platform.\"\n\nsetuptools.setup(\n name=\"securedrop-app-code\",\n version=\"2.9.0~rc1\",\n author=\"Freedom of the Press Foundation\",\n author_email=\"[email protected]\",\n description=\"SecureDrop Server\",\n long_description=long_description,\n long_description_content_type=\"text/markdown\",\n license=\"AGPLv3+\",\n python_requires=\">=3.8\",\n url=\"https://github.com/freedomofpress/securedrop\",\n classifiers=[\n \"Development Status :: 5 - Stable\",\n \"Programming Language :: Python :: 3\",\n \"Topic :: Software Development :: Libraries :: Python Modules\",\n \"License :: OSI Approved :: GNU Affero General Public License v3 or later (AGPLv3+)\",\n \"Intended Audience :: Developers\",\n \"Operating System :: OS Independent\",\n ],\n)\n", "path": "securedrop/setup.py"}]}
1,591
175
gh_patches_debug_29242
rasdani/github-patches
git_diff
larq__larq-34
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- tf.sign(0) = 0 --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `xquant/quantizers.py` Content: ``` 1 import tensorflow as tf 2 from xquant import utils 3 4 5 @utils.register_keras_custom_object 6 @tf.custom_gradient 7 def ste_sign(x): 8 r""" 9 Sign binarization function. 10 \\[q(x) = \mathrm{Sign}(x)\\] 11 12 The gradient is estimated using the Straight-Through Estimator. 13 \\[\frac{\partial q(x)}{\partial x} = x\\] 14 15 # Arguments 16 x: Input tensor. 17 18 # Returns 19 Binarized tensor. 20 21 # References 22 - [Binarized Neural Networks: Training Deep Neural Networks with Weights and 23 Activations Constrained to +1 or -1](http://arxiv.org/abs/1602.02830) 24 """ 25 26 def grad(dy): 27 return dy 28 29 return tf.sign(x), grad 30 31 32 @utils.register_keras_custom_object 33 @tf.custom_gradient 34 def approx_sign(x): 35 r""" 36 Sign binarization function. 37 \\[q(x) = \mathrm{Sign}(x)\\] 38 39 The gradient is estimated using the ApproxSign method. 40 \\[\frac{\partial q(x)}{\partial x} = (2 - 2 \left|x\right|))\\] 41 42 # Arguments 43 x: Input tensor. 44 45 # Returns 46 Binarized tensor. 47 48 # References 49 - [Bi-Real Net: Enhancing the Performance of 1-bit CNNs With Improved 50 Representational Capability and Advanced 51 Training Algorithm](http://arxiv.org/abs/1808.00278) 52 """ 53 54 def grad(dy): 55 return (1 - tf.abs(x)) * 2 * dy 56 57 return tf.sign(x), grad 58 59 60 def serialize(initializer): 61 return tf.keras.utils.serialize_keras_object(initializer) 62 63 64 def deserialize(name, custom_objects=None): 65 return tf.keras.utils.deserialize_keras_object( 66 name, 67 module_objects=globals(), 68 custom_objects=custom_objects, 69 printable_module_name="quantization function", 70 ) 71 72 73 def get(identifier): 74 if identifier is None: 75 return None 76 if isinstance(identifier, str): 77 return deserialize(str(identifier)) 78 if callable(identifier): 79 return identifier 80 raise ValueError( 81 "Could not interpret quantization function identifier:", identifier 82 ) 83 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/xquant/quantizers.py b/xquant/quantizers.py --- a/xquant/quantizers.py +++ b/xquant/quantizers.py @@ -2,12 +2,22 @@ from xquant import utils +def sign(x): + """A sign function that will never be zero""" + return tf.sign(tf.sign(x) + 1e-10) + + @utils.register_keras_custom_object @tf.custom_gradient def ste_sign(x): r""" Sign binarization function. - \\[q(x) = \mathrm{Sign}(x)\\] + \\[ + q(x) = \begin{cases} + -1 & x < 0 \\\ + 1 & x \geq 0 + \end{cases} + \\] The gradient is estimated using the Straight-Through Estimator. \\[\frac{\partial q(x)}{\partial x} = x\\] @@ -26,7 +36,7 @@ def grad(dy): return dy - return tf.sign(x), grad + return sign(x), grad @utils.register_keras_custom_object @@ -34,7 +44,12 @@ def approx_sign(x): r""" Sign binarization function. - \\[q(x) = \mathrm{Sign}(x)\\] + \\[ + q(x) = \begin{cases} + -1 & x < 0 \\\ + 1 & x \geq 0 + \end{cases} + \\] The gradient is estimated using the ApproxSign method. \\[\frac{\partial q(x)}{\partial x} = (2 - 2 \left|x\right|))\\] @@ -54,7 +69,7 @@ def grad(dy): return (1 - tf.abs(x)) * 2 * dy - return tf.sign(x), grad + return sign(x), grad def serialize(initializer):
{"golden_diff": "diff --git a/xquant/quantizers.py b/xquant/quantizers.py\n--- a/xquant/quantizers.py\n+++ b/xquant/quantizers.py\n@@ -2,12 +2,22 @@\n from xquant import utils\n \n \n+def sign(x):\n+ \"\"\"A sign function that will never be zero\"\"\"\n+ return tf.sign(tf.sign(x) + 1e-10)\n+\n+\n @utils.register_keras_custom_object\n @tf.custom_gradient\n def ste_sign(x):\n r\"\"\"\n Sign binarization function.\n- \\\\[q(x) = \\mathrm{Sign}(x)\\\\]\n+ \\\\[\n+ q(x) = \\begin{cases}\n+ -1 & x < 0 \\\\\\\n+ 1 & x \\geq 0\n+ \\end{cases}\n+ \\\\]\n \n The gradient is estimated using the Straight-Through Estimator.\n \\\\[\\frac{\\partial q(x)}{\\partial x} = x\\\\]\n@@ -26,7 +36,7 @@\n def grad(dy):\n return dy\n \n- return tf.sign(x), grad\n+ return sign(x), grad\n \n \n @utils.register_keras_custom_object\n@@ -34,7 +44,12 @@\n def approx_sign(x):\n r\"\"\"\n Sign binarization function.\n- \\\\[q(x) = \\mathrm{Sign}(x)\\\\]\n+ \\\\[\n+ q(x) = \\begin{cases}\n+ -1 & x < 0 \\\\\\\n+ 1 & x \\geq 0\n+ \\end{cases}\n+ \\\\]\n \n The gradient is estimated using the ApproxSign method.\n \\\\[\\frac{\\partial q(x)}{\\partial x} = (2 - 2 \\left|x\\right|))\\\\]\n@@ -54,7 +69,7 @@\n def grad(dy):\n return (1 - tf.abs(x)) * 2 * dy\n \n- return tf.sign(x), grad\n+ return sign(x), grad\n \n \n def serialize(initializer):\n", "issue": "tf.sign(0) = 0\n\n", "before_files": [{"content": "import tensorflow as tf\nfrom xquant import utils\n\n\[email protected]_keras_custom_object\[email protected]_gradient\ndef ste_sign(x):\n r\"\"\"\n Sign binarization function.\n \\\\[q(x) = \\mathrm{Sign}(x)\\\\]\n\n The gradient is estimated using the Straight-Through Estimator.\n \\\\[\\frac{\\partial q(x)}{\\partial x} = x\\\\]\n\n # Arguments\n x: Input tensor.\n\n # Returns\n Binarized tensor.\n\n # References\n - [Binarized Neural Networks: Training Deep Neural Networks with Weights and\n Activations Constrained to +1 or -1](http://arxiv.org/abs/1602.02830)\n \"\"\"\n\n def grad(dy):\n return dy\n\n return tf.sign(x), grad\n\n\[email protected]_keras_custom_object\[email protected]_gradient\ndef approx_sign(x):\n r\"\"\"\n Sign binarization function.\n \\\\[q(x) = \\mathrm{Sign}(x)\\\\]\n\n The gradient is estimated using the ApproxSign method.\n \\\\[\\frac{\\partial q(x)}{\\partial x} = (2 - 2 \\left|x\\right|))\\\\]\n\n # Arguments\n x: Input tensor.\n\n # Returns\n Binarized tensor.\n\n # References\n - [Bi-Real Net: Enhancing the Performance of 1-bit CNNs With Improved\n Representational Capability and Advanced\n Training Algorithm](http://arxiv.org/abs/1808.00278)\n \"\"\"\n\n def grad(dy):\n return (1 - tf.abs(x)) * 2 * dy\n\n return tf.sign(x), grad\n\n\ndef serialize(initializer):\n return tf.keras.utils.serialize_keras_object(initializer)\n\n\ndef deserialize(name, custom_objects=None):\n return tf.keras.utils.deserialize_keras_object(\n name,\n module_objects=globals(),\n custom_objects=custom_objects,\n printable_module_name=\"quantization function\",\n )\n\n\ndef get(identifier):\n if identifier is None:\n return None\n if isinstance(identifier, str):\n return deserialize(str(identifier))\n if callable(identifier):\n return identifier\n raise ValueError(\n \"Could not interpret quantization function identifier:\", identifier\n )\n", "path": "xquant/quantizers.py"}], "after_files": [{"content": "import tensorflow as tf\nfrom xquant import utils\n\n\ndef sign(x):\n \"\"\"A sign function that will never be zero\"\"\"\n return tf.sign(tf.sign(x) + 1e-10)\n\n\[email protected]_keras_custom_object\[email protected]_gradient\ndef ste_sign(x):\n r\"\"\"\n Sign binarization function.\n \\\\[\n q(x) = \\begin{cases}\n -1 & x < 0 \\\\\\\n 1 & x \\geq 0\n \\end{cases}\n \\\\]\n\n The gradient is estimated using the Straight-Through Estimator.\n \\\\[\\frac{\\partial q(x)}{\\partial x} = x\\\\]\n\n # Arguments\n x: Input tensor.\n\n # Returns\n Binarized tensor.\n\n # References\n - [Binarized Neural Networks: Training Deep Neural Networks with Weights and\n Activations Constrained to +1 or -1](http://arxiv.org/abs/1602.02830)\n \"\"\"\n\n def grad(dy):\n return dy\n\n return sign(x), grad\n\n\[email protected]_keras_custom_object\[email protected]_gradient\ndef approx_sign(x):\n r\"\"\"\n Sign binarization function.\n \\\\[\n q(x) = \\begin{cases}\n -1 & x < 0 \\\\\\\n 1 & x \\geq 0\n \\end{cases}\n \\\\]\n\n The gradient is estimated using the ApproxSign method.\n \\\\[\\frac{\\partial q(x)}{\\partial x} = (2 - 2 \\left|x\\right|))\\\\]\n\n # Arguments\n x: Input tensor.\n\n # Returns\n Binarized tensor.\n\n # References\n - [Bi-Real Net: Enhancing the Performance of 1-bit CNNs With Improved\n Representational Capability and Advanced\n Training Algorithm](http://arxiv.org/abs/1808.00278)\n \"\"\"\n\n def grad(dy):\n return (1 - tf.abs(x)) * 2 * dy\n\n return sign(x), grad\n\n\ndef serialize(initializer):\n return tf.keras.utils.serialize_keras_object(initializer)\n\n\ndef deserialize(name, custom_objects=None):\n return tf.keras.utils.deserialize_keras_object(\n name,\n module_objects=globals(),\n custom_objects=custom_objects,\n printable_module_name=\"quantization function\",\n )\n\n\ndef get(identifier):\n if identifier is None:\n return None\n if isinstance(identifier, str):\n return deserialize(str(identifier))\n if callable(identifier):\n return identifier\n raise ValueError(\n \"Could not interpret quantization function identifier:\", identifier\n )\n", "path": "xquant/quantizers.py"}]}
929
454
gh_patches_debug_13023
rasdani/github-patches
git_diff
readthedocs__readthedocs.org-4833
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Redirect full path to default version I'd like to be able to create versionless links to the documentation, for use in error messages, code comments, etc. For example, a message like `see https://click.palletsprojects.com/windows for more information`. I don't want to use URLs with versions because I would have to remember to modify all instances of it before releasing a new version. Currently, only the root path redirects to the default version, other paths raise a 404. Instead, the path should be preserved and appended to the default version path on redirect. ``` Works: https://click.palletsprojects.com/ -> https://click.palletsprojects.com/en/7.x/ Doesn't work, 404: https://click.palletsprojects.com/windows -> https://click.palletsprojects.com/en/7.x/windows ``` I do not want to use the "latest" or "stable" versions because I would like the URLs that people land on and share to contain the actual version. I already do this with the transitional redirects I set up from `click.pocoo.org` to `click.palletsprojects.com`. A similar approach could probably be used to extend RTD's default redirect. ```nginx location ~ ^/dev(.*)$ { return 301 https://click.palletsprojects.com/en/master$1; } location ~ ^/(\d)(.*)$ { return 301 https://click.palletsprojects.com/en/$1.x$2; } location ~ ^/latest(.*)$ { return 301 https://click.palletsprojects.com/en/7.x$1; } location / { return 301 https://click.palletsprojects.com/en/7.x$request_uri; } ``` --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `readthedocs/core/views/__init__.py` Content: ``` 1 # -*- coding: utf-8 -*- 2 3 """ 4 Core views, including the main homepage, 5 6 documentation and header rendering, and server errors. 7 """ 8 9 from __future__ import absolute_import 10 from __future__ import division 11 import os 12 import logging 13 14 from django.conf import settings 15 from django.http import HttpResponseRedirect, Http404, JsonResponse 16 from django.shortcuts import render, get_object_or_404, redirect 17 from django.views.decorators.csrf import csrf_exempt 18 from django.views.generic import TemplateView 19 20 from readthedocs.builds.models import Version 21 from readthedocs.core.utils import broadcast 22 from readthedocs.projects.models import Project, ImportedFile 23 from readthedocs.projects.tasks import remove_dir 24 from readthedocs.redirects.utils import get_redirect_response 25 26 log = logging.getLogger(__name__) 27 28 29 class NoProjectException(Exception): 30 pass 31 32 33 class HomepageView(TemplateView): 34 35 template_name = 'homepage.html' 36 37 def get_context_data(self, **kwargs): 38 """Add latest builds and featured projects.""" 39 context = super(HomepageView, self).get_context_data(**kwargs) 40 context['featured_list'] = Project.objects.filter(featured=True) 41 context['projects_count'] = Project.objects.count() 42 return context 43 44 45 class SupportView(TemplateView): 46 template_name = 'support.html' 47 48 def get_context_data(self, **kwargs): 49 context = super(SupportView, self).get_context_data(**kwargs) 50 support_email = getattr(settings, 'SUPPORT_EMAIL', None) 51 if not support_email: 52 support_email = 'support@{domain}'.format( 53 domain=getattr( 54 settings, 55 'PRODUCTION_DOMAIN', 56 'readthedocs.org', 57 ), 58 ) 59 60 context['support_email'] = support_email 61 return context 62 63 64 def random_page(request, project_slug=None): # pylint: disable=unused-argument 65 imported_file = ImportedFile.objects.order_by('?') 66 if project_slug: 67 imported_file = imported_file.filter(project__slug=project_slug) 68 imported_file = imported_file.first() 69 if imported_file is None: 70 raise Http404 71 url = imported_file.get_absolute_url() 72 return HttpResponseRedirect(url) 73 74 75 @csrf_exempt 76 def wipe_version(request, project_slug, version_slug): 77 version = get_object_or_404( 78 Version, 79 project__slug=project_slug, 80 slug=version_slug, 81 ) 82 # We need to check by ``for_admin_user`` here to allow members of the 83 # ``Admin`` team (which doesn't own the project) under the corporate site. 84 if version.project not in Project.objects.for_admin_user(user=request.user): 85 raise Http404('You must own this project to wipe it.') 86 87 if request.method == 'POST': 88 del_dirs = [ 89 os.path.join(version.project.doc_path, 'checkouts', version.slug), 90 os.path.join(version.project.doc_path, 'envs', version.slug), 91 os.path.join(version.project.doc_path, 'conda', version.slug), 92 ] 93 for del_dir in del_dirs: 94 broadcast(type='build', task=remove_dir, args=[del_dir]) 95 return redirect('project_version_list', project_slug) 96 return render( 97 request, 98 'wipe_version.html', 99 {'version': version, 'project': version.project}, 100 ) 101 102 103 def server_error_500(request, template_name='500.html'): 104 """A simple 500 handler so we get media.""" 105 r = render(request, template_name) 106 r.status_code = 500 107 return r 108 109 110 def server_error_404(request, exception=None, template_name='404.html'): # pylint: disable=unused-argument # noqa 111 """ 112 A simple 404 handler so we get media. 113 114 .. note:: 115 116 Marking exception as optional to make /404/ testing page to work. 117 """ 118 response = get_redirect_response(request, path=request.get_full_path()) 119 if response: 120 return response 121 r = render(request, template_name) 122 r.status_code = 404 123 return r 124 125 126 def do_not_track(request): 127 dnt_header = request.META.get('HTTP_DNT') 128 129 # https://w3c.github.io/dnt/drafts/tracking-dnt.html#status-representation 130 return JsonResponse({ # pylint: disable=redundant-content-type-for-json-response 131 'policy': 'https://docs.readthedocs.io/en/latest/privacy-policy.html', 132 'same-party': [ 133 'readthedocs.org', 134 'readthedocs.com', 135 'readthedocs.io', # .org Documentation Sites 136 'readthedocs-hosted.com', # .com Documentation Sites 137 ], 138 'tracking': 'N' if dnt_header == '1' else 'T', 139 }, content_type='application/tracking-status+json') 140 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/readthedocs/core/views/__init__.py b/readthedocs/core/views/__init__.py --- a/readthedocs/core/views/__init__.py +++ b/readthedocs/core/views/__init__.py @@ -116,8 +116,16 @@ Marking exception as optional to make /404/ testing page to work. """ response = get_redirect_response(request, path=request.get_full_path()) + if response: - return response + if response.url == request.build_absolute_uri(): + # check that we do have a response and avoid infinite redirect + log.warning( + 'Infinite Redirect: FROM URL is the same than TO URL. url=%s', + response.url, + ) + else: + return response r = render(request, template_name) r.status_code = 404 return r
{"golden_diff": "diff --git a/readthedocs/core/views/__init__.py b/readthedocs/core/views/__init__.py\n--- a/readthedocs/core/views/__init__.py\n+++ b/readthedocs/core/views/__init__.py\n@@ -116,8 +116,16 @@\n Marking exception as optional to make /404/ testing page to work.\n \"\"\"\n response = get_redirect_response(request, path=request.get_full_path())\n+\n if response:\n- return response\n+ if response.url == request.build_absolute_uri():\n+ # check that we do have a response and avoid infinite redirect\n+ log.warning(\n+ 'Infinite Redirect: FROM URL is the same than TO URL. url=%s',\n+ response.url,\n+ )\n+ else:\n+ return response\n r = render(request, template_name)\n r.status_code = 404\n return r\n", "issue": "Redirect full path to default version\nI'd like to be able to create versionless links to the documentation, for use in error messages, code comments, etc. For example, a message like `see https://click.palletsprojects.com/windows for more information`. I don't want to use URLs with versions because I would have to remember to modify all instances of it before releasing a new version.\r\n\r\nCurrently, only the root path redirects to the default version, other paths raise a 404. Instead, the path should be preserved and appended to the default version path on redirect.\r\n\r\n```\r\nWorks:\r\nhttps://click.palletsprojects.com/ -> https://click.palletsprojects.com/en/7.x/\r\n\r\nDoesn't work, 404:\r\nhttps://click.palletsprojects.com/windows -> https://click.palletsprojects.com/en/7.x/windows\r\n```\r\n\r\nI do not want to use the \"latest\" or \"stable\" versions because I would like the URLs that people land on and share to contain the actual version.\r\n\r\nI already do this with the transitional redirects I set up from `click.pocoo.org` to `click.palletsprojects.com`. A similar approach could probably be used to extend RTD's default redirect.\r\n\r\n```nginx\r\nlocation ~ ^/dev(.*)$ {\r\n return 301 https://click.palletsprojects.com/en/master$1;\r\n}\r\n\r\nlocation ~ ^/(\\d)(.*)$ {\r\n return 301 https://click.palletsprojects.com/en/$1.x$2;\r\n}\r\n\r\nlocation ~ ^/latest(.*)$ {\r\n return 301 https://click.palletsprojects.com/en/7.x$1;\r\n}\r\n\r\nlocation / {\r\n return 301 https://click.palletsprojects.com/en/7.x$request_uri;\r\n}\r\n```\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\n\"\"\"\nCore views, including the main homepage,\n\ndocumentation and header rendering, and server errors.\n\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nimport os\nimport logging\n\nfrom django.conf import settings\nfrom django.http import HttpResponseRedirect, Http404, JsonResponse\nfrom django.shortcuts import render, get_object_or_404, redirect\nfrom django.views.decorators.csrf import csrf_exempt\nfrom django.views.generic import TemplateView\n\nfrom readthedocs.builds.models import Version\nfrom readthedocs.core.utils import broadcast\nfrom readthedocs.projects.models import Project, ImportedFile\nfrom readthedocs.projects.tasks import remove_dir\nfrom readthedocs.redirects.utils import get_redirect_response\n\nlog = logging.getLogger(__name__)\n\n\nclass NoProjectException(Exception):\n pass\n\n\nclass HomepageView(TemplateView):\n\n template_name = 'homepage.html'\n\n def get_context_data(self, **kwargs):\n \"\"\"Add latest builds and featured projects.\"\"\"\n context = super(HomepageView, self).get_context_data(**kwargs)\n context['featured_list'] = Project.objects.filter(featured=True)\n context['projects_count'] = Project.objects.count()\n return context\n\n\nclass SupportView(TemplateView):\n template_name = 'support.html'\n\n def get_context_data(self, **kwargs):\n context = super(SupportView, self).get_context_data(**kwargs)\n support_email = getattr(settings, 'SUPPORT_EMAIL', None)\n if not support_email:\n support_email = 'support@{domain}'.format(\n domain=getattr(\n settings,\n 'PRODUCTION_DOMAIN',\n 'readthedocs.org',\n ),\n )\n\n context['support_email'] = support_email\n return context\n\n\ndef random_page(request, project_slug=None): # pylint: disable=unused-argument\n imported_file = ImportedFile.objects.order_by('?')\n if project_slug:\n imported_file = imported_file.filter(project__slug=project_slug)\n imported_file = imported_file.first()\n if imported_file is None:\n raise Http404\n url = imported_file.get_absolute_url()\n return HttpResponseRedirect(url)\n\n\n@csrf_exempt\ndef wipe_version(request, project_slug, version_slug):\n version = get_object_or_404(\n Version,\n project__slug=project_slug,\n slug=version_slug,\n )\n # We need to check by ``for_admin_user`` here to allow members of the\n # ``Admin`` team (which doesn't own the project) under the corporate site.\n if version.project not in Project.objects.for_admin_user(user=request.user):\n raise Http404('You must own this project to wipe it.')\n\n if request.method == 'POST':\n del_dirs = [\n os.path.join(version.project.doc_path, 'checkouts', version.slug),\n os.path.join(version.project.doc_path, 'envs', version.slug),\n os.path.join(version.project.doc_path, 'conda', version.slug),\n ]\n for del_dir in del_dirs:\n broadcast(type='build', task=remove_dir, args=[del_dir])\n return redirect('project_version_list', project_slug)\n return render(\n request,\n 'wipe_version.html',\n {'version': version, 'project': version.project},\n )\n\n\ndef server_error_500(request, template_name='500.html'):\n \"\"\"A simple 500 handler so we get media.\"\"\"\n r = render(request, template_name)\n r.status_code = 500\n return r\n\n\ndef server_error_404(request, exception=None, template_name='404.html'): # pylint: disable=unused-argument # noqa\n \"\"\"\n A simple 404 handler so we get media.\n\n .. note::\n\n Marking exception as optional to make /404/ testing page to work.\n \"\"\"\n response = get_redirect_response(request, path=request.get_full_path())\n if response:\n return response\n r = render(request, template_name)\n r.status_code = 404\n return r\n\n\ndef do_not_track(request):\n dnt_header = request.META.get('HTTP_DNT')\n\n # https://w3c.github.io/dnt/drafts/tracking-dnt.html#status-representation\n return JsonResponse({ # pylint: disable=redundant-content-type-for-json-response\n 'policy': 'https://docs.readthedocs.io/en/latest/privacy-policy.html',\n 'same-party': [\n 'readthedocs.org',\n 'readthedocs.com',\n 'readthedocs.io', # .org Documentation Sites\n 'readthedocs-hosted.com', # .com Documentation Sites\n ],\n 'tracking': 'N' if dnt_header == '1' else 'T',\n }, content_type='application/tracking-status+json')\n", "path": "readthedocs/core/views/__init__.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\n\"\"\"\nCore views, including the main homepage,\n\ndocumentation and header rendering, and server errors.\n\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nimport os\nimport logging\n\nfrom django.conf import settings\nfrom django.http import HttpResponseRedirect, Http404, JsonResponse\nfrom django.shortcuts import render, get_object_or_404, redirect\nfrom django.views.decorators.csrf import csrf_exempt\nfrom django.views.generic import TemplateView\n\nfrom readthedocs.builds.models import Version\nfrom readthedocs.core.utils import broadcast\nfrom readthedocs.projects.models import Project, ImportedFile\nfrom readthedocs.projects.tasks import remove_dir\nfrom readthedocs.redirects.utils import get_redirect_response\n\nlog = logging.getLogger(__name__)\n\n\nclass NoProjectException(Exception):\n pass\n\n\nclass HomepageView(TemplateView):\n\n template_name = 'homepage.html'\n\n def get_context_data(self, **kwargs):\n \"\"\"Add latest builds and featured projects.\"\"\"\n context = super(HomepageView, self).get_context_data(**kwargs)\n context['featured_list'] = Project.objects.filter(featured=True)\n context['projects_count'] = Project.objects.count()\n return context\n\n\nclass SupportView(TemplateView):\n template_name = 'support.html'\n\n def get_context_data(self, **kwargs):\n context = super(SupportView, self).get_context_data(**kwargs)\n support_email = getattr(settings, 'SUPPORT_EMAIL', None)\n if not support_email:\n support_email = 'support@{domain}'.format(\n domain=getattr(\n settings,\n 'PRODUCTION_DOMAIN',\n 'readthedocs.org',\n ),\n )\n\n context['support_email'] = support_email\n return context\n\n\ndef random_page(request, project_slug=None): # pylint: disable=unused-argument\n imported_file = ImportedFile.objects.order_by('?')\n if project_slug:\n imported_file = imported_file.filter(project__slug=project_slug)\n imported_file = imported_file.first()\n if imported_file is None:\n raise Http404\n url = imported_file.get_absolute_url()\n return HttpResponseRedirect(url)\n\n\n@csrf_exempt\ndef wipe_version(request, project_slug, version_slug):\n version = get_object_or_404(\n Version,\n project__slug=project_slug,\n slug=version_slug,\n )\n # We need to check by ``for_admin_user`` here to allow members of the\n # ``Admin`` team (which doesn't own the project) under the corporate site.\n if version.project not in Project.objects.for_admin_user(user=request.user):\n raise Http404('You must own this project to wipe it.')\n\n if request.method == 'POST':\n del_dirs = [\n os.path.join(version.project.doc_path, 'checkouts', version.slug),\n os.path.join(version.project.doc_path, 'envs', version.slug),\n os.path.join(version.project.doc_path, 'conda', version.slug),\n ]\n for del_dir in del_dirs:\n broadcast(type='build', task=remove_dir, args=[del_dir])\n return redirect('project_version_list', project_slug)\n return render(\n request,\n 'wipe_version.html',\n {'version': version, 'project': version.project},\n )\n\n\ndef server_error_500(request, template_name='500.html'):\n \"\"\"A simple 500 handler so we get media.\"\"\"\n r = render(request, template_name)\n r.status_code = 500\n return r\n\n\ndef server_error_404(request, exception=None, template_name='404.html'): # pylint: disable=unused-argument # noqa\n \"\"\"\n A simple 404 handler so we get media.\n\n .. note::\n\n Marking exception as optional to make /404/ testing page to work.\n \"\"\"\n response = get_redirect_response(request, path=request.get_full_path())\n\n if response:\n if response.url == request.build_absolute_uri():\n # check that we do have a response and avoid infinite redirect\n log.warning(\n 'Infinite Redirect: FROM URL is the same than TO URL. url=%s',\n response.url,\n )\n else:\n return response\n r = render(request, template_name)\n r.status_code = 404\n return r\n\n\ndef do_not_track(request):\n dnt_header = request.META.get('HTTP_DNT')\n\n # https://w3c.github.io/dnt/drafts/tracking-dnt.html#status-representation\n return JsonResponse({ # pylint: disable=redundant-content-type-for-json-response\n 'policy': 'https://docs.readthedocs.io/en/latest/privacy-policy.html',\n 'same-party': [\n 'readthedocs.org',\n 'readthedocs.com',\n 'readthedocs.io', # .org Documentation Sites\n 'readthedocs-hosted.com', # .com Documentation Sites\n ],\n 'tracking': 'N' if dnt_header == '1' else 'T',\n }, content_type='application/tracking-status+json')\n", "path": "readthedocs/core/views/__init__.py"}]}
2,004
198
gh_patches_debug_17349
rasdani/github-patches
git_diff
conan-io__conan-center-index-19060
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- [request] fast-cdr/1.1.0 ### Package Name/Version fast-cdr/1.1.0 ### Changelog https://github.com/eProsima/Fast-CDR/releases/tag/v1.1.0 ### Context about the new update The Conan Center Bot detects the updatable recipe in #3470. Open branch [qchateau/conan-center-index/ccb-fast-cdr-1.1.0](https://github.com/qchateau/conan-center-index/tree/ccb-fast-cdr-1.1.0) --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `recipes/fast-cdr/all/conanfile.py` Content: ``` 1 from conan import ConanFile 2 from conan.errors import ConanInvalidConfiguration 3 from conan.tools.build import check_min_cppstd 4 from conan.tools.cmake import CMake, CMakeToolchain, cmake_layout 5 from conan.tools.files import collect_libs, copy, get, rm, rmdir, save 6 from conan.tools.microsoft import is_msvc, is_msvc_static_runtime 7 import os 8 import textwrap 9 10 required_conan_version = ">=1.54.0" 11 12 13 class FastCDRConan(ConanFile): 14 name = "fast-cdr" 15 license = "Apache-2.0" 16 homepage = "https://github.com/eProsima/Fast-CDR" 17 url = "https://github.com/conan-io/conan-center-index" 18 description = "eProsima FastCDR library for serialization" 19 topics = ("dds", "middleware", "serialization") 20 21 package_type = "library" 22 settings = "os", "arch", "compiler", "build_type" 23 options = { 24 "shared": [True, False], 25 "fPIC": [True, False], 26 } 27 default_options = { 28 "shared": False, 29 "fPIC": True, 30 } 31 32 def config_options(self): 33 if self.settings.os == "Windows": 34 del self.options.fPIC 35 36 def configure(self): 37 if self.options.shared: 38 self.options.rm_safe("fPIC") 39 40 def layout(self): 41 cmake_layout(self, src_folder="src") 42 43 def validate(self): 44 if self.settings.compiler.get_safe("cppstd"): 45 check_min_cppstd(self, 11) 46 if self.options.shared and is_msvc(self) and is_msvc_static_runtime(self): 47 # This combination leads to an fast-cdr error when linking 48 # linking dynamic '*.dll' and static MT runtime 49 # see https://github.com/eProsima/Fast-CDR/blob/v1.0.21/include/fastcdr/eProsima_auto_link.h#L37 50 # (2021-05-31) 51 raise ConanInvalidConfiguration("Mixing a dll eprosima library with a static runtime is a bad idea") 52 53 def source(self): 54 get(self, **self.conan_data["sources"][self.version], strip_root=True) 55 56 def generate(self): 57 tc = CMakeToolchain(self) 58 tc.variables["BUILD_STATIC"] = not self.options.shared 59 tc.generate() 60 61 def build(self): 62 cmake = CMake(self) 63 cmake.configure() 64 cmake.build() 65 66 def package(self): 67 copy(self, "LICENSE", src=self.source_folder, dst=os.path.join(self.package_folder, "licenses")) 68 cmake = CMake(self) 69 cmake.install() 70 rmdir(self, os.path.join(self.package_folder, "lib", "cmake")) 71 rmdir(self, os.path.join(self.package_folder, "share")) 72 rm(self, "*.pdb", os.path.join(self.package_folder, "lib")) 73 rm(self, "*.pdb", os.path.join(self.package_folder, "bin")) 74 75 # TODO: to remove in conan v2 once cmake_find_package_* generators removed 76 self._create_cmake_module_alias_targets( 77 os.path.join(self.package_folder, self._module_file_rel_path), 78 {"fastcdr": "fastcdr::fastcdr"} 79 ) 80 81 def _create_cmake_module_alias_targets(self, module_file, targets): 82 content = "" 83 for alias, aliased in targets.items(): 84 content += textwrap.dedent(f"""\ 85 if(TARGET {aliased} AND NOT TARGET {alias}) 86 add_library({alias} INTERFACE IMPORTED) 87 set_property(TARGET {alias} PROPERTY INTERFACE_LINK_LIBRARIES {aliased}) 88 endif() 89 """) 90 save(self, module_file, content) 91 92 @property 93 def _module_file_rel_path(self): 94 return os.path.join("lib", "cmake", f"conan-official-{self.name}-targets.cmake") 95 96 def package_info(self): 97 self.cpp_info.set_property("cmake_file_name", "fastcdr") 98 self.cpp_info.set_property("cmake_target_name", "fastcdr") 99 self.cpp_info.libs = collect_libs(self) 100 if self.settings.os == "Windows" and self.options.shared: 101 self.cpp_info.defines.append("FASTCDR_DYN_LINK") 102 103 # TODO: to remove in conan v2 once cmake_find_package_* generators removed 104 self.cpp_info.names["cmake_find_package"] = "fastcdr" 105 self.cpp_info.names["cmake_find_package_multi"] = "fastcdr" 106 self.cpp_info.build_modules["cmake_find_package"] = [self._module_file_rel_path] 107 self.cpp_info.build_modules["cmake_find_package_multi"] = [self._module_file_rel_path] 108 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/recipes/fast-cdr/all/conanfile.py b/recipes/fast-cdr/all/conanfile.py --- a/recipes/fast-cdr/all/conanfile.py +++ b/recipes/fast-cdr/all/conanfile.py @@ -4,6 +4,7 @@ from conan.tools.cmake import CMake, CMakeToolchain, cmake_layout from conan.tools.files import collect_libs, copy, get, rm, rmdir, save from conan.tools.microsoft import is_msvc, is_msvc_static_runtime +from conan.tools.scm import Version import os import textwrap @@ -40,6 +41,10 @@ def layout(self): cmake_layout(self, src_folder="src") + def build_requirements(self): + if Version(self.version) >= "1.1.0": + self.tool_requires("cmake/[>=3.16.3 <4]") + def validate(self): if self.settings.compiler.get_safe("cppstd"): check_min_cppstd(self, 11)
{"golden_diff": "diff --git a/recipes/fast-cdr/all/conanfile.py b/recipes/fast-cdr/all/conanfile.py\n--- a/recipes/fast-cdr/all/conanfile.py\n+++ b/recipes/fast-cdr/all/conanfile.py\n@@ -4,6 +4,7 @@\n from conan.tools.cmake import CMake, CMakeToolchain, cmake_layout\n from conan.tools.files import collect_libs, copy, get, rm, rmdir, save\n from conan.tools.microsoft import is_msvc, is_msvc_static_runtime\n+from conan.tools.scm import Version\n import os\n import textwrap\n \n@@ -40,6 +41,10 @@\n def layout(self):\n cmake_layout(self, src_folder=\"src\")\n \n+ def build_requirements(self):\n+ if Version(self.version) >= \"1.1.0\":\n+ self.tool_requires(\"cmake/[>=3.16.3 <4]\")\n+\n def validate(self):\n if self.settings.compiler.get_safe(\"cppstd\"):\n check_min_cppstd(self, 11)\n", "issue": "[request] fast-cdr/1.1.0\n### Package Name/Version\n\nfast-cdr/1.1.0\n\n### Changelog\n\nhttps://github.com/eProsima/Fast-CDR/releases/tag/v1.1.0\n\n### Context about the new update\n\nThe Conan Center Bot detects the updatable recipe in #3470.\r\nOpen branch [qchateau/conan-center-index/ccb-fast-cdr-1.1.0](https://github.com/qchateau/conan-center-index/tree/ccb-fast-cdr-1.1.0)\n", "before_files": [{"content": "from conan import ConanFile\nfrom conan.errors import ConanInvalidConfiguration\nfrom conan.tools.build import check_min_cppstd\nfrom conan.tools.cmake import CMake, CMakeToolchain, cmake_layout\nfrom conan.tools.files import collect_libs, copy, get, rm, rmdir, save\nfrom conan.tools.microsoft import is_msvc, is_msvc_static_runtime\nimport os\nimport textwrap\n\nrequired_conan_version = \">=1.54.0\"\n\n\nclass FastCDRConan(ConanFile):\n name = \"fast-cdr\"\n license = \"Apache-2.0\"\n homepage = \"https://github.com/eProsima/Fast-CDR\"\n url = \"https://github.com/conan-io/conan-center-index\"\n description = \"eProsima FastCDR library for serialization\"\n topics = (\"dds\", \"middleware\", \"serialization\")\n\n package_type = \"library\"\n settings = \"os\", \"arch\", \"compiler\", \"build_type\"\n options = {\n \"shared\": [True, False],\n \"fPIC\": [True, False],\n }\n default_options = {\n \"shared\": False,\n \"fPIC\": True,\n }\n\n def config_options(self):\n if self.settings.os == \"Windows\":\n del self.options.fPIC\n\n def configure(self):\n if self.options.shared:\n self.options.rm_safe(\"fPIC\")\n\n def layout(self):\n cmake_layout(self, src_folder=\"src\")\n\n def validate(self):\n if self.settings.compiler.get_safe(\"cppstd\"):\n check_min_cppstd(self, 11)\n if self.options.shared and is_msvc(self) and is_msvc_static_runtime(self):\n # This combination leads to an fast-cdr error when linking\n # linking dynamic '*.dll' and static MT runtime\n # see https://github.com/eProsima/Fast-CDR/blob/v1.0.21/include/fastcdr/eProsima_auto_link.h#L37\n # (2021-05-31)\n raise ConanInvalidConfiguration(\"Mixing a dll eprosima library with a static runtime is a bad idea\")\n\n def source(self):\n get(self, **self.conan_data[\"sources\"][self.version], strip_root=True)\n\n def generate(self):\n tc = CMakeToolchain(self)\n tc.variables[\"BUILD_STATIC\"] = not self.options.shared\n tc.generate()\n\n def build(self):\n cmake = CMake(self)\n cmake.configure()\n cmake.build()\n\n def package(self):\n copy(self, \"LICENSE\", src=self.source_folder, dst=os.path.join(self.package_folder, \"licenses\"))\n cmake = CMake(self)\n cmake.install()\n rmdir(self, os.path.join(self.package_folder, \"lib\", \"cmake\"))\n rmdir(self, os.path.join(self.package_folder, \"share\"))\n rm(self, \"*.pdb\", os.path.join(self.package_folder, \"lib\"))\n rm(self, \"*.pdb\", os.path.join(self.package_folder, \"bin\"))\n\n # TODO: to remove in conan v2 once cmake_find_package_* generators removed\n self._create_cmake_module_alias_targets(\n os.path.join(self.package_folder, self._module_file_rel_path),\n {\"fastcdr\": \"fastcdr::fastcdr\"}\n )\n\n def _create_cmake_module_alias_targets(self, module_file, targets):\n content = \"\"\n for alias, aliased in targets.items():\n content += textwrap.dedent(f\"\"\"\\\n if(TARGET {aliased} AND NOT TARGET {alias})\n add_library({alias} INTERFACE IMPORTED)\n set_property(TARGET {alias} PROPERTY INTERFACE_LINK_LIBRARIES {aliased})\n endif()\n \"\"\")\n save(self, module_file, content)\n\n @property\n def _module_file_rel_path(self):\n return os.path.join(\"lib\", \"cmake\", f\"conan-official-{self.name}-targets.cmake\")\n\n def package_info(self):\n self.cpp_info.set_property(\"cmake_file_name\", \"fastcdr\")\n self.cpp_info.set_property(\"cmake_target_name\", \"fastcdr\")\n self.cpp_info.libs = collect_libs(self)\n if self.settings.os == \"Windows\" and self.options.shared:\n self.cpp_info.defines.append(\"FASTCDR_DYN_LINK\")\n\n # TODO: to remove in conan v2 once cmake_find_package_* generators removed\n self.cpp_info.names[\"cmake_find_package\"] = \"fastcdr\"\n self.cpp_info.names[\"cmake_find_package_multi\"] = \"fastcdr\"\n self.cpp_info.build_modules[\"cmake_find_package\"] = [self._module_file_rel_path]\n self.cpp_info.build_modules[\"cmake_find_package_multi\"] = [self._module_file_rel_path]\n", "path": "recipes/fast-cdr/all/conanfile.py"}], "after_files": [{"content": "from conan import ConanFile\nfrom conan.errors import ConanInvalidConfiguration\nfrom conan.tools.build import check_min_cppstd\nfrom conan.tools.cmake import CMake, CMakeToolchain, cmake_layout\nfrom conan.tools.files import collect_libs, copy, get, rm, rmdir, save\nfrom conan.tools.microsoft import is_msvc, is_msvc_static_runtime\nfrom conan.tools.scm import Version\nimport os\nimport textwrap\n\nrequired_conan_version = \">=1.54.0\"\n\n\nclass FastCDRConan(ConanFile):\n name = \"fast-cdr\"\n license = \"Apache-2.0\"\n homepage = \"https://github.com/eProsima/Fast-CDR\"\n url = \"https://github.com/conan-io/conan-center-index\"\n description = \"eProsima FastCDR library for serialization\"\n topics = (\"dds\", \"middleware\", \"serialization\")\n\n package_type = \"library\"\n settings = \"os\", \"arch\", \"compiler\", \"build_type\"\n options = {\n \"shared\": [True, False],\n \"fPIC\": [True, False],\n }\n default_options = {\n \"shared\": False,\n \"fPIC\": True,\n }\n\n def config_options(self):\n if self.settings.os == \"Windows\":\n del self.options.fPIC\n\n def configure(self):\n if self.options.shared:\n self.options.rm_safe(\"fPIC\")\n\n def layout(self):\n cmake_layout(self, src_folder=\"src\")\n\n def build_requirements(self):\n if Version(self.version) >= \"1.1.0\":\n self.tool_requires(\"cmake/[>=3.16.3 <4]\")\n\n def validate(self):\n if self.settings.compiler.get_safe(\"cppstd\"):\n check_min_cppstd(self, 11)\n if self.options.shared and is_msvc(self) and is_msvc_static_runtime(self):\n # This combination leads to an fast-cdr error when linking\n # linking dynamic '*.dll' and static MT runtime\n # see https://github.com/eProsima/Fast-CDR/blob/v1.0.21/include/fastcdr/eProsima_auto_link.h#L37\n # (2021-05-31)\n raise ConanInvalidConfiguration(\"Mixing a dll eprosima library with a static runtime is a bad idea\")\n\n def source(self):\n get(self, **self.conan_data[\"sources\"][self.version], strip_root=True)\n\n def generate(self):\n tc = CMakeToolchain(self)\n tc.variables[\"BUILD_STATIC\"] = not self.options.shared\n tc.generate()\n\n def build(self):\n cmake = CMake(self)\n cmake.configure()\n cmake.build()\n\n def package(self):\n copy(self, \"LICENSE\", src=self.source_folder, dst=os.path.join(self.package_folder, \"licenses\"))\n cmake = CMake(self)\n cmake.install()\n rmdir(self, os.path.join(self.package_folder, \"lib\", \"cmake\"))\n rmdir(self, os.path.join(self.package_folder, \"share\"))\n rm(self, \"*.pdb\", os.path.join(self.package_folder, \"lib\"))\n rm(self, \"*.pdb\", os.path.join(self.package_folder, \"bin\"))\n\n # TODO: to remove in conan v2 once cmake_find_package_* generators removed\n self._create_cmake_module_alias_targets(\n os.path.join(self.package_folder, self._module_file_rel_path),\n {\"fastcdr\": \"fastcdr::fastcdr\"}\n )\n\n def _create_cmake_module_alias_targets(self, module_file, targets):\n content = \"\"\n for alias, aliased in targets.items():\n content += textwrap.dedent(f\"\"\"\\\n if(TARGET {aliased} AND NOT TARGET {alias})\n add_library({alias} INTERFACE IMPORTED)\n set_property(TARGET {alias} PROPERTY INTERFACE_LINK_LIBRARIES {aliased})\n endif()\n \"\"\")\n save(self, module_file, content)\n\n @property\n def _module_file_rel_path(self):\n return os.path.join(\"lib\", \"cmake\", f\"conan-official-{self.name}-targets.cmake\")\n\n def package_info(self):\n self.cpp_info.set_property(\"cmake_file_name\", \"fastcdr\")\n self.cpp_info.set_property(\"cmake_target_name\", \"fastcdr\")\n self.cpp_info.libs = collect_libs(self)\n if self.settings.os == \"Windows\" and self.options.shared:\n self.cpp_info.defines.append(\"FASTCDR_DYN_LINK\")\n\n # TODO: to remove in conan v2 once cmake_find_package_* generators removed\n self.cpp_info.names[\"cmake_find_package\"] = \"fastcdr\"\n self.cpp_info.names[\"cmake_find_package_multi\"] = \"fastcdr\"\n self.cpp_info.build_modules[\"cmake_find_package\"] = [self._module_file_rel_path]\n self.cpp_info.build_modules[\"cmake_find_package_multi\"] = [self._module_file_rel_path]\n", "path": "recipes/fast-cdr/all/conanfile.py"}]}
1,632
232
gh_patches_debug_1457
rasdani/github-patches
git_diff
liqd__a4-meinberlin-539
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- use list markup for lists of elements Part of the BITV-Test: "1.3.1b HTML-Strukturelemente für Listen" ---- - [x] list of blueprints - [x] list of projects unsure: - [ ] list of questions in poll contents - [ ] list of choices in poll contents --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `apps/contrib/templatetags/contrib_tags.py` Content: ``` 1 from django import template 2 from django.template.loader import render_to_string 3 4 register = template.Library() 5 6 7 @register.assignment_tag 8 def include_template_string(template, **kwargs): 9 rendered_template = render_to_string(template, kwargs) 10 return str(rendered_template) 11 12 13 @register.assignment_tag 14 def combined_url_parameter(request_query_dict, **kwargs): 15 combined_query_dict = request_query_dict.copy() 16 for key in kwargs: 17 combined_query_dict.setlist(key, [kwargs[key]]) 18 encoded_parameter = '?' + combined_query_dict.urlencode() 19 return encoded_parameter 20 21 22 @register.assignment_tag 23 def filter_has_perm(perm, user, objects): 24 """Filter a list of objects based on user permissions.""" 25 if not hasattr(user, 'has_perm'): 26 # If the swapped user model does not support permissions, all objects 27 # will be returned. This is taken from rules.templatetags.has_perm. 28 return objects 29 else: 30 return (obj for obj in objects if user.has_perm(perm, obj)) 31 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/apps/contrib/templatetags/contrib_tags.py b/apps/contrib/templatetags/contrib_tags.py --- a/apps/contrib/templatetags/contrib_tags.py +++ b/apps/contrib/templatetags/contrib_tags.py @@ -27,4 +27,4 @@ # will be returned. This is taken from rules.templatetags.has_perm. return objects else: - return (obj for obj in objects if user.has_perm(perm, obj)) + return [obj for obj in objects if user.has_perm(perm, obj)]
{"golden_diff": "diff --git a/apps/contrib/templatetags/contrib_tags.py b/apps/contrib/templatetags/contrib_tags.py\n--- a/apps/contrib/templatetags/contrib_tags.py\n+++ b/apps/contrib/templatetags/contrib_tags.py\n@@ -27,4 +27,4 @@\n # will be returned. This is taken from rules.templatetags.has_perm.\n return objects\n else:\n- return (obj for obj in objects if user.has_perm(perm, obj))\n+ return [obj for obj in objects if user.has_perm(perm, obj)]\n", "issue": "use list markup for lists of elements\nPart of the BITV-Test: \"1.3.1b HTML-Strukturelemente f\u00fcr Listen\"\r\n----\r\n- [x] list of blueprints\r\n- [x] list of projects\r\n\r\nunsure:\r\n- [ ] list of questions in poll contents\r\n- [ ] list of choices in poll contents \n", "before_files": [{"content": "from django import template\nfrom django.template.loader import render_to_string\n\nregister = template.Library()\n\n\[email protected]_tag\ndef include_template_string(template, **kwargs):\n rendered_template = render_to_string(template, kwargs)\n return str(rendered_template)\n\n\[email protected]_tag\ndef combined_url_parameter(request_query_dict, **kwargs):\n combined_query_dict = request_query_dict.copy()\n for key in kwargs:\n combined_query_dict.setlist(key, [kwargs[key]])\n encoded_parameter = '?' + combined_query_dict.urlencode()\n return encoded_parameter\n\n\[email protected]_tag\ndef filter_has_perm(perm, user, objects):\n \"\"\"Filter a list of objects based on user permissions.\"\"\"\n if not hasattr(user, 'has_perm'):\n # If the swapped user model does not support permissions, all objects\n # will be returned. This is taken from rules.templatetags.has_perm.\n return objects\n else:\n return (obj for obj in objects if user.has_perm(perm, obj))\n", "path": "apps/contrib/templatetags/contrib_tags.py"}], "after_files": [{"content": "from django import template\nfrom django.template.loader import render_to_string\n\nregister = template.Library()\n\n\[email protected]_tag\ndef include_template_string(template, **kwargs):\n rendered_template = render_to_string(template, kwargs)\n return str(rendered_template)\n\n\[email protected]_tag\ndef combined_url_parameter(request_query_dict, **kwargs):\n combined_query_dict = request_query_dict.copy()\n for key in kwargs:\n combined_query_dict.setlist(key, [kwargs[key]])\n encoded_parameter = '?' + combined_query_dict.urlencode()\n return encoded_parameter\n\n\[email protected]_tag\ndef filter_has_perm(perm, user, objects):\n \"\"\"Filter a list of objects based on user permissions.\"\"\"\n if not hasattr(user, 'has_perm'):\n # If the swapped user model does not support permissions, all objects\n # will be returned. This is taken from rules.templatetags.has_perm.\n return objects\n else:\n return [obj for obj in objects if user.has_perm(perm, obj)]\n", "path": "apps/contrib/templatetags/contrib_tags.py"}]}
609
134
gh_patches_debug_17267
rasdani/github-patches
git_diff
pulp__pulpcore-239
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Fix bug where Last-Modified header was being updated on duplicate package uploads Fixes a bug where the Last-Modified header of a package stored in django-storages was being updated on duplicate uploads. Closes #5149 --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `setup.py` Content: ``` 1 from setuptools import find_packages, setup 2 3 with open('README.md') as f: 4 long_description = f.read() 5 6 requirements = [ 7 'coreapi', 8 'Django~=2.2', # LTS version, switch only if we have a compelling reason to 9 'django-filter', 10 'djangorestframework', 11 'djangorestframework-queryfields', 12 'drf-nested-routers', 13 'drf-yasg', 14 'gunicorn', 15 'packaging', # until drf-yasg 1.16.2 is out https://github.com/axnsan12/drf-yasg/issues/412 16 'PyYAML', 17 'rq~=1.0', 18 'redis<3.2.0', 19 'setuptools', 20 'dynaconf~=2.0', 21 'whitenoise', 22 ] 23 24 setup( 25 name='pulpcore', 26 version='3.0.0rc5.dev', 27 description='Pulp Django Application and Related Modules', 28 long_description=long_description, 29 long_description_content_type="text/markdown", 30 license='GPLv2+', 31 packages=find_packages(exclude=['test']), 32 author='Pulp Team', 33 author_email='[email protected]', 34 url='http://www.pulpproject.org', 35 python_requires='>=3.6', 36 install_requires=requirements, 37 extras_require={ 38 'postgres': ['psycopg2-binary'], 39 'mysql': ['mysqlclient'] 40 }, 41 include_package_data=True, 42 classifiers=( 43 'License :: OSI Approved :: GNU General Public License v2 or later (GPLv2+)', 44 'Operating System :: POSIX :: Linux', 45 'Development Status :: 4 - Beta', 46 'Framework :: Django', 47 'Programming Language :: Python', 48 'Programming Language :: Python :: 3', 49 'Programming Language :: Python :: 3.6', 50 'Programming Language :: Python :: 3.7', 51 ), 52 scripts=['bin/pulp-content'], 53 ) 54 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/setup.py b/setup.py --- a/setup.py +++ b/setup.py @@ -4,21 +4,21 @@ long_description = f.read() requirements = [ - 'coreapi', - 'Django~=2.2', # LTS version, switch only if we have a compelling reason to - 'django-filter', - 'djangorestframework', - 'djangorestframework-queryfields', - 'drf-nested-routers', - 'drf-yasg', - 'gunicorn', + 'coreapi~=2.3.3', + 'Django~=2.2.3', # LTS version, switch only if we have a compelling reason to + 'django-filter~=2.2.0', + 'djangorestframework~=3.10.2', + 'djangorestframework-queryfields~=1.0.0', + 'drf-nested-routers~=0.91.0', + 'drf-yasg~=1.16.1', + 'gunicorn~=19.9.0', 'packaging', # until drf-yasg 1.16.2 is out https://github.com/axnsan12/drf-yasg/issues/412 - 'PyYAML', - 'rq~=1.0', - 'redis<3.2.0', - 'setuptools', - 'dynaconf~=2.0', - 'whitenoise', + 'PyYAML~=5.1.1', + 'rq~=1.1.0', + 'redis~=3.1.0', + 'setuptools~=41.0.1', + 'dynaconf~=2.0.3', + 'whitenoise~=4.1.3', ] setup(
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -4,21 +4,21 @@\n long_description = f.read()\n \n requirements = [\n- 'coreapi',\n- 'Django~=2.2', # LTS version, switch only if we have a compelling reason to\n- 'django-filter',\n- 'djangorestframework',\n- 'djangorestframework-queryfields',\n- 'drf-nested-routers',\n- 'drf-yasg',\n- 'gunicorn',\n+ 'coreapi~=2.3.3',\n+ 'Django~=2.2.3', # LTS version, switch only if we have a compelling reason to\n+ 'django-filter~=2.2.0',\n+ 'djangorestframework~=3.10.2',\n+ 'djangorestframework-queryfields~=1.0.0',\n+ 'drf-nested-routers~=0.91.0',\n+ 'drf-yasg~=1.16.1',\n+ 'gunicorn~=19.9.0',\n 'packaging', # until drf-yasg 1.16.2 is out https://github.com/axnsan12/drf-yasg/issues/412\n- 'PyYAML',\n- 'rq~=1.0',\n- 'redis<3.2.0',\n- 'setuptools',\n- 'dynaconf~=2.0',\n- 'whitenoise',\n+ 'PyYAML~=5.1.1',\n+ 'rq~=1.1.0',\n+ 'redis~=3.1.0',\n+ 'setuptools~=41.0.1',\n+ 'dynaconf~=2.0.3',\n+ 'whitenoise~=4.1.3',\n ]\n \n setup(\n", "issue": "Fix bug where Last-Modified header was being updated on duplicate package uploads\nFixes a bug where the Last-Modified header of a package stored in django-storages was being updated on duplicate uploads.\r\n\r\nCloses #5149\n", "before_files": [{"content": "from setuptools import find_packages, setup\n\nwith open('README.md') as f:\n long_description = f.read()\n\nrequirements = [\n 'coreapi',\n 'Django~=2.2', # LTS version, switch only if we have a compelling reason to\n 'django-filter',\n 'djangorestframework',\n 'djangorestframework-queryfields',\n 'drf-nested-routers',\n 'drf-yasg',\n 'gunicorn',\n 'packaging', # until drf-yasg 1.16.2 is out https://github.com/axnsan12/drf-yasg/issues/412\n 'PyYAML',\n 'rq~=1.0',\n 'redis<3.2.0',\n 'setuptools',\n 'dynaconf~=2.0',\n 'whitenoise',\n]\n\nsetup(\n name='pulpcore',\n version='3.0.0rc5.dev',\n description='Pulp Django Application and Related Modules',\n long_description=long_description,\n long_description_content_type=\"text/markdown\",\n license='GPLv2+',\n packages=find_packages(exclude=['test']),\n author='Pulp Team',\n author_email='[email protected]',\n url='http://www.pulpproject.org',\n python_requires='>=3.6',\n install_requires=requirements,\n extras_require={\n 'postgres': ['psycopg2-binary'],\n 'mysql': ['mysqlclient']\n },\n include_package_data=True,\n classifiers=(\n 'License :: OSI Approved :: GNU General Public License v2 or later (GPLv2+)',\n 'Operating System :: POSIX :: Linux',\n 'Development Status :: 4 - Beta',\n 'Framework :: Django',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7',\n ),\n scripts=['bin/pulp-content'],\n)\n", "path": "setup.py"}], "after_files": [{"content": "from setuptools import find_packages, setup\n\nwith open('README.md') as f:\n long_description = f.read()\n\nrequirements = [\n 'coreapi~=2.3.3',\n 'Django~=2.2.3', # LTS version, switch only if we have a compelling reason to\n 'django-filter~=2.2.0',\n 'djangorestframework~=3.10.2',\n 'djangorestframework-queryfields~=1.0.0',\n 'drf-nested-routers~=0.91.0',\n 'drf-yasg~=1.16.1',\n 'gunicorn~=19.9.0',\n 'packaging', # until drf-yasg 1.16.2 is out https://github.com/axnsan12/drf-yasg/issues/412\n 'PyYAML~=5.1.1',\n 'rq~=1.1.0',\n 'redis~=3.1.0',\n 'setuptools~=41.0.1',\n 'dynaconf~=2.0.3',\n 'whitenoise~=4.1.3',\n]\n\nsetup(\n name='pulpcore',\n version='3.0.0rc5.dev',\n description='Pulp Django Application and Related Modules',\n long_description=long_description,\n long_description_content_type=\"text/markdown\",\n license='GPLv2+',\n packages=find_packages(exclude=['test']),\n author='Pulp Team',\n author_email='[email protected]',\n url='http://www.pulpproject.org',\n python_requires='>=3.6',\n install_requires=requirements,\n extras_require={\n 'postgres': ['psycopg2-binary'],\n 'mysql': ['mysqlclient']\n },\n include_package_data=True,\n classifiers=(\n 'License :: OSI Approved :: GNU General Public License v2 or later (GPLv2+)',\n 'Operating System :: POSIX :: Linux',\n 'Development Status :: 4 - Beta',\n 'Framework :: Django',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7',\n ),\n scripts=['bin/pulp-content'],\n)\n", "path": "setup.py"}]}
840
425
gh_patches_debug_25641
rasdani/github-patches
git_diff
sublimelsp__LSP-1573
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- "Rename…" code action is broken (v1.1.6) The _Rename_ code action doesn’t work correctly. I wasn’t able to use it in VSCode so I can’t tell whether it comes from the language server of the Sublime extension. Here is a minimal example: ```elm module Test exposing (..) import Html exposing (Html) view : Html msg view = Html.text body body : String body = "..." ``` When renaming `body` for instance: * if the new name has the same length, it seems to work fine * if the new name is longer (`bodyxyz` in the example below), the last few characters are duplicated: ```elm view : Html msg view = Html.text bodyxyzxyz bodyxyzxyz : String bodyxyzxyz = "..." ``` * if the new name is shorter (`a` in this example), well… ```elm view : Html msg view = Html.text aaString a "..." ``` --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `plugin/core/edit.py` Content: ``` 1 from .logging import debug 2 from .open import open_file 3 from .promise import Promise 4 from .typing import List, Dict, Any, Iterable, Optional, Tuple 5 from .url import uri_to_filename 6 from functools import partial 7 import operator 8 import sublime 9 10 11 # tuple of start, end, newText, version 12 TextEdit = Tuple[Tuple[int, int], Tuple[int, int], str, Optional[int]] 13 14 15 def parse_workspace_edit(workspace_edit: Dict[str, Any]) -> Dict[str, List[TextEdit]]: 16 changes = {} # type: Dict[str, List[TextEdit]] 17 raw_changes = workspace_edit.get('changes') 18 if isinstance(raw_changes, dict): 19 for uri, file_changes in raw_changes.items(): 20 changes[uri_to_filename(uri)] = list(parse_text_edit(change) for change in file_changes) 21 document_changes = workspace_edit.get('documentChanges') 22 if isinstance(document_changes, list): 23 for document_change in document_changes: 24 if 'kind' in document_change: 25 debug('Ignoring unsupported "resourceOperations" edit type') 26 continue 27 uri = document_change.get('textDocument').get('uri') 28 version = document_change.get('textDocument').get('version') 29 text_edit = list(parse_text_edit(change, version) for change in document_change.get('edits')) 30 changes.setdefault(uri_to_filename(uri), []).extend(text_edit) 31 return changes 32 33 34 def parse_range(range: Dict[str, int]) -> Tuple[int, int]: 35 return range['line'], range['character'] 36 37 38 def parse_text_edit(text_edit: Dict[str, Any], version: int = None) -> TextEdit: 39 return ( 40 parse_range(text_edit['range']['start']), 41 parse_range(text_edit['range']['end']), 42 # Strip away carriage returns -- SublimeText takes care of that. 43 text_edit.get('newText', '').replace("\r", ""), 44 version 45 ) 46 47 48 def sort_by_application_order(changes: Iterable[TextEdit]) -> List[TextEdit]: 49 # The spec reads: 50 # > However, it is possible that multiple edits have the same start position: multiple 51 # > inserts, or any number of inserts followed by a single remove or replace edit. If 52 # > multiple inserts have the same position, the order in the array defines the order in 53 # > which the inserted strings appear in the resulting text. 54 # So we sort by start position. But if multiple text edits start at the same position, 55 # we use the index in the array as the key. 56 57 return list(sorted(changes, key=operator.itemgetter(0))) 58 59 60 def apply_workspace_edit(window: sublime.Window, changes: Dict[str, List[TextEdit]]) -> Promise: 61 """Apply workspace edits. This function must be called from the main thread!""" 62 return Promise.all([open_file(window, fn).then(partial(_apply_edits, edits)) for fn, edits in changes.items()]) 63 64 65 def _apply_edits(edits: List[TextEdit], view: Optional[sublime.View]) -> None: 66 if view and view.is_valid(): 67 # Text commands run blocking. After this call has returned the changes are applied. 68 view.run_command("lsp_apply_document_edit", {"changes": edits}) 69 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/plugin/core/edit.py b/plugin/core/edit.py --- a/plugin/core/edit.py +++ b/plugin/core/edit.py @@ -14,10 +14,6 @@ def parse_workspace_edit(workspace_edit: Dict[str, Any]) -> Dict[str, List[TextEdit]]: changes = {} # type: Dict[str, List[TextEdit]] - raw_changes = workspace_edit.get('changes') - if isinstance(raw_changes, dict): - for uri, file_changes in raw_changes.items(): - changes[uri_to_filename(uri)] = list(parse_text_edit(change) for change in file_changes) document_changes = workspace_edit.get('documentChanges') if isinstance(document_changes, list): for document_change in document_changes: @@ -28,6 +24,11 @@ version = document_change.get('textDocument').get('version') text_edit = list(parse_text_edit(change, version) for change in document_change.get('edits')) changes.setdefault(uri_to_filename(uri), []).extend(text_edit) + else: + raw_changes = workspace_edit.get('changes') + if isinstance(raw_changes, dict): + for uri, file_changes in raw_changes.items(): + changes[uri_to_filename(uri)] = list(parse_text_edit(change) for change in file_changes) return changes
{"golden_diff": "diff --git a/plugin/core/edit.py b/plugin/core/edit.py\n--- a/plugin/core/edit.py\n+++ b/plugin/core/edit.py\n@@ -14,10 +14,6 @@\n \n def parse_workspace_edit(workspace_edit: Dict[str, Any]) -> Dict[str, List[TextEdit]]:\n changes = {} # type: Dict[str, List[TextEdit]]\n- raw_changes = workspace_edit.get('changes')\n- if isinstance(raw_changes, dict):\n- for uri, file_changes in raw_changes.items():\n- changes[uri_to_filename(uri)] = list(parse_text_edit(change) for change in file_changes)\n document_changes = workspace_edit.get('documentChanges')\n if isinstance(document_changes, list):\n for document_change in document_changes:\n@@ -28,6 +24,11 @@\n version = document_change.get('textDocument').get('version')\n text_edit = list(parse_text_edit(change, version) for change in document_change.get('edits'))\n changes.setdefault(uri_to_filename(uri), []).extend(text_edit)\n+ else:\n+ raw_changes = workspace_edit.get('changes')\n+ if isinstance(raw_changes, dict):\n+ for uri, file_changes in raw_changes.items():\n+ changes[uri_to_filename(uri)] = list(parse_text_edit(change) for change in file_changes)\n return changes\n", "issue": "\"Rename\u2026\" code action is broken\n(v1.1.6) The _Rename_ code action doesn\u2019t work correctly. I wasn\u2019t able to use it in VSCode so I can\u2019t tell whether it comes from the language server of the Sublime extension.\r\n\r\nHere is a minimal example:\r\n\r\n```elm\r\nmodule Test exposing (..)\r\n\r\nimport Html exposing (Html)\r\n\r\n\r\nview : Html msg\r\nview =\r\n Html.text body\r\n\r\n\r\nbody : String\r\nbody =\r\n \"...\"\r\n```\r\n\r\nWhen renaming `body` for instance:\r\n\r\n* if the new name has the same length, it seems to work fine\r\n* if the new name is longer (`bodyxyz` in the example below), the last few characters are duplicated:\r\n```elm\r\n\r\nview : Html msg\r\nview =\r\n Html.text bodyxyzxyz\r\n\r\n\r\nbodyxyzxyz : String\r\nbodyxyzxyz =\r\n \"...\"\r\n```\r\n* if the new name is shorter (`a` in this example), well\u2026\r\n```elm\r\nview : Html msg\r\nview =\r\n Html.text aaString\r\na \"...\"\r\n```\n", "before_files": [{"content": "from .logging import debug\nfrom .open import open_file\nfrom .promise import Promise\nfrom .typing import List, Dict, Any, Iterable, Optional, Tuple\nfrom .url import uri_to_filename\nfrom functools import partial\nimport operator\nimport sublime\n\n\n# tuple of start, end, newText, version\nTextEdit = Tuple[Tuple[int, int], Tuple[int, int], str, Optional[int]]\n\n\ndef parse_workspace_edit(workspace_edit: Dict[str, Any]) -> Dict[str, List[TextEdit]]:\n changes = {} # type: Dict[str, List[TextEdit]]\n raw_changes = workspace_edit.get('changes')\n if isinstance(raw_changes, dict):\n for uri, file_changes in raw_changes.items():\n changes[uri_to_filename(uri)] = list(parse_text_edit(change) for change in file_changes)\n document_changes = workspace_edit.get('documentChanges')\n if isinstance(document_changes, list):\n for document_change in document_changes:\n if 'kind' in document_change:\n debug('Ignoring unsupported \"resourceOperations\" edit type')\n continue\n uri = document_change.get('textDocument').get('uri')\n version = document_change.get('textDocument').get('version')\n text_edit = list(parse_text_edit(change, version) for change in document_change.get('edits'))\n changes.setdefault(uri_to_filename(uri), []).extend(text_edit)\n return changes\n\n\ndef parse_range(range: Dict[str, int]) -> Tuple[int, int]:\n return range['line'], range['character']\n\n\ndef parse_text_edit(text_edit: Dict[str, Any], version: int = None) -> TextEdit:\n return (\n parse_range(text_edit['range']['start']),\n parse_range(text_edit['range']['end']),\n # Strip away carriage returns -- SublimeText takes care of that.\n text_edit.get('newText', '').replace(\"\\r\", \"\"),\n version\n )\n\n\ndef sort_by_application_order(changes: Iterable[TextEdit]) -> List[TextEdit]:\n # The spec reads:\n # > However, it is possible that multiple edits have the same start position: multiple\n # > inserts, or any number of inserts followed by a single remove or replace edit. If\n # > multiple inserts have the same position, the order in the array defines the order in\n # > which the inserted strings appear in the resulting text.\n # So we sort by start position. But if multiple text edits start at the same position,\n # we use the index in the array as the key.\n\n return list(sorted(changes, key=operator.itemgetter(0)))\n\n\ndef apply_workspace_edit(window: sublime.Window, changes: Dict[str, List[TextEdit]]) -> Promise:\n \"\"\"Apply workspace edits. This function must be called from the main thread!\"\"\"\n return Promise.all([open_file(window, fn).then(partial(_apply_edits, edits)) for fn, edits in changes.items()])\n\n\ndef _apply_edits(edits: List[TextEdit], view: Optional[sublime.View]) -> None:\n if view and view.is_valid():\n # Text commands run blocking. After this call has returned the changes are applied.\n view.run_command(\"lsp_apply_document_edit\", {\"changes\": edits})\n", "path": "plugin/core/edit.py"}], "after_files": [{"content": "from .logging import debug\nfrom .open import open_file\nfrom .promise import Promise\nfrom .typing import List, Dict, Any, Iterable, Optional, Tuple\nfrom .url import uri_to_filename\nfrom functools import partial\nimport operator\nimport sublime\n\n\n# tuple of start, end, newText, version\nTextEdit = Tuple[Tuple[int, int], Tuple[int, int], str, Optional[int]]\n\n\ndef parse_workspace_edit(workspace_edit: Dict[str, Any]) -> Dict[str, List[TextEdit]]:\n changes = {} # type: Dict[str, List[TextEdit]]\n document_changes = workspace_edit.get('documentChanges')\n if isinstance(document_changes, list):\n for document_change in document_changes:\n if 'kind' in document_change:\n debug('Ignoring unsupported \"resourceOperations\" edit type')\n continue\n uri = document_change.get('textDocument').get('uri')\n version = document_change.get('textDocument').get('version')\n text_edit = list(parse_text_edit(change, version) for change in document_change.get('edits'))\n changes.setdefault(uri_to_filename(uri), []).extend(text_edit)\n else:\n raw_changes = workspace_edit.get('changes')\n if isinstance(raw_changes, dict):\n for uri, file_changes in raw_changes.items():\n changes[uri_to_filename(uri)] = list(parse_text_edit(change) for change in file_changes)\n return changes\n\n\ndef parse_range(range: Dict[str, int]) -> Tuple[int, int]:\n return range['line'], range['character']\n\n\ndef parse_text_edit(text_edit: Dict[str, Any], version: int = None) -> TextEdit:\n return (\n parse_range(text_edit['range']['start']),\n parse_range(text_edit['range']['end']),\n # Strip away carriage returns -- SublimeText takes care of that.\n text_edit.get('newText', '').replace(\"\\r\", \"\"),\n version\n )\n\n\ndef sort_by_application_order(changes: Iterable[TextEdit]) -> List[TextEdit]:\n # The spec reads:\n # > However, it is possible that multiple edits have the same start position: multiple\n # > inserts, or any number of inserts followed by a single remove or replace edit. If\n # > multiple inserts have the same position, the order in the array defines the order in\n # > which the inserted strings appear in the resulting text.\n # So we sort by start position. But if multiple text edits start at the same position,\n # we use the index in the array as the key.\n\n return list(sorted(changes, key=operator.itemgetter(0)))\n\n\ndef apply_workspace_edit(window: sublime.Window, changes: Dict[str, List[TextEdit]]) -> Promise:\n \"\"\"Apply workspace edits. This function must be called from the main thread!\"\"\"\n return Promise.all([open_file(window, fn).then(partial(_apply_edits, edits)) for fn, edits in changes.items()])\n\n\ndef _apply_edits(edits: List[TextEdit], view: Optional[sublime.View]) -> None:\n if view and view.is_valid():\n # Text commands run blocking. After this call has returned the changes are applied.\n view.run_command(\"lsp_apply_document_edit\", {\"changes\": edits})\n", "path": "plugin/core/edit.py"}]}
1,283
282
gh_patches_debug_14435
rasdani/github-patches
git_diff
fossasia__open-event-server-5247
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Inconsistent data on Admin Statistics end points **I'm submitting a ...** (check one with "x") - [x] bug report - [ ] feature request - [ ] support request => Please do not submit support requests here, instead ask your query in out Gitter channel at https://gitter.im/fossasia/open-event-server **Current behavior:** <!-- Describe how the bug manifests. --> The admin statistics end points return various stats about events, users etc. Currently they are completely out of sync with the database. For instance, the `admin/statisitics/events` returns ![image](https://user-images.githubusercontent.com/17252805/39888957-1a5daf72-54b4-11e8-934c-1d77a9ab66b7.png) Where as the actual event count when generically querying for events is 92: ![image](https://user-images.githubusercontent.com/17252805/39889011-44af3d9a-54b4-11e8-8fb8-c45b0cef175d.png) **Expected behavior:** <!-- Describe what the behavior would be without the bug. --> The counts should be consistent. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `app/api/schema/admin_statistics_schema/events.py` Content: ``` 1 from marshmallow_jsonapi.flask import Schema 2 from marshmallow_jsonapi import fields 3 from app.models.event import Event 4 from app.api.helpers.db import get_count 5 from app.api.helpers.utilities import dasherize 6 from datetime import datetime 7 import pytz 8 9 10 class AdminStatisticsEventSchema(Schema): 11 """ 12 Api schema 13 """ 14 class Meta: 15 """ 16 Meta class 17 """ 18 type_ = 'admin-statistics-event' 19 self_view = 'v1.admin_statistics_event_detail' 20 inflect = dasherize 21 22 id = fields.String() 23 draft = fields.Method("events_draft_count") 24 published = fields.Method("events_published_count") 25 past = fields.Method("events_past_count") 26 27 def events_draft_count(self, obj): 28 return get_count(Event.query.filter_by(state='draft')) 29 30 def events_published_count(self, obj): 31 return get_count(Event.query.filter_by(state='published')) 32 33 def events_past_count(self, obj): 34 return get_count(Event.query.filter(Event.ends_at < datetime.now(pytz.utc))) 35 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/app/api/schema/admin_statistics_schema/events.py b/app/api/schema/admin_statistics_schema/events.py --- a/app/api/schema/admin_statistics_schema/events.py +++ b/app/api/schema/admin_statistics_schema/events.py @@ -25,10 +25,12 @@ past = fields.Method("events_past_count") def events_draft_count(self, obj): - return get_count(Event.query.filter_by(state='draft')) + events = Event.query.filter(Event.ends_at > datetime.now(pytz.utc)) + return get_count(events.filter_by(state='draft')) def events_published_count(self, obj): - return get_count(Event.query.filter_by(state='published')) + events = Event.query.filter(Event.ends_at > datetime.now(pytz.utc)) + return get_count(events.filter_by(state='published')) def events_past_count(self, obj): return get_count(Event.query.filter(Event.ends_at < datetime.now(pytz.utc)))
{"golden_diff": "diff --git a/app/api/schema/admin_statistics_schema/events.py b/app/api/schema/admin_statistics_schema/events.py\n--- a/app/api/schema/admin_statistics_schema/events.py\n+++ b/app/api/schema/admin_statistics_schema/events.py\n@@ -25,10 +25,12 @@\n past = fields.Method(\"events_past_count\")\n \n def events_draft_count(self, obj):\n- return get_count(Event.query.filter_by(state='draft'))\n+ events = Event.query.filter(Event.ends_at > datetime.now(pytz.utc))\n+ return get_count(events.filter_by(state='draft'))\n \n def events_published_count(self, obj):\n- return get_count(Event.query.filter_by(state='published'))\n+ events = Event.query.filter(Event.ends_at > datetime.now(pytz.utc))\n+ return get_count(events.filter_by(state='published'))\n \n def events_past_count(self, obj):\n return get_count(Event.query.filter(Event.ends_at < datetime.now(pytz.utc)))\n", "issue": "Inconsistent data on Admin Statistics end points\n**I'm submitting a ...** (check one with \"x\")\r\n- [x] bug report\r\n- [ ] feature request\r\n- [ ] support request => Please do not submit support requests here, instead ask your query in out Gitter channel at https://gitter.im/fossasia/open-event-server\r\n\r\n**Current behavior:**\r\n<!-- Describe how the bug manifests. -->\r\nThe admin statistics end points return various stats about events, users etc.\r\nCurrently they are completely out of sync with the database.\r\nFor instance, the `admin/statisitics/events` returns \r\n![image](https://user-images.githubusercontent.com/17252805/39888957-1a5daf72-54b4-11e8-934c-1d77a9ab66b7.png)\r\n\r\nWhere as the actual event count when generically querying for events is 92:\r\n![image](https://user-images.githubusercontent.com/17252805/39889011-44af3d9a-54b4-11e8-8fb8-c45b0cef175d.png)\r\n\r\n**Expected behavior:**\r\n<!-- Describe what the behavior would be without the bug. -->\r\nThe counts should be consistent.\n", "before_files": [{"content": "from marshmallow_jsonapi.flask import Schema\nfrom marshmallow_jsonapi import fields\nfrom app.models.event import Event\nfrom app.api.helpers.db import get_count\nfrom app.api.helpers.utilities import dasherize\nfrom datetime import datetime\nimport pytz\n\n\nclass AdminStatisticsEventSchema(Schema):\n \"\"\"\n Api schema\n \"\"\"\n class Meta:\n \"\"\"\n Meta class\n \"\"\"\n type_ = 'admin-statistics-event'\n self_view = 'v1.admin_statistics_event_detail'\n inflect = dasherize\n\n id = fields.String()\n draft = fields.Method(\"events_draft_count\")\n published = fields.Method(\"events_published_count\")\n past = fields.Method(\"events_past_count\")\n\n def events_draft_count(self, obj):\n return get_count(Event.query.filter_by(state='draft'))\n\n def events_published_count(self, obj):\n return get_count(Event.query.filter_by(state='published'))\n\n def events_past_count(self, obj):\n return get_count(Event.query.filter(Event.ends_at < datetime.now(pytz.utc)))\n", "path": "app/api/schema/admin_statistics_schema/events.py"}], "after_files": [{"content": "from marshmallow_jsonapi.flask import Schema\nfrom marshmallow_jsonapi import fields\nfrom app.models.event import Event\nfrom app.api.helpers.db import get_count\nfrom app.api.helpers.utilities import dasherize\nfrom datetime import datetime\nimport pytz\n\n\nclass AdminStatisticsEventSchema(Schema):\n \"\"\"\n Api schema\n \"\"\"\n class Meta:\n \"\"\"\n Meta class\n \"\"\"\n type_ = 'admin-statistics-event'\n self_view = 'v1.admin_statistics_event_detail'\n inflect = dasherize\n\n id = fields.String()\n draft = fields.Method(\"events_draft_count\")\n published = fields.Method(\"events_published_count\")\n past = fields.Method(\"events_past_count\")\n\n def events_draft_count(self, obj):\n events = Event.query.filter(Event.ends_at > datetime.now(pytz.utc))\n return get_count(events.filter_by(state='draft'))\n\n def events_published_count(self, obj):\n events = Event.query.filter(Event.ends_at > datetime.now(pytz.utc))\n return get_count(events.filter_by(state='published'))\n\n def events_past_count(self, obj):\n return get_count(Event.query.filter(Event.ends_at < datetime.now(pytz.utc)))\n", "path": "app/api/schema/admin_statistics_schema/events.py"}]}
833
202
gh_patches_debug_49498
rasdani/github-patches
git_diff
pex-tool__pex-1516
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Release 2.1.55 On the docket: + [x] Add official support for Python 3.10 (#1512) + [x] Always register global options. (#1511) + [x] Fix RTD generation by pinning docutils low. (#1509) --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `pex/version.py` Content: ``` 1 # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). 2 # Licensed under the Apache License, Version 2.0 (see LICENSE). 3 4 __version__ = "2.1.54" 5 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/pex/version.py b/pex/version.py --- a/pex/version.py +++ b/pex/version.py @@ -1,4 +1,4 @@ # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md). # Licensed under the Apache License, Version 2.0 (see LICENSE). -__version__ = "2.1.54" +__version__ = "2.1.55"
{"golden_diff": "diff --git a/pex/version.py b/pex/version.py\n--- a/pex/version.py\n+++ b/pex/version.py\n@@ -1,4 +1,4 @@\n # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n # Licensed under the Apache License, Version 2.0 (see LICENSE).\n \n-__version__ = \"2.1.54\"\n+__version__ = \"2.1.55\"\n", "issue": "Release 2.1.55\nOn the docket:\r\n+ [x] Add official support for Python 3.10 (#1512)\r\n+ [x] Always register global options. (#1511)\r\n+ [x] Fix RTD generation by pinning docutils low. (#1509)\n", "before_files": [{"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.54\"\n", "path": "pex/version.py"}], "after_files": [{"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.55\"\n", "path": "pex/version.py"}]}
377
96
gh_patches_debug_16916
rasdani/github-patches
git_diff
pyjanitor-devs__pyjanitor-452
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- [DOC] Clarify original-dataframe mutation behavior in pyjanitor function docstrings # Brief Description of Fix Currently, some pyjanitor functions mutate the original dataframe and others return a copy. Solutions are heavily discussed in #79 but no conclusion has been reached. At the moment, it is unclear, without experimentation from the user, which behavior applies in each function. In the interim, I propose to explicitly clarify this behavior in each function's docstring so the user has a clear idea regarding the function's mutating behavior. Below is a sample of what this could look like for `.clean_names()`: """ Clean column names. Takes all column names, converts them to lowercase, then replaces all spaces with underscores. <b>Does not mutate original dataframe.</b> """ Happy to add this line somewhere else in the docstring if inappropriate here. - [Link to documentation page](https://pyjanitor.readthedocs.io/reference/index.html) - [Link to exact file to be edited](https://github.com/ericmjl/pyjanitor/blob/dev/janitor/functions.py) --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `janitor/biology.py` Content: ``` 1 """ 2 Biology and bioinformatics-oriented data cleaning functions. 3 """ 4 5 import pandas as pd 6 import pandas_flavor as pf 7 8 from .utils import deprecated_alias, import_message 9 10 try: 11 from Bio import SeqIO 12 except ImportError: 13 import_message( 14 "biology", "biopython", "conda install -c conda-forge biopython" 15 ) 16 17 18 @pf.register_dataframe_method 19 @deprecated_alias(col_name="column_name") 20 def join_fasta( 21 df: pd.DataFrame, filename: str, id_col: str, column_name 22 ) -> pd.DataFrame: 23 """ 24 Convenience method to join in a FASTA file as a column. 25 26 This allows us to add the string sequence of a FASTA file as a new column 27 of data in the dataframe. 28 29 This function only attaches the string representation of the SeqRecord.Seq 30 object from Biopython. Does not attach the full SeqRecord. Alphabet is 31 also not stored, under the assumption that the data scientist has domain 32 knowledge of what kind of sequence is being read in (nucleotide vs. amino 33 acid.) 34 35 For more advanced functions, please use phylopandas. 36 37 :param df: A pandas DataFrame. 38 :param filename: Path to the FASTA file. 39 :param id_col: The column in the DataFrame that houses sequence IDs. 40 :param column_name: The name of the new column. 41 """ 42 seqrecords = { 43 x.id: x.seq.__str__() for x in SeqIO.parse(filename, "fasta") 44 } 45 seq_col = [seqrecords[i] for i in df[id_col]] 46 df[column_name] = seq_col 47 return df 48 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/janitor/biology.py b/janitor/biology.py --- a/janitor/biology.py +++ b/janitor/biology.py @@ -26,12 +26,14 @@ This allows us to add the string sequence of a FASTA file as a new column of data in the dataframe. - This function only attaches the string representation of the SeqRecord.Seq + This method only attaches the string representation of the SeqRecord.Seq object from Biopython. Does not attach the full SeqRecord. Alphabet is also not stored, under the assumption that the data scientist has domain knowledge of what kind of sequence is being read in (nucleotide vs. amino acid.) + This method mutates the original DataFrame. + For more advanced functions, please use phylopandas. :param df: A pandas DataFrame.
{"golden_diff": "diff --git a/janitor/biology.py b/janitor/biology.py\n--- a/janitor/biology.py\n+++ b/janitor/biology.py\n@@ -26,12 +26,14 @@\n This allows us to add the string sequence of a FASTA file as a new column\n of data in the dataframe.\n \n- This function only attaches the string representation of the SeqRecord.Seq\n+ This method only attaches the string representation of the SeqRecord.Seq\n object from Biopython. Does not attach the full SeqRecord. Alphabet is\n also not stored, under the assumption that the data scientist has domain\n knowledge of what kind of sequence is being read in (nucleotide vs. amino\n acid.)\n \n+ This method mutates the original DataFrame.\n+\n For more advanced functions, please use phylopandas.\n \n :param df: A pandas DataFrame.\n", "issue": "[DOC] Clarify original-dataframe mutation behavior in pyjanitor function docstrings\n# Brief Description of Fix\r\nCurrently, some pyjanitor functions mutate the original dataframe and others return a copy. Solutions are heavily discussed in #79 but no conclusion has been reached. At the moment, it is unclear, without experimentation from the user, which behavior applies in each function. \r\n\r\nIn the interim, I propose to explicitly clarify this behavior in each function's docstring so the user has a clear idea regarding the function's mutating behavior. Below is a sample of what this could look like for `.clean_names()`:\r\n\r\n\"\"\"\r\nClean column names.\r\n Takes all column names, converts them to lowercase, then replaces all\r\n spaces with underscores. <b>Does not mutate original dataframe.</b>\r\n\"\"\"\r\n\r\nHappy to add this line somewhere else in the docstring if inappropriate here. \r\n\r\n- [Link to documentation page](https://pyjanitor.readthedocs.io/reference/index.html)\r\n- [Link to exact file to be edited](https://github.com/ericmjl/pyjanitor/blob/dev/janitor/functions.py)\r\n\n", "before_files": [{"content": "\"\"\"\nBiology and bioinformatics-oriented data cleaning functions.\n\"\"\"\n\nimport pandas as pd\nimport pandas_flavor as pf\n\nfrom .utils import deprecated_alias, import_message\n\ntry:\n from Bio import SeqIO\nexcept ImportError:\n import_message(\n \"biology\", \"biopython\", \"conda install -c conda-forge biopython\"\n )\n\n\[email protected]_dataframe_method\n@deprecated_alias(col_name=\"column_name\")\ndef join_fasta(\n df: pd.DataFrame, filename: str, id_col: str, column_name\n) -> pd.DataFrame:\n \"\"\"\n Convenience method to join in a FASTA file as a column.\n\n This allows us to add the string sequence of a FASTA file as a new column\n of data in the dataframe.\n\n This function only attaches the string representation of the SeqRecord.Seq\n object from Biopython. Does not attach the full SeqRecord. Alphabet is\n also not stored, under the assumption that the data scientist has domain\n knowledge of what kind of sequence is being read in (nucleotide vs. amino\n acid.)\n\n For more advanced functions, please use phylopandas.\n\n :param df: A pandas DataFrame.\n :param filename: Path to the FASTA file.\n :param id_col: The column in the DataFrame that houses sequence IDs.\n :param column_name: The name of the new column.\n \"\"\"\n seqrecords = {\n x.id: x.seq.__str__() for x in SeqIO.parse(filename, \"fasta\")\n }\n seq_col = [seqrecords[i] for i in df[id_col]]\n df[column_name] = seq_col\n return df\n", "path": "janitor/biology.py"}], "after_files": [{"content": "\"\"\"\nBiology and bioinformatics-oriented data cleaning functions.\n\"\"\"\n\nimport pandas as pd\nimport pandas_flavor as pf\n\nfrom .utils import deprecated_alias, import_message\n\ntry:\n from Bio import SeqIO\nexcept ImportError:\n import_message(\n \"biology\", \"biopython\", \"conda install -c conda-forge biopython\"\n )\n\n\[email protected]_dataframe_method\n@deprecated_alias(col_name=\"column_name\")\ndef join_fasta(\n df: pd.DataFrame, filename: str, id_col: str, column_name\n) -> pd.DataFrame:\n \"\"\"\n Convenience method to join in a FASTA file as a column.\n\n This allows us to add the string sequence of a FASTA file as a new column\n of data in the dataframe.\n\n This method only attaches the string representation of the SeqRecord.Seq\n object from Biopython. Does not attach the full SeqRecord. Alphabet is\n also not stored, under the assumption that the data scientist has domain\n knowledge of what kind of sequence is being read in (nucleotide vs. amino\n acid.)\n\n This method mutates the original DataFrame.\n\n For more advanced functions, please use phylopandas.\n\n :param df: A pandas DataFrame.\n :param filename: Path to the FASTA file.\n :param id_col: The column in the DataFrame that houses sequence IDs.\n :param column_name: The name of the new column.\n \"\"\"\n seqrecords = {\n x.id: x.seq.__str__() for x in SeqIO.parse(filename, \"fasta\")\n }\n seq_col = [seqrecords[i] for i in df[id_col]]\n df[column_name] = seq_col\n return df\n", "path": "janitor/biology.py"}]}
932
198
gh_patches_debug_15954
rasdani/github-patches
git_diff
Nitrate__Nitrate-352
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Remove django 1.10 support - Remove django 1.10 testenv from tox.ini - Update django version in setup.py. Minimum django version is `1.11`. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `setup.py` Content: ``` 1 # -*- coding: utf-8 -*- 2 3 import sys 4 5 from setuptools import setup, find_packages 6 7 8 with open('VERSION.txt', 'r') as f: 9 pkg_version = f.read().strip() 10 11 12 def get_long_description(): 13 with open('README.rst', 'r') as f: 14 return f.read() 15 16 17 install_requires = [ 18 'PyMySQL == 0.7.11', 19 'beautifulsoup4 >= 4.1.1', 20 'celery == 4.1.0', 21 'django >= 1.10,<2.0', 22 'django-contrib-comments == 1.8.0', 23 'django-tinymce == 2.7.0', 24 'django-uuslug == 1.1.8', 25 'html2text', 26 'kobo == 0.7.0', 27 'odfpy >= 0.9.6', 28 'python-bugzilla', 29 'six', 30 'xmltodict', 31 ] 32 33 if sys.version_info.major < 3: 34 install_requires += [ 35 'enum34', 36 ] 37 38 extras_require = { 39 # Required for tcms.core.contrib.auth.backends.KerberosBackend 40 'krbauth': [ 41 'kerberos == 1.2.5' 42 ], 43 44 # Packages for building documentation 45 'docs': [ 46 'Sphinx >= 1.1.2', 47 'sphinx_rtd_theme', 48 ], 49 50 # Necessary packages for running tests 51 'tests': [ 52 'coverage', 53 'factory_boy', 54 'flake8', 55 'mock', 56 'pytest', 57 'pytest-cov', 58 'pytest-django', 59 ], 60 61 # Contain tools that assists the development 62 'devtools': [ 63 'django-debug-toolbar == 1.7', 64 'tox', 65 'django-extensions', 66 'pygraphviz', 67 'future-breakpoint', 68 ] 69 } 70 71 72 setup( 73 name='Nitrate', 74 version=pkg_version, 75 description='Test Case Management System', 76 long_description=get_long_description(), 77 author='Nitrate Team', 78 maintainer='Chenxiong Qi', 79 maintainer_email='[email protected]', 80 url='https://github.com/Nitrate/Nitrate/', 81 license='GPLv2+', 82 keywords='test case', 83 install_requires=install_requires, 84 extras_require=extras_require, 85 packages=find_packages(), 86 include_package_data=True, 87 classifiers=[ 88 'Framework :: Django', 89 'Framework :: Django :: 1.10', 90 'Framework :: Django :: 1.11', 91 'Intended Audience :: Developers', 92 'License :: OSI Approved :: GNU General Public License v2 or later (GPLv2+)', 93 'Programming Language :: Python :: 2', 94 'Programming Language :: Python :: 2.7', 95 'Programming Language :: Python :: 3', 96 'Programming Language :: Python :: 3.6', 97 'Topic :: Software Development :: Quality Assurance', 98 'Topic :: Software Development :: Testing', 99 ], 100 ) 101 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/setup.py b/setup.py --- a/setup.py +++ b/setup.py @@ -18,7 +18,7 @@ 'PyMySQL == 0.7.11', 'beautifulsoup4 >= 4.1.1', 'celery == 4.1.0', - 'django >= 1.10,<2.0', + 'django >= 1.11,<2.0', 'django-contrib-comments == 1.8.0', 'django-tinymce == 2.7.0', 'django-uuslug == 1.1.8', @@ -86,7 +86,6 @@ include_package_data=True, classifiers=[ 'Framework :: Django', - 'Framework :: Django :: 1.10', 'Framework :: Django :: 1.11', 'Intended Audience :: Developers', 'License :: OSI Approved :: GNU General Public License v2 or later (GPLv2+)',
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -18,7 +18,7 @@\n 'PyMySQL == 0.7.11',\n 'beautifulsoup4 >= 4.1.1',\n 'celery == 4.1.0',\n- 'django >= 1.10,<2.0',\n+ 'django >= 1.11,<2.0',\n 'django-contrib-comments == 1.8.0',\n 'django-tinymce == 2.7.0',\n 'django-uuslug == 1.1.8',\n@@ -86,7 +86,6 @@\n include_package_data=True,\n classifiers=[\n 'Framework :: Django',\n- 'Framework :: Django :: 1.10',\n 'Framework :: Django :: 1.11',\n 'Intended Audience :: Developers',\n 'License :: OSI Approved :: GNU General Public License v2 or later (GPLv2+)',\n", "issue": "Remove django 1.10 support\n- Remove django 1.10 testenv from tox.ini\r\n- Update django version in setup.py. Minimum django version is `1.11`.\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\nimport sys\n\nfrom setuptools import setup, find_packages\n\n\nwith open('VERSION.txt', 'r') as f:\n pkg_version = f.read().strip()\n\n\ndef get_long_description():\n with open('README.rst', 'r') as f:\n return f.read()\n\n\ninstall_requires = [\n 'PyMySQL == 0.7.11',\n 'beautifulsoup4 >= 4.1.1',\n 'celery == 4.1.0',\n 'django >= 1.10,<2.0',\n 'django-contrib-comments == 1.8.0',\n 'django-tinymce == 2.7.0',\n 'django-uuslug == 1.1.8',\n 'html2text',\n 'kobo == 0.7.0',\n 'odfpy >= 0.9.6',\n 'python-bugzilla',\n 'six',\n 'xmltodict',\n]\n\nif sys.version_info.major < 3:\n install_requires += [\n 'enum34',\n ]\n\nextras_require = {\n # Required for tcms.core.contrib.auth.backends.KerberosBackend\n 'krbauth': [\n 'kerberos == 1.2.5'\n ],\n\n # Packages for building documentation\n 'docs': [\n 'Sphinx >= 1.1.2',\n 'sphinx_rtd_theme',\n ],\n\n # Necessary packages for running tests\n 'tests': [\n 'coverage',\n 'factory_boy',\n 'flake8',\n 'mock',\n 'pytest',\n 'pytest-cov',\n 'pytest-django',\n ],\n\n # Contain tools that assists the development\n 'devtools': [\n 'django-debug-toolbar == 1.7',\n 'tox',\n 'django-extensions',\n 'pygraphviz',\n 'future-breakpoint',\n ]\n}\n\n\nsetup(\n name='Nitrate',\n version=pkg_version,\n description='Test Case Management System',\n long_description=get_long_description(),\n author='Nitrate Team',\n maintainer='Chenxiong Qi',\n maintainer_email='[email protected]',\n url='https://github.com/Nitrate/Nitrate/',\n license='GPLv2+',\n keywords='test case',\n install_requires=install_requires,\n extras_require=extras_require,\n packages=find_packages(),\n include_package_data=True,\n classifiers=[\n 'Framework :: Django',\n 'Framework :: Django :: 1.10',\n 'Framework :: Django :: 1.11',\n 'Intended Audience :: Developers',\n 'License :: OSI Approved :: GNU General Public License v2 or later (GPLv2+)',\n 'Programming Language :: Python :: 2',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.6',\n 'Topic :: Software Development :: Quality Assurance',\n 'Topic :: Software Development :: Testing',\n ],\n)\n", "path": "setup.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\nimport sys\n\nfrom setuptools import setup, find_packages\n\n\nwith open('VERSION.txt', 'r') as f:\n pkg_version = f.read().strip()\n\n\ndef get_long_description():\n with open('README.rst', 'r') as f:\n return f.read()\n\n\ninstall_requires = [\n 'PyMySQL == 0.7.11',\n 'beautifulsoup4 >= 4.1.1',\n 'celery == 4.1.0',\n 'django >= 1.11,<2.0',\n 'django-contrib-comments == 1.8.0',\n 'django-tinymce == 2.7.0',\n 'django-uuslug == 1.1.8',\n 'html2text',\n 'kobo == 0.7.0',\n 'odfpy >= 0.9.6',\n 'python-bugzilla',\n 'six',\n 'xmltodict',\n]\n\nif sys.version_info.major < 3:\n install_requires += [\n 'enum34',\n ]\n\nextras_require = {\n # Required for tcms.core.contrib.auth.backends.KerberosBackend\n 'krbauth': [\n 'kerberos == 1.2.5'\n ],\n\n # Packages for building documentation\n 'docs': [\n 'Sphinx >= 1.1.2',\n 'sphinx_rtd_theme',\n ],\n\n # Necessary packages for running tests\n 'tests': [\n 'coverage',\n 'factory_boy',\n 'flake8',\n 'mock',\n 'pytest',\n 'pytest-cov',\n 'pytest-django',\n ],\n\n # Contain tools that assists the development\n 'devtools': [\n 'django-debug-toolbar == 1.7',\n 'tox',\n 'django-extensions',\n 'pygraphviz',\n 'future-breakpoint',\n ]\n}\n\n\nsetup(\n name='Nitrate',\n version=pkg_version,\n description='Test Case Management System',\n long_description=get_long_description(),\n author='Nitrate Team',\n maintainer='Chenxiong Qi',\n maintainer_email='[email protected]',\n url='https://github.com/Nitrate/Nitrate/',\n license='GPLv2+',\n keywords='test case',\n install_requires=install_requires,\n extras_require=extras_require,\n packages=find_packages(),\n include_package_data=True,\n classifiers=[\n 'Framework :: Django',\n 'Framework :: Django :: 1.11',\n 'Intended Audience :: Developers',\n 'License :: OSI Approved :: GNU General Public License v2 or later (GPLv2+)',\n 'Programming Language :: Python :: 2',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.6',\n 'Topic :: Software Development :: Quality Assurance',\n 'Topic :: Software Development :: Testing',\n ],\n)\n", "path": "setup.py"}]}
1,155
222
gh_patches_debug_50088
rasdani/github-patches
git_diff
python-telegram-bot__python-telegram-bot-1216
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Typo in comment in conversationbot2.py <!-- Thanks for reporting issues of python-telegram-bot! Use this template to notify us if you found a bug, or if you want to request a new feature. If you're looking for help with programming your bot using our library, feel free to ask your questions in out telegram group at: https://t.me/pythontelegrambotgroup To make it easier for us to help you please enter detailed information below. Please note, we only support the latest version of python-telegram-bot and master branch. Please make sure to upgrade & recreate the issue on the latest version prior to opening an issue. --> ### Steps to reproduce 1. Not really a bug... wrong comment line in conversationbot2.py :) ### Expected behaviour Should be: # Add conversation handler with the states CHOOSING, TYPING_CHOICE and TYPING_REPLY ### Actual behaviour Actual comment in conversationbot2.py (leftover from conversationbot.py I guess :)) # Add conversation handler with the states GENDER, PHOTO, LOCATION and BIO ### Configuration **Operating System:** Windows **Version of Python, python-telegram-bot & dependencies:** 3.6 ``$ python -m telegram`` ### Logs Insert logs here (if necessary) --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `examples/conversationbot2.py` Content: ``` 1 #!/usr/bin/env python 2 # -*- coding: utf-8 -*- 3 # 4 # Simple Bot to reply to Telegram messages 5 # This program is dedicated to the public domain under the CC0 license. 6 """ 7 This Bot uses the Updater class to handle the bot. 8 9 First, a few callback functions are defined. Then, those functions are passed to 10 the Dispatcher and registered at their respective places. 11 Then, the bot is started and runs until we press Ctrl-C on the command line. 12 13 Usage: 14 Example of a bot-user conversation using ConversationHandler. 15 Send /start to initiate the conversation. 16 Press Ctrl-C on the command line or send a signal to the process to stop the 17 bot. 18 """ 19 20 from telegram import ReplyKeyboardMarkup 21 from telegram.ext import (Updater, CommandHandler, MessageHandler, Filters, RegexHandler, 22 ConversationHandler) 23 24 import logging 25 26 # Enable logging 27 logging.basicConfig(format='%(asctime)s - %(name)s - %(levelname)s - %(message)s', 28 level=logging.INFO) 29 30 logger = logging.getLogger(__name__) 31 32 CHOOSING, TYPING_REPLY, TYPING_CHOICE = range(3) 33 34 reply_keyboard = [['Age', 'Favourite colour'], 35 ['Number of siblings', 'Something else...'], 36 ['Done']] 37 markup = ReplyKeyboardMarkup(reply_keyboard, one_time_keyboard=True) 38 39 40 def facts_to_str(user_data): 41 facts = list() 42 43 for key, value in user_data.items(): 44 facts.append('{} - {}'.format(key, value)) 45 46 return "\n".join(facts).join(['\n', '\n']) 47 48 49 def start(bot, update): 50 update.message.reply_text( 51 "Hi! My name is Doctor Botter. I will hold a more complex conversation with you. " 52 "Why don't you tell me something about yourself?", 53 reply_markup=markup) 54 55 return CHOOSING 56 57 58 def regular_choice(bot, update, user_data): 59 text = update.message.text 60 user_data['choice'] = text 61 update.message.reply_text( 62 'Your {}? Yes, I would love to hear about that!'.format(text.lower())) 63 64 return TYPING_REPLY 65 66 67 def custom_choice(bot, update): 68 update.message.reply_text('Alright, please send me the category first, ' 69 'for example "Most impressive skill"') 70 71 return TYPING_CHOICE 72 73 74 def received_information(bot, update, user_data): 75 text = update.message.text 76 category = user_data['choice'] 77 user_data[category] = text 78 del user_data['choice'] 79 80 update.message.reply_text("Neat! Just so you know, this is what you already told me:" 81 "{}" 82 "You can tell me more, or change your opinion on something.".format( 83 facts_to_str(user_data)), reply_markup=markup) 84 85 return CHOOSING 86 87 88 def done(bot, update, user_data): 89 if 'choice' in user_data: 90 del user_data['choice'] 91 92 update.message.reply_text("I learned these facts about you:" 93 "{}" 94 "Until next time!".format(facts_to_str(user_data))) 95 96 user_data.clear() 97 return ConversationHandler.END 98 99 100 def error(bot, update, error): 101 """Log Errors caused by Updates.""" 102 logger.warning('Update "%s" caused error "%s"', update, error) 103 104 105 def main(): 106 # Create the Updater and pass it your bot's token. 107 updater = Updater("TOKEN") 108 109 # Get the dispatcher to register handlers 110 dp = updater.dispatcher 111 112 # Add conversation handler with the states GENDER, PHOTO, LOCATION and BIO 113 conv_handler = ConversationHandler( 114 entry_points=[CommandHandler('start', start)], 115 116 states={ 117 CHOOSING: [RegexHandler('^(Age|Favourite colour|Number of siblings)$', 118 regular_choice, 119 pass_user_data=True), 120 RegexHandler('^Something else...$', 121 custom_choice), 122 ], 123 124 TYPING_CHOICE: [MessageHandler(Filters.text, 125 regular_choice, 126 pass_user_data=True), 127 ], 128 129 TYPING_REPLY: [MessageHandler(Filters.text, 130 received_information, 131 pass_user_data=True), 132 ], 133 }, 134 135 fallbacks=[RegexHandler('^Done$', done, pass_user_data=True)] 136 ) 137 138 dp.add_handler(conv_handler) 139 140 # log all errors 141 dp.add_error_handler(error) 142 143 # Start the Bot 144 updater.start_polling() 145 146 # Run the bot until you press Ctrl-C or the process receives SIGINT, 147 # SIGTERM or SIGABRT. This should be used most of the time, since 148 # start_polling() is non-blocking and will stop the bot gracefully. 149 updater.idle() 150 151 152 if __name__ == '__main__': 153 main() 154 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/examples/conversationbot2.py b/examples/conversationbot2.py --- a/examples/conversationbot2.py +++ b/examples/conversationbot2.py @@ -109,7 +109,7 @@ # Get the dispatcher to register handlers dp = updater.dispatcher - # Add conversation handler with the states GENDER, PHOTO, LOCATION and BIO + # Add conversation handler with the states CHOOSING, TYPING_CHOICE and TYPING_REPLY conv_handler = ConversationHandler( entry_points=[CommandHandler('start', start)],
{"golden_diff": "diff --git a/examples/conversationbot2.py b/examples/conversationbot2.py\n--- a/examples/conversationbot2.py\n+++ b/examples/conversationbot2.py\n@@ -109,7 +109,7 @@\n # Get the dispatcher to register handlers\n dp = updater.dispatcher\n \n- # Add conversation handler with the states GENDER, PHOTO, LOCATION and BIO\n+ # Add conversation handler with the states CHOOSING, TYPING_CHOICE and TYPING_REPLY\n conv_handler = ConversationHandler(\n entry_points=[CommandHandler('start', start)],\n", "issue": "Typo in comment in conversationbot2.py\n<!--\r\nThanks for reporting issues of python-telegram-bot!\r\n\r\nUse this template to notify us if you found a bug, or if you want to request a new feature.\r\nIf you're looking for help with programming your bot using our library, feel free to ask your\r\nquestions in out telegram group at: https://t.me/pythontelegrambotgroup\r\n\r\nTo make it easier for us to help you please enter detailed information below.\r\n\r\nPlease note, we only support the latest version of python-telegram-bot and\r\nmaster branch. Please make sure to upgrade & recreate the issue on the latest\r\nversion prior to opening an issue.\r\n-->\r\n### Steps to reproduce\r\n1. Not really a bug... wrong comment line in conversationbot2.py :)\r\n\r\n### Expected behaviour\r\nShould be: # Add conversation handler with the states CHOOSING, TYPING_CHOICE and TYPING_REPLY\r\n\r\n### Actual behaviour\r\nActual comment in conversationbot2.py (leftover from conversationbot.py I guess :))\r\n# Add conversation handler with the states GENDER, PHOTO, LOCATION and BIO\r\n\r\n### Configuration\r\n**Operating System:**\r\nWindows\r\n\r\n**Version of Python, python-telegram-bot & dependencies:**\r\n3.6\r\n\r\n``$ python -m telegram``\r\n\r\n### Logs\r\nInsert logs here (if necessary)\r\n\n", "before_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n#\n# Simple Bot to reply to Telegram messages\n# This program is dedicated to the public domain under the CC0 license.\n\"\"\"\nThis Bot uses the Updater class to handle the bot.\n\nFirst, a few callback functions are defined. Then, those functions are passed to\nthe Dispatcher and registered at their respective places.\nThen, the bot is started and runs until we press Ctrl-C on the command line.\n\nUsage:\nExample of a bot-user conversation using ConversationHandler.\nSend /start to initiate the conversation.\nPress Ctrl-C on the command line or send a signal to the process to stop the\nbot.\n\"\"\"\n\nfrom telegram import ReplyKeyboardMarkup\nfrom telegram.ext import (Updater, CommandHandler, MessageHandler, Filters, RegexHandler,\n ConversationHandler)\n\nimport logging\n\n# Enable logging\nlogging.basicConfig(format='%(asctime)s - %(name)s - %(levelname)s - %(message)s',\n level=logging.INFO)\n\nlogger = logging.getLogger(__name__)\n\nCHOOSING, TYPING_REPLY, TYPING_CHOICE = range(3)\n\nreply_keyboard = [['Age', 'Favourite colour'],\n ['Number of siblings', 'Something else...'],\n ['Done']]\nmarkup = ReplyKeyboardMarkup(reply_keyboard, one_time_keyboard=True)\n\n\ndef facts_to_str(user_data):\n facts = list()\n\n for key, value in user_data.items():\n facts.append('{} - {}'.format(key, value))\n\n return \"\\n\".join(facts).join(['\\n', '\\n'])\n\n\ndef start(bot, update):\n update.message.reply_text(\n \"Hi! My name is Doctor Botter. I will hold a more complex conversation with you. \"\n \"Why don't you tell me something about yourself?\",\n reply_markup=markup)\n\n return CHOOSING\n\n\ndef regular_choice(bot, update, user_data):\n text = update.message.text\n user_data['choice'] = text\n update.message.reply_text(\n 'Your {}? Yes, I would love to hear about that!'.format(text.lower()))\n\n return TYPING_REPLY\n\n\ndef custom_choice(bot, update):\n update.message.reply_text('Alright, please send me the category first, '\n 'for example \"Most impressive skill\"')\n\n return TYPING_CHOICE\n\n\ndef received_information(bot, update, user_data):\n text = update.message.text\n category = user_data['choice']\n user_data[category] = text\n del user_data['choice']\n\n update.message.reply_text(\"Neat! Just so you know, this is what you already told me:\"\n \"{}\"\n \"You can tell me more, or change your opinion on something.\".format(\n facts_to_str(user_data)), reply_markup=markup)\n\n return CHOOSING\n\n\ndef done(bot, update, user_data):\n if 'choice' in user_data:\n del user_data['choice']\n\n update.message.reply_text(\"I learned these facts about you:\"\n \"{}\"\n \"Until next time!\".format(facts_to_str(user_data)))\n\n user_data.clear()\n return ConversationHandler.END\n\n\ndef error(bot, update, error):\n \"\"\"Log Errors caused by Updates.\"\"\"\n logger.warning('Update \"%s\" caused error \"%s\"', update, error)\n\n\ndef main():\n # Create the Updater and pass it your bot's token.\n updater = Updater(\"TOKEN\")\n\n # Get the dispatcher to register handlers\n dp = updater.dispatcher\n\n # Add conversation handler with the states GENDER, PHOTO, LOCATION and BIO\n conv_handler = ConversationHandler(\n entry_points=[CommandHandler('start', start)],\n\n states={\n CHOOSING: [RegexHandler('^(Age|Favourite colour|Number of siblings)$',\n regular_choice,\n pass_user_data=True),\n RegexHandler('^Something else...$',\n custom_choice),\n ],\n\n TYPING_CHOICE: [MessageHandler(Filters.text,\n regular_choice,\n pass_user_data=True),\n ],\n\n TYPING_REPLY: [MessageHandler(Filters.text,\n received_information,\n pass_user_data=True),\n ],\n },\n\n fallbacks=[RegexHandler('^Done$', done, pass_user_data=True)]\n )\n\n dp.add_handler(conv_handler)\n\n # log all errors\n dp.add_error_handler(error)\n\n # Start the Bot\n updater.start_polling()\n\n # Run the bot until you press Ctrl-C or the process receives SIGINT,\n # SIGTERM or SIGABRT. This should be used most of the time, since\n # start_polling() is non-blocking and will stop the bot gracefully.\n updater.idle()\n\n\nif __name__ == '__main__':\n main()\n", "path": "examples/conversationbot2.py"}], "after_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n#\n# Simple Bot to reply to Telegram messages\n# This program is dedicated to the public domain under the CC0 license.\n\"\"\"\nThis Bot uses the Updater class to handle the bot.\n\nFirst, a few callback functions are defined. Then, those functions are passed to\nthe Dispatcher and registered at their respective places.\nThen, the bot is started and runs until we press Ctrl-C on the command line.\n\nUsage:\nExample of a bot-user conversation using ConversationHandler.\nSend /start to initiate the conversation.\nPress Ctrl-C on the command line or send a signal to the process to stop the\nbot.\n\"\"\"\n\nfrom telegram import ReplyKeyboardMarkup\nfrom telegram.ext import (Updater, CommandHandler, MessageHandler, Filters, RegexHandler,\n ConversationHandler)\n\nimport logging\n\n# Enable logging\nlogging.basicConfig(format='%(asctime)s - %(name)s - %(levelname)s - %(message)s',\n level=logging.INFO)\n\nlogger = logging.getLogger(__name__)\n\nCHOOSING, TYPING_REPLY, TYPING_CHOICE = range(3)\n\nreply_keyboard = [['Age', 'Favourite colour'],\n ['Number of siblings', 'Something else...'],\n ['Done']]\nmarkup = ReplyKeyboardMarkup(reply_keyboard, one_time_keyboard=True)\n\n\ndef facts_to_str(user_data):\n facts = list()\n\n for key, value in user_data.items():\n facts.append('{} - {}'.format(key, value))\n\n return \"\\n\".join(facts).join(['\\n', '\\n'])\n\n\ndef start(bot, update):\n update.message.reply_text(\n \"Hi! My name is Doctor Botter. I will hold a more complex conversation with you. \"\n \"Why don't you tell me something about yourself?\",\n reply_markup=markup)\n\n return CHOOSING\n\n\ndef regular_choice(bot, update, user_data):\n text = update.message.text\n user_data['choice'] = text\n update.message.reply_text(\n 'Your {}? Yes, I would love to hear about that!'.format(text.lower()))\n\n return TYPING_REPLY\n\n\ndef custom_choice(bot, update):\n update.message.reply_text('Alright, please send me the category first, '\n 'for example \"Most impressive skill\"')\n\n return TYPING_CHOICE\n\n\ndef received_information(bot, update, user_data):\n text = update.message.text\n category = user_data['choice']\n user_data[category] = text\n del user_data['choice']\n\n update.message.reply_text(\"Neat! Just so you know, this is what you already told me:\"\n \"{}\"\n \"You can tell me more, or change your opinion on something.\".format(\n facts_to_str(user_data)), reply_markup=markup)\n\n return CHOOSING\n\n\ndef done(bot, update, user_data):\n if 'choice' in user_data:\n del user_data['choice']\n\n update.message.reply_text(\"I learned these facts about you:\"\n \"{}\"\n \"Until next time!\".format(facts_to_str(user_data)))\n\n user_data.clear()\n return ConversationHandler.END\n\n\ndef error(bot, update, error):\n \"\"\"Log Errors caused by Updates.\"\"\"\n logger.warning('Update \"%s\" caused error \"%s\"', update, error)\n\n\ndef main():\n # Create the Updater and pass it your bot's token.\n updater = Updater(\"TOKEN\")\n\n # Get the dispatcher to register handlers\n dp = updater.dispatcher\n\n # Add conversation handler with the states CHOOSING, TYPING_CHOICE and TYPING_REPLY\n conv_handler = ConversationHandler(\n entry_points=[CommandHandler('start', start)],\n\n states={\n CHOOSING: [RegexHandler('^(Age|Favourite colour|Number of siblings)$',\n regular_choice,\n pass_user_data=True),\n RegexHandler('^Something else...$',\n custom_choice),\n ],\n\n TYPING_CHOICE: [MessageHandler(Filters.text,\n regular_choice,\n pass_user_data=True),\n ],\n\n TYPING_REPLY: [MessageHandler(Filters.text,\n received_information,\n pass_user_data=True),\n ],\n },\n\n fallbacks=[RegexHandler('^Done$', done, pass_user_data=True)]\n )\n\n dp.add_handler(conv_handler)\n\n # log all errors\n dp.add_error_handler(error)\n\n # Start the Bot\n updater.start_polling()\n\n # Run the bot until you press Ctrl-C or the process receives SIGINT,\n # SIGTERM or SIGABRT. This should be used most of the time, since\n # start_polling() is non-blocking and will stop the bot gracefully.\n updater.idle()\n\n\nif __name__ == '__main__':\n main()\n", "path": "examples/conversationbot2.py"}]}
1,902
125
gh_patches_debug_39573
rasdani/github-patches
git_diff
scoutapp__scout_apm_python-674
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Support Jinja 3.x Jinja 3.x was released last week and it has broken some functionality within the agent. The following tests are currently failing: ``` tests/integration/instruments/test_jinja2_py36plus.py::test_async_render tests/integration/instruments/test_jinja2_py36plus.py::test_async_render_name tests/integration/instruments/test_urllib3.py::test_request - pytest.PytestUnraisableExceptionWarning tests/integration/instruments/test_urllib3.py::test_request_no_absolute_url ``` --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `src/scout_apm/instruments/jinja2.py` Content: ``` 1 # coding=utf-8 2 from __future__ import absolute_import, division, print_function, unicode_literals 3 4 import logging 5 import sys 6 7 import wrapt 8 9 from scout_apm.core.tracked_request import TrackedRequest 10 11 try: 12 from jinja2 import Environment 13 except ImportError: # pragma: no cover 14 Environment = None 15 16 try: 17 from jinja2 import Template 18 except ImportError: # pragma: no cover 19 Template = None 20 21 # The async_ module can only be shipped on Python 3.6+ 22 try: 23 from scout_apm.async_.instruments.jinja2 import wrapped_render_async 24 except ImportError: 25 wrapped_render_async = None 26 27 28 logger = logging.getLogger(__name__) 29 30 31 have_patched_environment_init = False 32 have_patched_template_render = False 33 have_patched_template_render_async = False 34 35 36 def ensure_installed(): 37 global have_patched_environment_init 38 global have_patched_template_render 39 40 logger.debug("Instrumenting Jinja2.") 41 42 if Template is None: 43 logger.debug("Couldn't import jinja2.Template - probably not installed.") 44 return 45 46 if not have_patched_environment_init: 47 try: 48 Environment.__init__ = wrapped_environment_init(Environment.__init__) 49 except Exception as exc: 50 logger.warning( 51 "Failed to instrument jinja2.Environment.__init__: %r", 52 exc, 53 exc_info=exc, 54 ) 55 else: 56 have_patched_environment_init = True 57 58 if not have_patched_template_render: 59 try: 60 Template.render = wrapped_render(Template.render) 61 except Exception as exc: 62 logger.warning( 63 "Failed to instrument jinja2.Template.render: %r", exc, exc_info=exc 64 ) 65 else: 66 have_patched_template_render = True 67 68 69 @wrapt.decorator 70 def wrapped_render(wrapped, instance, args, kwargs): 71 tracked_request = TrackedRequest.instance() 72 with tracked_request.span(operation="Template/Render") as span: 73 span.tag("name", instance.name) 74 return wrapped(*args, **kwargs) 75 76 77 @wrapt.decorator 78 def wrapped_environment_init(wrapped, instance, args, kwargs): 79 """ 80 Delayed wrapping of render_async(), since Template won't have this method 81 until after jinja2.asyncsupport is imported, which since Jinja2 2.11.0 is 82 done conditionally in Environment.__init__: 83 https://github.com/pallets/jinja/issues/765 84 """ 85 global have_patched_template_render_async 86 result = wrapped(*args, **kwargs) 87 88 if ( 89 wrapped_render_async is not None 90 and not have_patched_template_render_async 91 and "jinja2.asyncsupport" in sys.modules 92 ): 93 try: 94 Template.render_async = wrapped_render_async(Template.render_async) 95 except Exception as exc: 96 logger.warning( 97 "Failed to instrument jinja2.Template.render_async: %r", 98 exc, 99 exc_info=exc, 100 ) 101 else: 102 have_patched_template_render_async = True 103 104 return result 105 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/src/scout_apm/instruments/jinja2.py b/src/scout_apm/instruments/jinja2.py --- a/src/scout_apm/instruments/jinja2.py +++ b/src/scout_apm/instruments/jinja2.py @@ -34,7 +34,6 @@ def ensure_installed(): - global have_patched_environment_init global have_patched_template_render logger.debug("Instrumenting Jinja2.") @@ -43,9 +42,31 @@ logger.debug("Couldn't import jinja2.Template - probably not installed.") return - if not have_patched_environment_init: + instrument_render_async() + + if not have_patched_template_render: + try: + Template.render = wrapped_render(Template.render) + except Exception as exc: + logger.warning( + "Failed to instrument jinja2.Template.render: %r", exc, exc_info=exc + ) + else: + have_patched_template_render = True + + +def instrument_render_async(): + global have_patched_environment_init + global have_patched_template_render_async + + if wrapped_render_async is None: + return + + if not have_patched_environment_init and not hasattr(Template, "render_async"): try: - Environment.__init__ = wrapped_environment_init(Environment.__init__) + Environment.__init__ = wrapped_environment_init_jinja_v2( + Environment.__init__ + ) except Exception as exc: logger.warning( "Failed to instrument jinja2.Environment.__init__: %r", @@ -54,16 +75,17 @@ ) else: have_patched_environment_init = True - - if not have_patched_template_render: + elif hasattr(Template, "render_async") and not have_patched_template_render_async: try: - Template.render = wrapped_render(Template.render) + Template.render_async = wrapped_render_async(Template.render_async) except Exception as exc: logger.warning( - "Failed to instrument jinja2.Template.render: %r", exc, exc_info=exc + "Failed to instrument jinja2.Template.render_async: %r", + exc, + exc_info=exc, ) else: - have_patched_template_render = True + have_patched_template_render_async = True @wrapt.decorator @@ -75,12 +97,14 @@ @wrapt.decorator -def wrapped_environment_init(wrapped, instance, args, kwargs): +def wrapped_environment_init_jinja_v2(wrapped, instance, args, kwargs): """ Delayed wrapping of render_async(), since Template won't have this method until after jinja2.asyncsupport is imported, which since Jinja2 2.11.0 is done conditionally in Environment.__init__: https://github.com/pallets/jinja/issues/765 + + This is no longer needed since Jinja2 v3.0.0 """ global have_patched_template_render_async result = wrapped(*args, **kwargs)
{"golden_diff": "diff --git a/src/scout_apm/instruments/jinja2.py b/src/scout_apm/instruments/jinja2.py\n--- a/src/scout_apm/instruments/jinja2.py\n+++ b/src/scout_apm/instruments/jinja2.py\n@@ -34,7 +34,6 @@\n \n \n def ensure_installed():\n- global have_patched_environment_init\n global have_patched_template_render\n \n logger.debug(\"Instrumenting Jinja2.\")\n@@ -43,9 +42,31 @@\n logger.debug(\"Couldn't import jinja2.Template - probably not installed.\")\n return\n \n- if not have_patched_environment_init:\n+ instrument_render_async()\n+\n+ if not have_patched_template_render:\n+ try:\n+ Template.render = wrapped_render(Template.render)\n+ except Exception as exc:\n+ logger.warning(\n+ \"Failed to instrument jinja2.Template.render: %r\", exc, exc_info=exc\n+ )\n+ else:\n+ have_patched_template_render = True\n+\n+\n+def instrument_render_async():\n+ global have_patched_environment_init\n+ global have_patched_template_render_async\n+\n+ if wrapped_render_async is None:\n+ return\n+\n+ if not have_patched_environment_init and not hasattr(Template, \"render_async\"):\n try:\n- Environment.__init__ = wrapped_environment_init(Environment.__init__)\n+ Environment.__init__ = wrapped_environment_init_jinja_v2(\n+ Environment.__init__\n+ )\n except Exception as exc:\n logger.warning(\n \"Failed to instrument jinja2.Environment.__init__: %r\",\n@@ -54,16 +75,17 @@\n )\n else:\n have_patched_environment_init = True\n-\n- if not have_patched_template_render:\n+ elif hasattr(Template, \"render_async\") and not have_patched_template_render_async:\n try:\n- Template.render = wrapped_render(Template.render)\n+ Template.render_async = wrapped_render_async(Template.render_async)\n except Exception as exc:\n logger.warning(\n- \"Failed to instrument jinja2.Template.render: %r\", exc, exc_info=exc\n+ \"Failed to instrument jinja2.Template.render_async: %r\",\n+ exc,\n+ exc_info=exc,\n )\n else:\n- have_patched_template_render = True\n+ have_patched_template_render_async = True\n \n \n @wrapt.decorator\n@@ -75,12 +97,14 @@\n \n \n @wrapt.decorator\n-def wrapped_environment_init(wrapped, instance, args, kwargs):\n+def wrapped_environment_init_jinja_v2(wrapped, instance, args, kwargs):\n \"\"\"\n Delayed wrapping of render_async(), since Template won't have this method\n until after jinja2.asyncsupport is imported, which since Jinja2 2.11.0 is\n done conditionally in Environment.__init__:\n https://github.com/pallets/jinja/issues/765\n+\n+ This is no longer needed since Jinja2 v3.0.0\n \"\"\"\n global have_patched_template_render_async\n result = wrapped(*args, **kwargs)\n", "issue": "Support Jinja 3.x\nJinja 3.x was released last week and it has broken some functionality within the agent. The following tests are currently failing:\r\n\r\n```\r\ntests/integration/instruments/test_jinja2_py36plus.py::test_async_render\r\ntests/integration/instruments/test_jinja2_py36plus.py::test_async_render_name\r\ntests/integration/instruments/test_urllib3.py::test_request - pytest.PytestUnraisableExceptionWarning\r\ntests/integration/instruments/test_urllib3.py::test_request_no_absolute_url\r\n```\r\n\n", "before_files": [{"content": "# coding=utf-8\nfrom __future__ import absolute_import, division, print_function, unicode_literals\n\nimport logging\nimport sys\n\nimport wrapt\n\nfrom scout_apm.core.tracked_request import TrackedRequest\n\ntry:\n from jinja2 import Environment\nexcept ImportError: # pragma: no cover\n Environment = None\n\ntry:\n from jinja2 import Template\nexcept ImportError: # pragma: no cover\n Template = None\n\n# The async_ module can only be shipped on Python 3.6+\ntry:\n from scout_apm.async_.instruments.jinja2 import wrapped_render_async\nexcept ImportError:\n wrapped_render_async = None\n\n\nlogger = logging.getLogger(__name__)\n\n\nhave_patched_environment_init = False\nhave_patched_template_render = False\nhave_patched_template_render_async = False\n\n\ndef ensure_installed():\n global have_patched_environment_init\n global have_patched_template_render\n\n logger.debug(\"Instrumenting Jinja2.\")\n\n if Template is None:\n logger.debug(\"Couldn't import jinja2.Template - probably not installed.\")\n return\n\n if not have_patched_environment_init:\n try:\n Environment.__init__ = wrapped_environment_init(Environment.__init__)\n except Exception as exc:\n logger.warning(\n \"Failed to instrument jinja2.Environment.__init__: %r\",\n exc,\n exc_info=exc,\n )\n else:\n have_patched_environment_init = True\n\n if not have_patched_template_render:\n try:\n Template.render = wrapped_render(Template.render)\n except Exception as exc:\n logger.warning(\n \"Failed to instrument jinja2.Template.render: %r\", exc, exc_info=exc\n )\n else:\n have_patched_template_render = True\n\n\[email protected]\ndef wrapped_render(wrapped, instance, args, kwargs):\n tracked_request = TrackedRequest.instance()\n with tracked_request.span(operation=\"Template/Render\") as span:\n span.tag(\"name\", instance.name)\n return wrapped(*args, **kwargs)\n\n\[email protected]\ndef wrapped_environment_init(wrapped, instance, args, kwargs):\n \"\"\"\n Delayed wrapping of render_async(), since Template won't have this method\n until after jinja2.asyncsupport is imported, which since Jinja2 2.11.0 is\n done conditionally in Environment.__init__:\n https://github.com/pallets/jinja/issues/765\n \"\"\"\n global have_patched_template_render_async\n result = wrapped(*args, **kwargs)\n\n if (\n wrapped_render_async is not None\n and not have_patched_template_render_async\n and \"jinja2.asyncsupport\" in sys.modules\n ):\n try:\n Template.render_async = wrapped_render_async(Template.render_async)\n except Exception as exc:\n logger.warning(\n \"Failed to instrument jinja2.Template.render_async: %r\",\n exc,\n exc_info=exc,\n )\n else:\n have_patched_template_render_async = True\n\n return result\n", "path": "src/scout_apm/instruments/jinja2.py"}], "after_files": [{"content": "# coding=utf-8\nfrom __future__ import absolute_import, division, print_function, unicode_literals\n\nimport logging\nimport sys\n\nimport wrapt\n\nfrom scout_apm.core.tracked_request import TrackedRequest\n\ntry:\n from jinja2 import Environment\nexcept ImportError: # pragma: no cover\n Environment = None\n\ntry:\n from jinja2 import Template\nexcept ImportError: # pragma: no cover\n Template = None\n\n# The async_ module can only be shipped on Python 3.6+\ntry:\n from scout_apm.async_.instruments.jinja2 import wrapped_render_async\nexcept ImportError:\n wrapped_render_async = None\n\n\nlogger = logging.getLogger(__name__)\n\n\nhave_patched_environment_init = False\nhave_patched_template_render = False\nhave_patched_template_render_async = False\n\n\ndef ensure_installed():\n global have_patched_template_render\n\n logger.debug(\"Instrumenting Jinja2.\")\n\n if Template is None:\n logger.debug(\"Couldn't import jinja2.Template - probably not installed.\")\n return\n\n instrument_render_async()\n\n if not have_patched_template_render:\n try:\n Template.render = wrapped_render(Template.render)\n except Exception as exc:\n logger.warning(\n \"Failed to instrument jinja2.Template.render: %r\", exc, exc_info=exc\n )\n else:\n have_patched_template_render = True\n\n\ndef instrument_render_async():\n global have_patched_environment_init\n global have_patched_template_render_async\n\n if wrapped_render_async is None:\n return\n\n if not have_patched_environment_init and not hasattr(Template, \"render_async\"):\n try:\n Environment.__init__ = wrapped_environment_init_jinja_v2(\n Environment.__init__\n )\n except Exception as exc:\n logger.warning(\n \"Failed to instrument jinja2.Environment.__init__: %r\",\n exc,\n exc_info=exc,\n )\n else:\n have_patched_environment_init = True\n elif hasattr(Template, \"render_async\") and not have_patched_template_render_async:\n try:\n Template.render_async = wrapped_render_async(Template.render_async)\n except Exception as exc:\n logger.warning(\n \"Failed to instrument jinja2.Template.render_async: %r\",\n exc,\n exc_info=exc,\n )\n else:\n have_patched_template_render_async = True\n\n\[email protected]\ndef wrapped_render(wrapped, instance, args, kwargs):\n tracked_request = TrackedRequest.instance()\n with tracked_request.span(operation=\"Template/Render\") as span:\n span.tag(\"name\", instance.name)\n return wrapped(*args, **kwargs)\n\n\[email protected]\ndef wrapped_environment_init_jinja_v2(wrapped, instance, args, kwargs):\n \"\"\"\n Delayed wrapping of render_async(), since Template won't have this method\n until after jinja2.asyncsupport is imported, which since Jinja2 2.11.0 is\n done conditionally in Environment.__init__:\n https://github.com/pallets/jinja/issues/765\n\n This is no longer needed since Jinja2 v3.0.0\n \"\"\"\n global have_patched_template_render_async\n result = wrapped(*args, **kwargs)\n\n if (\n wrapped_render_async is not None\n and not have_patched_template_render_async\n and \"jinja2.asyncsupport\" in sys.modules\n ):\n try:\n Template.render_async = wrapped_render_async(Template.render_async)\n except Exception as exc:\n logger.warning(\n \"Failed to instrument jinja2.Template.render_async: %r\",\n exc,\n exc_info=exc,\n )\n else:\n have_patched_template_render_async = True\n\n return result\n", "path": "src/scout_apm/instruments/jinja2.py"}]}
1,252
692
gh_patches_debug_12015
rasdani/github-patches
git_diff
iterative__dvc-8505
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- `exp run`: data gets re-imported on every call # Bug Report ## Description When a pipeline uses an imported data file (with `dvc import`), the data gets cloned(?) and hashed every time `dvc exp run` is called. ### Reproduce 1. dvc import [email protected]:iterative/dataset-registry.git use-cases/cats-dogs 2. dvc stage add -n foo -d cats-dogs echo foo 3. dvc exp run ### Expected When using `dvc repro` the imported data doesn't get re-hashed. I would expect `dvc exp run` to behave the same. ### Environment information **Output of `dvc doctor`:** ```console $ dvc doctor DVC version: 2.6.3 (pip) --------------------------------- Platform: Python 3.9.6 on macOS-10.16-x86_64-i386-64bit Supports: gdrive (pydrive2 = 1.9.1), http (requests = 2.26.0), https (requests = 2.26.0) Cache types: reflink, hardlink, symlink Cache directory: apfs on /dev/disk1s1s1 Caches: local Remotes: None Workspace directory: apfs on /dev/disk1s1s1 Repo: dvc, git ``` **Additional Information (if any):** ```console $ dvc repro -v 2021-08-25 11:11:55,186 DEBUG: Computed stage: 'cats-dogs.dvc' md5: '5a135b297ee3c96465ce4b320f44fb8b' 'cats-dogs.dvc' didn't change, skipping Stage 'foo' didn't change, skipping Data and pipelines are up to date. ``` ```console $ dvc exp run -v 2021-08-25 11:12:15,672 DEBUG: Detaching HEAD at 'HEAD' 2021-08-25 11:12:15,690 DEBUG: Stashing workspace 2021-08-25 11:12:15,700 DEBUG: No changes to stash 2021-08-25 11:12:15,749 DEBUG: Creating external repo [email protected]:iterative/dataset-registry.git@ca140591a21c6d75a7057d1e2eb3f51d3115c5f5 2021-08-25 11:12:15,749 DEBUG: erepo: git clone '[email protected]:iterative/dataset-registry.git' to a temporary dir Computing file/dir hashes (only done once) . . . ``` --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `dvc/repo/commit.py` Content: ``` 1 from dvc import prompt 2 3 from . import locked 4 5 6 def _prepare_message(stage, changes): 7 changed_deps, changed_outs, changed_stage = changes 8 if changed_deps and changed_outs: 9 msg = "dependencies {deps} and outputs {outs} of {stage} changed." 10 elif changed_deps: 11 msg = "dependencies {deps} of {stage} changed." 12 elif changed_outs: 13 msg = "outputs {outs} of {stage} changed." 14 else: 15 msg = "{stage_changed}" 16 17 msg += " Are you sure you want to commit it?" 18 19 kw = { 20 "stage": stage, 21 "deps": changed_deps, 22 "outs": changed_outs, 23 "stage_changed": changed_stage, 24 } 25 return msg.format_map(kw) 26 27 28 def prompt_to_commit(stage, changes, force=False): 29 from dvc.stage.exceptions import StageCommitError 30 31 if not (force or prompt.confirm(_prepare_message(stage, changes))): 32 raise StageCommitError( 33 "unable to commit changed {}. Use `-f|--force` to " 34 "force.".format(stage) 35 ) 36 37 38 @locked 39 def commit( 40 self, 41 target, 42 with_deps=False, 43 recursive=False, 44 force=False, 45 allow_missing=False, 46 data_only=False, 47 ): 48 from dvc.dvcfile import Dvcfile 49 50 stages_info = [ 51 info 52 for info in self.stage.collect_granular( 53 target, with_deps=with_deps, recursive=recursive 54 ) 55 if not data_only or info.stage.is_data_source 56 ] 57 for stage_info in stages_info: 58 stage = stage_info.stage 59 changes = stage.changed_entries() 60 if any(changes): 61 prompt_to_commit(stage, changes, force=force) 62 stage.save(allow_missing=allow_missing) 63 stage.commit( 64 filter_info=stage_info.filter_info, allow_missing=allow_missing 65 ) 66 67 Dvcfile(self, stage.path).dump(stage, update_pipeline=False) 68 return [s.stage for s in stages_info] 69 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/dvc/repo/commit.py b/dvc/repo/commit.py --- a/dvc/repo/commit.py +++ b/dvc/repo/commit.py @@ -56,10 +56,13 @@ ] for stage_info in stages_info: stage = stage_info.stage - changes = stage.changed_entries() - if any(changes): - prompt_to_commit(stage, changes, force=force) + if force: stage.save(allow_missing=allow_missing) + else: + changes = stage.changed_entries() + if any(changes): + prompt_to_commit(stage, changes, force=force) + stage.save(allow_missing=allow_missing) stage.commit( filter_info=stage_info.filter_info, allow_missing=allow_missing )
{"golden_diff": "diff --git a/dvc/repo/commit.py b/dvc/repo/commit.py\n--- a/dvc/repo/commit.py\n+++ b/dvc/repo/commit.py\n@@ -56,10 +56,13 @@\n ]\n for stage_info in stages_info:\n stage = stage_info.stage\n- changes = stage.changed_entries()\n- if any(changes):\n- prompt_to_commit(stage, changes, force=force)\n+ if force:\n stage.save(allow_missing=allow_missing)\n+ else:\n+ changes = stage.changed_entries()\n+ if any(changes):\n+ prompt_to_commit(stage, changes, force=force)\n+ stage.save(allow_missing=allow_missing)\n stage.commit(\n filter_info=stage_info.filter_info, allow_missing=allow_missing\n )\n", "issue": "`exp run`: data gets re-imported on every call\n# Bug Report\r\n\r\n## Description\r\n\r\nWhen a pipeline uses an imported data file (with `dvc import`), the data gets cloned(?) and hashed every time `dvc exp run` is called.\r\n\r\n### Reproduce\r\n\r\n1. dvc import [email protected]:iterative/dataset-registry.git use-cases/cats-dogs\r\n2. dvc stage add -n foo -d cats-dogs echo foo\r\n3. dvc exp run\r\n\r\n### Expected\r\n\r\nWhen using `dvc repro` the imported data doesn't get re-hashed. I would expect `dvc exp run` to behave the same.\r\n\r\n### Environment information\r\n\r\n**Output of `dvc doctor`:**\r\n\r\n```console\r\n$ dvc doctor\r\nDVC version: 2.6.3 (pip)\r\n---------------------------------\r\nPlatform: Python 3.9.6 on macOS-10.16-x86_64-i386-64bit\r\nSupports:\r\n gdrive (pydrive2 = 1.9.1),\r\n http (requests = 2.26.0),\r\n https (requests = 2.26.0)\r\nCache types: reflink, hardlink, symlink\r\nCache directory: apfs on /dev/disk1s1s1\r\nCaches: local\r\nRemotes: None\r\nWorkspace directory: apfs on /dev/disk1s1s1\r\nRepo: dvc, git\r\n```\r\n\r\n**Additional Information (if any):**\r\n\r\n```console\r\n$ dvc repro -v\r\n2021-08-25 11:11:55,186 DEBUG: Computed stage: 'cats-dogs.dvc' md5: '5a135b297ee3c96465ce4b320f44fb8b'\r\n'cats-dogs.dvc' didn't change, skipping\r\nStage 'foo' didn't change, skipping\r\nData and pipelines are up to date.\r\n```\r\n\r\n```console\r\n$ dvc exp run -v\r\n2021-08-25 11:12:15,672 DEBUG: Detaching HEAD at 'HEAD' \r\n2021-08-25 11:12:15,690 DEBUG: Stashing workspace\r\n2021-08-25 11:12:15,700 DEBUG: No changes to stash\r\n2021-08-25 11:12:15,749 DEBUG: Creating external repo [email protected]:iterative/dataset-registry.git@ca140591a21c6d75a7057d1e2eb3f51d3115c5f5\r\n2021-08-25 11:12:15,749 DEBUG: erepo: git clone '[email protected]:iterative/dataset-registry.git' to a temporary dir\r\nComputing file/dir hashes (only done once) \r\n. . . \r\n```\r\n\r\n\n", "before_files": [{"content": "from dvc import prompt\n\nfrom . import locked\n\n\ndef _prepare_message(stage, changes):\n changed_deps, changed_outs, changed_stage = changes\n if changed_deps and changed_outs:\n msg = \"dependencies {deps} and outputs {outs} of {stage} changed.\"\n elif changed_deps:\n msg = \"dependencies {deps} of {stage} changed.\"\n elif changed_outs:\n msg = \"outputs {outs} of {stage} changed.\"\n else:\n msg = \"{stage_changed}\"\n\n msg += \" Are you sure you want to commit it?\"\n\n kw = {\n \"stage\": stage,\n \"deps\": changed_deps,\n \"outs\": changed_outs,\n \"stage_changed\": changed_stage,\n }\n return msg.format_map(kw)\n\n\ndef prompt_to_commit(stage, changes, force=False):\n from dvc.stage.exceptions import StageCommitError\n\n if not (force or prompt.confirm(_prepare_message(stage, changes))):\n raise StageCommitError(\n \"unable to commit changed {}. Use `-f|--force` to \"\n \"force.\".format(stage)\n )\n\n\n@locked\ndef commit(\n self,\n target,\n with_deps=False,\n recursive=False,\n force=False,\n allow_missing=False,\n data_only=False,\n):\n from dvc.dvcfile import Dvcfile\n\n stages_info = [\n info\n for info in self.stage.collect_granular(\n target, with_deps=with_deps, recursive=recursive\n )\n if not data_only or info.stage.is_data_source\n ]\n for stage_info in stages_info:\n stage = stage_info.stage\n changes = stage.changed_entries()\n if any(changes):\n prompt_to_commit(stage, changes, force=force)\n stage.save(allow_missing=allow_missing)\n stage.commit(\n filter_info=stage_info.filter_info, allow_missing=allow_missing\n )\n\n Dvcfile(self, stage.path).dump(stage, update_pipeline=False)\n return [s.stage for s in stages_info]\n", "path": "dvc/repo/commit.py"}], "after_files": [{"content": "from dvc import prompt\n\nfrom . import locked\n\n\ndef _prepare_message(stage, changes):\n changed_deps, changed_outs, changed_stage = changes\n if changed_deps and changed_outs:\n msg = \"dependencies {deps} and outputs {outs} of {stage} changed.\"\n elif changed_deps:\n msg = \"dependencies {deps} of {stage} changed.\"\n elif changed_outs:\n msg = \"outputs {outs} of {stage} changed.\"\n else:\n msg = \"{stage_changed}\"\n\n msg += \" Are you sure you want to commit it?\"\n\n kw = {\n \"stage\": stage,\n \"deps\": changed_deps,\n \"outs\": changed_outs,\n \"stage_changed\": changed_stage,\n }\n return msg.format_map(kw)\n\n\ndef prompt_to_commit(stage, changes, force=False):\n from dvc.stage.exceptions import StageCommitError\n\n if not (force or prompt.confirm(_prepare_message(stage, changes))):\n raise StageCommitError(\n \"unable to commit changed {}. Use `-f|--force` to \"\n \"force.\".format(stage)\n )\n\n\n@locked\ndef commit(\n self,\n target,\n with_deps=False,\n recursive=False,\n force=False,\n allow_missing=False,\n data_only=False,\n):\n from dvc.dvcfile import Dvcfile\n\n stages_info = [\n info\n for info in self.stage.collect_granular(\n target, with_deps=with_deps, recursive=recursive\n )\n if not data_only or info.stage.is_data_source\n ]\n for stage_info in stages_info:\n stage = stage_info.stage\n if force:\n stage.save(allow_missing=allow_missing)\n else:\n changes = stage.changed_entries()\n if any(changes):\n prompt_to_commit(stage, changes, force=force)\n stage.save(allow_missing=allow_missing)\n stage.commit(\n filter_info=stage_info.filter_info, allow_missing=allow_missing\n )\n\n Dvcfile(self, stage.path).dump(stage, update_pipeline=False)\n return [s.stage for s in stages_info]\n", "path": "dvc/repo/commit.py"}]}
1,518
176
gh_patches_debug_35163
rasdani/github-patches
git_diff
StackStorm__st2-4174
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Incomplete help for: st2 action-alias match ``` $ st2 action-alias match --help usage: st2 action-alias match [-t TOKEN] [--api-key API_KEY] [-j] [-y] [-a ATTR [ATTR ...]] [-w WIDTH [WIDTH ...]] command st2 action-alias match: error: too few arguments ``` --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `st2client/st2client/commands/action_alias.py` Content: ``` 1 # Licensed to the StackStorm, Inc ('StackStorm') under one or more 2 # contributor license agreements. See the NOTICE file distributed with 3 # this work for additional information regarding copyright ownership. 4 # The ASF licenses this file to You under the Apache License, Version 2.0 5 # (the "License"); you may not use this file except in compliance with 6 # the License. You may obtain a copy of the License at 7 # 8 # http://www.apache.org/licenses/LICENSE-2.0 9 # 10 # Unless required by applicable law or agreed to in writing, software 11 # distributed under the License is distributed on an "AS IS" BASIS, 12 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 # See the License for the specific language governing permissions and 14 # limitations under the License. 15 16 from __future__ import absolute_import 17 18 from st2client.models import core 19 from st2client.models.action_alias import ActionAlias 20 from st2client.models.action_alias import ActionAliasMatch 21 from st2client.commands import resource 22 from st2client.formatters import table 23 24 25 __all__ = [ 26 'ActionAliasBranch', 27 'ActionAliasMatchCommand', 28 'ActionAliasExecuteCommand' 29 ] 30 31 32 class ActionAliasBranch(resource.ResourceBranch): 33 def __init__(self, description, app, subparsers, parent_parser=None): 34 super(ActionAliasBranch, self).__init__( 35 ActionAlias, description, app, subparsers, 36 parent_parser=parent_parser, read_only=False, 37 commands={ 38 'list': ActionAliasListCommand, 39 'get': ActionAliasGetCommand 40 }) 41 42 self.commands['match'] = ActionAliasMatchCommand( 43 self.resource, self.app, self.subparsers, 44 add_help=False) 45 self.commands['execute'] = ActionAliasExecuteCommand( 46 self.resource, self.app, self.subparsers, 47 add_help=False) 48 49 50 class ActionAliasListCommand(resource.ContentPackResourceListCommand): 51 display_attributes = ['ref', 'pack', 'description', 'enabled'] 52 53 54 class ActionAliasGetCommand(resource.ContentPackResourceGetCommand): 55 display_attributes = ['all'] 56 attribute_display_order = ['id', 'ref', 'pack', 'name', 'description', 57 'enabled', 'action_ref', 'formats'] 58 59 60 class ActionAliasMatchCommand(resource.ResourceCommand): 61 display_attributes = ['name', 'description'] 62 63 def __init__(self, resource, *args, **kwargs): 64 super(ActionAliasMatchCommand, self).__init__( 65 resource, 'match', 66 'Get the list of %s that match the command text.' % 67 resource.get_plural_display_name().lower(), 68 *args, **kwargs) 69 70 self.parser.add_argument('match_text', 71 metavar='command', 72 help=help) 73 self.parser.add_argument('-h', '--help', 74 action='store_true', dest='help', 75 help='Print usage for the given action.') 76 self.parser.add_argument('-a', '--attr', nargs='+', 77 default=self.display_attributes, 78 help=('List of attributes to include in the ' 79 'output. "all" will return all ' 80 'attributes.')) 81 self.parser.add_argument('-w', '--width', nargs='+', type=int, 82 default=None, 83 help=('Set the width of columns in output.')) 84 85 @resource.add_auth_token_to_kwargs_from_cli 86 def run(self, args, **kwargs): 87 alias_match = ActionAliasMatch() 88 alias_match.command = args.match_text 89 90 match, _ = self.manager.match(alias_match, **kwargs) 91 return [match] 92 93 def run_and_print(self, args, **kwargs): 94 instances = self.run(args, **kwargs) 95 self.print_output(instances, table.MultiColumnTable, 96 attributes=args.attr, widths=args.width, 97 json=args.json, yaml=args.yaml) 98 99 100 class ActionAliasExecuteCommand(resource.ResourceCommand): 101 display_attributes = ['name'] 102 103 def __init__(self, resource, *args, **kwargs): 104 super(ActionAliasExecuteCommand, self).__init__( 105 resource, 'execute', 106 ('Execute the command text by finding a matching %s.' % 107 resource.get_display_name().lower()), *args, **kwargs) 108 109 self.parser.add_argument('command_text', 110 metavar='command', 111 help=help) 112 self.parser.add_argument('-h', '--help', 113 action='store_true', dest='help', 114 help='Print usage for the given action.') 115 self.parser.add_argument('-u', '--user', type=str, default=None, 116 help='User under which to run the action (admins only).') 117 118 @resource.add_auth_token_to_kwargs_from_cli 119 def run(self, args, **kwargs): 120 payload = core.Resource() 121 payload.command = args.command_text 122 payload.user = args.user 123 payload.source_channel = 'cli' 124 125 alias_execution_mgr = self.app.client.managers['ActionAliasExecution'] 126 execution = alias_execution_mgr.match_and_execute(payload) 127 return execution 128 129 def run_and_print(self, args, **kwargs): 130 execution = self.run(args, **kwargs) 131 print("Matching Action-alias: '%s'" % execution.actionalias['ref']) 132 print("To get the results, execute:\n st2 execution get %s" % 133 (execution.execution['id'])) 134 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/st2client/st2client/commands/action_alias.py b/st2client/st2client/commands/action_alias.py --- a/st2client/st2client/commands/action_alias.py +++ b/st2client/st2client/commands/action_alias.py @@ -41,10 +41,10 @@ self.commands['match'] = ActionAliasMatchCommand( self.resource, self.app, self.subparsers, - add_help=False) + add_help=True) self.commands['execute'] = ActionAliasExecuteCommand( self.resource, self.app, self.subparsers, - add_help=False) + add_help=True) class ActionAliasListCommand(resource.ContentPackResourceListCommand): @@ -69,10 +69,8 @@ self.parser.add_argument('match_text', metavar='command', - help=help) - self.parser.add_argument('-h', '--help', - action='store_true', dest='help', - help='Print usage for the given action.') + help=('Get the list of %s that match the command text.' % + resource.get_display_name().lower())) self.parser.add_argument('-a', '--attr', nargs='+', default=self.display_attributes, help=('List of attributes to include in the ' @@ -108,10 +106,8 @@ self.parser.add_argument('command_text', metavar='command', - help=help) - self.parser.add_argument('-h', '--help', - action='store_true', dest='help', - help='Print usage for the given action.') + help=('Execute the command text by finding a matching %s.' % + resource.get_display_name().lower())) self.parser.add_argument('-u', '--user', type=str, default=None, help='User under which to run the action (admins only).')
{"golden_diff": "diff --git a/st2client/st2client/commands/action_alias.py b/st2client/st2client/commands/action_alias.py\n--- a/st2client/st2client/commands/action_alias.py\n+++ b/st2client/st2client/commands/action_alias.py\n@@ -41,10 +41,10 @@\n \n self.commands['match'] = ActionAliasMatchCommand(\n self.resource, self.app, self.subparsers,\n- add_help=False)\n+ add_help=True)\n self.commands['execute'] = ActionAliasExecuteCommand(\n self.resource, self.app, self.subparsers,\n- add_help=False)\n+ add_help=True)\n \n \n class ActionAliasListCommand(resource.ContentPackResourceListCommand):\n@@ -69,10 +69,8 @@\n \n self.parser.add_argument('match_text',\n metavar='command',\n- help=help)\n- self.parser.add_argument('-h', '--help',\n- action='store_true', dest='help',\n- help='Print usage for the given action.')\n+ help=('Get the list of %s that match the command text.' %\n+ resource.get_display_name().lower()))\n self.parser.add_argument('-a', '--attr', nargs='+',\n default=self.display_attributes,\n help=('List of attributes to include in the '\n@@ -108,10 +106,8 @@\n \n self.parser.add_argument('command_text',\n metavar='command',\n- help=help)\n- self.parser.add_argument('-h', '--help',\n- action='store_true', dest='help',\n- help='Print usage for the given action.')\n+ help=('Execute the command text by finding a matching %s.' %\n+ resource.get_display_name().lower()))\n self.parser.add_argument('-u', '--user', type=str, default=None,\n help='User under which to run the action (admins only).')\n", "issue": "Incomplete help for: st2 action-alias match\n```\r\n$ st2 action-alias match --help\r\nusage: st2 action-alias match [-t TOKEN] [--api-key API_KEY] [-j] [-y]\r\n [-a ATTR [ATTR ...]] [-w WIDTH [WIDTH ...]]\r\n command\r\nst2 action-alias match: error: too few arguments\r\n```\n", "before_files": [{"content": "# Licensed to the StackStorm, Inc ('StackStorm') under one or more\n# contributor license agreements. See the NOTICE file distributed with\n# this work for additional information regarding copyright ownership.\n# The ASF licenses this file to You under the Apache License, Version 2.0\n# (the \"License\"); you may not use this file except in compliance with\n# the License. You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom __future__ import absolute_import\n\nfrom st2client.models import core\nfrom st2client.models.action_alias import ActionAlias\nfrom st2client.models.action_alias import ActionAliasMatch\nfrom st2client.commands import resource\nfrom st2client.formatters import table\n\n\n__all__ = [\n 'ActionAliasBranch',\n 'ActionAliasMatchCommand',\n 'ActionAliasExecuteCommand'\n]\n\n\nclass ActionAliasBranch(resource.ResourceBranch):\n def __init__(self, description, app, subparsers, parent_parser=None):\n super(ActionAliasBranch, self).__init__(\n ActionAlias, description, app, subparsers,\n parent_parser=parent_parser, read_only=False,\n commands={\n 'list': ActionAliasListCommand,\n 'get': ActionAliasGetCommand\n })\n\n self.commands['match'] = ActionAliasMatchCommand(\n self.resource, self.app, self.subparsers,\n add_help=False)\n self.commands['execute'] = ActionAliasExecuteCommand(\n self.resource, self.app, self.subparsers,\n add_help=False)\n\n\nclass ActionAliasListCommand(resource.ContentPackResourceListCommand):\n display_attributes = ['ref', 'pack', 'description', 'enabled']\n\n\nclass ActionAliasGetCommand(resource.ContentPackResourceGetCommand):\n display_attributes = ['all']\n attribute_display_order = ['id', 'ref', 'pack', 'name', 'description',\n 'enabled', 'action_ref', 'formats']\n\n\nclass ActionAliasMatchCommand(resource.ResourceCommand):\n display_attributes = ['name', 'description']\n\n def __init__(self, resource, *args, **kwargs):\n super(ActionAliasMatchCommand, self).__init__(\n resource, 'match',\n 'Get the list of %s that match the command text.' %\n resource.get_plural_display_name().lower(),\n *args, **kwargs)\n\n self.parser.add_argument('match_text',\n metavar='command',\n help=help)\n self.parser.add_argument('-h', '--help',\n action='store_true', dest='help',\n help='Print usage for the given action.')\n self.parser.add_argument('-a', '--attr', nargs='+',\n default=self.display_attributes,\n help=('List of attributes to include in the '\n 'output. \"all\" will return all '\n 'attributes.'))\n self.parser.add_argument('-w', '--width', nargs='+', type=int,\n default=None,\n help=('Set the width of columns in output.'))\n\n @resource.add_auth_token_to_kwargs_from_cli\n def run(self, args, **kwargs):\n alias_match = ActionAliasMatch()\n alias_match.command = args.match_text\n\n match, _ = self.manager.match(alias_match, **kwargs)\n return [match]\n\n def run_and_print(self, args, **kwargs):\n instances = self.run(args, **kwargs)\n self.print_output(instances, table.MultiColumnTable,\n attributes=args.attr, widths=args.width,\n json=args.json, yaml=args.yaml)\n\n\nclass ActionAliasExecuteCommand(resource.ResourceCommand):\n display_attributes = ['name']\n\n def __init__(self, resource, *args, **kwargs):\n super(ActionAliasExecuteCommand, self).__init__(\n resource, 'execute',\n ('Execute the command text by finding a matching %s.' %\n resource.get_display_name().lower()), *args, **kwargs)\n\n self.parser.add_argument('command_text',\n metavar='command',\n help=help)\n self.parser.add_argument('-h', '--help',\n action='store_true', dest='help',\n help='Print usage for the given action.')\n self.parser.add_argument('-u', '--user', type=str, default=None,\n help='User under which to run the action (admins only).')\n\n @resource.add_auth_token_to_kwargs_from_cli\n def run(self, args, **kwargs):\n payload = core.Resource()\n payload.command = args.command_text\n payload.user = args.user\n payload.source_channel = 'cli'\n\n alias_execution_mgr = self.app.client.managers['ActionAliasExecution']\n execution = alias_execution_mgr.match_and_execute(payload)\n return execution\n\n def run_and_print(self, args, **kwargs):\n execution = self.run(args, **kwargs)\n print(\"Matching Action-alias: '%s'\" % execution.actionalias['ref'])\n print(\"To get the results, execute:\\n st2 execution get %s\" %\n (execution.execution['id']))\n", "path": "st2client/st2client/commands/action_alias.py"}], "after_files": [{"content": "# Licensed to the StackStorm, Inc ('StackStorm') under one or more\n# contributor license agreements. See the NOTICE file distributed with\n# this work for additional information regarding copyright ownership.\n# The ASF licenses this file to You under the Apache License, Version 2.0\n# (the \"License\"); you may not use this file except in compliance with\n# the License. You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom __future__ import absolute_import\n\nfrom st2client.models import core\nfrom st2client.models.action_alias import ActionAlias\nfrom st2client.models.action_alias import ActionAliasMatch\nfrom st2client.commands import resource\nfrom st2client.formatters import table\n\n\n__all__ = [\n 'ActionAliasBranch',\n 'ActionAliasMatchCommand',\n 'ActionAliasExecuteCommand'\n]\n\n\nclass ActionAliasBranch(resource.ResourceBranch):\n def __init__(self, description, app, subparsers, parent_parser=None):\n super(ActionAliasBranch, self).__init__(\n ActionAlias, description, app, subparsers,\n parent_parser=parent_parser, read_only=False,\n commands={\n 'list': ActionAliasListCommand,\n 'get': ActionAliasGetCommand\n })\n\n self.commands['match'] = ActionAliasMatchCommand(\n self.resource, self.app, self.subparsers,\n add_help=True)\n self.commands['execute'] = ActionAliasExecuteCommand(\n self.resource, self.app, self.subparsers,\n add_help=True)\n\n\nclass ActionAliasListCommand(resource.ContentPackResourceListCommand):\n display_attributes = ['ref', 'pack', 'description', 'enabled']\n\n\nclass ActionAliasGetCommand(resource.ContentPackResourceGetCommand):\n display_attributes = ['all']\n attribute_display_order = ['id', 'ref', 'pack', 'name', 'description',\n 'enabled', 'action_ref', 'formats']\n\n\nclass ActionAliasMatchCommand(resource.ResourceCommand):\n display_attributes = ['name', 'description']\n\n def __init__(self, resource, *args, **kwargs):\n super(ActionAliasMatchCommand, self).__init__(\n resource, 'match',\n 'Get the list of %s that match the command text.' %\n resource.get_plural_display_name().lower(),\n *args, **kwargs)\n\n self.parser.add_argument('match_text',\n metavar='command',\n help=('Get the list of %s that match the command text.' %\n resource.get_display_name().lower()))\n self.parser.add_argument('-a', '--attr', nargs='+',\n default=self.display_attributes,\n help=('List of attributes to include in the '\n 'output. \"all\" will return all '\n 'attributes.'))\n self.parser.add_argument('-w', '--width', nargs='+', type=int,\n default=None,\n help=('Set the width of columns in output.'))\n\n @resource.add_auth_token_to_kwargs_from_cli\n def run(self, args, **kwargs):\n alias_match = ActionAliasMatch()\n alias_match.command = args.match_text\n\n match, _ = self.manager.match(alias_match, **kwargs)\n return [match]\n\n def run_and_print(self, args, **kwargs):\n instances = self.run(args, **kwargs)\n self.print_output(instances, table.MultiColumnTable,\n attributes=args.attr, widths=args.width,\n json=args.json, yaml=args.yaml)\n\n\nclass ActionAliasExecuteCommand(resource.ResourceCommand):\n display_attributes = ['name']\n\n def __init__(self, resource, *args, **kwargs):\n super(ActionAliasExecuteCommand, self).__init__(\n resource, 'execute',\n ('Execute the command text by finding a matching %s.' %\n resource.get_display_name().lower()), *args, **kwargs)\n\n self.parser.add_argument('command_text',\n metavar='command',\n help=('Execute the command text by finding a matching %s.' %\n resource.get_display_name().lower()))\n self.parser.add_argument('-u', '--user', type=str, default=None,\n help='User under which to run the action (admins only).')\n\n @resource.add_auth_token_to_kwargs_from_cli\n def run(self, args, **kwargs):\n payload = core.Resource()\n payload.command = args.command_text\n payload.user = args.user\n payload.source_channel = 'cli'\n\n alias_execution_mgr = self.app.client.managers['ActionAliasExecution']\n execution = alias_execution_mgr.match_and_execute(payload)\n return execution\n\n def run_and_print(self, args, **kwargs):\n execution = self.run(args, **kwargs)\n print(\"Matching Action-alias: '%s'\" % execution.actionalias['ref'])\n print(\"To get the results, execute:\\n st2 execution get %s\" %\n (execution.execution['id']))\n", "path": "st2client/st2client/commands/action_alias.py"}]}
1,754
406
gh_patches_debug_20315
rasdani/github-patches
git_diff
Qiskit__qiskit-2302
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- multi-language documentation <!-- ⚠️ If you do not respect this template, your issue will be closed --> <!-- ⚠️ Make sure to browse the opened and closed issues to confirm this idea does not exist. --> ### What is the expected behavior? I would like to remove all the multi-language documentation from this repository and add it to github.com/Qiskit/qiskit. I am not sure of the best way we do it and I let all discuss in that repository how we should do it. But we are not going to have documentation in the individual elements (only a readme about the element, contributing guidelines and other community files) @alfrisch could you take the lead on the German @rraymondhp could you take the lead on the Japanese @hanheepaik could you take the lead on the Korean @liupibm could you take the lead on the Chinese Thanks --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `docs/de/conf.py` Content: ``` 1 #!/usr/bin/env python3 2 # -*- coding: utf-8 -*- 3 """ 4 Language specific configuration file, inheriting from the main /doc 5 conf.py file and adjusting the variables that depend on the language. 6 """ 7 8 import os 9 import sys 10 11 sys.path.insert(0, os.path.abspath('..')) 12 from conf import * 13 14 language = 'de' 15 exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store', '_autodoc/modules.rst'] 16 17 templates_path = ['../_templates'] 18 html_static_path = ['../theme/static/'] 19 html_logo = '../theme/static/qiskit-logo-white-no-margin.gif' 20 html_favicon = '../theme/static/favicon.ico' 21 ``` Path: `docs/ja/conf.py` Content: ``` 1 #!/usr/bin/env python3 2 # -*- coding: utf-8 -*- 3 """ 4 Language specific configuration file, inheriting from the main /doc 5 conf.py file and adjusting the variables that depend on the language. 6 """ 7 8 import os 9 import sys 10 11 sys.path.insert(0, os.path.abspath('..')) 12 from conf import * 13 14 language = 'ja' 15 exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store', '_autodoc/modules.rst'] 16 17 templates_path = ['../_templates'] 18 html_static_path = ['../theme/static/'] 19 html_logo = '../theme/static/qiskit-logo-white-no-margin.gif' 20 html_favicon = '../theme/static/favicon.ico' 21 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/docs/de/conf.py b/docs/de/conf.py deleted file mode 100644 --- a/docs/de/conf.py +++ /dev/null @@ -1,20 +0,0 @@ -#!/usr/bin/env python3 -# -*- coding: utf-8 -*- -""" -Language specific configuration file, inheriting from the main /doc -conf.py file and adjusting the variables that depend on the language. -""" - -import os -import sys - -sys.path.insert(0, os.path.abspath('..')) -from conf import * - -language = 'de' -exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store', '_autodoc/modules.rst'] - -templates_path = ['../_templates'] -html_static_path = ['../theme/static/'] -html_logo = '../theme/static/qiskit-logo-white-no-margin.gif' -html_favicon = '../theme/static/favicon.ico' diff --git a/docs/ja/conf.py b/docs/ja/conf.py deleted file mode 100644 --- a/docs/ja/conf.py +++ /dev/null @@ -1,20 +0,0 @@ -#!/usr/bin/env python3 -# -*- coding: utf-8 -*- -""" -Language specific configuration file, inheriting from the main /doc -conf.py file and adjusting the variables that depend on the language. -""" - -import os -import sys - -sys.path.insert(0, os.path.abspath('..')) -from conf import * - -language = 'ja' -exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store', '_autodoc/modules.rst'] - -templates_path = ['../_templates'] -html_static_path = ['../theme/static/'] -html_logo = '../theme/static/qiskit-logo-white-no-margin.gif' -html_favicon = '../theme/static/favicon.ico'
{"golden_diff": "diff --git a/docs/de/conf.py b/docs/de/conf.py\ndeleted file mode 100644\n--- a/docs/de/conf.py\n+++ /dev/null\n@@ -1,20 +0,0 @@\n-#!/usr/bin/env python3\n-# -*- coding: utf-8 -*-\n-\"\"\"\n-Language specific configuration file, inheriting from the main /doc\n-conf.py file and adjusting the variables that depend on the language.\n-\"\"\"\n-\n-import os\n-import sys\n-\n-sys.path.insert(0, os.path.abspath('..'))\n-from conf import *\n-\n-language = 'de'\n-exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store', '_autodoc/modules.rst']\n-\n-templates_path = ['../_templates']\n-html_static_path = ['../theme/static/']\n-html_logo = '../theme/static/qiskit-logo-white-no-margin.gif'\n-html_favicon = '../theme/static/favicon.ico'\ndiff --git a/docs/ja/conf.py b/docs/ja/conf.py\ndeleted file mode 100644\n--- a/docs/ja/conf.py\n+++ /dev/null\n@@ -1,20 +0,0 @@\n-#!/usr/bin/env python3\n-# -*- coding: utf-8 -*-\n-\"\"\"\n-Language specific configuration file, inheriting from the main /doc\n-conf.py file and adjusting the variables that depend on the language.\n-\"\"\"\n-\n-import os\n-import sys\n-\n-sys.path.insert(0, os.path.abspath('..'))\n-from conf import *\n-\n-language = 'ja'\n-exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store', '_autodoc/modules.rst']\n-\n-templates_path = ['../_templates']\n-html_static_path = ['../theme/static/']\n-html_logo = '../theme/static/qiskit-logo-white-no-margin.gif'\n-html_favicon = '../theme/static/favicon.ico'\n", "issue": "multi-language documentation \n<!-- \u26a0\ufe0f If you do not respect this template, your issue will be closed -->\r\n<!-- \u26a0\ufe0f Make sure to browse the opened and closed issues to confirm this idea does not exist. -->\r\n\r\n### What is the expected behavior?\r\nI would like to remove all the multi-language documentation from this repository and add it to github.com/Qiskit/qiskit. I am not sure of the best way we do it and I let all discuss in that repository how we should do it. But we are not going to have documentation in the individual elements (only a readme about the element, contributing guidelines and other community files)\r\n\r\n@alfrisch could you take the lead on the German\r\n@rraymondhp could you take the lead on the Japanese\r\n@hanheepaik could you take the lead on the Korean\r\n@liupibm could you take the lead on the Chinese\r\n\r\nThanks \r\n\r\n\r\n\n", "before_files": [{"content": "#!/usr/bin/env python3\n# -*- coding: utf-8 -*-\n\"\"\"\nLanguage specific configuration file, inheriting from the main /doc\nconf.py file and adjusting the variables that depend on the language.\n\"\"\"\n\nimport os\nimport sys\n\nsys.path.insert(0, os.path.abspath('..'))\nfrom conf import *\n\nlanguage = 'de'\nexclude_patterns = ['_build', 'Thumbs.db', '.DS_Store', '_autodoc/modules.rst']\n\ntemplates_path = ['../_templates']\nhtml_static_path = ['../theme/static/']\nhtml_logo = '../theme/static/qiskit-logo-white-no-margin.gif'\nhtml_favicon = '../theme/static/favicon.ico'\n", "path": "docs/de/conf.py"}, {"content": "#!/usr/bin/env python3\n# -*- coding: utf-8 -*-\n\"\"\"\nLanguage specific configuration file, inheriting from the main /doc\nconf.py file and adjusting the variables that depend on the language.\n\"\"\"\n\nimport os\nimport sys\n\nsys.path.insert(0, os.path.abspath('..'))\nfrom conf import *\n\nlanguage = 'ja'\nexclude_patterns = ['_build', 'Thumbs.db', '.DS_Store', '_autodoc/modules.rst']\n\ntemplates_path = ['../_templates']\nhtml_static_path = ['../theme/static/']\nhtml_logo = '../theme/static/qiskit-logo-white-no-margin.gif'\nhtml_favicon = '../theme/static/favicon.ico'\n", "path": "docs/ja/conf.py"}], "after_files": [{"content": null, "path": "docs/de/conf.py"}, {"content": null, "path": "docs/ja/conf.py"}]}
814
395
gh_patches_debug_43406
rasdani/github-patches
git_diff
sublimelsp__LSP-707
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Reduce impact of LSP color listener by default. @predragnikolic: Noticed while closing views after find-replace in pyls: ``` Traceback (most recent call last): File "/Applications/Sublime Text.app/Contents/MacOS/sublime_plugin.py", line 506, in run_async_view_listener_callback vel.__class__.__dict__[name](vel) File "/Users/tomv/Library/Application Support/Sublime Text 3/Packages/LSP/plugin/color.py", line 47, in on_activated_async self.schedule_request() File "/Users/tomv/Library/Application Support/Sublime Text 3/Packages/LSP/plugin/color.py", line 53, in schedule_request current_point = self.view.sel()[0].begin() File "/Applications/Sublime Text.app/Contents/MacOS/sublime.py", line 649, in __getitem__ raise IndexError() IndexError ``` * We should probably fix the above issue (maybe it's rare but I ran into this edge case pretty quickly) * We could avoid creating ViewEventListeners for a few kinds of irrelevant views: Read-only, transient, unsupported syntaxes ( Maybe `DocumentSyncListener` `is_applicable` is suitable?) * We could avoid running `LspColorListener`'s `schedule_request` until a session with colorProvider for a given view has been found. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `plugin/color.py` Content: ``` 1 import sublime_plugin 2 import sublime 3 4 try: 5 from typing import Any, List, Dict, Callable, Optional 6 assert Any and List and Dict and Callable and Optional 7 except ImportError: 8 pass 9 10 from .core.protocol import Request 11 from .core.url import filename_to_uri 12 from .core.registry import session_for_view 13 from .core.settings import settings 14 from .core.views import range_to_region 15 from .core.protocol import Range 16 17 18 def send_color_request(view, on_response_recieved: 'Callable'): 19 session = session_for_view(view) 20 if not session or not session.has_capability('colorProvider'): 21 # the server doesn't support colors, just return 22 return 23 24 params = { 25 "textDocument": { 26 "uri": filename_to_uri(view.file_name()) 27 } 28 } 29 session.client.send_request( 30 Request.documentColor(params), 31 lambda response: on_response_recieved(response)) 32 33 34 class LspColorListener(sublime_plugin.ViewEventListener): 35 def __init__(self, view: sublime.View) -> None: 36 super().__init__(view) 37 self.color_phantom_set = None # type: Optional[sublime.PhantomSet] 38 self._stored_point = -1 39 40 @classmethod 41 def is_applicable(cls, _settings): 42 return 'colorProvider' not in settings.disabled_capabilities 43 44 def on_activated_async(self): 45 self.schedule_request() 46 47 def on_modified_async(self): 48 self.schedule_request() 49 50 def schedule_request(self): 51 current_point = self.view.sel()[0].begin() 52 if self._stored_point != current_point: 53 self._stored_point = current_point 54 sublime.set_timeout_async(lambda: self.fire_request(current_point), 800) 55 56 def fire_request(self, current_point: int) -> None: 57 if current_point == self._stored_point: 58 send_color_request(self.view, self.handle_response) 59 60 def handle_response(self, response) -> None: 61 phantoms = [] 62 for val in response: 63 color = val['color'] 64 red = color['red'] * 255 65 green = color['green'] * 255 66 blue = color['blue'] * 255 67 alpha = color['alpha'] 68 69 content = """ 70 <div style='padding: 0.4em; 71 margin-top: 0.1em; 72 border: 1px solid color(var(--foreground) alpha(0.25)); 73 background-color: rgba({}, {}, {}, {})'> 74 </div>""".format(red, green, blue, alpha) 75 76 range = Range.from_lsp(val['range']) 77 region = range_to_region(range, self.view) 78 79 phantoms.append(sublime.Phantom(region, content, sublime.LAYOUT_INLINE)) 80 81 if phantoms: 82 if not self.color_phantom_set: 83 self.color_phantom_set = sublime.PhantomSet(self.view, "lsp_color") 84 self.color_phantom_set.update(phantoms) 85 else: 86 self.color_phantom_set = None 87 88 89 def remove_color_boxes(view): 90 view.erase_phantoms('lsp_color') 91 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/plugin/color.py b/plugin/color.py --- a/plugin/color.py +++ b/plugin/color.py @@ -9,26 +9,12 @@ from .core.protocol import Request from .core.url import filename_to_uri -from .core.registry import session_for_view -from .core.settings import settings +from .core.registry import session_for_view, config_for_scope +from .core.settings import settings, client_configs from .core.views import range_to_region from .core.protocol import Range - - -def send_color_request(view, on_response_recieved: 'Callable'): - session = session_for_view(view) - if not session or not session.has_capability('colorProvider'): - # the server doesn't support colors, just return - return - - params = { - "textDocument": { - "uri": filename_to_uri(view.file_name()) - } - } - session.client.send_request( - Request.documentColor(params), - lambda response: on_response_recieved(response)) +from .core.configurations import is_supported_syntax +from .core.documents import is_transient_view class LspColorListener(sublime_plugin.ViewEventListener): @@ -36,26 +22,72 @@ super().__init__(view) self.color_phantom_set = None # type: Optional[sublime.PhantomSet] self._stored_point = -1 + self.initialized = False + self.enabled = False @classmethod def is_applicable(cls, _settings): - return 'colorProvider' not in settings.disabled_capabilities + syntax = _settings.get('syntax') + is_supported = syntax and is_supported_syntax(syntax, client_configs.all) + disabled_by_user = 'colorProvider' in settings.disabled_capabilities + return is_supported and not disabled_by_user def on_activated_async(self): - self.schedule_request() + if not self.initialized: + self.initialize() + + def initialize(self, is_retry=False): + config = config_for_scope(self.view) + if not config: + self.initialized = True # no server enabled, re-open file to activate feature. + + session = session_for_view(self.view) + if session: + self.initialized = True + self.enabled = session.has_capability('colorProvider') + if self.enabled: + self.send_color_request() + elif not is_retry: + # session may be starting, try again once in a second. + sublime.set_timeout_async(lambda: self.initialize(is_retry=True), 1000) + else: + self.initialized = True # we retried but still no session available. def on_modified_async(self): - self.schedule_request() + if self.enabled: + self.schedule_request() def schedule_request(self): - current_point = self.view.sel()[0].begin() + sel = self.view.sel() + if len(sel) < 1: + return + + current_point = sel[0].begin() if self._stored_point != current_point: self._stored_point = current_point sublime.set_timeout_async(lambda: self.fire_request(current_point), 800) def fire_request(self, current_point: int) -> None: if current_point == self._stored_point: - send_color_request(self.view, self.handle_response) + self.send_color_request() + + def send_color_request(self): + if is_transient_view(self.view): + return + + session = session_for_view(self.view) + if not session: + return + + params = { + "textDocument": { + "uri": filename_to_uri(self.view.file_name()) + } + } + session.client.send_request( + Request.documentColor(params), + self.handle_response + ) def handle_response(self, response) -> None: phantoms = [] @@ -68,7 +100,7 @@ content = """ <div style='padding: 0.4em; - margin-top: 0.1em; + margin-top: 0.2em; border: 1px solid color(var(--foreground) alpha(0.25)); background-color: rgba({}, {}, {}, {})'> </div>""".format(red, green, blue, alpha)
{"golden_diff": "diff --git a/plugin/color.py b/plugin/color.py\n--- a/plugin/color.py\n+++ b/plugin/color.py\n@@ -9,26 +9,12 @@\n \n from .core.protocol import Request\n from .core.url import filename_to_uri\n-from .core.registry import session_for_view\n-from .core.settings import settings\n+from .core.registry import session_for_view, config_for_scope\n+from .core.settings import settings, client_configs\n from .core.views import range_to_region\n from .core.protocol import Range\n-\n-\n-def send_color_request(view, on_response_recieved: 'Callable'):\n- session = session_for_view(view)\n- if not session or not session.has_capability('colorProvider'):\n- # the server doesn't support colors, just return\n- return\n-\n- params = {\n- \"textDocument\": {\n- \"uri\": filename_to_uri(view.file_name())\n- }\n- }\n- session.client.send_request(\n- Request.documentColor(params),\n- lambda response: on_response_recieved(response))\n+from .core.configurations import is_supported_syntax\n+from .core.documents import is_transient_view\n \n \n class LspColorListener(sublime_plugin.ViewEventListener):\n@@ -36,26 +22,72 @@\n super().__init__(view)\n self.color_phantom_set = None # type: Optional[sublime.PhantomSet]\n self._stored_point = -1\n+ self.initialized = False\n+ self.enabled = False\n \n @classmethod\n def is_applicable(cls, _settings):\n- return 'colorProvider' not in settings.disabled_capabilities\n+ syntax = _settings.get('syntax')\n+ is_supported = syntax and is_supported_syntax(syntax, client_configs.all)\n+ disabled_by_user = 'colorProvider' in settings.disabled_capabilities\n+ return is_supported and not disabled_by_user\n \n def on_activated_async(self):\n- self.schedule_request()\n+ if not self.initialized:\n+ self.initialize()\n+\n+ def initialize(self, is_retry=False):\n+ config = config_for_scope(self.view)\n+ if not config:\n+ self.initialized = True # no server enabled, re-open file to activate feature.\n+\n+ session = session_for_view(self.view)\n+ if session:\n+ self.initialized = True\n+ self.enabled = session.has_capability('colorProvider')\n+ if self.enabled:\n+ self.send_color_request()\n+ elif not is_retry:\n+ # session may be starting, try again once in a second.\n+ sublime.set_timeout_async(lambda: self.initialize(is_retry=True), 1000)\n+ else:\n+ self.initialized = True # we retried but still no session available.\n \n def on_modified_async(self):\n- self.schedule_request()\n+ if self.enabled:\n+ self.schedule_request()\n \n def schedule_request(self):\n- current_point = self.view.sel()[0].begin()\n+ sel = self.view.sel()\n+ if len(sel) < 1:\n+ return\n+\n+ current_point = sel[0].begin()\n if self._stored_point != current_point:\n self._stored_point = current_point\n sublime.set_timeout_async(lambda: self.fire_request(current_point), 800)\n \n def fire_request(self, current_point: int) -> None:\n if current_point == self._stored_point:\n- send_color_request(self.view, self.handle_response)\n+ self.send_color_request()\n+\n+ def send_color_request(self):\n+ if is_transient_view(self.view):\n+ return\n+\n+ session = session_for_view(self.view)\n+ if not session:\n+ return\n+\n+ params = {\n+ \"textDocument\": {\n+ \"uri\": filename_to_uri(self.view.file_name())\n+ }\n+ }\n+ session.client.send_request(\n+ Request.documentColor(params),\n+ self.handle_response\n+ )\n \n def handle_response(self, response) -> None:\n phantoms = []\n@@ -68,7 +100,7 @@\n \n content = \"\"\"\n <div style='padding: 0.4em;\n- margin-top: 0.1em;\n+ margin-top: 0.2em;\n border: 1px solid color(var(--foreground) alpha(0.25));\n background-color: rgba({}, {}, {}, {})'>\n </div>\"\"\".format(red, green, blue, alpha)\n", "issue": "Reduce impact of LSP color listener by default.\n@predragnikolic: Noticed while closing views after find-replace in pyls:\r\n\r\n```\r\nTraceback (most recent call last):\r\n File \"/Applications/Sublime Text.app/Contents/MacOS/sublime_plugin.py\", line 506, in run_async_view_listener_callback\r\n vel.__class__.__dict__[name](vel)\r\n File \"/Users/tomv/Library/Application Support/Sublime Text 3/Packages/LSP/plugin/color.py\", line 47, in on_activated_async\r\n self.schedule_request()\r\n File \"/Users/tomv/Library/Application Support/Sublime Text 3/Packages/LSP/plugin/color.py\", line 53, in schedule_request\r\n current_point = self.view.sel()[0].begin()\r\n File \"/Applications/Sublime Text.app/Contents/MacOS/sublime.py\", line 649, in __getitem__\r\n raise IndexError()\r\nIndexError\r\n```\r\n\r\n* We should probably fix the above issue (maybe it's rare but I ran into this edge case pretty quickly)\r\n* We could avoid creating ViewEventListeners for a few kinds of irrelevant views: Read-only, transient, unsupported syntaxes ( Maybe `DocumentSyncListener` `is_applicable` is suitable?)\r\n* We could avoid running `LspColorListener`'s `schedule_request` until a session with colorProvider for a given view has been found.\r\n\n", "before_files": [{"content": "import sublime_plugin\nimport sublime\n\ntry:\n from typing import Any, List, Dict, Callable, Optional\n assert Any and List and Dict and Callable and Optional\nexcept ImportError:\n pass\n\nfrom .core.protocol import Request\nfrom .core.url import filename_to_uri\nfrom .core.registry import session_for_view\nfrom .core.settings import settings\nfrom .core.views import range_to_region\nfrom .core.protocol import Range\n\n\ndef send_color_request(view, on_response_recieved: 'Callable'):\n session = session_for_view(view)\n if not session or not session.has_capability('colorProvider'):\n # the server doesn't support colors, just return\n return\n\n params = {\n \"textDocument\": {\n \"uri\": filename_to_uri(view.file_name())\n }\n }\n session.client.send_request(\n Request.documentColor(params),\n lambda response: on_response_recieved(response))\n\n\nclass LspColorListener(sublime_plugin.ViewEventListener):\n def __init__(self, view: sublime.View) -> None:\n super().__init__(view)\n self.color_phantom_set = None # type: Optional[sublime.PhantomSet]\n self._stored_point = -1\n\n @classmethod\n def is_applicable(cls, _settings):\n return 'colorProvider' not in settings.disabled_capabilities\n\n def on_activated_async(self):\n self.schedule_request()\n\n def on_modified_async(self):\n self.schedule_request()\n\n def schedule_request(self):\n current_point = self.view.sel()[0].begin()\n if self._stored_point != current_point:\n self._stored_point = current_point\n sublime.set_timeout_async(lambda: self.fire_request(current_point), 800)\n\n def fire_request(self, current_point: int) -> None:\n if current_point == self._stored_point:\n send_color_request(self.view, self.handle_response)\n\n def handle_response(self, response) -> None:\n phantoms = []\n for val in response:\n color = val['color']\n red = color['red'] * 255\n green = color['green'] * 255\n blue = color['blue'] * 255\n alpha = color['alpha']\n\n content = \"\"\"\n <div style='padding: 0.4em;\n margin-top: 0.1em;\n border: 1px solid color(var(--foreground) alpha(0.25));\n background-color: rgba({}, {}, {}, {})'>\n </div>\"\"\".format(red, green, blue, alpha)\n\n range = Range.from_lsp(val['range'])\n region = range_to_region(range, self.view)\n\n phantoms.append(sublime.Phantom(region, content, sublime.LAYOUT_INLINE))\n\n if phantoms:\n if not self.color_phantom_set:\n self.color_phantom_set = sublime.PhantomSet(self.view, \"lsp_color\")\n self.color_phantom_set.update(phantoms)\n else:\n self.color_phantom_set = None\n\n\ndef remove_color_boxes(view):\n view.erase_phantoms('lsp_color')\n", "path": "plugin/color.py"}], "after_files": [{"content": "import sublime_plugin\nimport sublime\n\ntry:\n from typing import Any, List, Dict, Callable, Optional\n assert Any and List and Dict and Callable and Optional\nexcept ImportError:\n pass\n\nfrom .core.protocol import Request\nfrom .core.url import filename_to_uri\nfrom .core.registry import session_for_view, config_for_scope\nfrom .core.settings import settings, client_configs\nfrom .core.views import range_to_region\nfrom .core.protocol import Range\nfrom .core.configurations import is_supported_syntax\nfrom .core.documents import is_transient_view\n\n\nclass LspColorListener(sublime_plugin.ViewEventListener):\n def __init__(self, view: sublime.View) -> None:\n super().__init__(view)\n self.color_phantom_set = None # type: Optional[sublime.PhantomSet]\n self._stored_point = -1\n self.initialized = False\n self.enabled = False\n\n @classmethod\n def is_applicable(cls, _settings):\n syntax = _settings.get('syntax')\n is_supported = syntax and is_supported_syntax(syntax, client_configs.all)\n disabled_by_user = 'colorProvider' in settings.disabled_capabilities\n return is_supported and not disabled_by_user\n\n def on_activated_async(self):\n if not self.initialized:\n self.initialize()\n\n def initialize(self, is_retry=False):\n config = config_for_scope(self.view)\n if not config:\n self.initialized = True # no server enabled, re-open file to activate feature.\n\n session = session_for_view(self.view)\n if session:\n self.initialized = True\n self.enabled = session.has_capability('colorProvider')\n if self.enabled:\n self.send_color_request()\n elif not is_retry:\n # session may be starting, try again once in a second.\n sublime.set_timeout_async(lambda: self.initialize(is_retry=True), 1000)\n else:\n self.initialized = True # we retried but still no session available.\n\n def on_modified_async(self):\n if self.enabled:\n self.schedule_request()\n\n def schedule_request(self):\n sel = self.view.sel()\n if len(sel) < 1:\n return\n\n current_point = sel[0].begin()\n if self._stored_point != current_point:\n self._stored_point = current_point\n sublime.set_timeout_async(lambda: self.fire_request(current_point), 800)\n\n def fire_request(self, current_point: int) -> None:\n if current_point == self._stored_point:\n self.send_color_request()\n\n def send_color_request(self):\n if is_transient_view(self.view):\n return\n\n session = session_for_view(self.view)\n if not session:\n return\n\n params = {\n \"textDocument\": {\n \"uri\": filename_to_uri(self.view.file_name())\n }\n }\n session.client.send_request(\n Request.documentColor(params),\n self.handle_response\n )\n\n def handle_response(self, response) -> None:\n phantoms = []\n for val in response:\n color = val['color']\n red = color['red'] * 255\n green = color['green'] * 255\n blue = color['blue'] * 255\n alpha = color['alpha']\n\n content = \"\"\"\n <div style='padding: 0.4em;\n margin-top: 0.2em;\n border: 1px solid color(var(--foreground) alpha(0.25));\n background-color: rgba({}, {}, {}, {})'>\n </div>\"\"\".format(red, green, blue, alpha)\n\n range = Range.from_lsp(val['range'])\n region = range_to_region(range, self.view)\n\n phantoms.append(sublime.Phantom(region, content, sublime.LAYOUT_INLINE))\n\n if phantoms:\n if not self.color_phantom_set:\n self.color_phantom_set = sublime.PhantomSet(self.view, \"lsp_color\")\n self.color_phantom_set.update(phantoms)\n else:\n self.color_phantom_set = None\n\n\ndef remove_color_boxes(view):\n view.erase_phantoms('lsp_color')\n", "path": "plugin/color.py"}]}
1,393
959
gh_patches_debug_21965
rasdani/github-patches
git_diff
apache__tvm-10188
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- [Bug] RPC Server Can't Serve Through Proxy Because of Missing Command Line Option Now we have RPC server, proxy, and tracker, so if the host machine can't access the RPC server directly, then we can put RPC proxy between host machine and RPC server like "Host --> RPC Proxy --> RPC Server", we can do it from Python API, but we can't do it through command line. Because "tvm.exec.rpc_server" haven't exposed the parameter "is_proxy" of class "tvm.rpc.server.Server" through command line option. ### Expected behavior Can register the RPC server to RPC proxy from command line like something below. ```shell python -m tvm.exec.rpc_server --host 0.0.0.0 --port 9090 --through-proxy ``` ### Actual behavior Currently haven't any command line option to set the parameter "is_proxy". ### Environment No ### Steps to reproduce Any --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `python/tvm/exec/rpc_server.py` Content: ``` 1 # Licensed to the Apache Software Foundation (ASF) under one 2 # or more contributor license agreements. See the NOTICE file 3 # distributed with this work for additional information 4 # regarding copyright ownership. The ASF licenses this file 5 # to you under the Apache License, Version 2.0 (the 6 # "License"); you may not use this file except in compliance 7 # with the License. You may obtain a copy of the License at 8 # 9 # http://www.apache.org/licenses/LICENSE-2.0 10 # 11 # Unless required by applicable law or agreed to in writing, 12 # software distributed under the License is distributed on an 13 # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY 14 # KIND, either express or implied. See the License for the 15 # specific language governing permissions and limitations 16 # under the License. 17 # pylint: disable=redefined-outer-name, invalid-name 18 """Start an RPC server""" 19 import argparse 20 import logging 21 from .. import rpc 22 23 24 def main(args): 25 """Main function 26 27 Parameters 28 ---------- 29 args : argparse.Namespace 30 parsed args from command-line invocation 31 """ 32 if args.tracker: 33 url, port = args.tracker.rsplit(":", 1) 34 port = int(port) 35 tracker_addr = (url, port) 36 if not args.key: 37 raise RuntimeError("Need key to present type of resource when tracker is available") 38 else: 39 tracker_addr = None 40 41 server = rpc.Server( 42 args.host, 43 args.port, 44 args.port_end, 45 key=args.key, 46 tracker_addr=tracker_addr, 47 load_library=args.load_library, 48 custom_addr=args.custom_addr, 49 silent=args.silent, 50 no_fork=not args.fork, 51 ) 52 server.proc.join() 53 54 55 if __name__ == "__main__": 56 parser = argparse.ArgumentParser() 57 parser.add_argument( 58 "--host", type=str, default="0.0.0.0", help="The host IP address the tracker binds to" 59 ) 60 parser.add_argument("--port", type=int, default=9090, help="The port of the RPC") 61 parser.add_argument("--port-end", type=int, default=9199, help="The end search port of the RPC") 62 parser.add_argument( 63 "--tracker", 64 type=str, 65 help=("The address of RPC tracker in host:port format. " "e.g. (10.77.1.234:9190)"), 66 ) 67 parser.add_argument( 68 "--key", type=str, default="", help="The key used to identify the device type in tracker." 69 ) 70 parser.add_argument("--silent", action="store_true", help="Whether run in silent mode.") 71 parser.add_argument("--load-library", type=str, help="Additional library to load") 72 parser.add_argument( 73 "--no-fork", 74 dest="fork", 75 action="store_false", 76 help="Use spawn mode to avoid fork. This option \ 77 is able to avoid potential fork problems with Metal, OpenCL \ 78 and ROCM compilers.", 79 ) 80 parser.add_argument( 81 "--custom-addr", type=str, help="Custom IP Address to Report to RPC Tracker" 82 ) 83 84 parser.set_defaults(fork=True) 85 args = parser.parse_args() 86 logging.basicConfig(level=logging.INFO) 87 if not args.fork is False and not args.silent: 88 logging.info( 89 "If you are running ROCM/Metal, fork will cause " 90 "compiler internal error. Try to launch with arg ```--no-fork```" 91 ) 92 main(args) 93 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/python/tvm/exec/rpc_server.py b/python/tvm/exec/rpc_server.py --- a/python/tvm/exec/rpc_server.py +++ b/python/tvm/exec/rpc_server.py @@ -42,6 +42,7 @@ args.host, args.port, args.port_end, + is_proxy=args.through_proxy, key=args.key, tracker_addr=tracker_addr, load_library=args.load_library, @@ -58,6 +59,15 @@ "--host", type=str, default="0.0.0.0", help="The host IP address the tracker binds to" ) parser.add_argument("--port", type=int, default=9090, help="The port of the RPC") + parser.add_argument( + "--through-proxy", + dest="through_proxy", + action="store_true", + help=( + "Whether this server provide service through a proxy. If this is true, the host and" + "port actually is the address of the proxy." + ), + ) parser.add_argument("--port-end", type=int, default=9199, help="The end search port of the RPC") parser.add_argument( "--tracker",
{"golden_diff": "diff --git a/python/tvm/exec/rpc_server.py b/python/tvm/exec/rpc_server.py\n--- a/python/tvm/exec/rpc_server.py\n+++ b/python/tvm/exec/rpc_server.py\n@@ -42,6 +42,7 @@\n args.host,\n args.port,\n args.port_end,\n+ is_proxy=args.through_proxy,\n key=args.key,\n tracker_addr=tracker_addr,\n load_library=args.load_library,\n@@ -58,6 +59,15 @@\n \"--host\", type=str, default=\"0.0.0.0\", help=\"The host IP address the tracker binds to\"\n )\n parser.add_argument(\"--port\", type=int, default=9090, help=\"The port of the RPC\")\n+ parser.add_argument(\n+ \"--through-proxy\",\n+ dest=\"through_proxy\",\n+ action=\"store_true\",\n+ help=(\n+ \"Whether this server provide service through a proxy. If this is true, the host and\"\n+ \"port actually is the address of the proxy.\"\n+ ),\n+ )\n parser.add_argument(\"--port-end\", type=int, default=9199, help=\"The end search port of the RPC\")\n parser.add_argument(\n \"--tracker\",\n", "issue": "[Bug] RPC Server Can't Serve Through Proxy Because of Missing Command Line Option\nNow we have RPC server, proxy, and tracker, so if the host machine can't access the RPC server directly, then we can put RPC proxy between host machine and RPC server like \"Host --> RPC Proxy --> RPC Server\", we can do it from Python API, but we can't do it through command line. Because \"tvm.exec.rpc_server\" haven't exposed the parameter \"is_proxy\" of class \"tvm.rpc.server.Server\" through command line option.\r\n\r\n### Expected behavior\r\n\r\nCan register the RPC server to RPC proxy from command line like something below.\r\n```shell\r\npython -m tvm.exec.rpc_server --host 0.0.0.0 --port 9090 --through-proxy\r\n```\r\n\r\n### Actual behavior\r\n\r\nCurrently haven't any command line option to set the parameter \"is_proxy\".\r\n\r\n### Environment\r\n\r\nNo\r\n\r\n### Steps to reproduce\r\n\r\nAny\n", "before_files": [{"content": "# Licensed to the Apache Software Foundation (ASF) under one\n# or more contributor license agreements. See the NOTICE file\n# distributed with this work for additional information\n# regarding copyright ownership. The ASF licenses this file\n# to you under the Apache License, Version 2.0 (the\n# \"License\"); you may not use this file except in compliance\n# with the License. You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing,\n# software distributed under the License is distributed on an\n# \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY\n# KIND, either express or implied. See the License for the\n# specific language governing permissions and limitations\n# under the License.\n# pylint: disable=redefined-outer-name, invalid-name\n\"\"\"Start an RPC server\"\"\"\nimport argparse\nimport logging\nfrom .. import rpc\n\n\ndef main(args):\n \"\"\"Main function\n\n Parameters\n ----------\n args : argparse.Namespace\n parsed args from command-line invocation\n \"\"\"\n if args.tracker:\n url, port = args.tracker.rsplit(\":\", 1)\n port = int(port)\n tracker_addr = (url, port)\n if not args.key:\n raise RuntimeError(\"Need key to present type of resource when tracker is available\")\n else:\n tracker_addr = None\n\n server = rpc.Server(\n args.host,\n args.port,\n args.port_end,\n key=args.key,\n tracker_addr=tracker_addr,\n load_library=args.load_library,\n custom_addr=args.custom_addr,\n silent=args.silent,\n no_fork=not args.fork,\n )\n server.proc.join()\n\n\nif __name__ == \"__main__\":\n parser = argparse.ArgumentParser()\n parser.add_argument(\n \"--host\", type=str, default=\"0.0.0.0\", help=\"The host IP address the tracker binds to\"\n )\n parser.add_argument(\"--port\", type=int, default=9090, help=\"The port of the RPC\")\n parser.add_argument(\"--port-end\", type=int, default=9199, help=\"The end search port of the RPC\")\n parser.add_argument(\n \"--tracker\",\n type=str,\n help=(\"The address of RPC tracker in host:port format. \" \"e.g. (10.77.1.234:9190)\"),\n )\n parser.add_argument(\n \"--key\", type=str, default=\"\", help=\"The key used to identify the device type in tracker.\"\n )\n parser.add_argument(\"--silent\", action=\"store_true\", help=\"Whether run in silent mode.\")\n parser.add_argument(\"--load-library\", type=str, help=\"Additional library to load\")\n parser.add_argument(\n \"--no-fork\",\n dest=\"fork\",\n action=\"store_false\",\n help=\"Use spawn mode to avoid fork. This option \\\n is able to avoid potential fork problems with Metal, OpenCL \\\n and ROCM compilers.\",\n )\n parser.add_argument(\n \"--custom-addr\", type=str, help=\"Custom IP Address to Report to RPC Tracker\"\n )\n\n parser.set_defaults(fork=True)\n args = parser.parse_args()\n logging.basicConfig(level=logging.INFO)\n if not args.fork is False and not args.silent:\n logging.info(\n \"If you are running ROCM/Metal, fork will cause \"\n \"compiler internal error. Try to launch with arg ```--no-fork```\"\n )\n main(args)\n", "path": "python/tvm/exec/rpc_server.py"}], "after_files": [{"content": "# Licensed to the Apache Software Foundation (ASF) under one\n# or more contributor license agreements. See the NOTICE file\n# distributed with this work for additional information\n# regarding copyright ownership. The ASF licenses this file\n# to you under the Apache License, Version 2.0 (the\n# \"License\"); you may not use this file except in compliance\n# with the License. You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing,\n# software distributed under the License is distributed on an\n# \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY\n# KIND, either express or implied. See the License for the\n# specific language governing permissions and limitations\n# under the License.\n# pylint: disable=redefined-outer-name, invalid-name\n\"\"\"Start an RPC server\"\"\"\nimport argparse\nimport logging\nfrom .. import rpc\n\n\ndef main(args):\n \"\"\"Main function\n\n Parameters\n ----------\n args : argparse.Namespace\n parsed args from command-line invocation\n \"\"\"\n if args.tracker:\n url, port = args.tracker.rsplit(\":\", 1)\n port = int(port)\n tracker_addr = (url, port)\n if not args.key:\n raise RuntimeError(\"Need key to present type of resource when tracker is available\")\n else:\n tracker_addr = None\n\n server = rpc.Server(\n args.host,\n args.port,\n args.port_end,\n is_proxy=args.through_proxy,\n key=args.key,\n tracker_addr=tracker_addr,\n load_library=args.load_library,\n custom_addr=args.custom_addr,\n silent=args.silent,\n no_fork=not args.fork,\n )\n server.proc.join()\n\n\nif __name__ == \"__main__\":\n parser = argparse.ArgumentParser()\n parser.add_argument(\n \"--host\", type=str, default=\"0.0.0.0\", help=\"The host IP address the tracker binds to\"\n )\n parser.add_argument(\"--port\", type=int, default=9090, help=\"The port of the RPC\")\n parser.add_argument(\n \"--through-proxy\",\n dest=\"through_proxy\",\n action=\"store_true\",\n help=(\n \"Whether this server provide service through a proxy. If this is true, the host and\"\n \"port actually is the address of the proxy.\"\n ),\n )\n parser.add_argument(\"--port-end\", type=int, default=9199, help=\"The end search port of the RPC\")\n parser.add_argument(\n \"--tracker\",\n type=str,\n help=(\"The address of RPC tracker in host:port format. \" \"e.g. (10.77.1.234:9190)\"),\n )\n parser.add_argument(\n \"--key\", type=str, default=\"\", help=\"The key used to identify the device type in tracker.\"\n )\n parser.add_argument(\"--silent\", action=\"store_true\", help=\"Whether run in silent mode.\")\n parser.add_argument(\"--load-library\", type=str, help=\"Additional library to load\")\n parser.add_argument(\n \"--no-fork\",\n dest=\"fork\",\n action=\"store_false\",\n help=\"Use spawn mode to avoid fork. This option \\\n is able to avoid potential fork problems with Metal, OpenCL \\\n and ROCM compilers.\",\n )\n parser.add_argument(\n \"--custom-addr\", type=str, help=\"Custom IP Address to Report to RPC Tracker\"\n )\n\n parser.set_defaults(fork=True)\n args = parser.parse_args()\n logging.basicConfig(level=logging.INFO)\n if not args.fork is False and not args.silent:\n logging.info(\n \"If you are running ROCM/Metal, fork will cause \"\n \"compiler internal error. Try to launch with arg ```--no-fork```\"\n )\n main(args)\n", "path": "python/tvm/exec/rpc_server.py"}]}
1,397
271
gh_patches_debug_5099
rasdani/github-patches
git_diff
translate__pootle-6747
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Add current character count when entering translations We're trying to use Pootle to translate metadata strings for an app in the iOS AppStore. The metadata includes individual messages for the app name, subtitle, description, privacy URL and so on and there are different limits on the number of characters allowed in each of them. For instance, an app's name can be no more than 30 characters. When entering translations, it would be really helpful to see the current number of characters that the translation uses as you type to ensure that you're not exceeding the limit. This could maybe fit on the lower right corner of the input view. You currently have timeline / comment / raw on the left. Current character count could just be a small label that floats to the right on the same line. # Environment (i.e. 'pootle --version', DB, OS, Browser): Pootle 2.8.0 --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `pootle/core/templatetags/core.py` Content: ``` 1 #!/usr/bin/env python 2 # -*- coding: utf-8 -*- 3 # 4 # Copyright (C) Pootle contributors. 5 # 6 # This file is a part of the Pootle project. It is distributed under the GPL3 7 # or later license. See the LICENSE file for a copy of the license and the 8 # AUTHORS file for copyright and authorship information. 9 10 from django import template 11 from django.utils.html import escapejs 12 from django.utils.safestring import mark_safe 13 14 from ..utils.json import jsonify 15 16 17 register = template.Library() 18 19 20 @register.filter 21 def to_js(value): 22 """Returns a string which leaves the value readily available for JS 23 consumption. 24 """ 25 return mark_safe('JSON.parse("%s")' % escapejs(jsonify(value))) 26 27 28 @register.inclusion_tag('includes/formtable.html') 29 def formtable(formtable): 30 return dict(formtable=formtable) 31 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/pootle/core/templatetags/core.py b/pootle/core/templatetags/core.py --- a/pootle/core/templatetags/core.py +++ b/pootle/core/templatetags/core.py @@ -25,6 +25,13 @@ return mark_safe('JSON.parse("%s")' % escapejs(jsonify(value))) [email protected] +def map_to_lengths(value): + """Maps a list value by replacing each element with its length. + """ + return [len(e) for e in value] + + @register.inclusion_tag('includes/formtable.html') def formtable(formtable): return dict(formtable=formtable)
{"golden_diff": "diff --git a/pootle/core/templatetags/core.py b/pootle/core/templatetags/core.py\n--- a/pootle/core/templatetags/core.py\n+++ b/pootle/core/templatetags/core.py\n@@ -25,6 +25,13 @@\n return mark_safe('JSON.parse(\"%s\")' % escapejs(jsonify(value)))\n \n \[email protected]\n+def map_to_lengths(value):\n+ \"\"\"Maps a list value by replacing each element with its length.\n+ \"\"\"\n+ return [len(e) for e in value]\n+\n+\n @register.inclusion_tag('includes/formtable.html')\n def formtable(formtable):\n return dict(formtable=formtable)\n", "issue": "Add current character count when entering translations\nWe're trying to use Pootle to translate metadata strings for an app in the iOS AppStore. The metadata includes individual messages for the app name, subtitle, description, privacy URL and so on and there are different limits on the number of characters allowed in each of them. For instance, an app's name can be no more than 30 characters. \r\n\r\nWhen entering translations, it would be really helpful to see the current number of characters that the translation uses as you type to ensure that you're not exceeding the limit. This could maybe fit on the lower right corner of the input view. You currently have timeline / comment / raw on the left. Current character count could just be a small label that floats to the right on the same line.\r\n\r\n# Environment (i.e. 'pootle --version', DB, OS, Browser):\r\n\r\nPootle 2.8.0\n", "before_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n#\n# Copyright (C) Pootle contributors.\n#\n# This file is a part of the Pootle project. It is distributed under the GPL3\n# or later license. See the LICENSE file for a copy of the license and the\n# AUTHORS file for copyright and authorship information.\n\nfrom django import template\nfrom django.utils.html import escapejs\nfrom django.utils.safestring import mark_safe\n\nfrom ..utils.json import jsonify\n\n\nregister = template.Library()\n\n\[email protected]\ndef to_js(value):\n \"\"\"Returns a string which leaves the value readily available for JS\n consumption.\n \"\"\"\n return mark_safe('JSON.parse(\"%s\")' % escapejs(jsonify(value)))\n\n\[email protected]_tag('includes/formtable.html')\ndef formtable(formtable):\n return dict(formtable=formtable)\n", "path": "pootle/core/templatetags/core.py"}], "after_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n#\n# Copyright (C) Pootle contributors.\n#\n# This file is a part of the Pootle project. It is distributed under the GPL3\n# or later license. See the LICENSE file for a copy of the license and the\n# AUTHORS file for copyright and authorship information.\n\nfrom django import template\nfrom django.utils.html import escapejs\nfrom django.utils.safestring import mark_safe\n\nfrom ..utils.json import jsonify\n\n\nregister = template.Library()\n\n\[email protected]\ndef to_js(value):\n \"\"\"Returns a string which leaves the value readily available for JS\n consumption.\n \"\"\"\n return mark_safe('JSON.parse(\"%s\")' % escapejs(jsonify(value)))\n\n\[email protected]\ndef map_to_lengths(value):\n \"\"\"Maps a list value by replacing each element with its length.\n \"\"\"\n return [len(e) for e in value]\n\n\[email protected]_tag('includes/formtable.html')\ndef formtable(formtable):\n return dict(formtable=formtable)\n", "path": "pootle/core/templatetags/core.py"}]}
697
157
gh_patches_debug_27424
rasdani/github-patches
git_diff
learningequality__kolibri-2117
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- can no longer sign in using a pre-existing superuser account after upgrade It appears that my superuser account is no longer available since upgrading to the latest develop. I would have expected it to get migrated to an admin account with superuser flags enabled. It actually looks like the user might still be there: ![image](https://user-images.githubusercontent.com/2367265/29852726-81f2b1b4-8cf0-11e7-9b0a-adb9fb0cb264.png) --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `kolibri/auth/migrations/0004_auto_20170816_1607.py` Content: ``` 1 # -*- coding: utf-8 -*- 2 # Generated by Django 1.9.7 on 2017-08-16 23:07 3 from __future__ import unicode_literals 4 5 import django.core.validators 6 from django.db import migrations, models 7 from kolibri.auth.constants.role_kinds import ADMIN 8 9 10 def device_owner_to_super_user(apps, schema_editor): 11 DeviceOwner = apps.get_model('kolibriauth', 'DeviceOwner') 12 FacilityUser = apps.get_model('kolibriauth', 'FacilityUser') 13 Facility = apps.get_model('kolibriauth', 'Facility') 14 default_facility = Facility.objects.all().first() 15 DevicePermissions = apps.get_model('device', 'DevicePermissions') 16 DeviceSettings = apps.get_model('device', 'DeviceSettings') 17 Role = apps.get_model('kolibriauth', 'Role') 18 from kolibri.auth.models import FacilityUser as RealFacilityUser, Facility as RealFacility, Role as RealRole 19 real_default_facility = RealFacility.get_default_facility() 20 # Can't do much if no facilities exist, as no facility to FK the users onto 21 if default_facility: 22 for device_owner in DeviceOwner.objects.all(): 23 dataset_id = real_default_facility.dataset_id 24 real_superuser = RealFacilityUser( 25 username=device_owner.username, 26 facility=real_default_facility, 27 dataset_id=dataset_id 28 ) 29 uuid = real_superuser.calculate_uuid() 30 superuser = FacilityUser.objects.create( 31 username=device_owner.username, 32 password=device_owner.password, 33 facility=default_facility, 34 full_name=device_owner.full_name, 35 date_joined=device_owner.date_joined, 36 id=uuid, 37 dataset_id=dataset_id, 38 _morango_source_id=real_superuser._morango_source_id, 39 _morango_partition=real_superuser._morango_partition, 40 ) 41 real_role = RealRole( 42 user=real_superuser, 43 collection=real_default_facility, 44 kind=ADMIN, 45 dataset_id=dataset_id, 46 ) 47 role_uuid = real_role.calculate_uuid() 48 role = Role.objects.create( 49 user=superuser, 50 collection=default_facility, 51 kind=ADMIN, 52 id=role_uuid, 53 dataset_id=dataset_id, 54 _morango_source_id=real_role._morango_source_id, 55 _morango_partition=real_role._morango_partition, 56 ) 57 DevicePermissions.objects.create(user=superuser, is_superuser=True) 58 # Finally, set the is_provisioned flag 59 settings, created = DeviceSettings.objects.get_or_create(is_provisioned=True) 60 61 62 class Migration(migrations.Migration): 63 64 dependencies = [ 65 ('kolibriauth', '0003_auto_20170621_0958'), 66 ('device', '0001_initial') 67 ] 68 69 operations = [ 70 migrations.RunPython(device_owner_to_super_user, migrations.RunPython.noop), 71 migrations.DeleteModel( 72 name='DeviceOwner', 73 ), 74 ] 75 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/kolibri/auth/migrations/0004_auto_20170816_1607.py b/kolibri/auth/migrations/0004_auto_20170816_1607.py --- a/kolibri/auth/migrations/0004_auto_20170816_1607.py +++ b/kolibri/auth/migrations/0004_auto_20170816_1607.py @@ -8,17 +8,17 @@ def device_owner_to_super_user(apps, schema_editor): - DeviceOwner = apps.get_model('kolibriauth', 'DeviceOwner') - FacilityUser = apps.get_model('kolibriauth', 'FacilityUser') - Facility = apps.get_model('kolibriauth', 'Facility') - default_facility = Facility.objects.all().first() - DevicePermissions = apps.get_model('device', 'DevicePermissions') - DeviceSettings = apps.get_model('device', 'DeviceSettings') - Role = apps.get_model('kolibriauth', 'Role') from kolibri.auth.models import FacilityUser as RealFacilityUser, Facility as RealFacility, Role as RealRole real_default_facility = RealFacility.get_default_facility() # Can't do much if no facilities exist, as no facility to FK the users onto - if default_facility: + if real_default_facility: + DeviceOwner = apps.get_model('kolibriauth', 'DeviceOwner') + FacilityUser = apps.get_model('kolibriauth', 'FacilityUser') + Facility = apps.get_model('kolibriauth', 'Facility') + default_facility = Facility.objects.get(pk=real_default_facility.id) + DevicePermissions = apps.get_model('device', 'DevicePermissions') + DeviceSettings = apps.get_model('device', 'DeviceSettings') + Role = apps.get_model('kolibriauth', 'Role') for device_owner in DeviceOwner.objects.all(): dataset_id = real_default_facility.dataset_id real_superuser = RealFacilityUser(
{"golden_diff": "diff --git a/kolibri/auth/migrations/0004_auto_20170816_1607.py b/kolibri/auth/migrations/0004_auto_20170816_1607.py\n--- a/kolibri/auth/migrations/0004_auto_20170816_1607.py\n+++ b/kolibri/auth/migrations/0004_auto_20170816_1607.py\n@@ -8,17 +8,17 @@\n \n \n def device_owner_to_super_user(apps, schema_editor):\n- DeviceOwner = apps.get_model('kolibriauth', 'DeviceOwner')\n- FacilityUser = apps.get_model('kolibriauth', 'FacilityUser')\n- Facility = apps.get_model('kolibriauth', 'Facility')\n- default_facility = Facility.objects.all().first()\n- DevicePermissions = apps.get_model('device', 'DevicePermissions')\n- DeviceSettings = apps.get_model('device', 'DeviceSettings')\n- Role = apps.get_model('kolibriauth', 'Role')\n from kolibri.auth.models import FacilityUser as RealFacilityUser, Facility as RealFacility, Role as RealRole\n real_default_facility = RealFacility.get_default_facility()\n # Can't do much if no facilities exist, as no facility to FK the users onto\n- if default_facility:\n+ if real_default_facility:\n+ DeviceOwner = apps.get_model('kolibriauth', 'DeviceOwner')\n+ FacilityUser = apps.get_model('kolibriauth', 'FacilityUser')\n+ Facility = apps.get_model('kolibriauth', 'Facility')\n+ default_facility = Facility.objects.get(pk=real_default_facility.id)\n+ DevicePermissions = apps.get_model('device', 'DevicePermissions')\n+ DeviceSettings = apps.get_model('device', 'DeviceSettings')\n+ Role = apps.get_model('kolibriauth', 'Role')\n for device_owner in DeviceOwner.objects.all():\n dataset_id = real_default_facility.dataset_id\n real_superuser = RealFacilityUser(\n", "issue": "can no longer sign in using a pre-existing superuser account after upgrade\nIt appears that my superuser account is no longer available since upgrading to the latest develop. \r\n\r\nI would have expected it to get migrated to an admin account with superuser flags enabled.\r\n\r\nIt actually looks like the user might still be there:\r\n\r\n![image](https://user-images.githubusercontent.com/2367265/29852726-81f2b1b4-8cf0-11e7-9b0a-adb9fb0cb264.png)\r\n\r\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n# Generated by Django 1.9.7 on 2017-08-16 23:07\nfrom __future__ import unicode_literals\n\nimport django.core.validators\nfrom django.db import migrations, models\nfrom kolibri.auth.constants.role_kinds import ADMIN\n\n\ndef device_owner_to_super_user(apps, schema_editor):\n DeviceOwner = apps.get_model('kolibriauth', 'DeviceOwner')\n FacilityUser = apps.get_model('kolibriauth', 'FacilityUser')\n Facility = apps.get_model('kolibriauth', 'Facility')\n default_facility = Facility.objects.all().first()\n DevicePermissions = apps.get_model('device', 'DevicePermissions')\n DeviceSettings = apps.get_model('device', 'DeviceSettings')\n Role = apps.get_model('kolibriauth', 'Role')\n from kolibri.auth.models import FacilityUser as RealFacilityUser, Facility as RealFacility, Role as RealRole\n real_default_facility = RealFacility.get_default_facility()\n # Can't do much if no facilities exist, as no facility to FK the users onto\n if default_facility:\n for device_owner in DeviceOwner.objects.all():\n dataset_id = real_default_facility.dataset_id\n real_superuser = RealFacilityUser(\n username=device_owner.username,\n facility=real_default_facility,\n dataset_id=dataset_id\n )\n uuid = real_superuser.calculate_uuid()\n superuser = FacilityUser.objects.create(\n username=device_owner.username,\n password=device_owner.password,\n facility=default_facility,\n full_name=device_owner.full_name,\n date_joined=device_owner.date_joined,\n id=uuid,\n dataset_id=dataset_id,\n _morango_source_id=real_superuser._morango_source_id,\n _morango_partition=real_superuser._morango_partition,\n )\n real_role = RealRole(\n user=real_superuser,\n collection=real_default_facility,\n kind=ADMIN,\n dataset_id=dataset_id,\n )\n role_uuid = real_role.calculate_uuid()\n role = Role.objects.create(\n user=superuser,\n collection=default_facility,\n kind=ADMIN,\n id=role_uuid,\n dataset_id=dataset_id,\n _morango_source_id=real_role._morango_source_id,\n _morango_partition=real_role._morango_partition,\n )\n DevicePermissions.objects.create(user=superuser, is_superuser=True)\n # Finally, set the is_provisioned flag\n settings, created = DeviceSettings.objects.get_or_create(is_provisioned=True)\n\n\nclass Migration(migrations.Migration):\n\n dependencies = [\n ('kolibriauth', '0003_auto_20170621_0958'),\n ('device', '0001_initial')\n ]\n\n operations = [\n migrations.RunPython(device_owner_to_super_user, migrations.RunPython.noop),\n migrations.DeleteModel(\n name='DeviceOwner',\n ),\n ]\n", "path": "kolibri/auth/migrations/0004_auto_20170816_1607.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n# Generated by Django 1.9.7 on 2017-08-16 23:07\nfrom __future__ import unicode_literals\n\nimport django.core.validators\nfrom django.db import migrations, models\nfrom kolibri.auth.constants.role_kinds import ADMIN\n\n\ndef device_owner_to_super_user(apps, schema_editor):\n from kolibri.auth.models import FacilityUser as RealFacilityUser, Facility as RealFacility, Role as RealRole\n real_default_facility = RealFacility.get_default_facility()\n # Can't do much if no facilities exist, as no facility to FK the users onto\n if real_default_facility:\n DeviceOwner = apps.get_model('kolibriauth', 'DeviceOwner')\n FacilityUser = apps.get_model('kolibriauth', 'FacilityUser')\n Facility = apps.get_model('kolibriauth', 'Facility')\n default_facility = Facility.objects.get(pk=real_default_facility.id)\n DevicePermissions = apps.get_model('device', 'DevicePermissions')\n DeviceSettings = apps.get_model('device', 'DeviceSettings')\n Role = apps.get_model('kolibriauth', 'Role')\n for device_owner in DeviceOwner.objects.all():\n dataset_id = real_default_facility.dataset_id\n real_superuser = RealFacilityUser(\n username=device_owner.username,\n facility=real_default_facility,\n dataset_id=dataset_id\n )\n uuid = real_superuser.calculate_uuid()\n superuser = FacilityUser.objects.create(\n username=device_owner.username,\n password=device_owner.password,\n facility=default_facility,\n full_name=device_owner.full_name,\n date_joined=device_owner.date_joined,\n id=uuid,\n dataset_id=dataset_id,\n _morango_source_id=real_superuser._morango_source_id,\n _morango_partition=real_superuser._morango_partition,\n )\n real_role = RealRole(\n user=real_superuser,\n collection=real_default_facility,\n kind=ADMIN,\n dataset_id=dataset_id,\n )\n role_uuid = real_role.calculate_uuid()\n role = Role.objects.create(\n user=superuser,\n collection=default_facility,\n kind=ADMIN,\n id=role_uuid,\n dataset_id=dataset_id,\n _morango_source_id=real_role._morango_source_id,\n _morango_partition=real_role._morango_partition,\n )\n DevicePermissions.objects.create(user=superuser, is_superuser=True)\n # Finally, set the is_provisioned flag\n settings, created = DeviceSettings.objects.get_or_create(is_provisioned=True)\n\n\nclass Migration(migrations.Migration):\n\n dependencies = [\n ('kolibriauth', '0003_auto_20170621_0958'),\n ('device', '0001_initial')\n ]\n\n operations = [\n migrations.RunPython(device_owner_to_super_user, migrations.RunPython.noop),\n migrations.DeleteModel(\n name='DeviceOwner',\n ),\n ]\n", "path": "kolibri/auth/migrations/0004_auto_20170816_1607.py"}]}
1,189
471
gh_patches_debug_18591
rasdani/github-patches
git_diff
StackStorm__st2-4007
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Pinned eventlet version has outstanding bugs pinned version 0.17 of eventlet has outstanding bugs on it's monkey patching of the ssl module. e.g. https://github.com/eventlet/eventlet/issues/371 ``` # Note: 0.20.0 removed select.poll() on which some of our code and libraries we # depend on rely ``` @Kami committed this reversion in https://github.com/StackStorm/st2/commit/1ec43d294e6770e56ec8f9990c805cb9dffe98c5 What was the specific issue? --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `st2common/st2common/util/monkey_patch.py` Content: ``` 1 # Licensed to the StackStorm, Inc ('StackStorm') under one or more 2 # contributor license agreements. See the NOTICE file distributed with 3 # this work for additional information regarding copyright ownership. 4 # The ASF licenses this file to You under the Apache License, Version 2.0 5 # (the "License"); you may not use this file except in compliance with 6 # the License. You may obtain a copy of the License at 7 # 8 # http://www.apache.org/licenses/LICENSE-2.0 9 # 10 # Unless required by applicable law or agreed to in writing, software 11 # distributed under the License is distributed on an "AS IS" BASIS, 12 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 # See the License for the specific language governing permissions and 14 # 15 16 """ 17 Module for performing eventlet and other monkey patching. 18 """ 19 20 from __future__ import absolute_import 21 22 import sys 23 24 __all__ = [ 25 'monkey_patch', 26 'is_use_debugger_flag_provided' 27 ] 28 29 USE_DEBUGGER_FLAG = '--use-debugger' 30 PARENT_ARGS_FLAG = '--parent-args=' 31 32 33 def monkey_patch(): 34 """ 35 Function which performs eventlet monkey patching and also takes into account "--use-debugger" 36 argument in the command line arguments. 37 38 If this argument is found, no monkey patching is performed for the thread module. This allows 39 user to use remote debuggers. 40 """ 41 import eventlet 42 43 patch_thread = not is_use_debugger_flag_provided() 44 eventlet.monkey_patch(os=True, select=True, socket=True, thread=patch_thread, time=True) 45 46 47 def is_use_debugger_flag_provided(): 48 # 1. Check sys.argv directly 49 if USE_DEBUGGER_FLAG in sys.argv: 50 return True 51 52 # 2. Check "parent-args" arguments. This is used for spawned processes such as sensors and 53 # Python runner actions 54 55 for arg in sys.argv: 56 if arg.startswith(PARENT_ARGS_FLAG) and USE_DEBUGGER_FLAG in arg: 57 return True 58 59 return False 60 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/st2common/st2common/util/monkey_patch.py b/st2common/st2common/util/monkey_patch.py --- a/st2common/st2common/util/monkey_patch.py +++ b/st2common/st2common/util/monkey_patch.py @@ -23,6 +23,7 @@ __all__ = [ 'monkey_patch', + 'use_select_poll_workaround', 'is_use_debugger_flag_provided' ] @@ -44,6 +45,21 @@ eventlet.monkey_patch(os=True, select=True, socket=True, thread=patch_thread, time=True) +def use_select_poll_workaround(): + """ + Work around for some tests which injects original select module with select.poll() + available to sys.modules. + """ + import sys + import subprocess + import eventlet + + # Work around to get tests to pass with eventlet >= 0.20.0 + if 'nose' in sys.modules.keys(): + sys.modules['select'] = eventlet.patcher.original('select') + subprocess.select = eventlet.patcher.original('select') + + def is_use_debugger_flag_provided(): # 1. Check sys.argv directly if USE_DEBUGGER_FLAG in sys.argv:
{"golden_diff": "diff --git a/st2common/st2common/util/monkey_patch.py b/st2common/st2common/util/monkey_patch.py\n--- a/st2common/st2common/util/monkey_patch.py\n+++ b/st2common/st2common/util/monkey_patch.py\n@@ -23,6 +23,7 @@\n \n __all__ = [\n 'monkey_patch',\n+ 'use_select_poll_workaround',\n 'is_use_debugger_flag_provided'\n ]\n \n@@ -44,6 +45,21 @@\n eventlet.monkey_patch(os=True, select=True, socket=True, thread=patch_thread, time=True)\n \n \n+def use_select_poll_workaround():\n+ \"\"\"\n+ Work around for some tests which injects original select module with select.poll()\n+ available to sys.modules.\n+ \"\"\"\n+ import sys\n+ import subprocess\n+ import eventlet\n+\n+ # Work around to get tests to pass with eventlet >= 0.20.0\n+ if 'nose' in sys.modules.keys():\n+ sys.modules['select'] = eventlet.patcher.original('select')\n+ subprocess.select = eventlet.patcher.original('select')\n+\n+\n def is_use_debugger_flag_provided():\n # 1. Check sys.argv directly\n if USE_DEBUGGER_FLAG in sys.argv:\n", "issue": "Pinned eventlet version has outstanding bugs\npinned version 0.17 of eventlet has outstanding bugs on it's monkey patching of the ssl module.\r\n\r\ne.g.\r\nhttps://github.com/eventlet/eventlet/issues/371\r\n\r\n```\r\n# Note: 0.20.0 removed select.poll() on which some of our code and libraries we\r\n# depend on rely\r\n```\r\n\r\n@Kami committed this reversion in https://github.com/StackStorm/st2/commit/1ec43d294e6770e56ec8f9990c805cb9dffe98c5\r\n\r\nWhat was the specific issue?\n", "before_files": [{"content": "# Licensed to the StackStorm, Inc ('StackStorm') under one or more\n# contributor license agreements. See the NOTICE file distributed with\n# this work for additional information regarding copyright ownership.\n# The ASF licenses this file to You under the Apache License, Version 2.0\n# (the \"License\"); you may not use this file except in compliance with\n# the License. You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n#\n\n\"\"\"\nModule for performing eventlet and other monkey patching.\n\"\"\"\n\nfrom __future__ import absolute_import\n\nimport sys\n\n__all__ = [\n 'monkey_patch',\n 'is_use_debugger_flag_provided'\n]\n\nUSE_DEBUGGER_FLAG = '--use-debugger'\nPARENT_ARGS_FLAG = '--parent-args='\n\n\ndef monkey_patch():\n \"\"\"\n Function which performs eventlet monkey patching and also takes into account \"--use-debugger\"\n argument in the command line arguments.\n\n If this argument is found, no monkey patching is performed for the thread module. This allows\n user to use remote debuggers.\n \"\"\"\n import eventlet\n\n patch_thread = not is_use_debugger_flag_provided()\n eventlet.monkey_patch(os=True, select=True, socket=True, thread=patch_thread, time=True)\n\n\ndef is_use_debugger_flag_provided():\n # 1. Check sys.argv directly\n if USE_DEBUGGER_FLAG in sys.argv:\n return True\n\n # 2. Check \"parent-args\" arguments. This is used for spawned processes such as sensors and\n # Python runner actions\n\n for arg in sys.argv:\n if arg.startswith(PARENT_ARGS_FLAG) and USE_DEBUGGER_FLAG in arg:\n return True\n\n return False\n", "path": "st2common/st2common/util/monkey_patch.py"}], "after_files": [{"content": "# Licensed to the StackStorm, Inc ('StackStorm') under one or more\n# contributor license agreements. See the NOTICE file distributed with\n# this work for additional information regarding copyright ownership.\n# The ASF licenses this file to You under the Apache License, Version 2.0\n# (the \"License\"); you may not use this file except in compliance with\n# the License. You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n#\n\n\"\"\"\nModule for performing eventlet and other monkey patching.\n\"\"\"\n\nfrom __future__ import absolute_import\n\nimport sys\n\n__all__ = [\n 'monkey_patch',\n 'use_select_poll_workaround',\n 'is_use_debugger_flag_provided'\n]\n\nUSE_DEBUGGER_FLAG = '--use-debugger'\nPARENT_ARGS_FLAG = '--parent-args='\n\n\ndef monkey_patch():\n \"\"\"\n Function which performs eventlet monkey patching and also takes into account \"--use-debugger\"\n argument in the command line arguments.\n\n If this argument is found, no monkey patching is performed for the thread module. This allows\n user to use remote debuggers.\n \"\"\"\n import eventlet\n\n patch_thread = not is_use_debugger_flag_provided()\n eventlet.monkey_patch(os=True, select=True, socket=True, thread=patch_thread, time=True)\n\n\ndef use_select_poll_workaround():\n \"\"\"\n Work around for some tests which injects original select module with select.poll()\n available to sys.modules.\n \"\"\"\n import sys\n import subprocess\n import eventlet\n\n # Work around to get tests to pass with eventlet >= 0.20.0\n if 'nose' in sys.modules.keys():\n sys.modules['select'] = eventlet.patcher.original('select')\n subprocess.select = eventlet.patcher.original('select')\n\n\ndef is_use_debugger_flag_provided():\n # 1. Check sys.argv directly\n if USE_DEBUGGER_FLAG in sys.argv:\n return True\n\n # 2. Check \"parent-args\" arguments. This is used for spawned processes such as sensors and\n # Python runner actions\n\n for arg in sys.argv:\n if arg.startswith(PARENT_ARGS_FLAG) and USE_DEBUGGER_FLAG in arg:\n return True\n\n return False\n", "path": "st2common/st2common/util/monkey_patch.py"}]}
966
286
gh_patches_debug_9640
rasdani/github-patches
git_diff
chainer__chainer-7760
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Refactor utility link in optimizer_hooks unit tests `chainer_tests/optimizer_hooks_tests` uses similar dummy links, which can be refactored to reduce repetition. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `chainer/optimizer_hooks/gradient_hard_clipping.py` Content: ``` 1 import chainer 2 from chainer import backend 3 4 5 class GradientHardClipping(object): 6 7 """Optimizer/UpdateRule hook function for gradient clipping. 8 9 This hook function clips all gradient arrays to be within a lower and upper 10 bound. 11 12 Args: 13 lower_bound (float): The lower bound of the gradient value. 14 upper_bound (float): The upper bound of the gradient value. 15 16 Attributes: 17 ~optimizer_hooks.GradientHardClipping.lower_bound (float): The 18 lower bound of the gradient value. 19 ~optimizer_hooks.GradientHardClipping.upper_bound (float): The 20 upper bound of the gradient value. 21 ~optimizer_hooks.GradientHardClipping.timing (string): Specifies 22 when this hook should be called by the 23 Optimizer/UpdateRule. Valid values are 'pre' 24 (before any updates) and 'post' 25 (after any updates). 26 ~optimizer_hooks.GradientHardClipping.call_for_each_param (bool): \ 27 Specifies if this hook is called for each parameter 28 (``True``) or only once (``False``) by an optimizer to 29 which this hook is registered. This function does 30 not expect users to switch the value from default one, 31 which is `True`. 32 33 .. versionadded:: 4.0.0 34 The *timing* parameter. 35 36 """ 37 name = 'GradientHardClipping' 38 call_for_each_param = True 39 timing = 'pre' 40 41 def __init__(self, lower_bound, upper_bound): 42 self.lower_bound = lower_bound 43 self.upper_bound = upper_bound 44 45 def __call__(self, rule, param): 46 grad = param.grad 47 if grad is None: 48 return 49 with chainer.using_device(param.device): 50 xp = param.device.xp 51 # TODO(kshitij12345): Fix when chainerx.clip 52 # supports kwarg `out`. 53 if xp == backend.chainerx \ 54 or isinstance(param.grad, backend.intel64.mdarray): 55 grad[:] = grad.clip(self.lower_bound, self.upper_bound) 56 else: 57 # Save on new object allocation when using numpy and cupy 58 # using kwarg `out` 59 xp.clip(grad, self.lower_bound, self.upper_bound, out=grad) 60 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/chainer/optimizer_hooks/gradient_hard_clipping.py b/chainer/optimizer_hooks/gradient_hard_clipping.py --- a/chainer/optimizer_hooks/gradient_hard_clipping.py +++ b/chainer/optimizer_hooks/gradient_hard_clipping.py @@ -52,7 +52,7 @@ # supports kwarg `out`. if xp == backend.chainerx \ or isinstance(param.grad, backend.intel64.mdarray): - grad[:] = grad.clip(self.lower_bound, self.upper_bound) + grad[...] = grad.clip(self.lower_bound, self.upper_bound) else: # Save on new object allocation when using numpy and cupy # using kwarg `out`
{"golden_diff": "diff --git a/chainer/optimizer_hooks/gradient_hard_clipping.py b/chainer/optimizer_hooks/gradient_hard_clipping.py\n--- a/chainer/optimizer_hooks/gradient_hard_clipping.py\n+++ b/chainer/optimizer_hooks/gradient_hard_clipping.py\n@@ -52,7 +52,7 @@\n # supports kwarg `out`.\n if xp == backend.chainerx \\\n or isinstance(param.grad, backend.intel64.mdarray):\n- grad[:] = grad.clip(self.lower_bound, self.upper_bound)\n+ grad[...] = grad.clip(self.lower_bound, self.upper_bound)\n else:\n # Save on new object allocation when using numpy and cupy\n # using kwarg `out`\n", "issue": "Refactor utility link in optimizer_hooks unit tests\n`chainer_tests/optimizer_hooks_tests` uses similar dummy links, which can be refactored to reduce repetition.\n", "before_files": [{"content": "import chainer\nfrom chainer import backend\n\n\nclass GradientHardClipping(object):\n\n \"\"\"Optimizer/UpdateRule hook function for gradient clipping.\n\n This hook function clips all gradient arrays to be within a lower and upper\n bound.\n\n Args:\n lower_bound (float): The lower bound of the gradient value.\n upper_bound (float): The upper bound of the gradient value.\n\n Attributes:\n ~optimizer_hooks.GradientHardClipping.lower_bound (float): The\n lower bound of the gradient value.\n ~optimizer_hooks.GradientHardClipping.upper_bound (float): The\n upper bound of the gradient value.\n ~optimizer_hooks.GradientHardClipping.timing (string): Specifies\n when this hook should be called by the\n Optimizer/UpdateRule. Valid values are 'pre'\n (before any updates) and 'post'\n (after any updates).\n ~optimizer_hooks.GradientHardClipping.call_for_each_param (bool): \\\n Specifies if this hook is called for each parameter\n (``True``) or only once (``False``) by an optimizer to\n which this hook is registered. This function does\n not expect users to switch the value from default one,\n which is `True`.\n\n .. versionadded:: 4.0.0\n The *timing* parameter.\n\n \"\"\"\n name = 'GradientHardClipping'\n call_for_each_param = True\n timing = 'pre'\n\n def __init__(self, lower_bound, upper_bound):\n self.lower_bound = lower_bound\n self.upper_bound = upper_bound\n\n def __call__(self, rule, param):\n grad = param.grad\n if grad is None:\n return\n with chainer.using_device(param.device):\n xp = param.device.xp\n # TODO(kshitij12345): Fix when chainerx.clip\n # supports kwarg `out`.\n if xp == backend.chainerx \\\n or isinstance(param.grad, backend.intel64.mdarray):\n grad[:] = grad.clip(self.lower_bound, self.upper_bound)\n else:\n # Save on new object allocation when using numpy and cupy\n # using kwarg `out`\n xp.clip(grad, self.lower_bound, self.upper_bound, out=grad)\n", "path": "chainer/optimizer_hooks/gradient_hard_clipping.py"}], "after_files": [{"content": "import chainer\nfrom chainer import backend\n\n\nclass GradientHardClipping(object):\n\n \"\"\"Optimizer/UpdateRule hook function for gradient clipping.\n\n This hook function clips all gradient arrays to be within a lower and upper\n bound.\n\n Args:\n lower_bound (float): The lower bound of the gradient value.\n upper_bound (float): The upper bound of the gradient value.\n\n Attributes:\n ~optimizer_hooks.GradientHardClipping.lower_bound (float): The\n lower bound of the gradient value.\n ~optimizer_hooks.GradientHardClipping.upper_bound (float): The\n upper bound of the gradient value.\n ~optimizer_hooks.GradientHardClipping.timing (string): Specifies\n when this hook should be called by the\n Optimizer/UpdateRule. Valid values are 'pre'\n (before any updates) and 'post'\n (after any updates).\n ~optimizer_hooks.GradientHardClipping.call_for_each_param (bool): \\\n Specifies if this hook is called for each parameter\n (``True``) or only once (``False``) by an optimizer to\n which this hook is registered. This function does\n not expect users to switch the value from default one,\n which is `True`.\n\n .. versionadded:: 4.0.0\n The *timing* parameter.\n\n \"\"\"\n name = 'GradientHardClipping'\n call_for_each_param = True\n timing = 'pre'\n\n def __init__(self, lower_bound, upper_bound):\n self.lower_bound = lower_bound\n self.upper_bound = upper_bound\n\n def __call__(self, rule, param):\n grad = param.grad\n if grad is None:\n return\n with chainer.using_device(param.device):\n xp = param.device.xp\n # TODO(kshitij12345): Fix when chainerx.clip\n # supports kwarg `out`.\n if xp == backend.chainerx \\\n or isinstance(param.grad, backend.intel64.mdarray):\n grad[...] = grad.clip(self.lower_bound, self.upper_bound)\n else:\n # Save on new object allocation when using numpy and cupy\n # using kwarg `out`\n xp.clip(grad, self.lower_bound, self.upper_bound, out=grad)\n", "path": "chainer/optimizer_hooks/gradient_hard_clipping.py"}]}
899
155
gh_patches_debug_22503
rasdani/github-patches
git_diff
TheAlgorithms__Python-10012
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Improve our test coverage ### Feature description Many of our existing algorithm files have little to no unit testing. This is problematic because this can easily let bugs slip through. We want some assurance that the code we currently have is correct and functional. We welcome all contributors to open PRs to help us add tests to our codebase. ### How to find low-coverage files Go to the Actions tab in this repository and find the most recent **build** workflow run. Open the logs under "Run Tests" and scroll down until you find the section on code coverage: ``` ---------- coverage: platform linux, python 3.12.0-final-0 ----------- Name Stmts Miss Cover Missing ----------------------------------------------------------------------------------------------------------- quantum/q_fourier_transform.py 30 30 0% 14-93 scripts/validate_solutions.py 54 54 0% 2-94 strings/min_cost_string_conversion.py 78 75 4% 20-57, 61-75, 79-129 ... ``` The "Cover" column tells you what percentage of the lines in that file are covered by tests. We want to increase this percentage for existing files. Find a file with low coverage percentage that you wish to write tests for, add doctests for each function, and open a PR with your changes. You do not need to have a perfect coverage percentage, but all functions should have doctests. Some files will naturally be hard to write tests for. For example, the file may be poorly written because they lack any functions. Other files might be how-tos, meaning they simply demonstrate how to use an existing library's functions rather than implementing the algorithm themselves. Ignore these kinds of files, as they will need to be rewritten eventually. Furthermore, ignore files in the `web_programming` and `project_euler` directories. Web programming files are inherently hard to test and Project Euler files have their own validation workflow, so don't worry about their test coverage. _**When you open your PR, put "Contributes to #9943" in the PR description.**_ Do not use the word "fixes", "resolves", or "closes". This issue is an ongoing one, and your PR will not single-handedly resolve this issue. ### How to add doctests A doctest is a unit test that is contained within the documentation comment (docstring) for a function. Here is an example of what doctests look like within a docstring: ```py def add(a: int, b: int) -> int: """ Adds two non-negative numbers. >>> add(1, 1) 2 >>> add(2, 5) 7 >>> add(1, 0) 1 >>> add(-1, -1) Traceback (most recent last): ... ValueError: Numbers must be non-negative """ ``` For every function in the file you choose, you should write doctests like the ones shown above in its docstring. If a function doesn't have a docstring, add one. Your doctests should be comprehensive but not excessive: you should write just enough tests to cover all basic cases as well as all edge cases (e.g., negative numbers, empty lists, etc). Do not simply run a function on some example inputs and put its output as the expected output for a doctest. This assumes that the function is implemented correctly when it might not be. Verify independently that your doctests and their expected outputs are correct. **Your PR will not be merged if it has failing tests.** If you happen to discover a bug while writing doctests, please fix it. _**Please read our [contributing guidelines](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md) before you contribute.**_ --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `dynamic_programming/minimum_partition.py` Content: ``` 1 """ 2 Partition a set into two subsets such that the difference of subset sums is minimum 3 """ 4 5 6 def find_min(arr): 7 n = len(arr) 8 s = sum(arr) 9 10 dp = [[False for x in range(s + 1)] for y in range(n + 1)] 11 12 for i in range(1, n + 1): 13 dp[i][0] = True 14 15 for i in range(1, s + 1): 16 dp[0][i] = False 17 18 for i in range(1, n + 1): 19 for j in range(1, s + 1): 20 dp[i][j] = dp[i][j - 1] 21 22 if arr[i - 1] <= j: 23 dp[i][j] = dp[i][j] or dp[i - 1][j - arr[i - 1]] 24 25 for j in range(int(s / 2), -1, -1): 26 if dp[n][j] is True: 27 diff = s - 2 * j 28 break 29 30 return diff 31 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/dynamic_programming/minimum_partition.py b/dynamic_programming/minimum_partition.py --- a/dynamic_programming/minimum_partition.py +++ b/dynamic_programming/minimum_partition.py @@ -3,13 +3,25 @@ """ -def find_min(arr): +def find_min(arr: list[int]) -> int: + """ + >>> find_min([1, 2, 3, 4, 5]) + 1 + >>> find_min([5, 5, 5, 5, 5]) + 5 + >>> find_min([5, 5, 5, 5]) + 0 + >>> find_min([3]) + 3 + >>> find_min([]) + 0 + """ n = len(arr) s = sum(arr) dp = [[False for x in range(s + 1)] for y in range(n + 1)] - for i in range(1, n + 1): + for i in range(n + 1): dp[i][0] = True for i in range(1, s + 1): @@ -17,7 +29,7 @@ for i in range(1, n + 1): for j in range(1, s + 1): - dp[i][j] = dp[i][j - 1] + dp[i][j] = dp[i - 1][j] if arr[i - 1] <= j: dp[i][j] = dp[i][j] or dp[i - 1][j - arr[i - 1]] @@ -28,3 +40,9 @@ break return diff + + +if __name__ == "__main__": + from doctest import testmod + + testmod()
{"golden_diff": "diff --git a/dynamic_programming/minimum_partition.py b/dynamic_programming/minimum_partition.py\n--- a/dynamic_programming/minimum_partition.py\n+++ b/dynamic_programming/minimum_partition.py\n@@ -3,13 +3,25 @@\n \"\"\"\n \n \n-def find_min(arr):\n+def find_min(arr: list[int]) -> int:\n+ \"\"\"\n+ >>> find_min([1, 2, 3, 4, 5])\n+ 1\n+ >>> find_min([5, 5, 5, 5, 5])\n+ 5\n+ >>> find_min([5, 5, 5, 5])\n+ 0\n+ >>> find_min([3])\n+ 3\n+ >>> find_min([])\n+ 0\n+ \"\"\"\n n = len(arr)\n s = sum(arr)\n \n dp = [[False for x in range(s + 1)] for y in range(n + 1)]\n \n- for i in range(1, n + 1):\n+ for i in range(n + 1):\n dp[i][0] = True\n \n for i in range(1, s + 1):\n@@ -17,7 +29,7 @@\n \n for i in range(1, n + 1):\n for j in range(1, s + 1):\n- dp[i][j] = dp[i][j - 1]\n+ dp[i][j] = dp[i - 1][j]\n \n if arr[i - 1] <= j:\n dp[i][j] = dp[i][j] or dp[i - 1][j - arr[i - 1]]\n@@ -28,3 +40,9 @@\n break\n \n return diff\n+\n+\n+if __name__ == \"__main__\":\n+ from doctest import testmod\n+\n+ testmod()\n", "issue": "Improve our test coverage\n### Feature description\r\n\r\nMany of our existing algorithm files have little to no unit testing. This is problematic because this can easily let bugs slip through. We want some assurance that the code we currently have is correct and functional. We welcome all contributors to open PRs to help us add tests to our codebase.\r\n\r\n### How to find low-coverage files\r\n\r\nGo to the Actions tab in this repository and find the most recent **build** workflow run. Open the logs under \"Run Tests\" and scroll down until you find the section on code coverage:\r\n```\r\n---------- coverage: platform linux, python 3.12.0-final-0 -----------\r\nName Stmts Miss Cover Missing\r\n-----------------------------------------------------------------------------------------------------------\r\nquantum/q_fourier_transform.py 30 30 0% 14-93\r\nscripts/validate_solutions.py 54 54 0% 2-94\r\nstrings/min_cost_string_conversion.py 78 75 4% 20-57, 61-75, 79-129\r\n...\r\n```\r\nThe \"Cover\" column tells you what percentage of the lines in that file are covered by tests. We want to increase this percentage for existing files. Find a file with low coverage percentage that you wish to write tests for, add doctests for each function, and open a PR with your changes. You do not need to have a perfect coverage percentage, but all functions should have doctests.\r\n\r\nSome files will naturally be hard to write tests for. For example, the file may be poorly written because they lack any functions. Other files might be how-tos, meaning they simply demonstrate how to use an existing library's functions rather than implementing the algorithm themselves. Ignore these kinds of files, as they will need to be rewritten eventually. Furthermore, ignore files in the `web_programming` and `project_euler` directories. Web programming files are inherently hard to test and Project Euler files have their own validation workflow, so don't worry about their test coverage.\r\n\r\n_**When you open your PR, put \"Contributes to #9943\" in the PR description.**_ Do not use the word \"fixes\", \"resolves\", or \"closes\". This issue is an ongoing one, and your PR will not single-handedly resolve this issue.\r\n\r\n### How to add doctests\r\n\r\nA doctest is a unit test that is contained within the documentation comment (docstring) for a function. Here is an example of what doctests look like within a docstring:\r\n```py\r\ndef add(a: int, b: int) -> int:\r\n \"\"\"\r\n Adds two non-negative numbers.\r\n >>> add(1, 1)\r\n 2\r\n >>> add(2, 5)\r\n 7\r\n >>> add(1, 0)\r\n 1\r\n >>> add(-1, -1)\r\n Traceback (most recent last):\r\n ...\r\n ValueError: Numbers must be non-negative\r\n \"\"\"\r\n```\r\nFor every function in the file you choose, you should write doctests like the ones shown above in its docstring. If a function doesn't have a docstring, add one. Your doctests should be comprehensive but not excessive: you should write just enough tests to cover all basic cases as well as all edge cases (e.g., negative numbers, empty lists, etc).\r\n\r\nDo not simply run a function on some example inputs and put its output as the expected output for a doctest. This assumes that the function is implemented correctly when it might not be. Verify independently that your doctests and their expected outputs are correct. **Your PR will not be merged if it has failing tests.** If you happen to discover a bug while writing doctests, please fix it.\r\n\r\n_**Please read our [contributing guidelines](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md) before you contribute.**_\n", "before_files": [{"content": "\"\"\"\nPartition a set into two subsets such that the difference of subset sums is minimum\n\"\"\"\n\n\ndef find_min(arr):\n n = len(arr)\n s = sum(arr)\n\n dp = [[False for x in range(s + 1)] for y in range(n + 1)]\n\n for i in range(1, n + 1):\n dp[i][0] = True\n\n for i in range(1, s + 1):\n dp[0][i] = False\n\n for i in range(1, n + 1):\n for j in range(1, s + 1):\n dp[i][j] = dp[i][j - 1]\n\n if arr[i - 1] <= j:\n dp[i][j] = dp[i][j] or dp[i - 1][j - arr[i - 1]]\n\n for j in range(int(s / 2), -1, -1):\n if dp[n][j] is True:\n diff = s - 2 * j\n break\n\n return diff\n", "path": "dynamic_programming/minimum_partition.py"}], "after_files": [{"content": "\"\"\"\nPartition a set into two subsets such that the difference of subset sums is minimum\n\"\"\"\n\n\ndef find_min(arr: list[int]) -> int:\n \"\"\"\n >>> find_min([1, 2, 3, 4, 5])\n 1\n >>> find_min([5, 5, 5, 5, 5])\n 5\n >>> find_min([5, 5, 5, 5])\n 0\n >>> find_min([3])\n 3\n >>> find_min([])\n 0\n \"\"\"\n n = len(arr)\n s = sum(arr)\n\n dp = [[False for x in range(s + 1)] for y in range(n + 1)]\n\n for i in range(n + 1):\n dp[i][0] = True\n\n for i in range(1, s + 1):\n dp[0][i] = False\n\n for i in range(1, n + 1):\n for j in range(1, s + 1):\n dp[i][j] = dp[i - 1][j]\n\n if arr[i - 1] <= j:\n dp[i][j] = dp[i][j] or dp[i - 1][j - arr[i - 1]]\n\n for j in range(int(s / 2), -1, -1):\n if dp[n][j] is True:\n diff = s - 2 * j\n break\n\n return diff\n\n\nif __name__ == \"__main__\":\n from doctest import testmod\n\n testmod()\n", "path": "dynamic_programming/minimum_partition.py"}]}
1,390
416
gh_patches_debug_17712
rasdani/github-patches
git_diff
pypa__virtualenv-1509
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Integration with virtualenvwrapper Congratulations on the rewrite. I notice that `virtualenvwrapper` doesn't integrate with the new beta: ``` $ mkvirtualenv foo ERROR: Environment '/Users/brettmz/.virtualenvs/foo' does not contain an activate script. ``` Sure enough - I believe it is looking for a script `activate` - but the script file now depends on the shell - `activate.sh` etc. It would be good if this could work somehow - would it be reasonable to create / link a default `activate` script for this case? Or should virtualenvwrapper be updated? --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `src/virtualenv/activation/bash/__init__.py` Content: ``` 1 from __future__ import absolute_import, unicode_literals 2 3 from virtualenv.util.path import Path 4 5 from ..via_template import ViaTemplateActivator 6 7 8 class BashActivator(ViaTemplateActivator): 9 @classmethod 10 def supports(cls, interpreter): 11 return interpreter.os != "nt" 12 13 def templates(self): 14 yield Path("activate.sh") 15 ``` Path: `src/virtualenv/activation/via_template.py` Content: ``` 1 from __future__ import absolute_import, unicode_literals 2 3 import os 4 import sys 5 from abc import ABCMeta, abstractmethod 6 7 import six 8 9 from .activator import Activator 10 11 if sys.version_info >= (3, 7): 12 from importlib.resources import read_text 13 else: 14 from importlib_resources import read_text 15 16 17 @six.add_metaclass(ABCMeta) 18 class ViaTemplateActivator(Activator): 19 @abstractmethod 20 def templates(self): 21 raise NotImplementedError 22 23 def generate(self, creator): 24 dest_folder = creator.bin_dir 25 replacements = self.replacements(creator, dest_folder) 26 self._generate(replacements, self.templates(), dest_folder, creator) 27 if self.flag_prompt is not None: 28 creator.pyenv_cfg["prompt"] = self.flag_prompt 29 30 def replacements(self, creator, dest_folder): 31 return { 32 "__VIRTUAL_PROMPT__": "" if self.flag_prompt is None else self.flag_prompt, 33 "__VIRTUAL_ENV__": six.ensure_text(str(creator.dest)), 34 "__VIRTUAL_NAME__": creator.env_name, 35 "__BIN_NAME__": six.ensure_text(str(creator.bin_dir.relative_to(creator.dest))), 36 "__PATH_SEP__": six.ensure_text(os.pathsep), 37 } 38 39 def _generate(self, replacements, templates, to_folder, creator): 40 for template in templates: 41 text = self.instantiate_template(replacements, template, creator) 42 (to_folder / template).write_text(text, encoding="utf-8") 43 44 def instantiate_template(self, replacements, template, creator): 45 # read text and do replacements 46 text = read_text(self.__module__, str(template), encoding="utf-8", errors="strict") 47 for key, value in replacements.items(): 48 value = self._repr_unicode(creator, value) 49 text = text.replace(key, value) 50 return text 51 52 @staticmethod 53 def _repr_unicode(creator, value): 54 # by default we just let it be unicode 55 return value 56 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/src/virtualenv/activation/bash/__init__.py b/src/virtualenv/activation/bash/__init__.py --- a/src/virtualenv/activation/bash/__init__.py +++ b/src/virtualenv/activation/bash/__init__.py @@ -12,3 +12,6 @@ def templates(self): yield Path("activate.sh") + + def as_name(self, template): + return template.stem diff --git a/src/virtualenv/activation/via_template.py b/src/virtualenv/activation/via_template.py --- a/src/virtualenv/activation/via_template.py +++ b/src/virtualenv/activation/via_template.py @@ -39,7 +39,11 @@ def _generate(self, replacements, templates, to_folder, creator): for template in templates: text = self.instantiate_template(replacements, template, creator) - (to_folder / template).write_text(text, encoding="utf-8") + dest = to_folder / self.as_name(template) + dest.write_text(text, encoding="utf-8") + + def as_name(self, template): + return template.name def instantiate_template(self, replacements, template, creator): # read text and do replacements
{"golden_diff": "diff --git a/src/virtualenv/activation/bash/__init__.py b/src/virtualenv/activation/bash/__init__.py\n--- a/src/virtualenv/activation/bash/__init__.py\n+++ b/src/virtualenv/activation/bash/__init__.py\n@@ -12,3 +12,6 @@\n \n def templates(self):\n yield Path(\"activate.sh\")\n+\n+ def as_name(self, template):\n+ return template.stem\ndiff --git a/src/virtualenv/activation/via_template.py b/src/virtualenv/activation/via_template.py\n--- a/src/virtualenv/activation/via_template.py\n+++ b/src/virtualenv/activation/via_template.py\n@@ -39,7 +39,11 @@\n def _generate(self, replacements, templates, to_folder, creator):\n for template in templates:\n text = self.instantiate_template(replacements, template, creator)\n- (to_folder / template).write_text(text, encoding=\"utf-8\")\n+ dest = to_folder / self.as_name(template)\n+ dest.write_text(text, encoding=\"utf-8\")\n+\n+ def as_name(self, template):\n+ return template.name\n \n def instantiate_template(self, replacements, template, creator):\n # read text and do replacements\n", "issue": "Integration with virtualenvwrapper\nCongratulations on the rewrite.\r\n\r\nI notice that `virtualenvwrapper` doesn't integrate with the new beta:\r\n\r\n```\r\n$ mkvirtualenv foo\r\nERROR: Environment '/Users/brettmz/.virtualenvs/foo' does not contain an activate script.\r\n```\r\n\r\nSure enough - I believe it is looking for a script `activate` - but the script file now depends on the shell - `activate.sh` etc. \r\n\r\nIt would be good if this could work somehow - would it be reasonable to create / link a default `activate` script for this case? Or should virtualenvwrapper be updated?\n", "before_files": [{"content": "from __future__ import absolute_import, unicode_literals\n\nfrom virtualenv.util.path import Path\n\nfrom ..via_template import ViaTemplateActivator\n\n\nclass BashActivator(ViaTemplateActivator):\n @classmethod\n def supports(cls, interpreter):\n return interpreter.os != \"nt\"\n\n def templates(self):\n yield Path(\"activate.sh\")\n", "path": "src/virtualenv/activation/bash/__init__.py"}, {"content": "from __future__ import absolute_import, unicode_literals\n\nimport os\nimport sys\nfrom abc import ABCMeta, abstractmethod\n\nimport six\n\nfrom .activator import Activator\n\nif sys.version_info >= (3, 7):\n from importlib.resources import read_text\nelse:\n from importlib_resources import read_text\n\n\[email protected]_metaclass(ABCMeta)\nclass ViaTemplateActivator(Activator):\n @abstractmethod\n def templates(self):\n raise NotImplementedError\n\n def generate(self, creator):\n dest_folder = creator.bin_dir\n replacements = self.replacements(creator, dest_folder)\n self._generate(replacements, self.templates(), dest_folder, creator)\n if self.flag_prompt is not None:\n creator.pyenv_cfg[\"prompt\"] = self.flag_prompt\n\n def replacements(self, creator, dest_folder):\n return {\n \"__VIRTUAL_PROMPT__\": \"\" if self.flag_prompt is None else self.flag_prompt,\n \"__VIRTUAL_ENV__\": six.ensure_text(str(creator.dest)),\n \"__VIRTUAL_NAME__\": creator.env_name,\n \"__BIN_NAME__\": six.ensure_text(str(creator.bin_dir.relative_to(creator.dest))),\n \"__PATH_SEP__\": six.ensure_text(os.pathsep),\n }\n\n def _generate(self, replacements, templates, to_folder, creator):\n for template in templates:\n text = self.instantiate_template(replacements, template, creator)\n (to_folder / template).write_text(text, encoding=\"utf-8\")\n\n def instantiate_template(self, replacements, template, creator):\n # read text and do replacements\n text = read_text(self.__module__, str(template), encoding=\"utf-8\", errors=\"strict\")\n for key, value in replacements.items():\n value = self._repr_unicode(creator, value)\n text = text.replace(key, value)\n return text\n\n @staticmethod\n def _repr_unicode(creator, value):\n # by default we just let it be unicode\n return value\n", "path": "src/virtualenv/activation/via_template.py"}], "after_files": [{"content": "from __future__ import absolute_import, unicode_literals\n\nfrom virtualenv.util.path import Path\n\nfrom ..via_template import ViaTemplateActivator\n\n\nclass BashActivator(ViaTemplateActivator):\n @classmethod\n def supports(cls, interpreter):\n return interpreter.os != \"nt\"\n\n def templates(self):\n yield Path(\"activate.sh\")\n\n def as_name(self, template):\n return template.stem\n", "path": "src/virtualenv/activation/bash/__init__.py"}, {"content": "from __future__ import absolute_import, unicode_literals\n\nimport os\nimport sys\nfrom abc import ABCMeta, abstractmethod\n\nimport six\n\nfrom .activator import Activator\n\nif sys.version_info >= (3, 7):\n from importlib.resources import read_text\nelse:\n from importlib_resources import read_text\n\n\[email protected]_metaclass(ABCMeta)\nclass ViaTemplateActivator(Activator):\n @abstractmethod\n def templates(self):\n raise NotImplementedError\n\n def generate(self, creator):\n dest_folder = creator.bin_dir\n replacements = self.replacements(creator, dest_folder)\n self._generate(replacements, self.templates(), dest_folder, creator)\n if self.flag_prompt is not None:\n creator.pyenv_cfg[\"prompt\"] = self.flag_prompt\n\n def replacements(self, creator, dest_folder):\n return {\n \"__VIRTUAL_PROMPT__\": \"\" if self.flag_prompt is None else self.flag_prompt,\n \"__VIRTUAL_ENV__\": six.ensure_text(str(creator.dest)),\n \"__VIRTUAL_NAME__\": creator.env_name,\n \"__BIN_NAME__\": six.ensure_text(str(creator.bin_dir.relative_to(creator.dest))),\n \"__PATH_SEP__\": six.ensure_text(os.pathsep),\n }\n\n def _generate(self, replacements, templates, to_folder, creator):\n for template in templates:\n text = self.instantiate_template(replacements, template, creator)\n dest = to_folder / self.as_name(template)\n dest.write_text(text, encoding=\"utf-8\")\n\n def as_name(self, template):\n return template.name\n\n def instantiate_template(self, replacements, template, creator):\n # read text and do replacements\n text = read_text(self.__module__, str(template), encoding=\"utf-8\", errors=\"strict\")\n for key, value in replacements.items():\n value = self._repr_unicode(creator, value)\n text = text.replace(key, value)\n return text\n\n @staticmethod\n def _repr_unicode(creator, value):\n # by default we just let it be unicode\n return value\n", "path": "src/virtualenv/activation/via_template.py"}]}
1,036
276
gh_patches_debug_30204
rasdani/github-patches
git_diff
ipython__ipython-9854
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Cannot close last figure window After updating ipython to 5.0 (from macports), I cannot close last figure window (neither with button in window bar, nor with 'close()'). The mouse cursor becomes spinning wheel inside the window, but I can still type command in the terminal window. However, if I type clf(), the figure window closes and a new one appears. This happens only when there is only one figure window remaining. Here's my system (OS X El Capitan, with python/ipython etc. installed via macports) : ``` [~] $ python -c "import IPython; print(IPython.sys_info())" {'commit_hash': u'<not found>', 'commit_source': '(none found)', 'default_encoding': 'UTF-8', 'ipython_path': '/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/IPython', 'ipython_version': '5.0.0', 'os_name': 'posix', 'platform': 'Darwin-15.5.0-x86_64-i386-64bit', 'sys_executable': '/opt/local/Library/Frameworks/Python.framework/Versions/2.7/Resources/Python.app/Contents/MacOS/Python', 'sys_platform': 'darwin', 'sys_version': '2.7.12 (default, Jun 29 2016, 12:52:38) \n[GCC 4.2.1 Compatible Apple LLVM 7.0.2 (clang-700.1.81)]'} ``` --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `IPython/terminal/pt_inputhooks/osx.py` Content: ``` 1 """Inputhook for OS X 2 3 Calls NSApp / CoreFoundation APIs via ctypes. 4 """ 5 6 # obj-c boilerplate from appnope, used under BSD 2-clause 7 8 import ctypes 9 import ctypes.util 10 11 objc = ctypes.cdll.LoadLibrary(ctypes.util.find_library('objc')) 12 13 void_p = ctypes.c_void_p 14 15 objc.objc_getClass.restype = void_p 16 objc.sel_registerName.restype = void_p 17 objc.objc_msgSend.restype = void_p 18 objc.objc_msgSend.argtypes = [void_p, void_p] 19 20 msg = objc.objc_msgSend 21 22 def _utf8(s): 23 """ensure utf8 bytes""" 24 if not isinstance(s, bytes): 25 s = s.encode('utf8') 26 return s 27 28 def n(name): 29 """create a selector name (for ObjC methods)""" 30 return objc.sel_registerName(_utf8(name)) 31 32 def C(classname): 33 """get an ObjC Class by name""" 34 return objc.objc_getClass(_utf8(classname)) 35 36 # end obj-c boilerplate from appnope 37 38 # CoreFoundation C-API calls we will use: 39 CoreFoundation = ctypes.cdll.LoadLibrary(ctypes.util.find_library('CoreFoundation')) 40 41 CFFileDescriptorCreate = CoreFoundation.CFFileDescriptorCreate 42 CFFileDescriptorCreate.restype = void_p 43 CFFileDescriptorCreate.argtypes = [void_p, ctypes.c_int, ctypes.c_bool, void_p] 44 45 CFFileDescriptorGetNativeDescriptor = CoreFoundation.CFFileDescriptorGetNativeDescriptor 46 CFFileDescriptorGetNativeDescriptor.restype = ctypes.c_int 47 CFFileDescriptorGetNativeDescriptor.argtypes = [void_p] 48 49 CFFileDescriptorEnableCallBacks = CoreFoundation.CFFileDescriptorEnableCallBacks 50 CFFileDescriptorEnableCallBacks.restype = None 51 CFFileDescriptorEnableCallBacks.argtypes = [void_p, ctypes.c_ulong] 52 53 CFFileDescriptorCreateRunLoopSource = CoreFoundation.CFFileDescriptorCreateRunLoopSource 54 CFFileDescriptorCreateRunLoopSource.restype = void_p 55 CFFileDescriptorCreateRunLoopSource.argtypes = [void_p, void_p, void_p] 56 57 CFRunLoopGetCurrent = CoreFoundation.CFRunLoopGetCurrent 58 CFRunLoopGetCurrent.restype = void_p 59 60 CFRunLoopAddSource = CoreFoundation.CFRunLoopAddSource 61 CFRunLoopAddSource.restype = None 62 CFRunLoopAddSource.argtypes = [void_p, void_p, void_p] 63 64 CFRelease = CoreFoundation.CFRelease 65 CFRelease.restype = None 66 CFRelease.argtypes = [void_p] 67 68 CFFileDescriptorInvalidate = CoreFoundation.CFFileDescriptorInvalidate 69 CFFileDescriptorInvalidate.restype = None 70 CFFileDescriptorInvalidate.argtypes = [void_p] 71 72 # From CFFileDescriptor.h 73 kCFFileDescriptorReadCallBack = 1 74 kCFRunLoopCommonModes = void_p.in_dll(CoreFoundation, 'kCFRunLoopCommonModes') 75 76 77 def _NSApp(): 78 """Return the global NSApplication instance (NSApp)""" 79 return msg(C('NSApplication'), n('sharedApplication')) 80 81 82 def _wake(NSApp): 83 """Wake the Application""" 84 event = msg(C('NSEvent'), 85 n('otherEventWithType:location:modifierFlags:' 86 'timestamp:windowNumber:context:subtype:data1:data2:'), 87 15, # Type 88 0, # location 89 0, # flags 90 0, # timestamp 91 0, # window 92 None, # context 93 0, # subtype 94 0, # data1 95 0, # data2 96 ) 97 msg(NSApp, n('postEvent:atStart:'), void_p(event), True) 98 99 100 def _input_callback(fdref, flags, info): 101 """Callback to fire when there's input to be read""" 102 CFFileDescriptorInvalidate(fdref) 103 CFRelease(fdref) 104 NSApp = _NSApp() 105 msg(NSApp, n('stop:'), NSApp) 106 _wake(NSApp) 107 108 _c_callback_func_type = ctypes.CFUNCTYPE(None, void_p, void_p, void_p) 109 _c_input_callback = _c_callback_func_type(_input_callback) 110 111 112 def _stop_on_read(fd): 113 """Register callback to stop eventloop when there's data on fd""" 114 fdref = CFFileDescriptorCreate(None, fd, False, _c_input_callback, None) 115 CFFileDescriptorEnableCallBacks(fdref, kCFFileDescriptorReadCallBack) 116 source = CFFileDescriptorCreateRunLoopSource(None, fdref, 0) 117 loop = CFRunLoopGetCurrent() 118 CFRunLoopAddSource(loop, source, kCFRunLoopCommonModes) 119 CFRelease(source) 120 121 122 def inputhook(context): 123 """Inputhook for Cocoa (NSApp)""" 124 NSApp = _NSApp() 125 window_count = msg( 126 msg(NSApp, n('windows')), 127 n('count') 128 ) 129 if not window_count: 130 return 131 _stop_on_read(context.fileno()) 132 msg(NSApp, n('run')) 133 134 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/IPython/terminal/pt_inputhooks/osx.py b/IPython/terminal/pt_inputhooks/osx.py --- a/IPython/terminal/pt_inputhooks/osx.py +++ b/IPython/terminal/pt_inputhooks/osx.py @@ -7,6 +7,7 @@ import ctypes import ctypes.util +from threading import Event objc = ctypes.cdll.LoadLibrary(ctypes.util.find_library('objc')) @@ -97,8 +98,11 @@ msg(NSApp, n('postEvent:atStart:'), void_p(event), True) +_triggered = Event() + def _input_callback(fdref, flags, info): """Callback to fire when there's input to be read""" + _triggered.set() CFFileDescriptorInvalidate(fdref) CFRelease(fdref) NSApp = _NSApp() @@ -111,6 +115,7 @@ def _stop_on_read(fd): """Register callback to stop eventloop when there's data on fd""" + _triggered.clear() fdref = CFFileDescriptorCreate(None, fd, False, _c_input_callback, None) CFFileDescriptorEnableCallBacks(fdref, kCFFileDescriptorReadCallBack) source = CFFileDescriptorCreateRunLoopSource(None, fdref, 0) @@ -130,4 +135,9 @@ return _stop_on_read(context.fileno()) msg(NSApp, n('run')) - + if not _triggered.is_set(): + # app closed without firing callback, + # probably due to last window being closed. + # Run the loop manually in this case, + # since there may be events still to process (#9734) + CoreFoundation.CFRunLoopRun()
{"golden_diff": "diff --git a/IPython/terminal/pt_inputhooks/osx.py b/IPython/terminal/pt_inputhooks/osx.py\n--- a/IPython/terminal/pt_inputhooks/osx.py\n+++ b/IPython/terminal/pt_inputhooks/osx.py\n@@ -7,6 +7,7 @@\n \n import ctypes\n import ctypes.util\n+from threading import Event\n \n objc = ctypes.cdll.LoadLibrary(ctypes.util.find_library('objc'))\n \n@@ -97,8 +98,11 @@\n msg(NSApp, n('postEvent:atStart:'), void_p(event), True)\n \n \n+_triggered = Event()\n+\n def _input_callback(fdref, flags, info):\n \"\"\"Callback to fire when there's input to be read\"\"\"\n+ _triggered.set()\n CFFileDescriptorInvalidate(fdref)\n CFRelease(fdref)\n NSApp = _NSApp()\n@@ -111,6 +115,7 @@\n \n def _stop_on_read(fd):\n \"\"\"Register callback to stop eventloop when there's data on fd\"\"\"\n+ _triggered.clear()\n fdref = CFFileDescriptorCreate(None, fd, False, _c_input_callback, None)\n CFFileDescriptorEnableCallBacks(fdref, kCFFileDescriptorReadCallBack)\n source = CFFileDescriptorCreateRunLoopSource(None, fdref, 0)\n@@ -130,4 +135,9 @@\n return\n _stop_on_read(context.fileno())\n msg(NSApp, n('run'))\n-\n+ if not _triggered.is_set():\n+ # app closed without firing callback,\n+ # probably due to last window being closed.\n+ # Run the loop manually in this case,\n+ # since there may be events still to process (#9734)\n+ CoreFoundation.CFRunLoopRun()\n", "issue": "Cannot close last figure window\nAfter updating ipython to 5.0 (from macports), I cannot close last figure window (neither with button in window bar, nor with 'close()'). The mouse cursor becomes spinning wheel inside the window, but I can still type command in the terminal window.\n\nHowever, if I type clf(), the figure window closes and a new one appears.\n\nThis happens only when there is only one figure window remaining. \n\nHere's my system (OS X El Capitan, with python/ipython etc. installed via macports) :\n\n```\n[~] $ python -c \"import IPython; print(IPython.sys_info())\"\n{'commit_hash': u'<not found>',\n 'commit_source': '(none found)',\n 'default_encoding': 'UTF-8',\n 'ipython_path': '/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/IPython',\n 'ipython_version': '5.0.0',\n 'os_name': 'posix',\n 'platform': 'Darwin-15.5.0-x86_64-i386-64bit',\n 'sys_executable': '/opt/local/Library/Frameworks/Python.framework/Versions/2.7/Resources/Python.app/Contents/MacOS/Python',\n 'sys_platform': 'darwin',\n 'sys_version': '2.7.12 (default, Jun 29 2016, 12:52:38) \\n[GCC 4.2.1 Compatible Apple LLVM 7.0.2 (clang-700.1.81)]'}\n```\n\n", "before_files": [{"content": "\"\"\"Inputhook for OS X\n\nCalls NSApp / CoreFoundation APIs via ctypes.\n\"\"\"\n\n# obj-c boilerplate from appnope, used under BSD 2-clause\n\nimport ctypes\nimport ctypes.util\n\nobjc = ctypes.cdll.LoadLibrary(ctypes.util.find_library('objc'))\n\nvoid_p = ctypes.c_void_p\n\nobjc.objc_getClass.restype = void_p\nobjc.sel_registerName.restype = void_p\nobjc.objc_msgSend.restype = void_p\nobjc.objc_msgSend.argtypes = [void_p, void_p]\n\nmsg = objc.objc_msgSend\n\ndef _utf8(s):\n \"\"\"ensure utf8 bytes\"\"\"\n if not isinstance(s, bytes):\n s = s.encode('utf8')\n return s\n\ndef n(name):\n \"\"\"create a selector name (for ObjC methods)\"\"\"\n return objc.sel_registerName(_utf8(name))\n\ndef C(classname):\n \"\"\"get an ObjC Class by name\"\"\"\n return objc.objc_getClass(_utf8(classname))\n\n# end obj-c boilerplate from appnope\n\n# CoreFoundation C-API calls we will use:\nCoreFoundation = ctypes.cdll.LoadLibrary(ctypes.util.find_library('CoreFoundation'))\n\nCFFileDescriptorCreate = CoreFoundation.CFFileDescriptorCreate\nCFFileDescriptorCreate.restype = void_p\nCFFileDescriptorCreate.argtypes = [void_p, ctypes.c_int, ctypes.c_bool, void_p]\n\nCFFileDescriptorGetNativeDescriptor = CoreFoundation.CFFileDescriptorGetNativeDescriptor\nCFFileDescriptorGetNativeDescriptor.restype = ctypes.c_int\nCFFileDescriptorGetNativeDescriptor.argtypes = [void_p]\n\nCFFileDescriptorEnableCallBacks = CoreFoundation.CFFileDescriptorEnableCallBacks\nCFFileDescriptorEnableCallBacks.restype = None\nCFFileDescriptorEnableCallBacks.argtypes = [void_p, ctypes.c_ulong]\n\nCFFileDescriptorCreateRunLoopSource = CoreFoundation.CFFileDescriptorCreateRunLoopSource\nCFFileDescriptorCreateRunLoopSource.restype = void_p\nCFFileDescriptorCreateRunLoopSource.argtypes = [void_p, void_p, void_p]\n\nCFRunLoopGetCurrent = CoreFoundation.CFRunLoopGetCurrent\nCFRunLoopGetCurrent.restype = void_p\n\nCFRunLoopAddSource = CoreFoundation.CFRunLoopAddSource\nCFRunLoopAddSource.restype = None\nCFRunLoopAddSource.argtypes = [void_p, void_p, void_p]\n\nCFRelease = CoreFoundation.CFRelease\nCFRelease.restype = None\nCFRelease.argtypes = [void_p]\n\nCFFileDescriptorInvalidate = CoreFoundation.CFFileDescriptorInvalidate\nCFFileDescriptorInvalidate.restype = None\nCFFileDescriptorInvalidate.argtypes = [void_p]\n\n# From CFFileDescriptor.h\nkCFFileDescriptorReadCallBack = 1\nkCFRunLoopCommonModes = void_p.in_dll(CoreFoundation, 'kCFRunLoopCommonModes')\n\n\ndef _NSApp():\n \"\"\"Return the global NSApplication instance (NSApp)\"\"\"\n return msg(C('NSApplication'), n('sharedApplication'))\n\n\ndef _wake(NSApp):\n \"\"\"Wake the Application\"\"\"\n event = msg(C('NSEvent'),\n n('otherEventWithType:location:modifierFlags:'\n 'timestamp:windowNumber:context:subtype:data1:data2:'),\n 15, # Type\n 0, # location\n 0, # flags\n 0, # timestamp\n 0, # window\n None, # context\n 0, # subtype\n 0, # data1\n 0, # data2\n )\n msg(NSApp, n('postEvent:atStart:'), void_p(event), True)\n\n\ndef _input_callback(fdref, flags, info):\n \"\"\"Callback to fire when there's input to be read\"\"\"\n CFFileDescriptorInvalidate(fdref)\n CFRelease(fdref)\n NSApp = _NSApp()\n msg(NSApp, n('stop:'), NSApp)\n _wake(NSApp)\n\n_c_callback_func_type = ctypes.CFUNCTYPE(None, void_p, void_p, void_p)\n_c_input_callback = _c_callback_func_type(_input_callback)\n\n\ndef _stop_on_read(fd):\n \"\"\"Register callback to stop eventloop when there's data on fd\"\"\"\n fdref = CFFileDescriptorCreate(None, fd, False, _c_input_callback, None)\n CFFileDescriptorEnableCallBacks(fdref, kCFFileDescriptorReadCallBack)\n source = CFFileDescriptorCreateRunLoopSource(None, fdref, 0)\n loop = CFRunLoopGetCurrent()\n CFRunLoopAddSource(loop, source, kCFRunLoopCommonModes)\n CFRelease(source)\n\n\ndef inputhook(context):\n \"\"\"Inputhook for Cocoa (NSApp)\"\"\"\n NSApp = _NSApp()\n window_count = msg(\n msg(NSApp, n('windows')),\n n('count')\n )\n if not window_count:\n return\n _stop_on_read(context.fileno())\n msg(NSApp, n('run'))\n\n", "path": "IPython/terminal/pt_inputhooks/osx.py"}], "after_files": [{"content": "\"\"\"Inputhook for OS X\n\nCalls NSApp / CoreFoundation APIs via ctypes.\n\"\"\"\n\n# obj-c boilerplate from appnope, used under BSD 2-clause\n\nimport ctypes\nimport ctypes.util\nfrom threading import Event\n\nobjc = ctypes.cdll.LoadLibrary(ctypes.util.find_library('objc'))\n\nvoid_p = ctypes.c_void_p\n\nobjc.objc_getClass.restype = void_p\nobjc.sel_registerName.restype = void_p\nobjc.objc_msgSend.restype = void_p\nobjc.objc_msgSend.argtypes = [void_p, void_p]\n\nmsg = objc.objc_msgSend\n\ndef _utf8(s):\n \"\"\"ensure utf8 bytes\"\"\"\n if not isinstance(s, bytes):\n s = s.encode('utf8')\n return s\n\ndef n(name):\n \"\"\"create a selector name (for ObjC methods)\"\"\"\n return objc.sel_registerName(_utf8(name))\n\ndef C(classname):\n \"\"\"get an ObjC Class by name\"\"\"\n return objc.objc_getClass(_utf8(classname))\n\n# end obj-c boilerplate from appnope\n\n# CoreFoundation C-API calls we will use:\nCoreFoundation = ctypes.cdll.LoadLibrary(ctypes.util.find_library('CoreFoundation'))\n\nCFFileDescriptorCreate = CoreFoundation.CFFileDescriptorCreate\nCFFileDescriptorCreate.restype = void_p\nCFFileDescriptorCreate.argtypes = [void_p, ctypes.c_int, ctypes.c_bool, void_p]\n\nCFFileDescriptorGetNativeDescriptor = CoreFoundation.CFFileDescriptorGetNativeDescriptor\nCFFileDescriptorGetNativeDescriptor.restype = ctypes.c_int\nCFFileDescriptorGetNativeDescriptor.argtypes = [void_p]\n\nCFFileDescriptorEnableCallBacks = CoreFoundation.CFFileDescriptorEnableCallBacks\nCFFileDescriptorEnableCallBacks.restype = None\nCFFileDescriptorEnableCallBacks.argtypes = [void_p, ctypes.c_ulong]\n\nCFFileDescriptorCreateRunLoopSource = CoreFoundation.CFFileDescriptorCreateRunLoopSource\nCFFileDescriptorCreateRunLoopSource.restype = void_p\nCFFileDescriptorCreateRunLoopSource.argtypes = [void_p, void_p, void_p]\n\nCFRunLoopGetCurrent = CoreFoundation.CFRunLoopGetCurrent\nCFRunLoopGetCurrent.restype = void_p\n\nCFRunLoopAddSource = CoreFoundation.CFRunLoopAddSource\nCFRunLoopAddSource.restype = None\nCFRunLoopAddSource.argtypes = [void_p, void_p, void_p]\n\nCFRelease = CoreFoundation.CFRelease\nCFRelease.restype = None\nCFRelease.argtypes = [void_p]\n\nCFFileDescriptorInvalidate = CoreFoundation.CFFileDescriptorInvalidate\nCFFileDescriptorInvalidate.restype = None\nCFFileDescriptorInvalidate.argtypes = [void_p]\n\n# From CFFileDescriptor.h\nkCFFileDescriptorReadCallBack = 1\nkCFRunLoopCommonModes = void_p.in_dll(CoreFoundation, 'kCFRunLoopCommonModes')\n\n\ndef _NSApp():\n \"\"\"Return the global NSApplication instance (NSApp)\"\"\"\n return msg(C('NSApplication'), n('sharedApplication'))\n\n\ndef _wake(NSApp):\n \"\"\"Wake the Application\"\"\"\n event = msg(C('NSEvent'),\n n('otherEventWithType:location:modifierFlags:'\n 'timestamp:windowNumber:context:subtype:data1:data2:'),\n 15, # Type\n 0, # location\n 0, # flags\n 0, # timestamp\n 0, # window\n None, # context\n 0, # subtype\n 0, # data1\n 0, # data2\n )\n msg(NSApp, n('postEvent:atStart:'), void_p(event), True)\n\n\n_triggered = Event()\n\ndef _input_callback(fdref, flags, info):\n \"\"\"Callback to fire when there's input to be read\"\"\"\n _triggered.set()\n CFFileDescriptorInvalidate(fdref)\n CFRelease(fdref)\n NSApp = _NSApp()\n msg(NSApp, n('stop:'), NSApp)\n _wake(NSApp)\n\n_c_callback_func_type = ctypes.CFUNCTYPE(None, void_p, void_p, void_p)\n_c_input_callback = _c_callback_func_type(_input_callback)\n\n\ndef _stop_on_read(fd):\n \"\"\"Register callback to stop eventloop when there's data on fd\"\"\"\n _triggered.clear()\n fdref = CFFileDescriptorCreate(None, fd, False, _c_input_callback, None)\n CFFileDescriptorEnableCallBacks(fdref, kCFFileDescriptorReadCallBack)\n source = CFFileDescriptorCreateRunLoopSource(None, fdref, 0)\n loop = CFRunLoopGetCurrent()\n CFRunLoopAddSource(loop, source, kCFRunLoopCommonModes)\n CFRelease(source)\n\n\ndef inputhook(context):\n \"\"\"Inputhook for Cocoa (NSApp)\"\"\"\n NSApp = _NSApp()\n window_count = msg(\n msg(NSApp, n('windows')),\n n('count')\n )\n if not window_count:\n return\n _stop_on_read(context.fileno())\n msg(NSApp, n('run'))\n if not _triggered.is_set():\n # app closed without firing callback,\n # probably due to last window being closed.\n # Run the loop manually in this case,\n # since there may be events still to process (#9734)\n CoreFoundation.CFRunLoopRun()\n", "path": "IPython/terminal/pt_inputhooks/osx.py"}]}
1,973
393
gh_patches_debug_41219
rasdani/github-patches
git_diff
beeware__toga-863
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Progressbar crash on start ## Expected Behavior <!--- If you're describing a bug, tell us what you expect to happen. --> Progress bar to load <!--- If you're requesting a new feature, tell us why you'd like this feature. --> ## Current Behavior <!--- If you're describing a bug, what currently happens? --> ``` python -m progressbar Unhandled Exception: Python.Runtime.PythonException: TypeError : 'NoneType' value cannot be converted to System.Int32 at Python.Runtime.Dispatcher.Dispatch(ArrayList args) at __System_Threading_ThreadStartDispatcher.Invoke() at System.Threading.ThreadHelper.ThreadStart_Context(Object state) at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx) at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx) at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state) at System.Threading.ThreadHelper.ThreadStart() ``` ## Steps to reproduce <!--- Provide a set of steps describing how to reproduce this bug. If you have a live example, provide the link below --> 1. launch progressbar ## Your Environment <!--- Provide details on your current environment you found the bug in --> * Python Version (list the specific version number) 3.6.5 * Operating System and Version (select from the following and list the specific version number; if your OS is not listed, list that as well) - [ ] macOS - version: - [ ] Linux - distro: - version: - [ * ] Windows - version: Win10 1607 - [ ] Other - name: - version: * Toga Target (the type of app you are trying to generate) - [ ] android - [ ] cocoa - [ ] django - [ ] gtk - [ ] iOS - [ ] tvOS - [ ] watchOS - [ * ] winforms - [ ] win32 - [ ] Other (please specify) --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `src/winforms/toga_winforms/widgets/progressbar.py` Content: ``` 1 from travertino.size import at_least 2 3 from toga_winforms.libs import WinForms 4 5 from .base import Widget 6 7 8 class ProgressBar(Widget): 9 def create(self): 10 self.native = WinForms.ProgressBar() 11 12 def start(self): 13 '''Not supported for WinForms implementation''' 14 self.interface.factory.not_implemented('ProgressBar.start()') 15 16 # possible implementation (not tested): 17 # self.native.Style = ProgressBarStyle.Marquee 18 19 def stop(self): 20 '''Not supported for WinForms implementation''' 21 self.interface.factory.not_implemented('ProgressBar.stop()') 22 23 # possible implementation (not tested): 24 # self.native.Style = ProgressBarStyle.Continuous 25 26 def set_max(self, value): 27 self.native.Maximum = value 28 29 def set_value(self, value): 30 self.native.Value = value 31 32 def rehint(self): 33 # Height must be non-zero 34 # Set a sensible min-width 35 self.interface.intrinsic.width = at_least(self.interface.MIN_WIDTH) 36 self.interface.intrinsic.height = self.native.PreferredSize.Height 37 ``` Path: `examples/progressbar/progressbar/app.py` Content: ``` 1 import toga 2 from toga.style import Pack 3 from toga.constants import ROW, COLUMN 4 5 6 class ProgressBarApp(toga.App): 7 8 def startup(self): 9 # Main window of the application with title and size 10 self.main_window = toga.MainWindow(title=self.name, size=(500, 500)) 11 12 # the user may change the value with +/- buttons 13 self.progress_adder = toga.ProgressBar() 14 15 # the user may switch between "running" mode and a set value 16 self.progress_runner = toga.ProgressBar(max=None) 17 18 # set up common styles 19 label_style = Pack(flex=1, padding_right=24) 20 row_box_style = Pack(direction=ROW, padding=24) 21 col_box_style = Pack(direction=COLUMN, padding=24) 22 23 # Add the content on the main window 24 self.main_window.content = toga.Box(style=col_box_style, children=[ 25 toga.Box(style=col_box_style, children=[ 26 toga.Label("Use the +/- buttons to change the progress", 27 style=label_style), 28 29 self.progress_adder, 30 31 toga.Box(children=[ 32 toga.Button("+", on_press=self.increase_progress, 33 style=Pack(flex=1)), 34 toga.Button("-", on_press=self.decrease_progress, 35 style=Pack(flex=1)), 36 ]), 37 38 toga.Switch("Toggle running mode", on_toggle=self.toggle_running) 39 ]), 40 41 toga.Box(style=row_box_style, children=[ 42 toga.Label("default ProgressBar", style=label_style), 43 toga.ProgressBar(), 44 ]), 45 46 toga.Box(style=row_box_style, children=[ 47 toga.Label("disabled ProgressBar", style=label_style), 48 toga.ProgressBar(max=None, running=False), 49 ]), 50 51 toga.Box(style=row_box_style, children=[ 52 toga.Label("indeterminate ProgressBar", style=label_style), 53 toga.ProgressBar(max=None, running=True), 54 ]), 55 56 toga.Box(style=row_box_style, children=[ 57 toga.Label("determinate ProgressBar", style=label_style), 58 toga.ProgressBar(max=1, running=False, value=0.5), 59 ]), 60 61 toga.Box(style=row_box_style, children=[ 62 toga.Label("running determinate ProgressBar", style=label_style), 63 toga.ProgressBar(max=1, running=True, value=0.5), 64 ]), 65 ]) 66 67 self.main_window.show() 68 69 def increase_progress(self, button, **kw): 70 if not self.progress_adder.is_running: 71 self.progress_adder.value += 0.1 * self.progress_adder.max 72 73 def decrease_progress(self, button, **kw): 74 if not self.progress_adder.is_running: 75 self.progress_adder.value -= 0.1 * self.progress_adder.max 76 77 def toggle_running(self, switch, **kw): 78 if switch.is_on: 79 self.progress_adder.max = None 80 self.progress_adder.start() 81 else: 82 self.progress_adder.max = 1 83 self.progress_adder.stop() 84 85 86 def main(): 87 # App name and namespace 88 return ProgressBarApp('ProgressBar', 'org.beeware.examples.progressbar') 89 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/examples/progressbar/progressbar/app.py b/examples/progressbar/progressbar/app.py --- a/examples/progressbar/progressbar/app.py +++ b/examples/progressbar/progressbar/app.py @@ -2,6 +2,8 @@ from toga.style import Pack from toga.constants import ROW, COLUMN +MAX_PROGRESSBAR_VALUE = 100 + class ProgressBarApp(toga.App): @@ -10,10 +12,7 @@ self.main_window = toga.MainWindow(title=self.name, size=(500, 500)) # the user may change the value with +/- buttons - self.progress_adder = toga.ProgressBar() - - # the user may switch between "running" mode and a set value - self.progress_runner = toga.ProgressBar(max=None) + self.progress_adder = toga.ProgressBar(max=MAX_PROGRESSBAR_VALUE) # set up common styles label_style = Pack(flex=1, padding_right=24) @@ -55,12 +54,18 @@ toga.Box(style=row_box_style, children=[ toga.Label("determinate ProgressBar", style=label_style), - toga.ProgressBar(max=1, running=False, value=0.5), + toga.ProgressBar( + max=MAX_PROGRESSBAR_VALUE, + running=False, + value=0.5 * MAX_PROGRESSBAR_VALUE), ]), toga.Box(style=row_box_style, children=[ toga.Label("running determinate ProgressBar", style=label_style), - toga.ProgressBar(max=1, running=True, value=0.5), + toga.ProgressBar( + max=MAX_PROGRESSBAR_VALUE, + running=True, + value=0.5 * MAX_PROGRESSBAR_VALUE), ]), ]) @@ -79,7 +84,7 @@ self.progress_adder.max = None self.progress_adder.start() else: - self.progress_adder.max = 1 + self.progress_adder.max = MAX_PROGRESSBAR_VALUE self.progress_adder.stop() diff --git a/src/winforms/toga_winforms/widgets/progressbar.py b/src/winforms/toga_winforms/widgets/progressbar.py --- a/src/winforms/toga_winforms/widgets/progressbar.py +++ b/src/winforms/toga_winforms/widgets/progressbar.py @@ -10,21 +10,31 @@ self.native = WinForms.ProgressBar() def start(self): - '''Not supported for WinForms implementation''' - self.interface.factory.not_implemented('ProgressBar.start()') - - # possible implementation (not tested): - # self.native.Style = ProgressBarStyle.Marquee + self.set_running_style() def stop(self): - '''Not supported for WinForms implementation''' - self.interface.factory.not_implemented('ProgressBar.stop()') + self.set_stopping_style() - # possible implementation (not tested): - # self.native.Style = ProgressBarStyle.Continuous + @property + def max(self): + return self.interface.max def set_max(self, value): - self.native.Maximum = value + if value is not None: + self.native.Maximum = value + if self.interface.is_running: + self.set_running_style() + else: + self.set_stopping_style() + + def set_running_style(self): + if self.max is None: + self.native.Style = WinForms.ProgressBarStyle.Marquee + else: + self.native.Style = WinForms.ProgressBarStyle.Blocks + + def set_stopping_style(self): + self.native.Style = WinForms.ProgressBarStyle.Continuous def set_value(self, value): self.native.Value = value
{"golden_diff": "diff --git a/examples/progressbar/progressbar/app.py b/examples/progressbar/progressbar/app.py\n--- a/examples/progressbar/progressbar/app.py\n+++ b/examples/progressbar/progressbar/app.py\n@@ -2,6 +2,8 @@\n from toga.style import Pack\n from toga.constants import ROW, COLUMN\n \n+MAX_PROGRESSBAR_VALUE = 100\n+\n \n class ProgressBarApp(toga.App):\n \n@@ -10,10 +12,7 @@\n self.main_window = toga.MainWindow(title=self.name, size=(500, 500))\n \n # the user may change the value with +/- buttons\n- self.progress_adder = toga.ProgressBar()\n-\n- # the user may switch between \"running\" mode and a set value\n- self.progress_runner = toga.ProgressBar(max=None)\n+ self.progress_adder = toga.ProgressBar(max=MAX_PROGRESSBAR_VALUE)\n \n # set up common styles\n label_style = Pack(flex=1, padding_right=24)\n@@ -55,12 +54,18 @@\n \n toga.Box(style=row_box_style, children=[\n toga.Label(\"determinate ProgressBar\", style=label_style),\n- toga.ProgressBar(max=1, running=False, value=0.5),\n+ toga.ProgressBar(\n+ max=MAX_PROGRESSBAR_VALUE,\n+ running=False,\n+ value=0.5 * MAX_PROGRESSBAR_VALUE),\n ]),\n \n toga.Box(style=row_box_style, children=[\n toga.Label(\"running determinate ProgressBar\", style=label_style),\n- toga.ProgressBar(max=1, running=True, value=0.5),\n+ toga.ProgressBar(\n+ max=MAX_PROGRESSBAR_VALUE,\n+ running=True,\n+ value=0.5 * MAX_PROGRESSBAR_VALUE),\n ]),\n ])\n \n@@ -79,7 +84,7 @@\n self.progress_adder.max = None\n self.progress_adder.start()\n else:\n- self.progress_adder.max = 1\n+ self.progress_adder.max = MAX_PROGRESSBAR_VALUE\n self.progress_adder.stop()\n \n \ndiff --git a/src/winforms/toga_winforms/widgets/progressbar.py b/src/winforms/toga_winforms/widgets/progressbar.py\n--- a/src/winforms/toga_winforms/widgets/progressbar.py\n+++ b/src/winforms/toga_winforms/widgets/progressbar.py\n@@ -10,21 +10,31 @@\n self.native = WinForms.ProgressBar()\n \n def start(self):\n- '''Not supported for WinForms implementation'''\n- self.interface.factory.not_implemented('ProgressBar.start()')\n-\n- # possible implementation (not tested):\n- # self.native.Style = ProgressBarStyle.Marquee\n+ self.set_running_style()\n \n def stop(self):\n- '''Not supported for WinForms implementation'''\n- self.interface.factory.not_implemented('ProgressBar.stop()')\n+ self.set_stopping_style()\n \n- # possible implementation (not tested):\n- # self.native.Style = ProgressBarStyle.Continuous\n+ @property\n+ def max(self):\n+ return self.interface.max\n \n def set_max(self, value):\n- self.native.Maximum = value\n+ if value is not None:\n+ self.native.Maximum = value\n+ if self.interface.is_running:\n+ self.set_running_style()\n+ else:\n+ self.set_stopping_style()\n+\n+ def set_running_style(self):\n+ if self.max is None:\n+ self.native.Style = WinForms.ProgressBarStyle.Marquee\n+ else:\n+ self.native.Style = WinForms.ProgressBarStyle.Blocks\n+\n+ def set_stopping_style(self):\n+ self.native.Style = WinForms.ProgressBarStyle.Continuous\n \n def set_value(self, value):\n self.native.Value = value\n", "issue": "Progressbar crash on start\n## Expected Behavior\r\n<!--- If you're describing a bug, tell us what you expect to happen. -->\r\nProgress bar to load\r\n<!--- If you're requesting a new feature, tell us why you'd like this feature. -->\r\n\r\n\r\n## Current Behavior\r\n<!--- If you're describing a bug, what currently happens? -->\r\n```\r\npython -m progressbar\r\nUnhandled Exception: Python.Runtime.PythonException: TypeError : 'NoneType' value cannot be converted to System.Int32\r\n at Python.Runtime.Dispatcher.Dispatch(ArrayList args)\r\n at __System_Threading_ThreadStartDispatcher.Invoke()\r\n at System.Threading.ThreadHelper.ThreadStart_Context(Object state)\r\n at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)\r\n at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)\r\n at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state)\r\n at System.Threading.ThreadHelper.ThreadStart()\r\n```\r\n## Steps to reproduce\r\n<!--- Provide a set of steps describing how to reproduce this bug. If you have a live example, provide the link below -->\r\n1. launch progressbar\r\n\r\n## Your Environment\r\n<!--- Provide details on your current environment you found the bug in -->\r\n\r\n* Python Version (list the specific version number) 3.6.5\r\n\r\n* Operating System and Version (select from the following and list the specific version number; if your OS is not listed, list that as well)\r\n\r\n - [ ] macOS - version: \r\n - [ ] Linux - distro: - version:\r\n - [ * ] Windows - version: Win10 1607\r\n - [ ] Other - name: - version:\r\n\r\n* Toga Target (the type of app you are trying to generate)\r\n \r\n - [ ] android\r\n - [ ] cocoa\r\n - [ ] django \r\n - [ ] gtk\r\n - [ ] iOS\r\n - [ ] tvOS\r\n - [ ] watchOS\r\n - [ * ] winforms \r\n - [ ] win32\r\n - [ ] Other (please specify)\r\n\n", "before_files": [{"content": "from travertino.size import at_least\n\nfrom toga_winforms.libs import WinForms\n\nfrom .base import Widget\n\n\nclass ProgressBar(Widget):\n def create(self):\n self.native = WinForms.ProgressBar()\n\n def start(self):\n '''Not supported for WinForms implementation'''\n self.interface.factory.not_implemented('ProgressBar.start()')\n\n # possible implementation (not tested):\n # self.native.Style = ProgressBarStyle.Marquee\n\n def stop(self):\n '''Not supported for WinForms implementation'''\n self.interface.factory.not_implemented('ProgressBar.stop()')\n\n # possible implementation (not tested):\n # self.native.Style = ProgressBarStyle.Continuous\n\n def set_max(self, value):\n self.native.Maximum = value\n\n def set_value(self, value):\n self.native.Value = value\n\n def rehint(self):\n # Height must be non-zero\n # Set a sensible min-width\n self.interface.intrinsic.width = at_least(self.interface.MIN_WIDTH)\n self.interface.intrinsic.height = self.native.PreferredSize.Height\n", "path": "src/winforms/toga_winforms/widgets/progressbar.py"}, {"content": "import toga\nfrom toga.style import Pack\nfrom toga.constants import ROW, COLUMN\n\n\nclass ProgressBarApp(toga.App):\n\n def startup(self):\n # Main window of the application with title and size\n self.main_window = toga.MainWindow(title=self.name, size=(500, 500))\n\n # the user may change the value with +/- buttons\n self.progress_adder = toga.ProgressBar()\n\n # the user may switch between \"running\" mode and a set value\n self.progress_runner = toga.ProgressBar(max=None)\n\n # set up common styles\n label_style = Pack(flex=1, padding_right=24)\n row_box_style = Pack(direction=ROW, padding=24)\n col_box_style = Pack(direction=COLUMN, padding=24)\n\n # Add the content on the main window\n self.main_window.content = toga.Box(style=col_box_style, children=[\n toga.Box(style=col_box_style, children=[\n toga.Label(\"Use the +/- buttons to change the progress\",\n style=label_style),\n\n self.progress_adder,\n\n toga.Box(children=[\n toga.Button(\"+\", on_press=self.increase_progress,\n style=Pack(flex=1)),\n toga.Button(\"-\", on_press=self.decrease_progress,\n style=Pack(flex=1)),\n ]),\n\n toga.Switch(\"Toggle running mode\", on_toggle=self.toggle_running)\n ]),\n\n toga.Box(style=row_box_style, children=[\n toga.Label(\"default ProgressBar\", style=label_style),\n toga.ProgressBar(),\n ]),\n\n toga.Box(style=row_box_style, children=[\n toga.Label(\"disabled ProgressBar\", style=label_style),\n toga.ProgressBar(max=None, running=False),\n ]),\n\n toga.Box(style=row_box_style, children=[\n toga.Label(\"indeterminate ProgressBar\", style=label_style),\n toga.ProgressBar(max=None, running=True),\n ]),\n\n toga.Box(style=row_box_style, children=[\n toga.Label(\"determinate ProgressBar\", style=label_style),\n toga.ProgressBar(max=1, running=False, value=0.5),\n ]),\n\n toga.Box(style=row_box_style, children=[\n toga.Label(\"running determinate ProgressBar\", style=label_style),\n toga.ProgressBar(max=1, running=True, value=0.5),\n ]),\n ])\n\n self.main_window.show()\n\n def increase_progress(self, button, **kw):\n if not self.progress_adder.is_running:\n self.progress_adder.value += 0.1 * self.progress_adder.max\n\n def decrease_progress(self, button, **kw):\n if not self.progress_adder.is_running:\n self.progress_adder.value -= 0.1 * self.progress_adder.max\n\n def toggle_running(self, switch, **kw):\n if switch.is_on:\n self.progress_adder.max = None\n self.progress_adder.start()\n else:\n self.progress_adder.max = 1\n self.progress_adder.stop()\n\n\ndef main():\n # App name and namespace\n return ProgressBarApp('ProgressBar', 'org.beeware.examples.progressbar')\n", "path": "examples/progressbar/progressbar/app.py"}], "after_files": [{"content": "from travertino.size import at_least\n\nfrom toga_winforms.libs import WinForms\n\nfrom .base import Widget\n\n\nclass ProgressBar(Widget):\n def create(self):\n self.native = WinForms.ProgressBar()\n\n def start(self):\n self.set_running_style()\n\n def stop(self):\n self.set_stopping_style()\n\n @property\n def max(self):\n return self.interface.max\n\n def set_max(self, value):\n if value is not None:\n self.native.Maximum = value\n if self.interface.is_running:\n self.set_running_style()\n else:\n self.set_stopping_style()\n\n def set_running_style(self):\n if self.max is None:\n self.native.Style = WinForms.ProgressBarStyle.Marquee\n else:\n self.native.Style = WinForms.ProgressBarStyle.Blocks\n\n def set_stopping_style(self):\n self.native.Style = WinForms.ProgressBarStyle.Continuous\n\n def set_value(self, value):\n self.native.Value = value\n\n def rehint(self):\n # Height must be non-zero\n # Set a sensible min-width\n self.interface.intrinsic.width = at_least(self.interface.MIN_WIDTH)\n self.interface.intrinsic.height = self.native.PreferredSize.Height\n", "path": "src/winforms/toga_winforms/widgets/progressbar.py"}, {"content": "import toga\nfrom toga.style import Pack\nfrom toga.constants import ROW, COLUMN\n\nMAX_PROGRESSBAR_VALUE = 100\n\n\nclass ProgressBarApp(toga.App):\n\n def startup(self):\n # Main window of the application with title and size\n self.main_window = toga.MainWindow(title=self.name, size=(500, 500))\n\n # the user may change the value with +/- buttons\n self.progress_adder = toga.ProgressBar(max=MAX_PROGRESSBAR_VALUE)\n\n # set up common styles\n label_style = Pack(flex=1, padding_right=24)\n row_box_style = Pack(direction=ROW, padding=24)\n col_box_style = Pack(direction=COLUMN, padding=24)\n\n # Add the content on the main window\n self.main_window.content = toga.Box(style=col_box_style, children=[\n toga.Box(style=col_box_style, children=[\n toga.Label(\"Use the +/- buttons to change the progress\",\n style=label_style),\n\n self.progress_adder,\n\n toga.Box(children=[\n toga.Button(\"+\", on_press=self.increase_progress,\n style=Pack(flex=1)),\n toga.Button(\"-\", on_press=self.decrease_progress,\n style=Pack(flex=1)),\n ]),\n\n toga.Switch(\"Toggle running mode\", on_toggle=self.toggle_running)\n ]),\n\n toga.Box(style=row_box_style, children=[\n toga.Label(\"default ProgressBar\", style=label_style),\n toga.ProgressBar(),\n ]),\n\n toga.Box(style=row_box_style, children=[\n toga.Label(\"disabled ProgressBar\", style=label_style),\n toga.ProgressBar(max=None, running=False),\n ]),\n\n toga.Box(style=row_box_style, children=[\n toga.Label(\"indeterminate ProgressBar\", style=label_style),\n toga.ProgressBar(max=None, running=True),\n ]),\n\n toga.Box(style=row_box_style, children=[\n toga.Label(\"determinate ProgressBar\", style=label_style),\n toga.ProgressBar(\n max=MAX_PROGRESSBAR_VALUE,\n running=False,\n value=0.5 * MAX_PROGRESSBAR_VALUE),\n ]),\n\n toga.Box(style=row_box_style, children=[\n toga.Label(\"running determinate ProgressBar\", style=label_style),\n toga.ProgressBar(\n max=MAX_PROGRESSBAR_VALUE,\n running=True,\n value=0.5 * MAX_PROGRESSBAR_VALUE),\n ]),\n ])\n\n self.main_window.show()\n\n def increase_progress(self, button, **kw):\n if not self.progress_adder.is_running:\n self.progress_adder.value += 0.1 * self.progress_adder.max\n\n def decrease_progress(self, button, **kw):\n if not self.progress_adder.is_running:\n self.progress_adder.value -= 0.1 * self.progress_adder.max\n\n def toggle_running(self, switch, **kw):\n if switch.is_on:\n self.progress_adder.max = None\n self.progress_adder.start()\n else:\n self.progress_adder.max = MAX_PROGRESSBAR_VALUE\n self.progress_adder.stop()\n\n\ndef main():\n # App name and namespace\n return ProgressBarApp('ProgressBar', 'org.beeware.examples.progressbar')\n", "path": "examples/progressbar/progressbar/app.py"}]}
1,876
814
gh_patches_debug_25048
rasdani/github-patches
git_diff
mosaicml__composer-1493
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- HuggingFace model should update word embeddings dimension according to tokenizer ** To reproduce Steps to reproduce the behavior: Right now if I call ``` model = transformers.AutoModelForCausalLM.from_pretrained('bigscience/bloom-560m') tokenizer = transformers.AutoTokenizer.from_pretrained('gpt2') model = HuggingFaceModel(model=model, tokenizer=tokenizer) ``` The bloom model will have a [250880, 1024] dim word embedding matrix and a vocab size of 250880 even though a gpt2 tokenizer with vocab size of 50257 is used. ## Expected behavior The class HuggingFaceModel needs to update word embedding matrix if a tokenizer is supplied. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `composer/models/huggingface.py` Content: ``` 1 # Copyright 2022 MosaicML Composer authors 2 # SPDX-License-Identifier: Apache-2.0 3 4 """A wrapper class that converts 🤗 Transformers models to composer models""" 5 6 from __future__ import annotations 7 8 from typing import TYPE_CHECKING, Any, Dict, List, Optional, Union 9 10 from torchmetrics import Metric 11 12 from composer.models.base import ComposerModel 13 from composer.utils.import_helpers import MissingConditionalImportError 14 15 if TYPE_CHECKING: 16 import transformers 17 18 __all__ = ['HuggingFaceModel'] 19 20 21 class HuggingFaceModel(ComposerModel): 22 """ 23 A wrapper class that converts 🤗 Transformers models to composer models. 24 25 Args: 26 model (transformers.PreTrainedModel): A 🤗 Transformers model. 27 tokenizer (transformers.PreTrainedTokenizer): Tokenizer used to prepare the dataset and validate model inputs during training. Default ``None``. 28 use_logits (bool, optional): If True, the model's output logits will be used to calculate validation metrics. Else, metrics will be inferred from the HuggingFaceModel directly. Default: ``False`` 29 metrics (list[Metric], optional): list of torchmetrics to apply to the output of `validate`. Default: ``None``. 30 .. warning:: This wrapper is designed to work with 🤗 datasets that define a `labels` column. 31 32 Example: 33 34 .. testcode:: 35 36 import transformers 37 from composer.models import HuggingFaceModel 38 39 hf_model = transformers.AutoModelForSequenceClassification.from_pretrained('bert-base-uncased', num_labels=2) 40 model = HuggingFaceModel(hf_model) 41 """ 42 43 def __init__(self, 44 model: transformers.PreTrainedModel, 45 tokenizer: Optional[Union[transformers.PreTrainedTokenizer, 46 transformers.PreTrainedTokenizerFast]] = None, 47 use_logits: Optional[bool] = False, 48 metrics: Optional[List[Metric]] = None) -> None: 49 try: 50 import transformers 51 except ImportError as e: 52 raise MissingConditionalImportError(extra_deps_group='nlp', conda_package='transformers') from e 53 54 super().__init__() 55 self.model = model 56 self.config = model.config 57 58 # the set of inputs that a model expects inferred from the model type or 59 # tokenizer if provided 60 if tokenizer is None: 61 if isinstance(self.model.base_model, transformers.GPT2Model): 62 self.model_inputs = {'input_ids', 'attention_mask'} 63 elif isinstance(self.model.base_model, transformers.BertModel): 64 self.model_inputs = {'input_ids', 'attention_mask', 'token_type_ids'} 65 else: 66 assert tokenizer.model_input_names is not None, 'the tokenizer should have a model input name' 67 self.model_inputs = set(tokenizer.model_input_names) 68 69 self.use_logits = use_logits 70 71 self.train_metrics = None 72 self.val_metrics = None 73 74 if metrics: 75 self.train_metrics = {metric.__class__.__name__: metric for metric in metrics} 76 self.val_metrics = {metric.__class__.__name__: metric for metric in metrics} 77 78 self.labels = None # set in eval_forward() if exists 79 80 def forward(self, batch): 81 for key in self.model_inputs: 82 if key not in batch.keys(): 83 raise ValueError(f'Batch missing key: {key}') 84 85 output = self.model(**batch) # type: ignore (thirdparty) 86 return output 87 88 def loss(self, outputs, batch): 89 return outputs['loss'] 90 91 def eval_forward(self, batch, outputs: Optional[Any] = None): 92 output = outputs if outputs else self.forward(batch) 93 if self.use_logits: 94 self.labels = batch.pop('labels') 95 output = output['logits'] 96 97 # if we are in the single class case, then remove the classes dimension 98 if output.shape[1] == 1: 99 output = output.squeeze(dim=1) 100 101 return output 102 103 def get_metrics(self, is_train: bool = False) -> Dict[str, Metric]: 104 if is_train: 105 metrics = self.train_metrics 106 else: 107 metrics = self.val_metrics 108 109 return metrics if metrics else {} 110 111 def update_metric(self, batch: Any, outputs: Any, metric: Metric) -> None: 112 metric.update(outputs, self.labels) 113 114 def get_model_inputs(self): 115 """Returns a set of inputs that the model expects in the forward pass. 116 If an algorithm wants to interact with the model inputs (for instance, 117 popping the labels for a custom loss fn, or adding attention head masks 118 for head pruning, it must access self.set_model_inputs(). 119 Returns: 120 model_inputs: The set of keys that are expected in the Mapping used to compute the forward pass. 121 """ 122 123 return self.model_inputs 124 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/composer/models/huggingface.py b/composer/models/huggingface.py --- a/composer/models/huggingface.py +++ b/composer/models/huggingface.py @@ -5,6 +5,7 @@ from __future__ import annotations +import logging from typing import TYPE_CHECKING, Any, Dict, List, Optional, Union from torchmetrics import Metric @@ -15,6 +16,8 @@ if TYPE_CHECKING: import transformers +log = logging.getLogger(__name__) + __all__ = ['HuggingFaceModel'] @@ -66,6 +69,13 @@ assert tokenizer.model_input_names is not None, 'the tokenizer should have a model input name' self.model_inputs = set(tokenizer.model_input_names) + if self.config.vocab_size != len(tokenizer): + # set model's word embedding matrix and final lm_head to vocab size according to tokenizer + log.warning( + f'The number of tokens in the tokenizer and the number of tokens in the model are different.' + f' Resizing the model tokenizer to {len(tokenizer)} from {self.config.vocab_size}.') + self.model.resize_token_embeddings(len(tokenizer)) + self.use_logits = use_logits self.train_metrics = None
{"golden_diff": "diff --git a/composer/models/huggingface.py b/composer/models/huggingface.py\n--- a/composer/models/huggingface.py\n+++ b/composer/models/huggingface.py\n@@ -5,6 +5,7 @@\n \n from __future__ import annotations\n \n+import logging\n from typing import TYPE_CHECKING, Any, Dict, List, Optional, Union\n \n from torchmetrics import Metric\n@@ -15,6 +16,8 @@\n if TYPE_CHECKING:\n import transformers\n \n+log = logging.getLogger(__name__)\n+\n __all__ = ['HuggingFaceModel']\n \n \n@@ -66,6 +69,13 @@\n assert tokenizer.model_input_names is not None, 'the tokenizer should have a model input name'\n self.model_inputs = set(tokenizer.model_input_names)\n \n+ if self.config.vocab_size != len(tokenizer):\n+ # set model's word embedding matrix and final lm_head to vocab size according to tokenizer\n+ log.warning(\n+ f'The number of tokens in the tokenizer and the number of tokens in the model are different.'\n+ f' Resizing the model tokenizer to {len(tokenizer)} from {self.config.vocab_size}.')\n+ self.model.resize_token_embeddings(len(tokenizer))\n+\n self.use_logits = use_logits\n \n self.train_metrics = None\n", "issue": "HuggingFace model should update word embeddings dimension according to tokenizer\n** To reproduce\r\n\r\nSteps to reproduce the behavior:\r\n\r\nRight now if I call \r\n\r\n```\r\nmodel = transformers.AutoModelForCausalLM.from_pretrained('bigscience/bloom-560m')\r\ntokenizer = transformers.AutoTokenizer.from_pretrained('gpt2')\r\n\r\nmodel = HuggingFaceModel(model=model, tokenizer=tokenizer)\r\n```\r\n\r\nThe bloom model will have a [250880, 1024] dim word embedding matrix and a vocab size of 250880 even though a gpt2 tokenizer with vocab size of 50257 is used.\r\n\r\n## Expected behavior\r\n\r\nThe class HuggingFaceModel needs to update word embedding matrix if a tokenizer is supplied.\r\n\r\n\n", "before_files": [{"content": "# Copyright 2022 MosaicML Composer authors\n# SPDX-License-Identifier: Apache-2.0\n\n\"\"\"A wrapper class that converts \ud83e\udd17 Transformers models to composer models\"\"\"\n\nfrom __future__ import annotations\n\nfrom typing import TYPE_CHECKING, Any, Dict, List, Optional, Union\n\nfrom torchmetrics import Metric\n\nfrom composer.models.base import ComposerModel\nfrom composer.utils.import_helpers import MissingConditionalImportError\n\nif TYPE_CHECKING:\n import transformers\n\n__all__ = ['HuggingFaceModel']\n\n\nclass HuggingFaceModel(ComposerModel):\n \"\"\"\n A wrapper class that converts \ud83e\udd17 Transformers models to composer models.\n\n Args:\n model (transformers.PreTrainedModel): A \ud83e\udd17 Transformers model.\n tokenizer (transformers.PreTrainedTokenizer): Tokenizer used to prepare the dataset and validate model inputs during training. Default ``None``.\n use_logits (bool, optional): If True, the model's output logits will be used to calculate validation metrics. Else, metrics will be inferred from the HuggingFaceModel directly. Default: ``False``\n metrics (list[Metric], optional): list of torchmetrics to apply to the output of `validate`. Default: ``None``.\n .. warning:: This wrapper is designed to work with \ud83e\udd17 datasets that define a `labels` column.\n\n Example:\n\n .. testcode::\n\n import transformers\n from composer.models import HuggingFaceModel\n\n hf_model = transformers.AutoModelForSequenceClassification.from_pretrained('bert-base-uncased', num_labels=2)\n model = HuggingFaceModel(hf_model)\n \"\"\"\n\n def __init__(self,\n model: transformers.PreTrainedModel,\n tokenizer: Optional[Union[transformers.PreTrainedTokenizer,\n transformers.PreTrainedTokenizerFast]] = None,\n use_logits: Optional[bool] = False,\n metrics: Optional[List[Metric]] = None) -> None:\n try:\n import transformers\n except ImportError as e:\n raise MissingConditionalImportError(extra_deps_group='nlp', conda_package='transformers') from e\n\n super().__init__()\n self.model = model\n self.config = model.config\n\n # the set of inputs that a model expects inferred from the model type or\n # tokenizer if provided\n if tokenizer is None:\n if isinstance(self.model.base_model, transformers.GPT2Model):\n self.model_inputs = {'input_ids', 'attention_mask'}\n elif isinstance(self.model.base_model, transformers.BertModel):\n self.model_inputs = {'input_ids', 'attention_mask', 'token_type_ids'}\n else:\n assert tokenizer.model_input_names is not None, 'the tokenizer should have a model input name'\n self.model_inputs = set(tokenizer.model_input_names)\n\n self.use_logits = use_logits\n\n self.train_metrics = None\n self.val_metrics = None\n\n if metrics:\n self.train_metrics = {metric.__class__.__name__: metric for metric in metrics}\n self.val_metrics = {metric.__class__.__name__: metric for metric in metrics}\n\n self.labels = None # set in eval_forward() if exists\n\n def forward(self, batch):\n for key in self.model_inputs:\n if key not in batch.keys():\n raise ValueError(f'Batch missing key: {key}')\n\n output = self.model(**batch) # type: ignore (thirdparty)\n return output\n\n def loss(self, outputs, batch):\n return outputs['loss']\n\n def eval_forward(self, batch, outputs: Optional[Any] = None):\n output = outputs if outputs else self.forward(batch)\n if self.use_logits:\n self.labels = batch.pop('labels')\n output = output['logits']\n\n # if we are in the single class case, then remove the classes dimension\n if output.shape[1] == 1:\n output = output.squeeze(dim=1)\n\n return output\n\n def get_metrics(self, is_train: bool = False) -> Dict[str, Metric]:\n if is_train:\n metrics = self.train_metrics\n else:\n metrics = self.val_metrics\n\n return metrics if metrics else {}\n\n def update_metric(self, batch: Any, outputs: Any, metric: Metric) -> None:\n metric.update(outputs, self.labels)\n\n def get_model_inputs(self):\n \"\"\"Returns a set of inputs that the model expects in the forward pass.\n If an algorithm wants to interact with the model inputs (for instance,\n popping the labels for a custom loss fn, or adding attention head masks\n for head pruning, it must access self.set_model_inputs().\n Returns:\n model_inputs: The set of keys that are expected in the Mapping used to compute the forward pass.\n \"\"\"\n\n return self.model_inputs\n", "path": "composer/models/huggingface.py"}], "after_files": [{"content": "# Copyright 2022 MosaicML Composer authors\n# SPDX-License-Identifier: Apache-2.0\n\n\"\"\"A wrapper class that converts \ud83e\udd17 Transformers models to composer models\"\"\"\n\nfrom __future__ import annotations\n\nimport logging\nfrom typing import TYPE_CHECKING, Any, Dict, List, Optional, Union\n\nfrom torchmetrics import Metric\n\nfrom composer.models.base import ComposerModel\nfrom composer.utils.import_helpers import MissingConditionalImportError\n\nif TYPE_CHECKING:\n import transformers\n\nlog = logging.getLogger(__name__)\n\n__all__ = ['HuggingFaceModel']\n\n\nclass HuggingFaceModel(ComposerModel):\n \"\"\"\n A wrapper class that converts \ud83e\udd17 Transformers models to composer models.\n\n Args:\n model (transformers.PreTrainedModel): A \ud83e\udd17 Transformers model.\n tokenizer (transformers.PreTrainedTokenizer): Tokenizer used to prepare the dataset and validate model inputs during training. Default ``None``.\n use_logits (bool, optional): If True, the model's output logits will be used to calculate validation metrics. Else, metrics will be inferred from the HuggingFaceModel directly. Default: ``False``\n metrics (list[Metric], optional): list of torchmetrics to apply to the output of `validate`. Default: ``None``.\n .. warning:: This wrapper is designed to work with \ud83e\udd17 datasets that define a `labels` column.\n\n Example:\n\n .. testcode::\n\n import transformers\n from composer.models import HuggingFaceModel\n\n hf_model = transformers.AutoModelForSequenceClassification.from_pretrained('bert-base-uncased', num_labels=2)\n model = HuggingFaceModel(hf_model)\n \"\"\"\n\n def __init__(self,\n model: transformers.PreTrainedModel,\n tokenizer: Optional[Union[transformers.PreTrainedTokenizer,\n transformers.PreTrainedTokenizerFast]] = None,\n use_logits: Optional[bool] = False,\n metrics: Optional[List[Metric]] = None) -> None:\n try:\n import transformers\n except ImportError as e:\n raise MissingConditionalImportError(extra_deps_group='nlp', conda_package='transformers') from e\n\n super().__init__()\n self.model = model\n self.config = model.config\n\n # the set of inputs that a model expects inferred from the model type or\n # tokenizer if provided\n if tokenizer is None:\n if isinstance(self.model.base_model, transformers.GPT2Model):\n self.model_inputs = {'input_ids', 'attention_mask'}\n elif isinstance(self.model.base_model, transformers.BertModel):\n self.model_inputs = {'input_ids', 'attention_mask', 'token_type_ids'}\n else:\n assert tokenizer.model_input_names is not None, 'the tokenizer should have a model input name'\n self.model_inputs = set(tokenizer.model_input_names)\n\n if self.config.vocab_size != len(tokenizer):\n # set model's word embedding matrix and final lm_head to vocab size according to tokenizer\n log.warning(\n f'The number of tokens in the tokenizer and the number of tokens in the model are different.'\n f' Resizing the model tokenizer to {len(tokenizer)} from {self.config.vocab_size}.')\n self.model.resize_token_embeddings(len(tokenizer))\n\n self.use_logits = use_logits\n\n self.train_metrics = None\n self.val_metrics = None\n\n if metrics:\n self.train_metrics = {metric.__class__.__name__: metric for metric in metrics}\n self.val_metrics = {metric.__class__.__name__: metric for metric in metrics}\n\n self.labels = None # set in eval_forward() if exists\n\n def forward(self, batch):\n for key in self.model_inputs:\n if key not in batch.keys():\n raise ValueError(f'Batch missing key: {key}')\n\n output = self.model(**batch) # type: ignore (thirdparty)\n return output\n\n def loss(self, outputs, batch):\n return outputs['loss']\n\n def eval_forward(self, batch, outputs: Optional[Any] = None):\n output = outputs if outputs else self.forward(batch)\n if self.use_logits:\n self.labels = batch.pop('labels')\n output = output['logits']\n\n # if we are in the single class case, then remove the classes dimension\n if output.shape[1] == 1:\n output = output.squeeze(dim=1)\n\n return output\n\n def get_metrics(self, is_train: bool = False) -> Dict[str, Metric]:\n if is_train:\n metrics = self.train_metrics\n else:\n metrics = self.val_metrics\n\n return metrics if metrics else {}\n\n def update_metric(self, batch: Any, outputs: Any, metric: Metric) -> None:\n metric.update(outputs, self.labels)\n\n def get_model_inputs(self):\n \"\"\"Returns a set of inputs that the model expects in the forward pass.\n If an algorithm wants to interact with the model inputs (for instance,\n popping the labels for a custom loss fn, or adding attention head masks\n for head pruning, it must access self.set_model_inputs().\n Returns:\n model_inputs: The set of keys that are expected in the Mapping used to compute the forward pass.\n \"\"\"\n\n return self.model_inputs\n", "path": "composer/models/huggingface.py"}]}
1,711
281
gh_patches_debug_16540
rasdani/github-patches
git_diff
Kinto__kinto-1343
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- `kinto create-user` doesn't override the password if the user already exists. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `kinto/plugins/accounts/scripts.py` Content: ``` 1 import logging 2 import getpass 3 4 import transaction as current_transaction 5 from pyramid.settings import asbool 6 7 from .utils import hash_password 8 from .views import AccountIdGenerator 9 10 11 logger = logging.getLogger(__name__) 12 13 14 def create_user(env, username=None, password=None): 15 """Administrative command to create a new user.""" 16 registry = env['registry'] 17 settings = registry.settings 18 readonly_mode = asbool(settings.get('readonly', False)) 19 if readonly_mode: 20 message = 'Cannot create a user with a readonly server.' 21 logger.error(message) 22 return 51 23 24 if 'kinto.plugins.accounts' not in settings['includes']: 25 message = 'Cannot create a user when the accounts plugin is not installed.' 26 logger.error(message) 27 return 52 28 29 try: 30 validator = AccountIdGenerator() 31 if username is None: 32 username = input('Username: ') 33 while not validator.match(username): 34 print('{} is not a valid username.') 35 print('Username should match {0!r}, please try again.'.format(validator.regexp)) 36 username = input('Username: ') 37 38 if password is None: 39 while True: # The user didn't entered twice the same password 40 password = getpass.getpass('Please enter a password for {}: '.format(username)) 41 confirm = getpass.getpass('Please confirm the password: '.format(username)) 42 43 if password != confirm: 44 print('Sorry, passwords do not match, please try again.') 45 else: 46 break 47 except EOFError: 48 print('User creation aborted') 49 return 53 50 51 print("Creating user '{}'".format(username)) 52 record = {'id': username, 'password': hash_password(password)} 53 registry.storage.create(collection_id='account', 54 parent_id=username, 55 record=record, 56 ignore_conflict=True) 57 registry.permission.add_principal_to_ace('/accounts/{}'.format(username), 58 'write', 59 'account:{}'.format(username)) 60 61 current_transaction.commit() 62 63 return 0 64 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/kinto/plugins/accounts/scripts.py b/kinto/plugins/accounts/scripts.py --- a/kinto/plugins/accounts/scripts.py +++ b/kinto/plugins/accounts/scripts.py @@ -50,10 +50,10 @@ print("Creating user '{}'".format(username)) record = {'id': username, 'password': hash_password(password)} - registry.storage.create(collection_id='account', + registry.storage.update(collection_id='account', parent_id=username, - record=record, - ignore_conflict=True) + object_id=username, + record=record) registry.permission.add_principal_to_ace('/accounts/{}'.format(username), 'write', 'account:{}'.format(username))
{"golden_diff": "diff --git a/kinto/plugins/accounts/scripts.py b/kinto/plugins/accounts/scripts.py\n--- a/kinto/plugins/accounts/scripts.py\n+++ b/kinto/plugins/accounts/scripts.py\n@@ -50,10 +50,10 @@\n \n print(\"Creating user '{}'\".format(username))\n record = {'id': username, 'password': hash_password(password)}\n- registry.storage.create(collection_id='account',\n+ registry.storage.update(collection_id='account',\n parent_id=username,\n- record=record,\n- ignore_conflict=True)\n+ object_id=username,\n+ record=record)\n registry.permission.add_principal_to_ace('/accounts/{}'.format(username),\n 'write',\n 'account:{}'.format(username))\n", "issue": "`kinto create-user` doesn't override the password if the user already exists.\n\n", "before_files": [{"content": "import logging\nimport getpass\n\nimport transaction as current_transaction\nfrom pyramid.settings import asbool\n\nfrom .utils import hash_password\nfrom .views import AccountIdGenerator\n\n\nlogger = logging.getLogger(__name__)\n\n\ndef create_user(env, username=None, password=None):\n \"\"\"Administrative command to create a new user.\"\"\"\n registry = env['registry']\n settings = registry.settings\n readonly_mode = asbool(settings.get('readonly', False))\n if readonly_mode:\n message = 'Cannot create a user with a readonly server.'\n logger.error(message)\n return 51\n\n if 'kinto.plugins.accounts' not in settings['includes']:\n message = 'Cannot create a user when the accounts plugin is not installed.'\n logger.error(message)\n return 52\n\n try:\n validator = AccountIdGenerator()\n if username is None:\n username = input('Username: ')\n while not validator.match(username):\n print('{} is not a valid username.')\n print('Username should match {0!r}, please try again.'.format(validator.regexp))\n username = input('Username: ')\n\n if password is None:\n while True: # The user didn't entered twice the same password\n password = getpass.getpass('Please enter a password for {}: '.format(username))\n confirm = getpass.getpass('Please confirm the password: '.format(username))\n\n if password != confirm:\n print('Sorry, passwords do not match, please try again.')\n else:\n break\n except EOFError:\n print('User creation aborted')\n return 53\n\n print(\"Creating user '{}'\".format(username))\n record = {'id': username, 'password': hash_password(password)}\n registry.storage.create(collection_id='account',\n parent_id=username,\n record=record,\n ignore_conflict=True)\n registry.permission.add_principal_to_ace('/accounts/{}'.format(username),\n 'write',\n 'account:{}'.format(username))\n\n current_transaction.commit()\n\n return 0\n", "path": "kinto/plugins/accounts/scripts.py"}], "after_files": [{"content": "import logging\nimport getpass\n\nimport transaction as current_transaction\nfrom pyramid.settings import asbool\n\nfrom .utils import hash_password\nfrom .views import AccountIdGenerator\n\n\nlogger = logging.getLogger(__name__)\n\n\ndef create_user(env, username=None, password=None):\n \"\"\"Administrative command to create a new user.\"\"\"\n registry = env['registry']\n settings = registry.settings\n readonly_mode = asbool(settings.get('readonly', False))\n if readonly_mode:\n message = 'Cannot create a user with a readonly server.'\n logger.error(message)\n return 51\n\n if 'kinto.plugins.accounts' not in settings['includes']:\n message = 'Cannot create a user when the accounts plugin is not installed.'\n logger.error(message)\n return 52\n\n try:\n validator = AccountIdGenerator()\n if username is None:\n username = input('Username: ')\n while not validator.match(username):\n print('{} is not a valid username.')\n print('Username should match {0!r}, please try again.'.format(validator.regexp))\n username = input('Username: ')\n\n if password is None:\n while True: # The user didn't entered twice the same password\n password = getpass.getpass('Please enter a password for {}: '.format(username))\n confirm = getpass.getpass('Please confirm the password: '.format(username))\n\n if password != confirm:\n print('Sorry, passwords do not match, please try again.')\n else:\n break\n except EOFError:\n print('User creation aborted')\n return 53\n\n print(\"Creating user '{}'\".format(username))\n record = {'id': username, 'password': hash_password(password)}\n registry.storage.update(collection_id='account',\n parent_id=username,\n object_id=username,\n record=record)\n registry.permission.add_principal_to_ace('/accounts/{}'.format(username),\n 'write',\n 'account:{}'.format(username))\n\n current_transaction.commit()\n\n return 0\n", "path": "kinto/plugins/accounts/scripts.py"}]}
826
153
gh_patches_debug_22380
rasdani/github-patches
git_diff
getsentry__sentry-python-355
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- 0.7.10 CeleryIntegration captures task Ignore exception Similar to #252, I've noticed that the Sentry client with the CeleryIntegration is capturing task Ignore exceptions, which are often used with dynamic tasks (see also https://github.com/celery/celery/issues/3437). I believe that since Retries are ignored, the `Ignore` exception should also be ignored. The exception in Sentry is showing `celery/app/task.py in replace at line 894`: ```python raise Ignore('Replaced by new task') ``` celery: 4.3 sentry-sdk: 0.7.10 python: 3.6 --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `sentry_sdk/integrations/celery.py` Content: ``` 1 from __future__ import absolute_import 2 3 import sys 4 5 from celery.exceptions import SoftTimeLimitExceeded, Retry # type: ignore 6 7 from sentry_sdk.hub import Hub 8 from sentry_sdk.utils import capture_internal_exceptions, event_from_exception 9 from sentry_sdk._compat import reraise 10 from sentry_sdk.integrations import Integration 11 from sentry_sdk.integrations.logging import ignore_logger 12 13 14 class CeleryIntegration(Integration): 15 identifier = "celery" 16 17 @staticmethod 18 def setup_once(): 19 import celery.app.trace as trace # type: ignore 20 21 old_build_tracer = trace.build_tracer 22 23 def sentry_build_tracer(name, task, *args, **kwargs): 24 # Need to patch both methods because older celery sometimes 25 # short-circuits to task.run if it thinks it's safe. 26 task.__call__ = _wrap_task_call(task, task.__call__) 27 task.run = _wrap_task_call(task, task.run) 28 return _wrap_tracer(task, old_build_tracer(name, task, *args, **kwargs)) 29 30 trace.build_tracer = sentry_build_tracer 31 32 _patch_worker_exit() 33 34 # This logger logs every status of every task that ran on the worker. 35 # Meaning that every task's breadcrumbs are full of stuff like "Task 36 # <foo> raised unexpected <bar>". 37 ignore_logger("celery.worker.job") 38 39 40 def _wrap_tracer(task, f): 41 # Need to wrap tracer for pushing the scope before prerun is sent, and 42 # popping it after postrun is sent. 43 # 44 # This is the reason we don't use signals for hooking in the first place. 45 # Also because in Celery 3, signal dispatch returns early if one handler 46 # crashes. 47 def _inner(*args, **kwargs): 48 hub = Hub.current 49 if hub.get_integration(CeleryIntegration) is None: 50 return f(*args, **kwargs) 51 52 with hub.push_scope() as scope: 53 scope._name = "celery" 54 scope.clear_breadcrumbs() 55 scope.add_event_processor(_make_event_processor(task, *args, **kwargs)) 56 57 return f(*args, **kwargs) 58 59 return _inner 60 61 62 def _wrap_task_call(task, f): 63 # Need to wrap task call because the exception is caught before we get to 64 # see it. Also celery's reported stacktrace is untrustworthy. 65 def _inner(*args, **kwargs): 66 try: 67 return f(*args, **kwargs) 68 except Exception: 69 exc_info = sys.exc_info() 70 with capture_internal_exceptions(): 71 _capture_exception(task, exc_info) 72 reraise(*exc_info) 73 74 return _inner 75 76 77 def _make_event_processor(task, uuid, args, kwargs, request=None): 78 def event_processor(event, hint): 79 with capture_internal_exceptions(): 80 event["transaction"] = task.name 81 82 with capture_internal_exceptions(): 83 extra = event.setdefault("extra", {}) 84 extra["celery-job"] = { 85 "task_name": task.name, 86 "args": args, 87 "kwargs": kwargs, 88 } 89 90 if "exc_info" in hint: 91 with capture_internal_exceptions(): 92 if issubclass(hint["exc_info"][0], SoftTimeLimitExceeded): 93 event["fingerprint"] = [ 94 "celery", 95 "SoftTimeLimitExceeded", 96 getattr(task, "name", task), 97 ] 98 99 return event 100 101 return event_processor 102 103 104 def _capture_exception(task, exc_info): 105 hub = Hub.current 106 107 if hub.get_integration(CeleryIntegration) is None: 108 return 109 if isinstance(exc_info[1], Retry): 110 return 111 if hasattr(task, "throws") and isinstance(exc_info[1], task.throws): 112 return 113 114 event, hint = event_from_exception( 115 exc_info, 116 client_options=hub.client.options, 117 mechanism={"type": "celery", "handled": False}, 118 ) 119 120 hub.capture_event(event, hint=hint) 121 122 123 def _patch_worker_exit(): 124 # Need to flush queue before worker shutdown because a crashing worker will 125 # call os._exit 126 from billiard.pool import Worker # type: ignore 127 128 old_workloop = Worker.workloop 129 130 def sentry_workloop(*args, **kwargs): 131 try: 132 return old_workloop(*args, **kwargs) 133 finally: 134 with capture_internal_exceptions(): 135 hub = Hub.current 136 if hub.get_integration(CeleryIntegration) is not None: 137 hub.flush() 138 139 Worker.workloop = sentry_workloop 140 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/sentry_sdk/integrations/celery.py b/sentry_sdk/integrations/celery.py --- a/sentry_sdk/integrations/celery.py +++ b/sentry_sdk/integrations/celery.py @@ -2,7 +2,12 @@ import sys -from celery.exceptions import SoftTimeLimitExceeded, Retry # type: ignore +from celery.exceptions import ( # type: ignore + SoftTimeLimitExceeded, + Retry, + Ignore, + Reject, +) from sentry_sdk.hub import Hub from sentry_sdk.utils import capture_internal_exceptions, event_from_exception @@ -11,6 +16,9 @@ from sentry_sdk.integrations.logging import ignore_logger +CELERY_CONTROL_FLOW_EXCEPTIONS = (Retry, Ignore, Reject) + + class CeleryIntegration(Integration): identifier = "celery" @@ -106,7 +114,7 @@ if hub.get_integration(CeleryIntegration) is None: return - if isinstance(exc_info[1], Retry): + if isinstance(exc_info[1], CELERY_CONTROL_FLOW_EXCEPTIONS): return if hasattr(task, "throws") and isinstance(exc_info[1], task.throws): return
{"golden_diff": "diff --git a/sentry_sdk/integrations/celery.py b/sentry_sdk/integrations/celery.py\n--- a/sentry_sdk/integrations/celery.py\n+++ b/sentry_sdk/integrations/celery.py\n@@ -2,7 +2,12 @@\n \n import sys\n \n-from celery.exceptions import SoftTimeLimitExceeded, Retry # type: ignore\n+from celery.exceptions import ( # type: ignore\n+ SoftTimeLimitExceeded,\n+ Retry,\n+ Ignore,\n+ Reject,\n+)\n \n from sentry_sdk.hub import Hub\n from sentry_sdk.utils import capture_internal_exceptions, event_from_exception\n@@ -11,6 +16,9 @@\n from sentry_sdk.integrations.logging import ignore_logger\n \n \n+CELERY_CONTROL_FLOW_EXCEPTIONS = (Retry, Ignore, Reject)\n+\n+\n class CeleryIntegration(Integration):\n identifier = \"celery\"\n \n@@ -106,7 +114,7 @@\n \n if hub.get_integration(CeleryIntegration) is None:\n return\n- if isinstance(exc_info[1], Retry):\n+ if isinstance(exc_info[1], CELERY_CONTROL_FLOW_EXCEPTIONS):\n return\n if hasattr(task, \"throws\") and isinstance(exc_info[1], task.throws):\n return\n", "issue": "0.7.10 CeleryIntegration captures task Ignore exception\nSimilar to #252, I've noticed that the Sentry client with the CeleryIntegration is capturing task Ignore exceptions, which are often used with dynamic tasks (see also https://github.com/celery/celery/issues/3437). I believe that since Retries are ignored, the `Ignore` exception should also be ignored.\r\n\r\nThe exception in Sentry is showing `celery/app/task.py in replace at line 894`:\r\n```python\r\nraise Ignore('Replaced by new task')\r\n```\r\n\r\ncelery: 4.3\r\nsentry-sdk: 0.7.10\r\npython: 3.6\n", "before_files": [{"content": "from __future__ import absolute_import\n\nimport sys\n\nfrom celery.exceptions import SoftTimeLimitExceeded, Retry # type: ignore\n\nfrom sentry_sdk.hub import Hub\nfrom sentry_sdk.utils import capture_internal_exceptions, event_from_exception\nfrom sentry_sdk._compat import reraise\nfrom sentry_sdk.integrations import Integration\nfrom sentry_sdk.integrations.logging import ignore_logger\n\n\nclass CeleryIntegration(Integration):\n identifier = \"celery\"\n\n @staticmethod\n def setup_once():\n import celery.app.trace as trace # type: ignore\n\n old_build_tracer = trace.build_tracer\n\n def sentry_build_tracer(name, task, *args, **kwargs):\n # Need to patch both methods because older celery sometimes\n # short-circuits to task.run if it thinks it's safe.\n task.__call__ = _wrap_task_call(task, task.__call__)\n task.run = _wrap_task_call(task, task.run)\n return _wrap_tracer(task, old_build_tracer(name, task, *args, **kwargs))\n\n trace.build_tracer = sentry_build_tracer\n\n _patch_worker_exit()\n\n # This logger logs every status of every task that ran on the worker.\n # Meaning that every task's breadcrumbs are full of stuff like \"Task\n # <foo> raised unexpected <bar>\".\n ignore_logger(\"celery.worker.job\")\n\n\ndef _wrap_tracer(task, f):\n # Need to wrap tracer for pushing the scope before prerun is sent, and\n # popping it after postrun is sent.\n #\n # This is the reason we don't use signals for hooking in the first place.\n # Also because in Celery 3, signal dispatch returns early if one handler\n # crashes.\n def _inner(*args, **kwargs):\n hub = Hub.current\n if hub.get_integration(CeleryIntegration) is None:\n return f(*args, **kwargs)\n\n with hub.push_scope() as scope:\n scope._name = \"celery\"\n scope.clear_breadcrumbs()\n scope.add_event_processor(_make_event_processor(task, *args, **kwargs))\n\n return f(*args, **kwargs)\n\n return _inner\n\n\ndef _wrap_task_call(task, f):\n # Need to wrap task call because the exception is caught before we get to\n # see it. Also celery's reported stacktrace is untrustworthy.\n def _inner(*args, **kwargs):\n try:\n return f(*args, **kwargs)\n except Exception:\n exc_info = sys.exc_info()\n with capture_internal_exceptions():\n _capture_exception(task, exc_info)\n reraise(*exc_info)\n\n return _inner\n\n\ndef _make_event_processor(task, uuid, args, kwargs, request=None):\n def event_processor(event, hint):\n with capture_internal_exceptions():\n event[\"transaction\"] = task.name\n\n with capture_internal_exceptions():\n extra = event.setdefault(\"extra\", {})\n extra[\"celery-job\"] = {\n \"task_name\": task.name,\n \"args\": args,\n \"kwargs\": kwargs,\n }\n\n if \"exc_info\" in hint:\n with capture_internal_exceptions():\n if issubclass(hint[\"exc_info\"][0], SoftTimeLimitExceeded):\n event[\"fingerprint\"] = [\n \"celery\",\n \"SoftTimeLimitExceeded\",\n getattr(task, \"name\", task),\n ]\n\n return event\n\n return event_processor\n\n\ndef _capture_exception(task, exc_info):\n hub = Hub.current\n\n if hub.get_integration(CeleryIntegration) is None:\n return\n if isinstance(exc_info[1], Retry):\n return\n if hasattr(task, \"throws\") and isinstance(exc_info[1], task.throws):\n return\n\n event, hint = event_from_exception(\n exc_info,\n client_options=hub.client.options,\n mechanism={\"type\": \"celery\", \"handled\": False},\n )\n\n hub.capture_event(event, hint=hint)\n\n\ndef _patch_worker_exit():\n # Need to flush queue before worker shutdown because a crashing worker will\n # call os._exit\n from billiard.pool import Worker # type: ignore\n\n old_workloop = Worker.workloop\n\n def sentry_workloop(*args, **kwargs):\n try:\n return old_workloop(*args, **kwargs)\n finally:\n with capture_internal_exceptions():\n hub = Hub.current\n if hub.get_integration(CeleryIntegration) is not None:\n hub.flush()\n\n Worker.workloop = sentry_workloop\n", "path": "sentry_sdk/integrations/celery.py"}], "after_files": [{"content": "from __future__ import absolute_import\n\nimport sys\n\nfrom celery.exceptions import ( # type: ignore\n SoftTimeLimitExceeded,\n Retry,\n Ignore,\n Reject,\n)\n\nfrom sentry_sdk.hub import Hub\nfrom sentry_sdk.utils import capture_internal_exceptions, event_from_exception\nfrom sentry_sdk._compat import reraise\nfrom sentry_sdk.integrations import Integration\nfrom sentry_sdk.integrations.logging import ignore_logger\n\n\nCELERY_CONTROL_FLOW_EXCEPTIONS = (Retry, Ignore, Reject)\n\n\nclass CeleryIntegration(Integration):\n identifier = \"celery\"\n\n @staticmethod\n def setup_once():\n import celery.app.trace as trace # type: ignore\n\n old_build_tracer = trace.build_tracer\n\n def sentry_build_tracer(name, task, *args, **kwargs):\n # Need to patch both methods because older celery sometimes\n # short-circuits to task.run if it thinks it's safe.\n task.__call__ = _wrap_task_call(task, task.__call__)\n task.run = _wrap_task_call(task, task.run)\n return _wrap_tracer(task, old_build_tracer(name, task, *args, **kwargs))\n\n trace.build_tracer = sentry_build_tracer\n\n _patch_worker_exit()\n\n # This logger logs every status of every task that ran on the worker.\n # Meaning that every task's breadcrumbs are full of stuff like \"Task\n # <foo> raised unexpected <bar>\".\n ignore_logger(\"celery.worker.job\")\n\n\ndef _wrap_tracer(task, f):\n # Need to wrap tracer for pushing the scope before prerun is sent, and\n # popping it after postrun is sent.\n #\n # This is the reason we don't use signals for hooking in the first place.\n # Also because in Celery 3, signal dispatch returns early if one handler\n # crashes.\n def _inner(*args, **kwargs):\n hub = Hub.current\n if hub.get_integration(CeleryIntegration) is None:\n return f(*args, **kwargs)\n\n with hub.push_scope() as scope:\n scope._name = \"celery\"\n scope.clear_breadcrumbs()\n scope.add_event_processor(_make_event_processor(task, *args, **kwargs))\n\n return f(*args, **kwargs)\n\n return _inner\n\n\ndef _wrap_task_call(task, f):\n # Need to wrap task call because the exception is caught before we get to\n # see it. Also celery's reported stacktrace is untrustworthy.\n def _inner(*args, **kwargs):\n try:\n return f(*args, **kwargs)\n except Exception:\n exc_info = sys.exc_info()\n with capture_internal_exceptions():\n _capture_exception(task, exc_info)\n reraise(*exc_info)\n\n return _inner\n\n\ndef _make_event_processor(task, uuid, args, kwargs, request=None):\n def event_processor(event, hint):\n with capture_internal_exceptions():\n event[\"transaction\"] = task.name\n\n with capture_internal_exceptions():\n extra = event.setdefault(\"extra\", {})\n extra[\"celery-job\"] = {\n \"task_name\": task.name,\n \"args\": args,\n \"kwargs\": kwargs,\n }\n\n if \"exc_info\" in hint:\n with capture_internal_exceptions():\n if issubclass(hint[\"exc_info\"][0], SoftTimeLimitExceeded):\n event[\"fingerprint\"] = [\n \"celery\",\n \"SoftTimeLimitExceeded\",\n getattr(task, \"name\", task),\n ]\n\n return event\n\n return event_processor\n\n\ndef _capture_exception(task, exc_info):\n hub = Hub.current\n\n if hub.get_integration(CeleryIntegration) is None:\n return\n if isinstance(exc_info[1], CELERY_CONTROL_FLOW_EXCEPTIONS):\n return\n if hasattr(task, \"throws\") and isinstance(exc_info[1], task.throws):\n return\n\n event, hint = event_from_exception(\n exc_info,\n client_options=hub.client.options,\n mechanism={\"type\": \"celery\", \"handled\": False},\n )\n\n hub.capture_event(event, hint=hint)\n\n\ndef _patch_worker_exit():\n # Need to flush queue before worker shutdown because a crashing worker will\n # call os._exit\n from billiard.pool import Worker # type: ignore\n\n old_workloop = Worker.workloop\n\n def sentry_workloop(*args, **kwargs):\n try:\n return old_workloop(*args, **kwargs)\n finally:\n with capture_internal_exceptions():\n hub = Hub.current\n if hub.get_integration(CeleryIntegration) is not None:\n hub.flush()\n\n Worker.workloop = sentry_workloop\n", "path": "sentry_sdk/integrations/celery.py"}]}
1,725
282
gh_patches_debug_9738
rasdani/github-patches
git_diff
modin-project__modin-6337
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Ray is incompatible with pydantic>=2.0 We should pin `pydantic<2.0` to workaround the issues on Ray side. Example [from](https://github.com/modin-project/modin/actions/runs/5425526005/jobs/9866377841): ```bash File "/usr/share/miniconda/envs/modin/lib/python3.8/site-packages/ray/util/state/__init__.py", line 1, in <module> from ray.util.state.api import ( File "/usr/share/miniconda/envs/modin/lib/python3.8/site-packages/ray/util/state/api.py", line 17, in <module> from ray.util.state.common import ( File "/usr/share/miniconda/envs/modin/lib/python3.8/site-packages/ray/util/state/common.py", line 120, in <module> @dataclass(init=True) File "/usr/share/miniconda/envs/modin/lib/python3.8/site-packages/pydantic/dataclasses.py", line 139, in dataclass assert init is False, 'pydantic.dataclasses.dataclass only supports init=False' AssertionError: pydantic.dataclasses.dataclass only supports init=False ``` --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `setup.py` Content: ``` 1 from setuptools import setup, find_packages 2 import versioneer 3 4 with open("README.md", "r", encoding="utf-8") as fh: 5 long_description = fh.read() 6 7 dask_deps = ["dask>=2.22.0", "distributed>=2.22.0"] 8 # ray==2.5.0 broken: https://github.com/conda-forge/ray-packages-feedstock/issues/100 9 ray_deps = ["ray[default]>=1.13.0,!=2.5.0", "pyarrow"] 10 unidist_deps = ["unidist[mpi]>=0.2.1"] 11 remote_deps = ["rpyc==4.1.5", "cloudpickle", "boto3"] 12 spreadsheet_deps = ["modin-spreadsheet>=0.1.0"] 13 sql_deps = ["dfsql>=0.4.2", "pyparsing<=2.4.7"] 14 all_deps = dask_deps + ray_deps + unidist_deps + remote_deps + spreadsheet_deps 15 16 # Distribute 'modin-autoimport-pandas.pth' along with binary and source distributions. 17 # This file provides the "import pandas before Ray init" feature if specific 18 # environment variable is set (see https://github.com/modin-project/modin/issues/4564). 19 cmdclass = versioneer.get_cmdclass() 20 extra_files = ["modin-autoimport-pandas.pth"] 21 22 23 class AddPthFileBuild(cmdclass["build_py"]): 24 def _get_data_files(self): 25 return (super()._get_data_files() or []) + [ 26 (".", ".", self.build_lib, extra_files) 27 ] 28 29 30 class AddPthFileSDist(cmdclass["sdist"]): 31 def make_distribution(self): 32 self.filelist.extend(extra_files) 33 return super().make_distribution() 34 35 36 cmdclass["build_py"] = AddPthFileBuild 37 cmdclass["sdist"] = AddPthFileSDist 38 39 setup( 40 name="modin", 41 version=versioneer.get_version(), 42 cmdclass=cmdclass, 43 description="Modin: Make your pandas code run faster by changing one line of code.", 44 packages=find_packages(exclude=["scripts", "scripts.*"]), 45 include_package_data=True, 46 license="Apache 2", 47 url="https://github.com/modin-project/modin", 48 long_description=long_description, 49 long_description_content_type="text/markdown", 50 install_requires=[ 51 "pandas>=2,<2.1", 52 "packaging", 53 "numpy>=1.18.5", 54 "fsspec", 55 "psutil", 56 ], 57 extras_require={ 58 # can be installed by pip install modin[dask] 59 "dask": dask_deps, 60 "ray": ray_deps, 61 "unidist": unidist_deps, 62 "remote": remote_deps, 63 "spreadsheet": spreadsheet_deps, 64 "sql": sql_deps, 65 "all": all_deps, 66 }, 67 python_requires=">=3.8", 68 ) 69 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/setup.py b/setup.py --- a/setup.py +++ b/setup.py @@ -6,7 +6,8 @@ dask_deps = ["dask>=2.22.0", "distributed>=2.22.0"] # ray==2.5.0 broken: https://github.com/conda-forge/ray-packages-feedstock/issues/100 -ray_deps = ["ray[default]>=1.13.0,!=2.5.0", "pyarrow"] +# pydantic<2: https://github.com/modin-project/modin/issues/6336 +ray_deps = ["ray[default]>=1.13.0,!=2.5.0", "pyarrow", "pydantic<2"] unidist_deps = ["unidist[mpi]>=0.2.1"] remote_deps = ["rpyc==4.1.5", "cloudpickle", "boto3"] spreadsheet_deps = ["modin-spreadsheet>=0.1.0"]
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -6,7 +6,8 @@\n \n dask_deps = [\"dask>=2.22.0\", \"distributed>=2.22.0\"]\n # ray==2.5.0 broken: https://github.com/conda-forge/ray-packages-feedstock/issues/100\n-ray_deps = [\"ray[default]>=1.13.0,!=2.5.0\", \"pyarrow\"]\n+# pydantic<2: https://github.com/modin-project/modin/issues/6336\n+ray_deps = [\"ray[default]>=1.13.0,!=2.5.0\", \"pyarrow\", \"pydantic<2\"]\n unidist_deps = [\"unidist[mpi]>=0.2.1\"]\n remote_deps = [\"rpyc==4.1.5\", \"cloudpickle\", \"boto3\"]\n spreadsheet_deps = [\"modin-spreadsheet>=0.1.0\"]\n", "issue": "Ray is incompatible with pydantic>=2.0\nWe should pin `pydantic<2.0` to workaround the issues on Ray side.\r\n\r\n\r\nExample [from](https://github.com/modin-project/modin/actions/runs/5425526005/jobs/9866377841):\r\n```bash\r\n File \"/usr/share/miniconda/envs/modin/lib/python3.8/site-packages/ray/util/state/__init__.py\", line 1, in <module>\r\n from ray.util.state.api import (\r\n File \"/usr/share/miniconda/envs/modin/lib/python3.8/site-packages/ray/util/state/api.py\", line 17, in <module>\r\n from ray.util.state.common import (\r\n File \"/usr/share/miniconda/envs/modin/lib/python3.8/site-packages/ray/util/state/common.py\", line 120, in <module>\r\n @dataclass(init=True)\r\n File \"/usr/share/miniconda/envs/modin/lib/python3.8/site-packages/pydantic/dataclasses.py\", line 139, in dataclass\r\n assert init is False, 'pydantic.dataclasses.dataclass only supports init=False'\r\nAssertionError: pydantic.dataclasses.dataclass only supports init=False\r\n```\n", "before_files": [{"content": "from setuptools import setup, find_packages\nimport versioneer\n\nwith open(\"README.md\", \"r\", encoding=\"utf-8\") as fh:\n long_description = fh.read()\n\ndask_deps = [\"dask>=2.22.0\", \"distributed>=2.22.0\"]\n# ray==2.5.0 broken: https://github.com/conda-forge/ray-packages-feedstock/issues/100\nray_deps = [\"ray[default]>=1.13.0,!=2.5.0\", \"pyarrow\"]\nunidist_deps = [\"unidist[mpi]>=0.2.1\"]\nremote_deps = [\"rpyc==4.1.5\", \"cloudpickle\", \"boto3\"]\nspreadsheet_deps = [\"modin-spreadsheet>=0.1.0\"]\nsql_deps = [\"dfsql>=0.4.2\", \"pyparsing<=2.4.7\"]\nall_deps = dask_deps + ray_deps + unidist_deps + remote_deps + spreadsheet_deps\n\n# Distribute 'modin-autoimport-pandas.pth' along with binary and source distributions.\n# This file provides the \"import pandas before Ray init\" feature if specific\n# environment variable is set (see https://github.com/modin-project/modin/issues/4564).\ncmdclass = versioneer.get_cmdclass()\nextra_files = [\"modin-autoimport-pandas.pth\"]\n\n\nclass AddPthFileBuild(cmdclass[\"build_py\"]):\n def _get_data_files(self):\n return (super()._get_data_files() or []) + [\n (\".\", \".\", self.build_lib, extra_files)\n ]\n\n\nclass AddPthFileSDist(cmdclass[\"sdist\"]):\n def make_distribution(self):\n self.filelist.extend(extra_files)\n return super().make_distribution()\n\n\ncmdclass[\"build_py\"] = AddPthFileBuild\ncmdclass[\"sdist\"] = AddPthFileSDist\n\nsetup(\n name=\"modin\",\n version=versioneer.get_version(),\n cmdclass=cmdclass,\n description=\"Modin: Make your pandas code run faster by changing one line of code.\",\n packages=find_packages(exclude=[\"scripts\", \"scripts.*\"]),\n include_package_data=True,\n license=\"Apache 2\",\n url=\"https://github.com/modin-project/modin\",\n long_description=long_description,\n long_description_content_type=\"text/markdown\",\n install_requires=[\n \"pandas>=2,<2.1\",\n \"packaging\",\n \"numpy>=1.18.5\",\n \"fsspec\",\n \"psutil\",\n ],\n extras_require={\n # can be installed by pip install modin[dask]\n \"dask\": dask_deps,\n \"ray\": ray_deps,\n \"unidist\": unidist_deps,\n \"remote\": remote_deps,\n \"spreadsheet\": spreadsheet_deps,\n \"sql\": sql_deps,\n \"all\": all_deps,\n },\n python_requires=\">=3.8\",\n)\n", "path": "setup.py"}], "after_files": [{"content": "from setuptools import setup, find_packages\nimport versioneer\n\nwith open(\"README.md\", \"r\", encoding=\"utf-8\") as fh:\n long_description = fh.read()\n\ndask_deps = [\"dask>=2.22.0\", \"distributed>=2.22.0\"]\n# ray==2.5.0 broken: https://github.com/conda-forge/ray-packages-feedstock/issues/100\n# pydantic<2: https://github.com/modin-project/modin/issues/6336\nray_deps = [\"ray[default]>=1.13.0,!=2.5.0\", \"pyarrow\", \"pydantic<2\"]\nunidist_deps = [\"unidist[mpi]>=0.2.1\"]\nremote_deps = [\"rpyc==4.1.5\", \"cloudpickle\", \"boto3\"]\nspreadsheet_deps = [\"modin-spreadsheet>=0.1.0\"]\nsql_deps = [\"dfsql>=0.4.2\", \"pyparsing<=2.4.7\"]\nall_deps = dask_deps + ray_deps + unidist_deps + remote_deps + spreadsheet_deps\n\n# Distribute 'modin-autoimport-pandas.pth' along with binary and source distributions.\n# This file provides the \"import pandas before Ray init\" feature if specific\n# environment variable is set (see https://github.com/modin-project/modin/issues/4564).\ncmdclass = versioneer.get_cmdclass()\nextra_files = [\"modin-autoimport-pandas.pth\"]\n\n\nclass AddPthFileBuild(cmdclass[\"build_py\"]):\n def _get_data_files(self):\n return (super()._get_data_files() or []) + [\n (\".\", \".\", self.build_lib, extra_files)\n ]\n\n\nclass AddPthFileSDist(cmdclass[\"sdist\"]):\n def make_distribution(self):\n self.filelist.extend(extra_files)\n return super().make_distribution()\n\n\ncmdclass[\"build_py\"] = AddPthFileBuild\ncmdclass[\"sdist\"] = AddPthFileSDist\n\nsetup(\n name=\"modin\",\n version=versioneer.get_version(),\n cmdclass=cmdclass,\n description=\"Modin: Make your pandas code run faster by changing one line of code.\",\n packages=find_packages(exclude=[\"scripts\", \"scripts.*\"]),\n include_package_data=True,\n license=\"Apache 2\",\n url=\"https://github.com/modin-project/modin\",\n long_description=long_description,\n long_description_content_type=\"text/markdown\",\n install_requires=[\n \"pandas>=2,<2.1\",\n \"packaging\",\n \"numpy>=1.18.5\",\n \"fsspec\",\n \"psutil\",\n ],\n extras_require={\n # can be installed by pip install modin[dask]\n \"dask\": dask_deps,\n \"ray\": ray_deps,\n \"unidist\": unidist_deps,\n \"remote\": remote_deps,\n \"spreadsheet\": spreadsheet_deps,\n \"sql\": sql_deps,\n \"all\": all_deps,\n },\n python_requires=\">=3.8\",\n)\n", "path": "setup.py"}]}
1,309
225
gh_patches_debug_3681
rasdani/github-patches
git_diff
ivy-llc__ivy-13823
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- rand_like --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `ivy/functional/frontends/torch/random_sampling.py` Content: ``` 1 import ivy 2 from ivy.func_wrapper import with_supported_dtypes 3 from ivy.functional.frontends.torch.func_wrapper import to_ivy_arrays_and_back 4 5 try: 6 from torch import Generator 7 except ImportError: 8 from types import SimpleNamespace 9 10 Generator = SimpleNamespace 11 12 13 def seed() -> int: 14 """Returns a 64 bit number used to seed the RNG""" 15 return int(ivy.randint(-(2**63), 2**63 - 1)) 16 17 18 @to_ivy_arrays_and_back 19 def manual_seed(seed: int): 20 ivy.seed(seed_value=seed) 21 return Generator().manual_seed(seed) 22 23 24 @with_supported_dtypes( 25 { 26 "1.11.0 and below": ( 27 "float32", 28 "float64", 29 ) 30 }, 31 "torch", 32 ) 33 @to_ivy_arrays_and_back 34 def multinomial(input, num_samples, replacement=False, *, generator=None, out=None): 35 return ivy.multinomial( 36 num_samples + 1, # doesn't matter because `probs` is provided, but should be 37 # greater than the number of samples 38 num_samples, 39 probs=input, 40 replace=replacement, 41 out=out, 42 ) 43 44 45 @with_supported_dtypes( 46 { 47 "1.11.0 and below": ( 48 "float32", 49 "float64", 50 ) 51 }, 52 "torch", 53 ) 54 @to_ivy_arrays_and_back 55 def poisson(input, generator=None): 56 return ivy.poisson(input, shape=None) 57 58 59 @to_ivy_arrays_and_back 60 def rand( 61 size, 62 *, 63 generator=None, 64 out=None, 65 dtype=None, 66 layout=None, 67 device=None, 68 requires_grad=False, 69 pin_memory=False 70 ): 71 return ivy.random_uniform( 72 shape=size, 73 out=out, 74 dtype=dtype, 75 device=device, 76 ) 77 78 79 @to_ivy_arrays_and_back 80 def randn( 81 size, 82 *, 83 generator=None, 84 out=None, 85 dtype=None, 86 layout=None, 87 device=None, 88 requires_grad=False, 89 pin_memory=False 90 ): 91 return ivy.random_normal( 92 shape=size, 93 out=out, 94 dtype=dtype, 95 device=device, 96 ) 97 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/ivy/functional/frontends/torch/random_sampling.py b/ivy/functional/frontends/torch/random_sampling.py --- a/ivy/functional/frontends/torch/random_sampling.py +++ b/ivy/functional/frontends/torch/random_sampling.py @@ -76,6 +76,27 @@ ) +@to_ivy_arrays_and_back +def rand_like( + input, + *, + dtype=None, + layout=None, + device=None, + requires_grad=False, + memory_format=False +): + shape = input.shape + if not dtype: + dtype = input.dtype + + return ivy.random_uniform( + shape=shape, + dtype=dtype, + device=device, + ) + + @to_ivy_arrays_and_back def randn( size,
{"golden_diff": "diff --git a/ivy/functional/frontends/torch/random_sampling.py b/ivy/functional/frontends/torch/random_sampling.py\n--- a/ivy/functional/frontends/torch/random_sampling.py\n+++ b/ivy/functional/frontends/torch/random_sampling.py\n@@ -76,6 +76,27 @@\n )\n \n \n+@to_ivy_arrays_and_back\n+def rand_like(\n+ input,\n+ *,\n+ dtype=None,\n+ layout=None,\n+ device=None,\n+ requires_grad=False,\n+ memory_format=False\n+):\n+ shape = input.shape\n+ if not dtype:\n+ dtype = input.dtype\n+\n+ return ivy.random_uniform(\n+ shape=shape,\n+ dtype=dtype,\n+ device=device,\n+ )\n+\n+\n @to_ivy_arrays_and_back\n def randn(\n size,\n", "issue": "rand_like\n\n", "before_files": [{"content": "import ivy\nfrom ivy.func_wrapper import with_supported_dtypes\nfrom ivy.functional.frontends.torch.func_wrapper import to_ivy_arrays_and_back\n\ntry:\n from torch import Generator\nexcept ImportError:\n from types import SimpleNamespace\n\n Generator = SimpleNamespace\n\n\ndef seed() -> int:\n \"\"\"Returns a 64 bit number used to seed the RNG\"\"\"\n return int(ivy.randint(-(2**63), 2**63 - 1))\n\n\n@to_ivy_arrays_and_back\ndef manual_seed(seed: int):\n ivy.seed(seed_value=seed)\n return Generator().manual_seed(seed)\n\n\n@with_supported_dtypes(\n {\n \"1.11.0 and below\": (\n \"float32\",\n \"float64\",\n )\n },\n \"torch\",\n)\n@to_ivy_arrays_and_back\ndef multinomial(input, num_samples, replacement=False, *, generator=None, out=None):\n return ivy.multinomial(\n num_samples + 1, # doesn't matter because `probs` is provided, but should be\n # greater than the number of samples\n num_samples,\n probs=input,\n replace=replacement,\n out=out,\n )\n\n\n@with_supported_dtypes(\n {\n \"1.11.0 and below\": (\n \"float32\",\n \"float64\",\n )\n },\n \"torch\",\n)\n@to_ivy_arrays_and_back\ndef poisson(input, generator=None):\n return ivy.poisson(input, shape=None)\n\n\n@to_ivy_arrays_and_back\ndef rand(\n size,\n *,\n generator=None,\n out=None,\n dtype=None,\n layout=None,\n device=None,\n requires_grad=False,\n pin_memory=False\n):\n return ivy.random_uniform(\n shape=size,\n out=out,\n dtype=dtype,\n device=device,\n )\n\n\n@to_ivy_arrays_and_back\ndef randn(\n size,\n *,\n generator=None,\n out=None,\n dtype=None,\n layout=None,\n device=None,\n requires_grad=False,\n pin_memory=False\n):\n return ivy.random_normal(\n shape=size,\n out=out,\n dtype=dtype,\n device=device,\n )\n", "path": "ivy/functional/frontends/torch/random_sampling.py"}], "after_files": [{"content": "import ivy\nfrom ivy.func_wrapper import with_supported_dtypes\nfrom ivy.functional.frontends.torch.func_wrapper import to_ivy_arrays_and_back\n\ntry:\n from torch import Generator\nexcept ImportError:\n from types import SimpleNamespace\n\n Generator = SimpleNamespace\n\n\ndef seed() -> int:\n \"\"\"Returns a 64 bit number used to seed the RNG\"\"\"\n return int(ivy.randint(-(2**63), 2**63 - 1))\n\n\n@to_ivy_arrays_and_back\ndef manual_seed(seed: int):\n ivy.seed(seed_value=seed)\n return Generator().manual_seed(seed)\n\n\n@with_supported_dtypes(\n {\n \"1.11.0 and below\": (\n \"float32\",\n \"float64\",\n )\n },\n \"torch\",\n)\n@to_ivy_arrays_and_back\ndef multinomial(input, num_samples, replacement=False, *, generator=None, out=None):\n return ivy.multinomial(\n num_samples + 1, # doesn't matter because `probs` is provided, but should be\n # greater than the number of samples\n num_samples,\n probs=input,\n replace=replacement,\n out=out,\n )\n\n\n@with_supported_dtypes(\n {\n \"1.11.0 and below\": (\n \"float32\",\n \"float64\",\n )\n },\n \"torch\",\n)\n@to_ivy_arrays_and_back\ndef poisson(input, generator=None):\n return ivy.poisson(input, shape=None)\n\n\n@to_ivy_arrays_and_back\ndef rand(\n size,\n *,\n generator=None,\n out=None,\n dtype=None,\n layout=None,\n device=None,\n requires_grad=False,\n pin_memory=False\n):\n return ivy.random_uniform(\n shape=size,\n out=out,\n dtype=dtype,\n device=device,\n )\n\n\n@to_ivy_arrays_and_back\ndef rand_like(\n input,\n *,\n dtype=None,\n layout=None,\n device=None,\n requires_grad=False,\n memory_format=False\n):\n shape = input.shape\n if not dtype:\n dtype = input.dtype\n\n return ivy.random_uniform(\n shape=shape,\n dtype=dtype,\n device=device,\n )\n\n\n@to_ivy_arrays_and_back\ndef randn(\n size,\n *,\n generator=None,\n out=None,\n dtype=None,\n layout=None,\n device=None,\n requires_grad=False,\n pin_memory=False\n):\n return ivy.random_normal(\n shape=size,\n out=out,\n dtype=dtype,\n device=device,\n )\n", "path": "ivy/functional/frontends/torch/random_sampling.py"}]}
940
188
gh_patches_debug_66902
rasdani/github-patches
git_diff
ivy-llc__ivy-17524
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- argsort --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `ivy/functional/frontends/paddle/tensor/search.py` Content: ``` 1 # global 2 import ivy 3 from ivy.func_wrapper import with_supported_dtypes 4 from ivy.functional.frontends.paddle.func_wrapper import ( 5 to_ivy_arrays_and_back, 6 ) 7 8 9 @with_supported_dtypes( 10 {"2.4.2 and below": ("float32", "float64", "int16", "int32", "int64", "uint8")}, 11 "paddle", 12 ) 13 @to_ivy_arrays_and_back 14 def argmax(x, /, *, axis=None, keepdim=False, dtype="int64", name=None): 15 return ivy.argmax(x, axis=axis, keepdims=keepdim, dtype=dtype) 16 17 18 @with_supported_dtypes( 19 {"2.4.2 and below": ("float32", "float64", "int16", "int32", "int64", "uint8")}, 20 "paddle", 21 ) 22 @to_ivy_arrays_and_back 23 def argmin(x, /, *, axis=None, keepdim=False, dtype="int64", name=None): 24 return ivy.argmin(x, axis=axis, keepdims=keepdim, dtype=dtype) 25 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/ivy/functional/frontends/paddle/tensor/search.py b/ivy/functional/frontends/paddle/tensor/search.py --- a/ivy/functional/frontends/paddle/tensor/search.py +++ b/ivy/functional/frontends/paddle/tensor/search.py @@ -22,3 +22,12 @@ @to_ivy_arrays_and_back def argmin(x, /, *, axis=None, keepdim=False, dtype="int64", name=None): return ivy.argmin(x, axis=axis, keepdims=keepdim, dtype=dtype) + + +@with_supported_dtypes( + {"2.4.2 and below": ("float32", "float64", "int16", "int32", "int64", "uint8")}, + "paddle", +) +@to_ivy_arrays_and_back +def argsort(x, /, *, axis=-1, descending=False, name=None): + return ivy.argsort(x, axis=axis, descending=descending)
{"golden_diff": "diff --git a/ivy/functional/frontends/paddle/tensor/search.py b/ivy/functional/frontends/paddle/tensor/search.py\n--- a/ivy/functional/frontends/paddle/tensor/search.py\n+++ b/ivy/functional/frontends/paddle/tensor/search.py\n@@ -22,3 +22,12 @@\n @to_ivy_arrays_and_back\n def argmin(x, /, *, axis=None, keepdim=False, dtype=\"int64\", name=None):\n return ivy.argmin(x, axis=axis, keepdims=keepdim, dtype=dtype)\n+\n+\n+@with_supported_dtypes(\n+ {\"2.4.2 and below\": (\"float32\", \"float64\", \"int16\", \"int32\", \"int64\", \"uint8\")},\n+ \"paddle\",\n+)\n+@to_ivy_arrays_and_back\n+def argsort(x, /, *, axis=-1, descending=False, name=None):\n+ return ivy.argsort(x, axis=axis, descending=descending)\n", "issue": "argsort\n\n", "before_files": [{"content": "# global\nimport ivy\nfrom ivy.func_wrapper import with_supported_dtypes\nfrom ivy.functional.frontends.paddle.func_wrapper import (\n to_ivy_arrays_and_back,\n)\n\n\n@with_supported_dtypes(\n {\"2.4.2 and below\": (\"float32\", \"float64\", \"int16\", \"int32\", \"int64\", \"uint8\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef argmax(x, /, *, axis=None, keepdim=False, dtype=\"int64\", name=None):\n return ivy.argmax(x, axis=axis, keepdims=keepdim, dtype=dtype)\n\n\n@with_supported_dtypes(\n {\"2.4.2 and below\": (\"float32\", \"float64\", \"int16\", \"int32\", \"int64\", \"uint8\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef argmin(x, /, *, axis=None, keepdim=False, dtype=\"int64\", name=None):\n return ivy.argmin(x, axis=axis, keepdims=keepdim, dtype=dtype)\n", "path": "ivy/functional/frontends/paddle/tensor/search.py"}], "after_files": [{"content": "# global\nimport ivy\nfrom ivy.func_wrapper import with_supported_dtypes\nfrom ivy.functional.frontends.paddle.func_wrapper import (\n to_ivy_arrays_and_back,\n)\n\n\n@with_supported_dtypes(\n {\"2.4.2 and below\": (\"float32\", \"float64\", \"int16\", \"int32\", \"int64\", \"uint8\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef argmax(x, /, *, axis=None, keepdim=False, dtype=\"int64\", name=None):\n return ivy.argmax(x, axis=axis, keepdims=keepdim, dtype=dtype)\n\n\n@with_supported_dtypes(\n {\"2.4.2 and below\": (\"float32\", \"float64\", \"int16\", \"int32\", \"int64\", \"uint8\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef argmin(x, /, *, axis=None, keepdim=False, dtype=\"int64\", name=None):\n return ivy.argmin(x, axis=axis, keepdims=keepdim, dtype=dtype)\n\n\n@with_supported_dtypes(\n {\"2.4.2 and below\": (\"float32\", \"float64\", \"int16\", \"int32\", \"int64\", \"uint8\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef argsort(x, /, *, axis=-1, descending=False, name=None):\n return ivy.argsort(x, axis=axis, descending=descending)\n", "path": "ivy/functional/frontends/paddle/tensor/search.py"}]}
561
228
gh_patches_debug_11382
rasdani/github-patches
git_diff
activeloopai__deeplake-683
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- [BUG] ## 🐛🐛 Bug Report ### ⚗️ Current Behavior Hub's version info is present in two locations, `setup.py` and `hub/version.py`. As result, the released version displays the wrong version info (1.2.3 instead of 1.3.0) when users do `hub --version`. ### 🧰 Possible Solution (optional) Remove version info from `setup.py`. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `setup.py` Content: ``` 1 """ 2 License: 3 This Source Code Form is subject to the terms of the Mozilla Public License, v. 2.0. 4 If a copy of the MPL was not distributed with this file, You can obtain one at https://mozilla.org/MPL/2.0/. 5 """ 6 7 import os 8 9 from setuptools import find_packages, setup 10 11 project = "hub" 12 VERSION = "1.3.0" 13 14 this_directory = os.path.abspath(os.path.dirname(__file__)) 15 with open(os.path.join(this_directory, "README.md"), encoding="utf-8") as f: 16 long_description = f.read() 17 18 with open(os.path.join(this_directory, "requirements.txt")) as f: 19 requirements = f.readlines() 20 21 setup( 22 name=project, 23 version=VERSION, 24 description="Activeloop Hub", 25 long_description=long_description, 26 long_description_content_type="text/markdown", 27 author="Snark AI Inc.", 28 author_email="[email protected]", 29 license="MPL 2.0", 30 url="https://github.com/activeloopai/Hub", 31 packages=find_packages(), 32 include_package_data=True, 33 zip_safe=False, 34 keywords="snark-hub", 35 project_urls={ 36 "Documentation": "https://docs.activeloop.ai/", 37 "Source": "https://github.com/activeloopai/Hub", 38 }, 39 classifiers=[ 40 "License :: OSI Approved :: Mozilla Public License 2.0 (MPL 2.0)", 41 "Programming Language :: Python :: 3", 42 "Programming Language :: Python :: 3.6", 43 "Programming Language :: Python :: 3.7", 44 "Programming Language :: Python :: 3.8", 45 "Programming Language :: Python :: 3.9", 46 "Programming Language :: Python :: 3 :: Only", 47 ], 48 python_requires=">=3.6", 49 install_requires=requirements, 50 setup_requires=[], 51 dependency_links=[], 52 entry_points={ 53 "console_scripts": [ 54 "activeloop = hub.cli.command:cli", 55 "activeloop-local = hub.cli.local:cli", 56 "activeloop-dev = hub.cli.dev:cli", 57 "hub = hub.cli.command:cli", 58 ] 59 }, 60 tests_require=["pytest", "mock>=1.0.1"], 61 ) 62 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/setup.py b/setup.py --- a/setup.py +++ b/setup.py @@ -9,7 +9,6 @@ from setuptools import find_packages, setup project = "hub" -VERSION = "1.3.0" this_directory = os.path.abspath(os.path.dirname(__file__)) with open(os.path.join(this_directory, "README.md")) as f: @@ -20,7 +19,6 @@ setup( name=project, - version=VERSION, description="Activeloop Hub", long_description=long_description, long_description_content_type="text/markdown",
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -9,7 +9,6 @@\n from setuptools import find_packages, setup\n \n project = \"hub\"\n-VERSION = \"1.3.0\"\n \n this_directory = os.path.abspath(os.path.dirname(__file__))\n with open(os.path.join(this_directory, \"README.md\")) as f:\n@@ -20,7 +19,6 @@\n \n setup(\n name=project,\n- version=VERSION,\n description=\"Activeloop Hub\",\n long_description=long_description,\n long_description_content_type=\"text/markdown\",\n", "issue": "[BUG]\n## \ud83d\udc1b\ud83d\udc1b Bug Report\r\n\r\n\r\n### \u2697\ufe0f Current Behavior\r\nHub's version info is present in two locations, `setup.py` and `hub/version.py`. As result, the released version displays the wrong version info (1.2.3 instead of 1.3.0) when users do `hub --version`.\r\n\r\n### \ud83e\uddf0 Possible Solution (optional)\r\nRemove version info from `setup.py`.\n", "before_files": [{"content": "\"\"\"\nLicense:\nThis Source Code Form is subject to the terms of the Mozilla Public License, v. 2.0.\nIf a copy of the MPL was not distributed with this file, You can obtain one at https://mozilla.org/MPL/2.0/.\n\"\"\"\n\nimport os\n\nfrom setuptools import find_packages, setup\n\nproject = \"hub\"\nVERSION = \"1.3.0\"\n\nthis_directory = os.path.abspath(os.path.dirname(__file__))\nwith open(os.path.join(this_directory, \"README.md\"), encoding=\"utf-8\") as f:\n long_description = f.read()\n\nwith open(os.path.join(this_directory, \"requirements.txt\")) as f:\n requirements = f.readlines()\n\nsetup(\n name=project,\n version=VERSION,\n description=\"Activeloop Hub\",\n long_description=long_description,\n long_description_content_type=\"text/markdown\",\n author=\"Snark AI Inc.\",\n author_email=\"[email protected]\",\n license=\"MPL 2.0\",\n url=\"https://github.com/activeloopai/Hub\",\n packages=find_packages(),\n include_package_data=True,\n zip_safe=False,\n keywords=\"snark-hub\",\n project_urls={\n \"Documentation\": \"https://docs.activeloop.ai/\",\n \"Source\": \"https://github.com/activeloopai/Hub\",\n },\n classifiers=[\n \"License :: OSI Approved :: Mozilla Public License 2.0 (MPL 2.0)\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Programming Language :: Python :: 3 :: Only\",\n ],\n python_requires=\">=3.6\",\n install_requires=requirements,\n setup_requires=[],\n dependency_links=[],\n entry_points={\n \"console_scripts\": [\n \"activeloop = hub.cli.command:cli\",\n \"activeloop-local = hub.cli.local:cli\",\n \"activeloop-dev = hub.cli.dev:cli\",\n \"hub = hub.cli.command:cli\",\n ]\n },\n tests_require=[\"pytest\", \"mock>=1.0.1\"],\n)\n", "path": "setup.py"}], "after_files": [{"content": "\"\"\"\nLicense:\nThis Source Code Form is subject to the terms of the Mozilla Public License, v. 2.0.\nIf a copy of the MPL was not distributed with this file, You can obtain one at https://mozilla.org/MPL/2.0/.\n\"\"\"\n\nimport os\n\nfrom setuptools import find_packages, setup\n\nproject = \"hub\"\n\nthis_directory = os.path.abspath(os.path.dirname(__file__))\nwith open(os.path.join(this_directory, \"README.md\")) as f:\n long_description = f.read()\n\nwith open(os.path.join(this_directory, \"requirements.txt\")) as f:\n requirements = f.readlines()\n\nsetup(\n name=project,\n description=\"Activeloop Hub\",\n long_description=long_description,\n long_description_content_type=\"text/markdown\",\n author=\"Snark AI Inc.\",\n author_email=\"[email protected]\",\n license=\"MPL 2.0\",\n url=\"https://github.com/activeloopai/Hub\",\n packages=find_packages(),\n include_package_data=True,\n zip_safe=False,\n keywords=\"snark-hub\",\n project_urls={\n \"Documentation\": \"https://docs.activeloop.ai/\",\n \"Source\": \"https://github.com/activeloopai/Hub\",\n },\n classifiers=[\n \"License :: OSI Approved :: Mozilla Public License 2.0 (MPL 2.0)\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Programming Language :: Python :: 3 :: Only\",\n ],\n python_requires=\">=3.6\",\n install_requires=requirements,\n setup_requires=[],\n dependency_links=[],\n entry_points={\n \"console_scripts\": [\n \"activeloop = hub.cli.command:cli\",\n \"activeloop-local = hub.cli.local:cli\",\n \"activeloop-dev = hub.cli.dev:cli\",\n \"hub = hub.cli.command:cli\",\n ]\n },\n tests_require=[\"pytest\", \"mock>=1.0.1\"],\n)\n", "path": "setup.py"}]}
950
132
gh_patches_debug_63280
rasdani/github-patches
git_diff
pre-commit__pre-commit-1113
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- rust hook requires `--path` attribute Cargo has changed how packages get installed and requires an extra `--path <destination>` attribute. Symptom: ``` [INFO] Initializing environment for https://github.com/nix-community/nixpkgs-fmt. [INFO] Installing environment for https://github.com/nix-community/nixpkgs-fmt. [INFO] Once installed this environment will be reused. [INFO] This may take a few minutes... An unexpected error has occurred: CalledProcessError: Command: ('/nix/store/fcc3x8zwq1c0667xjs7bkn6ay8j4fdpz-rust-1.38.0-nightly-2019-08-07-ad7c55e1f/bin/cargo', 'install', '--bins', '--root', '/home/zimbatm/.cache/pre-commit/repoeft6xm6t/rustenv-default') Return code: 101 Expected return code: 0 Output: (none) Errors: error: Using `cargo install` to install the binaries for the package in current working directory is no longer supported, use `cargo install --path .` instead. Use `cargo build` if you want to simply build the package. ``` I guess the fix should be done where here: https://github.com/pre-commit/pre-commit/blob/9c6a1d80d6b94c86a1785a40a51389e83accac3e/pre_commit/languages/rust.py#L87 Do we want to make pre-commit compatible with multiple versions of cargo or just the latest one? /cc @asottile @chriskuehl --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `pre_commit/languages/rust.py` Content: ``` 1 from __future__ import unicode_literals 2 3 import contextlib 4 import os.path 5 6 import toml 7 8 import pre_commit.constants as C 9 from pre_commit.envcontext import envcontext 10 from pre_commit.envcontext import Var 11 from pre_commit.languages import helpers 12 from pre_commit.util import clean_path_on_failure 13 from pre_commit.util import cmd_output 14 15 16 ENVIRONMENT_DIR = 'rustenv' 17 get_default_version = helpers.basic_get_default_version 18 healthy = helpers.basic_healthy 19 20 21 def get_env_patch(target_dir): 22 return ( 23 ( 24 'PATH', 25 (os.path.join(target_dir, 'bin'), os.pathsep, Var('PATH')), 26 ), 27 ) 28 29 30 @contextlib.contextmanager 31 def in_env(prefix): 32 target_dir = prefix.path( 33 helpers.environment_dir(ENVIRONMENT_DIR, C.DEFAULT), 34 ) 35 with envcontext(get_env_patch(target_dir)): 36 yield 37 38 39 def _add_dependencies(cargo_toml_path, additional_dependencies): 40 with open(cargo_toml_path, 'r+') as f: 41 cargo_toml = toml.load(f) 42 cargo_toml.setdefault('dependencies', {}) 43 for dep in additional_dependencies: 44 name, _, spec = dep.partition(':') 45 cargo_toml['dependencies'][name] = spec or '*' 46 f.seek(0) 47 toml.dump(cargo_toml, f) 48 f.truncate() 49 50 51 def install_environment(prefix, version, additional_dependencies): 52 helpers.assert_version_default('rust', version) 53 directory = prefix.path( 54 helpers.environment_dir(ENVIRONMENT_DIR, C.DEFAULT), 55 ) 56 57 # There are two cases where we might want to specify more dependencies: 58 # as dependencies for the library being built, and as binary packages 59 # to be `cargo install`'d. 60 # 61 # Unlike e.g. Python, if we just `cargo install` a library, it won't be 62 # used for compilation. And if we add a crate providing a binary to the 63 # `Cargo.toml`, the binary won't be built. 64 # 65 # Because of this, we allow specifying "cli" dependencies by prefixing 66 # with 'cli:'. 67 cli_deps = { 68 dep for dep in additional_dependencies if dep.startswith('cli:') 69 } 70 lib_deps = set(additional_dependencies) - cli_deps 71 72 if len(lib_deps) > 0: 73 _add_dependencies(prefix.path('Cargo.toml'), lib_deps) 74 75 with clean_path_on_failure(directory): 76 packages_to_install = {()} 77 for cli_dep in cli_deps: 78 cli_dep = cli_dep[len('cli:'):] 79 package, _, version = cli_dep.partition(':') 80 if version != '': 81 packages_to_install.add((package, '--version', version)) 82 else: 83 packages_to_install.add((package,)) 84 85 for package in packages_to_install: 86 cmd_output( 87 'cargo', 'install', '--bins', '--root', directory, *package, 88 cwd=prefix.prefix_dir 89 ) 90 91 92 def run_hook(hook, file_args): 93 with in_env(hook.prefix): 94 return helpers.run_xargs(hook, helpers.to_cmd(hook), file_args) 95 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/pre_commit/languages/rust.py b/pre_commit/languages/rust.py --- a/pre_commit/languages/rust.py +++ b/pre_commit/languages/rust.py @@ -73,7 +73,7 @@ _add_dependencies(prefix.path('Cargo.toml'), lib_deps) with clean_path_on_failure(directory): - packages_to_install = {()} + packages_to_install = {('--path', '.')} for cli_dep in cli_deps: cli_dep = cli_dep[len('cli:'):] package, _, version = cli_dep.partition(':')
{"golden_diff": "diff --git a/pre_commit/languages/rust.py b/pre_commit/languages/rust.py\n--- a/pre_commit/languages/rust.py\n+++ b/pre_commit/languages/rust.py\n@@ -73,7 +73,7 @@\n _add_dependencies(prefix.path('Cargo.toml'), lib_deps)\n \n with clean_path_on_failure(directory):\n- packages_to_install = {()}\n+ packages_to_install = {('--path', '.')}\n for cli_dep in cli_deps:\n cli_dep = cli_dep[len('cli:'):]\n package, _, version = cli_dep.partition(':')\n", "issue": "rust hook requires `--path` attribute\nCargo has changed how packages get installed and requires an extra `--path <destination>` attribute.\r\n\r\nSymptom:\r\n```\r\n[INFO] Initializing environment for https://github.com/nix-community/nixpkgs-fmt.\r\n[INFO] Installing environment for https://github.com/nix-community/nixpkgs-fmt.\r\n[INFO] Once installed this environment will be reused.\r\n[INFO] This may take a few minutes...\r\nAn unexpected error has occurred: CalledProcessError: Command: ('/nix/store/fcc3x8zwq1c0667xjs7bkn6ay8j4fdpz-rust-1.38.0-nightly-2019-08-07-ad7c55e1f/bin/cargo', 'install', '--bins', '--root', '/home/zimbatm/.cache/pre-commit/repoeft6xm6t/rustenv-default')\r\nReturn code: 101\r\nExpected return code: 0\r\nOutput: (none)\r\nErrors: \r\n error: Using `cargo install` to install the binaries for the package in current working directory is no longer supported, use `cargo install --path .` instead. Use `cargo build` if you want to simply build the package.\r\n```\r\n\r\nI guess the fix should be done where here: https://github.com/pre-commit/pre-commit/blob/9c6a1d80d6b94c86a1785a40a51389e83accac3e/pre_commit/languages/rust.py#L87\r\n\r\nDo we want to make pre-commit compatible with multiple versions of cargo or just the latest one?\r\n\r\n/cc @asottile @chriskuehl \n", "before_files": [{"content": "from __future__ import unicode_literals\n\nimport contextlib\nimport os.path\n\nimport toml\n\nimport pre_commit.constants as C\nfrom pre_commit.envcontext import envcontext\nfrom pre_commit.envcontext import Var\nfrom pre_commit.languages import helpers\nfrom pre_commit.util import clean_path_on_failure\nfrom pre_commit.util import cmd_output\n\n\nENVIRONMENT_DIR = 'rustenv'\nget_default_version = helpers.basic_get_default_version\nhealthy = helpers.basic_healthy\n\n\ndef get_env_patch(target_dir):\n return (\n (\n 'PATH',\n (os.path.join(target_dir, 'bin'), os.pathsep, Var('PATH')),\n ),\n )\n\n\[email protected]\ndef in_env(prefix):\n target_dir = prefix.path(\n helpers.environment_dir(ENVIRONMENT_DIR, C.DEFAULT),\n )\n with envcontext(get_env_patch(target_dir)):\n yield\n\n\ndef _add_dependencies(cargo_toml_path, additional_dependencies):\n with open(cargo_toml_path, 'r+') as f:\n cargo_toml = toml.load(f)\n cargo_toml.setdefault('dependencies', {})\n for dep in additional_dependencies:\n name, _, spec = dep.partition(':')\n cargo_toml['dependencies'][name] = spec or '*'\n f.seek(0)\n toml.dump(cargo_toml, f)\n f.truncate()\n\n\ndef install_environment(prefix, version, additional_dependencies):\n helpers.assert_version_default('rust', version)\n directory = prefix.path(\n helpers.environment_dir(ENVIRONMENT_DIR, C.DEFAULT),\n )\n\n # There are two cases where we might want to specify more dependencies:\n # as dependencies for the library being built, and as binary packages\n # to be `cargo install`'d.\n #\n # Unlike e.g. Python, if we just `cargo install` a library, it won't be\n # used for compilation. And if we add a crate providing a binary to the\n # `Cargo.toml`, the binary won't be built.\n #\n # Because of this, we allow specifying \"cli\" dependencies by prefixing\n # with 'cli:'.\n cli_deps = {\n dep for dep in additional_dependencies if dep.startswith('cli:')\n }\n lib_deps = set(additional_dependencies) - cli_deps\n\n if len(lib_deps) > 0:\n _add_dependencies(prefix.path('Cargo.toml'), lib_deps)\n\n with clean_path_on_failure(directory):\n packages_to_install = {()}\n for cli_dep in cli_deps:\n cli_dep = cli_dep[len('cli:'):]\n package, _, version = cli_dep.partition(':')\n if version != '':\n packages_to_install.add((package, '--version', version))\n else:\n packages_to_install.add((package,))\n\n for package in packages_to_install:\n cmd_output(\n 'cargo', 'install', '--bins', '--root', directory, *package,\n cwd=prefix.prefix_dir\n )\n\n\ndef run_hook(hook, file_args):\n with in_env(hook.prefix):\n return helpers.run_xargs(hook, helpers.to_cmd(hook), file_args)\n", "path": "pre_commit/languages/rust.py"}], "after_files": [{"content": "from __future__ import unicode_literals\n\nimport contextlib\nimport os.path\n\nimport toml\n\nimport pre_commit.constants as C\nfrom pre_commit.envcontext import envcontext\nfrom pre_commit.envcontext import Var\nfrom pre_commit.languages import helpers\nfrom pre_commit.util import clean_path_on_failure\nfrom pre_commit.util import cmd_output\n\n\nENVIRONMENT_DIR = 'rustenv'\nget_default_version = helpers.basic_get_default_version\nhealthy = helpers.basic_healthy\n\n\ndef get_env_patch(target_dir):\n return (\n (\n 'PATH',\n (os.path.join(target_dir, 'bin'), os.pathsep, Var('PATH')),\n ),\n )\n\n\[email protected]\ndef in_env(prefix):\n target_dir = prefix.path(\n helpers.environment_dir(ENVIRONMENT_DIR, C.DEFAULT),\n )\n with envcontext(get_env_patch(target_dir)):\n yield\n\n\ndef _add_dependencies(cargo_toml_path, additional_dependencies):\n with open(cargo_toml_path, 'r+') as f:\n cargo_toml = toml.load(f)\n cargo_toml.setdefault('dependencies', {})\n for dep in additional_dependencies:\n name, _, spec = dep.partition(':')\n cargo_toml['dependencies'][name] = spec or '*'\n f.seek(0)\n toml.dump(cargo_toml, f)\n f.truncate()\n\n\ndef install_environment(prefix, version, additional_dependencies):\n helpers.assert_version_default('rust', version)\n directory = prefix.path(\n helpers.environment_dir(ENVIRONMENT_DIR, C.DEFAULT),\n )\n\n # There are two cases where we might want to specify more dependencies:\n # as dependencies for the library being built, and as binary packages\n # to be `cargo install`'d.\n #\n # Unlike e.g. Python, if we just `cargo install` a library, it won't be\n # used for compilation. And if we add a crate providing a binary to the\n # `Cargo.toml`, the binary won't be built.\n #\n # Because of this, we allow specifying \"cli\" dependencies by prefixing\n # with 'cli:'.\n cli_deps = {\n dep for dep in additional_dependencies if dep.startswith('cli:')\n }\n lib_deps = set(additional_dependencies) - cli_deps\n\n if len(lib_deps) > 0:\n _add_dependencies(prefix.path('Cargo.toml'), lib_deps)\n\n with clean_path_on_failure(directory):\n packages_to_install = {('--path', '.')}\n for cli_dep in cli_deps:\n cli_dep = cli_dep[len('cli:'):]\n package, _, version = cli_dep.partition(':')\n if version != '':\n packages_to_install.add((package, '--version', version))\n else:\n packages_to_install.add((package,))\n\n for package in packages_to_install:\n cmd_output(\n 'cargo', 'install', '--bins', '--root', directory, *package,\n cwd=prefix.prefix_dir\n )\n\n\ndef run_hook(hook, file_args):\n with in_env(hook.prefix):\n return helpers.run_xargs(hook, helpers.to_cmd(hook), file_args)\n", "path": "pre_commit/languages/rust.py"}]}
1,489
125
gh_patches_debug_17363
rasdani/github-patches
git_diff
DataDog__dd-trace-py-4009
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Jinja2: Handle non-string template names as span resources Jinja2 template names may not be strings, but need to be serialized as strings prior to being set as a span resource. Problem areas can be found in `ddtrace/contrib/jinja2/patch.py` on lines 63, 81 and 97 where the value from Jinja2 is trusted to be a string, but may not be. https://github.com/DataDog/dd-trace-py/blob/fb8dfa2f33fff37d21df9728d8386c0260df9744/ddtrace/contrib/jinja2/patch.py#L51-L100 Here is an example of the exception we are seeing when `span.finish` is called: ``` File "/opt/venv/lib/python3.8/site-packages/ddtrace/span.py", line 237, in finish cb(self) File "/opt/venv/lib/python3.8/site-packages/ddtrace/tracer.py", line 712, in _on_span_finish p.on_span_finish(span) File "/opt/venv/lib/python3.8/site-packages/ddtrace/internal/processor/trace.py", line 208, in on_span_finish self._writer.write(spans) File "/opt/venv/lib/python3.8/site-packages/ddtrace/internal/writer.py", line 518, in write self._encoder.put(spans) File "ddtrace/internal/_encoding.pyx", line 456, in ddtrace.internal._encoding.MsgpackEncoderBase.put File "ddtrace/internal/_encoding.pyx", line 460, in ddtrace.internal._encoding.MsgpackEncoderBase.put File "ddtrace/internal/_encoding.pyx", line 483, in ddtrace.internal._encoding.MsgpackEncoderBase.put File "ddtrace/internal/_encoding.pyx", line 464, in ddtrace.internal._encoding.MsgpackEncoderBase.put File "ddtrace/internal/_encoding.pyx", line 451, in ddtrace.internal._encoding.MsgpackEncoderBase._pack_trace File "ddtrace/internal/_encoding.pyx", line 600, in ddtrace.internal._encoding.MsgpackEncoderV03.pack_span File "ddtrace/internal/_encoding.pyx", line 142, in ddtrace.internal._encoding.pack_text TypeError: Unhandled text type: <class 'int'> ``` Relevant pip dependencies: datadogpy == 0.44.0 dd-trace-py == 1.2.3 jinja2 == 3.0.3 --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `ddtrace/contrib/jinja2/patch.py` Content: ``` 1 import os 2 3 import jinja2 4 5 from ddtrace import config 6 from ddtrace.vendor.wrapt import wrap_function_wrapper as _w 7 8 from ...constants import SPAN_MEASURED_KEY 9 from ...ext import SpanTypes 10 from ...internal.utils import ArgumentError 11 from ...internal.utils import get_argument_value 12 from ...pin import Pin 13 from ..trace_utils import unwrap as _u 14 from .constants import DEFAULT_TEMPLATE_NAME 15 16 17 # default settings 18 config._add( 19 "jinja2", 20 { 21 "service_name": os.getenv("DD_JINJA2_SERVICE_NAME"), 22 }, 23 ) 24 25 26 def patch(): 27 if getattr(jinja2, "__datadog_patch", False): 28 # already patched 29 return 30 setattr(jinja2, "__datadog_patch", True) 31 Pin( 32 service=config.jinja2["service_name"], 33 _config=config.jinja2, 34 ).onto(jinja2.environment.Environment) 35 _w(jinja2, "environment.Template.render", _wrap_render) 36 _w(jinja2, "environment.Template.generate", _wrap_render) 37 _w(jinja2, "environment.Environment.compile", _wrap_compile) 38 _w(jinja2, "environment.Environment._load_template", _wrap_load_template) 39 40 41 def unpatch(): 42 if not getattr(jinja2, "__datadog_patch", False): 43 return 44 setattr(jinja2, "__datadog_patch", False) 45 _u(jinja2.Template, "render") 46 _u(jinja2.Template, "generate") 47 _u(jinja2.Environment, "compile") 48 _u(jinja2.Environment, "_load_template") 49 50 51 def _wrap_render(wrapped, instance, args, kwargs): 52 """Wrap `Template.render()` or `Template.generate()`""" 53 pin = Pin.get_from(instance.environment) 54 if not pin or not pin.enabled(): 55 return wrapped(*args, **kwargs) 56 57 template_name = instance.name or DEFAULT_TEMPLATE_NAME 58 with pin.tracer.trace("jinja2.render", pin.service, span_type=SpanTypes.TEMPLATE) as span: 59 span.set_tag(SPAN_MEASURED_KEY) 60 try: 61 return wrapped(*args, **kwargs) 62 finally: 63 span.resource = template_name 64 span.set_tag("jinja2.template_name", template_name) 65 66 67 def _wrap_compile(wrapped, instance, args, kwargs): 68 pin = Pin.get_from(instance) 69 if not pin or not pin.enabled(): 70 return wrapped(*args, **kwargs) 71 72 try: 73 template_name = get_argument_value(args, kwargs, 1, "name") 74 except ArgumentError: 75 template_name = DEFAULT_TEMPLATE_NAME 76 77 with pin.tracer.trace("jinja2.compile", pin.service, span_type=SpanTypes.TEMPLATE) as span: 78 try: 79 return wrapped(*args, **kwargs) 80 finally: 81 span.resource = template_name 82 span.set_tag("jinja2.template_name", template_name) 83 84 85 def _wrap_load_template(wrapped, instance, args, kwargs): 86 pin = Pin.get_from(instance) 87 if not pin or not pin.enabled(): 88 return wrapped(*args, **kwargs) 89 90 template_name = get_argument_value(args, kwargs, 0, "name") 91 with pin.tracer.trace("jinja2.load", pin.service, span_type=SpanTypes.TEMPLATE) as span: 92 template = None 93 try: 94 template = wrapped(*args, **kwargs) 95 return template 96 finally: 97 span.resource = template_name 98 span.set_tag("jinja2.template_name", template_name) 99 if template: 100 span.set_tag("jinja2.template_path", template.filename) 101 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/ddtrace/contrib/jinja2/patch.py b/ddtrace/contrib/jinja2/patch.py --- a/ddtrace/contrib/jinja2/patch.py +++ b/ddtrace/contrib/jinja2/patch.py @@ -7,6 +7,7 @@ from ...constants import SPAN_MEASURED_KEY from ...ext import SpanTypes +from ...internal.compat import stringify from ...internal.utils import ArgumentError from ...internal.utils import get_argument_value from ...pin import Pin @@ -54,7 +55,7 @@ if not pin or not pin.enabled(): return wrapped(*args, **kwargs) - template_name = instance.name or DEFAULT_TEMPLATE_NAME + template_name = stringify(instance.name or DEFAULT_TEMPLATE_NAME) with pin.tracer.trace("jinja2.render", pin.service, span_type=SpanTypes.TEMPLATE) as span: span.set_tag(SPAN_MEASURED_KEY) try:
{"golden_diff": "diff --git a/ddtrace/contrib/jinja2/patch.py b/ddtrace/contrib/jinja2/patch.py\n--- a/ddtrace/contrib/jinja2/patch.py\n+++ b/ddtrace/contrib/jinja2/patch.py\n@@ -7,6 +7,7 @@\n \n from ...constants import SPAN_MEASURED_KEY\n from ...ext import SpanTypes\n+from ...internal.compat import stringify\n from ...internal.utils import ArgumentError\n from ...internal.utils import get_argument_value\n from ...pin import Pin\n@@ -54,7 +55,7 @@\n if not pin or not pin.enabled():\n return wrapped(*args, **kwargs)\n \n- template_name = instance.name or DEFAULT_TEMPLATE_NAME\n+ template_name = stringify(instance.name or DEFAULT_TEMPLATE_NAME)\n with pin.tracer.trace(\"jinja2.render\", pin.service, span_type=SpanTypes.TEMPLATE) as span:\n span.set_tag(SPAN_MEASURED_KEY)\n try:\n", "issue": "Jinja2: Handle non-string template names as span resources\nJinja2 template names may not be strings, but need to be serialized as strings prior to being set as a span resource. Problem areas can be found in `ddtrace/contrib/jinja2/patch.py` on lines 63, 81 and 97 where the value from Jinja2 is trusted to be a string, but may not be.\r\n\r\nhttps://github.com/DataDog/dd-trace-py/blob/fb8dfa2f33fff37d21df9728d8386c0260df9744/ddtrace/contrib/jinja2/patch.py#L51-L100\r\n\r\n\r\nHere is an example of the exception we are seeing when `span.finish` is called:\r\n\r\n```\r\n File \"/opt/venv/lib/python3.8/site-packages/ddtrace/span.py\", line 237, in finish\r\n cb(self)\r\n File \"/opt/venv/lib/python3.8/site-packages/ddtrace/tracer.py\", line 712, in _on_span_finish\r\n p.on_span_finish(span)\r\n File \"/opt/venv/lib/python3.8/site-packages/ddtrace/internal/processor/trace.py\", line 208, in on_span_finish\r\n self._writer.write(spans)\r\n File \"/opt/venv/lib/python3.8/site-packages/ddtrace/internal/writer.py\", line 518, in write\r\n self._encoder.put(spans)\r\n File \"ddtrace/internal/_encoding.pyx\", line 456, in ddtrace.internal._encoding.MsgpackEncoderBase.put\r\n File \"ddtrace/internal/_encoding.pyx\", line 460, in ddtrace.internal._encoding.MsgpackEncoderBase.put\r\n File \"ddtrace/internal/_encoding.pyx\", line 483, in ddtrace.internal._encoding.MsgpackEncoderBase.put\r\n File \"ddtrace/internal/_encoding.pyx\", line 464, in ddtrace.internal._encoding.MsgpackEncoderBase.put\r\n File \"ddtrace/internal/_encoding.pyx\", line 451, in ddtrace.internal._encoding.MsgpackEncoderBase._pack_trace\r\n File \"ddtrace/internal/_encoding.pyx\", line 600, in ddtrace.internal._encoding.MsgpackEncoderV03.pack_span\r\n File \"ddtrace/internal/_encoding.pyx\", line 142, in ddtrace.internal._encoding.pack_text\r\nTypeError: Unhandled text type: <class 'int'>\r\n```\r\n\r\nRelevant pip dependencies:\r\ndatadogpy == 0.44.0\r\ndd-trace-py == 1.2.3\r\njinja2 == 3.0.3\r\n\n", "before_files": [{"content": "import os\n\nimport jinja2\n\nfrom ddtrace import config\nfrom ddtrace.vendor.wrapt import wrap_function_wrapper as _w\n\nfrom ...constants import SPAN_MEASURED_KEY\nfrom ...ext import SpanTypes\nfrom ...internal.utils import ArgumentError\nfrom ...internal.utils import get_argument_value\nfrom ...pin import Pin\nfrom ..trace_utils import unwrap as _u\nfrom .constants import DEFAULT_TEMPLATE_NAME\n\n\n# default settings\nconfig._add(\n \"jinja2\",\n {\n \"service_name\": os.getenv(\"DD_JINJA2_SERVICE_NAME\"),\n },\n)\n\n\ndef patch():\n if getattr(jinja2, \"__datadog_patch\", False):\n # already patched\n return\n setattr(jinja2, \"__datadog_patch\", True)\n Pin(\n service=config.jinja2[\"service_name\"],\n _config=config.jinja2,\n ).onto(jinja2.environment.Environment)\n _w(jinja2, \"environment.Template.render\", _wrap_render)\n _w(jinja2, \"environment.Template.generate\", _wrap_render)\n _w(jinja2, \"environment.Environment.compile\", _wrap_compile)\n _w(jinja2, \"environment.Environment._load_template\", _wrap_load_template)\n\n\ndef unpatch():\n if not getattr(jinja2, \"__datadog_patch\", False):\n return\n setattr(jinja2, \"__datadog_patch\", False)\n _u(jinja2.Template, \"render\")\n _u(jinja2.Template, \"generate\")\n _u(jinja2.Environment, \"compile\")\n _u(jinja2.Environment, \"_load_template\")\n\n\ndef _wrap_render(wrapped, instance, args, kwargs):\n \"\"\"Wrap `Template.render()` or `Template.generate()`\"\"\"\n pin = Pin.get_from(instance.environment)\n if not pin or not pin.enabled():\n return wrapped(*args, **kwargs)\n\n template_name = instance.name or DEFAULT_TEMPLATE_NAME\n with pin.tracer.trace(\"jinja2.render\", pin.service, span_type=SpanTypes.TEMPLATE) as span:\n span.set_tag(SPAN_MEASURED_KEY)\n try:\n return wrapped(*args, **kwargs)\n finally:\n span.resource = template_name\n span.set_tag(\"jinja2.template_name\", template_name)\n\n\ndef _wrap_compile(wrapped, instance, args, kwargs):\n pin = Pin.get_from(instance)\n if not pin or not pin.enabled():\n return wrapped(*args, **kwargs)\n\n try:\n template_name = get_argument_value(args, kwargs, 1, \"name\")\n except ArgumentError:\n template_name = DEFAULT_TEMPLATE_NAME\n\n with pin.tracer.trace(\"jinja2.compile\", pin.service, span_type=SpanTypes.TEMPLATE) as span:\n try:\n return wrapped(*args, **kwargs)\n finally:\n span.resource = template_name\n span.set_tag(\"jinja2.template_name\", template_name)\n\n\ndef _wrap_load_template(wrapped, instance, args, kwargs):\n pin = Pin.get_from(instance)\n if not pin or not pin.enabled():\n return wrapped(*args, **kwargs)\n\n template_name = get_argument_value(args, kwargs, 0, \"name\")\n with pin.tracer.trace(\"jinja2.load\", pin.service, span_type=SpanTypes.TEMPLATE) as span:\n template = None\n try:\n template = wrapped(*args, **kwargs)\n return template\n finally:\n span.resource = template_name\n span.set_tag(\"jinja2.template_name\", template_name)\n if template:\n span.set_tag(\"jinja2.template_path\", template.filename)\n", "path": "ddtrace/contrib/jinja2/patch.py"}], "after_files": [{"content": "import os\n\nimport jinja2\n\nfrom ddtrace import config\nfrom ddtrace.vendor.wrapt import wrap_function_wrapper as _w\n\nfrom ...constants import SPAN_MEASURED_KEY\nfrom ...ext import SpanTypes\nfrom ...internal.compat import stringify\nfrom ...internal.utils import ArgumentError\nfrom ...internal.utils import get_argument_value\nfrom ...pin import Pin\nfrom ..trace_utils import unwrap as _u\nfrom .constants import DEFAULT_TEMPLATE_NAME\n\n\n# default settings\nconfig._add(\n \"jinja2\",\n {\n \"service_name\": os.getenv(\"DD_JINJA2_SERVICE_NAME\"),\n },\n)\n\n\ndef patch():\n if getattr(jinja2, \"__datadog_patch\", False):\n # already patched\n return\n setattr(jinja2, \"__datadog_patch\", True)\n Pin(\n service=config.jinja2[\"service_name\"],\n _config=config.jinja2,\n ).onto(jinja2.environment.Environment)\n _w(jinja2, \"environment.Template.render\", _wrap_render)\n _w(jinja2, \"environment.Template.generate\", _wrap_render)\n _w(jinja2, \"environment.Environment.compile\", _wrap_compile)\n _w(jinja2, \"environment.Environment._load_template\", _wrap_load_template)\n\n\ndef unpatch():\n if not getattr(jinja2, \"__datadog_patch\", False):\n return\n setattr(jinja2, \"__datadog_patch\", False)\n _u(jinja2.Template, \"render\")\n _u(jinja2.Template, \"generate\")\n _u(jinja2.Environment, \"compile\")\n _u(jinja2.Environment, \"_load_template\")\n\n\ndef _wrap_render(wrapped, instance, args, kwargs):\n \"\"\"Wrap `Template.render()` or `Template.generate()`\"\"\"\n pin = Pin.get_from(instance.environment)\n if not pin or not pin.enabled():\n return wrapped(*args, **kwargs)\n\n template_name = stringify(instance.name or DEFAULT_TEMPLATE_NAME)\n with pin.tracer.trace(\"jinja2.render\", pin.service, span_type=SpanTypes.TEMPLATE) as span:\n span.set_tag(SPAN_MEASURED_KEY)\n try:\n return wrapped(*args, **kwargs)\n finally:\n span.resource = template_name\n span.set_tag(\"jinja2.template_name\", template_name)\n\n\ndef _wrap_compile(wrapped, instance, args, kwargs):\n pin = Pin.get_from(instance)\n if not pin or not pin.enabled():\n return wrapped(*args, **kwargs)\n\n try:\n template_name = get_argument_value(args, kwargs, 1, \"name\")\n except ArgumentError:\n template_name = DEFAULT_TEMPLATE_NAME\n\n with pin.tracer.trace(\"jinja2.compile\", pin.service, span_type=SpanTypes.TEMPLATE) as span:\n try:\n return wrapped(*args, **kwargs)\n finally:\n span.resource = template_name\n span.set_tag(\"jinja2.template_name\", template_name)\n\n\ndef _wrap_load_template(wrapped, instance, args, kwargs):\n pin = Pin.get_from(instance)\n if not pin or not pin.enabled():\n return wrapped(*args, **kwargs)\n\n template_name = get_argument_value(args, kwargs, 0, \"name\")\n with pin.tracer.trace(\"jinja2.load\", pin.service, span_type=SpanTypes.TEMPLATE) as span:\n template = None\n try:\n template = wrapped(*args, **kwargs)\n return template\n finally:\n span.resource = template_name\n span.set_tag(\"jinja2.template_name\", template_name)\n if template:\n span.set_tag(\"jinja2.template_path\", template.filename)\n", "path": "ddtrace/contrib/jinja2/patch.py"}]}
1,822
207
gh_patches_debug_22154
rasdani/github-patches
git_diff
mathesar-foundation__mathesar-3490
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- New connection creation flow does not handle schema creation failure scenarios ## Description 1. Create a database manually, say `test_db`, and create a schema called `Library Management` in it. 1. In Mathesar, create a new connection to the database. 1. While creating the connection, in the "New PostgreSQL Database Connection" modal, check the "Library Management" schema in the "Schemas to install" field. ![Screenshot 2024-01-26 at 1 01 21 PM](https://github.com/mathesar-foundation/mathesar/assets/11850603/ec5c109b-d779-4d56-96a0-1c209e3deb3d) 1. Click on 'Add Connection'. 1. An error is thrown: - `"(psycopg2.errors.DuplicateSchema) schema \"Library Management\" already exists\n\n[SQL: CREATE SCHEMA \"Library Management\";]\n(Background on this error at: https://sqlalche.me/e/14/f405)"`. - This is expected. 1. The modal remains open. Uncheck the "Library Management" schema, and try clicking on 'Add Connection' again. 1. Another error is thrown: - `duplicate key value violates unique constraint "mathesar_database_name_key" DETAIL: Key (name)=(pavish_install_test_1) already exists.` - From the user perpective, there's no clear next step for them to take when this happens. 1. Close the modal and refresh the page, notice that the connection has been created already. 1. Open the connection, and notice that it shows `Schemas (0)`. 1. Forcing a manual reflection using the "Sync external changes" button shows the public schema and existing tables. ### Note: * I performed the above steps 1-3 only as a means to force an error in the schema creation flow. While testing, I encountered an error which occurred due to an unknown reason. * This issue primarily focuses on steps 4-10. ## Expected behavior * When an error occurs, there should not be an inconsistent middle state in the DB vs the UI. * When schema creation fails, we should try to revert back to the original state, or we should make it extra clear on the UI that it's a multi-step process. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `mathesar/utils/connections.py` Content: ``` 1 """Utilities to help with creating and managing connections in Mathesar.""" 2 from psycopg2.errors import DuplicateSchema 3 from sqlalchemy.exc import OperationalError 4 from mathesar.models.base import Database 5 from db import install, connection as dbconn 6 from mathesar.state import reset_reflection 7 from demo.install.library_dataset import load_library_dataset 8 from demo.install.movies_dataset import load_movies_dataset 9 10 11 class BadInstallationTarget(Exception): 12 """Raise when an attempt is made to install on a disallowed target""" 13 pass 14 15 16 def copy_connection_from_preexisting( 17 connection, nickname, db_name, create_db, sample_data 18 ): 19 if connection['connection_type'] == 'internal_database': 20 db_model = Database.create_from_settings_key('default') 21 elif connection['connection_type'] == 'user_database': 22 db_model = Database.current_objects.get(id=connection['id']) 23 db_model.id = None 24 else: 25 raise KeyError("connection_type") 26 root_db = db_model.db_name 27 return _save_and_install( 28 db_model, db_name, root_db, nickname, create_db, sample_data 29 ) 30 31 32 def create_connection_from_scratch( 33 user, password, host, port, nickname, db_name, sample_data 34 ): 35 db_model = Database(username=user, password=password, host=host, port=port) 36 root_db = db_name 37 return _save_and_install( 38 db_model, db_name, root_db, nickname, False, sample_data 39 ) 40 41 42 def create_connection_with_new_user( 43 connection, user, password, nickname, db_name, create_db, sample_data 44 ): 45 db_model = copy_connection_from_preexisting( 46 connection, nickname, db_name, create_db, [] 47 ) 48 engine = db_model._sa_engine 49 db_model.username = user 50 db_model.password = password 51 db_model.save() 52 dbconn.execute_msar_func_with_engine( 53 engine, 54 'create_basic_mathesar_user', 55 db_model.username, 56 db_model.password 57 ) 58 _load_sample_data(db_model._sa_engine, sample_data) 59 return db_model 60 61 62 def _save_and_install( 63 db_model, db_name, root_db, nickname, create_db, sample_data 64 ): 65 db_model.name = nickname 66 db_model.db_name = db_name 67 _validate_db_model(db_model) 68 db_model.save() 69 try: 70 install.install_mathesar( 71 database_name=db_model.db_name, 72 username=db_model.username, 73 password=db_model.password, 74 hostname=db_model.host, 75 port=db_model.port, 76 skip_confirm=True, 77 create_db=create_db, 78 root_db=root_db, 79 ) 80 except OperationalError as e: 81 db_model.delete() 82 raise e 83 _load_sample_data(db_model._sa_engine, sample_data) 84 return db_model 85 86 87 def _load_sample_data(engine, sample_data): 88 DATASET_MAP = { 89 'library_management': load_library_dataset, 90 'movie_collection': load_movies_dataset, 91 } 92 for key in sample_data: 93 try: 94 DATASET_MAP[key](engine, safe_mode=True) 95 except DuplicateSchema: 96 # We swallow this error, since otherwise we'll raise an error on the 97 # front end even though installation generally succeeded. 98 continue 99 reset_reflection() 100 101 102 def _validate_db_model(db_model): 103 internal_db_model = Database.create_from_settings_key('default') 104 if ( 105 internal_db_model is not None 106 and db_model.host == internal_db_model.host 107 and db_model.port == internal_db_model.port 108 and db_model.db_name == internal_db_model.db_name 109 ): 110 raise BadInstallationTarget( 111 "Mathesar can't be installed in the internal DB namespace" 112 ) 113 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/mathesar/utils/connections.py b/mathesar/utils/connections.py --- a/mathesar/utils/connections.py +++ b/mathesar/utils/connections.py @@ -1,6 +1,6 @@ """Utilities to help with creating and managing connections in Mathesar.""" from psycopg2.errors import DuplicateSchema -from sqlalchemy.exc import OperationalError +from sqlalchemy.exc import OperationalError, ProgrammingError from mathesar.models.base import Database from db import install, connection as dbconn from mathesar.state import reset_reflection @@ -92,10 +92,11 @@ for key in sample_data: try: DATASET_MAP[key](engine, safe_mode=True) - except DuplicateSchema: - # We swallow this error, since otherwise we'll raise an error on the - # front end even though installation generally succeeded. - continue + except ProgrammingError as e: + if isinstance(e.orig, DuplicateSchema): + # We swallow this error, since otherwise we'll raise an error on the + # front end even though installation generally succeeded. + continue reset_reflection()
{"golden_diff": "diff --git a/mathesar/utils/connections.py b/mathesar/utils/connections.py\n--- a/mathesar/utils/connections.py\n+++ b/mathesar/utils/connections.py\n@@ -1,6 +1,6 @@\n \"\"\"Utilities to help with creating and managing connections in Mathesar.\"\"\"\n from psycopg2.errors import DuplicateSchema\n-from sqlalchemy.exc import OperationalError\n+from sqlalchemy.exc import OperationalError, ProgrammingError\n from mathesar.models.base import Database\n from db import install, connection as dbconn\n from mathesar.state import reset_reflection\n@@ -92,10 +92,11 @@\n for key in sample_data:\n try:\n DATASET_MAP[key](engine, safe_mode=True)\n- except DuplicateSchema:\n- # We swallow this error, since otherwise we'll raise an error on the\n- # front end even though installation generally succeeded.\n- continue\n+ except ProgrammingError as e:\n+ if isinstance(e.orig, DuplicateSchema):\n+ # We swallow this error, since otherwise we'll raise an error on the\n+ # front end even though installation generally succeeded.\n+ continue\n reset_reflection()\n", "issue": "New connection creation flow does not handle schema creation failure scenarios\n## Description\r\n1. Create a database manually, say `test_db`, and create a schema called `Library Management` in it.\r\n1. In Mathesar, create a new connection to the database.\r\n1. While creating the connection, in the \"New PostgreSQL Database Connection\" modal, check the \"Library Management\" schema in the \"Schemas to install\" field.\r\n ![Screenshot 2024-01-26 at 1 01 21\u202fPM](https://github.com/mathesar-foundation/mathesar/assets/11850603/ec5c109b-d779-4d56-96a0-1c209e3deb3d)\r\n1. Click on 'Add Connection'.\r\n1. An error is thrown:\r\n - `\"(psycopg2.errors.DuplicateSchema) schema \\\"Library Management\\\" already exists\\n\\n[SQL: CREATE SCHEMA \\\"Library Management\\\";]\\n(Background on this error at: https://sqlalche.me/e/14/f405)\"`.\r\n - This is expected.\r\n1. The modal remains open. Uncheck the \"Library Management\" schema, and try clicking on 'Add Connection' again.\r\n1. Another error is thrown:\r\n - `duplicate key value violates unique constraint \"mathesar_database_name_key\" DETAIL: Key (name)=(pavish_install_test_1) already exists.`\r\n - From the user perpective, there's no clear next step for them to take when this happens.\r\n1. Close the modal and refresh the page, notice that the connection has been created already.\r\n1. Open the connection, and notice that it shows `Schemas (0)`.\r\n1. Forcing a manual reflection using the \"Sync external changes\" button shows the public schema and existing tables.\r\n\r\n### Note:\r\n* I performed the above steps 1-3 only as a means to force an error in the schema creation flow. While testing, I encountered an error which occurred due to an unknown reason. \r\n* This issue primarily focuses on steps 4-10.\r\n\r\n## Expected behavior\r\n* When an error occurs, there should not be an inconsistent middle state in the DB vs the UI.\r\n* When schema creation fails, we should try to revert back to the original state, or we should make it extra clear on the UI that it's a multi-step process.\n", "before_files": [{"content": "\"\"\"Utilities to help with creating and managing connections in Mathesar.\"\"\"\nfrom psycopg2.errors import DuplicateSchema\nfrom sqlalchemy.exc import OperationalError\nfrom mathesar.models.base import Database\nfrom db import install, connection as dbconn\nfrom mathesar.state import reset_reflection\nfrom demo.install.library_dataset import load_library_dataset\nfrom demo.install.movies_dataset import load_movies_dataset\n\n\nclass BadInstallationTarget(Exception):\n \"\"\"Raise when an attempt is made to install on a disallowed target\"\"\"\n pass\n\n\ndef copy_connection_from_preexisting(\n connection, nickname, db_name, create_db, sample_data\n):\n if connection['connection_type'] == 'internal_database':\n db_model = Database.create_from_settings_key('default')\n elif connection['connection_type'] == 'user_database':\n db_model = Database.current_objects.get(id=connection['id'])\n db_model.id = None\n else:\n raise KeyError(\"connection_type\")\n root_db = db_model.db_name\n return _save_and_install(\n db_model, db_name, root_db, nickname, create_db, sample_data\n )\n\n\ndef create_connection_from_scratch(\n user, password, host, port, nickname, db_name, sample_data\n):\n db_model = Database(username=user, password=password, host=host, port=port)\n root_db = db_name\n return _save_and_install(\n db_model, db_name, root_db, nickname, False, sample_data\n )\n\n\ndef create_connection_with_new_user(\n connection, user, password, nickname, db_name, create_db, sample_data\n):\n db_model = copy_connection_from_preexisting(\n connection, nickname, db_name, create_db, []\n )\n engine = db_model._sa_engine\n db_model.username = user\n db_model.password = password\n db_model.save()\n dbconn.execute_msar_func_with_engine(\n engine,\n 'create_basic_mathesar_user',\n db_model.username,\n db_model.password\n )\n _load_sample_data(db_model._sa_engine, sample_data)\n return db_model\n\n\ndef _save_and_install(\n db_model, db_name, root_db, nickname, create_db, sample_data\n):\n db_model.name = nickname\n db_model.db_name = db_name\n _validate_db_model(db_model)\n db_model.save()\n try:\n install.install_mathesar(\n database_name=db_model.db_name,\n username=db_model.username,\n password=db_model.password,\n hostname=db_model.host,\n port=db_model.port,\n skip_confirm=True,\n create_db=create_db,\n root_db=root_db,\n )\n except OperationalError as e:\n db_model.delete()\n raise e\n _load_sample_data(db_model._sa_engine, sample_data)\n return db_model\n\n\ndef _load_sample_data(engine, sample_data):\n DATASET_MAP = {\n 'library_management': load_library_dataset,\n 'movie_collection': load_movies_dataset,\n }\n for key in sample_data:\n try:\n DATASET_MAP[key](engine, safe_mode=True)\n except DuplicateSchema:\n # We swallow this error, since otherwise we'll raise an error on the\n # front end even though installation generally succeeded.\n continue\n reset_reflection()\n\n\ndef _validate_db_model(db_model):\n internal_db_model = Database.create_from_settings_key('default')\n if (\n internal_db_model is not None\n and db_model.host == internal_db_model.host\n and db_model.port == internal_db_model.port\n and db_model.db_name == internal_db_model.db_name\n ):\n raise BadInstallationTarget(\n \"Mathesar can't be installed in the internal DB namespace\"\n )\n", "path": "mathesar/utils/connections.py"}], "after_files": [{"content": "\"\"\"Utilities to help with creating and managing connections in Mathesar.\"\"\"\nfrom psycopg2.errors import DuplicateSchema\nfrom sqlalchemy.exc import OperationalError, ProgrammingError\nfrom mathesar.models.base import Database\nfrom db import install, connection as dbconn\nfrom mathesar.state import reset_reflection\nfrom demo.install.library_dataset import load_library_dataset\nfrom demo.install.movies_dataset import load_movies_dataset\n\n\nclass BadInstallationTarget(Exception):\n \"\"\"Raise when an attempt is made to install on a disallowed target\"\"\"\n pass\n\n\ndef copy_connection_from_preexisting(\n connection, nickname, db_name, create_db, sample_data\n):\n if connection['connection_type'] == 'internal_database':\n db_model = Database.create_from_settings_key('default')\n elif connection['connection_type'] == 'user_database':\n db_model = Database.current_objects.get(id=connection['id'])\n db_model.id = None\n else:\n raise KeyError(\"connection_type\")\n root_db = db_model.db_name\n return _save_and_install(\n db_model, db_name, root_db, nickname, create_db, sample_data\n )\n\n\ndef create_connection_from_scratch(\n user, password, host, port, nickname, db_name, sample_data\n):\n db_model = Database(username=user, password=password, host=host, port=port)\n root_db = db_name\n return _save_and_install(\n db_model, db_name, root_db, nickname, False, sample_data\n )\n\n\ndef create_connection_with_new_user(\n connection, user, password, nickname, db_name, create_db, sample_data\n):\n db_model = copy_connection_from_preexisting(\n connection, nickname, db_name, create_db, []\n )\n engine = db_model._sa_engine\n db_model.username = user\n db_model.password = password\n db_model.save()\n dbconn.execute_msar_func_with_engine(\n engine,\n 'create_basic_mathesar_user',\n db_model.username,\n db_model.password\n )\n _load_sample_data(db_model._sa_engine, sample_data)\n return db_model\n\n\ndef _save_and_install(\n db_model, db_name, root_db, nickname, create_db, sample_data\n):\n db_model.name = nickname\n db_model.db_name = db_name\n _validate_db_model(db_model)\n db_model.save()\n try:\n install.install_mathesar(\n database_name=db_model.db_name,\n username=db_model.username,\n password=db_model.password,\n hostname=db_model.host,\n port=db_model.port,\n skip_confirm=True,\n create_db=create_db,\n root_db=root_db,\n )\n except OperationalError as e:\n db_model.delete()\n raise e\n _load_sample_data(db_model._sa_engine, sample_data)\n return db_model\n\n\ndef _load_sample_data(engine, sample_data):\n DATASET_MAP = {\n 'library_management': load_library_dataset,\n 'movie_collection': load_movies_dataset,\n }\n for key in sample_data:\n try:\n DATASET_MAP[key](engine, safe_mode=True)\n except ProgrammingError as e:\n if isinstance(e.orig, DuplicateSchema):\n # We swallow this error, since otherwise we'll raise an error on the\n # front end even though installation generally succeeded.\n continue\n reset_reflection()\n\n\ndef _validate_db_model(db_model):\n internal_db_model = Database.create_from_settings_key('default')\n if (\n internal_db_model is not None\n and db_model.host == internal_db_model.host\n and db_model.port == internal_db_model.port\n and db_model.db_name == internal_db_model.db_name\n ):\n raise BadInstallationTarget(\n \"Mathesar can't be installed in the internal DB namespace\"\n )\n", "path": "mathesar/utils/connections.py"}]}
1,788
243
gh_patches_debug_25965
rasdani/github-patches
git_diff
facebookresearch__fairseq-4808
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- [fairseq] Guard call to `shape_as_tensor` with `is_in_onnx_export()` This is a no-op in eager and in ONNX export, but it's better for other tracers if this is preserved as shapes directly instead of converted to a tensor. There is a little annoying code duplication with `torch.jit.is_scripting()`, which is unforunately necessary because we didn't implement compile-time short circuiting correctly in TorchScript lol. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `fairseq/modules/sinusoidal_positional_embedding.py` Content: ``` 1 # Copyright (c) Facebook, Inc. and its affiliates. 2 # 3 # This source code is licensed under the MIT license found in the 4 # LICENSE file in the root directory of this source tree. 5 6 import math 7 from typing import Any, Optional 8 9 import torch 10 import torch.onnx.operators 11 from fairseq import utils 12 from torch import Tensor, nn 13 14 15 class SinusoidalPositionalEmbedding(nn.Module): 16 """This module produces sinusoidal positional embeddings of any length. 17 18 Padding symbols are ignored. 19 """ 20 21 def __init__(self, embedding_dim, padding_idx, init_size=1024): 22 super().__init__() 23 self.embedding_dim = embedding_dim 24 self.padding_idx = padding_idx if padding_idx is not None else 0 25 self.weights = SinusoidalPositionalEmbedding.get_embedding( 26 init_size, embedding_dim, padding_idx 27 ) 28 self.onnx_trace = False 29 self.register_buffer("_float_tensor", torch.FloatTensor(1)) 30 self.max_positions = int(1e5) 31 32 def prepare_for_onnx_export_(self): 33 self.onnx_trace = True 34 35 @staticmethod 36 def get_embedding( 37 num_embeddings: int, embedding_dim: int, padding_idx: Optional[int] = None 38 ): 39 """Build sinusoidal embeddings. 40 41 This matches the implementation in tensor2tensor, but differs slightly 42 from the description in Section 3.5 of "Attention Is All You Need". 43 """ 44 half_dim = embedding_dim // 2 45 emb = math.log(10000) / (half_dim - 1) 46 emb = torch.exp(torch.arange(half_dim, dtype=torch.float) * -emb) 47 emb = torch.arange(num_embeddings, dtype=torch.float).unsqueeze( 48 1 49 ) * emb.unsqueeze(0) 50 emb = torch.cat([torch.sin(emb), torch.cos(emb)], dim=1).view( 51 num_embeddings, -1 52 ) 53 if embedding_dim % 2 == 1: 54 # zero pad 55 emb = torch.cat([emb, torch.zeros(num_embeddings, 1)], dim=1) 56 if padding_idx is not None: 57 emb[padding_idx, :] = 0 58 return emb 59 60 def forward( 61 self, 62 input, 63 incremental_state: Optional[Any] = None, 64 timestep: Optional[Tensor] = None, 65 positions: Optional[Any] = None, 66 ): 67 """Input is expected to be of size [bsz x seqlen].""" 68 if torch.jit.is_scripting(): 69 bspair = torch.onnx.operators.shape_as_tensor(input) 70 elif torch.onnx.is_in_onnx_export(): 71 bspair = torch.onnx.operators.shape_as_tensor(input) 72 else: 73 bspair = input.size() 74 bsz, seq_len = bspair[0], bspair[1] 75 max_pos = self.padding_idx + 1 + seq_len 76 if self.weights is None or max_pos > self.weights.size(0): 77 # recompute/expand embeddings if needed 78 self.weights = SinusoidalPositionalEmbedding.get_embedding( 79 max_pos, self.embedding_dim, self.padding_idx 80 ) 81 self.weights = self.weights.to(self._float_tensor) 82 83 if incremental_state is not None: 84 # positions is the same for every token when decoding a single step 85 pos = timestep.view(-1)[0] + 1 if timestep is not None else seq_len 86 if self.onnx_trace: 87 return ( 88 self.weights.index_select(index=self.padding_idx + pos, dim=0) 89 .unsqueeze(1) 90 .repeat(bsz, 1, 1) 91 ) 92 return self.weights[self.padding_idx + pos, :].expand(bsz, 1, -1) 93 94 positions = utils.make_positions( 95 input, self.padding_idx, onnx_trace=self.onnx_trace 96 ) 97 if self.onnx_trace: 98 flat_embeddings = self.weights.detach().index_select(0, positions.view(-1)) 99 embedding_shape = torch.cat( 100 (bsz, seq_len, torch.tensor([-1], dtype=torch.long)) 101 ) 102 embeddings = torch.onnx.operators.reshape_from_tensor_shape( 103 flat_embeddings, embedding_shape 104 ) 105 return embeddings 106 return ( 107 self.weights.index_select(0, positions.view(-1)) 108 .view(bsz, seq_len, -1) 109 .detach() 110 ) 111 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/fairseq/modules/sinusoidal_positional_embedding.py b/fairseq/modules/sinusoidal_positional_embedding.py --- a/fairseq/modules/sinusoidal_positional_embedding.py +++ b/fairseq/modules/sinusoidal_positional_embedding.py @@ -65,12 +65,7 @@ positions: Optional[Any] = None, ): """Input is expected to be of size [bsz x seqlen].""" - if torch.jit.is_scripting(): - bspair = torch.onnx.operators.shape_as_tensor(input) - elif torch.onnx.is_in_onnx_export(): - bspair = torch.onnx.operators.shape_as_tensor(input) - else: - bspair = input.size() + bspair = torch.onnx.operators.shape_as_tensor(input) bsz, seq_len = bspair[0], bspair[1] max_pos = self.padding_idx + 1 + seq_len if self.weights is None or max_pos > self.weights.size(0): @@ -97,7 +92,7 @@ if self.onnx_trace: flat_embeddings = self.weights.detach().index_select(0, positions.view(-1)) embedding_shape = torch.cat( - (bsz, seq_len, torch.tensor([-1], dtype=torch.long)) + (bsz.view(1), seq_len.view(1), torch.tensor([-1], dtype=torch.long)) ) embeddings = torch.onnx.operators.reshape_from_tensor_shape( flat_embeddings, embedding_shape
{"golden_diff": "diff --git a/fairseq/modules/sinusoidal_positional_embedding.py b/fairseq/modules/sinusoidal_positional_embedding.py\n--- a/fairseq/modules/sinusoidal_positional_embedding.py\n+++ b/fairseq/modules/sinusoidal_positional_embedding.py\n@@ -65,12 +65,7 @@\n positions: Optional[Any] = None,\n ):\n \"\"\"Input is expected to be of size [bsz x seqlen].\"\"\"\n- if torch.jit.is_scripting():\n- bspair = torch.onnx.operators.shape_as_tensor(input)\n- elif torch.onnx.is_in_onnx_export():\n- bspair = torch.onnx.operators.shape_as_tensor(input)\n- else:\n- bspair = input.size()\n+ bspair = torch.onnx.operators.shape_as_tensor(input)\n bsz, seq_len = bspair[0], bspair[1]\n max_pos = self.padding_idx + 1 + seq_len\n if self.weights is None or max_pos > self.weights.size(0):\n@@ -97,7 +92,7 @@\n if self.onnx_trace:\n flat_embeddings = self.weights.detach().index_select(0, positions.view(-1))\n embedding_shape = torch.cat(\n- (bsz, seq_len, torch.tensor([-1], dtype=torch.long))\n+ (bsz.view(1), seq_len.view(1), torch.tensor([-1], dtype=torch.long))\n )\n embeddings = torch.onnx.operators.reshape_from_tensor_shape(\n flat_embeddings, embedding_shape\n", "issue": "[fairseq] Guard call to `shape_as_tensor` with `is_in_onnx_export()`\nThis is a no-op in eager and in ONNX export, but it's better for other\ntracers if this is preserved as shapes directly instead of converted to\na tensor.\n\nThere is a little annoying code duplication with\n`torch.jit.is_scripting()`, which is unforunately necessary because we\ndidn't implement compile-time short circuiting correctly in TorchScript\nlol.\n\n", "before_files": [{"content": "# Copyright (c) Facebook, Inc. and its affiliates.\n#\n# This source code is licensed under the MIT license found in the\n# LICENSE file in the root directory of this source tree.\n\nimport math\nfrom typing import Any, Optional\n\nimport torch\nimport torch.onnx.operators\nfrom fairseq import utils\nfrom torch import Tensor, nn\n\n\nclass SinusoidalPositionalEmbedding(nn.Module):\n \"\"\"This module produces sinusoidal positional embeddings of any length.\n\n Padding symbols are ignored.\n \"\"\"\n\n def __init__(self, embedding_dim, padding_idx, init_size=1024):\n super().__init__()\n self.embedding_dim = embedding_dim\n self.padding_idx = padding_idx if padding_idx is not None else 0\n self.weights = SinusoidalPositionalEmbedding.get_embedding(\n init_size, embedding_dim, padding_idx\n )\n self.onnx_trace = False\n self.register_buffer(\"_float_tensor\", torch.FloatTensor(1))\n self.max_positions = int(1e5)\n\n def prepare_for_onnx_export_(self):\n self.onnx_trace = True\n\n @staticmethod\n def get_embedding(\n num_embeddings: int, embedding_dim: int, padding_idx: Optional[int] = None\n ):\n \"\"\"Build sinusoidal embeddings.\n\n This matches the implementation in tensor2tensor, but differs slightly\n from the description in Section 3.5 of \"Attention Is All You Need\".\n \"\"\"\n half_dim = embedding_dim // 2\n emb = math.log(10000) / (half_dim - 1)\n emb = torch.exp(torch.arange(half_dim, dtype=torch.float) * -emb)\n emb = torch.arange(num_embeddings, dtype=torch.float).unsqueeze(\n 1\n ) * emb.unsqueeze(0)\n emb = torch.cat([torch.sin(emb), torch.cos(emb)], dim=1).view(\n num_embeddings, -1\n )\n if embedding_dim % 2 == 1:\n # zero pad\n emb = torch.cat([emb, torch.zeros(num_embeddings, 1)], dim=1)\n if padding_idx is not None:\n emb[padding_idx, :] = 0\n return emb\n\n def forward(\n self,\n input,\n incremental_state: Optional[Any] = None,\n timestep: Optional[Tensor] = None,\n positions: Optional[Any] = None,\n ):\n \"\"\"Input is expected to be of size [bsz x seqlen].\"\"\"\n if torch.jit.is_scripting():\n bspair = torch.onnx.operators.shape_as_tensor(input)\n elif torch.onnx.is_in_onnx_export():\n bspair = torch.onnx.operators.shape_as_tensor(input)\n else:\n bspair = input.size()\n bsz, seq_len = bspair[0], bspair[1]\n max_pos = self.padding_idx + 1 + seq_len\n if self.weights is None or max_pos > self.weights.size(0):\n # recompute/expand embeddings if needed\n self.weights = SinusoidalPositionalEmbedding.get_embedding(\n max_pos, self.embedding_dim, self.padding_idx\n )\n self.weights = self.weights.to(self._float_tensor)\n\n if incremental_state is not None:\n # positions is the same for every token when decoding a single step\n pos = timestep.view(-1)[0] + 1 if timestep is not None else seq_len\n if self.onnx_trace:\n return (\n self.weights.index_select(index=self.padding_idx + pos, dim=0)\n .unsqueeze(1)\n .repeat(bsz, 1, 1)\n )\n return self.weights[self.padding_idx + pos, :].expand(bsz, 1, -1)\n\n positions = utils.make_positions(\n input, self.padding_idx, onnx_trace=self.onnx_trace\n )\n if self.onnx_trace:\n flat_embeddings = self.weights.detach().index_select(0, positions.view(-1))\n embedding_shape = torch.cat(\n (bsz, seq_len, torch.tensor([-1], dtype=torch.long))\n )\n embeddings = torch.onnx.operators.reshape_from_tensor_shape(\n flat_embeddings, embedding_shape\n )\n return embeddings\n return (\n self.weights.index_select(0, positions.view(-1))\n .view(bsz, seq_len, -1)\n .detach()\n )\n", "path": "fairseq/modules/sinusoidal_positional_embedding.py"}], "after_files": [{"content": "# Copyright (c) Facebook, Inc. and its affiliates.\n#\n# This source code is licensed under the MIT license found in the\n# LICENSE file in the root directory of this source tree.\n\nimport math\nfrom typing import Any, Optional\n\nimport torch\nimport torch.onnx.operators\nfrom fairseq import utils\nfrom torch import Tensor, nn\n\n\nclass SinusoidalPositionalEmbedding(nn.Module):\n \"\"\"This module produces sinusoidal positional embeddings of any length.\n\n Padding symbols are ignored.\n \"\"\"\n\n def __init__(self, embedding_dim, padding_idx, init_size=1024):\n super().__init__()\n self.embedding_dim = embedding_dim\n self.padding_idx = padding_idx if padding_idx is not None else 0\n self.weights = SinusoidalPositionalEmbedding.get_embedding(\n init_size, embedding_dim, padding_idx\n )\n self.onnx_trace = False\n self.register_buffer(\"_float_tensor\", torch.FloatTensor(1))\n self.max_positions = int(1e5)\n\n def prepare_for_onnx_export_(self):\n self.onnx_trace = True\n\n @staticmethod\n def get_embedding(\n num_embeddings: int, embedding_dim: int, padding_idx: Optional[int] = None\n ):\n \"\"\"Build sinusoidal embeddings.\n\n This matches the implementation in tensor2tensor, but differs slightly\n from the description in Section 3.5 of \"Attention Is All You Need\".\n \"\"\"\n half_dim = embedding_dim // 2\n emb = math.log(10000) / (half_dim - 1)\n emb = torch.exp(torch.arange(half_dim, dtype=torch.float) * -emb)\n emb = torch.arange(num_embeddings, dtype=torch.float).unsqueeze(\n 1\n ) * emb.unsqueeze(0)\n emb = torch.cat([torch.sin(emb), torch.cos(emb)], dim=1).view(\n num_embeddings, -1\n )\n if embedding_dim % 2 == 1:\n # zero pad\n emb = torch.cat([emb, torch.zeros(num_embeddings, 1)], dim=1)\n if padding_idx is not None:\n emb[padding_idx, :] = 0\n return emb\n\n def forward(\n self,\n input,\n incremental_state: Optional[Any] = None,\n timestep: Optional[Tensor] = None,\n positions: Optional[Any] = None,\n ):\n \"\"\"Input is expected to be of size [bsz x seqlen].\"\"\"\n bspair = torch.onnx.operators.shape_as_tensor(input)\n bsz, seq_len = bspair[0], bspair[1]\n max_pos = self.padding_idx + 1 + seq_len\n if self.weights is None or max_pos > self.weights.size(0):\n # recompute/expand embeddings if needed\n self.weights = SinusoidalPositionalEmbedding.get_embedding(\n max_pos, self.embedding_dim, self.padding_idx\n )\n self.weights = self.weights.to(self._float_tensor)\n\n if incremental_state is not None:\n # positions is the same for every token when decoding a single step\n pos = timestep.view(-1)[0] + 1 if timestep is not None else seq_len\n if self.onnx_trace:\n return (\n self.weights.index_select(index=self.padding_idx + pos, dim=0)\n .unsqueeze(1)\n .repeat(bsz, 1, 1)\n )\n return self.weights[self.padding_idx + pos, :].expand(bsz, 1, -1)\n\n positions = utils.make_positions(\n input, self.padding_idx, onnx_trace=self.onnx_trace\n )\n if self.onnx_trace:\n flat_embeddings = self.weights.detach().index_select(0, positions.view(-1))\n embedding_shape = torch.cat(\n (bsz.view(1), seq_len.view(1), torch.tensor([-1], dtype=torch.long))\n )\n embeddings = torch.onnx.operators.reshape_from_tensor_shape(\n flat_embeddings, embedding_shape\n )\n return embeddings\n return (\n self.weights.index_select(0, positions.view(-1))\n .view(bsz, seq_len, -1)\n .detach()\n )\n", "path": "fairseq/modules/sinusoidal_positional_embedding.py"}]}
1,537
338
gh_patches_debug_14196
rasdani/github-patches
git_diff
mozilla__pontoon-2826
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Pretranslation issues with multiline strings Pretranslation struggles to translate Fluent strings where line breaks are used to limit the line width, as it treats each line as a separate sentence. ``` rec-pw-1-2 = Make this password unique and different from any others you use. A good strategy to follow is to combine two or more unrelated words to create an entire pass phrase, and include numbers and symbols. ``` I don't think we can completely ignore line breaks, because there are some cases where these have an effect on display ([example](https://searchfox.org/mozilla-central/rev/169bf38e150667afac81ab73ef8b5ace8f1dfa8d/browser/locales/en-US/browser/downloads.ftl#244-248)). But maybe we can strip line breaks (i.e. replace them with a whitespace) when sending the translation to the machine translation engine, hoping that translators will catch edge cases when reviewing? Cc @eemeli because that's a topic that we discussed at some point for the Fluent editor. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `pontoon/pretranslation/pretranslate.py` Content: ``` 1 import logging 2 import operator 3 import re 4 5 from django.db.models import CharField, Value as V 6 from django.db.models.functions import Concat 7 8 from fluent.syntax import FluentParser, FluentSerializer 9 from functools import reduce 10 11 from pontoon.base.models import User, TranslatedResource 12 from pontoon.base.fluent import FlatTransformer, create_locale_plural_variants 13 from pontoon.machinery.utils import ( 14 get_google_translate_data, 15 get_translation_memory_data, 16 ) 17 18 19 log = logging.getLogger(__name__) 20 21 parser = FluentParser() 22 serializer = FluentSerializer() 23 24 25 class PretranslationTransformer(FlatTransformer): 26 def __init__(self, locale): 27 self.services = [] 28 self.locale = locale 29 30 def visit_SelectExpression(self, node): 31 create_locale_plural_variants(node, self.locale) 32 return self.generic_visit(node) 33 34 def visit_TextElement(self, node): 35 pretranslation, service = get_pretranslated_data(node.value, self.locale) 36 37 if pretranslation is None: 38 raise ValueError( 39 f"Pretranslation for `{node.value}` to {self.locale.code} not available." 40 ) 41 42 node.value = pretranslation 43 self.services.append(service) 44 return node 45 46 47 def get_pretranslations(entity, locale): 48 """ 49 Get pretranslations for the entity-locale pair using internal translation memory and 50 Google's machine translation. 51 52 For Fluent strings, uplift SelectExpressions, serialize Placeables as TextElements 53 and then only pretranslate TextElements. Set the most frequent TextElement 54 pretranslation author as the author of the entire pretranslation. 55 56 :arg Entity entity: the Entity object 57 :arg Locale locale: the Locale object 58 59 :returns: a list of tuples, consisting of: 60 - a pretranslation of the entity 61 - a plural form 62 - a user (representing TM or GT service) 63 """ 64 source = entity.string 65 services = { 66 "tm": User.objects.get(email="[email protected]"), 67 "gt": User.objects.get(email="[email protected]"), 68 } 69 70 if entity.resource.format == "ftl": 71 source_ast = parser.parse_entry(source) 72 pt_transformer = PretranslationTransformer(locale) 73 74 try: 75 pretranslated_ast = pt_transformer.visit(source_ast) 76 except ValueError as e: 77 log.info(f"Fluent pretranslation error: {e}") 78 return [] 79 80 pretranslation = serializer.serialize_entry(pretranslated_ast) 81 82 authors = [services[service] for service in pt_transformer.services] 83 author = max(set(authors), key=authors.count) if authors else services["tm"] 84 85 return [(pretranslation, None, author)] 86 87 else: 88 pretranslation, service = get_pretranslated_data(source, locale) 89 90 if pretranslation is None: 91 return [] 92 93 author = services[service] 94 if entity.string_plural == "": 95 return [(pretranslation, None, author)] 96 else: 97 plural_forms = range(0, locale.nplurals or 1) 98 return [ 99 (pretranslation, plural_form, author) for plural_form in plural_forms 100 ] 101 102 103 def get_pretranslated_data(source, locale): 104 # Empty strings do not need translation 105 if re.search("^\\s*$", source): 106 return source, "tm" 107 108 # Try to get matches from Translation Memory 109 tm_response = get_translation_memory_data(text=source, locale=locale) 110 tm_perfect = [t for t in tm_response if int(t["quality"]) == 100] 111 if tm_perfect: 112 return tm_perfect[0]["target"], "tm" 113 114 # Fetch from Google Translate 115 elif locale.google_translate_code: 116 gt_response = get_google_translate_data(text=source, locale=locale) 117 if gt_response["status"]: 118 return gt_response["translation"], "gt" 119 120 return None, None 121 122 123 def update_changed_instances(tr_filter, tr_dict, translations): 124 """ 125 Update the latest activity and stats for changed Locales, ProjectLocales 126 & TranslatedResources 127 """ 128 tr_filter = tuple(tr_filter) 129 # Combine all generated filters with an OK operator. 130 # `operator.ior` is the '|' Python operator, which turns into a logical OR 131 # when used between django ORM query objects. 132 tr_query = reduce(operator.ior, tr_filter) 133 134 translatedresources = TranslatedResource.objects.filter(tr_query).annotate( 135 locale_resource=Concat( 136 "locale_id", V("-"), "resource_id", output_field=CharField() 137 ) 138 ) 139 140 translatedresources.update_stats() 141 142 for tr in translatedresources: 143 index = tr_dict[tr.locale_resource] 144 translation = translations[index] 145 translation.update_latest_translation() 146 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/pontoon/pretranslation/pretranslate.py b/pontoon/pretranslation/pretranslate.py --- a/pontoon/pretranslation/pretranslate.py +++ b/pontoon/pretranslation/pretranslate.py @@ -32,11 +32,15 @@ return self.generic_visit(node) def visit_TextElement(self, node): - pretranslation, service = get_pretranslated_data(node.value, self.locale) + # Machine translation treats each line as separate sentence, + # hence we replace newline characters with spaces. + source = node.value.replace("\n", " ") + + pretranslation, service = get_pretranslated_data(source, self.locale) if pretranslation is None: raise ValueError( - f"Pretranslation for `{node.value}` to {self.locale.code} not available." + f"Pretranslation for `{source}` to {self.locale.code} not available." ) node.value = pretranslation
{"golden_diff": "diff --git a/pontoon/pretranslation/pretranslate.py b/pontoon/pretranslation/pretranslate.py\n--- a/pontoon/pretranslation/pretranslate.py\n+++ b/pontoon/pretranslation/pretranslate.py\n@@ -32,11 +32,15 @@\n return self.generic_visit(node)\n \n def visit_TextElement(self, node):\n- pretranslation, service = get_pretranslated_data(node.value, self.locale)\n+ # Machine translation treats each line as separate sentence,\n+ # hence we replace newline characters with spaces.\n+ source = node.value.replace(\"\\n\", \" \")\n+\n+ pretranslation, service = get_pretranslated_data(source, self.locale)\n \n if pretranslation is None:\n raise ValueError(\n- f\"Pretranslation for `{node.value}` to {self.locale.code} not available.\"\n+ f\"Pretranslation for `{source}` to {self.locale.code} not available.\"\n )\n \n node.value = pretranslation\n", "issue": "Pretranslation issues with multiline strings\nPretranslation struggles to translate Fluent strings where line breaks are used to limit the line width, as it treats each line as a separate sentence.\r\n\r\n```\r\nrec-pw-1-2 =\r\n Make this password unique and different from any others you use.\r\n A good strategy to follow is to combine two or more unrelated\r\n words to create an entire pass phrase, and include numbers and symbols.\r\n```\r\n\r\nI don't think we can completely ignore line breaks, because there are some cases where these have an effect on display ([example](https://searchfox.org/mozilla-central/rev/169bf38e150667afac81ab73ef8b5ace8f1dfa8d/browser/locales/en-US/browser/downloads.ftl#244-248)). But maybe we can strip line breaks (i.e. replace them with a whitespace) when sending the translation to the machine translation engine, hoping that translators will catch edge cases when reviewing?\r\n\r\nCc @eemeli because that's a topic that we discussed at some point for the Fluent editor.\n", "before_files": [{"content": "import logging\nimport operator\nimport re\n\nfrom django.db.models import CharField, Value as V\nfrom django.db.models.functions import Concat\n\nfrom fluent.syntax import FluentParser, FluentSerializer\nfrom functools import reduce\n\nfrom pontoon.base.models import User, TranslatedResource\nfrom pontoon.base.fluent import FlatTransformer, create_locale_plural_variants\nfrom pontoon.machinery.utils import (\n get_google_translate_data,\n get_translation_memory_data,\n)\n\n\nlog = logging.getLogger(__name__)\n\nparser = FluentParser()\nserializer = FluentSerializer()\n\n\nclass PretranslationTransformer(FlatTransformer):\n def __init__(self, locale):\n self.services = []\n self.locale = locale\n\n def visit_SelectExpression(self, node):\n create_locale_plural_variants(node, self.locale)\n return self.generic_visit(node)\n\n def visit_TextElement(self, node):\n pretranslation, service = get_pretranslated_data(node.value, self.locale)\n\n if pretranslation is None:\n raise ValueError(\n f\"Pretranslation for `{node.value}` to {self.locale.code} not available.\"\n )\n\n node.value = pretranslation\n self.services.append(service)\n return node\n\n\ndef get_pretranslations(entity, locale):\n \"\"\"\n Get pretranslations for the entity-locale pair using internal translation memory and\n Google's machine translation.\n\n For Fluent strings, uplift SelectExpressions, serialize Placeables as TextElements\n and then only pretranslate TextElements. Set the most frequent TextElement\n pretranslation author as the author of the entire pretranslation.\n\n :arg Entity entity: the Entity object\n :arg Locale locale: the Locale object\n\n :returns: a list of tuples, consisting of:\n - a pretranslation of the entity\n - a plural form\n - a user (representing TM or GT service)\n \"\"\"\n source = entity.string\n services = {\n \"tm\": User.objects.get(email=\"[email protected]\"),\n \"gt\": User.objects.get(email=\"[email protected]\"),\n }\n\n if entity.resource.format == \"ftl\":\n source_ast = parser.parse_entry(source)\n pt_transformer = PretranslationTransformer(locale)\n\n try:\n pretranslated_ast = pt_transformer.visit(source_ast)\n except ValueError as e:\n log.info(f\"Fluent pretranslation error: {e}\")\n return []\n\n pretranslation = serializer.serialize_entry(pretranslated_ast)\n\n authors = [services[service] for service in pt_transformer.services]\n author = max(set(authors), key=authors.count) if authors else services[\"tm\"]\n\n return [(pretranslation, None, author)]\n\n else:\n pretranslation, service = get_pretranslated_data(source, locale)\n\n if pretranslation is None:\n return []\n\n author = services[service]\n if entity.string_plural == \"\":\n return [(pretranslation, None, author)]\n else:\n plural_forms = range(0, locale.nplurals or 1)\n return [\n (pretranslation, plural_form, author) for plural_form in plural_forms\n ]\n\n\ndef get_pretranslated_data(source, locale):\n # Empty strings do not need translation\n if re.search(\"^\\\\s*$\", source):\n return source, \"tm\"\n\n # Try to get matches from Translation Memory\n tm_response = get_translation_memory_data(text=source, locale=locale)\n tm_perfect = [t for t in tm_response if int(t[\"quality\"]) == 100]\n if tm_perfect:\n return tm_perfect[0][\"target\"], \"tm\"\n\n # Fetch from Google Translate\n elif locale.google_translate_code:\n gt_response = get_google_translate_data(text=source, locale=locale)\n if gt_response[\"status\"]:\n return gt_response[\"translation\"], \"gt\"\n\n return None, None\n\n\ndef update_changed_instances(tr_filter, tr_dict, translations):\n \"\"\"\n Update the latest activity and stats for changed Locales, ProjectLocales\n & TranslatedResources\n \"\"\"\n tr_filter = tuple(tr_filter)\n # Combine all generated filters with an OK operator.\n # `operator.ior` is the '|' Python operator, which turns into a logical OR\n # when used between django ORM query objects.\n tr_query = reduce(operator.ior, tr_filter)\n\n translatedresources = TranslatedResource.objects.filter(tr_query).annotate(\n locale_resource=Concat(\n \"locale_id\", V(\"-\"), \"resource_id\", output_field=CharField()\n )\n )\n\n translatedresources.update_stats()\n\n for tr in translatedresources:\n index = tr_dict[tr.locale_resource]\n translation = translations[index]\n translation.update_latest_translation()\n", "path": "pontoon/pretranslation/pretranslate.py"}], "after_files": [{"content": "import logging\nimport operator\nimport re\n\nfrom django.db.models import CharField, Value as V\nfrom django.db.models.functions import Concat\n\nfrom fluent.syntax import FluentParser, FluentSerializer\nfrom functools import reduce\n\nfrom pontoon.base.models import User, TranslatedResource\nfrom pontoon.base.fluent import FlatTransformer, create_locale_plural_variants\nfrom pontoon.machinery.utils import (\n get_google_translate_data,\n get_translation_memory_data,\n)\n\n\nlog = logging.getLogger(__name__)\n\nparser = FluentParser()\nserializer = FluentSerializer()\n\n\nclass PretranslationTransformer(FlatTransformer):\n def __init__(self, locale):\n self.services = []\n self.locale = locale\n\n def visit_SelectExpression(self, node):\n create_locale_plural_variants(node, self.locale)\n return self.generic_visit(node)\n\n def visit_TextElement(self, node):\n # Machine translation treats each line as separate sentence,\n # hence we replace newline characters with spaces.\n source = node.value.replace(\"\\n\", \" \")\n\n pretranslation, service = get_pretranslated_data(source, self.locale)\n\n if pretranslation is None:\n raise ValueError(\n f\"Pretranslation for `{source}` to {self.locale.code} not available.\"\n )\n\n node.value = pretranslation\n self.services.append(service)\n return node\n\n\ndef get_pretranslations(entity, locale):\n \"\"\"\n Get pretranslations for the entity-locale pair using internal translation memory and\n Google's machine translation.\n\n For Fluent strings, uplift SelectExpressions, serialize Placeables as TextElements\n and then only pretranslate TextElements. Set the most frequent TextElement\n pretranslation author as the author of the entire pretranslation.\n\n :arg Entity entity: the Entity object\n :arg Locale locale: the Locale object\n\n :returns: a list of tuples, consisting of:\n - a pretranslation of the entity\n - a plural form\n - a user (representing TM or GT service)\n \"\"\"\n source = entity.string\n services = {\n \"tm\": User.objects.get(email=\"[email protected]\"),\n \"gt\": User.objects.get(email=\"[email protected]\"),\n }\n\n if entity.resource.format == \"ftl\":\n source_ast = parser.parse_entry(source)\n pt_transformer = PretranslationTransformer(locale)\n\n try:\n pretranslated_ast = pt_transformer.visit(source_ast)\n except ValueError as e:\n log.info(f\"Fluent pretranslation error: {e}\")\n return []\n\n pretranslation = serializer.serialize_entry(pretranslated_ast)\n\n authors = [services[service] for service in pt_transformer.services]\n author = max(set(authors), key=authors.count) if authors else services[\"tm\"]\n\n return [(pretranslation, None, author)]\n\n else:\n pretranslation, service = get_pretranslated_data(source, locale)\n\n if pretranslation is None:\n return []\n\n author = services[service]\n if entity.string_plural == \"\":\n return [(pretranslation, None, author)]\n else:\n plural_forms = range(0, locale.nplurals or 1)\n return [\n (pretranslation, plural_form, author) for plural_form in plural_forms\n ]\n\n\ndef get_pretranslated_data(source, locale):\n # Empty strings do not need translation\n if re.search(\"^\\\\s*$\", source):\n return source, \"tm\"\n\n # Try to get matches from Translation Memory\n tm_response = get_translation_memory_data(text=source, locale=locale)\n tm_perfect = [t for t in tm_response if int(t[\"quality\"]) == 100]\n if tm_perfect:\n return tm_perfect[0][\"target\"], \"tm\"\n\n # Fetch from Google Translate\n elif locale.google_translate_code:\n gt_response = get_google_translate_data(text=source, locale=locale)\n if gt_response[\"status\"]:\n return gt_response[\"translation\"], \"gt\"\n\n return None, None\n\n\ndef update_changed_instances(tr_filter, tr_dict, translations):\n \"\"\"\n Update the latest activity and stats for changed Locales, ProjectLocales\n & TranslatedResources\n \"\"\"\n tr_filter = tuple(tr_filter)\n # Combine all generated filters with an OK operator.\n # `operator.ior` is the '|' Python operator, which turns into a logical OR\n # when used between django ORM query objects.\n tr_query = reduce(operator.ior, tr_filter)\n\n translatedresources = TranslatedResource.objects.filter(tr_query).annotate(\n locale_resource=Concat(\n \"locale_id\", V(\"-\"), \"resource_id\", output_field=CharField()\n )\n )\n\n translatedresources.update_stats()\n\n for tr in translatedresources:\n index = tr_dict[tr.locale_resource]\n translation = translations[index]\n translation.update_latest_translation()\n", "path": "pontoon/pretranslation/pretranslate.py"}]}
1,834
205
gh_patches_debug_7431
rasdani/github-patches
git_diff
plone__Products.CMFPlone-3501
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Allow TinyMCE to be used in inline-mode Inline-mode of TinyMCE (https://www.tiny.cloud/docs/demo/inline) is nice among other reasons because the edited text looks exactly the way as the saved result. To enable it you could simply set `{"inline": "true"}` as the value in the field "Other settings" of the TinyMCE controlpanel. That is enough to trigger it. But that fails with `Could not initialize inline editor on invalid inline target element` since the html-tag that renders the TinyMCE is a `<textarea>`. See https://www.tiny.cloud/blog/tinymce-inline-option-not-working for details about that constraint. The `<textarea>` thing is defined in `plone.app.z3cform.widget.RichTextWidget` which has `TextareaWidget` as `_base`. In `render_input_mode` of the widget it is possible to conditionally render a different html element like `div` around the text. But then fallback to textarea no longer works and saving also fails and and and... So it seems that using inline-mode in not straightforward. The question is do we like that option enough to do the required work? --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `Products/CMFPlone/patterns/settings.py` Content: ``` 1 from Acquisition import aq_inner 2 from Acquisition import aq_parent 3 from borg.localrole.interfaces import IFactoryTempFolder 4 from plone.app.content.browser.interfaces import IFolderContentsView 5 from plone.app.widgets.utils import get_relateditems_options 6 from plone.app.z3cform.utils import call_callables 7 from plone.registry.interfaces import IRegistry 8 from plone.uuid.interfaces import IUUID 9 from Products.CMFCore.interfaces._content import IFolderish 10 from plone.base.interfaces import ILinkSchema 11 from plone.base.interfaces import IPatternsSettings 12 from plone.base.interfaces import IPloneSiteRoot 13 from Products.CMFPlone.patterns.tinymce import TinyMCESettingsGenerator 14 from Products.CMFPlone.utils import get_portal 15 from zope.component import getUtility 16 from zope.i18n import translate 17 from zope.interface import implementer 18 from zope.schema.interfaces import IVocabularyFactory 19 20 import json 21 22 23 @implementer(IPatternsSettings) 24 class PatternSettingsAdapter: 25 """ 26 Provides default plone settings relevant for patterns. 27 """ 28 29 def __init__(self, context, request, field): 30 self.request = request 31 self.context = context 32 self.field = field 33 34 def __call__(self): 35 data = {} 36 data.update(self.mark_special_links()) 37 data.update(self.structure_updater()) 38 return data 39 40 def structure_updater(self): 41 """Generate the options for the structure updater pattern. 42 If we're not in folder contents view, do not expose these options. 43 """ 44 data = {} 45 view = self.request.get("PUBLISHED", None) 46 if IFolderContentsView.providedBy(view): 47 data = { 48 "data-pat-structureupdater": json.dumps( 49 { 50 "titleSelector": ".documentFirstHeading", 51 "descriptionSelector": ".documentDescription", 52 } 53 ) 54 } 55 return data 56 57 def mark_special_links(self): 58 result = {} 59 60 registry = getUtility(IRegistry) 61 settings = registry.forInterface(ILinkSchema, prefix="plone", check=False) 62 63 msl = settings.mark_special_links 64 elonw = settings.external_links_open_new_window 65 if msl or elonw: 66 result = { 67 "data-pat-markspeciallinks": json.dumps( 68 {"external_links_open_new_window": elonw, "mark_special_links": msl} 69 ) 70 } 71 return result 72 73 @property 74 def image_scales(self): 75 factory = getUtility(IVocabularyFactory, "plone.app.vocabularies.ImagesScales") 76 vocabulary = factory(self.context) 77 ret = [{"title": translate(it.title), "value": it.value} for it in vocabulary] 78 ret = sorted(ret, key=lambda it: it["title"]) 79 return json.dumps(ret) 80 81 def tinymce(self): 82 """ 83 data-pat-tinymce : JSON.stringify({ 84 relatedItems: { 85 vocabularyUrl: config.portal_url + 86 '/@@getVocabulary?name=plone.app.vocabularies.Catalog' 87 }, 88 tiny: config, 89 prependToUrl: 'resolveuid/', 90 linkAttribute: 'UID', 91 prependToScalePart: '/@@images/image/' 92 }) 93 """ 94 95 generator = TinyMCESettingsGenerator(self.context, self.request) 96 settings = generator.settings 97 folder = aq_inner(self.context) 98 99 # Test if we are currently creating an Archetype object 100 if IFactoryTempFolder.providedBy(aq_parent(folder)): 101 folder = aq_parent(aq_parent(aq_parent(folder))) 102 if not IFolderish.providedBy(folder): 103 folder = aq_parent(folder) 104 105 if IPloneSiteRoot.providedBy(folder): 106 initial = None 107 else: 108 initial = IUUID(folder, None) 109 110 portal = get_portal() 111 portal_url = portal.absolute_url() 112 current_path = folder.absolute_url()[len(portal_url) :] 113 114 image_types = settings.image_objects or [] 115 116 server_url = self.request.get("SERVER_URL", "") 117 site_path = portal_url[len(server_url) :] 118 119 related_items_config = get_relateditems_options( 120 context=self.context, 121 value=None, 122 separator=";", 123 vocabulary_name="plone.app.vocabularies.Catalog", 124 vocabulary_view="@@getVocabulary", 125 field_name=None, 126 ) 127 related_items_config = call_callables(related_items_config, self.context) 128 129 configuration = { 130 "base_url": self.context.absolute_url(), 131 "imageTypes": image_types, 132 "imageScales": self.image_scales, 133 "linkAttribute": "UID", 134 # This is for loading the languages on tinymce 135 "loadingBaseUrl": "{}/++plone++static/components/tinymce-builded/" 136 "js/tinymce".format(portal_url), 137 "relatedItems": related_items_config, 138 "prependToScalePart": "/@@images/image/", 139 "prependToUrl": "{}/resolveuid/".format(site_path.rstrip("/")), 140 "tiny": generator.get_tiny_config(), 141 "upload": { 142 "baseUrl": portal_url, 143 "currentPath": current_path, 144 "initialFolder": initial, 145 "maxFiles": 1, 146 "relativePath": "@@fileUpload", 147 "showTitle": False, 148 "uploadMultiple": False, 149 }, 150 } 151 return {"data-pat-tinymce": json.dumps(configuration)} 152 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/Products/CMFPlone/patterns/settings.py b/Products/CMFPlone/patterns/settings.py --- a/Products/CMFPlone/patterns/settings.py +++ b/Products/CMFPlone/patterns/settings.py @@ -137,6 +137,7 @@ "relatedItems": related_items_config, "prependToScalePart": "/@@images/image/", "prependToUrl": "{}/resolveuid/".format(site_path.rstrip("/")), + "inline": settings.inline, "tiny": generator.get_tiny_config(), "upload": { "baseUrl": portal_url,
{"golden_diff": "diff --git a/Products/CMFPlone/patterns/settings.py b/Products/CMFPlone/patterns/settings.py\n--- a/Products/CMFPlone/patterns/settings.py\n+++ b/Products/CMFPlone/patterns/settings.py\n@@ -137,6 +137,7 @@\n \"relatedItems\": related_items_config,\n \"prependToScalePart\": \"/@@images/image/\",\n \"prependToUrl\": \"{}/resolveuid/\".format(site_path.rstrip(\"/\")),\n+ \"inline\": settings.inline,\n \"tiny\": generator.get_tiny_config(),\n \"upload\": {\n \"baseUrl\": portal_url,\n", "issue": "Allow TinyMCE to be used in inline-mode\nInline-mode of TinyMCE (https://www.tiny.cloud/docs/demo/inline) is nice among other reasons because the edited text looks exactly the way as the saved result. \r\n\r\nTo enable it you could simply set `{\"inline\": \"true\"}` as the value in the field \"Other settings\" of the TinyMCE controlpanel. That is enough to trigger it. \r\n\r\nBut that fails with `Could not initialize inline editor on invalid inline target element` since the html-tag that renders the TinyMCE is a `<textarea>`. See https://www.tiny.cloud/blog/tinymce-inline-option-not-working for details about that constraint.\r\n\r\nThe `<textarea>` thing is defined in `plone.app.z3cform.widget.RichTextWidget` which has `TextareaWidget` as `_base`. In `render_input_mode` of the widget it is possible to conditionally render a different html element like `div` around the text. But then fallback to textarea no longer works and saving also fails and and and... \r\n\r\nSo it seems that using inline-mode in not straightforward. The question is do we like that option enough to do the required work?\r\n\n", "before_files": [{"content": "from Acquisition import aq_inner\nfrom Acquisition import aq_parent\nfrom borg.localrole.interfaces import IFactoryTempFolder\nfrom plone.app.content.browser.interfaces import IFolderContentsView\nfrom plone.app.widgets.utils import get_relateditems_options\nfrom plone.app.z3cform.utils import call_callables\nfrom plone.registry.interfaces import IRegistry\nfrom plone.uuid.interfaces import IUUID\nfrom Products.CMFCore.interfaces._content import IFolderish\nfrom plone.base.interfaces import ILinkSchema\nfrom plone.base.interfaces import IPatternsSettings\nfrom plone.base.interfaces import IPloneSiteRoot\nfrom Products.CMFPlone.patterns.tinymce import TinyMCESettingsGenerator\nfrom Products.CMFPlone.utils import get_portal\nfrom zope.component import getUtility\nfrom zope.i18n import translate\nfrom zope.interface import implementer\nfrom zope.schema.interfaces import IVocabularyFactory\n\nimport json\n\n\n@implementer(IPatternsSettings)\nclass PatternSettingsAdapter:\n \"\"\"\n Provides default plone settings relevant for patterns.\n \"\"\"\n\n def __init__(self, context, request, field):\n self.request = request\n self.context = context\n self.field = field\n\n def __call__(self):\n data = {}\n data.update(self.mark_special_links())\n data.update(self.structure_updater())\n return data\n\n def structure_updater(self):\n \"\"\"Generate the options for the structure updater pattern.\n If we're not in folder contents view, do not expose these options.\n \"\"\"\n data = {}\n view = self.request.get(\"PUBLISHED\", None)\n if IFolderContentsView.providedBy(view):\n data = {\n \"data-pat-structureupdater\": json.dumps(\n {\n \"titleSelector\": \".documentFirstHeading\",\n \"descriptionSelector\": \".documentDescription\",\n }\n )\n }\n return data\n\n def mark_special_links(self):\n result = {}\n\n registry = getUtility(IRegistry)\n settings = registry.forInterface(ILinkSchema, prefix=\"plone\", check=False)\n\n msl = settings.mark_special_links\n elonw = settings.external_links_open_new_window\n if msl or elonw:\n result = {\n \"data-pat-markspeciallinks\": json.dumps(\n {\"external_links_open_new_window\": elonw, \"mark_special_links\": msl}\n )\n }\n return result\n\n @property\n def image_scales(self):\n factory = getUtility(IVocabularyFactory, \"plone.app.vocabularies.ImagesScales\")\n vocabulary = factory(self.context)\n ret = [{\"title\": translate(it.title), \"value\": it.value} for it in vocabulary]\n ret = sorted(ret, key=lambda it: it[\"title\"])\n return json.dumps(ret)\n\n def tinymce(self):\n \"\"\"\n data-pat-tinymce : JSON.stringify({\n relatedItems: {\n vocabularyUrl: config.portal_url +\n '/@@getVocabulary?name=plone.app.vocabularies.Catalog'\n },\n tiny: config,\n prependToUrl: 'resolveuid/',\n linkAttribute: 'UID',\n prependToScalePart: '/@@images/image/'\n })\n \"\"\"\n\n generator = TinyMCESettingsGenerator(self.context, self.request)\n settings = generator.settings\n folder = aq_inner(self.context)\n\n # Test if we are currently creating an Archetype object\n if IFactoryTempFolder.providedBy(aq_parent(folder)):\n folder = aq_parent(aq_parent(aq_parent(folder)))\n if not IFolderish.providedBy(folder):\n folder = aq_parent(folder)\n\n if IPloneSiteRoot.providedBy(folder):\n initial = None\n else:\n initial = IUUID(folder, None)\n\n portal = get_portal()\n portal_url = portal.absolute_url()\n current_path = folder.absolute_url()[len(portal_url) :]\n\n image_types = settings.image_objects or []\n\n server_url = self.request.get(\"SERVER_URL\", \"\")\n site_path = portal_url[len(server_url) :]\n\n related_items_config = get_relateditems_options(\n context=self.context,\n value=None,\n separator=\";\",\n vocabulary_name=\"plone.app.vocabularies.Catalog\",\n vocabulary_view=\"@@getVocabulary\",\n field_name=None,\n )\n related_items_config = call_callables(related_items_config, self.context)\n\n configuration = {\n \"base_url\": self.context.absolute_url(),\n \"imageTypes\": image_types,\n \"imageScales\": self.image_scales,\n \"linkAttribute\": \"UID\",\n # This is for loading the languages on tinymce\n \"loadingBaseUrl\": \"{}/++plone++static/components/tinymce-builded/\"\n \"js/tinymce\".format(portal_url),\n \"relatedItems\": related_items_config,\n \"prependToScalePart\": \"/@@images/image/\",\n \"prependToUrl\": \"{}/resolveuid/\".format(site_path.rstrip(\"/\")),\n \"tiny\": generator.get_tiny_config(),\n \"upload\": {\n \"baseUrl\": portal_url,\n \"currentPath\": current_path,\n \"initialFolder\": initial,\n \"maxFiles\": 1,\n \"relativePath\": \"@@fileUpload\",\n \"showTitle\": False,\n \"uploadMultiple\": False,\n },\n }\n return {\"data-pat-tinymce\": json.dumps(configuration)}\n", "path": "Products/CMFPlone/patterns/settings.py"}], "after_files": [{"content": "from Acquisition import aq_inner\nfrom Acquisition import aq_parent\nfrom borg.localrole.interfaces import IFactoryTempFolder\nfrom plone.app.content.browser.interfaces import IFolderContentsView\nfrom plone.app.widgets.utils import get_relateditems_options\nfrom plone.app.z3cform.utils import call_callables\nfrom plone.registry.interfaces import IRegistry\nfrom plone.uuid.interfaces import IUUID\nfrom Products.CMFCore.interfaces._content import IFolderish\nfrom plone.base.interfaces import ILinkSchema\nfrom plone.base.interfaces import IPatternsSettings\nfrom plone.base.interfaces import IPloneSiteRoot\nfrom Products.CMFPlone.patterns.tinymce import TinyMCESettingsGenerator\nfrom Products.CMFPlone.utils import get_portal\nfrom zope.component import getUtility\nfrom zope.i18n import translate\nfrom zope.interface import implementer\nfrom zope.schema.interfaces import IVocabularyFactory\n\nimport json\n\n\n@implementer(IPatternsSettings)\nclass PatternSettingsAdapter:\n \"\"\"\n Provides default plone settings relevant for patterns.\n \"\"\"\n\n def __init__(self, context, request, field):\n self.request = request\n self.context = context\n self.field = field\n\n def __call__(self):\n data = {}\n data.update(self.mark_special_links())\n data.update(self.structure_updater())\n return data\n\n def structure_updater(self):\n \"\"\"Generate the options for the structure updater pattern.\n If we're not in folder contents view, do not expose these options.\n \"\"\"\n data = {}\n view = self.request.get(\"PUBLISHED\", None)\n if IFolderContentsView.providedBy(view):\n data = {\n \"data-pat-structureupdater\": json.dumps(\n {\n \"titleSelector\": \".documentFirstHeading\",\n \"descriptionSelector\": \".documentDescription\",\n }\n )\n }\n return data\n\n def mark_special_links(self):\n result = {}\n\n registry = getUtility(IRegistry)\n settings = registry.forInterface(ILinkSchema, prefix=\"plone\", check=False)\n\n msl = settings.mark_special_links\n elonw = settings.external_links_open_new_window\n if msl or elonw:\n result = {\n \"data-pat-markspeciallinks\": json.dumps(\n {\"external_links_open_new_window\": elonw, \"mark_special_links\": msl}\n )\n }\n return result\n\n @property\n def image_scales(self):\n factory = getUtility(IVocabularyFactory, \"plone.app.vocabularies.ImagesScales\")\n vocabulary = factory(self.context)\n ret = [{\"title\": translate(it.title), \"value\": it.value} for it in vocabulary]\n ret = sorted(ret, key=lambda it: it[\"title\"])\n return json.dumps(ret)\n\n def tinymce(self):\n \"\"\"\n data-pat-tinymce : JSON.stringify({\n relatedItems: {\n vocabularyUrl: config.portal_url +\n '/@@getVocabulary?name=plone.app.vocabularies.Catalog'\n },\n tiny: config,\n prependToUrl: 'resolveuid/',\n linkAttribute: 'UID',\n prependToScalePart: '/@@images/image/'\n })\n \"\"\"\n\n generator = TinyMCESettingsGenerator(self.context, self.request)\n settings = generator.settings\n folder = aq_inner(self.context)\n\n # Test if we are currently creating an Archetype object\n if IFactoryTempFolder.providedBy(aq_parent(folder)):\n folder = aq_parent(aq_parent(aq_parent(folder)))\n if not IFolderish.providedBy(folder):\n folder = aq_parent(folder)\n\n if IPloneSiteRoot.providedBy(folder):\n initial = None\n else:\n initial = IUUID(folder, None)\n\n portal = get_portal()\n portal_url = portal.absolute_url()\n current_path = folder.absolute_url()[len(portal_url) :]\n\n image_types = settings.image_objects or []\n\n server_url = self.request.get(\"SERVER_URL\", \"\")\n site_path = portal_url[len(server_url) :]\n\n related_items_config = get_relateditems_options(\n context=self.context,\n value=None,\n separator=\";\",\n vocabulary_name=\"plone.app.vocabularies.Catalog\",\n vocabulary_view=\"@@getVocabulary\",\n field_name=None,\n )\n related_items_config = call_callables(related_items_config, self.context)\n\n configuration = {\n \"base_url\": self.context.absolute_url(),\n \"imageTypes\": image_types,\n \"imageScales\": self.image_scales,\n \"linkAttribute\": \"UID\",\n # This is for loading the languages on tinymce\n \"loadingBaseUrl\": \"{}/++plone++static/components/tinymce-builded/\"\n \"js/tinymce\".format(portal_url),\n \"relatedItems\": related_items_config,\n \"prependToScalePart\": \"/@@images/image/\",\n \"prependToUrl\": \"{}/resolveuid/\".format(site_path.rstrip(\"/\")),\n \"inline\": settings.inline,\n \"tiny\": generator.get_tiny_config(),\n \"upload\": {\n \"baseUrl\": portal_url,\n \"currentPath\": current_path,\n \"initialFolder\": initial,\n \"maxFiles\": 1,\n \"relativePath\": \"@@fileUpload\",\n \"showTitle\": False,\n \"uploadMultiple\": False,\n },\n }\n return {\"data-pat-tinymce\": json.dumps(configuration)}\n", "path": "Products/CMFPlone/patterns/settings.py"}]}
2,013
143
gh_patches_debug_1148
rasdani/github-patches
git_diff
PaddlePaddle__PaddleSpeech-19
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Fix some problems in the ctc beam search decoder - [x] Make character's index in FST starting from one, otherwise wrong decoding results would be produced especially when space is the first character in the vocabulary; - [x] Add version check in the setup script; - [x] Remove unused code. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `decoders/swig/setup.py` Content: ``` 1 """Script to build and install decoder package.""" 2 from __future__ import absolute_import 3 from __future__ import division 4 from __future__ import print_function 5 6 from setuptools import setup, Extension, distutils 7 import glob 8 import platform 9 import os, sys 10 import multiprocessing.pool 11 import argparse 12 13 parser = argparse.ArgumentParser(description=__doc__) 14 parser.add_argument( 15 "--num_processes", 16 default=1, 17 type=int, 18 help="Number of cpu processes to build package. (default: %(default)d)") 19 args = parser.parse_known_args() 20 21 # reconstruct sys.argv to pass to setup below 22 sys.argv = [sys.argv[0]] + args[1] 23 24 25 # monkey-patch for parallel compilation 26 # See: https://stackoverflow.com/a/13176803 27 def parallelCCompile(self, 28 sources, 29 output_dir=None, 30 macros=None, 31 include_dirs=None, 32 debug=0, 33 extra_preargs=None, 34 extra_postargs=None, 35 depends=None): 36 # those lines are copied from distutils.ccompiler.CCompiler directly 37 macros, objects, extra_postargs, pp_opts, build = self._setup_compile( 38 output_dir, macros, include_dirs, sources, depends, extra_postargs) 39 cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) 40 41 # parallel code 42 def _single_compile(obj): 43 try: 44 src, ext = build[obj] 45 except KeyError: 46 return 47 self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) 48 49 # convert to list, imap is evaluated on-demand 50 thread_pool = multiprocessing.pool.ThreadPool(args[0].num_processes) 51 list(thread_pool.imap(_single_compile, objects)) 52 return objects 53 54 55 def compile_test(header, library): 56 dummy_path = os.path.join(os.path.dirname(__file__), "dummy") 57 command = "bash -c \"g++ -include " + header \ 58 + " -l" + library + " -x c++ - <<<'int main() {}' -o " \ 59 + dummy_path + " >/dev/null 2>/dev/null && rm " \ 60 + dummy_path + " 2>/dev/null\"" 61 return os.system(command) == 0 62 63 64 # hack compile to support parallel compiling 65 distutils.ccompiler.CCompiler.compile = parallelCCompile 66 67 FILES = glob.glob('kenlm/util/*.cc') \ 68 + glob.glob('kenlm/lm/*.cc') \ 69 + glob.glob('kenlm/util/double-conversion/*.cc') 70 71 FILES += glob.glob('openfst-1.6.3/src/lib/*.cc') 72 73 FILES = [ 74 fn for fn in FILES 75 if not (fn.endswith('main.cc') or fn.endswith('test.cc') or fn.endswith( 76 'unittest.cc')) 77 ] 78 79 LIBS = ['stdc++'] 80 if platform.system() != 'Darwin': 81 LIBS.append('rt') 82 83 ARGS = ['-O3', '-DNDEBUG', '-DKENLM_MAX_ORDER=6', '-std=c++11'] 84 85 if compile_test('zlib.h', 'z'): 86 ARGS.append('-DHAVE_ZLIB') 87 LIBS.append('z') 88 89 if compile_test('bzlib.h', 'bz2'): 90 ARGS.append('-DHAVE_BZLIB') 91 LIBS.append('bz2') 92 93 if compile_test('lzma.h', 'lzma'): 94 ARGS.append('-DHAVE_XZLIB') 95 LIBS.append('lzma') 96 97 os.system('swig -python -c++ ./decoders.i') 98 99 decoders_module = [ 100 Extension( 101 name='_swig_decoders', 102 sources=FILES + glob.glob('*.cxx') + glob.glob('*.cpp'), 103 language='c++', 104 include_dirs=[ 105 '.', 106 'kenlm', 107 'openfst-1.6.3/src/include', 108 'ThreadPool', 109 ], 110 libraries=LIBS, 111 extra_compile_args=ARGS) 112 ] 113 114 setup( 115 name='swig_decoders', 116 version='1.0', 117 description="""CTC decoders""", 118 ext_modules=decoders_module, 119 py_modules=['swig_decoders'], ) 120 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/decoders/swig/setup.py b/decoders/swig/setup.py --- a/decoders/swig/setup.py +++ b/decoders/swig/setup.py @@ -113,7 +113,7 @@ setup( name='swig_decoders', - version='1.0', + version='1.1', description="""CTC decoders""", ext_modules=decoders_module, py_modules=['swig_decoders'], )
{"golden_diff": "diff --git a/decoders/swig/setup.py b/decoders/swig/setup.py\n--- a/decoders/swig/setup.py\n+++ b/decoders/swig/setup.py\n@@ -113,7 +113,7 @@\n \n setup(\n name='swig_decoders',\n- version='1.0',\n+ version='1.1',\n description=\"\"\"CTC decoders\"\"\",\n ext_modules=decoders_module,\n py_modules=['swig_decoders'], )\n", "issue": "Fix some problems in the ctc beam search decoder\n- [x] Make character's index in FST starting from one, otherwise wrong decoding results would be produced especially when space is the first character in the vocabulary;\r\n- [x] Add version check in the setup script;\r\n- [x] Remove unused code. \r\n\n", "before_files": [{"content": "\"\"\"Script to build and install decoder package.\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nfrom setuptools import setup, Extension, distutils\nimport glob\nimport platform\nimport os, sys\nimport multiprocessing.pool\nimport argparse\n\nparser = argparse.ArgumentParser(description=__doc__)\nparser.add_argument(\n \"--num_processes\",\n default=1,\n type=int,\n help=\"Number of cpu processes to build package. (default: %(default)d)\")\nargs = parser.parse_known_args()\n\n# reconstruct sys.argv to pass to setup below\nsys.argv = [sys.argv[0]] + args[1]\n\n\n# monkey-patch for parallel compilation\n# See: https://stackoverflow.com/a/13176803\ndef parallelCCompile(self,\n sources,\n output_dir=None,\n macros=None,\n include_dirs=None,\n debug=0,\n extra_preargs=None,\n extra_postargs=None,\n depends=None):\n # those lines are copied from distutils.ccompiler.CCompiler directly\n macros, objects, extra_postargs, pp_opts, build = self._setup_compile(\n output_dir, macros, include_dirs, sources, depends, extra_postargs)\n cc_args = self._get_cc_args(pp_opts, debug, extra_preargs)\n\n # parallel code\n def _single_compile(obj):\n try:\n src, ext = build[obj]\n except KeyError:\n return\n self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts)\n\n # convert to list, imap is evaluated on-demand\n thread_pool = multiprocessing.pool.ThreadPool(args[0].num_processes)\n list(thread_pool.imap(_single_compile, objects))\n return objects\n\n\ndef compile_test(header, library):\n dummy_path = os.path.join(os.path.dirname(__file__), \"dummy\")\n command = \"bash -c \\\"g++ -include \" + header \\\n + \" -l\" + library + \" -x c++ - <<<'int main() {}' -o \" \\\n + dummy_path + \" >/dev/null 2>/dev/null && rm \" \\\n + dummy_path + \" 2>/dev/null\\\"\"\n return os.system(command) == 0\n\n\n# hack compile to support parallel compiling\ndistutils.ccompiler.CCompiler.compile = parallelCCompile\n\nFILES = glob.glob('kenlm/util/*.cc') \\\n + glob.glob('kenlm/lm/*.cc') \\\n + glob.glob('kenlm/util/double-conversion/*.cc')\n\nFILES += glob.glob('openfst-1.6.3/src/lib/*.cc')\n\nFILES = [\n fn for fn in FILES\n if not (fn.endswith('main.cc') or fn.endswith('test.cc') or fn.endswith(\n 'unittest.cc'))\n]\n\nLIBS = ['stdc++']\nif platform.system() != 'Darwin':\n LIBS.append('rt')\n\nARGS = ['-O3', '-DNDEBUG', '-DKENLM_MAX_ORDER=6', '-std=c++11']\n\nif compile_test('zlib.h', 'z'):\n ARGS.append('-DHAVE_ZLIB')\n LIBS.append('z')\n\nif compile_test('bzlib.h', 'bz2'):\n ARGS.append('-DHAVE_BZLIB')\n LIBS.append('bz2')\n\nif compile_test('lzma.h', 'lzma'):\n ARGS.append('-DHAVE_XZLIB')\n LIBS.append('lzma')\n\nos.system('swig -python -c++ ./decoders.i')\n\ndecoders_module = [\n Extension(\n name='_swig_decoders',\n sources=FILES + glob.glob('*.cxx') + glob.glob('*.cpp'),\n language='c++',\n include_dirs=[\n '.',\n 'kenlm',\n 'openfst-1.6.3/src/include',\n 'ThreadPool',\n ],\n libraries=LIBS,\n extra_compile_args=ARGS)\n]\n\nsetup(\n name='swig_decoders',\n version='1.0',\n description=\"\"\"CTC decoders\"\"\",\n ext_modules=decoders_module,\n py_modules=['swig_decoders'], )\n", "path": "decoders/swig/setup.py"}], "after_files": [{"content": "\"\"\"Script to build and install decoder package.\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nfrom setuptools import setup, Extension, distutils\nimport glob\nimport platform\nimport os, sys\nimport multiprocessing.pool\nimport argparse\n\nparser = argparse.ArgumentParser(description=__doc__)\nparser.add_argument(\n \"--num_processes\",\n default=1,\n type=int,\n help=\"Number of cpu processes to build package. (default: %(default)d)\")\nargs = parser.parse_known_args()\n\n# reconstruct sys.argv to pass to setup below\nsys.argv = [sys.argv[0]] + args[1]\n\n\n# monkey-patch for parallel compilation\n# See: https://stackoverflow.com/a/13176803\ndef parallelCCompile(self,\n sources,\n output_dir=None,\n macros=None,\n include_dirs=None,\n debug=0,\n extra_preargs=None,\n extra_postargs=None,\n depends=None):\n # those lines are copied from distutils.ccompiler.CCompiler directly\n macros, objects, extra_postargs, pp_opts, build = self._setup_compile(\n output_dir, macros, include_dirs, sources, depends, extra_postargs)\n cc_args = self._get_cc_args(pp_opts, debug, extra_preargs)\n\n # parallel code\n def _single_compile(obj):\n try:\n src, ext = build[obj]\n except KeyError:\n return\n self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts)\n\n # convert to list, imap is evaluated on-demand\n thread_pool = multiprocessing.pool.ThreadPool(args[0].num_processes)\n list(thread_pool.imap(_single_compile, objects))\n return objects\n\n\ndef compile_test(header, library):\n dummy_path = os.path.join(os.path.dirname(__file__), \"dummy\")\n command = \"bash -c \\\"g++ -include \" + header \\\n + \" -l\" + library + \" -x c++ - <<<'int main() {}' -o \" \\\n + dummy_path + \" >/dev/null 2>/dev/null && rm \" \\\n + dummy_path + \" 2>/dev/null\\\"\"\n return os.system(command) == 0\n\n\n# hack compile to support parallel compiling\ndistutils.ccompiler.CCompiler.compile = parallelCCompile\n\nFILES = glob.glob('kenlm/util/*.cc') \\\n + glob.glob('kenlm/lm/*.cc') \\\n + glob.glob('kenlm/util/double-conversion/*.cc')\n\nFILES += glob.glob('openfst-1.6.3/src/lib/*.cc')\n\nFILES = [\n fn for fn in FILES\n if not (fn.endswith('main.cc') or fn.endswith('test.cc') or fn.endswith(\n 'unittest.cc'))\n]\n\nLIBS = ['stdc++']\nif platform.system() != 'Darwin':\n LIBS.append('rt')\n\nARGS = ['-O3', '-DNDEBUG', '-DKENLM_MAX_ORDER=6', '-std=c++11']\n\nif compile_test('zlib.h', 'z'):\n ARGS.append('-DHAVE_ZLIB')\n LIBS.append('z')\n\nif compile_test('bzlib.h', 'bz2'):\n ARGS.append('-DHAVE_BZLIB')\n LIBS.append('bz2')\n\nif compile_test('lzma.h', 'lzma'):\n ARGS.append('-DHAVE_XZLIB')\n LIBS.append('lzma')\n\nos.system('swig -python -c++ ./decoders.i')\n\ndecoders_module = [\n Extension(\n name='_swig_decoders',\n sources=FILES + glob.glob('*.cxx') + glob.glob('*.cpp'),\n language='c++',\n include_dirs=[\n '.',\n 'kenlm',\n 'openfst-1.6.3/src/include',\n 'ThreadPool',\n ],\n libraries=LIBS,\n extra_compile_args=ARGS)\n]\n\nsetup(\n name='swig_decoders',\n version='1.1',\n description=\"\"\"CTC decoders\"\"\",\n ext_modules=decoders_module,\n py_modules=['swig_decoders'], )\n", "path": "decoders/swig/setup.py"}]}
1,469
107
gh_patches_debug_12031
rasdani/github-patches
git_diff
pyca__cryptography-2766
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- x509.CertifcateBuilder().sign() fails with "Unknown OpenSSL error' when subject has bad country code The x509.CertifcateBuilder().sign() function fails with an unknown OpenSSL error when the builder is provided with an invalid country code: ``` Traceback (most recent call last): File "./demo.py", line 30, in <module> ca_crt = builder.sign(private_key=ca_key, algorithm=hashes.SHA256(), backend=default_backend()) File "/usr/local/lib/python3.4/dist-packages/cryptography/x509/base.py", line 520, in sign return backend.create_x509_certificate(self, private_key, algorithm) File "/usr/local/lib/python3.4/dist-packages/cryptography/hazmat/backends/multibackend.py", line 381, in create_x509_certificate return b.create_x509_certificate(builder, private_key, algorithm) File "/usr/local/lib/python3.4/dist-packages/cryptography/hazmat/backends/openssl/backend.py", line 1402, in create_x509_certificate x509_cert, _encode_name_gc(self, list(builder._subject_name)) File "/usr/local/lib/python3.4/dist-packages/cryptography/hazmat/backends/openssl/backend.py", line 158, in _encode_name_gc subject = _encode_name(backend, attributes) File "/usr/local/lib/python3.4/dist-packages/cryptography/hazmat/backends/openssl/backend.py", line 153, in _encode_name backend.openssl_assert(res == 1) File "/usr/local/lib/python3.4/dist-packages/cryptography/hazmat/backends/openssl/backend.py", line 719, in openssl_assert return binding._openssl_assert(self._lib, ok) File "/usr/local/lib/python3.4/dist-packages/cryptography/hazmat/bindings/openssl/binding.py", line 43, in _openssl_assert errors cryptography.exceptions.InternalError: Unknown OpenSSL error. Please file an issue at https://github.com/pyca/cryptography/issues with information on how to reproduce this. ([_OpenSSLError(code=218603671, lib=13, func=122, reason=151)]) ``` The demo code below reproduces this error. Tested on Ubuntu 14.04.3 x64 w/ Python 3.4.3, cryptography 1.2.1, and libssl-dev 1.0.1f-1ubuntu2.16. ``` #!/usr/bin/env python3 import datetime from cryptography import x509 from cryptography.hazmat.backends import default_backend from cryptography.hazmat.primitives import hashes from cryptography.hazmat.primitives.asymmetric import rsa DUR_ONE_DAY = datetime.timedelta(1, 0, 0) DUR_ONE_YEAR = datetime.timedelta(366, 0, 0) ca_key = rsa.generate_private_key(65537, 4096, default_backend()) sub_attr = [x509.NameAttribute(x509.NameOID.COUNTRY_NAME, "InvalidCC")] builder = x509.CertificateBuilder() builder = builder.issuer_name(x509.Name(sub_attr)) builder = builder.subject_name(x509.Name(sub_attr)) builder = builder.not_valid_before(datetime.datetime.today() - DUR_ONE_DAY) builder = builder.not_valid_after(datetime.datetime.today() + DUR_ONE_YEAR) builder = builder.serial_number(3) builder = builder.public_key(ca_key.public_key()) extensions = [] extensions.append(x509.BasicConstraints(ca=True, path_length=1)) for ext in extensions: builder = builder.add_extension(ext, critical=True) ca_crt = builder.sign(private_key=ca_key, algorithm=hashes.SHA256(), backend=default_backend()) # builder.sign() will fail with when CONTRY_NAME is invalid country code: # cryptography.exceptions.InternalError: Unknown # OpenSSL error. Please file an issue at # https://github.com/pyca/cryptography/issues with information on # how to reproduce this. ([_OpenSSLError(code=218603671, lib=13, # func=122, reason=151)]) ``` Might be nice to raise a more relevant error message. Or catch this even sooner when the NameAttribute is added. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `src/cryptography/x509/name.py` Content: ``` 1 # This file is dual licensed under the terms of the Apache License, Version 2 # 2.0, and the BSD License. See the LICENSE file in the root of this repository 3 # for complete details. 4 5 from __future__ import absolute_import, division, print_function 6 7 import six 8 9 from cryptography import utils 10 from cryptography.x509.oid import ObjectIdentifier 11 12 13 class NameAttribute(object): 14 def __init__(self, oid, value): 15 if not isinstance(oid, ObjectIdentifier): 16 raise TypeError( 17 "oid argument must be an ObjectIdentifier instance." 18 ) 19 20 if not isinstance(value, six.text_type): 21 raise TypeError( 22 "value argument must be a text type." 23 ) 24 25 self._oid = oid 26 self._value = value 27 28 oid = utils.read_only_property("_oid") 29 value = utils.read_only_property("_value") 30 31 def __eq__(self, other): 32 if not isinstance(other, NameAttribute): 33 return NotImplemented 34 35 return ( 36 self.oid == other.oid and 37 self.value == other.value 38 ) 39 40 def __ne__(self, other): 41 return not self == other 42 43 def __hash__(self): 44 return hash((self.oid, self.value)) 45 46 def __repr__(self): 47 return "<NameAttribute(oid={0.oid}, value={0.value!r})>".format(self) 48 49 50 class Name(object): 51 def __init__(self, attributes): 52 self._attributes = attributes 53 54 def get_attributes_for_oid(self, oid): 55 return [i for i in self if i.oid == oid] 56 57 def __eq__(self, other): 58 if not isinstance(other, Name): 59 return NotImplemented 60 61 return self._attributes == other._attributes 62 63 def __ne__(self, other): 64 return not self == other 65 66 def __hash__(self): 67 # TODO: this is relatively expensive, if this looks like a bottleneck 68 # for you, consider optimizing! 69 return hash(tuple(self._attributes)) 70 71 def __iter__(self): 72 return iter(self._attributes) 73 74 def __len__(self): 75 return len(self._attributes) 76 77 def __repr__(self): 78 return "<Name({0!r})>".format(self._attributes) 79 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/src/cryptography/x509/name.py b/src/cryptography/x509/name.py --- a/src/cryptography/x509/name.py +++ b/src/cryptography/x509/name.py @@ -7,7 +7,7 @@ import six from cryptography import utils -from cryptography.x509.oid import ObjectIdentifier +from cryptography.x509.oid import NameOID, ObjectIdentifier class NameAttribute(object): @@ -22,6 +22,11 @@ "value argument must be a text type." ) + if oid == NameOID.COUNTRY_NAME and len(value.encode("utf8")) != 2: + raise ValueError( + "Country name must be a 2 character country code" + ) + self._oid = oid self._value = value
{"golden_diff": "diff --git a/src/cryptography/x509/name.py b/src/cryptography/x509/name.py\n--- a/src/cryptography/x509/name.py\n+++ b/src/cryptography/x509/name.py\n@@ -7,7 +7,7 @@\n import six\n \n from cryptography import utils\n-from cryptography.x509.oid import ObjectIdentifier\n+from cryptography.x509.oid import NameOID, ObjectIdentifier\n \n \n class NameAttribute(object):\n@@ -22,6 +22,11 @@\n \"value argument must be a text type.\"\n )\n \n+ if oid == NameOID.COUNTRY_NAME and len(value.encode(\"utf8\")) != 2:\n+ raise ValueError(\n+ \"Country name must be a 2 character country code\"\n+ )\n+\n self._oid = oid\n self._value = value\n", "issue": "x509.CertifcateBuilder().sign() fails with \"Unknown OpenSSL error' when subject has bad country code\nThe x509.CertifcateBuilder().sign() function fails with an unknown OpenSSL error when the builder is provided with an invalid country code:\n\n```\nTraceback (most recent call last):\n File \"./demo.py\", line 30, in <module>\n ca_crt = builder.sign(private_key=ca_key, algorithm=hashes.SHA256(), backend=default_backend())\n File \"/usr/local/lib/python3.4/dist-packages/cryptography/x509/base.py\", line 520, in sign\n return backend.create_x509_certificate(self, private_key, algorithm)\n File \"/usr/local/lib/python3.4/dist-packages/cryptography/hazmat/backends/multibackend.py\", line 381, in create_x509_certificate\n return b.create_x509_certificate(builder, private_key, algorithm)\n File \"/usr/local/lib/python3.4/dist-packages/cryptography/hazmat/backends/openssl/backend.py\", line 1402, in create_x509_certificate\n x509_cert, _encode_name_gc(self, list(builder._subject_name))\n File \"/usr/local/lib/python3.4/dist-packages/cryptography/hazmat/backends/openssl/backend.py\", line 158, in _encode_name_gc\n subject = _encode_name(backend, attributes)\n File \"/usr/local/lib/python3.4/dist-packages/cryptography/hazmat/backends/openssl/backend.py\", line 153, in _encode_name\n backend.openssl_assert(res == 1)\n File \"/usr/local/lib/python3.4/dist-packages/cryptography/hazmat/backends/openssl/backend.py\", line 719, in openssl_assert\n return binding._openssl_assert(self._lib, ok)\n File \"/usr/local/lib/python3.4/dist-packages/cryptography/hazmat/bindings/openssl/binding.py\", line 43, in _openssl_assert\n errors\ncryptography.exceptions.InternalError: Unknown OpenSSL error. Please file an issue at https://github.com/pyca/cryptography/issues with information on how to reproduce this. ([_OpenSSLError(code=218603671, lib=13, func=122, reason=151)])\n```\n\nThe demo code below reproduces this error. Tested on Ubuntu 14.04.3 x64 w/ Python 3.4.3, cryptography 1.2.1, and libssl-dev 1.0.1f-1ubuntu2.16.\n\n```\n#!/usr/bin/env python3\n\nimport datetime\n\nfrom cryptography import x509\nfrom cryptography.hazmat.backends import default_backend\nfrom cryptography.hazmat.primitives import hashes\nfrom cryptography.hazmat.primitives.asymmetric import rsa\n\nDUR_ONE_DAY = datetime.timedelta(1, 0, 0)\nDUR_ONE_YEAR = datetime.timedelta(366, 0, 0)\n\nca_key = rsa.generate_private_key(65537, 4096, default_backend())\n\nsub_attr = [x509.NameAttribute(x509.NameOID.COUNTRY_NAME, \"InvalidCC\")]\n\nbuilder = x509.CertificateBuilder()\nbuilder = builder.issuer_name(x509.Name(sub_attr))\nbuilder = builder.subject_name(x509.Name(sub_attr))\nbuilder = builder.not_valid_before(datetime.datetime.today() - DUR_ONE_DAY)\nbuilder = builder.not_valid_after(datetime.datetime.today() + DUR_ONE_YEAR)\nbuilder = builder.serial_number(3)\nbuilder = builder.public_key(ca_key.public_key())\n\nextensions = []\nextensions.append(x509.BasicConstraints(ca=True, path_length=1))\nfor ext in extensions:\n builder = builder.add_extension(ext, critical=True)\n\nca_crt = builder.sign(private_key=ca_key, algorithm=hashes.SHA256(), backend=default_backend())\n\n# builder.sign() will fail with when CONTRY_NAME is invalid country code:\n# cryptography.exceptions.InternalError: Unknown\n# OpenSSL error. Please file an issue at\n# https://github.com/pyca/cryptography/issues with information on\n# how to reproduce this. ([_OpenSSLError(code=218603671, lib=13,\n# func=122, reason=151)])\n```\n\nMight be nice to raise a more relevant error message. Or catch this even sooner when the NameAttribute is added. \n\n", "before_files": [{"content": "# This file is dual licensed under the terms of the Apache License, Version\n# 2.0, and the BSD License. See the LICENSE file in the root of this repository\n# for complete details.\n\nfrom __future__ import absolute_import, division, print_function\n\nimport six\n\nfrom cryptography import utils\nfrom cryptography.x509.oid import ObjectIdentifier\n\n\nclass NameAttribute(object):\n def __init__(self, oid, value):\n if not isinstance(oid, ObjectIdentifier):\n raise TypeError(\n \"oid argument must be an ObjectIdentifier instance.\"\n )\n\n if not isinstance(value, six.text_type):\n raise TypeError(\n \"value argument must be a text type.\"\n )\n\n self._oid = oid\n self._value = value\n\n oid = utils.read_only_property(\"_oid\")\n value = utils.read_only_property(\"_value\")\n\n def __eq__(self, other):\n if not isinstance(other, NameAttribute):\n return NotImplemented\n\n return (\n self.oid == other.oid and\n self.value == other.value\n )\n\n def __ne__(self, other):\n return not self == other\n\n def __hash__(self):\n return hash((self.oid, self.value))\n\n def __repr__(self):\n return \"<NameAttribute(oid={0.oid}, value={0.value!r})>\".format(self)\n\n\nclass Name(object):\n def __init__(self, attributes):\n self._attributes = attributes\n\n def get_attributes_for_oid(self, oid):\n return [i for i in self if i.oid == oid]\n\n def __eq__(self, other):\n if not isinstance(other, Name):\n return NotImplemented\n\n return self._attributes == other._attributes\n\n def __ne__(self, other):\n return not self == other\n\n def __hash__(self):\n # TODO: this is relatively expensive, if this looks like a bottleneck\n # for you, consider optimizing!\n return hash(tuple(self._attributes))\n\n def __iter__(self):\n return iter(self._attributes)\n\n def __len__(self):\n return len(self._attributes)\n\n def __repr__(self):\n return \"<Name({0!r})>\".format(self._attributes)\n", "path": "src/cryptography/x509/name.py"}], "after_files": [{"content": "# This file is dual licensed under the terms of the Apache License, Version\n# 2.0, and the BSD License. See the LICENSE file in the root of this repository\n# for complete details.\n\nfrom __future__ import absolute_import, division, print_function\n\nimport six\n\nfrom cryptography import utils\nfrom cryptography.x509.oid import NameOID, ObjectIdentifier\n\n\nclass NameAttribute(object):\n def __init__(self, oid, value):\n if not isinstance(oid, ObjectIdentifier):\n raise TypeError(\n \"oid argument must be an ObjectIdentifier instance.\"\n )\n\n if not isinstance(value, six.text_type):\n raise TypeError(\n \"value argument must be a text type.\"\n )\n\n if oid == NameOID.COUNTRY_NAME and len(value.encode(\"utf8\")) != 2:\n raise ValueError(\n \"Country name must be a 2 character country code\"\n )\n\n self._oid = oid\n self._value = value\n\n oid = utils.read_only_property(\"_oid\")\n value = utils.read_only_property(\"_value\")\n\n def __eq__(self, other):\n if not isinstance(other, NameAttribute):\n return NotImplemented\n\n return (\n self.oid == other.oid and\n self.value == other.value\n )\n\n def __ne__(self, other):\n return not self == other\n\n def __hash__(self):\n return hash((self.oid, self.value))\n\n def __repr__(self):\n return \"<NameAttribute(oid={0.oid}, value={0.value!r})>\".format(self)\n\n\nclass Name(object):\n def __init__(self, attributes):\n self._attributes = attributes\n\n def get_attributes_for_oid(self, oid):\n return [i for i in self if i.oid == oid]\n\n def __eq__(self, other):\n if not isinstance(other, Name):\n return NotImplemented\n\n return self._attributes == other._attributes\n\n def __ne__(self, other):\n return not self == other\n\n def __hash__(self):\n # TODO: this is relatively expensive, if this looks like a bottleneck\n # for you, consider optimizing!\n return hash(tuple(self._attributes))\n\n def __iter__(self):\n return iter(self._attributes)\n\n def __len__(self):\n return len(self._attributes)\n\n def __repr__(self):\n return \"<Name({0!r})>\".format(self._attributes)\n", "path": "src/cryptography/x509/name.py"}]}
1,878
185
gh_patches_debug_822
rasdani/github-patches
git_diff
opendatacube__datacube-core-348
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Unnecessary dependency on `pathlib` when running in python3 ### Expected behaviour Datacube shouldn't depend on unnecessary packages when running in Python 3. ### Actual behaviour There's a dependency on `pathlib`, which is included in the Python 3 standard library, and so doesn't need to be installed. This causes trouble on the NCI deployment when trying to load `stats` modules which use the `setuptools` entry_points for their registration. And returns error messages to users trying to load them. ### Steps to reproduce the behaviour ``` module load agdc-py3-prod agdc_statistics dra547@raijin4:~ $ python Python 3.6.3 | packaged by conda-forge | (default, Nov 4 2017, 10:10:56) [GCC 4.8.2 20140120 (Red Hat 4.8.2-15)] on linux Type "help", "copyright", "credits" or "license" for more information. >>> import datacube_stats.statistics Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/g/data/v10/public/modules/agdc_statistics/0.9a7/lib/python3.6/site-packages/datacube_stats/statistics.py", line 769, in <module> STATS[entry_point.name] = entry_point.load() File "/g/data/v10/public/modules/agdc-py3-env/20171214/envs/agdc/lib/python3.6/site-packages/pkg_resources/__init__.py", line 2404, in load self.require(*args, **kwargs) File "/g/data/v10/public/modules/agdc-py3-env/20171214/envs/agdc/lib/python3.6/site-packages/pkg_resources/__init__.py", line 2427, in require items = working_set.resolve(reqs, env, installer, extras=self.extras) File "/g/data/v10/public/modules/agdc-py3-env/20171214/envs/agdc/lib/python3.6/site-packages/pkg_resources/__init__.py", line 870, in resolve raise DistributionNotFound(req, requirers) pkg_resources.DistributionNotFound: The 'pathlib' distribution was not found and is required by datacube >>> ``` ### The Fix Modify `setup.py` to use [platform specific dependencies](https://setuptools.readthedocs.io/en/latest/setuptools.html#declaring-platform-specific-dependencies) to only require `pathlib` when not running on python 3. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `setup.py` Content: ``` 1 #!/usr/bin/env python 2 3 import versioneer 4 from setuptools import setup, find_packages 5 6 tests_require = [ 7 'pytest', 'pytest-cov', 'mock', 'pep8', 'pylint', 'hypothesis', 'compliance-checker', 'objgraph' 8 ] 9 10 extras_require = { 11 'performance': ['ciso8601', 'bottleneck'], 12 'interactive': ['matplotlib', 'fiona'], 13 'distributed': ['distributed', 'dask[distributed]'], 14 'analytics': ['scipy', 'pyparsing', 'numexpr'], 15 'doc': ['Sphinx', 'setuptools'], 16 'replicas': ['paramiko', 'sshtunnel', 'tqdm'], 17 'celery': ['celery>=4', 'redis'], 18 's3': ['boto3==1.4.3', 'SharedArray', 'pathos', 'zstandard'], 19 'test': tests_require, 20 } 21 # An 'all' option, following ipython naming conventions. 22 extras_require['all'] = sorted(set(sum(extras_require.values(), []))) 23 24 setup( 25 name='datacube', 26 version=versioneer.get_version(), 27 cmdclass=versioneer.get_cmdclass(), 28 29 url='https://github.com/opendatacube/datacube-core', 30 author='AGDC Collaboration', 31 maintainer='AGDC Collaboration', 32 maintainer_email='', 33 description='An analysis environment for satellite and other earth observation data', 34 long_description=open('README.rst').read(), 35 license='Apache License 2.0', 36 classifiers=[ 37 "Development Status :: 4 - Beta", 38 "Intended Audience :: Developers", 39 "Intended Audience :: Science/Research", 40 "License :: OSI Approved :: Apache Software License", 41 "Natural Language :: English", 42 "Operating System :: MacOS :: MacOS X", 43 "Operating System :: POSIX", 44 "Operating System :: POSIX :: BSD", 45 "Operating System :: POSIX :: Linux", 46 "Operating System :: Microsoft :: Windows", 47 "Programming Language :: Python", 48 "Programming Language :: Python :: 2", 49 "Programming Language :: Python :: 2.7", 50 "Programming Language :: Python :: 3", 51 "Programming Language :: Python :: 3.5", 52 "Topic :: Scientific/Engineering :: GIS", 53 "Topic :: Scientific/Engineering :: Information Analysis", 54 ], 55 56 packages=find_packages( 57 exclude=('tests', 'tests.*', 58 'integration_tests', 'integration_tests.*') 59 ), 60 package_data={ 61 '': ['*.yaml', '*/*.yaml'], 62 }, 63 scripts=[ 64 'datacube_apps/scripts/pbs_helpers.sh' 65 ], 66 setup_requires=[ 67 'pytest-runner' 68 ], 69 install_requires=[ 70 'affine', 71 'cachetools', 72 'click>=5.0', 73 'cloudpickle>=0.4', 74 'dask[array]', 75 'gdal>=1.9', 76 'jsonschema', 77 'netcdf4', 78 'numpy', 79 'pathlib', 80 'psycopg2', 81 'pypeg2', 82 'python-dateutil', 83 'pyyaml', 84 'rasterio>=0.9', # required for zip reading, 0.9 gets around 1.0a ordering problems 85 'singledispatch', 86 'sqlalchemy', 87 'xarray>=0.9', # >0.9 fixes most problems with `crs` attributes being lost 88 ], 89 extras_require=extras_require, 90 tests_require=tests_require, 91 92 entry_points={ 93 'console_scripts': [ 94 'datacube-search = datacube.scripts.search_tool:cli', 95 'datacube = datacube.scripts.cli_app:cli', 96 'datacube-stacker = datacube_apps.stacker:main', 97 'datacube-worker = datacube.execution.worker:main', 98 'datacube-fixer = datacube_apps.stacker:fixer_main', 99 'datacube-ncml = datacube_apps.ncml:ncml_app', 100 'pixeldrill = datacube_apps.pixeldrill:main [interactive]', 101 'movie_generator = datacube_apps.movie_generator:main', 102 'datacube-simple-replica = datacube_apps.simple_replica:replicate' 103 ] 104 }, 105 ) 106 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/setup.py b/setup.py --- a/setup.py +++ b/setup.py @@ -76,7 +76,7 @@ 'jsonschema', 'netcdf4', 'numpy', - 'pathlib', + 'pathlib;python_version<"3"', 'psycopg2', 'pypeg2', 'python-dateutil',
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -76,7 +76,7 @@\n 'jsonschema',\n 'netcdf4',\n 'numpy',\n- 'pathlib',\n+ 'pathlib;python_version<\"3\"',\n 'psycopg2',\n 'pypeg2',\n 'python-dateutil',\n", "issue": "Unnecessary dependency on `pathlib` when running in python3\n### Expected behaviour\r\nDatacube shouldn't depend on unnecessary packages when running in Python 3.\r\n\r\n### Actual behaviour\r\nThere's a dependency on `pathlib`, which is included in the Python 3 standard library, and so doesn't need to be installed.\r\n\r\nThis causes trouble on the NCI deployment when trying to load `stats` modules which use the `setuptools` entry_points for their registration. And returns error messages to users trying to load them.\r\n\r\n### Steps to reproduce the behaviour\r\n```\r\nmodule load agdc-py3-prod agdc_statistics\r\ndra547@raijin4:~ $ python\r\nPython 3.6.3 | packaged by conda-forge | (default, Nov 4 2017, 10:10:56)\r\n[GCC 4.8.2 20140120 (Red Hat 4.8.2-15)] on linux\r\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\r\n>>> import datacube_stats.statistics\r\nTraceback (most recent call last):\r\n File \"<stdin>\", line 1, in <module>\r\n File \"/g/data/v10/public/modules/agdc_statistics/0.9a7/lib/python3.6/site-packages/datacube_stats/statistics.py\", line 769, in <module>\r\n STATS[entry_point.name] = entry_point.load()\r\n File \"/g/data/v10/public/modules/agdc-py3-env/20171214/envs/agdc/lib/python3.6/site-packages/pkg_resources/__init__.py\", line 2404, in load\r\n self.require(*args, **kwargs)\r\n File \"/g/data/v10/public/modules/agdc-py3-env/20171214/envs/agdc/lib/python3.6/site-packages/pkg_resources/__init__.py\", line 2427, in require\r\n items = working_set.resolve(reqs, env, installer, extras=self.extras)\r\n File \"/g/data/v10/public/modules/agdc-py3-env/20171214/envs/agdc/lib/python3.6/site-packages/pkg_resources/__init__.py\", line 870, in resolve\r\n raise DistributionNotFound(req, requirers)\r\npkg_resources.DistributionNotFound: The 'pathlib' distribution was not found and is required by datacube\r\n>>>\r\n```\r\n\r\n### The Fix\r\nModify `setup.py` to use [platform specific dependencies](https://setuptools.readthedocs.io/en/latest/setuptools.html#declaring-platform-specific-dependencies) to only require `pathlib` when not running on python 3.\n", "before_files": [{"content": "#!/usr/bin/env python\n\nimport versioneer\nfrom setuptools import setup, find_packages\n\ntests_require = [\n 'pytest', 'pytest-cov', 'mock', 'pep8', 'pylint', 'hypothesis', 'compliance-checker', 'objgraph'\n]\n\nextras_require = {\n 'performance': ['ciso8601', 'bottleneck'],\n 'interactive': ['matplotlib', 'fiona'],\n 'distributed': ['distributed', 'dask[distributed]'],\n 'analytics': ['scipy', 'pyparsing', 'numexpr'],\n 'doc': ['Sphinx', 'setuptools'],\n 'replicas': ['paramiko', 'sshtunnel', 'tqdm'],\n 'celery': ['celery>=4', 'redis'],\n 's3': ['boto3==1.4.3', 'SharedArray', 'pathos', 'zstandard'],\n 'test': tests_require,\n}\n# An 'all' option, following ipython naming conventions.\nextras_require['all'] = sorted(set(sum(extras_require.values(), [])))\n\nsetup(\n name='datacube',\n version=versioneer.get_version(),\n cmdclass=versioneer.get_cmdclass(),\n\n url='https://github.com/opendatacube/datacube-core',\n author='AGDC Collaboration',\n maintainer='AGDC Collaboration',\n maintainer_email='',\n description='An analysis environment for satellite and other earth observation data',\n long_description=open('README.rst').read(),\n license='Apache License 2.0',\n classifiers=[\n \"Development Status :: 4 - Beta\",\n \"Intended Audience :: Developers\",\n \"Intended Audience :: Science/Research\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Natural Language :: English\",\n \"Operating System :: MacOS :: MacOS X\",\n \"Operating System :: POSIX\",\n \"Operating System :: POSIX :: BSD\",\n \"Operating System :: POSIX :: Linux\",\n \"Operating System :: Microsoft :: Windows\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 2\",\n \"Programming Language :: Python :: 2.7\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.5\",\n \"Topic :: Scientific/Engineering :: GIS\",\n \"Topic :: Scientific/Engineering :: Information Analysis\",\n ],\n\n packages=find_packages(\n exclude=('tests', 'tests.*',\n 'integration_tests', 'integration_tests.*')\n ),\n package_data={\n '': ['*.yaml', '*/*.yaml'],\n },\n scripts=[\n 'datacube_apps/scripts/pbs_helpers.sh'\n ],\n setup_requires=[\n 'pytest-runner'\n ],\n install_requires=[\n 'affine',\n 'cachetools',\n 'click>=5.0',\n 'cloudpickle>=0.4',\n 'dask[array]',\n 'gdal>=1.9',\n 'jsonschema',\n 'netcdf4',\n 'numpy',\n 'pathlib',\n 'psycopg2',\n 'pypeg2',\n 'python-dateutil',\n 'pyyaml',\n 'rasterio>=0.9', # required for zip reading, 0.9 gets around 1.0a ordering problems\n 'singledispatch',\n 'sqlalchemy',\n 'xarray>=0.9', # >0.9 fixes most problems with `crs` attributes being lost\n ],\n extras_require=extras_require,\n tests_require=tests_require,\n\n entry_points={\n 'console_scripts': [\n 'datacube-search = datacube.scripts.search_tool:cli',\n 'datacube = datacube.scripts.cli_app:cli',\n 'datacube-stacker = datacube_apps.stacker:main',\n 'datacube-worker = datacube.execution.worker:main',\n 'datacube-fixer = datacube_apps.stacker:fixer_main',\n 'datacube-ncml = datacube_apps.ncml:ncml_app',\n 'pixeldrill = datacube_apps.pixeldrill:main [interactive]',\n 'movie_generator = datacube_apps.movie_generator:main',\n 'datacube-simple-replica = datacube_apps.simple_replica:replicate'\n ]\n },\n)\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/env python\n\nimport versioneer\nfrom setuptools import setup, find_packages\n\ntests_require = [\n 'pytest', 'pytest-cov', 'mock', 'pep8', 'pylint', 'hypothesis', 'compliance-checker', 'objgraph'\n]\n\nextras_require = {\n 'performance': ['ciso8601', 'bottleneck'],\n 'interactive': ['matplotlib', 'fiona'],\n 'distributed': ['distributed', 'dask[distributed]'],\n 'analytics': ['scipy', 'pyparsing', 'numexpr'],\n 'doc': ['Sphinx', 'setuptools'],\n 'replicas': ['paramiko', 'sshtunnel', 'tqdm'],\n 'celery': ['celery>=4', 'redis'],\n 's3': ['boto3==1.4.3', 'SharedArray', 'pathos', 'zstandard'],\n 'test': tests_require,\n}\n# An 'all' option, following ipython naming conventions.\nextras_require['all'] = sorted(set(sum(extras_require.values(), [])))\n\nsetup(\n name='datacube',\n version=versioneer.get_version(),\n cmdclass=versioneer.get_cmdclass(),\n\n url='https://github.com/opendatacube/datacube-core',\n author='AGDC Collaboration',\n maintainer='AGDC Collaboration',\n maintainer_email='',\n description='An analysis environment for satellite and other earth observation data',\n long_description=open('README.rst').read(),\n license='Apache License 2.0',\n classifiers=[\n \"Development Status :: 4 - Beta\",\n \"Intended Audience :: Developers\",\n \"Intended Audience :: Science/Research\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Natural Language :: English\",\n \"Operating System :: MacOS :: MacOS X\",\n \"Operating System :: POSIX\",\n \"Operating System :: POSIX :: BSD\",\n \"Operating System :: POSIX :: Linux\",\n \"Operating System :: Microsoft :: Windows\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 2\",\n \"Programming Language :: Python :: 2.7\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.5\",\n \"Topic :: Scientific/Engineering :: GIS\",\n \"Topic :: Scientific/Engineering :: Information Analysis\",\n ],\n\n packages=find_packages(\n exclude=('tests', 'tests.*',\n 'integration_tests', 'integration_tests.*')\n ),\n package_data={\n '': ['*.yaml', '*/*.yaml'],\n },\n scripts=[\n 'datacube_apps/scripts/pbs_helpers.sh'\n ],\n setup_requires=[\n 'pytest-runner'\n ],\n install_requires=[\n 'affine',\n 'cachetools',\n 'click>=5.0',\n 'cloudpickle>=0.4',\n 'dask[array]',\n 'gdal>=1.9',\n 'jsonschema',\n 'netcdf4',\n 'numpy',\n 'pathlib;python_version<\"3\"',\n 'psycopg2',\n 'pypeg2',\n 'python-dateutil',\n 'pyyaml',\n 'rasterio>=0.9', # required for zip reading, 0.9 gets around 1.0a ordering problems\n 'singledispatch',\n 'sqlalchemy',\n 'xarray>=0.9', # >0.9 fixes most problems with `crs` attributes being lost\n ],\n extras_require=extras_require,\n tests_require=tests_require,\n\n entry_points={\n 'console_scripts': [\n 'datacube-search = datacube.scripts.search_tool:cli',\n 'datacube = datacube.scripts.cli_app:cli',\n 'datacube-stacker = datacube_apps.stacker:main',\n 'datacube-worker = datacube.execution.worker:main',\n 'datacube-fixer = datacube_apps.stacker:fixer_main',\n 'datacube-ncml = datacube_apps.ncml:ncml_app',\n 'pixeldrill = datacube_apps.pixeldrill:main [interactive]',\n 'movie_generator = datacube_apps.movie_generator:main',\n 'datacube-simple-replica = datacube_apps.simple_replica:replicate'\n ]\n },\n)\n", "path": "setup.py"}]}
1,961
83
gh_patches_debug_30084
rasdani/github-patches
git_diff
cobbler__cobbler-3264
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Task logs don't end ### Describe the bug Task logs contain the normal server logs, even though the task has stopped. This leads to duplicated logs in all created files. ### Steps to reproduce 1. `cobbler mkloaders` (or any other task) 2. Repeat step 1 3. See logs being appended to completed tasks ### Expected behavior The files should only contain the logs for the task. ### Cobbler version Commit: 2a5c7ce9c1533bbf6f6b9050198d98aca45a06ae ````paste below Cobbler 3.4.0 source: ?, ? build time: Fri Oct 14 14:17:09 2022 ```` ### Operating system ``` e62e937a2de9:/var/log/cobbler/tasks # cat /etc/os-release NAME="openSUSE Leap" VERSION="15.3" ID="opensuse-leap" ID_LIKE="suse opensuse" VERSION_ID="15.3" PRETTY_NAME="openSUSE Leap 15.3" ANSI_COLOR="0;32" CPE_NAME="cpe:/o:opensuse:leap:15.3" BUG_REPORT_URL="https://bugs.opensuse.org" HOME_URL="https://www.opensuse.org/" ``` ### Cobbler log Not relevant ### Screenshots Not relevant ### Additional information Found by accident during another debugging session --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `cobbler/utils/thread.py` Content: ``` 1 """ 2 This module is responsible for managing the custom common threading logic Cobbler has. 3 """ 4 5 import logging 6 import pathlib 7 from threading import Thread 8 from typing import Callable 9 10 from cobbler import enums 11 from cobbler import utils 12 13 14 class CobblerThread(Thread): 15 """ 16 This is a custom thread that has a custom logger as well as logic to execute Cobbler triggers. 17 """ 18 19 def __init__( 20 self, 21 event_id: str, 22 remote, 23 options: dict, 24 task_name: str, 25 api, 26 run: Callable, 27 on_done: Callable = None, 28 ): 29 """ 30 This constructor creates a Cobbler thread which then may be run by calling ``run()``. 31 32 :param event_id: The event-id which is associated with this thread. Also used as thread name 33 :param remote: The Cobbler remote object to execute actions with. 34 :param options: Additional options which can be passed into the Thread. 35 :param task_name: The high level task name which is used to trigger pre- and post-task triggers 36 :param api: The Cobbler api object to resolve information with. 37 :param run: The callable that is going to be executed with this thread. 38 :param on_done: An optional callable that is going to be executed after ``run`` but before the triggers. 39 """ 40 super().__init__(name=event_id) 41 self.event_id = event_id 42 self.remote = remote 43 self.logger = logging.getLogger() 44 self.__setup_logger() 45 self._run = run 46 self.on_done = on_done 47 if options is None: 48 options = {} 49 self.options = options 50 self.task_name = task_name 51 self.api = api 52 53 def __setup_logger(self): 54 """ 55 Utility function that will set up the Python logger for the tasks in a special directory. 56 """ 57 filename = pathlib.Path("/var/log/cobbler/tasks") / f"{self.event_id}.log" 58 task_log_handler = logging.FileHandler(str(filename), encoding="utf-8") 59 task_log_formatter = logging.Formatter( 60 "[%(threadName)s] %(asctime)s - %(levelname)s | %(message)s" 61 ) 62 task_log_handler.setFormatter(task_log_formatter) 63 self.logger.setLevel(logging.INFO) 64 self.logger.addHandler(task_log_handler) 65 66 def _set_task_state(self, new_state: enums.EventStatus): 67 """ 68 Set the state of the task. (For internal use only) 69 70 :param new_state: The new state of the task. 71 """ 72 if not isinstance(new_state, enums.EventStatus): 73 raise TypeError('"new_state" needs to be of type enums.EventStatus!') 74 if self.event_id not in self.remote.events: 75 raise ValueError('"event_id" not existing!') 76 self.remote.events[self.event_id].state = new_state 77 # clear the list of who has read it 78 self.remote.events[self.event_id].read_by_who = [] 79 if new_state == enums.EventStatus.COMPLETE: 80 self.logger.info("### TASK COMPLETE ###") 81 elif new_state == enums.EventStatus.FAILED: 82 self.logger.error("### TASK FAILED ###") 83 84 def run(self): 85 """ 86 Run the thread. 87 88 :return: The return code of the action. This may a boolean or a Linux return code. 89 """ 90 self.logger.info("start_task(%s); event_id(%s)", self.task_name, self.event_id) 91 try: 92 if utils.run_triggers( 93 api=self.api, 94 globber=f"/var/lib/cobbler/triggers/task/{self.task_name}/pre/*", 95 additional=self.options, 96 ): 97 self._set_task_state(enums.EventStatus.FAILED) 98 return False 99 rc = self._run(self) 100 if rc is not None and not rc: 101 self._set_task_state(enums.EventStatus.FAILED) 102 else: 103 self._set_task_state(enums.EventStatus.COMPLETE) 104 if self.on_done is not None: 105 self.on_done() 106 utils.run_triggers( 107 api=self.api, 108 globber=f"/var/lib/cobbler/triggers/task/{self.task_name}/post/*", 109 additional=self.options, 110 ) 111 return rc 112 except Exception: 113 utils.log_exc() 114 self._set_task_state(enums.EventStatus.FAILED) 115 return False 116 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/cobbler/utils/thread.py b/cobbler/utils/thread.py --- a/cobbler/utils/thread.py +++ b/cobbler/utils/thread.py @@ -41,6 +41,7 @@ self.event_id = event_id self.remote = remote self.logger = logging.getLogger() + self.__task_log_handler = None self.__setup_logger() self._run = run self.on_done = on_done @@ -55,13 +56,13 @@ Utility function that will set up the Python logger for the tasks in a special directory. """ filename = pathlib.Path("/var/log/cobbler/tasks") / f"{self.event_id}.log" - task_log_handler = logging.FileHandler(str(filename), encoding="utf-8") + self.__task_log_handler = logging.FileHandler(str(filename), encoding="utf-8") task_log_formatter = logging.Formatter( "[%(threadName)s] %(asctime)s - %(levelname)s | %(message)s" ) - task_log_handler.setFormatter(task_log_formatter) + self.__task_log_handler.setFormatter(task_log_formatter) self.logger.setLevel(logging.INFO) - self.logger.addHandler(task_log_handler) + self.logger.addHandler(self.__task_log_handler) def _set_task_state(self, new_state: enums.EventStatus): """ @@ -113,3 +114,5 @@ utils.log_exc() self._set_task_state(enums.EventStatus.FAILED) return False + finally: + self.logger.removeHandler(self.__task_log_handler)
{"golden_diff": "diff --git a/cobbler/utils/thread.py b/cobbler/utils/thread.py\n--- a/cobbler/utils/thread.py\n+++ b/cobbler/utils/thread.py\n@@ -41,6 +41,7 @@\n self.event_id = event_id\n self.remote = remote\n self.logger = logging.getLogger()\n+ self.__task_log_handler = None\n self.__setup_logger()\n self._run = run\n self.on_done = on_done\n@@ -55,13 +56,13 @@\n Utility function that will set up the Python logger for the tasks in a special directory.\n \"\"\"\n filename = pathlib.Path(\"/var/log/cobbler/tasks\") / f\"{self.event_id}.log\"\n- task_log_handler = logging.FileHandler(str(filename), encoding=\"utf-8\")\n+ self.__task_log_handler = logging.FileHandler(str(filename), encoding=\"utf-8\")\n task_log_formatter = logging.Formatter(\n \"[%(threadName)s] %(asctime)s - %(levelname)s | %(message)s\"\n )\n- task_log_handler.setFormatter(task_log_formatter)\n+ self.__task_log_handler.setFormatter(task_log_formatter)\n self.logger.setLevel(logging.INFO)\n- self.logger.addHandler(task_log_handler)\n+ self.logger.addHandler(self.__task_log_handler)\n \n def _set_task_state(self, new_state: enums.EventStatus):\n \"\"\"\n@@ -113,3 +114,5 @@\n utils.log_exc()\n self._set_task_state(enums.EventStatus.FAILED)\n return False\n+ finally:\n+ self.logger.removeHandler(self.__task_log_handler)\n", "issue": "Task logs don't end\n### Describe the bug\r\n\r\nTask logs contain the normal server logs, even though the task has stopped. This leads to duplicated logs in all created files.\r\n\r\n### Steps to reproduce\r\n\r\n1. `cobbler mkloaders` (or any other task)\r\n2. Repeat step 1\r\n3. See logs being appended to completed tasks\r\n\r\n### Expected behavior\r\n\r\nThe files should only contain the logs for the task.\r\n\r\n### Cobbler version\r\n\r\nCommit: 2a5c7ce9c1533bbf6f6b9050198d98aca45a06ae\r\n\r\n````paste below\r\nCobbler 3.4.0\r\n source: ?, ?\r\n build time: Fri Oct 14 14:17:09 2022\r\n````\r\n\r\n### Operating system\r\n\r\n```\r\ne62e937a2de9:/var/log/cobbler/tasks # cat /etc/os-release \r\nNAME=\"openSUSE Leap\"\r\nVERSION=\"15.3\"\r\nID=\"opensuse-leap\"\r\nID_LIKE=\"suse opensuse\"\r\nVERSION_ID=\"15.3\"\r\nPRETTY_NAME=\"openSUSE Leap 15.3\"\r\nANSI_COLOR=\"0;32\"\r\nCPE_NAME=\"cpe:/o:opensuse:leap:15.3\"\r\nBUG_REPORT_URL=\"https://bugs.opensuse.org\"\r\nHOME_URL=\"https://www.opensuse.org/\"\r\n```\r\n\r\n### Cobbler log\r\n\r\nNot relevant\r\n\r\n### Screenshots\r\n\r\nNot relevant\r\n\r\n### Additional information\r\n\r\nFound by accident during another debugging session\r\n\n", "before_files": [{"content": "\"\"\"\nThis module is responsible for managing the custom common threading logic Cobbler has.\n\"\"\"\n\nimport logging\nimport pathlib\nfrom threading import Thread\nfrom typing import Callable\n\nfrom cobbler import enums\nfrom cobbler import utils\n\n\nclass CobblerThread(Thread):\n \"\"\"\n This is a custom thread that has a custom logger as well as logic to execute Cobbler triggers.\n \"\"\"\n\n def __init__(\n self,\n event_id: str,\n remote,\n options: dict,\n task_name: str,\n api,\n run: Callable,\n on_done: Callable = None,\n ):\n \"\"\"\n This constructor creates a Cobbler thread which then may be run by calling ``run()``.\n\n :param event_id: The event-id which is associated with this thread. Also used as thread name\n :param remote: The Cobbler remote object to execute actions with.\n :param options: Additional options which can be passed into the Thread.\n :param task_name: The high level task name which is used to trigger pre- and post-task triggers\n :param api: The Cobbler api object to resolve information with.\n :param run: The callable that is going to be executed with this thread.\n :param on_done: An optional callable that is going to be executed after ``run`` but before the triggers.\n \"\"\"\n super().__init__(name=event_id)\n self.event_id = event_id\n self.remote = remote\n self.logger = logging.getLogger()\n self.__setup_logger()\n self._run = run\n self.on_done = on_done\n if options is None:\n options = {}\n self.options = options\n self.task_name = task_name\n self.api = api\n\n def __setup_logger(self):\n \"\"\"\n Utility function that will set up the Python logger for the tasks in a special directory.\n \"\"\"\n filename = pathlib.Path(\"/var/log/cobbler/tasks\") / f\"{self.event_id}.log\"\n task_log_handler = logging.FileHandler(str(filename), encoding=\"utf-8\")\n task_log_formatter = logging.Formatter(\n \"[%(threadName)s] %(asctime)s - %(levelname)s | %(message)s\"\n )\n task_log_handler.setFormatter(task_log_formatter)\n self.logger.setLevel(logging.INFO)\n self.logger.addHandler(task_log_handler)\n\n def _set_task_state(self, new_state: enums.EventStatus):\n \"\"\"\n Set the state of the task. (For internal use only)\n\n :param new_state: The new state of the task.\n \"\"\"\n if not isinstance(new_state, enums.EventStatus):\n raise TypeError('\"new_state\" needs to be of type enums.EventStatus!')\n if self.event_id not in self.remote.events:\n raise ValueError('\"event_id\" not existing!')\n self.remote.events[self.event_id].state = new_state\n # clear the list of who has read it\n self.remote.events[self.event_id].read_by_who = []\n if new_state == enums.EventStatus.COMPLETE:\n self.logger.info(\"### TASK COMPLETE ###\")\n elif new_state == enums.EventStatus.FAILED:\n self.logger.error(\"### TASK FAILED ###\")\n\n def run(self):\n \"\"\"\n Run the thread.\n\n :return: The return code of the action. This may a boolean or a Linux return code.\n \"\"\"\n self.logger.info(\"start_task(%s); event_id(%s)\", self.task_name, self.event_id)\n try:\n if utils.run_triggers(\n api=self.api,\n globber=f\"/var/lib/cobbler/triggers/task/{self.task_name}/pre/*\",\n additional=self.options,\n ):\n self._set_task_state(enums.EventStatus.FAILED)\n return False\n rc = self._run(self)\n if rc is not None and not rc:\n self._set_task_state(enums.EventStatus.FAILED)\n else:\n self._set_task_state(enums.EventStatus.COMPLETE)\n if self.on_done is not None:\n self.on_done()\n utils.run_triggers(\n api=self.api,\n globber=f\"/var/lib/cobbler/triggers/task/{self.task_name}/post/*\",\n additional=self.options,\n )\n return rc\n except Exception:\n utils.log_exc()\n self._set_task_state(enums.EventStatus.FAILED)\n return False\n", "path": "cobbler/utils/thread.py"}], "after_files": [{"content": "\"\"\"\nThis module is responsible for managing the custom common threading logic Cobbler has.\n\"\"\"\n\nimport logging\nimport pathlib\nfrom threading import Thread\nfrom typing import Callable\n\nfrom cobbler import enums\nfrom cobbler import utils\n\n\nclass CobblerThread(Thread):\n \"\"\"\n This is a custom thread that has a custom logger as well as logic to execute Cobbler triggers.\n \"\"\"\n\n def __init__(\n self,\n event_id: str,\n remote,\n options: dict,\n task_name: str,\n api,\n run: Callable,\n on_done: Callable = None,\n ):\n \"\"\"\n This constructor creates a Cobbler thread which then may be run by calling ``run()``.\n\n :param event_id: The event-id which is associated with this thread. Also used as thread name\n :param remote: The Cobbler remote object to execute actions with.\n :param options: Additional options which can be passed into the Thread.\n :param task_name: The high level task name which is used to trigger pre- and post-task triggers\n :param api: The Cobbler api object to resolve information with.\n :param run: The callable that is going to be executed with this thread.\n :param on_done: An optional callable that is going to be executed after ``run`` but before the triggers.\n \"\"\"\n super().__init__(name=event_id)\n self.event_id = event_id\n self.remote = remote\n self.logger = logging.getLogger()\n self.__task_log_handler = None\n self.__setup_logger()\n self._run = run\n self.on_done = on_done\n if options is None:\n options = {}\n self.options = options\n self.task_name = task_name\n self.api = api\n\n def __setup_logger(self):\n \"\"\"\n Utility function that will set up the Python logger for the tasks in a special directory.\n \"\"\"\n filename = pathlib.Path(\"/var/log/cobbler/tasks\") / f\"{self.event_id}.log\"\n self.__task_log_handler = logging.FileHandler(str(filename), encoding=\"utf-8\")\n task_log_formatter = logging.Formatter(\n \"[%(threadName)s] %(asctime)s - %(levelname)s | %(message)s\"\n )\n self.__task_log_handler.setFormatter(task_log_formatter)\n self.logger.setLevel(logging.INFO)\n self.logger.addHandler(self.__task_log_handler)\n\n def _set_task_state(self, new_state: enums.EventStatus):\n \"\"\"\n Set the state of the task. (For internal use only)\n\n :param new_state: The new state of the task.\n \"\"\"\n if not isinstance(new_state, enums.EventStatus):\n raise TypeError('\"new_state\" needs to be of type enums.EventStatus!')\n if self.event_id not in self.remote.events:\n raise ValueError('\"event_id\" not existing!')\n self.remote.events[self.event_id].state = new_state\n # clear the list of who has read it\n self.remote.events[self.event_id].read_by_who = []\n if new_state == enums.EventStatus.COMPLETE:\n self.logger.info(\"### TASK COMPLETE ###\")\n elif new_state == enums.EventStatus.FAILED:\n self.logger.error(\"### TASK FAILED ###\")\n\n def run(self):\n \"\"\"\n Run the thread.\n\n :return: The return code of the action. This may a boolean or a Linux return code.\n \"\"\"\n self.logger.info(\"start_task(%s); event_id(%s)\", self.task_name, self.event_id)\n try:\n if utils.run_triggers(\n api=self.api,\n globber=f\"/var/lib/cobbler/triggers/task/{self.task_name}/pre/*\",\n additional=self.options,\n ):\n self._set_task_state(enums.EventStatus.FAILED)\n return False\n rc = self._run(self)\n if rc is not None and not rc:\n self._set_task_state(enums.EventStatus.FAILED)\n else:\n self._set_task_state(enums.EventStatus.COMPLETE)\n if self.on_done is not None:\n self.on_done()\n utils.run_triggers(\n api=self.api,\n globber=f\"/var/lib/cobbler/triggers/task/{self.task_name}/post/*\",\n additional=self.options,\n )\n return rc\n except Exception:\n utils.log_exc()\n self._set_task_state(enums.EventStatus.FAILED)\n return False\n finally:\n self.logger.removeHandler(self.__task_log_handler)\n", "path": "cobbler/utils/thread.py"}]}
1,763
345
gh_patches_debug_43152
rasdani/github-patches
git_diff
sktime__sktime-3561
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- [BUG] `CNNRegressor` missing essential initializing parameters **Describe the bug** <!-- A clear and concise description of what the bug is. --> `CNNRegressor` misses parameters like `optimizer` in its constructor. There might be more things which don't coincide with `CNNClassifier` which should be added but since I'm not sure what all things might be missing, I'm opening this issue to bring it to notice. --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `sktime/regression/deep_learning/cnn.py` Content: ``` 1 # -*- coding: utf-8 -*- 2 """Time Convolutional Neural Network (CNN) for regression.""" 3 4 __author__ = ["AurumnPegasus"] 5 __all__ = ["CNNRegressor"] 6 7 from sktime.networks.cnn import CNNNetwork 8 from sktime.regression.deep_learning.base import BaseDeepRegressor 9 from sktime.utils.validation._dependencies import _check_dl_dependencies 10 11 _check_dl_dependencies(severity="warning") 12 13 14 class CNNRegressor(BaseDeepRegressor): 15 """Time Convolutional Neural Network (CNN), as described in [1]. 16 17 Parameters 18 ---------- 19 should inherited fields be listed here? 20 n_epochs : int, default = 2000 21 the number of epochs to train the model 22 batch_size : int, default = 16 23 the number of samples per gradient update. 24 kernel_size : int, default = 7 25 the length of the 1D convolution window 26 avg_pool_size : int, default = 3 27 size of the average pooling windows 28 n_conv_layers : int, default = 2 29 the number of convolutional plus average pooling layers 30 filter_sizes : array of shape (n_conv_layers) default = [6, 12] 31 random_state : int or None, default=None 32 Seed for random number generation. 33 verbose : boolean, default = False 34 whether to output extra information 35 loss : string, default="mean_squared_error" 36 fit parameter for the keras model 37 optimizer : keras.optimizer, default=keras.optimizers.Adam(), 38 metrics : list of strings, default=["accuracy"], 39 40 Notes 41 ----- 42 .. [1] Zhao et. al, Convolutional neural networks for 43 time series classification, Journal of 44 Systems Engineering and Electronics, 28(1):2017. 45 46 Adapted from the implementation from Fawaz et. al 47 https://github.com/hfawaz/dl-4-tsc/blob/master/classifiers/cnn.py 48 """ 49 50 def __init__( 51 self, 52 n_epochs=2000, 53 batch_size=16, 54 kernel_size=7, 55 avg_pool_size=3, 56 n_conv_layers=2, 57 callbacks=None, 58 verbose=False, 59 loss="mean_squared_error", 60 metrics=None, 61 random_seed=0, 62 ): 63 _check_dl_dependencies(severity="error") 64 super(CNNRegressor, self).__init__( 65 batch_size=batch_size, 66 ) 67 self.n_conv_layers = n_conv_layers 68 self.avg_pool_size = avg_pool_size 69 self.kernel_size = kernel_size 70 self.callbacks = callbacks 71 self.n_epochs = n_epochs 72 self.batch_size = batch_size 73 self.verbose = verbose 74 self.loss = loss 75 self.metrics = metrics 76 self.random_seed = random_seed 77 self._network = CNNNetwork() 78 79 def build_model(self, input_shape, **kwargs): 80 """Construct a compiled, un-trained, keras model that is ready for training. 81 82 In sktime, time series are stored in numpy arrays of shape (d,m), where d 83 is the number of dimensions, m is the series length. Keras/tensorflow assume 84 data is in shape (m,d). This method also assumes (m,d). Transpose should 85 happen in fit. 86 87 Parameters 88 ---------- 89 input_shape : tuple 90 The shape of the data fed into the input layer, should be (m,d) 91 92 Returns 93 ------- 94 output : a compiled Keras Model 95 """ 96 import tensorflow as tf 97 from tensorflow import keras 98 99 tf.random.set_seed(self.random_seed) 100 101 if self.metrics is None: 102 metrics = ["accuracy"] 103 else: 104 metrics = self.metrics 105 106 input_layer, output_layer = self._network.build_network(input_shape, **kwargs) 107 108 output_layer = keras.layers.Dense(units=1, activation="sigmoid")(output_layer) 109 110 model = keras.models.Model(inputs=input_layer, outputs=output_layer) 111 112 model.compile( 113 loss=self.loss, 114 optimizer=keras.optimizers.Adam(), 115 metrics=metrics, 116 ) 117 return model 118 119 def _fit(self, X, y): 120 """Fit the classifier on the training set (X, y). 121 122 Parameters 123 ---------- 124 X : np.ndarray of shape = (n_instances (n), n_dimensions (d), series_length (m)) 125 The training input samples. 126 y : np.ndarray of shape n 127 The training data class labels. 128 129 Returns 130 ------- 131 self : object 132 """ 133 if self.callbacks is None: 134 self._callbacks = [] 135 136 # Transpose to conform to Keras input style. 137 X = X.transpose(0, 2, 1) 138 139 self.input_shape = X.shape[1:] 140 self.model_ = self.build_model(self.input_shape) 141 if self.verbose: 142 self.model.summary() 143 144 self.history = self.model_.fit( 145 X, 146 y, 147 batch_size=self.batch_size, 148 epochs=self.n_epochs, 149 verbose=self.verbose, 150 callbacks=self._callbacks, 151 ) 152 return self 153 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/sktime/regression/deep_learning/cnn.py b/sktime/regression/deep_learning/cnn.py --- a/sktime/regression/deep_learning/cnn.py +++ b/sktime/regression/deep_learning/cnn.py @@ -1,9 +1,11 @@ # -*- coding: utf-8 -*- """Time Convolutional Neural Network (CNN) for regression.""" -__author__ = ["AurumnPegasus"] +__author__ = ["AurumnPegasus", "achieveordie"] __all__ = ["CNNRegressor"] +from sklearn.utils import check_random_state + from sktime.networks.cnn import CNNNetwork from sktime.regression.deep_learning.base import BaseDeepRegressor from sktime.utils.validation._dependencies import _check_dl_dependencies @@ -34,7 +36,12 @@ whether to output extra information loss : string, default="mean_squared_error" fit parameter for the keras model - optimizer : keras.optimizer, default=keras.optimizers.Adam(), + activation : keras.activations or string, default `linear` + function to use in the output layer. + optimizer : keras.optimizers or string, default `None`. + when `None`, internally uses `keras.optimizers.Adam(0.01)` + use_bias : bool, default=True + whether to use bias in the output layer. metrics : list of strings, default=["accuracy"], Notes @@ -58,7 +65,10 @@ verbose=False, loss="mean_squared_error", metrics=None, - random_seed=0, + random_state=0, + activation="linear", + use_bias=True, + optimizer=None, ): _check_dl_dependencies(severity="error") super(CNNRegressor, self).__init__( @@ -73,7 +83,11 @@ self.verbose = verbose self.loss = loss self.metrics = metrics - self.random_seed = random_seed + self.random_state = random_state + self.activation = activation + self.use_bias = use_bias + self.optimizer = optimizer + self.history = None self._network = CNNNetwork() def build_model(self, input_shape, **kwargs): @@ -96,7 +110,7 @@ import tensorflow as tf from tensorflow import keras - tf.random.set_seed(self.random_seed) + tf.random.set_seed(self.random_state) if self.metrics is None: metrics = ["accuracy"] @@ -105,13 +119,23 @@ input_layer, output_layer = self._network.build_network(input_shape, **kwargs) - output_layer = keras.layers.Dense(units=1, activation="sigmoid")(output_layer) + output_layer = keras.layers.Dense( + units=1, + activation=self.activation, + use_bias=self.use_bias, + )(output_layer) + + self.optimizer_ = ( + keras.optimizers.Adam(learning_rate=0.01) + if self.optimizer is None + else self.optimizer + ) model = keras.models.Model(inputs=input_layer, outputs=output_layer) model.compile( loss=self.loss, - optimizer=keras.optimizers.Adam(), + optimizer=self.optimizer_, metrics=metrics, ) return model @@ -136,6 +160,7 @@ # Transpose to conform to Keras input style. X = X.transpose(0, 2, 1) + check_random_state(self.random_state) self.input_shape = X.shape[1:] self.model_ = self.build_model(self.input_shape) if self.verbose:
{"golden_diff": "diff --git a/sktime/regression/deep_learning/cnn.py b/sktime/regression/deep_learning/cnn.py\n--- a/sktime/regression/deep_learning/cnn.py\n+++ b/sktime/regression/deep_learning/cnn.py\n@@ -1,9 +1,11 @@\n # -*- coding: utf-8 -*-\n \"\"\"Time Convolutional Neural Network (CNN) for regression.\"\"\"\n \n-__author__ = [\"AurumnPegasus\"]\n+__author__ = [\"AurumnPegasus\", \"achieveordie\"]\n __all__ = [\"CNNRegressor\"]\n \n+from sklearn.utils import check_random_state\n+\n from sktime.networks.cnn import CNNNetwork\n from sktime.regression.deep_learning.base import BaseDeepRegressor\n from sktime.utils.validation._dependencies import _check_dl_dependencies\n@@ -34,7 +36,12 @@\n whether to output extra information\n loss : string, default=\"mean_squared_error\"\n fit parameter for the keras model\n- optimizer : keras.optimizer, default=keras.optimizers.Adam(),\n+ activation : keras.activations or string, default `linear`\n+ function to use in the output layer.\n+ optimizer : keras.optimizers or string, default `None`.\n+ when `None`, internally uses `keras.optimizers.Adam(0.01)`\n+ use_bias : bool, default=True\n+ whether to use bias in the output layer.\n metrics : list of strings, default=[\"accuracy\"],\n \n Notes\n@@ -58,7 +65,10 @@\n verbose=False,\n loss=\"mean_squared_error\",\n metrics=None,\n- random_seed=0,\n+ random_state=0,\n+ activation=\"linear\",\n+ use_bias=True,\n+ optimizer=None,\n ):\n _check_dl_dependencies(severity=\"error\")\n super(CNNRegressor, self).__init__(\n@@ -73,7 +83,11 @@\n self.verbose = verbose\n self.loss = loss\n self.metrics = metrics\n- self.random_seed = random_seed\n+ self.random_state = random_state\n+ self.activation = activation\n+ self.use_bias = use_bias\n+ self.optimizer = optimizer\n+ self.history = None\n self._network = CNNNetwork()\n \n def build_model(self, input_shape, **kwargs):\n@@ -96,7 +110,7 @@\n import tensorflow as tf\n from tensorflow import keras\n \n- tf.random.set_seed(self.random_seed)\n+ tf.random.set_seed(self.random_state)\n \n if self.metrics is None:\n metrics = [\"accuracy\"]\n@@ -105,13 +119,23 @@\n \n input_layer, output_layer = self._network.build_network(input_shape, **kwargs)\n \n- output_layer = keras.layers.Dense(units=1, activation=\"sigmoid\")(output_layer)\n+ output_layer = keras.layers.Dense(\n+ units=1,\n+ activation=self.activation,\n+ use_bias=self.use_bias,\n+ )(output_layer)\n+\n+ self.optimizer_ = (\n+ keras.optimizers.Adam(learning_rate=0.01)\n+ if self.optimizer is None\n+ else self.optimizer\n+ )\n \n model = keras.models.Model(inputs=input_layer, outputs=output_layer)\n \n model.compile(\n loss=self.loss,\n- optimizer=keras.optimizers.Adam(),\n+ optimizer=self.optimizer_,\n metrics=metrics,\n )\n return model\n@@ -136,6 +160,7 @@\n # Transpose to conform to Keras input style.\n X = X.transpose(0, 2, 1)\n \n+ check_random_state(self.random_state)\n self.input_shape = X.shape[1:]\n self.model_ = self.build_model(self.input_shape)\n if self.verbose:\n", "issue": "[BUG] `CNNRegressor` missing essential initializing parameters\n**Describe the bug**\r\n<!--\r\nA clear and concise description of what the bug is.\r\n-->\r\n`CNNRegressor` misses parameters like `optimizer` in its constructor. There might be more things which don't coincide with `CNNClassifier` which should be added but since I'm not sure what all things might be missing, I'm opening this issue to bring it to notice.\r\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\"\"\"Time Convolutional Neural Network (CNN) for regression.\"\"\"\n\n__author__ = [\"AurumnPegasus\"]\n__all__ = [\"CNNRegressor\"]\n\nfrom sktime.networks.cnn import CNNNetwork\nfrom sktime.regression.deep_learning.base import BaseDeepRegressor\nfrom sktime.utils.validation._dependencies import _check_dl_dependencies\n\n_check_dl_dependencies(severity=\"warning\")\n\n\nclass CNNRegressor(BaseDeepRegressor):\n \"\"\"Time Convolutional Neural Network (CNN), as described in [1].\n\n Parameters\n ----------\n should inherited fields be listed here?\n n_epochs : int, default = 2000\n the number of epochs to train the model\n batch_size : int, default = 16\n the number of samples per gradient update.\n kernel_size : int, default = 7\n the length of the 1D convolution window\n avg_pool_size : int, default = 3\n size of the average pooling windows\n n_conv_layers : int, default = 2\n the number of convolutional plus average pooling layers\n filter_sizes : array of shape (n_conv_layers) default = [6, 12]\n random_state : int or None, default=None\n Seed for random number generation.\n verbose : boolean, default = False\n whether to output extra information\n loss : string, default=\"mean_squared_error\"\n fit parameter for the keras model\n optimizer : keras.optimizer, default=keras.optimizers.Adam(),\n metrics : list of strings, default=[\"accuracy\"],\n\n Notes\n -----\n .. [1] Zhao et. al, Convolutional neural networks for\n time series classification, Journal of\n Systems Engineering and Electronics, 28(1):2017.\n\n Adapted from the implementation from Fawaz et. al\n https://github.com/hfawaz/dl-4-tsc/blob/master/classifiers/cnn.py\n \"\"\"\n\n def __init__(\n self,\n n_epochs=2000,\n batch_size=16,\n kernel_size=7,\n avg_pool_size=3,\n n_conv_layers=2,\n callbacks=None,\n verbose=False,\n loss=\"mean_squared_error\",\n metrics=None,\n random_seed=0,\n ):\n _check_dl_dependencies(severity=\"error\")\n super(CNNRegressor, self).__init__(\n batch_size=batch_size,\n )\n self.n_conv_layers = n_conv_layers\n self.avg_pool_size = avg_pool_size\n self.kernel_size = kernel_size\n self.callbacks = callbacks\n self.n_epochs = n_epochs\n self.batch_size = batch_size\n self.verbose = verbose\n self.loss = loss\n self.metrics = metrics\n self.random_seed = random_seed\n self._network = CNNNetwork()\n\n def build_model(self, input_shape, **kwargs):\n \"\"\"Construct a compiled, un-trained, keras model that is ready for training.\n\n In sktime, time series are stored in numpy arrays of shape (d,m), where d\n is the number of dimensions, m is the series length. Keras/tensorflow assume\n data is in shape (m,d). This method also assumes (m,d). Transpose should\n happen in fit.\n\n Parameters\n ----------\n input_shape : tuple\n The shape of the data fed into the input layer, should be (m,d)\n\n Returns\n -------\n output : a compiled Keras Model\n \"\"\"\n import tensorflow as tf\n from tensorflow import keras\n\n tf.random.set_seed(self.random_seed)\n\n if self.metrics is None:\n metrics = [\"accuracy\"]\n else:\n metrics = self.metrics\n\n input_layer, output_layer = self._network.build_network(input_shape, **kwargs)\n\n output_layer = keras.layers.Dense(units=1, activation=\"sigmoid\")(output_layer)\n\n model = keras.models.Model(inputs=input_layer, outputs=output_layer)\n\n model.compile(\n loss=self.loss,\n optimizer=keras.optimizers.Adam(),\n metrics=metrics,\n )\n return model\n\n def _fit(self, X, y):\n \"\"\"Fit the classifier on the training set (X, y).\n\n Parameters\n ----------\n X : np.ndarray of shape = (n_instances (n), n_dimensions (d), series_length (m))\n The training input samples.\n y : np.ndarray of shape n\n The training data class labels.\n\n Returns\n -------\n self : object\n \"\"\"\n if self.callbacks is None:\n self._callbacks = []\n\n # Transpose to conform to Keras input style.\n X = X.transpose(0, 2, 1)\n\n self.input_shape = X.shape[1:]\n self.model_ = self.build_model(self.input_shape)\n if self.verbose:\n self.model.summary()\n\n self.history = self.model_.fit(\n X,\n y,\n batch_size=self.batch_size,\n epochs=self.n_epochs,\n verbose=self.verbose,\n callbacks=self._callbacks,\n )\n return self\n", "path": "sktime/regression/deep_learning/cnn.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\"\"\"Time Convolutional Neural Network (CNN) for regression.\"\"\"\n\n__author__ = [\"AurumnPegasus\", \"achieveordie\"]\n__all__ = [\"CNNRegressor\"]\n\nfrom sklearn.utils import check_random_state\n\nfrom sktime.networks.cnn import CNNNetwork\nfrom sktime.regression.deep_learning.base import BaseDeepRegressor\nfrom sktime.utils.validation._dependencies import _check_dl_dependencies\n\n_check_dl_dependencies(severity=\"warning\")\n\n\nclass CNNRegressor(BaseDeepRegressor):\n \"\"\"Time Convolutional Neural Network (CNN), as described in [1].\n\n Parameters\n ----------\n should inherited fields be listed here?\n n_epochs : int, default = 2000\n the number of epochs to train the model\n batch_size : int, default = 16\n the number of samples per gradient update.\n kernel_size : int, default = 7\n the length of the 1D convolution window\n avg_pool_size : int, default = 3\n size of the average pooling windows\n n_conv_layers : int, default = 2\n the number of convolutional plus average pooling layers\n filter_sizes : array of shape (n_conv_layers) default = [6, 12]\n random_state : int or None, default=None\n Seed for random number generation.\n verbose : boolean, default = False\n whether to output extra information\n loss : string, default=\"mean_squared_error\"\n fit parameter for the keras model\n activation : keras.activations or string, default `linear`\n function to use in the output layer.\n optimizer : keras.optimizers or string, default `None`.\n when `None`, internally uses `keras.optimizers.Adam(0.01)`\n use_bias : bool, default=True\n whether to use bias in the output layer.\n metrics : list of strings, default=[\"accuracy\"],\n\n Notes\n -----\n .. [1] Zhao et. al, Convolutional neural networks for\n time series classification, Journal of\n Systems Engineering and Electronics, 28(1):2017.\n\n Adapted from the implementation from Fawaz et. al\n https://github.com/hfawaz/dl-4-tsc/blob/master/classifiers/cnn.py\n \"\"\"\n\n def __init__(\n self,\n n_epochs=2000,\n batch_size=16,\n kernel_size=7,\n avg_pool_size=3,\n n_conv_layers=2,\n callbacks=None,\n verbose=False,\n loss=\"mean_squared_error\",\n metrics=None,\n random_state=0,\n activation=\"linear\",\n use_bias=True,\n optimizer=None,\n ):\n _check_dl_dependencies(severity=\"error\")\n super(CNNRegressor, self).__init__(\n batch_size=batch_size,\n )\n self.n_conv_layers = n_conv_layers\n self.avg_pool_size = avg_pool_size\n self.kernel_size = kernel_size\n self.callbacks = callbacks\n self.n_epochs = n_epochs\n self.batch_size = batch_size\n self.verbose = verbose\n self.loss = loss\n self.metrics = metrics\n self.random_state = random_state\n self.activation = activation\n self.use_bias = use_bias\n self.optimizer = optimizer\n self.history = None\n self._network = CNNNetwork()\n\n def build_model(self, input_shape, **kwargs):\n \"\"\"Construct a compiled, un-trained, keras model that is ready for training.\n\n In sktime, time series are stored in numpy arrays of shape (d,m), where d\n is the number of dimensions, m is the series length. Keras/tensorflow assume\n data is in shape (m,d). This method also assumes (m,d). Transpose should\n happen in fit.\n\n Parameters\n ----------\n input_shape : tuple\n The shape of the data fed into the input layer, should be (m,d)\n\n Returns\n -------\n output : a compiled Keras Model\n \"\"\"\n import tensorflow as tf\n from tensorflow import keras\n\n tf.random.set_seed(self.random_state)\n\n if self.metrics is None:\n metrics = [\"accuracy\"]\n else:\n metrics = self.metrics\n\n input_layer, output_layer = self._network.build_network(input_shape, **kwargs)\n\n output_layer = keras.layers.Dense(\n units=1,\n activation=self.activation,\n use_bias=self.use_bias,\n )(output_layer)\n\n self.optimizer_ = (\n keras.optimizers.Adam(learning_rate=0.01)\n if self.optimizer is None\n else self.optimizer\n )\n\n model = keras.models.Model(inputs=input_layer, outputs=output_layer)\n\n model.compile(\n loss=self.loss,\n optimizer=self.optimizer_,\n metrics=metrics,\n )\n return model\n\n def _fit(self, X, y):\n \"\"\"Fit the classifier on the training set (X, y).\n\n Parameters\n ----------\n X : np.ndarray of shape = (n_instances (n), n_dimensions (d), series_length (m))\n The training input samples.\n y : np.ndarray of shape n\n The training data class labels.\n\n Returns\n -------\n self : object\n \"\"\"\n if self.callbacks is None:\n self._callbacks = []\n\n # Transpose to conform to Keras input style.\n X = X.transpose(0, 2, 1)\n\n check_random_state(self.random_state)\n self.input_shape = X.shape[1:]\n self.model_ = self.build_model(self.input_shape)\n if self.verbose:\n self.model.summary()\n\n self.history = self.model_.fit(\n X,\n y,\n batch_size=self.batch_size,\n epochs=self.n_epochs,\n verbose=self.verbose,\n callbacks=self._callbacks,\n )\n return self\n", "path": "sktime/regression/deep_learning/cnn.py"}]}
1,807
828
gh_patches_debug_14660
rasdani/github-patches
git_diff
lhotse-speech__lhotse-103
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- A few questions... I came across a few confusions while I was reading the code in order to write an example. It would be helpful for me if they are clarified ( I may have missed something). https://github.com/lhotse-speech/lhotse/blob/master/lhotse/kaldi.py#L68 why `duration - start` rather than just `duration`? https://github.com/lhotse-speech/lhotse/blob/master/lhotse/audio.py#L178 why not `[n_sources, n_channels, n_samples]`? Thanks! --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `lhotse/kaldi.py` Content: ``` 1 from collections import defaultdict 2 from pathlib import Path 3 from typing import Dict, Optional, Tuple 4 5 from lhotse.audio import AudioSource, Recording, RecordingSet 6 from lhotse.supervision import SupervisionSegment, SupervisionSet 7 from lhotse.utils import Pathlike 8 9 10 def load_kaldi_data_dir(path: Pathlike, sampling_rate: int) -> Tuple[RecordingSet, Optional[SupervisionSet]]: 11 """ 12 Load a Kaldi data directory and convert it to a Lhotse RecordingSet and SupervisionSet manifests. 13 For this to work, at least the wav.scp file must exist. 14 SupervisionSet is created only when a segments file exists. 15 All the other files (text, utt2spk, etc.) are optional, and some of them might not be handled yet. 16 In particular, feats.scp files are ignored. 17 """ 18 path = Path(path) 19 assert path.is_dir() 20 21 # must exist for RecordingSet 22 recordings = load_kaldi_text_mapping(path / 'wav.scp', must_exist=True) 23 24 durations = defaultdict(float) 25 reco2dur = path / 'reco2dur' 26 if not reco2dur.is_file(): 27 raise ValueError(f"No such file: '{reco2dur}' -- fix it by running: utils/data/get_reco2dur.sh <data-dir>") 28 with reco2dur.open() as f: 29 for line in f: 30 recording_id, dur = line.strip().split() 31 durations[recording_id] = float(dur) 32 33 audio_set = RecordingSet.from_recordings( 34 Recording( 35 id=recording_id, 36 sources=[ 37 AudioSource( 38 type='command' if path_or_cmd.endswith('|') else 'file', 39 channels=[0], 40 source=path_or_cmd[:-1] if path_or_cmd.endswith('|') else path_or_cmd 41 ) 42 ], 43 sampling_rate=sampling_rate, 44 num_samples=int(durations[recording_id] * sampling_rate), 45 duration=durations[recording_id] 46 ) 47 for recording_id, path_or_cmd in recordings.items() 48 ) 49 50 # must exist for SupervisionSet 51 segments = path / 'segments' 52 if not segments.is_file(): 53 return audio_set, None 54 55 with segments.open() as f: 56 supervision_segments = [l.strip().split() for l in f] 57 58 texts = load_kaldi_text_mapping(path / 'text') 59 speakers = load_kaldi_text_mapping(path / 'utt2spk') 60 genders = load_kaldi_text_mapping(path / 'spk2gender') 61 languages = load_kaldi_text_mapping(path / 'utt2lang') 62 63 supervision_set = SupervisionSet.from_segments( 64 SupervisionSegment( 65 id=segment_id, 66 recording_id=recording_id, 67 start=float(start), 68 duration=float(duration) - float(start), 69 channel=0, 70 text=texts[segment_id], 71 language=languages[segment_id], 72 speaker=speakers[segment_id], 73 gender=genders[speakers[segment_id]] 74 ) 75 for segment_id, recording_id, start, duration in supervision_segments 76 ) 77 78 return audio_set, supervision_set 79 80 81 def load_kaldi_text_mapping(path: Path, must_exist: bool = False) -> Dict[str, Optional[str]]: 82 """Load Kaldi files such as utt2spk, spk2gender, text, etc. as a dict.""" 83 mapping = defaultdict(lambda: None) 84 if path.is_file(): 85 with path.open() as f: 86 mapping = dict(line.strip().split(' ', maxsplit=1) for line in f) 87 elif must_exist: 88 raise ValueError(f"No such file: {path}") 89 return mapping 90 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/lhotse/kaldi.py b/lhotse/kaldi.py --- a/lhotse/kaldi.py +++ b/lhotse/kaldi.py @@ -65,14 +65,14 @@ id=segment_id, recording_id=recording_id, start=float(start), - duration=float(duration) - float(start), + duration=float(end) - float(start), channel=0, text=texts[segment_id], language=languages[segment_id], speaker=speakers[segment_id], gender=genders[speakers[segment_id]] ) - for segment_id, recording_id, start, duration in supervision_segments + for segment_id, recording_id, start, end in supervision_segments ) return audio_set, supervision_set
{"golden_diff": "diff --git a/lhotse/kaldi.py b/lhotse/kaldi.py\n--- a/lhotse/kaldi.py\n+++ b/lhotse/kaldi.py\n@@ -65,14 +65,14 @@\n id=segment_id,\n recording_id=recording_id,\n start=float(start),\n- duration=float(duration) - float(start),\n+ duration=float(end) - float(start),\n channel=0,\n text=texts[segment_id],\n language=languages[segment_id],\n speaker=speakers[segment_id],\n gender=genders[speakers[segment_id]]\n )\n- for segment_id, recording_id, start, duration in supervision_segments\n+ for segment_id, recording_id, start, end in supervision_segments\n )\n \n return audio_set, supervision_set\n", "issue": "A few questions...\nI came across a few confusions while I was reading the code in order to write an example. It would be helpful for me if they are clarified ( I may have missed something).\r\n\r\nhttps://github.com/lhotse-speech/lhotse/blob/master/lhotse/kaldi.py#L68\r\nwhy `duration - start` rather than just `duration`?\r\n\r\nhttps://github.com/lhotse-speech/lhotse/blob/master/lhotse/audio.py#L178\r\nwhy not `[n_sources, n_channels, n_samples]`?\r\n\r\nThanks!\n", "before_files": [{"content": "from collections import defaultdict\nfrom pathlib import Path\nfrom typing import Dict, Optional, Tuple\n\nfrom lhotse.audio import AudioSource, Recording, RecordingSet\nfrom lhotse.supervision import SupervisionSegment, SupervisionSet\nfrom lhotse.utils import Pathlike\n\n\ndef load_kaldi_data_dir(path: Pathlike, sampling_rate: int) -> Tuple[RecordingSet, Optional[SupervisionSet]]:\n \"\"\"\n Load a Kaldi data directory and convert it to a Lhotse RecordingSet and SupervisionSet manifests.\n For this to work, at least the wav.scp file must exist.\n SupervisionSet is created only when a segments file exists.\n All the other files (text, utt2spk, etc.) are optional, and some of them might not be handled yet.\n In particular, feats.scp files are ignored.\n \"\"\"\n path = Path(path)\n assert path.is_dir()\n\n # must exist for RecordingSet\n recordings = load_kaldi_text_mapping(path / 'wav.scp', must_exist=True)\n\n durations = defaultdict(float)\n reco2dur = path / 'reco2dur'\n if not reco2dur.is_file():\n raise ValueError(f\"No such file: '{reco2dur}' -- fix it by running: utils/data/get_reco2dur.sh <data-dir>\")\n with reco2dur.open() as f:\n for line in f:\n recording_id, dur = line.strip().split()\n durations[recording_id] = float(dur)\n\n audio_set = RecordingSet.from_recordings(\n Recording(\n id=recording_id,\n sources=[\n AudioSource(\n type='command' if path_or_cmd.endswith('|') else 'file',\n channels=[0],\n source=path_or_cmd[:-1] if path_or_cmd.endswith('|') else path_or_cmd\n )\n ],\n sampling_rate=sampling_rate,\n num_samples=int(durations[recording_id] * sampling_rate),\n duration=durations[recording_id]\n )\n for recording_id, path_or_cmd in recordings.items()\n )\n\n # must exist for SupervisionSet\n segments = path / 'segments'\n if not segments.is_file():\n return audio_set, None\n\n with segments.open() as f:\n supervision_segments = [l.strip().split() for l in f]\n\n texts = load_kaldi_text_mapping(path / 'text')\n speakers = load_kaldi_text_mapping(path / 'utt2spk')\n genders = load_kaldi_text_mapping(path / 'spk2gender')\n languages = load_kaldi_text_mapping(path / 'utt2lang')\n\n supervision_set = SupervisionSet.from_segments(\n SupervisionSegment(\n id=segment_id,\n recording_id=recording_id,\n start=float(start),\n duration=float(duration) - float(start),\n channel=0,\n text=texts[segment_id],\n language=languages[segment_id],\n speaker=speakers[segment_id],\n gender=genders[speakers[segment_id]]\n )\n for segment_id, recording_id, start, duration in supervision_segments\n )\n\n return audio_set, supervision_set\n\n\ndef load_kaldi_text_mapping(path: Path, must_exist: bool = False) -> Dict[str, Optional[str]]:\n \"\"\"Load Kaldi files such as utt2spk, spk2gender, text, etc. as a dict.\"\"\"\n mapping = defaultdict(lambda: None)\n if path.is_file():\n with path.open() as f:\n mapping = dict(line.strip().split(' ', maxsplit=1) for line in f)\n elif must_exist:\n raise ValueError(f\"No such file: {path}\")\n return mapping\n", "path": "lhotse/kaldi.py"}], "after_files": [{"content": "from collections import defaultdict\nfrom pathlib import Path\nfrom typing import Dict, Optional, Tuple\n\nfrom lhotse.audio import AudioSource, Recording, RecordingSet\nfrom lhotse.supervision import SupervisionSegment, SupervisionSet\nfrom lhotse.utils import Pathlike\n\n\ndef load_kaldi_data_dir(path: Pathlike, sampling_rate: int) -> Tuple[RecordingSet, Optional[SupervisionSet]]:\n \"\"\"\n Load a Kaldi data directory and convert it to a Lhotse RecordingSet and SupervisionSet manifests.\n For this to work, at least the wav.scp file must exist.\n SupervisionSet is created only when a segments file exists.\n All the other files (text, utt2spk, etc.) are optional, and some of them might not be handled yet.\n In particular, feats.scp files are ignored.\n \"\"\"\n path = Path(path)\n assert path.is_dir()\n\n # must exist for RecordingSet\n recordings = load_kaldi_text_mapping(path / 'wav.scp', must_exist=True)\n\n durations = defaultdict(float)\n reco2dur = path / 'reco2dur'\n if not reco2dur.is_file():\n raise ValueError(f\"No such file: '{reco2dur}' -- fix it by running: utils/data/get_reco2dur.sh <data-dir>\")\n with reco2dur.open() as f:\n for line in f:\n recording_id, dur = line.strip().split()\n durations[recording_id] = float(dur)\n\n audio_set = RecordingSet.from_recordings(\n Recording(\n id=recording_id,\n sources=[\n AudioSource(\n type='command' if path_or_cmd.endswith('|') else 'file',\n channels=[0],\n source=path_or_cmd[:-1] if path_or_cmd.endswith('|') else path_or_cmd\n )\n ],\n sampling_rate=sampling_rate,\n num_samples=int(durations[recording_id] * sampling_rate),\n duration=durations[recording_id]\n )\n for recording_id, path_or_cmd in recordings.items()\n )\n\n # must exist for SupervisionSet\n segments = path / 'segments'\n if not segments.is_file():\n return audio_set, None\n\n with segments.open() as f:\n supervision_segments = [l.strip().split() for l in f]\n\n texts = load_kaldi_text_mapping(path / 'text')\n speakers = load_kaldi_text_mapping(path / 'utt2spk')\n genders = load_kaldi_text_mapping(path / 'spk2gender')\n languages = load_kaldi_text_mapping(path / 'utt2lang')\n\n supervision_set = SupervisionSet.from_segments(\n SupervisionSegment(\n id=segment_id,\n recording_id=recording_id,\n start=float(start),\n duration=float(end) - float(start),\n channel=0,\n text=texts[segment_id],\n language=languages[segment_id],\n speaker=speakers[segment_id],\n gender=genders[speakers[segment_id]]\n )\n for segment_id, recording_id, start, end in supervision_segments\n )\n\n return audio_set, supervision_set\n\n\ndef load_kaldi_text_mapping(path: Path, must_exist: bool = False) -> Dict[str, Optional[str]]:\n \"\"\"Load Kaldi files such as utt2spk, spk2gender, text, etc. as a dict.\"\"\"\n mapping = defaultdict(lambda: None)\n if path.is_file():\n with path.open() as f:\n mapping = dict(line.strip().split(' ', maxsplit=1) for line in f)\n elif must_exist:\n raise ValueError(f\"No such file: {path}\")\n return mapping\n", "path": "lhotse/kaldi.py"}]}
1,343
175
gh_patches_debug_8194
rasdani/github-patches
git_diff
readthedocs__readthedocs.org-6402
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Improve regex for semver in automation rules We have a very basic pattern, but on the semver faq they have a better one https://semver.org/#is-there-a-suggested-regular-expression-regex-to-check-a-semver-string This needs to be changed and with a comment to the link from where it came from. https://github.com/readthedocs/readthedocs.org/blob/ff0ad67991e599f79e9d956a3733785ccb78c505/readthedocs/builds/constants.py#L113-L113 --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `readthedocs/builds/constants.py` Content: ``` 1 """Constants for the builds app.""" 2 3 from django.conf import settings 4 from django.utils.translation import ugettext_lazy as _ 5 6 7 BUILD_STATE_TRIGGERED = 'triggered' 8 BUILD_STATE_CLONING = 'cloning' 9 BUILD_STATE_INSTALLING = 'installing' 10 BUILD_STATE_BUILDING = 'building' 11 BUILD_STATE_FINISHED = 'finished' 12 13 BUILD_STATE = ( 14 (BUILD_STATE_TRIGGERED, _('Triggered')), 15 (BUILD_STATE_CLONING, _('Cloning')), 16 (BUILD_STATE_INSTALLING, _('Installing')), 17 (BUILD_STATE_BUILDING, _('Building')), 18 (BUILD_STATE_FINISHED, _('Finished')), 19 ) 20 21 BUILD_TYPES = ( 22 ('html', _('HTML')), 23 ('pdf', _('PDF')), 24 ('epub', _('Epub')), 25 # There is currently no support for building man/dash formats, but we keep 26 # it there since the DB might still contain those values for legacy 27 # projects. 28 ('man', _('Manpage')), 29 ('dash', _('Dash')), 30 ) 31 32 # Manager name for Internal Versions or Builds. 33 # ie: Versions and Builds Excluding pull request/merge request Versions and Builds. 34 INTERNAL = 'internal' 35 # Manager name for External Versions or Builds. 36 # ie: Only pull request/merge request Versions and Builds. 37 EXTERNAL = 'external' 38 EXTERNAL_TEXT = _('External') 39 40 BRANCH = 'branch' 41 BRANCH_TEXT = _('Branch') 42 TAG = 'tag' 43 TAG_TEXT = _('Tag') 44 UNKNOWN = 'unknown' 45 UNKNOWN_TEXT = _('Unknown') 46 47 VERSION_TYPES = ( 48 (BRANCH, BRANCH_TEXT), 49 (TAG, TAG_TEXT), 50 (EXTERNAL, EXTERNAL_TEXT), 51 (UNKNOWN, UNKNOWN_TEXT), 52 ) 53 54 LATEST = settings.RTD_LATEST 55 LATEST_VERBOSE_NAME = settings.RTD_LATEST_VERBOSE_NAME 56 57 STABLE = settings.RTD_STABLE 58 STABLE_VERBOSE_NAME = settings.RTD_STABLE_VERBOSE_NAME 59 60 # Those names are specialcased version names. They do not correspond to 61 # branches/tags in a project's repository. 62 NON_REPOSITORY_VERSIONS = ( 63 LATEST, 64 STABLE, 65 ) 66 67 # General Build Statuses 68 BUILD_STATUS_FAILURE = 'failed' 69 BUILD_STATUS_PENDING = 'pending' 70 BUILD_STATUS_SUCCESS = 'success' 71 72 # GitHub Build Statuses 73 GITHUB_BUILD_STATUS_FAILURE = 'failure' 74 GITHUB_BUILD_STATUS_PENDING = 'pending' 75 GITHUB_BUILD_STATUS_SUCCESS = 'success' 76 77 # GitLab Build Statuses 78 GITLAB_BUILD_STATUS_FAILURE = 'failed' 79 GITLAB_BUILD_STATUS_PENDING = 'pending' 80 GITLAB_BUILD_STATUS_SUCCESS = 'success' 81 82 # Used to select correct Build status and description to be sent to each service API 83 SELECT_BUILD_STATUS = { 84 BUILD_STATUS_FAILURE: { 85 'github': GITHUB_BUILD_STATUS_FAILURE, 86 'gitlab': GITLAB_BUILD_STATUS_FAILURE, 87 'description': 'Read the Docs build failed!', 88 }, 89 BUILD_STATUS_PENDING: { 90 'github': GITHUB_BUILD_STATUS_PENDING, 91 'gitlab': GITLAB_BUILD_STATUS_PENDING, 92 'description': 'Read the Docs build is in progress!', 93 }, 94 BUILD_STATUS_SUCCESS: { 95 'github': GITHUB_BUILD_STATUS_SUCCESS, 96 'gitlab': GITLAB_BUILD_STATUS_SUCCESS, 97 'description': 'Read the Docs build succeeded!', 98 }, 99 } 100 101 RTD_BUILD_STATUS_API_NAME = 'continuous-documentation/read-the-docs' 102 103 GITHUB_EXTERNAL_VERSION_NAME = 'Pull Request' 104 GITLAB_EXTERNAL_VERSION_NAME = 'Merge Request' 105 GENERIC_EXTERNAL_VERSION_NAME = 'External Version' 106 107 108 # Automation rules 109 110 ALL_VERSIONS = 'all-versions' 111 ALL_VERSIONS_REGEX = r'.*' 112 SEMVER_VERSIONS = 'semver-versions' 113 SEMVER_VERSIONS_REGEX = r'^v?(\d+\.)(\d+\.)(\d+)(-.+)?$' 114 115 116 PREDEFINED_MATCH_ARGS = ( 117 (ALL_VERSIONS, _('Any version')), 118 (SEMVER_VERSIONS, _('SemVer versions')), 119 (None, _('Custom match')), 120 ) 121 122 PREDEFINED_MATCH_ARGS_VALUES = { 123 ALL_VERSIONS: ALL_VERSIONS_REGEX, 124 SEMVER_VERSIONS: SEMVER_VERSIONS_REGEX, 125 } 126 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/readthedocs/builds/constants.py b/readthedocs/builds/constants.py --- a/readthedocs/builds/constants.py +++ b/readthedocs/builds/constants.py @@ -110,7 +110,10 @@ ALL_VERSIONS = 'all-versions' ALL_VERSIONS_REGEX = r'.*' SEMVER_VERSIONS = 'semver-versions' -SEMVER_VERSIONS_REGEX = r'^v?(\d+\.)(\d+\.)(\d+)(-.+)?$' + +# Pattern referred from +# https://semver.org/#is-there-a-suggested-regular-expression-regex-to-check-a-semver-string +SEMVER_VERSIONS_REGEX = r'^v?(0|[1-9]\d*)\.(0|[1-9]\d*)\.(0|[1-9]\d*)(?:-((?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*)(?:\.(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*))*))?(?:\+([0-9a-zA-Z-]+(?:\.[0-9a-zA-Z-]+)*))?$' # noqa PREDEFINED_MATCH_ARGS = (
{"golden_diff": "diff --git a/readthedocs/builds/constants.py b/readthedocs/builds/constants.py\n--- a/readthedocs/builds/constants.py\n+++ b/readthedocs/builds/constants.py\n@@ -110,7 +110,10 @@\n ALL_VERSIONS = 'all-versions'\n ALL_VERSIONS_REGEX = r'.*'\n SEMVER_VERSIONS = 'semver-versions'\n-SEMVER_VERSIONS_REGEX = r'^v?(\\d+\\.)(\\d+\\.)(\\d+)(-.+)?$'\n+\n+# Pattern referred from\n+# https://semver.org/#is-there-a-suggested-regular-expression-regex-to-check-a-semver-string\n+SEMVER_VERSIONS_REGEX = r'^v?(0|[1-9]\\d*)\\.(0|[1-9]\\d*)\\.(0|[1-9]\\d*)(?:-((?:0|[1-9]\\d*|\\d*[a-zA-Z-][0-9a-zA-Z-]*)(?:\\.(?:0|[1-9]\\d*|\\d*[a-zA-Z-][0-9a-zA-Z-]*))*))?(?:\\+([0-9a-zA-Z-]+(?:\\.[0-9a-zA-Z-]+)*))?$' # noqa\n \n \n PREDEFINED_MATCH_ARGS = (\n", "issue": "Improve regex for semver in automation rules\nWe have a very basic pattern, but on the semver faq they have a better one\r\n\r\nhttps://semver.org/#is-there-a-suggested-regular-expression-regex-to-check-a-semver-string\r\n\r\nThis needs to be changed and with a comment to the link from where it came from.\r\n\r\nhttps://github.com/readthedocs/readthedocs.org/blob/ff0ad67991e599f79e9d956a3733785ccb78c505/readthedocs/builds/constants.py#L113-L113\n", "before_files": [{"content": "\"\"\"Constants for the builds app.\"\"\"\n\nfrom django.conf import settings\nfrom django.utils.translation import ugettext_lazy as _\n\n\nBUILD_STATE_TRIGGERED = 'triggered'\nBUILD_STATE_CLONING = 'cloning'\nBUILD_STATE_INSTALLING = 'installing'\nBUILD_STATE_BUILDING = 'building'\nBUILD_STATE_FINISHED = 'finished'\n\nBUILD_STATE = (\n (BUILD_STATE_TRIGGERED, _('Triggered')),\n (BUILD_STATE_CLONING, _('Cloning')),\n (BUILD_STATE_INSTALLING, _('Installing')),\n (BUILD_STATE_BUILDING, _('Building')),\n (BUILD_STATE_FINISHED, _('Finished')),\n)\n\nBUILD_TYPES = (\n ('html', _('HTML')),\n ('pdf', _('PDF')),\n ('epub', _('Epub')),\n # There is currently no support for building man/dash formats, but we keep\n # it there since the DB might still contain those values for legacy\n # projects.\n ('man', _('Manpage')),\n ('dash', _('Dash')),\n)\n\n# Manager name for Internal Versions or Builds.\n# ie: Versions and Builds Excluding pull request/merge request Versions and Builds.\nINTERNAL = 'internal'\n# Manager name for External Versions or Builds.\n# ie: Only pull request/merge request Versions and Builds.\nEXTERNAL = 'external'\nEXTERNAL_TEXT = _('External')\n\nBRANCH = 'branch'\nBRANCH_TEXT = _('Branch')\nTAG = 'tag'\nTAG_TEXT = _('Tag')\nUNKNOWN = 'unknown'\nUNKNOWN_TEXT = _('Unknown')\n\nVERSION_TYPES = (\n (BRANCH, BRANCH_TEXT),\n (TAG, TAG_TEXT),\n (EXTERNAL, EXTERNAL_TEXT),\n (UNKNOWN, UNKNOWN_TEXT),\n)\n\nLATEST = settings.RTD_LATEST\nLATEST_VERBOSE_NAME = settings.RTD_LATEST_VERBOSE_NAME\n\nSTABLE = settings.RTD_STABLE\nSTABLE_VERBOSE_NAME = settings.RTD_STABLE_VERBOSE_NAME\n\n# Those names are specialcased version names. They do not correspond to\n# branches/tags in a project's repository.\nNON_REPOSITORY_VERSIONS = (\n LATEST,\n STABLE,\n)\n\n# General Build Statuses\nBUILD_STATUS_FAILURE = 'failed'\nBUILD_STATUS_PENDING = 'pending'\nBUILD_STATUS_SUCCESS = 'success'\n\n# GitHub Build Statuses\nGITHUB_BUILD_STATUS_FAILURE = 'failure'\nGITHUB_BUILD_STATUS_PENDING = 'pending'\nGITHUB_BUILD_STATUS_SUCCESS = 'success'\n\n# GitLab Build Statuses\nGITLAB_BUILD_STATUS_FAILURE = 'failed'\nGITLAB_BUILD_STATUS_PENDING = 'pending'\nGITLAB_BUILD_STATUS_SUCCESS = 'success'\n\n# Used to select correct Build status and description to be sent to each service API\nSELECT_BUILD_STATUS = {\n BUILD_STATUS_FAILURE: {\n 'github': GITHUB_BUILD_STATUS_FAILURE,\n 'gitlab': GITLAB_BUILD_STATUS_FAILURE,\n 'description': 'Read the Docs build failed!',\n },\n BUILD_STATUS_PENDING: {\n 'github': GITHUB_BUILD_STATUS_PENDING,\n 'gitlab': GITLAB_BUILD_STATUS_PENDING,\n 'description': 'Read the Docs build is in progress!',\n },\n BUILD_STATUS_SUCCESS: {\n 'github': GITHUB_BUILD_STATUS_SUCCESS,\n 'gitlab': GITLAB_BUILD_STATUS_SUCCESS,\n 'description': 'Read the Docs build succeeded!',\n },\n}\n\nRTD_BUILD_STATUS_API_NAME = 'continuous-documentation/read-the-docs'\n\nGITHUB_EXTERNAL_VERSION_NAME = 'Pull Request'\nGITLAB_EXTERNAL_VERSION_NAME = 'Merge Request'\nGENERIC_EXTERNAL_VERSION_NAME = 'External Version'\n\n\n# Automation rules\n\nALL_VERSIONS = 'all-versions'\nALL_VERSIONS_REGEX = r'.*'\nSEMVER_VERSIONS = 'semver-versions'\nSEMVER_VERSIONS_REGEX = r'^v?(\\d+\\.)(\\d+\\.)(\\d+)(-.+)?$'\n\n\nPREDEFINED_MATCH_ARGS = (\n (ALL_VERSIONS, _('Any version')),\n (SEMVER_VERSIONS, _('SemVer versions')),\n (None, _('Custom match')),\n)\n\nPREDEFINED_MATCH_ARGS_VALUES = {\n ALL_VERSIONS: ALL_VERSIONS_REGEX,\n SEMVER_VERSIONS: SEMVER_VERSIONS_REGEX,\n}\n", "path": "readthedocs/builds/constants.py"}], "after_files": [{"content": "\"\"\"Constants for the builds app.\"\"\"\n\nfrom django.conf import settings\nfrom django.utils.translation import ugettext_lazy as _\n\n\nBUILD_STATE_TRIGGERED = 'triggered'\nBUILD_STATE_CLONING = 'cloning'\nBUILD_STATE_INSTALLING = 'installing'\nBUILD_STATE_BUILDING = 'building'\nBUILD_STATE_FINISHED = 'finished'\n\nBUILD_STATE = (\n (BUILD_STATE_TRIGGERED, _('Triggered')),\n (BUILD_STATE_CLONING, _('Cloning')),\n (BUILD_STATE_INSTALLING, _('Installing')),\n (BUILD_STATE_BUILDING, _('Building')),\n (BUILD_STATE_FINISHED, _('Finished')),\n)\n\nBUILD_TYPES = (\n ('html', _('HTML')),\n ('pdf', _('PDF')),\n ('epub', _('Epub')),\n # There is currently no support for building man/dash formats, but we keep\n # it there since the DB might still contain those values for legacy\n # projects.\n ('man', _('Manpage')),\n ('dash', _('Dash')),\n)\n\n# Manager name for Internal Versions or Builds.\n# ie: Versions and Builds Excluding pull request/merge request Versions and Builds.\nINTERNAL = 'internal'\n# Manager name for External Versions or Builds.\n# ie: Only pull request/merge request Versions and Builds.\nEXTERNAL = 'external'\nEXTERNAL_TEXT = _('External')\n\nBRANCH = 'branch'\nBRANCH_TEXT = _('Branch')\nTAG = 'tag'\nTAG_TEXT = _('Tag')\nUNKNOWN = 'unknown'\nUNKNOWN_TEXT = _('Unknown')\n\nVERSION_TYPES = (\n (BRANCH, BRANCH_TEXT),\n (TAG, TAG_TEXT),\n (EXTERNAL, EXTERNAL_TEXT),\n (UNKNOWN, UNKNOWN_TEXT),\n)\n\nLATEST = settings.RTD_LATEST\nLATEST_VERBOSE_NAME = settings.RTD_LATEST_VERBOSE_NAME\n\nSTABLE = settings.RTD_STABLE\nSTABLE_VERBOSE_NAME = settings.RTD_STABLE_VERBOSE_NAME\n\n# Those names are specialcased version names. They do not correspond to\n# branches/tags in a project's repository.\nNON_REPOSITORY_VERSIONS = (\n LATEST,\n STABLE,\n)\n\n# General Build Statuses\nBUILD_STATUS_FAILURE = 'failed'\nBUILD_STATUS_PENDING = 'pending'\nBUILD_STATUS_SUCCESS = 'success'\n\n# GitHub Build Statuses\nGITHUB_BUILD_STATUS_FAILURE = 'failure'\nGITHUB_BUILD_STATUS_PENDING = 'pending'\nGITHUB_BUILD_STATUS_SUCCESS = 'success'\n\n# GitLab Build Statuses\nGITLAB_BUILD_STATUS_FAILURE = 'failed'\nGITLAB_BUILD_STATUS_PENDING = 'pending'\nGITLAB_BUILD_STATUS_SUCCESS = 'success'\n\n# Used to select correct Build status and description to be sent to each service API\nSELECT_BUILD_STATUS = {\n BUILD_STATUS_FAILURE: {\n 'github': GITHUB_BUILD_STATUS_FAILURE,\n 'gitlab': GITLAB_BUILD_STATUS_FAILURE,\n 'description': 'Read the Docs build failed!',\n },\n BUILD_STATUS_PENDING: {\n 'github': GITHUB_BUILD_STATUS_PENDING,\n 'gitlab': GITLAB_BUILD_STATUS_PENDING,\n 'description': 'Read the Docs build is in progress!',\n },\n BUILD_STATUS_SUCCESS: {\n 'github': GITHUB_BUILD_STATUS_SUCCESS,\n 'gitlab': GITLAB_BUILD_STATUS_SUCCESS,\n 'description': 'Read the Docs build succeeded!',\n },\n}\n\nRTD_BUILD_STATUS_API_NAME = 'continuous-documentation/read-the-docs'\n\nGITHUB_EXTERNAL_VERSION_NAME = 'Pull Request'\nGITLAB_EXTERNAL_VERSION_NAME = 'Merge Request'\nGENERIC_EXTERNAL_VERSION_NAME = 'External Version'\n\n\n# Automation rules\n\nALL_VERSIONS = 'all-versions'\nALL_VERSIONS_REGEX = r'.*'\nSEMVER_VERSIONS = 'semver-versions'\n\n# Pattern referred from\n# https://semver.org/#is-there-a-suggested-regular-expression-regex-to-check-a-semver-string\nSEMVER_VERSIONS_REGEX = r'^v?(0|[1-9]\\d*)\\.(0|[1-9]\\d*)\\.(0|[1-9]\\d*)(?:-((?:0|[1-9]\\d*|\\d*[a-zA-Z-][0-9a-zA-Z-]*)(?:\\.(?:0|[1-9]\\d*|\\d*[a-zA-Z-][0-9a-zA-Z-]*))*))?(?:\\+([0-9a-zA-Z-]+(?:\\.[0-9a-zA-Z-]+)*))?$' # noqa\n\n\nPREDEFINED_MATCH_ARGS = (\n (ALL_VERSIONS, _('Any version')),\n (SEMVER_VERSIONS, _('SemVer versions')),\n (None, _('Custom match')),\n)\n\nPREDEFINED_MATCH_ARGS_VALUES = {\n ALL_VERSIONS: ALL_VERSIONS_REGEX,\n SEMVER_VERSIONS: SEMVER_VERSIONS_REGEX,\n}\n", "path": "readthedocs/builds/constants.py"}]}
1,530
281
gh_patches_debug_56402
rasdani/github-patches
git_diff
encode__httpx-361
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- Missing py.typed declaration? `mypy` is complaining about not being able to find type annotations for `httpx`: `error: Cannot find module named 'httpx'` I'm somewhat new to using type annotations/static type checking in Python, but from the mypy documentation [here](https://mypy.readthedocs.io/en/latest/installed_packages.html#making-pep-561-compatible-packages) it looks like there may be a missing declaration in `setup.py`? --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `setup.py` Content: ``` 1 #!/usr/bin/env python 2 # -*- coding: utf-8 -*- 3 4 import re 5 from pathlib import Path 6 7 from setuptools import setup 8 9 10 def get_version(package): 11 """ 12 Return package version as listed in `__version__` in `init.py`. 13 """ 14 version = Path(package, "__version__.py").read_text() 15 return re.search("__version__ = ['\"]([^'\"]+)['\"]", version).group(1) 16 17 18 def get_long_description(): 19 """ 20 Return the README. 21 """ 22 long_description = "" 23 with open("README.md", encoding="utf8") as f: 24 long_description += f.read() 25 long_description += "\n\n" 26 with open("CHANGELOG.md", encoding="utf8") as f: 27 long_description += f.read() 28 return long_description 29 30 31 def get_packages(package): 32 """ 33 Return root package and all sub-packages. 34 """ 35 return [str(path.parent) for path in Path(package).glob("**/__init__.py")] 36 37 38 setup( 39 name="httpx", 40 python_requires=">=3.6", 41 version=get_version("httpx"), 42 url="https://github.com/encode/httpx", 43 license="BSD", 44 description="The next generation HTTP client.", 45 long_description=get_long_description(), 46 long_description_content_type="text/markdown", 47 author="Tom Christie", 48 author_email="[email protected]", 49 package_data={"httpx": ["py.typed"]}, 50 packages=get_packages("httpx"), 51 include_package_data=True, 52 install_requires=[ 53 "certifi", 54 "chardet==3.*", 55 "h11==0.8.*", 56 "h2==3.*", 57 "hstspreload>=2019.8.27", 58 "idna==2.*", 59 "rfc3986==1.*", 60 ], 61 classifiers=[ 62 "Development Status :: 3 - Alpha", 63 "Environment :: Web Environment", 64 "Intended Audience :: Developers", 65 "License :: OSI Approved :: BSD License", 66 "Operating System :: OS Independent", 67 "Topic :: Internet :: WWW/HTTP", 68 "Programming Language :: Python :: 3", 69 "Programming Language :: Python :: 3.6", 70 "Programming Language :: Python :: 3.7", 71 "Programming Language :: Python :: 3.8", 72 ], 73 ) 74 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/setup.py b/setup.py --- a/setup.py +++ b/setup.py @@ -49,6 +49,7 @@ package_data={"httpx": ["py.typed"]}, packages=get_packages("httpx"), include_package_data=True, + zip_safe=False, install_requires=[ "certifi", "chardet==3.*",
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -49,6 +49,7 @@\n package_data={\"httpx\": [\"py.typed\"]},\n packages=get_packages(\"httpx\"),\n include_package_data=True,\n+ zip_safe=False,\n install_requires=[\n \"certifi\",\n \"chardet==3.*\",\n", "issue": "Missing py.typed declaration?\n`mypy` is complaining about not being able to find type annotations for `httpx`: \r\n\r\n`error: Cannot find module named 'httpx'`\r\n\r\nI'm somewhat new to using type annotations/static type checking in Python, but from the mypy documentation [here](https://mypy.readthedocs.io/en/latest/installed_packages.html#making-pep-561-compatible-packages) it looks like there may be a missing declaration in `setup.py`?\n", "before_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n\nimport re\nfrom pathlib import Path\n\nfrom setuptools import setup\n\n\ndef get_version(package):\n \"\"\"\n Return package version as listed in `__version__` in `init.py`.\n \"\"\"\n version = Path(package, \"__version__.py\").read_text()\n return re.search(\"__version__ = ['\\\"]([^'\\\"]+)['\\\"]\", version).group(1)\n\n\ndef get_long_description():\n \"\"\"\n Return the README.\n \"\"\"\n long_description = \"\"\n with open(\"README.md\", encoding=\"utf8\") as f:\n long_description += f.read()\n long_description += \"\\n\\n\"\n with open(\"CHANGELOG.md\", encoding=\"utf8\") as f:\n long_description += f.read()\n return long_description\n\n\ndef get_packages(package):\n \"\"\"\n Return root package and all sub-packages.\n \"\"\"\n return [str(path.parent) for path in Path(package).glob(\"**/__init__.py\")]\n\n\nsetup(\n name=\"httpx\",\n python_requires=\">=3.6\",\n version=get_version(\"httpx\"),\n url=\"https://github.com/encode/httpx\",\n license=\"BSD\",\n description=\"The next generation HTTP client.\",\n long_description=get_long_description(),\n long_description_content_type=\"text/markdown\",\n author=\"Tom Christie\",\n author_email=\"[email protected]\",\n package_data={\"httpx\": [\"py.typed\"]},\n packages=get_packages(\"httpx\"),\n include_package_data=True,\n install_requires=[\n \"certifi\",\n \"chardet==3.*\",\n \"h11==0.8.*\",\n \"h2==3.*\",\n \"hstspreload>=2019.8.27\",\n \"idna==2.*\",\n \"rfc3986==1.*\",\n ],\n classifiers=[\n \"Development Status :: 3 - Alpha\",\n \"Environment :: Web Environment\",\n \"Intended Audience :: Developers\",\n \"License :: OSI Approved :: BSD License\",\n \"Operating System :: OS Independent\",\n \"Topic :: Internet :: WWW/HTTP\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n ],\n)\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n\nimport re\nfrom pathlib import Path\n\nfrom setuptools import setup\n\n\ndef get_version(package):\n \"\"\"\n Return package version as listed in `__version__` in `init.py`.\n \"\"\"\n version = Path(package, \"__version__.py\").read_text()\n return re.search(\"__version__ = ['\\\"]([^'\\\"]+)['\\\"]\", version).group(1)\n\n\ndef get_long_description():\n \"\"\"\n Return the README.\n \"\"\"\n long_description = \"\"\n with open(\"README.md\", encoding=\"utf8\") as f:\n long_description += f.read()\n long_description += \"\\n\\n\"\n with open(\"CHANGELOG.md\", encoding=\"utf8\") as f:\n long_description += f.read()\n return long_description\n\n\ndef get_packages(package):\n \"\"\"\n Return root package and all sub-packages.\n \"\"\"\n return [str(path.parent) for path in Path(package).glob(\"**/__init__.py\")]\n\n\nsetup(\n name=\"httpx\",\n python_requires=\">=3.6\",\n version=get_version(\"httpx\"),\n url=\"https://github.com/encode/httpx\",\n license=\"BSD\",\n description=\"The next generation HTTP client.\",\n long_description=get_long_description(),\n long_description_content_type=\"text/markdown\",\n author=\"Tom Christie\",\n author_email=\"[email protected]\",\n package_data={\"httpx\": [\"py.typed\"]},\n packages=get_packages(\"httpx\"),\n include_package_data=True,\n zip_safe=False,\n install_requires=[\n \"certifi\",\n \"chardet==3.*\",\n \"h11==0.8.*\",\n \"h2==3.*\",\n \"hstspreload>=2019.8.27\",\n \"idna==2.*\",\n \"rfc3986==1.*\",\n ],\n classifiers=[\n \"Development Status :: 3 - Alpha\",\n \"Environment :: Web Environment\",\n \"Intended Audience :: Developers\",\n \"License :: OSI Approved :: BSD License\",\n \"Operating System :: OS Independent\",\n \"Topic :: Internet :: WWW/HTTP\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n ],\n)\n", "path": "setup.py"}]}
1,005
83
gh_patches_debug_2399
rasdani/github-patches
git_diff
WeblateOrg__weblate-1655
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- File download is outdated ### Steps to reproduce 1. Edit string. 2. Donwload original translation file (without conversion). ### Actual behaviour The file does not have recent changes. ### Expected behaviour All changes should be reflected. ### Server configuration Current master --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `weblate/trans/views/helper.py` Content: ``` 1 # -*- coding: utf-8 -*- 2 # 3 # Copyright © 2012 - 2017 Michal Čihař <[email protected]> 4 # 5 # This file is part of Weblate <https://weblate.org/> 6 # 7 # This program is free software: you can redistribute it and/or modify 8 # it under the terms of the GNU General Public License as published by 9 # the Free Software Foundation, either version 3 of the License, or 10 # (at your option) any later version. 11 # 12 # This program is distributed in the hope that it will be useful, 13 # but WITHOUT ANY WARRANTY; without even the implied warranty of 14 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the 15 # GNU General Public License for more details. 16 # 17 # You should have received a copy of the GNU General Public License 18 # along with this program. If not, see <https://www.gnu.org/licenses/>. 19 # 20 """Helper methods for views.""" 21 22 from django.http import HttpResponse, Http404 23 from django.shortcuts import get_object_or_404 24 import django.utils.translation 25 from django.utils.translation import trans_real, ugettext as _ 26 27 from weblate.utils import messages 28 from weblate.permissions.helpers import check_access 29 from weblate.trans.exporters import get_exporter 30 from weblate.trans.models import Project, SubProject, Translation 31 32 33 def get_translation(request, project, subproject, lang, skip_acl=False): 34 """Return translation matching parameters.""" 35 translation = get_object_or_404( 36 Translation.objects.prefetch(), 37 language__code=lang, 38 subproject__slug=subproject, 39 subproject__project__slug=project, 40 enabled=True 41 ) 42 if not skip_acl: 43 check_access(request, translation.subproject.project) 44 return translation 45 46 47 def get_subproject(request, project, subproject, skip_acl=False): 48 """Return subproject matching parameters.""" 49 subproject = get_object_or_404( 50 SubProject.objects.prefetch(), 51 project__slug=project, 52 slug=subproject 53 ) 54 if not skip_acl: 55 check_access(request, subproject.project) 56 return subproject 57 58 59 def get_project(request, project, skip_acl=False): 60 """Return project matching parameters.""" 61 project = get_object_or_404( 62 Project, 63 slug=project, 64 ) 65 if not skip_acl: 66 check_access(request, project) 67 return project 68 69 70 def get_project_translation(request, project=None, subproject=None, lang=None): 71 """Return project, subproject, translation tuple for given parameters.""" 72 73 if lang is not None and subproject is not None: 74 # Language defined? We can get all 75 translation = get_translation(request, project, subproject, lang) 76 subproject = translation.subproject 77 project = subproject.project 78 else: 79 translation = None 80 if subproject is not None: 81 # Component defined? 82 subproject = get_subproject(request, project, subproject) 83 project = subproject.project 84 elif project is not None: 85 # Only project defined? 86 project = get_project(request, project) 87 88 # Return tuple 89 return project, subproject, translation 90 91 92 def try_set_language(lang): 93 """Try to activate language""" 94 95 try: 96 django.utils.translation.activate(lang) 97 # workaround for https://code.djangoproject.com/ticket/26050 98 # pylint: disable=W0212 99 if trans_real.catalog()._catalog is None: 100 raise Exception('Invalid language!') 101 except Exception: 102 # Ignore failure on activating language 103 django.utils.translation.activate('en') 104 105 106 def import_message(request, count, message_none, message_ok): 107 if count == 0: 108 messages.warning(request, message_none) 109 else: 110 messages.success(request, message_ok % count) 111 112 113 def download_translation_file(translation, fmt=None): 114 if fmt is not None: 115 try: 116 exporter = get_exporter(fmt)(translation=translation) 117 except KeyError: 118 raise Http404('File format not supported') 119 exporter.add_units(translation) 120 return exporter.get_response( 121 '{{project}}-{0}-{{language}}.{{extension}}'.format( 122 translation.subproject.slug 123 ) 124 ) 125 126 srcfilename = translation.get_filename() 127 128 # Construct file name (do not use real filename as it is usually not 129 # that useful) 130 filename = '{0}-{1}-{2}.{3}'.format( 131 translation.subproject.project.slug, 132 translation.subproject.slug, 133 translation.language.code, 134 translation.store.extension 135 ) 136 137 # Create response 138 with open(srcfilename) as handle: 139 response = HttpResponse( 140 handle.read(), 141 content_type=translation.store.mimetype 142 ) 143 144 # Fill in response headers 145 response['Content-Disposition'] = 'attachment; filename={0}'.format( 146 filename 147 ) 148 149 return response 150 151 152 def show_form_errors(request, form): 153 """Show all form errors as a message.""" 154 for error in form.non_field_errors(): 155 messages.error(request, error) 156 for field in form: 157 for error in field.errors: 158 messages.error( 159 request, 160 _('Error in parameter %(field)s: %(error)s') % { 161 'field': field.name, 162 'error': error 163 } 164 ) 165 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/weblate/trans/views/helper.py b/weblate/trans/views/helper.py --- a/weblate/trans/views/helper.py +++ b/weblate/trans/views/helper.py @@ -123,6 +123,10 @@ ) ) + # Force flushing pending units + author = translation.get_last_author(True) + translation.update_units(author) + srcfilename = translation.get_filename() # Construct file name (do not use real filename as it is usually not
{"golden_diff": "diff --git a/weblate/trans/views/helper.py b/weblate/trans/views/helper.py\n--- a/weblate/trans/views/helper.py\n+++ b/weblate/trans/views/helper.py\n@@ -123,6 +123,10 @@\n )\n )\n \n+ # Force flushing pending units\n+ author = translation.get_last_author(True)\n+ translation.update_units(author)\n+\n srcfilename = translation.get_filename()\n \n # Construct file name (do not use real filename as it is usually not\n", "issue": "File download is outdated\n### Steps to reproduce\r\n1. Edit string.\r\n2. Donwload original translation file (without conversion).\r\n\r\n### Actual behaviour\r\nThe file does not have recent changes.\r\n\r\n### Expected behaviour\r\nAll changes should be reflected.\r\n\r\n### Server configuration\r\nCurrent master\r\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n#\n# Copyright \u00a9 2012 - 2017 Michal \u010ciha\u0159 <[email protected]>\n#\n# This file is part of Weblate <https://weblate.org/>\n#\n# This program is free software: you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the Free Software Foundation, either version 3 of the License, or\n# (at your option) any later version.\n#\n# This program is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU General Public License for more details.\n#\n# You should have received a copy of the GNU General Public License\n# along with this program. If not, see <https://www.gnu.org/licenses/>.\n#\n\"\"\"Helper methods for views.\"\"\"\n\nfrom django.http import HttpResponse, Http404\nfrom django.shortcuts import get_object_or_404\nimport django.utils.translation\nfrom django.utils.translation import trans_real, ugettext as _\n\nfrom weblate.utils import messages\nfrom weblate.permissions.helpers import check_access\nfrom weblate.trans.exporters import get_exporter\nfrom weblate.trans.models import Project, SubProject, Translation\n\n\ndef get_translation(request, project, subproject, lang, skip_acl=False):\n \"\"\"Return translation matching parameters.\"\"\"\n translation = get_object_or_404(\n Translation.objects.prefetch(),\n language__code=lang,\n subproject__slug=subproject,\n subproject__project__slug=project,\n enabled=True\n )\n if not skip_acl:\n check_access(request, translation.subproject.project)\n return translation\n\n\ndef get_subproject(request, project, subproject, skip_acl=False):\n \"\"\"Return subproject matching parameters.\"\"\"\n subproject = get_object_or_404(\n SubProject.objects.prefetch(),\n project__slug=project,\n slug=subproject\n )\n if not skip_acl:\n check_access(request, subproject.project)\n return subproject\n\n\ndef get_project(request, project, skip_acl=False):\n \"\"\"Return project matching parameters.\"\"\"\n project = get_object_or_404(\n Project,\n slug=project,\n )\n if not skip_acl:\n check_access(request, project)\n return project\n\n\ndef get_project_translation(request, project=None, subproject=None, lang=None):\n \"\"\"Return project, subproject, translation tuple for given parameters.\"\"\"\n\n if lang is not None and subproject is not None:\n # Language defined? We can get all\n translation = get_translation(request, project, subproject, lang)\n subproject = translation.subproject\n project = subproject.project\n else:\n translation = None\n if subproject is not None:\n # Component defined?\n subproject = get_subproject(request, project, subproject)\n project = subproject.project\n elif project is not None:\n # Only project defined?\n project = get_project(request, project)\n\n # Return tuple\n return project, subproject, translation\n\n\ndef try_set_language(lang):\n \"\"\"Try to activate language\"\"\"\n\n try:\n django.utils.translation.activate(lang)\n # workaround for https://code.djangoproject.com/ticket/26050\n # pylint: disable=W0212\n if trans_real.catalog()._catalog is None:\n raise Exception('Invalid language!')\n except Exception:\n # Ignore failure on activating language\n django.utils.translation.activate('en')\n\n\ndef import_message(request, count, message_none, message_ok):\n if count == 0:\n messages.warning(request, message_none)\n else:\n messages.success(request, message_ok % count)\n\n\ndef download_translation_file(translation, fmt=None):\n if fmt is not None:\n try:\n exporter = get_exporter(fmt)(translation=translation)\n except KeyError:\n raise Http404('File format not supported')\n exporter.add_units(translation)\n return exporter.get_response(\n '{{project}}-{0}-{{language}}.{{extension}}'.format(\n translation.subproject.slug\n )\n )\n\n srcfilename = translation.get_filename()\n\n # Construct file name (do not use real filename as it is usually not\n # that useful)\n filename = '{0}-{1}-{2}.{3}'.format(\n translation.subproject.project.slug,\n translation.subproject.slug,\n translation.language.code,\n translation.store.extension\n )\n\n # Create response\n with open(srcfilename) as handle:\n response = HttpResponse(\n handle.read(),\n content_type=translation.store.mimetype\n )\n\n # Fill in response headers\n response['Content-Disposition'] = 'attachment; filename={0}'.format(\n filename\n )\n\n return response\n\n\ndef show_form_errors(request, form):\n \"\"\"Show all form errors as a message.\"\"\"\n for error in form.non_field_errors():\n messages.error(request, error)\n for field in form:\n for error in field.errors:\n messages.error(\n request,\n _('Error in parameter %(field)s: %(error)s') % {\n 'field': field.name,\n 'error': error\n }\n )\n", "path": "weblate/trans/views/helper.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n#\n# Copyright \u00a9 2012 - 2017 Michal \u010ciha\u0159 <[email protected]>\n#\n# This file is part of Weblate <https://weblate.org/>\n#\n# This program is free software: you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the Free Software Foundation, either version 3 of the License, or\n# (at your option) any later version.\n#\n# This program is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU General Public License for more details.\n#\n# You should have received a copy of the GNU General Public License\n# along with this program. If not, see <https://www.gnu.org/licenses/>.\n#\n\"\"\"Helper methods for views.\"\"\"\n\nfrom django.http import HttpResponse, Http404\nfrom django.shortcuts import get_object_or_404\nimport django.utils.translation\nfrom django.utils.translation import trans_real, ugettext as _\n\nfrom weblate.utils import messages\nfrom weblate.permissions.helpers import check_access\nfrom weblate.trans.exporters import get_exporter\nfrom weblate.trans.models import Project, SubProject, Translation\n\n\ndef get_translation(request, project, subproject, lang, skip_acl=False):\n \"\"\"Return translation matching parameters.\"\"\"\n translation = get_object_or_404(\n Translation.objects.prefetch(),\n language__code=lang,\n subproject__slug=subproject,\n subproject__project__slug=project,\n enabled=True\n )\n if not skip_acl:\n check_access(request, translation.subproject.project)\n return translation\n\n\ndef get_subproject(request, project, subproject, skip_acl=False):\n \"\"\"Return subproject matching parameters.\"\"\"\n subproject = get_object_or_404(\n SubProject.objects.prefetch(),\n project__slug=project,\n slug=subproject\n )\n if not skip_acl:\n check_access(request, subproject.project)\n return subproject\n\n\ndef get_project(request, project, skip_acl=False):\n \"\"\"Return project matching parameters.\"\"\"\n project = get_object_or_404(\n Project,\n slug=project,\n )\n if not skip_acl:\n check_access(request, project)\n return project\n\n\ndef get_project_translation(request, project=None, subproject=None, lang=None):\n \"\"\"Return project, subproject, translation tuple for given parameters.\"\"\"\n\n if lang is not None and subproject is not None:\n # Language defined? We can get all\n translation = get_translation(request, project, subproject, lang)\n subproject = translation.subproject\n project = subproject.project\n else:\n translation = None\n if subproject is not None:\n # Component defined?\n subproject = get_subproject(request, project, subproject)\n project = subproject.project\n elif project is not None:\n # Only project defined?\n project = get_project(request, project)\n\n # Return tuple\n return project, subproject, translation\n\n\ndef try_set_language(lang):\n \"\"\"Try to activate language\"\"\"\n\n try:\n django.utils.translation.activate(lang)\n # workaround for https://code.djangoproject.com/ticket/26050\n # pylint: disable=W0212\n if trans_real.catalog()._catalog is None:\n raise Exception('Invalid language!')\n except Exception:\n # Ignore failure on activating language\n django.utils.translation.activate('en')\n\n\ndef import_message(request, count, message_none, message_ok):\n if count == 0:\n messages.warning(request, message_none)\n else:\n messages.success(request, message_ok % count)\n\n\ndef download_translation_file(translation, fmt=None):\n if fmt is not None:\n try:\n exporter = get_exporter(fmt)(translation=translation)\n except KeyError:\n raise Http404('File format not supported')\n exporter.add_units(translation)\n return exporter.get_response(\n '{{project}}-{0}-{{language}}.{{extension}}'.format(\n translation.subproject.slug\n )\n )\n\n # Force flushing pending units\n author = translation.get_last_author(True)\n translation.update_units(author)\n\n srcfilename = translation.get_filename()\n\n # Construct file name (do not use real filename as it is usually not\n # that useful)\n filename = '{0}-{1}-{2}.{3}'.format(\n translation.subproject.project.slug,\n translation.subproject.slug,\n translation.language.code,\n translation.store.extension\n )\n\n # Create response\n with open(srcfilename) as handle:\n response = HttpResponse(\n handle.read(),\n content_type=translation.store.mimetype\n )\n\n # Fill in response headers\n response['Content-Disposition'] = 'attachment; filename={0}'.format(\n filename\n )\n\n return response\n\n\ndef show_form_errors(request, form):\n \"\"\"Show all form errors as a message.\"\"\"\n for error in form.non_field_errors():\n messages.error(request, error)\n for field in form:\n for error in field.errors:\n messages.error(\n request,\n _('Error in parameter %(field)s: %(error)s') % {\n 'field': field.name,\n 'error': error\n }\n )\n", "path": "weblate/trans/views/helper.py"}]}
1,843
112
gh_patches_debug_3989
rasdani/github-patches
git_diff
matrix-org__synapse-8583
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- mypy==0.782 will cause mypy-zope dependency regression in future pip releases ## Description Pip is planning to use a [new resolver by default from `20.3`](https://pip.pypa.io/en/latest/user_guide/#changes-to-the-pip-dependency-resolver-in-20-2-2020), which is more strict in it's dependency handling. Currently, `tox`'s `mypy` test environment defines this: ``` deps = {[base]deps} mypy==0.782 mypy-zope ``` Given that input, `pip>=20.3` will likely either: 1. Choose an older `mypy-zope` package that works with `mypy==0.782`, causing a regression. 2. ~~Reject installing outright, as this stricter dependency resolver will refuse to install incompatible packages.~~ --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `setup.py` Content: ``` 1 #!/usr/bin/env python 2 3 # Copyright 2014-2017 OpenMarket Ltd 4 # Copyright 2017 Vector Creations Ltd 5 # Copyright 2017-2018 New Vector Ltd 6 # 7 # Licensed under the Apache License, Version 2.0 (the "License"); 8 # you may not use this file except in compliance with the License. 9 # You may obtain a copy of the License at 10 # 11 # http://www.apache.org/licenses/LICENSE-2.0 12 # 13 # Unless required by applicable law or agreed to in writing, software 14 # distributed under the License is distributed on an "AS IS" BASIS, 15 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 16 # See the License for the specific language governing permissions and 17 # limitations under the License. 18 import glob 19 import os 20 21 from setuptools import Command, find_packages, setup 22 23 here = os.path.abspath(os.path.dirname(__file__)) 24 25 26 # Some notes on `setup.py test`: 27 # 28 # Once upon a time we used to try to make `setup.py test` run `tox` to run the 29 # tests. That's a bad idea for three reasons: 30 # 31 # 1: `setup.py test` is supposed to find out whether the tests work in the 32 # *current* environmentt, not whatever tox sets up. 33 # 2: Empirically, trying to install tox during the test run wasn't working ("No 34 # module named virtualenv"). 35 # 3: The tox documentation advises against it[1]. 36 # 37 # Even further back in time, we used to use setuptools_trial [2]. That has its 38 # own set of issues: for instance, it requires installation of Twisted to build 39 # an sdist (because the recommended mode of usage is to add it to 40 # `setup_requires`). That in turn means that in order to successfully run tox 41 # you have to have the python header files installed for whichever version of 42 # python tox uses (which is python3 on recent ubuntus, for example). 43 # 44 # So, for now at least, we stick with what appears to be the convention among 45 # Twisted projects, and don't attempt to do anything when someone runs 46 # `setup.py test`; instead we direct people to run `trial` directly if they 47 # care. 48 # 49 # [1]: http://tox.readthedocs.io/en/2.5.0/example/basic.html#integration-with-setup-py-test-command 50 # [2]: https://pypi.python.org/pypi/setuptools_trial 51 class TestCommand(Command): 52 user_options = [] 53 54 def initialize_options(self): 55 pass 56 57 def finalize_options(self): 58 pass 59 60 def run(self): 61 print( 62 """Synapse's tests cannot be run via setup.py. To run them, try: 63 PYTHONPATH="." trial tests 64 """ 65 ) 66 67 68 def read_file(path_segments): 69 """Read a file from the package. Takes a list of strings to join to 70 make the path""" 71 file_path = os.path.join(here, *path_segments) 72 with open(file_path) as f: 73 return f.read() 74 75 76 def exec_file(path_segments): 77 """Execute a single python file to get the variables defined in it""" 78 result = {} 79 code = read_file(path_segments) 80 exec(code, result) 81 return result 82 83 84 version = exec_file(("synapse", "__init__.py"))["__version__"] 85 dependencies = exec_file(("synapse", "python_dependencies.py")) 86 long_description = read_file(("README.rst",)) 87 88 REQUIREMENTS = dependencies["REQUIREMENTS"] 89 CONDITIONAL_REQUIREMENTS = dependencies["CONDITIONAL_REQUIREMENTS"] 90 ALL_OPTIONAL_REQUIREMENTS = dependencies["ALL_OPTIONAL_REQUIREMENTS"] 91 92 # Make `pip install matrix-synapse[all]` install all the optional dependencies. 93 CONDITIONAL_REQUIREMENTS["all"] = list(ALL_OPTIONAL_REQUIREMENTS) 94 95 # Developer dependencies should not get included in "all". 96 # 97 # We pin black so that our tests don't start failing on new releases. 98 CONDITIONAL_REQUIREMENTS["lint"] = [ 99 "isort==5.0.3", 100 "black==19.10b0", 101 "flake8-comprehensions", 102 "flake8", 103 ] 104 105 # Dependencies which are exclusively required by unit test code. This is 106 # NOT a list of all modules that are necessary to run the unit tests. 107 # Tests assume that all optional dependencies are installed. 108 # 109 # parameterized_class decorator was introduced in parameterized 0.7.0 110 CONDITIONAL_REQUIREMENTS["test"] = ["mock>=2.0", "parameterized>=0.7.0"] 111 112 setup( 113 name="matrix-synapse", 114 version=version, 115 packages=find_packages(exclude=["tests", "tests.*"]), 116 description="Reference homeserver for the Matrix decentralised comms protocol", 117 install_requires=REQUIREMENTS, 118 extras_require=CONDITIONAL_REQUIREMENTS, 119 include_package_data=True, 120 zip_safe=False, 121 long_description=long_description, 122 python_requires="~=3.5", 123 classifiers=[ 124 "Development Status :: 5 - Production/Stable", 125 "Topic :: Communications :: Chat", 126 "License :: OSI Approved :: Apache Software License", 127 "Programming Language :: Python :: 3 :: Only", 128 "Programming Language :: Python :: 3.5", 129 "Programming Language :: Python :: 3.6", 130 "Programming Language :: Python :: 3.7", 131 "Programming Language :: Python :: 3.8", 132 ], 133 scripts=["synctl"] + glob.glob("scripts/*"), 134 cmdclass={"test": TestCommand}, 135 ) 136 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/setup.py b/setup.py --- a/setup.py +++ b/setup.py @@ -102,6 +102,8 @@ "flake8", ] +CONDITIONAL_REQUIREMENTS["mypy"] = ["mypy==0.790", "mypy-zope"] + # Dependencies which are exclusively required by unit test code. This is # NOT a list of all modules that are necessary to run the unit tests. # Tests assume that all optional dependencies are installed.
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -102,6 +102,8 @@\n \"flake8\",\n ]\n \n+CONDITIONAL_REQUIREMENTS[\"mypy\"] = [\"mypy==0.790\", \"mypy-zope\"]\n+\n # Dependencies which are exclusively required by unit test code. This is\n # NOT a list of all modules that are necessary to run the unit tests.\n # Tests assume that all optional dependencies are installed.\n", "issue": "mypy==0.782 will cause mypy-zope dependency regression in future pip releases\n## Description\r\n\r\nPip is planning to use a [new resolver by default from `20.3`](https://pip.pypa.io/en/latest/user_guide/#changes-to-the-pip-dependency-resolver-in-20-2-2020), which is more strict in it's dependency handling.\r\n\r\nCurrently, `tox`'s `mypy` test environment defines this:\r\n\r\n```\r\ndeps =\r\n {[base]deps}\r\n mypy==0.782\r\n mypy-zope\r\n```\r\n\r\nGiven that input, `pip>=20.3` will likely either:\r\n1. Choose an older `mypy-zope` package that works with `mypy==0.782`, causing a regression.\r\n2. ~~Reject installing outright, as this stricter dependency resolver will refuse to install incompatible packages.~~\r\n\n", "before_files": [{"content": "#!/usr/bin/env python\n\n# Copyright 2014-2017 OpenMarket Ltd\n# Copyright 2017 Vector Creations Ltd\n# Copyright 2017-2018 New Vector Ltd\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport glob\nimport os\n\nfrom setuptools import Command, find_packages, setup\n\nhere = os.path.abspath(os.path.dirname(__file__))\n\n\n# Some notes on `setup.py test`:\n#\n# Once upon a time we used to try to make `setup.py test` run `tox` to run the\n# tests. That's a bad idea for three reasons:\n#\n# 1: `setup.py test` is supposed to find out whether the tests work in the\n# *current* environmentt, not whatever tox sets up.\n# 2: Empirically, trying to install tox during the test run wasn't working (\"No\n# module named virtualenv\").\n# 3: The tox documentation advises against it[1].\n#\n# Even further back in time, we used to use setuptools_trial [2]. That has its\n# own set of issues: for instance, it requires installation of Twisted to build\n# an sdist (because the recommended mode of usage is to add it to\n# `setup_requires`). That in turn means that in order to successfully run tox\n# you have to have the python header files installed for whichever version of\n# python tox uses (which is python3 on recent ubuntus, for example).\n#\n# So, for now at least, we stick with what appears to be the convention among\n# Twisted projects, and don't attempt to do anything when someone runs\n# `setup.py test`; instead we direct people to run `trial` directly if they\n# care.\n#\n# [1]: http://tox.readthedocs.io/en/2.5.0/example/basic.html#integration-with-setup-py-test-command\n# [2]: https://pypi.python.org/pypi/setuptools_trial\nclass TestCommand(Command):\n user_options = []\n\n def initialize_options(self):\n pass\n\n def finalize_options(self):\n pass\n\n def run(self):\n print(\n \"\"\"Synapse's tests cannot be run via setup.py. To run them, try:\n PYTHONPATH=\".\" trial tests\n\"\"\"\n )\n\n\ndef read_file(path_segments):\n \"\"\"Read a file from the package. Takes a list of strings to join to\n make the path\"\"\"\n file_path = os.path.join(here, *path_segments)\n with open(file_path) as f:\n return f.read()\n\n\ndef exec_file(path_segments):\n \"\"\"Execute a single python file to get the variables defined in it\"\"\"\n result = {}\n code = read_file(path_segments)\n exec(code, result)\n return result\n\n\nversion = exec_file((\"synapse\", \"__init__.py\"))[\"__version__\"]\ndependencies = exec_file((\"synapse\", \"python_dependencies.py\"))\nlong_description = read_file((\"README.rst\",))\n\nREQUIREMENTS = dependencies[\"REQUIREMENTS\"]\nCONDITIONAL_REQUIREMENTS = dependencies[\"CONDITIONAL_REQUIREMENTS\"]\nALL_OPTIONAL_REQUIREMENTS = dependencies[\"ALL_OPTIONAL_REQUIREMENTS\"]\n\n# Make `pip install matrix-synapse[all]` install all the optional dependencies.\nCONDITIONAL_REQUIREMENTS[\"all\"] = list(ALL_OPTIONAL_REQUIREMENTS)\n\n# Developer dependencies should not get included in \"all\".\n#\n# We pin black so that our tests don't start failing on new releases.\nCONDITIONAL_REQUIREMENTS[\"lint\"] = [\n \"isort==5.0.3\",\n \"black==19.10b0\",\n \"flake8-comprehensions\",\n \"flake8\",\n]\n\n# Dependencies which are exclusively required by unit test code. This is\n# NOT a list of all modules that are necessary to run the unit tests.\n# Tests assume that all optional dependencies are installed.\n#\n# parameterized_class decorator was introduced in parameterized 0.7.0\nCONDITIONAL_REQUIREMENTS[\"test\"] = [\"mock>=2.0\", \"parameterized>=0.7.0\"]\n\nsetup(\n name=\"matrix-synapse\",\n version=version,\n packages=find_packages(exclude=[\"tests\", \"tests.*\"]),\n description=\"Reference homeserver for the Matrix decentralised comms protocol\",\n install_requires=REQUIREMENTS,\n extras_require=CONDITIONAL_REQUIREMENTS,\n include_package_data=True,\n zip_safe=False,\n long_description=long_description,\n python_requires=\"~=3.5\",\n classifiers=[\n \"Development Status :: 5 - Production/Stable\",\n \"Topic :: Communications :: Chat\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Programming Language :: Python :: 3 :: Only\",\n \"Programming Language :: Python :: 3.5\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n ],\n scripts=[\"synctl\"] + glob.glob(\"scripts/*\"),\n cmdclass={\"test\": TestCommand},\n)\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/env python\n\n# Copyright 2014-2017 OpenMarket Ltd\n# Copyright 2017 Vector Creations Ltd\n# Copyright 2017-2018 New Vector Ltd\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport glob\nimport os\n\nfrom setuptools import Command, find_packages, setup\n\nhere = os.path.abspath(os.path.dirname(__file__))\n\n\n# Some notes on `setup.py test`:\n#\n# Once upon a time we used to try to make `setup.py test` run `tox` to run the\n# tests. That's a bad idea for three reasons:\n#\n# 1: `setup.py test` is supposed to find out whether the tests work in the\n# *current* environmentt, not whatever tox sets up.\n# 2: Empirically, trying to install tox during the test run wasn't working (\"No\n# module named virtualenv\").\n# 3: The tox documentation advises against it[1].\n#\n# Even further back in time, we used to use setuptools_trial [2]. That has its\n# own set of issues: for instance, it requires installation of Twisted to build\n# an sdist (because the recommended mode of usage is to add it to\n# `setup_requires`). That in turn means that in order to successfully run tox\n# you have to have the python header files installed for whichever version of\n# python tox uses (which is python3 on recent ubuntus, for example).\n#\n# So, for now at least, we stick with what appears to be the convention among\n# Twisted projects, and don't attempt to do anything when someone runs\n# `setup.py test`; instead we direct people to run `trial` directly if they\n# care.\n#\n# [1]: http://tox.readthedocs.io/en/2.5.0/example/basic.html#integration-with-setup-py-test-command\n# [2]: https://pypi.python.org/pypi/setuptools_trial\nclass TestCommand(Command):\n user_options = []\n\n def initialize_options(self):\n pass\n\n def finalize_options(self):\n pass\n\n def run(self):\n print(\n \"\"\"Synapse's tests cannot be run via setup.py. To run them, try:\n PYTHONPATH=\".\" trial tests\n\"\"\"\n )\n\n\ndef read_file(path_segments):\n \"\"\"Read a file from the package. Takes a list of strings to join to\n make the path\"\"\"\n file_path = os.path.join(here, *path_segments)\n with open(file_path) as f:\n return f.read()\n\n\ndef exec_file(path_segments):\n \"\"\"Execute a single python file to get the variables defined in it\"\"\"\n result = {}\n code = read_file(path_segments)\n exec(code, result)\n return result\n\n\nversion = exec_file((\"synapse\", \"__init__.py\"))[\"__version__\"]\ndependencies = exec_file((\"synapse\", \"python_dependencies.py\"))\nlong_description = read_file((\"README.rst\",))\n\nREQUIREMENTS = dependencies[\"REQUIREMENTS\"]\nCONDITIONAL_REQUIREMENTS = dependencies[\"CONDITIONAL_REQUIREMENTS\"]\nALL_OPTIONAL_REQUIREMENTS = dependencies[\"ALL_OPTIONAL_REQUIREMENTS\"]\n\n# Make `pip install matrix-synapse[all]` install all the optional dependencies.\nCONDITIONAL_REQUIREMENTS[\"all\"] = list(ALL_OPTIONAL_REQUIREMENTS)\n\n# Developer dependencies should not get included in \"all\".\n#\n# We pin black so that our tests don't start failing on new releases.\nCONDITIONAL_REQUIREMENTS[\"lint\"] = [\n \"isort==5.0.3\",\n \"black==19.10b0\",\n \"flake8-comprehensions\",\n \"flake8\",\n]\n\nCONDITIONAL_REQUIREMENTS[\"mypy\"] = [\"mypy==0.790\", \"mypy-zope\"]\n\n# Dependencies which are exclusively required by unit test code. This is\n# NOT a list of all modules that are necessary to run the unit tests.\n# Tests assume that all optional dependencies are installed.\n#\n# parameterized_class decorator was introduced in parameterized 0.7.0\nCONDITIONAL_REQUIREMENTS[\"test\"] = [\"mock>=2.0\", \"parameterized>=0.7.0\"]\n\nsetup(\n name=\"matrix-synapse\",\n version=version,\n packages=find_packages(exclude=[\"tests\", \"tests.*\"]),\n description=\"Reference homeserver for the Matrix decentralised comms protocol\",\n install_requires=REQUIREMENTS,\n extras_require=CONDITIONAL_REQUIREMENTS,\n include_package_data=True,\n zip_safe=False,\n long_description=long_description,\n python_requires=\"~=3.5\",\n classifiers=[\n \"Development Status :: 5 - Production/Stable\",\n \"Topic :: Communications :: Chat\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Programming Language :: Python :: 3 :: Only\",\n \"Programming Language :: Python :: 3.5\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n ],\n scripts=[\"synctl\"] + glob.glob(\"scripts/*\"),\n cmdclass={\"test\": TestCommand},\n)\n", "path": "setup.py"}]}
1,950
108
gh_patches_debug_23
rasdani/github-patches
git_diff
horovod__horovod-3745
We are currently solving the following issue within our repository. Here is the issue text: --- BEGIN ISSUE --- No module named 'packaging' when install horovod It seems that the horovod v0.26.0 has some dependency problems. How long does it take for a new patch version to be released or should I pin the horovod version? ^_^ ![image](https://user-images.githubusercontent.com/32220263/195746141-2a0050f5-1eaf-4f7b-9a62-50fd6b13f2ff.png) --- END ISSUE --- Below are some code segments, each from a relevant file. One or more of these files may contain bugs. --- BEGIN FILES --- Path: `horovod/__init__.py` Content: ``` 1 from horovod.runner import run 2 3 __version__ = '0.26.0' 4 ``` --- END FILES --- Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks. Here is an example: ```diff diff --git a/examples/server_async.py b/examples/server_async.py --- a/examples/server_async.py +++ b/examples/server_async.py @@ -313,4 +313,4 @@ if __name__ == "__main__": - asyncio.run(run_async_server("."), debug=True) + asyncio.run(run_async_server(), debug=True) diff --git a/examples/server_sync.py b/examples/server_sync.py --- a/examples/server_sync.py +++ b/examples/server_sync.py @@ -313,5 +313,5 @@ if __name__ == "__main__": - server = run_sync_server(".") + server = run_sync_server() server.shutdown() ```
diff --git a/horovod/__init__.py b/horovod/__init__.py --- a/horovod/__init__.py +++ b/horovod/__init__.py @@ -1,3 +1,3 @@ from horovod.runner import run -__version__ = '0.26.0' +__version__ = '0.26.1'
{"golden_diff": "diff --git a/horovod/__init__.py b/horovod/__init__.py\n--- a/horovod/__init__.py\n+++ b/horovod/__init__.py\n@@ -1,3 +1,3 @@\n from horovod.runner import run\n \n-__version__ = '0.26.0'\n+__version__ = '0.26.1'\n", "issue": "No module named 'packaging' when install horovod\nIt seems that the horovod v0.26.0 has some dependency problems.\r\n\r\nHow long does it take for a new patch version to be released or should I pin the horovod version? ^_^\r\n\r\n![image](https://user-images.githubusercontent.com/32220263/195746141-2a0050f5-1eaf-4f7b-9a62-50fd6b13f2ff.png)\r\n\n", "before_files": [{"content": "from horovod.runner import run\n\n__version__ = '0.26.0'\n", "path": "horovod/__init__.py"}], "after_files": [{"content": "from horovod.runner import run\n\n__version__ = '0.26.1'\n", "path": "horovod/__init__.py"}]}
410
88