problem_id
stringlengths 18
22
| source
stringclasses 1
value | task_type
stringclasses 1
value | in_source_id
stringlengths 13
58
| prompt
stringlengths 1.1k
25.4k
| golden_diff
stringlengths 145
5.13k
| verification_info
stringlengths 582
39.1k
| num_tokens
int64 271
4.1k
| num_tokens_diff
int64 47
1.02k
|
---|---|---|---|---|---|---|---|---|
gh_patches_debug_64706 | rasdani/github-patches | git_diff | ansible__ansible-modules-extras-3417 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
ec2_lc_find not returning AssociatePublicIpAddress
##### ISSUE TYPE
Bug Report
##### COMPONENT NAME
ec2_lc_find
##### ANSIBLE VERSION
```
ansible 2.2.0.0
config file = /home/centos/ansiblebase/ansible.cfg
configured module search path = Default w/o overrides
```
##### CONFIGURATION
No significant changes
##### OS / ENVIRONMENT
Started with Ansible Tower 3.0.3 on CentOS 7 x86_64
Did a yum update on ansible to 2.2.
Did pip install boto3.
##### SUMMARY
Running ec2_lc_find fails with a missing key for AssociatePublicIpAddress
##### STEPS TO REPRODUCE
```
- ec2_lc_find:
region: "{{ region }}"
name_regex: lc_name-*"
sort_order: ascending
limit: 3
register: old_lc_result
```
<!--- You can also paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
Correctly returns load configurations matching regex.
##### ACTUAL RESULTS
```
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: KeyError: 'AssociatePublicIpAddress'
fatal: [localhost]: FAILED! => {"changed": false, "failed": true, "module_stderr": "Traceback (most recent call last):\n File \"/tmp/ansible_dJ3ho0/ansible_module_ec2_lc_find.py\", line 225, in <module>\n main()\n File \"/tmp/ansible_dJ3ho0/ansible_module_ec2_lc_find.py\", line 217, in main\n find_launch_configs(client, module)\n File \"/tmp/ansible_dJ3ho0/ansible_module_ec2_lc_find.py\", line 187, in find_launch_configs\n 'associate_public_address': lc['AssociatePublicIpAddress'],\nKeyError: 'AssociatePublicIpAddress'\n", "module_stdout": "", "msg": "MODULE FAILURE"}
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `cloud/amazon/ec2_lc_find.py`
Content:
```
1 #!/usr/bin/python
2 # encoding: utf-8
3
4 # (c) 2015, Jose Armesto <[email protected]>
5 #
6 # This file is part of Ansible
7 #
8 # This module is free software: you can redistribute it and/or modify
9 # it under the terms of the GNU General Public License as published by
10 # the Free Software Foundation, either version 3 of the License, or
11 # (at your option) any later version.
12 #
13 # This software is distributed in the hope that it will be useful,
14 # but WITHOUT ANY WARRANTY; without even the implied warranty of
15 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
16 # GNU General Public License for more details.
17 #
18 # You should have received a copy of the GNU General Public License
19 # along with this software. If not, see <http://www.gnu.org/licenses/>.
20
21 DOCUMENTATION = """
22 ---
23 module: ec2_lc_find
24 short_description: Find AWS Autoscaling Launch Configurations
25 description:
26 - Returns list of matching Launch Configurations for a given name, along with other useful information
27 - Results can be sorted and sliced
28 - It depends on boto
29 - Based on the work by Tom Bamford (https://github.com/tombamford)
30
31 version_added: "2.2"
32 author: "Jose Armesto (@fiunchinho)"
33 options:
34 region:
35 description:
36 - The AWS region to use.
37 required: true
38 aliases: ['aws_region', 'ec2_region']
39 name_regex:
40 description:
41 - A Launch Configuration to match
42 - It'll be compiled as regex
43 required: True
44 sort_order:
45 description:
46 - Order in which to sort results.
47 choices: ['ascending', 'descending']
48 default: 'ascending'
49 required: false
50 limit:
51 description:
52 - How many results to show.
53 - Corresponds to Python slice notation like list[:limit].
54 default: null
55 required: false
56 requirements:
57 - "python >= 2.6"
58 - boto3
59 """
60
61 EXAMPLES = '''
62 # Note: These examples do not set authentication details, see the AWS Guide for details.
63
64 # Search for the Launch Configurations that start with "app"
65 - ec2_lc_find:
66 name_regex: app.*
67 sort_order: descending
68 limit: 2
69 '''
70
71 RETURN = '''
72 image_id:
73 description: AMI id
74 returned: when Launch Configuration was found
75 type: string
76 sample: "ami-0d75df7e"
77 user_data:
78 description: User data used to start instance
79 returned: when Launch Configuration was found
80 type: string
81 user_data: "ZXhwb3J0IENMT1VE"
82 name:
83 description: Name of the AMI
84 returned: when Launch Configuration was found
85 type: string
86 sample: "myapp-v123"
87 arn:
88 description: Name of the AMI
89 returned: when Launch Configuration was found
90 type: string
91 sample: "arn:aws:autoscaling:eu-west-1:12345:launchConfiguration:d82f050e-e315:launchConfigurationName/yourproject"
92 instance_type:
93 description: Type of ec2 instance
94 returned: when Launch Configuration was found
95 type: string
96 sample: "t2.small"
97 created_time:
98 description: When it was created
99 returned: when Launch Configuration was found
100 type: string
101 sample: "2016-06-29T14:59:22.222000+00:00"
102 ebs_optimized:
103 description: Launch Configuration EBS optimized property
104 returned: when Launch Configuration was found
105 type: boolean
106 sample: False
107 instance_monitoring:
108 description: Launch Configuration instance monitoring property
109 returned: when Launch Configuration was found
110 type: string
111 sample: {"Enabled": false}
112 classic_link_vpc_security_groups:
113 description: Launch Configuration classic link vpc security groups property
114 returned: when Launch Configuration was found
115 type: list
116 sample: []
117 block_device_mappings:
118 description: Launch Configuration block device mappings property
119 returned: when Launch Configuration was found
120 type: list
121 sample: []
122 keyname:
123 description: Launch Configuration ssh key
124 returned: when Launch Configuration was found
125 type: string
126 sample: mykey
127 security_groups:
128 description: Launch Configuration security groups
129 returned: when Launch Configuration was found
130 type: list
131 sample: []
132 kernel_id:
133 description: Launch Configuration kernel to use
134 returned: when Launch Configuration was found
135 type: string
136 sample: ''
137 ram_disk_id:
138 description: Launch Configuration ram disk property
139 returned: when Launch Configuration was found
140 type: string
141 sample: ''
142 associate_public_address:
143 description: Assign public address or not
144 returned: when Launch Configuration was found
145 type: boolean
146 sample: True
147 ...
148 '''
149
150
151 def find_launch_configs(client, module):
152 name_regex = module.params.get('name_regex')
153 sort_order = module.params.get('sort_order')
154 limit = module.params.get('limit')
155
156 paginator = client.get_paginator('describe_launch_configurations')
157
158 response_iterator = paginator.paginate(
159 PaginationConfig={
160 'MaxItems': 1000,
161 'PageSize': 100
162 }
163 )
164
165 results = []
166
167 for response in response_iterator:
168 response['LaunchConfigurations'] = filter(lambda lc: re.compile(name_regex).match(lc['LaunchConfigurationName']),
169 response['LaunchConfigurations'])
170
171 for lc in response['LaunchConfigurations']:
172 data = {
173 'name': lc['LaunchConfigurationName'],
174 'arn': lc['LaunchConfigurationARN'],
175 'created_time': lc['CreatedTime'],
176 'user_data': lc['UserData'],
177 'instance_type': lc['InstanceType'],
178 'image_id': lc['ImageId'],
179 'ebs_optimized': lc['EbsOptimized'],
180 'instance_monitoring': lc['InstanceMonitoring'],
181 'classic_link_vpc_security_groups': lc['ClassicLinkVPCSecurityGroups'],
182 'block_device_mappings': lc['BlockDeviceMappings'],
183 'keyname': lc['KeyName'],
184 'security_groups': lc['SecurityGroups'],
185 'kernel_id': lc['KernelId'],
186 'ram_disk_id': lc['RamdiskId'],
187 'associate_public_address': lc['AssociatePublicIpAddress'],
188 }
189
190 results.append(data)
191
192 results.sort(key=lambda e: e['name'], reverse=(sort_order == 'descending'))
193
194 if limit:
195 results = results[:int(limit)]
196
197 module.exit_json(changed=False, results=results)
198
199
200 def main():
201 argument_spec = ec2_argument_spec()
202 argument_spec.update(dict(
203 region=dict(required=True, aliases=['aws_region', 'ec2_region']),
204 name_regex=dict(required=True),
205 sort_order=dict(required=False, default='ascending', choices=['ascending', 'descending']),
206 limit=dict(required=False, type='int'),
207 )
208 )
209
210 module = AnsibleModule(
211 argument_spec=argument_spec,
212 )
213
214 region, ec2_url, aws_connect_params = get_aws_connection_info(module, True)
215
216 client = boto3_conn(module=module, conn_type='client', resource='autoscaling', region=region, **aws_connect_params)
217 find_launch_configs(client, module)
218
219
220 # import module snippets
221 from ansible.module_utils.basic import *
222 from ansible.module_utils.ec2 import *
223
224 if __name__ == '__main__':
225 main()
226
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/cloud/amazon/ec2_lc_find.py b/cloud/amazon/ec2_lc_find.py
--- a/cloud/amazon/ec2_lc_find.py
+++ b/cloud/amazon/ec2_lc_find.py
@@ -184,7 +184,7 @@
'security_groups': lc['SecurityGroups'],
'kernel_id': lc['KernelId'],
'ram_disk_id': lc['RamdiskId'],
- 'associate_public_address': lc['AssociatePublicIpAddress'],
+ 'associate_public_address': lc.get('AssociatePublicIpAddress', False),
}
results.append(data)
| {"golden_diff": "diff --git a/cloud/amazon/ec2_lc_find.py b/cloud/amazon/ec2_lc_find.py\n--- a/cloud/amazon/ec2_lc_find.py\n+++ b/cloud/amazon/ec2_lc_find.py\n@@ -184,7 +184,7 @@\n 'security_groups': lc['SecurityGroups'],\n 'kernel_id': lc['KernelId'],\n 'ram_disk_id': lc['RamdiskId'],\n- 'associate_public_address': lc['AssociatePublicIpAddress'],\n+ 'associate_public_address': lc.get('AssociatePublicIpAddress', False),\n }\n \n results.append(data)\n", "issue": "ec2_lc_find not returning AssociatePublicIpAddress\n##### ISSUE TYPE\r\nBug Report\r\n\r\n##### COMPONENT NAME\r\nec2_lc_find\r\n\r\n##### ANSIBLE VERSION\r\n```\r\nansible 2.2.0.0\r\n config file = /home/centos/ansiblebase/ansible.cfg\r\n configured module search path = Default w/o overrides\r\n```\r\n\r\n##### CONFIGURATION\r\nNo significant changes\r\n\r\n##### OS / ENVIRONMENT\r\nStarted with Ansible Tower 3.0.3 on CentOS 7 x86_64\r\nDid a yum update on ansible to 2.2.\r\nDid pip install boto3.\r\n\r\n##### SUMMARY\r\nRunning ec2_lc_find fails with a missing key for AssociatePublicIpAddress\r\n\r\n##### STEPS TO REPRODUCE\r\n```\r\n- ec2_lc_find:\r\n region: \"{{ region }}\"\r\n name_regex: lc_name-*\"\r\n sort_order: ascending\r\n limit: 3\r\n register: old_lc_result\r\n```\r\n\r\n<!--- You can also paste gist.github.com links for larger files -->\r\n\r\n##### EXPECTED RESULTS\r\nCorrectly returns load configurations matching regex.\r\n\r\n##### ACTUAL RESULTS\r\n```\r\nAn exception occurred during task execution. To see the full traceback, use -vvv. The error was: KeyError: 'AssociatePublicIpAddress'\r\nfatal: [localhost]: FAILED! => {\"changed\": false, \"failed\": true, \"module_stderr\": \"Traceback (most recent call last):\\n File \\\"/tmp/ansible_dJ3ho0/ansible_module_ec2_lc_find.py\\\", line 225, in <module>\\n main()\\n File \\\"/tmp/ansible_dJ3ho0/ansible_module_ec2_lc_find.py\\\", line 217, in main\\n find_launch_configs(client, module)\\n File \\\"/tmp/ansible_dJ3ho0/ansible_module_ec2_lc_find.py\\\", line 187, in find_launch_configs\\n 'associate_public_address': lc['AssociatePublicIpAddress'],\\nKeyError: 'AssociatePublicIpAddress'\\n\", \"module_stdout\": \"\", \"msg\": \"MODULE FAILURE\"}\r\n```\r\n\n", "before_files": [{"content": "#!/usr/bin/python\n# encoding: utf-8\n\n# (c) 2015, Jose Armesto <[email protected]>\n#\n# This file is part of Ansible\n#\n# This module is free software: you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the Free Software Foundation, either version 3 of the License, or\n# (at your option) any later version.\n#\n# This software is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU General Public License for more details.\n#\n# You should have received a copy of the GNU General Public License\n# along with this software. If not, see <http://www.gnu.org/licenses/>.\n\nDOCUMENTATION = \"\"\"\n---\nmodule: ec2_lc_find\nshort_description: Find AWS Autoscaling Launch Configurations\ndescription:\n - Returns list of matching Launch Configurations for a given name, along with other useful information\n - Results can be sorted and sliced\n - It depends on boto\n - Based on the work by Tom Bamford (https://github.com/tombamford)\n\nversion_added: \"2.2\"\nauthor: \"Jose Armesto (@fiunchinho)\"\noptions:\n region:\n description:\n - The AWS region to use.\n required: true\n aliases: ['aws_region', 'ec2_region']\n name_regex:\n description:\n - A Launch Configuration to match\n - It'll be compiled as regex\n required: True\n sort_order:\n description:\n - Order in which to sort results.\n choices: ['ascending', 'descending']\n default: 'ascending'\n required: false\n limit:\n description:\n - How many results to show.\n - Corresponds to Python slice notation like list[:limit].\n default: null\n required: false\nrequirements:\n - \"python >= 2.6\"\n - boto3\n\"\"\"\n\nEXAMPLES = '''\n# Note: These examples do not set authentication details, see the AWS Guide for details.\n\n# Search for the Launch Configurations that start with \"app\"\n- ec2_lc_find:\n name_regex: app.*\n sort_order: descending\n limit: 2\n'''\n\nRETURN = '''\nimage_id:\n description: AMI id\n returned: when Launch Configuration was found\n type: string\n sample: \"ami-0d75df7e\"\nuser_data:\n description: User data used to start instance\n returned: when Launch Configuration was found\n type: string\n user_data: \"ZXhwb3J0IENMT1VE\"\nname:\n description: Name of the AMI\n returned: when Launch Configuration was found\n type: string\n sample: \"myapp-v123\"\narn:\n description: Name of the AMI\n returned: when Launch Configuration was found\n type: string\n sample: \"arn:aws:autoscaling:eu-west-1:12345:launchConfiguration:d82f050e-e315:launchConfigurationName/yourproject\"\ninstance_type:\n description: Type of ec2 instance\n returned: when Launch Configuration was found\n type: string\n sample: \"t2.small\"\ncreated_time:\n description: When it was created\n returned: when Launch Configuration was found\n type: string\n sample: \"2016-06-29T14:59:22.222000+00:00\"\nebs_optimized:\n description: Launch Configuration EBS optimized property\n returned: when Launch Configuration was found\n type: boolean\n sample: False\ninstance_monitoring:\n description: Launch Configuration instance monitoring property\n returned: when Launch Configuration was found\n type: string\n sample: {\"Enabled\": false}\nclassic_link_vpc_security_groups:\n description: Launch Configuration classic link vpc security groups property\n returned: when Launch Configuration was found\n type: list\n sample: []\nblock_device_mappings:\n description: Launch Configuration block device mappings property\n returned: when Launch Configuration was found\n type: list\n sample: []\nkeyname:\n description: Launch Configuration ssh key\n returned: when Launch Configuration was found\n type: string\n sample: mykey\nsecurity_groups:\n description: Launch Configuration security groups\n returned: when Launch Configuration was found\n type: list\n sample: []\nkernel_id:\n description: Launch Configuration kernel to use\n returned: when Launch Configuration was found\n type: string\n sample: ''\nram_disk_id:\n description: Launch Configuration ram disk property\n returned: when Launch Configuration was found\n type: string\n sample: ''\nassociate_public_address:\n description: Assign public address or not\n returned: when Launch Configuration was found\n type: boolean\n sample: True\n...\n'''\n\n\ndef find_launch_configs(client, module):\n name_regex = module.params.get('name_regex')\n sort_order = module.params.get('sort_order')\n limit = module.params.get('limit')\n\n paginator = client.get_paginator('describe_launch_configurations')\n\n response_iterator = paginator.paginate(\n PaginationConfig={\n 'MaxItems': 1000,\n 'PageSize': 100\n }\n )\n\n results = []\n\n for response in response_iterator:\n response['LaunchConfigurations'] = filter(lambda lc: re.compile(name_regex).match(lc['LaunchConfigurationName']),\n response['LaunchConfigurations'])\n\n for lc in response['LaunchConfigurations']:\n data = {\n 'name': lc['LaunchConfigurationName'],\n 'arn': lc['LaunchConfigurationARN'],\n 'created_time': lc['CreatedTime'],\n 'user_data': lc['UserData'],\n 'instance_type': lc['InstanceType'],\n 'image_id': lc['ImageId'],\n 'ebs_optimized': lc['EbsOptimized'],\n 'instance_monitoring': lc['InstanceMonitoring'],\n 'classic_link_vpc_security_groups': lc['ClassicLinkVPCSecurityGroups'],\n 'block_device_mappings': lc['BlockDeviceMappings'],\n 'keyname': lc['KeyName'],\n 'security_groups': lc['SecurityGroups'],\n 'kernel_id': lc['KernelId'],\n 'ram_disk_id': lc['RamdiskId'],\n 'associate_public_address': lc['AssociatePublicIpAddress'],\n }\n\n results.append(data)\n\n results.sort(key=lambda e: e['name'], reverse=(sort_order == 'descending'))\n\n if limit:\n results = results[:int(limit)]\n\n module.exit_json(changed=False, results=results)\n\n\ndef main():\n argument_spec = ec2_argument_spec()\n argument_spec.update(dict(\n region=dict(required=True, aliases=['aws_region', 'ec2_region']),\n name_regex=dict(required=True),\n sort_order=dict(required=False, default='ascending', choices=['ascending', 'descending']),\n limit=dict(required=False, type='int'),\n )\n )\n\n module = AnsibleModule(\n argument_spec=argument_spec,\n )\n\n region, ec2_url, aws_connect_params = get_aws_connection_info(module, True)\n\n client = boto3_conn(module=module, conn_type='client', resource='autoscaling', region=region, **aws_connect_params)\n find_launch_configs(client, module)\n\n\n# import module snippets\nfrom ansible.module_utils.basic import *\nfrom ansible.module_utils.ec2 import *\n\nif __name__ == '__main__':\n main()\n", "path": "cloud/amazon/ec2_lc_find.py"}], "after_files": [{"content": "#!/usr/bin/python\n# encoding: utf-8\n\n# (c) 2015, Jose Armesto <[email protected]>\n#\n# This file is part of Ansible\n#\n# This module is free software: you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the Free Software Foundation, either version 3 of the License, or\n# (at your option) any later version.\n#\n# This software is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU General Public License for more details.\n#\n# You should have received a copy of the GNU General Public License\n# along with this software. If not, see <http://www.gnu.org/licenses/>.\n\nDOCUMENTATION = \"\"\"\n---\nmodule: ec2_lc_find\nshort_description: Find AWS Autoscaling Launch Configurations\ndescription:\n - Returns list of matching Launch Configurations for a given name, along with other useful information\n - Results can be sorted and sliced\n - It depends on boto\n - Based on the work by Tom Bamford (https://github.com/tombamford)\n\nversion_added: \"2.2\"\nauthor: \"Jose Armesto (@fiunchinho)\"\noptions:\n region:\n description:\n - The AWS region to use.\n required: true\n aliases: ['aws_region', 'ec2_region']\n name_regex:\n description:\n - A Launch Configuration to match\n - It'll be compiled as regex\n required: True\n sort_order:\n description:\n - Order in which to sort results.\n choices: ['ascending', 'descending']\n default: 'ascending'\n required: false\n limit:\n description:\n - How many results to show.\n - Corresponds to Python slice notation like list[:limit].\n default: null\n required: false\nrequirements:\n - \"python >= 2.6\"\n - boto3\n\"\"\"\n\nEXAMPLES = '''\n# Note: These examples do not set authentication details, see the AWS Guide for details.\n\n# Search for the Launch Configurations that start with \"app\"\n- ec2_lc_find:\n name_regex: app.*\n sort_order: descending\n limit: 2\n'''\n\nRETURN = '''\nimage_id:\n description: AMI id\n returned: when Launch Configuration was found\n type: string\n sample: \"ami-0d75df7e\"\nuser_data:\n description: User data used to start instance\n returned: when Launch Configuration was found\n type: string\n user_data: \"ZXhwb3J0IENMT1VE\"\nname:\n description: Name of the AMI\n returned: when Launch Configuration was found\n type: string\n sample: \"myapp-v123\"\narn:\n description: Name of the AMI\n returned: when Launch Configuration was found\n type: string\n sample: \"arn:aws:autoscaling:eu-west-1:12345:launchConfiguration:d82f050e-e315:launchConfigurationName/yourproject\"\ninstance_type:\n description: Type of ec2 instance\n returned: when Launch Configuration was found\n type: string\n sample: \"t2.small\"\ncreated_time:\n description: When it was created\n returned: when Launch Configuration was found\n type: string\n sample: \"2016-06-29T14:59:22.222000+00:00\"\nebs_optimized:\n description: Launch Configuration EBS optimized property\n returned: when Launch Configuration was found\n type: boolean\n sample: False\ninstance_monitoring:\n description: Launch Configuration instance monitoring property\n returned: when Launch Configuration was found\n type: string\n sample: {\"Enabled\": false}\nclassic_link_vpc_security_groups:\n description: Launch Configuration classic link vpc security groups property\n returned: when Launch Configuration was found\n type: list\n sample: []\nblock_device_mappings:\n description: Launch Configuration block device mappings property\n returned: when Launch Configuration was found\n type: list\n sample: []\nkeyname:\n description: Launch Configuration ssh key\n returned: when Launch Configuration was found\n type: string\n sample: mykey\nsecurity_groups:\n description: Launch Configuration security groups\n returned: when Launch Configuration was found\n type: list\n sample: []\nkernel_id:\n description: Launch Configuration kernel to use\n returned: when Launch Configuration was found\n type: string\n sample: ''\nram_disk_id:\n description: Launch Configuration ram disk property\n returned: when Launch Configuration was found\n type: string\n sample: ''\nassociate_public_address:\n description: Assign public address or not\n returned: when Launch Configuration was found\n type: boolean\n sample: True\n...\n'''\n\n\ndef find_launch_configs(client, module):\n name_regex = module.params.get('name_regex')\n sort_order = module.params.get('sort_order')\n limit = module.params.get('limit')\n\n paginator = client.get_paginator('describe_launch_configurations')\n\n response_iterator = paginator.paginate(\n PaginationConfig={\n 'MaxItems': 1000,\n 'PageSize': 100\n }\n )\n\n results = []\n\n for response in response_iterator:\n response['LaunchConfigurations'] = filter(lambda lc: re.compile(name_regex).match(lc['LaunchConfigurationName']),\n response['LaunchConfigurations'])\n\n for lc in response['LaunchConfigurations']:\n data = {\n 'name': lc['LaunchConfigurationName'],\n 'arn': lc['LaunchConfigurationARN'],\n 'created_time': lc['CreatedTime'],\n 'user_data': lc['UserData'],\n 'instance_type': lc['InstanceType'],\n 'image_id': lc['ImageId'],\n 'ebs_optimized': lc['EbsOptimized'],\n 'instance_monitoring': lc['InstanceMonitoring'],\n 'classic_link_vpc_security_groups': lc['ClassicLinkVPCSecurityGroups'],\n 'block_device_mappings': lc['BlockDeviceMappings'],\n 'keyname': lc['KeyName'],\n 'security_groups': lc['SecurityGroups'],\n 'kernel_id': lc['KernelId'],\n 'ram_disk_id': lc['RamdiskId'],\n 'associate_public_address': lc.get('AssociatePublicIpAddress', False),\n }\n\n results.append(data)\n\n results.sort(key=lambda e: e['name'], reverse=(sort_order == 'descending'))\n\n if limit:\n results = results[:int(limit)]\n\n module.exit_json(changed=False, results=results)\n\n\ndef main():\n argument_spec = ec2_argument_spec()\n argument_spec.update(dict(\n region=dict(required=True, aliases=['aws_region', 'ec2_region']),\n name_regex=dict(required=True),\n sort_order=dict(required=False, default='ascending', choices=['ascending', 'descending']),\n limit=dict(required=False, type='int'),\n )\n )\n\n module = AnsibleModule(\n argument_spec=argument_spec,\n )\n\n region, ec2_url, aws_connect_params = get_aws_connection_info(module, True)\n\n client = boto3_conn(module=module, conn_type='client', resource='autoscaling', region=region, **aws_connect_params)\n find_launch_configs(client, module)\n\n\n# import module snippets\nfrom ansible.module_utils.basic import *\nfrom ansible.module_utils.ec2 import *\n\nif __name__ == '__main__':\n main()\n", "path": "cloud/amazon/ec2_lc_find.py"}]} | 2,915 | 127 |
gh_patches_debug_40775 | rasdani/github-patches | git_diff | streamlink__streamlink-3662 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
plugins.bfmtv: No playable streams found
Hello. for few days, the plugin isn't working anymore
/usr/local/bin/streamlink --loglevel debug https://rmcdecouverte.bfmtv.com/mediaplayer-direct/ best
[cli][info] streamlink is running as root! Be careful!
[cli][debug] OS: Linux-5.8.0-44-generic-x86_64-with-glibc2.29
[cli][debug] Python: 3.8.5
[cli][debug] Streamlink: 2.1.1
[cli][debug] Requests(2.22.0), Socks(1.7.1), Websocket(0.58.0)
[cli][debug] Arguments:
[cli][debug] url=https://rmcdecouverte.bfmtv.com/mediaplayer-direct/
[cli][debug] stream=['best']
[cli][debug] --loglevel=debug
[cli][info] Found matching plugin bfmtv for URL https://rmcdecouverte.bfmtv.com/mediaplayer-direct/
error: No playable streams found on this URL: https://rmcdecouverte.bfmtv.com/mediaplayer-direct/
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/streamlink/plugins/bfmtv.py`
Content:
```
1 import logging
2 import re
3
4 from streamlink.plugin import Plugin
5 from streamlink.plugins.brightcove import BrightcovePlayer
6
7 log = logging.getLogger(__name__)
8
9
10 class BFMTV(Plugin):
11 _url_re = re.compile(r'https://.+\.(?:bfmtv|01net)\.com')
12 _dailymotion_url = 'https://www.dailymotion.com/embed/video/{}'
13 _brightcove_video_re = re.compile(
14 r'accountid="(?P<account_id>[0-9]+).*?videoid="(?P<video_id>[0-9]+)"',
15 re.DOTALL
16 )
17 _brightcove_video_alt_re = re.compile(
18 r'data-account="(?P<account_id>[0-9]+).*?data-video-id="(?P<video_id>[0-9]+)"',
19 re.DOTALL
20 )
21 _embed_video_id_re = re.compile(
22 r'<iframe.*?src=".*?/(?P<video_id>\w+)"',
23 re.DOTALL
24 )
25
26 @classmethod
27 def can_handle_url(cls, url):
28 return cls._url_re.match(url) is not None
29
30 def _get_streams(self):
31 # Retrieve URL page and search for Brightcove video data
32 res = self.session.http.get(self.url)
33 match = self._brightcove_video_re.search(res.text) or self._brightcove_video_alt_re.search(res.text)
34 if match is not None:
35 account_id = match.group('account_id')
36 log.debug(f'Account ID: {account_id}')
37 video_id = match.group('video_id')
38 log.debug(f'Video ID: {video_id}')
39 player = BrightcovePlayer(self.session, account_id)
40 yield from player.get_streams(video_id)
41 else:
42 # Try to find the Dailymotion video ID
43 match = self._embed_video_id_re.search(res.text)
44 if match is not None:
45 video_id = match.group('video_id')
46 log.debug(f'Video ID: {video_id}')
47 yield from self.session.streams(self._dailymotion_url.format(video_id)).items()
48
49
50 __plugin__ = BFMTV
51
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/streamlink/plugins/bfmtv.py b/src/streamlink/plugins/bfmtv.py
--- a/src/streamlink/plugins/bfmtv.py
+++ b/src/streamlink/plugins/bfmtv.py
@@ -1,8 +1,11 @@
import logging
import re
+from urllib.parse import urljoin, urlparse
from streamlink.plugin import Plugin
+from streamlink.plugin.api.utils import itertags
from streamlink.plugins.brightcove import BrightcovePlayer
+from streamlink.stream import HTTPStream
log = logging.getLogger(__name__)
@@ -22,29 +25,68 @@
r'<iframe.*?src=".*?/(?P<video_id>\w+)"',
re.DOTALL
)
+ _main_js_url_re = re.compile(r'src="([\w/]+/main\.\w+\.js)"')
+ _js_brightcove_video_re = re.compile(
+ r'i\?\([A-Z]="[^"]+",y="(?P<video_id>[0-9]+).*"data-account"\s*:\s*"(?P<account_id>[0-9]+)',
+ )
@classmethod
def can_handle_url(cls, url):
return cls._url_re.match(url) is not None
def _get_streams(self):
- # Retrieve URL page and search for Brightcove video data
res = self.session.http.get(self.url)
- match = self._brightcove_video_re.search(res.text) or self._brightcove_video_alt_re.search(res.text)
- if match is not None:
- account_id = match.group('account_id')
+
+ m = self._brightcove_video_re.search(res.text) or self._brightcove_video_alt_re.search(res.text)
+ if m:
+ account_id = m.group('account_id')
log.debug(f'Account ID: {account_id}')
- video_id = match.group('video_id')
+ video_id = m.group('video_id')
log.debug(f'Video ID: {video_id}')
player = BrightcovePlayer(self.session, account_id)
yield from player.get_streams(video_id)
- else:
- # Try to find the Dailymotion video ID
- match = self._embed_video_id_re.search(res.text)
- if match is not None:
- video_id = match.group('video_id')
+ return
+
+ # Try to find the Dailymotion video ID
+ m = self._embed_video_id_re.search(res.text)
+ if m:
+ video_id = m.group('video_id')
+ log.debug(f'Video ID: {video_id}')
+ yield from self.session.streams(self._dailymotion_url.format(video_id)).items()
+ return
+
+ # Try the JS for Brightcove video data
+ m = self._main_js_url_re.search(res.text)
+ if m:
+ log.debug(f'JS URL: {urljoin(self.url, m.group(1))}')
+ res = self.session.http.get(urljoin(self.url, m.group(1)))
+ m = self._js_brightcove_video_re.search(res.text)
+ if m:
+ account_id = m.group('account_id')
+ log.debug(f'Account ID: {account_id}')
+ video_id = m.group('video_id')
log.debug(f'Video ID: {video_id}')
- yield from self.session.streams(self._dailymotion_url.format(video_id)).items()
+ player = BrightcovePlayer(self.session, account_id)
+ yield from player.get_streams(video_id)
+ return
+
+ # Audio Live
+ audio_url = None
+ for source in itertags(res.text, 'source'):
+ url = source.attributes.get('src')
+ if url:
+ p_url = urlparse(url)
+ if p_url.path.endswith(('.mp3')):
+ audio_url = url
+
+ # Audio VOD
+ for div in itertags(res.text, 'div'):
+ if div.attributes.get('class') == 'audio-player':
+ audio_url = div.attributes.get('data-media-url')
+
+ if audio_url:
+ yield 'audio', HTTPStream(self.session, audio_url)
+ return
__plugin__ = BFMTV
| {"golden_diff": "diff --git a/src/streamlink/plugins/bfmtv.py b/src/streamlink/plugins/bfmtv.py\n--- a/src/streamlink/plugins/bfmtv.py\n+++ b/src/streamlink/plugins/bfmtv.py\n@@ -1,8 +1,11 @@\n import logging\n import re\n+from urllib.parse import urljoin, urlparse\n \n from streamlink.plugin import Plugin\n+from streamlink.plugin.api.utils import itertags\n from streamlink.plugins.brightcove import BrightcovePlayer\n+from streamlink.stream import HTTPStream\n \n log = logging.getLogger(__name__)\n \n@@ -22,29 +25,68 @@\n r'<iframe.*?src=\".*?/(?P<video_id>\\w+)\"',\n re.DOTALL\n )\n+ _main_js_url_re = re.compile(r'src=\"([\\w/]+/main\\.\\w+\\.js)\"')\n+ _js_brightcove_video_re = re.compile(\n+ r'i\\?\\([A-Z]=\"[^\"]+\",y=\"(?P<video_id>[0-9]+).*\"data-account\"\\s*:\\s*\"(?P<account_id>[0-9]+)',\n+ )\n \n @classmethod\n def can_handle_url(cls, url):\n return cls._url_re.match(url) is not None\n \n def _get_streams(self):\n- # Retrieve URL page and search for Brightcove video data\n res = self.session.http.get(self.url)\n- match = self._brightcove_video_re.search(res.text) or self._brightcove_video_alt_re.search(res.text)\n- if match is not None:\n- account_id = match.group('account_id')\n+\n+ m = self._brightcove_video_re.search(res.text) or self._brightcove_video_alt_re.search(res.text)\n+ if m:\n+ account_id = m.group('account_id')\n log.debug(f'Account ID: {account_id}')\n- video_id = match.group('video_id')\n+ video_id = m.group('video_id')\n log.debug(f'Video ID: {video_id}')\n player = BrightcovePlayer(self.session, account_id)\n yield from player.get_streams(video_id)\n- else:\n- # Try to find the Dailymotion video ID\n- match = self._embed_video_id_re.search(res.text)\n- if match is not None:\n- video_id = match.group('video_id')\n+ return\n+\n+ # Try to find the Dailymotion video ID\n+ m = self._embed_video_id_re.search(res.text)\n+ if m:\n+ video_id = m.group('video_id')\n+ log.debug(f'Video ID: {video_id}')\n+ yield from self.session.streams(self._dailymotion_url.format(video_id)).items()\n+ return\n+\n+ # Try the JS for Brightcove video data\n+ m = self._main_js_url_re.search(res.text)\n+ if m:\n+ log.debug(f'JS URL: {urljoin(self.url, m.group(1))}')\n+ res = self.session.http.get(urljoin(self.url, m.group(1)))\n+ m = self._js_brightcove_video_re.search(res.text)\n+ if m:\n+ account_id = m.group('account_id')\n+ log.debug(f'Account ID: {account_id}')\n+ video_id = m.group('video_id')\n log.debug(f'Video ID: {video_id}')\n- yield from self.session.streams(self._dailymotion_url.format(video_id)).items()\n+ player = BrightcovePlayer(self.session, account_id)\n+ yield from player.get_streams(video_id)\n+ return\n+\n+ # Audio Live\n+ audio_url = None\n+ for source in itertags(res.text, 'source'):\n+ url = source.attributes.get('src')\n+ if url:\n+ p_url = urlparse(url)\n+ if p_url.path.endswith(('.mp3')):\n+ audio_url = url\n+\n+ # Audio VOD\n+ for div in itertags(res.text, 'div'):\n+ if div.attributes.get('class') == 'audio-player':\n+ audio_url = div.attributes.get('data-media-url')\n+\n+ if audio_url:\n+ yield 'audio', HTTPStream(self.session, audio_url)\n+ return\n \n \n __plugin__ = BFMTV\n", "issue": "plugins.bfmtv: No playable streams found\n Hello. for few days, the plugin isn't working anymore\r\n\r\n\r\n/usr/local/bin/streamlink --loglevel debug https://rmcdecouverte.bfmtv.com/mediaplayer-direct/ best\r\n[cli][info] streamlink is running as root! Be careful!\r\n[cli][debug] OS: Linux-5.8.0-44-generic-x86_64-with-glibc2.29\r\n[cli][debug] Python: 3.8.5\r\n[cli][debug] Streamlink: 2.1.1\r\n[cli][debug] Requests(2.22.0), Socks(1.7.1), Websocket(0.58.0)\r\n[cli][debug] Arguments:\r\n[cli][debug] url=https://rmcdecouverte.bfmtv.com/mediaplayer-direct/\r\n[cli][debug] stream=['best']\r\n[cli][debug] --loglevel=debug\r\n[cli][info] Found matching plugin bfmtv for URL https://rmcdecouverte.bfmtv.com/mediaplayer-direct/\r\nerror: No playable streams found on this URL: https://rmcdecouverte.bfmtv.com/mediaplayer-direct/\n", "before_files": [{"content": "import logging\nimport re\n\nfrom streamlink.plugin import Plugin\nfrom streamlink.plugins.brightcove import BrightcovePlayer\n\nlog = logging.getLogger(__name__)\n\n\nclass BFMTV(Plugin):\n _url_re = re.compile(r'https://.+\\.(?:bfmtv|01net)\\.com')\n _dailymotion_url = 'https://www.dailymotion.com/embed/video/{}'\n _brightcove_video_re = re.compile(\n r'accountid=\"(?P<account_id>[0-9]+).*?videoid=\"(?P<video_id>[0-9]+)\"',\n re.DOTALL\n )\n _brightcove_video_alt_re = re.compile(\n r'data-account=\"(?P<account_id>[0-9]+).*?data-video-id=\"(?P<video_id>[0-9]+)\"',\n re.DOTALL\n )\n _embed_video_id_re = re.compile(\n r'<iframe.*?src=\".*?/(?P<video_id>\\w+)\"',\n re.DOTALL\n )\n\n @classmethod\n def can_handle_url(cls, url):\n return cls._url_re.match(url) is not None\n\n def _get_streams(self):\n # Retrieve URL page and search for Brightcove video data\n res = self.session.http.get(self.url)\n match = self._brightcove_video_re.search(res.text) or self._brightcove_video_alt_re.search(res.text)\n if match is not None:\n account_id = match.group('account_id')\n log.debug(f'Account ID: {account_id}')\n video_id = match.group('video_id')\n log.debug(f'Video ID: {video_id}')\n player = BrightcovePlayer(self.session, account_id)\n yield from player.get_streams(video_id)\n else:\n # Try to find the Dailymotion video ID\n match = self._embed_video_id_re.search(res.text)\n if match is not None:\n video_id = match.group('video_id')\n log.debug(f'Video ID: {video_id}')\n yield from self.session.streams(self._dailymotion_url.format(video_id)).items()\n\n\n__plugin__ = BFMTV\n", "path": "src/streamlink/plugins/bfmtv.py"}], "after_files": [{"content": "import logging\nimport re\nfrom urllib.parse import urljoin, urlparse\n\nfrom streamlink.plugin import Plugin\nfrom streamlink.plugin.api.utils import itertags\nfrom streamlink.plugins.brightcove import BrightcovePlayer\nfrom streamlink.stream import HTTPStream\n\nlog = logging.getLogger(__name__)\n\n\nclass BFMTV(Plugin):\n _url_re = re.compile(r'https://.+\\.(?:bfmtv|01net)\\.com')\n _dailymotion_url = 'https://www.dailymotion.com/embed/video/{}'\n _brightcove_video_re = re.compile(\n r'accountid=\"(?P<account_id>[0-9]+).*?videoid=\"(?P<video_id>[0-9]+)\"',\n re.DOTALL\n )\n _brightcove_video_alt_re = re.compile(\n r'data-account=\"(?P<account_id>[0-9]+).*?data-video-id=\"(?P<video_id>[0-9]+)\"',\n re.DOTALL\n )\n _embed_video_id_re = re.compile(\n r'<iframe.*?src=\".*?/(?P<video_id>\\w+)\"',\n re.DOTALL\n )\n _main_js_url_re = re.compile(r'src=\"([\\w/]+/main\\.\\w+\\.js)\"')\n _js_brightcove_video_re = re.compile(\n r'i\\?\\([A-Z]=\"[^\"]+\",y=\"(?P<video_id>[0-9]+).*\"data-account\"\\s*:\\s*\"(?P<account_id>[0-9]+)',\n )\n\n @classmethod\n def can_handle_url(cls, url):\n return cls._url_re.match(url) is not None\n\n def _get_streams(self):\n res = self.session.http.get(self.url)\n\n m = self._brightcove_video_re.search(res.text) or self._brightcove_video_alt_re.search(res.text)\n if m:\n account_id = m.group('account_id')\n log.debug(f'Account ID: {account_id}')\n video_id = m.group('video_id')\n log.debug(f'Video ID: {video_id}')\n player = BrightcovePlayer(self.session, account_id)\n yield from player.get_streams(video_id)\n return\n\n # Try to find the Dailymotion video ID\n m = self._embed_video_id_re.search(res.text)\n if m:\n video_id = m.group('video_id')\n log.debug(f'Video ID: {video_id}')\n yield from self.session.streams(self._dailymotion_url.format(video_id)).items()\n return\n\n # Try the JS for Brightcove video data\n m = self._main_js_url_re.search(res.text)\n if m:\n log.debug(f'JS URL: {urljoin(self.url, m.group(1))}')\n res = self.session.http.get(urljoin(self.url, m.group(1)))\n m = self._js_brightcove_video_re.search(res.text)\n if m:\n account_id = m.group('account_id')\n log.debug(f'Account ID: {account_id}')\n video_id = m.group('video_id')\n log.debug(f'Video ID: {video_id}')\n player = BrightcovePlayer(self.session, account_id)\n yield from player.get_streams(video_id)\n return\n\n # Audio Live\n audio_url = None\n for source in itertags(res.text, 'source'):\n url = source.attributes.get('src')\n if url:\n p_url = urlparse(url)\n if p_url.path.endswith(('.mp3')):\n audio_url = url\n\n # Audio VOD\n for div in itertags(res.text, 'div'):\n if div.attributes.get('class') == 'audio-player':\n audio_url = div.attributes.get('data-media-url')\n\n if audio_url:\n yield 'audio', HTTPStream(self.session, audio_url)\n return\n\n\n__plugin__ = BFMTV\n", "path": "src/streamlink/plugins/bfmtv.py"}]} | 1,111 | 955 |
gh_patches_debug_486 | rasdani/github-patches | git_diff | DDMAL__CantusDB-228 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Remove the "Users Online" section in footer.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `django/cantusdb_project/main_app/templatetags/helper_tags.py`
Content:
```
1 import calendar
2 from typing import Union, Optional
3 from django.utils.http import urlencode
4 from django import template
5 from main_app.models import Source
6 from django.utils.safestring import mark_safe
7
8 register = template.Library()
9
10
11 @register.filter(name="month_to_string")
12 def month_to_string(value: Optional[Union[str, int]]) -> Optional[Union[str, int]]:
13 """Converts month number to textual representation, 3 letters (Jan, Mar, etc)"""
14 if type(value) == int and value in range(1, 13):
15 return calendar.month_abbr[value]
16 else:
17 return value
18
19
20 @register.simple_tag(takes_context=True)
21 def url_add_get_params(context, **kwargs):
22 query = context["request"].GET.copy()
23 query.pop("page", None)
24 query.update(kwargs)
25 return query.urlencode()
26
27
28 @register.simple_tag(takes_context=False)
29 def source_links():
30 sources = (
31 Source.objects.filter(public=True, visible=True, segment__id=4063)
32 .exclude(siglum=None)
33 .values("siglum", "id")
34 .order_by("siglum")
35 )
36 options = ""
37 # <option value="source1">Source 1</option>
38 # <option value="source2">Source 2</option>
39 # <option value="source3">Source 3</option>
40 for source in sources:
41 option_str = (
42 f"<option value=source/{source['id']}>{source['siglum']}</option>\n"
43 )
44 options += option_str
45
46 return mark_safe(options)
47
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/django/cantusdb_project/main_app/templatetags/helper_tags.py b/django/cantusdb_project/main_app/templatetags/helper_tags.py
--- a/django/cantusdb_project/main_app/templatetags/helper_tags.py
+++ b/django/cantusdb_project/main_app/templatetags/helper_tags.py
@@ -44,3 +44,7 @@
options += option_str
return mark_safe(options)
+
[email protected](name='has_group')
+def has_group(user, group_name):
+ return user.groups.filter(name=group_name).exists()
| {"golden_diff": "diff --git a/django/cantusdb_project/main_app/templatetags/helper_tags.py b/django/cantusdb_project/main_app/templatetags/helper_tags.py\n--- a/django/cantusdb_project/main_app/templatetags/helper_tags.py\n+++ b/django/cantusdb_project/main_app/templatetags/helper_tags.py\n@@ -44,3 +44,7 @@\n options += option_str\n \n return mark_safe(options)\n+\[email protected](name='has_group') \n+def has_group(user, group_name):\n+ return user.groups.filter(name=group_name).exists()\n", "issue": "Remove the \"Users Online\" section in footer.\n\n", "before_files": [{"content": "import calendar\nfrom typing import Union, Optional\nfrom django.utils.http import urlencode\nfrom django import template\nfrom main_app.models import Source\nfrom django.utils.safestring import mark_safe\n\nregister = template.Library()\n\n\[email protected](name=\"month_to_string\")\ndef month_to_string(value: Optional[Union[str, int]]) -> Optional[Union[str, int]]:\n \"\"\"Converts month number to textual representation, 3 letters (Jan, Mar, etc)\"\"\"\n if type(value) == int and value in range(1, 13):\n return calendar.month_abbr[value]\n else:\n return value\n\n\[email protected]_tag(takes_context=True)\ndef url_add_get_params(context, **kwargs):\n query = context[\"request\"].GET.copy()\n query.pop(\"page\", None)\n query.update(kwargs)\n return query.urlencode()\n\n\[email protected]_tag(takes_context=False)\ndef source_links():\n sources = (\n Source.objects.filter(public=True, visible=True, segment__id=4063)\n .exclude(siglum=None)\n .values(\"siglum\", \"id\")\n .order_by(\"siglum\")\n )\n options = \"\"\n # <option value=\"source1\">Source 1</option>\n # <option value=\"source2\">Source 2</option>\n # <option value=\"source3\">Source 3</option>\n for source in sources:\n option_str = (\n f\"<option value=source/{source['id']}>{source['siglum']}</option>\\n\"\n )\n options += option_str\n\n return mark_safe(options)\n", "path": "django/cantusdb_project/main_app/templatetags/helper_tags.py"}], "after_files": [{"content": "import calendar\nfrom typing import Union, Optional\nfrom django.utils.http import urlencode\nfrom django import template\nfrom main_app.models import Source\nfrom django.utils.safestring import mark_safe\n\nregister = template.Library()\n\n\[email protected](name=\"month_to_string\")\ndef month_to_string(value: Optional[Union[str, int]]) -> Optional[Union[str, int]]:\n \"\"\"Converts month number to textual representation, 3 letters (Jan, Mar, etc)\"\"\"\n if type(value) == int and value in range(1, 13):\n return calendar.month_abbr[value]\n else:\n return value\n\n\[email protected]_tag(takes_context=True)\ndef url_add_get_params(context, **kwargs):\n query = context[\"request\"].GET.copy()\n query.pop(\"page\", None)\n query.update(kwargs)\n return query.urlencode()\n\n\[email protected]_tag(takes_context=False)\ndef source_links():\n sources = (\n Source.objects.filter(public=True, visible=True, segment__id=4063)\n .exclude(siglum=None)\n .values(\"siglum\", \"id\")\n .order_by(\"siglum\")\n )\n options = \"\"\n # <option value=\"source1\">Source 1</option>\n # <option value=\"source2\">Source 2</option>\n # <option value=\"source3\">Source 3</option>\n for source in sources:\n option_str = (\n f\"<option value=source/{source['id']}>{source['siglum']}</option>\\n\"\n )\n options += option_str\n\n return mark_safe(options)\n\[email protected](name='has_group') \ndef has_group(user, group_name):\n return user.groups.filter(name=group_name).exists() \n", "path": "django/cantusdb_project/main_app/templatetags/helper_tags.py"}]} | 712 | 137 |
gh_patches_debug_13979 | rasdani/github-patches | git_diff | facebookresearch__fairscale-975 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
the main branch is not compatible with python 3.6, but setup.py only requires ">=3.6"
python 3.6 can pip install latest fairscale
https://github.com/facebookresearch/fairscale/blob/1bc96fa8c69def6d990e42bfbd75f86146ce29bd/setup.py#L67
but, some code is not compatible with python 3.6
https://github.com/facebookresearch/fairscale/blob/1bc96fa8c69def6d990e42bfbd75f86146ce29bd/fairscale/experimental/nn/ssd_offload.py#L6
and python<3.7 has no dataclasses
https://github.com/facebookresearch/fairscale/blob/1bc96fa8c69def6d990e42bfbd75f86146ce29bd/fairscale/nn/data_parallel/fully_sharded_data_parallel.py#L8
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #!/usr/bin/env python3
2
3 # Copyright (c) Facebook, Inc. and its affiliates. All rights reserved.
4 #
5 # This source code is licensed under the BSD license found in the
6 # LICENSE file in the root directory of this source tree.
7
8 import os
9 import re
10
11 import setuptools
12
13 this_dir = os.path.dirname(os.path.abspath(__file__))
14
15
16 def fetch_requirements():
17 with open("requirements.txt") as f:
18 reqs = f.read().strip().split("\n")
19 return reqs
20
21
22 # https://packaging.python.org/guides/single-sourcing-package-version/
23 def find_version(version_file_path) -> str:
24 with open(version_file_path) as version_file:
25 version_match = re.search(r"^__version_tuple__ = (.*)", version_file.read(), re.M)
26 if version_match:
27 ver_tup = eval(version_match.group(1))
28 ver_str = ".".join([str(x) for x in ver_tup])
29 return ver_str
30 raise RuntimeError("Unable to find version tuple.")
31
32
33 extensions = []
34 cmdclass = {}
35
36 if os.getenv("BUILD_CUDA_EXTENSIONS", "0") == "1":
37 from torch.utils.cpp_extension import BuildExtension, CUDAExtension
38
39 extensions.extend(
40 [
41 CUDAExtension(
42 name="fairscale.fused_adam_cuda",
43 include_dirs=[os.path.join(this_dir, "fairscale/clib/fused_adam_cuda")],
44 sources=[
45 "fairscale/clib/fused_adam_cuda/fused_adam_cuda.cpp",
46 "fairscale/clib/fused_adam_cuda/fused_adam_cuda_kernel.cu",
47 ],
48 extra_compile_args={"cxx": ["-O3"], "nvcc": ["-O3", "--use_fast_math"]},
49 )
50 ]
51 )
52
53 cmdclass["build_ext"] = BuildExtension
54
55
56 if __name__ == "__main__":
57 setuptools.setup(
58 name="fairscale",
59 description="FairScale: A PyTorch library for large-scale and high-performance training.",
60 version=find_version("fairscale/version.py"),
61 setup_requires=["ninja"], # ninja is required to build extensions
62 install_requires=fetch_requirements(),
63 include_package_data=True,
64 packages=setuptools.find_packages(exclude=("tests", "tests.*")),
65 ext_modules=extensions,
66 cmdclass=cmdclass,
67 python_requires=">=3.6",
68 author="Facebook AI Research",
69 author_email="[email protected]",
70 long_description="FairScale is a PyTorch extension library for high performance and large scale training on one or multiple machines/nodes. This library extends basic PyTorch capabilities while adding new experimental ones.",
71 long_description_content_type="text/markdown",
72 classifiers=[
73 "Programming Language :: Python :: 3.7",
74 "Programming Language :: Python :: 3.8",
75 "Programming Language :: Python :: 3.9",
76 "License :: OSI Approved :: BSD License",
77 "Topic :: Scientific/Engineering :: Artificial Intelligence",
78 "Operating System :: OS Independent",
79 ],
80 )
81
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -64,7 +64,7 @@
packages=setuptools.find_packages(exclude=("tests", "tests.*")),
ext_modules=extensions,
cmdclass=cmdclass,
- python_requires=">=3.6",
+ python_requires=">=3.7",
author="Facebook AI Research",
author_email="[email protected]",
long_description="FairScale is a PyTorch extension library for high performance and large scale training on one or multiple machines/nodes. This library extends basic PyTorch capabilities while adding new experimental ones.",
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -64,7 +64,7 @@\n packages=setuptools.find_packages(exclude=(\"tests\", \"tests.*\")),\n ext_modules=extensions,\n cmdclass=cmdclass,\n- python_requires=\">=3.6\",\n+ python_requires=\">=3.7\",\n author=\"Facebook AI Research\",\n author_email=\"[email protected]\",\n long_description=\"FairScale is a PyTorch extension library for high performance and large scale training on one or multiple machines/nodes. This library extends basic PyTorch capabilities while adding new experimental ones.\",\n", "issue": "the main branch is not compatible with python 3.6, but setup.py only requires \">=3.6\"\npython 3.6 can pip install latest fairscale\r\nhttps://github.com/facebookresearch/fairscale/blob/1bc96fa8c69def6d990e42bfbd75f86146ce29bd/setup.py#L67\r\n\r\nbut, some code is not compatible with python 3.6\r\nhttps://github.com/facebookresearch/fairscale/blob/1bc96fa8c69def6d990e42bfbd75f86146ce29bd/fairscale/experimental/nn/ssd_offload.py#L6\r\nand python<3.7 has no dataclasses\r\nhttps://github.com/facebookresearch/fairscale/blob/1bc96fa8c69def6d990e42bfbd75f86146ce29bd/fairscale/nn/data_parallel/fully_sharded_data_parallel.py#L8\r\n\r\n\r\n\n", "before_files": [{"content": "#!/usr/bin/env python3\n\n# Copyright (c) Facebook, Inc. and its affiliates. All rights reserved.\n#\n# This source code is licensed under the BSD license found in the\n# LICENSE file in the root directory of this source tree.\n\nimport os\nimport re\n\nimport setuptools\n\nthis_dir = os.path.dirname(os.path.abspath(__file__))\n\n\ndef fetch_requirements():\n with open(\"requirements.txt\") as f:\n reqs = f.read().strip().split(\"\\n\")\n return reqs\n\n\n# https://packaging.python.org/guides/single-sourcing-package-version/\ndef find_version(version_file_path) -> str:\n with open(version_file_path) as version_file:\n version_match = re.search(r\"^__version_tuple__ = (.*)\", version_file.read(), re.M)\n if version_match:\n ver_tup = eval(version_match.group(1))\n ver_str = \".\".join([str(x) for x in ver_tup])\n return ver_str\n raise RuntimeError(\"Unable to find version tuple.\")\n\n\nextensions = []\ncmdclass = {}\n\nif os.getenv(\"BUILD_CUDA_EXTENSIONS\", \"0\") == \"1\":\n from torch.utils.cpp_extension import BuildExtension, CUDAExtension\n\n extensions.extend(\n [\n CUDAExtension(\n name=\"fairscale.fused_adam_cuda\",\n include_dirs=[os.path.join(this_dir, \"fairscale/clib/fused_adam_cuda\")],\n sources=[\n \"fairscale/clib/fused_adam_cuda/fused_adam_cuda.cpp\",\n \"fairscale/clib/fused_adam_cuda/fused_adam_cuda_kernel.cu\",\n ],\n extra_compile_args={\"cxx\": [\"-O3\"], \"nvcc\": [\"-O3\", \"--use_fast_math\"]},\n )\n ]\n )\n\n cmdclass[\"build_ext\"] = BuildExtension\n\n\nif __name__ == \"__main__\":\n setuptools.setup(\n name=\"fairscale\",\n description=\"FairScale: A PyTorch library for large-scale and high-performance training.\",\n version=find_version(\"fairscale/version.py\"),\n setup_requires=[\"ninja\"], # ninja is required to build extensions\n install_requires=fetch_requirements(),\n include_package_data=True,\n packages=setuptools.find_packages(exclude=(\"tests\", \"tests.*\")),\n ext_modules=extensions,\n cmdclass=cmdclass,\n python_requires=\">=3.6\",\n author=\"Facebook AI Research\",\n author_email=\"[email protected]\",\n long_description=\"FairScale is a PyTorch extension library for high performance and large scale training on one or multiple machines/nodes. This library extends basic PyTorch capabilities while adding new experimental ones.\",\n long_description_content_type=\"text/markdown\",\n classifiers=[\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"License :: OSI Approved :: BSD License\",\n \"Topic :: Scientific/Engineering :: Artificial Intelligence\",\n \"Operating System :: OS Independent\",\n ],\n )\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/env python3\n\n# Copyright (c) Facebook, Inc. and its affiliates. All rights reserved.\n#\n# This source code is licensed under the BSD license found in the\n# LICENSE file in the root directory of this source tree.\n\nimport os\nimport re\n\nimport setuptools\n\nthis_dir = os.path.dirname(os.path.abspath(__file__))\n\n\ndef fetch_requirements():\n with open(\"requirements.txt\") as f:\n reqs = f.read().strip().split(\"\\n\")\n return reqs\n\n\n# https://packaging.python.org/guides/single-sourcing-package-version/\ndef find_version(version_file_path) -> str:\n with open(version_file_path) as version_file:\n version_match = re.search(r\"^__version_tuple__ = (.*)\", version_file.read(), re.M)\n if version_match:\n ver_tup = eval(version_match.group(1))\n ver_str = \".\".join([str(x) for x in ver_tup])\n return ver_str\n raise RuntimeError(\"Unable to find version tuple.\")\n\n\nextensions = []\ncmdclass = {}\n\nif os.getenv(\"BUILD_CUDA_EXTENSIONS\", \"0\") == \"1\":\n from torch.utils.cpp_extension import BuildExtension, CUDAExtension\n\n extensions.extend(\n [\n CUDAExtension(\n name=\"fairscale.fused_adam_cuda\",\n include_dirs=[os.path.join(this_dir, \"fairscale/clib/fused_adam_cuda\")],\n sources=[\n \"fairscale/clib/fused_adam_cuda/fused_adam_cuda.cpp\",\n \"fairscale/clib/fused_adam_cuda/fused_adam_cuda_kernel.cu\",\n ],\n extra_compile_args={\"cxx\": [\"-O3\"], \"nvcc\": [\"-O3\", \"--use_fast_math\"]},\n )\n ]\n )\n\n cmdclass[\"build_ext\"] = BuildExtension\n\n\nif __name__ == \"__main__\":\n setuptools.setup(\n name=\"fairscale\",\n description=\"FairScale: A PyTorch library for large-scale and high-performance training.\",\n version=find_version(\"fairscale/version.py\"),\n setup_requires=[\"ninja\"], # ninja is required to build extensions\n install_requires=fetch_requirements(),\n include_package_data=True,\n packages=setuptools.find_packages(exclude=(\"tests\", \"tests.*\")),\n ext_modules=extensions,\n cmdclass=cmdclass,\n python_requires=\">=3.7\",\n author=\"Facebook AI Research\",\n author_email=\"[email protected]\",\n long_description=\"FairScale is a PyTorch extension library for high performance and large scale training on one or multiple machines/nodes. This library extends basic PyTorch capabilities while adding new experimental ones.\",\n long_description_content_type=\"text/markdown\",\n classifiers=[\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"License :: OSI Approved :: BSD License\",\n \"Topic :: Scientific/Engineering :: Artificial Intelligence\",\n \"Operating System :: OS Independent\",\n ],\n )\n", "path": "setup.py"}]} | 1,297 | 138 |
gh_patches_debug_38255 | rasdani/github-patches | git_diff | hydroshare__hydroshare-5302 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Enable management command to fix file issues on published resource
Currently, running `repair_resource` on published resources will fail.
Reproduce by:
* Create a resource and add some files
* Use the database or shell to remove some of the files from django to put the resource in a "broken" state
* Publish the resource
* Run the repair_resource --published management command and see that the published resource cannot be fixed by this script
**Additional context**
HS v2.11.3
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `hs_core/management/commands/repair_resource.py`
Content:
```
1 # -*- coding: utf-8 -*-
2
3 """
4 Check synchronization between iRODS and Django for multiple resources
5
6 This checks that:
7
8 1. every ResourceFile corresponds to an iRODS file
9 2. every iRODS file in {short_id}/data/contents corresponds to a ResourceFile
10 3. every iRODS directory {short_id} corresponds to a Django resource
11 """
12
13 from django.core.management.base import BaseCommand, CommandError
14 from hs_core.models import BaseResource
15 from hs_core.management.utils import repair_resource
16 from hs_core.views.utils import get_default_admin_user
17 from hs_core import hydroshare
18 from django.utils import timezone
19 from django.db.models import F
20 from datetime import timedelta
21
22 import logging
23
24
25 class Command(BaseCommand):
26 help = "Check synchronization between iRODS and Django."
27
28 def add_arguments(self, parser):
29 parser.add_argument('resource_ids', nargs='*', type=str)
30 parser.add_argument('--days', type=int, dest='days', help='include resources updated in the last X days')
31 parser.add_argument(
32 '--admin',
33 action='store_true', # True for presence, False for absence
34 dest='admin', # value is options['dry_run']
35 help='run process as admin user - this allows published resources to be modified',
36 )
37 parser.add_argument(
38 '--dryrun',
39 action='store_true', # True for presence, False for absence
40 dest='dry_run', # value is options['dry_run']
41 help='run process without saving changes',
42 )
43 parser.add_argument(
44 '--published',
45 action='store_true', # True for presence, False for absence
46 dest='published', # value is options['published']
47 help='filter to just published resources',
48 )
49
50 def handle(self, *args, **options):
51 logger = logging.getLogger(__name__)
52 resources_ids = options['resource_ids']
53 resources = BaseResource.objects.all()
54 days = options['days']
55 admin = options['admin']
56 dry_run = options['dry_run']
57 published = options['published']
58 site_url = hydroshare.utils.current_site_url()
59
60 if resources_ids: # an array of resource short_id to check.
61 print("CHECKING RESOURCES PROVIDED")
62 resources = resources.filter(short_id__in=resources_ids)
63 if published:
64 if not dry_run:
65 print("WARNING: Executing with --published arg without --dryrun. Published resources will be modified.")
66 print("FILTERING TO INCLUDE PUBLISHED RESOURCES ONLY")
67 resources = resources.filter(raccess__published=True)
68
69 if days:
70 print(f"FILTERING TO INCLUDE RESOURCES UPDATED IN LAST {days} DAYS")
71 if resources_ids:
72 print("Your supplied resource_ids will be filtered by the --days that you provided. ")
73 cuttoff_time = timezone.now() - timedelta(days)
74 resources = resources.filter(updated__gte=cuttoff_time)
75
76 if dry_run:
77 print("CONDUCTING A DRY RUN: FIXES WILL NOT BE SAVED")
78
79 if not resources:
80 print("NO RESOURCES FOUND MATCHING YOUR FILTER ARGUMENTS")
81 return
82
83 if admin:
84 print("PROCESSES WILL BE RUN AS ADMIN USER. ALLOWS DELETING DJANGO RESOURCE FILES ON PUBLISHED RESOURCES")
85 user = get_default_admin_user()
86 else:
87 user = None
88
89 resources = resources.order_by(F('updated').asc(nulls_first=True))
90
91 total_res_to_check = resources.count()
92 current_resource = 0
93 impacted_resources = 0
94 total_files_missing_in_django = 0
95 total_files_dangling_in_django = 0
96 resources_with_missing_django = []
97 resources_with_missing_irods = []
98 for resource in resources.iterator():
99 current_resource += 1
100 res_url = site_url + resource.absolute_url
101 print("*" * 100)
102 print(f"{current_resource}/{total_res_to_check}: Checking resource {res_url}")
103 if resource.raccess.published:
104 print("This Resource is published")
105 if admin:
106 print("Command running with --admin. Published resources will be repaired if needed.")
107 else:
108 print("Command running without --admin. Fixing a published resource raise ValidationError")
109 _, missing_in_django, dangling_in_django = repair_resource(resource, logger, dry_run=dry_run, user=user)
110 if dangling_in_django > 0 or missing_in_django > 0:
111 impacted_resources += 1
112 total_files_missing_in_django += missing_in_django
113 total_files_dangling_in_django += dangling_in_django
114 if missing_in_django > 0:
115 resources_with_missing_django.append(res_url)
116 if dangling_in_django > 0:
117 resources_with_missing_irods.append(res_url)
118 print(f"{dangling_in_django} files dangling in Django for this resource.")
119 print(f"{missing_in_django} files missing in Django for this resource.")
120 print(f"Resources thus far with at least one missing django file: {len(resources_with_missing_django)}")
121 print(f"Resources thus far with at least one dangling django file: {len(resources_with_missing_irods)}")
122 print(f"Total resources with discrepancies thus far: {impacted_resources}")
123 print("*" * 100)
124 print("*" * 100)
125 print(f"Number of resources that had at least one file issue: {impacted_resources}")
126
127 print("*" * 100)
128 print(f"Total number of files missing in Django (across all checked resources): \
129 {total_files_missing_in_django}")
130 print(f"Number of resources with at least one missing django file: {len(resources_with_missing_django)}")
131 for res in resources_with_missing_django:
132 print(res)
133
134 print("*" * 100)
135 print(f"Total number of files dangling in Django (across all checked resources): \
136 {total_files_dangling_in_django}")
137 print(f"Number of resources with at least one dangling Django file: {len(resources_with_missing_irods)}")
138 for res in resources_with_missing_irods:
139 print(res)
140
141 # Make it simple to detect clean/fail run in Jenkins
142 if impacted_resources:
143 raise CommandError("repair_resources detected problems")
144 else:
145 print("Completed run without detecting issues")
146
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/hs_core/management/commands/repair_resource.py b/hs_core/management/commands/repair_resource.py
--- a/hs_core/management/commands/repair_resource.py
+++ b/hs_core/management/commands/repair_resource.py
@@ -11,6 +11,7 @@
"""
from django.core.management.base import BaseCommand, CommandError
+from django.core.exceptions import ValidationError
from hs_core.models import BaseResource
from hs_core.management.utils import repair_resource
from hs_core.views.utils import get_default_admin_user
@@ -95,6 +96,7 @@
total_files_dangling_in_django = 0
resources_with_missing_django = []
resources_with_missing_irods = []
+ failed_resources = []
for resource in resources.iterator():
current_resource += 1
res_url = site_url + resource.absolute_url
@@ -106,7 +108,13 @@
print("Command running with --admin. Published resources will be repaired if needed.")
else:
print("Command running without --admin. Fixing a published resource raise ValidationError")
- _, missing_in_django, dangling_in_django = repair_resource(resource, logger, dry_run=dry_run, user=user)
+ try:
+ _, missing_in_django, dangling_in_django = repair_resource(resource, logger, dry_run=dry_run, user=user)
+ except ValidationError as ve:
+ failed_resources.append(res_url)
+ print("Exception while attempting to repair resource:")
+ print(ve)
+ continue
if dangling_in_django > 0 or missing_in_django > 0:
impacted_resources += 1
total_files_missing_in_django += missing_in_django
@@ -139,7 +147,13 @@
print(res)
# Make it simple to detect clean/fail run in Jenkins
- if impacted_resources:
- raise CommandError("repair_resources detected problems")
+ if impacted_resources and dry_run:
+ raise CommandError("repair_resources detected resources in need of repair during dry run")
else:
- print("Completed run without detecting issues")
+ print("Completed run of repair_resource")
+ if failed_resources:
+ print("*" * 100)
+ print("Repair was attempted but failed for the following resources:")
+ for res in resources_with_missing_irods:
+ print(res)
+ raise CommandError("Repair was attempted but failed on at least one resource")
| {"golden_diff": "diff --git a/hs_core/management/commands/repair_resource.py b/hs_core/management/commands/repair_resource.py\n--- a/hs_core/management/commands/repair_resource.py\n+++ b/hs_core/management/commands/repair_resource.py\n@@ -11,6 +11,7 @@\n \"\"\"\n \n from django.core.management.base import BaseCommand, CommandError\n+from django.core.exceptions import ValidationError\n from hs_core.models import BaseResource\n from hs_core.management.utils import repair_resource\n from hs_core.views.utils import get_default_admin_user\n@@ -95,6 +96,7 @@\n total_files_dangling_in_django = 0\n resources_with_missing_django = []\n resources_with_missing_irods = []\n+ failed_resources = []\n for resource in resources.iterator():\n current_resource += 1\n res_url = site_url + resource.absolute_url\n@@ -106,7 +108,13 @@\n print(\"Command running with --admin. Published resources will be repaired if needed.\")\n else:\n print(\"Command running without --admin. Fixing a published resource raise ValidationError\")\n- _, missing_in_django, dangling_in_django = repair_resource(resource, logger, dry_run=dry_run, user=user)\n+ try:\n+ _, missing_in_django, dangling_in_django = repair_resource(resource, logger, dry_run=dry_run, user=user)\n+ except ValidationError as ve:\n+ failed_resources.append(res_url)\n+ print(\"Exception while attempting to repair resource:\")\n+ print(ve)\n+ continue\n if dangling_in_django > 0 or missing_in_django > 0:\n impacted_resources += 1\n total_files_missing_in_django += missing_in_django\n@@ -139,7 +147,13 @@\n print(res)\n \n # Make it simple to detect clean/fail run in Jenkins\n- if impacted_resources:\n- raise CommandError(\"repair_resources detected problems\")\n+ if impacted_resources and dry_run:\n+ raise CommandError(\"repair_resources detected resources in need of repair during dry run\")\n else:\n- print(\"Completed run without detecting issues\")\n+ print(\"Completed run of repair_resource\")\n+ if failed_resources:\n+ print(\"*\" * 100)\n+ print(\"Repair was attempted but failed for the following resources:\")\n+ for res in resources_with_missing_irods:\n+ print(res)\n+ raise CommandError(\"Repair was attempted but failed on at least one resource\")\n", "issue": "Enable management command to fix file issues on published resource\nCurrently, running `repair_resource` on published resources will fail.\r\n\r\nReproduce by:\r\n* Create a resource and add some files\r\n* Use the database or shell to remove some of the files from django to put the resource in a \"broken\" state\r\n* Publish the resource\r\n* Run the repair_resource --published management command and see that the published resource cannot be fixed by this script\r\n\r\n**Additional context**\r\nHS v2.11.3\r\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\n\"\"\"\nCheck synchronization between iRODS and Django for multiple resources\n\nThis checks that:\n\n1. every ResourceFile corresponds to an iRODS file\n2. every iRODS file in {short_id}/data/contents corresponds to a ResourceFile\n3. every iRODS directory {short_id} corresponds to a Django resource\n\"\"\"\n\nfrom django.core.management.base import BaseCommand, CommandError\nfrom hs_core.models import BaseResource\nfrom hs_core.management.utils import repair_resource\nfrom hs_core.views.utils import get_default_admin_user\nfrom hs_core import hydroshare\nfrom django.utils import timezone\nfrom django.db.models import F\nfrom datetime import timedelta\n\nimport logging\n\n\nclass Command(BaseCommand):\n help = \"Check synchronization between iRODS and Django.\"\n\n def add_arguments(self, parser):\n parser.add_argument('resource_ids', nargs='*', type=str)\n parser.add_argument('--days', type=int, dest='days', help='include resources updated in the last X days')\n parser.add_argument(\n '--admin',\n action='store_true', # True for presence, False for absence\n dest='admin', # value is options['dry_run']\n help='run process as admin user - this allows published resources to be modified',\n )\n parser.add_argument(\n '--dryrun',\n action='store_true', # True for presence, False for absence\n dest='dry_run', # value is options['dry_run']\n help='run process without saving changes',\n )\n parser.add_argument(\n '--published',\n action='store_true', # True for presence, False for absence\n dest='published', # value is options['published']\n help='filter to just published resources',\n )\n\n def handle(self, *args, **options):\n logger = logging.getLogger(__name__)\n resources_ids = options['resource_ids']\n resources = BaseResource.objects.all()\n days = options['days']\n admin = options['admin']\n dry_run = options['dry_run']\n published = options['published']\n site_url = hydroshare.utils.current_site_url()\n\n if resources_ids: # an array of resource short_id to check.\n print(\"CHECKING RESOURCES PROVIDED\")\n resources = resources.filter(short_id__in=resources_ids)\n if published:\n if not dry_run:\n print(\"WARNING: Executing with --published arg without --dryrun. Published resources will be modified.\")\n print(\"FILTERING TO INCLUDE PUBLISHED RESOURCES ONLY\")\n resources = resources.filter(raccess__published=True)\n\n if days:\n print(f\"FILTERING TO INCLUDE RESOURCES UPDATED IN LAST {days} DAYS\")\n if resources_ids:\n print(\"Your supplied resource_ids will be filtered by the --days that you provided. \")\n cuttoff_time = timezone.now() - timedelta(days)\n resources = resources.filter(updated__gte=cuttoff_time)\n\n if dry_run:\n print(\"CONDUCTING A DRY RUN: FIXES WILL NOT BE SAVED\")\n\n if not resources:\n print(\"NO RESOURCES FOUND MATCHING YOUR FILTER ARGUMENTS\")\n return\n\n if admin:\n print(\"PROCESSES WILL BE RUN AS ADMIN USER. ALLOWS DELETING DJANGO RESOURCE FILES ON PUBLISHED RESOURCES\")\n user = get_default_admin_user()\n else:\n user = None\n\n resources = resources.order_by(F('updated').asc(nulls_first=True))\n\n total_res_to_check = resources.count()\n current_resource = 0\n impacted_resources = 0\n total_files_missing_in_django = 0\n total_files_dangling_in_django = 0\n resources_with_missing_django = []\n resources_with_missing_irods = []\n for resource in resources.iterator():\n current_resource += 1\n res_url = site_url + resource.absolute_url\n print(\"*\" * 100)\n print(f\"{current_resource}/{total_res_to_check}: Checking resource {res_url}\")\n if resource.raccess.published:\n print(\"This Resource is published\")\n if admin:\n print(\"Command running with --admin. Published resources will be repaired if needed.\")\n else:\n print(\"Command running without --admin. Fixing a published resource raise ValidationError\")\n _, missing_in_django, dangling_in_django = repair_resource(resource, logger, dry_run=dry_run, user=user)\n if dangling_in_django > 0 or missing_in_django > 0:\n impacted_resources += 1\n total_files_missing_in_django += missing_in_django\n total_files_dangling_in_django += dangling_in_django\n if missing_in_django > 0:\n resources_with_missing_django.append(res_url)\n if dangling_in_django > 0:\n resources_with_missing_irods.append(res_url)\n print(f\"{dangling_in_django} files dangling in Django for this resource.\")\n print(f\"{missing_in_django} files missing in Django for this resource.\")\n print(f\"Resources thus far with at least one missing django file: {len(resources_with_missing_django)}\")\n print(f\"Resources thus far with at least one dangling django file: {len(resources_with_missing_irods)}\")\n print(f\"Total resources with discrepancies thus far: {impacted_resources}\")\n print(\"*\" * 100)\n print(\"*\" * 100)\n print(f\"Number of resources that had at least one file issue: {impacted_resources}\")\n\n print(\"*\" * 100)\n print(f\"Total number of files missing in Django (across all checked resources): \\\n {total_files_missing_in_django}\")\n print(f\"Number of resources with at least one missing django file: {len(resources_with_missing_django)}\")\n for res in resources_with_missing_django:\n print(res)\n\n print(\"*\" * 100)\n print(f\"Total number of files dangling in Django (across all checked resources): \\\n {total_files_dangling_in_django}\")\n print(f\"Number of resources with at least one dangling Django file: {len(resources_with_missing_irods)}\")\n for res in resources_with_missing_irods:\n print(res)\n\n # Make it simple to detect clean/fail run in Jenkins\n if impacted_resources:\n raise CommandError(\"repair_resources detected problems\")\n else:\n print(\"Completed run without detecting issues\")\n", "path": "hs_core/management/commands/repair_resource.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\n\"\"\"\nCheck synchronization between iRODS and Django for multiple resources\n\nThis checks that:\n\n1. every ResourceFile corresponds to an iRODS file\n2. every iRODS file in {short_id}/data/contents corresponds to a ResourceFile\n3. every iRODS directory {short_id} corresponds to a Django resource\n\"\"\"\n\nfrom django.core.management.base import BaseCommand, CommandError\nfrom django.core.exceptions import ValidationError\nfrom hs_core.models import BaseResource\nfrom hs_core.management.utils import repair_resource\nfrom hs_core.views.utils import get_default_admin_user\nfrom hs_core import hydroshare\nfrom django.utils import timezone\nfrom django.db.models import F\nfrom datetime import timedelta\n\nimport logging\n\n\nclass Command(BaseCommand):\n help = \"Check synchronization between iRODS and Django.\"\n\n def add_arguments(self, parser):\n parser.add_argument('resource_ids', nargs='*', type=str)\n parser.add_argument('--days', type=int, dest='days', help='include resources updated in the last X days')\n parser.add_argument(\n '--admin',\n action='store_true', # True for presence, False for absence\n dest='admin', # value is options['dry_run']\n help='run process as admin user - this allows published resources to be modified',\n )\n parser.add_argument(\n '--dryrun',\n action='store_true', # True for presence, False for absence\n dest='dry_run', # value is options['dry_run']\n help='run process without saving changes',\n )\n parser.add_argument(\n '--published',\n action='store_true', # True for presence, False for absence\n dest='published', # value is options['published']\n help='filter to just published resources',\n )\n\n def handle(self, *args, **options):\n logger = logging.getLogger(__name__)\n resources_ids = options['resource_ids']\n resources = BaseResource.objects.all()\n days = options['days']\n admin = options['admin']\n dry_run = options['dry_run']\n published = options['published']\n site_url = hydroshare.utils.current_site_url()\n\n if resources_ids: # an array of resource short_id to check.\n print(\"CHECKING RESOURCES PROVIDED\")\n resources = resources.filter(short_id__in=resources_ids)\n if published:\n if not dry_run:\n print(\"WARNING: Executing with --published arg without --dryrun. Published resources will be modified.\")\n print(\"FILTERING TO INCLUDE PUBLISHED RESOURCES ONLY\")\n resources = resources.filter(raccess__published=True)\n\n if days:\n print(f\"FILTERING TO INCLUDE RESOURCES UPDATED IN LAST {days} DAYS\")\n if resources_ids:\n print(\"Your supplied resource_ids will be filtered by the --days that you provided. \")\n cuttoff_time = timezone.now() - timedelta(days)\n resources = resources.filter(updated__gte=cuttoff_time)\n\n if dry_run:\n print(\"CONDUCTING A DRY RUN: FIXES WILL NOT BE SAVED\")\n\n if not resources:\n print(\"NO RESOURCES FOUND MATCHING YOUR FILTER ARGUMENTS\")\n return\n\n if admin:\n print(\"PROCESSES WILL BE RUN AS ADMIN USER. ALLOWS DELETING DJANGO RESOURCE FILES ON PUBLISHED RESOURCES\")\n user = get_default_admin_user()\n else:\n user = None\n\n resources = resources.order_by(F('updated').asc(nulls_first=True))\n\n total_res_to_check = resources.count()\n current_resource = 0\n impacted_resources = 0\n total_files_missing_in_django = 0\n total_files_dangling_in_django = 0\n resources_with_missing_django = []\n resources_with_missing_irods = []\n failed_resources = []\n for resource in resources.iterator():\n current_resource += 1\n res_url = site_url + resource.absolute_url\n print(\"*\" * 100)\n print(f\"{current_resource}/{total_res_to_check}: Checking resource {res_url}\")\n if resource.raccess.published:\n print(\"This Resource is published\")\n if admin:\n print(\"Command running with --admin. Published resources will be repaired if needed.\")\n else:\n print(\"Command running without --admin. Fixing a published resource raise ValidationError\")\n try:\n _, missing_in_django, dangling_in_django = repair_resource(resource, logger, dry_run=dry_run, user=user)\n except ValidationError as ve:\n failed_resources.append(res_url)\n print(\"Exception while attempting to repair resource:\")\n print(ve)\n continue\n if dangling_in_django > 0 or missing_in_django > 0:\n impacted_resources += 1\n total_files_missing_in_django += missing_in_django\n total_files_dangling_in_django += dangling_in_django\n if missing_in_django > 0:\n resources_with_missing_django.append(res_url)\n if dangling_in_django > 0:\n resources_with_missing_irods.append(res_url)\n print(f\"{dangling_in_django} files dangling in Django for this resource.\")\n print(f\"{missing_in_django} files missing in Django for this resource.\")\n print(f\"Resources thus far with at least one missing django file: {len(resources_with_missing_django)}\")\n print(f\"Resources thus far with at least one dangling django file: {len(resources_with_missing_irods)}\")\n print(f\"Total resources with discrepancies thus far: {impacted_resources}\")\n print(\"*\" * 100)\n print(\"*\" * 100)\n print(f\"Number of resources that had at least one file issue: {impacted_resources}\")\n\n print(\"*\" * 100)\n print(f\"Total number of files missing in Django (across all checked resources): \\\n {total_files_missing_in_django}\")\n print(f\"Number of resources with at least one missing django file: {len(resources_with_missing_django)}\")\n for res in resources_with_missing_django:\n print(res)\n\n print(\"*\" * 100)\n print(f\"Total number of files dangling in Django (across all checked resources): \\\n {total_files_dangling_in_django}\")\n print(f\"Number of resources with at least one dangling Django file: {len(resources_with_missing_irods)}\")\n for res in resources_with_missing_irods:\n print(res)\n\n # Make it simple to detect clean/fail run in Jenkins\n if impacted_resources and dry_run:\n raise CommandError(\"repair_resources detected resources in need of repair during dry run\")\n else:\n print(\"Completed run of repair_resource\")\n if failed_resources:\n print(\"*\" * 100)\n print(\"Repair was attempted but failed for the following resources:\")\n for res in resources_with_missing_irods:\n print(res)\n raise CommandError(\"Repair was attempted but failed on at least one resource\")\n", "path": "hs_core/management/commands/repair_resource.py"}]} | 2,073 | 542 |
gh_patches_debug_16300 | rasdani/github-patches | git_diff | pre-commit__pre-commit-399 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Drop python2.6?
Is it worth attempting to continue to support python2.6?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 from setuptools import find_packages
2 from setuptools import setup
3
4
5 setup(
6 name='pre_commit',
7 description=(
8 'A framework for managing and maintaining multi-language pre-commit '
9 'hooks.'
10 ),
11 url='https://github.com/pre-commit/pre-commit',
12 version='0.8.2',
13
14 author='Anthony Sottile',
15 author_email='[email protected]',
16
17 platforms='linux',
18 classifiers=[
19 'License :: OSI Approved :: MIT License',
20 'Programming Language :: Python :: 2',
21 'Programming Language :: Python :: 2.6',
22 'Programming Language :: Python :: 2.7',
23 'Programming Language :: Python :: 3',
24 'Programming Language :: Python :: 3.4',
25 'Programming Language :: Python :: 3.5',
26 'Programming Language :: Python :: Implementation :: CPython',
27 'Programming Language :: Python :: Implementation :: PyPy',
28 ],
29
30 packages=find_packages('.', exclude=('tests*', 'testing*')),
31 package_data={
32 'pre_commit': [
33 'resources/hook-tmpl',
34 'resources/pre-push-tmpl',
35 'resources/rbenv.tar.gz',
36 'resources/ruby-build.tar.gz',
37 'resources/ruby-download.tar.gz',
38 ]
39 },
40 install_requires=[
41 'aspy.yaml',
42 'cached-property',
43 'jsonschema',
44 'nodeenv>=0.11.1',
45 'pyterminalsize',
46 'pyyaml',
47 'virtualenv',
48 ],
49 extras_require={
50 ':python_version=="2.6"': ['argparse', 'ordereddict'],
51 },
52 entry_points={
53 'console_scripts': [
54 'pre-commit = pre_commit.main:main',
55 'pre-commit-validate-config = pre_commit.clientlib.validate_config:run', # noqa
56 'pre-commit-validate-manifest = pre_commit.clientlib.validate_manifest:run', # noqa
57 ],
58 },
59 )
60
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -18,7 +18,6 @@
classifiers=[
'License :: OSI Approved :: MIT License',
'Programming Language :: Python :: 2',
- 'Programming Language :: Python :: 2.6',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.4',
@@ -46,9 +45,6 @@
'pyyaml',
'virtualenv',
],
- extras_require={
- ':python_version=="2.6"': ['argparse', 'ordereddict'],
- },
entry_points={
'console_scripts': [
'pre-commit = pre_commit.main:main',
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -18,7 +18,6 @@\n classifiers=[\n 'License :: OSI Approved :: MIT License',\n 'Programming Language :: Python :: 2',\n- 'Programming Language :: Python :: 2.6',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.4',\n@@ -46,9 +45,6 @@\n 'pyyaml',\n 'virtualenv',\n ],\n- extras_require={\n- ':python_version==\"2.6\"': ['argparse', 'ordereddict'],\n- },\n entry_points={\n 'console_scripts': [\n 'pre-commit = pre_commit.main:main',\n", "issue": "Drop python2.6?\nIs it worth attempting to continue to support python2.6?\n\n", "before_files": [{"content": "from setuptools import find_packages\nfrom setuptools import setup\n\n\nsetup(\n name='pre_commit',\n description=(\n 'A framework for managing and maintaining multi-language pre-commit '\n 'hooks.'\n ),\n url='https://github.com/pre-commit/pre-commit',\n version='0.8.2',\n\n author='Anthony Sottile',\n author_email='[email protected]',\n\n platforms='linux',\n classifiers=[\n 'License :: OSI Approved :: MIT License',\n 'Programming Language :: Python :: 2',\n 'Programming Language :: Python :: 2.6',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.4',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: Implementation :: CPython',\n 'Programming Language :: Python :: Implementation :: PyPy',\n ],\n\n packages=find_packages('.', exclude=('tests*', 'testing*')),\n package_data={\n 'pre_commit': [\n 'resources/hook-tmpl',\n 'resources/pre-push-tmpl',\n 'resources/rbenv.tar.gz',\n 'resources/ruby-build.tar.gz',\n 'resources/ruby-download.tar.gz',\n ]\n },\n install_requires=[\n 'aspy.yaml',\n 'cached-property',\n 'jsonschema',\n 'nodeenv>=0.11.1',\n 'pyterminalsize',\n 'pyyaml',\n 'virtualenv',\n ],\n extras_require={\n ':python_version==\"2.6\"': ['argparse', 'ordereddict'],\n },\n entry_points={\n 'console_scripts': [\n 'pre-commit = pre_commit.main:main',\n 'pre-commit-validate-config = pre_commit.clientlib.validate_config:run', # noqa\n 'pre-commit-validate-manifest = pre_commit.clientlib.validate_manifest:run', # noqa\n ],\n },\n)\n", "path": "setup.py"}], "after_files": [{"content": "from setuptools import find_packages\nfrom setuptools import setup\n\n\nsetup(\n name='pre_commit',\n description=(\n 'A framework for managing and maintaining multi-language pre-commit '\n 'hooks.'\n ),\n url='https://github.com/pre-commit/pre-commit',\n version='0.8.2',\n\n author='Anthony Sottile',\n author_email='[email protected]',\n\n platforms='linux',\n classifiers=[\n 'License :: OSI Approved :: MIT License',\n 'Programming Language :: Python :: 2',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.4',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: Implementation :: CPython',\n 'Programming Language :: Python :: Implementation :: PyPy',\n ],\n\n packages=find_packages('.', exclude=('tests*', 'testing*')),\n package_data={\n 'pre_commit': [\n 'resources/hook-tmpl',\n 'resources/pre-push-tmpl',\n 'resources/rbenv.tar.gz',\n 'resources/ruby-build.tar.gz',\n 'resources/ruby-download.tar.gz',\n ]\n },\n install_requires=[\n 'aspy.yaml',\n 'cached-property',\n 'jsonschema',\n 'nodeenv>=0.11.1',\n 'pyterminalsize',\n 'pyyaml',\n 'virtualenv',\n ],\n entry_points={\n 'console_scripts': [\n 'pre-commit = pre_commit.main:main',\n 'pre-commit-validate-config = pre_commit.clientlib.validate_config:run', # noqa\n 'pre-commit-validate-manifest = pre_commit.clientlib.validate_manifest:run', # noqa\n ],\n },\n)\n", "path": "setup.py"}]} | 796 | 174 |
gh_patches_debug_21753 | rasdani/github-patches | git_diff | Flexget__Flexget-1600 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
nyaa changed TLD
hi peeps. it seems they switched TLD from .eu to .se
i changed my local flexget/plugins/sites/nyaa.py, removed the pyc & reloaded the daemon. its pulling stuff. but i aint got the skills to send a pull request, so i thought i'd do the next best thing and say something
if you don't want to do anything, i guess thats fine too. the old is redirecting to the new
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `flexget/plugins/sites/nyaa.py`
Content:
```
1 from __future__ import unicode_literals, division, absolute_import
2 from builtins import * # noqa pylint: disable=unused-import, redefined-builtin
3 from future.moves.urllib.parse import quote
4
5 import logging
6
7 import feedparser
8
9 from flexget import plugin
10 from flexget.entry import Entry
11 from flexget.event import event
12 from flexget.utils.search import normalize_unicode
13
14 log = logging.getLogger('nyaa')
15
16 # TODO: Other categories
17 CATEGORIES = {'all': '0_0',
18 'anime': '1_0',
19 'anime eng': '1_37',
20 'anime non-eng': '1_38',
21 'anime raw': '1_11'}
22 FILTERS = ['all', 'filter remakes', 'trusted only', 'a+ only']
23
24
25 class UrlRewriteNyaa(object):
26 """Nyaa urlrewriter and search plugin."""
27
28 schema = {
29 'oneOf': [
30 {'type': 'string', 'enum': list(CATEGORIES)},
31 {
32 'type': 'object',
33 'properties': {
34 'category': {'type': 'string', 'enum': list(CATEGORIES)},
35 'filter': {'type': 'string', 'enum': list(FILTERS)}
36 },
37 'additionalProperties': False
38 }
39 ]
40 }
41
42 def search(self, task, entry, config):
43 if not isinstance(config, dict):
44 config = {'category': config}
45 config.setdefault('category', 'anime eng')
46 config.setdefault('filter', 'all')
47 entries = set()
48 for search_string in entry.get('search_strings', [entry['title']]):
49 name = normalize_unicode(search_string)
50 url = 'http://www.nyaa.eu/?page=rss&cats=%s&filter=%s&term=%s' % (
51 CATEGORIES[config['category']], FILTERS.index(config['filter']), quote(name.encode('utf-8')))
52
53 log.debug('requesting: %s' % url)
54 rss = feedparser.parse(url)
55
56 status = rss.get('status', False)
57 if status != 200:
58 log.debug('Search result not 200 (OK), received %s' % status)
59 if status >= 400:
60 continue
61
62 ex = rss.get('bozo_exception', False)
63 if ex:
64 log.error('Got bozo_exception (bad feed) on %s' % url)
65 continue
66
67 for item in rss.entries:
68 entry = Entry()
69 entry['title'] = item.title
70 entry['url'] = item.link
71 # TODO: parse some shit
72 # entry['torrent_seeds'] = int(item.seeds)
73 # entry['torrent_leeches'] = int(item.leechs)
74 # entry['search_sort'] = torrent_availability(entry['torrent_seeds'], entry['torrent_leeches'])
75 # entry['content_size'] = int(item.size) / 1024 / 1024
76
77 entries.add(entry)
78
79 return entries
80
81 def url_rewritable(self, task, entry):
82 return entry['url'].startswith('http://www.nyaa.eu/?page=torrentinfo&tid=')
83
84 def url_rewrite(self, task, entry):
85 entry['url'] = entry['url'].replace('torrentinfo', 'download')
86
87
88 @event('plugin.register')
89 def register_plugin():
90 plugin.register(UrlRewriteNyaa, 'nyaa', groups=['search', 'urlrewriter'], api_ver=2)
91
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/flexget/plugins/sites/nyaa.py b/flexget/plugins/sites/nyaa.py
--- a/flexget/plugins/sites/nyaa.py
+++ b/flexget/plugins/sites/nyaa.py
@@ -47,7 +47,7 @@
entries = set()
for search_string in entry.get('search_strings', [entry['title']]):
name = normalize_unicode(search_string)
- url = 'http://www.nyaa.eu/?page=rss&cats=%s&filter=%s&term=%s' % (
+ url = 'http://www.nyaa.se/?page=rss&cats=%s&filter=%s&term=%s' % (
CATEGORIES[config['category']], FILTERS.index(config['filter']), quote(name.encode('utf-8')))
log.debug('requesting: %s' % url)
@@ -79,7 +79,7 @@
return entries
def url_rewritable(self, task, entry):
- return entry['url'].startswith('http://www.nyaa.eu/?page=torrentinfo&tid=')
+ return entry['url'].startswith('http://www.nyaa.se/?page=torrentinfo&tid=')
def url_rewrite(self, task, entry):
entry['url'] = entry['url'].replace('torrentinfo', 'download')
| {"golden_diff": "diff --git a/flexget/plugins/sites/nyaa.py b/flexget/plugins/sites/nyaa.py\n--- a/flexget/plugins/sites/nyaa.py\n+++ b/flexget/plugins/sites/nyaa.py\n@@ -47,7 +47,7 @@\n entries = set()\n for search_string in entry.get('search_strings', [entry['title']]):\n name = normalize_unicode(search_string)\n- url = 'http://www.nyaa.eu/?page=rss&cats=%s&filter=%s&term=%s' % (\n+ url = 'http://www.nyaa.se/?page=rss&cats=%s&filter=%s&term=%s' % (\n CATEGORIES[config['category']], FILTERS.index(config['filter']), quote(name.encode('utf-8')))\n \n log.debug('requesting: %s' % url)\n@@ -79,7 +79,7 @@\n return entries\n \n def url_rewritable(self, task, entry):\n- return entry['url'].startswith('http://www.nyaa.eu/?page=torrentinfo&tid=')\n+ return entry['url'].startswith('http://www.nyaa.se/?page=torrentinfo&tid=')\n \n def url_rewrite(self, task, entry):\n entry['url'] = entry['url'].replace('torrentinfo', 'download')\n", "issue": "nyaa changed TLD\nhi peeps. it seems they switched TLD from .eu to .se\r\n\r\ni changed my local flexget/plugins/sites/nyaa.py, removed the pyc & reloaded the daemon. its pulling stuff. but i aint got the skills to send a pull request, so i thought i'd do the next best thing and say something\r\n\r\nif you don't want to do anything, i guess thats fine too. the old is redirecting to the new\n", "before_files": [{"content": "from __future__ import unicode_literals, division, absolute_import\nfrom builtins import * # noqa pylint: disable=unused-import, redefined-builtin\nfrom future.moves.urllib.parse import quote\n\nimport logging\n\nimport feedparser\n\nfrom flexget import plugin\nfrom flexget.entry import Entry\nfrom flexget.event import event\nfrom flexget.utils.search import normalize_unicode\n\nlog = logging.getLogger('nyaa')\n\n# TODO: Other categories\nCATEGORIES = {'all': '0_0',\n 'anime': '1_0',\n 'anime eng': '1_37',\n 'anime non-eng': '1_38',\n 'anime raw': '1_11'}\nFILTERS = ['all', 'filter remakes', 'trusted only', 'a+ only']\n\n\nclass UrlRewriteNyaa(object):\n \"\"\"Nyaa urlrewriter and search plugin.\"\"\"\n\n schema = {\n 'oneOf': [\n {'type': 'string', 'enum': list(CATEGORIES)},\n {\n 'type': 'object',\n 'properties': {\n 'category': {'type': 'string', 'enum': list(CATEGORIES)},\n 'filter': {'type': 'string', 'enum': list(FILTERS)}\n },\n 'additionalProperties': False\n }\n ]\n }\n\n def search(self, task, entry, config):\n if not isinstance(config, dict):\n config = {'category': config}\n config.setdefault('category', 'anime eng')\n config.setdefault('filter', 'all')\n entries = set()\n for search_string in entry.get('search_strings', [entry['title']]):\n name = normalize_unicode(search_string)\n url = 'http://www.nyaa.eu/?page=rss&cats=%s&filter=%s&term=%s' % (\n CATEGORIES[config['category']], FILTERS.index(config['filter']), quote(name.encode('utf-8')))\n\n log.debug('requesting: %s' % url)\n rss = feedparser.parse(url)\n\n status = rss.get('status', False)\n if status != 200:\n log.debug('Search result not 200 (OK), received %s' % status)\n if status >= 400:\n continue\n\n ex = rss.get('bozo_exception', False)\n if ex:\n log.error('Got bozo_exception (bad feed) on %s' % url)\n continue\n\n for item in rss.entries:\n entry = Entry()\n entry['title'] = item.title\n entry['url'] = item.link\n # TODO: parse some shit\n # entry['torrent_seeds'] = int(item.seeds)\n # entry['torrent_leeches'] = int(item.leechs)\n # entry['search_sort'] = torrent_availability(entry['torrent_seeds'], entry['torrent_leeches'])\n # entry['content_size'] = int(item.size) / 1024 / 1024\n\n entries.add(entry)\n\n return entries\n\n def url_rewritable(self, task, entry):\n return entry['url'].startswith('http://www.nyaa.eu/?page=torrentinfo&tid=')\n\n def url_rewrite(self, task, entry):\n entry['url'] = entry['url'].replace('torrentinfo', 'download')\n\n\n@event('plugin.register')\ndef register_plugin():\n plugin.register(UrlRewriteNyaa, 'nyaa', groups=['search', 'urlrewriter'], api_ver=2)\n", "path": "flexget/plugins/sites/nyaa.py"}], "after_files": [{"content": "from __future__ import unicode_literals, division, absolute_import\nfrom builtins import * # noqa pylint: disable=unused-import, redefined-builtin\nfrom future.moves.urllib.parse import quote\n\nimport logging\n\nimport feedparser\n\nfrom flexget import plugin\nfrom flexget.entry import Entry\nfrom flexget.event import event\nfrom flexget.utils.search import normalize_unicode\n\nlog = logging.getLogger('nyaa')\n\n# TODO: Other categories\nCATEGORIES = {'all': '0_0',\n 'anime': '1_0',\n 'anime eng': '1_37',\n 'anime non-eng': '1_38',\n 'anime raw': '1_11'}\nFILTERS = ['all', 'filter remakes', 'trusted only', 'a+ only']\n\n\nclass UrlRewriteNyaa(object):\n \"\"\"Nyaa urlrewriter and search plugin.\"\"\"\n\n schema = {\n 'oneOf': [\n {'type': 'string', 'enum': list(CATEGORIES)},\n {\n 'type': 'object',\n 'properties': {\n 'category': {'type': 'string', 'enum': list(CATEGORIES)},\n 'filter': {'type': 'string', 'enum': list(FILTERS)}\n },\n 'additionalProperties': False\n }\n ]\n }\n\n def search(self, task, entry, config):\n if not isinstance(config, dict):\n config = {'category': config}\n config.setdefault('category', 'anime eng')\n config.setdefault('filter', 'all')\n entries = set()\n for search_string in entry.get('search_strings', [entry['title']]):\n name = normalize_unicode(search_string)\n url = 'http://www.nyaa.se/?page=rss&cats=%s&filter=%s&term=%s' % (\n CATEGORIES[config['category']], FILTERS.index(config['filter']), quote(name.encode('utf-8')))\n\n log.debug('requesting: %s' % url)\n rss = feedparser.parse(url)\n\n status = rss.get('status', False)\n if status != 200:\n log.debug('Search result not 200 (OK), received %s' % status)\n if status >= 400:\n continue\n\n ex = rss.get('bozo_exception', False)\n if ex:\n log.error('Got bozo_exception (bad feed) on %s' % url)\n continue\n\n for item in rss.entries:\n entry = Entry()\n entry['title'] = item.title\n entry['url'] = item.link\n # TODO: parse some shit\n # entry['torrent_seeds'] = int(item.seeds)\n # entry['torrent_leeches'] = int(item.leechs)\n # entry['search_sort'] = torrent_availability(entry['torrent_seeds'], entry['torrent_leeches'])\n # entry['content_size'] = int(item.size) / 1024 / 1024\n\n entries.add(entry)\n\n return entries\n\n def url_rewritable(self, task, entry):\n return entry['url'].startswith('http://www.nyaa.se/?page=torrentinfo&tid=')\n\n def url_rewrite(self, task, entry):\n entry['url'] = entry['url'].replace('torrentinfo', 'download')\n\n\n@event('plugin.register')\ndef register_plugin():\n plugin.register(UrlRewriteNyaa, 'nyaa', groups=['search', 'urlrewriter'], api_ver=2)\n", "path": "flexget/plugins/sites/nyaa.py"}]} | 1,294 | 292 |
gh_patches_debug_25182 | rasdani/github-patches | git_diff | mars-project__mars-1155 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[BUG] Execute groupby failed
<!--
Thank you for your contribution!
Please review https://github.com/mars-project/mars/blob/master/CONTRIBUTING.rst before opening an issue.
-->
**Describe the bug**
`df.groupby().execute()` may fail.
**To Reproduce**
To help us reproducing this bug, please provide information below:
1. Your Python version
2. The version of Mars you use
3. Versions of crucial packages, such as numpy, scipy and protobuf
4. Full stack of the error.
5. Minimized code to reproduce the error.
```
In [1]: import pandas as pd; import numpy as np
In [2]: df = pd.DataFrame(np.random.rand(4, 3), index=np.arange(5, 1, -1))
In [4]: import mars.dataframe as md
In [5]: mdf = md.DataFrame(df)
In [6]: mdf.groupby(0).execute()
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
<ipython-input-6-491b51043e08> in <module>
----> 1 mdf.groupby(0).execute()
~/Workspace/mars/mars/core.py in execute(self, session, **kw)
426 if session is None:
427 session = Session.default_or_local()
--> 428 return session.run(self, **kw)
429
430 def fetch(self, session=None, **kw):
~/Workspace/mars/mars/session.py in run(self, *tileables, **kw)
181 tileables = tuple(mt.tensor(t) if not isinstance(t, (Entity, Base)) else t
182 for t in tileables)
--> 183 result = self._sess.run(*tileables, **kw)
184
185 for t in tileables:
~/Workspace/mars/mars/session.py in run(self, *tileables, **kw)
88 # set number of running cores
89 self.context.set_ncores(kw['n_parallel'])
---> 90 res = self._executor.execute_tileables(tileables, **kw)
91 return res
92
~/Workspace/mars/mars/utils.py in _wrapped(*args, **kwargs)
380 _kernel_mode.eager = False
381 _kernel_mode.eager_count = enter_eager_count + 1
--> 382 return func(*args, **kwargs)
383 finally:
384 _kernel_mode.eager_count -= 1
~/Workspace/mars/mars/utils.py in inner(*args, **kwargs)
468 def inner(*args, **kwargs):
469 with build_mode():
--> 470 return func(*args, **kwargs)
471 return inner
472
~/Workspace/mars/mars/executor.py in execute_tileables(self, tileables, fetch, n_parallel, n_thread, print_progress, mock, compose)
828 # update shape of tileable and its chunks whatever it's successful or not
829 self._update_tileable_and_chunk_shape(
--> 830 tileable_graph, chunk_result, chunk_graph_builder.interrupted_ops)
831 if chunk_graph_builder.done:
832 if len(intermediate_result_keys) > 0:
~/Workspace/mars/mars/executor.py in _update_tileable_and_chunk_shape(self, tileable_graph, chunk_result, failed_ops)
726 continue
727 for c in tiled_n.chunks:
--> 728 c.data._shape = chunk_result[c.key].shape
729 new_nsplits = self.get_tileable_nsplits(n, chunk_result=chunk_result)
730 for node in (n, tiled_n):
~/Workspace/mars/mars/lib/groupby_wrapper.py in __getattr__(self, item)
74 if item in getattr(self.obj, 'columns', ()):
75 return self.__getitem__(item)
---> 76 return getattr(self.groupby_obj, item)
77
78 def __iter__(self):
~/miniconda3/lib/python3.7/site-packages/pandas/core/groupby/groupby.py in __getattr__(self, attr)
578
579 raise AttributeError(
--> 580 f"'{type(self).__name__}' object has no attribute '{attr}'"
581 )
582
AttributeError: 'DataFrameGroupBy' object has no attribute 'shape'
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mars/lib/groupby_wrapper.py`
Content:
```
1 # Copyright 1999-2020 Alibaba Group Holding Ltd.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 from collections.abc import Iterable
16
17 import cloudpickle
18 from pandas.core.groupby import DataFrameGroupBy, SeriesGroupBy
19
20
21 class GroupByWrapper:
22 def __init__(self, obj, groupby_obj=None, keys=None, axis=0, level=None, grouper=None,
23 exclusions=None, selection=None, as_index=True, sort=True,
24 group_keys=True, squeeze=False, observed=False, mutated=False,
25 grouper_cache=None):
26
27 def fill_value(v, key):
28 return v if v is not None or groupby_obj is None else getattr(groupby_obj, key)
29
30 self.obj = obj
31 self.keys = fill_value(keys, 'keys')
32 self.axis = fill_value(axis, 'axis')
33 self.level = fill_value(level, 'level')
34 self.exclusions = fill_value(exclusions, 'exclusions')
35 self.selection = selection
36 self.as_index = fill_value(as_index, 'as_index')
37 self.sort = fill_value(sort, 'sort')
38 self.group_keys = fill_value(group_keys, 'group_keys')
39 self.squeeze = fill_value(squeeze, 'squeeze')
40 self.observed = fill_value(observed, 'observed')
41 self.mutated = fill_value(mutated, 'mutated')
42
43 if groupby_obj is None:
44 if obj.ndim == 2:
45 self.groupby_obj = DataFrameGroupBy(
46 obj, keys=keys, axis=axis, level=level, grouper=grouper, exclusions=exclusions,
47 as_index=as_index, group_keys=group_keys, squeeze=squeeze, observed=observed,
48 mutated=mutated)
49 else:
50 self.groupby_obj = SeriesGroupBy(
51 obj, keys=keys, axis=axis, level=level, grouper=grouper, exclusions=exclusions,
52 as_index=as_index, group_keys=group_keys, squeeze=squeeze, observed=observed,
53 mutated=mutated)
54 else:
55 self.groupby_obj = groupby_obj
56
57 self.is_frame = isinstance(self.groupby_obj, DataFrameGroupBy)
58
59 if grouper_cache:
60 self.groupby_obj.grouper._cache = grouper_cache
61 if selection:
62 self.groupby_obj = self.groupby_obj[selection]
63
64 def __getitem__(self, item):
65 return GroupByWrapper(
66 self.obj, keys=self.keys, axis=self.axis, level=self.level,
67 grouper=self.groupby_obj.grouper, exclusions=self.exclusions, selection=item,
68 as_index=self.as_index, sort=self.sort, group_keys=self.group_keys,
69 squeeze=self.squeeze, observed=self.observed, mutated=self.mutated)
70
71 def __getattr__(self, item):
72 if item.startswith('_'): # pragma: no cover
73 return object.__getattribute__(self, item)
74 if item in getattr(self.obj, 'columns', ()):
75 return self.__getitem__(item)
76 return getattr(self.groupby_obj, item)
77
78 def __iter__(self):
79 return self.groupby_obj.__iter__()
80
81 @property
82 def empty(self):
83 return self.obj.empty
84
85 def to_tuple(self, truncate=False, pickle_function=False):
86 if self.selection and truncate:
87 if isinstance(self.selection, Iterable) and not isinstance(self.selection, str):
88 item_list = list(self.selection)
89 else:
90 item_list = [self.selection]
91 item_set = set(item_list)
92
93 if isinstance(self.keys, list):
94 sel_keys = self.keys
95 elif self.keys in self.obj.columns:
96 sel_keys = [self.keys]
97 else:
98 sel_keys = []
99
100 all_items = item_list + [k for k in sel_keys or () if k not in item_set]
101 if set(all_items) == set(self.obj.columns):
102 obj = self.obj
103 else:
104 obj = self.obj[all_items]
105 else:
106 obj = self.obj
107
108 if pickle_function and callable(self.keys):
109 keys = cloudpickle.dumps(self.keys)
110 else:
111 keys = self.keys
112
113 return obj, keys, self.axis, self.level, self.exclusions, self.selection, \
114 self.as_index, self.sort, self.group_keys, self.squeeze, self.observed, \
115 self.mutated, getattr(self.groupby_obj.grouper, '_cache', dict())
116
117 @classmethod
118 def from_tuple(cls, tp):
119 obj, keys, axis, level, exclusions, selection, as_index, sort, group_keys, squeeze, \
120 observed, mutated, grouper_cache = tp
121
122 if isinstance(keys, (bytes, bytearray)):
123 keys = cloudpickle.loads(keys)
124
125 return cls(obj, keys=keys, axis=axis, level=level, exclusions=exclusions, selection=selection,
126 as_index=as_index, sort=sort, group_keys=group_keys, squeeze=squeeze, observed=observed,
127 mutated=mutated, grouper_cache=grouper_cache)
128
129
130 def wrapped_groupby(obj, by=None, axis=0, level=None, as_index=True, sort=True, group_keys=True,
131 squeeze=False, observed=False):
132 groupby_obj = obj.groupby(by=by, axis=axis, level=level, as_index=as_index, sort=sort,
133 group_keys=group_keys, squeeze=squeeze, observed=observed)
134 return GroupByWrapper(obj, groupby_obj=groupby_obj)
135
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/mars/lib/groupby_wrapper.py b/mars/lib/groupby_wrapper.py
--- a/mars/lib/groupby_wrapper.py
+++ b/mars/lib/groupby_wrapper.py
@@ -54,13 +54,13 @@
else:
self.groupby_obj = groupby_obj
- self.is_frame = isinstance(self.groupby_obj, DataFrameGroupBy)
-
if grouper_cache:
self.groupby_obj.grouper._cache = grouper_cache
if selection:
self.groupby_obj = self.groupby_obj[selection]
+ self.is_frame = isinstance(self.groupby_obj, DataFrameGroupBy)
+
def __getitem__(self, item):
return GroupByWrapper(
self.obj, keys=self.keys, axis=self.axis, level=self.level,
@@ -82,6 +82,13 @@
def empty(self):
return self.obj.empty
+ @property
+ def shape(self):
+ shape = list(self.groupby_obj.obj.shape)
+ if self.is_frame and self.selection:
+ shape[1] = len(self.selection)
+ return tuple(shape)
+
def to_tuple(self, truncate=False, pickle_function=False):
if self.selection and truncate:
if isinstance(self.selection, Iterable) and not isinstance(self.selection, str):
| {"golden_diff": "diff --git a/mars/lib/groupby_wrapper.py b/mars/lib/groupby_wrapper.py\n--- a/mars/lib/groupby_wrapper.py\n+++ b/mars/lib/groupby_wrapper.py\n@@ -54,13 +54,13 @@\n else:\n self.groupby_obj = groupby_obj\n \n- self.is_frame = isinstance(self.groupby_obj, DataFrameGroupBy)\n-\n if grouper_cache:\n self.groupby_obj.grouper._cache = grouper_cache\n if selection:\n self.groupby_obj = self.groupby_obj[selection]\n \n+ self.is_frame = isinstance(self.groupby_obj, DataFrameGroupBy)\n+\n def __getitem__(self, item):\n return GroupByWrapper(\n self.obj, keys=self.keys, axis=self.axis, level=self.level,\n@@ -82,6 +82,13 @@\n def empty(self):\n return self.obj.empty\n \n+ @property\n+ def shape(self):\n+ shape = list(self.groupby_obj.obj.shape)\n+ if self.is_frame and self.selection:\n+ shape[1] = len(self.selection)\n+ return tuple(shape)\n+\n def to_tuple(self, truncate=False, pickle_function=False):\n if self.selection and truncate:\n if isinstance(self.selection, Iterable) and not isinstance(self.selection, str):\n", "issue": "[BUG] Execute groupby failed\n<!--\r\nThank you for your contribution!\r\n\r\nPlease review https://github.com/mars-project/mars/blob/master/CONTRIBUTING.rst before opening an issue.\r\n-->\r\n\r\n**Describe the bug**\r\n\r\n`df.groupby().execute()` may fail.\r\n\r\n**To Reproduce**\r\nTo help us reproducing this bug, please provide information below:\r\n1. Your Python version\r\n2. The version of Mars you use\r\n3. Versions of crucial packages, such as numpy, scipy and protobuf\r\n4. Full stack of the error.\r\n5. Minimized code to reproduce the error.\r\n\r\n```\r\nIn [1]: import pandas as pd; import numpy as np \r\n\r\nIn [2]: df = pd.DataFrame(np.random.rand(4, 3), index=np.arange(5, 1, -1)) \r\n\r\nIn [4]: import mars.dataframe as md \r\n\r\nIn [5]: mdf = md.DataFrame(df) \r\n\r\nIn [6]: mdf.groupby(0).execute() \r\n---------------------------------------------------------------------------\r\nAttributeError Traceback (most recent call last)\r\n<ipython-input-6-491b51043e08> in <module>\r\n----> 1 mdf.groupby(0).execute()\r\n\r\n~/Workspace/mars/mars/core.py in execute(self, session, **kw)\r\n 426 if session is None:\r\n 427 session = Session.default_or_local()\r\n--> 428 return session.run(self, **kw)\r\n 429 \r\n 430 def fetch(self, session=None, **kw):\r\n\r\n~/Workspace/mars/mars/session.py in run(self, *tileables, **kw)\r\n 181 tileables = tuple(mt.tensor(t) if not isinstance(t, (Entity, Base)) else t\r\n 182 for t in tileables)\r\n--> 183 result = self._sess.run(*tileables, **kw)\r\n 184 \r\n 185 for t in tileables:\r\n\r\n~/Workspace/mars/mars/session.py in run(self, *tileables, **kw)\r\n 88 # set number of running cores\r\n 89 self.context.set_ncores(kw['n_parallel'])\r\n---> 90 res = self._executor.execute_tileables(tileables, **kw)\r\n 91 return res\r\n 92 \r\n\r\n~/Workspace/mars/mars/utils.py in _wrapped(*args, **kwargs)\r\n 380 _kernel_mode.eager = False\r\n 381 _kernel_mode.eager_count = enter_eager_count + 1\r\n--> 382 return func(*args, **kwargs)\r\n 383 finally:\r\n 384 _kernel_mode.eager_count -= 1\r\n\r\n~/Workspace/mars/mars/utils.py in inner(*args, **kwargs)\r\n 468 def inner(*args, **kwargs):\r\n 469 with build_mode():\r\n--> 470 return func(*args, **kwargs)\r\n 471 return inner\r\n 472 \r\n\r\n~/Workspace/mars/mars/executor.py in execute_tileables(self, tileables, fetch, n_parallel, n_thread, print_progress, mock, compose)\r\n 828 # update shape of tileable and its chunks whatever it's successful or not\r\n 829 self._update_tileable_and_chunk_shape(\r\n--> 830 tileable_graph, chunk_result, chunk_graph_builder.interrupted_ops)\r\n 831 if chunk_graph_builder.done:\r\n 832 if len(intermediate_result_keys) > 0:\r\n\r\n~/Workspace/mars/mars/executor.py in _update_tileable_and_chunk_shape(self, tileable_graph, chunk_result, failed_ops)\r\n 726 continue\r\n 727 for c in tiled_n.chunks:\r\n--> 728 c.data._shape = chunk_result[c.key].shape\r\n 729 new_nsplits = self.get_tileable_nsplits(n, chunk_result=chunk_result)\r\n 730 for node in (n, tiled_n):\r\n\r\n~/Workspace/mars/mars/lib/groupby_wrapper.py in __getattr__(self, item)\r\n 74 if item in getattr(self.obj, 'columns', ()):\r\n 75 return self.__getitem__(item)\r\n---> 76 return getattr(self.groupby_obj, item)\r\n 77 \r\n 78 def __iter__(self):\r\n\r\n~/miniconda3/lib/python3.7/site-packages/pandas/core/groupby/groupby.py in __getattr__(self, attr)\r\n 578 \r\n 579 raise AttributeError(\r\n--> 580 f\"'{type(self).__name__}' object has no attribute '{attr}'\"\r\n 581 )\r\n 582 \r\n\r\nAttributeError: 'DataFrameGroupBy' object has no attribute 'shape'\r\n```\r\n\r\n\n", "before_files": [{"content": "# Copyright 1999-2020 Alibaba Group Holding Ltd.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom collections.abc import Iterable\n\nimport cloudpickle\nfrom pandas.core.groupby import DataFrameGroupBy, SeriesGroupBy\n\n\nclass GroupByWrapper:\n def __init__(self, obj, groupby_obj=None, keys=None, axis=0, level=None, grouper=None,\n exclusions=None, selection=None, as_index=True, sort=True,\n group_keys=True, squeeze=False, observed=False, mutated=False,\n grouper_cache=None):\n\n def fill_value(v, key):\n return v if v is not None or groupby_obj is None else getattr(groupby_obj, key)\n\n self.obj = obj\n self.keys = fill_value(keys, 'keys')\n self.axis = fill_value(axis, 'axis')\n self.level = fill_value(level, 'level')\n self.exclusions = fill_value(exclusions, 'exclusions')\n self.selection = selection\n self.as_index = fill_value(as_index, 'as_index')\n self.sort = fill_value(sort, 'sort')\n self.group_keys = fill_value(group_keys, 'group_keys')\n self.squeeze = fill_value(squeeze, 'squeeze')\n self.observed = fill_value(observed, 'observed')\n self.mutated = fill_value(mutated, 'mutated')\n\n if groupby_obj is None:\n if obj.ndim == 2:\n self.groupby_obj = DataFrameGroupBy(\n obj, keys=keys, axis=axis, level=level, grouper=grouper, exclusions=exclusions,\n as_index=as_index, group_keys=group_keys, squeeze=squeeze, observed=observed,\n mutated=mutated)\n else:\n self.groupby_obj = SeriesGroupBy(\n obj, keys=keys, axis=axis, level=level, grouper=grouper, exclusions=exclusions,\n as_index=as_index, group_keys=group_keys, squeeze=squeeze, observed=observed,\n mutated=mutated)\n else:\n self.groupby_obj = groupby_obj\n\n self.is_frame = isinstance(self.groupby_obj, DataFrameGroupBy)\n\n if grouper_cache:\n self.groupby_obj.grouper._cache = grouper_cache\n if selection:\n self.groupby_obj = self.groupby_obj[selection]\n\n def __getitem__(self, item):\n return GroupByWrapper(\n self.obj, keys=self.keys, axis=self.axis, level=self.level,\n grouper=self.groupby_obj.grouper, exclusions=self.exclusions, selection=item,\n as_index=self.as_index, sort=self.sort, group_keys=self.group_keys,\n squeeze=self.squeeze, observed=self.observed, mutated=self.mutated)\n\n def __getattr__(self, item):\n if item.startswith('_'): # pragma: no cover\n return object.__getattribute__(self, item)\n if item in getattr(self.obj, 'columns', ()):\n return self.__getitem__(item)\n return getattr(self.groupby_obj, item)\n\n def __iter__(self):\n return self.groupby_obj.__iter__()\n\n @property\n def empty(self):\n return self.obj.empty\n\n def to_tuple(self, truncate=False, pickle_function=False):\n if self.selection and truncate:\n if isinstance(self.selection, Iterable) and not isinstance(self.selection, str):\n item_list = list(self.selection)\n else:\n item_list = [self.selection]\n item_set = set(item_list)\n\n if isinstance(self.keys, list):\n sel_keys = self.keys\n elif self.keys in self.obj.columns:\n sel_keys = [self.keys]\n else:\n sel_keys = []\n\n all_items = item_list + [k for k in sel_keys or () if k not in item_set]\n if set(all_items) == set(self.obj.columns):\n obj = self.obj\n else:\n obj = self.obj[all_items]\n else:\n obj = self.obj\n\n if pickle_function and callable(self.keys):\n keys = cloudpickle.dumps(self.keys)\n else:\n keys = self.keys\n\n return obj, keys, self.axis, self.level, self.exclusions, self.selection, \\\n self.as_index, self.sort, self.group_keys, self.squeeze, self.observed, \\\n self.mutated, getattr(self.groupby_obj.grouper, '_cache', dict())\n\n @classmethod\n def from_tuple(cls, tp):\n obj, keys, axis, level, exclusions, selection, as_index, sort, group_keys, squeeze, \\\n observed, mutated, grouper_cache = tp\n\n if isinstance(keys, (bytes, bytearray)):\n keys = cloudpickle.loads(keys)\n\n return cls(obj, keys=keys, axis=axis, level=level, exclusions=exclusions, selection=selection,\n as_index=as_index, sort=sort, group_keys=group_keys, squeeze=squeeze, observed=observed,\n mutated=mutated, grouper_cache=grouper_cache)\n\n\ndef wrapped_groupby(obj, by=None, axis=0, level=None, as_index=True, sort=True, group_keys=True,\n squeeze=False, observed=False):\n groupby_obj = obj.groupby(by=by, axis=axis, level=level, as_index=as_index, sort=sort,\n group_keys=group_keys, squeeze=squeeze, observed=observed)\n return GroupByWrapper(obj, groupby_obj=groupby_obj)\n", "path": "mars/lib/groupby_wrapper.py"}], "after_files": [{"content": "# Copyright 1999-2020 Alibaba Group Holding Ltd.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom collections.abc import Iterable\n\nimport cloudpickle\nfrom pandas.core.groupby import DataFrameGroupBy, SeriesGroupBy\n\n\nclass GroupByWrapper:\n def __init__(self, obj, groupby_obj=None, keys=None, axis=0, level=None, grouper=None,\n exclusions=None, selection=None, as_index=True, sort=True,\n group_keys=True, squeeze=False, observed=False, mutated=False,\n grouper_cache=None):\n\n def fill_value(v, key):\n return v if v is not None or groupby_obj is None else getattr(groupby_obj, key)\n\n self.obj = obj\n self.keys = fill_value(keys, 'keys')\n self.axis = fill_value(axis, 'axis')\n self.level = fill_value(level, 'level')\n self.exclusions = fill_value(exclusions, 'exclusions')\n self.selection = selection\n self.as_index = fill_value(as_index, 'as_index')\n self.sort = fill_value(sort, 'sort')\n self.group_keys = fill_value(group_keys, 'group_keys')\n self.squeeze = fill_value(squeeze, 'squeeze')\n self.observed = fill_value(observed, 'observed')\n self.mutated = fill_value(mutated, 'mutated')\n\n if groupby_obj is None:\n if obj.ndim == 2:\n self.groupby_obj = DataFrameGroupBy(\n obj, keys=keys, axis=axis, level=level, grouper=grouper, exclusions=exclusions,\n as_index=as_index, group_keys=group_keys, squeeze=squeeze, observed=observed,\n mutated=mutated)\n else:\n self.groupby_obj = SeriesGroupBy(\n obj, keys=keys, axis=axis, level=level, grouper=grouper, exclusions=exclusions,\n as_index=as_index, group_keys=group_keys, squeeze=squeeze, observed=observed,\n mutated=mutated)\n else:\n self.groupby_obj = groupby_obj\n\n if grouper_cache:\n self.groupby_obj.grouper._cache = grouper_cache\n if selection:\n self.groupby_obj = self.groupby_obj[selection]\n\n self.is_frame = isinstance(self.groupby_obj, DataFrameGroupBy)\n\n def __getitem__(self, item):\n return GroupByWrapper(\n self.obj, keys=self.keys, axis=self.axis, level=self.level,\n grouper=self.groupby_obj.grouper, exclusions=self.exclusions, selection=item,\n as_index=self.as_index, sort=self.sort, group_keys=self.group_keys,\n squeeze=self.squeeze, observed=self.observed, mutated=self.mutated)\n\n def __getattr__(self, item):\n if item.startswith('_'): # pragma: no cover\n return object.__getattribute__(self, item)\n if item in getattr(self.obj, 'columns', ()):\n return self.__getitem__(item)\n return getattr(self.groupby_obj, item)\n\n def __iter__(self):\n return self.groupby_obj.__iter__()\n\n @property\n def empty(self):\n return self.obj.empty\n\n @property\n def shape(self):\n shape = list(self.groupby_obj.obj.shape)\n if self.is_frame and self.selection:\n shape[1] = len(self.selection)\n return tuple(shape)\n\n def to_tuple(self, truncate=False, pickle_function=False):\n if self.selection and truncate:\n if isinstance(self.selection, Iterable) and not isinstance(self.selection, str):\n item_list = list(self.selection)\n else:\n item_list = [self.selection]\n item_set = set(item_list)\n\n if isinstance(self.keys, list):\n sel_keys = self.keys\n elif self.keys in self.obj.columns:\n sel_keys = [self.keys]\n else:\n sel_keys = []\n\n all_items = item_list + [k for k in sel_keys or () if k not in item_set]\n if set(all_items) == set(self.obj.columns):\n obj = self.obj\n else:\n obj = self.obj[all_items]\n else:\n obj = self.obj\n\n if pickle_function and callable(self.keys):\n keys = cloudpickle.dumps(self.keys)\n else:\n keys = self.keys\n\n return obj, keys, self.axis, self.level, self.exclusions, self.selection, \\\n self.as_index, self.sort, self.group_keys, self.squeeze, self.observed, \\\n self.mutated, getattr(self.groupby_obj.grouper, '_cache', dict())\n\n @classmethod\n def from_tuple(cls, tp):\n obj, keys, axis, level, exclusions, selection, as_index, sort, group_keys, squeeze, \\\n observed, mutated, grouper_cache = tp\n\n if isinstance(keys, (bytes, bytearray)):\n keys = cloudpickle.loads(keys)\n\n return cls(obj, keys=keys, axis=axis, level=level, exclusions=exclusions, selection=selection,\n as_index=as_index, sort=sort, group_keys=group_keys, squeeze=squeeze, observed=observed,\n mutated=mutated, grouper_cache=grouper_cache)\n\n\ndef wrapped_groupby(obj, by=None, axis=0, level=None, as_index=True, sort=True, group_keys=True,\n squeeze=False, observed=False):\n groupby_obj = obj.groupby(by=by, axis=axis, level=level, as_index=as_index, sort=sort,\n group_keys=group_keys, squeeze=squeeze, observed=observed)\n return GroupByWrapper(obj, groupby_obj=groupby_obj)\n", "path": "mars/lib/groupby_wrapper.py"}]} | 2,915 | 275 |
gh_patches_debug_31922 | rasdani/github-patches | git_diff | scrapy__scrapy-879 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Enabled extensions, middlewares, pipelines
I found that this information isn't being printed anymore.
The responsible of this bug is [this line](https://github.com/scrapy/scrapy/pull/816/files#diff-fee03a44ad4de98d9361d89947c8aba3R83), seems that `spider` is `None` on `eventDict` at the moment the components are instantiated.
I'm not sure how to fix it because I 'm not quite sure what it is attempting to block.
/cc @curita
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `scrapy/crawler.py`
Content:
```
1 import six
2 import signal
3 import warnings
4
5 from twisted.internet import reactor, defer
6
7 from scrapy.core.engine import ExecutionEngine
8 from scrapy.resolver import CachingThreadedResolver
9 from scrapy.extension import ExtensionManager
10 from scrapy.signalmanager import SignalManager
11 from scrapy.exceptions import ScrapyDeprecationWarning
12 from scrapy.utils.ossignal import install_shutdown_handlers, signal_names
13 from scrapy.utils.misc import load_object
14 from scrapy import log, signals
15
16
17 class Crawler(object):
18
19 def __init__(self, spidercls, settings):
20 self.spidercls = spidercls
21 self.settings = settings
22 self.signals = SignalManager(self)
23 self.stats = load_object(self.settings['STATS_CLASS'])(self)
24 lf_cls = load_object(self.settings['LOG_FORMATTER'])
25 self.logformatter = lf_cls.from_crawler(self)
26 self.extensions = ExtensionManager.from_crawler(self)
27
28 self.crawling = False
29 self.spider = None
30 self.engine = None
31
32 @property
33 def spiders(self):
34 if not hasattr(self, '_spiders'):
35 warnings.warn("Crawler.spiders is deprecated, use "
36 "CrawlerRunner.spiders or instantiate "
37 "scrapy.spidermanager.SpiderManager with your "
38 "settings.",
39 category=ScrapyDeprecationWarning, stacklevel=2)
40 spman_cls = load_object(self.settings['SPIDER_MANAGER_CLASS'])
41 self._spiders = spman_cls.from_settings(self.settings)
42 return self._spiders
43
44 @defer.inlineCallbacks
45 def crawl(self, *args, **kwargs):
46 assert not self.crawling, "Crawling already taking place"
47 self.crawling = True
48
49 try:
50 self.spider = self._create_spider(*args, **kwargs)
51 self.engine = self._create_engine()
52 start_requests = iter(self.spider.start_requests())
53 yield self.engine.open_spider(self.spider, start_requests)
54 yield defer.maybeDeferred(self.engine.start)
55 except Exception:
56 self.crawling = False
57 raise
58
59 def _create_spider(self, *args, **kwargs):
60 return self.spidercls.from_crawler(self, *args, **kwargs)
61
62 def _create_engine(self):
63 return ExecutionEngine(self, lambda _: self.stop())
64
65 @defer.inlineCallbacks
66 def stop(self):
67 if self.crawling:
68 self.crawling = False
69 yield defer.maybeDeferred(self.engine.stop)
70
71
72 class CrawlerRunner(object):
73
74 def __init__(self, settings):
75 self.settings = settings
76 smcls = load_object(settings['SPIDER_MANAGER_CLASS'])
77 self.spiders = smcls.from_settings(settings.frozencopy())
78 self.crawlers = set()
79 self.crawl_deferreds = set()
80
81 def crawl(self, spidercls, *args, **kwargs):
82 crawler = self._create_logged_crawler(spidercls)
83 self.crawlers.add(crawler)
84
85 d = crawler.crawl(*args, **kwargs)
86 self.crawl_deferreds.add(d)
87 return d
88
89 def _create_logged_crawler(self, spidercls):
90 crawler = self._create_crawler(spidercls)
91 log_observer = log.start_from_crawler(crawler)
92 if log_observer:
93 crawler.signals.connect(log_observer.stop, signals.engine_stopped)
94 return crawler
95
96 def _create_crawler(self, spidercls):
97 if isinstance(spidercls, six.string_types):
98 spidercls = self.spiders.load(spidercls)
99
100 crawler_settings = self.settings.copy()
101 spidercls.update_settings(crawler_settings)
102 crawler_settings.freeze()
103
104 crawler = Crawler(spidercls, crawler_settings)
105 return crawler
106
107 def stop(self):
108 return defer.DeferredList(c.stop() for c in self.crawlers)
109
110
111 class CrawlerProcess(CrawlerRunner):
112 """A class to run multiple scrapy crawlers in a process simultaneously"""
113
114 def __init__(self, settings):
115 super(CrawlerProcess, self).__init__(settings)
116 install_shutdown_handlers(self._signal_shutdown)
117 self.stopping = False
118
119 def _signal_shutdown(self, signum, _):
120 install_shutdown_handlers(self._signal_kill)
121 signame = signal_names[signum]
122 log.msg(format="Received %(signame)s, shutting down gracefully. Send again to force ",
123 level=log.INFO, signame=signame)
124 reactor.callFromThread(self.stop)
125
126 def _signal_kill(self, signum, _):
127 install_shutdown_handlers(signal.SIG_IGN)
128 signame = signal_names[signum]
129 log.msg(format='Received %(signame)s twice, forcing unclean shutdown',
130 level=log.INFO, signame=signame)
131 self._stop_logging()
132 reactor.callFromThread(self._stop_reactor)
133
134 def start(self, stop_after_crawl=True, start_reactor=True):
135 self.log_observer = log.start_from_settings(self.settings)
136 log.scrapy_info(self.settings)
137 if start_reactor:
138 self._start_reactor(stop_after_crawl)
139
140 def _start_reactor(self, stop_after_crawl=True):
141 if stop_after_crawl:
142 d = defer.DeferredList(self.crawl_deferreds)
143 if d.called:
144 # Don't start the reactor if the deferreds are already fired
145 return
146 d.addBoth(lambda _: self._stop_reactor())
147 if self.settings.getbool('DNSCACHE_ENABLED'):
148 reactor.installResolver(CachingThreadedResolver(reactor))
149 reactor.addSystemEventTrigger('before', 'shutdown', self.stop)
150 reactor.run(installSignalHandlers=False) # blocking call
151
152 def _stop_logging(self):
153 self.log_observer.stop()
154
155 def _stop_reactor(self, _=None):
156 try:
157 reactor.stop()
158 except RuntimeError: # raised if already stopped or in shutdown stage
159 pass
160
```
Path: `scrapy/commands/shell.py`
Content:
```
1 """
2 Scrapy Shell
3
4 See documentation in docs/topics/shell.rst
5 """
6
7 from threading import Thread
8
9 from scrapy.command import ScrapyCommand
10 from scrapy.shell import Shell
11 from scrapy.http import Request
12 from scrapy import log
13 from scrapy.utils.spider import spidercls_for_request, DefaultSpider
14
15
16 class Command(ScrapyCommand):
17
18 requires_project = False
19 default_settings = {'KEEP_ALIVE': True, 'LOGSTATS_INTERVAL': 0}
20
21 def syntax(self):
22 return "[url|file]"
23
24 def short_desc(self):
25 return "Interactive scraping console"
26
27 def long_desc(self):
28 return "Interactive console for scraping the given url"
29
30 def add_options(self, parser):
31 ScrapyCommand.add_options(self, parser)
32 parser.add_option("-c", dest="code",
33 help="evaluate the code in the shell, print the result and exit")
34 parser.add_option("--spider", dest="spider",
35 help="use this spider")
36
37 def update_vars(self, vars):
38 """You can use this function to update the Scrapy objects that will be
39 available in the shell
40 """
41 pass
42
43 def run(self, args, opts):
44 url = args[0] if args else None
45 spiders = self.crawler_process.spiders
46
47 spidercls = DefaultSpider
48 if opts.spider:
49 spidercls = spiders.load(opts.spider)
50 elif url:
51 spidercls = spidercls_for_request(spiders, Request(url),
52 spidercls, log_multiple=True)
53
54 # The crawler is created this way since the Shell manually handles the
55 # crawling engine, so the set up in the crawl method won't work
56 crawler = self.crawler_process._create_logged_crawler(spidercls)
57 # The Shell class needs a persistent engine in the crawler
58 crawler.engine = crawler._create_engine()
59 crawler.engine.start()
60
61 self.crawler_process.start(start_reactor=False)
62 self._start_crawler_thread()
63
64 shell = Shell(crawler, update_vars=self.update_vars, code=opts.code)
65 shell.start(url=url)
66
67 def _start_crawler_thread(self):
68 t = Thread(target=self.crawler_process._start_reactor,
69 kwargs={'stop_after_crawl': False})
70 t.daemon = True
71 t.start()
72
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/scrapy/commands/shell.py b/scrapy/commands/shell.py
--- a/scrapy/commands/shell.py
+++ b/scrapy/commands/shell.py
@@ -58,14 +58,13 @@
crawler.engine = crawler._create_engine()
crawler.engine.start()
- self.crawler_process.start(start_reactor=False)
self._start_crawler_thread()
shell = Shell(crawler, update_vars=self.update_vars, code=opts.code)
shell.start(url=url)
def _start_crawler_thread(self):
- t = Thread(target=self.crawler_process._start_reactor,
+ t = Thread(target=self.crawler_process.start,
kwargs={'stop_after_crawl': False})
t.daemon = True
t.start()
diff --git a/scrapy/crawler.py b/scrapy/crawler.py
--- a/scrapy/crawler.py
+++ b/scrapy/crawler.py
@@ -115,6 +115,8 @@
super(CrawlerProcess, self).__init__(settings)
install_shutdown_handlers(self._signal_shutdown)
self.stopping = False
+ self.log_observer = log.start_from_settings(self.settings)
+ log.scrapy_info(settings)
def _signal_shutdown(self, signum, _):
install_shutdown_handlers(self._signal_kill)
@@ -131,13 +133,7 @@
self._stop_logging()
reactor.callFromThread(self._stop_reactor)
- def start(self, stop_after_crawl=True, start_reactor=True):
- self.log_observer = log.start_from_settings(self.settings)
- log.scrapy_info(self.settings)
- if start_reactor:
- self._start_reactor(stop_after_crawl)
-
- def _start_reactor(self, stop_after_crawl=True):
+ def start(self, stop_after_crawl=True):
if stop_after_crawl:
d = defer.DeferredList(self.crawl_deferreds)
if d.called:
| {"golden_diff": "diff --git a/scrapy/commands/shell.py b/scrapy/commands/shell.py\n--- a/scrapy/commands/shell.py\n+++ b/scrapy/commands/shell.py\n@@ -58,14 +58,13 @@\n crawler.engine = crawler._create_engine()\n crawler.engine.start()\n \n- self.crawler_process.start(start_reactor=False)\n self._start_crawler_thread()\n \n shell = Shell(crawler, update_vars=self.update_vars, code=opts.code)\n shell.start(url=url)\n \n def _start_crawler_thread(self):\n- t = Thread(target=self.crawler_process._start_reactor,\n+ t = Thread(target=self.crawler_process.start,\n kwargs={'stop_after_crawl': False})\n t.daemon = True\n t.start()\ndiff --git a/scrapy/crawler.py b/scrapy/crawler.py\n--- a/scrapy/crawler.py\n+++ b/scrapy/crawler.py\n@@ -115,6 +115,8 @@\n super(CrawlerProcess, self).__init__(settings)\n install_shutdown_handlers(self._signal_shutdown)\n self.stopping = False\n+ self.log_observer = log.start_from_settings(self.settings)\n+ log.scrapy_info(settings)\n \n def _signal_shutdown(self, signum, _):\n install_shutdown_handlers(self._signal_kill)\n@@ -131,13 +133,7 @@\n self._stop_logging()\n reactor.callFromThread(self._stop_reactor)\n \n- def start(self, stop_after_crawl=True, start_reactor=True):\n- self.log_observer = log.start_from_settings(self.settings)\n- log.scrapy_info(self.settings)\n- if start_reactor:\n- self._start_reactor(stop_after_crawl)\n-\n- def _start_reactor(self, stop_after_crawl=True):\n+ def start(self, stop_after_crawl=True):\n if stop_after_crawl:\n d = defer.DeferredList(self.crawl_deferreds)\n if d.called:\n", "issue": "Enabled extensions, middlewares, pipelines\nI found that this information isn't being printed anymore.\n\nThe responsible of this bug is [this line](https://github.com/scrapy/scrapy/pull/816/files#diff-fee03a44ad4de98d9361d89947c8aba3R83), seems that `spider` is `None` on `eventDict` at the moment the components are instantiated.\n\nI'm not sure how to fix it because I 'm not quite sure what it is attempting to block.\n\n/cc @curita\n\n", "before_files": [{"content": "import six\nimport signal\nimport warnings\n\nfrom twisted.internet import reactor, defer\n\nfrom scrapy.core.engine import ExecutionEngine\nfrom scrapy.resolver import CachingThreadedResolver\nfrom scrapy.extension import ExtensionManager\nfrom scrapy.signalmanager import SignalManager\nfrom scrapy.exceptions import ScrapyDeprecationWarning\nfrom scrapy.utils.ossignal import install_shutdown_handlers, signal_names\nfrom scrapy.utils.misc import load_object\nfrom scrapy import log, signals\n\n\nclass Crawler(object):\n\n def __init__(self, spidercls, settings):\n self.spidercls = spidercls\n self.settings = settings\n self.signals = SignalManager(self)\n self.stats = load_object(self.settings['STATS_CLASS'])(self)\n lf_cls = load_object(self.settings['LOG_FORMATTER'])\n self.logformatter = lf_cls.from_crawler(self)\n self.extensions = ExtensionManager.from_crawler(self)\n\n self.crawling = False\n self.spider = None\n self.engine = None\n\n @property\n def spiders(self):\n if not hasattr(self, '_spiders'):\n warnings.warn(\"Crawler.spiders is deprecated, use \"\n \"CrawlerRunner.spiders or instantiate \"\n \"scrapy.spidermanager.SpiderManager with your \"\n \"settings.\",\n category=ScrapyDeprecationWarning, stacklevel=2)\n spman_cls = load_object(self.settings['SPIDER_MANAGER_CLASS'])\n self._spiders = spman_cls.from_settings(self.settings)\n return self._spiders\n\n @defer.inlineCallbacks\n def crawl(self, *args, **kwargs):\n assert not self.crawling, \"Crawling already taking place\"\n self.crawling = True\n\n try:\n self.spider = self._create_spider(*args, **kwargs)\n self.engine = self._create_engine()\n start_requests = iter(self.spider.start_requests())\n yield self.engine.open_spider(self.spider, start_requests)\n yield defer.maybeDeferred(self.engine.start)\n except Exception:\n self.crawling = False\n raise\n\n def _create_spider(self, *args, **kwargs):\n return self.spidercls.from_crawler(self, *args, **kwargs)\n\n def _create_engine(self):\n return ExecutionEngine(self, lambda _: self.stop())\n\n @defer.inlineCallbacks\n def stop(self):\n if self.crawling:\n self.crawling = False\n yield defer.maybeDeferred(self.engine.stop)\n\n\nclass CrawlerRunner(object):\n\n def __init__(self, settings):\n self.settings = settings\n smcls = load_object(settings['SPIDER_MANAGER_CLASS'])\n self.spiders = smcls.from_settings(settings.frozencopy())\n self.crawlers = set()\n self.crawl_deferreds = set()\n\n def crawl(self, spidercls, *args, **kwargs):\n crawler = self._create_logged_crawler(spidercls)\n self.crawlers.add(crawler)\n\n d = crawler.crawl(*args, **kwargs)\n self.crawl_deferreds.add(d)\n return d\n\n def _create_logged_crawler(self, spidercls):\n crawler = self._create_crawler(spidercls)\n log_observer = log.start_from_crawler(crawler)\n if log_observer:\n crawler.signals.connect(log_observer.stop, signals.engine_stopped)\n return crawler\n\n def _create_crawler(self, spidercls):\n if isinstance(spidercls, six.string_types):\n spidercls = self.spiders.load(spidercls)\n\n crawler_settings = self.settings.copy()\n spidercls.update_settings(crawler_settings)\n crawler_settings.freeze()\n\n crawler = Crawler(spidercls, crawler_settings)\n return crawler\n\n def stop(self):\n return defer.DeferredList(c.stop() for c in self.crawlers)\n\n\nclass CrawlerProcess(CrawlerRunner):\n \"\"\"A class to run multiple scrapy crawlers in a process simultaneously\"\"\"\n\n def __init__(self, settings):\n super(CrawlerProcess, self).__init__(settings)\n install_shutdown_handlers(self._signal_shutdown)\n self.stopping = False\n\n def _signal_shutdown(self, signum, _):\n install_shutdown_handlers(self._signal_kill)\n signame = signal_names[signum]\n log.msg(format=\"Received %(signame)s, shutting down gracefully. Send again to force \",\n level=log.INFO, signame=signame)\n reactor.callFromThread(self.stop)\n\n def _signal_kill(self, signum, _):\n install_shutdown_handlers(signal.SIG_IGN)\n signame = signal_names[signum]\n log.msg(format='Received %(signame)s twice, forcing unclean shutdown',\n level=log.INFO, signame=signame)\n self._stop_logging()\n reactor.callFromThread(self._stop_reactor)\n\n def start(self, stop_after_crawl=True, start_reactor=True):\n self.log_observer = log.start_from_settings(self.settings)\n log.scrapy_info(self.settings)\n if start_reactor:\n self._start_reactor(stop_after_crawl)\n\n def _start_reactor(self, stop_after_crawl=True):\n if stop_after_crawl:\n d = defer.DeferredList(self.crawl_deferreds)\n if d.called:\n # Don't start the reactor if the deferreds are already fired\n return\n d.addBoth(lambda _: self._stop_reactor())\n if self.settings.getbool('DNSCACHE_ENABLED'):\n reactor.installResolver(CachingThreadedResolver(reactor))\n reactor.addSystemEventTrigger('before', 'shutdown', self.stop)\n reactor.run(installSignalHandlers=False) # blocking call\n\n def _stop_logging(self):\n self.log_observer.stop()\n\n def _stop_reactor(self, _=None):\n try:\n reactor.stop()\n except RuntimeError: # raised if already stopped or in shutdown stage\n pass\n", "path": "scrapy/crawler.py"}, {"content": "\"\"\"\nScrapy Shell\n\nSee documentation in docs/topics/shell.rst\n\"\"\"\n\nfrom threading import Thread\n\nfrom scrapy.command import ScrapyCommand\nfrom scrapy.shell import Shell\nfrom scrapy.http import Request\nfrom scrapy import log\nfrom scrapy.utils.spider import spidercls_for_request, DefaultSpider\n\n\nclass Command(ScrapyCommand):\n\n requires_project = False\n default_settings = {'KEEP_ALIVE': True, 'LOGSTATS_INTERVAL': 0}\n\n def syntax(self):\n return \"[url|file]\"\n\n def short_desc(self):\n return \"Interactive scraping console\"\n\n def long_desc(self):\n return \"Interactive console for scraping the given url\"\n\n def add_options(self, parser):\n ScrapyCommand.add_options(self, parser)\n parser.add_option(\"-c\", dest=\"code\",\n help=\"evaluate the code in the shell, print the result and exit\")\n parser.add_option(\"--spider\", dest=\"spider\",\n help=\"use this spider\")\n\n def update_vars(self, vars):\n \"\"\"You can use this function to update the Scrapy objects that will be\n available in the shell\n \"\"\"\n pass\n\n def run(self, args, opts):\n url = args[0] if args else None\n spiders = self.crawler_process.spiders\n\n spidercls = DefaultSpider\n if opts.spider:\n spidercls = spiders.load(opts.spider)\n elif url:\n spidercls = spidercls_for_request(spiders, Request(url),\n spidercls, log_multiple=True)\n\n # The crawler is created this way since the Shell manually handles the\n # crawling engine, so the set up in the crawl method won't work\n crawler = self.crawler_process._create_logged_crawler(spidercls)\n # The Shell class needs a persistent engine in the crawler\n crawler.engine = crawler._create_engine()\n crawler.engine.start()\n\n self.crawler_process.start(start_reactor=False)\n self._start_crawler_thread()\n\n shell = Shell(crawler, update_vars=self.update_vars, code=opts.code)\n shell.start(url=url)\n\n def _start_crawler_thread(self):\n t = Thread(target=self.crawler_process._start_reactor,\n kwargs={'stop_after_crawl': False})\n t.daemon = True\n t.start()\n", "path": "scrapy/commands/shell.py"}], "after_files": [{"content": "import six\nimport signal\nimport warnings\n\nfrom twisted.internet import reactor, defer\n\nfrom scrapy.core.engine import ExecutionEngine\nfrom scrapy.resolver import CachingThreadedResolver\nfrom scrapy.extension import ExtensionManager\nfrom scrapy.signalmanager import SignalManager\nfrom scrapy.exceptions import ScrapyDeprecationWarning\nfrom scrapy.utils.ossignal import install_shutdown_handlers, signal_names\nfrom scrapy.utils.misc import load_object\nfrom scrapy import log, signals\n\n\nclass Crawler(object):\n\n def __init__(self, spidercls, settings):\n self.spidercls = spidercls\n self.settings = settings\n self.signals = SignalManager(self)\n self.stats = load_object(self.settings['STATS_CLASS'])(self)\n lf_cls = load_object(self.settings['LOG_FORMATTER'])\n self.logformatter = lf_cls.from_crawler(self)\n self.extensions = ExtensionManager.from_crawler(self)\n\n self.crawling = False\n self.spider = None\n self.engine = None\n\n @property\n def spiders(self):\n if not hasattr(self, '_spiders'):\n warnings.warn(\"Crawler.spiders is deprecated, use \"\n \"CrawlerRunner.spiders or instantiate \"\n \"scrapy.spidermanager.SpiderManager with your \"\n \"settings.\",\n category=ScrapyDeprecationWarning, stacklevel=2)\n spman_cls = load_object(self.settings['SPIDER_MANAGER_CLASS'])\n self._spiders = spman_cls.from_settings(self.settings)\n return self._spiders\n\n @defer.inlineCallbacks\n def crawl(self, *args, **kwargs):\n assert not self.crawling, \"Crawling already taking place\"\n self.crawling = True\n\n try:\n self.spider = self._create_spider(*args, **kwargs)\n self.engine = self._create_engine()\n start_requests = iter(self.spider.start_requests())\n yield self.engine.open_spider(self.spider, start_requests)\n yield defer.maybeDeferred(self.engine.start)\n except Exception:\n self.crawling = False\n raise\n\n def _create_spider(self, *args, **kwargs):\n return self.spidercls.from_crawler(self, *args, **kwargs)\n\n def _create_engine(self):\n return ExecutionEngine(self, lambda _: self.stop())\n\n @defer.inlineCallbacks\n def stop(self):\n if self.crawling:\n self.crawling = False\n yield defer.maybeDeferred(self.engine.stop)\n\n\nclass CrawlerRunner(object):\n\n def __init__(self, settings):\n self.settings = settings\n smcls = load_object(settings['SPIDER_MANAGER_CLASS'])\n self.spiders = smcls.from_settings(settings.frozencopy())\n self.crawlers = set()\n self.crawl_deferreds = set()\n\n def crawl(self, spidercls, *args, **kwargs):\n crawler = self._create_logged_crawler(spidercls)\n self.crawlers.add(crawler)\n\n d = crawler.crawl(*args, **kwargs)\n self.crawl_deferreds.add(d)\n return d\n\n def _create_logged_crawler(self, spidercls):\n crawler = self._create_crawler(spidercls)\n log_observer = log.start_from_crawler(crawler)\n if log_observer:\n crawler.signals.connect(log_observer.stop, signals.engine_stopped)\n return crawler\n\n def _create_crawler(self, spidercls):\n if isinstance(spidercls, six.string_types):\n spidercls = self.spiders.load(spidercls)\n\n crawler_settings = self.settings.copy()\n spidercls.update_settings(crawler_settings)\n crawler_settings.freeze()\n\n crawler = Crawler(spidercls, crawler_settings)\n return crawler\n\n def stop(self):\n return defer.DeferredList(c.stop() for c in self.crawlers)\n\n\nclass CrawlerProcess(CrawlerRunner):\n \"\"\"A class to run multiple scrapy crawlers in a process simultaneously\"\"\"\n\n def __init__(self, settings):\n super(CrawlerProcess, self).__init__(settings)\n install_shutdown_handlers(self._signal_shutdown)\n self.stopping = False\n self.log_observer = log.start_from_settings(self.settings)\n log.scrapy_info(settings)\n\n def _signal_shutdown(self, signum, _):\n install_shutdown_handlers(self._signal_kill)\n signame = signal_names[signum]\n log.msg(format=\"Received %(signame)s, shutting down gracefully. Send again to force \",\n level=log.INFO, signame=signame)\n reactor.callFromThread(self.stop)\n\n def _signal_kill(self, signum, _):\n install_shutdown_handlers(signal.SIG_IGN)\n signame = signal_names[signum]\n log.msg(format='Received %(signame)s twice, forcing unclean shutdown',\n level=log.INFO, signame=signame)\n self._stop_logging()\n reactor.callFromThread(self._stop_reactor)\n\n def start(self, stop_after_crawl=True):\n if stop_after_crawl:\n d = defer.DeferredList(self.crawl_deferreds)\n if d.called:\n # Don't start the reactor if the deferreds are already fired\n return\n d.addBoth(lambda _: self._stop_reactor())\n if self.settings.getbool('DNSCACHE_ENABLED'):\n reactor.installResolver(CachingThreadedResolver(reactor))\n reactor.addSystemEventTrigger('before', 'shutdown', self.stop)\n reactor.run(installSignalHandlers=False) # blocking call\n\n def _stop_logging(self):\n self.log_observer.stop()\n\n def _stop_reactor(self, _=None):\n try:\n reactor.stop()\n except RuntimeError: # raised if already stopped or in shutdown stage\n pass\n", "path": "scrapy/crawler.py"}, {"content": "\"\"\"\nScrapy Shell\n\nSee documentation in docs/topics/shell.rst\n\"\"\"\n\nfrom threading import Thread\n\nfrom scrapy.command import ScrapyCommand\nfrom scrapy.shell import Shell\nfrom scrapy.http import Request\nfrom scrapy import log\nfrom scrapy.utils.spider import spidercls_for_request, DefaultSpider\n\n\nclass Command(ScrapyCommand):\n\n requires_project = False\n default_settings = {'KEEP_ALIVE': True, 'LOGSTATS_INTERVAL': 0}\n\n def syntax(self):\n return \"[url|file]\"\n\n def short_desc(self):\n return \"Interactive scraping console\"\n\n def long_desc(self):\n return \"Interactive console for scraping the given url\"\n\n def add_options(self, parser):\n ScrapyCommand.add_options(self, parser)\n parser.add_option(\"-c\", dest=\"code\",\n help=\"evaluate the code in the shell, print the result and exit\")\n parser.add_option(\"--spider\", dest=\"spider\",\n help=\"use this spider\")\n\n def update_vars(self, vars):\n \"\"\"You can use this function to update the Scrapy objects that will be\n available in the shell\n \"\"\"\n pass\n\n def run(self, args, opts):\n url = args[0] if args else None\n spiders = self.crawler_process.spiders\n\n spidercls = DefaultSpider\n if opts.spider:\n spidercls = spiders.load(opts.spider)\n elif url:\n spidercls = spidercls_for_request(spiders, Request(url),\n spidercls, log_multiple=True)\n\n # The crawler is created this way since the Shell manually handles the\n # crawling engine, so the set up in the crawl method won't work\n crawler = self.crawler_process._create_logged_crawler(spidercls)\n # The Shell class needs a persistent engine in the crawler\n crawler.engine = crawler._create_engine()\n crawler.engine.start()\n\n self._start_crawler_thread()\n\n shell = Shell(crawler, update_vars=self.update_vars, code=opts.code)\n shell.start(url=url)\n\n def _start_crawler_thread(self):\n t = Thread(target=self.crawler_process.start,\n kwargs={'stop_after_crawl': False})\n t.daemon = True\n t.start()\n", "path": "scrapy/commands/shell.py"}]} | 2,669 | 434 |
gh_patches_debug_12880 | rasdani/github-patches | git_diff | ansible-collections__community.general-3800 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
pipx install option --install-deps not in version 0.16.4
### Summary
The [pipx plugin](https://github.com/ansible-collections/community.general/blob/main/plugins/modules/packaging/language/pipx.py) has an option `install_deps` which adds an `--install-deps` option to the the `pipx` command line. This causes the `pipx` command to fail with version 0.16.4 of `pipx`.
Version 0.16.4 of the program appears to provide the `--include-deps` option instead.
https://pypa.github.io/pipx/docs/#pipx-install
https://docs.ansible.com/ansible/latest/collections/community/general/pipx_module.html#parameter-install_deps
### Issue Type
Bug Report
### Component Name
pipx
### Ansible Version
```
ansible [core 2.11.6]
```
### Community.general Version
Global version
```
Collection Version
----------------- -------
community.general 3.8.1
```
Playbook local version
```
Collection Version
----------------- -------
community.general 4.1.0
```
### Configuration
```console (paste below)
$ ansible-config dump --only-changed
```
### OS / Environment
WSL2 (local)
Ubuntu 20.04 (remote)
### Steps to Reproduce
<!--- Paste example playbooks or commands between quotes below -->
```yaml
community.general.pipx:
name: "{{ pipx_item }}"
include_deps: true
```
### Expected Results
The module to complete succesfully
### Actual Results
```console
failed: [laptop] (item=ansible-base) => changed=false
ansible_loop_var: pipx_item
cmd: /usr/bin/python3 -m pipx install --install-deps ansible-base
msg: |-
usage: __main__.py [-h] [--version]
{install,inject,upgrade,upgrade-all,uninstall,uninstall-all,reinstall,reinstall-all,list,run,runpip,ensurepath,completions}
...
__main__.py: error: unrecognized arguments: --install-deps
pipx_item: ansible-base
rc: 2
stderr: |-
usage: __main__.py [-h] [--version]
{install,inject,upgrade,upgrade-all,uninstall,uninstall-all,reinstall,reinstall-all,list,run,runpip,ensurepath,completions}
...
__main__.py: error: unrecognized arguments: --install-deps
stderr_lines: <omitted>
stdout: ''
stdout_lines: <omitted>
```
### Code of Conduct
- [X] I agree to follow the Ansible Code of Conduct
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `plugins/modules/packaging/language/pipx.py`
Content:
```
1 #!/usr/bin/python
2 # -*- coding: utf-8 -*-
3
4 # (c) 2021, Alexei Znamensky <[email protected]>
5 # GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
6
7 from __future__ import absolute_import, division, print_function
8 __metaclass__ = type
9
10
11 DOCUMENTATION = '''
12 ---
13 module: pipx
14 short_description: Manages applications installed with pipx
15 version_added: 3.8.0
16 description:
17 - Manage Python applications installed in isolated virtualenvs using pipx.
18 options:
19 state:
20 type: str
21 choices: [present, absent, install, uninstall, uninstall_all, inject, upgrade, upgrade_all, reinstall, reinstall_all]
22 default: install
23 description:
24 - Desired state for the application.
25 - The states C(present) and C(absent) are aliases to C(install) and C(uninstall), respectively.
26 name:
27 type: str
28 description:
29 - >
30 The name of the application to be installed. It must to be a simple package name.
31 For passing package specifications or installing from URLs or directories,
32 please use the I(source) option.
33 source:
34 type: str
35 description:
36 - >
37 If the application source, such as a package with version specifier, or an URL,
38 directory or any other accepted specification. See C(pipx) documentation for more details.
39 - When specified, the C(pipx) command will use I(source) instead of I(name).
40 install_deps:
41 description:
42 - Include applications of dependent packages.
43 - Only used when I(state=install) or I(state=upgrade).
44 type: bool
45 default: false
46 inject_packages:
47 description:
48 - Packages to be injected into an existing virtual environment.
49 - Only used when I(state=inject).
50 type: list
51 elements: str
52 force:
53 description:
54 - Force modification of the application's virtual environment. See C(pipx) for details.
55 - Only used when I(state=install), I(state=upgrade), I(state=upgrade_all), or I(state=inject).
56 type: bool
57 default: false
58 include_injected:
59 description:
60 - Upgrade the injected packages along with the application.
61 - Only used when I(state=upgrade) or I(state=upgrade_all).
62 type: bool
63 default: false
64 index_url:
65 description:
66 - Base URL of Python Package Index.
67 - Only used when I(state=install), I(state=upgrade), or I(state=inject).
68 type: str
69 python:
70 description:
71 - Python version to be used when creating the application virtual environment. Must be 3.6+.
72 - Only used when I(state=install), I(state=reinstall), or I(state=reinstall_all).
73 type: str
74 executable:
75 description:
76 - Path to the C(pipx) installed in the system.
77 - >
78 If not specified, the module will use C(python -m pipx) to run the tool,
79 using the same Python interpreter as ansible itself.
80 type: path
81 notes:
82 - This module does not install the C(pipx) python package, however that can be easily done with the module M(ansible.builtin.pip).
83 - This module does not require C(pipx) to be in the shell C(PATH), but it must be loadable by Python as a module.
84 - Please note that C(pipx) requires Python 3.6 or above.
85 - >
86 This first implementation does not verify whether a specified version constraint has been installed or not.
87 Hence, when using version operators, C(pipx) module will always try to execute the operation,
88 even when the application was previously installed.
89 This feature will be added in the future.
90 - See also the C(pipx) documentation at U(https://pypa.github.io/pipx/).
91 author:
92 - "Alexei Znamensky (@russoz)"
93 '''
94
95 EXAMPLES = '''
96 - name: Install tox
97 community.general.pipx:
98 name: tox
99
100 - name: Install tox from git repository
101 community.general.pipx:
102 name: tox
103 source: git+https://github.com/tox-dev/tox.git
104
105 - name: Upgrade tox
106 community.general.pipx:
107 name: tox
108 state: upgrade
109
110 - name: Reinstall black with specific Python version
111 community.general.pipx:
112 name: black
113 state: reinstall
114 python: 3.7
115
116 - name: Uninstall pycowsay
117 community.general.pipx:
118 name: pycowsay
119 state: absent
120 '''
121
122
123 import json
124
125 from ansible_collections.community.general.plugins.module_utils.module_helper import (
126 CmdStateModuleHelper, ArgFormat, ModuleHelperException
127 )
128 from ansible.module_utils.facts.compat import ansible_facts
129
130
131 _state_map = dict(
132 present='install',
133 absent='uninstall',
134 uninstall_all='uninstall-all',
135 upgrade_all='upgrade-all',
136 reinstall_all='reinstall-all',
137 )
138
139
140 class PipX(CmdStateModuleHelper):
141 output_params = ['name', 'source', 'index_url', 'force', 'installdeps']
142 module = dict(
143 argument_spec=dict(
144 state=dict(type='str', default='install',
145 choices=[
146 'present', 'absent', 'install', 'uninstall', 'uninstall_all',
147 'inject', 'upgrade', 'upgrade_all', 'reinstall', 'reinstall_all']),
148 name=dict(type='str'),
149 source=dict(type='str'),
150 install_deps=dict(type='bool', default=False),
151 inject_packages=dict(type='list', elements='str'),
152 force=dict(type='bool', default=False),
153 include_injected=dict(type='bool', default=False),
154 index_url=dict(type='str'),
155 python=dict(type='str'),
156 executable=dict(type='path')
157 ),
158 required_if=[
159 ('state', 'present', ['name']),
160 ('state', 'install', ['name']),
161 ('state', 'absent', ['name']),
162 ('state', 'uninstall', ['name']),
163 ('state', 'inject', ['name', 'inject_packages']),
164 ],
165 supports_check_mode=True,
166 )
167 command_args_formats = dict(
168 state=dict(fmt=lambda v: [_state_map.get(v, v)]),
169 name_source=dict(fmt=lambda n, s: [s] if s else [n], stars=1),
170 install_deps=dict(fmt="--install-deps", style=ArgFormat.BOOLEAN),
171 inject_packages=dict(fmt=lambda v: v),
172 force=dict(fmt="--force", style=ArgFormat.BOOLEAN),
173 include_injected=dict(fmt="--include-injected", style=ArgFormat.BOOLEAN),
174 index_url=dict(fmt=('--index-url', '{0}'),),
175 python=dict(fmt=('--python', '{0}'),),
176 _list=dict(fmt=('list', '--include-injected', '--json'), style=ArgFormat.BOOLEAN),
177 )
178 check_rc = True
179 run_command_fixed_options = dict(
180 environ_update={'USE_EMOJI': '0'}
181 )
182
183 def _retrieve_installed(self):
184 def process_list(rc, out, err):
185 if not out:
186 return {}
187
188 results = {}
189 raw_data = json.loads(out)
190 for venv_name, venv in raw_data['venvs'].items():
191 results[venv_name] = {
192 'version': venv['metadata']['main_package']['package_version'],
193 'injected': dict(
194 (k, v['package_version']) for k, v in venv['metadata']['injected_packages'].items()
195 ),
196 }
197 return results
198
199 installed = self.run_command(params=[{'_list': True}], process_output=process_list,
200 publish_rc=False, publish_out=False, publish_err=False, publish_cmd=False)
201
202 if self.vars.name is not None:
203 app_list = installed.get(self.vars.name)
204 if app_list:
205 return {self.vars.name: app_list}
206 else:
207 return {}
208
209 return installed
210
211 def __init_module__(self):
212 if self.vars.executable:
213 self.command = [self.vars.executable]
214 else:
215 facts = ansible_facts(self.module, gather_subset=['python'])
216 self.command = [facts['python']['executable'], '-m', 'pipx']
217
218 self.vars.set('application', self._retrieve_installed(), change=True, diff=True)
219
220 def __quit_module__(self):
221 self.vars.application = self._retrieve_installed()
222
223 def state_install(self):
224 if not self.vars.application or self.vars.force:
225 self.changed = True
226 if not self.module.check_mode:
227 self.run_command(params=['state', 'index_url', 'install_deps', 'force', 'python',
228 {'name_source': [self.vars.name, self.vars.source]}])
229
230 state_present = state_install
231
232 def state_upgrade(self):
233 if not self.vars.application:
234 raise ModuleHelperException(
235 "Trying to upgrade a non-existent application: {0}".format(self.vars.name))
236 if self.vars.force:
237 self.changed = True
238 if not self.module.check_mode:
239 self.run_command(params=['state', 'index_url', 'install_deps', 'force', 'name'])
240
241 def state_uninstall(self):
242 if self.vars.application and not self.module.check_mode:
243 self.run_command(params=['state', 'name'])
244
245 state_absent = state_uninstall
246
247 def state_reinstall(self):
248 if not self.vars.application:
249 raise ModuleHelperException(
250 "Trying to reinstall a non-existent application: {0}".format(self.vars.name))
251 self.changed = True
252 if not self.module.check_mode:
253 self.run_command(params=['state', 'name', 'python'])
254
255 def state_inject(self):
256 if not self.vars.application:
257 raise ModuleHelperException(
258 "Trying to inject packages into a non-existent application: {0}".format(self.vars.name))
259 if self.vars.force:
260 self.changed = True
261 if not self.module.check_mode:
262 self.run_command(params=['state', 'index_url', 'force', 'name', 'inject_packages'])
263
264 def state_uninstall_all(self):
265 if not self.module.check_mode:
266 self.run_command(params=['state'])
267
268 def state_reinstall_all(self):
269 if not self.module.check_mode:
270 self.run_command(params=['state', 'python'])
271
272 def state_upgrade_all(self):
273 if self.vars.force:
274 self.changed = True
275 if not self.module.check_mode:
276 self.run_command(params=['state', 'include_injected', 'force'])
277
278
279 def main():
280 PipX.execute()
281
282
283 if __name__ == '__main__':
284 main()
285
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/plugins/modules/packaging/language/pipx.py b/plugins/modules/packaging/language/pipx.py
--- a/plugins/modules/packaging/language/pipx.py
+++ b/plugins/modules/packaging/language/pipx.py
@@ -167,7 +167,7 @@
command_args_formats = dict(
state=dict(fmt=lambda v: [_state_map.get(v, v)]),
name_source=dict(fmt=lambda n, s: [s] if s else [n], stars=1),
- install_deps=dict(fmt="--install-deps", style=ArgFormat.BOOLEAN),
+ install_deps=dict(fmt="--include-deps", style=ArgFormat.BOOLEAN),
inject_packages=dict(fmt=lambda v: v),
force=dict(fmt="--force", style=ArgFormat.BOOLEAN),
include_injected=dict(fmt="--include-injected", style=ArgFormat.BOOLEAN),
| {"golden_diff": "diff --git a/plugins/modules/packaging/language/pipx.py b/plugins/modules/packaging/language/pipx.py\n--- a/plugins/modules/packaging/language/pipx.py\n+++ b/plugins/modules/packaging/language/pipx.py\n@@ -167,7 +167,7 @@\n command_args_formats = dict(\n state=dict(fmt=lambda v: [_state_map.get(v, v)]),\n name_source=dict(fmt=lambda n, s: [s] if s else [n], stars=1),\n- install_deps=dict(fmt=\"--install-deps\", style=ArgFormat.BOOLEAN),\n+ install_deps=dict(fmt=\"--include-deps\", style=ArgFormat.BOOLEAN),\n inject_packages=dict(fmt=lambda v: v),\n force=dict(fmt=\"--force\", style=ArgFormat.BOOLEAN),\n include_injected=dict(fmt=\"--include-injected\", style=ArgFormat.BOOLEAN),\n", "issue": "pipx install option --install-deps not in version 0.16.4\n### Summary\n\nThe [pipx plugin](https://github.com/ansible-collections/community.general/blob/main/plugins/modules/packaging/language/pipx.py) has an option `install_deps` which adds an `--install-deps` option to the the `pipx` command line. This causes the `pipx` command to fail with version 0.16.4 of `pipx`.\r\n\r\nVersion 0.16.4 of the program appears to provide the `--include-deps` option instead.\r\n\r\nhttps://pypa.github.io/pipx/docs/#pipx-install\r\nhttps://docs.ansible.com/ansible/latest/collections/community/general/pipx_module.html#parameter-install_deps\n\n### Issue Type\n\nBug Report\n\n### Component Name\n\npipx\n\n### Ansible Version\n\n```\r\nansible [core 2.11.6]\r\n```\r\n\n\n### Community.general Version\n\nGlobal version\r\n```\r\nCollection Version\r\n----------------- -------\r\ncommunity.general 3.8.1\r\n```\r\n\r\nPlaybook local version\r\n```\r\nCollection Version\r\n----------------- -------\r\ncommunity.general 4.1.0\r\n```\n\n### Configuration\n\n```console (paste below)\r\n$ ansible-config dump --only-changed\r\n\r\n```\r\n\n\n### OS / Environment\n\nWSL2 (local)\r\nUbuntu 20.04 (remote)\n\n### Steps to Reproduce\n\n<!--- Paste example playbooks or commands between quotes below -->\r\n```yaml\r\ncommunity.general.pipx:\r\n name: \"{{ pipx_item }}\"\r\n include_deps: true\r\n\r\n```\r\n\n\n### Expected Results\n\nThe module to complete succesfully\n\n### Actual Results\n\n```console\r\nfailed: [laptop] (item=ansible-base) => changed=false\r\n ansible_loop_var: pipx_item\r\n cmd: /usr/bin/python3 -m pipx install --install-deps ansible-base\r\n msg: |-\r\n usage: __main__.py [-h] [--version]\r\n {install,inject,upgrade,upgrade-all,uninstall,uninstall-all,reinstall,reinstall-all,list,run,runpip,ensurepath,completions}\r\n ...\r\n __main__.py: error: unrecognized arguments: --install-deps\r\n pipx_item: ansible-base\r\n rc: 2\r\n stderr: |-\r\n usage: __main__.py [-h] [--version]\r\n {install,inject,upgrade,upgrade-all,uninstall,uninstall-all,reinstall,reinstall-all,list,run,runpip,ensurepath,completions}\r\n ...\r\n __main__.py: error: unrecognized arguments: --install-deps\r\n stderr_lines: <omitted>\r\n stdout: ''\r\n stdout_lines: <omitted>\r\n```\r\n\n\n### Code of Conduct\n\n- [X] I agree to follow the Ansible Code of Conduct\n", "before_files": [{"content": "#!/usr/bin/python\n# -*- coding: utf-8 -*-\n\n# (c) 2021, Alexei Znamensky <[email protected]>\n# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)\n\nfrom __future__ import absolute_import, division, print_function\n__metaclass__ = type\n\n\nDOCUMENTATION = '''\n---\nmodule: pipx\nshort_description: Manages applications installed with pipx\nversion_added: 3.8.0\ndescription:\n - Manage Python applications installed in isolated virtualenvs using pipx.\noptions:\n state:\n type: str\n choices: [present, absent, install, uninstall, uninstall_all, inject, upgrade, upgrade_all, reinstall, reinstall_all]\n default: install\n description:\n - Desired state for the application.\n - The states C(present) and C(absent) are aliases to C(install) and C(uninstall), respectively.\n name:\n type: str\n description:\n - >\n The name of the application to be installed. It must to be a simple package name.\n For passing package specifications or installing from URLs or directories,\n please use the I(source) option.\n source:\n type: str\n description:\n - >\n If the application source, such as a package with version specifier, or an URL,\n directory or any other accepted specification. See C(pipx) documentation for more details.\n - When specified, the C(pipx) command will use I(source) instead of I(name).\n install_deps:\n description:\n - Include applications of dependent packages.\n - Only used when I(state=install) or I(state=upgrade).\n type: bool\n default: false\n inject_packages:\n description:\n - Packages to be injected into an existing virtual environment.\n - Only used when I(state=inject).\n type: list\n elements: str\n force:\n description:\n - Force modification of the application's virtual environment. See C(pipx) for details.\n - Only used when I(state=install), I(state=upgrade), I(state=upgrade_all), or I(state=inject).\n type: bool\n default: false\n include_injected:\n description:\n - Upgrade the injected packages along with the application.\n - Only used when I(state=upgrade) or I(state=upgrade_all).\n type: bool\n default: false\n index_url:\n description:\n - Base URL of Python Package Index.\n - Only used when I(state=install), I(state=upgrade), or I(state=inject).\n type: str\n python:\n description:\n - Python version to be used when creating the application virtual environment. Must be 3.6+.\n - Only used when I(state=install), I(state=reinstall), or I(state=reinstall_all).\n type: str\n executable:\n description:\n - Path to the C(pipx) installed in the system.\n - >\n If not specified, the module will use C(python -m pipx) to run the tool,\n using the same Python interpreter as ansible itself.\n type: path\nnotes:\n - This module does not install the C(pipx) python package, however that can be easily done with the module M(ansible.builtin.pip).\n - This module does not require C(pipx) to be in the shell C(PATH), but it must be loadable by Python as a module.\n - Please note that C(pipx) requires Python 3.6 or above.\n - >\n This first implementation does not verify whether a specified version constraint has been installed or not.\n Hence, when using version operators, C(pipx) module will always try to execute the operation,\n even when the application was previously installed.\n This feature will be added in the future.\n - See also the C(pipx) documentation at U(https://pypa.github.io/pipx/).\nauthor:\n - \"Alexei Znamensky (@russoz)\"\n'''\n\nEXAMPLES = '''\n- name: Install tox\n community.general.pipx:\n name: tox\n\n- name: Install tox from git repository\n community.general.pipx:\n name: tox\n source: git+https://github.com/tox-dev/tox.git\n\n- name: Upgrade tox\n community.general.pipx:\n name: tox\n state: upgrade\n\n- name: Reinstall black with specific Python version\n community.general.pipx:\n name: black\n state: reinstall\n python: 3.7\n\n- name: Uninstall pycowsay\n community.general.pipx:\n name: pycowsay\n state: absent\n'''\n\n\nimport json\n\nfrom ansible_collections.community.general.plugins.module_utils.module_helper import (\n CmdStateModuleHelper, ArgFormat, ModuleHelperException\n)\nfrom ansible.module_utils.facts.compat import ansible_facts\n\n\n_state_map = dict(\n present='install',\n absent='uninstall',\n uninstall_all='uninstall-all',\n upgrade_all='upgrade-all',\n reinstall_all='reinstall-all',\n)\n\n\nclass PipX(CmdStateModuleHelper):\n output_params = ['name', 'source', 'index_url', 'force', 'installdeps']\n module = dict(\n argument_spec=dict(\n state=dict(type='str', default='install',\n choices=[\n 'present', 'absent', 'install', 'uninstall', 'uninstall_all',\n 'inject', 'upgrade', 'upgrade_all', 'reinstall', 'reinstall_all']),\n name=dict(type='str'),\n source=dict(type='str'),\n install_deps=dict(type='bool', default=False),\n inject_packages=dict(type='list', elements='str'),\n force=dict(type='bool', default=False),\n include_injected=dict(type='bool', default=False),\n index_url=dict(type='str'),\n python=dict(type='str'),\n executable=dict(type='path')\n ),\n required_if=[\n ('state', 'present', ['name']),\n ('state', 'install', ['name']),\n ('state', 'absent', ['name']),\n ('state', 'uninstall', ['name']),\n ('state', 'inject', ['name', 'inject_packages']),\n ],\n supports_check_mode=True,\n )\n command_args_formats = dict(\n state=dict(fmt=lambda v: [_state_map.get(v, v)]),\n name_source=dict(fmt=lambda n, s: [s] if s else [n], stars=1),\n install_deps=dict(fmt=\"--install-deps\", style=ArgFormat.BOOLEAN),\n inject_packages=dict(fmt=lambda v: v),\n force=dict(fmt=\"--force\", style=ArgFormat.BOOLEAN),\n include_injected=dict(fmt=\"--include-injected\", style=ArgFormat.BOOLEAN),\n index_url=dict(fmt=('--index-url', '{0}'),),\n python=dict(fmt=('--python', '{0}'),),\n _list=dict(fmt=('list', '--include-injected', '--json'), style=ArgFormat.BOOLEAN),\n )\n check_rc = True\n run_command_fixed_options = dict(\n environ_update={'USE_EMOJI': '0'}\n )\n\n def _retrieve_installed(self):\n def process_list(rc, out, err):\n if not out:\n return {}\n\n results = {}\n raw_data = json.loads(out)\n for venv_name, venv in raw_data['venvs'].items():\n results[venv_name] = {\n 'version': venv['metadata']['main_package']['package_version'],\n 'injected': dict(\n (k, v['package_version']) for k, v in venv['metadata']['injected_packages'].items()\n ),\n }\n return results\n\n installed = self.run_command(params=[{'_list': True}], process_output=process_list,\n publish_rc=False, publish_out=False, publish_err=False, publish_cmd=False)\n\n if self.vars.name is not None:\n app_list = installed.get(self.vars.name)\n if app_list:\n return {self.vars.name: app_list}\n else:\n return {}\n\n return installed\n\n def __init_module__(self):\n if self.vars.executable:\n self.command = [self.vars.executable]\n else:\n facts = ansible_facts(self.module, gather_subset=['python'])\n self.command = [facts['python']['executable'], '-m', 'pipx']\n\n self.vars.set('application', self._retrieve_installed(), change=True, diff=True)\n\n def __quit_module__(self):\n self.vars.application = self._retrieve_installed()\n\n def state_install(self):\n if not self.vars.application or self.vars.force:\n self.changed = True\n if not self.module.check_mode:\n self.run_command(params=['state', 'index_url', 'install_deps', 'force', 'python',\n {'name_source': [self.vars.name, self.vars.source]}])\n\n state_present = state_install\n\n def state_upgrade(self):\n if not self.vars.application:\n raise ModuleHelperException(\n \"Trying to upgrade a non-existent application: {0}\".format(self.vars.name))\n if self.vars.force:\n self.changed = True\n if not self.module.check_mode:\n self.run_command(params=['state', 'index_url', 'install_deps', 'force', 'name'])\n\n def state_uninstall(self):\n if self.vars.application and not self.module.check_mode:\n self.run_command(params=['state', 'name'])\n\n state_absent = state_uninstall\n\n def state_reinstall(self):\n if not self.vars.application:\n raise ModuleHelperException(\n \"Trying to reinstall a non-existent application: {0}\".format(self.vars.name))\n self.changed = True\n if not self.module.check_mode:\n self.run_command(params=['state', 'name', 'python'])\n\n def state_inject(self):\n if not self.vars.application:\n raise ModuleHelperException(\n \"Trying to inject packages into a non-existent application: {0}\".format(self.vars.name))\n if self.vars.force:\n self.changed = True\n if not self.module.check_mode:\n self.run_command(params=['state', 'index_url', 'force', 'name', 'inject_packages'])\n\n def state_uninstall_all(self):\n if not self.module.check_mode:\n self.run_command(params=['state'])\n\n def state_reinstall_all(self):\n if not self.module.check_mode:\n self.run_command(params=['state', 'python'])\n\n def state_upgrade_all(self):\n if self.vars.force:\n self.changed = True\n if not self.module.check_mode:\n self.run_command(params=['state', 'include_injected', 'force'])\n\n\ndef main():\n PipX.execute()\n\n\nif __name__ == '__main__':\n main()\n", "path": "plugins/modules/packaging/language/pipx.py"}], "after_files": [{"content": "#!/usr/bin/python\n# -*- coding: utf-8 -*-\n\n# (c) 2021, Alexei Znamensky <[email protected]>\n# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)\n\nfrom __future__ import absolute_import, division, print_function\n__metaclass__ = type\n\n\nDOCUMENTATION = '''\n---\nmodule: pipx\nshort_description: Manages applications installed with pipx\nversion_added: 3.8.0\ndescription:\n - Manage Python applications installed in isolated virtualenvs using pipx.\noptions:\n state:\n type: str\n choices: [present, absent, install, uninstall, uninstall_all, inject, upgrade, upgrade_all, reinstall, reinstall_all]\n default: install\n description:\n - Desired state for the application.\n - The states C(present) and C(absent) are aliases to C(install) and C(uninstall), respectively.\n name:\n type: str\n description:\n - >\n The name of the application to be installed. It must to be a simple package name.\n For passing package specifications or installing from URLs or directories,\n please use the I(source) option.\n source:\n type: str\n description:\n - >\n If the application source, such as a package with version specifier, or an URL,\n directory or any other accepted specification. See C(pipx) documentation for more details.\n - When specified, the C(pipx) command will use I(source) instead of I(name).\n install_deps:\n description:\n - Include applications of dependent packages.\n - Only used when I(state=install) or I(state=upgrade).\n type: bool\n default: false\n inject_packages:\n description:\n - Packages to be injected into an existing virtual environment.\n - Only used when I(state=inject).\n type: list\n elements: str\n force:\n description:\n - Force modification of the application's virtual environment. See C(pipx) for details.\n - Only used when I(state=install), I(state=upgrade), I(state=upgrade_all), or I(state=inject).\n type: bool\n default: false\n include_injected:\n description:\n - Upgrade the injected packages along with the application.\n - Only used when I(state=upgrade) or I(state=upgrade_all).\n type: bool\n default: false\n index_url:\n description:\n - Base URL of Python Package Index.\n - Only used when I(state=install), I(state=upgrade), or I(state=inject).\n type: str\n python:\n description:\n - Python version to be used when creating the application virtual environment. Must be 3.6+.\n - Only used when I(state=install), I(state=reinstall), or I(state=reinstall_all).\n type: str\n executable:\n description:\n - Path to the C(pipx) installed in the system.\n - >\n If not specified, the module will use C(python -m pipx) to run the tool,\n using the same Python interpreter as ansible itself.\n type: path\nnotes:\n - This module does not install the C(pipx) python package, however that can be easily done with the module M(ansible.builtin.pip).\n - This module does not require C(pipx) to be in the shell C(PATH), but it must be loadable by Python as a module.\n - Please note that C(pipx) requires Python 3.6 or above.\n - >\n This first implementation does not verify whether a specified version constraint has been installed or not.\n Hence, when using version operators, C(pipx) module will always try to execute the operation,\n even when the application was previously installed.\n This feature will be added in the future.\n - See also the C(pipx) documentation at U(https://pypa.github.io/pipx/).\nauthor:\n - \"Alexei Znamensky (@russoz)\"\n'''\n\nEXAMPLES = '''\n- name: Install tox\n community.general.pipx:\n name: tox\n\n- name: Install tox from git repository\n community.general.pipx:\n name: tox\n source: git+https://github.com/tox-dev/tox.git\n\n- name: Upgrade tox\n community.general.pipx:\n name: tox\n state: upgrade\n\n- name: Reinstall black with specific Python version\n community.general.pipx:\n name: black\n state: reinstall\n python: 3.7\n\n- name: Uninstall pycowsay\n community.general.pipx:\n name: pycowsay\n state: absent\n'''\n\n\nimport json\n\nfrom ansible_collections.community.general.plugins.module_utils.module_helper import (\n CmdStateModuleHelper, ArgFormat, ModuleHelperException\n)\nfrom ansible.module_utils.facts.compat import ansible_facts\n\n\n_state_map = dict(\n present='install',\n absent='uninstall',\n uninstall_all='uninstall-all',\n upgrade_all='upgrade-all',\n reinstall_all='reinstall-all',\n)\n\n\nclass PipX(CmdStateModuleHelper):\n output_params = ['name', 'source', 'index_url', 'force', 'installdeps']\n module = dict(\n argument_spec=dict(\n state=dict(type='str', default='install',\n choices=[\n 'present', 'absent', 'install', 'uninstall', 'uninstall_all',\n 'inject', 'upgrade', 'upgrade_all', 'reinstall', 'reinstall_all']),\n name=dict(type='str'),\n source=dict(type='str'),\n install_deps=dict(type='bool', default=False),\n inject_packages=dict(type='list', elements='str'),\n force=dict(type='bool', default=False),\n include_injected=dict(type='bool', default=False),\n index_url=dict(type='str'),\n python=dict(type='str'),\n executable=dict(type='path')\n ),\n required_if=[\n ('state', 'present', ['name']),\n ('state', 'install', ['name']),\n ('state', 'absent', ['name']),\n ('state', 'uninstall', ['name']),\n ('state', 'inject', ['name', 'inject_packages']),\n ],\n supports_check_mode=True,\n )\n command_args_formats = dict(\n state=dict(fmt=lambda v: [_state_map.get(v, v)]),\n name_source=dict(fmt=lambda n, s: [s] if s else [n], stars=1),\n install_deps=dict(fmt=\"--include-deps\", style=ArgFormat.BOOLEAN),\n inject_packages=dict(fmt=lambda v: v),\n force=dict(fmt=\"--force\", style=ArgFormat.BOOLEAN),\n include_injected=dict(fmt=\"--include-injected\", style=ArgFormat.BOOLEAN),\n index_url=dict(fmt=('--index-url', '{0}'),),\n python=dict(fmt=('--python', '{0}'),),\n _list=dict(fmt=('list', '--include-injected', '--json'), style=ArgFormat.BOOLEAN),\n )\n check_rc = True\n run_command_fixed_options = dict(\n environ_update={'USE_EMOJI': '0'}\n )\n\n def _retrieve_installed(self):\n def process_list(rc, out, err):\n if not out:\n return {}\n\n results = {}\n raw_data = json.loads(out)\n for venv_name, venv in raw_data['venvs'].items():\n results[venv_name] = {\n 'version': venv['metadata']['main_package']['package_version'],\n 'injected': dict(\n (k, v['package_version']) for k, v in venv['metadata']['injected_packages'].items()\n ),\n }\n return results\n\n installed = self.run_command(params=[{'_list': True}], process_output=process_list,\n publish_rc=False, publish_out=False, publish_err=False, publish_cmd=False)\n\n if self.vars.name is not None:\n app_list = installed.get(self.vars.name)\n if app_list:\n return {self.vars.name: app_list}\n else:\n return {}\n\n return installed\n\n def __init_module__(self):\n if self.vars.executable:\n self.command = [self.vars.executable]\n else:\n facts = ansible_facts(self.module, gather_subset=['python'])\n self.command = [facts['python']['executable'], '-m', 'pipx']\n\n self.vars.set('application', self._retrieve_installed(), change=True, diff=True)\n\n def __quit_module__(self):\n self.vars.application = self._retrieve_installed()\n\n def state_install(self):\n if not self.vars.application or self.vars.force:\n self.changed = True\n if not self.module.check_mode:\n self.run_command(params=['state', 'index_url', 'install_deps', 'force', 'python',\n {'name_source': [self.vars.name, self.vars.source]}])\n\n state_present = state_install\n\n def state_upgrade(self):\n if not self.vars.application:\n raise ModuleHelperException(\n \"Trying to upgrade a non-existent application: {0}\".format(self.vars.name))\n if self.vars.force:\n self.changed = True\n if not self.module.check_mode:\n self.run_command(params=['state', 'index_url', 'install_deps', 'force', 'name'])\n\n def state_uninstall(self):\n if self.vars.application and not self.module.check_mode:\n self.run_command(params=['state', 'name'])\n\n state_absent = state_uninstall\n\n def state_reinstall(self):\n if not self.vars.application:\n raise ModuleHelperException(\n \"Trying to reinstall a non-existent application: {0}\".format(self.vars.name))\n self.changed = True\n if not self.module.check_mode:\n self.run_command(params=['state', 'name', 'python'])\n\n def state_inject(self):\n if not self.vars.application:\n raise ModuleHelperException(\n \"Trying to inject packages into a non-existent application: {0}\".format(self.vars.name))\n if self.vars.force:\n self.changed = True\n if not self.module.check_mode:\n self.run_command(params=['state', 'index_url', 'force', 'name', 'inject_packages'])\n\n def state_uninstall_all(self):\n if not self.module.check_mode:\n self.run_command(params=['state'])\n\n def state_reinstall_all(self):\n if not self.module.check_mode:\n self.run_command(params=['state', 'python'])\n\n def state_upgrade_all(self):\n if self.vars.force:\n self.changed = True\n if not self.module.check_mode:\n self.run_command(params=['state', 'include_injected', 'force'])\n\n\ndef main():\n PipX.execute()\n\n\nif __name__ == '__main__':\n main()\n", "path": "plugins/modules/packaging/language/pipx.py"}]} | 3,945 | 190 |
gh_patches_debug_10022 | rasdani/github-patches | git_diff | bokeh__bokeh-6724 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Set initial date in date picker in models/file/widgets
This is needed to make image diff not fail when example is run on different days.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `examples/models/file/widgets.py`
Content:
```
1 from __future__ import print_function
2
3 #from datetime import date
4
5 from bokeh.document import Document
6 from bokeh.embed import file_html
7 from bokeh.resources import INLINE
8 from bokeh.util.browser import view
9 from bokeh.models import ColumnDataSource
10 from bokeh.models.layouts import Column, Row, WidgetBox
11 from bokeh.models.widgets import (
12 Button, Toggle, Dropdown,
13 CheckboxGroup, RadioGroup,
14 CheckboxButtonGroup, RadioButtonGroup,
15 TextInput, AutocompleteInput,
16 Select, MultiSelect,
17 Slider, RangeSlider, #DateRangeSlider,
18 DatePicker,
19 Paragraph, Div, PreText,
20 Panel, Tabs,
21 DataTable, TableColumn,
22 StringFormatter, NumberFormatter,
23 StringEditor, IntEditor, NumberEditor, SelectEditor,
24 )
25 from bokeh.plotting import figure
26 from bokeh.sampledata.iris import flowers
27 from bokeh.sampledata.autompg2 import autompg2 as mpg
28
29 button = Button(label="Button (disabled) - still has click event", button_type="primary", disabled=True)
30 toggle = Toggle(label="Toggle button", button_type="success")
31
32 menu = [("Item 1", "item_1_value"), ("Item 2", "item_2_value"), ("Item 3", "item_3_value")]
33
34 dropdown = Dropdown(label="Dropdown button", button_type="warning", menu=menu)
35 #dropdown_split = Dropdown(label="Split button", button_type="danger", menu=menu, default_value="default"))
36
37 checkbox_group = CheckboxGroup(labels=["Option 1", "Option 2", "Option 3"], active=[0, 1])
38 radio_group = RadioGroup(labels=["Option 1", "Option 2", "Option 3"], active=0)
39
40 checkbox_button_group = CheckboxButtonGroup(labels=["Option 1", "Option 2", "Option 3"], active=[0, 1])
41 radio_button_group = RadioButtonGroup(labels=["Option 1", "Option 2", "Option 3"], active=0)
42
43 text_input = TextInput(placeholder="Enter value ...")
44
45 completions = ["aaa", "aab", "aac", "baa", "caa"]
46 autocomplete_input = AutocompleteInput(placeholder="Enter value ...", completions=completions)
47
48 select = Select(options=["Option 1", "Option 2", "Option 3"])
49
50 multi_select = MultiSelect(options=["Option %d" % (i+1) for i in range(16)], size=6)
51
52 slider = Slider(value=10, start=0, end=100, step=0.5)
53
54 range_slider = RangeSlider(value=[10, 90], start=0, end=100, step=0.5)
55
56 #date_range_slider = DateRangeSlider(value=(date(2016, 1, 1), date(2016, 12, 31)))
57
58 date_picker = DatePicker()
59
60 paragraph = Paragraph(text="some text")
61
62 div = Div(text="some <b>text</b>")
63
64 pre_text = PreText(text="some text")
65
66 def mk_tab(color):
67 plot = figure(plot_width=300, plot_height=300)
68 plot.scatter(flowers["petal_length"], flowers["petal_width"], color=color, fill_alpha=0.2, size=12)
69 return Panel(title="Tab 1: %s" % color.capitalize(), child=plot)
70
71 tabs = Tabs(tabs=[mk_tab("red"), mk_tab("green"), mk_tab("blue")])
72
73 source = ColumnDataSource(data=mpg)
74 columns = [
75 TableColumn(field="manufacturer",
76 title="Manufacturer",
77 editor=SelectEditor(options=sorted(mpg["manufacturer"].unique())),
78 formatter=StringFormatter(font_style="bold")),
79 TableColumn(field="model",
80 title="Model",
81 editor=StringEditor(completions=sorted(mpg["model"].unique()))),
82 TableColumn(field="displ",
83 title="Displacement",
84 editor=NumberEditor(step=0.1),
85 formatter=NumberFormatter(format="0.0")),
86 TableColumn(field="year",
87 title="Year",
88 editor=IntEditor()),
89 TableColumn(field="cyl",
90 title="Cylinders",
91 editor=IntEditor()),
92 TableColumn(field="trans",
93 title="Transmission",
94 editor=SelectEditor(options=sorted(mpg["trans"].unique()))),
95 TableColumn(field="drv",
96 title="Drive",
97 editor=SelectEditor(options=sorted(mpg["drv"].unique()))),
98 TableColumn(field="class",
99 title="Class",
100 editor=SelectEditor(options=sorted(mpg["class"].unique()))),
101 TableColumn(field="cty",
102 title="City MPG",
103 editor=IntEditor()),
104 TableColumn(field="hwy",
105 title="Highway MPG",
106 editor=IntEditor()),
107 ]
108 table = DataTable(source=source, columns=columns, editable=True, width=800)
109
110 widgets = Column(children=[
111 Row(children=[
112 WidgetBox(children=[
113 button, toggle, dropdown, #dropdown_split,
114 checkbox_group, radio_group,
115 checkbox_button_group, radio_button_group,
116 ]),
117 WidgetBox(children=[
118 text_input, autocomplete_input,
119 select, multi_select,
120 slider, range_slider, #date_range_slider,
121 date_picker,
122 paragraph, div, pre_text,
123 ]),
124 WidgetBox(children=[
125 tabs,
126 ], width=400),
127 ]),
128 WidgetBox(children=[table]),
129 ])
130
131
132 doc = Document()
133 doc.add_root(widgets)
134
135 if __name__ == "__main__":
136 doc.validate()
137 filename = "widgets.html"
138 with open(filename, "w") as f:
139 f.write(file_html(doc, INLINE, "Widgets"))
140 print("Wrote %s" % filename)
141 view(filename)
142
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/examples/models/file/widgets.py b/examples/models/file/widgets.py
--- a/examples/models/file/widgets.py
+++ b/examples/models/file/widgets.py
@@ -1,6 +1,6 @@
from __future__ import print_function
-#from datetime import date
+from datetime import date
from bokeh.document import Document
from bokeh.embed import file_html
@@ -55,7 +55,7 @@
#date_range_slider = DateRangeSlider(value=(date(2016, 1, 1), date(2016, 12, 31)))
-date_picker = DatePicker()
+date_picker = DatePicker(value=date(2017, 8, 1))
paragraph = Paragraph(text="some text")
| {"golden_diff": "diff --git a/examples/models/file/widgets.py b/examples/models/file/widgets.py\n--- a/examples/models/file/widgets.py\n+++ b/examples/models/file/widgets.py\n@@ -1,6 +1,6 @@\n from __future__ import print_function\n \n-#from datetime import date\n+from datetime import date\n \n from bokeh.document import Document\n from bokeh.embed import file_html\n@@ -55,7 +55,7 @@\n \n #date_range_slider = DateRangeSlider(value=(date(2016, 1, 1), date(2016, 12, 31)))\n \n-date_picker = DatePicker()\n+date_picker = DatePicker(value=date(2017, 8, 1))\n \n paragraph = Paragraph(text=\"some text\")\n", "issue": "Set initial date in date picker in models/file/widgets\nThis is needed to make image diff not fail when example is run on different days.\r\n\n", "before_files": [{"content": "from __future__ import print_function\n\n#from datetime import date\n\nfrom bokeh.document import Document\nfrom bokeh.embed import file_html\nfrom bokeh.resources import INLINE\nfrom bokeh.util.browser import view\nfrom bokeh.models import ColumnDataSource\nfrom bokeh.models.layouts import Column, Row, WidgetBox\nfrom bokeh.models.widgets import (\n Button, Toggle, Dropdown,\n CheckboxGroup, RadioGroup,\n CheckboxButtonGroup, RadioButtonGroup,\n TextInput, AutocompleteInput,\n Select, MultiSelect,\n Slider, RangeSlider, #DateRangeSlider,\n DatePicker,\n Paragraph, Div, PreText,\n Panel, Tabs,\n DataTable, TableColumn,\n StringFormatter, NumberFormatter,\n StringEditor, IntEditor, NumberEditor, SelectEditor,\n)\nfrom bokeh.plotting import figure\nfrom bokeh.sampledata.iris import flowers\nfrom bokeh.sampledata.autompg2 import autompg2 as mpg\n\nbutton = Button(label=\"Button (disabled) - still has click event\", button_type=\"primary\", disabled=True)\ntoggle = Toggle(label=\"Toggle button\", button_type=\"success\")\n\nmenu = [(\"Item 1\", \"item_1_value\"), (\"Item 2\", \"item_2_value\"), (\"Item 3\", \"item_3_value\")]\n\ndropdown = Dropdown(label=\"Dropdown button\", button_type=\"warning\", menu=menu)\n#dropdown_split = Dropdown(label=\"Split button\", button_type=\"danger\", menu=menu, default_value=\"default\"))\n\ncheckbox_group = CheckboxGroup(labels=[\"Option 1\", \"Option 2\", \"Option 3\"], active=[0, 1])\nradio_group = RadioGroup(labels=[\"Option 1\", \"Option 2\", \"Option 3\"], active=0)\n\ncheckbox_button_group = CheckboxButtonGroup(labels=[\"Option 1\", \"Option 2\", \"Option 3\"], active=[0, 1])\nradio_button_group = RadioButtonGroup(labels=[\"Option 1\", \"Option 2\", \"Option 3\"], active=0)\n\ntext_input = TextInput(placeholder=\"Enter value ...\")\n\ncompletions = [\"aaa\", \"aab\", \"aac\", \"baa\", \"caa\"]\nautocomplete_input = AutocompleteInput(placeholder=\"Enter value ...\", completions=completions)\n\nselect = Select(options=[\"Option 1\", \"Option 2\", \"Option 3\"])\n\nmulti_select = MultiSelect(options=[\"Option %d\" % (i+1) for i in range(16)], size=6)\n\nslider = Slider(value=10, start=0, end=100, step=0.5)\n\nrange_slider = RangeSlider(value=[10, 90], start=0, end=100, step=0.5)\n\n#date_range_slider = DateRangeSlider(value=(date(2016, 1, 1), date(2016, 12, 31)))\n\ndate_picker = DatePicker()\n\nparagraph = Paragraph(text=\"some text\")\n\ndiv = Div(text=\"some <b>text</b>\")\n\npre_text = PreText(text=\"some text\")\n\ndef mk_tab(color):\n plot = figure(plot_width=300, plot_height=300)\n plot.scatter(flowers[\"petal_length\"], flowers[\"petal_width\"], color=color, fill_alpha=0.2, size=12)\n return Panel(title=\"Tab 1: %s\" % color.capitalize(), child=plot)\n\ntabs = Tabs(tabs=[mk_tab(\"red\"), mk_tab(\"green\"), mk_tab(\"blue\")])\n\nsource = ColumnDataSource(data=mpg)\ncolumns = [\n TableColumn(field=\"manufacturer\",\n title=\"Manufacturer\",\n editor=SelectEditor(options=sorted(mpg[\"manufacturer\"].unique())),\n formatter=StringFormatter(font_style=\"bold\")),\n TableColumn(field=\"model\",\n title=\"Model\",\n editor=StringEditor(completions=sorted(mpg[\"model\"].unique()))),\n TableColumn(field=\"displ\",\n title=\"Displacement\",\n editor=NumberEditor(step=0.1),\n formatter=NumberFormatter(format=\"0.0\")),\n TableColumn(field=\"year\",\n title=\"Year\",\n editor=IntEditor()),\n TableColumn(field=\"cyl\",\n title=\"Cylinders\",\n editor=IntEditor()),\n TableColumn(field=\"trans\",\n title=\"Transmission\",\n editor=SelectEditor(options=sorted(mpg[\"trans\"].unique()))),\n TableColumn(field=\"drv\",\n title=\"Drive\",\n editor=SelectEditor(options=sorted(mpg[\"drv\"].unique()))),\n TableColumn(field=\"class\",\n title=\"Class\",\n editor=SelectEditor(options=sorted(mpg[\"class\"].unique()))),\n TableColumn(field=\"cty\",\n title=\"City MPG\",\n editor=IntEditor()),\n TableColumn(field=\"hwy\",\n title=\"Highway MPG\",\n editor=IntEditor()),\n]\ntable = DataTable(source=source, columns=columns, editable=True, width=800)\n\nwidgets = Column(children=[\n Row(children=[\n WidgetBox(children=[\n button, toggle, dropdown, #dropdown_split,\n checkbox_group, radio_group,\n checkbox_button_group, radio_button_group,\n ]),\n WidgetBox(children=[\n text_input, autocomplete_input,\n select, multi_select,\n slider, range_slider, #date_range_slider,\n date_picker,\n paragraph, div, pre_text,\n ]),\n WidgetBox(children=[\n tabs,\n ], width=400),\n ]),\n WidgetBox(children=[table]),\n])\n\n\ndoc = Document()\ndoc.add_root(widgets)\n\nif __name__ == \"__main__\":\n doc.validate()\n filename = \"widgets.html\"\n with open(filename, \"w\") as f:\n f.write(file_html(doc, INLINE, \"Widgets\"))\n print(\"Wrote %s\" % filename)\n view(filename)\n", "path": "examples/models/file/widgets.py"}], "after_files": [{"content": "from __future__ import print_function\n\nfrom datetime import date\n\nfrom bokeh.document import Document\nfrom bokeh.embed import file_html\nfrom bokeh.resources import INLINE\nfrom bokeh.util.browser import view\nfrom bokeh.models import ColumnDataSource\nfrom bokeh.models.layouts import Column, Row, WidgetBox\nfrom bokeh.models.widgets import (\n Button, Toggle, Dropdown,\n CheckboxGroup, RadioGroup,\n CheckboxButtonGroup, RadioButtonGroup,\n TextInput, AutocompleteInput,\n Select, MultiSelect,\n Slider, RangeSlider, #DateRangeSlider,\n DatePicker,\n Paragraph, Div, PreText,\n Panel, Tabs,\n DataTable, TableColumn,\n StringFormatter, NumberFormatter,\n StringEditor, IntEditor, NumberEditor, SelectEditor,\n)\nfrom bokeh.plotting import figure\nfrom bokeh.sampledata.iris import flowers\nfrom bokeh.sampledata.autompg2 import autompg2 as mpg\n\nbutton = Button(label=\"Button (disabled) - still has click event\", button_type=\"primary\", disabled=True)\ntoggle = Toggle(label=\"Toggle button\", button_type=\"success\")\n\nmenu = [(\"Item 1\", \"item_1_value\"), (\"Item 2\", \"item_2_value\"), (\"Item 3\", \"item_3_value\")]\n\ndropdown = Dropdown(label=\"Dropdown button\", button_type=\"warning\", menu=menu)\n#dropdown_split = Dropdown(label=\"Split button\", button_type=\"danger\", menu=menu, default_value=\"default\"))\n\ncheckbox_group = CheckboxGroup(labels=[\"Option 1\", \"Option 2\", \"Option 3\"], active=[0, 1])\nradio_group = RadioGroup(labels=[\"Option 1\", \"Option 2\", \"Option 3\"], active=0)\n\ncheckbox_button_group = CheckboxButtonGroup(labels=[\"Option 1\", \"Option 2\", \"Option 3\"], active=[0, 1])\nradio_button_group = RadioButtonGroup(labels=[\"Option 1\", \"Option 2\", \"Option 3\"], active=0)\n\ntext_input = TextInput(placeholder=\"Enter value ...\")\n\ncompletions = [\"aaa\", \"aab\", \"aac\", \"baa\", \"caa\"]\nautocomplete_input = AutocompleteInput(placeholder=\"Enter value ...\", completions=completions)\n\nselect = Select(options=[\"Option 1\", \"Option 2\", \"Option 3\"])\n\nmulti_select = MultiSelect(options=[\"Option %d\" % (i+1) for i in range(16)], size=6)\n\nslider = Slider(value=10, start=0, end=100, step=0.5)\n\nrange_slider = RangeSlider(value=[10, 90], start=0, end=100, step=0.5)\n\n#date_range_slider = DateRangeSlider(value=(date(2016, 1, 1), date(2016, 12, 31)))\n\ndate_picker = DatePicker(value=date(2017, 8, 1))\n\nparagraph = Paragraph(text=\"some text\")\n\ndiv = Div(text=\"some <b>text</b>\")\n\npre_text = PreText(text=\"some text\")\n\ndef mk_tab(color):\n plot = figure(plot_width=300, plot_height=300)\n plot.scatter(flowers[\"petal_length\"], flowers[\"petal_width\"], color=color, fill_alpha=0.2, size=12)\n return Panel(title=\"Tab 1: %s\" % color.capitalize(), child=plot)\n\ntabs = Tabs(tabs=[mk_tab(\"red\"), mk_tab(\"green\"), mk_tab(\"blue\")])\n\nsource = ColumnDataSource(data=mpg)\ncolumns = [\n TableColumn(field=\"manufacturer\",\n title=\"Manufacturer\",\n editor=SelectEditor(options=sorted(mpg[\"manufacturer\"].unique())),\n formatter=StringFormatter(font_style=\"bold\")),\n TableColumn(field=\"model\",\n title=\"Model\",\n editor=StringEditor(completions=sorted(mpg[\"model\"].unique()))),\n TableColumn(field=\"displ\",\n title=\"Displacement\",\n editor=NumberEditor(step=0.1),\n formatter=NumberFormatter(format=\"0.0\")),\n TableColumn(field=\"year\",\n title=\"Year\",\n editor=IntEditor()),\n TableColumn(field=\"cyl\",\n title=\"Cylinders\",\n editor=IntEditor()),\n TableColumn(field=\"trans\",\n title=\"Transmission\",\n editor=SelectEditor(options=sorted(mpg[\"trans\"].unique()))),\n TableColumn(field=\"drv\",\n title=\"Drive\",\n editor=SelectEditor(options=sorted(mpg[\"drv\"].unique()))),\n TableColumn(field=\"class\",\n title=\"Class\",\n editor=SelectEditor(options=sorted(mpg[\"class\"].unique()))),\n TableColumn(field=\"cty\",\n title=\"City MPG\",\n editor=IntEditor()),\n TableColumn(field=\"hwy\",\n title=\"Highway MPG\",\n editor=IntEditor()),\n]\ntable = DataTable(source=source, columns=columns, editable=True, width=800)\n\nwidgets = Column(children=[\n Row(children=[\n WidgetBox(children=[\n button, toggle, dropdown, #dropdown_split,\n checkbox_group, radio_group,\n checkbox_button_group, radio_button_group,\n ]),\n WidgetBox(children=[\n text_input, autocomplete_input,\n select, multi_select,\n slider, range_slider, #date_range_slider,\n date_picker,\n paragraph, div, pre_text,\n ]),\n WidgetBox(children=[\n tabs,\n ], width=400),\n ]),\n WidgetBox(children=[table]),\n])\n\n\ndoc = Document()\ndoc.add_root(widgets)\n\nif __name__ == \"__main__\":\n doc.validate()\n filename = \"widgets.html\"\n with open(filename, \"w\") as f:\n f.write(file_html(doc, INLINE, \"Widgets\"))\n print(\"Wrote %s\" % filename)\n view(filename)\n", "path": "examples/models/file/widgets.py"}]} | 1,837 | 163 |
gh_patches_debug_17871 | rasdani/github-patches | git_diff | fal-ai__dbt-fal-779 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Support sqlserver
Add support for MS sqlserver
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `projects/adapter/src/dbt/adapters/fal_experimental/adapter_support.py`
Content:
```
1 import functools
2 from time import sleep
3 from typing import Any
4
5 import pandas as pd
6 import sqlalchemy
7 from contextlib import contextmanager
8 from dbt.adapters.base import BaseAdapter, BaseRelation, RelationType
9 from dbt.adapters.base.connections import AdapterResponse, Connection
10 from dbt.config import RuntimeConfig
11 from dbt.parser.manifest import MacroManifest, Manifest, ManifestLoader
12
13 from dbt.adapters import factory
14
15 _SQLALCHEMY_DIALECTS = {
16 "redshift": "redshift+psycopg2",
17 }
18
19
20 def _get_alchemy_engine(adapter: BaseAdapter, connection: Connection) -> Any:
21 # The following code heavily depends on the implementation
22 # details of the known adapters, hence it can't work for
23 # arbitrary ones.
24 adapter_type = adapter.type()
25
26 sqlalchemy_kwargs = {}
27 format_url = lambda url: url
28 if adapter_type == 'trino':
29 import dbt.adapters.fal_experimental.support.trino as support_trino
30 return support_trino.create_engine(adapter)
31
32 if adapter_type == "redshift":
33 # If the given adapter supports the DBAPI (PEP 249), we can
34 # use its connection directly for the engine.
35 sqlalchemy_kwargs["creator"] = lambda *args, **kwargs: connection.handle
36 url = _SQLALCHEMY_DIALECTS.get(adapter_type, adapter_type) + "://"
37 url = format_url(url)
38 else:
39 message = (
40 f"dbt-fal does not support {adapter_type} adapter. ",
41 f"If you need {adapter_type} support, you can create an issue ",
42 "in our GitHub repository: https://github.com/fal-ai/fal. ",
43 "We will look into it ASAP."
44 )
45 raise NotImplementedError(message)
46
47 return sqlalchemy.create_engine(url, **sqlalchemy_kwargs)
48
49
50 def drop_relation_if_it_exists(adapter: BaseAdapter, relation: BaseRelation) -> None:
51 if adapter.get_relation(
52 database=relation.database,
53 schema=relation.schema,
54 identifier=relation.identifier,
55 ):
56 adapter.drop_relation(relation)
57
58
59 def write_df_to_relation(
60 adapter: BaseAdapter,
61 relation: BaseRelation,
62 dataframe: pd.DataFrame,
63 *,
64 if_exists: str = "replace",
65 ) -> AdapterResponse:
66 """Generic version of the write_df_to_relation. Materialize the given
67 dataframe to the targeted relation on the adapter."""
68
69 adapter_type = adapter.type()
70
71 if adapter_type == "snowflake":
72 import dbt.adapters.fal_experimental.support.snowflake as support_snowflake
73
74 return support_snowflake.write_df_to_relation(adapter, dataframe, relation)
75
76 elif adapter_type == "bigquery":
77 import dbt.adapters.fal_experimental.support.bigquery as support_bq
78
79 return support_bq.write_df_to_relation(adapter, dataframe, relation)
80
81 elif adapter_type == "duckdb":
82 import dbt.adapters.fal_experimental.support.duckdb as support_duckdb
83
84 return support_duckdb.write_df_to_relation(adapter, dataframe, relation)
85
86 elif adapter_type == "postgres":
87 import dbt.adapters.fal_experimental.support.postgres as support_postgres
88
89 return support_postgres.write_df_to_relation(adapter, dataframe, relation)
90
91 else:
92 with new_connection(adapter, "fal:write_df_to_relation") as connection:
93 # TODO: this should probably live in the materialization macro.
94 temp_relation = relation.replace_path(
95 identifier=f"__dbt_fal_temp_{relation.identifier}"
96 )
97 drop_relation_if_it_exists(adapter, temp_relation)
98
99 alchemy_engine = _get_alchemy_engine(adapter, connection)
100
101 # TODO: probably worth handling errors here an returning
102 # a proper adapter response.
103 rows_affected = dataframe.to_sql(
104 con=alchemy_engine,
105 name=temp_relation.identifier,
106 schema=temp_relation.schema,
107 if_exists=if_exists,
108 index=False,
109 )
110 adapter.cache.add(temp_relation)
111 drop_relation_if_it_exists(adapter, relation)
112 adapter.rename_relation(temp_relation, relation)
113 adapter.commit_if_has_connection()
114
115 return AdapterResponse("OK", rows_affected=rows_affected)
116
117
118 def read_relation_as_df(adapter: BaseAdapter, relation: BaseRelation) -> pd.DataFrame:
119 """Generic version of the read_df_from_relation."""
120
121 adapter_type = adapter.type()
122
123 if adapter_type == "snowflake":
124 import dbt.adapters.fal_experimental.support.snowflake as support_snowflake
125
126 return support_snowflake.read_relation_as_df(adapter, relation)
127
128 elif adapter_type == "bigquery":
129 import dbt.adapters.fal_experimental.support.bigquery as support_bq
130
131 return support_bq.read_relation_as_df(adapter, relation)
132
133 elif adapter_type == "duckdb":
134 import dbt.adapters.fal_experimental.support.duckdb as support_duckdb
135
136 return support_duckdb.read_relation_as_df(adapter, relation)
137
138 elif adapter_type == "postgres":
139 import dbt.adapters.fal_experimental.support.postgres as support_postgres
140
141 return support_postgres.read_relation_as_df(adapter, relation)
142
143 else:
144 with new_connection(adapter, "fal:read_relation_as_df") as connection:
145 alchemy_engine = _get_alchemy_engine(adapter, connection)
146 return pd.read_sql_table(
147 con=alchemy_engine,
148 table_name=relation.identifier,
149 schema=relation.schema,
150 )
151
152
153 def prepare_for_adapter(adapter: BaseAdapter, function: Any) -> Any:
154 """Prepare the given function to be used with string-like inputs
155 (for relations) on the given adapter."""
156
157 @functools.wraps(function)
158 def wrapped(quoted_relation: str, *args, **kwargs) -> Any:
159 relation = adapter.Relation.create(
160 *quoted_relation.split("."), type=RelationType.Table
161 )
162 return function(adapter, relation, *args, **kwargs)
163
164 return wrapped
165
166
167 def reconstruct_adapter(config: RuntimeConfig, manifest: Manifest, macro_manifest: MacroManifest) -> BaseAdapter:
168 from dbt.tracking import do_not_track
169
170 # Prepare the DBT to not to track us.
171 do_not_track()
172
173 # Prepare the plugin loading system to handle the adapter.
174 factory.load_plugin(config.credentials.type)
175 factory.load_plugin(config.python_adapter_credentials.type)
176 factory.register_adapter(config)
177
178 # Initialize the adapter
179 db_adapter = factory.get_adapter(config)
180 db_adapter._macro_manifest_lazy = macro_manifest
181 reload_adapter_cache(db_adapter, manifest)
182
183 return db_adapter
184
185
186 def reload_adapter_cache(adapter: BaseAdapter, manifest: Manifest) -> None:
187 with new_connection(adapter, "fal:reload_adapter_cache"):
188 adapter.set_relations_cache(manifest, True)
189
190
191 @contextmanager
192 def new_connection(adapter: BaseAdapter, connection_name: str) -> Connection:
193 with adapter.connection_named(connection_name):
194 yield adapter.connections.get_thread_connection()
195
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/projects/adapter/src/dbt/adapters/fal_experimental/adapter_support.py b/projects/adapter/src/dbt/adapters/fal_experimental/adapter_support.py
--- a/projects/adapter/src/dbt/adapters/fal_experimental/adapter_support.py
+++ b/projects/adapter/src/dbt/adapters/fal_experimental/adapter_support.py
@@ -14,6 +14,7 @@
_SQLALCHEMY_DIALECTS = {
"redshift": "redshift+psycopg2",
+ "sqlserver": "mssql+pyodbc",
}
@@ -35,6 +36,10 @@
sqlalchemy_kwargs["creator"] = lambda *args, **kwargs: connection.handle
url = _SQLALCHEMY_DIALECTS.get(adapter_type, adapter_type) + "://"
url = format_url(url)
+ elif adapter_type == "sqlserver":
+ sqlalchemy_kwargs["creator"] = lambda *args, **kwargs: connection.handle
+ url = _SQLALCHEMY_DIALECTS.get(adapter_type, adapter_type) + "://"
+ url = format_url(url)
else:
message = (
f"dbt-fal does not support {adapter_type} adapter. ",
| {"golden_diff": "diff --git a/projects/adapter/src/dbt/adapters/fal_experimental/adapter_support.py b/projects/adapter/src/dbt/adapters/fal_experimental/adapter_support.py\n--- a/projects/adapter/src/dbt/adapters/fal_experimental/adapter_support.py\n+++ b/projects/adapter/src/dbt/adapters/fal_experimental/adapter_support.py\n@@ -14,6 +14,7 @@\n \n _SQLALCHEMY_DIALECTS = {\n \"redshift\": \"redshift+psycopg2\",\n+ \"sqlserver\": \"mssql+pyodbc\",\n }\n \n \n@@ -35,6 +36,10 @@\n sqlalchemy_kwargs[\"creator\"] = lambda *args, **kwargs: connection.handle\n url = _SQLALCHEMY_DIALECTS.get(adapter_type, adapter_type) + \"://\"\n url = format_url(url)\n+ elif adapter_type == \"sqlserver\":\n+ sqlalchemy_kwargs[\"creator\"] = lambda *args, **kwargs: connection.handle\n+ url = _SQLALCHEMY_DIALECTS.get(adapter_type, adapter_type) + \"://\"\n+ url = format_url(url)\n else:\n message = (\n f\"dbt-fal does not support {adapter_type} adapter. \",\n", "issue": "Support sqlserver\nAdd support for MS sqlserver\n", "before_files": [{"content": "import functools\nfrom time import sleep\nfrom typing import Any\n\nimport pandas as pd\nimport sqlalchemy\nfrom contextlib import contextmanager\nfrom dbt.adapters.base import BaseAdapter, BaseRelation, RelationType\nfrom dbt.adapters.base.connections import AdapterResponse, Connection\nfrom dbt.config import RuntimeConfig\nfrom dbt.parser.manifest import MacroManifest, Manifest, ManifestLoader\n\nfrom dbt.adapters import factory\n\n_SQLALCHEMY_DIALECTS = {\n \"redshift\": \"redshift+psycopg2\",\n}\n\n\ndef _get_alchemy_engine(adapter: BaseAdapter, connection: Connection) -> Any:\n # The following code heavily depends on the implementation\n # details of the known adapters, hence it can't work for\n # arbitrary ones.\n adapter_type = adapter.type()\n\n sqlalchemy_kwargs = {}\n format_url = lambda url: url\n if adapter_type == 'trino':\n import dbt.adapters.fal_experimental.support.trino as support_trino\n return support_trino.create_engine(adapter)\n\n if adapter_type == \"redshift\":\n # If the given adapter supports the DBAPI (PEP 249), we can\n # use its connection directly for the engine.\n sqlalchemy_kwargs[\"creator\"] = lambda *args, **kwargs: connection.handle\n url = _SQLALCHEMY_DIALECTS.get(adapter_type, adapter_type) + \"://\"\n url = format_url(url)\n else:\n message = (\n f\"dbt-fal does not support {adapter_type} adapter. \",\n f\"If you need {adapter_type} support, you can create an issue \",\n \"in our GitHub repository: https://github.com/fal-ai/fal. \",\n \"We will look into it ASAP.\"\n )\n raise NotImplementedError(message)\n\n return sqlalchemy.create_engine(url, **sqlalchemy_kwargs)\n\n\ndef drop_relation_if_it_exists(adapter: BaseAdapter, relation: BaseRelation) -> None:\n if adapter.get_relation(\n database=relation.database,\n schema=relation.schema,\n identifier=relation.identifier,\n ):\n adapter.drop_relation(relation)\n\n\ndef write_df_to_relation(\n adapter: BaseAdapter,\n relation: BaseRelation,\n dataframe: pd.DataFrame,\n *,\n if_exists: str = \"replace\",\n) -> AdapterResponse:\n \"\"\"Generic version of the write_df_to_relation. Materialize the given\n dataframe to the targeted relation on the adapter.\"\"\"\n\n adapter_type = adapter.type()\n\n if adapter_type == \"snowflake\":\n import dbt.adapters.fal_experimental.support.snowflake as support_snowflake\n\n return support_snowflake.write_df_to_relation(adapter, dataframe, relation)\n\n elif adapter_type == \"bigquery\":\n import dbt.adapters.fal_experimental.support.bigquery as support_bq\n\n return support_bq.write_df_to_relation(adapter, dataframe, relation)\n\n elif adapter_type == \"duckdb\":\n import dbt.adapters.fal_experimental.support.duckdb as support_duckdb\n\n return support_duckdb.write_df_to_relation(adapter, dataframe, relation)\n\n elif adapter_type == \"postgres\":\n import dbt.adapters.fal_experimental.support.postgres as support_postgres\n\n return support_postgres.write_df_to_relation(adapter, dataframe, relation)\n\n else:\n with new_connection(adapter, \"fal:write_df_to_relation\") as connection:\n # TODO: this should probably live in the materialization macro.\n temp_relation = relation.replace_path(\n identifier=f\"__dbt_fal_temp_{relation.identifier}\"\n )\n drop_relation_if_it_exists(adapter, temp_relation)\n\n alchemy_engine = _get_alchemy_engine(adapter, connection)\n\n # TODO: probably worth handling errors here an returning\n # a proper adapter response.\n rows_affected = dataframe.to_sql(\n con=alchemy_engine,\n name=temp_relation.identifier,\n schema=temp_relation.schema,\n if_exists=if_exists,\n index=False,\n )\n adapter.cache.add(temp_relation)\n drop_relation_if_it_exists(adapter, relation)\n adapter.rename_relation(temp_relation, relation)\n adapter.commit_if_has_connection()\n\n return AdapterResponse(\"OK\", rows_affected=rows_affected)\n\n\ndef read_relation_as_df(adapter: BaseAdapter, relation: BaseRelation) -> pd.DataFrame:\n \"\"\"Generic version of the read_df_from_relation.\"\"\"\n\n adapter_type = adapter.type()\n\n if adapter_type == \"snowflake\":\n import dbt.adapters.fal_experimental.support.snowflake as support_snowflake\n\n return support_snowflake.read_relation_as_df(adapter, relation)\n\n elif adapter_type == \"bigquery\":\n import dbt.adapters.fal_experimental.support.bigquery as support_bq\n\n return support_bq.read_relation_as_df(adapter, relation)\n\n elif adapter_type == \"duckdb\":\n import dbt.adapters.fal_experimental.support.duckdb as support_duckdb\n\n return support_duckdb.read_relation_as_df(adapter, relation)\n\n elif adapter_type == \"postgres\":\n import dbt.adapters.fal_experimental.support.postgres as support_postgres\n\n return support_postgres.read_relation_as_df(adapter, relation)\n\n else:\n with new_connection(adapter, \"fal:read_relation_as_df\") as connection:\n alchemy_engine = _get_alchemy_engine(adapter, connection)\n return pd.read_sql_table(\n con=alchemy_engine,\n table_name=relation.identifier,\n schema=relation.schema,\n )\n\n\ndef prepare_for_adapter(adapter: BaseAdapter, function: Any) -> Any:\n \"\"\"Prepare the given function to be used with string-like inputs\n (for relations) on the given adapter.\"\"\"\n\n @functools.wraps(function)\n def wrapped(quoted_relation: str, *args, **kwargs) -> Any:\n relation = adapter.Relation.create(\n *quoted_relation.split(\".\"), type=RelationType.Table\n )\n return function(adapter, relation, *args, **kwargs)\n\n return wrapped\n\n\ndef reconstruct_adapter(config: RuntimeConfig, manifest: Manifest, macro_manifest: MacroManifest) -> BaseAdapter:\n from dbt.tracking import do_not_track\n\n # Prepare the DBT to not to track us.\n do_not_track()\n\n # Prepare the plugin loading system to handle the adapter.\n factory.load_plugin(config.credentials.type)\n factory.load_plugin(config.python_adapter_credentials.type)\n factory.register_adapter(config)\n\n # Initialize the adapter\n db_adapter = factory.get_adapter(config)\n db_adapter._macro_manifest_lazy = macro_manifest\n reload_adapter_cache(db_adapter, manifest)\n\n return db_adapter\n\n\ndef reload_adapter_cache(adapter: BaseAdapter, manifest: Manifest) -> None:\n with new_connection(adapter, \"fal:reload_adapter_cache\"):\n adapter.set_relations_cache(manifest, True)\n\n\n@contextmanager\ndef new_connection(adapter: BaseAdapter, connection_name: str) -> Connection:\n with adapter.connection_named(connection_name):\n yield adapter.connections.get_thread_connection()\n", "path": "projects/adapter/src/dbt/adapters/fal_experimental/adapter_support.py"}], "after_files": [{"content": "import functools\nfrom time import sleep\nfrom typing import Any\n\nimport pandas as pd\nimport sqlalchemy\nfrom contextlib import contextmanager\nfrom dbt.adapters.base import BaseAdapter, BaseRelation, RelationType\nfrom dbt.adapters.base.connections import AdapterResponse, Connection\nfrom dbt.config import RuntimeConfig\nfrom dbt.parser.manifest import MacroManifest, Manifest, ManifestLoader\n\nfrom dbt.adapters import factory\n\n_SQLALCHEMY_DIALECTS = {\n \"redshift\": \"redshift+psycopg2\",\n \"sqlserver\": \"mssql+pyodbc\",\n}\n\n\ndef _get_alchemy_engine(adapter: BaseAdapter, connection: Connection) -> Any:\n # The following code heavily depends on the implementation\n # details of the known adapters, hence it can't work for\n # arbitrary ones.\n adapter_type = adapter.type()\n\n sqlalchemy_kwargs = {}\n format_url = lambda url: url\n if adapter_type == 'trino':\n import dbt.adapters.fal_experimental.support.trino as support_trino\n return support_trino.create_engine(adapter)\n\n if adapter_type == \"redshift\":\n # If the given adapter supports the DBAPI (PEP 249), we can\n # use its connection directly for the engine.\n sqlalchemy_kwargs[\"creator\"] = lambda *args, **kwargs: connection.handle\n url = _SQLALCHEMY_DIALECTS.get(adapter_type, adapter_type) + \"://\"\n url = format_url(url)\n elif adapter_type == \"sqlserver\":\n sqlalchemy_kwargs[\"creator\"] = lambda *args, **kwargs: connection.handle\n url = _SQLALCHEMY_DIALECTS.get(adapter_type, adapter_type) + \"://\"\n url = format_url(url)\n else:\n message = (\n f\"dbt-fal does not support {adapter_type} adapter. \",\n f\"If you need {adapter_type} support, you can create an issue \",\n \"in our GitHub repository: https://github.com/fal-ai/fal. \",\n \"We will look into it ASAP.\"\n )\n raise NotImplementedError(message)\n\n return sqlalchemy.create_engine(url, **sqlalchemy_kwargs)\n\n\ndef drop_relation_if_it_exists(adapter: BaseAdapter, relation: BaseRelation) -> None:\n if adapter.get_relation(\n database=relation.database,\n schema=relation.schema,\n identifier=relation.identifier,\n ):\n adapter.drop_relation(relation)\n\n\ndef write_df_to_relation(\n adapter: BaseAdapter,\n relation: BaseRelation,\n dataframe: pd.DataFrame,\n *,\n if_exists: str = \"replace\",\n) -> AdapterResponse:\n \"\"\"Generic version of the write_df_to_relation. Materialize the given\n dataframe to the targeted relation on the adapter.\"\"\"\n\n adapter_type = adapter.type()\n\n if adapter_type == \"snowflake\":\n import dbt.adapters.fal_experimental.support.snowflake as support_snowflake\n\n return support_snowflake.write_df_to_relation(adapter, dataframe, relation)\n\n elif adapter_type == \"bigquery\":\n import dbt.adapters.fal_experimental.support.bigquery as support_bq\n\n return support_bq.write_df_to_relation(adapter, dataframe, relation)\n\n elif adapter_type == \"duckdb\":\n import dbt.adapters.fal_experimental.support.duckdb as support_duckdb\n\n return support_duckdb.write_df_to_relation(adapter, dataframe, relation)\n\n elif adapter_type == \"postgres\":\n import dbt.adapters.fal_experimental.support.postgres as support_postgres\n\n return support_postgres.write_df_to_relation(adapter, dataframe, relation)\n\n else:\n with new_connection(adapter, \"fal:write_df_to_relation\") as connection:\n # TODO: this should probably live in the materialization macro.\n temp_relation = relation.replace_path(\n identifier=f\"__dbt_fal_temp_{relation.identifier}\"\n )\n drop_relation_if_it_exists(adapter, temp_relation)\n\n alchemy_engine = _get_alchemy_engine(adapter, connection)\n\n # TODO: probably worth handling errors here an returning\n # a proper adapter response.\n rows_affected = dataframe.to_sql(\n con=alchemy_engine,\n name=temp_relation.identifier,\n schema=temp_relation.schema,\n if_exists=if_exists,\n index=False,\n )\n adapter.cache.add(temp_relation)\n drop_relation_if_it_exists(adapter, relation)\n adapter.rename_relation(temp_relation, relation)\n adapter.commit_if_has_connection()\n\n return AdapterResponse(\"OK\", rows_affected=rows_affected)\n\n\ndef read_relation_as_df(adapter: BaseAdapter, relation: BaseRelation) -> pd.DataFrame:\n \"\"\"Generic version of the read_df_from_relation.\"\"\"\n\n adapter_type = adapter.type()\n\n if adapter_type == \"snowflake\":\n import dbt.adapters.fal_experimental.support.snowflake as support_snowflake\n\n return support_snowflake.read_relation_as_df(adapter, relation)\n\n elif adapter_type == \"bigquery\":\n import dbt.adapters.fal_experimental.support.bigquery as support_bq\n\n return support_bq.read_relation_as_df(adapter, relation)\n\n elif adapter_type == \"duckdb\":\n import dbt.adapters.fal_experimental.support.duckdb as support_duckdb\n\n return support_duckdb.read_relation_as_df(adapter, relation)\n\n elif adapter_type == \"postgres\":\n import dbt.adapters.fal_experimental.support.postgres as support_postgres\n\n return support_postgres.read_relation_as_df(adapter, relation)\n\n else:\n with new_connection(adapter, \"fal:read_relation_as_df\") as connection:\n alchemy_engine = _get_alchemy_engine(adapter, connection)\n return pd.read_sql_table(\n con=alchemy_engine,\n table_name=relation.identifier,\n schema=relation.schema,\n )\n\n\ndef prepare_for_adapter(adapter: BaseAdapter, function: Any) -> Any:\n \"\"\"Prepare the given function to be used with string-like inputs\n (for relations) on the given adapter.\"\"\"\n\n @functools.wraps(function)\n def wrapped(quoted_relation: str, *args, **kwargs) -> Any:\n relation = adapter.Relation.create(\n *quoted_relation.split(\".\"), type=RelationType.Table\n )\n return function(adapter, relation, *args, **kwargs)\n\n return wrapped\n\n\ndef reconstruct_adapter(config: RuntimeConfig, manifest: Manifest, macro_manifest: MacroManifest) -> BaseAdapter:\n from dbt.tracking import do_not_track\n\n # Prepare the DBT to not to track us.\n do_not_track()\n\n # Prepare the plugin loading system to handle the adapter.\n factory.load_plugin(config.credentials.type)\n factory.load_plugin(config.python_adapter_credentials.type)\n factory.register_adapter(config)\n\n # Initialize the adapter\n db_adapter = factory.get_adapter(config)\n db_adapter._macro_manifest_lazy = macro_manifest\n reload_adapter_cache(db_adapter, manifest)\n\n return db_adapter\n\n\ndef reload_adapter_cache(adapter: BaseAdapter, manifest: Manifest) -> None:\n with new_connection(adapter, \"fal:reload_adapter_cache\"):\n adapter.set_relations_cache(manifest, True)\n\n\n@contextmanager\ndef new_connection(adapter: BaseAdapter, connection_name: str) -> Connection:\n with adapter.connection_named(connection_name):\n yield adapter.connections.get_thread_connection()\n", "path": "projects/adapter/src/dbt/adapters/fal_experimental/adapter_support.py"}]} | 2,247 | 267 |
gh_patches_debug_5302 | rasdani/github-patches | git_diff | searx__searx-2991 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Only a lower case "ip" displays the IP address
When the feature is enabled to show a user's IP address when "ip" is entered into the search bar, it only does so when it is all lowercase. Querying "IP" does not return an IP. This seems like a bug, apologies if this was intended.
Thanks
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `searx/plugins/self_info.py`
Content:
```
1 '''
2 searx is free software: you can redistribute it and/or modify
3 it under the terms of the GNU Affero General Public License as published by
4 the Free Software Foundation, either version 3 of the License, or
5 (at your option) any later version.
6
7 searx is distributed in the hope that it will be useful,
8 but WITHOUT ANY WARRANTY; without even the implied warranty of
9 MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
10 GNU Affero General Public License for more details.
11
12 You should have received a copy of the GNU Affero General Public License
13 along with searx. If not, see < http://www.gnu.org/licenses/ >.
14
15 (C) 2015 by Adam Tauber, <[email protected]>
16 '''
17 from flask_babel import gettext
18 import re
19 name = gettext('Self Informations')
20 description = gettext('Displays your IP if the query is "ip" and your user agent if the query contains "user agent".')
21 default_on = True
22
23
24 # Self User Agent regex
25 p = re.compile('.*user[ -]agent.*', re.IGNORECASE)
26
27
28 # attach callback to the post search hook
29 # request: flask request object
30 # ctx: the whole local context of the pre search hook
31 def post_search(request, search):
32 if search.search_query.pageno > 1:
33 return True
34 if search.search_query.query == 'ip':
35 x_forwarded_for = request.headers.getlist("X-Forwarded-For")
36 if x_forwarded_for:
37 ip = x_forwarded_for[0]
38 else:
39 ip = request.remote_addr
40 search.result_container.answers['ip'] = {'answer': ip}
41 elif p.match(search.search_query.query):
42 ua = request.user_agent
43 search.result_container.answers['user-agent'] = {'answer': ua}
44 return True
45
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/searx/plugins/self_info.py b/searx/plugins/self_info.py
--- a/searx/plugins/self_info.py
+++ b/searx/plugins/self_info.py
@@ -31,7 +31,7 @@
def post_search(request, search):
if search.search_query.pageno > 1:
return True
- if search.search_query.query == 'ip':
+ if search.search_query.query.lower() == 'ip':
x_forwarded_for = request.headers.getlist("X-Forwarded-For")
if x_forwarded_for:
ip = x_forwarded_for[0]
| {"golden_diff": "diff --git a/searx/plugins/self_info.py b/searx/plugins/self_info.py\n--- a/searx/plugins/self_info.py\n+++ b/searx/plugins/self_info.py\n@@ -31,7 +31,7 @@\n def post_search(request, search):\n if search.search_query.pageno > 1:\n return True\n- if search.search_query.query == 'ip':\n+ if search.search_query.query.lower() == 'ip':\n x_forwarded_for = request.headers.getlist(\"X-Forwarded-For\")\n if x_forwarded_for:\n ip = x_forwarded_for[0]\n", "issue": "Only a lower case \"ip\" displays the IP address\nWhen the feature is enabled to show a user's IP address when \"ip\" is entered into the search bar, it only does so when it is all lowercase. Querying \"IP\" does not return an IP. This seems like a bug, apologies if this was intended.\r\n\r\nThanks\n", "before_files": [{"content": "'''\nsearx is free software: you can redistribute it and/or modify\nit under the terms of the GNU Affero General Public License as published by\nthe Free Software Foundation, either version 3 of the License, or\n(at your option) any later version.\n\nsearx is distributed in the hope that it will be useful,\nbut WITHOUT ANY WARRANTY; without even the implied warranty of\nMERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\nGNU Affero General Public License for more details.\n\nYou should have received a copy of the GNU Affero General Public License\nalong with searx. If not, see < http://www.gnu.org/licenses/ >.\n\n(C) 2015 by Adam Tauber, <[email protected]>\n'''\nfrom flask_babel import gettext\nimport re\nname = gettext('Self Informations')\ndescription = gettext('Displays your IP if the query is \"ip\" and your user agent if the query contains \"user agent\".')\ndefault_on = True\n\n\n# Self User Agent regex\np = re.compile('.*user[ -]agent.*', re.IGNORECASE)\n\n\n# attach callback to the post search hook\n# request: flask request object\n# ctx: the whole local context of the pre search hook\ndef post_search(request, search):\n if search.search_query.pageno > 1:\n return True\n if search.search_query.query == 'ip':\n x_forwarded_for = request.headers.getlist(\"X-Forwarded-For\")\n if x_forwarded_for:\n ip = x_forwarded_for[0]\n else:\n ip = request.remote_addr\n search.result_container.answers['ip'] = {'answer': ip}\n elif p.match(search.search_query.query):\n ua = request.user_agent\n search.result_container.answers['user-agent'] = {'answer': ua}\n return True\n", "path": "searx/plugins/self_info.py"}], "after_files": [{"content": "'''\nsearx is free software: you can redistribute it and/or modify\nit under the terms of the GNU Affero General Public License as published by\nthe Free Software Foundation, either version 3 of the License, or\n(at your option) any later version.\n\nsearx is distributed in the hope that it will be useful,\nbut WITHOUT ANY WARRANTY; without even the implied warranty of\nMERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\nGNU Affero General Public License for more details.\n\nYou should have received a copy of the GNU Affero General Public License\nalong with searx. If not, see < http://www.gnu.org/licenses/ >.\n\n(C) 2015 by Adam Tauber, <[email protected]>\n'''\nfrom flask_babel import gettext\nimport re\nname = gettext('Self Informations')\ndescription = gettext('Displays your IP if the query is \"ip\" and your user agent if the query contains \"user agent\".')\ndefault_on = True\n\n\n# Self User Agent regex\np = re.compile('.*user[ -]agent.*', re.IGNORECASE)\n\n\n# attach callback to the post search hook\n# request: flask request object\n# ctx: the whole local context of the pre search hook\ndef post_search(request, search):\n if search.search_query.pageno > 1:\n return True\n if search.search_query.query.lower() == 'ip':\n x_forwarded_for = request.headers.getlist(\"X-Forwarded-For\")\n if x_forwarded_for:\n ip = x_forwarded_for[0]\n else:\n ip = request.remote_addr\n search.result_container.answers['ip'] = {'answer': ip}\n elif p.match(search.search_query.query):\n ua = request.user_agent\n search.result_container.answers['user-agent'] = {'answer': ua}\n return True\n", "path": "searx/plugins/self_info.py"}]} | 808 | 134 |
gh_patches_debug_24158 | rasdani/github-patches | git_diff | pystiche__pystiche-9 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
propagate_guide() of Encoder raises a TypeError
When running the replication of [Gatys et al. 2017](https://github.com/pmeier/pystiche/blob/3260b68ea8dd88de433777ad3750d7abe3894743/replication/gatys_et_al_2017.py#L254), the following error is raised:
```
TypeError: Unions cannot be used with isinstance().
```
This points towards the [Encoder](https://github.com/pmeier/pystiche/blob/3260b68ea8dd88de433777ad3750d7abe3894743/pystiche/encoding/encoder.py#L12), specifically these `if` statements in the `propagate_guide()` method:
https://github.com/pmeier/pystiche/blob/3260b68ea8dd88de433777ad3750d7abe3894743/pystiche/encoding/encoder.py#L50-L53
`PoolModule` and `ConvModule` are defined in `pystiche.typing`:
https://github.com/pmeier/pystiche/blob/3260b68ea8dd88de433777ad3750d7abe3894743/pystiche/typing.py#L18-L23
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pystiche/typing.py`
Content:
```
1 from typing import Union, Sequence
2 import torch
3 from torch import nn
4
5 __all__ = [
6 "Numeric",
7 "TensorMeta",
8 "ConvModule",
9 "ConvModuleMeta",
10 "PoolModule",
11 "PoolModuleMeta",
12 ]
13
14 Numeric = Union[int, float]
15
16 TensorMeta = Union[torch.device, torch.dtype]
17
18 ConvModule = Union[nn.Conv1d, nn.Conv2d, nn.Conv2d]
19 ConvModuleMeta = Union[int, Sequence[int]]
20
21 PoolModule = Union[
22 nn.AvgPool1d, nn.AvgPool2d, nn.AvgPool3d, nn.MaxPool1d, nn.MaxPool2d, nn.MaxPool3d
23 ]
24 PoolModuleMeta = Union[int, Sequence[int]]
25
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pystiche/typing.py b/pystiche/typing.py
--- a/pystiche/typing.py
+++ b/pystiche/typing.py
@@ -1,4 +1,4 @@
-from typing import Union, Sequence
+from typing import Union, Any, Sequence
import torch
from torch import nn
@@ -6,8 +6,10 @@
"Numeric",
"TensorMeta",
"ConvModule",
+ "is_conv_module",
"ConvModuleMeta",
"PoolModule",
+ "is_pool_module",
"PoolModuleMeta",
]
@@ -15,10 +17,32 @@
TensorMeta = Union[torch.device, torch.dtype]
-ConvModule = Union[nn.Conv1d, nn.Conv2d, nn.Conv2d]
+ConvModule = Union[nn.Conv1d, nn.Conv2d, nn.Conv3d]
+
+
+def is_conv_module(x: Any) -> bool:
+ return isinstance(x, (nn.Conv1d, nn.Conv2d, nn.Conv3d))
+
+
ConvModuleMeta = Union[int, Sequence[int]]
PoolModule = Union[
nn.AvgPool1d, nn.AvgPool2d, nn.AvgPool3d, nn.MaxPool1d, nn.MaxPool2d, nn.MaxPool3d
]
+
+
+def is_pool_module(x: Any) -> bool:
+ return isinstance(
+ x,
+ (
+ nn.AvgPool1d,
+ nn.AvgPool2d,
+ nn.AvgPool3d,
+ nn.MaxPool1d,
+ nn.MaxPool2d,
+ nn.MaxPool3d,
+ ),
+ )
+
+
PoolModuleMeta = Union[int, Sequence[int]]
| {"golden_diff": "diff --git a/pystiche/typing.py b/pystiche/typing.py\n--- a/pystiche/typing.py\n+++ b/pystiche/typing.py\n@@ -1,4 +1,4 @@\n-from typing import Union, Sequence\n+from typing import Union, Any, Sequence\n import torch\n from torch import nn\n \n@@ -6,8 +6,10 @@\n \"Numeric\",\n \"TensorMeta\",\n \"ConvModule\",\n+ \"is_conv_module\",\n \"ConvModuleMeta\",\n \"PoolModule\",\n+ \"is_pool_module\",\n \"PoolModuleMeta\",\n ]\n \n@@ -15,10 +17,32 @@\n \n TensorMeta = Union[torch.device, torch.dtype]\n \n-ConvModule = Union[nn.Conv1d, nn.Conv2d, nn.Conv2d]\n+ConvModule = Union[nn.Conv1d, nn.Conv2d, nn.Conv3d]\n+\n+\n+def is_conv_module(x: Any) -> bool:\n+ return isinstance(x, (nn.Conv1d, nn.Conv2d, nn.Conv3d))\n+\n+\n ConvModuleMeta = Union[int, Sequence[int]]\n \n PoolModule = Union[\n nn.AvgPool1d, nn.AvgPool2d, nn.AvgPool3d, nn.MaxPool1d, nn.MaxPool2d, nn.MaxPool3d\n ]\n+\n+\n+def is_pool_module(x: Any) -> bool:\n+ return isinstance(\n+ x,\n+ (\n+ nn.AvgPool1d,\n+ nn.AvgPool2d,\n+ nn.AvgPool3d,\n+ nn.MaxPool1d,\n+ nn.MaxPool2d,\n+ nn.MaxPool3d,\n+ ),\n+ )\n+\n+\n PoolModuleMeta = Union[int, Sequence[int]]\n", "issue": "propagate_guide() of Encoder raises a TypeError\nWhen running the replication of [Gatys et al. 2017](https://github.com/pmeier/pystiche/blob/3260b68ea8dd88de433777ad3750d7abe3894743/replication/gatys_et_al_2017.py#L254), the following error is raised:\r\n\r\n```\r\nTypeError: Unions cannot be used with isinstance().\r\n```\r\n\r\nThis points towards the [Encoder](https://github.com/pmeier/pystiche/blob/3260b68ea8dd88de433777ad3750d7abe3894743/pystiche/encoding/encoder.py#L12), specifically these `if` statements in the `propagate_guide()` method:\r\n\r\nhttps://github.com/pmeier/pystiche/blob/3260b68ea8dd88de433777ad3750d7abe3894743/pystiche/encoding/encoder.py#L50-L53\r\n\r\n`PoolModule` and `ConvModule` are defined in `pystiche.typing`:\r\n\r\nhttps://github.com/pmeier/pystiche/blob/3260b68ea8dd88de433777ad3750d7abe3894743/pystiche/typing.py#L18-L23\r\n\n", "before_files": [{"content": "from typing import Union, Sequence\nimport torch\nfrom torch import nn\n\n__all__ = [\n \"Numeric\",\n \"TensorMeta\",\n \"ConvModule\",\n \"ConvModuleMeta\",\n \"PoolModule\",\n \"PoolModuleMeta\",\n]\n\nNumeric = Union[int, float]\n\nTensorMeta = Union[torch.device, torch.dtype]\n\nConvModule = Union[nn.Conv1d, nn.Conv2d, nn.Conv2d]\nConvModuleMeta = Union[int, Sequence[int]]\n\nPoolModule = Union[\n nn.AvgPool1d, nn.AvgPool2d, nn.AvgPool3d, nn.MaxPool1d, nn.MaxPool2d, nn.MaxPool3d\n]\nPoolModuleMeta = Union[int, Sequence[int]]\n", "path": "pystiche/typing.py"}], "after_files": [{"content": "from typing import Union, Any, Sequence\nimport torch\nfrom torch import nn\n\n__all__ = [\n \"Numeric\",\n \"TensorMeta\",\n \"ConvModule\",\n \"is_conv_module\",\n \"ConvModuleMeta\",\n \"PoolModule\",\n \"is_pool_module\",\n \"PoolModuleMeta\",\n]\n\nNumeric = Union[int, float]\n\nTensorMeta = Union[torch.device, torch.dtype]\n\nConvModule = Union[nn.Conv1d, nn.Conv2d, nn.Conv3d]\n\n\ndef is_conv_module(x: Any) -> bool:\n return isinstance(x, (nn.Conv1d, nn.Conv2d, nn.Conv3d))\n\n\nConvModuleMeta = Union[int, Sequence[int]]\n\nPoolModule = Union[\n nn.AvgPool1d, nn.AvgPool2d, nn.AvgPool3d, nn.MaxPool1d, nn.MaxPool2d, nn.MaxPool3d\n]\n\n\ndef is_pool_module(x: Any) -> bool:\n return isinstance(\n x,\n (\n nn.AvgPool1d,\n nn.AvgPool2d,\n nn.AvgPool3d,\n nn.MaxPool1d,\n nn.MaxPool2d,\n nn.MaxPool3d,\n ),\n )\n\n\nPoolModuleMeta = Union[int, Sequence[int]]\n", "path": "pystiche/typing.py"}]} | 801 | 388 |
gh_patches_debug_21080 | rasdani/github-patches | git_diff | conda__conda-6131 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
ValidationError: invalid value for timestamp
MODERATOR EDIT: This issue relates to an error that looks something like
ValidationError: Invalid value 1505856869.685 for timestamp
The issue relates to using conda 4.3.28, and then downgrading (presumably via conda-forge) to a previous version of conda, like 4.3.27.
# SOLUTION #
### To fix on macOS:
# in the command below, set PATH_TO_ENVIRONMENT yourself
sed -i '' -E 's|("timestamp": [0-9]+)\.|\1|' /PATH_TO_ENVIRONMENT/conda-meta/*.json
### To fix on Linux:
# in the command below, set PATH_TO_ENVIRONMENT yourself
sed -i -E 's|("timestamp": [0-9]+)\.|\1|' /PATH_TO_ENVIRONMENT/conda-meta/*.json
### To fix on Windows:
Open notepad, and copy the contents below to `c:\fix_timestamps.py`
```python
PATH_TO_ENVIRONMENT="c:\\ProgramData\\Anaconda3" # <-- fill this in yourself
# backslashes must be doubled
from glob import glob
import json
import os
for path in glob(os.path.join(PATH_TO_ENVIRONMENT, 'conda-meta', '*.json')):
with open(path) as fh:
content = json.load(fh)
if 'timestamp' in content:
old_timestamp = content['timestamp']
content['timestamp'] = int(old_timestamp)
if old_timestamp != content['timestamp']:
with open(path, 'w') as fh:
fh.write(json.dumps(content, indent=2, sort_keys=True, separators=(',', ': ')))
```
Also, change the path in the variable `PATH_TO_ENVIRONMENT` to point to the conda environment you want to fix. Then run the script with `python c:\fix_timestamps.py`.
----
EDITED: I realized this is not a pyqt issue, it's a conda issue
### steps to reproduce
1. Install 64-bit miniconda on windows 10 (version 4.3.27)
2. conda update to `4.3.28-py36h9daa44c_0`
3. conda install -c anaconda spyder
4. conda config --add channels conda-forge
Now any calls to conda results in the print-out below.
### conda info
This prints any time I try to use a conda command.
```conda install -c dsdale24 pyqt5
An unexpected error has occurred.
Please consider posting the following information to the
conda GitHub issue tracker at:
https://github.com/conda/conda/issues
Current conda install:
platform : win-64
conda version : 4.3.27
conda is private : False
conda-env version : 4.3.27
conda-build version : not installed
python version : 3.6.2.final.0
requests version : 2.18.4
root environment : C:\Users\jrinker\Miniconda3 (writable)
default environment : C:\Users\jrinker\Miniconda3
envs directories : C:\Users\jrinker\Miniconda3\envs
C:\Users\jrinker\AppData\Local\conda\conda\envs
C:\Users\jrinker\.conda\envs
package cache : C:\Users\jrinker\Miniconda3\pkgs
C:\Users\jrinker\AppData\Local\conda\conda\pkgs
channel URLs : https://conda.anaconda.org/dsdale24/win-64
https://conda.anaconda.org/dsdale24/noarch
https://conda.anaconda.org/conda-forge/win-64
https://conda.anaconda.org/conda-forge/noarch
https://repo.continuum.io/pkgs/main/win-64
https://repo.continuum.io/pkgs/main/noarch
https://repo.continuum.io/pkgs/free/win-64
https://repo.continuum.io/pkgs/free/noarch
https://repo.continuum.io/pkgs/r/win-64
https://repo.continuum.io/pkgs/r/noarch
https://repo.continuum.io/pkgs/pro/win-64
https://repo.continuum.io/pkgs/pro/noarch
https://repo.continuum.io/pkgs/msys2/win-64
https://repo.continuum.io/pkgs/msys2/noarch
config file : C:\Users\jrinker\.condarc
netrc file : None
offline mode : False
user-agent : conda/4.3.27 requests/2.18.4 CPython/3.6.2 Windows/10 Windows/10.0.15063
administrator : False
`$ C:\Users\jrinker\Miniconda3\Scripts\conda install -c dsdale24 pyqt5`
Traceback (most recent call last):
File "C:\Users\jrinker\Miniconda3\lib\site-packages\conda\exceptions.py", line 640, in conda_exception_handler
return_value = func(*args, **kwargs)
File "C:\Users\jrinker\Miniconda3\lib\site-packages\conda\cli\main.py", line 140, in _main
exit_code = args.func(args, p)
File "C:\Users\jrinker\Miniconda3\lib\site-packages\conda\cli\main_install.py", line 80, in execute
install(args, parser, 'install')
File "C:\Users\jrinker\Miniconda3\lib\site-packages\conda\cli\install.py", line 160, in install
linked_dists = install_linked(prefix)
File "C:\Users\jrinker\Miniconda3\lib\site-packages\conda\core\linked_data.py", line 123, in linked
return set(linked_data(prefix, ignore_channels=ignore_channels).keys())
File "C:\Users\jrinker\Miniconda3\lib\site-packages\conda\core\linked_data.py", line 115, in linked_data
load_linked_data(prefix, dist_name, ignore_channels=ignore_channels)
File "C:\Users\jrinker\Miniconda3\lib\site-packages\conda\core\linked_data.py", line 68, in load_linked_data
linked_data_[prefix][dist] = rec = IndexRecord(**rec)
File "C:\Users\jrinker\Miniconda3\lib\site-packages\conda\_vendor\auxlib\entity.py", line 702, in __call__
instance = super(EntityType, cls).__call__(*args, **kwargs)
File "C:\Users\jrinker\Miniconda3\lib\site-packages\conda\_vendor\auxlib\entity.py", line 719, in __init__
setattr(self, key, kwargs[key])
File "C:\Users\jrinker\Miniconda3\lib\site-packages\conda\_vendor\auxlib\entity.py", line 424, in __set__
instance.__dict__[self.name] = self.validate(instance, self.box(instance, val))
File "C:\Users\jrinker\Miniconda3\lib\site-packages\conda\_vendor\auxlib\entity.py", line 465, in validate
raise ValidationError(getattr(self, 'name', 'undefined name'), val)
conda._vendor.auxlib.exceptions.ValidationError: Invalid value 1505856869.685 for timestamp
```
### things I've tried
- Removing conda forge from channels
- `conda update conda` (still prints out error)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `conda/models/index_record.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 from __future__ import absolute_import, division, print_function, unicode_literals
3
4 from functools import total_ordering
5
6 from .enums import LinkType, NoarchType, Platform
7 from .._vendor.auxlib.entity import (BooleanField, ComposableField, DictSafeMixin, Entity,
8 EnumField, Field, IntegerField, ListField, MapField,
9 StringField, NumberField)
10 from ..common.compat import string_types
11
12
13 @total_ordering
14 class Priority(object):
15
16 def __init__(self, priority):
17 self._priority = priority
18
19 def __int__(self):
20 return self._priority
21
22 def __lt__(self, other):
23 return self._priority < int(other)
24
25 def __eq__(self, other):
26 return self._priority == int(other)
27
28 def __repr__(self):
29 return "Priority(%d)" % self._priority
30
31
32 class PriorityField(Field):
33 _type = (int, Priority)
34
35 def unbox(self, instance, instance_type, val):
36 return int(val)
37
38
39 class LinkTypeField(EnumField):
40 def box(self, instance, val):
41 if isinstance(val, string_types):
42 val = val.replace('-', '').replace('_', '').lower()
43 if val == 'hard':
44 val = LinkType.hardlink
45 elif val == 'soft':
46 val = LinkType.softlink
47 return super(LinkTypeField, self).box(instance, val)
48
49
50 class NoarchField(EnumField):
51 def box(self, instance, val):
52 return super(NoarchField, self).box(instance, NoarchType.coerce(val))
53
54
55 class TimestampField(NumberField):
56
57 def box(self, instance, val):
58 val = super(TimestampField, self).box(instance, val)
59 if val and val > 253402300799: # 9999-12-31
60 val /= 1000 # convert milliseconds to seconds; see conda/conda-build#1988
61 return val
62
63
64 class Link(DictSafeMixin, Entity):
65 source = StringField()
66 type = LinkTypeField(LinkType, required=False)
67
68
69 EMPTY_LINK = Link(source='')
70
71 # TODO: eventually stop mixing Record with LinkedPackageData
72 # class LinkedPackageRecord(DictSafeMixin, Entity):
73 # arch = EnumField(Arch, nullable=True)
74 # build = StringField()
75 # build_number = IntegerField()
76 # channel = StringField(required=False)
77 # date = StringField(required=False)
78 # depends = ListField(string_types)
79 # files = ListField(string_types, required=False)
80 # license = StringField(required=False)
81 # link = ComposableField(Link, required=False)
82 # md5 = StringField(required=False, nullable=True)
83 # name = StringField()
84 # platform = EnumField(Platform)
85 # requires = ListField(string_types, required=False)
86 # size = IntegerField(required=False)
87 # subdir = StringField(required=False)
88 # url = StringField(required=False)
89 # version = StringField()
90
91
92 class IndexRecord(DictSafeMixin, Entity):
93 _lazy_validate = True
94
95 arch = StringField(required=False, nullable=True)
96 build = StringField()
97 build_number = IntegerField()
98 date = StringField(required=False)
99 depends = ListField(string_types, required=False)
100 features = StringField(required=False)
101 has_prefix = BooleanField(required=False)
102 license = StringField(required=False)
103 license_family = StringField(required=False)
104 md5 = StringField(required=False, nullable=True)
105 name = StringField()
106 noarch = NoarchField(NoarchType, required=False, nullable=True)
107 platform = EnumField(Platform, required=False, nullable=True)
108 requires = ListField(string_types, required=False)
109 size = IntegerField(required=False)
110 subdir = StringField(required=False)
111 timestamp = TimestampField(required=False)
112 track_features = StringField(required=False)
113 version = StringField()
114
115 fn = StringField(required=False, nullable=True)
116 schannel = StringField(required=False, nullable=True)
117 channel = StringField(required=False, nullable=True)
118 priority = PriorityField(required=False)
119 url = StringField(required=False, nullable=True)
120 auth = StringField(required=False, nullable=True)
121
122 files = ListField(string_types, default=(), required=False)
123 link = ComposableField(Link, required=False)
124
125 with_features_depends = MapField(required=False)
126 preferred_env = StringField(default=None, required=False, nullable=True)
127
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/conda/models/index_record.py b/conda/models/index_record.py
--- a/conda/models/index_record.py
+++ b/conda/models/index_record.py
@@ -54,12 +54,33 @@
class TimestampField(NumberField):
- def box(self, instance, val):
- val = super(TimestampField, self).box(instance, val)
- if val and val > 253402300799: # 9999-12-31
- val /= 1000 # convert milliseconds to seconds; see conda/conda-build#1988
+ # @staticmethod
+ # def _make_seconds(val):
+ # if val:
+ # val = int(val)
+ # if val > 253402300799: # 9999-12-31
+ # val //= 1000 # convert milliseconds to seconds; see conda/conda-build#1988
+ # return val
+
+ @staticmethod
+ def _make_milliseconds(val):
+ if val:
+ if val < 253402300799: # 9999-12-31
+ val *= 1000 # convert seconds to milliseconds
+ val = int(val)
return val
+ def box(self, instance, val):
+ return self._make_milliseconds(super(TimestampField, self).box(instance, val))
+
+ def unbox(self, instance, instance_type, val):
+ return self._make_milliseconds(
+ super(TimestampField, self).unbox(instance, instance_type, val)
+ )
+
+ def dump(self, val):
+ return self._make_milliseconds(super(TimestampField, self).dump(val))
+
class Link(DictSafeMixin, Entity):
source = StringField()
| {"golden_diff": "diff --git a/conda/models/index_record.py b/conda/models/index_record.py\n--- a/conda/models/index_record.py\n+++ b/conda/models/index_record.py\n@@ -54,12 +54,33 @@\n \n class TimestampField(NumberField):\n \n- def box(self, instance, val):\n- val = super(TimestampField, self).box(instance, val)\n- if val and val > 253402300799: # 9999-12-31\n- val /= 1000 # convert milliseconds to seconds; see conda/conda-build#1988\n+ # @staticmethod\n+ # def _make_seconds(val):\n+ # if val:\n+ # val = int(val)\n+ # if val > 253402300799: # 9999-12-31\n+ # val //= 1000 # convert milliseconds to seconds; see conda/conda-build#1988\n+ # return val\n+\n+ @staticmethod\n+ def _make_milliseconds(val):\n+ if val:\n+ if val < 253402300799: # 9999-12-31\n+ val *= 1000 # convert seconds to milliseconds\n+ val = int(val)\n return val\n \n+ def box(self, instance, val):\n+ return self._make_milliseconds(super(TimestampField, self).box(instance, val))\n+\n+ def unbox(self, instance, instance_type, val):\n+ return self._make_milliseconds(\n+ super(TimestampField, self).unbox(instance, instance_type, val)\n+ )\n+\n+ def dump(self, val):\n+ return self._make_milliseconds(super(TimestampField, self).dump(val))\n+\n \n class Link(DictSafeMixin, Entity):\n source = StringField()\n", "issue": "ValidationError: invalid value for timestamp\nMODERATOR EDIT: This issue relates to an error that looks something like\r\n\r\n ValidationError: Invalid value 1505856869.685 for timestamp\r\n\r\nThe issue relates to using conda 4.3.28, and then downgrading (presumably via conda-forge) to a previous version of conda, like 4.3.27.\r\n\r\n# SOLUTION #\r\n\r\n### To fix on macOS:\r\n\r\n # in the command below, set PATH_TO_ENVIRONMENT yourself\r\n sed -i '' -E 's|(\"timestamp\": [0-9]+)\\.|\\1|' /PATH_TO_ENVIRONMENT/conda-meta/*.json\r\n\r\n### To fix on Linux:\r\n\r\n # in the command below, set PATH_TO_ENVIRONMENT yourself\r\n sed -i -E 's|(\"timestamp\": [0-9]+)\\.|\\1|' /PATH_TO_ENVIRONMENT/conda-meta/*.json\r\n\r\n### To fix on Windows:\r\n\r\nOpen notepad, and copy the contents below to `c:\\fix_timestamps.py`\r\n\r\n```python\r\nPATH_TO_ENVIRONMENT=\"c:\\\\ProgramData\\\\Anaconda3\" # <-- fill this in yourself\r\n # backslashes must be doubled\r\n\r\nfrom glob import glob\r\nimport json\r\nimport os\r\nfor path in glob(os.path.join(PATH_TO_ENVIRONMENT, 'conda-meta', '*.json')):\r\n with open(path) as fh:\r\n content = json.load(fh)\r\n if 'timestamp' in content:\r\n old_timestamp = content['timestamp']\r\n content['timestamp'] = int(old_timestamp)\r\n if old_timestamp != content['timestamp']:\r\n with open(path, 'w') as fh:\r\n fh.write(json.dumps(content, indent=2, sort_keys=True, separators=(',', ': ')))\r\n```\r\n\r\nAlso, change the path in the variable `PATH_TO_ENVIRONMENT` to point to the conda environment you want to fix. Then run the script with `python c:\\fix_timestamps.py`.\r\n\r\n----\r\n\r\nEDITED: I realized this is not a pyqt issue, it's a conda issue\r\n\r\n### steps to reproduce\r\n\r\n1. Install 64-bit miniconda on windows 10 (version 4.3.27) \r\n2. conda update to `4.3.28-py36h9daa44c_0` \r\n3. conda install -c anaconda spyder \r\n4. conda config --add channels conda-forge \r\n\r\nNow any calls to conda results in the print-out below.\r\n\r\n### conda info\r\n\r\nThis prints any time I try to use a conda command.\r\n\r\n```conda install -c dsdale24 pyqt5\r\nAn unexpected error has occurred.\r\nPlease consider posting the following information to the\r\nconda GitHub issue tracker at:\r\n\r\n https://github.com/conda/conda/issues\r\n\r\n\r\n\r\nCurrent conda install:\r\n\r\n platform : win-64\r\n conda version : 4.3.27\r\n conda is private : False\r\n conda-env version : 4.3.27\r\n conda-build version : not installed\r\n python version : 3.6.2.final.0\r\n requests version : 2.18.4\r\n root environment : C:\\Users\\jrinker\\Miniconda3 (writable)\r\n default environment : C:\\Users\\jrinker\\Miniconda3\r\n envs directories : C:\\Users\\jrinker\\Miniconda3\\envs\r\n C:\\Users\\jrinker\\AppData\\Local\\conda\\conda\\envs\r\n C:\\Users\\jrinker\\.conda\\envs\r\n package cache : C:\\Users\\jrinker\\Miniconda3\\pkgs\r\n C:\\Users\\jrinker\\AppData\\Local\\conda\\conda\\pkgs\r\n channel URLs : https://conda.anaconda.org/dsdale24/win-64\r\n https://conda.anaconda.org/dsdale24/noarch\r\n https://conda.anaconda.org/conda-forge/win-64\r\n https://conda.anaconda.org/conda-forge/noarch\r\n https://repo.continuum.io/pkgs/main/win-64\r\n https://repo.continuum.io/pkgs/main/noarch\r\n https://repo.continuum.io/pkgs/free/win-64\r\n https://repo.continuum.io/pkgs/free/noarch\r\n https://repo.continuum.io/pkgs/r/win-64\r\n https://repo.continuum.io/pkgs/r/noarch\r\n https://repo.continuum.io/pkgs/pro/win-64\r\n https://repo.continuum.io/pkgs/pro/noarch\r\n https://repo.continuum.io/pkgs/msys2/win-64\r\n https://repo.continuum.io/pkgs/msys2/noarch\r\n config file : C:\\Users\\jrinker\\.condarc\r\n netrc file : None\r\n offline mode : False\r\n user-agent : conda/4.3.27 requests/2.18.4 CPython/3.6.2 Windows/10 Windows/10.0.15063\r\n administrator : False\r\n\r\n`$ C:\\Users\\jrinker\\Miniconda3\\Scripts\\conda install -c dsdale24 pyqt5`\r\n\r\n\r\n\r\n\r\n Traceback (most recent call last):\r\n File \"C:\\Users\\jrinker\\Miniconda3\\lib\\site-packages\\conda\\exceptions.py\", line 640, in conda_exception_handler\r\n return_value = func(*args, **kwargs)\r\n File \"C:\\Users\\jrinker\\Miniconda3\\lib\\site-packages\\conda\\cli\\main.py\", line 140, in _main\r\n exit_code = args.func(args, p)\r\n File \"C:\\Users\\jrinker\\Miniconda3\\lib\\site-packages\\conda\\cli\\main_install.py\", line 80, in execute\r\n install(args, parser, 'install')\r\n File \"C:\\Users\\jrinker\\Miniconda3\\lib\\site-packages\\conda\\cli\\install.py\", line 160, in install\r\n linked_dists = install_linked(prefix)\r\n File \"C:\\Users\\jrinker\\Miniconda3\\lib\\site-packages\\conda\\core\\linked_data.py\", line 123, in linked\r\n return set(linked_data(prefix, ignore_channels=ignore_channels).keys())\r\n File \"C:\\Users\\jrinker\\Miniconda3\\lib\\site-packages\\conda\\core\\linked_data.py\", line 115, in linked_data\r\n load_linked_data(prefix, dist_name, ignore_channels=ignore_channels)\r\n File \"C:\\Users\\jrinker\\Miniconda3\\lib\\site-packages\\conda\\core\\linked_data.py\", line 68, in load_linked_data\r\n linked_data_[prefix][dist] = rec = IndexRecord(**rec)\r\n File \"C:\\Users\\jrinker\\Miniconda3\\lib\\site-packages\\conda\\_vendor\\auxlib\\entity.py\", line 702, in __call__\r\n instance = super(EntityType, cls).__call__(*args, **kwargs)\r\n File \"C:\\Users\\jrinker\\Miniconda3\\lib\\site-packages\\conda\\_vendor\\auxlib\\entity.py\", line 719, in __init__\r\n setattr(self, key, kwargs[key])\r\n File \"C:\\Users\\jrinker\\Miniconda3\\lib\\site-packages\\conda\\_vendor\\auxlib\\entity.py\", line 424, in __set__\r\n instance.__dict__[self.name] = self.validate(instance, self.box(instance, val))\r\n File \"C:\\Users\\jrinker\\Miniconda3\\lib\\site-packages\\conda\\_vendor\\auxlib\\entity.py\", line 465, in validate\r\n raise ValidationError(getattr(self, 'name', 'undefined name'), val)\r\n conda._vendor.auxlib.exceptions.ValidationError: Invalid value 1505856869.685 for timestamp\r\n``` \r\n\r\n### things I've tried\r\n\r\n- Removing conda forge from channels\r\n- `conda update conda` (still prints out error)\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\nfrom __future__ import absolute_import, division, print_function, unicode_literals\n\nfrom functools import total_ordering\n\nfrom .enums import LinkType, NoarchType, Platform\nfrom .._vendor.auxlib.entity import (BooleanField, ComposableField, DictSafeMixin, Entity,\n EnumField, Field, IntegerField, ListField, MapField,\n StringField, NumberField)\nfrom ..common.compat import string_types\n\n\n@total_ordering\nclass Priority(object):\n\n def __init__(self, priority):\n self._priority = priority\n\n def __int__(self):\n return self._priority\n\n def __lt__(self, other):\n return self._priority < int(other)\n\n def __eq__(self, other):\n return self._priority == int(other)\n\n def __repr__(self):\n return \"Priority(%d)\" % self._priority\n\n\nclass PriorityField(Field):\n _type = (int, Priority)\n\n def unbox(self, instance, instance_type, val):\n return int(val)\n\n\nclass LinkTypeField(EnumField):\n def box(self, instance, val):\n if isinstance(val, string_types):\n val = val.replace('-', '').replace('_', '').lower()\n if val == 'hard':\n val = LinkType.hardlink\n elif val == 'soft':\n val = LinkType.softlink\n return super(LinkTypeField, self).box(instance, val)\n\n\nclass NoarchField(EnumField):\n def box(self, instance, val):\n return super(NoarchField, self).box(instance, NoarchType.coerce(val))\n\n\nclass TimestampField(NumberField):\n\n def box(self, instance, val):\n val = super(TimestampField, self).box(instance, val)\n if val and val > 253402300799: # 9999-12-31\n val /= 1000 # convert milliseconds to seconds; see conda/conda-build#1988\n return val\n\n\nclass Link(DictSafeMixin, Entity):\n source = StringField()\n type = LinkTypeField(LinkType, required=False)\n\n\nEMPTY_LINK = Link(source='')\n\n# TODO: eventually stop mixing Record with LinkedPackageData\n# class LinkedPackageRecord(DictSafeMixin, Entity):\n# arch = EnumField(Arch, nullable=True)\n# build = StringField()\n# build_number = IntegerField()\n# channel = StringField(required=False)\n# date = StringField(required=False)\n# depends = ListField(string_types)\n# files = ListField(string_types, required=False)\n# license = StringField(required=False)\n# link = ComposableField(Link, required=False)\n# md5 = StringField(required=False, nullable=True)\n# name = StringField()\n# platform = EnumField(Platform)\n# requires = ListField(string_types, required=False)\n# size = IntegerField(required=False)\n# subdir = StringField(required=False)\n# url = StringField(required=False)\n# version = StringField()\n\n\nclass IndexRecord(DictSafeMixin, Entity):\n _lazy_validate = True\n\n arch = StringField(required=False, nullable=True)\n build = StringField()\n build_number = IntegerField()\n date = StringField(required=False)\n depends = ListField(string_types, required=False)\n features = StringField(required=False)\n has_prefix = BooleanField(required=False)\n license = StringField(required=False)\n license_family = StringField(required=False)\n md5 = StringField(required=False, nullable=True)\n name = StringField()\n noarch = NoarchField(NoarchType, required=False, nullable=True)\n platform = EnumField(Platform, required=False, nullable=True)\n requires = ListField(string_types, required=False)\n size = IntegerField(required=False)\n subdir = StringField(required=False)\n timestamp = TimestampField(required=False)\n track_features = StringField(required=False)\n version = StringField()\n\n fn = StringField(required=False, nullable=True)\n schannel = StringField(required=False, nullable=True)\n channel = StringField(required=False, nullable=True)\n priority = PriorityField(required=False)\n url = StringField(required=False, nullable=True)\n auth = StringField(required=False, nullable=True)\n\n files = ListField(string_types, default=(), required=False)\n link = ComposableField(Link, required=False)\n\n with_features_depends = MapField(required=False)\n preferred_env = StringField(default=None, required=False, nullable=True)\n", "path": "conda/models/index_record.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\nfrom __future__ import absolute_import, division, print_function, unicode_literals\n\nfrom functools import total_ordering\n\nfrom .enums import LinkType, NoarchType, Platform\nfrom .._vendor.auxlib.entity import (BooleanField, ComposableField, DictSafeMixin, Entity,\n EnumField, Field, IntegerField, ListField, MapField,\n StringField, NumberField)\nfrom ..common.compat import string_types\n\n\n@total_ordering\nclass Priority(object):\n\n def __init__(self, priority):\n self._priority = priority\n\n def __int__(self):\n return self._priority\n\n def __lt__(self, other):\n return self._priority < int(other)\n\n def __eq__(self, other):\n return self._priority == int(other)\n\n def __repr__(self):\n return \"Priority(%d)\" % self._priority\n\n\nclass PriorityField(Field):\n _type = (int, Priority)\n\n def unbox(self, instance, instance_type, val):\n return int(val)\n\n\nclass LinkTypeField(EnumField):\n def box(self, instance, val):\n if isinstance(val, string_types):\n val = val.replace('-', '').replace('_', '').lower()\n if val == 'hard':\n val = LinkType.hardlink\n elif val == 'soft':\n val = LinkType.softlink\n return super(LinkTypeField, self).box(instance, val)\n\n\nclass NoarchField(EnumField):\n def box(self, instance, val):\n return super(NoarchField, self).box(instance, NoarchType.coerce(val))\n\n\nclass TimestampField(NumberField):\n\n # @staticmethod\n # def _make_seconds(val):\n # if val:\n # val = int(val)\n # if val > 253402300799: # 9999-12-31\n # val //= 1000 # convert milliseconds to seconds; see conda/conda-build#1988\n # return val\n\n @staticmethod\n def _make_milliseconds(val):\n if val:\n if val < 253402300799: # 9999-12-31\n val *= 1000 # convert seconds to milliseconds\n val = int(val)\n return val\n\n def box(self, instance, val):\n return self._make_milliseconds(super(TimestampField, self).box(instance, val))\n\n def unbox(self, instance, instance_type, val):\n return self._make_milliseconds(\n super(TimestampField, self).unbox(instance, instance_type, val)\n )\n\n def dump(self, val):\n return self._make_milliseconds(super(TimestampField, self).dump(val))\n\n\nclass Link(DictSafeMixin, Entity):\n source = StringField()\n type = LinkTypeField(LinkType, required=False)\n\n\nEMPTY_LINK = Link(source='')\n\n# TODO: eventually stop mixing Record with LinkedPackageData\n# class LinkedPackageRecord(DictSafeMixin, Entity):\n# arch = EnumField(Arch, nullable=True)\n# build = StringField()\n# build_number = IntegerField()\n# channel = StringField(required=False)\n# date = StringField(required=False)\n# depends = ListField(string_types)\n# files = ListField(string_types, required=False)\n# license = StringField(required=False)\n# link = ComposableField(Link, required=False)\n# md5 = StringField(required=False, nullable=True)\n# name = StringField()\n# platform = EnumField(Platform)\n# requires = ListField(string_types, required=False)\n# size = IntegerField(required=False)\n# subdir = StringField(required=False)\n# url = StringField(required=False)\n# version = StringField()\n\n\nclass IndexRecord(DictSafeMixin, Entity):\n _lazy_validate = True\n\n arch = StringField(required=False, nullable=True)\n build = StringField()\n build_number = IntegerField()\n date = StringField(required=False)\n depends = ListField(string_types, required=False)\n features = StringField(required=False)\n has_prefix = BooleanField(required=False)\n license = StringField(required=False)\n license_family = StringField(required=False)\n md5 = StringField(required=False, nullable=True)\n name = StringField()\n noarch = NoarchField(NoarchType, required=False, nullable=True)\n platform = EnumField(Platform, required=False, nullable=True)\n requires = ListField(string_types, required=False)\n size = IntegerField(required=False)\n subdir = StringField(required=False)\n timestamp = TimestampField(required=False)\n track_features = StringField(required=False)\n version = StringField()\n\n fn = StringField(required=False, nullable=True)\n schannel = StringField(required=False, nullable=True)\n channel = StringField(required=False, nullable=True)\n priority = PriorityField(required=False)\n url = StringField(required=False, nullable=True)\n auth = StringField(required=False, nullable=True)\n\n files = ListField(string_types, default=(), required=False)\n link = ComposableField(Link, required=False)\n\n with_features_depends = MapField(required=False)\n preferred_env = StringField(default=None, required=False, nullable=True)\n", "path": "conda/models/index_record.py"}]} | 3,295 | 448 |
gh_patches_debug_39850 | rasdani/github-patches | git_diff | mozilla__pontoon-3075 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Store pretranslations for each locale separately
We currently retrieve pretranslations for each locale separately, and store them all at once with a single DB query (for all locales). That makes our DB performance a little bit better, but risks the need for additional retrieval of already retrieved pretranslations in case of an error.
Since retrieving pretranslations is significantly slower than writing them to the DB, we should write to DB for each locale separately.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pontoon/pretranslation/tasks.py`
Content:
```
1 import logging
2
3 from django.db.models import Q, CharField, Value as V
4 from django.db.models.functions import Concat
5 from django.conf import settings
6 from pontoon.base.models import (
7 Project,
8 Entity,
9 TranslatedResource,
10 Translation,
11 User,
12 )
13 from pontoon.actionlog.models import ActionLog
14 from pontoon.pretranslation import AUTHORS
15 from pontoon.pretranslation.pretranslate import (
16 get_pretranslations,
17 update_changed_instances,
18 )
19 from pontoon.base.tasks import PontoonTask
20 from pontoon.sync.core import serial_task
21 from pontoon.checks.libraries import run_checks
22 from pontoon.checks.utils import bulk_run_checks
23
24
25 log = logging.getLogger(__name__)
26
27
28 @serial_task(settings.SYNC_TASK_TIMEOUT, base=PontoonTask, lock_key="project={0}")
29 def pretranslate(self, project_pk, locales=None, entities=None):
30 """
31 Identifies strings without any translations and any suggestions.
32 Engages TheAlgorithm (bug 1552796) to gather pretranslations.
33 Stores pretranslations as suggestions (approved=False) to DB.
34
35 :arg project_pk: the pk of the project to be pretranslated
36 :arg Queryset locales: the locales for the project to be pretranslated
37 :arg Queryset entites: the entities for the project to be pretranslated
38
39 :returns: None
40 """
41 project = Project.objects.get(pk=project_pk)
42
43 if not project.pretranslation_enabled:
44 log.info(f"Pretranslation not enabled for project {project.name}")
45 return
46
47 if locales:
48 locales = project.locales.filter(pk__in=locales)
49 else:
50 locales = project.locales
51
52 locales = locales.filter(
53 project_locale__project=project,
54 project_locale__pretranslation_enabled=True,
55 project_locale__readonly=False,
56 )
57
58 if not locales:
59 log.info(
60 f"Pretranslation not enabled for any locale within project {project.name}"
61 )
62 return
63
64 log.info(f"Fetching pretranslations for project {project.name} started")
65
66 if not entities:
67 entities = Entity.objects.filter(
68 resource__project=project,
69 obsolete=False,
70 )
71
72 entities = entities.prefetch_related("resource")
73
74 # Fetch all available locale-resource pairs (TranslatedResource objects)
75 tr_pairs = (
76 TranslatedResource.objects.filter(
77 resource__project=project,
78 locale__in=locales,
79 )
80 .annotate(
81 locale_resource=Concat(
82 "locale_id", V("-"), "resource_id", output_field=CharField()
83 )
84 )
85 .values_list("locale_resource", flat=True)
86 .distinct()
87 )
88
89 # Fetch all locale-entity pairs with non-rejected or pretranslated translations
90 pt_authors = [User.objects.get(email=email) for email in AUTHORS.values()]
91 translated_entities = (
92 Translation.objects.filter(
93 locale__in=locales,
94 entity__in=entities,
95 )
96 .filter(Q(rejected=False) | Q(user__in=pt_authors))
97 .annotate(
98 locale_entity=Concat(
99 "locale_id", V("-"), "entity_id", output_field=CharField()
100 )
101 )
102 .values_list("locale_entity", flat=True)
103 .distinct()
104 )
105
106 translated_entities = list(translated_entities)
107
108 translations = []
109
110 # To keep track of changed TranslatedResources and their latest_translation
111 tr_dict = {}
112
113 tr_filter = []
114 index = -1
115
116 for locale in locales:
117 log.info(f"Fetching pretranslations for locale {locale.code} started")
118 for entity in entities:
119 locale_entity = f"{locale.id}-{entity.id}"
120 locale_resource = f"{locale.id}-{entity.resource.id}"
121 if locale_entity in translated_entities or locale_resource not in tr_pairs:
122 continue
123
124 pretranslations = get_pretranslations(entity, locale)
125
126 if not pretranslations:
127 continue
128
129 failed_checks = run_checks(
130 entity,
131 locale.code,
132 entity.string,
133 pretranslations[0][0],
134 use_tt_checks=False,
135 )
136
137 if failed_checks:
138 pretranslations = get_pretranslations(
139 entity, locale, preserve_placeables=True
140 )
141
142 for string, plural_form, user in pretranslations:
143 t = Translation(
144 entity=entity,
145 locale=locale,
146 string=string,
147 user=user,
148 approved=False,
149 pretranslated=True,
150 active=True,
151 plural_form=plural_form,
152 )
153
154 index += 1
155 translations.append(t)
156
157 if locale_resource not in tr_dict:
158 tr_dict[locale_resource] = index
159
160 # Add query for fetching respective TranslatedResource.
161 tr_filter.append(
162 Q(locale__id=locale.id) & Q(resource__id=entity.resource.id)
163 )
164
165 # Update the latest translation index
166 tr_dict[locale_resource] = index
167
168 log.info(f"Fetching pretranslations for locale {locale.code} done")
169
170 if len(translations) == 0:
171 return
172
173 translations = Translation.objects.bulk_create(translations)
174
175 # Log creating actions
176 actions_to_log = [
177 ActionLog(
178 action_type=ActionLog.ActionType.TRANSLATION_CREATED,
179 performed_by=t.user,
180 translation=t,
181 )
182 for t in translations
183 ]
184
185 ActionLog.objects.bulk_create(actions_to_log)
186
187 # Run checks on all translations
188 translation_pks = {translation.pk for translation in translations}
189 bulk_run_checks(Translation.objects.for_checks().filter(pk__in=translation_pks))
190
191 # Mark translations as changed
192 changed_translations = Translation.objects.filter(
193 pk__in=translation_pks,
194 # Do not sync translations with errors and warnings
195 errors__isnull=True,
196 warnings__isnull=True,
197 )
198 changed_translations.bulk_mark_changed()
199
200 # Update latest activity and stats for changed instances.
201 update_changed_instances(tr_filter, tr_dict, translations)
202
203 log.info(f"Fetching pretranslations for project {project.name} done")
204
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pontoon/pretranslation/tasks.py b/pontoon/pretranslation/tasks.py
--- a/pontoon/pretranslation/tasks.py
+++ b/pontoon/pretranslation/tasks.py
@@ -105,16 +105,16 @@
translated_entities = list(translated_entities)
- translations = []
+ for locale in locales:
+ log.info(f"Fetching pretranslations for locale {locale.code} started")
- # To keep track of changed TranslatedResources and their latest_translation
- tr_dict = {}
+ translations = []
- tr_filter = []
- index = -1
+ # To keep track of changed TranslatedResources and their latest_translation
+ tr_dict = {}
+ tr_filter = []
+ index = -1
- for locale in locales:
- log.info(f"Fetching pretranslations for locale {locale.code} started")
for entity in entities:
locale_entity = f"{locale.id}-{entity.id}"
locale_resource = f"{locale.id}-{entity.resource.id}"
@@ -165,39 +165,42 @@
# Update the latest translation index
tr_dict[locale_resource] = index
- log.info(f"Fetching pretranslations for locale {locale.code} done")
+ if len(translations) == 0:
+ log.info(
+ f"Fetching pretranslations for locale {locale.code} done: No pretranslation fetched"
+ )
+ continue
- if len(translations) == 0:
- return
+ translations = Translation.objects.bulk_create(translations)
- translations = Translation.objects.bulk_create(translations)
+ # Log creating actions
+ actions_to_log = [
+ ActionLog(
+ action_type=ActionLog.ActionType.TRANSLATION_CREATED,
+ performed_by=t.user,
+ translation=t,
+ )
+ for t in translations
+ ]
- # Log creating actions
- actions_to_log = [
- ActionLog(
- action_type=ActionLog.ActionType.TRANSLATION_CREATED,
- performed_by=t.user,
- translation=t,
- )
- for t in translations
- ]
+ ActionLog.objects.bulk_create(actions_to_log)
- ActionLog.objects.bulk_create(actions_to_log)
+ # Run checks on all translations
+ translation_pks = {translation.pk for translation in translations}
+ bulk_run_checks(Translation.objects.for_checks().filter(pk__in=translation_pks))
- # Run checks on all translations
- translation_pks = {translation.pk for translation in translations}
- bulk_run_checks(Translation.objects.for_checks().filter(pk__in=translation_pks))
+ # Mark translations as changed
+ changed_translations = Translation.objects.filter(
+ pk__in=translation_pks,
+ # Do not sync translations with errors and warnings
+ errors__isnull=True,
+ warnings__isnull=True,
+ )
+ changed_translations.bulk_mark_changed()
- # Mark translations as changed
- changed_translations = Translation.objects.filter(
- pk__in=translation_pks,
- # Do not sync translations with errors and warnings
- errors__isnull=True,
- warnings__isnull=True,
- )
- changed_translations.bulk_mark_changed()
+ # Update latest activity and stats for changed instances.
+ update_changed_instances(tr_filter, tr_dict, translations)
- # Update latest activity and stats for changed instances.
- update_changed_instances(tr_filter, tr_dict, translations)
+ log.info(f"Fetching pretranslations for locale {locale.code} done")
log.info(f"Fetching pretranslations for project {project.name} done")
| {"golden_diff": "diff --git a/pontoon/pretranslation/tasks.py b/pontoon/pretranslation/tasks.py\n--- a/pontoon/pretranslation/tasks.py\n+++ b/pontoon/pretranslation/tasks.py\n@@ -105,16 +105,16 @@\n \n translated_entities = list(translated_entities)\n \n- translations = []\n+ for locale in locales:\n+ log.info(f\"Fetching pretranslations for locale {locale.code} started\")\n \n- # To keep track of changed TranslatedResources and their latest_translation\n- tr_dict = {}\n+ translations = []\n \n- tr_filter = []\n- index = -1\n+ # To keep track of changed TranslatedResources and their latest_translation\n+ tr_dict = {}\n+ tr_filter = []\n+ index = -1\n \n- for locale in locales:\n- log.info(f\"Fetching pretranslations for locale {locale.code} started\")\n for entity in entities:\n locale_entity = f\"{locale.id}-{entity.id}\"\n locale_resource = f\"{locale.id}-{entity.resource.id}\"\n@@ -165,39 +165,42 @@\n # Update the latest translation index\n tr_dict[locale_resource] = index\n \n- log.info(f\"Fetching pretranslations for locale {locale.code} done\")\n+ if len(translations) == 0:\n+ log.info(\n+ f\"Fetching pretranslations for locale {locale.code} done: No pretranslation fetched\"\n+ )\n+ continue\n \n- if len(translations) == 0:\n- return\n+ translations = Translation.objects.bulk_create(translations)\n \n- translations = Translation.objects.bulk_create(translations)\n+ # Log creating actions\n+ actions_to_log = [\n+ ActionLog(\n+ action_type=ActionLog.ActionType.TRANSLATION_CREATED,\n+ performed_by=t.user,\n+ translation=t,\n+ )\n+ for t in translations\n+ ]\n \n- # Log creating actions\n- actions_to_log = [\n- ActionLog(\n- action_type=ActionLog.ActionType.TRANSLATION_CREATED,\n- performed_by=t.user,\n- translation=t,\n- )\n- for t in translations\n- ]\n+ ActionLog.objects.bulk_create(actions_to_log)\n \n- ActionLog.objects.bulk_create(actions_to_log)\n+ # Run checks on all translations\n+ translation_pks = {translation.pk for translation in translations}\n+ bulk_run_checks(Translation.objects.for_checks().filter(pk__in=translation_pks))\n \n- # Run checks on all translations\n- translation_pks = {translation.pk for translation in translations}\n- bulk_run_checks(Translation.objects.for_checks().filter(pk__in=translation_pks))\n+ # Mark translations as changed\n+ changed_translations = Translation.objects.filter(\n+ pk__in=translation_pks,\n+ # Do not sync translations with errors and warnings\n+ errors__isnull=True,\n+ warnings__isnull=True,\n+ )\n+ changed_translations.bulk_mark_changed()\n \n- # Mark translations as changed\n- changed_translations = Translation.objects.filter(\n- pk__in=translation_pks,\n- # Do not sync translations with errors and warnings\n- errors__isnull=True,\n- warnings__isnull=True,\n- )\n- changed_translations.bulk_mark_changed()\n+ # Update latest activity and stats for changed instances.\n+ update_changed_instances(tr_filter, tr_dict, translations)\n \n- # Update latest activity and stats for changed instances.\n- update_changed_instances(tr_filter, tr_dict, translations)\n+ log.info(f\"Fetching pretranslations for locale {locale.code} done\")\n \n log.info(f\"Fetching pretranslations for project {project.name} done\")\n", "issue": "Store pretranslations for each locale separately\nWe currently retrieve pretranslations for each locale separately, and store them all at once with a single DB query (for all locales). That makes our DB performance a little bit better, but risks the need for additional retrieval of already retrieved pretranslations in case of an error.\r\n\r\nSince retrieving pretranslations is significantly slower than writing them to the DB, we should write to DB for each locale separately.\n", "before_files": [{"content": "import logging\n\nfrom django.db.models import Q, CharField, Value as V\nfrom django.db.models.functions import Concat\nfrom django.conf import settings\nfrom pontoon.base.models import (\n Project,\n Entity,\n TranslatedResource,\n Translation,\n User,\n)\nfrom pontoon.actionlog.models import ActionLog\nfrom pontoon.pretranslation import AUTHORS\nfrom pontoon.pretranslation.pretranslate import (\n get_pretranslations,\n update_changed_instances,\n)\nfrom pontoon.base.tasks import PontoonTask\nfrom pontoon.sync.core import serial_task\nfrom pontoon.checks.libraries import run_checks\nfrom pontoon.checks.utils import bulk_run_checks\n\n\nlog = logging.getLogger(__name__)\n\n\n@serial_task(settings.SYNC_TASK_TIMEOUT, base=PontoonTask, lock_key=\"project={0}\")\ndef pretranslate(self, project_pk, locales=None, entities=None):\n \"\"\"\n Identifies strings without any translations and any suggestions.\n Engages TheAlgorithm (bug 1552796) to gather pretranslations.\n Stores pretranslations as suggestions (approved=False) to DB.\n\n :arg project_pk: the pk of the project to be pretranslated\n :arg Queryset locales: the locales for the project to be pretranslated\n :arg Queryset entites: the entities for the project to be pretranslated\n\n :returns: None\n \"\"\"\n project = Project.objects.get(pk=project_pk)\n\n if not project.pretranslation_enabled:\n log.info(f\"Pretranslation not enabled for project {project.name}\")\n return\n\n if locales:\n locales = project.locales.filter(pk__in=locales)\n else:\n locales = project.locales\n\n locales = locales.filter(\n project_locale__project=project,\n project_locale__pretranslation_enabled=True,\n project_locale__readonly=False,\n )\n\n if not locales:\n log.info(\n f\"Pretranslation not enabled for any locale within project {project.name}\"\n )\n return\n\n log.info(f\"Fetching pretranslations for project {project.name} started\")\n\n if not entities:\n entities = Entity.objects.filter(\n resource__project=project,\n obsolete=False,\n )\n\n entities = entities.prefetch_related(\"resource\")\n\n # Fetch all available locale-resource pairs (TranslatedResource objects)\n tr_pairs = (\n TranslatedResource.objects.filter(\n resource__project=project,\n locale__in=locales,\n )\n .annotate(\n locale_resource=Concat(\n \"locale_id\", V(\"-\"), \"resource_id\", output_field=CharField()\n )\n )\n .values_list(\"locale_resource\", flat=True)\n .distinct()\n )\n\n # Fetch all locale-entity pairs with non-rejected or pretranslated translations\n pt_authors = [User.objects.get(email=email) for email in AUTHORS.values()]\n translated_entities = (\n Translation.objects.filter(\n locale__in=locales,\n entity__in=entities,\n )\n .filter(Q(rejected=False) | Q(user__in=pt_authors))\n .annotate(\n locale_entity=Concat(\n \"locale_id\", V(\"-\"), \"entity_id\", output_field=CharField()\n )\n )\n .values_list(\"locale_entity\", flat=True)\n .distinct()\n )\n\n translated_entities = list(translated_entities)\n\n translations = []\n\n # To keep track of changed TranslatedResources and their latest_translation\n tr_dict = {}\n\n tr_filter = []\n index = -1\n\n for locale in locales:\n log.info(f\"Fetching pretranslations for locale {locale.code} started\")\n for entity in entities:\n locale_entity = f\"{locale.id}-{entity.id}\"\n locale_resource = f\"{locale.id}-{entity.resource.id}\"\n if locale_entity in translated_entities or locale_resource not in tr_pairs:\n continue\n\n pretranslations = get_pretranslations(entity, locale)\n\n if not pretranslations:\n continue\n\n failed_checks = run_checks(\n entity,\n locale.code,\n entity.string,\n pretranslations[0][0],\n use_tt_checks=False,\n )\n\n if failed_checks:\n pretranslations = get_pretranslations(\n entity, locale, preserve_placeables=True\n )\n\n for string, plural_form, user in pretranslations:\n t = Translation(\n entity=entity,\n locale=locale,\n string=string,\n user=user,\n approved=False,\n pretranslated=True,\n active=True,\n plural_form=plural_form,\n )\n\n index += 1\n translations.append(t)\n\n if locale_resource not in tr_dict:\n tr_dict[locale_resource] = index\n\n # Add query for fetching respective TranslatedResource.\n tr_filter.append(\n Q(locale__id=locale.id) & Q(resource__id=entity.resource.id)\n )\n\n # Update the latest translation index\n tr_dict[locale_resource] = index\n\n log.info(f\"Fetching pretranslations for locale {locale.code} done\")\n\n if len(translations) == 0:\n return\n\n translations = Translation.objects.bulk_create(translations)\n\n # Log creating actions\n actions_to_log = [\n ActionLog(\n action_type=ActionLog.ActionType.TRANSLATION_CREATED,\n performed_by=t.user,\n translation=t,\n )\n for t in translations\n ]\n\n ActionLog.objects.bulk_create(actions_to_log)\n\n # Run checks on all translations\n translation_pks = {translation.pk for translation in translations}\n bulk_run_checks(Translation.objects.for_checks().filter(pk__in=translation_pks))\n\n # Mark translations as changed\n changed_translations = Translation.objects.filter(\n pk__in=translation_pks,\n # Do not sync translations with errors and warnings\n errors__isnull=True,\n warnings__isnull=True,\n )\n changed_translations.bulk_mark_changed()\n\n # Update latest activity and stats for changed instances.\n update_changed_instances(tr_filter, tr_dict, translations)\n\n log.info(f\"Fetching pretranslations for project {project.name} done\")\n", "path": "pontoon/pretranslation/tasks.py"}], "after_files": [{"content": "import logging\n\nfrom django.db.models import Q, CharField, Value as V\nfrom django.db.models.functions import Concat\nfrom django.conf import settings\nfrom pontoon.base.models import (\n Project,\n Entity,\n TranslatedResource,\n Translation,\n User,\n)\nfrom pontoon.actionlog.models import ActionLog\nfrom pontoon.pretranslation import AUTHORS\nfrom pontoon.pretranslation.pretranslate import (\n get_pretranslations,\n update_changed_instances,\n)\nfrom pontoon.base.tasks import PontoonTask\nfrom pontoon.sync.core import serial_task\nfrom pontoon.checks.libraries import run_checks\nfrom pontoon.checks.utils import bulk_run_checks\n\n\nlog = logging.getLogger(__name__)\n\n\n@serial_task(settings.SYNC_TASK_TIMEOUT, base=PontoonTask, lock_key=\"project={0}\")\ndef pretranslate(self, project_pk, locales=None, entities=None):\n \"\"\"\n Identifies strings without any translations and any suggestions.\n Engages TheAlgorithm (bug 1552796) to gather pretranslations.\n Stores pretranslations as suggestions (approved=False) to DB.\n\n :arg project_pk: the pk of the project to be pretranslated\n :arg Queryset locales: the locales for the project to be pretranslated\n :arg Queryset entites: the entities for the project to be pretranslated\n\n :returns: None\n \"\"\"\n project = Project.objects.get(pk=project_pk)\n\n if not project.pretranslation_enabled:\n log.info(f\"Pretranslation not enabled for project {project.name}\")\n return\n\n if locales:\n locales = project.locales.filter(pk__in=locales)\n else:\n locales = project.locales\n\n locales = locales.filter(\n project_locale__project=project,\n project_locale__pretranslation_enabled=True,\n project_locale__readonly=False,\n )\n\n if not locales:\n log.info(\n f\"Pretranslation not enabled for any locale within project {project.name}\"\n )\n return\n\n log.info(f\"Fetching pretranslations for project {project.name} started\")\n\n if not entities:\n entities = Entity.objects.filter(\n resource__project=project,\n obsolete=False,\n )\n\n entities = entities.prefetch_related(\"resource\")\n\n # Fetch all available locale-resource pairs (TranslatedResource objects)\n tr_pairs = (\n TranslatedResource.objects.filter(\n resource__project=project,\n locale__in=locales,\n )\n .annotate(\n locale_resource=Concat(\n \"locale_id\", V(\"-\"), \"resource_id\", output_field=CharField()\n )\n )\n .values_list(\"locale_resource\", flat=True)\n .distinct()\n )\n\n # Fetch all locale-entity pairs with non-rejected or pretranslated translations\n pt_authors = [User.objects.get(email=email) for email in AUTHORS.values()]\n translated_entities = (\n Translation.objects.filter(\n locale__in=locales,\n entity__in=entities,\n )\n .filter(Q(rejected=False) | Q(user__in=pt_authors))\n .annotate(\n locale_entity=Concat(\n \"locale_id\", V(\"-\"), \"entity_id\", output_field=CharField()\n )\n )\n .values_list(\"locale_entity\", flat=True)\n .distinct()\n )\n\n translated_entities = list(translated_entities)\n\n for locale in locales:\n log.info(f\"Fetching pretranslations for locale {locale.code} started\")\n\n translations = []\n\n # To keep track of changed TranslatedResources and their latest_translation\n tr_dict = {}\n tr_filter = []\n index = -1\n\n for entity in entities:\n locale_entity = f\"{locale.id}-{entity.id}\"\n locale_resource = f\"{locale.id}-{entity.resource.id}\"\n if locale_entity in translated_entities or locale_resource not in tr_pairs:\n continue\n\n pretranslations = get_pretranslations(entity, locale)\n\n if not pretranslations:\n continue\n\n failed_checks = run_checks(\n entity,\n locale.code,\n entity.string,\n pretranslations[0][0],\n use_tt_checks=False,\n )\n\n if failed_checks:\n pretranslations = get_pretranslations(\n entity, locale, preserve_placeables=True\n )\n\n for string, plural_form, user in pretranslations:\n t = Translation(\n entity=entity,\n locale=locale,\n string=string,\n user=user,\n approved=False,\n pretranslated=True,\n active=True,\n plural_form=plural_form,\n )\n\n index += 1\n translations.append(t)\n\n if locale_resource not in tr_dict:\n tr_dict[locale_resource] = index\n\n # Add query for fetching respective TranslatedResource.\n tr_filter.append(\n Q(locale__id=locale.id) & Q(resource__id=entity.resource.id)\n )\n\n # Update the latest translation index\n tr_dict[locale_resource] = index\n\n if len(translations) == 0:\n log.info(\n f\"Fetching pretranslations for locale {locale.code} done: No pretranslation fetched\"\n )\n continue\n\n translations = Translation.objects.bulk_create(translations)\n\n # Log creating actions\n actions_to_log = [\n ActionLog(\n action_type=ActionLog.ActionType.TRANSLATION_CREATED,\n performed_by=t.user,\n translation=t,\n )\n for t in translations\n ]\n\n ActionLog.objects.bulk_create(actions_to_log)\n\n # Run checks on all translations\n translation_pks = {translation.pk for translation in translations}\n bulk_run_checks(Translation.objects.for_checks().filter(pk__in=translation_pks))\n\n # Mark translations as changed\n changed_translations = Translation.objects.filter(\n pk__in=translation_pks,\n # Do not sync translations with errors and warnings\n errors__isnull=True,\n warnings__isnull=True,\n )\n changed_translations.bulk_mark_changed()\n\n # Update latest activity and stats for changed instances.\n update_changed_instances(tr_filter, tr_dict, translations)\n\n log.info(f\"Fetching pretranslations for locale {locale.code} done\")\n\n log.info(f\"Fetching pretranslations for project {project.name} done\")\n", "path": "pontoon/pretranslation/tasks.py"}]} | 2,142 | 809 |
gh_patches_debug_6863 | rasdani/github-patches | git_diff | prowler-cloud__prowler-2709 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
FileNotFoundError after version 3.8.1
### Discussed in https://github.com/prowler-cloud/prowler/discussions/2707
<div type='discussions-op-text'>
<sup>Originally posted by **cerontrustly** August 10, 2023</sup>
Hello guys!
After using version 3.8.1 Prowler stopped working for me showing the following error:
`Traceback (most recent call last):
File "/home/prowler/.local/bin/prowler", line 8, in <module>
sys.exit(prowler())
File "/home/prowler/.local/lib/python3.9/site-packages/prowler/__main__.py", line 222, in prowler
resolve_security_hub_previous_findings(
File "/home/prowler/.local/lib/python3.9/site-packages/prowler/providers/aws/lib/security_hub/security_hub.py", line 66, in resolve_security_hub_previous_findings
with open(f"{output_directory}/{output_filename}{json_asff_file_suffix}") as f:
FileNotFoundError: [Errno 2] No such file or directory: '/home/prowler/output/None.asff.json'`
My command line looks like this:
`docker run -v /tmp:/prowler/tmp toniblyx/prowler:stable -f us-west-2 -M json-asff -S -z`
I think it can be related to [this](https://github.com/prowler-cloud/prowler/pull/2687) change.
Can somebody you help me?
Thank you!</div>
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `prowler/__main__.py`
Content:
```
1 #!/usr/bin/env python3
2 # -*- coding: utf-8 -*-
3
4 import os
5 import sys
6
7 from prowler.lib.banner import print_banner
8 from prowler.lib.check.check import (
9 bulk_load_checks_metadata,
10 bulk_load_compliance_frameworks,
11 exclude_checks_to_run,
12 exclude_services_to_run,
13 execute_checks,
14 list_categories,
15 list_services,
16 parse_checks_from_folder,
17 print_categories,
18 print_checks,
19 print_compliance_frameworks,
20 print_compliance_requirements,
21 print_services,
22 remove_custom_checks_module,
23 )
24 from prowler.lib.check.checks_loader import load_checks_to_execute
25 from prowler.lib.check.compliance import update_checks_metadata_with_compliance
26 from prowler.lib.cli.parser import ProwlerArgumentParser
27 from prowler.lib.logger import logger, set_logging_config
28 from prowler.lib.outputs.compliance import display_compliance_table
29 from prowler.lib.outputs.html import add_html_footer, fill_html_overview_statistics
30 from prowler.lib.outputs.json import close_json
31 from prowler.lib.outputs.outputs import extract_findings_statistics, send_to_s3_bucket
32 from prowler.lib.outputs.slack import send_slack_message
33 from prowler.lib.outputs.summary_table import display_summary_table
34 from prowler.providers.aws.lib.security_hub.security_hub import (
35 resolve_security_hub_previous_findings,
36 )
37 from prowler.providers.common.allowlist import set_provider_allowlist
38 from prowler.providers.common.audit_info import (
39 set_provider_audit_info,
40 set_provider_execution_parameters,
41 )
42 from prowler.providers.common.outputs import set_provider_output_options
43 from prowler.providers.common.quick_inventory import run_provider_quick_inventory
44
45
46 def prowler():
47 # Parse Arguments
48 parser = ProwlerArgumentParser()
49 args = parser.parse()
50
51 # Save Arguments
52 provider = args.provider
53 checks = args.checks
54 excluded_checks = args.excluded_checks
55 excluded_services = args.excluded_services
56 services = args.services
57 categories = args.categories
58 checks_file = args.checks_file
59 checks_folder = args.checks_folder
60 severities = args.severity
61 compliance_framework = args.compliance
62
63 if not args.no_banner:
64 print_banner(args)
65
66 # We treat the compliance framework as another output format
67 if compliance_framework:
68 args.output_modes.extend(compliance_framework)
69
70 # Set Logger configuration
71 set_logging_config(args.log_level, args.log_file, args.only_logs)
72
73 if args.list_services:
74 print_services(list_services(provider))
75 sys.exit()
76
77 # Load checks metadata
78 logger.debug("Loading checks metadata from .metadata.json files")
79 bulk_checks_metadata = bulk_load_checks_metadata(provider)
80
81 if args.list_categories:
82 print_categories(list_categories(bulk_checks_metadata))
83 sys.exit()
84
85 bulk_compliance_frameworks = {}
86 # Load compliance frameworks
87 logger.debug("Loading compliance frameworks from .json files")
88
89 bulk_compliance_frameworks = bulk_load_compliance_frameworks(provider)
90 # Complete checks metadata with the compliance framework specification
91 update_checks_metadata_with_compliance(
92 bulk_compliance_frameworks, bulk_checks_metadata
93 )
94 if args.list_compliance:
95 print_compliance_frameworks(bulk_compliance_frameworks)
96 sys.exit()
97 if args.list_compliance_requirements:
98 print_compliance_requirements(
99 bulk_compliance_frameworks, args.list_compliance_requirements
100 )
101 sys.exit()
102
103 # Load checks to execute
104 checks_to_execute = load_checks_to_execute(
105 bulk_checks_metadata,
106 bulk_compliance_frameworks,
107 checks_file,
108 checks,
109 services,
110 severities,
111 compliance_framework,
112 categories,
113 provider,
114 )
115
116 # If -l/--list-checks passed as argument, print checks to execute and quit
117 if args.list_checks:
118 print_checks(provider, sorted(checks_to_execute), bulk_checks_metadata)
119 sys.exit()
120
121 # Set the audit info based on the selected provider
122 audit_info = set_provider_audit_info(provider, args.__dict__)
123
124 # Import custom checks from folder
125 if checks_folder:
126 parse_checks_from_folder(audit_info, checks_folder, provider)
127
128 # Exclude checks if -e/--excluded-checks
129 if excluded_checks:
130 checks_to_execute = exclude_checks_to_run(checks_to_execute, excluded_checks)
131
132 # Exclude services if --excluded-services
133 if excluded_services:
134 checks_to_execute = exclude_services_to_run(
135 checks_to_execute, excluded_services, provider
136 )
137
138 # Once the audit_info is set and we have the eventual checks based on the resource identifier,
139 # it is time to check what Prowler's checks are going to be executed
140 if audit_info.audit_resources:
141 checks_to_execute = set_provider_execution_parameters(provider, audit_info)
142
143 # Sort final check list
144 checks_to_execute = sorted(checks_to_execute)
145
146 # Parse Allowlist
147 allowlist_file = set_provider_allowlist(provider, audit_info, args)
148
149 # Set output options based on the selected provider
150 audit_output_options = set_provider_output_options(
151 provider, args, audit_info, allowlist_file, bulk_checks_metadata
152 )
153
154 # Run the quick inventory for the provider if available
155 if hasattr(args, "quick_inventory") and args.quick_inventory:
156 run_provider_quick_inventory(provider, audit_info, args)
157 sys.exit()
158
159 # Execute checks
160 findings = []
161 if len(checks_to_execute):
162 findings = execute_checks(
163 checks_to_execute, provider, audit_info, audit_output_options
164 )
165 else:
166 logger.error(
167 "There are no checks to execute. Please, check your input arguments"
168 )
169
170 # Extract findings stats
171 stats = extract_findings_statistics(findings)
172
173 if args.slack:
174 if "SLACK_API_TOKEN" in os.environ and "SLACK_CHANNEL_ID" in os.environ:
175 _ = send_slack_message(
176 os.environ["SLACK_API_TOKEN"],
177 os.environ["SLACK_CHANNEL_ID"],
178 stats,
179 provider,
180 audit_info,
181 )
182 else:
183 logger.critical(
184 "Slack integration needs SLACK_API_TOKEN and SLACK_CHANNEL_ID environment variables (see more in https://docs.prowler.cloud/en/latest/tutorials/integrations/#slack)."
185 )
186 sys.exit(1)
187
188 if args.output_modes:
189 for mode in args.output_modes:
190 # Close json file if exists
191 if "json" in mode:
192 close_json(
193 audit_output_options.output_filename, args.output_directory, mode
194 )
195 if mode == "html":
196 add_html_footer(
197 audit_output_options.output_filename, args.output_directory
198 )
199 fill_html_overview_statistics(
200 stats, audit_output_options.output_filename, args.output_directory
201 )
202 # Send output to S3 if needed (-B / -D)
203 if provider == "aws" and (
204 args.output_bucket or args.output_bucket_no_assume
205 ):
206 output_bucket = args.output_bucket
207 bucket_session = audit_info.audit_session
208 # Check if -D was input
209 if args.output_bucket_no_assume:
210 output_bucket = args.output_bucket_no_assume
211 bucket_session = audit_info.original_session
212 send_to_s3_bucket(
213 audit_output_options.output_filename,
214 args.output_directory,
215 mode,
216 output_bucket,
217 bucket_session,
218 )
219
220 # Resolve previous fails of Security Hub
221 if provider == "aws" and args.security_hub and not args.skip_sh_update:
222 resolve_security_hub_previous_findings(
223 args.output_directory, args.output_filename, audit_info
224 )
225
226 # Display summary table
227 if not args.only_logs:
228 display_summary_table(
229 findings,
230 audit_info,
231 audit_output_options,
232 provider,
233 )
234
235 if compliance_framework and findings:
236 for compliance in compliance_framework:
237 # Display compliance table
238 display_compliance_table(
239 findings,
240 bulk_checks_metadata,
241 compliance,
242 audit_output_options.output_filename,
243 audit_output_options.output_directory,
244 )
245
246 # If custom checks were passed, remove the modules
247 if checks_folder:
248 remove_custom_checks_module(checks_folder, provider)
249
250 # If there are failed findings exit code 3, except if -z is input
251 if not args.ignore_exit_code_3 and stats["total_fail"] > 0:
252 sys.exit(3)
253
254
255 if __name__ == "__main__":
256 prowler()
257
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/prowler/__main__.py b/prowler/__main__.py
--- a/prowler/__main__.py
+++ b/prowler/__main__.py
@@ -220,7 +220,9 @@
# Resolve previous fails of Security Hub
if provider == "aws" and args.security_hub and not args.skip_sh_update:
resolve_security_hub_previous_findings(
- args.output_directory, args.output_filename, audit_info
+ audit_output_options.output_directory,
+ audit_output_options.output_filename,
+ audit_info,
)
# Display summary table
| {"golden_diff": "diff --git a/prowler/__main__.py b/prowler/__main__.py\n--- a/prowler/__main__.py\n+++ b/prowler/__main__.py\n@@ -220,7 +220,9 @@\n # Resolve previous fails of Security Hub\n if provider == \"aws\" and args.security_hub and not args.skip_sh_update:\n resolve_security_hub_previous_findings(\n- args.output_directory, args.output_filename, audit_info\n+ audit_output_options.output_directory,\n+ audit_output_options.output_filename,\n+ audit_info,\n )\n \n # Display summary table\n", "issue": "FileNotFoundError after version 3.8.1\n### Discussed in https://github.com/prowler-cloud/prowler/discussions/2707\r\n\r\n<div type='discussions-op-text'>\r\n\r\n<sup>Originally posted by **cerontrustly** August 10, 2023</sup>\r\nHello guys!\r\n\r\nAfter using version 3.8.1 Prowler stopped working for me showing the following error:\r\n\r\n`Traceback (most recent call last):\r\n File \"/home/prowler/.local/bin/prowler\", line 8, in <module>\r\n sys.exit(prowler())\r\n File \"/home/prowler/.local/lib/python3.9/site-packages/prowler/__main__.py\", line 222, in prowler\r\n resolve_security_hub_previous_findings(\r\n File \"/home/prowler/.local/lib/python3.9/site-packages/prowler/providers/aws/lib/security_hub/security_hub.py\", line 66, in resolve_security_hub_previous_findings\r\n with open(f\"{output_directory}/{output_filename}{json_asff_file_suffix}\") as f:\r\nFileNotFoundError: [Errno 2] No such file or directory: '/home/prowler/output/None.asff.json'`\r\n\r\nMy command line looks like this:\r\n\r\n`docker run -v /tmp:/prowler/tmp toniblyx/prowler:stable -f us-west-2 -M json-asff -S -z`\r\n\r\nI think it can be related to [this](https://github.com/prowler-cloud/prowler/pull/2687) change.\r\n\r\nCan somebody you help me?\r\n\r\nThank you!</div>\n", "before_files": [{"content": "#!/usr/bin/env python3\n# -*- coding: utf-8 -*-\n\nimport os\nimport sys\n\nfrom prowler.lib.banner import print_banner\nfrom prowler.lib.check.check import (\n bulk_load_checks_metadata,\n bulk_load_compliance_frameworks,\n exclude_checks_to_run,\n exclude_services_to_run,\n execute_checks,\n list_categories,\n list_services,\n parse_checks_from_folder,\n print_categories,\n print_checks,\n print_compliance_frameworks,\n print_compliance_requirements,\n print_services,\n remove_custom_checks_module,\n)\nfrom prowler.lib.check.checks_loader import load_checks_to_execute\nfrom prowler.lib.check.compliance import update_checks_metadata_with_compliance\nfrom prowler.lib.cli.parser import ProwlerArgumentParser\nfrom prowler.lib.logger import logger, set_logging_config\nfrom prowler.lib.outputs.compliance import display_compliance_table\nfrom prowler.lib.outputs.html import add_html_footer, fill_html_overview_statistics\nfrom prowler.lib.outputs.json import close_json\nfrom prowler.lib.outputs.outputs import extract_findings_statistics, send_to_s3_bucket\nfrom prowler.lib.outputs.slack import send_slack_message\nfrom prowler.lib.outputs.summary_table import display_summary_table\nfrom prowler.providers.aws.lib.security_hub.security_hub import (\n resolve_security_hub_previous_findings,\n)\nfrom prowler.providers.common.allowlist import set_provider_allowlist\nfrom prowler.providers.common.audit_info import (\n set_provider_audit_info,\n set_provider_execution_parameters,\n)\nfrom prowler.providers.common.outputs import set_provider_output_options\nfrom prowler.providers.common.quick_inventory import run_provider_quick_inventory\n\n\ndef prowler():\n # Parse Arguments\n parser = ProwlerArgumentParser()\n args = parser.parse()\n\n # Save Arguments\n provider = args.provider\n checks = args.checks\n excluded_checks = args.excluded_checks\n excluded_services = args.excluded_services\n services = args.services\n categories = args.categories\n checks_file = args.checks_file\n checks_folder = args.checks_folder\n severities = args.severity\n compliance_framework = args.compliance\n\n if not args.no_banner:\n print_banner(args)\n\n # We treat the compliance framework as another output format\n if compliance_framework:\n args.output_modes.extend(compliance_framework)\n\n # Set Logger configuration\n set_logging_config(args.log_level, args.log_file, args.only_logs)\n\n if args.list_services:\n print_services(list_services(provider))\n sys.exit()\n\n # Load checks metadata\n logger.debug(\"Loading checks metadata from .metadata.json files\")\n bulk_checks_metadata = bulk_load_checks_metadata(provider)\n\n if args.list_categories:\n print_categories(list_categories(bulk_checks_metadata))\n sys.exit()\n\n bulk_compliance_frameworks = {}\n # Load compliance frameworks\n logger.debug(\"Loading compliance frameworks from .json files\")\n\n bulk_compliance_frameworks = bulk_load_compliance_frameworks(provider)\n # Complete checks metadata with the compliance framework specification\n update_checks_metadata_with_compliance(\n bulk_compliance_frameworks, bulk_checks_metadata\n )\n if args.list_compliance:\n print_compliance_frameworks(bulk_compliance_frameworks)\n sys.exit()\n if args.list_compliance_requirements:\n print_compliance_requirements(\n bulk_compliance_frameworks, args.list_compliance_requirements\n )\n sys.exit()\n\n # Load checks to execute\n checks_to_execute = load_checks_to_execute(\n bulk_checks_metadata,\n bulk_compliance_frameworks,\n checks_file,\n checks,\n services,\n severities,\n compliance_framework,\n categories,\n provider,\n )\n\n # If -l/--list-checks passed as argument, print checks to execute and quit\n if args.list_checks:\n print_checks(provider, sorted(checks_to_execute), bulk_checks_metadata)\n sys.exit()\n\n # Set the audit info based on the selected provider\n audit_info = set_provider_audit_info(provider, args.__dict__)\n\n # Import custom checks from folder\n if checks_folder:\n parse_checks_from_folder(audit_info, checks_folder, provider)\n\n # Exclude checks if -e/--excluded-checks\n if excluded_checks:\n checks_to_execute = exclude_checks_to_run(checks_to_execute, excluded_checks)\n\n # Exclude services if --excluded-services\n if excluded_services:\n checks_to_execute = exclude_services_to_run(\n checks_to_execute, excluded_services, provider\n )\n\n # Once the audit_info is set and we have the eventual checks based on the resource identifier,\n # it is time to check what Prowler's checks are going to be executed\n if audit_info.audit_resources:\n checks_to_execute = set_provider_execution_parameters(provider, audit_info)\n\n # Sort final check list\n checks_to_execute = sorted(checks_to_execute)\n\n # Parse Allowlist\n allowlist_file = set_provider_allowlist(provider, audit_info, args)\n\n # Set output options based on the selected provider\n audit_output_options = set_provider_output_options(\n provider, args, audit_info, allowlist_file, bulk_checks_metadata\n )\n\n # Run the quick inventory for the provider if available\n if hasattr(args, \"quick_inventory\") and args.quick_inventory:\n run_provider_quick_inventory(provider, audit_info, args)\n sys.exit()\n\n # Execute checks\n findings = []\n if len(checks_to_execute):\n findings = execute_checks(\n checks_to_execute, provider, audit_info, audit_output_options\n )\n else:\n logger.error(\n \"There are no checks to execute. Please, check your input arguments\"\n )\n\n # Extract findings stats\n stats = extract_findings_statistics(findings)\n\n if args.slack:\n if \"SLACK_API_TOKEN\" in os.environ and \"SLACK_CHANNEL_ID\" in os.environ:\n _ = send_slack_message(\n os.environ[\"SLACK_API_TOKEN\"],\n os.environ[\"SLACK_CHANNEL_ID\"],\n stats,\n provider,\n audit_info,\n )\n else:\n logger.critical(\n \"Slack integration needs SLACK_API_TOKEN and SLACK_CHANNEL_ID environment variables (see more in https://docs.prowler.cloud/en/latest/tutorials/integrations/#slack).\"\n )\n sys.exit(1)\n\n if args.output_modes:\n for mode in args.output_modes:\n # Close json file if exists\n if \"json\" in mode:\n close_json(\n audit_output_options.output_filename, args.output_directory, mode\n )\n if mode == \"html\":\n add_html_footer(\n audit_output_options.output_filename, args.output_directory\n )\n fill_html_overview_statistics(\n stats, audit_output_options.output_filename, args.output_directory\n )\n # Send output to S3 if needed (-B / -D)\n if provider == \"aws\" and (\n args.output_bucket or args.output_bucket_no_assume\n ):\n output_bucket = args.output_bucket\n bucket_session = audit_info.audit_session\n # Check if -D was input\n if args.output_bucket_no_assume:\n output_bucket = args.output_bucket_no_assume\n bucket_session = audit_info.original_session\n send_to_s3_bucket(\n audit_output_options.output_filename,\n args.output_directory,\n mode,\n output_bucket,\n bucket_session,\n )\n\n # Resolve previous fails of Security Hub\n if provider == \"aws\" and args.security_hub and not args.skip_sh_update:\n resolve_security_hub_previous_findings(\n args.output_directory, args.output_filename, audit_info\n )\n\n # Display summary table\n if not args.only_logs:\n display_summary_table(\n findings,\n audit_info,\n audit_output_options,\n provider,\n )\n\n if compliance_framework and findings:\n for compliance in compliance_framework:\n # Display compliance table\n display_compliance_table(\n findings,\n bulk_checks_metadata,\n compliance,\n audit_output_options.output_filename,\n audit_output_options.output_directory,\n )\n\n # If custom checks were passed, remove the modules\n if checks_folder:\n remove_custom_checks_module(checks_folder, provider)\n\n # If there are failed findings exit code 3, except if -z is input\n if not args.ignore_exit_code_3 and stats[\"total_fail\"] > 0:\n sys.exit(3)\n\n\nif __name__ == \"__main__\":\n prowler()\n", "path": "prowler/__main__.py"}], "after_files": [{"content": "#!/usr/bin/env python3\n# -*- coding: utf-8 -*-\n\nimport os\nimport sys\n\nfrom prowler.lib.banner import print_banner\nfrom prowler.lib.check.check import (\n bulk_load_checks_metadata,\n bulk_load_compliance_frameworks,\n exclude_checks_to_run,\n exclude_services_to_run,\n execute_checks,\n list_categories,\n list_services,\n parse_checks_from_folder,\n print_categories,\n print_checks,\n print_compliance_frameworks,\n print_compliance_requirements,\n print_services,\n remove_custom_checks_module,\n)\nfrom prowler.lib.check.checks_loader import load_checks_to_execute\nfrom prowler.lib.check.compliance import update_checks_metadata_with_compliance\nfrom prowler.lib.cli.parser import ProwlerArgumentParser\nfrom prowler.lib.logger import logger, set_logging_config\nfrom prowler.lib.outputs.compliance import display_compliance_table\nfrom prowler.lib.outputs.html import add_html_footer, fill_html_overview_statistics\nfrom prowler.lib.outputs.json import close_json\nfrom prowler.lib.outputs.outputs import extract_findings_statistics, send_to_s3_bucket\nfrom prowler.lib.outputs.slack import send_slack_message\nfrom prowler.lib.outputs.summary_table import display_summary_table\nfrom prowler.providers.aws.lib.security_hub.security_hub import (\n resolve_security_hub_previous_findings,\n)\nfrom prowler.providers.common.allowlist import set_provider_allowlist\nfrom prowler.providers.common.audit_info import (\n set_provider_audit_info,\n set_provider_execution_parameters,\n)\nfrom prowler.providers.common.outputs import set_provider_output_options\nfrom prowler.providers.common.quick_inventory import run_provider_quick_inventory\n\n\ndef prowler():\n # Parse Arguments\n parser = ProwlerArgumentParser()\n args = parser.parse()\n\n # Save Arguments\n provider = args.provider\n checks = args.checks\n excluded_checks = args.excluded_checks\n excluded_services = args.excluded_services\n services = args.services\n categories = args.categories\n checks_file = args.checks_file\n checks_folder = args.checks_folder\n severities = args.severity\n compliance_framework = args.compliance\n\n if not args.no_banner:\n print_banner(args)\n\n # We treat the compliance framework as another output format\n if compliance_framework:\n args.output_modes.extend(compliance_framework)\n\n # Set Logger configuration\n set_logging_config(args.log_level, args.log_file, args.only_logs)\n\n if args.list_services:\n print_services(list_services(provider))\n sys.exit()\n\n # Load checks metadata\n logger.debug(\"Loading checks metadata from .metadata.json files\")\n bulk_checks_metadata = bulk_load_checks_metadata(provider)\n\n if args.list_categories:\n print_categories(list_categories(bulk_checks_metadata))\n sys.exit()\n\n bulk_compliance_frameworks = {}\n # Load compliance frameworks\n logger.debug(\"Loading compliance frameworks from .json files\")\n\n bulk_compliance_frameworks = bulk_load_compliance_frameworks(provider)\n # Complete checks metadata with the compliance framework specification\n update_checks_metadata_with_compliance(\n bulk_compliance_frameworks, bulk_checks_metadata\n )\n if args.list_compliance:\n print_compliance_frameworks(bulk_compliance_frameworks)\n sys.exit()\n if args.list_compliance_requirements:\n print_compliance_requirements(\n bulk_compliance_frameworks, args.list_compliance_requirements\n )\n sys.exit()\n\n # Load checks to execute\n checks_to_execute = load_checks_to_execute(\n bulk_checks_metadata,\n bulk_compliance_frameworks,\n checks_file,\n checks,\n services,\n severities,\n compliance_framework,\n categories,\n provider,\n )\n\n # If -l/--list-checks passed as argument, print checks to execute and quit\n if args.list_checks:\n print_checks(provider, sorted(checks_to_execute), bulk_checks_metadata)\n sys.exit()\n\n # Set the audit info based on the selected provider\n audit_info = set_provider_audit_info(provider, args.__dict__)\n\n # Import custom checks from folder\n if checks_folder:\n parse_checks_from_folder(audit_info, checks_folder, provider)\n\n # Exclude checks if -e/--excluded-checks\n if excluded_checks:\n checks_to_execute = exclude_checks_to_run(checks_to_execute, excluded_checks)\n\n # Exclude services if --excluded-services\n if excluded_services:\n checks_to_execute = exclude_services_to_run(\n checks_to_execute, excluded_services, provider\n )\n\n # Once the audit_info is set and we have the eventual checks based on the resource identifier,\n # it is time to check what Prowler's checks are going to be executed\n if audit_info.audit_resources:\n checks_to_execute = set_provider_execution_parameters(provider, audit_info)\n\n # Sort final check list\n checks_to_execute = sorted(checks_to_execute)\n\n # Parse Allowlist\n allowlist_file = set_provider_allowlist(provider, audit_info, args)\n\n # Set output options based on the selected provider\n audit_output_options = set_provider_output_options(\n provider, args, audit_info, allowlist_file, bulk_checks_metadata\n )\n\n # Run the quick inventory for the provider if available\n if hasattr(args, \"quick_inventory\") and args.quick_inventory:\n run_provider_quick_inventory(provider, audit_info, args)\n sys.exit()\n\n # Execute checks\n findings = []\n if len(checks_to_execute):\n findings = execute_checks(\n checks_to_execute, provider, audit_info, audit_output_options\n )\n else:\n logger.error(\n \"There are no checks to execute. Please, check your input arguments\"\n )\n\n # Extract findings stats\n stats = extract_findings_statistics(findings)\n\n if args.slack:\n if \"SLACK_API_TOKEN\" in os.environ and \"SLACK_CHANNEL_ID\" in os.environ:\n _ = send_slack_message(\n os.environ[\"SLACK_API_TOKEN\"],\n os.environ[\"SLACK_CHANNEL_ID\"],\n stats,\n provider,\n audit_info,\n )\n else:\n logger.critical(\n \"Slack integration needs SLACK_API_TOKEN and SLACK_CHANNEL_ID environment variables (see more in https://docs.prowler.cloud/en/latest/tutorials/integrations/#slack).\"\n )\n sys.exit(1)\n\n if args.output_modes:\n for mode in args.output_modes:\n # Close json file if exists\n if \"json\" in mode:\n close_json(\n audit_output_options.output_filename, args.output_directory, mode\n )\n if mode == \"html\":\n add_html_footer(\n audit_output_options.output_filename, args.output_directory\n )\n fill_html_overview_statistics(\n stats, audit_output_options.output_filename, args.output_directory\n )\n # Send output to S3 if needed (-B / -D)\n if provider == \"aws\" and (\n args.output_bucket or args.output_bucket_no_assume\n ):\n output_bucket = args.output_bucket\n bucket_session = audit_info.audit_session\n # Check if -D was input\n if args.output_bucket_no_assume:\n output_bucket = args.output_bucket_no_assume\n bucket_session = audit_info.original_session\n send_to_s3_bucket(\n audit_output_options.output_filename,\n args.output_directory,\n mode,\n output_bucket,\n bucket_session,\n )\n\n # Resolve previous fails of Security Hub\n if provider == \"aws\" and args.security_hub and not args.skip_sh_update:\n resolve_security_hub_previous_findings(\n audit_output_options.output_directory,\n audit_output_options.output_filename,\n audit_info,\n )\n\n # Display summary table\n if not args.only_logs:\n display_summary_table(\n findings,\n audit_info,\n audit_output_options,\n provider,\n )\n\n if compliance_framework and findings:\n for compliance in compliance_framework:\n # Display compliance table\n display_compliance_table(\n findings,\n bulk_checks_metadata,\n compliance,\n audit_output_options.output_filename,\n audit_output_options.output_directory,\n )\n\n # If custom checks were passed, remove the modules\n if checks_folder:\n remove_custom_checks_module(checks_folder, provider)\n\n # If there are failed findings exit code 3, except if -z is input\n if not args.ignore_exit_code_3 and stats[\"total_fail\"] > 0:\n sys.exit(3)\n\n\nif __name__ == \"__main__\":\n prowler()\n", "path": "prowler/__main__.py"}]} | 3,057 | 132 |
gh_patches_debug_18302 | rasdani/github-patches | git_diff | paperless-ngx__paperless-ngx-705 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[BUG] Mail account login via STARTTLS broken
### Description
Mail account login via STARTTLS is broken. See the error in the log file.
### Expected behavior
STARTTLS mail accounts should work normally
### Steps to reproduce
1. Setup a mail account with STARTTLS security and enable a mail rule for synchronization
2. Wait for the task to trigger
### Webserver logs
```bash
10:52:20 [Q] INFO Process-1 created a task from schedule [Check all e-mail accounts]
10:52:20 [Q] INFO Process-1:13 processing [wisconsin-music-enemy-sierra]
10:52:20 [Q] INFO Process-1:13 stopped doing work
10:52:20 [Q] ERROR Failed [wisconsin-music-enemy-sierra] - __init__() got an unexpected keyword argument 'starttls' : Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/django_q/cluster.py", line 432, in worker
res = f(*task["args"], **task["kwargs"])
File "/usr/src/paperless/src/paperless_mail/tasks.py", line 15, in process_mail_accounts
total_new_documents += MailAccountHandler().handle_mail_account(account)
File "/usr/src/paperless/src/paperless_mail/mail.py", line 158, in handle_mail_account
with get_mailbox(
File "/usr/src/paperless/src/paperless_mail/mail.py", line 95, in get_mailbox
mailbox = MailBox(server, port, starttls=True)
TypeError: __init__() got an unexpected keyword argument 'starttls'
10:52:21 [Q] INFO recycled worker Process-1:13
10:52:21 [Q] INFO Process-1:17 ready for work at 574
```
### Screenshots
_No response_
### Paperless-ngx version
1.7.0-rc1
### Host OS
Debian 11
### Installation method
Docker
### Browser
Firefox
### Configuration changes
n.a.
### Other
_No response_
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/paperless_mail/mail.py`
Content:
```
1 import os
2 import tempfile
3 from datetime import date
4 from datetime import timedelta
5 from fnmatch import fnmatch
6
7 import magic
8 import pathvalidate
9 from django.conf import settings
10 from django.db import DatabaseError
11 from django_q.tasks import async_task
12 from documents.loggers import LoggingMixin
13 from documents.models import Correspondent
14 from documents.parsers import is_mime_type_supported
15 from imap_tools import AND
16 from imap_tools import MailBox
17 from imap_tools import MailboxFolderSelectError
18 from imap_tools import MailBoxUnencrypted
19 from imap_tools import MailMessage
20 from imap_tools import MailMessageFlags
21 from paperless_mail.models import MailAccount
22 from paperless_mail.models import MailRule
23
24
25 class MailError(Exception):
26 pass
27
28
29 class BaseMailAction:
30 def get_criteria(self):
31 return {}
32
33 def post_consume(self, M, message_uids, parameter):
34 pass # pragma: nocover
35
36
37 class DeleteMailAction(BaseMailAction):
38 def post_consume(self, M, message_uids, parameter):
39 M.delete(message_uids)
40
41
42 class MarkReadMailAction(BaseMailAction):
43 def get_criteria(self):
44 return {"seen": False}
45
46 def post_consume(self, M, message_uids, parameter):
47 M.flag(message_uids, [MailMessageFlags.SEEN], True)
48
49
50 class MoveMailAction(BaseMailAction):
51 def post_consume(self, M, message_uids, parameter):
52 M.move(message_uids, parameter)
53
54
55 class FlagMailAction(BaseMailAction):
56 def get_criteria(self):
57 return {"flagged": False}
58
59 def post_consume(self, M, message_uids, parameter):
60 M.flag(message_uids, [MailMessageFlags.FLAGGED], True)
61
62
63 def get_rule_action(rule):
64 if rule.action == MailRule.AttachmentAction.FLAG:
65 return FlagMailAction()
66 elif rule.action == MailRule.AttachmentAction.DELETE:
67 return DeleteMailAction()
68 elif rule.action == MailRule.AttachmentAction.MOVE:
69 return MoveMailAction()
70 elif rule.action == MailRule.AttachmentAction.MARK_READ:
71 return MarkReadMailAction()
72 else:
73 raise NotImplementedError("Unknown action.") # pragma: nocover
74
75
76 def make_criterias(rule):
77 maximum_age = date.today() - timedelta(days=rule.maximum_age)
78 criterias = {}
79 if rule.maximum_age > 0:
80 criterias["date_gte"] = maximum_age
81 if rule.filter_from:
82 criterias["from_"] = rule.filter_from
83 if rule.filter_subject:
84 criterias["subject"] = rule.filter_subject
85 if rule.filter_body:
86 criterias["body"] = rule.filter_body
87
88 return {**criterias, **get_rule_action(rule).get_criteria()}
89
90
91 def get_mailbox(server, port, security):
92 if security == MailAccount.ImapSecurity.NONE:
93 mailbox = MailBoxUnencrypted(server, port)
94 elif security == MailAccount.ImapSecurity.STARTTLS:
95 mailbox = MailBox(server, port, starttls=True)
96 elif security == MailAccount.ImapSecurity.SSL:
97 mailbox = MailBox(server, port)
98 else:
99 raise NotImplementedError("Unknown IMAP security") # pragma: nocover
100 return mailbox
101
102
103 class MailAccountHandler(LoggingMixin):
104
105 logging_name = "paperless_mail"
106
107 def _correspondent_from_name(self, name):
108 try:
109 return Correspondent.objects.get_or_create(name=name)[0]
110 except DatabaseError as e:
111 self.log("error", f"Error while retrieving correspondent {name}: {e}")
112 return None
113
114 def get_title(self, message, att, rule):
115 if rule.assign_title_from == MailRule.TitleSource.FROM_SUBJECT:
116 return message.subject
117
118 elif rule.assign_title_from == MailRule.TitleSource.FROM_FILENAME:
119 return os.path.splitext(os.path.basename(att.filename))[0]
120
121 else:
122 raise NotImplementedError(
123 "Unknown title selector.",
124 ) # pragma: nocover
125
126 def get_correspondent(self, message: MailMessage, rule):
127 c_from = rule.assign_correspondent_from
128
129 if c_from == MailRule.CorrespondentSource.FROM_NOTHING:
130 return None
131
132 elif c_from == MailRule.CorrespondentSource.FROM_EMAIL:
133 return self._correspondent_from_name(message.from_)
134
135 elif c_from == MailRule.CorrespondentSource.FROM_NAME:
136 from_values = message.from_values
137 if from_values is not None and len(from_values.name) > 0:
138 return self._correspondent_from_name(from_values.name)
139 else:
140 return self._correspondent_from_name(message.from_)
141
142 elif c_from == MailRule.CorrespondentSource.FROM_CUSTOM:
143 return rule.assign_correspondent
144
145 else:
146 raise NotImplementedError(
147 "Unknwown correspondent selector",
148 ) # pragma: nocover
149
150 def handle_mail_account(self, account):
151
152 self.renew_logging_group()
153
154 self.log("debug", f"Processing mail account {account}")
155
156 total_processed_files = 0
157
158 with get_mailbox(
159 account.imap_server,
160 account.imap_port,
161 account.imap_security,
162 ) as M:
163
164 try:
165 M.login(account.username, account.password)
166 except Exception:
167 raise MailError(f"Error while authenticating account {account}")
168
169 self.log(
170 "debug",
171 f"Account {account}: Processing " f"{account.rules.count()} rule(s)",
172 )
173
174 for rule in account.rules.order_by("order"):
175 try:
176 total_processed_files += self.handle_mail_rule(M, rule)
177 except Exception as e:
178 self.log(
179 "error",
180 f"Rule {rule}: Error while processing rule: {e}",
181 exc_info=True,
182 )
183
184 return total_processed_files
185
186 def handle_mail_rule(self, M, rule):
187
188 self.log("debug", f"Rule {rule}: Selecting folder {rule.folder}")
189
190 try:
191 M.folder.set(rule.folder)
192 except MailboxFolderSelectError:
193 raise MailError(
194 f"Rule {rule}: Folder {rule.folder} "
195 f"does not exist in account {rule.account}",
196 )
197
198 criterias = make_criterias(rule)
199
200 self.log(
201 "debug",
202 f"Rule {rule}: Searching folder with criteria " f"{str(AND(**criterias))}",
203 )
204
205 try:
206 messages = M.fetch(
207 criteria=AND(**criterias),
208 mark_seen=False,
209 charset=rule.account.character_set,
210 )
211 except Exception:
212 raise MailError(f"Rule {rule}: Error while fetching folder {rule.folder}")
213
214 post_consume_messages = []
215
216 mails_processed = 0
217 total_processed_files = 0
218
219 for message in messages:
220 try:
221 processed_files = self.handle_message(message, rule)
222 if processed_files > 0:
223 post_consume_messages.append(message.uid)
224
225 total_processed_files += processed_files
226 mails_processed += 1
227 except Exception as e:
228 self.log(
229 "error",
230 f"Rule {rule}: Error while processing mail " f"{message.uid}: {e}",
231 exc_info=True,
232 )
233
234 self.log("debug", f"Rule {rule}: Processed {mails_processed} matching mail(s)")
235
236 self.log(
237 "debug",
238 f"Rule {rule}: Running mail actions on "
239 f"{len(post_consume_messages)} mails",
240 )
241
242 try:
243 get_rule_action(rule).post_consume(
244 M,
245 post_consume_messages,
246 rule.action_parameter,
247 )
248
249 except Exception as e:
250 raise MailError(
251 f"Rule {rule}: Error while processing post-consume actions: " f"{e}",
252 )
253
254 return total_processed_files
255
256 def handle_message(self, message, rule) -> int:
257 if not message.attachments:
258 return 0
259
260 self.log(
261 "debug",
262 f"Rule {rule}: "
263 f"Processing mail {message.subject} from {message.from_} with "
264 f"{len(message.attachments)} attachment(s)",
265 )
266
267 correspondent = self.get_correspondent(message, rule)
268 tag = rule.assign_tag
269 doc_type = rule.assign_document_type
270
271 processed_attachments = 0
272
273 for att in message.attachments:
274
275 if (
276 not att.content_disposition == "attachment"
277 and rule.attachment_type
278 == MailRule.AttachmentProcessing.ATTACHMENTS_ONLY
279 ):
280 self.log(
281 "debug",
282 f"Rule {rule}: "
283 f"Skipping attachment {att.filename} "
284 f"with content disposition {att.content_disposition}",
285 )
286 continue
287
288 if rule.filter_attachment_filename:
289 # Force the filename and pattern to the lowercase
290 # as this is system dependent otherwise
291 if not fnmatch(
292 att.filename.lower(),
293 rule.filter_attachment_filename.lower(),
294 ):
295 continue
296
297 title = self.get_title(message, att, rule)
298
299 # don't trust the content type of the attachment. Could be
300 # generic application/octet-stream.
301 mime_type = magic.from_buffer(att.payload, mime=True)
302
303 if is_mime_type_supported(mime_type):
304
305 os.makedirs(settings.SCRATCH_DIR, exist_ok=True)
306 _, temp_filename = tempfile.mkstemp(
307 prefix="paperless-mail-",
308 dir=settings.SCRATCH_DIR,
309 )
310 with open(temp_filename, "wb") as f:
311 f.write(att.payload)
312
313 self.log(
314 "info",
315 f"Rule {rule}: "
316 f"Consuming attachment {att.filename} from mail "
317 f"{message.subject} from {message.from_}",
318 )
319
320 async_task(
321 "documents.tasks.consume_file",
322 path=temp_filename,
323 override_filename=pathvalidate.sanitize_filename(
324 att.filename,
325 ),
326 override_title=title,
327 override_correspondent_id=correspondent.id
328 if correspondent
329 else None,
330 override_document_type_id=doc_type.id if doc_type else None,
331 override_tag_ids=[tag.id] if tag else None,
332 task_name=att.filename[:100],
333 )
334
335 processed_attachments += 1
336 else:
337 self.log(
338 "debug",
339 f"Rule {rule}: "
340 f"Skipping attachment {att.filename} "
341 f"since guessed mime type {mime_type} is not supported "
342 f"by paperless",
343 )
344
345 return processed_attachments
346
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/paperless_mail/mail.py b/src/paperless_mail/mail.py
--- a/src/paperless_mail/mail.py
+++ b/src/paperless_mail/mail.py
@@ -18,6 +18,7 @@
from imap_tools import MailBoxUnencrypted
from imap_tools import MailMessage
from imap_tools import MailMessageFlags
+from imap_tools.mailbox import MailBoxTls
from paperless_mail.models import MailAccount
from paperless_mail.models import MailRule
@@ -92,7 +93,7 @@
if security == MailAccount.ImapSecurity.NONE:
mailbox = MailBoxUnencrypted(server, port)
elif security == MailAccount.ImapSecurity.STARTTLS:
- mailbox = MailBox(server, port, starttls=True)
+ mailbox = MailBoxTls(server, port)
elif security == MailAccount.ImapSecurity.SSL:
mailbox = MailBox(server, port)
else:
| {"golden_diff": "diff --git a/src/paperless_mail/mail.py b/src/paperless_mail/mail.py\n--- a/src/paperless_mail/mail.py\n+++ b/src/paperless_mail/mail.py\n@@ -18,6 +18,7 @@\n from imap_tools import MailBoxUnencrypted\n from imap_tools import MailMessage\n from imap_tools import MailMessageFlags\n+from imap_tools.mailbox import MailBoxTls\n from paperless_mail.models import MailAccount\n from paperless_mail.models import MailRule\n \n@@ -92,7 +93,7 @@\n if security == MailAccount.ImapSecurity.NONE:\n mailbox = MailBoxUnencrypted(server, port)\n elif security == MailAccount.ImapSecurity.STARTTLS:\n- mailbox = MailBox(server, port, starttls=True)\n+ mailbox = MailBoxTls(server, port)\n elif security == MailAccount.ImapSecurity.SSL:\n mailbox = MailBox(server, port)\n else:\n", "issue": "[BUG] Mail account login via STARTTLS broken\n### Description\n\nMail account login via STARTTLS is broken. See the error in the log file.\n\n### Expected behavior\n\nSTARTTLS mail accounts should work normally\n\n### Steps to reproduce\n\n1. Setup a mail account with STARTTLS security and enable a mail rule for synchronization\r\n2. Wait for the task to trigger\n\n### Webserver logs\n\n```bash\n10:52:20 [Q] INFO Process-1 created a task from schedule [Check all e-mail accounts]\r\n10:52:20 [Q] INFO Process-1:13 processing [wisconsin-music-enemy-sierra]\r\n10:52:20 [Q] INFO Process-1:13 stopped doing work\r\n10:52:20 [Q] ERROR Failed [wisconsin-music-enemy-sierra] - __init__() got an unexpected keyword argument 'starttls' : Traceback (most recent call last):\r\n File \"/usr/local/lib/python3.9/site-packages/django_q/cluster.py\", line 432, in worker\r\n res = f(*task[\"args\"], **task[\"kwargs\"])\r\n File \"/usr/src/paperless/src/paperless_mail/tasks.py\", line 15, in process_mail_accounts\r\n total_new_documents += MailAccountHandler().handle_mail_account(account)\r\n File \"/usr/src/paperless/src/paperless_mail/mail.py\", line 158, in handle_mail_account\r\n with get_mailbox(\r\n File \"/usr/src/paperless/src/paperless_mail/mail.py\", line 95, in get_mailbox\r\n mailbox = MailBox(server, port, starttls=True)\r\nTypeError: __init__() got an unexpected keyword argument 'starttls'\r\n\r\n10:52:21 [Q] INFO recycled worker Process-1:13\r\n10:52:21 [Q] INFO Process-1:17 ready for work at 574\n```\n\n\n### Screenshots\n\n_No response_\n\n### Paperless-ngx version\n\n1.7.0-rc1\n\n### Host OS\n\nDebian 11\n\n### Installation method\n\nDocker\n\n### Browser\n\nFirefox\n\n### Configuration changes\n\nn.a.\n\n### Other\n\n_No response_\n", "before_files": [{"content": "import os\nimport tempfile\nfrom datetime import date\nfrom datetime import timedelta\nfrom fnmatch import fnmatch\n\nimport magic\nimport pathvalidate\nfrom django.conf import settings\nfrom django.db import DatabaseError\nfrom django_q.tasks import async_task\nfrom documents.loggers import LoggingMixin\nfrom documents.models import Correspondent\nfrom documents.parsers import is_mime_type_supported\nfrom imap_tools import AND\nfrom imap_tools import MailBox\nfrom imap_tools import MailboxFolderSelectError\nfrom imap_tools import MailBoxUnencrypted\nfrom imap_tools import MailMessage\nfrom imap_tools import MailMessageFlags\nfrom paperless_mail.models import MailAccount\nfrom paperless_mail.models import MailRule\n\n\nclass MailError(Exception):\n pass\n\n\nclass BaseMailAction:\n def get_criteria(self):\n return {}\n\n def post_consume(self, M, message_uids, parameter):\n pass # pragma: nocover\n\n\nclass DeleteMailAction(BaseMailAction):\n def post_consume(self, M, message_uids, parameter):\n M.delete(message_uids)\n\n\nclass MarkReadMailAction(BaseMailAction):\n def get_criteria(self):\n return {\"seen\": False}\n\n def post_consume(self, M, message_uids, parameter):\n M.flag(message_uids, [MailMessageFlags.SEEN], True)\n\n\nclass MoveMailAction(BaseMailAction):\n def post_consume(self, M, message_uids, parameter):\n M.move(message_uids, parameter)\n\n\nclass FlagMailAction(BaseMailAction):\n def get_criteria(self):\n return {\"flagged\": False}\n\n def post_consume(self, M, message_uids, parameter):\n M.flag(message_uids, [MailMessageFlags.FLAGGED], True)\n\n\ndef get_rule_action(rule):\n if rule.action == MailRule.AttachmentAction.FLAG:\n return FlagMailAction()\n elif rule.action == MailRule.AttachmentAction.DELETE:\n return DeleteMailAction()\n elif rule.action == MailRule.AttachmentAction.MOVE:\n return MoveMailAction()\n elif rule.action == MailRule.AttachmentAction.MARK_READ:\n return MarkReadMailAction()\n else:\n raise NotImplementedError(\"Unknown action.\") # pragma: nocover\n\n\ndef make_criterias(rule):\n maximum_age = date.today() - timedelta(days=rule.maximum_age)\n criterias = {}\n if rule.maximum_age > 0:\n criterias[\"date_gte\"] = maximum_age\n if rule.filter_from:\n criterias[\"from_\"] = rule.filter_from\n if rule.filter_subject:\n criterias[\"subject\"] = rule.filter_subject\n if rule.filter_body:\n criterias[\"body\"] = rule.filter_body\n\n return {**criterias, **get_rule_action(rule).get_criteria()}\n\n\ndef get_mailbox(server, port, security):\n if security == MailAccount.ImapSecurity.NONE:\n mailbox = MailBoxUnencrypted(server, port)\n elif security == MailAccount.ImapSecurity.STARTTLS:\n mailbox = MailBox(server, port, starttls=True)\n elif security == MailAccount.ImapSecurity.SSL:\n mailbox = MailBox(server, port)\n else:\n raise NotImplementedError(\"Unknown IMAP security\") # pragma: nocover\n return mailbox\n\n\nclass MailAccountHandler(LoggingMixin):\n\n logging_name = \"paperless_mail\"\n\n def _correspondent_from_name(self, name):\n try:\n return Correspondent.objects.get_or_create(name=name)[0]\n except DatabaseError as e:\n self.log(\"error\", f\"Error while retrieving correspondent {name}: {e}\")\n return None\n\n def get_title(self, message, att, rule):\n if rule.assign_title_from == MailRule.TitleSource.FROM_SUBJECT:\n return message.subject\n\n elif rule.assign_title_from == MailRule.TitleSource.FROM_FILENAME:\n return os.path.splitext(os.path.basename(att.filename))[0]\n\n else:\n raise NotImplementedError(\n \"Unknown title selector.\",\n ) # pragma: nocover\n\n def get_correspondent(self, message: MailMessage, rule):\n c_from = rule.assign_correspondent_from\n\n if c_from == MailRule.CorrespondentSource.FROM_NOTHING:\n return None\n\n elif c_from == MailRule.CorrespondentSource.FROM_EMAIL:\n return self._correspondent_from_name(message.from_)\n\n elif c_from == MailRule.CorrespondentSource.FROM_NAME:\n from_values = message.from_values\n if from_values is not None and len(from_values.name) > 0:\n return self._correspondent_from_name(from_values.name)\n else:\n return self._correspondent_from_name(message.from_)\n\n elif c_from == MailRule.CorrespondentSource.FROM_CUSTOM:\n return rule.assign_correspondent\n\n else:\n raise NotImplementedError(\n \"Unknwown correspondent selector\",\n ) # pragma: nocover\n\n def handle_mail_account(self, account):\n\n self.renew_logging_group()\n\n self.log(\"debug\", f\"Processing mail account {account}\")\n\n total_processed_files = 0\n\n with get_mailbox(\n account.imap_server,\n account.imap_port,\n account.imap_security,\n ) as M:\n\n try:\n M.login(account.username, account.password)\n except Exception:\n raise MailError(f\"Error while authenticating account {account}\")\n\n self.log(\n \"debug\",\n f\"Account {account}: Processing \" f\"{account.rules.count()} rule(s)\",\n )\n\n for rule in account.rules.order_by(\"order\"):\n try:\n total_processed_files += self.handle_mail_rule(M, rule)\n except Exception as e:\n self.log(\n \"error\",\n f\"Rule {rule}: Error while processing rule: {e}\",\n exc_info=True,\n )\n\n return total_processed_files\n\n def handle_mail_rule(self, M, rule):\n\n self.log(\"debug\", f\"Rule {rule}: Selecting folder {rule.folder}\")\n\n try:\n M.folder.set(rule.folder)\n except MailboxFolderSelectError:\n raise MailError(\n f\"Rule {rule}: Folder {rule.folder} \"\n f\"does not exist in account {rule.account}\",\n )\n\n criterias = make_criterias(rule)\n\n self.log(\n \"debug\",\n f\"Rule {rule}: Searching folder with criteria \" f\"{str(AND(**criterias))}\",\n )\n\n try:\n messages = M.fetch(\n criteria=AND(**criterias),\n mark_seen=False,\n charset=rule.account.character_set,\n )\n except Exception:\n raise MailError(f\"Rule {rule}: Error while fetching folder {rule.folder}\")\n\n post_consume_messages = []\n\n mails_processed = 0\n total_processed_files = 0\n\n for message in messages:\n try:\n processed_files = self.handle_message(message, rule)\n if processed_files > 0:\n post_consume_messages.append(message.uid)\n\n total_processed_files += processed_files\n mails_processed += 1\n except Exception as e:\n self.log(\n \"error\",\n f\"Rule {rule}: Error while processing mail \" f\"{message.uid}: {e}\",\n exc_info=True,\n )\n\n self.log(\"debug\", f\"Rule {rule}: Processed {mails_processed} matching mail(s)\")\n\n self.log(\n \"debug\",\n f\"Rule {rule}: Running mail actions on \"\n f\"{len(post_consume_messages)} mails\",\n )\n\n try:\n get_rule_action(rule).post_consume(\n M,\n post_consume_messages,\n rule.action_parameter,\n )\n\n except Exception as e:\n raise MailError(\n f\"Rule {rule}: Error while processing post-consume actions: \" f\"{e}\",\n )\n\n return total_processed_files\n\n def handle_message(self, message, rule) -> int:\n if not message.attachments:\n return 0\n\n self.log(\n \"debug\",\n f\"Rule {rule}: \"\n f\"Processing mail {message.subject} from {message.from_} with \"\n f\"{len(message.attachments)} attachment(s)\",\n )\n\n correspondent = self.get_correspondent(message, rule)\n tag = rule.assign_tag\n doc_type = rule.assign_document_type\n\n processed_attachments = 0\n\n for att in message.attachments:\n\n if (\n not att.content_disposition == \"attachment\"\n and rule.attachment_type\n == MailRule.AttachmentProcessing.ATTACHMENTS_ONLY\n ):\n self.log(\n \"debug\",\n f\"Rule {rule}: \"\n f\"Skipping attachment {att.filename} \"\n f\"with content disposition {att.content_disposition}\",\n )\n continue\n\n if rule.filter_attachment_filename:\n # Force the filename and pattern to the lowercase\n # as this is system dependent otherwise\n if not fnmatch(\n att.filename.lower(),\n rule.filter_attachment_filename.lower(),\n ):\n continue\n\n title = self.get_title(message, att, rule)\n\n # don't trust the content type of the attachment. Could be\n # generic application/octet-stream.\n mime_type = magic.from_buffer(att.payload, mime=True)\n\n if is_mime_type_supported(mime_type):\n\n os.makedirs(settings.SCRATCH_DIR, exist_ok=True)\n _, temp_filename = tempfile.mkstemp(\n prefix=\"paperless-mail-\",\n dir=settings.SCRATCH_DIR,\n )\n with open(temp_filename, \"wb\") as f:\n f.write(att.payload)\n\n self.log(\n \"info\",\n f\"Rule {rule}: \"\n f\"Consuming attachment {att.filename} from mail \"\n f\"{message.subject} from {message.from_}\",\n )\n\n async_task(\n \"documents.tasks.consume_file\",\n path=temp_filename,\n override_filename=pathvalidate.sanitize_filename(\n att.filename,\n ),\n override_title=title,\n override_correspondent_id=correspondent.id\n if correspondent\n else None,\n override_document_type_id=doc_type.id if doc_type else None,\n override_tag_ids=[tag.id] if tag else None,\n task_name=att.filename[:100],\n )\n\n processed_attachments += 1\n else:\n self.log(\n \"debug\",\n f\"Rule {rule}: \"\n f\"Skipping attachment {att.filename} \"\n f\"since guessed mime type {mime_type} is not supported \"\n f\"by paperless\",\n )\n\n return processed_attachments\n", "path": "src/paperless_mail/mail.py"}], "after_files": [{"content": "import os\nimport tempfile\nfrom datetime import date\nfrom datetime import timedelta\nfrom fnmatch import fnmatch\n\nimport magic\nimport pathvalidate\nfrom django.conf import settings\nfrom django.db import DatabaseError\nfrom django_q.tasks import async_task\nfrom documents.loggers import LoggingMixin\nfrom documents.models import Correspondent\nfrom documents.parsers import is_mime_type_supported\nfrom imap_tools import AND\nfrom imap_tools import MailBox\nfrom imap_tools import MailboxFolderSelectError\nfrom imap_tools import MailBoxUnencrypted\nfrom imap_tools import MailMessage\nfrom imap_tools import MailMessageFlags\nfrom imap_tools.mailbox import MailBoxTls\nfrom paperless_mail.models import MailAccount\nfrom paperless_mail.models import MailRule\n\n\nclass MailError(Exception):\n pass\n\n\nclass BaseMailAction:\n def get_criteria(self):\n return {}\n\n def post_consume(self, M, message_uids, parameter):\n pass # pragma: nocover\n\n\nclass DeleteMailAction(BaseMailAction):\n def post_consume(self, M, message_uids, parameter):\n M.delete(message_uids)\n\n\nclass MarkReadMailAction(BaseMailAction):\n def get_criteria(self):\n return {\"seen\": False}\n\n def post_consume(self, M, message_uids, parameter):\n M.flag(message_uids, [MailMessageFlags.SEEN], True)\n\n\nclass MoveMailAction(BaseMailAction):\n def post_consume(self, M, message_uids, parameter):\n M.move(message_uids, parameter)\n\n\nclass FlagMailAction(BaseMailAction):\n def get_criteria(self):\n return {\"flagged\": False}\n\n def post_consume(self, M, message_uids, parameter):\n M.flag(message_uids, [MailMessageFlags.FLAGGED], True)\n\n\ndef get_rule_action(rule):\n if rule.action == MailRule.AttachmentAction.FLAG:\n return FlagMailAction()\n elif rule.action == MailRule.AttachmentAction.DELETE:\n return DeleteMailAction()\n elif rule.action == MailRule.AttachmentAction.MOVE:\n return MoveMailAction()\n elif rule.action == MailRule.AttachmentAction.MARK_READ:\n return MarkReadMailAction()\n else:\n raise NotImplementedError(\"Unknown action.\") # pragma: nocover\n\n\ndef make_criterias(rule):\n maximum_age = date.today() - timedelta(days=rule.maximum_age)\n criterias = {}\n if rule.maximum_age > 0:\n criterias[\"date_gte\"] = maximum_age\n if rule.filter_from:\n criterias[\"from_\"] = rule.filter_from\n if rule.filter_subject:\n criterias[\"subject\"] = rule.filter_subject\n if rule.filter_body:\n criterias[\"body\"] = rule.filter_body\n\n return {**criterias, **get_rule_action(rule).get_criteria()}\n\n\ndef get_mailbox(server, port, security):\n if security == MailAccount.ImapSecurity.NONE:\n mailbox = MailBoxUnencrypted(server, port)\n elif security == MailAccount.ImapSecurity.STARTTLS:\n mailbox = MailBoxTls(server, port)\n elif security == MailAccount.ImapSecurity.SSL:\n mailbox = MailBox(server, port)\n else:\n raise NotImplementedError(\"Unknown IMAP security\") # pragma: nocover\n return mailbox\n\n\nclass MailAccountHandler(LoggingMixin):\n\n logging_name = \"paperless_mail\"\n\n def _correspondent_from_name(self, name):\n try:\n return Correspondent.objects.get_or_create(name=name)[0]\n except DatabaseError as e:\n self.log(\"error\", f\"Error while retrieving correspondent {name}: {e}\")\n return None\n\n def get_title(self, message, att, rule):\n if rule.assign_title_from == MailRule.TitleSource.FROM_SUBJECT:\n return message.subject\n\n elif rule.assign_title_from == MailRule.TitleSource.FROM_FILENAME:\n return os.path.splitext(os.path.basename(att.filename))[0]\n\n else:\n raise NotImplementedError(\n \"Unknown title selector.\",\n ) # pragma: nocover\n\n def get_correspondent(self, message: MailMessage, rule):\n c_from = rule.assign_correspondent_from\n\n if c_from == MailRule.CorrespondentSource.FROM_NOTHING:\n return None\n\n elif c_from == MailRule.CorrespondentSource.FROM_EMAIL:\n return self._correspondent_from_name(message.from_)\n\n elif c_from == MailRule.CorrespondentSource.FROM_NAME:\n from_values = message.from_values\n if from_values is not None and len(from_values.name) > 0:\n return self._correspondent_from_name(from_values.name)\n else:\n return self._correspondent_from_name(message.from_)\n\n elif c_from == MailRule.CorrespondentSource.FROM_CUSTOM:\n return rule.assign_correspondent\n\n else:\n raise NotImplementedError(\n \"Unknwown correspondent selector\",\n ) # pragma: nocover\n\n def handle_mail_account(self, account):\n\n self.renew_logging_group()\n\n self.log(\"debug\", f\"Processing mail account {account}\")\n\n total_processed_files = 0\n\n with get_mailbox(\n account.imap_server,\n account.imap_port,\n account.imap_security,\n ) as M:\n\n try:\n M.login(account.username, account.password)\n except Exception:\n raise MailError(f\"Error while authenticating account {account}\")\n\n self.log(\n \"debug\",\n f\"Account {account}: Processing \" f\"{account.rules.count()} rule(s)\",\n )\n\n for rule in account.rules.order_by(\"order\"):\n try:\n total_processed_files += self.handle_mail_rule(M, rule)\n except Exception as e:\n self.log(\n \"error\",\n f\"Rule {rule}: Error while processing rule: {e}\",\n exc_info=True,\n )\n\n return total_processed_files\n\n def handle_mail_rule(self, M, rule):\n\n self.log(\"debug\", f\"Rule {rule}: Selecting folder {rule.folder}\")\n\n try:\n M.folder.set(rule.folder)\n except MailboxFolderSelectError:\n raise MailError(\n f\"Rule {rule}: Folder {rule.folder} \"\n f\"does not exist in account {rule.account}\",\n )\n\n criterias = make_criterias(rule)\n\n self.log(\n \"debug\",\n f\"Rule {rule}: Searching folder with criteria \" f\"{str(AND(**criterias))}\",\n )\n\n try:\n messages = M.fetch(\n criteria=AND(**criterias),\n mark_seen=False,\n charset=rule.account.character_set,\n )\n except Exception:\n raise MailError(f\"Rule {rule}: Error while fetching folder {rule.folder}\")\n\n post_consume_messages = []\n\n mails_processed = 0\n total_processed_files = 0\n\n for message in messages:\n try:\n processed_files = self.handle_message(message, rule)\n if processed_files > 0:\n post_consume_messages.append(message.uid)\n\n total_processed_files += processed_files\n mails_processed += 1\n except Exception as e:\n self.log(\n \"error\",\n f\"Rule {rule}: Error while processing mail \" f\"{message.uid}: {e}\",\n exc_info=True,\n )\n\n self.log(\"debug\", f\"Rule {rule}: Processed {mails_processed} matching mail(s)\")\n\n self.log(\n \"debug\",\n f\"Rule {rule}: Running mail actions on \"\n f\"{len(post_consume_messages)} mails\",\n )\n\n try:\n get_rule_action(rule).post_consume(\n M,\n post_consume_messages,\n rule.action_parameter,\n )\n\n except Exception as e:\n raise MailError(\n f\"Rule {rule}: Error while processing post-consume actions: \" f\"{e}\",\n )\n\n return total_processed_files\n\n def handle_message(self, message, rule) -> int:\n if not message.attachments:\n return 0\n\n self.log(\n \"debug\",\n f\"Rule {rule}: \"\n f\"Processing mail {message.subject} from {message.from_} with \"\n f\"{len(message.attachments)} attachment(s)\",\n )\n\n correspondent = self.get_correspondent(message, rule)\n tag = rule.assign_tag\n doc_type = rule.assign_document_type\n\n processed_attachments = 0\n\n for att in message.attachments:\n\n if (\n not att.content_disposition == \"attachment\"\n and rule.attachment_type\n == MailRule.AttachmentProcessing.ATTACHMENTS_ONLY\n ):\n self.log(\n \"debug\",\n f\"Rule {rule}: \"\n f\"Skipping attachment {att.filename} \"\n f\"with content disposition {att.content_disposition}\",\n )\n continue\n\n if rule.filter_attachment_filename:\n # Force the filename and pattern to the lowercase\n # as this is system dependent otherwise\n if not fnmatch(\n att.filename.lower(),\n rule.filter_attachment_filename.lower(),\n ):\n continue\n\n title = self.get_title(message, att, rule)\n\n # don't trust the content type of the attachment. Could be\n # generic application/octet-stream.\n mime_type = magic.from_buffer(att.payload, mime=True)\n\n if is_mime_type_supported(mime_type):\n\n os.makedirs(settings.SCRATCH_DIR, exist_ok=True)\n _, temp_filename = tempfile.mkstemp(\n prefix=\"paperless-mail-\",\n dir=settings.SCRATCH_DIR,\n )\n with open(temp_filename, \"wb\") as f:\n f.write(att.payload)\n\n self.log(\n \"info\",\n f\"Rule {rule}: \"\n f\"Consuming attachment {att.filename} from mail \"\n f\"{message.subject} from {message.from_}\",\n )\n\n async_task(\n \"documents.tasks.consume_file\",\n path=temp_filename,\n override_filename=pathvalidate.sanitize_filename(\n att.filename,\n ),\n override_title=title,\n override_correspondent_id=correspondent.id\n if correspondent\n else None,\n override_document_type_id=doc_type.id if doc_type else None,\n override_tag_ids=[tag.id] if tag else None,\n task_name=att.filename[:100],\n )\n\n processed_attachments += 1\n else:\n self.log(\n \"debug\",\n f\"Rule {rule}: \"\n f\"Skipping attachment {att.filename} \"\n f\"since guessed mime type {mime_type} is not supported \"\n f\"by paperless\",\n )\n\n return processed_attachments\n", "path": "src/paperless_mail/mail.py"}]} | 3,939 | 202 |
gh_patches_debug_12132 | rasdani/github-patches | git_diff | angr__angr-1862 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Strange successors of the return block of a function
I'm analysing a MIPS binary when facing the problem.
The problem exists in the funcition `do_ssc`.
In the following block which has a return statement

When I run `node.successors` I got
```
In [103]: end.successors
Out[103]:
[<CFGNode 0x40a7a8[28]>,
<CFGNode do_ssc+0x12c [28]>,
<CFGNode do_ssc+0x4c4 [28]>,
<CFGNode do_ssc+0x45c [24]>,
<CFGNode do_ssc+0x2a8 [24]>]
```
Their addresses are `0x40a7a8`, `0x40a33c`, `0x40a6d4` and `0x40a4b8` respectively.
I know the cfg of angr is interfunctional, however, only `0x40a7a8` is an caller of `do_ssc`.
May I know why other threes exist?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `angr/analyses/cfg/indirect_jump_resolvers/mips_elf_fast.py`
Content:
```
1
2 import logging
3
4 import pyvex
5 import archinfo
6
7
8 from .... import options, BP_BEFORE
9 from ....blade import Blade
10 from ....annocfg import AnnotatedCFG
11 from ....exploration_techniques import Slicecutor
12
13 from .resolver import IndirectJumpResolver
14
15
16 l = logging.getLogger(name=__name__)
17
18
19 class MipsElfFastResolver(IndirectJumpResolver):
20 def __init__(self, project):
21 super(MipsElfFastResolver, self).__init__(project, timeless=True)
22
23 def filter(self, cfg, addr, func_addr, block, jumpkind):
24 if not isinstance(self.project.arch, (archinfo.ArchMIPS32, archinfo.ArchMIPS64, )):
25 return False
26 return True
27
28 def resolve(self, cfg, addr, func_addr, block, jumpkind):
29 """
30 Resolves the indirect jump in MIPS ELF binaries where all external function calls are indexed using gp.
31
32 :param cfg: A CFG instance.
33 :param int addr: IRSB address.
34 :param int func_addr: The function address.
35 :param pyvex.IRSB block: The IRSB.
36 :param str jumpkind: The jumpkind.
37 :return: If it was resolved and targets alongside it
38 :rtype: tuple
39 """
40
41 project = self.project
42
43 b = Blade(cfg.graph, addr, -1, cfg=cfg, project=project, ignore_sp=True, ignore_bp=True,
44 ignored_regs=('gp',)
45 )
46
47 sources = [n for n in b.slice.nodes() if b.slice.in_degree(n) == 0]
48 if not sources:
49 return False, []
50
51 source = sources[0]
52 source_addr = source[0]
53 annotated_cfg = AnnotatedCFG(project, None, detect_loops=False)
54 annotated_cfg.from_digraph(b.slice)
55
56 state = project.factory.blank_state(addr=source_addr, mode="fastpath",
57 remove_options=options.refs
58 )
59 func = cfg.kb.functions.function(addr=func_addr)
60
61 gp_offset = project.arch.registers['gp'][0]
62 if 'gp' not in func.info:
63 sec = project.loader.find_section_containing(func.addr)
64 if sec is None or sec.name != '.plt':
65 # this might a special case: gp is only used once in this function, and it can be initialized right before
66 # its use site.
67 # TODO: handle this case
68 l.debug('Failed to determine value of register gp for function %#x.', func.addr)
69 return False, [ ]
70 else:
71 state.regs.gp = func.info['gp']
72
73 def overwrite_tmp_value(state):
74 state.inspect.tmp_write_expr = state.solver.BVV(func.info['gp'], state.arch.bits)
75
76 # Special handling for cases where `gp` is stored on the stack
77 got_gp_stack_store = False
78 for block_addr_in_slice in set(slice_node[0] for slice_node in b.slice.nodes()):
79 for stmt in project.factory.block(block_addr_in_slice).vex.statements:
80 if isinstance(stmt, pyvex.IRStmt.Put) and stmt.offset == gp_offset and \
81 isinstance(stmt.data, pyvex.IRExpr.RdTmp):
82 tmp_offset = stmt.data.tmp # pylint:disable=cell-var-from-loop
83 # we must make sure value of that temporary variable equals to the correct gp value
84 state.inspect.make_breakpoint('tmp_write', when=BP_BEFORE,
85 condition=lambda s, bbl_addr_=block_addr_in_slice,
86 tmp_offset_=tmp_offset:
87 s.scratch.bbl_addr == bbl_addr_ and s.inspect.tmp_write_num == tmp_offset_,
88 action=overwrite_tmp_value
89 )
90 got_gp_stack_store = True
91 break
92 if got_gp_stack_store:
93 break
94
95 simgr = self.project.factory.simulation_manager(state)
96 simgr.use_technique(Slicecutor(annotated_cfg))
97 simgr.run()
98
99 if simgr.cut:
100 target = simgr.cut[0].addr
101
102 if self._is_target_valid(cfg, target):
103 l.debug("Indirect jump at %#x is resolved to target %#x.", addr, target)
104 return True, [ target ]
105
106 l.debug("Indirect jump at %#x is resolved to target %#x, which seems to be invalid.", addr, target)
107 return False, [ ]
108
109 l.debug("Indirect jump at %#x cannot be resolved by %s.", addr, repr(self))
110 return False, [ ]
111
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/angr/analyses/cfg/indirect_jump_resolvers/mips_elf_fast.py b/angr/analyses/cfg/indirect_jump_resolvers/mips_elf_fast.py
--- a/angr/analyses/cfg/indirect_jump_resolvers/mips_elf_fast.py
+++ b/angr/analyses/cfg/indirect_jump_resolvers/mips_elf_fast.py
@@ -97,7 +97,13 @@
simgr.run()
if simgr.cut:
- target = simgr.cut[0].addr
+ # pick the successor that is cut right after executing `addr`
+ try:
+ target_state = next(iter(cut for cut in simgr.cut if cut.history.addr == addr))
+ except StopIteration:
+ l.debug("Indirect jump at %#x cannot be resolved by %s.", addr, repr(self))
+ return False, [ ]
+ target = target_state.addr
if self._is_target_valid(cfg, target):
l.debug("Indirect jump at %#x is resolved to target %#x.", addr, target)
| {"golden_diff": "diff --git a/angr/analyses/cfg/indirect_jump_resolvers/mips_elf_fast.py b/angr/analyses/cfg/indirect_jump_resolvers/mips_elf_fast.py\n--- a/angr/analyses/cfg/indirect_jump_resolvers/mips_elf_fast.py\n+++ b/angr/analyses/cfg/indirect_jump_resolvers/mips_elf_fast.py\n@@ -97,7 +97,13 @@\n simgr.run()\n \n if simgr.cut:\n- target = simgr.cut[0].addr\n+ # pick the successor that is cut right after executing `addr`\n+ try:\n+ target_state = next(iter(cut for cut in simgr.cut if cut.history.addr == addr))\n+ except StopIteration:\n+ l.debug(\"Indirect jump at %#x cannot be resolved by %s.\", addr, repr(self))\n+ return False, [ ]\n+ target = target_state.addr\n \n if self._is_target_valid(cfg, target):\n l.debug(\"Indirect jump at %#x is resolved to target %#x.\", addr, target)\n", "issue": "Strange successors of the return block of a function\nI'm analysing a MIPS binary when facing the problem.\r\n\r\nThe problem exists in the funcition `do_ssc`.\r\n\r\nIn the following block which has a return statement\r\n\r\n\r\nWhen I run `node.successors` I got\r\n```\r\nIn [103]: end.successors \r\nOut[103]: \r\n[<CFGNode 0x40a7a8[28]>,\r\n <CFGNode do_ssc+0x12c [28]>,\r\n <CFGNode do_ssc+0x4c4 [28]>,\r\n <CFGNode do_ssc+0x45c [24]>,\r\n <CFGNode do_ssc+0x2a8 [24]>]\r\n```\r\nTheir addresses are `0x40a7a8`, `0x40a33c`, `0x40a6d4` and `0x40a4b8` respectively.\r\n\r\nI know the cfg of angr is interfunctional, however, only `0x40a7a8` is an caller of `do_ssc`.\r\n\r\nMay I know why other threes exist?\r\n\r\n\r\n\n", "before_files": [{"content": "\nimport logging\n\nimport pyvex\nimport archinfo\n\n\nfrom .... import options, BP_BEFORE\nfrom ....blade import Blade\nfrom ....annocfg import AnnotatedCFG\nfrom ....exploration_techniques import Slicecutor\n\nfrom .resolver import IndirectJumpResolver\n\n\nl = logging.getLogger(name=__name__)\n\n\nclass MipsElfFastResolver(IndirectJumpResolver):\n def __init__(self, project):\n super(MipsElfFastResolver, self).__init__(project, timeless=True)\n\n def filter(self, cfg, addr, func_addr, block, jumpkind):\n if not isinstance(self.project.arch, (archinfo.ArchMIPS32, archinfo.ArchMIPS64, )):\n return False\n return True\n\n def resolve(self, cfg, addr, func_addr, block, jumpkind):\n \"\"\"\n Resolves the indirect jump in MIPS ELF binaries where all external function calls are indexed using gp.\n\n :param cfg: A CFG instance.\n :param int addr: IRSB address.\n :param int func_addr: The function address.\n :param pyvex.IRSB block: The IRSB.\n :param str jumpkind: The jumpkind.\n :return: If it was resolved and targets alongside it\n :rtype: tuple\n \"\"\"\n\n project = self.project\n\n b = Blade(cfg.graph, addr, -1, cfg=cfg, project=project, ignore_sp=True, ignore_bp=True,\n ignored_regs=('gp',)\n )\n\n sources = [n for n in b.slice.nodes() if b.slice.in_degree(n) == 0]\n if not sources:\n return False, []\n\n source = sources[0]\n source_addr = source[0]\n annotated_cfg = AnnotatedCFG(project, None, detect_loops=False)\n annotated_cfg.from_digraph(b.slice)\n\n state = project.factory.blank_state(addr=source_addr, mode=\"fastpath\",\n remove_options=options.refs\n )\n func = cfg.kb.functions.function(addr=func_addr)\n\n gp_offset = project.arch.registers['gp'][0]\n if 'gp' not in func.info:\n sec = project.loader.find_section_containing(func.addr)\n if sec is None or sec.name != '.plt':\n # this might a special case: gp is only used once in this function, and it can be initialized right before\n # its use site.\n # TODO: handle this case\n l.debug('Failed to determine value of register gp for function %#x.', func.addr)\n return False, [ ]\n else:\n state.regs.gp = func.info['gp']\n\n def overwrite_tmp_value(state):\n state.inspect.tmp_write_expr = state.solver.BVV(func.info['gp'], state.arch.bits)\n\n # Special handling for cases where `gp` is stored on the stack\n got_gp_stack_store = False\n for block_addr_in_slice in set(slice_node[0] for slice_node in b.slice.nodes()):\n for stmt in project.factory.block(block_addr_in_slice).vex.statements:\n if isinstance(stmt, pyvex.IRStmt.Put) and stmt.offset == gp_offset and \\\n isinstance(stmt.data, pyvex.IRExpr.RdTmp):\n tmp_offset = stmt.data.tmp # pylint:disable=cell-var-from-loop\n # we must make sure value of that temporary variable equals to the correct gp value\n state.inspect.make_breakpoint('tmp_write', when=BP_BEFORE,\n condition=lambda s, bbl_addr_=block_addr_in_slice,\n tmp_offset_=tmp_offset:\n s.scratch.bbl_addr == bbl_addr_ and s.inspect.tmp_write_num == tmp_offset_,\n action=overwrite_tmp_value\n )\n got_gp_stack_store = True\n break\n if got_gp_stack_store:\n break\n\n simgr = self.project.factory.simulation_manager(state)\n simgr.use_technique(Slicecutor(annotated_cfg))\n simgr.run()\n\n if simgr.cut:\n target = simgr.cut[0].addr\n\n if self._is_target_valid(cfg, target):\n l.debug(\"Indirect jump at %#x is resolved to target %#x.\", addr, target)\n return True, [ target ]\n\n l.debug(\"Indirect jump at %#x is resolved to target %#x, which seems to be invalid.\", addr, target)\n return False, [ ]\n\n l.debug(\"Indirect jump at %#x cannot be resolved by %s.\", addr, repr(self))\n return False, [ ]\n", "path": "angr/analyses/cfg/indirect_jump_resolvers/mips_elf_fast.py"}], "after_files": [{"content": "\nimport logging\n\nimport pyvex\nimport archinfo\n\n\nfrom .... import options, BP_BEFORE\nfrom ....blade import Blade\nfrom ....annocfg import AnnotatedCFG\nfrom ....exploration_techniques import Slicecutor\n\nfrom .resolver import IndirectJumpResolver\n\n\nl = logging.getLogger(name=__name__)\n\n\nclass MipsElfFastResolver(IndirectJumpResolver):\n def __init__(self, project):\n super(MipsElfFastResolver, self).__init__(project, timeless=True)\n\n def filter(self, cfg, addr, func_addr, block, jumpkind):\n if not isinstance(self.project.arch, (archinfo.ArchMIPS32, archinfo.ArchMIPS64, )):\n return False\n return True\n\n def resolve(self, cfg, addr, func_addr, block, jumpkind):\n \"\"\"\n Resolves the indirect jump in MIPS ELF binaries where all external function calls are indexed using gp.\n\n :param cfg: A CFG instance.\n :param int addr: IRSB address.\n :param int func_addr: The function address.\n :param pyvex.IRSB block: The IRSB.\n :param str jumpkind: The jumpkind.\n :return: If it was resolved and targets alongside it\n :rtype: tuple\n \"\"\"\n\n project = self.project\n\n b = Blade(cfg.graph, addr, -1, cfg=cfg, project=project, ignore_sp=True, ignore_bp=True,\n ignored_regs=('gp',)\n )\n\n sources = [n for n in b.slice.nodes() if b.slice.in_degree(n) == 0]\n if not sources:\n return False, []\n\n source = sources[0]\n source_addr = source[0]\n annotated_cfg = AnnotatedCFG(project, None, detect_loops=False)\n annotated_cfg.from_digraph(b.slice)\n\n state = project.factory.blank_state(addr=source_addr, mode=\"fastpath\",\n remove_options=options.refs\n )\n func = cfg.kb.functions.function(addr=func_addr)\n\n gp_offset = project.arch.registers['gp'][0]\n if 'gp' not in func.info:\n sec = project.loader.find_section_containing(func.addr)\n if sec is None or sec.name != '.plt':\n # this might a special case: gp is only used once in this function, and it can be initialized right before\n # its use site.\n # TODO: handle this case\n l.debug('Failed to determine value of register gp for function %#x.', func.addr)\n return False, [ ]\n else:\n state.regs.gp = func.info['gp']\n\n def overwrite_tmp_value(state):\n state.inspect.tmp_write_expr = state.solver.BVV(func.info['gp'], state.arch.bits)\n\n # Special handling for cases where `gp` is stored on the stack\n got_gp_stack_store = False\n for block_addr_in_slice in set(slice_node[0] for slice_node in b.slice.nodes()):\n for stmt in project.factory.block(block_addr_in_slice).vex.statements:\n if isinstance(stmt, pyvex.IRStmt.Put) and stmt.offset == gp_offset and \\\n isinstance(stmt.data, pyvex.IRExpr.RdTmp):\n tmp_offset = stmt.data.tmp # pylint:disable=cell-var-from-loop\n # we must make sure value of that temporary variable equals to the correct gp value\n state.inspect.make_breakpoint('tmp_write', when=BP_BEFORE,\n condition=lambda s, bbl_addr_=block_addr_in_slice,\n tmp_offset_=tmp_offset:\n s.scratch.bbl_addr == bbl_addr_ and s.inspect.tmp_write_num == tmp_offset_,\n action=overwrite_tmp_value\n )\n got_gp_stack_store = True\n break\n if got_gp_stack_store:\n break\n\n simgr = self.project.factory.simulation_manager(state)\n simgr.use_technique(Slicecutor(annotated_cfg))\n simgr.run()\n\n if simgr.cut:\n # pick the successor that is cut right after executing `addr`\n try:\n target_state = next(iter(cut for cut in simgr.cut if cut.history.addr == addr))\n except StopIteration:\n l.debug(\"Indirect jump at %#x cannot be resolved by %s.\", addr, repr(self))\n return False, [ ]\n target = target_state.addr\n\n if self._is_target_valid(cfg, target):\n l.debug(\"Indirect jump at %#x is resolved to target %#x.\", addr, target)\n return True, [ target ]\n\n l.debug(\"Indirect jump at %#x is resolved to target %#x, which seems to be invalid.\", addr, target)\n return False, [ ]\n\n l.debug(\"Indirect jump at %#x cannot be resolved by %s.\", addr, repr(self))\n return False, [ ]\n", "path": "angr/analyses/cfg/indirect_jump_resolvers/mips_elf_fast.py"}]} | 1,794 | 246 |
gh_patches_debug_1220 | rasdani/github-patches | git_diff | DataBiosphere__toil-239 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Jenkins should only deploy to PyPI when building off the master branch
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 from setuptools import setup, find_packages
2
3 setup(
4 name='toil',
5 version='3.0.4',
6 description='Pipeline management software for clusters.',
7 author='Benedict Paten',
8 author_email='[email protected]',
9 url="https://github.com/BD2KGenomics/toil",
10 install_requires=['bd2k-python-lib>=1.7.dev1'],
11 extras_require={
12 'mesos': [
13 'mesos.interface==0.22.0',
14 'psutil==3.0.1' ],
15 'aws': [
16 'boto==2.38.0' ] },
17 package_dir={ '': 'src' },
18 packages=find_packages( 'src', exclude=[ '*.test' ] ),
19 entry_points={
20 'console_scripts': [
21 'toilKill = toil.utils.toilKill:main',
22 'toilStatus = toil.utils.toilStatus:main',
23 'toilStats = toil.utils.toilStats:main',
24 'toilRestarts = toil.utils.toilRestarts:main',
25 'multijob = toil.batchSystems.multijob:main',
26 'toil-mesos-executor = toil.batchSystems.mesos.executor:main [mesos]'] } )
27
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -2,7 +2,7 @@
setup(
name='toil',
- version='3.0.4',
+ version='3.0.5.dev1',
description='Pipeline management software for clusters.',
author='Benedict Paten',
author_email='[email protected]',
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -2,7 +2,7 @@\n \n setup(\n name='toil',\n- version='3.0.4',\n+ version='3.0.5.dev1',\n description='Pipeline management software for clusters.',\n author='Benedict Paten',\n author_email='[email protected]',\n", "issue": "Jenkins should only deploy to PyPI when building off the master branch\n\n", "before_files": [{"content": "from setuptools import setup, find_packages\n\nsetup(\n name='toil',\n version='3.0.4',\n description='Pipeline management software for clusters.',\n author='Benedict Paten',\n author_email='[email protected]',\n url=\"https://github.com/BD2KGenomics/toil\",\n install_requires=['bd2k-python-lib>=1.7.dev1'],\n extras_require={\n 'mesos': [\n 'mesos.interface==0.22.0',\n 'psutil==3.0.1' ],\n 'aws': [\n 'boto==2.38.0' ] },\n package_dir={ '': 'src' },\n packages=find_packages( 'src', exclude=[ '*.test' ] ),\n entry_points={\n 'console_scripts': [\n 'toilKill = toil.utils.toilKill:main',\n 'toilStatus = toil.utils.toilStatus:main',\n 'toilStats = toil.utils.toilStats:main',\n 'toilRestarts = toil.utils.toilRestarts:main',\n 'multijob = toil.batchSystems.multijob:main',\n 'toil-mesos-executor = toil.batchSystems.mesos.executor:main [mesos]'] } )\n", "path": "setup.py"}], "after_files": [{"content": "from setuptools import setup, find_packages\n\nsetup(\n name='toil',\n version='3.0.5.dev1',\n description='Pipeline management software for clusters.',\n author='Benedict Paten',\n author_email='[email protected]',\n url=\"https://github.com/BD2KGenomics/toil\",\n install_requires=['bd2k-python-lib>=1.7.dev1'],\n extras_require={\n 'mesos': [\n 'mesos.interface==0.22.0',\n 'psutil==3.0.1' ],\n 'aws': [\n 'boto==2.38.0' ] },\n package_dir={ '': 'src' },\n packages=find_packages( 'src', exclude=[ '*.test' ] ),\n entry_points={\n 'console_scripts': [\n 'toilKill = toil.utils.toilKill:main',\n 'toilStatus = toil.utils.toilStatus:main',\n 'toilStats = toil.utils.toilStats:main',\n 'toilRestarts = toil.utils.toilRestarts:main',\n 'multijob = toil.batchSystems.multijob:main',\n 'toil-mesos-executor = toil.batchSystems.mesos.executor:main [mesos]'] } )\n", "path": "setup.py"}]} | 600 | 93 |
gh_patches_debug_59245 | rasdani/github-patches | git_diff | facebookresearch__hydra-287 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[Bug] example of override fail in multirun
This fails
`python examples/tutorial/5_composition/my_app.py -m db=mysql,postgresql db.user=omry`
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 # Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved
2 import codecs
3 import distutils
4 import os
5 import re
6 import shutil
7 from os.path import join, exists, isdir
8
9 from setuptools import setup, find_packages
10
11 here = os.path.abspath(os.path.dirname(__file__))
12
13
14 def read(*parts):
15 with codecs.open(os.path.join(here, *parts), "r") as fp:
16 return fp.read()
17
18
19 def find_version(*file_paths):
20 version_file = read(*file_paths)
21 version_match = re.search(r"^__version__ = ['\"]([^'\"]*)['\"]", version_file, re.M)
22 if version_match:
23 return version_match.group(1)
24 raise RuntimeError("Unable to find version string.")
25
26
27 class CleanCommand(distutils.cmd.Command):
28 """
29 Our custom command to clean out junk files.
30 """
31
32 description = "Cleans out junk files we don't want in the repo"
33 user_options = []
34
35 def initialize_options(self):
36 pass
37
38 def finalize_options(self):
39 pass
40
41 @staticmethod
42 def find(root, includes, excludes=[]):
43 res = []
44 for parent, dirs, files in os.walk(root):
45 for f in dirs + files:
46 add = list()
47 for include in includes:
48 if re.findall(include, f):
49 add.append(join(parent, f))
50 res.extend(add)
51 final_list = []
52 # Exclude things that matches an exclude pattern
53 for ex in excludes:
54 for file in res:
55 if not re.findall(ex, file):
56 final_list.append(file)
57 return final_list
58
59 def run(self):
60 delete_patterns = [
61 ".eggs",
62 ".egg-info",
63 ".pytest_cache",
64 "build",
65 "dist",
66 "__pycache__",
67 ".pyc",
68 ]
69 deletion_list = CleanCommand.find(
70 ".", includes=delete_patterns, excludes=["\\.nox/.*"]
71 )
72
73 for f in deletion_list:
74 if exists(f):
75 if isdir(f):
76 shutil.rmtree(f, ignore_errors=True)
77 else:
78 os.unlink(f)
79
80
81 with open("README.md", "r") as fh:
82 LONG_DESC = fh.read()
83 setup(
84 cmdclass={"clean": CleanCommand},
85 name="hydra-core",
86 version=find_version("hydra", "__init__.py"),
87 author="Omry Yadan",
88 author_email="[email protected]",
89 description="Hydra is a library for writing flexible command line applications",
90 long_description=LONG_DESC,
91 long_description_content_type="text/markdown",
92 url="https://github.com/facebookresearch/hydra",
93 keywords="command-line configuration yaml tab-completion",
94 packages=find_packages(),
95 include_package_data=True,
96 classifiers=[
97 "License :: OSI Approved :: MIT License",
98 "Development Status :: 4 - Beta",
99 "Programming Language :: Python :: 2.7",
100 "Programming Language :: Python :: 3.6",
101 "Programming Language :: Python :: 3.7",
102 "Operating System :: POSIX :: Linux",
103 "Operating System :: MacOS",
104 "Operating System :: Microsoft :: Windows",
105 ],
106 install_requires=[
107 "omegaconf>=1.4.0rc2",
108 'pathlib2>=2.2.0;python_version<"3.0"',
109 ],
110 # Install development dependencies with
111 # pip install -e .[dev]
112 extras_require={
113 "dev": [
114 "black",
115 "coverage",
116 "flake8",
117 "flake8-copyright",
118 "nox",
119 "pre-commit",
120 "pytest",
121 "setuptools",
122 "towncrier",
123 "twine",
124 ]
125 },
126 )
127
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -104,7 +104,7 @@
"Operating System :: Microsoft :: Windows",
],
install_requires=[
- "omegaconf>=1.4.0rc2",
+ "omegaconf>=1.4.0rc3",
'pathlib2>=2.2.0;python_version<"3.0"',
],
# Install development dependencies with
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -104,7 +104,7 @@\n \"Operating System :: Microsoft :: Windows\",\n ],\n install_requires=[\n- \"omegaconf>=1.4.0rc2\",\n+ \"omegaconf>=1.4.0rc3\",\n 'pathlib2>=2.2.0;python_version<\"3.0\"',\n ],\n # Install development dependencies with\n", "issue": "[Bug] example of override fail in multirun\nThis fails\r\n\r\n`python examples/tutorial/5_composition/my_app.py -m db=mysql,postgresql db.user=omry`\n", "before_files": [{"content": "# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved\nimport codecs\nimport distutils\nimport os\nimport re\nimport shutil\nfrom os.path import join, exists, isdir\n\nfrom setuptools import setup, find_packages\n\nhere = os.path.abspath(os.path.dirname(__file__))\n\n\ndef read(*parts):\n with codecs.open(os.path.join(here, *parts), \"r\") as fp:\n return fp.read()\n\n\ndef find_version(*file_paths):\n version_file = read(*file_paths)\n version_match = re.search(r\"^__version__ = ['\\\"]([^'\\\"]*)['\\\"]\", version_file, re.M)\n if version_match:\n return version_match.group(1)\n raise RuntimeError(\"Unable to find version string.\")\n\n\nclass CleanCommand(distutils.cmd.Command):\n \"\"\"\n Our custom command to clean out junk files.\n \"\"\"\n\n description = \"Cleans out junk files we don't want in the repo\"\n user_options = []\n\n def initialize_options(self):\n pass\n\n def finalize_options(self):\n pass\n\n @staticmethod\n def find(root, includes, excludes=[]):\n res = []\n for parent, dirs, files in os.walk(root):\n for f in dirs + files:\n add = list()\n for include in includes:\n if re.findall(include, f):\n add.append(join(parent, f))\n res.extend(add)\n final_list = []\n # Exclude things that matches an exclude pattern\n for ex in excludes:\n for file in res:\n if not re.findall(ex, file):\n final_list.append(file)\n return final_list\n\n def run(self):\n delete_patterns = [\n \".eggs\",\n \".egg-info\",\n \".pytest_cache\",\n \"build\",\n \"dist\",\n \"__pycache__\",\n \".pyc\",\n ]\n deletion_list = CleanCommand.find(\n \".\", includes=delete_patterns, excludes=[\"\\\\.nox/.*\"]\n )\n\n for f in deletion_list:\n if exists(f):\n if isdir(f):\n shutil.rmtree(f, ignore_errors=True)\n else:\n os.unlink(f)\n\n\nwith open(\"README.md\", \"r\") as fh:\n LONG_DESC = fh.read()\n setup(\n cmdclass={\"clean\": CleanCommand},\n name=\"hydra-core\",\n version=find_version(\"hydra\", \"__init__.py\"),\n author=\"Omry Yadan\",\n author_email=\"[email protected]\",\n description=\"Hydra is a library for writing flexible command line applications\",\n long_description=LONG_DESC,\n long_description_content_type=\"text/markdown\",\n url=\"https://github.com/facebookresearch/hydra\",\n keywords=\"command-line configuration yaml tab-completion\",\n packages=find_packages(),\n include_package_data=True,\n classifiers=[\n \"License :: OSI Approved :: MIT License\",\n \"Development Status :: 4 - Beta\",\n \"Programming Language :: Python :: 2.7\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Operating System :: POSIX :: Linux\",\n \"Operating System :: MacOS\",\n \"Operating System :: Microsoft :: Windows\",\n ],\n install_requires=[\n \"omegaconf>=1.4.0rc2\",\n 'pathlib2>=2.2.0;python_version<\"3.0\"',\n ],\n # Install development dependencies with\n # pip install -e .[dev]\n extras_require={\n \"dev\": [\n \"black\",\n \"coverage\",\n \"flake8\",\n \"flake8-copyright\",\n \"nox\",\n \"pre-commit\",\n \"pytest\",\n \"setuptools\",\n \"towncrier\",\n \"twine\",\n ]\n },\n )\n", "path": "setup.py"}], "after_files": [{"content": "# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved\nimport codecs\nimport distutils\nimport os\nimport re\nimport shutil\nfrom os.path import join, exists, isdir\n\nfrom setuptools import setup, find_packages\n\nhere = os.path.abspath(os.path.dirname(__file__))\n\n\ndef read(*parts):\n with codecs.open(os.path.join(here, *parts), \"r\") as fp:\n return fp.read()\n\n\ndef find_version(*file_paths):\n version_file = read(*file_paths)\n version_match = re.search(r\"^__version__ = ['\\\"]([^'\\\"]*)['\\\"]\", version_file, re.M)\n if version_match:\n return version_match.group(1)\n raise RuntimeError(\"Unable to find version string.\")\n\n\nclass CleanCommand(distutils.cmd.Command):\n \"\"\"\n Our custom command to clean out junk files.\n \"\"\"\n\n description = \"Cleans out junk files we don't want in the repo\"\n user_options = []\n\n def initialize_options(self):\n pass\n\n def finalize_options(self):\n pass\n\n @staticmethod\n def find(root, includes, excludes=[]):\n res = []\n for parent, dirs, files in os.walk(root):\n for f in dirs + files:\n add = list()\n for include in includes:\n if re.findall(include, f):\n add.append(join(parent, f))\n res.extend(add)\n final_list = []\n # Exclude things that matches an exclude pattern\n for ex in excludes:\n for file in res:\n if not re.findall(ex, file):\n final_list.append(file)\n return final_list\n\n def run(self):\n delete_patterns = [\n \".eggs\",\n \".egg-info\",\n \".pytest_cache\",\n \"build\",\n \"dist\",\n \"__pycache__\",\n \".pyc\",\n ]\n deletion_list = CleanCommand.find(\n \".\", includes=delete_patterns, excludes=[\"\\\\.nox/.*\"]\n )\n\n for f in deletion_list:\n if exists(f):\n if isdir(f):\n shutil.rmtree(f, ignore_errors=True)\n else:\n os.unlink(f)\n\n\nwith open(\"README.md\", \"r\") as fh:\n LONG_DESC = fh.read()\n setup(\n cmdclass={\"clean\": CleanCommand},\n name=\"hydra-core\",\n version=find_version(\"hydra\", \"__init__.py\"),\n author=\"Omry Yadan\",\n author_email=\"[email protected]\",\n description=\"Hydra is a library for writing flexible command line applications\",\n long_description=LONG_DESC,\n long_description_content_type=\"text/markdown\",\n url=\"https://github.com/facebookresearch/hydra\",\n keywords=\"command-line configuration yaml tab-completion\",\n packages=find_packages(),\n include_package_data=True,\n classifiers=[\n \"License :: OSI Approved :: MIT License\",\n \"Development Status :: 4 - Beta\",\n \"Programming Language :: Python :: 2.7\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Operating System :: POSIX :: Linux\",\n \"Operating System :: MacOS\",\n \"Operating System :: Microsoft :: Windows\",\n ],\n install_requires=[\n \"omegaconf>=1.4.0rc3\",\n 'pathlib2>=2.2.0;python_version<\"3.0\"',\n ],\n # Install development dependencies with\n # pip install -e .[dev]\n extras_require={\n \"dev\": [\n \"black\",\n \"coverage\",\n \"flake8\",\n \"flake8-copyright\",\n \"nox\",\n \"pre-commit\",\n \"pytest\",\n \"setuptools\",\n \"towncrier\",\n \"twine\",\n ]\n },\n )\n", "path": "setup.py"}]} | 1,366 | 105 |
gh_patches_debug_21735 | rasdani/github-patches | git_diff | akvo__akvo-rsr-4750 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Collaborate draft update bug
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `akvo/rest/views/indicator_period_data.py`
Content:
```
1 # -*- coding: utf-8 -*-
2
3 # Akvo RSR is covered by the GNU Affero General Public License.
4 # See more details in the license.txt file located at the root folder of the Akvo RSR module.
5 # For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.
6
7 import json
8 import os
9
10 from akvo.rsr.models import IndicatorPeriodData, IndicatorPeriodDataComment, Project
11 from akvo.rest.models import TastyTokenAuthentication, JWTAuthentication
12 from akvo.rsr.models.result.utils import QUANTITATIVE, PERCENTAGE_MEASURE
13
14 from ..serializers import (IndicatorPeriodDataSerializer, IndicatorPeriodDataFrameworkSerializer,
15 IndicatorPeriodDataCommentSerializer)
16 from ..viewsets import PublicProjectViewSet
17
18 from django.shortcuts import get_object_or_404
19 from django.http import HttpResponseBadRequest, HttpResponseForbidden
20 from django.contrib.admin.models import LogEntry, ADDITION, CHANGE, DELETION
21 from django.contrib.contenttypes.models import ContentType
22 from rest_framework import status
23 from rest_framework.authentication import SessionAuthentication
24 from rest_framework.decorators import api_view, authentication_classes
25 from rest_framework.response import Response
26 from rest_framework.utils.encoders import JSONEncoder
27
28
29 class IndicatorPeriodDataViewSet(PublicProjectViewSet):
30 """
31 """
32 queryset = IndicatorPeriodData.objects.select_related('user', 'approved_by').all()
33 serializer_class = IndicatorPeriodDataSerializer
34
35 project_relation = 'period__indicator__result__project__'
36
37 def filter_queryset(self, queryset):
38 queryset = super(IndicatorPeriodDataViewSet, self).filter_queryset(queryset)
39 return IndicatorPeriodData.get_user_viewable_updates(
40 queryset, self.request.user
41 )
42
43 def perform_create(self, serializer):
44 serializer.save(user=self.request.user)
45
46
47 class IndicatorPeriodDataFrameworkViewSet(PublicProjectViewSet):
48 """
49 """
50 authentication_classes = (SessionAuthentication, TastyTokenAuthentication, JWTAuthentication)
51
52 queryset = IndicatorPeriodData.objects.select_related(
53 'period',
54 'user',
55 'approved_by',
56 ).prefetch_related(
57 'comments',
58 'disaggregations',
59 ).all()
60 serializer_class = IndicatorPeriodDataFrameworkSerializer
61 project_relation = 'period__indicator__result__project__'
62
63 def get_queryset(self):
64 queryset = getattr(self, '_c_queryset', None)
65 if queryset is None:
66 queryset = super(IndicatorPeriodDataFrameworkViewSet, self).get_queryset()
67 queryset = IndicatorPeriodData.get_user_viewable_updates(
68 queryset, self.request.user
69 )
70 self._c_queryset = queryset
71
72 return queryset
73
74 def perform_create(self, serializer):
75 data = {key: value for key, value in serializer.validated_data.items() if key not in ['period', 'files', 'photos', 'approved_by']}
76 if len(serializer._disaggregations_data) > 0:
77 data['disaggregations'] = [
78 {key: value for key, value in dsg.items() if key in ['id', 'dimension_value', 'value', 'numerator', 'denominator']}
79 for dsg in serializer._disaggregations_data
80 ]
81 user = self.request.user
82 serializer.save(user=user)
83 instance = serializer.instance
84 log_data = {'audit_trail': True, 'data': data}
85 LogEntry.objects.log_action(
86 user_id=user.id,
87 content_type_id=ContentType.objects.get_for_model(IndicatorPeriodData).id,
88 object_id=instance.id,
89 object_repr=str(instance),
90 action_flag=ADDITION,
91 change_message=json.dumps(log_data, cls=JSONEncoder)
92 )
93
94 def perform_update(self, serializer):
95 instance = serializer.instance
96 data = {
97 key: value
98 for key, value in serializer.validated_data.items()
99 if key not in ['period', 'files', 'photos', 'approved_by'] and (key == 'comments' or getattr(instance, key) != value)
100 }
101 if len(serializer._disaggregations_data) > 0:
102 indicator = instance.period.indicator
103 is_percentage = indicator.type == QUANTITATIVE and indicator.measure == PERCENTAGE_MEASURE
104 dsg_attrs = ['id', 'dimension_value', 'numerator', 'denominator'] if is_percentage else ['id', 'dimension_value', 'value']
105 data['disaggregations'] = [
106 {key: value for key, value in dsg.items() if key in dsg_attrs}
107 for dsg in serializer._disaggregations_data
108 ]
109 user = self.request.user
110 status = data.get('status', None)
111 if status == 'R' or status == 'A':
112 serializer.save()
113 else:
114 serializer.save(user=user)
115 log_data = {'audit_trail': True, 'data': data}
116 LogEntry.objects.log_action(
117 user_id=user.id,
118 content_type_id=ContentType.objects.get_for_model(IndicatorPeriodData).id,
119 object_id=instance.id,
120 object_repr=str(instance),
121 action_flag=CHANGE,
122 change_message=json.dumps(log_data, cls=JSONEncoder)
123 )
124
125 def perform_destroy(self, instance):
126 object_id = instance.id
127 object_repr = str(instance)
128 super().perform_destroy(instance)
129 LogEntry.objects.log_action(
130 user_id=self.request.user.id,
131 content_type_id=ContentType.objects.get_for_model(IndicatorPeriodData).id,
132 object_id=object_id,
133 object_repr=object_repr,
134 action_flag=DELETION,
135 change_message=json.dumps({'audit_trail': True})
136 )
137
138
139 class IndicatorPeriodDataCommentViewSet(PublicProjectViewSet):
140 """
141 """
142 authentication_classes = (SessionAuthentication, TastyTokenAuthentication, JWTAuthentication)
143
144 # TODO: Is there more optimization possible?
145 queryset = IndicatorPeriodDataComment.objects.select_related(
146 'user'
147 ).prefetch_related(
148 'user__employers', 'user__employers__organisation'
149 )
150 serializer_class = IndicatorPeriodDataCommentSerializer
151 project_relation = 'data__period__indicator__result__project__'
152
153 def perform_create(self, serializer):
154 serializer.save(user=self.request.user)
155
156
157 @api_view(['POST', 'DELETE'])
158 @authentication_classes([SessionAuthentication, TastyTokenAuthentication, JWTAuthentication])
159 def period_update_files(request, update_pk, file_pk=None):
160 update = get_object_or_404(IndicatorPeriodData, pk=update_pk)
161 user = request.user
162 if not user.has_perm('rsr.change_indicatorperioddata', update):
163 return Response({'error': 'User has no permission to add/remove files'}, status=status.HTTP_403_FORBIDDEN)
164
165 if request.method == 'POST' and not file_pk:
166 serializer = IndicatorPeriodDataFrameworkSerializer(instance=update, data=request.data, partial=True)
167 serializer.is_valid(raise_exception=True)
168 files = [f"Uploaded file \"{file.name}\"" for file in serializer.validated_data.get('files', [])]
169 serializer.save(user=user)
170 log_data = {'audit_trail': True, 'data': {'files': files}}
171 LogEntry.objects.log_action(
172 user_id=user.id,
173 content_type_id=ContentType.objects.get_for_model(IndicatorPeriodData).id,
174 object_id=update.id,
175 object_repr=str(update),
176 action_flag=CHANGE,
177 change_message=json.dumps(log_data)
178 )
179 return Response(serializer.data['file_set'])
180
181 if request.method == 'DELETE' and file_pk:
182 file = update.indicatorperioddatafile_set.get(pk=file_pk)
183 filename = os.path.basename(file.file.name)
184 file.delete()
185 update.user = user
186 update.save(update_fields=['user'])
187 log_data = {'audit_trail': True, 'data': {'files': [f"Removed file \"{filename}\""]}}
188 LogEntry.objects.log_action(
189 user_id=user.id,
190 content_type_id=ContentType.objects.get_for_model(IndicatorPeriodData).id,
191 object_id=update.id,
192 object_repr=str(update),
193 action_flag=CHANGE,
194 change_message=json.dumps(log_data)
195 )
196 return Response(status=status.HTTP_204_NO_CONTENT)
197
198 return Response(status=status.HTTP_405_METHOD_NOT_ALLOWED)
199
200
201 @api_view(['POST', 'DELETE'])
202 @authentication_classes([SessionAuthentication, TastyTokenAuthentication, JWTAuthentication])
203 def period_update_photos(request, update_pk, photo_pk=None):
204 update = get_object_or_404(IndicatorPeriodData, pk=update_pk)
205 user = request.user
206 if user != update.user:
207 return Response({'error': 'User has no permission to add/remove photos'}, status=status.HTTP_403_FORBIDDEN)
208
209 if request.method == 'POST' and not photo_pk:
210 serializer = IndicatorPeriodDataFrameworkSerializer(instance=update, data=request.data, partial=True)
211 serializer.is_valid(raise_exception=True)
212 serializer.save(user=user)
213 return Response(serializer.data['photo_set'])
214
215 if request.method == 'DELETE' and photo_pk:
216 photo = update.indicatorperioddataphoto_set.get(pk=photo_pk)
217 photo.delete()
218 return Response(status=status.HTTP_204_NO_CONTENT)
219
220 return Response(status=status.HTTP_405_METHOD_NOT_ALLOWED)
221
222
223 @api_view(['POST', 'DELETE'])
224 def indicator_upload_file(request, pk=None):
225 """
226 Special API call for directly uploading a file.
227
228 :param request; A Django request object.
229 :param pk; The primary key of an IndicatorPeriodData instance.
230 """
231 # Permissions
232 user = getattr(request, 'user', None)
233 if not user:
234 return Response({'error': 'User is not logged in'}, status=status.HTTP_403_FORBIDDEN)
235 # TODO: Check if user is allowed to upload a file
236 # if not user.has_perm('rsr.change_project', update.period.indicator.result.project):
237 # return Response({'error': 'User has no permission to place an update'},
238 # status=status.HTTP_403_FORBIDDEN)
239
240 update = IndicatorPeriodData.objects.get(pk=pk)
241 if request.method == 'DELETE':
242 try:
243 if request.data['type'] == 'photo':
244 update.photo = ''
245 update.save(update_fields=['photo'])
246 return Response({}, status=status.HTTP_204_NO_CONTENT)
247 elif request.data['type'] == 'file':
248 update.file = ''
249 update.save(update_fields=['file'])
250 return Response({}, status=status.HTTP_204_NO_CONTENT)
251 except Exception as e:
252 return Response({'error': str(e)}, status=status.HTTP_400_BAD_REQUEST)
253 else: # POST
254 upload_file = request.data['file']
255 try:
256 file_type = request.POST.copy()['type']
257 if file_type == 'photo':
258 update.photo = upload_file
259 update.save(update_fields=['photo'])
260 # Add photo member to be able to distinguish from file URL in new results version
261 # while keeping the old API
262 return Response({'file': update.photo.url, 'photo': update.photo.url})
263 elif file_type == 'file':
264 update.file = upload_file
265 update.save(update_fields=['file'])
266 return Response({'file': update.file.url})
267 except Exception as e:
268 return Response({'error': str(e)}, status=status.HTTP_400_BAD_REQUEST)
269
270
271 @api_view(['POST'])
272 @authentication_classes([SessionAuthentication, TastyTokenAuthentication])
273 def set_updates_status(request, project_pk):
274 """Bulk update IndicatorPeriodData.status attributes of a project.
275 """
276 update_ids = request.data.get('updates', [])
277 status = request.data.get('status', None)
278 if len(update_ids) < 1 or status is None:
279 return HttpResponseBadRequest()
280 user = request.user
281 project = get_object_or_404(Project, pk=project_pk)
282 if not user.has_perm('rsr.change_project', project):
283 return HttpResponseForbidden()
284 IndicatorPeriodData.objects\
285 .filter(id__in=update_ids, period__indicator__result__project=project)\
286 .update(status=status)
287 log_data = {'audit_trail': True, 'data': {'status': status}}
288 for update_id in update_ids:
289 LogEntry.objects.log_action(
290 user_id=user.id,
291 content_type_id=ContentType.objects.get_for_model(IndicatorPeriodData).id,
292 object_id=update_id,
293 object_repr='IndicatorPeriodData',
294 action_flag=CHANGE,
295 change_message=json.dumps(log_data)
296 )
297 return Response({'success': True})
298
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/akvo/rest/views/indicator_period_data.py b/akvo/rest/views/indicator_period_data.py
--- a/akvo/rest/views/indicator_period_data.py
+++ b/akvo/rest/views/indicator_period_data.py
@@ -60,15 +60,22 @@
serializer_class = IndicatorPeriodDataFrameworkSerializer
project_relation = 'period__indicator__result__project__'
- def get_queryset(self):
- queryset = getattr(self, '_c_queryset', None)
- if queryset is None:
- queryset = super(IndicatorPeriodDataFrameworkViewSet, self).get_queryset()
- queryset = IndicatorPeriodData.get_user_viewable_updates(
- queryset, self.request.user
- )
- self._c_queryset = queryset
+ def get_object(self):
+ obj = get_object_or_404(self.get_queryset(), pk=self.kwargs['pk'])
+ # check whether the user has permission
+ viewables = IndicatorPeriodData.get_user_viewable_updates(
+ self.get_queryset().filter(pk=self.kwargs['pk']),
+ self.request.user
+ )
+ if viewables.count() == 0:
+ self.permission_denied(self.request)
+ return obj
+ def filter_queryset(self, queryset):
+ queryset = super().filter_queryset(queryset)
+ queryset = IndicatorPeriodData.get_user_viewable_updates(
+ queryset, self.request.user
+ )
return queryset
def perform_create(self, serializer):
| {"golden_diff": "diff --git a/akvo/rest/views/indicator_period_data.py b/akvo/rest/views/indicator_period_data.py\n--- a/akvo/rest/views/indicator_period_data.py\n+++ b/akvo/rest/views/indicator_period_data.py\n@@ -60,15 +60,22 @@\n serializer_class = IndicatorPeriodDataFrameworkSerializer\n project_relation = 'period__indicator__result__project__'\n \n- def get_queryset(self):\n- queryset = getattr(self, '_c_queryset', None)\n- if queryset is None:\n- queryset = super(IndicatorPeriodDataFrameworkViewSet, self).get_queryset()\n- queryset = IndicatorPeriodData.get_user_viewable_updates(\n- queryset, self.request.user\n- )\n- self._c_queryset = queryset\n+ def get_object(self):\n+ obj = get_object_or_404(self.get_queryset(), pk=self.kwargs['pk'])\n+ # check whether the user has permission\n+ viewables = IndicatorPeriodData.get_user_viewable_updates(\n+ self.get_queryset().filter(pk=self.kwargs['pk']),\n+ self.request.user\n+ )\n+ if viewables.count() == 0:\n+ self.permission_denied(self.request)\n+ return obj\n \n+ def filter_queryset(self, queryset):\n+ queryset = super().filter_queryset(queryset)\n+ queryset = IndicatorPeriodData.get_user_viewable_updates(\n+ queryset, self.request.user\n+ )\n return queryset\n \n def perform_create(self, serializer):\n", "issue": "Collaborate draft update bug\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\n# Akvo RSR is covered by the GNU Affero General Public License.\n# See more details in the license.txt file located at the root folder of the Akvo RSR module.\n# For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n\nimport json\nimport os\n\nfrom akvo.rsr.models import IndicatorPeriodData, IndicatorPeriodDataComment, Project\nfrom akvo.rest.models import TastyTokenAuthentication, JWTAuthentication\nfrom akvo.rsr.models.result.utils import QUANTITATIVE, PERCENTAGE_MEASURE\n\nfrom ..serializers import (IndicatorPeriodDataSerializer, IndicatorPeriodDataFrameworkSerializer,\n IndicatorPeriodDataCommentSerializer)\nfrom ..viewsets import PublicProjectViewSet\n\nfrom django.shortcuts import get_object_or_404\nfrom django.http import HttpResponseBadRequest, HttpResponseForbidden\nfrom django.contrib.admin.models import LogEntry, ADDITION, CHANGE, DELETION\nfrom django.contrib.contenttypes.models import ContentType\nfrom rest_framework import status\nfrom rest_framework.authentication import SessionAuthentication\nfrom rest_framework.decorators import api_view, authentication_classes\nfrom rest_framework.response import Response\nfrom rest_framework.utils.encoders import JSONEncoder\n\n\nclass IndicatorPeriodDataViewSet(PublicProjectViewSet):\n \"\"\"\n \"\"\"\n queryset = IndicatorPeriodData.objects.select_related('user', 'approved_by').all()\n serializer_class = IndicatorPeriodDataSerializer\n\n project_relation = 'period__indicator__result__project__'\n\n def filter_queryset(self, queryset):\n queryset = super(IndicatorPeriodDataViewSet, self).filter_queryset(queryset)\n return IndicatorPeriodData.get_user_viewable_updates(\n queryset, self.request.user\n )\n\n def perform_create(self, serializer):\n serializer.save(user=self.request.user)\n\n\nclass IndicatorPeriodDataFrameworkViewSet(PublicProjectViewSet):\n \"\"\"\n \"\"\"\n authentication_classes = (SessionAuthentication, TastyTokenAuthentication, JWTAuthentication)\n\n queryset = IndicatorPeriodData.objects.select_related(\n 'period',\n 'user',\n 'approved_by',\n ).prefetch_related(\n 'comments',\n 'disaggregations',\n ).all()\n serializer_class = IndicatorPeriodDataFrameworkSerializer\n project_relation = 'period__indicator__result__project__'\n\n def get_queryset(self):\n queryset = getattr(self, '_c_queryset', None)\n if queryset is None:\n queryset = super(IndicatorPeriodDataFrameworkViewSet, self).get_queryset()\n queryset = IndicatorPeriodData.get_user_viewable_updates(\n queryset, self.request.user\n )\n self._c_queryset = queryset\n\n return queryset\n\n def perform_create(self, serializer):\n data = {key: value for key, value in serializer.validated_data.items() if key not in ['period', 'files', 'photos', 'approved_by']}\n if len(serializer._disaggregations_data) > 0:\n data['disaggregations'] = [\n {key: value for key, value in dsg.items() if key in ['id', 'dimension_value', 'value', 'numerator', 'denominator']}\n for dsg in serializer._disaggregations_data\n ]\n user = self.request.user\n serializer.save(user=user)\n instance = serializer.instance\n log_data = {'audit_trail': True, 'data': data}\n LogEntry.objects.log_action(\n user_id=user.id,\n content_type_id=ContentType.objects.get_for_model(IndicatorPeriodData).id,\n object_id=instance.id,\n object_repr=str(instance),\n action_flag=ADDITION,\n change_message=json.dumps(log_data, cls=JSONEncoder)\n )\n\n def perform_update(self, serializer):\n instance = serializer.instance\n data = {\n key: value\n for key, value in serializer.validated_data.items()\n if key not in ['period', 'files', 'photos', 'approved_by'] and (key == 'comments' or getattr(instance, key) != value)\n }\n if len(serializer._disaggregations_data) > 0:\n indicator = instance.period.indicator\n is_percentage = indicator.type == QUANTITATIVE and indicator.measure == PERCENTAGE_MEASURE\n dsg_attrs = ['id', 'dimension_value', 'numerator', 'denominator'] if is_percentage else ['id', 'dimension_value', 'value']\n data['disaggregations'] = [\n {key: value for key, value in dsg.items() if key in dsg_attrs}\n for dsg in serializer._disaggregations_data\n ]\n user = self.request.user\n status = data.get('status', None)\n if status == 'R' or status == 'A':\n serializer.save()\n else:\n serializer.save(user=user)\n log_data = {'audit_trail': True, 'data': data}\n LogEntry.objects.log_action(\n user_id=user.id,\n content_type_id=ContentType.objects.get_for_model(IndicatorPeriodData).id,\n object_id=instance.id,\n object_repr=str(instance),\n action_flag=CHANGE,\n change_message=json.dumps(log_data, cls=JSONEncoder)\n )\n\n def perform_destroy(self, instance):\n object_id = instance.id\n object_repr = str(instance)\n super().perform_destroy(instance)\n LogEntry.objects.log_action(\n user_id=self.request.user.id,\n content_type_id=ContentType.objects.get_for_model(IndicatorPeriodData).id,\n object_id=object_id,\n object_repr=object_repr,\n action_flag=DELETION,\n change_message=json.dumps({'audit_trail': True})\n )\n\n\nclass IndicatorPeriodDataCommentViewSet(PublicProjectViewSet):\n \"\"\"\n \"\"\"\n authentication_classes = (SessionAuthentication, TastyTokenAuthentication, JWTAuthentication)\n\n # TODO: Is there more optimization possible?\n queryset = IndicatorPeriodDataComment.objects.select_related(\n 'user'\n ).prefetch_related(\n 'user__employers', 'user__employers__organisation'\n )\n serializer_class = IndicatorPeriodDataCommentSerializer\n project_relation = 'data__period__indicator__result__project__'\n\n def perform_create(self, serializer):\n serializer.save(user=self.request.user)\n\n\n@api_view(['POST', 'DELETE'])\n@authentication_classes([SessionAuthentication, TastyTokenAuthentication, JWTAuthentication])\ndef period_update_files(request, update_pk, file_pk=None):\n update = get_object_or_404(IndicatorPeriodData, pk=update_pk)\n user = request.user\n if not user.has_perm('rsr.change_indicatorperioddata', update):\n return Response({'error': 'User has no permission to add/remove files'}, status=status.HTTP_403_FORBIDDEN)\n\n if request.method == 'POST' and not file_pk:\n serializer = IndicatorPeriodDataFrameworkSerializer(instance=update, data=request.data, partial=True)\n serializer.is_valid(raise_exception=True)\n files = [f\"Uploaded file \\\"{file.name}\\\"\" for file in serializer.validated_data.get('files', [])]\n serializer.save(user=user)\n log_data = {'audit_trail': True, 'data': {'files': files}}\n LogEntry.objects.log_action(\n user_id=user.id,\n content_type_id=ContentType.objects.get_for_model(IndicatorPeriodData).id,\n object_id=update.id,\n object_repr=str(update),\n action_flag=CHANGE,\n change_message=json.dumps(log_data)\n )\n return Response(serializer.data['file_set'])\n\n if request.method == 'DELETE' and file_pk:\n file = update.indicatorperioddatafile_set.get(pk=file_pk)\n filename = os.path.basename(file.file.name)\n file.delete()\n update.user = user\n update.save(update_fields=['user'])\n log_data = {'audit_trail': True, 'data': {'files': [f\"Removed file \\\"{filename}\\\"\"]}}\n LogEntry.objects.log_action(\n user_id=user.id,\n content_type_id=ContentType.objects.get_for_model(IndicatorPeriodData).id,\n object_id=update.id,\n object_repr=str(update),\n action_flag=CHANGE,\n change_message=json.dumps(log_data)\n )\n return Response(status=status.HTTP_204_NO_CONTENT)\n\n return Response(status=status.HTTP_405_METHOD_NOT_ALLOWED)\n\n\n@api_view(['POST', 'DELETE'])\n@authentication_classes([SessionAuthentication, TastyTokenAuthentication, JWTAuthentication])\ndef period_update_photos(request, update_pk, photo_pk=None):\n update = get_object_or_404(IndicatorPeriodData, pk=update_pk)\n user = request.user\n if user != update.user:\n return Response({'error': 'User has no permission to add/remove photos'}, status=status.HTTP_403_FORBIDDEN)\n\n if request.method == 'POST' and not photo_pk:\n serializer = IndicatorPeriodDataFrameworkSerializer(instance=update, data=request.data, partial=True)\n serializer.is_valid(raise_exception=True)\n serializer.save(user=user)\n return Response(serializer.data['photo_set'])\n\n if request.method == 'DELETE' and photo_pk:\n photo = update.indicatorperioddataphoto_set.get(pk=photo_pk)\n photo.delete()\n return Response(status=status.HTTP_204_NO_CONTENT)\n\n return Response(status=status.HTTP_405_METHOD_NOT_ALLOWED)\n\n\n@api_view(['POST', 'DELETE'])\ndef indicator_upload_file(request, pk=None):\n \"\"\"\n Special API call for directly uploading a file.\n\n :param request; A Django request object.\n :param pk; The primary key of an IndicatorPeriodData instance.\n \"\"\"\n # Permissions\n user = getattr(request, 'user', None)\n if not user:\n return Response({'error': 'User is not logged in'}, status=status.HTTP_403_FORBIDDEN)\n # TODO: Check if user is allowed to upload a file\n # if not user.has_perm('rsr.change_project', update.period.indicator.result.project):\n # return Response({'error': 'User has no permission to place an update'},\n # status=status.HTTP_403_FORBIDDEN)\n\n update = IndicatorPeriodData.objects.get(pk=pk)\n if request.method == 'DELETE':\n try:\n if request.data['type'] == 'photo':\n update.photo = ''\n update.save(update_fields=['photo'])\n return Response({}, status=status.HTTP_204_NO_CONTENT)\n elif request.data['type'] == 'file':\n update.file = ''\n update.save(update_fields=['file'])\n return Response({}, status=status.HTTP_204_NO_CONTENT)\n except Exception as e:\n return Response({'error': str(e)}, status=status.HTTP_400_BAD_REQUEST)\n else: # POST\n upload_file = request.data['file']\n try:\n file_type = request.POST.copy()['type']\n if file_type == 'photo':\n update.photo = upload_file\n update.save(update_fields=['photo'])\n # Add photo member to be able to distinguish from file URL in new results version\n # while keeping the old API\n return Response({'file': update.photo.url, 'photo': update.photo.url})\n elif file_type == 'file':\n update.file = upload_file\n update.save(update_fields=['file'])\n return Response({'file': update.file.url})\n except Exception as e:\n return Response({'error': str(e)}, status=status.HTTP_400_BAD_REQUEST)\n\n\n@api_view(['POST'])\n@authentication_classes([SessionAuthentication, TastyTokenAuthentication])\ndef set_updates_status(request, project_pk):\n \"\"\"Bulk update IndicatorPeriodData.status attributes of a project.\n \"\"\"\n update_ids = request.data.get('updates', [])\n status = request.data.get('status', None)\n if len(update_ids) < 1 or status is None:\n return HttpResponseBadRequest()\n user = request.user\n project = get_object_or_404(Project, pk=project_pk)\n if not user.has_perm('rsr.change_project', project):\n return HttpResponseForbidden()\n IndicatorPeriodData.objects\\\n .filter(id__in=update_ids, period__indicator__result__project=project)\\\n .update(status=status)\n log_data = {'audit_trail': True, 'data': {'status': status}}\n for update_id in update_ids:\n LogEntry.objects.log_action(\n user_id=user.id,\n content_type_id=ContentType.objects.get_for_model(IndicatorPeriodData).id,\n object_id=update_id,\n object_repr='IndicatorPeriodData',\n action_flag=CHANGE,\n change_message=json.dumps(log_data)\n )\n return Response({'success': True})\n", "path": "akvo/rest/views/indicator_period_data.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\n# Akvo RSR is covered by the GNU Affero General Public License.\n# See more details in the license.txt file located at the root folder of the Akvo RSR module.\n# For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n\nimport json\nimport os\n\nfrom akvo.rsr.models import IndicatorPeriodData, IndicatorPeriodDataComment, Project\nfrom akvo.rest.models import TastyTokenAuthentication, JWTAuthentication\nfrom akvo.rsr.models.result.utils import QUANTITATIVE, PERCENTAGE_MEASURE\n\nfrom ..serializers import (IndicatorPeriodDataSerializer, IndicatorPeriodDataFrameworkSerializer,\n IndicatorPeriodDataCommentSerializer)\nfrom ..viewsets import PublicProjectViewSet\n\nfrom django.shortcuts import get_object_or_404\nfrom django.http import HttpResponseBadRequest, HttpResponseForbidden\nfrom django.contrib.admin.models import LogEntry, ADDITION, CHANGE, DELETION\nfrom django.contrib.contenttypes.models import ContentType\nfrom rest_framework import status\nfrom rest_framework.authentication import SessionAuthentication\nfrom rest_framework.decorators import api_view, authentication_classes\nfrom rest_framework.response import Response\nfrom rest_framework.utils.encoders import JSONEncoder\n\n\nclass IndicatorPeriodDataViewSet(PublicProjectViewSet):\n \"\"\"\n \"\"\"\n queryset = IndicatorPeriodData.objects.select_related('user', 'approved_by').all()\n serializer_class = IndicatorPeriodDataSerializer\n\n project_relation = 'period__indicator__result__project__'\n\n def filter_queryset(self, queryset):\n queryset = super(IndicatorPeriodDataViewSet, self).filter_queryset(queryset)\n return IndicatorPeriodData.get_user_viewable_updates(\n queryset, self.request.user\n )\n\n def perform_create(self, serializer):\n serializer.save(user=self.request.user)\n\n\nclass IndicatorPeriodDataFrameworkViewSet(PublicProjectViewSet):\n \"\"\"\n \"\"\"\n authentication_classes = (SessionAuthentication, TastyTokenAuthentication, JWTAuthentication)\n\n queryset = IndicatorPeriodData.objects.select_related(\n 'period',\n 'user',\n 'approved_by',\n ).prefetch_related(\n 'comments',\n 'disaggregations',\n ).all()\n serializer_class = IndicatorPeriodDataFrameworkSerializer\n project_relation = 'period__indicator__result__project__'\n\n def get_object(self):\n obj = get_object_or_404(self.get_queryset(), pk=self.kwargs['pk'])\n # check whether the user has permission\n viewables = IndicatorPeriodData.get_user_viewable_updates(\n self.get_queryset().filter(pk=self.kwargs['pk']),\n self.request.user\n )\n if viewables.count() == 0:\n self.permission_denied(self.request)\n return obj\n\n def filter_queryset(self, queryset):\n queryset = super().filter_queryset(queryset)\n queryset = IndicatorPeriodData.get_user_viewable_updates(\n queryset, self.request.user\n )\n return queryset\n\n def perform_create(self, serializer):\n data = {key: value for key, value in serializer.validated_data.items() if key not in ['period', 'files', 'photos', 'approved_by']}\n if len(serializer._disaggregations_data) > 0:\n data['disaggregations'] = [\n {key: value for key, value in dsg.items() if key in ['id', 'dimension_value', 'value', 'numerator', 'denominator']}\n for dsg in serializer._disaggregations_data\n ]\n user = self.request.user\n serializer.save(user=user)\n instance = serializer.instance\n log_data = {'audit_trail': True, 'data': data}\n LogEntry.objects.log_action(\n user_id=user.id,\n content_type_id=ContentType.objects.get_for_model(IndicatorPeriodData).id,\n object_id=instance.id,\n object_repr=str(instance),\n action_flag=ADDITION,\n change_message=json.dumps(log_data, cls=JSONEncoder)\n )\n\n def perform_update(self, serializer):\n instance = serializer.instance\n data = {\n key: value\n for key, value in serializer.validated_data.items()\n if key not in ['period', 'files', 'photos', 'approved_by'] and (key == 'comments' or getattr(instance, key) != value)\n }\n if len(serializer._disaggregations_data) > 0:\n indicator = instance.period.indicator\n is_percentage = indicator.type == QUANTITATIVE and indicator.measure == PERCENTAGE_MEASURE\n dsg_attrs = ['id', 'dimension_value', 'numerator', 'denominator'] if is_percentage else ['id', 'dimension_value', 'value']\n data['disaggregations'] = [\n {key: value for key, value in dsg.items() if key in dsg_attrs}\n for dsg in serializer._disaggregations_data\n ]\n user = self.request.user\n status = data.get('status', None)\n if status == 'R' or status == 'A':\n serializer.save()\n else:\n serializer.save(user=user)\n log_data = {'audit_trail': True, 'data': data}\n LogEntry.objects.log_action(\n user_id=user.id,\n content_type_id=ContentType.objects.get_for_model(IndicatorPeriodData).id,\n object_id=instance.id,\n object_repr=str(instance),\n action_flag=CHANGE,\n change_message=json.dumps(log_data, cls=JSONEncoder)\n )\n\n def perform_destroy(self, instance):\n object_id = instance.id\n object_repr = str(instance)\n super().perform_destroy(instance)\n LogEntry.objects.log_action(\n user_id=self.request.user.id,\n content_type_id=ContentType.objects.get_for_model(IndicatorPeriodData).id,\n object_id=object_id,\n object_repr=object_repr,\n action_flag=DELETION,\n change_message=json.dumps({'audit_trail': True})\n )\n\n\nclass IndicatorPeriodDataCommentViewSet(PublicProjectViewSet):\n \"\"\"\n \"\"\"\n authentication_classes = (SessionAuthentication, TastyTokenAuthentication, JWTAuthentication)\n\n # TODO: Is there more optimization possible?\n queryset = IndicatorPeriodDataComment.objects.select_related(\n 'user'\n ).prefetch_related(\n 'user__employers', 'user__employers__organisation'\n )\n serializer_class = IndicatorPeriodDataCommentSerializer\n project_relation = 'data__period__indicator__result__project__'\n\n def perform_create(self, serializer):\n serializer.save(user=self.request.user)\n\n\n@api_view(['POST', 'DELETE'])\n@authentication_classes([SessionAuthentication, TastyTokenAuthentication, JWTAuthentication])\ndef period_update_files(request, update_pk, file_pk=None):\n update = get_object_or_404(IndicatorPeriodData, pk=update_pk)\n user = request.user\n if not user.has_perm('rsr.change_indicatorperioddata', update):\n return Response({'error': 'User has no permission to add/remove files'}, status=status.HTTP_403_FORBIDDEN)\n\n if request.method == 'POST' and not file_pk:\n serializer = IndicatorPeriodDataFrameworkSerializer(instance=update, data=request.data, partial=True)\n serializer.is_valid(raise_exception=True)\n files = [f\"Uploaded file \\\"{file.name}\\\"\" for file in serializer.validated_data.get('files', [])]\n serializer.save(user=user)\n log_data = {'audit_trail': True, 'data': {'files': files}}\n LogEntry.objects.log_action(\n user_id=user.id,\n content_type_id=ContentType.objects.get_for_model(IndicatorPeriodData).id,\n object_id=update.id,\n object_repr=str(update),\n action_flag=CHANGE,\n change_message=json.dumps(log_data)\n )\n return Response(serializer.data['file_set'])\n\n if request.method == 'DELETE' and file_pk:\n file = update.indicatorperioddatafile_set.get(pk=file_pk)\n filename = os.path.basename(file.file.name)\n file.delete()\n update.user = user\n update.save(update_fields=['user'])\n log_data = {'audit_trail': True, 'data': {'files': [f\"Removed file \\\"{filename}\\\"\"]}}\n LogEntry.objects.log_action(\n user_id=user.id,\n content_type_id=ContentType.objects.get_for_model(IndicatorPeriodData).id,\n object_id=update.id,\n object_repr=str(update),\n action_flag=CHANGE,\n change_message=json.dumps(log_data)\n )\n return Response(status=status.HTTP_204_NO_CONTENT)\n\n return Response(status=status.HTTP_405_METHOD_NOT_ALLOWED)\n\n\n@api_view(['POST', 'DELETE'])\n@authentication_classes([SessionAuthentication, TastyTokenAuthentication, JWTAuthentication])\ndef period_update_photos(request, update_pk, photo_pk=None):\n update = get_object_or_404(IndicatorPeriodData, pk=update_pk)\n user = request.user\n if user != update.user:\n return Response({'error': 'User has no permission to add/remove photos'}, status=status.HTTP_403_FORBIDDEN)\n\n if request.method == 'POST' and not photo_pk:\n serializer = IndicatorPeriodDataFrameworkSerializer(instance=update, data=request.data, partial=True)\n serializer.is_valid(raise_exception=True)\n serializer.save(user=user)\n return Response(serializer.data['photo_set'])\n\n if request.method == 'DELETE' and photo_pk:\n photo = update.indicatorperioddataphoto_set.get(pk=photo_pk)\n photo.delete()\n return Response(status=status.HTTP_204_NO_CONTENT)\n\n return Response(status=status.HTTP_405_METHOD_NOT_ALLOWED)\n\n\n@api_view(['POST', 'DELETE'])\ndef indicator_upload_file(request, pk=None):\n \"\"\"\n Special API call for directly uploading a file.\n\n :param request; A Django request object.\n :param pk; The primary key of an IndicatorPeriodData instance.\n \"\"\"\n # Permissions\n user = getattr(request, 'user', None)\n if not user:\n return Response({'error': 'User is not logged in'}, status=status.HTTP_403_FORBIDDEN)\n # TODO: Check if user is allowed to upload a file\n # if not user.has_perm('rsr.change_project', update.period.indicator.result.project):\n # return Response({'error': 'User has no permission to place an update'},\n # status=status.HTTP_403_FORBIDDEN)\n\n update = IndicatorPeriodData.objects.get(pk=pk)\n if request.method == 'DELETE':\n try:\n if request.data['type'] == 'photo':\n update.photo = ''\n update.save(update_fields=['photo'])\n return Response({}, status=status.HTTP_204_NO_CONTENT)\n elif request.data['type'] == 'file':\n update.file = ''\n update.save(update_fields=['file'])\n return Response({}, status=status.HTTP_204_NO_CONTENT)\n except Exception as e:\n return Response({'error': str(e)}, status=status.HTTP_400_BAD_REQUEST)\n else: # POST\n upload_file = request.data['file']\n try:\n file_type = request.POST.copy()['type']\n if file_type == 'photo':\n update.photo = upload_file\n update.save(update_fields=['photo'])\n # Add photo member to be able to distinguish from file URL in new results version\n # while keeping the old API\n return Response({'file': update.photo.url, 'photo': update.photo.url})\n elif file_type == 'file':\n update.file = upload_file\n update.save(update_fields=['file'])\n return Response({'file': update.file.url})\n except Exception as e:\n return Response({'error': str(e)}, status=status.HTTP_400_BAD_REQUEST)\n\n\n@api_view(['POST'])\n@authentication_classes([SessionAuthentication, TastyTokenAuthentication])\ndef set_updates_status(request, project_pk):\n \"\"\"Bulk update IndicatorPeriodData.status attributes of a project.\n \"\"\"\n update_ids = request.data.get('updates', [])\n status = request.data.get('status', None)\n if len(update_ids) < 1 or status is None:\n return HttpResponseBadRequest()\n user = request.user\n project = get_object_or_404(Project, pk=project_pk)\n if not user.has_perm('rsr.change_project', project):\n return HttpResponseForbidden()\n IndicatorPeriodData.objects\\\n .filter(id__in=update_ids, period__indicator__result__project=project)\\\n .update(status=status)\n log_data = {'audit_trail': True, 'data': {'status': status}}\n for update_id in update_ids:\n LogEntry.objects.log_action(\n user_id=user.id,\n content_type_id=ContentType.objects.get_for_model(IndicatorPeriodData).id,\n object_id=update_id,\n object_repr='IndicatorPeriodData',\n action_flag=CHANGE,\n change_message=json.dumps(log_data)\n )\n return Response({'success': True})\n", "path": "akvo/rest/views/indicator_period_data.py"}]} | 3,728 | 323 |
gh_patches_debug_32693 | rasdani/github-patches | git_diff | liqd__a4-meinberlin-1972 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
status bar AND project finished both shown on pop up

--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `meinberlin/apps/plans/serializers.py`
Content:
```
1 from django.utils.translation import ugettext as _
2 from easy_thumbnails.files import get_thumbnailer
3 from rest_framework import serializers
4
5 from adhocracy4.projects.models import Project
6 from meinberlin.apps.projects import get_project_type
7
8 from .models import Plan
9
10
11 class CommonFields:
12
13 def get_district(self, instance):
14 city_wide = _('City wide')
15 district_name = str(city_wide)
16 if instance.administrative_district:
17 district_name = instance.administrative_district.name
18 return district_name
19
20 def get_point(self, instance):
21 point = instance.point
22 if not point:
23 point = ''
24 return point
25
26 def get_organisation(self, instance):
27 return instance.organisation.name
28
29
30 class ProjectSerializer(serializers.ModelSerializer, CommonFields):
31 type = serializers.SerializerMethodField()
32 subtype = serializers.SerializerMethodField()
33 title = serializers.SerializerMethodField()
34 url = serializers.SerializerMethodField()
35 point = serializers.SerializerMethodField()
36 point_label = serializers.SerializerMethodField()
37 cost = serializers.SerializerMethodField()
38 district = serializers.SerializerMethodField()
39 status = serializers.SerializerMethodField()
40 organisation = serializers.SerializerMethodField()
41 participation = serializers.SerializerMethodField()
42 participation_display = serializers.SerializerMethodField()
43 participation_active = serializers.SerializerMethodField()
44 participation_string = serializers.SerializerMethodField()
45 future_phase = serializers.SerializerMethodField()
46 active_phase = serializers.SerializerMethodField()
47 past_phase = serializers.SerializerMethodField()
48 tile_image = serializers.SerializerMethodField()
49 plan_url = serializers.SerializerMethodField()
50 plan_title = serializers.SerializerMethodField()
51 published_projects_count = serializers.SerializerMethodField()
52 created_or_modified = serializers.SerializerMethodField()
53
54 class Meta:
55 model = Project
56 fields = ['type', 'subtype', 'title', 'url',
57 'organisation', 'tile_image',
58 'tile_image_copyright',
59 'point', 'point_label', 'cost',
60 'district', 'topics',
61 'status',
62 'participation_string',
63 'participation_active',
64 'participation', 'participation_display', 'description',
65 'future_phase', 'active_phase',
66 'past_phase', 'plan_url', 'plan_title',
67 'published_projects_count', 'created_or_modified']
68
69 def _get_participation_status_project(self, instance):
70 project_phases = instance.phases
71
72 if project_phases.active_phases():
73 return _('running'), True
74
75 if project_phases.future_phases():
76 try:
77 return (_('starts at {}').format
78 (project_phases.future_phases().first().
79 start_date.date().strftime('%d.%m.%Y')),
80 True)
81 except AttributeError as e:
82 print(e)
83 return (_('starts in the future'),
84 True)
85 else:
86 return _('done'), False
87
88 def get_type(self, instance):
89 return 'project'
90
91 def get_subtype(self, instance):
92 subtype = get_project_type(instance)
93 if subtype in ('external', 'bplan'):
94 return 'external'
95 return subtype
96
97 def get_title(self, instance):
98 return instance.name
99
100 def get_url(self, instance):
101 if get_project_type(instance) in ('external', 'bplan'):
102 return instance.externalproject.url
103 return instance.get_absolute_url()
104
105 def get_tile_image(self, instance):
106 image_url = ''
107 if instance.tile_image:
108 image = get_thumbnailer(instance.tile_image)['project_tile']
109 image_url = image.url
110 return image_url
111
112 def get_status(self, instance):
113 project_phases = instance.phases
114 if project_phases.active_phases() or project_phases.future_phases():
115 return 0
116 return 1
117
118 def get_participation(self, instance):
119 return 0
120
121 def get_participation_display(self, instance):
122 return _('Yes')
123
124 def get_future_phase(self, instance):
125 if (instance.future_phases and
126 instance.future_phases.first().start_date):
127 return str(
128 instance.future_phases.first().start_date.date())
129 return False
130
131 def get_active_phase(self, instance):
132 project_phases = instance.phases
133 if project_phases.active_phases():
134 progress = instance.active_phase_progress
135 time_left = instance.time_left
136 end_date = str(project_phases.active_phases().last().end_date)
137 return [progress, time_left, end_date]
138 return False
139
140 def get_past_phase(self, instance):
141 project_phases = instance.phases
142 if (project_phases.past_phases() and
143 project_phases.past_phases().first().end_date):
144 return str(
145 project_phases.past_phases().first().end_date.date())
146 return False
147
148 def get_participation_string(self, instance):
149 participation_string, participation_active = \
150 self._get_participation_status_project(instance)
151 return str(participation_string)
152
153 def get_participation_active(self, instance):
154 participation_string, participation_active = \
155 self._get_participation_status_project(instance)
156 return participation_active
157
158 def get_plan_url(self, instance):
159 if instance.plans.exists():
160 return instance.plans.first().get_absolute_url()
161 return None
162
163 def get_plan_title(self, instance):
164 if instance.plans.exists():
165 return instance.plans.first().title
166 return None
167
168 def get_published_projects_count(self, instance):
169 if hasattr(instance, 'projectcontainer') and instance.projectcontainer:
170 return instance.projectcontainer.active_project_count
171
172 def get_point_label(self, instance):
173 return ''
174
175 def get_cost(self, instance):
176 return ''
177
178 def get_created_or_modified(self, instance):
179 if instance.modified:
180 return str(instance.modified)
181 return str(instance.created)
182
183
184 class PlanSerializer(serializers.ModelSerializer, CommonFields):
185 type = serializers.SerializerMethodField()
186 subtype = serializers.SerializerMethodField()
187 url = serializers.SerializerMethodField()
188 district = serializers.SerializerMethodField()
189 point = serializers.SerializerMethodField()
190 participation_active = serializers.SerializerMethodField()
191 participation_string = serializers.SerializerMethodField()
192 published_projects_count = serializers.SerializerMethodField()
193 organisation = serializers.SerializerMethodField()
194 created_or_modified = serializers.SerializerMethodField()
195
196 class Meta:
197 model = Plan
198 fields = ['type', 'subtype', 'title', 'url',
199 'organisation', 'point',
200 'point_label', 'cost',
201 'district', 'topics', 'status',
202 'participation',
203 'participation_string',
204 'participation_active',
205 'published_projects_count', 'created_or_modified']
206
207 def get_subtype(self, instance):
208 return 'plan'
209
210 def _get_participation_status_plan(self, item):
211 projects = item.published_projects
212 if not projects:
213 return item.get_participation_display(), False
214 else:
215 status_string = item.participation_string
216 if status_string:
217 return status_string, True
218 else:
219 return item.get_participation_display(), False
220
221 def get_type(self, instance):
222 return 'plan'
223
224 def get_url(self, instance):
225 return instance.get_absolute_url()
226
227 def get_published_projects_count(self, instance):
228 return instance.published_projects.count()
229
230 def get_participation_string(self, instance):
231 participation_string, participation_active = \
232 self._get_participation_status_plan(instance)
233 return str(participation_string)
234
235 def get_participation_active(self, instance):
236 participation_string, participation_active = \
237 self._get_participation_status_plan(instance)
238 return participation_active
239
240 def get_created_or_modified(self, instance):
241 if instance.modified:
242 return str(instance.modified)
243 return str(instance.created)
244
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/meinberlin/apps/plans/serializers.py b/meinberlin/apps/plans/serializers.py
--- a/meinberlin/apps/plans/serializers.py
+++ b/meinberlin/apps/plans/serializers.py
@@ -26,6 +26,11 @@
def get_organisation(self, instance):
return instance.organisation.name
+ def get_created_or_modified(self, instance):
+ if instance.modified:
+ return str(instance.modified)
+ return str(instance.created)
+
class ProjectSerializer(serializers.ModelSerializer, CommonFields):
type = serializers.SerializerMethodField()
@@ -130,7 +135,7 @@
def get_active_phase(self, instance):
project_phases = instance.phases
- if project_phases.active_phases():
+ if project_phases.active_phases() and instance.active_phase_progress:
progress = instance.active_phase_progress
time_left = instance.time_left
end_date = str(project_phases.active_phases().last().end_date)
@@ -175,11 +180,6 @@
def get_cost(self, instance):
return ''
- def get_created_or_modified(self, instance):
- if instance.modified:
- return str(instance.modified)
- return str(instance.created)
-
class PlanSerializer(serializers.ModelSerializer, CommonFields):
type = serializers.SerializerMethodField()
@@ -236,8 +236,3 @@
participation_string, participation_active = \
self._get_participation_status_plan(instance)
return participation_active
-
- def get_created_or_modified(self, instance):
- if instance.modified:
- return str(instance.modified)
- return str(instance.created)
| {"golden_diff": "diff --git a/meinberlin/apps/plans/serializers.py b/meinberlin/apps/plans/serializers.py\n--- a/meinberlin/apps/plans/serializers.py\n+++ b/meinberlin/apps/plans/serializers.py\n@@ -26,6 +26,11 @@\n def get_organisation(self, instance):\n return instance.organisation.name\n \n+ def get_created_or_modified(self, instance):\n+ if instance.modified:\n+ return str(instance.modified)\n+ return str(instance.created)\n+\n \n class ProjectSerializer(serializers.ModelSerializer, CommonFields):\n type = serializers.SerializerMethodField()\n@@ -130,7 +135,7 @@\n \n def get_active_phase(self, instance):\n project_phases = instance.phases\n- if project_phases.active_phases():\n+ if project_phases.active_phases() and instance.active_phase_progress:\n progress = instance.active_phase_progress\n time_left = instance.time_left\n end_date = str(project_phases.active_phases().last().end_date)\n@@ -175,11 +180,6 @@\n def get_cost(self, instance):\n return ''\n \n- def get_created_or_modified(self, instance):\n- if instance.modified:\n- return str(instance.modified)\n- return str(instance.created)\n-\n \n class PlanSerializer(serializers.ModelSerializer, CommonFields):\n type = serializers.SerializerMethodField()\n@@ -236,8 +236,3 @@\n participation_string, participation_active = \\\n self._get_participation_status_plan(instance)\n return participation_active\n-\n- def get_created_or_modified(self, instance):\n- if instance.modified:\n- return str(instance.modified)\n- return str(instance.created)\n", "issue": "status bar AND project finished both shown on pop up\n\r\n\n", "before_files": [{"content": "from django.utils.translation import ugettext as _\nfrom easy_thumbnails.files import get_thumbnailer\nfrom rest_framework import serializers\n\nfrom adhocracy4.projects.models import Project\nfrom meinberlin.apps.projects import get_project_type\n\nfrom .models import Plan\n\n\nclass CommonFields:\n\n def get_district(self, instance):\n city_wide = _('City wide')\n district_name = str(city_wide)\n if instance.administrative_district:\n district_name = instance.administrative_district.name\n return district_name\n\n def get_point(self, instance):\n point = instance.point\n if not point:\n point = ''\n return point\n\n def get_organisation(self, instance):\n return instance.organisation.name\n\n\nclass ProjectSerializer(serializers.ModelSerializer, CommonFields):\n type = serializers.SerializerMethodField()\n subtype = serializers.SerializerMethodField()\n title = serializers.SerializerMethodField()\n url = serializers.SerializerMethodField()\n point = serializers.SerializerMethodField()\n point_label = serializers.SerializerMethodField()\n cost = serializers.SerializerMethodField()\n district = serializers.SerializerMethodField()\n status = serializers.SerializerMethodField()\n organisation = serializers.SerializerMethodField()\n participation = serializers.SerializerMethodField()\n participation_display = serializers.SerializerMethodField()\n participation_active = serializers.SerializerMethodField()\n participation_string = serializers.SerializerMethodField()\n future_phase = serializers.SerializerMethodField()\n active_phase = serializers.SerializerMethodField()\n past_phase = serializers.SerializerMethodField()\n tile_image = serializers.SerializerMethodField()\n plan_url = serializers.SerializerMethodField()\n plan_title = serializers.SerializerMethodField()\n published_projects_count = serializers.SerializerMethodField()\n created_or_modified = serializers.SerializerMethodField()\n\n class Meta:\n model = Project\n fields = ['type', 'subtype', 'title', 'url',\n 'organisation', 'tile_image',\n 'tile_image_copyright',\n 'point', 'point_label', 'cost',\n 'district', 'topics',\n 'status',\n 'participation_string',\n 'participation_active',\n 'participation', 'participation_display', 'description',\n 'future_phase', 'active_phase',\n 'past_phase', 'plan_url', 'plan_title',\n 'published_projects_count', 'created_or_modified']\n\n def _get_participation_status_project(self, instance):\n project_phases = instance.phases\n\n if project_phases.active_phases():\n return _('running'), True\n\n if project_phases.future_phases():\n try:\n return (_('starts at {}').format\n (project_phases.future_phases().first().\n start_date.date().strftime('%d.%m.%Y')),\n True)\n except AttributeError as e:\n print(e)\n return (_('starts in the future'),\n True)\n else:\n return _('done'), False\n\n def get_type(self, instance):\n return 'project'\n\n def get_subtype(self, instance):\n subtype = get_project_type(instance)\n if subtype in ('external', 'bplan'):\n return 'external'\n return subtype\n\n def get_title(self, instance):\n return instance.name\n\n def get_url(self, instance):\n if get_project_type(instance) in ('external', 'bplan'):\n return instance.externalproject.url\n return instance.get_absolute_url()\n\n def get_tile_image(self, instance):\n image_url = ''\n if instance.tile_image:\n image = get_thumbnailer(instance.tile_image)['project_tile']\n image_url = image.url\n return image_url\n\n def get_status(self, instance):\n project_phases = instance.phases\n if project_phases.active_phases() or project_phases.future_phases():\n return 0\n return 1\n\n def get_participation(self, instance):\n return 0\n\n def get_participation_display(self, instance):\n return _('Yes')\n\n def get_future_phase(self, instance):\n if (instance.future_phases and\n instance.future_phases.first().start_date):\n return str(\n instance.future_phases.first().start_date.date())\n return False\n\n def get_active_phase(self, instance):\n project_phases = instance.phases\n if project_phases.active_phases():\n progress = instance.active_phase_progress\n time_left = instance.time_left\n end_date = str(project_phases.active_phases().last().end_date)\n return [progress, time_left, end_date]\n return False\n\n def get_past_phase(self, instance):\n project_phases = instance.phases\n if (project_phases.past_phases() and\n project_phases.past_phases().first().end_date):\n return str(\n project_phases.past_phases().first().end_date.date())\n return False\n\n def get_participation_string(self, instance):\n participation_string, participation_active = \\\n self._get_participation_status_project(instance)\n return str(participation_string)\n\n def get_participation_active(self, instance):\n participation_string, participation_active = \\\n self._get_participation_status_project(instance)\n return participation_active\n\n def get_plan_url(self, instance):\n if instance.plans.exists():\n return instance.plans.first().get_absolute_url()\n return None\n\n def get_plan_title(self, instance):\n if instance.plans.exists():\n return instance.plans.first().title\n return None\n\n def get_published_projects_count(self, instance):\n if hasattr(instance, 'projectcontainer') and instance.projectcontainer:\n return instance.projectcontainer.active_project_count\n\n def get_point_label(self, instance):\n return ''\n\n def get_cost(self, instance):\n return ''\n\n def get_created_or_modified(self, instance):\n if instance.modified:\n return str(instance.modified)\n return str(instance.created)\n\n\nclass PlanSerializer(serializers.ModelSerializer, CommonFields):\n type = serializers.SerializerMethodField()\n subtype = serializers.SerializerMethodField()\n url = serializers.SerializerMethodField()\n district = serializers.SerializerMethodField()\n point = serializers.SerializerMethodField()\n participation_active = serializers.SerializerMethodField()\n participation_string = serializers.SerializerMethodField()\n published_projects_count = serializers.SerializerMethodField()\n organisation = serializers.SerializerMethodField()\n created_or_modified = serializers.SerializerMethodField()\n\n class Meta:\n model = Plan\n fields = ['type', 'subtype', 'title', 'url',\n 'organisation', 'point',\n 'point_label', 'cost',\n 'district', 'topics', 'status',\n 'participation',\n 'participation_string',\n 'participation_active',\n 'published_projects_count', 'created_or_modified']\n\n def get_subtype(self, instance):\n return 'plan'\n\n def _get_participation_status_plan(self, item):\n projects = item.published_projects\n if not projects:\n return item.get_participation_display(), False\n else:\n status_string = item.participation_string\n if status_string:\n return status_string, True\n else:\n return item.get_participation_display(), False\n\n def get_type(self, instance):\n return 'plan'\n\n def get_url(self, instance):\n return instance.get_absolute_url()\n\n def get_published_projects_count(self, instance):\n return instance.published_projects.count()\n\n def get_participation_string(self, instance):\n participation_string, participation_active = \\\n self._get_participation_status_plan(instance)\n return str(participation_string)\n\n def get_participation_active(self, instance):\n participation_string, participation_active = \\\n self._get_participation_status_plan(instance)\n return participation_active\n\n def get_created_or_modified(self, instance):\n if instance.modified:\n return str(instance.modified)\n return str(instance.created)\n", "path": "meinberlin/apps/plans/serializers.py"}], "after_files": [{"content": "from django.utils.translation import ugettext as _\nfrom easy_thumbnails.files import get_thumbnailer\nfrom rest_framework import serializers\n\nfrom adhocracy4.projects.models import Project\nfrom meinberlin.apps.projects import get_project_type\n\nfrom .models import Plan\n\n\nclass CommonFields:\n\n def get_district(self, instance):\n city_wide = _('City wide')\n district_name = str(city_wide)\n if instance.administrative_district:\n district_name = instance.administrative_district.name\n return district_name\n\n def get_point(self, instance):\n point = instance.point\n if not point:\n point = ''\n return point\n\n def get_organisation(self, instance):\n return instance.organisation.name\n\n def get_created_or_modified(self, instance):\n if instance.modified:\n return str(instance.modified)\n return str(instance.created)\n\n\nclass ProjectSerializer(serializers.ModelSerializer, CommonFields):\n type = serializers.SerializerMethodField()\n subtype = serializers.SerializerMethodField()\n title = serializers.SerializerMethodField()\n url = serializers.SerializerMethodField()\n point = serializers.SerializerMethodField()\n point_label = serializers.SerializerMethodField()\n cost = serializers.SerializerMethodField()\n district = serializers.SerializerMethodField()\n status = serializers.SerializerMethodField()\n organisation = serializers.SerializerMethodField()\n participation = serializers.SerializerMethodField()\n participation_display = serializers.SerializerMethodField()\n participation_active = serializers.SerializerMethodField()\n participation_string = serializers.SerializerMethodField()\n future_phase = serializers.SerializerMethodField()\n active_phase = serializers.SerializerMethodField()\n past_phase = serializers.SerializerMethodField()\n tile_image = serializers.SerializerMethodField()\n plan_url = serializers.SerializerMethodField()\n plan_title = serializers.SerializerMethodField()\n published_projects_count = serializers.SerializerMethodField()\n created_or_modified = serializers.SerializerMethodField()\n\n class Meta:\n model = Project\n fields = ['type', 'subtype', 'title', 'url',\n 'organisation', 'tile_image',\n 'tile_image_copyright',\n 'point', 'point_label', 'cost',\n 'district', 'topics',\n 'status',\n 'participation_string',\n 'participation_active',\n 'participation', 'participation_display', 'description',\n 'future_phase', 'active_phase',\n 'past_phase', 'plan_url', 'plan_title',\n 'published_projects_count', 'created_or_modified']\n\n def _get_participation_status_project(self, instance):\n project_phases = instance.phases\n\n if project_phases.active_phases():\n return _('running'), True\n\n if project_phases.future_phases():\n try:\n return (_('starts at {}').format\n (project_phases.future_phases().first().\n start_date.date().strftime('%d.%m.%Y')),\n True)\n except AttributeError as e:\n print(e)\n return (_('starts in the future'),\n True)\n else:\n return _('done'), False\n\n def get_type(self, instance):\n return 'project'\n\n def get_subtype(self, instance):\n subtype = get_project_type(instance)\n if subtype in ('external', 'bplan'):\n return 'external'\n return subtype\n\n def get_title(self, instance):\n return instance.name\n\n def get_url(self, instance):\n if get_project_type(instance) in ('external', 'bplan'):\n return instance.externalproject.url\n return instance.get_absolute_url()\n\n def get_tile_image(self, instance):\n image_url = ''\n if instance.tile_image:\n image = get_thumbnailer(instance.tile_image)['project_tile']\n image_url = image.url\n return image_url\n\n def get_status(self, instance):\n project_phases = instance.phases\n if project_phases.active_phases() or project_phases.future_phases():\n return 0\n return 1\n\n def get_participation(self, instance):\n return 0\n\n def get_participation_display(self, instance):\n return _('Yes')\n\n def get_future_phase(self, instance):\n if (instance.future_phases and\n instance.future_phases.first().start_date):\n return str(\n instance.future_phases.first().start_date.date())\n return False\n\n def get_active_phase(self, instance):\n project_phases = instance.phases\n if project_phases.active_phases() and instance.active_phase_progress:\n progress = instance.active_phase_progress\n time_left = instance.time_left\n end_date = str(project_phases.active_phases().last().end_date)\n return [progress, time_left, end_date]\n return False\n\n def get_past_phase(self, instance):\n project_phases = instance.phases\n if (project_phases.past_phases() and\n project_phases.past_phases().first().end_date):\n return str(\n project_phases.past_phases().first().end_date.date())\n return False\n\n def get_participation_string(self, instance):\n participation_string, participation_active = \\\n self._get_participation_status_project(instance)\n return str(participation_string)\n\n def get_participation_active(self, instance):\n participation_string, participation_active = \\\n self._get_participation_status_project(instance)\n return participation_active\n\n def get_plan_url(self, instance):\n if instance.plans.exists():\n return instance.plans.first().get_absolute_url()\n return None\n\n def get_plan_title(self, instance):\n if instance.plans.exists():\n return instance.plans.first().title\n return None\n\n def get_published_projects_count(self, instance):\n if hasattr(instance, 'projectcontainer') and instance.projectcontainer:\n return instance.projectcontainer.active_project_count\n\n def get_point_label(self, instance):\n return ''\n\n def get_cost(self, instance):\n return ''\n\n\nclass PlanSerializer(serializers.ModelSerializer, CommonFields):\n type = serializers.SerializerMethodField()\n subtype = serializers.SerializerMethodField()\n url = serializers.SerializerMethodField()\n district = serializers.SerializerMethodField()\n point = serializers.SerializerMethodField()\n participation_active = serializers.SerializerMethodField()\n participation_string = serializers.SerializerMethodField()\n published_projects_count = serializers.SerializerMethodField()\n organisation = serializers.SerializerMethodField()\n created_or_modified = serializers.SerializerMethodField()\n\n class Meta:\n model = Plan\n fields = ['type', 'subtype', 'title', 'url',\n 'organisation', 'point',\n 'point_label', 'cost',\n 'district', 'topics', 'status',\n 'participation',\n 'participation_string',\n 'participation_active',\n 'published_projects_count', 'created_or_modified']\n\n def get_subtype(self, instance):\n return 'plan'\n\n def _get_participation_status_plan(self, item):\n projects = item.published_projects\n if not projects:\n return item.get_participation_display(), False\n else:\n status_string = item.participation_string\n if status_string:\n return status_string, True\n else:\n return item.get_participation_display(), False\n\n def get_type(self, instance):\n return 'plan'\n\n def get_url(self, instance):\n return instance.get_absolute_url()\n\n def get_published_projects_count(self, instance):\n return instance.published_projects.count()\n\n def get_participation_string(self, instance):\n participation_string, participation_active = \\\n self._get_participation_status_plan(instance)\n return str(participation_string)\n\n def get_participation_active(self, instance):\n participation_string, participation_active = \\\n self._get_participation_status_plan(instance)\n return participation_active\n", "path": "meinberlin/apps/plans/serializers.py"}]} | 2,632 | 373 |
gh_patches_debug_20852 | rasdani/github-patches | git_diff | Lightning-AI__torchmetrics-1729 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
PearsonCorrCoeff returns nan when input is of type torch.float16 or torch.bfloat16.
## 🐛 Bug
PearsonCorrCoeff returns nan when input is of type torch.float16 or torch.bfloat16 and all values are close.
### To Reproduce
<!-- If you have a code sample, error messages, stack traces, please provide it here as well -->
<details>
<summary>
```python
import torch
import torchmetrics as tm
pcc = tm.regression.PearsonCorrCoef().to("cuda")
pred = torch.tensor([0.4746, 0.4805, 0.4766, 0.4805, 0.4766, 0.4805, 0.4785, 0.4824, 0.4805],dtype=torch.float16).to("cuda")
target = torch.tensor([0.0336, 0.3676, 0.6302, 0.7192, 0.2295, 0.2886, 0.6302, 0.7096, 0.0208],dtype=torch.float16).to("cuda")
print(pcc(pred,target))
print(pcc(pred.to(torch.float32),target.to(torch.float32)))
tensor(nan, device='cuda:0')
tensor(0.3720, device='cuda:0')
```
</summary>
<!-- Ideally attach a minimal code sample to reproduce the decried issue.
Minimal means having the shortest code but still preserving the bug. -->
</details>
### Expected behavior
<!-- A clear and concise description of what you expected to happen. -->
### Environment
Python version: 3.10.9
Torch version: 1.12.1
TorchMetrics version: 0.11.1
GPU device name: Tesla T4
CUDA Version: 11.4
### Additional context
When running in a training loop I found that some fraction (~30%) of steps would not produce a nan number when using torch.float16 or bfloat16, while the other ~70% would.
This seems to occur because the values in pred above are not very different (changing one value of pred above to be more different than the rest will compute a correct PCC), however I think that this should still be able to be computed with half precision and the standard deviation of pred shown above.
<!-- Add any other context about the problem here. -->
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/torchmetrics/functional/regression/pearson.py`
Content:
```
1 # Copyright The Lightning team.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 from typing import Tuple
15
16 import torch
17 from torch import Tensor
18
19 from torchmetrics.functional.regression.utils import _check_data_shape_to_num_outputs
20 from torchmetrics.utilities.checks import _check_same_shape
21
22
23 def _pearson_corrcoef_update(
24 preds: Tensor,
25 target: Tensor,
26 mean_x: Tensor,
27 mean_y: Tensor,
28 var_x: Tensor,
29 var_y: Tensor,
30 corr_xy: Tensor,
31 n_prior: Tensor,
32 num_outputs: int,
33 ) -> Tuple[Tensor, Tensor, Tensor, Tensor, Tensor, Tensor]:
34 """Update and returns variables required to compute Pearson Correlation Coefficient.
35
36 Check for same shape of input tensors.
37
38 Args:
39 preds: estimated scores
40 target: ground truth scores
41 mean_x: current mean estimate of x tensor
42 mean_y: current mean estimate of y tensor
43 var_x: current variance estimate of x tensor
44 var_y: current variance estimate of y tensor
45 corr_xy: current covariance estimate between x and y tensor
46 n_prior: current number of observed observations
47 num_outputs: Number of outputs in multioutput setting
48 """
49 # Data checking
50 _check_same_shape(preds, target)
51 _check_data_shape_to_num_outputs(preds, target, num_outputs)
52
53 n_obs = preds.shape[0]
54 mx_new = (n_prior * mean_x + preds.mean(0) * n_obs) / (n_prior + n_obs)
55 my_new = (n_prior * mean_y + target.mean(0) * n_obs) / (n_prior + n_obs)
56 n_prior += n_obs
57 var_x += ((preds - mx_new) * (preds - mean_x)).sum(0)
58 var_y += ((target - my_new) * (target - mean_y)).sum(0)
59 corr_xy += ((preds - mx_new) * (target - mean_y)).sum(0)
60 mean_x = mx_new
61 mean_y = my_new
62
63 return mean_x, mean_y, var_x, var_y, corr_xy, n_prior
64
65
66 def _pearson_corrcoef_compute(
67 var_x: Tensor,
68 var_y: Tensor,
69 corr_xy: Tensor,
70 nb: Tensor,
71 ) -> Tensor:
72 """Compute the final pearson correlation based on accumulated statistics.
73
74 Args:
75 var_x: variance estimate of x tensor
76 var_y: variance estimate of y tensor
77 corr_xy: covariance estimate between x and y tensor
78 nb: number of observations
79 """
80 var_x /= nb - 1
81 var_y /= nb - 1
82 corr_xy /= nb - 1
83 corrcoef = (corr_xy / (var_x * var_y).sqrt()).squeeze()
84 return torch.clamp(corrcoef, -1.0, 1.0)
85
86
87 def pearson_corrcoef(preds: Tensor, target: Tensor) -> Tensor:
88 """Compute pearson correlation coefficient.
89
90 Args:
91 preds: estimated scores
92 target: ground truth scores
93
94 Example (single output regression):
95 >>> from torchmetrics.functional.regression import pearson_corrcoef
96 >>> target = torch.tensor([3, -0.5, 2, 7])
97 >>> preds = torch.tensor([2.5, 0.0, 2, 8])
98 >>> pearson_corrcoef(preds, target)
99 tensor(0.9849)
100
101 Example (multi output regression):
102 >>> from torchmetrics.functional.regression import pearson_corrcoef
103 >>> target = torch.tensor([[3, -0.5], [2, 7]])
104 >>> preds = torch.tensor([[2.5, 0.0], [2, 8]])
105 >>> pearson_corrcoef(preds, target)
106 tensor([1., 1.])
107 """
108 d = preds.shape[1] if preds.ndim == 2 else 1
109 _temp = torch.zeros(d, dtype=preds.dtype, device=preds.device)
110 mean_x, mean_y, var_x = _temp.clone(), _temp.clone(), _temp.clone()
111 var_y, corr_xy, nb = _temp.clone(), _temp.clone(), _temp.clone()
112 _, _, var_x, var_y, corr_xy, nb = _pearson_corrcoef_update(
113 preds, target, mean_x, mean_y, var_x, var_y, corr_xy, nb, num_outputs=1 if preds.ndim == 1 else preds.shape[-1]
114 )
115 return _pearson_corrcoef_compute(var_x, var_y, corr_xy, nb)
116
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/torchmetrics/functional/regression/pearson.py b/src/torchmetrics/functional/regression/pearson.py
--- a/src/torchmetrics/functional/regression/pearson.py
+++ b/src/torchmetrics/functional/regression/pearson.py
@@ -49,13 +49,25 @@
# Data checking
_check_same_shape(preds, target)
_check_data_shape_to_num_outputs(preds, target, num_outputs)
+ cond = n_prior.mean() > 0
n_obs = preds.shape[0]
- mx_new = (n_prior * mean_x + preds.mean(0) * n_obs) / (n_prior + n_obs)
- my_new = (n_prior * mean_y + target.mean(0) * n_obs) / (n_prior + n_obs)
+ if cond:
+ mx_new = (n_prior * mean_x + preds.sum(0)) / (n_prior + n_obs)
+ my_new = (n_prior * mean_y + target.sum(0)) / (n_prior + n_obs)
+ else:
+ mx_new = preds.mean(0)
+ my_new = target.mean(0)
+
n_prior += n_obs
- var_x += ((preds - mx_new) * (preds - mean_x)).sum(0)
- var_y += ((target - my_new) * (target - mean_y)).sum(0)
+
+ if cond:
+ var_x += ((preds - mx_new) * (preds - mean_x)).sum(0)
+ var_y += ((target - my_new) * (target - mean_y)).sum(0)
+
+ else:
+ var_x += preds.var(0) * (n_obs - 1)
+ var_y += target.var(0) * (n_obs - 1)
corr_xy += ((preds - mx_new) * (target - mean_y)).sum(0)
mean_x = mx_new
mean_y = my_new
| {"golden_diff": "diff --git a/src/torchmetrics/functional/regression/pearson.py b/src/torchmetrics/functional/regression/pearson.py\n--- a/src/torchmetrics/functional/regression/pearson.py\n+++ b/src/torchmetrics/functional/regression/pearson.py\n@@ -49,13 +49,25 @@\n # Data checking\n _check_same_shape(preds, target)\n _check_data_shape_to_num_outputs(preds, target, num_outputs)\n+ cond = n_prior.mean() > 0\n \n n_obs = preds.shape[0]\n- mx_new = (n_prior * mean_x + preds.mean(0) * n_obs) / (n_prior + n_obs)\n- my_new = (n_prior * mean_y + target.mean(0) * n_obs) / (n_prior + n_obs)\n+ if cond:\n+ mx_new = (n_prior * mean_x + preds.sum(0)) / (n_prior + n_obs)\n+ my_new = (n_prior * mean_y + target.sum(0)) / (n_prior + n_obs)\n+ else:\n+ mx_new = preds.mean(0)\n+ my_new = target.mean(0)\n+\n n_prior += n_obs\n- var_x += ((preds - mx_new) * (preds - mean_x)).sum(0)\n- var_y += ((target - my_new) * (target - mean_y)).sum(0)\n+\n+ if cond:\n+ var_x += ((preds - mx_new) * (preds - mean_x)).sum(0)\n+ var_y += ((target - my_new) * (target - mean_y)).sum(0)\n+\n+ else:\n+ var_x += preds.var(0) * (n_obs - 1)\n+ var_y += target.var(0) * (n_obs - 1)\n corr_xy += ((preds - mx_new) * (target - mean_y)).sum(0)\n mean_x = mx_new\n mean_y = my_new\n", "issue": "PearsonCorrCoeff returns nan when input is of type torch.float16 or torch.bfloat16.\n## \ud83d\udc1b Bug\r\n\r\nPearsonCorrCoeff returns nan when input is of type torch.float16 or torch.bfloat16 and all values are close.\r\n\r\n### To Reproduce\r\n\r\n<!-- If you have a code sample, error messages, stack traces, please provide it here as well -->\r\n\r\n<details>\r\n <summary>\r\n\r\n```python\r\nimport torch\r\nimport torchmetrics as tm\r\npcc = tm.regression.PearsonCorrCoef().to(\"cuda\")\r\npred = torch.tensor([0.4746, 0.4805, 0.4766, 0.4805, 0.4766, 0.4805, 0.4785, 0.4824, 0.4805],dtype=torch.float16).to(\"cuda\")\r\ntarget = torch.tensor([0.0336, 0.3676, 0.6302, 0.7192, 0.2295, 0.2886, 0.6302, 0.7096, 0.0208],dtype=torch.float16).to(\"cuda\")\r\nprint(pcc(pred,target))\r\nprint(pcc(pred.to(torch.float32),target.to(torch.float32)))\r\ntensor(nan, device='cuda:0')\r\ntensor(0.3720, device='cuda:0')\r\n```\r\n\r\n</summary>\r\n\r\n<!-- Ideally attach a minimal code sample to reproduce the decried issue.\r\nMinimal means having the shortest code but still preserving the bug. -->\r\n\r\n</details>\r\n\r\n### Expected behavior\r\n\r\n<!-- A clear and concise description of what you expected to happen. -->\r\n\r\n### Environment\r\n\r\nPython version: 3.10.9\r\nTorch version: 1.12.1\r\nTorchMetrics version: 0.11.1\r\nGPU device name: Tesla T4\r\nCUDA Version: 11.4 \r\n\r\n### Additional context\r\n\r\nWhen running in a training loop I found that some fraction (~30%) of steps would not produce a nan number when using torch.float16 or bfloat16, while the other ~70% would.\r\nThis seems to occur because the values in pred above are not very different (changing one value of pred above to be more different than the rest will compute a correct PCC), however I think that this should still be able to be computed with half precision and the standard deviation of pred shown above.\r\n<!-- Add any other context about the problem here. -->\r\n\n", "before_files": [{"content": "# Copyright The Lightning team.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Tuple\n\nimport torch\nfrom torch import Tensor\n\nfrom torchmetrics.functional.regression.utils import _check_data_shape_to_num_outputs\nfrom torchmetrics.utilities.checks import _check_same_shape\n\n\ndef _pearson_corrcoef_update(\n preds: Tensor,\n target: Tensor,\n mean_x: Tensor,\n mean_y: Tensor,\n var_x: Tensor,\n var_y: Tensor,\n corr_xy: Tensor,\n n_prior: Tensor,\n num_outputs: int,\n) -> Tuple[Tensor, Tensor, Tensor, Tensor, Tensor, Tensor]:\n \"\"\"Update and returns variables required to compute Pearson Correlation Coefficient.\n\n Check for same shape of input tensors.\n\n Args:\n preds: estimated scores\n target: ground truth scores\n mean_x: current mean estimate of x tensor\n mean_y: current mean estimate of y tensor\n var_x: current variance estimate of x tensor\n var_y: current variance estimate of y tensor\n corr_xy: current covariance estimate between x and y tensor\n n_prior: current number of observed observations\n num_outputs: Number of outputs in multioutput setting\n \"\"\"\n # Data checking\n _check_same_shape(preds, target)\n _check_data_shape_to_num_outputs(preds, target, num_outputs)\n\n n_obs = preds.shape[0]\n mx_new = (n_prior * mean_x + preds.mean(0) * n_obs) / (n_prior + n_obs)\n my_new = (n_prior * mean_y + target.mean(0) * n_obs) / (n_prior + n_obs)\n n_prior += n_obs\n var_x += ((preds - mx_new) * (preds - mean_x)).sum(0)\n var_y += ((target - my_new) * (target - mean_y)).sum(0)\n corr_xy += ((preds - mx_new) * (target - mean_y)).sum(0)\n mean_x = mx_new\n mean_y = my_new\n\n return mean_x, mean_y, var_x, var_y, corr_xy, n_prior\n\n\ndef _pearson_corrcoef_compute(\n var_x: Tensor,\n var_y: Tensor,\n corr_xy: Tensor,\n nb: Tensor,\n) -> Tensor:\n \"\"\"Compute the final pearson correlation based on accumulated statistics.\n\n Args:\n var_x: variance estimate of x tensor\n var_y: variance estimate of y tensor\n corr_xy: covariance estimate between x and y tensor\n nb: number of observations\n \"\"\"\n var_x /= nb - 1\n var_y /= nb - 1\n corr_xy /= nb - 1\n corrcoef = (corr_xy / (var_x * var_y).sqrt()).squeeze()\n return torch.clamp(corrcoef, -1.0, 1.0)\n\n\ndef pearson_corrcoef(preds: Tensor, target: Tensor) -> Tensor:\n \"\"\"Compute pearson correlation coefficient.\n\n Args:\n preds: estimated scores\n target: ground truth scores\n\n Example (single output regression):\n >>> from torchmetrics.functional.regression import pearson_corrcoef\n >>> target = torch.tensor([3, -0.5, 2, 7])\n >>> preds = torch.tensor([2.5, 0.0, 2, 8])\n >>> pearson_corrcoef(preds, target)\n tensor(0.9849)\n\n Example (multi output regression):\n >>> from torchmetrics.functional.regression import pearson_corrcoef\n >>> target = torch.tensor([[3, -0.5], [2, 7]])\n >>> preds = torch.tensor([[2.5, 0.0], [2, 8]])\n >>> pearson_corrcoef(preds, target)\n tensor([1., 1.])\n \"\"\"\n d = preds.shape[1] if preds.ndim == 2 else 1\n _temp = torch.zeros(d, dtype=preds.dtype, device=preds.device)\n mean_x, mean_y, var_x = _temp.clone(), _temp.clone(), _temp.clone()\n var_y, corr_xy, nb = _temp.clone(), _temp.clone(), _temp.clone()\n _, _, var_x, var_y, corr_xy, nb = _pearson_corrcoef_update(\n preds, target, mean_x, mean_y, var_x, var_y, corr_xy, nb, num_outputs=1 if preds.ndim == 1 else preds.shape[-1]\n )\n return _pearson_corrcoef_compute(var_x, var_y, corr_xy, nb)\n", "path": "src/torchmetrics/functional/regression/pearson.py"}], "after_files": [{"content": "# Copyright The Lightning team.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Tuple\n\nimport torch\nfrom torch import Tensor\n\nfrom torchmetrics.functional.regression.utils import _check_data_shape_to_num_outputs\nfrom torchmetrics.utilities.checks import _check_same_shape\n\n\ndef _pearson_corrcoef_update(\n preds: Tensor,\n target: Tensor,\n mean_x: Tensor,\n mean_y: Tensor,\n var_x: Tensor,\n var_y: Tensor,\n corr_xy: Tensor,\n n_prior: Tensor,\n num_outputs: int,\n) -> Tuple[Tensor, Tensor, Tensor, Tensor, Tensor, Tensor]:\n \"\"\"Update and returns variables required to compute Pearson Correlation Coefficient.\n\n Check for same shape of input tensors.\n\n Args:\n preds: estimated scores\n target: ground truth scores\n mean_x: current mean estimate of x tensor\n mean_y: current mean estimate of y tensor\n var_x: current variance estimate of x tensor\n var_y: current variance estimate of y tensor\n corr_xy: current covariance estimate between x and y tensor\n n_prior: current number of observed observations\n num_outputs: Number of outputs in multioutput setting\n \"\"\"\n # Data checking\n _check_same_shape(preds, target)\n _check_data_shape_to_num_outputs(preds, target, num_outputs)\n cond = n_prior.mean() > 0\n\n n_obs = preds.shape[0]\n if cond:\n mx_new = (n_prior * mean_x + preds.sum(0)) / (n_prior + n_obs)\n my_new = (n_prior * mean_y + target.sum(0)) / (n_prior + n_obs)\n else:\n mx_new = preds.mean(0)\n my_new = target.mean(0)\n\n n_prior += n_obs\n\n if cond:\n var_x += ((preds - mx_new) * (preds - mean_x)).sum(0)\n var_y += ((target - my_new) * (target - mean_y)).sum(0)\n\n else:\n var_x += preds.var(0) * (n_obs - 1)\n var_y += target.var(0) * (n_obs - 1)\n corr_xy += ((preds - mx_new) * (target - mean_y)).sum(0)\n mean_x = mx_new\n mean_y = my_new\n\n return mean_x, mean_y, var_x, var_y, corr_xy, n_prior\n\n\ndef _pearson_corrcoef_compute(\n var_x: Tensor,\n var_y: Tensor,\n corr_xy: Tensor,\n nb: Tensor,\n) -> Tensor:\n \"\"\"Compute the final pearson correlation based on accumulated statistics.\n\n Args:\n var_x: variance estimate of x tensor\n var_y: variance estimate of y tensor\n corr_xy: covariance estimate between x and y tensor\n nb: number of observations\n \"\"\"\n var_x /= nb - 1\n var_y /= nb - 1\n corr_xy /= nb - 1\n corrcoef = (corr_xy / (var_x * var_y).sqrt()).squeeze()\n return torch.clamp(corrcoef, -1.0, 1.0)\n\n\ndef pearson_corrcoef(preds: Tensor, target: Tensor) -> Tensor:\n \"\"\"Compute pearson correlation coefficient.\n\n Args:\n preds: estimated scores\n target: ground truth scores\n\n Example (single output regression):\n >>> from torchmetrics.functional.regression import pearson_corrcoef\n >>> target = torch.tensor([3, -0.5, 2, 7])\n >>> preds = torch.tensor([2.5, 0.0, 2, 8])\n >>> pearson_corrcoef(preds, target)\n tensor(0.9849)\n\n Example (multi output regression):\n >>> from torchmetrics.functional.regression import pearson_corrcoef\n >>> target = torch.tensor([[3, -0.5], [2, 7]])\n >>> preds = torch.tensor([[2.5, 0.0], [2, 8]])\n >>> pearson_corrcoef(preds, target)\n tensor([1., 1.])\n \"\"\"\n d = preds.shape[1] if preds.ndim == 2 else 1\n _temp = torch.zeros(d, dtype=preds.dtype, device=preds.device)\n mean_x, mean_y, var_x = _temp.clone(), _temp.clone(), _temp.clone()\n var_y, corr_xy, nb = _temp.clone(), _temp.clone(), _temp.clone()\n _, _, var_x, var_y, corr_xy, nb = _pearson_corrcoef_update(\n preds, target, mean_x, mean_y, var_x, var_y, corr_xy, nb, num_outputs=1 if preds.ndim == 1 else preds.shape[-1]\n )\n return _pearson_corrcoef_compute(var_x, var_y, corr_xy, nb)\n", "path": "src/torchmetrics/functional/regression/pearson.py"}]} | 2,185 | 446 |
gh_patches_debug_23415 | rasdani/github-patches | git_diff | encode__starlette-563 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Use "308 Permanent Redirect" for redirect slashes behavior.
Hi,
I stumbled upon a quirk in starlette that is not properly documented. It seems like all of my HTTP request to a route without trailing slash are being redirected to route with trailing slashes. Say I am hitting `http://hostname/mountpoint/api` it will be redirected (302) to `http://hostname/mountpoint/api/`. This messed up your api calls; if you call POST to `http://hostname/mountpoint/api` it will be redirected to GET `http://hostname/mountpoint/api/`.
I dig on the source and see this redirect_slashes flag and setting it to False fix this. I feel this behavior (the auto redirection) should be documented.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `starlette/middleware/httpsredirect.py`
Content:
```
1 from starlette.datastructures import URL
2 from starlette.responses import RedirectResponse
3 from starlette.types import ASGIApp, Receive, Scope, Send
4
5
6 class HTTPSRedirectMiddleware:
7 def __init__(self, app: ASGIApp) -> None:
8 self.app = app
9
10 async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None:
11 if scope["type"] in ("http", "websocket") and scope["scheme"] in ("http", "ws"):
12 url = URL(scope=scope)
13 redirect_scheme = {"http": "https", "ws": "wss"}[url.scheme]
14 netloc = url.hostname if url.port in (80, 443) else url.netloc
15 url = url.replace(scheme=redirect_scheme, netloc=netloc)
16 response = RedirectResponse(url, status_code=301)
17 await response(scope, receive, send)
18 else:
19 await self.app(scope, receive, send)
20
```
Path: `starlette/responses.py`
Content:
```
1 import hashlib
2 import http.cookies
3 import inspect
4 import json
5 import os
6 import stat
7 import typing
8 from email.utils import formatdate
9 from mimetypes import guess_type
10 from urllib.parse import quote_plus
11
12 from starlette.background import BackgroundTask
13 from starlette.concurrency import iterate_in_threadpool
14 from starlette.datastructures import URL, MutableHeaders
15 from starlette.types import Receive, Scope, Send
16
17 try:
18 import aiofiles
19 from aiofiles.os import stat as aio_stat
20 except ImportError: # pragma: nocover
21 aiofiles = None # type: ignore
22 aio_stat = None # type: ignore
23
24 try:
25 import ujson
26 except ImportError: # pragma: nocover
27 ujson = None # type: ignore
28
29
30 class Response:
31 media_type = None
32 charset = "utf-8"
33
34 def __init__(
35 self,
36 content: typing.Any = None,
37 status_code: int = 200,
38 headers: dict = None,
39 media_type: str = None,
40 background: BackgroundTask = None,
41 ) -> None:
42 self.body = self.render(content)
43 self.status_code = status_code
44 if media_type is not None:
45 self.media_type = media_type
46 self.background = background
47 self.init_headers(headers)
48
49 def render(self, content: typing.Any) -> bytes:
50 if content is None:
51 return b""
52 if isinstance(content, bytes):
53 return content
54 return content.encode(self.charset)
55
56 def init_headers(self, headers: typing.Mapping[str, str] = None) -> None:
57 if headers is None:
58 raw_headers = [] # type: typing.List[typing.Tuple[bytes, bytes]]
59 populate_content_length = True
60 populate_content_type = True
61 else:
62 raw_headers = [
63 (k.lower().encode("latin-1"), v.encode("latin-1"))
64 for k, v in headers.items()
65 ]
66 keys = [h[0] for h in raw_headers]
67 populate_content_length = b"content-length" not in keys
68 populate_content_type = b"content-type" not in keys
69
70 body = getattr(self, "body", b"")
71 if body and populate_content_length:
72 content_length = str(len(body))
73 raw_headers.append((b"content-length", content_length.encode("latin-1")))
74
75 content_type = self.media_type
76 if content_type is not None and populate_content_type:
77 if content_type.startswith("text/"):
78 content_type += "; charset=" + self.charset
79 raw_headers.append((b"content-type", content_type.encode("latin-1")))
80
81 self.raw_headers = raw_headers
82
83 @property
84 def headers(self) -> MutableHeaders:
85 if not hasattr(self, "_headers"):
86 self._headers = MutableHeaders(raw=self.raw_headers)
87 return self._headers
88
89 def set_cookie(
90 self,
91 key: str,
92 value: str = "",
93 max_age: int = None,
94 expires: int = None,
95 path: str = "/",
96 domain: str = None,
97 secure: bool = False,
98 httponly: bool = False,
99 ) -> None:
100 cookie = http.cookies.SimpleCookie()
101 cookie[key] = value
102 if max_age is not None:
103 cookie[key]["max-age"] = max_age # type: ignore
104 if expires is not None:
105 cookie[key]["expires"] = expires # type: ignore
106 if path is not None:
107 cookie[key]["path"] = path
108 if domain is not None:
109 cookie[key]["domain"] = domain
110 if secure:
111 cookie[key]["secure"] = True # type: ignore
112 if httponly:
113 cookie[key]["httponly"] = True # type: ignore
114 cookie_val = cookie.output(header="").strip()
115 self.raw_headers.append((b"set-cookie", cookie_val.encode("latin-1")))
116
117 def delete_cookie(self, key: str, path: str = "/", domain: str = None) -> None:
118 self.set_cookie(key, expires=0, max_age=0, path=path, domain=domain)
119
120 async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None:
121 await send(
122 {
123 "type": "http.response.start",
124 "status": self.status_code,
125 "headers": self.raw_headers,
126 }
127 )
128 await send({"type": "http.response.body", "body": self.body})
129
130 if self.background is not None:
131 await self.background()
132
133
134 class HTMLResponse(Response):
135 media_type = "text/html"
136
137
138 class PlainTextResponse(Response):
139 media_type = "text/plain"
140
141
142 class JSONResponse(Response):
143 media_type = "application/json"
144
145 def render(self, content: typing.Any) -> bytes:
146 return json.dumps(
147 content,
148 ensure_ascii=False,
149 allow_nan=False,
150 indent=None,
151 separators=(",", ":"),
152 ).encode("utf-8")
153
154
155 class UJSONResponse(JSONResponse):
156 media_type = "application/json"
157
158 def render(self, content: typing.Any) -> bytes:
159 return ujson.dumps(content, ensure_ascii=False).encode("utf-8")
160
161
162 class RedirectResponse(Response):
163 def __init__(
164 self, url: typing.Union[str, URL], status_code: int = 302, headers: dict = None
165 ) -> None:
166 super().__init__(content=b"", status_code=status_code, headers=headers)
167 self.headers["location"] = quote_plus(str(url), safe=":/%#?&=@[]!$&'()*+,;")
168
169
170 class StreamingResponse(Response):
171 def __init__(
172 self,
173 content: typing.Any,
174 status_code: int = 200,
175 headers: dict = None,
176 media_type: str = None,
177 background: BackgroundTask = None,
178 ) -> None:
179 if inspect.isasyncgen(content):
180 self.body_iterator = content
181 else:
182 self.body_iterator = iterate_in_threadpool(content)
183 self.status_code = status_code
184 self.media_type = self.media_type if media_type is None else media_type
185 self.background = background
186 self.init_headers(headers)
187
188 async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None:
189 await send(
190 {
191 "type": "http.response.start",
192 "status": self.status_code,
193 "headers": self.raw_headers,
194 }
195 )
196 async for chunk in self.body_iterator:
197 if not isinstance(chunk, bytes):
198 chunk = chunk.encode(self.charset)
199 await send({"type": "http.response.body", "body": chunk, "more_body": True})
200 await send({"type": "http.response.body", "body": b"", "more_body": False})
201
202 if self.background is not None:
203 await self.background()
204
205
206 class FileResponse(Response):
207 chunk_size = 4096
208
209 def __init__(
210 self,
211 path: str,
212 status_code: int = 200,
213 headers: dict = None,
214 media_type: str = None,
215 background: BackgroundTask = None,
216 filename: str = None,
217 stat_result: os.stat_result = None,
218 method: str = None,
219 ) -> None:
220 assert aiofiles is not None, "'aiofiles' must be installed to use FileResponse"
221 self.path = path
222 self.status_code = status_code
223 self.filename = filename
224 self.send_header_only = method is not None and method.upper() == "HEAD"
225 if media_type is None:
226 media_type = guess_type(filename or path)[0] or "text/plain"
227 self.media_type = media_type
228 self.background = background
229 self.init_headers(headers)
230 if self.filename is not None:
231 content_disposition = 'attachment; filename="{}"'.format(self.filename)
232 self.headers.setdefault("content-disposition", content_disposition)
233 self.stat_result = stat_result
234 if stat_result is not None:
235 self.set_stat_headers(stat_result)
236
237 def set_stat_headers(self, stat_result: os.stat_result) -> None:
238 content_length = str(stat_result.st_size)
239 last_modified = formatdate(stat_result.st_mtime, usegmt=True)
240 etag_base = str(stat_result.st_mtime) + "-" + str(stat_result.st_size)
241 etag = hashlib.md5(etag_base.encode()).hexdigest()
242
243 self.headers.setdefault("content-length", content_length)
244 self.headers.setdefault("last-modified", last_modified)
245 self.headers.setdefault("etag", etag)
246
247 async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None:
248 if self.stat_result is None:
249 try:
250 stat_result = await aio_stat(self.path)
251 self.set_stat_headers(stat_result)
252 except FileNotFoundError:
253 raise RuntimeError(f"File at path {self.path} does not exist.")
254 else:
255 mode = stat_result.st_mode
256 if not stat.S_ISREG(mode):
257 raise RuntimeError(f"File at path {self.path} is not a file.")
258 await send(
259 {
260 "type": "http.response.start",
261 "status": self.status_code,
262 "headers": self.raw_headers,
263 }
264 )
265 if self.send_header_only:
266 await send({"type": "http.response.body"})
267 else:
268 async with aiofiles.open(self.path, mode="rb") as file:
269 more_body = True
270 while more_body:
271 chunk = await file.read(self.chunk_size)
272 more_body = len(chunk) == self.chunk_size
273 await send(
274 {
275 "type": "http.response.body",
276 "body": chunk,
277 "more_body": more_body,
278 }
279 )
280 if self.background is not None:
281 await self.background()
282
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/starlette/middleware/httpsredirect.py b/starlette/middleware/httpsredirect.py
--- a/starlette/middleware/httpsredirect.py
+++ b/starlette/middleware/httpsredirect.py
@@ -13,7 +13,7 @@
redirect_scheme = {"http": "https", "ws": "wss"}[url.scheme]
netloc = url.hostname if url.port in (80, 443) else url.netloc
url = url.replace(scheme=redirect_scheme, netloc=netloc)
- response = RedirectResponse(url, status_code=301)
+ response = RedirectResponse(url, status_code=308)
await response(scope, receive, send)
else:
await self.app(scope, receive, send)
diff --git a/starlette/responses.py b/starlette/responses.py
--- a/starlette/responses.py
+++ b/starlette/responses.py
@@ -161,7 +161,7 @@
class RedirectResponse(Response):
def __init__(
- self, url: typing.Union[str, URL], status_code: int = 302, headers: dict = None
+ self, url: typing.Union[str, URL], status_code: int = 307, headers: dict = None
) -> None:
super().__init__(content=b"", status_code=status_code, headers=headers)
self.headers["location"] = quote_plus(str(url), safe=":/%#?&=@[]!$&'()*+,;")
| {"golden_diff": "diff --git a/starlette/middleware/httpsredirect.py b/starlette/middleware/httpsredirect.py\n--- a/starlette/middleware/httpsredirect.py\n+++ b/starlette/middleware/httpsredirect.py\n@@ -13,7 +13,7 @@\n redirect_scheme = {\"http\": \"https\", \"ws\": \"wss\"}[url.scheme]\n netloc = url.hostname if url.port in (80, 443) else url.netloc\n url = url.replace(scheme=redirect_scheme, netloc=netloc)\n- response = RedirectResponse(url, status_code=301)\n+ response = RedirectResponse(url, status_code=308)\n await response(scope, receive, send)\n else:\n await self.app(scope, receive, send)\ndiff --git a/starlette/responses.py b/starlette/responses.py\n--- a/starlette/responses.py\n+++ b/starlette/responses.py\n@@ -161,7 +161,7 @@\n \n class RedirectResponse(Response):\n def __init__(\n- self, url: typing.Union[str, URL], status_code: int = 302, headers: dict = None\n+ self, url: typing.Union[str, URL], status_code: int = 307, headers: dict = None\n ) -> None:\n super().__init__(content=b\"\", status_code=status_code, headers=headers)\n self.headers[\"location\"] = quote_plus(str(url), safe=\":/%#?&=@[]!$&'()*+,;\")\n", "issue": "Use \"308 Permanent Redirect\" for redirect slashes behavior.\nHi,\r\nI stumbled upon a quirk in starlette that is not properly documented. It seems like all of my HTTP request to a route without trailing slash are being redirected to route with trailing slashes. Say I am hitting `http://hostname/mountpoint/api` it will be redirected (302) to `http://hostname/mountpoint/api/`. This messed up your api calls; if you call POST to `http://hostname/mountpoint/api` it will be redirected to GET `http://hostname/mountpoint/api/`.\r\n\r\nI dig on the source and see this redirect_slashes flag and setting it to False fix this. I feel this behavior (the auto redirection) should be documented.\n", "before_files": [{"content": "from starlette.datastructures import URL\nfrom starlette.responses import RedirectResponse\nfrom starlette.types import ASGIApp, Receive, Scope, Send\n\n\nclass HTTPSRedirectMiddleware:\n def __init__(self, app: ASGIApp) -> None:\n self.app = app\n\n async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None:\n if scope[\"type\"] in (\"http\", \"websocket\") and scope[\"scheme\"] in (\"http\", \"ws\"):\n url = URL(scope=scope)\n redirect_scheme = {\"http\": \"https\", \"ws\": \"wss\"}[url.scheme]\n netloc = url.hostname if url.port in (80, 443) else url.netloc\n url = url.replace(scheme=redirect_scheme, netloc=netloc)\n response = RedirectResponse(url, status_code=301)\n await response(scope, receive, send)\n else:\n await self.app(scope, receive, send)\n", "path": "starlette/middleware/httpsredirect.py"}, {"content": "import hashlib\nimport http.cookies\nimport inspect\nimport json\nimport os\nimport stat\nimport typing\nfrom email.utils import formatdate\nfrom mimetypes import guess_type\nfrom urllib.parse import quote_plus\n\nfrom starlette.background import BackgroundTask\nfrom starlette.concurrency import iterate_in_threadpool\nfrom starlette.datastructures import URL, MutableHeaders\nfrom starlette.types import Receive, Scope, Send\n\ntry:\n import aiofiles\n from aiofiles.os import stat as aio_stat\nexcept ImportError: # pragma: nocover\n aiofiles = None # type: ignore\n aio_stat = None # type: ignore\n\ntry:\n import ujson\nexcept ImportError: # pragma: nocover\n ujson = None # type: ignore\n\n\nclass Response:\n media_type = None\n charset = \"utf-8\"\n\n def __init__(\n self,\n content: typing.Any = None,\n status_code: int = 200,\n headers: dict = None,\n media_type: str = None,\n background: BackgroundTask = None,\n ) -> None:\n self.body = self.render(content)\n self.status_code = status_code\n if media_type is not None:\n self.media_type = media_type\n self.background = background\n self.init_headers(headers)\n\n def render(self, content: typing.Any) -> bytes:\n if content is None:\n return b\"\"\n if isinstance(content, bytes):\n return content\n return content.encode(self.charset)\n\n def init_headers(self, headers: typing.Mapping[str, str] = None) -> None:\n if headers is None:\n raw_headers = [] # type: typing.List[typing.Tuple[bytes, bytes]]\n populate_content_length = True\n populate_content_type = True\n else:\n raw_headers = [\n (k.lower().encode(\"latin-1\"), v.encode(\"latin-1\"))\n for k, v in headers.items()\n ]\n keys = [h[0] for h in raw_headers]\n populate_content_length = b\"content-length\" not in keys\n populate_content_type = b\"content-type\" not in keys\n\n body = getattr(self, \"body\", b\"\")\n if body and populate_content_length:\n content_length = str(len(body))\n raw_headers.append((b\"content-length\", content_length.encode(\"latin-1\")))\n\n content_type = self.media_type\n if content_type is not None and populate_content_type:\n if content_type.startswith(\"text/\"):\n content_type += \"; charset=\" + self.charset\n raw_headers.append((b\"content-type\", content_type.encode(\"latin-1\")))\n\n self.raw_headers = raw_headers\n\n @property\n def headers(self) -> MutableHeaders:\n if not hasattr(self, \"_headers\"):\n self._headers = MutableHeaders(raw=self.raw_headers)\n return self._headers\n\n def set_cookie(\n self,\n key: str,\n value: str = \"\",\n max_age: int = None,\n expires: int = None,\n path: str = \"/\",\n domain: str = None,\n secure: bool = False,\n httponly: bool = False,\n ) -> None:\n cookie = http.cookies.SimpleCookie()\n cookie[key] = value\n if max_age is not None:\n cookie[key][\"max-age\"] = max_age # type: ignore\n if expires is not None:\n cookie[key][\"expires\"] = expires # type: ignore\n if path is not None:\n cookie[key][\"path\"] = path\n if domain is not None:\n cookie[key][\"domain\"] = domain\n if secure:\n cookie[key][\"secure\"] = True # type: ignore\n if httponly:\n cookie[key][\"httponly\"] = True # type: ignore\n cookie_val = cookie.output(header=\"\").strip()\n self.raw_headers.append((b\"set-cookie\", cookie_val.encode(\"latin-1\")))\n\n def delete_cookie(self, key: str, path: str = \"/\", domain: str = None) -> None:\n self.set_cookie(key, expires=0, max_age=0, path=path, domain=domain)\n\n async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None:\n await send(\n {\n \"type\": \"http.response.start\",\n \"status\": self.status_code,\n \"headers\": self.raw_headers,\n }\n )\n await send({\"type\": \"http.response.body\", \"body\": self.body})\n\n if self.background is not None:\n await self.background()\n\n\nclass HTMLResponse(Response):\n media_type = \"text/html\"\n\n\nclass PlainTextResponse(Response):\n media_type = \"text/plain\"\n\n\nclass JSONResponse(Response):\n media_type = \"application/json\"\n\n def render(self, content: typing.Any) -> bytes:\n return json.dumps(\n content,\n ensure_ascii=False,\n allow_nan=False,\n indent=None,\n separators=(\",\", \":\"),\n ).encode(\"utf-8\")\n\n\nclass UJSONResponse(JSONResponse):\n media_type = \"application/json\"\n\n def render(self, content: typing.Any) -> bytes:\n return ujson.dumps(content, ensure_ascii=False).encode(\"utf-8\")\n\n\nclass RedirectResponse(Response):\n def __init__(\n self, url: typing.Union[str, URL], status_code: int = 302, headers: dict = None\n ) -> None:\n super().__init__(content=b\"\", status_code=status_code, headers=headers)\n self.headers[\"location\"] = quote_plus(str(url), safe=\":/%#?&=@[]!$&'()*+,;\")\n\n\nclass StreamingResponse(Response):\n def __init__(\n self,\n content: typing.Any,\n status_code: int = 200,\n headers: dict = None,\n media_type: str = None,\n background: BackgroundTask = None,\n ) -> None:\n if inspect.isasyncgen(content):\n self.body_iterator = content\n else:\n self.body_iterator = iterate_in_threadpool(content)\n self.status_code = status_code\n self.media_type = self.media_type if media_type is None else media_type\n self.background = background\n self.init_headers(headers)\n\n async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None:\n await send(\n {\n \"type\": \"http.response.start\",\n \"status\": self.status_code,\n \"headers\": self.raw_headers,\n }\n )\n async for chunk in self.body_iterator:\n if not isinstance(chunk, bytes):\n chunk = chunk.encode(self.charset)\n await send({\"type\": \"http.response.body\", \"body\": chunk, \"more_body\": True})\n await send({\"type\": \"http.response.body\", \"body\": b\"\", \"more_body\": False})\n\n if self.background is not None:\n await self.background()\n\n\nclass FileResponse(Response):\n chunk_size = 4096\n\n def __init__(\n self,\n path: str,\n status_code: int = 200,\n headers: dict = None,\n media_type: str = None,\n background: BackgroundTask = None,\n filename: str = None,\n stat_result: os.stat_result = None,\n method: str = None,\n ) -> None:\n assert aiofiles is not None, \"'aiofiles' must be installed to use FileResponse\"\n self.path = path\n self.status_code = status_code\n self.filename = filename\n self.send_header_only = method is not None and method.upper() == \"HEAD\"\n if media_type is None:\n media_type = guess_type(filename or path)[0] or \"text/plain\"\n self.media_type = media_type\n self.background = background\n self.init_headers(headers)\n if self.filename is not None:\n content_disposition = 'attachment; filename=\"{}\"'.format(self.filename)\n self.headers.setdefault(\"content-disposition\", content_disposition)\n self.stat_result = stat_result\n if stat_result is not None:\n self.set_stat_headers(stat_result)\n\n def set_stat_headers(self, stat_result: os.stat_result) -> None:\n content_length = str(stat_result.st_size)\n last_modified = formatdate(stat_result.st_mtime, usegmt=True)\n etag_base = str(stat_result.st_mtime) + \"-\" + str(stat_result.st_size)\n etag = hashlib.md5(etag_base.encode()).hexdigest()\n\n self.headers.setdefault(\"content-length\", content_length)\n self.headers.setdefault(\"last-modified\", last_modified)\n self.headers.setdefault(\"etag\", etag)\n\n async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None:\n if self.stat_result is None:\n try:\n stat_result = await aio_stat(self.path)\n self.set_stat_headers(stat_result)\n except FileNotFoundError:\n raise RuntimeError(f\"File at path {self.path} does not exist.\")\n else:\n mode = stat_result.st_mode\n if not stat.S_ISREG(mode):\n raise RuntimeError(f\"File at path {self.path} is not a file.\")\n await send(\n {\n \"type\": \"http.response.start\",\n \"status\": self.status_code,\n \"headers\": self.raw_headers,\n }\n )\n if self.send_header_only:\n await send({\"type\": \"http.response.body\"})\n else:\n async with aiofiles.open(self.path, mode=\"rb\") as file:\n more_body = True\n while more_body:\n chunk = await file.read(self.chunk_size)\n more_body = len(chunk) == self.chunk_size\n await send(\n {\n \"type\": \"http.response.body\",\n \"body\": chunk,\n \"more_body\": more_body,\n }\n )\n if self.background is not None:\n await self.background()\n", "path": "starlette/responses.py"}], "after_files": [{"content": "from starlette.datastructures import URL\nfrom starlette.responses import RedirectResponse\nfrom starlette.types import ASGIApp, Receive, Scope, Send\n\n\nclass HTTPSRedirectMiddleware:\n def __init__(self, app: ASGIApp) -> None:\n self.app = app\n\n async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None:\n if scope[\"type\"] in (\"http\", \"websocket\") and scope[\"scheme\"] in (\"http\", \"ws\"):\n url = URL(scope=scope)\n redirect_scheme = {\"http\": \"https\", \"ws\": \"wss\"}[url.scheme]\n netloc = url.hostname if url.port in (80, 443) else url.netloc\n url = url.replace(scheme=redirect_scheme, netloc=netloc)\n response = RedirectResponse(url, status_code=308)\n await response(scope, receive, send)\n else:\n await self.app(scope, receive, send)\n", "path": "starlette/middleware/httpsredirect.py"}, {"content": "import hashlib\nimport http.cookies\nimport inspect\nimport json\nimport os\nimport stat\nimport typing\nfrom email.utils import formatdate\nfrom mimetypes import guess_type\nfrom urllib.parse import quote_plus\n\nfrom starlette.background import BackgroundTask\nfrom starlette.concurrency import iterate_in_threadpool\nfrom starlette.datastructures import URL, MutableHeaders\nfrom starlette.types import Receive, Scope, Send\n\ntry:\n import aiofiles\n from aiofiles.os import stat as aio_stat\nexcept ImportError: # pragma: nocover\n aiofiles = None # type: ignore\n aio_stat = None # type: ignore\n\ntry:\n import ujson\nexcept ImportError: # pragma: nocover\n ujson = None # type: ignore\n\n\nclass Response:\n media_type = None\n charset = \"utf-8\"\n\n def __init__(\n self,\n content: typing.Any = None,\n status_code: int = 200,\n headers: dict = None,\n media_type: str = None,\n background: BackgroundTask = None,\n ) -> None:\n self.body = self.render(content)\n self.status_code = status_code\n if media_type is not None:\n self.media_type = media_type\n self.background = background\n self.init_headers(headers)\n\n def render(self, content: typing.Any) -> bytes:\n if content is None:\n return b\"\"\n if isinstance(content, bytes):\n return content\n return content.encode(self.charset)\n\n def init_headers(self, headers: typing.Mapping[str, str] = None) -> None:\n if headers is None:\n raw_headers = [] # type: typing.List[typing.Tuple[bytes, bytes]]\n populate_content_length = True\n populate_content_type = True\n else:\n raw_headers = [\n (k.lower().encode(\"latin-1\"), v.encode(\"latin-1\"))\n for k, v in headers.items()\n ]\n keys = [h[0] for h in raw_headers]\n populate_content_length = b\"content-length\" not in keys\n populate_content_type = b\"content-type\" not in keys\n\n body = getattr(self, \"body\", b\"\")\n if body and populate_content_length:\n content_length = str(len(body))\n raw_headers.append((b\"content-length\", content_length.encode(\"latin-1\")))\n\n content_type = self.media_type\n if content_type is not None and populate_content_type:\n if content_type.startswith(\"text/\"):\n content_type += \"; charset=\" + self.charset\n raw_headers.append((b\"content-type\", content_type.encode(\"latin-1\")))\n\n self.raw_headers = raw_headers\n\n @property\n def headers(self) -> MutableHeaders:\n if not hasattr(self, \"_headers\"):\n self._headers = MutableHeaders(raw=self.raw_headers)\n return self._headers\n\n def set_cookie(\n self,\n key: str,\n value: str = \"\",\n max_age: int = None,\n expires: int = None,\n path: str = \"/\",\n domain: str = None,\n secure: bool = False,\n httponly: bool = False,\n ) -> None:\n cookie = http.cookies.SimpleCookie()\n cookie[key] = value\n if max_age is not None:\n cookie[key][\"max-age\"] = max_age # type: ignore\n if expires is not None:\n cookie[key][\"expires\"] = expires # type: ignore\n if path is not None:\n cookie[key][\"path\"] = path\n if domain is not None:\n cookie[key][\"domain\"] = domain\n if secure:\n cookie[key][\"secure\"] = True # type: ignore\n if httponly:\n cookie[key][\"httponly\"] = True # type: ignore\n cookie_val = cookie.output(header=\"\").strip()\n self.raw_headers.append((b\"set-cookie\", cookie_val.encode(\"latin-1\")))\n\n def delete_cookie(self, key: str, path: str = \"/\", domain: str = None) -> None:\n self.set_cookie(key, expires=0, max_age=0, path=path, domain=domain)\n\n async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None:\n await send(\n {\n \"type\": \"http.response.start\",\n \"status\": self.status_code,\n \"headers\": self.raw_headers,\n }\n )\n await send({\"type\": \"http.response.body\", \"body\": self.body})\n\n if self.background is not None:\n await self.background()\n\n\nclass HTMLResponse(Response):\n media_type = \"text/html\"\n\n\nclass PlainTextResponse(Response):\n media_type = \"text/plain\"\n\n\nclass JSONResponse(Response):\n media_type = \"application/json\"\n\n def render(self, content: typing.Any) -> bytes:\n return json.dumps(\n content,\n ensure_ascii=False,\n allow_nan=False,\n indent=None,\n separators=(\",\", \":\"),\n ).encode(\"utf-8\")\n\n\nclass UJSONResponse(JSONResponse):\n media_type = \"application/json\"\n\n def render(self, content: typing.Any) -> bytes:\n return ujson.dumps(content, ensure_ascii=False).encode(\"utf-8\")\n\n\nclass RedirectResponse(Response):\n def __init__(\n self, url: typing.Union[str, URL], status_code: int = 307, headers: dict = None\n ) -> None:\n super().__init__(content=b\"\", status_code=status_code, headers=headers)\n self.headers[\"location\"] = quote_plus(str(url), safe=\":/%#?&=@[]!$&'()*+,;\")\n\n\nclass StreamingResponse(Response):\n def __init__(\n self,\n content: typing.Any,\n status_code: int = 200,\n headers: dict = None,\n media_type: str = None,\n background: BackgroundTask = None,\n ) -> None:\n if inspect.isasyncgen(content):\n self.body_iterator = content\n else:\n self.body_iterator = iterate_in_threadpool(content)\n self.status_code = status_code\n self.media_type = self.media_type if media_type is None else media_type\n self.background = background\n self.init_headers(headers)\n\n async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None:\n await send(\n {\n \"type\": \"http.response.start\",\n \"status\": self.status_code,\n \"headers\": self.raw_headers,\n }\n )\n async for chunk in self.body_iterator:\n if not isinstance(chunk, bytes):\n chunk = chunk.encode(self.charset)\n await send({\"type\": \"http.response.body\", \"body\": chunk, \"more_body\": True})\n await send({\"type\": \"http.response.body\", \"body\": b\"\", \"more_body\": False})\n\n if self.background is not None:\n await self.background()\n\n\nclass FileResponse(Response):\n chunk_size = 4096\n\n def __init__(\n self,\n path: str,\n status_code: int = 200,\n headers: dict = None,\n media_type: str = None,\n background: BackgroundTask = None,\n filename: str = None,\n stat_result: os.stat_result = None,\n method: str = None,\n ) -> None:\n assert aiofiles is not None, \"'aiofiles' must be installed to use FileResponse\"\n self.path = path\n self.status_code = status_code\n self.filename = filename\n self.send_header_only = method is not None and method.upper() == \"HEAD\"\n if media_type is None:\n media_type = guess_type(filename or path)[0] or \"text/plain\"\n self.media_type = media_type\n self.background = background\n self.init_headers(headers)\n if self.filename is not None:\n content_disposition = 'attachment; filename=\"{}\"'.format(self.filename)\n self.headers.setdefault(\"content-disposition\", content_disposition)\n self.stat_result = stat_result\n if stat_result is not None:\n self.set_stat_headers(stat_result)\n\n def set_stat_headers(self, stat_result: os.stat_result) -> None:\n content_length = str(stat_result.st_size)\n last_modified = formatdate(stat_result.st_mtime, usegmt=True)\n etag_base = str(stat_result.st_mtime) + \"-\" + str(stat_result.st_size)\n etag = hashlib.md5(etag_base.encode()).hexdigest()\n\n self.headers.setdefault(\"content-length\", content_length)\n self.headers.setdefault(\"last-modified\", last_modified)\n self.headers.setdefault(\"etag\", etag)\n\n async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None:\n if self.stat_result is None:\n try:\n stat_result = await aio_stat(self.path)\n self.set_stat_headers(stat_result)\n except FileNotFoundError:\n raise RuntimeError(f\"File at path {self.path} does not exist.\")\n else:\n mode = stat_result.st_mode\n if not stat.S_ISREG(mode):\n raise RuntimeError(f\"File at path {self.path} is not a file.\")\n await send(\n {\n \"type\": \"http.response.start\",\n \"status\": self.status_code,\n \"headers\": self.raw_headers,\n }\n )\n if self.send_header_only:\n await send({\"type\": \"http.response.body\"})\n else:\n async with aiofiles.open(self.path, mode=\"rb\") as file:\n more_body = True\n while more_body:\n chunk = await file.read(self.chunk_size)\n more_body = len(chunk) == self.chunk_size\n await send(\n {\n \"type\": \"http.response.body\",\n \"body\": chunk,\n \"more_body\": more_body,\n }\n )\n if self.background is not None:\n await self.background()\n", "path": "starlette/responses.py"}]} | 3,542 | 334 |
gh_patches_debug_30926 | rasdani/github-patches | git_diff | conda__conda-5820 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
conda 4.4 incorrect activate message
I updated to conda 4.4.0b2 from conda v4.3. I created a new environment and it recommended the following - which I ran and got an error:
To activate this environment, use
$ conda activate anaconda2
To deactivate an active environment, use
$ conda deactivate
0288-csoja:scripts csoja$ conda activate anaconda2
Traceback (most recent call last):
File "/Users/csoja/Desktop/anaconda/lib/python3.5/site-packages/conda/cli/main.py", line 176, in main
raise CommandNotFoundError(argv1)
conda.exceptions.CommandNotFoundError: 'activate is not a conda command.
Did you mean 'source activate'?
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/csoja/Desktop/anaconda/bin/conda", line 6, in <module>
sys.exit(conda.cli.main())
File "/Users/csoja/Desktop/anaconda/lib/python3.5/site-packages/conda/cli/main.py", line 178, in main
from ..exceptions import handle_exception
ImportError: cannot import name 'handle_exception'
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `conda/cli/install.py`
Content:
```
1 # (c) Continuum Analytics, Inc. / http://continuum.io
2 # All Rights Reserved
3 #
4 # conda is distributed under the terms of the BSD 3-clause license.
5 # Consult LICENSE.txt or http://opensource.org/licenses/BSD-3-Clause.
6
7 from __future__ import absolute_import, division, print_function, unicode_literals
8
9 from logging import getLogger
10 import os
11 from os.path import abspath, basename, exists, isdir
12
13 from conda.models.match_spec import MatchSpec
14 from . import common
15 from .common import check_non_admin
16 from .._vendor.auxlib.ish import dals
17 from ..base.constants import ROOT_ENV_NAME
18 from ..base.context import context
19 from ..common.compat import text_type
20 from ..core.envs_manager import EnvsDirectory
21 from ..core.index import calculate_channel_urls, get_index
22 from ..core.solve import Solver
23 from ..exceptions import (CondaImportError, CondaOSError, CondaSystemExit, CondaValueError,
24 DirectoryNotFoundError, DryRunExit, EnvironmentLocationNotFound,
25 PackagesNotFoundError, TooManyArgumentsError,
26 UnsatisfiableError)
27 from ..misc import append_env, clone_env, explicit, touch_nonadmin
28 from ..plan import (revert_actions)
29 from ..resolve import ResolvePackageNotFound
30
31 log = getLogger(__name__)
32 stderrlog = getLogger('conda.stderr')
33
34
35 def check_prefix(prefix, json=False):
36 name = basename(prefix)
37 error = None
38 if name == ROOT_ENV_NAME:
39 error = "'%s' is a reserved environment name" % name
40 if exists(prefix):
41 if isdir(prefix) and 'conda-meta' not in os.listdir(prefix):
42 return None
43 error = "prefix already exists: %s" % prefix
44
45 if error:
46 raise CondaValueError(error, json)
47
48 if ' ' in prefix:
49 stderrlog.warn("WARNING: A space was detected in your requested environment path\n"
50 "'%s'\n"
51 "Spaces in paths can sometimes be problematic." % prefix)
52
53
54 def clone(src_arg, dst_prefix, json=False, quiet=False, index_args=None):
55 if os.sep in src_arg:
56 src_prefix = abspath(src_arg)
57 if not isdir(src_prefix):
58 raise DirectoryNotFoundError(src_arg)
59 else:
60 assert context._argparse_args.clone is not None
61 src_prefix = EnvsDirectory.locate_prefix_by_name(context._argparse_args.clone)
62
63 if not json:
64 print("Source: %s" % src_prefix)
65 print("Destination: %s" % dst_prefix)
66
67 actions, untracked_files = clone_env(src_prefix, dst_prefix,
68 verbose=not json,
69 quiet=quiet,
70 index_args=index_args)
71
72 if json:
73 common.stdout_json_success(
74 actions=actions,
75 untracked_files=list(untracked_files),
76 src_prefix=src_prefix,
77 dst_prefix=dst_prefix
78 )
79
80
81 def print_activate(env_name_or_prefix): # pragma: no cover
82 if not context.quiet and not context.json:
83 message = dals("""
84
85 To activate this environment, use
86
87 $ conda activate %s
88
89 To deactivate an active environment, use
90
91 $ conda deactivate
92 """) % env_name_or_prefix
93 print(message) # TODO: use logger
94
95
96 def get_revision(arg, json=False):
97 try:
98 return int(arg)
99 except ValueError:
100 CondaValueError("expected revision number, not: '%s'" % arg, json)
101
102
103 def install(args, parser, command='install'):
104 """
105 conda install, conda update, and conda create
106 """
107 context.validate_configuration()
108 check_non_admin()
109
110 newenv = bool(command == 'create')
111 isupdate = bool(command == 'update')
112 isinstall = bool(command == 'install')
113 if newenv:
114 common.ensure_name_or_prefix(args, command)
115 prefix = context.target_prefix
116 if newenv:
117 check_prefix(prefix, json=context.json)
118 if context.force_32bit and prefix == context.root_prefix:
119 raise CondaValueError("cannot use CONDA_FORCE_32BIT=1 in root env")
120 if isupdate and not (args.file or args.all or args.packages):
121 raise CondaValueError("""no package names supplied
122 # If you want to update to a newer version of Anaconda, type:
123 #
124 # $ conda update --prefix %s anaconda
125 """ % prefix)
126
127 args_packages = [s.strip('"\'') for s in args.packages]
128 if newenv and not args.no_default_packages:
129 # Override defaults if they are specified at the command line
130 # TODO: rework in 4.4 branch using MatchSpec
131 args_packages_names = [pkg.replace(' ', '=').split('=', 1)[0] for pkg in args_packages]
132 for default_pkg in context.create_default_packages:
133 default_pkg_name = default_pkg.replace(' ', '=').split('=', 1)[0]
134 if default_pkg_name not in args_packages_names:
135 args_packages.append(default_pkg)
136 args_packages.extend(text_type(MatchSpec(provides_features=ft)) for ft in args.features or ())
137
138 index_args = {
139 'use_cache': args.use_index_cache,
140 'channel_urls': context.channels,
141 'unknown': args.unknown,
142 'prepend': not args.override_channels,
143 'use_local': args.use_local
144 }
145
146 num_cp = sum(s.endswith('.tar.bz2') for s in args_packages)
147 if num_cp:
148 if num_cp == len(args_packages):
149 explicit(args_packages, prefix, verbose=not context.quiet)
150 return
151 else:
152 raise CondaValueError("cannot mix specifications with conda package"
153 " filenames")
154
155 specs = []
156 if args.file:
157 for fpath in args.file:
158 specs.extend(common.specs_from_url(fpath, json=context.json))
159 if '@EXPLICIT' in specs:
160 explicit(specs, prefix, verbose=not context.quiet, index_args=index_args)
161 return
162 specs.extend(common.specs_from_args(args_packages, json=context.json))
163
164 if isinstall and args.revision:
165 get_revision(args.revision, json=context.json)
166 elif isinstall and not (args.file or args_packages):
167 raise CondaValueError("too few arguments, "
168 "must supply command line package specs or --file")
169
170 if newenv and args.clone:
171 if args.packages:
172 raise TooManyArgumentsError(0, len(args.packages), list(args.packages),
173 'did not expect any arguments for --clone')
174
175 clone(args.clone, prefix, json=context.json, quiet=context.quiet, index_args=index_args)
176 append_env(prefix)
177 touch_nonadmin(prefix)
178 print_activate(args.name if args.name else prefix)
179 return
180
181 if not isdir(prefix) and not newenv:
182 if args.mkdir:
183 try:
184 os.makedirs(prefix)
185 except OSError:
186 raise CondaOSError("Error: could not create directory: %s" % prefix)
187 else:
188 raise EnvironmentLocationNotFound(prefix)
189
190 try:
191 if isinstall and args.revision:
192 index = get_index(channel_urls=index_args['channel_urls'],
193 prepend=index_args['prepend'], platform=None,
194 use_local=index_args['use_local'], use_cache=index_args['use_cache'],
195 unknown=index_args['unknown'], prefix=prefix)
196 unlink_link_transaction = revert_actions(prefix, get_revision(args.revision), index)
197 progressive_fetch_extract = unlink_link_transaction.get_pfe()
198 else:
199 solver = Solver(prefix, context.channels, context.subdirs, specs_to_add=specs)
200 unlink_link_transaction = solver.solve_for_transaction(
201 force_reinstall=context.force,
202 )
203 progressive_fetch_extract = unlink_link_transaction.get_pfe()
204
205 except ResolvePackageNotFound as e:
206 channels_urls = tuple(calculate_channel_urls(
207 channel_urls=index_args['channel_urls'],
208 prepend=index_args['prepend'],
209 platform=None,
210 use_local=index_args['use_local'],
211 ))
212 raise PackagesNotFoundError(e.bad_deps, channels_urls)
213
214 except (UnsatisfiableError, SystemExit) as e:
215 # Unsatisfiable package specifications/no such revision/import error
216 if e.args and 'could not import' in e.args[0]:
217 raise CondaImportError(text_type(e))
218 raise
219
220 handle_txn(progressive_fetch_extract, unlink_link_transaction, prefix, args, newenv)
221
222
223 def handle_txn(progressive_fetch_extract, unlink_link_transaction, prefix, args, newenv,
224 remove_op=False):
225 if unlink_link_transaction.nothing_to_do:
226 if remove_op:
227 # No packages found to remove from environment
228 raise PackagesNotFoundError(args.package_names)
229 elif not newenv:
230 if context.json:
231 common.stdout_json_success(message='All requested packages already installed.')
232 else:
233 print('\n# All requested packages already installed.\n')
234 return
235
236 if not context.json:
237 unlink_link_transaction.display_actions(progressive_fetch_extract)
238 common.confirm_yn()
239
240 elif context.dry_run:
241 common.stdout_json_success(unlink_link_transaction=unlink_link_transaction, prefix=prefix,
242 dry_run=True)
243 raise DryRunExit()
244
245 try:
246 progressive_fetch_extract.execute()
247 unlink_link_transaction.execute()
248
249 except SystemExit as e:
250 raise CondaSystemExit('Exiting', e)
251
252 if newenv:
253 append_env(prefix)
254 touch_nonadmin(prefix)
255 print_activate(args.name if args.name else prefix)
256
257 if context.json:
258 actions = unlink_link_transaction.make_legacy_action_groups(progressive_fetch_extract)[0]
259 common.stdout_json_success(actions=actions)
260
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/conda/cli/install.py b/conda/cli/install.py
--- a/conda/cli/install.py
+++ b/conda/cli/install.py
@@ -16,7 +16,7 @@
from .._vendor.auxlib.ish import dals
from ..base.constants import ROOT_ENV_NAME
from ..base.context import context
-from ..common.compat import text_type
+from ..common.compat import text_type, on_win
from ..core.envs_manager import EnvsDirectory
from ..core.index import calculate_channel_urls, get_index
from ..core.solve import Solver
@@ -80,16 +80,39 @@
def print_activate(env_name_or_prefix): # pragma: no cover
if not context.quiet and not context.json:
- message = dals("""
-
- To activate this environment, use
-
- $ conda activate %s
-
- To deactivate an active environment, use
-
- $ conda deactivate
- """) % env_name_or_prefix
+ if 'CONDA_SHLVL' in os.environ or os.path.split(os.environ.get('SHELL', ''))[-1] == 'fish':
+ message = dals("""
+ #
+ # To activate this environment, use
+ #
+ # $ conda activate %s
+ #
+ # To deactivate an active environment, use
+ #
+ # $ conda deactivate
+ """) % env_name_or_prefix
+ elif on_win:
+ message = dals("""
+ #
+ # To activate this environment, use:
+ # > activate %s
+ #
+ # To deactivate an active environment, use:
+ # > deactivate
+ #
+ # * for power-users using bash, you must source
+ #
+ """) % env_name_or_prefix
+ else:
+ message = dals("""
+ #
+ # To activate this environment, use:
+ # > source activate %s
+ #
+ # To deactivate an active environment, use:
+ # > source deactivate
+ #
+ """) % env_name_or_prefix
print(message) # TODO: use logger
| {"golden_diff": "diff --git a/conda/cli/install.py b/conda/cli/install.py\n--- a/conda/cli/install.py\n+++ b/conda/cli/install.py\n@@ -16,7 +16,7 @@\n from .._vendor.auxlib.ish import dals\n from ..base.constants import ROOT_ENV_NAME\n from ..base.context import context\n-from ..common.compat import text_type\n+from ..common.compat import text_type, on_win\n from ..core.envs_manager import EnvsDirectory\n from ..core.index import calculate_channel_urls, get_index\n from ..core.solve import Solver\n@@ -80,16 +80,39 @@\n \n def print_activate(env_name_or_prefix): # pragma: no cover\n if not context.quiet and not context.json:\n- message = dals(\"\"\"\n-\n- To activate this environment, use\n-\n- $ conda activate %s\n-\n- To deactivate an active environment, use\n-\n- $ conda deactivate\n- \"\"\") % env_name_or_prefix\n+ if 'CONDA_SHLVL' in os.environ or os.path.split(os.environ.get('SHELL', ''))[-1] == 'fish':\n+ message = dals(\"\"\"\n+ #\n+ # To activate this environment, use\n+ #\n+ # $ conda activate %s\n+ #\n+ # To deactivate an active environment, use\n+ #\n+ # $ conda deactivate\n+ \"\"\") % env_name_or_prefix\n+ elif on_win:\n+ message = dals(\"\"\"\n+ #\n+ # To activate this environment, use:\n+ # > activate %s\n+ #\n+ # To deactivate an active environment, use:\n+ # > deactivate\n+ #\n+ # * for power-users using bash, you must source\n+ #\n+ \"\"\") % env_name_or_prefix\n+ else:\n+ message = dals(\"\"\"\n+ #\n+ # To activate this environment, use:\n+ # > source activate %s\n+ #\n+ # To deactivate an active environment, use:\n+ # > source deactivate\n+ #\n+ \"\"\") % env_name_or_prefix\n print(message) # TODO: use logger\n", "issue": "conda 4.4 incorrect activate message\nI updated to conda 4.4.0b2 from conda v4.3. I created a new environment and it recommended the following - which I ran and got an error:\r\n\r\nTo activate this environment, use\r\n\r\n $ conda activate anaconda2\r\n\r\nTo deactivate an active environment, use\r\n\r\n $ conda deactivate\r\n\r\n0288-csoja:scripts csoja$ conda activate anaconda2\r\nTraceback (most recent call last):\r\n File \"/Users/csoja/Desktop/anaconda/lib/python3.5/site-packages/conda/cli/main.py\", line 176, in main\r\n raise CommandNotFoundError(argv1)\r\nconda.exceptions.CommandNotFoundError: 'activate is not a conda command.\r\nDid you mean 'source activate'?\r\n\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File \"/Users/csoja/Desktop/anaconda/bin/conda\", line 6, in <module>\r\n sys.exit(conda.cli.main())\r\n File \"/Users/csoja/Desktop/anaconda/lib/python3.5/site-packages/conda/cli/main.py\", line 178, in main\r\n from ..exceptions import handle_exception\r\nImportError: cannot import name 'handle_exception'\n", "before_files": [{"content": "# (c) Continuum Analytics, Inc. / http://continuum.io\n# All Rights Reserved\n#\n# conda is distributed under the terms of the BSD 3-clause license.\n# Consult LICENSE.txt or http://opensource.org/licenses/BSD-3-Clause.\n\nfrom __future__ import absolute_import, division, print_function, unicode_literals\n\nfrom logging import getLogger\nimport os\nfrom os.path import abspath, basename, exists, isdir\n\nfrom conda.models.match_spec import MatchSpec\nfrom . import common\nfrom .common import check_non_admin\nfrom .._vendor.auxlib.ish import dals\nfrom ..base.constants import ROOT_ENV_NAME\nfrom ..base.context import context\nfrom ..common.compat import text_type\nfrom ..core.envs_manager import EnvsDirectory\nfrom ..core.index import calculate_channel_urls, get_index\nfrom ..core.solve import Solver\nfrom ..exceptions import (CondaImportError, CondaOSError, CondaSystemExit, CondaValueError,\n DirectoryNotFoundError, DryRunExit, EnvironmentLocationNotFound,\n PackagesNotFoundError, TooManyArgumentsError,\n UnsatisfiableError)\nfrom ..misc import append_env, clone_env, explicit, touch_nonadmin\nfrom ..plan import (revert_actions)\nfrom ..resolve import ResolvePackageNotFound\n\nlog = getLogger(__name__)\nstderrlog = getLogger('conda.stderr')\n\n\ndef check_prefix(prefix, json=False):\n name = basename(prefix)\n error = None\n if name == ROOT_ENV_NAME:\n error = \"'%s' is a reserved environment name\" % name\n if exists(prefix):\n if isdir(prefix) and 'conda-meta' not in os.listdir(prefix):\n return None\n error = \"prefix already exists: %s\" % prefix\n\n if error:\n raise CondaValueError(error, json)\n\n if ' ' in prefix:\n stderrlog.warn(\"WARNING: A space was detected in your requested environment path\\n\"\n \"'%s'\\n\"\n \"Spaces in paths can sometimes be problematic.\" % prefix)\n\n\ndef clone(src_arg, dst_prefix, json=False, quiet=False, index_args=None):\n if os.sep in src_arg:\n src_prefix = abspath(src_arg)\n if not isdir(src_prefix):\n raise DirectoryNotFoundError(src_arg)\n else:\n assert context._argparse_args.clone is not None\n src_prefix = EnvsDirectory.locate_prefix_by_name(context._argparse_args.clone)\n\n if not json:\n print(\"Source: %s\" % src_prefix)\n print(\"Destination: %s\" % dst_prefix)\n\n actions, untracked_files = clone_env(src_prefix, dst_prefix,\n verbose=not json,\n quiet=quiet,\n index_args=index_args)\n\n if json:\n common.stdout_json_success(\n actions=actions,\n untracked_files=list(untracked_files),\n src_prefix=src_prefix,\n dst_prefix=dst_prefix\n )\n\n\ndef print_activate(env_name_or_prefix): # pragma: no cover\n if not context.quiet and not context.json:\n message = dals(\"\"\"\n\n To activate this environment, use\n\n $ conda activate %s\n\n To deactivate an active environment, use\n\n $ conda deactivate\n \"\"\") % env_name_or_prefix\n print(message) # TODO: use logger\n\n\ndef get_revision(arg, json=False):\n try:\n return int(arg)\n except ValueError:\n CondaValueError(\"expected revision number, not: '%s'\" % arg, json)\n\n\ndef install(args, parser, command='install'):\n \"\"\"\n conda install, conda update, and conda create\n \"\"\"\n context.validate_configuration()\n check_non_admin()\n\n newenv = bool(command == 'create')\n isupdate = bool(command == 'update')\n isinstall = bool(command == 'install')\n if newenv:\n common.ensure_name_or_prefix(args, command)\n prefix = context.target_prefix\n if newenv:\n check_prefix(prefix, json=context.json)\n if context.force_32bit and prefix == context.root_prefix:\n raise CondaValueError(\"cannot use CONDA_FORCE_32BIT=1 in root env\")\n if isupdate and not (args.file or args.all or args.packages):\n raise CondaValueError(\"\"\"no package names supplied\n# If you want to update to a newer version of Anaconda, type:\n#\n# $ conda update --prefix %s anaconda\n\"\"\" % prefix)\n\n args_packages = [s.strip('\"\\'') for s in args.packages]\n if newenv and not args.no_default_packages:\n # Override defaults if they are specified at the command line\n # TODO: rework in 4.4 branch using MatchSpec\n args_packages_names = [pkg.replace(' ', '=').split('=', 1)[0] for pkg in args_packages]\n for default_pkg in context.create_default_packages:\n default_pkg_name = default_pkg.replace(' ', '=').split('=', 1)[0]\n if default_pkg_name not in args_packages_names:\n args_packages.append(default_pkg)\n args_packages.extend(text_type(MatchSpec(provides_features=ft)) for ft in args.features or ())\n\n index_args = {\n 'use_cache': args.use_index_cache,\n 'channel_urls': context.channels,\n 'unknown': args.unknown,\n 'prepend': not args.override_channels,\n 'use_local': args.use_local\n }\n\n num_cp = sum(s.endswith('.tar.bz2') for s in args_packages)\n if num_cp:\n if num_cp == len(args_packages):\n explicit(args_packages, prefix, verbose=not context.quiet)\n return\n else:\n raise CondaValueError(\"cannot mix specifications with conda package\"\n \" filenames\")\n\n specs = []\n if args.file:\n for fpath in args.file:\n specs.extend(common.specs_from_url(fpath, json=context.json))\n if '@EXPLICIT' in specs:\n explicit(specs, prefix, verbose=not context.quiet, index_args=index_args)\n return\n specs.extend(common.specs_from_args(args_packages, json=context.json))\n\n if isinstall and args.revision:\n get_revision(args.revision, json=context.json)\n elif isinstall and not (args.file or args_packages):\n raise CondaValueError(\"too few arguments, \"\n \"must supply command line package specs or --file\")\n\n if newenv and args.clone:\n if args.packages:\n raise TooManyArgumentsError(0, len(args.packages), list(args.packages),\n 'did not expect any arguments for --clone')\n\n clone(args.clone, prefix, json=context.json, quiet=context.quiet, index_args=index_args)\n append_env(prefix)\n touch_nonadmin(prefix)\n print_activate(args.name if args.name else prefix)\n return\n\n if not isdir(prefix) and not newenv:\n if args.mkdir:\n try:\n os.makedirs(prefix)\n except OSError:\n raise CondaOSError(\"Error: could not create directory: %s\" % prefix)\n else:\n raise EnvironmentLocationNotFound(prefix)\n\n try:\n if isinstall and args.revision:\n index = get_index(channel_urls=index_args['channel_urls'],\n prepend=index_args['prepend'], platform=None,\n use_local=index_args['use_local'], use_cache=index_args['use_cache'],\n unknown=index_args['unknown'], prefix=prefix)\n unlink_link_transaction = revert_actions(prefix, get_revision(args.revision), index)\n progressive_fetch_extract = unlink_link_transaction.get_pfe()\n else:\n solver = Solver(prefix, context.channels, context.subdirs, specs_to_add=specs)\n unlink_link_transaction = solver.solve_for_transaction(\n force_reinstall=context.force,\n )\n progressive_fetch_extract = unlink_link_transaction.get_pfe()\n\n except ResolvePackageNotFound as e:\n channels_urls = tuple(calculate_channel_urls(\n channel_urls=index_args['channel_urls'],\n prepend=index_args['prepend'],\n platform=None,\n use_local=index_args['use_local'],\n ))\n raise PackagesNotFoundError(e.bad_deps, channels_urls)\n\n except (UnsatisfiableError, SystemExit) as e:\n # Unsatisfiable package specifications/no such revision/import error\n if e.args and 'could not import' in e.args[0]:\n raise CondaImportError(text_type(e))\n raise\n\n handle_txn(progressive_fetch_extract, unlink_link_transaction, prefix, args, newenv)\n\n\ndef handle_txn(progressive_fetch_extract, unlink_link_transaction, prefix, args, newenv,\n remove_op=False):\n if unlink_link_transaction.nothing_to_do:\n if remove_op:\n # No packages found to remove from environment\n raise PackagesNotFoundError(args.package_names)\n elif not newenv:\n if context.json:\n common.stdout_json_success(message='All requested packages already installed.')\n else:\n print('\\n# All requested packages already installed.\\n')\n return\n\n if not context.json:\n unlink_link_transaction.display_actions(progressive_fetch_extract)\n common.confirm_yn()\n\n elif context.dry_run:\n common.stdout_json_success(unlink_link_transaction=unlink_link_transaction, prefix=prefix,\n dry_run=True)\n raise DryRunExit()\n\n try:\n progressive_fetch_extract.execute()\n unlink_link_transaction.execute()\n\n except SystemExit as e:\n raise CondaSystemExit('Exiting', e)\n\n if newenv:\n append_env(prefix)\n touch_nonadmin(prefix)\n print_activate(args.name if args.name else prefix)\n\n if context.json:\n actions = unlink_link_transaction.make_legacy_action_groups(progressive_fetch_extract)[0]\n common.stdout_json_success(actions=actions)\n", "path": "conda/cli/install.py"}], "after_files": [{"content": "# (c) Continuum Analytics, Inc. / http://continuum.io\n# All Rights Reserved\n#\n# conda is distributed under the terms of the BSD 3-clause license.\n# Consult LICENSE.txt or http://opensource.org/licenses/BSD-3-Clause.\n\nfrom __future__ import absolute_import, division, print_function, unicode_literals\n\nfrom logging import getLogger\nimport os\nfrom os.path import abspath, basename, exists, isdir\n\nfrom conda.models.match_spec import MatchSpec\nfrom . import common\nfrom .common import check_non_admin\nfrom .._vendor.auxlib.ish import dals\nfrom ..base.constants import ROOT_ENV_NAME\nfrom ..base.context import context\nfrom ..common.compat import text_type, on_win\nfrom ..core.envs_manager import EnvsDirectory\nfrom ..core.index import calculate_channel_urls, get_index\nfrom ..core.solve import Solver\nfrom ..exceptions import (CondaImportError, CondaOSError, CondaSystemExit, CondaValueError,\n DirectoryNotFoundError, DryRunExit, EnvironmentLocationNotFound,\n PackagesNotFoundError, TooManyArgumentsError,\n UnsatisfiableError)\nfrom ..misc import append_env, clone_env, explicit, touch_nonadmin\nfrom ..plan import (revert_actions)\nfrom ..resolve import ResolvePackageNotFound\n\nlog = getLogger(__name__)\nstderrlog = getLogger('conda.stderr')\n\n\ndef check_prefix(prefix, json=False):\n name = basename(prefix)\n error = None\n if name == ROOT_ENV_NAME:\n error = \"'%s' is a reserved environment name\" % name\n if exists(prefix):\n if isdir(prefix) and 'conda-meta' not in os.listdir(prefix):\n return None\n error = \"prefix already exists: %s\" % prefix\n\n if error:\n raise CondaValueError(error, json)\n\n if ' ' in prefix:\n stderrlog.warn(\"WARNING: A space was detected in your requested environment path\\n\"\n \"'%s'\\n\"\n \"Spaces in paths can sometimes be problematic.\" % prefix)\n\n\ndef clone(src_arg, dst_prefix, json=False, quiet=False, index_args=None):\n if os.sep in src_arg:\n src_prefix = abspath(src_arg)\n if not isdir(src_prefix):\n raise DirectoryNotFoundError(src_arg)\n else:\n assert context._argparse_args.clone is not None\n src_prefix = EnvsDirectory.locate_prefix_by_name(context._argparse_args.clone)\n\n if not json:\n print(\"Source: %s\" % src_prefix)\n print(\"Destination: %s\" % dst_prefix)\n\n actions, untracked_files = clone_env(src_prefix, dst_prefix,\n verbose=not json,\n quiet=quiet,\n index_args=index_args)\n\n if json:\n common.stdout_json_success(\n actions=actions,\n untracked_files=list(untracked_files),\n src_prefix=src_prefix,\n dst_prefix=dst_prefix\n )\n\n\ndef print_activate(env_name_or_prefix): # pragma: no cover\n if not context.quiet and not context.json:\n if 'CONDA_SHLVL' in os.environ or os.path.split(os.environ.get('SHELL', ''))[-1] == 'fish':\n message = dals(\"\"\"\n #\n # To activate this environment, use\n #\n # $ conda activate %s\n #\n # To deactivate an active environment, use\n #\n # $ conda deactivate\n \"\"\") % env_name_or_prefix\n elif on_win:\n message = dals(\"\"\"\n #\n # To activate this environment, use:\n # > activate %s\n #\n # To deactivate an active environment, use:\n # > deactivate\n #\n # * for power-users using bash, you must source\n #\n \"\"\") % env_name_or_prefix\n else:\n message = dals(\"\"\"\n #\n # To activate this environment, use:\n # > source activate %s\n #\n # To deactivate an active environment, use:\n # > source deactivate\n #\n \"\"\") % env_name_or_prefix\n print(message) # TODO: use logger\n\n\ndef get_revision(arg, json=False):\n try:\n return int(arg)\n except ValueError:\n CondaValueError(\"expected revision number, not: '%s'\" % arg, json)\n\n\ndef install(args, parser, command='install'):\n \"\"\"\n conda install, conda update, and conda create\n \"\"\"\n context.validate_configuration()\n check_non_admin()\n\n newenv = bool(command == 'create')\n isupdate = bool(command == 'update')\n isinstall = bool(command == 'install')\n if newenv:\n common.ensure_name_or_prefix(args, command)\n prefix = context.target_prefix\n if newenv:\n check_prefix(prefix, json=context.json)\n if context.force_32bit and prefix == context.root_prefix:\n raise CondaValueError(\"cannot use CONDA_FORCE_32BIT=1 in root env\")\n if isupdate and not (args.file or args.all or args.packages):\n raise CondaValueError(\"\"\"no package names supplied\n# If you want to update to a newer version of Anaconda, type:\n#\n# $ conda update --prefix %s anaconda\n\"\"\" % prefix)\n\n args_packages = [s.strip('\"\\'') for s in args.packages]\n if newenv and not args.no_default_packages:\n # Override defaults if they are specified at the command line\n # TODO: rework in 4.4 branch using MatchSpec\n args_packages_names = [pkg.replace(' ', '=').split('=', 1)[0] for pkg in args_packages]\n for default_pkg in context.create_default_packages:\n default_pkg_name = default_pkg.replace(' ', '=').split('=', 1)[0]\n if default_pkg_name not in args_packages_names:\n args_packages.append(default_pkg)\n args_packages.extend(text_type(MatchSpec(provides_features=ft)) for ft in args.features or ())\n\n index_args = {\n 'use_cache': args.use_index_cache,\n 'channel_urls': context.channels,\n 'unknown': args.unknown,\n 'prepend': not args.override_channels,\n 'use_local': args.use_local\n }\n\n num_cp = sum(s.endswith('.tar.bz2') for s in args_packages)\n if num_cp:\n if num_cp == len(args_packages):\n explicit(args_packages, prefix, verbose=not context.quiet)\n return\n else:\n raise CondaValueError(\"cannot mix specifications with conda package\"\n \" filenames\")\n\n specs = []\n if args.file:\n for fpath in args.file:\n specs.extend(common.specs_from_url(fpath, json=context.json))\n if '@EXPLICIT' in specs:\n explicit(specs, prefix, verbose=not context.quiet, index_args=index_args)\n return\n specs.extend(common.specs_from_args(args_packages, json=context.json))\n\n if isinstall and args.revision:\n get_revision(args.revision, json=context.json)\n elif isinstall and not (args.file or args_packages):\n raise CondaValueError(\"too few arguments, \"\n \"must supply command line package specs or --file\")\n\n if newenv and args.clone:\n if args.packages:\n raise TooManyArgumentsError(0, len(args.packages), list(args.packages),\n 'did not expect any arguments for --clone')\n\n clone(args.clone, prefix, json=context.json, quiet=context.quiet, index_args=index_args)\n append_env(prefix)\n touch_nonadmin(prefix)\n print_activate(args.name if args.name else prefix)\n return\n\n if not isdir(prefix) and not newenv:\n if args.mkdir:\n try:\n os.makedirs(prefix)\n except OSError:\n raise CondaOSError(\"Error: could not create directory: %s\" % prefix)\n else:\n raise EnvironmentLocationNotFound(prefix)\n\n try:\n if isinstall and args.revision:\n index = get_index(channel_urls=index_args['channel_urls'],\n prepend=index_args['prepend'], platform=None,\n use_local=index_args['use_local'], use_cache=index_args['use_cache'],\n unknown=index_args['unknown'], prefix=prefix)\n unlink_link_transaction = revert_actions(prefix, get_revision(args.revision), index)\n progressive_fetch_extract = unlink_link_transaction.get_pfe()\n else:\n solver = Solver(prefix, context.channels, context.subdirs, specs_to_add=specs)\n unlink_link_transaction = solver.solve_for_transaction(\n force_reinstall=context.force,\n )\n progressive_fetch_extract = unlink_link_transaction.get_pfe()\n\n except ResolvePackageNotFound as e:\n channels_urls = tuple(calculate_channel_urls(\n channel_urls=index_args['channel_urls'],\n prepend=index_args['prepend'],\n platform=None,\n use_local=index_args['use_local'],\n ))\n raise PackagesNotFoundError(e.bad_deps, channels_urls)\n\n except (UnsatisfiableError, SystemExit) as e:\n # Unsatisfiable package specifications/no such revision/import error\n if e.args and 'could not import' in e.args[0]:\n raise CondaImportError(text_type(e))\n raise\n\n handle_txn(progressive_fetch_extract, unlink_link_transaction, prefix, args, newenv)\n\n\ndef handle_txn(progressive_fetch_extract, unlink_link_transaction, prefix, args, newenv,\n remove_op=False):\n if unlink_link_transaction.nothing_to_do:\n if remove_op:\n # No packages found to remove from environment\n raise PackagesNotFoundError(args.package_names)\n elif not newenv:\n if context.json:\n common.stdout_json_success(message='All requested packages already installed.')\n else:\n print('\\n# All requested packages already installed.\\n')\n return\n\n if not context.json:\n unlink_link_transaction.display_actions(progressive_fetch_extract)\n common.confirm_yn()\n\n elif context.dry_run:\n common.stdout_json_success(unlink_link_transaction=unlink_link_transaction, prefix=prefix,\n dry_run=True)\n raise DryRunExit()\n\n try:\n progressive_fetch_extract.execute()\n unlink_link_transaction.execute()\n\n except SystemExit as e:\n raise CondaSystemExit('Exiting', e)\n\n if newenv:\n append_env(prefix)\n touch_nonadmin(prefix)\n print_activate(args.name if args.name else prefix)\n\n if context.json:\n actions = unlink_link_transaction.make_legacy_action_groups(progressive_fetch_extract)[0]\n common.stdout_json_success(actions=actions)\n", "path": "conda/cli/install.py"}]} | 3,270 | 485 |
gh_patches_debug_5110 | rasdani/github-patches | git_diff | mindsdb__mindsdb-177 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
AttributeError: 'PredictTransactionOutputData' object has no attribute 'predicted_values'
**Describe the bug**
After running predict.py in the example mindsdb/docs/examples/time_series/ I got the following AttributeError:
```
Traceback (most recent call last):
File "predict.py", line 12, in <module>
print(result.predicted_values)
AttributeError: 'PredictTransactionOutputData' object has no attribute 'predicted_values'
```
**To Reproduce**
Steps to reproduce the behavior:
1. First run train.py, with python3 train.py
2. When training is finished, run predict.py with python3 predict.py
3. See error
**Expected behavior**
I expected to see the predicted values.
**Desktop (please complete the following information):**
- OS: Ubuntu 18.04.2 LTS
- mindsdb 1.0.5
- pip 19.1
- python 3.6.7
- virtualenv 15.1.0
- urllib3 1.24
**Additional context**
Before the Traceback I got the following warning many times:
```
WARNING:mindsdb-logger-core-logger:libs/backends/ludwig.py:141 - ('Missing previous predicted values for output column: '
'Main_Engine_Fuel_Consumption_MT_day, these should be included in your input '
'under the name: previous_Main_Engine_Fuel_Consumption_MT_day')
```
Finally, I've installed mindsdb using pip3 inside a virtualenvironment.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `docs/examples/time_series/predict.py`
Content:
```
1 """
2
3 """
4
5 from mindsdb import Predictor
6
7 # Here we use the model to make predictions (NOTE: You need to run train.py first)
8 result = Predictor(name='fuel').predict(when_data = 'fuel_predict.csv')
9
10 # you can now print the results
11 print('The predicted main engine fuel consumption')
12 print(result.predicted_values)
```
Path: `docs/examples/nlp/predict.py`
Content:
```
1 from mindsdb import *
2
3 mdb = Predictor(name='real_estate_desc')
4
5 # Here we use the model to make predictions (NOTE: You need to run train.py first)
6 result = mdb.predict(
7 when={
8 "description": """A true gem
9 rooms: 2
10 bathrooms: 0
11 neighboorhood: thowsand_oaks
12 amenities: parking
13 area: 84.0291068642868
14 condition: great !
15 """
16 }
17 )
18
19 # you can now print the results
20 print('The predicted number of rooms')
21 print(result.predicted_values)
22
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/docs/examples/nlp/predict.py b/docs/examples/nlp/predict.py
--- a/docs/examples/nlp/predict.py
+++ b/docs/examples/nlp/predict.py
@@ -18,4 +18,4 @@
# you can now print the results
print('The predicted number of rooms')
-print(result.predicted_values)
+print(result)
diff --git a/docs/examples/time_series/predict.py b/docs/examples/time_series/predict.py
--- a/docs/examples/time_series/predict.py
+++ b/docs/examples/time_series/predict.py
@@ -9,4 +9,5 @@
# you can now print the results
print('The predicted main engine fuel consumption')
-print(result.predicted_values)
\ No newline at end of file
+for row in result:
+ print(row)
| {"golden_diff": "diff --git a/docs/examples/nlp/predict.py b/docs/examples/nlp/predict.py\n--- a/docs/examples/nlp/predict.py\n+++ b/docs/examples/nlp/predict.py\n@@ -18,4 +18,4 @@\n \n # you can now print the results\n print('The predicted number of rooms')\n-print(result.predicted_values)\n+print(result)\ndiff --git a/docs/examples/time_series/predict.py b/docs/examples/time_series/predict.py\n--- a/docs/examples/time_series/predict.py\n+++ b/docs/examples/time_series/predict.py\n@@ -9,4 +9,5 @@\n \n # you can now print the results\n print('The predicted main engine fuel consumption')\n-print(result.predicted_values)\n\\ No newline at end of file\n+for row in result:\n+ print(row)\n", "issue": "AttributeError: 'PredictTransactionOutputData' object has no attribute 'predicted_values'\n**Describe the bug**\r\nAfter running predict.py in the example mindsdb/docs/examples/time_series/ I got the following AttributeError:\r\n\r\n```\r\nTraceback (most recent call last):\r\n File \"predict.py\", line 12, in <module>\r\n print(result.predicted_values)\r\nAttributeError: 'PredictTransactionOutputData' object has no attribute 'predicted_values'\r\n```\r\n\r\n**To Reproduce**\r\nSteps to reproduce the behavior:\r\n1. First run train.py, with python3 train.py\r\n2. When training is finished, run predict.py with python3 predict.py\r\n3. See error\r\n\r\n**Expected behavior**\r\nI expected to see the predicted values.\r\n\r\n**Desktop (please complete the following information):**\r\n - OS: Ubuntu 18.04.2 LTS\r\n- mindsdb 1.0.5\r\n- pip 19.1\r\n- python 3.6.7\r\n- virtualenv 15.1.0\r\n- urllib3 1.24\r\n\r\n**Additional context**\r\nBefore the Traceback I got the following warning many times:\r\n\r\n```\r\nWARNING:mindsdb-logger-core-logger:libs/backends/ludwig.py:141 - ('Missing previous predicted values for output column: '\r\n 'Main_Engine_Fuel_Consumption_MT_day, these should be included in your input '\r\n 'under the name: previous_Main_Engine_Fuel_Consumption_MT_day')\r\n```\r\nFinally, I've installed mindsdb using pip3 inside a virtualenvironment.\r\n\n", "before_files": [{"content": "\"\"\"\n\n\"\"\"\n\nfrom mindsdb import Predictor\n\n# Here we use the model to make predictions (NOTE: You need to run train.py first)\nresult = Predictor(name='fuel').predict(when_data = 'fuel_predict.csv')\n\n# you can now print the results\nprint('The predicted main engine fuel consumption')\nprint(result.predicted_values)", "path": "docs/examples/time_series/predict.py"}, {"content": "from mindsdb import *\n\nmdb = Predictor(name='real_estate_desc')\n\n# Here we use the model to make predictions (NOTE: You need to run train.py first)\nresult = mdb.predict(\n when={\n \"description\": \"\"\"A true gem\n rooms: 2\n bathrooms: 0\n neighboorhood: thowsand_oaks\n amenities: parking\n area: 84.0291068642868\n condition: great !\n \"\"\"\n }\n)\n\n# you can now print the results\nprint('The predicted number of rooms')\nprint(result.predicted_values)\n", "path": "docs/examples/nlp/predict.py"}], "after_files": [{"content": "\"\"\"\n\n\"\"\"\n\nfrom mindsdb import Predictor\n\n# Here we use the model to make predictions (NOTE: You need to run train.py first)\nresult = Predictor(name='fuel').predict(when_data = 'fuel_predict.csv')\n\n# you can now print the results\nprint('The predicted main engine fuel consumption')\nfor row in result:\n print(row)\n", "path": "docs/examples/time_series/predict.py"}, {"content": "from mindsdb import *\n\nmdb = Predictor(name='real_estate_desc')\n\n# Here we use the model to make predictions (NOTE: You need to run train.py first)\nresult = mdb.predict(\n when={\n \"description\": \"\"\"A true gem\n rooms: 2\n bathrooms: 0\n neighboorhood: thowsand_oaks\n amenities: parking\n area: 84.0291068642868\n condition: great !\n \"\"\"\n }\n)\n\n# you can now print the results\nprint('The predicted number of rooms')\nprint(result)\n", "path": "docs/examples/nlp/predict.py"}]} | 864 | 172 |
gh_patches_debug_24483 | rasdani/github-patches | git_diff | sublimelsp__LSP-1335 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Triggers completion command on the wrong session
When having both LSP-css and LSP-stylelint running for open file (*.css), the completions are requested from LSP-css but the `command` that the completion item triggers runs on `LSP-stylelint` triggering an error dialog.
An example completion item looks like this:
```
{
"command": {
"command": "editor.action.triggerSuggest",
"title": "Suggest"
},
"documentation": {
"kind": "plaintext",
"value": "Specifies the width of the content area, padding area or border area (depending on 'box-sizing') of certain boxes.\n\nSyntax: <viewport-length>{1,2}\n\nMDN Reference: https://developer.mozilla.org/docs/Web/CSS/width"
},
"insertTextFormat": 2,
"kind": 10,
"label": "width",
"sortText": "d_0000",
"tags": [],
"textEdit": {
"newText": "width: $0;",
"range": {
"end": {
"character": 5,
"line": 4
},
"start": {
"character": 4,
"line": 4
}
}
}
},
```
It has a `command` that runs after inserting completion.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `plugin/completion.py`
Content:
```
1 import mdpopups
2 import sublime
3 import sublime_plugin
4 import webbrowser
5 from .core.css import css
6 from .core.logging import debug
7 from .core.edit import parse_text_edit
8 from .core.protocol import Request, InsertTextFormat, Range
9 from .core.registry import LspTextCommand
10 from .core.typing import Any, List, Dict, Optional, Generator, Union
11 from .core.views import FORMAT_STRING, FORMAT_MARKUP_CONTENT, minihtml
12 from .core.views import range_to_region
13
14
15 class LspResolveDocsCommand(LspTextCommand):
16
17 completions = [] # type: List[Dict[str, Any]]
18
19 def run(self, edit: sublime.Edit, index: int, event: Optional[dict] = None) -> None:
20 item = self.completions[index]
21 detail = self.format_documentation(item.get('detail') or "")
22 documentation = self.format_documentation(item.get("documentation") or "")
23 # don't show the detail in the cooperate AC popup if it is already shown in the AC details filed.
24 self.is_detail_shown = bool(detail)
25 if not detail or not documentation:
26 # To make sure that the detail or documentation fields doesn't exist we need to resove the completion item.
27 # If those fields appear after the item is resolved we show them in the popup.
28 session = self.best_session('completionProvider.resolveProvider')
29 if session:
30 session.send_request(Request.resolveCompletionItem(item), self.handle_resolve_response)
31 return
32 minihtml_content = self.get_content(documentation, detail)
33 self.show_popup(minihtml_content)
34
35 def format_documentation(self, content: Union[str, Dict[str, str]]) -> str:
36 return minihtml(self.view, content, allowed_formats=FORMAT_STRING | FORMAT_MARKUP_CONTENT)
37
38 def get_content(self, documentation: str, detail: str) -> str:
39 content = ""
40 if detail and not self.is_detail_shown:
41 content += "<div class='highlight'>{}</div>".format(detail)
42 if documentation:
43 content += "<div>{}</div>".format(documentation)
44 return content
45
46 def show_popup(self, minihtml_content: str) -> None:
47 viewport_width = self.view.viewport_extent()[0]
48 mdpopups.show_popup(
49 self.view,
50 minihtml_content,
51 flags=sublime.COOPERATE_WITH_AUTO_COMPLETE,
52 css=css().popups,
53 wrapper_class=css().popups_classname,
54 max_width=viewport_width,
55 on_navigate=self.on_navigate
56 )
57
58 def on_navigate(self, url: str) -> None:
59 webbrowser.open(url)
60
61 def handle_resolve_response(self, item: Optional[dict]) -> None:
62 detail = ""
63 documentation = ""
64 if item:
65 detail = self.format_documentation(item.get('detail') or "")
66 documentation = self.format_documentation(item.get("documentation") or "")
67 if not documentation:
68 documentation = self.format_documentation({"kind": "markdown", "value": "*No documentation available.*"})
69 minihtml_content = self.get_content(documentation, detail)
70 show = self.update_popup if self.view.is_popup_visible() else self.show_popup
71 # NOTE: Update/show popups from the main thread, or else the popup might make the AC widget disappear.
72 sublime.set_timeout(lambda: show(minihtml_content))
73
74 def update_popup(self, minihtml_content: str) -> None:
75 mdpopups.update_popup(
76 self.view,
77 minihtml_content,
78 css=css().popups,
79 wrapper_class=css().popups_classname,
80 )
81
82
83 class LspCompleteCommand(sublime_plugin.TextCommand):
84
85 def epilogue(self, item: Dict[str, Any]) -> None:
86 additional_edits = item.get('additionalTextEdits')
87 if additional_edits:
88 edits = [parse_text_edit(additional_edit) for additional_edit in additional_edits]
89 self.view.run_command("lsp_apply_document_edit", {'changes': edits})
90 command = item.get("command")
91 if command:
92 debug('Running server command "{}" for view {}'.format(command, self.view.id()))
93 self.view.run_command("lsp_execute", {"command_name": command})
94
95
96 class LspCompleteInsertTextCommand(LspCompleteCommand):
97
98 def run(self, edit: sublime.Edit, **item: Any) -> None:
99 insert_text = item.get("insertText") or item["label"]
100 if item.get("insertTextFormat", InsertTextFormat.PlainText) == InsertTextFormat.Snippet:
101 self.view.run_command("insert_snippet", {"contents": insert_text})
102 else:
103 self.view.run_command("insert", {"characters": insert_text})
104 self.epilogue(item)
105
106
107 class LspCompleteTextEditCommand(LspCompleteCommand):
108
109 def run(self, edit: sublime.Edit, **item: Any) -> None:
110 text_edit = item["textEdit"]
111 new_text = text_edit['newText']
112 edit_region = range_to_region(Range.from_lsp(text_edit['range']), self.view)
113 if item.get("insertTextFormat", InsertTextFormat.PlainText) == InsertTextFormat.Snippet:
114 for region in self.translated_regions(edit_region):
115 self.view.erase(edit, region)
116 self.view.run_command("insert_snippet", {"contents": new_text})
117 else:
118 for region in self.translated_regions(edit_region):
119 # NOTE: Cannot do .replace, because ST will select the replacement.
120 self.view.erase(edit, region)
121 self.view.insert(edit, region.a, new_text)
122 self.epilogue(item)
123
124 def translated_regions(self, edit_region: sublime.Region) -> Generator[sublime.Region, None, None]:
125 selection = self.view.sel()
126 primary_cursor_position = selection[0].b
127 for region in reversed(selection):
128 # For each selection region, apply the same removal as for the "primary" region.
129 # To do that, translate, or offset, the LSP edit region into the non-"primary" regions.
130 # The concept of "primary" is our own, and there is no mention of it in the LSP spec.
131 translation = region.b - primary_cursor_position
132 translated_edit_region = sublime.Region(edit_region.a + translation, edit_region.b + translation)
133 yield translated_edit_region
134
135
136 def resolve(completion_list: sublime.CompletionList, items: List[sublime.CompletionItem], flags: int = 0) -> None:
137 # Resolve the promise on the main thread to prevent any sort of data race for _set_target (see sublime_plugin.py).
138 sublime.set_timeout(lambda: completion_list.set_completions(items, flags))
139
```
Path: `plugin/execute_command.py`
Content:
```
1 import sublime
2 from .core.protocol import Request
3 from .core.registry import LspTextCommand
4 from .core.sessions import Session
5 from .core.typing import List, Optional, Dict, Any
6 from .core.views import uri_from_view, offset_to_point, region_to_range
7
8
9 class LspExecuteCommand(LspTextCommand):
10
11 capability = 'executeCommandProvider'
12
13 def run(self,
14 edit: sublime.Edit,
15 command_name: Optional[str] = None,
16 command_args: Optional[List[Any]] = None,
17 event: Optional[dict] = None) -> None:
18 session = self.best_session(self.capability)
19 if session and command_name:
20 window = self.view.window()
21 if window:
22 window.status_message("Running command {}".format(command_name))
23 if command_args:
24 self._expand_variables(command_args)
25 self._send_command(session, command_name, command_args)
26
27 def _expand_variables(self, command_args: List[Any]) -> None:
28 region = self.view.sel()[0]
29 for i, arg in enumerate(command_args):
30 if arg in ["$file_uri", "${file_uri}"]:
31 command_args[i] = uri_from_view(self.view)
32 elif arg in ["$selection", "${selection}"]:
33 command_args[i] = self.view.substr(region)
34 elif arg in ["$offset", "${offset}"]:
35 command_args[i] = region.b
36 elif arg in ["$selection_begin", "${selection_begin}"]:
37 command_args[i] = region.begin()
38 elif arg in ["$selection_end", "${selection_end}"]:
39 command_args[i] = region.end()
40 elif arg in ["$position", "${position}"]:
41 command_args[i] = offset_to_point(self.view, region.b).to_lsp()
42 elif arg in ["$range", "${range}"]:
43 command_args[i] = region_to_range(self.view, region).to_lsp()
44
45 def _handle_response(self, command: str, response: Optional[Any]) -> None:
46 msg = "command {} completed".format(command)
47 if response:
48 msg += "with response: {}".format(response)
49
50 window = self.view.window()
51 if window:
52 window.status_message(msg)
53
54 def _handle_error(self, command: str, error: Dict[str, Any]) -> None:
55 msg = "command {} failed. Reason: {}".format(command, error.get("message", "none provided by server :("))
56 sublime.message_dialog(msg)
57
58 def _send_command(self, session: Session, command_name: str, command_args: Optional[List[Any]]) -> None:
59 request = {"command": command_name, "arguments": command_args} if command_args else {"command": command_name}
60 session.send_request(Request.executeCommand(request),
61 lambda reponse: self._handle_response(command_name, reponse),
62 lambda error: self._handle_error(command_name, error))
63
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/plugin/completion.py b/plugin/completion.py
--- a/plugin/completion.py
+++ b/plugin/completion.py
@@ -90,7 +90,8 @@
command = item.get("command")
if command:
debug('Running server command "{}" for view {}'.format(command, self.view.id()))
- self.view.run_command("lsp_execute", {"command_name": command})
+ args = {"command_name": command["command"], "command_args": command.get("arguments")}
+ self.view.run_command("lsp_execute", args)
class LspCompleteInsertTextCommand(LspCompleteCommand):
diff --git a/plugin/execute_command.py b/plugin/execute_command.py
--- a/plugin/execute_command.py
+++ b/plugin/execute_command.py
@@ -15,6 +15,11 @@
command_name: Optional[str] = None,
command_args: Optional[List[Any]] = None,
event: Optional[dict] = None) -> None:
+ # Handle VSCode-specific command for triggering suggestions popup.
+ if command_name == "editor.action.triggerSuggest":
+ # Triggered from set_timeout as suggestions popup doesn't trigger otherwise.
+ sublime.set_timeout(lambda: self.view.run_command("auto_complete"))
+ return
session = self.best_session(self.capability)
if session and command_name:
window = self.view.window()
| {"golden_diff": "diff --git a/plugin/completion.py b/plugin/completion.py\n--- a/plugin/completion.py\n+++ b/plugin/completion.py\n@@ -90,7 +90,8 @@\n command = item.get(\"command\")\n if command:\n debug('Running server command \"{}\" for view {}'.format(command, self.view.id()))\n- self.view.run_command(\"lsp_execute\", {\"command_name\": command})\n+ args = {\"command_name\": command[\"command\"], \"command_args\": command.get(\"arguments\")}\n+ self.view.run_command(\"lsp_execute\", args)\n \n \n class LspCompleteInsertTextCommand(LspCompleteCommand):\ndiff --git a/plugin/execute_command.py b/plugin/execute_command.py\n--- a/plugin/execute_command.py\n+++ b/plugin/execute_command.py\n@@ -15,6 +15,11 @@\n command_name: Optional[str] = None,\n command_args: Optional[List[Any]] = None,\n event: Optional[dict] = None) -> None:\n+ # Handle VSCode-specific command for triggering suggestions popup.\n+ if command_name == \"editor.action.triggerSuggest\":\n+ # Triggered from set_timeout as suggestions popup doesn't trigger otherwise.\n+ sublime.set_timeout(lambda: self.view.run_command(\"auto_complete\"))\n+ return\n session = self.best_session(self.capability)\n if session and command_name:\n window = self.view.window()\n", "issue": "Triggers completion command on the wrong session\nWhen having both LSP-css and LSP-stylelint running for open file (*.css), the completions are requested from LSP-css but the `command` that the completion item triggers runs on `LSP-stylelint` triggering an error dialog.\r\n\r\nAn example completion item looks like this:\r\n```\r\n {\r\n \"command\": {\r\n \"command\": \"editor.action.triggerSuggest\",\r\n \"title\": \"Suggest\"\r\n },\r\n \"documentation\": {\r\n \"kind\": \"plaintext\",\r\n \"value\": \"Specifies the width of the content area, padding area or border area (depending on 'box-sizing') of certain boxes.\\n\\nSyntax: <viewport-length>{1,2}\\n\\nMDN Reference: https://developer.mozilla.org/docs/Web/CSS/width\"\r\n },\r\n \"insertTextFormat\": 2,\r\n \"kind\": 10,\r\n \"label\": \"width\",\r\n \"sortText\": \"d_0000\",\r\n \"tags\": [],\r\n \"textEdit\": {\r\n \"newText\": \"width: $0;\",\r\n \"range\": {\r\n \"end\": {\r\n \"character\": 5,\r\n \"line\": 4\r\n },\r\n \"start\": {\r\n \"character\": 4,\r\n \"line\": 4\r\n }\r\n }\r\n }\r\n },\r\n```\r\n\r\nIt has a `command` that runs after inserting completion.\n", "before_files": [{"content": "import mdpopups\nimport sublime\nimport sublime_plugin\nimport webbrowser\nfrom .core.css import css\nfrom .core.logging import debug\nfrom .core.edit import parse_text_edit\nfrom .core.protocol import Request, InsertTextFormat, Range\nfrom .core.registry import LspTextCommand\nfrom .core.typing import Any, List, Dict, Optional, Generator, Union\nfrom .core.views import FORMAT_STRING, FORMAT_MARKUP_CONTENT, minihtml\nfrom .core.views import range_to_region\n\n\nclass LspResolveDocsCommand(LspTextCommand):\n\n completions = [] # type: List[Dict[str, Any]]\n\n def run(self, edit: sublime.Edit, index: int, event: Optional[dict] = None) -> None:\n item = self.completions[index]\n detail = self.format_documentation(item.get('detail') or \"\")\n documentation = self.format_documentation(item.get(\"documentation\") or \"\")\n # don't show the detail in the cooperate AC popup if it is already shown in the AC details filed.\n self.is_detail_shown = bool(detail)\n if not detail or not documentation:\n # To make sure that the detail or documentation fields doesn't exist we need to resove the completion item.\n # If those fields appear after the item is resolved we show them in the popup.\n session = self.best_session('completionProvider.resolveProvider')\n if session:\n session.send_request(Request.resolveCompletionItem(item), self.handle_resolve_response)\n return\n minihtml_content = self.get_content(documentation, detail)\n self.show_popup(minihtml_content)\n\n def format_documentation(self, content: Union[str, Dict[str, str]]) -> str:\n return minihtml(self.view, content, allowed_formats=FORMAT_STRING | FORMAT_MARKUP_CONTENT)\n\n def get_content(self, documentation: str, detail: str) -> str:\n content = \"\"\n if detail and not self.is_detail_shown:\n content += \"<div class='highlight'>{}</div>\".format(detail)\n if documentation:\n content += \"<div>{}</div>\".format(documentation)\n return content\n\n def show_popup(self, minihtml_content: str) -> None:\n viewport_width = self.view.viewport_extent()[0]\n mdpopups.show_popup(\n self.view,\n minihtml_content,\n flags=sublime.COOPERATE_WITH_AUTO_COMPLETE,\n css=css().popups,\n wrapper_class=css().popups_classname,\n max_width=viewport_width,\n on_navigate=self.on_navigate\n )\n\n def on_navigate(self, url: str) -> None:\n webbrowser.open(url)\n\n def handle_resolve_response(self, item: Optional[dict]) -> None:\n detail = \"\"\n documentation = \"\"\n if item:\n detail = self.format_documentation(item.get('detail') or \"\")\n documentation = self.format_documentation(item.get(\"documentation\") or \"\")\n if not documentation:\n documentation = self.format_documentation({\"kind\": \"markdown\", \"value\": \"*No documentation available.*\"})\n minihtml_content = self.get_content(documentation, detail)\n show = self.update_popup if self.view.is_popup_visible() else self.show_popup\n # NOTE: Update/show popups from the main thread, or else the popup might make the AC widget disappear.\n sublime.set_timeout(lambda: show(minihtml_content))\n\n def update_popup(self, minihtml_content: str) -> None:\n mdpopups.update_popup(\n self.view,\n minihtml_content,\n css=css().popups,\n wrapper_class=css().popups_classname,\n )\n\n\nclass LspCompleteCommand(sublime_plugin.TextCommand):\n\n def epilogue(self, item: Dict[str, Any]) -> None:\n additional_edits = item.get('additionalTextEdits')\n if additional_edits:\n edits = [parse_text_edit(additional_edit) for additional_edit in additional_edits]\n self.view.run_command(\"lsp_apply_document_edit\", {'changes': edits})\n command = item.get(\"command\")\n if command:\n debug('Running server command \"{}\" for view {}'.format(command, self.view.id()))\n self.view.run_command(\"lsp_execute\", {\"command_name\": command})\n\n\nclass LspCompleteInsertTextCommand(LspCompleteCommand):\n\n def run(self, edit: sublime.Edit, **item: Any) -> None:\n insert_text = item.get(\"insertText\") or item[\"label\"]\n if item.get(\"insertTextFormat\", InsertTextFormat.PlainText) == InsertTextFormat.Snippet:\n self.view.run_command(\"insert_snippet\", {\"contents\": insert_text})\n else:\n self.view.run_command(\"insert\", {\"characters\": insert_text})\n self.epilogue(item)\n\n\nclass LspCompleteTextEditCommand(LspCompleteCommand):\n\n def run(self, edit: sublime.Edit, **item: Any) -> None:\n text_edit = item[\"textEdit\"]\n new_text = text_edit['newText']\n edit_region = range_to_region(Range.from_lsp(text_edit['range']), self.view)\n if item.get(\"insertTextFormat\", InsertTextFormat.PlainText) == InsertTextFormat.Snippet:\n for region in self.translated_regions(edit_region):\n self.view.erase(edit, region)\n self.view.run_command(\"insert_snippet\", {\"contents\": new_text})\n else:\n for region in self.translated_regions(edit_region):\n # NOTE: Cannot do .replace, because ST will select the replacement.\n self.view.erase(edit, region)\n self.view.insert(edit, region.a, new_text)\n self.epilogue(item)\n\n def translated_regions(self, edit_region: sublime.Region) -> Generator[sublime.Region, None, None]:\n selection = self.view.sel()\n primary_cursor_position = selection[0].b\n for region in reversed(selection):\n # For each selection region, apply the same removal as for the \"primary\" region.\n # To do that, translate, or offset, the LSP edit region into the non-\"primary\" regions.\n # The concept of \"primary\" is our own, and there is no mention of it in the LSP spec.\n translation = region.b - primary_cursor_position\n translated_edit_region = sublime.Region(edit_region.a + translation, edit_region.b + translation)\n yield translated_edit_region\n\n\ndef resolve(completion_list: sublime.CompletionList, items: List[sublime.CompletionItem], flags: int = 0) -> None:\n # Resolve the promise on the main thread to prevent any sort of data race for _set_target (see sublime_plugin.py).\n sublime.set_timeout(lambda: completion_list.set_completions(items, flags))\n", "path": "plugin/completion.py"}, {"content": "import sublime\nfrom .core.protocol import Request\nfrom .core.registry import LspTextCommand\nfrom .core.sessions import Session\nfrom .core.typing import List, Optional, Dict, Any\nfrom .core.views import uri_from_view, offset_to_point, region_to_range\n\n\nclass LspExecuteCommand(LspTextCommand):\n\n capability = 'executeCommandProvider'\n\n def run(self,\n edit: sublime.Edit,\n command_name: Optional[str] = None,\n command_args: Optional[List[Any]] = None,\n event: Optional[dict] = None) -> None:\n session = self.best_session(self.capability)\n if session and command_name:\n window = self.view.window()\n if window:\n window.status_message(\"Running command {}\".format(command_name))\n if command_args:\n self._expand_variables(command_args)\n self._send_command(session, command_name, command_args)\n\n def _expand_variables(self, command_args: List[Any]) -> None:\n region = self.view.sel()[0]\n for i, arg in enumerate(command_args):\n if arg in [\"$file_uri\", \"${file_uri}\"]:\n command_args[i] = uri_from_view(self.view)\n elif arg in [\"$selection\", \"${selection}\"]:\n command_args[i] = self.view.substr(region)\n elif arg in [\"$offset\", \"${offset}\"]:\n command_args[i] = region.b\n elif arg in [\"$selection_begin\", \"${selection_begin}\"]:\n command_args[i] = region.begin()\n elif arg in [\"$selection_end\", \"${selection_end}\"]:\n command_args[i] = region.end()\n elif arg in [\"$position\", \"${position}\"]:\n command_args[i] = offset_to_point(self.view, region.b).to_lsp()\n elif arg in [\"$range\", \"${range}\"]:\n command_args[i] = region_to_range(self.view, region).to_lsp()\n\n def _handle_response(self, command: str, response: Optional[Any]) -> None:\n msg = \"command {} completed\".format(command)\n if response:\n msg += \"with response: {}\".format(response)\n\n window = self.view.window()\n if window:\n window.status_message(msg)\n\n def _handle_error(self, command: str, error: Dict[str, Any]) -> None:\n msg = \"command {} failed. Reason: {}\".format(command, error.get(\"message\", \"none provided by server :(\"))\n sublime.message_dialog(msg)\n\n def _send_command(self, session: Session, command_name: str, command_args: Optional[List[Any]]) -> None:\n request = {\"command\": command_name, \"arguments\": command_args} if command_args else {\"command\": command_name}\n session.send_request(Request.executeCommand(request),\n lambda reponse: self._handle_response(command_name, reponse),\n lambda error: self._handle_error(command_name, error))\n", "path": "plugin/execute_command.py"}], "after_files": [{"content": "import mdpopups\nimport sublime\nimport sublime_plugin\nimport webbrowser\nfrom .core.css import css\nfrom .core.logging import debug\nfrom .core.edit import parse_text_edit\nfrom .core.protocol import Request, InsertTextFormat, Range\nfrom .core.registry import LspTextCommand\nfrom .core.typing import Any, List, Dict, Optional, Generator, Union\nfrom .core.views import FORMAT_STRING, FORMAT_MARKUP_CONTENT, minihtml\nfrom .core.views import range_to_region\n\n\nclass LspResolveDocsCommand(LspTextCommand):\n\n completions = [] # type: List[Dict[str, Any]]\n\n def run(self, edit: sublime.Edit, index: int, event: Optional[dict] = None) -> None:\n item = self.completions[index]\n detail = self.format_documentation(item.get('detail') or \"\")\n documentation = self.format_documentation(item.get(\"documentation\") or \"\")\n # don't show the detail in the cooperate AC popup if it is already shown in the AC details filed.\n self.is_detail_shown = bool(detail)\n if not detail or not documentation:\n # To make sure that the detail or documentation fields doesn't exist we need to resove the completion item.\n # If those fields appear after the item is resolved we show them in the popup.\n session = self.best_session('completionProvider.resolveProvider')\n if session:\n session.send_request(Request.resolveCompletionItem(item), self.handle_resolve_response)\n return\n minihtml_content = self.get_content(documentation, detail)\n self.show_popup(minihtml_content)\n\n def format_documentation(self, content: Union[str, Dict[str, str]]) -> str:\n return minihtml(self.view, content, allowed_formats=FORMAT_STRING | FORMAT_MARKUP_CONTENT)\n\n def get_content(self, documentation: str, detail: str) -> str:\n content = \"\"\n if detail and not self.is_detail_shown:\n content += \"<div class='highlight'>{}</div>\".format(detail)\n if documentation:\n content += \"<div>{}</div>\".format(documentation)\n return content\n\n def show_popup(self, minihtml_content: str) -> None:\n viewport_width = self.view.viewport_extent()[0]\n mdpopups.show_popup(\n self.view,\n minihtml_content,\n flags=sublime.COOPERATE_WITH_AUTO_COMPLETE,\n css=css().popups,\n wrapper_class=css().popups_classname,\n max_width=viewport_width,\n on_navigate=self.on_navigate\n )\n\n def on_navigate(self, url: str) -> None:\n webbrowser.open(url)\n\n def handle_resolve_response(self, item: Optional[dict]) -> None:\n detail = \"\"\n documentation = \"\"\n if item:\n detail = self.format_documentation(item.get('detail') or \"\")\n documentation = self.format_documentation(item.get(\"documentation\") or \"\")\n if not documentation:\n documentation = self.format_documentation({\"kind\": \"markdown\", \"value\": \"*No documentation available.*\"})\n minihtml_content = self.get_content(documentation, detail)\n show = self.update_popup if self.view.is_popup_visible() else self.show_popup\n # NOTE: Update/show popups from the main thread, or else the popup might make the AC widget disappear.\n sublime.set_timeout(lambda: show(minihtml_content))\n\n def update_popup(self, minihtml_content: str) -> None:\n mdpopups.update_popup(\n self.view,\n minihtml_content,\n css=css().popups,\n wrapper_class=css().popups_classname,\n )\n\n\nclass LspCompleteCommand(sublime_plugin.TextCommand):\n\n def epilogue(self, item: Dict[str, Any]) -> None:\n additional_edits = item.get('additionalTextEdits')\n if additional_edits:\n edits = [parse_text_edit(additional_edit) for additional_edit in additional_edits]\n self.view.run_command(\"lsp_apply_document_edit\", {'changes': edits})\n command = item.get(\"command\")\n if command:\n debug('Running server command \"{}\" for view {}'.format(command, self.view.id()))\n args = {\"command_name\": command[\"command\"], \"command_args\": command.get(\"arguments\")}\n self.view.run_command(\"lsp_execute\", args)\n\n\nclass LspCompleteInsertTextCommand(LspCompleteCommand):\n\n def run(self, edit: sublime.Edit, **item: Any) -> None:\n insert_text = item.get(\"insertText\") or item[\"label\"]\n if item.get(\"insertTextFormat\", InsertTextFormat.PlainText) == InsertTextFormat.Snippet:\n self.view.run_command(\"insert_snippet\", {\"contents\": insert_text})\n else:\n self.view.run_command(\"insert\", {\"characters\": insert_text})\n self.epilogue(item)\n\n\nclass LspCompleteTextEditCommand(LspCompleteCommand):\n\n def run(self, edit: sublime.Edit, **item: Any) -> None:\n text_edit = item[\"textEdit\"]\n new_text = text_edit['newText']\n edit_region = range_to_region(Range.from_lsp(text_edit['range']), self.view)\n if item.get(\"insertTextFormat\", InsertTextFormat.PlainText) == InsertTextFormat.Snippet:\n for region in self.translated_regions(edit_region):\n self.view.erase(edit, region)\n self.view.run_command(\"insert_snippet\", {\"contents\": new_text})\n else:\n for region in self.translated_regions(edit_region):\n # NOTE: Cannot do .replace, because ST will select the replacement.\n self.view.erase(edit, region)\n self.view.insert(edit, region.a, new_text)\n self.epilogue(item)\n\n def translated_regions(self, edit_region: sublime.Region) -> Generator[sublime.Region, None, None]:\n selection = self.view.sel()\n primary_cursor_position = selection[0].b\n for region in reversed(selection):\n # For each selection region, apply the same removal as for the \"primary\" region.\n # To do that, translate, or offset, the LSP edit region into the non-\"primary\" regions.\n # The concept of \"primary\" is our own, and there is no mention of it in the LSP spec.\n translation = region.b - primary_cursor_position\n translated_edit_region = sublime.Region(edit_region.a + translation, edit_region.b + translation)\n yield translated_edit_region\n\n\ndef resolve(completion_list: sublime.CompletionList, items: List[sublime.CompletionItem], flags: int = 0) -> None:\n # Resolve the promise on the main thread to prevent any sort of data race for _set_target (see sublime_plugin.py).\n sublime.set_timeout(lambda: completion_list.set_completions(items, flags))\n", "path": "plugin/completion.py"}, {"content": "import sublime\nfrom .core.protocol import Request\nfrom .core.registry import LspTextCommand\nfrom .core.sessions import Session\nfrom .core.typing import List, Optional, Dict, Any\nfrom .core.views import uri_from_view, offset_to_point, region_to_range\n\n\nclass LspExecuteCommand(LspTextCommand):\n\n capability = 'executeCommandProvider'\n\n def run(self,\n edit: sublime.Edit,\n command_name: Optional[str] = None,\n command_args: Optional[List[Any]] = None,\n event: Optional[dict] = None) -> None:\n # Handle VSCode-specific command for triggering suggestions popup.\n if command_name == \"editor.action.triggerSuggest\":\n # Triggered from set_timeout as suggestions popup doesn't trigger otherwise.\n sublime.set_timeout(lambda: self.view.run_command(\"auto_complete\"))\n return\n session = self.best_session(self.capability)\n if session and command_name:\n window = self.view.window()\n if window:\n window.status_message(\"Running command {}\".format(command_name))\n if command_args:\n self._expand_variables(command_args)\n self._send_command(session, command_name, command_args)\n\n def _expand_variables(self, command_args: List[Any]) -> None:\n region = self.view.sel()[0]\n for i, arg in enumerate(command_args):\n if arg in [\"$file_uri\", \"${file_uri}\"]:\n command_args[i] = uri_from_view(self.view)\n elif arg in [\"$selection\", \"${selection}\"]:\n command_args[i] = self.view.substr(region)\n elif arg in [\"$offset\", \"${offset}\"]:\n command_args[i] = region.b\n elif arg in [\"$selection_begin\", \"${selection_begin}\"]:\n command_args[i] = region.begin()\n elif arg in [\"$selection_end\", \"${selection_end}\"]:\n command_args[i] = region.end()\n elif arg in [\"$position\", \"${position}\"]:\n command_args[i] = offset_to_point(self.view, region.b).to_lsp()\n elif arg in [\"$range\", \"${range}\"]:\n command_args[i] = region_to_range(self.view, region).to_lsp()\n\n def _handle_response(self, command: str, response: Optional[Any]) -> None:\n msg = \"command {} completed\".format(command)\n if response:\n msg += \"with response: {}\".format(response)\n\n window = self.view.window()\n if window:\n window.status_message(msg)\n\n def _handle_error(self, command: str, error: Dict[str, Any]) -> None:\n msg = \"command {} failed. Reason: {}\".format(command, error.get(\"message\", \"none provided by server :(\"))\n sublime.message_dialog(msg)\n\n def _send_command(self, session: Session, command_name: str, command_args: Optional[List[Any]]) -> None:\n request = {\"command\": command_name, \"arguments\": command_args} if command_args else {\"command\": command_name}\n session.send_request(Request.executeCommand(request),\n lambda reponse: self._handle_response(command_name, reponse),\n lambda error: self._handle_error(command_name, error))\n", "path": "plugin/execute_command.py"}]} | 3,022 | 298 |
gh_patches_debug_58021 | rasdani/github-patches | git_diff | sopel-irc__sopel-949 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Problem in (at least) Wikipedia module: possibly Unicode related
Hi,
observe the following use case:
https://en.wikipedia.org/wiki/Hir%C5%8D_Onoda
@willie_5.4.1 KeyError: u'extract' (file "/usr/local/lib/python2.7/dist-packages/willie-5.4.1-py2.7.egg/willie/modules/wikipedia.py", line 89, in mw_snippet)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `sopel/modules/wikipedia.py`
Content:
```
1 # coding=utf-8
2 """
3 wikipedia.py - Sopel Wikipedia Module
4 Copyright 2013 Edward Powell - embolalia.net
5 Licensed under the Eiffel Forum License 2.
6
7 http://sopel.chat
8 """
9 from __future__ import unicode_literals, absolute_import, print_function, division
10 from sopel import web, tools
11 from sopel.config.types import StaticSection, ValidatedAttribute
12 from sopel.module import NOLIMIT, commands, example, rule
13 import json
14 import re
15
16 import sys
17 if sys.version_info.major < 3:
18 from urlparse import unquote
19 else:
20 from urllib.parse import unquote
21
22 REDIRECT = re.compile(r'^REDIRECT (.*)')
23
24
25 class WikipediaSection(StaticSection):
26 default_lang = ValidatedAttribute('default_lang', default='en')
27 """The default language to find articles from."""
28 lang_per_channel = ValidatedAttribute('lang_per_channel')
29
30
31 def setup(bot):
32 bot.config.define_section('wikipedia', WikipediaSection)
33
34 regex = re.compile('([a-z]+).(wikipedia.org/wiki/)([^ ]+)')
35 if not bot.memory.contains('url_callbacks'):
36 bot.memory['url_callbacks'] = tools.SopelMemory()
37 bot.memory['url_callbacks'][regex] = mw_info
38
39
40 def configure(config):
41 config.define_section('wikipedia', WikipediaSection)
42 config.wikipedia.configure_setting(
43 'default_lang',
44 "Enter the default language to find articles from."
45 )
46
47
48 def mw_search(server, query, num):
49 """
50 Searches the specified MediaWiki server for the given query, and returns
51 the specified number of results.
52 """
53 search_url = ('http://%s/w/api.php?format=json&action=query'
54 '&list=search&srlimit=%d&srprop=timestamp&srwhat=text'
55 '&srsearch=') % (server, num)
56 search_url += query
57 query = json.loads(web.get(search_url))
58 if 'query' in query:
59 query = query['query']['search']
60 return [r['title'] for r in query]
61 else:
62 return None
63
64
65 def say_snippet(bot, server, query, show_url=True):
66 page_name = query.replace('_', ' ')
67 query = query.replace(' ', '_')
68 snippet = mw_snippet(server, query)
69 msg = '[WIKIPEDIA] {} | "{}"'.format(page_name, snippet)
70 if show_url:
71 msg = msg + ' | https://{}/wiki/{}'.format(server, query)
72 bot.say(msg)
73
74
75 def mw_snippet(server, query):
76 """
77 Retrives a snippet of the specified length from the given page on the given
78 server.
79 """
80 snippet_url = ('https://' + server + '/w/api.php?format=json'
81 '&action=query&prop=extracts&exintro&explaintext'
82 '&exchars=300&redirects&titles=')
83 snippet_url += query
84 snippet = json.loads(web.get(snippet_url))
85 snippet = snippet['query']['pages']
86
87 # For some reason, the API gives the page *number* as the key, so we just
88 # grab the first page number in the results.
89 snippet = snippet[list(snippet.keys())[0]]
90
91 return snippet['extract']
92
93
94 @rule('.*/([a-z]+\.wikipedia.org)/wiki/([^ ]+).*')
95 def mw_info(bot, trigger, found_match=None):
96 """
97 Retrives a snippet of the specified length from the given page on the given
98 server.
99 """
100 match = found_match or trigger
101 say_snippet(bot, match.group(1), unquote(match.group(2)), show_url=False)
102
103
104 @commands('w', 'wiki', 'wik')
105 @example('.w San Francisco')
106 def wikipedia(bot, trigger):
107 lang = bot.config.wikipedia.default_lang
108
109 #change lang if channel has custom language set
110 if (trigger.sender and not trigger.sender.is_nick() and
111 bot.config.wikipedia.lang_per_channel):
112 customlang = re.search('(' + trigger.sender + '):(\w+)',
113 bot.config.wikipedia.lang_per_channel)
114 if customlang is not None:
115 lang = customlang.group(2)
116
117 if trigger.group(2) is None:
118 bot.reply("What do you want me to look up?")
119 return NOLIMIT
120
121 query = trigger.group(2)
122 args = re.search(r'^-([a-z]{2,12})\s(.*)', query)
123 if args is not None:
124 lang = args.group(1)
125 query = args.group(2)
126
127 if not query:
128 bot.reply('What do you want me to look up?')
129 return NOLIMIT
130 server = lang + '.wikipedia.org'
131 query = mw_search(server, query, 1)
132 if not query:
133 bot.reply("I can't find any results for that.")
134 return NOLIMIT
135 else:
136 query = query[0]
137 say_snippet(bot, server, query)
138
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/sopel/modules/wikipedia.py b/sopel/modules/wikipedia.py
--- a/sopel/modules/wikipedia.py
+++ b/sopel/modules/wikipedia.py
@@ -15,7 +15,8 @@
import sys
if sys.version_info.major < 3:
- from urlparse import unquote
+ from urlparse import unquote as _unquote
+ unquote = lambda s: _unquote(s.encode('utf-8')).decode('utf-8')
else:
from urllib.parse import unquote
| {"golden_diff": "diff --git a/sopel/modules/wikipedia.py b/sopel/modules/wikipedia.py\n--- a/sopel/modules/wikipedia.py\n+++ b/sopel/modules/wikipedia.py\n@@ -15,7 +15,8 @@\n \n import sys\n if sys.version_info.major < 3:\n- from urlparse import unquote\n+ from urlparse import unquote as _unquote\n+ unquote = lambda s: _unquote(s.encode('utf-8')).decode('utf-8')\n else:\n from urllib.parse import unquote\n", "issue": "Problem in (at least) Wikipedia module: possibly Unicode related\nHi,\nobserve the following use case:\n https://en.wikipedia.org/wiki/Hir%C5%8D_Onoda\n @willie_5.4.1 KeyError: u'extract' (file \"/usr/local/lib/python2.7/dist-packages/willie-5.4.1-py2.7.egg/willie/modules/wikipedia.py\", line 89, in mw_snippet)\n\n", "before_files": [{"content": "# coding=utf-8\n\"\"\"\nwikipedia.py - Sopel Wikipedia Module\nCopyright 2013 Edward Powell - embolalia.net\nLicensed under the Eiffel Forum License 2.\n\nhttp://sopel.chat\n\"\"\"\nfrom __future__ import unicode_literals, absolute_import, print_function, division\nfrom sopel import web, tools\nfrom sopel.config.types import StaticSection, ValidatedAttribute\nfrom sopel.module import NOLIMIT, commands, example, rule\nimport json\nimport re\n\nimport sys\nif sys.version_info.major < 3:\n from urlparse import unquote\nelse:\n from urllib.parse import unquote\n\nREDIRECT = re.compile(r'^REDIRECT (.*)')\n\n\nclass WikipediaSection(StaticSection):\n default_lang = ValidatedAttribute('default_lang', default='en')\n \"\"\"The default language to find articles from.\"\"\"\n lang_per_channel = ValidatedAttribute('lang_per_channel')\n\n\ndef setup(bot):\n bot.config.define_section('wikipedia', WikipediaSection)\n\n regex = re.compile('([a-z]+).(wikipedia.org/wiki/)([^ ]+)')\n if not bot.memory.contains('url_callbacks'):\n bot.memory['url_callbacks'] = tools.SopelMemory()\n bot.memory['url_callbacks'][regex] = mw_info\n\n\ndef configure(config):\n config.define_section('wikipedia', WikipediaSection)\n config.wikipedia.configure_setting(\n 'default_lang',\n \"Enter the default language to find articles from.\"\n )\n\n\ndef mw_search(server, query, num):\n \"\"\"\n Searches the specified MediaWiki server for the given query, and returns\n the specified number of results.\n \"\"\"\n search_url = ('http://%s/w/api.php?format=json&action=query'\n '&list=search&srlimit=%d&srprop=timestamp&srwhat=text'\n '&srsearch=') % (server, num)\n search_url += query\n query = json.loads(web.get(search_url))\n if 'query' in query:\n query = query['query']['search']\n return [r['title'] for r in query]\n else:\n return None\n\n\ndef say_snippet(bot, server, query, show_url=True):\n page_name = query.replace('_', ' ')\n query = query.replace(' ', '_')\n snippet = mw_snippet(server, query)\n msg = '[WIKIPEDIA] {} | \"{}\"'.format(page_name, snippet)\n if show_url:\n msg = msg + ' | https://{}/wiki/{}'.format(server, query)\n bot.say(msg)\n\n\ndef mw_snippet(server, query):\n \"\"\"\n Retrives a snippet of the specified length from the given page on the given\n server.\n \"\"\"\n snippet_url = ('https://' + server + '/w/api.php?format=json'\n '&action=query&prop=extracts&exintro&explaintext'\n '&exchars=300&redirects&titles=')\n snippet_url += query\n snippet = json.loads(web.get(snippet_url))\n snippet = snippet['query']['pages']\n\n # For some reason, the API gives the page *number* as the key, so we just\n # grab the first page number in the results.\n snippet = snippet[list(snippet.keys())[0]]\n\n return snippet['extract']\n\n\n@rule('.*/([a-z]+\\.wikipedia.org)/wiki/([^ ]+).*')\ndef mw_info(bot, trigger, found_match=None):\n \"\"\"\n Retrives a snippet of the specified length from the given page on the given\n server.\n \"\"\"\n match = found_match or trigger\n say_snippet(bot, match.group(1), unquote(match.group(2)), show_url=False)\n\n\n@commands('w', 'wiki', 'wik')\n@example('.w San Francisco')\ndef wikipedia(bot, trigger):\n lang = bot.config.wikipedia.default_lang\n\n #change lang if channel has custom language set\n if (trigger.sender and not trigger.sender.is_nick() and\n bot.config.wikipedia.lang_per_channel):\n customlang = re.search('(' + trigger.sender + '):(\\w+)',\n bot.config.wikipedia.lang_per_channel)\n if customlang is not None:\n lang = customlang.group(2)\n\n if trigger.group(2) is None:\n bot.reply(\"What do you want me to look up?\")\n return NOLIMIT\n\n query = trigger.group(2)\n args = re.search(r'^-([a-z]{2,12})\\s(.*)', query)\n if args is not None:\n lang = args.group(1)\n query = args.group(2)\n\n if not query:\n bot.reply('What do you want me to look up?')\n return NOLIMIT\n server = lang + '.wikipedia.org'\n query = mw_search(server, query, 1)\n if not query:\n bot.reply(\"I can't find any results for that.\")\n return NOLIMIT\n else:\n query = query[0]\n say_snippet(bot, server, query)\n", "path": "sopel/modules/wikipedia.py"}], "after_files": [{"content": "# coding=utf-8\n\"\"\"\nwikipedia.py - Sopel Wikipedia Module\nCopyright 2013 Edward Powell - embolalia.net\nLicensed under the Eiffel Forum License 2.\n\nhttp://sopel.chat\n\"\"\"\nfrom __future__ import unicode_literals, absolute_import, print_function, division\nfrom sopel import web, tools\nfrom sopel.config.types import StaticSection, ValidatedAttribute\nfrom sopel.module import NOLIMIT, commands, example, rule\nimport json\nimport re\n\nimport sys\nif sys.version_info.major < 3:\n from urlparse import unquote as _unquote\n unquote = lambda s: _unquote(s.encode('utf-8')).decode('utf-8')\nelse:\n from urllib.parse import unquote\n\nREDIRECT = re.compile(r'^REDIRECT (.*)')\n\n\nclass WikipediaSection(StaticSection):\n default_lang = ValidatedAttribute('default_lang', default='en')\n \"\"\"The default language to find articles from.\"\"\"\n lang_per_channel = ValidatedAttribute('lang_per_channel')\n\n\ndef setup(bot):\n bot.config.define_section('wikipedia', WikipediaSection)\n\n regex = re.compile('([a-z]+).(wikipedia.org/wiki/)([^ ]+)')\n if not bot.memory.contains('url_callbacks'):\n bot.memory['url_callbacks'] = tools.SopelMemory()\n bot.memory['url_callbacks'][regex] = mw_info\n\n\ndef configure(config):\n config.define_section('wikipedia', WikipediaSection)\n config.wikipedia.configure_setting(\n 'default_lang',\n \"Enter the default language to find articles from.\"\n )\n\n\ndef mw_search(server, query, num):\n \"\"\"\n Searches the specified MediaWiki server for the given query, and returns\n the specified number of results.\n \"\"\"\n search_url = ('http://%s/w/api.php?format=json&action=query'\n '&list=search&srlimit=%d&srprop=timestamp&srwhat=text'\n '&srsearch=') % (server, num)\n search_url += query\n query = json.loads(web.get(search_url))\n if 'query' in query:\n query = query['query']['search']\n return [r['title'] for r in query]\n else:\n return None\n\n\ndef say_snippet(bot, server, query, show_url=True):\n page_name = query.replace('_', ' ')\n query = query.replace(' ', '_')\n snippet = mw_snippet(server, query)\n msg = '[WIKIPEDIA] {} | \"{}\"'.format(page_name, snippet)\n if show_url:\n msg = msg + ' | https://{}/wiki/{}'.format(server, query)\n bot.say(msg)\n\n\ndef mw_snippet(server, query):\n \"\"\"\n Retrives a snippet of the specified length from the given page on the given\n server.\n \"\"\"\n snippet_url = ('https://' + server + '/w/api.php?format=json'\n '&action=query&prop=extracts&exintro&explaintext'\n '&exchars=300&redirects&titles=')\n snippet_url += query\n snippet = json.loads(web.get(snippet_url))\n snippet = snippet['query']['pages']\n\n # For some reason, the API gives the page *number* as the key, so we just\n # grab the first page number in the results.\n snippet = snippet[list(snippet.keys())[0]]\n\n return snippet['extract']\n\n\n@rule('.*/([a-z]+\\.wikipedia.org)/wiki/([^ ]+).*')\ndef mw_info(bot, trigger, found_match=None):\n \"\"\"\n Retrives a snippet of the specified length from the given page on the given\n server.\n \"\"\"\n match = found_match or trigger\n say_snippet(bot, match.group(1), unquote(match.group(2)), show_url=False)\n\n\n@commands('w', 'wiki', 'wik')\n@example('.w San Francisco')\ndef wikipedia(bot, trigger):\n lang = bot.config.wikipedia.default_lang\n\n #change lang if channel has custom language set\n if (trigger.sender and not trigger.sender.is_nick() and\n bot.config.wikipedia.lang_per_channel):\n customlang = re.search('(' + trigger.sender + '):(\\w+)',\n bot.config.wikipedia.lang_per_channel)\n if customlang is not None:\n lang = customlang.group(2)\n\n if trigger.group(2) is None:\n bot.reply(\"What do you want me to look up?\")\n return NOLIMIT\n\n query = trigger.group(2)\n args = re.search(r'^-([a-z]{2,12})\\s(.*)', query)\n if args is not None:\n lang = args.group(1)\n query = args.group(2)\n\n if not query:\n bot.reply('What do you want me to look up?')\n return NOLIMIT\n server = lang + '.wikipedia.org'\n query = mw_search(server, query, 1)\n if not query:\n bot.reply(\"I can't find any results for that.\")\n return NOLIMIT\n else:\n query = query[0]\n say_snippet(bot, server, query)\n", "path": "sopel/modules/wikipedia.py"}]} | 1,752 | 119 |
gh_patches_debug_585 | rasdani/github-patches | git_diff | pex-tool__pex-1679 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Release 2.1.73
On the docket:
+ [x] Unexpected distribution hash #1683
+ [x] Pex fails to parse wheel tags correctly when resolving from a lock. #1676
+ [x] `pex3 lock create --style universal` does not fully patch ambient interpreter properties. #1681
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pex/version.py`
Content:
```
1 # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
2 # Licensed under the Apache License, Version 2.0 (see LICENSE).
3
4 __version__ = "2.1.72"
5
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pex/version.py b/pex/version.py
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.72"
+__version__ = "2.1.73"
| {"golden_diff": "diff --git a/pex/version.py b/pex/version.py\n--- a/pex/version.py\n+++ b/pex/version.py\n@@ -1,4 +1,4 @@\n # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n # Licensed under the Apache License, Version 2.0 (see LICENSE).\n \n-__version__ = \"2.1.72\"\n+__version__ = \"2.1.73\"\n", "issue": "Release 2.1.73\nOn the docket:\r\n+ [x] Unexpected distribution hash #1683 \r\n+ [x] Pex fails to parse wheel tags correctly when resolving from a lock. #1676 \r\n+ [x] `pex3 lock create --style universal` does not fully patch ambient interpreter properties. #1681 \n", "before_files": [{"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.72\"\n", "path": "pex/version.py"}], "after_files": [{"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.73\"\n", "path": "pex/version.py"}]} | 386 | 96 |
gh_patches_debug_40938 | rasdani/github-patches | git_diff | Cog-Creators__Red-DiscordBot-3911 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Mod cog sends owner notifications on fresh install.
# Other bugs
I got reminded about it when I saw a fix for #3587. Mod cog sends owner notifications about `[p]moveignoredchannels` and `[p]movedeletedelay` on fresh Red installs. Only viable solution seems to be looping through all guild settings and only send the message if `delete_delay` has been changed from the default in at least one of them though I'm basing that on my comment [here](https://github.com/Cog-Creators/Red-DiscordBot/pull/3638#discussion_r392119234).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `redbot/cogs/mod/mod.py`
Content:
```
1 import asyncio
2 import logging
3 import re
4 from abc import ABC
5 from collections import defaultdict
6 from typing import List, Tuple
7
8 import discord
9 from redbot.core import Config, modlog, commands
10 from redbot.core.bot import Red
11 from redbot.core.i18n import Translator, cog_i18n
12 from redbot.core.utils._internal_utils import send_to_owners_with_prefix_replaced
13 from .casetypes import CASETYPES
14 from .events import Events
15 from .kickban import KickBanMixin
16 from .mutes import MuteMixin
17 from .names import ModInfo
18 from .slowmode import Slowmode
19 from .settings import ModSettings
20
21 _ = T_ = Translator("Mod", __file__)
22
23 __version__ = "1.2.0"
24
25
26 class CompositeMetaClass(type(commands.Cog), type(ABC)):
27 """
28 This allows the metaclass used for proper type detection to
29 coexist with discord.py's metaclass
30 """
31
32 pass
33
34
35 @cog_i18n(_)
36 class Mod(
37 ModSettings,
38 Events,
39 KickBanMixin,
40 MuteMixin,
41 ModInfo,
42 Slowmode,
43 commands.Cog,
44 metaclass=CompositeMetaClass,
45 ):
46 """Moderation tools."""
47
48 default_global_settings = {"version": ""}
49
50 default_guild_settings = {
51 "ban_mention_spam": False,
52 "delete_repeats": -1,
53 "ignored": False,
54 "respect_hierarchy": True,
55 "delete_delay": -1,
56 "reinvite_on_unban": False,
57 "current_tempbans": [],
58 "dm_on_kickban": False,
59 "default_days": 0,
60 }
61
62 default_channel_settings = {"ignored": False}
63
64 default_member_settings = {"past_nicks": [], "perms_cache": {}, "banned_until": False}
65
66 default_user_settings = {"past_names": []}
67
68 def __init__(self, bot: Red):
69 super().__init__()
70 self.bot = bot
71
72 self.config = Config.get_conf(self, 4961522000, force_registration=True)
73 self.config.register_global(**self.default_global_settings)
74 self.config.register_guild(**self.default_guild_settings)
75 self.config.register_channel(**self.default_channel_settings)
76 self.config.register_member(**self.default_member_settings)
77 self.config.register_user(**self.default_user_settings)
78 self.cache: dict = {}
79 self.tban_expiry_task = self.bot.loop.create_task(self.check_tempban_expirations())
80 self.last_case: dict = defaultdict(dict)
81
82 self._ready = asyncio.Event()
83
84 async def initialize(self):
85 await self._maybe_update_config()
86 self._ready.set()
87
88 async def cog_before_invoke(self, ctx: commands.Context) -> None:
89 await self._ready.wait()
90
91 def cog_unload(self):
92 self.tban_expiry_task.cancel()
93
94 async def _maybe_update_config(self):
95 """Maybe update `delete_delay` value set by Config prior to Mod 1.0.0."""
96 if not await self.config.version():
97 guild_dict = await self.config.all_guilds()
98 for guild_id, info in guild_dict.items():
99 delete_repeats = info.get("delete_repeats", False)
100 if delete_repeats:
101 val = 3
102 else:
103 val = -1
104 await self.config.guild(discord.Object(id=guild_id)).delete_repeats.set(val)
105 await self.config.version.set("1.0.0") # set version of last update
106 if await self.config.version() < "1.1.0":
107 msg = _(
108 "Ignored guilds and channels have been moved. "
109 "Please use `[p]moveignoredchannels` if "
110 "you were previously using these functions."
111 )
112 self.bot.loop.create_task(send_to_owners_with_prefix_replaced(self.bot, msg))
113 await self.config.version.set("1.1.0")
114 if await self.config.version() < "1.2.0":
115 msg = _(
116 "Delete delay settings have been moved. "
117 "Please use `[p]movedeletedelay` if "
118 "you were previously using these functions."
119 )
120 self.bot.loop.create_task(send_to_owners_with_prefix_replaced(self.bot, msg))
121 await self.config.version.set("1.2.0")
122
123 @commands.command()
124 @commands.is_owner()
125 async def moveignoredchannels(self, ctx: commands.Context) -> None:
126 """Move ignored channels and servers to core"""
127 all_guilds = await self.config.all_guilds()
128 all_channels = await self.config.all_channels()
129 for guild_id, settings in all_guilds.items():
130 await self.bot._config.guild_from_id(guild_id).ignored.set(settings["ignored"])
131 await self.config.guild_from_id(guild_id).ignored.clear()
132 for channel_id, settings in all_channels.items():
133 await self.bot._config.channel_from_id(channel_id).ignored.set(settings["ignored"])
134 await self.config.channel_from_id(channel_id).clear()
135 await ctx.send(_("Ignored channels and guilds restored."))
136
137 @commands.command()
138 @commands.is_owner()
139 async def movedeletedelay(self, ctx: commands.Context) -> None:
140 """
141 Move deletedelay settings to core
142 """
143 all_guilds = await self.config.all_guilds()
144 for guild_id, settings in all_guilds.items():
145 await self.bot._config.guild_from_id(guild_id).delete_delay.set(
146 settings["delete_delay"]
147 )
148 await self.config.guild_from_id(guild_id).delete_delay.clear()
149 await ctx.send(_("Delete delay settings restored."))
150
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/redbot/cogs/mod/mod.py b/redbot/cogs/mod/mod.py
--- a/redbot/cogs/mod/mod.py
+++ b/redbot/cogs/mod/mod.py
@@ -6,6 +6,8 @@
from typing import List, Tuple
import discord
+from redbot.core.utils import AsyncIter
+
from redbot.core import Config, modlog, commands
from redbot.core.bot import Red
from redbot.core.i18n import Translator, cog_i18n
@@ -95,7 +97,7 @@
"""Maybe update `delete_delay` value set by Config prior to Mod 1.0.0."""
if not await self.config.version():
guild_dict = await self.config.all_guilds()
- for guild_id, info in guild_dict.items():
+ async for guild_id, info in AsyncIter(guild_dict.items(), steps=25):
delete_repeats = info.get("delete_repeats", False)
if delete_repeats:
val = 3
@@ -104,20 +106,37 @@
await self.config.guild(discord.Object(id=guild_id)).delete_repeats.set(val)
await self.config.version.set("1.0.0") # set version of last update
if await self.config.version() < "1.1.0":
- msg = _(
- "Ignored guilds and channels have been moved. "
- "Please use `[p]moveignoredchannels` if "
- "you were previously using these functions."
- )
- self.bot.loop.create_task(send_to_owners_with_prefix_replaced(self.bot, msg))
+ message_sent = False
+ async for e in AsyncIter((await self.config.all_channels()).values(), steps=25):
+ if e["ignored"] is not False:
+ msg = _(
+ "Ignored guilds and channels have been moved. "
+ "Please use `[p]moveignoredchannels` to migrate the old settings."
+ )
+ self.bot.loop.create_task(send_to_owners_with_prefix_replaced(self.bot, msg))
+ message_sent = True
+ break
+ if message_sent is False:
+ async for e in AsyncIter((await self.config.all_guilds()).values(), steps=25):
+ if e["ignored"] is not False:
+ msg = _(
+ "Ignored guilds and channels have been moved. "
+ "Please use `[p]moveignoredchannels` to migrate the old settings."
+ )
+ self.bot.loop.create_task(
+ send_to_owners_with_prefix_replaced(self.bot, msg)
+ )
+ break
await self.config.version.set("1.1.0")
if await self.config.version() < "1.2.0":
- msg = _(
- "Delete delay settings have been moved. "
- "Please use `[p]movedeletedelay` if "
- "you were previously using these functions."
- )
- self.bot.loop.create_task(send_to_owners_with_prefix_replaced(self.bot, msg))
+ async for e in AsyncIter((await self.config.all_guilds()).values(), steps=25):
+ if e["delete_delay"] != -1:
+ msg = _(
+ "Delete delay settings have been moved. "
+ "Please use `[p]movedeletedelay` to migrate the old settings."
+ )
+ self.bot.loop.create_task(send_to_owners_with_prefix_replaced(self.bot, msg))
+ break
await self.config.version.set("1.2.0")
@commands.command()
| {"golden_diff": "diff --git a/redbot/cogs/mod/mod.py b/redbot/cogs/mod/mod.py\n--- a/redbot/cogs/mod/mod.py\n+++ b/redbot/cogs/mod/mod.py\n@@ -6,6 +6,8 @@\n from typing import List, Tuple\n \n import discord\n+from redbot.core.utils import AsyncIter\n+\n from redbot.core import Config, modlog, commands\n from redbot.core.bot import Red\n from redbot.core.i18n import Translator, cog_i18n\n@@ -95,7 +97,7 @@\n \"\"\"Maybe update `delete_delay` value set by Config prior to Mod 1.0.0.\"\"\"\n if not await self.config.version():\n guild_dict = await self.config.all_guilds()\n- for guild_id, info in guild_dict.items():\n+ async for guild_id, info in AsyncIter(guild_dict.items(), steps=25):\n delete_repeats = info.get(\"delete_repeats\", False)\n if delete_repeats:\n val = 3\n@@ -104,20 +106,37 @@\n await self.config.guild(discord.Object(id=guild_id)).delete_repeats.set(val)\n await self.config.version.set(\"1.0.0\") # set version of last update\n if await self.config.version() < \"1.1.0\":\n- msg = _(\n- \"Ignored guilds and channels have been moved. \"\n- \"Please use `[p]moveignoredchannels` if \"\n- \"you were previously using these functions.\"\n- )\n- self.bot.loop.create_task(send_to_owners_with_prefix_replaced(self.bot, msg))\n+ message_sent = False\n+ async for e in AsyncIter((await self.config.all_channels()).values(), steps=25):\n+ if e[\"ignored\"] is not False:\n+ msg = _(\n+ \"Ignored guilds and channels have been moved. \"\n+ \"Please use `[p]moveignoredchannels` to migrate the old settings.\"\n+ )\n+ self.bot.loop.create_task(send_to_owners_with_prefix_replaced(self.bot, msg))\n+ message_sent = True\n+ break\n+ if message_sent is False:\n+ async for e in AsyncIter((await self.config.all_guilds()).values(), steps=25):\n+ if e[\"ignored\"] is not False:\n+ msg = _(\n+ \"Ignored guilds and channels have been moved. \"\n+ \"Please use `[p]moveignoredchannels` to migrate the old settings.\"\n+ )\n+ self.bot.loop.create_task(\n+ send_to_owners_with_prefix_replaced(self.bot, msg)\n+ )\n+ break\n await self.config.version.set(\"1.1.0\")\n if await self.config.version() < \"1.2.0\":\n- msg = _(\n- \"Delete delay settings have been moved. \"\n- \"Please use `[p]movedeletedelay` if \"\n- \"you were previously using these functions.\"\n- )\n- self.bot.loop.create_task(send_to_owners_with_prefix_replaced(self.bot, msg))\n+ async for e in AsyncIter((await self.config.all_guilds()).values(), steps=25):\n+ if e[\"delete_delay\"] != -1:\n+ msg = _(\n+ \"Delete delay settings have been moved. \"\n+ \"Please use `[p]movedeletedelay` to migrate the old settings.\"\n+ )\n+ self.bot.loop.create_task(send_to_owners_with_prefix_replaced(self.bot, msg))\n+ break\n await self.config.version.set(\"1.2.0\")\n \n @commands.command()\n", "issue": "Mod cog sends owner notifications on fresh install.\n# Other bugs\r\n\r\nI got reminded about it when I saw a fix for #3587. Mod cog sends owner notifications about `[p]moveignoredchannels` and `[p]movedeletedelay` on fresh Red installs. Only viable solution seems to be looping through all guild settings and only send the message if `delete_delay` has been changed from the default in at least one of them though I'm basing that on my comment [here](https://github.com/Cog-Creators/Red-DiscordBot/pull/3638#discussion_r392119234).\r\n\n", "before_files": [{"content": "import asyncio\nimport logging\nimport re\nfrom abc import ABC\nfrom collections import defaultdict\nfrom typing import List, Tuple\n\nimport discord\nfrom redbot.core import Config, modlog, commands\nfrom redbot.core.bot import Red\nfrom redbot.core.i18n import Translator, cog_i18n\nfrom redbot.core.utils._internal_utils import send_to_owners_with_prefix_replaced\nfrom .casetypes import CASETYPES\nfrom .events import Events\nfrom .kickban import KickBanMixin\nfrom .mutes import MuteMixin\nfrom .names import ModInfo\nfrom .slowmode import Slowmode\nfrom .settings import ModSettings\n\n_ = T_ = Translator(\"Mod\", __file__)\n\n__version__ = \"1.2.0\"\n\n\nclass CompositeMetaClass(type(commands.Cog), type(ABC)):\n \"\"\"\n This allows the metaclass used for proper type detection to\n coexist with discord.py's metaclass\n \"\"\"\n\n pass\n\n\n@cog_i18n(_)\nclass Mod(\n ModSettings,\n Events,\n KickBanMixin,\n MuteMixin,\n ModInfo,\n Slowmode,\n commands.Cog,\n metaclass=CompositeMetaClass,\n):\n \"\"\"Moderation tools.\"\"\"\n\n default_global_settings = {\"version\": \"\"}\n\n default_guild_settings = {\n \"ban_mention_spam\": False,\n \"delete_repeats\": -1,\n \"ignored\": False,\n \"respect_hierarchy\": True,\n \"delete_delay\": -1,\n \"reinvite_on_unban\": False,\n \"current_tempbans\": [],\n \"dm_on_kickban\": False,\n \"default_days\": 0,\n }\n\n default_channel_settings = {\"ignored\": False}\n\n default_member_settings = {\"past_nicks\": [], \"perms_cache\": {}, \"banned_until\": False}\n\n default_user_settings = {\"past_names\": []}\n\n def __init__(self, bot: Red):\n super().__init__()\n self.bot = bot\n\n self.config = Config.get_conf(self, 4961522000, force_registration=True)\n self.config.register_global(**self.default_global_settings)\n self.config.register_guild(**self.default_guild_settings)\n self.config.register_channel(**self.default_channel_settings)\n self.config.register_member(**self.default_member_settings)\n self.config.register_user(**self.default_user_settings)\n self.cache: dict = {}\n self.tban_expiry_task = self.bot.loop.create_task(self.check_tempban_expirations())\n self.last_case: dict = defaultdict(dict)\n\n self._ready = asyncio.Event()\n\n async def initialize(self):\n await self._maybe_update_config()\n self._ready.set()\n\n async def cog_before_invoke(self, ctx: commands.Context) -> None:\n await self._ready.wait()\n\n def cog_unload(self):\n self.tban_expiry_task.cancel()\n\n async def _maybe_update_config(self):\n \"\"\"Maybe update `delete_delay` value set by Config prior to Mod 1.0.0.\"\"\"\n if not await self.config.version():\n guild_dict = await self.config.all_guilds()\n for guild_id, info in guild_dict.items():\n delete_repeats = info.get(\"delete_repeats\", False)\n if delete_repeats:\n val = 3\n else:\n val = -1\n await self.config.guild(discord.Object(id=guild_id)).delete_repeats.set(val)\n await self.config.version.set(\"1.0.0\") # set version of last update\n if await self.config.version() < \"1.1.0\":\n msg = _(\n \"Ignored guilds and channels have been moved. \"\n \"Please use `[p]moveignoredchannels` if \"\n \"you were previously using these functions.\"\n )\n self.bot.loop.create_task(send_to_owners_with_prefix_replaced(self.bot, msg))\n await self.config.version.set(\"1.1.0\")\n if await self.config.version() < \"1.2.0\":\n msg = _(\n \"Delete delay settings have been moved. \"\n \"Please use `[p]movedeletedelay` if \"\n \"you were previously using these functions.\"\n )\n self.bot.loop.create_task(send_to_owners_with_prefix_replaced(self.bot, msg))\n await self.config.version.set(\"1.2.0\")\n\n @commands.command()\n @commands.is_owner()\n async def moveignoredchannels(self, ctx: commands.Context) -> None:\n \"\"\"Move ignored channels and servers to core\"\"\"\n all_guilds = await self.config.all_guilds()\n all_channels = await self.config.all_channels()\n for guild_id, settings in all_guilds.items():\n await self.bot._config.guild_from_id(guild_id).ignored.set(settings[\"ignored\"])\n await self.config.guild_from_id(guild_id).ignored.clear()\n for channel_id, settings in all_channels.items():\n await self.bot._config.channel_from_id(channel_id).ignored.set(settings[\"ignored\"])\n await self.config.channel_from_id(channel_id).clear()\n await ctx.send(_(\"Ignored channels and guilds restored.\"))\n\n @commands.command()\n @commands.is_owner()\n async def movedeletedelay(self, ctx: commands.Context) -> None:\n \"\"\"\n Move deletedelay settings to core\n \"\"\"\n all_guilds = await self.config.all_guilds()\n for guild_id, settings in all_guilds.items():\n await self.bot._config.guild_from_id(guild_id).delete_delay.set(\n settings[\"delete_delay\"]\n )\n await self.config.guild_from_id(guild_id).delete_delay.clear()\n await ctx.send(_(\"Delete delay settings restored.\"))\n", "path": "redbot/cogs/mod/mod.py"}], "after_files": [{"content": "import asyncio\nimport logging\nimport re\nfrom abc import ABC\nfrom collections import defaultdict\nfrom typing import List, Tuple\n\nimport discord\nfrom redbot.core.utils import AsyncIter\n\nfrom redbot.core import Config, modlog, commands\nfrom redbot.core.bot import Red\nfrom redbot.core.i18n import Translator, cog_i18n\nfrom redbot.core.utils._internal_utils import send_to_owners_with_prefix_replaced\nfrom .casetypes import CASETYPES\nfrom .events import Events\nfrom .kickban import KickBanMixin\nfrom .mutes import MuteMixin\nfrom .names import ModInfo\nfrom .slowmode import Slowmode\nfrom .settings import ModSettings\n\n_ = T_ = Translator(\"Mod\", __file__)\n\n__version__ = \"1.2.0\"\n\n\nclass CompositeMetaClass(type(commands.Cog), type(ABC)):\n \"\"\"\n This allows the metaclass used for proper type detection to\n coexist with discord.py's metaclass\n \"\"\"\n\n pass\n\n\n@cog_i18n(_)\nclass Mod(\n ModSettings,\n Events,\n KickBanMixin,\n MuteMixin,\n ModInfo,\n Slowmode,\n commands.Cog,\n metaclass=CompositeMetaClass,\n):\n \"\"\"Moderation tools.\"\"\"\n\n default_global_settings = {\"version\": \"\"}\n\n default_guild_settings = {\n \"ban_mention_spam\": False,\n \"delete_repeats\": -1,\n \"ignored\": False,\n \"respect_hierarchy\": True,\n \"delete_delay\": -1,\n \"reinvite_on_unban\": False,\n \"current_tempbans\": [],\n \"dm_on_kickban\": False,\n \"default_days\": 0,\n }\n\n default_channel_settings = {\"ignored\": False}\n\n default_member_settings = {\"past_nicks\": [], \"perms_cache\": {}, \"banned_until\": False}\n\n default_user_settings = {\"past_names\": []}\n\n def __init__(self, bot: Red):\n super().__init__()\n self.bot = bot\n\n self.config = Config.get_conf(self, 4961522000, force_registration=True)\n self.config.register_global(**self.default_global_settings)\n self.config.register_guild(**self.default_guild_settings)\n self.config.register_channel(**self.default_channel_settings)\n self.config.register_member(**self.default_member_settings)\n self.config.register_user(**self.default_user_settings)\n self.cache: dict = {}\n self.tban_expiry_task = self.bot.loop.create_task(self.check_tempban_expirations())\n self.last_case: dict = defaultdict(dict)\n\n self._ready = asyncio.Event()\n\n async def initialize(self):\n await self._maybe_update_config()\n self._ready.set()\n\n async def cog_before_invoke(self, ctx: commands.Context) -> None:\n await self._ready.wait()\n\n def cog_unload(self):\n self.tban_expiry_task.cancel()\n\n async def _maybe_update_config(self):\n \"\"\"Maybe update `delete_delay` value set by Config prior to Mod 1.0.0.\"\"\"\n if not await self.config.version():\n guild_dict = await self.config.all_guilds()\n async for guild_id, info in AsyncIter(guild_dict.items(), steps=25):\n delete_repeats = info.get(\"delete_repeats\", False)\n if delete_repeats:\n val = 3\n else:\n val = -1\n await self.config.guild(discord.Object(id=guild_id)).delete_repeats.set(val)\n await self.config.version.set(\"1.0.0\") # set version of last update\n if await self.config.version() < \"1.1.0\":\n message_sent = False\n async for e in AsyncIter((await self.config.all_channels()).values(), steps=25):\n if e[\"ignored\"] is not False:\n msg = _(\n \"Ignored guilds and channels have been moved. \"\n \"Please use `[p]moveignoredchannels` to migrate the old settings.\"\n )\n self.bot.loop.create_task(send_to_owners_with_prefix_replaced(self.bot, msg))\n message_sent = True\n break\n if message_sent is False:\n async for e in AsyncIter((await self.config.all_guilds()).values(), steps=25):\n if e[\"ignored\"] is not False:\n msg = _(\n \"Ignored guilds and channels have been moved. \"\n \"Please use `[p]moveignoredchannels` to migrate the old settings.\"\n )\n self.bot.loop.create_task(\n send_to_owners_with_prefix_replaced(self.bot, msg)\n )\n break\n await self.config.version.set(\"1.1.0\")\n if await self.config.version() < \"1.2.0\":\n async for e in AsyncIter((await self.config.all_guilds()).values(), steps=25):\n if e[\"delete_delay\"] != -1:\n msg = _(\n \"Delete delay settings have been moved. \"\n \"Please use `[p]movedeletedelay` to migrate the old settings.\"\n )\n self.bot.loop.create_task(send_to_owners_with_prefix_replaced(self.bot, msg))\n break\n await self.config.version.set(\"1.2.0\")\n\n @commands.command()\n @commands.is_owner()\n async def moveignoredchannels(self, ctx: commands.Context) -> None:\n \"\"\"Move ignored channels and servers to core\"\"\"\n all_guilds = await self.config.all_guilds()\n all_channels = await self.config.all_channels()\n for guild_id, settings in all_guilds.items():\n await self.bot._config.guild_from_id(guild_id).ignored.set(settings[\"ignored\"])\n await self.config.guild_from_id(guild_id).ignored.clear()\n for channel_id, settings in all_channels.items():\n await self.bot._config.channel_from_id(channel_id).ignored.set(settings[\"ignored\"])\n await self.config.channel_from_id(channel_id).clear()\n await ctx.send(_(\"Ignored channels and guilds restored.\"))\n\n @commands.command()\n @commands.is_owner()\n async def movedeletedelay(self, ctx: commands.Context) -> None:\n \"\"\"\n Move deletedelay settings to core\n \"\"\"\n all_guilds = await self.config.all_guilds()\n for guild_id, settings in all_guilds.items():\n await self.bot._config.guild_from_id(guild_id).delete_delay.set(\n settings[\"delete_delay\"]\n )\n await self.config.guild_from_id(guild_id).delete_delay.clear()\n await ctx.send(_(\"Delete delay settings restored.\"))\n", "path": "redbot/cogs/mod/mod.py"}]} | 1,965 | 800 |
gh_patches_debug_577 | rasdani/github-patches | git_diff | numba__numba-1356 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Use CPython allocator in NRT
NRT should optionally use the CPython memory allocation functions (when imported from CPython). This would allow Numba-allocated memory to be seen by other utilities such as `sys.getallocatedblocks()`, `sys.debugmallocstats()`, and `tracemalloc`.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `numba/runtime/nrt.py`
Content:
```
1 from __future__ import print_function, absolute_import, division
2
3 from collections import namedtuple
4
5 from . import atomicops
6 from llvmlite import binding as ll
7
8 from numba.utils import finalize as _finalize
9 from . import _nrt_python as _nrt
10
11 _nrt_mstats = namedtuple("nrt_mstats", ["alloc", "free", "mi_alloc", "mi_free"])
12
13
14 class _Runtime(object):
15 def __init__(self):
16 self._init = False
17
18 def initialize(self, ctx):
19 """Initializes the NRT
20
21 Must be called before any actual call to the NRT API.
22 Safe to be called multiple times.
23 """
24 if self._init:
25 # Already initialized
26 return
27
28 # Register globals into the system
29 for py_name in _nrt.c_helpers:
30 c_name = "NRT_" + py_name
31 c_address = _nrt.c_helpers[py_name]
32 ll.add_symbol(c_name, c_address)
33
34 # Compile atomic operations
35 self._library = atomicops.compile_nrt_functions(ctx)
36
37 self._ptr_inc = self._library.get_pointer_to_function("nrt_atomic_add")
38 self._ptr_dec = self._library.get_pointer_to_function("nrt_atomic_sub")
39 self._ptr_cas = self._library.get_pointer_to_function("nrt_atomic_cas")
40
41 # Install atomic ops to NRT
42 _nrt.memsys_set_atomic_inc_dec(self._ptr_inc, self._ptr_dec)
43 _nrt.memsys_set_atomic_cas(self._ptr_cas)
44
45 self._init = True
46
47 @staticmethod
48 def shutdown():
49 """
50 Shutdown the NRT
51 Safe to be called without calling Runtime.initialize first
52 """
53 _nrt.memsys_shutdown()
54
55 @property
56 def library(self):
57 """
58 Return the Library object containing the various NRT functions.
59 """
60 return self._library
61
62 def meminfo_new(self, data, pyobj):
63 """
64 Returns a MemInfo object that tracks memory at `data` owned by `pyobj`.
65 MemInfo will acquire a reference on `pyobj`.
66 The release of MemInfo will release a reference on `pyobj`.
67 """
68 mi = _nrt.meminfo_new(data, pyobj)
69 return MemInfo(mi)
70
71 def meminfo_alloc(self, size, safe=False):
72 """
73 Allocate a new memory of `size` bytes and returns a MemInfo object
74 that tracks the allocation. When there is no more reference to the
75 MemInfo object, the underlying memory will be deallocated.
76
77 If `safe` flag is True, the memory is allocated using the `safe` scheme.
78 This is used for debugging and testing purposes.
79 See `NRT_MemInfo_alloc_safe()` in "nrt.h" for details.
80 """
81 if safe:
82 mi = _nrt.meminfo_alloc_safe(size)
83 else:
84 mi = _nrt.meminfo_alloc(size)
85 return MemInfo(mi)
86
87 def get_allocation_stats(self):
88 """
89 Returns a namedtuple of (alloc, free, mi_alloc, mi_free) for count of
90 each memory operations.
91 """
92 return _nrt_mstats(alloc=_nrt.memsys_get_stats_alloc(),
93 free=_nrt.memsys_get_stats_free(),
94 mi_alloc=_nrt.memsys_get_stats_mi_alloc(),
95 mi_free=_nrt.memsys_get_stats_mi_free())
96
97
98 # Alias to _nrt_python._MemInfo
99 MemInfo = _nrt._MemInfo
100
101 # Create uninitialized runtime
102 rtsys = _Runtime()
103
104 # Install finalizer
105 _finalize(rtsys, _Runtime.shutdown)
106
107 # Avoid future use of the class
108 del _Runtime
109
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/numba/runtime/nrt.py b/numba/runtime/nrt.py
--- a/numba/runtime/nrt.py
+++ b/numba/runtime/nrt.py
@@ -98,7 +98,8 @@
# Alias to _nrt_python._MemInfo
MemInfo = _nrt._MemInfo
-# Create uninitialized runtime
+# Create runtime
+_nrt.memsys_use_cpython_allocator()
rtsys = _Runtime()
# Install finalizer
| {"golden_diff": "diff --git a/numba/runtime/nrt.py b/numba/runtime/nrt.py\n--- a/numba/runtime/nrt.py\n+++ b/numba/runtime/nrt.py\n@@ -98,7 +98,8 @@\n # Alias to _nrt_python._MemInfo\n MemInfo = _nrt._MemInfo\n \n-# Create uninitialized runtime\n+# Create runtime\n+_nrt.memsys_use_cpython_allocator()\n rtsys = _Runtime()\n \n # Install finalizer\n", "issue": "Use CPython allocator in NRT\nNRT should optionally use the CPython memory allocation functions (when imported from CPython). This would allow Numba-allocated memory to be seen by other utilities such as `sys.getallocatedblocks()`, `sys.debugmallocstats()`, and `tracemalloc`.\n\n", "before_files": [{"content": "from __future__ import print_function, absolute_import, division\n\nfrom collections import namedtuple\n\nfrom . import atomicops\nfrom llvmlite import binding as ll\n\nfrom numba.utils import finalize as _finalize\nfrom . import _nrt_python as _nrt\n\n_nrt_mstats = namedtuple(\"nrt_mstats\", [\"alloc\", \"free\", \"mi_alloc\", \"mi_free\"])\n\n\nclass _Runtime(object):\n def __init__(self):\n self._init = False\n\n def initialize(self, ctx):\n \"\"\"Initializes the NRT\n\n Must be called before any actual call to the NRT API.\n Safe to be called multiple times.\n \"\"\"\n if self._init:\n # Already initialized\n return\n\n # Register globals into the system\n for py_name in _nrt.c_helpers:\n c_name = \"NRT_\" + py_name\n c_address = _nrt.c_helpers[py_name]\n ll.add_symbol(c_name, c_address)\n\n # Compile atomic operations\n self._library = atomicops.compile_nrt_functions(ctx)\n\n self._ptr_inc = self._library.get_pointer_to_function(\"nrt_atomic_add\")\n self._ptr_dec = self._library.get_pointer_to_function(\"nrt_atomic_sub\")\n self._ptr_cas = self._library.get_pointer_to_function(\"nrt_atomic_cas\")\n\n # Install atomic ops to NRT\n _nrt.memsys_set_atomic_inc_dec(self._ptr_inc, self._ptr_dec)\n _nrt.memsys_set_atomic_cas(self._ptr_cas)\n\n self._init = True\n\n @staticmethod\n def shutdown():\n \"\"\"\n Shutdown the NRT\n Safe to be called without calling Runtime.initialize first\n \"\"\"\n _nrt.memsys_shutdown()\n\n @property\n def library(self):\n \"\"\"\n Return the Library object containing the various NRT functions.\n \"\"\"\n return self._library\n\n def meminfo_new(self, data, pyobj):\n \"\"\"\n Returns a MemInfo object that tracks memory at `data` owned by `pyobj`.\n MemInfo will acquire a reference on `pyobj`.\n The release of MemInfo will release a reference on `pyobj`.\n \"\"\"\n mi = _nrt.meminfo_new(data, pyobj)\n return MemInfo(mi)\n\n def meminfo_alloc(self, size, safe=False):\n \"\"\"\n Allocate a new memory of `size` bytes and returns a MemInfo object\n that tracks the allocation. When there is no more reference to the\n MemInfo object, the underlying memory will be deallocated.\n\n If `safe` flag is True, the memory is allocated using the `safe` scheme.\n This is used for debugging and testing purposes.\n See `NRT_MemInfo_alloc_safe()` in \"nrt.h\" for details.\n \"\"\"\n if safe:\n mi = _nrt.meminfo_alloc_safe(size)\n else:\n mi = _nrt.meminfo_alloc(size)\n return MemInfo(mi)\n\n def get_allocation_stats(self):\n \"\"\"\n Returns a namedtuple of (alloc, free, mi_alloc, mi_free) for count of\n each memory operations.\n \"\"\"\n return _nrt_mstats(alloc=_nrt.memsys_get_stats_alloc(),\n free=_nrt.memsys_get_stats_free(),\n mi_alloc=_nrt.memsys_get_stats_mi_alloc(),\n mi_free=_nrt.memsys_get_stats_mi_free())\n\n\n# Alias to _nrt_python._MemInfo\nMemInfo = _nrt._MemInfo\n\n# Create uninitialized runtime\nrtsys = _Runtime()\n\n# Install finalizer\n_finalize(rtsys, _Runtime.shutdown)\n\n# Avoid future use of the class\ndel _Runtime\n", "path": "numba/runtime/nrt.py"}], "after_files": [{"content": "from __future__ import print_function, absolute_import, division\n\nfrom collections import namedtuple\n\nfrom . import atomicops\nfrom llvmlite import binding as ll\n\nfrom numba.utils import finalize as _finalize\nfrom . import _nrt_python as _nrt\n\n_nrt_mstats = namedtuple(\"nrt_mstats\", [\"alloc\", \"free\", \"mi_alloc\", \"mi_free\"])\n\n\nclass _Runtime(object):\n def __init__(self):\n self._init = False\n\n def initialize(self, ctx):\n \"\"\"Initializes the NRT\n\n Must be called before any actual call to the NRT API.\n Safe to be called multiple times.\n \"\"\"\n if self._init:\n # Already initialized\n return\n\n # Register globals into the system\n for py_name in _nrt.c_helpers:\n c_name = \"NRT_\" + py_name\n c_address = _nrt.c_helpers[py_name]\n ll.add_symbol(c_name, c_address)\n\n # Compile atomic operations\n self._library = atomicops.compile_nrt_functions(ctx)\n\n self._ptr_inc = self._library.get_pointer_to_function(\"nrt_atomic_add\")\n self._ptr_dec = self._library.get_pointer_to_function(\"nrt_atomic_sub\")\n self._ptr_cas = self._library.get_pointer_to_function(\"nrt_atomic_cas\")\n\n # Install atomic ops to NRT\n _nrt.memsys_set_atomic_inc_dec(self._ptr_inc, self._ptr_dec)\n _nrt.memsys_set_atomic_cas(self._ptr_cas)\n\n self._init = True\n\n @staticmethod\n def shutdown():\n \"\"\"\n Shutdown the NRT\n Safe to be called without calling Runtime.initialize first\n \"\"\"\n _nrt.memsys_shutdown()\n\n @property\n def library(self):\n \"\"\"\n Return the Library object containing the various NRT functions.\n \"\"\"\n return self._library\n\n def meminfo_new(self, data, pyobj):\n \"\"\"\n Returns a MemInfo object that tracks memory at `data` owned by `pyobj`.\n MemInfo will acquire a reference on `pyobj`.\n The release of MemInfo will release a reference on `pyobj`.\n \"\"\"\n mi = _nrt.meminfo_new(data, pyobj)\n return MemInfo(mi)\n\n def meminfo_alloc(self, size, safe=False):\n \"\"\"\n Allocate a new memory of `size` bytes and returns a MemInfo object\n that tracks the allocation. When there is no more reference to the\n MemInfo object, the underlying memory will be deallocated.\n\n If `safe` flag is True, the memory is allocated using the `safe` scheme.\n This is used for debugging and testing purposes.\n See `NRT_MemInfo_alloc_safe()` in \"nrt.h\" for details.\n \"\"\"\n if safe:\n mi = _nrt.meminfo_alloc_safe(size)\n else:\n mi = _nrt.meminfo_alloc(size)\n return MemInfo(mi)\n\n def get_allocation_stats(self):\n \"\"\"\n Returns a namedtuple of (alloc, free, mi_alloc, mi_free) for count of\n each memory operations.\n \"\"\"\n return _nrt_mstats(alloc=_nrt.memsys_get_stats_alloc(),\n free=_nrt.memsys_get_stats_free(),\n mi_alloc=_nrt.memsys_get_stats_mi_alloc(),\n mi_free=_nrt.memsys_get_stats_mi_free())\n\n\n# Alias to _nrt_python._MemInfo\nMemInfo = _nrt._MemInfo\n\n# Create runtime\n_nrt.memsys_use_cpython_allocator()\nrtsys = _Runtime()\n\n# Install finalizer\n_finalize(rtsys, _Runtime.shutdown)\n\n# Avoid future use of the class\ndel _Runtime\n", "path": "numba/runtime/nrt.py"}]} | 1,355 | 106 |
gh_patches_debug_36848 | rasdani/github-patches | git_diff | pwndbg__pwndbg-1920 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
The `ctx threads` (or `threads`) should display all threads no matter of context threads limit
cc: @CptGibbon we should probably add this option for convenience :)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pwndbg/commands/tls.py`
Content:
```
1 """
2 Command to print the information of the current Thread Local Storage (TLS).
3 """
4 from __future__ import annotations
5
6 import argparse
7
8 import pwndbg.commands
9 import pwndbg.gdblib.tls
10 from pwndbg.color import message
11 from pwndbg.commands import CommandCategory
12
13 parser = argparse.ArgumentParser(
14 formatter_class=argparse.RawTextHelpFormatter,
15 description="Print out base address of the current Thread Local Storage (TLS).",
16 )
17
18 parser.add_argument(
19 "-p",
20 "--pthread-self",
21 action="store_true",
22 default=False,
23 help="Try to get the address of TLS by calling pthread_self().",
24 )
25
26
27 @pwndbg.commands.ArgparsedCommand(parser, category=CommandCategory.LINUX)
28 @pwndbg.commands.OnlyWhenRunning
29 @pwndbg.commands.OnlyWhenUserspace
30 def tls(pthread_self=False) -> None:
31 tls_base = (
32 pwndbg.gdblib.tls.find_address_with_register()
33 if not pthread_self
34 else pwndbg.gdblib.tls.find_address_with_pthread_self()
35 )
36 if pwndbg.gdblib.memory.is_readable_address(tls_base):
37 print(message.success("Thread Local Storage (TLS) base: %#x" % tls_base))
38 print(message.success("TLS is located at:"))
39 print(message.notice(pwndbg.gdblib.vmmap.find(tls_base)))
40 return
41 print(message.error("Couldn't find Thread Local Storage (TLS) base."))
42 if not pthread_self:
43 print(
44 message.notice(
45 "You can try to use -p/--pthread option to get the address of TLS by calling pthread_self().\n"
46 "(This might cause problems if the pthread_self() is not in libc or not initialized yet.)"
47 )
48 )
49
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pwndbg/commands/tls.py b/pwndbg/commands/tls.py
--- a/pwndbg/commands/tls.py
+++ b/pwndbg/commands/tls.py
@@ -5,6 +5,10 @@
import argparse
+import gdb
+from tabulate import tabulate
+
+import pwndbg.color.memory as M
import pwndbg.commands
import pwndbg.gdblib.tls
from pwndbg.color import message
@@ -46,3 +50,97 @@
"(This might cause problems if the pthread_self() is not in libc or not initialized yet.)"
)
)
+
+
+parser = argparse.ArgumentParser(
+ formatter_class=argparse.RawTextHelpFormatter,
+ description="List all threads belonging to the selected inferior.",
+)
+group = parser.add_mutually_exclusive_group()
+
+group.add_argument(
+ "num_threads",
+ type=int,
+ nargs="?",
+ default=None,
+ help="Number of threads to display. Omit to display all threads.",
+)
+
+group.add_argument(
+ "-c",
+ "--config",
+ action="store_true",
+ dest="respect_config",
+ help="Respect context-max-threads config to limit number of threads displayed.",
+)
+
+
[email protected](parser, category=CommandCategory.LINUX)
[email protected]
[email protected]
+def threads(num_threads, respect_config) -> None:
+ table = []
+ headers = ["global_num", "name", "status", "pc", "symbol"]
+ bold_green = lambda text: pwndbg.color.bold(pwndbg.color.green(text))
+
+ try:
+ original_thread = gdb.selected_thread()
+ except SystemError:
+ original_thread = None
+
+ all_threads = gdb.selected_inferior().threads()[::-1]
+
+ displayed_threads = []
+
+ if original_thread is not None and original_thread.is_valid():
+ displayed_threads.append(original_thread)
+
+ for thread in all_threads:
+ if respect_config and len(displayed_threads) >= int(
+ pwndbg.commands.context.config_max_threads_display
+ ):
+ break
+ elif num_threads is not None and len(displayed_threads) >= num_threads:
+ break
+
+ if thread.is_valid() and thread is not original_thread:
+ displayed_threads.append(thread)
+
+ for thread in displayed_threads:
+ name = thread.name or ""
+
+ if thread is original_thread:
+ row = [
+ bold_green(thread.global_num),
+ bold_green(name),
+ ]
+ else:
+ row = [
+ thread.global_num,
+ name,
+ ]
+
+ row.append(pwndbg.commands.context.get_thread_status(thread))
+
+ if thread.is_stopped():
+ thread.switch()
+ pc = gdb.selected_frame().pc()
+
+ pc_colored = M.get(pc)
+ symbol = pwndbg.gdblib.symbol.get(pc)
+
+ row.append(pc_colored)
+
+ if symbol:
+ if thread is original_thread:
+ row.append(bold_green(symbol))
+ else:
+ row.append(symbol)
+
+ table.append(row)
+
+ if original_thread is not None and original_thread.is_valid():
+ original_thread.switch()
+
+ print(tabulate(table, headers))
+ print(f"\nShowing {len(displayed_threads)} of {len(all_threads)} threads.")
| {"golden_diff": "diff --git a/pwndbg/commands/tls.py b/pwndbg/commands/tls.py\n--- a/pwndbg/commands/tls.py\n+++ b/pwndbg/commands/tls.py\n@@ -5,6 +5,10 @@\n \n import argparse\n \n+import gdb\n+from tabulate import tabulate\n+\n+import pwndbg.color.memory as M\n import pwndbg.commands\n import pwndbg.gdblib.tls\n from pwndbg.color import message\n@@ -46,3 +50,97 @@\n \"(This might cause problems if the pthread_self() is not in libc or not initialized yet.)\"\n )\n )\n+\n+\n+parser = argparse.ArgumentParser(\n+ formatter_class=argparse.RawTextHelpFormatter,\n+ description=\"List all threads belonging to the selected inferior.\",\n+)\n+group = parser.add_mutually_exclusive_group()\n+\n+group.add_argument(\n+ \"num_threads\",\n+ type=int,\n+ nargs=\"?\",\n+ default=None,\n+ help=\"Number of threads to display. Omit to display all threads.\",\n+)\n+\n+group.add_argument(\n+ \"-c\",\n+ \"--config\",\n+ action=\"store_true\",\n+ dest=\"respect_config\",\n+ help=\"Respect context-max-threads config to limit number of threads displayed.\",\n+)\n+\n+\[email protected](parser, category=CommandCategory.LINUX)\[email protected]\[email protected]\n+def threads(num_threads, respect_config) -> None:\n+ table = []\n+ headers = [\"global_num\", \"name\", \"status\", \"pc\", \"symbol\"]\n+ bold_green = lambda text: pwndbg.color.bold(pwndbg.color.green(text))\n+\n+ try:\n+ original_thread = gdb.selected_thread()\n+ except SystemError:\n+ original_thread = None\n+\n+ all_threads = gdb.selected_inferior().threads()[::-1]\n+\n+ displayed_threads = []\n+\n+ if original_thread is not None and original_thread.is_valid():\n+ displayed_threads.append(original_thread)\n+\n+ for thread in all_threads:\n+ if respect_config and len(displayed_threads) >= int(\n+ pwndbg.commands.context.config_max_threads_display\n+ ):\n+ break\n+ elif num_threads is not None and len(displayed_threads) >= num_threads:\n+ break\n+\n+ if thread.is_valid() and thread is not original_thread:\n+ displayed_threads.append(thread)\n+\n+ for thread in displayed_threads:\n+ name = thread.name or \"\"\n+\n+ if thread is original_thread:\n+ row = [\n+ bold_green(thread.global_num),\n+ bold_green(name),\n+ ]\n+ else:\n+ row = [\n+ thread.global_num,\n+ name,\n+ ]\n+\n+ row.append(pwndbg.commands.context.get_thread_status(thread))\n+\n+ if thread.is_stopped():\n+ thread.switch()\n+ pc = gdb.selected_frame().pc()\n+\n+ pc_colored = M.get(pc)\n+ symbol = pwndbg.gdblib.symbol.get(pc)\n+\n+ row.append(pc_colored)\n+\n+ if symbol:\n+ if thread is original_thread:\n+ row.append(bold_green(symbol))\n+ else:\n+ row.append(symbol)\n+\n+ table.append(row)\n+\n+ if original_thread is not None and original_thread.is_valid():\n+ original_thread.switch()\n+\n+ print(tabulate(table, headers))\n+ print(f\"\\nShowing {len(displayed_threads)} of {len(all_threads)} threads.\")\n", "issue": "The `ctx threads` (or `threads`) should display all threads no matter of context threads limit\ncc: @CptGibbon we should probably add this option for convenience :)\n", "before_files": [{"content": "\"\"\"\nCommand to print the information of the current Thread Local Storage (TLS).\n\"\"\"\nfrom __future__ import annotations\n\nimport argparse\n\nimport pwndbg.commands\nimport pwndbg.gdblib.tls\nfrom pwndbg.color import message\nfrom pwndbg.commands import CommandCategory\n\nparser = argparse.ArgumentParser(\n formatter_class=argparse.RawTextHelpFormatter,\n description=\"Print out base address of the current Thread Local Storage (TLS).\",\n)\n\nparser.add_argument(\n \"-p\",\n \"--pthread-self\",\n action=\"store_true\",\n default=False,\n help=\"Try to get the address of TLS by calling pthread_self().\",\n)\n\n\[email protected](parser, category=CommandCategory.LINUX)\[email protected]\[email protected]\ndef tls(pthread_self=False) -> None:\n tls_base = (\n pwndbg.gdblib.tls.find_address_with_register()\n if not pthread_self\n else pwndbg.gdblib.tls.find_address_with_pthread_self()\n )\n if pwndbg.gdblib.memory.is_readable_address(tls_base):\n print(message.success(\"Thread Local Storage (TLS) base: %#x\" % tls_base))\n print(message.success(\"TLS is located at:\"))\n print(message.notice(pwndbg.gdblib.vmmap.find(tls_base)))\n return\n print(message.error(\"Couldn't find Thread Local Storage (TLS) base.\"))\n if not pthread_self:\n print(\n message.notice(\n \"You can try to use -p/--pthread option to get the address of TLS by calling pthread_self().\\n\"\n \"(This might cause problems if the pthread_self() is not in libc or not initialized yet.)\"\n )\n )\n", "path": "pwndbg/commands/tls.py"}], "after_files": [{"content": "\"\"\"\nCommand to print the information of the current Thread Local Storage (TLS).\n\"\"\"\nfrom __future__ import annotations\n\nimport argparse\n\nimport gdb\nfrom tabulate import tabulate\n\nimport pwndbg.color.memory as M\nimport pwndbg.commands\nimport pwndbg.gdblib.tls\nfrom pwndbg.color import message\nfrom pwndbg.commands import CommandCategory\n\nparser = argparse.ArgumentParser(\n formatter_class=argparse.RawTextHelpFormatter,\n description=\"Print out base address of the current Thread Local Storage (TLS).\",\n)\n\nparser.add_argument(\n \"-p\",\n \"--pthread-self\",\n action=\"store_true\",\n default=False,\n help=\"Try to get the address of TLS by calling pthread_self().\",\n)\n\n\[email protected](parser, category=CommandCategory.LINUX)\[email protected]\[email protected]\ndef tls(pthread_self=False) -> None:\n tls_base = (\n pwndbg.gdblib.tls.find_address_with_register()\n if not pthread_self\n else pwndbg.gdblib.tls.find_address_with_pthread_self()\n )\n if pwndbg.gdblib.memory.is_readable_address(tls_base):\n print(message.success(\"Thread Local Storage (TLS) base: %#x\" % tls_base))\n print(message.success(\"TLS is located at:\"))\n print(message.notice(pwndbg.gdblib.vmmap.find(tls_base)))\n return\n print(message.error(\"Couldn't find Thread Local Storage (TLS) base.\"))\n if not pthread_self:\n print(\n message.notice(\n \"You can try to use -p/--pthread option to get the address of TLS by calling pthread_self().\\n\"\n \"(This might cause problems if the pthread_self() is not in libc or not initialized yet.)\"\n )\n )\n\n\nparser = argparse.ArgumentParser(\n formatter_class=argparse.RawTextHelpFormatter,\n description=\"List all threads belonging to the selected inferior.\",\n)\ngroup = parser.add_mutually_exclusive_group()\n\ngroup.add_argument(\n \"num_threads\",\n type=int,\n nargs=\"?\",\n default=None,\n help=\"Number of threads to display. Omit to display all threads.\",\n)\n\ngroup.add_argument(\n \"-c\",\n \"--config\",\n action=\"store_true\",\n dest=\"respect_config\",\n help=\"Respect context-max-threads config to limit number of threads displayed.\",\n)\n\n\[email protected](parser, category=CommandCategory.LINUX)\[email protected]\[email protected]\ndef threads(num_threads, respect_config) -> None:\n table = []\n headers = [\"global_num\", \"name\", \"status\", \"pc\", \"symbol\"]\n bold_green = lambda text: pwndbg.color.bold(pwndbg.color.green(text))\n\n try:\n original_thread = gdb.selected_thread()\n except SystemError:\n original_thread = None\n\n all_threads = gdb.selected_inferior().threads()[::-1]\n\n displayed_threads = []\n\n if original_thread is not None and original_thread.is_valid():\n displayed_threads.append(original_thread)\n\n for thread in all_threads:\n if respect_config and len(displayed_threads) >= int(\n pwndbg.commands.context.config_max_threads_display\n ):\n break\n elif num_threads is not None and len(displayed_threads) >= num_threads:\n break\n\n if thread.is_valid() and thread is not original_thread:\n displayed_threads.append(thread)\n\n for thread in displayed_threads:\n name = thread.name or \"\"\n\n if thread is original_thread:\n row = [\n bold_green(thread.global_num),\n bold_green(name),\n ]\n else:\n row = [\n thread.global_num,\n name,\n ]\n\n row.append(pwndbg.commands.context.get_thread_status(thread))\n\n if thread.is_stopped():\n thread.switch()\n pc = gdb.selected_frame().pc()\n\n pc_colored = M.get(pc)\n symbol = pwndbg.gdblib.symbol.get(pc)\n\n row.append(pc_colored)\n\n if symbol:\n if thread is original_thread:\n row.append(bold_green(symbol))\n else:\n row.append(symbol)\n\n table.append(row)\n\n if original_thread is not None and original_thread.is_valid():\n original_thread.switch()\n\n print(tabulate(table, headers))\n print(f\"\\nShowing {len(displayed_threads)} of {len(all_threads)} threads.\")\n", "path": "pwndbg/commands/tls.py"}]} | 766 | 781 |
gh_patches_debug_12510 | rasdani/github-patches | git_diff | web2py__web2py-2115 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Python 3 compatibility fix
https://github.com/web2py/web2py/blob/master/gluon/contrib/webclient.py#L124 should be `data['_formname'] = list(self.forms.keys())[0]` to avoid `TypeError: 'dict_keys' object does not support indexing` when this is executing under Python 3.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `gluon/contrib/webclient.py`
Content:
```
1 """
2 Developed by Massimo Di Pierro
3 Released under the web2py license (LGPL)
4
5 It an interface on top of urllib2 which simplifies scripting of http requests
6 mostly for testing purposes
7
8 - customizable
9 - supports basic auth
10 - supports cookies
11 - supports session cookies (tested with web2py sessions)
12 - detects broken session
13 - detects web2py form postbacks and handles formname and formkey
14 - detects web2py tickets
15
16 Some examples at the bottom.
17 """
18 from __future__ import print_function
19 from gluon._compat import urllib2, cookielib, iteritems, to_native, urlencode, to_bytes
20 import re
21 import time
22
23
24 DEFAULT_HEADERS = {
25 'user-agent': 'Mozilla/4.0', # some servers are picky
26 'accept-language': 'en',
27 }
28
29 FORM_REGEX = re.compile('(\<input name\="_formkey" type\="hidden" value\="(?P<formkey>.+?)" \/\>)?\<input name\="_formname" type\="hidden" value\="(?P<formname>.+?)" \/\>')
30
31 SESSION_REGEX = 'session_id_(?P<name>.+)'
32
33
34 class WebClient(object):
35
36 def __init__(self,
37 app='',
38 postbacks=True,
39 default_headers=DEFAULT_HEADERS,
40 session_regex=SESSION_REGEX):
41 self.app = app
42 self.postbacks = postbacks
43 self.forms = {}
44 self.history = []
45 self.cookies = {}
46 self.cookiejar = cookielib.CookieJar()
47 self.default_headers = default_headers
48 self.sessions = {}
49 self.session_regex = session_regex and re.compile(session_regex)
50 self.headers = {}
51
52 def _parse_headers_in_cookies(self):
53 self.cookies = {}
54 if 'set-cookie' in self.headers:
55 for item in self.headers['set-cookie'].split(','):
56 cookie = item[:item.find(';')]
57 pos = cookie.find('=')
58 key = cookie[:pos]
59 value = cookie[pos+1:]
60 self.cookies[key.strip()] = value.strip()
61
62 def get(self, url, cookies=None, headers=None, auth=None):
63 return self.post(url, data=None, cookies=cookies,
64 headers=headers, method='GET')
65
66 def post(self, url, data=None, cookies=None,
67 headers=None, auth=None, method='auto'):
68 self.url = self.app + url
69
70 # if this POST form requires a postback do it
71 if data and '_formname' in data and self.postbacks and \
72 self.history and self.history[-1][1] != self.url:
73 # to bypass the web2py CSRF need to get formkey
74 # before submitting the form
75 self.get(url, cookies=cookies, headers=headers, auth=auth)
76
77 # unless cookies are specified, recycle cookies
78 if cookies is None:
79 cookies = self.cookies
80 cookies = cookies or {}
81 headers = headers or {}
82
83 args = [
84 urllib2.HTTPCookieProcessor(self.cookiejar),
85 urllib2.HTTPHandler(debuglevel=0)
86 ]
87 # if required do basic auth
88 if auth:
89 auth_handler = urllib2.HTTPBasicAuthHandler()
90 auth_handler.add_password(**auth)
91 args.append(auth_handler)
92
93 opener = urllib2.build_opener(*args)
94
95 # copy headers from dict to list of key,value
96 headers_list = []
97 for key, value in iteritems(self.default_headers):
98 if not key in headers:
99 headers[key] = value
100 for key, value in iteritems(headers):
101 if isinstance(value, (list, tuple)):
102 for v in value:
103 headers_list.append((key, v))
104 else:
105 headers_list.append((key, value))
106
107 # move cookies to headers
108 for key, value in iteritems(cookies):
109 headers_list.append(('Cookie', '%s=%s' % (key, value)))
110
111 # add headers to request
112 for key, value in headers_list:
113 opener.addheaders.append((key, str(value)))
114
115 # assume everything is ok and make http request
116 error = None
117 try:
118 if isinstance(data, str):
119 self.method = 'POST' if method=='auto' else method
120 elif isinstance(data, dict):
121 self.method = 'POST' if method=='auto' else method
122 # if there is only one form, set _formname automatically
123 if not '_formname' in data and len(self.forms) == 1:
124 data['_formname'] = self.forms.keys()[0]
125
126 # if there is no formkey but it is known, set it
127 if '_formname' in data and not '_formkey' in data and \
128 data['_formname'] in self.forms:
129 data['_formkey'] = self.forms[data['_formname']]
130
131 # time the POST request
132 data = urlencode(data, doseq=True)
133 else:
134 self.method = 'GET' if method=='auto' else method
135 data = None
136 t0 = time.time()
137 self.response = opener.open(self.url, to_bytes(data))
138 self.time = time.time() - t0
139 except urllib2.HTTPError as er:
140 error = er
141 # catch HTTP errors
142 self.time = time.time() - t0
143 self.response = er
144
145 if hasattr(self.response, 'getcode'):
146 self.status = self.response.getcode()
147 else:#python2.5
148 self.status = None
149
150 self.text = to_native(self.response.read())
151 # In PY3 self.response.headers are case sensitive
152 self.headers = dict()
153 for h in self.response.headers:
154 self.headers[h.lower()] = self.response.headers[h]
155
156 # treat web2py tickets as special types of errors
157 if error is not None:
158 if 'web2py_error' in self.headers:
159 raise RuntimeError(self.headers['web2py_error'])
160 else:
161 raise error
162
163 self._parse_headers_in_cookies()
164
165 # check is a new session id has been issued, symptom of broken session
166 if self.session_regex is not None:
167 for cookie, value in iteritems(self.cookies):
168 match = self.session_regex.match(cookie)
169 if match:
170 name = match.group('name')
171 if name in self.sessions and self.sessions[name] != value:
172 print(RuntimeError('Changed session ID %s' % name))
173 self.sessions[name] = value
174
175 # find all forms and formkeys in page
176 self.forms = {}
177 for match in FORM_REGEX.finditer(to_native(self.text)):
178 self.forms[match.group('formname')] = match.group('formkey')
179
180 # log this request
181 self.history.append((self.method, self.url, self.status, self.time))
182
183
184 def test_web2py_registration_and_login():
185 # from gluon.contrib.webclient import WebClient
186 # start a web2py instance for testing
187
188 client = WebClient('http://127.0.0.1:8000/welcome/default/')
189 client.get('index')
190
191 # register
192 data = dict(first_name='Homer',
193 last_name='Simpson',
194 email='[email protected]',
195 password='test',
196 password_two='test',
197 _formname='register')
198 client.post('user/register', data=data)
199
200 # logout
201 client.get('user/logout')
202
203 # login
204 data = dict(email='[email protected]',
205 password='test',
206 _formname='login')
207 client.post('user/login', data=data)
208
209 # check registration and login were successful
210 client.get('user/profile')
211 assert 'Welcome Homer' in client.text
212
213 # print some variables
214 print('\nsessions:\n', client.sessions)
215 print('\nheaders:\n', client.headers)
216 print('\ncookies:\n', client.cookies)
217 print('\nforms:\n', client.forms)
218 print()
219 for method, url, status, t in client.history:
220 print(method, url, status, t)
221
222 if __name__ == '__main__':
223 test_web2py_registration_and_login()
224
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/gluon/contrib/webclient.py b/gluon/contrib/webclient.py
--- a/gluon/contrib/webclient.py
+++ b/gluon/contrib/webclient.py
@@ -121,7 +121,7 @@
self.method = 'POST' if method=='auto' else method
# if there is only one form, set _formname automatically
if not '_formname' in data and len(self.forms) == 1:
- data['_formname'] = self.forms.keys()[0]
+ data['_formname'] = next(iter(self.forms.keys())) # Use the first key
# if there is no formkey but it is known, set it
if '_formname' in data and not '_formkey' in data and \
| {"golden_diff": "diff --git a/gluon/contrib/webclient.py b/gluon/contrib/webclient.py\n--- a/gluon/contrib/webclient.py\n+++ b/gluon/contrib/webclient.py\n@@ -121,7 +121,7 @@\n self.method = 'POST' if method=='auto' else method\n # if there is only one form, set _formname automatically\n if not '_formname' in data and len(self.forms) == 1:\n- data['_formname'] = self.forms.keys()[0]\n+ data['_formname'] = next(iter(self.forms.keys())) # Use the first key\n \n # if there is no formkey but it is known, set it\n if '_formname' in data and not '_formkey' in data and \\\n", "issue": "Python 3 compatibility fix\nhttps://github.com/web2py/web2py/blob/master/gluon/contrib/webclient.py#L124 should be `data['_formname'] = list(self.forms.keys())[0]` to avoid `TypeError: 'dict_keys' object does not support indexing` when this is executing under Python 3.\n", "before_files": [{"content": "\"\"\"\nDeveloped by Massimo Di Pierro\nReleased under the web2py license (LGPL)\n\nIt an interface on top of urllib2 which simplifies scripting of http requests\nmostly for testing purposes\n\n- customizable\n- supports basic auth\n- supports cookies\n- supports session cookies (tested with web2py sessions)\n- detects broken session\n- detects web2py form postbacks and handles formname and formkey\n- detects web2py tickets\n\nSome examples at the bottom.\n\"\"\"\nfrom __future__ import print_function\nfrom gluon._compat import urllib2, cookielib, iteritems, to_native, urlencode, to_bytes\nimport re\nimport time\n\n\nDEFAULT_HEADERS = {\n 'user-agent': 'Mozilla/4.0', # some servers are picky\n 'accept-language': 'en',\n}\n\nFORM_REGEX = re.compile('(\\<input name\\=\"_formkey\" type\\=\"hidden\" value\\=\"(?P<formkey>.+?)\" \\/\\>)?\\<input name\\=\"_formname\" type\\=\"hidden\" value\\=\"(?P<formname>.+?)\" \\/\\>')\n\nSESSION_REGEX = 'session_id_(?P<name>.+)'\n\n\nclass WebClient(object):\n\n def __init__(self,\n app='',\n postbacks=True,\n default_headers=DEFAULT_HEADERS,\n session_regex=SESSION_REGEX):\n self.app = app\n self.postbacks = postbacks\n self.forms = {}\n self.history = []\n self.cookies = {}\n self.cookiejar = cookielib.CookieJar()\n self.default_headers = default_headers\n self.sessions = {}\n self.session_regex = session_regex and re.compile(session_regex)\n self.headers = {}\n\n def _parse_headers_in_cookies(self):\n self.cookies = {}\n if 'set-cookie' in self.headers:\n for item in self.headers['set-cookie'].split(','):\n cookie = item[:item.find(';')]\n pos = cookie.find('=')\n key = cookie[:pos]\n value = cookie[pos+1:]\n self.cookies[key.strip()] = value.strip()\n\n def get(self, url, cookies=None, headers=None, auth=None):\n return self.post(url, data=None, cookies=cookies,\n headers=headers, method='GET')\n\n def post(self, url, data=None, cookies=None,\n headers=None, auth=None, method='auto'):\n self.url = self.app + url\n\n # if this POST form requires a postback do it\n if data and '_formname' in data and self.postbacks and \\\n self.history and self.history[-1][1] != self.url:\n # to bypass the web2py CSRF need to get formkey\n # before submitting the form\n self.get(url, cookies=cookies, headers=headers, auth=auth)\n\n # unless cookies are specified, recycle cookies\n if cookies is None:\n cookies = self.cookies\n cookies = cookies or {}\n headers = headers or {}\n\n args = [\n urllib2.HTTPCookieProcessor(self.cookiejar),\n urllib2.HTTPHandler(debuglevel=0)\n ]\n # if required do basic auth\n if auth:\n auth_handler = urllib2.HTTPBasicAuthHandler()\n auth_handler.add_password(**auth)\n args.append(auth_handler)\n\n opener = urllib2.build_opener(*args)\n\n # copy headers from dict to list of key,value\n headers_list = []\n for key, value in iteritems(self.default_headers):\n if not key in headers:\n headers[key] = value\n for key, value in iteritems(headers):\n if isinstance(value, (list, tuple)):\n for v in value:\n headers_list.append((key, v))\n else:\n headers_list.append((key, value))\n\n # move cookies to headers\n for key, value in iteritems(cookies):\n headers_list.append(('Cookie', '%s=%s' % (key, value)))\n\n # add headers to request\n for key, value in headers_list:\n opener.addheaders.append((key, str(value)))\n\n # assume everything is ok and make http request\n error = None\n try:\n if isinstance(data, str):\n self.method = 'POST' if method=='auto' else method\n elif isinstance(data, dict):\n self.method = 'POST' if method=='auto' else method\n # if there is only one form, set _formname automatically\n if not '_formname' in data and len(self.forms) == 1:\n data['_formname'] = self.forms.keys()[0]\n\n # if there is no formkey but it is known, set it\n if '_formname' in data and not '_formkey' in data and \\\n data['_formname'] in self.forms:\n data['_formkey'] = self.forms[data['_formname']]\n\n # time the POST request\n data = urlencode(data, doseq=True)\n else:\n self.method = 'GET' if method=='auto' else method\n data = None\n t0 = time.time()\n self.response = opener.open(self.url, to_bytes(data))\n self.time = time.time() - t0\n except urllib2.HTTPError as er:\n error = er\n # catch HTTP errors\n self.time = time.time() - t0\n self.response = er\n\n if hasattr(self.response, 'getcode'):\n self.status = self.response.getcode()\n else:#python2.5\n self.status = None\n\n self.text = to_native(self.response.read())\n # In PY3 self.response.headers are case sensitive\n self.headers = dict()\n for h in self.response.headers:\n self.headers[h.lower()] = self.response.headers[h]\n\n # treat web2py tickets as special types of errors\n if error is not None:\n if 'web2py_error' in self.headers:\n raise RuntimeError(self.headers['web2py_error'])\n else:\n raise error\n\n self._parse_headers_in_cookies()\n\n # check is a new session id has been issued, symptom of broken session\n if self.session_regex is not None:\n for cookie, value in iteritems(self.cookies):\n match = self.session_regex.match(cookie)\n if match:\n name = match.group('name')\n if name in self.sessions and self.sessions[name] != value:\n print(RuntimeError('Changed session ID %s' % name))\n self.sessions[name] = value\n\n # find all forms and formkeys in page\n self.forms = {}\n for match in FORM_REGEX.finditer(to_native(self.text)):\n self.forms[match.group('formname')] = match.group('formkey')\n\n # log this request\n self.history.append((self.method, self.url, self.status, self.time))\n\n\ndef test_web2py_registration_and_login():\n # from gluon.contrib.webclient import WebClient\n # start a web2py instance for testing\n\n client = WebClient('http://127.0.0.1:8000/welcome/default/')\n client.get('index')\n\n # register\n data = dict(first_name='Homer',\n last_name='Simpson',\n email='[email protected]',\n password='test',\n password_two='test',\n _formname='register')\n client.post('user/register', data=data)\n\n # logout\n client.get('user/logout')\n\n # login\n data = dict(email='[email protected]',\n password='test',\n _formname='login')\n client.post('user/login', data=data)\n\n # check registration and login were successful\n client.get('user/profile')\n assert 'Welcome Homer' in client.text\n\n # print some variables\n print('\\nsessions:\\n', client.sessions)\n print('\\nheaders:\\n', client.headers)\n print('\\ncookies:\\n', client.cookies)\n print('\\nforms:\\n', client.forms)\n print()\n for method, url, status, t in client.history:\n print(method, url, status, t)\n\nif __name__ == '__main__':\n test_web2py_registration_and_login()\n", "path": "gluon/contrib/webclient.py"}], "after_files": [{"content": "\"\"\"\nDeveloped by Massimo Di Pierro\nReleased under the web2py license (LGPL)\n\nIt an interface on top of urllib2 which simplifies scripting of http requests\nmostly for testing purposes\n\n- customizable\n- supports basic auth\n- supports cookies\n- supports session cookies (tested with web2py sessions)\n- detects broken session\n- detects web2py form postbacks and handles formname and formkey\n- detects web2py tickets\n\nSome examples at the bottom.\n\"\"\"\nfrom __future__ import print_function\nfrom gluon._compat import urllib2, cookielib, iteritems, to_native, urlencode, to_bytes\nimport re\nimport time\n\n\nDEFAULT_HEADERS = {\n 'user-agent': 'Mozilla/4.0', # some servers are picky\n 'accept-language': 'en',\n}\n\nFORM_REGEX = re.compile('(\\<input name\\=\"_formkey\" type\\=\"hidden\" value\\=\"(?P<formkey>.+?)\" \\/\\>)?\\<input name\\=\"_formname\" type\\=\"hidden\" value\\=\"(?P<formname>.+?)\" \\/\\>')\n\nSESSION_REGEX = 'session_id_(?P<name>.+)'\n\n\nclass WebClient(object):\n\n def __init__(self,\n app='',\n postbacks=True,\n default_headers=DEFAULT_HEADERS,\n session_regex=SESSION_REGEX):\n self.app = app\n self.postbacks = postbacks\n self.forms = {}\n self.history = []\n self.cookies = {}\n self.cookiejar = cookielib.CookieJar()\n self.default_headers = default_headers\n self.sessions = {}\n self.session_regex = session_regex and re.compile(session_regex)\n self.headers = {}\n\n def _parse_headers_in_cookies(self):\n self.cookies = {}\n if 'set-cookie' in self.headers:\n for item in self.headers['set-cookie'].split(','):\n cookie = item[:item.find(';')]\n pos = cookie.find('=')\n key = cookie[:pos]\n value = cookie[pos+1:]\n self.cookies[key.strip()] = value.strip()\n\n def get(self, url, cookies=None, headers=None, auth=None):\n return self.post(url, data=None, cookies=cookies,\n headers=headers, method='GET')\n\n def post(self, url, data=None, cookies=None,\n headers=None, auth=None, method='auto'):\n self.url = self.app + url\n\n # if this POST form requires a postback do it\n if data and '_formname' in data and self.postbacks and \\\n self.history and self.history[-1][1] != self.url:\n # to bypass the web2py CSRF need to get formkey\n # before submitting the form\n self.get(url, cookies=cookies, headers=headers, auth=auth)\n\n # unless cookies are specified, recycle cookies\n if cookies is None:\n cookies = self.cookies\n cookies = cookies or {}\n headers = headers or {}\n\n args = [\n urllib2.HTTPCookieProcessor(self.cookiejar),\n urllib2.HTTPHandler(debuglevel=0)\n ]\n # if required do basic auth\n if auth:\n auth_handler = urllib2.HTTPBasicAuthHandler()\n auth_handler.add_password(**auth)\n args.append(auth_handler)\n\n opener = urllib2.build_opener(*args)\n\n # copy headers from dict to list of key,value\n headers_list = []\n for key, value in iteritems(self.default_headers):\n if not key in headers:\n headers[key] = value\n for key, value in iteritems(headers):\n if isinstance(value, (list, tuple)):\n for v in value:\n headers_list.append((key, v))\n else:\n headers_list.append((key, value))\n\n # move cookies to headers\n for key, value in iteritems(cookies):\n headers_list.append(('Cookie', '%s=%s' % (key, value)))\n\n # add headers to request\n for key, value in headers_list:\n opener.addheaders.append((key, str(value)))\n\n # assume everything is ok and make http request\n error = None\n try:\n if isinstance(data, str):\n self.method = 'POST' if method=='auto' else method\n elif isinstance(data, dict):\n self.method = 'POST' if method=='auto' else method\n # if there is only one form, set _formname automatically\n if not '_formname' in data and len(self.forms) == 1:\n data['_formname'] = next(iter(self.forms.keys())) # Use the first key\n\n # if there is no formkey but it is known, set it\n if '_formname' in data and not '_formkey' in data and \\\n data['_formname'] in self.forms:\n data['_formkey'] = self.forms[data['_formname']]\n\n # time the POST request\n data = urlencode(data, doseq=True)\n else:\n self.method = 'GET' if method=='auto' else method\n data = None\n t0 = time.time()\n self.response = opener.open(self.url, to_bytes(data))\n self.time = time.time() - t0\n except urllib2.HTTPError as er:\n error = er\n # catch HTTP errors\n self.time = time.time() - t0\n self.response = er\n\n if hasattr(self.response, 'getcode'):\n self.status = self.response.getcode()\n else:#python2.5\n self.status = None\n\n self.text = to_native(self.response.read())\n # In PY3 self.response.headers are case sensitive\n self.headers = dict()\n for h in self.response.headers:\n self.headers[h.lower()] = self.response.headers[h]\n\n # treat web2py tickets as special types of errors\n if error is not None:\n if 'web2py_error' in self.headers:\n raise RuntimeError(self.headers['web2py_error'])\n else:\n raise error\n\n self._parse_headers_in_cookies()\n\n # check is a new session id has been issued, symptom of broken session\n if self.session_regex is not None:\n for cookie, value in iteritems(self.cookies):\n match = self.session_regex.match(cookie)\n if match:\n name = match.group('name')\n if name in self.sessions and self.sessions[name] != value:\n print(RuntimeError('Changed session ID %s' % name))\n self.sessions[name] = value\n\n # find all forms and formkeys in page\n self.forms = {}\n for match in FORM_REGEX.finditer(to_native(self.text)):\n self.forms[match.group('formname')] = match.group('formkey')\n\n # log this request\n self.history.append((self.method, self.url, self.status, self.time))\n\n\ndef test_web2py_registration_and_login():\n # from gluon.contrib.webclient import WebClient\n # start a web2py instance for testing\n\n client = WebClient('http://127.0.0.1:8000/welcome/default/')\n client.get('index')\n\n # register\n data = dict(first_name='Homer',\n last_name='Simpson',\n email='[email protected]',\n password='test',\n password_two='test',\n _formname='register')\n client.post('user/register', data=data)\n\n # logout\n client.get('user/logout')\n\n # login\n data = dict(email='[email protected]',\n password='test',\n _formname='login')\n client.post('user/login', data=data)\n\n # check registration and login were successful\n client.get('user/profile')\n assert 'Welcome Homer' in client.text\n\n # print some variables\n print('\\nsessions:\\n', client.sessions)\n print('\\nheaders:\\n', client.headers)\n print('\\ncookies:\\n', client.cookies)\n print('\\nforms:\\n', client.forms)\n print()\n for method, url, status, t in client.history:\n print(method, url, status, t)\n\nif __name__ == '__main__':\n test_web2py_registration_and_login()\n", "path": "gluon/contrib/webclient.py"}]} | 2,647 | 174 |
gh_patches_debug_36469 | rasdani/github-patches | git_diff | rlworkgroup__garage-861 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Upgrade pycma so that garage can install without numpy
The current version of pycma we use requires numpy to run its setup.py. This breaks most automated install processes. Later versions of pycma don't have this problem, but they have an API change which requires updating `garage.np.algos.cmaes`.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 """setuptools based setup module."""
2 from setuptools import find_packages
3 from setuptools import setup
4
5 try:
6 # pylint: disable=unused-import
7 import numpy # noqa: F401
8 except ImportError:
9 raise RuntimeError(
10 'garage requires numpy in the environment to install. '
11 'Please install numpy==1.14.5 and try again. See '
12 'https://github.com/rlworkgroup/garage/issues/800 for more info.')
13
14 TF_VERSION = '<1.15,>=1.14.0'
15 GYM_VERSION = '==0.12.4'
16
17 # Required dependencies
18 required = [
19 # Please keep alphabetized
20 'akro==0.0.6',
21 'cached_property',
22 'click',
23 'cloudpickle',
24 'cma==1.1.06',
25 'dowel==0.0.2',
26 'gym[atari,box2d,classic_control]' + GYM_VERSION,
27 'joblib<0.13,>=0.12',
28 'matplotlib',
29 'numpy==1.14.5',
30 'psutil',
31 # Pyglet 1.4.0 introduces some api change which breaks some
32 # gym environments
33 # See: https://github.com/openai/gym/issues/1588
34 'pyglet<1.4.0,>=1.3.0',
35 'pyprind',
36 'python-dateutil',
37 'torch==1.1.0',
38 'ray',
39 'scikit-image',
40 'scipy',
41 'tensorflow' + TF_VERSION,
42 'tensorflow-probability<0.8.0,>=0.7.0', # for tensorflow 1.12
43 'torchvision==0.3.0'
44 ]
45
46 # Dependencies for optional features
47 extras = {}
48
49 extras['mujoco'] = [
50 'mujoco-py<2.1,>=2.0',
51 'gym[all]' + GYM_VERSION,
52 ]
53
54 extras['dm_control'] = [
55 # dm_control throws an error during install about not being able to
56 # find a build dependency (absl-py). Later pip executes the `install`
57 # command again and the install succeeds because absl-py has been
58 # installed. This is stupid, but harmless.
59 'dm_control @ https://api.github.com/repos/deepmind/dm_control/tarball/7a36377879c57777e5d5b4da5aae2cd2a29b607a', # noqa: E501
60 ]
61
62 extras['all'] = list(set(sum(extras.values(), [])))
63
64 # dependencies for using gpu, not included in all
65 extras['gpu'] = ['tensorflow-gpu' + TF_VERSION]
66
67 # Development dependencies (*not* included in "all")
68 extras['dev'] = [
69 # Please keep alphabetized
70 'baselines @ https://api.github.com/repos/openai/baselines/tarball/f2729693253c0ef4d4086231d36e0a4307ec1cb3', # noqa: E501
71 'flake8',
72 'flake8-docstrings==1.3.0',
73 'flake8-import-order',
74 'pandas',
75 'pep8-naming==0.7.0',
76 'pre-commit',
77 # pydocstyle 4.0.0 breaks flake8-docstrings 1.3.0
78 # See https://gitlab.com/pycqa/flake8-docstrings/issues/36
79 'pydocstyle<4.0.0',
80 'pylint==1.9.2',
81 'pytest>=3.6', # Required for pytest-cov on Python 3.6
82 'pytest-cov',
83 'pytest-xdist',
84 'recommonmark',
85 'sphinx',
86 'sphinx_rtd_theme',
87 'yapf==0.28.0',
88 ]
89
90 with open('README.md') as f:
91 readme = f.read()
92
93 # Get the package version dynamically
94 with open('VERSION') as v:
95 version = v.read().strip()
96
97 setup(
98 name='garage',
99 version=version,
100 author='Reinforcement Learning Working Group',
101 description='A framework for reproducible reinforcement learning research',
102 url='https://github.com/rlworkgroup/garage',
103 packages=find_packages(where='src'),
104 package_dir={'': 'src'},
105 scripts=['scripts/garage'],
106 python_requires='>=3.5',
107 install_requires=required,
108 extras_require=extras,
109 license='MIT',
110 long_description=readme,
111 long_description_content_type='text/markdown',
112 classifiers=[
113 'Development Status :: 4 - Beta',
114 'Intended Audience :: Developers',
115 'Intended Audience :: Education',
116 'Intended Audience :: Science/Research',
117 'License :: OSI Approved :: MIT License',
118 'Programming Language :: Python :: 3.5',
119 'Programming Language :: Python :: 3.6',
120 'Programming Language :: Python :: 3.7',
121 'Programming Language :: Python :: 3 :: Only',
122 'Topic :: Scientific/Engineering :: Artificial Intelligence',
123 'Topic :: Scientific/Engineering :: Mathematics',
124 'Topic :: Software Development :: Libraries',
125 ],
126 )
127
```
Path: `src/garage/np/algos/cma_es.py`
Content:
```
1 """Covariance Matrix Adaptation Evolution Strategy."""
2 import cma
3 from dowel import logger, tabular
4 import numpy as np
5
6 from garage.np.algos import BatchPolopt
7
8
9 class CMAES(BatchPolopt):
10 """Covariance Matrix Adaptation Evolution Strategy.
11
12 Note:
13 The CMA-ES method can hardly learn a successful policy even for
14 simple task. It is still maintained here only for consistency with
15 original rllab paper.
16
17 Args:
18 env_spec (garage.envs.EnvSpec): Environment specification.
19 policy (garage.np.policies.Policy): Action policy.
20 baseline (garage.np.baselines.Baseline): Baseline for GAE
21 (Generalized Advantage Estimation).
22 n_samples (int): Number of policies sampled in one epoch.
23 discount (float): Environment reward discount.
24 max_path_length (int): Maximum length of a single rollout.
25 sigma0 (float): Initial std for param distribution.
26
27 """
28
29 def __init__(self,
30 env_spec,
31 policy,
32 baseline,
33 n_samples,
34 discount=0.99,
35 max_path_length=500,
36 sigma0=1.):
37 super().__init__(policy, baseline, discount, max_path_length,
38 n_samples)
39 self.env_spec = env_spec
40 self.policy = policy
41
42 self.sigma0 = sigma0
43
44 def _sample_params(self):
45 return self.es.ask(self.n_samples)
46
47 def train(self, runner):
48 """Initialize variables and start training.
49
50 Args:
51 runner (LocalRunner): LocalRunner is passed to give algorithm
52 the access to runner.step_epochs(), which provides services
53 such as snapshotting and sampler control.
54
55 Returns:
56 The average return in last epoch cycle.
57
58 """
59 init_mean = self.policy.get_param_values()
60 self.es = cma.CMAEvolutionStrategy(init_mean, self.sigma0)
61 self.all_params = self._sample_params()
62 self.cur_params = self.all_params[0]
63 self.policy.set_param_values(self.cur_params)
64 self.all_returns = []
65
66 return super().train(runner)
67
68 def train_once(self, itr, paths):
69 """Perform one step of policy optimization given one batch of samples.
70
71 Args:
72 itr (int): Iteration number.
73 paths (list[dict]): A list of collected paths.
74
75 """
76 paths = self.process_samples(itr, paths)
77
78 epoch = itr // self.n_samples
79 i_sample = itr - epoch * self.n_samples
80
81 tabular.record('Epoch', epoch)
82 tabular.record('# Sample', i_sample)
83
84 rtn = paths['average_return']
85 self.all_returns.append(paths['average_return'])
86
87 if (itr + 1) % self.n_samples == 0:
88 avg_rtns = np.array(self.all_returns)
89 self.es.tell(self.all_params, -avg_rtns)
90 self.policy.set_param_values(self.es.result()[0])
91
92 # Clear for next epoch
93 rtn = max(self.all_returns)
94 self.all_returns.clear()
95 self.all_params = self._sample_params()
96
97 self.cur_params = self.all_params[(i_sample + 1) % self.n_samples]
98 self.policy.set_param_values(self.cur_params)
99
100 logger.log(tabular)
101 return rtn
102
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -2,15 +2,6 @@
from setuptools import find_packages
from setuptools import setup
-try:
- # pylint: disable=unused-import
- import numpy # noqa: F401
-except ImportError:
- raise RuntimeError(
- 'garage requires numpy in the environment to install. '
- 'Please install numpy==1.14.5 and try again. See '
- 'https://github.com/rlworkgroup/garage/issues/800 for more info.')
-
TF_VERSION = '<1.15,>=1.14.0'
GYM_VERSION = '==0.12.4'
@@ -21,12 +12,12 @@
'cached_property',
'click',
'cloudpickle',
- 'cma==1.1.06',
+ 'cma==2.7.0',
'dowel==0.0.2',
'gym[atari,box2d,classic_control]' + GYM_VERSION,
'joblib<0.13,>=0.12',
'matplotlib',
- 'numpy==1.14.5',
+ 'numpy>=1.14.5',
'psutil',
# Pyglet 1.4.0 introduces some api change which breaks some
# gym environments
diff --git a/src/garage/np/algos/cma_es.py b/src/garage/np/algos/cma_es.py
--- a/src/garage/np/algos/cma_es.py
+++ b/src/garage/np/algos/cma_es.py
@@ -42,7 +42,7 @@
self.sigma0 = sigma0
def _sample_params(self):
- return self.es.ask(self.n_samples)
+ return self.es.ask()
def train(self, runner):
"""Initialize variables and start training.
@@ -57,7 +57,8 @@
"""
init_mean = self.policy.get_param_values()
- self.es = cma.CMAEvolutionStrategy(init_mean, self.sigma0)
+ self.es = cma.CMAEvolutionStrategy(init_mean, self.sigma0,
+ {'popsize': self.n_samples})
self.all_params = self._sample_params()
self.cur_params = self.all_params[0]
self.policy.set_param_values(self.cur_params)
@@ -87,7 +88,7 @@
if (itr + 1) % self.n_samples == 0:
avg_rtns = np.array(self.all_returns)
self.es.tell(self.all_params, -avg_rtns)
- self.policy.set_param_values(self.es.result()[0])
+ self.policy.set_param_values(self.es.best.get()[0])
# Clear for next epoch
rtn = max(self.all_returns)
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -2,15 +2,6 @@\n from setuptools import find_packages\n from setuptools import setup\n \n-try:\n- # pylint: disable=unused-import\n- import numpy # noqa: F401\n-except ImportError:\n- raise RuntimeError(\n- 'garage requires numpy in the environment to install. '\n- 'Please install numpy==1.14.5 and try again. See '\n- 'https://github.com/rlworkgroup/garage/issues/800 for more info.')\n-\n TF_VERSION = '<1.15,>=1.14.0'\n GYM_VERSION = '==0.12.4'\n \n@@ -21,12 +12,12 @@\n 'cached_property',\n 'click',\n 'cloudpickle',\n- 'cma==1.1.06',\n+ 'cma==2.7.0',\n 'dowel==0.0.2',\n 'gym[atari,box2d,classic_control]' + GYM_VERSION,\n 'joblib<0.13,>=0.12',\n 'matplotlib',\n- 'numpy==1.14.5',\n+ 'numpy>=1.14.5',\n 'psutil',\n # Pyglet 1.4.0 introduces some api change which breaks some\n # gym environments\ndiff --git a/src/garage/np/algos/cma_es.py b/src/garage/np/algos/cma_es.py\n--- a/src/garage/np/algos/cma_es.py\n+++ b/src/garage/np/algos/cma_es.py\n@@ -42,7 +42,7 @@\n self.sigma0 = sigma0\n \n def _sample_params(self):\n- return self.es.ask(self.n_samples)\n+ return self.es.ask()\n \n def train(self, runner):\n \"\"\"Initialize variables and start training.\n@@ -57,7 +57,8 @@\n \n \"\"\"\n init_mean = self.policy.get_param_values()\n- self.es = cma.CMAEvolutionStrategy(init_mean, self.sigma0)\n+ self.es = cma.CMAEvolutionStrategy(init_mean, self.sigma0,\n+ {'popsize': self.n_samples})\n self.all_params = self._sample_params()\n self.cur_params = self.all_params[0]\n self.policy.set_param_values(self.cur_params)\n@@ -87,7 +88,7 @@\n if (itr + 1) % self.n_samples == 0:\n avg_rtns = np.array(self.all_returns)\n self.es.tell(self.all_params, -avg_rtns)\n- self.policy.set_param_values(self.es.result()[0])\n+ self.policy.set_param_values(self.es.best.get()[0])\n \n # Clear for next epoch\n rtn = max(self.all_returns)\n", "issue": "Upgrade pycma so that garage can install without numpy\nThe current version of pycma we use requires numpy to run its setup.py. This breaks most automated install processes. Later versions of pycma don't have this problem, but they have an API change which requires updating `garage.np.algos.cmaes`.\n", "before_files": [{"content": "\"\"\"setuptools based setup module.\"\"\"\nfrom setuptools import find_packages\nfrom setuptools import setup\n\ntry:\n # pylint: disable=unused-import\n import numpy # noqa: F401\nexcept ImportError:\n raise RuntimeError(\n 'garage requires numpy in the environment to install. '\n 'Please install numpy==1.14.5 and try again. See '\n 'https://github.com/rlworkgroup/garage/issues/800 for more info.')\n\nTF_VERSION = '<1.15,>=1.14.0'\nGYM_VERSION = '==0.12.4'\n\n# Required dependencies\nrequired = [\n # Please keep alphabetized\n 'akro==0.0.6',\n 'cached_property',\n 'click',\n 'cloudpickle',\n 'cma==1.1.06',\n 'dowel==0.0.2',\n 'gym[atari,box2d,classic_control]' + GYM_VERSION,\n 'joblib<0.13,>=0.12',\n 'matplotlib',\n 'numpy==1.14.5',\n 'psutil',\n # Pyglet 1.4.0 introduces some api change which breaks some\n # gym environments\n # See: https://github.com/openai/gym/issues/1588\n 'pyglet<1.4.0,>=1.3.0',\n 'pyprind',\n 'python-dateutil',\n 'torch==1.1.0',\n 'ray',\n 'scikit-image',\n 'scipy',\n 'tensorflow' + TF_VERSION,\n 'tensorflow-probability<0.8.0,>=0.7.0', # for tensorflow 1.12\n 'torchvision==0.3.0'\n]\n\n# Dependencies for optional features\nextras = {}\n\nextras['mujoco'] = [\n 'mujoco-py<2.1,>=2.0',\n 'gym[all]' + GYM_VERSION,\n]\n\nextras['dm_control'] = [\n # dm_control throws an error during install about not being able to\n # find a build dependency (absl-py). Later pip executes the `install`\n # command again and the install succeeds because absl-py has been\n # installed. This is stupid, but harmless.\n 'dm_control @ https://api.github.com/repos/deepmind/dm_control/tarball/7a36377879c57777e5d5b4da5aae2cd2a29b607a', # noqa: E501\n]\n\nextras['all'] = list(set(sum(extras.values(), [])))\n\n# dependencies for using gpu, not included in all\nextras['gpu'] = ['tensorflow-gpu' + TF_VERSION]\n\n# Development dependencies (*not* included in \"all\")\nextras['dev'] = [\n # Please keep alphabetized\n 'baselines @ https://api.github.com/repos/openai/baselines/tarball/f2729693253c0ef4d4086231d36e0a4307ec1cb3', # noqa: E501\n 'flake8',\n 'flake8-docstrings==1.3.0',\n 'flake8-import-order',\n 'pandas',\n 'pep8-naming==0.7.0',\n 'pre-commit',\n # pydocstyle 4.0.0 breaks flake8-docstrings 1.3.0\n # See https://gitlab.com/pycqa/flake8-docstrings/issues/36\n 'pydocstyle<4.0.0',\n 'pylint==1.9.2',\n 'pytest>=3.6', # Required for pytest-cov on Python 3.6\n 'pytest-cov',\n 'pytest-xdist',\n 'recommonmark',\n 'sphinx',\n 'sphinx_rtd_theme',\n 'yapf==0.28.0',\n]\n\nwith open('README.md') as f:\n readme = f.read()\n\n# Get the package version dynamically\nwith open('VERSION') as v:\n version = v.read().strip()\n\nsetup(\n name='garage',\n version=version,\n author='Reinforcement Learning Working Group',\n description='A framework for reproducible reinforcement learning research',\n url='https://github.com/rlworkgroup/garage',\n packages=find_packages(where='src'),\n package_dir={'': 'src'},\n scripts=['scripts/garage'],\n python_requires='>=3.5',\n install_requires=required,\n extras_require=extras,\n license='MIT',\n long_description=readme,\n long_description_content_type='text/markdown',\n classifiers=[\n 'Development Status :: 4 - Beta',\n 'Intended Audience :: Developers',\n 'Intended Audience :: Education',\n 'Intended Audience :: Science/Research',\n 'License :: OSI Approved :: MIT License',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7',\n 'Programming Language :: Python :: 3 :: Only',\n 'Topic :: Scientific/Engineering :: Artificial Intelligence',\n 'Topic :: Scientific/Engineering :: Mathematics',\n 'Topic :: Software Development :: Libraries',\n ],\n)\n", "path": "setup.py"}, {"content": "\"\"\"Covariance Matrix Adaptation Evolution Strategy.\"\"\"\nimport cma\nfrom dowel import logger, tabular\nimport numpy as np\n\nfrom garage.np.algos import BatchPolopt\n\n\nclass CMAES(BatchPolopt):\n \"\"\"Covariance Matrix Adaptation Evolution Strategy.\n\n Note:\n The CMA-ES method can hardly learn a successful policy even for\n simple task. It is still maintained here only for consistency with\n original rllab paper.\n\n Args:\n env_spec (garage.envs.EnvSpec): Environment specification.\n policy (garage.np.policies.Policy): Action policy.\n baseline (garage.np.baselines.Baseline): Baseline for GAE\n (Generalized Advantage Estimation).\n n_samples (int): Number of policies sampled in one epoch.\n discount (float): Environment reward discount.\n max_path_length (int): Maximum length of a single rollout.\n sigma0 (float): Initial std for param distribution.\n\n \"\"\"\n\n def __init__(self,\n env_spec,\n policy,\n baseline,\n n_samples,\n discount=0.99,\n max_path_length=500,\n sigma0=1.):\n super().__init__(policy, baseline, discount, max_path_length,\n n_samples)\n self.env_spec = env_spec\n self.policy = policy\n\n self.sigma0 = sigma0\n\n def _sample_params(self):\n return self.es.ask(self.n_samples)\n\n def train(self, runner):\n \"\"\"Initialize variables and start training.\n\n Args:\n runner (LocalRunner): LocalRunner is passed to give algorithm\n the access to runner.step_epochs(), which provides services\n such as snapshotting and sampler control.\n\n Returns:\n The average return in last epoch cycle.\n\n \"\"\"\n init_mean = self.policy.get_param_values()\n self.es = cma.CMAEvolutionStrategy(init_mean, self.sigma0)\n self.all_params = self._sample_params()\n self.cur_params = self.all_params[0]\n self.policy.set_param_values(self.cur_params)\n self.all_returns = []\n\n return super().train(runner)\n\n def train_once(self, itr, paths):\n \"\"\"Perform one step of policy optimization given one batch of samples.\n\n Args:\n itr (int): Iteration number.\n paths (list[dict]): A list of collected paths.\n\n \"\"\"\n paths = self.process_samples(itr, paths)\n\n epoch = itr // self.n_samples\n i_sample = itr - epoch * self.n_samples\n\n tabular.record('Epoch', epoch)\n tabular.record('# Sample', i_sample)\n\n rtn = paths['average_return']\n self.all_returns.append(paths['average_return'])\n\n if (itr + 1) % self.n_samples == 0:\n avg_rtns = np.array(self.all_returns)\n self.es.tell(self.all_params, -avg_rtns)\n self.policy.set_param_values(self.es.result()[0])\n\n # Clear for next epoch\n rtn = max(self.all_returns)\n self.all_returns.clear()\n self.all_params = self._sample_params()\n\n self.cur_params = self.all_params[(i_sample + 1) % self.n_samples]\n self.policy.set_param_values(self.cur_params)\n\n logger.log(tabular)\n return rtn\n", "path": "src/garage/np/algos/cma_es.py"}], "after_files": [{"content": "\"\"\"setuptools based setup module.\"\"\"\nfrom setuptools import find_packages\nfrom setuptools import setup\n\nTF_VERSION = '<1.15,>=1.14.0'\nGYM_VERSION = '==0.12.4'\n\n# Required dependencies\nrequired = [\n # Please keep alphabetized\n 'akro==0.0.6',\n 'cached_property',\n 'click',\n 'cloudpickle',\n 'cma==2.7.0',\n 'dowel==0.0.2',\n 'gym[atari,box2d,classic_control]' + GYM_VERSION,\n 'joblib<0.13,>=0.12',\n 'matplotlib',\n 'numpy>=1.14.5',\n 'psutil',\n # Pyglet 1.4.0 introduces some api change which breaks some\n # gym environments\n # See: https://github.com/openai/gym/issues/1588\n 'pyglet<1.4.0,>=1.3.0',\n 'pyprind',\n 'python-dateutil',\n 'torch==1.1.0',\n 'ray',\n 'scikit-image',\n 'scipy',\n 'tensorflow' + TF_VERSION,\n 'tensorflow-probability<0.8.0,>=0.7.0', # for tensorflow 1.12\n 'torchvision==0.3.0'\n]\n\n# Dependencies for optional features\nextras = {}\n\nextras['mujoco'] = [\n 'mujoco-py<2.1,>=2.0',\n 'gym[all]' + GYM_VERSION,\n]\n\nextras['dm_control'] = [\n # dm_control throws an error during install about not being able to\n # find a build dependency (absl-py). Later pip executes the `install`\n # command again and the install succeeds because absl-py has been\n # installed. This is stupid, but harmless.\n 'dm_control @ https://api.github.com/repos/deepmind/dm_control/tarball/7a36377879c57777e5d5b4da5aae2cd2a29b607a', # noqa: E501\n]\n\nextras['all'] = list(set(sum(extras.values(), [])))\n\n# dependencies for using gpu, not included in all\nextras['gpu'] = ['tensorflow-gpu' + TF_VERSION]\n\n# Development dependencies (*not* included in \"all\")\nextras['dev'] = [\n # Please keep alphabetized\n 'baselines @ https://api.github.com/repos/openai/baselines/tarball/f2729693253c0ef4d4086231d36e0a4307ec1cb3', # noqa: E501\n 'flake8',\n 'flake8-docstrings==1.3.0',\n 'flake8-import-order',\n 'pandas',\n 'pep8-naming==0.7.0',\n 'pre-commit',\n # pydocstyle 4.0.0 breaks flake8-docstrings 1.3.0\n # See https://gitlab.com/pycqa/flake8-docstrings/issues/36\n 'pydocstyle<4.0.0',\n 'pylint==1.9.2',\n 'pytest>=3.6', # Required for pytest-cov on Python 3.6\n 'pytest-cov',\n 'pytest-xdist',\n 'recommonmark',\n 'sphinx',\n 'sphinx_rtd_theme',\n 'yapf==0.28.0',\n]\n\nwith open('README.md') as f:\n readme = f.read()\n\n# Get the package version dynamically\nwith open('VERSION') as v:\n version = v.read().strip()\n\nsetup(\n name='garage',\n version=version,\n author='Reinforcement Learning Working Group',\n description='A framework for reproducible reinforcement learning research',\n url='https://github.com/rlworkgroup/garage',\n packages=find_packages(where='src'),\n package_dir={'': 'src'},\n scripts=['scripts/garage'],\n python_requires='>=3.5',\n install_requires=required,\n extras_require=extras,\n license='MIT',\n long_description=readme,\n long_description_content_type='text/markdown',\n classifiers=[\n 'Development Status :: 4 - Beta',\n 'Intended Audience :: Developers',\n 'Intended Audience :: Education',\n 'Intended Audience :: Science/Research',\n 'License :: OSI Approved :: MIT License',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7',\n 'Programming Language :: Python :: 3 :: Only',\n 'Topic :: Scientific/Engineering :: Artificial Intelligence',\n 'Topic :: Scientific/Engineering :: Mathematics',\n 'Topic :: Software Development :: Libraries',\n ],\n)\n", "path": "setup.py"}, {"content": "\"\"\"Covariance Matrix Adaptation Evolution Strategy.\"\"\"\nimport cma\nfrom dowel import logger, tabular\nimport numpy as np\n\nfrom garage.np.algos import BatchPolopt\n\n\nclass CMAES(BatchPolopt):\n \"\"\"Covariance Matrix Adaptation Evolution Strategy.\n\n Note:\n The CMA-ES method can hardly learn a successful policy even for\n simple task. It is still maintained here only for consistency with\n original rllab paper.\n\n Args:\n env_spec (garage.envs.EnvSpec): Environment specification.\n policy (garage.np.policies.Policy): Action policy.\n baseline (garage.np.baselines.Baseline): Baseline for GAE\n (Generalized Advantage Estimation).\n n_samples (int): Number of policies sampled in one epoch.\n discount (float): Environment reward discount.\n max_path_length (int): Maximum length of a single rollout.\n sigma0 (float): Initial std for param distribution.\n\n \"\"\"\n\n def __init__(self,\n env_spec,\n policy,\n baseline,\n n_samples,\n discount=0.99,\n max_path_length=500,\n sigma0=1.):\n super().__init__(policy, baseline, discount, max_path_length,\n n_samples)\n self.env_spec = env_spec\n self.policy = policy\n\n self.sigma0 = sigma0\n\n def _sample_params(self):\n return self.es.ask()\n\n def train(self, runner):\n \"\"\"Initialize variables and start training.\n\n Args:\n runner (LocalRunner): LocalRunner is passed to give algorithm\n the access to runner.step_epochs(), which provides services\n such as snapshotting and sampler control.\n\n Returns:\n The average return in last epoch cycle.\n\n \"\"\"\n init_mean = self.policy.get_param_values()\n self.es = cma.CMAEvolutionStrategy(init_mean, self.sigma0,\n {'popsize': self.n_samples})\n self.all_params = self._sample_params()\n self.cur_params = self.all_params[0]\n self.policy.set_param_values(self.cur_params)\n self.all_returns = []\n\n return super().train(runner)\n\n def train_once(self, itr, paths):\n \"\"\"Perform one step of policy optimization given one batch of samples.\n\n Args:\n itr (int): Iteration number.\n paths (list[dict]): A list of collected paths.\n\n \"\"\"\n paths = self.process_samples(itr, paths)\n\n epoch = itr // self.n_samples\n i_sample = itr - epoch * self.n_samples\n\n tabular.record('Epoch', epoch)\n tabular.record('# Sample', i_sample)\n\n rtn = paths['average_return']\n self.all_returns.append(paths['average_return'])\n\n if (itr + 1) % self.n_samples == 0:\n avg_rtns = np.array(self.all_returns)\n self.es.tell(self.all_params, -avg_rtns)\n self.policy.set_param_values(self.es.best.get()[0])\n\n # Clear for next epoch\n rtn = max(self.all_returns)\n self.all_returns.clear()\n self.all_params = self._sample_params()\n\n self.cur_params = self.all_params[(i_sample + 1) % self.n_samples]\n self.policy.set_param_values(self.cur_params)\n\n logger.log(tabular)\n return rtn\n", "path": "src/garage/np/algos/cma_es.py"}]} | 2,727 | 646 |
gh_patches_debug_16592 | rasdani/github-patches | git_diff | pyg-team__pytorch_geometric-2691 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
PointCNN dilated KNN might select less than `K` neighbours
## 🐛 Bug
In the current X-Conv implementation, I can see the following:
https://github.com/rusty1s/pytorch_geometric/blob/e8e004439e3204a7b888a21e508c45d166c5817c/torch_geometric/nn/conv/x_conv.py#L130-L137
As, `torch.randint` would sample with replacement, there's a chance that the number of actual neighbours can be less than `K`. I am thinking we can fix this using something like:
```
indices = torch.randperm(K * dil)[:K]
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `torch_geometric/nn/conv/x_conv.py`
Content:
```
1 from typing import Optional
2
3 from math import ceil
4
5 import torch
6 from torch import Tensor
7 from torch.nn import Sequential as S, Linear as L, BatchNorm1d as BN
8 from torch.nn import ELU, Conv1d
9 from torch_geometric.nn import Reshape
10
11 from ..inits import reset
12
13 try:
14 from torch_cluster import knn_graph
15 except ImportError:
16 knn_graph = None
17
18
19 class XConv(torch.nn.Module):
20 r"""The convolutional operator on :math:`\mathcal{X}`-transformed points
21 from the `"PointCNN: Convolution On X-Transformed Points"
22 <https://arxiv.org/abs/1801.07791>`_ paper
23
24 .. math::
25 \mathbf{x}^{\prime}_i = \mathrm{Conv}\left(\mathbf{K},
26 \gamma_{\mathbf{\Theta}}(\mathbf{P}_i - \mathbf{p}_i) \times
27 \left( h_\mathbf{\Theta}(\mathbf{P}_i - \mathbf{p}_i) \, \Vert \,
28 \mathbf{x}_i \right) \right),
29
30 where :math:`\mathbf{K}` and :math:`\mathbf{P}_i` denote the trainable
31 filter and neighboring point positions of :math:`\mathbf{x}_i`,
32 respectively.
33 :math:`\gamma_{\mathbf{\Theta}}` and :math:`h_{\mathbf{\Theta}}` describe
34 neural networks, *i.e.* MLPs, where :math:`h_{\mathbf{\Theta}}`
35 individually lifts each point into a higher-dimensional space, and
36 :math:`\gamma_{\mathbf{\Theta}}` computes the :math:`\mathcal{X}`-
37 transformation matrix based on *all* points in a neighborhood.
38
39 Args:
40 in_channels (int): Size of each input sample.
41 out_channels (int): Size of each output sample.
42 dim (int): Point cloud dimensionality.
43 kernel_size (int): Size of the convolving kernel, *i.e.* number of
44 neighbors including self-loops.
45 hidden_channels (int, optional): Output size of
46 :math:`h_{\mathbf{\Theta}}`, *i.e.* dimensionality of lifted
47 points. If set to :obj:`None`, will be automatically set to
48 :obj:`in_channels / 4`. (default: :obj:`None`)
49 dilation (int, optional): The factor by which the neighborhood is
50 extended, from which :obj:`kernel_size` neighbors are then
51 uniformly sampled. Can be interpreted as the dilation rate of
52 classical convolutional operators. (default: :obj:`1`)
53 bias (bool, optional): If set to :obj:`False`, the layer will not learn
54 an additive bias. (default: :obj:`True`)
55 num_workers (int): Number of workers to use for k-NN computation.
56 Has no effect in case :obj:`batch` is not :obj:`None`, or the input
57 lies on the GPU. (default: :obj:`1`)
58 """
59 def __init__(self, in_channels: int, out_channels: int, dim: int,
60 kernel_size: int, hidden_channels: Optional[int] = None,
61 dilation: int = 1, bias: bool = True, num_workers: int = 1):
62 super(XConv, self).__init__()
63
64 if knn_graph is None:
65 raise ImportError('`XConv` requires `torch-cluster`.')
66
67 self.in_channels = in_channels
68 if hidden_channels is None:
69 hidden_channels = in_channels // 4
70 assert hidden_channels > 0
71 self.hidden_channels = hidden_channels
72 self.out_channels = out_channels
73 self.dim = dim
74 self.kernel_size = kernel_size
75 self.dilation = dilation
76 self.num_workers = num_workers
77
78 C_in, C_delta, C_out = in_channels, hidden_channels, out_channels
79 D, K = dim, kernel_size
80
81 self.mlp1 = S(
82 L(dim, C_delta),
83 ELU(),
84 BN(C_delta),
85 L(C_delta, C_delta),
86 ELU(),
87 BN(C_delta),
88 Reshape(-1, K, C_delta),
89 )
90
91 self.mlp2 = S(
92 L(D * K, K**2),
93 ELU(),
94 BN(K**2),
95 Reshape(-1, K, K),
96 Conv1d(K, K**2, K, groups=K),
97 ELU(),
98 BN(K**2),
99 Reshape(-1, K, K),
100 Conv1d(K, K**2, K, groups=K),
101 BN(K**2),
102 Reshape(-1, K, K),
103 )
104
105 C_in = C_in + C_delta
106 depth_multiplier = int(ceil(C_out / C_in))
107 self.conv = S(
108 Conv1d(C_in, C_in * depth_multiplier, K, groups=C_in),
109 Reshape(-1, C_in * depth_multiplier),
110 L(C_in * depth_multiplier, C_out, bias=bias),
111 )
112
113 self.reset_parameters()
114
115 def reset_parameters(self):
116 reset(self.mlp1)
117 reset(self.mlp2)
118 reset(self.conv)
119
120 def forward(self, x: Tensor, pos: Tensor, batch: Optional[Tensor] = None):
121 """"""
122 pos = pos.unsqueeze(-1) if pos.dim() == 1 else pos
123 (N, D), K = pos.size(), self.kernel_size
124
125 edge_index = knn_graph(pos, K * self.dilation, batch, loop=True,
126 flow='target_to_source',
127 num_workers=self.num_workers)
128 row, col = edge_index[0], edge_index[1]
129
130 if self.dilation > 1:
131 dil = self.dilation
132 index = torch.randint(K * dil, (N, K), dtype=torch.long,
133 device=row.device)
134 arange = torch.arange(N, dtype=torch.long, device=row.device)
135 arange = arange * (K * dil)
136 index = (index + arange.view(-1, 1)).view(-1)
137 row, col = row[index], col[index]
138
139 pos = pos[col] - pos[row]
140
141 x_star = self.mlp1(pos)
142 if x is not None:
143 x = x.unsqueeze(-1) if x.dim() == 1 else x
144 x = x[col].view(N, K, self.in_channels)
145 x_star = torch.cat([x_star, x], dim=-1)
146 x_star = x_star.transpose(1, 2).contiguous()
147
148 transform_matrix = self.mlp2(pos.view(N, K * D))
149
150 x_transformed = torch.matmul(x_star, transform_matrix)
151
152 out = self.conv(x_transformed)
153
154 return out
155
156 def __repr__(self):
157 return '{}({}, {})'.format(self.__class__.__name__, self.in_channels,
158 self.out_channels)
159
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/torch_geometric/nn/conv/x_conv.py b/torch_geometric/nn/conv/x_conv.py
--- a/torch_geometric/nn/conv/x_conv.py
+++ b/torch_geometric/nn/conv/x_conv.py
@@ -125,16 +125,11 @@
edge_index = knn_graph(pos, K * self.dilation, batch, loop=True,
flow='target_to_source',
num_workers=self.num_workers)
- row, col = edge_index[0], edge_index[1]
if self.dilation > 1:
- dil = self.dilation
- index = torch.randint(K * dil, (N, K), dtype=torch.long,
- device=row.device)
- arange = torch.arange(N, dtype=torch.long, device=row.device)
- arange = arange * (K * dil)
- index = (index + arange.view(-1, 1)).view(-1)
- row, col = row[index], col[index]
+ edge_index = edge_index[:, ::K]
+
+ row, col = edge_index[0], edge_index[1]
pos = pos[col] - pos[row]
| {"golden_diff": "diff --git a/torch_geometric/nn/conv/x_conv.py b/torch_geometric/nn/conv/x_conv.py\n--- a/torch_geometric/nn/conv/x_conv.py\n+++ b/torch_geometric/nn/conv/x_conv.py\n@@ -125,16 +125,11 @@\n edge_index = knn_graph(pos, K * self.dilation, batch, loop=True,\n flow='target_to_source',\n num_workers=self.num_workers)\n- row, col = edge_index[0], edge_index[1]\n \n if self.dilation > 1:\n- dil = self.dilation\n- index = torch.randint(K * dil, (N, K), dtype=torch.long,\n- device=row.device)\n- arange = torch.arange(N, dtype=torch.long, device=row.device)\n- arange = arange * (K * dil)\n- index = (index + arange.view(-1, 1)).view(-1)\n- row, col = row[index], col[index]\n+ edge_index = edge_index[:, ::K]\n+\n+ row, col = edge_index[0], edge_index[1]\n \n pos = pos[col] - pos[row]\n", "issue": "PointCNN dilated KNN might select less than `K` neighbours \n## \ud83d\udc1b Bug\r\n\r\nIn the current X-Conv implementation, I can see the following:\r\n\r\nhttps://github.com/rusty1s/pytorch_geometric/blob/e8e004439e3204a7b888a21e508c45d166c5817c/torch_geometric/nn/conv/x_conv.py#L130-L137\r\n\r\nAs, `torch.randint` would sample with replacement, there's a chance that the number of actual neighbours can be less than `K`. I am thinking we can fix this using something like:\r\n\r\n```\r\nindices = torch.randperm(K * dil)[:K]\r\n```\n", "before_files": [{"content": "from typing import Optional\n\nfrom math import ceil\n\nimport torch\nfrom torch import Tensor\nfrom torch.nn import Sequential as S, Linear as L, BatchNorm1d as BN\nfrom torch.nn import ELU, Conv1d\nfrom torch_geometric.nn import Reshape\n\nfrom ..inits import reset\n\ntry:\n from torch_cluster import knn_graph\nexcept ImportError:\n knn_graph = None\n\n\nclass XConv(torch.nn.Module):\n r\"\"\"The convolutional operator on :math:`\\mathcal{X}`-transformed points\n from the `\"PointCNN: Convolution On X-Transformed Points\"\n <https://arxiv.org/abs/1801.07791>`_ paper\n\n .. math::\n \\mathbf{x}^{\\prime}_i = \\mathrm{Conv}\\left(\\mathbf{K},\n \\gamma_{\\mathbf{\\Theta}}(\\mathbf{P}_i - \\mathbf{p}_i) \\times\n \\left( h_\\mathbf{\\Theta}(\\mathbf{P}_i - \\mathbf{p}_i) \\, \\Vert \\,\n \\mathbf{x}_i \\right) \\right),\n\n where :math:`\\mathbf{K}` and :math:`\\mathbf{P}_i` denote the trainable\n filter and neighboring point positions of :math:`\\mathbf{x}_i`,\n respectively.\n :math:`\\gamma_{\\mathbf{\\Theta}}` and :math:`h_{\\mathbf{\\Theta}}` describe\n neural networks, *i.e.* MLPs, where :math:`h_{\\mathbf{\\Theta}}`\n individually lifts each point into a higher-dimensional space, and\n :math:`\\gamma_{\\mathbf{\\Theta}}` computes the :math:`\\mathcal{X}`-\n transformation matrix based on *all* points in a neighborhood.\n\n Args:\n in_channels (int): Size of each input sample.\n out_channels (int): Size of each output sample.\n dim (int): Point cloud dimensionality.\n kernel_size (int): Size of the convolving kernel, *i.e.* number of\n neighbors including self-loops.\n hidden_channels (int, optional): Output size of\n :math:`h_{\\mathbf{\\Theta}}`, *i.e.* dimensionality of lifted\n points. If set to :obj:`None`, will be automatically set to\n :obj:`in_channels / 4`. (default: :obj:`None`)\n dilation (int, optional): The factor by which the neighborhood is\n extended, from which :obj:`kernel_size` neighbors are then\n uniformly sampled. Can be interpreted as the dilation rate of\n classical convolutional operators. (default: :obj:`1`)\n bias (bool, optional): If set to :obj:`False`, the layer will not learn\n an additive bias. (default: :obj:`True`)\n num_workers (int): Number of workers to use for k-NN computation.\n Has no effect in case :obj:`batch` is not :obj:`None`, or the input\n lies on the GPU. (default: :obj:`1`)\n \"\"\"\n def __init__(self, in_channels: int, out_channels: int, dim: int,\n kernel_size: int, hidden_channels: Optional[int] = None,\n dilation: int = 1, bias: bool = True, num_workers: int = 1):\n super(XConv, self).__init__()\n\n if knn_graph is None:\n raise ImportError('`XConv` requires `torch-cluster`.')\n\n self.in_channels = in_channels\n if hidden_channels is None:\n hidden_channels = in_channels // 4\n assert hidden_channels > 0\n self.hidden_channels = hidden_channels\n self.out_channels = out_channels\n self.dim = dim\n self.kernel_size = kernel_size\n self.dilation = dilation\n self.num_workers = num_workers\n\n C_in, C_delta, C_out = in_channels, hidden_channels, out_channels\n D, K = dim, kernel_size\n\n self.mlp1 = S(\n L(dim, C_delta),\n ELU(),\n BN(C_delta),\n L(C_delta, C_delta),\n ELU(),\n BN(C_delta),\n Reshape(-1, K, C_delta),\n )\n\n self.mlp2 = S(\n L(D * K, K**2),\n ELU(),\n BN(K**2),\n Reshape(-1, K, K),\n Conv1d(K, K**2, K, groups=K),\n ELU(),\n BN(K**2),\n Reshape(-1, K, K),\n Conv1d(K, K**2, K, groups=K),\n BN(K**2),\n Reshape(-1, K, K),\n )\n\n C_in = C_in + C_delta\n depth_multiplier = int(ceil(C_out / C_in))\n self.conv = S(\n Conv1d(C_in, C_in * depth_multiplier, K, groups=C_in),\n Reshape(-1, C_in * depth_multiplier),\n L(C_in * depth_multiplier, C_out, bias=bias),\n )\n\n self.reset_parameters()\n\n def reset_parameters(self):\n reset(self.mlp1)\n reset(self.mlp2)\n reset(self.conv)\n\n def forward(self, x: Tensor, pos: Tensor, batch: Optional[Tensor] = None):\n \"\"\"\"\"\"\n pos = pos.unsqueeze(-1) if pos.dim() == 1 else pos\n (N, D), K = pos.size(), self.kernel_size\n\n edge_index = knn_graph(pos, K * self.dilation, batch, loop=True,\n flow='target_to_source',\n num_workers=self.num_workers)\n row, col = edge_index[0], edge_index[1]\n\n if self.dilation > 1:\n dil = self.dilation\n index = torch.randint(K * dil, (N, K), dtype=torch.long,\n device=row.device)\n arange = torch.arange(N, dtype=torch.long, device=row.device)\n arange = arange * (K * dil)\n index = (index + arange.view(-1, 1)).view(-1)\n row, col = row[index], col[index]\n\n pos = pos[col] - pos[row]\n\n x_star = self.mlp1(pos)\n if x is not None:\n x = x.unsqueeze(-1) if x.dim() == 1 else x\n x = x[col].view(N, K, self.in_channels)\n x_star = torch.cat([x_star, x], dim=-1)\n x_star = x_star.transpose(1, 2).contiguous()\n\n transform_matrix = self.mlp2(pos.view(N, K * D))\n\n x_transformed = torch.matmul(x_star, transform_matrix)\n\n out = self.conv(x_transformed)\n\n return out\n\n def __repr__(self):\n return '{}({}, {})'.format(self.__class__.__name__, self.in_channels,\n self.out_channels)\n", "path": "torch_geometric/nn/conv/x_conv.py"}], "after_files": [{"content": "from typing import Optional\n\nfrom math import ceil\n\nimport torch\nfrom torch import Tensor\nfrom torch.nn import Sequential as S, Linear as L, BatchNorm1d as BN\nfrom torch.nn import ELU, Conv1d\nfrom torch_geometric.nn import Reshape\n\nfrom ..inits import reset\n\ntry:\n from torch_cluster import knn_graph\nexcept ImportError:\n knn_graph = None\n\n\nclass XConv(torch.nn.Module):\n r\"\"\"The convolutional operator on :math:`\\mathcal{X}`-transformed points\n from the `\"PointCNN: Convolution On X-Transformed Points\"\n <https://arxiv.org/abs/1801.07791>`_ paper\n\n .. math::\n \\mathbf{x}^{\\prime}_i = \\mathrm{Conv}\\left(\\mathbf{K},\n \\gamma_{\\mathbf{\\Theta}}(\\mathbf{P}_i - \\mathbf{p}_i) \\times\n \\left( h_\\mathbf{\\Theta}(\\mathbf{P}_i - \\mathbf{p}_i) \\, \\Vert \\,\n \\mathbf{x}_i \\right) \\right),\n\n where :math:`\\mathbf{K}` and :math:`\\mathbf{P}_i` denote the trainable\n filter and neighboring point positions of :math:`\\mathbf{x}_i`,\n respectively.\n :math:`\\gamma_{\\mathbf{\\Theta}}` and :math:`h_{\\mathbf{\\Theta}}` describe\n neural networks, *i.e.* MLPs, where :math:`h_{\\mathbf{\\Theta}}`\n individually lifts each point into a higher-dimensional space, and\n :math:`\\gamma_{\\mathbf{\\Theta}}` computes the :math:`\\mathcal{X}`-\n transformation matrix based on *all* points in a neighborhood.\n\n Args:\n in_channels (int): Size of each input sample.\n out_channels (int): Size of each output sample.\n dim (int): Point cloud dimensionality.\n kernel_size (int): Size of the convolving kernel, *i.e.* number of\n neighbors including self-loops.\n hidden_channels (int, optional): Output size of\n :math:`h_{\\mathbf{\\Theta}}`, *i.e.* dimensionality of lifted\n points. If set to :obj:`None`, will be automatically set to\n :obj:`in_channels / 4`. (default: :obj:`None`)\n dilation (int, optional): The factor by which the neighborhood is\n extended, from which :obj:`kernel_size` neighbors are then\n uniformly sampled. Can be interpreted as the dilation rate of\n classical convolutional operators. (default: :obj:`1`)\n bias (bool, optional): If set to :obj:`False`, the layer will not learn\n an additive bias. (default: :obj:`True`)\n num_workers (int): Number of workers to use for k-NN computation.\n Has no effect in case :obj:`batch` is not :obj:`None`, or the input\n lies on the GPU. (default: :obj:`1`)\n \"\"\"\n def __init__(self, in_channels: int, out_channels: int, dim: int,\n kernel_size: int, hidden_channels: Optional[int] = None,\n dilation: int = 1, bias: bool = True, num_workers: int = 1):\n super(XConv, self).__init__()\n\n if knn_graph is None:\n raise ImportError('`XConv` requires `torch-cluster`.')\n\n self.in_channels = in_channels\n if hidden_channels is None:\n hidden_channels = in_channels // 4\n assert hidden_channels > 0\n self.hidden_channels = hidden_channels\n self.out_channels = out_channels\n self.dim = dim\n self.kernel_size = kernel_size\n self.dilation = dilation\n self.num_workers = num_workers\n\n C_in, C_delta, C_out = in_channels, hidden_channels, out_channels\n D, K = dim, kernel_size\n\n self.mlp1 = S(\n L(dim, C_delta),\n ELU(),\n BN(C_delta),\n L(C_delta, C_delta),\n ELU(),\n BN(C_delta),\n Reshape(-1, K, C_delta),\n )\n\n self.mlp2 = S(\n L(D * K, K**2),\n ELU(),\n BN(K**2),\n Reshape(-1, K, K),\n Conv1d(K, K**2, K, groups=K),\n ELU(),\n BN(K**2),\n Reshape(-1, K, K),\n Conv1d(K, K**2, K, groups=K),\n BN(K**2),\n Reshape(-1, K, K),\n )\n\n C_in = C_in + C_delta\n depth_multiplier = int(ceil(C_out / C_in))\n self.conv = S(\n Conv1d(C_in, C_in * depth_multiplier, K, groups=C_in),\n Reshape(-1, C_in * depth_multiplier),\n L(C_in * depth_multiplier, C_out, bias=bias),\n )\n\n self.reset_parameters()\n\n def reset_parameters(self):\n reset(self.mlp1)\n reset(self.mlp2)\n reset(self.conv)\n\n def forward(self, x: Tensor, pos: Tensor, batch: Optional[Tensor] = None):\n \"\"\"\"\"\"\n pos = pos.unsqueeze(-1) if pos.dim() == 1 else pos\n (N, D), K = pos.size(), self.kernel_size\n\n edge_index = knn_graph(pos, K * self.dilation, batch, loop=True,\n flow='target_to_source',\n num_workers=self.num_workers)\n\n if self.dilation > 1:\n edge_index = edge_index[:, ::K]\n\n row, col = edge_index[0], edge_index[1]\n\n pos = pos[col] - pos[row]\n\n x_star = self.mlp1(pos)\n if x is not None:\n x = x.unsqueeze(-1) if x.dim() == 1 else x\n x = x[col].view(N, K, self.in_channels)\n x_star = torch.cat([x_star, x], dim=-1)\n x_star = x_star.transpose(1, 2).contiguous()\n\n transform_matrix = self.mlp2(pos.view(N, K * D))\n\n x_transformed = torch.matmul(x_star, transform_matrix)\n\n out = self.conv(x_transformed)\n\n return out\n\n def __repr__(self):\n return '{}({}, {})'.format(self.__class__.__name__, self.in_channels,\n self.out_channels)\n", "path": "torch_geometric/nn/conv/x_conv.py"}]} | 2,344 | 266 |
gh_patches_debug_27532 | rasdani/github-patches | git_diff | python-discord__site-432 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Support deleting infractions
The bot tries to delete infractions that fail to apply on the Discord side of things. This is cause it sends the infraction to our API before applying it on Discord. However, our API doesn't actually support the delete method so it returns a 405.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pydis_site/apps/api/viewsets/bot/infraction.py`
Content:
```
1 from django.http.request import HttpRequest
2 from django_filters.rest_framework import DjangoFilterBackend
3 from rest_framework.decorators import action
4 from rest_framework.exceptions import ValidationError
5 from rest_framework.filters import OrderingFilter, SearchFilter
6 from rest_framework.mixins import (
7 CreateModelMixin,
8 ListModelMixin,
9 RetrieveModelMixin
10 )
11 from rest_framework.response import Response
12 from rest_framework.viewsets import GenericViewSet
13
14 from pydis_site.apps.api.models.bot.infraction import Infraction
15 from pydis_site.apps.api.serializers import (
16 ExpandedInfractionSerializer,
17 InfractionSerializer
18 )
19
20
21 class InfractionViewSet(CreateModelMixin, RetrieveModelMixin, ListModelMixin, GenericViewSet):
22 """
23 View providing CRUD operations on infractions for Discord users.
24
25 ## Routes
26 ### GET /bot/infractions
27 Retrieve all infractions.
28 May be filtered by the query parameters.
29
30 #### Query parameters
31 - **active** `bool`: whether the infraction is still active
32 - **actor__id** `int`: snowflake of the user which applied the infraction
33 - **hidden** `bool`: whether the infraction is a shadow infraction
34 - **search** `str`: regular expression applied to the infraction's reason
35 - **type** `str`: the type of the infraction
36 - **user__id** `int`: snowflake of the user to which the infraction was applied
37 - **ordering** `str`: comma-separated sequence of fields to order the returned results
38
39 Invalid query parameters are ignored.
40
41 #### Response format
42 >>> [
43 ... {
44 ... 'id': 5,
45 ... 'inserted_at': '2018-11-22T07:24:06.132307Z',
46 ... 'expires_at': '5018-11-20T15:52:00Z',
47 ... 'active': False,
48 ... 'user': 172395097705414656,
49 ... 'actor': 125435062127820800,
50 ... 'type': 'ban',
51 ... 'reason': 'He terk my jerb!',
52 ... 'hidden': True
53 ... }
54 ... ]
55
56 #### Status codes
57 - 200: returned on success
58
59 ### GET /bot/infractions/<id:int>
60 Retrieve a single infraction by ID.
61
62 #### Response format
63 See `GET /bot/infractions`.
64
65 #### Status codes
66 - 200: returned on success
67 - 404: if an infraction with the given `id` could not be found
68
69 ### POST /bot/infractions
70 Create a new infraction and return the created infraction.
71 Only `actor`, `type`, and `user` are required.
72 The `actor` and `user` must be users known by the site.
73
74 #### Request body
75 >>> {
76 ... 'active': False,
77 ... 'actor': 125435062127820800,
78 ... 'expires_at': '5018-11-20T15:52:00+00:00',
79 ... 'hidden': True,
80 ... 'type': 'ban',
81 ... 'reason': 'He terk my jerb!',
82 ... 'user': 172395097705414656
83 ... }
84
85 #### Response format
86 See `GET /bot/infractions`.
87
88 #### Status codes
89 - 201: returned on success
90 - 400: if a given user is unknown or a field in the request body is invalid
91
92 ### PATCH /bot/infractions/<id:int>
93 Update the infraction with the given `id` and return the updated infraction.
94 Only `active`, `reason`, and `expires_at` may be updated.
95
96 #### Request body
97 >>> {
98 ... 'active': True,
99 ... 'expires_at': '4143-02-15T21:04:31+00:00',
100 ... 'reason': 'durka derr'
101 ... }
102
103 #### Response format
104 See `GET /bot/infractions`.
105
106 #### Status codes
107 - 200: returned on success
108 - 400: if a field in the request body is invalid or disallowed
109 - 404: if an infraction with the given `id` could not be found
110
111 ### Expanded routes
112 All routes support expansion of `user` and `actor` in responses. To use an expanded route,
113 append `/expanded` to the end of the route e.g. `GET /bot/infractions/expanded`.
114
115 #### Response format
116 See `GET /bot/users/<snowflake:int>` for the expanded formats of `user` and `actor`. Responses
117 are otherwise identical to their non-expanded counterparts.
118 """
119
120 serializer_class = InfractionSerializer
121 queryset = Infraction.objects.all()
122 filter_backends = (DjangoFilterBackend, SearchFilter, OrderingFilter)
123 filter_fields = ('user__id', 'actor__id', 'active', 'hidden', 'type')
124 search_fields = ('$reason',)
125 frozen_fields = ('id', 'inserted_at', 'type', 'user', 'actor', 'hidden')
126
127 def partial_update(self, request: HttpRequest, *_args, **_kwargs) -> Response:
128 """Method that handles the nuts and bolts of updating an Infraction."""
129 for field in request.data:
130 if field in self.frozen_fields:
131 raise ValidationError({field: ['This field cannot be updated.']})
132
133 instance = self.get_object()
134 serializer = self.get_serializer(instance, data=request.data, partial=True)
135 serializer.is_valid(raise_exception=True)
136 serializer.save()
137
138 return Response(serializer.data)
139
140 @action(url_path='expanded', detail=False)
141 def list_expanded(self, *args, **kwargs) -> Response:
142 """
143 DRF method for listing Infraction entries.
144
145 Called by the Django Rest Framework in response to the corresponding HTTP request.
146 """
147 self.serializer_class = ExpandedInfractionSerializer
148 return self.list(*args, **kwargs)
149
150 @list_expanded.mapping.post
151 def create_expanded(self, *args, **kwargs) -> Response:
152 """
153 DRF method for creating an Infraction.
154
155 Called by the Django Rest Framework in response to the corresponding HTTP request.
156 """
157 self.serializer_class = ExpandedInfractionSerializer
158 return self.create(*args, **kwargs)
159
160 @action(url_path='expanded', url_name='detail-expanded', detail=True)
161 def retrieve_expanded(self, *args, **kwargs) -> Response:
162 """
163 DRF method for retrieving a specific Infraction.
164
165 Called by the Django Rest Framework in response to the corresponding HTTP request.
166 """
167 self.serializer_class = ExpandedInfractionSerializer
168 return self.retrieve(*args, **kwargs)
169
170 @retrieve_expanded.mapping.patch
171 def partial_update_expanded(self, *args, **kwargs) -> Response:
172 """
173 DRF method for updating an Infraction.
174
175 Called by the Django Rest Framework in response to the corresponding HTTP request.
176 """
177 self.serializer_class = ExpandedInfractionSerializer
178 return self.partial_update(*args, **kwargs)
179
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pydis_site/apps/api/viewsets/bot/infraction.py b/pydis_site/apps/api/viewsets/bot/infraction.py
--- a/pydis_site/apps/api/viewsets/bot/infraction.py
+++ b/pydis_site/apps/api/viewsets/bot/infraction.py
@@ -5,6 +5,7 @@
from rest_framework.filters import OrderingFilter, SearchFilter
from rest_framework.mixins import (
CreateModelMixin,
+ DestroyModelMixin,
ListModelMixin,
RetrieveModelMixin
)
@@ -18,7 +19,13 @@
)
-class InfractionViewSet(CreateModelMixin, RetrieveModelMixin, ListModelMixin, GenericViewSet):
+class InfractionViewSet(
+ CreateModelMixin,
+ RetrieveModelMixin,
+ ListModelMixin,
+ GenericViewSet,
+ DestroyModelMixin
+):
"""
View providing CRUD operations on infractions for Discord users.
@@ -108,6 +115,13 @@
- 400: if a field in the request body is invalid or disallowed
- 404: if an infraction with the given `id` could not be found
+ ### DELETE /bot/infractions/<id:int>
+ Delete the infraction with the given `id`.
+
+ #### Status codes
+ - 204: returned on success
+ - 404: if a infraction with the given `id` does not exist
+
### Expanded routes
All routes support expansion of `user` and `actor` in responses. To use an expanded route,
append `/expanded` to the end of the route e.g. `GET /bot/infractions/expanded`.
| {"golden_diff": "diff --git a/pydis_site/apps/api/viewsets/bot/infraction.py b/pydis_site/apps/api/viewsets/bot/infraction.py\n--- a/pydis_site/apps/api/viewsets/bot/infraction.py\n+++ b/pydis_site/apps/api/viewsets/bot/infraction.py\n@@ -5,6 +5,7 @@\n from rest_framework.filters import OrderingFilter, SearchFilter\n from rest_framework.mixins import (\n CreateModelMixin,\n+ DestroyModelMixin,\n ListModelMixin,\n RetrieveModelMixin\n )\n@@ -18,7 +19,13 @@\n )\n \n \n-class InfractionViewSet(CreateModelMixin, RetrieveModelMixin, ListModelMixin, GenericViewSet):\n+class InfractionViewSet(\n+ CreateModelMixin,\n+ RetrieveModelMixin,\n+ ListModelMixin,\n+ GenericViewSet,\n+ DestroyModelMixin\n+):\n \"\"\"\n View providing CRUD operations on infractions for Discord users.\n \n@@ -108,6 +115,13 @@\n - 400: if a field in the request body is invalid or disallowed\n - 404: if an infraction with the given `id` could not be found\n \n+ ### DELETE /bot/infractions/<id:int>\n+ Delete the infraction with the given `id`.\n+\n+ #### Status codes\n+ - 204: returned on success\n+ - 404: if a infraction with the given `id` does not exist\n+\n ### Expanded routes\n All routes support expansion of `user` and `actor` in responses. To use an expanded route,\n append `/expanded` to the end of the route e.g. `GET /bot/infractions/expanded`.\n", "issue": "Support deleting infractions\nThe bot tries to delete infractions that fail to apply on the Discord side of things. This is cause it sends the infraction to our API before applying it on Discord. However, our API doesn't actually support the delete method so it returns a 405.\n", "before_files": [{"content": "from django.http.request import HttpRequest\nfrom django_filters.rest_framework import DjangoFilterBackend\nfrom rest_framework.decorators import action\nfrom rest_framework.exceptions import ValidationError\nfrom rest_framework.filters import OrderingFilter, SearchFilter\nfrom rest_framework.mixins import (\n CreateModelMixin,\n ListModelMixin,\n RetrieveModelMixin\n)\nfrom rest_framework.response import Response\nfrom rest_framework.viewsets import GenericViewSet\n\nfrom pydis_site.apps.api.models.bot.infraction import Infraction\nfrom pydis_site.apps.api.serializers import (\n ExpandedInfractionSerializer,\n InfractionSerializer\n)\n\n\nclass InfractionViewSet(CreateModelMixin, RetrieveModelMixin, ListModelMixin, GenericViewSet):\n \"\"\"\n View providing CRUD operations on infractions for Discord users.\n\n ## Routes\n ### GET /bot/infractions\n Retrieve all infractions.\n May be filtered by the query parameters.\n\n #### Query parameters\n - **active** `bool`: whether the infraction is still active\n - **actor__id** `int`: snowflake of the user which applied the infraction\n - **hidden** `bool`: whether the infraction is a shadow infraction\n - **search** `str`: regular expression applied to the infraction's reason\n - **type** `str`: the type of the infraction\n - **user__id** `int`: snowflake of the user to which the infraction was applied\n - **ordering** `str`: comma-separated sequence of fields to order the returned results\n\n Invalid query parameters are ignored.\n\n #### Response format\n >>> [\n ... {\n ... 'id': 5,\n ... 'inserted_at': '2018-11-22T07:24:06.132307Z',\n ... 'expires_at': '5018-11-20T15:52:00Z',\n ... 'active': False,\n ... 'user': 172395097705414656,\n ... 'actor': 125435062127820800,\n ... 'type': 'ban',\n ... 'reason': 'He terk my jerb!',\n ... 'hidden': True\n ... }\n ... ]\n\n #### Status codes\n - 200: returned on success\n\n ### GET /bot/infractions/<id:int>\n Retrieve a single infraction by ID.\n\n #### Response format\n See `GET /bot/infractions`.\n\n #### Status codes\n - 200: returned on success\n - 404: if an infraction with the given `id` could not be found\n\n ### POST /bot/infractions\n Create a new infraction and return the created infraction.\n Only `actor`, `type`, and `user` are required.\n The `actor` and `user` must be users known by the site.\n\n #### Request body\n >>> {\n ... 'active': False,\n ... 'actor': 125435062127820800,\n ... 'expires_at': '5018-11-20T15:52:00+00:00',\n ... 'hidden': True,\n ... 'type': 'ban',\n ... 'reason': 'He terk my jerb!',\n ... 'user': 172395097705414656\n ... }\n\n #### Response format\n See `GET /bot/infractions`.\n\n #### Status codes\n - 201: returned on success\n - 400: if a given user is unknown or a field in the request body is invalid\n\n ### PATCH /bot/infractions/<id:int>\n Update the infraction with the given `id` and return the updated infraction.\n Only `active`, `reason`, and `expires_at` may be updated.\n\n #### Request body\n >>> {\n ... 'active': True,\n ... 'expires_at': '4143-02-15T21:04:31+00:00',\n ... 'reason': 'durka derr'\n ... }\n\n #### Response format\n See `GET /bot/infractions`.\n\n #### Status codes\n - 200: returned on success\n - 400: if a field in the request body is invalid or disallowed\n - 404: if an infraction with the given `id` could not be found\n\n ### Expanded routes\n All routes support expansion of `user` and `actor` in responses. To use an expanded route,\n append `/expanded` to the end of the route e.g. `GET /bot/infractions/expanded`.\n\n #### Response format\n See `GET /bot/users/<snowflake:int>` for the expanded formats of `user` and `actor`. Responses\n are otherwise identical to their non-expanded counterparts.\n \"\"\"\n\n serializer_class = InfractionSerializer\n queryset = Infraction.objects.all()\n filter_backends = (DjangoFilterBackend, SearchFilter, OrderingFilter)\n filter_fields = ('user__id', 'actor__id', 'active', 'hidden', 'type')\n search_fields = ('$reason',)\n frozen_fields = ('id', 'inserted_at', 'type', 'user', 'actor', 'hidden')\n\n def partial_update(self, request: HttpRequest, *_args, **_kwargs) -> Response:\n \"\"\"Method that handles the nuts and bolts of updating an Infraction.\"\"\"\n for field in request.data:\n if field in self.frozen_fields:\n raise ValidationError({field: ['This field cannot be updated.']})\n\n instance = self.get_object()\n serializer = self.get_serializer(instance, data=request.data, partial=True)\n serializer.is_valid(raise_exception=True)\n serializer.save()\n\n return Response(serializer.data)\n\n @action(url_path='expanded', detail=False)\n def list_expanded(self, *args, **kwargs) -> Response:\n \"\"\"\n DRF method for listing Infraction entries.\n\n Called by the Django Rest Framework in response to the corresponding HTTP request.\n \"\"\"\n self.serializer_class = ExpandedInfractionSerializer\n return self.list(*args, **kwargs)\n\n @list_expanded.mapping.post\n def create_expanded(self, *args, **kwargs) -> Response:\n \"\"\"\n DRF method for creating an Infraction.\n\n Called by the Django Rest Framework in response to the corresponding HTTP request.\n \"\"\"\n self.serializer_class = ExpandedInfractionSerializer\n return self.create(*args, **kwargs)\n\n @action(url_path='expanded', url_name='detail-expanded', detail=True)\n def retrieve_expanded(self, *args, **kwargs) -> Response:\n \"\"\"\n DRF method for retrieving a specific Infraction.\n\n Called by the Django Rest Framework in response to the corresponding HTTP request.\n \"\"\"\n self.serializer_class = ExpandedInfractionSerializer\n return self.retrieve(*args, **kwargs)\n\n @retrieve_expanded.mapping.patch\n def partial_update_expanded(self, *args, **kwargs) -> Response:\n \"\"\"\n DRF method for updating an Infraction.\n\n Called by the Django Rest Framework in response to the corresponding HTTP request.\n \"\"\"\n self.serializer_class = ExpandedInfractionSerializer\n return self.partial_update(*args, **kwargs)\n", "path": "pydis_site/apps/api/viewsets/bot/infraction.py"}], "after_files": [{"content": "from django.http.request import HttpRequest\nfrom django_filters.rest_framework import DjangoFilterBackend\nfrom rest_framework.decorators import action\nfrom rest_framework.exceptions import ValidationError\nfrom rest_framework.filters import OrderingFilter, SearchFilter\nfrom rest_framework.mixins import (\n CreateModelMixin,\n DestroyModelMixin,\n ListModelMixin,\n RetrieveModelMixin\n)\nfrom rest_framework.response import Response\nfrom rest_framework.viewsets import GenericViewSet\n\nfrom pydis_site.apps.api.models.bot.infraction import Infraction\nfrom pydis_site.apps.api.serializers import (\n ExpandedInfractionSerializer,\n InfractionSerializer\n)\n\n\nclass InfractionViewSet(\n CreateModelMixin,\n RetrieveModelMixin,\n ListModelMixin,\n GenericViewSet,\n DestroyModelMixin\n):\n \"\"\"\n View providing CRUD operations on infractions for Discord users.\n\n ## Routes\n ### GET /bot/infractions\n Retrieve all infractions.\n May be filtered by the query parameters.\n\n #### Query parameters\n - **active** `bool`: whether the infraction is still active\n - **actor__id** `int`: snowflake of the user which applied the infraction\n - **hidden** `bool`: whether the infraction is a shadow infraction\n - **search** `str`: regular expression applied to the infraction's reason\n - **type** `str`: the type of the infraction\n - **user__id** `int`: snowflake of the user to which the infraction was applied\n - **ordering** `str`: comma-separated sequence of fields to order the returned results\n\n Invalid query parameters are ignored.\n\n #### Response format\n >>> [\n ... {\n ... 'id': 5,\n ... 'inserted_at': '2018-11-22T07:24:06.132307Z',\n ... 'expires_at': '5018-11-20T15:52:00Z',\n ... 'active': False,\n ... 'user': 172395097705414656,\n ... 'actor': 125435062127820800,\n ... 'type': 'ban',\n ... 'reason': 'He terk my jerb!',\n ... 'hidden': True\n ... }\n ... ]\n\n #### Status codes\n - 200: returned on success\n\n ### GET /bot/infractions/<id:int>\n Retrieve a single infraction by ID.\n\n #### Response format\n See `GET /bot/infractions`.\n\n #### Status codes\n - 200: returned on success\n - 404: if an infraction with the given `id` could not be found\n\n ### POST /bot/infractions\n Create a new infraction and return the created infraction.\n Only `actor`, `type`, and `user` are required.\n The `actor` and `user` must be users known by the site.\n\n #### Request body\n >>> {\n ... 'active': False,\n ... 'actor': 125435062127820800,\n ... 'expires_at': '5018-11-20T15:52:00+00:00',\n ... 'hidden': True,\n ... 'type': 'ban',\n ... 'reason': 'He terk my jerb!',\n ... 'user': 172395097705414656\n ... }\n\n #### Response format\n See `GET /bot/infractions`.\n\n #### Status codes\n - 201: returned on success\n - 400: if a given user is unknown or a field in the request body is invalid\n\n ### PATCH /bot/infractions/<id:int>\n Update the infraction with the given `id` and return the updated infraction.\n Only `active`, `reason`, and `expires_at` may be updated.\n\n #### Request body\n >>> {\n ... 'active': True,\n ... 'expires_at': '4143-02-15T21:04:31+00:00',\n ... 'reason': 'durka derr'\n ... }\n\n #### Response format\n See `GET /bot/infractions`.\n\n #### Status codes\n - 200: returned on success\n - 400: if a field in the request body is invalid or disallowed\n - 404: if an infraction with the given `id` could not be found\n\n ### DELETE /bot/infractions/<id:int>\n Delete the infraction with the given `id`.\n\n #### Status codes\n - 204: returned on success\n - 404: if a infraction with the given `id` does not exist\n\n ### Expanded routes\n All routes support expansion of `user` and `actor` in responses. To use an expanded route,\n append `/expanded` to the end of the route e.g. `GET /bot/infractions/expanded`.\n\n #### Response format\n See `GET /bot/users/<snowflake:int>` for the expanded formats of `user` and `actor`. Responses\n are otherwise identical to their non-expanded counterparts.\n \"\"\"\n\n serializer_class = InfractionSerializer\n queryset = Infraction.objects.all()\n filter_backends = (DjangoFilterBackend, SearchFilter, OrderingFilter)\n filter_fields = ('user__id', 'actor__id', 'active', 'hidden', 'type')\n search_fields = ('$reason',)\n frozen_fields = ('id', 'inserted_at', 'type', 'user', 'actor', 'hidden')\n\n def partial_update(self, request: HttpRequest, *_args, **_kwargs) -> Response:\n \"\"\"Method that handles the nuts and bolts of updating an Infraction.\"\"\"\n for field in request.data:\n if field in self.frozen_fields:\n raise ValidationError({field: ['This field cannot be updated.']})\n\n instance = self.get_object()\n serializer = self.get_serializer(instance, data=request.data, partial=True)\n serializer.is_valid(raise_exception=True)\n serializer.save()\n\n return Response(serializer.data)\n\n @action(url_path='expanded', detail=False)\n def list_expanded(self, *args, **kwargs) -> Response:\n \"\"\"\n DRF method for listing Infraction entries.\n\n Called by the Django Rest Framework in response to the corresponding HTTP request.\n \"\"\"\n self.serializer_class = ExpandedInfractionSerializer\n return self.list(*args, **kwargs)\n\n @list_expanded.mapping.post\n def create_expanded(self, *args, **kwargs) -> Response:\n \"\"\"\n DRF method for creating an Infraction.\n\n Called by the Django Rest Framework in response to the corresponding HTTP request.\n \"\"\"\n self.serializer_class = ExpandedInfractionSerializer\n return self.create(*args, **kwargs)\n\n @action(url_path='expanded', url_name='detail-expanded', detail=True)\n def retrieve_expanded(self, *args, **kwargs) -> Response:\n \"\"\"\n DRF method for retrieving a specific Infraction.\n\n Called by the Django Rest Framework in response to the corresponding HTTP request.\n \"\"\"\n self.serializer_class = ExpandedInfractionSerializer\n return self.retrieve(*args, **kwargs)\n\n @retrieve_expanded.mapping.patch\n def partial_update_expanded(self, *args, **kwargs) -> Response:\n \"\"\"\n DRF method for updating an Infraction.\n\n Called by the Django Rest Framework in response to the corresponding HTTP request.\n \"\"\"\n self.serializer_class = ExpandedInfractionSerializer\n return self.partial_update(*args, **kwargs)\n", "path": "pydis_site/apps/api/viewsets/bot/infraction.py"}]} | 2,407 | 375 |
gh_patches_debug_8221 | rasdani/github-patches | git_diff | cisagov__manage.get.gov-1094 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Check Domain availability via epp-Testing
### Issue Description
When adding the /availability endpoint we will need to send a CheckDomain request to epp to see if the domain is available. This epp function is already implemented in domain.py and is called available(). It just needs to be tested and updated if the test show any problem with the implementation
### AC
- [x] unit tests added for available
- [x] manually test via sandbox with OT&E to be sure that this is working as expected
- [x] update the implementation as needed or desired
- [x] in your tests, ensure that this function can be called by just doing Domain.available() and not by having an instance of a domain
### Additional Context (optional)
This must be tested by using Domain.available because the /availability endpoint (when implemented) will not have access to any particular domain object and this function needs to be able to be performed on its own.
### Issue Link
blocks: #1015
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/epplibwrapper/__init__.py`
Content:
```
1 import logging
2 from types import SimpleNamespace
3
4 try:
5 from epplib import constants
6 except ImportError:
7 # allow epplibwrapper to load without epplib, for testing and development
8 pass
9
10 logger = logging.getLogger(__name__)
11
12 NAMESPACE = SimpleNamespace(
13 EPP="urn:ietf:params:xml:ns:epp-1.0",
14 XSI="http://www.w3.org/2001/XMLSchema-instance",
15 FRED="noop",
16 NIC_CONTACT="urn:ietf:params:xml:ns:contact-1.0",
17 NIC_DOMAIN="urn:ietf:params:xml:ns:domain-1.0",
18 NIC_ENUMVAL="noop",
19 NIC_EXTRA_ADDR="noop",
20 NIC_HOST="urn:ietf:params:xml:ns:host-1.0",
21 NIC_KEYSET="noop",
22 NIC_NSSET="noop",
23 )
24
25 SCHEMA_LOCATION = SimpleNamespace(
26 XSI="urn:ietf:params:xml:ns:epp-1.0 epp-1.0.xsd",
27 FRED="noop fred-1.5.0.xsd",
28 NIC_CONTACT="urn:ietf:params:xml:ns:contact-1.0 contact-1.0.xsd",
29 NIC_DOMAIN="urn:ietf:params:xml:ns:domain-1.0 domain-1.0.xsd",
30 NIC_ENUMVAL="noop enumval-1.2.0.xsd",
31 NIC_EXTRA_ADDR="noop extra-addr-1.0.0.xsd",
32 NIC_HOST="urn:ietf:params:xml:ns:host-1.0 host-1.0.xsd",
33 NIC_KEYSET="noop keyset-1.3.2.xsd",
34 NIC_NSSET="noop nsset-1.2.2.xsd",
35 )
36
37 try:
38 constants.NAMESPACE = NAMESPACE
39 constants.SCHEMA_LOCATION = SCHEMA_LOCATION
40 except NameError:
41 pass
42
43 # Attn: these imports should NOT be at the top of the file
44 try:
45 from .client import CLIENT, commands
46 from .errors import RegistryError, ErrorCode
47 from epplib.models import common
48 except ImportError:
49 pass
50
51 __all__ = [
52 "CLIENT",
53 "commands",
54 "common",
55 "ErrorCode",
56 "RegistryError",
57 ]
58
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/epplibwrapper/__init__.py b/src/epplibwrapper/__init__.py
--- a/src/epplibwrapper/__init__.py
+++ b/src/epplibwrapper/__init__.py
@@ -45,6 +45,7 @@
from .client import CLIENT, commands
from .errors import RegistryError, ErrorCode
from epplib.models import common
+ from epplib import responses
except ImportError:
pass
@@ -52,6 +53,7 @@
"CLIENT",
"commands",
"common",
+ "responses",
"ErrorCode",
"RegistryError",
]
| {"golden_diff": "diff --git a/src/epplibwrapper/__init__.py b/src/epplibwrapper/__init__.py\n--- a/src/epplibwrapper/__init__.py\n+++ b/src/epplibwrapper/__init__.py\n@@ -45,6 +45,7 @@\n from .client import CLIENT, commands\n from .errors import RegistryError, ErrorCode\n from epplib.models import common\n+ from epplib import responses\n except ImportError:\n pass\n \n@@ -52,6 +53,7 @@\n \"CLIENT\",\n \"commands\",\n \"common\",\n+ \"responses\",\n \"ErrorCode\",\n \"RegistryError\",\n ]\n", "issue": "Check Domain availability via epp-Testing\n### Issue Description\r\n\r\nWhen adding the /availability endpoint we will need to send a CheckDomain request to epp to see if the domain is available. This epp function is already implemented in domain.py and is called available(). It just needs to be tested and updated if the test show any problem with the implementation\r\n\r\n### AC\r\n\r\n- [x] unit tests added for available\r\n- [x] manually test via sandbox with OT&E to be sure that this is working as expected \r\n- [x] update the implementation as needed or desired\r\n- [x] in your tests, ensure that this function can be called by just doing Domain.available() and not by having an instance of a domain\r\n\r\n### Additional Context (optional)\r\n\r\nThis must be tested by using Domain.available because the /availability endpoint (when implemented) will not have access to any particular domain object and this function needs to be able to be performed on its own.\r\n\r\n### Issue Link\r\nblocks: #1015 \n", "before_files": [{"content": "import logging\nfrom types import SimpleNamespace\n\ntry:\n from epplib import constants\nexcept ImportError:\n # allow epplibwrapper to load without epplib, for testing and development\n pass\n\nlogger = logging.getLogger(__name__)\n\nNAMESPACE = SimpleNamespace(\n EPP=\"urn:ietf:params:xml:ns:epp-1.0\",\n XSI=\"http://www.w3.org/2001/XMLSchema-instance\",\n FRED=\"noop\",\n NIC_CONTACT=\"urn:ietf:params:xml:ns:contact-1.0\",\n NIC_DOMAIN=\"urn:ietf:params:xml:ns:domain-1.0\",\n NIC_ENUMVAL=\"noop\",\n NIC_EXTRA_ADDR=\"noop\",\n NIC_HOST=\"urn:ietf:params:xml:ns:host-1.0\",\n NIC_KEYSET=\"noop\",\n NIC_NSSET=\"noop\",\n)\n\nSCHEMA_LOCATION = SimpleNamespace(\n XSI=\"urn:ietf:params:xml:ns:epp-1.0 epp-1.0.xsd\",\n FRED=\"noop fred-1.5.0.xsd\",\n NIC_CONTACT=\"urn:ietf:params:xml:ns:contact-1.0 contact-1.0.xsd\",\n NIC_DOMAIN=\"urn:ietf:params:xml:ns:domain-1.0 domain-1.0.xsd\",\n NIC_ENUMVAL=\"noop enumval-1.2.0.xsd\",\n NIC_EXTRA_ADDR=\"noop extra-addr-1.0.0.xsd\",\n NIC_HOST=\"urn:ietf:params:xml:ns:host-1.0 host-1.0.xsd\",\n NIC_KEYSET=\"noop keyset-1.3.2.xsd\",\n NIC_NSSET=\"noop nsset-1.2.2.xsd\",\n)\n\ntry:\n constants.NAMESPACE = NAMESPACE\n constants.SCHEMA_LOCATION = SCHEMA_LOCATION\nexcept NameError:\n pass\n\n# Attn: these imports should NOT be at the top of the file\ntry:\n from .client import CLIENT, commands\n from .errors import RegistryError, ErrorCode\n from epplib.models import common\nexcept ImportError:\n pass\n\n__all__ = [\n \"CLIENT\",\n \"commands\",\n \"common\",\n \"ErrorCode\",\n \"RegistryError\",\n]\n", "path": "src/epplibwrapper/__init__.py"}], "after_files": [{"content": "import logging\nfrom types import SimpleNamespace\n\ntry:\n from epplib import constants\nexcept ImportError:\n # allow epplibwrapper to load without epplib, for testing and development\n pass\n\nlogger = logging.getLogger(__name__)\n\nNAMESPACE = SimpleNamespace(\n EPP=\"urn:ietf:params:xml:ns:epp-1.0\",\n XSI=\"http://www.w3.org/2001/XMLSchema-instance\",\n FRED=\"noop\",\n NIC_CONTACT=\"urn:ietf:params:xml:ns:contact-1.0\",\n NIC_DOMAIN=\"urn:ietf:params:xml:ns:domain-1.0\",\n NIC_ENUMVAL=\"noop\",\n NIC_EXTRA_ADDR=\"noop\",\n NIC_HOST=\"urn:ietf:params:xml:ns:host-1.0\",\n NIC_KEYSET=\"noop\",\n NIC_NSSET=\"noop\",\n)\n\nSCHEMA_LOCATION = SimpleNamespace(\n XSI=\"urn:ietf:params:xml:ns:epp-1.0 epp-1.0.xsd\",\n FRED=\"noop fred-1.5.0.xsd\",\n NIC_CONTACT=\"urn:ietf:params:xml:ns:contact-1.0 contact-1.0.xsd\",\n NIC_DOMAIN=\"urn:ietf:params:xml:ns:domain-1.0 domain-1.0.xsd\",\n NIC_ENUMVAL=\"noop enumval-1.2.0.xsd\",\n NIC_EXTRA_ADDR=\"noop extra-addr-1.0.0.xsd\",\n NIC_HOST=\"urn:ietf:params:xml:ns:host-1.0 host-1.0.xsd\",\n NIC_KEYSET=\"noop keyset-1.3.2.xsd\",\n NIC_NSSET=\"noop nsset-1.2.2.xsd\",\n)\n\ntry:\n constants.NAMESPACE = NAMESPACE\n constants.SCHEMA_LOCATION = SCHEMA_LOCATION\nexcept NameError:\n pass\n\n# Attn: these imports should NOT be at the top of the file\ntry:\n from .client import CLIENT, commands\n from .errors import RegistryError, ErrorCode\n from epplib.models import common\n from epplib import responses\nexcept ImportError:\n pass\n\n__all__ = [\n \"CLIENT\",\n \"commands\",\n \"common\",\n \"responses\",\n \"ErrorCode\",\n \"RegistryError\",\n]\n", "path": "src/epplibwrapper/__init__.py"}]} | 1,067 | 140 |
gh_patches_debug_13162 | rasdani/github-patches | git_diff | chainer__chainer-2143 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Stop using ABC in Serializer
AbstractSerializer is currently written as an abstract base class. I don't think it is needed to support ABC.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `chainer/serializer.py`
Content:
```
1 import abc
2
3 import six
4
5
6 @six.add_metaclass(abc.ABCMeta)
7 class AbstractSerializer(object):
8
9 """Abstract base class of all serializers and deserializers."""
10
11 @abc.abstractmethod
12 def __getitem__(self, key):
13 """Gets a child serializer.
14
15 This operator creates a _child_ serializer represented by the given
16 key.
17
18 Args:
19 key (str): Name of the child serializer.
20
21 """
22 raise NotImplementedError
23
24 @abc.abstractmethod
25 def __call__(self, key, value):
26 """Serializes or deserializes a value by given name.
27
28 This operator saves or loads a value by given name.
29
30 If this is a serializer, then the value is simply saved at the key.
31 Note that some type information might be missed depending on the
32 implementation (and the target file format).
33
34 If this is a deserializer, then the value is loaded by the key. The
35 deserialization differently works on scalars and arrays. For scalars,
36 the ``value`` argument is used just for determining the type of
37 restored value to be converted, and the converted value is returned.
38 For arrays, the restored elements are directly copied into the
39 ``value`` argument. String values are treated like scalars. If the
40 ``value`` argument is ``None``, the type of the restored value will
41 typically be a numpy array but can depend on the particular subclass
42 implementation.
43
44 Args:
45 key (str): Name of the serialization entry.
46 value (scalar, array, None, or str): Object to be (de)serialized.
47 ``None`` is only supported by deserializers.
48
49 Returns:
50 Serialized or deserialized value.
51
52 """
53 raise NotImplementedError
54
55
56 class Serializer(AbstractSerializer):
57
58 """Base class of all serializers."""
59
60 def save(self, obj):
61 """Saves an object by this serializer.
62
63 This is equivalent to ``obj.serialize(self)``.
64
65 Args:
66 obj: Target object to be serialized.
67
68 """
69 obj.serialize(self)
70
71
72 class Deserializer(AbstractSerializer):
73
74 """Base class of all deserializers."""
75
76 def load(self, obj):
77 """Loads an object from this deserializer.
78
79 This is equivalent to ``obj.serialize(self)``.
80
81 Args:
82 obj: Target object to be serialized.
83
84 """
85 obj.serialize(self)
86
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/chainer/serializer.py b/chainer/serializer.py
--- a/chainer/serializer.py
+++ b/chainer/serializer.py
@@ -1,14 +1,7 @@
-import abc
-
-import six
-
-
[email protected]_metaclass(abc.ABCMeta)
class AbstractSerializer(object):
"""Abstract base class of all serializers and deserializers."""
- @abc.abstractmethod
def __getitem__(self, key):
"""Gets a child serializer.
@@ -21,7 +14,6 @@
"""
raise NotImplementedError
- @abc.abstractmethod
def __call__(self, key, value):
"""Serializes or deserializes a value by given name.
| {"golden_diff": "diff --git a/chainer/serializer.py b/chainer/serializer.py\n--- a/chainer/serializer.py\n+++ b/chainer/serializer.py\n@@ -1,14 +1,7 @@\n-import abc\n-\n-import six\n-\n-\[email protected]_metaclass(abc.ABCMeta)\n class AbstractSerializer(object):\n \n \"\"\"Abstract base class of all serializers and deserializers.\"\"\"\n \n- @abc.abstractmethod\n def __getitem__(self, key):\n \"\"\"Gets a child serializer.\n \n@@ -21,7 +14,6 @@\n \"\"\"\n raise NotImplementedError\n \n- @abc.abstractmethod\n def __call__(self, key, value):\n \"\"\"Serializes or deserializes a value by given name.\n", "issue": "Stop using ABC in Serializer\nAbstractSerializer is currently written as an abstract base class. I don't think it is needed to support ABC.\n", "before_files": [{"content": "import abc\n\nimport six\n\n\[email protected]_metaclass(abc.ABCMeta)\nclass AbstractSerializer(object):\n\n \"\"\"Abstract base class of all serializers and deserializers.\"\"\"\n\n @abc.abstractmethod\n def __getitem__(self, key):\n \"\"\"Gets a child serializer.\n\n This operator creates a _child_ serializer represented by the given\n key.\n\n Args:\n key (str): Name of the child serializer.\n\n \"\"\"\n raise NotImplementedError\n\n @abc.abstractmethod\n def __call__(self, key, value):\n \"\"\"Serializes or deserializes a value by given name.\n\n This operator saves or loads a value by given name.\n\n If this is a serializer, then the value is simply saved at the key.\n Note that some type information might be missed depending on the\n implementation (and the target file format).\n\n If this is a deserializer, then the value is loaded by the key. The\n deserialization differently works on scalars and arrays. For scalars,\n the ``value`` argument is used just for determining the type of\n restored value to be converted, and the converted value is returned.\n For arrays, the restored elements are directly copied into the\n ``value`` argument. String values are treated like scalars. If the\n ``value`` argument is ``None``, the type of the restored value will\n typically be a numpy array but can depend on the particular subclass\n implementation.\n\n Args:\n key (str): Name of the serialization entry.\n value (scalar, array, None, or str): Object to be (de)serialized.\n ``None`` is only supported by deserializers.\n\n Returns:\n Serialized or deserialized value.\n\n \"\"\"\n raise NotImplementedError\n\n\nclass Serializer(AbstractSerializer):\n\n \"\"\"Base class of all serializers.\"\"\"\n\n def save(self, obj):\n \"\"\"Saves an object by this serializer.\n\n This is equivalent to ``obj.serialize(self)``.\n\n Args:\n obj: Target object to be serialized.\n\n \"\"\"\n obj.serialize(self)\n\n\nclass Deserializer(AbstractSerializer):\n\n \"\"\"Base class of all deserializers.\"\"\"\n\n def load(self, obj):\n \"\"\"Loads an object from this deserializer.\n\n This is equivalent to ``obj.serialize(self)``.\n\n Args:\n obj: Target object to be serialized.\n\n \"\"\"\n obj.serialize(self)\n", "path": "chainer/serializer.py"}], "after_files": [{"content": "class AbstractSerializer(object):\n\n \"\"\"Abstract base class of all serializers and deserializers.\"\"\"\n\n def __getitem__(self, key):\n \"\"\"Gets a child serializer.\n\n This operator creates a _child_ serializer represented by the given\n key.\n\n Args:\n key (str): Name of the child serializer.\n\n \"\"\"\n raise NotImplementedError\n\n def __call__(self, key, value):\n \"\"\"Serializes or deserializes a value by given name.\n\n This operator saves or loads a value by given name.\n\n If this is a serializer, then the value is simply saved at the key.\n Note that some type information might be missed depending on the\n implementation (and the target file format).\n\n If this is a deserializer, then the value is loaded by the key. The\n deserialization differently works on scalars and arrays. For scalars,\n the ``value`` argument is used just for determining the type of\n restored value to be converted, and the converted value is returned.\n For arrays, the restored elements are directly copied into the\n ``value`` argument. String values are treated like scalars. If the\n ``value`` argument is ``None``, the type of the restored value will\n typically be a numpy array but can depend on the particular subclass\n implementation.\n\n Args:\n key (str): Name of the serialization entry.\n value (scalar, array, None, or str): Object to be (de)serialized.\n ``None`` is only supported by deserializers.\n\n Returns:\n Serialized or deserialized value.\n\n \"\"\"\n raise NotImplementedError\n\n\nclass Serializer(AbstractSerializer):\n\n \"\"\"Base class of all serializers.\"\"\"\n\n def save(self, obj):\n \"\"\"Saves an object by this serializer.\n\n This is equivalent to ``obj.serialize(self)``.\n\n Args:\n obj: Target object to be serialized.\n\n \"\"\"\n obj.serialize(self)\n\n\nclass Deserializer(AbstractSerializer):\n\n \"\"\"Base class of all deserializers.\"\"\"\n\n def load(self, obj):\n \"\"\"Loads an object from this deserializer.\n\n This is equivalent to ``obj.serialize(self)``.\n\n Args:\n obj: Target object to be serialized.\n\n \"\"\"\n obj.serialize(self)\n", "path": "chainer/serializer.py"}]} | 969 | 158 |
gh_patches_debug_29494 | rasdani/github-patches | git_diff | ray-project__ray-4336 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
rllib rollout does not load the model automatically from params.json
<!--
General questions should be asked on the mailing list [email protected].
Questions about how to use Ray should be asked on
[StackOverflow](https://stackoverflow.com/questions/tagged/ray).
Before submitting an issue, please fill out the following form.
-->
### System information
- **OS Platform and Distribution (e.g., Linux Ubuntu 16.04)**: Linux 4.4.0-135-generic x86_64
- **Python version**: Python 3.6.5
<!--
You can obtain the Ray version with
python -c "import ray; print(ray.__version__)"
-->
### Describe the problem
<!-- Describe the problem clearly here. -->
### Source code / logs
<!-- Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached. Try to provide a reproducible test case that is the bare minimum necessary to generate the problem. -->
rllib rollout does not load the model automatically from params.json for a simple 256x256x256x256 model.
When I run rllib rollout without specifying --config with "model": {"fcnet_hiddens": [256, 256, 256, 256]} it fails with the following error:
```
assert len(vector) == i, "Passed weight does not have the correct shape."
AssertionError: Passed weight does not have the correct shape.
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `python/ray/rllib/rollout.py`
Content:
```
1 #!/usr/bin/env python
2
3 from __future__ import absolute_import
4 from __future__ import division
5 from __future__ import print_function
6
7 import argparse
8 import json
9 import os
10 import pickle
11
12 import gym
13 import ray
14 from ray.rllib.agents.registry import get_agent_class
15
16 EXAMPLE_USAGE = """
17 Example Usage via RLlib CLI:
18 rllib rollout /tmp/ray/checkpoint_dir/checkpoint-0 --run DQN
19 --env CartPole-v0 --steps 1000000 --out rollouts.pkl
20
21 Example Usage via executable:
22 ./rollout.py /tmp/ray/checkpoint_dir/checkpoint-0 --run DQN
23 --env CartPole-v0 --steps 1000000 --out rollouts.pkl
24 """
25
26 # Note: if you use any custom models or envs, register them here first, e.g.:
27 #
28 # ModelCatalog.register_custom_model("pa_model", ParametricActionsModel)
29 # register_env("pa_cartpole", lambda _: ParametricActionCartpole(10))
30
31
32 def create_parser(parser_creator=None):
33 parser_creator = parser_creator or argparse.ArgumentParser
34 parser = parser_creator(
35 formatter_class=argparse.RawDescriptionHelpFormatter,
36 description="Roll out a reinforcement learning agent "
37 "given a checkpoint.",
38 epilog=EXAMPLE_USAGE)
39
40 parser.add_argument(
41 "checkpoint", type=str, help="Checkpoint from which to roll out.")
42 required_named = parser.add_argument_group("required named arguments")
43 required_named.add_argument(
44 "--run",
45 type=str,
46 required=True,
47 help="The algorithm or model to train. This may refer to the name "
48 "of a built-on algorithm (e.g. RLLib's DQN or PPO), or a "
49 "user-defined trainable function or class registered in the "
50 "tune registry.")
51 required_named.add_argument(
52 "--env", type=str, help="The gym environment to use.")
53 parser.add_argument(
54 "--no-render",
55 default=False,
56 action="store_const",
57 const=True,
58 help="Surpress rendering of the environment.")
59 parser.add_argument(
60 "--steps", default=10000, help="Number of steps to roll out.")
61 parser.add_argument("--out", default=None, help="Output filename.")
62 parser.add_argument(
63 "--config",
64 default="{}",
65 type=json.loads,
66 help="Algorithm-specific configuration (e.g. env, hyperparams). "
67 "Surpresses loading of configuration from checkpoint.")
68 return parser
69
70
71 def run(args, parser):
72 config = args.config
73 if not config:
74 # Load configuration from file
75 config_dir = os.path.dirname(args.checkpoint)
76 config_path = os.path.join(config_dir, "params.pkl")
77 if not os.path.exists(config_path):
78 config_path = os.path.join(config_dir, "../params.pkl")
79 if not os.path.exists(config_path):
80 raise ValueError(
81 "Could not find params.pkl in either the checkpoint dir or "
82 "its parent directory.")
83 with open(config_path, 'rb') as f:
84 config = pickle.load(f)
85 if "num_workers" in config:
86 config["num_workers"] = min(2, config["num_workers"])
87
88 if not args.env:
89 if not config.get("env"):
90 parser.error("the following arguments are required: --env")
91 args.env = config.get("env")
92
93 ray.init()
94
95 cls = get_agent_class(args.run)
96 agent = cls(env=args.env, config=config)
97 agent.restore(args.checkpoint)
98 num_steps = int(args.steps)
99 rollout(agent, args.env, num_steps, args.out, args.no_render)
100
101
102 def rollout(agent, env_name, num_steps, out=None, no_render=True):
103 if hasattr(agent, "local_evaluator"):
104 env = agent.local_evaluator.env
105 multiagent = agent.local_evaluator.multiagent
106 if multiagent:
107 policy_agent_mapping = agent.config["multiagent"][
108 "policy_mapping_fn"]
109 mapping_cache = {}
110 policy_map = agent.local_evaluator.policy_map
111 state_init = {p: m.get_initial_state() for p, m in policy_map.items()}
112 use_lstm = {p: len(s) > 0 for p, s in state_init.items()}
113 else:
114 env = gym.make(env_name)
115 multiagent = False
116 use_lstm = {'default': False}
117
118 if out is not None:
119 rollouts = []
120 steps = 0
121 while steps < (num_steps or steps + 1):
122 if out is not None:
123 rollout = []
124 state = env.reset()
125 done = False
126 reward_total = 0.0
127 while not done and steps < (num_steps or steps + 1):
128 if multiagent:
129 action_dict = {}
130 for agent_id in state.keys():
131 a_state = state[agent_id]
132 if a_state is not None:
133 policy_id = mapping_cache.setdefault(
134 agent_id, policy_agent_mapping(agent_id))
135 p_use_lstm = use_lstm[policy_id]
136 if p_use_lstm:
137 a_action, p_state_init, _ = agent.compute_action(
138 a_state,
139 state=state_init[policy_id],
140 policy_id=policy_id)
141 state_init[policy_id] = p_state_init
142 else:
143 a_action = agent.compute_action(
144 a_state, policy_id=policy_id)
145 action_dict[agent_id] = a_action
146 action = action_dict
147 else:
148 if use_lstm["default"]:
149 action, state_init, _ = agent.compute_action(
150 state, state=state_init)
151 else:
152 action = agent.compute_action(state)
153
154 next_state, reward, done, _ = env.step(action)
155
156 if multiagent:
157 done = done["__all__"]
158 reward_total += sum(reward.values())
159 else:
160 reward_total += reward
161 if not no_render:
162 env.render()
163 if out is not None:
164 rollout.append([state, action, next_state, reward, done])
165 steps += 1
166 state = next_state
167 if out is not None:
168 rollouts.append(rollout)
169 print("Episode reward", reward_total)
170
171 if out is not None:
172 pickle.dump(rollouts, open(out, "wb"))
173
174
175 if __name__ == "__main__":
176 parser = create_parser()
177 args = parser.parse_args()
178 run(args, parser)
179
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/python/ray/rllib/rollout.py b/python/ray/rllib/rollout.py
--- a/python/ray/rllib/rollout.py
+++ b/python/ray/rllib/rollout.py
@@ -12,6 +12,7 @@
import gym
import ray
from ray.rllib.agents.registry import get_agent_class
+from ray.tune.util import merge_dicts
EXAMPLE_USAGE = """
Example Usage via RLlib CLI:
@@ -69,22 +70,23 @@
def run(args, parser):
- config = args.config
- if not config:
- # Load configuration from file
- config_dir = os.path.dirname(args.checkpoint)
- config_path = os.path.join(config_dir, "params.pkl")
- if not os.path.exists(config_path):
- config_path = os.path.join(config_dir, "../params.pkl")
- if not os.path.exists(config_path):
+ config = {}
+ # Load configuration from file
+ config_dir = os.path.dirname(args.checkpoint)
+ config_path = os.path.join(config_dir, "params.pkl")
+ if not os.path.exists(config_path):
+ config_path = os.path.join(config_dir, "../params.pkl")
+ if not os.path.exists(config_path):
+ if not args.config:
raise ValueError(
"Could not find params.pkl in either the checkpoint dir or "
"its parent directory.")
+ else:
with open(config_path, 'rb') as f:
config = pickle.load(f)
- if "num_workers" in config:
- config["num_workers"] = min(2, config["num_workers"])
-
+ if "num_workers" in config:
+ config["num_workers"] = min(2, config["num_workers"])
+ config = merge_dicts(config, args.config)
if not args.env:
if not config.get("env"):
parser.error("the following arguments are required: --env")
| {"golden_diff": "diff --git a/python/ray/rllib/rollout.py b/python/ray/rllib/rollout.py\n--- a/python/ray/rllib/rollout.py\n+++ b/python/ray/rllib/rollout.py\n@@ -12,6 +12,7 @@\n import gym\n import ray\n from ray.rllib.agents.registry import get_agent_class\n+from ray.tune.util import merge_dicts\n \n EXAMPLE_USAGE = \"\"\"\n Example Usage via RLlib CLI:\n@@ -69,22 +70,23 @@\n \n \n def run(args, parser):\n- config = args.config\n- if not config:\n- # Load configuration from file\n- config_dir = os.path.dirname(args.checkpoint)\n- config_path = os.path.join(config_dir, \"params.pkl\")\n- if not os.path.exists(config_path):\n- config_path = os.path.join(config_dir, \"../params.pkl\")\n- if not os.path.exists(config_path):\n+ config = {}\n+ # Load configuration from file\n+ config_dir = os.path.dirname(args.checkpoint)\n+ config_path = os.path.join(config_dir, \"params.pkl\")\n+ if not os.path.exists(config_path):\n+ config_path = os.path.join(config_dir, \"../params.pkl\")\n+ if not os.path.exists(config_path):\n+ if not args.config:\n raise ValueError(\n \"Could not find params.pkl in either the checkpoint dir or \"\n \"its parent directory.\")\n+ else:\n with open(config_path, 'rb') as f:\n config = pickle.load(f)\n- if \"num_workers\" in config:\n- config[\"num_workers\"] = min(2, config[\"num_workers\"])\n-\n+ if \"num_workers\" in config:\n+ config[\"num_workers\"] = min(2, config[\"num_workers\"])\n+ config = merge_dicts(config, args.config)\n if not args.env:\n if not config.get(\"env\"):\n parser.error(\"the following arguments are required: --env\")\n", "issue": "rllib rollout does not load the model automatically from params.json\n<!--\r\nGeneral questions should be asked on the mailing list [email protected].\r\nQuestions about how to use Ray should be asked on\r\n[StackOverflow](https://stackoverflow.com/questions/tagged/ray).\r\n\r\nBefore submitting an issue, please fill out the following form.\r\n-->\r\n\r\n### System information\r\n- **OS Platform and Distribution (e.g., Linux Ubuntu 16.04)**: Linux 4.4.0-135-generic x86_64 \r\n- **Python version**: Python 3.6.5 \r\n\r\n<!--\r\nYou can obtain the Ray version with\r\n\r\npython -c \"import ray; print(ray.__version__)\"\r\n-->\r\n\r\n### Describe the problem\r\n<!-- Describe the problem clearly here. -->\r\n\r\n### Source code / logs\r\n<!-- Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached. Try to provide a reproducible test case that is the bare minimum necessary to generate the problem. -->\r\nrllib rollout does not load the model automatically from params.json for a simple 256x256x256x256 model. \r\nWhen I run rllib rollout without specifying --config with \"model\": {\"fcnet_hiddens\": [256, 256, 256, 256]} it fails with the following error:\r\n```\r\n assert len(vector) == i, \"Passed weight does not have the correct shape.\"\r\nAssertionError: Passed weight does not have the correct shape.\r\n```\n", "before_files": [{"content": "#!/usr/bin/env python\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport argparse\nimport json\nimport os\nimport pickle\n\nimport gym\nimport ray\nfrom ray.rllib.agents.registry import get_agent_class\n\nEXAMPLE_USAGE = \"\"\"\nExample Usage via RLlib CLI:\n rllib rollout /tmp/ray/checkpoint_dir/checkpoint-0 --run DQN\n --env CartPole-v0 --steps 1000000 --out rollouts.pkl\n\nExample Usage via executable:\n ./rollout.py /tmp/ray/checkpoint_dir/checkpoint-0 --run DQN\n --env CartPole-v0 --steps 1000000 --out rollouts.pkl\n\"\"\"\n\n# Note: if you use any custom models or envs, register them here first, e.g.:\n#\n# ModelCatalog.register_custom_model(\"pa_model\", ParametricActionsModel)\n# register_env(\"pa_cartpole\", lambda _: ParametricActionCartpole(10))\n\n\ndef create_parser(parser_creator=None):\n parser_creator = parser_creator or argparse.ArgumentParser\n parser = parser_creator(\n formatter_class=argparse.RawDescriptionHelpFormatter,\n description=\"Roll out a reinforcement learning agent \"\n \"given a checkpoint.\",\n epilog=EXAMPLE_USAGE)\n\n parser.add_argument(\n \"checkpoint\", type=str, help=\"Checkpoint from which to roll out.\")\n required_named = parser.add_argument_group(\"required named arguments\")\n required_named.add_argument(\n \"--run\",\n type=str,\n required=True,\n help=\"The algorithm or model to train. This may refer to the name \"\n \"of a built-on algorithm (e.g. RLLib's DQN or PPO), or a \"\n \"user-defined trainable function or class registered in the \"\n \"tune registry.\")\n required_named.add_argument(\n \"--env\", type=str, help=\"The gym environment to use.\")\n parser.add_argument(\n \"--no-render\",\n default=False,\n action=\"store_const\",\n const=True,\n help=\"Surpress rendering of the environment.\")\n parser.add_argument(\n \"--steps\", default=10000, help=\"Number of steps to roll out.\")\n parser.add_argument(\"--out\", default=None, help=\"Output filename.\")\n parser.add_argument(\n \"--config\",\n default=\"{}\",\n type=json.loads,\n help=\"Algorithm-specific configuration (e.g. env, hyperparams). \"\n \"Surpresses loading of configuration from checkpoint.\")\n return parser\n\n\ndef run(args, parser):\n config = args.config\n if not config:\n # Load configuration from file\n config_dir = os.path.dirname(args.checkpoint)\n config_path = os.path.join(config_dir, \"params.pkl\")\n if not os.path.exists(config_path):\n config_path = os.path.join(config_dir, \"../params.pkl\")\n if not os.path.exists(config_path):\n raise ValueError(\n \"Could not find params.pkl in either the checkpoint dir or \"\n \"its parent directory.\")\n with open(config_path, 'rb') as f:\n config = pickle.load(f)\n if \"num_workers\" in config:\n config[\"num_workers\"] = min(2, config[\"num_workers\"])\n\n if not args.env:\n if not config.get(\"env\"):\n parser.error(\"the following arguments are required: --env\")\n args.env = config.get(\"env\")\n\n ray.init()\n\n cls = get_agent_class(args.run)\n agent = cls(env=args.env, config=config)\n agent.restore(args.checkpoint)\n num_steps = int(args.steps)\n rollout(agent, args.env, num_steps, args.out, args.no_render)\n\n\ndef rollout(agent, env_name, num_steps, out=None, no_render=True):\n if hasattr(agent, \"local_evaluator\"):\n env = agent.local_evaluator.env\n multiagent = agent.local_evaluator.multiagent\n if multiagent:\n policy_agent_mapping = agent.config[\"multiagent\"][\n \"policy_mapping_fn\"]\n mapping_cache = {}\n policy_map = agent.local_evaluator.policy_map\n state_init = {p: m.get_initial_state() for p, m in policy_map.items()}\n use_lstm = {p: len(s) > 0 for p, s in state_init.items()}\n else:\n env = gym.make(env_name)\n multiagent = False\n use_lstm = {'default': False}\n\n if out is not None:\n rollouts = []\n steps = 0\n while steps < (num_steps or steps + 1):\n if out is not None:\n rollout = []\n state = env.reset()\n done = False\n reward_total = 0.0\n while not done and steps < (num_steps or steps + 1):\n if multiagent:\n action_dict = {}\n for agent_id in state.keys():\n a_state = state[agent_id]\n if a_state is not None:\n policy_id = mapping_cache.setdefault(\n agent_id, policy_agent_mapping(agent_id))\n p_use_lstm = use_lstm[policy_id]\n if p_use_lstm:\n a_action, p_state_init, _ = agent.compute_action(\n a_state,\n state=state_init[policy_id],\n policy_id=policy_id)\n state_init[policy_id] = p_state_init\n else:\n a_action = agent.compute_action(\n a_state, policy_id=policy_id)\n action_dict[agent_id] = a_action\n action = action_dict\n else:\n if use_lstm[\"default\"]:\n action, state_init, _ = agent.compute_action(\n state, state=state_init)\n else:\n action = agent.compute_action(state)\n\n next_state, reward, done, _ = env.step(action)\n\n if multiagent:\n done = done[\"__all__\"]\n reward_total += sum(reward.values())\n else:\n reward_total += reward\n if not no_render:\n env.render()\n if out is not None:\n rollout.append([state, action, next_state, reward, done])\n steps += 1\n state = next_state\n if out is not None:\n rollouts.append(rollout)\n print(\"Episode reward\", reward_total)\n\n if out is not None:\n pickle.dump(rollouts, open(out, \"wb\"))\n\n\nif __name__ == \"__main__\":\n parser = create_parser()\n args = parser.parse_args()\n run(args, parser)\n", "path": "python/ray/rllib/rollout.py"}], "after_files": [{"content": "#!/usr/bin/env python\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport argparse\nimport json\nimport os\nimport pickle\n\nimport gym\nimport ray\nfrom ray.rllib.agents.registry import get_agent_class\nfrom ray.tune.util import merge_dicts\n\nEXAMPLE_USAGE = \"\"\"\nExample Usage via RLlib CLI:\n rllib rollout /tmp/ray/checkpoint_dir/checkpoint-0 --run DQN\n --env CartPole-v0 --steps 1000000 --out rollouts.pkl\n\nExample Usage via executable:\n ./rollout.py /tmp/ray/checkpoint_dir/checkpoint-0 --run DQN\n --env CartPole-v0 --steps 1000000 --out rollouts.pkl\n\"\"\"\n\n# Note: if you use any custom models or envs, register them here first, e.g.:\n#\n# ModelCatalog.register_custom_model(\"pa_model\", ParametricActionsModel)\n# register_env(\"pa_cartpole\", lambda _: ParametricActionCartpole(10))\n\n\ndef create_parser(parser_creator=None):\n parser_creator = parser_creator or argparse.ArgumentParser\n parser = parser_creator(\n formatter_class=argparse.RawDescriptionHelpFormatter,\n description=\"Roll out a reinforcement learning agent \"\n \"given a checkpoint.\",\n epilog=EXAMPLE_USAGE)\n\n parser.add_argument(\n \"checkpoint\", type=str, help=\"Checkpoint from which to roll out.\")\n required_named = parser.add_argument_group(\"required named arguments\")\n required_named.add_argument(\n \"--run\",\n type=str,\n required=True,\n help=\"The algorithm or model to train. This may refer to the name \"\n \"of a built-on algorithm (e.g. RLLib's DQN or PPO), or a \"\n \"user-defined trainable function or class registered in the \"\n \"tune registry.\")\n required_named.add_argument(\n \"--env\", type=str, help=\"The gym environment to use.\")\n parser.add_argument(\n \"--no-render\",\n default=False,\n action=\"store_const\",\n const=True,\n help=\"Surpress rendering of the environment.\")\n parser.add_argument(\n \"--steps\", default=10000, help=\"Number of steps to roll out.\")\n parser.add_argument(\"--out\", default=None, help=\"Output filename.\")\n parser.add_argument(\n \"--config\",\n default=\"{}\",\n type=json.loads,\n help=\"Algorithm-specific configuration (e.g. env, hyperparams). \"\n \"Surpresses loading of configuration from checkpoint.\")\n return parser\n\n\ndef run(args, parser):\n config = {}\n # Load configuration from file\n config_dir = os.path.dirname(args.checkpoint)\n config_path = os.path.join(config_dir, \"params.pkl\")\n if not os.path.exists(config_path):\n config_path = os.path.join(config_dir, \"../params.pkl\")\n if not os.path.exists(config_path):\n if not args.config:\n raise ValueError(\n \"Could not find params.pkl in either the checkpoint dir or \"\n \"its parent directory.\")\n else:\n with open(config_path, 'rb') as f:\n config = pickle.load(f)\n if \"num_workers\" in config:\n config[\"num_workers\"] = min(2, config[\"num_workers\"])\n config = merge_dicts(config, args.config)\n if not args.env:\n if not config.get(\"env\"):\n parser.error(\"the following arguments are required: --env\")\n args.env = config.get(\"env\")\n\n ray.init()\n\n cls = get_agent_class(args.run)\n agent = cls(env=args.env, config=config)\n agent.restore(args.checkpoint)\n num_steps = int(args.steps)\n rollout(agent, args.env, num_steps, args.out, args.no_render)\n\n\ndef rollout(agent, env_name, num_steps, out=None, no_render=True):\n if hasattr(agent, \"local_evaluator\"):\n env = agent.local_evaluator.env\n multiagent = agent.local_evaluator.multiagent\n if multiagent:\n policy_agent_mapping = agent.config[\"multiagent\"][\n \"policy_mapping_fn\"]\n mapping_cache = {}\n policy_map = agent.local_evaluator.policy_map\n state_init = {p: m.get_initial_state() for p, m in policy_map.items()}\n use_lstm = {p: len(s) > 0 for p, s in state_init.items()}\n else:\n env = gym.make(env_name)\n multiagent = False\n use_lstm = {'default': False}\n\n if out is not None:\n rollouts = []\n steps = 0\n while steps < (num_steps or steps + 1):\n if out is not None:\n rollout = []\n state = env.reset()\n done = False\n reward_total = 0.0\n while not done and steps < (num_steps or steps + 1):\n if multiagent:\n action_dict = {}\n for agent_id in state.keys():\n a_state = state[agent_id]\n if a_state is not None:\n policy_id = mapping_cache.setdefault(\n agent_id, policy_agent_mapping(agent_id))\n p_use_lstm = use_lstm[policy_id]\n if p_use_lstm:\n a_action, p_state_init, _ = agent.compute_action(\n a_state,\n state=state_init[policy_id],\n policy_id=policy_id)\n state_init[policy_id] = p_state_init\n else:\n a_action = agent.compute_action(\n a_state, policy_id=policy_id)\n action_dict[agent_id] = a_action\n action = action_dict\n else:\n if use_lstm[\"default\"]:\n action, state_init, _ = agent.compute_action(\n state, state=state_init)\n else:\n action = agent.compute_action(state)\n\n next_state, reward, done, _ = env.step(action)\n\n if multiagent:\n done = done[\"__all__\"]\n reward_total += sum(reward.values())\n else:\n reward_total += reward\n if not no_render:\n env.render()\n if out is not None:\n rollout.append([state, action, next_state, reward, done])\n steps += 1\n state = next_state\n if out is not None:\n rollouts.append(rollout)\n print(\"Episode reward\", reward_total)\n\n if out is not None:\n pickle.dump(rollouts, open(out, \"wb\"))\n\n\nif __name__ == \"__main__\":\n parser = create_parser()\n args = parser.parse_args()\n run(args, parser)\n", "path": "python/ray/rllib/rollout.py"}]} | 2,424 | 425 |
gh_patches_debug_9091 | rasdani/github-patches | git_diff | pytorch__ignite-320 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
ParamScheduler docs missing
No docs on `ParamScheduler` and related classes on the [site](https://pytorch.org/ignite/contrib/handlers.html).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ignite/contrib/handlers/__init__.py`
Content:
```
1
2 from ignite.contrib.handlers.param_scheduler import ParamScheduler, CyclicalScheduler, \
3 LinearCyclicalScheduler, CosineAnnealingScheduler
4
5 from ignite.contrib.handlers.tqdm_logger import ProgressBar
6
7 __all__ = ['ProgressBar']
8
```
Path: `ignite/contrib/engines/__init__.py`
Content:
```
1 # coding: utf-8
2
3 from ignite.contrib.engines.tbptt import create_supervised_tbptt_trainer
4 from ignite.contrib.engines.tbptt import Tbptt_Events
5
6
7 __all__ = ["create_supervised_tbptt_trainer", "Tbptt_Events"]
8
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/ignite/contrib/engines/__init__.py b/ignite/contrib/engines/__init__.py
--- a/ignite/contrib/engines/__init__.py
+++ b/ignite/contrib/engines/__init__.py
@@ -2,6 +2,3 @@
from ignite.contrib.engines.tbptt import create_supervised_tbptt_trainer
from ignite.contrib.engines.tbptt import Tbptt_Events
-
-
-__all__ = ["create_supervised_tbptt_trainer", "Tbptt_Events"]
diff --git a/ignite/contrib/handlers/__init__.py b/ignite/contrib/handlers/__init__.py
--- a/ignite/contrib/handlers/__init__.py
+++ b/ignite/contrib/handlers/__init__.py
@@ -3,5 +3,3 @@
LinearCyclicalScheduler, CosineAnnealingScheduler
from ignite.contrib.handlers.tqdm_logger import ProgressBar
-
-__all__ = ['ProgressBar']
| {"golden_diff": "diff --git a/ignite/contrib/engines/__init__.py b/ignite/contrib/engines/__init__.py\n--- a/ignite/contrib/engines/__init__.py\n+++ b/ignite/contrib/engines/__init__.py\n@@ -2,6 +2,3 @@\n \n from ignite.contrib.engines.tbptt import create_supervised_tbptt_trainer\n from ignite.contrib.engines.tbptt import Tbptt_Events\n-\n-\n-__all__ = [\"create_supervised_tbptt_trainer\", \"Tbptt_Events\"]\ndiff --git a/ignite/contrib/handlers/__init__.py b/ignite/contrib/handlers/__init__.py\n--- a/ignite/contrib/handlers/__init__.py\n+++ b/ignite/contrib/handlers/__init__.py\n@@ -3,5 +3,3 @@\n LinearCyclicalScheduler, CosineAnnealingScheduler\n \n from ignite.contrib.handlers.tqdm_logger import ProgressBar\n-\n-__all__ = ['ProgressBar']\n", "issue": "ParamScheduler docs missing\nNo docs on `ParamScheduler` and related classes on the [site](https://pytorch.org/ignite/contrib/handlers.html).\n", "before_files": [{"content": "\nfrom ignite.contrib.handlers.param_scheduler import ParamScheduler, CyclicalScheduler, \\\n LinearCyclicalScheduler, CosineAnnealingScheduler\n\nfrom ignite.contrib.handlers.tqdm_logger import ProgressBar\n\n__all__ = ['ProgressBar']\n", "path": "ignite/contrib/handlers/__init__.py"}, {"content": "# coding: utf-8\n\nfrom ignite.contrib.engines.tbptt import create_supervised_tbptt_trainer\nfrom ignite.contrib.engines.tbptt import Tbptt_Events\n\n\n__all__ = [\"create_supervised_tbptt_trainer\", \"Tbptt_Events\"]\n", "path": "ignite/contrib/engines/__init__.py"}], "after_files": [{"content": "\nfrom ignite.contrib.handlers.param_scheduler import ParamScheduler, CyclicalScheduler, \\\n LinearCyclicalScheduler, CosineAnnealingScheduler\n\nfrom ignite.contrib.handlers.tqdm_logger import ProgressBar\n", "path": "ignite/contrib/handlers/__init__.py"}, {"content": "# coding: utf-8\n\nfrom ignite.contrib.engines.tbptt import create_supervised_tbptt_trainer\nfrom ignite.contrib.engines.tbptt import Tbptt_Events\n", "path": "ignite/contrib/engines/__init__.py"}]} | 449 | 224 |
gh_patches_debug_11762 | rasdani/github-patches | git_diff | common-workflow-language__cwltool-672 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Information about javascript timeouts not printed to the console
At https://github.com/common-workflow-language/cwltool/blob/f454efb1e980b2a9374afba4a8b793f73f4d891a/cwltool/sandboxjs.py#L297-L301, if `debug` is false, no information about the snippet of javascript that caused the timeout is printed.
Information about javascript timeouts not printed to the console
At https://github.com/common-workflow-language/cwltool/blob/f454efb1e980b2a9374afba4a8b793f73f4d891a/cwltool/sandboxjs.py#L297-L301, if `debug` is false, no information about the snippet of javascript that caused the timeout is printed.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `cwltool/sandboxjs.py`
Content:
```
1 from __future__ import absolute_import
2 import errno
3 import json
4 import logging
5 import os
6 import re
7 import select
8 import subprocess
9 import threading
10 import sys
11 from io import BytesIO
12 from typing import Any, Dict, List, Mapping, Text, Tuple, Union
13 from .utils import onWindows
14 from pkg_resources import resource_stream
15
16 import six
17
18 try:
19 import queue # type: ignore
20 except ImportError:
21 import Queue as queue # type: ignore
22
23 class JavascriptException(Exception):
24 pass
25
26
27 _logger = logging.getLogger("cwltool")
28
29 JSON = Union[Dict[Text, Any], List[Any], Text, int, float, bool, None]
30
31 localdata = threading.local()
32
33 have_node_slim = False
34 # minimum acceptable version of nodejs engine
35 minimum_node_version_str = '0.10.26'
36
37 def check_js_threshold_version(working_alias):
38 # type: (str) -> bool
39
40 """Checks if the nodeJS engine version on the system
41 with the allowed minimum version.
42 https://github.com/nodejs/node/blob/master/CHANGELOG.md#nodejs-changelog
43 """
44 # parse nodejs version into int Tuple: 'v4.2.6\n' -> [4, 2, 6]
45 current_version_str = subprocess.check_output(
46 [working_alias, "-v"]).decode('utf-8')
47
48 current_version = [int(v) for v in current_version_str.strip().strip('v').split('.')]
49 minimum_node_version = [int(v) for v in minimum_node_version_str.split('.')]
50
51 if current_version >= minimum_node_version:
52 return True
53 else:
54 return False
55
56
57 def new_js_proc(force_docker_pull=False, js_console=False):
58 # type: (bool, bool) -> subprocess.Popen
59
60 cwl_node_engine_js = 'cwlNodeEngine.js'
61 if js_console:
62 cwl_node_engine_js = 'cwlNodeEngineJSConsole.js'
63 _logger.warn("Running with support for javascript console in expressions (DO NOT USE IN PRODUCTION)")
64
65 res = resource_stream(__name__, cwl_node_engine_js)
66 nodecode = res.read().decode('utf-8')
67
68 required_node_version, docker = (False,)*2
69 nodejs = None
70 trynodes = ("nodejs", "node")
71 for n in trynodes:
72 try:
73 if subprocess.check_output([n, "--eval", "process.stdout.write('t')"]).decode('utf-8') != "t":
74 continue
75 else:
76 nodejs = subprocess.Popen([n, "--eval", nodecode],
77 stdin=subprocess.PIPE,
78 stdout=subprocess.PIPE,
79 stderr=subprocess.PIPE)
80
81 required_node_version = check_js_threshold_version(n)
82 break
83 except subprocess.CalledProcessError:
84 pass
85 except OSError as e:
86 if e.errno == errno.ENOENT:
87 pass
88 else:
89 raise
90
91 if nodejs is None or nodejs is not None and required_node_version is False:
92 try:
93 nodeimg = "node:slim"
94 global have_node_slim
95
96 if not have_node_slim:
97 dockerimgs = subprocess.check_output(["docker", "images", "-q", nodeimg]).decode('utf-8')
98 # if output is an empty string
99 if (len(dockerimgs.split("\n")) <= 1) or force_docker_pull:
100 # pull node:slim docker container
101 nodejsimg = subprocess.check_output(["docker", "pull", nodeimg]).decode('utf-8')
102 _logger.info("Pulled Docker image %s %s", nodeimg, nodejsimg)
103 have_node_slim = True
104 nodejs = subprocess.Popen(["docker", "run",
105 "--attach=STDIN", "--attach=STDOUT", "--attach=STDERR",
106 "--sig-proxy=true", "--interactive",
107 "--rm", nodeimg, "node", "--eval", nodecode],
108 stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
109 docker = True
110 except OSError as e:
111 if e.errno == errno.ENOENT:
112 pass
113 else:
114 raise
115 except subprocess.CalledProcessError:
116 pass
117
118 # docker failed and nodejs not on system
119 if nodejs is None:
120 raise JavascriptException(
121 u"cwltool requires Node.js engine to evaluate Javascript "
122 "expressions, but couldn't find it. Tried %s, docker run "
123 "node:slim" % u", ".join(trynodes))
124
125 # docker failed, but nodejs is installed on system but the version is below the required version
126 if docker is False and required_node_version is False:
127 raise JavascriptException(
128 u'cwltool requires minimum v{} version of Node.js engine.'.format(minimum_node_version_str),
129 u'Try updating: https://docs.npmjs.com/getting-started/installing-node')
130
131 return nodejs
132
133
134 def execjs(js, jslib, timeout=None, force_docker_pull=False, debug=False, js_console=False): # type: (Union[Mapping, Text], Any, int, bool, bool, bool) -> JSON
135
136 if not hasattr(localdata, "proc") or localdata.proc.poll() is not None or onWindows():
137 localdata.proc = new_js_proc(force_docker_pull=force_docker_pull, js_console=js_console)
138
139 nodejs = localdata.proc
140
141 fn = u"\"use strict\";\n%s\n(function()%s)()" %\
142 (jslib, js if isinstance(js, six.string_types) and len(js) > 1 and js[0] == '{' else ("{return (%s);}" % js))
143
144 killed = []
145
146 """ Kill the node process if it exceeds timeout limit"""
147 def terminate():
148 try:
149 killed.append(True)
150 nodejs.kill()
151 except OSError:
152 pass
153
154 if timeout is None:
155 timeout = 20
156
157 tm = threading.Timer(timeout, terminate)
158 tm.start()
159
160 stdin_buf = BytesIO((json.dumps(fn) + "\n").encode('utf-8'))
161 stdout_buf = BytesIO()
162 stderr_buf = BytesIO()
163
164 rselect = [nodejs.stdout, nodejs.stderr] # type: List[BytesIO]
165 wselect = [nodejs.stdin] # type: List[BytesIO]
166
167 PROCESS_FINISHED_STR = "r1cepzbhUTxtykz5XTC4\n"
168
169 def process_finished(): # type: () -> bool
170 return stdout_buf.getvalue().decode().endswith(PROCESS_FINISHED_STR) and \
171 stderr_buf.getvalue().decode().endswith(PROCESS_FINISHED_STR)
172
173 # On windows system standard input/output are not handled properly by select module
174 # (modules like pywin32, msvcrt, gevent don't work either)
175 if sys.platform=='win32':
176 READ_BYTES_SIZE = 512
177
178 # creating queue for reading from a thread to queue
179 input_queue = queue.Queue()
180 output_queue = queue.Queue()
181 error_queue = queue.Queue()
182
183 # To tell threads that output has ended and threads can safely exit
184 no_more_output = threading.Lock()
185 no_more_output.acquire()
186 no_more_error = threading.Lock()
187 no_more_error.acquire()
188
189 # put constructed command to input queue which then will be passed to nodejs's stdin
190 def put_input(input_queue):
191 while True:
192 b = stdin_buf.read(READ_BYTES_SIZE)
193 if b:
194 input_queue.put(b)
195 else:
196 break
197
198 # get the output from nodejs's stdout and continue till otuput ends
199 def get_output(output_queue):
200 while not no_more_output.acquire(False):
201 b=os.read(nodejs.stdout.fileno(), READ_BYTES_SIZE)
202 if b:
203 output_queue.put(b)
204
205 # get the output from nodejs's stderr and continue till error output ends
206 def get_error(error_queue):
207 while not no_more_error.acquire(False):
208 b = os.read(nodejs.stderr.fileno(), READ_BYTES_SIZE)
209 if b:
210 error_queue.put(b)
211
212 # Threads managing nodejs.stdin, nodejs.stdout and nodejs.stderr respectively
213 input_thread = threading.Thread(target=put_input, args=(input_queue,))
214 input_thread.daemon=True
215 input_thread.start()
216 output_thread = threading.Thread(target=get_output, args=(output_queue,))
217 output_thread.daemon=True
218 output_thread.start()
219 error_thread = threading.Thread(target=get_error, args=(error_queue,))
220 error_thread.daemon=True
221 error_thread.start()
222
223 finished = False
224
225 while not finished and tm.is_alive():
226 try:
227 if nodejs.stdin in wselect:
228 if not input_queue.empty():
229 os.write(nodejs.stdin.fileno(), input_queue.get())
230 elif not input_thread.is_alive():
231 wselect = []
232 if nodejs.stdout in rselect:
233 if not output_queue.empty():
234 stdout_buf.write(output_queue.get())
235
236 if nodejs.stderr in rselect:
237 if not error_queue.empty():
238 stderr_buf.write(error_queue.get())
239
240 if process_finished() and error_queue.empty() and output_queue.empty():
241 finished = True
242 no_more_output.release()
243 no_more_error.release()
244 except OSError as e:
245 break
246
247 else:
248 while not process_finished() and tm.is_alive():
249 rready, wready, _ = select.select(rselect, wselect, [])
250 try:
251 if nodejs.stdin in wready:
252 b = stdin_buf.read(select.PIPE_BUF)
253 if b:
254 os.write(nodejs.stdin.fileno(), b)
255 for pipes in ((nodejs.stdout, stdout_buf), (nodejs.stderr, stderr_buf)):
256 if pipes[0] in rready:
257 b = os.read(pipes[0].fileno(), select.PIPE_BUF)
258 if b:
259 pipes[1].write(b)
260 except OSError as e:
261 break
262 tm.cancel()
263
264 stdin_buf.close()
265 stdoutdata = stdout_buf.getvalue()[:-len(PROCESS_FINISHED_STR) - 1]
266 stderrdata = stderr_buf.getvalue()[:-len(PROCESS_FINISHED_STR) - 1]
267
268 def fn_linenum(): # type: () -> Text
269 lines = fn.splitlines()
270 ofs = 0
271 maxlines = 99
272 if len(lines) > maxlines:
273 ofs = len(lines) - maxlines
274 lines = lines[-maxlines:]
275 return u"\n".join(u"%02i %s" % (i + ofs + 1, b) for i, b in enumerate(lines))
276
277 def stdfmt(data): # type: (Text) -> Text
278 if "\n" in data:
279 return "\n" + data.strip()
280 return data
281
282 nodejs.poll()
283
284 if js_console:
285 if len(stderrdata) > 0:
286 _logger.info("Javascript console output:")
287 _logger.info("----------------------------------------")
288 _logger.info('\n'.join(re.findall(r'^[[](?:log|err)[]].*$', stderrdata.decode('utf-8'), flags=re.MULTILINE)))
289 _logger.info("----------------------------------------")
290
291 if debug:
292 info = u"returncode was: %s\nscript was:\n%s\nstdout was: %s\nstderr was: %s\n" %\
293 (nodejs.returncode, fn_linenum(), stdfmt(stdoutdata.decode('utf-8')), stdfmt(stderrdata.decode('utf-8')))
294 else:
295 info = stdfmt(stderrdata.decode('utf-8'))
296
297 if nodejs.poll() not in (None, 0):
298 if killed:
299 raise JavascriptException(u"Long-running script killed after %s seconds: %s" % (timeout, info))
300 else:
301 raise JavascriptException(info)
302 else:
303 try:
304 # On windows currently a new instance of nodejs process is used due to problem with blocking on read operation on windows
305 if onWindows():
306 nodejs.kill()
307 return json.loads(stdoutdata.decode('utf-8'))
308 except ValueError as e:
309 raise JavascriptException(u"%s\nscript was:\n%s\nstdout was: '%s'\nstderr was: '%s'\n" %
310 (e, fn_linenum(), stdoutdata, stderrdata))
311
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/cwltool/sandboxjs.py b/cwltool/sandboxjs.py
--- a/cwltool/sandboxjs.py
+++ b/cwltool/sandboxjs.py
@@ -292,7 +292,8 @@
info = u"returncode was: %s\nscript was:\n%s\nstdout was: %s\nstderr was: %s\n" %\
(nodejs.returncode, fn_linenum(), stdfmt(stdoutdata.decode('utf-8')), stdfmt(stderrdata.decode('utf-8')))
else:
- info = stdfmt(stderrdata.decode('utf-8'))
+ info = u"Javascript expression was: %s\nstdout was: %s\nstderr was: %s" %\
+ (js, stdfmt(stdoutdata.decode('utf-8')), stdfmt(stderrdata.decode('utf-8')))
if nodejs.poll() not in (None, 0):
if killed:
| {"golden_diff": "diff --git a/cwltool/sandboxjs.py b/cwltool/sandboxjs.py\n--- a/cwltool/sandboxjs.py\n+++ b/cwltool/sandboxjs.py\n@@ -292,7 +292,8 @@\n info = u\"returncode was: %s\\nscript was:\\n%s\\nstdout was: %s\\nstderr was: %s\\n\" %\\\n (nodejs.returncode, fn_linenum(), stdfmt(stdoutdata.decode('utf-8')), stdfmt(stderrdata.decode('utf-8')))\n else:\n- info = stdfmt(stderrdata.decode('utf-8'))\n+ info = u\"Javascript expression was: %s\\nstdout was: %s\\nstderr was: %s\" %\\\n+ (js, stdfmt(stdoutdata.decode('utf-8')), stdfmt(stderrdata.decode('utf-8')))\n \n if nodejs.poll() not in (None, 0):\n if killed:\n", "issue": "Information about javascript timeouts not printed to the console\nAt https://github.com/common-workflow-language/cwltool/blob/f454efb1e980b2a9374afba4a8b793f73f4d891a/cwltool/sandboxjs.py#L297-L301, if `debug` is false, no information about the snippet of javascript that caused the timeout is printed.\nInformation about javascript timeouts not printed to the console\nAt https://github.com/common-workflow-language/cwltool/blob/f454efb1e980b2a9374afba4a8b793f73f4d891a/cwltool/sandboxjs.py#L297-L301, if `debug` is false, no information about the snippet of javascript that caused the timeout is printed.\n", "before_files": [{"content": "from __future__ import absolute_import\nimport errno\nimport json\nimport logging\nimport os\nimport re\nimport select\nimport subprocess\nimport threading\nimport sys\nfrom io import BytesIO\nfrom typing import Any, Dict, List, Mapping, Text, Tuple, Union\nfrom .utils import onWindows\nfrom pkg_resources import resource_stream\n\nimport six\n\ntry:\n import queue # type: ignore\nexcept ImportError:\n import Queue as queue # type: ignore\n\nclass JavascriptException(Exception):\n pass\n\n\n_logger = logging.getLogger(\"cwltool\")\n\nJSON = Union[Dict[Text, Any], List[Any], Text, int, float, bool, None]\n\nlocaldata = threading.local()\n\nhave_node_slim = False\n# minimum acceptable version of nodejs engine\nminimum_node_version_str = '0.10.26'\n\ndef check_js_threshold_version(working_alias):\n # type: (str) -> bool\n\n \"\"\"Checks if the nodeJS engine version on the system\n with the allowed minimum version.\n https://github.com/nodejs/node/blob/master/CHANGELOG.md#nodejs-changelog\n \"\"\"\n # parse nodejs version into int Tuple: 'v4.2.6\\n' -> [4, 2, 6]\n current_version_str = subprocess.check_output(\n [working_alias, \"-v\"]).decode('utf-8')\n\n current_version = [int(v) for v in current_version_str.strip().strip('v').split('.')]\n minimum_node_version = [int(v) for v in minimum_node_version_str.split('.')]\n\n if current_version >= minimum_node_version:\n return True\n else:\n return False\n\n\ndef new_js_proc(force_docker_pull=False, js_console=False):\n # type: (bool, bool) -> subprocess.Popen\n \n cwl_node_engine_js = 'cwlNodeEngine.js'\n if js_console:\n cwl_node_engine_js = 'cwlNodeEngineJSConsole.js'\n _logger.warn(\"Running with support for javascript console in expressions (DO NOT USE IN PRODUCTION)\")\n\n res = resource_stream(__name__, cwl_node_engine_js)\n nodecode = res.read().decode('utf-8')\n\n required_node_version, docker = (False,)*2\n nodejs = None\n trynodes = (\"nodejs\", \"node\")\n for n in trynodes:\n try:\n if subprocess.check_output([n, \"--eval\", \"process.stdout.write('t')\"]).decode('utf-8') != \"t\":\n continue\n else:\n nodejs = subprocess.Popen([n, \"--eval\", nodecode],\n stdin=subprocess.PIPE,\n stdout=subprocess.PIPE,\n stderr=subprocess.PIPE)\n\n required_node_version = check_js_threshold_version(n)\n break\n except subprocess.CalledProcessError:\n pass\n except OSError as e:\n if e.errno == errno.ENOENT:\n pass\n else:\n raise\n\n if nodejs is None or nodejs is not None and required_node_version is False:\n try:\n nodeimg = \"node:slim\"\n global have_node_slim\n\n if not have_node_slim:\n dockerimgs = subprocess.check_output([\"docker\", \"images\", \"-q\", nodeimg]).decode('utf-8')\n # if output is an empty string\n if (len(dockerimgs.split(\"\\n\")) <= 1) or force_docker_pull:\n # pull node:slim docker container\n nodejsimg = subprocess.check_output([\"docker\", \"pull\", nodeimg]).decode('utf-8')\n _logger.info(\"Pulled Docker image %s %s\", nodeimg, nodejsimg)\n have_node_slim = True\n nodejs = subprocess.Popen([\"docker\", \"run\",\n \"--attach=STDIN\", \"--attach=STDOUT\", \"--attach=STDERR\",\n \"--sig-proxy=true\", \"--interactive\",\n \"--rm\", nodeimg, \"node\", \"--eval\", nodecode],\n stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE)\n docker = True\n except OSError as e:\n if e.errno == errno.ENOENT:\n pass\n else:\n raise\n except subprocess.CalledProcessError:\n pass\n\n # docker failed and nodejs not on system\n if nodejs is None:\n raise JavascriptException(\n u\"cwltool requires Node.js engine to evaluate Javascript \"\n \"expressions, but couldn't find it. Tried %s, docker run \"\n \"node:slim\" % u\", \".join(trynodes))\n\n # docker failed, but nodejs is installed on system but the version is below the required version\n if docker is False and required_node_version is False:\n raise JavascriptException(\n u'cwltool requires minimum v{} version of Node.js engine.'.format(minimum_node_version_str),\n u'Try updating: https://docs.npmjs.com/getting-started/installing-node')\n\n return nodejs\n\n\ndef execjs(js, jslib, timeout=None, force_docker_pull=False, debug=False, js_console=False): # type: (Union[Mapping, Text], Any, int, bool, bool, bool) -> JSON\n\n if not hasattr(localdata, \"proc\") or localdata.proc.poll() is not None or onWindows():\n localdata.proc = new_js_proc(force_docker_pull=force_docker_pull, js_console=js_console)\n\n nodejs = localdata.proc\n\n fn = u\"\\\"use strict\\\";\\n%s\\n(function()%s)()\" %\\\n (jslib, js if isinstance(js, six.string_types) and len(js) > 1 and js[0] == '{' else (\"{return (%s);}\" % js))\n\n killed = []\n\n \"\"\" Kill the node process if it exceeds timeout limit\"\"\"\n def terminate():\n try:\n killed.append(True)\n nodejs.kill()\n except OSError:\n pass\n\n if timeout is None:\n timeout = 20\n\n tm = threading.Timer(timeout, terminate)\n tm.start()\n\n stdin_buf = BytesIO((json.dumps(fn) + \"\\n\").encode('utf-8'))\n stdout_buf = BytesIO()\n stderr_buf = BytesIO()\n\n rselect = [nodejs.stdout, nodejs.stderr] # type: List[BytesIO]\n wselect = [nodejs.stdin] # type: List[BytesIO]\n\n PROCESS_FINISHED_STR = \"r1cepzbhUTxtykz5XTC4\\n\"\n\n def process_finished(): # type: () -> bool\n return stdout_buf.getvalue().decode().endswith(PROCESS_FINISHED_STR) and \\\n stderr_buf.getvalue().decode().endswith(PROCESS_FINISHED_STR)\n\n # On windows system standard input/output are not handled properly by select module\n # (modules like pywin32, msvcrt, gevent don't work either)\n if sys.platform=='win32':\n READ_BYTES_SIZE = 512\n\n # creating queue for reading from a thread to queue\n input_queue = queue.Queue()\n output_queue = queue.Queue()\n error_queue = queue.Queue()\n\n # To tell threads that output has ended and threads can safely exit\n no_more_output = threading.Lock()\n no_more_output.acquire()\n no_more_error = threading.Lock()\n no_more_error.acquire()\n\n # put constructed command to input queue which then will be passed to nodejs's stdin\n def put_input(input_queue):\n while True:\n b = stdin_buf.read(READ_BYTES_SIZE)\n if b:\n input_queue.put(b)\n else:\n break\n\n # get the output from nodejs's stdout and continue till otuput ends\n def get_output(output_queue):\n while not no_more_output.acquire(False):\n b=os.read(nodejs.stdout.fileno(), READ_BYTES_SIZE)\n if b:\n output_queue.put(b)\n\n # get the output from nodejs's stderr and continue till error output ends\n def get_error(error_queue):\n while not no_more_error.acquire(False):\n b = os.read(nodejs.stderr.fileno(), READ_BYTES_SIZE)\n if b:\n error_queue.put(b)\n\n # Threads managing nodejs.stdin, nodejs.stdout and nodejs.stderr respectively\n input_thread = threading.Thread(target=put_input, args=(input_queue,))\n input_thread.daemon=True\n input_thread.start()\n output_thread = threading.Thread(target=get_output, args=(output_queue,))\n output_thread.daemon=True\n output_thread.start()\n error_thread = threading.Thread(target=get_error, args=(error_queue,))\n error_thread.daemon=True\n error_thread.start()\n\n finished = False\n\n while not finished and tm.is_alive():\n try:\n if nodejs.stdin in wselect:\n if not input_queue.empty():\n os.write(nodejs.stdin.fileno(), input_queue.get())\n elif not input_thread.is_alive():\n wselect = []\n if nodejs.stdout in rselect:\n if not output_queue.empty():\n stdout_buf.write(output_queue.get())\n\n if nodejs.stderr in rselect:\n if not error_queue.empty():\n stderr_buf.write(error_queue.get()) \n\n if process_finished() and error_queue.empty() and output_queue.empty():\n finished = True\n no_more_output.release()\n no_more_error.release()\n except OSError as e:\n break\n\n else:\n while not process_finished() and tm.is_alive():\n rready, wready, _ = select.select(rselect, wselect, [])\n try:\n if nodejs.stdin in wready:\n b = stdin_buf.read(select.PIPE_BUF)\n if b:\n os.write(nodejs.stdin.fileno(), b)\n for pipes in ((nodejs.stdout, stdout_buf), (nodejs.stderr, stderr_buf)):\n if pipes[0] in rready:\n b = os.read(pipes[0].fileno(), select.PIPE_BUF)\n if b:\n pipes[1].write(b)\n except OSError as e:\n break\n tm.cancel()\n\n stdin_buf.close()\n stdoutdata = stdout_buf.getvalue()[:-len(PROCESS_FINISHED_STR) - 1]\n stderrdata = stderr_buf.getvalue()[:-len(PROCESS_FINISHED_STR) - 1]\n\n def fn_linenum(): # type: () -> Text\n lines = fn.splitlines()\n ofs = 0\n maxlines = 99\n if len(lines) > maxlines:\n ofs = len(lines) - maxlines\n lines = lines[-maxlines:]\n return u\"\\n\".join(u\"%02i %s\" % (i + ofs + 1, b) for i, b in enumerate(lines))\n\n def stdfmt(data): # type: (Text) -> Text\n if \"\\n\" in data:\n return \"\\n\" + data.strip()\n return data\n\n nodejs.poll()\n\n if js_console:\n if len(stderrdata) > 0:\n _logger.info(\"Javascript console output:\")\n _logger.info(\"----------------------------------------\")\n _logger.info('\\n'.join(re.findall(r'^[[](?:log|err)[]].*$', stderrdata.decode('utf-8'), flags=re.MULTILINE)))\n _logger.info(\"----------------------------------------\")\n\n if debug:\n info = u\"returncode was: %s\\nscript was:\\n%s\\nstdout was: %s\\nstderr was: %s\\n\" %\\\n (nodejs.returncode, fn_linenum(), stdfmt(stdoutdata.decode('utf-8')), stdfmt(stderrdata.decode('utf-8')))\n else:\n info = stdfmt(stderrdata.decode('utf-8'))\n\n if nodejs.poll() not in (None, 0):\n if killed:\n raise JavascriptException(u\"Long-running script killed after %s seconds: %s\" % (timeout, info))\n else:\n raise JavascriptException(info)\n else:\n try:\n # On windows currently a new instance of nodejs process is used due to problem with blocking on read operation on windows\n if onWindows():\n nodejs.kill()\n return json.loads(stdoutdata.decode('utf-8'))\n except ValueError as e:\n raise JavascriptException(u\"%s\\nscript was:\\n%s\\nstdout was: '%s'\\nstderr was: '%s'\\n\" %\n (e, fn_linenum(), stdoutdata, stderrdata))\n", "path": "cwltool/sandboxjs.py"}], "after_files": [{"content": "from __future__ import absolute_import\nimport errno\nimport json\nimport logging\nimport os\nimport re\nimport select\nimport subprocess\nimport threading\nimport sys\nfrom io import BytesIO\nfrom typing import Any, Dict, List, Mapping, Text, Tuple, Union\nfrom .utils import onWindows\nfrom pkg_resources import resource_stream\n\nimport six\n\ntry:\n import queue # type: ignore\nexcept ImportError:\n import Queue as queue # type: ignore\n\nclass JavascriptException(Exception):\n pass\n\n\n_logger = logging.getLogger(\"cwltool\")\n\nJSON = Union[Dict[Text, Any], List[Any], Text, int, float, bool, None]\n\nlocaldata = threading.local()\n\nhave_node_slim = False\n# minimum acceptable version of nodejs engine\nminimum_node_version_str = '0.10.26'\n\ndef check_js_threshold_version(working_alias):\n # type: (str) -> bool\n\n \"\"\"Checks if the nodeJS engine version on the system\n with the allowed minimum version.\n https://github.com/nodejs/node/blob/master/CHANGELOG.md#nodejs-changelog\n \"\"\"\n # parse nodejs version into int Tuple: 'v4.2.6\\n' -> [4, 2, 6]\n current_version_str = subprocess.check_output(\n [working_alias, \"-v\"]).decode('utf-8')\n\n current_version = [int(v) for v in current_version_str.strip().strip('v').split('.')]\n minimum_node_version = [int(v) for v in minimum_node_version_str.split('.')]\n\n if current_version >= minimum_node_version:\n return True\n else:\n return False\n\n\ndef new_js_proc(force_docker_pull=False, js_console=False):\n # type: (bool, bool) -> subprocess.Popen\n \n cwl_node_engine_js = 'cwlNodeEngine.js'\n if js_console:\n cwl_node_engine_js = 'cwlNodeEngineJSConsole.js'\n _logger.warn(\"Running with support for javascript console in expressions (DO NOT USE IN PRODUCTION)\")\n\n res = resource_stream(__name__, cwl_node_engine_js)\n nodecode = res.read().decode('utf-8')\n\n required_node_version, docker = (False,)*2\n nodejs = None\n trynodes = (\"nodejs\", \"node\")\n for n in trynodes:\n try:\n if subprocess.check_output([n, \"--eval\", \"process.stdout.write('t')\"]).decode('utf-8') != \"t\":\n continue\n else:\n nodejs = subprocess.Popen([n, \"--eval\", nodecode],\n stdin=subprocess.PIPE,\n stdout=subprocess.PIPE,\n stderr=subprocess.PIPE)\n\n required_node_version = check_js_threshold_version(n)\n break\n except subprocess.CalledProcessError:\n pass\n except OSError as e:\n if e.errno == errno.ENOENT:\n pass\n else:\n raise\n\n if nodejs is None or nodejs is not None and required_node_version is False:\n try:\n nodeimg = \"node:slim\"\n global have_node_slim\n\n if not have_node_slim:\n dockerimgs = subprocess.check_output([\"docker\", \"images\", \"-q\", nodeimg]).decode('utf-8')\n # if output is an empty string\n if (len(dockerimgs.split(\"\\n\")) <= 1) or force_docker_pull:\n # pull node:slim docker container\n nodejsimg = subprocess.check_output([\"docker\", \"pull\", nodeimg]).decode('utf-8')\n _logger.info(\"Pulled Docker image %s %s\", nodeimg, nodejsimg)\n have_node_slim = True\n nodejs = subprocess.Popen([\"docker\", \"run\",\n \"--attach=STDIN\", \"--attach=STDOUT\", \"--attach=STDERR\",\n \"--sig-proxy=true\", \"--interactive\",\n \"--rm\", nodeimg, \"node\", \"--eval\", nodecode],\n stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE)\n docker = True\n except OSError as e:\n if e.errno == errno.ENOENT:\n pass\n else:\n raise\n except subprocess.CalledProcessError:\n pass\n\n # docker failed and nodejs not on system\n if nodejs is None:\n raise JavascriptException(\n u\"cwltool requires Node.js engine to evaluate Javascript \"\n \"expressions, but couldn't find it. Tried %s, docker run \"\n \"node:slim\" % u\", \".join(trynodes))\n\n # docker failed, but nodejs is installed on system but the version is below the required version\n if docker is False and required_node_version is False:\n raise JavascriptException(\n u'cwltool requires minimum v{} version of Node.js engine.'.format(minimum_node_version_str),\n u'Try updating: https://docs.npmjs.com/getting-started/installing-node')\n\n return nodejs\n\n\ndef execjs(js, jslib, timeout=None, force_docker_pull=False, debug=False, js_console=False): # type: (Union[Mapping, Text], Any, int, bool, bool, bool) -> JSON\n\n if not hasattr(localdata, \"proc\") or localdata.proc.poll() is not None or onWindows():\n localdata.proc = new_js_proc(force_docker_pull=force_docker_pull, js_console=js_console)\n\n nodejs = localdata.proc\n\n fn = u\"\\\"use strict\\\";\\n%s\\n(function()%s)()\" %\\\n (jslib, js if isinstance(js, six.string_types) and len(js) > 1 and js[0] == '{' else (\"{return (%s);}\" % js))\n\n killed = []\n\n \"\"\" Kill the node process if it exceeds timeout limit\"\"\"\n def terminate():\n try:\n killed.append(True)\n nodejs.kill()\n except OSError:\n pass\n\n if timeout is None:\n timeout = 20\n\n tm = threading.Timer(timeout, terminate)\n tm.start()\n\n stdin_buf = BytesIO((json.dumps(fn) + \"\\n\").encode('utf-8'))\n stdout_buf = BytesIO()\n stderr_buf = BytesIO()\n\n rselect = [nodejs.stdout, nodejs.stderr] # type: List[BytesIO]\n wselect = [nodejs.stdin] # type: List[BytesIO]\n\n PROCESS_FINISHED_STR = \"r1cepzbhUTxtykz5XTC4\\n\"\n\n def process_finished(): # type: () -> bool\n return stdout_buf.getvalue().decode().endswith(PROCESS_FINISHED_STR) and \\\n stderr_buf.getvalue().decode().endswith(PROCESS_FINISHED_STR)\n\n # On windows system standard input/output are not handled properly by select module\n # (modules like pywin32, msvcrt, gevent don't work either)\n if sys.platform=='win32':\n READ_BYTES_SIZE = 512\n\n # creating queue for reading from a thread to queue\n input_queue = queue.Queue()\n output_queue = queue.Queue()\n error_queue = queue.Queue()\n\n # To tell threads that output has ended and threads can safely exit\n no_more_output = threading.Lock()\n no_more_output.acquire()\n no_more_error = threading.Lock()\n no_more_error.acquire()\n\n # put constructed command to input queue which then will be passed to nodejs's stdin\n def put_input(input_queue):\n while True:\n b = stdin_buf.read(READ_BYTES_SIZE)\n if b:\n input_queue.put(b)\n else:\n break\n\n # get the output from nodejs's stdout and continue till otuput ends\n def get_output(output_queue):\n while not no_more_output.acquire(False):\n b=os.read(nodejs.stdout.fileno(), READ_BYTES_SIZE)\n if b:\n output_queue.put(b)\n\n # get the output from nodejs's stderr and continue till error output ends\n def get_error(error_queue):\n while not no_more_error.acquire(False):\n b = os.read(nodejs.stderr.fileno(), READ_BYTES_SIZE)\n if b:\n error_queue.put(b)\n\n # Threads managing nodejs.stdin, nodejs.stdout and nodejs.stderr respectively\n input_thread = threading.Thread(target=put_input, args=(input_queue,))\n input_thread.daemon=True\n input_thread.start()\n output_thread = threading.Thread(target=get_output, args=(output_queue,))\n output_thread.daemon=True\n output_thread.start()\n error_thread = threading.Thread(target=get_error, args=(error_queue,))\n error_thread.daemon=True\n error_thread.start()\n\n finished = False\n\n while not finished and tm.is_alive():\n try:\n if nodejs.stdin in wselect:\n if not input_queue.empty():\n os.write(nodejs.stdin.fileno(), input_queue.get())\n elif not input_thread.is_alive():\n wselect = []\n if nodejs.stdout in rselect:\n if not output_queue.empty():\n stdout_buf.write(output_queue.get())\n\n if nodejs.stderr in rselect:\n if not error_queue.empty():\n stderr_buf.write(error_queue.get()) \n\n if process_finished() and error_queue.empty() and output_queue.empty():\n finished = True\n no_more_output.release()\n no_more_error.release()\n except OSError as e:\n break\n\n else:\n while not process_finished() and tm.is_alive():\n rready, wready, _ = select.select(rselect, wselect, [])\n try:\n if nodejs.stdin in wready:\n b = stdin_buf.read(select.PIPE_BUF)\n if b:\n os.write(nodejs.stdin.fileno(), b)\n for pipes in ((nodejs.stdout, stdout_buf), (nodejs.stderr, stderr_buf)):\n if pipes[0] in rready:\n b = os.read(pipes[0].fileno(), select.PIPE_BUF)\n if b:\n pipes[1].write(b)\n except OSError as e:\n break\n tm.cancel()\n\n stdin_buf.close()\n stdoutdata = stdout_buf.getvalue()[:-len(PROCESS_FINISHED_STR) - 1]\n stderrdata = stderr_buf.getvalue()[:-len(PROCESS_FINISHED_STR) - 1]\n\n def fn_linenum(): # type: () -> Text\n lines = fn.splitlines()\n ofs = 0\n maxlines = 99\n if len(lines) > maxlines:\n ofs = len(lines) - maxlines\n lines = lines[-maxlines:]\n return u\"\\n\".join(u\"%02i %s\" % (i + ofs + 1, b) for i, b in enumerate(lines))\n\n def stdfmt(data): # type: (Text) -> Text\n if \"\\n\" in data:\n return \"\\n\" + data.strip()\n return data\n\n nodejs.poll()\n\n if js_console:\n if len(stderrdata) > 0:\n _logger.info(\"Javascript console output:\")\n _logger.info(\"----------------------------------------\")\n _logger.info('\\n'.join(re.findall(r'^[[](?:log|err)[]].*$', stderrdata.decode('utf-8'), flags=re.MULTILINE)))\n _logger.info(\"----------------------------------------\")\n\n if debug:\n info = u\"returncode was: %s\\nscript was:\\n%s\\nstdout was: %s\\nstderr was: %s\\n\" %\\\n (nodejs.returncode, fn_linenum(), stdfmt(stdoutdata.decode('utf-8')), stdfmt(stderrdata.decode('utf-8')))\n else:\n info = u\"Javascript expression was: %s\\nstdout was: %s\\nstderr was: %s\" %\\\n (js, stdfmt(stdoutdata.decode('utf-8')), stdfmt(stderrdata.decode('utf-8')))\n\n if nodejs.poll() not in (None, 0):\n if killed:\n raise JavascriptException(u\"Long-running script killed after %s seconds: %s\" % (timeout, info))\n else:\n raise JavascriptException(info)\n else:\n try:\n # On windows currently a new instance of nodejs process is used due to problem with blocking on read operation on windows\n if onWindows():\n nodejs.kill()\n return json.loads(stdoutdata.decode('utf-8'))\n except ValueError as e:\n raise JavascriptException(u\"%s\\nscript was:\\n%s\\nstdout was: '%s'\\nstderr was: '%s'\\n\" %\n (e, fn_linenum(), stdoutdata, stderrdata))\n", "path": "cwltool/sandboxjs.py"}]} | 3,949 | 206 |
gh_patches_debug_5758 | rasdani/github-patches | git_diff | fossasia__open-event-server-2489 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Propose attendees/ticketing API
With the orga app and the implementation of API endpoints in this PR https://github.com/fossasia/open-event-orga-server/pull/2379 we have the first steps to an attendee API. In how far would that overlap with a ticketing API?
What is the best way to implement this and keep it generic? Do we need two APIs - Attendees and Ticketing or would that be handled in one API?
Also related to https://github.com/fossasia/open-event-orga-server/issues/904
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `app/api/tickets.py`
Content:
```
1 from flask.ext.restplus import Namespace
2
3 from app.helpers.ticketing import TicketingManager
4
5 from .helpers.helpers import (
6 requires_auth,
7 can_access)
8 from .helpers.utils import POST_RESPONSES
9 from .helpers.utils import Resource
10 from .helpers import custom_fields as fields
11 from ..helpers.data_getter import DataGetter
12
13 api = Namespace('tickets', description='Tickets', path='/')
14
15 ORDER = api.model('Order', {
16 'id': fields.Integer(),
17 'identifier': fields.String(),
18 'amount': fields.Float(),
19 'paid_via': fields.String(),
20 'invoice_number': fields.String(),
21 'payment_mode': fields.String(),
22 'status': fields.String(),
23 'completed_at': fields.DateTime(),
24 })
25
26 TICKET = api.model('Ticket', {
27 'id': fields.Integer(),
28 'name': fields.String(),
29 'description': fields.String(),
30 'type': fields.String(),
31 'price': fields.Float(),
32 'quantity': fields.Integer(),
33 })
34
35
36 @api.route('/events/<int:event_id>/tickets/')
37 class TicketsList(Resource):
38 @requires_auth
39 @api.doc('tickets', responses=POST_RESPONSES)
40 @api.marshal_list_with(TICKET)
41 def get(self, event_id):
42 """Get tickets of the event"""
43 return DataGetter.get_sales_open_tickets(event_id=event_id).all()
44
45
46 @api.route('/events/<int:event_id>/tickets/<int:ticket_id>')
47 class Ticket(Resource):
48 @requires_auth
49 @api.doc('ticket', responses=POST_RESPONSES)
50 @api.marshal_with(TICKET)
51 def get(self, event_id, ticket_id):
52 """Get information about a ticket"""
53 return TicketingManager.get_ticket(ticket_id=ticket_id)
54
55
56
57
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/app/api/tickets.py b/app/api/tickets.py
--- a/app/api/tickets.py
+++ b/app/api/tickets.py
@@ -52,5 +52,13 @@
"""Get information about a ticket"""
return TicketingManager.get_ticket(ticket_id=ticket_id)
[email protected]('/events/<int:event_id>/orders/<string:identifier>')
+class Order(Resource):
+ @requires_auth
+ @api.doc('order', responses=POST_RESPONSES)
+ @api.marshal_with(ORDER)
+ def get(self, event_id, identifier):
+ """Get information about a ticket"""
+ return TicketingManager.get_order_by_identifier(identifier=identifier)
| {"golden_diff": "diff --git a/app/api/tickets.py b/app/api/tickets.py\n--- a/app/api/tickets.py\n+++ b/app/api/tickets.py\n@@ -52,5 +52,13 @@\n \"\"\"Get information about a ticket\"\"\"\n return TicketingManager.get_ticket(ticket_id=ticket_id)\n \[email protected]('/events/<int:event_id>/orders/<string:identifier>')\n+class Order(Resource):\n+ @requires_auth\n+ @api.doc('order', responses=POST_RESPONSES)\n+ @api.marshal_with(ORDER)\n+ def get(self, event_id, identifier):\n+ \"\"\"Get information about a ticket\"\"\"\n+ return TicketingManager.get_order_by_identifier(identifier=identifier)\n", "issue": "Propose attendees/ticketing API\nWith the orga app and the implementation of API endpoints in this PR https://github.com/fossasia/open-event-orga-server/pull/2379 we have the first steps to an attendee API. In how far would that overlap with a ticketing API?\n\nWhat is the best way to implement this and keep it generic? Do we need two APIs - Attendees and Ticketing or would that be handled in one API?\n\nAlso related to https://github.com/fossasia/open-event-orga-server/issues/904\n\n", "before_files": [{"content": "from flask.ext.restplus import Namespace\n\nfrom app.helpers.ticketing import TicketingManager\n\nfrom .helpers.helpers import (\n requires_auth,\n can_access)\nfrom .helpers.utils import POST_RESPONSES\nfrom .helpers.utils import Resource\nfrom .helpers import custom_fields as fields\nfrom ..helpers.data_getter import DataGetter\n\napi = Namespace('tickets', description='Tickets', path='/')\n\nORDER = api.model('Order', {\n 'id': fields.Integer(),\n 'identifier': fields.String(),\n 'amount': fields.Float(),\n 'paid_via': fields.String(),\n 'invoice_number': fields.String(),\n 'payment_mode': fields.String(),\n 'status': fields.String(),\n 'completed_at': fields.DateTime(),\n})\n\nTICKET = api.model('Ticket', {\n 'id': fields.Integer(),\n 'name': fields.String(),\n 'description': fields.String(),\n 'type': fields.String(),\n 'price': fields.Float(),\n 'quantity': fields.Integer(),\n})\n\n\[email protected]('/events/<int:event_id>/tickets/')\nclass TicketsList(Resource):\n @requires_auth\n @api.doc('tickets', responses=POST_RESPONSES)\n @api.marshal_list_with(TICKET)\n def get(self, event_id):\n \"\"\"Get tickets of the event\"\"\"\n return DataGetter.get_sales_open_tickets(event_id=event_id).all()\n\n\[email protected]('/events/<int:event_id>/tickets/<int:ticket_id>')\nclass Ticket(Resource):\n @requires_auth\n @api.doc('ticket', responses=POST_RESPONSES)\n @api.marshal_with(TICKET)\n def get(self, event_id, ticket_id):\n \"\"\"Get information about a ticket\"\"\"\n return TicketingManager.get_ticket(ticket_id=ticket_id)\n\n\n\n", "path": "app/api/tickets.py"}], "after_files": [{"content": "from flask.ext.restplus import Namespace\n\nfrom app.helpers.ticketing import TicketingManager\n\nfrom .helpers.helpers import (\n requires_auth,\n can_access)\nfrom .helpers.utils import POST_RESPONSES\nfrom .helpers.utils import Resource\nfrom .helpers import custom_fields as fields\nfrom ..helpers.data_getter import DataGetter\n\napi = Namespace('tickets', description='Tickets', path='/')\n\nORDER = api.model('Order', {\n 'id': fields.Integer(),\n 'identifier': fields.String(),\n 'amount': fields.Float(),\n 'paid_via': fields.String(),\n 'invoice_number': fields.String(),\n 'payment_mode': fields.String(),\n 'status': fields.String(),\n 'completed_at': fields.DateTime(),\n})\n\nTICKET = api.model('Ticket', {\n 'id': fields.Integer(),\n 'name': fields.String(),\n 'description': fields.String(),\n 'type': fields.String(),\n 'price': fields.Float(),\n 'quantity': fields.Integer(),\n})\n\n\[email protected]('/events/<int:event_id>/tickets/')\nclass TicketsList(Resource):\n @requires_auth\n @api.doc('tickets', responses=POST_RESPONSES)\n @api.marshal_list_with(TICKET)\n def get(self, event_id):\n \"\"\"Get tickets of the event\"\"\"\n return DataGetter.get_sales_open_tickets(event_id=event_id).all()\n\n\[email protected]('/events/<int:event_id>/tickets/<int:ticket_id>')\nclass Ticket(Resource):\n @requires_auth\n @api.doc('ticket', responses=POST_RESPONSES)\n @api.marshal_with(TICKET)\n def get(self, event_id, ticket_id):\n \"\"\"Get information about a ticket\"\"\"\n return TicketingManager.get_ticket(ticket_id=ticket_id)\n\[email protected]('/events/<int:event_id>/orders/<string:identifier>')\nclass Order(Resource):\n @requires_auth\n @api.doc('order', responses=POST_RESPONSES)\n @api.marshal_with(ORDER)\n def get(self, event_id, identifier):\n \"\"\"Get information about a ticket\"\"\"\n return TicketingManager.get_order_by_identifier(identifier=identifier)\n\n\n", "path": "app/api/tickets.py"}]} | 852 | 153 |
gh_patches_debug_17306 | rasdani/github-patches | git_diff | cal-itp__benefits-2116 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Refactor claims handling for integer claims
During the OAuth `authorize` flow, we look for [boolean claim values](https://github.com/cal-itp/benefits/blob/dev/benefits/oauth/views.py#L75) to determine if the user is eligible.
IdG is changing their claims implementation to cut down on the size of the token being sent to Benefits. Instead of booleans, they will use integers to indicate claim values:
* `0` will indicate `False` (i.e. the claim indicates eligibility failed)
* `1` will indicate `True` (i.e. the claim indicates eligibility succeeded)
* Any other integer `>= 10` will indicate an error code
**Note:** the claim values are transmitted in the token as `str`, and should be parsed to `int` before usage.
## Acceptance Criteria
<!-- Remember to consider edge cases -->
- [ ] `authorize` processes integer claims as described above
## Additional context
While we work to implement this change, existing flows for Older Adults and Veterans will use both claim styles. New flows for CalFresh and the new Veterans API will ~only use the newer integer claim style, so this refactor is necessary for supporting those flows.~ also support both styles to allow us time to implement and cut over. There are an entirely new set of scopes created for the integer-based claims so as not to interfere with the existing implementation.
Once we have this change tested and deployed, IdG will cutover all flows to use the integer style only.
Mapping error codes to error messages and analytics will be handled in #2049.
See [this Slack thread](https://cal-itp.slack.com/archives/C037Y3UE71P/p1714434750536319) from @johnatstate for more context.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `benefits/oauth/views.py`
Content:
```
1 import logging
2
3 from django.shortcuts import redirect
4 from django.urls import reverse
5 from django.utils.decorators import decorator_from_middleware
6
7 from benefits.core import session
8 from . import analytics, redirects
9 from .client import oauth
10 from .middleware import VerifierUsesAuthVerificationSessionRequired
11
12
13 logger = logging.getLogger(__name__)
14
15
16 ROUTE_AUTH = "oauth:authorize"
17 ROUTE_START = "eligibility:start"
18 ROUTE_CONFIRM = "eligibility:confirm"
19 ROUTE_UNVERIFIED = "eligibility:unverified"
20 ROUTE_POST_LOGOUT = "oauth:post_logout"
21
22
23 @decorator_from_middleware(VerifierUsesAuthVerificationSessionRequired)
24 def login(request):
25 """View implementing OIDC authorize_redirect."""
26 verifier = session.verifier(request)
27 oauth_client = oauth.create_client(verifier.auth_provider.client_name)
28
29 if not oauth_client:
30 raise Exception(f"oauth_client not registered: {verifier.auth_provider.client_name}")
31
32 route = reverse(ROUTE_AUTH)
33 redirect_uri = redirects.generate_redirect_uri(request, route)
34
35 logger.debug(f"OAuth authorize_redirect with redirect_uri: {redirect_uri}")
36
37 analytics.started_sign_in(request)
38
39 return oauth_client.authorize_redirect(request, redirect_uri)
40
41
42 @decorator_from_middleware(VerifierUsesAuthVerificationSessionRequired)
43 def authorize(request):
44 """View implementing OIDC token authorization."""
45 verifier = session.verifier(request)
46 oauth_client = oauth.create_client(verifier.auth_provider.client_name)
47
48 if not oauth_client:
49 raise Exception(f"oauth_client not registered: {verifier.auth_provider.client_name}")
50
51 logger.debug("Attempting to authorize OAuth access token")
52 token = oauth_client.authorize_access_token(request)
53
54 if token is None:
55 logger.warning("Could not authorize OAuth access token")
56 return redirect(ROUTE_START)
57
58 logger.debug("OAuth access token authorized")
59
60 # We store the id_token in the user's session. This is the minimal amount of information needed later to log the user out.
61 id_token = token["id_token"]
62
63 # We store the returned claim in case it can be used later in eligibility verification.
64 verifier_claim = verifier.auth_provider.claim
65 stored_claim = None
66
67 if verifier_claim:
68 userinfo = token.get("userinfo")
69
70 if userinfo:
71 claim_value = userinfo.get(verifier_claim)
72 # the claim comes back in userinfo like { "claim": "True" | "False" }
73 if claim_value is None:
74 logger.warning(f"userinfo did not contain: {verifier_claim}")
75 elif claim_value.lower() == "true":
76 # if userinfo contains our claim and the flag is true, store the *claim*
77 stored_claim = verifier_claim
78
79 session.update(request, oauth_token=id_token, oauth_claim=stored_claim)
80
81 analytics.finished_sign_in(request)
82
83 return redirect(ROUTE_CONFIRM)
84
85
86 @decorator_from_middleware(VerifierUsesAuthVerificationSessionRequired)
87 def cancel(request):
88 """View implementing cancellation of OIDC authorization."""
89
90 analytics.canceled_sign_in(request)
91
92 return redirect(ROUTE_UNVERIFIED)
93
94
95 @decorator_from_middleware(VerifierUsesAuthVerificationSessionRequired)
96 def logout(request):
97 """View implementing OIDC and application sign out."""
98 verifier = session.verifier(request)
99 oauth_client = oauth.create_client(verifier.auth_provider.client_name)
100
101 if not oauth_client:
102 raise Exception(f"oauth_client not registered: {verifier.auth_provider.client_name}")
103
104 analytics.started_sign_out(request)
105
106 # overwrite the oauth session token, the user is signed out of the app
107 token = session.oauth_token(request)
108 session.logout(request)
109
110 route = reverse(ROUTE_POST_LOGOUT)
111 redirect_uri = redirects.generate_redirect_uri(request, route)
112
113 logger.debug(f"OAuth end_session_endpoint with redirect_uri: {redirect_uri}")
114
115 # send the user through the end_session_endpoint, redirecting back to
116 # the post_logout route
117 return redirects.deauthorize_redirect(oauth_client, token, redirect_uri)
118
119
120 @decorator_from_middleware(VerifierUsesAuthVerificationSessionRequired)
121 def post_logout(request):
122 """View routes the user to their origin after sign out."""
123
124 analytics.finished_sign_out(request)
125
126 origin = session.origin(request)
127 return redirect(origin)
128
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/benefits/oauth/views.py b/benefits/oauth/views.py
--- a/benefits/oauth/views.py
+++ b/benefits/oauth/views.py
@@ -69,11 +69,12 @@
if userinfo:
claim_value = userinfo.get(verifier_claim)
- # the claim comes back in userinfo like { "claim": "True" | "False" }
+ # the claim comes back in userinfo like { "claim": "1" | "0" }
+ claim_value = int(claim_value) if claim_value else None
if claim_value is None:
logger.warning(f"userinfo did not contain: {verifier_claim}")
- elif claim_value.lower() == "true":
- # if userinfo contains our claim and the flag is true, store the *claim*
+ elif claim_value == 1:
+ # if userinfo contains our claim and the flag is 1 (true), store the *claim*
stored_claim = verifier_claim
session.update(request, oauth_token=id_token, oauth_claim=stored_claim)
| {"golden_diff": "diff --git a/benefits/oauth/views.py b/benefits/oauth/views.py\n--- a/benefits/oauth/views.py\n+++ b/benefits/oauth/views.py\n@@ -69,11 +69,12 @@\n \n if userinfo:\n claim_value = userinfo.get(verifier_claim)\n- # the claim comes back in userinfo like { \"claim\": \"True\" | \"False\" }\n+ # the claim comes back in userinfo like { \"claim\": \"1\" | \"0\" }\n+ claim_value = int(claim_value) if claim_value else None\n if claim_value is None:\n logger.warning(f\"userinfo did not contain: {verifier_claim}\")\n- elif claim_value.lower() == \"true\":\n- # if userinfo contains our claim and the flag is true, store the *claim*\n+ elif claim_value == 1:\n+ # if userinfo contains our claim and the flag is 1 (true), store the *claim*\n stored_claim = verifier_claim\n \n session.update(request, oauth_token=id_token, oauth_claim=stored_claim)\n", "issue": "Refactor claims handling for integer claims\nDuring the OAuth `authorize` flow, we look for [boolean claim values](https://github.com/cal-itp/benefits/blob/dev/benefits/oauth/views.py#L75) to determine if the user is eligible.\n\nIdG is changing their claims implementation to cut down on the size of the token being sent to Benefits. Instead of booleans, they will use integers to indicate claim values:\n\n* `0` will indicate `False` (i.e. the claim indicates eligibility failed)\n* `1` will indicate `True` (i.e. the claim indicates eligibility succeeded)\n* Any other integer `>= 10` will indicate an error code\n\n**Note:** the claim values are transmitted in the token as `str`, and should be parsed to `int` before usage.\n\n## Acceptance Criteria\n\n<!-- Remember to consider edge cases -->\n\n- [ ] `authorize` processes integer claims as described above\n\n## Additional context\n\nWhile we work to implement this change, existing flows for Older Adults and Veterans will use both claim styles. New flows for CalFresh and the new Veterans API will ~only use the newer integer claim style, so this refactor is necessary for supporting those flows.~ also support both styles to allow us time to implement and cut over. There are an entirely new set of scopes created for the integer-based claims so as not to interfere with the existing implementation.\n\nOnce we have this change tested and deployed, IdG will cutover all flows to use the integer style only.\n\nMapping error codes to error messages and analytics will be handled in #2049.\n\nSee [this Slack thread](https://cal-itp.slack.com/archives/C037Y3UE71P/p1714434750536319) from @johnatstate for more context.\n", "before_files": [{"content": "import logging\n\nfrom django.shortcuts import redirect\nfrom django.urls import reverse\nfrom django.utils.decorators import decorator_from_middleware\n\nfrom benefits.core import session\nfrom . import analytics, redirects\nfrom .client import oauth\nfrom .middleware import VerifierUsesAuthVerificationSessionRequired\n\n\nlogger = logging.getLogger(__name__)\n\n\nROUTE_AUTH = \"oauth:authorize\"\nROUTE_START = \"eligibility:start\"\nROUTE_CONFIRM = \"eligibility:confirm\"\nROUTE_UNVERIFIED = \"eligibility:unverified\"\nROUTE_POST_LOGOUT = \"oauth:post_logout\"\n\n\n@decorator_from_middleware(VerifierUsesAuthVerificationSessionRequired)\ndef login(request):\n \"\"\"View implementing OIDC authorize_redirect.\"\"\"\n verifier = session.verifier(request)\n oauth_client = oauth.create_client(verifier.auth_provider.client_name)\n\n if not oauth_client:\n raise Exception(f\"oauth_client not registered: {verifier.auth_provider.client_name}\")\n\n route = reverse(ROUTE_AUTH)\n redirect_uri = redirects.generate_redirect_uri(request, route)\n\n logger.debug(f\"OAuth authorize_redirect with redirect_uri: {redirect_uri}\")\n\n analytics.started_sign_in(request)\n\n return oauth_client.authorize_redirect(request, redirect_uri)\n\n\n@decorator_from_middleware(VerifierUsesAuthVerificationSessionRequired)\ndef authorize(request):\n \"\"\"View implementing OIDC token authorization.\"\"\"\n verifier = session.verifier(request)\n oauth_client = oauth.create_client(verifier.auth_provider.client_name)\n\n if not oauth_client:\n raise Exception(f\"oauth_client not registered: {verifier.auth_provider.client_name}\")\n\n logger.debug(\"Attempting to authorize OAuth access token\")\n token = oauth_client.authorize_access_token(request)\n\n if token is None:\n logger.warning(\"Could not authorize OAuth access token\")\n return redirect(ROUTE_START)\n\n logger.debug(\"OAuth access token authorized\")\n\n # We store the id_token in the user's session. This is the minimal amount of information needed later to log the user out.\n id_token = token[\"id_token\"]\n\n # We store the returned claim in case it can be used later in eligibility verification.\n verifier_claim = verifier.auth_provider.claim\n stored_claim = None\n\n if verifier_claim:\n userinfo = token.get(\"userinfo\")\n\n if userinfo:\n claim_value = userinfo.get(verifier_claim)\n # the claim comes back in userinfo like { \"claim\": \"True\" | \"False\" }\n if claim_value is None:\n logger.warning(f\"userinfo did not contain: {verifier_claim}\")\n elif claim_value.lower() == \"true\":\n # if userinfo contains our claim and the flag is true, store the *claim*\n stored_claim = verifier_claim\n\n session.update(request, oauth_token=id_token, oauth_claim=stored_claim)\n\n analytics.finished_sign_in(request)\n\n return redirect(ROUTE_CONFIRM)\n\n\n@decorator_from_middleware(VerifierUsesAuthVerificationSessionRequired)\ndef cancel(request):\n \"\"\"View implementing cancellation of OIDC authorization.\"\"\"\n\n analytics.canceled_sign_in(request)\n\n return redirect(ROUTE_UNVERIFIED)\n\n\n@decorator_from_middleware(VerifierUsesAuthVerificationSessionRequired)\ndef logout(request):\n \"\"\"View implementing OIDC and application sign out.\"\"\"\n verifier = session.verifier(request)\n oauth_client = oauth.create_client(verifier.auth_provider.client_name)\n\n if not oauth_client:\n raise Exception(f\"oauth_client not registered: {verifier.auth_provider.client_name}\")\n\n analytics.started_sign_out(request)\n\n # overwrite the oauth session token, the user is signed out of the app\n token = session.oauth_token(request)\n session.logout(request)\n\n route = reverse(ROUTE_POST_LOGOUT)\n redirect_uri = redirects.generate_redirect_uri(request, route)\n\n logger.debug(f\"OAuth end_session_endpoint with redirect_uri: {redirect_uri}\")\n\n # send the user through the end_session_endpoint, redirecting back to\n # the post_logout route\n return redirects.deauthorize_redirect(oauth_client, token, redirect_uri)\n\n\n@decorator_from_middleware(VerifierUsesAuthVerificationSessionRequired)\ndef post_logout(request):\n \"\"\"View routes the user to their origin after sign out.\"\"\"\n\n analytics.finished_sign_out(request)\n\n origin = session.origin(request)\n return redirect(origin)\n", "path": "benefits/oauth/views.py"}], "after_files": [{"content": "import logging\n\nfrom django.shortcuts import redirect\nfrom django.urls import reverse\nfrom django.utils.decorators import decorator_from_middleware\n\nfrom benefits.core import session\nfrom . import analytics, redirects\nfrom .client import oauth\nfrom .middleware import VerifierUsesAuthVerificationSessionRequired\n\n\nlogger = logging.getLogger(__name__)\n\n\nROUTE_AUTH = \"oauth:authorize\"\nROUTE_START = \"eligibility:start\"\nROUTE_CONFIRM = \"eligibility:confirm\"\nROUTE_UNVERIFIED = \"eligibility:unverified\"\nROUTE_POST_LOGOUT = \"oauth:post_logout\"\n\n\n@decorator_from_middleware(VerifierUsesAuthVerificationSessionRequired)\ndef login(request):\n \"\"\"View implementing OIDC authorize_redirect.\"\"\"\n verifier = session.verifier(request)\n oauth_client = oauth.create_client(verifier.auth_provider.client_name)\n\n if not oauth_client:\n raise Exception(f\"oauth_client not registered: {verifier.auth_provider.client_name}\")\n\n route = reverse(ROUTE_AUTH)\n redirect_uri = redirects.generate_redirect_uri(request, route)\n\n logger.debug(f\"OAuth authorize_redirect with redirect_uri: {redirect_uri}\")\n\n analytics.started_sign_in(request)\n\n return oauth_client.authorize_redirect(request, redirect_uri)\n\n\n@decorator_from_middleware(VerifierUsesAuthVerificationSessionRequired)\ndef authorize(request):\n \"\"\"View implementing OIDC token authorization.\"\"\"\n verifier = session.verifier(request)\n oauth_client = oauth.create_client(verifier.auth_provider.client_name)\n\n if not oauth_client:\n raise Exception(f\"oauth_client not registered: {verifier.auth_provider.client_name}\")\n\n logger.debug(\"Attempting to authorize OAuth access token\")\n token = oauth_client.authorize_access_token(request)\n\n if token is None:\n logger.warning(\"Could not authorize OAuth access token\")\n return redirect(ROUTE_START)\n\n logger.debug(\"OAuth access token authorized\")\n\n # We store the id_token in the user's session. This is the minimal amount of information needed later to log the user out.\n id_token = token[\"id_token\"]\n\n # We store the returned claim in case it can be used later in eligibility verification.\n verifier_claim = verifier.auth_provider.claim\n stored_claim = None\n\n if verifier_claim:\n userinfo = token.get(\"userinfo\")\n\n if userinfo:\n claim_value = userinfo.get(verifier_claim)\n # the claim comes back in userinfo like { \"claim\": \"1\" | \"0\" }\n claim_value = int(claim_value) if claim_value else None\n if claim_value is None:\n logger.warning(f\"userinfo did not contain: {verifier_claim}\")\n elif claim_value == 1:\n # if userinfo contains our claim and the flag is 1 (true), store the *claim*\n stored_claim = verifier_claim\n\n session.update(request, oauth_token=id_token, oauth_claim=stored_claim)\n\n analytics.finished_sign_in(request)\n\n return redirect(ROUTE_CONFIRM)\n\n\n@decorator_from_middleware(VerifierUsesAuthVerificationSessionRequired)\ndef cancel(request):\n \"\"\"View implementing cancellation of OIDC authorization.\"\"\"\n\n analytics.canceled_sign_in(request)\n\n return redirect(ROUTE_UNVERIFIED)\n\n\n@decorator_from_middleware(VerifierUsesAuthVerificationSessionRequired)\ndef logout(request):\n \"\"\"View implementing OIDC and application sign out.\"\"\"\n verifier = session.verifier(request)\n oauth_client = oauth.create_client(verifier.auth_provider.client_name)\n\n if not oauth_client:\n raise Exception(f\"oauth_client not registered: {verifier.auth_provider.client_name}\")\n\n analytics.started_sign_out(request)\n\n # overwrite the oauth session token, the user is signed out of the app\n token = session.oauth_token(request)\n session.logout(request)\n\n route = reverse(ROUTE_POST_LOGOUT)\n redirect_uri = redirects.generate_redirect_uri(request, route)\n\n logger.debug(f\"OAuth end_session_endpoint with redirect_uri: {redirect_uri}\")\n\n # send the user through the end_session_endpoint, redirecting back to\n # the post_logout route\n return redirects.deauthorize_redirect(oauth_client, token, redirect_uri)\n\n\n@decorator_from_middleware(VerifierUsesAuthVerificationSessionRequired)\ndef post_logout(request):\n \"\"\"View routes the user to their origin after sign out.\"\"\"\n\n analytics.finished_sign_out(request)\n\n origin = session.origin(request)\n return redirect(origin)\n", "path": "benefits/oauth/views.py"}]} | 1,824 | 231 |
gh_patches_debug_6673 | rasdani/github-patches | git_diff | freqtrade__freqtrade-2884 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Strange results for download-data
I recalled #2716, tried to reproduce and obtained very strange results:
* remove config.json
* then run:
```
$ freqtrade download-data --exchange gemini --pairs 'BTC/USD' 'LTC/USD' 'BCH/USD' 'ETH/USD' -t 1h --days 5 -v
2020-02-08 03:48:33,819 - freqtrade.loggers - INFO - Verbosity set to 1
2020-02-08 03:48:33,819 - freqtrade.configuration.configuration - INFO - Using exchange gemini
2020-02-08 03:48:33,819 - freqtrade.configuration.configuration - INFO - Using user-data directory: /home/user/freqtrade-wrk/github-hroff-1902/freqtrade/user_data ...
2020-02-08 03:48:33,819 - freqtrade.configuration.configuration - INFO - Using data directory: /home/user/freqtrade-wrk/github-hroff-1902/freqtrade/user_data/data/gemini ...
2020-02-08 03:48:33,819 - freqtrade.configuration.configuration - INFO - Using pairs ['BTC/USD', 'LTC/USD', 'BCH/USD', 'ETH/USD']
2020-02-08 03:48:33,820 - freqtrade.configuration.configuration - INFO - timeframes --timeframes: ['1h']
2020-02-08 03:48:33,820 - freqtrade.configuration.configuration - INFO - Detected --days: 5
2020-02-08 03:48:33,820 - freqtrade.configuration.check_exchange - INFO - Checking exchange...
2020-02-08 03:48:33,820 - freqtrade.configuration.check_exchange - WARNING - Exchange "gemini" is known to the the ccxt library, available for the bot, but not officially supported by the Freqtrade development team. It may work flawlessly (please report back) or have serious issues. Use it at your own discretion.
2020-02-08 03:48:33,820 - freqtrade.configuration.deprecated_settings - WARNING - DEPRECATED: Pairlists must be defined explicitly in the future.Defaulting to StaticPairList for now.
2020-02-08 03:48:33,820 - freqtrade.configuration.config_validation - INFO - Validating configuration ...
2020-02-08 03:48:33,824 - freqtrade.commands.data_commands - INFO - About to download pairs: ['BTC/USD', 'LTC/USD', 'BCH/USD', 'ETH/USD'], intervals: ['1h'] to /home/user/freqtrade-wrk/github-hroff-1902/freqtrade/user_data/data/gemini
2020-02-08 03:48:33,824 - freqtrade.resolvers.exchange_resolver - INFO - No Gemini specific subclass found. Using the generic class instead.
2020-02-08 03:48:33,824 - freqtrade.exchange.exchange - INFO - Instance is running with dry_run enabled
2020-02-08 03:48:33,826 - freqtrade.exchange.exchange - INFO - Applying additional ccxt config: {'enableRateLimit': True}
2020-02-08 03:48:33,826 - asyncio - DEBUG - Using selector: EpollSelector
2020-02-08 03:48:33,828 - freqtrade.exchange.exchange - INFO - Using Exchange "Gemini"
2020-02-08 03:48:35,437 - chardet.charsetprober - DEBUG - utf-8 confidence = 0.99
2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - SHIFT_JIS Japanese confidence = 0.01
2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - EUC-JP Japanese confidence = 0.01
2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - GB2312 Chinese confidence = 0.01
2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - EUC-KR Korean confidence = 0.01
2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - CP949 Korean confidence = 0.01
2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - Big5 Chinese confidence = 0.01
2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - EUC-TW Taiwan confidence = 0.01
2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - windows-1251 Russian confidence = 0.01
2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - KOI8-R Russian confidence = 0.01
2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - ISO-8859-5 Russian confidence = 0.01
2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - MacCyrillic Russian confidence = 0.0
2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - IBM866 Russian confidence = 0.0
2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - IBM855 Russian confidence = 0.01
2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - ISO-8859-7 Greek confidence = 0.0
2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - windows-1253 Greek confidence = 0.0
2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - ISO-8859-5 Bulgairan confidence = 0.01
2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - windows-1251 Bulgarian confidence = 0.01
2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - TIS-620 Thai confidence = 0.0
2020-02-08 03:48:35,439 - chardet.charsetprober - DEBUG - ISO-8859-9 Turkish confidence = 0.6551158277756192
2020-02-08 03:48:35,439 - chardet.charsetprober - DEBUG - windows-1255 Hebrew confidence = 0.0
2020-02-08 03:48:35,439 - chardet.charsetprober - DEBUG - windows-1255 Hebrew confidence = 0.0
2020-02-08 03:48:35,439 - chardet.charsetprober - DEBUG - windows-1255 Hebrew confidence = 0.0
2020-02-08 03:48:35,439 - chardet.charsetprober - DEBUG - utf-8 confidence = 0.99
2020-02-08 03:48:35,439 - chardet.charsetprober - DEBUG - SHIFT_JIS Japanese confidence = 0.01
2020-02-08 03:48:35,439 - chardet.charsetprober - DEBUG - EUC-JP Japanese confidence = 0.01
2020-02-08 03:48:35,439 - chardet.charsetprober - DEBUG - GB2312 Chinese confidence = 0.01
2020-02-08 03:48:35,439 - chardet.charsetprober - DEBUG - EUC-KR Korean confidence = 0.01
2020-02-08 03:48:35,439 - chardet.charsetprober - DEBUG - CP949 Korean confidence = 0.01
2020-02-08 03:48:35,439 - chardet.charsetprober - DEBUG - Big5 Chinese confidence = 0.01
2020-02-08 03:48:35,439 - chardet.charsetprober - DEBUG - EUC-TW Taiwan confidence = 0.01
2020-02-08 03:48:35,441 - freqtrade - ERROR - is not available as stake on Gemini. Available currencies are: BCH, BTC, ETH, LTC, USD
2020-02-08 03:48:35,441 - freqtrade.exchange.exchange - DEBUG - Exchange object destroyed, closing async loop
```
-- note 1) strange messages from "chardet.charsetprober" module; 2) empty "something" between "ERROR - is not available". 3) strange module "freqtrade" for this message...
download-data validates whitelist when --pairs is used
The download-data subcommands validates whitelist when `--pairs` is used:
```
$ freqtrade download-data -c config.json --exchange kraken --pairs BTC/USD -t 1d --days 100 -v
...
2019-12-17 20:07:49,885 - freqtrade - ERROR - Pair NXT/BTC is not available on Kraken. Please remove NXT/BTC from your whitelist.
```
Expected behavior: ignore pair_whitelist, work only with pairs set in the cli.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `freqtrade/commands/data_commands.py`
Content:
```
1 import logging
2 import sys
3 from typing import Any, Dict, List
4
5 import arrow
6
7 from freqtrade.configuration import TimeRange, setup_utils_configuration
8 from freqtrade.data.history import (convert_trades_to_ohlcv,
9 refresh_backtest_ohlcv_data,
10 refresh_backtest_trades_data)
11 from freqtrade.exceptions import OperationalException
12 from freqtrade.resolvers import ExchangeResolver
13 from freqtrade.state import RunMode
14
15 logger = logging.getLogger(__name__)
16
17
18 def start_download_data(args: Dict[str, Any]) -> None:
19 """
20 Download data (former download_backtest_data.py script)
21 """
22 config = setup_utils_configuration(args, RunMode.UTIL_EXCHANGE)
23
24 timerange = TimeRange()
25 if 'days' in config:
26 time_since = arrow.utcnow().shift(days=-config['days']).strftime("%Y%m%d")
27 timerange = TimeRange.parse_timerange(f'{time_since}-')
28
29 if 'pairs' not in config:
30 raise OperationalException(
31 "Downloading data requires a list of pairs. "
32 "Please check the documentation on how to configure this.")
33
34 logger.info(f'About to download pairs: {config["pairs"]}, '
35 f'intervals: {config["timeframes"]} to {config["datadir"]}')
36
37 pairs_not_available: List[str] = []
38
39 # Init exchange
40 exchange = ExchangeResolver.load_exchange(config['exchange']['name'], config)
41 try:
42
43 if config.get('download_trades'):
44 pairs_not_available = refresh_backtest_trades_data(
45 exchange, pairs=config["pairs"], datadir=config['datadir'],
46 timerange=timerange, erase=bool(config.get("erase")))
47
48 # Convert downloaded trade data to different timeframes
49 convert_trades_to_ohlcv(
50 pairs=config["pairs"], timeframes=config["timeframes"],
51 datadir=config['datadir'], timerange=timerange,
52 erase=bool(config.get("erase")))
53 else:
54 pairs_not_available = refresh_backtest_ohlcv_data(
55 exchange, pairs=config["pairs"], timeframes=config["timeframes"],
56 datadir=config['datadir'], timerange=timerange,
57 erase=bool(config.get("erase")))
58
59 except KeyboardInterrupt:
60 sys.exit("SIGINT received, aborting ...")
61
62 finally:
63 if pairs_not_available:
64 logger.info(f"Pairs [{','.join(pairs_not_available)}] not available "
65 f"on exchange {exchange.name}.")
66
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/freqtrade/commands/data_commands.py b/freqtrade/commands/data_commands.py
--- a/freqtrade/commands/data_commands.py
+++ b/freqtrade/commands/data_commands.py
@@ -37,7 +37,12 @@
pairs_not_available: List[str] = []
# Init exchange
- exchange = ExchangeResolver.load_exchange(config['exchange']['name'], config)
+ exchange = ExchangeResolver.load_exchange(config['exchange']['name'], config, validate=False)
+ # Manual validations of relevant settings
+ exchange.validate_pairs(config['pairs'])
+ for timeframe in config['timeframes']:
+ exchange.validate_timeframes(timeframe)
+
try:
if config.get('download_trades'):
| {"golden_diff": "diff --git a/freqtrade/commands/data_commands.py b/freqtrade/commands/data_commands.py\n--- a/freqtrade/commands/data_commands.py\n+++ b/freqtrade/commands/data_commands.py\n@@ -37,7 +37,12 @@\n pairs_not_available: List[str] = []\n \n # Init exchange\n- exchange = ExchangeResolver.load_exchange(config['exchange']['name'], config)\n+ exchange = ExchangeResolver.load_exchange(config['exchange']['name'], config, validate=False)\n+ # Manual validations of relevant settings\n+ exchange.validate_pairs(config['pairs'])\n+ for timeframe in config['timeframes']:\n+ exchange.validate_timeframes(timeframe)\n+\n try:\n \n if config.get('download_trades'):\n", "issue": "Strange results for download-data\nI recalled #2716, tried to reproduce and obtained very strange results:\r\n\r\n* remove config.json\r\n* then run:\r\n```\r\n$ freqtrade download-data --exchange gemini --pairs 'BTC/USD' 'LTC/USD' 'BCH/USD' 'ETH/USD' -t 1h --days 5 -v\r\n2020-02-08 03:48:33,819 - freqtrade.loggers - INFO - Verbosity set to 1\r\n2020-02-08 03:48:33,819 - freqtrade.configuration.configuration - INFO - Using exchange gemini\r\n2020-02-08 03:48:33,819 - freqtrade.configuration.configuration - INFO - Using user-data directory: /home/user/freqtrade-wrk/github-hroff-1902/freqtrade/user_data ...\r\n2020-02-08 03:48:33,819 - freqtrade.configuration.configuration - INFO - Using data directory: /home/user/freqtrade-wrk/github-hroff-1902/freqtrade/user_data/data/gemini ...\r\n2020-02-08 03:48:33,819 - freqtrade.configuration.configuration - INFO - Using pairs ['BTC/USD', 'LTC/USD', 'BCH/USD', 'ETH/USD']\r\n2020-02-08 03:48:33,820 - freqtrade.configuration.configuration - INFO - timeframes --timeframes: ['1h']\r\n2020-02-08 03:48:33,820 - freqtrade.configuration.configuration - INFO - Detected --days: 5\r\n2020-02-08 03:48:33,820 - freqtrade.configuration.check_exchange - INFO - Checking exchange...\r\n2020-02-08 03:48:33,820 - freqtrade.configuration.check_exchange - WARNING - Exchange \"gemini\" is known to the the ccxt library, available for the bot, but not officially supported by the Freqtrade development team. It may work flawlessly (please report back) or have serious issues. Use it at your own discretion.\r\n2020-02-08 03:48:33,820 - freqtrade.configuration.deprecated_settings - WARNING - DEPRECATED: Pairlists must be defined explicitly in the future.Defaulting to StaticPairList for now.\r\n2020-02-08 03:48:33,820 - freqtrade.configuration.config_validation - INFO - Validating configuration ...\r\n2020-02-08 03:48:33,824 - freqtrade.commands.data_commands - INFO - About to download pairs: ['BTC/USD', 'LTC/USD', 'BCH/USD', 'ETH/USD'], intervals: ['1h'] to /home/user/freqtrade-wrk/github-hroff-1902/freqtrade/user_data/data/gemini\r\n2020-02-08 03:48:33,824 - freqtrade.resolvers.exchange_resolver - INFO - No Gemini specific subclass found. Using the generic class instead.\r\n2020-02-08 03:48:33,824 - freqtrade.exchange.exchange - INFO - Instance is running with dry_run enabled\r\n2020-02-08 03:48:33,826 - freqtrade.exchange.exchange - INFO - Applying additional ccxt config: {'enableRateLimit': True}\r\n2020-02-08 03:48:33,826 - asyncio - DEBUG - Using selector: EpollSelector\r\n2020-02-08 03:48:33,828 - freqtrade.exchange.exchange - INFO - Using Exchange \"Gemini\"\r\n2020-02-08 03:48:35,437 - chardet.charsetprober - DEBUG - utf-8 confidence = 0.99\r\n2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - SHIFT_JIS Japanese confidence = 0.01\r\n2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - EUC-JP Japanese confidence = 0.01\r\n2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - GB2312 Chinese confidence = 0.01\r\n2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - EUC-KR Korean confidence = 0.01\r\n2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - CP949 Korean confidence = 0.01\r\n2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - Big5 Chinese confidence = 0.01\r\n2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - EUC-TW Taiwan confidence = 0.01\r\n2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - windows-1251 Russian confidence = 0.01\r\n2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - KOI8-R Russian confidence = 0.01\r\n2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - ISO-8859-5 Russian confidence = 0.01\r\n2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - MacCyrillic Russian confidence = 0.0\r\n2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - IBM866 Russian confidence = 0.0\r\n2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - IBM855 Russian confidence = 0.01\r\n2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - ISO-8859-7 Greek confidence = 0.0\r\n2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - windows-1253 Greek confidence = 0.0\r\n2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - ISO-8859-5 Bulgairan confidence = 0.01\r\n2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - windows-1251 Bulgarian confidence = 0.01\r\n2020-02-08 03:48:35,438 - chardet.charsetprober - DEBUG - TIS-620 Thai confidence = 0.0\r\n2020-02-08 03:48:35,439 - chardet.charsetprober - DEBUG - ISO-8859-9 Turkish confidence = 0.6551158277756192\r\n2020-02-08 03:48:35,439 - chardet.charsetprober - DEBUG - windows-1255 Hebrew confidence = 0.0\r\n2020-02-08 03:48:35,439 - chardet.charsetprober - DEBUG - windows-1255 Hebrew confidence = 0.0\r\n2020-02-08 03:48:35,439 - chardet.charsetprober - DEBUG - windows-1255 Hebrew confidence = 0.0\r\n2020-02-08 03:48:35,439 - chardet.charsetprober - DEBUG - utf-8 confidence = 0.99\r\n2020-02-08 03:48:35,439 - chardet.charsetprober - DEBUG - SHIFT_JIS Japanese confidence = 0.01\r\n2020-02-08 03:48:35,439 - chardet.charsetprober - DEBUG - EUC-JP Japanese confidence = 0.01\r\n2020-02-08 03:48:35,439 - chardet.charsetprober - DEBUG - GB2312 Chinese confidence = 0.01\r\n2020-02-08 03:48:35,439 - chardet.charsetprober - DEBUG - EUC-KR Korean confidence = 0.01\r\n2020-02-08 03:48:35,439 - chardet.charsetprober - DEBUG - CP949 Korean confidence = 0.01\r\n2020-02-08 03:48:35,439 - chardet.charsetprober - DEBUG - Big5 Chinese confidence = 0.01\r\n2020-02-08 03:48:35,439 - chardet.charsetprober - DEBUG - EUC-TW Taiwan confidence = 0.01\r\n2020-02-08 03:48:35,441 - freqtrade - ERROR - is not available as stake on Gemini. Available currencies are: BCH, BTC, ETH, LTC, USD\r\n2020-02-08 03:48:35,441 - freqtrade.exchange.exchange - DEBUG - Exchange object destroyed, closing async loop\r\n```\r\n\r\n-- note 1) strange messages from \"chardet.charsetprober\" module; 2) empty \"something\" between \"ERROR - is not available\". 3) strange module \"freqtrade\" for this message...\r\n\ndownload-data validates whitelist when --pairs is used\nThe download-data subcommands validates whitelist when `--pairs` is used:\r\n\r\n```\r\n$ freqtrade download-data -c config.json --exchange kraken --pairs BTC/USD -t 1d --days 100 -v\r\n...\r\n2019-12-17 20:07:49,885 - freqtrade - ERROR - Pair NXT/BTC is not available on Kraken. Please remove NXT/BTC from your whitelist.\r\n```\r\n\r\nExpected behavior: ignore pair_whitelist, work only with pairs set in the cli.\r\n\n", "before_files": [{"content": "import logging\nimport sys\nfrom typing import Any, Dict, List\n\nimport arrow\n\nfrom freqtrade.configuration import TimeRange, setup_utils_configuration\nfrom freqtrade.data.history import (convert_trades_to_ohlcv,\n refresh_backtest_ohlcv_data,\n refresh_backtest_trades_data)\nfrom freqtrade.exceptions import OperationalException\nfrom freqtrade.resolvers import ExchangeResolver\nfrom freqtrade.state import RunMode\n\nlogger = logging.getLogger(__name__)\n\n\ndef start_download_data(args: Dict[str, Any]) -> None:\n \"\"\"\n Download data (former download_backtest_data.py script)\n \"\"\"\n config = setup_utils_configuration(args, RunMode.UTIL_EXCHANGE)\n\n timerange = TimeRange()\n if 'days' in config:\n time_since = arrow.utcnow().shift(days=-config['days']).strftime(\"%Y%m%d\")\n timerange = TimeRange.parse_timerange(f'{time_since}-')\n\n if 'pairs' not in config:\n raise OperationalException(\n \"Downloading data requires a list of pairs. \"\n \"Please check the documentation on how to configure this.\")\n\n logger.info(f'About to download pairs: {config[\"pairs\"]}, '\n f'intervals: {config[\"timeframes\"]} to {config[\"datadir\"]}')\n\n pairs_not_available: List[str] = []\n\n # Init exchange\n exchange = ExchangeResolver.load_exchange(config['exchange']['name'], config)\n try:\n\n if config.get('download_trades'):\n pairs_not_available = refresh_backtest_trades_data(\n exchange, pairs=config[\"pairs\"], datadir=config['datadir'],\n timerange=timerange, erase=bool(config.get(\"erase\")))\n\n # Convert downloaded trade data to different timeframes\n convert_trades_to_ohlcv(\n pairs=config[\"pairs\"], timeframes=config[\"timeframes\"],\n datadir=config['datadir'], timerange=timerange,\n erase=bool(config.get(\"erase\")))\n else:\n pairs_not_available = refresh_backtest_ohlcv_data(\n exchange, pairs=config[\"pairs\"], timeframes=config[\"timeframes\"],\n datadir=config['datadir'], timerange=timerange,\n erase=bool(config.get(\"erase\")))\n\n except KeyboardInterrupt:\n sys.exit(\"SIGINT received, aborting ...\")\n\n finally:\n if pairs_not_available:\n logger.info(f\"Pairs [{','.join(pairs_not_available)}] not available \"\n f\"on exchange {exchange.name}.\")\n", "path": "freqtrade/commands/data_commands.py"}], "after_files": [{"content": "import logging\nimport sys\nfrom typing import Any, Dict, List\n\nimport arrow\n\nfrom freqtrade.configuration import TimeRange, setup_utils_configuration\nfrom freqtrade.data.history import (convert_trades_to_ohlcv,\n refresh_backtest_ohlcv_data,\n refresh_backtest_trades_data)\nfrom freqtrade.exceptions import OperationalException\nfrom freqtrade.resolvers import ExchangeResolver\nfrom freqtrade.state import RunMode\n\nlogger = logging.getLogger(__name__)\n\n\ndef start_download_data(args: Dict[str, Any]) -> None:\n \"\"\"\n Download data (former download_backtest_data.py script)\n \"\"\"\n config = setup_utils_configuration(args, RunMode.UTIL_EXCHANGE)\n\n timerange = TimeRange()\n if 'days' in config:\n time_since = arrow.utcnow().shift(days=-config['days']).strftime(\"%Y%m%d\")\n timerange = TimeRange.parse_timerange(f'{time_since}-')\n\n if 'pairs' not in config:\n raise OperationalException(\n \"Downloading data requires a list of pairs. \"\n \"Please check the documentation on how to configure this.\")\n\n logger.info(f'About to download pairs: {config[\"pairs\"]}, '\n f'intervals: {config[\"timeframes\"]} to {config[\"datadir\"]}')\n\n pairs_not_available: List[str] = []\n\n # Init exchange\n exchange = ExchangeResolver.load_exchange(config['exchange']['name'], config, validate=False)\n # Manual validations of relevant settings\n exchange.validate_pairs(config['pairs'])\n for timeframe in config['timeframes']:\n exchange.validate_timeframes(timeframe)\n\n try:\n\n if config.get('download_trades'):\n pairs_not_available = refresh_backtest_trades_data(\n exchange, pairs=config[\"pairs\"], datadir=config['datadir'],\n timerange=timerange, erase=bool(config.get(\"erase\")))\n\n # Convert downloaded trade data to different timeframes\n convert_trades_to_ohlcv(\n pairs=config[\"pairs\"], timeframes=config[\"timeframes\"],\n datadir=config['datadir'], timerange=timerange,\n erase=bool(config.get(\"erase\")))\n else:\n pairs_not_available = refresh_backtest_ohlcv_data(\n exchange, pairs=config[\"pairs\"], timeframes=config[\"timeframes\"],\n datadir=config['datadir'], timerange=timerange,\n erase=bool(config.get(\"erase\")))\n\n except KeyboardInterrupt:\n sys.exit(\"SIGINT received, aborting ...\")\n\n finally:\n if pairs_not_available:\n logger.info(f\"Pairs [{','.join(pairs_not_available)}] not available \"\n f\"on exchange {exchange.name}.\")\n", "path": "freqtrade/commands/data_commands.py"}]} | 3,546 | 159 |
gh_patches_debug_43093 | rasdani/github-patches | git_diff | python-telegram-bot__python-telegram-bot-3261 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[Suggestion] Add chat(s) parameter to ChatJoinRequestHandler
This param should allow to filter out chats which will be handled by the ChatJoinRequestHandler, much like the pattern argument of the CallbackQueryHandler. It should allow "username" strings as well as ids and if set, the handler should check if the incoming update is from that chat.
For first time contributors, check how CallbackQueryHandler implements the pattern argument in check_update: https://github.com/python-telegram-bot/python-telegram-bot/blob/master/telegram/ext/_callbackqueryhandler.py#L123
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `telegram/ext/_chatjoinrequesthandler.py`
Content:
```
1 #!/usr/bin/env python
2 #
3 # A library that provides a Python interface to the Telegram Bot API
4 # Copyright (C) 2015-2022
5 # Leandro Toledo de Souza <[email protected]>
6 #
7 # This program is free software: you can redistribute it and/or modify
8 # it under the terms of the GNU Lesser Public License as published by
9 # the Free Software Foundation, either version 3 of the License, or
10 # (at your option) any later version.
11 #
12 # This program is distributed in the hope that it will be useful,
13 # but WITHOUT ANY WARRANTY; without even the implied warranty of
14 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
15 # GNU Lesser Public License for more details.
16 #
17 # You should have received a copy of the GNU Lesser Public License
18 # along with this program. If not, see [http://www.gnu.org/licenses/].
19 """This module contains the ChatJoinRequestHandler class."""
20
21
22 from telegram import Update
23 from telegram.ext._handler import BaseHandler
24 from telegram.ext._utils.types import CCT
25
26
27 class ChatJoinRequestHandler(BaseHandler[Update, CCT]):
28 """BaseHandler class to handle Telegram updates that contain
29 :attr:`telegram.Update.chat_join_request`.
30
31 Warning:
32 When setting :paramref:`block` to :obj:`False`, you cannot rely on adding custom
33 attributes to :class:`telegram.ext.CallbackContext`. See its docs for more info.
34
35 .. versionadded:: 13.8
36
37 Args:
38 callback (:term:`coroutine function`): The callback function for this handler. Will be
39 called when :meth:`check_update` has determined that an update should be processed by
40 this handler. Callback signature::
41
42 async def callback(update: Update, context: CallbackContext)
43
44 The return value of the callback is usually ignored except for the special case of
45 :class:`telegram.ext.ConversationHandler`.
46 block (:obj:`bool`, optional): Determines whether the return value of the callback should
47 be awaited before processing the next handler in
48 :meth:`telegram.ext.Application.process_update`. Defaults to :obj:`True`.
49
50 Attributes:
51 callback (:term:`coroutine function`): The callback function for this handler.
52 block (:obj:`bool`): Determines whether the callback will run in a blocking way..
53
54 """
55
56 __slots__ = ()
57
58 def check_update(self, update: object) -> bool:
59 """Determines whether an update should be passed to this handler's :attr:`callback`.
60
61 Args:
62 update (:class:`telegram.Update` | :obj:`object`): Incoming update.
63
64 Returns:
65 :obj:`bool`
66
67 """
68 return isinstance(update, Update) and bool(update.chat_join_request)
69
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/telegram/ext/_chatjoinrequesthandler.py b/telegram/ext/_chatjoinrequesthandler.py
--- a/telegram/ext/_chatjoinrequesthandler.py
+++ b/telegram/ext/_chatjoinrequesthandler.py
@@ -18,16 +18,27 @@
# along with this program. If not, see [http://www.gnu.org/licenses/].
"""This module contains the ChatJoinRequestHandler class."""
+from typing import FrozenSet, Optional
from telegram import Update
+from telegram._utils.defaultvalue import DEFAULT_TRUE
+from telegram._utils.types import RT, SCT, DVInput
from telegram.ext._handler import BaseHandler
-from telegram.ext._utils.types import CCT
+from telegram.ext._utils.types import CCT, HandlerCallback
class ChatJoinRequestHandler(BaseHandler[Update, CCT]):
"""BaseHandler class to handle Telegram updates that contain
:attr:`telegram.Update.chat_join_request`.
+ Note:
+ If neither of :paramref:`username` and the :paramref:`chat_id` are passed, this handler
+ accepts *any* join request. Otherwise, this handler accepts all requests to join chats
+ for which the chat ID is listed in :paramref:`chat_id` or the username is listed in
+ :paramref:`username`, or both.
+
+ .. versionadded:: 20.0
+
Warning:
When setting :paramref:`block` to :obj:`False`, you cannot rely on adding custom
attributes to :class:`telegram.ext.CallbackContext`. See its docs for more info.
@@ -43,6 +54,14 @@
The return value of the callback is usually ignored except for the special case of
:class:`telegram.ext.ConversationHandler`.
+ chat_id (:obj:`int` | Collection[:obj:`int`], optional): Filters requests to allow only
+ those which are asking to join the specified chat ID(s).
+
+ .. versionadded:: 20.0
+ username (:obj:`str` | Collection[:obj:`str`], optional): Filters requests to allow only
+ those which are asking to join the specified username(s).
+
+ .. versionadded:: 20.0
block (:obj:`bool`, optional): Determines whether the return value of the callback should
be awaited before processing the next handler in
:meth:`telegram.ext.Application.process_update`. Defaults to :obj:`True`.
@@ -53,7 +72,38 @@
"""
- __slots__ = ()
+ __slots__ = (
+ "_chat_ids",
+ "_usernames",
+ )
+
+ def __init__(
+ self,
+ callback: HandlerCallback[Update, CCT, RT],
+ chat_id: SCT[int] = None,
+ username: SCT[str] = None,
+ block: DVInput[bool] = DEFAULT_TRUE,
+ ):
+ super().__init__(callback, block=block)
+
+ self._chat_ids = self._parse_chat_id(chat_id)
+ self._usernames = self._parse_username(username)
+
+ @staticmethod
+ def _parse_chat_id(chat_id: Optional[SCT[int]]) -> FrozenSet[int]:
+ if chat_id is None:
+ return frozenset()
+ if isinstance(chat_id, int):
+ return frozenset({chat_id})
+ return frozenset(chat_id)
+
+ @staticmethod
+ def _parse_username(username: Optional[SCT[str]]) -> FrozenSet[str]:
+ if username is None:
+ return frozenset()
+ if isinstance(username, str):
+ return frozenset({username[1:] if username.startswith("@") else username})
+ return frozenset({usr[1:] if usr.startswith("@") else usr for usr in username})
def check_update(self, update: object) -> bool:
"""Determines whether an update should be passed to this handler's :attr:`callback`.
@@ -65,4 +115,12 @@
:obj:`bool`
"""
- return isinstance(update, Update) and bool(update.chat_join_request)
+ if isinstance(update, Update) and update.chat_join_request:
+ if not self._chat_ids and not self._usernames:
+ return True
+ if update.chat_join_request.chat.id in self._chat_ids:
+ return True
+ if update.chat_join_request.from_user.username in self._usernames:
+ return True
+ return False
+ return False
| {"golden_diff": "diff --git a/telegram/ext/_chatjoinrequesthandler.py b/telegram/ext/_chatjoinrequesthandler.py\n--- a/telegram/ext/_chatjoinrequesthandler.py\n+++ b/telegram/ext/_chatjoinrequesthandler.py\n@@ -18,16 +18,27 @@\n # along with this program. If not, see [http://www.gnu.org/licenses/].\n \"\"\"This module contains the ChatJoinRequestHandler class.\"\"\"\n \n+from typing import FrozenSet, Optional\n \n from telegram import Update\n+from telegram._utils.defaultvalue import DEFAULT_TRUE\n+from telegram._utils.types import RT, SCT, DVInput\n from telegram.ext._handler import BaseHandler\n-from telegram.ext._utils.types import CCT\n+from telegram.ext._utils.types import CCT, HandlerCallback\n \n \n class ChatJoinRequestHandler(BaseHandler[Update, CCT]):\n \"\"\"BaseHandler class to handle Telegram updates that contain\n :attr:`telegram.Update.chat_join_request`.\n \n+ Note:\n+ If neither of :paramref:`username` and the :paramref:`chat_id` are passed, this handler\n+ accepts *any* join request. Otherwise, this handler accepts all requests to join chats\n+ for which the chat ID is listed in :paramref:`chat_id` or the username is listed in\n+ :paramref:`username`, or both.\n+\n+ .. versionadded:: 20.0\n+\n Warning:\n When setting :paramref:`block` to :obj:`False`, you cannot rely on adding custom\n attributes to :class:`telegram.ext.CallbackContext`. See its docs for more info.\n@@ -43,6 +54,14 @@\n \n The return value of the callback is usually ignored except for the special case of\n :class:`telegram.ext.ConversationHandler`.\n+ chat_id (:obj:`int` | Collection[:obj:`int`], optional): Filters requests to allow only\n+ those which are asking to join the specified chat ID(s).\n+\n+ .. versionadded:: 20.0\n+ username (:obj:`str` | Collection[:obj:`str`], optional): Filters requests to allow only\n+ those which are asking to join the specified username(s).\n+\n+ .. versionadded:: 20.0\n block (:obj:`bool`, optional): Determines whether the return value of the callback should\n be awaited before processing the next handler in\n :meth:`telegram.ext.Application.process_update`. Defaults to :obj:`True`.\n@@ -53,7 +72,38 @@\n \n \"\"\"\n \n- __slots__ = ()\n+ __slots__ = (\n+ \"_chat_ids\",\n+ \"_usernames\",\n+ )\n+\n+ def __init__(\n+ self,\n+ callback: HandlerCallback[Update, CCT, RT],\n+ chat_id: SCT[int] = None,\n+ username: SCT[str] = None,\n+ block: DVInput[bool] = DEFAULT_TRUE,\n+ ):\n+ super().__init__(callback, block=block)\n+\n+ self._chat_ids = self._parse_chat_id(chat_id)\n+ self._usernames = self._parse_username(username)\n+\n+ @staticmethod\n+ def _parse_chat_id(chat_id: Optional[SCT[int]]) -> FrozenSet[int]:\n+ if chat_id is None:\n+ return frozenset()\n+ if isinstance(chat_id, int):\n+ return frozenset({chat_id})\n+ return frozenset(chat_id)\n+\n+ @staticmethod\n+ def _parse_username(username: Optional[SCT[str]]) -> FrozenSet[str]:\n+ if username is None:\n+ return frozenset()\n+ if isinstance(username, str):\n+ return frozenset({username[1:] if username.startswith(\"@\") else username})\n+ return frozenset({usr[1:] if usr.startswith(\"@\") else usr for usr in username})\n \n def check_update(self, update: object) -> bool:\n \"\"\"Determines whether an update should be passed to this handler's :attr:`callback`.\n@@ -65,4 +115,12 @@\n :obj:`bool`\n \n \"\"\"\n- return isinstance(update, Update) and bool(update.chat_join_request)\n+ if isinstance(update, Update) and update.chat_join_request:\n+ if not self._chat_ids and not self._usernames:\n+ return True\n+ if update.chat_join_request.chat.id in self._chat_ids:\n+ return True\n+ if update.chat_join_request.from_user.username in self._usernames:\n+ return True\n+ return False\n+ return False\n", "issue": "[Suggestion] Add chat(s) parameter to ChatJoinRequestHandler\nThis param should allow to filter out chats which will be handled by the ChatJoinRequestHandler, much like the pattern argument of the CallbackQueryHandler. It should allow \"username\" strings as well as ids and if set, the handler should check if the incoming update is from that chat.\r\n\r\nFor first time contributors, check how CallbackQueryHandler implements the pattern argument in check_update: https://github.com/python-telegram-bot/python-telegram-bot/blob/master/telegram/ext/_callbackqueryhandler.py#L123\n", "before_files": [{"content": "#!/usr/bin/env python\n#\n# A library that provides a Python interface to the Telegram Bot API\n# Copyright (C) 2015-2022\n# Leandro Toledo de Souza <[email protected]>\n#\n# This program is free software: you can redistribute it and/or modify\n# it under the terms of the GNU Lesser Public License as published by\n# the Free Software Foundation, either version 3 of the License, or\n# (at your option) any later version.\n#\n# This program is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU Lesser Public License for more details.\n#\n# You should have received a copy of the GNU Lesser Public License\n# along with this program. If not, see [http://www.gnu.org/licenses/].\n\"\"\"This module contains the ChatJoinRequestHandler class.\"\"\"\n\n\nfrom telegram import Update\nfrom telegram.ext._handler import BaseHandler\nfrom telegram.ext._utils.types import CCT\n\n\nclass ChatJoinRequestHandler(BaseHandler[Update, CCT]):\n \"\"\"BaseHandler class to handle Telegram updates that contain\n :attr:`telegram.Update.chat_join_request`.\n\n Warning:\n When setting :paramref:`block` to :obj:`False`, you cannot rely on adding custom\n attributes to :class:`telegram.ext.CallbackContext`. See its docs for more info.\n\n .. versionadded:: 13.8\n\n Args:\n callback (:term:`coroutine function`): The callback function for this handler. Will be\n called when :meth:`check_update` has determined that an update should be processed by\n this handler. Callback signature::\n\n async def callback(update: Update, context: CallbackContext)\n\n The return value of the callback is usually ignored except for the special case of\n :class:`telegram.ext.ConversationHandler`.\n block (:obj:`bool`, optional): Determines whether the return value of the callback should\n be awaited before processing the next handler in\n :meth:`telegram.ext.Application.process_update`. Defaults to :obj:`True`.\n\n Attributes:\n callback (:term:`coroutine function`): The callback function for this handler.\n block (:obj:`bool`): Determines whether the callback will run in a blocking way..\n\n \"\"\"\n\n __slots__ = ()\n\n def check_update(self, update: object) -> bool:\n \"\"\"Determines whether an update should be passed to this handler's :attr:`callback`.\n\n Args:\n update (:class:`telegram.Update` | :obj:`object`): Incoming update.\n\n Returns:\n :obj:`bool`\n\n \"\"\"\n return isinstance(update, Update) and bool(update.chat_join_request)\n", "path": "telegram/ext/_chatjoinrequesthandler.py"}], "after_files": [{"content": "#!/usr/bin/env python\n#\n# A library that provides a Python interface to the Telegram Bot API\n# Copyright (C) 2015-2022\n# Leandro Toledo de Souza <[email protected]>\n#\n# This program is free software: you can redistribute it and/or modify\n# it under the terms of the GNU Lesser Public License as published by\n# the Free Software Foundation, either version 3 of the License, or\n# (at your option) any later version.\n#\n# This program is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU Lesser Public License for more details.\n#\n# You should have received a copy of the GNU Lesser Public License\n# along with this program. If not, see [http://www.gnu.org/licenses/].\n\"\"\"This module contains the ChatJoinRequestHandler class.\"\"\"\n\nfrom typing import FrozenSet, Optional\n\nfrom telegram import Update\nfrom telegram._utils.defaultvalue import DEFAULT_TRUE\nfrom telegram._utils.types import RT, SCT, DVInput\nfrom telegram.ext._handler import BaseHandler\nfrom telegram.ext._utils.types import CCT, HandlerCallback\n\n\nclass ChatJoinRequestHandler(BaseHandler[Update, CCT]):\n \"\"\"BaseHandler class to handle Telegram updates that contain\n :attr:`telegram.Update.chat_join_request`.\n\n Note:\n If neither of :paramref:`username` and the :paramref:`chat_id` are passed, this handler\n accepts *any* join request. Otherwise, this handler accepts all requests to join chats\n for which the chat ID is listed in :paramref:`chat_id` or the username is listed in\n :paramref:`username`, or both.\n\n .. versionadded:: 20.0\n\n Warning:\n When setting :paramref:`block` to :obj:`False`, you cannot rely on adding custom\n attributes to :class:`telegram.ext.CallbackContext`. See its docs for more info.\n\n .. versionadded:: 13.8\n\n Args:\n callback (:term:`coroutine function`): The callback function for this handler. Will be\n called when :meth:`check_update` has determined that an update should be processed by\n this handler. Callback signature::\n\n async def callback(update: Update, context: CallbackContext)\n\n The return value of the callback is usually ignored except for the special case of\n :class:`telegram.ext.ConversationHandler`.\n chat_id (:obj:`int` | Collection[:obj:`int`], optional): Filters requests to allow only\n those which are asking to join the specified chat ID(s).\n\n .. versionadded:: 20.0\n username (:obj:`str` | Collection[:obj:`str`], optional): Filters requests to allow only\n those which are asking to join the specified username(s).\n\n .. versionadded:: 20.0\n block (:obj:`bool`, optional): Determines whether the return value of the callback should\n be awaited before processing the next handler in\n :meth:`telegram.ext.Application.process_update`. Defaults to :obj:`True`.\n\n Attributes:\n callback (:term:`coroutine function`): The callback function for this handler.\n block (:obj:`bool`): Determines whether the callback will run in a blocking way..\n\n \"\"\"\n\n __slots__ = (\n \"_chat_ids\",\n \"_usernames\",\n )\n\n def __init__(\n self,\n callback: HandlerCallback[Update, CCT, RT],\n chat_id: SCT[int] = None,\n username: SCT[str] = None,\n block: DVInput[bool] = DEFAULT_TRUE,\n ):\n super().__init__(callback, block=block)\n\n self._chat_ids = self._parse_chat_id(chat_id)\n self._usernames = self._parse_username(username)\n\n @staticmethod\n def _parse_chat_id(chat_id: Optional[SCT[int]]) -> FrozenSet[int]:\n if chat_id is None:\n return frozenset()\n if isinstance(chat_id, int):\n return frozenset({chat_id})\n return frozenset(chat_id)\n\n @staticmethod\n def _parse_username(username: Optional[SCT[str]]) -> FrozenSet[str]:\n if username is None:\n return frozenset()\n if isinstance(username, str):\n return frozenset({username[1:] if username.startswith(\"@\") else username})\n return frozenset({usr[1:] if usr.startswith(\"@\") else usr for usr in username})\n\n def check_update(self, update: object) -> bool:\n \"\"\"Determines whether an update should be passed to this handler's :attr:`callback`.\n\n Args:\n update (:class:`telegram.Update` | :obj:`object`): Incoming update.\n\n Returns:\n :obj:`bool`\n\n \"\"\"\n if isinstance(update, Update) and update.chat_join_request:\n if not self._chat_ids and not self._usernames:\n return True\n if update.chat_join_request.chat.id in self._chat_ids:\n return True\n if update.chat_join_request.from_user.username in self._usernames:\n return True\n return False\n return False\n", "path": "telegram/ext/_chatjoinrequesthandler.py"}]} | 1,103 | 987 |
gh_patches_debug_8876 | rasdani/github-patches | git_diff | microsoft__MLOS-211 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Sphinx Python API docs generation broken in recent nightly CI runs
For example: <https://github.com/microsoft/MLOS/runs/1635132574?check_suite_focus=true>
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `source/Mlos.Python/mlos/Spaces/Point.py`
Content:
```
1 #
2 # Copyright (c) Microsoft Corporation.
3 # Licensed under the MIT License.
4 #
5 import json
6 from numbers import Number
7
8 import pandas as pd
9 from mlos.Spaces.Dimensions.Dimension import Dimension
10
11
12 class Point:
13 """ Models a point in a Hypergrid.
14
15 """
16 def __init__(self, **kwargs):
17 self.dimension_value_dict = dict()
18 for dimension_name, value in kwargs.items():
19 self[dimension_name] = value
20
21 def copy(self):
22 return Point(**{key: value for key, value in self})
23
24 def flat_copy(self):
25 """ Creates a copy of the point but all dimension names are flattened.
26
27 :return:
28 """
29 flat_dict = {
30 Dimension.flatten_dimension_name(dimension_name): value
31 for dimension_name, value in self
32 }
33 return Point(**flat_dict)
34
35 def __eq__(self, other):
36 if not isinstance(other, Point):
37 return False
38 return \
39 all(other.get(dimension_name, None) == value for dimension_name, value in self) \
40 and \
41 all(self.get(dimension_name, None) == value for dimension_name, value in other)
42
43 def __ne__(self, other):
44 return not self == other
45
46 def __iter__(self):
47 for dimension_name, value in self.dimension_value_dict.items():
48 if not isinstance(value, Point):
49 yield dimension_name, value
50 else:
51 for sub_dimension_name, sub_dimension_value in value:
52 yield dimension_name + "." + sub_dimension_name, sub_dimension_value
53
54 def __getattr__(self, dimension_name):
55 if dimension_name == "__isabstractmethod__":
56 # A sad but necessary way to deal with ABC.
57 return False
58 return self[dimension_name]
59
60 def __setattr__(self, name, value):
61 if name == "dimension_value_dict":
62 self.__dict__[name] = value
63 else:
64 dimension_name = name
65 subgrid_name, dimension_name_without_subgrid_name = Dimension.split_dimension_name(dimension_name)
66 if subgrid_name is None:
67 self.dimension_value_dict[dimension_name] = value
68 else:
69 point_in_subgrid = self.dimension_value_dict.get(subgrid_name, Point())
70 point_in_subgrid[dimension_name_without_subgrid_name] = value
71 self.dimension_value_dict[subgrid_name] = point_in_subgrid
72
73 def __getitem__(self, dimension_name):
74 if dimension_name not in self:
75 raise KeyError(f"This Point does not have a value along dimension: {dimension_name}")
76 subgrid_name, dimension_name_without_subgrid_name = Dimension.split_dimension_name(dimension_name)
77 if subgrid_name is None:
78 return self.dimension_value_dict[dimension_name]
79 return self[subgrid_name][dimension_name_without_subgrid_name]
80
81 def get(self, dimension_name, default=None):
82 try:
83 return self[dimension_name]
84 except KeyError:
85 return default
86
87 def __setitem__(self, dimension_name, value):
88 subgrid_name, dimension_name_without_subgrid_name = Dimension.split_dimension_name(dimension_name)
89 if subgrid_name is None:
90 self.dimension_value_dict[dimension_name] = value
91 else:
92 point_in_subgrid = self.dimension_value_dict.get(subgrid_name, Point())
93 point_in_subgrid[dimension_name_without_subgrid_name] = value
94 self.dimension_value_dict[subgrid_name] = point_in_subgrid
95
96 def __contains__(self, dimension_name):
97 subgrid_name, dimension_name_without_subgrid_name = Dimension.split_dimension_name(dimension_name)
98 if subgrid_name is None:
99 return dimension_name in self.dimension_value_dict
100 if subgrid_name not in self.dimension_value_dict:
101 return False
102 return dimension_name_without_subgrid_name in self[subgrid_name]
103
104 def __repr__(self):
105 return self.__str__()
106
107 def __str__(self):
108 return str(self.to_json(indent=2))
109
110 def __getstate__(self):
111 return self.to_json()
112
113 def __setstate__(self, state):
114 temp_point = self.from_json(state)
115 self.dimension_value_dict = temp_point.dimension_value_dict
116
117 def to_json(self, indent=None):
118 if indent is not None:
119 return json.dumps(self.to_dict(), indent=indent)
120 return json.dumps(self.to_dict())
121
122 @classmethod
123 def from_json(cls, json_str):
124 coordinates = json.loads(json_str)
125 return Point(**coordinates)
126
127 def to_dict(self):
128 return_dict = {}
129 for param_name, value in self:
130 if isinstance(value, Number) and int(value) == value and not isinstance(value, bool):
131 value = int(value)
132 return_dict[param_name] = value
133 return return_dict
134
135 def to_dataframe(self):
136 return pd.DataFrame({param_name: [value] for param_name, value in self})
137
138 @classmethod
139 def from_dataframe(cls, dataframe: pd.DataFrame):
140 assert len(dataframe.index) == 1
141 dataframe = dataframe.dropna(axis=1)
142 dataframe_dict = dataframe.to_dict(orient='list')
143 point_dict = {key: values[0] for key, values in dataframe_dict.items()}
144 return Point(**point_dict)
145
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/source/Mlos.Python/mlos/Spaces/Point.py b/source/Mlos.Python/mlos/Spaces/Point.py
--- a/source/Mlos.Python/mlos/Spaces/Point.py
+++ b/source/Mlos.Python/mlos/Spaces/Point.py
@@ -55,7 +55,10 @@
if dimension_name == "__isabstractmethod__":
# A sad but necessary way to deal with ABC.
return False
- return self[dimension_name]
+ try:
+ return self[dimension_name]
+ except KeyError:
+ raise AttributeError(f"This Point does not have a {dimension_name} attribute.")
def __setattr__(self, name, value):
if name == "dimension_value_dict":
| {"golden_diff": "diff --git a/source/Mlos.Python/mlos/Spaces/Point.py b/source/Mlos.Python/mlos/Spaces/Point.py\n--- a/source/Mlos.Python/mlos/Spaces/Point.py\n+++ b/source/Mlos.Python/mlos/Spaces/Point.py\n@@ -55,7 +55,10 @@\n if dimension_name == \"__isabstractmethod__\":\r\n # A sad but necessary way to deal with ABC.\r\n return False\r\n- return self[dimension_name]\r\n+ try:\r\n+ return self[dimension_name]\r\n+ except KeyError:\r\n+ raise AttributeError(f\"This Point does not have a {dimension_name} attribute.\")\r\n \r\n def __setattr__(self, name, value):\r\n if name == \"dimension_value_dict\":\n", "issue": "Sphinx Python API docs generation broken in recent nightly CI runs\nFor example: <https://github.com/microsoft/MLOS/runs/1635132574?check_suite_focus=true>\n", "before_files": [{"content": "#\r\n# Copyright (c) Microsoft Corporation.\r\n# Licensed under the MIT License.\r\n#\r\nimport json\r\nfrom numbers import Number\r\n\r\nimport pandas as pd\r\nfrom mlos.Spaces.Dimensions.Dimension import Dimension\r\n\r\n\r\nclass Point:\r\n \"\"\" Models a point in a Hypergrid.\r\n\r\n \"\"\"\r\n def __init__(self, **kwargs):\r\n self.dimension_value_dict = dict()\r\n for dimension_name, value in kwargs.items():\r\n self[dimension_name] = value\r\n\r\n def copy(self):\r\n return Point(**{key: value for key, value in self})\r\n\r\n def flat_copy(self):\r\n \"\"\" Creates a copy of the point but all dimension names are flattened.\r\n\r\n :return:\r\n \"\"\"\r\n flat_dict = {\r\n Dimension.flatten_dimension_name(dimension_name): value\r\n for dimension_name, value in self\r\n }\r\n return Point(**flat_dict)\r\n\r\n def __eq__(self, other):\r\n if not isinstance(other, Point):\r\n return False\r\n return \\\r\n all(other.get(dimension_name, None) == value for dimension_name, value in self) \\\r\n and \\\r\n all(self.get(dimension_name, None) == value for dimension_name, value in other)\r\n\r\n def __ne__(self, other):\r\n return not self == other\r\n\r\n def __iter__(self):\r\n for dimension_name, value in self.dimension_value_dict.items():\r\n if not isinstance(value, Point):\r\n yield dimension_name, value\r\n else:\r\n for sub_dimension_name, sub_dimension_value in value:\r\n yield dimension_name + \".\" + sub_dimension_name, sub_dimension_value\r\n\r\n def __getattr__(self, dimension_name):\r\n if dimension_name == \"__isabstractmethod__\":\r\n # A sad but necessary way to deal with ABC.\r\n return False\r\n return self[dimension_name]\r\n\r\n def __setattr__(self, name, value):\r\n if name == \"dimension_value_dict\":\r\n self.__dict__[name] = value\r\n else:\r\n dimension_name = name\r\n subgrid_name, dimension_name_without_subgrid_name = Dimension.split_dimension_name(dimension_name)\r\n if subgrid_name is None:\r\n self.dimension_value_dict[dimension_name] = value\r\n else:\r\n point_in_subgrid = self.dimension_value_dict.get(subgrid_name, Point())\r\n point_in_subgrid[dimension_name_without_subgrid_name] = value\r\n self.dimension_value_dict[subgrid_name] = point_in_subgrid\r\n\r\n def __getitem__(self, dimension_name):\r\n if dimension_name not in self:\r\n raise KeyError(f\"This Point does not have a value along dimension: {dimension_name}\")\r\n subgrid_name, dimension_name_without_subgrid_name = Dimension.split_dimension_name(dimension_name)\r\n if subgrid_name is None:\r\n return self.dimension_value_dict[dimension_name]\r\n return self[subgrid_name][dimension_name_without_subgrid_name]\r\n\r\n def get(self, dimension_name, default=None):\r\n try:\r\n return self[dimension_name]\r\n except KeyError:\r\n return default\r\n\r\n def __setitem__(self, dimension_name, value):\r\n subgrid_name, dimension_name_without_subgrid_name = Dimension.split_dimension_name(dimension_name)\r\n if subgrid_name is None:\r\n self.dimension_value_dict[dimension_name] = value\r\n else:\r\n point_in_subgrid = self.dimension_value_dict.get(subgrid_name, Point())\r\n point_in_subgrid[dimension_name_without_subgrid_name] = value\r\n self.dimension_value_dict[subgrid_name] = point_in_subgrid\r\n\r\n def __contains__(self, dimension_name):\r\n subgrid_name, dimension_name_without_subgrid_name = Dimension.split_dimension_name(dimension_name)\r\n if subgrid_name is None:\r\n return dimension_name in self.dimension_value_dict\r\n if subgrid_name not in self.dimension_value_dict:\r\n return False\r\n return dimension_name_without_subgrid_name in self[subgrid_name]\r\n\r\n def __repr__(self):\r\n return self.__str__()\r\n\r\n def __str__(self):\r\n return str(self.to_json(indent=2))\r\n\r\n def __getstate__(self):\r\n return self.to_json()\r\n\r\n def __setstate__(self, state):\r\n temp_point = self.from_json(state)\r\n self.dimension_value_dict = temp_point.dimension_value_dict\r\n\r\n def to_json(self, indent=None):\r\n if indent is not None:\r\n return json.dumps(self.to_dict(), indent=indent)\r\n return json.dumps(self.to_dict())\r\n\r\n @classmethod\r\n def from_json(cls, json_str):\r\n coordinates = json.loads(json_str)\r\n return Point(**coordinates)\r\n\r\n def to_dict(self):\r\n return_dict = {}\r\n for param_name, value in self:\r\n if isinstance(value, Number) and int(value) == value and not isinstance(value, bool):\r\n value = int(value)\r\n return_dict[param_name] = value\r\n return return_dict\r\n\r\n def to_dataframe(self):\r\n return pd.DataFrame({param_name: [value] for param_name, value in self})\r\n\r\n @classmethod\r\n def from_dataframe(cls, dataframe: pd.DataFrame):\r\n assert len(dataframe.index) == 1\r\n dataframe = dataframe.dropna(axis=1)\r\n dataframe_dict = dataframe.to_dict(orient='list')\r\n point_dict = {key: values[0] for key, values in dataframe_dict.items()}\r\n return Point(**point_dict)\r\n", "path": "source/Mlos.Python/mlos/Spaces/Point.py"}], "after_files": [{"content": "#\r\n# Copyright (c) Microsoft Corporation.\r\n# Licensed under the MIT License.\r\n#\r\nimport json\r\nfrom numbers import Number\r\n\r\nimport pandas as pd\r\nfrom mlos.Spaces.Dimensions.Dimension import Dimension\r\n\r\n\r\nclass Point:\r\n \"\"\" Models a point in a Hypergrid.\r\n\r\n \"\"\"\r\n def __init__(self, **kwargs):\r\n self.dimension_value_dict = dict()\r\n for dimension_name, value in kwargs.items():\r\n self[dimension_name] = value\r\n\r\n def copy(self):\r\n return Point(**{key: value for key, value in self})\r\n\r\n def flat_copy(self):\r\n \"\"\" Creates a copy of the point but all dimension names are flattened.\r\n\r\n :return:\r\n \"\"\"\r\n flat_dict = {\r\n Dimension.flatten_dimension_name(dimension_name): value\r\n for dimension_name, value in self\r\n }\r\n return Point(**flat_dict)\r\n\r\n def __eq__(self, other):\r\n if not isinstance(other, Point):\r\n return False\r\n return \\\r\n all(other.get(dimension_name, None) == value for dimension_name, value in self) \\\r\n and \\\r\n all(self.get(dimension_name, None) == value for dimension_name, value in other)\r\n\r\n def __ne__(self, other):\r\n return not self == other\r\n\r\n def __iter__(self):\r\n for dimension_name, value in self.dimension_value_dict.items():\r\n if not isinstance(value, Point):\r\n yield dimension_name, value\r\n else:\r\n for sub_dimension_name, sub_dimension_value in value:\r\n yield dimension_name + \".\" + sub_dimension_name, sub_dimension_value\r\n\r\n def __getattr__(self, dimension_name):\r\n if dimension_name == \"__isabstractmethod__\":\r\n # A sad but necessary way to deal with ABC.\r\n return False\r\n try:\r\n return self[dimension_name]\r\n except KeyError:\r\n raise AttributeError(f\"This Point does not have a {dimension_name} attribute.\")\r\n\r\n def __setattr__(self, name, value):\r\n if name == \"dimension_value_dict\":\r\n self.__dict__[name] = value\r\n else:\r\n dimension_name = name\r\n subgrid_name, dimension_name_without_subgrid_name = Dimension.split_dimension_name(dimension_name)\r\n if subgrid_name is None:\r\n self.dimension_value_dict[dimension_name] = value\r\n else:\r\n point_in_subgrid = self.dimension_value_dict.get(subgrid_name, Point())\r\n point_in_subgrid[dimension_name_without_subgrid_name] = value\r\n self.dimension_value_dict[subgrid_name] = point_in_subgrid\r\n\r\n def __getitem__(self, dimension_name):\r\n if dimension_name not in self:\r\n raise KeyError(f\"This Point does not have a value along dimension: {dimension_name}\")\r\n subgrid_name, dimension_name_without_subgrid_name = Dimension.split_dimension_name(dimension_name)\r\n if subgrid_name is None:\r\n return self.dimension_value_dict[dimension_name]\r\n return self[subgrid_name][dimension_name_without_subgrid_name]\r\n\r\n def get(self, dimension_name, default=None):\r\n try:\r\n return self[dimension_name]\r\n except KeyError:\r\n return default\r\n\r\n def __setitem__(self, dimension_name, value):\r\n subgrid_name, dimension_name_without_subgrid_name = Dimension.split_dimension_name(dimension_name)\r\n if subgrid_name is None:\r\n self.dimension_value_dict[dimension_name] = value\r\n else:\r\n point_in_subgrid = self.dimension_value_dict.get(subgrid_name, Point())\r\n point_in_subgrid[dimension_name_without_subgrid_name] = value\r\n self.dimension_value_dict[subgrid_name] = point_in_subgrid\r\n\r\n def __contains__(self, dimension_name):\r\n subgrid_name, dimension_name_without_subgrid_name = Dimension.split_dimension_name(dimension_name)\r\n if subgrid_name is None:\r\n return dimension_name in self.dimension_value_dict\r\n if subgrid_name not in self.dimension_value_dict:\r\n return False\r\n return dimension_name_without_subgrid_name in self[subgrid_name]\r\n\r\n def __repr__(self):\r\n return self.__str__()\r\n\r\n def __str__(self):\r\n return str(self.to_json(indent=2))\r\n\r\n def __getstate__(self):\r\n return self.to_json()\r\n\r\n def __setstate__(self, state):\r\n temp_point = self.from_json(state)\r\n self.dimension_value_dict = temp_point.dimension_value_dict\r\n\r\n def to_json(self, indent=None):\r\n if indent is not None:\r\n return json.dumps(self.to_dict(), indent=indent)\r\n return json.dumps(self.to_dict())\r\n\r\n @classmethod\r\n def from_json(cls, json_str):\r\n coordinates = json.loads(json_str)\r\n return Point(**coordinates)\r\n\r\n def to_dict(self):\r\n return_dict = {}\r\n for param_name, value in self:\r\n if isinstance(value, Number) and int(value) == value and not isinstance(value, bool):\r\n value = int(value)\r\n return_dict[param_name] = value\r\n return return_dict\r\n\r\n def to_dataframe(self):\r\n return pd.DataFrame({param_name: [value] for param_name, value in self})\r\n\r\n @classmethod\r\n def from_dataframe(cls, dataframe: pd.DataFrame):\r\n assert len(dataframe.index) == 1\r\n dataframe = dataframe.dropna(axis=1)\r\n dataframe_dict = dataframe.to_dict(orient='list')\r\n point_dict = {key: values[0] for key, values in dataframe_dict.items()}\r\n return Point(**point_dict)\r\n", "path": "source/Mlos.Python/mlos/Spaces/Point.py"}]} | 1,758 | 162 |
gh_patches_debug_37529 | rasdani/github-patches | git_diff | dmlc__dgl-5377 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[Sparse] Support column-wise softmax.
## 🔨Work Item
**IMPORTANT:**
* This template is only for dev team to track project progress. For feature request or bug report, please use the corresponding issue templates.
* DO NOT create a new work item if the purpose is to fix an existing issue or feature request. We will directly use the issue in the project tracker.
Project tracker: https://github.com/orgs/dmlc/projects/2
## Description
<!-- short description of the work item -->
## Depending work items or issues
<!-- what must be done before this -->
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `python/dgl/sparse/softmax.py`
Content:
```
1 """Softmax op for SparseMatrix"""
2 # pylint: disable=invalid-name, W0622
3
4 import torch
5
6 from .sparse_matrix import SparseMatrix
7
8 __all__ = ["softmax"]
9
10
11 def softmax(input: SparseMatrix) -> SparseMatrix:
12 """Applies row-wise softmax to the non-zero elements of the sparse matrix.
13
14 Equivalently, applies softmax to the non-zero elements of the sparse
15 matrix along the column (``dim=1``) dimension.
16
17 If :attr:`input.val` takes shape ``(nnz, D)``, then the output matrix
18 :attr:`output` and :attr:`output.val` take the same shape as :attr:`input`
19 and :attr:`input.val`. :attr:`output.val[:, i]` is calculated based on
20 :attr:`input.val[:, i]`.
21
22 Parameters
23 ----------
24 input : SparseMatrix
25 The input sparse matrix
26
27 Returns
28 -------
29 SparseMatrix
30 The output sparse matrix
31
32 Examples
33 --------
34
35 Case1: matrix with values of shape (nnz)
36
37 >>> indices = torch.tensor([[0, 0, 1, 2], [1, 2, 2, 0]])
38 >>> nnz = len(row)
39 >>> val = torch.arange(nnz).float()
40 >>> A = dglsp.spmatrix(indices, val)
41 >>> dglsp.softmax(A)
42 SparseMatrix(indices=tensor([[0, 0, 1, 2],
43 [1, 2, 2, 0]]),
44 values=tensor([0.2689, 0.7311, 1.0000, 1.0000]),
45 shape=(3, 3), nnz=4)
46
47 Case2: matrix with values of shape (nnz, D)
48
49 >>> indices = torch.tensor([[0, 0, 1, 2], [1, 2, 2, 0]])
50 >>> val = torch.tensor([[0., 7.], [1., 3.], [2., 2.], [3., 1.]])
51 >>> A = dglsp.spmatrix(indices, val)
52 >>> dglsp.softmax(A)
53 SparseMatrix(indices=tensor([[0, 0, 1, 2],
54 [1, 2, 2, 0]]),
55 values=tensor([[0.2689, 0.9820],
56 [0.7311, 0.0180],
57 [1.0000, 1.0000],
58 [1.0000, 1.0000]]),
59 shape=(3, 3), nnz=4, val_size=(2,))
60 """
61 return SparseMatrix(torch.ops.dgl_sparse.softmax(input.c_sparse_matrix))
62
63
64 SparseMatrix.softmax = softmax
65
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/python/dgl/sparse/softmax.py b/python/dgl/sparse/softmax.py
--- a/python/dgl/sparse/softmax.py
+++ b/python/dgl/sparse/softmax.py
@@ -8,11 +8,10 @@
__all__ = ["softmax"]
-def softmax(input: SparseMatrix) -> SparseMatrix:
- """Applies row-wise softmax to the non-zero elements of the sparse matrix.
-
- Equivalently, applies softmax to the non-zero elements of the sparse
- matrix along the column (``dim=1``) dimension.
+def softmax(input: SparseMatrix, dim: int = 1) -> SparseMatrix:
+ """Applies softmax to the non-zero elements of the sparse matrix on the
+ dimension :attr:``dim``. dim = 0 or 1 indicates column-wise or row-wise
+ softmax respectively.
If :attr:`input.val` takes shape ``(nnz, D)``, then the output matrix
:attr:`output` and :attr:`output.val` take the same shape as :attr:`input`
@@ -32,11 +31,10 @@
Examples
--------
- Case1: matrix with values of shape (nnz)
+ Case1: row-wise softmax on matrix with values of shape (nnz)
>>> indices = torch.tensor([[0, 0, 1, 2], [1, 2, 2, 0]])
- >>> nnz = len(row)
- >>> val = torch.arange(nnz).float()
+ >>> val = torch.tensor([0., 1., 2., 3.])
>>> A = dglsp.spmatrix(indices, val)
>>> dglsp.softmax(A)
SparseMatrix(indices=tensor([[0, 0, 1, 2],
@@ -44,7 +42,7 @@
values=tensor([0.2689, 0.7311, 1.0000, 1.0000]),
shape=(3, 3), nnz=4)
- Case2: matrix with values of shape (nnz, D)
+ Case2: row-wise softmax on matrix with values of shape (nnz, D)
>>> indices = torch.tensor([[0, 0, 1, 2], [1, 2, 2, 0]])
>>> val = torch.tensor([[0., 7.], [1., 3.], [2., 2.], [3., 1.]])
@@ -57,8 +55,21 @@
[1.0000, 1.0000],
[1.0000, 1.0000]]),
shape=(3, 3), nnz=4, val_size=(2,))
+
+ Case3: column-wise softmax on matrix with values of shape (nnz)
+
+ >>> indices = torch.tensor([[0, 0, 1, 2], [1, 2, 2, 0]])
+ >>> val = torch.tensor([0., 1., 2., 3.])
+ >>> A = dglsp.spmatrix(indices, val)
+ >>> dglsp.softmax(A, 0)
+ SparseMatrix(indices=tensor([[0, 0, 1, 2],
+ [1, 2, 2, 0]]),
+ values=tensor([1.0000, 0.2689, 0.7311, 1.0000]),
+ shape=(3, 3), nnz=4)
"""
- return SparseMatrix(torch.ops.dgl_sparse.softmax(input.c_sparse_matrix))
+ return SparseMatrix(
+ torch.ops.dgl_sparse.softmax(input.c_sparse_matrix, dim)
+ )
SparseMatrix.softmax = softmax
| {"golden_diff": "diff --git a/python/dgl/sparse/softmax.py b/python/dgl/sparse/softmax.py\n--- a/python/dgl/sparse/softmax.py\n+++ b/python/dgl/sparse/softmax.py\n@@ -8,11 +8,10 @@\n __all__ = [\"softmax\"]\n \n \n-def softmax(input: SparseMatrix) -> SparseMatrix:\n- \"\"\"Applies row-wise softmax to the non-zero elements of the sparse matrix.\n-\n- Equivalently, applies softmax to the non-zero elements of the sparse\n- matrix along the column (``dim=1``) dimension.\n+def softmax(input: SparseMatrix, dim: int = 1) -> SparseMatrix:\n+ \"\"\"Applies softmax to the non-zero elements of the sparse matrix on the\n+ dimension :attr:``dim``. dim = 0 or 1 indicates column-wise or row-wise\n+ softmax respectively.\n \n If :attr:`input.val` takes shape ``(nnz, D)``, then the output matrix\n :attr:`output` and :attr:`output.val` take the same shape as :attr:`input`\n@@ -32,11 +31,10 @@\n Examples\n --------\n \n- Case1: matrix with values of shape (nnz)\n+ Case1: row-wise softmax on matrix with values of shape (nnz)\n \n >>> indices = torch.tensor([[0, 0, 1, 2], [1, 2, 2, 0]])\n- >>> nnz = len(row)\n- >>> val = torch.arange(nnz).float()\n+ >>> val = torch.tensor([0., 1., 2., 3.])\n >>> A = dglsp.spmatrix(indices, val)\n >>> dglsp.softmax(A)\n SparseMatrix(indices=tensor([[0, 0, 1, 2],\n@@ -44,7 +42,7 @@\n values=tensor([0.2689, 0.7311, 1.0000, 1.0000]),\n shape=(3, 3), nnz=4)\n \n- Case2: matrix with values of shape (nnz, D)\n+ Case2: row-wise softmax on matrix with values of shape (nnz, D)\n \n >>> indices = torch.tensor([[0, 0, 1, 2], [1, 2, 2, 0]])\n >>> val = torch.tensor([[0., 7.], [1., 3.], [2., 2.], [3., 1.]])\n@@ -57,8 +55,21 @@\n [1.0000, 1.0000],\n [1.0000, 1.0000]]),\n shape=(3, 3), nnz=4, val_size=(2,))\n+\n+ Case3: column-wise softmax on matrix with values of shape (nnz)\n+\n+ >>> indices = torch.tensor([[0, 0, 1, 2], [1, 2, 2, 0]])\n+ >>> val = torch.tensor([0., 1., 2., 3.])\n+ >>> A = dglsp.spmatrix(indices, val)\n+ >>> dglsp.softmax(A, 0)\n+ SparseMatrix(indices=tensor([[0, 0, 1, 2],\n+ [1, 2, 2, 0]]),\n+ values=tensor([1.0000, 0.2689, 0.7311, 1.0000]),\n+ shape=(3, 3), nnz=4)\n \"\"\"\n- return SparseMatrix(torch.ops.dgl_sparse.softmax(input.c_sparse_matrix))\n+ return SparseMatrix(\n+ torch.ops.dgl_sparse.softmax(input.c_sparse_matrix, dim)\n+ )\n \n \n SparseMatrix.softmax = softmax\n", "issue": "[Sparse] Support column-wise softmax.\n## \ud83d\udd28Work Item\r\n\r\n**IMPORTANT:**\r\n* This template is only for dev team to track project progress. For feature request or bug report, please use the corresponding issue templates.\r\n* DO NOT create a new work item if the purpose is to fix an existing issue or feature request. We will directly use the issue in the project tracker.\r\n\r\nProject tracker: https://github.com/orgs/dmlc/projects/2\r\n\r\n## Description\r\n\r\n<!-- short description of the work item -->\r\n\r\n## Depending work items or issues\r\n\r\n<!-- what must be done before this -->\r\n\n", "before_files": [{"content": "\"\"\"Softmax op for SparseMatrix\"\"\"\n# pylint: disable=invalid-name, W0622\n\nimport torch\n\nfrom .sparse_matrix import SparseMatrix\n\n__all__ = [\"softmax\"]\n\n\ndef softmax(input: SparseMatrix) -> SparseMatrix:\n \"\"\"Applies row-wise softmax to the non-zero elements of the sparse matrix.\n\n Equivalently, applies softmax to the non-zero elements of the sparse\n matrix along the column (``dim=1``) dimension.\n\n If :attr:`input.val` takes shape ``(nnz, D)``, then the output matrix\n :attr:`output` and :attr:`output.val` take the same shape as :attr:`input`\n and :attr:`input.val`. :attr:`output.val[:, i]` is calculated based on\n :attr:`input.val[:, i]`.\n\n Parameters\n ----------\n input : SparseMatrix\n The input sparse matrix\n\n Returns\n -------\n SparseMatrix\n The output sparse matrix\n\n Examples\n --------\n\n Case1: matrix with values of shape (nnz)\n\n >>> indices = torch.tensor([[0, 0, 1, 2], [1, 2, 2, 0]])\n >>> nnz = len(row)\n >>> val = torch.arange(nnz).float()\n >>> A = dglsp.spmatrix(indices, val)\n >>> dglsp.softmax(A)\n SparseMatrix(indices=tensor([[0, 0, 1, 2],\n [1, 2, 2, 0]]),\n values=tensor([0.2689, 0.7311, 1.0000, 1.0000]),\n shape=(3, 3), nnz=4)\n\n Case2: matrix with values of shape (nnz, D)\n\n >>> indices = torch.tensor([[0, 0, 1, 2], [1, 2, 2, 0]])\n >>> val = torch.tensor([[0., 7.], [1., 3.], [2., 2.], [3., 1.]])\n >>> A = dglsp.spmatrix(indices, val)\n >>> dglsp.softmax(A)\n SparseMatrix(indices=tensor([[0, 0, 1, 2],\n [1, 2, 2, 0]]),\n values=tensor([[0.2689, 0.9820],\n [0.7311, 0.0180],\n [1.0000, 1.0000],\n [1.0000, 1.0000]]),\n shape=(3, 3), nnz=4, val_size=(2,))\n \"\"\"\n return SparseMatrix(torch.ops.dgl_sparse.softmax(input.c_sparse_matrix))\n\n\nSparseMatrix.softmax = softmax\n", "path": "python/dgl/sparse/softmax.py"}], "after_files": [{"content": "\"\"\"Softmax op for SparseMatrix\"\"\"\n# pylint: disable=invalid-name, W0622\n\nimport torch\n\nfrom .sparse_matrix import SparseMatrix\n\n__all__ = [\"softmax\"]\n\n\ndef softmax(input: SparseMatrix, dim: int = 1) -> SparseMatrix:\n \"\"\"Applies softmax to the non-zero elements of the sparse matrix on the\n dimension :attr:``dim``. dim = 0 or 1 indicates column-wise or row-wise\n softmax respectively.\n\n If :attr:`input.val` takes shape ``(nnz, D)``, then the output matrix\n :attr:`output` and :attr:`output.val` take the same shape as :attr:`input`\n and :attr:`input.val`. :attr:`output.val[:, i]` is calculated based on\n :attr:`input.val[:, i]`.\n\n Parameters\n ----------\n input : SparseMatrix\n The input sparse matrix\n\n Returns\n -------\n SparseMatrix\n The output sparse matrix\n\n Examples\n --------\n\n Case1: row-wise softmax on matrix with values of shape (nnz)\n\n >>> indices = torch.tensor([[0, 0, 1, 2], [1, 2, 2, 0]])\n >>> val = torch.tensor([0., 1., 2., 3.])\n >>> A = dglsp.spmatrix(indices, val)\n >>> dglsp.softmax(A)\n SparseMatrix(indices=tensor([[0, 0, 1, 2],\n [1, 2, 2, 0]]),\n values=tensor([0.2689, 0.7311, 1.0000, 1.0000]),\n shape=(3, 3), nnz=4)\n\n Case2: row-wise softmax on matrix with values of shape (nnz, D)\n\n >>> indices = torch.tensor([[0, 0, 1, 2], [1, 2, 2, 0]])\n >>> val = torch.tensor([[0., 7.], [1., 3.], [2., 2.], [3., 1.]])\n >>> A = dglsp.spmatrix(indices, val)\n >>> dglsp.softmax(A)\n SparseMatrix(indices=tensor([[0, 0, 1, 2],\n [1, 2, 2, 0]]),\n values=tensor([[0.2689, 0.9820],\n [0.7311, 0.0180],\n [1.0000, 1.0000],\n [1.0000, 1.0000]]),\n shape=(3, 3), nnz=4, val_size=(2,))\n\n Case3: column-wise softmax on matrix with values of shape (nnz)\n\n >>> indices = torch.tensor([[0, 0, 1, 2], [1, 2, 2, 0]])\n >>> val = torch.tensor([0., 1., 2., 3.])\n >>> A = dglsp.spmatrix(indices, val)\n >>> dglsp.softmax(A, 0)\n SparseMatrix(indices=tensor([[0, 0, 1, 2],\n [1, 2, 2, 0]]),\n values=tensor([1.0000, 0.2689, 0.7311, 1.0000]),\n shape=(3, 3), nnz=4)\n \"\"\"\n return SparseMatrix(\n torch.ops.dgl_sparse.softmax(input.c_sparse_matrix, dim)\n )\n\n\nSparseMatrix.softmax = softmax\n", "path": "python/dgl/sparse/softmax.py"}]} | 1,143 | 863 |
gh_patches_debug_23551 | rasdani/github-patches | git_diff | getnikola__nikola-1468 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
github_deploy broken if no remote `gh-pages` branch
If you just started out with Nikola and your remote doesn't have a `gh-pages` branch yet (which will be the case for most people), then `github_deploy` crashes with:
```
ERROR: github_deploy: Failed GitHub deployment — command ['git', 'pull', 'origin', 'gh-pages:gh-pages'] returned 1
```
Since `git pull origin gh-pages:gh-pages` won't work if there's no remote `gh-pages` branch. This is a big problem, because creating and pushing the blank branch is possible but nontrivial, and is one of the main automation "bonuses" of having a `nikola github_deploy` in the first place.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `nikola/plugins/command/github_deploy.py`
Content:
```
1 # -*- coding: utf-8 -*-
2
3 # Copyright © 2014 Puneeth Chaganti and others.
4
5 # Permission is hereby granted, free of charge, to any
6 # person obtaining a copy of this software and associated
7 # documentation files (the "Software"), to deal in the
8 # Software without restriction, including without limitation
9 # the rights to use, copy, modify, merge, publish,
10 # distribute, sublicense, and/or sell copies of the
11 # Software, and to permit persons to whom the Software is
12 # furnished to do so, subject to the following conditions:
13 #
14 # The above copyright notice and this permission notice
15 # shall be included in all copies or substantial portions of
16 # the Software.
17 #
18 # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY
19 # KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE
20 # WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR
21 # PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS
22 # OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR
23 # OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
24 # OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
25 # SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
26
27 from __future__ import print_function
28 import os
29 import shutil
30 import subprocess
31 import sys
32 from textwrap import dedent
33
34 from nikola.plugin_categories import Command
35 from nikola.plugins.command.check import real_scan_files
36 from nikola.utils import ask_yesno, get_logger
37 from nikola.__main__ import main
38 from nikola import __version__
39
40
41 def uni_check_output(*args, **kwargs):
42 o = subprocess.check_output(*args, **kwargs)
43 return o.decode('utf-8')
44
45
46 class CommandGitHubDeploy(Command):
47 """ Deploy site to GitHub pages. """
48 name = 'github_deploy'
49
50 doc_usage = ''
51 doc_purpose = 'deploy the site to GitHub pages'
52 doc_description = dedent(
53 """\
54 This command can be used to deploy your site to GitHub pages.
55 It performs the following actions:
56
57 1. Ensure that your site is a git repository, and git is on the PATH.
58 2. Ensure that the output directory is not committed on the
59 source branch.
60 3. Check for changes, and prompt the user to continue, if required.
61 4. Build the site
62 5. Clean any files that are "unknown" to Nikola.
63 6. Create a deploy branch, if one doesn't exist.
64 7. Commit the output to this branch. (NOTE: Any untracked source
65 files, may get committed at this stage, on the wrong branch!)
66 8. Push and deploy!
67
68 NOTE: This command needs your site to be a git repository, with a
69 master branch (or a different branch, configured using
70 GITHUB_SOURCE_BRANCH if you are pushing to user.github
71 .io/organization.github.io pages) containing the sources of your
72 site. You also, obviously, need to have `git` on your PATH,
73 and should be able to push to the repository specified as the remote
74 (origin, by default).
75 """
76 )
77
78 logger = None
79
80 _deploy_branch = ''
81 _source_branch = ''
82 _remote_name = ''
83
84 def _execute(self, command, args):
85
86 self.logger = get_logger(
87 CommandGitHubDeploy.name, self.site.loghandlers
88 )
89 self._source_branch = self.site.config.get(
90 'GITHUB_SOURCE_BRANCH', 'master'
91 )
92 self._deploy_branch = self.site.config.get(
93 'GITHUB_DEPLOY_BRANCH', 'gh-pages'
94 )
95 self._remote_name = self.site.config.get(
96 'GITHUB_REMOTE_NAME', 'origin'
97 )
98
99 self._ensure_git_repo()
100
101 self._exit_if_output_committed()
102
103 if not self._prompt_continue():
104 return
105
106 build = main(['build'])
107 if build != 0:
108 self.logger.error('Build failed, not deploying to GitHub')
109 sys.exit(build)
110
111 only_on_output, _ = real_scan_files(self.site)
112 for f in only_on_output:
113 os.unlink(f)
114
115 self._checkout_deploy_branch()
116
117 self._copy_output()
118
119 self._commit_and_push()
120
121 return
122
123 def _commit_and_push(self):
124 """ Commit all the files and push. """
125
126 deploy = self._deploy_branch
127 source = self._source_branch
128 remote = self._remote_name
129
130 source_commit = uni_check_output(['git', 'rev-parse', source])
131 commit_message = (
132 'Nikola auto commit.\n\n'
133 'Source commit: %s'
134 'Nikola version: %s' % (source_commit, __version__)
135 )
136
137 commands = [
138 ['git', 'pull', '--rebase=false', remote, '%s:%s' % (deploy, deploy)],
139 ['git', 'add', '-A'],
140 ['git', 'commit', '-m', commit_message],
141 ['git', 'push', '--force', remote, '%s:%s' % (deploy, deploy)],
142 ['git', 'checkout', source],
143 ]
144
145 for command in commands:
146 self.logger.info("==> {0}".format(command))
147 try:
148 subprocess.check_call(command)
149 except subprocess.CalledProcessError as e:
150 self.logger.error(
151 'Failed GitHub deployment — command {0} '
152 'returned {1}'.format(e.cmd, e.returncode)
153 )
154 sys.exit(e.returncode)
155
156 def _copy_output(self):
157 """ Copy all output to the top level directory. """
158 output_folder = self.site.config['OUTPUT_FOLDER']
159 for each in os.listdir(output_folder):
160 if os.path.exists(each):
161 if os.path.isdir(each):
162 shutil.rmtree(each)
163
164 else:
165 os.unlink(each)
166
167 shutil.move(os.path.join(output_folder, each), '.')
168
169 def _checkout_deploy_branch(self):
170 """ Check out the deploy branch
171
172 Creates an orphan branch if not present.
173
174 """
175
176 deploy = self._deploy_branch
177
178 try:
179 subprocess.check_call(
180 [
181 'git', 'show-ref', '--verify', '--quiet',
182 'refs/heads/%s' % deploy
183 ]
184 )
185 except subprocess.CalledProcessError:
186 self._create_orphan_deploy_branch()
187 else:
188 subprocess.check_call(['git', 'checkout', deploy])
189
190 def _create_orphan_deploy_branch(self):
191 """ Create an orphan deploy branch """
192
193 result = subprocess.check_call(
194 ['git', 'checkout', '--orphan', self._deploy_branch]
195 )
196 if result != 0:
197 self.logger.error('Failed to create a deploy branch')
198 sys.exit(1)
199
200 result = subprocess.check_call(['git', 'rm', '-rf', '.'])
201 if result != 0:
202 self.logger.error('Failed to create a deploy branch')
203 sys.exit(1)
204
205 with open('.gitignore', 'w') as f:
206 f.write('%s\n' % self.site.config['OUTPUT_FOLDER'])
207 f.write('%s\n' % self.site.config['CACHE_FOLDER'])
208 f.write('*.pyc\n')
209 f.write('*.db\n')
210
211 subprocess.check_call(['git', 'add', '.gitignore'])
212 subprocess.check_call(['git', 'commit', '-m', 'Add .gitignore'])
213
214 def _ensure_git_repo(self):
215 """ Ensure that the site is a git-repo.
216
217 Also make sure that a remote with the specified name exists.
218
219 """
220
221 try:
222 remotes = uni_check_output(['git', 'remote'])
223 except subprocess.CalledProcessError as e:
224 self.logger.notice('github_deploy needs a git repository!')
225 sys.exit(e.returncode)
226 except OSError as e:
227 import errno
228 self.logger.error('Running git failed with {0}'.format(e))
229 if e.errno == errno.ENOENT:
230 self.logger.notice('Is git on the PATH?')
231 sys.exit(1)
232 else:
233 if self._remote_name not in remotes:
234 self.logger.error(
235 'Need a remote called "%s" configured' % self._remote_name
236 )
237 sys.exit(1)
238
239 def _exit_if_output_committed(self):
240 """ Exit if the output folder is committed on the source branch. """
241
242 source = self._source_branch
243 subprocess.check_call(['git', 'checkout', source])
244
245 output_folder = self.site.config['OUTPUT_FOLDER']
246 output_log = uni_check_output(
247 ['git', 'ls-files', '--', output_folder]
248 )
249
250 if len(output_log.strip()) > 0:
251 self.logger.error(
252 'Output folder is committed on the source branch. '
253 'Cannot proceed until it is removed.'
254 )
255 sys.exit(1)
256
257 def _prompt_continue(self):
258 """ Show uncommitted changes, and ask if user wants to continue. """
259
260 changes = uni_check_output(['git', 'status', '--porcelain'])
261 if changes.strip():
262 changes = uni_check_output(['git', 'status']).strip()
263 message = (
264 "You have the following changes:\n%s\n\n"
265 "Anything not committed, and unknown to Nikola may be lost, "
266 "or committed onto the wrong branch. Do you wish to continue?"
267 ) % changes
268 proceed = ask_yesno(message, False)
269 else:
270 proceed = True
271
272 return proceed
273
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/nikola/plugins/command/github_deploy.py b/nikola/plugins/command/github_deploy.py
--- a/nikola/plugins/command/github_deploy.py
+++ b/nikola/plugins/command/github_deploy.py
@@ -95,6 +95,9 @@
self._remote_name = self.site.config.get(
'GITHUB_REMOTE_NAME', 'origin'
)
+ self._pull_before_commit = self.site.config.get(
+ 'GITHUB_PULL_BEFORE_COMMIT', False
+ )
self._ensure_git_repo()
@@ -135,13 +138,15 @@
)
commands = [
- ['git', 'pull', '--rebase=false', remote, '%s:%s' % (deploy, deploy)],
['git', 'add', '-A'],
['git', 'commit', '-m', commit_message],
['git', 'push', '--force', remote, '%s:%s' % (deploy, deploy)],
['git', 'checkout', source],
]
+ if self._pull_before_commit:
+ commands.insert(0, ['git', 'pull', '--rebase=false', remote, '%s:%s' % (deploy, deploy)])
+
for command in commands:
self.logger.info("==> {0}".format(command))
try:
| {"golden_diff": "diff --git a/nikola/plugins/command/github_deploy.py b/nikola/plugins/command/github_deploy.py\n--- a/nikola/plugins/command/github_deploy.py\n+++ b/nikola/plugins/command/github_deploy.py\n@@ -95,6 +95,9 @@\n self._remote_name = self.site.config.get(\n 'GITHUB_REMOTE_NAME', 'origin'\n )\n+ self._pull_before_commit = self.site.config.get(\n+ 'GITHUB_PULL_BEFORE_COMMIT', False\n+ )\n \n self._ensure_git_repo()\n \n@@ -135,13 +138,15 @@\n )\n \n commands = [\n- ['git', 'pull', '--rebase=false', remote, '%s:%s' % (deploy, deploy)],\n ['git', 'add', '-A'],\n ['git', 'commit', '-m', commit_message],\n ['git', 'push', '--force', remote, '%s:%s' % (deploy, deploy)],\n ['git', 'checkout', source],\n ]\n \n+ if self._pull_before_commit:\n+ commands.insert(0, ['git', 'pull', '--rebase=false', remote, '%s:%s' % (deploy, deploy)])\n+\n for command in commands:\n self.logger.info(\"==> {0}\".format(command))\n try:\n", "issue": "github_deploy broken if no remote `gh-pages` branch\nIf you just started out with Nikola and your remote doesn't have a `gh-pages` branch yet (which will be the case for most people), then `github_deploy` crashes with:\n\n```\nERROR: github_deploy: Failed GitHub deployment \u2014 command ['git', 'pull', 'origin', 'gh-pages:gh-pages'] returned 1\n```\n\nSince `git pull origin gh-pages:gh-pages` won't work if there's no remote `gh-pages` branch. This is a big problem, because creating and pushing the blank branch is possible but nontrivial, and is one of the main automation \"bonuses\" of having a `nikola github_deploy` in the first place.\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\n# Copyright \u00a9 2014 Puneeth Chaganti and others.\n\n# Permission is hereby granted, free of charge, to any\n# person obtaining a copy of this software and associated\n# documentation files (the \"Software\"), to deal in the\n# Software without restriction, including without limitation\n# the rights to use, copy, modify, merge, publish,\n# distribute, sublicense, and/or sell copies of the\n# Software, and to permit persons to whom the Software is\n# furnished to do so, subject to the following conditions:\n#\n# The above copyright notice and this permission notice\n# shall be included in all copies or substantial portions of\n# the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY\n# KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE\n# WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR\n# PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS\n# OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR\n# OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR\n# OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE\n# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.\n\nfrom __future__ import print_function\nimport os\nimport shutil\nimport subprocess\nimport sys\nfrom textwrap import dedent\n\nfrom nikola.plugin_categories import Command\nfrom nikola.plugins.command.check import real_scan_files\nfrom nikola.utils import ask_yesno, get_logger\nfrom nikola.__main__ import main\nfrom nikola import __version__\n\n\ndef uni_check_output(*args, **kwargs):\n o = subprocess.check_output(*args, **kwargs)\n return o.decode('utf-8')\n\n\nclass CommandGitHubDeploy(Command):\n \"\"\" Deploy site to GitHub pages. \"\"\"\n name = 'github_deploy'\n\n doc_usage = ''\n doc_purpose = 'deploy the site to GitHub pages'\n doc_description = dedent(\n \"\"\"\\\n This command can be used to deploy your site to GitHub pages.\n It performs the following actions:\n\n 1. Ensure that your site is a git repository, and git is on the PATH.\n 2. Ensure that the output directory is not committed on the\n source branch.\n 3. Check for changes, and prompt the user to continue, if required.\n 4. Build the site\n 5. Clean any files that are \"unknown\" to Nikola.\n 6. Create a deploy branch, if one doesn't exist.\n 7. Commit the output to this branch. (NOTE: Any untracked source\n files, may get committed at this stage, on the wrong branch!)\n 8. Push and deploy!\n\n NOTE: This command needs your site to be a git repository, with a\n master branch (or a different branch, configured using\n GITHUB_SOURCE_BRANCH if you are pushing to user.github\n .io/organization.github.io pages) containing the sources of your\n site. You also, obviously, need to have `git` on your PATH,\n and should be able to push to the repository specified as the remote\n (origin, by default).\n \"\"\"\n )\n\n logger = None\n\n _deploy_branch = ''\n _source_branch = ''\n _remote_name = ''\n\n def _execute(self, command, args):\n\n self.logger = get_logger(\n CommandGitHubDeploy.name, self.site.loghandlers\n )\n self._source_branch = self.site.config.get(\n 'GITHUB_SOURCE_BRANCH', 'master'\n )\n self._deploy_branch = self.site.config.get(\n 'GITHUB_DEPLOY_BRANCH', 'gh-pages'\n )\n self._remote_name = self.site.config.get(\n 'GITHUB_REMOTE_NAME', 'origin'\n )\n\n self._ensure_git_repo()\n\n self._exit_if_output_committed()\n\n if not self._prompt_continue():\n return\n\n build = main(['build'])\n if build != 0:\n self.logger.error('Build failed, not deploying to GitHub')\n sys.exit(build)\n\n only_on_output, _ = real_scan_files(self.site)\n for f in only_on_output:\n os.unlink(f)\n\n self._checkout_deploy_branch()\n\n self._copy_output()\n\n self._commit_and_push()\n\n return\n\n def _commit_and_push(self):\n \"\"\" Commit all the files and push. \"\"\"\n\n deploy = self._deploy_branch\n source = self._source_branch\n remote = self._remote_name\n\n source_commit = uni_check_output(['git', 'rev-parse', source])\n commit_message = (\n 'Nikola auto commit.\\n\\n'\n 'Source commit: %s'\n 'Nikola version: %s' % (source_commit, __version__)\n )\n\n commands = [\n ['git', 'pull', '--rebase=false', remote, '%s:%s' % (deploy, deploy)],\n ['git', 'add', '-A'],\n ['git', 'commit', '-m', commit_message],\n ['git', 'push', '--force', remote, '%s:%s' % (deploy, deploy)],\n ['git', 'checkout', source],\n ]\n\n for command in commands:\n self.logger.info(\"==> {0}\".format(command))\n try:\n subprocess.check_call(command)\n except subprocess.CalledProcessError as e:\n self.logger.error(\n 'Failed GitHub deployment \u2014 command {0} '\n 'returned {1}'.format(e.cmd, e.returncode)\n )\n sys.exit(e.returncode)\n\n def _copy_output(self):\n \"\"\" Copy all output to the top level directory. \"\"\"\n output_folder = self.site.config['OUTPUT_FOLDER']\n for each in os.listdir(output_folder):\n if os.path.exists(each):\n if os.path.isdir(each):\n shutil.rmtree(each)\n\n else:\n os.unlink(each)\n\n shutil.move(os.path.join(output_folder, each), '.')\n\n def _checkout_deploy_branch(self):\n \"\"\" Check out the deploy branch\n\n Creates an orphan branch if not present.\n\n \"\"\"\n\n deploy = self._deploy_branch\n\n try:\n subprocess.check_call(\n [\n 'git', 'show-ref', '--verify', '--quiet',\n 'refs/heads/%s' % deploy\n ]\n )\n except subprocess.CalledProcessError:\n self._create_orphan_deploy_branch()\n else:\n subprocess.check_call(['git', 'checkout', deploy])\n\n def _create_orphan_deploy_branch(self):\n \"\"\" Create an orphan deploy branch \"\"\"\n\n result = subprocess.check_call(\n ['git', 'checkout', '--orphan', self._deploy_branch]\n )\n if result != 0:\n self.logger.error('Failed to create a deploy branch')\n sys.exit(1)\n\n result = subprocess.check_call(['git', 'rm', '-rf', '.'])\n if result != 0:\n self.logger.error('Failed to create a deploy branch')\n sys.exit(1)\n\n with open('.gitignore', 'w') as f:\n f.write('%s\\n' % self.site.config['OUTPUT_FOLDER'])\n f.write('%s\\n' % self.site.config['CACHE_FOLDER'])\n f.write('*.pyc\\n')\n f.write('*.db\\n')\n\n subprocess.check_call(['git', 'add', '.gitignore'])\n subprocess.check_call(['git', 'commit', '-m', 'Add .gitignore'])\n\n def _ensure_git_repo(self):\n \"\"\" Ensure that the site is a git-repo.\n\n Also make sure that a remote with the specified name exists.\n\n \"\"\"\n\n try:\n remotes = uni_check_output(['git', 'remote'])\n except subprocess.CalledProcessError as e:\n self.logger.notice('github_deploy needs a git repository!')\n sys.exit(e.returncode)\n except OSError as e:\n import errno\n self.logger.error('Running git failed with {0}'.format(e))\n if e.errno == errno.ENOENT:\n self.logger.notice('Is git on the PATH?')\n sys.exit(1)\n else:\n if self._remote_name not in remotes:\n self.logger.error(\n 'Need a remote called \"%s\" configured' % self._remote_name\n )\n sys.exit(1)\n\n def _exit_if_output_committed(self):\n \"\"\" Exit if the output folder is committed on the source branch. \"\"\"\n\n source = self._source_branch\n subprocess.check_call(['git', 'checkout', source])\n\n output_folder = self.site.config['OUTPUT_FOLDER']\n output_log = uni_check_output(\n ['git', 'ls-files', '--', output_folder]\n )\n\n if len(output_log.strip()) > 0:\n self.logger.error(\n 'Output folder is committed on the source branch. '\n 'Cannot proceed until it is removed.'\n )\n sys.exit(1)\n\n def _prompt_continue(self):\n \"\"\" Show uncommitted changes, and ask if user wants to continue. \"\"\"\n\n changes = uni_check_output(['git', 'status', '--porcelain'])\n if changes.strip():\n changes = uni_check_output(['git', 'status']).strip()\n message = (\n \"You have the following changes:\\n%s\\n\\n\"\n \"Anything not committed, and unknown to Nikola may be lost, \"\n \"or committed onto the wrong branch. Do you wish to continue?\"\n ) % changes\n proceed = ask_yesno(message, False)\n else:\n proceed = True\n\n return proceed\n", "path": "nikola/plugins/command/github_deploy.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\n# Copyright \u00a9 2014 Puneeth Chaganti and others.\n\n# Permission is hereby granted, free of charge, to any\n# person obtaining a copy of this software and associated\n# documentation files (the \"Software\"), to deal in the\n# Software without restriction, including without limitation\n# the rights to use, copy, modify, merge, publish,\n# distribute, sublicense, and/or sell copies of the\n# Software, and to permit persons to whom the Software is\n# furnished to do so, subject to the following conditions:\n#\n# The above copyright notice and this permission notice\n# shall be included in all copies or substantial portions of\n# the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY\n# KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE\n# WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR\n# PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS\n# OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR\n# OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR\n# OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE\n# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.\n\nfrom __future__ import print_function\nimport os\nimport shutil\nimport subprocess\nimport sys\nfrom textwrap import dedent\n\nfrom nikola.plugin_categories import Command\nfrom nikola.plugins.command.check import real_scan_files\nfrom nikola.utils import ask_yesno, get_logger\nfrom nikola.__main__ import main\nfrom nikola import __version__\n\n\ndef uni_check_output(*args, **kwargs):\n o = subprocess.check_output(*args, **kwargs)\n return o.decode('utf-8')\n\n\nclass CommandGitHubDeploy(Command):\n \"\"\" Deploy site to GitHub pages. \"\"\"\n name = 'github_deploy'\n\n doc_usage = ''\n doc_purpose = 'deploy the site to GitHub pages'\n doc_description = dedent(\n \"\"\"\\\n This command can be used to deploy your site to GitHub pages.\n It performs the following actions:\n\n 1. Ensure that your site is a git repository, and git is on the PATH.\n 2. Ensure that the output directory is not committed on the\n source branch.\n 3. Check for changes, and prompt the user to continue, if required.\n 4. Build the site\n 5. Clean any files that are \"unknown\" to Nikola.\n 6. Create a deploy branch, if one doesn't exist.\n 7. Commit the output to this branch. (NOTE: Any untracked source\n files, may get committed at this stage, on the wrong branch!)\n 8. Push and deploy!\n\n NOTE: This command needs your site to be a git repository, with a\n master branch (or a different branch, configured using\n GITHUB_SOURCE_BRANCH if you are pushing to user.github\n .io/organization.github.io pages) containing the sources of your\n site. You also, obviously, need to have `git` on your PATH,\n and should be able to push to the repository specified as the remote\n (origin, by default).\n \"\"\"\n )\n\n logger = None\n\n _deploy_branch = ''\n _source_branch = ''\n _remote_name = ''\n\n def _execute(self, command, args):\n\n self.logger = get_logger(\n CommandGitHubDeploy.name, self.site.loghandlers\n )\n self._source_branch = self.site.config.get(\n 'GITHUB_SOURCE_BRANCH', 'master'\n )\n self._deploy_branch = self.site.config.get(\n 'GITHUB_DEPLOY_BRANCH', 'gh-pages'\n )\n self._remote_name = self.site.config.get(\n 'GITHUB_REMOTE_NAME', 'origin'\n )\n self._pull_before_commit = self.site.config.get(\n 'GITHUB_PULL_BEFORE_COMMIT', False\n )\n\n self._ensure_git_repo()\n\n self._exit_if_output_committed()\n\n if not self._prompt_continue():\n return\n\n build = main(['build'])\n if build != 0:\n self.logger.error('Build failed, not deploying to GitHub')\n sys.exit(build)\n\n only_on_output, _ = real_scan_files(self.site)\n for f in only_on_output:\n os.unlink(f)\n\n self._checkout_deploy_branch()\n\n self._copy_output()\n\n self._commit_and_push()\n\n return\n\n def _commit_and_push(self):\n \"\"\" Commit all the files and push. \"\"\"\n\n deploy = self._deploy_branch\n source = self._source_branch\n remote = self._remote_name\n\n source_commit = uni_check_output(['git', 'rev-parse', source])\n commit_message = (\n 'Nikola auto commit.\\n\\n'\n 'Source commit: %s'\n 'Nikola version: %s' % (source_commit, __version__)\n )\n\n commands = [\n ['git', 'add', '-A'],\n ['git', 'commit', '-m', commit_message],\n ['git', 'push', '--force', remote, '%s:%s' % (deploy, deploy)],\n ['git', 'checkout', source],\n ]\n\n if self._pull_before_commit:\n commands.insert(0, ['git', 'pull', '--rebase=false', remote, '%s:%s' % (deploy, deploy)])\n\n for command in commands:\n self.logger.info(\"==> {0}\".format(command))\n try:\n subprocess.check_call(command)\n except subprocess.CalledProcessError as e:\n self.logger.error(\n 'Failed GitHub deployment \u2014 command {0} '\n 'returned {1}'.format(e.cmd, e.returncode)\n )\n sys.exit(e.returncode)\n\n def _copy_output(self):\n \"\"\" Copy all output to the top level directory. \"\"\"\n output_folder = self.site.config['OUTPUT_FOLDER']\n for each in os.listdir(output_folder):\n if os.path.exists(each):\n if os.path.isdir(each):\n shutil.rmtree(each)\n\n else:\n os.unlink(each)\n\n shutil.move(os.path.join(output_folder, each), '.')\n\n def _checkout_deploy_branch(self):\n \"\"\" Check out the deploy branch\n\n Creates an orphan branch if not present.\n\n \"\"\"\n\n deploy = self._deploy_branch\n\n try:\n subprocess.check_call(\n [\n 'git', 'show-ref', '--verify', '--quiet',\n 'refs/heads/%s' % deploy\n ]\n )\n except subprocess.CalledProcessError:\n self._create_orphan_deploy_branch()\n else:\n subprocess.check_call(['git', 'checkout', deploy])\n\n def _create_orphan_deploy_branch(self):\n \"\"\" Create an orphan deploy branch \"\"\"\n\n result = subprocess.check_call(\n ['git', 'checkout', '--orphan', self._deploy_branch]\n )\n if result != 0:\n self.logger.error('Failed to create a deploy branch')\n sys.exit(1)\n\n result = subprocess.check_call(['git', 'rm', '-rf', '.'])\n if result != 0:\n self.logger.error('Failed to create a deploy branch')\n sys.exit(1)\n\n with open('.gitignore', 'w') as f:\n f.write('%s\\n' % self.site.config['OUTPUT_FOLDER'])\n f.write('%s\\n' % self.site.config['CACHE_FOLDER'])\n f.write('*.pyc\\n')\n f.write('*.db\\n')\n\n subprocess.check_call(['git', 'add', '.gitignore'])\n subprocess.check_call(['git', 'commit', '-m', 'Add .gitignore'])\n\n def _ensure_git_repo(self):\n \"\"\" Ensure that the site is a git-repo.\n\n Also make sure that a remote with the specified name exists.\n\n \"\"\"\n\n try:\n remotes = uni_check_output(['git', 'remote'])\n except subprocess.CalledProcessError as e:\n self.logger.notice('github_deploy needs a git repository!')\n sys.exit(e.returncode)\n except OSError as e:\n import errno\n self.logger.error('Running git failed with {0}'.format(e))\n if e.errno == errno.ENOENT:\n self.logger.notice('Is git on the PATH?')\n sys.exit(1)\n else:\n if self._remote_name not in remotes:\n self.logger.error(\n 'Need a remote called \"%s\" configured' % self._remote_name\n )\n sys.exit(1)\n\n def _exit_if_output_committed(self):\n \"\"\" Exit if the output folder is committed on the source branch. \"\"\"\n\n source = self._source_branch\n subprocess.check_call(['git', 'checkout', source])\n\n output_folder = self.site.config['OUTPUT_FOLDER']\n output_log = uni_check_output(\n ['git', 'ls-files', '--', output_folder]\n )\n\n if len(output_log.strip()) > 0:\n self.logger.error(\n 'Output folder is committed on the source branch. '\n 'Cannot proceed until it is removed.'\n )\n sys.exit(1)\n\n def _prompt_continue(self):\n \"\"\" Show uncommitted changes, and ask if user wants to continue. \"\"\"\n\n changes = uni_check_output(['git', 'status', '--porcelain'])\n if changes.strip():\n changes = uni_check_output(['git', 'status']).strip()\n message = (\n \"You have the following changes:\\n%s\\n\\n\"\n \"Anything not committed, and unknown to Nikola may be lost, \"\n \"or committed onto the wrong branch. Do you wish to continue?\"\n ) % changes\n proceed = ask_yesno(message, False)\n else:\n proceed = True\n\n return proceed\n", "path": "nikola/plugins/command/github_deploy.py"}]} | 3,166 | 284 |
gh_patches_debug_20857 | rasdani/github-patches | git_diff | bridgecrewio__checkov-3127 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
baseline output can change resource order for each run
If I generate a baseline file and I have then made some improvements to my Terraform code and I run the baseline again. What I am finding is that the order of the resources for each file can often change which then shows up as a diff against the prevous baseline file - when in reality nothing has change but the order of the resources in the findings array in the baseline output file
I was wondering could the findings array just be sorted before being output? Then the resource order should be fixed and any actual diffs should be real changes to check_ids (which is sorted already) or new resources being added?
e.g. this is a diff from two runs of generating a baseline file nothing has actually change just resources moved around in the array.
```
@@ -100,13 +100,12 @@
"file": "/main.tf",
"findings": [
{
- "resource": "aws_s3_bucket.canary_artifacts",
+ "resource": "aws_s3_bucket.backups",
"check_ids": [
"CKV2_AWS_6",
"CKV_AWS_144",
"CKV_AWS_145",
- "CKV_AWS_18",
- "CKV_AWS_21"
+ "CKV_AWS_18"
]
},
{
@@ -119,12 +118,13 @@
]
},
{
- "resource": "aws_s3_bucket.lambdas",
+ "resource": "aws_s3_bucket.canary_artifacts",
"check_ids": [
"CKV2_AWS_6",
"CKV_AWS_144",
"CKV_AWS_145",
- "CKV_AWS_18"
+ "CKV_AWS_18",
+ "CKV_AWS_21"
]
},
{
@@ -137,7 +137,7 @@
]
},
{
- "resource": "aws_s3_bucket.backups",
+ "resource": "aws_s3_bucket.lambdas",
"check_ids": [
"CKV2_AWS_6",
"CKV_AWS_144",
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `checkov/common/output/baseline.py`
Content:
```
1 from __future__ import annotations
2
3 import json
4 from collections import defaultdict
5 from checkov.common.models.enums import CheckResult
6 from typing import Any, TYPE_CHECKING
7
8 if TYPE_CHECKING:
9 from checkov.common.output.record import Record
10 from checkov.common.output.report import Report
11 from checkov.common.typing import _BaselineFinding, _BaselineFailedChecks
12
13
14 class Baseline:
15 def __init__(self, output_skipped: bool = False) -> None:
16 self.path = ""
17 self.path_failed_checks_map: dict[str, list[_BaselineFinding]] = defaultdict(list)
18 self.failed_checks: list[_BaselineFailedChecks] = []
19 self.output_skipped = output_skipped
20
21 def add_findings_from_report(self, report: Report) -> None:
22 for check in report.failed_checks:
23 try:
24 existing = next(
25 x for x in self.path_failed_checks_map[check.file_path] if x["resource"] == check.resource
26 )
27 except StopIteration:
28 existing = {"resource": check.resource, "check_ids": []}
29 self.path_failed_checks_map[check.file_path].append(existing)
30 existing["check_ids"].append(check.check_id)
31 existing["check_ids"].sort() # Sort the check IDs to be nicer to the eye
32
33 def to_dict(self) -> dict[str, Any]:
34 """
35 The output of this class needs to be very explicit, hence the following structure of the dict:
36 {
37 "failed_checks": [
38 {
39 "file": "path/to/file",
40 "findings: [
41 {
42 "resource": "aws_s3_bucket.this",
43 "check_ids": [
44 "CKV_AWS_1",
45 "CKV_AWS_2",
46 "CKV_AWS_3"
47 ]
48 }
49 ]
50 }
51 ]
52 }
53 """
54 failed_checks_list = []
55 for file, findings in self.path_failed_checks_map.items():
56 formatted_findings = []
57 for finding in findings:
58 formatted_findings.append({"resource": finding["resource"], "check_ids": finding["check_ids"]})
59 failed_checks_list.append({"file": file, "findings": formatted_findings})
60
61 resp = {"failed_checks": failed_checks_list}
62 return resp
63
64 def compare_and_reduce_reports(self, scan_reports: list[Report]) -> None:
65 for scan_report in scan_reports:
66 scan_report.passed_checks = [
67 check for check in scan_report.passed_checks if self._is_check_in_baseline(check)
68 ]
69 scan_report.skipped_checks = [
70 check for check in scan_report.skipped_checks if self._is_check_in_baseline(check)
71 ]
72 if self.output_skipped:
73 for check in scan_report.failed_checks:
74 if self._is_check_in_baseline(check):
75 check.check_result["suppress_comment"] = "baseline-skipped"
76 check.check_result["result"] = CheckResult.SKIPPED
77 scan_report.skipped_checks.append(check)
78 scan_report.failed_checks = [
79 check for check in scan_report.failed_checks if not self._is_check_in_baseline(check)
80 ]
81
82 def _is_check_in_baseline(self, check: Record) -> bool:
83 failed_check_id = check.check_id
84 failed_check_resource = check.resource
85 for baseline_failed_check in self.failed_checks:
86 for finding in baseline_failed_check["findings"]:
87 if finding["resource"] == failed_check_resource and failed_check_id in finding["check_ids"]:
88 return True
89 return False
90
91 def from_json(self, file_path: str) -> None:
92 self.path = file_path
93 with open(file_path, "r") as f:
94 baseline_raw = json.load(f)
95 self.failed_checks = baseline_raw.get("failed_checks", {})
96
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/checkov/common/output/baseline.py b/checkov/common/output/baseline.py
--- a/checkov/common/output/baseline.py
+++ b/checkov/common/output/baseline.py
@@ -2,6 +2,8 @@
import json
from collections import defaultdict
+from operator import itemgetter
+
from checkov.common.models.enums import CheckResult
from typing import Any, TYPE_CHECKING
@@ -56,9 +58,9 @@
formatted_findings = []
for finding in findings:
formatted_findings.append({"resource": finding["resource"], "check_ids": finding["check_ids"]})
- failed_checks_list.append({"file": file, "findings": formatted_findings})
+ failed_checks_list.append({"file": file, "findings": sorted(formatted_findings, key=itemgetter("resource"))})
- resp = {"failed_checks": failed_checks_list}
+ resp = {"failed_checks": sorted(failed_checks_list, key=itemgetter("file"))}
return resp
def compare_and_reduce_reports(self, scan_reports: list[Report]) -> None:
| {"golden_diff": "diff --git a/checkov/common/output/baseline.py b/checkov/common/output/baseline.py\n--- a/checkov/common/output/baseline.py\n+++ b/checkov/common/output/baseline.py\n@@ -2,6 +2,8 @@\n \n import json\n from collections import defaultdict\n+from operator import itemgetter\n+\n from checkov.common.models.enums import CheckResult\n from typing import Any, TYPE_CHECKING\n \n@@ -56,9 +58,9 @@\n formatted_findings = []\n for finding in findings:\n formatted_findings.append({\"resource\": finding[\"resource\"], \"check_ids\": finding[\"check_ids\"]})\n- failed_checks_list.append({\"file\": file, \"findings\": formatted_findings})\n+ failed_checks_list.append({\"file\": file, \"findings\": sorted(formatted_findings, key=itemgetter(\"resource\"))})\n \n- resp = {\"failed_checks\": failed_checks_list}\n+ resp = {\"failed_checks\": sorted(failed_checks_list, key=itemgetter(\"file\"))}\n return resp\n \n def compare_and_reduce_reports(self, scan_reports: list[Report]) -> None:\n", "issue": "baseline output can change resource order for each run\nIf I generate a baseline file and I have then made some improvements to my Terraform code and I run the baseline again. What I am finding is that the order of the resources for each file can often change which then shows up as a diff against the prevous baseline file - when in reality nothing has change but the order of the resources in the findings array in the baseline output file \r\n\r\nI was wondering could the findings array just be sorted before being output? Then the resource order should be fixed and any actual diffs should be real changes to check_ids (which is sorted already) or new resources being added?\r\n\r\ne.g. this is a diff from two runs of generating a baseline file nothing has actually change just resources moved around in the array.\r\n\r\n```\r\n@@ -100,13 +100,12 @@\r\n \"file\": \"/main.tf\",\r\n \"findings\": [\r\n {\r\n- \"resource\": \"aws_s3_bucket.canary_artifacts\",\r\n+ \"resource\": \"aws_s3_bucket.backups\",\r\n \"check_ids\": [\r\n \"CKV2_AWS_6\",\r\n \"CKV_AWS_144\",\r\n \"CKV_AWS_145\",\r\n- \"CKV_AWS_18\",\r\n- \"CKV_AWS_21\"\r\n+ \"CKV_AWS_18\"\r\n ]\r\n },\r\n {\r\n@@ -119,12 +118,13 @@\r\n ]\r\n },\r\n {\r\n- \"resource\": \"aws_s3_bucket.lambdas\",\r\n+ \"resource\": \"aws_s3_bucket.canary_artifacts\",\r\n \"check_ids\": [\r\n \"CKV2_AWS_6\",\r\n \"CKV_AWS_144\",\r\n \"CKV_AWS_145\",\r\n- \"CKV_AWS_18\"\r\n+ \"CKV_AWS_18\",\r\n+ \"CKV_AWS_21\"\r\n ]\r\n },\r\n {\r\n@@ -137,7 +137,7 @@\r\n ]\r\n },\r\n {\r\n- \"resource\": \"aws_s3_bucket.backups\",\r\n+ \"resource\": \"aws_s3_bucket.lambdas\",\r\n \"check_ids\": [\r\n \"CKV2_AWS_6\",\r\n \"CKV_AWS_144\",\r\n```\n", "before_files": [{"content": "from __future__ import annotations\n\nimport json\nfrom collections import defaultdict\nfrom checkov.common.models.enums import CheckResult\nfrom typing import Any, TYPE_CHECKING\n\nif TYPE_CHECKING:\n from checkov.common.output.record import Record\n from checkov.common.output.report import Report\n from checkov.common.typing import _BaselineFinding, _BaselineFailedChecks\n\n\nclass Baseline:\n def __init__(self, output_skipped: bool = False) -> None:\n self.path = \"\"\n self.path_failed_checks_map: dict[str, list[_BaselineFinding]] = defaultdict(list)\n self.failed_checks: list[_BaselineFailedChecks] = []\n self.output_skipped = output_skipped\n\n def add_findings_from_report(self, report: Report) -> None:\n for check in report.failed_checks:\n try:\n existing = next(\n x for x in self.path_failed_checks_map[check.file_path] if x[\"resource\"] == check.resource\n )\n except StopIteration:\n existing = {\"resource\": check.resource, \"check_ids\": []}\n self.path_failed_checks_map[check.file_path].append(existing)\n existing[\"check_ids\"].append(check.check_id)\n existing[\"check_ids\"].sort() # Sort the check IDs to be nicer to the eye\n\n def to_dict(self) -> dict[str, Any]:\n \"\"\"\n The output of this class needs to be very explicit, hence the following structure of the dict:\n {\n \"failed_checks\": [\n {\n \"file\": \"path/to/file\",\n \"findings: [\n {\n \"resource\": \"aws_s3_bucket.this\",\n \"check_ids\": [\n \"CKV_AWS_1\",\n \"CKV_AWS_2\",\n \"CKV_AWS_3\"\n ]\n }\n ]\n }\n ]\n }\n \"\"\"\n failed_checks_list = []\n for file, findings in self.path_failed_checks_map.items():\n formatted_findings = []\n for finding in findings:\n formatted_findings.append({\"resource\": finding[\"resource\"], \"check_ids\": finding[\"check_ids\"]})\n failed_checks_list.append({\"file\": file, \"findings\": formatted_findings})\n\n resp = {\"failed_checks\": failed_checks_list}\n return resp\n\n def compare_and_reduce_reports(self, scan_reports: list[Report]) -> None:\n for scan_report in scan_reports:\n scan_report.passed_checks = [\n check for check in scan_report.passed_checks if self._is_check_in_baseline(check)\n ]\n scan_report.skipped_checks = [\n check for check in scan_report.skipped_checks if self._is_check_in_baseline(check)\n ]\n if self.output_skipped:\n for check in scan_report.failed_checks:\n if self._is_check_in_baseline(check):\n check.check_result[\"suppress_comment\"] = \"baseline-skipped\"\n check.check_result[\"result\"] = CheckResult.SKIPPED\n scan_report.skipped_checks.append(check)\n scan_report.failed_checks = [\n check for check in scan_report.failed_checks if not self._is_check_in_baseline(check)\n ]\n\n def _is_check_in_baseline(self, check: Record) -> bool:\n failed_check_id = check.check_id\n failed_check_resource = check.resource\n for baseline_failed_check in self.failed_checks:\n for finding in baseline_failed_check[\"findings\"]:\n if finding[\"resource\"] == failed_check_resource and failed_check_id in finding[\"check_ids\"]:\n return True\n return False\n\n def from_json(self, file_path: str) -> None:\n self.path = file_path\n with open(file_path, \"r\") as f:\n baseline_raw = json.load(f)\n self.failed_checks = baseline_raw.get(\"failed_checks\", {})\n", "path": "checkov/common/output/baseline.py"}], "after_files": [{"content": "from __future__ import annotations\n\nimport json\nfrom collections import defaultdict\nfrom operator import itemgetter\n\nfrom checkov.common.models.enums import CheckResult\nfrom typing import Any, TYPE_CHECKING\n\nif TYPE_CHECKING:\n from checkov.common.output.record import Record\n from checkov.common.output.report import Report\n from checkov.common.typing import _BaselineFinding, _BaselineFailedChecks\n\n\nclass Baseline:\n def __init__(self, output_skipped: bool = False) -> None:\n self.path = \"\"\n self.path_failed_checks_map: dict[str, list[_BaselineFinding]] = defaultdict(list)\n self.failed_checks: list[_BaselineFailedChecks] = []\n self.output_skipped = output_skipped\n\n def add_findings_from_report(self, report: Report) -> None:\n for check in report.failed_checks:\n try:\n existing = next(\n x for x in self.path_failed_checks_map[check.file_path] if x[\"resource\"] == check.resource\n )\n except StopIteration:\n existing = {\"resource\": check.resource, \"check_ids\": []}\n self.path_failed_checks_map[check.file_path].append(existing)\n existing[\"check_ids\"].append(check.check_id)\n existing[\"check_ids\"].sort() # Sort the check IDs to be nicer to the eye\n\n def to_dict(self) -> dict[str, Any]:\n \"\"\"\n The output of this class needs to be very explicit, hence the following structure of the dict:\n {\n \"failed_checks\": [\n {\n \"file\": \"path/to/file\",\n \"findings: [\n {\n \"resource\": \"aws_s3_bucket.this\",\n \"check_ids\": [\n \"CKV_AWS_1\",\n \"CKV_AWS_2\",\n \"CKV_AWS_3\"\n ]\n }\n ]\n }\n ]\n }\n \"\"\"\n failed_checks_list = []\n for file, findings in self.path_failed_checks_map.items():\n formatted_findings = []\n for finding in findings:\n formatted_findings.append({\"resource\": finding[\"resource\"], \"check_ids\": finding[\"check_ids\"]})\n failed_checks_list.append({\"file\": file, \"findings\": sorted(formatted_findings, key=itemgetter(\"resource\"))})\n\n resp = {\"failed_checks\": sorted(failed_checks_list, key=itemgetter(\"file\"))}\n return resp\n\n def compare_and_reduce_reports(self, scan_reports: list[Report]) -> None:\n for scan_report in scan_reports:\n scan_report.passed_checks = [\n check for check in scan_report.passed_checks if self._is_check_in_baseline(check)\n ]\n scan_report.skipped_checks = [\n check for check in scan_report.skipped_checks if self._is_check_in_baseline(check)\n ]\n if self.output_skipped:\n for check in scan_report.failed_checks:\n if self._is_check_in_baseline(check):\n check.check_result[\"suppress_comment\"] = \"baseline-skipped\"\n check.check_result[\"result\"] = CheckResult.SKIPPED\n scan_report.skipped_checks.append(check)\n scan_report.failed_checks = [\n check for check in scan_report.failed_checks if not self._is_check_in_baseline(check)\n ]\n\n def _is_check_in_baseline(self, check: Record) -> bool:\n failed_check_id = check.check_id\n failed_check_resource = check.resource\n for baseline_failed_check in self.failed_checks:\n for finding in baseline_failed_check[\"findings\"]:\n if finding[\"resource\"] == failed_check_resource and failed_check_id in finding[\"check_ids\"]:\n return True\n return False\n\n def from_json(self, file_path: str) -> None:\n self.path = file_path\n with open(file_path, \"r\") as f:\n baseline_raw = json.load(f)\n self.failed_checks = baseline_raw.get(\"failed_checks\", {})\n", "path": "checkov/common/output/baseline.py"}]} | 1,745 | 235 |
gh_patches_debug_34812 | rasdani/github-patches | git_diff | wagtail__wagtail-10961 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Avoid purging Revisions in use by third-party packages
I originally raised this as part of some rework on the `purge_revisions` management command in https://github.com/wagtail/wagtail/pull/10619#pullrequestreview-1528373346. We considered it out of scope to fix the issue in that PR since Wagtail has always worked like this. Hence I'm raising it again separately in this issue.
### Is your proposal related to a problem?
Wagtail provides a `purge_revisions` management command to delete stale `Revision` objects from the database. This command is aware of revisions that are still in use by Wagtail's workflow feature and avoids deleting those revisions ([link to source](https://github.com/wagtail/wagtail/blob/3d0282573d0d23f9eabe2fb6427e58b01f79ffe0/wagtail/management/commands/purge_revisions.py#L61-L66)).
However, it is somewhat common for third-party packages to rely on a specific revision. [wagtail-ab-testing](https://github.com/wagtail-nest/wagtail-ab-testing/blob/d04bd0032c4335a839c781851b12fe16037950de/wagtail_ab_testing/models.py#L76) is a real world example of this. It maintains a ForeignKey to a specific revision to track an AB experiment. If `purge_revisions` were to remove a revision in use by wagtail-ab-testing, experiment data loss would occur.
Data loss can be prevented by specifying the ForeignKey with `on_delete=models.PROTECT` but `purge_revisions` wouldn't be able to handle this and crash with a `ProtectedError`. This means third-party packages have no recourse: either they allow the revisions they depend on to be deleted or they mark their relations as protected and break Wagtails `purge_revisions` command.
### Describe the solution you'd like
I propose `purge_revisions` gracefully recovers from `ProtectedError` when trying to delete a revision. I also propose we document that relations to a `Revision` should be marked with `on_delete=models.PROTECT` if data loss is not desirable.
I also feel like the `purge_revisions` command should report the number of revisions that it couldn't delete because they were protected.
### Describe alternatives you've considered
An alternative mechanism I considered would be an API that allows packages to inform Wagtail which revisions they still use. The `purge_revisions` command can take this information into account and avoid revisions still in use.
This is a more complex solution with no apparent benefit over the solution described above.
Avoid purging Revisions in use by third-party packages
I originally raised this as part of some rework on the `purge_revisions` management command in https://github.com/wagtail/wagtail/pull/10619#pullrequestreview-1528373346. We considered it out of scope to fix the issue in that PR since Wagtail has always worked like this. Hence I'm raising it again separately in this issue.
### Is your proposal related to a problem?
Wagtail provides a `purge_revisions` management command to delete stale `Revision` objects from the database. This command is aware of revisions that are still in use by Wagtail's workflow feature and avoids deleting those revisions ([link to source](https://github.com/wagtail/wagtail/blob/3d0282573d0d23f9eabe2fb6427e58b01f79ffe0/wagtail/management/commands/purge_revisions.py#L61-L66)).
However, it is somewhat common for third-party packages to rely on a specific revision. [wagtail-ab-testing](https://github.com/wagtail-nest/wagtail-ab-testing/blob/d04bd0032c4335a839c781851b12fe16037950de/wagtail_ab_testing/models.py#L76) is a real world example of this. It maintains a ForeignKey to a specific revision to track an AB experiment. If `purge_revisions` were to remove a revision in use by wagtail-ab-testing, experiment data loss would occur.
Data loss can be prevented by specifying the ForeignKey with `on_delete=models.PROTECT` but `purge_revisions` wouldn't be able to handle this and crash with a `ProtectedError`. This means third-party packages have no recourse: either they allow the revisions they depend on to be deleted or they mark their relations as protected and break Wagtails `purge_revisions` command.
### Describe the solution you'd like
I propose `purge_revisions` gracefully recovers from `ProtectedError` when trying to delete a revision. I also propose we document that relations to a `Revision` should be marked with `on_delete=models.PROTECT` if data loss is not desirable.
I also feel like the `purge_revisions` command should report the number of revisions that it couldn't delete because they were protected.
### Describe alternatives you've considered
An alternative mechanism I considered would be an API that allows packages to inform Wagtail which revisions they still use. The `purge_revisions` command can take this information into account and avoid revisions still in use.
This is a more complex solution with no apparent benefit over the solution described above.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `wagtail/management/commands/purge_revisions.py`
Content:
```
1 from django.conf import settings
2 from django.core.management.base import BaseCommand
3 from django.db.models import Q
4 from django.utils import timezone
5
6 from wagtail.models import Revision, WorkflowState
7
8
9 class Command(BaseCommand):
10 help = "Delete revisions which are not the latest revision, published or scheduled to be published, or in moderation"
11
12 def add_arguments(self, parser):
13 parser.add_argument(
14 "--days",
15 type=int,
16 help="Only delete revisions older than this number of days",
17 )
18 parser.add_argument(
19 "--pages",
20 action="store_true",
21 help="Only delete revisions of page models",
22 )
23 parser.add_argument(
24 "--non-pages",
25 action="store_true",
26 help="Only delete revisions of non-page models",
27 )
28
29 def handle(self, *args, **options):
30 days = options.get("days")
31 pages = options.get("pages")
32 non_pages = options.get("non_pages")
33
34 revisions_deleted = purge_revisions(days=days, pages=pages, non_pages=non_pages)
35
36 if revisions_deleted:
37 self.stdout.write(
38 self.style.SUCCESS(
39 "Successfully deleted %s revisions" % revisions_deleted
40 )
41 )
42 else:
43 self.stdout.write("No revisions deleted")
44
45
46 def purge_revisions(days=None, pages=True, non_pages=True):
47 if pages == non_pages:
48 # If both are True or both are False, purge revisions of pages and non-pages
49 objects = Revision.objects.all()
50 elif pages:
51 objects = Revision.objects.page_revisions()
52 elif non_pages:
53 objects = Revision.objects.not_page_revisions()
54
55 # exclude revisions which have been submitted for moderation in the old system
56 # RemovedInWagtail60Warning
57 # Remove this when the deprecation period for the legacy
58 # moderation system ends.
59 purgeable_revisions = objects.exclude(submitted_for_moderation=True).exclude(
60 # and exclude revisions with an approved_go_live_at date
61 approved_go_live_at__isnull=False
62 )
63
64 if getattr(settings, "WAGTAIL_WORKFLOW_ENABLED", True):
65 purgeable_revisions = purgeable_revisions.exclude(
66 # and exclude revisions linked to an in progress or needs changes workflow state
67 Q(task_states__workflow_state__status=WorkflowState.STATUS_IN_PROGRESS)
68 | Q(task_states__workflow_state__status=WorkflowState.STATUS_NEEDS_CHANGES)
69 )
70
71 if days:
72 purgeable_until = timezone.now() - timezone.timedelta(days=days)
73 # only include revisions which were created before the cut off date
74 purgeable_revisions = purgeable_revisions.filter(created_at__lt=purgeable_until)
75
76 deleted_revisions_count = 0
77
78 for revision in purgeable_revisions.iterator():
79 # don't delete the latest revision
80 if not revision.is_latest_revision():
81 revision.delete()
82 deleted_revisions_count += 1
83
84 return deleted_revisions_count
85
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/wagtail/management/commands/purge_revisions.py b/wagtail/management/commands/purge_revisions.py
--- a/wagtail/management/commands/purge_revisions.py
+++ b/wagtail/management/commands/purge_revisions.py
@@ -1,6 +1,7 @@
from django.conf import settings
from django.core.management.base import BaseCommand
from django.db.models import Q
+from django.db.models.deletion import ProtectedError
from django.utils import timezone
from wagtail.models import Revision, WorkflowState
@@ -31,7 +32,9 @@
pages = options.get("pages")
non_pages = options.get("non_pages")
- revisions_deleted = purge_revisions(days=days, pages=pages, non_pages=non_pages)
+ revisions_deleted, protected_error_count = purge_revisions(
+ days=days, pages=pages, non_pages=non_pages
+ )
if revisions_deleted:
self.stdout.write(
@@ -39,6 +42,12 @@
"Successfully deleted %s revisions" % revisions_deleted
)
)
+ self.stdout.write(
+ self.style.SUCCESS(
+ "Ignored %s revisions because one or more protected relations exist that prevent deletion."
+ % protected_error_count
+ )
+ )
else:
self.stdout.write("No revisions deleted")
@@ -74,11 +83,15 @@
purgeable_revisions = purgeable_revisions.filter(created_at__lt=purgeable_until)
deleted_revisions_count = 0
+ protected_error_count = 0
for revision in purgeable_revisions.iterator():
# don't delete the latest revision
if not revision.is_latest_revision():
- revision.delete()
- deleted_revisions_count += 1
+ try:
+ revision.delete()
+ deleted_revisions_count += 1
+ except ProtectedError:
+ protected_error_count += 1
- return deleted_revisions_count
+ return deleted_revisions_count, protected_error_count
| {"golden_diff": "diff --git a/wagtail/management/commands/purge_revisions.py b/wagtail/management/commands/purge_revisions.py\n--- a/wagtail/management/commands/purge_revisions.py\n+++ b/wagtail/management/commands/purge_revisions.py\n@@ -1,6 +1,7 @@\n from django.conf import settings\n from django.core.management.base import BaseCommand\n from django.db.models import Q\n+from django.db.models.deletion import ProtectedError\n from django.utils import timezone\n \n from wagtail.models import Revision, WorkflowState\n@@ -31,7 +32,9 @@\n pages = options.get(\"pages\")\n non_pages = options.get(\"non_pages\")\n \n- revisions_deleted = purge_revisions(days=days, pages=pages, non_pages=non_pages)\n+ revisions_deleted, protected_error_count = purge_revisions(\n+ days=days, pages=pages, non_pages=non_pages\n+ )\n \n if revisions_deleted:\n self.stdout.write(\n@@ -39,6 +42,12 @@\n \"Successfully deleted %s revisions\" % revisions_deleted\n )\n )\n+ self.stdout.write(\n+ self.style.SUCCESS(\n+ \"Ignored %s revisions because one or more protected relations exist that prevent deletion.\"\n+ % protected_error_count\n+ )\n+ )\n else:\n self.stdout.write(\"No revisions deleted\")\n \n@@ -74,11 +83,15 @@\n purgeable_revisions = purgeable_revisions.filter(created_at__lt=purgeable_until)\n \n deleted_revisions_count = 0\n+ protected_error_count = 0\n \n for revision in purgeable_revisions.iterator():\n # don't delete the latest revision\n if not revision.is_latest_revision():\n- revision.delete()\n- deleted_revisions_count += 1\n+ try:\n+ revision.delete()\n+ deleted_revisions_count += 1\n+ except ProtectedError:\n+ protected_error_count += 1\n \n- return deleted_revisions_count\n+ return deleted_revisions_count, protected_error_count\n", "issue": "Avoid purging Revisions in use by third-party packages\nI originally raised this as part of some rework on the `purge_revisions` management command in https://github.com/wagtail/wagtail/pull/10619#pullrequestreview-1528373346. We considered it out of scope to fix the issue in that PR since Wagtail has always worked like this. Hence I'm raising it again separately in this issue.\r\n\r\n### Is your proposal related to a problem?\r\n\r\nWagtail provides a `purge_revisions` management command to delete stale `Revision` objects from the database. This command is aware of revisions that are still in use by Wagtail's workflow feature and avoids deleting those revisions ([link to source](https://github.com/wagtail/wagtail/blob/3d0282573d0d23f9eabe2fb6427e58b01f79ffe0/wagtail/management/commands/purge_revisions.py#L61-L66)).\r\n\r\nHowever, it is somewhat common for third-party packages to rely on a specific revision. [wagtail-ab-testing](https://github.com/wagtail-nest/wagtail-ab-testing/blob/d04bd0032c4335a839c781851b12fe16037950de/wagtail_ab_testing/models.py#L76) is a real world example of this. It maintains a ForeignKey to a specific revision to track an AB experiment. If `purge_revisions` were to remove a revision in use by wagtail-ab-testing, experiment data loss would occur.\r\n\r\nData loss can be prevented by specifying the ForeignKey with `on_delete=models.PROTECT` but `purge_revisions` wouldn't be able to handle this and crash with a `ProtectedError`. This means third-party packages have no recourse: either they allow the revisions they depend on to be deleted or they mark their relations as protected and break Wagtails `purge_revisions` command.\r\n\r\n### Describe the solution you'd like\r\n\r\nI propose `purge_revisions` gracefully recovers from `ProtectedError` when trying to delete a revision. I also propose we document that relations to a `Revision` should be marked with `on_delete=models.PROTECT` if data loss is not desirable.\r\n\r\nI also feel like the `purge_revisions` command should report the number of revisions that it couldn't delete because they were protected.\r\n\r\n### Describe alternatives you've considered\r\n\r\nAn alternative mechanism I considered would be an API that allows packages to inform Wagtail which revisions they still use. The `purge_revisions` command can take this information into account and avoid revisions still in use.\r\n\r\nThis is a more complex solution with no apparent benefit over the solution described above.\nAvoid purging Revisions in use by third-party packages\nI originally raised this as part of some rework on the `purge_revisions` management command in https://github.com/wagtail/wagtail/pull/10619#pullrequestreview-1528373346. We considered it out of scope to fix the issue in that PR since Wagtail has always worked like this. Hence I'm raising it again separately in this issue.\r\n\r\n### Is your proposal related to a problem?\r\n\r\nWagtail provides a `purge_revisions` management command to delete stale `Revision` objects from the database. This command is aware of revisions that are still in use by Wagtail's workflow feature and avoids deleting those revisions ([link to source](https://github.com/wagtail/wagtail/blob/3d0282573d0d23f9eabe2fb6427e58b01f79ffe0/wagtail/management/commands/purge_revisions.py#L61-L66)).\r\n\r\nHowever, it is somewhat common for third-party packages to rely on a specific revision. [wagtail-ab-testing](https://github.com/wagtail-nest/wagtail-ab-testing/blob/d04bd0032c4335a839c781851b12fe16037950de/wagtail_ab_testing/models.py#L76) is a real world example of this. It maintains a ForeignKey to a specific revision to track an AB experiment. If `purge_revisions` were to remove a revision in use by wagtail-ab-testing, experiment data loss would occur.\r\n\r\nData loss can be prevented by specifying the ForeignKey with `on_delete=models.PROTECT` but `purge_revisions` wouldn't be able to handle this and crash with a `ProtectedError`. This means third-party packages have no recourse: either they allow the revisions they depend on to be deleted or they mark their relations as protected and break Wagtails `purge_revisions` command.\r\n\r\n### Describe the solution you'd like\r\n\r\nI propose `purge_revisions` gracefully recovers from `ProtectedError` when trying to delete a revision. I also propose we document that relations to a `Revision` should be marked with `on_delete=models.PROTECT` if data loss is not desirable.\r\n\r\nI also feel like the `purge_revisions` command should report the number of revisions that it couldn't delete because they were protected.\r\n\r\n### Describe alternatives you've considered\r\n\r\nAn alternative mechanism I considered would be an API that allows packages to inform Wagtail which revisions they still use. The `purge_revisions` command can take this information into account and avoid revisions still in use.\r\n\r\nThis is a more complex solution with no apparent benefit over the solution described above.\n", "before_files": [{"content": "from django.conf import settings\nfrom django.core.management.base import BaseCommand\nfrom django.db.models import Q\nfrom django.utils import timezone\n\nfrom wagtail.models import Revision, WorkflowState\n\n\nclass Command(BaseCommand):\n help = \"Delete revisions which are not the latest revision, published or scheduled to be published, or in moderation\"\n\n def add_arguments(self, parser):\n parser.add_argument(\n \"--days\",\n type=int,\n help=\"Only delete revisions older than this number of days\",\n )\n parser.add_argument(\n \"--pages\",\n action=\"store_true\",\n help=\"Only delete revisions of page models\",\n )\n parser.add_argument(\n \"--non-pages\",\n action=\"store_true\",\n help=\"Only delete revisions of non-page models\",\n )\n\n def handle(self, *args, **options):\n days = options.get(\"days\")\n pages = options.get(\"pages\")\n non_pages = options.get(\"non_pages\")\n\n revisions_deleted = purge_revisions(days=days, pages=pages, non_pages=non_pages)\n\n if revisions_deleted:\n self.stdout.write(\n self.style.SUCCESS(\n \"Successfully deleted %s revisions\" % revisions_deleted\n )\n )\n else:\n self.stdout.write(\"No revisions deleted\")\n\n\ndef purge_revisions(days=None, pages=True, non_pages=True):\n if pages == non_pages:\n # If both are True or both are False, purge revisions of pages and non-pages\n objects = Revision.objects.all()\n elif pages:\n objects = Revision.objects.page_revisions()\n elif non_pages:\n objects = Revision.objects.not_page_revisions()\n\n # exclude revisions which have been submitted for moderation in the old system\n # RemovedInWagtail60Warning\n # Remove this when the deprecation period for the legacy\n # moderation system ends.\n purgeable_revisions = objects.exclude(submitted_for_moderation=True).exclude(\n # and exclude revisions with an approved_go_live_at date\n approved_go_live_at__isnull=False\n )\n\n if getattr(settings, \"WAGTAIL_WORKFLOW_ENABLED\", True):\n purgeable_revisions = purgeable_revisions.exclude(\n # and exclude revisions linked to an in progress or needs changes workflow state\n Q(task_states__workflow_state__status=WorkflowState.STATUS_IN_PROGRESS)\n | Q(task_states__workflow_state__status=WorkflowState.STATUS_NEEDS_CHANGES)\n )\n\n if days:\n purgeable_until = timezone.now() - timezone.timedelta(days=days)\n # only include revisions which were created before the cut off date\n purgeable_revisions = purgeable_revisions.filter(created_at__lt=purgeable_until)\n\n deleted_revisions_count = 0\n\n for revision in purgeable_revisions.iterator():\n # don't delete the latest revision\n if not revision.is_latest_revision():\n revision.delete()\n deleted_revisions_count += 1\n\n return deleted_revisions_count\n", "path": "wagtail/management/commands/purge_revisions.py"}], "after_files": [{"content": "from django.conf import settings\nfrom django.core.management.base import BaseCommand\nfrom django.db.models import Q\nfrom django.db.models.deletion import ProtectedError\nfrom django.utils import timezone\n\nfrom wagtail.models import Revision, WorkflowState\n\n\nclass Command(BaseCommand):\n help = \"Delete revisions which are not the latest revision, published or scheduled to be published, or in moderation\"\n\n def add_arguments(self, parser):\n parser.add_argument(\n \"--days\",\n type=int,\n help=\"Only delete revisions older than this number of days\",\n )\n parser.add_argument(\n \"--pages\",\n action=\"store_true\",\n help=\"Only delete revisions of page models\",\n )\n parser.add_argument(\n \"--non-pages\",\n action=\"store_true\",\n help=\"Only delete revisions of non-page models\",\n )\n\n def handle(self, *args, **options):\n days = options.get(\"days\")\n pages = options.get(\"pages\")\n non_pages = options.get(\"non_pages\")\n\n revisions_deleted, protected_error_count = purge_revisions(\n days=days, pages=pages, non_pages=non_pages\n )\n\n if revisions_deleted:\n self.stdout.write(\n self.style.SUCCESS(\n \"Successfully deleted %s revisions\" % revisions_deleted\n )\n )\n self.stdout.write(\n self.style.SUCCESS(\n \"Ignored %s revisions because one or more protected relations exist that prevent deletion.\"\n % protected_error_count\n )\n )\n else:\n self.stdout.write(\"No revisions deleted\")\n\n\ndef purge_revisions(days=None, pages=True, non_pages=True):\n if pages == non_pages:\n # If both are True or both are False, purge revisions of pages and non-pages\n objects = Revision.objects.all()\n elif pages:\n objects = Revision.objects.page_revisions()\n elif non_pages:\n objects = Revision.objects.not_page_revisions()\n\n # exclude revisions which have been submitted for moderation in the old system\n # RemovedInWagtail60Warning\n # Remove this when the deprecation period for the legacy\n # moderation system ends.\n purgeable_revisions = objects.exclude(submitted_for_moderation=True).exclude(\n # and exclude revisions with an approved_go_live_at date\n approved_go_live_at__isnull=False\n )\n\n if getattr(settings, \"WAGTAIL_WORKFLOW_ENABLED\", True):\n purgeable_revisions = purgeable_revisions.exclude(\n # and exclude revisions linked to an in progress or needs changes workflow state\n Q(task_states__workflow_state__status=WorkflowState.STATUS_IN_PROGRESS)\n | Q(task_states__workflow_state__status=WorkflowState.STATUS_NEEDS_CHANGES)\n )\n\n if days:\n purgeable_until = timezone.now() - timezone.timedelta(days=days)\n # only include revisions which were created before the cut off date\n purgeable_revisions = purgeable_revisions.filter(created_at__lt=purgeable_until)\n\n deleted_revisions_count = 0\n protected_error_count = 0\n\n for revision in purgeable_revisions.iterator():\n # don't delete the latest revision\n if not revision.is_latest_revision():\n try:\n revision.delete()\n deleted_revisions_count += 1\n except ProtectedError:\n protected_error_count += 1\n\n return deleted_revisions_count, protected_error_count\n", "path": "wagtail/management/commands/purge_revisions.py"}]} | 2,250 | 450 |
gh_patches_debug_10241 | rasdani/github-patches | git_diff | rootpy__rootpy-748 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Error when using root_open: 'TDirectory' object has no attribute 'func'
As above:
`AttributeError: 'TDirectory' object has no attribute 'func'`
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `rootpy/ROOT.py`
Content:
```
1 # Copyright 2012 the rootpy developers
2 # distributed under the terms of the GNU General Public License
3 """
4 :py:mod:`rootpy.ROOT`
5 =====================
6
7 This module is intended to be a drop-in replacement for ordinary
8 PyROOT imports by mimicking PyROOT's interface. If you find a case where it is
9 not, please report an issue to the rootpy developers.
10
11 Both ROOT and rootpy classes can be accessed in a harmonized way through this
12 module. This means you can take advantage of rootpy classes automatically by
13 replacing ``import ROOT`` with ``import rootpy.ROOT as ROOT`` or
14 ``from rootpy import ROOT`` in your code, while maintaining backward
15 compatibility with existing use of ROOT's classes.
16
17 ROOT classes are automatically "asrootpy'd" *after* the constructor in ROOT has
18 been called:
19
20 .. sourcecode:: python
21
22 >>> import rootpy.ROOT as ROOT
23 >>> h = ROOT.TH1F('name', 'title', 10, 0, 1)
24 >>> h
25 Hist('name')
26 >>> h.TYPE
27 'F'
28
29 Also access rootpy classes under this same module without needing to remember
30 where to import them from in rootpy:
31
32 .. sourcecode:: python
33
34 >>> import rootpy.ROOT as ROOT
35 >>> h = ROOT.Hist(10, 0, 1, name='name', type='F')
36 >>> h
37 Hist('name')
38 >>> h.TYPE
39 'F'
40
41 Plain old ROOT can still be accessed through the ``R`` property:
42
43 .. sourcecode:: python
44
45 >>> from rootpy import ROOT
46 >>> ROOT.R.TFile
47 <class 'ROOT.TFile'>
48
49 """
50 from __future__ import absolute_import
51
52 from copy import copy
53
54 import ROOT
55
56 from . import asrootpy, lookup_rootpy, ROOT_VERSION
57 from . import QROOT, stl
58 from .utils.module_facade import Facade
59
60 __all__ = []
61
62
63 def proxy_global(name, no_expand_macro=False):
64 """
65 Used to automatically asrootpy ROOT's thread local variables
66 """
67 if no_expand_macro: # pragma: no cover
68 # handle older ROOT versions without _ExpandMacroFunction wrapping
69 @property
70 def gSomething_no_func(self):
71 glob = self(getattr(ROOT, name))
72 # create a fake func() that just returns self
73 def func():
74 return glob
75 glob.func = func
76 return glob
77 return gSomething_no_func
78
79 @property
80 def gSomething(self):
81 glob = getattr(ROOT, name)
82 orig_func = glob.func
83
84 def asrootpy_izing_func():
85 return self(orig_func())
86
87 # new_glob = copy(glob)
88 new_glob = glob.__class__.__new__(glob.__class__)
89 new_glob.func = asrootpy_izing_func
90 # Memoize
91 setattr(type(self), name, new_glob)
92 return new_glob
93 return gSomething
94
95
96 @Facade(__name__, expose_internal=False)
97 class Module(object):
98
99 __version__ = ROOT_VERSION
100
101 def __call__(self, arg, after_init=False):
102 return asrootpy(arg, warn=False, after_init=after_init)
103
104 def __getattr__(self, what):
105 try:
106 # check ROOT
107 result = self(getattr(ROOT, what), after_init=True)
108 except AttributeError:
109 # check rootpy
110 result = lookup_rootpy(what)
111 if result is None:
112 raise AttributeError(
113 'ROOT does not have the attribute `{0}` '
114 'and rootpy does not contain the class `{0}`'.format(what))
115 return result
116
117 try:
118 # Memoize
119 setattr(self, what, result)
120 except AttributeError:
121 # Oops... Oh well. I tried.
122 pass
123
124 return result
125
126 @property
127 def R(self):
128 return ROOT
129
130 gPad = proxy_global("gPad")
131 gVirtualX = proxy_global("gVirtualX")
132
133 if ROOT_VERSION < (5, 32, 0): # pragma: no cover
134 # handle versions of ROOT older than 5.32.00
135 gDirectory = proxy_global("gDirectory", no_expand_macro=True)
136 gFile = proxy_global("gFile", no_expand_macro=True)
137 gInterpreter = proxy_global("gInterpreter", no_expand_macro=True)
138 else:
139 gDirectory = proxy_global("gDirectory")
140 gFile = proxy_global("gFile")
141 gInterpreter = proxy_global("gInterpreter")
142
143 # use the smart template STL types from rootpy.stl instead
144 for t in QROOT.std.stlclasses:
145 locals()[t] = getattr(stl, t)
146 del t
147
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/rootpy/ROOT.py b/rootpy/ROOT.py
--- a/rootpy/ROOT.py
+++ b/rootpy/ROOT.py
@@ -130,8 +130,7 @@
gPad = proxy_global("gPad")
gVirtualX = proxy_global("gVirtualX")
- if ROOT_VERSION < (5, 32, 0): # pragma: no cover
- # handle versions of ROOT older than 5.32.00
+ if ROOT_VERSION < (5, 32, 0) or ROOT_VERSION >= (6, 9, 2): # pragma: no cover
gDirectory = proxy_global("gDirectory", no_expand_macro=True)
gFile = proxy_global("gFile", no_expand_macro=True)
gInterpreter = proxy_global("gInterpreter", no_expand_macro=True)
| {"golden_diff": "diff --git a/rootpy/ROOT.py b/rootpy/ROOT.py\n--- a/rootpy/ROOT.py\n+++ b/rootpy/ROOT.py\n@@ -130,8 +130,7 @@\n gPad = proxy_global(\"gPad\")\n gVirtualX = proxy_global(\"gVirtualX\")\n \n- if ROOT_VERSION < (5, 32, 0): # pragma: no cover\n- # handle versions of ROOT older than 5.32.00\n+ if ROOT_VERSION < (5, 32, 0) or ROOT_VERSION >= (6, 9, 2): # pragma: no cover\n gDirectory = proxy_global(\"gDirectory\", no_expand_macro=True)\n gFile = proxy_global(\"gFile\", no_expand_macro=True)\n gInterpreter = proxy_global(\"gInterpreter\", no_expand_macro=True)\n", "issue": "Error when using root_open: 'TDirectory' object has no attribute 'func'\nAs above:\r\n\r\n`AttributeError: 'TDirectory' object has no attribute 'func'`\n", "before_files": [{"content": "# Copyright 2012 the rootpy developers\n# distributed under the terms of the GNU General Public License\n\"\"\"\n:py:mod:`rootpy.ROOT`\n=====================\n\nThis module is intended to be a drop-in replacement for ordinary\nPyROOT imports by mimicking PyROOT's interface. If you find a case where it is\nnot, please report an issue to the rootpy developers.\n\nBoth ROOT and rootpy classes can be accessed in a harmonized way through this\nmodule. This means you can take advantage of rootpy classes automatically by\nreplacing ``import ROOT`` with ``import rootpy.ROOT as ROOT`` or\n``from rootpy import ROOT`` in your code, while maintaining backward\ncompatibility with existing use of ROOT's classes.\n\nROOT classes are automatically \"asrootpy'd\" *after* the constructor in ROOT has\nbeen called:\n\n.. sourcecode:: python\n\n >>> import rootpy.ROOT as ROOT\n >>> h = ROOT.TH1F('name', 'title', 10, 0, 1)\n >>> h\n Hist('name')\n >>> h.TYPE\n 'F'\n\nAlso access rootpy classes under this same module without needing to remember\nwhere to import them from in rootpy:\n\n.. sourcecode:: python\n\n >>> import rootpy.ROOT as ROOT\n >>> h = ROOT.Hist(10, 0, 1, name='name', type='F')\n >>> h\n Hist('name')\n >>> h.TYPE\n 'F'\n\nPlain old ROOT can still be accessed through the ``R`` property:\n\n.. sourcecode:: python\n\n >>> from rootpy import ROOT\n >>> ROOT.R.TFile\n <class 'ROOT.TFile'>\n\n\"\"\"\nfrom __future__ import absolute_import\n\nfrom copy import copy\n\nimport ROOT\n\nfrom . import asrootpy, lookup_rootpy, ROOT_VERSION\nfrom . import QROOT, stl\nfrom .utils.module_facade import Facade\n\n__all__ = []\n\n\ndef proxy_global(name, no_expand_macro=False):\n \"\"\"\n Used to automatically asrootpy ROOT's thread local variables\n \"\"\"\n if no_expand_macro: # pragma: no cover\n # handle older ROOT versions without _ExpandMacroFunction wrapping\n @property\n def gSomething_no_func(self):\n glob = self(getattr(ROOT, name))\n # create a fake func() that just returns self\n def func():\n return glob\n glob.func = func\n return glob\n return gSomething_no_func\n\n @property\n def gSomething(self):\n glob = getattr(ROOT, name)\n orig_func = glob.func\n\n def asrootpy_izing_func():\n return self(orig_func())\n\n # new_glob = copy(glob)\n new_glob = glob.__class__.__new__(glob.__class__)\n new_glob.func = asrootpy_izing_func\n # Memoize\n setattr(type(self), name, new_glob)\n return new_glob\n return gSomething\n\n\n@Facade(__name__, expose_internal=False)\nclass Module(object):\n\n __version__ = ROOT_VERSION\n\n def __call__(self, arg, after_init=False):\n return asrootpy(arg, warn=False, after_init=after_init)\n\n def __getattr__(self, what):\n try:\n # check ROOT\n result = self(getattr(ROOT, what), after_init=True)\n except AttributeError:\n # check rootpy\n result = lookup_rootpy(what)\n if result is None:\n raise AttributeError(\n 'ROOT does not have the attribute `{0}` '\n 'and rootpy does not contain the class `{0}`'.format(what))\n return result\n\n try:\n # Memoize\n setattr(self, what, result)\n except AttributeError:\n # Oops... Oh well. I tried.\n pass\n\n return result\n\n @property\n def R(self):\n return ROOT\n\n gPad = proxy_global(\"gPad\")\n gVirtualX = proxy_global(\"gVirtualX\")\n\n if ROOT_VERSION < (5, 32, 0): # pragma: no cover\n # handle versions of ROOT older than 5.32.00\n gDirectory = proxy_global(\"gDirectory\", no_expand_macro=True)\n gFile = proxy_global(\"gFile\", no_expand_macro=True)\n gInterpreter = proxy_global(\"gInterpreter\", no_expand_macro=True)\n else:\n gDirectory = proxy_global(\"gDirectory\")\n gFile = proxy_global(\"gFile\")\n gInterpreter = proxy_global(\"gInterpreter\")\n\n # use the smart template STL types from rootpy.stl instead\n for t in QROOT.std.stlclasses:\n locals()[t] = getattr(stl, t)\n del t\n", "path": "rootpy/ROOT.py"}], "after_files": [{"content": "# Copyright 2012 the rootpy developers\n# distributed under the terms of the GNU General Public License\n\"\"\"\n:py:mod:`rootpy.ROOT`\n=====================\n\nThis module is intended to be a drop-in replacement for ordinary\nPyROOT imports by mimicking PyROOT's interface. If you find a case where it is\nnot, please report an issue to the rootpy developers.\n\nBoth ROOT and rootpy classes can be accessed in a harmonized way through this\nmodule. This means you can take advantage of rootpy classes automatically by\nreplacing ``import ROOT`` with ``import rootpy.ROOT as ROOT`` or\n``from rootpy import ROOT`` in your code, while maintaining backward\ncompatibility with existing use of ROOT's classes.\n\nROOT classes are automatically \"asrootpy'd\" *after* the constructor in ROOT has\nbeen called:\n\n.. sourcecode:: python\n\n >>> import rootpy.ROOT as ROOT\n >>> h = ROOT.TH1F('name', 'title', 10, 0, 1)\n >>> h\n Hist('name')\n >>> h.TYPE\n 'F'\n\nAlso access rootpy classes under this same module without needing to remember\nwhere to import them from in rootpy:\n\n.. sourcecode:: python\n\n >>> import rootpy.ROOT as ROOT\n >>> h = ROOT.Hist(10, 0, 1, name='name', type='F')\n >>> h\n Hist('name')\n >>> h.TYPE\n 'F'\n\nPlain old ROOT can still be accessed through the ``R`` property:\n\n.. sourcecode:: python\n\n >>> from rootpy import ROOT\n >>> ROOT.R.TFile\n <class 'ROOT.TFile'>\n\n\"\"\"\nfrom __future__ import absolute_import\n\nfrom copy import copy\n\nimport ROOT\n\nfrom . import asrootpy, lookup_rootpy, ROOT_VERSION\nfrom . import QROOT, stl\nfrom .utils.module_facade import Facade\n\n__all__ = []\n\n\ndef proxy_global(name, no_expand_macro=False):\n \"\"\"\n Used to automatically asrootpy ROOT's thread local variables\n \"\"\"\n if no_expand_macro: # pragma: no cover\n # handle older ROOT versions without _ExpandMacroFunction wrapping\n @property\n def gSomething_no_func(self):\n glob = self(getattr(ROOT, name))\n # create a fake func() that just returns self\n def func():\n return glob\n glob.func = func\n return glob\n return gSomething_no_func\n\n @property\n def gSomething(self):\n glob = getattr(ROOT, name)\n orig_func = glob.func\n\n def asrootpy_izing_func():\n return self(orig_func())\n\n # new_glob = copy(glob)\n new_glob = glob.__class__.__new__(glob.__class__)\n new_glob.func = asrootpy_izing_func\n # Memoize\n setattr(type(self), name, new_glob)\n return new_glob\n return gSomething\n\n\n@Facade(__name__, expose_internal=False)\nclass Module(object):\n\n __version__ = ROOT_VERSION\n\n def __call__(self, arg, after_init=False):\n return asrootpy(arg, warn=False, after_init=after_init)\n\n def __getattr__(self, what):\n try:\n # check ROOT\n result = self(getattr(ROOT, what), after_init=True)\n except AttributeError:\n # check rootpy\n result = lookup_rootpy(what)\n if result is None:\n raise AttributeError(\n 'ROOT does not have the attribute `{0}` '\n 'and rootpy does not contain the class `{0}`'.format(what))\n return result\n\n try:\n # Memoize\n setattr(self, what, result)\n except AttributeError:\n # Oops... Oh well. I tried.\n pass\n\n return result\n\n @property\n def R(self):\n return ROOT\n\n gPad = proxy_global(\"gPad\")\n gVirtualX = proxy_global(\"gVirtualX\")\n\n if ROOT_VERSION < (5, 32, 0) or ROOT_VERSION >= (6, 9, 2): # pragma: no cover\n gDirectory = proxy_global(\"gDirectory\", no_expand_macro=True)\n gFile = proxy_global(\"gFile\", no_expand_macro=True)\n gInterpreter = proxy_global(\"gInterpreter\", no_expand_macro=True)\n else:\n gDirectory = proxy_global(\"gDirectory\")\n gFile = proxy_global(\"gFile\")\n gInterpreter = proxy_global(\"gInterpreter\")\n\n # use the smart template STL types from rootpy.stl instead\n for t in QROOT.std.stlclasses:\n locals()[t] = getattr(stl, t)\n del t\n", "path": "rootpy/ROOT.py"}]} | 1,662 | 191 |
gh_patches_debug_22709 | rasdani/github-patches | git_diff | sopel-irc__sopel-2494 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Root module description is a mini-rant about LC_ALL rather than a description of the library
### Description
Looking at the `sopel` module with `pydoc` in an interactive prompt) exposes the user to [a short rant](https://github.com/sopel-irc/sopel/blob/c26914b68913bc25bdd1f5fed9c5942a87fdfee6/sopel/__init__.py#L1-L4) about the behavior of `LC_ALL` and instructions to use only ASCII in this module.
I'm sympathetic to the frustration over #984 that led to this, but it will be an improvement to add a docstring to the module with a short description.
### Reproduction steps
Run `python3 -m pydoc sopel` or `import sopel; help(sopel)` in an interactive prompt.
### Expected behavior
The user should see a short description of Sopel
### Relevant logs
_No response_
### Notes
_No response_
### Sopel version
c26914b
### Installation method
`pip install`
### Python version
_No response_
### Operating system
_No response_
### IRCd
_No response_
### Relevant plugins
_No response_
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `sopel/__init__.py`
Content:
```
1 # ASCII ONLY IN THIS FILE THOUGH!!!!!!!
2 # Python does some stupid bullshit of respecting LC_ALL over the encoding on the
3 # file, so in order to undo Python's ridiculous fucking idiocy, we have to have
4 # our own check.
5
6 # Copyright 2008, Sean B. Palmer, inamidst.com
7 # Copyright 2012, Elsie Powell, http://embolalia.com
8 # Copyright 2012, Elad Alfassa <[email protected]>
9 #
10 # Licensed under the Eiffel Forum License 2.
11
12 from __future__ import annotations
13
14 from collections import namedtuple
15 import locale
16 import re
17 import sys
18
19 # TODO: replace with stdlib importlib.metadata when dropping py3.7
20 # version info used in this module works from py3.8+
21 import importlib_metadata
22
23 __all__ = [
24 'bot',
25 'config',
26 'db',
27 'formatting',
28 'irc',
29 'loader',
30 'logger',
31 'module', # deprecated in 7.1, removed in 9.0
32 'plugin',
33 'tools',
34 'trigger',
35 'version_info',
36 ]
37
38 loc = locale.getlocale()
39 if not loc[1] or ('UTF-8' not in loc[1] and 'utf8' not in loc[1]):
40 print('WARNING!!! You are running with a non-UTF8 locale environment '
41 'variable (e.g. LC_ALL is set to "C"), which makes Python 3 do '
42 'stupid things. If you get strange errors, please set it to '
43 'something like "en_US.UTF-8".', file=sys.stderr)
44
45
46 __version__ = importlib_metadata.version('sopel')
47
48
49 def _version_info(version=__version__):
50 regex = re.compile(r'(\d+)\.(\d+)\.(\d+)(?:[\-\.]?(a|b|rc)(\d+))?.*')
51 version_match = regex.match(version)
52
53 if version_match is None:
54 raise RuntimeError("Can't parse version number!")
55
56 version_groups = version_match.groups()
57 major, minor, micro = (int(piece) for piece in version_groups[0:3])
58 level = version_groups[3]
59 serial = int(version_groups[4] or 0)
60 if level == 'a':
61 level = 'alpha'
62 elif level == 'b':
63 level = 'beta'
64 elif level == 'rc':
65 level = 'candidate'
66 elif not level and version_groups[4] is None:
67 level = 'final'
68 else:
69 level = 'alpha'
70
71 VersionInfo = namedtuple('VersionInfo',
72 'major, minor, micro, releaselevel, serial')
73 return VersionInfo(major, minor, micro, level, serial)
74
75
76 version_info = _version_info()
77
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/sopel/__init__.py b/sopel/__init__.py
--- a/sopel/__init__.py
+++ b/sopel/__init__.py
@@ -1,8 +1,9 @@
-# ASCII ONLY IN THIS FILE THOUGH!!!!!!!
-# Python does some stupid bullshit of respecting LC_ALL over the encoding on the
-# file, so in order to undo Python's ridiculous fucking idiocy, we have to have
-# our own check.
+"""
+Sopel is a simple, easy-to-use, open-source IRC utility bot, written in Python.
+It’s designed to be easy to use, easy to run, and easy to extend.
+"""
+#
# Copyright 2008, Sean B. Palmer, inamidst.com
# Copyright 2012, Elsie Powell, http://embolalia.com
# Copyright 2012, Elad Alfassa <[email protected]>
@@ -37,9 +38,8 @@
loc = locale.getlocale()
if not loc[1] or ('UTF-8' not in loc[1] and 'utf8' not in loc[1]):
- print('WARNING!!! You are running with a non-UTF8 locale environment '
- 'variable (e.g. LC_ALL is set to "C"), which makes Python 3 do '
- 'stupid things. If you get strange errors, please set it to '
+ print('Warning: Running with a non-UTF8 locale. If you see strange '
+ 'encoding errors, try setting the LC_ALL environment variable to '
'something like "en_US.UTF-8".', file=sys.stderr)
| {"golden_diff": "diff --git a/sopel/__init__.py b/sopel/__init__.py\n--- a/sopel/__init__.py\n+++ b/sopel/__init__.py\n@@ -1,8 +1,9 @@\n-# ASCII ONLY IN THIS FILE THOUGH!!!!!!!\n-# Python does some stupid bullshit of respecting LC_ALL over the encoding on the\n-# file, so in order to undo Python's ridiculous fucking idiocy, we have to have\n-# our own check.\n+\"\"\"\n+Sopel is a simple, easy-to-use, open-source IRC utility bot, written in Python.\n \n+It\u2019s designed to be easy to use, easy to run, and easy to extend.\n+\"\"\"\n+#\n # Copyright 2008, Sean B. Palmer, inamidst.com\n # Copyright 2012, Elsie Powell, http://embolalia.com\n # Copyright 2012, Elad Alfassa <[email protected]>\n@@ -37,9 +38,8 @@\n \n loc = locale.getlocale()\n if not loc[1] or ('UTF-8' not in loc[1] and 'utf8' not in loc[1]):\n- print('WARNING!!! You are running with a non-UTF8 locale environment '\n- 'variable (e.g. LC_ALL is set to \"C\"), which makes Python 3 do '\n- 'stupid things. If you get strange errors, please set it to '\n+ print('Warning: Running with a non-UTF8 locale. If you see strange '\n+ 'encoding errors, try setting the LC_ALL environment variable to '\n 'something like \"en_US.UTF-8\".', file=sys.stderr)\n", "issue": "Root module description is a mini-rant about LC_ALL rather than a description of the library\n### Description\n\nLooking at the `sopel` module with `pydoc` in an interactive prompt) exposes the user to [a short rant](https://github.com/sopel-irc/sopel/blob/c26914b68913bc25bdd1f5fed9c5942a87fdfee6/sopel/__init__.py#L1-L4) about the behavior of `LC_ALL` and instructions to use only ASCII in this module.\r\n\r\nI'm sympathetic to the frustration over #984 that led to this, but it will be an improvement to add a docstring to the module with a short description.\n\n### Reproduction steps\n\nRun `python3 -m pydoc sopel` or `import sopel; help(sopel)` in an interactive prompt.\n\n### Expected behavior\n\nThe user should see a short description of Sopel\n\n### Relevant logs\n\n_No response_\n\n### Notes\n\n_No response_\n\n### Sopel version\n\nc26914b\n\n### Installation method\n\n`pip install`\n\n### Python version\n\n_No response_\n\n### Operating system\n\n_No response_\n\n### IRCd\n\n_No response_\n\n### Relevant plugins\n\n_No response_\n", "before_files": [{"content": "# ASCII ONLY IN THIS FILE THOUGH!!!!!!!\n# Python does some stupid bullshit of respecting LC_ALL over the encoding on the\n# file, so in order to undo Python's ridiculous fucking idiocy, we have to have\n# our own check.\n\n# Copyright 2008, Sean B. Palmer, inamidst.com\n# Copyright 2012, Elsie Powell, http://embolalia.com\n# Copyright 2012, Elad Alfassa <[email protected]>\n#\n# Licensed under the Eiffel Forum License 2.\n\nfrom __future__ import annotations\n\nfrom collections import namedtuple\nimport locale\nimport re\nimport sys\n\n# TODO: replace with stdlib importlib.metadata when dropping py3.7\n# version info used in this module works from py3.8+\nimport importlib_metadata\n\n__all__ = [\n 'bot',\n 'config',\n 'db',\n 'formatting',\n 'irc',\n 'loader',\n 'logger',\n 'module', # deprecated in 7.1, removed in 9.0\n 'plugin',\n 'tools',\n 'trigger',\n 'version_info',\n]\n\nloc = locale.getlocale()\nif not loc[1] or ('UTF-8' not in loc[1] and 'utf8' not in loc[1]):\n print('WARNING!!! You are running with a non-UTF8 locale environment '\n 'variable (e.g. LC_ALL is set to \"C\"), which makes Python 3 do '\n 'stupid things. If you get strange errors, please set it to '\n 'something like \"en_US.UTF-8\".', file=sys.stderr)\n\n\n__version__ = importlib_metadata.version('sopel')\n\n\ndef _version_info(version=__version__):\n regex = re.compile(r'(\\d+)\\.(\\d+)\\.(\\d+)(?:[\\-\\.]?(a|b|rc)(\\d+))?.*')\n version_match = regex.match(version)\n\n if version_match is None:\n raise RuntimeError(\"Can't parse version number!\")\n\n version_groups = version_match.groups()\n major, minor, micro = (int(piece) for piece in version_groups[0:3])\n level = version_groups[3]\n serial = int(version_groups[4] or 0)\n if level == 'a':\n level = 'alpha'\n elif level == 'b':\n level = 'beta'\n elif level == 'rc':\n level = 'candidate'\n elif not level and version_groups[4] is None:\n level = 'final'\n else:\n level = 'alpha'\n\n VersionInfo = namedtuple('VersionInfo',\n 'major, minor, micro, releaselevel, serial')\n return VersionInfo(major, minor, micro, level, serial)\n\n\nversion_info = _version_info()\n", "path": "sopel/__init__.py"}], "after_files": [{"content": "\"\"\"\nSopel is a simple, easy-to-use, open-source IRC utility bot, written in Python.\n\nIt\u2019s designed to be easy to use, easy to run, and easy to extend.\n\"\"\"\n#\n# Copyright 2008, Sean B. Palmer, inamidst.com\n# Copyright 2012, Elsie Powell, http://embolalia.com\n# Copyright 2012, Elad Alfassa <[email protected]>\n#\n# Licensed under the Eiffel Forum License 2.\n\nfrom __future__ import annotations\n\nfrom collections import namedtuple\nimport locale\nimport re\nimport sys\n\n# TODO: replace with stdlib importlib.metadata when dropping py3.7\n# version info used in this module works from py3.8+\nimport importlib_metadata\n\n__all__ = [\n 'bot',\n 'config',\n 'db',\n 'formatting',\n 'irc',\n 'loader',\n 'logger',\n 'module', # deprecated in 7.1, removed in 9.0\n 'plugin',\n 'tools',\n 'trigger',\n 'version_info',\n]\n\nloc = locale.getlocale()\nif not loc[1] or ('UTF-8' not in loc[1] and 'utf8' not in loc[1]):\n print('Warning: Running with a non-UTF8 locale. If you see strange '\n 'encoding errors, try setting the LC_ALL environment variable to '\n 'something like \"en_US.UTF-8\".', file=sys.stderr)\n\n\n__version__ = importlib_metadata.version('sopel')\n\n\ndef _version_info(version=__version__):\n regex = re.compile(r'(\\d+)\\.(\\d+)\\.(\\d+)(?:[\\-\\.]?(a|b|rc)(\\d+))?.*')\n version_match = regex.match(version)\n\n if version_match is None:\n raise RuntimeError(\"Can't parse version number!\")\n\n version_groups = version_match.groups()\n major, minor, micro = (int(piece) for piece in version_groups[0:3])\n level = version_groups[3]\n serial = int(version_groups[4] or 0)\n if level == 'a':\n level = 'alpha'\n elif level == 'b':\n level = 'beta'\n elif level == 'rc':\n level = 'candidate'\n elif not level and version_groups[4] is None:\n level = 'final'\n else:\n level = 'alpha'\n\n VersionInfo = namedtuple('VersionInfo',\n 'major, minor, micro, releaselevel, serial')\n return VersionInfo(major, minor, micro, level, serial)\n\n\nversion_info = _version_info()\n", "path": "sopel/__init__.py"}]} | 1,299 | 370 |
gh_patches_debug_30963 | rasdani/github-patches | git_diff | bridgecrewio__checkov-5638 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
CKV_AZURE_226: error in check and testcase
**Describe the issue**
CKV_AZURE_226 checks for ephemeral disks within the "main resource" azurerm_kubernetes_cluster but the cluster itself doesn't have any argument called os_disk_type. The argument os_disk_type is part of the node pool.
The testcase [here](https://github.com/bridgecrewio/checkov/pull/5584/files#diff-c0b8f08537766f6eff2a5d10b9439d227fdaaebe6ff7903008825c5f9d51c22dR1) is misleading and the check itself [here](https://github.com/bridgecrewio/checkov/pull/5584/files#diff-c9248390aa120f7af4643f1908d3d824fb903fd3c6cd63e9e77fe8e9ecd59289R28) too.
In my opinion this must be something like
```
def get_inspected_key(self) -> str:
return "default_node_pool/[0]/os_disk_type"
```
otherwise it won't work?
Same for CKV_AZURE_227.
**Examples**
```
[root] # head -30 aks.tf
resource "azurerm_kubernetes_cluster" "this" {
name = local.name_prefix
location = var.resource_group.location
resource_group_name = var.resource_group.name
node_resource_group = "${local.name_prefix}-node-pool"
dns_prefix = local.name_prefix
kubernetes_version = local.kubernetes_version
sku_tier = var.sku_tier
api_server_access_profile {
authorized_ip_ranges = var.api_server_authorized_ip_ranges
}
default_node_pool {
name = "default"
enable_host_encryption = true
vm_size = "Standard_E4ads_v5"
os_disk_type = "Ephemeral"
zones = [1, 2, 3]
only_critical_addons_enabled = true
type = "VirtualMachineScaleSets"
vnet_subnet_id = var.subnet_id
enable_auto_scaling = true
max_count = 6
min_count = 2
orchestrator_version = local.kubernetes_version
upgrade_settings {
```
results in
```
[root] # checkov --skip-framework kubernetes --skip-framework helm --quiet --compact -o junitxml -o cli --directory .
2023-10-02 11:58:47,399 [MainThread ] [WARNI] The framework "sca_image" is part of the "SCA" module, which is not enabled in the platform
2023-10-02 11:58:47,399 [MainThread ] [WARNI] The framework "sca_package" is part of the "SCA" module, which is not enabled in the platform
terraform scan results:
Passed checks: 6, Failed checks: 11, Skipped checks: 0
[...]
Check: CKV_AZURE_226: "Ensure ephemeral disks are used for OS disks"
FAILED for resource: azurerm_kubernetes_cluster.this
File: /aks.tf:1-64
Check: CKV_AZURE_227: "Ensure that the AKS cluster encrypt temp disks, caches, and data flows between Compute and Storage resources"
FAILED for resource: azurerm_kubernetes_cluster.this
File: /aks.tf:1-64
[...]
```
Please also see https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/kubernetes_cluster for code example.
**Version (please complete the following information):**
- Checkov Version 2.4.58
**Additional context**
This is related to https://github.com/bridgecrewio/checkov/pull/5584 and https://github.com/bridgecrewio/checkov/pull/5588.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `checkov/terraform/checks/resource/azure/AKSEphemeralOSDisks.py`
Content:
```
1 from checkov.common.models.enums import CheckCategories
2 from checkov.terraform.checks.resource.base_resource_value_check import BaseResourceValueCheck
3 from typing import Any
4
5
6 class AKSEphemeralOSDisks(BaseResourceValueCheck):
7 def __init__(self) -> None:
8 """
9 Temporary data can contain sensitive data at some points, by using ephemeral disks,
10 we ensure that data written to OS disk is stored on local VM storage and isn't persisted to Azure Storage
11
12 Azure automatically replicates data stored in the managed OS disk of a virtual machine to Azure storage
13 to avoid data loss in case the virtual machine needs to be relocated to another host.
14 Generally speaking, containers are not designed to have local state persisted to the managed OS disk,
15 hence this behavior offers limited value to AKS hosted while providing some drawbacks,
16 including slower node provisioning and higher read/write latency.
17
18 Ephemeral disks allow us also to have faster cluster operations like scale or upgrade
19 due to faster re-imaging and boot times.
20 """
21 name = "Ensure ephemeral disks are used for OS disks"
22 id = "CKV_AZURE_226"
23 supported_resources = ("azurerm_kubernetes_cluster",)
24 categories = (CheckCategories.KUBERNETES,)
25 super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)
26
27 def get_inspected_key(self) -> str:
28 return "os_disk_type"
29
30 def get_expected_value(self) -> Any:
31 return "Ephemeral"
32
33
34 check = AKSEphemeralOSDisks()
35
```
Path: `checkov/terraform/checks/resource/azure/AKSEncryptionAtHostEnabled.py`
Content:
```
1
2 from checkov.common.models.enums import CheckCategories, CheckResult
3 from checkov.terraform.checks.resource.base_resource_value_check import BaseResourceValueCheck
4
5
6 class AKSEncryptionAtHostEnabled(BaseResourceValueCheck):
7 def __init__(self) -> None:
8 """
9 With host-based encryption, the data stored on the VM host of
10 your AKS agent nodes' VMs is encrypted at rest and flows encrypted to the Storage service.
11
12 This means the temp disks are encrypted at rest with platform-managed keys.
13 The cache of OS and data disks is encrypted at rest with either platform-managed keys
14 or customer-managed keys depending on the encryption type set on those disks.
15 """
16 name = "Ensure that the AKS cluster encrypt temp disks, caches, and data flows "
17 name += "between Compute and Storage resources"
18 id = "CKV_AZURE_227"
19 supported_resources = ("azurerm_kubernetes_cluster", "azurerm_kubernetes_cluster_node_pool")
20 categories = (CheckCategories.KUBERNETES,)
21 super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources,
22 missing_block_result=CheckResult.FAILED)
23
24 def get_inspected_key(self) -> str:
25 return "enable_host_encryption"
26
27
28 check = AKSEncryptionAtHostEnabled()
29
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/checkov/terraform/checks/resource/azure/AKSEncryptionAtHostEnabled.py b/checkov/terraform/checks/resource/azure/AKSEncryptionAtHostEnabled.py
--- a/checkov/terraform/checks/resource/azure/AKSEncryptionAtHostEnabled.py
+++ b/checkov/terraform/checks/resource/azure/AKSEncryptionAtHostEnabled.py
@@ -1,4 +1,3 @@
-
from checkov.common.models.enums import CheckCategories, CheckResult
from checkov.terraform.checks.resource.base_resource_value_check import BaseResourceValueCheck
@@ -18,11 +17,19 @@
id = "CKV_AZURE_227"
supported_resources = ("azurerm_kubernetes_cluster", "azurerm_kubernetes_cluster_node_pool")
categories = (CheckCategories.KUBERNETES,)
- super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources,
- missing_block_result=CheckResult.FAILED)
+ super().__init__(
+ name=name,
+ id=id,
+ categories=categories,
+ supported_resources=supported_resources,
+ missing_block_result=CheckResult.FAILED,
+ )
def get_inspected_key(self) -> str:
- return "enable_host_encryption"
+ if self.entity_type == "azurerm_kubernetes_cluster":
+ return "default_node_pool/[0]/enable_host_encryption"
+ else:
+ return "enable_host_encryption"
check = AKSEncryptionAtHostEnabled()
diff --git a/checkov/terraform/checks/resource/azure/AKSEphemeralOSDisks.py b/checkov/terraform/checks/resource/azure/AKSEphemeralOSDisks.py
--- a/checkov/terraform/checks/resource/azure/AKSEphemeralOSDisks.py
+++ b/checkov/terraform/checks/resource/azure/AKSEphemeralOSDisks.py
@@ -25,7 +25,7 @@
super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)
def get_inspected_key(self) -> str:
- return "os_disk_type"
+ return "default_node_pool/[0]/os_disk_type"
def get_expected_value(self) -> Any:
return "Ephemeral"
| {"golden_diff": "diff --git a/checkov/terraform/checks/resource/azure/AKSEncryptionAtHostEnabled.py b/checkov/terraform/checks/resource/azure/AKSEncryptionAtHostEnabled.py\n--- a/checkov/terraform/checks/resource/azure/AKSEncryptionAtHostEnabled.py\n+++ b/checkov/terraform/checks/resource/azure/AKSEncryptionAtHostEnabled.py\n@@ -1,4 +1,3 @@\n-\n from checkov.common.models.enums import CheckCategories, CheckResult\n from checkov.terraform.checks.resource.base_resource_value_check import BaseResourceValueCheck\n \n@@ -18,11 +17,19 @@\n id = \"CKV_AZURE_227\"\n supported_resources = (\"azurerm_kubernetes_cluster\", \"azurerm_kubernetes_cluster_node_pool\")\n categories = (CheckCategories.KUBERNETES,)\n- super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources,\n- missing_block_result=CheckResult.FAILED)\n+ super().__init__(\n+ name=name,\n+ id=id,\n+ categories=categories,\n+ supported_resources=supported_resources,\n+ missing_block_result=CheckResult.FAILED,\n+ )\n \n def get_inspected_key(self) -> str:\n- return \"enable_host_encryption\"\n+ if self.entity_type == \"azurerm_kubernetes_cluster\":\n+ return \"default_node_pool/[0]/enable_host_encryption\"\n+ else:\n+ return \"enable_host_encryption\"\n \n \n check = AKSEncryptionAtHostEnabled()\ndiff --git a/checkov/terraform/checks/resource/azure/AKSEphemeralOSDisks.py b/checkov/terraform/checks/resource/azure/AKSEphemeralOSDisks.py\n--- a/checkov/terraform/checks/resource/azure/AKSEphemeralOSDisks.py\n+++ b/checkov/terraform/checks/resource/azure/AKSEphemeralOSDisks.py\n@@ -25,7 +25,7 @@\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)\n \n def get_inspected_key(self) -> str:\n- return \"os_disk_type\"\n+ return \"default_node_pool/[0]/os_disk_type\"\n \n def get_expected_value(self) -> Any:\n return \"Ephemeral\"\n", "issue": "CKV_AZURE_226: error in check and testcase\n**Describe the issue**\r\nCKV_AZURE_226 checks for ephemeral disks within the \"main resource\" azurerm_kubernetes_cluster but the cluster itself doesn't have any argument called os_disk_type. The argument os_disk_type is part of the node pool. \r\nThe testcase [here](https://github.com/bridgecrewio/checkov/pull/5584/files#diff-c0b8f08537766f6eff2a5d10b9439d227fdaaebe6ff7903008825c5f9d51c22dR1) is misleading and the check itself [here](https://github.com/bridgecrewio/checkov/pull/5584/files#diff-c9248390aa120f7af4643f1908d3d824fb903fd3c6cd63e9e77fe8e9ecd59289R28) too. \r\n\r\nIn my opinion this must be something like \r\n```\r\n def get_inspected_key(self) -> str:\r\n return \"default_node_pool/[0]/os_disk_type\"\r\n```\r\notherwise it won't work?\r\n\r\nSame for CKV_AZURE_227.\r\n\r\n**Examples**\r\n```\r\n[root] # head -30 aks.tf\r\nresource \"azurerm_kubernetes_cluster\" \"this\" {\r\n name = local.name_prefix\r\n location = var.resource_group.location\r\n resource_group_name = var.resource_group.name\r\n node_resource_group = \"${local.name_prefix}-node-pool\"\r\n dns_prefix = local.name_prefix\r\n kubernetes_version = local.kubernetes_version\r\n sku_tier = var.sku_tier\r\n\r\n api_server_access_profile {\r\n authorized_ip_ranges = var.api_server_authorized_ip_ranges\r\n }\r\n\r\n default_node_pool {\r\n name = \"default\"\r\n\r\n enable_host_encryption = true\r\n vm_size = \"Standard_E4ads_v5\"\r\n os_disk_type = \"Ephemeral\"\r\n zones = [1, 2, 3]\r\n only_critical_addons_enabled = true\r\n\r\n type = \"VirtualMachineScaleSets\"\r\n vnet_subnet_id = var.subnet_id\r\n enable_auto_scaling = true\r\n max_count = 6\r\n min_count = 2\r\n orchestrator_version = local.kubernetes_version\r\n\r\n upgrade_settings {\r\n```\r\n\r\nresults in\r\n```\r\n[root] # checkov --skip-framework kubernetes --skip-framework helm --quiet --compact -o junitxml -o cli --directory .\r\n2023-10-02 11:58:47,399 [MainThread ] [WARNI] The framework \"sca_image\" is part of the \"SCA\" module, which is not enabled in the platform\r\n2023-10-02 11:58:47,399 [MainThread ] [WARNI] The framework \"sca_package\" is part of the \"SCA\" module, which is not enabled in the platform\r\nterraform scan results:\r\n\r\nPassed checks: 6, Failed checks: 11, Skipped checks: 0\r\n\r\n[...]\r\nCheck: CKV_AZURE_226: \"Ensure ephemeral disks are used for OS disks\"\r\n FAILED for resource: azurerm_kubernetes_cluster.this\r\n File: /aks.tf:1-64\r\nCheck: CKV_AZURE_227: \"Ensure that the AKS cluster encrypt temp disks, caches, and data flows between Compute and Storage resources\"\r\n FAILED for resource: azurerm_kubernetes_cluster.this\r\n File: /aks.tf:1-64\r\n[...]\r\n```\r\n\r\nPlease also see https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/kubernetes_cluster for code example.\r\n\r\n**Version (please complete the following information):**\r\n - Checkov Version 2.4.58\r\n\r\n**Additional context**\r\nThis is related to https://github.com/bridgecrewio/checkov/pull/5584 and https://github.com/bridgecrewio/checkov/pull/5588.\r\n\n", "before_files": [{"content": "from checkov.common.models.enums import CheckCategories\nfrom checkov.terraform.checks.resource.base_resource_value_check import BaseResourceValueCheck\nfrom typing import Any\n\n\nclass AKSEphemeralOSDisks(BaseResourceValueCheck):\n def __init__(self) -> None:\n \"\"\"\n Temporary data can contain sensitive data at some points, by using ephemeral disks,\n we ensure that data written to OS disk is stored on local VM storage and isn't persisted to Azure Storage\n\n Azure automatically replicates data stored in the managed OS disk of a virtual machine to Azure storage\n to avoid data loss in case the virtual machine needs to be relocated to another host.\n Generally speaking, containers are not designed to have local state persisted to the managed OS disk,\n hence this behavior offers limited value to AKS hosted while providing some drawbacks,\n including slower node provisioning and higher read/write latency.\n\n Ephemeral disks allow us also to have faster cluster operations like scale or upgrade\n due to faster re-imaging and boot times.\n \"\"\"\n name = \"Ensure ephemeral disks are used for OS disks\"\n id = \"CKV_AZURE_226\"\n supported_resources = (\"azurerm_kubernetes_cluster\",)\n categories = (CheckCategories.KUBERNETES,)\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)\n\n def get_inspected_key(self) -> str:\n return \"os_disk_type\"\n\n def get_expected_value(self) -> Any:\n return \"Ephemeral\"\n\n\ncheck = AKSEphemeralOSDisks()\n", "path": "checkov/terraform/checks/resource/azure/AKSEphemeralOSDisks.py"}, {"content": "\nfrom checkov.common.models.enums import CheckCategories, CheckResult\nfrom checkov.terraform.checks.resource.base_resource_value_check import BaseResourceValueCheck\n\n\nclass AKSEncryptionAtHostEnabled(BaseResourceValueCheck):\n def __init__(self) -> None:\n \"\"\"\n With host-based encryption, the data stored on the VM host of\n your AKS agent nodes' VMs is encrypted at rest and flows encrypted to the Storage service.\n\n This means the temp disks are encrypted at rest with platform-managed keys.\n The cache of OS and data disks is encrypted at rest with either platform-managed keys\n or customer-managed keys depending on the encryption type set on those disks.\n \"\"\"\n name = \"Ensure that the AKS cluster encrypt temp disks, caches, and data flows \"\n name += \"between Compute and Storage resources\"\n id = \"CKV_AZURE_227\"\n supported_resources = (\"azurerm_kubernetes_cluster\", \"azurerm_kubernetes_cluster_node_pool\")\n categories = (CheckCategories.KUBERNETES,)\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources,\n missing_block_result=CheckResult.FAILED)\n\n def get_inspected_key(self) -> str:\n return \"enable_host_encryption\"\n\n\ncheck = AKSEncryptionAtHostEnabled()\n", "path": "checkov/terraform/checks/resource/azure/AKSEncryptionAtHostEnabled.py"}], "after_files": [{"content": "from checkov.common.models.enums import CheckCategories\nfrom checkov.terraform.checks.resource.base_resource_value_check import BaseResourceValueCheck\nfrom typing import Any\n\n\nclass AKSEphemeralOSDisks(BaseResourceValueCheck):\n def __init__(self) -> None:\n \"\"\"\n Temporary data can contain sensitive data at some points, by using ephemeral disks,\n we ensure that data written to OS disk is stored on local VM storage and isn't persisted to Azure Storage\n\n Azure automatically replicates data stored in the managed OS disk of a virtual machine to Azure storage\n to avoid data loss in case the virtual machine needs to be relocated to another host.\n Generally speaking, containers are not designed to have local state persisted to the managed OS disk,\n hence this behavior offers limited value to AKS hosted while providing some drawbacks,\n including slower node provisioning and higher read/write latency.\n\n Ephemeral disks allow us also to have faster cluster operations like scale or upgrade\n due to faster re-imaging and boot times.\n \"\"\"\n name = \"Ensure ephemeral disks are used for OS disks\"\n id = \"CKV_AZURE_226\"\n supported_resources = (\"azurerm_kubernetes_cluster\",)\n categories = (CheckCategories.KUBERNETES,)\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)\n\n def get_inspected_key(self) -> str:\n return \"default_node_pool/[0]/os_disk_type\"\n\n def get_expected_value(self) -> Any:\n return \"Ephemeral\"\n\n\ncheck = AKSEphemeralOSDisks()\n", "path": "checkov/terraform/checks/resource/azure/AKSEphemeralOSDisks.py"}, {"content": "from checkov.common.models.enums import CheckCategories, CheckResult\nfrom checkov.terraform.checks.resource.base_resource_value_check import BaseResourceValueCheck\n\n\nclass AKSEncryptionAtHostEnabled(BaseResourceValueCheck):\n def __init__(self) -> None:\n \"\"\"\n With host-based encryption, the data stored on the VM host of\n your AKS agent nodes' VMs is encrypted at rest and flows encrypted to the Storage service.\n\n This means the temp disks are encrypted at rest with platform-managed keys.\n The cache of OS and data disks is encrypted at rest with either platform-managed keys\n or customer-managed keys depending on the encryption type set on those disks.\n \"\"\"\n name = \"Ensure that the AKS cluster encrypt temp disks, caches, and data flows \"\n name += \"between Compute and Storage resources\"\n id = \"CKV_AZURE_227\"\n supported_resources = (\"azurerm_kubernetes_cluster\", \"azurerm_kubernetes_cluster_node_pool\")\n categories = (CheckCategories.KUBERNETES,)\n super().__init__(\n name=name,\n id=id,\n categories=categories,\n supported_resources=supported_resources,\n missing_block_result=CheckResult.FAILED,\n )\n\n def get_inspected_key(self) -> str:\n if self.entity_type == \"azurerm_kubernetes_cluster\":\n return \"default_node_pool/[0]/enable_host_encryption\"\n else:\n return \"enable_host_encryption\"\n\n\ncheck = AKSEncryptionAtHostEnabled()\n", "path": "checkov/terraform/checks/resource/azure/AKSEncryptionAtHostEnabled.py"}]} | 1,981 | 511 |
gh_patches_debug_22890 | rasdani/github-patches | git_diff | bokeh__bokeh-4524 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Undeprecate VBox and HBox
I would like names `VBox` and `HBox` to stay, because they are more meaningful to from UI point of view than the new counterparts.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `bokeh/models/layouts.py`
Content:
```
1 """ Various kinds of layout components.
2
3 """
4 from __future__ import absolute_import
5
6 import warnings
7 import logging
8 logger = logging.getLogger(__name__)
9
10 from ..core import validation
11
12 from ..core.validation.warnings import (
13 EMPTY_LAYOUT,
14 BOTH_CHILD_AND_ROOT,
15 )
16 from ..core.enums import Location, Responsive as ResponsiveEnum
17 from ..core.properties import abstract, Bool, Int, Instance, List, Responsive, Override
18 from ..embed import notebook_div
19 from ..model import Model
20 from ..util.deprecate import deprecated
21
22
23 @abstract
24 class LayoutDOM(Model):
25 """ An abstract base class for layout components. ``LayoutDOM`` is not
26 generally useful to instantiate on its own.
27
28 """
29
30 width = Int(help="""
31 An optional width for the component (in pixels).
32 """)
33
34 height = Int(help="""
35 An optional height for the component (in pixels).
36 """)
37
38 disabled = Bool(False, help="""
39 Whether the widget will be disabled when rendered. If ``True``,
40 the widget will be greyed-out, and not respond to UI events.
41 """)
42
43 responsive = Responsive(help="""
44 The type of responsiveness for the item being displayed. Possible values are
45 ``"fixed"`` (or ``False``), ``"scale_width"`` (or ``True``),
46 ``"scale_height"``, ``"scale_both"``, ``"stretch_both"``.
47
48 ``"stretch_both"`` elements are completely responsive (independently in width and height) and
49 will resize to occupy all available space, even if this changes the aspect ratio of the element.
50 This is sometimes called outside-in, and is a typical behavior for desktop applications.
51
52 ``"fixed"`` elements are not responsive. They will retain their original width and height
53 regardless of any subsequent browser window resize events.
54
55 ``"scale_width"`` elements will responsively resize to fit to the width available, *while
56 maintaining the original aspect ratio*. This is a typical behavior for modern websites. For a
57 ``Plot``, the aspect ratio ``plot_width/plot_height`` is maintained.
58
59 ``"scale_height"`` elements will responsively resize to fit to the height available, *while
60 maintaining the original aspect ratio*. For a ``Plot``, the aspect ratio
61 ``plot_width/plot_height`` is maintained. A plot with ``"scale_height"`` mode needs
62 to be wrapped in a ``Row`` or ``Column`` to be responsive.
63
64
65 ``"scale_both"`` elements will responsively resize to fir both the width and height available,
66 *while maintaining the original aspect ratio*.
67
68 """)
69
70 # TODO: (mp) Not yet, because it breaks plotting/notebook examples.
71 # Rename to _repr_html_ if we decide to enable this by default.
72 def __repr_html__(self):
73 return notebook_div(self)
74
75 @property
76 def html(self):
77 from IPython.core.display import HTML
78 return HTML(self.__repr_html__())
79
80
81 class Spacer(LayoutDOM):
82 """ A container for space used to fill an empty spot in a row or column.
83
84 """
85
86
87 class WidgetBox(LayoutDOM):
88 """ A container for widgets that are part of a layout."""
89 def __init__(self, *args, **kwargs):
90 if len(args) > 0 and "children" in kwargs:
91 raise ValueError("'children' keyword cannot be used with positional arguments")
92 elif len(args) > 0:
93 kwargs["children"] = list(args)
94 super(WidgetBox, self).__init__(**kwargs)
95
96 @validation.warning(EMPTY_LAYOUT)
97 def _check_empty_layout(self):
98 from itertools import chain
99 if not list(chain(self.children)):
100 return str(self)
101
102 @validation.warning(BOTH_CHILD_AND_ROOT)
103 def _check_child_is_also_root(self):
104 problems = []
105 for c in self.children:
106 if c.document is not None and c in c.document.roots:
107 problems.append(str(c))
108 if problems:
109 return ", ".join(problems)
110 else:
111 return None
112
113 children = List(Instance('bokeh.models.widgets.Widget'), help="""
114 The list of widgets to put in the layout box.
115 """)
116
117 responsive = Override(default='fixed')
118
119
120 @abstract
121 class Box(LayoutDOM):
122 """ Abstract base class for Row and Column. Do not use directly.
123 """
124
125 def __init__(self, *args, **kwargs):
126
127 if len(args) > 0 and "children" in kwargs:
128 raise ValueError("'children' keyword cannot be used with positional arguments")
129 elif len(args) > 0:
130 kwargs["children"] = list(args)
131
132 unwrapped_children = kwargs.get("children", [])
133 kwargs["children"] = self._wrap_children(unwrapped_children)
134 super(Box, self).__init__(**kwargs)
135
136 def _wrap_children(self, children):
137 """ Wrap any Widgets of a list of child layouts in a WidgetBox.
138 This allows for the convenience of just spelling Row(button1, button2).
139 """
140 from .widgets.widget import Widget
141 wrapped_children = []
142 for child in children:
143 if isinstance(child, Widget):
144 child = WidgetBox(
145 children=[child],
146 responsive=child.responsive,
147 width=child.width,
148 height=child.height,
149 disabled=child.disabled
150 )
151 wrapped_children.append(child)
152 return wrapped_children
153
154 @validation.warning(EMPTY_LAYOUT)
155 def _check_empty_layout(self):
156 from itertools import chain
157 if not list(chain(self.children)):
158 return str(self)
159
160 @validation.warning(BOTH_CHILD_AND_ROOT)
161 def _check_child_is_also_root(self):
162 problems = []
163 for c in self.children:
164 if c.document is not None and c in c.document.roots:
165 problems.append(str(c))
166 if problems:
167 return ", ".join(problems)
168 else:
169 return None
170
171 #TODO Debating the following instead to prevent people adding just a plain
172 # widget into a box, which sometimes works and sometimes looks disastrous
173 #children = List(
174 # Either(
175 # Instance('bokeh.models.layouts.Row'),
176 # Instance('bokeh.models.layouts.Column'),
177 # Instance('bokeh.models.plots.Plot'),
178 # Instance('bokeh.models.layouts.WidgetBox')
179 # ), help="""
180 # The list of children, which can be other components including plots, rows, columns, and widgets.
181 #""")
182 children = List(Instance(LayoutDOM), help="""
183 The list of children, which can be other components including plots, rows, columns, and widgets.
184 """)
185
186 responsive = Override(default='fixed')
187
188
189 class Row(Box):
190 """ Lay out child components in a single horizontal row.
191
192 Children can be specified as positional arguments, as a single argument
193 that is a sequence, or using the ``children`` keyword argument.
194 """
195
196
197 class Column(Box):
198 """ Lay out child components in a single vertical row.
199
200 Children can be specified as positional arguments, as a single argument
201 that is a sequence, or using the ``children`` keyword argument.
202 """
203
204
205 # ---- DEPRECATIONS
206
207 @deprecated("Bokeh 0.12.0", "bokeh.layouts.gridplot")
208 def GridPlot(*args, **kwargs):
209 from bokeh.layouts import gridplot
210 return gridplot(*args, **kwargs)
211
212
213 @deprecated("Bokeh 0.12.0", "bokeh.models.layouts.Row")
214 def HBox(*args, **kwargs):
215 return Row(*args, **kwargs)
216
217
218 @deprecated("Bokeh 0.12.0", "bokeh.models.layouts.Column")
219 def VBox(*args, **kwargs):
220 return Column(*args, **kwargs)
221
222
223 @deprecated("Bokeh 0.12.0", "bokeh.models.layouts.WidgetBox")
224 def VBoxForm(*args, **kwargs):
225 from bokeh.models.widgets.widget import Widget
226
227 if len(args) > 0 and "children" in kwargs:
228 raise ValueError("'children' keyword cannot be used with positional arguments")
229 elif len(args) > 0:
230 children = list(args)
231 else:
232 children = kwargs.get("children", [])
233 is_widget = [isinstance(item, Widget) for item in children]
234 if all(is_widget):
235 return WidgetBox(*args, **kwargs)
236 else:
237 warnings.warn(
238 """WARNING: Non-widgets added to VBoxForm! VBoxForm is deprecated and is
239 being replaced with WidgetBox. WidgetBox does not allow you to add non-widgets to it.
240 We have transformed your request into a Column, with your Plots and WidgetBox(es) inside
241 it. In the future, you will need to update your code to use Row and Column. You may
242 find the new bokeh.layouts functions helpful.
243 """)
244 return Column(*args, **kwargs)
245
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/bokeh/models/layouts.py b/bokeh/models/layouts.py
--- a/bokeh/models/layouts.py
+++ b/bokeh/models/layouts.py
@@ -202,23 +202,33 @@
"""
-# ---- DEPRECATIONS
-
-@deprecated("Bokeh 0.12.0", "bokeh.layouts.gridplot")
-def GridPlot(*args, **kwargs):
- from bokeh.layouts import gridplot
- return gridplot(*args, **kwargs)
+def HBox(*args, **kwargs):
+ """ Lay out child components in a single horizontal row.
+ Children can be specified as positional arguments, as a single argument
+ that is a sequence, or using the ``children`` keyword argument.
-@deprecated("Bokeh 0.12.0", "bokeh.models.layouts.Row")
-def HBox(*args, **kwargs):
+ Returns a Row instance.
+ """
return Row(*args, **kwargs)
-@deprecated("Bokeh 0.12.0", "bokeh.models.layouts.Column")
def VBox(*args, **kwargs):
+ """ Lay out child components in a single vertical row.
+
+ Children can be specified as positional arguments, as a single argument
+ that is a sequence, or using the ``children`` keyword argument.
+
+ Returns a Column instance.
+ """
return Column(*args, **kwargs)
+# ---- DEPRECATIONS
+
+@deprecated("Bokeh 0.12.0", "bokeh.layouts.gridplot")
+def GridPlot(*args, **kwargs):
+ from bokeh.layouts import gridplot
+ return gridplot(*args, **kwargs)
@deprecated("Bokeh 0.12.0", "bokeh.models.layouts.WidgetBox")
def VBoxForm(*args, **kwargs):
| {"golden_diff": "diff --git a/bokeh/models/layouts.py b/bokeh/models/layouts.py\n--- a/bokeh/models/layouts.py\n+++ b/bokeh/models/layouts.py\n@@ -202,23 +202,33 @@\n \"\"\"\n \n \n-# ---- DEPRECATIONS\n-\n-@deprecated(\"Bokeh 0.12.0\", \"bokeh.layouts.gridplot\")\n-def GridPlot(*args, **kwargs):\n- from bokeh.layouts import gridplot\n- return gridplot(*args, **kwargs)\n+def HBox(*args, **kwargs):\n+ \"\"\" Lay out child components in a single horizontal row.\n \n+ Children can be specified as positional arguments, as a single argument\n+ that is a sequence, or using the ``children`` keyword argument.\n \n-@deprecated(\"Bokeh 0.12.0\", \"bokeh.models.layouts.Row\")\n-def HBox(*args, **kwargs):\n+ Returns a Row instance.\n+ \"\"\"\n return Row(*args, **kwargs)\n \n \n-@deprecated(\"Bokeh 0.12.0\", \"bokeh.models.layouts.Column\")\n def VBox(*args, **kwargs):\n+ \"\"\" Lay out child components in a single vertical row.\n+\n+ Children can be specified as positional arguments, as a single argument\n+ that is a sequence, or using the ``children`` keyword argument.\n+\n+ Returns a Column instance.\n+ \"\"\"\n return Column(*args, **kwargs)\n \n+# ---- DEPRECATIONS\n+\n+@deprecated(\"Bokeh 0.12.0\", \"bokeh.layouts.gridplot\")\n+def GridPlot(*args, **kwargs):\n+ from bokeh.layouts import gridplot\n+ return gridplot(*args, **kwargs)\n \n @deprecated(\"Bokeh 0.12.0\", \"bokeh.models.layouts.WidgetBox\")\n def VBoxForm(*args, **kwargs):\n", "issue": "Undeprecate VBox and HBox\nI would like names `VBox` and `HBox` to stay, because they are more meaningful to from UI point of view than the new counterparts.\n\n", "before_files": [{"content": "\"\"\" Various kinds of layout components.\n\n\"\"\"\nfrom __future__ import absolute_import\n\nimport warnings\nimport logging\nlogger = logging.getLogger(__name__)\n\nfrom ..core import validation\n\nfrom ..core.validation.warnings import (\n EMPTY_LAYOUT,\n BOTH_CHILD_AND_ROOT,\n)\nfrom ..core.enums import Location, Responsive as ResponsiveEnum\nfrom ..core.properties import abstract, Bool, Int, Instance, List, Responsive, Override\nfrom ..embed import notebook_div\nfrom ..model import Model\nfrom ..util.deprecate import deprecated\n\n\n@abstract\nclass LayoutDOM(Model):\n \"\"\" An abstract base class for layout components. ``LayoutDOM`` is not\n generally useful to instantiate on its own.\n\n \"\"\"\n\n width = Int(help=\"\"\"\n An optional width for the component (in pixels).\n \"\"\")\n\n height = Int(help=\"\"\"\n An optional height for the component (in pixels).\n \"\"\")\n\n disabled = Bool(False, help=\"\"\"\n Whether the widget will be disabled when rendered. If ``True``,\n the widget will be greyed-out, and not respond to UI events.\n \"\"\")\n\n responsive = Responsive(help=\"\"\"\n The type of responsiveness for the item being displayed. Possible values are\n ``\"fixed\"`` (or ``False``), ``\"scale_width\"`` (or ``True``),\n ``\"scale_height\"``, ``\"scale_both\"``, ``\"stretch_both\"``.\n\n ``\"stretch_both\"`` elements are completely responsive (independently in width and height) and\n will resize to occupy all available space, even if this changes the aspect ratio of the element.\n This is sometimes called outside-in, and is a typical behavior for desktop applications.\n\n ``\"fixed\"`` elements are not responsive. They will retain their original width and height\n regardless of any subsequent browser window resize events.\n\n ``\"scale_width\"`` elements will responsively resize to fit to the width available, *while\n maintaining the original aspect ratio*. This is a typical behavior for modern websites. For a\n ``Plot``, the aspect ratio ``plot_width/plot_height`` is maintained.\n\n ``\"scale_height\"`` elements will responsively resize to fit to the height available, *while\n maintaining the original aspect ratio*. For a ``Plot``, the aspect ratio\n ``plot_width/plot_height`` is maintained. A plot with ``\"scale_height\"`` mode needs\n to be wrapped in a ``Row`` or ``Column`` to be responsive.\n\n\n ``\"scale_both\"`` elements will responsively resize to fir both the width and height available,\n *while maintaining the original aspect ratio*.\n\n \"\"\")\n\n # TODO: (mp) Not yet, because it breaks plotting/notebook examples.\n # Rename to _repr_html_ if we decide to enable this by default.\n def __repr_html__(self):\n return notebook_div(self)\n\n @property\n def html(self):\n from IPython.core.display import HTML\n return HTML(self.__repr_html__())\n\n\nclass Spacer(LayoutDOM):\n \"\"\" A container for space used to fill an empty spot in a row or column.\n\n \"\"\"\n\n\nclass WidgetBox(LayoutDOM):\n \"\"\" A container for widgets that are part of a layout.\"\"\"\n def __init__(self, *args, **kwargs):\n if len(args) > 0 and \"children\" in kwargs:\n raise ValueError(\"'children' keyword cannot be used with positional arguments\")\n elif len(args) > 0:\n kwargs[\"children\"] = list(args)\n super(WidgetBox, self).__init__(**kwargs)\n\n @validation.warning(EMPTY_LAYOUT)\n def _check_empty_layout(self):\n from itertools import chain\n if not list(chain(self.children)):\n return str(self)\n\n @validation.warning(BOTH_CHILD_AND_ROOT)\n def _check_child_is_also_root(self):\n problems = []\n for c in self.children:\n if c.document is not None and c in c.document.roots:\n problems.append(str(c))\n if problems:\n return \", \".join(problems)\n else:\n return None\n\n children = List(Instance('bokeh.models.widgets.Widget'), help=\"\"\"\n The list of widgets to put in the layout box.\n \"\"\")\n\n responsive = Override(default='fixed')\n\n\n@abstract\nclass Box(LayoutDOM):\n \"\"\" Abstract base class for Row and Column. Do not use directly.\n \"\"\"\n\n def __init__(self, *args, **kwargs):\n\n if len(args) > 0 and \"children\" in kwargs:\n raise ValueError(\"'children' keyword cannot be used with positional arguments\")\n elif len(args) > 0:\n kwargs[\"children\"] = list(args)\n\n unwrapped_children = kwargs.get(\"children\", [])\n kwargs[\"children\"] = self._wrap_children(unwrapped_children)\n super(Box, self).__init__(**kwargs)\n\n def _wrap_children(self, children):\n \"\"\" Wrap any Widgets of a list of child layouts in a WidgetBox.\n This allows for the convenience of just spelling Row(button1, button2).\n \"\"\"\n from .widgets.widget import Widget\n wrapped_children = []\n for child in children:\n if isinstance(child, Widget):\n child = WidgetBox(\n children=[child],\n responsive=child.responsive,\n width=child.width,\n height=child.height,\n disabled=child.disabled\n )\n wrapped_children.append(child)\n return wrapped_children\n\n @validation.warning(EMPTY_LAYOUT)\n def _check_empty_layout(self):\n from itertools import chain\n if not list(chain(self.children)):\n return str(self)\n\n @validation.warning(BOTH_CHILD_AND_ROOT)\n def _check_child_is_also_root(self):\n problems = []\n for c in self.children:\n if c.document is not None and c in c.document.roots:\n problems.append(str(c))\n if problems:\n return \", \".join(problems)\n else:\n return None\n\n #TODO Debating the following instead to prevent people adding just a plain\n # widget into a box, which sometimes works and sometimes looks disastrous\n #children = List(\n # Either(\n # Instance('bokeh.models.layouts.Row'),\n # Instance('bokeh.models.layouts.Column'),\n # Instance('bokeh.models.plots.Plot'),\n # Instance('bokeh.models.layouts.WidgetBox')\n # ), help=\"\"\"\n # The list of children, which can be other components including plots, rows, columns, and widgets.\n #\"\"\")\n children = List(Instance(LayoutDOM), help=\"\"\"\n The list of children, which can be other components including plots, rows, columns, and widgets.\n \"\"\")\n\n responsive = Override(default='fixed')\n\n\nclass Row(Box):\n \"\"\" Lay out child components in a single horizontal row.\n\n Children can be specified as positional arguments, as a single argument\n that is a sequence, or using the ``children`` keyword argument.\n \"\"\"\n\n\nclass Column(Box):\n \"\"\" Lay out child components in a single vertical row.\n\n Children can be specified as positional arguments, as a single argument\n that is a sequence, or using the ``children`` keyword argument.\n \"\"\"\n\n\n# ---- DEPRECATIONS\n\n@deprecated(\"Bokeh 0.12.0\", \"bokeh.layouts.gridplot\")\ndef GridPlot(*args, **kwargs):\n from bokeh.layouts import gridplot\n return gridplot(*args, **kwargs)\n\n\n@deprecated(\"Bokeh 0.12.0\", \"bokeh.models.layouts.Row\")\ndef HBox(*args, **kwargs):\n return Row(*args, **kwargs)\n\n\n@deprecated(\"Bokeh 0.12.0\", \"bokeh.models.layouts.Column\")\ndef VBox(*args, **kwargs):\n return Column(*args, **kwargs)\n\n\n@deprecated(\"Bokeh 0.12.0\", \"bokeh.models.layouts.WidgetBox\")\ndef VBoxForm(*args, **kwargs):\n from bokeh.models.widgets.widget import Widget\n\n if len(args) > 0 and \"children\" in kwargs:\n raise ValueError(\"'children' keyword cannot be used with positional arguments\")\n elif len(args) > 0:\n children = list(args)\n else:\n children = kwargs.get(\"children\", [])\n is_widget = [isinstance(item, Widget) for item in children]\n if all(is_widget):\n return WidgetBox(*args, **kwargs)\n else:\n warnings.warn(\n \"\"\"WARNING: Non-widgets added to VBoxForm! VBoxForm is deprecated and is\n being replaced with WidgetBox. WidgetBox does not allow you to add non-widgets to it.\n We have transformed your request into a Column, with your Plots and WidgetBox(es) inside\n it. In the future, you will need to update your code to use Row and Column. You may\n find the new bokeh.layouts functions helpful.\n \"\"\")\n return Column(*args, **kwargs)\n", "path": "bokeh/models/layouts.py"}], "after_files": [{"content": "\"\"\" Various kinds of layout components.\n\n\"\"\"\nfrom __future__ import absolute_import\n\nimport warnings\nimport logging\nlogger = logging.getLogger(__name__)\n\nfrom ..core import validation\n\nfrom ..core.validation.warnings import (\n EMPTY_LAYOUT,\n BOTH_CHILD_AND_ROOT,\n)\nfrom ..core.enums import Location, Responsive as ResponsiveEnum\nfrom ..core.properties import abstract, Bool, Int, Instance, List, Responsive, Override\nfrom ..embed import notebook_div\nfrom ..model import Model\nfrom ..util.deprecate import deprecated\n\n\n@abstract\nclass LayoutDOM(Model):\n \"\"\" An abstract base class for layout components. ``LayoutDOM`` is not\n generally useful to instantiate on its own.\n\n \"\"\"\n\n width = Int(help=\"\"\"\n An optional width for the component (in pixels).\n \"\"\")\n\n height = Int(help=\"\"\"\n An optional height for the component (in pixels).\n \"\"\")\n\n disabled = Bool(False, help=\"\"\"\n Whether the widget will be disabled when rendered. If ``True``,\n the widget will be greyed-out, and not respond to UI events.\n \"\"\")\n\n responsive = Responsive(help=\"\"\"\n The type of responsiveness for the item being displayed. Possible values are\n ``\"fixed\"`` (or ``False``), ``\"scale_width\"`` (or ``True``),\n ``\"scale_height\"``, ``\"scale_both\"``, ``\"stretch_both\"``.\n\n ``\"stretch_both\"`` elements are completely responsive (independently in width and height) and\n will resize to occupy all available space, even if this changes the aspect ratio of the element.\n This is sometimes called outside-in, and is a typical behavior for desktop applications.\n\n ``\"fixed\"`` elements are not responsive. They will retain their original width and height\n regardless of any subsequent browser window resize events.\n\n ``\"scale_width\"`` elements will responsively resize to fit to the width available, *while\n maintaining the original aspect ratio*. This is a typical behavior for modern websites. For a\n ``Plot``, the aspect ratio ``plot_width/plot_height`` is maintained.\n\n ``\"scale_height\"`` elements will responsively resize to fit to the height available, *while\n maintaining the original aspect ratio*. For a ``Plot``, the aspect ratio\n ``plot_width/plot_height`` is maintained. A plot with ``\"scale_height\"`` mode needs\n to be wrapped in a ``Row`` or ``Column`` to be responsive.\n\n\n ``\"scale_both\"`` elements will responsively resize to fir both the width and height available,\n *while maintaining the original aspect ratio*.\n\n \"\"\")\n\n # TODO: (mp) Not yet, because it breaks plotting/notebook examples.\n # Rename to _repr_html_ if we decide to enable this by default.\n def __repr_html__(self):\n return notebook_div(self)\n\n @property\n def html(self):\n from IPython.core.display import HTML\n return HTML(self.__repr_html__())\n\n\nclass Spacer(LayoutDOM):\n \"\"\" A container for space used to fill an empty spot in a row or column.\n\n \"\"\"\n\n\nclass WidgetBox(LayoutDOM):\n \"\"\" A container for widgets that are part of a layout.\"\"\"\n def __init__(self, *args, **kwargs):\n if len(args) > 0 and \"children\" in kwargs:\n raise ValueError(\"'children' keyword cannot be used with positional arguments\")\n elif len(args) > 0:\n kwargs[\"children\"] = list(args)\n super(WidgetBox, self).__init__(**kwargs)\n\n @validation.warning(EMPTY_LAYOUT)\n def _check_empty_layout(self):\n from itertools import chain\n if not list(chain(self.children)):\n return str(self)\n\n @validation.warning(BOTH_CHILD_AND_ROOT)\n def _check_child_is_also_root(self):\n problems = []\n for c in self.children:\n if c.document is not None and c in c.document.roots:\n problems.append(str(c))\n if problems:\n return \", \".join(problems)\n else:\n return None\n\n children = List(Instance('bokeh.models.widgets.Widget'), help=\"\"\"\n The list of widgets to put in the layout box.\n \"\"\")\n\n responsive = Override(default='fixed')\n\n\n@abstract\nclass Box(LayoutDOM):\n \"\"\" Abstract base class for Row and Column. Do not use directly.\n \"\"\"\n\n def __init__(self, *args, **kwargs):\n\n if len(args) > 0 and \"children\" in kwargs:\n raise ValueError(\"'children' keyword cannot be used with positional arguments\")\n elif len(args) > 0:\n kwargs[\"children\"] = list(args)\n\n unwrapped_children = kwargs.get(\"children\", [])\n kwargs[\"children\"] = self._wrap_children(unwrapped_children)\n super(Box, self).__init__(**kwargs)\n\n def _wrap_children(self, children):\n \"\"\" Wrap any Widgets of a list of child layouts in a WidgetBox.\n This allows for the convenience of just spelling Row(button1, button2).\n \"\"\"\n from .widgets.widget import Widget\n wrapped_children = []\n for child in children:\n if isinstance(child, Widget):\n child = WidgetBox(\n children=[child],\n responsive=child.responsive,\n width=child.width,\n height=child.height,\n disabled=child.disabled\n )\n wrapped_children.append(child)\n return wrapped_children\n\n @validation.warning(EMPTY_LAYOUT)\n def _check_empty_layout(self):\n from itertools import chain\n if not list(chain(self.children)):\n return str(self)\n\n @validation.warning(BOTH_CHILD_AND_ROOT)\n def _check_child_is_also_root(self):\n problems = []\n for c in self.children:\n if c.document is not None and c in c.document.roots:\n problems.append(str(c))\n if problems:\n return \", \".join(problems)\n else:\n return None\n\n #TODO Debating the following instead to prevent people adding just a plain\n # widget into a box, which sometimes works and sometimes looks disastrous\n #children = List(\n # Either(\n # Instance('bokeh.models.layouts.Row'),\n # Instance('bokeh.models.layouts.Column'),\n # Instance('bokeh.models.plots.Plot'),\n # Instance('bokeh.models.layouts.WidgetBox')\n # ), help=\"\"\"\n # The list of children, which can be other components including plots, rows, columns, and widgets.\n #\"\"\")\n children = List(Instance(LayoutDOM), help=\"\"\"\n The list of children, which can be other components including plots, rows, columns, and widgets.\n \"\"\")\n\n responsive = Override(default='fixed')\n\n\nclass Row(Box):\n \"\"\" Lay out child components in a single horizontal row.\n\n Children can be specified as positional arguments, as a single argument\n that is a sequence, or using the ``children`` keyword argument.\n \"\"\"\n\n\nclass Column(Box):\n \"\"\" Lay out child components in a single vertical row.\n\n Children can be specified as positional arguments, as a single argument\n that is a sequence, or using the ``children`` keyword argument.\n \"\"\"\n\n\ndef HBox(*args, **kwargs):\n \"\"\" Lay out child components in a single horizontal row.\n\n Children can be specified as positional arguments, as a single argument\n that is a sequence, or using the ``children`` keyword argument.\n\n Returns a Row instance.\n \"\"\"\n return Row(*args, **kwargs)\n\n\ndef VBox(*args, **kwargs):\n \"\"\" Lay out child components in a single vertical row.\n\n Children can be specified as positional arguments, as a single argument\n that is a sequence, or using the ``children`` keyword argument.\n\n Returns a Column instance.\n \"\"\"\n return Column(*args, **kwargs)\n\n# ---- DEPRECATIONS\n\n@deprecated(\"Bokeh 0.12.0\", \"bokeh.layouts.gridplot\")\ndef GridPlot(*args, **kwargs):\n from bokeh.layouts import gridplot\n return gridplot(*args, **kwargs)\n\n@deprecated(\"Bokeh 0.12.0\", \"bokeh.models.layouts.WidgetBox\")\ndef VBoxForm(*args, **kwargs):\n from bokeh.models.widgets.widget import Widget\n\n if len(args) > 0 and \"children\" in kwargs:\n raise ValueError(\"'children' keyword cannot be used with positional arguments\")\n elif len(args) > 0:\n children = list(args)\n else:\n children = kwargs.get(\"children\", [])\n is_widget = [isinstance(item, Widget) for item in children]\n if all(is_widget):\n return WidgetBox(*args, **kwargs)\n else:\n warnings.warn(\n \"\"\"WARNING: Non-widgets added to VBoxForm! VBoxForm is deprecated and is\n being replaced with WidgetBox. WidgetBox does not allow you to add non-widgets to it.\n We have transformed your request into a Column, with your Plots and WidgetBox(es) inside\n it. In the future, you will need to update your code to use Row and Column. You may\n find the new bokeh.layouts functions helpful.\n \"\"\")\n return Column(*args, **kwargs)\n", "path": "bokeh/models/layouts.py"}]} | 2,850 | 405 |
gh_patches_debug_20164 | rasdani/github-patches | git_diff | pytorch__vision-2258 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Raise error if target boxes are degenerate in Faster R-CNN
We have had a number of reports with users saying that their training loss is nan after a few iterations.
Most of the time, this is due to degenerate boxes (i.e., boxes with negative sizes or zero area). We should improve the user experience in those situations.
I think that raising an error in `GeneralizedRCNN` if the target boxes are degenerate would be a good compromise.
Related issues: https://github.com/pytorch/vision/issues/2235 https://github.com/pytorch/vision/issues/1994 https://github.com/pytorch/vision/issues/1176 https://github.com/pytorch/vision/issues/1128 #1120 and #997
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `torchvision/models/detection/generalized_rcnn.py`
Content:
```
1 # Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved.
2 """
3 Implements the Generalized R-CNN framework
4 """
5
6 from collections import OrderedDict
7 import torch
8 from torch import nn
9 import warnings
10 from torch.jit.annotations import Tuple, List, Dict, Optional
11 from torch import Tensor
12
13
14 class GeneralizedRCNN(nn.Module):
15 """
16 Main class for Generalized R-CNN.
17
18 Arguments:
19 backbone (nn.Module):
20 rpn (nn.Module):
21 roi_heads (nn.Module): takes the features + the proposals from the RPN and computes
22 detections / masks from it.
23 transform (nn.Module): performs the data transformation from the inputs to feed into
24 the model
25 """
26
27 def __init__(self, backbone, rpn, roi_heads, transform):
28 super(GeneralizedRCNN, self).__init__()
29 self.transform = transform
30 self.backbone = backbone
31 self.rpn = rpn
32 self.roi_heads = roi_heads
33 # used only on torchscript mode
34 self._has_warned = False
35
36 @torch.jit.unused
37 def eager_outputs(self, losses, detections):
38 # type: (Dict[str, Tensor], List[Dict[str, Tensor]]) -> Tuple[Dict[str, Tensor], List[Dict[str, Tensor]]]
39 if self.training:
40 return losses
41
42 return detections
43
44 def forward(self, images, targets=None):
45 # type: (List[Tensor], Optional[List[Dict[str, Tensor]]]) -> Tuple[Dict[str, Tensor], List[Dict[str, Tensor]]]
46 """
47 Arguments:
48 images (list[Tensor]): images to be processed
49 targets (list[Dict[Tensor]]): ground-truth boxes present in the image (optional)
50
51 Returns:
52 result (list[BoxList] or dict[Tensor]): the output from the model.
53 During training, it returns a dict[Tensor] which contains the losses.
54 During testing, it returns list[BoxList] contains additional fields
55 like `scores`, `labels` and `mask` (for Mask R-CNN models).
56
57 """
58 if self.training and targets is None:
59 raise ValueError("In training mode, targets should be passed")
60 if self.training:
61 assert targets is not None
62 for target in targets:
63 boxes = target["boxes"]
64 if isinstance(boxes, torch.Tensor):
65 if len(boxes.shape) != 2 or boxes.shape[-1] != 4:
66 raise ValueError("Expected target boxes to be a tensor"
67 "of shape [N, 4], got {:}.".format(
68 boxes.shape))
69 else:
70 raise ValueError("Expected target boxes to be of type "
71 "Tensor, got {:}.".format(type(boxes)))
72
73 original_image_sizes = torch.jit.annotate(List[Tuple[int, int]], [])
74 for img in images:
75 val = img.shape[-2:]
76 assert len(val) == 2
77 original_image_sizes.append((val[0], val[1]))
78
79 images, targets = self.transform(images, targets)
80 features = self.backbone(images.tensors)
81 if isinstance(features, torch.Tensor):
82 features = OrderedDict([('0', features)])
83 proposals, proposal_losses = self.rpn(images, features, targets)
84 detections, detector_losses = self.roi_heads(features, proposals, images.image_sizes, targets)
85 detections = self.transform.postprocess(detections, images.image_sizes, original_image_sizes)
86
87 losses = {}
88 losses.update(detector_losses)
89 losses.update(proposal_losses)
90
91 if torch.jit.is_scripting():
92 if not self._has_warned:
93 warnings.warn("RCNN always returns a (Losses, Detections) tuple in scripting")
94 self._has_warned = True
95 return (losses, detections)
96 else:
97 return self.eager_outputs(losses, detections)
98
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/torchvision/models/detection/generalized_rcnn.py b/torchvision/models/detection/generalized_rcnn.py
--- a/torchvision/models/detection/generalized_rcnn.py
+++ b/torchvision/models/detection/generalized_rcnn.py
@@ -77,6 +77,21 @@
original_image_sizes.append((val[0], val[1]))
images, targets = self.transform(images, targets)
+
+ # Check for degenerate boxes
+ # TODO: Move this to a function
+ if targets is not None:
+ for target_idx, target in enumerate(targets):
+ boxes = target["boxes"]
+ degenerate_boxes = boxes[:, 2:] <= boxes[:, :2]
+ if degenerate_boxes.any():
+ # print the first degenrate box
+ bb_idx = degenerate_boxes.any(dim=1).nonzero().view(-1)[0]
+ degen_bb: List[float] = boxes[bb_idx].tolist()
+ raise ValueError("All bounding boxes should have positive height and width."
+ " Found invaid box {} for target at index {}."
+ .format(degen_bb, target_idx))
+
features = self.backbone(images.tensors)
if isinstance(features, torch.Tensor):
features = OrderedDict([('0', features)])
| {"golden_diff": "diff --git a/torchvision/models/detection/generalized_rcnn.py b/torchvision/models/detection/generalized_rcnn.py\n--- a/torchvision/models/detection/generalized_rcnn.py\n+++ b/torchvision/models/detection/generalized_rcnn.py\n@@ -77,6 +77,21 @@\n original_image_sizes.append((val[0], val[1]))\n \n images, targets = self.transform(images, targets)\n+\n+ # Check for degenerate boxes\n+ # TODO: Move this to a function\n+ if targets is not None:\n+ for target_idx, target in enumerate(targets):\n+ boxes = target[\"boxes\"]\n+ degenerate_boxes = boxes[:, 2:] <= boxes[:, :2]\n+ if degenerate_boxes.any():\n+ # print the first degenrate box\n+ bb_idx = degenerate_boxes.any(dim=1).nonzero().view(-1)[0]\n+ degen_bb: List[float] = boxes[bb_idx].tolist()\n+ raise ValueError(\"All bounding boxes should have positive height and width.\"\n+ \" Found invaid box {} for target at index {}.\"\n+ .format(degen_bb, target_idx))\n+\n features = self.backbone(images.tensors)\n if isinstance(features, torch.Tensor):\n features = OrderedDict([('0', features)])\n", "issue": "Raise error if target boxes are degenerate in Faster R-CNN\nWe have had a number of reports with users saying that their training loss is nan after a few iterations.\r\n\r\nMost of the time, this is due to degenerate boxes (i.e., boxes with negative sizes or zero area). We should improve the user experience in those situations.\r\n\r\nI think that raising an error in `GeneralizedRCNN` if the target boxes are degenerate would be a good compromise.\r\n\r\nRelated issues: https://github.com/pytorch/vision/issues/2235 https://github.com/pytorch/vision/issues/1994 https://github.com/pytorch/vision/issues/1176 https://github.com/pytorch/vision/issues/1128 #1120 and #997\n", "before_files": [{"content": "# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved.\n\"\"\"\nImplements the Generalized R-CNN framework\n\"\"\"\n\nfrom collections import OrderedDict\nimport torch\nfrom torch import nn\nimport warnings\nfrom torch.jit.annotations import Tuple, List, Dict, Optional\nfrom torch import Tensor\n\n\nclass GeneralizedRCNN(nn.Module):\n \"\"\"\n Main class for Generalized R-CNN.\n\n Arguments:\n backbone (nn.Module):\n rpn (nn.Module):\n roi_heads (nn.Module): takes the features + the proposals from the RPN and computes\n detections / masks from it.\n transform (nn.Module): performs the data transformation from the inputs to feed into\n the model\n \"\"\"\n\n def __init__(self, backbone, rpn, roi_heads, transform):\n super(GeneralizedRCNN, self).__init__()\n self.transform = transform\n self.backbone = backbone\n self.rpn = rpn\n self.roi_heads = roi_heads\n # used only on torchscript mode\n self._has_warned = False\n\n @torch.jit.unused\n def eager_outputs(self, losses, detections):\n # type: (Dict[str, Tensor], List[Dict[str, Tensor]]) -> Tuple[Dict[str, Tensor], List[Dict[str, Tensor]]]\n if self.training:\n return losses\n\n return detections\n\n def forward(self, images, targets=None):\n # type: (List[Tensor], Optional[List[Dict[str, Tensor]]]) -> Tuple[Dict[str, Tensor], List[Dict[str, Tensor]]]\n \"\"\"\n Arguments:\n images (list[Tensor]): images to be processed\n targets (list[Dict[Tensor]]): ground-truth boxes present in the image (optional)\n\n Returns:\n result (list[BoxList] or dict[Tensor]): the output from the model.\n During training, it returns a dict[Tensor] which contains the losses.\n During testing, it returns list[BoxList] contains additional fields\n like `scores`, `labels` and `mask` (for Mask R-CNN models).\n\n \"\"\"\n if self.training and targets is None:\n raise ValueError(\"In training mode, targets should be passed\")\n if self.training:\n assert targets is not None\n for target in targets:\n boxes = target[\"boxes\"]\n if isinstance(boxes, torch.Tensor):\n if len(boxes.shape) != 2 or boxes.shape[-1] != 4:\n raise ValueError(\"Expected target boxes to be a tensor\"\n \"of shape [N, 4], got {:}.\".format(\n boxes.shape))\n else:\n raise ValueError(\"Expected target boxes to be of type \"\n \"Tensor, got {:}.\".format(type(boxes)))\n\n original_image_sizes = torch.jit.annotate(List[Tuple[int, int]], [])\n for img in images:\n val = img.shape[-2:]\n assert len(val) == 2\n original_image_sizes.append((val[0], val[1]))\n\n images, targets = self.transform(images, targets)\n features = self.backbone(images.tensors)\n if isinstance(features, torch.Tensor):\n features = OrderedDict([('0', features)])\n proposals, proposal_losses = self.rpn(images, features, targets)\n detections, detector_losses = self.roi_heads(features, proposals, images.image_sizes, targets)\n detections = self.transform.postprocess(detections, images.image_sizes, original_image_sizes)\n\n losses = {}\n losses.update(detector_losses)\n losses.update(proposal_losses)\n\n if torch.jit.is_scripting():\n if not self._has_warned:\n warnings.warn(\"RCNN always returns a (Losses, Detections) tuple in scripting\")\n self._has_warned = True\n return (losses, detections)\n else:\n return self.eager_outputs(losses, detections)\n", "path": "torchvision/models/detection/generalized_rcnn.py"}], "after_files": [{"content": "# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved.\n\"\"\"\nImplements the Generalized R-CNN framework\n\"\"\"\n\nfrom collections import OrderedDict\nimport torch\nfrom torch import nn\nimport warnings\nfrom torch.jit.annotations import Tuple, List, Dict, Optional\nfrom torch import Tensor\n\n\nclass GeneralizedRCNN(nn.Module):\n \"\"\"\n Main class for Generalized R-CNN.\n\n Arguments:\n backbone (nn.Module):\n rpn (nn.Module):\n roi_heads (nn.Module): takes the features + the proposals from the RPN and computes\n detections / masks from it.\n transform (nn.Module): performs the data transformation from the inputs to feed into\n the model\n \"\"\"\n\n def __init__(self, backbone, rpn, roi_heads, transform):\n super(GeneralizedRCNN, self).__init__()\n self.transform = transform\n self.backbone = backbone\n self.rpn = rpn\n self.roi_heads = roi_heads\n # used only on torchscript mode\n self._has_warned = False\n\n @torch.jit.unused\n def eager_outputs(self, losses, detections):\n # type: (Dict[str, Tensor], List[Dict[str, Tensor]]) -> Tuple[Dict[str, Tensor], List[Dict[str, Tensor]]]\n if self.training:\n return losses\n\n return detections\n\n def forward(self, images, targets=None):\n # type: (List[Tensor], Optional[List[Dict[str, Tensor]]]) -> Tuple[Dict[str, Tensor], List[Dict[str, Tensor]]]\n \"\"\"\n Arguments:\n images (list[Tensor]): images to be processed\n targets (list[Dict[Tensor]]): ground-truth boxes present in the image (optional)\n\n Returns:\n result (list[BoxList] or dict[Tensor]): the output from the model.\n During training, it returns a dict[Tensor] which contains the losses.\n During testing, it returns list[BoxList] contains additional fields\n like `scores`, `labels` and `mask` (for Mask R-CNN models).\n\n \"\"\"\n if self.training and targets is None:\n raise ValueError(\"In training mode, targets should be passed\")\n if self.training:\n assert targets is not None\n for target in targets:\n boxes = target[\"boxes\"]\n if isinstance(boxes, torch.Tensor):\n if len(boxes.shape) != 2 or boxes.shape[-1] != 4:\n raise ValueError(\"Expected target boxes to be a tensor\"\n \"of shape [N, 4], got {:}.\".format(\n boxes.shape))\n else:\n raise ValueError(\"Expected target boxes to be of type \"\n \"Tensor, got {:}.\".format(type(boxes)))\n\n original_image_sizes = torch.jit.annotate(List[Tuple[int, int]], [])\n for img in images:\n val = img.shape[-2:]\n assert len(val) == 2\n original_image_sizes.append((val[0], val[1]))\n\n images, targets = self.transform(images, targets)\n\n # Check for degenerate boxes\n # TODO: Move this to a function\n if targets is not None:\n for target_idx, target in enumerate(targets):\n boxes = target[\"boxes\"]\n degenerate_boxes = boxes[:, 2:] <= boxes[:, :2]\n if degenerate_boxes.any():\n # print the first degenrate box\n bb_idx = degenerate_boxes.any(dim=1).nonzero().view(-1)[0]\n degen_bb: List[float] = boxes[bb_idx].tolist()\n raise ValueError(\"All bounding boxes should have positive height and width.\"\n \" Found invaid box {} for target at index {}.\"\n .format(degen_bb, target_idx))\n\n features = self.backbone(images.tensors)\n if isinstance(features, torch.Tensor):\n features = OrderedDict([('0', features)])\n proposals, proposal_losses = self.rpn(images, features, targets)\n detections, detector_losses = self.roi_heads(features, proposals, images.image_sizes, targets)\n detections = self.transform.postprocess(detections, images.image_sizes, original_image_sizes)\n\n losses = {}\n losses.update(detector_losses)\n losses.update(proposal_losses)\n\n if torch.jit.is_scripting():\n if not self._has_warned:\n warnings.warn(\"RCNN always returns a (Losses, Detections) tuple in scripting\")\n self._has_warned = True\n return (losses, detections)\n else:\n return self.eager_outputs(losses, detections)\n", "path": "torchvision/models/detection/generalized_rcnn.py"}]} | 1,445 | 286 |
gh_patches_debug_1514 | rasdani/github-patches | git_diff | ocadotechnology__aimmo-543 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Latest minikube not starting on Travis CI
Same issue and hopefully fix as this https://github.com/kubernetes/minikube/issues/2704
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 from setuptools import find_packages, setup
3
4 import versioneer
5
6 setup(
7 name='aimmo',
8 cmdclass=versioneer.get_cmdclass(),
9 packages=find_packages(),
10 include_package_data=True,
11 install_requires=[
12 'django >= 1.8.3, < 1.9.0',
13 'django-autoconfig >= 0.3.6, < 1.0.0',
14 'django-forms-bootstrap',
15 'django-js-reverse',
16 'eventlet',
17 'flask',
18 'flask-socketio',
19 'requests',
20 'six',
21 'pykube',
22 'hypothesis',
23 'flask-cors >= 3.0, < 3.1',
24 'psutil >= 5.4, < 5.5',
25 ],
26 tests_require=[
27 'django-setuptest',
28 'httmock',
29 'mock == 2.0.0',
30 'docker == 2.7.0',
31 'kubernetes == 4.0.0',
32 'PyYAML == 3.12',
33 ],
34 test_suite='setuptest.setuptest.SetupTestSuite',
35 version=versioneer.get_version(),
36 zip_safe=False,
37 )
38
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -28,7 +28,7 @@
'httmock',
'mock == 2.0.0',
'docker == 2.7.0',
- 'kubernetes == 4.0.0',
+ 'kubernetes == 5.0.0',
'PyYAML == 3.12',
],
test_suite='setuptest.setuptest.SetupTestSuite',
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -28,7 +28,7 @@\n 'httmock',\n 'mock == 2.0.0',\n 'docker == 2.7.0',\n- 'kubernetes == 4.0.0',\n+ 'kubernetes == 5.0.0',\n 'PyYAML == 3.12',\n ],\n test_suite='setuptest.setuptest.SetupTestSuite',\n", "issue": "Latest minikube not starting on Travis CI\nSame issue and hopefully fix as this https://github.com/kubernetes/minikube/issues/2704\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\nfrom setuptools import find_packages, setup\n\nimport versioneer\n\nsetup(\n name='aimmo',\n cmdclass=versioneer.get_cmdclass(),\n packages=find_packages(),\n include_package_data=True,\n install_requires=[\n 'django >= 1.8.3, < 1.9.0',\n 'django-autoconfig >= 0.3.6, < 1.0.0',\n 'django-forms-bootstrap',\n 'django-js-reverse',\n 'eventlet',\n 'flask',\n 'flask-socketio',\n 'requests',\n 'six',\n 'pykube',\n 'hypothesis',\n 'flask-cors >= 3.0, < 3.1',\n 'psutil >= 5.4, < 5.5',\n ],\n tests_require=[\n 'django-setuptest',\n 'httmock',\n 'mock == 2.0.0',\n 'docker == 2.7.0',\n 'kubernetes == 4.0.0',\n 'PyYAML == 3.12',\n ],\n test_suite='setuptest.setuptest.SetupTestSuite',\n version=versioneer.get_version(),\n zip_safe=False,\n)\n", "path": "setup.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\nfrom setuptools import find_packages, setup\n\nimport versioneer\n\nsetup(\n name='aimmo',\n cmdclass=versioneer.get_cmdclass(),\n packages=find_packages(),\n include_package_data=True,\n install_requires=[\n 'django >= 1.8.3, < 1.9.0',\n 'django-autoconfig >= 0.3.6, < 1.0.0',\n 'django-forms-bootstrap',\n 'django-js-reverse',\n 'eventlet',\n 'flask',\n 'flask-socketio',\n 'requests',\n 'six',\n 'pykube',\n 'hypothesis',\n 'flask-cors >= 3.0, < 3.1',\n 'psutil >= 5.4, < 5.5',\n ],\n tests_require=[\n 'django-setuptest',\n 'httmock',\n 'mock == 2.0.0',\n 'docker == 2.7.0',\n 'kubernetes == 5.0.0',\n 'PyYAML == 3.12',\n ],\n test_suite='setuptest.setuptest.SetupTestSuite',\n version=versioneer.get_version(),\n zip_safe=False,\n)\n", "path": "setup.py"}]} | 630 | 114 |
gh_patches_debug_1492 | rasdani/github-patches | git_diff | wright-group__WrightTools-590 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Change __version__ to match pep 440
Specifically, when a branch is specified, it should use a plus sign instead of minus
https://www.python.org/dev/peps/pep-0440/#local-version-identifiers
https://github.com/wright-group/WrightTools/blob/490a4a3d6fb6f016e7033d661b553b72c2d86fcb/WrightTools/__version__.py#L33
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `WrightTools/__version__.py`
Content:
```
1 """Define WrightTools version."""
2
3
4 # --- import --------------------------------------------------------------------------------------
5
6
7 import os
8
9
10 # ---- define -------------------------------------------------------------------------------------
11
12
13 here = os.path.abspath(os.path.dirname(__file__))
14
15
16 __all__ = ['__version__', '__branch__']
17
18
19 # --- version -------------------------------------------------------------------------------------
20
21
22 # read from VERSION file
23 with open(os.path.join(os.path.dirname(here), 'VERSION')) as f:
24 __version__ = f.read().strip()
25
26
27 # add git branch, if appropriate
28 p = os.path.join(os.path.dirname(here), '.git', 'HEAD')
29 if os.path.isfile(p):
30 with open(p) as f:
31 __branch__ = f.readline().rstrip().split(r'/')[-1]
32 if __branch__ != 'master':
33 __version__ += '-' + __branch__
34 else:
35 __branch__ = None
36
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/WrightTools/__version__.py b/WrightTools/__version__.py
--- a/WrightTools/__version__.py
+++ b/WrightTools/__version__.py
@@ -30,6 +30,6 @@
with open(p) as f:
__branch__ = f.readline().rstrip().split(r'/')[-1]
if __branch__ != 'master':
- __version__ += '-' + __branch__
+ __version__ += '+' + __branch__
else:
__branch__ = None
| {"golden_diff": "diff --git a/WrightTools/__version__.py b/WrightTools/__version__.py\n--- a/WrightTools/__version__.py\n+++ b/WrightTools/__version__.py\n@@ -30,6 +30,6 @@\n with open(p) as f:\n __branch__ = f.readline().rstrip().split(r'/')[-1]\n if __branch__ != 'master':\n- __version__ += '-' + __branch__\n+ __version__ += '+' + __branch__\n else:\n __branch__ = None\n", "issue": "Change __version__ to match pep 440\nSpecifically, when a branch is specified, it should use a plus sign instead of minus\r\n\r\nhttps://www.python.org/dev/peps/pep-0440/#local-version-identifiers\r\n\r\nhttps://github.com/wright-group/WrightTools/blob/490a4a3d6fb6f016e7033d661b553b72c2d86fcb/WrightTools/__version__.py#L33\n", "before_files": [{"content": "\"\"\"Define WrightTools version.\"\"\"\n\n\n# --- import --------------------------------------------------------------------------------------\n\n\nimport os\n\n\n# ---- define -------------------------------------------------------------------------------------\n\n\nhere = os.path.abspath(os.path.dirname(__file__))\n\n\n__all__ = ['__version__', '__branch__']\n\n\n# --- version -------------------------------------------------------------------------------------\n\n\n# read from VERSION file\nwith open(os.path.join(os.path.dirname(here), 'VERSION')) as f:\n __version__ = f.read().strip()\n\n\n# add git branch, if appropriate\np = os.path.join(os.path.dirname(here), '.git', 'HEAD')\nif os.path.isfile(p):\n with open(p) as f:\n __branch__ = f.readline().rstrip().split(r'/')[-1]\n if __branch__ != 'master':\n __version__ += '-' + __branch__\nelse:\n __branch__ = None\n", "path": "WrightTools/__version__.py"}], "after_files": [{"content": "\"\"\"Define WrightTools version.\"\"\"\n\n\n# --- import --------------------------------------------------------------------------------------\n\n\nimport os\n\n\n# ---- define -------------------------------------------------------------------------------------\n\n\nhere = os.path.abspath(os.path.dirname(__file__))\n\n\n__all__ = ['__version__', '__branch__']\n\n\n# --- version -------------------------------------------------------------------------------------\n\n\n# read from VERSION file\nwith open(os.path.join(os.path.dirname(here), 'VERSION')) as f:\n __version__ = f.read().strip()\n\n\n# add git branch, if appropriate\np = os.path.join(os.path.dirname(here), '.git', 'HEAD')\nif os.path.isfile(p):\n with open(p) as f:\n __branch__ = f.readline().rstrip().split(r'/')[-1]\n if __branch__ != 'master':\n __version__ += '+' + __branch__\nelse:\n __branch__ = None\n", "path": "WrightTools/__version__.py"}]} | 621 | 117 |
gh_patches_debug_30334 | rasdani/github-patches | git_diff | Lightning-AI__pytorch-lightning-1360 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
WandbLogger cannot be used with 'ddp'
<!--
### Common bugs:
1. Tensorboard not showing in Jupyter-notebook see [issue 79](https://github.com/PyTorchLightning/pytorch-lightning/issues/79).
2. PyTorch 1.1.0 vs 1.2.0 support [see FAQ](https://github.com/PyTorchLightning/pytorch-lightning#faq)
-->
## 🐛 Bug
wandb modifies `init` such that a child process calling init returns None if the master process has called init. This seems to cause a bug with ddp, and results in rank zero having experiment = None, which crashes the program.
### To Reproduce
Can be reproduced with the basic MNIST gpu template, simply add a WandbLogger and pass 'ddp' as the distributed backend.
```
-- Process 0 terminated with the following error:
Traceback (most recent call last):
File "/home/rmrao/anaconda3/lib/python3.6/site-packages/torch/multiprocessing/spawn.py", line 19, in _wrap
fn(i, *args)
File "/home/rmrao/anaconda3/lib/python3.6/site-packages/pytorch_lightning/trainer/distrib_data_parallel.py", line 331, in ddp_train
self.run_pretrain_routine(model)
File "/home/rmrao/anaconda3/lib/python3.6/site-packages/pytorch_lightning/trainer/trainer.py", line 757, in run_pretrain_routine
self.logger.log_hyperparams(ref_model.hparams)
File "/home/rmrao/anaconda3/lib/python3.6/site-packages/pytorch_lightning/logging/base.py", line 14, in wrapped_fn
fn(self, *args, **kwargs)
File "/home/rmrao/anaconda3/lib/python3.6/site-packages/pytorch_lightning/logging/wandb.py", line 79, in log_hyperparams
self.experiment.config.update(params)
AttributeError: 'NoneType' object has no attribute 'config'
```
This occurs with the latest wandb version and with pytorch-lightning 0.6.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pytorch_lightning/loggers/wandb.py`
Content:
```
1 r"""
2
3 .. _wandb:
4
5 WandbLogger
6 -------------
7 """
8 import os
9 from argparse import Namespace
10 from typing import Optional, List, Dict, Union, Any
11
12 import torch.nn as nn
13
14 try:
15 import wandb
16 from wandb.wandb_run import Run
17 except ImportError: # pragma: no-cover
18 raise ImportError('You want to use `wandb` logger which is not installed yet,' # pragma: no-cover
19 ' install it with `pip install wandb`.')
20
21 from pytorch_lightning.loggers.base import LightningLoggerBase, rank_zero_only
22
23
24 class WandbLogger(LightningLoggerBase):
25 """
26 Logger for `W&B <https://www.wandb.com/>`_.
27
28 Args:
29 name (str): display name for the run.
30 save_dir (str): path where data is saved.
31 offline (bool): run offline (data can be streamed later to wandb servers).
32 id or version (str): sets the version, mainly used to resume a previous run.
33 anonymous (bool): enables or explicitly disables anonymous logging.
34 project (str): the name of the project to which this run will belong.
35 tags (list of str): tags associated with this run.
36 log_model (bool): save checkpoints in wandb dir to upload on W&B servers.
37
38 Example
39 --------
40 .. code-block:: python
41
42 from pytorch_lightning.loggers import WandbLogger
43 from pytorch_lightning import Trainer
44
45 wandb_logger = WandbLogger()
46 trainer = Trainer(logger=wandb_logger)
47 """
48
49 def __init__(self, name: Optional[str] = None, save_dir: Optional[str] = None,
50 offline: bool = False, id: Optional[str] = None, anonymous: bool = False,
51 version: Optional[str] = None, project: Optional[str] = None,
52 tags: Optional[List[str]] = None, log_model: bool = False,
53 experiment=None, entity=None):
54 super().__init__()
55 self._name = name
56 self._save_dir = save_dir
57 self._anonymous = 'allow' if anonymous else None
58 self._id = version or id
59 self._tags = tags
60 self._project = project
61 self._experiment = experiment
62 self._offline = offline
63 self._entity = entity
64 self._log_model = log_model
65
66 def __getstate__(self):
67 state = self.__dict__.copy()
68 # cannot be pickled
69 state['_experiment'] = None
70 # args needed to reload correct experiment
71 state['_id'] = self.experiment.id
72 return state
73
74 @property
75 def experiment(self) -> Run:
76 r"""
77
78 Actual wandb object. To use wandb features do the following.
79
80 Example::
81
82 self.logger.experiment.some_wandb_function()
83
84 """
85 if self._experiment is None:
86 if self._offline:
87 os.environ['WANDB_MODE'] = 'dryrun'
88 self._experiment = wandb.init(
89 name=self._name, dir=self._save_dir, project=self._project, anonymous=self._anonymous,
90 id=self._id, resume='allow', tags=self._tags, entity=self._entity)
91 # save checkpoints in wandb dir to upload on W&B servers
92 if self._log_model:
93 self.save_dir = self._experiment.dir
94 return self._experiment
95
96 def watch(self, model: nn.Module, log: str = 'gradients', log_freq: int = 100):
97 wandb.watch(model, log=log, log_freq=log_freq)
98
99 @rank_zero_only
100 def log_hyperparams(self, params: Union[Dict[str, Any], Namespace]) -> None:
101 params = self._convert_params(params)
102 self.experiment.config.update(params)
103
104 @rank_zero_only
105 def log_metrics(self, metrics: Dict[str, float], step: Optional[int] = None) -> None:
106 if step is not None:
107 metrics['global_step'] = step
108 self.experiment.log(metrics)
109
110 @property
111 def name(self) -> str:
112 return self.experiment.project_name()
113
114 @property
115 def version(self) -> str:
116 return self.experiment.id
117
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pytorch_lightning/loggers/wandb.py b/pytorch_lightning/loggers/wandb.py
--- a/pytorch_lightning/loggers/wandb.py
+++ b/pytorch_lightning/loggers/wandb.py
@@ -65,10 +65,11 @@
def __getstate__(self):
state = self.__dict__.copy()
+ # args needed to reload correct experiment
+ state['_id'] = self._experiment.id if self._experiment is not None else None
+
# cannot be pickled
state['_experiment'] = None
- # args needed to reload correct experiment
- state['_id'] = self.experiment.id
return state
@property
@@ -87,7 +88,7 @@
os.environ['WANDB_MODE'] = 'dryrun'
self._experiment = wandb.init(
name=self._name, dir=self._save_dir, project=self._project, anonymous=self._anonymous,
- id=self._id, resume='allow', tags=self._tags, entity=self._entity)
+ reinit=True, id=self._id, resume='allow', tags=self._tags, entity=self._entity)
# save checkpoints in wandb dir to upload on W&B servers
if self._log_model:
self.save_dir = self._experiment.dir
@@ -109,8 +110,11 @@
@property
def name(self) -> str:
- return self.experiment.project_name()
+ # don't create an experiment if we don't have one
+ name = self._experiment.project_name() if self._experiment else None
+ return name
@property
def version(self) -> str:
- return self.experiment.id
+ # don't create an experiment if we don't have one
+ return self._experiment.id if self._experiment else None
| {"golden_diff": "diff --git a/pytorch_lightning/loggers/wandb.py b/pytorch_lightning/loggers/wandb.py\n--- a/pytorch_lightning/loggers/wandb.py\n+++ b/pytorch_lightning/loggers/wandb.py\n@@ -65,10 +65,11 @@\n \n def __getstate__(self):\n state = self.__dict__.copy()\n+ # args needed to reload correct experiment\n+ state['_id'] = self._experiment.id if self._experiment is not None else None\n+\n # cannot be pickled\n state['_experiment'] = None\n- # args needed to reload correct experiment\n- state['_id'] = self.experiment.id\n return state\n \n @property\n@@ -87,7 +88,7 @@\n os.environ['WANDB_MODE'] = 'dryrun'\n self._experiment = wandb.init(\n name=self._name, dir=self._save_dir, project=self._project, anonymous=self._anonymous,\n- id=self._id, resume='allow', tags=self._tags, entity=self._entity)\n+ reinit=True, id=self._id, resume='allow', tags=self._tags, entity=self._entity)\n # save checkpoints in wandb dir to upload on W&B servers\n if self._log_model:\n self.save_dir = self._experiment.dir\n@@ -109,8 +110,11 @@\n \n @property\n def name(self) -> str:\n- return self.experiment.project_name()\n+ # don't create an experiment if we don't have one\n+ name = self._experiment.project_name() if self._experiment else None\n+ return name\n \n @property\n def version(self) -> str:\n- return self.experiment.id\n+ # don't create an experiment if we don't have one\n+ return self._experiment.id if self._experiment else None\n", "issue": "WandbLogger cannot be used with 'ddp'\n<!-- \r\n### Common bugs:\r\n1. Tensorboard not showing in Jupyter-notebook see [issue 79](https://github.com/PyTorchLightning/pytorch-lightning/issues/79). \r\n2. PyTorch 1.1.0 vs 1.2.0 support [see FAQ](https://github.com/PyTorchLightning/pytorch-lightning#faq) \r\n-->\r\n\r\n## \ud83d\udc1b Bug\r\n\r\nwandb modifies `init` such that a child process calling init returns None if the master process has called init. This seems to cause a bug with ddp, and results in rank zero having experiment = None, which crashes the program.\r\n\r\n### To Reproduce\r\n\r\nCan be reproduced with the basic MNIST gpu template, simply add a WandbLogger and pass 'ddp' as the distributed backend.\r\n```\r\n-- Process 0 terminated with the following error:\r\nTraceback (most recent call last):\r\n File \"/home/rmrao/anaconda3/lib/python3.6/site-packages/torch/multiprocessing/spawn.py\", line 19, in _wrap\r\n fn(i, *args)\r\n File \"/home/rmrao/anaconda3/lib/python3.6/site-packages/pytorch_lightning/trainer/distrib_data_parallel.py\", line 331, in ddp_train\r\n self.run_pretrain_routine(model)\r\n File \"/home/rmrao/anaconda3/lib/python3.6/site-packages/pytorch_lightning/trainer/trainer.py\", line 757, in run_pretrain_routine\r\n self.logger.log_hyperparams(ref_model.hparams)\r\n File \"/home/rmrao/anaconda3/lib/python3.6/site-packages/pytorch_lightning/logging/base.py\", line 14, in wrapped_fn\r\n fn(self, *args, **kwargs)\r\n File \"/home/rmrao/anaconda3/lib/python3.6/site-packages/pytorch_lightning/logging/wandb.py\", line 79, in log_hyperparams\r\n self.experiment.config.update(params)\r\nAttributeError: 'NoneType' object has no attribute 'config'\r\n```\r\n\r\nThis occurs with the latest wandb version and with pytorch-lightning 0.6.\r\n\n", "before_files": [{"content": "r\"\"\"\n\n.. _wandb:\n\nWandbLogger\n-------------\n\"\"\"\nimport os\nfrom argparse import Namespace\nfrom typing import Optional, List, Dict, Union, Any\n\nimport torch.nn as nn\n\ntry:\n import wandb\n from wandb.wandb_run import Run\nexcept ImportError: # pragma: no-cover\n raise ImportError('You want to use `wandb` logger which is not installed yet,' # pragma: no-cover\n ' install it with `pip install wandb`.')\n\nfrom pytorch_lightning.loggers.base import LightningLoggerBase, rank_zero_only\n\n\nclass WandbLogger(LightningLoggerBase):\n \"\"\"\n Logger for `W&B <https://www.wandb.com/>`_.\n\n Args:\n name (str): display name for the run.\n save_dir (str): path where data is saved.\n offline (bool): run offline (data can be streamed later to wandb servers).\n id or version (str): sets the version, mainly used to resume a previous run.\n anonymous (bool): enables or explicitly disables anonymous logging.\n project (str): the name of the project to which this run will belong.\n tags (list of str): tags associated with this run.\n log_model (bool): save checkpoints in wandb dir to upload on W&B servers.\n\n Example\n --------\n .. code-block:: python\n\n from pytorch_lightning.loggers import WandbLogger\n from pytorch_lightning import Trainer\n\n wandb_logger = WandbLogger()\n trainer = Trainer(logger=wandb_logger)\n \"\"\"\n\n def __init__(self, name: Optional[str] = None, save_dir: Optional[str] = None,\n offline: bool = False, id: Optional[str] = None, anonymous: bool = False,\n version: Optional[str] = None, project: Optional[str] = None,\n tags: Optional[List[str]] = None, log_model: bool = False,\n experiment=None, entity=None):\n super().__init__()\n self._name = name\n self._save_dir = save_dir\n self._anonymous = 'allow' if anonymous else None\n self._id = version or id\n self._tags = tags\n self._project = project\n self._experiment = experiment\n self._offline = offline\n self._entity = entity\n self._log_model = log_model\n\n def __getstate__(self):\n state = self.__dict__.copy()\n # cannot be pickled\n state['_experiment'] = None\n # args needed to reload correct experiment\n state['_id'] = self.experiment.id\n return state\n\n @property\n def experiment(self) -> Run:\n r\"\"\"\n\n Actual wandb object. To use wandb features do the following.\n\n Example::\n\n self.logger.experiment.some_wandb_function()\n\n \"\"\"\n if self._experiment is None:\n if self._offline:\n os.environ['WANDB_MODE'] = 'dryrun'\n self._experiment = wandb.init(\n name=self._name, dir=self._save_dir, project=self._project, anonymous=self._anonymous,\n id=self._id, resume='allow', tags=self._tags, entity=self._entity)\n # save checkpoints in wandb dir to upload on W&B servers\n if self._log_model:\n self.save_dir = self._experiment.dir\n return self._experiment\n\n def watch(self, model: nn.Module, log: str = 'gradients', log_freq: int = 100):\n wandb.watch(model, log=log, log_freq=log_freq)\n\n @rank_zero_only\n def log_hyperparams(self, params: Union[Dict[str, Any], Namespace]) -> None:\n params = self._convert_params(params)\n self.experiment.config.update(params)\n\n @rank_zero_only\n def log_metrics(self, metrics: Dict[str, float], step: Optional[int] = None) -> None:\n if step is not None:\n metrics['global_step'] = step\n self.experiment.log(metrics)\n\n @property\n def name(self) -> str:\n return self.experiment.project_name()\n\n @property\n def version(self) -> str:\n return self.experiment.id\n", "path": "pytorch_lightning/loggers/wandb.py"}], "after_files": [{"content": "r\"\"\"\n\n.. _wandb:\n\nWandbLogger\n-------------\n\"\"\"\nimport os\nfrom argparse import Namespace\nfrom typing import Optional, List, Dict, Union, Any\n\nimport torch.nn as nn\n\ntry:\n import wandb\n from wandb.wandb_run import Run\nexcept ImportError: # pragma: no-cover\n raise ImportError('You want to use `wandb` logger which is not installed yet,' # pragma: no-cover\n ' install it with `pip install wandb`.')\n\nfrom pytorch_lightning.loggers.base import LightningLoggerBase, rank_zero_only\n\n\nclass WandbLogger(LightningLoggerBase):\n \"\"\"\n Logger for `W&B <https://www.wandb.com/>`_.\n\n Args:\n name (str): display name for the run.\n save_dir (str): path where data is saved.\n offline (bool): run offline (data can be streamed later to wandb servers).\n id or version (str): sets the version, mainly used to resume a previous run.\n anonymous (bool): enables or explicitly disables anonymous logging.\n project (str): the name of the project to which this run will belong.\n tags (list of str): tags associated with this run.\n log_model (bool): save checkpoints in wandb dir to upload on W&B servers.\n\n Example\n --------\n .. code-block:: python\n\n from pytorch_lightning.loggers import WandbLogger\n from pytorch_lightning import Trainer\n\n wandb_logger = WandbLogger()\n trainer = Trainer(logger=wandb_logger)\n \"\"\"\n\n def __init__(self, name: Optional[str] = None, save_dir: Optional[str] = None,\n offline: bool = False, id: Optional[str] = None, anonymous: bool = False,\n version: Optional[str] = None, project: Optional[str] = None,\n tags: Optional[List[str]] = None, log_model: bool = False,\n experiment=None, entity=None):\n super().__init__()\n self._name = name\n self._save_dir = save_dir\n self._anonymous = 'allow' if anonymous else None\n self._id = version or id\n self._tags = tags\n self._project = project\n self._experiment = experiment\n self._offline = offline\n self._entity = entity\n self._log_model = log_model\n\n def __getstate__(self):\n state = self.__dict__.copy()\n # args needed to reload correct experiment\n state['_id'] = self._experiment.id if self._experiment is not None else None\n\n # cannot be pickled\n state['_experiment'] = None\n return state\n\n @property\n def experiment(self) -> Run:\n r\"\"\"\n\n Actual wandb object. To use wandb features do the following.\n\n Example::\n\n self.logger.experiment.some_wandb_function()\n\n \"\"\"\n if self._experiment is None:\n if self._offline:\n os.environ['WANDB_MODE'] = 'dryrun'\n self._experiment = wandb.init(\n name=self._name, dir=self._save_dir, project=self._project, anonymous=self._anonymous,\n reinit=True, id=self._id, resume='allow', tags=self._tags, entity=self._entity)\n # save checkpoints in wandb dir to upload on W&B servers\n if self._log_model:\n self.save_dir = self._experiment.dir\n return self._experiment\n\n def watch(self, model: nn.Module, log: str = 'gradients', log_freq: int = 100):\n wandb.watch(model, log=log, log_freq=log_freq)\n\n @rank_zero_only\n def log_hyperparams(self, params: Union[Dict[str, Any], Namespace]) -> None:\n params = self._convert_params(params)\n self.experiment.config.update(params)\n\n @rank_zero_only\n def log_metrics(self, metrics: Dict[str, float], step: Optional[int] = None) -> None:\n if step is not None:\n metrics['global_step'] = step\n self.experiment.log(metrics)\n\n @property\n def name(self) -> str:\n # don't create an experiment if we don't have one\n name = self._experiment.project_name() if self._experiment else None\n return name\n\n @property\n def version(self) -> str:\n # don't create an experiment if we don't have one\n return self._experiment.id if self._experiment else None\n", "path": "pytorch_lightning/loggers/wandb.py"}]} | 1,917 | 419 |
gh_patches_debug_19325 | rasdani/github-patches | git_diff | qtile__qtile-2290 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
WidgetBox does not fully re-expand on click
# Issue description
On first click, WidgetBox expands properly, but clicking a second time does not hide all widgets.
Specifically, in my configuration (below), the Wallpaper is shown/hidden properly, but the Systray does not hide again.
# Qtile version
qtile-0.17.1.dev82+geb8722ec
# Stack traces
no errors in qtile.log
# Configuration
```
widget.WidgetBox(widgets=[
widget.Wallpaper(directory='~/Downloads/wallpaper/', wallpaper_command=None, random_selection=True, label='\u0394'),
widget.Systray(padding=1, icon_size=22),
], close_button_location='right',
),
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `libqtile/widget/widgetbox.py`
Content:
```
1 # Copyright (c) 2020 elParaguayo
2 #
3 # Permission is hereby granted, free of charge, to any person obtaining a copy
4 # of this software and associated documentation files (the "Software"), to deal
5 # in the Software without restriction, including without limitation the rights
6 # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
7 # copies of the Software, and to permit persons to whom the Software is
8 # furnished to do so, subject to the following conditions:
9 #
10 # The above copyright notice and this permission notice shall be included in
11 # all copies or substantial portions of the Software.
12 #
13 # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
14 # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
15 # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
16 # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
17 # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
18 # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
19 # SOFTWARE.
20
21 from collections import namedtuple
22
23 from libqtile import bar
24 from libqtile.log_utils import logger
25 from libqtile.widget import base
26
27 BoxedWidget = namedtuple("BoxedWidget", ["widget", "draw"])
28
29
30 def _no_draw(*args, **kwargs):
31 pass
32
33
34 class WidgetBox(base._Widget):
35 """A widget to declutter your bar.
36
37 WidgetBox is a widget that hides widgets by default but shows them when
38 the box is opened.
39
40 Widgets that are hidden will still update etc. as if they were on the main
41 bar.
42
43 Button clicks are passed to widgets when they are visible so callbacks will
44 work.
45
46 Widgets in the box also remain accessible via command interfaces.
47
48 Widgets can only be added to the box via the configuration file. The widget
49 is configured by adding widgets to the "widgets" parameter as follows::
50
51 widget.WidgetBox(widgets=[
52 widget.TextBox(text="This widget is in the box"),
53 widget.Memory()
54 ]
55 ),
56 """
57 orientations = base.ORIENTATION_HORIZONTAL
58 defaults = [
59 (
60 "font",
61 "sans",
62 "Text font"
63 ),
64 (
65 "fontsize",
66 None,
67 "Font pixel size. Calculated if None."
68 ),
69 (
70 "fontshadow",
71 None,
72 "font shadow color, default is None(no shadow)"
73 ),
74 (
75 "foreground",
76 "#ffffff",
77 "Foreground colour."
78 ),
79 (
80 "close_button_location",
81 "left",
82 "Location of close button when box open ('left' or 'right')"
83 ),
84 (
85 "text_closed",
86 "[<]",
87 "Text when box is closed"
88 ),
89 (
90 "text_open",
91 "[>]",
92 "Text when box is open"
93 ),
94 ]
95
96 def __init__(self, widgets=list(), **config):
97 base._Widget.__init__(self, bar.CALCULATED, **config)
98 self.add_defaults(WidgetBox.defaults)
99 self.box_is_open = False
100 self._widgets = widgets
101 self.add_callbacks({"Button1": self.cmd_toggle})
102
103 if self.close_button_location not in ["left", "right"]:
104 val = self.close_button_location
105 msg = "Invalid value for 'close_button_location': {}".format(val)
106 logger.warning(msg)
107 self.close_button_location = "left"
108
109 def _configure(self, qtile, bar):
110 base._Widget._configure(self, qtile, bar)
111
112 self.layout = self.drawer.textlayout(
113 self.text_closed,
114 self.foreground,
115 self.font,
116 self.fontsize,
117 self.fontshadow,
118 markup=False,
119 )
120
121 for idx, w in enumerate(self._widgets):
122 if w.configured:
123 w = w.create_mirror()
124 self._widgets[idx] = w
125 self.qtile.register_widget(w)
126 w._configure(self.qtile, self.bar)
127
128 # In case the widget is mirrored, we need to draw it once so the
129 # mirror can copy the surface but draw it off screen
130 w.offsetx = self.bar.width
131 self.qtile.call_soon(w.draw)
132
133 # We need to stop hidden widgets from drawing while hidden
134 # (e.g. draw could be triggered by a timer) so we take a reference to
135 # the widget's drawer.draw method
136 self.widgets = [BoxedWidget(w, w.drawer.draw) for w in self._widgets]
137
138 # # Overwrite the current drawer.draw method with a no-op
139 for w in self.widgets:
140 w.widget.drawer.draw = _no_draw
141
142 def calculate_length(self):
143 return self.layout.width
144
145 def set_box_label(self):
146 self.layout.text = (self.text_open if self.box_is_open
147 else self.text_closed)
148
149 def toggle_widgets(self):
150 for item in self.widgets:
151 try:
152 self.bar.widgets.remove(item.widget)
153 # Override drawer.drawer with a no-op
154 item.widget.drawer.draw = _no_draw
155 except ValueError:
156 continue
157
158 index = self.bar.widgets.index(self)
159
160 if self.close_button_location == "left":
161 index += 1
162
163 if self.box_is_open:
164
165 # Need to reverse list as widgets get added in front of eachother.
166 for item in self.widgets[::-1]:
167 # Restore the original drawer.draw method
168 item.widget.drawer.draw = item.draw
169 self.bar.widgets.insert(index, item.widget)
170
171 def draw(self):
172 self.drawer.clear(self.background or self.bar.background)
173
174 self.layout.draw(0,
175 int(self.bar.height / 2.0 -
176 self.layout.height / 2.0) + 1)
177
178 self.drawer.draw(offsetx=self.offsetx, width=self.width)
179
180 def button_press(self, x, y, button):
181 name = "Button{}".format(button)
182 if name in self.mouse_callbacks:
183 self.mouse_callbacks[name]()
184
185 def cmd_toggle(self):
186 """Toggle box state"""
187 self.box_is_open = not self.box_is_open
188 self.toggle_widgets()
189 self.set_box_label()
190 self.bar.draw()
191
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/libqtile/widget/widgetbox.py b/libqtile/widget/widgetbox.py
--- a/libqtile/widget/widgetbox.py
+++ b/libqtile/widget/widgetbox.py
@@ -22,7 +22,7 @@
from libqtile import bar
from libqtile.log_utils import logger
-from libqtile.widget import base
+from libqtile.widget import Systray, base
BoxedWidget = namedtuple("BoxedWidget", ["widget", "draw"])
@@ -152,6 +152,14 @@
self.bar.widgets.remove(item.widget)
# Override drawer.drawer with a no-op
item.widget.drawer.draw = _no_draw
+
+ # Systray widget needs some additional steps to hide as the icons
+ # are separate _Window instances.
+ # Systray unhides icons when it draws so we only need to hide them.
+ if isinstance(item.widget, Systray):
+ for icon in item.widget.icons.values():
+ icon.hide()
+
except ValueError:
continue
| {"golden_diff": "diff --git a/libqtile/widget/widgetbox.py b/libqtile/widget/widgetbox.py\n--- a/libqtile/widget/widgetbox.py\n+++ b/libqtile/widget/widgetbox.py\n@@ -22,7 +22,7 @@\n \n from libqtile import bar\n from libqtile.log_utils import logger\n-from libqtile.widget import base\n+from libqtile.widget import Systray, base\n \n BoxedWidget = namedtuple(\"BoxedWidget\", [\"widget\", \"draw\"])\n \n@@ -152,6 +152,14 @@\n self.bar.widgets.remove(item.widget)\n # Override drawer.drawer with a no-op\n item.widget.drawer.draw = _no_draw\n+\n+ # Systray widget needs some additional steps to hide as the icons\n+ # are separate _Window instances.\n+ # Systray unhides icons when it draws so we only need to hide them.\n+ if isinstance(item.widget, Systray):\n+ for icon in item.widget.icons.values():\n+ icon.hide()\n+\n except ValueError:\n continue\n", "issue": "WidgetBox does not fully re-expand on click\n# Issue description\r\n\r\nOn first click, WidgetBox expands properly, but clicking a second time does not hide all widgets.\r\n\r\nSpecifically, in my configuration (below), the Wallpaper is shown/hidden properly, but the Systray does not hide again.\r\n\r\n# Qtile version\r\n\r\nqtile-0.17.1.dev82+geb8722ec\r\n\r\n# Stack traces\r\n\r\nno errors in qtile.log\r\n\r\n# Configuration\r\n\r\n```\r\n widget.WidgetBox(widgets=[\r\n widget.Wallpaper(directory='~/Downloads/wallpaper/', wallpaper_command=None, random_selection=True, label='\\u0394'),\r\n widget.Systray(padding=1, icon_size=22),\r\n ], close_button_location='right',\r\n ),\r\n```\r\n\r\n\n", "before_files": [{"content": "# Copyright (c) 2020 elParaguayo\n#\n# Permission is hereby granted, free of charge, to any person obtaining a copy\n# of this software and associated documentation files (the \"Software\"), to deal\n# in the Software without restriction, including without limitation the rights\n# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n# copies of the Software, and to permit persons to whom the Software is\n# furnished to do so, subject to the following conditions:\n#\n# The above copyright notice and this permission notice shall be included in\n# all copies or substantial portions of the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n# SOFTWARE.\n\nfrom collections import namedtuple\n\nfrom libqtile import bar\nfrom libqtile.log_utils import logger\nfrom libqtile.widget import base\n\nBoxedWidget = namedtuple(\"BoxedWidget\", [\"widget\", \"draw\"])\n\n\ndef _no_draw(*args, **kwargs):\n pass\n\n\nclass WidgetBox(base._Widget):\n \"\"\"A widget to declutter your bar.\n\n WidgetBox is a widget that hides widgets by default but shows them when\n the box is opened.\n\n Widgets that are hidden will still update etc. as if they were on the main\n bar.\n\n Button clicks are passed to widgets when they are visible so callbacks will\n work.\n\n Widgets in the box also remain accessible via command interfaces.\n\n Widgets can only be added to the box via the configuration file. The widget\n is configured by adding widgets to the \"widgets\" parameter as follows::\n\n widget.WidgetBox(widgets=[\n widget.TextBox(text=\"This widget is in the box\"),\n widget.Memory()\n ]\n ),\n \"\"\"\n orientations = base.ORIENTATION_HORIZONTAL\n defaults = [\n (\n \"font\",\n \"sans\",\n \"Text font\"\n ),\n (\n \"fontsize\",\n None,\n \"Font pixel size. Calculated if None.\"\n ),\n (\n \"fontshadow\",\n None,\n \"font shadow color, default is None(no shadow)\"\n ),\n (\n \"foreground\",\n \"#ffffff\",\n \"Foreground colour.\"\n ),\n (\n \"close_button_location\",\n \"left\",\n \"Location of close button when box open ('left' or 'right')\"\n ),\n (\n \"text_closed\",\n \"[<]\",\n \"Text when box is closed\"\n ),\n (\n \"text_open\",\n \"[>]\",\n \"Text when box is open\"\n ),\n ]\n\n def __init__(self, widgets=list(), **config):\n base._Widget.__init__(self, bar.CALCULATED, **config)\n self.add_defaults(WidgetBox.defaults)\n self.box_is_open = False\n self._widgets = widgets\n self.add_callbacks({\"Button1\": self.cmd_toggle})\n\n if self.close_button_location not in [\"left\", \"right\"]:\n val = self.close_button_location\n msg = \"Invalid value for 'close_button_location': {}\".format(val)\n logger.warning(msg)\n self.close_button_location = \"left\"\n\n def _configure(self, qtile, bar):\n base._Widget._configure(self, qtile, bar)\n\n self.layout = self.drawer.textlayout(\n self.text_closed,\n self.foreground,\n self.font,\n self.fontsize,\n self.fontshadow,\n markup=False,\n )\n\n for idx, w in enumerate(self._widgets):\n if w.configured:\n w = w.create_mirror()\n self._widgets[idx] = w\n self.qtile.register_widget(w)\n w._configure(self.qtile, self.bar)\n\n # In case the widget is mirrored, we need to draw it once so the\n # mirror can copy the surface but draw it off screen\n w.offsetx = self.bar.width\n self.qtile.call_soon(w.draw)\n\n # We need to stop hidden widgets from drawing while hidden\n # (e.g. draw could be triggered by a timer) so we take a reference to\n # the widget's drawer.draw method\n self.widgets = [BoxedWidget(w, w.drawer.draw) for w in self._widgets]\n\n # # Overwrite the current drawer.draw method with a no-op\n for w in self.widgets:\n w.widget.drawer.draw = _no_draw\n\n def calculate_length(self):\n return self.layout.width\n\n def set_box_label(self):\n self.layout.text = (self.text_open if self.box_is_open\n else self.text_closed)\n\n def toggle_widgets(self):\n for item in self.widgets:\n try:\n self.bar.widgets.remove(item.widget)\n # Override drawer.drawer with a no-op\n item.widget.drawer.draw = _no_draw\n except ValueError:\n continue\n\n index = self.bar.widgets.index(self)\n\n if self.close_button_location == \"left\":\n index += 1\n\n if self.box_is_open:\n\n # Need to reverse list as widgets get added in front of eachother.\n for item in self.widgets[::-1]:\n # Restore the original drawer.draw method\n item.widget.drawer.draw = item.draw\n self.bar.widgets.insert(index, item.widget)\n\n def draw(self):\n self.drawer.clear(self.background or self.bar.background)\n\n self.layout.draw(0,\n int(self.bar.height / 2.0 -\n self.layout.height / 2.0) + 1)\n\n self.drawer.draw(offsetx=self.offsetx, width=self.width)\n\n def button_press(self, x, y, button):\n name = \"Button{}\".format(button)\n if name in self.mouse_callbacks:\n self.mouse_callbacks[name]()\n\n def cmd_toggle(self):\n \"\"\"Toggle box state\"\"\"\n self.box_is_open = not self.box_is_open\n self.toggle_widgets()\n self.set_box_label()\n self.bar.draw()\n", "path": "libqtile/widget/widgetbox.py"}], "after_files": [{"content": "# Copyright (c) 2020 elParaguayo\n#\n# Permission is hereby granted, free of charge, to any person obtaining a copy\n# of this software and associated documentation files (the \"Software\"), to deal\n# in the Software without restriction, including without limitation the rights\n# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n# copies of the Software, and to permit persons to whom the Software is\n# furnished to do so, subject to the following conditions:\n#\n# The above copyright notice and this permission notice shall be included in\n# all copies or substantial portions of the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n# SOFTWARE.\n\nfrom collections import namedtuple\n\nfrom libqtile import bar\nfrom libqtile.log_utils import logger\nfrom libqtile.widget import Systray, base\n\nBoxedWidget = namedtuple(\"BoxedWidget\", [\"widget\", \"draw\"])\n\n\ndef _no_draw(*args, **kwargs):\n pass\n\n\nclass WidgetBox(base._Widget):\n \"\"\"A widget to declutter your bar.\n\n WidgetBox is a widget that hides widgets by default but shows them when\n the box is opened.\n\n Widgets that are hidden will still update etc. as if they were on the main\n bar.\n\n Button clicks are passed to widgets when they are visible so callbacks will\n work.\n\n Widgets in the box also remain accessible via command interfaces.\n\n Widgets can only be added to the box via the configuration file. The widget\n is configured by adding widgets to the \"widgets\" parameter as follows::\n\n widget.WidgetBox(widgets=[\n widget.TextBox(text=\"This widget is in the box\"),\n widget.Memory()\n ]\n ),\n \"\"\"\n orientations = base.ORIENTATION_HORIZONTAL\n defaults = [\n (\n \"font\",\n \"sans\",\n \"Text font\"\n ),\n (\n \"fontsize\",\n None,\n \"Font pixel size. Calculated if None.\"\n ),\n (\n \"fontshadow\",\n None,\n \"font shadow color, default is None(no shadow)\"\n ),\n (\n \"foreground\",\n \"#ffffff\",\n \"Foreground colour.\"\n ),\n (\n \"close_button_location\",\n \"left\",\n \"Location of close button when box open ('left' or 'right')\"\n ),\n (\n \"text_closed\",\n \"[<]\",\n \"Text when box is closed\"\n ),\n (\n \"text_open\",\n \"[>]\",\n \"Text when box is open\"\n ),\n ]\n\n def __init__(self, widgets=list(), **config):\n base._Widget.__init__(self, bar.CALCULATED, **config)\n self.add_defaults(WidgetBox.defaults)\n self.box_is_open = False\n self._widgets = widgets\n self.add_callbacks({\"Button1\": self.cmd_toggle})\n\n if self.close_button_location not in [\"left\", \"right\"]:\n val = self.close_button_location\n msg = \"Invalid value for 'close_button_location': {}\".format(val)\n logger.warning(msg)\n self.close_button_location = \"left\"\n\n def _configure(self, qtile, bar):\n base._Widget._configure(self, qtile, bar)\n\n self.layout = self.drawer.textlayout(\n self.text_closed,\n self.foreground,\n self.font,\n self.fontsize,\n self.fontshadow,\n markup=False,\n )\n\n for idx, w in enumerate(self._widgets):\n if w.configured:\n w = w.create_mirror()\n self._widgets[idx] = w\n self.qtile.register_widget(w)\n w._configure(self.qtile, self.bar)\n\n # In case the widget is mirrored, we need to draw it once so the\n # mirror can copy the surface but draw it off screen\n w.offsetx = self.bar.width\n self.qtile.call_soon(w.draw)\n\n # We need to stop hidden widgets from drawing while hidden\n # (e.g. draw could be triggered by a timer) so we take a reference to\n # the widget's drawer.draw method\n self.widgets = [BoxedWidget(w, w.drawer.draw) for w in self._widgets]\n\n # # Overwrite the current drawer.draw method with a no-op\n for w in self.widgets:\n w.widget.drawer.draw = _no_draw\n\n def calculate_length(self):\n return self.layout.width\n\n def set_box_label(self):\n self.layout.text = (self.text_open if self.box_is_open\n else self.text_closed)\n\n def toggle_widgets(self):\n for item in self.widgets:\n try:\n self.bar.widgets.remove(item.widget)\n # Override drawer.drawer with a no-op\n item.widget.drawer.draw = _no_draw\n\n # Systray widget needs some additional steps to hide as the icons\n # are separate _Window instances.\n # Systray unhides icons when it draws so we only need to hide them.\n if isinstance(item.widget, Systray):\n for icon in item.widget.icons.values():\n icon.hide()\n\n except ValueError:\n continue\n\n index = self.bar.widgets.index(self)\n\n if self.close_button_location == \"left\":\n index += 1\n\n if self.box_is_open:\n\n # Need to reverse list as widgets get added in front of eachother.\n for item in self.widgets[::-1]:\n # Restore the original drawer.draw method\n item.widget.drawer.draw = item.draw\n self.bar.widgets.insert(index, item.widget)\n\n def draw(self):\n self.drawer.clear(self.background or self.bar.background)\n\n self.layout.draw(0,\n int(self.bar.height / 2.0 -\n self.layout.height / 2.0) + 1)\n\n self.drawer.draw(offsetx=self.offsetx, width=self.width)\n\n def button_press(self, x, y, button):\n name = \"Button{}\".format(button)\n if name in self.mouse_callbacks:\n self.mouse_callbacks[name]()\n\n def cmd_toggle(self):\n \"\"\"Toggle box state\"\"\"\n self.box_is_open = not self.box_is_open\n self.toggle_widgets()\n self.set_box_label()\n self.bar.draw()\n", "path": "libqtile/widget/widgetbox.py"}]} | 2,232 | 231 |
gh_patches_debug_24345 | rasdani/github-patches | git_diff | cobbler__cobbler-3650 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[Backport] Add support for cnames in dnsmasq
### Original feature issue
- Issue: #1643
- PR: #3640
### Target release
- [x] release33
- [ ] release32
- [ ] release30
### Reason
Stabilization of Cobbler 3.3.5
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `cobbler/modules/managers/dnsmasq.py`
Content:
```
1 """
2 This is some of the code behind 'cobbler sync'.
3
4 Copyright 2006-2009, Red Hat, Inc and Others
5 Michael DeHaan <michael.dehaan AT gmail>
6 John Eckersberg <[email protected]>
7
8 This program is free software; you can redistribute it and/or modify
9 it under the terms of the GNU General Public License as published by
10 the Free Software Foundation; either version 2 of the License, or
11 (at your option) any later version.
12
13 This program is distributed in the hope that it will be useful,
14 but WITHOUT ANY WARRANTY; without even the implied warranty of
15 MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
16 GNU General Public License for more details.
17
18 You should have received a copy of the GNU General Public License
19 along with this program; if not, write to the Free Software
20 Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
21 02110-1301 USA
22 """
23
24 import time
25
26 import cobbler.utils as utils
27 from cobbler.manager import ManagerModule
28
29 MANAGER = None
30
31
32 def register() -> str:
33 """
34 The mandatory Cobbler modules registration hook.
35
36 :return: Always "manage".
37 """
38 return "manage"
39
40
41 class _DnsmasqManager(ManagerModule):
42 """
43 Handles conversion of internal state to the tftpboot tree layout.
44 """
45
46 @staticmethod
47 def what() -> str:
48 """
49 This identifies the module.
50
51 :return: Will always return ``dnsmasq``.
52 """
53 return "dnsmasq"
54
55 def write_configs(self):
56 """
57 DHCP files are written when ``manage_dhcp`` is set in our settings.
58
59 :raises OSError
60 """
61
62 settings_file = "/etc/dnsmasq.conf"
63 template_file = "/etc/cobbler/dnsmasq.template"
64
65 try:
66 f2 = open(template_file, "r")
67 except Exception:
68 raise OSError("error writing template to file: %s" % template_file)
69 template_data = f2.read()
70 f2.close()
71
72 system_definitions = {}
73
74 # we used to just loop through each system, but now we must loop
75 # through each network interface of each system.
76
77 for system in self.systems:
78
79 if not system.is_management_supported(cidr_ok=False):
80 continue
81
82 profile = system.get_conceptual_parent()
83 distro = profile.get_conceptual_parent()
84 for interface in system.interfaces.values():
85
86 mac = interface.mac_address
87 ip = interface.ip_address
88 host = interface.dns_name
89 ipv6 = interface.ipv6_address
90
91 if not mac:
92 # can't write a DHCP entry for this system
93 continue
94
95 # In many reallife situations there is a need to control the IP address and hostname for a specific
96 # client when only the MAC address is available. In addition to that in some scenarios there is a need
97 # to explicitly label a host with the applicable architecture in order to correctly handle situations
98 # where we need something other than ``pxelinux.0``. So we always write a dhcp-host entry with as much
99 # info as possible to allow maximum control and flexibility within the dnsmasq config.
100
101 systxt = "dhcp-host=net:" + distro.arch.value.lower() + "," + mac
102
103 if host != "":
104 systxt += "," + host
105
106 if ip != "":
107 systxt += "," + ip
108 if ipv6 != "":
109 systxt += ",[%s]" % ipv6
110
111 systxt += "\n"
112
113 dhcp_tag = interface.dhcp_tag
114 if dhcp_tag == "":
115 dhcp_tag = "default"
116
117 if dhcp_tag not in system_definitions:
118 system_definitions[dhcp_tag] = ""
119 system_definitions[dhcp_tag] = system_definitions[dhcp_tag] + systxt
120
121 # We are now done with the looping through each interface of each system.
122
123 metadata = {
124 "insert_cobbler_system_definitions": system_definitions.get("default", ""),
125 "date": time.asctime(time.gmtime()),
126 "cobbler_server": self.settings.server,
127 "next_server_v4": self.settings.next_server_v4,
128 "next_server_v6": self.settings.next_server_v6,
129 }
130
131 # now add in other DHCP expansions that are not tagged with "default"
132 for x in list(system_definitions.keys()):
133 if x == "default":
134 continue
135 metadata["insert_cobbler_system_definitions_%s" % x] = system_definitions[x]
136
137 self.templar.render(template_data, metadata, settings_file)
138
139 def regen_ethers(self):
140 """
141 This function regenerates the ethers file. To get more information please read ``man ethers``, the format is
142 also in there described.
143 """
144 # dnsmasq knows how to read this database of MACs -> IPs, so we'll keep it up to date every time we add a
145 # system.
146 fh = open("/etc/ethers", "w+")
147 for system in self.systems:
148 if not system.is_management_supported(cidr_ok=False):
149 continue
150 for interface in system.interfaces.values():
151 mac = interface.mac_address
152 ip = interface.ip_address
153 if not mac:
154 # can't write this w/o a MAC address
155 continue
156 if ip is not None and ip != "":
157 fh.write(mac.upper() + "\t" + ip + "\n")
158 fh.close()
159
160 def regen_hosts(self):
161 """
162 This rewrites the hosts file and thus also rewrites the dns config.
163 """
164 # dnsmasq knows how to read this database for host info (other things may also make use of this later)
165 fh = open("/var/lib/cobbler/cobbler_hosts", "w+")
166 for system in self.systems:
167 if not system.is_management_supported(cidr_ok=False):
168 continue
169 for (_, interface) in system.interfaces.items():
170 mac = interface.mac_address
171 host = interface.dns_name
172 ip = interface.ip_address
173 ipv6 = interface.ipv6_address
174 if not mac:
175 continue
176 if host is not None and host != "" and ipv6 is not None and ipv6 != "":
177 fh.write(ipv6 + "\t" + host + "\n")
178 elif host is not None and host != "" and ip is not None and ip != "":
179 fh.write(ip + "\t" + host + "\n")
180 fh.close()
181
182 def restart_service(self):
183 """
184 This restarts the dhcp server and thus applied the newly written config files.
185 """
186 service_name = "dnsmasq"
187 if self.settings.restart_dhcp:
188 return_code_service_restart = utils.service_restart(service_name)
189 if return_code_service_restart != 0:
190 self.logger.error("%s service failed", service_name)
191 return return_code_service_restart
192
193
194 def get_manager(api):
195 """
196 Creates a manager object to manage a dnsmasq server.
197
198 :param api: The API to resolve all information with.
199 :return: The object generated from the class.
200 """
201 # Singleton used, therefore ignoring 'global'
202 global MANAGER # pylint: disable=global-statement
203
204 if not MANAGER:
205 MANAGER = _DnsmasqManager(api)
206 return MANAGER
207
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/cobbler/modules/managers/dnsmasq.py b/cobbler/modules/managers/dnsmasq.py
--- a/cobbler/modules/managers/dnsmasq.py
+++ b/cobbler/modules/managers/dnsmasq.py
@@ -169,14 +169,21 @@
for (_, interface) in system.interfaces.items():
mac = interface.mac_address
host = interface.dns_name
+ cnames = " ".join(interface.cnames)
ip = interface.ip_address
ipv6 = interface.ipv6_address
if not mac:
continue
if host is not None and host != "" and ipv6 is not None and ipv6 != "":
- fh.write(ipv6 + "\t" + host + "\n")
+ if cnames:
+ fh.write(ipv6 + "\t" + host + ' ' + cnames + "\n")
+ else:
+ fh.write(ipv6 + "\t" + host + "\n")
elif host is not None and host != "" and ip is not None and ip != "":
- fh.write(ip + "\t" + host + "\n")
+ if cnames:
+ fh.write(ip + "\t" + host + ' ' + cnames + "\n")
+ else:
+ fh.write(ip + "\t" + host + "\n")
fh.close()
def restart_service(self):
| {"golden_diff": "diff --git a/cobbler/modules/managers/dnsmasq.py b/cobbler/modules/managers/dnsmasq.py\n--- a/cobbler/modules/managers/dnsmasq.py\n+++ b/cobbler/modules/managers/dnsmasq.py\n@@ -169,14 +169,21 @@\n for (_, interface) in system.interfaces.items():\n mac = interface.mac_address\n host = interface.dns_name\n+ cnames = \" \".join(interface.cnames)\n ip = interface.ip_address\n ipv6 = interface.ipv6_address\n if not mac:\n continue\n if host is not None and host != \"\" and ipv6 is not None and ipv6 != \"\":\n- fh.write(ipv6 + \"\\t\" + host + \"\\n\")\n+ if cnames:\n+ fh.write(ipv6 + \"\\t\" + host + ' ' + cnames + \"\\n\")\n+ else:\n+ fh.write(ipv6 + \"\\t\" + host + \"\\n\")\n elif host is not None and host != \"\" and ip is not None and ip != \"\":\n- fh.write(ip + \"\\t\" + host + \"\\n\")\n+ if cnames:\n+ fh.write(ip + \"\\t\" + host + ' ' + cnames + \"\\n\")\n+ else:\n+ fh.write(ip + \"\\t\" + host + \"\\n\")\n fh.close()\n \n def restart_service(self):\n", "issue": "[Backport] Add support for cnames in dnsmasq\n### Original feature issue\r\n\r\n- Issue: #1643\r\n- PR: #3640\r\n\r\n### Target release\r\n\r\n- [x] release33\r\n- [ ] release32\r\n- [ ] release30\r\n\r\n### Reason\r\n\r\nStabilization of Cobbler 3.3.5\r\n\n", "before_files": [{"content": "\"\"\"\nThis is some of the code behind 'cobbler sync'.\n\nCopyright 2006-2009, Red Hat, Inc and Others\nMichael DeHaan <michael.dehaan AT gmail>\nJohn Eckersberg <[email protected]>\n\nThis program is free software; you can redistribute it and/or modify\nit under the terms of the GNU General Public License as published by\nthe Free Software Foundation; either version 2 of the License, or\n(at your option) any later version.\n\nThis program is distributed in the hope that it will be useful,\nbut WITHOUT ANY WARRANTY; without even the implied warranty of\nMERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\nGNU General Public License for more details.\n\nYou should have received a copy of the GNU General Public License\nalong with this program; if not, write to the Free Software\nFoundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA\n02110-1301 USA\n\"\"\"\n\nimport time\n\nimport cobbler.utils as utils\nfrom cobbler.manager import ManagerModule\n\nMANAGER = None\n\n\ndef register() -> str:\n \"\"\"\n The mandatory Cobbler modules registration hook.\n\n :return: Always \"manage\".\n \"\"\"\n return \"manage\"\n\n\nclass _DnsmasqManager(ManagerModule):\n \"\"\"\n Handles conversion of internal state to the tftpboot tree layout.\n \"\"\"\n\n @staticmethod\n def what() -> str:\n \"\"\"\n This identifies the module.\n\n :return: Will always return ``dnsmasq``.\n \"\"\"\n return \"dnsmasq\"\n\n def write_configs(self):\n \"\"\"\n DHCP files are written when ``manage_dhcp`` is set in our settings.\n\n :raises OSError\n \"\"\"\n\n settings_file = \"/etc/dnsmasq.conf\"\n template_file = \"/etc/cobbler/dnsmasq.template\"\n\n try:\n f2 = open(template_file, \"r\")\n except Exception:\n raise OSError(\"error writing template to file: %s\" % template_file)\n template_data = f2.read()\n f2.close()\n\n system_definitions = {}\n\n # we used to just loop through each system, but now we must loop\n # through each network interface of each system.\n\n for system in self.systems:\n\n if not system.is_management_supported(cidr_ok=False):\n continue\n\n profile = system.get_conceptual_parent()\n distro = profile.get_conceptual_parent()\n for interface in system.interfaces.values():\n\n mac = interface.mac_address\n ip = interface.ip_address\n host = interface.dns_name\n ipv6 = interface.ipv6_address\n\n if not mac:\n # can't write a DHCP entry for this system\n continue\n\n # In many reallife situations there is a need to control the IP address and hostname for a specific\n # client when only the MAC address is available. In addition to that in some scenarios there is a need\n # to explicitly label a host with the applicable architecture in order to correctly handle situations\n # where we need something other than ``pxelinux.0``. So we always write a dhcp-host entry with as much\n # info as possible to allow maximum control and flexibility within the dnsmasq config.\n\n systxt = \"dhcp-host=net:\" + distro.arch.value.lower() + \",\" + mac\n\n if host != \"\":\n systxt += \",\" + host\n\n if ip != \"\":\n systxt += \",\" + ip\n if ipv6 != \"\":\n systxt += \",[%s]\" % ipv6\n\n systxt += \"\\n\"\n\n dhcp_tag = interface.dhcp_tag\n if dhcp_tag == \"\":\n dhcp_tag = \"default\"\n\n if dhcp_tag not in system_definitions:\n system_definitions[dhcp_tag] = \"\"\n system_definitions[dhcp_tag] = system_definitions[dhcp_tag] + systxt\n\n # We are now done with the looping through each interface of each system.\n\n metadata = {\n \"insert_cobbler_system_definitions\": system_definitions.get(\"default\", \"\"),\n \"date\": time.asctime(time.gmtime()),\n \"cobbler_server\": self.settings.server,\n \"next_server_v4\": self.settings.next_server_v4,\n \"next_server_v6\": self.settings.next_server_v6,\n }\n\n # now add in other DHCP expansions that are not tagged with \"default\"\n for x in list(system_definitions.keys()):\n if x == \"default\":\n continue\n metadata[\"insert_cobbler_system_definitions_%s\" % x] = system_definitions[x]\n\n self.templar.render(template_data, metadata, settings_file)\n\n def regen_ethers(self):\n \"\"\"\n This function regenerates the ethers file. To get more information please read ``man ethers``, the format is\n also in there described.\n \"\"\"\n # dnsmasq knows how to read this database of MACs -> IPs, so we'll keep it up to date every time we add a\n # system.\n fh = open(\"/etc/ethers\", \"w+\")\n for system in self.systems:\n if not system.is_management_supported(cidr_ok=False):\n continue\n for interface in system.interfaces.values():\n mac = interface.mac_address\n ip = interface.ip_address\n if not mac:\n # can't write this w/o a MAC address\n continue\n if ip is not None and ip != \"\":\n fh.write(mac.upper() + \"\\t\" + ip + \"\\n\")\n fh.close()\n\n def regen_hosts(self):\n \"\"\"\n This rewrites the hosts file and thus also rewrites the dns config.\n \"\"\"\n # dnsmasq knows how to read this database for host info (other things may also make use of this later)\n fh = open(\"/var/lib/cobbler/cobbler_hosts\", \"w+\")\n for system in self.systems:\n if not system.is_management_supported(cidr_ok=False):\n continue\n for (_, interface) in system.interfaces.items():\n mac = interface.mac_address\n host = interface.dns_name\n ip = interface.ip_address\n ipv6 = interface.ipv6_address\n if not mac:\n continue\n if host is not None and host != \"\" and ipv6 is not None and ipv6 != \"\":\n fh.write(ipv6 + \"\\t\" + host + \"\\n\")\n elif host is not None and host != \"\" and ip is not None and ip != \"\":\n fh.write(ip + \"\\t\" + host + \"\\n\")\n fh.close()\n\n def restart_service(self):\n \"\"\"\n This restarts the dhcp server and thus applied the newly written config files.\n \"\"\"\n service_name = \"dnsmasq\"\n if self.settings.restart_dhcp:\n return_code_service_restart = utils.service_restart(service_name)\n if return_code_service_restart != 0:\n self.logger.error(\"%s service failed\", service_name)\n return return_code_service_restart\n\n\ndef get_manager(api):\n \"\"\"\n Creates a manager object to manage a dnsmasq server.\n\n :param api: The API to resolve all information with.\n :return: The object generated from the class.\n \"\"\"\n # Singleton used, therefore ignoring 'global'\n global MANAGER # pylint: disable=global-statement\n\n if not MANAGER:\n MANAGER = _DnsmasqManager(api)\n return MANAGER\n", "path": "cobbler/modules/managers/dnsmasq.py"}], "after_files": [{"content": "\"\"\"\nThis is some of the code behind 'cobbler sync'.\n\nCopyright 2006-2009, Red Hat, Inc and Others\nMichael DeHaan <michael.dehaan AT gmail>\nJohn Eckersberg <[email protected]>\n\nThis program is free software; you can redistribute it and/or modify\nit under the terms of the GNU General Public License as published by\nthe Free Software Foundation; either version 2 of the License, or\n(at your option) any later version.\n\nThis program is distributed in the hope that it will be useful,\nbut WITHOUT ANY WARRANTY; without even the implied warranty of\nMERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\nGNU General Public License for more details.\n\nYou should have received a copy of the GNU General Public License\nalong with this program; if not, write to the Free Software\nFoundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA\n02110-1301 USA\n\"\"\"\n\nimport time\n\nimport cobbler.utils as utils\nfrom cobbler.manager import ManagerModule\n\nMANAGER = None\n\n\ndef register() -> str:\n \"\"\"\n The mandatory Cobbler modules registration hook.\n\n :return: Always \"manage\".\n \"\"\"\n return \"manage\"\n\n\nclass _DnsmasqManager(ManagerModule):\n \"\"\"\n Handles conversion of internal state to the tftpboot tree layout.\n \"\"\"\n\n @staticmethod\n def what() -> str:\n \"\"\"\n This identifies the module.\n\n :return: Will always return ``dnsmasq``.\n \"\"\"\n return \"dnsmasq\"\n\n def write_configs(self):\n \"\"\"\n DHCP files are written when ``manage_dhcp`` is set in our settings.\n\n :raises OSError\n \"\"\"\n\n settings_file = \"/etc/dnsmasq.conf\"\n template_file = \"/etc/cobbler/dnsmasq.template\"\n\n try:\n f2 = open(template_file, \"r\")\n except Exception:\n raise OSError(\"error writing template to file: %s\" % template_file)\n template_data = f2.read()\n f2.close()\n\n system_definitions = {}\n\n # we used to just loop through each system, but now we must loop\n # through each network interface of each system.\n\n for system in self.systems:\n\n if not system.is_management_supported(cidr_ok=False):\n continue\n\n profile = system.get_conceptual_parent()\n distro = profile.get_conceptual_parent()\n for interface in system.interfaces.values():\n\n mac = interface.mac_address\n ip = interface.ip_address\n host = interface.dns_name\n ipv6 = interface.ipv6_address\n\n if not mac:\n # can't write a DHCP entry for this system\n continue\n\n # In many reallife situations there is a need to control the IP address and hostname for a specific\n # client when only the MAC address is available. In addition to that in some scenarios there is a need\n # to explicitly label a host with the applicable architecture in order to correctly handle situations\n # where we need something other than ``pxelinux.0``. So we always write a dhcp-host entry with as much\n # info as possible to allow maximum control and flexibility within the dnsmasq config.\n\n systxt = \"dhcp-host=net:\" + distro.arch.value.lower() + \",\" + mac\n\n if host != \"\":\n systxt += \",\" + host\n\n if ip != \"\":\n systxt += \",\" + ip\n if ipv6 != \"\":\n systxt += \",[%s]\" % ipv6\n\n systxt += \"\\n\"\n\n dhcp_tag = interface.dhcp_tag\n if dhcp_tag == \"\":\n dhcp_tag = \"default\"\n\n if dhcp_tag not in system_definitions:\n system_definitions[dhcp_tag] = \"\"\n system_definitions[dhcp_tag] = system_definitions[dhcp_tag] + systxt\n\n # We are now done with the looping through each interface of each system.\n\n metadata = {\n \"insert_cobbler_system_definitions\": system_definitions.get(\"default\", \"\"),\n \"date\": time.asctime(time.gmtime()),\n \"cobbler_server\": self.settings.server,\n \"next_server_v4\": self.settings.next_server_v4,\n \"next_server_v6\": self.settings.next_server_v6,\n }\n\n # now add in other DHCP expansions that are not tagged with \"default\"\n for x in list(system_definitions.keys()):\n if x == \"default\":\n continue\n metadata[\"insert_cobbler_system_definitions_%s\" % x] = system_definitions[x]\n\n self.templar.render(template_data, metadata, settings_file)\n\n def regen_ethers(self):\n \"\"\"\n This function regenerates the ethers file. To get more information please read ``man ethers``, the format is\n also in there described.\n \"\"\"\n # dnsmasq knows how to read this database of MACs -> IPs, so we'll keep it up to date every time we add a\n # system.\n fh = open(\"/etc/ethers\", \"w+\")\n for system in self.systems:\n if not system.is_management_supported(cidr_ok=False):\n continue\n for interface in system.interfaces.values():\n mac = interface.mac_address\n ip = interface.ip_address\n if not mac:\n # can't write this w/o a MAC address\n continue\n if ip is not None and ip != \"\":\n fh.write(mac.upper() + \"\\t\" + ip + \"\\n\")\n fh.close()\n\n def regen_hosts(self):\n \"\"\"\n This rewrites the hosts file and thus also rewrites the dns config.\n \"\"\"\n # dnsmasq knows how to read this database for host info (other things may also make use of this later)\n fh = open(\"/var/lib/cobbler/cobbler_hosts\", \"w+\")\n for system in self.systems:\n if not system.is_management_supported(cidr_ok=False):\n continue\n for (_, interface) in system.interfaces.items():\n mac = interface.mac_address\n host = interface.dns_name\n cnames = \" \".join(interface.cnames)\n ip = interface.ip_address\n ipv6 = interface.ipv6_address\n if not mac:\n continue\n if host is not None and host != \"\" and ipv6 is not None and ipv6 != \"\":\n if cnames:\n fh.write(ipv6 + \"\\t\" + host + ' ' + cnames + \"\\n\")\n else:\n fh.write(ipv6 + \"\\t\" + host + \"\\n\")\n elif host is not None and host != \"\" and ip is not None and ip != \"\":\n if cnames:\n fh.write(ip + \"\\t\" + host + ' ' + cnames + \"\\n\")\n else:\n fh.write(ip + \"\\t\" + host + \"\\n\")\n fh.close()\n\n def restart_service(self):\n \"\"\"\n This restarts the dhcp server and thus applied the newly written config files.\n \"\"\"\n service_name = \"dnsmasq\"\n if self.settings.restart_dhcp:\n return_code_service_restart = utils.service_restart(service_name)\n if return_code_service_restart != 0:\n self.logger.error(\"%s service failed\", service_name)\n return return_code_service_restart\n\n\ndef get_manager(api):\n \"\"\"\n Creates a manager object to manage a dnsmasq server.\n\n :param api: The API to resolve all information with.\n :return: The object generated from the class.\n \"\"\"\n # Singleton used, therefore ignoring 'global'\n global MANAGER # pylint: disable=global-statement\n\n if not MANAGER:\n MANAGER = _DnsmasqManager(api)\n return MANAGER\n", "path": "cobbler/modules/managers/dnsmasq.py"}]} | 2,477 | 316 |
gh_patches_debug_3450 | rasdani/github-patches | git_diff | astronomer__astro-sdk-176 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Use standard AWS environment variables
**Context**
At the moment, Astro 0.6.x uses a custom environment variable `AIRFLOW__ASTRO__CONN_AWS_DEFAULT` to define AWS credentials. However, there are standard [AWS environment variables to define credentials](https://boto3.amazonaws.com/v1/documentation/api/latest/guide/configuration.html#using-environment-variables).
**Acceptance criteria**
* Replace any occurrence of `AIRFLOW__ASTRO__CONN_AWS_DEFAULT` by `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY`
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/astro/utils/cloud_storage_creds.py`
Content:
```
1 import json
2 import os
3 from urllib import parse
4
5 from airflow.hooks.base import BaseHook
6
7 from astro.utils.dependencies import (
8 AwsBaseHook,
9 BotoSession,
10 GCSClient,
11 GCSHook,
12 google_service_account,
13 )
14
15
16 def parse_s3_env_var():
17 raw_data = (
18 os.environ["AIRFLOW__ASTRO__CONN_AWS_DEFAULT"]
19 .replace("%2F", "/")
20 .replace("aws://", "")
21 .replace("@", "")
22 .split(":")
23 )
24 return [parse.unquote(r) for r in raw_data]
25
26
27 def s3fs_creds(conn_id=None):
28 """Structure s3fs credentials from Airflow connection.
29 s3fs enables pandas to write to s3
30 """
31 if conn_id:
32 # The following line raises a friendly exception
33 BaseHook.get_connection(conn_id)
34 aws_hook = AwsBaseHook(conn_id, client_type="S3")
35 session = aws_hook.get_session()
36 else:
37 key, secret = parse_s3_env_var()
38 session = BotoSession(
39 aws_access_key_id=key,
40 aws_secret_access_key=secret,
41 )
42 return dict(client=session.client("s3"))
43
44
45 def gcs_client(conn_id=None):
46 """
47 get GCS credentials for storage.
48 """
49 if conn_id:
50 gcs_hook = GCSHook(conn_id)
51 client = gcs_hook.get_conn()
52 else:
53 client = GCSClient()
54
55 return dict(client=client)
56
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/astro/utils/cloud_storage_creds.py b/src/astro/utils/cloud_storage_creds.py
--- a/src/astro/utils/cloud_storage_creds.py
+++ b/src/astro/utils/cloud_storage_creds.py
@@ -14,14 +14,7 @@
def parse_s3_env_var():
- raw_data = (
- os.environ["AIRFLOW__ASTRO__CONN_AWS_DEFAULT"]
- .replace("%2F", "/")
- .replace("aws://", "")
- .replace("@", "")
- .split(":")
- )
- return [parse.unquote(r) for r in raw_data]
+ return os.environ["AWS_ACCESS_KEY_ID"], os.environ["AWS_SECRET_ACCESS_KEY"]
def s3fs_creds(conn_id=None):
| {"golden_diff": "diff --git a/src/astro/utils/cloud_storage_creds.py b/src/astro/utils/cloud_storage_creds.py\n--- a/src/astro/utils/cloud_storage_creds.py\n+++ b/src/astro/utils/cloud_storage_creds.py\n@@ -14,14 +14,7 @@\n \n \n def parse_s3_env_var():\n- raw_data = (\n- os.environ[\"AIRFLOW__ASTRO__CONN_AWS_DEFAULT\"]\n- .replace(\"%2F\", \"/\")\n- .replace(\"aws://\", \"\")\n- .replace(\"@\", \"\")\n- .split(\":\")\n- )\n- return [parse.unquote(r) for r in raw_data]\n+ return os.environ[\"AWS_ACCESS_KEY_ID\"], os.environ[\"AWS_SECRET_ACCESS_KEY\"]\n \n \n def s3fs_creds(conn_id=None):\n", "issue": "Use standard AWS environment variables\n**Context**\r\nAt the moment, Astro 0.6.x uses a custom environment variable `AIRFLOW__ASTRO__CONN_AWS_DEFAULT` to define AWS credentials. However, there are standard [AWS environment variables to define credentials](https://boto3.amazonaws.com/v1/documentation/api/latest/guide/configuration.html#using-environment-variables).\r\n\r\n**Acceptance criteria**\r\n* Replace any occurrence of `AIRFLOW__ASTRO__CONN_AWS_DEFAULT` by `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY`\n", "before_files": [{"content": "import json\nimport os\nfrom urllib import parse\n\nfrom airflow.hooks.base import BaseHook\n\nfrom astro.utils.dependencies import (\n AwsBaseHook,\n BotoSession,\n GCSClient,\n GCSHook,\n google_service_account,\n)\n\n\ndef parse_s3_env_var():\n raw_data = (\n os.environ[\"AIRFLOW__ASTRO__CONN_AWS_DEFAULT\"]\n .replace(\"%2F\", \"/\")\n .replace(\"aws://\", \"\")\n .replace(\"@\", \"\")\n .split(\":\")\n )\n return [parse.unquote(r) for r in raw_data]\n\n\ndef s3fs_creds(conn_id=None):\n \"\"\"Structure s3fs credentials from Airflow connection.\n s3fs enables pandas to write to s3\n \"\"\"\n if conn_id:\n # The following line raises a friendly exception\n BaseHook.get_connection(conn_id)\n aws_hook = AwsBaseHook(conn_id, client_type=\"S3\")\n session = aws_hook.get_session()\n else:\n key, secret = parse_s3_env_var()\n session = BotoSession(\n aws_access_key_id=key,\n aws_secret_access_key=secret,\n )\n return dict(client=session.client(\"s3\"))\n\n\ndef gcs_client(conn_id=None):\n \"\"\"\n get GCS credentials for storage.\n \"\"\"\n if conn_id:\n gcs_hook = GCSHook(conn_id)\n client = gcs_hook.get_conn()\n else:\n client = GCSClient()\n\n return dict(client=client)\n", "path": "src/astro/utils/cloud_storage_creds.py"}], "after_files": [{"content": "import json\nimport os\nfrom urllib import parse\n\nfrom airflow.hooks.base import BaseHook\n\nfrom astro.utils.dependencies import (\n AwsBaseHook,\n BotoSession,\n GCSClient,\n GCSHook,\n google_service_account,\n)\n\n\ndef parse_s3_env_var():\n return os.environ[\"AWS_ACCESS_KEY_ID\"], os.environ[\"AWS_SECRET_ACCESS_KEY\"]\n\n\ndef s3fs_creds(conn_id=None):\n \"\"\"Structure s3fs credentials from Airflow connection.\n s3fs enables pandas to write to s3\n \"\"\"\n if conn_id:\n # The following line raises a friendly exception\n BaseHook.get_connection(conn_id)\n aws_hook = AwsBaseHook(conn_id, client_type=\"S3\")\n session = aws_hook.get_session()\n else:\n key, secret = parse_s3_env_var()\n session = BotoSession(\n aws_access_key_id=key,\n aws_secret_access_key=secret,\n )\n return dict(client=session.client(\"s3\"))\n\n\ndef gcs_client(conn_id=None):\n \"\"\"\n get GCS credentials for storage.\n \"\"\"\n if conn_id:\n gcs_hook = GCSHook(conn_id)\n client = gcs_hook.get_conn()\n else:\n client = GCSClient()\n\n return dict(client=client)\n", "path": "src/astro/utils/cloud_storage_creds.py"}]} | 806 | 171 |
gh_patches_debug_33725 | rasdani/github-patches | git_diff | modoboa__modoboa-1859 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Dashboard - server behind proxy
# Impacted versions
* Modoboa: 1.14.0
* Webserver: Nginx
# Steps to reproduce
Modoboa server is behind proxy, so no internet direct access
Acces dashboard via admin account
# Current behavior
504 Gateway Time-out
# Expected behavior
no error
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `modoboa/core/views/dashboard.py`
Content:
```
1 """Core dashboard views."""
2
3 import feedparser
4 import requests
5 from dateutil import parser
6 from requests.exceptions import RequestException
7
8 from django.contrib.auth import mixins as auth_mixins
9 from django.views import generic
10
11 from .. import signals
12
13 MODOBOA_WEBSITE_URL = "https://modoboa.org/"
14
15
16 class DashboardView(auth_mixins.AccessMixin, generic.TemplateView):
17 """Dashboard view."""
18
19 template_name = "core/dashboard.html"
20
21 def dispatch(self, request, *args, **kwargs):
22 """Check if user can access dashboard."""
23 if not request.user.is_authenticated or not request.user.is_admin:
24 return self.handle_no_permission()
25 return super(DashboardView, self).dispatch(request, *args, **kwargs)
26
27 def get_context_data(self, **kwargs):
28 """Add context variables."""
29 context = super(DashboardView, self).get_context_data(**kwargs)
30 context.update({
31 "selection": "dashboard", "widgets": {"left": [], "right": []}
32 })
33 # Fetch latest news
34 if self.request.user.language == "fr":
35 lang = "fr"
36 else:
37 lang = "en"
38 context.update({"selection": "dashboard"})
39
40 feed_url = "{}{}/weblog/feeds/".format(MODOBOA_WEBSITE_URL, lang)
41 if self.request.user.role != "SuperAdmins":
42 custom_feed_url = (
43 self.request.localconfig.parameters.get_value("rss_feed_url"))
44 if custom_feed_url:
45 feed_url = custom_feed_url
46 posts = feedparser.parse(feed_url)
47 entries = []
48 for entry in posts["entries"][:5]:
49 entry["published"] = parser.parse(entry["published"])
50 entries.append(entry)
51 context["widgets"]["left"].append("core/_latest_news_widget.html")
52 context.update({"news": entries})
53
54 hide_features_widget = self.request.localconfig.parameters.get_value(
55 "hide_features_widget")
56 if self.request.user.is_superuser or not hide_features_widget:
57 url = "{}{}/api/projects/?featured=true".format(
58 MODOBOA_WEBSITE_URL, lang)
59 features = []
60 try:
61 response = requests.get(url)
62 except RequestException:
63 pass
64 else:
65 if response.status_code == 200:
66 features = response.json()
67 context["widgets"]["right"].append("core/_current_features.html")
68 context.update({"features": features})
69
70 # Extra widgets
71 result = signals.extra_admin_dashboard_widgets.send(
72 sender=self.__class__, user=self.request.user)
73 for _receiver, widgets in result:
74 for widget in widgets:
75 context["widgets"][widget["column"]].append(
76 widget["template"])
77 # FIXME: can raise conflicts...
78 context.update(widget["context"])
79
80 return context
81
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/modoboa/core/views/dashboard.py b/modoboa/core/views/dashboard.py
--- a/modoboa/core/views/dashboard.py
+++ b/modoboa/core/views/dashboard.py
@@ -8,6 +8,8 @@
from django.contrib.auth import mixins as auth_mixins
from django.views import generic
+from django.conf import settings
+
from .. import signals
MODOBOA_WEBSITE_URL = "https://modoboa.org/"
@@ -43,11 +45,12 @@
self.request.localconfig.parameters.get_value("rss_feed_url"))
if custom_feed_url:
feed_url = custom_feed_url
- posts = feedparser.parse(feed_url)
entries = []
- for entry in posts["entries"][:5]:
- entry["published"] = parser.parse(entry["published"])
- entries.append(entry)
+ if not settings.DISABLE_DASHBOARD_EXTERNAL_QUERIES:
+ posts = feedparser.parse(feed_url)
+ for entry in posts["entries"][:5]:
+ entry["published"] = parser.parse(entry["published"])
+ entries.append(entry)
context["widgets"]["left"].append("core/_latest_news_widget.html")
context.update({"news": entries})
@@ -57,13 +60,14 @@
url = "{}{}/api/projects/?featured=true".format(
MODOBOA_WEBSITE_URL, lang)
features = []
- try:
- response = requests.get(url)
- except RequestException:
- pass
- else:
- if response.status_code == 200:
- features = response.json()
+ if not settings.DISABLE_DASHBOARD_EXTERNAL_QUERIES:
+ try:
+ response = requests.get(url)
+ except RequestException:
+ pass
+ else:
+ if response.status_code == 200:
+ features = response.json()
context["widgets"]["right"].append("core/_current_features.html")
context.update({"features": features})
| {"golden_diff": "diff --git a/modoboa/core/views/dashboard.py b/modoboa/core/views/dashboard.py\n--- a/modoboa/core/views/dashboard.py\n+++ b/modoboa/core/views/dashboard.py\n@@ -8,6 +8,8 @@\n from django.contrib.auth import mixins as auth_mixins\n from django.views import generic\n \n+from django.conf import settings\n+\n from .. import signals\n \n MODOBOA_WEBSITE_URL = \"https://modoboa.org/\"\n@@ -43,11 +45,12 @@\n self.request.localconfig.parameters.get_value(\"rss_feed_url\"))\n if custom_feed_url:\n feed_url = custom_feed_url\n- posts = feedparser.parse(feed_url)\n entries = []\n- for entry in posts[\"entries\"][:5]:\n- entry[\"published\"] = parser.parse(entry[\"published\"])\n- entries.append(entry)\n+ if not settings.DISABLE_DASHBOARD_EXTERNAL_QUERIES:\n+ posts = feedparser.parse(feed_url)\n+ for entry in posts[\"entries\"][:5]:\n+ entry[\"published\"] = parser.parse(entry[\"published\"])\n+ entries.append(entry)\n context[\"widgets\"][\"left\"].append(\"core/_latest_news_widget.html\")\n context.update({\"news\": entries})\n \n@@ -57,13 +60,14 @@\n url = \"{}{}/api/projects/?featured=true\".format(\n MODOBOA_WEBSITE_URL, lang)\n features = []\n- try:\n- response = requests.get(url)\n- except RequestException:\n- pass\n- else:\n- if response.status_code == 200:\n- features = response.json()\n+ if not settings.DISABLE_DASHBOARD_EXTERNAL_QUERIES:\n+ try:\n+ response = requests.get(url)\n+ except RequestException:\n+ pass\n+ else:\n+ if response.status_code == 200:\n+ features = response.json()\n context[\"widgets\"][\"right\"].append(\"core/_current_features.html\")\n context.update({\"features\": features})\n", "issue": "Dashboard - server behind proxy\n# Impacted versions\r\n\r\n* Modoboa: 1.14.0\r\n* Webserver: Nginx\r\n\r\n# Steps to reproduce\r\nModoboa server is behind proxy, so no internet direct access\r\nAcces dashboard via admin account\r\n\r\n# Current behavior\r\n504 Gateway Time-out\r\n\r\n# Expected behavior\r\nno error\r\n\n", "before_files": [{"content": "\"\"\"Core dashboard views.\"\"\"\n\nimport feedparser\nimport requests\nfrom dateutil import parser\nfrom requests.exceptions import RequestException\n\nfrom django.contrib.auth import mixins as auth_mixins\nfrom django.views import generic\n\nfrom .. import signals\n\nMODOBOA_WEBSITE_URL = \"https://modoboa.org/\"\n\n\nclass DashboardView(auth_mixins.AccessMixin, generic.TemplateView):\n \"\"\"Dashboard view.\"\"\"\n\n template_name = \"core/dashboard.html\"\n\n def dispatch(self, request, *args, **kwargs):\n \"\"\"Check if user can access dashboard.\"\"\"\n if not request.user.is_authenticated or not request.user.is_admin:\n return self.handle_no_permission()\n return super(DashboardView, self).dispatch(request, *args, **kwargs)\n\n def get_context_data(self, **kwargs):\n \"\"\"Add context variables.\"\"\"\n context = super(DashboardView, self).get_context_data(**kwargs)\n context.update({\n \"selection\": \"dashboard\", \"widgets\": {\"left\": [], \"right\": []}\n })\n # Fetch latest news\n if self.request.user.language == \"fr\":\n lang = \"fr\"\n else:\n lang = \"en\"\n context.update({\"selection\": \"dashboard\"})\n\n feed_url = \"{}{}/weblog/feeds/\".format(MODOBOA_WEBSITE_URL, lang)\n if self.request.user.role != \"SuperAdmins\":\n custom_feed_url = (\n self.request.localconfig.parameters.get_value(\"rss_feed_url\"))\n if custom_feed_url:\n feed_url = custom_feed_url\n posts = feedparser.parse(feed_url)\n entries = []\n for entry in posts[\"entries\"][:5]:\n entry[\"published\"] = parser.parse(entry[\"published\"])\n entries.append(entry)\n context[\"widgets\"][\"left\"].append(\"core/_latest_news_widget.html\")\n context.update({\"news\": entries})\n\n hide_features_widget = self.request.localconfig.parameters.get_value(\n \"hide_features_widget\")\n if self.request.user.is_superuser or not hide_features_widget:\n url = \"{}{}/api/projects/?featured=true\".format(\n MODOBOA_WEBSITE_URL, lang)\n features = []\n try:\n response = requests.get(url)\n except RequestException:\n pass\n else:\n if response.status_code == 200:\n features = response.json()\n context[\"widgets\"][\"right\"].append(\"core/_current_features.html\")\n context.update({\"features\": features})\n\n # Extra widgets\n result = signals.extra_admin_dashboard_widgets.send(\n sender=self.__class__, user=self.request.user)\n for _receiver, widgets in result:\n for widget in widgets:\n context[\"widgets\"][widget[\"column\"]].append(\n widget[\"template\"])\n # FIXME: can raise conflicts...\n context.update(widget[\"context\"])\n\n return context\n", "path": "modoboa/core/views/dashboard.py"}], "after_files": [{"content": "\"\"\"Core dashboard views.\"\"\"\n\nimport feedparser\nimport requests\nfrom dateutil import parser\nfrom requests.exceptions import RequestException\n\nfrom django.contrib.auth import mixins as auth_mixins\nfrom django.views import generic\n\nfrom django.conf import settings\n\nfrom .. import signals\n\nMODOBOA_WEBSITE_URL = \"https://modoboa.org/\"\n\n\nclass DashboardView(auth_mixins.AccessMixin, generic.TemplateView):\n \"\"\"Dashboard view.\"\"\"\n\n template_name = \"core/dashboard.html\"\n\n def dispatch(self, request, *args, **kwargs):\n \"\"\"Check if user can access dashboard.\"\"\"\n if not request.user.is_authenticated or not request.user.is_admin:\n return self.handle_no_permission()\n return super(DashboardView, self).dispatch(request, *args, **kwargs)\n\n def get_context_data(self, **kwargs):\n \"\"\"Add context variables.\"\"\"\n context = super(DashboardView, self).get_context_data(**kwargs)\n context.update({\n \"selection\": \"dashboard\", \"widgets\": {\"left\": [], \"right\": []}\n })\n # Fetch latest news\n if self.request.user.language == \"fr\":\n lang = \"fr\"\n else:\n lang = \"en\"\n context.update({\"selection\": \"dashboard\"})\n\n feed_url = \"{}{}/weblog/feeds/\".format(MODOBOA_WEBSITE_URL, lang)\n if self.request.user.role != \"SuperAdmins\":\n custom_feed_url = (\n self.request.localconfig.parameters.get_value(\"rss_feed_url\"))\n if custom_feed_url:\n feed_url = custom_feed_url\n entries = []\n if not settings.DISABLE_DASHBOARD_EXTERNAL_QUERIES:\n posts = feedparser.parse(feed_url)\n for entry in posts[\"entries\"][:5]:\n entry[\"published\"] = parser.parse(entry[\"published\"])\n entries.append(entry)\n context[\"widgets\"][\"left\"].append(\"core/_latest_news_widget.html\")\n context.update({\"news\": entries})\n\n hide_features_widget = self.request.localconfig.parameters.get_value(\n \"hide_features_widget\")\n if self.request.user.is_superuser or not hide_features_widget:\n url = \"{}{}/api/projects/?featured=true\".format(\n MODOBOA_WEBSITE_URL, lang)\n features = []\n if not settings.DISABLE_DASHBOARD_EXTERNAL_QUERIES:\n try:\n response = requests.get(url)\n except RequestException:\n pass\n else:\n if response.status_code == 200:\n features = response.json()\n context[\"widgets\"][\"right\"].append(\"core/_current_features.html\")\n context.update({\"features\": features})\n\n # Extra widgets\n result = signals.extra_admin_dashboard_widgets.send(\n sender=self.__class__, user=self.request.user)\n for _receiver, widgets in result:\n for widget in widgets:\n context[\"widgets\"][widget[\"column\"]].append(\n widget[\"template\"])\n # FIXME: can raise conflicts...\n context.update(widget[\"context\"])\n\n return context\n", "path": "modoboa/core/views/dashboard.py"}]} | 1,085 | 432 |
gh_patches_debug_15542 | rasdani/github-patches | git_diff | replicate__cog-553 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Dear friend,please tell me why I can't run it from cog example.
I am a newbie.
I run the code from cog examples.
I can run "cog run python",but I can't run following command.
input:
sudo cog predict -i @input.jpg
resluts:

--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `python/cog/json.py`
Content:
```
1 from enum import Enum
2 import io
3 from typing import Any
4
5 from pydantic import BaseModel
6
7 from .types import Path
8
9 try:
10 import numpy as np # type: ignore
11
12 has_numpy = True
13 except ImportError:
14 has_numpy = False
15
16
17 def encode_json(obj: Any, upload_file) -> Any:
18 """
19 Returns a JSON-compatible version of the object. It will encode any Pydantic models and custom types.
20
21 When a file is encountered, it will be passed to upload_file. Any paths will be opened and converted to files.
22
23 Somewhat based on FastAPI's jsonable_encoder().
24 """
25 if isinstance(obj, BaseModel):
26 return encode_json(obj.dict(exclude_unset=True), upload_file)
27 if isinstance(obj, dict):
28 return {key: encode_json(value, upload_file) for key, value in obj.items()}
29 if isinstance(obj, list):
30 return [encode_json(value, upload_file) for value in obj]
31 if isinstance(obj, Enum):
32 return obj.value
33 if isinstance(obj, Path):
34 with obj.open("rb") as f:
35 return upload_file(f)
36 if isinstance(obj, io.IOBase):
37 return upload_file(obj)
38 if has_numpy:
39 if isinstance(obj, np.integer):
40 return int(obj)
41 if isinstance(obj, np.floating):
42 return float(obj)
43 if isinstance(obj, np.ndarray):
44 return obj.tolist()
45 return obj
46
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/python/cog/json.py b/python/cog/json.py
--- a/python/cog/json.py
+++ b/python/cog/json.py
@@ -1,5 +1,6 @@
from enum import Enum
import io
+from types import GeneratorType
from typing import Any
from pydantic import BaseModel
@@ -26,7 +27,7 @@
return encode_json(obj.dict(exclude_unset=True), upload_file)
if isinstance(obj, dict):
return {key: encode_json(value, upload_file) for key, value in obj.items()}
- if isinstance(obj, list):
+ if isinstance(obj, (list, set, frozenset, GeneratorType, tuple)):
return [encode_json(value, upload_file) for value in obj]
if isinstance(obj, Enum):
return obj.value
| {"golden_diff": "diff --git a/python/cog/json.py b/python/cog/json.py\n--- a/python/cog/json.py\n+++ b/python/cog/json.py\n@@ -1,5 +1,6 @@\n from enum import Enum\n import io\n+from types import GeneratorType\n from typing import Any\n \n from pydantic import BaseModel\n@@ -26,7 +27,7 @@\n return encode_json(obj.dict(exclude_unset=True), upload_file)\n if isinstance(obj, dict):\n return {key: encode_json(value, upload_file) for key, value in obj.items()}\n- if isinstance(obj, list):\n+ if isinstance(obj, (list, set, frozenset, GeneratorType, tuple)):\n return [encode_json(value, upload_file) for value in obj]\n if isinstance(obj, Enum):\n return obj.value\n", "issue": "Dear friend,please tell me why I can't run it from cog example.\nI am a newbie.\r\nI run the code from cog examples.\r\nI can run \"cog run python\",but I can't run following command.\r\ninput:\r\nsudo cog predict -i @input.jpg\r\nresluts:\r\n\r\n\n", "before_files": [{"content": "from enum import Enum\nimport io\nfrom typing import Any\n\nfrom pydantic import BaseModel\n\nfrom .types import Path\n\ntry:\n import numpy as np # type: ignore\n\n has_numpy = True\nexcept ImportError:\n has_numpy = False\n\n\ndef encode_json(obj: Any, upload_file) -> Any:\n \"\"\"\n Returns a JSON-compatible version of the object. It will encode any Pydantic models and custom types.\n\n When a file is encountered, it will be passed to upload_file. Any paths will be opened and converted to files.\n\n Somewhat based on FastAPI's jsonable_encoder().\n \"\"\"\n if isinstance(obj, BaseModel):\n return encode_json(obj.dict(exclude_unset=True), upload_file)\n if isinstance(obj, dict):\n return {key: encode_json(value, upload_file) for key, value in obj.items()}\n if isinstance(obj, list):\n return [encode_json(value, upload_file) for value in obj]\n if isinstance(obj, Enum):\n return obj.value\n if isinstance(obj, Path):\n with obj.open(\"rb\") as f:\n return upload_file(f)\n if isinstance(obj, io.IOBase):\n return upload_file(obj)\n if has_numpy:\n if isinstance(obj, np.integer):\n return int(obj)\n if isinstance(obj, np.floating):\n return float(obj)\n if isinstance(obj, np.ndarray):\n return obj.tolist()\n return obj\n", "path": "python/cog/json.py"}], "after_files": [{"content": "from enum import Enum\nimport io\nfrom types import GeneratorType\nfrom typing import Any\n\nfrom pydantic import BaseModel\n\nfrom .types import Path\n\ntry:\n import numpy as np # type: ignore\n\n has_numpy = True\nexcept ImportError:\n has_numpy = False\n\n\ndef encode_json(obj: Any, upload_file) -> Any:\n \"\"\"\n Returns a JSON-compatible version of the object. It will encode any Pydantic models and custom types.\n\n When a file is encountered, it will be passed to upload_file. Any paths will be opened and converted to files.\n\n Somewhat based on FastAPI's jsonable_encoder().\n \"\"\"\n if isinstance(obj, BaseModel):\n return encode_json(obj.dict(exclude_unset=True), upload_file)\n if isinstance(obj, dict):\n return {key: encode_json(value, upload_file) for key, value in obj.items()}\n if isinstance(obj, (list, set, frozenset, GeneratorType, tuple)):\n return [encode_json(value, upload_file) for value in obj]\n if isinstance(obj, Enum):\n return obj.value\n if isinstance(obj, Path):\n with obj.open(\"rb\") as f:\n return upload_file(f)\n if isinstance(obj, io.IOBase):\n return upload_file(obj)\n if has_numpy:\n if isinstance(obj, np.integer):\n return int(obj)\n if isinstance(obj, np.floating):\n return float(obj)\n if isinstance(obj, np.ndarray):\n return obj.tolist()\n return obj\n", "path": "python/cog/json.py"}]} | 775 | 177 |
gh_patches_debug_12833 | rasdani/github-patches | git_diff | mindee__doctr-219 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Demo app error when analyzing my first document
## 🐛 Bug
I tried to analyze a PNG and a PDF, got the same error. I try to change the model, didn't change anything.
## To Reproduce
Steps to reproduce the behavior:
1. Upload a PNG
2. Click on analyze document
<!-- If you have a code sample, error messages, stack traces, please provide it here as well -->
```
KeyError: 0
Traceback:
File "/Users/thibautmorla/opt/anaconda3/lib/python3.8/site-packages/streamlit/script_runner.py", line 337, in _run_script
exec(code, module.__dict__)
File "/Users/thibautmorla/Downloads/doctr/demo/app.py", line 93, in <module>
main()
File "/Users/thibautmorla/Downloads/doctr/demo/app.py", line 77, in main
seg_map = predictor.det_predictor.model(processed_batches[0])[0]
```
## Additional context
First image upload
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `demo/app.py`
Content:
```
1 # Copyright (C) 2021, Mindee.
2
3 # This program is licensed under the Apache License version 2.
4 # See LICENSE or go to <https://www.apache.org/licenses/LICENSE-2.0.txt> for full license details.
5
6 import os
7 import streamlit as st
8 import matplotlib.pyplot as plt
9
10 os.environ["TF_CPP_MIN_LOG_LEVEL"] = "2"
11
12 import tensorflow as tf
13 import cv2
14
15 gpu_devices = tf.config.experimental.list_physical_devices('GPU')
16 if any(gpu_devices):
17 tf.config.experimental.set_memory_growth(gpu_devices[0], True)
18
19 from doctr.documents import DocumentFile
20 from doctr.models import ocr_predictor
21 from doctr.utils.visualization import visualize_page
22
23 DET_ARCHS = ["db_resnet50"]
24 RECO_ARCHS = ["crnn_vgg16_bn", "crnn_resnet31", "sar_vgg16_bn", "sar_resnet31"]
25
26
27 def main():
28
29 # Wide mode
30 st.set_page_config(layout="wide")
31
32 # Designing the interface
33 st.title("DocTR: Document Text Recognition")
34 # For newline
35 st.write('\n')
36 # Set the columns
37 cols = st.beta_columns((1, 1, 1))
38 cols[0].header("Input document")
39 cols[1].header("Text segmentation")
40 cols[-1].header("OCR output")
41
42 # Sidebar
43 # File selection
44 st.sidebar.title("Document selection")
45 # Disabling warning
46 st.set_option('deprecation.showfileUploaderEncoding', False)
47 # Choose your own image
48 uploaded_file = st.sidebar.file_uploader("Upload files", type=['pdf', 'png', 'jpeg', 'jpg'])
49 if uploaded_file is not None:
50 if uploaded_file.name.endswith('.pdf'):
51 doc = DocumentFile.from_pdf(uploaded_file.read())
52 else:
53 doc = DocumentFile.from_images(uploaded_file.read())
54 cols[0].image(doc[0], "First page", use_column_width=True)
55
56 # Model selection
57 st.sidebar.title("Model selection")
58 det_arch = st.sidebar.selectbox("Text detection model", DET_ARCHS)
59 reco_arch = st.sidebar.selectbox("Text recognition model", RECO_ARCHS)
60
61 # For newline
62 st.sidebar.write('\n')
63
64 if st.sidebar.button("Analyze document"):
65
66 if uploaded_file is None:
67 st.sidebar.write("Please upload a document")
68
69 else:
70 with st.spinner('Loading model...'):
71 predictor = ocr_predictor(det_arch, reco_arch, pretrained=True)
72
73 with st.spinner('Analyzing...'):
74
75 # Forward the image to the model
76 processed_batches = predictor.det_predictor.pre_processor(doc)
77 seg_map = predictor.det_predictor.model(processed_batches[0])[0]
78 seg_map = cv2.resize(seg_map.numpy(), (doc[0].shape[1], doc[0].shape[0]),
79 interpolation=cv2.INTER_LINEAR)
80 # Plot the raw heatmap
81 fig, ax = plt.subplots()
82 ax.imshow(seg_map)
83 ax.axis('off')
84 cols[1].pyplot(fig)
85
86 # OCR
87 out = predictor(doc)
88 fig = visualize_page(out.pages[0].export(), doc[0], interactive=False)
89 cols[-1].pyplot(fig)
90
91
92 if __name__ == '__main__':
93 main()
94
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/demo/app.py b/demo/app.py
--- a/demo/app.py
+++ b/demo/app.py
@@ -74,7 +74,8 @@
# Forward the image to the model
processed_batches = predictor.det_predictor.pre_processor(doc)
- seg_map = predictor.det_predictor.model(processed_batches[0])[0]
+ seg_map = predictor.det_predictor.model(processed_batches[0])["proba_map"]
+ seg_map = tf.squeeze(seg_map, axis=[0, 3])
seg_map = cv2.resize(seg_map.numpy(), (doc[0].shape[1], doc[0].shape[0]),
interpolation=cv2.INTER_LINEAR)
# Plot the raw heatmap
| {"golden_diff": "diff --git a/demo/app.py b/demo/app.py\n--- a/demo/app.py\n+++ b/demo/app.py\n@@ -74,7 +74,8 @@\n \n # Forward the image to the model\n processed_batches = predictor.det_predictor.pre_processor(doc)\n- seg_map = predictor.det_predictor.model(processed_batches[0])[0]\n+ seg_map = predictor.det_predictor.model(processed_batches[0])[\"proba_map\"]\n+ seg_map = tf.squeeze(seg_map, axis=[0, 3])\n seg_map = cv2.resize(seg_map.numpy(), (doc[0].shape[1], doc[0].shape[0]),\n interpolation=cv2.INTER_LINEAR)\n # Plot the raw heatmap\n", "issue": "Demo app error when analyzing my first document\n## \ud83d\udc1b Bug\r\n\r\nI tried to analyze a PNG and a PDF, got the same error. I try to change the model, didn't change anything.\r\n\r\n## To Reproduce\r\n\r\nSteps to reproduce the behavior:\r\n\r\n1. Upload a PNG\r\n2. Click on analyze document\r\n\r\n\r\n<!-- If you have a code sample, error messages, stack traces, please provide it here as well -->\r\n```\r\nKeyError: 0\r\nTraceback:\r\nFile \"/Users/thibautmorla/opt/anaconda3/lib/python3.8/site-packages/streamlit/script_runner.py\", line 337, in _run_script\r\n exec(code, module.__dict__)\r\nFile \"/Users/thibautmorla/Downloads/doctr/demo/app.py\", line 93, in <module>\r\n main()\r\nFile \"/Users/thibautmorla/Downloads/doctr/demo/app.py\", line 77, in main\r\n seg_map = predictor.det_predictor.model(processed_batches[0])[0]\r\n```\r\n\r\n\r\n## Additional context\r\n\r\nFirst image upload\n", "before_files": [{"content": "# Copyright (C) 2021, Mindee.\n\n# This program is licensed under the Apache License version 2.\n# See LICENSE or go to <https://www.apache.org/licenses/LICENSE-2.0.txt> for full license details.\n\nimport os\nimport streamlit as st\nimport matplotlib.pyplot as plt\n\nos.environ[\"TF_CPP_MIN_LOG_LEVEL\"] = \"2\"\n\nimport tensorflow as tf\nimport cv2\n\ngpu_devices = tf.config.experimental.list_physical_devices('GPU')\nif any(gpu_devices):\n tf.config.experimental.set_memory_growth(gpu_devices[0], True)\n\nfrom doctr.documents import DocumentFile\nfrom doctr.models import ocr_predictor\nfrom doctr.utils.visualization import visualize_page\n\nDET_ARCHS = [\"db_resnet50\"]\nRECO_ARCHS = [\"crnn_vgg16_bn\", \"crnn_resnet31\", \"sar_vgg16_bn\", \"sar_resnet31\"]\n\n\ndef main():\n\n # Wide mode\n st.set_page_config(layout=\"wide\")\n\n # Designing the interface\n st.title(\"DocTR: Document Text Recognition\")\n # For newline\n st.write('\\n')\n # Set the columns\n cols = st.beta_columns((1, 1, 1))\n cols[0].header(\"Input document\")\n cols[1].header(\"Text segmentation\")\n cols[-1].header(\"OCR output\")\n\n # Sidebar\n # File selection\n st.sidebar.title(\"Document selection\")\n # Disabling warning\n st.set_option('deprecation.showfileUploaderEncoding', False)\n # Choose your own image\n uploaded_file = st.sidebar.file_uploader(\"Upload files\", type=['pdf', 'png', 'jpeg', 'jpg'])\n if uploaded_file is not None:\n if uploaded_file.name.endswith('.pdf'):\n doc = DocumentFile.from_pdf(uploaded_file.read())\n else:\n doc = DocumentFile.from_images(uploaded_file.read())\n cols[0].image(doc[0], \"First page\", use_column_width=True)\n\n # Model selection\n st.sidebar.title(\"Model selection\")\n det_arch = st.sidebar.selectbox(\"Text detection model\", DET_ARCHS)\n reco_arch = st.sidebar.selectbox(\"Text recognition model\", RECO_ARCHS)\n\n # For newline\n st.sidebar.write('\\n')\n\n if st.sidebar.button(\"Analyze document\"):\n\n if uploaded_file is None:\n st.sidebar.write(\"Please upload a document\")\n\n else:\n with st.spinner('Loading model...'):\n predictor = ocr_predictor(det_arch, reco_arch, pretrained=True)\n\n with st.spinner('Analyzing...'):\n\n # Forward the image to the model\n processed_batches = predictor.det_predictor.pre_processor(doc)\n seg_map = predictor.det_predictor.model(processed_batches[0])[0]\n seg_map = cv2.resize(seg_map.numpy(), (doc[0].shape[1], doc[0].shape[0]),\n interpolation=cv2.INTER_LINEAR)\n # Plot the raw heatmap\n fig, ax = plt.subplots()\n ax.imshow(seg_map)\n ax.axis('off')\n cols[1].pyplot(fig)\n\n # OCR\n out = predictor(doc)\n fig = visualize_page(out.pages[0].export(), doc[0], interactive=False)\n cols[-1].pyplot(fig)\n\n\nif __name__ == '__main__':\n main()\n", "path": "demo/app.py"}], "after_files": [{"content": "# Copyright (C) 2021, Mindee.\n\n# This program is licensed under the Apache License version 2.\n# See LICENSE or go to <https://www.apache.org/licenses/LICENSE-2.0.txt> for full license details.\n\nimport os\nimport streamlit as st\nimport matplotlib.pyplot as plt\n\nos.environ[\"TF_CPP_MIN_LOG_LEVEL\"] = \"2\"\n\nimport tensorflow as tf\nimport cv2\n\ngpu_devices = tf.config.experimental.list_physical_devices('GPU')\nif any(gpu_devices):\n tf.config.experimental.set_memory_growth(gpu_devices[0], True)\n\nfrom doctr.documents import DocumentFile\nfrom doctr.models import ocr_predictor\nfrom doctr.utils.visualization import visualize_page\n\nDET_ARCHS = [\"db_resnet50\"]\nRECO_ARCHS = [\"crnn_vgg16_bn\", \"crnn_resnet31\", \"sar_vgg16_bn\", \"sar_resnet31\"]\n\n\ndef main():\n\n # Wide mode\n st.set_page_config(layout=\"wide\")\n\n # Designing the interface\n st.title(\"DocTR: Document Text Recognition\")\n # For newline\n st.write('\\n')\n # Set the columns\n cols = st.beta_columns((1, 1, 1))\n cols[0].header(\"Input document\")\n cols[1].header(\"Text segmentation\")\n cols[-1].header(\"OCR output\")\n\n # Sidebar\n # File selection\n st.sidebar.title(\"Document selection\")\n # Disabling warning\n st.set_option('deprecation.showfileUploaderEncoding', False)\n # Choose your own image\n uploaded_file = st.sidebar.file_uploader(\"Upload files\", type=['pdf', 'png', 'jpeg', 'jpg'])\n if uploaded_file is not None:\n if uploaded_file.name.endswith('.pdf'):\n doc = DocumentFile.from_pdf(uploaded_file.read())\n else:\n doc = DocumentFile.from_images(uploaded_file.read())\n cols[0].image(doc[0], \"First page\", use_column_width=True)\n\n # Model selection\n st.sidebar.title(\"Model selection\")\n det_arch = st.sidebar.selectbox(\"Text detection model\", DET_ARCHS)\n reco_arch = st.sidebar.selectbox(\"Text recognition model\", RECO_ARCHS)\n\n # For newline\n st.sidebar.write('\\n')\n\n if st.sidebar.button(\"Analyze document\"):\n\n if uploaded_file is None:\n st.sidebar.write(\"Please upload a document\")\n\n else:\n with st.spinner('Loading model...'):\n predictor = ocr_predictor(det_arch, reco_arch, pretrained=True)\n\n with st.spinner('Analyzing...'):\n\n # Forward the image to the model\n processed_batches = predictor.det_predictor.pre_processor(doc)\n seg_map = predictor.det_predictor.model(processed_batches[0])[\"proba_map\"]\n seg_map = tf.squeeze(seg_map, axis=[0, 3])\n seg_map = cv2.resize(seg_map.numpy(), (doc[0].shape[1], doc[0].shape[0]),\n interpolation=cv2.INTER_LINEAR)\n # Plot the raw heatmap\n fig, ax = plt.subplots()\n ax.imshow(seg_map)\n ax.axis('off')\n cols[1].pyplot(fig)\n\n # OCR\n out = predictor(doc)\n fig = visualize_page(out.pages[0].export(), doc[0], interactive=False)\n cols[-1].pyplot(fig)\n\n\nif __name__ == '__main__':\n main()\n", "path": "demo/app.py"}]} | 1,387 | 156 |
gh_patches_debug_4637 | rasdani/github-patches | git_diff | conda__conda-4585 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
ERROR conda.core.link:_execute_actions(319): An error occurred while installing package 'defaults::qt-5.6.2-vc9_3'
```
Current conda install:
platform : win-64
conda version : 4.3.9
conda is private : False
conda-env version : 4.3.9
conda-build version : 2.1.3
python version : 2.7.13.final.0
requests version : 2.12.4
root environment : C:\Users\joelkim\Anaconda2 (writable)
default environment : C:\Users\joelkim\Anaconda2
envs directories : C:\Users\joelkim\Anaconda2\envs
package cache : C:\Users\joelkim\Anaconda2\pkgs
channel URLs : https://repo.continuum.io/pkgs/free/win-64
https://repo.continuum.io/pkgs/free/noarch
https://repo.continuum.io/pkgs/r/win-64
https://repo.continuum.io/pkgs/r/noarch
https://repo.continuum.io/pkgs/pro/win-64
https://repo.continuum.io/pkgs/pro/noarch
https://repo.continuum.io/pkgs/msys2/win-64
https://repo.continuum.io/pkgs/msys2/noarch
config file : None
offline mode : False
user-agent : conda/4.3.9 requests/2.12.4 CPython/2.7.13 Windows/10 Windows/10.0.14393
```
I got this error when I tried to install qt:
```
> conda create -n test qt
Fetching package metadata ...........
Solving package specifications: .
Package plan for installation in environment C:\Users\joelkim\Anaconda2\envs\test:
The following NEW packages will be INSTALLED:
icu: 57.1-vc9_0 [vc9]
jpeg: 9b-vc9_0 [vc9]
libpng: 1.6.27-vc9_0 [vc9]
openssl: 1.0.2k-vc9_0 [vc9]
pip: 9.0.1-py27_1
python: 2.7.13-0
qt: 5.6.2-vc9_3 [vc9]
setuptools: 27.2.0-py27_1
vs2008_runtime: 9.00.30729.5054-0
wheel: 0.29.0-py27_0
zlib: 1.2.8-vc9_3 [vc9]
Proceed ([y]/n)?
ERROR conda.core.link:_execute_actions(319): An error occurred while installing package 'defaults::qt-5.6.2-vc9_3'.
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `conda/common/compat.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 # Try to keep compat small because it's imported by everything
3 # What is compat, and what isn't?
4 # If a piece of code is "general" and used in multiple modules, it goes here.
5 # If it's only used in one module, keep it in that module, preferably near the top.
6 from __future__ import absolute_import, division, print_function, unicode_literals
7
8 from itertools import chain
9 from operator import methodcaller
10 from os import chmod, lstat
11 from os.path import islink
12 import sys
13
14 on_win = bool(sys.platform == "win32")
15
16 PY2 = sys.version_info[0] == 2
17 PY3 = sys.version_info[0] == 3
18
19
20 # #############################
21 # equivalent commands
22 # #############################
23
24 if PY3: # pragma: py2 no cover
25 string_types = str,
26 integer_types = int,
27 class_types = type,
28 text_type = str
29 binary_type = bytes
30 input = input
31 range = range
32
33 elif PY2: # pragma: py3 no cover
34 from types import ClassType
35 string_types = basestring,
36 integer_types = (int, long)
37 class_types = (type, ClassType)
38 text_type = unicode
39 binary_type = str
40 input = raw_input
41 range = xrange
42
43
44 # #############################
45 # equivalent imports
46 # #############################
47
48 if PY3: # pragma: py2 no cover
49 from io import StringIO
50 from itertools import zip_longest
51 elif PY2: # pragma: py3 no cover
52 from cStringIO import StringIO
53 from itertools import izip as zip, izip_longest as zip_longest
54
55 StringIO = StringIO
56 zip = zip
57 zip_longest = zip_longest
58
59
60 # #############################
61 # equivalent functions
62 # #############################
63
64 if PY3: # pragma: py2 no cover
65 def iterkeys(d, **kw):
66 return iter(d.keys(**kw))
67
68 def itervalues(d, **kw):
69 return iter(d.values(**kw))
70
71 def iteritems(d, **kw):
72 return iter(d.items(**kw))
73
74 viewkeys = methodcaller("keys")
75 viewvalues = methodcaller("values")
76 viewitems = methodcaller("items")
77
78 def lchmod(path, mode):
79 try:
80 chmod(path, mode, follow_symlinks=False)
81 except (TypeError, NotImplementedError, SystemError):
82 # On systems that don't allow permissions on symbolic links, skip
83 # links entirely.
84 if not islink(path):
85 chmod(path, mode)
86
87
88 from collections import Iterable
89 def isiterable(obj):
90 return not isinstance(obj, string_types) and isinstance(obj, Iterable)
91
92 elif PY2: # pragma: py3 no cover
93 def iterkeys(d, **kw):
94 return d.iterkeys(**kw)
95
96 def itervalues(d, **kw):
97 return d.itervalues(**kw)
98
99 def iteritems(d, **kw):
100 return d.iteritems(**kw)
101
102 viewkeys = methodcaller("viewkeys")
103 viewvalues = methodcaller("viewvalues")
104 viewitems = methodcaller("viewitems")
105
106 try:
107 from os import lchmod as os_lchmod
108 lchmod = os_lchmod
109 except ImportError:
110 def lchmod(path, mode):
111 # On systems that don't allow permissions on symbolic links, skip
112 # links entirely.
113 if not islink(path):
114 chmod(path, mode)
115
116 def isiterable(obj):
117 return (hasattr(obj, '__iter__')
118 and not isinstance(obj, string_types)
119 and type(obj) is not type)
120
121
122 # #############################
123 # other
124 # #############################
125
126 def with_metaclass(Type, skip_attrs=set(('__dict__', '__weakref__'))):
127 """Class decorator to set metaclass.
128
129 Works with both Python 2 and Python 3 and it does not add
130 an extra class in the lookup order like ``six.with_metaclass`` does
131 (that is -- it copies the original class instead of using inheritance).
132
133 """
134
135 def _clone_with_metaclass(Class):
136 attrs = dict((key, value) for key, value in iteritems(vars(Class))
137 if key not in skip_attrs)
138 return Type(Class.__name__, Class.__bases__, attrs)
139
140 return _clone_with_metaclass
141
142
143 from collections import OrderedDict as odict
144 odict = odict
145
146 NoneType = type(None)
147 primitive_types = tuple(chain(string_types, integer_types, (float, complex, bool, NoneType)))
148
149
150 def ensure_binary(value):
151 return value.encode('utf-8') if hasattr(value, 'encode') else value
152
153
154 def ensure_text_type(value):
155 return value.decode('utf-8') if hasattr(value, 'decode') else value
156
157
158 def ensure_unicode(value):
159 return value.decode('unicode_escape') if hasattr(value, 'decode') else value
160
161
162 # TODO: move this somewhere else
163 # work-around for python bug on Windows prior to python 3.2
164 # https://bugs.python.org/issue10027
165 # Adapted from the ntfsutils package, Copyright (c) 2012, the Mozilla Foundation
166 class CrossPlatformStLink(object):
167 _st_nlink = None
168
169 def __call__(self, path):
170 return self.st_nlink(path)
171
172 @classmethod
173 def st_nlink(cls, path):
174 if cls._st_nlink is None:
175 cls._initialize()
176 return cls._st_nlink(path)
177
178 @classmethod
179 def _standard_st_nlink(cls, path):
180 return lstat(path).st_nlink
181
182 @classmethod
183 def _windows_st_nlink(cls, path):
184 st_nlink = cls._standard_st_nlink(path)
185 if st_nlink != 0:
186 return st_nlink
187 else:
188 # cannot trust python on Windows when st_nlink == 0
189 # get value using windows libraries to be sure of its true value
190 # Adapted from the ntfsutils package, Copyright (c) 2012, the Mozilla Foundation
191 GENERIC_READ = 0x80000000
192 FILE_SHARE_READ = 0x00000001
193 OPEN_EXISTING = 3
194 hfile = cls.CreateFile(path, GENERIC_READ, FILE_SHARE_READ, None,
195 OPEN_EXISTING, 0, None)
196 if hfile is None:
197 from ctypes import WinError
198 raise WinError()
199 info = cls.BY_HANDLE_FILE_INFORMATION()
200 rv = cls.GetFileInformationByHandle(hfile, info)
201 cls.CloseHandle(hfile)
202 if rv == 0:
203 from ctypes import WinError
204 raise WinError()
205 return info.nNumberOfLinks
206
207 @classmethod
208 def _initialize(cls):
209 if not on_win:
210 cls._st_nlink = cls._standard_st_nlink
211 else:
212 # http://msdn.microsoft.com/en-us/library/windows/desktop/aa363858
213 import ctypes
214 from ctypes import POINTER
215 from ctypes.wintypes import DWORD, HANDLE, BOOL
216
217 cls.CreateFile = ctypes.windll.kernel32.CreateFileW
218 cls.CreateFile.argtypes = [ctypes.c_wchar_p, DWORD, DWORD, ctypes.c_void_p,
219 DWORD, DWORD, HANDLE]
220 cls.CreateFile.restype = HANDLE
221
222 # http://msdn.microsoft.com/en-us/library/windows/desktop/ms724211
223 cls.CloseHandle = ctypes.windll.kernel32.CloseHandle
224 cls.CloseHandle.argtypes = [HANDLE]
225 cls.CloseHandle.restype = BOOL
226
227 class FILETIME(ctypes.Structure):
228 _fields_ = [("dwLowDateTime", DWORD),
229 ("dwHighDateTime", DWORD)]
230
231 class BY_HANDLE_FILE_INFORMATION(ctypes.Structure):
232 _fields_ = [("dwFileAttributes", DWORD),
233 ("ftCreationTime", FILETIME),
234 ("ftLastAccessTime", FILETIME),
235 ("ftLastWriteTime", FILETIME),
236 ("dwVolumeSerialNumber", DWORD),
237 ("nFileSizeHigh", DWORD),
238 ("nFileSizeLow", DWORD),
239 ("nNumberOfLinks", DWORD),
240 ("nFileIndexHigh", DWORD),
241 ("nFileIndexLow", DWORD)]
242 cls.BY_HANDLE_FILE_INFORMATION = BY_HANDLE_FILE_INFORMATION
243
244 # http://msdn.microsoft.com/en-us/library/windows/desktop/aa364952
245 cls.GetFileInformationByHandle = ctypes.windll.kernel32.GetFileInformationByHandle
246 cls.GetFileInformationByHandle.argtypes = [HANDLE, POINTER(BY_HANDLE_FILE_INFORMATION)]
247 cls.GetFileInformationByHandle.restype = BOOL
248
249 cls._st_nlink = cls._windows_st_nlink
250
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/conda/common/compat.py b/conda/common/compat.py
--- a/conda/common/compat.py
+++ b/conda/common/compat.py
@@ -152,7 +152,15 @@
def ensure_text_type(value):
- return value.decode('utf-8') if hasattr(value, 'decode') else value
+ if hasattr(value, 'decode'):
+ try:
+ return value.decode('utf-8')
+ except UnicodeDecodeError:
+ from requests.packages.chardet import detect
+ encoding = detect(value).get('encoding') or 'utf-8'
+ return value.decode(encoding)
+ else:
+ return value
def ensure_unicode(value):
| {"golden_diff": "diff --git a/conda/common/compat.py b/conda/common/compat.py\n--- a/conda/common/compat.py\n+++ b/conda/common/compat.py\n@@ -152,7 +152,15 @@\n \n \n def ensure_text_type(value):\n- return value.decode('utf-8') if hasattr(value, 'decode') else value\n+ if hasattr(value, 'decode'):\n+ try:\n+ return value.decode('utf-8')\n+ except UnicodeDecodeError:\n+ from requests.packages.chardet import detect\n+ encoding = detect(value).get('encoding') or 'utf-8'\n+ return value.decode(encoding)\n+ else:\n+ return value\n \n \n def ensure_unicode(value):\n", "issue": "ERROR conda.core.link:_execute_actions(319): An error occurred while installing package 'defaults::qt-5.6.2-vc9_3'\n```\r\nCurrent conda install:\r\n\r\n platform : win-64\r\n conda version : 4.3.9\r\n conda is private : False\r\n conda-env version : 4.3.9\r\n conda-build version : 2.1.3\r\n python version : 2.7.13.final.0\r\n requests version : 2.12.4\r\n root environment : C:\\Users\\joelkim\\Anaconda2 (writable)\r\n default environment : C:\\Users\\joelkim\\Anaconda2\r\n envs directories : C:\\Users\\joelkim\\Anaconda2\\envs\r\n package cache : C:\\Users\\joelkim\\Anaconda2\\pkgs\r\n channel URLs : https://repo.continuum.io/pkgs/free/win-64\r\n https://repo.continuum.io/pkgs/free/noarch\r\n https://repo.continuum.io/pkgs/r/win-64\r\n https://repo.continuum.io/pkgs/r/noarch\r\n https://repo.continuum.io/pkgs/pro/win-64\r\n https://repo.continuum.io/pkgs/pro/noarch\r\n https://repo.continuum.io/pkgs/msys2/win-64\r\n https://repo.continuum.io/pkgs/msys2/noarch\r\n config file : None\r\n offline mode : False\r\n user-agent : conda/4.3.9 requests/2.12.4 CPython/2.7.13 Windows/10 Windows/10.0.14393\r\n```\r\n\r\nI got this error when I tried to install qt:\r\n```\r\n> conda create -n test qt\r\nFetching package metadata ...........\r\nSolving package specifications: .\r\n\r\nPackage plan for installation in environment C:\\Users\\joelkim\\Anaconda2\\envs\\test:\r\n\r\nThe following NEW packages will be INSTALLED:\r\n\r\n icu: 57.1-vc9_0 [vc9]\r\n jpeg: 9b-vc9_0 [vc9]\r\n libpng: 1.6.27-vc9_0 [vc9]\r\n openssl: 1.0.2k-vc9_0 [vc9]\r\n pip: 9.0.1-py27_1\r\n python: 2.7.13-0\r\n qt: 5.6.2-vc9_3 [vc9]\r\n setuptools: 27.2.0-py27_1\r\n vs2008_runtime: 9.00.30729.5054-0\r\n wheel: 0.29.0-py27_0\r\n zlib: 1.2.8-vc9_3 [vc9]\r\n\r\nProceed ([y]/n)?\r\n\r\nERROR conda.core.link:_execute_actions(319): An error occurred while installing package 'defaults::qt-5.6.2-vc9_3'.\r\n```\r\n \n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n# Try to keep compat small because it's imported by everything\n# What is compat, and what isn't?\n# If a piece of code is \"general\" and used in multiple modules, it goes here.\n# If it's only used in one module, keep it in that module, preferably near the top.\nfrom __future__ import absolute_import, division, print_function, unicode_literals\n\nfrom itertools import chain\nfrom operator import methodcaller\nfrom os import chmod, lstat\nfrom os.path import islink\nimport sys\n\non_win = bool(sys.platform == \"win32\")\n\nPY2 = sys.version_info[0] == 2\nPY3 = sys.version_info[0] == 3\n\n\n# #############################\n# equivalent commands\n# #############################\n\nif PY3: # pragma: py2 no cover\n string_types = str,\n integer_types = int,\n class_types = type,\n text_type = str\n binary_type = bytes\n input = input\n range = range\n\nelif PY2: # pragma: py3 no cover\n from types import ClassType\n string_types = basestring,\n integer_types = (int, long)\n class_types = (type, ClassType)\n text_type = unicode\n binary_type = str\n input = raw_input\n range = xrange\n\n\n# #############################\n# equivalent imports\n# #############################\n\nif PY3: # pragma: py2 no cover\n from io import StringIO\n from itertools import zip_longest\nelif PY2: # pragma: py3 no cover\n from cStringIO import StringIO\n from itertools import izip as zip, izip_longest as zip_longest\n\nStringIO = StringIO\nzip = zip\nzip_longest = zip_longest\n\n\n# #############################\n# equivalent functions\n# #############################\n\nif PY3: # pragma: py2 no cover\n def iterkeys(d, **kw):\n return iter(d.keys(**kw))\n\n def itervalues(d, **kw):\n return iter(d.values(**kw))\n\n def iteritems(d, **kw):\n return iter(d.items(**kw))\n\n viewkeys = methodcaller(\"keys\")\n viewvalues = methodcaller(\"values\")\n viewitems = methodcaller(\"items\")\n\n def lchmod(path, mode):\n try:\n chmod(path, mode, follow_symlinks=False)\n except (TypeError, NotImplementedError, SystemError):\n # On systems that don't allow permissions on symbolic links, skip\n # links entirely.\n if not islink(path):\n chmod(path, mode)\n\n\n from collections import Iterable\n def isiterable(obj):\n return not isinstance(obj, string_types) and isinstance(obj, Iterable)\n\nelif PY2: # pragma: py3 no cover\n def iterkeys(d, **kw):\n return d.iterkeys(**kw)\n\n def itervalues(d, **kw):\n return d.itervalues(**kw)\n\n def iteritems(d, **kw):\n return d.iteritems(**kw)\n\n viewkeys = methodcaller(\"viewkeys\")\n viewvalues = methodcaller(\"viewvalues\")\n viewitems = methodcaller(\"viewitems\")\n\n try:\n from os import lchmod as os_lchmod\n lchmod = os_lchmod\n except ImportError:\n def lchmod(path, mode):\n # On systems that don't allow permissions on symbolic links, skip\n # links entirely.\n if not islink(path):\n chmod(path, mode)\n\n def isiterable(obj):\n return (hasattr(obj, '__iter__')\n and not isinstance(obj, string_types)\n and type(obj) is not type)\n\n\n# #############################\n# other\n# #############################\n\ndef with_metaclass(Type, skip_attrs=set(('__dict__', '__weakref__'))):\n \"\"\"Class decorator to set metaclass.\n\n Works with both Python 2 and Python 3 and it does not add\n an extra class in the lookup order like ``six.with_metaclass`` does\n (that is -- it copies the original class instead of using inheritance).\n\n \"\"\"\n\n def _clone_with_metaclass(Class):\n attrs = dict((key, value) for key, value in iteritems(vars(Class))\n if key not in skip_attrs)\n return Type(Class.__name__, Class.__bases__, attrs)\n\n return _clone_with_metaclass\n\n\nfrom collections import OrderedDict as odict\nodict = odict\n\nNoneType = type(None)\nprimitive_types = tuple(chain(string_types, integer_types, (float, complex, bool, NoneType)))\n\n\ndef ensure_binary(value):\n return value.encode('utf-8') if hasattr(value, 'encode') else value\n\n\ndef ensure_text_type(value):\n return value.decode('utf-8') if hasattr(value, 'decode') else value\n\n\ndef ensure_unicode(value):\n return value.decode('unicode_escape') if hasattr(value, 'decode') else value\n\n\n# TODO: move this somewhere else\n# work-around for python bug on Windows prior to python 3.2\n# https://bugs.python.org/issue10027\n# Adapted from the ntfsutils package, Copyright (c) 2012, the Mozilla Foundation\nclass CrossPlatformStLink(object):\n _st_nlink = None\n\n def __call__(self, path):\n return self.st_nlink(path)\n\n @classmethod\n def st_nlink(cls, path):\n if cls._st_nlink is None:\n cls._initialize()\n return cls._st_nlink(path)\n\n @classmethod\n def _standard_st_nlink(cls, path):\n return lstat(path).st_nlink\n\n @classmethod\n def _windows_st_nlink(cls, path):\n st_nlink = cls._standard_st_nlink(path)\n if st_nlink != 0:\n return st_nlink\n else:\n # cannot trust python on Windows when st_nlink == 0\n # get value using windows libraries to be sure of its true value\n # Adapted from the ntfsutils package, Copyright (c) 2012, the Mozilla Foundation\n GENERIC_READ = 0x80000000\n FILE_SHARE_READ = 0x00000001\n OPEN_EXISTING = 3\n hfile = cls.CreateFile(path, GENERIC_READ, FILE_SHARE_READ, None,\n OPEN_EXISTING, 0, None)\n if hfile is None:\n from ctypes import WinError\n raise WinError()\n info = cls.BY_HANDLE_FILE_INFORMATION()\n rv = cls.GetFileInformationByHandle(hfile, info)\n cls.CloseHandle(hfile)\n if rv == 0:\n from ctypes import WinError\n raise WinError()\n return info.nNumberOfLinks\n\n @classmethod\n def _initialize(cls):\n if not on_win:\n cls._st_nlink = cls._standard_st_nlink\n else:\n # http://msdn.microsoft.com/en-us/library/windows/desktop/aa363858\n import ctypes\n from ctypes import POINTER\n from ctypes.wintypes import DWORD, HANDLE, BOOL\n\n cls.CreateFile = ctypes.windll.kernel32.CreateFileW\n cls.CreateFile.argtypes = [ctypes.c_wchar_p, DWORD, DWORD, ctypes.c_void_p,\n DWORD, DWORD, HANDLE]\n cls.CreateFile.restype = HANDLE\n\n # http://msdn.microsoft.com/en-us/library/windows/desktop/ms724211\n cls.CloseHandle = ctypes.windll.kernel32.CloseHandle\n cls.CloseHandle.argtypes = [HANDLE]\n cls.CloseHandle.restype = BOOL\n\n class FILETIME(ctypes.Structure):\n _fields_ = [(\"dwLowDateTime\", DWORD),\n (\"dwHighDateTime\", DWORD)]\n\n class BY_HANDLE_FILE_INFORMATION(ctypes.Structure):\n _fields_ = [(\"dwFileAttributes\", DWORD),\n (\"ftCreationTime\", FILETIME),\n (\"ftLastAccessTime\", FILETIME),\n (\"ftLastWriteTime\", FILETIME),\n (\"dwVolumeSerialNumber\", DWORD),\n (\"nFileSizeHigh\", DWORD),\n (\"nFileSizeLow\", DWORD),\n (\"nNumberOfLinks\", DWORD),\n (\"nFileIndexHigh\", DWORD),\n (\"nFileIndexLow\", DWORD)]\n cls.BY_HANDLE_FILE_INFORMATION = BY_HANDLE_FILE_INFORMATION\n\n # http://msdn.microsoft.com/en-us/library/windows/desktop/aa364952\n cls.GetFileInformationByHandle = ctypes.windll.kernel32.GetFileInformationByHandle\n cls.GetFileInformationByHandle.argtypes = [HANDLE, POINTER(BY_HANDLE_FILE_INFORMATION)]\n cls.GetFileInformationByHandle.restype = BOOL\n\n cls._st_nlink = cls._windows_st_nlink\n", "path": "conda/common/compat.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n# Try to keep compat small because it's imported by everything\n# What is compat, and what isn't?\n# If a piece of code is \"general\" and used in multiple modules, it goes here.\n# If it's only used in one module, keep it in that module, preferably near the top.\nfrom __future__ import absolute_import, division, print_function, unicode_literals\n\nfrom itertools import chain\nfrom operator import methodcaller\nfrom os import chmod, lstat\nfrom os.path import islink\nimport sys\n\non_win = bool(sys.platform == \"win32\")\n\nPY2 = sys.version_info[0] == 2\nPY3 = sys.version_info[0] == 3\n\n\n# #############################\n# equivalent commands\n# #############################\n\nif PY3: # pragma: py2 no cover\n string_types = str,\n integer_types = int,\n class_types = type,\n text_type = str\n binary_type = bytes\n input = input\n range = range\n\nelif PY2: # pragma: py3 no cover\n from types import ClassType\n string_types = basestring,\n integer_types = (int, long)\n class_types = (type, ClassType)\n text_type = unicode\n binary_type = str\n input = raw_input\n range = xrange\n\n\n# #############################\n# equivalent imports\n# #############################\n\nif PY3: # pragma: py2 no cover\n from io import StringIO\n from itertools import zip_longest\nelif PY2: # pragma: py3 no cover\n from cStringIO import StringIO\n from itertools import izip as zip, izip_longest as zip_longest\n\nStringIO = StringIO\nzip = zip\nzip_longest = zip_longest\n\n\n# #############################\n# equivalent functions\n# #############################\n\nif PY3: # pragma: py2 no cover\n def iterkeys(d, **kw):\n return iter(d.keys(**kw))\n\n def itervalues(d, **kw):\n return iter(d.values(**kw))\n\n def iteritems(d, **kw):\n return iter(d.items(**kw))\n\n viewkeys = methodcaller(\"keys\")\n viewvalues = methodcaller(\"values\")\n viewitems = methodcaller(\"items\")\n\n def lchmod(path, mode):\n try:\n chmod(path, mode, follow_symlinks=False)\n except (TypeError, NotImplementedError, SystemError):\n # On systems that don't allow permissions on symbolic links, skip\n # links entirely.\n if not islink(path):\n chmod(path, mode)\n\n\n from collections import Iterable\n def isiterable(obj):\n return not isinstance(obj, string_types) and isinstance(obj, Iterable)\n\nelif PY2: # pragma: py3 no cover\n def iterkeys(d, **kw):\n return d.iterkeys(**kw)\n\n def itervalues(d, **kw):\n return d.itervalues(**kw)\n\n def iteritems(d, **kw):\n return d.iteritems(**kw)\n\n viewkeys = methodcaller(\"viewkeys\")\n viewvalues = methodcaller(\"viewvalues\")\n viewitems = methodcaller(\"viewitems\")\n\n try:\n from os import lchmod as os_lchmod\n lchmod = os_lchmod\n except ImportError:\n def lchmod(path, mode):\n # On systems that don't allow permissions on symbolic links, skip\n # links entirely.\n if not islink(path):\n chmod(path, mode)\n\n def isiterable(obj):\n return (hasattr(obj, '__iter__')\n and not isinstance(obj, string_types)\n and type(obj) is not type)\n\n\n# #############################\n# other\n# #############################\n\ndef with_metaclass(Type, skip_attrs=set(('__dict__', '__weakref__'))):\n \"\"\"Class decorator to set metaclass.\n\n Works with both Python 2 and Python 3 and it does not add\n an extra class in the lookup order like ``six.with_metaclass`` does\n (that is -- it copies the original class instead of using inheritance).\n\n \"\"\"\n\n def _clone_with_metaclass(Class):\n attrs = dict((key, value) for key, value in iteritems(vars(Class))\n if key not in skip_attrs)\n return Type(Class.__name__, Class.__bases__, attrs)\n\n return _clone_with_metaclass\n\n\nfrom collections import OrderedDict as odict\nodict = odict\n\nNoneType = type(None)\nprimitive_types = tuple(chain(string_types, integer_types, (float, complex, bool, NoneType)))\n\n\ndef ensure_binary(value):\n return value.encode('utf-8') if hasattr(value, 'encode') else value\n\n\ndef ensure_text_type(value):\n if hasattr(value, 'decode'):\n try:\n return value.decode('utf-8')\n except UnicodeDecodeError:\n from requests.packages.chardet import detect\n encoding = detect(value).get('encoding') or 'utf-8'\n return value.decode(encoding)\n else:\n return value\n\n\ndef ensure_unicode(value):\n return value.decode('unicode_escape') if hasattr(value, 'decode') else value\n\n\n# TODO: move this somewhere else\n# work-around for python bug on Windows prior to python 3.2\n# https://bugs.python.org/issue10027\n# Adapted from the ntfsutils package, Copyright (c) 2012, the Mozilla Foundation\nclass CrossPlatformStLink(object):\n _st_nlink = None\n\n def __call__(self, path):\n return self.st_nlink(path)\n\n @classmethod\n def st_nlink(cls, path):\n if cls._st_nlink is None:\n cls._initialize()\n return cls._st_nlink(path)\n\n @classmethod\n def _standard_st_nlink(cls, path):\n return lstat(path).st_nlink\n\n @classmethod\n def _windows_st_nlink(cls, path):\n st_nlink = cls._standard_st_nlink(path)\n if st_nlink != 0:\n return st_nlink\n else:\n # cannot trust python on Windows when st_nlink == 0\n # get value using windows libraries to be sure of its true value\n # Adapted from the ntfsutils package, Copyright (c) 2012, the Mozilla Foundation\n GENERIC_READ = 0x80000000\n FILE_SHARE_READ = 0x00000001\n OPEN_EXISTING = 3\n hfile = cls.CreateFile(path, GENERIC_READ, FILE_SHARE_READ, None,\n OPEN_EXISTING, 0, None)\n if hfile is None:\n from ctypes import WinError\n raise WinError()\n info = cls.BY_HANDLE_FILE_INFORMATION()\n rv = cls.GetFileInformationByHandle(hfile, info)\n cls.CloseHandle(hfile)\n if rv == 0:\n from ctypes import WinError\n raise WinError()\n return info.nNumberOfLinks\n\n @classmethod\n def _initialize(cls):\n if not on_win:\n cls._st_nlink = cls._standard_st_nlink\n else:\n # http://msdn.microsoft.com/en-us/library/windows/desktop/aa363858\n import ctypes\n from ctypes import POINTER\n from ctypes.wintypes import DWORD, HANDLE, BOOL\n\n cls.CreateFile = ctypes.windll.kernel32.CreateFileW\n cls.CreateFile.argtypes = [ctypes.c_wchar_p, DWORD, DWORD, ctypes.c_void_p,\n DWORD, DWORD, HANDLE]\n cls.CreateFile.restype = HANDLE\n\n # http://msdn.microsoft.com/en-us/library/windows/desktop/ms724211\n cls.CloseHandle = ctypes.windll.kernel32.CloseHandle\n cls.CloseHandle.argtypes = [HANDLE]\n cls.CloseHandle.restype = BOOL\n\n class FILETIME(ctypes.Structure):\n _fields_ = [(\"dwLowDateTime\", DWORD),\n (\"dwHighDateTime\", DWORD)]\n\n class BY_HANDLE_FILE_INFORMATION(ctypes.Structure):\n _fields_ = [(\"dwFileAttributes\", DWORD),\n (\"ftCreationTime\", FILETIME),\n (\"ftLastAccessTime\", FILETIME),\n (\"ftLastWriteTime\", FILETIME),\n (\"dwVolumeSerialNumber\", DWORD),\n (\"nFileSizeHigh\", DWORD),\n (\"nFileSizeLow\", DWORD),\n (\"nNumberOfLinks\", DWORD),\n (\"nFileIndexHigh\", DWORD),\n (\"nFileIndexLow\", DWORD)]\n cls.BY_HANDLE_FILE_INFORMATION = BY_HANDLE_FILE_INFORMATION\n\n # http://msdn.microsoft.com/en-us/library/windows/desktop/aa364952\n cls.GetFileInformationByHandle = ctypes.windll.kernel32.GetFileInformationByHandle\n cls.GetFileInformationByHandle.argtypes = [HANDLE, POINTER(BY_HANDLE_FILE_INFORMATION)]\n cls.GetFileInformationByHandle.restype = BOOL\n\n cls._st_nlink = cls._windows_st_nlink\n", "path": "conda/common/compat.py"}]} | 3,536 | 159 |
gh_patches_debug_3507 | rasdani/github-patches | git_diff | jazzband__pip-tools-1039 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
setup.py install_requires should have `"click>=7"` not `"click>=6"`
Thank you for all the work on this tool, it's very useful.
Issue:
As of 4.4.0 pip-tools now depends on version 7.0 of click, not 6.0.
The argument `show_envvar` is now being passed to `click.option()`
https://github.com/jazzband/pip-tools/compare/4.3.0...4.4.0#diff-c8673e93c598354ab4a9aa8dd090e913R183
That argument was added in click 7.0
https://click.palletsprojects.com/en/7.x/api/#click.Option
compared to
https://click.palletsprojects.com/en/6.x/api/#click.Option
Fix: setup.py install_requires should have `"click>=7"` not `"click>=6"`
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 """
2 pip-tools keeps your pinned dependencies fresh.
3 """
4 from os.path import abspath, dirname, join
5
6 from setuptools import find_packages, setup
7
8
9 def read_file(filename):
10 """Read the contents of a file located relative to setup.py"""
11 with open(join(abspath(dirname(__file__)), filename)) as thefile:
12 return thefile.read()
13
14
15 setup(
16 name="pip-tools",
17 use_scm_version=True,
18 url="https://github.com/jazzband/pip-tools/",
19 license="BSD",
20 author="Vincent Driessen",
21 author_email="[email protected]",
22 description=__doc__.strip(),
23 long_description=read_file("README.rst"),
24 long_description_content_type="text/x-rst",
25 packages=find_packages(exclude=["tests"]),
26 package_data={},
27 python_requires=">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*",
28 setup_requires=["setuptools_scm"],
29 install_requires=["click>=6", "six"],
30 zip_safe=False,
31 entry_points={
32 "console_scripts": [
33 "pip-compile = piptools.scripts.compile:cli",
34 "pip-sync = piptools.scripts.sync:cli",
35 ]
36 },
37 platforms="any",
38 classifiers=[
39 "Development Status :: 5 - Production/Stable",
40 "Intended Audience :: Developers",
41 "Intended Audience :: System Administrators",
42 "License :: OSI Approved :: BSD License",
43 "Operating System :: OS Independent",
44 "Programming Language :: Python",
45 "Programming Language :: Python :: 2",
46 "Programming Language :: Python :: 2.7",
47 "Programming Language :: Python :: 3",
48 "Programming Language :: Python :: 3.5",
49 "Programming Language :: Python :: 3.6",
50 "Programming Language :: Python :: 3.7",
51 "Programming Language :: Python :: 3.8",
52 "Programming Language :: Python :: Implementation :: CPython",
53 "Programming Language :: Python :: Implementation :: PyPy",
54 "Topic :: System :: Systems Administration",
55 ],
56 )
57
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -26,7 +26,7 @@
package_data={},
python_requires=">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*",
setup_requires=["setuptools_scm"],
- install_requires=["click>=6", "six"],
+ install_requires=["click>=7", "six"],
zip_safe=False,
entry_points={
"console_scripts": [
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -26,7 +26,7 @@\n package_data={},\n python_requires=\">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*\",\n setup_requires=[\"setuptools_scm\"],\n- install_requires=[\"click>=6\", \"six\"],\n+ install_requires=[\"click>=7\", \"six\"],\n zip_safe=False,\n entry_points={\n \"console_scripts\": [\n", "issue": "setup.py install_requires should have `\"click>=7\"` not `\"click>=6\"`\nThank you for all the work on this tool, it's very useful.\r\n\r\nIssue:\r\nAs of 4.4.0 pip-tools now depends on version 7.0 of click, not 6.0.\r\n\r\nThe argument `show_envvar` is now being passed to `click.option()`\r\nhttps://github.com/jazzband/pip-tools/compare/4.3.0...4.4.0#diff-c8673e93c598354ab4a9aa8dd090e913R183\r\n\r\nThat argument was added in click 7.0\r\nhttps://click.palletsprojects.com/en/7.x/api/#click.Option\r\ncompared to \r\nhttps://click.palletsprojects.com/en/6.x/api/#click.Option\r\n\r\nFix: setup.py install_requires should have `\"click>=7\"` not `\"click>=6\"`\n", "before_files": [{"content": "\"\"\"\npip-tools keeps your pinned dependencies fresh.\n\"\"\"\nfrom os.path import abspath, dirname, join\n\nfrom setuptools import find_packages, setup\n\n\ndef read_file(filename):\n \"\"\"Read the contents of a file located relative to setup.py\"\"\"\n with open(join(abspath(dirname(__file__)), filename)) as thefile:\n return thefile.read()\n\n\nsetup(\n name=\"pip-tools\",\n use_scm_version=True,\n url=\"https://github.com/jazzband/pip-tools/\",\n license=\"BSD\",\n author=\"Vincent Driessen\",\n author_email=\"[email protected]\",\n description=__doc__.strip(),\n long_description=read_file(\"README.rst\"),\n long_description_content_type=\"text/x-rst\",\n packages=find_packages(exclude=[\"tests\"]),\n package_data={},\n python_requires=\">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*\",\n setup_requires=[\"setuptools_scm\"],\n install_requires=[\"click>=6\", \"six\"],\n zip_safe=False,\n entry_points={\n \"console_scripts\": [\n \"pip-compile = piptools.scripts.compile:cli\",\n \"pip-sync = piptools.scripts.sync:cli\",\n ]\n },\n platforms=\"any\",\n classifiers=[\n \"Development Status :: 5 - Production/Stable\",\n \"Intended Audience :: Developers\",\n \"Intended Audience :: System Administrators\",\n \"License :: OSI Approved :: BSD License\",\n \"Operating System :: OS Independent\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 2\",\n \"Programming Language :: Python :: 2.7\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.5\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: Implementation :: CPython\",\n \"Programming Language :: Python :: Implementation :: PyPy\",\n \"Topic :: System :: Systems Administration\",\n ],\n)\n", "path": "setup.py"}], "after_files": [{"content": "\"\"\"\npip-tools keeps your pinned dependencies fresh.\n\"\"\"\nfrom os.path import abspath, dirname, join\n\nfrom setuptools import find_packages, setup\n\n\ndef read_file(filename):\n \"\"\"Read the contents of a file located relative to setup.py\"\"\"\n with open(join(abspath(dirname(__file__)), filename)) as thefile:\n return thefile.read()\n\n\nsetup(\n name=\"pip-tools\",\n use_scm_version=True,\n url=\"https://github.com/jazzband/pip-tools/\",\n license=\"BSD\",\n author=\"Vincent Driessen\",\n author_email=\"[email protected]\",\n description=__doc__.strip(),\n long_description=read_file(\"README.rst\"),\n long_description_content_type=\"text/x-rst\",\n packages=find_packages(exclude=[\"tests\"]),\n package_data={},\n python_requires=\">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*\",\n setup_requires=[\"setuptools_scm\"],\n install_requires=[\"click>=7\", \"six\"],\n zip_safe=False,\n entry_points={\n \"console_scripts\": [\n \"pip-compile = piptools.scripts.compile:cli\",\n \"pip-sync = piptools.scripts.sync:cli\",\n ]\n },\n platforms=\"any\",\n classifiers=[\n \"Development Status :: 5 - Production/Stable\",\n \"Intended Audience :: Developers\",\n \"Intended Audience :: System Administrators\",\n \"License :: OSI Approved :: BSD License\",\n \"Operating System :: OS Independent\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 2\",\n \"Programming Language :: Python :: 2.7\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.5\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: Implementation :: CPython\",\n \"Programming Language :: Python :: Implementation :: PyPy\",\n \"Topic :: System :: Systems Administration\",\n ],\n)\n", "path": "setup.py"}]} | 1,020 | 120 |
gh_patches_debug_5040 | rasdani/github-patches | git_diff | pymodbus-dev__pymodbus-1355 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
pymodbus.simulator fails with no running event loop
### Versions
* Python: 3.9.2
* OS: Debian Bullseye
* Pymodbus: 3.1.3 latest dev branch
* Modbus Hardware (if used):
### Pymodbus Specific
* Server: tcp - sync/async
* Client: tcp - sync/async
### Description
Executing pymodbus.simulator from the commandline results in the following error:
```
$ pymodbus.simulator
10:39:28 INFO logging:74 Start simulator
Traceback (most recent call last):
File "/usr/local/bin/pymodbus.simulator", line 33, in <module>
sys.exit(load_entry_point('pymodbus===3.1.x', 'console_scripts', 'pymodbus.simulator')())
File "/usr/local/lib/python3.9/dist-packages/pymodbus/server/simulator/main.py", line 112, in main
task = ModbusSimulatorServer(**cmd_args)
File "/usr/local/lib/python3.9/dist-packages/pymodbus/server/simulator/http_server.py", line 134, in __init__
server["loop"] = asyncio.get_running_loop()
RuntimeError: no running event loop
```
NOTE: I am running this from the pymodbus/server/simulator/ folder, so it picks up the example [setup.json](https://github.com/pymodbus-dev/pymodbus/blob/dev/pymodbus/server/simulator/setup.json) file.
Manually specifying available options from the commandline results in the same error as well:
```
$ pymodbus.simulator \
--http_host 0.0.0.0 \
--http_port 8080 \
--modbus_server server \
--modbus_device device \
--json_file ~/git/pymodbus/pymodbus/server/simulator/setup.json
11:24:07 INFO logging:74 Start simulator
Traceback (most recent call last):
File "/usr/local/bin/pymodbus.simulator", line 33, in <module>
sys.exit(load_entry_point('pymodbus===3.1.x', 'console_scripts', 'pymodbus.simulator')())
File "/usr/local/lib/python3.9/dist-packages/pymodbus/server/simulator/main.py", line 112, in main
task = ModbusSimulatorServer(**cmd_args)
File "/usr/local/lib/python3.9/dist-packages/pymodbus/server/simulator/http_server.py", line 134, in __init__
server["loop"] = asyncio.get_running_loop()
RuntimeError: no running event loop
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pymodbus/server/simulator/main.py`
Content:
```
1 #!/usr/bin/env python3
2 """HTTP server for modbus simulator.
3
4 The modbus simulator contain 3 distint parts:
5
6 - Datastore simulator, to define registers and their behaviour including actions: (simulator)(../../datastore/simulator.py)
7 - Modbus server: (server)(./http_server.py)
8 - HTTP server with REST API and web pages providing an online console in your browser
9
10 Multiple setups for different server types and/or devices are prepared in a (json file)(./setup.json), the detailed configuration is explained in (doc)(README.md)
11
12 The command line parameters are kept to a minimum:
13
14 usage: main.py [-h] [--modbus_server MODBUS_SERVER]
15 [--modbus_device MODBUS_DEVICE] [--http_host HTTP_HOST]
16 [--http_port HTTP_PORT]
17 [--log {critical,error,warning,info,debug}]
18 [--json_file JSON_FILE]
19 [--custom_actions_module CUSTOM_ACTIONS_MODULE]
20
21 Modbus server with REST-API and web server
22
23 options:
24 -h, --help show this help message and exit
25 --modbus_server MODBUS_SERVER
26 use <modbus_server> from server_list in json file
27 --modbus_device MODBUS_DEVICE
28 use <modbus_device> from device_list in json file
29 --http_host HTTP_HOST
30 use <http_host> as host to bind http listen
31 --http_port HTTP_PORT
32 use <http_port> as port to bind http listen
33 --log {critical,error,warning,info,debug}
34 set log level, default is info
35 --log_file LOG_FILE
36 name of server log file, default is "server.log"
37 --json_file JSON_FILE
38 name of json_file, default is "setup.json"
39 --custom_actions_module CUSTOM_ACTIONS_MODULE
40 python file with custom actions, default is none
41 """
42 import argparse
43 import asyncio
44
45 from pymodbus import pymodbus_apply_logging_config
46 from pymodbus.logging import Log
47 from pymodbus.server.simulator.http_server import ModbusSimulatorServer
48
49
50 def get_commandline():
51 """Get command line arguments."""
52 parser = argparse.ArgumentParser(
53 description="Modbus server with REST-API and web server"
54 )
55 parser.add_argument(
56 "--modbus_server",
57 help="use <modbus_server> from server_list in json file",
58 type=str,
59 )
60 parser.add_argument(
61 "--modbus_device",
62 help="use <modbus_device> from device_list in json file",
63 type=str,
64 )
65 parser.add_argument(
66 "--http_host",
67 help="use <http_host> as host to bind http listen",
68 type=str,
69 )
70 parser.add_argument(
71 "--http_port",
72 help="use <http_port> as port to bind http listen",
73 type=str,
74 )
75 parser.add_argument(
76 "--log",
77 choices=["critical", "error", "warning", "info", "debug"],
78 help="set log level, default is info",
79 default="info",
80 type=str,
81 )
82 parser.add_argument(
83 "--json_file",
84 help='name of json file, default is "setup.json"',
85 type=str,
86 )
87 parser.add_argument(
88 "--log_file",
89 help='name of server log file, default is "server.log"',
90 type=str,
91 )
92 parser.add_argument(
93 "--custom_actions_module",
94 help="python file with custom actions, default is none",
95 type=str,
96 )
97 args = parser.parse_args()
98 pymodbus_apply_logging_config(args.log.upper())
99 Log.info("Start simulator")
100 cmd_args = {}
101 for argument in args.__dict__:
102 if argument == "log":
103 continue
104 if args.__dict__[argument] is not None:
105 cmd_args[argument] = args.__dict__[argument]
106 return cmd_args
107
108
109 def main():
110 """Run server."""
111 cmd_args = get_commandline()
112 task = ModbusSimulatorServer(**cmd_args)
113 asyncio.run(task.run_forever(), debug=True)
114
115
116 if __name__ == "__main__":
117 main()
118
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pymodbus/server/simulator/main.py b/pymodbus/server/simulator/main.py
--- a/pymodbus/server/simulator/main.py
+++ b/pymodbus/server/simulator/main.py
@@ -106,11 +106,16 @@
return cmd_args
-def main():
- """Run server."""
+async def run_main():
+ """Run server async."""
cmd_args = get_commandline()
task = ModbusSimulatorServer(**cmd_args)
- asyncio.run(task.run_forever(), debug=True)
+ await task.run_forever()
+
+
+def main():
+ """Run server."""
+ asyncio.run(run_main(), debug=True)
if __name__ == "__main__":
| {"golden_diff": "diff --git a/pymodbus/server/simulator/main.py b/pymodbus/server/simulator/main.py\n--- a/pymodbus/server/simulator/main.py\n+++ b/pymodbus/server/simulator/main.py\n@@ -106,11 +106,16 @@\n return cmd_args\n \n \n-def main():\n- \"\"\"Run server.\"\"\"\n+async def run_main():\n+ \"\"\"Run server async.\"\"\"\n cmd_args = get_commandline()\n task = ModbusSimulatorServer(**cmd_args)\n- asyncio.run(task.run_forever(), debug=True)\n+ await task.run_forever()\n+\n+\n+def main():\n+ \"\"\"Run server.\"\"\"\n+ asyncio.run(run_main(), debug=True)\n \n \n if __name__ == \"__main__\":\n", "issue": "pymodbus.simulator fails with no running event loop\n### Versions\r\n\r\n* Python: 3.9.2\r\n* OS: Debian Bullseye\r\n* Pymodbus: 3.1.3 latest dev branch\r\n* Modbus Hardware (if used):\r\n\r\n### Pymodbus Specific\r\n* Server: tcp - sync/async\r\n* Client: tcp - sync/async\r\n\r\n### Description\r\n\r\nExecuting pymodbus.simulator from the commandline results in the following error:\r\n\r\n```\r\n$ pymodbus.simulator\r\n10:39:28 INFO logging:74 Start simulator\r\nTraceback (most recent call last):\r\n File \"/usr/local/bin/pymodbus.simulator\", line 33, in <module>\r\n sys.exit(load_entry_point('pymodbus===3.1.x', 'console_scripts', 'pymodbus.simulator')())\r\n File \"/usr/local/lib/python3.9/dist-packages/pymodbus/server/simulator/main.py\", line 112, in main\r\n task = ModbusSimulatorServer(**cmd_args)\r\n File \"/usr/local/lib/python3.9/dist-packages/pymodbus/server/simulator/http_server.py\", line 134, in __init__\r\n server[\"loop\"] = asyncio.get_running_loop()\r\nRuntimeError: no running event loop\r\n```\r\nNOTE: I am running this from the pymodbus/server/simulator/ folder, so it picks up the example [setup.json](https://github.com/pymodbus-dev/pymodbus/blob/dev/pymodbus/server/simulator/setup.json) file.\r\n\r\nManually specifying available options from the commandline results in the same error as well:\r\n```\r\n$ pymodbus.simulator \\\r\n --http_host 0.0.0.0 \\\r\n --http_port 8080 \\\r\n --modbus_server server \\\r\n --modbus_device device \\\r\n --json_file ~/git/pymodbus/pymodbus/server/simulator/setup.json\r\n\r\n11:24:07 INFO logging:74 Start simulator\r\nTraceback (most recent call last):\r\n File \"/usr/local/bin/pymodbus.simulator\", line 33, in <module>\r\n sys.exit(load_entry_point('pymodbus===3.1.x', 'console_scripts', 'pymodbus.simulator')())\r\n File \"/usr/local/lib/python3.9/dist-packages/pymodbus/server/simulator/main.py\", line 112, in main\r\n task = ModbusSimulatorServer(**cmd_args)\r\n File \"/usr/local/lib/python3.9/dist-packages/pymodbus/server/simulator/http_server.py\", line 134, in __init__\r\n server[\"loop\"] = asyncio.get_running_loop()\r\nRuntimeError: no running event loop\r\n```\r\n\n", "before_files": [{"content": "#!/usr/bin/env python3\n\"\"\"HTTP server for modbus simulator.\n\nThe modbus simulator contain 3 distint parts:\n\n- Datastore simulator, to define registers and their behaviour including actions: (simulator)(../../datastore/simulator.py)\n- Modbus server: (server)(./http_server.py)\n- HTTP server with REST API and web pages providing an online console in your browser\n\nMultiple setups for different server types and/or devices are prepared in a (json file)(./setup.json), the detailed configuration is explained in (doc)(README.md)\n\nThe command line parameters are kept to a minimum:\n\nusage: main.py [-h] [--modbus_server MODBUS_SERVER]\n [--modbus_device MODBUS_DEVICE] [--http_host HTTP_HOST]\n [--http_port HTTP_PORT]\n [--log {critical,error,warning,info,debug}]\n [--json_file JSON_FILE]\n [--custom_actions_module CUSTOM_ACTIONS_MODULE]\n\nModbus server with REST-API and web server\n\noptions:\n -h, --help show this help message and exit\n --modbus_server MODBUS_SERVER\n use <modbus_server> from server_list in json file\n --modbus_device MODBUS_DEVICE\n use <modbus_device> from device_list in json file\n --http_host HTTP_HOST\n use <http_host> as host to bind http listen\n --http_port HTTP_PORT\n use <http_port> as port to bind http listen\n --log {critical,error,warning,info,debug}\n set log level, default is info\n --log_file LOG_FILE\n name of server log file, default is \"server.log\"\n --json_file JSON_FILE\n name of json_file, default is \"setup.json\"\n --custom_actions_module CUSTOM_ACTIONS_MODULE\n python file with custom actions, default is none\n\"\"\"\nimport argparse\nimport asyncio\n\nfrom pymodbus import pymodbus_apply_logging_config\nfrom pymodbus.logging import Log\nfrom pymodbus.server.simulator.http_server import ModbusSimulatorServer\n\n\ndef get_commandline():\n \"\"\"Get command line arguments.\"\"\"\n parser = argparse.ArgumentParser(\n description=\"Modbus server with REST-API and web server\"\n )\n parser.add_argument(\n \"--modbus_server\",\n help=\"use <modbus_server> from server_list in json file\",\n type=str,\n )\n parser.add_argument(\n \"--modbus_device\",\n help=\"use <modbus_device> from device_list in json file\",\n type=str,\n )\n parser.add_argument(\n \"--http_host\",\n help=\"use <http_host> as host to bind http listen\",\n type=str,\n )\n parser.add_argument(\n \"--http_port\",\n help=\"use <http_port> as port to bind http listen\",\n type=str,\n )\n parser.add_argument(\n \"--log\",\n choices=[\"critical\", \"error\", \"warning\", \"info\", \"debug\"],\n help=\"set log level, default is info\",\n default=\"info\",\n type=str,\n )\n parser.add_argument(\n \"--json_file\",\n help='name of json file, default is \"setup.json\"',\n type=str,\n )\n parser.add_argument(\n \"--log_file\",\n help='name of server log file, default is \"server.log\"',\n type=str,\n )\n parser.add_argument(\n \"--custom_actions_module\",\n help=\"python file with custom actions, default is none\",\n type=str,\n )\n args = parser.parse_args()\n pymodbus_apply_logging_config(args.log.upper())\n Log.info(\"Start simulator\")\n cmd_args = {}\n for argument in args.__dict__:\n if argument == \"log\":\n continue\n if args.__dict__[argument] is not None:\n cmd_args[argument] = args.__dict__[argument]\n return cmd_args\n\n\ndef main():\n \"\"\"Run server.\"\"\"\n cmd_args = get_commandline()\n task = ModbusSimulatorServer(**cmd_args)\n asyncio.run(task.run_forever(), debug=True)\n\n\nif __name__ == \"__main__\":\n main()\n", "path": "pymodbus/server/simulator/main.py"}], "after_files": [{"content": "#!/usr/bin/env python3\n\"\"\"HTTP server for modbus simulator.\n\nThe modbus simulator contain 3 distint parts:\n\n- Datastore simulator, to define registers and their behaviour including actions: (simulator)(../../datastore/simulator.py)\n- Modbus server: (server)(./http_server.py)\n- HTTP server with REST API and web pages providing an online console in your browser\n\nMultiple setups for different server types and/or devices are prepared in a (json file)(./setup.json), the detailed configuration is explained in (doc)(README.md)\n\nThe command line parameters are kept to a minimum:\n\nusage: main.py [-h] [--modbus_server MODBUS_SERVER]\n [--modbus_device MODBUS_DEVICE] [--http_host HTTP_HOST]\n [--http_port HTTP_PORT]\n [--log {critical,error,warning,info,debug}]\n [--json_file JSON_FILE]\n [--custom_actions_module CUSTOM_ACTIONS_MODULE]\n\nModbus server with REST-API and web server\n\noptions:\n -h, --help show this help message and exit\n --modbus_server MODBUS_SERVER\n use <modbus_server> from server_list in json file\n --modbus_device MODBUS_DEVICE\n use <modbus_device> from device_list in json file\n --http_host HTTP_HOST\n use <http_host> as host to bind http listen\n --http_port HTTP_PORT\n use <http_port> as port to bind http listen\n --log {critical,error,warning,info,debug}\n set log level, default is info\n --log_file LOG_FILE\n name of server log file, default is \"server.log\"\n --json_file JSON_FILE\n name of json_file, default is \"setup.json\"\n --custom_actions_module CUSTOM_ACTIONS_MODULE\n python file with custom actions, default is none\n\"\"\"\nimport argparse\nimport asyncio\n\nfrom pymodbus import pymodbus_apply_logging_config\nfrom pymodbus.logging import Log\nfrom pymodbus.server.simulator.http_server import ModbusSimulatorServer\n\n\ndef get_commandline():\n \"\"\"Get command line arguments.\"\"\"\n parser = argparse.ArgumentParser(\n description=\"Modbus server with REST-API and web server\"\n )\n parser.add_argument(\n \"--modbus_server\",\n help=\"use <modbus_server> from server_list in json file\",\n type=str,\n )\n parser.add_argument(\n \"--modbus_device\",\n help=\"use <modbus_device> from device_list in json file\",\n type=str,\n )\n parser.add_argument(\n \"--http_host\",\n help=\"use <http_host> as host to bind http listen\",\n type=str,\n )\n parser.add_argument(\n \"--http_port\",\n help=\"use <http_port> as port to bind http listen\",\n type=str,\n )\n parser.add_argument(\n \"--log\",\n choices=[\"critical\", \"error\", \"warning\", \"info\", \"debug\"],\n help=\"set log level, default is info\",\n default=\"info\",\n type=str,\n )\n parser.add_argument(\n \"--json_file\",\n help='name of json file, default is \"setup.json\"',\n type=str,\n )\n parser.add_argument(\n \"--log_file\",\n help='name of server log file, default is \"server.log\"',\n type=str,\n )\n parser.add_argument(\n \"--custom_actions_module\",\n help=\"python file with custom actions, default is none\",\n type=str,\n )\n args = parser.parse_args()\n pymodbus_apply_logging_config(args.log.upper())\n Log.info(\"Start simulator\")\n cmd_args = {}\n for argument in args.__dict__:\n if argument == \"log\":\n continue\n if args.__dict__[argument] is not None:\n cmd_args[argument] = args.__dict__[argument]\n return cmd_args\n\n\nasync def run_main():\n \"\"\"Run server async.\"\"\"\n cmd_args = get_commandline()\n task = ModbusSimulatorServer(**cmd_args)\n await task.run_forever()\n\n\ndef main():\n \"\"\"Run server.\"\"\"\n asyncio.run(run_main(), debug=True)\n\n\nif __name__ == \"__main__\":\n main()\n", "path": "pymodbus/server/simulator/main.py"}]} | 1,975 | 163 |
gh_patches_debug_16515 | rasdani/github-patches | git_diff | ansible__ansible-lint-2666 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Docs: Add details for using profiles
##### Summary
As content moves through the automation content creation pipeline, various linting rules that are developed against may or may not apply depending on where the content is being executed.
For example, in development, as a content creator, there should be the ability to create some rules and quickly validate them utilising ansible-lint locally. Then in the first test run, an ansible-lint profile should allow the executor from ignoring the rules to ensure that the content itself runs as desired.
Update documentation to provide context for using profiles in a more progressive (step-by-step) way with additional context that addresses motivation.
##### Issue Type
- Bug Report (Docs)
##### Ansible and Ansible Lint details
N/A
##### OS / ENVIRONMENT
Fedora 36
##### STEPS TO REPRODUCE
N/A
##### Desired Behavior
N/A
##### Actual Behavior
N/A
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/ansiblelint/generate_docs.py`
Content:
```
1 """Utils to generate rules documentation."""
2 import logging
3 from pathlib import Path
4 from typing import Iterable
5
6 from rich import box
7
8 # Remove this compatibility try-catch block once we drop support for rich < 10.7.0
9 try:
10 from rich.console import group
11 except ImportError:
12 from rich.console import render_group as group # type: ignore
13
14 from rich.markdown import Markdown
15 from rich.table import Table
16
17 from ansiblelint.config import PROFILES
18 from ansiblelint.constants import RULE_DOC_URL
19 from ansiblelint.rules import RulesCollection
20
21 DOC_HEADER = """
22 # Default Rules
23
24 (lint_default_rules)=
25
26 Below you can see the list of default rules Ansible Lint use to evaluate playbooks and roles:
27
28 """
29
30 _logger = logging.getLogger(__name__)
31
32
33 def rules_as_docs(rules: RulesCollection) -> str:
34 """Dump documentation files for all rules, returns only confirmation message.
35
36 That is internally used for building documentation and the API can change
37 at any time.
38 """
39 result = ""
40 dump_path = Path(".") / "docs" / "rules"
41 if not dump_path.exists():
42 raise RuntimeError(f"Failed to find {dump_path} folder for dumping rules.")
43
44 with open(dump_path / ".." / "profiles.md", "w", encoding="utf-8") as f:
45 f.write(profiles_as_md(header=True, docs_url="rules/"))
46
47 for rule in rules.alphabetical():
48 result = ""
49 with open(dump_path / f"{rule.id}.md", "w", encoding="utf-8") as f:
50 # because title == rule.id we get the desired labels for free
51 # and we do not have to insert `(target_header)=`
52 title = f"{rule.id}"
53
54 if rule.help:
55 if not rule.help.startswith(f"# {rule.id}"):
56 raise RuntimeError(
57 f"Rule {rule.__class__} markdown help does not start with `# {rule.id}` header.\n{rule.help}"
58 )
59 result = result[1:]
60 result += f"{rule.help}"
61 else:
62 description = rule.description
63 if rule.link:
64 description += f" [more]({rule.link})"
65
66 result += f"# {title}\n\n**{rule.shortdesc}**\n\n{description}"
67 f.write(result)
68
69 return "All markdown files for rules were dumped!"
70
71
72 def rules_as_str(rules: RulesCollection) -> str:
73 """Return rules as string."""
74 return "\n".join([str(rule) for rule in rules.alphabetical()])
75
76
77 def rules_as_md(rules: RulesCollection) -> str:
78 """Return md documentation for a list of rules."""
79 result = DOC_HEADER
80
81 for rule in rules.alphabetical():
82
83 # because title == rule.id we get the desired labels for free
84 # and we do not have to insert `(target_header)=`
85 title = f"{rule.id}"
86
87 if rule.help:
88 if not rule.help.startswith(f"# {rule.id}"):
89 raise RuntimeError(
90 f"Rule {rule.__class__} markdown help does not start with `# {rule.id}` header.\n{rule.help}"
91 )
92 result += f"\n\n{rule.help}"
93 else:
94 description = rule.description
95 if rule.link:
96 description += f" [more]({rule.link})"
97
98 result += f"\n\n## {title}\n\n**{rule.shortdesc}**\n\n{description}"
99
100 return result
101
102
103 @group()
104 def rules_as_rich(rules: RulesCollection) -> Iterable[Table]:
105 """Print documentation for a list of rules, returns empty string."""
106 width = max(16, *[len(rule.id) for rule in rules])
107 for rule in rules.alphabetical():
108 table = Table(show_header=True, header_style="title", box=box.MINIMAL)
109 table.add_column(rule.id, style="dim", width=width)
110 table.add_column(Markdown(rule.shortdesc))
111
112 description = rule.help or rule.description
113 if rule.link:
114 description += f" [(more)]({rule.link})"
115 table.add_row("description", Markdown(description))
116 if rule.version_added:
117 table.add_row("version_added", rule.version_added)
118 if rule.tags:
119 table.add_row("tags", ", ".join(rule.tags))
120 if rule.severity:
121 table.add_row("severity", rule.severity)
122 yield table
123
124
125 def profiles_as_md(header: bool = False, docs_url: str = RULE_DOC_URL) -> str:
126 """Return markdown representation of supported profiles."""
127 result = ""
128
129 if header:
130 result += """<!---
131 Do not manually edit, generated from generate_docs.py
132 -->
133 # Profiles
134
135 One of the best ways to run `ansible-lint` is by specifying which rule profile
136 you want to use. These profiles stack on top of each other, allowing you to
137 gradually raise the quality bar.
138
139 To run it with the most strict profile just type `ansible-lint --profile=production`.
140
141 If you want to consult the list of rules from each profile, type
142 `ansible-lint -P`. For your convenience, we also list the same output below.
143
144 The rules that have a `*` suffix, are not implemented yet but we documented
145 them with links to their issues.
146
147 ```{note}
148 Special rule tags such `opt-in` and `experimental` are automatically removed
149 when a rule is included in a profile, directly or indirectly. This means that
150 they will always execute once included.
151 ```
152
153 """
154
155 for name, profile in PROFILES.items():
156 extends = ""
157 if profile.get("extends", None):
158 extends = (
159 f" It extends [{profile['extends']}](#{profile['extends']}) profile."
160 )
161 result += f"## {name}\n\n{profile['description']}{extends}\n"
162 for rule, rule_data in profile["rules"].items():
163 if "[" in rule:
164 url = f"{docs_url}{rule.split('[')[0]}/"
165 else:
166 url = f"{docs_url}{rule}/"
167 if not rule_data:
168 result += f"- [{rule}]({url})\n"
169 else:
170 result += f"- [{rule}]({rule_data['url']})\n"
171
172 result += "\n"
173 return result
174
175
176 def profiles_as_rich() -> Markdown:
177 """Return rich representation of supported profiles."""
178 return Markdown(profiles_as_md())
179
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/ansiblelint/generate_docs.py b/src/ansiblelint/generate_docs.py
--- a/src/ansiblelint/generate_docs.py
+++ b/src/ansiblelint/generate_docs.py
@@ -132,22 +132,10 @@
-->
# Profiles
-One of the best ways to run `ansible-lint` is by specifying which rule profile
-you want to use. These profiles stack on top of each other, allowing you to
-gradually raise the quality bar.
-
-To run it with the most strict profile just type `ansible-lint --profile=production`.
-
-If you want to consult the list of rules from each profile, type
-`ansible-lint -P`. For your convenience, we also list the same output below.
-
-The rules that have a `*` suffix, are not implemented yet but we documented
-them with links to their issues.
+Ansible-lint profiles gradually increase the strictness of rules as your Ansible content lifecycle.
```{note}
-Special rule tags such `opt-in` and `experimental` are automatically removed
-when a rule is included in a profile, directly or indirectly. This means that
-they will always execute once included.
+Rules with `*` in the suffix are not yet implemented but are documented with linked GitHub issues.
```
"""
| {"golden_diff": "diff --git a/src/ansiblelint/generate_docs.py b/src/ansiblelint/generate_docs.py\n--- a/src/ansiblelint/generate_docs.py\n+++ b/src/ansiblelint/generate_docs.py\n@@ -132,22 +132,10 @@\n -->\n # Profiles\n \n-One of the best ways to run `ansible-lint` is by specifying which rule profile\n-you want to use. These profiles stack on top of each other, allowing you to\n-gradually raise the quality bar.\n-\n-To run it with the most strict profile just type `ansible-lint --profile=production`.\n-\n-If you want to consult the list of rules from each profile, type\n-`ansible-lint -P`. For your convenience, we also list the same output below.\n-\n-The rules that have a `*` suffix, are not implemented yet but we documented\n-them with links to their issues.\n+Ansible-lint profiles gradually increase the strictness of rules as your Ansible content lifecycle.\n \n ```{note}\n-Special rule tags such `opt-in` and `experimental` are automatically removed\n-when a rule is included in a profile, directly or indirectly. This means that\n-they will always execute once included.\n+Rules with `*` in the suffix are not yet implemented but are documented with linked GitHub issues.\n ```\n \n \"\"\"\n", "issue": "Docs: Add details for using profiles\n##### Summary\r\n\r\nAs content moves through the automation content creation pipeline, various linting rules that are developed against may or may not apply depending on where the content is being executed. \r\n\r\nFor example, in development, as a content creator, there should be the ability to create some rules and quickly validate them utilising ansible-lint locally. Then in the first test run, an ansible-lint profile should allow the executor from ignoring the rules to ensure that the content itself runs as desired.\r\n\r\nUpdate documentation to provide context for using profiles in a more progressive (step-by-step) way with additional context that addresses motivation.\r\n\r\n##### Issue Type\r\n\r\n- Bug Report (Docs)\r\n\r\n##### Ansible and Ansible Lint details\r\n\r\nN/A\r\n\r\n##### OS / ENVIRONMENT\r\n\r\nFedora 36\r\n\r\n##### STEPS TO REPRODUCE\r\n\r\nN/A\r\n\r\n##### Desired Behavior\r\n\r\nN/A\r\n\r\n##### Actual Behavior\r\n\r\nN/A\r\n\n", "before_files": [{"content": "\"\"\"Utils to generate rules documentation.\"\"\"\nimport logging\nfrom pathlib import Path\nfrom typing import Iterable\n\nfrom rich import box\n\n# Remove this compatibility try-catch block once we drop support for rich < 10.7.0\ntry:\n from rich.console import group\nexcept ImportError:\n from rich.console import render_group as group # type: ignore\n\nfrom rich.markdown import Markdown\nfrom rich.table import Table\n\nfrom ansiblelint.config import PROFILES\nfrom ansiblelint.constants import RULE_DOC_URL\nfrom ansiblelint.rules import RulesCollection\n\nDOC_HEADER = \"\"\"\n# Default Rules\n\n(lint_default_rules)=\n\nBelow you can see the list of default rules Ansible Lint use to evaluate playbooks and roles:\n\n\"\"\"\n\n_logger = logging.getLogger(__name__)\n\n\ndef rules_as_docs(rules: RulesCollection) -> str:\n \"\"\"Dump documentation files for all rules, returns only confirmation message.\n\n That is internally used for building documentation and the API can change\n at any time.\n \"\"\"\n result = \"\"\n dump_path = Path(\".\") / \"docs\" / \"rules\"\n if not dump_path.exists():\n raise RuntimeError(f\"Failed to find {dump_path} folder for dumping rules.\")\n\n with open(dump_path / \"..\" / \"profiles.md\", \"w\", encoding=\"utf-8\") as f:\n f.write(profiles_as_md(header=True, docs_url=\"rules/\"))\n\n for rule in rules.alphabetical():\n result = \"\"\n with open(dump_path / f\"{rule.id}.md\", \"w\", encoding=\"utf-8\") as f:\n # because title == rule.id we get the desired labels for free\n # and we do not have to insert `(target_header)=`\n title = f\"{rule.id}\"\n\n if rule.help:\n if not rule.help.startswith(f\"# {rule.id}\"):\n raise RuntimeError(\n f\"Rule {rule.__class__} markdown help does not start with `# {rule.id}` header.\\n{rule.help}\"\n )\n result = result[1:]\n result += f\"{rule.help}\"\n else:\n description = rule.description\n if rule.link:\n description += f\" [more]({rule.link})\"\n\n result += f\"# {title}\\n\\n**{rule.shortdesc}**\\n\\n{description}\"\n f.write(result)\n\n return \"All markdown files for rules were dumped!\"\n\n\ndef rules_as_str(rules: RulesCollection) -> str:\n \"\"\"Return rules as string.\"\"\"\n return \"\\n\".join([str(rule) for rule in rules.alphabetical()])\n\n\ndef rules_as_md(rules: RulesCollection) -> str:\n \"\"\"Return md documentation for a list of rules.\"\"\"\n result = DOC_HEADER\n\n for rule in rules.alphabetical():\n\n # because title == rule.id we get the desired labels for free\n # and we do not have to insert `(target_header)=`\n title = f\"{rule.id}\"\n\n if rule.help:\n if not rule.help.startswith(f\"# {rule.id}\"):\n raise RuntimeError(\n f\"Rule {rule.__class__} markdown help does not start with `# {rule.id}` header.\\n{rule.help}\"\n )\n result += f\"\\n\\n{rule.help}\"\n else:\n description = rule.description\n if rule.link:\n description += f\" [more]({rule.link})\"\n\n result += f\"\\n\\n## {title}\\n\\n**{rule.shortdesc}**\\n\\n{description}\"\n\n return result\n\n\n@group()\ndef rules_as_rich(rules: RulesCollection) -> Iterable[Table]:\n \"\"\"Print documentation for a list of rules, returns empty string.\"\"\"\n width = max(16, *[len(rule.id) for rule in rules])\n for rule in rules.alphabetical():\n table = Table(show_header=True, header_style=\"title\", box=box.MINIMAL)\n table.add_column(rule.id, style=\"dim\", width=width)\n table.add_column(Markdown(rule.shortdesc))\n\n description = rule.help or rule.description\n if rule.link:\n description += f\" [(more)]({rule.link})\"\n table.add_row(\"description\", Markdown(description))\n if rule.version_added:\n table.add_row(\"version_added\", rule.version_added)\n if rule.tags:\n table.add_row(\"tags\", \", \".join(rule.tags))\n if rule.severity:\n table.add_row(\"severity\", rule.severity)\n yield table\n\n\ndef profiles_as_md(header: bool = False, docs_url: str = RULE_DOC_URL) -> str:\n \"\"\"Return markdown representation of supported profiles.\"\"\"\n result = \"\"\n\n if header:\n result += \"\"\"<!---\nDo not manually edit, generated from generate_docs.py\n-->\n# Profiles\n\nOne of the best ways to run `ansible-lint` is by specifying which rule profile\nyou want to use. These profiles stack on top of each other, allowing you to\ngradually raise the quality bar.\n\nTo run it with the most strict profile just type `ansible-lint --profile=production`.\n\nIf you want to consult the list of rules from each profile, type\n`ansible-lint -P`. For your convenience, we also list the same output below.\n\nThe rules that have a `*` suffix, are not implemented yet but we documented\nthem with links to their issues.\n\n```{note}\nSpecial rule tags such `opt-in` and `experimental` are automatically removed\nwhen a rule is included in a profile, directly or indirectly. This means that\nthey will always execute once included.\n```\n\n\"\"\"\n\n for name, profile in PROFILES.items():\n extends = \"\"\n if profile.get(\"extends\", None):\n extends = (\n f\" It extends [{profile['extends']}](#{profile['extends']}) profile.\"\n )\n result += f\"## {name}\\n\\n{profile['description']}{extends}\\n\"\n for rule, rule_data in profile[\"rules\"].items():\n if \"[\" in rule:\n url = f\"{docs_url}{rule.split('[')[0]}/\"\n else:\n url = f\"{docs_url}{rule}/\"\n if not rule_data:\n result += f\"- [{rule}]({url})\\n\"\n else:\n result += f\"- [{rule}]({rule_data['url']})\\n\"\n\n result += \"\\n\"\n return result\n\n\ndef profiles_as_rich() -> Markdown:\n \"\"\"Return rich representation of supported profiles.\"\"\"\n return Markdown(profiles_as_md())\n", "path": "src/ansiblelint/generate_docs.py"}], "after_files": [{"content": "\"\"\"Utils to generate rules documentation.\"\"\"\nimport logging\nfrom pathlib import Path\nfrom typing import Iterable\n\nfrom rich import box\n\n# Remove this compatibility try-catch block once we drop support for rich < 10.7.0\ntry:\n from rich.console import group\nexcept ImportError:\n from rich.console import render_group as group # type: ignore\n\nfrom rich.markdown import Markdown\nfrom rich.table import Table\n\nfrom ansiblelint.config import PROFILES\nfrom ansiblelint.constants import RULE_DOC_URL\nfrom ansiblelint.rules import RulesCollection\n\nDOC_HEADER = \"\"\"\n# Default Rules\n\n(lint_default_rules)=\n\nBelow you can see the list of default rules Ansible Lint use to evaluate playbooks and roles:\n\n\"\"\"\n\n_logger = logging.getLogger(__name__)\n\n\ndef rules_as_docs(rules: RulesCollection) -> str:\n \"\"\"Dump documentation files for all rules, returns only confirmation message.\n\n That is internally used for building documentation and the API can change\n at any time.\n \"\"\"\n result = \"\"\n dump_path = Path(\".\") / \"docs\" / \"rules\"\n if not dump_path.exists():\n raise RuntimeError(f\"Failed to find {dump_path} folder for dumping rules.\")\n\n with open(dump_path / \"..\" / \"profiles.md\", \"w\", encoding=\"utf-8\") as f:\n f.write(profiles_as_md(header=True, docs_url=\"rules/\"))\n\n for rule in rules.alphabetical():\n result = \"\"\n with open(dump_path / f\"{rule.id}.md\", \"w\", encoding=\"utf-8\") as f:\n # because title == rule.id we get the desired labels for free\n # and we do not have to insert `(target_header)=`\n title = f\"{rule.id}\"\n\n if rule.help:\n if not rule.help.startswith(f\"# {rule.id}\"):\n raise RuntimeError(\n f\"Rule {rule.__class__} markdown help does not start with `# {rule.id}` header.\\n{rule.help}\"\n )\n result = result[1:]\n result += f\"{rule.help}\"\n else:\n description = rule.description\n if rule.link:\n description += f\" [more]({rule.link})\"\n\n result += f\"# {title}\\n\\n**{rule.shortdesc}**\\n\\n{description}\"\n f.write(result)\n\n return \"All markdown files for rules were dumped!\"\n\n\ndef rules_as_str(rules: RulesCollection) -> str:\n \"\"\"Return rules as string.\"\"\"\n return \"\\n\".join([str(rule) for rule in rules.alphabetical()])\n\n\ndef rules_as_md(rules: RulesCollection) -> str:\n \"\"\"Return md documentation for a list of rules.\"\"\"\n result = DOC_HEADER\n\n for rule in rules.alphabetical():\n\n # because title == rule.id we get the desired labels for free\n # and we do not have to insert `(target_header)=`\n title = f\"{rule.id}\"\n\n if rule.help:\n if not rule.help.startswith(f\"# {rule.id}\"):\n raise RuntimeError(\n f\"Rule {rule.__class__} markdown help does not start with `# {rule.id}` header.\\n{rule.help}\"\n )\n result += f\"\\n\\n{rule.help}\"\n else:\n description = rule.description\n if rule.link:\n description += f\" [more]({rule.link})\"\n\n result += f\"\\n\\n## {title}\\n\\n**{rule.shortdesc}**\\n\\n{description}\"\n\n return result\n\n\n@group()\ndef rules_as_rich(rules: RulesCollection) -> Iterable[Table]:\n \"\"\"Print documentation for a list of rules, returns empty string.\"\"\"\n width = max(16, *[len(rule.id) for rule in rules])\n for rule in rules.alphabetical():\n table = Table(show_header=True, header_style=\"title\", box=box.MINIMAL)\n table.add_column(rule.id, style=\"dim\", width=width)\n table.add_column(Markdown(rule.shortdesc))\n\n description = rule.help or rule.description\n if rule.link:\n description += f\" [(more)]({rule.link})\"\n table.add_row(\"description\", Markdown(description))\n if rule.version_added:\n table.add_row(\"version_added\", rule.version_added)\n if rule.tags:\n table.add_row(\"tags\", \", \".join(rule.tags))\n if rule.severity:\n table.add_row(\"severity\", rule.severity)\n yield table\n\n\ndef profiles_as_md(header: bool = False, docs_url: str = RULE_DOC_URL) -> str:\n \"\"\"Return markdown representation of supported profiles.\"\"\"\n result = \"\"\n\n if header:\n result += \"\"\"<!---\nDo not manually edit, generated from generate_docs.py\n-->\n# Profiles\n\nAnsible-lint profiles gradually increase the strictness of rules as your Ansible content lifecycle.\n\n```{note}\nRules with `*` in the suffix are not yet implemented but are documented with linked GitHub issues.\n```\n\n\"\"\"\n\n for name, profile in PROFILES.items():\n extends = \"\"\n if profile.get(\"extends\", None):\n extends = (\n f\" It extends [{profile['extends']}](#{profile['extends']}) profile.\"\n )\n result += f\"## {name}\\n\\n{profile['description']}{extends}\\n\"\n for rule, rule_data in profile[\"rules\"].items():\n if \"[\" in rule:\n url = f\"{docs_url}{rule.split('[')[0]}/\"\n else:\n url = f\"{docs_url}{rule}/\"\n if not rule_data:\n result += f\"- [{rule}]({url})\\n\"\n else:\n result += f\"- [{rule}]({rule_data['url']})\\n\"\n\n result += \"\\n\"\n return result\n\n\ndef profiles_as_rich() -> Markdown:\n \"\"\"Return rich representation of supported profiles.\"\"\"\n return Markdown(profiles_as_md())\n", "path": "src/ansiblelint/generate_docs.py"}]} | 2,275 | 285 |
gh_patches_debug_18433 | rasdani/github-patches | git_diff | holoviz__panel-5344 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Race Condition with threaded Periodic Callback can result in additional executions
#### ALL software version info
panel==1.2.1
#### Description of expected behavior and the observed behavior
Looks like due to a race condition, by scheduling a "heavy" periodic callback, it can be executed more than the maximum count when letting panel handle the multithreading. This is because, I believe, the counter is incremented after the callback is finished executing, and with multiple threads many instances can be scheduled while waiting for the count to be met.
In the below example, when you toggle on the periodic with a 0 delay it runs as expected, but increasing the delay beyond the period will result in additional executions of the callback. The larger the delay the worse the effect.
#### Complete, minimal, self-contained example code that reproduces the issue
```python
# app.py
import time
import panel as pn
pn.extension(nthreads=0) # or > 1
def heavy():
print(cb.counter)
time.sleep(delay.value) # simulate lots of work
delay = pn.widgets.IntInput(name='Delay', value=0, start=0, end=5, step=1)
cb = pn.state.add_periodic_callback(heavy, 200, count=5, start=False)
toggle = pn.widgets.Toggle(name='Run', value=False)
toggle.link(cb, bidirectional=True, value='running')
pn.Column(delay, toggle).servable()
```
Run with:
```shell
panel serve app.py
```
#### Stack traceback and/or browser JavaScript console output
N/A
#### Screenshots or screencasts of the bug in action
N/A
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `panel/io/callbacks.py`
Content:
```
1 """
2 Defines callbacks to be executed on a thread or by scheduling it
3 on a running bokeh server.
4 """
5 import asyncio
6 import inspect
7 import logging
8 import time
9
10 from functools import partial
11
12 import param
13
14 from ..util import edit_readonly, function_name
15 from .logging import LOG_PERIODIC_END, LOG_PERIODIC_START
16 from .state import curdoc_locked, state
17
18 log = logging.getLogger('panel.callbacks')
19 _periodic_logger = logging.getLogger(f'{__name__}.PeriodicCallback')
20
21 class PeriodicCallback(param.Parameterized):
22 """
23 Periodic encapsulates a periodic callback which will run both
24 in tornado based notebook environments and on bokeh server. By
25 default the callback will run until the stop method is called,
26 but count and timeout values can be set to limit the number of
27 executions or the maximum length of time for which the callback
28 will run. The callback may also be started and stopped by setting
29 the running parameter to True or False respectively.
30 """
31
32 callback = param.Callable(doc="""
33 The callback to execute periodically.""")
34
35 counter = param.Integer(default=0, doc="""
36 Counts the number of executions.""")
37
38 count = param.Integer(default=None, doc="""
39 Number of times the callback will be executed, by default
40 this is unlimited.""")
41
42 log = param.Boolean(default=True, doc="""
43 Whether the periodic callback should log its actions.""")
44
45 period = param.Integer(default=500, doc="""
46 Period in milliseconds at which the callback is executed.""")
47
48 timeout = param.Integer(default=None, doc="""
49 Timeout in milliseconds from the start time at which the callback
50 expires.""")
51
52 running = param.Boolean(default=False, doc="""
53 Toggles whether the periodic callback is currently running.""")
54
55 def __init__(self, **params):
56 self._background = params.pop('background', False)
57 super().__init__(**params)
58 self._start_time = None
59 self._cb = None
60 self._updating = False
61 self._doc = None
62
63 @param.depends('running', watch=True)
64 def _start(self):
65 if not self.running or self._updating:
66 return
67 self.start()
68
69 @param.depends('running', watch=True)
70 def _stop(self):
71 if self.running or self._updating:
72 return
73 self.stop()
74
75 @param.depends('period', watch=True)
76 def _update_period(self):
77 if self._cb:
78 self.stop()
79 self.start()
80
81 def _exec_callback(self, post=False):
82 from .state import set_curdoc
83 try:
84 with set_curdoc(self._doc):
85 cb = self.callback()
86 except Exception:
87 cb = None
88 if post:
89 self._post_callback()
90 return cb
91
92 def _post_callback(self):
93 cbname = function_name(self.callback)
94 if self._doc and self.log:
95 _periodic_logger.info(
96 LOG_PERIODIC_END, id(self._doc), cbname, self.counter
97 )
98 if not self._background:
99 with edit_readonly(state):
100 state._busy_counter -= 1
101 self.counter += 1
102 if self.timeout is not None:
103 dt = (time.time() - self._start_time) * 1000
104 if dt > self.timeout:
105 self.stop()
106 if self.counter == self.count:
107 self.stop()
108
109 async def _periodic_callback(self):
110 if not self._background:
111 with edit_readonly(state):
112 state._busy_counter += 1
113 cbname = function_name(self.callback)
114 if self._doc and self.log:
115 _periodic_logger.info(
116 LOG_PERIODIC_START, id(self._doc), cbname, self.counter
117 )
118 is_async = (
119 inspect.isasyncgenfunction(self.callback) or
120 inspect.iscoroutinefunction(self.callback)
121 )
122 if state._thread_pool and not is_async:
123 future = state._thread_pool.submit(self._exec_callback, True)
124 future.add_done_callback(partial(state._handle_future_exception, doc=self._doc))
125 return
126 try:
127 cb = self._exec_callback()
128 if inspect.isawaitable(cb):
129 await cb
130 except Exception:
131 log.exception('Periodic callback failed.')
132 raise
133 finally:
134 self._post_callback()
135
136 async def _async_repeat(self, func):
137 """
138 Run func every interval seconds.
139
140 If func has not finished before *interval*, will run again
141 immediately when the previous iteration finished.
142 """
143 while True:
144 start = time.monotonic()
145 await func()
146 timeout = (self.period/1000.) - (time.monotonic()-start)
147 if timeout > 0:
148 await asyncio.sleep(timeout)
149
150 def _cleanup(self, session_context):
151 self.stop()
152
153 def start(self):
154 """
155 Starts running the periodic callback.
156 """
157 if self._cb is not None:
158 raise RuntimeError('Periodic callback has already started.')
159 if not self.running:
160 try:
161 self._updating = True
162 self.running = True
163 finally:
164 self._updating = False
165 self._start_time = time.time()
166 if state._is_pyodide:
167 self._cb = asyncio.create_task(
168 self._async_repeat(self._periodic_callback)
169 )
170 elif state.curdoc and state.curdoc.session_context:
171 self._doc = state.curdoc
172 if state._unblocked(state.curdoc):
173 self._cb = self._doc.add_periodic_callback(self._periodic_callback, self.period)
174 else:
175 self._doc.add_next_tick_callback(self.start)
176 else:
177 from tornado.ioloop import PeriodicCallback
178 self._cb = PeriodicCallback(lambda: asyncio.create_task(self._periodic_callback()), self.period)
179 self._cb.start()
180
181 def stop(self):
182 """
183 Stops running the periodic callback.
184 """
185 if self.running:
186 try:
187 self._updating = True
188 self.running = False
189 finally:
190 self._updating = False
191 with param.discard_events(self):
192 self.counter = 0
193 self._timeout = None
194 if state._is_pyodide:
195 self._cb.cancel()
196 elif self._doc:
197 if self._doc._session_context:
198 self._doc.callbacks.remove_session_callback(self._cb)
199 else:
200 self._doc.callbacks._session_callbacks.remove(self._cb)
201 elif self._cb:
202 self._cb.stop()
203 self._cb = None
204 doc = self._doc or curdoc_locked()
205 if doc:
206 doc.callbacks.session_destroyed_callbacks = {
207 cb for cb in doc.callbacks.session_destroyed_callbacks
208 if cb is not self._cleanup
209 }
210 self._doc = None
211
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/panel/io/callbacks.py b/panel/io/callbacks.py
--- a/panel/io/callbacks.py
+++ b/panel/io/callbacks.py
@@ -82,7 +82,11 @@
from .state import set_curdoc
try:
with set_curdoc(self._doc):
- cb = self.callback()
+ if self.running:
+ self.counter += 1
+ if self.counter > self.count:
+ self.stop()
+ cb = self.callback() if self.running else None
except Exception:
cb = None
if post:
@@ -98,7 +102,6 @@
if not self._background:
with edit_readonly(state):
state._busy_counter -= 1
- self.counter += 1
if self.timeout is not None:
dt = (time.time() - self._start_time) * 1000
if dt > self.timeout:
| {"golden_diff": "diff --git a/panel/io/callbacks.py b/panel/io/callbacks.py\n--- a/panel/io/callbacks.py\n+++ b/panel/io/callbacks.py\n@@ -82,7 +82,11 @@\n from .state import set_curdoc\n try:\n with set_curdoc(self._doc):\n- cb = self.callback()\n+ if self.running:\n+ self.counter += 1\n+ if self.counter > self.count:\n+ self.stop()\n+ cb = self.callback() if self.running else None\n except Exception:\n cb = None\n if post:\n@@ -98,7 +102,6 @@\n if not self._background:\n with edit_readonly(state):\n state._busy_counter -= 1\n- self.counter += 1\n if self.timeout is not None:\n dt = (time.time() - self._start_time) * 1000\n if dt > self.timeout:\n", "issue": "Race Condition with threaded Periodic Callback can result in additional executions\n#### ALL software version info\r\n\r\npanel==1.2.1\r\n\r\n#### Description of expected behavior and the observed behavior\r\n\r\nLooks like due to a race condition, by scheduling a \"heavy\" periodic callback, it can be executed more than the maximum count when letting panel handle the multithreading. This is because, I believe, the counter is incremented after the callback is finished executing, and with multiple threads many instances can be scheduled while waiting for the count to be met.\r\n\r\nIn the below example, when you toggle on the periodic with a 0 delay it runs as expected, but increasing the delay beyond the period will result in additional executions of the callback. The larger the delay the worse the effect.\r\n\r\n#### Complete, minimal, self-contained example code that reproduces the issue\r\n\r\n```python\r\n# app.py\r\nimport time\r\nimport panel as pn\r\npn.extension(nthreads=0) # or > 1\r\n\r\ndef heavy():\r\n print(cb.counter)\r\n time.sleep(delay.value) # simulate lots of work\r\n\r\ndelay = pn.widgets.IntInput(name='Delay', value=0, start=0, end=5, step=1)\r\n\r\ncb = pn.state.add_periodic_callback(heavy, 200, count=5, start=False)\r\ntoggle = pn.widgets.Toggle(name='Run', value=False)\r\ntoggle.link(cb, bidirectional=True, value='running')\r\n\r\npn.Column(delay, toggle).servable()\r\n```\r\n\r\nRun with:\r\n```shell\r\npanel serve app.py\r\n```\r\n\r\n#### Stack traceback and/or browser JavaScript console output\r\n\r\nN/A\r\n\r\n#### Screenshots or screencasts of the bug in action\r\n\r\nN/A\n", "before_files": [{"content": "\"\"\"\nDefines callbacks to be executed on a thread or by scheduling it\non a running bokeh server.\n\"\"\"\nimport asyncio\nimport inspect\nimport logging\nimport time\n\nfrom functools import partial\n\nimport param\n\nfrom ..util import edit_readonly, function_name\nfrom .logging import LOG_PERIODIC_END, LOG_PERIODIC_START\nfrom .state import curdoc_locked, state\n\nlog = logging.getLogger('panel.callbacks')\n_periodic_logger = logging.getLogger(f'{__name__}.PeriodicCallback')\n\nclass PeriodicCallback(param.Parameterized):\n \"\"\"\n Periodic encapsulates a periodic callback which will run both\n in tornado based notebook environments and on bokeh server. By\n default the callback will run until the stop method is called,\n but count and timeout values can be set to limit the number of\n executions or the maximum length of time for which the callback\n will run. The callback may also be started and stopped by setting\n the running parameter to True or False respectively.\n \"\"\"\n\n callback = param.Callable(doc=\"\"\"\n The callback to execute periodically.\"\"\")\n\n counter = param.Integer(default=0, doc=\"\"\"\n Counts the number of executions.\"\"\")\n\n count = param.Integer(default=None, doc=\"\"\"\n Number of times the callback will be executed, by default\n this is unlimited.\"\"\")\n\n log = param.Boolean(default=True, doc=\"\"\"\n Whether the periodic callback should log its actions.\"\"\")\n\n period = param.Integer(default=500, doc=\"\"\"\n Period in milliseconds at which the callback is executed.\"\"\")\n\n timeout = param.Integer(default=None, doc=\"\"\"\n Timeout in milliseconds from the start time at which the callback\n expires.\"\"\")\n\n running = param.Boolean(default=False, doc=\"\"\"\n Toggles whether the periodic callback is currently running.\"\"\")\n\n def __init__(self, **params):\n self._background = params.pop('background', False)\n super().__init__(**params)\n self._start_time = None\n self._cb = None\n self._updating = False\n self._doc = None\n\n @param.depends('running', watch=True)\n def _start(self):\n if not self.running or self._updating:\n return\n self.start()\n\n @param.depends('running', watch=True)\n def _stop(self):\n if self.running or self._updating:\n return\n self.stop()\n\n @param.depends('period', watch=True)\n def _update_period(self):\n if self._cb:\n self.stop()\n self.start()\n\n def _exec_callback(self, post=False):\n from .state import set_curdoc\n try:\n with set_curdoc(self._doc):\n cb = self.callback()\n except Exception:\n cb = None\n if post:\n self._post_callback()\n return cb\n\n def _post_callback(self):\n cbname = function_name(self.callback)\n if self._doc and self.log:\n _periodic_logger.info(\n LOG_PERIODIC_END, id(self._doc), cbname, self.counter\n )\n if not self._background:\n with edit_readonly(state):\n state._busy_counter -= 1\n self.counter += 1\n if self.timeout is not None:\n dt = (time.time() - self._start_time) * 1000\n if dt > self.timeout:\n self.stop()\n if self.counter == self.count:\n self.stop()\n\n async def _periodic_callback(self):\n if not self._background:\n with edit_readonly(state):\n state._busy_counter += 1\n cbname = function_name(self.callback)\n if self._doc and self.log:\n _periodic_logger.info(\n LOG_PERIODIC_START, id(self._doc), cbname, self.counter\n )\n is_async = (\n inspect.isasyncgenfunction(self.callback) or\n inspect.iscoroutinefunction(self.callback)\n )\n if state._thread_pool and not is_async:\n future = state._thread_pool.submit(self._exec_callback, True)\n future.add_done_callback(partial(state._handle_future_exception, doc=self._doc))\n return\n try:\n cb = self._exec_callback()\n if inspect.isawaitable(cb):\n await cb\n except Exception:\n log.exception('Periodic callback failed.')\n raise\n finally:\n self._post_callback()\n\n async def _async_repeat(self, func):\n \"\"\"\n Run func every interval seconds.\n\n If func has not finished before *interval*, will run again\n immediately when the previous iteration finished.\n \"\"\"\n while True:\n start = time.monotonic()\n await func()\n timeout = (self.period/1000.) - (time.monotonic()-start)\n if timeout > 0:\n await asyncio.sleep(timeout)\n\n def _cleanup(self, session_context):\n self.stop()\n\n def start(self):\n \"\"\"\n Starts running the periodic callback.\n \"\"\"\n if self._cb is not None:\n raise RuntimeError('Periodic callback has already started.')\n if not self.running:\n try:\n self._updating = True\n self.running = True\n finally:\n self._updating = False\n self._start_time = time.time()\n if state._is_pyodide:\n self._cb = asyncio.create_task(\n self._async_repeat(self._periodic_callback)\n )\n elif state.curdoc and state.curdoc.session_context:\n self._doc = state.curdoc\n if state._unblocked(state.curdoc):\n self._cb = self._doc.add_periodic_callback(self._periodic_callback, self.period)\n else:\n self._doc.add_next_tick_callback(self.start)\n else:\n from tornado.ioloop import PeriodicCallback\n self._cb = PeriodicCallback(lambda: asyncio.create_task(self._periodic_callback()), self.period)\n self._cb.start()\n\n def stop(self):\n \"\"\"\n Stops running the periodic callback.\n \"\"\"\n if self.running:\n try:\n self._updating = True\n self.running = False\n finally:\n self._updating = False\n with param.discard_events(self):\n self.counter = 0\n self._timeout = None\n if state._is_pyodide:\n self._cb.cancel()\n elif self._doc:\n if self._doc._session_context:\n self._doc.callbacks.remove_session_callback(self._cb)\n else:\n self._doc.callbacks._session_callbacks.remove(self._cb)\n elif self._cb:\n self._cb.stop()\n self._cb = None\n doc = self._doc or curdoc_locked()\n if doc:\n doc.callbacks.session_destroyed_callbacks = {\n cb for cb in doc.callbacks.session_destroyed_callbacks\n if cb is not self._cleanup\n }\n self._doc = None\n", "path": "panel/io/callbacks.py"}], "after_files": [{"content": "\"\"\"\nDefines callbacks to be executed on a thread or by scheduling it\non a running bokeh server.\n\"\"\"\nimport asyncio\nimport inspect\nimport logging\nimport time\n\nfrom functools import partial\n\nimport param\n\nfrom ..util import edit_readonly, function_name\nfrom .logging import LOG_PERIODIC_END, LOG_PERIODIC_START\nfrom .state import curdoc_locked, state\n\nlog = logging.getLogger('panel.callbacks')\n_periodic_logger = logging.getLogger(f'{__name__}.PeriodicCallback')\n\nclass PeriodicCallback(param.Parameterized):\n \"\"\"\n Periodic encapsulates a periodic callback which will run both\n in tornado based notebook environments and on bokeh server. By\n default the callback will run until the stop method is called,\n but count and timeout values can be set to limit the number of\n executions or the maximum length of time for which the callback\n will run. The callback may also be started and stopped by setting\n the running parameter to True or False respectively.\n \"\"\"\n\n callback = param.Callable(doc=\"\"\"\n The callback to execute periodically.\"\"\")\n\n counter = param.Integer(default=0, doc=\"\"\"\n Counts the number of executions.\"\"\")\n\n count = param.Integer(default=None, doc=\"\"\"\n Number of times the callback will be executed, by default\n this is unlimited.\"\"\")\n\n log = param.Boolean(default=True, doc=\"\"\"\n Whether the periodic callback should log its actions.\"\"\")\n\n period = param.Integer(default=500, doc=\"\"\"\n Period in milliseconds at which the callback is executed.\"\"\")\n\n timeout = param.Integer(default=None, doc=\"\"\"\n Timeout in milliseconds from the start time at which the callback\n expires.\"\"\")\n\n running = param.Boolean(default=False, doc=\"\"\"\n Toggles whether the periodic callback is currently running.\"\"\")\n\n def __init__(self, **params):\n self._background = params.pop('background', False)\n super().__init__(**params)\n self._start_time = None\n self._cb = None\n self._updating = False\n self._doc = None\n\n @param.depends('running', watch=True)\n def _start(self):\n if not self.running or self._updating:\n return\n self.start()\n\n @param.depends('running', watch=True)\n def _stop(self):\n if self.running or self._updating:\n return\n self.stop()\n\n @param.depends('period', watch=True)\n def _update_period(self):\n if self._cb:\n self.stop()\n self.start()\n\n def _exec_callback(self, post=False):\n from .state import set_curdoc\n try:\n with set_curdoc(self._doc):\n if self.running:\n self.counter += 1\n if self.counter > self.count:\n self.stop()\n cb = self.callback() if self.running else None\n except Exception:\n cb = None\n if post:\n self._post_callback()\n return cb\n\n def _post_callback(self):\n cbname = function_name(self.callback)\n if self._doc and self.log:\n _periodic_logger.info(\n LOG_PERIODIC_END, id(self._doc), cbname, self.counter\n )\n if not self._background:\n with edit_readonly(state):\n state._busy_counter -= 1\n if self.timeout is not None:\n dt = (time.time() - self._start_time) * 1000\n if dt > self.timeout:\n self.stop()\n if self.counter == self.count:\n self.stop()\n\n async def _periodic_callback(self):\n if not self._background:\n with edit_readonly(state):\n state._busy_counter += 1\n cbname = function_name(self.callback)\n if self._doc and self.log:\n _periodic_logger.info(\n LOG_PERIODIC_START, id(self._doc), cbname, self.counter\n )\n is_async = (\n inspect.isasyncgenfunction(self.callback) or\n inspect.iscoroutinefunction(self.callback)\n )\n if state._thread_pool and not is_async:\n future = state._thread_pool.submit(self._exec_callback, True)\n future.add_done_callback(partial(state._handle_future_exception, doc=self._doc))\n return\n try:\n cb = self._exec_callback()\n if inspect.isawaitable(cb):\n await cb\n except Exception:\n log.exception('Periodic callback failed.')\n raise\n finally:\n self._post_callback()\n\n async def _async_repeat(self, func):\n \"\"\"\n Run func every interval seconds.\n\n If func has not finished before *interval*, will run again\n immediately when the previous iteration finished.\n \"\"\"\n while True:\n start = time.monotonic()\n await func()\n timeout = (self.period/1000.) - (time.monotonic()-start)\n if timeout > 0:\n await asyncio.sleep(timeout)\n\n def _cleanup(self, session_context):\n self.stop()\n\n def start(self):\n \"\"\"\n Starts running the periodic callback.\n \"\"\"\n if self._cb is not None:\n raise RuntimeError('Periodic callback has already started.')\n if not self.running:\n try:\n self._updating = True\n self.running = True\n finally:\n self._updating = False\n self._start_time = time.time()\n if state._is_pyodide:\n self._cb = asyncio.create_task(\n self._async_repeat(self._periodic_callback)\n )\n elif state.curdoc and state.curdoc.session_context:\n self._doc = state.curdoc\n if state._unblocked(state.curdoc):\n self._cb = self._doc.add_periodic_callback(self._periodic_callback, self.period)\n else:\n self._doc.add_next_tick_callback(self.start)\n else:\n from tornado.ioloop import PeriodicCallback\n self._cb = PeriodicCallback(lambda: asyncio.create_task(self._periodic_callback()), self.period)\n self._cb.start()\n\n def stop(self):\n \"\"\"\n Stops running the periodic callback.\n \"\"\"\n if self.running:\n try:\n self._updating = True\n self.running = False\n finally:\n self._updating = False\n with param.discard_events(self):\n self.counter = 0\n self._timeout = None\n if state._is_pyodide:\n self._cb.cancel()\n elif self._doc:\n if self._doc._session_context:\n self._doc.callbacks.remove_session_callback(self._cb)\n else:\n self._doc.callbacks._session_callbacks.remove(self._cb)\n elif self._cb:\n self._cb.stop()\n self._cb = None\n doc = self._doc or curdoc_locked()\n if doc:\n doc.callbacks.session_destroyed_callbacks = {\n cb for cb in doc.callbacks.session_destroyed_callbacks\n if cb is not self._cleanup\n }\n self._doc = None\n", "path": "panel/io/callbacks.py"}]} | 2,596 | 210 |
gh_patches_debug_11929 | rasdani/github-patches | git_diff | open-telemetry__opentelemetry-python-contrib-1791 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Instantiating AsyncPGInstrumentor after .instrument has been called causes tracing to fail
**Describe your environment**
Python 3.11 on a M2 Mac.
**Steps to reproduce**
Run the following code (with `python -m asyncio` to allow top-level `async`/`await`):
```
import asyncpg
from opentelemetry.instrumentation.asyncpg import AsyncPGInstrumentor
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor, ConsoleSpanExporter
from opentelemetry.trace import set_tracer_provider
provider = TracerProvider()
processor = BatchSpanProcessor(ConsoleSpanExporter())
provider.add_span_processor(processor)
set_tracer_provider(provider)
dsn = "postgres://postgres:password@localhost:54320/postgres"
AsyncPGInstrumentor().instrument()
AsyncPGInstrumentor()
connection = await asyncpg.connect(dsn)
await connection.execute("SELECT 1")
```
**What is the expected behavior?**
The SQL query runs successfully and a span is exported to the console.
**What is the actual behavior?**
What did you see instead?
```
AttributeError: 'NoneType' object has no attribute 'start_as_current_span'
```
**Additional context**
Each instantiation of `AsyncPGInstrumentor` runs `__init__`, which sets `self._tracer` to `None`. However, `BaseInstrumentor` overrides `__new__` to implement the singleton pattern, so only one instance of `AsyncPGInstrumentor` is ever created. Instantiating `AsyncPGInstrumentor` after `instrument` has been called (which sets `self._tracer`) therefore sets `self._tracer` back to `None`, which is a state inconsistent with `_is_instrumented_by_opentelemetry` (which will still be `True`).
A simple solution is to remove the line `self._tracer = None`.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `instrumentation/opentelemetry-instrumentation-asyncpg/src/opentelemetry/instrumentation/asyncpg/__init__.py`
Content:
```
1 # Copyright The OpenTelemetry Authors
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 """
16 This library allows tracing PostgreSQL queries made by the
17 `asyncpg <https://magicstack.github.io/asyncpg/current/>`_ library.
18
19 Usage
20 -----
21
22 .. code-block:: python
23
24 import asyncpg
25 from opentelemetry.instrumentation.asyncpg import AsyncPGInstrumentor
26
27 # You can optionally pass a custom TracerProvider to AsyncPGInstrumentor.instrument()
28 AsyncPGInstrumentor().instrument()
29 conn = await asyncpg.connect(user='user', password='password',
30 database='database', host='127.0.0.1')
31 values = await conn.fetch('''SELECT 42;''')
32
33 API
34 ---
35 """
36
37 import re
38 from typing import Collection
39
40 import asyncpg
41 import wrapt
42
43 from opentelemetry import trace
44 from opentelemetry.instrumentation.asyncpg.package import _instruments
45 from opentelemetry.instrumentation.asyncpg.version import __version__
46 from opentelemetry.instrumentation.instrumentor import BaseInstrumentor
47 from opentelemetry.instrumentation.utils import unwrap
48 from opentelemetry.semconv.trace import (
49 DbSystemValues,
50 NetTransportValues,
51 SpanAttributes,
52 )
53 from opentelemetry.trace import SpanKind
54 from opentelemetry.trace.status import Status, StatusCode
55
56
57 def _hydrate_span_from_args(connection, query, parameters) -> dict:
58 """Get network and database attributes from connection."""
59 span_attributes = {
60 SpanAttributes.DB_SYSTEM: DbSystemValues.POSTGRESQL.value
61 }
62
63 # connection contains _params attribute which is a namedtuple ConnectionParameters.
64 # https://github.com/MagicStack/asyncpg/blob/master/asyncpg/connection.py#L68
65
66 params = getattr(connection, "_params", None)
67 dbname = getattr(params, "database", None)
68 if dbname:
69 span_attributes[SpanAttributes.DB_NAME] = dbname
70 user = getattr(params, "user", None)
71 if user:
72 span_attributes[SpanAttributes.DB_USER] = user
73
74 # connection contains _addr attribute which is either a host/port tuple, or unix socket string
75 # https://magicstack.github.io/asyncpg/current/_modules/asyncpg/connection.html
76 addr = getattr(connection, "_addr", None)
77 if isinstance(addr, tuple):
78 span_attributes[SpanAttributes.NET_PEER_NAME] = addr[0]
79 span_attributes[SpanAttributes.NET_PEER_PORT] = addr[1]
80 span_attributes[
81 SpanAttributes.NET_TRANSPORT
82 ] = NetTransportValues.IP_TCP.value
83 elif isinstance(addr, str):
84 span_attributes[SpanAttributes.NET_PEER_NAME] = addr
85 span_attributes[
86 SpanAttributes.NET_TRANSPORT
87 ] = NetTransportValues.OTHER.value
88
89 if query is not None:
90 span_attributes[SpanAttributes.DB_STATEMENT] = query
91
92 if parameters is not None and len(parameters) > 0:
93 span_attributes["db.statement.parameters"] = str(parameters)
94
95 return span_attributes
96
97
98 class AsyncPGInstrumentor(BaseInstrumentor):
99 def __init__(self, capture_parameters=False):
100 super().__init__()
101 self.capture_parameters = capture_parameters
102 self._tracer = None
103 self._leading_comment_remover = re.compile(r"^/\*.*?\*/")
104
105 def instrumentation_dependencies(self) -> Collection[str]:
106 return _instruments
107
108 def _instrument(self, **kwargs):
109 tracer_provider = kwargs.get("tracer_provider")
110 self._tracer = trace.get_tracer(
111 __name__,
112 __version__,
113 tracer_provider,
114 schema_url="https://opentelemetry.io/schemas/1.11.0",
115 )
116
117 for method in [
118 "Connection.execute",
119 "Connection.executemany",
120 "Connection.fetch",
121 "Connection.fetchval",
122 "Connection.fetchrow",
123 ]:
124 wrapt.wrap_function_wrapper(
125 "asyncpg.connection", method, self._do_execute
126 )
127
128 def _uninstrument(self, **__):
129 for method in [
130 "execute",
131 "executemany",
132 "fetch",
133 "fetchval",
134 "fetchrow",
135 ]:
136 unwrap(asyncpg.Connection, method)
137
138 async def _do_execute(self, func, instance, args, kwargs):
139 exception = None
140 params = getattr(instance, "_params", {})
141 name = args[0] if args[0] else params.get("database", "postgresql")
142
143 try:
144 # Strip leading comments so we get the operation name.
145 name = self._leading_comment_remover.sub("", name).split()[0]
146 except IndexError:
147 name = ""
148
149 with self._tracer.start_as_current_span(
150 name, kind=SpanKind.CLIENT
151 ) as span:
152 if span.is_recording():
153 span_attributes = _hydrate_span_from_args(
154 instance,
155 args[0],
156 args[1:] if self.capture_parameters else None,
157 )
158 for attribute, value in span_attributes.items():
159 span.set_attribute(attribute, value)
160
161 try:
162 result = await func(*args, **kwargs)
163 except Exception as exc: # pylint: disable=W0703
164 exception = exc
165 raise
166 finally:
167 if span.is_recording() and exception is not None:
168 span.set_status(Status(StatusCode.ERROR))
169
170 return result
171
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/instrumentation/opentelemetry-instrumentation-asyncpg/src/opentelemetry/instrumentation/asyncpg/__init__.py b/instrumentation/opentelemetry-instrumentation-asyncpg/src/opentelemetry/instrumentation/asyncpg/__init__.py
--- a/instrumentation/opentelemetry-instrumentation-asyncpg/src/opentelemetry/instrumentation/asyncpg/__init__.py
+++ b/instrumentation/opentelemetry-instrumentation-asyncpg/src/opentelemetry/instrumentation/asyncpg/__init__.py
@@ -96,11 +96,13 @@
class AsyncPGInstrumentor(BaseInstrumentor):
+
+ _leading_comment_remover = re.compile(r"^/\*.*?\*/")
+ _tracer = None
+
def __init__(self, capture_parameters=False):
super().__init__()
self.capture_parameters = capture_parameters
- self._tracer = None
- self._leading_comment_remover = re.compile(r"^/\*.*?\*/")
def instrumentation_dependencies(self) -> Collection[str]:
return _instruments
| {"golden_diff": "diff --git a/instrumentation/opentelemetry-instrumentation-asyncpg/src/opentelemetry/instrumentation/asyncpg/__init__.py b/instrumentation/opentelemetry-instrumentation-asyncpg/src/opentelemetry/instrumentation/asyncpg/__init__.py\n--- a/instrumentation/opentelemetry-instrumentation-asyncpg/src/opentelemetry/instrumentation/asyncpg/__init__.py\n+++ b/instrumentation/opentelemetry-instrumentation-asyncpg/src/opentelemetry/instrumentation/asyncpg/__init__.py\n@@ -96,11 +96,13 @@\n \n \n class AsyncPGInstrumentor(BaseInstrumentor):\n+\n+ _leading_comment_remover = re.compile(r\"^/\\*.*?\\*/\")\n+ _tracer = None\n+\n def __init__(self, capture_parameters=False):\n super().__init__()\n self.capture_parameters = capture_parameters\n- self._tracer = None\n- self._leading_comment_remover = re.compile(r\"^/\\*.*?\\*/\")\n \n def instrumentation_dependencies(self) -> Collection[str]:\n return _instruments\n", "issue": "Instantiating AsyncPGInstrumentor after .instrument has been called causes tracing to fail\n**Describe your environment**\r\n\r\nPython 3.11 on a M2 Mac.\r\n\r\n**Steps to reproduce**\r\n\r\nRun the following code (with `python -m asyncio` to allow top-level `async`/`await`):\r\n```\r\nimport asyncpg\r\n\r\nfrom opentelemetry.instrumentation.asyncpg import AsyncPGInstrumentor\r\nfrom opentelemetry.sdk.trace import TracerProvider\r\nfrom opentelemetry.sdk.trace.export import BatchSpanProcessor, ConsoleSpanExporter\r\nfrom opentelemetry.trace import set_tracer_provider\r\n\r\nprovider = TracerProvider()\r\nprocessor = BatchSpanProcessor(ConsoleSpanExporter())\r\nprovider.add_span_processor(processor)\r\nset_tracer_provider(provider)\r\n\r\ndsn = \"postgres://postgres:password@localhost:54320/postgres\"\r\n\r\nAsyncPGInstrumentor().instrument()\r\nAsyncPGInstrumentor()\r\n\r\nconnection = await asyncpg.connect(dsn)\r\nawait connection.execute(\"SELECT 1\")\r\n```\r\n\r\n**What is the expected behavior?**\r\n\r\nThe SQL query runs successfully and a span is exported to the console.\r\n\r\n**What is the actual behavior?**\r\nWhat did you see instead?\r\n\r\n```\r\nAttributeError: 'NoneType' object has no attribute 'start_as_current_span'\r\n```\r\n\r\n**Additional context**\r\n\r\nEach instantiation of `AsyncPGInstrumentor` runs `__init__`, which sets `self._tracer` to `None`. However, `BaseInstrumentor` overrides `__new__` to implement the singleton pattern, so only one instance of `AsyncPGInstrumentor` is ever created. Instantiating `AsyncPGInstrumentor` after `instrument` has been called (which sets `self._tracer`) therefore sets `self._tracer` back to `None`, which is a state inconsistent with `_is_instrumented_by_opentelemetry` (which will still be `True`).\r\n\r\nA simple solution is to remove the line `self._tracer = None`.\n", "before_files": [{"content": "# Copyright The OpenTelemetry Authors\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"\nThis library allows tracing PostgreSQL queries made by the\n`asyncpg <https://magicstack.github.io/asyncpg/current/>`_ library.\n\nUsage\n-----\n\n.. code-block:: python\n\n import asyncpg\n from opentelemetry.instrumentation.asyncpg import AsyncPGInstrumentor\n\n # You can optionally pass a custom TracerProvider to AsyncPGInstrumentor.instrument()\n AsyncPGInstrumentor().instrument()\n conn = await asyncpg.connect(user='user', password='password',\n database='database', host='127.0.0.1')\n values = await conn.fetch('''SELECT 42;''')\n\nAPI\n---\n\"\"\"\n\nimport re\nfrom typing import Collection\n\nimport asyncpg\nimport wrapt\n\nfrom opentelemetry import trace\nfrom opentelemetry.instrumentation.asyncpg.package import _instruments\nfrom opentelemetry.instrumentation.asyncpg.version import __version__\nfrom opentelemetry.instrumentation.instrumentor import BaseInstrumentor\nfrom opentelemetry.instrumentation.utils import unwrap\nfrom opentelemetry.semconv.trace import (\n DbSystemValues,\n NetTransportValues,\n SpanAttributes,\n)\nfrom opentelemetry.trace import SpanKind\nfrom opentelemetry.trace.status import Status, StatusCode\n\n\ndef _hydrate_span_from_args(connection, query, parameters) -> dict:\n \"\"\"Get network and database attributes from connection.\"\"\"\n span_attributes = {\n SpanAttributes.DB_SYSTEM: DbSystemValues.POSTGRESQL.value\n }\n\n # connection contains _params attribute which is a namedtuple ConnectionParameters.\n # https://github.com/MagicStack/asyncpg/blob/master/asyncpg/connection.py#L68\n\n params = getattr(connection, \"_params\", None)\n dbname = getattr(params, \"database\", None)\n if dbname:\n span_attributes[SpanAttributes.DB_NAME] = dbname\n user = getattr(params, \"user\", None)\n if user:\n span_attributes[SpanAttributes.DB_USER] = user\n\n # connection contains _addr attribute which is either a host/port tuple, or unix socket string\n # https://magicstack.github.io/asyncpg/current/_modules/asyncpg/connection.html\n addr = getattr(connection, \"_addr\", None)\n if isinstance(addr, tuple):\n span_attributes[SpanAttributes.NET_PEER_NAME] = addr[0]\n span_attributes[SpanAttributes.NET_PEER_PORT] = addr[1]\n span_attributes[\n SpanAttributes.NET_TRANSPORT\n ] = NetTransportValues.IP_TCP.value\n elif isinstance(addr, str):\n span_attributes[SpanAttributes.NET_PEER_NAME] = addr\n span_attributes[\n SpanAttributes.NET_TRANSPORT\n ] = NetTransportValues.OTHER.value\n\n if query is not None:\n span_attributes[SpanAttributes.DB_STATEMENT] = query\n\n if parameters is not None and len(parameters) > 0:\n span_attributes[\"db.statement.parameters\"] = str(parameters)\n\n return span_attributes\n\n\nclass AsyncPGInstrumentor(BaseInstrumentor):\n def __init__(self, capture_parameters=False):\n super().__init__()\n self.capture_parameters = capture_parameters\n self._tracer = None\n self._leading_comment_remover = re.compile(r\"^/\\*.*?\\*/\")\n\n def instrumentation_dependencies(self) -> Collection[str]:\n return _instruments\n\n def _instrument(self, **kwargs):\n tracer_provider = kwargs.get(\"tracer_provider\")\n self._tracer = trace.get_tracer(\n __name__,\n __version__,\n tracer_provider,\n schema_url=\"https://opentelemetry.io/schemas/1.11.0\",\n )\n\n for method in [\n \"Connection.execute\",\n \"Connection.executemany\",\n \"Connection.fetch\",\n \"Connection.fetchval\",\n \"Connection.fetchrow\",\n ]:\n wrapt.wrap_function_wrapper(\n \"asyncpg.connection\", method, self._do_execute\n )\n\n def _uninstrument(self, **__):\n for method in [\n \"execute\",\n \"executemany\",\n \"fetch\",\n \"fetchval\",\n \"fetchrow\",\n ]:\n unwrap(asyncpg.Connection, method)\n\n async def _do_execute(self, func, instance, args, kwargs):\n exception = None\n params = getattr(instance, \"_params\", {})\n name = args[0] if args[0] else params.get(\"database\", \"postgresql\")\n\n try:\n # Strip leading comments so we get the operation name.\n name = self._leading_comment_remover.sub(\"\", name).split()[0]\n except IndexError:\n name = \"\"\n\n with self._tracer.start_as_current_span(\n name, kind=SpanKind.CLIENT\n ) as span:\n if span.is_recording():\n span_attributes = _hydrate_span_from_args(\n instance,\n args[0],\n args[1:] if self.capture_parameters else None,\n )\n for attribute, value in span_attributes.items():\n span.set_attribute(attribute, value)\n\n try:\n result = await func(*args, **kwargs)\n except Exception as exc: # pylint: disable=W0703\n exception = exc\n raise\n finally:\n if span.is_recording() and exception is not None:\n span.set_status(Status(StatusCode.ERROR))\n\n return result\n", "path": "instrumentation/opentelemetry-instrumentation-asyncpg/src/opentelemetry/instrumentation/asyncpg/__init__.py"}], "after_files": [{"content": "# Copyright The OpenTelemetry Authors\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"\nThis library allows tracing PostgreSQL queries made by the\n`asyncpg <https://magicstack.github.io/asyncpg/current/>`_ library.\n\nUsage\n-----\n\n.. code-block:: python\n\n import asyncpg\n from opentelemetry.instrumentation.asyncpg import AsyncPGInstrumentor\n\n # You can optionally pass a custom TracerProvider to AsyncPGInstrumentor.instrument()\n AsyncPGInstrumentor().instrument()\n conn = await asyncpg.connect(user='user', password='password',\n database='database', host='127.0.0.1')\n values = await conn.fetch('''SELECT 42;''')\n\nAPI\n---\n\"\"\"\n\nimport re\nfrom typing import Collection\n\nimport asyncpg\nimport wrapt\n\nfrom opentelemetry import trace\nfrom opentelemetry.instrumentation.asyncpg.package import _instruments\nfrom opentelemetry.instrumentation.asyncpg.version import __version__\nfrom opentelemetry.instrumentation.instrumentor import BaseInstrumentor\nfrom opentelemetry.instrumentation.utils import unwrap\nfrom opentelemetry.semconv.trace import (\n DbSystemValues,\n NetTransportValues,\n SpanAttributes,\n)\nfrom opentelemetry.trace import SpanKind\nfrom opentelemetry.trace.status import Status, StatusCode\n\n\ndef _hydrate_span_from_args(connection, query, parameters) -> dict:\n \"\"\"Get network and database attributes from connection.\"\"\"\n span_attributes = {\n SpanAttributes.DB_SYSTEM: DbSystemValues.POSTGRESQL.value\n }\n\n # connection contains _params attribute which is a namedtuple ConnectionParameters.\n # https://github.com/MagicStack/asyncpg/blob/master/asyncpg/connection.py#L68\n\n params = getattr(connection, \"_params\", None)\n dbname = getattr(params, \"database\", None)\n if dbname:\n span_attributes[SpanAttributes.DB_NAME] = dbname\n user = getattr(params, \"user\", None)\n if user:\n span_attributes[SpanAttributes.DB_USER] = user\n\n # connection contains _addr attribute which is either a host/port tuple, or unix socket string\n # https://magicstack.github.io/asyncpg/current/_modules/asyncpg/connection.html\n addr = getattr(connection, \"_addr\", None)\n if isinstance(addr, tuple):\n span_attributes[SpanAttributes.NET_PEER_NAME] = addr[0]\n span_attributes[SpanAttributes.NET_PEER_PORT] = addr[1]\n span_attributes[\n SpanAttributes.NET_TRANSPORT\n ] = NetTransportValues.IP_TCP.value\n elif isinstance(addr, str):\n span_attributes[SpanAttributes.NET_PEER_NAME] = addr\n span_attributes[\n SpanAttributes.NET_TRANSPORT\n ] = NetTransportValues.OTHER.value\n\n if query is not None:\n span_attributes[SpanAttributes.DB_STATEMENT] = query\n\n if parameters is not None and len(parameters) > 0:\n span_attributes[\"db.statement.parameters\"] = str(parameters)\n\n return span_attributes\n\n\nclass AsyncPGInstrumentor(BaseInstrumentor):\n\n _leading_comment_remover = re.compile(r\"^/\\*.*?\\*/\")\n _tracer = None\n\n def __init__(self, capture_parameters=False):\n super().__init__()\n self.capture_parameters = capture_parameters\n\n def instrumentation_dependencies(self) -> Collection[str]:\n return _instruments\n\n def _instrument(self, **kwargs):\n tracer_provider = kwargs.get(\"tracer_provider\")\n self._tracer = trace.get_tracer(\n __name__,\n __version__,\n tracer_provider,\n schema_url=\"https://opentelemetry.io/schemas/1.11.0\",\n )\n\n for method in [\n \"Connection.execute\",\n \"Connection.executemany\",\n \"Connection.fetch\",\n \"Connection.fetchval\",\n \"Connection.fetchrow\",\n ]:\n wrapt.wrap_function_wrapper(\n \"asyncpg.connection\", method, self._do_execute\n )\n\n def _uninstrument(self, **__):\n for method in [\n \"execute\",\n \"executemany\",\n \"fetch\",\n \"fetchval\",\n \"fetchrow\",\n ]:\n unwrap(asyncpg.Connection, method)\n\n async def _do_execute(self, func, instance, args, kwargs):\n exception = None\n params = getattr(instance, \"_params\", {})\n name = args[0] if args[0] else params.get(\"database\", \"postgresql\")\n\n try:\n # Strip leading comments so we get the operation name.\n name = self._leading_comment_remover.sub(\"\", name).split()[0]\n except IndexError:\n name = \"\"\n\n with self._tracer.start_as_current_span(\n name, kind=SpanKind.CLIENT\n ) as span:\n if span.is_recording():\n span_attributes = _hydrate_span_from_args(\n instance,\n args[0],\n args[1:] if self.capture_parameters else None,\n )\n for attribute, value in span_attributes.items():\n span.set_attribute(attribute, value)\n\n try:\n result = await func(*args, **kwargs)\n except Exception as exc: # pylint: disable=W0703\n exception = exc\n raise\n finally:\n if span.is_recording() and exception is not None:\n span.set_status(Status(StatusCode.ERROR))\n\n return result\n", "path": "instrumentation/opentelemetry-instrumentation-asyncpg/src/opentelemetry/instrumentation/asyncpg/__init__.py"}]} | 2,332 | 235 |
gh_patches_debug_11654 | rasdani/github-patches | git_diff | mitmproxy__mitmproxy-2919 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Update socks mode access commandline + documentation (v3.0.2)
##### Steps to reproduce the problem:
1. mitmproxy --socks
"--socks is deprecated
Please use '--set socks=value' instead"
2. Check online documentation at:
https://mitmproxy.org/docs/latest/concepts-modes/#socks-proxy
3. Check mitmproxy --help
##### Any other comments? What have you tried so far?
1. The advice given here doesn't appear to work (no combinations I tried were accepted).
2. The online documentation stops at Socks Proxy (no content)
3. The --help text shows the correct method (--mode socks5)
##### System information
Mitmproxy: 3.0.2
Python: 3.5.5rc1
OpenSSL: OpenSSL 1.0.1f 6 Jan 2014
Platform: Linux-2.6.39.4-kat124-ga627d40-armv7l-with-debian-jessie-sid
(Android: KatKiss Marshmallow hosting Linux Deploy: Ubuntu Trusty [armhf] using pyenv)
<!-- Please use the mitmproxy forums (https://discourse.mitmproxy.org/) for support/how-to questions. Thanks! :) -->
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mitmproxy/utils/arg_check.py`
Content:
```
1 import sys
2
3 DEPRECATED = """
4 --cadir
5 -Z
6 --body-size-limit
7 --stream
8 --palette
9 --palette-transparent
10 --follow
11 --order
12 --no-mouse
13 --reverse
14 --socks
15 --http2-priority
16 --no-http2-priority
17 --no-websocket
18 --websocket
19 --spoof-source-address
20 --upstream-bind-address
21 --ciphers-client
22 --ciphers-server
23 --client-certs
24 --no-upstream-cert
25 --add-upstream-certs-to-client-chain
26 --upstream-trusted-cadir
27 --upstream-trusted-ca
28 --ssl-version-client
29 --ssl-version-server
30 --no-onboarding
31 --onboarding-host
32 --onboarding-port
33 --server-replay-use-header
34 --no-pop
35 --replay-ignore-content
36 --replay-ignore-payload-param
37 --replay-ignore-param
38 --replay-ignore-host
39 --replace-from-file
40 """
41
42 REPLACED = """
43 -t
44 -u
45 --wfile
46 -a
47 --afile
48 -z
49 -b
50 --bind-address
51 --port
52 -I
53 --ignore
54 --tcp
55 --cert
56 --insecure
57 -c
58 --replace
59 -i
60 -f
61 --filter
62 """
63
64 REPLACEMENTS = {
65 "--stream": "stream_large_bodies",
66 "--palette": "console_palette",
67 "--palette-transparent": "console_palette_transparent:",
68 "--follow": "console_focus_follow",
69 "--order": "view_order",
70 "--no-mouse": "console_mouse",
71 "--reverse": "view_order_reversed",
72 "--no-http2-priority": "http2_priority",
73 "--no-websocket": "websocket",
74 "--no-upstream-cert": "upstream_cert",
75 "--upstream-trusted-cadir": "ssl_verify_upstream_trusted_cadir",
76 "--upstream-trusted-ca": "ssl_verify_upstream_trusted_ca",
77 "--no-onboarding": "onboarding",
78 "--no-pop": "server_replay_nopop",
79 "--replay-ignore-content": "server_replay_ignore_content",
80 "--replay-ignore-payload-param": "server_replay_ignore_payload_params",
81 "--replay-ignore-param": "server_replay_ignore_params",
82 "--replay-ignore-host": "server_replay_ignore_host",
83 "--replace-from-file": "replacements (use @ to specify path)",
84 "-t": "--stickycookie",
85 "-u": "--stickyauth",
86 "--wfile": "--save-stream-file",
87 "-a": "-w Prefix path with + to append.",
88 "--afile": "-w Prefix path with + to append.",
89 "-z": "--anticomp",
90 "-b": "--listen-host",
91 "--bind-address": "--listen-host",
92 "--port": "--listen-port",
93 "-I": "--ignore-hosts",
94 "--ignore": "--ignore-hosts",
95 "--tcp": "--tcp-hosts",
96 "--cert": "--certs",
97 "--insecure": "--ssl-insecure",
98 "-c": "-C",
99 "--replace": "--replacements",
100 "-i": "--intercept",
101 "-f": "--view-filter",
102 "--filter": "--view-filter"
103 }
104
105
106 def check():
107 args = sys.argv[1:]
108 print()
109 if "-U" in args:
110 print("-U is deprecated, please use --mode upstream:SPEC instead")
111
112 if "-T" in args:
113 print("-T is deprecated, please use --mode transparent instead")
114
115 for option in ("-e", "--eventlog", "--norefresh"):
116 if option in args:
117 print("{} has been removed.".format(option))
118
119 for option in ("--nonanonymous", "--singleuser", "--htpasswd"):
120 if option in args:
121 print(
122 '{} is deprecated.\n'
123 'Please use `--proxyauth SPEC` instead.\n'
124 'SPEC Format: "username:pass", "any" to accept any user/pass combination,\n'
125 '"@path" to use an Apache htpasswd file, or\n'
126 '"ldap[s]:url_server_ldap:dn_auth:password:dn_subtree" '
127 'for LDAP authentication.'.format(option))
128
129 for option in REPLACED.splitlines():
130 if option in args:
131 print(
132 "{} is deprecated.\n"
133 "Please use `{}` instead.".format(
134 option,
135 REPLACEMENTS.get(option)
136 )
137 )
138
139 for option in DEPRECATED.splitlines():
140 if option in args:
141 print(
142 "{} is deprecated.\n"
143 "Please use `--set {}=value` instead.\n"
144 "To show all options and their default values use --options".format(
145 option,
146 REPLACEMENTS.get(option, None) or option.lstrip("-").replace("-", "_")
147 )
148 )
149
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/mitmproxy/utils/arg_check.py b/mitmproxy/utils/arg_check.py
--- a/mitmproxy/utils/arg_check.py
+++ b/mitmproxy/utils/arg_check.py
@@ -11,7 +11,6 @@
--order
--no-mouse
--reverse
---socks
--http2-priority
--no-http2-priority
--no-websocket
@@ -59,6 +58,7 @@
-i
-f
--filter
+--socks
"""
REPLACEMENTS = {
@@ -99,7 +99,8 @@
"--replace": "--replacements",
"-i": "--intercept",
"-f": "--view-filter",
- "--filter": "--view-filter"
+ "--filter": "--view-filter",
+ "--socks": "--mode socks5"
}
| {"golden_diff": "diff --git a/mitmproxy/utils/arg_check.py b/mitmproxy/utils/arg_check.py\n--- a/mitmproxy/utils/arg_check.py\n+++ b/mitmproxy/utils/arg_check.py\n@@ -11,7 +11,6 @@\n --order\n --no-mouse\n --reverse\n---socks\n --http2-priority\n --no-http2-priority\n --no-websocket\n@@ -59,6 +58,7 @@\n -i\n -f\n --filter\n+--socks\n \"\"\"\n \n REPLACEMENTS = {\n@@ -99,7 +99,8 @@\n \"--replace\": \"--replacements\",\n \"-i\": \"--intercept\",\n \"-f\": \"--view-filter\",\n- \"--filter\": \"--view-filter\"\n+ \"--filter\": \"--view-filter\",\n+ \"--socks\": \"--mode socks5\"\n }\n", "issue": "Update socks mode access commandline + documentation (v3.0.2)\n##### Steps to reproduce the problem:\r\n1. mitmproxy --socks\r\n\"--socks is deprecated \r\nPlease use '--set socks=value' instead\"\r\n2. Check online documentation at:\r\nhttps://mitmproxy.org/docs/latest/concepts-modes/#socks-proxy\r\n3. Check mitmproxy --help\r\n\r\n##### Any other comments? What have you tried so far?\r\n1. The advice given here doesn't appear to work (no combinations I tried were accepted).\r\n2. The online documentation stops at Socks Proxy (no content)\r\n3. The --help text shows the correct method (--mode socks5)\r\n\r\n##### System information\r\nMitmproxy: 3.0.2\r\nPython: 3.5.5rc1\r\nOpenSSL: OpenSSL 1.0.1f 6 Jan 2014\r\nPlatform: Linux-2.6.39.4-kat124-ga627d40-armv7l-with-debian-jessie-sid\r\n\r\n(Android: KatKiss Marshmallow hosting Linux Deploy: Ubuntu Trusty [armhf] using pyenv)\r\n\r\n<!-- Please use the mitmproxy forums (https://discourse.mitmproxy.org/) for support/how-to questions. Thanks! :) -->\r\n\n", "before_files": [{"content": "import sys\n\nDEPRECATED = \"\"\"\n--cadir\n-Z\n--body-size-limit\n--stream\n--palette\n--palette-transparent\n--follow\n--order\n--no-mouse\n--reverse\n--socks\n--http2-priority\n--no-http2-priority\n--no-websocket\n--websocket\n--spoof-source-address\n--upstream-bind-address\n--ciphers-client\n--ciphers-server\n--client-certs\n--no-upstream-cert\n--add-upstream-certs-to-client-chain\n--upstream-trusted-cadir\n--upstream-trusted-ca\n--ssl-version-client\n--ssl-version-server\n--no-onboarding\n--onboarding-host\n--onboarding-port\n--server-replay-use-header\n--no-pop\n--replay-ignore-content\n--replay-ignore-payload-param\n--replay-ignore-param\n--replay-ignore-host\n--replace-from-file\n\"\"\"\n\nREPLACED = \"\"\"\n-t\n-u\n--wfile\n-a\n--afile\n-z\n-b\n--bind-address\n--port\n-I\n--ignore\n--tcp\n--cert\n--insecure\n-c\n--replace\n-i\n-f\n--filter\n\"\"\"\n\nREPLACEMENTS = {\n \"--stream\": \"stream_large_bodies\",\n \"--palette\": \"console_palette\",\n \"--palette-transparent\": \"console_palette_transparent:\",\n \"--follow\": \"console_focus_follow\",\n \"--order\": \"view_order\",\n \"--no-mouse\": \"console_mouse\",\n \"--reverse\": \"view_order_reversed\",\n \"--no-http2-priority\": \"http2_priority\",\n \"--no-websocket\": \"websocket\",\n \"--no-upstream-cert\": \"upstream_cert\",\n \"--upstream-trusted-cadir\": \"ssl_verify_upstream_trusted_cadir\",\n \"--upstream-trusted-ca\": \"ssl_verify_upstream_trusted_ca\",\n \"--no-onboarding\": \"onboarding\",\n \"--no-pop\": \"server_replay_nopop\",\n \"--replay-ignore-content\": \"server_replay_ignore_content\",\n \"--replay-ignore-payload-param\": \"server_replay_ignore_payload_params\",\n \"--replay-ignore-param\": \"server_replay_ignore_params\",\n \"--replay-ignore-host\": \"server_replay_ignore_host\",\n \"--replace-from-file\": \"replacements (use @ to specify path)\",\n \"-t\": \"--stickycookie\",\n \"-u\": \"--stickyauth\",\n \"--wfile\": \"--save-stream-file\",\n \"-a\": \"-w Prefix path with + to append.\",\n \"--afile\": \"-w Prefix path with + to append.\",\n \"-z\": \"--anticomp\",\n \"-b\": \"--listen-host\",\n \"--bind-address\": \"--listen-host\",\n \"--port\": \"--listen-port\",\n \"-I\": \"--ignore-hosts\",\n \"--ignore\": \"--ignore-hosts\",\n \"--tcp\": \"--tcp-hosts\",\n \"--cert\": \"--certs\",\n \"--insecure\": \"--ssl-insecure\",\n \"-c\": \"-C\",\n \"--replace\": \"--replacements\",\n \"-i\": \"--intercept\",\n \"-f\": \"--view-filter\",\n \"--filter\": \"--view-filter\"\n}\n\n\ndef check():\n args = sys.argv[1:]\n print()\n if \"-U\" in args:\n print(\"-U is deprecated, please use --mode upstream:SPEC instead\")\n\n if \"-T\" in args:\n print(\"-T is deprecated, please use --mode transparent instead\")\n\n for option in (\"-e\", \"--eventlog\", \"--norefresh\"):\n if option in args:\n print(\"{} has been removed.\".format(option))\n\n for option in (\"--nonanonymous\", \"--singleuser\", \"--htpasswd\"):\n if option in args:\n print(\n '{} is deprecated.\\n'\n 'Please use `--proxyauth SPEC` instead.\\n'\n 'SPEC Format: \"username:pass\", \"any\" to accept any user/pass combination,\\n'\n '\"@path\" to use an Apache htpasswd file, or\\n'\n '\"ldap[s]:url_server_ldap:dn_auth:password:dn_subtree\" '\n 'for LDAP authentication.'.format(option))\n\n for option in REPLACED.splitlines():\n if option in args:\n print(\n \"{} is deprecated.\\n\"\n \"Please use `{}` instead.\".format(\n option,\n REPLACEMENTS.get(option)\n )\n )\n\n for option in DEPRECATED.splitlines():\n if option in args:\n print(\n \"{} is deprecated.\\n\"\n \"Please use `--set {}=value` instead.\\n\"\n \"To show all options and their default values use --options\".format(\n option,\n REPLACEMENTS.get(option, None) or option.lstrip(\"-\").replace(\"-\", \"_\")\n )\n )\n", "path": "mitmproxy/utils/arg_check.py"}], "after_files": [{"content": "import sys\n\nDEPRECATED = \"\"\"\n--cadir\n-Z\n--body-size-limit\n--stream\n--palette\n--palette-transparent\n--follow\n--order\n--no-mouse\n--reverse\n--http2-priority\n--no-http2-priority\n--no-websocket\n--websocket\n--spoof-source-address\n--upstream-bind-address\n--ciphers-client\n--ciphers-server\n--client-certs\n--no-upstream-cert\n--add-upstream-certs-to-client-chain\n--upstream-trusted-cadir\n--upstream-trusted-ca\n--ssl-version-client\n--ssl-version-server\n--no-onboarding\n--onboarding-host\n--onboarding-port\n--server-replay-use-header\n--no-pop\n--replay-ignore-content\n--replay-ignore-payload-param\n--replay-ignore-param\n--replay-ignore-host\n--replace-from-file\n\"\"\"\n\nREPLACED = \"\"\"\n-t\n-u\n--wfile\n-a\n--afile\n-z\n-b\n--bind-address\n--port\n-I\n--ignore\n--tcp\n--cert\n--insecure\n-c\n--replace\n-i\n-f\n--filter\n--socks\n\"\"\"\n\nREPLACEMENTS = {\n \"--stream\": \"stream_large_bodies\",\n \"--palette\": \"console_palette\",\n \"--palette-transparent\": \"console_palette_transparent:\",\n \"--follow\": \"console_focus_follow\",\n \"--order\": \"view_order\",\n \"--no-mouse\": \"console_mouse\",\n \"--reverse\": \"view_order_reversed\",\n \"--no-http2-priority\": \"http2_priority\",\n \"--no-websocket\": \"websocket\",\n \"--no-upstream-cert\": \"upstream_cert\",\n \"--upstream-trusted-cadir\": \"ssl_verify_upstream_trusted_cadir\",\n \"--upstream-trusted-ca\": \"ssl_verify_upstream_trusted_ca\",\n \"--no-onboarding\": \"onboarding\",\n \"--no-pop\": \"server_replay_nopop\",\n \"--replay-ignore-content\": \"server_replay_ignore_content\",\n \"--replay-ignore-payload-param\": \"server_replay_ignore_payload_params\",\n \"--replay-ignore-param\": \"server_replay_ignore_params\",\n \"--replay-ignore-host\": \"server_replay_ignore_host\",\n \"--replace-from-file\": \"replacements (use @ to specify path)\",\n \"-t\": \"--stickycookie\",\n \"-u\": \"--stickyauth\",\n \"--wfile\": \"--save-stream-file\",\n \"-a\": \"-w Prefix path with + to append.\",\n \"--afile\": \"-w Prefix path with + to append.\",\n \"-z\": \"--anticomp\",\n \"-b\": \"--listen-host\",\n \"--bind-address\": \"--listen-host\",\n \"--port\": \"--listen-port\",\n \"-I\": \"--ignore-hosts\",\n \"--ignore\": \"--ignore-hosts\",\n \"--tcp\": \"--tcp-hosts\",\n \"--cert\": \"--certs\",\n \"--insecure\": \"--ssl-insecure\",\n \"-c\": \"-C\",\n \"--replace\": \"--replacements\",\n \"-i\": \"--intercept\",\n \"-f\": \"--view-filter\",\n \"--filter\": \"--view-filter\",\n \"--socks\": \"--mode socks5\"\n}\n\n\ndef check():\n args = sys.argv[1:]\n print()\n if \"-U\" in args:\n print(\"-U is deprecated, please use --mode upstream:SPEC instead\")\n\n if \"-T\" in args:\n print(\"-T is deprecated, please use --mode transparent instead\")\n\n for option in (\"-e\", \"--eventlog\", \"--norefresh\"):\n if option in args:\n print(\"{} has been removed.\".format(option))\n\n for option in (\"--nonanonymous\", \"--singleuser\", \"--htpasswd\"):\n if option in args:\n print(\n '{} is deprecated.\\n'\n 'Please use `--proxyauth SPEC` instead.\\n'\n 'SPEC Format: \"username:pass\", \"any\" to accept any user/pass combination,\\n'\n '\"@path\" to use an Apache htpasswd file, or\\n'\n '\"ldap[s]:url_server_ldap:dn_auth:password:dn_subtree\" '\n 'for LDAP authentication.'.format(option))\n\n for option in REPLACED.splitlines():\n if option in args:\n print(\n \"{} is deprecated.\\n\"\n \"Please use `{}` instead.\".format(\n option,\n REPLACEMENTS.get(option)\n )\n )\n\n for option in DEPRECATED.splitlines():\n if option in args:\n print(\n \"{} is deprecated.\\n\"\n \"Please use `--set {}=value` instead.\\n\"\n \"To show all options and their default values use --options\".format(\n option,\n REPLACEMENTS.get(option, None) or option.lstrip(\"-\").replace(\"-\", \"_\")\n )\n )\n", "path": "mitmproxy/utils/arg_check.py"}]} | 1,903 | 188 |
gh_patches_debug_39916 | rasdani/github-patches | git_diff | rasterio__rasterio-826 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Profile class to deprecate affine key
This is a common pattern in some rasterio code, ironically intended as a way to prepare for the future by ensuring `Affine` objects get passed as the `transform` kwarg.
```
with rasterio.open(src_path) as src:
opts = src.profile.copy()
opts['transform'] = opts['affine']
```
With the merger of the `drop-dep` branch in PR #763, we get a `KeyError: 'affine'`.
Adding `affine` to profile dictionary would maintain backwards compat but would lead to unavoidable warnings when kwarg unpacking with `open(..., **profile)`
**Proposed solution**: Modify the `rasterio.profiles.Profile` class to behave like a dict but deprecate the usage of the `affine` key (Warning on `__getitem__`, AttributeError on `__setitem__`). Then `src.profile` could return an instance of this class instead of a dict.
Continuation of the discussion started here: https://github.com/mapbox/rasterio/pull/763#issuecomment-229737519 but broken out into a separate issue.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `rasterio/rio/info.py`
Content:
```
1 """$ rio info"""
2
3
4 import json
5
6 import click
7
8 import rasterio
9 import rasterio.crs
10 from rasterio.rio import options
11
12
13 @click.command(short_help="Print information about a data file.")
14 @options.file_in_arg
15 @click.option('--meta', 'aspect', flag_value='meta', default=True,
16 help="Show data file structure (default).")
17 @click.option('--tags', 'aspect', flag_value='tags',
18 help="Show data file tags.")
19 @click.option('--namespace', help="Select a tag namespace.")
20 @click.option('--indent', default=None, type=int,
21 help="Indentation level for pretty printed output")
22 # Options to pick out a single metadata item and print it as
23 # a string.
24 @click.option('--count', 'meta_member', flag_value='count',
25 help="Print the count of bands.")
26 @click.option('-t', '--dtype', 'meta_member', flag_value='dtype',
27 help="Print the dtype name.")
28 @click.option('--nodata', 'meta_member', flag_value='nodata',
29 help="Print the nodata value.")
30 @click.option('-f', '--format', '--driver', 'meta_member', flag_value='driver',
31 help="Print the format driver.")
32 @click.option('--shape', 'meta_member', flag_value='shape',
33 help="Print the (height, width) shape.")
34 @click.option('--height', 'meta_member', flag_value='height',
35 help="Print the height (number of rows).")
36 @click.option('--width', 'meta_member', flag_value='width',
37 help="Print the width (number of columns).")
38 @click.option('--crs', 'meta_member', flag_value='crs',
39 help="Print the CRS as a PROJ.4 string.")
40 @click.option('--bounds', 'meta_member', flag_value='bounds',
41 help="Print the boundary coordinates "
42 "(left, bottom, right, top).")
43 @click.option('-r', '--res', 'meta_member', flag_value='res',
44 help="Print pixel width and height.")
45 @click.option('--lnglat', 'meta_member', flag_value='lnglat',
46 help="Print longitude and latitude at center.")
47 @click.option('--stats', 'meta_member', flag_value='stats',
48 help="Print statistics (min, max, mean) of a single band "
49 "(use --bidx).")
50 @click.option('--checksum', 'meta_member', flag_value='checksum',
51 help="Print integer checksum of a single band "
52 "(use --bidx).")
53 @click.option('-v', '--tell-me-more', '--verbose', is_flag=True,
54 help="Output extra information.")
55 @options.bidx_opt
56 @options.masked_opt
57 @click.pass_context
58 def info(ctx, input, aspect, indent, namespace, meta_member, verbose, bidx,
59 masked):
60 """Print metadata about the dataset as JSON.
61
62 Optionally print a single metadata item as a string.
63 """
64 verbosity = ctx.obj.get('verbosity')
65 mode = 'r' if (verbose or meta_member == 'stats') else 'r-'
66 try:
67 with rasterio.Env(CPL_DEBUG=(verbosity > 2)):
68 with rasterio.open(input, mode) as src:
69 info = src.profile
70 info['shape'] = info['height'], info['width']
71 info['bounds'] = src.bounds
72 proj4 = src.crs.to_string()
73 if proj4.startswith('+init=epsg'):
74 proj4 = proj4.split('=')[1].upper()
75 info['crs'] = proj4
76 info['res'] = src.res
77 info['colorinterp'] = [src.colorinterp(i).name
78 for i in src.indexes]
79 if proj4 != '':
80 info['lnglat'] = src.lnglat()
81 if verbose:
82 stats = [{'min': float(b.min()),
83 'max': float(b.max()),
84 'mean': float(b.mean())
85 } for b in src.read(masked=masked)]
86 info['stats'] = stats
87 info['checksum'] = [src.checksum(i) for i in src.indexes]
88 if aspect == 'meta':
89 if meta_member == 'stats':
90 band = src.read(bidx, masked=masked)
91 click.echo('%f %f %f' % (
92 float(band.min()),
93 float(band.max()),
94 float(band.mean())))
95 elif meta_member == 'checksum':
96 click.echo(str(src.checksum(bidx)))
97 elif meta_member:
98 if isinstance(info[meta_member], (list, tuple)):
99 click.echo(" ".join(map(str, info[meta_member])))
100 else:
101 click.echo(info[meta_member])
102 else:
103 click.echo(json.dumps(info, indent=indent))
104 elif aspect == 'tags':
105 click.echo(
106 json.dumps(src.tags(ns=namespace), indent=indent))
107 except Exception:
108 raise click.Abort()
109
```
Path: `rasterio/profiles.py`
Content:
```
1 """Raster dataset profiles."""
2
3 from rasterio.dtypes import uint8
4
5
6 class Profile:
7 """Base class for Rasterio dataset profiles.
8
9 Subclasses will declare a format driver and driver-specific
10 creation options.
11 """
12
13 driver = None
14 defaults = {}
15
16 def __call__(self, **kwargs):
17 """Return a mapping of keyword args for writing a new datasets.
18
19 Example:
20
21 profile = SomeProfile()
22 with rasterio.open('foo.tif', 'w', **profile()) as dst:
23 # Write data ...
24 """
25 if kwargs.get('driver', self.driver) != self.driver:
26 raise ValueError(
27 "Overriding this profile's driver is not allowed.")
28 profile = self.defaults.copy()
29 profile.update(**kwargs)
30 profile['driver'] = self.driver
31 return profile
32
33
34 class DefaultGTiffProfile(Profile):
35 """A tiled, band-interleaved, LZW-compressed, 8-bit GTiff profile."""
36
37 driver = 'GTiff'
38 defaults = {
39 'interleave': 'band',
40 'tiled': True,
41 'blockxsize': 256,
42 'blockysize': 256,
43 'compress': 'lzw',
44 'nodata': 0,
45 'dtype': uint8
46 }
47
48
49 default_gtiff_profile = DefaultGTiffProfile()
50
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/rasterio/profiles.py b/rasterio/profiles.py
--- a/rasterio/profiles.py
+++ b/rasterio/profiles.py
@@ -1,41 +1,69 @@
"""Raster dataset profiles."""
+import warnings
+
+from rasterio.compat import UserDict
from rasterio.dtypes import uint8
-class Profile:
+class Profile(UserDict):
"""Base class for Rasterio dataset profiles.
- Subclasses will declare a format driver and driver-specific
- creation options.
+ Subclasses will declare driver-specific creation options.
"""
- driver = None
defaults = {}
- def __call__(self, **kwargs):
- """Return a mapping of keyword args for writing a new datasets.
+ def __init__(self, data={}, **kwds):
+ """Create a new profile based on the class defaults, which are
+ overlaid with items from the `data` dict and keyword arguments."""
+ UserDict.__init__(self)
+ initdata = self.defaults.copy()
+ initdata.update(data)
+ initdata.update(**kwds)
+
+ if 'affine' in initdata and 'transform' in initdata:
+ warnings.warn("affine item is deprecated, use transform only",
+ DeprecationWarning)
+ del initdata['affine']
+ elif 'affine' in initdata:
+ warnings.warn("affine item is deprecated, use transform instead",
+ DeprecationWarning)
+ initdata['transform'] = initdata.pop('affine')
+
+ self.data.update(initdata)
+
+ def __getitem__(self, key):
+ """Like normal item access but with affine alias."""
+ if key == 'affine':
+ key = 'transform'
+ warnings.warn("affine item is deprecated, use transform instead",
+ DeprecationWarning)
+ return self.data[key]
+
+ def __setitem__(self, key, val):
+ """Like normal item setter but forbidding affine item."""
+ if key == 'affine':
+ raise TypeError("affine key is prohibited")
+ self.data[key] = val
- Example:
+ def __call__(self, **kwds):
+ """Return a mapping of keyword args.
- profile = SomeProfile()
- with rasterio.open('foo.tif', 'w', **profile()) as dst:
- # Write data ...
+ DEPRECATED.
"""
- if kwargs.get('driver', self.driver) != self.driver:
- raise ValueError(
- "Overriding this profile's driver is not allowed.")
- profile = self.defaults.copy()
- profile.update(**kwargs)
- profile['driver'] = self.driver
+ warnings.warn("__call__() is deprecated, use mapping methods instead",
+ DeprecationWarning)
+ profile = self.data.copy()
+ profile.update(**kwds)
return profile
class DefaultGTiffProfile(Profile):
- """A tiled, band-interleaved, LZW-compressed, 8-bit GTiff profile."""
+ """Tiled, band-interleaved, LZW-compressed, 8-bit GTiff."""
- driver = 'GTiff'
defaults = {
+ 'driver': 'GTiff',
'interleave': 'band',
'tiled': True,
'blockxsize': 256,
diff --git a/rasterio/rio/info.py b/rasterio/rio/info.py
--- a/rasterio/rio/info.py
+++ b/rasterio/rio/info.py
@@ -66,7 +66,7 @@
try:
with rasterio.Env(CPL_DEBUG=(verbosity > 2)):
with rasterio.open(input, mode) as src:
- info = src.profile
+ info = dict(src.profile)
info['shape'] = info['height'], info['width']
info['bounds'] = src.bounds
proj4 = src.crs.to_string()
| {"golden_diff": "diff --git a/rasterio/profiles.py b/rasterio/profiles.py\n--- a/rasterio/profiles.py\n+++ b/rasterio/profiles.py\n@@ -1,41 +1,69 @@\n \"\"\"Raster dataset profiles.\"\"\"\n \n+import warnings\n+\n+from rasterio.compat import UserDict\n from rasterio.dtypes import uint8\n \n \n-class Profile:\n+class Profile(UserDict):\n \"\"\"Base class for Rasterio dataset profiles.\n \n- Subclasses will declare a format driver and driver-specific\n- creation options.\n+ Subclasses will declare driver-specific creation options.\n \"\"\"\n \n- driver = None\n defaults = {}\n \n- def __call__(self, **kwargs):\n- \"\"\"Return a mapping of keyword args for writing a new datasets.\n+ def __init__(self, data={}, **kwds):\n+ \"\"\"Create a new profile based on the class defaults, which are\n+ overlaid with items from the `data` dict and keyword arguments.\"\"\"\n+ UserDict.__init__(self)\n+ initdata = self.defaults.copy()\n+ initdata.update(data)\n+ initdata.update(**kwds)\n+\n+ if 'affine' in initdata and 'transform' in initdata:\n+ warnings.warn(\"affine item is deprecated, use transform only\",\n+ DeprecationWarning)\n+ del initdata['affine']\n+ elif 'affine' in initdata:\n+ warnings.warn(\"affine item is deprecated, use transform instead\",\n+ DeprecationWarning)\n+ initdata['transform'] = initdata.pop('affine')\n+\n+ self.data.update(initdata)\n+\n+ def __getitem__(self, key):\n+ \"\"\"Like normal item access but with affine alias.\"\"\"\n+ if key == 'affine':\n+ key = 'transform'\n+ warnings.warn(\"affine item is deprecated, use transform instead\",\n+ DeprecationWarning)\n+ return self.data[key]\n+\n+ def __setitem__(self, key, val):\n+ \"\"\"Like normal item setter but forbidding affine item.\"\"\"\n+ if key == 'affine':\n+ raise TypeError(\"affine key is prohibited\")\n+ self.data[key] = val\n \n- Example:\n+ def __call__(self, **kwds):\n+ \"\"\"Return a mapping of keyword args.\n \n- profile = SomeProfile()\n- with rasterio.open('foo.tif', 'w', **profile()) as dst:\n- # Write data ...\n+ DEPRECATED.\n \"\"\"\n- if kwargs.get('driver', self.driver) != self.driver:\n- raise ValueError(\n- \"Overriding this profile's driver is not allowed.\")\n- profile = self.defaults.copy()\n- profile.update(**kwargs)\n- profile['driver'] = self.driver\n+ warnings.warn(\"__call__() is deprecated, use mapping methods instead\",\n+ DeprecationWarning)\n+ profile = self.data.copy()\n+ profile.update(**kwds)\n return profile\n \n \n class DefaultGTiffProfile(Profile):\n- \"\"\"A tiled, band-interleaved, LZW-compressed, 8-bit GTiff profile.\"\"\"\n+ \"\"\"Tiled, band-interleaved, LZW-compressed, 8-bit GTiff.\"\"\"\n \n- driver = 'GTiff'\n defaults = {\n+ 'driver': 'GTiff',\n 'interleave': 'band',\n 'tiled': True,\n 'blockxsize': 256,\ndiff --git a/rasterio/rio/info.py b/rasterio/rio/info.py\n--- a/rasterio/rio/info.py\n+++ b/rasterio/rio/info.py\n@@ -66,7 +66,7 @@\n try:\n with rasterio.Env(CPL_DEBUG=(verbosity > 2)):\n with rasterio.open(input, mode) as src:\n- info = src.profile\n+ info = dict(src.profile)\n info['shape'] = info['height'], info['width']\n info['bounds'] = src.bounds\n proj4 = src.crs.to_string()\n", "issue": "Profile class to deprecate affine key\nThis is a common pattern in some rasterio code, ironically intended as a way to prepare for the future by ensuring `Affine` objects get passed as the `transform` kwarg. \n\n```\n with rasterio.open(src_path) as src:\n opts = src.profile.copy()\n\n opts['transform'] = opts['affine']\n```\n\nWith the merger of the `drop-dep` branch in PR #763, we get a `KeyError: 'affine'`.\n\nAdding `affine` to profile dictionary would maintain backwards compat but would lead to unavoidable warnings when kwarg unpacking with `open(..., **profile)`\n\n**Proposed solution**: Modify the `rasterio.profiles.Profile` class to behave like a dict but deprecate the usage of the `affine` key (Warning on `__getitem__`, AttributeError on `__setitem__`). Then `src.profile` could return an instance of this class instead of a dict.\n\nContinuation of the discussion started here: https://github.com/mapbox/rasterio/pull/763#issuecomment-229737519 but broken out into a separate issue.\n\n", "before_files": [{"content": "\"\"\"$ rio info\"\"\"\n\n\nimport json\n\nimport click\n\nimport rasterio\nimport rasterio.crs\nfrom rasterio.rio import options\n\n\[email protected](short_help=\"Print information about a data file.\")\[email protected]_in_arg\[email protected]('--meta', 'aspect', flag_value='meta', default=True,\n help=\"Show data file structure (default).\")\[email protected]('--tags', 'aspect', flag_value='tags',\n help=\"Show data file tags.\")\[email protected]('--namespace', help=\"Select a tag namespace.\")\[email protected]('--indent', default=None, type=int,\n help=\"Indentation level for pretty printed output\")\n# Options to pick out a single metadata item and print it as\n# a string.\[email protected]('--count', 'meta_member', flag_value='count',\n help=\"Print the count of bands.\")\[email protected]('-t', '--dtype', 'meta_member', flag_value='dtype',\n help=\"Print the dtype name.\")\[email protected]('--nodata', 'meta_member', flag_value='nodata',\n help=\"Print the nodata value.\")\[email protected]('-f', '--format', '--driver', 'meta_member', flag_value='driver',\n help=\"Print the format driver.\")\[email protected]('--shape', 'meta_member', flag_value='shape',\n help=\"Print the (height, width) shape.\")\[email protected]('--height', 'meta_member', flag_value='height',\n help=\"Print the height (number of rows).\")\[email protected]('--width', 'meta_member', flag_value='width',\n help=\"Print the width (number of columns).\")\[email protected]('--crs', 'meta_member', flag_value='crs',\n help=\"Print the CRS as a PROJ.4 string.\")\[email protected]('--bounds', 'meta_member', flag_value='bounds',\n help=\"Print the boundary coordinates \"\n \"(left, bottom, right, top).\")\[email protected]('-r', '--res', 'meta_member', flag_value='res',\n help=\"Print pixel width and height.\")\[email protected]('--lnglat', 'meta_member', flag_value='lnglat',\n help=\"Print longitude and latitude at center.\")\[email protected]('--stats', 'meta_member', flag_value='stats',\n help=\"Print statistics (min, max, mean) of a single band \"\n \"(use --bidx).\")\[email protected]('--checksum', 'meta_member', flag_value='checksum',\n help=\"Print integer checksum of a single band \"\n \"(use --bidx).\")\[email protected]('-v', '--tell-me-more', '--verbose', is_flag=True,\n help=\"Output extra information.\")\[email protected]_opt\[email protected]_opt\[email protected]_context\ndef info(ctx, input, aspect, indent, namespace, meta_member, verbose, bidx,\n masked):\n \"\"\"Print metadata about the dataset as JSON.\n\n Optionally print a single metadata item as a string.\n \"\"\"\n verbosity = ctx.obj.get('verbosity')\n mode = 'r' if (verbose or meta_member == 'stats') else 'r-'\n try:\n with rasterio.Env(CPL_DEBUG=(verbosity > 2)):\n with rasterio.open(input, mode) as src:\n info = src.profile\n info['shape'] = info['height'], info['width']\n info['bounds'] = src.bounds\n proj4 = src.crs.to_string()\n if proj4.startswith('+init=epsg'):\n proj4 = proj4.split('=')[1].upper()\n info['crs'] = proj4\n info['res'] = src.res\n info['colorinterp'] = [src.colorinterp(i).name\n for i in src.indexes]\n if proj4 != '':\n info['lnglat'] = src.lnglat()\n if verbose:\n stats = [{'min': float(b.min()),\n 'max': float(b.max()),\n 'mean': float(b.mean())\n } for b in src.read(masked=masked)]\n info['stats'] = stats\n info['checksum'] = [src.checksum(i) for i in src.indexes]\n if aspect == 'meta':\n if meta_member == 'stats':\n band = src.read(bidx, masked=masked)\n click.echo('%f %f %f' % (\n float(band.min()),\n float(band.max()),\n float(band.mean())))\n elif meta_member == 'checksum':\n click.echo(str(src.checksum(bidx)))\n elif meta_member:\n if isinstance(info[meta_member], (list, tuple)):\n click.echo(\" \".join(map(str, info[meta_member])))\n else:\n click.echo(info[meta_member])\n else:\n click.echo(json.dumps(info, indent=indent))\n elif aspect == 'tags':\n click.echo(\n json.dumps(src.tags(ns=namespace), indent=indent))\n except Exception:\n raise click.Abort()\n", "path": "rasterio/rio/info.py"}, {"content": "\"\"\"Raster dataset profiles.\"\"\"\n\nfrom rasterio.dtypes import uint8\n\n\nclass Profile:\n \"\"\"Base class for Rasterio dataset profiles.\n\n Subclasses will declare a format driver and driver-specific\n creation options.\n \"\"\"\n\n driver = None\n defaults = {}\n\n def __call__(self, **kwargs):\n \"\"\"Return a mapping of keyword args for writing a new datasets.\n\n Example:\n\n profile = SomeProfile()\n with rasterio.open('foo.tif', 'w', **profile()) as dst:\n # Write data ...\n \"\"\"\n if kwargs.get('driver', self.driver) != self.driver:\n raise ValueError(\n \"Overriding this profile's driver is not allowed.\")\n profile = self.defaults.copy()\n profile.update(**kwargs)\n profile['driver'] = self.driver\n return profile\n\n\nclass DefaultGTiffProfile(Profile):\n \"\"\"A tiled, band-interleaved, LZW-compressed, 8-bit GTiff profile.\"\"\"\n\n driver = 'GTiff'\n defaults = {\n 'interleave': 'band',\n 'tiled': True,\n 'blockxsize': 256,\n 'blockysize': 256,\n 'compress': 'lzw',\n 'nodata': 0,\n 'dtype': uint8\n }\n\n\ndefault_gtiff_profile = DefaultGTiffProfile()\n", "path": "rasterio/profiles.py"}], "after_files": [{"content": "\"\"\"$ rio info\"\"\"\n\n\nimport json\n\nimport click\n\nimport rasterio\nimport rasterio.crs\nfrom rasterio.rio import options\n\n\[email protected](short_help=\"Print information about a data file.\")\[email protected]_in_arg\[email protected]('--meta', 'aspect', flag_value='meta', default=True,\n help=\"Show data file structure (default).\")\[email protected]('--tags', 'aspect', flag_value='tags',\n help=\"Show data file tags.\")\[email protected]('--namespace', help=\"Select a tag namespace.\")\[email protected]('--indent', default=None, type=int,\n help=\"Indentation level for pretty printed output\")\n# Options to pick out a single metadata item and print it as\n# a string.\[email protected]('--count', 'meta_member', flag_value='count',\n help=\"Print the count of bands.\")\[email protected]('-t', '--dtype', 'meta_member', flag_value='dtype',\n help=\"Print the dtype name.\")\[email protected]('--nodata', 'meta_member', flag_value='nodata',\n help=\"Print the nodata value.\")\[email protected]('-f', '--format', '--driver', 'meta_member', flag_value='driver',\n help=\"Print the format driver.\")\[email protected]('--shape', 'meta_member', flag_value='shape',\n help=\"Print the (height, width) shape.\")\[email protected]('--height', 'meta_member', flag_value='height',\n help=\"Print the height (number of rows).\")\[email protected]('--width', 'meta_member', flag_value='width',\n help=\"Print the width (number of columns).\")\[email protected]('--crs', 'meta_member', flag_value='crs',\n help=\"Print the CRS as a PROJ.4 string.\")\[email protected]('--bounds', 'meta_member', flag_value='bounds',\n help=\"Print the boundary coordinates \"\n \"(left, bottom, right, top).\")\[email protected]('-r', '--res', 'meta_member', flag_value='res',\n help=\"Print pixel width and height.\")\[email protected]('--lnglat', 'meta_member', flag_value='lnglat',\n help=\"Print longitude and latitude at center.\")\[email protected]('--stats', 'meta_member', flag_value='stats',\n help=\"Print statistics (min, max, mean) of a single band \"\n \"(use --bidx).\")\[email protected]('--checksum', 'meta_member', flag_value='checksum',\n help=\"Print integer checksum of a single band \"\n \"(use --bidx).\")\[email protected]('-v', '--tell-me-more', '--verbose', is_flag=True,\n help=\"Output extra information.\")\[email protected]_opt\[email protected]_opt\[email protected]_context\ndef info(ctx, input, aspect, indent, namespace, meta_member, verbose, bidx,\n masked):\n \"\"\"Print metadata about the dataset as JSON.\n\n Optionally print a single metadata item as a string.\n \"\"\"\n verbosity = ctx.obj.get('verbosity')\n mode = 'r' if (verbose or meta_member == 'stats') else 'r-'\n try:\n with rasterio.Env(CPL_DEBUG=(verbosity > 2)):\n with rasterio.open(input, mode) as src:\n info = dict(src.profile)\n info['shape'] = info['height'], info['width']\n info['bounds'] = src.bounds\n proj4 = src.crs.to_string()\n if proj4.startswith('+init=epsg'):\n proj4 = proj4.split('=')[1].upper()\n info['crs'] = proj4\n info['res'] = src.res\n info['colorinterp'] = [src.colorinterp(i).name\n for i in src.indexes]\n if proj4 != '':\n info['lnglat'] = src.lnglat()\n if verbose:\n stats = [{'min': float(b.min()),\n 'max': float(b.max()),\n 'mean': float(b.mean())\n } for b in src.read(masked=masked)]\n info['stats'] = stats\n info['checksum'] = [src.checksum(i) for i in src.indexes]\n if aspect == 'meta':\n if meta_member == 'stats':\n band = src.read(bidx, masked=masked)\n click.echo('%f %f %f' % (\n float(band.min()),\n float(band.max()),\n float(band.mean())))\n elif meta_member == 'checksum':\n click.echo(str(src.checksum(bidx)))\n elif meta_member:\n if isinstance(info[meta_member], (list, tuple)):\n click.echo(\" \".join(map(str, info[meta_member])))\n else:\n click.echo(info[meta_member])\n else:\n click.echo(json.dumps(info, indent=indent))\n elif aspect == 'tags':\n click.echo(\n json.dumps(src.tags(ns=namespace), indent=indent))\n except Exception:\n raise click.Abort()\n", "path": "rasterio/rio/info.py"}, {"content": "\"\"\"Raster dataset profiles.\"\"\"\n\nimport warnings\n\nfrom rasterio.compat import UserDict\nfrom rasterio.dtypes import uint8\n\n\nclass Profile(UserDict):\n \"\"\"Base class for Rasterio dataset profiles.\n\n Subclasses will declare driver-specific creation options.\n \"\"\"\n\n defaults = {}\n\n def __init__(self, data={}, **kwds):\n \"\"\"Create a new profile based on the class defaults, which are\n overlaid with items from the `data` dict and keyword arguments.\"\"\"\n UserDict.__init__(self)\n initdata = self.defaults.copy()\n initdata.update(data)\n initdata.update(**kwds)\n\n if 'affine' in initdata and 'transform' in initdata:\n warnings.warn(\"affine item is deprecated, use transform only\",\n DeprecationWarning)\n del initdata['affine']\n elif 'affine' in initdata:\n warnings.warn(\"affine item is deprecated, use transform instead\",\n DeprecationWarning)\n initdata['transform'] = initdata.pop('affine')\n\n self.data.update(initdata)\n\n def __getitem__(self, key):\n \"\"\"Like normal item access but with affine alias.\"\"\"\n if key == 'affine':\n key = 'transform'\n warnings.warn(\"affine item is deprecated, use transform instead\",\n DeprecationWarning)\n return self.data[key]\n\n def __setitem__(self, key, val):\n \"\"\"Like normal item setter but forbidding affine item.\"\"\"\n if key == 'affine':\n raise TypeError(\"affine key is prohibited\")\n self.data[key] = val\n\n def __call__(self, **kwds):\n \"\"\"Return a mapping of keyword args.\n\n DEPRECATED.\n \"\"\"\n warnings.warn(\"__call__() is deprecated, use mapping methods instead\",\n DeprecationWarning)\n profile = self.data.copy()\n profile.update(**kwds)\n return profile\n\n\nclass DefaultGTiffProfile(Profile):\n \"\"\"Tiled, band-interleaved, LZW-compressed, 8-bit GTiff.\"\"\"\n\n defaults = {\n 'driver': 'GTiff',\n 'interleave': 'band',\n 'tiled': True,\n 'blockxsize': 256,\n 'blockysize': 256,\n 'compress': 'lzw',\n 'nodata': 0,\n 'dtype': uint8\n }\n\n\ndefault_gtiff_profile = DefaultGTiffProfile()\n", "path": "rasterio/profiles.py"}]} | 2,178 | 875 |
gh_patches_debug_32372 | rasdani/github-patches | git_diff | ciudadanointeligente__votainteligente-portal-electoral-482 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
No notifica en wizard full que llega un mail!!!
<!---
@huboard:{"order":470.04699999999997,"milestone_order":481,"custom_state":""}
-->
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `popular_proposal/views.py`
Content:
```
1 from django.views.generic.edit import FormView, UpdateView
2 from popular_proposal.forms import (ProposalForm,
3 SubscriptionForm,
4 get_form_list,
5 AreaForm,
6 UpdateProposalForm,
7 ProposalFilterForm,
8 )
9 from django.core.urlresolvers import reverse
10 from django.shortcuts import get_object_or_404
11 from popolo.models import Area
12 from django.utils.decorators import method_decorator
13 from django.contrib.auth.decorators import login_required
14 from django.views.generic.base import TemplateView
15 from django.views.generic.detail import DetailView
16 from popular_proposal.models import (PopularProposal,
17 ProposalTemporaryData,
18 ProposalLike)
19 from django.shortcuts import render_to_response
20 from formtools.wizard.views import SessionWizardView
21 from collections import OrderedDict
22 from django.views.generic import View
23 from django.http import JsonResponse, HttpResponseNotFound
24 from django_filters.views import FilterView
25 from django.views.generic.list import ListView
26 from popular_proposal.forms import ProposalAreaFilterForm
27 from popular_proposal.filters import ProposalAreaFilter
28
29
30 class ProposalCreationView(FormView):
31 template_name = 'popular_proposal/create.html'
32 form_class = ProposalForm
33
34 @method_decorator(login_required)
35 def dispatch(self, *args, **kwargs):
36 self.area = get_object_or_404(Area, id=self.kwargs['slug'])
37 return super(ProposalCreationView, self).dispatch(*args, **kwargs)
38
39 def get_context_data(self, **kwargs):
40 kwargs = super(ProposalCreationView, self).get_context_data(**kwargs)
41 kwargs['area'] = self.area
42 return kwargs
43
44 def get_form_kwargs(self):
45 kwargs = super(ProposalCreationView, self).get_form_kwargs()
46 kwargs['proposer'] = self.request.user
47 kwargs['area'] = self.area
48 return kwargs
49
50 def form_valid(self, form):
51 form.save()
52 return super(ProposalCreationView, self).form_valid(form)
53
54 def get_success_url(self):
55 return reverse('popular_proposals:thanks', kwargs={'pk': self.area.id})
56
57
58 class ThanksForProposingView(TemplateView):
59 template_name = 'popular_proposal/thanks.html'
60
61 def dispatch(self, *args, **kwargs):
62 self.area = get_object_or_404(Area, id=self.kwargs['pk'])
63 return super(ThanksForProposingView, self).dispatch(*args, **kwargs)
64
65 def get_context_data(self, **kwargs):
66 kwargs = super(ThanksForProposingView, self).get_context_data(**kwargs)
67 kwargs['area'] = self.area
68 return kwargs
69
70
71 class SubscriptionView(FormView):
72 template_name = 'popular_proposal/new_subscription.html'
73 form_class = SubscriptionForm
74
75 @method_decorator(login_required)
76 def dispatch(self, *args, **kwargs):
77 self.proposal = get_object_or_404(PopularProposal, id=self.kwargs['pk'])
78 return super(SubscriptionView, self).dispatch(*args, **kwargs)
79
80 def get_form_kwargs(self):
81 kwargs = super(SubscriptionView, self).get_form_kwargs()
82 kwargs['user'] = self.request.user
83 kwargs['proposal'] = self.proposal
84 return kwargs
85
86 def get_context_data(self, **kwargs):
87 kwargs = super(SubscriptionView, self).get_context_data(**kwargs)
88 kwargs['proposal'] = self.proposal
89 return kwargs
90
91 def get_success_url(self):
92 return reverse('area', kwargs={'slug': self.proposal.area.id})
93
94 def form_valid(self, form):
95 form.subscribe()
96 return super(SubscriptionView, self).form_valid(form)
97
98
99 class HomeView(FilterView):
100 model = PopularProposal
101 template_name = 'popular_proposal/home.html'
102 layout = 'base.html'
103
104 def get_queryset(self):
105 qs = super(HomeView, self).get_queryset()
106 return qs
107
108 def get_context_data(self, **kwargs):
109 context = super(HomeView, self).get_context_data(**kwargs)
110 initial = self.request.GET
111 context['form'] = ProposalFilterForm(initial=initial)
112 context['layout'] = self.layout
113 return context
114
115 def get_context_object_name(self, object_list):
116 return 'popular_proposals'
117
118
119 class PopularProposalDetailView(DetailView):
120 model = PopularProposal
121 template_name = 'popular_proposal/detail.html'
122 context_object_name = 'popular_proposal'
123 layout = 'base.html'
124
125 def get_context_data(self, *args, **kwargs):
126 context = super(PopularProposalDetailView, self).get_context_data(*args, **kwargs)
127 context['layout'] = self.layout
128 return context
129
130
131 wizard_form_list = get_form_list()
132
133
134 class ProposalWizardBase(SessionWizardView):
135 form_list = wizard_form_list
136 template_name = 'popular_proposal/wizard/form_step.html'
137
138 def get_template_names(self):
139 form = self.get_form(step=self.steps.current)
140 template_name = getattr(form, 'template', self.template_name)
141 return template_name
142
143 def get_previous_forms(self):
144 return []
145
146 def get_form_list(self):
147 form_list = OrderedDict()
148 previous_forms = self.get_previous_forms()
149 my_list = previous_forms + get_form_list(user=self.request.user)
150 counter = 0
151 for form_class in my_list:
152 form_list[str(counter)] = form_class
153 counter += 1
154 self.form_list = form_list
155 return form_list
156
157
158 class ProposalWizard(ProposalWizardBase):
159 @method_decorator(login_required)
160 def dispatch(self, request, *args, **kwargs):
161 self.area = get_object_or_404(Area, id=self.kwargs['slug'])
162
163 return super(ProposalWizard, self).dispatch(request, *args, **kwargs)
164
165 def done(self, form_list, **kwargs):
166 data = {}
167 [data.update(form.cleaned_data) for form in form_list]
168 t_data = ProposalTemporaryData.objects.create(proposer=self.request.user,
169 area=self.area,
170 data=data)
171 t_data.notify_new()
172 return render_to_response('popular_proposal/wizard/done.html', {
173 'popular_proposal': t_data,
174 'area': self.area
175 })
176
177 def get_context_data(self, form, **kwargs):
178 context = super(ProposalWizard, self).get_context_data(form, **kwargs)
179 context['area'] = self.area
180 context['preview_data'] = self.get_all_cleaned_data()
181 return context
182
183
184 full_wizard_form_list = [AreaForm, ] + wizard_form_list
185
186
187 class ProposalWizardFull(ProposalWizardBase):
188 form_list = full_wizard_form_list
189 template_name = 'popular_proposal/wizard/form_step.html'
190
191 @method_decorator(login_required)
192 def dispatch(self, request, *args, **kwargs):
193 return super(ProposalWizardFull, self).dispatch(request,
194 *args,
195 **kwargs)
196
197 def get_previous_forms(self):
198 return [AreaForm, ]
199
200 def done(self, form_list, **kwargs):
201 data = {}
202 [data.update(form.cleaned_data) for form in form_list]
203 area = data['area']
204 temporary_data = ProposalTemporaryData.objects.create(proposer=self.request.user,
205 area=area,
206 data=data)
207 context = self.get_context_data(form=None)
208 context.update({'popular_proposal': temporary_data,
209 'area': area
210 })
211 return render_to_response('popular_proposal/wizard/done.html',
212 context)
213
214 def get_context_data(self, *args, **kwargs):
215 context = super(ProposalWizardFull, self).get_context_data(*args, **kwargs)
216 data = self.get_all_cleaned_data()
217 if 'area' in data:
218 context['area'] = data['area']
219 context['preview_data'] = self.get_all_cleaned_data()
220
221 return context
222
223
224 class PopularProposalUpdateView(UpdateView):
225 form_class = UpdateProposalForm
226 template_name = 'popular_proposal/update.html'
227 model = PopularProposal
228 context_object_name = 'popular_proposal'
229
230 @method_decorator(login_required)
231 def dispatch(self, request, *args, **kwargs):
232 return super(PopularProposalUpdateView, self).dispatch(request,
233 *args,
234 **kwargs)
235
236 def get_queryset(self):
237 qs = super(PopularProposalUpdateView, self).get_queryset()
238 qs = qs.filter(proposer=self.request.user)
239 return qs
240
241
242 class UnlikeProposalView(View):
243 def dispatch(self, request, *args, **kwargs):
244 if not self.request.user.is_authenticated():
245 return HttpResponseNotFound()
246 self.pk = self.kwargs.pop('pk')
247 self.like = get_object_or_404(ProposalLike,
248 pk=self.pk,
249 user=self.request.user)
250 return super(UnlikeProposalView, self).dispatch(request,
251 *args,
252 **kwargs)
253
254 def post(self, request, **kwargs):
255 self.like.delete()
256 return JsonResponse({'deleted_item': self.pk})
257
258
259 class ProposalsPerArea(ListView):
260 model = PopularProposal
261 layout = 'base.html'
262 template_name = 'popular_proposal/area.html'
263 context_object_name = 'popular_proposals'
264
265 def dispatch(self, request, *args, **kwargs):
266 self.area = get_object_or_404(Area, id=self.kwargs['slug'])
267 return super(ProposalsPerArea, self).dispatch(request, *args, **kwargs)
268
269 def get_context_data(self):
270 context = super(ProposalsPerArea, self).get_context_data()
271 context['layout'] = self.layout
272 initial = self.request.GET or None
273 context['form'] = ProposalAreaFilterForm(area=self.area,
274 initial=initial)
275 return context
276
277 def get_queryset(self):
278 kwargs = {'data': self.request.GET or None,
279 'area': self.area
280 }
281 filterset = ProposalAreaFilter(**kwargs)
282 return filterset
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/popular_proposal/views.py b/popular_proposal/views.py
--- a/popular_proposal/views.py
+++ b/popular_proposal/views.py
@@ -75,6 +75,10 @@
@method_decorator(login_required)
def dispatch(self, *args, **kwargs):
self.proposal = get_object_or_404(PopularProposal, id=self.kwargs['pk'])
+ if self.request.method == 'GET':
+ self.next_url = self.request.GET.get('next', None)
+ elif self.request.method == 'POST':
+ self.next_url = self.request.POST.get('next', None)
return super(SubscriptionView, self).dispatch(*args, **kwargs)
def get_form_kwargs(self):
@@ -86,9 +90,13 @@
def get_context_data(self, **kwargs):
kwargs = super(SubscriptionView, self).get_context_data(**kwargs)
kwargs['proposal'] = self.proposal
+ if self.next_url:
+ kwargs['next'] = self.next_url
return kwargs
def get_success_url(self):
+ if self.next_url:
+ return self.next_url
return reverse('area', kwargs={'slug': self.proposal.area.id})
def form_valid(self, form):
@@ -204,6 +212,7 @@
temporary_data = ProposalTemporaryData.objects.create(proposer=self.request.user,
area=area,
data=data)
+ temporary_data.notify_new()
context = self.get_context_data(form=None)
context.update({'popular_proposal': temporary_data,
'area': area
| {"golden_diff": "diff --git a/popular_proposal/views.py b/popular_proposal/views.py\n--- a/popular_proposal/views.py\n+++ b/popular_proposal/views.py\n@@ -75,6 +75,10 @@\n @method_decorator(login_required)\n def dispatch(self, *args, **kwargs):\n self.proposal = get_object_or_404(PopularProposal, id=self.kwargs['pk'])\n+ if self.request.method == 'GET':\n+ self.next_url = self.request.GET.get('next', None)\n+ elif self.request.method == 'POST':\n+ self.next_url = self.request.POST.get('next', None)\n return super(SubscriptionView, self).dispatch(*args, **kwargs)\n \n def get_form_kwargs(self):\n@@ -86,9 +90,13 @@\n def get_context_data(self, **kwargs):\n kwargs = super(SubscriptionView, self).get_context_data(**kwargs)\n kwargs['proposal'] = self.proposal\n+ if self.next_url:\n+ kwargs['next'] = self.next_url\n return kwargs\n \n def get_success_url(self):\n+ if self.next_url:\n+ return self.next_url\n return reverse('area', kwargs={'slug': self.proposal.area.id})\n \n def form_valid(self, form):\n@@ -204,6 +212,7 @@\n temporary_data = ProposalTemporaryData.objects.create(proposer=self.request.user,\n area=area,\n data=data)\n+ temporary_data.notify_new()\n context = self.get_context_data(form=None)\n context.update({'popular_proposal': temporary_data,\n 'area': area\n", "issue": "No notifica en wizard full que llega un mail!!!\n<!---\n@huboard:{\"order\":470.04699999999997,\"milestone_order\":481,\"custom_state\":\"\"}\n-->\n\n", "before_files": [{"content": "from django.views.generic.edit import FormView, UpdateView\nfrom popular_proposal.forms import (ProposalForm,\n SubscriptionForm,\n get_form_list,\n AreaForm,\n UpdateProposalForm,\n ProposalFilterForm,\n )\nfrom django.core.urlresolvers import reverse\nfrom django.shortcuts import get_object_or_404\nfrom popolo.models import Area\nfrom django.utils.decorators import method_decorator\nfrom django.contrib.auth.decorators import login_required\nfrom django.views.generic.base import TemplateView\nfrom django.views.generic.detail import DetailView\nfrom popular_proposal.models import (PopularProposal,\n ProposalTemporaryData,\n ProposalLike)\nfrom django.shortcuts import render_to_response\nfrom formtools.wizard.views import SessionWizardView\nfrom collections import OrderedDict\nfrom django.views.generic import View\nfrom django.http import JsonResponse, HttpResponseNotFound\nfrom django_filters.views import FilterView\nfrom django.views.generic.list import ListView\nfrom popular_proposal.forms import ProposalAreaFilterForm\nfrom popular_proposal.filters import ProposalAreaFilter\n\n\nclass ProposalCreationView(FormView):\n template_name = 'popular_proposal/create.html'\n form_class = ProposalForm\n\n @method_decorator(login_required)\n def dispatch(self, *args, **kwargs):\n self.area = get_object_or_404(Area, id=self.kwargs['slug'])\n return super(ProposalCreationView, self).dispatch(*args, **kwargs)\n\n def get_context_data(self, **kwargs):\n kwargs = super(ProposalCreationView, self).get_context_data(**kwargs)\n kwargs['area'] = self.area\n return kwargs\n\n def get_form_kwargs(self):\n kwargs = super(ProposalCreationView, self).get_form_kwargs()\n kwargs['proposer'] = self.request.user\n kwargs['area'] = self.area\n return kwargs\n\n def form_valid(self, form):\n form.save()\n return super(ProposalCreationView, self).form_valid(form)\n\n def get_success_url(self):\n return reverse('popular_proposals:thanks', kwargs={'pk': self.area.id})\n\n\nclass ThanksForProposingView(TemplateView):\n template_name = 'popular_proposal/thanks.html'\n\n def dispatch(self, *args, **kwargs):\n self.area = get_object_or_404(Area, id=self.kwargs['pk'])\n return super(ThanksForProposingView, self).dispatch(*args, **kwargs)\n\n def get_context_data(self, **kwargs):\n kwargs = super(ThanksForProposingView, self).get_context_data(**kwargs)\n kwargs['area'] = self.area\n return kwargs\n\n\nclass SubscriptionView(FormView):\n template_name = 'popular_proposal/new_subscription.html'\n form_class = SubscriptionForm\n\n @method_decorator(login_required)\n def dispatch(self, *args, **kwargs):\n self.proposal = get_object_or_404(PopularProposal, id=self.kwargs['pk'])\n return super(SubscriptionView, self).dispatch(*args, **kwargs)\n\n def get_form_kwargs(self):\n kwargs = super(SubscriptionView, self).get_form_kwargs()\n kwargs['user'] = self.request.user\n kwargs['proposal'] = self.proposal\n return kwargs\n\n def get_context_data(self, **kwargs):\n kwargs = super(SubscriptionView, self).get_context_data(**kwargs)\n kwargs['proposal'] = self.proposal\n return kwargs\n\n def get_success_url(self):\n return reverse('area', kwargs={'slug': self.proposal.area.id})\n\n def form_valid(self, form):\n form.subscribe()\n return super(SubscriptionView, self).form_valid(form)\n\n\nclass HomeView(FilterView):\n model = PopularProposal\n template_name = 'popular_proposal/home.html'\n layout = 'base.html'\n\n def get_queryset(self):\n qs = super(HomeView, self).get_queryset()\n return qs\n\n def get_context_data(self, **kwargs):\n context = super(HomeView, self).get_context_data(**kwargs)\n initial = self.request.GET\n context['form'] = ProposalFilterForm(initial=initial)\n context['layout'] = self.layout\n return context\n\n def get_context_object_name(self, object_list):\n return 'popular_proposals'\n\n\nclass PopularProposalDetailView(DetailView):\n model = PopularProposal\n template_name = 'popular_proposal/detail.html'\n context_object_name = 'popular_proposal'\n layout = 'base.html'\n\n def get_context_data(self, *args, **kwargs):\n context = super(PopularProposalDetailView, self).get_context_data(*args, **kwargs)\n context['layout'] = self.layout\n return context\n\n\nwizard_form_list = get_form_list()\n\n\nclass ProposalWizardBase(SessionWizardView):\n form_list = wizard_form_list\n template_name = 'popular_proposal/wizard/form_step.html'\n\n def get_template_names(self):\n form = self.get_form(step=self.steps.current)\n template_name = getattr(form, 'template', self.template_name)\n return template_name\n\n def get_previous_forms(self):\n return []\n\n def get_form_list(self):\n form_list = OrderedDict()\n previous_forms = self.get_previous_forms()\n my_list = previous_forms + get_form_list(user=self.request.user)\n counter = 0\n for form_class in my_list:\n form_list[str(counter)] = form_class\n counter += 1\n self.form_list = form_list\n return form_list\n\n\nclass ProposalWizard(ProposalWizardBase):\n @method_decorator(login_required)\n def dispatch(self, request, *args, **kwargs):\n self.area = get_object_or_404(Area, id=self.kwargs['slug'])\n\n return super(ProposalWizard, self).dispatch(request, *args, **kwargs)\n\n def done(self, form_list, **kwargs):\n data = {}\n [data.update(form.cleaned_data) for form in form_list]\n t_data = ProposalTemporaryData.objects.create(proposer=self.request.user,\n area=self.area,\n data=data)\n t_data.notify_new()\n return render_to_response('popular_proposal/wizard/done.html', {\n 'popular_proposal': t_data,\n 'area': self.area\n })\n\n def get_context_data(self, form, **kwargs):\n context = super(ProposalWizard, self).get_context_data(form, **kwargs)\n context['area'] = self.area\n context['preview_data'] = self.get_all_cleaned_data()\n return context\n\n\nfull_wizard_form_list = [AreaForm, ] + wizard_form_list\n\n\nclass ProposalWizardFull(ProposalWizardBase):\n form_list = full_wizard_form_list\n template_name = 'popular_proposal/wizard/form_step.html'\n\n @method_decorator(login_required)\n def dispatch(self, request, *args, **kwargs):\n return super(ProposalWizardFull, self).dispatch(request,\n *args,\n **kwargs)\n\n def get_previous_forms(self):\n return [AreaForm, ]\n\n def done(self, form_list, **kwargs):\n data = {}\n [data.update(form.cleaned_data) for form in form_list]\n area = data['area']\n temporary_data = ProposalTemporaryData.objects.create(proposer=self.request.user,\n area=area,\n data=data)\n context = self.get_context_data(form=None)\n context.update({'popular_proposal': temporary_data,\n 'area': area\n })\n return render_to_response('popular_proposal/wizard/done.html',\n context)\n\n def get_context_data(self, *args, **kwargs):\n context = super(ProposalWizardFull, self).get_context_data(*args, **kwargs)\n data = self.get_all_cleaned_data()\n if 'area' in data:\n context['area'] = data['area']\n context['preview_data'] = self.get_all_cleaned_data()\n \n return context\n\n\nclass PopularProposalUpdateView(UpdateView):\n form_class = UpdateProposalForm\n template_name = 'popular_proposal/update.html'\n model = PopularProposal\n context_object_name = 'popular_proposal'\n\n @method_decorator(login_required)\n def dispatch(self, request, *args, **kwargs):\n return super(PopularProposalUpdateView, self).dispatch(request,\n *args,\n **kwargs)\n\n def get_queryset(self):\n qs = super(PopularProposalUpdateView, self).get_queryset()\n qs = qs.filter(proposer=self.request.user)\n return qs\n\n\nclass UnlikeProposalView(View):\n def dispatch(self, request, *args, **kwargs):\n if not self.request.user.is_authenticated():\n return HttpResponseNotFound()\n self.pk = self.kwargs.pop('pk')\n self.like = get_object_or_404(ProposalLike,\n pk=self.pk,\n user=self.request.user)\n return super(UnlikeProposalView, self).dispatch(request,\n *args,\n **kwargs)\n\n def post(self, request, **kwargs):\n self.like.delete()\n return JsonResponse({'deleted_item': self.pk})\n\n\nclass ProposalsPerArea(ListView):\n model = PopularProposal\n layout = 'base.html'\n template_name = 'popular_proposal/area.html'\n context_object_name = 'popular_proposals'\n\n def dispatch(self, request, *args, **kwargs):\n self.area = get_object_or_404(Area, id=self.kwargs['slug'])\n return super(ProposalsPerArea, self).dispatch(request, *args, **kwargs)\n\n def get_context_data(self):\n context = super(ProposalsPerArea, self).get_context_data()\n context['layout'] = self.layout\n initial = self.request.GET or None\n context['form'] = ProposalAreaFilterForm(area=self.area,\n initial=initial)\n return context\n\n def get_queryset(self):\n kwargs = {'data': self.request.GET or None,\n 'area': self.area\n }\n filterset = ProposalAreaFilter(**kwargs)\n return filterset", "path": "popular_proposal/views.py"}], "after_files": [{"content": "from django.views.generic.edit import FormView, UpdateView\nfrom popular_proposal.forms import (ProposalForm,\n SubscriptionForm,\n get_form_list,\n AreaForm,\n UpdateProposalForm,\n ProposalFilterForm,\n )\nfrom django.core.urlresolvers import reverse\nfrom django.shortcuts import get_object_or_404\nfrom popolo.models import Area\nfrom django.utils.decorators import method_decorator\nfrom django.contrib.auth.decorators import login_required\nfrom django.views.generic.base import TemplateView\nfrom django.views.generic.detail import DetailView\nfrom popular_proposal.models import (PopularProposal,\n ProposalTemporaryData,\n ProposalLike)\nfrom django.shortcuts import render_to_response\nfrom formtools.wizard.views import SessionWizardView\nfrom collections import OrderedDict\nfrom django.views.generic import View\nfrom django.http import JsonResponse, HttpResponseNotFound\nfrom django_filters.views import FilterView\nfrom django.views.generic.list import ListView\nfrom popular_proposal.forms import ProposalAreaFilterForm\nfrom popular_proposal.filters import ProposalAreaFilter\n\n\nclass ProposalCreationView(FormView):\n template_name = 'popular_proposal/create.html'\n form_class = ProposalForm\n\n @method_decorator(login_required)\n def dispatch(self, *args, **kwargs):\n self.area = get_object_or_404(Area, id=self.kwargs['slug'])\n return super(ProposalCreationView, self).dispatch(*args, **kwargs)\n\n def get_context_data(self, **kwargs):\n kwargs = super(ProposalCreationView, self).get_context_data(**kwargs)\n kwargs['area'] = self.area\n return kwargs\n\n def get_form_kwargs(self):\n kwargs = super(ProposalCreationView, self).get_form_kwargs()\n kwargs['proposer'] = self.request.user\n kwargs['area'] = self.area\n return kwargs\n\n def form_valid(self, form):\n form.save()\n return super(ProposalCreationView, self).form_valid(form)\n\n def get_success_url(self):\n return reverse('popular_proposals:thanks', kwargs={'pk': self.area.id})\n\n\nclass ThanksForProposingView(TemplateView):\n template_name = 'popular_proposal/thanks.html'\n\n def dispatch(self, *args, **kwargs):\n self.area = get_object_or_404(Area, id=self.kwargs['pk'])\n return super(ThanksForProposingView, self).dispatch(*args, **kwargs)\n\n def get_context_data(self, **kwargs):\n kwargs = super(ThanksForProposingView, self).get_context_data(**kwargs)\n kwargs['area'] = self.area\n return kwargs\n\n\nclass SubscriptionView(FormView):\n template_name = 'popular_proposal/new_subscription.html'\n form_class = SubscriptionForm\n\n @method_decorator(login_required)\n def dispatch(self, *args, **kwargs):\n self.proposal = get_object_or_404(PopularProposal, id=self.kwargs['pk'])\n if self.request.method == 'GET':\n self.next_url = self.request.GET.get('next', None)\n elif self.request.method == 'POST':\n self.next_url = self.request.POST.get('next', None)\n return super(SubscriptionView, self).dispatch(*args, **kwargs)\n\n def get_form_kwargs(self):\n kwargs = super(SubscriptionView, self).get_form_kwargs()\n kwargs['user'] = self.request.user\n kwargs['proposal'] = self.proposal\n return kwargs\n\n def get_context_data(self, **kwargs):\n kwargs = super(SubscriptionView, self).get_context_data(**kwargs)\n kwargs['proposal'] = self.proposal\n if self.next_url:\n kwargs['next'] = self.next_url\n return kwargs\n\n def get_success_url(self):\n if self.next_url:\n return self.next_url\n return reverse('area', kwargs={'slug': self.proposal.area.id})\n\n def form_valid(self, form):\n form.subscribe()\n return super(SubscriptionView, self).form_valid(form)\n\n\nclass HomeView(FilterView):\n model = PopularProposal\n template_name = 'popular_proposal/home.html'\n layout = 'base.html'\n\n def get_queryset(self):\n qs = super(HomeView, self).get_queryset()\n return qs\n\n def get_context_data(self, **kwargs):\n context = super(HomeView, self).get_context_data(**kwargs)\n initial = self.request.GET\n context['form'] = ProposalFilterForm(initial=initial)\n context['layout'] = self.layout\n return context\n\n def get_context_object_name(self, object_list):\n return 'popular_proposals'\n\n\nclass PopularProposalDetailView(DetailView):\n model = PopularProposal\n template_name = 'popular_proposal/detail.html'\n context_object_name = 'popular_proposal'\n layout = 'base.html'\n\n def get_context_data(self, *args, **kwargs):\n context = super(PopularProposalDetailView, self).get_context_data(*args, **kwargs)\n context['layout'] = self.layout\n return context\n\n\nwizard_form_list = get_form_list()\n\n\nclass ProposalWizardBase(SessionWizardView):\n form_list = wizard_form_list\n template_name = 'popular_proposal/wizard/form_step.html'\n\n def get_template_names(self):\n form = self.get_form(step=self.steps.current)\n template_name = getattr(form, 'template', self.template_name)\n return template_name\n\n def get_previous_forms(self):\n return []\n\n def get_form_list(self):\n form_list = OrderedDict()\n previous_forms = self.get_previous_forms()\n my_list = previous_forms + get_form_list(user=self.request.user)\n counter = 0\n for form_class in my_list:\n form_list[str(counter)] = form_class\n counter += 1\n self.form_list = form_list\n return form_list\n\n\nclass ProposalWizard(ProposalWizardBase):\n @method_decorator(login_required)\n def dispatch(self, request, *args, **kwargs):\n self.area = get_object_or_404(Area, id=self.kwargs['slug'])\n\n return super(ProposalWizard, self).dispatch(request, *args, **kwargs)\n\n def done(self, form_list, **kwargs):\n data = {}\n [data.update(form.cleaned_data) for form in form_list]\n t_data = ProposalTemporaryData.objects.create(proposer=self.request.user,\n area=self.area,\n data=data)\n t_data.notify_new()\n return render_to_response('popular_proposal/wizard/done.html', {\n 'popular_proposal': t_data,\n 'area': self.area\n })\n\n def get_context_data(self, form, **kwargs):\n context = super(ProposalWizard, self).get_context_data(form, **kwargs)\n context['area'] = self.area\n context['preview_data'] = self.get_all_cleaned_data()\n return context\n\n\nfull_wizard_form_list = [AreaForm, ] + wizard_form_list\n\n\nclass ProposalWizardFull(ProposalWizardBase):\n form_list = full_wizard_form_list\n template_name = 'popular_proposal/wizard/form_step.html'\n\n @method_decorator(login_required)\n def dispatch(self, request, *args, **kwargs):\n return super(ProposalWizardFull, self).dispatch(request,\n *args,\n **kwargs)\n\n def get_previous_forms(self):\n return [AreaForm, ]\n\n def done(self, form_list, **kwargs):\n data = {}\n [data.update(form.cleaned_data) for form in form_list]\n area = data['area']\n temporary_data = ProposalTemporaryData.objects.create(proposer=self.request.user,\n area=area,\n data=data)\n temporary_data.notify_new()\n context = self.get_context_data(form=None)\n context.update({'popular_proposal': temporary_data,\n 'area': area\n })\n return render_to_response('popular_proposal/wizard/done.html',\n context)\n\n def get_context_data(self, *args, **kwargs):\n context = super(ProposalWizardFull, self).get_context_data(*args, **kwargs)\n data = self.get_all_cleaned_data()\n if 'area' in data:\n context['area'] = data['area']\n context['preview_data'] = self.get_all_cleaned_data()\n \n return context\n\n\nclass PopularProposalUpdateView(UpdateView):\n form_class = UpdateProposalForm\n template_name = 'popular_proposal/update.html'\n model = PopularProposal\n context_object_name = 'popular_proposal'\n\n @method_decorator(login_required)\n def dispatch(self, request, *args, **kwargs):\n return super(PopularProposalUpdateView, self).dispatch(request,\n *args,\n **kwargs)\n\n def get_queryset(self):\n qs = super(PopularProposalUpdateView, self).get_queryset()\n qs = qs.filter(proposer=self.request.user)\n return qs\n\n\nclass UnlikeProposalView(View):\n def dispatch(self, request, *args, **kwargs):\n if not self.request.user.is_authenticated():\n return HttpResponseNotFound()\n self.pk = self.kwargs.pop('pk')\n self.like = get_object_or_404(ProposalLike,\n pk=self.pk,\n user=self.request.user)\n return super(UnlikeProposalView, self).dispatch(request,\n *args,\n **kwargs)\n\n def post(self, request, **kwargs):\n self.like.delete()\n return JsonResponse({'deleted_item': self.pk})\n\n\nclass ProposalsPerArea(ListView):\n model = PopularProposal\n layout = 'base.html'\n template_name = 'popular_proposal/area.html'\n context_object_name = 'popular_proposals'\n\n def dispatch(self, request, *args, **kwargs):\n self.area = get_object_or_404(Area, id=self.kwargs['slug'])\n return super(ProposalsPerArea, self).dispatch(request, *args, **kwargs)\n\n def get_context_data(self):\n context = super(ProposalsPerArea, self).get_context_data()\n context['layout'] = self.layout\n initial = self.request.GET or None\n context['form'] = ProposalAreaFilterForm(area=self.area,\n initial=initial)\n return context\n\n def get_queryset(self):\n kwargs = {'data': self.request.GET or None,\n 'area': self.area\n }\n filterset = ProposalAreaFilter(**kwargs)\n return filterset", "path": "popular_proposal/views.py"}]} | 3,192 | 353 |
gh_patches_debug_406 | rasdani/github-patches | git_diff | nltk__nltk-3156 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Class 'CharTokenizer' is missing attribute '_string'
I think the class `CharTokenizer` is missing the attribute `_string=""`
https://github.com/nltk/nltk/blob/fc53edbf6f0763971afca5855386a2a382da37ac/nltk/tokenize/simple.py#L68-L77
Without this attribute, when trying to use the class, I get the following error:
`TypeError: Can't instantiate abstract class CharTokenizer with abstract method _string`
Example code:
```python
from nltk.tokenize.simple import CharTokenizer
tokenizer = CharTokenizer()
```
Error:
```bash
Traceback (most recent call last):
File "/home/francis/.local/share/virtualenvs/cafa-challenge-bUqSu2Tm/lib/python3.10/site-packages/IPython/core/interactiveshell.py", line 3508, in run_code
exec(code_obj, self.user_global_ns, self.user_ns)
File "<ipython-input-7-803c2e672729>", line 1, in <module>
tokenizer = CharTokenizer()
TypeError: Can't instantiate abstract class CharTokenizer with abstract method _string
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `nltk/tokenize/simple.py`
Content:
```
1 # Natural Language Toolkit: Simple Tokenizers
2 #
3 # Copyright (C) 2001-2023 NLTK Project
4 # Author: Edward Loper <[email protected]>
5 # Steven Bird <[email protected]>
6 # URL: <https://www.nltk.org>
7 # For license information, see LICENSE.TXT
8
9 r"""
10 Simple Tokenizers
11
12 These tokenizers divide strings into substrings using the string
13 ``split()`` method.
14 When tokenizing using a particular delimiter string, use
15 the string ``split()`` method directly, as this is more efficient.
16
17 The simple tokenizers are *not* available as separate functions;
18 instead, you should just use the string ``split()`` method directly:
19
20 >>> s = "Good muffins cost $3.88\nin New York. Please buy me\ntwo of them.\n\nThanks."
21 >>> s.split() # doctest: +NORMALIZE_WHITESPACE
22 ['Good', 'muffins', 'cost', '$3.88', 'in', 'New', 'York.',
23 'Please', 'buy', 'me', 'two', 'of', 'them.', 'Thanks.']
24 >>> s.split(' ') # doctest: +NORMALIZE_WHITESPACE
25 ['Good', 'muffins', 'cost', '$3.88\nin', 'New', 'York.', '',
26 'Please', 'buy', 'me\ntwo', 'of', 'them.\n\nThanks.']
27 >>> s.split('\n') # doctest: +NORMALIZE_WHITESPACE
28 ['Good muffins cost $3.88', 'in New York. Please buy me',
29 'two of them.', '', 'Thanks.']
30
31 The simple tokenizers are mainly useful because they follow the
32 standard ``TokenizerI`` interface, and so can be used with any code
33 that expects a tokenizer. For example, these tokenizers can be used
34 to specify the tokenization conventions when building a `CorpusReader`.
35
36 """
37
38 from nltk.tokenize.api import StringTokenizer, TokenizerI
39 from nltk.tokenize.util import regexp_span_tokenize, string_span_tokenize
40
41
42 class SpaceTokenizer(StringTokenizer):
43 r"""Tokenize a string using the space character as a delimiter,
44 which is the same as ``s.split(' ')``.
45
46 >>> from nltk.tokenize import SpaceTokenizer
47 >>> s = "Good muffins cost $3.88\nin New York. Please buy me\ntwo of them.\n\nThanks."
48 >>> SpaceTokenizer().tokenize(s) # doctest: +NORMALIZE_WHITESPACE
49 ['Good', 'muffins', 'cost', '$3.88\nin', 'New', 'York.', '',
50 'Please', 'buy', 'me\ntwo', 'of', 'them.\n\nThanks.']
51 """
52
53 _string = " "
54
55
56 class TabTokenizer(StringTokenizer):
57 r"""Tokenize a string use the tab character as a delimiter,
58 the same as ``s.split('\t')``.
59
60 >>> from nltk.tokenize import TabTokenizer
61 >>> TabTokenizer().tokenize('a\tb c\n\t d')
62 ['a', 'b c\n', ' d']
63 """
64
65 _string = "\t"
66
67
68 class CharTokenizer(StringTokenizer):
69 """Tokenize a string into individual characters. If this functionality
70 is ever required directly, use ``for char in string``.
71 """
72
73 def tokenize(self, s):
74 return list(s)
75
76 def span_tokenize(self, s):
77 yield from enumerate(range(1, len(s) + 1))
78
79
80 class LineTokenizer(TokenizerI):
81 r"""Tokenize a string into its lines, optionally discarding blank lines.
82 This is similar to ``s.split('\n')``.
83
84 >>> from nltk.tokenize import LineTokenizer
85 >>> s = "Good muffins cost $3.88\nin New York. Please buy me\ntwo of them.\n\nThanks."
86 >>> LineTokenizer(blanklines='keep').tokenize(s) # doctest: +NORMALIZE_WHITESPACE
87 ['Good muffins cost $3.88', 'in New York. Please buy me',
88 'two of them.', '', 'Thanks.']
89 >>> # same as [l for l in s.split('\n') if l.strip()]:
90 >>> LineTokenizer(blanklines='discard').tokenize(s) # doctest: +NORMALIZE_WHITESPACE
91 ['Good muffins cost $3.88', 'in New York. Please buy me',
92 'two of them.', 'Thanks.']
93
94 :param blanklines: Indicates how blank lines should be handled. Valid values are:
95
96 - ``discard``: strip blank lines out of the token list before returning it.
97 A line is considered blank if it contains only whitespace characters.
98 - ``keep``: leave all blank lines in the token list.
99 - ``discard-eof``: if the string ends with a newline, then do not generate
100 a corresponding token ``''`` after that newline.
101 """
102
103 def __init__(self, blanklines="discard"):
104 valid_blanklines = ("discard", "keep", "discard-eof")
105 if blanklines not in valid_blanklines:
106 raise ValueError(
107 "Blank lines must be one of: %s" % " ".join(valid_blanklines)
108 )
109
110 self._blanklines = blanklines
111
112 def tokenize(self, s):
113 lines = s.splitlines()
114 # If requested, strip off blank lines.
115 if self._blanklines == "discard":
116 lines = [l for l in lines if l.rstrip()]
117 elif self._blanklines == "discard-eof":
118 if lines and not lines[-1].strip():
119 lines.pop()
120 return lines
121
122 # discard-eof not implemented
123 def span_tokenize(self, s):
124 if self._blanklines == "keep":
125 yield from string_span_tokenize(s, r"\n")
126 else:
127 yield from regexp_span_tokenize(s, r"\n(\s+\n)*")
128
129
130 ######################################################################
131 # { Tokenization Functions
132 ######################################################################
133 # XXX: it is stated in module docs that there is no function versions
134
135
136 def line_tokenize(text, blanklines="discard"):
137 return LineTokenizer(blanklines).tokenize(text)
138
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/nltk/tokenize/simple.py b/nltk/tokenize/simple.py
--- a/nltk/tokenize/simple.py
+++ b/nltk/tokenize/simple.py
@@ -70,6 +70,8 @@
is ever required directly, use ``for char in string``.
"""
+ _string = None
+
def tokenize(self, s):
return list(s)
| {"golden_diff": "diff --git a/nltk/tokenize/simple.py b/nltk/tokenize/simple.py\n--- a/nltk/tokenize/simple.py\n+++ b/nltk/tokenize/simple.py\n@@ -70,6 +70,8 @@\n is ever required directly, use ``for char in string``.\n \"\"\"\n \n+ _string = None\n+\n def tokenize(self, s):\n return list(s)\n", "issue": "Class 'CharTokenizer' is missing attribute '_string'\nI think the class `CharTokenizer` is missing the attribute `_string=\"\"`\r\n\r\nhttps://github.com/nltk/nltk/blob/fc53edbf6f0763971afca5855386a2a382da37ac/nltk/tokenize/simple.py#L68-L77\r\n\r\nWithout this attribute, when trying to use the class, I get the following error:\r\n\r\n`TypeError: Can't instantiate abstract class CharTokenizer with abstract method _string`\r\n\r\nExample code:\r\n\r\n```python\r\nfrom nltk.tokenize.simple import CharTokenizer\r\n\r\n\r\ntokenizer = CharTokenizer()\r\n```\r\n\r\nError:\r\n\r\n```bash\r\nTraceback (most recent call last):\r\n File \"/home/francis/.local/share/virtualenvs/cafa-challenge-bUqSu2Tm/lib/python3.10/site-packages/IPython/core/interactiveshell.py\", line 3508, in run_code\r\n exec(code_obj, self.user_global_ns, self.user_ns)\r\n File \"<ipython-input-7-803c2e672729>\", line 1, in <module>\r\n tokenizer = CharTokenizer()\r\nTypeError: Can't instantiate abstract class CharTokenizer with abstract method _string\r\n```\n", "before_files": [{"content": "# Natural Language Toolkit: Simple Tokenizers\n#\n# Copyright (C) 2001-2023 NLTK Project\n# Author: Edward Loper <[email protected]>\n# Steven Bird <[email protected]>\n# URL: <https://www.nltk.org>\n# For license information, see LICENSE.TXT\n\nr\"\"\"\nSimple Tokenizers\n\nThese tokenizers divide strings into substrings using the string\n``split()`` method.\nWhen tokenizing using a particular delimiter string, use\nthe string ``split()`` method directly, as this is more efficient.\n\nThe simple tokenizers are *not* available as separate functions;\ninstead, you should just use the string ``split()`` method directly:\n\n >>> s = \"Good muffins cost $3.88\\nin New York. Please buy me\\ntwo of them.\\n\\nThanks.\"\n >>> s.split() # doctest: +NORMALIZE_WHITESPACE\n ['Good', 'muffins', 'cost', '$3.88', 'in', 'New', 'York.',\n 'Please', 'buy', 'me', 'two', 'of', 'them.', 'Thanks.']\n >>> s.split(' ') # doctest: +NORMALIZE_WHITESPACE\n ['Good', 'muffins', 'cost', '$3.88\\nin', 'New', 'York.', '',\n 'Please', 'buy', 'me\\ntwo', 'of', 'them.\\n\\nThanks.']\n >>> s.split('\\n') # doctest: +NORMALIZE_WHITESPACE\n ['Good muffins cost $3.88', 'in New York. Please buy me',\n 'two of them.', '', 'Thanks.']\n\nThe simple tokenizers are mainly useful because they follow the\nstandard ``TokenizerI`` interface, and so can be used with any code\nthat expects a tokenizer. For example, these tokenizers can be used\nto specify the tokenization conventions when building a `CorpusReader`.\n\n\"\"\"\n\nfrom nltk.tokenize.api import StringTokenizer, TokenizerI\nfrom nltk.tokenize.util import regexp_span_tokenize, string_span_tokenize\n\n\nclass SpaceTokenizer(StringTokenizer):\n r\"\"\"Tokenize a string using the space character as a delimiter,\n which is the same as ``s.split(' ')``.\n\n >>> from nltk.tokenize import SpaceTokenizer\n >>> s = \"Good muffins cost $3.88\\nin New York. Please buy me\\ntwo of them.\\n\\nThanks.\"\n >>> SpaceTokenizer().tokenize(s) # doctest: +NORMALIZE_WHITESPACE\n ['Good', 'muffins', 'cost', '$3.88\\nin', 'New', 'York.', '',\n 'Please', 'buy', 'me\\ntwo', 'of', 'them.\\n\\nThanks.']\n \"\"\"\n\n _string = \" \"\n\n\nclass TabTokenizer(StringTokenizer):\n r\"\"\"Tokenize a string use the tab character as a delimiter,\n the same as ``s.split('\\t')``.\n\n >>> from nltk.tokenize import TabTokenizer\n >>> TabTokenizer().tokenize('a\\tb c\\n\\t d')\n ['a', 'b c\\n', ' d']\n \"\"\"\n\n _string = \"\\t\"\n\n\nclass CharTokenizer(StringTokenizer):\n \"\"\"Tokenize a string into individual characters. If this functionality\n is ever required directly, use ``for char in string``.\n \"\"\"\n\n def tokenize(self, s):\n return list(s)\n\n def span_tokenize(self, s):\n yield from enumerate(range(1, len(s) + 1))\n\n\nclass LineTokenizer(TokenizerI):\n r\"\"\"Tokenize a string into its lines, optionally discarding blank lines.\n This is similar to ``s.split('\\n')``.\n\n >>> from nltk.tokenize import LineTokenizer\n >>> s = \"Good muffins cost $3.88\\nin New York. Please buy me\\ntwo of them.\\n\\nThanks.\"\n >>> LineTokenizer(blanklines='keep').tokenize(s) # doctest: +NORMALIZE_WHITESPACE\n ['Good muffins cost $3.88', 'in New York. Please buy me',\n 'two of them.', '', 'Thanks.']\n >>> # same as [l for l in s.split('\\n') if l.strip()]:\n >>> LineTokenizer(blanklines='discard').tokenize(s) # doctest: +NORMALIZE_WHITESPACE\n ['Good muffins cost $3.88', 'in New York. Please buy me',\n 'two of them.', 'Thanks.']\n\n :param blanklines: Indicates how blank lines should be handled. Valid values are:\n\n - ``discard``: strip blank lines out of the token list before returning it.\n A line is considered blank if it contains only whitespace characters.\n - ``keep``: leave all blank lines in the token list.\n - ``discard-eof``: if the string ends with a newline, then do not generate\n a corresponding token ``''`` after that newline.\n \"\"\"\n\n def __init__(self, blanklines=\"discard\"):\n valid_blanklines = (\"discard\", \"keep\", \"discard-eof\")\n if blanklines not in valid_blanklines:\n raise ValueError(\n \"Blank lines must be one of: %s\" % \" \".join(valid_blanklines)\n )\n\n self._blanklines = blanklines\n\n def tokenize(self, s):\n lines = s.splitlines()\n # If requested, strip off blank lines.\n if self._blanklines == \"discard\":\n lines = [l for l in lines if l.rstrip()]\n elif self._blanklines == \"discard-eof\":\n if lines and not lines[-1].strip():\n lines.pop()\n return lines\n\n # discard-eof not implemented\n def span_tokenize(self, s):\n if self._blanklines == \"keep\":\n yield from string_span_tokenize(s, r\"\\n\")\n else:\n yield from regexp_span_tokenize(s, r\"\\n(\\s+\\n)*\")\n\n\n######################################################################\n# { Tokenization Functions\n######################################################################\n# XXX: it is stated in module docs that there is no function versions\n\n\ndef line_tokenize(text, blanklines=\"discard\"):\n return LineTokenizer(blanklines).tokenize(text)\n", "path": "nltk/tokenize/simple.py"}], "after_files": [{"content": "# Natural Language Toolkit: Simple Tokenizers\n#\n# Copyright (C) 2001-2023 NLTK Project\n# Author: Edward Loper <[email protected]>\n# Steven Bird <[email protected]>\n# URL: <https://www.nltk.org>\n# For license information, see LICENSE.TXT\n\nr\"\"\"\nSimple Tokenizers\n\nThese tokenizers divide strings into substrings using the string\n``split()`` method.\nWhen tokenizing using a particular delimiter string, use\nthe string ``split()`` method directly, as this is more efficient.\n\nThe simple tokenizers are *not* available as separate functions;\ninstead, you should just use the string ``split()`` method directly:\n\n >>> s = \"Good muffins cost $3.88\\nin New York. Please buy me\\ntwo of them.\\n\\nThanks.\"\n >>> s.split() # doctest: +NORMALIZE_WHITESPACE\n ['Good', 'muffins', 'cost', '$3.88', 'in', 'New', 'York.',\n 'Please', 'buy', 'me', 'two', 'of', 'them.', 'Thanks.']\n >>> s.split(' ') # doctest: +NORMALIZE_WHITESPACE\n ['Good', 'muffins', 'cost', '$3.88\\nin', 'New', 'York.', '',\n 'Please', 'buy', 'me\\ntwo', 'of', 'them.\\n\\nThanks.']\n >>> s.split('\\n') # doctest: +NORMALIZE_WHITESPACE\n ['Good muffins cost $3.88', 'in New York. Please buy me',\n 'two of them.', '', 'Thanks.']\n\nThe simple tokenizers are mainly useful because they follow the\nstandard ``TokenizerI`` interface, and so can be used with any code\nthat expects a tokenizer. For example, these tokenizers can be used\nto specify the tokenization conventions when building a `CorpusReader`.\n\n\"\"\"\n\nfrom nltk.tokenize.api import StringTokenizer, TokenizerI\nfrom nltk.tokenize.util import regexp_span_tokenize, string_span_tokenize\n\n\nclass SpaceTokenizer(StringTokenizer):\n r\"\"\"Tokenize a string using the space character as a delimiter,\n which is the same as ``s.split(' ')``.\n\n >>> from nltk.tokenize import SpaceTokenizer\n >>> s = \"Good muffins cost $3.88\\nin New York. Please buy me\\ntwo of them.\\n\\nThanks.\"\n >>> SpaceTokenizer().tokenize(s) # doctest: +NORMALIZE_WHITESPACE\n ['Good', 'muffins', 'cost', '$3.88\\nin', 'New', 'York.', '',\n 'Please', 'buy', 'me\\ntwo', 'of', 'them.\\n\\nThanks.']\n \"\"\"\n\n _string = \" \"\n\n\nclass TabTokenizer(StringTokenizer):\n r\"\"\"Tokenize a string use the tab character as a delimiter,\n the same as ``s.split('\\t')``.\n\n >>> from nltk.tokenize import TabTokenizer\n >>> TabTokenizer().tokenize('a\\tb c\\n\\t d')\n ['a', 'b c\\n', ' d']\n \"\"\"\n\n _string = \"\\t\"\n\n\nclass CharTokenizer(StringTokenizer):\n \"\"\"Tokenize a string into individual characters. If this functionality\n is ever required directly, use ``for char in string``.\n \"\"\"\n\n _string = None\n\n def tokenize(self, s):\n return list(s)\n\n def span_tokenize(self, s):\n yield from enumerate(range(1, len(s) + 1))\n\n\nclass LineTokenizer(TokenizerI):\n r\"\"\"Tokenize a string into its lines, optionally discarding blank lines.\n This is similar to ``s.split('\\n')``.\n\n >>> from nltk.tokenize import LineTokenizer\n >>> s = \"Good muffins cost $3.88\\nin New York. Please buy me\\ntwo of them.\\n\\nThanks.\"\n >>> LineTokenizer(blanklines='keep').tokenize(s) # doctest: +NORMALIZE_WHITESPACE\n ['Good muffins cost $3.88', 'in New York. Please buy me',\n 'two of them.', '', 'Thanks.']\n >>> # same as [l for l in s.split('\\n') if l.strip()]:\n >>> LineTokenizer(blanklines='discard').tokenize(s) # doctest: +NORMALIZE_WHITESPACE\n ['Good muffins cost $3.88', 'in New York. Please buy me',\n 'two of them.', 'Thanks.']\n\n :param blanklines: Indicates how blank lines should be handled. Valid values are:\n\n - ``discard``: strip blank lines out of the token list before returning it.\n A line is considered blank if it contains only whitespace characters.\n - ``keep``: leave all blank lines in the token list.\n - ``discard-eof``: if the string ends with a newline, then do not generate\n a corresponding token ``''`` after that newline.\n \"\"\"\n\n def __init__(self, blanklines=\"discard\"):\n valid_blanklines = (\"discard\", \"keep\", \"discard-eof\")\n if blanklines not in valid_blanklines:\n raise ValueError(\n \"Blank lines must be one of: %s\" % \" \".join(valid_blanklines)\n )\n\n self._blanklines = blanklines\n\n def tokenize(self, s):\n lines = s.splitlines()\n # If requested, strip off blank lines.\n if self._blanklines == \"discard\":\n lines = [l for l in lines if l.rstrip()]\n elif self._blanklines == \"discard-eof\":\n if lines and not lines[-1].strip():\n lines.pop()\n return lines\n\n # discard-eof not implemented\n def span_tokenize(self, s):\n if self._blanklines == \"keep\":\n yield from string_span_tokenize(s, r\"\\n\")\n else:\n yield from regexp_span_tokenize(s, r\"\\n(\\s+\\n)*\")\n\n\n######################################################################\n# { Tokenization Functions\n######################################################################\n# XXX: it is stated in module docs that there is no function versions\n\n\ndef line_tokenize(text, blanklines=\"discard\"):\n return LineTokenizer(blanklines).tokenize(text)\n", "path": "nltk/tokenize/simple.py"}]} | 2,198 | 85 |
gh_patches_debug_5846 | rasdani/github-patches | git_diff | pwr-Solaar__Solaar-1447 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Too high PyYAML and psutil version requirements for RHEL/CentOS 8 base OS
**Information**
<!-- Please update to Solaar from this repository before asking for a new feature. -->
- Solaar version (`solaar --version` and `git describe --tags`): **1.1.1**
- Distribution: **RHEL 8.5**
- Kernel version (ex. `uname -srmo`): `Linux 4.18.0-348.7.1.el8_5.x86_64 x86_64 GNU/Linux`
- Output of `solaar show` for the target device (if applicable): **N/A**
**Is your feature request related to a problem? Please describe.**
[setup.py](https://github.com/pwr-Solaar/Solaar/blob/master/setup.py#L63) lists requirements for PyYAML >= 5.1 and psutil >= 5.6.0. Unfortunately, RHEL8 and derivatives carry PyYAML 3.12 and psutil 5.4.3 built for the default python 3.6 in the base OS repository. Consequently, solaar is not installable without installing `python38` or `python39` stack (called module in RHEL/Fedora) that carry newer versions of these two. I've had a request from RHEL8 user to provide a package for RHEL8 in Fedora EPEL, hence this issue.
**Describe the solution you'd like**
If there's no specific reason for the higher versions, I'd like the requirements to be lowered to the versions listed above. I will provide a patch if agreed. I made a build for RHEL8 with these dependencies patched to lower versions and a user tested it and confirmed it seems to be working fine for him.
**Describe alternatives you've considered**
Regular packages in Fedora/EPEL may not depend on modular packages, only base OS. I considered maintaining solaar as a modular package, but this seems very complicated and I don't really have time for this. On the other hand, building it as a regular package adds minimal overhead to my Fedora package maintenance.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #!/usr/bin/env python3
2
3 from glob import glob as _glob
4
5 try:
6 from setuptools import setup
7 except ImportError:
8 from distutils.core import setup
9
10 # from solaar import NAME, __version__
11 __version__ = '1.1.1'
12 NAME = 'Solaar'
13
14
15 def _data_files():
16 from os.path import dirname as _dirname
17
18 yield 'share/solaar/icons', _glob('share/solaar/icons/solaar*.svg')
19 yield 'share/solaar/icons', _glob('share/solaar/icons/light_*.png')
20 yield 'share/icons/hicolor/scalable/apps', ['share/solaar/icons/solaar.svg']
21
22 for mo in _glob('share/locale/*/LC_MESSAGES/solaar.mo'):
23 yield _dirname(mo), [mo]
24
25 yield 'share/applications', ['share/applications/solaar.desktop']
26 yield 'share/solaar/udev-rules.d', ['rules.d/42-logitech-unify-permissions.rules']
27 yield 'share/metainfo', ['share/solaar/io.github.pwr_solaar.solaar.metainfo.xml']
28
29 del _dirname
30
31
32 setup(
33 name=NAME.lower(),
34 version=__version__,
35 description='Linux device manager for Logitech receivers, keyboards, mice, and tablets.',
36 long_description='''
37 Solaar is a Linux device manager for many Logitech peripherals that connect through
38 Unifying and other receivers or via USB or Bluetooth.
39 Solaar is able to pair/unpair devices with receivers and show and modify some of the
40 modifiable features of devices.
41 For instructions on installing Solaar see https://pwr-solaar.github.io/Solaar/installation'''.strip(),
42 author='Daniel Pavel',
43 license='GPLv2',
44 url='http://pwr-solaar.github.io/Solaar/',
45 classifiers=[
46 'Development Status :: 4 - Beta',
47 'Environment :: X11 Applications :: GTK',
48 'Environment :: Console',
49 'Intended Audience :: End Users/Desktop',
50 'License :: DFSG approved',
51 'License :: OSI Approved :: GNU General Public License v2 (GPLv2)',
52 'Natural Language :: English',
53 'Programming Language :: Python :: 3 :: Only',
54 'Operating System :: POSIX :: Linux',
55 'Topic :: Utilities',
56 ],
57 platforms=['linux'],
58
59 # sudo apt install python-gi python3-gi \
60 # gir1.2-gtk-3.0 gir1.2-notify-0.7 gir1.2-ayatanaappindicator3-0.1
61 # os_requires=['gi.repository.GObject (>= 2.0)', 'gi.repository.Gtk (>= 3.0)'],
62 python_requires='>=3.6',
63 install_requires=[
64 'pyudev (>= 0.13)',
65 'PyYAML (>= 5.1)',
66 'python-xlib (>= 0.27)',
67 'psutil (>= 5.6.0)',
68 ],
69 package_dir={'': 'lib'},
70 packages=['hidapi', 'logitech_receiver', 'solaar', 'solaar.ui', 'solaar.cli'],
71 data_files=list(_data_files()),
72 scripts=_glob('bin/*'),
73 )
74
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -62,9 +62,9 @@
python_requires='>=3.6',
install_requires=[
'pyudev (>= 0.13)',
- 'PyYAML (>= 5.1)',
+ 'PyYAML (>= 3.12)',
'python-xlib (>= 0.27)',
- 'psutil (>= 5.6.0)',
+ 'psutil (>= 5.4.3)',
],
package_dir={'': 'lib'},
packages=['hidapi', 'logitech_receiver', 'solaar', 'solaar.ui', 'solaar.cli'],
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -62,9 +62,9 @@\n python_requires='>=3.6',\n install_requires=[\n 'pyudev (>= 0.13)',\n- 'PyYAML (>= 5.1)',\n+ 'PyYAML (>= 3.12)',\n 'python-xlib (>= 0.27)',\n- 'psutil (>= 5.6.0)',\n+ 'psutil (>= 5.4.3)',\n ],\n package_dir={'': 'lib'},\n packages=['hidapi', 'logitech_receiver', 'solaar', 'solaar.ui', 'solaar.cli'],\n", "issue": "Too high PyYAML and psutil version requirements for RHEL/CentOS 8 base OS\n**Information**\r\n<!-- Please update to Solaar from this repository before asking for a new feature. -->\r\n- Solaar version (`solaar --version` and `git describe --tags`): **1.1.1**\r\n- Distribution: **RHEL 8.5**\r\n- Kernel version (ex. `uname -srmo`): `Linux 4.18.0-348.7.1.el8_5.x86_64 x86_64 GNU/Linux`\r\n- Output of `solaar show` for the target device (if applicable): **N/A**\r\n\r\n**Is your feature request related to a problem? Please describe.**\r\n[setup.py](https://github.com/pwr-Solaar/Solaar/blob/master/setup.py#L63) lists requirements for PyYAML >= 5.1 and psutil >= 5.6.0. Unfortunately, RHEL8 and derivatives carry PyYAML 3.12 and psutil 5.4.3 built for the default python 3.6 in the base OS repository. Consequently, solaar is not installable without installing `python38` or `python39` stack (called module in RHEL/Fedora) that carry newer versions of these two. I've had a request from RHEL8 user to provide a package for RHEL8 in Fedora EPEL, hence this issue.\r\n\r\n**Describe the solution you'd like**\r\nIf there's no specific reason for the higher versions, I'd like the requirements to be lowered to the versions listed above. I will provide a patch if agreed. I made a build for RHEL8 with these dependencies patched to lower versions and a user tested it and confirmed it seems to be working fine for him.\r\n\r\n**Describe alternatives you've considered**\r\nRegular packages in Fedora/EPEL may not depend on modular packages, only base OS. I considered maintaining solaar as a modular package, but this seems very complicated and I don't really have time for this. On the other hand, building it as a regular package adds minimal overhead to my Fedora package maintenance.\n", "before_files": [{"content": "#!/usr/bin/env python3\n\nfrom glob import glob as _glob\n\ntry:\n from setuptools import setup\nexcept ImportError:\n from distutils.core import setup\n\n# from solaar import NAME, __version__\n__version__ = '1.1.1'\nNAME = 'Solaar'\n\n\ndef _data_files():\n from os.path import dirname as _dirname\n\n yield 'share/solaar/icons', _glob('share/solaar/icons/solaar*.svg')\n yield 'share/solaar/icons', _glob('share/solaar/icons/light_*.png')\n yield 'share/icons/hicolor/scalable/apps', ['share/solaar/icons/solaar.svg']\n\n for mo in _glob('share/locale/*/LC_MESSAGES/solaar.mo'):\n yield _dirname(mo), [mo]\n\n yield 'share/applications', ['share/applications/solaar.desktop']\n yield 'share/solaar/udev-rules.d', ['rules.d/42-logitech-unify-permissions.rules']\n yield 'share/metainfo', ['share/solaar/io.github.pwr_solaar.solaar.metainfo.xml']\n\n del _dirname\n\n\nsetup(\n name=NAME.lower(),\n version=__version__,\n description='Linux device manager for Logitech receivers, keyboards, mice, and tablets.',\n long_description='''\nSolaar is a Linux device manager for many Logitech peripherals that connect through\nUnifying and other receivers or via USB or Bluetooth.\nSolaar is able to pair/unpair devices with receivers and show and modify some of the\nmodifiable features of devices.\nFor instructions on installing Solaar see https://pwr-solaar.github.io/Solaar/installation'''.strip(),\n author='Daniel Pavel',\n license='GPLv2',\n url='http://pwr-solaar.github.io/Solaar/',\n classifiers=[\n 'Development Status :: 4 - Beta',\n 'Environment :: X11 Applications :: GTK',\n 'Environment :: Console',\n 'Intended Audience :: End Users/Desktop',\n 'License :: DFSG approved',\n 'License :: OSI Approved :: GNU General Public License v2 (GPLv2)',\n 'Natural Language :: English',\n 'Programming Language :: Python :: 3 :: Only',\n 'Operating System :: POSIX :: Linux',\n 'Topic :: Utilities',\n ],\n platforms=['linux'],\n\n # sudo apt install python-gi python3-gi \\\n # gir1.2-gtk-3.0 gir1.2-notify-0.7 gir1.2-ayatanaappindicator3-0.1\n # os_requires=['gi.repository.GObject (>= 2.0)', 'gi.repository.Gtk (>= 3.0)'],\n python_requires='>=3.6',\n install_requires=[\n 'pyudev (>= 0.13)',\n 'PyYAML (>= 5.1)',\n 'python-xlib (>= 0.27)',\n 'psutil (>= 5.6.0)',\n ],\n package_dir={'': 'lib'},\n packages=['hidapi', 'logitech_receiver', 'solaar', 'solaar.ui', 'solaar.cli'],\n data_files=list(_data_files()),\n scripts=_glob('bin/*'),\n)\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/env python3\n\nfrom glob import glob as _glob\n\ntry:\n from setuptools import setup\nexcept ImportError:\n from distutils.core import setup\n\n# from solaar import NAME, __version__\n__version__ = '1.1.1'\nNAME = 'Solaar'\n\n\ndef _data_files():\n from os.path import dirname as _dirname\n\n yield 'share/solaar/icons', _glob('share/solaar/icons/solaar*.svg')\n yield 'share/solaar/icons', _glob('share/solaar/icons/light_*.png')\n yield 'share/icons/hicolor/scalable/apps', ['share/solaar/icons/solaar.svg']\n\n for mo in _glob('share/locale/*/LC_MESSAGES/solaar.mo'):\n yield _dirname(mo), [mo]\n\n yield 'share/applications', ['share/applications/solaar.desktop']\n yield 'share/solaar/udev-rules.d', ['rules.d/42-logitech-unify-permissions.rules']\n yield 'share/metainfo', ['share/solaar/io.github.pwr_solaar.solaar.metainfo.xml']\n\n del _dirname\n\n\nsetup(\n name=NAME.lower(),\n version=__version__,\n description='Linux device manager for Logitech receivers, keyboards, mice, and tablets.',\n long_description='''\nSolaar is a Linux device manager for many Logitech peripherals that connect through\nUnifying and other receivers or via USB or Bluetooth.\nSolaar is able to pair/unpair devices with receivers and show and modify some of the\nmodifiable features of devices.\nFor instructions on installing Solaar see https://pwr-solaar.github.io/Solaar/installation'''.strip(),\n author='Daniel Pavel',\n license='GPLv2',\n url='http://pwr-solaar.github.io/Solaar/',\n classifiers=[\n 'Development Status :: 4 - Beta',\n 'Environment :: X11 Applications :: GTK',\n 'Environment :: Console',\n 'Intended Audience :: End Users/Desktop',\n 'License :: DFSG approved',\n 'License :: OSI Approved :: GNU General Public License v2 (GPLv2)',\n 'Natural Language :: English',\n 'Programming Language :: Python :: 3 :: Only',\n 'Operating System :: POSIX :: Linux',\n 'Topic :: Utilities',\n ],\n platforms=['linux'],\n\n # sudo apt install python-gi python3-gi \\\n # gir1.2-gtk-3.0 gir1.2-notify-0.7 gir1.2-ayatanaappindicator3-0.1\n # os_requires=['gi.repository.GObject (>= 2.0)', 'gi.repository.Gtk (>= 3.0)'],\n python_requires='>=3.6',\n install_requires=[\n 'pyudev (>= 0.13)',\n 'PyYAML (>= 3.12)',\n 'python-xlib (>= 0.27)',\n 'psutil (>= 5.4.3)',\n ],\n package_dir={'': 'lib'},\n packages=['hidapi', 'logitech_receiver', 'solaar', 'solaar.ui', 'solaar.cli'],\n data_files=list(_data_files()),\n scripts=_glob('bin/*'),\n)\n", "path": "setup.py"}]} | 1,568 | 163 |
gh_patches_debug_17312 | rasdani/github-patches | git_diff | googleapis__google-auth-library-python-619 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Use extra for asyncio dependencies
Hello! The latest release for this library pulls in aiohttp and its dependencies unconditionally, which adds non-trivial burden to projects that don’t need it. Would you consider using a packaging extra so that people can opt-in?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 # Copyright 2014 Google Inc.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 import io
16
17 from setuptools import find_packages
18 from setuptools import setup
19
20
21 DEPENDENCIES = (
22 "cachetools>=2.0.0,<5.0",
23 "pyasn1-modules>=0.2.1",
24 # rsa==4.5 is the last version to support 2.7
25 # https://github.com/sybrenstuvel/python-rsa/issues/152#issuecomment-643470233
26 'rsa<4.6; python_version < "3.5"',
27 'rsa>=3.1.4,<5; python_version >= "3.5"',
28 "setuptools>=40.3.0",
29 "six>=1.9.0",
30 'aiohttp >= 3.6.2, < 4.0.0dev; python_version>="3.6"',
31 )
32
33
34 with io.open("README.rst", "r") as fh:
35 long_description = fh.read()
36
37 version = "1.22.0"
38
39 setup(
40 name="google-auth",
41 version=version,
42 author="Google Cloud Platform",
43 author_email="[email protected]",
44 description="Google Authentication Library",
45 long_description=long_description,
46 url="https://github.com/googleapis/google-auth-library-python",
47 packages=find_packages(exclude=("tests*", "system_tests*")),
48 namespace_packages=("google",),
49 install_requires=DEPENDENCIES,
50 python_requires=">=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*",
51 license="Apache 2.0",
52 keywords="google auth oauth client",
53 classifiers=[
54 "Programming Language :: Python :: 2",
55 "Programming Language :: Python :: 2.7",
56 "Programming Language :: Python :: 3",
57 "Programming Language :: Python :: 3.5",
58 "Programming Language :: Python :: 3.6",
59 "Programming Language :: Python :: 3.7",
60 "Programming Language :: Python :: 3.8",
61 "Development Status :: 5 - Production/Stable",
62 "Intended Audience :: Developers",
63 "License :: OSI Approved :: Apache Software License",
64 "Operating System :: POSIX",
65 "Operating System :: Microsoft :: Windows",
66 "Operating System :: MacOS :: MacOS X",
67 "Operating System :: OS Independent",
68 "Topic :: Internet :: WWW/HTTP",
69 ],
70 )
71
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -27,9 +27,9 @@
'rsa>=3.1.4,<5; python_version >= "3.5"',
"setuptools>=40.3.0",
"six>=1.9.0",
- 'aiohttp >= 3.6.2, < 4.0.0dev; python_version>="3.6"',
)
+extras = {"aiohttp": "aiohttp >= 3.6.2, < 4.0.0dev; python_version>='3.6'"}
with io.open("README.rst", "r") as fh:
long_description = fh.read()
@@ -47,6 +47,7 @@
packages=find_packages(exclude=("tests*", "system_tests*")),
namespace_packages=("google",),
install_requires=DEPENDENCIES,
+ extras_require=extras,
python_requires=">=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*",
license="Apache 2.0",
keywords="google auth oauth client",
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -27,9 +27,9 @@\n 'rsa>=3.1.4,<5; python_version >= \"3.5\"',\n \"setuptools>=40.3.0\",\n \"six>=1.9.0\",\n- 'aiohttp >= 3.6.2, < 4.0.0dev; python_version>=\"3.6\"',\n )\n \n+extras = {\"aiohttp\": \"aiohttp >= 3.6.2, < 4.0.0dev; python_version>='3.6'\"}\n \n with io.open(\"README.rst\", \"r\") as fh:\n long_description = fh.read()\n@@ -47,6 +47,7 @@\n packages=find_packages(exclude=(\"tests*\", \"system_tests*\")),\n namespace_packages=(\"google\",),\n install_requires=DEPENDENCIES,\n+ extras_require=extras,\n python_requires=\">=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*\",\n license=\"Apache 2.0\",\n keywords=\"google auth oauth client\",\n", "issue": "Use extra for asyncio dependencies\nHello! The latest release for this library pulls in aiohttp and its dependencies unconditionally, which adds non-trivial burden to projects that don\u2019t need it. Would you consider using a packaging extra so that people can opt-in?\n", "before_files": [{"content": "# Copyright 2014 Google Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport io\n\nfrom setuptools import find_packages\nfrom setuptools import setup\n\n\nDEPENDENCIES = (\n \"cachetools>=2.0.0,<5.0\",\n \"pyasn1-modules>=0.2.1\",\n # rsa==4.5 is the last version to support 2.7\n # https://github.com/sybrenstuvel/python-rsa/issues/152#issuecomment-643470233\n 'rsa<4.6; python_version < \"3.5\"',\n 'rsa>=3.1.4,<5; python_version >= \"3.5\"',\n \"setuptools>=40.3.0\",\n \"six>=1.9.0\",\n 'aiohttp >= 3.6.2, < 4.0.0dev; python_version>=\"3.6\"',\n)\n\n\nwith io.open(\"README.rst\", \"r\") as fh:\n long_description = fh.read()\n\nversion = \"1.22.0\"\n\nsetup(\n name=\"google-auth\",\n version=version,\n author=\"Google Cloud Platform\",\n author_email=\"[email protected]\",\n description=\"Google Authentication Library\",\n long_description=long_description,\n url=\"https://github.com/googleapis/google-auth-library-python\",\n packages=find_packages(exclude=(\"tests*\", \"system_tests*\")),\n namespace_packages=(\"google\",),\n install_requires=DEPENDENCIES,\n python_requires=\">=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*\",\n license=\"Apache 2.0\",\n keywords=\"google auth oauth client\",\n classifiers=[\n \"Programming Language :: Python :: 2\",\n \"Programming Language :: Python :: 2.7\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.5\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Development Status :: 5 - Production/Stable\",\n \"Intended Audience :: Developers\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Operating System :: POSIX\",\n \"Operating System :: Microsoft :: Windows\",\n \"Operating System :: MacOS :: MacOS X\",\n \"Operating System :: OS Independent\",\n \"Topic :: Internet :: WWW/HTTP\",\n ],\n)\n", "path": "setup.py"}], "after_files": [{"content": "# Copyright 2014 Google Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport io\n\nfrom setuptools import find_packages\nfrom setuptools import setup\n\n\nDEPENDENCIES = (\n \"cachetools>=2.0.0,<5.0\",\n \"pyasn1-modules>=0.2.1\",\n # rsa==4.5 is the last version to support 2.7\n # https://github.com/sybrenstuvel/python-rsa/issues/152#issuecomment-643470233\n 'rsa<4.6; python_version < \"3.5\"',\n 'rsa>=3.1.4,<5; python_version >= \"3.5\"',\n \"setuptools>=40.3.0\",\n \"six>=1.9.0\",\n)\n\nextras = {\"aiohttp\": \"aiohttp >= 3.6.2, < 4.0.0dev; python_version>='3.6'\"}\n\nwith io.open(\"README.rst\", \"r\") as fh:\n long_description = fh.read()\n\nversion = \"1.22.0\"\n\nsetup(\n name=\"google-auth\",\n version=version,\n author=\"Google Cloud Platform\",\n author_email=\"[email protected]\",\n description=\"Google Authentication Library\",\n long_description=long_description,\n url=\"https://github.com/googleapis/google-auth-library-python\",\n packages=find_packages(exclude=(\"tests*\", \"system_tests*\")),\n namespace_packages=(\"google\",),\n install_requires=DEPENDENCIES,\n extras_require=extras,\n python_requires=\">=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*\",\n license=\"Apache 2.0\",\n keywords=\"google auth oauth client\",\n classifiers=[\n \"Programming Language :: Python :: 2\",\n \"Programming Language :: Python :: 2.7\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.5\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Development Status :: 5 - Production/Stable\",\n \"Intended Audience :: Developers\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Operating System :: POSIX\",\n \"Operating System :: Microsoft :: Windows\",\n \"Operating System :: MacOS :: MacOS X\",\n \"Operating System :: OS Independent\",\n \"Topic :: Internet :: WWW/HTTP\",\n ],\n)\n", "path": "setup.py"}]} | 1,096 | 261 |
gh_patches_debug_4554 | rasdani/github-patches | git_diff | pytorch__pytorch-131 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Port LBFGS from Lua optim
fixes #483
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `torch/nn/functions/thnn/auto.py`
Content:
```
1 from itertools import repeat
2 from collections import defaultdict
3
4 import torch
5 from torch._thnn.utils import parse_header, THNN_H_PATH
6 from torch.autograd.function import Function, InplaceFunction
7 from torch._thnn import type2backend
8
9 from . import _all_functions
10
11
12 def _make_function_class_criterion(class_name, update_output, update_grad_input, acc_grad_parameters):
13 weight_arg_idx = -1
14 for i, arg in enumerate(update_output.arguments):
15 if arg.name.startswith('weight'):
16 weight_arg_idx = i
17 break
18
19 buffers_idx = []
20 additional_arg_idx = 0
21 for arg in update_output.arguments[4:]:
22 if not arg.name.startswith('weight') and arg.type == 'THTensor*':
23 buffers_idx.append(additional_arg_idx)
24 additional_arg_idx += 1
25
26 def __init__(self, *args, **kwargs):
27 Function.__init__(self)
28 self.weight = kwargs.get('weight')
29 self.additional_args = list(args)
30
31 def forward(self, input, target):
32 self._backend = type2backend[type(input)]
33 self.save_for_backward(input, target)
34 if weight_arg_idx >= 0:
35 insert_idx = weight_arg_idx - 4 # state, input, target, output
36 self.additional_args.insert(insert_idx, self.weight)
37 for idx in buffers_idx:
38 self.additional_args.insert(idx, input.new(1))
39 output = input.new(1)
40 getattr(self._backend, update_output.name)(self._backend.library_state, input, target,
41 output, *self.additional_args)
42 return output
43
44 def backward(self, grad_output):
45 input, target = self.saved_tensors
46 grad_input = grad_output.new().resize_as_(input).zero_()
47 getattr(self._backend, update_grad_input.name)(self._backend.library_state, input, target,
48 grad_input, *self.additional_args)
49 grad_output_expanded = grad_output.resize_(*repeat(1, grad_input.dim()))
50 grad_input.mul_(grad_output.expand_as(grad_input))
51 return grad_input, None
52
53 return type(class_name, (Function,), dict(__init__=__init__, forward=forward, backward=backward))
54
55
56 def _find_buffers(args, ignored_args):
57 additional_arg_idx = 0
58 buffers = []
59 for arg in args:
60 if arg.name in ignored_args:
61 continue
62 if arg.type == 'THTensor*':
63 buffers.append((additional_arg_idx, arg.name))
64 additional_arg_idx += 1
65 return buffers
66
67
68 def _make_function_class(class_name, update_output, update_grad_input, acc_grad_parameters):
69 def has_argument(fn, name):
70 for arg in fn.arguments:
71 if arg.name == name:
72 return True
73 return False
74 save_output = has_argument(update_grad_input, 'output')
75
76 param_args = {'weight', 'bias'}
77 ignored_args = {'weight', 'bias', 'gradWeight', 'gradBias', 'output'}
78 expected_params = [arg for arg in update_output.arguments[3:]
79 if arg.name in param_args]
80 buffers = {}
81 buffers['update_output'] = _find_buffers(update_output.arguments[3:],
82 ignored_args)
83 buffers['update_grad_input'] = _find_buffers(
84 update_grad_input.arguments[4:], ignored_args)
85 if acc_grad_parameters is not None:
86 buffers['acc_grad_parameters'] = _find_buffers(
87 acc_grad_parameters.arguments[3:], ignored_args)
88
89 # This and __init__ assume that only the last argument can be
90 # an inplace flag
91 is_inplace = update_output.arguments[-1].name == 'inplace'
92
93 def __init__(self, *args):
94 if is_inplace:
95 InplaceFunction.__init__(self, args[-1])
96 else:
97 Function.__init__(self)
98 self.additional_args = list(args)
99
100 def _initialize_buffers(self, fn_name):
101 additional_args = self.additional_args
102 for idx, name in buffers[fn_name]:
103 # TODO: some buffers are necessary only for update output and can be
104 # freed right afterwards
105 buffer = self.buffers[name]
106 additional_args = additional_args[:idx] + [buffer] + additional_args[idx:]
107 return tuple(additional_args)
108
109 def forward(self, input, *params):
110 self._backend = type2backend[type(input)]
111
112 for param in params:
113 if type(param) != type(input):
114 raise RuntimeError("input type ({}) doesn't match the type of "
115 "a parameter tensor ({})".format(torch.typename(input),
116 torch.typename(param)))
117
118 # Allocate temporary buffers and insert them into additional_args
119 self.buffers = defaultdict(type(input))
120 additional_args = self._initialize_buffers('update_output')
121
122 # Fill in optional params with None
123 args = params
124 for i in range(len(params), len(expected_params)):
125 param = expected_params[i]
126 if param.is_optional:
127 args += (None,)
128 else:
129 raise ValueError("missing required argument '%s'" % param.name)
130
131 args += tuple(additional_args)
132
133 # If the module is working in-place it's output will be set to the
134 # same storage as input, but it's variable won't be dirty.
135 if is_inplace and self.inplace:
136 self.mark_dirty(input)
137 output = input
138 self.save_for_backward(input, *params)
139 else:
140 output = input.new()
141 if save_output:
142 self.save_for_backward(input, output, *params)
143 else:
144 self.save_for_backward(input, *params)
145
146 getattr(self._backend, update_output.name)(self._backend.library_state, input, output, *args)
147 return output
148
149 def backward(self, grad_output):
150 t = self.saved_tensors
151 if save_output:
152 input, output, params = t[0], t[1], t[2:]
153 else:
154 input, params = t[0], t[1:]
155 grad_params = tuple(None for p in params)
156 grad_input_tuple = (None,)
157
158 if self.needs_input_grad[0]:
159 additional_args = self._initialize_buffers('update_grad_input')
160 if save_output:
161 additional_args = (output,) + additional_args
162
163 grad_input = input.new().resize_as_(input).zero_()
164 params_without_bias = params if len(params) < 2 else params[:1]
165 update_grad_input_fn = getattr(self._backend, update_grad_input.name)
166 gi_args = params_without_bias + additional_args
167 update_grad_input_fn(self._backend.library_state, input, grad_output, grad_input, *gi_args)
168 grad_input_tuple = (grad_input,)
169
170 if acc_grad_parameters and any(self.needs_input_grad[1:]):
171 additional_args = self._initialize_buffers('acc_grad_parameters')
172 grad_params = tuple(p.new().resize_as_(p).zero_() for p in params)
173 acc_grad_parameters_fn = getattr(self._backend, acc_grad_parameters.name)
174 param_args = grad_params + additional_args + (1,)
175 acc_grad_parameters_fn(self._backend.library_state, input, grad_output, *param_args)
176
177 return grad_input_tuple + grad_params
178
179 base_class = Function if not is_inplace else InplaceFunction
180 return type(class_name, (base_class,), dict(__init__=__init__, forward=forward, backward=backward, _initialize_buffers=_initialize_buffers))
181
182
183 def _generate_function_classes(scope_dict):
184 function_list = parse_header(THNN_H_PATH)
185 function_by_name = {fn.name: fn for fn in function_list}
186 classes_to_generate = {fn.name.partition('_')[0] for fn in function_list}
187 exceptions = {
188 'SpatialConvolutionMM',
189 'SparseLinear',
190 'TemporalConvolution',
191 'SpatialMaxPooling',
192 'SpatialDilatedMaxPooling',
193 'SpatialMaxUnpooling',
194 'VolumetricMaxPooling',
195 'VolumetricMaxUnpooling',
196 'VolumetricConvolution',
197 'VolumetricConvolutionMM',
198 'TemporalMaxPooling',
199 'BatchNormalization',
200 'LookupTable',
201 'PReLU',
202 'RReLU',
203 'unfolded',
204 }
205 name_remap = {
206 'TemporalConvolution': 'Conv1d',
207 'SpatialDilatedConvolution': 'DilatedConv2d',
208 'SpatialAveragePooling': 'AvgPool2d',
209 'SpatialMaxUnpooling': 'MaxUnpool2d',
210 'SpatialReflectionPadding': 'ReflectionPad2d',
211 'SpatialReplicationPadding': 'ReplicationPad2d',
212 'SpatialFullConvolution': 'ConvTranspose2d',
213 'VolumetricFullConvolution': 'ConvTranspose3d',
214 'VolumetricReplicationPadding': 'ReplicationPad3d',
215 'VolumetricMaxUnpooling': 'MaxUnpool3d',
216 'VolumetricAveragePooling': 'AvgPool3d',
217 'SoftMax': 'Softmax',
218 'LogSoftMax': 'LogSoftmax',
219 'HardTanh': 'Hardtanh',
220 'HardShrink': 'Hardshrink',
221 'SoftPlus': 'Softplus',
222 'SoftShrink': 'Softshrink',
223 'MSECriterion': 'MSELoss',
224 'AbsCriterion': 'L1Loss',
225 'BCECriterion': '_BCELoss', # TODO: move the glue code into THNN
226 'ClassNLLCriterion': 'NLLLoss',
227 'DistKLDivCriterion': 'KLDivLoss',
228 'SpatialClassNLLCriterion': 'NLLLoss2d',
229 'MultiLabelMarginCriterion': 'MultiLabelMarginLoss',
230 'MultiMarginCriterion': 'MultiMarginLoss',
231 'SmoothL1Criterion': 'SmoothL1Loss',
232 'SoftMarginCriterion': 'SoftMarginLoss',
233 }
234 classes_to_generate -= exceptions
235 for fn in classes_to_generate:
236 update_output = function_by_name[fn + '_updateOutput']
237 update_grad_input = function_by_name[fn + '_updateGradInput']
238 acc_grad_parameters = function_by_name.get(fn + '_accGradParameters')
239 class_name = name_remap.get(fn, fn)
240 # This has to call a function to retain correct references to functions
241 if 'Criterion' in fn:
242 cls = _make_function_class_criterion(class_name, update_output,
243 update_grad_input, acc_grad_parameters)
244 else:
245 cls = _make_function_class(class_name, update_output,
246 update_grad_input, acc_grad_parameters)
247 scope_dict[class_name] = cls
248 if not class_name.startswith('_'):
249 _all_functions.append(cls)
250
251
252 _generate_function_classes(locals())
253
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/torch/nn/functions/thnn/auto.py b/torch/nn/functions/thnn/auto.py
--- a/torch/nn/functions/thnn/auto.py
+++ b/torch/nn/functions/thnn/auto.py
@@ -185,6 +185,7 @@
function_by_name = {fn.name: fn for fn in function_list}
classes_to_generate = {fn.name.partition('_')[0] for fn in function_list}
exceptions = {
+ 'Linear',
'SpatialConvolutionMM',
'SparseLinear',
'TemporalConvolution',
| {"golden_diff": "diff --git a/torch/nn/functions/thnn/auto.py b/torch/nn/functions/thnn/auto.py\n--- a/torch/nn/functions/thnn/auto.py\n+++ b/torch/nn/functions/thnn/auto.py\n@@ -185,6 +185,7 @@\n function_by_name = {fn.name: fn for fn in function_list}\n classes_to_generate = {fn.name.partition('_')[0] for fn in function_list}\n exceptions = {\n+ 'Linear',\n 'SpatialConvolutionMM',\n 'SparseLinear',\n 'TemporalConvolution',\n", "issue": "Port LBFGS from Lua optim\nfixes #483 \n", "before_files": [{"content": "from itertools import repeat\nfrom collections import defaultdict\n\nimport torch\nfrom torch._thnn.utils import parse_header, THNN_H_PATH\nfrom torch.autograd.function import Function, InplaceFunction\nfrom torch._thnn import type2backend\n\nfrom . import _all_functions\n\n\ndef _make_function_class_criterion(class_name, update_output, update_grad_input, acc_grad_parameters):\n weight_arg_idx = -1\n for i, arg in enumerate(update_output.arguments):\n if arg.name.startswith('weight'):\n weight_arg_idx = i\n break\n\n buffers_idx = []\n additional_arg_idx = 0\n for arg in update_output.arguments[4:]:\n if not arg.name.startswith('weight') and arg.type == 'THTensor*':\n buffers_idx.append(additional_arg_idx)\n additional_arg_idx += 1\n\n def __init__(self, *args, **kwargs):\n Function.__init__(self)\n self.weight = kwargs.get('weight')\n self.additional_args = list(args)\n\n def forward(self, input, target):\n self._backend = type2backend[type(input)]\n self.save_for_backward(input, target)\n if weight_arg_idx >= 0:\n insert_idx = weight_arg_idx - 4 # state, input, target, output\n self.additional_args.insert(insert_idx, self.weight)\n for idx in buffers_idx:\n self.additional_args.insert(idx, input.new(1))\n output = input.new(1)\n getattr(self._backend, update_output.name)(self._backend.library_state, input, target,\n output, *self.additional_args)\n return output\n\n def backward(self, grad_output):\n input, target = self.saved_tensors\n grad_input = grad_output.new().resize_as_(input).zero_()\n getattr(self._backend, update_grad_input.name)(self._backend.library_state, input, target,\n grad_input, *self.additional_args)\n grad_output_expanded = grad_output.resize_(*repeat(1, grad_input.dim()))\n grad_input.mul_(grad_output.expand_as(grad_input))\n return grad_input, None\n\n return type(class_name, (Function,), dict(__init__=__init__, forward=forward, backward=backward))\n\n\ndef _find_buffers(args, ignored_args):\n additional_arg_idx = 0\n buffers = []\n for arg in args:\n if arg.name in ignored_args:\n continue\n if arg.type == 'THTensor*':\n buffers.append((additional_arg_idx, arg.name))\n additional_arg_idx += 1\n return buffers\n\n\ndef _make_function_class(class_name, update_output, update_grad_input, acc_grad_parameters):\n def has_argument(fn, name):\n for arg in fn.arguments:\n if arg.name == name:\n return True\n return False\n save_output = has_argument(update_grad_input, 'output')\n\n param_args = {'weight', 'bias'}\n ignored_args = {'weight', 'bias', 'gradWeight', 'gradBias', 'output'}\n expected_params = [arg for arg in update_output.arguments[3:]\n if arg.name in param_args]\n buffers = {}\n buffers['update_output'] = _find_buffers(update_output.arguments[3:],\n ignored_args)\n buffers['update_grad_input'] = _find_buffers(\n update_grad_input.arguments[4:], ignored_args)\n if acc_grad_parameters is not None:\n buffers['acc_grad_parameters'] = _find_buffers(\n acc_grad_parameters.arguments[3:], ignored_args)\n\n # This and __init__ assume that only the last argument can be\n # an inplace flag\n is_inplace = update_output.arguments[-1].name == 'inplace'\n\n def __init__(self, *args):\n if is_inplace:\n InplaceFunction.__init__(self, args[-1])\n else:\n Function.__init__(self)\n self.additional_args = list(args)\n\n def _initialize_buffers(self, fn_name):\n additional_args = self.additional_args\n for idx, name in buffers[fn_name]:\n # TODO: some buffers are necessary only for update output and can be\n # freed right afterwards\n buffer = self.buffers[name]\n additional_args = additional_args[:idx] + [buffer] + additional_args[idx:]\n return tuple(additional_args)\n\n def forward(self, input, *params):\n self._backend = type2backend[type(input)]\n\n for param in params:\n if type(param) != type(input):\n raise RuntimeError(\"input type ({}) doesn't match the type of \"\n \"a parameter tensor ({})\".format(torch.typename(input),\n torch.typename(param)))\n\n # Allocate temporary buffers and insert them into additional_args\n self.buffers = defaultdict(type(input))\n additional_args = self._initialize_buffers('update_output')\n\n # Fill in optional params with None\n args = params\n for i in range(len(params), len(expected_params)):\n param = expected_params[i]\n if param.is_optional:\n args += (None,)\n else:\n raise ValueError(\"missing required argument '%s'\" % param.name)\n\n args += tuple(additional_args)\n\n # If the module is working in-place it's output will be set to the\n # same storage as input, but it's variable won't be dirty.\n if is_inplace and self.inplace:\n self.mark_dirty(input)\n output = input\n self.save_for_backward(input, *params)\n else:\n output = input.new()\n if save_output:\n self.save_for_backward(input, output, *params)\n else:\n self.save_for_backward(input, *params)\n\n getattr(self._backend, update_output.name)(self._backend.library_state, input, output, *args)\n return output\n\n def backward(self, grad_output):\n t = self.saved_tensors\n if save_output:\n input, output, params = t[0], t[1], t[2:]\n else:\n input, params = t[0], t[1:]\n grad_params = tuple(None for p in params)\n grad_input_tuple = (None,)\n\n if self.needs_input_grad[0]:\n additional_args = self._initialize_buffers('update_grad_input')\n if save_output:\n additional_args = (output,) + additional_args\n\n grad_input = input.new().resize_as_(input).zero_()\n params_without_bias = params if len(params) < 2 else params[:1]\n update_grad_input_fn = getattr(self._backend, update_grad_input.name)\n gi_args = params_without_bias + additional_args\n update_grad_input_fn(self._backend.library_state, input, grad_output, grad_input, *gi_args)\n grad_input_tuple = (grad_input,)\n\n if acc_grad_parameters and any(self.needs_input_grad[1:]):\n additional_args = self._initialize_buffers('acc_grad_parameters')\n grad_params = tuple(p.new().resize_as_(p).zero_() for p in params)\n acc_grad_parameters_fn = getattr(self._backend, acc_grad_parameters.name)\n param_args = grad_params + additional_args + (1,)\n acc_grad_parameters_fn(self._backend.library_state, input, grad_output, *param_args)\n\n return grad_input_tuple + grad_params\n\n base_class = Function if not is_inplace else InplaceFunction\n return type(class_name, (base_class,), dict(__init__=__init__, forward=forward, backward=backward, _initialize_buffers=_initialize_buffers))\n\n\ndef _generate_function_classes(scope_dict):\n function_list = parse_header(THNN_H_PATH)\n function_by_name = {fn.name: fn for fn in function_list}\n classes_to_generate = {fn.name.partition('_')[0] for fn in function_list}\n exceptions = {\n 'SpatialConvolutionMM',\n 'SparseLinear',\n 'TemporalConvolution',\n 'SpatialMaxPooling',\n 'SpatialDilatedMaxPooling',\n 'SpatialMaxUnpooling',\n 'VolumetricMaxPooling',\n 'VolumetricMaxUnpooling',\n 'VolumetricConvolution',\n 'VolumetricConvolutionMM',\n 'TemporalMaxPooling',\n 'BatchNormalization',\n 'LookupTable',\n 'PReLU',\n 'RReLU',\n 'unfolded',\n }\n name_remap = {\n 'TemporalConvolution': 'Conv1d',\n 'SpatialDilatedConvolution': 'DilatedConv2d',\n 'SpatialAveragePooling': 'AvgPool2d',\n 'SpatialMaxUnpooling': 'MaxUnpool2d',\n 'SpatialReflectionPadding': 'ReflectionPad2d',\n 'SpatialReplicationPadding': 'ReplicationPad2d',\n 'SpatialFullConvolution': 'ConvTranspose2d',\n 'VolumetricFullConvolution': 'ConvTranspose3d',\n 'VolumetricReplicationPadding': 'ReplicationPad3d',\n 'VolumetricMaxUnpooling': 'MaxUnpool3d',\n 'VolumetricAveragePooling': 'AvgPool3d',\n 'SoftMax': 'Softmax',\n 'LogSoftMax': 'LogSoftmax',\n 'HardTanh': 'Hardtanh',\n 'HardShrink': 'Hardshrink',\n 'SoftPlus': 'Softplus',\n 'SoftShrink': 'Softshrink',\n 'MSECriterion': 'MSELoss',\n 'AbsCriterion': 'L1Loss',\n 'BCECriterion': '_BCELoss', # TODO: move the glue code into THNN\n 'ClassNLLCriterion': 'NLLLoss',\n 'DistKLDivCriterion': 'KLDivLoss',\n 'SpatialClassNLLCriterion': 'NLLLoss2d',\n 'MultiLabelMarginCriterion': 'MultiLabelMarginLoss',\n 'MultiMarginCriterion': 'MultiMarginLoss',\n 'SmoothL1Criterion': 'SmoothL1Loss',\n 'SoftMarginCriterion': 'SoftMarginLoss',\n }\n classes_to_generate -= exceptions\n for fn in classes_to_generate:\n update_output = function_by_name[fn + '_updateOutput']\n update_grad_input = function_by_name[fn + '_updateGradInput']\n acc_grad_parameters = function_by_name.get(fn + '_accGradParameters')\n class_name = name_remap.get(fn, fn)\n # This has to call a function to retain correct references to functions\n if 'Criterion' in fn:\n cls = _make_function_class_criterion(class_name, update_output,\n update_grad_input, acc_grad_parameters)\n else:\n cls = _make_function_class(class_name, update_output,\n update_grad_input, acc_grad_parameters)\n scope_dict[class_name] = cls\n if not class_name.startswith('_'):\n _all_functions.append(cls)\n\n\n_generate_function_classes(locals())\n", "path": "torch/nn/functions/thnn/auto.py"}], "after_files": [{"content": "from itertools import repeat\nfrom collections import defaultdict\n\nimport torch\nfrom torch._thnn.utils import parse_header, THNN_H_PATH\nfrom torch.autograd.function import Function, InplaceFunction\nfrom torch._thnn import type2backend\n\nfrom . import _all_functions\n\n\ndef _make_function_class_criterion(class_name, update_output, update_grad_input, acc_grad_parameters):\n weight_arg_idx = -1\n for i, arg in enumerate(update_output.arguments):\n if arg.name.startswith('weight'):\n weight_arg_idx = i\n break\n\n buffers_idx = []\n additional_arg_idx = 0\n for arg in update_output.arguments[4:]:\n if not arg.name.startswith('weight') and arg.type == 'THTensor*':\n buffers_idx.append(additional_arg_idx)\n additional_arg_idx += 1\n\n def __init__(self, *args, **kwargs):\n Function.__init__(self)\n self.weight = kwargs.get('weight')\n self.additional_args = list(args)\n\n def forward(self, input, target):\n self._backend = type2backend[type(input)]\n self.save_for_backward(input, target)\n if weight_arg_idx >= 0:\n insert_idx = weight_arg_idx - 4 # state, input, target, output\n self.additional_args.insert(insert_idx, self.weight)\n for idx in buffers_idx:\n self.additional_args.insert(idx, input.new(1))\n output = input.new(1)\n getattr(self._backend, update_output.name)(self._backend.library_state, input, target,\n output, *self.additional_args)\n return output\n\n def backward(self, grad_output):\n input, target = self.saved_tensors\n grad_input = grad_output.new().resize_as_(input).zero_()\n getattr(self._backend, update_grad_input.name)(self._backend.library_state, input, target,\n grad_input, *self.additional_args)\n grad_output_expanded = grad_output.resize_(*repeat(1, grad_input.dim()))\n grad_input.mul_(grad_output.expand_as(grad_input))\n return grad_input, None\n\n return type(class_name, (Function,), dict(__init__=__init__, forward=forward, backward=backward))\n\n\ndef _find_buffers(args, ignored_args):\n additional_arg_idx = 0\n buffers = []\n for arg in args:\n if arg.name in ignored_args:\n continue\n if arg.type == 'THTensor*':\n buffers.append((additional_arg_idx, arg.name))\n additional_arg_idx += 1\n return buffers\n\n\ndef _make_function_class(class_name, update_output, update_grad_input, acc_grad_parameters):\n def has_argument(fn, name):\n for arg in fn.arguments:\n if arg.name == name:\n return True\n return False\n save_output = has_argument(update_grad_input, 'output')\n\n param_args = {'weight', 'bias'}\n ignored_args = {'weight', 'bias', 'gradWeight', 'gradBias', 'output'}\n expected_params = [arg for arg in update_output.arguments[3:]\n if arg.name in param_args]\n buffers = {}\n buffers['update_output'] = _find_buffers(update_output.arguments[3:],\n ignored_args)\n buffers['update_grad_input'] = _find_buffers(\n update_grad_input.arguments[4:], ignored_args)\n if acc_grad_parameters is not None:\n buffers['acc_grad_parameters'] = _find_buffers(\n acc_grad_parameters.arguments[3:], ignored_args)\n\n # This and __init__ assume that only the last argument can be\n # an inplace flag\n is_inplace = update_output.arguments[-1].name == 'inplace'\n\n def __init__(self, *args):\n if is_inplace:\n InplaceFunction.__init__(self, args[-1])\n else:\n Function.__init__(self)\n self.additional_args = list(args)\n\n def _initialize_buffers(self, fn_name):\n additional_args = self.additional_args\n for idx, name in buffers[fn_name]:\n # TODO: some buffers are necessary only for update output and can be\n # freed right afterwards\n buffer = self.buffers[name]\n additional_args = additional_args[:idx] + [buffer] + additional_args[idx:]\n return tuple(additional_args)\n\n def forward(self, input, *params):\n self._backend = type2backend[type(input)]\n\n for param in params:\n if type(param) != type(input):\n raise RuntimeError(\"input type ({}) doesn't match the type of \"\n \"a parameter tensor ({})\".format(torch.typename(input),\n torch.typename(param)))\n\n # Allocate temporary buffers and insert them into additional_args\n self.buffers = defaultdict(type(input))\n additional_args = self._initialize_buffers('update_output')\n\n # Fill in optional params with None\n args = params\n for i in range(len(params), len(expected_params)):\n param = expected_params[i]\n if param.is_optional:\n args += (None,)\n else:\n raise ValueError(\"missing required argument '%s'\" % param.name)\n\n args += tuple(additional_args)\n\n # If the module is working in-place it's output will be set to the\n # same storage as input, but it's variable won't be dirty.\n if is_inplace and self.inplace:\n self.mark_dirty(input)\n output = input\n self.save_for_backward(input, *params)\n else:\n output = input.new()\n if save_output:\n self.save_for_backward(input, output, *params)\n else:\n self.save_for_backward(input, *params)\n\n getattr(self._backend, update_output.name)(self._backend.library_state, input, output, *args)\n return output\n\n def backward(self, grad_output):\n t = self.saved_tensors\n if save_output:\n input, output, params = t[0], t[1], t[2:]\n else:\n input, params = t[0], t[1:]\n grad_params = tuple(None for p in params)\n grad_input_tuple = (None,)\n\n if self.needs_input_grad[0]:\n additional_args = self._initialize_buffers('update_grad_input')\n if save_output:\n additional_args = (output,) + additional_args\n\n grad_input = input.new().resize_as_(input).zero_()\n params_without_bias = params if len(params) < 2 else params[:1]\n update_grad_input_fn = getattr(self._backend, update_grad_input.name)\n gi_args = params_without_bias + additional_args\n update_grad_input_fn(self._backend.library_state, input, grad_output, grad_input, *gi_args)\n grad_input_tuple = (grad_input,)\n\n if acc_grad_parameters and any(self.needs_input_grad[1:]):\n additional_args = self._initialize_buffers('acc_grad_parameters')\n grad_params = tuple(p.new().resize_as_(p).zero_() for p in params)\n acc_grad_parameters_fn = getattr(self._backend, acc_grad_parameters.name)\n param_args = grad_params + additional_args + (1,)\n acc_grad_parameters_fn(self._backend.library_state, input, grad_output, *param_args)\n\n return grad_input_tuple + grad_params\n\n base_class = Function if not is_inplace else InplaceFunction\n return type(class_name, (base_class,), dict(__init__=__init__, forward=forward, backward=backward, _initialize_buffers=_initialize_buffers))\n\n\ndef _generate_function_classes(scope_dict):\n function_list = parse_header(THNN_H_PATH)\n function_by_name = {fn.name: fn for fn in function_list}\n classes_to_generate = {fn.name.partition('_')[0] for fn in function_list}\n exceptions = {\n 'Linear',\n 'SpatialConvolutionMM',\n 'SparseLinear',\n 'TemporalConvolution',\n 'SpatialMaxPooling',\n 'SpatialDilatedMaxPooling',\n 'SpatialMaxUnpooling',\n 'VolumetricMaxPooling',\n 'VolumetricMaxUnpooling',\n 'VolumetricConvolution',\n 'VolumetricConvolutionMM',\n 'TemporalMaxPooling',\n 'BatchNormalization',\n 'LookupTable',\n 'PReLU',\n 'RReLU',\n 'unfolded',\n }\n name_remap = {\n 'TemporalConvolution': 'Conv1d',\n 'SpatialDilatedConvolution': 'DilatedConv2d',\n 'SpatialAveragePooling': 'AvgPool2d',\n 'SpatialMaxUnpooling': 'MaxUnpool2d',\n 'SpatialReflectionPadding': 'ReflectionPad2d',\n 'SpatialReplicationPadding': 'ReplicationPad2d',\n 'SpatialFullConvolution': 'ConvTranspose2d',\n 'VolumetricFullConvolution': 'ConvTranspose3d',\n 'VolumetricReplicationPadding': 'ReplicationPad3d',\n 'VolumetricMaxUnpooling': 'MaxUnpool3d',\n 'VolumetricAveragePooling': 'AvgPool3d',\n 'SoftMax': 'Softmax',\n 'LogSoftMax': 'LogSoftmax',\n 'HardTanh': 'Hardtanh',\n 'HardShrink': 'Hardshrink',\n 'SoftPlus': 'Softplus',\n 'SoftShrink': 'Softshrink',\n 'MSECriterion': 'MSELoss',\n 'AbsCriterion': 'L1Loss',\n 'BCECriterion': '_BCELoss', # TODO: move the glue code into THNN\n 'ClassNLLCriterion': 'NLLLoss',\n 'DistKLDivCriterion': 'KLDivLoss',\n 'SpatialClassNLLCriterion': 'NLLLoss2d',\n 'MultiLabelMarginCriterion': 'MultiLabelMarginLoss',\n 'MultiMarginCriterion': 'MultiMarginLoss',\n 'SmoothL1Criterion': 'SmoothL1Loss',\n 'SoftMarginCriterion': 'SoftMarginLoss',\n }\n classes_to_generate -= exceptions\n for fn in classes_to_generate:\n update_output = function_by_name[fn + '_updateOutput']\n update_grad_input = function_by_name[fn + '_updateGradInput']\n acc_grad_parameters = function_by_name.get(fn + '_accGradParameters')\n class_name = name_remap.get(fn, fn)\n # This has to call a function to retain correct references to functions\n if 'Criterion' in fn:\n cls = _make_function_class_criterion(class_name, update_output,\n update_grad_input, acc_grad_parameters)\n else:\n cls = _make_function_class(class_name, update_output,\n update_grad_input, acc_grad_parameters)\n scope_dict[class_name] = cls\n if not class_name.startswith('_'):\n _all_functions.append(cls)\n\n\n_generate_function_classes(locals())\n", "path": "torch/nn/functions/thnn/auto.py"}]} | 3,226 | 124 |
gh_patches_debug_2883 | rasdani/github-patches | git_diff | zestedesavoir__zds-site-672 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Avoir un rappel des bases du Mardkown à côté des zones de rédaction
L'idée est d'avoir un rappel des bases du Mardkown à côté des zones de rédaction (les grandes lignes + un lien vers le tuto).
Je sais qu'on a les boutons, mais c'est toujours utile pour ceux qui préfèrent éviter de jouer avec la souris, et je pense améliorera l'apprentissage du MD.
Le truc le plus important à y mettre est sans doute la gestion des sauts de ligne / paragraphes :)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `zds/settings.py`
Content:
```
1 # coding: utf-8
2
3 import locale
4 import os
5 import platform
6
7
8 # Python is platform-independent...or is it?
9 if platform.system() == "Windows":
10 locale.setlocale(locale.LC_TIME, 'fra')
11 else:
12 locale.setlocale(locale.LC_TIME, 'fr_FR.UTF-8')
13
14 DEBUG = True
15 TEMPLATE_DEBUG = DEBUG
16 # INTERNAL_IPS = ('127.0.0.1',) # debug toolbar
17
18
19 ADMINS = (
20 ('user', 'mail'),
21 )
22
23 MANAGERS = ADMINS
24
25 DATABASES = {
26 'default': {
27 'ENGINE': 'django.db.backends.sqlite3',
28 'NAME': 'base.db',
29 'USER': '',
30 'PASSWORD': '',
31 'HOST': '',
32 'PORT': '',
33 }
34 }
35
36 # Local time zone for this installation. Choices can be found here:
37 # http://en.wikipedia.org/wiki/List_of_tz_zones_by_name
38 # although not all choices may be available on all operating systems.
39 # In a Windows environment this must be set to your system time zone.
40 TIME_ZONE = 'Europe/Paris'
41
42 # Language code for this installation. All choices can be found here:
43 # http://www.i18nguy.com/unicode/language-identifiers.html
44 LANGUAGE_CODE = 'fr-fr'
45
46 SITE_ID = 1
47
48 # If you set this to False, Django will make some optimizations so as not
49 # to load the internationalization machinery.
50 USE_I18N = True
51
52 # If you set this to False, Django will not format dates, numbers and
53 # calendars according to the current locale.
54 USE_L10N = True
55
56 # If you set this to False, Django will not use timezone-aware datetimes.
57 USE_TZ = False
58
59 SITE_ROOT = os.path.realpath(os.path.dirname(os.path.dirname(__file__)))
60
61 SITE_URL = 'http://127.0.0.1:8000'
62
63 # Absolute filesystem path to the directory that will hold user-uploaded files.
64 # Example: "/home/media/media.lawrence.com/media/"
65 MEDIA_ROOT = os.path.join(SITE_ROOT, 'media')
66
67 # URL that handles the media served from MEDIA_ROOT. Make sure to use a
68 # trailing slash.
69 # Examples: "http://media.lawrence.com/media/", "http://example.com/media/"
70 MEDIA_URL = '/media/'
71
72 # Absolute path to the directory static files should be collected to.
73 # Don't put anything in this directory yourself; store your static files
74 # in apps' "static/" subdirectories and in STATICFILES_DIRS.
75 # Example: "/home/media/media.lawrence.com/static/"
76 STATIC_ROOT = os.path.join(SITE_ROOT, 'static')
77
78 # URL prefix for static files.
79 # Example: "http://media.lawrence.com/static/"
80 STATIC_URL = '/static/'
81
82 # Additional locations of static files
83 STATICFILES_DIRS = (
84 # Put strings here, like "/home/html/static" or "C:/www/django/static".
85 # Always use forward slashes, even on Windows.
86 # Don't forget to use absolute paths, not relative paths.
87 os.path.join(SITE_ROOT, 'assets'),
88 )
89
90 # List of finder classes that know how to find static files in
91 # various locations.
92 STATICFILES_FINDERS = (
93 'django.contrib.staticfiles.finders.FileSystemFinder',
94 'django.contrib.staticfiles.finders.AppDirectoriesFinder',
95 # 'django.contrib.staticfiles.finders.DefaultStorageFinder',
96 )
97
98 STATICFILES_STORAGE = 'pipeline.storage.PipelineStorage'
99
100 FIXTURE_DIRS = (os.path.join(SITE_ROOT, 'fixtures'))
101 # You will need yuglify to be installed
102 PIPELINE_JS = {
103 'modernizr' : {
104 'source_filenames': (
105 'js/vendor/custom.modernizr.js',
106 ),
107 'output_filename': 'js/modernizr.js'
108 },
109 'css3-mediaqueries' : {
110 'source_filenames': (
111 'js/vendor/css3-mediaqueries.js',
112 ),
113 'output_filename': 'js/css3-mediaqueries.js'
114 },
115 'main-js': {
116 'source_filenames': (
117 'js/vendor/jquery.js',
118 'js/vendor/jquery.tabbable.js',
119
120 'js/custom/editor.js',
121
122 'js/custom/mobile-menu.js',
123 'js/custom/accessibility-links.js',
124 'js/custom/dropdown-menu.js',
125 'js/custom/data-click.js',
126 'js/custom/accordeon.js',
127 'js/custom/modal.js',
128 'js/custom/close-alert-box.js',
129 'js/custom/keyboard-navigation.js',
130 'js/custom/message-hidden.js',
131 'js/custom/spoiler.js',
132 ),
133 'output_filename': 'js/main.js'
134 }
135 }
136
137 PIPELINE_CSS = {
138 'main-css': {
139 'source_filenames': (
140 'css/main.css',
141 ),
142 'output_filename': 'css/design.css'
143 }
144 }
145
146 # Make this unique, and don't share it with anybody.
147 SECRET_KEY = 'n!01nl+318#x75_%le8#s0=-*ysw&y49uc#t=*wvi(9hnyii0z'
148
149 # List of callables that know how to import templates from various sources.
150 TEMPLATE_LOADERS = (
151 'django.template.loaders.filesystem.Loader',
152 'django.template.loaders.app_directories.Loader',
153 # 'django.template.loaders.eggs.Loader',
154 )
155
156 FILE_UPLOAD_HANDLERS = (
157 "django.core.files.uploadhandler.MemoryFileUploadHandler",
158 "django.core.files.uploadhandler.TemporaryFileUploadHandler",
159 )
160
161 MIDDLEWARE_CLASSES = (
162 'django.middleware.common.CommonMiddleware',
163 'django.contrib.sessions.middleware.SessionMiddleware',
164 'django.middleware.csrf.CsrfViewMiddleware',
165 'django.contrib.auth.middleware.AuthenticationMiddleware',
166 'django.contrib.messages.middleware.MessageMiddleware',
167 # Uncomment the next line for simple clickjacking protection:
168 # 'django.middleware.clickjacking.XFrameOptionsMiddleware',
169 'zds.utils.ThreadLocals',
170 'zds.middlewares.SetLastVisitMiddleware.SetLastVisitMiddleware',
171 )
172
173 ROOT_URLCONF = 'zds.urls'
174
175 # Python dotted path to the WSGI application used by Django's runserver.
176 WSGI_APPLICATION = 'zds.wsgi.application'
177
178 TEMPLATE_DIRS = [
179 # Put strings here, like "/home/html/django_templates" or "C:/www/django/templates".
180 # Always use forward slashes, even on Windows.
181 # Don't forget to use absolute paths, not relative paths.
182 os.path.join(SITE_ROOT, 'templates')
183 ]
184
185 TEMPLATE_CONTEXT_PROCESSORS = (
186 # Default context processors
187 'django.contrib.auth.context_processors.auth',
188 'django.core.context_processors.debug',
189 'django.core.context_processors.i18n',
190 'django.core.context_processors.media',
191 'django.core.context_processors.static',
192 'django.core.context_processors.request',
193 'django.core.context_processors.tz',
194 'django.contrib.messages.context_processors.messages'
195 )
196
197 CRISPY_TEMPLATE_PACK='bootstrap'
198
199 INSTALLED_APPS = (
200 'django.contrib.auth',
201 'django.contrib.contenttypes',
202 'django.contrib.sessions',
203 'django.contrib.sites',
204 'django.contrib.messages',
205 'django.contrib.staticfiles',
206 'django.contrib.sitemaps',
207 'django.contrib.humanize',
208
209 'south',
210 'crispy_forms',
211 'email_obfuscator',
212 'pipeline',
213 'haystack',
214 'munin',
215
216 # Apps DB tables are created in THIS order by default
217 # --> Order is CRITICAL to properly handle foreign keys
218 'zds.utils',
219 'zds.pages',
220 'zds.gallery',
221 'zds.mp',
222 'zds.newsletter',
223 'zds.article',
224 'zds.forum',
225 'zds.tutorial',
226 'zds.member',
227 # Uncomment the next line to enable the admin:
228 'django.contrib.admin',
229 # Uncomment the next line to enable admin documentation:
230 # 'django.contrib.admindocs',
231 )
232 if (DEBUG):
233 INSTALLED_APPS += (
234 'debug_toolbar',
235 )
236
237 # A sample logging configuration. The only tangible logging
238 # performed by this configuration is to send an email to
239 # the site admins on every HTTP 500 error when DEBUG=False.
240 # See http://docs.djangoproject.com/en/dev/topics/logging for
241 # more details on how to customize your logging configuration.
242 LOGGING = {
243 'version': 1,
244 'disable_existing_loggers': False,
245 'filters': {
246 'require_debug_false': {
247 '()': 'django.utils.log.RequireDebugFalse'
248 }
249 },
250 'handlers': {
251 'mail_admins': {
252 'level': 'ERROR',
253 'filters': ['require_debug_false'],
254 'class': 'django.utils.log.AdminEmailHandler'
255 }
256 },
257 'loggers': {
258 'django.request': {
259 'handlers': ['mail_admins'],
260 'level': 'ERROR',
261 'propagate': True,
262 },
263 }
264 }
265
266 CACHES = {
267 'default': {
268 'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache',
269 'LOCATION': '127.0.0.1:11211',
270 }
271 }
272
273 SESSION_ENGINE = "django.contrib.sessions.backends.cached_db"
274
275 AUTH_PROFILE_MODULE = 'member.Profile'
276 LOGIN_URL = '/membres/connexion'
277
278 ABSOLUTE_URL_OVERRIDES = {
279 'auth.user': lambda u: '/membres/voir/{0}/'.format(u.username.encode('utf-8'))
280 }
281
282
283 # Django fileserve settings (set to True for local dev version only)
284 SERVE = False
285
286 # Max size image upload (in bytes)
287 IMAGE_MAX_SIZE = 1024 * 1024 * 2
288
289 # git directory
290 REPO_PATH = os.path.join(SITE_ROOT, 'tutoriels-private')
291 REPO_PATH_PROD = os.path.join(SITE_ROOT, 'tutoriels-public')
292 REPO_ARTICLE_PATH = os.path.join(SITE_ROOT, 'articles-data')
293
294 # Constants for pagination
295 POSTS_PER_PAGE = 21
296 TOPICS_PER_PAGE = 21
297 MEMBERS_PER_PAGE = 36
298
299 # Constants to avoid spam
300 SPAM_LIMIT_SECONDS = 60 * 15
301 SPAM_LIMIT_PARTICIPANT = 2
302 FOLLOWED_TOPICS_PER_PAGE = 21
303
304 BOT_ACCOUNT = 'admin'
305
306 PANDOC_LOC = ''
307
308 HAYSTACK_CONNECTIONS = {
309 'default': {
310 'ENGINE': 'haystack.backends.solr_backend.SolrEngine',
311 'URL': 'http://127.0.0.1:8983/solr'
312 # ...or for multicore...
313 # 'URL': 'http://127.0.0.1:8983/solr/mysite',
314 },
315 }
316
317 GEOIP_PATH = os.path.join(SITE_ROOT, 'geodata')
318
319 from django.contrib.messages import constants as message_constants
320 MESSAGE_TAGS = {
321 message_constants.DEBUG: 'debug',
322 message_constants.INFO: 'info',
323 message_constants.SUCCESS: 'success',
324 message_constants.WARNING: 'warning',
325 message_constants.ERROR: 'alert',
326 }
327
328
329 MAX_POST_LENGTH = 1000000
330 SDZ_TUTO_DIR = ''
331
332 MAIL_CA_ASSO = '[email protected]'
333
334 # Load the production settings, overwrite the existing ones if needed
335 try:
336 from settings_prod import *
337 except ImportError:
338 pass
339
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/zds/settings.py b/zds/settings.py
--- a/zds/settings.py
+++ b/zds/settings.py
@@ -129,6 +129,8 @@
'js/custom/keyboard-navigation.js',
'js/custom/message-hidden.js',
'js/custom/spoiler.js',
+ 'js/custom/karma-ajax.js',
+ 'js/custom/markdown-help.js',
),
'output_filename': 'js/main.js'
}
| {"golden_diff": "diff --git a/zds/settings.py b/zds/settings.py\n--- a/zds/settings.py\n+++ b/zds/settings.py\n@@ -129,6 +129,8 @@\n 'js/custom/keyboard-navigation.js',\n 'js/custom/message-hidden.js',\n 'js/custom/spoiler.js',\n+ 'js/custom/karma-ajax.js',\n+ 'js/custom/markdown-help.js',\n ),\n 'output_filename': 'js/main.js'\n }\n", "issue": "Avoir un rappel des bases du Mardkown \u00e0 c\u00f4t\u00e9 des zones de r\u00e9daction\nL'id\u00e9e est d'avoir un rappel des bases du Mardkown \u00e0 c\u00f4t\u00e9 des zones de r\u00e9daction (les grandes lignes + un lien vers le tuto).\n\nJe sais qu'on a les boutons, mais c'est toujours utile pour ceux qui pr\u00e9f\u00e8rent \u00e9viter de jouer avec la souris, et je pense am\u00e9liorera l'apprentissage du MD.\n\nLe truc le plus important \u00e0 y mettre est sans doute la gestion des sauts de ligne / paragraphes :)\n\n", "before_files": [{"content": "# coding: utf-8\n\nimport locale\nimport os\nimport platform\n\n\n# Python is platform-independent...or is it?\nif platform.system() == \"Windows\":\n locale.setlocale(locale.LC_TIME, 'fra')\nelse:\n locale.setlocale(locale.LC_TIME, 'fr_FR.UTF-8')\n\nDEBUG = True\nTEMPLATE_DEBUG = DEBUG\n# INTERNAL_IPS = ('127.0.0.1',) # debug toolbar\n\n\nADMINS = (\n ('user', 'mail'),\n)\n\nMANAGERS = ADMINS\n\nDATABASES = {\n 'default': {\n 'ENGINE': 'django.db.backends.sqlite3',\n 'NAME': 'base.db',\n 'USER': '',\n 'PASSWORD': '',\n 'HOST': '',\n 'PORT': '',\n }\n}\n\n# Local time zone for this installation. Choices can be found here:\n# http://en.wikipedia.org/wiki/List_of_tz_zones_by_name\n# although not all choices may be available on all operating systems.\n# In a Windows environment this must be set to your system time zone.\nTIME_ZONE = 'Europe/Paris'\n\n# Language code for this installation. All choices can be found here:\n# http://www.i18nguy.com/unicode/language-identifiers.html\nLANGUAGE_CODE = 'fr-fr'\n\nSITE_ID = 1\n\n# If you set this to False, Django will make some optimizations so as not\n# to load the internationalization machinery.\nUSE_I18N = True\n\n# If you set this to False, Django will not format dates, numbers and\n# calendars according to the current locale.\nUSE_L10N = True\n\n# If you set this to False, Django will not use timezone-aware datetimes.\nUSE_TZ = False\n\nSITE_ROOT = os.path.realpath(os.path.dirname(os.path.dirname(__file__)))\n\nSITE_URL = 'http://127.0.0.1:8000'\n\n# Absolute filesystem path to the directory that will hold user-uploaded files.\n# Example: \"/home/media/media.lawrence.com/media/\"\nMEDIA_ROOT = os.path.join(SITE_ROOT, 'media')\n\n# URL that handles the media served from MEDIA_ROOT. Make sure to use a\n# trailing slash.\n# Examples: \"http://media.lawrence.com/media/\", \"http://example.com/media/\"\nMEDIA_URL = '/media/'\n\n# Absolute path to the directory static files should be collected to.\n# Don't put anything in this directory yourself; store your static files\n# in apps' \"static/\" subdirectories and in STATICFILES_DIRS.\n# Example: \"/home/media/media.lawrence.com/static/\"\nSTATIC_ROOT = os.path.join(SITE_ROOT, 'static')\n\n# URL prefix for static files.\n# Example: \"http://media.lawrence.com/static/\"\nSTATIC_URL = '/static/'\n\n# Additional locations of static files\nSTATICFILES_DIRS = (\n # Put strings here, like \"/home/html/static\" or \"C:/www/django/static\".\n # Always use forward slashes, even on Windows.\n # Don't forget to use absolute paths, not relative paths.\n os.path.join(SITE_ROOT, 'assets'),\n)\n\n# List of finder classes that know how to find static files in\n# various locations.\nSTATICFILES_FINDERS = (\n 'django.contrib.staticfiles.finders.FileSystemFinder',\n 'django.contrib.staticfiles.finders.AppDirectoriesFinder',\n # 'django.contrib.staticfiles.finders.DefaultStorageFinder',\n)\n\nSTATICFILES_STORAGE = 'pipeline.storage.PipelineStorage'\n\nFIXTURE_DIRS = (os.path.join(SITE_ROOT, 'fixtures'))\n# You will need yuglify to be installed\nPIPELINE_JS = {\n 'modernizr' : {\n 'source_filenames': (\n 'js/vendor/custom.modernizr.js',\n ),\n 'output_filename': 'js/modernizr.js'\n }, \n 'css3-mediaqueries' : {\n 'source_filenames': (\n 'js/vendor/css3-mediaqueries.js',\n ),\n 'output_filename': 'js/css3-mediaqueries.js'\n },\n 'main-js': {\n 'source_filenames': (\n 'js/vendor/jquery.js',\n 'js/vendor/jquery.tabbable.js',\n\n 'js/custom/editor.js',\n\n 'js/custom/mobile-menu.js',\n 'js/custom/accessibility-links.js',\n 'js/custom/dropdown-menu.js',\n 'js/custom/data-click.js',\n 'js/custom/accordeon.js',\n 'js/custom/modal.js',\n 'js/custom/close-alert-box.js',\n 'js/custom/keyboard-navigation.js',\n 'js/custom/message-hidden.js',\n 'js/custom/spoiler.js',\n ),\n 'output_filename': 'js/main.js'\n }\n}\n\nPIPELINE_CSS = {\n 'main-css': {\n 'source_filenames': (\n 'css/main.css',\n ),\n 'output_filename': 'css/design.css'\n }\n}\n\n# Make this unique, and don't share it with anybody.\nSECRET_KEY = 'n!01nl+318#x75_%le8#s0=-*ysw&y49uc#t=*wvi(9hnyii0z'\n\n# List of callables that know how to import templates from various sources.\nTEMPLATE_LOADERS = (\n 'django.template.loaders.filesystem.Loader',\n 'django.template.loaders.app_directories.Loader',\n # 'django.template.loaders.eggs.Loader',\n)\n\nFILE_UPLOAD_HANDLERS = (\n \"django.core.files.uploadhandler.MemoryFileUploadHandler\",\n \"django.core.files.uploadhandler.TemporaryFileUploadHandler\",\n)\n\nMIDDLEWARE_CLASSES = (\n 'django.middleware.common.CommonMiddleware',\n 'django.contrib.sessions.middleware.SessionMiddleware',\n 'django.middleware.csrf.CsrfViewMiddleware',\n 'django.contrib.auth.middleware.AuthenticationMiddleware',\n 'django.contrib.messages.middleware.MessageMiddleware',\n # Uncomment the next line for simple clickjacking protection:\n # 'django.middleware.clickjacking.XFrameOptionsMiddleware',\n 'zds.utils.ThreadLocals',\n 'zds.middlewares.SetLastVisitMiddleware.SetLastVisitMiddleware',\n)\n\nROOT_URLCONF = 'zds.urls'\n\n# Python dotted path to the WSGI application used by Django's runserver.\nWSGI_APPLICATION = 'zds.wsgi.application'\n\nTEMPLATE_DIRS = [\n # Put strings here, like \"/home/html/django_templates\" or \"C:/www/django/templates\".\n # Always use forward slashes, even on Windows.\n # Don't forget to use absolute paths, not relative paths.\n os.path.join(SITE_ROOT, 'templates')\n]\n\nTEMPLATE_CONTEXT_PROCESSORS = (\n # Default context processors\n 'django.contrib.auth.context_processors.auth',\n 'django.core.context_processors.debug',\n 'django.core.context_processors.i18n',\n 'django.core.context_processors.media',\n 'django.core.context_processors.static',\n 'django.core.context_processors.request',\n 'django.core.context_processors.tz',\n 'django.contrib.messages.context_processors.messages'\n)\n\nCRISPY_TEMPLATE_PACK='bootstrap'\n\nINSTALLED_APPS = (\n 'django.contrib.auth',\n 'django.contrib.contenttypes',\n 'django.contrib.sessions',\n 'django.contrib.sites',\n 'django.contrib.messages',\n 'django.contrib.staticfiles',\n 'django.contrib.sitemaps',\n 'django.contrib.humanize',\n\n 'south',\n 'crispy_forms',\n 'email_obfuscator',\n 'pipeline',\n 'haystack',\n 'munin',\n\n # Apps DB tables are created in THIS order by default\n # --> Order is CRITICAL to properly handle foreign keys\n 'zds.utils',\n 'zds.pages',\n 'zds.gallery',\n 'zds.mp',\n 'zds.newsletter',\n 'zds.article',\n 'zds.forum',\n 'zds.tutorial',\n 'zds.member',\n # Uncomment the next line to enable the admin:\n 'django.contrib.admin',\n # Uncomment the next line to enable admin documentation:\n # 'django.contrib.admindocs',\n)\nif (DEBUG):\n INSTALLED_APPS += (\n 'debug_toolbar',\n )\n\n# A sample logging configuration. The only tangible logging\n# performed by this configuration is to send an email to\n# the site admins on every HTTP 500 error when DEBUG=False.\n# See http://docs.djangoproject.com/en/dev/topics/logging for\n# more details on how to customize your logging configuration.\nLOGGING = {\n 'version': 1,\n 'disable_existing_loggers': False,\n 'filters': {\n 'require_debug_false': {\n '()': 'django.utils.log.RequireDebugFalse'\n }\n },\n 'handlers': {\n 'mail_admins': {\n 'level': 'ERROR',\n 'filters': ['require_debug_false'],\n 'class': 'django.utils.log.AdminEmailHandler'\n }\n },\n 'loggers': {\n 'django.request': {\n 'handlers': ['mail_admins'],\n 'level': 'ERROR',\n 'propagate': True,\n },\n }\n}\n\nCACHES = {\n 'default': {\n 'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache',\n 'LOCATION': '127.0.0.1:11211',\n }\n}\n\nSESSION_ENGINE = \"django.contrib.sessions.backends.cached_db\"\n\nAUTH_PROFILE_MODULE = 'member.Profile'\nLOGIN_URL = '/membres/connexion'\n\nABSOLUTE_URL_OVERRIDES = {\n 'auth.user': lambda u: '/membres/voir/{0}/'.format(u.username.encode('utf-8'))\n}\n\n\n# Django fileserve settings (set to True for local dev version only)\nSERVE = False\n\n# Max size image upload (in bytes)\nIMAGE_MAX_SIZE = 1024 * 1024 * 2\n\n# git directory\nREPO_PATH = os.path.join(SITE_ROOT, 'tutoriels-private')\nREPO_PATH_PROD = os.path.join(SITE_ROOT, 'tutoriels-public')\nREPO_ARTICLE_PATH = os.path.join(SITE_ROOT, 'articles-data')\n\n# Constants for pagination\nPOSTS_PER_PAGE = 21\nTOPICS_PER_PAGE = 21\nMEMBERS_PER_PAGE = 36\n\n# Constants to avoid spam\nSPAM_LIMIT_SECONDS = 60 * 15\nSPAM_LIMIT_PARTICIPANT = 2\nFOLLOWED_TOPICS_PER_PAGE = 21\n\nBOT_ACCOUNT = 'admin'\n\nPANDOC_LOC = ''\n\nHAYSTACK_CONNECTIONS = {\n 'default': {\n 'ENGINE': 'haystack.backends.solr_backend.SolrEngine',\n 'URL': 'http://127.0.0.1:8983/solr'\n # ...or for multicore...\n # 'URL': 'http://127.0.0.1:8983/solr/mysite',\n },\n}\n\nGEOIP_PATH = os.path.join(SITE_ROOT, 'geodata')\n\nfrom django.contrib.messages import constants as message_constants\nMESSAGE_TAGS = {\n message_constants.DEBUG: 'debug',\n message_constants.INFO: 'info',\n message_constants.SUCCESS: 'success',\n message_constants.WARNING: 'warning',\n message_constants.ERROR: 'alert',\n}\n\n\nMAX_POST_LENGTH = 1000000\nSDZ_TUTO_DIR = ''\n\nMAIL_CA_ASSO = '[email protected]'\n\n# Load the production settings, overwrite the existing ones if needed\ntry:\n from settings_prod import *\nexcept ImportError:\n pass\n", "path": "zds/settings.py"}], "after_files": [{"content": "# coding: utf-8\n\nimport locale\nimport os\nimport platform\n\n\n# Python is platform-independent...or is it?\nif platform.system() == \"Windows\":\n locale.setlocale(locale.LC_TIME, 'fra')\nelse:\n locale.setlocale(locale.LC_TIME, 'fr_FR.UTF-8')\n\nDEBUG = True\nTEMPLATE_DEBUG = DEBUG\n# INTERNAL_IPS = ('127.0.0.1',) # debug toolbar\n\n\nADMINS = (\n ('user', 'mail'),\n)\n\nMANAGERS = ADMINS\n\nDATABASES = {\n 'default': {\n 'ENGINE': 'django.db.backends.sqlite3',\n 'NAME': 'base.db',\n 'USER': '',\n 'PASSWORD': '',\n 'HOST': '',\n 'PORT': '',\n }\n}\n\n# Local time zone for this installation. Choices can be found here:\n# http://en.wikipedia.org/wiki/List_of_tz_zones_by_name\n# although not all choices may be available on all operating systems.\n# In a Windows environment this must be set to your system time zone.\nTIME_ZONE = 'Europe/Paris'\n\n# Language code for this installation. All choices can be found here:\n# http://www.i18nguy.com/unicode/language-identifiers.html\nLANGUAGE_CODE = 'fr-fr'\n\nSITE_ID = 1\n\n# If you set this to False, Django will make some optimizations so as not\n# to load the internationalization machinery.\nUSE_I18N = True\n\n# If you set this to False, Django will not format dates, numbers and\n# calendars according to the current locale.\nUSE_L10N = True\n\n# If you set this to False, Django will not use timezone-aware datetimes.\nUSE_TZ = False\n\nSITE_ROOT = os.path.realpath(os.path.dirname(os.path.dirname(__file__)))\n\nSITE_URL = 'http://127.0.0.1:8000'\n\n# Absolute filesystem path to the directory that will hold user-uploaded files.\n# Example: \"/home/media/media.lawrence.com/media/\"\nMEDIA_ROOT = os.path.join(SITE_ROOT, 'media')\n\n# URL that handles the media served from MEDIA_ROOT. Make sure to use a\n# trailing slash.\n# Examples: \"http://media.lawrence.com/media/\", \"http://example.com/media/\"\nMEDIA_URL = '/media/'\n\n# Absolute path to the directory static files should be collected to.\n# Don't put anything in this directory yourself; store your static files\n# in apps' \"static/\" subdirectories and in STATICFILES_DIRS.\n# Example: \"/home/media/media.lawrence.com/static/\"\nSTATIC_ROOT = os.path.join(SITE_ROOT, 'static')\n\n# URL prefix for static files.\n# Example: \"http://media.lawrence.com/static/\"\nSTATIC_URL = '/static/'\n\n# Additional locations of static files\nSTATICFILES_DIRS = (\n # Put strings here, like \"/home/html/static\" or \"C:/www/django/static\".\n # Always use forward slashes, even on Windows.\n # Don't forget to use absolute paths, not relative paths.\n os.path.join(SITE_ROOT, 'assets'),\n)\n\n# List of finder classes that know how to find static files in\n# various locations.\nSTATICFILES_FINDERS = (\n 'django.contrib.staticfiles.finders.FileSystemFinder',\n 'django.contrib.staticfiles.finders.AppDirectoriesFinder',\n # 'django.contrib.staticfiles.finders.DefaultStorageFinder',\n)\n\nSTATICFILES_STORAGE = 'pipeline.storage.PipelineStorage'\n\nFIXTURE_DIRS = (os.path.join(SITE_ROOT, 'fixtures'))\n# You will need yuglify to be installed\nPIPELINE_JS = {\n 'modernizr' : {\n 'source_filenames': (\n 'js/vendor/custom.modernizr.js',\n ),\n 'output_filename': 'js/modernizr.js'\n }, \n 'css3-mediaqueries' : {\n 'source_filenames': (\n 'js/vendor/css3-mediaqueries.js',\n ),\n 'output_filename': 'js/css3-mediaqueries.js'\n },\n 'main-js': {\n 'source_filenames': (\n 'js/vendor/jquery.js',\n 'js/vendor/jquery.tabbable.js',\n\n 'js/custom/editor.js',\n\n 'js/custom/mobile-menu.js',\n 'js/custom/accessibility-links.js',\n 'js/custom/dropdown-menu.js',\n 'js/custom/data-click.js',\n 'js/custom/accordeon.js',\n 'js/custom/modal.js',\n 'js/custom/close-alert-box.js',\n 'js/custom/keyboard-navigation.js',\n 'js/custom/message-hidden.js',\n 'js/custom/spoiler.js',\n 'js/custom/karma-ajax.js',\n 'js/custom/markdown-help.js',\n ),\n 'output_filename': 'js/main.js'\n }\n}\n\nPIPELINE_CSS = {\n 'main-css': {\n 'source_filenames': (\n 'css/main.css',\n ),\n 'output_filename': 'css/design.css'\n }\n}\n\n# Make this unique, and don't share it with anybody.\nSECRET_KEY = 'n!01nl+318#x75_%le8#s0=-*ysw&y49uc#t=*wvi(9hnyii0z'\n\n# List of callables that know how to import templates from various sources.\nTEMPLATE_LOADERS = (\n 'django.template.loaders.filesystem.Loader',\n 'django.template.loaders.app_directories.Loader',\n # 'django.template.loaders.eggs.Loader',\n)\n\nFILE_UPLOAD_HANDLERS = (\n \"django.core.files.uploadhandler.MemoryFileUploadHandler\",\n \"django.core.files.uploadhandler.TemporaryFileUploadHandler\",\n)\n\nMIDDLEWARE_CLASSES = (\n 'django.middleware.common.CommonMiddleware',\n 'django.contrib.sessions.middleware.SessionMiddleware',\n 'django.middleware.csrf.CsrfViewMiddleware',\n 'django.contrib.auth.middleware.AuthenticationMiddleware',\n 'django.contrib.messages.middleware.MessageMiddleware',\n # Uncomment the next line for simple clickjacking protection:\n # 'django.middleware.clickjacking.XFrameOptionsMiddleware',\n 'zds.utils.ThreadLocals',\n 'zds.middlewares.SetLastVisitMiddleware.SetLastVisitMiddleware',\n)\n\nROOT_URLCONF = 'zds.urls'\n\n# Python dotted path to the WSGI application used by Django's runserver.\nWSGI_APPLICATION = 'zds.wsgi.application'\n\nTEMPLATE_DIRS = [\n # Put strings here, like \"/home/html/django_templates\" or \"C:/www/django/templates\".\n # Always use forward slashes, even on Windows.\n # Don't forget to use absolute paths, not relative paths.\n os.path.join(SITE_ROOT, 'templates')\n]\n\nTEMPLATE_CONTEXT_PROCESSORS = (\n # Default context processors\n 'django.contrib.auth.context_processors.auth',\n 'django.core.context_processors.debug',\n 'django.core.context_processors.i18n',\n 'django.core.context_processors.media',\n 'django.core.context_processors.static',\n 'django.core.context_processors.request',\n 'django.core.context_processors.tz',\n 'django.contrib.messages.context_processors.messages'\n)\n\nCRISPY_TEMPLATE_PACK='bootstrap'\n\nINSTALLED_APPS = (\n 'django.contrib.auth',\n 'django.contrib.contenttypes',\n 'django.contrib.sessions',\n 'django.contrib.sites',\n 'django.contrib.messages',\n 'django.contrib.staticfiles',\n 'django.contrib.sitemaps',\n 'django.contrib.humanize',\n\n 'south',\n 'crispy_forms',\n 'email_obfuscator',\n 'pipeline',\n 'haystack',\n 'munin',\n\n # Apps DB tables are created in THIS order by default\n # --> Order is CRITICAL to properly handle foreign keys\n 'zds.utils',\n 'zds.pages',\n 'zds.gallery',\n 'zds.mp',\n 'zds.newsletter',\n 'zds.article',\n 'zds.forum',\n 'zds.tutorial',\n 'zds.member',\n # Uncomment the next line to enable the admin:\n 'django.contrib.admin',\n # Uncomment the next line to enable admin documentation:\n # 'django.contrib.admindocs',\n)\nif (DEBUG):\n INSTALLED_APPS += (\n 'debug_toolbar',\n )\n\n# A sample logging configuration. The only tangible logging\n# performed by this configuration is to send an email to\n# the site admins on every HTTP 500 error when DEBUG=False.\n# See http://docs.djangoproject.com/en/dev/topics/logging for\n# more details on how to customize your logging configuration.\nLOGGING = {\n 'version': 1,\n 'disable_existing_loggers': False,\n 'filters': {\n 'require_debug_false': {\n '()': 'django.utils.log.RequireDebugFalse'\n }\n },\n 'handlers': {\n 'mail_admins': {\n 'level': 'ERROR',\n 'filters': ['require_debug_false'],\n 'class': 'django.utils.log.AdminEmailHandler'\n }\n },\n 'loggers': {\n 'django.request': {\n 'handlers': ['mail_admins'],\n 'level': 'ERROR',\n 'propagate': True,\n },\n }\n}\n\nCACHES = {\n 'default': {\n 'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache',\n 'LOCATION': '127.0.0.1:11211',\n }\n}\n\nSESSION_ENGINE = \"django.contrib.sessions.backends.cached_db\"\n\nAUTH_PROFILE_MODULE = 'member.Profile'\nLOGIN_URL = '/membres/connexion'\n\nABSOLUTE_URL_OVERRIDES = {\n 'auth.user': lambda u: '/membres/voir/{0}/'.format(u.username.encode('utf-8'))\n}\n\n\n# Django fileserve settings (set to True for local dev version only)\nSERVE = False\n\n# Max size image upload (in bytes)\nIMAGE_MAX_SIZE = 1024 * 1024 * 2\n\n# git directory\nREPO_PATH = os.path.join(SITE_ROOT, 'tutoriels-private')\nREPO_PATH_PROD = os.path.join(SITE_ROOT, 'tutoriels-public')\nREPO_ARTICLE_PATH = os.path.join(SITE_ROOT, 'articles-data')\n\n# Constants for pagination\nPOSTS_PER_PAGE = 21\nTOPICS_PER_PAGE = 21\nMEMBERS_PER_PAGE = 36\n\n# Constants to avoid spam\nSPAM_LIMIT_SECONDS = 60 * 15\nSPAM_LIMIT_PARTICIPANT = 2\nFOLLOWED_TOPICS_PER_PAGE = 21\n\nBOT_ACCOUNT = 'admin'\n\nPANDOC_LOC = ''\n\nHAYSTACK_CONNECTIONS = {\n 'default': {\n 'ENGINE': 'haystack.backends.solr_backend.SolrEngine',\n 'URL': 'http://127.0.0.1:8983/solr'\n # ...or for multicore...\n # 'URL': 'http://127.0.0.1:8983/solr/mysite',\n },\n}\n\nGEOIP_PATH = os.path.join(SITE_ROOT, 'geodata')\n\nfrom django.contrib.messages import constants as message_constants\nMESSAGE_TAGS = {\n message_constants.DEBUG: 'debug',\n message_constants.INFO: 'info',\n message_constants.SUCCESS: 'success',\n message_constants.WARNING: 'warning',\n message_constants.ERROR: 'alert',\n}\n\n\nMAX_POST_LENGTH = 1000000\nSDZ_TUTO_DIR = ''\n\nMAIL_CA_ASSO = '[email protected]'\n\n# Load the production settings, overwrite the existing ones if needed\ntry:\n from settings_prod import *\nexcept ImportError:\n pass\n", "path": "zds/settings.py"}]} | 3,772 | 102 |
gh_patches_debug_17304 | rasdani/github-patches | git_diff | networkx__networkx-4339 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Update plot_antigraph.py example to remove `_iter` in method name.
`def adjacency_iter(self)` should be `def adjacency(self)`
There may be other places (especially in the examples) where we've missed an ```_iter``` update.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `examples/subclass/plot_antigraph.py`
Content:
```
1 """
2 =========
3 Antigraph
4 =========
5
6 Complement graph class for small footprint when working on dense graphs.
7
8 This class allows you to add the edges that *do not exist* in the dense
9 graph. However, when applying algorithms to this complement graph data
10 structure, it behaves as if it were the dense version. So it can be used
11 directly in several NetworkX algorithms.
12
13 This subclass has only been tested for k-core, connected_components,
14 and biconnected_components algorithms but might also work for other
15 algorithms.
16
17 """
18 import networkx as nx
19 from networkx.exception import NetworkXError
20 import matplotlib.pyplot as plt
21
22
23 class AntiGraph(nx.Graph):
24 """
25 Class for complement graphs.
26
27 The main goal is to be able to work with big and dense graphs with
28 a low memory footprint.
29
30 In this class you add the edges that *do not exist* in the dense graph,
31 the report methods of the class return the neighbors, the edges and
32 the degree as if it was the dense graph. Thus it's possible to use
33 an instance of this class with some of NetworkX functions.
34 """
35
36 all_edge_dict = {"weight": 1}
37
38 def single_edge_dict(self):
39 return self.all_edge_dict
40
41 edge_attr_dict_factory = single_edge_dict
42
43 def __getitem__(self, n):
44 """Return a dict of neighbors of node n in the dense graph.
45
46 Parameters
47 ----------
48 n : node
49 A node in the graph.
50
51 Returns
52 -------
53 adj_dict : dictionary
54 The adjacency dictionary for nodes connected to n.
55
56 """
57 return {
58 node: self.all_edge_dict for node in set(self.adj) - set(self.adj[n]) - {n}
59 }
60
61 def neighbors(self, n):
62 """Return an iterator over all neighbors of node n in the
63 dense graph.
64
65 """
66 try:
67 return iter(set(self.adj) - set(self.adj[n]) - {n})
68 except KeyError as e:
69 raise NetworkXError(f"The node {n} is not in the graph.") from e
70
71 def degree(self, nbunch=None, weight=None):
72 """Return an iterator for (node, degree) in the dense graph.
73
74 The node degree is the number of edges adjacent to the node.
75
76 Parameters
77 ----------
78 nbunch : iterable container, optional (default=all nodes)
79 A container of nodes. The container will be iterated
80 through once.
81
82 weight : string or None, optional (default=None)
83 The edge attribute that holds the numerical value used
84 as a weight. If None, then each edge has weight 1.
85 The degree is the sum of the edge weights adjacent to the node.
86
87 Returns
88 -------
89 nd_iter : iterator
90 The iterator returns two-tuples of (node, degree).
91
92 See Also
93 --------
94 degree
95
96 Examples
97 --------
98 >>> G = nx.path_graph(4) # or DiGraph, MultiGraph, MultiDiGraph, etc
99 >>> list(G.degree(0)) # node 0 with degree 1
100 [(0, 1)]
101 >>> list(G.degree([0, 1]))
102 [(0, 1), (1, 2)]
103
104 """
105 if nbunch is None:
106 nodes_nbrs = (
107 (
108 n,
109 {
110 v: self.all_edge_dict
111 for v in set(self.adj) - set(self.adj[n]) - {n}
112 },
113 )
114 for n in self.nodes()
115 )
116 elif nbunch in self:
117 nbrs = set(self.nodes()) - set(self.adj[nbunch]) - {nbunch}
118 return len(nbrs)
119 else:
120 nodes_nbrs = (
121 (
122 n,
123 {
124 v: self.all_edge_dict
125 for v in set(self.nodes()) - set(self.adj[n]) - {n}
126 },
127 )
128 for n in self.nbunch_iter(nbunch)
129 )
130
131 if weight is None:
132 return ((n, len(nbrs)) for n, nbrs in nodes_nbrs)
133 else:
134 # AntiGraph is a ThinGraph so all edges have weight 1
135 return (
136 (n, sum((nbrs[nbr].get(weight, 1)) for nbr in nbrs))
137 for n, nbrs in nodes_nbrs
138 )
139
140 def adjacency_iter(self):
141 """Return an iterator of (node, adjacency set) tuples for all nodes
142 in the dense graph.
143
144 This is the fastest way to look at every edge.
145 For directed graphs, only outgoing adjacencies are included.
146
147 Returns
148 -------
149 adj_iter : iterator
150 An iterator of (node, adjacency set) for all nodes in
151 the graph.
152
153 """
154 for n in self.adj:
155 yield (n, set(self.adj) - set(self.adj[n]) - {n})
156
157
158 # Build several pairs of graphs, a regular graph
159 # and the AntiGraph of it's complement, which behaves
160 # as if it were the original graph.
161 Gnp = nx.gnp_random_graph(20, 0.8, seed=42)
162 Anp = AntiGraph(nx.complement(Gnp))
163 Gd = nx.davis_southern_women_graph()
164 Ad = AntiGraph(nx.complement(Gd))
165 Gk = nx.karate_club_graph()
166 Ak = AntiGraph(nx.complement(Gk))
167 pairs = [(Gnp, Anp), (Gd, Ad), (Gk, Ak)]
168 # test connected components
169 for G, A in pairs:
170 gc = [set(c) for c in nx.connected_components(G)]
171 ac = [set(c) for c in nx.connected_components(A)]
172 for comp in ac:
173 assert comp in gc
174 # test biconnected components
175 for G, A in pairs:
176 gc = [set(c) for c in nx.biconnected_components(G)]
177 ac = [set(c) for c in nx.biconnected_components(A)]
178 for comp in ac:
179 assert comp in gc
180 # test degree
181 for G, A in pairs:
182 node = list(G.nodes())[0]
183 nodes = list(G.nodes())[1:4]
184 assert G.degree(node) == A.degree(node)
185 assert sum(d for n, d in G.degree()) == sum(d for n, d in A.degree())
186 # AntiGraph is a ThinGraph, so all the weights are 1
187 assert sum(d for n, d in A.degree()) == sum(d for n, d in A.degree(weight="weight"))
188 assert sum(d for n, d in G.degree(nodes)) == sum(d for n, d in A.degree(nodes))
189
190 pos = nx.spring_layout(G, seed=268) # Seed for reproducible layout
191 nx.draw(Gnp, pos=pos)
192 plt.show()
193
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/examples/subclass/plot_antigraph.py b/examples/subclass/plot_antigraph.py
--- a/examples/subclass/plot_antigraph.py
+++ b/examples/subclass/plot_antigraph.py
@@ -137,7 +137,7 @@
for n, nbrs in nodes_nbrs
)
- def adjacency_iter(self):
+ def adjacency(self):
"""Return an iterator of (node, adjacency set) tuples for all nodes
in the dense graph.
@@ -149,10 +149,10 @@
adj_iter : iterator
An iterator of (node, adjacency set) for all nodes in
the graph.
-
"""
- for n in self.adj:
- yield (n, set(self.adj) - set(self.adj[n]) - {n})
+ nodes = set(self.adj)
+ for n, nbrs in self.adj.items():
+ yield (n, nodes - set(nbrs) - {n})
# Build several pairs of graphs, a regular graph
| {"golden_diff": "diff --git a/examples/subclass/plot_antigraph.py b/examples/subclass/plot_antigraph.py\n--- a/examples/subclass/plot_antigraph.py\n+++ b/examples/subclass/plot_antigraph.py\n@@ -137,7 +137,7 @@\n for n, nbrs in nodes_nbrs\n )\n \n- def adjacency_iter(self):\n+ def adjacency(self):\n \"\"\"Return an iterator of (node, adjacency set) tuples for all nodes\n in the dense graph.\n \n@@ -149,10 +149,10 @@\n adj_iter : iterator\n An iterator of (node, adjacency set) for all nodes in\n the graph.\n-\n \"\"\"\n- for n in self.adj:\n- yield (n, set(self.adj) - set(self.adj[n]) - {n})\n+ nodes = set(self.adj)\n+ for n, nbrs in self.adj.items():\n+ yield (n, nodes - set(nbrs) - {n})\n \n \n # Build several pairs of graphs, a regular graph\n", "issue": "Update plot_antigraph.py example to remove `_iter` in method name.\n`def adjacency_iter(self)` should be `def adjacency(self)`\r\nThere may be other places (especially in the examples) where we've missed an ```_iter``` update.\n", "before_files": [{"content": "\"\"\"\n=========\nAntigraph\n=========\n\nComplement graph class for small footprint when working on dense graphs.\n\nThis class allows you to add the edges that *do not exist* in the dense\ngraph. However, when applying algorithms to this complement graph data\nstructure, it behaves as if it were the dense version. So it can be used\ndirectly in several NetworkX algorithms.\n\nThis subclass has only been tested for k-core, connected_components,\nand biconnected_components algorithms but might also work for other\nalgorithms.\n\n\"\"\"\nimport networkx as nx\nfrom networkx.exception import NetworkXError\nimport matplotlib.pyplot as plt\n\n\nclass AntiGraph(nx.Graph):\n \"\"\"\n Class for complement graphs.\n\n The main goal is to be able to work with big and dense graphs with\n a low memory footprint.\n\n In this class you add the edges that *do not exist* in the dense graph,\n the report methods of the class return the neighbors, the edges and\n the degree as if it was the dense graph. Thus it's possible to use\n an instance of this class with some of NetworkX functions.\n \"\"\"\n\n all_edge_dict = {\"weight\": 1}\n\n def single_edge_dict(self):\n return self.all_edge_dict\n\n edge_attr_dict_factory = single_edge_dict\n\n def __getitem__(self, n):\n \"\"\"Return a dict of neighbors of node n in the dense graph.\n\n Parameters\n ----------\n n : node\n A node in the graph.\n\n Returns\n -------\n adj_dict : dictionary\n The adjacency dictionary for nodes connected to n.\n\n \"\"\"\n return {\n node: self.all_edge_dict for node in set(self.adj) - set(self.adj[n]) - {n}\n }\n\n def neighbors(self, n):\n \"\"\"Return an iterator over all neighbors of node n in the\n dense graph.\n\n \"\"\"\n try:\n return iter(set(self.adj) - set(self.adj[n]) - {n})\n except KeyError as e:\n raise NetworkXError(f\"The node {n} is not in the graph.\") from e\n\n def degree(self, nbunch=None, weight=None):\n \"\"\"Return an iterator for (node, degree) in the dense graph.\n\n The node degree is the number of edges adjacent to the node.\n\n Parameters\n ----------\n nbunch : iterable container, optional (default=all nodes)\n A container of nodes. The container will be iterated\n through once.\n\n weight : string or None, optional (default=None)\n The edge attribute that holds the numerical value used\n as a weight. If None, then each edge has weight 1.\n The degree is the sum of the edge weights adjacent to the node.\n\n Returns\n -------\n nd_iter : iterator\n The iterator returns two-tuples of (node, degree).\n\n See Also\n --------\n degree\n\n Examples\n --------\n >>> G = nx.path_graph(4) # or DiGraph, MultiGraph, MultiDiGraph, etc\n >>> list(G.degree(0)) # node 0 with degree 1\n [(0, 1)]\n >>> list(G.degree([0, 1]))\n [(0, 1), (1, 2)]\n\n \"\"\"\n if nbunch is None:\n nodes_nbrs = (\n (\n n,\n {\n v: self.all_edge_dict\n for v in set(self.adj) - set(self.adj[n]) - {n}\n },\n )\n for n in self.nodes()\n )\n elif nbunch in self:\n nbrs = set(self.nodes()) - set(self.adj[nbunch]) - {nbunch}\n return len(nbrs)\n else:\n nodes_nbrs = (\n (\n n,\n {\n v: self.all_edge_dict\n for v in set(self.nodes()) - set(self.adj[n]) - {n}\n },\n )\n for n in self.nbunch_iter(nbunch)\n )\n\n if weight is None:\n return ((n, len(nbrs)) for n, nbrs in nodes_nbrs)\n else:\n # AntiGraph is a ThinGraph so all edges have weight 1\n return (\n (n, sum((nbrs[nbr].get(weight, 1)) for nbr in nbrs))\n for n, nbrs in nodes_nbrs\n )\n\n def adjacency_iter(self):\n \"\"\"Return an iterator of (node, adjacency set) tuples for all nodes\n in the dense graph.\n\n This is the fastest way to look at every edge.\n For directed graphs, only outgoing adjacencies are included.\n\n Returns\n -------\n adj_iter : iterator\n An iterator of (node, adjacency set) for all nodes in\n the graph.\n\n \"\"\"\n for n in self.adj:\n yield (n, set(self.adj) - set(self.adj[n]) - {n})\n\n\n# Build several pairs of graphs, a regular graph\n# and the AntiGraph of it's complement, which behaves\n# as if it were the original graph.\nGnp = nx.gnp_random_graph(20, 0.8, seed=42)\nAnp = AntiGraph(nx.complement(Gnp))\nGd = nx.davis_southern_women_graph()\nAd = AntiGraph(nx.complement(Gd))\nGk = nx.karate_club_graph()\nAk = AntiGraph(nx.complement(Gk))\npairs = [(Gnp, Anp), (Gd, Ad), (Gk, Ak)]\n# test connected components\nfor G, A in pairs:\n gc = [set(c) for c in nx.connected_components(G)]\n ac = [set(c) for c in nx.connected_components(A)]\n for comp in ac:\n assert comp in gc\n# test biconnected components\nfor G, A in pairs:\n gc = [set(c) for c in nx.biconnected_components(G)]\n ac = [set(c) for c in nx.biconnected_components(A)]\n for comp in ac:\n assert comp in gc\n# test degree\nfor G, A in pairs:\n node = list(G.nodes())[0]\n nodes = list(G.nodes())[1:4]\n assert G.degree(node) == A.degree(node)\n assert sum(d for n, d in G.degree()) == sum(d for n, d in A.degree())\n # AntiGraph is a ThinGraph, so all the weights are 1\n assert sum(d for n, d in A.degree()) == sum(d for n, d in A.degree(weight=\"weight\"))\n assert sum(d for n, d in G.degree(nodes)) == sum(d for n, d in A.degree(nodes))\n\npos = nx.spring_layout(G, seed=268) # Seed for reproducible layout\nnx.draw(Gnp, pos=pos)\nplt.show()\n", "path": "examples/subclass/plot_antigraph.py"}], "after_files": [{"content": "\"\"\"\n=========\nAntigraph\n=========\n\nComplement graph class for small footprint when working on dense graphs.\n\nThis class allows you to add the edges that *do not exist* in the dense\ngraph. However, when applying algorithms to this complement graph data\nstructure, it behaves as if it were the dense version. So it can be used\ndirectly in several NetworkX algorithms.\n\nThis subclass has only been tested for k-core, connected_components,\nand biconnected_components algorithms but might also work for other\nalgorithms.\n\n\"\"\"\nimport networkx as nx\nfrom networkx.exception import NetworkXError\nimport matplotlib.pyplot as plt\n\n\nclass AntiGraph(nx.Graph):\n \"\"\"\n Class for complement graphs.\n\n The main goal is to be able to work with big and dense graphs with\n a low memory footprint.\n\n In this class you add the edges that *do not exist* in the dense graph,\n the report methods of the class return the neighbors, the edges and\n the degree as if it was the dense graph. Thus it's possible to use\n an instance of this class with some of NetworkX functions.\n \"\"\"\n\n all_edge_dict = {\"weight\": 1}\n\n def single_edge_dict(self):\n return self.all_edge_dict\n\n edge_attr_dict_factory = single_edge_dict\n\n def __getitem__(self, n):\n \"\"\"Return a dict of neighbors of node n in the dense graph.\n\n Parameters\n ----------\n n : node\n A node in the graph.\n\n Returns\n -------\n adj_dict : dictionary\n The adjacency dictionary for nodes connected to n.\n\n \"\"\"\n return {\n node: self.all_edge_dict for node in set(self.adj) - set(self.adj[n]) - {n}\n }\n\n def neighbors(self, n):\n \"\"\"Return an iterator over all neighbors of node n in the\n dense graph.\n\n \"\"\"\n try:\n return iter(set(self.adj) - set(self.adj[n]) - {n})\n except KeyError as e:\n raise NetworkXError(f\"The node {n} is not in the graph.\") from e\n\n def degree(self, nbunch=None, weight=None):\n \"\"\"Return an iterator for (node, degree) in the dense graph.\n\n The node degree is the number of edges adjacent to the node.\n\n Parameters\n ----------\n nbunch : iterable container, optional (default=all nodes)\n A container of nodes. The container will be iterated\n through once.\n\n weight : string or None, optional (default=None)\n The edge attribute that holds the numerical value used\n as a weight. If None, then each edge has weight 1.\n The degree is the sum of the edge weights adjacent to the node.\n\n Returns\n -------\n nd_iter : iterator\n The iterator returns two-tuples of (node, degree).\n\n See Also\n --------\n degree\n\n Examples\n --------\n >>> G = nx.path_graph(4) # or DiGraph, MultiGraph, MultiDiGraph, etc\n >>> list(G.degree(0)) # node 0 with degree 1\n [(0, 1)]\n >>> list(G.degree([0, 1]))\n [(0, 1), (1, 2)]\n\n \"\"\"\n if nbunch is None:\n nodes_nbrs = (\n (\n n,\n {\n v: self.all_edge_dict\n for v in set(self.adj) - set(self.adj[n]) - {n}\n },\n )\n for n in self.nodes()\n )\n elif nbunch in self:\n nbrs = set(self.nodes()) - set(self.adj[nbunch]) - {nbunch}\n return len(nbrs)\n else:\n nodes_nbrs = (\n (\n n,\n {\n v: self.all_edge_dict\n for v in set(self.nodes()) - set(self.adj[n]) - {n}\n },\n )\n for n in self.nbunch_iter(nbunch)\n )\n\n if weight is None:\n return ((n, len(nbrs)) for n, nbrs in nodes_nbrs)\n else:\n # AntiGraph is a ThinGraph so all edges have weight 1\n return (\n (n, sum((nbrs[nbr].get(weight, 1)) for nbr in nbrs))\n for n, nbrs in nodes_nbrs\n )\n\n def adjacency(self):\n \"\"\"Return an iterator of (node, adjacency set) tuples for all nodes\n in the dense graph.\n\n This is the fastest way to look at every edge.\n For directed graphs, only outgoing adjacencies are included.\n\n Returns\n -------\n adj_iter : iterator\n An iterator of (node, adjacency set) for all nodes in\n the graph.\n \"\"\"\n nodes = set(self.adj)\n for n, nbrs in self.adj.items():\n yield (n, nodes - set(nbrs) - {n})\n\n\n# Build several pairs of graphs, a regular graph\n# and the AntiGraph of it's complement, which behaves\n# as if it were the original graph.\nGnp = nx.gnp_random_graph(20, 0.8, seed=42)\nAnp = AntiGraph(nx.complement(Gnp))\nGd = nx.davis_southern_women_graph()\nAd = AntiGraph(nx.complement(Gd))\nGk = nx.karate_club_graph()\nAk = AntiGraph(nx.complement(Gk))\npairs = [(Gnp, Anp), (Gd, Ad), (Gk, Ak)]\n# test connected components\nfor G, A in pairs:\n gc = [set(c) for c in nx.connected_components(G)]\n ac = [set(c) for c in nx.connected_components(A)]\n for comp in ac:\n assert comp in gc\n# test biconnected components\nfor G, A in pairs:\n gc = [set(c) for c in nx.biconnected_components(G)]\n ac = [set(c) for c in nx.biconnected_components(A)]\n for comp in ac:\n assert comp in gc\n# test degree\nfor G, A in pairs:\n node = list(G.nodes())[0]\n nodes = list(G.nodes())[1:4]\n assert G.degree(node) == A.degree(node)\n assert sum(d for n, d in G.degree()) == sum(d for n, d in A.degree())\n # AntiGraph is a ThinGraph, so all the weights are 1\n assert sum(d for n, d in A.degree()) == sum(d for n, d in A.degree(weight=\"weight\"))\n assert sum(d for n, d in G.degree(nodes)) == sum(d for n, d in A.degree(nodes))\n\npos = nx.spring_layout(G, seed=268) # Seed for reproducible layout\nnx.draw(Gnp, pos=pos)\nplt.show()\n", "path": "examples/subclass/plot_antigraph.py"}]} | 2,259 | 230 |
gh_patches_debug_28730 | rasdani/github-patches | git_diff | pytorch__ignite-2027 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Loss metric to use required_output_keys
## 🚀 Feature
Currently, if we have custom metrics that require data other then `y_pred` and `y`, [we suggest](https://discuss.pytorch.org/t/how-access-inputs-in-custom-ignite-metric/91221/6) to do the following:
```python
metrics = {
"Accuracy": Accuracy(),
"Loss": Loss(criterion, output_transform=lambda out_dict: (out_dict["y_pred"], out_dict["y"])),
"CustomMetric": CustomMetric()
}
evaluator = create_supervised_evaluator(
model,
metrics=metrics,
output_transform=lambda x, y, y_pred: {"x": x, "y": y, "y_pred": y_pred}
)
```
where `CustomMetric` is defined as
```python
class CustomMetric(Metric):
required_output_keys = ("y_pred", "y", "x")
```
The idea is to extend this for `Loss` metric to support `required_output_keys`. The main issue with `Loss` now is with `(prediction, target, kwargs)` optional input, where `kwargs` is a dict for extra args for criterion function.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ignite/metrics/loss.py`
Content:
```
1 from typing import Callable, Dict, Sequence, Tuple, Union, cast
2
3 import torch
4
5 from ignite.exceptions import NotComputableError
6 from ignite.metrics.metric import Metric, reinit__is_reduced, sync_all_reduce
7
8 __all__ = ["Loss"]
9
10
11 class Loss(Metric):
12 """
13 Calculates the average loss according to the passed loss_fn.
14
15 Args:
16 loss_fn: a callable taking a prediction tensor, a target
17 tensor, optionally other arguments, and returns the average loss
18 over all observations in the batch.
19 output_transform: a callable that is used to transform the
20 :class:`~ignite.engine.engine.Engine`'s ``process_function``'s output into the
21 form expected by the metric.
22 This can be useful if, for example, you have a multi-output model and
23 you want to compute the metric with respect to one of the outputs.
24 The output is expected to be a tuple `(prediction, target)` or
25 (prediction, target, kwargs) where kwargs is a dictionary of extra
26 keywords arguments. If extra keywords arguments are provided they are passed to `loss_fn`.
27 batch_size: a callable taking a target tensor that returns the
28 first dimension size (usually the batch size).
29 device: specifies which device updates are accumulated on. Setting the
30 metric's device to be the same as your ``update`` arguments ensures the ``update`` method is
31 non-blocking. By default, CPU.
32
33 """
34
35 required_output_keys = None
36
37 def __init__(
38 self,
39 loss_fn: Callable,
40 output_transform: Callable = lambda x: x,
41 batch_size: Callable = len,
42 device: Union[str, torch.device] = torch.device("cpu"),
43 ):
44 super(Loss, self).__init__(output_transform, device=device)
45 self._loss_fn = loss_fn
46 self._batch_size = batch_size
47
48 @reinit__is_reduced
49 def reset(self) -> None:
50 self._sum = torch.tensor(0.0, device=self._device)
51 self._num_examples = 0
52
53 @reinit__is_reduced
54 def update(self, output: Sequence[Union[torch.Tensor, Dict]]) -> None:
55 if len(output) == 2:
56 y_pred, y = cast(Tuple[torch.Tensor, torch.Tensor], output)
57 kwargs = {} # type: Dict
58 else:
59 y_pred, y, kwargs = cast(Tuple[torch.Tensor, torch.Tensor, Dict], output)
60 average_loss = self._loss_fn(y_pred, y, **kwargs).detach()
61
62 if len(average_loss.shape) != 0:
63 raise ValueError("loss_fn did not return the average loss.")
64
65 n = self._batch_size(y)
66 self._sum += average_loss.to(self._device) * n
67 self._num_examples += n
68
69 @sync_all_reduce("_sum", "_num_examples")
70 def compute(self) -> float:
71 if self._num_examples == 0:
72 raise NotComputableError("Loss must have at least one example before it can be computed.")
73 return self._sum.item() / self._num_examples
74
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/ignite/metrics/loss.py b/ignite/metrics/loss.py
--- a/ignite/metrics/loss.py
+++ b/ignite/metrics/loss.py
@@ -30,9 +30,52 @@
metric's device to be the same as your ``update`` arguments ensures the ``update`` method is
non-blocking. By default, CPU.
+ Attributes:
+ required_output_keys: dictionary defines required keys to be found in ``engine.state.output`` if the
+ latter is a dictionary. Default, ``("y_pred", "y", "criterion_kwargs")``. This is useful when the
+ criterion function requires additional arguments, which can be passed using ``criterion_kwargs``.
+ See notes below for an example.
+
+ Note:
+
+ Let's implement a Loss metric that requires ``x``, ``y_pred``, ``y`` and ``criterion_kwargs`` as input
+ for ``criterion`` function. In the example below we show how to setup standard metric like Accuracy
+ and the Loss metric using an ``evaluator`` created with
+ :meth:`~ignite.engine.create_supervised_evaluator` method.
+
+ .. code-block:: python
+
+ import torch
+ import torch.nn as nn
+ from torch.nn.functional import nll_loss
+
+ from ignite.metrics import Accuracy, Loss
+ from ignite.engine import create_supervised_evaluator
+
+ model = ...
+
+ criterion = nll_loss
+
+ metrics = {
+ "Accuracy": Accuracy(),
+ "Loss": Loss(criterion)
+ }
+
+ # global criterion kwargs
+ criterion_kwargs = {...}
+
+ evaluator = create_supervised_evaluator(
+ model,
+ metrics=metrics,
+ output_transform=lambda x, y, y_pred: {
+ "x": x, "y": y, "y_pred": y_pred, "criterion_kwargs": criterion_kwargs}
+ )
+
+ res = evaluator.run(data)
+
"""
- required_output_keys = None
+ required_output_keys = ("y_pred", "y", "criterion_kwargs")
def __init__(
self,
| {"golden_diff": "diff --git a/ignite/metrics/loss.py b/ignite/metrics/loss.py\n--- a/ignite/metrics/loss.py\n+++ b/ignite/metrics/loss.py\n@@ -30,9 +30,52 @@\n metric's device to be the same as your ``update`` arguments ensures the ``update`` method is\n non-blocking. By default, CPU.\n \n+ Attributes:\n+ required_output_keys: dictionary defines required keys to be found in ``engine.state.output`` if the\n+ latter is a dictionary. Default, ``(\"y_pred\", \"y\", \"criterion_kwargs\")``. This is useful when the\n+ criterion function requires additional arguments, which can be passed using ``criterion_kwargs``.\n+ See notes below for an example.\n+\n+ Note:\n+\n+ Let's implement a Loss metric that requires ``x``, ``y_pred``, ``y`` and ``criterion_kwargs`` as input\n+ for ``criterion`` function. In the example below we show how to setup standard metric like Accuracy\n+ and the Loss metric using an ``evaluator`` created with\n+ :meth:`~ignite.engine.create_supervised_evaluator` method.\n+\n+ .. code-block:: python\n+\n+ import torch\n+ import torch.nn as nn\n+ from torch.nn.functional import nll_loss\n+\n+ from ignite.metrics import Accuracy, Loss\n+ from ignite.engine import create_supervised_evaluator\n+\n+ model = ...\n+\n+ criterion = nll_loss\n+\n+ metrics = {\n+ \"Accuracy\": Accuracy(),\n+ \"Loss\": Loss(criterion)\n+ }\n+\n+ # global criterion kwargs\n+ criterion_kwargs = {...}\n+\n+ evaluator = create_supervised_evaluator(\n+ model,\n+ metrics=metrics,\n+ output_transform=lambda x, y, y_pred: {\n+ \"x\": x, \"y\": y, \"y_pred\": y_pred, \"criterion_kwargs\": criterion_kwargs}\n+ )\n+\n+ res = evaluator.run(data)\n+\n \"\"\"\n \n- required_output_keys = None\n+ required_output_keys = (\"y_pred\", \"y\", \"criterion_kwargs\")\n \n def __init__(\n self,\n", "issue": "Loss metric to use required_output_keys\n## \ud83d\ude80 Feature\r\n\r\nCurrently, if we have custom metrics that require data other then `y_pred` and `y`, [we suggest](https://discuss.pytorch.org/t/how-access-inputs-in-custom-ignite-metric/91221/6) to do the following: \r\n```python\r\nmetrics = {\r\n \"Accuracy\": Accuracy(),\r\n \"Loss\": Loss(criterion, output_transform=lambda out_dict: (out_dict[\"y_pred\"], out_dict[\"y\"])),\r\n \"CustomMetric\": CustomMetric()\r\n}\r\n\r\nevaluator = create_supervised_evaluator(\r\n model, \r\n metrics=metrics, \r\n output_transform=lambda x, y, y_pred: {\"x\": x, \"y\": y, \"y_pred\": y_pred}\r\n)\r\n```\r\n\r\nwhere `CustomMetric` is defined as \r\n```python\r\nclass CustomMetric(Metric):\r\n\r\n required_output_keys = (\"y_pred\", \"y\", \"x\")\r\n```\r\n\r\nThe idea is to extend this for `Loss` metric to support `required_output_keys`. The main issue with `Loss` now is with `(prediction, target, kwargs)` optional input, where `kwargs` is a dict for extra args for criterion function.\n", "before_files": [{"content": "from typing import Callable, Dict, Sequence, Tuple, Union, cast\n\nimport torch\n\nfrom ignite.exceptions import NotComputableError\nfrom ignite.metrics.metric import Metric, reinit__is_reduced, sync_all_reduce\n\n__all__ = [\"Loss\"]\n\n\nclass Loss(Metric):\n \"\"\"\n Calculates the average loss according to the passed loss_fn.\n\n Args:\n loss_fn: a callable taking a prediction tensor, a target\n tensor, optionally other arguments, and returns the average loss\n over all observations in the batch.\n output_transform: a callable that is used to transform the\n :class:`~ignite.engine.engine.Engine`'s ``process_function``'s output into the\n form expected by the metric.\n This can be useful if, for example, you have a multi-output model and\n you want to compute the metric with respect to one of the outputs.\n The output is expected to be a tuple `(prediction, target)` or\n (prediction, target, kwargs) where kwargs is a dictionary of extra\n keywords arguments. If extra keywords arguments are provided they are passed to `loss_fn`.\n batch_size: a callable taking a target tensor that returns the\n first dimension size (usually the batch size).\n device: specifies which device updates are accumulated on. Setting the\n metric's device to be the same as your ``update`` arguments ensures the ``update`` method is\n non-blocking. By default, CPU.\n\n \"\"\"\n\n required_output_keys = None\n\n def __init__(\n self,\n loss_fn: Callable,\n output_transform: Callable = lambda x: x,\n batch_size: Callable = len,\n device: Union[str, torch.device] = torch.device(\"cpu\"),\n ):\n super(Loss, self).__init__(output_transform, device=device)\n self._loss_fn = loss_fn\n self._batch_size = batch_size\n\n @reinit__is_reduced\n def reset(self) -> None:\n self._sum = torch.tensor(0.0, device=self._device)\n self._num_examples = 0\n\n @reinit__is_reduced\n def update(self, output: Sequence[Union[torch.Tensor, Dict]]) -> None:\n if len(output) == 2:\n y_pred, y = cast(Tuple[torch.Tensor, torch.Tensor], output)\n kwargs = {} # type: Dict\n else:\n y_pred, y, kwargs = cast(Tuple[torch.Tensor, torch.Tensor, Dict], output)\n average_loss = self._loss_fn(y_pred, y, **kwargs).detach()\n\n if len(average_loss.shape) != 0:\n raise ValueError(\"loss_fn did not return the average loss.\")\n\n n = self._batch_size(y)\n self._sum += average_loss.to(self._device) * n\n self._num_examples += n\n\n @sync_all_reduce(\"_sum\", \"_num_examples\")\n def compute(self) -> float:\n if self._num_examples == 0:\n raise NotComputableError(\"Loss must have at least one example before it can be computed.\")\n return self._sum.item() / self._num_examples\n", "path": "ignite/metrics/loss.py"}], "after_files": [{"content": "from typing import Callable, Dict, Sequence, Tuple, Union, cast\n\nimport torch\n\nfrom ignite.exceptions import NotComputableError\nfrom ignite.metrics.metric import Metric, reinit__is_reduced, sync_all_reduce\n\n__all__ = [\"Loss\"]\n\n\nclass Loss(Metric):\n \"\"\"\n Calculates the average loss according to the passed loss_fn.\n\n Args:\n loss_fn: a callable taking a prediction tensor, a target\n tensor, optionally other arguments, and returns the average loss\n over all observations in the batch.\n output_transform: a callable that is used to transform the\n :class:`~ignite.engine.engine.Engine`'s ``process_function``'s output into the\n form expected by the metric.\n This can be useful if, for example, you have a multi-output model and\n you want to compute the metric with respect to one of the outputs.\n The output is expected to be a tuple `(prediction, target)` or\n (prediction, target, kwargs) where kwargs is a dictionary of extra\n keywords arguments. If extra keywords arguments are provided they are passed to `loss_fn`.\n batch_size: a callable taking a target tensor that returns the\n first dimension size (usually the batch size).\n device: specifies which device updates are accumulated on. Setting the\n metric's device to be the same as your ``update`` arguments ensures the ``update`` method is\n non-blocking. By default, CPU.\n\n Attributes:\n required_output_keys: dictionary defines required keys to be found in ``engine.state.output`` if the\n latter is a dictionary. Default, ``(\"y_pred\", \"y\", \"criterion_kwargs\")``. This is useful when the\n criterion function requires additional arguments, which can be passed using ``criterion_kwargs``.\n See notes below for an example.\n\n Note:\n\n Let's implement a Loss metric that requires ``x``, ``y_pred``, ``y`` and ``criterion_kwargs`` as input\n for ``criterion`` function. In the example below we show how to setup standard metric like Accuracy\n and the Loss metric using an ``evaluator`` created with\n :meth:`~ignite.engine.create_supervised_evaluator` method.\n\n .. code-block:: python\n\n import torch\n import torch.nn as nn\n from torch.nn.functional import nll_loss\n\n from ignite.metrics import Accuracy, Loss\n from ignite.engine import create_supervised_evaluator\n\n model = ...\n\n criterion = nll_loss\n\n metrics = {\n \"Accuracy\": Accuracy(),\n \"Loss\": Loss(criterion)\n }\n\n # global criterion kwargs\n criterion_kwargs = {...}\n\n evaluator = create_supervised_evaluator(\n model,\n metrics=metrics,\n output_transform=lambda x, y, y_pred: {\n \"x\": x, \"y\": y, \"y_pred\": y_pred, \"criterion_kwargs\": criterion_kwargs}\n )\n\n res = evaluator.run(data)\n\n \"\"\"\n\n required_output_keys = (\"y_pred\", \"y\", \"criterion_kwargs\")\n\n def __init__(\n self,\n loss_fn: Callable,\n output_transform: Callable = lambda x: x,\n batch_size: Callable = len,\n device: Union[str, torch.device] = torch.device(\"cpu\"),\n ):\n super(Loss, self).__init__(output_transform, device=device)\n self._loss_fn = loss_fn\n self._batch_size = batch_size\n\n @reinit__is_reduced\n def reset(self) -> None:\n self._sum = torch.tensor(0.0, device=self._device)\n self._num_examples = 0\n\n @reinit__is_reduced\n def update(self, output: Sequence[Union[torch.Tensor, Dict]]) -> None:\n if len(output) == 2:\n y_pred, y = cast(Tuple[torch.Tensor, torch.Tensor], output)\n kwargs = {} # type: Dict\n else:\n y_pred, y, kwargs = cast(Tuple[torch.Tensor, torch.Tensor, Dict], output)\n average_loss = self._loss_fn(y_pred, y, **kwargs).detach()\n\n if len(average_loss.shape) != 0:\n raise ValueError(\"loss_fn did not return the average loss.\")\n\n n = self._batch_size(y)\n self._sum += average_loss.to(self._device) * n\n self._num_examples += n\n\n @sync_all_reduce(\"_sum\", \"_num_examples\")\n def compute(self) -> float:\n if self._num_examples == 0:\n raise NotComputableError(\"Loss must have at least one example before it can be computed.\")\n return self._sum.item() / self._num_examples\n", "path": "ignite/metrics/loss.py"}]} | 1,334 | 481 |
gh_patches_debug_53495 | rasdani/github-patches | git_diff | cocotb__cocotb-2451 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
TOPLEVEL should be whitespace stripped
From here: https://github.com/cocotb/cocotb/blob/ecb43878a6af0605fdf08c2f1829cc401bfdeb36/cocotb/__init__.py#L219-L225
On Questa with the FLI, trailing whitespace causes the toplevel entity to not be found since it does a strict `!strcmp()`.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `cocotb/__init__.py`
Content:
```
1 # Copyright (c) 2013 Potential Ventures Ltd
2 # Copyright (c) 2013 SolarFlare Communications Inc
3 # All rights reserved.
4
5 # Redistribution and use in source and binary forms, with or without
6 # modification, are permitted provided that the following conditions are met:
7 # * Redistributions of source code must retain the above copyright
8 # notice, this list of conditions and the following disclaimer.
9 # * Redistributions in binary form must reproduce the above copyright
10 # notice, this list of conditions and the following disclaimer in the
11 # documentation and/or other materials provided with the distribution.
12 # * Neither the name of Potential Ventures Ltd,
13 # SolarFlare Communications Inc nor the
14 # names of its contributors may be used to endorse or promote products
15 # derived from this software without specific prior written permission.
16
17 # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
18 # ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
19 # WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
20 # DISCLAIMED. IN NO EVENT SHALL POTENTIAL VENTURES LTD BE LIABLE FOR ANY
21 # DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
22 # (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
23 # LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
24 # ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
25 # (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
26 # SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
27
28 """
29 Cocotb is a coroutine, cosimulation framework for writing testbenches in Python.
30
31 See https://docs.cocotb.org for full documentation
32 """
33 import os
34 import sys
35 import logging
36 import threading
37 import random
38 import time
39 import warnings
40 from typing import Dict, List, Optional, Union
41 from collections.abc import Coroutine
42
43 import cocotb.handle
44 import cocotb.log
45 from cocotb.scheduler import Scheduler
46 from cocotb.regression import RegressionManager
47 from cocotb.decorators import RunningTask
48
49 # Things we want in the cocotb namespace
50 from cocotb.decorators import test, coroutine, hook, function, external # noqa: F401
51
52 from ._version import __version__
53
54
55 def _setup_logging():
56 global log
57
58 def _reopen_stream_with_buffering(stream_name):
59 try:
60 if not getattr(sys, stream_name).isatty():
61 setattr(sys, stream_name, os.fdopen(getattr(sys, stream_name).fileno(), 'w', 1))
62 return True
63 return False
64 except Exception as e:
65 return e
66
67 # If stdout/stderr are not TTYs, Python may not have opened them with line
68 # buffering. In that case, try to reopen them with line buffering
69 # explicitly enabled. This ensures that prints such as stack traces always
70 # appear. Continue silently if this fails.
71 _stdout_buffer_result = _reopen_stream_with_buffering('stdout')
72 _stderr_buffer_result = _reopen_stream_with_buffering('stderr')
73
74 # Don't set the logging up until we've attempted to fix the standard IO,
75 # otherwise it will end up connected to the unfixed IO.
76 cocotb.log.default_config()
77 log = logging.getLogger(__name__)
78
79 # we can't log these things until the logging is set up!
80 if _stderr_buffer_result is True:
81 log.debug("Reopened stderr with line buffering")
82 if _stdout_buffer_result is True:
83 log.debug("Reopened stdout with line buffering")
84 if isinstance(_stdout_buffer_result, Exception) or isinstance(_stderr_buffer_result, Exception):
85 if isinstance(_stdout_buffer_result, Exception):
86 log.warning("Failed to ensure that stdout is line buffered", exc_info=_stdout_buffer_result)
87 if isinstance(_stderr_buffer_result, Exception):
88 log.warning("Failed to ensure that stderr is line buffered", exc_info=_stderr_buffer_result)
89 log.warning("Some stack traces may not appear because of this.")
90
91 del _stderr_buffer_result, _stdout_buffer_result
92
93
94 # Singleton scheduler instance
95 # NB this cheekily ensures a singleton since we're replacing the reference
96 # so that cocotb.scheduler gives you the singleton instance and not the
97 # scheduler package
98
99 scheduler: Optional[Scheduler] = None
100 """The global scheduler instance.
101
102 This is guaranteed to hold a value at test time.
103 """
104
105 regression_manager: Optional[RegressionManager] = None
106 """The global regression manager instance.
107
108 This is guaranteed to hold a value at test time.
109 """
110
111 argv: Optional[List[str]] = None
112 """The argument list as seen by the simulator.
113
114 This is guaranteed to hold a value at test time.
115 """
116
117 argc: Optional[int] = None
118 """The length of :data:`cocotb.argv`.
119
120 This is guaranteed to hold a value at test time.
121 """
122
123 plusargs: Optional[Dict[str, Union[bool, str]]] = None
124 """A dictionary of "plusargs" handed to the simulation.
125
126 See :make:var:`PLUSARGS` for details.
127 This is guaranteed to hold a value at test time.
128 """
129
130 LANGUAGE: Optional[str] = os.getenv("TOPLEVEL_LANG")
131 """The value of :make:var:`TOPLEVEL_LANG`.
132
133 This is guaranteed to hold a value at test time.
134 """
135
136 SIM_NAME: Optional[str] = None
137 """The running simulator product information.
138
139 ``None`` if :mod:`cocotb` was not loaded from a simulator.
140 """
141
142 SIM_VERSION: Optional[str] = None
143 """The version of the running simulator.
144
145 ``None`` if :mod:`cocotb` was not loaded from a simulator."""
146
147 RANDOM_SEED: Optional[int] = None
148 """
149 The value passed to the Python default random number generator.
150
151 See :envvar:`RANDOM_SEED` for details on how the value is computed.
152 This is guaranteed to hold a value at test time.
153 """
154
155 _library_coverage = None
156 """ used for cocotb library coverage """
157
158 top: Optional[cocotb.handle.SimHandleBase] = None
159 r"""
160 A handle to the :envvar:`TOPLEVEL` entity/module.
161
162 This is equivalent to the :term:`DUT` parameter given to cocotb tests, so it can be used wherever that variable can be used.
163 It is particularly useful for extracting information about the :term:`DUT` in module-level class and function definitions;
164 and in parameters to :class:`.TestFactory`\ s.
165 ``None`` if :mod:`cocotb` was not loaded from a simulator.
166 """
167
168
169 def fork(coro: Union[RunningTask, Coroutine]) -> RunningTask:
170 """ Schedule a coroutine to be run concurrently. See :ref:`coroutines` for details on its use. """
171 return scheduler.add(coro)
172
173
174 # FIXME is this really required?
175 _rlock = threading.RLock()
176
177
178 def mem_debug(port):
179 import cocotb.memdebug
180 cocotb.memdebug.start(port)
181
182
183 def _initialise_testbench(argv_): # pragma: no cover
184 """Initialize testbench.
185
186 This function is called after the simulator has elaborated all
187 entities and is ready to run the test.
188
189 The test must be defined by the environment variables
190 :envvar:`MODULE` and :envvar:`TESTCASE`.
191
192 The environment variable :envvar:`COCOTB_HOOKS`, if present, contains a
193 comma-separated list of modules to be executed before the first test.
194 """
195 with _rlock:
196
197 if "COCOTB_LIBRARY_COVERAGE" in os.environ:
198 import coverage
199
200 global _library_coverage
201 _library_coverage = coverage.coverage(
202 data_file=".coverage.cocotb",
203 branch=True,
204 include=["{}/*".format(os.path.dirname(__file__))])
205 _library_coverage.start()
206
207 return _initialise_testbench_(argv_)
208
209
210 def _initialise_testbench_(argv_):
211 # The body of this function is split in two because no coverage is collected on
212 # the function that starts the coverage. By splitting it in two we get coverage
213 # on most of the function.
214
215 global argc, argv
216 argv = argv_
217 argc = len(argv)
218
219 root_name = os.getenv("TOPLEVEL")
220 if root_name is not None:
221 if root_name == "":
222 root_name = None
223 elif '.' in root_name:
224 # Skip any library component of the toplevel
225 root_name = root_name.split(".", 1)[1]
226
227 # sys.path normally includes "" (the current directory), but does not appear to when python is embedded.
228 # Add it back because users expect to be able to import files in their test directory.
229 # TODO: move this to gpi_embed.cpp
230 sys.path.insert(0, "")
231
232 _setup_logging()
233
234 # From https://www.python.org/dev/peps/pep-0565/#recommended-filter-settings-for-test-runners
235 # If the user doesn't want to see these, they can always change the global
236 # warning settings in their test module.
237 if not sys.warnoptions:
238 warnings.simplefilter("default")
239
240 from cocotb import simulator
241
242 global SIM_NAME, SIM_VERSION
243 SIM_NAME = simulator.get_simulator_product().strip()
244 SIM_VERSION = simulator.get_simulator_version().strip()
245
246 cocotb.log.info("Running on {} version {}".format(SIM_NAME, SIM_VERSION))
247
248 memcheck_port = os.getenv('MEMCHECK')
249 if memcheck_port is not None:
250 mem_debug(int(memcheck_port))
251
252 log.info("Running tests with cocotb v%s from %s" %
253 (__version__, os.path.dirname(__file__)))
254
255 # Create the base handle type
256
257 process_plusargs()
258
259 global scheduler
260 scheduler = Scheduler()
261
262 # Seed the Python random number generator to make this repeatable
263 global RANDOM_SEED
264 RANDOM_SEED = os.getenv('RANDOM_SEED')
265
266 if RANDOM_SEED is None:
267 if 'ntb_random_seed' in plusargs:
268 RANDOM_SEED = eval(plusargs['ntb_random_seed'])
269 elif 'seed' in plusargs:
270 RANDOM_SEED = eval(plusargs['seed'])
271 else:
272 RANDOM_SEED = int(time.time())
273 log.info("Seeding Python random module with %d" % (RANDOM_SEED))
274 else:
275 RANDOM_SEED = int(RANDOM_SEED)
276 log.info("Seeding Python random module with supplied seed %d" % (RANDOM_SEED))
277 random.seed(RANDOM_SEED)
278
279 # Setup DUT object
280 from cocotb import simulator
281
282 handle = simulator.get_root_handle(root_name)
283 if not handle:
284 raise RuntimeError("Can not find root handle ({})".format(root_name))
285
286 global top
287 top = cocotb.handle.SimHandle(handle)
288
289 try:
290 import pytest
291 except ImportError:
292 log.warning("Pytest not found, assertion rewriting will not occur")
293 else:
294 try:
295 # Install the assertion rewriting hook, which must be done before we
296 # import the test modules.
297 from _pytest.config import Config
298 from _pytest.assertion import install_importhook
299 pytest_conf = Config.fromdictargs([], {})
300 install_importhook(pytest_conf)
301 except Exception:
302 log.exception(
303 "Configuring the assertion rewrite hook using pytest {} failed. "
304 "Please file a bug report!".format(pytest.__version__))
305
306 # start Regression Manager
307 global regression_manager
308 regression_manager = RegressionManager.from_discovery(top)
309 regression_manager.execute()
310
311 return True
312
313
314 def _sim_event(level, message):
315 """Function that can be called externally to signal an event."""
316 # SIM_INFO = 0
317 SIM_TEST_FAIL = 1
318 SIM_FAIL = 2
319 from cocotb.result import TestFailure, SimFailure
320
321 if level is SIM_TEST_FAIL:
322 scheduler.log.error("Failing test at simulator request")
323 scheduler._finish_test(TestFailure("Failure from external source: {}".format(message)))
324 elif level is SIM_FAIL:
325 # We simply return here as the simulator will exit
326 # so no cleanup is needed
327 msg = "Failing test at simulator request before test run completion: {}".format(message)
328 scheduler.log.error(msg)
329 scheduler._finish_scheduler(SimFailure(msg))
330 else:
331 scheduler.log.error("Unsupported sim event")
332
333 return True
334
335
336 def process_plusargs():
337
338 global plusargs
339
340 plusargs = {}
341
342 for option in cocotb.argv:
343 if option.startswith('+'):
344 if option.find('=') != -1:
345 (name, value) = option[1:].split('=')
346 plusargs[name] = value
347 else:
348 plusargs[option[1:]] = True
349
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/cocotb/__init__.py b/cocotb/__init__.py
--- a/cocotb/__init__.py
+++ b/cocotb/__init__.py
@@ -218,6 +218,7 @@
root_name = os.getenv("TOPLEVEL")
if root_name is not None:
+ root_name = root_name.strip()
if root_name == "":
root_name = None
elif '.' in root_name:
| {"golden_diff": "diff --git a/cocotb/__init__.py b/cocotb/__init__.py\n--- a/cocotb/__init__.py\n+++ b/cocotb/__init__.py\n@@ -218,6 +218,7 @@\n \n root_name = os.getenv(\"TOPLEVEL\")\n if root_name is not None:\n+ root_name = root_name.strip()\n if root_name == \"\":\n root_name = None\n elif '.' in root_name:\n", "issue": "TOPLEVEL should be whitespace stripped\nFrom here: https://github.com/cocotb/cocotb/blob/ecb43878a6af0605fdf08c2f1829cc401bfdeb36/cocotb/__init__.py#L219-L225\r\n\r\nOn Questa with the FLI, trailing whitespace causes the toplevel entity to not be found since it does a strict `!strcmp()`.\n", "before_files": [{"content": "# Copyright (c) 2013 Potential Ventures Ltd\n# Copyright (c) 2013 SolarFlare Communications Inc\n# All rights reserved.\n\n# Redistribution and use in source and binary forms, with or without\n# modification, are permitted provided that the following conditions are met:\n# * Redistributions of source code must retain the above copyright\n# notice, this list of conditions and the following disclaimer.\n# * Redistributions in binary form must reproduce the above copyright\n# notice, this list of conditions and the following disclaimer in the\n# documentation and/or other materials provided with the distribution.\n# * Neither the name of Potential Ventures Ltd,\n# SolarFlare Communications Inc nor the\n# names of its contributors may be used to endorse or promote products\n# derived from this software without specific prior written permission.\n\n# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\" AND\n# ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED\n# WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\n# DISCLAIMED. IN NO EVENT SHALL POTENTIAL VENTURES LTD BE LIABLE FOR ANY\n# DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES\n# (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;\n# LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND\n# ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT\n# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS\n# SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n\n\"\"\"\nCocotb is a coroutine, cosimulation framework for writing testbenches in Python.\n\nSee https://docs.cocotb.org for full documentation\n\"\"\"\nimport os\nimport sys\nimport logging\nimport threading\nimport random\nimport time\nimport warnings\nfrom typing import Dict, List, Optional, Union\nfrom collections.abc import Coroutine\n\nimport cocotb.handle\nimport cocotb.log\nfrom cocotb.scheduler import Scheduler\nfrom cocotb.regression import RegressionManager\nfrom cocotb.decorators import RunningTask\n\n# Things we want in the cocotb namespace\nfrom cocotb.decorators import test, coroutine, hook, function, external # noqa: F401\n\nfrom ._version import __version__\n\n\ndef _setup_logging():\n global log\n\n def _reopen_stream_with_buffering(stream_name):\n try:\n if not getattr(sys, stream_name).isatty():\n setattr(sys, stream_name, os.fdopen(getattr(sys, stream_name).fileno(), 'w', 1))\n return True\n return False\n except Exception as e:\n return e\n\n # If stdout/stderr are not TTYs, Python may not have opened them with line\n # buffering. In that case, try to reopen them with line buffering\n # explicitly enabled. This ensures that prints such as stack traces always\n # appear. Continue silently if this fails.\n _stdout_buffer_result = _reopen_stream_with_buffering('stdout')\n _stderr_buffer_result = _reopen_stream_with_buffering('stderr')\n\n # Don't set the logging up until we've attempted to fix the standard IO,\n # otherwise it will end up connected to the unfixed IO.\n cocotb.log.default_config()\n log = logging.getLogger(__name__)\n\n # we can't log these things until the logging is set up!\n if _stderr_buffer_result is True:\n log.debug(\"Reopened stderr with line buffering\")\n if _stdout_buffer_result is True:\n log.debug(\"Reopened stdout with line buffering\")\n if isinstance(_stdout_buffer_result, Exception) or isinstance(_stderr_buffer_result, Exception):\n if isinstance(_stdout_buffer_result, Exception):\n log.warning(\"Failed to ensure that stdout is line buffered\", exc_info=_stdout_buffer_result)\n if isinstance(_stderr_buffer_result, Exception):\n log.warning(\"Failed to ensure that stderr is line buffered\", exc_info=_stderr_buffer_result)\n log.warning(\"Some stack traces may not appear because of this.\")\n\n del _stderr_buffer_result, _stdout_buffer_result\n\n\n# Singleton scheduler instance\n# NB this cheekily ensures a singleton since we're replacing the reference\n# so that cocotb.scheduler gives you the singleton instance and not the\n# scheduler package\n\nscheduler: Optional[Scheduler] = None\n\"\"\"The global scheduler instance.\n\nThis is guaranteed to hold a value at test time.\n\"\"\"\n\nregression_manager: Optional[RegressionManager] = None\n\"\"\"The global regression manager instance.\n\nThis is guaranteed to hold a value at test time.\n\"\"\"\n\nargv: Optional[List[str]] = None\n\"\"\"The argument list as seen by the simulator.\n\nThis is guaranteed to hold a value at test time.\n\"\"\"\n\nargc: Optional[int] = None\n\"\"\"The length of :data:`cocotb.argv`.\n\nThis is guaranteed to hold a value at test time.\n\"\"\"\n\nplusargs: Optional[Dict[str, Union[bool, str]]] = None\n\"\"\"A dictionary of \"plusargs\" handed to the simulation.\n\nSee :make:var:`PLUSARGS` for details.\nThis is guaranteed to hold a value at test time.\n\"\"\"\n\nLANGUAGE: Optional[str] = os.getenv(\"TOPLEVEL_LANG\")\n\"\"\"The value of :make:var:`TOPLEVEL_LANG`.\n\nThis is guaranteed to hold a value at test time.\n\"\"\"\n\nSIM_NAME: Optional[str] = None\n\"\"\"The running simulator product information.\n\n``None`` if :mod:`cocotb` was not loaded from a simulator.\n\"\"\"\n\nSIM_VERSION: Optional[str] = None\n\"\"\"The version of the running simulator.\n\n``None`` if :mod:`cocotb` was not loaded from a simulator.\"\"\"\n\nRANDOM_SEED: Optional[int] = None\n\"\"\"\nThe value passed to the Python default random number generator.\n\nSee :envvar:`RANDOM_SEED` for details on how the value is computed.\nThis is guaranteed to hold a value at test time.\n\"\"\"\n\n_library_coverage = None\n\"\"\" used for cocotb library coverage \"\"\"\n\ntop: Optional[cocotb.handle.SimHandleBase] = None\nr\"\"\"\nA handle to the :envvar:`TOPLEVEL` entity/module.\n\nThis is equivalent to the :term:`DUT` parameter given to cocotb tests, so it can be used wherever that variable can be used.\nIt is particularly useful for extracting information about the :term:`DUT` in module-level class and function definitions;\nand in parameters to :class:`.TestFactory`\\ s.\n``None`` if :mod:`cocotb` was not loaded from a simulator.\n\"\"\"\n\n\ndef fork(coro: Union[RunningTask, Coroutine]) -> RunningTask:\n \"\"\" Schedule a coroutine to be run concurrently. See :ref:`coroutines` for details on its use. \"\"\"\n return scheduler.add(coro)\n\n\n# FIXME is this really required?\n_rlock = threading.RLock()\n\n\ndef mem_debug(port):\n import cocotb.memdebug\n cocotb.memdebug.start(port)\n\n\ndef _initialise_testbench(argv_): # pragma: no cover\n \"\"\"Initialize testbench.\n\n This function is called after the simulator has elaborated all\n entities and is ready to run the test.\n\n The test must be defined by the environment variables\n :envvar:`MODULE` and :envvar:`TESTCASE`.\n\n The environment variable :envvar:`COCOTB_HOOKS`, if present, contains a\n comma-separated list of modules to be executed before the first test.\n \"\"\"\n with _rlock:\n\n if \"COCOTB_LIBRARY_COVERAGE\" in os.environ:\n import coverage\n\n global _library_coverage\n _library_coverage = coverage.coverage(\n data_file=\".coverage.cocotb\",\n branch=True,\n include=[\"{}/*\".format(os.path.dirname(__file__))])\n _library_coverage.start()\n\n return _initialise_testbench_(argv_)\n\n\ndef _initialise_testbench_(argv_):\n # The body of this function is split in two because no coverage is collected on\n # the function that starts the coverage. By splitting it in two we get coverage\n # on most of the function.\n\n global argc, argv\n argv = argv_\n argc = len(argv)\n\n root_name = os.getenv(\"TOPLEVEL\")\n if root_name is not None:\n if root_name == \"\":\n root_name = None\n elif '.' in root_name:\n # Skip any library component of the toplevel\n root_name = root_name.split(\".\", 1)[1]\n\n # sys.path normally includes \"\" (the current directory), but does not appear to when python is embedded.\n # Add it back because users expect to be able to import files in their test directory.\n # TODO: move this to gpi_embed.cpp\n sys.path.insert(0, \"\")\n\n _setup_logging()\n\n # From https://www.python.org/dev/peps/pep-0565/#recommended-filter-settings-for-test-runners\n # If the user doesn't want to see these, they can always change the global\n # warning settings in their test module.\n if not sys.warnoptions:\n warnings.simplefilter(\"default\")\n\n from cocotb import simulator\n\n global SIM_NAME, SIM_VERSION\n SIM_NAME = simulator.get_simulator_product().strip()\n SIM_VERSION = simulator.get_simulator_version().strip()\n\n cocotb.log.info(\"Running on {} version {}\".format(SIM_NAME, SIM_VERSION))\n\n memcheck_port = os.getenv('MEMCHECK')\n if memcheck_port is not None:\n mem_debug(int(memcheck_port))\n\n log.info(\"Running tests with cocotb v%s from %s\" %\n (__version__, os.path.dirname(__file__)))\n\n # Create the base handle type\n\n process_plusargs()\n\n global scheduler\n scheduler = Scheduler()\n\n # Seed the Python random number generator to make this repeatable\n global RANDOM_SEED\n RANDOM_SEED = os.getenv('RANDOM_SEED')\n\n if RANDOM_SEED is None:\n if 'ntb_random_seed' in plusargs:\n RANDOM_SEED = eval(plusargs['ntb_random_seed'])\n elif 'seed' in plusargs:\n RANDOM_SEED = eval(plusargs['seed'])\n else:\n RANDOM_SEED = int(time.time())\n log.info(\"Seeding Python random module with %d\" % (RANDOM_SEED))\n else:\n RANDOM_SEED = int(RANDOM_SEED)\n log.info(\"Seeding Python random module with supplied seed %d\" % (RANDOM_SEED))\n random.seed(RANDOM_SEED)\n\n # Setup DUT object\n from cocotb import simulator\n\n handle = simulator.get_root_handle(root_name)\n if not handle:\n raise RuntimeError(\"Can not find root handle ({})\".format(root_name))\n\n global top\n top = cocotb.handle.SimHandle(handle)\n\n try:\n import pytest\n except ImportError:\n log.warning(\"Pytest not found, assertion rewriting will not occur\")\n else:\n try:\n # Install the assertion rewriting hook, which must be done before we\n # import the test modules.\n from _pytest.config import Config\n from _pytest.assertion import install_importhook\n pytest_conf = Config.fromdictargs([], {})\n install_importhook(pytest_conf)\n except Exception:\n log.exception(\n \"Configuring the assertion rewrite hook using pytest {} failed. \"\n \"Please file a bug report!\".format(pytest.__version__))\n\n # start Regression Manager\n global regression_manager\n regression_manager = RegressionManager.from_discovery(top)\n regression_manager.execute()\n\n return True\n\n\ndef _sim_event(level, message):\n \"\"\"Function that can be called externally to signal an event.\"\"\"\n # SIM_INFO = 0\n SIM_TEST_FAIL = 1\n SIM_FAIL = 2\n from cocotb.result import TestFailure, SimFailure\n\n if level is SIM_TEST_FAIL:\n scheduler.log.error(\"Failing test at simulator request\")\n scheduler._finish_test(TestFailure(\"Failure from external source: {}\".format(message)))\n elif level is SIM_FAIL:\n # We simply return here as the simulator will exit\n # so no cleanup is needed\n msg = \"Failing test at simulator request before test run completion: {}\".format(message)\n scheduler.log.error(msg)\n scheduler._finish_scheduler(SimFailure(msg))\n else:\n scheduler.log.error(\"Unsupported sim event\")\n\n return True\n\n\ndef process_plusargs():\n\n global plusargs\n\n plusargs = {}\n\n for option in cocotb.argv:\n if option.startswith('+'):\n if option.find('=') != -1:\n (name, value) = option[1:].split('=')\n plusargs[name] = value\n else:\n plusargs[option[1:]] = True\n", "path": "cocotb/__init__.py"}], "after_files": [{"content": "# Copyright (c) 2013 Potential Ventures Ltd\n# Copyright (c) 2013 SolarFlare Communications Inc\n# All rights reserved.\n\n# Redistribution and use in source and binary forms, with or without\n# modification, are permitted provided that the following conditions are met:\n# * Redistributions of source code must retain the above copyright\n# notice, this list of conditions and the following disclaimer.\n# * Redistributions in binary form must reproduce the above copyright\n# notice, this list of conditions and the following disclaimer in the\n# documentation and/or other materials provided with the distribution.\n# * Neither the name of Potential Ventures Ltd,\n# SolarFlare Communications Inc nor the\n# names of its contributors may be used to endorse or promote products\n# derived from this software without specific prior written permission.\n\n# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\" AND\n# ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED\n# WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\n# DISCLAIMED. IN NO EVENT SHALL POTENTIAL VENTURES LTD BE LIABLE FOR ANY\n# DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES\n# (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;\n# LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND\n# ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT\n# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS\n# SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n\n\"\"\"\nCocotb is a coroutine, cosimulation framework for writing testbenches in Python.\n\nSee https://docs.cocotb.org for full documentation\n\"\"\"\nimport os\nimport sys\nimport logging\nimport threading\nimport random\nimport time\nimport warnings\nfrom typing import Dict, List, Optional, Union\nfrom collections.abc import Coroutine\n\nimport cocotb.handle\nimport cocotb.log\nfrom cocotb.scheduler import Scheduler\nfrom cocotb.regression import RegressionManager\nfrom cocotb.decorators import RunningTask\n\n# Things we want in the cocotb namespace\nfrom cocotb.decorators import test, coroutine, hook, function, external # noqa: F401\n\nfrom ._version import __version__\n\n\ndef _setup_logging():\n global log\n\n def _reopen_stream_with_buffering(stream_name):\n try:\n if not getattr(sys, stream_name).isatty():\n setattr(sys, stream_name, os.fdopen(getattr(sys, stream_name).fileno(), 'w', 1))\n return True\n return False\n except Exception as e:\n return e\n\n # If stdout/stderr are not TTYs, Python may not have opened them with line\n # buffering. In that case, try to reopen them with line buffering\n # explicitly enabled. This ensures that prints such as stack traces always\n # appear. Continue silently if this fails.\n _stdout_buffer_result = _reopen_stream_with_buffering('stdout')\n _stderr_buffer_result = _reopen_stream_with_buffering('stderr')\n\n # Don't set the logging up until we've attempted to fix the standard IO,\n # otherwise it will end up connected to the unfixed IO.\n cocotb.log.default_config()\n log = logging.getLogger(__name__)\n\n # we can't log these things until the logging is set up!\n if _stderr_buffer_result is True:\n log.debug(\"Reopened stderr with line buffering\")\n if _stdout_buffer_result is True:\n log.debug(\"Reopened stdout with line buffering\")\n if isinstance(_stdout_buffer_result, Exception) or isinstance(_stderr_buffer_result, Exception):\n if isinstance(_stdout_buffer_result, Exception):\n log.warning(\"Failed to ensure that stdout is line buffered\", exc_info=_stdout_buffer_result)\n if isinstance(_stderr_buffer_result, Exception):\n log.warning(\"Failed to ensure that stderr is line buffered\", exc_info=_stderr_buffer_result)\n log.warning(\"Some stack traces may not appear because of this.\")\n\n del _stderr_buffer_result, _stdout_buffer_result\n\n\n# Singleton scheduler instance\n# NB this cheekily ensures a singleton since we're replacing the reference\n# so that cocotb.scheduler gives you the singleton instance and not the\n# scheduler package\n\nscheduler: Optional[Scheduler] = None\n\"\"\"The global scheduler instance.\n\nThis is guaranteed to hold a value at test time.\n\"\"\"\n\nregression_manager: Optional[RegressionManager] = None\n\"\"\"The global regression manager instance.\n\nThis is guaranteed to hold a value at test time.\n\"\"\"\n\nargv: Optional[List[str]] = None\n\"\"\"The argument list as seen by the simulator.\n\nThis is guaranteed to hold a value at test time.\n\"\"\"\n\nargc: Optional[int] = None\n\"\"\"The length of :data:`cocotb.argv`.\n\nThis is guaranteed to hold a value at test time.\n\"\"\"\n\nplusargs: Optional[Dict[str, Union[bool, str]]] = None\n\"\"\"A dictionary of \"plusargs\" handed to the simulation.\n\nSee :make:var:`PLUSARGS` for details.\nThis is guaranteed to hold a value at test time.\n\"\"\"\n\nLANGUAGE: Optional[str] = os.getenv(\"TOPLEVEL_LANG\")\n\"\"\"The value of :make:var:`TOPLEVEL_LANG`.\n\nThis is guaranteed to hold a value at test time.\n\"\"\"\n\nSIM_NAME: Optional[str] = None\n\"\"\"The running simulator product information.\n\n``None`` if :mod:`cocotb` was not loaded from a simulator.\n\"\"\"\n\nSIM_VERSION: Optional[str] = None\n\"\"\"The version of the running simulator.\n\n``None`` if :mod:`cocotb` was not loaded from a simulator.\"\"\"\n\nRANDOM_SEED: Optional[int] = None\n\"\"\"\nThe value passed to the Python default random number generator.\n\nSee :envvar:`RANDOM_SEED` for details on how the value is computed.\nThis is guaranteed to hold a value at test time.\n\"\"\"\n\n_library_coverage = None\n\"\"\" used for cocotb library coverage \"\"\"\n\ntop: Optional[cocotb.handle.SimHandleBase] = None\nr\"\"\"\nA handle to the :envvar:`TOPLEVEL` entity/module.\n\nThis is equivalent to the :term:`DUT` parameter given to cocotb tests, so it can be used wherever that variable can be used.\nIt is particularly useful for extracting information about the :term:`DUT` in module-level class and function definitions;\nand in parameters to :class:`.TestFactory`\\ s.\n``None`` if :mod:`cocotb` was not loaded from a simulator.\n\"\"\"\n\n\ndef fork(coro: Union[RunningTask, Coroutine]) -> RunningTask:\n \"\"\" Schedule a coroutine to be run concurrently. See :ref:`coroutines` for details on its use. \"\"\"\n return scheduler.add(coro)\n\n\n# FIXME is this really required?\n_rlock = threading.RLock()\n\n\ndef mem_debug(port):\n import cocotb.memdebug\n cocotb.memdebug.start(port)\n\n\ndef _initialise_testbench(argv_): # pragma: no cover\n \"\"\"Initialize testbench.\n\n This function is called after the simulator has elaborated all\n entities and is ready to run the test.\n\n The test must be defined by the environment variables\n :envvar:`MODULE` and :envvar:`TESTCASE`.\n\n The environment variable :envvar:`COCOTB_HOOKS`, if present, contains a\n comma-separated list of modules to be executed before the first test.\n \"\"\"\n with _rlock:\n\n if \"COCOTB_LIBRARY_COVERAGE\" in os.environ:\n import coverage\n\n global _library_coverage\n _library_coverage = coverage.coverage(\n data_file=\".coverage.cocotb\",\n branch=True,\n include=[\"{}/*\".format(os.path.dirname(__file__))])\n _library_coverage.start()\n\n return _initialise_testbench_(argv_)\n\n\ndef _initialise_testbench_(argv_):\n # The body of this function is split in two because no coverage is collected on\n # the function that starts the coverage. By splitting it in two we get coverage\n # on most of the function.\n\n global argc, argv\n argv = argv_\n argc = len(argv)\n\n root_name = os.getenv(\"TOPLEVEL\")\n if root_name is not None:\n root_name = root_name.strip()\n if root_name == \"\":\n root_name = None\n elif '.' in root_name:\n # Skip any library component of the toplevel\n root_name = root_name.split(\".\", 1)[1]\n\n # sys.path normally includes \"\" (the current directory), but does not appear to when python is embedded.\n # Add it back because users expect to be able to import files in their test directory.\n # TODO: move this to gpi_embed.cpp\n sys.path.insert(0, \"\")\n\n _setup_logging()\n\n # From https://www.python.org/dev/peps/pep-0565/#recommended-filter-settings-for-test-runners\n # If the user doesn't want to see these, they can always change the global\n # warning settings in their test module.\n if not sys.warnoptions:\n warnings.simplefilter(\"default\")\n\n from cocotb import simulator\n\n global SIM_NAME, SIM_VERSION\n SIM_NAME = simulator.get_simulator_product().strip()\n SIM_VERSION = simulator.get_simulator_version().strip()\n\n cocotb.log.info(\"Running on {} version {}\".format(SIM_NAME, SIM_VERSION))\n\n memcheck_port = os.getenv('MEMCHECK')\n if memcheck_port is not None:\n mem_debug(int(memcheck_port))\n\n log.info(\"Running tests with cocotb v%s from %s\" %\n (__version__, os.path.dirname(__file__)))\n\n # Create the base handle type\n\n process_plusargs()\n\n global scheduler\n scheduler = Scheduler()\n\n # Seed the Python random number generator to make this repeatable\n global RANDOM_SEED\n RANDOM_SEED = os.getenv('RANDOM_SEED')\n\n if RANDOM_SEED is None:\n if 'ntb_random_seed' in plusargs:\n RANDOM_SEED = eval(plusargs['ntb_random_seed'])\n elif 'seed' in plusargs:\n RANDOM_SEED = eval(plusargs['seed'])\n else:\n RANDOM_SEED = int(time.time())\n log.info(\"Seeding Python random module with %d\" % (RANDOM_SEED))\n else:\n RANDOM_SEED = int(RANDOM_SEED)\n log.info(\"Seeding Python random module with supplied seed %d\" % (RANDOM_SEED))\n random.seed(RANDOM_SEED)\n\n # Setup DUT object\n from cocotb import simulator\n\n handle = simulator.get_root_handle(root_name)\n if not handle:\n raise RuntimeError(\"Can not find root handle ({})\".format(root_name))\n\n global top\n top = cocotb.handle.SimHandle(handle)\n\n try:\n import pytest\n except ImportError:\n log.warning(\"Pytest not found, assertion rewriting will not occur\")\n else:\n try:\n # Install the assertion rewriting hook, which must be done before we\n # import the test modules.\n from _pytest.config import Config\n from _pytest.assertion import install_importhook\n pytest_conf = Config.fromdictargs([], {})\n install_importhook(pytest_conf)\n except Exception:\n log.exception(\n \"Configuring the assertion rewrite hook using pytest {} failed. \"\n \"Please file a bug report!\".format(pytest.__version__))\n\n # start Regression Manager\n global regression_manager\n regression_manager = RegressionManager.from_discovery(top)\n regression_manager.execute()\n\n return True\n\n\ndef _sim_event(level, message):\n \"\"\"Function that can be called externally to signal an event.\"\"\"\n # SIM_INFO = 0\n SIM_TEST_FAIL = 1\n SIM_FAIL = 2\n from cocotb.result import TestFailure, SimFailure\n\n if level is SIM_TEST_FAIL:\n scheduler.log.error(\"Failing test at simulator request\")\n scheduler._finish_test(TestFailure(\"Failure from external source: {}\".format(message)))\n elif level is SIM_FAIL:\n # We simply return here as the simulator will exit\n # so no cleanup is needed\n msg = \"Failing test at simulator request before test run completion: {}\".format(message)\n scheduler.log.error(msg)\n scheduler._finish_scheduler(SimFailure(msg))\n else:\n scheduler.log.error(\"Unsupported sim event\")\n\n return True\n\n\ndef process_plusargs():\n\n global plusargs\n\n plusargs = {}\n\n for option in cocotb.argv:\n if option.startswith('+'):\n if option.find('=') != -1:\n (name, value) = option[1:].split('=')\n plusargs[name] = value\n else:\n plusargs[option[1:]] = True\n", "path": "cocotb/__init__.py"}]} | 4,078 | 105 |
gh_patches_debug_3055 | rasdani/github-patches | git_diff | spyder-ide__spyder-3909 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Profiler error when no filename is passed to it
Hello!
* Spyder Version: Git version (4dev)
* Python Version: 3.5
* Operating system: Arch Linux
When I click on Run > Profile , It produces this error:
`Traceback (most recent call last):
File "/usr/lib/python3.5/site-packages/spyder_profiler/profiler.py", line 138, in run_profiler
self.analyze(self.main.editor.get_current_filename())
File "/usr/lib/python3.5/site-packages/spyder_profiler/profiler.py", line 155, in analyze
pythonpath=pythonpath)
TypeError: analyze() got multiple values for argument 'wdir'`
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `spyder_profiler/profiler.py`
Content:
```
1 # -*- coding:utf-8 -*-
2 #
3 # Copyright © Spyder Project Contributors
4 # based on p_pylint.py by Pierre Raybaut
5 #
6 # Licensed under the terms of the MIT License
7 # (see spyder/__init__.py for details)
8
9 """Profiler Plugin."""
10
11 # Standard library imports
12 import os.path as osp
13
14 # Third party imports
15 from qtpy.QtCore import Qt
16 from qtpy.QtWidgets import QGroupBox, QLabel, QVBoxLayout
17
18 # Local imports
19 from spyder.config.base import get_translation
20 from spyder.api.plugins import SpyderPluginWidget
21 from spyder.api.preferences import PluginConfigPage
22 from spyder.plugins.runconfig import get_run_configuration
23 from spyder.utils import icon_manager as ima
24 from spyder.utils.qthelpers import create_action
25 from .widgets.profilergui import (ProfilerWidget, is_profiler_installed)
26
27
28 _ = get_translation("profiler", "spyder_profiler")
29
30
31 class ProfilerConfigPage(PluginConfigPage):
32 def setup_page(self):
33 results_group = QGroupBox(_("Results"))
34 results_label1 = QLabel(_("Profiler plugin results "
35 "(the output of python's profile/cProfile)\n"
36 "are stored here:"))
37 results_label1.setWordWrap(True)
38
39 # Warning: do not try to regroup the following QLabel contents with
40 # widgets above -- this string was isolated here in a single QLabel
41 # on purpose: to fix Issue 863
42 results_label2 = QLabel(ProfilerWidget.DATAPATH)
43
44 results_label2.setTextInteractionFlags(Qt.TextSelectableByMouse)
45 results_label2.setWordWrap(True)
46
47 results_layout = QVBoxLayout()
48 results_layout.addWidget(results_label1)
49 results_layout.addWidget(results_label2)
50 results_group.setLayout(results_layout)
51
52 vlayout = QVBoxLayout()
53 vlayout.addWidget(results_group)
54 vlayout.addStretch(1)
55 self.setLayout(vlayout)
56
57
58 class Profiler(SpyderPluginWidget):
59 """Profiler (after python's profile and pstats)."""
60
61 CONF_SECTION = 'profiler'
62 CONFIGWIDGET_CLASS = ProfilerConfigPage
63
64 def __init__(self, parent=None):
65 SpyderPluginWidget.__init__(self, parent)
66
67 max_entries = self.get_option('max_entries', 50)
68 self.profiler = ProfilerWidget(self, max_entries)
69
70 layout = QVBoxLayout()
71 layout.addWidget(self.profiler)
72 self.setLayout(layout)
73
74 # Initialize plugin
75 self.initialize_plugin()
76
77 #------ SpyderPluginWidget API ---------------------------------------------
78 def get_plugin_title(self):
79 """Return widget title"""
80 return _("Profiler")
81
82 def get_plugin_icon(self):
83 """Return widget icon"""
84 path = osp.join(self.PLUGIN_PATH, self.IMG_PATH)
85 return ima.icon('profiler', icon_path=path)
86
87 def get_focus_widget(self):
88 """
89 Return the widget to give focus to when
90 this plugin's dockwidget is raised on top-level
91 """
92 return self.profiler.datatree
93
94 def get_plugin_actions(self):
95 """Return a list of actions related to plugin"""
96 return []
97
98 def on_first_registration(self):
99 """Action to be performed on first plugin registration"""
100 self.main.tabify_plugins(self.main.help, self)
101 self.dockwidget.hide()
102
103 def register_plugin(self):
104 """Register plugin in Spyder's main window"""
105 self.profiler.datatree.sig_edit_goto.connect(self.main.editor.load)
106 self.profiler.redirect_stdio.connect(
107 self.main.redirect_internalshell_stdio)
108 self.main.add_dockwidget(self)
109
110 profiler_act = create_action(self, _("Profile"),
111 icon=self.get_plugin_icon(),
112 triggered=self.run_profiler)
113 profiler_act.setEnabled(is_profiler_installed())
114 self.register_shortcut(profiler_act, context="Profiler",
115 name="Run profiler")
116
117 self.main.run_menu_actions += [profiler_act]
118 self.main.editor.pythonfile_dependent_actions += [profiler_act]
119
120 def refresh_plugin(self):
121 """Refresh profiler widget"""
122 #self.remove_obsolete_items() # FIXME: not implemented yet
123
124 def closing_plugin(self, cancelable=False):
125 """Perform actions before parent main window is closed"""
126 return True
127
128 def apply_plugin_settings(self, options):
129 """Apply configuration file's plugin settings"""
130 # The history depth option will be applied at
131 # next Spyder startup, which is soon enough
132 pass
133
134 #------ Public API ---------------------------------------------------------
135 def run_profiler(self):
136 """Run profiler"""
137 if self.main.editor.save():
138 self.analyze(self.main.editor.get_current_filename())
139
140 def analyze(self, filename):
141 """Reimplement analyze method"""
142 if self.dockwidget and not self.ismaximized:
143 self.dockwidget.setVisible(True)
144 self.dockwidget.setFocus()
145 self.dockwidget.raise_()
146 pythonpath = self.main.get_spyder_pythonpath()
147 runconf = get_run_configuration(filename)
148 wdir, args = None, []
149 if runconf is not None:
150 if runconf.wdir_enabled:
151 wdir = runconf.wdir
152 if runconf.args_enabled:
153 args = runconf.args
154 self.profiler.analyze(self, filename, wdir=wdir, args=args,
155 pythonpath=pythonpath)
156
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/spyder_profiler/profiler.py b/spyder_profiler/profiler.py
--- a/spyder_profiler/profiler.py
+++ b/spyder_profiler/profiler.py
@@ -151,5 +151,5 @@
wdir = runconf.wdir
if runconf.args_enabled:
args = runconf.args
- self.profiler.analyze(self, filename, wdir=wdir, args=args,
+ self.profiler.analyze(filename, wdir=wdir, args=args,
pythonpath=pythonpath)
| {"golden_diff": "diff --git a/spyder_profiler/profiler.py b/spyder_profiler/profiler.py\n--- a/spyder_profiler/profiler.py\n+++ b/spyder_profiler/profiler.py\n@@ -151,5 +151,5 @@\n wdir = runconf.wdir\n if runconf.args_enabled:\n args = runconf.args\n- self.profiler.analyze(self, filename, wdir=wdir, args=args,\n+ self.profiler.analyze(filename, wdir=wdir, args=args,\n pythonpath=pythonpath)\n", "issue": "Profiler error when no filename is passed to it\nHello!\r\n\r\n* Spyder Version: Git version (4dev)\r\n* Python Version: 3.5\r\n* Operating system: Arch Linux\r\n\r\nWhen I click on Run > Profile , It produces this error:\r\n`Traceback (most recent call last):\r\n File \"/usr/lib/python3.5/site-packages/spyder_profiler/profiler.py\", line 138, in run_profiler\r\n self.analyze(self.main.editor.get_current_filename())\r\n File \"/usr/lib/python3.5/site-packages/spyder_profiler/profiler.py\", line 155, in analyze\r\n pythonpath=pythonpath)\r\nTypeError: analyze() got multiple values for argument 'wdir'`\r\n\r\n\n", "before_files": [{"content": "# -*- coding:utf-8 -*-\n#\n# Copyright \u00a9 Spyder Project Contributors\n# based on p_pylint.py by Pierre Raybaut\n#\n# Licensed under the terms of the MIT License\n# (see spyder/__init__.py for details)\n\n\"\"\"Profiler Plugin.\"\"\"\n\n# Standard library imports\nimport os.path as osp\n\n# Third party imports\nfrom qtpy.QtCore import Qt\nfrom qtpy.QtWidgets import QGroupBox, QLabel, QVBoxLayout\n\n# Local imports\nfrom spyder.config.base import get_translation\nfrom spyder.api.plugins import SpyderPluginWidget\nfrom spyder.api.preferences import PluginConfigPage\nfrom spyder.plugins.runconfig import get_run_configuration\nfrom spyder.utils import icon_manager as ima\nfrom spyder.utils.qthelpers import create_action\nfrom .widgets.profilergui import (ProfilerWidget, is_profiler_installed)\n\n\n_ = get_translation(\"profiler\", \"spyder_profiler\")\n\n\nclass ProfilerConfigPage(PluginConfigPage):\n def setup_page(self):\n results_group = QGroupBox(_(\"Results\"))\n results_label1 = QLabel(_(\"Profiler plugin results \"\n \"(the output of python's profile/cProfile)\\n\"\n \"are stored here:\"))\n results_label1.setWordWrap(True)\n\n # Warning: do not try to regroup the following QLabel contents with\n # widgets above -- this string was isolated here in a single QLabel\n # on purpose: to fix Issue 863\n results_label2 = QLabel(ProfilerWidget.DATAPATH)\n\n results_label2.setTextInteractionFlags(Qt.TextSelectableByMouse)\n results_label2.setWordWrap(True)\n\n results_layout = QVBoxLayout()\n results_layout.addWidget(results_label1)\n results_layout.addWidget(results_label2)\n results_group.setLayout(results_layout)\n\n vlayout = QVBoxLayout()\n vlayout.addWidget(results_group)\n vlayout.addStretch(1)\n self.setLayout(vlayout)\n\n\nclass Profiler(SpyderPluginWidget):\n \"\"\"Profiler (after python's profile and pstats).\"\"\"\n\n CONF_SECTION = 'profiler'\n CONFIGWIDGET_CLASS = ProfilerConfigPage\n\n def __init__(self, parent=None):\n SpyderPluginWidget.__init__(self, parent)\n\n max_entries = self.get_option('max_entries', 50)\n self.profiler = ProfilerWidget(self, max_entries)\n\n layout = QVBoxLayout()\n layout.addWidget(self.profiler)\n self.setLayout(layout)\n\n # Initialize plugin\n self.initialize_plugin()\n \n #------ SpyderPluginWidget API --------------------------------------------- \n def get_plugin_title(self):\n \"\"\"Return widget title\"\"\"\n return _(\"Profiler\")\n\n def get_plugin_icon(self):\n \"\"\"Return widget icon\"\"\"\n path = osp.join(self.PLUGIN_PATH, self.IMG_PATH)\n return ima.icon('profiler', icon_path=path)\n\n def get_focus_widget(self):\n \"\"\"\n Return the widget to give focus to when\n this plugin's dockwidget is raised on top-level\n \"\"\"\n return self.profiler.datatree\n \n def get_plugin_actions(self):\n \"\"\"Return a list of actions related to plugin\"\"\"\n return []\n\n def on_first_registration(self):\n \"\"\"Action to be performed on first plugin registration\"\"\"\n self.main.tabify_plugins(self.main.help, self)\n self.dockwidget.hide()\n\n def register_plugin(self):\n \"\"\"Register plugin in Spyder's main window\"\"\"\n self.profiler.datatree.sig_edit_goto.connect(self.main.editor.load)\n self.profiler.redirect_stdio.connect(\n self.main.redirect_internalshell_stdio)\n self.main.add_dockwidget(self)\n\n profiler_act = create_action(self, _(\"Profile\"),\n icon=self.get_plugin_icon(),\n triggered=self.run_profiler)\n profiler_act.setEnabled(is_profiler_installed())\n self.register_shortcut(profiler_act, context=\"Profiler\",\n name=\"Run profiler\")\n \n self.main.run_menu_actions += [profiler_act]\n self.main.editor.pythonfile_dependent_actions += [profiler_act]\n\n def refresh_plugin(self):\n \"\"\"Refresh profiler widget\"\"\"\n #self.remove_obsolete_items() # FIXME: not implemented yet\n \n def closing_plugin(self, cancelable=False):\n \"\"\"Perform actions before parent main window is closed\"\"\"\n return True\n \n def apply_plugin_settings(self, options):\n \"\"\"Apply configuration file's plugin settings\"\"\"\n # The history depth option will be applied at \n # next Spyder startup, which is soon enough\n pass\n \n #------ Public API --------------------------------------------------------- \n def run_profiler(self):\n \"\"\"Run profiler\"\"\"\n if self.main.editor.save():\n self.analyze(self.main.editor.get_current_filename())\n\n def analyze(self, filename):\n \"\"\"Reimplement analyze method\"\"\"\n if self.dockwidget and not self.ismaximized:\n self.dockwidget.setVisible(True)\n self.dockwidget.setFocus()\n self.dockwidget.raise_()\n pythonpath = self.main.get_spyder_pythonpath()\n runconf = get_run_configuration(filename)\n wdir, args = None, []\n if runconf is not None:\n if runconf.wdir_enabled:\n wdir = runconf.wdir\n if runconf.args_enabled:\n args = runconf.args\n self.profiler.analyze(self, filename, wdir=wdir, args=args,\n pythonpath=pythonpath)\n", "path": "spyder_profiler/profiler.py"}], "after_files": [{"content": "# -*- coding:utf-8 -*-\n#\n# Copyright \u00a9 Spyder Project Contributors\n# based on p_pylint.py by Pierre Raybaut\n#\n# Licensed under the terms of the MIT License\n# (see spyder/__init__.py for details)\n\n\"\"\"Profiler Plugin.\"\"\"\n\n# Standard library imports\nimport os.path as osp\n\n# Third party imports\nfrom qtpy.QtCore import Qt\nfrom qtpy.QtWidgets import QGroupBox, QLabel, QVBoxLayout\n\n# Local imports\nfrom spyder.config.base import get_translation\nfrom spyder.api.plugins import SpyderPluginWidget\nfrom spyder.api.preferences import PluginConfigPage\nfrom spyder.plugins.runconfig import get_run_configuration\nfrom spyder.utils import icon_manager as ima\nfrom spyder.utils.qthelpers import create_action\nfrom .widgets.profilergui import (ProfilerWidget, is_profiler_installed)\n\n\n_ = get_translation(\"profiler\", \"spyder_profiler\")\n\n\nclass ProfilerConfigPage(PluginConfigPage):\n def setup_page(self):\n results_group = QGroupBox(_(\"Results\"))\n results_label1 = QLabel(_(\"Profiler plugin results \"\n \"(the output of python's profile/cProfile)\\n\"\n \"are stored here:\"))\n results_label1.setWordWrap(True)\n\n # Warning: do not try to regroup the following QLabel contents with\n # widgets above -- this string was isolated here in a single QLabel\n # on purpose: to fix Issue 863\n results_label2 = QLabel(ProfilerWidget.DATAPATH)\n\n results_label2.setTextInteractionFlags(Qt.TextSelectableByMouse)\n results_label2.setWordWrap(True)\n\n results_layout = QVBoxLayout()\n results_layout.addWidget(results_label1)\n results_layout.addWidget(results_label2)\n results_group.setLayout(results_layout)\n\n vlayout = QVBoxLayout()\n vlayout.addWidget(results_group)\n vlayout.addStretch(1)\n self.setLayout(vlayout)\n\n\nclass Profiler(SpyderPluginWidget):\n \"\"\"Profiler (after python's profile and pstats).\"\"\"\n\n CONF_SECTION = 'profiler'\n CONFIGWIDGET_CLASS = ProfilerConfigPage\n\n def __init__(self, parent=None):\n SpyderPluginWidget.__init__(self, parent)\n\n max_entries = self.get_option('max_entries', 50)\n self.profiler = ProfilerWidget(self, max_entries)\n\n layout = QVBoxLayout()\n layout.addWidget(self.profiler)\n self.setLayout(layout)\n\n # Initialize plugin\n self.initialize_plugin()\n \n #------ SpyderPluginWidget API --------------------------------------------- \n def get_plugin_title(self):\n \"\"\"Return widget title\"\"\"\n return _(\"Profiler\")\n\n def get_plugin_icon(self):\n \"\"\"Return widget icon\"\"\"\n path = osp.join(self.PLUGIN_PATH, self.IMG_PATH)\n return ima.icon('profiler', icon_path=path)\n\n def get_focus_widget(self):\n \"\"\"\n Return the widget to give focus to when\n this plugin's dockwidget is raised on top-level\n \"\"\"\n return self.profiler.datatree\n \n def get_plugin_actions(self):\n \"\"\"Return a list of actions related to plugin\"\"\"\n return []\n\n def on_first_registration(self):\n \"\"\"Action to be performed on first plugin registration\"\"\"\n self.main.tabify_plugins(self.main.help, self)\n self.dockwidget.hide()\n\n def register_plugin(self):\n \"\"\"Register plugin in Spyder's main window\"\"\"\n self.profiler.datatree.sig_edit_goto.connect(self.main.editor.load)\n self.profiler.redirect_stdio.connect(\n self.main.redirect_internalshell_stdio)\n self.main.add_dockwidget(self)\n\n profiler_act = create_action(self, _(\"Profile\"),\n icon=self.get_plugin_icon(),\n triggered=self.run_profiler)\n profiler_act.setEnabled(is_profiler_installed())\n self.register_shortcut(profiler_act, context=\"Profiler\",\n name=\"Run profiler\")\n \n self.main.run_menu_actions += [profiler_act]\n self.main.editor.pythonfile_dependent_actions += [profiler_act]\n\n def refresh_plugin(self):\n \"\"\"Refresh profiler widget\"\"\"\n #self.remove_obsolete_items() # FIXME: not implemented yet\n \n def closing_plugin(self, cancelable=False):\n \"\"\"Perform actions before parent main window is closed\"\"\"\n return True\n \n def apply_plugin_settings(self, options):\n \"\"\"Apply configuration file's plugin settings\"\"\"\n # The history depth option will be applied at \n # next Spyder startup, which is soon enough\n pass\n \n #------ Public API --------------------------------------------------------- \n def run_profiler(self):\n \"\"\"Run profiler\"\"\"\n if self.main.editor.save():\n self.analyze(self.main.editor.get_current_filename())\n\n def analyze(self, filename):\n \"\"\"Reimplement analyze method\"\"\"\n if self.dockwidget and not self.ismaximized:\n self.dockwidget.setVisible(True)\n self.dockwidget.setFocus()\n self.dockwidget.raise_()\n pythonpath = self.main.get_spyder_pythonpath()\n runconf = get_run_configuration(filename)\n wdir, args = None, []\n if runconf is not None:\n if runconf.wdir_enabled:\n wdir = runconf.wdir\n if runconf.args_enabled:\n args = runconf.args\n self.profiler.analyze(filename, wdir=wdir, args=args,\n pythonpath=pythonpath)\n", "path": "spyder_profiler/profiler.py"}]} | 1,907 | 124 |
gh_patches_debug_23931 | rasdani/github-patches | git_diff | alltheplaces__alltheplaces-4285 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
nationwide_gb spider returning closed branches
The nationwide_gb spide is currently returning a number of branches that are described as "Permanently Closed" on their web pages. For example:
* https://www.nationwide.co.uk/branches/glasgow/1635-great-western-road
* https://www.nationwide.co.uk/branches/bournemouth/10-southbourne-grove
* https://www.nationwide.co.uk/branches/london/129-balham-high-road
They all have "- permanently closed" appended to their names, which might be robust enough to use to detect them. If not, they have their opening times for each day set to "closed".
nationwide_gb spider missing branches (regex not general enough)
According to e.g. https://www.altfi.com/article/9347_nationwide-pledges-to-keep-its-625-branches-open-until-2024 Nationwide should have 625 UK branches. The current nationwide_gb spider is only returning 549.
One issue that is probably behind most (if not all) of the missing branches is that the regex `r"https:\/\/www\.nationwide\.co\.uk\/branches\/[-()\w]+\/[-\w]+$"` used to detect branch page URLs is not sufficiently general. In addition to word characters and hypens the final (branch) part of the URL can also contain a forward slash (used to denote house-number ranges).
For example: https://www.nationwide.co.uk/branches/northampton/18/19-weston-favell-centre
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `locations/spiders/nationwide_gb.py`
Content:
```
1 from scrapy.linkextractors import LinkExtractor
2 from scrapy.spiders import CrawlSpider, Rule
3
4 from locations.structured_data_spider import StructuredDataSpider
5
6
7 class NationwideGB(CrawlSpider, StructuredDataSpider):
8 name = "nationwide_gb"
9 item_attributes = {"brand": "Nationwide", "brand_wikidata": "Q846735"}
10 start_urls = ["https://www.nationwide.co.uk/branches/index.html"]
11 rules = [
12 Rule(
13 LinkExtractor(
14 allow=r"https:\/\/www\.nationwide\.co\.uk\/branches\/[-()\w]+\/[-\w]+$"
15 ),
16 callback="parse_sd",
17 ),
18 Rule(
19 LinkExtractor(
20 allow=r"https:\/\/www\.nationwide\.co\.uk\/branches\/[-()\w]+$"
21 )
22 ),
23 ]
24 wanted_types = ["BankOrCreditUnion"]
25
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/locations/spiders/nationwide_gb.py b/locations/spiders/nationwide_gb.py
--- a/locations/spiders/nationwide_gb.py
+++ b/locations/spiders/nationwide_gb.py
@@ -1,24 +1,20 @@
from scrapy.linkextractors import LinkExtractor
from scrapy.spiders import CrawlSpider, Rule
+from locations.categories import Categories
from locations.structured_data_spider import StructuredDataSpider
class NationwideGB(CrawlSpider, StructuredDataSpider):
name = "nationwide_gb"
- item_attributes = {"brand": "Nationwide", "brand_wikidata": "Q846735"}
+ item_attributes = {
+ "brand": "Nationwide",
+ "brand_wikidata": "Q846735",
+ "extras": Categories.BANK.value,
+ }
start_urls = ["https://www.nationwide.co.uk/branches/index.html"]
- rules = [
- Rule(
- LinkExtractor(
- allow=r"https:\/\/www\.nationwide\.co\.uk\/branches\/[-()\w]+\/[-\w]+$"
- ),
- callback="parse_sd",
- ),
- Rule(
- LinkExtractor(
- allow=r"https:\/\/www\.nationwide\.co\.uk\/branches\/[-()\w]+$"
- )
- ),
- ]
- wanted_types = ["BankOrCreditUnion"]
+ rules = [Rule(LinkExtractor(allow=r"/branches/"), callback="parse_sd", follow=True)]
+
+ def post_process_item(self, item, response, ld_data, **kwargs):
+ if "permanently closed" not in item["name"].lower():
+ yield item
| {"golden_diff": "diff --git a/locations/spiders/nationwide_gb.py b/locations/spiders/nationwide_gb.py\n--- a/locations/spiders/nationwide_gb.py\n+++ b/locations/spiders/nationwide_gb.py\n@@ -1,24 +1,20 @@\n from scrapy.linkextractors import LinkExtractor\n from scrapy.spiders import CrawlSpider, Rule\n \n+from locations.categories import Categories\n from locations.structured_data_spider import StructuredDataSpider\n \n \n class NationwideGB(CrawlSpider, StructuredDataSpider):\n name = \"nationwide_gb\"\n- item_attributes = {\"brand\": \"Nationwide\", \"brand_wikidata\": \"Q846735\"}\n+ item_attributes = {\n+ \"brand\": \"Nationwide\",\n+ \"brand_wikidata\": \"Q846735\",\n+ \"extras\": Categories.BANK.value,\n+ }\n start_urls = [\"https://www.nationwide.co.uk/branches/index.html\"]\n- rules = [\n- Rule(\n- LinkExtractor(\n- allow=r\"https:\\/\\/www\\.nationwide\\.co\\.uk\\/branches\\/[-()\\w]+\\/[-\\w]+$\"\n- ),\n- callback=\"parse_sd\",\n- ),\n- Rule(\n- LinkExtractor(\n- allow=r\"https:\\/\\/www\\.nationwide\\.co\\.uk\\/branches\\/[-()\\w]+$\"\n- )\n- ),\n- ]\n- wanted_types = [\"BankOrCreditUnion\"]\n+ rules = [Rule(LinkExtractor(allow=r\"/branches/\"), callback=\"parse_sd\", follow=True)]\n+\n+ def post_process_item(self, item, response, ld_data, **kwargs):\n+ if \"permanently closed\" not in item[\"name\"].lower():\n+ yield item\n", "issue": "nationwide_gb spider returning closed branches\nThe nationwide_gb spide is currently returning a number of branches that are described as \"Permanently Closed\" on their web pages. For example:\r\n\r\n* https://www.nationwide.co.uk/branches/glasgow/1635-great-western-road\r\n* https://www.nationwide.co.uk/branches/bournemouth/10-southbourne-grove\r\n* https://www.nationwide.co.uk/branches/london/129-balham-high-road\r\n\r\nThey all have \"- permanently closed\" appended to their names, which might be robust enough to use to detect them. If not, they have their opening times for each day set to \"closed\".\nnationwide_gb spider missing branches (regex not general enough)\nAccording to e.g. https://www.altfi.com/article/9347_nationwide-pledges-to-keep-its-625-branches-open-until-2024 Nationwide should have 625 UK branches. The current nationwide_gb spider is only returning 549.\r\n\r\nOne issue that is probably behind most (if not all) of the missing branches is that the regex `r\"https:\\/\\/www\\.nationwide\\.co\\.uk\\/branches\\/[-()\\w]+\\/[-\\w]+$\"` used to detect branch page URLs is not sufficiently general. In addition to word characters and hypens the final (branch) part of the URL can also contain a forward slash (used to denote house-number ranges).\r\n\r\nFor example: https://www.nationwide.co.uk/branches/northampton/18/19-weston-favell-centre\n", "before_files": [{"content": "from scrapy.linkextractors import LinkExtractor\nfrom scrapy.spiders import CrawlSpider, Rule\n\nfrom locations.structured_data_spider import StructuredDataSpider\n\n\nclass NationwideGB(CrawlSpider, StructuredDataSpider):\n name = \"nationwide_gb\"\n item_attributes = {\"brand\": \"Nationwide\", \"brand_wikidata\": \"Q846735\"}\n start_urls = [\"https://www.nationwide.co.uk/branches/index.html\"]\n rules = [\n Rule(\n LinkExtractor(\n allow=r\"https:\\/\\/www\\.nationwide\\.co\\.uk\\/branches\\/[-()\\w]+\\/[-\\w]+$\"\n ),\n callback=\"parse_sd\",\n ),\n Rule(\n LinkExtractor(\n allow=r\"https:\\/\\/www\\.nationwide\\.co\\.uk\\/branches\\/[-()\\w]+$\"\n )\n ),\n ]\n wanted_types = [\"BankOrCreditUnion\"]\n", "path": "locations/spiders/nationwide_gb.py"}], "after_files": [{"content": "from scrapy.linkextractors import LinkExtractor\nfrom scrapy.spiders import CrawlSpider, Rule\n\nfrom locations.categories import Categories\nfrom locations.structured_data_spider import StructuredDataSpider\n\n\nclass NationwideGB(CrawlSpider, StructuredDataSpider):\n name = \"nationwide_gb\"\n item_attributes = {\n \"brand\": \"Nationwide\",\n \"brand_wikidata\": \"Q846735\",\n \"extras\": Categories.BANK.value,\n }\n start_urls = [\"https://www.nationwide.co.uk/branches/index.html\"]\n rules = [Rule(LinkExtractor(allow=r\"/branches/\"), callback=\"parse_sd\", follow=True)]\n\n def post_process_item(self, item, response, ld_data, **kwargs):\n if \"permanently closed\" not in item[\"name\"].lower():\n yield item\n", "path": "locations/spiders/nationwide_gb.py"}]} | 835 | 380 |
gh_patches_debug_35892 | rasdani/github-patches | git_diff | dotkom__onlineweb4-545 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Error reports should not contain sensitive data
https://docs.djangoproject.com/en/dev/howto/error-reporting/#filtering-error-reports
Here is a guide how to do it.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `apps/authentication/views.py`
Content:
```
1 # -*- coding: utf-8 -*-
2
3 import uuid
4 import re
5
6 from django.contrib import auth
7 from django.contrib import messages
8 from django.core.mail import send_mail
9 from django.shortcuts import render, redirect, get_object_or_404
10 from django.http import HttpResponseRedirect
11 from django.utils.translation import ugettext as _
12
13 from django.conf import settings
14 from apps.authentication.forms import (LoginForm, RegisterForm,
15 RecoveryForm, ChangePasswordForm)
16 from apps.authentication.models import OnlineUser as User, RegisterToken, Email
17
18 def login(request):
19 redirect_url = request.REQUEST.get('next', '')
20 if request.method == 'POST':
21 form = LoginForm(request.POST)
22 if form.login(request):
23 messages.success(request, _(u'Du er nå logget inn.'))
24 if redirect_url:
25 return HttpResponseRedirect(redirect_url)
26 return HttpResponseRedirect('/')
27 else: form = LoginForm(request.POST, auto_id=True)
28 else:
29 form = LoginForm()
30
31 response_dict = { 'form' : form, 'next' : redirect_url}
32 return render(request, 'auth/login.html', response_dict)
33
34 def logout(request):
35 auth.logout(request)
36 messages.success(request, _(u'Du er nå logget ut.'))
37 return HttpResponseRedirect('/')
38
39 def register(request):
40 if request.user.is_authenticated():
41 messages.error(request, _(u'Registrering av ny konto krever at du er logget ut.'))
42 return HttpResponseRedirect('/')
43 else:
44 if request.method == 'POST':
45 form = RegisterForm(request.POST)
46 if form.is_valid():
47 cleaned = form.cleaned_data
48
49 # Create user
50 user = User(
51 username=cleaned['username'].lower(),
52 first_name=cleaned['first_name'].title(),
53 last_name=cleaned['last_name'].title(),
54 )
55 # Set remaining fields
56 user.phone_number=cleaned['phone']
57 user.address=cleaned['address'].title()
58 user.zip_code=cleaned['zip_code']
59 # Store password properly
60 user.set_password(cleaned['password'])
61 # Users need to be manually activated
62 user.is_active = False
63 user.save()
64
65 # Set email address
66 email = Email(
67 user=user,
68 email=cleaned['email'],
69 )
70 email.primary = True
71 email.save()
72
73 # Create the registration token
74 token = uuid.uuid4().hex
75 rt = RegisterToken(user=user, email=cleaned['email'], token=token)
76 rt.save()
77
78 email_message = _(u"""
79 En konto har blitt registrert på online.ntnu.no med denne epostadressen. Dersom du ikke
80 har utført denne handlingen ber vi deg se bort fra denne eposten.
81
82 For å bruke denne kontoen kreves det at du verifiserer epostadressen. Du kan gjøre
83 dette ved å besøke linken under.
84
85 http://%s/auth/verify/%s/
86
87 Denne lenken vil være gyldig i 24 timer. Dersom du behøver å få tilsendt en ny lenke
88 kan dette gjøres med funksjonen for å gjenopprette passord.
89 """) % (request.META['HTTP_HOST'], token)
90
91 send_mail(_(u'Verifiser din konto'), email_message, settings.DEFAULT_FROM_EMAIL, [email.email,])
92
93 messages.success(request, _(u'Registreringen var vellykket. Se tilsendt epost for verifiseringsinstrukser.'))
94
95 return HttpResponseRedirect('/')
96 else:
97 form = RegisterForm(request.POST, auto_id=True)
98 else:
99 form = RegisterForm()
100
101 return render(request, 'auth/register.html', {'form': form, })
102
103 def verify(request, token):
104 rt = get_object_or_404(RegisterToken, token=token)
105
106 if rt.is_valid:
107 email = get_object_or_404(Email, email=rt.email)
108 email.verified = True
109 email.save()
110
111 user = getattr(rt, 'user')
112
113 # If it is a stud email, set the ntnu_username for user
114 if re.match(r'[^@][email protected]', rt.email):
115 user.ntnu_username = rt.email.split("@")[0]
116
117 user_activated = False
118 if not user.is_active:
119 user.is_active = True
120 user_activated = True
121
122 user.save()
123 rt.delete()
124
125 if user_activated:
126 messages.success(request, _(u'Bruker %s ble aktivert. Du kan nå logge inn.') % user.username)
127 return redirect('auth_login')
128 else:
129 messages.success(request, _(u'Eposten %s er nå verifisert.') % email)
130 return redirect('profiles')
131 else:
132 messages.error(request, _(u'Denne lenken er utløpt. Bruk gjenopprett passord for å få tilsendt en ny lenke.'))
133 return HttpResponseRedirect('/')
134
135
136 def recover(request):
137 if request.user.is_authenticated():
138 messages.error(request, _(u'Gjenoppretning av passord krever at du er logget ut.'))
139 return HttpResponseRedirect('/')
140 else:
141 if request.method == 'POST':
142 form = RecoveryForm(request.POST)
143 if form.is_valid():
144 email_string = form.cleaned_data['email']
145 emails = Email.objects.filter(email=email_string)
146
147 if len(emails) == 0:
148 messages.error(request, _(u'Denne eposten er ikke registrert i våre systemer.'))
149 return HttpResponseRedirect('/')
150
151 email = emails[0]
152
153 # Create the registration token
154 token = uuid.uuid4().hex
155 rt = RegisterToken(user=email.user.username, email=email.email, token=token)
156 rt.save()
157
158 email_message = _(u"""
159 Vi har mottat forespørsel om å gjenopprette passordet for kontoen bundet til %s.
160 Dersom du ikke har bedt om denne handlingen ber vi deg se bort fra denne eposten.
161
162 Brukernavn: %s
163
164 Hvis du ønsker å gjennomføre en gjenoppretning av passord, bruk lenken under.
165
166 http://%s/auth/set_password/%s/
167
168 Denne lenken vil være gyldig i 24 timer. Dersom du behøver å få tilsendt en ny lenke
169 kan dette gjøres med funksjonen for å gjenopprette passord.
170 """) % (email.email, email.user.username, request.META['HTTP_HOST'], token)
171
172 send_mail(_(u'Gjenoppretning av passord'), email_message, settings.DEFAULT_FROM_EMAIL, [email.email,])
173
174 messages.success(request, _(u'En lenke for gjenoppretning har blitt sendt til %s.') % email.email)
175
176 return HttpResponseRedirect('/')
177 else:
178 form = RecoveryForm(request.POST, auto_id=True)
179 else:
180 form = RecoveryForm()
181
182 return render(request, 'auth/recover.html', {'form': form})
183
184 def set_password(request, token=None):
185 if request.user.is_authenticated():
186 return HttpResponseRedirect('/')
187 else:
188 rt = get_object_or_404(RegisterToken, token=token)
189
190 if rt.is_valid:
191 if request.method == 'POST':
192 form = ChangePasswordForm(request.POST, auto_id=True)
193 if form.is_valid():
194 user = getattr(rt, 'user')
195
196 user.is_active = True
197 user.set_password(form.cleaned_data['new_password'])
198 user.save()
199
200 rt.delete()
201
202 messages.success(request, _(u'Bruker %s har gjennomført vellykket gjenoppretning av passord. Du kan nå logge inn.') % user.username)
203
204 return HttpResponseRedirect('/')
205 else:
206
207 form = ChangePasswordForm()
208
209 messages.success(request, _(u'Lenken er akseptert. Vennligst skriv inn ønsket passord.'))
210
211 return render(request, 'auth/set_password.html', {'form': form, 'token': token})
212
213 else:
214 messages.error(request, _(u'Lenken er utløpt. Vennligst bruk gjenoppretning av passord for å få tilsendt en ny lenke.'))
215 return HttpResponseRedirect('/')
216
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/apps/authentication/views.py b/apps/authentication/views.py
--- a/apps/authentication/views.py
+++ b/apps/authentication/views.py
@@ -9,12 +9,15 @@
from django.shortcuts import render, redirect, get_object_or_404
from django.http import HttpResponseRedirect
from django.utils.translation import ugettext as _
+from django.views.decorators.debug import sensitive_post_parameters
from django.conf import settings
from apps.authentication.forms import (LoginForm, RegisterForm,
RecoveryForm, ChangePasswordForm)
from apps.authentication.models import OnlineUser as User, RegisterToken, Email
+
+@sensitive_post_parameters()
def login(request):
redirect_url = request.REQUEST.get('next', '')
if request.method == 'POST':
@@ -31,11 +34,14 @@
response_dict = { 'form' : form, 'next' : redirect_url}
return render(request, 'auth/login.html', response_dict)
+
def logout(request):
auth.logout(request)
messages.success(request, _(u'Du er nå logget ut.'))
return HttpResponseRedirect('/')
+
+@sensitive_post_parameters()
def register(request):
if request.user.is_authenticated():
messages.error(request, _(u'Registrering av ny konto krever at du er logget ut.'))
@@ -100,6 +106,7 @@
return render(request, 'auth/register.html', {'form': form, })
+
def verify(request, token):
rt = get_object_or_404(RegisterToken, token=token)
@@ -181,6 +188,8 @@
return render(request, 'auth/recover.html', {'form': form})
+
+@sensitive_post_parameters()
def set_password(request, token=None):
if request.user.is_authenticated():
return HttpResponseRedirect('/')
| {"golden_diff": "diff --git a/apps/authentication/views.py b/apps/authentication/views.py\n--- a/apps/authentication/views.py\n+++ b/apps/authentication/views.py\n@@ -9,12 +9,15 @@\n from django.shortcuts import render, redirect, get_object_or_404\n from django.http import HttpResponseRedirect\n from django.utils.translation import ugettext as _\n+from django.views.decorators.debug import sensitive_post_parameters\n \n from django.conf import settings\n from apps.authentication.forms import (LoginForm, RegisterForm, \n RecoveryForm, ChangePasswordForm)\n from apps.authentication.models import OnlineUser as User, RegisterToken, Email\n \n+\n+@sensitive_post_parameters()\n def login(request):\n redirect_url = request.REQUEST.get('next', '')\n if request.method == 'POST':\n@@ -31,11 +34,14 @@\n response_dict = { 'form' : form, 'next' : redirect_url}\n return render(request, 'auth/login.html', response_dict)\n \n+\n def logout(request):\n auth.logout(request)\n messages.success(request, _(u'Du er n\u00e5 logget ut.'))\n return HttpResponseRedirect('/')\n \n+\n+@sensitive_post_parameters()\n def register(request):\n if request.user.is_authenticated():\n messages.error(request, _(u'Registrering av ny konto krever at du er logget ut.'))\n@@ -100,6 +106,7 @@\n \n return render(request, 'auth/register.html', {'form': form, })\n \n+\n def verify(request, token):\n rt = get_object_or_404(RegisterToken, token=token)\n \n@@ -181,6 +188,8 @@\n \n return render(request, 'auth/recover.html', {'form': form})\n \n+\n+@sensitive_post_parameters()\n def set_password(request, token=None): \n if request.user.is_authenticated():\n return HttpResponseRedirect('/')\n", "issue": "Error reports should not contain sensitive data\nhttps://docs.djangoproject.com/en/dev/howto/error-reporting/#filtering-error-reports\n\nHere is a guide how to do it.\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\nimport uuid\nimport re\n\nfrom django.contrib import auth\nfrom django.contrib import messages\nfrom django.core.mail import send_mail\nfrom django.shortcuts import render, redirect, get_object_or_404\nfrom django.http import HttpResponseRedirect\nfrom django.utils.translation import ugettext as _\n\nfrom django.conf import settings\nfrom apps.authentication.forms import (LoginForm, RegisterForm, \n RecoveryForm, ChangePasswordForm)\nfrom apps.authentication.models import OnlineUser as User, RegisterToken, Email\n\ndef login(request):\n redirect_url = request.REQUEST.get('next', '')\n if request.method == 'POST':\n form = LoginForm(request.POST)\n if form.login(request):\n messages.success(request, _(u'Du er n\u00e5 logget inn.'))\n if redirect_url:\n return HttpResponseRedirect(redirect_url)\n return HttpResponseRedirect('/')\n else: form = LoginForm(request.POST, auto_id=True)\n else:\n form = LoginForm()\n\n response_dict = { 'form' : form, 'next' : redirect_url}\n return render(request, 'auth/login.html', response_dict)\n\ndef logout(request):\n auth.logout(request)\n messages.success(request, _(u'Du er n\u00e5 logget ut.'))\n return HttpResponseRedirect('/')\n\ndef register(request):\n if request.user.is_authenticated():\n messages.error(request, _(u'Registrering av ny konto krever at du er logget ut.'))\n return HttpResponseRedirect('/')\n else:\n if request.method == 'POST':\n form = RegisterForm(request.POST)\n if form.is_valid():\n cleaned = form.cleaned_data\n\n # Create user\n user = User(\n username=cleaned['username'].lower(), \n first_name=cleaned['first_name'].title(), \n last_name=cleaned['last_name'].title(),\n )\n # Set remaining fields\n user.phone_number=cleaned['phone']\n user.address=cleaned['address'].title()\n user.zip_code=cleaned['zip_code']\n # Store password properly\n user.set_password(cleaned['password'])\n # Users need to be manually activated\n user.is_active = False\n user.save()\n\n # Set email address\n email = Email(\n user=user,\n email=cleaned['email'],\n )\n email.primary = True\n email.save() \n\n # Create the registration token\n token = uuid.uuid4().hex\n rt = RegisterToken(user=user, email=cleaned['email'], token=token)\n rt.save()\n\n email_message = _(u\"\"\"\nEn konto har blitt registrert p\u00e5 online.ntnu.no med denne epostadressen. Dersom du ikke\nhar utf\u00f8rt denne handlingen ber vi deg se bort fra denne eposten.\n\nFor \u00e5 bruke denne kontoen kreves det at du verifiserer epostadressen. Du kan gj\u00f8re\ndette ved \u00e5 bes\u00f8ke linken under.\n\nhttp://%s/auth/verify/%s/\n\nDenne lenken vil v\u00e6re gyldig i 24 timer. Dersom du beh\u00f8ver \u00e5 f\u00e5 tilsendt en ny lenke\nkan dette gj\u00f8res med funksjonen for \u00e5 gjenopprette passord.\n\"\"\") % (request.META['HTTP_HOST'], token)\n\n send_mail(_(u'Verifiser din konto'), email_message, settings.DEFAULT_FROM_EMAIL, [email.email,])\n\n messages.success(request, _(u'Registreringen var vellykket. Se tilsendt epost for verifiseringsinstrukser.'))\n\n return HttpResponseRedirect('/') \n else:\n form = RegisterForm(request.POST, auto_id=True)\n else:\n form = RegisterForm()\n\n return render(request, 'auth/register.html', {'form': form, })\n\ndef verify(request, token):\n rt = get_object_or_404(RegisterToken, token=token)\n \n if rt.is_valid:\n email = get_object_or_404(Email, email=rt.email)\n email.verified = True\n email.save()\n \n user = getattr(rt, 'user')\n\n # If it is a stud email, set the ntnu_username for user\n if re.match(r'[^@][email protected]', rt.email):\n user.ntnu_username = rt.email.split(\"@\")[0]\n\n user_activated = False\n if not user.is_active:\n user.is_active = True\n user_activated = True\n\n user.save()\n rt.delete()\n\n if user_activated:\n messages.success(request, _(u'Bruker %s ble aktivert. Du kan n\u00e5 logge inn.') % user.username)\n return redirect('auth_login')\n else:\n messages.success(request, _(u'Eposten %s er n\u00e5 verifisert.') % email)\n return redirect('profiles')\n else:\n messages.error(request, _(u'Denne lenken er utl\u00f8pt. Bruk gjenopprett passord for \u00e5 f\u00e5 tilsendt en ny lenke.'))\n return HttpResponseRedirect('/') \n \n\ndef recover(request):\n if request.user.is_authenticated():\n messages.error(request, _(u'Gjenoppretning av passord krever at du er logget ut.'))\n return HttpResponseRedirect('/')\n else:\n if request.method == 'POST':\n form = RecoveryForm(request.POST)\n if form.is_valid():\n email_string = form.cleaned_data['email']\n emails = Email.objects.filter(email=email_string)\n\n if len(emails) == 0:\n messages.error(request, _(u'Denne eposten er ikke registrert i v\u00e5re systemer.'))\n return HttpResponseRedirect('/') \n\n email = emails[0]\n \n # Create the registration token\n token = uuid.uuid4().hex\n rt = RegisterToken(user=email.user.username, email=email.email, token=token)\n rt.save()\n\n email_message = _(u\"\"\"\nVi har mottat foresp\u00f8rsel om \u00e5 gjenopprette passordet for kontoen bundet til %s.\nDersom du ikke har bedt om denne handlingen ber vi deg se bort fra denne eposten.\n\nBrukernavn: %s\n\nHvis du \u00f8nsker \u00e5 gjennomf\u00f8re en gjenoppretning av passord, bruk lenken under.\n\nhttp://%s/auth/set_password/%s/\n\nDenne lenken vil v\u00e6re gyldig i 24 timer. Dersom du beh\u00f8ver \u00e5 f\u00e5 tilsendt en ny lenke\nkan dette gj\u00f8res med funksjonen for \u00e5 gjenopprette passord.\n\"\"\") % (email.email, email.user.username, request.META['HTTP_HOST'], token)\n\n send_mail(_(u'Gjenoppretning av passord'), email_message, settings.DEFAULT_FROM_EMAIL, [email.email,])\n\n messages.success(request, _(u'En lenke for gjenoppretning har blitt sendt til %s.') % email.email)\n\n return HttpResponseRedirect('/') \n else:\n form = RecoveryForm(request.POST, auto_id=True)\n else:\n form = RecoveryForm()\n\n return render(request, 'auth/recover.html', {'form': form})\n\ndef set_password(request, token=None): \n if request.user.is_authenticated():\n return HttpResponseRedirect('/')\n else:\n rt = get_object_or_404(RegisterToken, token=token)\n \n if rt.is_valid:\n if request.method == 'POST':\n form = ChangePasswordForm(request.POST, auto_id=True)\n if form.is_valid():\n user = getattr(rt, 'user')\n\n user.is_active = True\n user.set_password(form.cleaned_data['new_password'])\n user.save()\n \n rt.delete()\n\n messages.success(request, _(u'Bruker %s har gjennomf\u00f8rt vellykket gjenoppretning av passord. Du kan n\u00e5 logge inn.') % user.username)\n \n return HttpResponseRedirect('/') \n else:\n \n form = ChangePasswordForm()\n\n messages.success(request, _(u'Lenken er akseptert. Vennligst skriv inn \u00f8nsket passord.'))\n\n return render(request, 'auth/set_password.html', {'form': form, 'token': token})\n\n else:\n messages.error(request, _(u'Lenken er utl\u00f8pt. Vennligst bruk gjenoppretning av passord for \u00e5 f\u00e5 tilsendt en ny lenke.'))\n return HttpResponseRedirect('/') \n", "path": "apps/authentication/views.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\nimport uuid\nimport re\n\nfrom django.contrib import auth\nfrom django.contrib import messages\nfrom django.core.mail import send_mail\nfrom django.shortcuts import render, redirect, get_object_or_404\nfrom django.http import HttpResponseRedirect\nfrom django.utils.translation import ugettext as _\nfrom django.views.decorators.debug import sensitive_post_parameters\n\nfrom django.conf import settings\nfrom apps.authentication.forms import (LoginForm, RegisterForm, \n RecoveryForm, ChangePasswordForm)\nfrom apps.authentication.models import OnlineUser as User, RegisterToken, Email\n\n\n@sensitive_post_parameters()\ndef login(request):\n redirect_url = request.REQUEST.get('next', '')\n if request.method == 'POST':\n form = LoginForm(request.POST)\n if form.login(request):\n messages.success(request, _(u'Du er n\u00e5 logget inn.'))\n if redirect_url:\n return HttpResponseRedirect(redirect_url)\n return HttpResponseRedirect('/')\n else: form = LoginForm(request.POST, auto_id=True)\n else:\n form = LoginForm()\n\n response_dict = { 'form' : form, 'next' : redirect_url}\n return render(request, 'auth/login.html', response_dict)\n\n\ndef logout(request):\n auth.logout(request)\n messages.success(request, _(u'Du er n\u00e5 logget ut.'))\n return HttpResponseRedirect('/')\n\n\n@sensitive_post_parameters()\ndef register(request):\n if request.user.is_authenticated():\n messages.error(request, _(u'Registrering av ny konto krever at du er logget ut.'))\n return HttpResponseRedirect('/')\n else:\n if request.method == 'POST':\n form = RegisterForm(request.POST)\n if form.is_valid():\n cleaned = form.cleaned_data\n\n # Create user\n user = User(\n username=cleaned['username'].lower(), \n first_name=cleaned['first_name'].title(), \n last_name=cleaned['last_name'].title(),\n )\n # Set remaining fields\n user.phone_number=cleaned['phone']\n user.address=cleaned['address'].title()\n user.zip_code=cleaned['zip_code']\n # Store password properly\n user.set_password(cleaned['password'])\n # Users need to be manually activated\n user.is_active = False\n user.save()\n\n # Set email address\n email = Email(\n user=user,\n email=cleaned['email'],\n )\n email.primary = True\n email.save() \n\n # Create the registration token\n token = uuid.uuid4().hex\n rt = RegisterToken(user=user, email=cleaned['email'], token=token)\n rt.save()\n\n email_message = _(u\"\"\"\nEn konto har blitt registrert p\u00e5 online.ntnu.no med denne epostadressen. Dersom du ikke\nhar utf\u00f8rt denne handlingen ber vi deg se bort fra denne eposten.\n\nFor \u00e5 bruke denne kontoen kreves det at du verifiserer epostadressen. Du kan gj\u00f8re\ndette ved \u00e5 bes\u00f8ke linken under.\n\nhttp://%s/auth/verify/%s/\n\nDenne lenken vil v\u00e6re gyldig i 24 timer. Dersom du beh\u00f8ver \u00e5 f\u00e5 tilsendt en ny lenke\nkan dette gj\u00f8res med funksjonen for \u00e5 gjenopprette passord.\n\"\"\") % (request.META['HTTP_HOST'], token)\n\n send_mail(_(u'Verifiser din konto'), email_message, settings.DEFAULT_FROM_EMAIL, [email.email,])\n\n messages.success(request, _(u'Registreringen var vellykket. Se tilsendt epost for verifiseringsinstrukser.'))\n\n return HttpResponseRedirect('/') \n else:\n form = RegisterForm(request.POST, auto_id=True)\n else:\n form = RegisterForm()\n\n return render(request, 'auth/register.html', {'form': form, })\n\n\ndef verify(request, token):\n rt = get_object_or_404(RegisterToken, token=token)\n \n if rt.is_valid:\n email = get_object_or_404(Email, email=rt.email)\n email.verified = True\n email.save()\n \n user = getattr(rt, 'user')\n\n # If it is a stud email, set the ntnu_username for user\n if re.match(r'[^@][email protected]', rt.email):\n user.ntnu_username = rt.email.split(\"@\")[0]\n\n user_activated = False\n if not user.is_active:\n user.is_active = True\n user_activated = True\n\n user.save()\n rt.delete()\n\n if user_activated:\n messages.success(request, _(u'Bruker %s ble aktivert. Du kan n\u00e5 logge inn.') % user.username)\n return redirect('auth_login')\n else:\n messages.success(request, _(u'Eposten %s er n\u00e5 verifisert.') % email)\n return redirect('profiles')\n else:\n messages.error(request, _(u'Denne lenken er utl\u00f8pt. Bruk gjenopprett passord for \u00e5 f\u00e5 tilsendt en ny lenke.'))\n return HttpResponseRedirect('/') \n \n\ndef recover(request):\n if request.user.is_authenticated():\n messages.error(request, _(u'Gjenoppretning av passord krever at du er logget ut.'))\n return HttpResponseRedirect('/')\n else:\n if request.method == 'POST':\n form = RecoveryForm(request.POST)\n if form.is_valid():\n email_string = form.cleaned_data['email']\n emails = Email.objects.filter(email=email_string)\n\n if len(emails) == 0:\n messages.error(request, _(u'Denne eposten er ikke registrert i v\u00e5re systemer.'))\n return HttpResponseRedirect('/') \n\n email = emails[0]\n \n # Create the registration token\n token = uuid.uuid4().hex\n rt = RegisterToken(user=email.user.username, email=email.email, token=token)\n rt.save()\n\n email_message = _(u\"\"\"\nVi har mottat foresp\u00f8rsel om \u00e5 gjenopprette passordet for kontoen bundet til %s.\nDersom du ikke har bedt om denne handlingen ber vi deg se bort fra denne eposten.\n\nBrukernavn: %s\n\nHvis du \u00f8nsker \u00e5 gjennomf\u00f8re en gjenoppretning av passord, bruk lenken under.\n\nhttp://%s/auth/set_password/%s/\n\nDenne lenken vil v\u00e6re gyldig i 24 timer. Dersom du beh\u00f8ver \u00e5 f\u00e5 tilsendt en ny lenke\nkan dette gj\u00f8res med funksjonen for \u00e5 gjenopprette passord.\n\"\"\") % (email.email, email.user.username, request.META['HTTP_HOST'], token)\n\n send_mail(_(u'Gjenoppretning av passord'), email_message, settings.DEFAULT_FROM_EMAIL, [email.email,])\n\n messages.success(request, _(u'En lenke for gjenoppretning har blitt sendt til %s.') % email.email)\n\n return HttpResponseRedirect('/') \n else:\n form = RecoveryForm(request.POST, auto_id=True)\n else:\n form = RecoveryForm()\n\n return render(request, 'auth/recover.html', {'form': form})\n\n\n@sensitive_post_parameters()\ndef set_password(request, token=None): \n if request.user.is_authenticated():\n return HttpResponseRedirect('/')\n else:\n rt = get_object_or_404(RegisterToken, token=token)\n \n if rt.is_valid:\n if request.method == 'POST':\n form = ChangePasswordForm(request.POST, auto_id=True)\n if form.is_valid():\n user = getattr(rt, 'user')\n\n user.is_active = True\n user.set_password(form.cleaned_data['new_password'])\n user.save()\n \n rt.delete()\n\n messages.success(request, _(u'Bruker %s har gjennomf\u00f8rt vellykket gjenoppretning av passord. Du kan n\u00e5 logge inn.') % user.username)\n \n return HttpResponseRedirect('/') \n else:\n \n form = ChangePasswordForm()\n\n messages.success(request, _(u'Lenken er akseptert. Vennligst skriv inn \u00f8nsket passord.'))\n\n return render(request, 'auth/set_password.html', {'form': form, 'token': token})\n\n else:\n messages.error(request, _(u'Lenken er utl\u00f8pt. Vennligst bruk gjenoppretning av passord for \u00e5 f\u00e5 tilsendt en ny lenke.'))\n return HttpResponseRedirect('/') \n", "path": "apps/authentication/views.py"}]} | 2,658 | 392 |
gh_patches_debug_34402 | rasdani/github-patches | git_diff | HypothesisWorks__hypothesis-3807 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
`builds` fails when used with `attrs` classes containing private attributes
Hi, I think I may have found a bug that occurs when using `builds` with automatic attribute inference from `attrs`-built classes that have required but "private" attributes. The bug is probably due to the fact that the name exposed via the constructor differs from the attribute name in these cases, since `attrs` by default strips the leading underscore.
Here a minimal example:
```python
from attrs import define
from hypothesis import given, strategies as st
@define
class MyClass:
_x: int
@given(st.builds(MyClass))
def test_class(my_obj):
assert isinstance(my_obj, MyClass)
```
The error you get:
```python
AttributeError: 'MyClassAttributes' object has no attribute 'x'
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `hypothesis-python/setup.py`
Content:
```
1 # This file is part of Hypothesis, which may be found at
2 # https://github.com/HypothesisWorks/hypothesis/
3 #
4 # Copyright the Hypothesis Authors.
5 # Individual contributors are listed in AUTHORS.rst and the git log.
6 #
7 # This Source Code Form is subject to the terms of the Mozilla Public License,
8 # v. 2.0. If a copy of the MPL was not distributed with this file, You can
9 # obtain one at https://mozilla.org/MPL/2.0/.
10
11 import sys
12 import warnings
13 from pathlib import Path
14
15 import setuptools
16
17 if sys.version_info[:2] < (3, 8): # noqa # "unreachable" sanity check
18 raise Exception(
19 "You are trying to install Hypothesis using Python "
20 f"{sys.version.split()[0]}, but it requires Python 3.8 or later."
21 "Update `pip` and `setuptools`, try again, and you will automatically "
22 "get the latest compatible version of Hypothesis instead. "
23 "See also https://python3statement.org/practicalities/"
24 )
25
26
27 def local_file(name):
28 return Path(__file__).absolute().parent.joinpath(name).relative_to(Path.cwd())
29
30
31 SOURCE = str(local_file("src"))
32
33 setuptools_version = tuple(map(int, setuptools.__version__.split(".")[:1]))
34
35 if setuptools_version < (42,):
36 # Warning only - very bad if uploading bdist but fine if installing sdist.
37 warnings.warn(
38 "This version of setuptools is too old to handle license_files "
39 "metadata key. For more info, see: "
40 "https://setuptools.pypa.io/en/latest/userguide/declarative_config.html#metadata",
41 stacklevel=1,
42 )
43
44
45 # Assignment to placate pyflakes. The actual version is from the exec that follows.
46 __version__ = None
47 exec(local_file("src/hypothesis/version.py").read_text(encoding="utf-8"))
48 assert __version__ is not None
49
50
51 extras = {
52 "cli": ["click>=7.0", "black>=19.10b0", "rich>=9.0.0"],
53 "codemods": ["libcst>=0.3.16"],
54 "ghostwriter": ["black>=19.10b0"],
55 "pytz": ["pytz>=2014.1"],
56 "dateutil": ["python-dateutil>=1.4"],
57 "lark": ["lark>=0.10.1"], # probably still works with old `lark-parser` too
58 "numpy": ["numpy>=1.17.3"], # oldest with wheels for non-EOL Python (for now)
59 "pandas": ["pandas>=1.1"],
60 "pytest": ["pytest>=4.6"],
61 "dpcontracts": ["dpcontracts>=0.4"],
62 "redis": ["redis>=3.0.0"],
63 # zoneinfo is an odd one: every dependency is conditional, because they're
64 # only necessary on old versions of Python or Windows systems.
65 "zoneinfo": [
66 "tzdata>=2023.3 ; sys_platform == 'win32'",
67 "backports.zoneinfo>=0.2.1 ; python_version<'3.9'",
68 ],
69 # We only support Django versions with upstream support - see
70 # https://www.djangoproject.com/download/#supported-versions
71 # We also leave the choice of timezone library to the user, since it
72 # might be zoneinfo or pytz depending on version and configuration.
73 "django": ["django>=3.2"],
74 }
75
76 extras["all"] = sorted(set(sum(extras.values(), [])))
77
78
79 setuptools.setup(
80 name="hypothesis",
81 version=__version__,
82 author="David R. MacIver and Zac Hatfield-Dodds",
83 author_email="[email protected]",
84 packages=setuptools.find_packages(SOURCE),
85 package_dir={"": SOURCE},
86 package_data={"hypothesis": ["py.typed", "vendor/tlds-alpha-by-domain.txt"]},
87 url="https://hypothesis.works",
88 project_urls={
89 "Source": "https://github.com/HypothesisWorks/hypothesis/tree/master/hypothesis-python",
90 "Changelog": "https://hypothesis.readthedocs.io/en/latest/changes.html",
91 "Documentation": "https://hypothesis.readthedocs.io",
92 "Issues": "https://github.com/HypothesisWorks/hypothesis/issues",
93 },
94 license="MPL-2.0",
95 description="A library for property-based testing",
96 zip_safe=False,
97 extras_require=extras,
98 install_requires=[
99 "attrs>=19.2.0",
100 "exceptiongroup>=1.0.0 ; python_version<'3.11'",
101 "sortedcontainers>=2.1.0,<3.0.0",
102 ],
103 python_requires=">=3.8",
104 classifiers=[
105 "Development Status :: 5 - Production/Stable",
106 "Framework :: Hypothesis",
107 "Framework :: Pytest",
108 "Intended Audience :: Developers",
109 "License :: OSI Approved :: Mozilla Public License 2.0 (MPL 2.0)",
110 "Operating System :: Unix",
111 "Operating System :: POSIX",
112 "Operating System :: Microsoft :: Windows",
113 "Programming Language :: Python",
114 "Programming Language :: Python :: 3",
115 "Programming Language :: Python :: 3 :: Only",
116 "Programming Language :: Python :: 3.8",
117 "Programming Language :: Python :: 3.9",
118 "Programming Language :: Python :: 3.10",
119 "Programming Language :: Python :: 3.11",
120 "Programming Language :: Python :: 3.12",
121 "Programming Language :: Python :: Implementation :: CPython",
122 "Programming Language :: Python :: Implementation :: PyPy",
123 "Topic :: Education :: Testing",
124 "Topic :: Software Development :: Testing",
125 "Typing :: Typed",
126 ],
127 py_modules=["_hypothesis_pytestplugin", "_hypothesis_ftz_detector"],
128 entry_points={
129 "pytest11": ["hypothesispytest = _hypothesis_pytestplugin"],
130 "console_scripts": ["hypothesis = hypothesis.extra.cli:main"],
131 },
132 long_description=local_file("README.rst").read_text(encoding="utf-8"),
133 long_description_content_type="text/x-rst",
134 keywords="python testing fuzzing property-based-testing",
135 )
136
```
Path: `hypothesis-python/src/hypothesis/strategies/_internal/attrs.py`
Content:
```
1 # This file is part of Hypothesis, which may be found at
2 # https://github.com/HypothesisWorks/hypothesis/
3 #
4 # Copyright the Hypothesis Authors.
5 # Individual contributors are listed in AUTHORS.rst and the git log.
6 #
7 # This Source Code Form is subject to the terms of the Mozilla Public License,
8 # v. 2.0. If a copy of the MPL was not distributed with this file, You can
9 # obtain one at https://mozilla.org/MPL/2.0/.
10
11 from functools import reduce
12 from itertools import chain
13
14 import attr
15
16 from hypothesis import strategies as st
17 from hypothesis.errors import ResolutionFailed
18 from hypothesis.internal.compat import get_type_hints
19 from hypothesis.strategies._internal.core import BuildsStrategy
20 from hypothesis.strategies._internal.types import is_a_type, type_sorting_key
21 from hypothesis.utils.conventions import infer
22
23
24 def from_attrs(target, args, kwargs, to_infer):
25 """An internal version of builds(), specialised for Attrs classes."""
26 fields = attr.fields(target)
27 kwargs = {k: v for k, v in kwargs.items() if v is not infer}
28 for name in to_infer:
29 kwargs[name] = from_attrs_attribute(getattr(fields, name), target)
30 # We might make this strategy more efficient if we added a layer here that
31 # retries drawing if validation fails, for improved composition.
32 # The treatment of timezones in datetimes() provides a precedent.
33 return BuildsStrategy(target, args, kwargs)
34
35
36 def from_attrs_attribute(attrib, target):
37 """Infer a strategy from the metadata on an attr.Attribute object."""
38 # Try inferring from the default argument. Note that this will only help if
39 # the user passed `...` to builds() for this attribute, but in that case
40 # we use it as the minimal example.
41 default = st.nothing()
42 if isinstance(attrib.default, attr.Factory):
43 if not attrib.default.takes_self:
44 default = st.builds(attrib.default.factory)
45 elif attrib.default is not attr.NOTHING:
46 default = st.just(attrib.default)
47
48 # Try inferring None, exact values, or type from attrs provided validators.
49 null = st.nothing() # updated to none() on seeing an OptionalValidator
50 in_collections = [] # list of in_ validator collections to sample from
51 validator_types = set() # type constraints to pass to types_to_strategy()
52 if attrib.validator is not None:
53 validator = attrib.validator
54 if isinstance(validator, attr.validators._OptionalValidator):
55 null = st.none()
56 validator = validator.validator
57 if isinstance(validator, attr.validators._AndValidator):
58 vs = validator._validators
59 else:
60 vs = [validator]
61 for v in vs:
62 if isinstance(v, attr.validators._InValidator):
63 if isinstance(v.options, str):
64 in_collections.append(list(all_substrings(v.options)))
65 else:
66 in_collections.append(v.options)
67 elif isinstance(v, attr.validators._InstanceOfValidator):
68 validator_types.add(v.type)
69
70 # This is the important line. We compose the final strategy from various
71 # parts. The default value, if any, is the minimal shrink, followed by
72 # None (again, if allowed). We then prefer to sample from values passed
73 # to an in_ validator if available, but infer from a type otherwise.
74 # Pick one because (sampled_from((1, 2)) | from_type(int)) would usually
75 # fail validation by generating e.g. zero!
76 if in_collections:
77 sample = st.sampled_from(list(ordered_intersection(in_collections)))
78 strat = default | null | sample
79 else:
80 strat = default | null | types_to_strategy(attrib, validator_types)
81
82 # Better to give a meaningful error here than an opaque "could not draw"
83 # when we try to get a value but have lost track of where this was created.
84 if strat.is_empty:
85 raise ResolutionFailed(
86 "Cannot infer a strategy from the default, validator, type, or "
87 f"converter for attribute={attrib!r} of class={target!r}"
88 )
89 return strat
90
91
92 def types_to_strategy(attrib, types):
93 """Find all the type metadata for this attribute, reconcile it, and infer a
94 strategy from the mess."""
95 # If we know types from the validator(s), that's sufficient.
96 if len(types) == 1:
97 (typ,) = types
98 if isinstance(typ, tuple):
99 return st.one_of(*map(st.from_type, typ))
100 return st.from_type(typ)
101 elif types:
102 # We have a list of tuples of types, and want to find a type
103 # (or tuple of types) that is a subclass of all of of them.
104 type_tuples = [k if isinstance(k, tuple) else (k,) for k in types]
105 # Flatten the list, filter types that would fail validation, and
106 # sort so that ordering is stable between runs and shrinks well.
107 allowed = [
108 t
109 for t in set(sum(type_tuples, ()))
110 if all(issubclass(t, tup) for tup in type_tuples)
111 ]
112 allowed.sort(key=type_sorting_key)
113 return st.one_of([st.from_type(t) for t in allowed])
114
115 # Otherwise, try the `type` attribute as a fallback, and finally try
116 # the type hints on a converter (desperate!) before giving up.
117 if is_a_type(getattr(attrib, "type", None)):
118 # The convoluted test is because variable annotations may be stored
119 # in string form; attrs doesn't evaluate them and we don't handle them.
120 # See PEP 526, PEP 563, and Hypothesis issue #1004 for details.
121 return st.from_type(attrib.type)
122
123 converter = getattr(attrib, "converter", None)
124 if isinstance(converter, type):
125 return st.from_type(converter)
126 elif callable(converter):
127 hints = get_type_hints(converter)
128 if "return" in hints:
129 return st.from_type(hints["return"])
130
131 return st.nothing()
132
133
134 def ordered_intersection(in_):
135 """Set union of n sequences, ordered for reproducibility across runs."""
136 intersection = reduce(set.intersection, in_, set(in_[0]))
137 for x in chain.from_iterable(in_):
138 if x in intersection:
139 yield x
140 intersection.remove(x)
141
142
143 def all_substrings(s):
144 """Generate all substrings of `s`, in order of length then occurrence.
145 Includes the empty string (first), and any duplicates that are present.
146
147 >>> list(all_substrings('010'))
148 ['', '0', '1', '0', '01', '10', '010']
149 """
150 yield s[:0]
151 for n, _ in enumerate(s):
152 for i in range(len(s) - n):
153 yield s[i : i + n + 1]
154
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/hypothesis-python/setup.py b/hypothesis-python/setup.py
--- a/hypothesis-python/setup.py
+++ b/hypothesis-python/setup.py
@@ -96,7 +96,7 @@
zip_safe=False,
extras_require=extras,
install_requires=[
- "attrs>=19.2.0",
+ "attrs>=22.2.0",
"exceptiongroup>=1.0.0 ; python_version<'3.11'",
"sortedcontainers>=2.1.0,<3.0.0",
],
diff --git a/hypothesis-python/src/hypothesis/strategies/_internal/attrs.py b/hypothesis-python/src/hypothesis/strategies/_internal/attrs.py
--- a/hypothesis-python/src/hypothesis/strategies/_internal/attrs.py
+++ b/hypothesis-python/src/hypothesis/strategies/_internal/attrs.py
@@ -21,12 +21,38 @@
from hypothesis.utils.conventions import infer
+def get_attribute_by_alias(fields, alias, *, target=None):
+ """
+ Get an attrs attribute by its alias, rather than its name (compare
+ getattr(fields, name)).
+
+ ``target`` is used only to provide a nicer error message, and can be safely
+ omitted.
+ """
+ # attrs supports defining an alias for a field, which is the name used when
+ # defining __init__. The init args are what we pull from when determining
+ # what parameters we need to supply to the class, so it's what we need to
+ # match against as well, rather than the class-level attribute name.
+ matched_fields = [f for f in fields if f.alias == alias]
+ if not matched_fields:
+ raise TypeError(
+ f"Unexpected keyword argument {alias} for attrs class"
+ f"{f' {target}' if target else ''}. Expected one of "
+ f"{[f.name for f in fields]}"
+ )
+ # alias is used as an arg in __init__, so it is guaranteed to be unique, if
+ # it exists.
+ assert len(matched_fields) == 1
+ return matched_fields[0]
+
+
def from_attrs(target, args, kwargs, to_infer):
"""An internal version of builds(), specialised for Attrs classes."""
fields = attr.fields(target)
kwargs = {k: v for k, v in kwargs.items() if v is not infer}
for name in to_infer:
- kwargs[name] = from_attrs_attribute(getattr(fields, name), target)
+ attrib = get_attribute_by_alias(fields, name, target=target)
+ kwargs[name] = from_attrs_attribute(attrib, target)
# We might make this strategy more efficient if we added a layer here that
# retries drawing if validation fails, for improved composition.
# The treatment of timezones in datetimes() provides a precedent.
| {"golden_diff": "diff --git a/hypothesis-python/setup.py b/hypothesis-python/setup.py\n--- a/hypothesis-python/setup.py\n+++ b/hypothesis-python/setup.py\n@@ -96,7 +96,7 @@\n zip_safe=False,\n extras_require=extras,\n install_requires=[\n- \"attrs>=19.2.0\",\n+ \"attrs>=22.2.0\",\n \"exceptiongroup>=1.0.0 ; python_version<'3.11'\",\n \"sortedcontainers>=2.1.0,<3.0.0\",\n ],\ndiff --git a/hypothesis-python/src/hypothesis/strategies/_internal/attrs.py b/hypothesis-python/src/hypothesis/strategies/_internal/attrs.py\n--- a/hypothesis-python/src/hypothesis/strategies/_internal/attrs.py\n+++ b/hypothesis-python/src/hypothesis/strategies/_internal/attrs.py\n@@ -21,12 +21,38 @@\n from hypothesis.utils.conventions import infer\n \n \n+def get_attribute_by_alias(fields, alias, *, target=None):\n+ \"\"\"\n+ Get an attrs attribute by its alias, rather than its name (compare\n+ getattr(fields, name)).\n+\n+ ``target`` is used only to provide a nicer error message, and can be safely\n+ omitted.\n+ \"\"\"\n+ # attrs supports defining an alias for a field, which is the name used when\n+ # defining __init__. The init args are what we pull from when determining\n+ # what parameters we need to supply to the class, so it's what we need to\n+ # match against as well, rather than the class-level attribute name.\n+ matched_fields = [f for f in fields if f.alias == alias]\n+ if not matched_fields:\n+ raise TypeError(\n+ f\"Unexpected keyword argument {alias} for attrs class\"\n+ f\"{f' {target}' if target else ''}. Expected one of \"\n+ f\"{[f.name for f in fields]}\"\n+ )\n+ # alias is used as an arg in __init__, so it is guaranteed to be unique, if\n+ # it exists.\n+ assert len(matched_fields) == 1\n+ return matched_fields[0]\n+\n+\n def from_attrs(target, args, kwargs, to_infer):\n \"\"\"An internal version of builds(), specialised for Attrs classes.\"\"\"\n fields = attr.fields(target)\n kwargs = {k: v for k, v in kwargs.items() if v is not infer}\n for name in to_infer:\n- kwargs[name] = from_attrs_attribute(getattr(fields, name), target)\n+ attrib = get_attribute_by_alias(fields, name, target=target)\n+ kwargs[name] = from_attrs_attribute(attrib, target)\n # We might make this strategy more efficient if we added a layer here that\n # retries drawing if validation fails, for improved composition.\n # The treatment of timezones in datetimes() provides a precedent.\n", "issue": "`builds` fails when used with `attrs` classes containing private attributes\nHi, I think I may have found a bug that occurs when using `builds` with automatic attribute inference from `attrs`-built classes that have required but \"private\" attributes. The bug is probably due to the fact that the name exposed via the constructor differs from the attribute name in these cases, since `attrs` by default strips the leading underscore. \r\n\r\nHere a minimal example:\r\n\r\n```python\r\nfrom attrs import define\r\nfrom hypothesis import given, strategies as st\r\n\r\n\r\n@define\r\nclass MyClass:\r\n _x: int\r\n\r\n\r\n@given(st.builds(MyClass))\r\ndef test_class(my_obj):\r\n assert isinstance(my_obj, MyClass)\r\n```\r\n\r\nThe error you get:\r\n\r\n```python\r\nAttributeError: 'MyClassAttributes' object has no attribute 'x'\r\n```\n", "before_files": [{"content": "# This file is part of Hypothesis, which may be found at\n# https://github.com/HypothesisWorks/hypothesis/\n#\n# Copyright the Hypothesis Authors.\n# Individual contributors are listed in AUTHORS.rst and the git log.\n#\n# This Source Code Form is subject to the terms of the Mozilla Public License,\n# v. 2.0. If a copy of the MPL was not distributed with this file, You can\n# obtain one at https://mozilla.org/MPL/2.0/.\n\nimport sys\nimport warnings\nfrom pathlib import Path\n\nimport setuptools\n\nif sys.version_info[:2] < (3, 8): # noqa # \"unreachable\" sanity check\n raise Exception(\n \"You are trying to install Hypothesis using Python \"\n f\"{sys.version.split()[0]}, but it requires Python 3.8 or later.\"\n \"Update `pip` and `setuptools`, try again, and you will automatically \"\n \"get the latest compatible version of Hypothesis instead. \"\n \"See also https://python3statement.org/practicalities/\"\n )\n\n\ndef local_file(name):\n return Path(__file__).absolute().parent.joinpath(name).relative_to(Path.cwd())\n\n\nSOURCE = str(local_file(\"src\"))\n\nsetuptools_version = tuple(map(int, setuptools.__version__.split(\".\")[:1]))\n\nif setuptools_version < (42,):\n # Warning only - very bad if uploading bdist but fine if installing sdist.\n warnings.warn(\n \"This version of setuptools is too old to handle license_files \"\n \"metadata key. For more info, see: \"\n \"https://setuptools.pypa.io/en/latest/userguide/declarative_config.html#metadata\",\n stacklevel=1,\n )\n\n\n# Assignment to placate pyflakes. The actual version is from the exec that follows.\n__version__ = None\nexec(local_file(\"src/hypothesis/version.py\").read_text(encoding=\"utf-8\"))\nassert __version__ is not None\n\n\nextras = {\n \"cli\": [\"click>=7.0\", \"black>=19.10b0\", \"rich>=9.0.0\"],\n \"codemods\": [\"libcst>=0.3.16\"],\n \"ghostwriter\": [\"black>=19.10b0\"],\n \"pytz\": [\"pytz>=2014.1\"],\n \"dateutil\": [\"python-dateutil>=1.4\"],\n \"lark\": [\"lark>=0.10.1\"], # probably still works with old `lark-parser` too\n \"numpy\": [\"numpy>=1.17.3\"], # oldest with wheels for non-EOL Python (for now)\n \"pandas\": [\"pandas>=1.1\"],\n \"pytest\": [\"pytest>=4.6\"],\n \"dpcontracts\": [\"dpcontracts>=0.4\"],\n \"redis\": [\"redis>=3.0.0\"],\n # zoneinfo is an odd one: every dependency is conditional, because they're\n # only necessary on old versions of Python or Windows systems.\n \"zoneinfo\": [\n \"tzdata>=2023.3 ; sys_platform == 'win32'\",\n \"backports.zoneinfo>=0.2.1 ; python_version<'3.9'\",\n ],\n # We only support Django versions with upstream support - see\n # https://www.djangoproject.com/download/#supported-versions\n # We also leave the choice of timezone library to the user, since it\n # might be zoneinfo or pytz depending on version and configuration.\n \"django\": [\"django>=3.2\"],\n}\n\nextras[\"all\"] = sorted(set(sum(extras.values(), [])))\n\n\nsetuptools.setup(\n name=\"hypothesis\",\n version=__version__,\n author=\"David R. MacIver and Zac Hatfield-Dodds\",\n author_email=\"[email protected]\",\n packages=setuptools.find_packages(SOURCE),\n package_dir={\"\": SOURCE},\n package_data={\"hypothesis\": [\"py.typed\", \"vendor/tlds-alpha-by-domain.txt\"]},\n url=\"https://hypothesis.works\",\n project_urls={\n \"Source\": \"https://github.com/HypothesisWorks/hypothesis/tree/master/hypothesis-python\",\n \"Changelog\": \"https://hypothesis.readthedocs.io/en/latest/changes.html\",\n \"Documentation\": \"https://hypothesis.readthedocs.io\",\n \"Issues\": \"https://github.com/HypothesisWorks/hypothesis/issues\",\n },\n license=\"MPL-2.0\",\n description=\"A library for property-based testing\",\n zip_safe=False,\n extras_require=extras,\n install_requires=[\n \"attrs>=19.2.0\",\n \"exceptiongroup>=1.0.0 ; python_version<'3.11'\",\n \"sortedcontainers>=2.1.0,<3.0.0\",\n ],\n python_requires=\">=3.8\",\n classifiers=[\n \"Development Status :: 5 - Production/Stable\",\n \"Framework :: Hypothesis\",\n \"Framework :: Pytest\",\n \"Intended Audience :: Developers\",\n \"License :: OSI Approved :: Mozilla Public License 2.0 (MPL 2.0)\",\n \"Operating System :: Unix\",\n \"Operating System :: POSIX\",\n \"Operating System :: Microsoft :: Windows\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3 :: Only\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Programming Language :: Python :: 3.10\",\n \"Programming Language :: Python :: 3.11\",\n \"Programming Language :: Python :: 3.12\",\n \"Programming Language :: Python :: Implementation :: CPython\",\n \"Programming Language :: Python :: Implementation :: PyPy\",\n \"Topic :: Education :: Testing\",\n \"Topic :: Software Development :: Testing\",\n \"Typing :: Typed\",\n ],\n py_modules=[\"_hypothesis_pytestplugin\", \"_hypothesis_ftz_detector\"],\n entry_points={\n \"pytest11\": [\"hypothesispytest = _hypothesis_pytestplugin\"],\n \"console_scripts\": [\"hypothesis = hypothesis.extra.cli:main\"],\n },\n long_description=local_file(\"README.rst\").read_text(encoding=\"utf-8\"),\n long_description_content_type=\"text/x-rst\",\n keywords=\"python testing fuzzing property-based-testing\",\n)\n", "path": "hypothesis-python/setup.py"}, {"content": "# This file is part of Hypothesis, which may be found at\n# https://github.com/HypothesisWorks/hypothesis/\n#\n# Copyright the Hypothesis Authors.\n# Individual contributors are listed in AUTHORS.rst and the git log.\n#\n# This Source Code Form is subject to the terms of the Mozilla Public License,\n# v. 2.0. If a copy of the MPL was not distributed with this file, You can\n# obtain one at https://mozilla.org/MPL/2.0/.\n\nfrom functools import reduce\nfrom itertools import chain\n\nimport attr\n\nfrom hypothesis import strategies as st\nfrom hypothesis.errors import ResolutionFailed\nfrom hypothesis.internal.compat import get_type_hints\nfrom hypothesis.strategies._internal.core import BuildsStrategy\nfrom hypothesis.strategies._internal.types import is_a_type, type_sorting_key\nfrom hypothesis.utils.conventions import infer\n\n\ndef from_attrs(target, args, kwargs, to_infer):\n \"\"\"An internal version of builds(), specialised for Attrs classes.\"\"\"\n fields = attr.fields(target)\n kwargs = {k: v for k, v in kwargs.items() if v is not infer}\n for name in to_infer:\n kwargs[name] = from_attrs_attribute(getattr(fields, name), target)\n # We might make this strategy more efficient if we added a layer here that\n # retries drawing if validation fails, for improved composition.\n # The treatment of timezones in datetimes() provides a precedent.\n return BuildsStrategy(target, args, kwargs)\n\n\ndef from_attrs_attribute(attrib, target):\n \"\"\"Infer a strategy from the metadata on an attr.Attribute object.\"\"\"\n # Try inferring from the default argument. Note that this will only help if\n # the user passed `...` to builds() for this attribute, but in that case\n # we use it as the minimal example.\n default = st.nothing()\n if isinstance(attrib.default, attr.Factory):\n if not attrib.default.takes_self:\n default = st.builds(attrib.default.factory)\n elif attrib.default is not attr.NOTHING:\n default = st.just(attrib.default)\n\n # Try inferring None, exact values, or type from attrs provided validators.\n null = st.nothing() # updated to none() on seeing an OptionalValidator\n in_collections = [] # list of in_ validator collections to sample from\n validator_types = set() # type constraints to pass to types_to_strategy()\n if attrib.validator is not None:\n validator = attrib.validator\n if isinstance(validator, attr.validators._OptionalValidator):\n null = st.none()\n validator = validator.validator\n if isinstance(validator, attr.validators._AndValidator):\n vs = validator._validators\n else:\n vs = [validator]\n for v in vs:\n if isinstance(v, attr.validators._InValidator):\n if isinstance(v.options, str):\n in_collections.append(list(all_substrings(v.options)))\n else:\n in_collections.append(v.options)\n elif isinstance(v, attr.validators._InstanceOfValidator):\n validator_types.add(v.type)\n\n # This is the important line. We compose the final strategy from various\n # parts. The default value, if any, is the minimal shrink, followed by\n # None (again, if allowed). We then prefer to sample from values passed\n # to an in_ validator if available, but infer from a type otherwise.\n # Pick one because (sampled_from((1, 2)) | from_type(int)) would usually\n # fail validation by generating e.g. zero!\n if in_collections:\n sample = st.sampled_from(list(ordered_intersection(in_collections)))\n strat = default | null | sample\n else:\n strat = default | null | types_to_strategy(attrib, validator_types)\n\n # Better to give a meaningful error here than an opaque \"could not draw\"\n # when we try to get a value but have lost track of where this was created.\n if strat.is_empty:\n raise ResolutionFailed(\n \"Cannot infer a strategy from the default, validator, type, or \"\n f\"converter for attribute={attrib!r} of class={target!r}\"\n )\n return strat\n\n\ndef types_to_strategy(attrib, types):\n \"\"\"Find all the type metadata for this attribute, reconcile it, and infer a\n strategy from the mess.\"\"\"\n # If we know types from the validator(s), that's sufficient.\n if len(types) == 1:\n (typ,) = types\n if isinstance(typ, tuple):\n return st.one_of(*map(st.from_type, typ))\n return st.from_type(typ)\n elif types:\n # We have a list of tuples of types, and want to find a type\n # (or tuple of types) that is a subclass of all of of them.\n type_tuples = [k if isinstance(k, tuple) else (k,) for k in types]\n # Flatten the list, filter types that would fail validation, and\n # sort so that ordering is stable between runs and shrinks well.\n allowed = [\n t\n for t in set(sum(type_tuples, ()))\n if all(issubclass(t, tup) for tup in type_tuples)\n ]\n allowed.sort(key=type_sorting_key)\n return st.one_of([st.from_type(t) for t in allowed])\n\n # Otherwise, try the `type` attribute as a fallback, and finally try\n # the type hints on a converter (desperate!) before giving up.\n if is_a_type(getattr(attrib, \"type\", None)):\n # The convoluted test is because variable annotations may be stored\n # in string form; attrs doesn't evaluate them and we don't handle them.\n # See PEP 526, PEP 563, and Hypothesis issue #1004 for details.\n return st.from_type(attrib.type)\n\n converter = getattr(attrib, \"converter\", None)\n if isinstance(converter, type):\n return st.from_type(converter)\n elif callable(converter):\n hints = get_type_hints(converter)\n if \"return\" in hints:\n return st.from_type(hints[\"return\"])\n\n return st.nothing()\n\n\ndef ordered_intersection(in_):\n \"\"\"Set union of n sequences, ordered for reproducibility across runs.\"\"\"\n intersection = reduce(set.intersection, in_, set(in_[0]))\n for x in chain.from_iterable(in_):\n if x in intersection:\n yield x\n intersection.remove(x)\n\n\ndef all_substrings(s):\n \"\"\"Generate all substrings of `s`, in order of length then occurrence.\n Includes the empty string (first), and any duplicates that are present.\n\n >>> list(all_substrings('010'))\n ['', '0', '1', '0', '01', '10', '010']\n \"\"\"\n yield s[:0]\n for n, _ in enumerate(s):\n for i in range(len(s) - n):\n yield s[i : i + n + 1]\n", "path": "hypothesis-python/src/hypothesis/strategies/_internal/attrs.py"}], "after_files": [{"content": "# This file is part of Hypothesis, which may be found at\n# https://github.com/HypothesisWorks/hypothesis/\n#\n# Copyright the Hypothesis Authors.\n# Individual contributors are listed in AUTHORS.rst and the git log.\n#\n# This Source Code Form is subject to the terms of the Mozilla Public License,\n# v. 2.0. If a copy of the MPL was not distributed with this file, You can\n# obtain one at https://mozilla.org/MPL/2.0/.\n\nimport sys\nimport warnings\nfrom pathlib import Path\n\nimport setuptools\n\nif sys.version_info[:2] < (3, 8): # noqa # \"unreachable\" sanity check\n raise Exception(\n \"You are trying to install Hypothesis using Python \"\n f\"{sys.version.split()[0]}, but it requires Python 3.8 or later.\"\n \"Update `pip` and `setuptools`, try again, and you will automatically \"\n \"get the latest compatible version of Hypothesis instead. \"\n \"See also https://python3statement.org/practicalities/\"\n )\n\n\ndef local_file(name):\n return Path(__file__).absolute().parent.joinpath(name).relative_to(Path.cwd())\n\n\nSOURCE = str(local_file(\"src\"))\n\nsetuptools_version = tuple(map(int, setuptools.__version__.split(\".\")[:1]))\n\nif setuptools_version < (42,):\n # Warning only - very bad if uploading bdist but fine if installing sdist.\n warnings.warn(\n \"This version of setuptools is too old to handle license_files \"\n \"metadata key. For more info, see: \"\n \"https://setuptools.pypa.io/en/latest/userguide/declarative_config.html#metadata\",\n stacklevel=1,\n )\n\n\n# Assignment to placate pyflakes. The actual version is from the exec that follows.\n__version__ = None\nexec(local_file(\"src/hypothesis/version.py\").read_text(encoding=\"utf-8\"))\nassert __version__ is not None\n\n\nextras = {\n \"cli\": [\"click>=7.0\", \"black>=19.10b0\", \"rich>=9.0.0\"],\n \"codemods\": [\"libcst>=0.3.16\"],\n \"ghostwriter\": [\"black>=19.10b0\"],\n \"pytz\": [\"pytz>=2014.1\"],\n \"dateutil\": [\"python-dateutil>=1.4\"],\n \"lark\": [\"lark>=0.10.1\"], # probably still works with old `lark-parser` too\n \"numpy\": [\"numpy>=1.17.3\"], # oldest with wheels for non-EOL Python (for now)\n \"pandas\": [\"pandas>=1.1\"],\n \"pytest\": [\"pytest>=4.6\"],\n \"dpcontracts\": [\"dpcontracts>=0.4\"],\n \"redis\": [\"redis>=3.0.0\"],\n # zoneinfo is an odd one: every dependency is conditional, because they're\n # only necessary on old versions of Python or Windows systems.\n \"zoneinfo\": [\n \"tzdata>=2023.3 ; sys_platform == 'win32'\",\n \"backports.zoneinfo>=0.2.1 ; python_version<'3.9'\",\n ],\n # We only support Django versions with upstream support - see\n # https://www.djangoproject.com/download/#supported-versions\n # We also leave the choice of timezone library to the user, since it\n # might be zoneinfo or pytz depending on version and configuration.\n \"django\": [\"django>=3.2\"],\n}\n\nextras[\"all\"] = sorted(set(sum(extras.values(), [])))\n\n\nsetuptools.setup(\n name=\"hypothesis\",\n version=__version__,\n author=\"David R. MacIver and Zac Hatfield-Dodds\",\n author_email=\"[email protected]\",\n packages=setuptools.find_packages(SOURCE),\n package_dir={\"\": SOURCE},\n package_data={\"hypothesis\": [\"py.typed\", \"vendor/tlds-alpha-by-domain.txt\"]},\n url=\"https://hypothesis.works\",\n project_urls={\n \"Source\": \"https://github.com/HypothesisWorks/hypothesis/tree/master/hypothesis-python\",\n \"Changelog\": \"https://hypothesis.readthedocs.io/en/latest/changes.html\",\n \"Documentation\": \"https://hypothesis.readthedocs.io\",\n \"Issues\": \"https://github.com/HypothesisWorks/hypothesis/issues\",\n },\n license=\"MPL-2.0\",\n description=\"A library for property-based testing\",\n zip_safe=False,\n extras_require=extras,\n install_requires=[\n \"attrs>=22.2.0\",\n \"exceptiongroup>=1.0.0 ; python_version<'3.11'\",\n \"sortedcontainers>=2.1.0,<3.0.0\",\n ],\n python_requires=\">=3.8\",\n classifiers=[\n \"Development Status :: 5 - Production/Stable\",\n \"Framework :: Hypothesis\",\n \"Framework :: Pytest\",\n \"Intended Audience :: Developers\",\n \"License :: OSI Approved :: Mozilla Public License 2.0 (MPL 2.0)\",\n \"Operating System :: Unix\",\n \"Operating System :: POSIX\",\n \"Operating System :: Microsoft :: Windows\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3 :: Only\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Programming Language :: Python :: 3.10\",\n \"Programming Language :: Python :: 3.11\",\n \"Programming Language :: Python :: 3.12\",\n \"Programming Language :: Python :: Implementation :: CPython\",\n \"Programming Language :: Python :: Implementation :: PyPy\",\n \"Topic :: Education :: Testing\",\n \"Topic :: Software Development :: Testing\",\n \"Typing :: Typed\",\n ],\n py_modules=[\"_hypothesis_pytestplugin\", \"_hypothesis_ftz_detector\"],\n entry_points={\n \"pytest11\": [\"hypothesispytest = _hypothesis_pytestplugin\"],\n \"console_scripts\": [\"hypothesis = hypothesis.extra.cli:main\"],\n },\n long_description=local_file(\"README.rst\").read_text(encoding=\"utf-8\"),\n long_description_content_type=\"text/x-rst\",\n keywords=\"python testing fuzzing property-based-testing\",\n)\n", "path": "hypothesis-python/setup.py"}, {"content": "# This file is part of Hypothesis, which may be found at\n# https://github.com/HypothesisWorks/hypothesis/\n#\n# Copyright the Hypothesis Authors.\n# Individual contributors are listed in AUTHORS.rst and the git log.\n#\n# This Source Code Form is subject to the terms of the Mozilla Public License,\n# v. 2.0. If a copy of the MPL was not distributed with this file, You can\n# obtain one at https://mozilla.org/MPL/2.0/.\n\nfrom functools import reduce\nfrom itertools import chain\n\nimport attr\n\nfrom hypothesis import strategies as st\nfrom hypothesis.errors import ResolutionFailed\nfrom hypothesis.internal.compat import get_type_hints\nfrom hypothesis.strategies._internal.core import BuildsStrategy\nfrom hypothesis.strategies._internal.types import is_a_type, type_sorting_key\nfrom hypothesis.utils.conventions import infer\n\n\ndef get_attribute_by_alias(fields, alias, *, target=None):\n \"\"\"\n Get an attrs attribute by its alias, rather than its name (compare\n getattr(fields, name)).\n\n ``target`` is used only to provide a nicer error message, and can be safely\n omitted.\n \"\"\"\n # attrs supports defining an alias for a field, which is the name used when\n # defining __init__. The init args are what we pull from when determining\n # what parameters we need to supply to the class, so it's what we need to\n # match against as well, rather than the class-level attribute name.\n matched_fields = [f for f in fields if f.alias == alias]\n if not matched_fields:\n raise TypeError(\n f\"Unexpected keyword argument {alias} for attrs class\"\n f\"{f' {target}' if target else ''}. Expected one of \"\n f\"{[f.name for f in fields]}\"\n )\n # alias is used as an arg in __init__, so it is guaranteed to be unique, if\n # it exists.\n assert len(matched_fields) == 1\n return matched_fields[0]\n\n\ndef from_attrs(target, args, kwargs, to_infer):\n \"\"\"An internal version of builds(), specialised for Attrs classes.\"\"\"\n fields = attr.fields(target)\n kwargs = {k: v for k, v in kwargs.items() if v is not infer}\n for name in to_infer:\n attrib = get_attribute_by_alias(fields, name, target=target)\n kwargs[name] = from_attrs_attribute(attrib, target)\n # We might make this strategy more efficient if we added a layer here that\n # retries drawing if validation fails, for improved composition.\n # The treatment of timezones in datetimes() provides a precedent.\n return BuildsStrategy(target, args, kwargs)\n\n\ndef from_attrs_attribute(attrib, target):\n \"\"\"Infer a strategy from the metadata on an attr.Attribute object.\"\"\"\n # Try inferring from the default argument. Note that this will only help if\n # the user passed `...` to builds() for this attribute, but in that case\n # we use it as the minimal example.\n default = st.nothing()\n if isinstance(attrib.default, attr.Factory):\n if not attrib.default.takes_self:\n default = st.builds(attrib.default.factory)\n elif attrib.default is not attr.NOTHING:\n default = st.just(attrib.default)\n\n # Try inferring None, exact values, or type from attrs provided validators.\n null = st.nothing() # updated to none() on seeing an OptionalValidator\n in_collections = [] # list of in_ validator collections to sample from\n validator_types = set() # type constraints to pass to types_to_strategy()\n if attrib.validator is not None:\n validator = attrib.validator\n if isinstance(validator, attr.validators._OptionalValidator):\n null = st.none()\n validator = validator.validator\n if isinstance(validator, attr.validators._AndValidator):\n vs = validator._validators\n else:\n vs = [validator]\n for v in vs:\n if isinstance(v, attr.validators._InValidator):\n if isinstance(v.options, str):\n in_collections.append(list(all_substrings(v.options)))\n else:\n in_collections.append(v.options)\n elif isinstance(v, attr.validators._InstanceOfValidator):\n validator_types.add(v.type)\n\n # This is the important line. We compose the final strategy from various\n # parts. The default value, if any, is the minimal shrink, followed by\n # None (again, if allowed). We then prefer to sample from values passed\n # to an in_ validator if available, but infer from a type otherwise.\n # Pick one because (sampled_from((1, 2)) | from_type(int)) would usually\n # fail validation by generating e.g. zero!\n if in_collections:\n sample = st.sampled_from(list(ordered_intersection(in_collections)))\n strat = default | null | sample\n else:\n strat = default | null | types_to_strategy(attrib, validator_types)\n\n # Better to give a meaningful error here than an opaque \"could not draw\"\n # when we try to get a value but have lost track of where this was created.\n if strat.is_empty:\n raise ResolutionFailed(\n \"Cannot infer a strategy from the default, validator, type, or \"\n f\"converter for attribute={attrib!r} of class={target!r}\"\n )\n return strat\n\n\ndef types_to_strategy(attrib, types):\n \"\"\"Find all the type metadata for this attribute, reconcile it, and infer a\n strategy from the mess.\"\"\"\n # If we know types from the validator(s), that's sufficient.\n if len(types) == 1:\n (typ,) = types\n if isinstance(typ, tuple):\n return st.one_of(*map(st.from_type, typ))\n return st.from_type(typ)\n elif types:\n # We have a list of tuples of types, and want to find a type\n # (or tuple of types) that is a subclass of all of of them.\n type_tuples = [k if isinstance(k, tuple) else (k,) for k in types]\n # Flatten the list, filter types that would fail validation, and\n # sort so that ordering is stable between runs and shrinks well.\n allowed = [\n t\n for t in set(sum(type_tuples, ()))\n if all(issubclass(t, tup) for tup in type_tuples)\n ]\n allowed.sort(key=type_sorting_key)\n return st.one_of([st.from_type(t) for t in allowed])\n\n # Otherwise, try the `type` attribute as a fallback, and finally try\n # the type hints on a converter (desperate!) before giving up.\n if is_a_type(getattr(attrib, \"type\", None)):\n # The convoluted test is because variable annotations may be stored\n # in string form; attrs doesn't evaluate them and we don't handle them.\n # See PEP 526, PEP 563, and Hypothesis issue #1004 for details.\n return st.from_type(attrib.type)\n\n converter = getattr(attrib, \"converter\", None)\n if isinstance(converter, type):\n return st.from_type(converter)\n elif callable(converter):\n hints = get_type_hints(converter)\n if \"return\" in hints:\n return st.from_type(hints[\"return\"])\n\n return st.nothing()\n\n\ndef ordered_intersection(in_):\n \"\"\"Set union of n sequences, ordered for reproducibility across runs.\"\"\"\n intersection = reduce(set.intersection, in_, set(in_[0]))\n for x in chain.from_iterable(in_):\n if x in intersection:\n yield x\n intersection.remove(x)\n\n\ndef all_substrings(s):\n \"\"\"Generate all substrings of `s`, in order of length then occurrence.\n Includes the empty string (first), and any duplicates that are present.\n\n >>> list(all_substrings('010'))\n ['', '0', '1', '0', '01', '10', '010']\n \"\"\"\n yield s[:0]\n for n, _ in enumerate(s):\n for i in range(len(s) - n):\n yield s[i : i + n + 1]\n", "path": "hypothesis-python/src/hypothesis/strategies/_internal/attrs.py"}]} | 4,032 | 647 |
gh_patches_debug_11573 | rasdani/github-patches | git_diff | getnikola__nikola-1133 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
utils.get_root_dir() should not be run for import_* and init
Here’s a fun thing you can try at home:
``` console
$ cd /tmp
$ nikola init site_1
[2014-03-14T15:39:05Z] INFO: init: Created empty site at site_1.
$ cd site_1
$ mkdir subdir
$ cd subdir
$ nikola init site_2
[2014-03-14T15:39:41Z] INFO: init: Created empty site at site_2.
$ ls -l .
total 0
$ ls -l ..
total 32
-rw-r--r-- 1 kwpolska kwpolska 27413 Mar 14 16:39 conf.py
-rw-r--r-- 1 kwpolska kwpolska 1923 Mar 14 16:39 conf.pyc
drwxr-xr-x 2 kwpolska kwpolska 40 Mar 14 16:39 files/
drwxr-xr-x 2 kwpolska kwpolska 40 Mar 14 16:39 galleries/
drwxr-xr-x 2 kwpolska kwpolska 40 Mar 14 16:39 listings/
drwxr-xr-x 2 kwpolska kwpolska 40 Mar 14 16:39 posts/
drwxr-xr-x 7 kwpolska kwpolska 160 Mar 14 16:39 site_2/ ← wat
drwxr-xr-x 2 kwpolska kwpolska 40 Mar 14 16:39 stories/
drwxr-xr-x 2 kwpolska kwpolska 40 Mar 14 16:39 subdir/
$
```
The same behavior applies to `import_*`:
``` console
$ pwd
/tmp/site_1/subdir
$ nikola import_wordpress ~/git/nikola/tests/wordpress_unicode_export.xml
[2014-03-14T15:41:24Z] INFO: init: Created empty site at new_site.
[2014-03-14T15:41:24Z] INFO: Nikola: Configuration will be written to: new_site/conf.py
$ ls -l .
total 0
$ ls -l ..
total 32
-rw-r--r-- 1 kwpolska kwpolska 27413 Mar 14 16:39 conf.py
-rw-r--r-- 1 kwpolska kwpolska 1923 Mar 14 16:39 conf.pyc
drwxr-xr-x 2 kwpolska kwpolska 40 Mar 14 16:39 files/
drwxr-xr-x 2 kwpolska kwpolska 40 Mar 14 16:39 galleries/
drwxr-xr-x 2 kwpolska kwpolska 40 Mar 14 16:39 listings/
drwxr-xr-x 7 kwpolska kwpolska 180 Mar 14 16:41 new_site/ ← wat
drwxr-xr-x 2 kwpolska kwpolska 40 Mar 14 16:39 posts/
drwxr-xr-x 7 kwpolska kwpolska 160 Mar 14 16:39 site_2/ ← wat
drwxr-xr-x 2 kwpolska kwpolska 40 Mar 14 16:39 stories/
drwxr-xr-x 3 kwpolska kwpolska 40 Mar 14 16:39 subdir/
$ mkdir nesting_is_fun
$ cd nesting_is_fun
$ nikola import_wordpress ~/git/nikola/tests/wordpress_unicode_export.xml
[2014-03-14T15:42:05Z] NOTICE: Nikola: The folder new_site already exists - assuming that this is a already existing Nikola site.
[2014-03-14T15:42:05Z] INFO: Nikola: Configuration will be written to: new_site/conf.py.import_wordpress-20140314_164205
$
```
This can be dangerous — especially because one can put a file named `conf.py` anywhere higher in the directory structure and destroy his Nikola attempts.
How could we fix it? We could just not run `utils.get_root_dir()` if the user wants to execute one of those commands (it’s executed in `__main__.py`).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `nikola/__main__.py`
Content:
```
1 # -*- coding: utf-8 -*-
2
3 # Copyright © 2012-2014 Roberto Alsina and others.
4
5 # Permission is hereby granted, free of charge, to any
6 # person obtaining a copy of this software and associated
7 # documentation files (the "Software"), to deal in the
8 # Software without restriction, including without limitation
9 # the rights to use, copy, modify, merge, publish,
10 # distribute, sublicense, and/or sell copies of the
11 # Software, and to permit persons to whom the Software is
12 # furnished to do so, subject to the following conditions:
13 #
14 # The above copyright notice and this permission notice
15 # shall be included in all copies or substantial portions of
16 # the Software.
17 #
18 # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY
19 # KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE
20 # WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR
21 # PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS
22 # OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR
23 # OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
24 # OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
25 # SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
26
27 from __future__ import print_function, unicode_literals
28 from operator import attrgetter
29 import os
30 import shutil
31 import sys
32 import traceback
33
34 from doit.loader import generate_tasks
35 from doit.cmd_base import TaskLoader
36 from doit.reporter import ExecutedOnlyReporter
37 from doit.doit_cmd import DoitMain
38 from doit.cmd_help import Help as DoitHelp
39 from doit.cmd_run import Run as DoitRun
40 from doit.cmd_clean import Clean as DoitClean
41 from doit.cmd_auto import Auto as DoitAuto
42 from logbook import NullHandler
43
44 from . import __version__
45 from .nikola import Nikola
46 from .utils import _reload, sys_decode, get_root_dir, LOGGER, STRICT_HANDLER
47
48
49 config = {}
50
51
52 def main(args):
53 quiet = False
54 if len(args) > 0 and args[0] == 'build' and '--strict' in args:
55 LOGGER.notice('Running in strict mode')
56 STRICT_HANDLER.push_application()
57 if len(args) > 0 and args[0] == 'build' and '-q' in args or '--quiet' in args:
58 nullhandler = NullHandler()
59 nullhandler.push_application()
60 quiet = True
61 global config
62
63 colorful = False
64 if sys.stderr.isatty():
65 colorful = True
66 try:
67 import colorama
68 colorama.init()
69 except ImportError:
70 if os.name == 'nt':
71 colorful = False
72
73 root = get_root_dir()
74 if root:
75 os.chdir(root)
76
77 sys.path.append('')
78 try:
79 import conf
80 _reload(conf)
81 config = conf.__dict__
82 except Exception:
83 if os.path.exists('conf.py'):
84 msg = traceback.format_exc(0).splitlines()[1]
85 LOGGER.error('In conf.py line {0}: {1}'.format(sys.exc_info()[2].tb_lineno, msg))
86 sys.exit(1)
87 config = {}
88
89 config.update({'__colorful__': colorful})
90
91 site = Nikola(**config)
92 return DoitNikola(site, quiet).run(args)
93
94
95 class Help(DoitHelp):
96 """show Nikola usage instead of doit """
97
98 @staticmethod
99 def print_usage(cmds):
100 """print nikola "usage" (basic help) instructions"""
101 print("Nikola is a tool to create static websites and blogs. For full documentation and more information, please visit http://getnikola.com/\n\n")
102 print("Available commands:")
103 for cmd in sorted(cmds.values(), key=attrgetter('name')):
104 print(" nikola %-*s %s" % (20, cmd.name, cmd.doc_purpose))
105 print("")
106 print(" nikola help show help / reference")
107 print(" nikola help <command> show command usage")
108 print(" nikola help <task-name> show task usage")
109
110
111 class Build(DoitRun):
112 """expose "run" command as "build" for backward compatibility"""
113 def __init__(self, *args, **kw):
114 opts = list(self.cmd_options)
115 opts.append(
116 {
117 'name': 'strict',
118 'long': 'strict',
119 'default': False,
120 'type': bool,
121 'help': "Fail on things that would normally be warnings.",
122 }
123 )
124 opts.append(
125 {
126 'name': 'quiet',
127 'long': 'quiet',
128 'short': 'q',
129 'default': False,
130 'type': bool,
131 'help': "Run quietly.",
132 }
133 )
134 self.cmd_options = tuple(opts)
135 super(Build, self).__init__(*args, **kw)
136
137
138 class Clean(DoitClean):
139 """A clean that removes cache/"""
140
141 def clean_tasks(self, tasks, dryrun):
142 if not dryrun and config:
143 cache_folder = config.get('CACHE_FOLDER', 'cache')
144 if os.path.exists(cache_folder):
145 shutil.rmtree(cache_folder)
146 return super(Clean, self).clean_tasks(tasks, dryrun)
147
148 # Nikola has its own "auto" commands that uses livereload.
149 # Expose original doit "auto" command as "doit_auto".
150 DoitAuto.name = 'doit_auto'
151
152
153 class NikolaTaskLoader(TaskLoader):
154 """custom task loader to get tasks from Nikola instead of dodo.py file"""
155 def __init__(self, nikola, quiet=False):
156 self.nikola = nikola
157 self.quiet = quiet
158
159 def load_tasks(self, cmd, opt_values, pos_args):
160 if self.quiet:
161 DOIT_CONFIG = {
162 'verbosity': 0,
163 'reporter': 'zero',
164 }
165 else:
166 DOIT_CONFIG = {
167 'reporter': ExecutedOnlyReporter,
168 }
169 DOIT_CONFIG['default_tasks'] = ['render_site', 'post_render']
170 tasks = generate_tasks(
171 'render_site',
172 self.nikola.gen_tasks('render_site', "Task", 'Group of tasks to render the site.'))
173 latetasks = generate_tasks(
174 'post_render',
175 self.nikola.gen_tasks('post_render', "LateTask", 'Group of tasks to be executes after site is rendered.'))
176 return tasks + latetasks, DOIT_CONFIG
177
178
179 class DoitNikola(DoitMain):
180 # overwite help command
181 DOIT_CMDS = list(DoitMain.DOIT_CMDS) + [Help, Build, Clean, DoitAuto]
182 TASK_LOADER = NikolaTaskLoader
183
184 def __init__(self, nikola, quiet=False):
185 self.nikola = nikola
186 self.task_loader = self.TASK_LOADER(nikola, quiet)
187
188 def get_commands(self):
189 # core doit commands
190 cmds = DoitMain.get_commands(self)
191 # load nikola commands
192 for name, cmd in self.nikola.commands.items():
193 cmds[name] = cmd
194 return cmds
195
196 def run(self, cmd_args):
197 sub_cmds = self.get_commands()
198 args = self.process_args(cmd_args)
199 args = [sys_decode(arg) for arg in args]
200
201 if len(args) == 0 or any(arg in ["--help", '-h'] for arg in args):
202 cmd_args = ['help']
203 args = ['help']
204 # Hide run because Nikola uses build
205 sub_cmds.pop('run')
206 if len(args) == 0 or any(arg in ["--version", '-V'] for arg in args):
207 cmd_args = ['version']
208 args = ['version']
209 if len(args) == 0 or args[0] not in sub_cmds.keys() or \
210 args[0] == 'build':
211 # Check for conf.py before launching run
212 if not self.nikola.configured:
213 LOGGER.error("This command needs to run inside an "
214 "existing Nikola site.")
215 return False
216 return super(DoitNikola, self).run(cmd_args)
217
218 @staticmethod
219 def print_version():
220 print("Nikola v" + __version__)
221
222 if __name__ == "__main__":
223 sys.exit(main(sys.argv[1:]))
224
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/nikola/__main__.py b/nikola/__main__.py
--- a/nikola/__main__.py
+++ b/nikola/__main__.py
@@ -70,9 +70,16 @@
if os.name == 'nt':
colorful = False
- root = get_root_dir()
- if root:
- os.chdir(root)
+ # Those commands do not require a `conf.py`. (Issue #1132)
+ # Moreover, actually having one somewhere in the tree can be bad, putting
+ # the output of that command (the new site) in an unknown directory that is
+ # not the current working directory. (does not apply to `version`)
+ argname = args[0] if len(args) > 0 else None
+ if argname not in ['init', 'import_wordpress', 'import_feed',
+ 'import_blogger', 'version']:
+ root = get_root_dir()
+ if root:
+ os.chdir(root)
sys.path.append('')
try:
| {"golden_diff": "diff --git a/nikola/__main__.py b/nikola/__main__.py\n--- a/nikola/__main__.py\n+++ b/nikola/__main__.py\n@@ -70,9 +70,16 @@\n if os.name == 'nt':\n colorful = False\n \n- root = get_root_dir()\n- if root:\n- os.chdir(root)\n+ # Those commands do not require a `conf.py`. (Issue #1132)\n+ # Moreover, actually having one somewhere in the tree can be bad, putting\n+ # the output of that command (the new site) in an unknown directory that is\n+ # not the current working directory. (does not apply to `version`)\n+ argname = args[0] if len(args) > 0 else None\n+ if argname not in ['init', 'import_wordpress', 'import_feed',\n+ 'import_blogger', 'version']:\n+ root = get_root_dir()\n+ if root:\n+ os.chdir(root)\n \n sys.path.append('')\n try:\n", "issue": "utils.get_root_dir() should not be run for import_* and init\nHere\u2019s a fun thing you can try at home:\n\n``` console\n$ cd /tmp\n$ nikola init site_1\n[2014-03-14T15:39:05Z] INFO: init: Created empty site at site_1.\n$ cd site_1\n$ mkdir subdir\n$ cd subdir\n$ nikola init site_2\n[2014-03-14T15:39:41Z] INFO: init: Created empty site at site_2.\n$ ls -l .\ntotal 0\n$ ls -l ..\ntotal 32\n-rw-r--r-- 1 kwpolska kwpolska 27413 Mar 14 16:39 conf.py\n-rw-r--r-- 1 kwpolska kwpolska 1923 Mar 14 16:39 conf.pyc\ndrwxr-xr-x 2 kwpolska kwpolska 40 Mar 14 16:39 files/\ndrwxr-xr-x 2 kwpolska kwpolska 40 Mar 14 16:39 galleries/\ndrwxr-xr-x 2 kwpolska kwpolska 40 Mar 14 16:39 listings/\ndrwxr-xr-x 2 kwpolska kwpolska 40 Mar 14 16:39 posts/\ndrwxr-xr-x 7 kwpolska kwpolska 160 Mar 14 16:39 site_2/ \u2190 wat\ndrwxr-xr-x 2 kwpolska kwpolska 40 Mar 14 16:39 stories/\ndrwxr-xr-x 2 kwpolska kwpolska 40 Mar 14 16:39 subdir/\n$\n```\n\nThe same behavior applies to `import_*`:\n\n``` console\n$ pwd\n/tmp/site_1/subdir\n$ nikola import_wordpress ~/git/nikola/tests/wordpress_unicode_export.xml\n[2014-03-14T15:41:24Z] INFO: init: Created empty site at new_site.\n[2014-03-14T15:41:24Z] INFO: Nikola: Configuration will be written to: new_site/conf.py\n$ ls -l .\ntotal 0\n$ ls -l ..\ntotal 32\n-rw-r--r-- 1 kwpolska kwpolska 27413 Mar 14 16:39 conf.py\n-rw-r--r-- 1 kwpolska kwpolska 1923 Mar 14 16:39 conf.pyc\ndrwxr-xr-x 2 kwpolska kwpolska 40 Mar 14 16:39 files/\ndrwxr-xr-x 2 kwpolska kwpolska 40 Mar 14 16:39 galleries/\ndrwxr-xr-x 2 kwpolska kwpolska 40 Mar 14 16:39 listings/\ndrwxr-xr-x 7 kwpolska kwpolska 180 Mar 14 16:41 new_site/ \u2190 wat\ndrwxr-xr-x 2 kwpolska kwpolska 40 Mar 14 16:39 posts/\ndrwxr-xr-x 7 kwpolska kwpolska 160 Mar 14 16:39 site_2/ \u2190 wat\ndrwxr-xr-x 2 kwpolska kwpolska 40 Mar 14 16:39 stories/\ndrwxr-xr-x 3 kwpolska kwpolska 40 Mar 14 16:39 subdir/\n$ mkdir nesting_is_fun\n$ cd nesting_is_fun\n$ nikola import_wordpress ~/git/nikola/tests/wordpress_unicode_export.xml\n[2014-03-14T15:42:05Z] NOTICE: Nikola: The folder new_site already exists - assuming that this is a already existing Nikola site.\n[2014-03-14T15:42:05Z] INFO: Nikola: Configuration will be written to: new_site/conf.py.import_wordpress-20140314_164205\n$\n```\n\nThis can be dangerous \u2014 especially because one can put a file named `conf.py` anywhere higher in the directory structure and destroy his Nikola attempts.\n\nHow could we fix it? We could just not run `utils.get_root_dir()` if the user wants to execute one of those commands (it\u2019s executed in `__main__.py`).\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\n# Copyright \u00a9 2012-2014 Roberto Alsina and others.\n\n# Permission is hereby granted, free of charge, to any\n# person obtaining a copy of this software and associated\n# documentation files (the \"Software\"), to deal in the\n# Software without restriction, including without limitation\n# the rights to use, copy, modify, merge, publish,\n# distribute, sublicense, and/or sell copies of the\n# Software, and to permit persons to whom the Software is\n# furnished to do so, subject to the following conditions:\n#\n# The above copyright notice and this permission notice\n# shall be included in all copies or substantial portions of\n# the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY\n# KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE\n# WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR\n# PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS\n# OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR\n# OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR\n# OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE\n# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.\n\nfrom __future__ import print_function, unicode_literals\nfrom operator import attrgetter\nimport os\nimport shutil\nimport sys\nimport traceback\n\nfrom doit.loader import generate_tasks\nfrom doit.cmd_base import TaskLoader\nfrom doit.reporter import ExecutedOnlyReporter\nfrom doit.doit_cmd import DoitMain\nfrom doit.cmd_help import Help as DoitHelp\nfrom doit.cmd_run import Run as DoitRun\nfrom doit.cmd_clean import Clean as DoitClean\nfrom doit.cmd_auto import Auto as DoitAuto\nfrom logbook import NullHandler\n\nfrom . import __version__\nfrom .nikola import Nikola\nfrom .utils import _reload, sys_decode, get_root_dir, LOGGER, STRICT_HANDLER\n\n\nconfig = {}\n\n\ndef main(args):\n quiet = False\n if len(args) > 0 and args[0] == 'build' and '--strict' in args:\n LOGGER.notice('Running in strict mode')\n STRICT_HANDLER.push_application()\n if len(args) > 0 and args[0] == 'build' and '-q' in args or '--quiet' in args:\n nullhandler = NullHandler()\n nullhandler.push_application()\n quiet = True\n global config\n\n colorful = False\n if sys.stderr.isatty():\n colorful = True\n try:\n import colorama\n colorama.init()\n except ImportError:\n if os.name == 'nt':\n colorful = False\n\n root = get_root_dir()\n if root:\n os.chdir(root)\n\n sys.path.append('')\n try:\n import conf\n _reload(conf)\n config = conf.__dict__\n except Exception:\n if os.path.exists('conf.py'):\n msg = traceback.format_exc(0).splitlines()[1]\n LOGGER.error('In conf.py line {0}: {1}'.format(sys.exc_info()[2].tb_lineno, msg))\n sys.exit(1)\n config = {}\n\n config.update({'__colorful__': colorful})\n\n site = Nikola(**config)\n return DoitNikola(site, quiet).run(args)\n\n\nclass Help(DoitHelp):\n \"\"\"show Nikola usage instead of doit \"\"\"\n\n @staticmethod\n def print_usage(cmds):\n \"\"\"print nikola \"usage\" (basic help) instructions\"\"\"\n print(\"Nikola is a tool to create static websites and blogs. For full documentation and more information, please visit http://getnikola.com/\\n\\n\")\n print(\"Available commands:\")\n for cmd in sorted(cmds.values(), key=attrgetter('name')):\n print(\" nikola %-*s %s\" % (20, cmd.name, cmd.doc_purpose))\n print(\"\")\n print(\" nikola help show help / reference\")\n print(\" nikola help <command> show command usage\")\n print(\" nikola help <task-name> show task usage\")\n\n\nclass Build(DoitRun):\n \"\"\"expose \"run\" command as \"build\" for backward compatibility\"\"\"\n def __init__(self, *args, **kw):\n opts = list(self.cmd_options)\n opts.append(\n {\n 'name': 'strict',\n 'long': 'strict',\n 'default': False,\n 'type': bool,\n 'help': \"Fail on things that would normally be warnings.\",\n }\n )\n opts.append(\n {\n 'name': 'quiet',\n 'long': 'quiet',\n 'short': 'q',\n 'default': False,\n 'type': bool,\n 'help': \"Run quietly.\",\n }\n )\n self.cmd_options = tuple(opts)\n super(Build, self).__init__(*args, **kw)\n\n\nclass Clean(DoitClean):\n \"\"\"A clean that removes cache/\"\"\"\n\n def clean_tasks(self, tasks, dryrun):\n if not dryrun and config:\n cache_folder = config.get('CACHE_FOLDER', 'cache')\n if os.path.exists(cache_folder):\n shutil.rmtree(cache_folder)\n return super(Clean, self).clean_tasks(tasks, dryrun)\n\n# Nikola has its own \"auto\" commands that uses livereload.\n# Expose original doit \"auto\" command as \"doit_auto\".\nDoitAuto.name = 'doit_auto'\n\n\nclass NikolaTaskLoader(TaskLoader):\n \"\"\"custom task loader to get tasks from Nikola instead of dodo.py file\"\"\"\n def __init__(self, nikola, quiet=False):\n self.nikola = nikola\n self.quiet = quiet\n\n def load_tasks(self, cmd, opt_values, pos_args):\n if self.quiet:\n DOIT_CONFIG = {\n 'verbosity': 0,\n 'reporter': 'zero',\n }\n else:\n DOIT_CONFIG = {\n 'reporter': ExecutedOnlyReporter,\n }\n DOIT_CONFIG['default_tasks'] = ['render_site', 'post_render']\n tasks = generate_tasks(\n 'render_site',\n self.nikola.gen_tasks('render_site', \"Task\", 'Group of tasks to render the site.'))\n latetasks = generate_tasks(\n 'post_render',\n self.nikola.gen_tasks('post_render', \"LateTask\", 'Group of tasks to be executes after site is rendered.'))\n return tasks + latetasks, DOIT_CONFIG\n\n\nclass DoitNikola(DoitMain):\n # overwite help command\n DOIT_CMDS = list(DoitMain.DOIT_CMDS) + [Help, Build, Clean, DoitAuto]\n TASK_LOADER = NikolaTaskLoader\n\n def __init__(self, nikola, quiet=False):\n self.nikola = nikola\n self.task_loader = self.TASK_LOADER(nikola, quiet)\n\n def get_commands(self):\n # core doit commands\n cmds = DoitMain.get_commands(self)\n # load nikola commands\n for name, cmd in self.nikola.commands.items():\n cmds[name] = cmd\n return cmds\n\n def run(self, cmd_args):\n sub_cmds = self.get_commands()\n args = self.process_args(cmd_args)\n args = [sys_decode(arg) for arg in args]\n\n if len(args) == 0 or any(arg in [\"--help\", '-h'] for arg in args):\n cmd_args = ['help']\n args = ['help']\n # Hide run because Nikola uses build\n sub_cmds.pop('run')\n if len(args) == 0 or any(arg in [\"--version\", '-V'] for arg in args):\n cmd_args = ['version']\n args = ['version']\n if len(args) == 0 or args[0] not in sub_cmds.keys() or \\\n args[0] == 'build':\n # Check for conf.py before launching run\n if not self.nikola.configured:\n LOGGER.error(\"This command needs to run inside an \"\n \"existing Nikola site.\")\n return False\n return super(DoitNikola, self).run(cmd_args)\n\n @staticmethod\n def print_version():\n print(\"Nikola v\" + __version__)\n\nif __name__ == \"__main__\":\n sys.exit(main(sys.argv[1:]))\n", "path": "nikola/__main__.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\n# Copyright \u00a9 2012-2014 Roberto Alsina and others.\n\n# Permission is hereby granted, free of charge, to any\n# person obtaining a copy of this software and associated\n# documentation files (the \"Software\"), to deal in the\n# Software without restriction, including without limitation\n# the rights to use, copy, modify, merge, publish,\n# distribute, sublicense, and/or sell copies of the\n# Software, and to permit persons to whom the Software is\n# furnished to do so, subject to the following conditions:\n#\n# The above copyright notice and this permission notice\n# shall be included in all copies or substantial portions of\n# the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY\n# KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE\n# WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR\n# PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS\n# OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR\n# OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR\n# OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE\n# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.\n\nfrom __future__ import print_function, unicode_literals\nfrom operator import attrgetter\nimport os\nimport shutil\nimport sys\nimport traceback\n\nfrom doit.loader import generate_tasks\nfrom doit.cmd_base import TaskLoader\nfrom doit.reporter import ExecutedOnlyReporter\nfrom doit.doit_cmd import DoitMain\nfrom doit.cmd_help import Help as DoitHelp\nfrom doit.cmd_run import Run as DoitRun\nfrom doit.cmd_clean import Clean as DoitClean\nfrom doit.cmd_auto import Auto as DoitAuto\nfrom logbook import NullHandler\n\nfrom . import __version__\nfrom .nikola import Nikola\nfrom .utils import _reload, sys_decode, get_root_dir, LOGGER, STRICT_HANDLER\n\n\nconfig = {}\n\n\ndef main(args):\n quiet = False\n if len(args) > 0 and args[0] == 'build' and '--strict' in args:\n LOGGER.notice('Running in strict mode')\n STRICT_HANDLER.push_application()\n if len(args) > 0 and args[0] == 'build' and '-q' in args or '--quiet' in args:\n nullhandler = NullHandler()\n nullhandler.push_application()\n quiet = True\n global config\n\n colorful = False\n if sys.stderr.isatty():\n colorful = True\n try:\n import colorama\n colorama.init()\n except ImportError:\n if os.name == 'nt':\n colorful = False\n\n # Those commands do not require a `conf.py`. (Issue #1132)\n # Moreover, actually having one somewhere in the tree can be bad, putting\n # the output of that command (the new site) in an unknown directory that is\n # not the current working directory. (does not apply to `version`)\n argname = args[0] if len(args) > 0 else None\n if argname not in ['init', 'import_wordpress', 'import_feed',\n 'import_blogger', 'version']:\n root = get_root_dir()\n if root:\n os.chdir(root)\n\n sys.path.append('')\n try:\n import conf\n _reload(conf)\n config = conf.__dict__\n except Exception:\n if os.path.exists('conf.py'):\n msg = traceback.format_exc(0).splitlines()[1]\n LOGGER.error('In conf.py line {0}: {1}'.format(sys.exc_info()[2].tb_lineno, msg))\n sys.exit(1)\n config = {}\n\n config.update({'__colorful__': colorful})\n\n site = Nikola(**config)\n return DoitNikola(site, quiet).run(args)\n\n\nclass Help(DoitHelp):\n \"\"\"show Nikola usage instead of doit \"\"\"\n\n @staticmethod\n def print_usage(cmds):\n \"\"\"print nikola \"usage\" (basic help) instructions\"\"\"\n print(\"Nikola is a tool to create static websites and blogs. For full documentation and more information, please visit http://getnikola.com/\\n\\n\")\n print(\"Available commands:\")\n for cmd in sorted(cmds.values(), key=attrgetter('name')):\n print(\" nikola %-*s %s\" % (20, cmd.name, cmd.doc_purpose))\n print(\"\")\n print(\" nikola help show help / reference\")\n print(\" nikola help <command> show command usage\")\n print(\" nikola help <task-name> show task usage\")\n\n\nclass Build(DoitRun):\n \"\"\"expose \"run\" command as \"build\" for backward compatibility\"\"\"\n def __init__(self, *args, **kw):\n opts = list(self.cmd_options)\n opts.append(\n {\n 'name': 'strict',\n 'long': 'strict',\n 'default': False,\n 'type': bool,\n 'help': \"Fail on things that would normally be warnings.\",\n }\n )\n opts.append(\n {\n 'name': 'quiet',\n 'long': 'quiet',\n 'short': 'q',\n 'default': False,\n 'type': bool,\n 'help': \"Run quietly.\",\n }\n )\n self.cmd_options = tuple(opts)\n super(Build, self).__init__(*args, **kw)\n\n\nclass Clean(DoitClean):\n \"\"\"A clean that removes cache/\"\"\"\n\n def clean_tasks(self, tasks, dryrun):\n if not dryrun and config:\n cache_folder = config.get('CACHE_FOLDER', 'cache')\n if os.path.exists(cache_folder):\n shutil.rmtree(cache_folder)\n return super(Clean, self).clean_tasks(tasks, dryrun)\n\n# Nikola has its own \"auto\" commands that uses livereload.\n# Expose original doit \"auto\" command as \"doit_auto\".\nDoitAuto.name = 'doit_auto'\n\n\nclass NikolaTaskLoader(TaskLoader):\n \"\"\"custom task loader to get tasks from Nikola instead of dodo.py file\"\"\"\n def __init__(self, nikola, quiet=False):\n self.nikola = nikola\n self.quiet = quiet\n\n def load_tasks(self, cmd, opt_values, pos_args):\n if self.quiet:\n DOIT_CONFIG = {\n 'verbosity': 0,\n 'reporter': 'zero',\n }\n else:\n DOIT_CONFIG = {\n 'reporter': ExecutedOnlyReporter,\n }\n DOIT_CONFIG['default_tasks'] = ['render_site', 'post_render']\n tasks = generate_tasks(\n 'render_site',\n self.nikola.gen_tasks('render_site', \"Task\", 'Group of tasks to render the site.'))\n latetasks = generate_tasks(\n 'post_render',\n self.nikola.gen_tasks('post_render', \"LateTask\", 'Group of tasks to be executes after site is rendered.'))\n return tasks + latetasks, DOIT_CONFIG\n\n\nclass DoitNikola(DoitMain):\n # overwite help command\n DOIT_CMDS = list(DoitMain.DOIT_CMDS) + [Help, Build, Clean, DoitAuto]\n TASK_LOADER = NikolaTaskLoader\n\n def __init__(self, nikola, quiet=False):\n self.nikola = nikola\n self.task_loader = self.TASK_LOADER(nikola, quiet)\n\n def get_commands(self):\n # core doit commands\n cmds = DoitMain.get_commands(self)\n # load nikola commands\n for name, cmd in self.nikola.commands.items():\n cmds[name] = cmd\n return cmds\n\n def run(self, cmd_args):\n sub_cmds = self.get_commands()\n args = self.process_args(cmd_args)\n args = [sys_decode(arg) for arg in args]\n\n if len(args) == 0 or any(arg in [\"--help\", '-h'] for arg in args):\n cmd_args = ['help']\n args = ['help']\n # Hide run because Nikola uses build\n sub_cmds.pop('run')\n if len(args) == 0 or any(arg in [\"--version\", '-V'] for arg in args):\n cmd_args = ['version']\n args = ['version']\n if len(args) == 0 or args[0] not in sub_cmds.keys() or \\\n args[0] == 'build':\n # Check for conf.py before launching run\n if not self.nikola.configured:\n LOGGER.error(\"This command needs to run inside an \"\n \"existing Nikola site.\")\n return False\n return super(DoitNikola, self).run(cmd_args)\n\n @staticmethod\n def print_version():\n print(\"Nikola v\" + __version__)\n\nif __name__ == \"__main__\":\n sys.exit(main(sys.argv[1:]))\n", "path": "nikola/__main__.py"}]} | 3,748 | 239 |
gh_patches_debug_35245 | rasdani/github-patches | git_diff | litestar-org__litestar-336 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Pydantic 1.9.1 deepcopy breaks Redis caching, possibly other things
I'm open to the possibility that this should be treated as an issue in Pydantic because it's a breaking change there, and https://github.com/samuelcolvin/pydantic/issues/4184 appears to be a variation of this.
I have a Starlite application which configures its caching as follows (excerpt from my actual `app.py`):
```python
import redis
import starlite
from .config import settings
cache_config = (
starlite.CacheConfig(backend=redis.from_url(settings.CACHE_URL))
if settings.CACHE_URL is not None
else starlite.app.DEFAULT_CACHE_CONFIG
)
```
`config.settings` in this case is a `pydantic.BaseSettings` instance which reads the Redis URL from an environment variable, if set. The fallback logic is for local development where a Redis instance may not be running.
I recently did an upgrade of my dependency tree which moved from Pydantic 1.9.0 to Pydantic 1.9.1, and my web worker processes now fail to boot. Reverting to Pydantic 1.9.0 resolves this.
The failure occurs at instantiating the `Starlite` application object:
```python
File /opt/venv/lib/python3.10/site-packages/pydantic/decorator.py:40, in pydantic.decorator.validate_arguments.validate.wrapper_function()
File /opt/venv/lib/python3.10/site-packages/pydantic/decorator.py:133, in pydantic.decorator.ValidatedFunction.call()
File /opt/venv/lib/python3.10/site-packages/pydantic/decorator.py:130, in pydantic.decorator.ValidatedFunction.init_model_instance()
File /opt/venv/lib/python3.10/site-packages/pydantic/main.py:341, in pydantic.main.BaseModel.__init__()
ValidationError: 1 validation error for Init
cache_config
cannot pickle '_thread.lock' object (type=type_error)
```
The root of the issue appears to be that in Pydantic 1.9.1, Pydantic defaults to performing a `copy.deepcopy()` of model members during validation, which in turn fails on encountering any non-pickle-able object (as in the above traceback where it attempts to `deepcopy()` a Redis client instance). Pydantic 1.9.0 performed only a shallow copy.
This behavior can be disabled by setting the `copy_on_model_validation` option to a false-y value in the config for a Pydantic model class, or by passing it in the `config` dictionary of the `validate_arguments` decorator, and that may be the simplest workaround to apply for now while Pydantic decides what to do about this. It may also be the correct long-term choice since it's likely that non-pickle-able objects, such as caching clients, will be passed in arguments to the `Starlite` constructor from time to time.
This issue appears to be present in any version of Starlite which decorates `Starlite.__init__()` with `validate_arguments`, when using Pydantic 1.9.1 (I've tested Starlite 1.7.3 and 1.3.3), and disappears on reverting to Pydantic 1.9.0.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `starlite/exceptions/utils.py`
Content:
```
1 from typing import Any, Dict, List, Optional, Union
2
3 from pydantic import BaseModel
4 from starlette.exceptions import HTTPException as StarletteHTTPException
5 from starlette.status import HTTP_500_INTERNAL_SERVER_ERROR # noqa: TC002
6
7 from starlite.enums import MediaType
8 from starlite.response import Response
9
10 from .exceptions import HTTPException
11
12 __all__ = ["create_exception_response"]
13
14
15 class ExceptionResponseContent(BaseModel):
16 detail: Optional[str]
17 extra: Optional[Union[Dict[str, Any], List[Any]]]
18 status_code: int = HTTP_500_INTERNAL_SERVER_ERROR
19
20
21 def create_exception_response(exc: Exception) -> Response:
22 """
23 Constructs a response from an exception.
24
25 For instances of either `starlite.exceptions.HTTPException` or `starlette.exceptions.HTTPException` the response
26 status code is drawn from the exception, otherwise response status is `HTTP_500_INTERNAL_SERVER_ERROR`.
27
28 Args:
29 exc (Exception): Any exception.
30
31 Returns:
32 Response
33 """
34 if isinstance(exc, (HTTPException, StarletteHTTPException)):
35 content = ExceptionResponseContent(detail=exc.detail, status_code=exc.status_code)
36 if isinstance(exc, HTTPException):
37 content.extra = exc.extra
38 else:
39 content = ExceptionResponseContent(detail=repr(exc))
40 return Response(
41 media_type=MediaType.JSON,
42 content=content.dict(exclude_none=True),
43 status_code=content.status_code,
44 )
45
```
Path: `starlite/config.py`
Content:
```
1 from enum import Enum
2 from typing import (
3 TYPE_CHECKING,
4 Any,
5 Callable,
6 Dict,
7 List,
8 Optional,
9 Tuple,
10 Type,
11 Union,
12 )
13 from urllib.parse import urlencode
14
15 from pydantic import AnyUrl, BaseModel, DirectoryPath, constr, validator
16 from pydantic_openapi_schema.utils import construct_open_api_with_schema_class
17 from pydantic_openapi_schema.v3_1_0.contact import Contact
18 from pydantic_openapi_schema.v3_1_0.external_documentation import ExternalDocumentation
19 from pydantic_openapi_schema.v3_1_0.info import Info
20 from pydantic_openapi_schema.v3_1_0.license import License
21 from pydantic_openapi_schema.v3_1_0.open_api import OpenAPI
22 from pydantic_openapi_schema.v3_1_0.path_item import PathItem
23 from pydantic_openapi_schema.v3_1_0.reference import Reference
24 from pydantic_openapi_schema.v3_1_0.security_requirement import SecurityRequirement
25 from pydantic_openapi_schema.v3_1_0.server import Server
26 from pydantic_openapi_schema.v3_1_0.tag import Tag
27
28 from starlite.cache import CacheBackendProtocol, SimpleCacheBackend
29 from starlite.openapi.controller import OpenAPIController
30 from starlite.openapi.path_item import create_path_item
31 from starlite.routes import HTTPRoute
32 from starlite.template import TemplateEngineProtocol
33 from starlite.types import CacheKeyBuilder
34
35 if TYPE_CHECKING:
36 from starlite.app import Starlite
37 from starlite.connection import Request
38
39
40 class CORSConfig(BaseModel):
41 allow_origins: List[str] = ["*"]
42 allow_methods: List[str] = ["*"]
43 allow_headers: List[str] = ["*"]
44 allow_credentials: bool = False
45 allow_origin_regex: Optional[str] = None
46 expose_headers: List[str] = []
47 max_age: int = 600
48
49
50 class CompressionBackend(str, Enum):
51 """CompressionBackend is an enum that defines the available compression backends."""
52
53 GZIP = "gzip"
54 BROTLI = "brotli"
55
56
57 class BrotliMode(str, Enum):
58 """BrotliMode is an enum that defines the available brotli compression optimization modes."""
59
60 GENERIC = "generic"
61 TEXT = "text"
62 FONT = "font"
63
64
65 class CompressionConfig(BaseModel):
66 """Class containing the configuration for request compression."""
67
68 backend: Union[CompressionBackend, str]
69 minimum_size: int = 500
70 gzip_compress_level: int = 9
71 brotli_quality: int = 5
72 brotli_mode: Union[BrotliMode, str] = BrotliMode.TEXT
73 brotli_lgwin: int = 22
74 brotli_lgblock: int = 0
75 brotli_gzip_fallback: bool = True
76
77 @validator("backend", pre=True, always=True)
78 def backend_must_be_supported( # pylint: disable=no-self-argument
79 cls, v: Union[CompressionBackend, str]
80 ) -> CompressionBackend:
81 """Compression Backend Validation
82
83 Args:
84 v (CompressionBackend|str): Holds the selected compression backend
85
86 Raises:
87 ValueError: Value is not a valid compression backend
88
89 Returns:
90 _type_: CompressionBackend
91 """
92 if isinstance(v, str):
93 try:
94 v = CompressionBackend[v.upper()]
95 except KeyError as e:
96 raise ValueError(f"{v} is not a valid compression backend") from e
97 return v
98
99 @validator("brotli_mode", pre=True, always=True)
100 def brotli_mode_must_be_valid(cls, v: Union[BrotliMode, str]) -> BrotliMode: # pylint: disable=no-self-argument
101 """Compression Backend Validation
102
103 Args:
104 v (CompressionBackend|str): Holds the selected compression backend
105
106 Raises:
107 ValueError: Value is not a valid compression backend
108
109 Returns:
110 _type_: CompressionBackend
111 """
112 if isinstance(v, str):
113 try:
114 v = BrotliMode[v.upper()]
115 except KeyError as e:
116 raise ValueError(f"{v} is not a valid compression optimization mode") from e
117 return v
118
119 def dict(self, *args, **kwargs) -> Dict[str, Any]: # type: ignore[no-untyped-def]
120 """Returns a dictionary representation of the CompressionConfig.
121
122 Returns:
123 Dict[str, Any]: dictionary representation of the selected CompressionConfig. Only columns for the selected backend are included
124 """
125 brotli_keys = {
126 "minimum_size",
127 "brotli_quality",
128 "brotli_mode",
129 "brotli_lgwin",
130 "brotli_lgblock",
131 "brotli_gzip_fallback",
132 }
133 gzip_keys = {"minimum_size", "gzip_compress_level"}
134 if self.backend == CompressionBackend.GZIP:
135 kwargs["include"] = gzip_keys
136 elif self.backend == CompressionBackend.BROTLI:
137 kwargs["include"] = brotli_keys
138 else:
139 kwargs["include"] = brotli_keys.union(gzip_keys)
140
141 return super().dict(*args, **kwargs)
142
143
144 class OpenAPIConfig(BaseModel):
145 """Class containing Settings and Schema Properties"""
146
147 create_examples: bool = False
148 openapi_controller: Type[OpenAPIController] = OpenAPIController
149
150 title: str
151 version: str
152 contact: Optional[Contact] = None
153 description: Optional[str] = None
154 external_docs: Optional[ExternalDocumentation] = None
155 license: Optional[License] = None
156 security: Optional[List[SecurityRequirement]] = None
157 servers: List[Server] = [Server(url="/")]
158 summary: Optional[str] = None
159 tags: Optional[List[Tag]] = None
160 terms_of_service: Optional[AnyUrl] = None
161 use_handler_docstrings: bool = False
162 webhooks: Optional[Dict[str, Union[PathItem, Reference]]] = None
163
164 def to_openapi_schema(self) -> OpenAPI:
165 """Generates an OpenAPI model"""
166 return OpenAPI(
167 externalDocs=self.external_docs,
168 security=self.security,
169 servers=self.servers,
170 tags=self.tags,
171 webhooks=self.webhooks,
172 info=Info(
173 title=self.title,
174 version=self.version,
175 description=self.description,
176 contact=self.contact,
177 license=self.license,
178 summary=self.summary,
179 termsOfService=self.terms_of_service,
180 ),
181 )
182
183 def create_openapi_schema_model(self, app: "Starlite") -> OpenAPI:
184 """
185 Creates `OpenAPI` instance for the given `router`.
186
187 Args:
188 app (Starlite)
189
190 Returns:
191 OpenAPI
192 """
193 schema = self.to_openapi_schema()
194 schema.paths = {}
195 for route in app.routes:
196 if (
197 isinstance(route, HTTPRoute)
198 and any(route_handler.include_in_schema for route_handler, _ in route.route_handler_map.values())
199 and (route.path_format or "/") not in schema.paths
200 ):
201 schema.paths[route.path_format or "/"] = create_path_item(
202 route=route,
203 create_examples=self.create_examples,
204 plugins=app.plugins,
205 use_handler_docstrings=self.use_handler_docstrings,
206 )
207 return construct_open_api_with_schema_class(schema)
208
209
210 class StaticFilesConfig(BaseModel):
211 path: constr(min_length=1) # type: ignore
212 directories: List[DirectoryPath]
213 html_mode: bool = False
214
215
216 class TemplateConfig(BaseModel):
217 class Config:
218 arbitrary_types_allowed = True
219
220 directory: Union[DirectoryPath, List[DirectoryPath]]
221 engine: Type[TemplateEngineProtocol]
222 engine_callback: Optional[Callable[[Any], Any]]
223
224
225 def default_cache_key_builder(request: "Request") -> str:
226 """
227 Given a request object, returns a cache key by combining the path with the sorted query params
228 """
229 qp: List[Tuple[str, Any]] = list(request.query_params.items())
230 qp.sort(key=lambda x: x[0])
231 return request.url.path + urlencode(qp, doseq=True)
232
233
234 class CacheConfig(BaseModel):
235 class Config:
236 arbitrary_types_allowed = True
237
238 backend: CacheBackendProtocol = SimpleCacheBackend()
239 expiration: int = 60 # value in seconds
240 cache_key_builder: CacheKeyBuilder = default_cache_key_builder
241
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/starlite/config.py b/starlite/config.py
--- a/starlite/config.py
+++ b/starlite/config.py
@@ -12,7 +12,7 @@
)
from urllib.parse import urlencode
-from pydantic import AnyUrl, BaseModel, DirectoryPath, constr, validator
+from pydantic import AnyUrl, BaseConfig, BaseModel, DirectoryPath, constr, validator
from pydantic_openapi_schema.utils import construct_open_api_with_schema_class
from pydantic_openapi_schema.v3_1_0.contact import Contact
from pydantic_openapi_schema.v3_1_0.external_documentation import ExternalDocumentation
@@ -144,6 +144,9 @@
class OpenAPIConfig(BaseModel):
"""Class containing Settings and Schema Properties"""
+ class Config(BaseConfig):
+ copy_on_model_validation = False
+
create_examples: bool = False
openapi_controller: Type[OpenAPIController] = OpenAPIController
@@ -214,8 +217,9 @@
class TemplateConfig(BaseModel):
- class Config:
+ class Config(BaseConfig):
arbitrary_types_allowed = True
+ copy_on_model_validation = False
directory: Union[DirectoryPath, List[DirectoryPath]]
engine: Type[TemplateEngineProtocol]
@@ -232,8 +236,9 @@
class CacheConfig(BaseModel):
- class Config:
+ class Config(BaseConfig):
arbitrary_types_allowed = True
+ copy_on_model_validation = False
backend: CacheBackendProtocol = SimpleCacheBackend()
expiration: int = 60 # value in seconds
diff --git a/starlite/exceptions/utils.py b/starlite/exceptions/utils.py
--- a/starlite/exceptions/utils.py
+++ b/starlite/exceptions/utils.py
@@ -2,7 +2,7 @@
from pydantic import BaseModel
from starlette.exceptions import HTTPException as StarletteHTTPException
-from starlette.status import HTTP_500_INTERNAL_SERVER_ERROR # noqa: TC002
+from starlette.status import HTTP_500_INTERNAL_SERVER_ERROR
from starlite.enums import MediaType
from starlite.response import Response
| {"golden_diff": "diff --git a/starlite/config.py b/starlite/config.py\n--- a/starlite/config.py\n+++ b/starlite/config.py\n@@ -12,7 +12,7 @@\n )\n from urllib.parse import urlencode\n \n-from pydantic import AnyUrl, BaseModel, DirectoryPath, constr, validator\n+from pydantic import AnyUrl, BaseConfig, BaseModel, DirectoryPath, constr, validator\n from pydantic_openapi_schema.utils import construct_open_api_with_schema_class\n from pydantic_openapi_schema.v3_1_0.contact import Contact\n from pydantic_openapi_schema.v3_1_0.external_documentation import ExternalDocumentation\n@@ -144,6 +144,9 @@\n class OpenAPIConfig(BaseModel):\n \"\"\"Class containing Settings and Schema Properties\"\"\"\n \n+ class Config(BaseConfig):\n+ copy_on_model_validation = False\n+\n create_examples: bool = False\n openapi_controller: Type[OpenAPIController] = OpenAPIController\n \n@@ -214,8 +217,9 @@\n \n \n class TemplateConfig(BaseModel):\n- class Config:\n+ class Config(BaseConfig):\n arbitrary_types_allowed = True\n+ copy_on_model_validation = False\n \n directory: Union[DirectoryPath, List[DirectoryPath]]\n engine: Type[TemplateEngineProtocol]\n@@ -232,8 +236,9 @@\n \n \n class CacheConfig(BaseModel):\n- class Config:\n+ class Config(BaseConfig):\n arbitrary_types_allowed = True\n+ copy_on_model_validation = False\n \n backend: CacheBackendProtocol = SimpleCacheBackend()\n expiration: int = 60 # value in seconds\ndiff --git a/starlite/exceptions/utils.py b/starlite/exceptions/utils.py\n--- a/starlite/exceptions/utils.py\n+++ b/starlite/exceptions/utils.py\n@@ -2,7 +2,7 @@\n \n from pydantic import BaseModel\n from starlette.exceptions import HTTPException as StarletteHTTPException\n-from starlette.status import HTTP_500_INTERNAL_SERVER_ERROR # noqa: TC002\n+from starlette.status import HTTP_500_INTERNAL_SERVER_ERROR\n \n from starlite.enums import MediaType\n from starlite.response import Response\n", "issue": "Pydantic 1.9.1 deepcopy breaks Redis caching, possibly other things\nI'm open to the possibility that this should be treated as an issue in Pydantic because it's a breaking change there, and https://github.com/samuelcolvin/pydantic/issues/4184 appears to be a variation of this.\r\n\r\nI have a Starlite application which configures its caching as follows (excerpt from my actual `app.py`):\r\n\r\n```python\r\nimport redis\r\nimport starlite\r\n\r\nfrom .config import settings\r\n\r\ncache_config = (\r\n starlite.CacheConfig(backend=redis.from_url(settings.CACHE_URL))\r\n if settings.CACHE_URL is not None\r\n else starlite.app.DEFAULT_CACHE_CONFIG\r\n)\r\n```\r\n\r\n`config.settings` in this case is a `pydantic.BaseSettings` instance which reads the Redis URL from an environment variable, if set. The fallback logic is for local development where a Redis instance may not be running.\r\n\r\nI recently did an upgrade of my dependency tree which moved from Pydantic 1.9.0 to Pydantic 1.9.1, and my web worker processes now fail to boot. Reverting to Pydantic 1.9.0 resolves this.\r\n\r\nThe failure occurs at instantiating the `Starlite` application object:\r\n\r\n```python\r\nFile /opt/venv/lib/python3.10/site-packages/pydantic/decorator.py:40, in pydantic.decorator.validate_arguments.validate.wrapper_function()\r\n\r\nFile /opt/venv/lib/python3.10/site-packages/pydantic/decorator.py:133, in pydantic.decorator.ValidatedFunction.call()\r\n\r\nFile /opt/venv/lib/python3.10/site-packages/pydantic/decorator.py:130, in pydantic.decorator.ValidatedFunction.init_model_instance()\r\n\r\nFile /opt/venv/lib/python3.10/site-packages/pydantic/main.py:341, in pydantic.main.BaseModel.__init__()\r\n\r\nValidationError: 1 validation error for Init\r\ncache_config\r\n cannot pickle '_thread.lock' object (type=type_error)\r\n```\r\n\r\nThe root of the issue appears to be that in Pydantic 1.9.1, Pydantic defaults to performing a `copy.deepcopy()` of model members during validation, which in turn fails on encountering any non-pickle-able object (as in the above traceback where it attempts to `deepcopy()` a Redis client instance). Pydantic 1.9.0 performed only a shallow copy.\r\n\r\nThis behavior can be disabled by setting the `copy_on_model_validation` option to a false-y value in the config for a Pydantic model class, or by passing it in the `config` dictionary of the `validate_arguments` decorator, and that may be the simplest workaround to apply for now while Pydantic decides what to do about this. It may also be the correct long-term choice since it's likely that non-pickle-able objects, such as caching clients, will be passed in arguments to the `Starlite` constructor from time to time.\r\n\r\nThis issue appears to be present in any version of Starlite which decorates `Starlite.__init__()` with `validate_arguments`, when using Pydantic 1.9.1 (I've tested Starlite 1.7.3 and 1.3.3), and disappears on reverting to Pydantic 1.9.0.\n", "before_files": [{"content": "from typing import Any, Dict, List, Optional, Union\n\nfrom pydantic import BaseModel\nfrom starlette.exceptions import HTTPException as StarletteHTTPException\nfrom starlette.status import HTTP_500_INTERNAL_SERVER_ERROR # noqa: TC002\n\nfrom starlite.enums import MediaType\nfrom starlite.response import Response\n\nfrom .exceptions import HTTPException\n\n__all__ = [\"create_exception_response\"]\n\n\nclass ExceptionResponseContent(BaseModel):\n detail: Optional[str]\n extra: Optional[Union[Dict[str, Any], List[Any]]]\n status_code: int = HTTP_500_INTERNAL_SERVER_ERROR\n\n\ndef create_exception_response(exc: Exception) -> Response:\n \"\"\"\n Constructs a response from an exception.\n\n For instances of either `starlite.exceptions.HTTPException` or `starlette.exceptions.HTTPException` the response\n status code is drawn from the exception, otherwise response status is `HTTP_500_INTERNAL_SERVER_ERROR`.\n\n Args:\n exc (Exception): Any exception.\n\n Returns:\n Response\n \"\"\"\n if isinstance(exc, (HTTPException, StarletteHTTPException)):\n content = ExceptionResponseContent(detail=exc.detail, status_code=exc.status_code)\n if isinstance(exc, HTTPException):\n content.extra = exc.extra\n else:\n content = ExceptionResponseContent(detail=repr(exc))\n return Response(\n media_type=MediaType.JSON,\n content=content.dict(exclude_none=True),\n status_code=content.status_code,\n )\n", "path": "starlite/exceptions/utils.py"}, {"content": "from enum import Enum\nfrom typing import (\n TYPE_CHECKING,\n Any,\n Callable,\n Dict,\n List,\n Optional,\n Tuple,\n Type,\n Union,\n)\nfrom urllib.parse import urlencode\n\nfrom pydantic import AnyUrl, BaseModel, DirectoryPath, constr, validator\nfrom pydantic_openapi_schema.utils import construct_open_api_with_schema_class\nfrom pydantic_openapi_schema.v3_1_0.contact import Contact\nfrom pydantic_openapi_schema.v3_1_0.external_documentation import ExternalDocumentation\nfrom pydantic_openapi_schema.v3_1_0.info import Info\nfrom pydantic_openapi_schema.v3_1_0.license import License\nfrom pydantic_openapi_schema.v3_1_0.open_api import OpenAPI\nfrom pydantic_openapi_schema.v3_1_0.path_item import PathItem\nfrom pydantic_openapi_schema.v3_1_0.reference import Reference\nfrom pydantic_openapi_schema.v3_1_0.security_requirement import SecurityRequirement\nfrom pydantic_openapi_schema.v3_1_0.server import Server\nfrom pydantic_openapi_schema.v3_1_0.tag import Tag\n\nfrom starlite.cache import CacheBackendProtocol, SimpleCacheBackend\nfrom starlite.openapi.controller import OpenAPIController\nfrom starlite.openapi.path_item import create_path_item\nfrom starlite.routes import HTTPRoute\nfrom starlite.template import TemplateEngineProtocol\nfrom starlite.types import CacheKeyBuilder\n\nif TYPE_CHECKING:\n from starlite.app import Starlite\n from starlite.connection import Request\n\n\nclass CORSConfig(BaseModel):\n allow_origins: List[str] = [\"*\"]\n allow_methods: List[str] = [\"*\"]\n allow_headers: List[str] = [\"*\"]\n allow_credentials: bool = False\n allow_origin_regex: Optional[str] = None\n expose_headers: List[str] = []\n max_age: int = 600\n\n\nclass CompressionBackend(str, Enum):\n \"\"\"CompressionBackend is an enum that defines the available compression backends.\"\"\"\n\n GZIP = \"gzip\"\n BROTLI = \"brotli\"\n\n\nclass BrotliMode(str, Enum):\n \"\"\"BrotliMode is an enum that defines the available brotli compression optimization modes.\"\"\"\n\n GENERIC = \"generic\"\n TEXT = \"text\"\n FONT = \"font\"\n\n\nclass CompressionConfig(BaseModel):\n \"\"\"Class containing the configuration for request compression.\"\"\"\n\n backend: Union[CompressionBackend, str]\n minimum_size: int = 500\n gzip_compress_level: int = 9\n brotli_quality: int = 5\n brotli_mode: Union[BrotliMode, str] = BrotliMode.TEXT\n brotli_lgwin: int = 22\n brotli_lgblock: int = 0\n brotli_gzip_fallback: bool = True\n\n @validator(\"backend\", pre=True, always=True)\n def backend_must_be_supported( # pylint: disable=no-self-argument\n cls, v: Union[CompressionBackend, str]\n ) -> CompressionBackend:\n \"\"\"Compression Backend Validation\n\n Args:\n v (CompressionBackend|str): Holds the selected compression backend\n\n Raises:\n ValueError: Value is not a valid compression backend\n\n Returns:\n _type_: CompressionBackend\n \"\"\"\n if isinstance(v, str):\n try:\n v = CompressionBackend[v.upper()]\n except KeyError as e:\n raise ValueError(f\"{v} is not a valid compression backend\") from e\n return v\n\n @validator(\"brotli_mode\", pre=True, always=True)\n def brotli_mode_must_be_valid(cls, v: Union[BrotliMode, str]) -> BrotliMode: # pylint: disable=no-self-argument\n \"\"\"Compression Backend Validation\n\n Args:\n v (CompressionBackend|str): Holds the selected compression backend\n\n Raises:\n ValueError: Value is not a valid compression backend\n\n Returns:\n _type_: CompressionBackend\n \"\"\"\n if isinstance(v, str):\n try:\n v = BrotliMode[v.upper()]\n except KeyError as e:\n raise ValueError(f\"{v} is not a valid compression optimization mode\") from e\n return v\n\n def dict(self, *args, **kwargs) -> Dict[str, Any]: # type: ignore[no-untyped-def]\n \"\"\"Returns a dictionary representation of the CompressionConfig.\n\n Returns:\n Dict[str, Any]: dictionary representation of the selected CompressionConfig. Only columns for the selected backend are included\n \"\"\"\n brotli_keys = {\n \"minimum_size\",\n \"brotli_quality\",\n \"brotli_mode\",\n \"brotli_lgwin\",\n \"brotli_lgblock\",\n \"brotli_gzip_fallback\",\n }\n gzip_keys = {\"minimum_size\", \"gzip_compress_level\"}\n if self.backend == CompressionBackend.GZIP:\n kwargs[\"include\"] = gzip_keys\n elif self.backend == CompressionBackend.BROTLI:\n kwargs[\"include\"] = brotli_keys\n else:\n kwargs[\"include\"] = brotli_keys.union(gzip_keys)\n\n return super().dict(*args, **kwargs)\n\n\nclass OpenAPIConfig(BaseModel):\n \"\"\"Class containing Settings and Schema Properties\"\"\"\n\n create_examples: bool = False\n openapi_controller: Type[OpenAPIController] = OpenAPIController\n\n title: str\n version: str\n contact: Optional[Contact] = None\n description: Optional[str] = None\n external_docs: Optional[ExternalDocumentation] = None\n license: Optional[License] = None\n security: Optional[List[SecurityRequirement]] = None\n servers: List[Server] = [Server(url=\"/\")]\n summary: Optional[str] = None\n tags: Optional[List[Tag]] = None\n terms_of_service: Optional[AnyUrl] = None\n use_handler_docstrings: bool = False\n webhooks: Optional[Dict[str, Union[PathItem, Reference]]] = None\n\n def to_openapi_schema(self) -> OpenAPI:\n \"\"\"Generates an OpenAPI model\"\"\"\n return OpenAPI(\n externalDocs=self.external_docs,\n security=self.security,\n servers=self.servers,\n tags=self.tags,\n webhooks=self.webhooks,\n info=Info(\n title=self.title,\n version=self.version,\n description=self.description,\n contact=self.contact,\n license=self.license,\n summary=self.summary,\n termsOfService=self.terms_of_service,\n ),\n )\n\n def create_openapi_schema_model(self, app: \"Starlite\") -> OpenAPI:\n \"\"\"\n Creates `OpenAPI` instance for the given `router`.\n\n Args:\n app (Starlite)\n\n Returns:\n OpenAPI\n \"\"\"\n schema = self.to_openapi_schema()\n schema.paths = {}\n for route in app.routes:\n if (\n isinstance(route, HTTPRoute)\n and any(route_handler.include_in_schema for route_handler, _ in route.route_handler_map.values())\n and (route.path_format or \"/\") not in schema.paths\n ):\n schema.paths[route.path_format or \"/\"] = create_path_item(\n route=route,\n create_examples=self.create_examples,\n plugins=app.plugins,\n use_handler_docstrings=self.use_handler_docstrings,\n )\n return construct_open_api_with_schema_class(schema)\n\n\nclass StaticFilesConfig(BaseModel):\n path: constr(min_length=1) # type: ignore\n directories: List[DirectoryPath]\n html_mode: bool = False\n\n\nclass TemplateConfig(BaseModel):\n class Config:\n arbitrary_types_allowed = True\n\n directory: Union[DirectoryPath, List[DirectoryPath]]\n engine: Type[TemplateEngineProtocol]\n engine_callback: Optional[Callable[[Any], Any]]\n\n\ndef default_cache_key_builder(request: \"Request\") -> str:\n \"\"\"\n Given a request object, returns a cache key by combining the path with the sorted query params\n \"\"\"\n qp: List[Tuple[str, Any]] = list(request.query_params.items())\n qp.sort(key=lambda x: x[0])\n return request.url.path + urlencode(qp, doseq=True)\n\n\nclass CacheConfig(BaseModel):\n class Config:\n arbitrary_types_allowed = True\n\n backend: CacheBackendProtocol = SimpleCacheBackend()\n expiration: int = 60 # value in seconds\n cache_key_builder: CacheKeyBuilder = default_cache_key_builder\n", "path": "starlite/config.py"}], "after_files": [{"content": "from typing import Any, Dict, List, Optional, Union\n\nfrom pydantic import BaseModel\nfrom starlette.exceptions import HTTPException as StarletteHTTPException\nfrom starlette.status import HTTP_500_INTERNAL_SERVER_ERROR\n\nfrom starlite.enums import MediaType\nfrom starlite.response import Response\n\nfrom .exceptions import HTTPException\n\n__all__ = [\"create_exception_response\"]\n\n\nclass ExceptionResponseContent(BaseModel):\n detail: Optional[str]\n extra: Optional[Union[Dict[str, Any], List[Any]]]\n status_code: int = HTTP_500_INTERNAL_SERVER_ERROR\n\n\ndef create_exception_response(exc: Exception) -> Response:\n \"\"\"\n Constructs a response from an exception.\n\n For instances of either `starlite.exceptions.HTTPException` or `starlette.exceptions.HTTPException` the response\n status code is drawn from the exception, otherwise response status is `HTTP_500_INTERNAL_SERVER_ERROR`.\n\n Args:\n exc (Exception): Any exception.\n\n Returns:\n Response\n \"\"\"\n if isinstance(exc, (HTTPException, StarletteHTTPException)):\n content = ExceptionResponseContent(detail=exc.detail, status_code=exc.status_code)\n if isinstance(exc, HTTPException):\n content.extra = exc.extra\n else:\n content = ExceptionResponseContent(detail=repr(exc))\n return Response(\n media_type=MediaType.JSON,\n content=content.dict(exclude_none=True),\n status_code=content.status_code,\n )\n", "path": "starlite/exceptions/utils.py"}, {"content": "from enum import Enum\nfrom typing import (\n TYPE_CHECKING,\n Any,\n Callable,\n Dict,\n List,\n Optional,\n Tuple,\n Type,\n Union,\n)\nfrom urllib.parse import urlencode\n\nfrom pydantic import AnyUrl, BaseConfig, BaseModel, DirectoryPath, constr, validator\nfrom pydantic_openapi_schema.utils import construct_open_api_with_schema_class\nfrom pydantic_openapi_schema.v3_1_0.contact import Contact\nfrom pydantic_openapi_schema.v3_1_0.external_documentation import ExternalDocumentation\nfrom pydantic_openapi_schema.v3_1_0.info import Info\nfrom pydantic_openapi_schema.v3_1_0.license import License\nfrom pydantic_openapi_schema.v3_1_0.open_api import OpenAPI\nfrom pydantic_openapi_schema.v3_1_0.path_item import PathItem\nfrom pydantic_openapi_schema.v3_1_0.reference import Reference\nfrom pydantic_openapi_schema.v3_1_0.security_requirement import SecurityRequirement\nfrom pydantic_openapi_schema.v3_1_0.server import Server\nfrom pydantic_openapi_schema.v3_1_0.tag import Tag\n\nfrom starlite.cache import CacheBackendProtocol, SimpleCacheBackend\nfrom starlite.openapi.controller import OpenAPIController\nfrom starlite.openapi.path_item import create_path_item\nfrom starlite.routes import HTTPRoute\nfrom starlite.template import TemplateEngineProtocol\nfrom starlite.types import CacheKeyBuilder\n\nif TYPE_CHECKING:\n from starlite.app import Starlite\n from starlite.connection import Request\n\n\nclass CORSConfig(BaseModel):\n allow_origins: List[str] = [\"*\"]\n allow_methods: List[str] = [\"*\"]\n allow_headers: List[str] = [\"*\"]\n allow_credentials: bool = False\n allow_origin_regex: Optional[str] = None\n expose_headers: List[str] = []\n max_age: int = 600\n\n\nclass CompressionBackend(str, Enum):\n \"\"\"CompressionBackend is an enum that defines the available compression backends.\"\"\"\n\n GZIP = \"gzip\"\n BROTLI = \"brotli\"\n\n\nclass BrotliMode(str, Enum):\n \"\"\"BrotliMode is an enum that defines the available brotli compression optimization modes.\"\"\"\n\n GENERIC = \"generic\"\n TEXT = \"text\"\n FONT = \"font\"\n\n\nclass CompressionConfig(BaseModel):\n \"\"\"Class containing the configuration for request compression.\"\"\"\n\n backend: Union[CompressionBackend, str]\n minimum_size: int = 500\n gzip_compress_level: int = 9\n brotli_quality: int = 5\n brotli_mode: Union[BrotliMode, str] = BrotliMode.TEXT\n brotli_lgwin: int = 22\n brotli_lgblock: int = 0\n brotli_gzip_fallback: bool = True\n\n @validator(\"backend\", pre=True, always=True)\n def backend_must_be_supported( # pylint: disable=no-self-argument\n cls, v: Union[CompressionBackend, str]\n ) -> CompressionBackend:\n \"\"\"Compression Backend Validation\n\n Args:\n v (CompressionBackend|str): Holds the selected compression backend\n\n Raises:\n ValueError: Value is not a valid compression backend\n\n Returns:\n _type_: CompressionBackend\n \"\"\"\n if isinstance(v, str):\n try:\n v = CompressionBackend[v.upper()]\n except KeyError as e:\n raise ValueError(f\"{v} is not a valid compression backend\") from e\n return v\n\n @validator(\"brotli_mode\", pre=True, always=True)\n def brotli_mode_must_be_valid(cls, v: Union[BrotliMode, str]) -> BrotliMode: # pylint: disable=no-self-argument\n \"\"\"Compression Backend Validation\n\n Args:\n v (CompressionBackend|str): Holds the selected compression backend\n\n Raises:\n ValueError: Value is not a valid compression backend\n\n Returns:\n _type_: CompressionBackend\n \"\"\"\n if isinstance(v, str):\n try:\n v = BrotliMode[v.upper()]\n except KeyError as e:\n raise ValueError(f\"{v} is not a valid compression optimization mode\") from e\n return v\n\n def dict(self, *args, **kwargs) -> Dict[str, Any]: # type: ignore[no-untyped-def]\n \"\"\"Returns a dictionary representation of the CompressionConfig.\n\n Returns:\n Dict[str, Any]: dictionary representation of the selected CompressionConfig. Only columns for the selected backend are included\n \"\"\"\n brotli_keys = {\n \"minimum_size\",\n \"brotli_quality\",\n \"brotli_mode\",\n \"brotli_lgwin\",\n \"brotli_lgblock\",\n \"brotli_gzip_fallback\",\n }\n gzip_keys = {\"minimum_size\", \"gzip_compress_level\"}\n if self.backend == CompressionBackend.GZIP:\n kwargs[\"include\"] = gzip_keys\n elif self.backend == CompressionBackend.BROTLI:\n kwargs[\"include\"] = brotli_keys\n else:\n kwargs[\"include\"] = brotli_keys.union(gzip_keys)\n\n return super().dict(*args, **kwargs)\n\n\nclass OpenAPIConfig(BaseModel):\n \"\"\"Class containing Settings and Schema Properties\"\"\"\n\n class Config(BaseConfig):\n copy_on_model_validation = False\n\n create_examples: bool = False\n openapi_controller: Type[OpenAPIController] = OpenAPIController\n\n title: str\n version: str\n contact: Optional[Contact] = None\n description: Optional[str] = None\n external_docs: Optional[ExternalDocumentation] = None\n license: Optional[License] = None\n security: Optional[List[SecurityRequirement]] = None\n servers: List[Server] = [Server(url=\"/\")]\n summary: Optional[str] = None\n tags: Optional[List[Tag]] = None\n terms_of_service: Optional[AnyUrl] = None\n use_handler_docstrings: bool = False\n webhooks: Optional[Dict[str, Union[PathItem, Reference]]] = None\n\n def to_openapi_schema(self) -> OpenAPI:\n \"\"\"Generates an OpenAPI model\"\"\"\n return OpenAPI(\n externalDocs=self.external_docs,\n security=self.security,\n servers=self.servers,\n tags=self.tags,\n webhooks=self.webhooks,\n info=Info(\n title=self.title,\n version=self.version,\n description=self.description,\n contact=self.contact,\n license=self.license,\n summary=self.summary,\n termsOfService=self.terms_of_service,\n ),\n )\n\n def create_openapi_schema_model(self, app: \"Starlite\") -> OpenAPI:\n \"\"\"\n Creates `OpenAPI` instance for the given `router`.\n\n Args:\n app (Starlite)\n\n Returns:\n OpenAPI\n \"\"\"\n schema = self.to_openapi_schema()\n schema.paths = {}\n for route in app.routes:\n if (\n isinstance(route, HTTPRoute)\n and any(route_handler.include_in_schema for route_handler, _ in route.route_handler_map.values())\n and (route.path_format or \"/\") not in schema.paths\n ):\n schema.paths[route.path_format or \"/\"] = create_path_item(\n route=route,\n create_examples=self.create_examples,\n plugins=app.plugins,\n use_handler_docstrings=self.use_handler_docstrings,\n )\n return construct_open_api_with_schema_class(schema)\n\n\nclass StaticFilesConfig(BaseModel):\n path: constr(min_length=1) # type: ignore\n directories: List[DirectoryPath]\n html_mode: bool = False\n\n\nclass TemplateConfig(BaseModel):\n class Config(BaseConfig):\n arbitrary_types_allowed = True\n copy_on_model_validation = False\n\n directory: Union[DirectoryPath, List[DirectoryPath]]\n engine: Type[TemplateEngineProtocol]\n engine_callback: Optional[Callable[[Any], Any]]\n\n\ndef default_cache_key_builder(request: \"Request\") -> str:\n \"\"\"\n Given a request object, returns a cache key by combining the path with the sorted query params\n \"\"\"\n qp: List[Tuple[str, Any]] = list(request.query_params.items())\n qp.sort(key=lambda x: x[0])\n return request.url.path + urlencode(qp, doseq=True)\n\n\nclass CacheConfig(BaseModel):\n class Config(BaseConfig):\n arbitrary_types_allowed = True\n copy_on_model_validation = False\n\n backend: CacheBackendProtocol = SimpleCacheBackend()\n expiration: int = 60 # value in seconds\n cache_key_builder: CacheKeyBuilder = default_cache_key_builder\n", "path": "starlite/config.py"}]} | 3,859 | 473 |
gh_patches_debug_6365 | rasdani/github-patches | git_diff | lutris__lutris-1251 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Log window: lock to the bottom gets lost
Caused by https://github.com/lutris/lutris/pull/1179. From my experience, it happens when there are a lot lines outputted all at once:

--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `lutris/gui/logwindow.py`
Content:
```
1 from gi.repository import Gtk
2 from lutris.gui.widgets.dialogs import Dialog
3
4
5 class LogTextView(Gtk.TextView):
6 def __init__(self, buffer):
7 super(LogTextView, self).__init__()
8
9 self.set_buffer(buffer)
10 self.set_editable(False)
11 self.set_monospace(True)
12 self.set_left_margin(10)
13 self.scroll_max = 0
14 self.set_wrap_mode(Gtk.WrapMode.CHAR)
15 self.get_style_context().add_class('lutris-logview')
16 self.connect("size-allocate", self.autoscroll)
17
18 def autoscroll(self, *args):
19 adj = self.get_vadjustment()
20 if adj.get_value() == self.scroll_max or self.scroll_max == 0:
21 adj.set_value(adj.get_upper() - adj.get_page_size())
22 self.scroll_max = adj.get_upper() - adj.get_page_size()
23
24
25 class LogWindow(Dialog):
26 def __init__(self, title, buffer, parent):
27 super(LogWindow, self).__init__(title, parent, 0,
28 ('_OK', Gtk.ResponseType.OK))
29 self.set_size_request(640, 480)
30 self.grid = Gtk.Grid()
31 self.buffer = buffer
32 self.logtextview = LogTextView(self.buffer)
33
34 scrolledwindow = Gtk.ScrolledWindow(hexpand=True, vexpand=True,
35 child=self.logtextview)
36 self.vbox.add(scrolledwindow)
37 self.show_all()
38
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/lutris/gui/logwindow.py b/lutris/gui/logwindow.py
--- a/lutris/gui/logwindow.py
+++ b/lutris/gui/logwindow.py
@@ -19,7 +19,9 @@
adj = self.get_vadjustment()
if adj.get_value() == self.scroll_max or self.scroll_max == 0:
adj.set_value(adj.get_upper() - adj.get_page_size())
- self.scroll_max = adj.get_upper() - adj.get_page_size()
+ self.scroll_max = adj.get_value()
+ else:
+ self.scroll_max = adj.get_upper() - adj.get_page_size()
class LogWindow(Dialog):
| {"golden_diff": "diff --git a/lutris/gui/logwindow.py b/lutris/gui/logwindow.py\n--- a/lutris/gui/logwindow.py\n+++ b/lutris/gui/logwindow.py\n@@ -19,7 +19,9 @@\n adj = self.get_vadjustment()\n if adj.get_value() == self.scroll_max or self.scroll_max == 0:\n adj.set_value(adj.get_upper() - adj.get_page_size())\n- self.scroll_max = adj.get_upper() - adj.get_page_size()\n+ self.scroll_max = adj.get_value()\n+ else:\n+ self.scroll_max = adj.get_upper() - adj.get_page_size()\n \n \n class LogWindow(Dialog):\n", "issue": "Log window: lock to the bottom gets lost\nCaused by https://github.com/lutris/lutris/pull/1179. From my experience, it happens when there are a lot lines outputted all at once:\r\n\r\n\n", "before_files": [{"content": "from gi.repository import Gtk\nfrom lutris.gui.widgets.dialogs import Dialog\n\n\nclass LogTextView(Gtk.TextView):\n def __init__(self, buffer):\n super(LogTextView, self).__init__()\n\n self.set_buffer(buffer)\n self.set_editable(False)\n self.set_monospace(True)\n self.set_left_margin(10)\n self.scroll_max = 0\n self.set_wrap_mode(Gtk.WrapMode.CHAR)\n self.get_style_context().add_class('lutris-logview')\n self.connect(\"size-allocate\", self.autoscroll)\n\n def autoscroll(self, *args):\n adj = self.get_vadjustment()\n if adj.get_value() == self.scroll_max or self.scroll_max == 0:\n adj.set_value(adj.get_upper() - adj.get_page_size())\n self.scroll_max = adj.get_upper() - adj.get_page_size()\n\n\nclass LogWindow(Dialog):\n def __init__(self, title, buffer, parent):\n super(LogWindow, self).__init__(title, parent, 0,\n ('_OK', Gtk.ResponseType.OK))\n self.set_size_request(640, 480)\n self.grid = Gtk.Grid()\n self.buffer = buffer\n self.logtextview = LogTextView(self.buffer)\n\n scrolledwindow = Gtk.ScrolledWindow(hexpand=True, vexpand=True,\n child=self.logtextview)\n self.vbox.add(scrolledwindow)\n self.show_all()\n", "path": "lutris/gui/logwindow.py"}], "after_files": [{"content": "from gi.repository import Gtk\nfrom lutris.gui.widgets.dialogs import Dialog\n\n\nclass LogTextView(Gtk.TextView):\n def __init__(self, buffer):\n super(LogTextView, self).__init__()\n\n self.set_buffer(buffer)\n self.set_editable(False)\n self.set_monospace(True)\n self.set_left_margin(10)\n self.scroll_max = 0\n self.set_wrap_mode(Gtk.WrapMode.CHAR)\n self.get_style_context().add_class('lutris-logview')\n self.connect(\"size-allocate\", self.autoscroll)\n\n def autoscroll(self, *args):\n adj = self.get_vadjustment()\n if adj.get_value() == self.scroll_max or self.scroll_max == 0:\n adj.set_value(adj.get_upper() - adj.get_page_size())\n self.scroll_max = adj.get_value()\n else:\n self.scroll_max = adj.get_upper() - adj.get_page_size()\n\n\nclass LogWindow(Dialog):\n def __init__(self, title, buffer, parent):\n super(LogWindow, self).__init__(title, parent, 0,\n ('_OK', Gtk.ResponseType.OK))\n self.set_size_request(640, 480)\n self.grid = Gtk.Grid()\n self.buffer = buffer\n self.logtextview = LogTextView(self.buffer)\n\n scrolledwindow = Gtk.ScrolledWindow(hexpand=True, vexpand=True,\n child=self.logtextview)\n self.vbox.add(scrolledwindow)\n self.show_all()\n", "path": "lutris/gui/logwindow.py"}]} | 765 | 145 |
gh_patches_debug_29658 | rasdani/github-patches | git_diff | gratipay__gratipay.com-4565 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Broken URL links for npm packages with special characters?
From @calvinhp on [Slack](https://gratipay.slackarchive.io/gratipay/page-100/ts-1496325029088097):
> I just found a 404 :slightly_smiling_face:
> If I search for Plone, the first matching project is @plone/plone-react
> but it is a 404
[Search results](https://gratipay.com/search?q=Plone):

Broken Link: https://gratipay.com/on/npm/@plone/plone-react/
I tried URL encoding it to be https://gratipay.com/on/npm/%40plone%2Fplone-react/, didn't work. I've encountered similar problems with common routing libraries where a `/` can't be URL-encoded and placed in the URL, although the spec supports it - might be related.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `gratipay/models/package/__init__.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 from __future__ import absolute_import, division, print_function, unicode_literals
3
4 from postgres.orm import Model
5
6 from .emails import Emails
7 from .team import Team
8
9
10 NPM = 'npm' # We are starting with a single package manager. If we see
11 # traction we will expand.
12
13
14 class Package(Model, Emails, Team):
15 """Represent a gratipackage. :-)
16
17 Packages are entities on open source package managers; `npm
18 <https://www.npmjs.com/>`_ is the only one we support so far. Each package
19 on npm has a page on Gratipay with an URL of the form ``/on/npm/foo/``.
20 Packages can be claimed by Gratipay participants, at which point we create
21 a :py:class:`~gratipay.models.team.Team` for them under the hood so they
22 can start accepting payments.
23
24 """
25
26 typname = 'packages'
27
28 def __eq__(self, other):
29 if not isinstance(other, Package):
30 return False
31 return self.id == other.id
32
33 def __ne__(self, other):
34 if not isinstance(other, Package):
35 return True
36 return self.id != other.id
37
38
39 @property
40 def url_path(self):
41 """The path part of the URL for this package on Gratipay.
42 """
43 return '/on/{}/{}/'.format(self.package_manager, self.name)
44
45
46 @property
47 def remote_human_url(self):
48 """The URL for the main page for this package on its package manager.
49 """
50 if self.package_manager == NPM:
51 return 'https://www.npmjs.com/package/{}'.format(self.name)
52 raise NotImplementedError()
53
54
55 @property
56 def remote_api_url(self):
57 """The main API URL for this package on its package manager.
58 """
59 if self.package_manager == NPM:
60 return 'https://registry.npmjs.com/{}'.format(self.name)
61 raise NotImplementedError()
62
63
64 # Constructors
65 # ============
66
67 @classmethod
68 def from_id(cls, id, cursor=None):
69 """Return an existing package based on id.
70 """
71 cursor = cursor or cls.db
72 return cursor.one("SELECT packages.*::packages FROM packages WHERE id=%s", (id,))
73
74
75 @classmethod
76 def from_names(cls, package_manager, name, cursor=None):
77 """Return an existing package based on package manager and package names.
78 """
79 cursor = cursor or cls.db
80 return cursor.one( "SELECT packages.*::packages FROM packages "
81 "WHERE package_manager=%s and name=%s"
82 , (package_manager, name)
83 )
84
85
86 @classmethod
87 def upsert(cls, package_manager, **kw):
88 """Upsert a package. Required keyword arguments:
89
90 - ``name`` (string)
91 - ``description`` (string)
92 - ``emails`` (list of strings)
93
94 Optional keyword argument:
95
96 - ``cursor``
97
98 :return None:
99
100 """
101 cursor = kw.pop('cursor', cls.db)
102 cursor.run('''
103 INSERT INTO packages
104 (package_manager, name, description, emails)
105 VALUES ('npm', %(name)s, %(description)s, %(emails)s)
106
107 ON CONFLICT (package_manager, name) DO UPDATE
108 SET description=%(description)s, emails=%(emails)s
109 ''', kw)
110
111
112 def delete(self, cursor=None):
113 """Delete the package, unlinking any team (the team itself lives on)
114 and clearing any claim.
115 """
116 cursor = cursor or self.db
117 if self.load_team(cursor):
118 self.unlink_team(cursor)
119 cursor.run("DELETE FROM claims WHERE package_id=%s", (self.id,))
120 cursor.run( "DELETE FROM packages WHERE package_manager=%s AND name=%s"
121 , (self.package_manager, self.name)
122 )
123
```
Path: `gratipay/models/package/team.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 from __future__ import absolute_import, division, print_function, unicode_literals
3
4 import uuid
5
6 from gratipay.models.team import Team as _Team
7
8
9 class Team(object):
10 """A :py:class:`~gratipay.models.package.Package` can have a
11 :py:class:`~gratipay.models.team.Team` associated with it.
12 """
13
14 @property
15 def team(self):
16 """A computed attribute, the :py:class:`~gratipay.models.team.Team`
17 linked to this package if there is one, otherwise ``None``. Makes a
18 database call.
19 """
20 return self.load_team(self.db)
21
22
23 def load_team(self, cursor):
24 """Given a database cursor, return a
25 :py:class:`~gratipay.models.team.Team` if there is one linked to this
26 package, or ``None`` if not.
27 """
28 return cursor.one( 'SELECT t.*::teams FROM teams t WHERE t.id='
29 '(SELECT team_id FROM teams_to_packages tp WHERE tp.package_id=%s)'
30 , (self.id,)
31 )
32
33
34 def get_or_create_linked_team(self, cursor, owner):
35 """Given a db cursor and a :py:class:`Participant`, return a
36 :py:class:`~gratipay.models.team.Team`.
37 """
38 team = self.load_team(cursor)
39 if team:
40 return team
41
42 def slug_options():
43 yield self.name
44 for i in range(1, 10):
45 yield '{}-{}'.format(self.name, i)
46 yield uuid.uuid4().hex
47
48 for slug in slug_options():
49 if cursor.one('SELECT count(*) FROM teams WHERE slug=%s', (slug,)) == 0:
50 break
51
52 team = _Team.insert( slug=slug
53 , slug_lower=slug.lower()
54 , name=slug
55 , homepage='https://www.npmjs.com/package/' + self.name
56 , product_or_service=self.description
57 , owner=owner
58 , _cursor=cursor
59 )
60 cursor.run('INSERT INTO teams_to_packages (team_id, package_id) '
61 'VALUES (%s, %s)', (team.id, self.id))
62 self.app.add_event( cursor
63 , 'package'
64 , dict(id=self.id, action='link', values=dict(team_id=team.id))
65 )
66 return team
67
68
69 def unlink_team(self, cursor):
70 """Given a db cursor, unlink the team associated with this package
71 (it's a bug if called with no team linked).
72 """
73 team = self.load_team(cursor)
74 assert team is not None # sanity check
75 cursor.run('DELETE FROM teams_to_packages WHERE package_id=%s', (self.id,))
76 self.app.add_event( cursor
77 , 'package'
78 , dict(id=self.id, action='unlink', values=dict(team_id=team.id))
79 )
80
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/gratipay/models/package/__init__.py b/gratipay/models/package/__init__.py
--- a/gratipay/models/package/__init__.py
+++ b/gratipay/models/package/__init__.py
@@ -16,7 +16,7 @@
Packages are entities on open source package managers; `npm
<https://www.npmjs.com/>`_ is the only one we support so far. Each package
- on npm has a page on Gratipay with an URL of the form ``/on/npm/foo/``.
+ on npm has a page on Gratipay with an URL of the form ``/on/npm/foo``.
Packages can be claimed by Gratipay participants, at which point we create
a :py:class:`~gratipay.models.team.Team` for them under the hood so they
can start accepting payments.
@@ -40,7 +40,7 @@
def url_path(self):
"""The path part of the URL for this package on Gratipay.
"""
- return '/on/{}/{}/'.format(self.package_manager, self.name)
+ return '/on/{}/{}'.format(self.package_manager, self.name)
@property
diff --git a/gratipay/models/package/team.py b/gratipay/models/package/team.py
--- a/gratipay/models/package/team.py
+++ b/gratipay/models/package/team.py
@@ -40,9 +40,12 @@
return team
def slug_options():
- yield self.name
+ # Having analyzed existing names, we should never get `@` without
+ # `/`. Be conservative in what we accept! Oh, wait ...
+ base_name = self.name.split('/')[1] if self.name.startswith('@') else self.name
+ yield base_name
for i in range(1, 10):
- yield '{}-{}'.format(self.name, i)
+ yield '{}-{}'.format(base_name, i)
yield uuid.uuid4().hex
for slug in slug_options():
| {"golden_diff": "diff --git a/gratipay/models/package/__init__.py b/gratipay/models/package/__init__.py\n--- a/gratipay/models/package/__init__.py\n+++ b/gratipay/models/package/__init__.py\n@@ -16,7 +16,7 @@\n \n Packages are entities on open source package managers; `npm\n <https://www.npmjs.com/>`_ is the only one we support so far. Each package\n- on npm has a page on Gratipay with an URL of the form ``/on/npm/foo/``.\n+ on npm has a page on Gratipay with an URL of the form ``/on/npm/foo``.\n Packages can be claimed by Gratipay participants, at which point we create\n a :py:class:`~gratipay.models.team.Team` for them under the hood so they\n can start accepting payments.\n@@ -40,7 +40,7 @@\n def url_path(self):\n \"\"\"The path part of the URL for this package on Gratipay.\n \"\"\"\n- return '/on/{}/{}/'.format(self.package_manager, self.name)\n+ return '/on/{}/{}'.format(self.package_manager, self.name)\n \n \n @property\ndiff --git a/gratipay/models/package/team.py b/gratipay/models/package/team.py\n--- a/gratipay/models/package/team.py\n+++ b/gratipay/models/package/team.py\n@@ -40,9 +40,12 @@\n return team\n \n def slug_options():\n- yield self.name\n+ # Having analyzed existing names, we should never get `@` without\n+ # `/`. Be conservative in what we accept! Oh, wait ...\n+ base_name = self.name.split('/')[1] if self.name.startswith('@') else self.name\n+ yield base_name\n for i in range(1, 10):\n- yield '{}-{}'.format(self.name, i)\n+ yield '{}-{}'.format(base_name, i)\n yield uuid.uuid4().hex\n \n for slug in slug_options():\n", "issue": "Broken URL links for npm packages with special characters? \nFrom @calvinhp on [Slack](https://gratipay.slackarchive.io/gratipay/page-100/ts-1496325029088097): \r\n\r\n> I just found a 404 :slightly_smiling_face:\r\n> If I search for Plone, the first matching project is @plone/plone-react\r\n> but it is a 404\r\n\r\n[Search results](https://gratipay.com/search?q=Plone): \r\n\r\n\r\n\r\nBroken Link: https://gratipay.com/on/npm/@plone/plone-react/\r\n\r\nI tried URL encoding it to be https://gratipay.com/on/npm/%40plone%2Fplone-react/, didn't work. I've encountered similar problems with common routing libraries where a `/` can't be URL-encoded and placed in the URL, although the spec supports it - might be related. \n", "before_files": [{"content": "# -*- coding: utf-8 -*-\nfrom __future__ import absolute_import, division, print_function, unicode_literals\n\nfrom postgres.orm import Model\n\nfrom .emails import Emails\nfrom .team import Team\n\n\nNPM = 'npm' # We are starting with a single package manager. If we see\n # traction we will expand.\n\n\nclass Package(Model, Emails, Team):\n \"\"\"Represent a gratipackage. :-)\n\n Packages are entities on open source package managers; `npm\n <https://www.npmjs.com/>`_ is the only one we support so far. Each package\n on npm has a page on Gratipay with an URL of the form ``/on/npm/foo/``.\n Packages can be claimed by Gratipay participants, at which point we create\n a :py:class:`~gratipay.models.team.Team` for them under the hood so they\n can start accepting payments.\n\n \"\"\"\n\n typname = 'packages'\n\n def __eq__(self, other):\n if not isinstance(other, Package):\n return False\n return self.id == other.id\n\n def __ne__(self, other):\n if not isinstance(other, Package):\n return True\n return self.id != other.id\n\n\n @property\n def url_path(self):\n \"\"\"The path part of the URL for this package on Gratipay.\n \"\"\"\n return '/on/{}/{}/'.format(self.package_manager, self.name)\n\n\n @property\n def remote_human_url(self):\n \"\"\"The URL for the main page for this package on its package manager.\n \"\"\"\n if self.package_manager == NPM:\n return 'https://www.npmjs.com/package/{}'.format(self.name)\n raise NotImplementedError()\n\n\n @property\n def remote_api_url(self):\n \"\"\"The main API URL for this package on its package manager.\n \"\"\"\n if self.package_manager == NPM:\n return 'https://registry.npmjs.com/{}'.format(self.name)\n raise NotImplementedError()\n\n\n # Constructors\n # ============\n\n @classmethod\n def from_id(cls, id, cursor=None):\n \"\"\"Return an existing package based on id.\n \"\"\"\n cursor = cursor or cls.db\n return cursor.one(\"SELECT packages.*::packages FROM packages WHERE id=%s\", (id,))\n\n\n @classmethod\n def from_names(cls, package_manager, name, cursor=None):\n \"\"\"Return an existing package based on package manager and package names.\n \"\"\"\n cursor = cursor or cls.db\n return cursor.one( \"SELECT packages.*::packages FROM packages \"\n \"WHERE package_manager=%s and name=%s\"\n , (package_manager, name)\n )\n\n\n @classmethod\n def upsert(cls, package_manager, **kw):\n \"\"\"Upsert a package. Required keyword arguments:\n\n - ``name`` (string)\n - ``description`` (string)\n - ``emails`` (list of strings)\n\n Optional keyword argument:\n\n - ``cursor``\n\n :return None:\n\n \"\"\"\n cursor = kw.pop('cursor', cls.db)\n cursor.run('''\n INSERT INTO packages\n (package_manager, name, description, emails)\n VALUES ('npm', %(name)s, %(description)s, %(emails)s)\n\n ON CONFLICT (package_manager, name) DO UPDATE\n SET description=%(description)s, emails=%(emails)s\n ''', kw)\n\n\n def delete(self, cursor=None):\n \"\"\"Delete the package, unlinking any team (the team itself lives on)\n and clearing any claim.\n \"\"\"\n cursor = cursor or self.db\n if self.load_team(cursor):\n self.unlink_team(cursor)\n cursor.run(\"DELETE FROM claims WHERE package_id=%s\", (self.id,))\n cursor.run( \"DELETE FROM packages WHERE package_manager=%s AND name=%s\"\n , (self.package_manager, self.name)\n )\n", "path": "gratipay/models/package/__init__.py"}, {"content": "# -*- coding: utf-8 -*-\nfrom __future__ import absolute_import, division, print_function, unicode_literals\n\nimport uuid\n\nfrom gratipay.models.team import Team as _Team\n\n\nclass Team(object):\n \"\"\"A :py:class:`~gratipay.models.package.Package` can have a\n :py:class:`~gratipay.models.team.Team` associated with it.\n \"\"\"\n\n @property\n def team(self):\n \"\"\"A computed attribute, the :py:class:`~gratipay.models.team.Team`\n linked to this package if there is one, otherwise ``None``. Makes a\n database call.\n \"\"\"\n return self.load_team(self.db)\n\n\n def load_team(self, cursor):\n \"\"\"Given a database cursor, return a\n :py:class:`~gratipay.models.team.Team` if there is one linked to this\n package, or ``None`` if not.\n \"\"\"\n return cursor.one( 'SELECT t.*::teams FROM teams t WHERE t.id='\n '(SELECT team_id FROM teams_to_packages tp WHERE tp.package_id=%s)'\n , (self.id,)\n )\n\n\n def get_or_create_linked_team(self, cursor, owner):\n \"\"\"Given a db cursor and a :py:class:`Participant`, return a\n :py:class:`~gratipay.models.team.Team`.\n \"\"\"\n team = self.load_team(cursor)\n if team:\n return team\n\n def slug_options():\n yield self.name\n for i in range(1, 10):\n yield '{}-{}'.format(self.name, i)\n yield uuid.uuid4().hex\n\n for slug in slug_options():\n if cursor.one('SELECT count(*) FROM teams WHERE slug=%s', (slug,)) == 0:\n break\n\n team = _Team.insert( slug=slug\n , slug_lower=slug.lower()\n , name=slug\n , homepage='https://www.npmjs.com/package/' + self.name\n , product_or_service=self.description\n , owner=owner\n , _cursor=cursor\n )\n cursor.run('INSERT INTO teams_to_packages (team_id, package_id) '\n 'VALUES (%s, %s)', (team.id, self.id))\n self.app.add_event( cursor\n , 'package'\n , dict(id=self.id, action='link', values=dict(team_id=team.id))\n )\n return team\n\n\n def unlink_team(self, cursor):\n \"\"\"Given a db cursor, unlink the team associated with this package\n (it's a bug if called with no team linked).\n \"\"\"\n team = self.load_team(cursor)\n assert team is not None # sanity check\n cursor.run('DELETE FROM teams_to_packages WHERE package_id=%s', (self.id,))\n self.app.add_event( cursor\n , 'package'\n , dict(id=self.id, action='unlink', values=dict(team_id=team.id))\n )\n", "path": "gratipay/models/package/team.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\nfrom __future__ import absolute_import, division, print_function, unicode_literals\n\nfrom postgres.orm import Model\n\nfrom .emails import Emails\nfrom .team import Team\n\n\nNPM = 'npm' # We are starting with a single package manager. If we see\n # traction we will expand.\n\n\nclass Package(Model, Emails, Team):\n \"\"\"Represent a gratipackage. :-)\n\n Packages are entities on open source package managers; `npm\n <https://www.npmjs.com/>`_ is the only one we support so far. Each package\n on npm has a page on Gratipay with an URL of the form ``/on/npm/foo``.\n Packages can be claimed by Gratipay participants, at which point we create\n a :py:class:`~gratipay.models.team.Team` for them under the hood so they\n can start accepting payments.\n\n \"\"\"\n\n typname = 'packages'\n\n def __eq__(self, other):\n if not isinstance(other, Package):\n return False\n return self.id == other.id\n\n def __ne__(self, other):\n if not isinstance(other, Package):\n return True\n return self.id != other.id\n\n\n @property\n def url_path(self):\n \"\"\"The path part of the URL for this package on Gratipay.\n \"\"\"\n return '/on/{}/{}'.format(self.package_manager, self.name)\n\n\n @property\n def remote_human_url(self):\n \"\"\"The URL for the main page for this package on its package manager.\n \"\"\"\n if self.package_manager == NPM:\n return 'https://www.npmjs.com/package/{}'.format(self.name)\n raise NotImplementedError()\n\n\n @property\n def remote_api_url(self):\n \"\"\"The main API URL for this package on its package manager.\n \"\"\"\n if self.package_manager == NPM:\n return 'https://registry.npmjs.com/{}'.format(self.name)\n raise NotImplementedError()\n\n\n # Constructors\n # ============\n\n @classmethod\n def from_id(cls, id, cursor=None):\n \"\"\"Return an existing package based on id.\n \"\"\"\n cursor = cursor or cls.db\n return cursor.one(\"SELECT packages.*::packages FROM packages WHERE id=%s\", (id,))\n\n\n @classmethod\n def from_names(cls, package_manager, name, cursor=None):\n \"\"\"Return an existing package based on package manager and package names.\n \"\"\"\n cursor = cursor or cls.db\n return cursor.one( \"SELECT packages.*::packages FROM packages \"\n \"WHERE package_manager=%s and name=%s\"\n , (package_manager, name)\n )\n\n\n @classmethod\n def upsert(cls, package_manager, **kw):\n \"\"\"Upsert a package. Required keyword arguments:\n\n - ``name`` (string)\n - ``description`` (string)\n - ``emails`` (list of strings)\n\n Optional keyword argument:\n\n - ``cursor``\n\n :return None:\n\n \"\"\"\n cursor = kw.pop('cursor', cls.db)\n cursor.run('''\n INSERT INTO packages\n (package_manager, name, description, emails)\n VALUES ('npm', %(name)s, %(description)s, %(emails)s)\n\n ON CONFLICT (package_manager, name) DO UPDATE\n SET description=%(description)s, emails=%(emails)s\n ''', kw)\n\n\n def delete(self, cursor=None):\n \"\"\"Delete the package, unlinking any team (the team itself lives on)\n and clearing any claim.\n \"\"\"\n cursor = cursor or self.db\n if self.load_team(cursor):\n self.unlink_team(cursor)\n cursor.run(\"DELETE FROM claims WHERE package_id=%s\", (self.id,))\n cursor.run( \"DELETE FROM packages WHERE package_manager=%s AND name=%s\"\n , (self.package_manager, self.name)\n )\n", "path": "gratipay/models/package/__init__.py"}, {"content": "# -*- coding: utf-8 -*-\nfrom __future__ import absolute_import, division, print_function, unicode_literals\n\nimport uuid\n\nfrom gratipay.models.team import Team as _Team\n\n\nclass Team(object):\n \"\"\"A :py:class:`~gratipay.models.package.Package` can have a\n :py:class:`~gratipay.models.team.Team` associated with it.\n \"\"\"\n\n @property\n def team(self):\n \"\"\"A computed attribute, the :py:class:`~gratipay.models.team.Team`\n linked to this package if there is one, otherwise ``None``. Makes a\n database call.\n \"\"\"\n return self.load_team(self.db)\n\n\n def load_team(self, cursor):\n \"\"\"Given a database cursor, return a\n :py:class:`~gratipay.models.team.Team` if there is one linked to this\n package, or ``None`` if not.\n \"\"\"\n return cursor.one( 'SELECT t.*::teams FROM teams t WHERE t.id='\n '(SELECT team_id FROM teams_to_packages tp WHERE tp.package_id=%s)'\n , (self.id,)\n )\n\n\n def get_or_create_linked_team(self, cursor, owner):\n \"\"\"Given a db cursor and a :py:class:`Participant`, return a\n :py:class:`~gratipay.models.team.Team`.\n \"\"\"\n team = self.load_team(cursor)\n if team:\n return team\n\n def slug_options():\n # Having analyzed existing names, we should never get `@` without\n # `/`. Be conservative in what we accept! Oh, wait ...\n base_name = self.name.split('/')[1] if self.name.startswith('@') else self.name\n yield base_name\n for i in range(1, 10):\n yield '{}-{}'.format(base_name, i)\n yield uuid.uuid4().hex\n\n for slug in slug_options():\n if cursor.one('SELECT count(*) FROM teams WHERE slug=%s', (slug,)) == 0:\n break\n\n team = _Team.insert( slug=slug\n , slug_lower=slug.lower()\n , name=slug\n , homepage='https://www.npmjs.com/package/' + self.name\n , product_or_service=self.description\n , owner=owner\n , _cursor=cursor\n )\n cursor.run('INSERT INTO teams_to_packages (team_id, package_id) '\n 'VALUES (%s, %s)', (team.id, self.id))\n self.app.add_event( cursor\n , 'package'\n , dict(id=self.id, action='link', values=dict(team_id=team.id))\n )\n return team\n\n\n def unlink_team(self, cursor):\n \"\"\"Given a db cursor, unlink the team associated with this package\n (it's a bug if called with no team linked).\n \"\"\"\n team = self.load_team(cursor)\n assert team is not None # sanity check\n cursor.run('DELETE FROM teams_to_packages WHERE package_id=%s', (self.id,))\n self.app.add_event( cursor\n , 'package'\n , dict(id=self.id, action='unlink', values=dict(team_id=team.id))\n )\n", "path": "gratipay/models/package/team.py"}]} | 2,449 | 439 |
gh_patches_debug_611 | rasdani/github-patches | git_diff | pex-tool__pex-1251 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Release 2.1.31
On the docket:
+ [x] When Pex is run from a Pex PEX its isolation is broken. #1232
+ [x] The `--venv` mode `pex` script does not have a `__name__ == '__main__'` guard breaking multiprocessing. #1236
+ [x] The `--seed` mode for a `--venv` PEX is unsafe. #1239
+ [x] The venv `pex` script handles entrypoint functions differently from PEX. #1241
+ [x] Interpreter identification leaks an unconstrained `$PWD` entry into `sys.path`. #1231
+ [x] Support control of venv creation mode `--copies` vs. `--symlinks` #1230
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pex/version.py`
Content:
```
1 # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
2 # Licensed under the Apache License, Version 2.0 (see LICENSE).
3
4 __version__ = "2.1.30"
5
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pex/version.py b/pex/version.py
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.30"
+__version__ = "2.1.31"
| {"golden_diff": "diff --git a/pex/version.py b/pex/version.py\n--- a/pex/version.py\n+++ b/pex/version.py\n@@ -1,4 +1,4 @@\n # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n # Licensed under the Apache License, Version 2.0 (see LICENSE).\n \n-__version__ = \"2.1.30\"\n+__version__ = \"2.1.31\"\n", "issue": "Release 2.1.31\nOn the docket:\r\n+ [x] When Pex is run from a Pex PEX its isolation is broken. #1232\r\n+ [x] The `--venv` mode `pex` script does not have a `__name__ == '__main__'` guard breaking multiprocessing. #1236\r\n+ [x] The `--seed` mode for a `--venv` PEX is unsafe. #1239\r\n+ [x] The venv `pex` script handles entrypoint functions differently from PEX. #1241\r\n+ [x] Interpreter identification leaks an unconstrained `$PWD` entry into `sys.path`. #1231\r\n+ [x] Support control of venv creation mode `--copies` vs. `--symlinks` #1230\n", "before_files": [{"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.30\"\n", "path": "pex/version.py"}], "after_files": [{"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.31\"\n", "path": "pex/version.py"}]} | 496 | 96 |
gh_patches_debug_11337 | rasdani/github-patches | git_diff | numba__numba-3402 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Negative indexing unsupported in `tuple` type.
- [x] I am using the latest released version of Numba (most recent is visible in
the change log (https://github.com/numba/numba/blob/master/CHANGE_LOG).
- [x] I have included below a minimal working reproducer (if you are unsure how
to write one see http://matthewrocklin.com/blog/work/2018/02/28/minimal-bug-reports).
Negative indexing unsupported in `tuple` type.
From https://stackoverflow.com/questions/52777695/negative-dynamic-index-in-numba
```python
from numba import njit
@njit
def test_fn():
tup = (3,2,4,6,2)
total = 0
for idx in range(5):
total += tup[-idx]
return total
test_fn()
```
raises `IndexError: tuple index out of range`
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `numba/targets/tupleobj.py`
Content:
```
1 """
2 Implementation of tuple objects
3 """
4
5 from llvmlite import ir
6 import llvmlite.llvmpy.core as lc
7 import operator
8
9 from .imputils import (lower_builtin, lower_getattr_generic, lower_cast,
10 lower_constant,
11 iternext_impl, impl_ret_borrowed, impl_ret_untracked)
12 from .. import typing, types, cgutils
13 from ..extending import overload_method
14
15
16 @lower_builtin(types.NamedTupleClass, types.VarArg(types.Any))
17 def namedtuple_constructor(context, builder, sig, args):
18 # A namedtuple has the same representation as a regular tuple
19 res = context.make_tuple(builder, sig.return_type, args)
20 # The tuple's contents are borrowed
21 return impl_ret_borrowed(context, builder, sig.return_type, res)
22
23 @lower_builtin(operator.add, types.BaseTuple, types.BaseTuple)
24 def tuple_add(context, builder, sig, args):
25 left, right = [cgutils.unpack_tuple(builder, x) for x in args]
26 res = context.make_tuple(builder, sig.return_type, left + right)
27 # The tuple's contents are borrowed
28 return impl_ret_borrowed(context, builder, sig.return_type, res)
29
30 def tuple_cmp_ordered(context, builder, op, sig, args):
31 tu, tv = sig.args
32 u, v = args
33 res = cgutils.alloca_once_value(builder, cgutils.true_bit)
34 bbend = builder.append_basic_block("cmp_end")
35 for i, (ta, tb) in enumerate(zip(tu.types, tv.types)):
36 a = builder.extract_value(u, i)
37 b = builder.extract_value(v, i)
38 not_equal = context.generic_compare(builder, operator.ne, (ta, tb), (a, b))
39 with builder.if_then(not_equal):
40 pred = context.generic_compare(builder, op, (ta, tb), (a, b))
41 builder.store(pred, res)
42 builder.branch(bbend)
43 # Everything matched equal => compare lengths
44 len_compare = op(len(tu.types), len(tv.types))
45 pred = context.get_constant(types.boolean, len_compare)
46 builder.store(pred, res)
47 builder.branch(bbend)
48 builder.position_at_end(bbend)
49 return builder.load(res)
50
51 @lower_builtin(operator.eq, types.BaseTuple, types.BaseTuple)
52 def tuple_eq(context, builder, sig, args):
53 tu, tv = sig.args
54 u, v = args
55 if len(tu.types) != len(tv.types):
56 res = context.get_constant(types.boolean, False)
57 return impl_ret_untracked(context, builder, sig.return_type, res)
58 res = context.get_constant(types.boolean, True)
59 for i, (ta, tb) in enumerate(zip(tu.types, tv.types)):
60 a = builder.extract_value(u, i)
61 b = builder.extract_value(v, i)
62 pred = context.generic_compare(builder, operator.eq, (ta, tb), (a, b))
63 res = builder.and_(res, pred)
64 return impl_ret_untracked(context, builder, sig.return_type, res)
65
66 @lower_builtin(operator.ne, types.BaseTuple, types.BaseTuple)
67 def tuple_ne(context, builder, sig, args):
68 res = builder.not_(tuple_eq(context, builder, sig, args))
69 return impl_ret_untracked(context, builder, sig.return_type, res)
70
71 @lower_builtin(operator.lt, types.BaseTuple, types.BaseTuple)
72 def tuple_lt(context, builder, sig, args):
73 res = tuple_cmp_ordered(context, builder, operator.lt, sig, args)
74 return impl_ret_untracked(context, builder, sig.return_type, res)
75
76 @lower_builtin(operator.le, types.BaseTuple, types.BaseTuple)
77 def tuple_le(context, builder, sig, args):
78 res = tuple_cmp_ordered(context, builder, operator.le, sig, args)
79 return impl_ret_untracked(context, builder, sig.return_type, res)
80
81 @lower_builtin(operator.gt, types.BaseTuple, types.BaseTuple)
82 def tuple_gt(context, builder, sig, args):
83 res = tuple_cmp_ordered(context, builder, operator.gt, sig, args)
84 return impl_ret_untracked(context, builder, sig.return_type, res)
85
86 @lower_builtin(operator.ge, types.BaseTuple, types.BaseTuple)
87 def tuple_ge(context, builder, sig, args):
88 res = tuple_cmp_ordered(context, builder, operator.ge, sig, args)
89 return impl_ret_untracked(context, builder, sig.return_type, res)
90
91 @lower_builtin(hash, types.BaseTuple)
92 def hash_tuple(context, builder, sig, args):
93 tupty, = sig.args
94 tup, = args
95 lty = context.get_value_type(sig.return_type)
96
97 h = ir.Constant(lty, 0x345678)
98 mult = ir.Constant(lty, 1000003)
99 n = ir.Constant(lty, len(tupty))
100
101 for i, ty in enumerate(tupty.types):
102 # h = h * mult
103 h = builder.mul(h, mult)
104 val = builder.extract_value(tup, i)
105 hash_impl = context.get_function(hash,
106 typing.signature(sig.return_type, ty))
107 h_val = hash_impl(builder, (val,))
108 # h = h ^ hash(val)
109 h = builder.xor(h, h_val)
110 # Perturb: mult = mult + len(tup)
111 mult = builder.add(mult, n)
112
113 return h
114
115
116 @lower_getattr_generic(types.BaseNamedTuple)
117 def namedtuple_getattr(context, builder, typ, value, attr):
118 """
119 Fetch a namedtuple's field.
120 """
121 index = typ.fields.index(attr)
122 res = builder.extract_value(value, index)
123 return impl_ret_borrowed(context, builder, typ[index], res)
124
125
126 @lower_constant(types.UniTuple)
127 @lower_constant(types.NamedUniTuple)
128 def unituple_constant(context, builder, ty, pyval):
129 """
130 Create a homogeneous tuple constant.
131 """
132 consts = [context.get_constant_generic(builder, ty.dtype, v)
133 for v in pyval]
134 return ir.ArrayType(consts[0].type, len(consts))(consts)
135
136 @lower_constant(types.Tuple)
137 @lower_constant(types.NamedTuple)
138 def unituple_constant(context, builder, ty, pyval):
139 """
140 Create a heterogeneous tuple constant.
141 """
142 consts = [context.get_constant_generic(builder, ty.types[i], v)
143 for i, v in enumerate(pyval)]
144 return ir.Constant.literal_struct(consts)
145
146
147 #------------------------------------------------------------------------------
148 # Tuple iterators
149
150 @lower_builtin('getiter', types.UniTuple)
151 @lower_builtin('getiter', types.NamedUniTuple)
152 def getiter_unituple(context, builder, sig, args):
153 [tupty] = sig.args
154 [tup] = args
155
156 iterval = context.make_helper(builder, types.UniTupleIter(tupty))
157
158 index0 = context.get_constant(types.intp, 0)
159 indexptr = cgutils.alloca_once(builder, index0.type)
160 builder.store(index0, indexptr)
161
162 iterval.index = indexptr
163 iterval.tuple = tup
164
165 res = iterval._getvalue()
166 return impl_ret_borrowed(context, builder, sig.return_type, res)
167
168
169 @lower_builtin('iternext', types.UniTupleIter)
170 @iternext_impl
171 def iternext_unituple(context, builder, sig, args, result):
172 [tupiterty] = sig.args
173 [tupiter] = args
174
175 iterval = context.make_helper(builder, tupiterty, value=tupiter)
176
177 tup = iterval.tuple
178 idxptr = iterval.index
179 idx = builder.load(idxptr)
180 count = context.get_constant(types.intp, tupiterty.container.count)
181
182 is_valid = builder.icmp(lc.ICMP_SLT, idx, count)
183 result.set_valid(is_valid)
184
185 with builder.if_then(is_valid):
186 getitem_sig = typing.signature(tupiterty.container.dtype,
187 tupiterty.container,
188 types.intp)
189 getitem_out = getitem_unituple(context, builder, getitem_sig,
190 [tup, idx])
191 result.yield_(getitem_out)
192 nidx = builder.add(idx, context.get_constant(types.intp, 1))
193 builder.store(nidx, iterval.index)
194
195
196 @lower_builtin('getitem', types.UniTuple, types.intp)
197 @lower_builtin('getitem', types.NamedUniTuple, types.intp)
198 def getitem_unituple(context, builder, sig, args):
199 tupty, _ = sig.args
200 tup, idx = args
201
202 errmsg_oob = ("tuple index out of range",)
203
204 if len(tupty) == 0:
205 # Empty tuple.
206
207 # Always branch and raise IndexError
208 with builder.if_then(cgutils.true_bit):
209 context.call_conv.return_user_exc(builder, IndexError,
210 errmsg_oob)
211 # This is unreachable in runtime,
212 # but it exists to not terminate the current basicblock.
213 res = context.get_constant_null(sig.return_type)
214 return impl_ret_untracked(context, builder,
215 sig.return_type, res)
216 else:
217 # The tuple is not empty
218 bbelse = builder.append_basic_block("switch.else")
219 bbend = builder.append_basic_block("switch.end")
220 switch = builder.switch(idx, bbelse)
221
222 with builder.goto_block(bbelse):
223 context.call_conv.return_user_exc(builder, IndexError,
224 errmsg_oob)
225
226 lrtty = context.get_value_type(tupty.dtype)
227 with builder.goto_block(bbend):
228 phinode = builder.phi(lrtty)
229
230 for i in range(tupty.count):
231 ki = context.get_constant(types.intp, i)
232 bbi = builder.append_basic_block("switch.%d" % i)
233 switch.add_case(ki, bbi)
234 with builder.goto_block(bbi):
235 value = builder.extract_value(tup, i)
236 builder.branch(bbend)
237 phinode.add_incoming(value, bbi)
238
239 builder.position_at_end(bbend)
240 res = phinode
241 assert sig.return_type == tupty.dtype
242 return impl_ret_borrowed(context, builder, sig.return_type, res)
243
244
245 @lower_builtin('static_getitem', types.BaseTuple, types.Const)
246 def static_getitem_tuple(context, builder, sig, args):
247 tupty, _ = sig.args
248 tup, idx = args
249 if isinstance(idx, int):
250 if idx < 0:
251 idx += len(tupty)
252 if not 0 <= idx < len(tupty):
253 raise IndexError("cannot index at %d in %s" % (idx, tupty))
254 res = builder.extract_value(tup, idx)
255 elif isinstance(idx, slice):
256 items = cgutils.unpack_tuple(builder, tup)[idx]
257 res = context.make_tuple(builder, sig.return_type, items)
258 else:
259 raise NotImplementedError("unexpected index %r for %s"
260 % (idx, sig.args[0]))
261 return impl_ret_borrowed(context, builder, sig.return_type, res)
262
263
264 #------------------------------------------------------------------------------
265 # Implicit conversion
266
267 @lower_cast(types.BaseTuple, types.BaseTuple)
268 def tuple_to_tuple(context, builder, fromty, toty, val):
269 if (isinstance(fromty, types.BaseNamedTuple)
270 or isinstance(toty, types.BaseNamedTuple)):
271 # Disallowed by typing layer
272 raise NotImplementedError
273
274 if len(fromty) != len(toty):
275 # Disallowed by typing layer
276 raise NotImplementedError
277
278 olditems = cgutils.unpack_tuple(builder, val, len(fromty))
279 items = [context.cast(builder, v, f, t)
280 for v, f, t in zip(olditems, fromty, toty)]
281 return context.make_tuple(builder, toty, items)
282
283
284 #------------------------------------------------------------------------------
285 # Methods
286
287 @overload_method(types.BaseTuple, 'index')
288 def tuple_index(tup, value):
289
290 def tuple_index_impl(tup, value):
291 for i in range(len(tup)):
292 if tup[i] == value:
293 return i
294 raise ValueError("tuple.index(x): x not in tuple")
295
296 return tuple_index_impl
297
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/numba/targets/tupleobj.py b/numba/targets/tupleobj.py
--- a/numba/targets/tupleobj.py
+++ b/numba/targets/tupleobj.py
@@ -231,6 +231,10 @@
ki = context.get_constant(types.intp, i)
bbi = builder.append_basic_block("switch.%d" % i)
switch.add_case(ki, bbi)
+ # handle negative indexing, create case (-tuple.count + i) to
+ # reference same block as i
+ kin = context.get_constant(types.intp, -tupty.count + i)
+ switch.add_case(kin, bbi)
with builder.goto_block(bbi):
value = builder.extract_value(tup, i)
builder.branch(bbend)
| {"golden_diff": "diff --git a/numba/targets/tupleobj.py b/numba/targets/tupleobj.py\n--- a/numba/targets/tupleobj.py\n+++ b/numba/targets/tupleobj.py\n@@ -231,6 +231,10 @@\n ki = context.get_constant(types.intp, i)\n bbi = builder.append_basic_block(\"switch.%d\" % i)\n switch.add_case(ki, bbi)\n+ # handle negative indexing, create case (-tuple.count + i) to\n+ # reference same block as i\n+ kin = context.get_constant(types.intp, -tupty.count + i)\n+ switch.add_case(kin, bbi)\n with builder.goto_block(bbi):\n value = builder.extract_value(tup, i)\n builder.branch(bbend)\n", "issue": "Negative indexing unsupported in `tuple` type.\n- [x] I am using the latest released version of Numba (most recent is visible in\r\n the change log (https://github.com/numba/numba/blob/master/CHANGE_LOG).\r\n- [x] I have included below a minimal working reproducer (if you are unsure how\r\n to write one see http://matthewrocklin.com/blog/work/2018/02/28/minimal-bug-reports).\r\n\r\nNegative indexing unsupported in `tuple` type.\r\n\r\nFrom https://stackoverflow.com/questions/52777695/negative-dynamic-index-in-numba\r\n\r\n```python\r\nfrom numba import njit\r\n@njit\r\ndef test_fn():\r\n tup = (3,2,4,6,2)\r\n total = 0\r\n for idx in range(5):\r\n total += tup[-idx]\r\n return total\r\n\r\ntest_fn()\r\n```\r\nraises `IndexError: tuple index out of range`\r\n\n", "before_files": [{"content": "\"\"\"\nImplementation of tuple objects\n\"\"\"\n\nfrom llvmlite import ir\nimport llvmlite.llvmpy.core as lc\nimport operator\n\nfrom .imputils import (lower_builtin, lower_getattr_generic, lower_cast,\n lower_constant,\n iternext_impl, impl_ret_borrowed, impl_ret_untracked)\nfrom .. import typing, types, cgutils\nfrom ..extending import overload_method\n\n\n@lower_builtin(types.NamedTupleClass, types.VarArg(types.Any))\ndef namedtuple_constructor(context, builder, sig, args):\n # A namedtuple has the same representation as a regular tuple\n res = context.make_tuple(builder, sig.return_type, args)\n # The tuple's contents are borrowed\n return impl_ret_borrowed(context, builder, sig.return_type, res)\n\n@lower_builtin(operator.add, types.BaseTuple, types.BaseTuple)\ndef tuple_add(context, builder, sig, args):\n left, right = [cgutils.unpack_tuple(builder, x) for x in args]\n res = context.make_tuple(builder, sig.return_type, left + right)\n # The tuple's contents are borrowed\n return impl_ret_borrowed(context, builder, sig.return_type, res)\n\ndef tuple_cmp_ordered(context, builder, op, sig, args):\n tu, tv = sig.args\n u, v = args\n res = cgutils.alloca_once_value(builder, cgutils.true_bit)\n bbend = builder.append_basic_block(\"cmp_end\")\n for i, (ta, tb) in enumerate(zip(tu.types, tv.types)):\n a = builder.extract_value(u, i)\n b = builder.extract_value(v, i)\n not_equal = context.generic_compare(builder, operator.ne, (ta, tb), (a, b))\n with builder.if_then(not_equal):\n pred = context.generic_compare(builder, op, (ta, tb), (a, b))\n builder.store(pred, res)\n builder.branch(bbend)\n # Everything matched equal => compare lengths\n len_compare = op(len(tu.types), len(tv.types))\n pred = context.get_constant(types.boolean, len_compare)\n builder.store(pred, res)\n builder.branch(bbend)\n builder.position_at_end(bbend)\n return builder.load(res)\n\n@lower_builtin(operator.eq, types.BaseTuple, types.BaseTuple)\ndef tuple_eq(context, builder, sig, args):\n tu, tv = sig.args\n u, v = args\n if len(tu.types) != len(tv.types):\n res = context.get_constant(types.boolean, False)\n return impl_ret_untracked(context, builder, sig.return_type, res)\n res = context.get_constant(types.boolean, True)\n for i, (ta, tb) in enumerate(zip(tu.types, tv.types)):\n a = builder.extract_value(u, i)\n b = builder.extract_value(v, i)\n pred = context.generic_compare(builder, operator.eq, (ta, tb), (a, b))\n res = builder.and_(res, pred)\n return impl_ret_untracked(context, builder, sig.return_type, res)\n\n@lower_builtin(operator.ne, types.BaseTuple, types.BaseTuple)\ndef tuple_ne(context, builder, sig, args):\n res = builder.not_(tuple_eq(context, builder, sig, args))\n return impl_ret_untracked(context, builder, sig.return_type, res)\n\n@lower_builtin(operator.lt, types.BaseTuple, types.BaseTuple)\ndef tuple_lt(context, builder, sig, args):\n res = tuple_cmp_ordered(context, builder, operator.lt, sig, args)\n return impl_ret_untracked(context, builder, sig.return_type, res)\n\n@lower_builtin(operator.le, types.BaseTuple, types.BaseTuple)\ndef tuple_le(context, builder, sig, args):\n res = tuple_cmp_ordered(context, builder, operator.le, sig, args)\n return impl_ret_untracked(context, builder, sig.return_type, res)\n\n@lower_builtin(operator.gt, types.BaseTuple, types.BaseTuple)\ndef tuple_gt(context, builder, sig, args):\n res = tuple_cmp_ordered(context, builder, operator.gt, sig, args)\n return impl_ret_untracked(context, builder, sig.return_type, res)\n\n@lower_builtin(operator.ge, types.BaseTuple, types.BaseTuple)\ndef tuple_ge(context, builder, sig, args):\n res = tuple_cmp_ordered(context, builder, operator.ge, sig, args)\n return impl_ret_untracked(context, builder, sig.return_type, res)\n\n@lower_builtin(hash, types.BaseTuple)\ndef hash_tuple(context, builder, sig, args):\n tupty, = sig.args\n tup, = args\n lty = context.get_value_type(sig.return_type)\n\n h = ir.Constant(lty, 0x345678)\n mult = ir.Constant(lty, 1000003)\n n = ir.Constant(lty, len(tupty))\n\n for i, ty in enumerate(tupty.types):\n # h = h * mult\n h = builder.mul(h, mult)\n val = builder.extract_value(tup, i)\n hash_impl = context.get_function(hash,\n typing.signature(sig.return_type, ty))\n h_val = hash_impl(builder, (val,))\n # h = h ^ hash(val)\n h = builder.xor(h, h_val)\n # Perturb: mult = mult + len(tup)\n mult = builder.add(mult, n)\n\n return h\n\n\n@lower_getattr_generic(types.BaseNamedTuple)\ndef namedtuple_getattr(context, builder, typ, value, attr):\n \"\"\"\n Fetch a namedtuple's field.\n \"\"\"\n index = typ.fields.index(attr)\n res = builder.extract_value(value, index)\n return impl_ret_borrowed(context, builder, typ[index], res)\n\n\n@lower_constant(types.UniTuple)\n@lower_constant(types.NamedUniTuple)\ndef unituple_constant(context, builder, ty, pyval):\n \"\"\"\n Create a homogeneous tuple constant.\n \"\"\"\n consts = [context.get_constant_generic(builder, ty.dtype, v)\n for v in pyval]\n return ir.ArrayType(consts[0].type, len(consts))(consts)\n\n@lower_constant(types.Tuple)\n@lower_constant(types.NamedTuple)\ndef unituple_constant(context, builder, ty, pyval):\n \"\"\"\n Create a heterogeneous tuple constant.\n \"\"\"\n consts = [context.get_constant_generic(builder, ty.types[i], v)\n for i, v in enumerate(pyval)]\n return ir.Constant.literal_struct(consts)\n\n\n#------------------------------------------------------------------------------\n# Tuple iterators\n\n@lower_builtin('getiter', types.UniTuple)\n@lower_builtin('getiter', types.NamedUniTuple)\ndef getiter_unituple(context, builder, sig, args):\n [tupty] = sig.args\n [tup] = args\n\n iterval = context.make_helper(builder, types.UniTupleIter(tupty))\n\n index0 = context.get_constant(types.intp, 0)\n indexptr = cgutils.alloca_once(builder, index0.type)\n builder.store(index0, indexptr)\n\n iterval.index = indexptr\n iterval.tuple = tup\n\n res = iterval._getvalue()\n return impl_ret_borrowed(context, builder, sig.return_type, res)\n\n\n@lower_builtin('iternext', types.UniTupleIter)\n@iternext_impl\ndef iternext_unituple(context, builder, sig, args, result):\n [tupiterty] = sig.args\n [tupiter] = args\n\n iterval = context.make_helper(builder, tupiterty, value=tupiter)\n\n tup = iterval.tuple\n idxptr = iterval.index\n idx = builder.load(idxptr)\n count = context.get_constant(types.intp, tupiterty.container.count)\n\n is_valid = builder.icmp(lc.ICMP_SLT, idx, count)\n result.set_valid(is_valid)\n\n with builder.if_then(is_valid):\n getitem_sig = typing.signature(tupiterty.container.dtype,\n tupiterty.container,\n types.intp)\n getitem_out = getitem_unituple(context, builder, getitem_sig,\n [tup, idx])\n result.yield_(getitem_out)\n nidx = builder.add(idx, context.get_constant(types.intp, 1))\n builder.store(nidx, iterval.index)\n\n\n@lower_builtin('getitem', types.UniTuple, types.intp)\n@lower_builtin('getitem', types.NamedUniTuple, types.intp)\ndef getitem_unituple(context, builder, sig, args):\n tupty, _ = sig.args\n tup, idx = args\n\n errmsg_oob = (\"tuple index out of range\",)\n\n if len(tupty) == 0:\n # Empty tuple.\n\n # Always branch and raise IndexError\n with builder.if_then(cgutils.true_bit):\n context.call_conv.return_user_exc(builder, IndexError,\n errmsg_oob)\n # This is unreachable in runtime,\n # but it exists to not terminate the current basicblock.\n res = context.get_constant_null(sig.return_type)\n return impl_ret_untracked(context, builder,\n sig.return_type, res)\n else:\n # The tuple is not empty\n bbelse = builder.append_basic_block(\"switch.else\")\n bbend = builder.append_basic_block(\"switch.end\")\n switch = builder.switch(idx, bbelse)\n\n with builder.goto_block(bbelse):\n context.call_conv.return_user_exc(builder, IndexError,\n errmsg_oob)\n\n lrtty = context.get_value_type(tupty.dtype)\n with builder.goto_block(bbend):\n phinode = builder.phi(lrtty)\n\n for i in range(tupty.count):\n ki = context.get_constant(types.intp, i)\n bbi = builder.append_basic_block(\"switch.%d\" % i)\n switch.add_case(ki, bbi)\n with builder.goto_block(bbi):\n value = builder.extract_value(tup, i)\n builder.branch(bbend)\n phinode.add_incoming(value, bbi)\n\n builder.position_at_end(bbend)\n res = phinode\n assert sig.return_type == tupty.dtype\n return impl_ret_borrowed(context, builder, sig.return_type, res)\n\n\n@lower_builtin('static_getitem', types.BaseTuple, types.Const)\ndef static_getitem_tuple(context, builder, sig, args):\n tupty, _ = sig.args\n tup, idx = args\n if isinstance(idx, int):\n if idx < 0:\n idx += len(tupty)\n if not 0 <= idx < len(tupty):\n raise IndexError(\"cannot index at %d in %s\" % (idx, tupty))\n res = builder.extract_value(tup, idx)\n elif isinstance(idx, slice):\n items = cgutils.unpack_tuple(builder, tup)[idx]\n res = context.make_tuple(builder, sig.return_type, items)\n else:\n raise NotImplementedError(\"unexpected index %r for %s\"\n % (idx, sig.args[0]))\n return impl_ret_borrowed(context, builder, sig.return_type, res)\n\n\n#------------------------------------------------------------------------------\n# Implicit conversion\n\n@lower_cast(types.BaseTuple, types.BaseTuple)\ndef tuple_to_tuple(context, builder, fromty, toty, val):\n if (isinstance(fromty, types.BaseNamedTuple)\n or isinstance(toty, types.BaseNamedTuple)):\n # Disallowed by typing layer\n raise NotImplementedError\n\n if len(fromty) != len(toty):\n # Disallowed by typing layer\n raise NotImplementedError\n\n olditems = cgutils.unpack_tuple(builder, val, len(fromty))\n items = [context.cast(builder, v, f, t)\n for v, f, t in zip(olditems, fromty, toty)]\n return context.make_tuple(builder, toty, items)\n\n\n#------------------------------------------------------------------------------\n# Methods\n\n@overload_method(types.BaseTuple, 'index')\ndef tuple_index(tup, value):\n\n def tuple_index_impl(tup, value):\n for i in range(len(tup)):\n if tup[i] == value:\n return i\n raise ValueError(\"tuple.index(x): x not in tuple\")\n\n return tuple_index_impl\n", "path": "numba/targets/tupleobj.py"}], "after_files": [{"content": "\"\"\"\nImplementation of tuple objects\n\"\"\"\n\nfrom llvmlite import ir\nimport llvmlite.llvmpy.core as lc\nimport operator\n\nfrom .imputils import (lower_builtin, lower_getattr_generic, lower_cast,\n lower_constant,\n iternext_impl, impl_ret_borrowed, impl_ret_untracked)\nfrom .. import typing, types, cgutils\nfrom ..extending import overload_method\n\n\n@lower_builtin(types.NamedTupleClass, types.VarArg(types.Any))\ndef namedtuple_constructor(context, builder, sig, args):\n # A namedtuple has the same representation as a regular tuple\n res = context.make_tuple(builder, sig.return_type, args)\n # The tuple's contents are borrowed\n return impl_ret_borrowed(context, builder, sig.return_type, res)\n\n@lower_builtin(operator.add, types.BaseTuple, types.BaseTuple)\ndef tuple_add(context, builder, sig, args):\n left, right = [cgutils.unpack_tuple(builder, x) for x in args]\n res = context.make_tuple(builder, sig.return_type, left + right)\n # The tuple's contents are borrowed\n return impl_ret_borrowed(context, builder, sig.return_type, res)\n\ndef tuple_cmp_ordered(context, builder, op, sig, args):\n tu, tv = sig.args\n u, v = args\n res = cgutils.alloca_once_value(builder, cgutils.true_bit)\n bbend = builder.append_basic_block(\"cmp_end\")\n for i, (ta, tb) in enumerate(zip(tu.types, tv.types)):\n a = builder.extract_value(u, i)\n b = builder.extract_value(v, i)\n not_equal = context.generic_compare(builder, operator.ne, (ta, tb), (a, b))\n with builder.if_then(not_equal):\n pred = context.generic_compare(builder, op, (ta, tb), (a, b))\n builder.store(pred, res)\n builder.branch(bbend)\n # Everything matched equal => compare lengths\n len_compare = op(len(tu.types), len(tv.types))\n pred = context.get_constant(types.boolean, len_compare)\n builder.store(pred, res)\n builder.branch(bbend)\n builder.position_at_end(bbend)\n return builder.load(res)\n\n@lower_builtin(operator.eq, types.BaseTuple, types.BaseTuple)\ndef tuple_eq(context, builder, sig, args):\n tu, tv = sig.args\n u, v = args\n if len(tu.types) != len(tv.types):\n res = context.get_constant(types.boolean, False)\n return impl_ret_untracked(context, builder, sig.return_type, res)\n res = context.get_constant(types.boolean, True)\n for i, (ta, tb) in enumerate(zip(tu.types, tv.types)):\n a = builder.extract_value(u, i)\n b = builder.extract_value(v, i)\n pred = context.generic_compare(builder, operator.eq, (ta, tb), (a, b))\n res = builder.and_(res, pred)\n return impl_ret_untracked(context, builder, sig.return_type, res)\n\n@lower_builtin(operator.ne, types.BaseTuple, types.BaseTuple)\ndef tuple_ne(context, builder, sig, args):\n res = builder.not_(tuple_eq(context, builder, sig, args))\n return impl_ret_untracked(context, builder, sig.return_type, res)\n\n@lower_builtin(operator.lt, types.BaseTuple, types.BaseTuple)\ndef tuple_lt(context, builder, sig, args):\n res = tuple_cmp_ordered(context, builder, operator.lt, sig, args)\n return impl_ret_untracked(context, builder, sig.return_type, res)\n\n@lower_builtin(operator.le, types.BaseTuple, types.BaseTuple)\ndef tuple_le(context, builder, sig, args):\n res = tuple_cmp_ordered(context, builder, operator.le, sig, args)\n return impl_ret_untracked(context, builder, sig.return_type, res)\n\n@lower_builtin(operator.gt, types.BaseTuple, types.BaseTuple)\ndef tuple_gt(context, builder, sig, args):\n res = tuple_cmp_ordered(context, builder, operator.gt, sig, args)\n return impl_ret_untracked(context, builder, sig.return_type, res)\n\n@lower_builtin(operator.ge, types.BaseTuple, types.BaseTuple)\ndef tuple_ge(context, builder, sig, args):\n res = tuple_cmp_ordered(context, builder, operator.ge, sig, args)\n return impl_ret_untracked(context, builder, sig.return_type, res)\n\n@lower_builtin(hash, types.BaseTuple)\ndef hash_tuple(context, builder, sig, args):\n tupty, = sig.args\n tup, = args\n lty = context.get_value_type(sig.return_type)\n\n h = ir.Constant(lty, 0x345678)\n mult = ir.Constant(lty, 1000003)\n n = ir.Constant(lty, len(tupty))\n\n for i, ty in enumerate(tupty.types):\n # h = h * mult\n h = builder.mul(h, mult)\n val = builder.extract_value(tup, i)\n hash_impl = context.get_function(hash,\n typing.signature(sig.return_type, ty))\n h_val = hash_impl(builder, (val,))\n # h = h ^ hash(val)\n h = builder.xor(h, h_val)\n # Perturb: mult = mult + len(tup)\n mult = builder.add(mult, n)\n\n return h\n\n\n@lower_getattr_generic(types.BaseNamedTuple)\ndef namedtuple_getattr(context, builder, typ, value, attr):\n \"\"\"\n Fetch a namedtuple's field.\n \"\"\"\n index = typ.fields.index(attr)\n res = builder.extract_value(value, index)\n return impl_ret_borrowed(context, builder, typ[index], res)\n\n\n@lower_constant(types.UniTuple)\n@lower_constant(types.NamedUniTuple)\ndef unituple_constant(context, builder, ty, pyval):\n \"\"\"\n Create a homogeneous tuple constant.\n \"\"\"\n consts = [context.get_constant_generic(builder, ty.dtype, v)\n for v in pyval]\n return ir.ArrayType(consts[0].type, len(consts))(consts)\n\n@lower_constant(types.Tuple)\n@lower_constant(types.NamedTuple)\ndef unituple_constant(context, builder, ty, pyval):\n \"\"\"\n Create a heterogeneous tuple constant.\n \"\"\"\n consts = [context.get_constant_generic(builder, ty.types[i], v)\n for i, v in enumerate(pyval)]\n return ir.Constant.literal_struct(consts)\n\n\n#------------------------------------------------------------------------------\n# Tuple iterators\n\n@lower_builtin('getiter', types.UniTuple)\n@lower_builtin('getiter', types.NamedUniTuple)\ndef getiter_unituple(context, builder, sig, args):\n [tupty] = sig.args\n [tup] = args\n\n iterval = context.make_helper(builder, types.UniTupleIter(tupty))\n\n index0 = context.get_constant(types.intp, 0)\n indexptr = cgutils.alloca_once(builder, index0.type)\n builder.store(index0, indexptr)\n\n iterval.index = indexptr\n iterval.tuple = tup\n\n res = iterval._getvalue()\n return impl_ret_borrowed(context, builder, sig.return_type, res)\n\n\n@lower_builtin('iternext', types.UniTupleIter)\n@iternext_impl\ndef iternext_unituple(context, builder, sig, args, result):\n [tupiterty] = sig.args\n [tupiter] = args\n\n iterval = context.make_helper(builder, tupiterty, value=tupiter)\n\n tup = iterval.tuple\n idxptr = iterval.index\n idx = builder.load(idxptr)\n count = context.get_constant(types.intp, tupiterty.container.count)\n\n is_valid = builder.icmp(lc.ICMP_SLT, idx, count)\n result.set_valid(is_valid)\n\n with builder.if_then(is_valid):\n getitem_sig = typing.signature(tupiterty.container.dtype,\n tupiterty.container,\n types.intp)\n getitem_out = getitem_unituple(context, builder, getitem_sig,\n [tup, idx])\n result.yield_(getitem_out)\n nidx = builder.add(idx, context.get_constant(types.intp, 1))\n builder.store(nidx, iterval.index)\n\n\n@lower_builtin('getitem', types.UniTuple, types.intp)\n@lower_builtin('getitem', types.NamedUniTuple, types.intp)\ndef getitem_unituple(context, builder, sig, args):\n tupty, _ = sig.args\n tup, idx = args\n\n errmsg_oob = (\"tuple index out of range\",)\n\n if len(tupty) == 0:\n # Empty tuple.\n\n # Always branch and raise IndexError\n with builder.if_then(cgutils.true_bit):\n context.call_conv.return_user_exc(builder, IndexError,\n errmsg_oob)\n # This is unreachable in runtime,\n # but it exists to not terminate the current basicblock.\n res = context.get_constant_null(sig.return_type)\n return impl_ret_untracked(context, builder,\n sig.return_type, res)\n else:\n # The tuple is not empty\n bbelse = builder.append_basic_block(\"switch.else\")\n bbend = builder.append_basic_block(\"switch.end\")\n switch = builder.switch(idx, bbelse)\n\n with builder.goto_block(bbelse):\n context.call_conv.return_user_exc(builder, IndexError,\n errmsg_oob)\n\n lrtty = context.get_value_type(tupty.dtype)\n with builder.goto_block(bbend):\n phinode = builder.phi(lrtty)\n\n for i in range(tupty.count):\n ki = context.get_constant(types.intp, i)\n bbi = builder.append_basic_block(\"switch.%d\" % i)\n switch.add_case(ki, bbi)\n # handle negative indexing, create case (-tuple.count + i) to\n # reference same block as i\n kin = context.get_constant(types.intp, -tupty.count + i)\n switch.add_case(kin, bbi)\n with builder.goto_block(bbi):\n value = builder.extract_value(tup, i)\n builder.branch(bbend)\n phinode.add_incoming(value, bbi)\n\n builder.position_at_end(bbend)\n res = phinode\n assert sig.return_type == tupty.dtype\n return impl_ret_borrowed(context, builder, sig.return_type, res)\n\n\n@lower_builtin('static_getitem', types.BaseTuple, types.Const)\ndef static_getitem_tuple(context, builder, sig, args):\n tupty, _ = sig.args\n tup, idx = args\n if isinstance(idx, int):\n if idx < 0:\n idx += len(tupty)\n if not 0 <= idx < len(tupty):\n raise IndexError(\"cannot index at %d in %s\" % (idx, tupty))\n res = builder.extract_value(tup, idx)\n elif isinstance(idx, slice):\n items = cgutils.unpack_tuple(builder, tup)[idx]\n res = context.make_tuple(builder, sig.return_type, items)\n else:\n raise NotImplementedError(\"unexpected index %r for %s\"\n % (idx, sig.args[0]))\n return impl_ret_borrowed(context, builder, sig.return_type, res)\n\n\n#------------------------------------------------------------------------------\n# Implicit conversion\n\n@lower_cast(types.BaseTuple, types.BaseTuple)\ndef tuple_to_tuple(context, builder, fromty, toty, val):\n if (isinstance(fromty, types.BaseNamedTuple)\n or isinstance(toty, types.BaseNamedTuple)):\n # Disallowed by typing layer\n raise NotImplementedError\n\n if len(fromty) != len(toty):\n # Disallowed by typing layer\n raise NotImplementedError\n\n olditems = cgutils.unpack_tuple(builder, val, len(fromty))\n items = [context.cast(builder, v, f, t)\n for v, f, t in zip(olditems, fromty, toty)]\n return context.make_tuple(builder, toty, items)\n\n\n#------------------------------------------------------------------------------\n# Methods\n\n@overload_method(types.BaseTuple, 'index')\ndef tuple_index(tup, value):\n\n def tuple_index_impl(tup, value):\n for i in range(len(tup)):\n if tup[i] == value:\n return i\n raise ValueError(\"tuple.index(x): x not in tuple\")\n\n return tuple_index_impl\n", "path": "numba/targets/tupleobj.py"}]} | 3,887 | 181 |
gh_patches_debug_10697 | rasdani/github-patches | git_diff | freedomofpress__securedrop-4884 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Upgrade Ansble to 2.7 series
## Description
SecureDrop currently uses the Ansible 2.6 series which is approaching end-of-life. Ansible major versions are supported (receive security patches) for [three major releases](https://docs.ansible.com/ansible/latest/reference_appendices/release_and_maintenance.html).
We should move to Ansible 2.7 to ensure a smooth transition, which would allow us to rapidly patch should there be a need to quickly patch.
## User Stories
As a developer, I want to make sure my dependencies are being maintained for security issues.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `install_files/ansible-base/callback_plugins/ansible_version_check.py`
Content:
```
1 # -*- encoding:utf-8 -*-
2 from __future__ import absolute_import, division, print_function, \
3 unicode_literals
4
5 import sys
6
7 import ansible
8
9 try:
10 # Version 2.0+
11 from ansible.plugins.callback import CallbackBase
12 except ImportError:
13 CallbackBase = object
14
15
16 def print_red_bold(text):
17 print('\x1b[31;1m' + text + '\x1b[0m')
18
19
20 class CallbackModule(CallbackBase):
21 def __init__(self):
22 # Can't use `on_X` because this isn't forwards compatible
23 # with Ansible 2.0+
24 required_version = '2.6.19' # Keep synchronized with requirements files
25 if not ansible.__version__.startswith(required_version):
26 print_red_bold(
27 "SecureDrop restriction: only Ansible {version}.*"
28 "is supported."
29 .format(version=required_version)
30 )
31 sys.exit(1)
32
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/install_files/ansible-base/callback_plugins/ansible_version_check.py b/install_files/ansible-base/callback_plugins/ansible_version_check.py
--- a/install_files/ansible-base/callback_plugins/ansible_version_check.py
+++ b/install_files/ansible-base/callback_plugins/ansible_version_check.py
@@ -21,7 +21,7 @@
def __init__(self):
# Can't use `on_X` because this isn't forwards compatible
# with Ansible 2.0+
- required_version = '2.6.19' # Keep synchronized with requirements files
+ required_version = '2.7.13' # Keep synchronized with requirements files
if not ansible.__version__.startswith(required_version):
print_red_bold(
"SecureDrop restriction: only Ansible {version}.*"
| {"golden_diff": "diff --git a/install_files/ansible-base/callback_plugins/ansible_version_check.py b/install_files/ansible-base/callback_plugins/ansible_version_check.py\n--- a/install_files/ansible-base/callback_plugins/ansible_version_check.py\n+++ b/install_files/ansible-base/callback_plugins/ansible_version_check.py\n@@ -21,7 +21,7 @@\n def __init__(self):\n # Can't use `on_X` because this isn't forwards compatible\n # with Ansible 2.0+\n- required_version = '2.6.19' # Keep synchronized with requirements files\n+ required_version = '2.7.13' # Keep synchronized with requirements files\n if not ansible.__version__.startswith(required_version):\n print_red_bold(\n \"SecureDrop restriction: only Ansible {version}.*\"\n", "issue": "Upgrade Ansble to 2.7 series\n## Description\r\n\r\nSecureDrop currently uses the Ansible 2.6 series which is approaching end-of-life. Ansible major versions are supported (receive security patches) for [three major releases](https://docs.ansible.com/ansible/latest/reference_appendices/release_and_maintenance.html).\r\n\r\nWe should move to Ansible 2.7 to ensure a smooth transition, which would allow us to rapidly patch should there be a need to quickly patch.\r\n\r\n## User Stories\r\n\r\nAs a developer, I want to make sure my dependencies are being maintained for security issues.\r\n\n", "before_files": [{"content": "# -*- encoding:utf-8 -*-\nfrom __future__ import absolute_import, division, print_function, \\\n unicode_literals\n\nimport sys\n\nimport ansible\n\ntry:\n # Version 2.0+\n from ansible.plugins.callback import CallbackBase\nexcept ImportError:\n CallbackBase = object\n\n\ndef print_red_bold(text):\n print('\\x1b[31;1m' + text + '\\x1b[0m')\n\n\nclass CallbackModule(CallbackBase):\n def __init__(self):\n # Can't use `on_X` because this isn't forwards compatible\n # with Ansible 2.0+\n required_version = '2.6.19' # Keep synchronized with requirements files\n if not ansible.__version__.startswith(required_version):\n print_red_bold(\n \"SecureDrop restriction: only Ansible {version}.*\"\n \"is supported.\"\n .format(version=required_version)\n )\n sys.exit(1)\n", "path": "install_files/ansible-base/callback_plugins/ansible_version_check.py"}], "after_files": [{"content": "# -*- encoding:utf-8 -*-\nfrom __future__ import absolute_import, division, print_function, \\\n unicode_literals\n\nimport sys\n\nimport ansible\n\ntry:\n # Version 2.0+\n from ansible.plugins.callback import CallbackBase\nexcept ImportError:\n CallbackBase = object\n\n\ndef print_red_bold(text):\n print('\\x1b[31;1m' + text + '\\x1b[0m')\n\n\nclass CallbackModule(CallbackBase):\n def __init__(self):\n # Can't use `on_X` because this isn't forwards compatible\n # with Ansible 2.0+\n required_version = '2.7.13' # Keep synchronized with requirements files\n if not ansible.__version__.startswith(required_version):\n print_red_bold(\n \"SecureDrop restriction: only Ansible {version}.*\"\n \"is supported.\"\n .format(version=required_version)\n )\n sys.exit(1)\n", "path": "install_files/ansible-base/callback_plugins/ansible_version_check.py"}]} | 652 | 177 |
gh_patches_debug_2651 | rasdani/github-patches | git_diff | cloud-custodian__cloud-custodian-2513 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
S3Output should grant bucket owner full control
Currently `S3Output` misses the `ACL` option. In a cross account setup it is desirable to give the bucket owner full control.
Would you change the code like this?:
```
diff --git a/c7n/output.py b/c7n/output.py
index c3839c2f..5fb06f59 100644
--- a/c7n/output.py
+++ b/c7n/output.py
@@ -268,6 +268,7 @@ class S3Output(FSOutput):
self.transfer.upload_file(
os.path.join(root, f), self.bucket, key,
extra_args={
+ 'ACL': 'bucket-owner-full-control',
'ServerSideEncryption': 'AES256'})
def use_s3(self):
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `c7n/output.py`
Content:
```
1 # Copyright 2015-2017 Capital One Services, LLC
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 """
15 Outputs metrics, logs, structured records across
16 a variety of sources.
17
18 See docs/usage/outputs.rst
19
20 """
21 from __future__ import absolute_import, division, print_function, unicode_literals
22
23 import datetime
24 import gzip
25 import logging
26 import shutil
27 import tempfile
28
29 import os
30
31 from boto3.s3.transfer import S3Transfer
32
33 from c7n.registry import PluginRegistry
34 from c7n.log import CloudWatchLogHandler
35 from c7n.utils import local_session, parse_s3, get_retry
36
37 DEFAULT_NAMESPACE = "CloudMaid"
38
39 log = logging.getLogger('custodian.output')
40
41
42 blob_outputs = PluginRegistry('c7n.blob-outputs')
43
44
45 class MetricsOutput(object):
46 """Send metrics data to cloudwatch
47 """
48
49 permissions = ("cloudWatch:PutMetricData",)
50
51 retry = staticmethod(get_retry(('Throttling',)))
52
53 @staticmethod
54 def select(metrics_enabled):
55 if metrics_enabled:
56 return MetricsOutput
57 return NullMetricsOutput
58
59 def __init__(self, ctx, namespace=DEFAULT_NAMESPACE):
60 self.ctx = ctx
61 self.namespace = namespace
62 self.buf = []
63
64 def get_timestamp(self):
65 """
66 Now, if C7N_METRICS_TZ is set to TRUE, UTC timestamp will be used.
67 For backwards compatibility, if it is not set, UTC will be the default.
68 To disable this and use the system's time zone, C7N_METRICS_TZ shoule be set to FALSE.
69 """
70
71 if os.getenv("C7N_METRICS_TZ", '').upper() in ('TRUE', ''):
72 return datetime.datetime.utcnow()
73 else:
74 return datetime.datetime.now()
75
76 def flush(self):
77 if self.buf:
78 self._put_metrics(self.namespace, self.buf)
79 self.buf = []
80
81 def put_metric(self, key, value, unit, buffer=False, **dimensions):
82 d = {
83 "MetricName": key,
84 "Timestamp": self.get_timestamp(),
85 "Value": value,
86 "Unit": unit}
87 d["Dimensions"] = [
88 {"Name": "Policy", "Value": self.ctx.policy.name},
89 {"Name": "ResType", "Value": self.ctx.policy.resource_type}]
90 for k, v in dimensions.items():
91 d['Dimensions'].append({"Name": k, "Value": v})
92
93 if buffer:
94 self.buf.append(d)
95 # Max metrics in a single request
96 if len(self.buf) == 20:
97 self.flush()
98 else:
99 self._put_metrics(self.namespace, [d])
100
101 def _put_metrics(self, ns, metrics):
102 watch = local_session(self.ctx.session_factory).client('cloudwatch')
103 return self.retry(
104 watch.put_metric_data, Namespace=ns, MetricData=metrics)
105
106
107 class NullMetricsOutput(MetricsOutput):
108
109 permissions = ()
110
111 def __init__(self, ctx, namespace=DEFAULT_NAMESPACE):
112 super(NullMetricsOutput, self).__init__(ctx, namespace)
113 self.data = []
114
115 def _put_metrics(self, ns, metrics):
116 self.data.append({'Namespace': ns, 'MetricData': metrics})
117 for m in metrics:
118 if m['MetricName'] not in ('ActionTime', 'ResourceTime'):
119 log.debug(self.format_metric(m))
120
121 def format_metric(self, m):
122 label = "metric:%s %s:%s" % (m['MetricName'], m['Unit'], m['Value'])
123 for d in m['Dimensions']:
124 label += " %s:%s" % (d['Name'].lower(), d['Value'].lower())
125 return label
126
127
128 class LogOutput(object):
129
130 log_format = '%(asctime)s - %(name)s - %(levelname)s - %(message)s'
131
132 def __init__(self, ctx):
133 self.ctx = ctx
134
135 def get_handler(self):
136 raise NotImplementedError()
137
138 def __enter__(self):
139 log.debug("Storing output with %s" % repr(self))
140 self.join_log()
141 return self
142
143 def __exit__(self, exc_type=None, exc_value=None, exc_traceback=None):
144 self.leave_log()
145 if exc_type is not None:
146 log.exception("Error while executing policy")
147
148 def join_log(self):
149 self.handler = self.get_handler()
150 self.handler.setLevel(logging.DEBUG)
151 self.handler.setFormatter(logging.Formatter(self.log_format))
152 mlog = logging.getLogger('custodian')
153 mlog.addHandler(self.handler)
154
155 def leave_log(self):
156 mlog = logging.getLogger('custodian')
157 mlog.removeHandler(self.handler)
158 self.handler.flush()
159 self.handler.close()
160
161
162 class CloudWatchLogOutput(LogOutput):
163
164 log_format = '%(asctime)s - %(levelname)s - %(name)s - %(message)s'
165
166 def get_handler(self):
167 return CloudWatchLogHandler(
168 log_group=self.ctx.options.log_group,
169 log_stream=self.ctx.policy.name,
170 session_factory=lambda x=None: self.ctx.session_factory(
171 assume=False))
172
173 def __repr__(self):
174 return "<%s to group:%s stream:%s>" % (
175 self.__class__.__name__,
176 self.ctx.options.log_group,
177 self.ctx.policy.name)
178
179
180 class FSOutput(LogOutput):
181
182 @staticmethod
183 def select(path):
184 for k in blob_outputs.keys():
185 if path.startswith('%s://' % k):
186 return blob_outputs[k]
187 # Fall back local disk
188 return blob_outputs['file']
189
190 @staticmethod
191 def join(*parts):
192 return os.path.join(*parts)
193
194 def __init__(self, ctx):
195 super(FSOutput, self).__init__(ctx)
196 self.root_dir = self.ctx.output_path or tempfile.mkdtemp()
197
198 def get_handler(self):
199 return logging.FileHandler(
200 os.path.join(self.root_dir, 'custodian-run.log'))
201
202 def compress(self):
203 # Compress files individually so thats easy to walk them, without
204 # downloading tar and extracting.
205 for root, dirs, files in os.walk(self.root_dir):
206 for f in files:
207 fp = os.path.join(root, f)
208 with gzip.open(fp + ".gz", "wb", compresslevel=7) as zfh:
209 with open(fp, "rb") as sfh:
210 shutil.copyfileobj(sfh, zfh, length=2**15)
211 os.remove(fp)
212
213
214 @blob_outputs.register('file')
215 class DirectoryOutput(FSOutput):
216
217 permissions = ()
218
219 def __init__(self, ctx):
220 super(DirectoryOutput, self).__init__(ctx)
221 if self.root_dir.startswith('file://'):
222 self.root_dir = self.root_dir[len('file://'):]
223 if self.ctx.output_path is not None:
224 if not os.path.exists(self.root_dir):
225 os.makedirs(self.root_dir)
226
227 def __repr__(self):
228 return "<%s to dir:%s>" % (self.__class__.__name__, self.root_dir)
229
230
231 @blob_outputs.register('s3')
232 class S3Output(FSOutput):
233 """
234 Usage:
235
236 .. code-block:: python
237
238 with S3Output(session_factory, 's3://bucket/prefix'):
239 log.info('xyz') # -> log messages sent to custodian-run.log.gz
240
241 """
242
243 permissions = ('S3:PutObject',)
244
245 def __init__(self, ctx):
246 super(S3Output, self).__init__(ctx)
247 self.date_path = datetime.datetime.now().strftime('%Y/%m/%d/%H')
248 self.s3_path, self.bucket, self.key_prefix = parse_s3(
249 self.ctx.output_path)
250 self.root_dir = tempfile.mkdtemp()
251 self.transfer = None
252
253 def __repr__(self):
254 return "<%s to bucket:%s prefix:%s>" % (
255 self.__class__.__name__,
256 self.bucket,
257 "%s/%s" % (self.key_prefix, self.date_path))
258
259 @staticmethod
260 def join(*parts):
261 return "/".join([s.strip('/') for s in parts])
262
263 def __exit__(self, exc_type=None, exc_value=None, exc_traceback=None):
264 if exc_type is not None:
265 log.exception("Error while executing policy")
266 log.debug("Uploading policy logs")
267 self.leave_log()
268 self.compress()
269 self.transfer = S3Transfer(
270 self.ctx.session_factory(assume=False).client('s3'))
271 self.upload()
272 shutil.rmtree(self.root_dir)
273 log.debug("Policy Logs uploaded")
274
275 def upload(self):
276 for root, dirs, files in os.walk(self.root_dir):
277 for f in files:
278 key = "%s/%s%s" % (
279 self.key_prefix,
280 self.date_path,
281 "%s/%s" % (
282 root[len(self.root_dir):], f))
283 key = key.strip('/')
284 self.transfer.upload_file(
285 os.path.join(root, f), self.bucket, key,
286 extra_args={
287 'ServerSideEncryption': 'AES256'})
288
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/c7n/output.py b/c7n/output.py
--- a/c7n/output.py
+++ b/c7n/output.py
@@ -284,4 +284,5 @@
self.transfer.upload_file(
os.path.join(root, f), self.bucket, key,
extra_args={
+ 'ACL': 'bucket-owner-full-control',
'ServerSideEncryption': 'AES256'})
| {"golden_diff": "diff --git a/c7n/output.py b/c7n/output.py\n--- a/c7n/output.py\n+++ b/c7n/output.py\n@@ -284,4 +284,5 @@\n self.transfer.upload_file(\n os.path.join(root, f), self.bucket, key,\n extra_args={\n+ 'ACL': 'bucket-owner-full-control',\n 'ServerSideEncryption': 'AES256'})\n", "issue": "S3Output should grant bucket owner full control\nCurrently `S3Output` misses the `ACL` option. In a cross account setup it is desirable to give the bucket owner full control.\r\n\r\nWould you change the code like this?:\r\n```\r\ndiff --git a/c7n/output.py b/c7n/output.py\r\nindex c3839c2f..5fb06f59 100644\r\n--- a/c7n/output.py\r\n+++ b/c7n/output.py\r\n@@ -268,6 +268,7 @@ class S3Output(FSOutput):\r\n self.transfer.upload_file(\r\n os.path.join(root, f), self.bucket, key,\r\n extra_args={\r\n+ 'ACL': 'bucket-owner-full-control',\r\n 'ServerSideEncryption': 'AES256'})\r\n \r\n def use_s3(self):\r\n```\r\n\n", "before_files": [{"content": "# Copyright 2015-2017 Capital One Services, LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"\nOutputs metrics, logs, structured records across\na variety of sources.\n\nSee docs/usage/outputs.rst\n\n\"\"\"\nfrom __future__ import absolute_import, division, print_function, unicode_literals\n\nimport datetime\nimport gzip\nimport logging\nimport shutil\nimport tempfile\n\nimport os\n\nfrom boto3.s3.transfer import S3Transfer\n\nfrom c7n.registry import PluginRegistry\nfrom c7n.log import CloudWatchLogHandler\nfrom c7n.utils import local_session, parse_s3, get_retry\n\nDEFAULT_NAMESPACE = \"CloudMaid\"\n\nlog = logging.getLogger('custodian.output')\n\n\nblob_outputs = PluginRegistry('c7n.blob-outputs')\n\n\nclass MetricsOutput(object):\n \"\"\"Send metrics data to cloudwatch\n \"\"\"\n\n permissions = (\"cloudWatch:PutMetricData\",)\n\n retry = staticmethod(get_retry(('Throttling',)))\n\n @staticmethod\n def select(metrics_enabled):\n if metrics_enabled:\n return MetricsOutput\n return NullMetricsOutput\n\n def __init__(self, ctx, namespace=DEFAULT_NAMESPACE):\n self.ctx = ctx\n self.namespace = namespace\n self.buf = []\n\n def get_timestamp(self):\n \"\"\"\n Now, if C7N_METRICS_TZ is set to TRUE, UTC timestamp will be used.\n For backwards compatibility, if it is not set, UTC will be the default.\n To disable this and use the system's time zone, C7N_METRICS_TZ shoule be set to FALSE.\n \"\"\"\n\n if os.getenv(\"C7N_METRICS_TZ\", '').upper() in ('TRUE', ''):\n return datetime.datetime.utcnow()\n else:\n return datetime.datetime.now()\n\n def flush(self):\n if self.buf:\n self._put_metrics(self.namespace, self.buf)\n self.buf = []\n\n def put_metric(self, key, value, unit, buffer=False, **dimensions):\n d = {\n \"MetricName\": key,\n \"Timestamp\": self.get_timestamp(),\n \"Value\": value,\n \"Unit\": unit}\n d[\"Dimensions\"] = [\n {\"Name\": \"Policy\", \"Value\": self.ctx.policy.name},\n {\"Name\": \"ResType\", \"Value\": self.ctx.policy.resource_type}]\n for k, v in dimensions.items():\n d['Dimensions'].append({\"Name\": k, \"Value\": v})\n\n if buffer:\n self.buf.append(d)\n # Max metrics in a single request\n if len(self.buf) == 20:\n self.flush()\n else:\n self._put_metrics(self.namespace, [d])\n\n def _put_metrics(self, ns, metrics):\n watch = local_session(self.ctx.session_factory).client('cloudwatch')\n return self.retry(\n watch.put_metric_data, Namespace=ns, MetricData=metrics)\n\n\nclass NullMetricsOutput(MetricsOutput):\n\n permissions = ()\n\n def __init__(self, ctx, namespace=DEFAULT_NAMESPACE):\n super(NullMetricsOutput, self).__init__(ctx, namespace)\n self.data = []\n\n def _put_metrics(self, ns, metrics):\n self.data.append({'Namespace': ns, 'MetricData': metrics})\n for m in metrics:\n if m['MetricName'] not in ('ActionTime', 'ResourceTime'):\n log.debug(self.format_metric(m))\n\n def format_metric(self, m):\n label = \"metric:%s %s:%s\" % (m['MetricName'], m['Unit'], m['Value'])\n for d in m['Dimensions']:\n label += \" %s:%s\" % (d['Name'].lower(), d['Value'].lower())\n return label\n\n\nclass LogOutput(object):\n\n log_format = '%(asctime)s - %(name)s - %(levelname)s - %(message)s'\n\n def __init__(self, ctx):\n self.ctx = ctx\n\n def get_handler(self):\n raise NotImplementedError()\n\n def __enter__(self):\n log.debug(\"Storing output with %s\" % repr(self))\n self.join_log()\n return self\n\n def __exit__(self, exc_type=None, exc_value=None, exc_traceback=None):\n self.leave_log()\n if exc_type is not None:\n log.exception(\"Error while executing policy\")\n\n def join_log(self):\n self.handler = self.get_handler()\n self.handler.setLevel(logging.DEBUG)\n self.handler.setFormatter(logging.Formatter(self.log_format))\n mlog = logging.getLogger('custodian')\n mlog.addHandler(self.handler)\n\n def leave_log(self):\n mlog = logging.getLogger('custodian')\n mlog.removeHandler(self.handler)\n self.handler.flush()\n self.handler.close()\n\n\nclass CloudWatchLogOutput(LogOutput):\n\n log_format = '%(asctime)s - %(levelname)s - %(name)s - %(message)s'\n\n def get_handler(self):\n return CloudWatchLogHandler(\n log_group=self.ctx.options.log_group,\n log_stream=self.ctx.policy.name,\n session_factory=lambda x=None: self.ctx.session_factory(\n assume=False))\n\n def __repr__(self):\n return \"<%s to group:%s stream:%s>\" % (\n self.__class__.__name__,\n self.ctx.options.log_group,\n self.ctx.policy.name)\n\n\nclass FSOutput(LogOutput):\n\n @staticmethod\n def select(path):\n for k in blob_outputs.keys():\n if path.startswith('%s://' % k):\n return blob_outputs[k]\n # Fall back local disk\n return blob_outputs['file']\n\n @staticmethod\n def join(*parts):\n return os.path.join(*parts)\n\n def __init__(self, ctx):\n super(FSOutput, self).__init__(ctx)\n self.root_dir = self.ctx.output_path or tempfile.mkdtemp()\n\n def get_handler(self):\n return logging.FileHandler(\n os.path.join(self.root_dir, 'custodian-run.log'))\n\n def compress(self):\n # Compress files individually so thats easy to walk them, without\n # downloading tar and extracting.\n for root, dirs, files in os.walk(self.root_dir):\n for f in files:\n fp = os.path.join(root, f)\n with gzip.open(fp + \".gz\", \"wb\", compresslevel=7) as zfh:\n with open(fp, \"rb\") as sfh:\n shutil.copyfileobj(sfh, zfh, length=2**15)\n os.remove(fp)\n\n\n@blob_outputs.register('file')\nclass DirectoryOutput(FSOutput):\n\n permissions = ()\n\n def __init__(self, ctx):\n super(DirectoryOutput, self).__init__(ctx)\n if self.root_dir.startswith('file://'):\n self.root_dir = self.root_dir[len('file://'):]\n if self.ctx.output_path is not None:\n if not os.path.exists(self.root_dir):\n os.makedirs(self.root_dir)\n\n def __repr__(self):\n return \"<%s to dir:%s>\" % (self.__class__.__name__, self.root_dir)\n\n\n@blob_outputs.register('s3')\nclass S3Output(FSOutput):\n \"\"\"\n Usage:\n\n .. code-block:: python\n\n with S3Output(session_factory, 's3://bucket/prefix'):\n log.info('xyz') # -> log messages sent to custodian-run.log.gz\n\n \"\"\"\n\n permissions = ('S3:PutObject',)\n\n def __init__(self, ctx):\n super(S3Output, self).__init__(ctx)\n self.date_path = datetime.datetime.now().strftime('%Y/%m/%d/%H')\n self.s3_path, self.bucket, self.key_prefix = parse_s3(\n self.ctx.output_path)\n self.root_dir = tempfile.mkdtemp()\n self.transfer = None\n\n def __repr__(self):\n return \"<%s to bucket:%s prefix:%s>\" % (\n self.__class__.__name__,\n self.bucket,\n \"%s/%s\" % (self.key_prefix, self.date_path))\n\n @staticmethod\n def join(*parts):\n return \"/\".join([s.strip('/') for s in parts])\n\n def __exit__(self, exc_type=None, exc_value=None, exc_traceback=None):\n if exc_type is not None:\n log.exception(\"Error while executing policy\")\n log.debug(\"Uploading policy logs\")\n self.leave_log()\n self.compress()\n self.transfer = S3Transfer(\n self.ctx.session_factory(assume=False).client('s3'))\n self.upload()\n shutil.rmtree(self.root_dir)\n log.debug(\"Policy Logs uploaded\")\n\n def upload(self):\n for root, dirs, files in os.walk(self.root_dir):\n for f in files:\n key = \"%s/%s%s\" % (\n self.key_prefix,\n self.date_path,\n \"%s/%s\" % (\n root[len(self.root_dir):], f))\n key = key.strip('/')\n self.transfer.upload_file(\n os.path.join(root, f), self.bucket, key,\n extra_args={\n 'ServerSideEncryption': 'AES256'})\n", "path": "c7n/output.py"}], "after_files": [{"content": "# Copyright 2015-2017 Capital One Services, LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"\nOutputs metrics, logs, structured records across\na variety of sources.\n\nSee docs/usage/outputs.rst\n\n\"\"\"\nfrom __future__ import absolute_import, division, print_function, unicode_literals\n\nimport datetime\nimport gzip\nimport logging\nimport shutil\nimport tempfile\n\nimport os\n\nfrom boto3.s3.transfer import S3Transfer\n\nfrom c7n.registry import PluginRegistry\nfrom c7n.log import CloudWatchLogHandler\nfrom c7n.utils import local_session, parse_s3, get_retry\n\nDEFAULT_NAMESPACE = \"CloudMaid\"\n\nlog = logging.getLogger('custodian.output')\n\n\nblob_outputs = PluginRegistry('c7n.blob-outputs')\n\n\nclass MetricsOutput(object):\n \"\"\"Send metrics data to cloudwatch\n \"\"\"\n\n permissions = (\"cloudWatch:PutMetricData\",)\n\n retry = staticmethod(get_retry(('Throttling',)))\n\n @staticmethod\n def select(metrics_enabled):\n if metrics_enabled:\n return MetricsOutput\n return NullMetricsOutput\n\n def __init__(self, ctx, namespace=DEFAULT_NAMESPACE):\n self.ctx = ctx\n self.namespace = namespace\n self.buf = []\n\n def get_timestamp(self):\n \"\"\"\n Now, if C7N_METRICS_TZ is set to TRUE, UTC timestamp will be used.\n For backwards compatibility, if it is not set, UTC will be the default.\n To disable this and use the system's time zone, C7N_METRICS_TZ shoule be set to FALSE.\n \"\"\"\n\n if os.getenv(\"C7N_METRICS_TZ\", '').upper() in ('TRUE', ''):\n return datetime.datetime.utcnow()\n else:\n return datetime.datetime.now()\n\n def flush(self):\n if self.buf:\n self._put_metrics(self.namespace, self.buf)\n self.buf = []\n\n def put_metric(self, key, value, unit, buffer=False, **dimensions):\n d = {\n \"MetricName\": key,\n \"Timestamp\": self.get_timestamp(),\n \"Value\": value,\n \"Unit\": unit}\n d[\"Dimensions\"] = [\n {\"Name\": \"Policy\", \"Value\": self.ctx.policy.name},\n {\"Name\": \"ResType\", \"Value\": self.ctx.policy.resource_type}]\n for k, v in dimensions.items():\n d['Dimensions'].append({\"Name\": k, \"Value\": v})\n\n if buffer:\n self.buf.append(d)\n # Max metrics in a single request\n if len(self.buf) == 20:\n self.flush()\n else:\n self._put_metrics(self.namespace, [d])\n\n def _put_metrics(self, ns, metrics):\n watch = local_session(self.ctx.session_factory).client('cloudwatch')\n return self.retry(\n watch.put_metric_data, Namespace=ns, MetricData=metrics)\n\n\nclass NullMetricsOutput(MetricsOutput):\n\n permissions = ()\n\n def __init__(self, ctx, namespace=DEFAULT_NAMESPACE):\n super(NullMetricsOutput, self).__init__(ctx, namespace)\n self.data = []\n\n def _put_metrics(self, ns, metrics):\n self.data.append({'Namespace': ns, 'MetricData': metrics})\n for m in metrics:\n if m['MetricName'] not in ('ActionTime', 'ResourceTime'):\n log.debug(self.format_metric(m))\n\n def format_metric(self, m):\n label = \"metric:%s %s:%s\" % (m['MetricName'], m['Unit'], m['Value'])\n for d in m['Dimensions']:\n label += \" %s:%s\" % (d['Name'].lower(), d['Value'].lower())\n return label\n\n\nclass LogOutput(object):\n\n log_format = '%(asctime)s - %(name)s - %(levelname)s - %(message)s'\n\n def __init__(self, ctx):\n self.ctx = ctx\n\n def get_handler(self):\n raise NotImplementedError()\n\n def __enter__(self):\n log.debug(\"Storing output with %s\" % repr(self))\n self.join_log()\n return self\n\n def __exit__(self, exc_type=None, exc_value=None, exc_traceback=None):\n self.leave_log()\n if exc_type is not None:\n log.exception(\"Error while executing policy\")\n\n def join_log(self):\n self.handler = self.get_handler()\n self.handler.setLevel(logging.DEBUG)\n self.handler.setFormatter(logging.Formatter(self.log_format))\n mlog = logging.getLogger('custodian')\n mlog.addHandler(self.handler)\n\n def leave_log(self):\n mlog = logging.getLogger('custodian')\n mlog.removeHandler(self.handler)\n self.handler.flush()\n self.handler.close()\n\n\nclass CloudWatchLogOutput(LogOutput):\n\n log_format = '%(asctime)s - %(levelname)s - %(name)s - %(message)s'\n\n def get_handler(self):\n return CloudWatchLogHandler(\n log_group=self.ctx.options.log_group,\n log_stream=self.ctx.policy.name,\n session_factory=lambda x=None: self.ctx.session_factory(\n assume=False))\n\n def __repr__(self):\n return \"<%s to group:%s stream:%s>\" % (\n self.__class__.__name__,\n self.ctx.options.log_group,\n self.ctx.policy.name)\n\n\nclass FSOutput(LogOutput):\n\n @staticmethod\n def select(path):\n for k in blob_outputs.keys():\n if path.startswith('%s://' % k):\n return blob_outputs[k]\n # Fall back local disk\n return blob_outputs['file']\n\n @staticmethod\n def join(*parts):\n return os.path.join(*parts)\n\n def __init__(self, ctx):\n super(FSOutput, self).__init__(ctx)\n self.root_dir = self.ctx.output_path or tempfile.mkdtemp()\n\n def get_handler(self):\n return logging.FileHandler(\n os.path.join(self.root_dir, 'custodian-run.log'))\n\n def compress(self):\n # Compress files individually so thats easy to walk them, without\n # downloading tar and extracting.\n for root, dirs, files in os.walk(self.root_dir):\n for f in files:\n fp = os.path.join(root, f)\n with gzip.open(fp + \".gz\", \"wb\", compresslevel=7) as zfh:\n with open(fp, \"rb\") as sfh:\n shutil.copyfileobj(sfh, zfh, length=2**15)\n os.remove(fp)\n\n\n@blob_outputs.register('file')\nclass DirectoryOutput(FSOutput):\n\n permissions = ()\n\n def __init__(self, ctx):\n super(DirectoryOutput, self).__init__(ctx)\n if self.root_dir.startswith('file://'):\n self.root_dir = self.root_dir[len('file://'):]\n if self.ctx.output_path is not None:\n if not os.path.exists(self.root_dir):\n os.makedirs(self.root_dir)\n\n def __repr__(self):\n return \"<%s to dir:%s>\" % (self.__class__.__name__, self.root_dir)\n\n\n@blob_outputs.register('s3')\nclass S3Output(FSOutput):\n \"\"\"\n Usage:\n\n .. code-block:: python\n\n with S3Output(session_factory, 's3://bucket/prefix'):\n log.info('xyz') # -> log messages sent to custodian-run.log.gz\n\n \"\"\"\n\n permissions = ('S3:PutObject',)\n\n def __init__(self, ctx):\n super(S3Output, self).__init__(ctx)\n self.date_path = datetime.datetime.now().strftime('%Y/%m/%d/%H')\n self.s3_path, self.bucket, self.key_prefix = parse_s3(\n self.ctx.output_path)\n self.root_dir = tempfile.mkdtemp()\n self.transfer = None\n\n def __repr__(self):\n return \"<%s to bucket:%s prefix:%s>\" % (\n self.__class__.__name__,\n self.bucket,\n \"%s/%s\" % (self.key_prefix, self.date_path))\n\n @staticmethod\n def join(*parts):\n return \"/\".join([s.strip('/') for s in parts])\n\n def __exit__(self, exc_type=None, exc_value=None, exc_traceback=None):\n if exc_type is not None:\n log.exception(\"Error while executing policy\")\n log.debug(\"Uploading policy logs\")\n self.leave_log()\n self.compress()\n self.transfer = S3Transfer(\n self.ctx.session_factory(assume=False).client('s3'))\n self.upload()\n shutil.rmtree(self.root_dir)\n log.debug(\"Policy Logs uploaded\")\n\n def upload(self):\n for root, dirs, files in os.walk(self.root_dir):\n for f in files:\n key = \"%s/%s%s\" % (\n self.key_prefix,\n self.date_path,\n \"%s/%s\" % (\n root[len(self.root_dir):], f))\n key = key.strip('/')\n self.transfer.upload_file(\n os.path.join(root, f), self.bucket, key,\n extra_args={\n 'ACL': 'bucket-owner-full-control',\n 'ServerSideEncryption': 'AES256'})\n", "path": "c7n/output.py"}]} | 3,275 | 92 |
gh_patches_debug_7275 | rasdani/github-patches | git_diff | facebookresearch__xformers-136 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[CI] InProj countainer not properly covered
# 🐛 Bug
Not a bug per say, but many inProjCountainer options are not covered, see https://app.codecov.io/gh/facebookresearch/xformers/blob/main/xformers/components/in_proj_container.py
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `xformers/components/__init__.py`
Content:
```
1 # Copyright (c) Facebook, Inc. and its affiliates. All rights reserved.
2 #
3 # This source code is licensed under the BSD license found in the
4 # LICENSE file in the root directory of this source tree.
5
6
7 from dataclasses import fields
8 from pathlib import Path
9 from typing import Any, Dict, Union
10
11 from xformers.utils import import_all_modules
12
13 from .activations import Activation, build_activation # noqa
14 from .attention import Attention, build_attention # noqa
15 from .multi_head_dispatch import MultiHeadDispatch, MultiHeadDispatchConfig # noqa
16 from .residual import LayerNormStyle, PostNorm, PreNorm, Residual # noqa
17
18 # automatically import any Python files in the directory
19 import_all_modules(str(Path(__file__).parent), "xformers.components")
20
21
22 def build_multi_head_attention(
23 multi_head_config: Union[MultiHeadDispatchConfig, Dict[str, Any]],
24 ):
25 """Builds a multihead attention from a config.
26
27 This assumes a 'name' key in the config which is used to determine what
28 attention class to instantiate. For instance, a config `{"name": "my_attention",
29 "foo": "bar"}` will find a class that was registered as "my_attention"
30 (see :func:`register_attention`) and call .from_config on it."""
31
32 if not isinstance(multi_head_config, MultiHeadDispatchConfig):
33 # Extract the required fields
34 field_names = list(map(lambda x: x.name, fields(MultiHeadDispatchConfig)))
35
36 # The missing fields get Noned
37 for k in field_names:
38 if k not in multi_head_config.keys():
39 multi_head_config[k] = None
40
41 # Could be that the attention needs to be instantiated
42 if not isinstance(multi_head_config["attention"], Attention):
43 # Convenience: fill in possible missing fields
44 if "num_heads" not in multi_head_config["attention"]:
45 multi_head_config["attention"]["num_heads"] = multi_head_config[
46 "num_heads"
47 ]
48
49 if (
50 "dim_features" not in multi_head_config["attention"]
51 or multi_head_config["attention"]["dim_features"] is None
52 ):
53 multi_head_config["attention"]["dim_features"] = (
54 multi_head_config["dim_model"] // multi_head_config["num_heads"]
55 )
56
57 multi_head_config["attention"] = build_attention(
58 multi_head_config["attention"]
59 )
60
61 multi_head_config = MultiHeadDispatchConfig(**multi_head_config)
62
63 return MultiHeadDispatch.from_config(multi_head_config)
64
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/xformers/components/__init__.py b/xformers/components/__init__.py
--- a/xformers/components/__init__.py
+++ b/xformers/components/__init__.py
@@ -12,6 +12,7 @@
from .activations import Activation, build_activation # noqa
from .attention import Attention, build_attention # noqa
+from .in_proj_container import InProjContainer, InProjParams # noqa
from .multi_head_dispatch import MultiHeadDispatch, MultiHeadDispatchConfig # noqa
from .residual import LayerNormStyle, PostNorm, PreNorm, Residual # noqa
| {"golden_diff": "diff --git a/xformers/components/__init__.py b/xformers/components/__init__.py\n--- a/xformers/components/__init__.py\n+++ b/xformers/components/__init__.py\n@@ -12,6 +12,7 @@\n \n from .activations import Activation, build_activation # noqa\n from .attention import Attention, build_attention # noqa\n+from .in_proj_container import InProjContainer, InProjParams # noqa\n from .multi_head_dispatch import MultiHeadDispatch, MultiHeadDispatchConfig # noqa\n from .residual import LayerNormStyle, PostNorm, PreNorm, Residual # noqa\n", "issue": "[CI] InProj countainer not properly covered\n# \ud83d\udc1b Bug\r\nNot a bug per say, but many inProjCountainer options are not covered, see https://app.codecov.io/gh/facebookresearch/xformers/blob/main/xformers/components/in_proj_container.py\r\n\n", "before_files": [{"content": "# Copyright (c) Facebook, Inc. and its affiliates. All rights reserved.\n#\n# This source code is licensed under the BSD license found in the\n# LICENSE file in the root directory of this source tree.\n\n\nfrom dataclasses import fields\nfrom pathlib import Path\nfrom typing import Any, Dict, Union\n\nfrom xformers.utils import import_all_modules\n\nfrom .activations import Activation, build_activation # noqa\nfrom .attention import Attention, build_attention # noqa\nfrom .multi_head_dispatch import MultiHeadDispatch, MultiHeadDispatchConfig # noqa\nfrom .residual import LayerNormStyle, PostNorm, PreNorm, Residual # noqa\n\n# automatically import any Python files in the directory\nimport_all_modules(str(Path(__file__).parent), \"xformers.components\")\n\n\ndef build_multi_head_attention(\n multi_head_config: Union[MultiHeadDispatchConfig, Dict[str, Any]],\n):\n \"\"\"Builds a multihead attention from a config.\n\n This assumes a 'name' key in the config which is used to determine what\n attention class to instantiate. For instance, a config `{\"name\": \"my_attention\",\n \"foo\": \"bar\"}` will find a class that was registered as \"my_attention\"\n (see :func:`register_attention`) and call .from_config on it.\"\"\"\n\n if not isinstance(multi_head_config, MultiHeadDispatchConfig):\n # Extract the required fields\n field_names = list(map(lambda x: x.name, fields(MultiHeadDispatchConfig)))\n\n # The missing fields get Noned\n for k in field_names:\n if k not in multi_head_config.keys():\n multi_head_config[k] = None\n\n # Could be that the attention needs to be instantiated\n if not isinstance(multi_head_config[\"attention\"], Attention):\n # Convenience: fill in possible missing fields\n if \"num_heads\" not in multi_head_config[\"attention\"]:\n multi_head_config[\"attention\"][\"num_heads\"] = multi_head_config[\n \"num_heads\"\n ]\n\n if (\n \"dim_features\" not in multi_head_config[\"attention\"]\n or multi_head_config[\"attention\"][\"dim_features\"] is None\n ):\n multi_head_config[\"attention\"][\"dim_features\"] = (\n multi_head_config[\"dim_model\"] // multi_head_config[\"num_heads\"]\n )\n\n multi_head_config[\"attention\"] = build_attention(\n multi_head_config[\"attention\"]\n )\n\n multi_head_config = MultiHeadDispatchConfig(**multi_head_config)\n\n return MultiHeadDispatch.from_config(multi_head_config)\n", "path": "xformers/components/__init__.py"}], "after_files": [{"content": "# Copyright (c) Facebook, Inc. and its affiliates. All rights reserved.\n#\n# This source code is licensed under the BSD license found in the\n# LICENSE file in the root directory of this source tree.\n\n\nfrom dataclasses import fields\nfrom pathlib import Path\nfrom typing import Any, Dict, Union\n\nfrom xformers.utils import import_all_modules\n\nfrom .activations import Activation, build_activation # noqa\nfrom .attention import Attention, build_attention # noqa\nfrom .in_proj_container import InProjContainer, InProjParams # noqa\nfrom .multi_head_dispatch import MultiHeadDispatch, MultiHeadDispatchConfig # noqa\nfrom .residual import LayerNormStyle, PostNorm, PreNorm, Residual # noqa\n\n# automatically import any Python files in the directory\nimport_all_modules(str(Path(__file__).parent), \"xformers.components\")\n\n\ndef build_multi_head_attention(\n multi_head_config: Union[MultiHeadDispatchConfig, Dict[str, Any]],\n):\n \"\"\"Builds a multihead attention from a config.\n\n This assumes a 'name' key in the config which is used to determine what\n attention class to instantiate. For instance, a config `{\"name\": \"my_attention\",\n \"foo\": \"bar\"}` will find a class that was registered as \"my_attention\"\n (see :func:`register_attention`) and call .from_config on it.\"\"\"\n\n if not isinstance(multi_head_config, MultiHeadDispatchConfig):\n # Extract the required fields\n field_names = list(map(lambda x: x.name, fields(MultiHeadDispatchConfig)))\n\n # The missing fields get Noned\n for k in field_names:\n if k not in multi_head_config.keys():\n multi_head_config[k] = None\n\n # Could be that the attention needs to be instantiated\n if not isinstance(multi_head_config[\"attention\"], Attention):\n # Convenience: fill in possible missing fields\n if \"num_heads\" not in multi_head_config[\"attention\"]:\n multi_head_config[\"attention\"][\"num_heads\"] = multi_head_config[\n \"num_heads\"\n ]\n\n if (\n \"dim_features\" not in multi_head_config[\"attention\"]\n or multi_head_config[\"attention\"][\"dim_features\"] is None\n ):\n multi_head_config[\"attention\"][\"dim_features\"] = (\n multi_head_config[\"dim_model\"] // multi_head_config[\"num_heads\"]\n )\n\n multi_head_config[\"attention\"] = build_attention(\n multi_head_config[\"attention\"]\n )\n\n multi_head_config = MultiHeadDispatchConfig(**multi_head_config)\n\n return MultiHeadDispatch.from_config(multi_head_config)\n", "path": "xformers/components/__init__.py"}]} | 984 | 139 |
gh_patches_debug_31196 | rasdani/github-patches | git_diff | Textualize__textual-4223 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Writing to a hidden RichLog breaks table alignment
I've got a `RichLog` that is often hidden and I write text to it. I'm using a `Rich.Table` to left/right justify two pieces of text on a single line.
This works as expected while the widget is shown.
If the widget has either `visible=False` or `display=False` set, then any text written during that time is collapsed together into a single string.
I'm not sure if this is a side effect of not having the widget displayed (no space is calculated for it so it doesn't know what sizes to use when writing the `Table`?) or not. If this is working as expected, might there be any way to write to the widget while it's hidden but preserve the appropriate sizing?
The following is an example of writing to the widget both while it is shown, has `visible=False` and `display=False`.

Example Code (same that produced `gif` above).
```python
from __future__ import annotations
import random
from rich.align import Align
from rich.table import Table
from rich.text import Text
from textual.app import App, ComposeResult
from textual.widgets import RichLog
class Example(App):
BINDINGS = [
("w", "write_line", "Write Line"),
("h", "toggle_visible", "Toggle Visibility"),
("d", "toggle_display", "Toggle Display"),
("q", "quit", "Quit"),
]
DEFAULT_CSS = """
#one {
border: round white;
height: 1fr;
}
#two {
border: round white;
height: 1fr;
}
"""
def compose(self) -> ComposeResult:
log1 = RichLog(id="one")
log1.border_title = "Log 1"
log2 = RichLog(id="two")
log2.border_title = "Log 2"
yield log1
yield log2
def action_toggle_visible(self) -> None:
log1 = self.query_one("#one", RichLog)
log2 = self.query_one("#two", RichLog)
log2.visible = not log2.visible
log1.write(f"log2 visibility={log2.visible}")
def action_toggle_display(self) -> None:
log1 = self.query_one("#one", RichLog)
log2 = self.query_one("#two", RichLog)
log2.display = not log2.display
log1.write(f"log2 display={log2.display}")
def action_write_line(self) -> None:
"""Write a line to log 2"""
log1 = self.query_one("#one", RichLog)
log2 = self.query_one("#two", RichLog)
left = Text("something: text on the left")
right = Text("Stuff on the right")
table = Table.grid(expand=True)
table.add_row(left, Align.right(right))
log2.write(table, expand=True)
log1.write("Wrote to log2")
if __name__ == "__main__":
app = Example()
app.run()
```
# Textual Diagnostics
## Versions
| Name | Value |
|---------|--------|
| Textual | 0.46.0 |
| Rich | 13.7.0 |
## Python
| Name | Value |
|----------------|-----------------------------------------------------------|
| Version | 3.8.10 |
| Implementation | CPython |
| Compiler | Clang 14.0.3 (clang-1403.0.22.14.1) |
| Executable | /Users/geoff/Documents/projects/kafka-tui/venv/bin/python |
## Operating System
| Name | Value |
|---------|-------------------------------------------------------------------------------------------------------|
| System | Darwin |
| Release | 23.3.0 |
| Version | Darwin Kernel Version 23.3.0: Wed Dec 20 21:30:27 PST 2023; root:xnu-10002.81.5~7/RELEASE_ARM64_T8103 |
## Terminal
| Name | Value |
|----------------------|-----------------|
| Terminal Application | vscode (1.86.0) |
| TERM | xterm-256color |
| COLORTERM | truecolor |
| FORCE_COLOR | *Not set* |
| NO_COLOR | *Not set* |
## Rich Console options
| Name | Value |
|----------------|----------------------|
| size | width=115, height=40 |
| legacy_windows | False |
| min_width | 1 |
| max_width | 115 |
| is_terminal | True |
| encoding | utf-8 |
| max_height | 40 |
| justify | None |
| overflow | None |
| no_wrap | False |
| highlight | None |
| markup | None |
| height | None |
Feel free to add screenshots and / or videos. These can be very helpful!
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/textual/widgets/_rich_log.py`
Content:
```
1 """Provides a scrollable text-logging widget."""
2
3 from __future__ import annotations
4
5 from typing import TYPE_CHECKING, Optional, cast
6
7 from rich.console import RenderableType
8 from rich.highlighter import ReprHighlighter
9 from rich.measure import measure_renderables
10 from rich.pretty import Pretty
11 from rich.protocol import is_renderable
12 from rich.segment import Segment
13 from rich.text import Text
14
15 from ..cache import LRUCache
16 from ..geometry import Region, Size
17 from ..reactive import var
18 from ..scroll_view import ScrollView
19 from ..strip import Strip
20
21 if TYPE_CHECKING:
22 from typing_extensions import Self
23
24
25 class RichLog(ScrollView, can_focus=True):
26 """A widget for logging text."""
27
28 DEFAULT_CSS = """
29 RichLog{
30 background: $surface;
31 color: $text;
32 overflow-y: scroll;
33 }
34 """
35
36 max_lines: var[int | None] = var[Optional[int]](None)
37 min_width: var[int] = var(78)
38 wrap: var[bool] = var(False)
39 highlight: var[bool] = var(False)
40 markup: var[bool] = var(False)
41 auto_scroll: var[bool] = var(True)
42
43 def __init__(
44 self,
45 *,
46 max_lines: int | None = None,
47 min_width: int = 78,
48 wrap: bool = False,
49 highlight: bool = False,
50 markup: bool = False,
51 auto_scroll: bool = True,
52 name: str | None = None,
53 id: str | None = None,
54 classes: str | None = None,
55 disabled: bool = False,
56 ) -> None:
57 """Create a RichLog widget.
58
59 Args:
60 max_lines: Maximum number of lines in the log or `None` for no maximum.
61 min_width: Minimum width of renderables.
62 wrap: Enable word wrapping (default is off).
63 highlight: Automatically highlight content.
64 markup: Apply Rich console markup.
65 auto_scroll: Enable automatic scrolling to end.
66 name: The name of the text log.
67 id: The ID of the text log in the DOM.
68 classes: The CSS classes of the text log.
69 disabled: Whether the text log is disabled or not.
70 """
71 super().__init__(name=name, id=id, classes=classes, disabled=disabled)
72 self.max_lines = max_lines
73 """Maximum number of lines in the log or `None` for no maximum."""
74 self._start_line: int = 0
75 self.lines: list[Strip] = []
76 self._line_cache: LRUCache[tuple[int, int, int, int], Strip]
77 self._line_cache = LRUCache(1024)
78 self.max_width: int = 0
79 self.min_width = min_width
80 """Minimum width of renderables."""
81 self.wrap = wrap
82 """Enable word wrapping."""
83 self.highlight = highlight
84 """Automatically highlight content."""
85 self.markup = markup
86 """Apply Rich console markup."""
87 self.auto_scroll = auto_scroll
88 """Automatically scroll to the end on write."""
89 self.highlighter = ReprHighlighter()
90
91 def notify_style_update(self) -> None:
92 self._line_cache.clear()
93
94 def _make_renderable(self, content: RenderableType | object) -> RenderableType:
95 """Make content renderable.
96
97 Args:
98 content: Content to render.
99
100 Returns:
101 A Rich renderable.
102 """
103 renderable: RenderableType
104 if not is_renderable(content):
105 renderable = Pretty(content)
106 else:
107 if isinstance(content, str):
108 if self.markup:
109 renderable = Text.from_markup(content)
110 else:
111 renderable = Text(content)
112 if self.highlight:
113 renderable = self.highlighter(renderable)
114 else:
115 renderable = cast(RenderableType, content)
116
117 if isinstance(renderable, Text):
118 renderable.expand_tabs()
119
120 return renderable
121
122 def write(
123 self,
124 content: RenderableType | object,
125 width: int | None = None,
126 expand: bool = False,
127 shrink: bool = True,
128 scroll_end: bool | None = None,
129 ) -> Self:
130 """Write text or a rich renderable.
131
132 Args:
133 content: Rich renderable (or text).
134 width: Width to render or `None` to use optimal width.
135 expand: Enable expand to widget width, or `False` to use `width`.
136 shrink: Enable shrinking of content to fit width.
137 scroll_end: Enable automatic scroll to end, or `None` to use `self.auto_scroll`.
138
139 Returns:
140 The `RichLog` instance.
141 """
142
143 auto_scroll = self.auto_scroll if scroll_end is None else scroll_end
144
145 console = self.app.console
146 render_options = console.options
147
148 renderable = self._make_renderable(content)
149
150 if isinstance(renderable, Text) and not self.wrap:
151 render_options = render_options.update(overflow="ignore", no_wrap=True)
152
153 render_width = measure_renderables(
154 console, render_options, [renderable]
155 ).maximum
156 container_width = (
157 self.scrollable_content_region.width if width is None else width
158 )
159 if container_width:
160 if expand and render_width < container_width:
161 render_width = container_width
162 if shrink and render_width > container_width:
163 render_width = container_width
164
165 segments = self.app.console.render(
166 renderable, render_options.update_width(render_width)
167 )
168 lines = list(Segment.split_lines(segments))
169 if not lines:
170 self.lines.append(Strip.blank(render_width))
171 else:
172 self.max_width = max(
173 self.max_width,
174 max(sum([segment.cell_length for segment in _line]) for _line in lines),
175 )
176 strips = Strip.from_lines(lines)
177 for strip in strips:
178 strip.adjust_cell_length(render_width)
179 self.lines.extend(strips)
180
181 if self.max_lines is not None and len(self.lines) > self.max_lines:
182 self._start_line += len(self.lines) - self.max_lines
183 self.refresh()
184 self.lines = self.lines[-self.max_lines :]
185 self.virtual_size = Size(self.max_width, len(self.lines))
186 if auto_scroll:
187 self.scroll_end(animate=False)
188
189 return self
190
191 def clear(self) -> Self:
192 """Clear the text log.
193
194 Returns:
195 The `RichLog` instance.
196 """
197 self.lines.clear()
198 self._line_cache.clear()
199 self._start_line = 0
200 self.max_width = 0
201 self.virtual_size = Size(self.max_width, len(self.lines))
202 self.refresh()
203 return self
204
205 def render_line(self, y: int) -> Strip:
206 scroll_x, scroll_y = self.scroll_offset
207 line = self._render_line(scroll_y + y, scroll_x, self.size.width)
208 strip = line.apply_style(self.rich_style)
209 return strip
210
211 def render_lines(self, crop: Region) -> list[Strip]:
212 """Render the widget in to lines.
213
214 Args:
215 crop: Region within visible area to.
216
217 Returns:
218 A list of list of segments.
219 """
220 lines = self._styles_cache.render_widget(self, crop)
221 return lines
222
223 def _render_line(self, y: int, scroll_x: int, width: int) -> Strip:
224 if y >= len(self.lines):
225 return Strip.blank(width, self.rich_style)
226
227 key = (y + self._start_line, scroll_x, width, self.max_width)
228 if key in self._line_cache:
229 return self._line_cache[key]
230
231 line = self.lines[y].crop_extend(scroll_x, scroll_x + width, self.rich_style)
232
233 self._line_cache[key] = line
234 return line
235
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/textual/widgets/_rich_log.py b/src/textual/widgets/_rich_log.py
--- a/src/textual/widgets/_rich_log.py
+++ b/src/textual/widgets/_rich_log.py
@@ -88,9 +88,15 @@
"""Automatically scroll to the end on write."""
self.highlighter = ReprHighlighter()
+ self._last_container_width: int = min_width
+ """Record the last width we rendered content at."""
+
def notify_style_update(self) -> None:
self._line_cache.clear()
+ def on_resize(self) -> None:
+ self._last_container_width = self.scrollable_content_region.width
+
def _make_renderable(self, content: RenderableType | object) -> RenderableType:
"""Make content renderable.
@@ -153,14 +159,22 @@
render_width = measure_renderables(
console, render_options, [renderable]
).maximum
+
container_width = (
self.scrollable_content_region.width if width is None else width
)
- if container_width:
- if expand and render_width < container_width:
- render_width = container_width
- if shrink and render_width > container_width:
- render_width = container_width
+
+ # Use the container_width if it's available, otherwise use the last available width.
+ container_width = (
+ container_width if container_width else self._last_container_width
+ )
+
+ if expand and render_width < container_width:
+ render_width = container_width
+ if shrink and render_width > container_width:
+ render_width = container_width
+
+ render_width = max(render_width, self.min_width)
segments = self.app.console.render(
renderable, render_options.update_width(render_width)
| {"golden_diff": "diff --git a/src/textual/widgets/_rich_log.py b/src/textual/widgets/_rich_log.py\n--- a/src/textual/widgets/_rich_log.py\n+++ b/src/textual/widgets/_rich_log.py\n@@ -88,9 +88,15 @@\n \"\"\"Automatically scroll to the end on write.\"\"\"\n self.highlighter = ReprHighlighter()\n \n+ self._last_container_width: int = min_width\n+ \"\"\"Record the last width we rendered content at.\"\"\"\n+\n def notify_style_update(self) -> None:\n self._line_cache.clear()\n \n+ def on_resize(self) -> None:\n+ self._last_container_width = self.scrollable_content_region.width\n+\n def _make_renderable(self, content: RenderableType | object) -> RenderableType:\n \"\"\"Make content renderable.\n \n@@ -153,14 +159,22 @@\n render_width = measure_renderables(\n console, render_options, [renderable]\n ).maximum\n+\n container_width = (\n self.scrollable_content_region.width if width is None else width\n )\n- if container_width:\n- if expand and render_width < container_width:\n- render_width = container_width\n- if shrink and render_width > container_width:\n- render_width = container_width\n+\n+ # Use the container_width if it's available, otherwise use the last available width.\n+ container_width = (\n+ container_width if container_width else self._last_container_width\n+ )\n+\n+ if expand and render_width < container_width:\n+ render_width = container_width\n+ if shrink and render_width > container_width:\n+ render_width = container_width\n+\n+ render_width = max(render_width, self.min_width)\n \n segments = self.app.console.render(\n renderable, render_options.update_width(render_width)\n", "issue": "Writing to a hidden RichLog breaks table alignment\nI've got a `RichLog` that is often hidden and I write text to it. I'm using a `Rich.Table` to left/right justify two pieces of text on a single line.\r\n\r\nThis works as expected while the widget is shown.\r\n\r\nIf the widget has either `visible=False` or `display=False` set, then any text written during that time is collapsed together into a single string.\r\n\r\nI'm not sure if this is a side effect of not having the widget displayed (no space is calculated for it so it doesn't know what sizes to use when writing the `Table`?) or not. If this is working as expected, might there be any way to write to the widget while it's hidden but preserve the appropriate sizing?\r\n\r\nThe following is an example of writing to the widget both while it is shown, has `visible=False` and `display=False`.\r\n\r\n\r\n\r\nExample Code (same that produced `gif` above).\r\n```python\r\nfrom __future__ import annotations\r\n\r\nimport random\r\n\r\nfrom rich.align import Align\r\nfrom rich.table import Table\r\nfrom rich.text import Text\r\nfrom textual.app import App, ComposeResult\r\nfrom textual.widgets import RichLog\r\n\r\n\r\nclass Example(App):\r\n BINDINGS = [\r\n (\"w\", \"write_line\", \"Write Line\"),\r\n (\"h\", \"toggle_visible\", \"Toggle Visibility\"),\r\n (\"d\", \"toggle_display\", \"Toggle Display\"),\r\n (\"q\", \"quit\", \"Quit\"),\r\n ]\r\n\r\n DEFAULT_CSS = \"\"\"\r\n #one {\r\n border: round white;\r\n height: 1fr;\r\n }\r\n #two {\r\n border: round white;\r\n height: 1fr;\r\n }\r\n \"\"\"\r\n\r\n def compose(self) -> ComposeResult:\r\n log1 = RichLog(id=\"one\")\r\n log1.border_title = \"Log 1\"\r\n log2 = RichLog(id=\"two\")\r\n log2.border_title = \"Log 2\"\r\n yield log1\r\n yield log2\r\n\r\n def action_toggle_visible(self) -> None:\r\n log1 = self.query_one(\"#one\", RichLog)\r\n log2 = self.query_one(\"#two\", RichLog)\r\n\r\n log2.visible = not log2.visible\r\n log1.write(f\"log2 visibility={log2.visible}\")\r\n\r\n def action_toggle_display(self) -> None:\r\n log1 = self.query_one(\"#one\", RichLog)\r\n log2 = self.query_one(\"#two\", RichLog)\r\n\r\n log2.display = not log2.display\r\n log1.write(f\"log2 display={log2.display}\")\r\n\r\n def action_write_line(self) -> None:\r\n \"\"\"Write a line to log 2\"\"\"\r\n log1 = self.query_one(\"#one\", RichLog)\r\n log2 = self.query_one(\"#two\", RichLog)\r\n\r\n left = Text(\"something: text on the left\")\r\n right = Text(\"Stuff on the right\")\r\n table = Table.grid(expand=True)\r\n table.add_row(left, Align.right(right))\r\n\r\n log2.write(table, expand=True)\r\n log1.write(\"Wrote to log2\")\r\n\r\n\r\nif __name__ == \"__main__\":\r\n app = Example()\r\n app.run()\r\n```\r\n\r\n# Textual Diagnostics\r\n\r\n## Versions\r\n\r\n| Name | Value |\r\n|---------|--------|\r\n| Textual | 0.46.0 |\r\n| Rich | 13.7.0 |\r\n\r\n## Python\r\n\r\n| Name | Value |\r\n|----------------|-----------------------------------------------------------|\r\n| Version | 3.8.10 |\r\n| Implementation | CPython |\r\n| Compiler | Clang 14.0.3 (clang-1403.0.22.14.1) |\r\n| Executable | /Users/geoff/Documents/projects/kafka-tui/venv/bin/python |\r\n\r\n## Operating System\r\n\r\n| Name | Value |\r\n|---------|-------------------------------------------------------------------------------------------------------|\r\n| System | Darwin |\r\n| Release | 23.3.0 |\r\n| Version | Darwin Kernel Version 23.3.0: Wed Dec 20 21:30:27 PST 2023; root:xnu-10002.81.5~7/RELEASE_ARM64_T8103 |\r\n\r\n## Terminal\r\n\r\n| Name | Value |\r\n|----------------------|-----------------|\r\n| Terminal Application | vscode (1.86.0) |\r\n| TERM | xterm-256color |\r\n| COLORTERM | truecolor |\r\n| FORCE_COLOR | *Not set* |\r\n| NO_COLOR | *Not set* |\r\n\r\n## Rich Console options\r\n\r\n| Name | Value |\r\n|----------------|----------------------|\r\n| size | width=115, height=40 |\r\n| legacy_windows | False |\r\n| min_width | 1 |\r\n| max_width | 115 |\r\n| is_terminal | True |\r\n| encoding | utf-8 |\r\n| max_height | 40 |\r\n| justify | None |\r\n| overflow | None |\r\n| no_wrap | False |\r\n| highlight | None |\r\n| markup | None |\r\n| height | None |\r\n\r\nFeel free to add screenshots and / or videos. These can be very helpful!\r\n\n", "before_files": [{"content": "\"\"\"Provides a scrollable text-logging widget.\"\"\"\n\nfrom __future__ import annotations\n\nfrom typing import TYPE_CHECKING, Optional, cast\n\nfrom rich.console import RenderableType\nfrom rich.highlighter import ReprHighlighter\nfrom rich.measure import measure_renderables\nfrom rich.pretty import Pretty\nfrom rich.protocol import is_renderable\nfrom rich.segment import Segment\nfrom rich.text import Text\n\nfrom ..cache import LRUCache\nfrom ..geometry import Region, Size\nfrom ..reactive import var\nfrom ..scroll_view import ScrollView\nfrom ..strip import Strip\n\nif TYPE_CHECKING:\n from typing_extensions import Self\n\n\nclass RichLog(ScrollView, can_focus=True):\n \"\"\"A widget for logging text.\"\"\"\n\n DEFAULT_CSS = \"\"\"\n RichLog{\n background: $surface;\n color: $text;\n overflow-y: scroll;\n }\n \"\"\"\n\n max_lines: var[int | None] = var[Optional[int]](None)\n min_width: var[int] = var(78)\n wrap: var[bool] = var(False)\n highlight: var[bool] = var(False)\n markup: var[bool] = var(False)\n auto_scroll: var[bool] = var(True)\n\n def __init__(\n self,\n *,\n max_lines: int | None = None,\n min_width: int = 78,\n wrap: bool = False,\n highlight: bool = False,\n markup: bool = False,\n auto_scroll: bool = True,\n name: str | None = None,\n id: str | None = None,\n classes: str | None = None,\n disabled: bool = False,\n ) -> None:\n \"\"\"Create a RichLog widget.\n\n Args:\n max_lines: Maximum number of lines in the log or `None` for no maximum.\n min_width: Minimum width of renderables.\n wrap: Enable word wrapping (default is off).\n highlight: Automatically highlight content.\n markup: Apply Rich console markup.\n auto_scroll: Enable automatic scrolling to end.\n name: The name of the text log.\n id: The ID of the text log in the DOM.\n classes: The CSS classes of the text log.\n disabled: Whether the text log is disabled or not.\n \"\"\"\n super().__init__(name=name, id=id, classes=classes, disabled=disabled)\n self.max_lines = max_lines\n \"\"\"Maximum number of lines in the log or `None` for no maximum.\"\"\"\n self._start_line: int = 0\n self.lines: list[Strip] = []\n self._line_cache: LRUCache[tuple[int, int, int, int], Strip]\n self._line_cache = LRUCache(1024)\n self.max_width: int = 0\n self.min_width = min_width\n \"\"\"Minimum width of renderables.\"\"\"\n self.wrap = wrap\n \"\"\"Enable word wrapping.\"\"\"\n self.highlight = highlight\n \"\"\"Automatically highlight content.\"\"\"\n self.markup = markup\n \"\"\"Apply Rich console markup.\"\"\"\n self.auto_scroll = auto_scroll\n \"\"\"Automatically scroll to the end on write.\"\"\"\n self.highlighter = ReprHighlighter()\n\n def notify_style_update(self) -> None:\n self._line_cache.clear()\n\n def _make_renderable(self, content: RenderableType | object) -> RenderableType:\n \"\"\"Make content renderable.\n\n Args:\n content: Content to render.\n\n Returns:\n A Rich renderable.\n \"\"\"\n renderable: RenderableType\n if not is_renderable(content):\n renderable = Pretty(content)\n else:\n if isinstance(content, str):\n if self.markup:\n renderable = Text.from_markup(content)\n else:\n renderable = Text(content)\n if self.highlight:\n renderable = self.highlighter(renderable)\n else:\n renderable = cast(RenderableType, content)\n\n if isinstance(renderable, Text):\n renderable.expand_tabs()\n\n return renderable\n\n def write(\n self,\n content: RenderableType | object,\n width: int | None = None,\n expand: bool = False,\n shrink: bool = True,\n scroll_end: bool | None = None,\n ) -> Self:\n \"\"\"Write text or a rich renderable.\n\n Args:\n content: Rich renderable (or text).\n width: Width to render or `None` to use optimal width.\n expand: Enable expand to widget width, or `False` to use `width`.\n shrink: Enable shrinking of content to fit width.\n scroll_end: Enable automatic scroll to end, or `None` to use `self.auto_scroll`.\n\n Returns:\n The `RichLog` instance.\n \"\"\"\n\n auto_scroll = self.auto_scroll if scroll_end is None else scroll_end\n\n console = self.app.console\n render_options = console.options\n\n renderable = self._make_renderable(content)\n\n if isinstance(renderable, Text) and not self.wrap:\n render_options = render_options.update(overflow=\"ignore\", no_wrap=True)\n\n render_width = measure_renderables(\n console, render_options, [renderable]\n ).maximum\n container_width = (\n self.scrollable_content_region.width if width is None else width\n )\n if container_width:\n if expand and render_width < container_width:\n render_width = container_width\n if shrink and render_width > container_width:\n render_width = container_width\n\n segments = self.app.console.render(\n renderable, render_options.update_width(render_width)\n )\n lines = list(Segment.split_lines(segments))\n if not lines:\n self.lines.append(Strip.blank(render_width))\n else:\n self.max_width = max(\n self.max_width,\n max(sum([segment.cell_length for segment in _line]) for _line in lines),\n )\n strips = Strip.from_lines(lines)\n for strip in strips:\n strip.adjust_cell_length(render_width)\n self.lines.extend(strips)\n\n if self.max_lines is not None and len(self.lines) > self.max_lines:\n self._start_line += len(self.lines) - self.max_lines\n self.refresh()\n self.lines = self.lines[-self.max_lines :]\n self.virtual_size = Size(self.max_width, len(self.lines))\n if auto_scroll:\n self.scroll_end(animate=False)\n\n return self\n\n def clear(self) -> Self:\n \"\"\"Clear the text log.\n\n Returns:\n The `RichLog` instance.\n \"\"\"\n self.lines.clear()\n self._line_cache.clear()\n self._start_line = 0\n self.max_width = 0\n self.virtual_size = Size(self.max_width, len(self.lines))\n self.refresh()\n return self\n\n def render_line(self, y: int) -> Strip:\n scroll_x, scroll_y = self.scroll_offset\n line = self._render_line(scroll_y + y, scroll_x, self.size.width)\n strip = line.apply_style(self.rich_style)\n return strip\n\n def render_lines(self, crop: Region) -> list[Strip]:\n \"\"\"Render the widget in to lines.\n\n Args:\n crop: Region within visible area to.\n\n Returns:\n A list of list of segments.\n \"\"\"\n lines = self._styles_cache.render_widget(self, crop)\n return lines\n\n def _render_line(self, y: int, scroll_x: int, width: int) -> Strip:\n if y >= len(self.lines):\n return Strip.blank(width, self.rich_style)\n\n key = (y + self._start_line, scroll_x, width, self.max_width)\n if key in self._line_cache:\n return self._line_cache[key]\n\n line = self.lines[y].crop_extend(scroll_x, scroll_x + width, self.rich_style)\n\n self._line_cache[key] = line\n return line\n", "path": "src/textual/widgets/_rich_log.py"}], "after_files": [{"content": "\"\"\"Provides a scrollable text-logging widget.\"\"\"\n\nfrom __future__ import annotations\n\nfrom typing import TYPE_CHECKING, Optional, cast\n\nfrom rich.console import RenderableType\nfrom rich.highlighter import ReprHighlighter\nfrom rich.measure import measure_renderables\nfrom rich.pretty import Pretty\nfrom rich.protocol import is_renderable\nfrom rich.segment import Segment\nfrom rich.text import Text\n\nfrom ..cache import LRUCache\nfrom ..geometry import Region, Size\nfrom ..reactive import var\nfrom ..scroll_view import ScrollView\nfrom ..strip import Strip\n\nif TYPE_CHECKING:\n from typing_extensions import Self\n\n\nclass RichLog(ScrollView, can_focus=True):\n \"\"\"A widget for logging text.\"\"\"\n\n DEFAULT_CSS = \"\"\"\n RichLog{\n background: $surface;\n color: $text;\n overflow-y: scroll;\n }\n \"\"\"\n\n max_lines: var[int | None] = var[Optional[int]](None)\n min_width: var[int] = var(78)\n wrap: var[bool] = var(False)\n highlight: var[bool] = var(False)\n markup: var[bool] = var(False)\n auto_scroll: var[bool] = var(True)\n\n def __init__(\n self,\n *,\n max_lines: int | None = None,\n min_width: int = 78,\n wrap: bool = False,\n highlight: bool = False,\n markup: bool = False,\n auto_scroll: bool = True,\n name: str | None = None,\n id: str | None = None,\n classes: str | None = None,\n disabled: bool = False,\n ) -> None:\n \"\"\"Create a RichLog widget.\n\n Args:\n max_lines: Maximum number of lines in the log or `None` for no maximum.\n min_width: Minimum width of renderables.\n wrap: Enable word wrapping (default is off).\n highlight: Automatically highlight content.\n markup: Apply Rich console markup.\n auto_scroll: Enable automatic scrolling to end.\n name: The name of the text log.\n id: The ID of the text log in the DOM.\n classes: The CSS classes of the text log.\n disabled: Whether the text log is disabled or not.\n \"\"\"\n super().__init__(name=name, id=id, classes=classes, disabled=disabled)\n self.max_lines = max_lines\n \"\"\"Maximum number of lines in the log or `None` for no maximum.\"\"\"\n self._start_line: int = 0\n self.lines: list[Strip] = []\n self._line_cache: LRUCache[tuple[int, int, int, int], Strip]\n self._line_cache = LRUCache(1024)\n self.max_width: int = 0\n self.min_width = min_width\n \"\"\"Minimum width of renderables.\"\"\"\n self.wrap = wrap\n \"\"\"Enable word wrapping.\"\"\"\n self.highlight = highlight\n \"\"\"Automatically highlight content.\"\"\"\n self.markup = markup\n \"\"\"Apply Rich console markup.\"\"\"\n self.auto_scroll = auto_scroll\n \"\"\"Automatically scroll to the end on write.\"\"\"\n self.highlighter = ReprHighlighter()\n\n self._last_container_width: int = min_width\n \"\"\"Record the last width we rendered content at.\"\"\"\n\n def notify_style_update(self) -> None:\n self._line_cache.clear()\n\n def on_resize(self) -> None:\n self._last_container_width = self.scrollable_content_region.width\n\n def _make_renderable(self, content: RenderableType | object) -> RenderableType:\n \"\"\"Make content renderable.\n\n Args:\n content: Content to render.\n\n Returns:\n A Rich renderable.\n \"\"\"\n renderable: RenderableType\n if not is_renderable(content):\n renderable = Pretty(content)\n else:\n if isinstance(content, str):\n if self.markup:\n renderable = Text.from_markup(content)\n else:\n renderable = Text(content)\n if self.highlight:\n renderable = self.highlighter(renderable)\n else:\n renderable = cast(RenderableType, content)\n\n if isinstance(renderable, Text):\n renderable.expand_tabs()\n\n return renderable\n\n def write(\n self,\n content: RenderableType | object,\n width: int | None = None,\n expand: bool = False,\n shrink: bool = True,\n scroll_end: bool | None = None,\n ) -> Self:\n \"\"\"Write text or a rich renderable.\n\n Args:\n content: Rich renderable (or text).\n width: Width to render or `None` to use optimal width.\n expand: Enable expand to widget width, or `False` to use `width`.\n shrink: Enable shrinking of content to fit width.\n scroll_end: Enable automatic scroll to end, or `None` to use `self.auto_scroll`.\n\n Returns:\n The `RichLog` instance.\n \"\"\"\n\n auto_scroll = self.auto_scroll if scroll_end is None else scroll_end\n\n console = self.app.console\n render_options = console.options\n\n renderable = self._make_renderable(content)\n\n if isinstance(renderable, Text) and not self.wrap:\n render_options = render_options.update(overflow=\"ignore\", no_wrap=True)\n\n render_width = measure_renderables(\n console, render_options, [renderable]\n ).maximum\n\n container_width = (\n self.scrollable_content_region.width if width is None else width\n )\n\n # Use the container_width if it's available, otherwise use the last available width.\n container_width = (\n container_width if container_width else self._last_container_width\n )\n\n if expand and render_width < container_width:\n render_width = container_width\n if shrink and render_width > container_width:\n render_width = container_width\n\n render_width = max(render_width, self.min_width)\n\n segments = self.app.console.render(\n renderable, render_options.update_width(render_width)\n )\n lines = list(Segment.split_lines(segments))\n if not lines:\n self.lines.append(Strip.blank(render_width))\n else:\n self.max_width = max(\n self.max_width,\n max(sum([segment.cell_length for segment in _line]) for _line in lines),\n )\n strips = Strip.from_lines(lines)\n for strip in strips:\n strip.adjust_cell_length(render_width)\n self.lines.extend(strips)\n\n if self.max_lines is not None and len(self.lines) > self.max_lines:\n self._start_line += len(self.lines) - self.max_lines\n self.refresh()\n self.lines = self.lines[-self.max_lines :]\n self.virtual_size = Size(self.max_width, len(self.lines))\n if auto_scroll:\n self.scroll_end(animate=False)\n\n return self\n\n def clear(self) -> Self:\n \"\"\"Clear the text log.\n\n Returns:\n The `RichLog` instance.\n \"\"\"\n self.lines.clear()\n self._line_cache.clear()\n self._start_line = 0\n self.max_width = 0\n self.virtual_size = Size(self.max_width, len(self.lines))\n self.refresh()\n return self\n\n def render_line(self, y: int) -> Strip:\n scroll_x, scroll_y = self.scroll_offset\n line = self._render_line(scroll_y + y, scroll_x, self.size.width)\n strip = line.apply_style(self.rich_style)\n return strip\n\n def render_lines(self, crop: Region) -> list[Strip]:\n \"\"\"Render the widget in to lines.\n\n Args:\n crop: Region within visible area to.\n\n Returns:\n A list of list of segments.\n \"\"\"\n lines = self._styles_cache.render_widget(self, crop)\n return lines\n\n def _render_line(self, y: int, scroll_x: int, width: int) -> Strip:\n if y >= len(self.lines):\n return Strip.blank(width, self.rich_style)\n\n key = (y + self._start_line, scroll_x, width, self.max_width)\n if key in self._line_cache:\n return self._line_cache[key]\n\n line = self.lines[y].crop_extend(scroll_x, scroll_x + width, self.rich_style)\n\n self._line_cache[key] = line\n return line\n", "path": "src/textual/widgets/_rich_log.py"}]} | 3,738 | 398 |
gh_patches_debug_1866 | rasdani/github-patches | git_diff | modin-project__modin-1782 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
ClusterError class should implement its own version of __str__ method
<!--
General questions should be asked on the mailing list [email protected].
Before submitting an issue, please fill out the following form.
-->
### System information
- **OS Platform and Distribution (e.g., Linux Ubuntu 16.04)**:
- **Modin installed from (source or binary)**:
- **Modin version**:
- **Python version**:
- **Exact command to reproduce**:
<!--
You can obtain the Modin version with
python -c "import modin; print(modin.__version__)"
-->
### Describe the problem
<!-- Describe the problem clearly here. -->
`ClusterError` includes the `cause` field that is not printed. This makes it difficult to understand the problems that cause exceptions.
### Source code / logs
<!-- Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached. Try to provide a reproducible test case that is the bare minimum necessary to generate the problem. -->
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `modin/experimental/cloud/base.py`
Content:
```
1 # Licensed to Modin Development Team under one or more contributor license agreements.
2 # See the NOTICE file distributed with this work for additional information regarding
3 # copyright ownership. The Modin Development Team licenses this file to you under the
4 # Apache License, Version 2.0 (the "License"); you may not use this file except in
5 # compliance with the License. You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software distributed under
10 # the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF
11 # ANY KIND, either express or implied. See the License for the specific language
12 # governing permissions and limitations under the License.
13
14 from typing import NamedTuple
15 import os
16 import sys
17
18
19 class ClusterError(Exception):
20 """
21 Generic cluster operating exception
22 """
23
24 def __init__(self, *args, cause: BaseException = None, traceback: str = None, **kw):
25 self.cause = cause
26 self.traceback = traceback
27 super().__init__(*args, **kw)
28
29
30 class CannotSpawnCluster(ClusterError):
31 """
32 Raised when cluster cannot be spawned in the cloud
33 """
34
35
36 class CannotDestroyCluster(ClusterError):
37 """
38 Raised when cluster cannot be destroyed in the cloud
39 """
40
41
42 class ConnectionDetails(NamedTuple):
43 user_name: str = "modin"
44 key_file: str = None
45 address: str = None
46 port: int = 22
47
48
49 _EXT = (".exe", ".com", ".cmd", ".bat", "") if sys.platform == "win32" else ("",)
50
51
52 def _which(prog):
53 for entry in os.environ["PATH"].split(os.pathsep):
54 for ext in _EXT:
55 path = os.path.join(entry, prog + ext)
56 if os.access(path, os.X_OK):
57 return path
58 return None
59
60
61 def _get_ssh_proxy_command():
62 socks_proxy = os.environ.get("MODIN_SOCKS_PROXY", None)
63 if socks_proxy is None:
64 return None
65 if _which("nc"):
66 return f"nc -x {socks_proxy} %h %p"
67 elif _which("connect"):
68 return f"connect -S {socks_proxy} %h %p"
69 raise ClusterError(
70 "SSH through proxy required but no supported proxying tools found"
71 )
72
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/modin/experimental/cloud/base.py b/modin/experimental/cloud/base.py
--- a/modin/experimental/cloud/base.py
+++ b/modin/experimental/cloud/base.py
@@ -26,6 +26,11 @@
self.traceback = traceback
super().__init__(*args, **kw)
+ def __str__(self):
+ if self.clause:
+ return f"clause: {self.cause}\n{super()}"
+ return str(super())
+
class CannotSpawnCluster(ClusterError):
"""
| {"golden_diff": "diff --git a/modin/experimental/cloud/base.py b/modin/experimental/cloud/base.py\n--- a/modin/experimental/cloud/base.py\n+++ b/modin/experimental/cloud/base.py\n@@ -26,6 +26,11 @@\n self.traceback = traceback\n super().__init__(*args, **kw)\n \n+ def __str__(self):\n+ if self.clause:\n+ return f\"clause: {self.cause}\\n{super()}\"\n+ return str(super())\n+\n \n class CannotSpawnCluster(ClusterError):\n \"\"\"\n", "issue": "ClusterError class should implement its own version of __str__ method\n<!--\r\nGeneral questions should be asked on the mailing list [email protected].\r\n\r\nBefore submitting an issue, please fill out the following form.\r\n-->\r\n\r\n### System information\r\n- **OS Platform and Distribution (e.g., Linux Ubuntu 16.04)**:\r\n- **Modin installed from (source or binary)**:\r\n- **Modin version**:\r\n- **Python version**:\r\n- **Exact command to reproduce**:\r\n\r\n<!--\r\nYou can obtain the Modin version with\r\n\r\npython -c \"import modin; print(modin.__version__)\"\r\n-->\r\n\r\n### Describe the problem\r\n<!-- Describe the problem clearly here. -->\r\n`ClusterError` includes the `cause` field that is not printed. This makes it difficult to understand the problems that cause exceptions.\r\n\r\n### Source code / logs\r\n<!-- Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached. Try to provide a reproducible test case that is the bare minimum necessary to generate the problem. -->\r\n\n", "before_files": [{"content": "# Licensed to Modin Development Team under one or more contributor license agreements.\n# See the NOTICE file distributed with this work for additional information regarding\n# copyright ownership. The Modin Development Team licenses this file to you under the\n# Apache License, Version 2.0 (the \"License\"); you may not use this file except in\n# compliance with the License. You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software distributed under\n# the License is distributed on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF\n# ANY KIND, either express or implied. See the License for the specific language\n# governing permissions and limitations under the License.\n\nfrom typing import NamedTuple\nimport os\nimport sys\n\n\nclass ClusterError(Exception):\n \"\"\"\n Generic cluster operating exception\n \"\"\"\n\n def __init__(self, *args, cause: BaseException = None, traceback: str = None, **kw):\n self.cause = cause\n self.traceback = traceback\n super().__init__(*args, **kw)\n\n\nclass CannotSpawnCluster(ClusterError):\n \"\"\"\n Raised when cluster cannot be spawned in the cloud\n \"\"\"\n\n\nclass CannotDestroyCluster(ClusterError):\n \"\"\"\n Raised when cluster cannot be destroyed in the cloud\n \"\"\"\n\n\nclass ConnectionDetails(NamedTuple):\n user_name: str = \"modin\"\n key_file: str = None\n address: str = None\n port: int = 22\n\n\n_EXT = (\".exe\", \".com\", \".cmd\", \".bat\", \"\") if sys.platform == \"win32\" else (\"\",)\n\n\ndef _which(prog):\n for entry in os.environ[\"PATH\"].split(os.pathsep):\n for ext in _EXT:\n path = os.path.join(entry, prog + ext)\n if os.access(path, os.X_OK):\n return path\n return None\n\n\ndef _get_ssh_proxy_command():\n socks_proxy = os.environ.get(\"MODIN_SOCKS_PROXY\", None)\n if socks_proxy is None:\n return None\n if _which(\"nc\"):\n return f\"nc -x {socks_proxy} %h %p\"\n elif _which(\"connect\"):\n return f\"connect -S {socks_proxy} %h %p\"\n raise ClusterError(\n \"SSH through proxy required but no supported proxying tools found\"\n )\n", "path": "modin/experimental/cloud/base.py"}], "after_files": [{"content": "# Licensed to Modin Development Team under one or more contributor license agreements.\n# See the NOTICE file distributed with this work for additional information regarding\n# copyright ownership. The Modin Development Team licenses this file to you under the\n# Apache License, Version 2.0 (the \"License\"); you may not use this file except in\n# compliance with the License. You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software distributed under\n# the License is distributed on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF\n# ANY KIND, either express or implied. See the License for the specific language\n# governing permissions and limitations under the License.\n\nfrom typing import NamedTuple\nimport os\nimport sys\n\n\nclass ClusterError(Exception):\n \"\"\"\n Generic cluster operating exception\n \"\"\"\n\n def __init__(self, *args, cause: BaseException = None, traceback: str = None, **kw):\n self.cause = cause\n self.traceback = traceback\n super().__init__(*args, **kw)\n\n def __str__(self):\n if self.clause:\n return f\"clause: {self.cause}\\n{super()}\"\n return str(super())\n\n\nclass CannotSpawnCluster(ClusterError):\n \"\"\"\n Raised when cluster cannot be spawned in the cloud\n \"\"\"\n\n\nclass CannotDestroyCluster(ClusterError):\n \"\"\"\n Raised when cluster cannot be destroyed in the cloud\n \"\"\"\n\n\nclass ConnectionDetails(NamedTuple):\n user_name: str = \"modin\"\n key_file: str = None\n address: str = None\n port: int = 22\n\n\n_EXT = (\".exe\", \".com\", \".cmd\", \".bat\", \"\") if sys.platform == \"win32\" else (\"\",)\n\n\ndef _which(prog):\n for entry in os.environ[\"PATH\"].split(os.pathsep):\n for ext in _EXT:\n path = os.path.join(entry, prog + ext)\n if os.access(path, os.X_OK):\n return path\n return None\n\n\ndef _get_ssh_proxy_command():\n socks_proxy = os.environ.get(\"MODIN_SOCKS_PROXY\", None)\n if socks_proxy is None:\n return None\n if _which(\"nc\"):\n return f\"nc -x {socks_proxy} %h %p\"\n elif _which(\"connect\"):\n return f\"connect -S {socks_proxy} %h %p\"\n raise ClusterError(\n \"SSH through proxy required but no supported proxying tools found\"\n )\n", "path": "modin/experimental/cloud/base.py"}]} | 1,166 | 121 |
gh_patches_debug_7609 | rasdani/github-patches | git_diff | ansible__awx-11088 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Errors caused by Social Auth Organization and Team Mapping; Users unable to login
### Please confirm the following
- [X] I agree to follow this project's [code of conduct](http://docs.ansible.com/ansible/latest/community/code_of_conduct.html).
- [X] I have checked the [current issues](https://github.com/ansible/awx/issues) for duplicates.
- [X] I understand that AWX is open source software provided for free and that I am not entitled to status updates or other assurances.
### Summary
Attempting to configure Social Organization/Team Mapping prevents social-auth users from logging in. `awx-web` and postgres show errors. If ORGANIZATION_MAP and TEAM_MAP are reverted to default/empty, user is able to login.
### AWX version
19.2.2
### Installation method
kubernetes
### Modifications
yes
### Ansible version
_No response_
### Operating system
_No response_
### Web browser
Chrome
### Steps to reproduce
1. Fresh AWX 19.2.2 installation via awx-operator (0.12.0) with `create_preload_data: 'false'`
2. Log in as Admin; Configure Github Organization OAuth2 Social Auth ( do not configure org/team maps )
```json
{
"SOCIAL_AUTH_GITHUB_ORG_CALLBACK_URL": "https://my-awx-install.com/sso/complete/github-org/",
"SOCIAL_AUTH_GITHUB_ORG_KEY": "<redacted>",
"SOCIAL_AUTH_GITHUB_ORG_SECRET": "$encrypted$",
"SOCIAL_AUTH_GITHUB_ORG_ORGANIZATION_MAP": null,
"SOCIAL_AUTH_GITHUB_ORG_TEAM_MAP": null
}
```
3. Validate Step 2 configs by logging in via Github Org Social Auth
4. Log out.
5. Login as Admin; Configure Org/Team Maps (I tried this in both the Global and GitHub Org specific configs)
```json
{
"SOCIAL_AUTH_GITHUB_ORG_CALLBACK_URL": "https://my-awx-install.com/sso/complete/github-org/",
"SOCIAL_AUTH_GITHUB_ORG_KEY": "<redacted>",
"SOCIAL_AUTH_GITHUB_ORG_SECRET": "$encrypted$",
"SOCIAL_AUTH_GITHUB_ORG_NAME": "MyOrg",
"SOCIAL_AUTH_GITHUB_ORG_ORGANIZATION_MAP": {
"MyOrg": {
"users": [
"[email protected]"
],
"admins": [
"[email protected]"
],
"remove_users": true,
"remove_admins": true
}
},
"SOCIAL_AUTH_GITHUB_ORG_TEAM_MAP": {
"operations-team": {
"remove": true,
"organization": "MyOrg",
"users": [
"[email protected]"
]
}
}
}
```
6. Attempt to login as user; Get redirected to login page; Observe error awx-web and postgres error logs
### Expected results
Organizations specified by valid Social Auth Org map(s) are created automatically
Teams specified by valid Social Auth Team map(s) are created automatically
Users are able to login as specified by Social Auth Org/Team maps
### Actual results
User was not able to login following Org/Map configuration. Organization and Teams were not created on our behalf.
`postgres` errors
```bash
2021-08-13 21:39:42.438 UTC [2886] STATEMENT: INSERT INTO "main_organizationgalaxycredentialmembership" ("organization_id", "credential_id", "position") VALUES (1, NULL, NULL) RETURNING "main_organizationgalaxycredentialmembership"."id"
2021-08-13 21:39:42.438 UTC [2886] ERROR: null value in column "credential_id" violates not-null constraint
2021-08-13 21:39:42.438 UTC [2886] DETAIL: Failing row contains (1, null, null, 1).
```
`awx-web` errors
```
2021-08-13 21:39:42,440 ERROR [c6496ec03db747d3a7800f653acd316b] social null value in column "credential_id" violates not-null constraint
DETAIL: Failing row contains (1, null, null, 1).
.
```
### Additional information
* Installed via awx-operator with `create_preload_data: false`
* Prior to opening this issue, I've also attempted this against 19.3.0 to no success.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `awx/main/models/organization.py`
Content:
```
1 # Copyright (c) 2015 Ansible, Inc.
2 # All Rights Reserved.
3
4
5 # Django
6 from django.conf import settings
7 from django.db import models
8 from django.contrib.auth.models import User
9 from django.contrib.sessions.models import Session
10 from django.utils.timezone import now as tz_now
11 from django.utils.translation import ugettext_lazy as _
12
13
14 # AWX
15 from awx.api.versioning import reverse
16 from awx.main.fields import AutoOneToOneField, ImplicitRoleField, OrderedManyToManyField
17 from awx.main.models.base import BaseModel, CommonModel, CommonModelNameNotUnique, CreatedModifiedModel, NotificationFieldsModel
18 from awx.main.models.rbac import (
19 ROLE_SINGLETON_SYSTEM_ADMINISTRATOR,
20 ROLE_SINGLETON_SYSTEM_AUDITOR,
21 )
22 from awx.main.models.unified_jobs import UnifiedJob
23 from awx.main.models.mixins import ResourceMixin, CustomVirtualEnvMixin, RelatedJobsMixin
24
25 __all__ = ['Organization', 'Team', 'Profile', 'UserSessionMembership']
26
27
28 class Organization(CommonModel, NotificationFieldsModel, ResourceMixin, CustomVirtualEnvMixin, RelatedJobsMixin):
29 """
30 An organization is the basic unit of multi-tenancy divisions
31 """
32
33 class Meta:
34 app_label = 'main'
35 ordering = ('name',)
36
37 instance_groups = OrderedManyToManyField('InstanceGroup', blank=True, through='OrganizationInstanceGroupMembership')
38 galaxy_credentials = OrderedManyToManyField(
39 'Credential', blank=True, through='OrganizationGalaxyCredentialMembership', related_name='%(class)s_galaxy_credentials'
40 )
41 max_hosts = models.PositiveIntegerField(
42 blank=True,
43 default=0,
44 help_text=_('Maximum number of hosts allowed to be managed by this organization.'),
45 )
46 notification_templates_approvals = models.ManyToManyField("NotificationTemplate", blank=True, related_name='%(class)s_notification_templates_for_approvals')
47 default_environment = models.ForeignKey(
48 'ExecutionEnvironment',
49 null=True,
50 blank=True,
51 default=None,
52 on_delete=models.SET_NULL,
53 related_name='+',
54 help_text=_('The default execution environment for jobs run by this organization.'),
55 )
56
57 admin_role = ImplicitRoleField(
58 parent_role='singleton:' + ROLE_SINGLETON_SYSTEM_ADMINISTRATOR,
59 )
60 execute_role = ImplicitRoleField(
61 parent_role='admin_role',
62 )
63 project_admin_role = ImplicitRoleField(
64 parent_role='admin_role',
65 )
66 inventory_admin_role = ImplicitRoleField(
67 parent_role='admin_role',
68 )
69 credential_admin_role = ImplicitRoleField(
70 parent_role='admin_role',
71 )
72 workflow_admin_role = ImplicitRoleField(
73 parent_role='admin_role',
74 )
75 notification_admin_role = ImplicitRoleField(
76 parent_role='admin_role',
77 )
78 job_template_admin_role = ImplicitRoleField(
79 parent_role='admin_role',
80 )
81 execution_environment_admin_role = ImplicitRoleField(
82 parent_role='admin_role',
83 )
84 auditor_role = ImplicitRoleField(
85 parent_role='singleton:' + ROLE_SINGLETON_SYSTEM_AUDITOR,
86 )
87 member_role = ImplicitRoleField(parent_role=['admin_role'])
88 read_role = ImplicitRoleField(
89 parent_role=[
90 'member_role',
91 'auditor_role',
92 'execute_role',
93 'project_admin_role',
94 'inventory_admin_role',
95 'workflow_admin_role',
96 'notification_admin_role',
97 'credential_admin_role',
98 'job_template_admin_role',
99 'approval_role',
100 'execution_environment_admin_role',
101 ],
102 )
103 approval_role = ImplicitRoleField(
104 parent_role='admin_role',
105 )
106
107 def get_absolute_url(self, request=None):
108 return reverse('api:organization_detail', kwargs={'pk': self.pk}, request=request)
109
110 '''
111 RelatedJobsMixin
112 '''
113
114 def _get_related_jobs(self):
115 return UnifiedJob.objects.non_polymorphic().filter(organization=self)
116
117 def create_default_galaxy_credential(self):
118 from awx.main.models import Credential
119
120 public_galaxy_credential = Credential.objects.filter(managed=True, name='Ansible Galaxy').first()
121 if public_galaxy_credential not in self.galaxy_credentials.all():
122 self.galaxy_credentials.add(public_galaxy_credential)
123
124
125 class OrganizationGalaxyCredentialMembership(models.Model):
126
127 organization = models.ForeignKey('Organization', on_delete=models.CASCADE)
128 credential = models.ForeignKey('Credential', on_delete=models.CASCADE)
129 position = models.PositiveIntegerField(
130 null=True,
131 default=None,
132 db_index=True,
133 )
134
135
136 class Team(CommonModelNameNotUnique, ResourceMixin):
137 """
138 A team is a group of users that work on common projects.
139 """
140
141 class Meta:
142 app_label = 'main'
143 unique_together = [('organization', 'name')]
144 ordering = ('organization__name', 'name')
145
146 organization = models.ForeignKey(
147 'Organization',
148 blank=False,
149 null=False,
150 on_delete=models.CASCADE,
151 related_name='teams',
152 )
153 admin_role = ImplicitRoleField(
154 parent_role='organization.admin_role',
155 )
156 member_role = ImplicitRoleField(
157 parent_role='admin_role',
158 )
159 read_role = ImplicitRoleField(
160 parent_role=['organization.auditor_role', 'member_role'],
161 )
162
163 def get_absolute_url(self, request=None):
164 return reverse('api:team_detail', kwargs={'pk': self.pk}, request=request)
165
166
167 class Profile(CreatedModifiedModel):
168 """
169 Profile model related to User object. Currently stores LDAP DN for users
170 loaded from LDAP.
171 """
172
173 class Meta:
174 app_label = 'main'
175
176 user = AutoOneToOneField('auth.User', related_name='profile', editable=False, on_delete=models.CASCADE)
177 ldap_dn = models.CharField(
178 max_length=1024,
179 default='',
180 )
181
182
183 class UserSessionMembership(BaseModel):
184 """
185 A lookup table for API session membership given user. Note, there is a
186 different session created by channels for websockets using the same
187 underlying model.
188 """
189
190 class Meta:
191 app_label = 'main'
192
193 user = models.ForeignKey('auth.User', related_name='+', blank=False, null=False, on_delete=models.CASCADE)
194 session = models.OneToOneField(Session, related_name='+', blank=False, null=False, on_delete=models.CASCADE)
195 created = models.DateTimeField(default=None, editable=False)
196
197 @staticmethod
198 def get_memberships_over_limit(user_id, now=None):
199 if settings.SESSIONS_PER_USER == -1:
200 return []
201 if now is None:
202 now = tz_now()
203 query_set = UserSessionMembership.objects.select_related('session').filter(user_id=user_id).order_by('-created')
204 non_expire_memberships = [x for x in query_set if x.session.expire_date > now]
205 return non_expire_memberships[settings.SESSIONS_PER_USER :]
206
207
208 # Add get_absolute_url method to User model if not present.
209 if not hasattr(User, 'get_absolute_url'):
210
211 def user_get_absolute_url(user, request=None):
212 return reverse('api:user_detail', kwargs={'pk': user.pk}, request=request)
213
214 User.add_to_class('get_absolute_url', user_get_absolute_url)
215
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/awx/main/models/organization.py b/awx/main/models/organization.py
--- a/awx/main/models/organization.py
+++ b/awx/main/models/organization.py
@@ -118,7 +118,7 @@
from awx.main.models import Credential
public_galaxy_credential = Credential.objects.filter(managed=True, name='Ansible Galaxy').first()
- if public_galaxy_credential not in self.galaxy_credentials.all():
+ if public_galaxy_credential is not None and public_galaxy_credential not in self.galaxy_credentials.all():
self.galaxy_credentials.add(public_galaxy_credential)
| {"golden_diff": "diff --git a/awx/main/models/organization.py b/awx/main/models/organization.py\n--- a/awx/main/models/organization.py\n+++ b/awx/main/models/organization.py\n@@ -118,7 +118,7 @@\n from awx.main.models import Credential\n \n public_galaxy_credential = Credential.objects.filter(managed=True, name='Ansible Galaxy').first()\n- if public_galaxy_credential not in self.galaxy_credentials.all():\n+ if public_galaxy_credential is not None and public_galaxy_credential not in self.galaxy_credentials.all():\n self.galaxy_credentials.add(public_galaxy_credential)\n", "issue": "Errors caused by Social Auth Organization and Team Mapping; Users unable to login\n### Please confirm the following\r\n\r\n- [X] I agree to follow this project's [code of conduct](http://docs.ansible.com/ansible/latest/community/code_of_conduct.html).\r\n- [X] I have checked the [current issues](https://github.com/ansible/awx/issues) for duplicates.\r\n- [X] I understand that AWX is open source software provided for free and that I am not entitled to status updates or other assurances.\r\n\r\n### Summary\r\n\r\nAttempting to configure Social Organization/Team Mapping prevents social-auth users from logging in. `awx-web` and postgres show errors. If ORGANIZATION_MAP and TEAM_MAP are reverted to default/empty, user is able to login.\r\n\r\n\r\n### AWX version\r\n\r\n19.2.2\r\n\r\n### Installation method\r\n\r\nkubernetes\r\n\r\n### Modifications\r\n\r\nyes\r\n\r\n### Ansible version\r\n\r\n_No response_\r\n\r\n### Operating system\r\n\r\n_No response_\r\n\r\n### Web browser\r\n\r\nChrome\r\n\r\n### Steps to reproduce\r\n\r\n1. Fresh AWX 19.2.2 installation via awx-operator (0.12.0) with `create_preload_data: 'false'`\r\n2. Log in as Admin; Configure Github Organization OAuth2 Social Auth ( do not configure org/team maps )\r\n```json\r\n{\r\n \"SOCIAL_AUTH_GITHUB_ORG_CALLBACK_URL\": \"https://my-awx-install.com/sso/complete/github-org/\",\r\n \"SOCIAL_AUTH_GITHUB_ORG_KEY\": \"<redacted>\",\r\n \"SOCIAL_AUTH_GITHUB_ORG_SECRET\": \"$encrypted$\",\r\n \"SOCIAL_AUTH_GITHUB_ORG_ORGANIZATION_MAP\": null,\r\n \"SOCIAL_AUTH_GITHUB_ORG_TEAM_MAP\": null\r\n}\r\n```\r\n3. Validate Step 2 configs by logging in via Github Org Social Auth\r\n4. Log out.\r\n\r\n5. Login as Admin; Configure Org/Team Maps (I tried this in both the Global and GitHub Org specific configs)\r\n```json\r\n{\r\n \"SOCIAL_AUTH_GITHUB_ORG_CALLBACK_URL\": \"https://my-awx-install.com/sso/complete/github-org/\",\r\n \"SOCIAL_AUTH_GITHUB_ORG_KEY\": \"<redacted>\",\r\n \"SOCIAL_AUTH_GITHUB_ORG_SECRET\": \"$encrypted$\",\r\n \"SOCIAL_AUTH_GITHUB_ORG_NAME\": \"MyOrg\",\r\n \"SOCIAL_AUTH_GITHUB_ORG_ORGANIZATION_MAP\": {\r\n \"MyOrg\": {\r\n \"users\": [\r\n \"[email protected]\"\r\n ],\r\n \"admins\": [\r\n \"[email protected]\"\r\n ],\r\n \"remove_users\": true,\r\n \"remove_admins\": true\r\n }\r\n },\r\n \"SOCIAL_AUTH_GITHUB_ORG_TEAM_MAP\": {\r\n \"operations-team\": {\r\n \"remove\": true,\r\n \"organization\": \"MyOrg\",\r\n \"users\": [\r\n \"[email protected]\"\r\n ]\r\n }\r\n }\r\n}\r\n```\r\n\r\n6. Attempt to login as user; Get redirected to login page; Observe error awx-web and postgres error logs\r\n\r\n### Expected results\r\n\r\nOrganizations specified by valid Social Auth Org map(s) are created automatically\r\nTeams specified by valid Social Auth Team map(s) are created automatically\r\nUsers are able to login as specified by Social Auth Org/Team maps\r\n\r\n### Actual results\r\n\r\nUser was not able to login following Org/Map configuration. Organization and Teams were not created on our behalf.\r\n\r\n`postgres` errors\r\n```bash\r\n2021-08-13 21:39:42.438 UTC [2886] STATEMENT: INSERT INTO \"main_organizationgalaxycredentialmembership\" (\"organization_id\", \"credential_id\", \"position\") VALUES (1, NULL, NULL) RETURNING \"main_organizationgalaxycredentialmembership\".\"id\"\r\n2021-08-13 21:39:42.438 UTC [2886] ERROR: null value in column \"credential_id\" violates not-null constraint\r\n2021-08-13 21:39:42.438 UTC [2886] DETAIL: Failing row contains (1, null, null, 1).\r\n```\r\n\r\n`awx-web` errors\r\n```\r\n2021-08-13 21:39:42,440 ERROR [c6496ec03db747d3a7800f653acd316b] social null value in column \"credential_id\" violates not-null constraint\r\nDETAIL: Failing row contains (1, null, null, 1).\r\n.\r\n```\r\n\r\n### Additional information\r\n\r\n* Installed via awx-operator with `create_preload_data: false`\r\n* Prior to opening this issue, I've also attempted this against 19.3.0 to no success.\n", "before_files": [{"content": "# Copyright (c) 2015 Ansible, Inc.\n# All Rights Reserved.\n\n\n# Django\nfrom django.conf import settings\nfrom django.db import models\nfrom django.contrib.auth.models import User\nfrom django.contrib.sessions.models import Session\nfrom django.utils.timezone import now as tz_now\nfrom django.utils.translation import ugettext_lazy as _\n\n\n# AWX\nfrom awx.api.versioning import reverse\nfrom awx.main.fields import AutoOneToOneField, ImplicitRoleField, OrderedManyToManyField\nfrom awx.main.models.base import BaseModel, CommonModel, CommonModelNameNotUnique, CreatedModifiedModel, NotificationFieldsModel\nfrom awx.main.models.rbac import (\n ROLE_SINGLETON_SYSTEM_ADMINISTRATOR,\n ROLE_SINGLETON_SYSTEM_AUDITOR,\n)\nfrom awx.main.models.unified_jobs import UnifiedJob\nfrom awx.main.models.mixins import ResourceMixin, CustomVirtualEnvMixin, RelatedJobsMixin\n\n__all__ = ['Organization', 'Team', 'Profile', 'UserSessionMembership']\n\n\nclass Organization(CommonModel, NotificationFieldsModel, ResourceMixin, CustomVirtualEnvMixin, RelatedJobsMixin):\n \"\"\"\n An organization is the basic unit of multi-tenancy divisions\n \"\"\"\n\n class Meta:\n app_label = 'main'\n ordering = ('name',)\n\n instance_groups = OrderedManyToManyField('InstanceGroup', blank=True, through='OrganizationInstanceGroupMembership')\n galaxy_credentials = OrderedManyToManyField(\n 'Credential', blank=True, through='OrganizationGalaxyCredentialMembership', related_name='%(class)s_galaxy_credentials'\n )\n max_hosts = models.PositiveIntegerField(\n blank=True,\n default=0,\n help_text=_('Maximum number of hosts allowed to be managed by this organization.'),\n )\n notification_templates_approvals = models.ManyToManyField(\"NotificationTemplate\", blank=True, related_name='%(class)s_notification_templates_for_approvals')\n default_environment = models.ForeignKey(\n 'ExecutionEnvironment',\n null=True,\n blank=True,\n default=None,\n on_delete=models.SET_NULL,\n related_name='+',\n help_text=_('The default execution environment for jobs run by this organization.'),\n )\n\n admin_role = ImplicitRoleField(\n parent_role='singleton:' + ROLE_SINGLETON_SYSTEM_ADMINISTRATOR,\n )\n execute_role = ImplicitRoleField(\n parent_role='admin_role',\n )\n project_admin_role = ImplicitRoleField(\n parent_role='admin_role',\n )\n inventory_admin_role = ImplicitRoleField(\n parent_role='admin_role',\n )\n credential_admin_role = ImplicitRoleField(\n parent_role='admin_role',\n )\n workflow_admin_role = ImplicitRoleField(\n parent_role='admin_role',\n )\n notification_admin_role = ImplicitRoleField(\n parent_role='admin_role',\n )\n job_template_admin_role = ImplicitRoleField(\n parent_role='admin_role',\n )\n execution_environment_admin_role = ImplicitRoleField(\n parent_role='admin_role',\n )\n auditor_role = ImplicitRoleField(\n parent_role='singleton:' + ROLE_SINGLETON_SYSTEM_AUDITOR,\n )\n member_role = ImplicitRoleField(parent_role=['admin_role'])\n read_role = ImplicitRoleField(\n parent_role=[\n 'member_role',\n 'auditor_role',\n 'execute_role',\n 'project_admin_role',\n 'inventory_admin_role',\n 'workflow_admin_role',\n 'notification_admin_role',\n 'credential_admin_role',\n 'job_template_admin_role',\n 'approval_role',\n 'execution_environment_admin_role',\n ],\n )\n approval_role = ImplicitRoleField(\n parent_role='admin_role',\n )\n\n def get_absolute_url(self, request=None):\n return reverse('api:organization_detail', kwargs={'pk': self.pk}, request=request)\n\n '''\n RelatedJobsMixin\n '''\n\n def _get_related_jobs(self):\n return UnifiedJob.objects.non_polymorphic().filter(organization=self)\n\n def create_default_galaxy_credential(self):\n from awx.main.models import Credential\n\n public_galaxy_credential = Credential.objects.filter(managed=True, name='Ansible Galaxy').first()\n if public_galaxy_credential not in self.galaxy_credentials.all():\n self.galaxy_credentials.add(public_galaxy_credential)\n\n\nclass OrganizationGalaxyCredentialMembership(models.Model):\n\n organization = models.ForeignKey('Organization', on_delete=models.CASCADE)\n credential = models.ForeignKey('Credential', on_delete=models.CASCADE)\n position = models.PositiveIntegerField(\n null=True,\n default=None,\n db_index=True,\n )\n\n\nclass Team(CommonModelNameNotUnique, ResourceMixin):\n \"\"\"\n A team is a group of users that work on common projects.\n \"\"\"\n\n class Meta:\n app_label = 'main'\n unique_together = [('organization', 'name')]\n ordering = ('organization__name', 'name')\n\n organization = models.ForeignKey(\n 'Organization',\n blank=False,\n null=False,\n on_delete=models.CASCADE,\n related_name='teams',\n )\n admin_role = ImplicitRoleField(\n parent_role='organization.admin_role',\n )\n member_role = ImplicitRoleField(\n parent_role='admin_role',\n )\n read_role = ImplicitRoleField(\n parent_role=['organization.auditor_role', 'member_role'],\n )\n\n def get_absolute_url(self, request=None):\n return reverse('api:team_detail', kwargs={'pk': self.pk}, request=request)\n\n\nclass Profile(CreatedModifiedModel):\n \"\"\"\n Profile model related to User object. Currently stores LDAP DN for users\n loaded from LDAP.\n \"\"\"\n\n class Meta:\n app_label = 'main'\n\n user = AutoOneToOneField('auth.User', related_name='profile', editable=False, on_delete=models.CASCADE)\n ldap_dn = models.CharField(\n max_length=1024,\n default='',\n )\n\n\nclass UserSessionMembership(BaseModel):\n \"\"\"\n A lookup table for API session membership given user. Note, there is a\n different session created by channels for websockets using the same\n underlying model.\n \"\"\"\n\n class Meta:\n app_label = 'main'\n\n user = models.ForeignKey('auth.User', related_name='+', blank=False, null=False, on_delete=models.CASCADE)\n session = models.OneToOneField(Session, related_name='+', blank=False, null=False, on_delete=models.CASCADE)\n created = models.DateTimeField(default=None, editable=False)\n\n @staticmethod\n def get_memberships_over_limit(user_id, now=None):\n if settings.SESSIONS_PER_USER == -1:\n return []\n if now is None:\n now = tz_now()\n query_set = UserSessionMembership.objects.select_related('session').filter(user_id=user_id).order_by('-created')\n non_expire_memberships = [x for x in query_set if x.session.expire_date > now]\n return non_expire_memberships[settings.SESSIONS_PER_USER :]\n\n\n# Add get_absolute_url method to User model if not present.\nif not hasattr(User, 'get_absolute_url'):\n\n def user_get_absolute_url(user, request=None):\n return reverse('api:user_detail', kwargs={'pk': user.pk}, request=request)\n\n User.add_to_class('get_absolute_url', user_get_absolute_url)\n", "path": "awx/main/models/organization.py"}], "after_files": [{"content": "# Copyright (c) 2015 Ansible, Inc.\n# All Rights Reserved.\n\n\n# Django\nfrom django.conf import settings\nfrom django.db import models\nfrom django.contrib.auth.models import User\nfrom django.contrib.sessions.models import Session\nfrom django.utils.timezone import now as tz_now\nfrom django.utils.translation import ugettext_lazy as _\n\n\n# AWX\nfrom awx.api.versioning import reverse\nfrom awx.main.fields import AutoOneToOneField, ImplicitRoleField, OrderedManyToManyField\nfrom awx.main.models.base import BaseModel, CommonModel, CommonModelNameNotUnique, CreatedModifiedModel, NotificationFieldsModel\nfrom awx.main.models.rbac import (\n ROLE_SINGLETON_SYSTEM_ADMINISTRATOR,\n ROLE_SINGLETON_SYSTEM_AUDITOR,\n)\nfrom awx.main.models.unified_jobs import UnifiedJob\nfrom awx.main.models.mixins import ResourceMixin, CustomVirtualEnvMixin, RelatedJobsMixin\n\n__all__ = ['Organization', 'Team', 'Profile', 'UserSessionMembership']\n\n\nclass Organization(CommonModel, NotificationFieldsModel, ResourceMixin, CustomVirtualEnvMixin, RelatedJobsMixin):\n \"\"\"\n An organization is the basic unit of multi-tenancy divisions\n \"\"\"\n\n class Meta:\n app_label = 'main'\n ordering = ('name',)\n\n instance_groups = OrderedManyToManyField('InstanceGroup', blank=True, through='OrganizationInstanceGroupMembership')\n galaxy_credentials = OrderedManyToManyField(\n 'Credential', blank=True, through='OrganizationGalaxyCredentialMembership', related_name='%(class)s_galaxy_credentials'\n )\n max_hosts = models.PositiveIntegerField(\n blank=True,\n default=0,\n help_text=_('Maximum number of hosts allowed to be managed by this organization.'),\n )\n notification_templates_approvals = models.ManyToManyField(\"NotificationTemplate\", blank=True, related_name='%(class)s_notification_templates_for_approvals')\n default_environment = models.ForeignKey(\n 'ExecutionEnvironment',\n null=True,\n blank=True,\n default=None,\n on_delete=models.SET_NULL,\n related_name='+',\n help_text=_('The default execution environment for jobs run by this organization.'),\n )\n\n admin_role = ImplicitRoleField(\n parent_role='singleton:' + ROLE_SINGLETON_SYSTEM_ADMINISTRATOR,\n )\n execute_role = ImplicitRoleField(\n parent_role='admin_role',\n )\n project_admin_role = ImplicitRoleField(\n parent_role='admin_role',\n )\n inventory_admin_role = ImplicitRoleField(\n parent_role='admin_role',\n )\n credential_admin_role = ImplicitRoleField(\n parent_role='admin_role',\n )\n workflow_admin_role = ImplicitRoleField(\n parent_role='admin_role',\n )\n notification_admin_role = ImplicitRoleField(\n parent_role='admin_role',\n )\n job_template_admin_role = ImplicitRoleField(\n parent_role='admin_role',\n )\n execution_environment_admin_role = ImplicitRoleField(\n parent_role='admin_role',\n )\n auditor_role = ImplicitRoleField(\n parent_role='singleton:' + ROLE_SINGLETON_SYSTEM_AUDITOR,\n )\n member_role = ImplicitRoleField(parent_role=['admin_role'])\n read_role = ImplicitRoleField(\n parent_role=[\n 'member_role',\n 'auditor_role',\n 'execute_role',\n 'project_admin_role',\n 'inventory_admin_role',\n 'workflow_admin_role',\n 'notification_admin_role',\n 'credential_admin_role',\n 'job_template_admin_role',\n 'approval_role',\n 'execution_environment_admin_role',\n ],\n )\n approval_role = ImplicitRoleField(\n parent_role='admin_role',\n )\n\n def get_absolute_url(self, request=None):\n return reverse('api:organization_detail', kwargs={'pk': self.pk}, request=request)\n\n '''\n RelatedJobsMixin\n '''\n\n def _get_related_jobs(self):\n return UnifiedJob.objects.non_polymorphic().filter(organization=self)\n\n def create_default_galaxy_credential(self):\n from awx.main.models import Credential\n\n public_galaxy_credential = Credential.objects.filter(managed=True, name='Ansible Galaxy').first()\n if public_galaxy_credential is not None and public_galaxy_credential not in self.galaxy_credentials.all():\n self.galaxy_credentials.add(public_galaxy_credential)\n\n\nclass OrganizationGalaxyCredentialMembership(models.Model):\n\n organization = models.ForeignKey('Organization', on_delete=models.CASCADE)\n credential = models.ForeignKey('Credential', on_delete=models.CASCADE)\n position = models.PositiveIntegerField(\n null=True,\n default=None,\n db_index=True,\n )\n\n\nclass Team(CommonModelNameNotUnique, ResourceMixin):\n \"\"\"\n A team is a group of users that work on common projects.\n \"\"\"\n\n class Meta:\n app_label = 'main'\n unique_together = [('organization', 'name')]\n ordering = ('organization__name', 'name')\n\n organization = models.ForeignKey(\n 'Organization',\n blank=False,\n null=False,\n on_delete=models.CASCADE,\n related_name='teams',\n )\n admin_role = ImplicitRoleField(\n parent_role='organization.admin_role',\n )\n member_role = ImplicitRoleField(\n parent_role='admin_role',\n )\n read_role = ImplicitRoleField(\n parent_role=['organization.auditor_role', 'member_role'],\n )\n\n def get_absolute_url(self, request=None):\n return reverse('api:team_detail', kwargs={'pk': self.pk}, request=request)\n\n\nclass Profile(CreatedModifiedModel):\n \"\"\"\n Profile model related to User object. Currently stores LDAP DN for users\n loaded from LDAP.\n \"\"\"\n\n class Meta:\n app_label = 'main'\n\n user = AutoOneToOneField('auth.User', related_name='profile', editable=False, on_delete=models.CASCADE)\n ldap_dn = models.CharField(\n max_length=1024,\n default='',\n )\n\n\nclass UserSessionMembership(BaseModel):\n \"\"\"\n A lookup table for API session membership given user. Note, there is a\n different session created by channels for websockets using the same\n underlying model.\n \"\"\"\n\n class Meta:\n app_label = 'main'\n\n user = models.ForeignKey('auth.User', related_name='+', blank=False, null=False, on_delete=models.CASCADE)\n session = models.OneToOneField(Session, related_name='+', blank=False, null=False, on_delete=models.CASCADE)\n created = models.DateTimeField(default=None, editable=False)\n\n @staticmethod\n def get_memberships_over_limit(user_id, now=None):\n if settings.SESSIONS_PER_USER == -1:\n return []\n if now is None:\n now = tz_now()\n query_set = UserSessionMembership.objects.select_related('session').filter(user_id=user_id).order_by('-created')\n non_expire_memberships = [x for x in query_set if x.session.expire_date > now]\n return non_expire_memberships[settings.SESSIONS_PER_USER :]\n\n\n# Add get_absolute_url method to User model if not present.\nif not hasattr(User, 'get_absolute_url'):\n\n def user_get_absolute_url(user, request=None):\n return reverse('api:user_detail', kwargs={'pk': user.pk}, request=request)\n\n User.add_to_class('get_absolute_url', user_get_absolute_url)\n", "path": "awx/main/models/organization.py"}]} | 3,353 | 150 |
gh_patches_debug_2605 | rasdani/github-patches | git_diff | mathesar-foundation__mathesar-940 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
DB Types in column.valid_target_types are not in sync with the types returned in database types endpoint
## Description
* `valid_target_types` property of column returns "DOUBLE PRECISION"
- Endpoint: /api/v0/tables/14/columns/
* Types endpoint returns mathesar types where Number has the db type "DOUBLE_PRECISION"
- http://localhost:8000/api/v0/databases/1/types/
- Mathesar type: Number
Note that "DOUBLE PRECISION" and "DOUBLE_PRECISION" differ from each other.
## Expected behavior
Both endpoints should return values with same spelling.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `db/types/base.py`
Content:
```
1 from enum import Enum
2
3 from sqlalchemy import create_engine
4
5 from db import constants
6
7
8 CHAR = 'char'
9 STRING = 'string'
10 VARCHAR = 'varchar'
11
12
13 class PostgresType(Enum):
14 """
15 This only includes built-in Postgres types that SQLAlchemy supports.
16 SQLAlchemy doesn't support XML. See zzzeek's comment on:
17 https://stackoverflow.com/questions/16153512/using-postgresql-xml-data-type-with-sqlalchemy
18 The values are keys returned by get_available_types.
19 """
20 _ARRAY = '_array'
21 BIGINT = 'bigint'
22 BIT_VARYING = 'bit varying'
23 BIT = 'bit'
24 BOOLEAN = 'boolean'
25 BYTEA = 'bytea'
26 CHAR = '"char"'
27 CHARACTER_VARYING = 'character varying'
28 CHARACTER = 'character'
29 CIDR = 'cidr'
30 DATE = 'date'
31 DATERANGE = 'daterange'
32 DECIMAL = 'decimal'
33 DOUBLE_PRECISION = 'double precision'
34 FLOAT = 'float'
35 HSTORE = 'hstore'
36 INET = 'inet'
37 INT4RANGE = 'int4range'
38 INT8RANGE = 'int8range'
39 INTEGER = 'integer'
40 INTERVAL = 'interval'
41 JSON = 'json'
42 JSONB = 'jsonb'
43 MACADDR = 'macaddr'
44 MONEY = 'money'
45 NAME = 'name'
46 NUMERIC = 'numeric'
47 NUMRANGE = 'numrange'
48 OID = 'oid'
49 REAL = 'real'
50 REGCLASS = 'regclass'
51 SMALLINT = 'smallint'
52 TEXT = 'text'
53 TIME = 'time'
54 TIME_WITH_TIME_ZONE = 'time with time zone'
55 TIME_WITHOUT_TIME_ZONE = 'time without time zone'
56 TIMESTAMP = 'timestamp'
57 TIMESTAMP_WITH_TIMESTAMP_ZONE = 'timestamp with time zone'
58 TIMESTAMP_WITHOUT_TIMESTAMP_ZONE = 'timestamp without time zone'
59 TSRANGE = 'tsrange'
60 TSTZRANGE = 'tstzrange'
61 TSVECTOR = 'tsvector'
62 UUID = 'uuid'
63
64
65 class MathesarCustomType(Enum):
66 """
67 This is a list of custom Mathesar DB types.
68 Keys returned by get_available_types are of the format 'mathesar_types.VALUE'
69 """
70 EMAIL = 'email'
71 URI = 'uri'
72 MONEY = 'money'
73
74
75 SCHEMA = f"{constants.MATHESAR_PREFIX}types"
76 # Since we want to have our identifiers quoted appropriately for use in
77 # PostgreSQL, we want to use the postgres dialect preparer to set this up.
78 preparer = create_engine("postgresql://").dialect.identifier_preparer
79
80
81 def get_qualified_name(name):
82 return ".".join([preparer.quote_schema(SCHEMA), name])
83
84
85 def get_available_types(engine):
86 return engine.dialect.ischema_names
87
88
89 def get_db_type_name(sa_type, engine):
90 USER_DEFINED_STR = 'user_defined'
91 db_type = sa_type.__visit_name__
92 if db_type == USER_DEFINED_STR:
93 db_type = sa_type().compile(engine.dialect)
94 return db_type
95
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/db/types/base.py b/db/types/base.py
--- a/db/types/base.py
+++ b/db/types/base.py
@@ -87,8 +87,8 @@
def get_db_type_name(sa_type, engine):
- USER_DEFINED_STR = 'user_defined'
- db_type = sa_type.__visit_name__
- if db_type == USER_DEFINED_STR:
- db_type = sa_type().compile(engine.dialect)
+ try:
+ db_type = sa_type.compile(dialect=engine.dialect)
+ except TypeError:
+ db_type = sa_type().compile(dialect=engine.dialect)
return db_type
| {"golden_diff": "diff --git a/db/types/base.py b/db/types/base.py\n--- a/db/types/base.py\n+++ b/db/types/base.py\n@@ -87,8 +87,8 @@\n \n \n def get_db_type_name(sa_type, engine):\n- USER_DEFINED_STR = 'user_defined'\n- db_type = sa_type.__visit_name__\n- if db_type == USER_DEFINED_STR:\n- db_type = sa_type().compile(engine.dialect)\n+ try:\n+ db_type = sa_type.compile(dialect=engine.dialect)\n+ except TypeError:\n+ db_type = sa_type().compile(dialect=engine.dialect)\n return db_type\n", "issue": "DB Types in column.valid_target_types are not in sync with the types returned in database types endpoint\n## Description\r\n* `valid_target_types` property of column returns \"DOUBLE PRECISION\"\r\n - Endpoint: /api/v0/tables/14/columns/\r\n* Types endpoint returns mathesar types where Number has the db type \"DOUBLE_PRECISION\"\r\n - http://localhost:8000/api/v0/databases/1/types/\r\n - Mathesar type: Number\r\n\r\nNote that \"DOUBLE PRECISION\" and \"DOUBLE_PRECISION\" differ from each other.\r\n\r\n## Expected behavior\r\nBoth endpoints should return values with same spelling.\r\n\n", "before_files": [{"content": "from enum import Enum\n\nfrom sqlalchemy import create_engine\n\nfrom db import constants\n\n\nCHAR = 'char'\nSTRING = 'string'\nVARCHAR = 'varchar'\n\n\nclass PostgresType(Enum):\n \"\"\"\n This only includes built-in Postgres types that SQLAlchemy supports.\n SQLAlchemy doesn't support XML. See zzzeek's comment on:\n https://stackoverflow.com/questions/16153512/using-postgresql-xml-data-type-with-sqlalchemy\n The values are keys returned by get_available_types.\n \"\"\"\n _ARRAY = '_array'\n BIGINT = 'bigint'\n BIT_VARYING = 'bit varying'\n BIT = 'bit'\n BOOLEAN = 'boolean'\n BYTEA = 'bytea'\n CHAR = '\"char\"'\n CHARACTER_VARYING = 'character varying'\n CHARACTER = 'character'\n CIDR = 'cidr'\n DATE = 'date'\n DATERANGE = 'daterange'\n DECIMAL = 'decimal'\n DOUBLE_PRECISION = 'double precision'\n FLOAT = 'float'\n HSTORE = 'hstore'\n INET = 'inet'\n INT4RANGE = 'int4range'\n INT8RANGE = 'int8range'\n INTEGER = 'integer'\n INTERVAL = 'interval'\n JSON = 'json'\n JSONB = 'jsonb'\n MACADDR = 'macaddr'\n MONEY = 'money'\n NAME = 'name'\n NUMERIC = 'numeric'\n NUMRANGE = 'numrange'\n OID = 'oid'\n REAL = 'real'\n REGCLASS = 'regclass'\n SMALLINT = 'smallint'\n TEXT = 'text'\n TIME = 'time'\n TIME_WITH_TIME_ZONE = 'time with time zone'\n TIME_WITHOUT_TIME_ZONE = 'time without time zone'\n TIMESTAMP = 'timestamp'\n TIMESTAMP_WITH_TIMESTAMP_ZONE = 'timestamp with time zone'\n TIMESTAMP_WITHOUT_TIMESTAMP_ZONE = 'timestamp without time zone'\n TSRANGE = 'tsrange'\n TSTZRANGE = 'tstzrange'\n TSVECTOR = 'tsvector'\n UUID = 'uuid'\n\n\nclass MathesarCustomType(Enum):\n \"\"\"\n This is a list of custom Mathesar DB types.\n Keys returned by get_available_types are of the format 'mathesar_types.VALUE'\n \"\"\"\n EMAIL = 'email'\n URI = 'uri'\n MONEY = 'money'\n\n\nSCHEMA = f\"{constants.MATHESAR_PREFIX}types\"\n# Since we want to have our identifiers quoted appropriately for use in\n# PostgreSQL, we want to use the postgres dialect preparer to set this up.\npreparer = create_engine(\"postgresql://\").dialect.identifier_preparer\n\n\ndef get_qualified_name(name):\n return \".\".join([preparer.quote_schema(SCHEMA), name])\n\n\ndef get_available_types(engine):\n return engine.dialect.ischema_names\n\n\ndef get_db_type_name(sa_type, engine):\n USER_DEFINED_STR = 'user_defined'\n db_type = sa_type.__visit_name__\n if db_type == USER_DEFINED_STR:\n db_type = sa_type().compile(engine.dialect)\n return db_type\n", "path": "db/types/base.py"}], "after_files": [{"content": "from enum import Enum\n\nfrom sqlalchemy import create_engine\n\nfrom db import constants\n\n\nCHAR = 'char'\nSTRING = 'string'\nVARCHAR = 'varchar'\n\n\nclass PostgresType(Enum):\n \"\"\"\n This only includes built-in Postgres types that SQLAlchemy supports.\n SQLAlchemy doesn't support XML. See zzzeek's comment on:\n https://stackoverflow.com/questions/16153512/using-postgresql-xml-data-type-with-sqlalchemy\n The values are keys returned by get_available_types.\n \"\"\"\n _ARRAY = '_array'\n BIGINT = 'bigint'\n BIT_VARYING = 'bit varying'\n BIT = 'bit'\n BOOLEAN = 'boolean'\n BYTEA = 'bytea'\n CHAR = '\"char\"'\n CHARACTER_VARYING = 'character varying'\n CHARACTER = 'character'\n CIDR = 'cidr'\n DATE = 'date'\n DATERANGE = 'daterange'\n DECIMAL = 'decimal'\n DOUBLE_PRECISION = 'double precision'\n FLOAT = 'float'\n HSTORE = 'hstore'\n INET = 'inet'\n INT4RANGE = 'int4range'\n INT8RANGE = 'int8range'\n INTEGER = 'integer'\n INTERVAL = 'interval'\n JSON = 'json'\n JSONB = 'jsonb'\n MACADDR = 'macaddr'\n MONEY = 'money'\n NAME = 'name'\n NUMERIC = 'numeric'\n NUMRANGE = 'numrange'\n OID = 'oid'\n REAL = 'real'\n REGCLASS = 'regclass'\n SMALLINT = 'smallint'\n TEXT = 'text'\n TIME = 'time'\n TIME_WITH_TIME_ZONE = 'time with time zone'\n TIME_WITHOUT_TIME_ZONE = 'time without time zone'\n TIMESTAMP = 'timestamp'\n TIMESTAMP_WITH_TIMESTAMP_ZONE = 'timestamp with time zone'\n TIMESTAMP_WITHOUT_TIMESTAMP_ZONE = 'timestamp without time zone'\n TSRANGE = 'tsrange'\n TSTZRANGE = 'tstzrange'\n TSVECTOR = 'tsvector'\n UUID = 'uuid'\n\n\nclass MathesarCustomType(Enum):\n \"\"\"\n This is a list of custom Mathesar DB types.\n Keys returned by get_available_types are of the format 'mathesar_types.VALUE'\n \"\"\"\n EMAIL = 'email'\n URI = 'uri'\n MONEY = 'money'\n\n\nSCHEMA = f\"{constants.MATHESAR_PREFIX}types\"\n# Since we want to have our identifiers quoted appropriately for use in\n# PostgreSQL, we want to use the postgres dialect preparer to set this up.\npreparer = create_engine(\"postgresql://\").dialect.identifier_preparer\n\n\ndef get_qualified_name(name):\n return \".\".join([preparer.quote_schema(SCHEMA), name])\n\n\ndef get_available_types(engine):\n return engine.dialect.ischema_names\n\n\ndef get_db_type_name(sa_type, engine):\n try:\n db_type = sa_type.compile(dialect=engine.dialect)\n except TypeError:\n db_type = sa_type().compile(dialect=engine.dialect)\n return db_type\n", "path": "db/types/base.py"}]} | 1,245 | 140 |
gh_patches_debug_6965 | rasdani/github-patches | git_diff | qtile__qtile-3225 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
widget.box does not close
Hi,
I'm running qtile on arch and after the latest update the widget.box does not close. It was working fine before the update.


--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `libqtile/widget/widgetbox.py`
Content:
```
1 # Copyright (c) 2020 elParaguayo
2 #
3 # Permission is hereby granted, free of charge, to any person obtaining a copy
4 # of this software and associated documentation files (the "Software"), to deal
5 # in the Software without restriction, including without limitation the rights
6 # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
7 # copies of the Software, and to permit persons to whom the Software is
8 # furnished to do so, subject to the following conditions:
9 #
10 # The above copyright notice and this permission notice shall be included in
11 # all copies or substantial portions of the Software.
12 #
13 # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
14 # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
15 # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
16 # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
17 # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
18 # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
19 # SOFTWARE.
20 from libqtile import bar
21 from libqtile.log_utils import logger
22 from libqtile.widget import Systray, base
23
24
25 class WidgetBox(base._Widget):
26 """A widget to declutter your bar.
27
28 WidgetBox is a widget that hides widgets by default but shows them when
29 the box is opened.
30
31 Widgets that are hidden will still update etc. as if they were on the main
32 bar.
33
34 Button clicks are passed to widgets when they are visible so callbacks will
35 work.
36
37 Widgets in the box also remain accessible via command interfaces.
38
39 Widgets can only be added to the box via the configuration file. The widget
40 is configured by adding widgets to the "widgets" parameter as follows::
41
42 widget.WidgetBox(widgets=[
43 widget.TextBox(text="This widget is in the box"),
44 widget.Memory()
45 ]
46 ),
47 """
48
49 orientations = base.ORIENTATION_HORIZONTAL
50 defaults = [
51 ("font", "sans", "Text font"),
52 ("fontsize", None, "Font pixel size. Calculated if None."),
53 ("fontshadow", None, "font shadow color, default is None(no shadow)"),
54 ("foreground", "#ffffff", "Foreground colour."),
55 (
56 "close_button_location",
57 "left",
58 "Location of close button when box open ('left' or 'right')",
59 ),
60 ("text_closed", "[<]", "Text when box is closed"),
61 ("text_open", "[>]", "Text when box is open"),
62 ]
63
64 def __init__(self, widgets: list = None, **config):
65 base._Widget.__init__(self, bar.CALCULATED, **config)
66 self.add_defaults(WidgetBox.defaults)
67 self.box_is_open = False
68 self.widgets = widgets if widgets is not None else []
69 self.add_callbacks({"Button1": self.cmd_toggle})
70
71 self.close_button_location: str
72 if self.close_button_location not in ["left", "right"]:
73 val = self.close_button_location
74 msg = "Invalid value for 'close_button_location': {}".format(val)
75 logger.warning(msg)
76 self.close_button_location = "left"
77
78 def _configure(self, qtile, bar):
79 base._Widget._configure(self, qtile, bar)
80
81 self.layout = self.drawer.textlayout(
82 self.text_closed,
83 self.foreground,
84 self.font,
85 self.fontsize,
86 self.fontshadow,
87 markup=False,
88 )
89
90 for idx, w in enumerate(self.widgets):
91 if w.configured:
92 w = w.create_mirror()
93 self.widgets[idx] = w
94 self.qtile.register_widget(w)
95 w._configure(self.qtile, self.bar)
96
97 # In case the widget is mirrored, we need to draw it once so the
98 # mirror can copy the surface but draw it off screen
99 w.offsetx = self.bar.width
100 self.qtile.call_soon(w.draw)
101
102 # Disable drawing of the widget's contents
103 for w in self.widgets:
104 w.drawer.disable()
105
106 def calculate_length(self):
107 return self.layout.width
108
109 def set_box_label(self):
110 self.layout.text = self.text_open if self.box_is_open else self.text_closed
111
112 def toggle_widgets(self):
113 for widget in self.widgets:
114 try:
115 self.bar.widgets.remove(widget)
116 # Override drawer.drawer with a no-op
117 widget.drawer.disable()
118
119 # Systray widget needs some additional steps to hide as the icons
120 # are separate _Window instances.
121 # Systray unhides icons when it draws so we only need to hide them.
122 if isinstance(widget, Systray):
123 for icon in widget.icons.values():
124 icon.hide()
125
126 except ValueError:
127 continue
128
129 index = self.bar.widgets.index(self)
130
131 if self.close_button_location == "left":
132 index += 1
133
134 if self.box_is_open:
135
136 # Need to reverse list as widgets get added in front of eachother.
137 for widget in self.widgets[::-1]:
138 # enable drawing again
139 widget.drawer.enable()
140 self.bar.widgets.insert(index, widget)
141
142 def draw(self):
143 self.drawer.clear(self.background or self.bar.background)
144
145 self.layout.draw(0, int(self.bar.height / 2.0 - self.layout.height / 2.0) + 1)
146
147 self.drawer.draw(offsetx=self.offsetx, offsety=self.offsety, width=self.width)
148
149 def cmd_toggle(self):
150 """Toggle box state"""
151 self.box_is_open = not self.box_is_open
152 self.toggle_widgets()
153 self.set_box_label()
154 self.bar.draw()
155
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/libqtile/widget/widgetbox.py b/libqtile/widget/widgetbox.py
--- a/libqtile/widget/widgetbox.py
+++ b/libqtile/widget/widgetbox.py
@@ -120,7 +120,7 @@
# are separate _Window instances.
# Systray unhides icons when it draws so we only need to hide them.
if isinstance(widget, Systray):
- for icon in widget.icons.values():
+ for icon in widget.tray_icons:
icon.hide()
except ValueError:
| {"golden_diff": "diff --git a/libqtile/widget/widgetbox.py b/libqtile/widget/widgetbox.py\n--- a/libqtile/widget/widgetbox.py\n+++ b/libqtile/widget/widgetbox.py\n@@ -120,7 +120,7 @@\n # are separate _Window instances.\n # Systray unhides icons when it draws so we only need to hide them.\n if isinstance(widget, Systray):\n- for icon in widget.icons.values():\n+ for icon in widget.tray_icons:\n icon.hide()\n \n except ValueError:\n", "issue": "widget.box does not close\nHi,\r\n\r\n I'm running qtile on arch and after the latest update the widget.box does not close. It was working fine before the update. \r\n\r\n\r\n\r\n\r\n \n", "before_files": [{"content": "# Copyright (c) 2020 elParaguayo\n#\n# Permission is hereby granted, free of charge, to any person obtaining a copy\n# of this software and associated documentation files (the \"Software\"), to deal\n# in the Software without restriction, including without limitation the rights\n# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n# copies of the Software, and to permit persons to whom the Software is\n# furnished to do so, subject to the following conditions:\n#\n# The above copyright notice and this permission notice shall be included in\n# all copies or substantial portions of the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n# SOFTWARE.\nfrom libqtile import bar\nfrom libqtile.log_utils import logger\nfrom libqtile.widget import Systray, base\n\n\nclass WidgetBox(base._Widget):\n \"\"\"A widget to declutter your bar.\n\n WidgetBox is a widget that hides widgets by default but shows them when\n the box is opened.\n\n Widgets that are hidden will still update etc. as if they were on the main\n bar.\n\n Button clicks are passed to widgets when they are visible so callbacks will\n work.\n\n Widgets in the box also remain accessible via command interfaces.\n\n Widgets can only be added to the box via the configuration file. The widget\n is configured by adding widgets to the \"widgets\" parameter as follows::\n\n widget.WidgetBox(widgets=[\n widget.TextBox(text=\"This widget is in the box\"),\n widget.Memory()\n ]\n ),\n \"\"\"\n\n orientations = base.ORIENTATION_HORIZONTAL\n defaults = [\n (\"font\", \"sans\", \"Text font\"),\n (\"fontsize\", None, \"Font pixel size. Calculated if None.\"),\n (\"fontshadow\", None, \"font shadow color, default is None(no shadow)\"),\n (\"foreground\", \"#ffffff\", \"Foreground colour.\"),\n (\n \"close_button_location\",\n \"left\",\n \"Location of close button when box open ('left' or 'right')\",\n ),\n (\"text_closed\", \"[<]\", \"Text when box is closed\"),\n (\"text_open\", \"[>]\", \"Text when box is open\"),\n ]\n\n def __init__(self, widgets: list = None, **config):\n base._Widget.__init__(self, bar.CALCULATED, **config)\n self.add_defaults(WidgetBox.defaults)\n self.box_is_open = False\n self.widgets = widgets if widgets is not None else []\n self.add_callbacks({\"Button1\": self.cmd_toggle})\n\n self.close_button_location: str\n if self.close_button_location not in [\"left\", \"right\"]:\n val = self.close_button_location\n msg = \"Invalid value for 'close_button_location': {}\".format(val)\n logger.warning(msg)\n self.close_button_location = \"left\"\n\n def _configure(self, qtile, bar):\n base._Widget._configure(self, qtile, bar)\n\n self.layout = self.drawer.textlayout(\n self.text_closed,\n self.foreground,\n self.font,\n self.fontsize,\n self.fontshadow,\n markup=False,\n )\n\n for idx, w in enumerate(self.widgets):\n if w.configured:\n w = w.create_mirror()\n self.widgets[idx] = w\n self.qtile.register_widget(w)\n w._configure(self.qtile, self.bar)\n\n # In case the widget is mirrored, we need to draw it once so the\n # mirror can copy the surface but draw it off screen\n w.offsetx = self.bar.width\n self.qtile.call_soon(w.draw)\n\n # Disable drawing of the widget's contents\n for w in self.widgets:\n w.drawer.disable()\n\n def calculate_length(self):\n return self.layout.width\n\n def set_box_label(self):\n self.layout.text = self.text_open if self.box_is_open else self.text_closed\n\n def toggle_widgets(self):\n for widget in self.widgets:\n try:\n self.bar.widgets.remove(widget)\n # Override drawer.drawer with a no-op\n widget.drawer.disable()\n\n # Systray widget needs some additional steps to hide as the icons\n # are separate _Window instances.\n # Systray unhides icons when it draws so we only need to hide them.\n if isinstance(widget, Systray):\n for icon in widget.icons.values():\n icon.hide()\n\n except ValueError:\n continue\n\n index = self.bar.widgets.index(self)\n\n if self.close_button_location == \"left\":\n index += 1\n\n if self.box_is_open:\n\n # Need to reverse list as widgets get added in front of eachother.\n for widget in self.widgets[::-1]:\n # enable drawing again\n widget.drawer.enable()\n self.bar.widgets.insert(index, widget)\n\n def draw(self):\n self.drawer.clear(self.background or self.bar.background)\n\n self.layout.draw(0, int(self.bar.height / 2.0 - self.layout.height / 2.0) + 1)\n\n self.drawer.draw(offsetx=self.offsetx, offsety=self.offsety, width=self.width)\n\n def cmd_toggle(self):\n \"\"\"Toggle box state\"\"\"\n self.box_is_open = not self.box_is_open\n self.toggle_widgets()\n self.set_box_label()\n self.bar.draw()\n", "path": "libqtile/widget/widgetbox.py"}], "after_files": [{"content": "# Copyright (c) 2020 elParaguayo\n#\n# Permission is hereby granted, free of charge, to any person obtaining a copy\n# of this software and associated documentation files (the \"Software\"), to deal\n# in the Software without restriction, including without limitation the rights\n# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n# copies of the Software, and to permit persons to whom the Software is\n# furnished to do so, subject to the following conditions:\n#\n# The above copyright notice and this permission notice shall be included in\n# all copies or substantial portions of the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n# SOFTWARE.\nfrom libqtile import bar\nfrom libqtile.log_utils import logger\nfrom libqtile.widget import Systray, base\n\n\nclass WidgetBox(base._Widget):\n \"\"\"A widget to declutter your bar.\n\n WidgetBox is a widget that hides widgets by default but shows them when\n the box is opened.\n\n Widgets that are hidden will still update etc. as if they were on the main\n bar.\n\n Button clicks are passed to widgets when they are visible so callbacks will\n work.\n\n Widgets in the box also remain accessible via command interfaces.\n\n Widgets can only be added to the box via the configuration file. The widget\n is configured by adding widgets to the \"widgets\" parameter as follows::\n\n widget.WidgetBox(widgets=[\n widget.TextBox(text=\"This widget is in the box\"),\n widget.Memory()\n ]\n ),\n \"\"\"\n\n orientations = base.ORIENTATION_HORIZONTAL\n defaults = [\n (\"font\", \"sans\", \"Text font\"),\n (\"fontsize\", None, \"Font pixel size. Calculated if None.\"),\n (\"fontshadow\", None, \"font shadow color, default is None(no shadow)\"),\n (\"foreground\", \"#ffffff\", \"Foreground colour.\"),\n (\n \"close_button_location\",\n \"left\",\n \"Location of close button when box open ('left' or 'right')\",\n ),\n (\"text_closed\", \"[<]\", \"Text when box is closed\"),\n (\"text_open\", \"[>]\", \"Text when box is open\"),\n ]\n\n def __init__(self, widgets: list = None, **config):\n base._Widget.__init__(self, bar.CALCULATED, **config)\n self.add_defaults(WidgetBox.defaults)\n self.box_is_open = False\n self.widgets = widgets if widgets is not None else []\n self.add_callbacks({\"Button1\": self.cmd_toggle})\n\n self.close_button_location: str\n if self.close_button_location not in [\"left\", \"right\"]:\n val = self.close_button_location\n msg = \"Invalid value for 'close_button_location': {}\".format(val)\n logger.warning(msg)\n self.close_button_location = \"left\"\n\n def _configure(self, qtile, bar):\n base._Widget._configure(self, qtile, bar)\n\n self.layout = self.drawer.textlayout(\n self.text_closed,\n self.foreground,\n self.font,\n self.fontsize,\n self.fontshadow,\n markup=False,\n )\n\n for idx, w in enumerate(self.widgets):\n if w.configured:\n w = w.create_mirror()\n self.widgets[idx] = w\n self.qtile.register_widget(w)\n w._configure(self.qtile, self.bar)\n\n # In case the widget is mirrored, we need to draw it once so the\n # mirror can copy the surface but draw it off screen\n w.offsetx = self.bar.width\n self.qtile.call_soon(w.draw)\n\n # Disable drawing of the widget's contents\n for w in self.widgets:\n w.drawer.disable()\n\n def calculate_length(self):\n return self.layout.width\n\n def set_box_label(self):\n self.layout.text = self.text_open if self.box_is_open else self.text_closed\n\n def toggle_widgets(self):\n for widget in self.widgets:\n try:\n self.bar.widgets.remove(widget)\n # Override drawer.drawer with a no-op\n widget.drawer.disable()\n\n # Systray widget needs some additional steps to hide as the icons\n # are separate _Window instances.\n # Systray unhides icons when it draws so we only need to hide them.\n if isinstance(widget, Systray):\n for icon in widget.tray_icons:\n icon.hide()\n\n except ValueError:\n continue\n\n index = self.bar.widgets.index(self)\n\n if self.close_button_location == \"left\":\n index += 1\n\n if self.box_is_open:\n\n # Need to reverse list as widgets get added in front of eachother.\n for widget in self.widgets[::-1]:\n # enable drawing again\n widget.drawer.enable()\n self.bar.widgets.insert(index, widget)\n\n def draw(self):\n self.drawer.clear(self.background or self.bar.background)\n\n self.layout.draw(0, int(self.bar.height / 2.0 - self.layout.height / 2.0) + 1)\n\n self.drawer.draw(offsetx=self.offsetx, offsety=self.offsety, width=self.width)\n\n def cmd_toggle(self):\n \"\"\"Toggle box state\"\"\"\n self.box_is_open = not self.box_is_open\n self.toggle_widgets()\n self.set_box_label()\n self.bar.draw()\n", "path": "libqtile/widget/widgetbox.py"}]} | 2,016 | 118 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.