problem_id
stringlengths 18
22
| source
stringclasses 1
value | task_type
stringclasses 1
value | in_source_id
stringlengths 13
58
| prompt
stringlengths 1.1k
10.2k
| golden_diff
stringlengths 151
4.94k
| verification_info
stringlengths 582
21k
| num_tokens
int64 271
2.05k
| num_tokens_diff
int64 47
1.02k
|
---|---|---|---|---|---|---|---|---|
gh_patches_debug_39662 | rasdani/github-patches | git_diff | jupyterhub__zero-to-jupyterhub-k8s-531 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
culler is failing and exiting when culling users and servers are slow to stop
Two issues:
1. culler script seems to exit when the cull request fails. It's unclear why this happens, but we should confirm and fix this
2. the 'real' issue is that the culler is hitting 400 errors in the first place. The cause is servers that are slow to stop (DELETE /users/:name gives 400 if the user's server is running and cannot stop promptly). The previous request to stop the server will have returned 202 ACCEPTED instead of 204 DELETED in this case. If we delay deleting users if we get 202 ACCEPTED from the server deletion, we should be safe here.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `images/hub/cull_idle_servers.py`
Content:
```
1 #!/usr/bin/env python3
2 # Imported from https://github.com/jupyterhub/jupyterhub/blob/0.8.0rc1/examples/cull-idle/cull_idle_servers.py
3 """script to monitor and cull idle single-user servers
4
5 Caveats:
6
7 last_activity is not updated with high frequency,
8 so cull timeout should be greater than the sum of:
9
10 - single-user websocket ping interval (default: 30s)
11 - JupyterHub.last_activity_interval (default: 5 minutes)
12
13 You can run this as a service managed by JupyterHub with this in your config::
14
15
16 c.JupyterHub.services = [
17 {
18 'name': 'cull-idle',
19 'admin': True,
20 'command': 'python cull_idle_servers.py --timeout=3600'.split(),
21 }
22 ]
23
24 Or run it manually by generating an API token and storing it in `JUPYTERHUB_API_TOKEN`:
25
26 export JUPYTERHUB_API_TOKEN=`jupyterhub token`
27 python cull_idle_servers.py [--timeout=900] [--url=http://127.0.0.1:8081/hub/api]
28 """
29
30 import datetime
31 import json
32 import os
33
34 from dateutil.parser import parse as parse_date
35
36 from tornado.gen import coroutine
37 from tornado.log import app_log
38 from tornado.httpclient import AsyncHTTPClient, HTTPRequest
39 from tornado.ioloop import IOLoop, PeriodicCallback
40 from tornado.options import define, options, parse_command_line
41
42
43 @coroutine
44 def cull_idle(url, api_token, timeout, cull_users=False):
45 """Shutdown idle single-user servers
46
47 If cull_users, inactive *users* will be deleted as well.
48 """
49 auth_header = {
50 'Authorization': 'token %s' % api_token
51 }
52 req = HTTPRequest(url=url + '/users',
53 headers=auth_header,
54 )
55 now = datetime.datetime.utcnow()
56 cull_limit = now - datetime.timedelta(seconds=timeout)
57 client = AsyncHTTPClient()
58 resp = yield client.fetch(req)
59 users = json.loads(resp.body.decode('utf8', 'replace'))
60 futures = []
61
62 @coroutine
63 def cull_one(user, last_activity):
64 """cull one user"""
65
66 # shutdown server first. Hub doesn't allow deleting users with running servers.
67 if user['server']:
68 app_log.info("Culling server for %s (inactive since %s)", user['name'], last_activity)
69 req = HTTPRequest(url=url + '/users/%s/server' % user['name'],
70 method='DELETE',
71 headers=auth_header,
72 )
73 yield client.fetch(req)
74 if cull_users:
75 app_log.info("Culling user %s (inactive since %s)", user['name'], last_activity)
76 req = HTTPRequest(url=url + '/users/%s' % user['name'],
77 method='DELETE',
78 headers=auth_header,
79 )
80 yield client.fetch(req)
81
82 for user in users:
83 if not user['server'] and not cull_users:
84 # server not running and not culling users, nothing to do
85 continue
86 last_activity = parse_date(user['last_activity'])
87 if last_activity < cull_limit:
88 futures.append((user['name'], cull_one(user, last_activity)))
89 else:
90 app_log.debug("Not culling %s (active since %s)", user['name'], last_activity)
91
92 for (name, f) in futures:
93 yield f
94 app_log.debug("Finished culling %s", name)
95
96
97 if __name__ == '__main__':
98 define('url', default=os.environ.get('JUPYTERHUB_API_URL'), help="The JupyterHub API URL")
99 define('timeout', default=600, help="The idle timeout (in seconds)")
100 define('cull_every', default=0, help="The interval (in seconds) for checking for idle servers to cull")
101 define('cull_users', default=False,
102 help="""Cull users in addition to servers.
103 This is for use in temporary-user cases such as tmpnb.""",
104 )
105
106 parse_command_line()
107 if not options.cull_every:
108 options.cull_every = options.timeout // 2
109 api_token = os.environ['JUPYTERHUB_API_TOKEN']
110
111 loop = IOLoop.current()
112 cull = lambda : cull_idle(options.url, api_token, options.timeout, options.cull_users)
113 # run once before scheduling periodic call
114 loop.run_sync(cull)
115 # schedule periodic cull
116 pc = PeriodicCallback(cull, 1e3 * options.cull_every)
117 pc.start()
118 try:
119 loop.start()
120 except KeyboardInterrupt:
121 pass
122
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/images/hub/cull_idle_servers.py b/images/hub/cull_idle_servers.py
--- a/images/hub/cull_idle_servers.py
+++ b/images/hub/cull_idle_servers.py
@@ -70,7 +70,15 @@
method='DELETE',
headers=auth_header,
)
- yield client.fetch(req)
+ resp = yield client.fetch(req)
+ if resp.code == 202:
+ msg = "Server for {} is slow to stop.".format(user['name'])
+ if cull_users:
+ app_log.warning(msg + " Not culling user yet.")
+ # return here so we don't continue to cull the user
+ # which will fail if the server is still trying to shutdown
+ return
+ app_log.warning(msg)
if cull_users:
app_log.info("Culling user %s (inactive since %s)", user['name'], last_activity)
req = HTTPRequest(url=url + '/users/%s' % user['name'],
@@ -83,12 +91,19 @@
if not user['server'] and not cull_users:
# server not running and not culling users, nothing to do
continue
+ if not user['last_activity']:
+ continue
last_activity = parse_date(user['last_activity'])
if last_activity < cull_limit:
+ # user might be in a transition (e.g. starting or stopping)
+ # don't try to cull if this is happening
+ if user['pending']:
+ app_log.warning("Not culling user %s with pending %s", user['name'], user['pending'])
+ continue
futures.append((user['name'], cull_one(user, last_activity)))
else:
app_log.debug("Not culling %s (active since %s)", user['name'], last_activity)
-
+
for (name, f) in futures:
yield f
app_log.debug("Finished culling %s", name)
@@ -102,16 +117,17 @@
help="""Cull users in addition to servers.
This is for use in temporary-user cases such as tmpnb.""",
)
-
+
parse_command_line()
if not options.cull_every:
options.cull_every = options.timeout // 2
api_token = os.environ['JUPYTERHUB_API_TOKEN']
-
+
loop = IOLoop.current()
cull = lambda : cull_idle(options.url, api_token, options.timeout, options.cull_users)
- # run once before scheduling periodic call
- loop.run_sync(cull)
+ # schedule first cull immediately
+ # because PeriodicCallback doesn't start until the end of the first interval
+ loop.add_callback(cull)
# schedule periodic cull
pc = PeriodicCallback(cull, 1e3 * options.cull_every)
pc.start()
| {"golden_diff": "diff --git a/images/hub/cull_idle_servers.py b/images/hub/cull_idle_servers.py\n--- a/images/hub/cull_idle_servers.py\n+++ b/images/hub/cull_idle_servers.py\n@@ -70,7 +70,15 @@\n method='DELETE',\n headers=auth_header,\n )\n- yield client.fetch(req)\n+ resp = yield client.fetch(req)\n+ if resp.code == 202:\n+ msg = \"Server for {} is slow to stop.\".format(user['name'])\n+ if cull_users:\n+ app_log.warning(msg + \" Not culling user yet.\")\n+ # return here so we don't continue to cull the user\n+ # which will fail if the server is still trying to shutdown\n+ return\n+ app_log.warning(msg)\n if cull_users:\n app_log.info(\"Culling user %s (inactive since %s)\", user['name'], last_activity)\n req = HTTPRequest(url=url + '/users/%s' % user['name'],\n@@ -83,12 +91,19 @@\n if not user['server'] and not cull_users:\n # server not running and not culling users, nothing to do\n continue\n+ if not user['last_activity']:\n+ continue\n last_activity = parse_date(user['last_activity'])\n if last_activity < cull_limit:\n+ # user might be in a transition (e.g. starting or stopping)\n+ # don't try to cull if this is happening\n+ if user['pending']:\n+ app_log.warning(\"Not culling user %s with pending %s\", user['name'], user['pending'])\n+ continue\n futures.append((user['name'], cull_one(user, last_activity)))\n else:\n app_log.debug(\"Not culling %s (active since %s)\", user['name'], last_activity)\n- \n+\n for (name, f) in futures:\n yield f\n app_log.debug(\"Finished culling %s\", name)\n@@ -102,16 +117,17 @@\n help=\"\"\"Cull users in addition to servers.\n This is for use in temporary-user cases such as tmpnb.\"\"\",\n )\n- \n+\n parse_command_line()\n if not options.cull_every:\n options.cull_every = options.timeout // 2\n api_token = os.environ['JUPYTERHUB_API_TOKEN']\n- \n+\n loop = IOLoop.current()\n cull = lambda : cull_idle(options.url, api_token, options.timeout, options.cull_users)\n- # run once before scheduling periodic call\n- loop.run_sync(cull)\n+ # schedule first cull immediately\n+ # because PeriodicCallback doesn't start until the end of the first interval\n+ loop.add_callback(cull)\n # schedule periodic cull\n pc = PeriodicCallback(cull, 1e3 * options.cull_every)\n pc.start()\n", "issue": "culler is failing and exiting when culling users and servers are slow to stop\nTwo issues:\r\n\r\n1. culler script seems to exit when the cull request fails. It's unclear why this happens, but we should confirm and fix this\r\n2. the 'real' issue is that the culler is hitting 400 errors in the first place. The cause is servers that are slow to stop (DELETE /users/:name gives 400 if the user's server is running and cannot stop promptly). The previous request to stop the server will have returned 202 ACCEPTED instead of 204 DELETED in this case. If we delay deleting users if we get 202 ACCEPTED from the server deletion, we should be safe here.\n", "before_files": [{"content": "#!/usr/bin/env python3\n# Imported from https://github.com/jupyterhub/jupyterhub/blob/0.8.0rc1/examples/cull-idle/cull_idle_servers.py\n\"\"\"script to monitor and cull idle single-user servers\n\nCaveats:\n\nlast_activity is not updated with high frequency,\nso cull timeout should be greater than the sum of:\n\n- single-user websocket ping interval (default: 30s)\n- JupyterHub.last_activity_interval (default: 5 minutes)\n\nYou can run this as a service managed by JupyterHub with this in your config::\n\n\n c.JupyterHub.services = [\n {\n 'name': 'cull-idle',\n 'admin': True,\n 'command': 'python cull_idle_servers.py --timeout=3600'.split(),\n }\n ]\n\nOr run it manually by generating an API token and storing it in `JUPYTERHUB_API_TOKEN`:\n\n export JUPYTERHUB_API_TOKEN=`jupyterhub token`\n python cull_idle_servers.py [--timeout=900] [--url=http://127.0.0.1:8081/hub/api]\n\"\"\"\n\nimport datetime\nimport json\nimport os\n\nfrom dateutil.parser import parse as parse_date\n\nfrom tornado.gen import coroutine\nfrom tornado.log import app_log\nfrom tornado.httpclient import AsyncHTTPClient, HTTPRequest\nfrom tornado.ioloop import IOLoop, PeriodicCallback\nfrom tornado.options import define, options, parse_command_line\n\n\n@coroutine\ndef cull_idle(url, api_token, timeout, cull_users=False):\n \"\"\"Shutdown idle single-user servers\n\n If cull_users, inactive *users* will be deleted as well.\n \"\"\"\n auth_header = {\n 'Authorization': 'token %s' % api_token\n }\n req = HTTPRequest(url=url + '/users',\n headers=auth_header,\n )\n now = datetime.datetime.utcnow()\n cull_limit = now - datetime.timedelta(seconds=timeout)\n client = AsyncHTTPClient()\n resp = yield client.fetch(req)\n users = json.loads(resp.body.decode('utf8', 'replace'))\n futures = []\n\n @coroutine\n def cull_one(user, last_activity):\n \"\"\"cull one user\"\"\"\n\n # shutdown server first. Hub doesn't allow deleting users with running servers.\n if user['server']:\n app_log.info(\"Culling server for %s (inactive since %s)\", user['name'], last_activity)\n req = HTTPRequest(url=url + '/users/%s/server' % user['name'],\n method='DELETE',\n headers=auth_header,\n )\n yield client.fetch(req)\n if cull_users:\n app_log.info(\"Culling user %s (inactive since %s)\", user['name'], last_activity)\n req = HTTPRequest(url=url + '/users/%s' % user['name'],\n method='DELETE',\n headers=auth_header,\n )\n yield client.fetch(req)\n\n for user in users:\n if not user['server'] and not cull_users:\n # server not running and not culling users, nothing to do\n continue\n last_activity = parse_date(user['last_activity'])\n if last_activity < cull_limit:\n futures.append((user['name'], cull_one(user, last_activity)))\n else:\n app_log.debug(\"Not culling %s (active since %s)\", user['name'], last_activity)\n \n for (name, f) in futures:\n yield f\n app_log.debug(\"Finished culling %s\", name)\n\n\nif __name__ == '__main__':\n define('url', default=os.environ.get('JUPYTERHUB_API_URL'), help=\"The JupyterHub API URL\")\n define('timeout', default=600, help=\"The idle timeout (in seconds)\")\n define('cull_every', default=0, help=\"The interval (in seconds) for checking for idle servers to cull\")\n define('cull_users', default=False,\n help=\"\"\"Cull users in addition to servers.\n This is for use in temporary-user cases such as tmpnb.\"\"\",\n )\n \n parse_command_line()\n if not options.cull_every:\n options.cull_every = options.timeout // 2\n api_token = os.environ['JUPYTERHUB_API_TOKEN']\n \n loop = IOLoop.current()\n cull = lambda : cull_idle(options.url, api_token, options.timeout, options.cull_users)\n # run once before scheduling periodic call\n loop.run_sync(cull)\n # schedule periodic cull\n pc = PeriodicCallback(cull, 1e3 * options.cull_every)\n pc.start()\n try:\n loop.start()\n except KeyboardInterrupt:\n pass\n", "path": "images/hub/cull_idle_servers.py"}], "after_files": [{"content": "#!/usr/bin/env python3\n# Imported from https://github.com/jupyterhub/jupyterhub/blob/0.8.0rc1/examples/cull-idle/cull_idle_servers.py\n\"\"\"script to monitor and cull idle single-user servers\n\nCaveats:\n\nlast_activity is not updated with high frequency,\nso cull timeout should be greater than the sum of:\n\n- single-user websocket ping interval (default: 30s)\n- JupyterHub.last_activity_interval (default: 5 minutes)\n\nYou can run this as a service managed by JupyterHub with this in your config::\n\n\n c.JupyterHub.services = [\n {\n 'name': 'cull-idle',\n 'admin': True,\n 'command': 'python cull_idle_servers.py --timeout=3600'.split(),\n }\n ]\n\nOr run it manually by generating an API token and storing it in `JUPYTERHUB_API_TOKEN`:\n\n export JUPYTERHUB_API_TOKEN=`jupyterhub token`\n python cull_idle_servers.py [--timeout=900] [--url=http://127.0.0.1:8081/hub/api]\n\"\"\"\n\nimport datetime\nimport json\nimport os\n\nfrom dateutil.parser import parse as parse_date\n\nfrom tornado.gen import coroutine\nfrom tornado.log import app_log\nfrom tornado.httpclient import AsyncHTTPClient, HTTPRequest\nfrom tornado.ioloop import IOLoop, PeriodicCallback\nfrom tornado.options import define, options, parse_command_line\n\n\n@coroutine\ndef cull_idle(url, api_token, timeout, cull_users=False):\n \"\"\"Shutdown idle single-user servers\n\n If cull_users, inactive *users* will be deleted as well.\n \"\"\"\n auth_header = {\n 'Authorization': 'token %s' % api_token\n }\n req = HTTPRequest(url=url + '/users',\n headers=auth_header,\n )\n now = datetime.datetime.utcnow()\n cull_limit = now - datetime.timedelta(seconds=timeout)\n client = AsyncHTTPClient()\n resp = yield client.fetch(req)\n users = json.loads(resp.body.decode('utf8', 'replace'))\n futures = []\n\n @coroutine\n def cull_one(user, last_activity):\n \"\"\"cull one user\"\"\"\n\n # shutdown server first. Hub doesn't allow deleting users with running servers.\n if user['server']:\n app_log.info(\"Culling server for %s (inactive since %s)\", user['name'], last_activity)\n req = HTTPRequest(url=url + '/users/%s/server' % user['name'],\n method='DELETE',\n headers=auth_header,\n )\n resp = yield client.fetch(req)\n if resp.code == 202:\n msg = \"Server for {} is slow to stop.\".format(user['name'])\n if cull_users:\n app_log.warning(msg + \" Not culling user yet.\")\n # return here so we don't continue to cull the user\n # which will fail if the server is still trying to shutdown\n return\n app_log.warning(msg)\n if cull_users:\n app_log.info(\"Culling user %s (inactive since %s)\", user['name'], last_activity)\n req = HTTPRequest(url=url + '/users/%s' % user['name'],\n method='DELETE',\n headers=auth_header,\n )\n yield client.fetch(req)\n\n for user in users:\n if not user['server'] and not cull_users:\n # server not running and not culling users, nothing to do\n continue\n if not user['last_activity']:\n continue\n last_activity = parse_date(user['last_activity'])\n if last_activity < cull_limit:\n # user might be in a transition (e.g. starting or stopping)\n # don't try to cull if this is happening\n if user['pending']:\n app_log.warning(\"Not culling user %s with pending %s\", user['name'], user['pending'])\n continue\n futures.append((user['name'], cull_one(user, last_activity)))\n else:\n app_log.debug(\"Not culling %s (active since %s)\", user['name'], last_activity)\n\n for (name, f) in futures:\n yield f\n app_log.debug(\"Finished culling %s\", name)\n\n\nif __name__ == '__main__':\n define('url', default=os.environ.get('JUPYTERHUB_API_URL'), help=\"The JupyterHub API URL\")\n define('timeout', default=600, help=\"The idle timeout (in seconds)\")\n define('cull_every', default=0, help=\"The interval (in seconds) for checking for idle servers to cull\")\n define('cull_users', default=False,\n help=\"\"\"Cull users in addition to servers.\n This is for use in temporary-user cases such as tmpnb.\"\"\",\n )\n\n parse_command_line()\n if not options.cull_every:\n options.cull_every = options.timeout // 2\n api_token = os.environ['JUPYTERHUB_API_TOKEN']\n\n loop = IOLoop.current()\n cull = lambda : cull_idle(options.url, api_token, options.timeout, options.cull_users)\n # schedule first cull immediately\n # because PeriodicCallback doesn't start until the end of the first interval\n loop.add_callback(cull)\n # schedule periodic cull\n pc = PeriodicCallback(cull, 1e3 * options.cull_every)\n pc.start()\n try:\n loop.start()\n except KeyboardInterrupt:\n pass\n", "path": "images/hub/cull_idle_servers.py"}]} | 1,716 | 647 |
gh_patches_debug_19230 | rasdani/github-patches | git_diff | google__clusterfuzz-863 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Fuzzers page does not work well with large number of jobs
For libFuzzer, if we have a lot of existing jobs (>100) and want to add a new job and associate it. Submit button feels stuck, does not show updates, and take 1-2 min to finish. Can we show some update or something better to optimize this when only one job is updated.
@oliverchang as fyi.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/python/fuzzing/fuzzer_selection.py`
Content:
```
1 # Copyright 2019 Google LLC
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 """Helper functions to update fuzzer-job mappings, and select fuzzers to run."""
15
16 import collections
17
18 from base import utils
19 from datastore import data_types
20 from datastore import fuzz_target_utils
21 from datastore import ndb
22 from datastore import ndb_utils
23 from metrics import logs
24 from system import environment
25
26 # Used to prepare targets to be passed to utils.random_weighted_choice.
27 WeightedTarget = collections.namedtuple('WeightedTarget', ['target', 'weight'])
28
29
30 def update_mappings_for_fuzzer(fuzzer, mappings=None):
31 """Clear existing mappings for a fuzzer, and replace them."""
32 if mappings is None:
33 mappings = fuzzer.jobs
34
35 query = data_types.FuzzerJob.query()
36 query = query.filter(data_types.FuzzerJob.fuzzer == fuzzer.name)
37 entities = ndb_utils.get_all_from_query(query)
38 old_mappings = {}
39 for entity in entities:
40 old_mappings[(entity.job, entity.platform)] = entity
41
42 new_mappings = []
43 for job_name in mappings:
44 job = data_types.Job.query(data_types.Job.name == job_name).get()
45 if not job:
46 logs.log_error('An unknown job %s was selected for fuzzer %s.' %
47 (job_name, fuzzer.name))
48 continue
49
50 mapping = old_mappings.pop((job_name, job.platform), None)
51 if mapping:
52 continue
53
54 mapping = data_types.FuzzerJob()
55 mapping.fuzzer = fuzzer.name
56 mapping.job = job_name
57 mapping.platform = job.platform
58 new_mappings.append(mapping)
59
60 ndb.put_multi(new_mappings)
61 ndb.delete_multi([m.key for m in list(old_mappings.values())])
62
63
64 def update_platform_for_job(job_name, new_platform):
65 """Update platform for all mappings for a particular job."""
66 query = data_types.FuzzerJob.query()
67 query = query.filter(data_types.FuzzerJob.job == job_name)
68 mappings = ndb_utils.get_all_from_query(query)
69 new_mappings = []
70 for mapping in mappings:
71 mapping.platform = new_platform
72 new_mappings.append(mapping)
73 ndb.put_multi(new_mappings)
74
75
76 def get_fuzz_task_payload(platform=None):
77 """Select a fuzzer that can run on this platform."""
78 if not platform:
79 queue_override = environment.get_value('QUEUE_OVERRIDE')
80 platform = queue_override if queue_override else environment.platform()
81
82 query = data_types.FuzzerJob.query()
83 query = query.filter(data_types.FuzzerJob.platform == platform)
84
85 mappings = list(ndb_utils.get_all_from_query(query))
86 if not mappings:
87 return None, None
88
89 selection = utils.random_weighted_choice(mappings)
90 return selection.fuzzer, selection.job
91
92
93 def select_fuzz_target(targets, target_weights):
94 """Select a fuzz target from a list of potential targets."""
95 assert targets
96
97 weighted_targets = []
98 for target in targets:
99 weight = target_weights.get(target, 1.0)
100 weighted_targets.append(WeightedTarget(target, weight))
101
102 return utils.random_weighted_choice(weighted_targets).target
103
104
105 def get_fuzz_target_weights():
106 """Get a list of fuzz target weights based on the current fuzzer."""
107 job_type = environment.get_value('JOB_NAME')
108
109 target_jobs = list(fuzz_target_utils.get_fuzz_target_jobs(job=job_type))
110 fuzz_targets = fuzz_target_utils.get_fuzz_targets_for_target_jobs(target_jobs)
111
112 weights = {}
113 for fuzz_target, target_job in zip(fuzz_targets, target_jobs):
114 if not fuzz_target:
115 logs.log_error('Skipping weight assignment for fuzz target %s.' %
116 target_job.fuzz_target_name)
117 continue
118
119 weights[fuzz_target.binary] = target_job.weight
120
121 return weights
122
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/python/fuzzing/fuzzer_selection.py b/src/python/fuzzing/fuzzer_selection.py
--- a/src/python/fuzzing/fuzzer_selection.py
+++ b/src/python/fuzzing/fuzzer_selection.py
@@ -37,20 +37,20 @@
entities = ndb_utils.get_all_from_query(query)
old_mappings = {}
for entity in entities:
- old_mappings[(entity.job, entity.platform)] = entity
+ old_mappings[entity.job] = entity
new_mappings = []
for job_name in mappings:
+ mapping = old_mappings.pop(job_name, None)
+ if mapping:
+ continue
+
job = data_types.Job.query(data_types.Job.name == job_name).get()
if not job:
logs.log_error('An unknown job %s was selected for fuzzer %s.' %
(job_name, fuzzer.name))
continue
- mapping = old_mappings.pop((job_name, job.platform), None)
- if mapping:
- continue
-
mapping = data_types.FuzzerJob()
mapping.fuzzer = fuzzer.name
mapping.job = job_name
| {"golden_diff": "diff --git a/src/python/fuzzing/fuzzer_selection.py b/src/python/fuzzing/fuzzer_selection.py\n--- a/src/python/fuzzing/fuzzer_selection.py\n+++ b/src/python/fuzzing/fuzzer_selection.py\n@@ -37,20 +37,20 @@\n entities = ndb_utils.get_all_from_query(query)\n old_mappings = {}\n for entity in entities:\n- old_mappings[(entity.job, entity.platform)] = entity\n+ old_mappings[entity.job] = entity\n \n new_mappings = []\n for job_name in mappings:\n+ mapping = old_mappings.pop(job_name, None)\n+ if mapping:\n+ continue\n+\n job = data_types.Job.query(data_types.Job.name == job_name).get()\n if not job:\n logs.log_error('An unknown job %s was selected for fuzzer %s.' %\n (job_name, fuzzer.name))\n continue\n \n- mapping = old_mappings.pop((job_name, job.platform), None)\n- if mapping:\n- continue\n-\n mapping = data_types.FuzzerJob()\n mapping.fuzzer = fuzzer.name\n mapping.job = job_name\n", "issue": "Fuzzers page does not work well with large number of jobs\nFor libFuzzer, if we have a lot of existing jobs (>100) and want to add a new job and associate it. Submit button feels stuck, does not show updates, and take 1-2 min to finish. Can we show some update or something better to optimize this when only one job is updated.\r\n\r\n@oliverchang as fyi.\n", "before_files": [{"content": "# Copyright 2019 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Helper functions to update fuzzer-job mappings, and select fuzzers to run.\"\"\"\n\nimport collections\n\nfrom base import utils\nfrom datastore import data_types\nfrom datastore import fuzz_target_utils\nfrom datastore import ndb\nfrom datastore import ndb_utils\nfrom metrics import logs\nfrom system import environment\n\n# Used to prepare targets to be passed to utils.random_weighted_choice.\nWeightedTarget = collections.namedtuple('WeightedTarget', ['target', 'weight'])\n\n\ndef update_mappings_for_fuzzer(fuzzer, mappings=None):\n \"\"\"Clear existing mappings for a fuzzer, and replace them.\"\"\"\n if mappings is None:\n mappings = fuzzer.jobs\n\n query = data_types.FuzzerJob.query()\n query = query.filter(data_types.FuzzerJob.fuzzer == fuzzer.name)\n entities = ndb_utils.get_all_from_query(query)\n old_mappings = {}\n for entity in entities:\n old_mappings[(entity.job, entity.platform)] = entity\n\n new_mappings = []\n for job_name in mappings:\n job = data_types.Job.query(data_types.Job.name == job_name).get()\n if not job:\n logs.log_error('An unknown job %s was selected for fuzzer %s.' %\n (job_name, fuzzer.name))\n continue\n\n mapping = old_mappings.pop((job_name, job.platform), None)\n if mapping:\n continue\n\n mapping = data_types.FuzzerJob()\n mapping.fuzzer = fuzzer.name\n mapping.job = job_name\n mapping.platform = job.platform\n new_mappings.append(mapping)\n\n ndb.put_multi(new_mappings)\n ndb.delete_multi([m.key for m in list(old_mappings.values())])\n\n\ndef update_platform_for_job(job_name, new_platform):\n \"\"\"Update platform for all mappings for a particular job.\"\"\"\n query = data_types.FuzzerJob.query()\n query = query.filter(data_types.FuzzerJob.job == job_name)\n mappings = ndb_utils.get_all_from_query(query)\n new_mappings = []\n for mapping in mappings:\n mapping.platform = new_platform\n new_mappings.append(mapping)\n ndb.put_multi(new_mappings)\n\n\ndef get_fuzz_task_payload(platform=None):\n \"\"\"Select a fuzzer that can run on this platform.\"\"\"\n if not platform:\n queue_override = environment.get_value('QUEUE_OVERRIDE')\n platform = queue_override if queue_override else environment.platform()\n\n query = data_types.FuzzerJob.query()\n query = query.filter(data_types.FuzzerJob.platform == platform)\n\n mappings = list(ndb_utils.get_all_from_query(query))\n if not mappings:\n return None, None\n\n selection = utils.random_weighted_choice(mappings)\n return selection.fuzzer, selection.job\n\n\ndef select_fuzz_target(targets, target_weights):\n \"\"\"Select a fuzz target from a list of potential targets.\"\"\"\n assert targets\n\n weighted_targets = []\n for target in targets:\n weight = target_weights.get(target, 1.0)\n weighted_targets.append(WeightedTarget(target, weight))\n\n return utils.random_weighted_choice(weighted_targets).target\n\n\ndef get_fuzz_target_weights():\n \"\"\"Get a list of fuzz target weights based on the current fuzzer.\"\"\"\n job_type = environment.get_value('JOB_NAME')\n\n target_jobs = list(fuzz_target_utils.get_fuzz_target_jobs(job=job_type))\n fuzz_targets = fuzz_target_utils.get_fuzz_targets_for_target_jobs(target_jobs)\n\n weights = {}\n for fuzz_target, target_job in zip(fuzz_targets, target_jobs):\n if not fuzz_target:\n logs.log_error('Skipping weight assignment for fuzz target %s.' %\n target_job.fuzz_target_name)\n continue\n\n weights[fuzz_target.binary] = target_job.weight\n\n return weights\n", "path": "src/python/fuzzing/fuzzer_selection.py"}], "after_files": [{"content": "# Copyright 2019 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Helper functions to update fuzzer-job mappings, and select fuzzers to run.\"\"\"\n\nimport collections\n\nfrom base import utils\nfrom datastore import data_types\nfrom datastore import fuzz_target_utils\nfrom datastore import ndb\nfrom datastore import ndb_utils\nfrom metrics import logs\nfrom system import environment\n\n# Used to prepare targets to be passed to utils.random_weighted_choice.\nWeightedTarget = collections.namedtuple('WeightedTarget', ['target', 'weight'])\n\n\ndef update_mappings_for_fuzzer(fuzzer, mappings=None):\n \"\"\"Clear existing mappings for a fuzzer, and replace them.\"\"\"\n if mappings is None:\n mappings = fuzzer.jobs\n\n query = data_types.FuzzerJob.query()\n query = query.filter(data_types.FuzzerJob.fuzzer == fuzzer.name)\n entities = ndb_utils.get_all_from_query(query)\n old_mappings = {}\n for entity in entities:\n old_mappings[entity.job] = entity\n\n new_mappings = []\n for job_name in mappings:\n mapping = old_mappings.pop(job_name, None)\n if mapping:\n continue\n\n job = data_types.Job.query(data_types.Job.name == job_name).get()\n if not job:\n logs.log_error('An unknown job %s was selected for fuzzer %s.' %\n (job_name, fuzzer.name))\n continue\n\n mapping = data_types.FuzzerJob()\n mapping.fuzzer = fuzzer.name\n mapping.job = job_name\n mapping.platform = job.platform\n new_mappings.append(mapping)\n\n ndb.put_multi(new_mappings)\n ndb.delete_multi([m.key for m in list(old_mappings.values())])\n\n\ndef update_platform_for_job(job_name, new_platform):\n \"\"\"Update platform for all mappings for a particular job.\"\"\"\n query = data_types.FuzzerJob.query()\n query = query.filter(data_types.FuzzerJob.job == job_name)\n mappings = ndb_utils.get_all_from_query(query)\n new_mappings = []\n for mapping in mappings:\n mapping.platform = new_platform\n new_mappings.append(mapping)\n ndb.put_multi(new_mappings)\n\n\ndef get_fuzz_task_payload(platform=None):\n \"\"\"Select a fuzzer that can run on this platform.\"\"\"\n if not platform:\n queue_override = environment.get_value('QUEUE_OVERRIDE')\n platform = queue_override if queue_override else environment.platform()\n\n query = data_types.FuzzerJob.query()\n query = query.filter(data_types.FuzzerJob.platform == platform)\n\n mappings = list(ndb_utils.get_all_from_query(query))\n if not mappings:\n return None, None\n\n selection = utils.random_weighted_choice(mappings)\n return selection.fuzzer, selection.job\n\n\ndef select_fuzz_target(targets, target_weights):\n \"\"\"Select a fuzz target from a list of potential targets.\"\"\"\n assert targets\n\n weighted_targets = []\n for target in targets:\n weight = target_weights.get(target, 1.0)\n weighted_targets.append(WeightedTarget(target, weight))\n\n return utils.random_weighted_choice(weighted_targets).target\n\n\ndef get_fuzz_target_weights():\n \"\"\"Get a list of fuzz target weights based on the current fuzzer.\"\"\"\n job_type = environment.get_value('JOB_NAME')\n\n target_jobs = list(fuzz_target_utils.get_fuzz_target_jobs(job=job_type))\n fuzz_targets = fuzz_target_utils.get_fuzz_targets_for_target_jobs(target_jobs)\n\n weights = {}\n for fuzz_target, target_job in zip(fuzz_targets, target_jobs):\n if not fuzz_target:\n logs.log_error('Skipping weight assignment for fuzz target %s.' %\n target_job.fuzz_target_name)\n continue\n\n weights[fuzz_target.binary] = target_job.weight\n\n return weights\n", "path": "src/python/fuzzing/fuzzer_selection.py"}]} | 1,537 | 248 |
gh_patches_debug_20320 | rasdani/github-patches | git_diff | praw-dev__praw-1104 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Replying to comments in quarantined subreddits returns an empty object
I have a bot that fetches comment ids from pushshift, then does some work and replies to the comment. If the comment is in a quarantined subreddit that I have not clicked through the quarantine warning for, I get the following error.
```
praw/models/reddit/mixins/replyable.py", line 26, in reply
return self._reddit.post(API_PATH["comment"], data=data)[0]
IndexError: list index out of range
```
The reply call succeeds, returning a valid http code and the following json
```
{'json': {'errors': [], 'data': {'things': []}}}
```
`objector.objectify` fails to parse this and returns an empty array. Importantly, the comment is successfully created.
I'm not really sure what should happen in this case, but I do think it should be a more clear error message. Happy to put together a pull request if anyone has any ideas.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `praw/models/reddit/mixins/replyable.py`
Content:
```
1 """Provide the ReplyableMixin class."""
2 from ....const import API_PATH
3
4
5 class ReplyableMixin:
6 """Interface for RedditBase classes that can be replied to."""
7
8 def reply(self, body):
9 """Reply to the object.
10
11 :param body: The markdown formatted content for a comment.
12 :returns: A :class:`~.Comment` object for the newly created comment.
13
14 Example usage:
15
16 .. code:: python
17
18 submission = reddit.submission(id='5or86n')
19 submission.reply('reply')
20
21 comment = reddit.comment(id='dxolpyc')
22 comment.reply('reply')
23
24 """
25 data = {"text": body, "thing_id": self.fullname}
26 return self._reddit.post(API_PATH["comment"], data=data)[0]
27
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/praw/models/reddit/mixins/replyable.py b/praw/models/reddit/mixins/replyable.py
--- a/praw/models/reddit/mixins/replyable.py
+++ b/praw/models/reddit/mixins/replyable.py
@@ -9,7 +9,14 @@
"""Reply to the object.
:param body: The markdown formatted content for a comment.
- :returns: A :class:`~.Comment` object for the newly created comment.
+ :returns: A :class:`~.Comment` object for the newly created
+ comment or ``None`` if Reddit doesn't provide one.
+
+ A ``None`` value can be returned if the target is a comment or
+ submission in a quarantined subreddit and the authenticated user
+ has not opt-ed in to viewing the content. When this happens the
+ comment will be sucessfully created on Reddit and can be retried
+ by drawing the comment from the user's comment history.
Example usage:
@@ -23,4 +30,8 @@
"""
data = {"text": body, "thing_id": self.fullname}
- return self._reddit.post(API_PATH["comment"], data=data)[0]
+ comments = self._reddit.post(API_PATH["comment"], data=data)
+ try:
+ return comments[0]
+ except IndexError:
+ return None
| {"golden_diff": "diff --git a/praw/models/reddit/mixins/replyable.py b/praw/models/reddit/mixins/replyable.py\n--- a/praw/models/reddit/mixins/replyable.py\n+++ b/praw/models/reddit/mixins/replyable.py\n@@ -9,7 +9,14 @@\n \"\"\"Reply to the object.\n \n :param body: The markdown formatted content for a comment.\n- :returns: A :class:`~.Comment` object for the newly created comment.\n+ :returns: A :class:`~.Comment` object for the newly created\n+ comment or ``None`` if Reddit doesn't provide one.\n+\n+ A ``None`` value can be returned if the target is a comment or\n+ submission in a quarantined subreddit and the authenticated user\n+ has not opt-ed in to viewing the content. When this happens the\n+ comment will be sucessfully created on Reddit and can be retried\n+ by drawing the comment from the user's comment history.\n \n Example usage:\n \n@@ -23,4 +30,8 @@\n \n \"\"\"\n data = {\"text\": body, \"thing_id\": self.fullname}\n- return self._reddit.post(API_PATH[\"comment\"], data=data)[0]\n+ comments = self._reddit.post(API_PATH[\"comment\"], data=data)\n+ try:\n+ return comments[0]\n+ except IndexError:\n+ return None\n", "issue": "Replying to comments in quarantined subreddits returns an empty object\nI have a bot that fetches comment ids from pushshift, then does some work and replies to the comment. If the comment is in a quarantined subreddit that I have not clicked through the quarantine warning for, I get the following error.\r\n\r\n```\r\npraw/models/reddit/mixins/replyable.py\", line 26, in reply\r\n return self._reddit.post(API_PATH[\"comment\"], data=data)[0]\r\nIndexError: list index out of range\r\n```\r\nThe reply call succeeds, returning a valid http code and the following json\r\n```\r\n{'json': {'errors': [], 'data': {'things': []}}}\r\n```\r\n`objector.objectify` fails to parse this and returns an empty array. Importantly, the comment is successfully created.\r\n\r\nI'm not really sure what should happen in this case, but I do think it should be a more clear error message. Happy to put together a pull request if anyone has any ideas.\n", "before_files": [{"content": "\"\"\"Provide the ReplyableMixin class.\"\"\"\nfrom ....const import API_PATH\n\n\nclass ReplyableMixin:\n \"\"\"Interface for RedditBase classes that can be replied to.\"\"\"\n\n def reply(self, body):\n \"\"\"Reply to the object.\n\n :param body: The markdown formatted content for a comment.\n :returns: A :class:`~.Comment` object for the newly created comment.\n\n Example usage:\n\n .. code:: python\n\n submission = reddit.submission(id='5or86n')\n submission.reply('reply')\n\n comment = reddit.comment(id='dxolpyc')\n comment.reply('reply')\n\n \"\"\"\n data = {\"text\": body, \"thing_id\": self.fullname}\n return self._reddit.post(API_PATH[\"comment\"], data=data)[0]\n", "path": "praw/models/reddit/mixins/replyable.py"}], "after_files": [{"content": "\"\"\"Provide the ReplyableMixin class.\"\"\"\nfrom ....const import API_PATH\n\n\nclass ReplyableMixin:\n \"\"\"Interface for RedditBase classes that can be replied to.\"\"\"\n\n def reply(self, body):\n \"\"\"Reply to the object.\n\n :param body: The markdown formatted content for a comment.\n :returns: A :class:`~.Comment` object for the newly created\n comment or ``None`` if Reddit doesn't provide one.\n\n A ``None`` value can be returned if the target is a comment or\n submission in a quarantined subreddit and the authenticated user\n has not opt-ed in to viewing the content. When this happens the\n comment will be sucessfully created on Reddit and can be retried\n by drawing the comment from the user's comment history.\n\n Example usage:\n\n .. code:: python\n\n submission = reddit.submission(id='5or86n')\n submission.reply('reply')\n\n comment = reddit.comment(id='dxolpyc')\n comment.reply('reply')\n\n \"\"\"\n data = {\"text\": body, \"thing_id\": self.fullname}\n comments = self._reddit.post(API_PATH[\"comment\"], data=data)\n try:\n return comments[0]\n except IndexError:\n return None\n", "path": "praw/models/reddit/mixins/replyable.py"}]} | 694 | 313 |
gh_patches_debug_22216 | rasdani/github-patches | git_diff | spacetelescope__jwql-483 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Simplify conda environments
Since `conda` will automatically determine which libraries are needed for dependencies (e.g. `numpy_base` is installed when installing `numpy`), we could probably use to trim down our conda environments to only those high-level packages that are used within our repo, and `conda` will figure out the rest. It might also be a good time to make sure the `conda` environment is consistent with the dependencies listed in `setup.py` and `requirements.txt`
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 import numpy as np
2 from setuptools import setup
3 from setuptools import find_packages
4
5 VERSION = '0.21.0'
6
7 AUTHORS = 'Matthew Bourque, Lauren Chambers, Misty Cracraft, Joe Filippazzo, Bryan Hilbert, '
8 AUTHORS += 'Graham Kanarek, Catherine Martlin, Johannes Sahlmann'
9
10 DESCRIPTION = 'The James Webb Space Telescope Quicklook Project'
11
12 REQUIRES = [
13 'astropy>=3.2.1',
14 'astroquery>=0.3.9',
15 'authlib',
16 'bokeh>=1.0',
17 'django>=2.0',
18 'jinja2',
19 'jsonschema==2.6.0',
20 'jwedb',
21 'jwst',
22 'matplotlib',
23 'numpy',
24 'numpydoc',
25 'pandas',
26 'psycopg2',
27 'pysiaf',
28 'pytest',
29 'sphinx',
30 'sqlalchemy',
31 'stsci_rtd_theme'
32 ]
33
34 setup(
35 name='jwql',
36 version=VERSION,
37 description=DESCRIPTION,
38 url='https://github.com/spacetelescope/jwql.git',
39 author=AUTHORS,
40 author_email='[email protected]',
41 license='BSD',
42 keywords=['astronomy', 'python'],
43 classifiers=['Programming Language :: Python'],
44 packages=find_packages(),
45 install_requires=REQUIRES,
46 include_package_data=True,
47 include_dirs=[np.get_include()],
48 )
49
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -4,21 +4,26 @@
VERSION = '0.21.0'
-AUTHORS = 'Matthew Bourque, Lauren Chambers, Misty Cracraft, Joe Filippazzo, Bryan Hilbert, '
-AUTHORS += 'Graham Kanarek, Catherine Martlin, Johannes Sahlmann'
+AUTHORS = 'Matthew Bourque, Misty Cracraft, Joe Filippazzo, Bryan Hilbert, '
+AUTHORS += 'Graham Kanarek, Catherine Martlin, Johannes Sahlmann, Ben Sunnquist'
DESCRIPTION = 'The James Webb Space Telescope Quicklook Project'
REQUIRES = [
+ 'asdf>=2.3.3',
'astropy>=3.2.1',
'astroquery>=0.3.9',
'authlib',
'bokeh>=1.0',
+ 'codecov',
'django>=2.0',
+ 'flake8',
+ 'inflection',
+ 'ipython',
'jinja2',
'jsonschema==2.6.0',
- 'jwedb',
- 'jwst',
+ 'jwedb>=0.0.3',
+ 'jwst==0.13.0',
'matplotlib',
'numpy',
'numpydoc',
@@ -26,9 +31,12 @@
'psycopg2',
'pysiaf',
'pytest',
+ 'pytest-cov',
+ 'scipy',
'sphinx',
'sqlalchemy',
- 'stsci_rtd_theme'
+ 'stsci_rtd_theme',
+ 'twine'
]
setup(
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -4,21 +4,26 @@\n \n VERSION = '0.21.0'\n \n-AUTHORS = 'Matthew Bourque, Lauren Chambers, Misty Cracraft, Joe Filippazzo, Bryan Hilbert, '\n-AUTHORS += 'Graham Kanarek, Catherine Martlin, Johannes Sahlmann'\n+AUTHORS = 'Matthew Bourque, Misty Cracraft, Joe Filippazzo, Bryan Hilbert, '\n+AUTHORS += 'Graham Kanarek, Catherine Martlin, Johannes Sahlmann, Ben Sunnquist'\n \n DESCRIPTION = 'The James Webb Space Telescope Quicklook Project'\n \n REQUIRES = [\n+ 'asdf>=2.3.3',\n 'astropy>=3.2.1',\n 'astroquery>=0.3.9',\n 'authlib',\n 'bokeh>=1.0',\n+ 'codecov',\n 'django>=2.0',\n+ 'flake8',\n+ 'inflection',\n+ 'ipython',\n 'jinja2',\n 'jsonschema==2.6.0',\n- 'jwedb',\n- 'jwst',\n+ 'jwedb>=0.0.3',\n+ 'jwst==0.13.0',\n 'matplotlib',\n 'numpy',\n 'numpydoc',\n@@ -26,9 +31,12 @@\n 'psycopg2',\n 'pysiaf',\n 'pytest',\n+ 'pytest-cov',\n+ 'scipy',\n 'sphinx',\n 'sqlalchemy',\n- 'stsci_rtd_theme'\n+ 'stsci_rtd_theme',\n+ 'twine'\n ]\n \n setup(\n", "issue": "Simplify conda environments \nSince `conda` will automatically determine which libraries are needed for dependencies (e.g. `numpy_base` is installed when installing `numpy`), we could probably use to trim down our conda environments to only those high-level packages that are used within our repo, and `conda` will figure out the rest. It might also be a good time to make sure the `conda` environment is consistent with the dependencies listed in `setup.py` and `requirements.txt`\n", "before_files": [{"content": "import numpy as np\nfrom setuptools import setup\nfrom setuptools import find_packages\n\nVERSION = '0.21.0'\n\nAUTHORS = 'Matthew Bourque, Lauren Chambers, Misty Cracraft, Joe Filippazzo, Bryan Hilbert, '\nAUTHORS += 'Graham Kanarek, Catherine Martlin, Johannes Sahlmann'\n\nDESCRIPTION = 'The James Webb Space Telescope Quicklook Project'\n\nREQUIRES = [\n 'astropy>=3.2.1',\n 'astroquery>=0.3.9',\n 'authlib',\n 'bokeh>=1.0',\n 'django>=2.0',\n 'jinja2',\n 'jsonschema==2.6.0',\n 'jwedb',\n 'jwst',\n 'matplotlib',\n 'numpy',\n 'numpydoc',\n 'pandas',\n 'psycopg2',\n 'pysiaf',\n 'pytest',\n 'sphinx',\n 'sqlalchemy',\n 'stsci_rtd_theme'\n]\n\nsetup(\n name='jwql',\n version=VERSION,\n description=DESCRIPTION,\n url='https://github.com/spacetelescope/jwql.git',\n author=AUTHORS,\n author_email='[email protected]',\n license='BSD',\n keywords=['astronomy', 'python'],\n classifiers=['Programming Language :: Python'],\n packages=find_packages(),\n install_requires=REQUIRES,\n include_package_data=True,\n include_dirs=[np.get_include()],\n)\n", "path": "setup.py"}], "after_files": [{"content": "import numpy as np\nfrom setuptools import setup\nfrom setuptools import find_packages\n\nVERSION = '0.21.0'\n\nAUTHORS = 'Matthew Bourque, Misty Cracraft, Joe Filippazzo, Bryan Hilbert, '\nAUTHORS += 'Graham Kanarek, Catherine Martlin, Johannes Sahlmann, Ben Sunnquist'\n\nDESCRIPTION = 'The James Webb Space Telescope Quicklook Project'\n\nREQUIRES = [\n 'asdf>=2.3.3',\n 'astropy>=3.2.1',\n 'astroquery>=0.3.9',\n 'authlib',\n 'bokeh>=1.0',\n 'codecov',\n 'django>=2.0',\n 'flake8',\n 'inflection',\n 'ipython',\n 'jinja2',\n 'jsonschema==2.6.0',\n 'jwedb>=0.0.3',\n 'jwst==0.13.0',\n 'matplotlib',\n 'numpy',\n 'numpydoc',\n 'pandas',\n 'psycopg2',\n 'pysiaf',\n 'pytest',\n 'pytest-cov',\n 'scipy',\n 'sphinx',\n 'sqlalchemy',\n 'stsci_rtd_theme',\n 'twine'\n]\n\nsetup(\n name='jwql',\n version=VERSION,\n description=DESCRIPTION,\n url='https://github.com/spacetelescope/jwql.git',\n author=AUTHORS,\n author_email='[email protected]',\n license='BSD',\n keywords=['astronomy', 'python'],\n classifiers=['Programming Language :: Python'],\n packages=find_packages(),\n install_requires=REQUIRES,\n include_package_data=True,\n include_dirs=[np.get_include()],\n)\n", "path": "setup.py"}]} | 766 | 391 |
gh_patches_debug_26912 | rasdani/github-patches | git_diff | digitalfabrik__integreat-cms-361 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Refactor technical feedback
The current implementation of technical feedback does not comply with the way it is used in the apps when submitting feedback. Instead of having one model for technical feedback, we should rather have a `technical` flag for each of the other feedback models. This flag should be set whenever the `category`-parameter contains "technisch" or "technical". In a future API version, we can convert this to a parameter, which can either be true or false in the feedback submission request.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/cms/models/__init__.py`
Content:
```
1 from .config.configuration import Configuration
2
3 from .events.event import Event
4 from .events.event_translation import EventTranslation
5 from .events.recurrence_rule import RecurrenceRule
6
7 from .offers.offer import Offer
8 from .offers.offer_template import OfferTemplate
9
10 from .feedback.event_feedback import EventFeedback
11 from .feedback.offer_feedback import OfferFeedback
12 from .feedback.feedback import Feedback
13 from .feedback.page_feedback import PageFeedback
14 from .feedback.region_feedback import RegionFeedback
15 from .feedback.search_result_feedback import SearchResultFeedback
16 from .feedback.technical_feedback import TechnicalFeedback
17
18 from .languages.language import Language
19 from .languages.language_tree_node import LanguageTreeNode
20
21 from .media.document import Document
22
23 from .pages.page import Page
24 from .pages.page_translation import PageTranslation
25
26 from .pois.poi import POI
27 from .pois.poi_translation import POITranslation
28
29 from .push_notifications.push_notification import PushNotification
30 from .push_notifications.push_notification_translation import PushNotificationTranslation
31
32 from .regions.region import Region
33
34 from .users.organization import Organization
35 from .users.user_profile import UserProfile
36 from .users.user_mfa import UserMfa
37
```
Path: `src/cms/models/feedback/feedback.py`
Content:
```
1 """
2 Module for models storing feedback from front end users
3 """
4 from django.db import models
5
6
7 class Feedback(models.Model):
8 """
9 Base class for collecting feeedback from users.
10 """
11 EMOTION = (
12 ("Pos", "Positive"),
13 ("Neg", "Negative"),
14 ("NA", "Not Available"),
15 )
16 emotion = models.CharField(max_length=3, choices=EMOTION)
17 comment = models.CharField(max_length=1000)
18 readStatus = models.BooleanField(default=False)
19
20 created_date = models.DateTimeField(auto_now_add=True)
21 last_updated = models.DateTimeField(auto_now=True)
22
23 class Meta:
24 default_permissions = ()
25 permissions = (
26 ('view_feedback', 'Can view feedback'),
27 )
28
```
Path: `src/cms/models/feedback/technical_feedback.py`
Content:
```
1 """
2 Module for models storing feedback from front end users
3 """
4 from django.db import models
5
6 from .feedback import Feedback
7 from ..pages.page import Page
8
9
10 class TechnicalFeedback(Feedback):
11 """
12 Technical feedback on the end user app
13 """
14 page = models.ForeignKey(Page, on_delete=models.CASCADE)
15
16 class Meta:
17 default_permissions = ()
18
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/cms/models/__init__.py b/src/cms/models/__init__.py
--- a/src/cms/models/__init__.py
+++ b/src/cms/models/__init__.py
@@ -13,7 +13,6 @@
from .feedback.page_feedback import PageFeedback
from .feedback.region_feedback import RegionFeedback
from .feedback.search_result_feedback import SearchResultFeedback
-from .feedback.technical_feedback import TechnicalFeedback
from .languages.language import Language
from .languages.language_tree_node import LanguageTreeNode
diff --git a/src/cms/models/feedback/feedback.py b/src/cms/models/feedback/feedback.py
--- a/src/cms/models/feedback/feedback.py
+++ b/src/cms/models/feedback/feedback.py
@@ -15,7 +15,8 @@
)
emotion = models.CharField(max_length=3, choices=EMOTION)
comment = models.CharField(max_length=1000)
- readStatus = models.BooleanField(default=False)
+ is_technical = models.BooleanField(default=False)
+ read_status = models.BooleanField(default=False)
created_date = models.DateTimeField(auto_now_add=True)
last_updated = models.DateTimeField(auto_now=True)
diff --git a/src/cms/models/feedback/technical_feedback.py b/src/cms/models/feedback/technical_feedback.py
deleted file mode 100644
--- a/src/cms/models/feedback/technical_feedback.py
+++ /dev/null
@@ -1,17 +0,0 @@
-"""
-Module for models storing feedback from front end users
-"""
-from django.db import models
-
-from .feedback import Feedback
-from ..pages.page import Page
-
-
-class TechnicalFeedback(Feedback):
- """
- Technical feedback on the end user app
- """
- page = models.ForeignKey(Page, on_delete=models.CASCADE)
-
- class Meta:
- default_permissions = ()
| {"golden_diff": "diff --git a/src/cms/models/__init__.py b/src/cms/models/__init__.py\n--- a/src/cms/models/__init__.py\n+++ b/src/cms/models/__init__.py\n@@ -13,7 +13,6 @@\n from .feedback.page_feedback import PageFeedback\n from .feedback.region_feedback import RegionFeedback\n from .feedback.search_result_feedback import SearchResultFeedback\n-from .feedback.technical_feedback import TechnicalFeedback\n \n from .languages.language import Language\n from .languages.language_tree_node import LanguageTreeNode\ndiff --git a/src/cms/models/feedback/feedback.py b/src/cms/models/feedback/feedback.py\n--- a/src/cms/models/feedback/feedback.py\n+++ b/src/cms/models/feedback/feedback.py\n@@ -15,7 +15,8 @@\n )\n emotion = models.CharField(max_length=3, choices=EMOTION)\n comment = models.CharField(max_length=1000)\n- readStatus = models.BooleanField(default=False)\n+ is_technical = models.BooleanField(default=False)\n+ read_status = models.BooleanField(default=False)\n \n created_date = models.DateTimeField(auto_now_add=True)\n last_updated = models.DateTimeField(auto_now=True)\ndiff --git a/src/cms/models/feedback/technical_feedback.py b/src/cms/models/feedback/technical_feedback.py\ndeleted file mode 100644\n--- a/src/cms/models/feedback/technical_feedback.py\n+++ /dev/null\n@@ -1,17 +0,0 @@\n-\"\"\"\n-Module for models storing feedback from front end users\n-\"\"\"\n-from django.db import models\n-\n-from .feedback import Feedback\n-from ..pages.page import Page\n-\n-\n-class TechnicalFeedback(Feedback):\n- \"\"\"\n- Technical feedback on the end user app\n- \"\"\"\n- page = models.ForeignKey(Page, on_delete=models.CASCADE)\n-\n- class Meta:\n- default_permissions = ()\n", "issue": "Refactor technical feedback\nThe current implementation of technical feedback does not comply with the way it is used in the apps when submitting feedback. Instead of having one model for technical feedback, we should rather have a `technical` flag for each of the other feedback models. This flag should be set whenever the `category`-parameter contains \"technisch\" or \"technical\". In a future API version, we can convert this to a parameter, which can either be true or false in the feedback submission request.\n", "before_files": [{"content": "from .config.configuration import Configuration\n\nfrom .events.event import Event\nfrom .events.event_translation import EventTranslation\nfrom .events.recurrence_rule import RecurrenceRule\n\nfrom .offers.offer import Offer\nfrom .offers.offer_template import OfferTemplate\n\nfrom .feedback.event_feedback import EventFeedback\nfrom .feedback.offer_feedback import OfferFeedback\nfrom .feedback.feedback import Feedback\nfrom .feedback.page_feedback import PageFeedback\nfrom .feedback.region_feedback import RegionFeedback\nfrom .feedback.search_result_feedback import SearchResultFeedback\nfrom .feedback.technical_feedback import TechnicalFeedback\n\nfrom .languages.language import Language\nfrom .languages.language_tree_node import LanguageTreeNode\n\nfrom .media.document import Document\n\nfrom .pages.page import Page\nfrom .pages.page_translation import PageTranslation\n\nfrom .pois.poi import POI\nfrom .pois.poi_translation import POITranslation\n\nfrom .push_notifications.push_notification import PushNotification\nfrom .push_notifications.push_notification_translation import PushNotificationTranslation\n\nfrom .regions.region import Region\n\nfrom .users.organization import Organization\nfrom .users.user_profile import UserProfile\nfrom .users.user_mfa import UserMfa\n", "path": "src/cms/models/__init__.py"}, {"content": "\"\"\"\nModule for models storing feedback from front end users\n\"\"\"\nfrom django.db import models\n\n\nclass Feedback(models.Model):\n \"\"\"\n Base class for collecting feeedback from users.\n \"\"\"\n EMOTION = (\n (\"Pos\", \"Positive\"),\n (\"Neg\", \"Negative\"),\n (\"NA\", \"Not Available\"),\n )\n emotion = models.CharField(max_length=3, choices=EMOTION)\n comment = models.CharField(max_length=1000)\n readStatus = models.BooleanField(default=False)\n\n created_date = models.DateTimeField(auto_now_add=True)\n last_updated = models.DateTimeField(auto_now=True)\n\n class Meta:\n default_permissions = ()\n permissions = (\n ('view_feedback', 'Can view feedback'),\n )\n", "path": "src/cms/models/feedback/feedback.py"}, {"content": "\"\"\"\nModule for models storing feedback from front end users\n\"\"\"\nfrom django.db import models\n\nfrom .feedback import Feedback\nfrom ..pages.page import Page\n\n\nclass TechnicalFeedback(Feedback):\n \"\"\"\n Technical feedback on the end user app\n \"\"\"\n page = models.ForeignKey(Page, on_delete=models.CASCADE)\n\n class Meta:\n default_permissions = ()\n", "path": "src/cms/models/feedback/technical_feedback.py"}], "after_files": [{"content": "from .config.configuration import Configuration\n\nfrom .events.event import Event\nfrom .events.event_translation import EventTranslation\nfrom .events.recurrence_rule import RecurrenceRule\n\nfrom .offers.offer import Offer\nfrom .offers.offer_template import OfferTemplate\n\nfrom .feedback.event_feedback import EventFeedback\nfrom .feedback.offer_feedback import OfferFeedback\nfrom .feedback.feedback import Feedback\nfrom .feedback.page_feedback import PageFeedback\nfrom .feedback.region_feedback import RegionFeedback\nfrom .feedback.search_result_feedback import SearchResultFeedback\n\nfrom .languages.language import Language\nfrom .languages.language_tree_node import LanguageTreeNode\n\nfrom .media.document import Document\n\nfrom .pages.page import Page\nfrom .pages.page_translation import PageTranslation\n\nfrom .pois.poi import POI\nfrom .pois.poi_translation import POITranslation\n\nfrom .push_notifications.push_notification import PushNotification\nfrom .push_notifications.push_notification_translation import PushNotificationTranslation\n\nfrom .regions.region import Region\n\nfrom .users.organization import Organization\nfrom .users.user_profile import UserProfile\nfrom .users.user_mfa import UserMfa\n", "path": "src/cms/models/__init__.py"}, {"content": "\"\"\"\nModule for models storing feedback from front end users\n\"\"\"\nfrom django.db import models\n\n\nclass Feedback(models.Model):\n \"\"\"\n Base class for collecting feeedback from users.\n \"\"\"\n EMOTION = (\n (\"Pos\", \"Positive\"),\n (\"Neg\", \"Negative\"),\n (\"NA\", \"Not Available\"),\n )\n emotion = models.CharField(max_length=3, choices=EMOTION)\n comment = models.CharField(max_length=1000)\n is_technical = models.BooleanField(default=False)\n read_status = models.BooleanField(default=False)\n\n created_date = models.DateTimeField(auto_now_add=True)\n last_updated = models.DateTimeField(auto_now=True)\n\n class Meta:\n default_permissions = ()\n permissions = (\n ('view_feedback', 'Can view feedback'),\n )\n", "path": "src/cms/models/feedback/feedback.py"}, {"content": null, "path": "src/cms/models/feedback/technical_feedback.py"}]} | 1,007 | 394 |
gh_patches_debug_7917 | rasdani/github-patches | git_diff | bokeh__bokeh-6159 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
For Interactive Legends, a note about `muted_color` would be helpful
For the [Interactive Legend guide](https://github.com/bokeh/bokeh/blob/d8fcffa1c472bf641517ef81698bb6d057cbd30f/sphinx/source/docs/user_guide/interaction/legends.rst#id7), in addition to the note about `muted_alpha`, it would be helpful to also demonstrate `muted_color`.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `sphinx/source/docs/user_guide/examples/interaction_legend_mute.py`
Content:
```
1 import pandas as pd
2
3 from bokeh.palettes import Spectral4
4 from bokeh.plotting import figure, output_file, show
5
6 p = figure(plot_width=800, plot_height=250, x_axis_type="datetime")
7 p.title.text = 'Click on legend entries to mute the corresponding lines'
8
9 for name, color in zip(['AAPL', 'IBM', 'MSFT', 'GOOG'], Spectral4):
10 df = pd.read_csv(
11 "http://ichart.yahoo.com/table.csv?s=%s&a=0&b=1&c=2005&d=0&e=1&f=2014" % name,
12 parse_dates=['Date']
13 )
14 p.line(df['Date'], df['Close'], line_width=2, color=color, alpha=0.8, muted_alpha=0.2, legend=name)
15
16 p.legend.location = "top_left"
17 p.legend.click_policy="mute"
18
19 output_file("interactive_legend.html", title="interactive_legend.py example")
20
21 show(p)
22
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/sphinx/source/docs/user_guide/examples/interaction_legend_mute.py b/sphinx/source/docs/user_guide/examples/interaction_legend_mute.py
--- a/sphinx/source/docs/user_guide/examples/interaction_legend_mute.py
+++ b/sphinx/source/docs/user_guide/examples/interaction_legend_mute.py
@@ -11,7 +11,8 @@
"http://ichart.yahoo.com/table.csv?s=%s&a=0&b=1&c=2005&d=0&e=1&f=2014" % name,
parse_dates=['Date']
)
- p.line(df['Date'], df['Close'], line_width=2, color=color, alpha=0.8, muted_alpha=0.2, legend=name)
+ p.line(df['Date'], df['Close'], line_width=2, color=color, alpha=0.8,
+ muted_color=color, muted_alpha=0.2, legend=name)
p.legend.location = "top_left"
p.legend.click_policy="mute"
| {"golden_diff": "diff --git a/sphinx/source/docs/user_guide/examples/interaction_legend_mute.py b/sphinx/source/docs/user_guide/examples/interaction_legend_mute.py\n--- a/sphinx/source/docs/user_guide/examples/interaction_legend_mute.py\n+++ b/sphinx/source/docs/user_guide/examples/interaction_legend_mute.py\n@@ -11,7 +11,8 @@\n \"http://ichart.yahoo.com/table.csv?s=%s&a=0&b=1&c=2005&d=0&e=1&f=2014\" % name,\n parse_dates=['Date']\n )\n- p.line(df['Date'], df['Close'], line_width=2, color=color, alpha=0.8, muted_alpha=0.2, legend=name)\n+ p.line(df['Date'], df['Close'], line_width=2, color=color, alpha=0.8,\n+ muted_color=color, muted_alpha=0.2, legend=name)\n \n p.legend.location = \"top_left\"\n p.legend.click_policy=\"mute\"\n", "issue": "For Interactive Legends, a note about `muted_color` would be helpful\nFor the [Interactive Legend guide](https://github.com/bokeh/bokeh/blob/d8fcffa1c472bf641517ef81698bb6d057cbd30f/sphinx/source/docs/user_guide/interaction/legends.rst#id7), in addition to the note about `muted_alpha`, it would be helpful to also demonstrate `muted_color`.\n", "before_files": [{"content": "import pandas as pd\n\nfrom bokeh.palettes import Spectral4\nfrom bokeh.plotting import figure, output_file, show\n\np = figure(plot_width=800, plot_height=250, x_axis_type=\"datetime\")\np.title.text = 'Click on legend entries to mute the corresponding lines'\n\nfor name, color in zip(['AAPL', 'IBM', 'MSFT', 'GOOG'], Spectral4):\n df = pd.read_csv(\n \"http://ichart.yahoo.com/table.csv?s=%s&a=0&b=1&c=2005&d=0&e=1&f=2014\" % name,\n parse_dates=['Date']\n )\n p.line(df['Date'], df['Close'], line_width=2, color=color, alpha=0.8, muted_alpha=0.2, legend=name)\n\np.legend.location = \"top_left\"\np.legend.click_policy=\"mute\"\n\noutput_file(\"interactive_legend.html\", title=\"interactive_legend.py example\")\n\nshow(p)\n", "path": "sphinx/source/docs/user_guide/examples/interaction_legend_mute.py"}], "after_files": [{"content": "import pandas as pd\n\nfrom bokeh.palettes import Spectral4\nfrom bokeh.plotting import figure, output_file, show\n\np = figure(plot_width=800, plot_height=250, x_axis_type=\"datetime\")\np.title.text = 'Click on legend entries to mute the corresponding lines'\n\nfor name, color in zip(['AAPL', 'IBM', 'MSFT', 'GOOG'], Spectral4):\n df = pd.read_csv(\n \"http://ichart.yahoo.com/table.csv?s=%s&a=0&b=1&c=2005&d=0&e=1&f=2014\" % name,\n parse_dates=['Date']\n )\n p.line(df['Date'], df['Close'], line_width=2, color=color, alpha=0.8,\n muted_color=color, muted_alpha=0.2, legend=name)\n\np.legend.location = \"top_left\"\np.legend.click_policy=\"mute\"\n\noutput_file(\"interactive_legend.html\", title=\"interactive_legend.py example\")\n\nshow(p)\n", "path": "sphinx/source/docs/user_guide/examples/interaction_legend_mute.py"}]} | 632 | 223 |
gh_patches_debug_13930 | rasdani/github-patches | git_diff | microsoft__ptvsd-1425 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
python without debugging won't start if there is a breakpoint
@tmdag commented on [Thu May 02 2019](https://github.com/microsoft/vscode-python/issues/5540)
## Environment data
- VSCode Version: 1.33.1
- OS Version:Linux 5.0.7-200.fc29.x86_64
- Extension version (available under the Extensions sidebar): Python 2019.4.11987
- Anaconda Extension Pack 1.0.1
## Steps to reproduce:
Open Python code
create a breakpoint
run python without debugging (ctrl+F5)
Terminal output:
cd /home/user/dev/Pytool ; env PYTHONIOENCODING=UTF-8 PYTHONUNBUFFERED=1 /usr/bin/python3 /home/user/.vscode/extensions/ms-python.python-2019.4.11987/pythonFiles/ptvsd_launcher.py --default --nodebug --client --host localhost --port 36019 /home/user/dev/Pytool/mypytool.py
Terminated
Does this issue occur when all extensions are disabled?: Yes/No
Not sure - Python extension is required
## Enabled Extensions:
Bookmarks 10.4.3
C/C++ 0.221
Gist 3.0.3
Git History 0.4.6
GitLens - 9.6.3
markdownlint 0.26.0
Syncing 2.1.6
OpenCL 0.5.2
VEX 0.4.0
TAML 0.4.0
---
@jxramos commented on [Thu May 02 2019](https://github.com/microsoft/vscode-python/issues/5540#issuecomment-488807421)
I'm seeing this too on a MacOS, it immediately exits with `Terminated: 15`. This behavior persists even if the breakpoint is disabled/unchecked.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/ptvsd/runner.py`
Content:
```
1 # Copyright (c) Microsoft Corporation. All rights reserved.
2 # Licensed under the MIT License. See LICENSE in the project root
3 # for license information.
4
5 import pydevd
6 import threading
7
8 from ptvsd.daemon import DaemonBase
9 from ptvsd.session import DebugSession
10 from ptvsd.wrapper import VSCLifecycleMsgProcessor
11 from pydevd import init_stdout_redirect, init_stderr_redirect
12
13
14 HOSTNAME = 'localhost'
15
16
17 def run(address, filename, is_module, *args, **kwargs):
18 # TODO: docstring
19 # TODO: client/server -> address
20 daemon = Daemon()
21 if not daemon.wait_for_launch(address):
22 return
23
24 debugger = pydevd.PyDB()
25 # We do not want some internal methods to get executed in non-debug mode.
26 debugger.init_matplotlib_support = lambda *arg: None
27 debugger.run(
28 file=filename,
29 globals=None,
30 locals=None,
31 is_module=is_module,
32 set_trace=False)
33
34
35 class Daemon(DaemonBase):
36 """The process-level manager for the VSC protocol debug adapter."""
37
38 LAUNCH_TIMEOUT = 10000 # seconds
39
40 class SESSION(DebugSession):
41 class MESSAGE_PROCESSOR(VSCLifecycleMsgProcessor):
42 def on_invalid_request(self, request, args):
43 self.send_response(request, success=True)
44
45 def wait_for_launch(self, addr, timeout=LAUNCH_TIMEOUT):
46 # TODO: docstring
47 launched = threading.Event()
48 _, start_session = self.start_client(addr)
49 start_session(
50 notify_launch=launched.set,
51 )
52 return launched.wait(timeout)
53
54 def _start(self):
55 import weakref
56 weak_self = weakref.ref(self) # Avoid cyclic ref
57
58 def on_stdout(msg):
59 self = weak_self()
60 if self is not None:
61 self._send_output('stdout', msg)
62
63 def on_stderr(msg):
64 self = weak_self()
65 if self is not None:
66 self._send_output('stderr', msg)
67
68 init_stdout_redirect(on_stdout)
69 init_stderr_redirect(on_stderr)
70 return NoSocket()
71
72 def _close(self):
73 super(Daemon, self)._close()
74
75 def _send_output(self, category, output):
76 if self.session is None:
77 return
78 self.session._msgprocessor.send_event('output',
79 category=category,
80 output=output)
81
82
83 class NoSocket(object):
84 """A object with a noop socket lifecycle."""
85
86 def shutdown(self, *args, **kwargs):
87 pass
88
89 def close(self):
90 pass
91
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/ptvsd/runner.py b/src/ptvsd/runner.py
--- a/src/ptvsd/runner.py
+++ b/src/ptvsd/runner.py
@@ -39,6 +39,19 @@
class SESSION(DebugSession):
class MESSAGE_PROCESSOR(VSCLifecycleMsgProcessor):
+
+ def on_setBreakpoints(self, request, args):
+ # Note: breakpoints is required (vscode will terminate
+ # the debugger if that's not the case).
+ # See: https://github.com/microsoft/ptvsd/issues/1408
+ self.send_response(
+ request,
+ success=True,
+ breakpoints=(
+ [{'verified': False}] * len(args.get('breakpoints', ()))
+ )
+ )
+
def on_invalid_request(self, request, args):
self.send_response(request, success=True)
| {"golden_diff": "diff --git a/src/ptvsd/runner.py b/src/ptvsd/runner.py\n--- a/src/ptvsd/runner.py\n+++ b/src/ptvsd/runner.py\n@@ -39,6 +39,19 @@\n \n class SESSION(DebugSession):\n class MESSAGE_PROCESSOR(VSCLifecycleMsgProcessor):\n+\n+ def on_setBreakpoints(self, request, args):\n+ # Note: breakpoints is required (vscode will terminate\n+ # the debugger if that's not the case).\n+ # See: https://github.com/microsoft/ptvsd/issues/1408\n+ self.send_response(\n+ request,\n+ success=True,\n+ breakpoints=(\n+ [{'verified': False}] * len(args.get('breakpoints', ()))\n+ )\n+ )\n+\n def on_invalid_request(self, request, args):\n self.send_response(request, success=True)\n", "issue": "python without debugging won't start if there is a breakpoint\n@tmdag commented on [Thu May 02 2019](https://github.com/microsoft/vscode-python/issues/5540)\n\n## Environment data\r\n- VSCode Version: 1.33.1\r\n- OS Version:Linux 5.0.7-200.fc29.x86_64\r\n- Extension version (available under the Extensions sidebar): Python 2019.4.11987\r\n- Anaconda Extension Pack 1.0.1\r\n\r\n## Steps to reproduce:\r\nOpen Python code\r\ncreate a breakpoint\r\nrun python without debugging (ctrl+F5)\r\nTerminal output:\r\n\r\ncd /home/user/dev/Pytool ; env PYTHONIOENCODING=UTF-8 PYTHONUNBUFFERED=1 /usr/bin/python3 /home/user/.vscode/extensions/ms-python.python-2019.4.11987/pythonFiles/ptvsd_launcher.py --default --nodebug --client --host localhost --port 36019 /home/user/dev/Pytool/mypytool.py\r\nTerminated\r\nDoes this issue occur when all extensions are disabled?: Yes/No\r\nNot sure - Python extension is required\r\n\r\n## Enabled Extensions:\r\nBookmarks 10.4.3\r\nC/C++ 0.221\r\nGist 3.0.3\r\nGit History 0.4.6\r\nGitLens - 9.6.3\r\nmarkdownlint 0.26.0\r\n\r\nSyncing 2.1.6\r\nOpenCL 0.5.2\r\nVEX 0.4.0\r\nTAML 0.4.0\r\n\r\n\n\n---\n\n@jxramos commented on [Thu May 02 2019](https://github.com/microsoft/vscode-python/issues/5540#issuecomment-488807421)\n\nI'm seeing this too on a MacOS, it immediately exits with `Terminated: 15`. This behavior persists even if the breakpoint is disabled/unchecked.\n\n\n", "before_files": [{"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License. See LICENSE in the project root\n# for license information.\n\nimport pydevd\nimport threading\n\nfrom ptvsd.daemon import DaemonBase\nfrom ptvsd.session import DebugSession\nfrom ptvsd.wrapper import VSCLifecycleMsgProcessor\nfrom pydevd import init_stdout_redirect, init_stderr_redirect\n\n\nHOSTNAME = 'localhost'\n\n\ndef run(address, filename, is_module, *args, **kwargs):\n # TODO: docstring\n # TODO: client/server -> address\n daemon = Daemon()\n if not daemon.wait_for_launch(address):\n return\n\n debugger = pydevd.PyDB()\n # We do not want some internal methods to get executed in non-debug mode.\n debugger.init_matplotlib_support = lambda *arg: None\n debugger.run(\n file=filename,\n globals=None,\n locals=None,\n is_module=is_module,\n set_trace=False)\n\n\nclass Daemon(DaemonBase):\n \"\"\"The process-level manager for the VSC protocol debug adapter.\"\"\"\n\n LAUNCH_TIMEOUT = 10000 # seconds\n\n class SESSION(DebugSession):\n class MESSAGE_PROCESSOR(VSCLifecycleMsgProcessor):\n def on_invalid_request(self, request, args):\n self.send_response(request, success=True)\n\n def wait_for_launch(self, addr, timeout=LAUNCH_TIMEOUT):\n # TODO: docstring\n launched = threading.Event()\n _, start_session = self.start_client(addr)\n start_session(\n notify_launch=launched.set,\n )\n return launched.wait(timeout)\n\n def _start(self):\n import weakref\n weak_self = weakref.ref(self) # Avoid cyclic ref\n\n def on_stdout(msg):\n self = weak_self()\n if self is not None:\n self._send_output('stdout', msg)\n\n def on_stderr(msg):\n self = weak_self()\n if self is not None:\n self._send_output('stderr', msg)\n\n init_stdout_redirect(on_stdout)\n init_stderr_redirect(on_stderr)\n return NoSocket()\n\n def _close(self):\n super(Daemon, self)._close()\n\n def _send_output(self, category, output):\n if self.session is None:\n return\n self.session._msgprocessor.send_event('output',\n category=category,\n output=output)\n\n\nclass NoSocket(object):\n \"\"\"A object with a noop socket lifecycle.\"\"\"\n\n def shutdown(self, *args, **kwargs):\n pass\n\n def close(self):\n pass\n", "path": "src/ptvsd/runner.py"}], "after_files": [{"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License. See LICENSE in the project root\n# for license information.\n\nimport pydevd\nimport threading\n\nfrom ptvsd.daemon import DaemonBase\nfrom ptvsd.session import DebugSession\nfrom ptvsd.wrapper import VSCLifecycleMsgProcessor\nfrom pydevd import init_stdout_redirect, init_stderr_redirect\n\n\nHOSTNAME = 'localhost'\n\n\ndef run(address, filename, is_module, *args, **kwargs):\n # TODO: docstring\n # TODO: client/server -> address\n daemon = Daemon()\n if not daemon.wait_for_launch(address):\n return\n\n debugger = pydevd.PyDB()\n # We do not want some internal methods to get executed in non-debug mode.\n debugger.init_matplotlib_support = lambda *arg: None\n debugger.run(\n file=filename,\n globals=None,\n locals=None,\n is_module=is_module,\n set_trace=False)\n\n\nclass Daemon(DaemonBase):\n \"\"\"The process-level manager for the VSC protocol debug adapter.\"\"\"\n\n LAUNCH_TIMEOUT = 10000 # seconds\n\n class SESSION(DebugSession):\n class MESSAGE_PROCESSOR(VSCLifecycleMsgProcessor):\n\n def on_setBreakpoints(self, request, args):\n # Note: breakpoints is required (vscode will terminate\n # the debugger if that's not the case).\n # See: https://github.com/microsoft/ptvsd/issues/1408\n self.send_response(\n request,\n success=True,\n breakpoints=(\n [{'verified': False}] * len(args.get('breakpoints', ()))\n )\n )\n\n def on_invalid_request(self, request, args):\n self.send_response(request, success=True)\n\n def wait_for_launch(self, addr, timeout=LAUNCH_TIMEOUT):\n # TODO: docstring\n launched = threading.Event()\n _, start_session = self.start_client(addr)\n start_session(\n notify_launch=launched.set,\n )\n return launched.wait(timeout)\n\n def _start(self):\n import weakref\n weak_self = weakref.ref(self) # Avoid cyclic ref\n\n def on_stdout(msg):\n self = weak_self()\n if self is not None:\n self._send_output('stdout', msg)\n\n def on_stderr(msg):\n self = weak_self()\n if self is not None:\n self._send_output('stderr', msg)\n\n init_stdout_redirect(on_stdout)\n init_stderr_redirect(on_stderr)\n return NoSocket()\n\n def _close(self):\n super(Daemon, self)._close()\n\n def _send_output(self, category, output):\n if self.session is None:\n return\n self.session._msgprocessor.send_event('output',\n category=category,\n output=output)\n\n\nclass NoSocket(object):\n \"\"\"A object with a noop socket lifecycle.\"\"\"\n\n def shutdown(self, *args, **kwargs):\n pass\n\n def close(self):\n pass\n", "path": "src/ptvsd/runner.py"}]} | 1,440 | 201 |
gh_patches_debug_41696 | rasdani/github-patches | git_diff | carpentries__amy-475 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Admin commands broken
Please check all the admin commands (use `make` on its own to get a list):
```
$ make airports
/Users/gvwilson/anaconda/envs/py3k/lib/python3.3/site-packages/crispy_forms/utils.py:25: RemovedInDjango19Warning: memoize wrapper is deprecated and will be removed in Django 1.9. Use django.utils.lru_cache instead.
default_field_template = memoize(default_field_template, {}, 1)
Traceback (most recent call last):
File "manage.py", line 10, in <module>
execute_from_command_line(sys.argv)
File "/Users/gvwilson/anaconda/envs/py3k/lib/python3.3/site-packages/django/core/management/__init__.py", line 338, in execute_from_command_line
utility.execute()
File "/Users/gvwilson/anaconda/envs/py3k/lib/python3.3/site-packages/django/core/management/__init__.py", line 330, in execute
self.fetch_command(subcommand).run_from_argv(self.argv)
File "/Users/gvwilson/anaconda/envs/py3k/lib/python3.3/site-packages/django/core/management/__init__.py", line 190, in fetch_command
klass = load_command_class(app_name, subcommand)
File "/Users/gvwilson/anaconda/envs/py3k/lib/python3.3/site-packages/django/core/management/__init__.py", line 40, in load_command_class
module = import_module('%s.management.commands.%s' % (app_name, name))
File "/Users/gvwilson/anaconda/envs/py3k/lib/python3.3/importlib/__init__.py", line 90, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1582, in _gcd_import
File "<frozen importlib._bootstrap>", line 1563, in _find_and_load
File "<frozen importlib._bootstrap>", line 1530, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 584, in _check_name_wrapper
File "<frozen importlib._bootstrap>", line 1022, in load_module
File "<frozen importlib._bootstrap>", line 1003, in load_module
File "<frozen importlib._bootstrap>", line 560, in module_for_loader_wrapper
File "<frozen importlib._bootstrap>", line 868, in _load_module
File "<frozen importlib._bootstrap>", line 313, in _call_with_frames_removed
File "/Users/gvwilson/s/amy/workshops/management/commands/export_airports.py", line 3, in <module>
from workshops.views import _export_instructors
ImportError: cannot import name _export_instructors
make: *** [airports] Error 1
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `workshops/management/commands/export_airports.py`
Content:
```
1 import yaml
2 from django.core.management.base import BaseCommand, CommandError
3 from workshops.views import _export_instructors
4
5 class Command(BaseCommand):
6 args = 'no arguments'
7 help = 'Display YAML for airports.'
8
9 def handle(self, *args, **options):
10 print(yaml.dump(_export_instructors()).rstrip())
11
```
Path: `workshops/management/commands/export_badges.py`
Content:
```
1 import yaml
2 from django.core.management.base import BaseCommand, CommandError
3 from workshops.views import _export_badges
4
5 class Command(BaseCommand):
6 args = 'no arguments'
7 help = 'Display YAML for badges.'
8
9 def handle(self, *args, **options):
10 print(yaml.dump(_export_badges()).rstrip())
11
```
Path: `api/views.py`
Content:
```
1 from django.db.models import Q
2 from rest_framework.generics import ListAPIView
3 from rest_framework.permissions import IsAuthenticatedOrReadOnly
4 from rest_framework.response import Response
5 from rest_framework.reverse import reverse
6 from rest_framework.views import APIView
7
8 from workshops.models import Badge, Airport, Event
9
10 from .serializers import (
11 ExportBadgesSerializer,
12 ExportInstructorLocationsSerializer,
13 EventSerializer,
14 )
15
16
17 class ApiRoot(APIView):
18 def get(self, request, format=None):
19 return Response({
20 'export-badges': reverse('api:export-badges', request=request,
21 format=format),
22 'export-instructors': reverse('api:export-instructors',
23 request=request, format=format),
24 'events-published': reverse('api:events-published',
25 request=request, format=format),
26 })
27
28
29 class ExportBadgesView(APIView):
30 """List all badges and people who have them."""
31 permission_classes = (IsAuthenticatedOrReadOnly, )
32
33 def get(self, request, format=None):
34 badges = Badge.objects.prefetch_related('person_set')
35 serializer = ExportBadgesSerializer(badges, many=True)
36 return Response(serializer.data)
37
38
39 class ExportInstructorLocationsView(APIView):
40 """List all airports and instructors located near them."""
41 permission_classes = (IsAuthenticatedOrReadOnly, )
42
43 def get(self, request, format=None):
44 # TODO: return only people marked as instructors?
45 airports = Airport.objects.exclude(person=None) \
46 .prefetch_related('person_set')
47 serializer = ExportInstructorLocationsSerializer(airports, many=True)
48 return Response(serializer.data)
49
50
51 class ListEvents(APIView):
52 # I wanted to use ListAPIView, but it had problems with the way we test
53 # this code... Basically ListAPIView uses pagination, and pagination
54 # requires existing Request object - something we're faking in part of the
55 # tests (request = None).
56 serializer_class = EventSerializer
57 permission_classes = (IsAuthenticatedOrReadOnly, )
58 queryset = None # override this in the subclass
59
60 def get(self, request, format=None):
61 objects = self.queryset.all()
62 serializer = self.serializer_class(objects, many=True)
63 return Response(serializer.data)
64
65
66 class PublishedEvents(ListEvents):
67 # only events that have both a starting date and a URL
68 queryset = Event.objects.exclude(
69 Q(start__isnull=True) | Q(url__isnull=True)
70 ).order_by('-start')
71
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/api/views.py b/api/views.py
--- a/api/views.py
+++ b/api/views.py
@@ -26,45 +26,31 @@
})
-class ExportBadgesView(APIView):
+class ExportBadgesView(ListAPIView):
"""List all badges and people who have them."""
permission_classes = (IsAuthenticatedOrReadOnly, )
+ paginator = None # disable pagination
- def get(self, request, format=None):
- badges = Badge.objects.prefetch_related('person_set')
- serializer = ExportBadgesSerializer(badges, many=True)
- return Response(serializer.data)
+ queryset = Badge.objects.prefetch_related('person_set')
+ serializer_class = ExportBadgesSerializer
-class ExportInstructorLocationsView(APIView):
+class ExportInstructorLocationsView(ListAPIView):
"""List all airports and instructors located near them."""
permission_classes = (IsAuthenticatedOrReadOnly, )
+ paginator = None # disable pagination
- def get(self, request, format=None):
- # TODO: return only people marked as instructors?
- airports = Airport.objects.exclude(person=None) \
- .prefetch_related('person_set')
- serializer = ExportInstructorLocationsSerializer(airports, many=True)
- return Response(serializer.data)
+ queryset = Airport.objects.exclude(person=None) \
+ .prefetch_related('person_set')
+ serializer_class = ExportInstructorLocationsSerializer
-class ListEvents(APIView):
- # I wanted to use ListAPIView, but it had problems with the way we test
- # this code... Basically ListAPIView uses pagination, and pagination
- # requires existing Request object - something we're faking in part of the
- # tests (request = None).
- serializer_class = EventSerializer
+class PublishedEvents(ListAPIView):
+ # only events that have both a starting date and a URL
permission_classes = (IsAuthenticatedOrReadOnly, )
- queryset = None # override this in the subclass
+ paginator = None # disable pagination
- def get(self, request, format=None):
- objects = self.queryset.all()
- serializer = self.serializer_class(objects, many=True)
- return Response(serializer.data)
-
-
-class PublishedEvents(ListEvents):
- # only events that have both a starting date and a URL
+ serializer_class = EventSerializer
queryset = Event.objects.exclude(
Q(start__isnull=True) | Q(url__isnull=True)
).order_by('-start')
diff --git a/workshops/management/commands/export_airports.py b/workshops/management/commands/export_airports.py
--- a/workshops/management/commands/export_airports.py
+++ b/workshops/management/commands/export_airports.py
@@ -1,10 +1,14 @@
-import yaml
-from django.core.management.base import BaseCommand, CommandError
-from workshops.views import _export_instructors
+from django.core.management.base import BaseCommand
+from django.core.urlresolvers import reverse
+from rest_framework.test import APIClient
+
class Command(BaseCommand):
args = 'no arguments'
help = 'Display YAML for airports.'
def handle(self, *args, **options):
- print(yaml.dump(_export_instructors()).rstrip())
+ client = APIClient()
+ response = client.get(reverse('api:export-instructors'),
+ {'format': 'yaml'})
+ print(response.content.decode('utf-8'))
diff --git a/workshops/management/commands/export_badges.py b/workshops/management/commands/export_badges.py
--- a/workshops/management/commands/export_badges.py
+++ b/workshops/management/commands/export_badges.py
@@ -1,10 +1,14 @@
-import yaml
-from django.core.management.base import BaseCommand, CommandError
-from workshops.views import _export_badges
+from django.core.management.base import BaseCommand
+from django.core.urlresolvers import reverse
+from rest_framework.test import APIClient
+
class Command(BaseCommand):
args = 'no arguments'
help = 'Display YAML for badges.'
def handle(self, *args, **options):
- print(yaml.dump(_export_badges()).rstrip())
+ client = APIClient()
+ response = client.get(reverse('api:export-badges'),
+ {'format': 'yaml'})
+ print(response.content.decode('utf-8'))
| {"golden_diff": "diff --git a/api/views.py b/api/views.py\n--- a/api/views.py\n+++ b/api/views.py\n@@ -26,45 +26,31 @@\n })\n \n \n-class ExportBadgesView(APIView):\n+class ExportBadgesView(ListAPIView):\n \"\"\"List all badges and people who have them.\"\"\"\n permission_classes = (IsAuthenticatedOrReadOnly, )\n+ paginator = None # disable pagination\n \n- def get(self, request, format=None):\n- badges = Badge.objects.prefetch_related('person_set')\n- serializer = ExportBadgesSerializer(badges, many=True)\n- return Response(serializer.data)\n+ queryset = Badge.objects.prefetch_related('person_set')\n+ serializer_class = ExportBadgesSerializer\n \n \n-class ExportInstructorLocationsView(APIView):\n+class ExportInstructorLocationsView(ListAPIView):\n \"\"\"List all airports and instructors located near them.\"\"\"\n permission_classes = (IsAuthenticatedOrReadOnly, )\n+ paginator = None # disable pagination\n \n- def get(self, request, format=None):\n- # TODO: return only people marked as instructors?\n- airports = Airport.objects.exclude(person=None) \\\n- .prefetch_related('person_set')\n- serializer = ExportInstructorLocationsSerializer(airports, many=True)\n- return Response(serializer.data)\n+ queryset = Airport.objects.exclude(person=None) \\\n+ .prefetch_related('person_set')\n+ serializer_class = ExportInstructorLocationsSerializer\n \n \n-class ListEvents(APIView):\n- # I wanted to use ListAPIView, but it had problems with the way we test\n- # this code... Basically ListAPIView uses pagination, and pagination\n- # requires existing Request object - something we're faking in part of the\n- # tests (request = None).\n- serializer_class = EventSerializer\n+class PublishedEvents(ListAPIView):\n+ # only events that have both a starting date and a URL\n permission_classes = (IsAuthenticatedOrReadOnly, )\n- queryset = None # override this in the subclass\n+ paginator = None # disable pagination\n \n- def get(self, request, format=None):\n- objects = self.queryset.all()\n- serializer = self.serializer_class(objects, many=True)\n- return Response(serializer.data)\n-\n-\n-class PublishedEvents(ListEvents):\n- # only events that have both a starting date and a URL\n+ serializer_class = EventSerializer\n queryset = Event.objects.exclude(\n Q(start__isnull=True) | Q(url__isnull=True)\n ).order_by('-start')\ndiff --git a/workshops/management/commands/export_airports.py b/workshops/management/commands/export_airports.py\n--- a/workshops/management/commands/export_airports.py\n+++ b/workshops/management/commands/export_airports.py\n@@ -1,10 +1,14 @@\n-import yaml\n-from django.core.management.base import BaseCommand, CommandError\n-from workshops.views import _export_instructors\n+from django.core.management.base import BaseCommand\n+from django.core.urlresolvers import reverse\n+from rest_framework.test import APIClient\n+\n \n class Command(BaseCommand):\n args = 'no arguments'\n help = 'Display YAML for airports.'\n \n def handle(self, *args, **options):\n- print(yaml.dump(_export_instructors()).rstrip())\n+ client = APIClient()\n+ response = client.get(reverse('api:export-instructors'),\n+ {'format': 'yaml'})\n+ print(response.content.decode('utf-8'))\ndiff --git a/workshops/management/commands/export_badges.py b/workshops/management/commands/export_badges.py\n--- a/workshops/management/commands/export_badges.py\n+++ b/workshops/management/commands/export_badges.py\n@@ -1,10 +1,14 @@\n-import yaml\n-from django.core.management.base import BaseCommand, CommandError\n-from workshops.views import _export_badges\n+from django.core.management.base import BaseCommand\n+from django.core.urlresolvers import reverse\n+from rest_framework.test import APIClient\n+\n \n class Command(BaseCommand):\n args = 'no arguments'\n help = 'Display YAML for badges.'\n \n def handle(self, *args, **options):\n- print(yaml.dump(_export_badges()).rstrip())\n+ client = APIClient()\n+ response = client.get(reverse('api:export-badges'),\n+ {'format': 'yaml'})\n+ print(response.content.decode('utf-8'))\n", "issue": "Admin commands broken\nPlease check all the admin commands (use `make` on its own to get a list):\n\n```\n$ make airports\n\n/Users/gvwilson/anaconda/envs/py3k/lib/python3.3/site-packages/crispy_forms/utils.py:25: RemovedInDjango19Warning: memoize wrapper is deprecated and will be removed in Django 1.9. Use django.utils.lru_cache instead.\n default_field_template = memoize(default_field_template, {}, 1)\n\nTraceback (most recent call last):\n File \"manage.py\", line 10, in <module>\n execute_from_command_line(sys.argv)\n File \"/Users/gvwilson/anaconda/envs/py3k/lib/python3.3/site-packages/django/core/management/__init__.py\", line 338, in execute_from_command_line\n utility.execute()\n File \"/Users/gvwilson/anaconda/envs/py3k/lib/python3.3/site-packages/django/core/management/__init__.py\", line 330, in execute\n self.fetch_command(subcommand).run_from_argv(self.argv)\n File \"/Users/gvwilson/anaconda/envs/py3k/lib/python3.3/site-packages/django/core/management/__init__.py\", line 190, in fetch_command\n klass = load_command_class(app_name, subcommand)\n File \"/Users/gvwilson/anaconda/envs/py3k/lib/python3.3/site-packages/django/core/management/__init__.py\", line 40, in load_command_class\n module = import_module('%s.management.commands.%s' % (app_name, name))\n File \"/Users/gvwilson/anaconda/envs/py3k/lib/python3.3/importlib/__init__.py\", line 90, in import_module\n return _bootstrap._gcd_import(name[level:], package, level)\n File \"<frozen importlib._bootstrap>\", line 1582, in _gcd_import\n File \"<frozen importlib._bootstrap>\", line 1563, in _find_and_load\n File \"<frozen importlib._bootstrap>\", line 1530, in _find_and_load_unlocked\n File \"<frozen importlib._bootstrap>\", line 584, in _check_name_wrapper\n File \"<frozen importlib._bootstrap>\", line 1022, in load_module\n File \"<frozen importlib._bootstrap>\", line 1003, in load_module\n File \"<frozen importlib._bootstrap>\", line 560, in module_for_loader_wrapper\n File \"<frozen importlib._bootstrap>\", line 868, in _load_module\n File \"<frozen importlib._bootstrap>\", line 313, in _call_with_frames_removed\n File \"/Users/gvwilson/s/amy/workshops/management/commands/export_airports.py\", line 3, in <module>\n from workshops.views import _export_instructors\nImportError: cannot import name _export_instructors\nmake: *** [airports] Error 1\n```\n\n", "before_files": [{"content": "import yaml\nfrom django.core.management.base import BaseCommand, CommandError\nfrom workshops.views import _export_instructors\n\nclass Command(BaseCommand):\n args = 'no arguments'\n help = 'Display YAML for airports.'\n\n def handle(self, *args, **options):\n print(yaml.dump(_export_instructors()).rstrip())\n", "path": "workshops/management/commands/export_airports.py"}, {"content": "import yaml\nfrom django.core.management.base import BaseCommand, CommandError\nfrom workshops.views import _export_badges\n\nclass Command(BaseCommand):\n args = 'no arguments'\n help = 'Display YAML for badges.'\n\n def handle(self, *args, **options):\n print(yaml.dump(_export_badges()).rstrip())\n", "path": "workshops/management/commands/export_badges.py"}, {"content": "from django.db.models import Q\nfrom rest_framework.generics import ListAPIView\nfrom rest_framework.permissions import IsAuthenticatedOrReadOnly\nfrom rest_framework.response import Response\nfrom rest_framework.reverse import reverse\nfrom rest_framework.views import APIView\n\nfrom workshops.models import Badge, Airport, Event\n\nfrom .serializers import (\n ExportBadgesSerializer,\n ExportInstructorLocationsSerializer,\n EventSerializer,\n)\n\n\nclass ApiRoot(APIView):\n def get(self, request, format=None):\n return Response({\n 'export-badges': reverse('api:export-badges', request=request,\n format=format),\n 'export-instructors': reverse('api:export-instructors',\n request=request, format=format),\n 'events-published': reverse('api:events-published',\n request=request, format=format),\n })\n\n\nclass ExportBadgesView(APIView):\n \"\"\"List all badges and people who have them.\"\"\"\n permission_classes = (IsAuthenticatedOrReadOnly, )\n\n def get(self, request, format=None):\n badges = Badge.objects.prefetch_related('person_set')\n serializer = ExportBadgesSerializer(badges, many=True)\n return Response(serializer.data)\n\n\nclass ExportInstructorLocationsView(APIView):\n \"\"\"List all airports and instructors located near them.\"\"\"\n permission_classes = (IsAuthenticatedOrReadOnly, )\n\n def get(self, request, format=None):\n # TODO: return only people marked as instructors?\n airports = Airport.objects.exclude(person=None) \\\n .prefetch_related('person_set')\n serializer = ExportInstructorLocationsSerializer(airports, many=True)\n return Response(serializer.data)\n\n\nclass ListEvents(APIView):\n # I wanted to use ListAPIView, but it had problems with the way we test\n # this code... Basically ListAPIView uses pagination, and pagination\n # requires existing Request object - something we're faking in part of the\n # tests (request = None).\n serializer_class = EventSerializer\n permission_classes = (IsAuthenticatedOrReadOnly, )\n queryset = None # override this in the subclass\n\n def get(self, request, format=None):\n objects = self.queryset.all()\n serializer = self.serializer_class(objects, many=True)\n return Response(serializer.data)\n\n\nclass PublishedEvents(ListEvents):\n # only events that have both a starting date and a URL\n queryset = Event.objects.exclude(\n Q(start__isnull=True) | Q(url__isnull=True)\n ).order_by('-start')\n", "path": "api/views.py"}], "after_files": [{"content": "from django.core.management.base import BaseCommand\nfrom django.core.urlresolvers import reverse\nfrom rest_framework.test import APIClient\n\n\nclass Command(BaseCommand):\n args = 'no arguments'\n help = 'Display YAML for airports.'\n\n def handle(self, *args, **options):\n client = APIClient()\n response = client.get(reverse('api:export-instructors'),\n {'format': 'yaml'})\n print(response.content.decode('utf-8'))\n", "path": "workshops/management/commands/export_airports.py"}, {"content": "from django.core.management.base import BaseCommand\nfrom django.core.urlresolvers import reverse\nfrom rest_framework.test import APIClient\n\n\nclass Command(BaseCommand):\n args = 'no arguments'\n help = 'Display YAML for badges.'\n\n def handle(self, *args, **options):\n client = APIClient()\n response = client.get(reverse('api:export-badges'),\n {'format': 'yaml'})\n print(response.content.decode('utf-8'))\n", "path": "workshops/management/commands/export_badges.py"}, {"content": "from django.db.models import Q\nfrom rest_framework.generics import ListAPIView\nfrom rest_framework.permissions import IsAuthenticatedOrReadOnly\nfrom rest_framework.response import Response\nfrom rest_framework.reverse import reverse\nfrom rest_framework.views import APIView\n\nfrom workshops.models import Badge, Airport, Event\n\nfrom .serializers import (\n ExportBadgesSerializer,\n ExportInstructorLocationsSerializer,\n EventSerializer,\n)\n\n\nclass ApiRoot(APIView):\n def get(self, request, format=None):\n return Response({\n 'export-badges': reverse('api:export-badges', request=request,\n format=format),\n 'export-instructors': reverse('api:export-instructors',\n request=request, format=format),\n 'events-published': reverse('api:events-published',\n request=request, format=format),\n })\n\n\nclass ExportBadgesView(ListAPIView):\n \"\"\"List all badges and people who have them.\"\"\"\n permission_classes = (IsAuthenticatedOrReadOnly, )\n paginator = None # disable pagination\n\n queryset = Badge.objects.prefetch_related('person_set')\n serializer_class = ExportBadgesSerializer\n\n\nclass ExportInstructorLocationsView(ListAPIView):\n \"\"\"List all airports and instructors located near them.\"\"\"\n permission_classes = (IsAuthenticatedOrReadOnly, )\n paginator = None # disable pagination\n\n queryset = Airport.objects.exclude(person=None) \\\n .prefetch_related('person_set')\n serializer_class = ExportInstructorLocationsSerializer\n\n\nclass PublishedEvents(ListAPIView):\n # only events that have both a starting date and a URL\n permission_classes = (IsAuthenticatedOrReadOnly, )\n paginator = None # disable pagination\n\n serializer_class = EventSerializer\n queryset = Event.objects.exclude(\n Q(start__isnull=True) | Q(url__isnull=True)\n ).order_by('-start')\n", "path": "api/views.py"}]} | 1,798 | 954 |
gh_patches_debug_13993 | rasdani/github-patches | git_diff | google__flax-1324 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
AttributeError: module 'flax.linen' has no attribute 'merge_param'
[This guide](https://flax.readthedocs.io/en/latest/design_notes/arguments.html) suggests using `nn.merge_param` to combine arguments, but `merge_param` is only available through `nn.module.merge_param`. I believe it needs to be added to the import line [here](https://github.com/google/flax/blob/4ae9143f7ef46ffab6d9123ba1b2e4f3303e68d1/flax/linen/__init__.py#L28). I can open a PR if this is the case.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `flax/linen/__init__.py`
Content:
```
1 # Copyright 2021 The Flax Authors.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 """The Flax Module system."""
16
17
18 # pylint: disable=g-multiple-import
19 # re-export commonly used modules and functions
20 from .activation import (celu, elu, gelu, glu, leaky_relu, log_sigmoid,
21 log_softmax, relu, sigmoid, soft_sign, softmax,
22 softplus, swish, silu, tanh)
23 from .attention import (MultiHeadDotProductAttention, SelfAttention,
24 dot_product_attention, make_attention_mask,
25 make_causal_mask, combine_masks)
26 from ..core import broadcast, DenyList
27 from .linear import Conv, ConvTranspose, Dense, DenseGeneral, Embed
28 from .module import Module, compact, enable_named_call, disable_named_call, Variable, init, init_with_output, apply
29 from .normalization import BatchNorm, GroupNorm, LayerNorm
30 from .pooling import avg_pool, max_pool
31 from .recurrent import GRUCell, LSTMCell, ConvLSTM, OptimizedLSTMCell
32 from .stochastic import Dropout
33 from .transforms import jit, named_call, remat, scan, vmap
34 from .initializers import zeros, ones
35
36 # pylint: enable=g-multiple-import
37
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/flax/linen/__init__.py b/flax/linen/__init__.py
--- a/flax/linen/__init__.py
+++ b/flax/linen/__init__.py
@@ -25,7 +25,8 @@
make_causal_mask, combine_masks)
from ..core import broadcast, DenyList
from .linear import Conv, ConvTranspose, Dense, DenseGeneral, Embed
-from .module import Module, compact, enable_named_call, disable_named_call, Variable, init, init_with_output, apply
+from .module import (Module, compact, enable_named_call, disable_named_call,
+ Variable, init, init_with_output, apply, merge_param)
from .normalization import BatchNorm, GroupNorm, LayerNorm
from .pooling import avg_pool, max_pool
from .recurrent import GRUCell, LSTMCell, ConvLSTM, OptimizedLSTMCell
| {"golden_diff": "diff --git a/flax/linen/__init__.py b/flax/linen/__init__.py\n--- a/flax/linen/__init__.py\n+++ b/flax/linen/__init__.py\n@@ -25,7 +25,8 @@\n make_causal_mask, combine_masks)\n from ..core import broadcast, DenyList\n from .linear import Conv, ConvTranspose, Dense, DenseGeneral, Embed\n-from .module import Module, compact, enable_named_call, disable_named_call, Variable, init, init_with_output, apply\n+from .module import (Module, compact, enable_named_call, disable_named_call,\n+ Variable, init, init_with_output, apply, merge_param)\n from .normalization import BatchNorm, GroupNorm, LayerNorm\n from .pooling import avg_pool, max_pool\n from .recurrent import GRUCell, LSTMCell, ConvLSTM, OptimizedLSTMCell\n", "issue": "AttributeError: module 'flax.linen' has no attribute 'merge_param'\n[This guide](https://flax.readthedocs.io/en/latest/design_notes/arguments.html) suggests using `nn.merge_param` to combine arguments, but `merge_param` is only available through `nn.module.merge_param`. I believe it needs to be added to the import line [here](https://github.com/google/flax/blob/4ae9143f7ef46ffab6d9123ba1b2e4f3303e68d1/flax/linen/__init__.py#L28). I can open a PR if this is the case.\r\n\n", "before_files": [{"content": "# Copyright 2021 The Flax Authors.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"The Flax Module system.\"\"\"\n\n\n# pylint: disable=g-multiple-import\n# re-export commonly used modules and functions\nfrom .activation import (celu, elu, gelu, glu, leaky_relu, log_sigmoid,\n log_softmax, relu, sigmoid, soft_sign, softmax,\n softplus, swish, silu, tanh)\nfrom .attention import (MultiHeadDotProductAttention, SelfAttention,\n dot_product_attention, make_attention_mask,\n make_causal_mask, combine_masks)\nfrom ..core import broadcast, DenyList\nfrom .linear import Conv, ConvTranspose, Dense, DenseGeneral, Embed\nfrom .module import Module, compact, enable_named_call, disable_named_call, Variable, init, init_with_output, apply\nfrom .normalization import BatchNorm, GroupNorm, LayerNorm\nfrom .pooling import avg_pool, max_pool\nfrom .recurrent import GRUCell, LSTMCell, ConvLSTM, OptimizedLSTMCell\nfrom .stochastic import Dropout\nfrom .transforms import jit, named_call, remat, scan, vmap\nfrom .initializers import zeros, ones\n\n# pylint: enable=g-multiple-import\n", "path": "flax/linen/__init__.py"}], "after_files": [{"content": "# Copyright 2021 The Flax Authors.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"The Flax Module system.\"\"\"\n\n\n# pylint: disable=g-multiple-import\n# re-export commonly used modules and functions\nfrom .activation import (celu, elu, gelu, glu, leaky_relu, log_sigmoid,\n log_softmax, relu, sigmoid, soft_sign, softmax,\n softplus, swish, silu, tanh)\nfrom .attention import (MultiHeadDotProductAttention, SelfAttention,\n dot_product_attention, make_attention_mask,\n make_causal_mask, combine_masks)\nfrom ..core import broadcast, DenyList\nfrom .linear import Conv, ConvTranspose, Dense, DenseGeneral, Embed\nfrom .module import (Module, compact, enable_named_call, disable_named_call,\n Variable, init, init_with_output, apply, merge_param)\nfrom .normalization import BatchNorm, GroupNorm, LayerNorm\nfrom .pooling import avg_pool, max_pool\nfrom .recurrent import GRUCell, LSTMCell, ConvLSTM, OptimizedLSTMCell\nfrom .stochastic import Dropout\nfrom .transforms import jit, named_call, remat, scan, vmap\nfrom .initializers import zeros, ones\n\n# pylint: enable=g-multiple-import\n", "path": "flax/linen/__init__.py"}]} | 860 | 201 |
gh_patches_debug_28915 | rasdani/github-patches | git_diff | ckan__ckan-6797 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Support for pdb and debuggers
### Proposed fixes:
It is now possible to debug ckan with pdb/ipdb/PyCharm debugger and others, both outside Docker and inside Docker.
I just exposed a `werkzeug` option to the CKAN CLI, called `passthrough_errors`. Enabling that, together with `--disable-reloader` (which should be the default in my opinion, like it was in the past), allow to run pdb without making other changes to the source code.
`threads` should not be enabled and `processes` should be set to 1. These are the defaults already.
> passthrough_errors (bool) – set this to True to disable the error catching. This means that the server will die on errors but it can be useful to hook debuggers in (pdb etc.)
-- https://werkzeug.palletsprojects.com/en/2.0.x/serving/
Example:
```
$ cd contrib/docker
$ docker-compose up --build -d
$ # wait...
$ docker-compose exec ckan bash
root@f6a71d0b7686:/# python3 -m pdb /usr/lib/ckan/venv/bin/ckan -c /etc/ckan/production.ini run --host 0.0.0.0 -E --disable-reloader
> /usr/lib/ckan/venv/bin/ckan(3)<module>()
-> import re
(Pdb) b ckan/views/api.py:215
Breakpoint 1 at /usr/lib/ckan/venv/src/ckan/ckan/views/api.py:215
(Pdb) c
2021-11-01 17:00:50,832 INFO [ckan.cli] Using configuration file /etc/ckan/production.ini
2021-11-01 17:00:50,832 INFO [ckan.config.environment] Loading static files from public
2021-11-01 17:00:50,954 INFO [ckan.config.environment] Loading templates from /usr/lib/ckan/venv/src/ckan/ckan/templates
2021-11-01 17:00:51,552 INFO [ckan.config.environment] Loading templates from /usr/lib/ckan/venv/src/ckan/ckan/templates
2021-11-01 17:00:52,173 INFO [ckan.cli.server] Running CKAN on http://0.0.0.0:5000
2021-11-01 17:00:52,174 WARNI [werkzeug] * Running on all addresses.
WARNING: This is a development server. Do not use it in a production deployment.
```
...then `http://localhost:5000/api/3/action/package_search` can be opened in the web browser to trigger the breakpoint:
```
> /usr/lib/ckan/venv/src/ckan/ckan/views/api.py(215)action()
-> try:
(Pdb)
```
### Features:
- [ ] includes tests covering changes
- [x] includes updated documentation
- [X] includes user-visible changes
- [ ] includes API changes
- [ ] includes bugfix for possible backport
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ckan/cli/server.py`
Content:
```
1 # encoding: utf-8
2
3 import logging
4
5 import click
6 from werkzeug.serving import run_simple
7
8 import ckan.plugins.toolkit as tk
9 from ckan.common import config
10
11 log = logging.getLogger(__name__)
12
13 DEFAULT_HOST = u"localhost"
14 DEFAULT_PORT = 5000
15
16
17 @click.command(u"run", short_help=u"Start development server")
18 @click.option(u"-H", u"--host", help=u"Host name")
19 @click.option(u"-p", u"--port", help=u"Port number")
20 @click.option(u"-r", u"--disable-reloader", is_flag=True,
21 help=u"Disable reloader")
22 @click.option(
23 u"-t", u"--threaded", is_flag=True,
24 help=u"Handle each request in a separate thread"
25 )
26 @click.option(
27 u"--processes", type=int, default=0,
28 help=u"Maximum number of concurrent processes"
29 )
30 @click.option(
31 u"-e", u"--extra-files", multiple=True,
32 help=u"Additional files that should be watched for server reloading"
33 " (you can provide multiple values)")
34 @click.option(
35 u"-C", u"--ssl-cert", default=None,
36 help=u"Certificate file to use to enable SSL. Passing 'adhoc' will "
37 " automatically generate a new one (on each server reload).")
38 @click.option(
39 u"-K", u"--ssl-key", default=None,
40 help=u"Key file to use to enable SSL. Passing 'adhoc' will "
41 " automatically generate a new one (on each server reload).")
42 @click.pass_context
43 def run(ctx, host, port, disable_reloader, threaded, extra_files, processes,
44 ssl_cert, ssl_key):
45 u"""Runs the Werkzeug development server"""
46
47 # Reloading
48 use_reloader = not disable_reloader
49 config_extra_files = tk.aslist(
50 config.get(u"ckan.devserver.watch_patterns")
51 )
52 extra_files = list(extra_files) + [
53 config[u"__file__"]
54 ] + config_extra_files
55
56 # Threads and processes
57 threaded = threaded or tk.asbool(config.get(u"ckan.devserver.threaded"))
58 processes = processes or tk.asint(
59 config.get(u"ckan.devserver.multiprocess", 1)
60 )
61 if threaded and processes > 1:
62 tk.error_shout(u"Cannot have a multithreaded and multi process server")
63 raise click.Abort()
64
65 # SSL
66 cert_file = ssl_cert or config.get(u"ckan.devserver.ssl_cert")
67 key_file = ssl_key or config.get(u"ckan.devserver.ssl_key")
68
69 if cert_file and key_file:
70 if cert_file == key_file == u"adhoc":
71 ssl_context = u"adhoc"
72 else:
73 ssl_context = (ssl_cert, ssl_key)
74 else:
75 ssl_context = None
76
77 host = host or config.get(u'ckan.devserver.host', DEFAULT_HOST)
78 port = port or config.get(u'ckan.devserver.port', DEFAULT_PORT)
79 try:
80 port = int(port)
81 except ValueError:
82 tk.error_shout(u"Server port must be an integer, not {}".format(port))
83 raise click.Abort()
84
85 log.info(u"Running CKAN on {scheme}://{host}:{port}".format(
86 scheme=u"https" if ssl_context else u"http", host=host, port=port))
87
88 run_simple(
89 host,
90 port,
91 ctx.obj.app,
92 use_reloader=use_reloader,
93 use_evalex=True,
94 threaded=threaded,
95 processes=processes,
96 extra_files=extra_files,
97 ssl_context=ssl_context,
98 )
99
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/ckan/cli/server.py b/ckan/cli/server.py
--- a/ckan/cli/server.py
+++ b/ckan/cli/server.py
@@ -19,6 +19,8 @@
@click.option(u"-p", u"--port", help=u"Port number")
@click.option(u"-r", u"--disable-reloader", is_flag=True,
help=u"Disable reloader")
[email protected](u"-E", u"--passthrough-errors", is_flag=True,
+ help=u"Disable error caching (useful to hook debuggers)")
@click.option(
u"-t", u"--threaded", is_flag=True,
help=u"Handle each request in a separate thread"
@@ -40,10 +42,16 @@
help=u"Key file to use to enable SSL. Passing 'adhoc' will "
" automatically generate a new one (on each server reload).")
@click.pass_context
-def run(ctx, host, port, disable_reloader, threaded, extra_files, processes,
- ssl_cert, ssl_key):
+def run(ctx, host, port, disable_reloader, passthrough_errors, threaded,
+ extra_files, processes, ssl_cert, ssl_key):
u"""Runs the Werkzeug development server"""
+ # passthrough_errors overrides conflicting options
+ if passthrough_errors:
+ disable_reloader = True
+ threaded = False
+ processes = 1
+
# Reloading
use_reloader = not disable_reloader
config_extra_files = tk.aslist(
@@ -95,4 +103,5 @@
processes=processes,
extra_files=extra_files,
ssl_context=ssl_context,
+ passthrough_errors=passthrough_errors,
)
| {"golden_diff": "diff --git a/ckan/cli/server.py b/ckan/cli/server.py\n--- a/ckan/cli/server.py\n+++ b/ckan/cli/server.py\n@@ -19,6 +19,8 @@\n @click.option(u\"-p\", u\"--port\", help=u\"Port number\")\n @click.option(u\"-r\", u\"--disable-reloader\", is_flag=True,\n help=u\"Disable reloader\")\[email protected](u\"-E\", u\"--passthrough-errors\", is_flag=True,\n+ help=u\"Disable error caching (useful to hook debuggers)\")\n @click.option(\n u\"-t\", u\"--threaded\", is_flag=True,\n help=u\"Handle each request in a separate thread\"\n@@ -40,10 +42,16 @@\n help=u\"Key file to use to enable SSL. Passing 'adhoc' will \"\n \" automatically generate a new one (on each server reload).\")\n @click.pass_context\n-def run(ctx, host, port, disable_reloader, threaded, extra_files, processes,\n- ssl_cert, ssl_key):\n+def run(ctx, host, port, disable_reloader, passthrough_errors, threaded,\n+ extra_files, processes, ssl_cert, ssl_key):\n u\"\"\"Runs the Werkzeug development server\"\"\"\n \n+ # passthrough_errors overrides conflicting options\n+ if passthrough_errors:\n+ disable_reloader = True\n+ threaded = False\n+ processes = 1\n+\n # Reloading\n use_reloader = not disable_reloader\n config_extra_files = tk.aslist(\n@@ -95,4 +103,5 @@\n processes=processes,\n extra_files=extra_files,\n ssl_context=ssl_context,\n+ passthrough_errors=passthrough_errors,\n )\n", "issue": "Support for pdb and debuggers\n### Proposed fixes:\r\n\r\nIt is now possible to debug ckan with pdb/ipdb/PyCharm debugger and others, both outside Docker and inside Docker.\r\nI just exposed a `werkzeug` option to the CKAN CLI, called `passthrough_errors`. Enabling that, together with `--disable-reloader` (which should be the default in my opinion, like it was in the past), allow to run pdb without making other changes to the source code.\r\n`threads` should not be enabled and `processes` should be set to 1. These are the defaults already.\r\n\r\n> passthrough_errors (bool) \u2013 set this to True to disable the error catching. This means that the server will die on errors but it can be useful to hook debuggers in (pdb etc.)\r\n-- https://werkzeug.palletsprojects.com/en/2.0.x/serving/\r\n\r\nExample:\r\n```\r\n$ cd contrib/docker\r\n$ docker-compose up --build -d\r\n$ # wait...\r\n$ docker-compose exec ckan bash\r\nroot@f6a71d0b7686:/# python3 -m pdb /usr/lib/ckan/venv/bin/ckan -c /etc/ckan/production.ini run --host 0.0.0.0 -E --disable-reloader\r\n> /usr/lib/ckan/venv/bin/ckan(3)<module>()\r\n-> import re\r\n(Pdb) b ckan/views/api.py:215\r\nBreakpoint 1 at /usr/lib/ckan/venv/src/ckan/ckan/views/api.py:215\r\n(Pdb) c\r\n2021-11-01 17:00:50,832 INFO [ckan.cli] Using configuration file /etc/ckan/production.ini\r\n2021-11-01 17:00:50,832 INFO [ckan.config.environment] Loading static files from public\r\n2021-11-01 17:00:50,954 INFO [ckan.config.environment] Loading templates from /usr/lib/ckan/venv/src/ckan/ckan/templates\r\n2021-11-01 17:00:51,552 INFO [ckan.config.environment] Loading templates from /usr/lib/ckan/venv/src/ckan/ckan/templates\r\n2021-11-01 17:00:52,173 INFO [ckan.cli.server] Running CKAN on http://0.0.0.0:5000\r\n2021-11-01 17:00:52,174 WARNI [werkzeug] * Running on all addresses.\r\n WARNING: This is a development server. Do not use it in a production deployment.\r\n```\r\n\r\n...then `http://localhost:5000/api/3/action/package_search` can be opened in the web browser to trigger the breakpoint:\r\n\r\n```\r\n> /usr/lib/ckan/venv/src/ckan/ckan/views/api.py(215)action()\r\n-> try:\r\n(Pdb)\r\n```\r\n\r\n### Features:\r\n\r\n- [ ] includes tests covering changes\r\n- [x] includes updated documentation\r\n- [X] includes user-visible changes\r\n- [ ] includes API changes\r\n- [ ] includes bugfix for possible backport\n", "before_files": [{"content": "# encoding: utf-8\n\nimport logging\n\nimport click\nfrom werkzeug.serving import run_simple\n\nimport ckan.plugins.toolkit as tk\nfrom ckan.common import config\n\nlog = logging.getLogger(__name__)\n\nDEFAULT_HOST = u\"localhost\"\nDEFAULT_PORT = 5000\n\n\[email protected](u\"run\", short_help=u\"Start development server\")\[email protected](u\"-H\", u\"--host\", help=u\"Host name\")\[email protected](u\"-p\", u\"--port\", help=u\"Port number\")\[email protected](u\"-r\", u\"--disable-reloader\", is_flag=True,\n help=u\"Disable reloader\")\[email protected](\n u\"-t\", u\"--threaded\", is_flag=True,\n help=u\"Handle each request in a separate thread\"\n)\[email protected](\n u\"--processes\", type=int, default=0,\n help=u\"Maximum number of concurrent processes\"\n)\[email protected](\n u\"-e\", u\"--extra-files\", multiple=True,\n help=u\"Additional files that should be watched for server reloading\"\n \" (you can provide multiple values)\")\[email protected](\n u\"-C\", u\"--ssl-cert\", default=None,\n help=u\"Certificate file to use to enable SSL. Passing 'adhoc' will \"\n \" automatically generate a new one (on each server reload).\")\[email protected](\n u\"-K\", u\"--ssl-key\", default=None,\n help=u\"Key file to use to enable SSL. Passing 'adhoc' will \"\n \" automatically generate a new one (on each server reload).\")\[email protected]_context\ndef run(ctx, host, port, disable_reloader, threaded, extra_files, processes,\n ssl_cert, ssl_key):\n u\"\"\"Runs the Werkzeug development server\"\"\"\n\n # Reloading\n use_reloader = not disable_reloader\n config_extra_files = tk.aslist(\n config.get(u\"ckan.devserver.watch_patterns\")\n )\n extra_files = list(extra_files) + [\n config[u\"__file__\"]\n ] + config_extra_files\n\n # Threads and processes\n threaded = threaded or tk.asbool(config.get(u\"ckan.devserver.threaded\"))\n processes = processes or tk.asint(\n config.get(u\"ckan.devserver.multiprocess\", 1)\n )\n if threaded and processes > 1:\n tk.error_shout(u\"Cannot have a multithreaded and multi process server\")\n raise click.Abort()\n\n # SSL\n cert_file = ssl_cert or config.get(u\"ckan.devserver.ssl_cert\")\n key_file = ssl_key or config.get(u\"ckan.devserver.ssl_key\")\n\n if cert_file and key_file:\n if cert_file == key_file == u\"adhoc\":\n ssl_context = u\"adhoc\"\n else:\n ssl_context = (ssl_cert, ssl_key)\n else:\n ssl_context = None\n\n host = host or config.get(u'ckan.devserver.host', DEFAULT_HOST)\n port = port or config.get(u'ckan.devserver.port', DEFAULT_PORT)\n try:\n port = int(port)\n except ValueError:\n tk.error_shout(u\"Server port must be an integer, not {}\".format(port))\n raise click.Abort()\n\n log.info(u\"Running CKAN on {scheme}://{host}:{port}\".format(\n scheme=u\"https\" if ssl_context else u\"http\", host=host, port=port))\n\n run_simple(\n host,\n port,\n ctx.obj.app,\n use_reloader=use_reloader,\n use_evalex=True,\n threaded=threaded,\n processes=processes,\n extra_files=extra_files,\n ssl_context=ssl_context,\n )\n", "path": "ckan/cli/server.py"}], "after_files": [{"content": "# encoding: utf-8\n\nimport logging\n\nimport click\nfrom werkzeug.serving import run_simple\n\nimport ckan.plugins.toolkit as tk\nfrom ckan.common import config\n\nlog = logging.getLogger(__name__)\n\nDEFAULT_HOST = u\"localhost\"\nDEFAULT_PORT = 5000\n\n\[email protected](u\"run\", short_help=u\"Start development server\")\[email protected](u\"-H\", u\"--host\", help=u\"Host name\")\[email protected](u\"-p\", u\"--port\", help=u\"Port number\")\[email protected](u\"-r\", u\"--disable-reloader\", is_flag=True,\n help=u\"Disable reloader\")\[email protected](u\"-E\", u\"--passthrough-errors\", is_flag=True,\n help=u\"Disable error caching (useful to hook debuggers)\")\[email protected](\n u\"-t\", u\"--threaded\", is_flag=True,\n help=u\"Handle each request in a separate thread\"\n)\[email protected](\n u\"--processes\", type=int, default=0,\n help=u\"Maximum number of concurrent processes\"\n)\[email protected](\n u\"-e\", u\"--extra-files\", multiple=True,\n help=u\"Additional files that should be watched for server reloading\"\n \" (you can provide multiple values)\")\[email protected](\n u\"-C\", u\"--ssl-cert\", default=None,\n help=u\"Certificate file to use to enable SSL. Passing 'adhoc' will \"\n \" automatically generate a new one (on each server reload).\")\[email protected](\n u\"-K\", u\"--ssl-key\", default=None,\n help=u\"Key file to use to enable SSL. Passing 'adhoc' will \"\n \" automatically generate a new one (on each server reload).\")\[email protected]_context\ndef run(ctx, host, port, disable_reloader, passthrough_errors, threaded,\n extra_files, processes, ssl_cert, ssl_key):\n u\"\"\"Runs the Werkzeug development server\"\"\"\n\n # passthrough_errors overrides conflicting options\n if passthrough_errors:\n disable_reloader = True\n threaded = False\n processes = 1\n\n # Reloading\n use_reloader = not disable_reloader\n config_extra_files = tk.aslist(\n config.get(u\"ckan.devserver.watch_patterns\")\n )\n extra_files = list(extra_files) + [\n config[u\"__file__\"]\n ] + config_extra_files\n\n # Threads and processes\n threaded = threaded or tk.asbool(config.get(u\"ckan.devserver.threaded\"))\n processes = processes or tk.asint(\n config.get(u\"ckan.devserver.multiprocess\", 1)\n )\n if threaded and processes > 1:\n tk.error_shout(u\"Cannot have a multithreaded and multi process server\")\n raise click.Abort()\n\n # SSL\n cert_file = ssl_cert or config.get(u\"ckan.devserver.ssl_cert\")\n key_file = ssl_key or config.get(u\"ckan.devserver.ssl_key\")\n\n if cert_file and key_file:\n if cert_file == key_file == u\"adhoc\":\n ssl_context = u\"adhoc\"\n else:\n ssl_context = (ssl_cert, ssl_key)\n else:\n ssl_context = None\n\n host = host or config.get(u'ckan.devserver.host', DEFAULT_HOST)\n port = port or config.get(u'ckan.devserver.port', DEFAULT_PORT)\n try:\n port = int(port)\n except ValueError:\n tk.error_shout(u\"Server port must be an integer, not {}\".format(port))\n raise click.Abort()\n\n log.info(u\"Running CKAN on {scheme}://{host}:{port}\".format(\n scheme=u\"https\" if ssl_context else u\"http\", host=host, port=port))\n\n run_simple(\n host,\n port,\n ctx.obj.app,\n use_reloader=use_reloader,\n use_evalex=True,\n threaded=threaded,\n processes=processes,\n extra_files=extra_files,\n ssl_context=ssl_context,\n passthrough_errors=passthrough_errors,\n )\n", "path": "ckan/cli/server.py"}]} | 2,004 | 385 |
gh_patches_debug_19296 | rasdani/github-patches | git_diff | scikit-hep__pyhf-1673 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Pass `Accept` header in `contrib.utils.download`
I'm copying a comment here that I made in the [HEPData Zulip chat](https://hepdata.zulipchat.com/#narrow/stream/226203-pyhf/topic/DOIs/near/213610306) on 16th October 2020.
Regarding the issue (HEPData/hepdata#162) to mint DOIs for all local resource files attached to a submission, if we do eventually get around to addressing it, we would probably redirect the DOI to a landing page for the resource file, rather than to the resource file itself (e.g. the pyhf tarball). This would follow the DataCite [Best Practices for DOI Landing Pages](https://support.datacite.org/docs/landing-pages), e.g. "[DOIs should resolve to a landing page, not directly to the content](https://support.datacite.org/docs/landing-pages#dois-should-resolve-to-a-landing-page-not-directly-to-the-content)", which I'm currently breaking for the two manually minted DOIs. In the issue (HEPdata/hepdata#162) I mentioned the possibility of using [DataCite Content Negotiation](https://support.datacite.org/docs/datacite-content-resolver) to redirect to the resource file itself, but the linked page now says "Custom content types are no longer supported since January 1st, 2020". I thought maybe content negotiation could be used to return the `.tar.gz` file directly, but the intended purpose is to retrieve DOI metadata in different formats, not to provide the content itself. In anticipation of possible future changes, I'd recommend that you use the URL directly rather than the DOI in pyhf download scripts and documentation (e.g. revert #1109).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/pyhf/contrib/utils.py`
Content:
```
1 """Helper utilities for common tasks."""
2
3 from urllib.parse import urlparse
4 import tarfile
5 from io import BytesIO
6 import logging
7 from pyhf import exceptions
8
9 log = logging.getLogger(__name__)
10
11 __all__ = ["download"]
12
13
14 def __dir__():
15 return __all__
16
17
18 try:
19 import requests
20
21 def download(archive_url, output_directory, force=False, compress=False):
22 """
23 Download the patchset archive from the remote URL and extract it in a
24 directory at the path given.
25
26 Example:
27
28 >>> from pyhf.contrib.utils import download
29 >>> download("https://doi.org/10.17182/hepdata.90607.v3/r3", "1Lbb-likelihoods")
30 >>> import os
31 >>> sorted(os.listdir("1Lbb-likelihoods"))
32 ['BkgOnly.json', 'README.md', 'patchset.json']
33 >>> download("https://doi.org/10.17182/hepdata.90607.v3/r3", "1Lbb-likelihoods.tar.gz", compress=True)
34 >>> import glob
35 >>> glob.glob("1Lbb-likelihoods.tar.gz")
36 ['1Lbb-likelihoods.tar.gz']
37
38 Args:
39 archive_url (:obj:`str`): The URL of the :class:`~pyhf.patchset.PatchSet` archive to download.
40 output_directory (:obj:`str`): Name of the directory to unpack the archive into.
41 force (:obj:`bool`): Force download from non-approved host. Default is ``False``.
42 compress (:obj:`bool`): Keep the archive in a compressed ``tar.gz`` form. Default is ``False``.
43
44 Raises:
45 :class:`~pyhf.exceptions.InvalidArchiveHost`: if the provided archive host name is not known to be valid
46 """
47 if not force:
48 valid_hosts = ["www.hepdata.net", "doi.org"]
49 netloc = urlparse(archive_url).netloc
50 if netloc not in valid_hosts:
51 raise exceptions.InvalidArchiveHost(
52 f"{netloc} is not an approved archive host: {', '.join(str(host) for host in valid_hosts)}\n"
53 + "To download an archive from this host use the --force option."
54 )
55
56 with requests.get(archive_url) as response:
57 if compress:
58 with open(output_directory, "wb") as archive:
59 archive.write(response.content)
60 else:
61 with tarfile.open(
62 mode="r|gz", fileobj=BytesIO(response.content)
63 ) as archive:
64 archive.extractall(output_directory)
65
66
67 except ModuleNotFoundError:
68 log.error(
69 "\nInstallation of the contrib extra is required to use pyhf.contrib.utils.download"
70 + "\nPlease install with: python -m pip install pyhf[contrib]\n",
71 exc_info=True,
72 )
73
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/pyhf/contrib/utils.py b/src/pyhf/contrib/utils.py
--- a/src/pyhf/contrib/utils.py
+++ b/src/pyhf/contrib/utils.py
@@ -53,7 +53,18 @@
+ "To download an archive from this host use the --force option."
)
- with requests.get(archive_url) as response:
+ # c.f. https://github.com/scikit-hep/pyhf/issues/1491
+ # > Use content negotiation at the landing page for the resource that
+ # > the DOI resolves to. DataCite content negotiation is forwarding all
+ # > requests with unknown content types to the URL registered in the
+ # > handle system.
+ # c.f. https://blog.datacite.org/changes-to-doi-content-negotiation/
+ # The HEPData landing page for the resource file can check if the Accept
+ # request HTTP header matches the content type of the resource file and
+ # return the content directly if so.
+ with requests.get(
+ archive_url, headers={"Accept": "application/x-tar"}
+ ) as response:
if compress:
with open(output_directory, "wb") as archive:
archive.write(response.content)
| {"golden_diff": "diff --git a/src/pyhf/contrib/utils.py b/src/pyhf/contrib/utils.py\n--- a/src/pyhf/contrib/utils.py\n+++ b/src/pyhf/contrib/utils.py\n@@ -53,7 +53,18 @@\n + \"To download an archive from this host use the --force option.\"\n )\n \n- with requests.get(archive_url) as response:\n+ # c.f. https://github.com/scikit-hep/pyhf/issues/1491\n+ # > Use content negotiation at the landing page for the resource that\n+ # > the DOI resolves to. DataCite content negotiation is forwarding all\n+ # > requests with unknown content types to the URL registered in the\n+ # > handle system.\n+ # c.f. https://blog.datacite.org/changes-to-doi-content-negotiation/\n+ # The HEPData landing page for the resource file can check if the Accept\n+ # request HTTP header matches the content type of the resource file and\n+ # return the content directly if so.\n+ with requests.get(\n+ archive_url, headers={\"Accept\": \"application/x-tar\"}\n+ ) as response:\n if compress:\n with open(output_directory, \"wb\") as archive:\n archive.write(response.content)\n", "issue": "Pass `Accept` header in `contrib.utils.download`\nI'm copying a comment here that I made in the [HEPData Zulip chat](https://hepdata.zulipchat.com/#narrow/stream/226203-pyhf/topic/DOIs/near/213610306) on 16th October 2020.\r\n\r\nRegarding the issue (HEPData/hepdata#162) to mint DOIs for all local resource files attached to a submission, if we do eventually get around to addressing it, we would probably redirect the DOI to a landing page for the resource file, rather than to the resource file itself (e.g. the pyhf tarball). This would follow the DataCite [Best Practices for DOI Landing Pages](https://support.datacite.org/docs/landing-pages), e.g. \"[DOIs should resolve to a landing page, not directly to the content](https://support.datacite.org/docs/landing-pages#dois-should-resolve-to-a-landing-page-not-directly-to-the-content)\", which I'm currently breaking for the two manually minted DOIs. In the issue (HEPdata/hepdata#162) I mentioned the possibility of using [DataCite Content Negotiation](https://support.datacite.org/docs/datacite-content-resolver) to redirect to the resource file itself, but the linked page now says \"Custom content types are no longer supported since January 1st, 2020\". I thought maybe content negotiation could be used to return the `.tar.gz` file directly, but the intended purpose is to retrieve DOI metadata in different formats, not to provide the content itself. In anticipation of possible future changes, I'd recommend that you use the URL directly rather than the DOI in pyhf download scripts and documentation (e.g. revert #1109).\n", "before_files": [{"content": "\"\"\"Helper utilities for common tasks.\"\"\"\n\nfrom urllib.parse import urlparse\nimport tarfile\nfrom io import BytesIO\nimport logging\nfrom pyhf import exceptions\n\nlog = logging.getLogger(__name__)\n\n__all__ = [\"download\"]\n\n\ndef __dir__():\n return __all__\n\n\ntry:\n import requests\n\n def download(archive_url, output_directory, force=False, compress=False):\n \"\"\"\n Download the patchset archive from the remote URL and extract it in a\n directory at the path given.\n\n Example:\n\n >>> from pyhf.contrib.utils import download\n >>> download(\"https://doi.org/10.17182/hepdata.90607.v3/r3\", \"1Lbb-likelihoods\")\n >>> import os\n >>> sorted(os.listdir(\"1Lbb-likelihoods\"))\n ['BkgOnly.json', 'README.md', 'patchset.json']\n >>> download(\"https://doi.org/10.17182/hepdata.90607.v3/r3\", \"1Lbb-likelihoods.tar.gz\", compress=True)\n >>> import glob\n >>> glob.glob(\"1Lbb-likelihoods.tar.gz\")\n ['1Lbb-likelihoods.tar.gz']\n\n Args:\n archive_url (:obj:`str`): The URL of the :class:`~pyhf.patchset.PatchSet` archive to download.\n output_directory (:obj:`str`): Name of the directory to unpack the archive into.\n force (:obj:`bool`): Force download from non-approved host. Default is ``False``.\n compress (:obj:`bool`): Keep the archive in a compressed ``tar.gz`` form. Default is ``False``.\n\n Raises:\n :class:`~pyhf.exceptions.InvalidArchiveHost`: if the provided archive host name is not known to be valid\n \"\"\"\n if not force:\n valid_hosts = [\"www.hepdata.net\", \"doi.org\"]\n netloc = urlparse(archive_url).netloc\n if netloc not in valid_hosts:\n raise exceptions.InvalidArchiveHost(\n f\"{netloc} is not an approved archive host: {', '.join(str(host) for host in valid_hosts)}\\n\"\n + \"To download an archive from this host use the --force option.\"\n )\n\n with requests.get(archive_url) as response:\n if compress:\n with open(output_directory, \"wb\") as archive:\n archive.write(response.content)\n else:\n with tarfile.open(\n mode=\"r|gz\", fileobj=BytesIO(response.content)\n ) as archive:\n archive.extractall(output_directory)\n\n\nexcept ModuleNotFoundError:\n log.error(\n \"\\nInstallation of the contrib extra is required to use pyhf.contrib.utils.download\"\n + \"\\nPlease install with: python -m pip install pyhf[contrib]\\n\",\n exc_info=True,\n )\n", "path": "src/pyhf/contrib/utils.py"}], "after_files": [{"content": "\"\"\"Helper utilities for common tasks.\"\"\"\n\nfrom urllib.parse import urlparse\nimport tarfile\nfrom io import BytesIO\nimport logging\nfrom pyhf import exceptions\n\nlog = logging.getLogger(__name__)\n\n__all__ = [\"download\"]\n\n\ndef __dir__():\n return __all__\n\n\ntry:\n import requests\n\n def download(archive_url, output_directory, force=False, compress=False):\n \"\"\"\n Download the patchset archive from the remote URL and extract it in a\n directory at the path given.\n\n Example:\n\n >>> from pyhf.contrib.utils import download\n >>> download(\"https://doi.org/10.17182/hepdata.90607.v3/r3\", \"1Lbb-likelihoods\")\n >>> import os\n >>> sorted(os.listdir(\"1Lbb-likelihoods\"))\n ['BkgOnly.json', 'README.md', 'patchset.json']\n >>> download(\"https://doi.org/10.17182/hepdata.90607.v3/r3\", \"1Lbb-likelihoods.tar.gz\", compress=True)\n >>> import glob\n >>> glob.glob(\"1Lbb-likelihoods.tar.gz\")\n ['1Lbb-likelihoods.tar.gz']\n\n Args:\n archive_url (:obj:`str`): The URL of the :class:`~pyhf.patchset.PatchSet` archive to download.\n output_directory (:obj:`str`): Name of the directory to unpack the archive into.\n force (:obj:`bool`): Force download from non-approved host. Default is ``False``.\n compress (:obj:`bool`): Keep the archive in a compressed ``tar.gz`` form. Default is ``False``.\n\n Raises:\n :class:`~pyhf.exceptions.InvalidArchiveHost`: if the provided archive host name is not known to be valid\n \"\"\"\n if not force:\n valid_hosts = [\"www.hepdata.net\", \"doi.org\"]\n netloc = urlparse(archive_url).netloc\n if netloc not in valid_hosts:\n raise exceptions.InvalidArchiveHost(\n f\"{netloc} is not an approved archive host: {', '.join(str(host) for host in valid_hosts)}\\n\"\n + \"To download an archive from this host use the --force option.\"\n )\n\n # c.f. https://github.com/scikit-hep/pyhf/issues/1491\n # > Use content negotiation at the landing page for the resource that\n # > the DOI resolves to. DataCite content negotiation is forwarding all\n # > requests with unknown content types to the URL registered in the\n # > handle system.\n # c.f. https://blog.datacite.org/changes-to-doi-content-negotiation/\n # The HEPData landing page for the resource file can check if the Accept\n # request HTTP header matches the content type of the resource file and\n # return the content directly if so.\n with requests.get(\n archive_url, headers={\"Accept\": \"application/x-tar\"}\n ) as response:\n if compress:\n with open(output_directory, \"wb\") as archive:\n archive.write(response.content)\n else:\n with tarfile.open(\n mode=\"r|gz\", fileobj=BytesIO(response.content)\n ) as archive:\n archive.extractall(output_directory)\n\n\nexcept ModuleNotFoundError:\n log.error(\n \"\\nInstallation of the contrib extra is required to use pyhf.contrib.utils.download\"\n + \"\\nPlease install with: python -m pip install pyhf[contrib]\\n\",\n exc_info=True,\n )\n", "path": "src/pyhf/contrib/utils.py"}]} | 1,412 | 280 |
gh_patches_debug_50213 | rasdani/github-patches | git_diff | pex-tool__pex-1590 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Release 2.1.64
On the docket:
+ [x] Pex does not support mac universal2 wheels #1587
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pex/version.py`
Content:
```
1 # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
2 # Licensed under the Apache License, Version 2.0 (see LICENSE).
3
4 __version__ = "2.1.63"
5
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pex/version.py b/pex/version.py
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.63"
+__version__ = "2.1.64"
| {"golden_diff": "diff --git a/pex/version.py b/pex/version.py\n--- a/pex/version.py\n+++ b/pex/version.py\n@@ -1,4 +1,4 @@\n # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n # Licensed under the Apache License, Version 2.0 (see LICENSE).\n \n-__version__ = \"2.1.63\"\n+__version__ = \"2.1.64\"\n", "issue": "Release 2.1.64\nOn the docket:\r\n+ [x] Pex does not support mac universal2 wheels #1587 \r\n\n", "before_files": [{"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.63\"\n", "path": "pex/version.py"}], "after_files": [{"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.64\"\n", "path": "pex/version.py"}]} | 342 | 96 |
gh_patches_debug_8532 | rasdani/github-patches | git_diff | akvo__akvo-rsr-2994 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Custom reports removed from 'Reports' general view
Remove custom reports from 'Reports' general view, i.e. Plan Finland or EUTF reports are not visible for other partners.

--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `akvo/rest/views/report.py`
Content:
```
1 # -*- coding: utf-8 -*-
2
3 # Akvo RSR is covered by the GNU Affero General Public License.
4 # See more details in the license.txt file located at the root folder of the Akvo RSR module.
5 # For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.
6
7 from django.db.models import Q
8 from rest_framework.decorators import api_view
9 from rest_framework.response import Response
10
11 from akvo.rsr.models import Report, ReportFormat
12 from ..serializers import ReportSerializer, ReportFormatSerializer
13
14
15 @api_view(['GET'])
16 def reports(request):
17 """
18 A view for displaying all report information, sorted by title.
19 """
20
21 user = request.user
22 is_admin = user.is_active and (user.is_superuser or user.is_admin)
23 reports = Report.objects.all()
24 if not is_admin:
25 # Show only those reports that the user is allowed to see
26 reports = reports.filter(
27 Q(organisations=None) | Q(organisations__in=user.approved_organisations())
28 ).distinct()
29
30 # FIXME: Use a viewset instead?
31 return Response({
32 'count': reports.count(),
33 'results': [ReportSerializer(r).data for r in reports.order_by('title')],
34 })
35
36
37 @api_view(['GET'])
38 def report_formats(request):
39 """
40 A view for displaying all report format information.
41 """
42 return Response({
43 'count': ReportFormat.objects.all().count(),
44 'results': [ReportFormatSerializer(f).data for f in ReportFormat.objects.all()],
45 })
46
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/akvo/rest/views/report.py b/akvo/rest/views/report.py
--- a/akvo/rest/views/report.py
+++ b/akvo/rest/views/report.py
@@ -23,8 +23,9 @@
reports = Report.objects.all()
if not is_admin:
# Show only those reports that the user is allowed to see
+ approved_orgs = user.approved_organisations() if not user.is_anonymous() else []
reports = reports.filter(
- Q(organisations=None) | Q(organisations__in=user.approved_organisations())
+ Q(organisations=None) | Q(organisations__in=approved_orgs)
).distinct()
# FIXME: Use a viewset instead?
| {"golden_diff": "diff --git a/akvo/rest/views/report.py b/akvo/rest/views/report.py\n--- a/akvo/rest/views/report.py\n+++ b/akvo/rest/views/report.py\n@@ -23,8 +23,9 @@\n reports = Report.objects.all()\n if not is_admin:\n # Show only those reports that the user is allowed to see\n+ approved_orgs = user.approved_organisations() if not user.is_anonymous() else []\n reports = reports.filter(\n- Q(organisations=None) | Q(organisations__in=user.approved_organisations())\n+ Q(organisations=None) | Q(organisations__in=approved_orgs)\n ).distinct()\n \n # FIXME: Use a viewset instead?\n", "issue": "Custom reports removed from 'Reports' general view\nRemove custom reports from 'Reports' general view, i.e. Plan Finland or EUTF reports are not visible for other partners. \r\n\r\n\r\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\n# Akvo RSR is covered by the GNU Affero General Public License.\n# See more details in the license.txt file located at the root folder of the Akvo RSR module.\n# For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n\nfrom django.db.models import Q\nfrom rest_framework.decorators import api_view\nfrom rest_framework.response import Response\n\nfrom akvo.rsr.models import Report, ReportFormat\nfrom ..serializers import ReportSerializer, ReportFormatSerializer\n\n\n@api_view(['GET'])\ndef reports(request):\n \"\"\"\n A view for displaying all report information, sorted by title.\n \"\"\"\n\n user = request.user\n is_admin = user.is_active and (user.is_superuser or user.is_admin)\n reports = Report.objects.all()\n if not is_admin:\n # Show only those reports that the user is allowed to see\n reports = reports.filter(\n Q(organisations=None) | Q(organisations__in=user.approved_organisations())\n ).distinct()\n\n # FIXME: Use a viewset instead?\n return Response({\n 'count': reports.count(),\n 'results': [ReportSerializer(r).data for r in reports.order_by('title')],\n })\n\n\n@api_view(['GET'])\ndef report_formats(request):\n \"\"\"\n A view for displaying all report format information.\n \"\"\"\n return Response({\n 'count': ReportFormat.objects.all().count(),\n 'results': [ReportFormatSerializer(f).data for f in ReportFormat.objects.all()],\n })\n", "path": "akvo/rest/views/report.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\n# Akvo RSR is covered by the GNU Affero General Public License.\n# See more details in the license.txt file located at the root folder of the Akvo RSR module.\n# For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n\nfrom django.db.models import Q\nfrom rest_framework.decorators import api_view\nfrom rest_framework.response import Response\n\nfrom akvo.rsr.models import Report, ReportFormat\nfrom ..serializers import ReportSerializer, ReportFormatSerializer\n\n\n@api_view(['GET'])\ndef reports(request):\n \"\"\"\n A view for displaying all report information, sorted by title.\n \"\"\"\n\n user = request.user\n is_admin = user.is_active and (user.is_superuser or user.is_admin)\n reports = Report.objects.all()\n if not is_admin:\n # Show only those reports that the user is allowed to see\n approved_orgs = user.approved_organisations() if not user.is_anonymous() else []\n reports = reports.filter(\n Q(organisations=None) | Q(organisations__in=approved_orgs)\n ).distinct()\n\n # FIXME: Use a viewset instead?\n return Response({\n 'count': reports.count(),\n 'results': [ReportSerializer(r).data for r in reports.order_by('title')],\n })\n\n\n@api_view(['GET'])\ndef report_formats(request):\n \"\"\"\n A view for displaying all report format information.\n \"\"\"\n return Response({\n 'count': ReportFormat.objects.all().count(),\n 'results': [ReportFormatSerializer(f).data for f in ReportFormat.objects.all()],\n })\n", "path": "akvo/rest/views/report.py"}]} | 781 | 166 |
gh_patches_debug_11850 | rasdani/github-patches | git_diff | cobbler__cobbler-3397 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[Backport] GHSA-mcg6-h362-cmq5
### Original feature issue
- Advisory: https://github.com/advisories/GHSA-mcg6-h362-cmq5
### Target release
- [ ] release33
- [x] release32
- [ ] release30
### Reason
Stabilization of Cobbler 3.2.x in the Fedora Ecosystem
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `cobbler/modules/authentication/pam.py`
Content:
```
1 """
2 Authentication module that uses /etc/cobbler/auth.conf
3 Choice of authentication module is in /etc/cobbler/modules.conf
4
5 Copyright 2007-2009, Red Hat, Inc and Others
6 Michael DeHaan <michael.dehaan AT gmail>
7
8 This program is free software; you can redistribute it and/or modify
9 it under the terms of the GNU General Public License as published by
10 the Free Software Foundation; either version 2 of the License, or
11 (at your option) any later version.
12
13 This program is distributed in the hope that it will be useful,
14 but WITHOUT ANY WARRANTY; without even the implied warranty of
15 MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
16 GNU General Public License for more details.
17
18 You should have received a copy of the GNU General Public License
19 along with this program; if not, write to the Free Software
20 Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
21 02110-1301 USA
22
23 PAM python code based on the pam_python code created by Chris AtLee:
24 http://atlee.ca/software/pam/
25
26 #-----------------------------------------------
27 pam_python (c) 2007 Chris AtLee <[email protected]>
28 Licensed under the MIT license:
29 http://www.opensource.org/licenses/mit-license.php
30
31 PAM module for python
32
33 Provides an authenticate function that will allow the caller to authenticate
34 a user against the Pluggable Authentication Modules (PAM) on the system.
35
36 Implemented using ctypes, so no compilation is necessary.
37 """
38
39 from ctypes import CDLL, POINTER, Structure, CFUNCTYPE, cast, pointer, sizeof
40 from ctypes import c_void_p, c_uint, c_char_p, c_char, c_int
41 from ctypes.util import find_library
42
43 LIBPAM = CDLL(find_library("pam"))
44 LIBC = CDLL(find_library("c"))
45
46 CALLOC = LIBC.calloc
47 CALLOC.restype = c_void_p
48 CALLOC.argtypes = [c_uint, c_uint]
49
50 STRDUP = LIBC.strdup
51 STRDUP.argstypes = [c_char_p]
52 STRDUP.restype = POINTER(c_char) # NOT c_char_p !!!!
53
54 # Various constants
55 PAM_PROMPT_ECHO_OFF = 1
56 PAM_PROMPT_ECHO_ON = 2
57 PAM_ERROR_MSG = 3
58 PAM_TEXT_INFO = 4
59
60
61 def register() -> str:
62 """
63 The mandatory Cobbler module registration hook.
64 """
65 return "authn"
66
67
68 class PamHandle(Structure):
69 """
70 wrapper class for pam_handle_t
71 """
72 _fields_ = [("handle", c_void_p)]
73
74 def __init__(self):
75 Structure.__init__(self)
76 self.handle = 0
77
78
79 class PamMessage(Structure):
80 """
81 wrapper class for pam_message structure
82 """
83 _fields_ = [("msg_style", c_int), ("msg", c_char_p)]
84
85 def __repr__(self):
86 return "<PamMessage %i '%s'>" % (self.msg_style, self.msg)
87
88
89 class PamResponse(Structure):
90 """
91 wrapper class for pam_response structure
92 """
93 _fields_ = [("resp", c_char_p), ("resp_retcode", c_int)]
94
95 def __repr__(self):
96 return "<PamResponse %i '%s'>" % (self.resp_retcode, self.resp)
97
98
99 CONV_FUNC = CFUNCTYPE(c_int, c_int, POINTER(POINTER(PamMessage)), POINTER(POINTER(PamResponse)), c_void_p)
100
101
102 class PamConv(Structure):
103 """
104 wrapper class for pam_conv structure
105 """
106 _fields_ = [("conv", CONV_FUNC), ("appdata_ptr", c_void_p)]
107
108
109 PAM_START = LIBPAM.pam_start
110 PAM_START.restype = c_int
111 PAM_START.argtypes = [c_char_p, c_char_p, POINTER(PamConv), POINTER(PamHandle)]
112
113 PAM_AUTHENTICATE = LIBPAM.pam_authenticate
114 PAM_AUTHENTICATE.restype = c_int
115 PAM_AUTHENTICATE.argtypes = [PamHandle, c_int]
116
117
118 def authenticate(api_handle, username: str, password: str) -> bool:
119 """
120 :param api_handle: Used for resolving the the pam service name and getting the Logger.
121 :param username:The username to log in with.
122 :param password: The password to log in with.
123 :returns: True if the given username and password authenticate for the given service. Otherwise False
124 """
125
126 @CONV_FUNC
127 def my_conv(n_messages, messages, p_response, app_data):
128 """
129 Simple conversation function that responds to any prompt where the echo is off with the supplied password
130 """
131 # Create an array of n_messages response objects
132 addr = CALLOC(n_messages, sizeof(PamResponse))
133 p_response[0] = cast(addr, POINTER(PamResponse))
134 for i in range(n_messages):
135 if messages[i].contents.msg_style == PAM_PROMPT_ECHO_OFF:
136 pw_copy = STRDUP(password.encode())
137 p_response.contents[i].resp = cast(pw_copy, c_char_p)
138 p_response.contents[i].resp_retcode = 0
139 return 0
140
141 try:
142 service = api_handle.settings().authn_pam_service
143 except:
144 service = 'login'
145
146 api_handle.logger.debug("authn_pam: PAM service is %s" % service)
147
148 handle = PamHandle()
149 conv = PamConv(my_conv, 0)
150 retval = PAM_START(service.encode(), username.encode(), pointer(conv), pointer(handle))
151
152 if retval != 0:
153 # TODO: This is not an authentication error, something has gone wrong starting up PAM
154 api_handle.logger.error("authn_pam: error initializing PAM library")
155 return False
156
157 retval = PAM_AUTHENTICATE(handle, 0)
158 return retval == 0
159
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/cobbler/modules/authentication/pam.py b/cobbler/modules/authentication/pam.py
--- a/cobbler/modules/authentication/pam.py
+++ b/cobbler/modules/authentication/pam.py
@@ -114,6 +114,10 @@
PAM_AUTHENTICATE.restype = c_int
PAM_AUTHENTICATE.argtypes = [PamHandle, c_int]
+PAM_ACCT_MGMT = LIBPAM.pam_acct_mgmt
+PAM_ACCT_MGMT.restype = c_int
+PAM_ACCT_MGMT.argtypes = [PamHandle, c_int]
+
def authenticate(api_handle, username: str, password: str) -> bool:
"""
@@ -155,4 +159,8 @@
return False
retval = PAM_AUTHENTICATE(handle, 0)
+
+ if retval == 0:
+ retval = PAM_ACCT_MGMT(handle, 0)
+
return retval == 0
| {"golden_diff": "diff --git a/cobbler/modules/authentication/pam.py b/cobbler/modules/authentication/pam.py\n--- a/cobbler/modules/authentication/pam.py\n+++ b/cobbler/modules/authentication/pam.py\n@@ -114,6 +114,10 @@\n PAM_AUTHENTICATE.restype = c_int\n PAM_AUTHENTICATE.argtypes = [PamHandle, c_int]\n \n+PAM_ACCT_MGMT = LIBPAM.pam_acct_mgmt\n+PAM_ACCT_MGMT.restype = c_int\n+PAM_ACCT_MGMT.argtypes = [PamHandle, c_int]\n+\n \n def authenticate(api_handle, username: str, password: str) -> bool:\n \"\"\"\n@@ -155,4 +159,8 @@\n return False\n \n retval = PAM_AUTHENTICATE(handle, 0)\n+\n+ if retval == 0:\n+ retval = PAM_ACCT_MGMT(handle, 0)\n+\n return retval == 0\n", "issue": "[Backport] GHSA-mcg6-h362-cmq5\n### Original feature issue\r\n\r\n- Advisory: https://github.com/advisories/GHSA-mcg6-h362-cmq5\r\n\r\n### Target release\r\n\r\n- [ ] release33\r\n- [x] release32\r\n- [ ] release30\r\n\r\n### Reason\r\n\r\nStabilization of Cobbler 3.2.x in the Fedora Ecosystem\r\n\n", "before_files": [{"content": "\"\"\"\nAuthentication module that uses /etc/cobbler/auth.conf\nChoice of authentication module is in /etc/cobbler/modules.conf\n\nCopyright 2007-2009, Red Hat, Inc and Others\nMichael DeHaan <michael.dehaan AT gmail>\n\nThis program is free software; you can redistribute it and/or modify\nit under the terms of the GNU General Public License as published by\nthe Free Software Foundation; either version 2 of the License, or\n(at your option) any later version.\n\nThis program is distributed in the hope that it will be useful,\nbut WITHOUT ANY WARRANTY; without even the implied warranty of\nMERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\nGNU General Public License for more details.\n\nYou should have received a copy of the GNU General Public License\nalong with this program; if not, write to the Free Software\nFoundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA\n02110-1301 USA\n\nPAM python code based on the pam_python code created by Chris AtLee:\nhttp://atlee.ca/software/pam/\n\n#-----------------------------------------------\npam_python (c) 2007 Chris AtLee <[email protected]>\nLicensed under the MIT license:\nhttp://www.opensource.org/licenses/mit-license.php\n\nPAM module for python\n\nProvides an authenticate function that will allow the caller to authenticate\na user against the Pluggable Authentication Modules (PAM) on the system.\n\nImplemented using ctypes, so no compilation is necessary.\n\"\"\"\n\nfrom ctypes import CDLL, POINTER, Structure, CFUNCTYPE, cast, pointer, sizeof\nfrom ctypes import c_void_p, c_uint, c_char_p, c_char, c_int\nfrom ctypes.util import find_library\n\nLIBPAM = CDLL(find_library(\"pam\"))\nLIBC = CDLL(find_library(\"c\"))\n\nCALLOC = LIBC.calloc\nCALLOC.restype = c_void_p\nCALLOC.argtypes = [c_uint, c_uint]\n\nSTRDUP = LIBC.strdup\nSTRDUP.argstypes = [c_char_p]\nSTRDUP.restype = POINTER(c_char) # NOT c_char_p !!!!\n\n# Various constants\nPAM_PROMPT_ECHO_OFF = 1\nPAM_PROMPT_ECHO_ON = 2\nPAM_ERROR_MSG = 3\nPAM_TEXT_INFO = 4\n\n\ndef register() -> str:\n \"\"\"\n The mandatory Cobbler module registration hook.\n \"\"\"\n return \"authn\"\n\n\nclass PamHandle(Structure):\n \"\"\"\n wrapper class for pam_handle_t\n \"\"\"\n _fields_ = [(\"handle\", c_void_p)]\n\n def __init__(self):\n Structure.__init__(self)\n self.handle = 0\n\n\nclass PamMessage(Structure):\n \"\"\"\n wrapper class for pam_message structure\n \"\"\"\n _fields_ = [(\"msg_style\", c_int), (\"msg\", c_char_p)]\n\n def __repr__(self):\n return \"<PamMessage %i '%s'>\" % (self.msg_style, self.msg)\n\n\nclass PamResponse(Structure):\n \"\"\"\n wrapper class for pam_response structure\n \"\"\"\n _fields_ = [(\"resp\", c_char_p), (\"resp_retcode\", c_int)]\n\n def __repr__(self):\n return \"<PamResponse %i '%s'>\" % (self.resp_retcode, self.resp)\n\n\nCONV_FUNC = CFUNCTYPE(c_int, c_int, POINTER(POINTER(PamMessage)), POINTER(POINTER(PamResponse)), c_void_p)\n\n\nclass PamConv(Structure):\n \"\"\"\n wrapper class for pam_conv structure\n \"\"\"\n _fields_ = [(\"conv\", CONV_FUNC), (\"appdata_ptr\", c_void_p)]\n\n\nPAM_START = LIBPAM.pam_start\nPAM_START.restype = c_int\nPAM_START.argtypes = [c_char_p, c_char_p, POINTER(PamConv), POINTER(PamHandle)]\n\nPAM_AUTHENTICATE = LIBPAM.pam_authenticate\nPAM_AUTHENTICATE.restype = c_int\nPAM_AUTHENTICATE.argtypes = [PamHandle, c_int]\n\n\ndef authenticate(api_handle, username: str, password: str) -> bool:\n \"\"\"\n :param api_handle: Used for resolving the the pam service name and getting the Logger.\n :param username:The username to log in with.\n :param password: The password to log in with.\n :returns: True if the given username and password authenticate for the given service. Otherwise False\n \"\"\"\n\n @CONV_FUNC\n def my_conv(n_messages, messages, p_response, app_data):\n \"\"\"\n Simple conversation function that responds to any prompt where the echo is off with the supplied password\n \"\"\"\n # Create an array of n_messages response objects\n addr = CALLOC(n_messages, sizeof(PamResponse))\n p_response[0] = cast(addr, POINTER(PamResponse))\n for i in range(n_messages):\n if messages[i].contents.msg_style == PAM_PROMPT_ECHO_OFF:\n pw_copy = STRDUP(password.encode())\n p_response.contents[i].resp = cast(pw_copy, c_char_p)\n p_response.contents[i].resp_retcode = 0\n return 0\n\n try:\n service = api_handle.settings().authn_pam_service\n except:\n service = 'login'\n\n api_handle.logger.debug(\"authn_pam: PAM service is %s\" % service)\n\n handle = PamHandle()\n conv = PamConv(my_conv, 0)\n retval = PAM_START(service.encode(), username.encode(), pointer(conv), pointer(handle))\n\n if retval != 0:\n # TODO: This is not an authentication error, something has gone wrong starting up PAM\n api_handle.logger.error(\"authn_pam: error initializing PAM library\")\n return False\n\n retval = PAM_AUTHENTICATE(handle, 0)\n return retval == 0\n", "path": "cobbler/modules/authentication/pam.py"}], "after_files": [{"content": "\"\"\"\nAuthentication module that uses /etc/cobbler/auth.conf\nChoice of authentication module is in /etc/cobbler/modules.conf\n\nCopyright 2007-2009, Red Hat, Inc and Others\nMichael DeHaan <michael.dehaan AT gmail>\n\nThis program is free software; you can redistribute it and/or modify\nit under the terms of the GNU General Public License as published by\nthe Free Software Foundation; either version 2 of the License, or\n(at your option) any later version.\n\nThis program is distributed in the hope that it will be useful,\nbut WITHOUT ANY WARRANTY; without even the implied warranty of\nMERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\nGNU General Public License for more details.\n\nYou should have received a copy of the GNU General Public License\nalong with this program; if not, write to the Free Software\nFoundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA\n02110-1301 USA\n\nPAM python code based on the pam_python code created by Chris AtLee:\nhttp://atlee.ca/software/pam/\n\n#-----------------------------------------------\npam_python (c) 2007 Chris AtLee <[email protected]>\nLicensed under the MIT license:\nhttp://www.opensource.org/licenses/mit-license.php\n\nPAM module for python\n\nProvides an authenticate function that will allow the caller to authenticate\na user against the Pluggable Authentication Modules (PAM) on the system.\n\nImplemented using ctypes, so no compilation is necessary.\n\"\"\"\n\nfrom ctypes import CDLL, POINTER, Structure, CFUNCTYPE, cast, pointer, sizeof\nfrom ctypes import c_void_p, c_uint, c_char_p, c_char, c_int\nfrom ctypes.util import find_library\n\nLIBPAM = CDLL(find_library(\"pam\"))\nLIBC = CDLL(find_library(\"c\"))\n\nCALLOC = LIBC.calloc\nCALLOC.restype = c_void_p\nCALLOC.argtypes = [c_uint, c_uint]\n\nSTRDUP = LIBC.strdup\nSTRDUP.argstypes = [c_char_p]\nSTRDUP.restype = POINTER(c_char) # NOT c_char_p !!!!\n\n# Various constants\nPAM_PROMPT_ECHO_OFF = 1\nPAM_PROMPT_ECHO_ON = 2\nPAM_ERROR_MSG = 3\nPAM_TEXT_INFO = 4\n\n\ndef register() -> str:\n \"\"\"\n The mandatory Cobbler module registration hook.\n \"\"\"\n return \"authn\"\n\n\nclass PamHandle(Structure):\n \"\"\"\n wrapper class for pam_handle_t\n \"\"\"\n _fields_ = [(\"handle\", c_void_p)]\n\n def __init__(self):\n Structure.__init__(self)\n self.handle = 0\n\n\nclass PamMessage(Structure):\n \"\"\"\n wrapper class for pam_message structure\n \"\"\"\n _fields_ = [(\"msg_style\", c_int), (\"msg\", c_char_p)]\n\n def __repr__(self):\n return \"<PamMessage %i '%s'>\" % (self.msg_style, self.msg)\n\n\nclass PamResponse(Structure):\n \"\"\"\n wrapper class for pam_response structure\n \"\"\"\n _fields_ = [(\"resp\", c_char_p), (\"resp_retcode\", c_int)]\n\n def __repr__(self):\n return \"<PamResponse %i '%s'>\" % (self.resp_retcode, self.resp)\n\n\nCONV_FUNC = CFUNCTYPE(c_int, c_int, POINTER(POINTER(PamMessage)), POINTER(POINTER(PamResponse)), c_void_p)\n\n\nclass PamConv(Structure):\n \"\"\"\n wrapper class for pam_conv structure\n \"\"\"\n _fields_ = [(\"conv\", CONV_FUNC), (\"appdata_ptr\", c_void_p)]\n\n\nPAM_START = LIBPAM.pam_start\nPAM_START.restype = c_int\nPAM_START.argtypes = [c_char_p, c_char_p, POINTER(PamConv), POINTER(PamHandle)]\n\nPAM_AUTHENTICATE = LIBPAM.pam_authenticate\nPAM_AUTHENTICATE.restype = c_int\nPAM_AUTHENTICATE.argtypes = [PamHandle, c_int]\n\nPAM_ACCT_MGMT = LIBPAM.pam_acct_mgmt\nPAM_ACCT_MGMT.restype = c_int\nPAM_ACCT_MGMT.argtypes = [PamHandle, c_int]\n\n\ndef authenticate(api_handle, username: str, password: str) -> bool:\n \"\"\"\n :param api_handle: Used for resolving the the pam service name and getting the Logger.\n :param username:The username to log in with.\n :param password: The password to log in with.\n :returns: True if the given username and password authenticate for the given service. Otherwise False\n \"\"\"\n\n @CONV_FUNC\n def my_conv(n_messages, messages, p_response, app_data):\n \"\"\"\n Simple conversation function that responds to any prompt where the echo is off with the supplied password\n \"\"\"\n # Create an array of n_messages response objects\n addr = CALLOC(n_messages, sizeof(PamResponse))\n p_response[0] = cast(addr, POINTER(PamResponse))\n for i in range(n_messages):\n if messages[i].contents.msg_style == PAM_PROMPT_ECHO_OFF:\n pw_copy = STRDUP(password.encode())\n p_response.contents[i].resp = cast(pw_copy, c_char_p)\n p_response.contents[i].resp_retcode = 0\n return 0\n\n try:\n service = api_handle.settings().authn_pam_service\n except:\n service = 'login'\n\n api_handle.logger.debug(\"authn_pam: PAM service is %s\" % service)\n\n handle = PamHandle()\n conv = PamConv(my_conv, 0)\n retval = PAM_START(service.encode(), username.encode(), pointer(conv), pointer(handle))\n\n if retval != 0:\n # TODO: This is not an authentication error, something has gone wrong starting up PAM\n api_handle.logger.error(\"authn_pam: error initializing PAM library\")\n return False\n\n retval = PAM_AUTHENTICATE(handle, 0)\n\n if retval == 0:\n retval = PAM_ACCT_MGMT(handle, 0)\n\n return retval == 0\n", "path": "cobbler/modules/authentication/pam.py"}]} | 2,016 | 218 |
gh_patches_debug_44788 | rasdani/github-patches | git_diff | aws__aws-cli-2537 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Reading/writing EMR key_pair_file configuration options behaves oddly
Version:
```
$ aws --version
aws-cli/1.11.75 Python/2.7.10 Darwin/15.6.0 botocore/1.5.38
```
[It's suggested that one can set a default key_pair_file argument here](https://github.com/aws/aws-cli/blob/master/awscli/customizations/emr/ssh.py#L25) by running `aws configure set emr.key_pair_file <value>`
By that token, I would expect `aws configure get emr.key_pair_file` to retrieve this item and to exit with a exit code of 0.
```
$ aws configure set emr.key_pair_file /tmp/foo
$ cat config
[default]
emr =
key_pair_file = /tmp/foo
$ aws configure get emr.key_pair_file
$ echo $?
1
```
As you can see, setting this and trying to retrieve it exits with a non-zero exit code which makes it a pain to check for this config item being set as part of shell scripts prior to do other EMR-based commands (such as create-cluster).
As an aside, trying to get the top level `emr` config item fails too;
```
$ aws configure get emr
expected a character buffer object
```
Additionally this item doesn't show up when `aws configure list` is run either;
```
$ aws configure list
Name Value Type Location
---- ----- ---- --------
profile <not set> None None
access_key REDACTED shared-credentials-file
secret_key REDACTED shared-credentials-file
region <not set> None None
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `awscli/customizations/configure/get.py`
Content:
```
1 # Copyright 2016 Amazon.com, Inc. or its affiliates. All Rights Reserved.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License"). You
4 # may not use this file except in compliance with the License. A copy of
5 # the License is located at
6 #
7 # http://aws.amazon.com/apache2.0/
8 #
9 # or in the "license" file accompanying this file. This file is
10 # distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF
11 # ANY KIND, either express or implied. See the License for the specific
12 # language governing permissions and limitations under the License.
13 import sys
14
15 from awscli.customizations.commands import BasicCommand
16
17 from . import PREDEFINED_SECTION_NAMES
18
19
20 class ConfigureGetCommand(BasicCommand):
21 NAME = 'get'
22 DESCRIPTION = BasicCommand.FROM_FILE('configure', 'get',
23 '_description.rst')
24 SYNOPSIS = ('aws configure get varname [--profile profile-name]')
25 EXAMPLES = BasicCommand.FROM_FILE('configure', 'get', '_examples.rst')
26 ARG_TABLE = [
27 {'name': 'varname',
28 'help_text': 'The name of the config value to retrieve.',
29 'action': 'store',
30 'cli_type_name': 'string', 'positional_arg': True},
31 ]
32
33 def __init__(self, session, stream=sys.stdout):
34 super(ConfigureGetCommand, self).__init__(session)
35 self._stream = stream
36
37 def _run_main(self, args, parsed_globals):
38 varname = args.varname
39 value = None
40 if '.' not in varname:
41 # get_scoped_config() returns the config variables in the config
42 # file (not the logical_var names), which is what we want.
43 config = self._session.get_scoped_config()
44 value = config.get(varname)
45 else:
46 value = self._get_dotted_config_value(varname)
47 if value is not None:
48 self._stream.write(value)
49 self._stream.write('\n')
50 return 0
51 else:
52 return 1
53
54 def _get_dotted_config_value(self, varname):
55 parts = varname.split('.')
56 num_dots = varname.count('.')
57 # Logic to deal with predefined sections like [preview], [plugin] and etc.
58 if num_dots == 1 and parts[0] in PREDEFINED_SECTION_NAMES:
59 full_config = self._session.full_config
60 section, config_name = varname.split('.')
61 value = full_config.get(section, {}).get(config_name)
62 if value is None:
63 # Try to retrieve it from the profile config.
64 value = full_config['profiles'].get(
65 section, {}).get(config_name)
66 return value
67 if parts[0] == 'profile':
68 profile_name = parts[1]
69 config_name = parts[2]
70 remaining = parts[3:]
71 # Check if varname starts with 'default' profile (e.g. default.emr-dev.emr.instance_profile)
72 # If not, go further to check if varname starts with a known profile name
73 elif parts[0] == 'default' or (parts[0] in self._session.full_config['profiles']):
74 profile_name = parts[0]
75 config_name = parts[1]
76 remaining = parts[2:]
77 else:
78 profile_name = self._session.get_config_variable('profile')
79 config_name = parts[0]
80 remaining = parts[1:]
81
82 value = self._session.full_config['profiles'].get(
83 profile_name, {}).get(config_name)
84 if len(remaining) == 1:
85 try:
86 value = value.get(remaining[-1])
87 except AttributeError:
88 value = None
89 return value
90
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/awscli/customizations/configure/get.py b/awscli/customizations/configure/get.py
--- a/awscli/customizations/configure/get.py
+++ b/awscli/customizations/configure/get.py
@@ -11,17 +11,21 @@
# ANY KIND, either express or implied. See the License for the specific
# language governing permissions and limitations under the License.
import sys
+import logging
from awscli.customizations.commands import BasicCommand
+from awscli.compat import six
from . import PREDEFINED_SECTION_NAMES
+LOG = logging.getLogger(__name__)
+
class ConfigureGetCommand(BasicCommand):
NAME = 'get'
DESCRIPTION = BasicCommand.FROM_FILE('configure', 'get',
'_description.rst')
- SYNOPSIS = ('aws configure get varname [--profile profile-name]')
+ SYNOPSIS = 'aws configure get varname [--profile profile-name]'
EXAMPLES = BasicCommand.FROM_FILE('configure', 'get', '_examples.rst')
ARG_TABLE = [
{'name': 'varname',
@@ -30,13 +34,14 @@
'cli_type_name': 'string', 'positional_arg': True},
]
- def __init__(self, session, stream=sys.stdout):
+ def __init__(self, session, stream=sys.stdout, error_stream=sys.stderr):
super(ConfigureGetCommand, self).__init__(session)
self._stream = stream
+ self._error_stream = error_stream
def _run_main(self, args, parsed_globals):
varname = args.varname
- value = None
+
if '.' not in varname:
# get_scoped_config() returns the config variables in the config
# file (not the logical_var names), which is what we want.
@@ -44,17 +49,30 @@
value = config.get(varname)
else:
value = self._get_dotted_config_value(varname)
- if value is not None:
+
+ LOG.debug(u'Config value retrieved: %s' % value)
+
+ if isinstance(value, six.string_types):
self._stream.write(value)
self._stream.write('\n')
return 0
+ elif isinstance(value, dict):
+ # TODO: add support for this. We would need to print it off in
+ # the same format as the config file.
+ self._error_stream.write(
+ 'varname (%s) must reference a value, not a section or '
+ 'sub-section.' % varname
+ )
+ return 1
else:
return 1
def _get_dotted_config_value(self, varname):
parts = varname.split('.')
num_dots = varname.count('.')
- # Logic to deal with predefined sections like [preview], [plugin] and etc.
+
+ # Logic to deal with predefined sections like [preview], [plugin] and
+ # etc.
if num_dots == 1 and parts[0] in PREDEFINED_SECTION_NAMES:
full_config = self._session.full_config
section, config_name = varname.split('.')
@@ -64,18 +82,23 @@
value = full_config['profiles'].get(
section, {}).get(config_name)
return value
+
if parts[0] == 'profile':
profile_name = parts[1]
config_name = parts[2]
remaining = parts[3:]
- # Check if varname starts with 'default' profile (e.g. default.emr-dev.emr.instance_profile)
- # If not, go further to check if varname starts with a known profile name
- elif parts[0] == 'default' or (parts[0] in self._session.full_config['profiles']):
+ # Check if varname starts with 'default' profile (e.g.
+ # default.emr-dev.emr.instance_profile) If not, go further to check
+ # if varname starts with a known profile name
+ elif parts[0] == 'default' or (
+ parts[0] in self._session.full_config['profiles']):
profile_name = parts[0]
config_name = parts[1]
remaining = parts[2:]
else:
profile_name = self._session.get_config_variable('profile')
+ if profile_name is None:
+ profile_name = 'default'
config_name = parts[0]
remaining = parts[1:]
| {"golden_diff": "diff --git a/awscli/customizations/configure/get.py b/awscli/customizations/configure/get.py\n--- a/awscli/customizations/configure/get.py\n+++ b/awscli/customizations/configure/get.py\n@@ -11,17 +11,21 @@\n # ANY KIND, either express or implied. See the License for the specific\n # language governing permissions and limitations under the License.\n import sys\n+import logging\n \n from awscli.customizations.commands import BasicCommand\n+from awscli.compat import six\n \n from . import PREDEFINED_SECTION_NAMES\n \n+LOG = logging.getLogger(__name__)\n+\n \n class ConfigureGetCommand(BasicCommand):\n NAME = 'get'\n DESCRIPTION = BasicCommand.FROM_FILE('configure', 'get',\n '_description.rst')\n- SYNOPSIS = ('aws configure get varname [--profile profile-name]')\n+ SYNOPSIS = 'aws configure get varname [--profile profile-name]'\n EXAMPLES = BasicCommand.FROM_FILE('configure', 'get', '_examples.rst')\n ARG_TABLE = [\n {'name': 'varname',\n@@ -30,13 +34,14 @@\n 'cli_type_name': 'string', 'positional_arg': True},\n ]\n \n- def __init__(self, session, stream=sys.stdout):\n+ def __init__(self, session, stream=sys.stdout, error_stream=sys.stderr):\n super(ConfigureGetCommand, self).__init__(session)\n self._stream = stream\n+ self._error_stream = error_stream\n \n def _run_main(self, args, parsed_globals):\n varname = args.varname\n- value = None\n+\n if '.' not in varname:\n # get_scoped_config() returns the config variables in the config\n # file (not the logical_var names), which is what we want.\n@@ -44,17 +49,30 @@\n value = config.get(varname)\n else:\n value = self._get_dotted_config_value(varname)\n- if value is not None:\n+\n+ LOG.debug(u'Config value retrieved: %s' % value)\n+\n+ if isinstance(value, six.string_types):\n self._stream.write(value)\n self._stream.write('\\n')\n return 0\n+ elif isinstance(value, dict):\n+ # TODO: add support for this. We would need to print it off in\n+ # the same format as the config file.\n+ self._error_stream.write(\n+ 'varname (%s) must reference a value, not a section or '\n+ 'sub-section.' % varname\n+ )\n+ return 1\n else:\n return 1\n \n def _get_dotted_config_value(self, varname):\n parts = varname.split('.')\n num_dots = varname.count('.')\n- # Logic to deal with predefined sections like [preview], [plugin] and etc.\n+\n+ # Logic to deal with predefined sections like [preview], [plugin] and\n+ # etc.\n if num_dots == 1 and parts[0] in PREDEFINED_SECTION_NAMES:\n full_config = self._session.full_config\n section, config_name = varname.split('.')\n@@ -64,18 +82,23 @@\n value = full_config['profiles'].get(\n section, {}).get(config_name)\n return value\n+\n if parts[0] == 'profile':\n profile_name = parts[1]\n config_name = parts[2]\n remaining = parts[3:]\n- # Check if varname starts with 'default' profile (e.g. default.emr-dev.emr.instance_profile)\n- # If not, go further to check if varname starts with a known profile name\n- elif parts[0] == 'default' or (parts[0] in self._session.full_config['profiles']):\n+ # Check if varname starts with 'default' profile (e.g.\n+ # default.emr-dev.emr.instance_profile) If not, go further to check\n+ # if varname starts with a known profile name\n+ elif parts[0] == 'default' or (\n+ parts[0] in self._session.full_config['profiles']):\n profile_name = parts[0]\n config_name = parts[1]\n remaining = parts[2:]\n else:\n profile_name = self._session.get_config_variable('profile')\n+ if profile_name is None:\n+ profile_name = 'default'\n config_name = parts[0]\n remaining = parts[1:]\n", "issue": "Reading/writing EMR key_pair_file configuration options behaves oddly\nVersion:\r\n\r\n```\r\n$ aws --version\r\naws-cli/1.11.75 Python/2.7.10 Darwin/15.6.0 botocore/1.5.38\r\n```\r\n\r\n[It's suggested that one can set a default key_pair_file argument here](https://github.com/aws/aws-cli/blob/master/awscli/customizations/emr/ssh.py#L25) by running `aws configure set emr.key_pair_file <value>`\r\n\r\nBy that token, I would expect `aws configure get emr.key_pair_file` to retrieve this item and to exit with a exit code of 0.\r\n\r\n```\r\n$ aws configure set emr.key_pair_file /tmp/foo\r\n\r\n$ cat config\r\n[default]\r\nemr =\r\n key_pair_file = /tmp/foo\r\n\r\n$ aws configure get emr.key_pair_file\r\n\r\n$ echo $?\r\n1\r\n```\r\n\r\nAs you can see, setting this and trying to retrieve it exits with a non-zero exit code which makes it a pain to check for this config item being set as part of shell scripts prior to do other EMR-based commands (such as create-cluster).\r\n\r\nAs an aside, trying to get the top level `emr` config item fails too;\r\n\r\n```\r\n$ aws configure get emr\r\n\r\nexpected a character buffer object\r\n```\r\n\r\nAdditionally this item doesn't show up when `aws configure list` is run either;\r\n\r\n```\r\n$ aws configure list\r\n Name Value Type Location\r\n ---- ----- ---- --------\r\n profile <not set> None None\r\naccess_key REDACTED shared-credentials-file\r\nsecret_key REDACTED shared-credentials-file\r\n region <not set> None None\r\n\r\n```\r\n\n", "before_files": [{"content": "# Copyright 2016 Amazon.com, Inc. or its affiliates. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"). You\n# may not use this file except in compliance with the License. A copy of\n# the License is located at\n#\n# http://aws.amazon.com/apache2.0/\n#\n# or in the \"license\" file accompanying this file. This file is\n# distributed on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF\n# ANY KIND, either express or implied. See the License for the specific\n# language governing permissions and limitations under the License.\nimport sys\n\nfrom awscli.customizations.commands import BasicCommand\n\nfrom . import PREDEFINED_SECTION_NAMES\n\n\nclass ConfigureGetCommand(BasicCommand):\n NAME = 'get'\n DESCRIPTION = BasicCommand.FROM_FILE('configure', 'get',\n '_description.rst')\n SYNOPSIS = ('aws configure get varname [--profile profile-name]')\n EXAMPLES = BasicCommand.FROM_FILE('configure', 'get', '_examples.rst')\n ARG_TABLE = [\n {'name': 'varname',\n 'help_text': 'The name of the config value to retrieve.',\n 'action': 'store',\n 'cli_type_name': 'string', 'positional_arg': True},\n ]\n\n def __init__(self, session, stream=sys.stdout):\n super(ConfigureGetCommand, self).__init__(session)\n self._stream = stream\n\n def _run_main(self, args, parsed_globals):\n varname = args.varname\n value = None\n if '.' not in varname:\n # get_scoped_config() returns the config variables in the config\n # file (not the logical_var names), which is what we want.\n config = self._session.get_scoped_config()\n value = config.get(varname)\n else:\n value = self._get_dotted_config_value(varname)\n if value is not None:\n self._stream.write(value)\n self._stream.write('\\n')\n return 0\n else:\n return 1\n\n def _get_dotted_config_value(self, varname):\n parts = varname.split('.')\n num_dots = varname.count('.')\n # Logic to deal with predefined sections like [preview], [plugin] and etc.\n if num_dots == 1 and parts[0] in PREDEFINED_SECTION_NAMES:\n full_config = self._session.full_config\n section, config_name = varname.split('.')\n value = full_config.get(section, {}).get(config_name)\n if value is None:\n # Try to retrieve it from the profile config.\n value = full_config['profiles'].get(\n section, {}).get(config_name)\n return value\n if parts[0] == 'profile':\n profile_name = parts[1]\n config_name = parts[2]\n remaining = parts[3:]\n # Check if varname starts with 'default' profile (e.g. default.emr-dev.emr.instance_profile)\n # If not, go further to check if varname starts with a known profile name\n elif parts[0] == 'default' or (parts[0] in self._session.full_config['profiles']):\n profile_name = parts[0]\n config_name = parts[1]\n remaining = parts[2:]\n else:\n profile_name = self._session.get_config_variable('profile')\n config_name = parts[0]\n remaining = parts[1:]\n\n value = self._session.full_config['profiles'].get(\n profile_name, {}).get(config_name)\n if len(remaining) == 1:\n try:\n value = value.get(remaining[-1])\n except AttributeError:\n value = None\n return value\n", "path": "awscli/customizations/configure/get.py"}], "after_files": [{"content": "# Copyright 2016 Amazon.com, Inc. or its affiliates. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"). You\n# may not use this file except in compliance with the License. A copy of\n# the License is located at\n#\n# http://aws.amazon.com/apache2.0/\n#\n# or in the \"license\" file accompanying this file. This file is\n# distributed on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF\n# ANY KIND, either express or implied. See the License for the specific\n# language governing permissions and limitations under the License.\nimport sys\nimport logging\n\nfrom awscli.customizations.commands import BasicCommand\nfrom awscli.compat import six\n\nfrom . import PREDEFINED_SECTION_NAMES\n\nLOG = logging.getLogger(__name__)\n\n\nclass ConfigureGetCommand(BasicCommand):\n NAME = 'get'\n DESCRIPTION = BasicCommand.FROM_FILE('configure', 'get',\n '_description.rst')\n SYNOPSIS = 'aws configure get varname [--profile profile-name]'\n EXAMPLES = BasicCommand.FROM_FILE('configure', 'get', '_examples.rst')\n ARG_TABLE = [\n {'name': 'varname',\n 'help_text': 'The name of the config value to retrieve.',\n 'action': 'store',\n 'cli_type_name': 'string', 'positional_arg': True},\n ]\n\n def __init__(self, session, stream=sys.stdout, error_stream=sys.stderr):\n super(ConfigureGetCommand, self).__init__(session)\n self._stream = stream\n self._error_stream = error_stream\n\n def _run_main(self, args, parsed_globals):\n varname = args.varname\n\n if '.' not in varname:\n # get_scoped_config() returns the config variables in the config\n # file (not the logical_var names), which is what we want.\n config = self._session.get_scoped_config()\n value = config.get(varname)\n else:\n value = self._get_dotted_config_value(varname)\n\n LOG.debug(u'Config value retrieved: %s' % value)\n\n if isinstance(value, six.string_types):\n self._stream.write(value)\n self._stream.write('\\n')\n return 0\n elif isinstance(value, dict):\n # TODO: add support for this. We would need to print it off in\n # the same format as the config file.\n self._error_stream.write(\n 'varname (%s) must reference a value, not a section or '\n 'sub-section.' % varname\n )\n return 1\n else:\n return 1\n\n def _get_dotted_config_value(self, varname):\n parts = varname.split('.')\n num_dots = varname.count('.')\n\n # Logic to deal with predefined sections like [preview], [plugin] and\n # etc.\n if num_dots == 1 and parts[0] in PREDEFINED_SECTION_NAMES:\n full_config = self._session.full_config\n section, config_name = varname.split('.')\n value = full_config.get(section, {}).get(config_name)\n if value is None:\n # Try to retrieve it from the profile config.\n value = full_config['profiles'].get(\n section, {}).get(config_name)\n return value\n\n if parts[0] == 'profile':\n profile_name = parts[1]\n config_name = parts[2]\n remaining = parts[3:]\n # Check if varname starts with 'default' profile (e.g.\n # default.emr-dev.emr.instance_profile) If not, go further to check\n # if varname starts with a known profile name\n elif parts[0] == 'default' or (\n parts[0] in self._session.full_config['profiles']):\n profile_name = parts[0]\n config_name = parts[1]\n remaining = parts[2:]\n else:\n profile_name = self._session.get_config_variable('profile')\n if profile_name is None:\n profile_name = 'default'\n config_name = parts[0]\n remaining = parts[1:]\n\n value = self._session.full_config['profiles'].get(\n profile_name, {}).get(config_name)\n if len(remaining) == 1:\n try:\n value = value.get(remaining[-1])\n except AttributeError:\n value = None\n return value\n", "path": "awscli/customizations/configure/get.py"}]} | 1,619 | 974 |
gh_patches_debug_30466 | rasdani/github-patches | git_diff | vaexio__vaex-1150 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[BUG-REPORT] TypeError: can't pickle vaex.superutils.ordered_set
**Description**
If I use `df.func.where` with `isin`, I can't pickle the resulted state.
This is for machine learning pipelines.
reproduce:
```
import vaex
import pickle
df = vaex.from_dict({'a':[1,2,3]})
df['b'] = df.func.where(df['a'].isin([1]),1,2)
pickle.dumps(df.state_get())
...
TypeError: can't pickle vaex.superutils.ordered_set_int64 objects
```
**Software information**
- vaex-core==4.0.0a11
- Vaex was installed via: pip
- OS: Mac
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `packages/vaex-core/vaex/hash.py`
Content:
```
1 import os
2
3
4 on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
5 if not on_rtd:
6 from .superutils import *
7 from . import superutils
8 ordered_set = tuple([cls for name, cls in vars(superutils).items() if name.startswith('ordered_set')])
9
10
11 def counter_type_from_dtype(dtype, transient=True):
12 from .array_types import is_string_type
13 if is_string_type(dtype):
14 if transient:
15 postfix = 'string'
16 else:
17 postfix = 'string' # view not support atm
18 else:
19 postfix = str(dtype)
20 if postfix == '>f8':
21 postfix = 'float64'
22 if postfix == 'double': # arrow
23 postfix = 'float64'
24 name = 'counter_' + postfix
25 return globals()[name]
26
27 def ordered_set_type_from_dtype(dtype, transient=True):
28 from .array_types import is_string_type
29 if is_string_type(dtype):
30 if transient:
31 postfix = 'string'
32 else:
33 postfix = 'string' # not support atm
34 else:
35 postfix = str(dtype)
36 if postfix == '>f8':
37 postfix = 'float64'
38 name = 'ordered_set_' + postfix
39 return globals()[name]
40
41 def index_type_from_dtype(dtype, transient=True, prime_growth=False):
42 from .array_types import is_string_type
43 if is_string_type(dtype):
44 if transient:
45 postfix = 'string'
46 else:
47 postfix = 'string' # not support atm
48 else:
49 postfix = str(dtype)
50 if postfix == '>f8':
51 postfix = 'float64'
52 name = 'index_hash_' + postfix
53 if prime_growth:
54 name += "_prime_growth"
55 return globals()[name]
56
57 # from numpy import *
58 # import IPython
59 # IPython.embed()
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/packages/vaex-core/vaex/hash.py b/packages/vaex-core/vaex/hash.py
--- a/packages/vaex-core/vaex/hash.py
+++ b/packages/vaex-core/vaex/hash.py
@@ -1,12 +1,19 @@
import os
+import copyreg
on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
if not on_rtd:
from .superutils import *
from . import superutils
+
ordered_set = tuple([cls for name, cls in vars(superutils).items() if name.startswith('ordered_set')])
+ def pickle(x):
+ return type(x), (x.extract(), x.count, x.nan_count, x.null_count)
+ for cls in ordered_set:
+ copyreg.pickle(cls, pickle)
+
def counter_type_from_dtype(dtype, transient=True):
from .array_types import is_string_type
@@ -24,6 +31,7 @@
name = 'counter_' + postfix
return globals()[name]
+
def ordered_set_type_from_dtype(dtype, transient=True):
from .array_types import is_string_type
if is_string_type(dtype):
@@ -38,6 +46,7 @@
name = 'ordered_set_' + postfix
return globals()[name]
+
def index_type_from_dtype(dtype, transient=True, prime_growth=False):
from .array_types import is_string_type
if is_string_type(dtype):
@@ -53,7 +62,3 @@
if prime_growth:
name += "_prime_growth"
return globals()[name]
-
-# from numpy import *
-# import IPython
-# IPython.embed()
\ No newline at end of file
| {"golden_diff": "diff --git a/packages/vaex-core/vaex/hash.py b/packages/vaex-core/vaex/hash.py\n--- a/packages/vaex-core/vaex/hash.py\n+++ b/packages/vaex-core/vaex/hash.py\n@@ -1,12 +1,19 @@\n import os\n+import copyreg\n \n \n on_rtd = os.environ.get('READTHEDOCS', None) == 'True'\n if not on_rtd:\n from .superutils import *\n from . import superutils\n+\n ordered_set = tuple([cls for name, cls in vars(superutils).items() if name.startswith('ordered_set')])\n \n+ def pickle(x):\n+ return type(x), (x.extract(), x.count, x.nan_count, x.null_count)\n+ for cls in ordered_set:\n+ copyreg.pickle(cls, pickle)\n+\n \n def counter_type_from_dtype(dtype, transient=True):\n from .array_types import is_string_type\n@@ -24,6 +31,7 @@\n name = 'counter_' + postfix\n return globals()[name]\n \n+\n def ordered_set_type_from_dtype(dtype, transient=True):\n from .array_types import is_string_type\n if is_string_type(dtype):\n@@ -38,6 +46,7 @@\n name = 'ordered_set_' + postfix\n return globals()[name]\n \n+\n def index_type_from_dtype(dtype, transient=True, prime_growth=False):\n from .array_types import is_string_type\n if is_string_type(dtype):\n@@ -53,7 +62,3 @@\n if prime_growth:\n name += \"_prime_growth\"\n return globals()[name]\n-\n-# from numpy import *\n-# import IPython\n-# IPython.embed()\n\\ No newline at end of file\n", "issue": "[BUG-REPORT] TypeError: can't pickle vaex.superutils.ordered_set\n**Description**\r\nIf I use `df.func.where` with `isin`, I can't pickle the resulted state.\r\nThis is for machine learning pipelines. \r\n\r\nreproduce:\r\n```\r\nimport vaex\r\nimport pickle\r\n\r\ndf = vaex.from_dict({'a':[1,2,3]})\r\ndf['b'] = df.func.where(df['a'].isin([1]),1,2)\r\npickle.dumps(df.state_get())\r\n...\r\nTypeError: can't pickle vaex.superutils.ordered_set_int64 objects\r\n```\r\n\r\n\r\n**Software information**\r\n - vaex-core==4.0.0a11\r\n - Vaex was installed via: pip\r\n - OS: Mac\r\n \n", "before_files": [{"content": "import os\n\n\non_rtd = os.environ.get('READTHEDOCS', None) == 'True'\nif not on_rtd:\n from .superutils import *\n from . import superutils\n ordered_set = tuple([cls for name, cls in vars(superutils).items() if name.startswith('ordered_set')])\n\n\ndef counter_type_from_dtype(dtype, transient=True):\n from .array_types import is_string_type\n if is_string_type(dtype):\n if transient:\n postfix = 'string'\n else:\n postfix = 'string' # view not support atm\n else:\n postfix = str(dtype)\n if postfix == '>f8':\n postfix = 'float64'\n if postfix == 'double': # arrow\n postfix = 'float64'\n name = 'counter_' + postfix\n return globals()[name]\n\ndef ordered_set_type_from_dtype(dtype, transient=True):\n from .array_types import is_string_type\n if is_string_type(dtype):\n if transient:\n postfix = 'string'\n else:\n postfix = 'string' # not support atm\n else:\n postfix = str(dtype)\n if postfix == '>f8':\n postfix = 'float64'\n name = 'ordered_set_' + postfix\n return globals()[name]\n\ndef index_type_from_dtype(dtype, transient=True, prime_growth=False):\n from .array_types import is_string_type\n if is_string_type(dtype):\n if transient:\n postfix = 'string'\n else:\n postfix = 'string' # not support atm\n else:\n postfix = str(dtype)\n if postfix == '>f8':\n postfix = 'float64'\n name = 'index_hash_' + postfix\n if prime_growth:\n name += \"_prime_growth\"\n return globals()[name]\n\n# from numpy import *\n# import IPython\n# IPython.embed()", "path": "packages/vaex-core/vaex/hash.py"}], "after_files": [{"content": "import os\nimport copyreg\n\n\non_rtd = os.environ.get('READTHEDOCS', None) == 'True'\nif not on_rtd:\n from .superutils import *\n from . import superutils\n\n ordered_set = tuple([cls for name, cls in vars(superutils).items() if name.startswith('ordered_set')])\n\n def pickle(x):\n return type(x), (x.extract(), x.count, x.nan_count, x.null_count)\n for cls in ordered_set:\n copyreg.pickle(cls, pickle)\n\n\ndef counter_type_from_dtype(dtype, transient=True):\n from .array_types import is_string_type\n if is_string_type(dtype):\n if transient:\n postfix = 'string'\n else:\n postfix = 'string' # view not support atm\n else:\n postfix = str(dtype)\n if postfix == '>f8':\n postfix = 'float64'\n if postfix == 'double': # arrow\n postfix = 'float64'\n name = 'counter_' + postfix\n return globals()[name]\n\n\ndef ordered_set_type_from_dtype(dtype, transient=True):\n from .array_types import is_string_type\n if is_string_type(dtype):\n if transient:\n postfix = 'string'\n else:\n postfix = 'string' # not support atm\n else:\n postfix = str(dtype)\n if postfix == '>f8':\n postfix = 'float64'\n name = 'ordered_set_' + postfix\n return globals()[name]\n\n\ndef index_type_from_dtype(dtype, transient=True, prime_growth=False):\n from .array_types import is_string_type\n if is_string_type(dtype):\n if transient:\n postfix = 'string'\n else:\n postfix = 'string' # not support atm\n else:\n postfix = str(dtype)\n if postfix == '>f8':\n postfix = 'float64'\n name = 'index_hash_' + postfix\n if prime_growth:\n name += \"_prime_growth\"\n return globals()[name]\n", "path": "packages/vaex-core/vaex/hash.py"}]} | 932 | 381 |
gh_patches_debug_15466 | rasdani/github-patches | git_diff | google__turbinia-811 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
DockerContainersEnumerationTask JSON decode failure
```
DockerContainersEnumerationTask Task failed with exception: [Expecting value: line 1 column 1 (char 0)]
Traceback (most recent call last):
File "/usr/local/lib/python3.6/dist-packages/turbinia-20210423.dev2+g1f137be-py3.6.egg/turbinia/workers/__init__.py", line 895, in run_wrapper
self.result = self.run(evidence, self.result)
File "/usr/local/lib/python3.6/dist-packages/turbinia-20210423.dev2+g1f137be-py3.6.egg/turbinia/workers/docker.py", line 107, in run
containers_info = self.GetContainers(evidence)
File "/usr/local/lib/python3.6/dist-packages/turbinia-20210423.dev2+g1f137be-py3.6.egg/turbinia/workers/docker.py", line 84, in GetContainers
containers_info = json.loads(json_string)
File "/usr/lib/python3.6/json/__init__.py", line 354, in loads
return _default_decoder.decode(s)
File "/usr/lib/python3.6/json/decoder.py", line 339, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib/python3.6/json/decoder.py", line 357, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
```
e2e test -> https://plaso-ci.log2timeline.net/jenkins/view/Turbinia/job/turbinia-e2e-googleclouddisk-dev/59/
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `turbinia/workers/docker.py`
Content:
```
1 # Copyright 2015 Google Inc.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 """Task for running docker-explorer."""
15
16 from __future__ import unicode_literals
17
18 import json
19 import logging
20 import subprocess
21
22 from turbinia import TurbiniaException
23 from turbinia.evidence import DockerContainer
24 from turbinia.evidence import EvidenceState as state
25 from turbinia.lib import utils
26 from turbinia.workers import Priority
27 from turbinia.workers import TurbiniaTask
28 from turbinia.lib.docker_manager import GetDockerPath
29 from turbinia import config
30
31 log = logging.getLogger('turbinia')
32
33
34 class DockerContainersEnumerationTask(TurbiniaTask):
35 """Enumerates Docker containers on Linux"""
36
37 REQUIRED_STATES = [state.ATTACHED, state.MOUNTED]
38
39 def GetContainers(self, evidence):
40 """Lists the containers from an input Evidence.
41
42 We use subprocess to run the DockerExplorer script, instead of using the
43 Python module, because we need to make sure all DockerExplorer code runs
44 as root.
45
46 Args:
47 evidence (Evidence): the input Evidence.
48
49 Returns:
50 a list(dict) containing information about the containers found.
51
52 Raises:
53 TurbiniaException: when the docker-explorer tool cannot be found or failed
54 to run.
55 """
56 config.LoadConfig()
57 docker_dir = GetDockerPath(evidence.mount_path)
58
59 containers_info = None
60
61 # TODO(rgayon): use docker-explorer exposed constant when
62 # https://github.com/google/docker-explorer/issues/80 is in.
63 de_binary = utils.get_exe_path('de.py')
64 if not de_binary:
65 raise TurbiniaException('Cannot find de.py in path')
66
67 docker_explorer_command = ['sudo', de_binary]
68
69 if config.DEBUG_TASKS or evidence.config.get('debug_tasks'):
70 docker_explorer_command.append('-d')
71
72 docker_explorer_command.extend(['-r', docker_dir, 'list', 'all_containers'])
73
74 log.info('Running {0:s}'.format(' '.join(docker_explorer_command)))
75 try:
76 json_string = subprocess.check_output(docker_explorer_command).decode(
77 'utf-8')
78 except json.JSONDecodeError as e:
79 raise TurbiniaException(
80 'Error decoding JSON output from de.py: {0!s}'.format(e))
81 except subprocess.CalledProcessError as e:
82 raise TurbiniaException('de.py returned an error: {0!s}'.format(e))
83
84 containers_info = json.loads(json_string)
85
86 return containers_info
87
88 def run(self, evidence, result):
89 """Run the docker-explorer tool to list containerss.
90
91 Args:
92 evidence (Evidence object): The evidence to process
93 result (TurbiniaTaskResult): The object to place task results into.
94
95 Returns:
96 TurbiniaTaskResult object.
97 """
98
99 status_report = ''
100 success = False
101
102 status_report = (
103 'Error enumerating Docker containers, evidence has no mounted '
104 'filesystem')
105 found_containers = []
106 try:
107 containers_info = self.GetContainers(evidence)
108 for container_info in containers_info:
109 container_id = container_info.get('container_id')
110 found_containers.append(container_id)
111 container_evidence = DockerContainer(container_id=container_id)
112 result.add_evidence(container_evidence, evidence.config)
113 success = True
114 status_report = 'Found {0!s} containers: {1:s}'.format(
115 len(found_containers), ' '.join(found_containers))
116 except TurbiniaException as e:
117 status_report = 'Error enumerating Docker containers: {0!s}'.format(e)
118
119 result.report_priority = Priority.LOW
120 result.report_data = status_report
121 result.close(self, success=success, status=status_report)
122 return result
123
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/turbinia/workers/docker.py b/turbinia/workers/docker.py
--- a/turbinia/workers/docker.py
+++ b/turbinia/workers/docker.py
@@ -75,14 +75,14 @@
try:
json_string = subprocess.check_output(docker_explorer_command).decode(
'utf-8')
+ containers_info = json.loads(json_string)
except json.JSONDecodeError as e:
raise TurbiniaException(
- 'Error decoding JSON output from de.py: {0!s}'.format(e))
+ 'Error decoding JSON output from de.py: {0!s} {1!s}'.format(
+ e, json_string))
except subprocess.CalledProcessError as e:
raise TurbiniaException('de.py returned an error: {0!s}'.format(e))
- containers_info = json.loads(json_string)
-
return containers_info
def run(self, evidence, result):
| {"golden_diff": "diff --git a/turbinia/workers/docker.py b/turbinia/workers/docker.py\n--- a/turbinia/workers/docker.py\n+++ b/turbinia/workers/docker.py\n@@ -75,14 +75,14 @@\n try:\n json_string = subprocess.check_output(docker_explorer_command).decode(\n 'utf-8')\n+ containers_info = json.loads(json_string)\n except json.JSONDecodeError as e:\n raise TurbiniaException(\n- 'Error decoding JSON output from de.py: {0!s}'.format(e))\n+ 'Error decoding JSON output from de.py: {0!s} {1!s}'.format(\n+ e, json_string))\n except subprocess.CalledProcessError as e:\n raise TurbiniaException('de.py returned an error: {0!s}'.format(e))\n \n- containers_info = json.loads(json_string)\n-\n return containers_info\n \n def run(self, evidence, result):\n", "issue": "DockerContainersEnumerationTask JSON decode failure\n```\r\nDockerContainersEnumerationTask Task failed with exception: [Expecting value: line 1 column 1 (char 0)]\r\nTraceback (most recent call last):\r\n File \"/usr/local/lib/python3.6/dist-packages/turbinia-20210423.dev2+g1f137be-py3.6.egg/turbinia/workers/__init__.py\", line 895, in run_wrapper\r\n self.result = self.run(evidence, self.result)\r\n File \"/usr/local/lib/python3.6/dist-packages/turbinia-20210423.dev2+g1f137be-py3.6.egg/turbinia/workers/docker.py\", line 107, in run\r\n containers_info = self.GetContainers(evidence)\r\n File \"/usr/local/lib/python3.6/dist-packages/turbinia-20210423.dev2+g1f137be-py3.6.egg/turbinia/workers/docker.py\", line 84, in GetContainers\r\n containers_info = json.loads(json_string)\r\n File \"/usr/lib/python3.6/json/__init__.py\", line 354, in loads\r\n return _default_decoder.decode(s)\r\n File \"/usr/lib/python3.6/json/decoder.py\", line 339, in decode\r\n obj, end = self.raw_decode(s, idx=_w(s, 0).end())\r\n File \"/usr/lib/python3.6/json/decoder.py\", line 357, in raw_decode\r\n raise JSONDecodeError(\"Expecting value\", s, err.value) from None\r\njson.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)\r\n```\r\n\r\ne2e test -> https://plaso-ci.log2timeline.net/jenkins/view/Turbinia/job/turbinia-e2e-googleclouddisk-dev/59/\n", "before_files": [{"content": "# Copyright 2015 Google Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Task for running docker-explorer.\"\"\"\n\nfrom __future__ import unicode_literals\n\nimport json\nimport logging\nimport subprocess\n\nfrom turbinia import TurbiniaException\nfrom turbinia.evidence import DockerContainer\nfrom turbinia.evidence import EvidenceState as state\nfrom turbinia.lib import utils\nfrom turbinia.workers import Priority\nfrom turbinia.workers import TurbiniaTask\nfrom turbinia.lib.docker_manager import GetDockerPath\nfrom turbinia import config\n\nlog = logging.getLogger('turbinia')\n\n\nclass DockerContainersEnumerationTask(TurbiniaTask):\n \"\"\"Enumerates Docker containers on Linux\"\"\"\n\n REQUIRED_STATES = [state.ATTACHED, state.MOUNTED]\n\n def GetContainers(self, evidence):\n \"\"\"Lists the containers from an input Evidence.\n\n We use subprocess to run the DockerExplorer script, instead of using the\n Python module, because we need to make sure all DockerExplorer code runs\n as root.\n\n Args:\n evidence (Evidence): the input Evidence.\n\n Returns:\n a list(dict) containing information about the containers found.\n\n Raises:\n TurbiniaException: when the docker-explorer tool cannot be found or failed\n to run.\n \"\"\"\n config.LoadConfig()\n docker_dir = GetDockerPath(evidence.mount_path)\n\n containers_info = None\n\n # TODO(rgayon): use docker-explorer exposed constant when\n # https://github.com/google/docker-explorer/issues/80 is in.\n de_binary = utils.get_exe_path('de.py')\n if not de_binary:\n raise TurbiniaException('Cannot find de.py in path')\n\n docker_explorer_command = ['sudo', de_binary]\n\n if config.DEBUG_TASKS or evidence.config.get('debug_tasks'):\n docker_explorer_command.append('-d')\n\n docker_explorer_command.extend(['-r', docker_dir, 'list', 'all_containers'])\n\n log.info('Running {0:s}'.format(' '.join(docker_explorer_command)))\n try:\n json_string = subprocess.check_output(docker_explorer_command).decode(\n 'utf-8')\n except json.JSONDecodeError as e:\n raise TurbiniaException(\n 'Error decoding JSON output from de.py: {0!s}'.format(e))\n except subprocess.CalledProcessError as e:\n raise TurbiniaException('de.py returned an error: {0!s}'.format(e))\n\n containers_info = json.loads(json_string)\n\n return containers_info\n\n def run(self, evidence, result):\n \"\"\"Run the docker-explorer tool to list containerss.\n\n Args:\n evidence (Evidence object): The evidence to process\n result (TurbiniaTaskResult): The object to place task results into.\n\n Returns:\n TurbiniaTaskResult object.\n \"\"\"\n\n status_report = ''\n success = False\n\n status_report = (\n 'Error enumerating Docker containers, evidence has no mounted '\n 'filesystem')\n found_containers = []\n try:\n containers_info = self.GetContainers(evidence)\n for container_info in containers_info:\n container_id = container_info.get('container_id')\n found_containers.append(container_id)\n container_evidence = DockerContainer(container_id=container_id)\n result.add_evidence(container_evidence, evidence.config)\n success = True\n status_report = 'Found {0!s} containers: {1:s}'.format(\n len(found_containers), ' '.join(found_containers))\n except TurbiniaException as e:\n status_report = 'Error enumerating Docker containers: {0!s}'.format(e)\n\n result.report_priority = Priority.LOW\n result.report_data = status_report\n result.close(self, success=success, status=status_report)\n return result\n", "path": "turbinia/workers/docker.py"}], "after_files": [{"content": "# Copyright 2015 Google Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Task for running docker-explorer.\"\"\"\n\nfrom __future__ import unicode_literals\n\nimport json\nimport logging\nimport subprocess\n\nfrom turbinia import TurbiniaException\nfrom turbinia.evidence import DockerContainer\nfrom turbinia.evidence import EvidenceState as state\nfrom turbinia.lib import utils\nfrom turbinia.workers import Priority\nfrom turbinia.workers import TurbiniaTask\nfrom turbinia.lib.docker_manager import GetDockerPath\nfrom turbinia import config\n\nlog = logging.getLogger('turbinia')\n\n\nclass DockerContainersEnumerationTask(TurbiniaTask):\n \"\"\"Enumerates Docker containers on Linux\"\"\"\n\n REQUIRED_STATES = [state.ATTACHED, state.MOUNTED]\n\n def GetContainers(self, evidence):\n \"\"\"Lists the containers from an input Evidence.\n\n We use subprocess to run the DockerExplorer script, instead of using the\n Python module, because we need to make sure all DockerExplorer code runs\n as root.\n\n Args:\n evidence (Evidence): the input Evidence.\n\n Returns:\n a list(dict) containing information about the containers found.\n\n Raises:\n TurbiniaException: when the docker-explorer tool cannot be found or failed\n to run.\n \"\"\"\n config.LoadConfig()\n docker_dir = GetDockerPath(evidence.mount_path)\n\n containers_info = None\n\n # TODO(rgayon): use docker-explorer exposed constant when\n # https://github.com/google/docker-explorer/issues/80 is in.\n de_binary = utils.get_exe_path('de.py')\n if not de_binary:\n raise TurbiniaException('Cannot find de.py in path')\n\n docker_explorer_command = ['sudo', de_binary]\n\n if config.DEBUG_TASKS or evidence.config.get('debug_tasks'):\n docker_explorer_command.append('-d')\n\n docker_explorer_command.extend(['-r', docker_dir, 'list', 'all_containers'])\n\n log.info('Running {0:s}'.format(' '.join(docker_explorer_command)))\n try:\n json_string = subprocess.check_output(docker_explorer_command).decode(\n 'utf-8')\n containers_info = json.loads(json_string)\n except json.JSONDecodeError as e:\n raise TurbiniaException(\n 'Error decoding JSON output from de.py: {0!s} {1!s}'.format(\n e, json_string))\n except subprocess.CalledProcessError as e:\n raise TurbiniaException('de.py returned an error: {0!s}'.format(e))\n\n return containers_info\n\n def run(self, evidence, result):\n \"\"\"Run the docker-explorer tool to list containerss.\n\n Args:\n evidence (Evidence object): The evidence to process\n result (TurbiniaTaskResult): The object to place task results into.\n\n Returns:\n TurbiniaTaskResult object.\n \"\"\"\n\n status_report = ''\n success = False\n\n status_report = (\n 'Error enumerating Docker containers, evidence has no mounted '\n 'filesystem')\n found_containers = []\n try:\n containers_info = self.GetContainers(evidence)\n for container_info in containers_info:\n container_id = container_info.get('container_id')\n found_containers.append(container_id)\n container_evidence = DockerContainer(container_id=container_id)\n result.add_evidence(container_evidence, evidence.config)\n success = True\n status_report = 'Found {0!s} containers: {1:s}'.format(\n len(found_containers), ' '.join(found_containers))\n except TurbiniaException as e:\n status_report = 'Error enumerating Docker containers: {0!s}'.format(e)\n\n result.report_priority = Priority.LOW\n result.report_data = status_report\n result.close(self, success=success, status=status_report)\n return result\n", "path": "turbinia/workers/docker.py"}]} | 1,910 | 214 |
gh_patches_debug_12726 | rasdani/github-patches | git_diff | enthought__chaco-678 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Remove chaco.chaco_version file
https://github.com/enthought/chaco/blob/fdd858aa6dbc76addb50d011fb81e879ce8e0355/chaco/chaco_version.py
We now create `chaco._version` file when installing the package (in `setup.py`) so we don't need this additional `chaco.chaco_version` file anymore.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `chaco/chaco_version.py`
Content:
```
1 # ------------------------------------------------------------------------------
2 # Copyright (c) 2005, Enthought, Inc.
3 # All rights reserved.
4 #
5 # This software is provided without warranty under the terms of the BSD
6 # license included in LICENSE.txt and may be redistributed only
7 # under the conditions described in the aforementioned license. The license
8 # is also available online at http://www.enthought.com/licenses/BSD.txt
9 # Thanks for using Enthought open source!
10 #
11 # Author: Enthought, Inc.
12 # Description: <Enthought library component>
13 # ------------------------------------------------------------------------------
14 """ Defines version numbering for the Chaco package.
15 """
16 major = 2
17 minor = 0
18 micro = 9
19
20 version = "%s.%s.%s" % (major, minor, micro)
21
22
23 release_level = "beta"
24 branch = ""
25 revision = version
26
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/chaco/chaco_version.py b/chaco/chaco_version.py
deleted file mode 100644
--- a/chaco/chaco_version.py
+++ /dev/null
@@ -1,25 +0,0 @@
-# ------------------------------------------------------------------------------
-# Copyright (c) 2005, Enthought, Inc.
-# All rights reserved.
-#
-# This software is provided without warranty under the terms of the BSD
-# license included in LICENSE.txt and may be redistributed only
-# under the conditions described in the aforementioned license. The license
-# is also available online at http://www.enthought.com/licenses/BSD.txt
-# Thanks for using Enthought open source!
-#
-# Author: Enthought, Inc.
-# Description: <Enthought library component>
-# ------------------------------------------------------------------------------
-""" Defines version numbering for the Chaco package.
-"""
-major = 2
-minor = 0
-micro = 9
-
-version = "%s.%s.%s" % (major, minor, micro)
-
-
-release_level = "beta"
-branch = ""
-revision = version
| {"golden_diff": "diff --git a/chaco/chaco_version.py b/chaco/chaco_version.py\ndeleted file mode 100644\n--- a/chaco/chaco_version.py\n+++ /dev/null\n@@ -1,25 +0,0 @@\n-# ------------------------------------------------------------------------------\n-# Copyright (c) 2005, Enthought, Inc.\n-# All rights reserved.\n-#\n-# This software is provided without warranty under the terms of the BSD\n-# license included in LICENSE.txt and may be redistributed only\n-# under the conditions described in the aforementioned license. The license\n-# is also available online at http://www.enthought.com/licenses/BSD.txt\n-# Thanks for using Enthought open source!\n-#\n-# Author: Enthought, Inc.\n-# Description: <Enthought library component>\n-# ------------------------------------------------------------------------------\n-\"\"\" Defines version numbering for the Chaco package.\n-\"\"\"\n-major = 2\n-minor = 0\n-micro = 9\n-\n-version = \"%s.%s.%s\" % (major, minor, micro)\n-\n-\n-release_level = \"beta\"\n-branch = \"\"\n-revision = version\n", "issue": "Remove chaco.chaco_version file\nhttps://github.com/enthought/chaco/blob/fdd858aa6dbc76addb50d011fb81e879ce8e0355/chaco/chaco_version.py\r\n\r\nWe now create `chaco._version` file when installing the package (in `setup.py`) so we don't need this additional `chaco.chaco_version` file anymore.\n", "before_files": [{"content": "# ------------------------------------------------------------------------------\n# Copyright (c) 2005, Enthought, Inc.\n# All rights reserved.\n#\n# This software is provided without warranty under the terms of the BSD\n# license included in LICENSE.txt and may be redistributed only\n# under the conditions described in the aforementioned license. The license\n# is also available online at http://www.enthought.com/licenses/BSD.txt\n# Thanks for using Enthought open source!\n#\n# Author: Enthought, Inc.\n# Description: <Enthought library component>\n# ------------------------------------------------------------------------------\n\"\"\" Defines version numbering for the Chaco package.\n\"\"\"\nmajor = 2\nminor = 0\nmicro = 9\n\nversion = \"%s.%s.%s\" % (major, minor, micro)\n\n\nrelease_level = \"beta\"\nbranch = \"\"\nrevision = version\n", "path": "chaco/chaco_version.py"}], "after_files": [{"content": null, "path": "chaco/chaco_version.py"}]} | 569 | 245 |
gh_patches_debug_41891 | rasdani/github-patches | git_diff | streamlink__streamlink-2134 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Huomao plugin not work
<!--
Thanks for reporting a plugin issue!
USE THE TEMPLATE. Otherwise your plugin issue may be rejected.
First, see the contribution guidelines:
https://github.com/streamlink/streamlink/blob/master/CONTRIBUTING.md#contributing-to-streamlink
Also check the list of open and closed plugin issues:
https://github.com/streamlink/streamlink/issues?q=is%3Aissue+label%3A%22plugin+issue%22
Please see the text preview to avoid unnecessary formatting errors.
-->
## Plugin Issue
<!-- Replace [ ] with [x] in order to check the box -->
- [x] This is a plugin issue and I have read the contribution guidelines.
### Description
I found huomao plugin seems not work, i can use browser to watch stream but
streamlink says no playable stream
<!-- Explain the plugin issue as thoroughly as you can. -->
### Reproduction steps / Explicit stream URLs to test
<!-- How can we reproduce this? Please note the exact steps below using the list format supplied. If you need more steps please add them. -->
1. https://www.huomao.com/9755
2. https://www.huomao.com/777777
3. https://www.huomao.com/888
### Log output
<!--
TEXT LOG OUTPUT IS REQUIRED for a plugin issue!
Use the `--loglevel debug` parameter and avoid using parameters which suppress log output.
https://streamlink.github.io/cli.html#cmdoption-l
Make sure to **remove usernames and passwords**
You can copy the output to https://gist.github.com/ or paste it below.
-->
```
[cli][info] Found matching plugin huomao for URL https://www.huomao.com/888
[plugin.huomao][error] Failed to extract stream_info.
error: No playable streams found on this URL: https://www.huomao.com/888
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/streamlink/plugins/huomao.py`
Content:
```
1 """
2 NOTE: Since a documented API is nowhere to be found for Huomao; this plugin
3 simply extracts the videos stream_id, stream_url and stream_quality by
4 scraping the HTML and JS of one of Huomaos mobile webpages.
5
6 When viewing a stream on huomao.com, the base URL references a room_id. This
7 room_id is mapped one-to-one to a stream_id which references the actual .flv
8 video. Both stream_id, stream_url and stream_quality can be found in the
9 HTML and JS source of the mobile_page. Since one stream can occur in many
10 different qualities, we scrape all stream_url and stream_quality occurrences
11 and return each option to the user.
12 """
13
14 import re
15
16 from streamlink.plugin import Plugin
17 from streamlink.stream import HTTPStream
18
19 # URL pattern for recognizing inputed Huomao.tv / Huomao.com URL.
20 url_re = re.compile(r"""
21 (http(s)?://)?
22 (www\.)?
23 huomao
24 (\.tv|\.com)
25 /(?P<room_id>\d+)
26 """, re.VERBOSE)
27
28 # URL used to retrive the stream_id, stream_url and stream_quality based of
29 # a room_id.
30 mobile_url = "http://www.huomao.com/mobile/mob_live/{0}"
31
32 # Pattern for extracting the stream_id from the mobile_url HTML.
33 #
34 # Example from HTML:
35 # <input id="html_stream" value="efmrCH" type="hidden">
36 stream_id_pattern = re.compile(r'id=\"html_stream\" value=\"(?P<stream_id>\w+)\"')
37
38 # Pattern for extracting each stream_url, stream_quality_url and a prettified
39 # stream_quality_name used for quality naming.
40 #
41 # Example from HTML:
42 # "2: 'http://live-ws.huomaotv.cn/live/'+stream+'_720/playlist.m3u8'"
43 stream_info_pattern = re.compile(r"""
44 [1-9]:
45 \s+
46 '(?P<stream_url>(?:\w|\.|:|-|/)+)
47 '\+stream\+'
48 (?P<stream_quality_url>_?(?P<stream_quality_name>\d*))
49 /playlist.m3u8'
50 """, re.VERBOSE)
51
52
53 class Huomao(Plugin):
54 @classmethod
55 def can_handle_url(self, url):
56 return url_re.match(url)
57
58 def get_stream_id(self, html):
59 """Returns the stream_id contained in the HTML."""
60 stream_id = stream_id_pattern.search(html)
61
62 if not stream_id:
63 self.logger.error("Failed to extract stream_id.")
64
65 return stream_id.group("stream_id")
66
67 def get_stream_info(self, html):
68 """Returns a nested list of different stream options.
69
70 Each entry in the list will contain a stream_url, stream_quality_url
71 and stream_quality_name for each stream occurrence that was found in
72 the JS.
73 """
74 stream_info = stream_info_pattern.findall(html)
75
76 if not stream_info:
77 self.logger.error("Failed to extract stream_info.")
78
79 # Rename the "" quality to "source" by transforming the tuples to a
80 # list and reassigning.
81 stream_info_list = []
82 for info in stream_info:
83 if not info[2]:
84 stream_info_list.append([info[0], info[1], "source"])
85 else:
86 stream_info_list.append(list(info))
87
88 return stream_info_list
89
90 def _get_streams(self):
91 room_id = url_re.search(self.url).group("room_id")
92 html = self.session.http.get(mobile_url.format(room_id))
93 stream_id = self.get_stream_id(html.text)
94 stream_info = self.get_stream_info(html.text)
95
96 streams = {}
97 for info in stream_info:
98 streams[info[2]] = HTTPStream(self.session,
99 info[0] + stream_id + info[1] + ".flv")
100
101 return streams
102
103
104 __plugin__ = Huomao
105
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/streamlink/plugins/huomao.py b/src/streamlink/plugins/huomao.py
--- a/src/streamlink/plugins/huomao.py
+++ b/src/streamlink/plugins/huomao.py
@@ -4,8 +4,8 @@
scraping the HTML and JS of one of Huomaos mobile webpages.
When viewing a stream on huomao.com, the base URL references a room_id. This
-room_id is mapped one-to-one to a stream_id which references the actual .flv
-video. Both stream_id, stream_url and stream_quality can be found in the
+room_id is mapped one-to-one to a stream_id which references the actual .m3u8
+file. Both stream_id, stream_url and stream_quality can be found in the
HTML and JS source of the mobile_page. Since one stream can occur in many
different qualities, we scrape all stream_url and stream_quality occurrences
and return each option to the user.
@@ -14,7 +14,7 @@
import re
from streamlink.plugin import Plugin
-from streamlink.stream import HTTPStream
+from streamlink.stream import HLSStream
# URL pattern for recognizing inputed Huomao.tv / Huomao.com URL.
url_re = re.compile(r"""
@@ -35,18 +35,15 @@
# <input id="html_stream" value="efmrCH" type="hidden">
stream_id_pattern = re.compile(r'id=\"html_stream\" value=\"(?P<stream_id>\w+)\"')
-# Pattern for extracting each stream_url, stream_quality_url and a prettified
+# Pattern for extracting each stream_url and
# stream_quality_name used for quality naming.
#
# Example from HTML:
-# "2: 'http://live-ws.huomaotv.cn/live/'+stream+'_720/playlist.m3u8'"
+# src="http://live-ws-hls.huomaotv.cn/live/<stream_id>_720/playlist.m3u8"
stream_info_pattern = re.compile(r"""
- [1-9]:
- \s+
- '(?P<stream_url>(?:\w|\.|:|-|/)+)
- '\+stream\+'
- (?P<stream_quality_url>_?(?P<stream_quality_name>\d*))
- /playlist.m3u8'
+ (?P<stream_url>(?:[\w\/\.\-:]+)
+ \/[^_\"]+(?:_(?P<stream_quality_name>\d+))
+ ?/playlist.m3u8)
""", re.VERBOSE)
@@ -65,11 +62,11 @@
return stream_id.group("stream_id")
def get_stream_info(self, html):
- """Returns a nested list of different stream options.
+ """
+ Returns a nested list of different stream options.
- Each entry in the list will contain a stream_url, stream_quality_url
- and stream_quality_name for each stream occurrence that was found in
- the JS.
+ Each entry in the list will contain a stream_url and stream_quality_name
+ for each stream occurrence that was found in the JS.
"""
stream_info = stream_info_pattern.findall(html)
@@ -80,8 +77,8 @@
# list and reassigning.
stream_info_list = []
for info in stream_info:
- if not info[2]:
- stream_info_list.append([info[0], info[1], "source"])
+ if not info[1]:
+ stream_info_list.append([info[0], "source"])
else:
stream_info_list.append(list(info))
@@ -95,8 +92,8 @@
streams = {}
for info in stream_info:
- streams[info[2]] = HTTPStream(self.session,
- info[0] + stream_id + info[1] + ".flv")
+ if stream_id in info[0]:
+ streams[info[1]] = HLSStream(self.session, info[0])
return streams
| {"golden_diff": "diff --git a/src/streamlink/plugins/huomao.py b/src/streamlink/plugins/huomao.py\n--- a/src/streamlink/plugins/huomao.py\n+++ b/src/streamlink/plugins/huomao.py\n@@ -4,8 +4,8 @@\n scraping the HTML and JS of one of Huomaos mobile webpages.\n \n When viewing a stream on huomao.com, the base URL references a room_id. This\n-room_id is mapped one-to-one to a stream_id which references the actual .flv\n-video. Both stream_id, stream_url and stream_quality can be found in the\n+room_id is mapped one-to-one to a stream_id which references the actual .m3u8\n+file. Both stream_id, stream_url and stream_quality can be found in the\n HTML and JS source of the mobile_page. Since one stream can occur in many\n different qualities, we scrape all stream_url and stream_quality occurrences\n and return each option to the user.\n@@ -14,7 +14,7 @@\n import re\n \n from streamlink.plugin import Plugin\n-from streamlink.stream import HTTPStream\n+from streamlink.stream import HLSStream\n \n # URL pattern for recognizing inputed Huomao.tv / Huomao.com URL.\n url_re = re.compile(r\"\"\"\n@@ -35,18 +35,15 @@\n # <input id=\"html_stream\" value=\"efmrCH\" type=\"hidden\">\n stream_id_pattern = re.compile(r'id=\\\"html_stream\\\" value=\\\"(?P<stream_id>\\w+)\\\"')\n \n-# Pattern for extracting each stream_url, stream_quality_url and a prettified\n+# Pattern for extracting each stream_url and\n # stream_quality_name used for quality naming.\n #\n # Example from HTML:\n-# \"2: 'http://live-ws.huomaotv.cn/live/'+stream+'_720/playlist.m3u8'\"\n+# src=\"http://live-ws-hls.huomaotv.cn/live/<stream_id>_720/playlist.m3u8\"\n stream_info_pattern = re.compile(r\"\"\"\n- [1-9]:\n- \\s+\n- '(?P<stream_url>(?:\\w|\\.|:|-|/)+)\n- '\\+stream\\+'\n- (?P<stream_quality_url>_?(?P<stream_quality_name>\\d*))\n- /playlist.m3u8'\n+ (?P<stream_url>(?:[\\w\\/\\.\\-:]+)\n+ \\/[^_\\\"]+(?:_(?P<stream_quality_name>\\d+))\n+ ?/playlist.m3u8)\n \"\"\", re.VERBOSE)\n \n \n@@ -65,11 +62,11 @@\n return stream_id.group(\"stream_id\")\n \n def get_stream_info(self, html):\n- \"\"\"Returns a nested list of different stream options.\n+ \"\"\"\n+ Returns a nested list of different stream options.\n \n- Each entry in the list will contain a stream_url, stream_quality_url\n- and stream_quality_name for each stream occurrence that was found in\n- the JS.\n+ Each entry in the list will contain a stream_url and stream_quality_name\n+ for each stream occurrence that was found in the JS.\n \"\"\"\n stream_info = stream_info_pattern.findall(html)\n \n@@ -80,8 +77,8 @@\n # list and reassigning.\n stream_info_list = []\n for info in stream_info:\n- if not info[2]:\n- stream_info_list.append([info[0], info[1], \"source\"])\n+ if not info[1]:\n+ stream_info_list.append([info[0], \"source\"])\n else:\n stream_info_list.append(list(info))\n \n@@ -95,8 +92,8 @@\n \n streams = {}\n for info in stream_info:\n- streams[info[2]] = HTTPStream(self.session,\n- info[0] + stream_id + info[1] + \".flv\")\n+ if stream_id in info[0]:\n+ streams[info[1]] = HLSStream(self.session, info[0])\n \n return streams\n", "issue": "Huomao plugin not work\n<!--\r\nThanks for reporting a plugin issue!\r\nUSE THE TEMPLATE. Otherwise your plugin issue may be rejected.\r\n\r\nFirst, see the contribution guidelines:\r\nhttps://github.com/streamlink/streamlink/blob/master/CONTRIBUTING.md#contributing-to-streamlink\r\n\r\nAlso check the list of open and closed plugin issues:\r\nhttps://github.com/streamlink/streamlink/issues?q=is%3Aissue+label%3A%22plugin+issue%22\r\n\r\nPlease see the text preview to avoid unnecessary formatting errors.\r\n-->\r\n\r\n\r\n## Plugin Issue\r\n\r\n<!-- Replace [ ] with [x] in order to check the box -->\r\n- [x] This is a plugin issue and I have read the contribution guidelines.\r\n\r\n\r\n### Description\r\nI found huomao plugin seems not work, i can use browser to watch stream but\r\nstreamlink says no playable stream\r\n\r\n\r\n<!-- Explain the plugin issue as thoroughly as you can. -->\r\n\r\n\r\n### Reproduction steps / Explicit stream URLs to test\r\n\r\n<!-- How can we reproduce this? Please note the exact steps below using the list format supplied. If you need more steps please add them. -->\r\n\r\n1. https://www.huomao.com/9755\r\n2. https://www.huomao.com/777777\r\n3. https://www.huomao.com/888\r\n\r\n\r\n### Log output\r\n\r\n<!--\r\nTEXT LOG OUTPUT IS REQUIRED for a plugin issue!\r\nUse the `--loglevel debug` parameter and avoid using parameters which suppress log output.\r\nhttps://streamlink.github.io/cli.html#cmdoption-l\r\n\r\nMake sure to **remove usernames and passwords**\r\nYou can copy the output to https://gist.github.com/ or paste it below.\r\n-->\r\n\r\n```\r\n[cli][info] Found matching plugin huomao for URL https://www.huomao.com/888\r\n[plugin.huomao][error] Failed to extract stream_info.\r\nerror: No playable streams found on this URL: https://www.huomao.com/888\r\n```\r\n\r\n\r\n\n", "before_files": [{"content": "\"\"\"\nNOTE: Since a documented API is nowhere to be found for Huomao; this plugin\nsimply extracts the videos stream_id, stream_url and stream_quality by\nscraping the HTML and JS of one of Huomaos mobile webpages.\n\nWhen viewing a stream on huomao.com, the base URL references a room_id. This\nroom_id is mapped one-to-one to a stream_id which references the actual .flv\nvideo. Both stream_id, stream_url and stream_quality can be found in the\nHTML and JS source of the mobile_page. Since one stream can occur in many\ndifferent qualities, we scrape all stream_url and stream_quality occurrences\nand return each option to the user.\n\"\"\"\n\nimport re\n\nfrom streamlink.plugin import Plugin\nfrom streamlink.stream import HTTPStream\n\n# URL pattern for recognizing inputed Huomao.tv / Huomao.com URL.\nurl_re = re.compile(r\"\"\"\n (http(s)?://)?\n (www\\.)?\n huomao\n (\\.tv|\\.com)\n /(?P<room_id>\\d+)\n\"\"\", re.VERBOSE)\n\n# URL used to retrive the stream_id, stream_url and stream_quality based of\n# a room_id.\nmobile_url = \"http://www.huomao.com/mobile/mob_live/{0}\"\n\n# Pattern for extracting the stream_id from the mobile_url HTML.\n#\n# Example from HTML:\n# <input id=\"html_stream\" value=\"efmrCH\" type=\"hidden\">\nstream_id_pattern = re.compile(r'id=\\\"html_stream\\\" value=\\\"(?P<stream_id>\\w+)\\\"')\n\n# Pattern for extracting each stream_url, stream_quality_url and a prettified\n# stream_quality_name used for quality naming.\n#\n# Example from HTML:\n# \"2: 'http://live-ws.huomaotv.cn/live/'+stream+'_720/playlist.m3u8'\"\nstream_info_pattern = re.compile(r\"\"\"\n [1-9]:\n \\s+\n '(?P<stream_url>(?:\\w|\\.|:|-|/)+)\n '\\+stream\\+'\n (?P<stream_quality_url>_?(?P<stream_quality_name>\\d*))\n /playlist.m3u8'\n\"\"\", re.VERBOSE)\n\n\nclass Huomao(Plugin):\n @classmethod\n def can_handle_url(self, url):\n return url_re.match(url)\n\n def get_stream_id(self, html):\n \"\"\"Returns the stream_id contained in the HTML.\"\"\"\n stream_id = stream_id_pattern.search(html)\n\n if not stream_id:\n self.logger.error(\"Failed to extract stream_id.\")\n\n return stream_id.group(\"stream_id\")\n\n def get_stream_info(self, html):\n \"\"\"Returns a nested list of different stream options.\n\n Each entry in the list will contain a stream_url, stream_quality_url\n and stream_quality_name for each stream occurrence that was found in\n the JS.\n \"\"\"\n stream_info = stream_info_pattern.findall(html)\n\n if not stream_info:\n self.logger.error(\"Failed to extract stream_info.\")\n\n # Rename the \"\" quality to \"source\" by transforming the tuples to a\n # list and reassigning.\n stream_info_list = []\n for info in stream_info:\n if not info[2]:\n stream_info_list.append([info[0], info[1], \"source\"])\n else:\n stream_info_list.append(list(info))\n\n return stream_info_list\n\n def _get_streams(self):\n room_id = url_re.search(self.url).group(\"room_id\")\n html = self.session.http.get(mobile_url.format(room_id))\n stream_id = self.get_stream_id(html.text)\n stream_info = self.get_stream_info(html.text)\n\n streams = {}\n for info in stream_info:\n streams[info[2]] = HTTPStream(self.session,\n info[0] + stream_id + info[1] + \".flv\")\n\n return streams\n\n\n__plugin__ = Huomao\n", "path": "src/streamlink/plugins/huomao.py"}], "after_files": [{"content": "\"\"\"\nNOTE: Since a documented API is nowhere to be found for Huomao; this plugin\nsimply extracts the videos stream_id, stream_url and stream_quality by\nscraping the HTML and JS of one of Huomaos mobile webpages.\n\nWhen viewing a stream on huomao.com, the base URL references a room_id. This\nroom_id is mapped one-to-one to a stream_id which references the actual .m3u8\nfile. Both stream_id, stream_url and stream_quality can be found in the\nHTML and JS source of the mobile_page. Since one stream can occur in many\ndifferent qualities, we scrape all stream_url and stream_quality occurrences\nand return each option to the user.\n\"\"\"\n\nimport re\n\nfrom streamlink.plugin import Plugin\nfrom streamlink.stream import HLSStream\n\n# URL pattern for recognizing inputed Huomao.tv / Huomao.com URL.\nurl_re = re.compile(r\"\"\"\n (http(s)?://)?\n (www\\.)?\n huomao\n (\\.tv|\\.com)\n /(?P<room_id>\\d+)\n\"\"\", re.VERBOSE)\n\n# URL used to retrive the stream_id, stream_url and stream_quality based of\n# a room_id.\nmobile_url = \"http://www.huomao.com/mobile/mob_live/{0}\"\n\n# Pattern for extracting the stream_id from the mobile_url HTML.\n#\n# Example from HTML:\n# <input id=\"html_stream\" value=\"efmrCH\" type=\"hidden\">\nstream_id_pattern = re.compile(r'id=\\\"html_stream\\\" value=\\\"(?P<stream_id>\\w+)\\\"')\n\n# Pattern for extracting each stream_url and\n# stream_quality_name used for quality naming.\n#\n# Example from HTML:\n# src=\"http://live-ws-hls.huomaotv.cn/live/<stream_id>_720/playlist.m3u8\"\nstream_info_pattern = re.compile(r\"\"\"\n (?P<stream_url>(?:[\\w\\/\\.\\-:]+)\n \\/[^_\\\"]+(?:_(?P<stream_quality_name>\\d+))\n ?/playlist.m3u8)\n\"\"\", re.VERBOSE)\n\n\nclass Huomao(Plugin):\n @classmethod\n def can_handle_url(self, url):\n return url_re.match(url)\n\n def get_stream_id(self, html):\n \"\"\"Returns the stream_id contained in the HTML.\"\"\"\n stream_id = stream_id_pattern.search(html)\n\n if not stream_id:\n self.logger.error(\"Failed to extract stream_id.\")\n\n return stream_id.group(\"stream_id\")\n\n def get_stream_info(self, html):\n \"\"\"\n Returns a nested list of different stream options.\n\n Each entry in the list will contain a stream_url and stream_quality_name\n for each stream occurrence that was found in the JS.\n \"\"\"\n stream_info = stream_info_pattern.findall(html)\n\n if not stream_info:\n self.logger.error(\"Failed to extract stream_info.\")\n\n # Rename the \"\" quality to \"source\" by transforming the tuples to a\n # list and reassigning.\n stream_info_list = []\n for info in stream_info:\n if not info[1]:\n stream_info_list.append([info[0], \"source\"])\n else:\n stream_info_list.append(list(info))\n\n return stream_info_list\n\n def _get_streams(self):\n room_id = url_re.search(self.url).group(\"room_id\")\n html = self.session.http.get(mobile_url.format(room_id))\n stream_id = self.get_stream_id(html.text)\n stream_info = self.get_stream_info(html.text)\n\n streams = {}\n for info in stream_info:\n if stream_id in info[0]:\n streams[info[1]] = HLSStream(self.session, info[0])\n\n return streams\n\n\n__plugin__ = Huomao\n", "path": "src/streamlink/plugins/huomao.py"}]} | 1,741 | 889 |
gh_patches_debug_6455 | rasdani/github-patches | git_diff | voicepaw__so-vits-svc-fork-354 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
UnpicklingError: Weights only load failed. Unpickler error: Unsupported class numpy.core.multiarray._reconstruct
**Describe the bug**
I tried to update, but I got this exception start from version 3.6.0 during inference
```
UnpicklingError: Weights only load failed. Re-running `torch.load` with `weights_only` set to `False` will likely succeed, but it can result in arbitrary code execution.Do it only if you get the file from a trusted source. WeightsUnpickler error: Unsupported class numpy.core.multiarray._reconstruct
```
**To Reproduce**
Steps to reproduce the behavior:
- Update so-vits-svc-fork
- Run inference
**Additional context**
Initially I updated to version 3.8.0, because of the exception I tried to solve by keep downgrading the version until I got to version 3.5.1 to solve the problem.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/so_vits_svc_fork/cluster/__init__.py`
Content:
```
1 from __future__ import annotations
2
3 from pathlib import Path
4 from typing import Any
5
6 import torch
7 from sklearn.cluster import KMeans
8
9
10 def get_cluster_model(ckpt_path: Path | str):
11 with Path(ckpt_path).open("rb") as f:
12 checkpoint = torch.load(f, map_location="cpu", weights_only=True)
13 kmeans_dict = {}
14 for spk, ckpt in checkpoint.items():
15 km = KMeans(ckpt["n_features_in_"])
16 km.__dict__["n_features_in_"] = ckpt["n_features_in_"]
17 km.__dict__["_n_threads"] = ckpt["_n_threads"]
18 km.__dict__["cluster_centers_"] = ckpt["cluster_centers_"]
19 kmeans_dict[spk] = km
20 return kmeans_dict
21
22
23 def check_speaker(model: Any, speaker: Any):
24 if speaker not in model:
25 raise ValueError(f"Speaker {speaker} not in {list(model.keys())}")
26
27
28 def get_cluster_result(model: Any, x: Any, speaker: Any):
29 """
30 x: np.array [t, 256]
31 return cluster class result
32 """
33 check_speaker(model, speaker)
34 return model[speaker].predict(x)
35
36
37 def get_cluster_center_result(model: Any, x: Any, speaker: Any):
38 """x: np.array [t, 256]"""
39 check_speaker(model, speaker)
40 predict = model[speaker].predict(x)
41 return model[speaker].cluster_centers_[predict]
42
43
44 def get_center(model: Any, x: Any, speaker: Any):
45 check_speaker(model, speaker)
46 return model[speaker].cluster_centers_[x]
47
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/so_vits_svc_fork/cluster/__init__.py b/src/so_vits_svc_fork/cluster/__init__.py
--- a/src/so_vits_svc_fork/cluster/__init__.py
+++ b/src/so_vits_svc_fork/cluster/__init__.py
@@ -9,7 +9,9 @@
def get_cluster_model(ckpt_path: Path | str):
with Path(ckpt_path).open("rb") as f:
- checkpoint = torch.load(f, map_location="cpu", weights_only=True)
+ checkpoint = torch.load(
+ f, map_location="cpu"
+ ) # Danger of arbitrary code execution
kmeans_dict = {}
for spk, ckpt in checkpoint.items():
km = KMeans(ckpt["n_features_in_"])
| {"golden_diff": "diff --git a/src/so_vits_svc_fork/cluster/__init__.py b/src/so_vits_svc_fork/cluster/__init__.py\n--- a/src/so_vits_svc_fork/cluster/__init__.py\n+++ b/src/so_vits_svc_fork/cluster/__init__.py\n@@ -9,7 +9,9 @@\n \n def get_cluster_model(ckpt_path: Path | str):\n with Path(ckpt_path).open(\"rb\") as f:\n- checkpoint = torch.load(f, map_location=\"cpu\", weights_only=True)\n+ checkpoint = torch.load(\n+ f, map_location=\"cpu\"\n+ ) # Danger of arbitrary code execution\n kmeans_dict = {}\n for spk, ckpt in checkpoint.items():\n km = KMeans(ckpt[\"n_features_in_\"])\n", "issue": "UnpicklingError: Weights only load failed. Unpickler error: Unsupported class numpy.core.multiarray._reconstruct\n**Describe the bug**\r\nI tried to update, but I got this exception start from version 3.6.0 during inference\r\n```\r\nUnpicklingError: Weights only load failed. Re-running `torch.load` with `weights_only` set to `False` will likely succeed, but it can result in arbitrary code execution.Do it only if you get the file from a trusted source. WeightsUnpickler error: Unsupported class numpy.core.multiarray._reconstruct\r\n```\r\n\r\n**To Reproduce**\r\nSteps to reproduce the behavior:\r\n- Update so-vits-svc-fork\r\n- Run inference\r\n\r\n**Additional context**\r\nInitially I updated to version 3.8.0, because of the exception I tried to solve by keep downgrading the version until I got to version 3.5.1 to solve the problem.\r\n\n", "before_files": [{"content": "from __future__ import annotations\n\nfrom pathlib import Path\nfrom typing import Any\n\nimport torch\nfrom sklearn.cluster import KMeans\n\n\ndef get_cluster_model(ckpt_path: Path | str):\n with Path(ckpt_path).open(\"rb\") as f:\n checkpoint = torch.load(f, map_location=\"cpu\", weights_only=True)\n kmeans_dict = {}\n for spk, ckpt in checkpoint.items():\n km = KMeans(ckpt[\"n_features_in_\"])\n km.__dict__[\"n_features_in_\"] = ckpt[\"n_features_in_\"]\n km.__dict__[\"_n_threads\"] = ckpt[\"_n_threads\"]\n km.__dict__[\"cluster_centers_\"] = ckpt[\"cluster_centers_\"]\n kmeans_dict[spk] = km\n return kmeans_dict\n\n\ndef check_speaker(model: Any, speaker: Any):\n if speaker not in model:\n raise ValueError(f\"Speaker {speaker} not in {list(model.keys())}\")\n\n\ndef get_cluster_result(model: Any, x: Any, speaker: Any):\n \"\"\"\n x: np.array [t, 256]\n return cluster class result\n \"\"\"\n check_speaker(model, speaker)\n return model[speaker].predict(x)\n\n\ndef get_cluster_center_result(model: Any, x: Any, speaker: Any):\n \"\"\"x: np.array [t, 256]\"\"\"\n check_speaker(model, speaker)\n predict = model[speaker].predict(x)\n return model[speaker].cluster_centers_[predict]\n\n\ndef get_center(model: Any, x: Any, speaker: Any):\n check_speaker(model, speaker)\n return model[speaker].cluster_centers_[x]\n", "path": "src/so_vits_svc_fork/cluster/__init__.py"}], "after_files": [{"content": "from __future__ import annotations\n\nfrom pathlib import Path\nfrom typing import Any\n\nimport torch\nfrom sklearn.cluster import KMeans\n\n\ndef get_cluster_model(ckpt_path: Path | str):\n with Path(ckpt_path).open(\"rb\") as f:\n checkpoint = torch.load(\n f, map_location=\"cpu\"\n ) # Danger of arbitrary code execution\n kmeans_dict = {}\n for spk, ckpt in checkpoint.items():\n km = KMeans(ckpt[\"n_features_in_\"])\n km.__dict__[\"n_features_in_\"] = ckpt[\"n_features_in_\"]\n km.__dict__[\"_n_threads\"] = ckpt[\"_n_threads\"]\n km.__dict__[\"cluster_centers_\"] = ckpt[\"cluster_centers_\"]\n kmeans_dict[spk] = km\n return kmeans_dict\n\n\ndef check_speaker(model: Any, speaker: Any):\n if speaker not in model:\n raise ValueError(f\"Speaker {speaker} not in {list(model.keys())}\")\n\n\ndef get_cluster_result(model: Any, x: Any, speaker: Any):\n \"\"\"\n x: np.array [t, 256]\n return cluster class result\n \"\"\"\n check_speaker(model, speaker)\n return model[speaker].predict(x)\n\n\ndef get_cluster_center_result(model: Any, x: Any, speaker: Any):\n \"\"\"x: np.array [t, 256]\"\"\"\n check_speaker(model, speaker)\n predict = model[speaker].predict(x)\n return model[speaker].cluster_centers_[predict]\n\n\ndef get_center(model: Any, x: Any, speaker: Any):\n check_speaker(model, speaker)\n return model[speaker].cluster_centers_[x]\n", "path": "src/so_vits_svc_fork/cluster/__init__.py"}]} | 919 | 179 |
gh_patches_debug_7476 | rasdani/github-patches | git_diff | Mailu__Mailu-1084 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Setup error for incorrect docker network subnet
When entering an incorrect docker subnet (e.g. 172.168.0.1/16) the setup throws a 500 error -- without any reasons being given.
If you run locally, the error is clearly reported in an ungraceful way.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup/server.py`
Content:
```
1 import flask
2 import flask_bootstrap
3 import redis
4 import json
5 import os
6 import jinja2
7 import uuid
8 import string
9 import random
10 import ipaddress
11 import hashlib
12 import time
13
14
15 version = os.getenv("this_version", "master")
16 static_url_path = "/" + version + "/static"
17 app = flask.Flask(__name__, static_url_path=static_url_path)
18 flask_bootstrap.Bootstrap(app)
19 db = redis.StrictRedis(host='redis', port=6379, db=0)
20
21
22 def render_flavor(flavor, template, data):
23 return flask.render_template(
24 os.path.join(flavor, template),
25 **data
26 )
27
28
29 @app.add_template_global
30 def secret(length=16):
31 charset = string.ascii_uppercase + string.digits
32 return ''.join(
33 random.SystemRandom().choice(charset)
34 for _ in range(length)
35 )
36
37 #Original copied from https://github.com/andrewlkho/ulagen
38 def random_ipv6_subnet():
39 eui64 = uuid.getnode() >> 24 << 48 | 0xfffe000000 | uuid.getnode() & 0xffffff
40 eui64_canon = "-".join([format(eui64, "02X")[i:i+2] for i in range(0, 18, 2)])
41
42 h = hashlib.sha1()
43 h.update((eui64_canon + str(time.time() - time.mktime((1900, 1, 1, 0, 0, 0, 0, 1, -1)))).encode('utf-8'))
44 globalid = h.hexdigest()[0:10]
45
46 prefix = ":".join(("fd" + globalid[0:2], globalid[2:6], globalid[6:10]))
47 return prefix
48
49 def build_app(path):
50
51 app.jinja_env.trim_blocks = True
52 app.jinja_env.lstrip_blocks = True
53
54 @app.context_processor
55 def app_context():
56 return dict(versions=os.getenv("VERSIONS","master").split(','))
57
58 prefix_bp = flask.Blueprint(version, __name__)
59 prefix_bp.jinja_loader = jinja2.ChoiceLoader([
60 jinja2.FileSystemLoader(os.path.join(path, "templates")),
61 jinja2.FileSystemLoader(os.path.join(path, "flavors"))
62 ])
63
64 root_bp = flask.Blueprint("root", __name__)
65 root_bp.jinja_loader = jinja2.ChoiceLoader([
66 jinja2.FileSystemLoader(os.path.join(path, "templates")),
67 jinja2.FileSystemLoader(os.path.join(path, "flavors"))
68 ])
69
70 @prefix_bp.context_processor
71 @root_bp.context_processor
72 def bp_context(version=version):
73 return dict(version=version)
74
75 @prefix_bp.route("/")
76 @root_bp.route("/")
77 def wizard():
78 return flask.render_template('wizard.html')
79
80 @prefix_bp.route("/submit_flavor", methods=["POST"])
81 @root_bp.route("/submit_flavor", methods=["POST"])
82 def submit_flavor():
83 data = flask.request.form.copy()
84 subnet6 = random_ipv6_subnet()
85 steps = sorted(os.listdir(os.path.join(path, "templates", "steps", data["flavor"])))
86 return flask.render_template('wizard.html', flavor=data["flavor"], steps=steps, subnet6=subnet6)
87
88 @prefix_bp.route("/submit", methods=["POST"])
89 @root_bp.route("/submit", methods=["POST"])
90 def submit():
91 data = flask.request.form.copy()
92 data['uid'] = str(uuid.uuid4())
93 try:
94 data['dns'] = str(ipaddress.IPv4Network(data['subnet'])[-2])
95 except ValueError as err:
96 return "Error while generating files: " + str(err)
97 db.set(data['uid'], json.dumps(data))
98 return flask.redirect(flask.url_for('.setup', uid=data['uid']))
99
100 @prefix_bp.route("/setup/<uid>", methods=["GET"])
101 @root_bp.route("/setup/<uid>", methods=["GET"])
102 def setup(uid):
103 data = json.loads(db.get(uid))
104 flavor = data.get("flavor", "compose")
105 rendered = render_flavor(flavor, "setup.html", data)
106 return flask.render_template("setup.html", contents=rendered)
107
108 @prefix_bp.route("/file/<uid>/<filepath>", methods=["GET"])
109 @root_bp.route("/file/<uid>/<filepath>", methods=["GET"])
110 def file(uid, filepath):
111 data = json.loads(db.get(uid))
112 flavor = data.get("flavor", "compose")
113 return flask.Response(
114 render_flavor(flavor, filepath, data),
115 mimetype="application/text"
116 )
117
118 app.register_blueprint(prefix_bp, url_prefix="/{}".format(version))
119 app.register_blueprint(root_bp)
120
121
122 if __name__ == "__main__":
123 build_app("/tmp/mailutest")
124 app.run(debug=True)
125
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup/server.py b/setup/server.py
--- a/setup/server.py
+++ b/setup/server.py
@@ -91,7 +91,7 @@
data = flask.request.form.copy()
data['uid'] = str(uuid.uuid4())
try:
- data['dns'] = str(ipaddress.IPv4Network(data['subnet'])[-2])
+ data['dns'] = str(ipaddress.IPv4Network(data['subnet'], strict=False)[-2])
except ValueError as err:
return "Error while generating files: " + str(err)
db.set(data['uid'], json.dumps(data))
| {"golden_diff": "diff --git a/setup/server.py b/setup/server.py\n--- a/setup/server.py\n+++ b/setup/server.py\n@@ -91,7 +91,7 @@\n data = flask.request.form.copy()\n data['uid'] = str(uuid.uuid4())\n try:\n- data['dns'] = str(ipaddress.IPv4Network(data['subnet'])[-2])\n+ data['dns'] = str(ipaddress.IPv4Network(data['subnet'], strict=False)[-2])\n except ValueError as err:\n return \"Error while generating files: \" + str(err)\n db.set(data['uid'], json.dumps(data))\n", "issue": "Setup error for incorrect docker network subnet\nWhen entering an incorrect docker subnet (e.g. 172.168.0.1/16) the setup throws a 500 error -- without any reasons being given. \r\n\r\nIf you run locally, the error is clearly reported in an ungraceful way.\n", "before_files": [{"content": "import flask\nimport flask_bootstrap\nimport redis\nimport json\nimport os\nimport jinja2\nimport uuid\nimport string\nimport random\nimport ipaddress\nimport hashlib\nimport time\n\n\nversion = os.getenv(\"this_version\", \"master\")\nstatic_url_path = \"/\" + version + \"/static\"\napp = flask.Flask(__name__, static_url_path=static_url_path)\nflask_bootstrap.Bootstrap(app)\ndb = redis.StrictRedis(host='redis', port=6379, db=0)\n\n\ndef render_flavor(flavor, template, data):\n return flask.render_template(\n os.path.join(flavor, template),\n **data\n )\n\n\[email protected]_template_global\ndef secret(length=16):\n charset = string.ascii_uppercase + string.digits\n return ''.join(\n random.SystemRandom().choice(charset)\n for _ in range(length)\n )\n\n#Original copied from https://github.com/andrewlkho/ulagen\ndef random_ipv6_subnet():\n eui64 = uuid.getnode() >> 24 << 48 | 0xfffe000000 | uuid.getnode() & 0xffffff\n eui64_canon = \"-\".join([format(eui64, \"02X\")[i:i+2] for i in range(0, 18, 2)])\n\n h = hashlib.sha1()\n h.update((eui64_canon + str(time.time() - time.mktime((1900, 1, 1, 0, 0, 0, 0, 1, -1)))).encode('utf-8'))\n globalid = h.hexdigest()[0:10]\n\n prefix = \":\".join((\"fd\" + globalid[0:2], globalid[2:6], globalid[6:10]))\n return prefix\n\ndef build_app(path):\n\n app.jinja_env.trim_blocks = True\n app.jinja_env.lstrip_blocks = True\n\n @app.context_processor\n def app_context():\n return dict(versions=os.getenv(\"VERSIONS\",\"master\").split(','))\n\n prefix_bp = flask.Blueprint(version, __name__)\n prefix_bp.jinja_loader = jinja2.ChoiceLoader([\n jinja2.FileSystemLoader(os.path.join(path, \"templates\")),\n jinja2.FileSystemLoader(os.path.join(path, \"flavors\"))\n ])\n\n root_bp = flask.Blueprint(\"root\", __name__)\n root_bp.jinja_loader = jinja2.ChoiceLoader([\n jinja2.FileSystemLoader(os.path.join(path, \"templates\")),\n jinja2.FileSystemLoader(os.path.join(path, \"flavors\"))\n ])\n\n @prefix_bp.context_processor\n @root_bp.context_processor\n def bp_context(version=version):\n return dict(version=version)\n\n @prefix_bp.route(\"/\")\n @root_bp.route(\"/\")\n def wizard():\n return flask.render_template('wizard.html')\n\n @prefix_bp.route(\"/submit_flavor\", methods=[\"POST\"])\n @root_bp.route(\"/submit_flavor\", methods=[\"POST\"])\n def submit_flavor():\n data = flask.request.form.copy()\n subnet6 = random_ipv6_subnet()\n steps = sorted(os.listdir(os.path.join(path, \"templates\", \"steps\", data[\"flavor\"])))\n return flask.render_template('wizard.html', flavor=data[\"flavor\"], steps=steps, subnet6=subnet6)\n\n @prefix_bp.route(\"/submit\", methods=[\"POST\"])\n @root_bp.route(\"/submit\", methods=[\"POST\"])\n def submit():\n data = flask.request.form.copy()\n data['uid'] = str(uuid.uuid4())\n try:\n data['dns'] = str(ipaddress.IPv4Network(data['subnet'])[-2])\n except ValueError as err:\n return \"Error while generating files: \" + str(err)\n db.set(data['uid'], json.dumps(data))\n return flask.redirect(flask.url_for('.setup', uid=data['uid']))\n\n @prefix_bp.route(\"/setup/<uid>\", methods=[\"GET\"])\n @root_bp.route(\"/setup/<uid>\", methods=[\"GET\"])\n def setup(uid):\n data = json.loads(db.get(uid))\n flavor = data.get(\"flavor\", \"compose\")\n rendered = render_flavor(flavor, \"setup.html\", data)\n return flask.render_template(\"setup.html\", contents=rendered)\n\n @prefix_bp.route(\"/file/<uid>/<filepath>\", methods=[\"GET\"])\n @root_bp.route(\"/file/<uid>/<filepath>\", methods=[\"GET\"])\n def file(uid, filepath):\n data = json.loads(db.get(uid))\n flavor = data.get(\"flavor\", \"compose\")\n return flask.Response(\n render_flavor(flavor, filepath, data),\n mimetype=\"application/text\"\n )\n\n app.register_blueprint(prefix_bp, url_prefix=\"/{}\".format(version))\n app.register_blueprint(root_bp)\n\n\nif __name__ == \"__main__\":\n build_app(\"/tmp/mailutest\")\n app.run(debug=True)\n", "path": "setup/server.py"}], "after_files": [{"content": "import flask\nimport flask_bootstrap\nimport redis\nimport json\nimport os\nimport jinja2\nimport uuid\nimport string\nimport random\nimport ipaddress\nimport hashlib\nimport time\n\n\nversion = os.getenv(\"this_version\", \"master\")\nstatic_url_path = \"/\" + version + \"/static\"\napp = flask.Flask(__name__, static_url_path=static_url_path)\nflask_bootstrap.Bootstrap(app)\ndb = redis.StrictRedis(host='redis', port=6379, db=0)\n\n\ndef render_flavor(flavor, template, data):\n return flask.render_template(\n os.path.join(flavor, template),\n **data\n )\n\n\[email protected]_template_global\ndef secret(length=16):\n charset = string.ascii_uppercase + string.digits\n return ''.join(\n random.SystemRandom().choice(charset)\n for _ in range(length)\n )\n\n#Original copied from https://github.com/andrewlkho/ulagen\ndef random_ipv6_subnet():\n eui64 = uuid.getnode() >> 24 << 48 | 0xfffe000000 | uuid.getnode() & 0xffffff\n eui64_canon = \"-\".join([format(eui64, \"02X\")[i:i+2] for i in range(0, 18, 2)])\n\n h = hashlib.sha1()\n h.update((eui64_canon + str(time.time() - time.mktime((1900, 1, 1, 0, 0, 0, 0, 1, -1)))).encode('utf-8'))\n globalid = h.hexdigest()[0:10]\n\n prefix = \":\".join((\"fd\" + globalid[0:2], globalid[2:6], globalid[6:10]))\n return prefix\n\ndef build_app(path):\n\n app.jinja_env.trim_blocks = True\n app.jinja_env.lstrip_blocks = True\n\n @app.context_processor\n def app_context():\n return dict(versions=os.getenv(\"VERSIONS\",\"master\").split(','))\n\n prefix_bp = flask.Blueprint(version, __name__)\n prefix_bp.jinja_loader = jinja2.ChoiceLoader([\n jinja2.FileSystemLoader(os.path.join(path, \"templates\")),\n jinja2.FileSystemLoader(os.path.join(path, \"flavors\"))\n ])\n\n root_bp = flask.Blueprint(\"root\", __name__)\n root_bp.jinja_loader = jinja2.ChoiceLoader([\n jinja2.FileSystemLoader(os.path.join(path, \"templates\")),\n jinja2.FileSystemLoader(os.path.join(path, \"flavors\"))\n ])\n\n @prefix_bp.context_processor\n @root_bp.context_processor\n def bp_context(version=version):\n return dict(version=version)\n\n @prefix_bp.route(\"/\")\n @root_bp.route(\"/\")\n def wizard():\n return flask.render_template('wizard.html')\n\n @prefix_bp.route(\"/submit_flavor\", methods=[\"POST\"])\n @root_bp.route(\"/submit_flavor\", methods=[\"POST\"])\n def submit_flavor():\n data = flask.request.form.copy()\n subnet6 = random_ipv6_subnet()\n steps = sorted(os.listdir(os.path.join(path, \"templates\", \"steps\", data[\"flavor\"])))\n return flask.render_template('wizard.html', flavor=data[\"flavor\"], steps=steps, subnet6=subnet6)\n\n @prefix_bp.route(\"/submit\", methods=[\"POST\"])\n @root_bp.route(\"/submit\", methods=[\"POST\"])\n def submit():\n data = flask.request.form.copy()\n data['uid'] = str(uuid.uuid4())\n try:\n data['dns'] = str(ipaddress.IPv4Network(data['subnet'], strict=False)[-2])\n except ValueError as err:\n return \"Error while generating files: \" + str(err)\n db.set(data['uid'], json.dumps(data))\n return flask.redirect(flask.url_for('.setup', uid=data['uid']))\n\n @prefix_bp.route(\"/setup/<uid>\", methods=[\"GET\"])\n @root_bp.route(\"/setup/<uid>\", methods=[\"GET\"])\n def setup(uid):\n data = json.loads(db.get(uid))\n flavor = data.get(\"flavor\", \"compose\")\n rendered = render_flavor(flavor, \"setup.html\", data)\n return flask.render_template(\"setup.html\", contents=rendered)\n\n @prefix_bp.route(\"/file/<uid>/<filepath>\", methods=[\"GET\"])\n @root_bp.route(\"/file/<uid>/<filepath>\", methods=[\"GET\"])\n def file(uid, filepath):\n data = json.loads(db.get(uid))\n flavor = data.get(\"flavor\", \"compose\")\n return flask.Response(\n render_flavor(flavor, filepath, data),\n mimetype=\"application/text\"\n )\n\n app.register_blueprint(prefix_bp, url_prefix=\"/{}\".format(version))\n app.register_blueprint(root_bp)\n\n\nif __name__ == \"__main__\":\n build_app(\"/tmp/mailutest\")\n app.run(debug=True)\n", "path": "setup/server.py"}]} | 1,664 | 132 |
gh_patches_debug_23312 | rasdani/github-patches | git_diff | ephios-dev__ephios-338 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Format event description
As a planner, I want to be able to format the event description. This should at least mean that links will be formatted so that they are clickable. Maybe this should mean that the description will accept Markdown.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ephios/extra/templatetags/rich_text.py`
Content:
```
1 import bleach
2 import markdown
3 from django import template
4 from django.utils.safestring import mark_safe
5
6 register = template.Library()
7
8 ALLOWED_TAGS = [
9 "a",
10 "abbr",
11 "acronym",
12 "b",
13 "blockquote",
14 "br",
15 "code",
16 "div",
17 "em",
18 "h1",
19 "h2",
20 "h3",
21 "h4",
22 "h5",
23 "h6",
24 "hr",
25 "i",
26 "li",
27 "ol",
28 "p",
29 "pre",
30 "span",
31 "strong",
32 "table",
33 "tbody",
34 "td",
35 "th",
36 "thead",
37 "tr",
38 "ul",
39 ]
40
41
42 ALLOWED_ATTRIBUTES = {
43 "a": ["href", "title", "class"],
44 "abbr": ["title"],
45 "acronym": ["title"],
46 "table": ["width"],
47 "td": ["width", "align"],
48 "div": ["class"],
49 "p": ["class"],
50 "span": ["class", "title"],
51 }
52
53 ALLOWED_PROTOCOLS = ["http", "https", "mailto", "tel"]
54
55
56 def markdown_compile(source):
57 extensions = ["markdown.extensions.sane_lists", "markdown.extensions.nl2br"]
58 return bleach.clean(
59 markdown.markdown(source, extensions=extensions),
60 tags=ALLOWED_TAGS,
61 attributes=ALLOWED_ATTRIBUTES,
62 protocols=ALLOWED_PROTOCOLS,
63 )
64
65
66 @register.filter
67 def rich_text(text: str, **kwargs):
68 """
69 Processes markdown and cleans HTML in a text input.
70 """
71 text = str(text)
72 linker = bleach.Linker(parse_email=True)
73 body_md = linker.linkify(markdown_compile(text))
74 return mark_safe(body_md)
75
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/ephios/extra/templatetags/rich_text.py b/ephios/extra/templatetags/rich_text.py
--- a/ephios/extra/templatetags/rich_text.py
+++ b/ephios/extra/templatetags/rich_text.py
@@ -53,22 +53,28 @@
ALLOWED_PROTOCOLS = ["http", "https", "mailto", "tel"]
-def markdown_compile(source):
+def markdown_compile(source, excluded_tags=""):
extensions = ["markdown.extensions.sane_lists", "markdown.extensions.nl2br"]
+ tags = ALLOWED_TAGS.copy()
+ for tag in excluded_tags.split(","):
+ try:
+ tags.remove(tag)
+ except ValueError:
+ pass
return bleach.clean(
markdown.markdown(source, extensions=extensions),
- tags=ALLOWED_TAGS,
+ tags=tags,
attributes=ALLOWED_ATTRIBUTES,
protocols=ALLOWED_PROTOCOLS,
)
@register.filter
-def rich_text(text: str, **kwargs):
+def rich_text(text: str, excluded_tags=""):
"""
Processes markdown and cleans HTML in a text input.
"""
text = str(text)
linker = bleach.Linker(parse_email=True)
- body_md = linker.linkify(markdown_compile(text))
+ body_md = linker.linkify(markdown_compile(text, excluded_tags=excluded_tags))
return mark_safe(body_md)
| {"golden_diff": "diff --git a/ephios/extra/templatetags/rich_text.py b/ephios/extra/templatetags/rich_text.py\n--- a/ephios/extra/templatetags/rich_text.py\n+++ b/ephios/extra/templatetags/rich_text.py\n@@ -53,22 +53,28 @@\n ALLOWED_PROTOCOLS = [\"http\", \"https\", \"mailto\", \"tel\"]\n \n \n-def markdown_compile(source):\n+def markdown_compile(source, excluded_tags=\"\"):\n extensions = [\"markdown.extensions.sane_lists\", \"markdown.extensions.nl2br\"]\n+ tags = ALLOWED_TAGS.copy()\n+ for tag in excluded_tags.split(\",\"):\n+ try:\n+ tags.remove(tag)\n+ except ValueError:\n+ pass\n return bleach.clean(\n markdown.markdown(source, extensions=extensions),\n- tags=ALLOWED_TAGS,\n+ tags=tags,\n attributes=ALLOWED_ATTRIBUTES,\n protocols=ALLOWED_PROTOCOLS,\n )\n \n \n @register.filter\n-def rich_text(text: str, **kwargs):\n+def rich_text(text: str, excluded_tags=\"\"):\n \"\"\"\n Processes markdown and cleans HTML in a text input.\n \"\"\"\n text = str(text)\n linker = bleach.Linker(parse_email=True)\n- body_md = linker.linkify(markdown_compile(text))\n+ body_md = linker.linkify(markdown_compile(text, excluded_tags=excluded_tags))\n return mark_safe(body_md)\n", "issue": "Format event description\nAs a planner, I want to be able to format the event description. This should at least mean that links will be formatted so that they are clickable. Maybe this should mean that the description will accept Markdown.\n", "before_files": [{"content": "import bleach\nimport markdown\nfrom django import template\nfrom django.utils.safestring import mark_safe\n\nregister = template.Library()\n\nALLOWED_TAGS = [\n \"a\",\n \"abbr\",\n \"acronym\",\n \"b\",\n \"blockquote\",\n \"br\",\n \"code\",\n \"div\",\n \"em\",\n \"h1\",\n \"h2\",\n \"h3\",\n \"h4\",\n \"h5\",\n \"h6\",\n \"hr\",\n \"i\",\n \"li\",\n \"ol\",\n \"p\",\n \"pre\",\n \"span\",\n \"strong\",\n \"table\",\n \"tbody\",\n \"td\",\n \"th\",\n \"thead\",\n \"tr\",\n \"ul\",\n]\n\n\nALLOWED_ATTRIBUTES = {\n \"a\": [\"href\", \"title\", \"class\"],\n \"abbr\": [\"title\"],\n \"acronym\": [\"title\"],\n \"table\": [\"width\"],\n \"td\": [\"width\", \"align\"],\n \"div\": [\"class\"],\n \"p\": [\"class\"],\n \"span\": [\"class\", \"title\"],\n}\n\nALLOWED_PROTOCOLS = [\"http\", \"https\", \"mailto\", \"tel\"]\n\n\ndef markdown_compile(source):\n extensions = [\"markdown.extensions.sane_lists\", \"markdown.extensions.nl2br\"]\n return bleach.clean(\n markdown.markdown(source, extensions=extensions),\n tags=ALLOWED_TAGS,\n attributes=ALLOWED_ATTRIBUTES,\n protocols=ALLOWED_PROTOCOLS,\n )\n\n\[email protected]\ndef rich_text(text: str, **kwargs):\n \"\"\"\n Processes markdown and cleans HTML in a text input.\n \"\"\"\n text = str(text)\n linker = bleach.Linker(parse_email=True)\n body_md = linker.linkify(markdown_compile(text))\n return mark_safe(body_md)\n", "path": "ephios/extra/templatetags/rich_text.py"}], "after_files": [{"content": "import bleach\nimport markdown\nfrom django import template\nfrom django.utils.safestring import mark_safe\n\nregister = template.Library()\n\nALLOWED_TAGS = [\n \"a\",\n \"abbr\",\n \"acronym\",\n \"b\",\n \"blockquote\",\n \"br\",\n \"code\",\n \"div\",\n \"em\",\n \"h1\",\n \"h2\",\n \"h3\",\n \"h4\",\n \"h5\",\n \"h6\",\n \"hr\",\n \"i\",\n \"li\",\n \"ol\",\n \"p\",\n \"pre\",\n \"span\",\n \"strong\",\n \"table\",\n \"tbody\",\n \"td\",\n \"th\",\n \"thead\",\n \"tr\",\n \"ul\",\n]\n\n\nALLOWED_ATTRIBUTES = {\n \"a\": [\"href\", \"title\", \"class\"],\n \"abbr\": [\"title\"],\n \"acronym\": [\"title\"],\n \"table\": [\"width\"],\n \"td\": [\"width\", \"align\"],\n \"div\": [\"class\"],\n \"p\": [\"class\"],\n \"span\": [\"class\", \"title\"],\n}\n\nALLOWED_PROTOCOLS = [\"http\", \"https\", \"mailto\", \"tel\"]\n\n\ndef markdown_compile(source, excluded_tags=\"\"):\n extensions = [\"markdown.extensions.sane_lists\", \"markdown.extensions.nl2br\"]\n tags = ALLOWED_TAGS.copy()\n for tag in excluded_tags.split(\",\"):\n try:\n tags.remove(tag)\n except ValueError:\n pass\n return bleach.clean(\n markdown.markdown(source, extensions=extensions),\n tags=tags,\n attributes=ALLOWED_ATTRIBUTES,\n protocols=ALLOWED_PROTOCOLS,\n )\n\n\[email protected]\ndef rich_text(text: str, excluded_tags=\"\"):\n \"\"\"\n Processes markdown and cleans HTML in a text input.\n \"\"\"\n text = str(text)\n linker = bleach.Linker(parse_email=True)\n body_md = linker.linkify(markdown_compile(text, excluded_tags=excluded_tags))\n return mark_safe(body_md)\n", "path": "ephios/extra/templatetags/rich_text.py"}]} | 839 | 314 |
gh_patches_debug_30302 | rasdani/github-patches | git_diff | wagtail__wagtail-10860 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Please add a progress bar to "wagtail_update_image_renditions" management command
I love this new management command to regenerate all image renditions `wagtail_update_image_renditions`. But often the websites I had to use it on had many thousands of images and this command would need some hours to complete. I would love to see some kind of **progress feedback** in the terminal to get an idea of the time this task might need to complete. This could be text based of even a bar, like we know from packages like "tqdm".
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `wagtail/images/management/commands/wagtail_update_image_renditions.py`
Content:
```
1 import logging
2
3 from django.core.management.base import BaseCommand
4 from django.db import transaction
5
6 from wagtail.images import get_image_model
7
8 logger = logging.getLogger(__name__)
9
10
11 class Command(BaseCommand):
12 """Command to create missing image renditions with the option to remove (purge) any existing ones."""
13
14 help = "This command will generate all image renditions, with an option to purge existing renditions first."
15
16 def add_arguments(self, parser):
17 parser.add_argument(
18 "--purge-only",
19 action="store_true",
20 help="Purge all image renditions without regenerating them",
21 )
22 parser.add_argument(
23 "--chunk-size",
24 type=int,
25 default=50,
26 help="Operate in x size chunks (default: %(default)s)",
27 )
28
29 def handle(self, *args, **options):
30 Rendition = get_image_model().get_rendition_model()
31
32 renditions = Rendition.objects.all()
33
34 purge_only = options["purge_only"]
35
36 if not renditions.exists():
37 self.stdout.write(self.style.WARNING("No image renditions found."))
38 return
39
40 rendition_ids = list(renditions.values_list("id", flat=True))
41 num_renditions = len(rendition_ids)
42
43 if purge_only:
44 self.stdout.write(
45 self.style.HTTP_INFO(f"Purging {num_renditions} rendition(s)")
46 )
47 else:
48 self.stdout.write(
49 self.style.HTTP_INFO(f"Regenerating {num_renditions} rendition(s)")
50 )
51
52 for rendition in (
53 # Pre-calculate the ids of the renditions to change,
54 # otherwise `.iterator` never ends.
55 renditions.filter(id__in=rendition_ids)
56 .select_related("image")
57 .iterator(chunk_size=options["chunk_size"])
58 ):
59 try:
60 with transaction.atomic():
61 rendition_filter = rendition.filter
62 rendition_image = rendition.image
63
64 # Delete the existing rendition
65 rendition.delete()
66
67 if not purge_only:
68 # Create a new one
69 rendition_image.get_rendition(rendition_filter)
70 except: # noqa:E722
71 logger.exception("Error operating on rendition %d", rendition.id)
72 self.stderr.write(
73 self.style.ERROR(f"Failed to operate on rendition {rendition.id}")
74 )
75 num_renditions -= 1
76
77 if num_renditions:
78 self.stdout.write(
79 self.style.SUCCESS(
80 f"Successfully processed {num_renditions} rendition(s)"
81 )
82 )
83 else:
84 self.stdout.write(self.style.WARNING("Could not process any renditions."))
85
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/wagtail/images/management/commands/wagtail_update_image_renditions.py b/wagtail/images/management/commands/wagtail_update_image_renditions.py
--- a/wagtail/images/management/commands/wagtail_update_image_renditions.py
+++ b/wagtail/images/management/commands/wagtail_update_image_renditions.py
@@ -8,6 +8,17 @@
logger = logging.getLogger(__name__)
+def progress_bar(current, total, bar_length=50):
+ fraction = current / total
+
+ arrow = int(fraction * bar_length - 1) * "-" + ">"
+ padding = int(bar_length - len(arrow)) * " "
+
+ ending = "\n" if current == total else "\r"
+
+ return (f"Progress: [{arrow}{padding}] {int(fraction*100)}%", ending)
+
+
class Command(BaseCommand):
"""Command to create missing image renditions with the option to remove (purge) any existing ones."""
@@ -49,6 +60,7 @@
self.style.HTTP_INFO(f"Regenerating {num_renditions} rendition(s)")
)
+ progress_bar_current = 1
for rendition in (
# Pre-calculate the ids of the renditions to change,
# otherwise `.iterator` never ends.
@@ -64,6 +76,10 @@
# Delete the existing rendition
rendition.delete()
+ _progress_bar = progress_bar(progress_bar_current, num_renditions)
+ self.stdout.write(_progress_bar[0], ending=_progress_bar[1])
+ progress_bar_current = progress_bar_current + 1
+
if not purge_only:
# Create a new one
rendition_image.get_rendition(rendition_filter)
| {"golden_diff": "diff --git a/wagtail/images/management/commands/wagtail_update_image_renditions.py b/wagtail/images/management/commands/wagtail_update_image_renditions.py\n--- a/wagtail/images/management/commands/wagtail_update_image_renditions.py\n+++ b/wagtail/images/management/commands/wagtail_update_image_renditions.py\n@@ -8,6 +8,17 @@\n logger = logging.getLogger(__name__)\n \n \n+def progress_bar(current, total, bar_length=50):\n+ fraction = current / total\n+\n+ arrow = int(fraction * bar_length - 1) * \"-\" + \">\"\n+ padding = int(bar_length - len(arrow)) * \" \"\n+\n+ ending = \"\\n\" if current == total else \"\\r\"\n+\n+ return (f\"Progress: [{arrow}{padding}] {int(fraction*100)}%\", ending)\n+\n+\n class Command(BaseCommand):\n \"\"\"Command to create missing image renditions with the option to remove (purge) any existing ones.\"\"\"\n \n@@ -49,6 +60,7 @@\n self.style.HTTP_INFO(f\"Regenerating {num_renditions} rendition(s)\")\n )\n \n+ progress_bar_current = 1\n for rendition in (\n # Pre-calculate the ids of the renditions to change,\n # otherwise `.iterator` never ends.\n@@ -64,6 +76,10 @@\n # Delete the existing rendition\n rendition.delete()\n \n+ _progress_bar = progress_bar(progress_bar_current, num_renditions)\n+ self.stdout.write(_progress_bar[0], ending=_progress_bar[1])\n+ progress_bar_current = progress_bar_current + 1\n+\n if not purge_only:\n # Create a new one\n rendition_image.get_rendition(rendition_filter)\n", "issue": "Please add a progress bar to \"wagtail_update_image_renditions\" management command\nI love this new management command to regenerate all image renditions `wagtail_update_image_renditions`. But often the websites I had to use it on had many thousands of images and this command would need some hours to complete. I would love to see some kind of **progress feedback** in the terminal to get an idea of the time this task might need to complete. This could be text based of even a bar, like we know from packages like \"tqdm\".\n", "before_files": [{"content": "import logging\n\nfrom django.core.management.base import BaseCommand\nfrom django.db import transaction\n\nfrom wagtail.images import get_image_model\n\nlogger = logging.getLogger(__name__)\n\n\nclass Command(BaseCommand):\n \"\"\"Command to create missing image renditions with the option to remove (purge) any existing ones.\"\"\"\n\n help = \"This command will generate all image renditions, with an option to purge existing renditions first.\"\n\n def add_arguments(self, parser):\n parser.add_argument(\n \"--purge-only\",\n action=\"store_true\",\n help=\"Purge all image renditions without regenerating them\",\n )\n parser.add_argument(\n \"--chunk-size\",\n type=int,\n default=50,\n help=\"Operate in x size chunks (default: %(default)s)\",\n )\n\n def handle(self, *args, **options):\n Rendition = get_image_model().get_rendition_model()\n\n renditions = Rendition.objects.all()\n\n purge_only = options[\"purge_only\"]\n\n if not renditions.exists():\n self.stdout.write(self.style.WARNING(\"No image renditions found.\"))\n return\n\n rendition_ids = list(renditions.values_list(\"id\", flat=True))\n num_renditions = len(rendition_ids)\n\n if purge_only:\n self.stdout.write(\n self.style.HTTP_INFO(f\"Purging {num_renditions} rendition(s)\")\n )\n else:\n self.stdout.write(\n self.style.HTTP_INFO(f\"Regenerating {num_renditions} rendition(s)\")\n )\n\n for rendition in (\n # Pre-calculate the ids of the renditions to change,\n # otherwise `.iterator` never ends.\n renditions.filter(id__in=rendition_ids)\n .select_related(\"image\")\n .iterator(chunk_size=options[\"chunk_size\"])\n ):\n try:\n with transaction.atomic():\n rendition_filter = rendition.filter\n rendition_image = rendition.image\n\n # Delete the existing rendition\n rendition.delete()\n\n if not purge_only:\n # Create a new one\n rendition_image.get_rendition(rendition_filter)\n except: # noqa:E722\n logger.exception(\"Error operating on rendition %d\", rendition.id)\n self.stderr.write(\n self.style.ERROR(f\"Failed to operate on rendition {rendition.id}\")\n )\n num_renditions -= 1\n\n if num_renditions:\n self.stdout.write(\n self.style.SUCCESS(\n f\"Successfully processed {num_renditions} rendition(s)\"\n )\n )\n else:\n self.stdout.write(self.style.WARNING(\"Could not process any renditions.\"))\n", "path": "wagtail/images/management/commands/wagtail_update_image_renditions.py"}], "after_files": [{"content": "import logging\n\nfrom django.core.management.base import BaseCommand\nfrom django.db import transaction\n\nfrom wagtail.images import get_image_model\n\nlogger = logging.getLogger(__name__)\n\n\ndef progress_bar(current, total, bar_length=50):\n fraction = current / total\n\n arrow = int(fraction * bar_length - 1) * \"-\" + \">\"\n padding = int(bar_length - len(arrow)) * \" \"\n\n ending = \"\\n\" if current == total else \"\\r\"\n\n return (f\"Progress: [{arrow}{padding}] {int(fraction*100)}%\", ending)\n\n\nclass Command(BaseCommand):\n \"\"\"Command to create missing image renditions with the option to remove (purge) any existing ones.\"\"\"\n\n help = \"This command will generate all image renditions, with an option to purge existing renditions first.\"\n\n def add_arguments(self, parser):\n parser.add_argument(\n \"--purge-only\",\n action=\"store_true\",\n help=\"Purge all image renditions without regenerating them\",\n )\n parser.add_argument(\n \"--chunk-size\",\n type=int,\n default=50,\n help=\"Operate in x size chunks (default: %(default)s)\",\n )\n\n def handle(self, *args, **options):\n Rendition = get_image_model().get_rendition_model()\n\n renditions = Rendition.objects.all()\n\n purge_only = options[\"purge_only\"]\n\n if not renditions.exists():\n self.stdout.write(self.style.WARNING(\"No image renditions found.\"))\n return\n\n rendition_ids = list(renditions.values_list(\"id\", flat=True))\n num_renditions = len(rendition_ids)\n\n if purge_only:\n self.stdout.write(\n self.style.HTTP_INFO(f\"Purging {num_renditions} rendition(s)\")\n )\n else:\n self.stdout.write(\n self.style.HTTP_INFO(f\"Regenerating {num_renditions} rendition(s)\")\n )\n\n progress_bar_current = 1\n for rendition in (\n # Pre-calculate the ids of the renditions to change,\n # otherwise `.iterator` never ends.\n renditions.filter(id__in=rendition_ids)\n .select_related(\"image\")\n .iterator(chunk_size=options[\"chunk_size\"])\n ):\n try:\n with transaction.atomic():\n rendition_filter = rendition.filter\n rendition_image = rendition.image\n\n # Delete the existing rendition\n rendition.delete()\n\n _progress_bar = progress_bar(progress_bar_current, num_renditions)\n self.stdout.write(_progress_bar[0], ending=_progress_bar[1])\n progress_bar_current = progress_bar_current + 1\n\n if not purge_only:\n # Create a new one\n rendition_image.get_rendition(rendition_filter)\n except: # noqa:E722\n logger.exception(\"Error operating on rendition %d\", rendition.id)\n self.stderr.write(\n self.style.ERROR(f\"Failed to operate on rendition {rendition.id}\")\n )\n num_renditions -= 1\n\n if num_renditions:\n self.stdout.write(\n self.style.SUCCESS(\n f\"Successfully processed {num_renditions} rendition(s)\"\n )\n )\n else:\n self.stdout.write(self.style.WARNING(\"Could not process any renditions.\"))\n", "path": "wagtail/images/management/commands/wagtail_update_image_renditions.py"}]} | 1,113 | 397 |
gh_patches_debug_19081 | rasdani/github-patches | git_diff | nvaccess__nvda-11609 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Visual Studio IntelliSense overly verbose
To reproduce this issue:
1. I created a console application
2. In the main method I started typing "Console" and it reads ok
3. When I press "." to access "WriteLine" or other method it reads the whole line. Which gets very confusing and anoying when lines get longer. Imagine pressing up or down arrows and for each item in the Intelli sense it repeats the whole line of code and lastly reads the item in the IntelliSense.
For example, if the line is following:
string name = string.format("Hello {0}", textField.text);
When I got to the point when I type "textField" and than "." it will repeat the whole line and than read the "text" property, and it repeats if I move up or down using arrow keys.
It is worth mentioning that this issue is not present when using Freedom scientific JAWS, but NVDA is much better choice for blind developers.
I also reported this issue to the Microsoft: https://developercommunity.visualstudio.com/content/problem/164719/nvda-screen-reader-is-not-working-properly-with-in.html
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `source/NVDAObjects/UIA/VisualStudio.py`
Content:
```
1 # This file is covered by the GNU General Public License.
2 # See the file COPYING for more details.
3 # Copyright (C) 2020 NV Access Limited, Leonard de Ruijter
4
5 """
6 Object overlay classes for Visual Studio components
7 available in Visual Studio and SQL Server Management Studio.
8 """
9
10 from . import UIA
11 import speech
12 import braille
13 import api
14
15
16 class IntelliSenseItem(UIA):
17
18 def _get_name(self):
19 return self.UIAElement.cachedAutomationID
20
21 def event_UIA_elementSelected(self):
22 # Cancel speech to have speech announce the selection as soon as possible.
23 # This is needed because L{reportFocus} does not cancel speech.
24 # Therefore, if speech wouldn't be cancelled,
25 # selection announcements would queue up when changing selection rapidly.
26 speech.cancelSpeech()
27 api.setNavigatorObject(self, isFocus=True)
28 self.reportFocus()
29 # Display results as flash messages.
30 braille.handler.message(braille.getPropertiesBraille(
31 name=self.name, role=self.role, positionInfo=self.positionInfo, description=self.description
32 ))
33
34
35 class IntelliSenseList(UIA):
36 ...
37
38
39 def findExtraOverlayClasses(obj, clsList):
40 if obj.UIAElement.cachedAutomationId == "listBoxCompletions":
41 clsList.insert(0, IntelliSenseList)
42 elif isinstance(obj.parent, IntelliSenseList) and obj.UIAElement.cachedClassName == "IntellisenseMenuItem":
43 clsList.insert(0, IntelliSenseItem)
44
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/source/NVDAObjects/UIA/VisualStudio.py b/source/NVDAObjects/UIA/VisualStudio.py
--- a/source/NVDAObjects/UIA/VisualStudio.py
+++ b/source/NVDAObjects/UIA/VisualStudio.py
@@ -36,8 +36,31 @@
...
+class IntelliSenseLiveRegion(UIA):
+ """
+ Visual Studio uses both Intellisense menu item objects and a live region
+ to communicate Intellisense selections.
+ NVDA uses the menu item approach and therefore the live region provides doubled information
+ and is disabled.
+ """
+
+ _shouldAllowUIALiveRegionChangeEvent = False
+
+
+_INTELLISENSE_LIST_AUTOMATION_IDS = {
+ "listBoxCompletions",
+ "CompletionList"
+}
+
+
def findExtraOverlayClasses(obj, clsList):
- if obj.UIAElement.cachedAutomationId == "listBoxCompletions":
+ if obj.UIAAutomationId in _INTELLISENSE_LIST_AUTOMATION_IDS:
clsList.insert(0, IntelliSenseList)
elif isinstance(obj.parent, IntelliSenseList) and obj.UIAElement.cachedClassName == "IntellisenseMenuItem":
clsList.insert(0, IntelliSenseItem)
+ elif (
+ obj.UIAElement.cachedClassName == "LiveTextBlock"
+ and obj.previous
+ and isinstance(obj.previous.previous, IntelliSenseList)
+ ):
+ clsList.insert(0, IntelliSenseLiveRegion)
| {"golden_diff": "diff --git a/source/NVDAObjects/UIA/VisualStudio.py b/source/NVDAObjects/UIA/VisualStudio.py\n--- a/source/NVDAObjects/UIA/VisualStudio.py\n+++ b/source/NVDAObjects/UIA/VisualStudio.py\n@@ -36,8 +36,31 @@\n \t...\n \n \n+class IntelliSenseLiveRegion(UIA):\n+\t\"\"\"\n+\tVisual Studio uses both Intellisense menu item objects and a live region\n+\tto communicate Intellisense selections.\n+\tNVDA uses the menu item approach and therefore the live region provides doubled information\n+\tand is disabled.\n+\t\"\"\"\n+\n+\t_shouldAllowUIALiveRegionChangeEvent = False\n+\n+\n+_INTELLISENSE_LIST_AUTOMATION_IDS = {\n+\t\"listBoxCompletions\",\n+\t\"CompletionList\"\n+}\n+\n+\n def findExtraOverlayClasses(obj, clsList):\n-\tif obj.UIAElement.cachedAutomationId == \"listBoxCompletions\":\n+\tif obj.UIAAutomationId in _INTELLISENSE_LIST_AUTOMATION_IDS:\n \t\tclsList.insert(0, IntelliSenseList)\n \telif isinstance(obj.parent, IntelliSenseList) and obj.UIAElement.cachedClassName == \"IntellisenseMenuItem\":\n \t\tclsList.insert(0, IntelliSenseItem)\n+\telif (\n+\t\tobj.UIAElement.cachedClassName == \"LiveTextBlock\"\n+\t\tand obj.previous\n+\t\tand isinstance(obj.previous.previous, IntelliSenseList)\n+\t):\n+\t\tclsList.insert(0, IntelliSenseLiveRegion)\n", "issue": "Visual Studio IntelliSense overly verbose\nTo reproduce this issue:\r\n1. I created a console application\r\n2. In the main method I started typing \"Console\" and it reads ok\r\n3. When I press \".\" to access \"WriteLine\" or other method it reads the whole line. Which gets very confusing and anoying when lines get longer. Imagine pressing up or down arrows and for each item in the Intelli sense it repeats the whole line of code and lastly reads the item in the IntelliSense.\r\nFor example, if the line is following:\r\nstring name = string.format(\"Hello {0}\", textField.text);\r\n\r\nWhen I got to the point when I type \"textField\" and than \".\" it will repeat the whole line and than read the \"text\" property, and it repeats if I move up or down using arrow keys.\r\n\r\nIt is worth mentioning that this issue is not present when using Freedom scientific JAWS, but NVDA is much better choice for blind developers.\r\n\r\nI also reported this issue to the Microsoft: https://developercommunity.visualstudio.com/content/problem/164719/nvda-screen-reader-is-not-working-properly-with-in.html\n", "before_files": [{"content": "# This file is covered by the GNU General Public License.\n# See the file COPYING for more details.\n# Copyright (C) 2020 NV Access Limited, Leonard de Ruijter\n\n\"\"\"\nObject overlay classes for Visual Studio components\navailable in Visual Studio and SQL Server Management Studio.\n\"\"\"\n\nfrom . import UIA\nimport speech\nimport braille\nimport api\n\n\nclass IntelliSenseItem(UIA):\n\n\tdef _get_name(self):\n\t\treturn self.UIAElement.cachedAutomationID\n\n\tdef event_UIA_elementSelected(self):\n\t\t# Cancel speech to have speech announce the selection as soon as possible.\n\t\t# This is needed because L{reportFocus} does not cancel speech.\n\t\t# Therefore, if speech wouldn't be cancelled,\n\t\t# selection announcements would queue up when changing selection rapidly.\n\t\tspeech.cancelSpeech()\n\t\tapi.setNavigatorObject(self, isFocus=True)\n\t\tself.reportFocus()\n\t\t# Display results as flash messages.\n\t\tbraille.handler.message(braille.getPropertiesBraille(\n\t\t\tname=self.name, role=self.role, positionInfo=self.positionInfo, description=self.description\n\t\t))\n\n\nclass IntelliSenseList(UIA):\n\t...\n\n\ndef findExtraOverlayClasses(obj, clsList):\n\tif obj.UIAElement.cachedAutomationId == \"listBoxCompletions\":\n\t\tclsList.insert(0, IntelliSenseList)\n\telif isinstance(obj.parent, IntelliSenseList) and obj.UIAElement.cachedClassName == \"IntellisenseMenuItem\":\n\t\tclsList.insert(0, IntelliSenseItem)\n", "path": "source/NVDAObjects/UIA/VisualStudio.py"}], "after_files": [{"content": "# This file is covered by the GNU General Public License.\n# See the file COPYING for more details.\n# Copyright (C) 2020 NV Access Limited, Leonard de Ruijter\n\n\"\"\"\nObject overlay classes for Visual Studio components\navailable in Visual Studio and SQL Server Management Studio.\n\"\"\"\n\nfrom . import UIA\nimport speech\nimport braille\nimport api\n\n\nclass IntelliSenseItem(UIA):\n\n\tdef _get_name(self):\n\t\treturn self.UIAElement.cachedAutomationID\n\n\tdef event_UIA_elementSelected(self):\n\t\t# Cancel speech to have speech announce the selection as soon as possible.\n\t\t# This is needed because L{reportFocus} does not cancel speech.\n\t\t# Therefore, if speech wouldn't be cancelled,\n\t\t# selection announcements would queue up when changing selection rapidly.\n\t\tspeech.cancelSpeech()\n\t\tapi.setNavigatorObject(self, isFocus=True)\n\t\tself.reportFocus()\n\t\t# Display results as flash messages.\n\t\tbraille.handler.message(braille.getPropertiesBraille(\n\t\t\tname=self.name, role=self.role, positionInfo=self.positionInfo, description=self.description\n\t\t))\n\n\nclass IntelliSenseList(UIA):\n\t...\n\n\nclass IntelliSenseLiveRegion(UIA):\n\t\"\"\"\n\tVisual Studio uses both Intellisense menu item objects and a live region\n\tto communicate Intellisense selections.\n\tNVDA uses the menu item approach and therefore the live region provides doubled information\n\tand is disabled.\n\t\"\"\"\n\n\t_shouldAllowUIALiveRegionChangeEvent = False\n\n\n_INTELLISENSE_LIST_AUTOMATION_IDS = {\n\t\"listBoxCompletions\",\n\t\"CompletionList\"\n}\n\n\ndef findExtraOverlayClasses(obj, clsList):\n\tif obj.UIAAutomationId in _INTELLISENSE_LIST_AUTOMATION_IDS:\n\t\tclsList.insert(0, IntelliSenseList)\n\telif isinstance(obj.parent, IntelliSenseList) and obj.UIAElement.cachedClassName == \"IntellisenseMenuItem\":\n\t\tclsList.insert(0, IntelliSenseItem)\n\telif (\n\t\tobj.UIAElement.cachedClassName == \"LiveTextBlock\"\n\t\tand obj.previous\n\t\tand isinstance(obj.previous.previous, IntelliSenseList)\n\t):\n\t\tclsList.insert(0, IntelliSenseLiveRegion)\n", "path": "source/NVDAObjects/UIA/VisualStudio.py"}]} | 912 | 336 |
gh_patches_debug_47517 | rasdani/github-patches | git_diff | conda-forge__conda-smithy-1727 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Python 3 regression: Undefined Jinja2 variables get rendered as empty string in linting
### Solution to issue cannot be found in the documentation.
- [X] I checked the documentation.
### Issue
For linting, undefined Jinja2 variables get rendered by `conda_smithy.utils.NullUndefined`. That class contains a `__unicode__` method that returns the name of the variable. This is useful to put a clear placeholder where variables will be filled by variants from `conda_build_config.yaml` during the actual build. However, `NullUndefined` doesn't overwrite the `__str__` method of Jinja's own `Undefined`, which returns an empty string.
In effect, linting in a Python 2 environment renders, e.g. `- {{ libjpeg }}` as `- libjpeg`, but in a Python 3 environment, we get `- ` which becomes `None` in the `requirements_section` dictionary.
### Installed packages
```shell
-
```
### Environment info
```shell
-
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `conda_smithy/utils.py`
Content:
```
1 import shutil
2 import tempfile
3 import io
4 import jinja2
5 import datetime
6 import time
7 import os
8 import sys
9 from pathlib import Path
10 from collections import defaultdict
11 from contextlib import contextmanager
12
13 import ruamel.yaml
14
15
16 def get_feedstock_name_from_meta(meta):
17 """Resolve the feedtstock name from the parsed meta.yaml."""
18 if "feedstock-name" in meta.meta["extra"]:
19 return meta.meta["extra"]["feedstock-name"]
20 elif "parent_recipe" in meta.meta["extra"]:
21 return meta.meta["extra"]["parent_recipe"]["name"]
22 else:
23 return meta.name()
24
25
26 def get_feedstock_about_from_meta(meta) -> dict:
27 """Fetch the feedtstock about from the parsed meta.yaml."""
28 # it turns out that conda_build would not preserve the feedstock about:
29 # - if a subpackage does not have about, it uses the feedstock's
30 # - if a subpackage has about, it's used as is
31 # therefore we need to parse the yaml again just to get the about section...
32 if "parent_recipe" in meta.meta["extra"]:
33 recipe_meta = os.path.join(
34 meta.meta["extra"]["parent_recipe"]["path"], "meta.yaml"
35 )
36 with io.open(recipe_meta, "rt") as fh:
37 content = render_meta_yaml("".join(fh))
38 meta = get_yaml().load(content)
39 return dict(meta["about"])
40 else:
41 # no parent recipe for any reason, use self's about
42 return dict(meta.meta["about"])
43
44
45 def get_yaml():
46 # define global yaml API
47 # roundrip-loader and allowing duplicate keys
48 # for handling # [filter] / # [not filter]
49 # Don't use a global variable for this as a global
50 # variable will make conda-smithy thread unsafe.
51 yaml = ruamel.yaml.YAML(typ="rt")
52 yaml.allow_duplicate_keys = True
53 return yaml
54
55
56 @contextmanager
57 def tmp_directory():
58 tmp_dir = tempfile.mkdtemp("_recipe")
59 yield tmp_dir
60 shutil.rmtree(tmp_dir)
61
62
63 class NullUndefined(jinja2.Undefined):
64 def __unicode__(self):
65 return self._undefined_name
66
67 def __getattr__(self, name):
68 return "{}.{}".format(self, name)
69
70 def __getitem__(self, name):
71 return '{}["{}"]'.format(self, name)
72
73
74 class MockOS(dict):
75 def __init__(self):
76 self.environ = defaultdict(lambda: "")
77 self.sep = "/"
78
79
80 def stub_compatible_pin(*args, **kwargs):
81 return f"compatible_pin {args[0]}"
82
83
84 def stub_subpackage_pin(*args, **kwargs):
85 return f"subpackage_pin {args[0]}"
86
87
88 def render_meta_yaml(text):
89 env = jinja2.Environment(undefined=NullUndefined)
90
91 # stub out cb3 jinja2 functions - they are not important for linting
92 # if we don't stub them out, the ruamel.yaml load fails to interpret them
93 # we can't just use conda-build's api.render functionality, because it would apply selectors
94 env.globals.update(
95 dict(
96 compiler=lambda x: x + "_compiler_stub",
97 pin_subpackage=stub_subpackage_pin,
98 pin_compatible=stub_compatible_pin,
99 cdt=lambda *args, **kwargs: "cdt_stub",
100 load_file_regex=lambda *args, **kwargs: defaultdict(lambda: ""),
101 datetime=datetime,
102 time=time,
103 target_platform="linux-64",
104 mpi="mpi",
105 )
106 )
107 mockos = MockOS()
108 py_ver = "3.7"
109 context = {"os": mockos, "environ": mockos.environ, "PY_VER": py_ver}
110 content = env.from_string(text).render(context)
111 return content
112
113
114 @contextmanager
115 def update_conda_forge_config(forge_yaml):
116 """Utility method used to update conda forge configuration files
117
118 Uage:
119 >>> with update_conda_forge_config(somepath) as cfg:
120 ... cfg['foo'] = 'bar'
121 """
122 if os.path.exists(forge_yaml):
123 with open(forge_yaml, "r") as fh:
124 code = get_yaml().load(fh)
125 else:
126 code = {}
127
128 # Code could come in as an empty list.
129 if not code:
130 code = {}
131
132 yield code
133
134 get_yaml().dump(code, Path(forge_yaml))
135
136
137 def merge_dict(src, dest):
138 """Recursive merge dictionary"""
139 for key, value in src.items():
140 if isinstance(value, dict):
141 # get node or create one
142 node = dest.setdefault(key, {})
143 merge_dict(value, node)
144 else:
145 dest[key] = value
146
147 return dest
148
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/conda_smithy/utils.py b/conda_smithy/utils.py
--- a/conda_smithy/utils.py
+++ b/conda_smithy/utils.py
@@ -61,7 +61,7 @@
class NullUndefined(jinja2.Undefined):
- def __unicode__(self):
+ def __str__(self):
return self._undefined_name
def __getattr__(self, name):
| {"golden_diff": "diff --git a/conda_smithy/utils.py b/conda_smithy/utils.py\n--- a/conda_smithy/utils.py\n+++ b/conda_smithy/utils.py\n@@ -61,7 +61,7 @@\n \n \n class NullUndefined(jinja2.Undefined):\n- def __unicode__(self):\n+ def __str__(self):\n return self._undefined_name\n \n def __getattr__(self, name):\n", "issue": "Python 3 regression: Undefined Jinja2 variables get rendered as empty string in linting\n### Solution to issue cannot be found in the documentation.\n\n- [X] I checked the documentation.\n\n### Issue\n\nFor linting, undefined Jinja2 variables get rendered by `conda_smithy.utils.NullUndefined`. That class contains a `__unicode__` method that returns the name of the variable. This is useful to put a clear placeholder where variables will be filled by variants from `conda_build_config.yaml` during the actual build. However, `NullUndefined` doesn't overwrite the `__str__` method of Jinja's own `Undefined`, which returns an empty string.\r\n\r\nIn effect, linting in a Python 2 environment renders, e.g. `- {{ libjpeg }}` as `- libjpeg`, but in a Python 3 environment, we get `- ` which becomes `None` in the `requirements_section` dictionary.\n\n### Installed packages\n\n```shell\n-\n```\n\n\n### Environment info\n\n```shell\n-\n```\n\n", "before_files": [{"content": "import shutil\nimport tempfile\nimport io\nimport jinja2\nimport datetime\nimport time\nimport os\nimport sys\nfrom pathlib import Path\nfrom collections import defaultdict\nfrom contextlib import contextmanager\n\nimport ruamel.yaml\n\n\ndef get_feedstock_name_from_meta(meta):\n \"\"\"Resolve the feedtstock name from the parsed meta.yaml.\"\"\"\n if \"feedstock-name\" in meta.meta[\"extra\"]:\n return meta.meta[\"extra\"][\"feedstock-name\"]\n elif \"parent_recipe\" in meta.meta[\"extra\"]:\n return meta.meta[\"extra\"][\"parent_recipe\"][\"name\"]\n else:\n return meta.name()\n\n\ndef get_feedstock_about_from_meta(meta) -> dict:\n \"\"\"Fetch the feedtstock about from the parsed meta.yaml.\"\"\"\n # it turns out that conda_build would not preserve the feedstock about:\n # - if a subpackage does not have about, it uses the feedstock's\n # - if a subpackage has about, it's used as is\n # therefore we need to parse the yaml again just to get the about section...\n if \"parent_recipe\" in meta.meta[\"extra\"]:\n recipe_meta = os.path.join(\n meta.meta[\"extra\"][\"parent_recipe\"][\"path\"], \"meta.yaml\"\n )\n with io.open(recipe_meta, \"rt\") as fh:\n content = render_meta_yaml(\"\".join(fh))\n meta = get_yaml().load(content)\n return dict(meta[\"about\"])\n else:\n # no parent recipe for any reason, use self's about\n return dict(meta.meta[\"about\"])\n\n\ndef get_yaml():\n # define global yaml API\n # roundrip-loader and allowing duplicate keys\n # for handling # [filter] / # [not filter]\n # Don't use a global variable for this as a global\n # variable will make conda-smithy thread unsafe.\n yaml = ruamel.yaml.YAML(typ=\"rt\")\n yaml.allow_duplicate_keys = True\n return yaml\n\n\n@contextmanager\ndef tmp_directory():\n tmp_dir = tempfile.mkdtemp(\"_recipe\")\n yield tmp_dir\n shutil.rmtree(tmp_dir)\n\n\nclass NullUndefined(jinja2.Undefined):\n def __unicode__(self):\n return self._undefined_name\n\n def __getattr__(self, name):\n return \"{}.{}\".format(self, name)\n\n def __getitem__(self, name):\n return '{}[\"{}\"]'.format(self, name)\n\n\nclass MockOS(dict):\n def __init__(self):\n self.environ = defaultdict(lambda: \"\")\n self.sep = \"/\"\n\n\ndef stub_compatible_pin(*args, **kwargs):\n return f\"compatible_pin {args[0]}\"\n\n\ndef stub_subpackage_pin(*args, **kwargs):\n return f\"subpackage_pin {args[0]}\"\n\n\ndef render_meta_yaml(text):\n env = jinja2.Environment(undefined=NullUndefined)\n\n # stub out cb3 jinja2 functions - they are not important for linting\n # if we don't stub them out, the ruamel.yaml load fails to interpret them\n # we can't just use conda-build's api.render functionality, because it would apply selectors\n env.globals.update(\n dict(\n compiler=lambda x: x + \"_compiler_stub\",\n pin_subpackage=stub_subpackage_pin,\n pin_compatible=stub_compatible_pin,\n cdt=lambda *args, **kwargs: \"cdt_stub\",\n load_file_regex=lambda *args, **kwargs: defaultdict(lambda: \"\"),\n datetime=datetime,\n time=time,\n target_platform=\"linux-64\",\n mpi=\"mpi\",\n )\n )\n mockos = MockOS()\n py_ver = \"3.7\"\n context = {\"os\": mockos, \"environ\": mockos.environ, \"PY_VER\": py_ver}\n content = env.from_string(text).render(context)\n return content\n\n\n@contextmanager\ndef update_conda_forge_config(forge_yaml):\n \"\"\"Utility method used to update conda forge configuration files\n\n Uage:\n >>> with update_conda_forge_config(somepath) as cfg:\n ... cfg['foo'] = 'bar'\n \"\"\"\n if os.path.exists(forge_yaml):\n with open(forge_yaml, \"r\") as fh:\n code = get_yaml().load(fh)\n else:\n code = {}\n\n # Code could come in as an empty list.\n if not code:\n code = {}\n\n yield code\n\n get_yaml().dump(code, Path(forge_yaml))\n\n\ndef merge_dict(src, dest):\n \"\"\"Recursive merge dictionary\"\"\"\n for key, value in src.items():\n if isinstance(value, dict):\n # get node or create one\n node = dest.setdefault(key, {})\n merge_dict(value, node)\n else:\n dest[key] = value\n\n return dest\n", "path": "conda_smithy/utils.py"}], "after_files": [{"content": "import shutil\nimport tempfile\nimport io\nimport jinja2\nimport datetime\nimport time\nimport os\nimport sys\nfrom pathlib import Path\nfrom collections import defaultdict\nfrom contextlib import contextmanager\n\nimport ruamel.yaml\n\n\ndef get_feedstock_name_from_meta(meta):\n \"\"\"Resolve the feedtstock name from the parsed meta.yaml.\"\"\"\n if \"feedstock-name\" in meta.meta[\"extra\"]:\n return meta.meta[\"extra\"][\"feedstock-name\"]\n elif \"parent_recipe\" in meta.meta[\"extra\"]:\n return meta.meta[\"extra\"][\"parent_recipe\"][\"name\"]\n else:\n return meta.name()\n\n\ndef get_feedstock_about_from_meta(meta) -> dict:\n \"\"\"Fetch the feedtstock about from the parsed meta.yaml.\"\"\"\n # it turns out that conda_build would not preserve the feedstock about:\n # - if a subpackage does not have about, it uses the feedstock's\n # - if a subpackage has about, it's used as is\n # therefore we need to parse the yaml again just to get the about section...\n if \"parent_recipe\" in meta.meta[\"extra\"]:\n recipe_meta = os.path.join(\n meta.meta[\"extra\"][\"parent_recipe\"][\"path\"], \"meta.yaml\"\n )\n with io.open(recipe_meta, \"rt\") as fh:\n content = render_meta_yaml(\"\".join(fh))\n meta = get_yaml().load(content)\n return dict(meta[\"about\"])\n else:\n # no parent recipe for any reason, use self's about\n return dict(meta.meta[\"about\"])\n\n\ndef get_yaml():\n # define global yaml API\n # roundrip-loader and allowing duplicate keys\n # for handling # [filter] / # [not filter]\n # Don't use a global variable for this as a global\n # variable will make conda-smithy thread unsafe.\n yaml = ruamel.yaml.YAML(typ=\"rt\")\n yaml.allow_duplicate_keys = True\n return yaml\n\n\n@contextmanager\ndef tmp_directory():\n tmp_dir = tempfile.mkdtemp(\"_recipe\")\n yield tmp_dir\n shutil.rmtree(tmp_dir)\n\n\nclass NullUndefined(jinja2.Undefined):\n def __str__(self):\n return self._undefined_name\n\n def __getattr__(self, name):\n return \"{}.{}\".format(self, name)\n\n def __getitem__(self, name):\n return '{}[\"{}\"]'.format(self, name)\n\n\nclass MockOS(dict):\n def __init__(self):\n self.environ = defaultdict(lambda: \"\")\n self.sep = \"/\"\n\n\ndef stub_compatible_pin(*args, **kwargs):\n return f\"compatible_pin {args[0]}\"\n\n\ndef stub_subpackage_pin(*args, **kwargs):\n return f\"subpackage_pin {args[0]}\"\n\n\ndef render_meta_yaml(text):\n env = jinja2.Environment(undefined=NullUndefined)\n\n # stub out cb3 jinja2 functions - they are not important for linting\n # if we don't stub them out, the ruamel.yaml load fails to interpret them\n # we can't just use conda-build's api.render functionality, because it would apply selectors\n env.globals.update(\n dict(\n compiler=lambda x: x + \"_compiler_stub\",\n pin_subpackage=stub_subpackage_pin,\n pin_compatible=stub_compatible_pin,\n cdt=lambda *args, **kwargs: \"cdt_stub\",\n load_file_regex=lambda *args, **kwargs: defaultdict(lambda: \"\"),\n datetime=datetime,\n time=time,\n target_platform=\"linux-64\",\n mpi=\"mpi\",\n )\n )\n mockos = MockOS()\n py_ver = \"3.7\"\n context = {\"os\": mockos, \"environ\": mockos.environ, \"PY_VER\": py_ver}\n content = env.from_string(text).render(context)\n return content\n\n\n@contextmanager\ndef update_conda_forge_config(forge_yaml):\n \"\"\"Utility method used to update conda forge configuration files\n\n Uage:\n >>> with update_conda_forge_config(somepath) as cfg:\n ... cfg['foo'] = 'bar'\n \"\"\"\n if os.path.exists(forge_yaml):\n with open(forge_yaml, \"r\") as fh:\n code = get_yaml().load(fh)\n else:\n code = {}\n\n # Code could come in as an empty list.\n if not code:\n code = {}\n\n yield code\n\n get_yaml().dump(code, Path(forge_yaml))\n\n\ndef merge_dict(src, dest):\n \"\"\"Recursive merge dictionary\"\"\"\n for key, value in src.items():\n if isinstance(value, dict):\n # get node or create one\n node = dest.setdefault(key, {})\n merge_dict(value, node)\n else:\n dest[key] = value\n\n return dest\n", "path": "conda_smithy/utils.py"}]} | 1,844 | 95 |
gh_patches_debug_566 | rasdani/github-patches | git_diff | pex-tool__pex-797 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Release 2.0.1
On the docket:
+ [x] pex --index-url=... fails in 2.0.0 #794
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pex/version.py`
Content:
```
1 # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
2 # Licensed under the Apache License, Version 2.0 (see LICENSE).
3
4 __version__ = '2.0.0'
5
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pex/version.py b/pex/version.py
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = '2.0.0'
+__version__ = '2.0.1'
| {"golden_diff": "diff --git a/pex/version.py b/pex/version.py\n--- a/pex/version.py\n+++ b/pex/version.py\n@@ -1,4 +1,4 @@\n # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n # Licensed under the Apache License, Version 2.0 (see LICENSE).\n \n-__version__ = '2.0.0'\n+__version__ = '2.0.1'\n", "issue": "Release 2.0.1\nOn the docket:\r\n\r\n+ [x] pex --index-url=... fails in 2.0.0 #794\n", "before_files": [{"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.0.0'\n", "path": "pex/version.py"}], "after_files": [{"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.0.1'\n", "path": "pex/version.py"}]} | 343 | 94 |
gh_patches_debug_56452 | rasdani/github-patches | git_diff | netket__netket-111 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Python bindings for Jastrow machines randomly failing
I realized in #91 that once in a while the python tests for the `Jastrow` machines fail. This issue seems related to some memory problem, but I still don't understand if it is on the c++ side or python
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 import os
2 import re
3 import sys
4 import platform
5 import subprocess
6
7 from setuptools import setup, Extension
8 from setuptools.command.build_ext import build_ext
9 from distutils.version import LooseVersion
10
11
12 class CMakeExtension(Extension):
13 def __init__(self, name, sourcedir=''):
14 Extension.__init__(self, name, sources=[])
15 self.sourcedir = os.path.abspath(sourcedir)
16
17
18 class CMakeBuild(build_ext):
19 def run(self):
20 try:
21 out = subprocess.check_output(['cmake', '--version'])
22 except OSError:
23 raise RuntimeError("CMake must be installed to build the following extensions: " +
24 ", ".join(e.name for e in self.extensions))
25
26 if platform.system() == "Windows":
27 cmake_version = LooseVersion(re.search(r'version\s*([\d.]+)', out.decode()).group(1))
28 if cmake_version < '3.1.0':
29 raise RuntimeError("CMake >= 3.1.0 is required on Windows")
30
31 for ext in self.extensions:
32 self.build_extension(ext)
33
34 def build_extension(self, ext):
35 extdir = os.path.abspath(os.path.dirname(self.get_ext_fullpath(ext.name)))
36 cmake_args = ['-DCMAKE_LIBRARY_OUTPUT_DIRECTORY=' + extdir,
37 '-DPYTHON_EXECUTABLE=' + sys.executable]
38
39 cfg = 'Debug' if self.debug else 'Release'
40 build_args = ['--config', cfg]
41
42 if platform.system() == "Windows":
43 cmake_args += ['-DCMAKE_LIBRARY_OUTPUT_DIRECTORY_{}={}'.format(cfg.upper(), extdir)]
44 if sys.maxsize > 2**32:
45 cmake_args += ['-A', 'x64']
46 build_args += ['--', '/m']
47 else:
48 cmake_args += ['-DCMAKE_BUILD_TYPE=' + cfg]
49 build_args += ['--', '-j2']
50
51 env = os.environ.copy()
52 env['CXXFLAGS'] = '{} -DVERSION_INFO=\\"{}\\"'.format(env.get('CXXFLAGS', ''),
53 self.distribution.get_version())
54 if not os.path.exists(self.build_temp):
55 os.makedirs(self.build_temp)
56 subprocess.check_call(['cmake', ext.sourcedir] + cmake_args, cwd=self.build_temp, env=env)
57 subprocess.check_call(['cmake', '--build', '.'] + build_args, cwd=self.build_temp)
58
59 setup(
60 name='netket',
61 version='0.1',
62 author='Giuseppe Carleo et al.',
63 description='NetKet',
64 url='http://github.com/netket/netket',
65 author_email='[email protected]',
66 license='Apache',
67 ext_modules=[CMakeExtension('netket')],
68 cmdclass=dict(build_ext=CMakeBuild),
69 zip_safe=False,
70 )
71
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -58,7 +58,7 @@
setup(
name='netket',
- version='0.1',
+ version='2.0',
author='Giuseppe Carleo et al.',
description='NetKet',
url='http://github.com/netket/netket',
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -58,7 +58,7 @@\n \n setup(\n name='netket',\n- version='0.1',\n+ version='2.0',\n author='Giuseppe Carleo et al.',\n description='NetKet',\n url='http://github.com/netket/netket',\n", "issue": "Python bindings for Jastrow machines randomly failing\nI realized in #91 that once in a while the python tests for the `Jastrow` machines fail. This issue seems related to some memory problem, but I still don't understand if it is on the c++ side or python \n", "before_files": [{"content": "import os\nimport re\nimport sys\nimport platform\nimport subprocess\n\nfrom setuptools import setup, Extension\nfrom setuptools.command.build_ext import build_ext\nfrom distutils.version import LooseVersion\n\n\nclass CMakeExtension(Extension):\n def __init__(self, name, sourcedir=''):\n Extension.__init__(self, name, sources=[])\n self.sourcedir = os.path.abspath(sourcedir)\n\n\nclass CMakeBuild(build_ext):\n def run(self):\n try:\n out = subprocess.check_output(['cmake', '--version'])\n except OSError:\n raise RuntimeError(\"CMake must be installed to build the following extensions: \" +\n \", \".join(e.name for e in self.extensions))\n\n if platform.system() == \"Windows\":\n cmake_version = LooseVersion(re.search(r'version\\s*([\\d.]+)', out.decode()).group(1))\n if cmake_version < '3.1.0':\n raise RuntimeError(\"CMake >= 3.1.0 is required on Windows\")\n\n for ext in self.extensions:\n self.build_extension(ext)\n\n def build_extension(self, ext):\n extdir = os.path.abspath(os.path.dirname(self.get_ext_fullpath(ext.name)))\n cmake_args = ['-DCMAKE_LIBRARY_OUTPUT_DIRECTORY=' + extdir,\n '-DPYTHON_EXECUTABLE=' + sys.executable]\n\n cfg = 'Debug' if self.debug else 'Release'\n build_args = ['--config', cfg]\n\n if platform.system() == \"Windows\":\n cmake_args += ['-DCMAKE_LIBRARY_OUTPUT_DIRECTORY_{}={}'.format(cfg.upper(), extdir)]\n if sys.maxsize > 2**32:\n cmake_args += ['-A', 'x64']\n build_args += ['--', '/m']\n else:\n cmake_args += ['-DCMAKE_BUILD_TYPE=' + cfg]\n build_args += ['--', '-j2']\n\n env = os.environ.copy()\n env['CXXFLAGS'] = '{} -DVERSION_INFO=\\\\\"{}\\\\\"'.format(env.get('CXXFLAGS', ''),\n self.distribution.get_version())\n if not os.path.exists(self.build_temp):\n os.makedirs(self.build_temp)\n subprocess.check_call(['cmake', ext.sourcedir] + cmake_args, cwd=self.build_temp, env=env)\n subprocess.check_call(['cmake', '--build', '.'] + build_args, cwd=self.build_temp)\n\nsetup(\n name='netket',\n version='0.1',\n author='Giuseppe Carleo et al.',\n description='NetKet',\n url='http://github.com/netket/netket',\n author_email='[email protected]',\n license='Apache',\n ext_modules=[CMakeExtension('netket')],\n cmdclass=dict(build_ext=CMakeBuild),\n zip_safe=False,\n)\n", "path": "setup.py"}], "after_files": [{"content": "import os\nimport re\nimport sys\nimport platform\nimport subprocess\n\nfrom setuptools import setup, Extension\nfrom setuptools.command.build_ext import build_ext\nfrom distutils.version import LooseVersion\n\n\nclass CMakeExtension(Extension):\n def __init__(self, name, sourcedir=''):\n Extension.__init__(self, name, sources=[])\n self.sourcedir = os.path.abspath(sourcedir)\n\n\nclass CMakeBuild(build_ext):\n def run(self):\n try:\n out = subprocess.check_output(['cmake', '--version'])\n except OSError:\n raise RuntimeError(\"CMake must be installed to build the following extensions: \" +\n \", \".join(e.name for e in self.extensions))\n\n if platform.system() == \"Windows\":\n cmake_version = LooseVersion(re.search(r'version\\s*([\\d.]+)', out.decode()).group(1))\n if cmake_version < '3.1.0':\n raise RuntimeError(\"CMake >= 3.1.0 is required on Windows\")\n\n for ext in self.extensions:\n self.build_extension(ext)\n\n def build_extension(self, ext):\n extdir = os.path.abspath(os.path.dirname(self.get_ext_fullpath(ext.name)))\n cmake_args = ['-DCMAKE_LIBRARY_OUTPUT_DIRECTORY=' + extdir,\n '-DPYTHON_EXECUTABLE=' + sys.executable]\n\n cfg = 'Debug' if self.debug else 'Release'\n build_args = ['--config', cfg]\n\n if platform.system() == \"Windows\":\n cmake_args += ['-DCMAKE_LIBRARY_OUTPUT_DIRECTORY_{}={}'.format(cfg.upper(), extdir)]\n if sys.maxsize > 2**32:\n cmake_args += ['-A', 'x64']\n build_args += ['--', '/m']\n else:\n cmake_args += ['-DCMAKE_BUILD_TYPE=' + cfg]\n build_args += ['--', '-j2']\n\n env = os.environ.copy()\n env['CXXFLAGS'] = '{} -DVERSION_INFO=\\\\\"{}\\\\\"'.format(env.get('CXXFLAGS', ''),\n self.distribution.get_version())\n if not os.path.exists(self.build_temp):\n os.makedirs(self.build_temp)\n subprocess.check_call(['cmake', ext.sourcedir] + cmake_args, cwd=self.build_temp, env=env)\n subprocess.check_call(['cmake', '--build', '.'] + build_args, cwd=self.build_temp)\n\nsetup(\n name='netket',\n version='2.0',\n author='Giuseppe Carleo et al.',\n description='NetKet',\n url='http://github.com/netket/netket',\n author_email='[email protected]',\n license='Apache',\n ext_modules=[CMakeExtension('netket')],\n cmdclass=dict(build_ext=CMakeBuild),\n zip_safe=False,\n)\n", "path": "setup.py"}]} | 1,045 | 86 |
gh_patches_debug_20708 | rasdani/github-patches | git_diff | cfpb__consumerfinance.gov-457 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
little typo on The Bureau page
http://beta.consumerfinance.gov/the-bureau/
"Organizaitonal structure and leadership >"
should be Organizational
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `_lib/wordpress_post_processor.py`
Content:
```
1 import sys
2 import json
3 import os.path
4 import requests
5 from string import Template
6
7 import dateutil.parser
8
9 def posts_at_url(url):
10
11 current_page = 1
12 max_page = sys.maxint
13
14 while current_page <= max_page:
15
16 url = os.path.expandvars(url)
17 resp = requests.get(url, params={'page':current_page, 'count': '-1'})
18 results = json.loads(resp.content)
19 current_page += 1
20 max_page = results['pages']
21 total = 0
22 for p in results['posts']:
23 total += 1
24 yield p
25
26 def documents(name, url, **kwargs):
27
28 for post in posts_at_url(url):
29 yield process_post(post)
30
31
32 def process_post(post, newsroom = False):
33 del post['comments']
34 post['_id'] = post['slug']
35 # remove fields we're not interested in
36 if post['type'] == 'cfpb_newsroom':
37 post['category'] = [cat['title'].replace('&', '&') for cat in post['taxonomy_cfpb_newsroom_cat_taxonomy']]
38 elif post['type'] == 'post':
39 post['category'] = [cat['title'].replace('&', '&') for cat in post['taxonomy_fj_category']]
40 if post['type'] == 'watchroom':
41 post['author'] = [post['author']['name']]
42 # convert watchroom_data_x into a proper list
43 links = []
44 for x in xrange(0,10):
45 custom_fields = post['custom_fields']
46 key = 'watchroom_data_%s_link' % x
47 if key in custom_fields:
48 links.append(custom_fields[key])
49 post['links'] = links
50 else:
51 post['tags'] = [tag['title'] for tag in post['taxonomy_fj_tag']]
52 post['author'] = [author['title'] for author in
53 post['taxonomy_fj_author'] if 'Press Release' not in
54 post['category']]
55 if newsroom and post['type'] == 'post':
56 post['category'][0] = "Blog"
57 author_template = Template("$first_name $last_name")
58 dt = dateutil.parser.parse(post['date'])
59 dt_string = dt.strftime('%Y-%m-%dT%H:%M:%SZ')
60 post['date'] = dt_string
61 if 'twtr_text' in post['custom_fields']:
62 post['twtr_text'] = post['custom_fields']['twtr_text'][0]
63 if 'twtr_lang' in post['custom_fields']:
64 post['twtr_lang'] = post['custom_fields']['twtr_lang'][0]
65 if 'twtr_rel' in post['custom_fields']:
66 post['twtr_rel'] = post['custom_fields']['twtr_rel'][0]
67 if 'twtr_hash' in post['custom_fields']:
68 post['twtr_hash'] = post['custom_fields']['twtr_hash'][0]
69 return post
70
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/_lib/wordpress_post_processor.py b/_lib/wordpress_post_processor.py
--- a/_lib/wordpress_post_processor.py
+++ b/_lib/wordpress_post_processor.py
@@ -37,13 +37,13 @@
post['category'] = [cat['title'].replace('&', '&') for cat in post['taxonomy_cfpb_newsroom_cat_taxonomy']]
elif post['type'] == 'post':
post['category'] = [cat['title'].replace('&', '&') for cat in post['taxonomy_fj_category']]
- if post['type'] == 'watchroom':
+ if post['type'] == 'featured_topic':
post['author'] = [post['author']['name']]
- # convert watchroom_data_x into a proper list
+ # convert featured_topic_data_x into a proper list
links = []
for x in xrange(0,10):
custom_fields = post['custom_fields']
- key = 'watchroom_data_%s_link' % x
+ key = 'featured_topic_data_%s_link' % x
if key in custom_fields:
links.append(custom_fields[key])
post['links'] = links
| {"golden_diff": "diff --git a/_lib/wordpress_post_processor.py b/_lib/wordpress_post_processor.py\n--- a/_lib/wordpress_post_processor.py\n+++ b/_lib/wordpress_post_processor.py\n@@ -37,13 +37,13 @@\n post['category'] = [cat['title'].replace('&', '&') for cat in post['taxonomy_cfpb_newsroom_cat_taxonomy']]\n elif post['type'] == 'post':\n post['category'] = [cat['title'].replace('&', '&') for cat in post['taxonomy_fj_category']]\n- if post['type'] == 'watchroom':\n+ if post['type'] == 'featured_topic':\n post['author'] = [post['author']['name']]\n- # convert watchroom_data_x into a proper list\n+ # convert featured_topic_data_x into a proper list\n links = []\n for x in xrange(0,10):\n custom_fields = post['custom_fields']\n- key = 'watchroom_data_%s_link' % x\n+ key = 'featured_topic_data_%s_link' % x\n if key in custom_fields:\n links.append(custom_fields[key])\n post['links'] = links\n", "issue": "little typo on The Bureau page\nhttp://beta.consumerfinance.gov/the-bureau/\n\n\"Organizaitonal structure and leadership >\"\n\nshould be Organizational\n\n", "before_files": [{"content": "import sys\nimport json\nimport os.path\nimport requests\nfrom string import Template\n\nimport dateutil.parser\n\ndef posts_at_url(url):\n \n current_page = 1\n max_page = sys.maxint\n\n while current_page <= max_page:\n\n url = os.path.expandvars(url)\n resp = requests.get(url, params={'page':current_page, 'count': '-1'})\n results = json.loads(resp.content) \n current_page += 1\n max_page = results['pages']\n total = 0\n for p in results['posts']:\n total += 1\n yield p\n\ndef documents(name, url, **kwargs):\n \n for post in posts_at_url(url):\n yield process_post(post)\n\n\ndef process_post(post, newsroom = False):\n del post['comments']\n post['_id'] = post['slug']\n # remove fields we're not interested in\n if post['type'] == 'cfpb_newsroom':\n post['category'] = [cat['title'].replace('&', '&') for cat in post['taxonomy_cfpb_newsroom_cat_taxonomy']]\n elif post['type'] == 'post':\n post['category'] = [cat['title'].replace('&', '&') for cat in post['taxonomy_fj_category']]\n if post['type'] == 'watchroom':\n post['author'] = [post['author']['name']]\n # convert watchroom_data_x into a proper list\n links = []\n for x in xrange(0,10):\n custom_fields = post['custom_fields']\n key = 'watchroom_data_%s_link' % x\n if key in custom_fields:\n links.append(custom_fields[key])\n post['links'] = links\n else:\n post['tags'] = [tag['title'] for tag in post['taxonomy_fj_tag']]\n post['author'] = [author['title'] for author in\n post['taxonomy_fj_author'] if 'Press Release' not in\n post['category']]\n if newsroom and post['type'] == 'post':\n post['category'][0] = \"Blog\"\n author_template = Template(\"$first_name $last_name\")\n dt = dateutil.parser.parse(post['date'])\n dt_string = dt.strftime('%Y-%m-%dT%H:%M:%SZ')\n post['date'] = dt_string\n if 'twtr_text' in post['custom_fields']:\n post['twtr_text'] = post['custom_fields']['twtr_text'][0]\n if 'twtr_lang' in post['custom_fields']:\n post['twtr_lang'] = post['custom_fields']['twtr_lang'][0]\n if 'twtr_rel' in post['custom_fields']:\n post['twtr_rel'] = post['custom_fields']['twtr_rel'][0]\n if 'twtr_hash' in post['custom_fields']:\n post['twtr_hash'] = post['custom_fields']['twtr_hash'][0]\n return post\n", "path": "_lib/wordpress_post_processor.py"}], "after_files": [{"content": "import sys\nimport json\nimport os.path\nimport requests\nfrom string import Template\n\nimport dateutil.parser\n\ndef posts_at_url(url):\n \n current_page = 1\n max_page = sys.maxint\n\n while current_page <= max_page:\n\n url = os.path.expandvars(url)\n resp = requests.get(url, params={'page':current_page, 'count': '-1'})\n results = json.loads(resp.content) \n current_page += 1\n max_page = results['pages']\n total = 0\n for p in results['posts']:\n total += 1\n yield p\n\ndef documents(name, url, **kwargs):\n \n for post in posts_at_url(url):\n yield process_post(post)\n\n\ndef process_post(post, newsroom = False):\n del post['comments']\n post['_id'] = post['slug']\n # remove fields we're not interested in\n if post['type'] == 'cfpb_newsroom':\n post['category'] = [cat['title'].replace('&', '&') for cat in post['taxonomy_cfpb_newsroom_cat_taxonomy']]\n elif post['type'] == 'post':\n post['category'] = [cat['title'].replace('&', '&') for cat in post['taxonomy_fj_category']]\n if post['type'] == 'featured_topic':\n post['author'] = [post['author']['name']]\n # convert featured_topic_data_x into a proper list\n links = []\n for x in xrange(0,10):\n custom_fields = post['custom_fields']\n key = 'featured_topic_data_%s_link' % x\n if key in custom_fields:\n links.append(custom_fields[key])\n post['links'] = links\n else:\n post['tags'] = [tag['title'] for tag in post['taxonomy_fj_tag']]\n post['author'] = [author['title'] for author in\n post['taxonomy_fj_author'] if 'Press Release' not in\n post['category']]\n if newsroom and post['type'] == 'post':\n post['category'][0] = \"Blog\"\n author_template = Template(\"$first_name $last_name\")\n dt = dateutil.parser.parse(post['date'])\n dt_string = dt.strftime('%Y-%m-%dT%H:%M:%SZ')\n post['date'] = dt_string\n if 'twtr_text' in post['custom_fields']:\n post['twtr_text'] = post['custom_fields']['twtr_text'][0]\n if 'twtr_lang' in post['custom_fields']:\n post['twtr_lang'] = post['custom_fields']['twtr_lang'][0]\n if 'twtr_rel' in post['custom_fields']:\n post['twtr_rel'] = post['custom_fields']['twtr_rel'][0]\n if 'twtr_hash' in post['custom_fields']:\n post['twtr_hash'] = post['custom_fields']['twtr_hash'][0]\n return post\n", "path": "_lib/wordpress_post_processor.py"}]} | 1,064 | 261 |
gh_patches_debug_15914 | rasdani/github-patches | git_diff | keras-team__keras-8240 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
fix device names for multi_gpu_model
This PR will try to fix #8213.
`DeviceSpec.from_string(device).to_string()` is used by tensorflow. ([Ref](https://github.com/tensorflow/tensorflow/blob/40c475b48c091a70ad8061c1508dff6ded2d2af6/tensorflow/python/framework/device.py#L251))
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `keras/utils/training_utils.py`
Content:
```
1 from ..layers.merge import concatenate
2 from .. import backend as K
3 from ..layers.core import Lambda
4 from ..engine.training import Model
5
6
7 def _get_available_devices():
8 from tensorflow.python.client import device_lib
9 local_device_protos = device_lib.list_local_devices()
10 return [x.name for x in local_device_protos]
11
12
13 def multi_gpu_model(model, gpus):
14 """Replicates a model on different GPUs.
15
16 Specifically, this function implements single-machine
17 multi-GPU data parallelism. It works in the following way:
18
19 - Divide the model's input(s) into multiple sub-batches.
20 - Apply a model copy on each sub-batch. Every model copy
21 is executed on a dedicated GPU.
22 - Concatenate the results (on CPU) into one big batch.
23
24 E.g. if your `batch_size` is 64 and you use `gpus=2`,
25 then we will divide the input into 2 sub-batches of 32 samples,
26 process each sub-batch on one GPU, then return the full
27 batch of 64 processed samples.
28
29 This induces quasi-linear speedup on up to 8 GPUs.
30
31 This function is only available with the TensorFlow backend
32 for the time being.
33
34 # Arguments
35 model: A Keras model instance. To avoid OOM errors,
36 this model could have been built on CPU, for instance
37 (see usage example below).
38 gpus: Integer >= 2, number of on GPUs on which to create
39 model replicas.
40
41 # Returns
42 A Keras `Model` instance which can be used just like the initial
43 `model` argument, but which distributes its workload on multiple GPUs.
44
45 # Example
46
47 ```python
48 import tensorflow as tf
49 from keras.applications import Xception
50 from keras.utils import multi_gpu_model
51 import numpy as np
52
53 num_samples = 1000
54 height = 224
55 width = 224
56 num_classes = 1000
57
58 # Instantiate the base model
59 # (here, we do it on CPU, which is optional).
60 with tf.device('/cpu:0'):
61 model = Xception(weights=None,
62 input_shape=(height, width, 3),
63 classes=num_classes)
64
65 # Replicates the model on 8 GPUs.
66 # This assumes that your machine has 8 available GPUs.
67 parallel_model = multi_gpu_model(model, gpus=8)
68 parallel_model.compile(loss='categorical_crossentropy',
69 optimizer='rmsprop')
70
71 # Generate dummy data.
72 x = np.random.random((num_samples, height, width, 3))
73 y = np.random.random((num_samples, num_classes))
74
75 # This `fit` call will be distributed on 8 GPUs.
76 # Since the batch size is 256, each GPU will process 32 samples.
77 parallel_model.fit(x, y, epochs=20, batch_size=256)
78 ```
79 """
80 if K.backend() != 'tensorflow':
81 raise ValueError('`multi_gpu_model` is only available '
82 'with the TensorFlow backend.')
83 if gpus <= 1:
84 raise ValueError('For multi-gpu usage to be effective, '
85 'call `multi_gpu_model` with `gpus >= 2`. '
86 'Received: `gpus=%d`' % gpus)
87
88 import tensorflow as tf
89
90 target_devices = ['/cpu:0'] + ['/gpu:%d' % i for i in range(gpus)]
91 available_devices = _get_available_devices()
92 for device in target_devices:
93 if device not in available_devices:
94 raise ValueError(
95 'To call `multi_gpu_model` with `gpus=%d`, '
96 'we expect the following devices to be available: %s. '
97 'However this machine only has: %s. '
98 'Try reducing `gpus`.' % (gpus,
99 target_devices,
100 available_devices))
101
102 def get_slice(data, i, parts):
103 shape = tf.shape(data)
104 batch_size = shape[:1]
105 input_shape = shape[1:]
106 step = batch_size // parts
107 if i == gpus - 1:
108 size = batch_size - step * i
109 else:
110 size = step
111 size = tf.concat([size, input_shape], axis=0)
112 stride = tf.concat([step, input_shape * 0], axis=0)
113 start = stride * i
114 return tf.slice(data, start, size)
115
116 all_outputs = []
117 for i in range(len(model.outputs)):
118 all_outputs.append([])
119
120 # Place a copy of the model on each GPU,
121 # each getting a slice of the inputs.
122 for i in range(gpus):
123 with tf.device('/gpu:%d' % i):
124 with tf.name_scope('replica_%d' % i):
125 inputs = []
126 # Retrieve a slice of the input.
127 for x in model.inputs:
128 input_shape = tuple(x.get_shape().as_list())[1:]
129 slice_i = Lambda(get_slice,
130 output_shape=input_shape,
131 arguments={'i': i,
132 'parts': gpus})(x)
133 inputs.append(slice_i)
134
135 # Apply model on slice
136 # (creating a model replica on the target device).
137 outputs = model(inputs)
138 if not isinstance(outputs, list):
139 outputs = [outputs]
140
141 # Save the outputs for merging back together later.
142 for o in range(len(outputs)):
143 all_outputs[o].append(outputs[o])
144
145 # Merge outputs on CPU.
146 with tf.device('/cpu:0'):
147 merged = []
148 for outputs in all_outputs:
149 merged.append(concatenate(outputs,
150 axis=0))
151 return Model(model.inputs, merged)
152
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/keras/utils/training_utils.py b/keras/utils/training_utils.py
--- a/keras/utils/training_utils.py
+++ b/keras/utils/training_utils.py
@@ -10,6 +10,11 @@
return [x.name for x in local_device_protos]
+def _normalize_device_name(name):
+ name = name.lower().replace('device:', '')
+ return name
+
+
def multi_gpu_model(model, gpus):
"""Replicates a model on different GPUs.
@@ -89,6 +94,7 @@
target_devices = ['/cpu:0'] + ['/gpu:%d' % i for i in range(gpus)]
available_devices = _get_available_devices()
+ available_devices = [_normalize_device_name(name) for name in available_devices]
for device in target_devices:
if device not in available_devices:
raise ValueError(
| {"golden_diff": "diff --git a/keras/utils/training_utils.py b/keras/utils/training_utils.py\n--- a/keras/utils/training_utils.py\n+++ b/keras/utils/training_utils.py\n@@ -10,6 +10,11 @@\n return [x.name for x in local_device_protos]\n \n \n+def _normalize_device_name(name):\n+ name = name.lower().replace('device:', '')\n+ return name\n+\n+\n def multi_gpu_model(model, gpus):\n \"\"\"Replicates a model on different GPUs.\n \n@@ -89,6 +94,7 @@\n \n target_devices = ['/cpu:0'] + ['/gpu:%d' % i for i in range(gpus)]\n available_devices = _get_available_devices()\n+ available_devices = [_normalize_device_name(name) for name in available_devices]\n for device in target_devices:\n if device not in available_devices:\n raise ValueError(\n", "issue": "fix device names for multi_gpu_model\nThis PR will try to fix #8213.\r\n\r\n`DeviceSpec.from_string(device).to_string()` is used by tensorflow. ([Ref](https://github.com/tensorflow/tensorflow/blob/40c475b48c091a70ad8061c1508dff6ded2d2af6/tensorflow/python/framework/device.py#L251))\n", "before_files": [{"content": "from ..layers.merge import concatenate\nfrom .. import backend as K\nfrom ..layers.core import Lambda\nfrom ..engine.training import Model\n\n\ndef _get_available_devices():\n from tensorflow.python.client import device_lib\n local_device_protos = device_lib.list_local_devices()\n return [x.name for x in local_device_protos]\n\n\ndef multi_gpu_model(model, gpus):\n \"\"\"Replicates a model on different GPUs.\n\n Specifically, this function implements single-machine\n multi-GPU data parallelism. It works in the following way:\n\n - Divide the model's input(s) into multiple sub-batches.\n - Apply a model copy on each sub-batch. Every model copy\n is executed on a dedicated GPU.\n - Concatenate the results (on CPU) into one big batch.\n\n E.g. if your `batch_size` is 64 and you use `gpus=2`,\n then we will divide the input into 2 sub-batches of 32 samples,\n process each sub-batch on one GPU, then return the full\n batch of 64 processed samples.\n\n This induces quasi-linear speedup on up to 8 GPUs.\n\n This function is only available with the TensorFlow backend\n for the time being.\n\n # Arguments\n model: A Keras model instance. To avoid OOM errors,\n this model could have been built on CPU, for instance\n (see usage example below).\n gpus: Integer >= 2, number of on GPUs on which to create\n model replicas.\n\n # Returns\n A Keras `Model` instance which can be used just like the initial\n `model` argument, but which distributes its workload on multiple GPUs.\n\n # Example\n\n ```python\n import tensorflow as tf\n from keras.applications import Xception\n from keras.utils import multi_gpu_model\n import numpy as np\n\n num_samples = 1000\n height = 224\n width = 224\n num_classes = 1000\n\n # Instantiate the base model\n # (here, we do it on CPU, which is optional).\n with tf.device('/cpu:0'):\n model = Xception(weights=None,\n input_shape=(height, width, 3),\n classes=num_classes)\n\n # Replicates the model on 8 GPUs.\n # This assumes that your machine has 8 available GPUs.\n parallel_model = multi_gpu_model(model, gpus=8)\n parallel_model.compile(loss='categorical_crossentropy',\n optimizer='rmsprop')\n\n # Generate dummy data.\n x = np.random.random((num_samples, height, width, 3))\n y = np.random.random((num_samples, num_classes))\n\n # This `fit` call will be distributed on 8 GPUs.\n # Since the batch size is 256, each GPU will process 32 samples.\n parallel_model.fit(x, y, epochs=20, batch_size=256)\n ```\n \"\"\"\n if K.backend() != 'tensorflow':\n raise ValueError('`multi_gpu_model` is only available '\n 'with the TensorFlow backend.')\n if gpus <= 1:\n raise ValueError('For multi-gpu usage to be effective, '\n 'call `multi_gpu_model` with `gpus >= 2`. '\n 'Received: `gpus=%d`' % gpus)\n\n import tensorflow as tf\n\n target_devices = ['/cpu:0'] + ['/gpu:%d' % i for i in range(gpus)]\n available_devices = _get_available_devices()\n for device in target_devices:\n if device not in available_devices:\n raise ValueError(\n 'To call `multi_gpu_model` with `gpus=%d`, '\n 'we expect the following devices to be available: %s. '\n 'However this machine only has: %s. '\n 'Try reducing `gpus`.' % (gpus,\n target_devices,\n available_devices))\n\n def get_slice(data, i, parts):\n shape = tf.shape(data)\n batch_size = shape[:1]\n input_shape = shape[1:]\n step = batch_size // parts\n if i == gpus - 1:\n size = batch_size - step * i\n else:\n size = step\n size = tf.concat([size, input_shape], axis=0)\n stride = tf.concat([step, input_shape * 0], axis=0)\n start = stride * i\n return tf.slice(data, start, size)\n\n all_outputs = []\n for i in range(len(model.outputs)):\n all_outputs.append([])\n\n # Place a copy of the model on each GPU,\n # each getting a slice of the inputs.\n for i in range(gpus):\n with tf.device('/gpu:%d' % i):\n with tf.name_scope('replica_%d' % i):\n inputs = []\n # Retrieve a slice of the input.\n for x in model.inputs:\n input_shape = tuple(x.get_shape().as_list())[1:]\n slice_i = Lambda(get_slice,\n output_shape=input_shape,\n arguments={'i': i,\n 'parts': gpus})(x)\n inputs.append(slice_i)\n\n # Apply model on slice\n # (creating a model replica on the target device).\n outputs = model(inputs)\n if not isinstance(outputs, list):\n outputs = [outputs]\n\n # Save the outputs for merging back together later.\n for o in range(len(outputs)):\n all_outputs[o].append(outputs[o])\n\n # Merge outputs on CPU.\n with tf.device('/cpu:0'):\n merged = []\n for outputs in all_outputs:\n merged.append(concatenate(outputs,\n axis=0))\n return Model(model.inputs, merged)\n", "path": "keras/utils/training_utils.py"}], "after_files": [{"content": "from ..layers.merge import concatenate\nfrom .. import backend as K\nfrom ..layers.core import Lambda\nfrom ..engine.training import Model\n\n\ndef _get_available_devices():\n from tensorflow.python.client import device_lib\n local_device_protos = device_lib.list_local_devices()\n return [x.name for x in local_device_protos]\n\n\ndef _normalize_device_name(name):\n name = name.lower().replace('device:', '')\n return name\n\n\ndef multi_gpu_model(model, gpus):\n \"\"\"Replicates a model on different GPUs.\n\n Specifically, this function implements single-machine\n multi-GPU data parallelism. It works in the following way:\n\n - Divide the model's input(s) into multiple sub-batches.\n - Apply a model copy on each sub-batch. Every model copy\n is executed on a dedicated GPU.\n - Concatenate the results (on CPU) into one big batch.\n\n E.g. if your `batch_size` is 64 and you use `gpus=2`,\n then we will divide the input into 2 sub-batches of 32 samples,\n process each sub-batch on one GPU, then return the full\n batch of 64 processed samples.\n\n This induces quasi-linear speedup on up to 8 GPUs.\n\n This function is only available with the TensorFlow backend\n for the time being.\n\n # Arguments\n model: A Keras model instance. To avoid OOM errors,\n this model could have been built on CPU, for instance\n (see usage example below).\n gpus: Integer >= 2, number of on GPUs on which to create\n model replicas.\n\n # Returns\n A Keras `Model` instance which can be used just like the initial\n `model` argument, but which distributes its workload on multiple GPUs.\n\n # Example\n\n ```python\n import tensorflow as tf\n from keras.applications import Xception\n from keras.utils import multi_gpu_model\n import numpy as np\n\n num_samples = 1000\n height = 224\n width = 224\n num_classes = 1000\n\n # Instantiate the base model\n # (here, we do it on CPU, which is optional).\n with tf.device('/cpu:0'):\n model = Xception(weights=None,\n input_shape=(height, width, 3),\n classes=num_classes)\n\n # Replicates the model on 8 GPUs.\n # This assumes that your machine has 8 available GPUs.\n parallel_model = multi_gpu_model(model, gpus=8)\n parallel_model.compile(loss='categorical_crossentropy',\n optimizer='rmsprop')\n\n # Generate dummy data.\n x = np.random.random((num_samples, height, width, 3))\n y = np.random.random((num_samples, num_classes))\n\n # This `fit` call will be distributed on 8 GPUs.\n # Since the batch size is 256, each GPU will process 32 samples.\n parallel_model.fit(x, y, epochs=20, batch_size=256)\n ```\n \"\"\"\n if K.backend() != 'tensorflow':\n raise ValueError('`multi_gpu_model` is only available '\n 'with the TensorFlow backend.')\n if gpus <= 1:\n raise ValueError('For multi-gpu usage to be effective, '\n 'call `multi_gpu_model` with `gpus >= 2`. '\n 'Received: `gpus=%d`' % gpus)\n\n import tensorflow as tf\n\n target_devices = ['/cpu:0'] + ['/gpu:%d' % i for i in range(gpus)]\n available_devices = _get_available_devices()\n available_devices = [_normalize_device_name(name) for name in available_devices]\n for device in target_devices:\n if device not in available_devices:\n raise ValueError(\n 'To call `multi_gpu_model` with `gpus=%d`, '\n 'we expect the following devices to be available: %s. '\n 'However this machine only has: %s. '\n 'Try reducing `gpus`.' % (gpus,\n target_devices,\n available_devices))\n\n def get_slice(data, i, parts):\n shape = tf.shape(data)\n batch_size = shape[:1]\n input_shape = shape[1:]\n step = batch_size // parts\n if i == gpus - 1:\n size = batch_size - step * i\n else:\n size = step\n size = tf.concat([size, input_shape], axis=0)\n stride = tf.concat([step, input_shape * 0], axis=0)\n start = stride * i\n return tf.slice(data, start, size)\n\n all_outputs = []\n for i in range(len(model.outputs)):\n all_outputs.append([])\n\n # Place a copy of the model on each GPU,\n # each getting a slice of the inputs.\n for i in range(gpus):\n with tf.device('/gpu:%d' % i):\n with tf.name_scope('replica_%d' % i):\n inputs = []\n # Retrieve a slice of the input.\n for x in model.inputs:\n input_shape = tuple(x.get_shape().as_list())[1:]\n slice_i = Lambda(get_slice,\n output_shape=input_shape,\n arguments={'i': i,\n 'parts': gpus})(x)\n inputs.append(slice_i)\n\n # Apply model on slice\n # (creating a model replica on the target device).\n outputs = model(inputs)\n if not isinstance(outputs, list):\n outputs = [outputs]\n\n # Save the outputs for merging back together later.\n for o in range(len(outputs)):\n all_outputs[o].append(outputs[o])\n\n # Merge outputs on CPU.\n with tf.device('/cpu:0'):\n merged = []\n for outputs in all_outputs:\n merged.append(concatenate(outputs,\n axis=0))\n return Model(model.inputs, merged)\n", "path": "keras/utils/training_utils.py"}]} | 1,957 | 201 |
gh_patches_debug_12524 | rasdani/github-patches | git_diff | python-telegram-bot__python-telegram-bot-1734 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Python supported versions need to be updated in docs
Not really a bug, but for example it still says Python 3.4 is supported in readme and setup explanations.
Copy-pasting from https://pypi.org/project/python-telegram-bot/ :
> This library provides a pure Python interface for the Telegram Bot API. It’s compatible with Python versions 2.7, 3.3+ and PyPy.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #!/usr/bin/env python
2 """The setup and build script for the python-telegram-bot library."""
3
4 import codecs
5 import os
6 from setuptools import setup, find_packages
7
8
9 def requirements():
10 """Build the requirements list for this project"""
11 requirements_list = []
12
13 with open('requirements.txt') as requirements:
14 for install in requirements:
15 requirements_list.append(install.strip())
16
17 return requirements_list
18
19
20 packages = find_packages(exclude=['tests*'])
21
22 with codecs.open('README.rst', 'r', 'utf-8') as fd:
23 fn = os.path.join('telegram', 'version.py')
24 with open(fn) as fh:
25 code = compile(fh.read(), fn, 'exec')
26 exec(code)
27
28 setup(name='python-telegram-bot',
29 version=__version__,
30 author='Leandro Toledo',
31 author_email='[email protected]',
32 license='LGPLv3',
33 url='https://python-telegram-bot.org/',
34 keywords='python telegram bot api wrapper',
35 description="We have made you a wrapper you can't refuse",
36 long_description=fd.read(),
37 packages=packages,
38 install_requires=requirements(),
39 extras_require={
40 'json': 'ujson',
41 'socks': 'PySocks'
42 },
43 include_package_data=True,
44 classifiers=[
45 'Development Status :: 5 - Production/Stable',
46 'Intended Audience :: Developers',
47 'License :: OSI Approved :: GNU Lesser General Public License v3 (LGPLv3)',
48 'Operating System :: OS Independent',
49 'Topic :: Software Development :: Libraries :: Python Modules',
50 'Topic :: Communications :: Chat',
51 'Topic :: Internet',
52 'Programming Language :: Python',
53 'Programming Language :: Python :: 2',
54 'Programming Language :: Python :: 2.7',
55 'Programming Language :: Python :: 3',
56 'Programming Language :: Python :: 3.4',
57 'Programming Language :: Python :: 3.5',
58 'Programming Language :: Python :: 3.6',
59 'Programming Language :: Python :: 3.7'
60 ],)
61
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -50,10 +50,7 @@
'Topic :: Communications :: Chat',
'Topic :: Internet',
'Programming Language :: Python',
- 'Programming Language :: Python :: 2',
- 'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
- 'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7'
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -50,10 +50,7 @@\n 'Topic :: Communications :: Chat',\n 'Topic :: Internet',\n 'Programming Language :: Python',\n- 'Programming Language :: Python :: 2',\n- 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3',\n- 'Programming Language :: Python :: 3.4',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7'\n", "issue": "Python supported versions need to be updated in docs\nNot really a bug, but for example it still says Python 3.4 is supported in readme and setup explanations.\r\n\r\nCopy-pasting from https://pypi.org/project/python-telegram-bot/ :\r\n\r\n> This library provides a pure Python interface for the Telegram Bot API. It\u2019s compatible with Python versions 2.7, 3.3+ and PyPy.\r\n\n", "before_files": [{"content": "#!/usr/bin/env python\n\"\"\"The setup and build script for the python-telegram-bot library.\"\"\"\n\nimport codecs\nimport os\nfrom setuptools import setup, find_packages\n\n\ndef requirements():\n \"\"\"Build the requirements list for this project\"\"\"\n requirements_list = []\n\n with open('requirements.txt') as requirements:\n for install in requirements:\n requirements_list.append(install.strip())\n\n return requirements_list\n\n\npackages = find_packages(exclude=['tests*'])\n\nwith codecs.open('README.rst', 'r', 'utf-8') as fd:\n fn = os.path.join('telegram', 'version.py')\n with open(fn) as fh:\n code = compile(fh.read(), fn, 'exec')\n exec(code)\n\n setup(name='python-telegram-bot',\n version=__version__,\n author='Leandro Toledo',\n author_email='[email protected]',\n license='LGPLv3',\n url='https://python-telegram-bot.org/',\n keywords='python telegram bot api wrapper',\n description=\"We have made you a wrapper you can't refuse\",\n long_description=fd.read(),\n packages=packages,\n install_requires=requirements(),\n extras_require={\n 'json': 'ujson',\n 'socks': 'PySocks'\n },\n include_package_data=True,\n classifiers=[\n 'Development Status :: 5 - Production/Stable',\n 'Intended Audience :: Developers',\n 'License :: OSI Approved :: GNU Lesser General Public License v3 (LGPLv3)',\n 'Operating System :: OS Independent',\n 'Topic :: Software Development :: Libraries :: Python Modules',\n 'Topic :: Communications :: Chat',\n 'Topic :: Internet',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 2',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.4',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7'\n ],)\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/env python\n\"\"\"The setup and build script for the python-telegram-bot library.\"\"\"\n\nimport codecs\nimport os\nfrom setuptools import setup, find_packages\n\n\ndef requirements():\n \"\"\"Build the requirements list for this project\"\"\"\n requirements_list = []\n\n with open('requirements.txt') as requirements:\n for install in requirements:\n requirements_list.append(install.strip())\n\n return requirements_list\n\n\npackages = find_packages(exclude=['tests*'])\n\nwith codecs.open('README.rst', 'r', 'utf-8') as fd:\n fn = os.path.join('telegram', 'version.py')\n with open(fn) as fh:\n code = compile(fh.read(), fn, 'exec')\n exec(code)\n\n setup(name='python-telegram-bot',\n version=__version__,\n author='Leandro Toledo',\n author_email='[email protected]',\n license='LGPLv3',\n url='https://python-telegram-bot.org/',\n keywords='python telegram bot api wrapper',\n description=\"We have made you a wrapper you can't refuse\",\n long_description=fd.read(),\n packages=packages,\n install_requires=requirements(),\n extras_require={\n 'json': 'ujson',\n 'socks': 'PySocks'\n },\n include_package_data=True,\n classifiers=[\n 'Development Status :: 5 - Production/Stable',\n 'Intended Audience :: Developers',\n 'License :: OSI Approved :: GNU Lesser General Public License v3 (LGPLv3)',\n 'Operating System :: OS Independent',\n 'Topic :: Software Development :: Libraries :: Python Modules',\n 'Topic :: Communications :: Chat',\n 'Topic :: Internet',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7'\n ],)\n", "path": "setup.py"}]} | 904 | 137 |
gh_patches_debug_17838 | rasdani/github-patches | git_diff | voxel51__fiftyone-1283 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[FR] update opencv-python-headless
### Proposal Summary
Currently this repo requires opencv-python-headless<=4.4.0.46. To cut a long story short there are no wheels available for python3.9 and I am unable to install fiftyone (I am using docker `image: jupyter/scipy-notebook:latest`). However version `4.5.3.56` is available for install without issue, and I propose updating the requirement for this dependency.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #!/usr/bin/env python
2 """
3 Installs FiftyOne.
4
5 | Copyright 2017-2021, Voxel51, Inc.
6 | `voxel51.com <https://voxel51.com/>`_
7 |
8 """
9 import os
10 from setuptools import setup, find_packages
11 from wheel.bdist_wheel import bdist_wheel
12
13
14 class BdistWheelCustom(bdist_wheel):
15 def finalize_options(self):
16 bdist_wheel.finalize_options(self)
17 # make just the wheel require these packages, since they aren't needed
18 # for a development installation
19 self.distribution.install_requires += [
20 "fiftyone-brain>=0.7,<0.8",
21 "fiftyone-db>=0.3,<0.4",
22 ]
23
24
25 VERSION = "0.13.2"
26
27
28 def get_version():
29 if "RELEASE_VERSION" in os.environ:
30 version = os.environ["RELEASE_VERSION"]
31 if not version.startswith(VERSION):
32 raise ValueError(
33 "Release version does not match version: %s and %s"
34 % (version, VERSION)
35 )
36 return version
37
38 return VERSION
39
40
41 EXTRAS_REQUIREMENTS = {"desktop": ["fiftyone-desktop>=0.16,<0.17"]}
42
43
44 with open("README.md", "r") as fh:
45 long_description = fh.read()
46
47
48 setup(
49 name="fiftyone",
50 version=get_version(),
51 description=(
52 "FiftyOne: the open-source tool for building high-quality datasets "
53 "and computer vision models"
54 ),
55 author="Voxel51, Inc.",
56 author_email="[email protected]",
57 url="https://github.com/voxel51/fiftyone",
58 extras_require=EXTRAS_REQUIREMENTS,
59 license="Apache",
60 long_description=long_description,
61 long_description_content_type="text/markdown",
62 packages=find_packages() + ["fiftyone.recipes", "fiftyone.tutorials"],
63 package_dir={
64 "fiftyone.recipes": "docs/source/recipes",
65 "fiftyone.tutorials": "docs/source/tutorials",
66 },
67 include_package_data=True,
68 install_requires=[
69 # third-party packages
70 "argcomplete",
71 "boto3",
72 "Deprecated",
73 "eventlet",
74 "future",
75 "Jinja2",
76 "kaleido",
77 "matplotlib",
78 "mongoengine==0.20.0",
79 "motor>=2.3,<3",
80 "numpy",
81 "packaging",
82 "pandas",
83 "Pillow>=6.2",
84 "plotly>=4.14,<5",
85 "pprintpp",
86 "psutil",
87 "pymongo>=3.11,<4",
88 "PyYAML",
89 "retrying",
90 "scikit-learn",
91 "scikit-image",
92 "setuptools",
93 "tabulate",
94 "tornado>=5.1.1,<7",
95 "xmltodict",
96 "universal-analytics-python3>=1.0.1,<2",
97 # internal packages
98 "voxel51-eta>=0.5.2,<0.6",
99 # ETA dependency - restricted to a maximum version known to provide
100 # wheels here because it tends to publish sdists several hours before
101 # wheels. When users install FiftyOne in this window, they will need to
102 # compile OpenCV from source, leading to either errors or a
103 # time-consuming installation.
104 "opencv-python-headless<=4.4.0.46",
105 ],
106 classifiers=[
107 "Development Status :: 4 - Beta",
108 "Intended Audience :: Developers",
109 "Intended Audience :: Science/Research",
110 "License :: OSI Approved :: Apache Software License",
111 "Topic :: Scientific/Engineering :: Artificial Intelligence",
112 "Topic :: Scientific/Engineering :: Image Processing",
113 "Topic :: Scientific/Engineering :: Image Recognition",
114 "Topic :: Scientific/Engineering :: Information Analysis",
115 "Topic :: Scientific/Engineering :: Visualization",
116 "Operating System :: MacOS :: MacOS X",
117 "Operating System :: POSIX :: Linux",
118 "Operating System :: Microsoft :: Windows",
119 "Programming Language :: Python :: 3",
120 "Programming Language :: Python :: 3.6",
121 "Programming Language :: Python :: 3.7",
122 "Programming Language :: Python :: 3.8",
123 "Programming Language :: Python :: 3.9",
124 ],
125 entry_points={"console_scripts": ["fiftyone=fiftyone.core.cli:main"]},
126 python_requires=">=3.6",
127 cmdclass={"bdist_wheel": BdistWheelCustom},
128 )
129
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -78,6 +78,7 @@
"mongoengine==0.20.0",
"motor>=2.3,<3",
"numpy",
+ "opencv-python-headless",
"packaging",
"pandas",
"Pillow>=6.2",
@@ -96,12 +97,6 @@
"universal-analytics-python3>=1.0.1,<2",
# internal packages
"voxel51-eta>=0.5.2,<0.6",
- # ETA dependency - restricted to a maximum version known to provide
- # wheels here because it tends to publish sdists several hours before
- # wheels. When users install FiftyOne in this window, they will need to
- # compile OpenCV from source, leading to either errors or a
- # time-consuming installation.
- "opencv-python-headless<=4.4.0.46",
],
classifiers=[
"Development Status :: 4 - Beta",
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -78,6 +78,7 @@\n \"mongoengine==0.20.0\",\n \"motor>=2.3,<3\",\n \"numpy\",\n+ \"opencv-python-headless\",\n \"packaging\",\n \"pandas\",\n \"Pillow>=6.2\",\n@@ -96,12 +97,6 @@\n \"universal-analytics-python3>=1.0.1,<2\",\n # internal packages\n \"voxel51-eta>=0.5.2,<0.6\",\n- # ETA dependency - restricted to a maximum version known to provide\n- # wheels here because it tends to publish sdists several hours before\n- # wheels. When users install FiftyOne in this window, they will need to\n- # compile OpenCV from source, leading to either errors or a\n- # time-consuming installation.\n- \"opencv-python-headless<=4.4.0.46\",\n ],\n classifiers=[\n \"Development Status :: 4 - Beta\",\n", "issue": "[FR] update opencv-python-headless\n### Proposal Summary\r\nCurrently this repo requires opencv-python-headless<=4.4.0.46. To cut a long story short there are no wheels available for python3.9 and I am unable to install fiftyone (I am using docker `image: jupyter/scipy-notebook:latest`). However version `4.5.3.56` is available for install without issue, and I propose updating the requirement for this dependency.\n", "before_files": [{"content": "#!/usr/bin/env python\n\"\"\"\nInstalls FiftyOne.\n\n| Copyright 2017-2021, Voxel51, Inc.\n| `voxel51.com <https://voxel51.com/>`_\n|\n\"\"\"\nimport os\nfrom setuptools import setup, find_packages\nfrom wheel.bdist_wheel import bdist_wheel\n\n\nclass BdistWheelCustom(bdist_wheel):\n def finalize_options(self):\n bdist_wheel.finalize_options(self)\n # make just the wheel require these packages, since they aren't needed\n # for a development installation\n self.distribution.install_requires += [\n \"fiftyone-brain>=0.7,<0.8\",\n \"fiftyone-db>=0.3,<0.4\",\n ]\n\n\nVERSION = \"0.13.2\"\n\n\ndef get_version():\n if \"RELEASE_VERSION\" in os.environ:\n version = os.environ[\"RELEASE_VERSION\"]\n if not version.startswith(VERSION):\n raise ValueError(\n \"Release version does not match version: %s and %s\"\n % (version, VERSION)\n )\n return version\n\n return VERSION\n\n\nEXTRAS_REQUIREMENTS = {\"desktop\": [\"fiftyone-desktop>=0.16,<0.17\"]}\n\n\nwith open(\"README.md\", \"r\") as fh:\n long_description = fh.read()\n\n\nsetup(\n name=\"fiftyone\",\n version=get_version(),\n description=(\n \"FiftyOne: the open-source tool for building high-quality datasets \"\n \"and computer vision models\"\n ),\n author=\"Voxel51, Inc.\",\n author_email=\"[email protected]\",\n url=\"https://github.com/voxel51/fiftyone\",\n extras_require=EXTRAS_REQUIREMENTS,\n license=\"Apache\",\n long_description=long_description,\n long_description_content_type=\"text/markdown\",\n packages=find_packages() + [\"fiftyone.recipes\", \"fiftyone.tutorials\"],\n package_dir={\n \"fiftyone.recipes\": \"docs/source/recipes\",\n \"fiftyone.tutorials\": \"docs/source/tutorials\",\n },\n include_package_data=True,\n install_requires=[\n # third-party packages\n \"argcomplete\",\n \"boto3\",\n \"Deprecated\",\n \"eventlet\",\n \"future\",\n \"Jinja2\",\n \"kaleido\",\n \"matplotlib\",\n \"mongoengine==0.20.0\",\n \"motor>=2.3,<3\",\n \"numpy\",\n \"packaging\",\n \"pandas\",\n \"Pillow>=6.2\",\n \"plotly>=4.14,<5\",\n \"pprintpp\",\n \"psutil\",\n \"pymongo>=3.11,<4\",\n \"PyYAML\",\n \"retrying\",\n \"scikit-learn\",\n \"scikit-image\",\n \"setuptools\",\n \"tabulate\",\n \"tornado>=5.1.1,<7\",\n \"xmltodict\",\n \"universal-analytics-python3>=1.0.1,<2\",\n # internal packages\n \"voxel51-eta>=0.5.2,<0.6\",\n # ETA dependency - restricted to a maximum version known to provide\n # wheels here because it tends to publish sdists several hours before\n # wheels. When users install FiftyOne in this window, they will need to\n # compile OpenCV from source, leading to either errors or a\n # time-consuming installation.\n \"opencv-python-headless<=4.4.0.46\",\n ],\n classifiers=[\n \"Development Status :: 4 - Beta\",\n \"Intended Audience :: Developers\",\n \"Intended Audience :: Science/Research\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Topic :: Scientific/Engineering :: Artificial Intelligence\",\n \"Topic :: Scientific/Engineering :: Image Processing\",\n \"Topic :: Scientific/Engineering :: Image Recognition\",\n \"Topic :: Scientific/Engineering :: Information Analysis\",\n \"Topic :: Scientific/Engineering :: Visualization\",\n \"Operating System :: MacOS :: MacOS X\",\n \"Operating System :: POSIX :: Linux\",\n \"Operating System :: Microsoft :: Windows\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n ],\n entry_points={\"console_scripts\": [\"fiftyone=fiftyone.core.cli:main\"]},\n python_requires=\">=3.6\",\n cmdclass={\"bdist_wheel\": BdistWheelCustom},\n)\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/env python\n\"\"\"\nInstalls FiftyOne.\n\n| Copyright 2017-2021, Voxel51, Inc.\n| `voxel51.com <https://voxel51.com/>`_\n|\n\"\"\"\nimport os\nfrom setuptools import setup, find_packages\nfrom wheel.bdist_wheel import bdist_wheel\n\n\nclass BdistWheelCustom(bdist_wheel):\n def finalize_options(self):\n bdist_wheel.finalize_options(self)\n # make just the wheel require these packages, since they aren't needed\n # for a development installation\n self.distribution.install_requires += [\n \"fiftyone-brain>=0.7,<0.8\",\n \"fiftyone-db>=0.3,<0.4\",\n ]\n\n\nVERSION = \"0.13.2\"\n\n\ndef get_version():\n if \"RELEASE_VERSION\" in os.environ:\n version = os.environ[\"RELEASE_VERSION\"]\n if not version.startswith(VERSION):\n raise ValueError(\n \"Release version does not match version: %s and %s\"\n % (version, VERSION)\n )\n return version\n\n return VERSION\n\n\nEXTRAS_REQUIREMENTS = {\"desktop\": [\"fiftyone-desktop>=0.16,<0.17\"]}\n\n\nwith open(\"README.md\", \"r\") as fh:\n long_description = fh.read()\n\n\nsetup(\n name=\"fiftyone\",\n version=get_version(),\n description=(\n \"FiftyOne: the open-source tool for building high-quality datasets \"\n \"and computer vision models\"\n ),\n author=\"Voxel51, Inc.\",\n author_email=\"[email protected]\",\n url=\"https://github.com/voxel51/fiftyone\",\n extras_require=EXTRAS_REQUIREMENTS,\n license=\"Apache\",\n long_description=long_description,\n long_description_content_type=\"text/markdown\",\n packages=find_packages() + [\"fiftyone.recipes\", \"fiftyone.tutorials\"],\n package_dir={\n \"fiftyone.recipes\": \"docs/source/recipes\",\n \"fiftyone.tutorials\": \"docs/source/tutorials\",\n },\n include_package_data=True,\n install_requires=[\n # third-party packages\n \"argcomplete\",\n \"boto3\",\n \"Deprecated\",\n \"eventlet\",\n \"future\",\n \"Jinja2\",\n \"kaleido\",\n \"matplotlib\",\n \"mongoengine==0.20.0\",\n \"motor>=2.3,<3\",\n \"numpy\",\n \"opencv-python-headless\",\n \"packaging\",\n \"pandas\",\n \"Pillow>=6.2\",\n \"plotly>=4.14,<5\",\n \"pprintpp\",\n \"psutil\",\n \"pymongo>=3.11,<4\",\n \"PyYAML\",\n \"retrying\",\n \"scikit-learn\",\n \"scikit-image\",\n \"setuptools\",\n \"tabulate\",\n \"tornado>=5.1.1,<7\",\n \"xmltodict\",\n \"universal-analytics-python3>=1.0.1,<2\",\n # internal packages\n \"voxel51-eta>=0.5.2,<0.6\",\n ],\n classifiers=[\n \"Development Status :: 4 - Beta\",\n \"Intended Audience :: Developers\",\n \"Intended Audience :: Science/Research\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Topic :: Scientific/Engineering :: Artificial Intelligence\",\n \"Topic :: Scientific/Engineering :: Image Processing\",\n \"Topic :: Scientific/Engineering :: Image Recognition\",\n \"Topic :: Scientific/Engineering :: Information Analysis\",\n \"Topic :: Scientific/Engineering :: Visualization\",\n \"Operating System :: MacOS :: MacOS X\",\n \"Operating System :: POSIX :: Linux\",\n \"Operating System :: Microsoft :: Windows\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n ],\n entry_points={\"console_scripts\": [\"fiftyone=fiftyone.core.cli:main\"]},\n python_requires=\">=3.6\",\n cmdclass={\"bdist_wheel\": BdistWheelCustom},\n)\n", "path": "setup.py"}]} | 1,643 | 242 |
gh_patches_debug_7459 | rasdani/github-patches | git_diff | rlworkgroup__garage-1558 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Backport #1554
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/garage/tf/plotter/plotter.py`
Content:
```
1 import atexit
2 from collections import namedtuple
3 from enum import Enum
4 import platform
5 from queue import Queue
6 from threading import Thread
7
8 import numpy as np
9 import tensorflow as tf
10
11 from garage.sampler.utils import rollout as default_rollout
12
13 __all__ = ['Plotter']
14
15
16 class Op(Enum):
17 STOP = 0
18 UPDATE = 1
19 DEMO = 2
20
21
22 Message = namedtuple('Message', ['op', 'args', 'kwargs'])
23
24
25 class Plotter:
26
27 # Static variable used to disable the plotter
28 enable = True
29 # List containing all plotters instantiated in the process
30 __plotters = []
31
32 def __init__(self,
33 env,
34 policy,
35 sess=None,
36 graph=None,
37 rollout=default_rollout):
38 Plotter.__plotters.append(self)
39 self.env = env
40 self.sess = tf.compat.v1.get_default_session(
41 ) if sess is None else sess
42 self.graph = tf.compat.v1.get_default_graph(
43 ) if graph is None else graph
44 with self.sess.as_default(), self.graph.as_default():
45 self.policy = policy.clone('plotter_policy')
46 self.rollout = rollout
47 self.worker_thread = Thread(target=self._start_worker, daemon=True)
48 self.queue = Queue()
49
50 # Needed in order to draw glfw window on the main thread
51 if ('Darwin' in platform.platform()):
52 self.rollout(env,
53 policy,
54 max_path_length=np.inf,
55 animated=True,
56 speedup=5)
57
58 def _start_worker(self):
59 env = None
60 policy = None
61 max_length = None
62 initial_rollout = True
63 try:
64 with self.sess.as_default(), self.sess.graph.as_default():
65 # Each iteration will process ALL messages currently in the
66 # queue
67 while True:
68 msgs = {}
69 # If true, block and yield processor
70 if initial_rollout:
71 msg = self.queue.get()
72 msgs[msg.op] = msg
73 # Only fetch the last message of each type
74 while not self.queue.empty():
75 msg = self.queue.get()
76 msgs[msg.op] = msg
77 else:
78 # Only fetch the last message of each type
79 while not self.queue.empty():
80 msg = self.queue.get_nowait()
81 msgs[msg.op] = msg
82
83 if Op.STOP in msgs:
84 self.queue.task_done()
85 break
86 if Op.UPDATE in msgs:
87 env, policy = msgs[Op.UPDATE].args
88 self.queue.task_done()
89 if Op.DEMO in msgs:
90 param_values, max_length = msgs[Op.DEMO].args
91 policy.set_param_values(param_values)
92 initial_rollout = False
93 self.rollout(env,
94 policy,
95 max_path_length=max_length,
96 animated=True,
97 speedup=5)
98 self.queue.task_done()
99 else:
100 if max_length:
101 self.rollout(env,
102 policy,
103 max_path_length=max_length,
104 animated=True,
105 speedup=5)
106 except KeyboardInterrupt:
107 pass
108
109 def close(self):
110 if self.worker_thread.is_alive():
111 while not self.queue.empty():
112 self.queue.get()
113 self.queue.task_done()
114 self.queue.put(Message(op=Op.STOP, args=None, kwargs=None))
115 self.queue.join()
116 self.worker_thread.join()
117
118 @staticmethod
119 def disable():
120 """Disable all instances of the Plotter class."""
121 Plotter.enable = False
122
123 @staticmethod
124 def get_plotters():
125 return Plotter.__plotters
126
127 def start(self):
128 if not Plotter.enable:
129 return
130 if not self.worker_thread.is_alive():
131 tf.compat.v1.get_variable_scope().reuse_variables()
132 self.worker_thread.start()
133 self.queue.put(
134 Message(op=Op.UPDATE,
135 args=(self.env, self.policy),
136 kwargs=None))
137 atexit.register(self.close)
138
139 def update_plot(self, policy, max_length=np.inf):
140 if not Plotter.enable:
141 return
142 if self.worker_thread.is_alive():
143 self.queue.put(
144 Message(op=Op.DEMO,
145 args=(policy.get_param_values(), max_length),
146 kwargs=None))
147
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/garage/tf/plotter/plotter.py b/src/garage/tf/plotter/plotter.py
--- a/src/garage/tf/plotter/plotter.py
+++ b/src/garage/tf/plotter/plotter.py
@@ -43,6 +43,7 @@
) if graph is None else graph
with self.sess.as_default(), self.graph.as_default():
self.policy = policy.clone('plotter_policy')
+ self.policy.build(policy.model.input)
self.rollout = rollout
self.worker_thread = Thread(target=self._start_worker, daemon=True)
self.queue = Queue()
| {"golden_diff": "diff --git a/src/garage/tf/plotter/plotter.py b/src/garage/tf/plotter/plotter.py\n--- a/src/garage/tf/plotter/plotter.py\n+++ b/src/garage/tf/plotter/plotter.py\n@@ -43,6 +43,7 @@\n ) if graph is None else graph\n with self.sess.as_default(), self.graph.as_default():\n self.policy = policy.clone('plotter_policy')\n+ self.policy.build(policy.model.input)\n self.rollout = rollout\n self.worker_thread = Thread(target=self._start_worker, daemon=True)\n self.queue = Queue()\n", "issue": "Backport #1554\n\n", "before_files": [{"content": "import atexit\nfrom collections import namedtuple\nfrom enum import Enum\nimport platform\nfrom queue import Queue\nfrom threading import Thread\n\nimport numpy as np\nimport tensorflow as tf\n\nfrom garage.sampler.utils import rollout as default_rollout\n\n__all__ = ['Plotter']\n\n\nclass Op(Enum):\n STOP = 0\n UPDATE = 1\n DEMO = 2\n\n\nMessage = namedtuple('Message', ['op', 'args', 'kwargs'])\n\n\nclass Plotter:\n\n # Static variable used to disable the plotter\n enable = True\n # List containing all plotters instantiated in the process\n __plotters = []\n\n def __init__(self,\n env,\n policy,\n sess=None,\n graph=None,\n rollout=default_rollout):\n Plotter.__plotters.append(self)\n self.env = env\n self.sess = tf.compat.v1.get_default_session(\n ) if sess is None else sess\n self.graph = tf.compat.v1.get_default_graph(\n ) if graph is None else graph\n with self.sess.as_default(), self.graph.as_default():\n self.policy = policy.clone('plotter_policy')\n self.rollout = rollout\n self.worker_thread = Thread(target=self._start_worker, daemon=True)\n self.queue = Queue()\n\n # Needed in order to draw glfw window on the main thread\n if ('Darwin' in platform.platform()):\n self.rollout(env,\n policy,\n max_path_length=np.inf,\n animated=True,\n speedup=5)\n\n def _start_worker(self):\n env = None\n policy = None\n max_length = None\n initial_rollout = True\n try:\n with self.sess.as_default(), self.sess.graph.as_default():\n # Each iteration will process ALL messages currently in the\n # queue\n while True:\n msgs = {}\n # If true, block and yield processor\n if initial_rollout:\n msg = self.queue.get()\n msgs[msg.op] = msg\n # Only fetch the last message of each type\n while not self.queue.empty():\n msg = self.queue.get()\n msgs[msg.op] = msg\n else:\n # Only fetch the last message of each type\n while not self.queue.empty():\n msg = self.queue.get_nowait()\n msgs[msg.op] = msg\n\n if Op.STOP in msgs:\n self.queue.task_done()\n break\n if Op.UPDATE in msgs:\n env, policy = msgs[Op.UPDATE].args\n self.queue.task_done()\n if Op.DEMO in msgs:\n param_values, max_length = msgs[Op.DEMO].args\n policy.set_param_values(param_values)\n initial_rollout = False\n self.rollout(env,\n policy,\n max_path_length=max_length,\n animated=True,\n speedup=5)\n self.queue.task_done()\n else:\n if max_length:\n self.rollout(env,\n policy,\n max_path_length=max_length,\n animated=True,\n speedup=5)\n except KeyboardInterrupt:\n pass\n\n def close(self):\n if self.worker_thread.is_alive():\n while not self.queue.empty():\n self.queue.get()\n self.queue.task_done()\n self.queue.put(Message(op=Op.STOP, args=None, kwargs=None))\n self.queue.join()\n self.worker_thread.join()\n\n @staticmethod\n def disable():\n \"\"\"Disable all instances of the Plotter class.\"\"\"\n Plotter.enable = False\n\n @staticmethod\n def get_plotters():\n return Plotter.__plotters\n\n def start(self):\n if not Plotter.enable:\n return\n if not self.worker_thread.is_alive():\n tf.compat.v1.get_variable_scope().reuse_variables()\n self.worker_thread.start()\n self.queue.put(\n Message(op=Op.UPDATE,\n args=(self.env, self.policy),\n kwargs=None))\n atexit.register(self.close)\n\n def update_plot(self, policy, max_length=np.inf):\n if not Plotter.enable:\n return\n if self.worker_thread.is_alive():\n self.queue.put(\n Message(op=Op.DEMO,\n args=(policy.get_param_values(), max_length),\n kwargs=None))\n", "path": "src/garage/tf/plotter/plotter.py"}], "after_files": [{"content": "import atexit\nfrom collections import namedtuple\nfrom enum import Enum\nimport platform\nfrom queue import Queue\nfrom threading import Thread\n\nimport numpy as np\nimport tensorflow as tf\n\nfrom garage.sampler.utils import rollout as default_rollout\n\n__all__ = ['Plotter']\n\n\nclass Op(Enum):\n STOP = 0\n UPDATE = 1\n DEMO = 2\n\n\nMessage = namedtuple('Message', ['op', 'args', 'kwargs'])\n\n\nclass Plotter:\n\n # Static variable used to disable the plotter\n enable = True\n # List containing all plotters instantiated in the process\n __plotters = []\n\n def __init__(self,\n env,\n policy,\n sess=None,\n graph=None,\n rollout=default_rollout):\n Plotter.__plotters.append(self)\n self.env = env\n self.sess = tf.compat.v1.get_default_session(\n ) if sess is None else sess\n self.graph = tf.compat.v1.get_default_graph(\n ) if graph is None else graph\n with self.sess.as_default(), self.graph.as_default():\n self.policy = policy.clone('plotter_policy')\n self.policy.build(policy.model.input)\n self.rollout = rollout\n self.worker_thread = Thread(target=self._start_worker, daemon=True)\n self.queue = Queue()\n\n # Needed in order to draw glfw window on the main thread\n if ('Darwin' in platform.platform()):\n self.rollout(env,\n policy,\n max_path_length=np.inf,\n animated=True,\n speedup=5)\n\n def _start_worker(self):\n env = None\n policy = None\n max_length = None\n initial_rollout = True\n try:\n with self.sess.as_default(), self.sess.graph.as_default():\n # Each iteration will process ALL messages currently in the\n # queue\n while True:\n msgs = {}\n # If true, block and yield processor\n if initial_rollout:\n msg = self.queue.get()\n msgs[msg.op] = msg\n # Only fetch the last message of each type\n while not self.queue.empty():\n msg = self.queue.get()\n msgs[msg.op] = msg\n else:\n # Only fetch the last message of each type\n while not self.queue.empty():\n msg = self.queue.get_nowait()\n msgs[msg.op] = msg\n\n if Op.STOP in msgs:\n self.queue.task_done()\n break\n if Op.UPDATE in msgs:\n env, policy = msgs[Op.UPDATE].args\n self.queue.task_done()\n if Op.DEMO in msgs:\n param_values, max_length = msgs[Op.DEMO].args\n policy.set_param_values(param_values)\n initial_rollout = False\n self.rollout(env,\n policy,\n max_path_length=max_length,\n animated=True,\n speedup=5)\n self.queue.task_done()\n else:\n if max_length:\n self.rollout(env,\n policy,\n max_path_length=max_length,\n animated=True,\n speedup=5)\n except KeyboardInterrupt:\n pass\n\n def close(self):\n if self.worker_thread.is_alive():\n while not self.queue.empty():\n self.queue.get()\n self.queue.task_done()\n self.queue.put(Message(op=Op.STOP, args=None, kwargs=None))\n self.queue.join()\n self.worker_thread.join()\n\n @staticmethod\n def disable():\n \"\"\"Disable all instances of the Plotter class.\"\"\"\n Plotter.enable = False\n\n @staticmethod\n def get_plotters():\n return Plotter.__plotters\n\n def start(self):\n if not Plotter.enable:\n return\n if not self.worker_thread.is_alive():\n tf.compat.v1.get_variable_scope().reuse_variables()\n self.worker_thread.start()\n self.queue.put(\n Message(op=Op.UPDATE,\n args=(self.env, self.policy),\n kwargs=None))\n atexit.register(self.close)\n\n def update_plot(self, policy, max_length=np.inf):\n if not Plotter.enable:\n return\n if self.worker_thread.is_alive():\n self.queue.put(\n Message(op=Op.DEMO,\n args=(policy.get_param_values(), max_length),\n kwargs=None))\n", "path": "src/garage/tf/plotter/plotter.py"}]} | 1,514 | 145 |
gh_patches_debug_1848 | rasdani/github-patches | git_diff | kivy__python-for-android-1163 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Openssl recipe crashes on x86 arch
p4a branch: stable
buildozer: 0.33
bootstrap: sdl2
kivy: 1.10.0
Error message i get:
```
arm_arch.h:46:6: error: #error "unsupported ARM architecture"
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pythonforandroid/recipes/openssl/__init__.py`
Content:
```
1 from functools import partial
2
3 from pythonforandroid.toolchain import Recipe, shprint, current_directory
4 import sh
5
6
7 class OpenSSLRecipe(Recipe):
8 version = '1.0.2h'
9 url = 'https://www.openssl.org/source/openssl-{version}.tar.gz'
10
11 def should_build(self, arch):
12 return not self.has_libs(arch, 'libssl' + self.version + '.so',
13 'libcrypto' + self.version + '.so')
14
15 def check_symbol(self, env, sofile, symbol):
16 nm = env.get('NM', 'nm')
17 syms = sh.sh('-c', "{} -gp {} | cut -d' ' -f3".format(
18 nm, sofile), _env=env).splitlines()
19 if symbol in syms:
20 return True
21 print('{} missing symbol {}; rebuilding'.format(sofile, symbol))
22 return False
23
24 def get_recipe_env(self, arch=None):
25 env = super(OpenSSLRecipe, self).get_recipe_env(arch)
26 env['OPENSSL_VERSION'] = self.version
27 env['CFLAGS'] += ' ' + env['LDFLAGS']
28 env['CC'] += ' ' + env['LDFLAGS']
29 return env
30
31 def select_build_arch(self, arch):
32 aname = arch.arch
33 if 'arm64' in aname:
34 return 'linux-aarch64'
35 if 'v7a' in aname:
36 return 'android-armv7'
37 if 'arm' in aname:
38 return 'android'
39 return 'linux-armv4'
40
41 def build_arch(self, arch):
42 env = self.get_recipe_env(arch)
43 with current_directory(self.get_build_dir(arch.arch)):
44 # sh fails with code 255 trying to execute ./Configure
45 # so instead we manually run perl passing in Configure
46 perl = sh.Command('perl')
47 buildarch = self.select_build_arch(arch)
48 shprint(perl, 'Configure', 'shared', 'no-dso', 'no-krb5', buildarch, _env=env)
49 self.apply_patch('disable-sover.patch', arch.arch)
50 self.apply_patch('rename-shared-lib.patch', arch.arch)
51
52 # check_ssl = partial(self.check_symbol, env, 'libssl' + self.version + '.so')
53 check_crypto = partial(self.check_symbol, env, 'libcrypto' + self.version + '.so')
54 while True:
55 shprint(sh.make, 'build_libs', _env=env)
56 if all(map(check_crypto, ('SSLeay', 'MD5_Transform', 'MD4_Init'))):
57 break
58 shprint(sh.make, 'clean', _env=env)
59
60 self.install_libs(arch, 'libssl' + self.version + '.so',
61 'libcrypto' + self.version + '.so')
62
63 recipe = OpenSSLRecipe()
64
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pythonforandroid/recipes/openssl/__init__.py b/pythonforandroid/recipes/openssl/__init__.py
--- a/pythonforandroid/recipes/openssl/__init__.py
+++ b/pythonforandroid/recipes/openssl/__init__.py
@@ -36,6 +36,8 @@
return 'android-armv7'
if 'arm' in aname:
return 'android'
+ if 'x86' in aname:
+ return 'android-x86'
return 'linux-armv4'
def build_arch(self, arch):
| {"golden_diff": "diff --git a/pythonforandroid/recipes/openssl/__init__.py b/pythonforandroid/recipes/openssl/__init__.py\n--- a/pythonforandroid/recipes/openssl/__init__.py\n+++ b/pythonforandroid/recipes/openssl/__init__.py\n@@ -36,6 +36,8 @@\n return 'android-armv7'\n if 'arm' in aname:\n return 'android'\n+ if 'x86' in aname:\n+ return 'android-x86'\n return 'linux-armv4'\n \n def build_arch(self, arch):\n", "issue": "Openssl recipe crashes on x86 arch\np4a branch: stable\r\nbuildozer: 0.33\r\nbootstrap: sdl2\r\nkivy: 1.10.0\r\n\r\nError message i get:\r\n```\r\narm_arch.h:46:6: error: #error \"unsupported ARM architecture\"\r\n```\n", "before_files": [{"content": "from functools import partial\n\nfrom pythonforandroid.toolchain import Recipe, shprint, current_directory\nimport sh\n\n\nclass OpenSSLRecipe(Recipe):\n version = '1.0.2h'\n url = 'https://www.openssl.org/source/openssl-{version}.tar.gz'\n\n def should_build(self, arch):\n return not self.has_libs(arch, 'libssl' + self.version + '.so',\n 'libcrypto' + self.version + '.so')\n\n def check_symbol(self, env, sofile, symbol):\n nm = env.get('NM', 'nm')\n syms = sh.sh('-c', \"{} -gp {} | cut -d' ' -f3\".format(\n nm, sofile), _env=env).splitlines()\n if symbol in syms:\n return True\n print('{} missing symbol {}; rebuilding'.format(sofile, symbol))\n return False\n\n def get_recipe_env(self, arch=None):\n env = super(OpenSSLRecipe, self).get_recipe_env(arch)\n env['OPENSSL_VERSION'] = self.version\n env['CFLAGS'] += ' ' + env['LDFLAGS']\n env['CC'] += ' ' + env['LDFLAGS']\n return env\n\n def select_build_arch(self, arch):\n aname = arch.arch\n if 'arm64' in aname:\n return 'linux-aarch64'\n if 'v7a' in aname:\n return 'android-armv7'\n if 'arm' in aname:\n return 'android'\n return 'linux-armv4'\n\n def build_arch(self, arch):\n env = self.get_recipe_env(arch)\n with current_directory(self.get_build_dir(arch.arch)):\n # sh fails with code 255 trying to execute ./Configure\n # so instead we manually run perl passing in Configure\n perl = sh.Command('perl')\n buildarch = self.select_build_arch(arch)\n shprint(perl, 'Configure', 'shared', 'no-dso', 'no-krb5', buildarch, _env=env)\n self.apply_patch('disable-sover.patch', arch.arch)\n self.apply_patch('rename-shared-lib.patch', arch.arch)\n\n # check_ssl = partial(self.check_symbol, env, 'libssl' + self.version + '.so')\n check_crypto = partial(self.check_symbol, env, 'libcrypto' + self.version + '.so')\n while True:\n shprint(sh.make, 'build_libs', _env=env)\n if all(map(check_crypto, ('SSLeay', 'MD5_Transform', 'MD4_Init'))):\n break\n shprint(sh.make, 'clean', _env=env)\n\n self.install_libs(arch, 'libssl' + self.version + '.so',\n 'libcrypto' + self.version + '.so')\n\nrecipe = OpenSSLRecipe()\n", "path": "pythonforandroid/recipes/openssl/__init__.py"}], "after_files": [{"content": "from functools import partial\n\nfrom pythonforandroid.toolchain import Recipe, shprint, current_directory\nimport sh\n\n\nclass OpenSSLRecipe(Recipe):\n version = '1.0.2h'\n url = 'https://www.openssl.org/source/openssl-{version}.tar.gz'\n\n def should_build(self, arch):\n return not self.has_libs(arch, 'libssl' + self.version + '.so',\n 'libcrypto' + self.version + '.so')\n\n def check_symbol(self, env, sofile, symbol):\n nm = env.get('NM', 'nm')\n syms = sh.sh('-c', \"{} -gp {} | cut -d' ' -f3\".format(\n nm, sofile), _env=env).splitlines()\n if symbol in syms:\n return True\n print('{} missing symbol {}; rebuilding'.format(sofile, symbol))\n return False\n\n def get_recipe_env(self, arch=None):\n env = super(OpenSSLRecipe, self).get_recipe_env(arch)\n env['OPENSSL_VERSION'] = self.version\n env['CFLAGS'] += ' ' + env['LDFLAGS']\n env['CC'] += ' ' + env['LDFLAGS']\n return env\n\n def select_build_arch(self, arch):\n aname = arch.arch\n if 'arm64' in aname:\n return 'linux-aarch64'\n if 'v7a' in aname:\n return 'android-armv7'\n if 'arm' in aname:\n return 'android'\n if 'x86' in aname:\n return 'android-x86'\n return 'linux-armv4'\n\n def build_arch(self, arch):\n env = self.get_recipe_env(arch)\n with current_directory(self.get_build_dir(arch.arch)):\n # sh fails with code 255 trying to execute ./Configure\n # so instead we manually run perl passing in Configure\n perl = sh.Command('perl')\n buildarch = self.select_build_arch(arch)\n shprint(perl, 'Configure', 'shared', 'no-dso', 'no-krb5', buildarch, _env=env)\n self.apply_patch('disable-sover.patch', arch.arch)\n self.apply_patch('rename-shared-lib.patch', arch.arch)\n\n # check_ssl = partial(self.check_symbol, env, 'libssl' + self.version + '.so')\n check_crypto = partial(self.check_symbol, env, 'libcrypto' + self.version + '.so')\n while True:\n shprint(sh.make, 'build_libs', _env=env)\n if all(map(check_crypto, ('SSLeay', 'MD5_Transform', 'MD4_Init'))):\n break\n shprint(sh.make, 'clean', _env=env)\n\n self.install_libs(arch, 'libssl' + self.version + '.so',\n 'libcrypto' + self.version + '.so')\n\nrecipe = OpenSSLRecipe()\n", "path": "pythonforandroid/recipes/openssl/__init__.py"}]} | 1,086 | 128 |
gh_patches_debug_66775 | rasdani/github-patches | git_diff | apache__airflow-1296 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Inconsistent 'owner' field in examples
Dear Airflow Maintainers,
### Environment
- Version of Airflow (e.g. a release version, running your own fork, running off master -- provide a git log snippet): **1.7.0**
- Screen shots of your DAG's graph and tree views:

- Operating System: (Windows Version or `$ uname -a`) **Ubuntu 14.04**
- Python Version: `$ python --version` **2.7**
### Description of Issue
- What did you expect to happen? **All of the examples have a consistent owner, probably 'airflow'**
- What happened instead? **[Some](https://github.com/airbnb/airflow/blob/master/airflow/example_dags/example_python_operator.py) examples have `airflow`, [some](https://github.com/airbnb/airflow/blob/master/airflow/example_dags/example_passing_params_via_test_command.py) have `me`**
### Reproduction Steps
1. install airflow 1.7.0 via pip
2. start the webserver
3. look at the web UI, probably http://localhost:8080
**Inconsistent hard-coding in the examples will likely lead to confusion for new users.**
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `airflow/example_dags/example_trigger_controller_dag.py`
Content:
```
1
2 """This example illustrates the use of the TriggerDagRunOperator. There are 2
3 entities at work in this scenario:
4 1. The Controller DAG - the DAG that conditionally executes the trigger
5 2. The Target DAG - DAG being triggered (in example_trigger_target_dag.py)
6
7 This example illustrates the following features :
8 1. A TriggerDagRunOperator that takes:
9 a. A python callable that decides whether or not to trigger the Target DAG
10 b. An optional params dict passed to the python callable to help in
11 evaluating whether or not to trigger the Target DAG
12 c. The id (name) of the Target DAG
13 d. The python callable can add contextual info to the DagRun created by
14 way of adding a Pickleable payload (e.g. dictionary of primitives). This
15 state is then made available to the TargetDag
16 2. A Target DAG : c.f. example_trigger_target_dag.py
17 """
18
19 from airflow import DAG
20 from airflow.operators import TriggerDagRunOperator
21 from datetime import datetime
22
23 import pprint
24
25 pp = pprint.PrettyPrinter(indent=4)
26
27
28 def conditionally_trigger(context, dag_run_obj):
29 """This function decides whether or not to Trigger the remote DAG"""
30 c_p =context['params']['condition_param']
31 print("Controller DAG : conditionally_trigger = {}".format(c_p))
32 if context['params']['condition_param']:
33 dag_run_obj.payload = {'message': context['params']['message']}
34 pp.pprint(dag_run_obj.payload)
35 return dag_run_obj
36
37
38 # Define the DAG
39 dag = DAG(dag_id='example_trigger_controller_dag',
40 default_args={"owner": "me",
41 "start_date": datetime.now()},
42 schedule_interval='@once')
43
44
45 # Define the single task in this controller example DAG
46 trigger = TriggerDagRunOperator(task_id='test_trigger_dagrun',
47 trigger_dag_id="example_trigger_target_dag",
48 python_callable=conditionally_trigger,
49 params={'condition_param': True,
50 'message': 'Hello World'},
51 dag=dag)
52
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/airflow/example_dags/example_trigger_controller_dag.py b/airflow/example_dags/example_trigger_controller_dag.py
--- a/airflow/example_dags/example_trigger_controller_dag.py
+++ b/airflow/example_dags/example_trigger_controller_dag.py
@@ -37,7 +37,7 @@
# Define the DAG
dag = DAG(dag_id='example_trigger_controller_dag',
- default_args={"owner": "me",
+ default_args={"owner": "airflow",
"start_date": datetime.now()},
schedule_interval='@once')
| {"golden_diff": "diff --git a/airflow/example_dags/example_trigger_controller_dag.py b/airflow/example_dags/example_trigger_controller_dag.py\n--- a/airflow/example_dags/example_trigger_controller_dag.py\n+++ b/airflow/example_dags/example_trigger_controller_dag.py\n@@ -37,7 +37,7 @@\n \n # Define the DAG\n dag = DAG(dag_id='example_trigger_controller_dag',\n- default_args={\"owner\": \"me\",\n+ default_args={\"owner\": \"airflow\",\n \"start_date\": datetime.now()},\n schedule_interval='@once')\n", "issue": "Inconsistent 'owner' field in examples\nDear Airflow Maintainers,\n### Environment\n- Version of Airflow (e.g. a release version, running your own fork, running off master -- provide a git log snippet): **1.7.0**\n- Screen shots of your DAG's graph and tree views:\n \n- Operating System: (Windows Version or `$ uname -a`) **Ubuntu 14.04**\n- Python Version: `$ python --version` **2.7**\n### Description of Issue\n- What did you expect to happen? **All of the examples have a consistent owner, probably 'airflow'**\n- What happened instead? **[Some](https://github.com/airbnb/airflow/blob/master/airflow/example_dags/example_python_operator.py) examples have `airflow`, [some](https://github.com/airbnb/airflow/blob/master/airflow/example_dags/example_passing_params_via_test_command.py) have `me`**\n### Reproduction Steps\n1. install airflow 1.7.0 via pip\n2. start the webserver\n3. look at the web UI, probably http://localhost:8080\n\n**Inconsistent hard-coding in the examples will likely lead to confusion for new users.**\n\n", "before_files": [{"content": "\n\"\"\"This example illustrates the use of the TriggerDagRunOperator. There are 2\nentities at work in this scenario:\n1. The Controller DAG - the DAG that conditionally executes the trigger\n2. The Target DAG - DAG being triggered (in example_trigger_target_dag.py)\n\nThis example illustrates the following features :\n1. A TriggerDagRunOperator that takes:\n a. A python callable that decides whether or not to trigger the Target DAG\n b. An optional params dict passed to the python callable to help in\n evaluating whether or not to trigger the Target DAG\n c. The id (name) of the Target DAG\n d. The python callable can add contextual info to the DagRun created by\n way of adding a Pickleable payload (e.g. dictionary of primitives). This\n state is then made available to the TargetDag\n2. A Target DAG : c.f. example_trigger_target_dag.py\n\"\"\"\n\nfrom airflow import DAG\nfrom airflow.operators import TriggerDagRunOperator\nfrom datetime import datetime\n\nimport pprint\n\npp = pprint.PrettyPrinter(indent=4)\n\n\ndef conditionally_trigger(context, dag_run_obj):\n \"\"\"This function decides whether or not to Trigger the remote DAG\"\"\"\n c_p =context['params']['condition_param']\n print(\"Controller DAG : conditionally_trigger = {}\".format(c_p))\n if context['params']['condition_param']:\n dag_run_obj.payload = {'message': context['params']['message']}\n pp.pprint(dag_run_obj.payload)\n return dag_run_obj\n\n\n# Define the DAG\ndag = DAG(dag_id='example_trigger_controller_dag',\n default_args={\"owner\": \"me\",\n \"start_date\": datetime.now()},\n schedule_interval='@once')\n\n\n# Define the single task in this controller example DAG\ntrigger = TriggerDagRunOperator(task_id='test_trigger_dagrun',\n trigger_dag_id=\"example_trigger_target_dag\",\n python_callable=conditionally_trigger,\n params={'condition_param': True,\n 'message': 'Hello World'},\n dag=dag)\n", "path": "airflow/example_dags/example_trigger_controller_dag.py"}], "after_files": [{"content": "\n\"\"\"This example illustrates the use of the TriggerDagRunOperator. There are 2\nentities at work in this scenario:\n1. The Controller DAG - the DAG that conditionally executes the trigger\n2. The Target DAG - DAG being triggered (in example_trigger_target_dag.py)\n\nThis example illustrates the following features :\n1. A TriggerDagRunOperator that takes:\n a. A python callable that decides whether or not to trigger the Target DAG\n b. An optional params dict passed to the python callable to help in\n evaluating whether or not to trigger the Target DAG\n c. The id (name) of the Target DAG\n d. The python callable can add contextual info to the DagRun created by\n way of adding a Pickleable payload (e.g. dictionary of primitives). This\n state is then made available to the TargetDag\n2. A Target DAG : c.f. example_trigger_target_dag.py\n\"\"\"\n\nfrom airflow import DAG\nfrom airflow.operators import TriggerDagRunOperator\nfrom datetime import datetime\n\nimport pprint\n\npp = pprint.PrettyPrinter(indent=4)\n\n\ndef conditionally_trigger(context, dag_run_obj):\n \"\"\"This function decides whether or not to Trigger the remote DAG\"\"\"\n c_p =context['params']['condition_param']\n print(\"Controller DAG : conditionally_trigger = {}\".format(c_p))\n if context['params']['condition_param']:\n dag_run_obj.payload = {'message': context['params']['message']}\n pp.pprint(dag_run_obj.payload)\n return dag_run_obj\n\n\n# Define the DAG\ndag = DAG(dag_id='example_trigger_controller_dag',\n default_args={\"owner\": \"airflow\",\n \"start_date\": datetime.now()},\n schedule_interval='@once')\n\n\n# Define the single task in this controller example DAG\ntrigger = TriggerDagRunOperator(task_id='test_trigger_dagrun',\n trigger_dag_id=\"example_trigger_target_dag\",\n python_callable=conditionally_trigger,\n params={'condition_param': True,\n 'message': 'Hello World'},\n dag=dag)\n", "path": "airflow/example_dags/example_trigger_controller_dag.py"}]} | 1,091 | 127 |
gh_patches_debug_37830 | rasdani/github-patches | git_diff | hpcaitech__ColossalAI-2969 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[BUG]: 运行chatgpt推理示例报错
### 🐛 Describe the bug
(https://github.com/hpcaitech/ColossalAI/tree/main/applications/ChatGPT)/examples/ 运行inference.py 抛出OSError:

### Environment
_No response_
[tensor] fix some unittests
[tensor] fix some unittests
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `applications/ChatGPT/examples/inference.py`
Content:
```
1 import argparse
2 import torch
3
4 from chatgpt.nn import BLOOMActor, GPTActor, OPTActor
5 from transformers import AutoTokenizer
6 from transformers.models.gpt2.tokenization_gpt2 import GPT2Tokenizer
7
8
9 def eval(args):
10 # configure model
11 if args.model == 'gpt2':
12 model = GPTActor(pretrained=args.pretrain).to(torch.cuda.current_device())
13 elif args.model == 'bloom':
14 model = BLOOMActor(pretrained=args.pretrain).to(torch.cuda.current_device())
15 elif args.model == 'opt':
16 model = OPTActor(pretrained=args.pretrain).to(torch.cuda.current_device())
17 else:
18 raise ValueError(f'Unsupported model "{args.model}"')
19
20 # configure tokenizer
21 if args.model == 'gpt2':
22 tokenizer = GPT2Tokenizer.from_pretrained('gpt2')
23 tokenizer.pad_token = tokenizer.eos_token
24 elif args.model == 'bloom':
25 tokenizer = AutoTokenizer.from_pretrained(args.pretrain)
26 tokenizer.pad_token = tokenizer.eos_token
27 elif args.model == 'opt':
28 tokenizer = AutoTokenizer.from_pretrained("facebook/opt-350m")
29 else:
30 raise ValueError(f'Unsupported model "{args.model}"')
31
32 model.eval()
33 input = args.input
34 input_ids = tokenizer.encode(input, return_tensors='pt').to(torch.cuda.current_device())
35 outputs = model.generate(input_ids,
36 max_length=args.max_length,
37 do_sample=True,
38 top_k=50,
39 top_p=0.95,
40 num_return_sequences=1)
41 output = tokenizer.batch_decode(outputs[0], skip_special_tokens=True)
42 print(output)
43
44
45 if __name__ == '__main__':
46 parser = argparse.ArgumentParser()
47 parser.add_argument('--model', default='gpt2', choices=['gpt2', 'bloom', 'opt'])
48 parser.add_argument('--pretrain', type=str, default=None)
49 parser.add_argument('--input', type=str, default='Q: How are you ? A:')
50 parser.add_argument('--max_length', type=int, default=100)
51 args = parser.parse_args()
52 eval(args)
53
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/applications/ChatGPT/examples/inference.py b/applications/ChatGPT/examples/inference.py
--- a/applications/ChatGPT/examples/inference.py
+++ b/applications/ChatGPT/examples/inference.py
@@ -9,30 +9,34 @@
def eval(args):
# configure model
if args.model == 'gpt2':
- model = GPTActor(pretrained=args.pretrain).to(torch.cuda.current_device())
+ actor = GPTActor().to(torch.cuda.current_device())
elif args.model == 'bloom':
- model = BLOOMActor(pretrained=args.pretrain).to(torch.cuda.current_device())
+ actor = BLOOMActor().to(torch.cuda.current_device())
elif args.model == 'opt':
- model = OPTActor(pretrained=args.pretrain).to(torch.cuda.current_device())
+ actor = OPTActor().to(torch.cuda.current_device())
else:
raise ValueError(f'Unsupported model "{args.model}"')
+ state_dict = torch.load(args.pretrain)
+ actor.model.load_state_dict(state_dict)
+
+
# configure tokenizer
if args.model == 'gpt2':
tokenizer = GPT2Tokenizer.from_pretrained('gpt2')
tokenizer.pad_token = tokenizer.eos_token
elif args.model == 'bloom':
- tokenizer = AutoTokenizer.from_pretrained(args.pretrain)
+ tokenizer = AutoTokenizer.from_pretrained('bigscience/bloom-560m')
tokenizer.pad_token = tokenizer.eos_token
elif args.model == 'opt':
- tokenizer = AutoTokenizer.from_pretrained("facebook/opt-350m")
+ tokenizer = AutoTokenizer.from_pretrained('facebook/opt-350m')
else:
raise ValueError(f'Unsupported model "{args.model}"')
- model.eval()
+ actor.eval()
input = args.input
input_ids = tokenizer.encode(input, return_tensors='pt').to(torch.cuda.current_device())
- outputs = model.generate(input_ids,
+ outputs = actor.generate(input_ids,
max_length=args.max_length,
do_sample=True,
top_k=50,
@@ -46,7 +50,7 @@
parser = argparse.ArgumentParser()
parser.add_argument('--model', default='gpt2', choices=['gpt2', 'bloom', 'opt'])
parser.add_argument('--pretrain', type=str, default=None)
- parser.add_argument('--input', type=str, default='Q: How are you ? A:')
+ parser.add_argument('--input', type=str, default='Question: How are you ? Answer:')
parser.add_argument('--max_length', type=int, default=100)
args = parser.parse_args()
eval(args)
| {"golden_diff": "diff --git a/applications/ChatGPT/examples/inference.py b/applications/ChatGPT/examples/inference.py\n--- a/applications/ChatGPT/examples/inference.py\n+++ b/applications/ChatGPT/examples/inference.py\n@@ -9,30 +9,34 @@\n def eval(args):\n # configure model\n if args.model == 'gpt2':\n- model = GPTActor(pretrained=args.pretrain).to(torch.cuda.current_device())\n+ actor = GPTActor().to(torch.cuda.current_device())\n elif args.model == 'bloom':\n- model = BLOOMActor(pretrained=args.pretrain).to(torch.cuda.current_device())\n+ actor = BLOOMActor().to(torch.cuda.current_device())\n elif args.model == 'opt':\n- model = OPTActor(pretrained=args.pretrain).to(torch.cuda.current_device())\n+ actor = OPTActor().to(torch.cuda.current_device())\n else:\n raise ValueError(f'Unsupported model \"{args.model}\"')\n \n+ state_dict = torch.load(args.pretrain)\n+ actor.model.load_state_dict(state_dict)\n+ \n+ \n # configure tokenizer\n if args.model == 'gpt2':\n tokenizer = GPT2Tokenizer.from_pretrained('gpt2')\n tokenizer.pad_token = tokenizer.eos_token\n elif args.model == 'bloom':\n- tokenizer = AutoTokenizer.from_pretrained(args.pretrain)\n+ tokenizer = AutoTokenizer.from_pretrained('bigscience/bloom-560m')\n tokenizer.pad_token = tokenizer.eos_token\n elif args.model == 'opt':\n- tokenizer = AutoTokenizer.from_pretrained(\"facebook/opt-350m\")\n+ tokenizer = AutoTokenizer.from_pretrained('facebook/opt-350m')\n else:\n raise ValueError(f'Unsupported model \"{args.model}\"')\n \n- model.eval()\n+ actor.eval()\n input = args.input\n input_ids = tokenizer.encode(input, return_tensors='pt').to(torch.cuda.current_device())\n- outputs = model.generate(input_ids,\n+ outputs = actor.generate(input_ids,\n max_length=args.max_length,\n do_sample=True,\n top_k=50,\n@@ -46,7 +50,7 @@\n parser = argparse.ArgumentParser()\n parser.add_argument('--model', default='gpt2', choices=['gpt2', 'bloom', 'opt'])\n parser.add_argument('--pretrain', type=str, default=None)\n- parser.add_argument('--input', type=str, default='Q: How are you ? A:')\n+ parser.add_argument('--input', type=str, default='Question: How are you ? Answer:')\n parser.add_argument('--max_length', type=int, default=100)\n args = parser.parse_args()\n eval(args)\n", "issue": "[BUG]: \u8fd0\u884cchatgpt\u63a8\u7406\u793a\u4f8b\u62a5\u9519\n### \ud83d\udc1b Describe the bug\n\n(https://github.com/hpcaitech/ColossalAI/tree/main/applications/ChatGPT)/examples/ \u8fd0\u884cinference.py \u629b\u51faOSError:\r\n\r\n\n\n### Environment\n\n_No response_\n[tensor] fix some unittests\n\n[tensor] fix some unittests\n\n", "before_files": [{"content": "import argparse\nimport torch\n\nfrom chatgpt.nn import BLOOMActor, GPTActor, OPTActor\nfrom transformers import AutoTokenizer\nfrom transformers.models.gpt2.tokenization_gpt2 import GPT2Tokenizer\n\n\ndef eval(args):\n # configure model\n if args.model == 'gpt2':\n model = GPTActor(pretrained=args.pretrain).to(torch.cuda.current_device())\n elif args.model == 'bloom':\n model = BLOOMActor(pretrained=args.pretrain).to(torch.cuda.current_device())\n elif args.model == 'opt':\n model = OPTActor(pretrained=args.pretrain).to(torch.cuda.current_device())\n else:\n raise ValueError(f'Unsupported model \"{args.model}\"')\n\n # configure tokenizer\n if args.model == 'gpt2':\n tokenizer = GPT2Tokenizer.from_pretrained('gpt2')\n tokenizer.pad_token = tokenizer.eos_token\n elif args.model == 'bloom':\n tokenizer = AutoTokenizer.from_pretrained(args.pretrain)\n tokenizer.pad_token = tokenizer.eos_token\n elif args.model == 'opt':\n tokenizer = AutoTokenizer.from_pretrained(\"facebook/opt-350m\")\n else:\n raise ValueError(f'Unsupported model \"{args.model}\"')\n\n model.eval()\n input = args.input\n input_ids = tokenizer.encode(input, return_tensors='pt').to(torch.cuda.current_device())\n outputs = model.generate(input_ids,\n max_length=args.max_length,\n do_sample=True,\n top_k=50,\n top_p=0.95,\n num_return_sequences=1)\n output = tokenizer.batch_decode(outputs[0], skip_special_tokens=True)\n print(output)\n\n\nif __name__ == '__main__':\n parser = argparse.ArgumentParser()\n parser.add_argument('--model', default='gpt2', choices=['gpt2', 'bloom', 'opt'])\n parser.add_argument('--pretrain', type=str, default=None)\n parser.add_argument('--input', type=str, default='Q: How are you ? A:')\n parser.add_argument('--max_length', type=int, default=100)\n args = parser.parse_args()\n eval(args)\n", "path": "applications/ChatGPT/examples/inference.py"}], "after_files": [{"content": "import argparse\nimport torch\n\nfrom chatgpt.nn import BLOOMActor, GPTActor, OPTActor\nfrom transformers import AutoTokenizer\nfrom transformers.models.gpt2.tokenization_gpt2 import GPT2Tokenizer\n\n\ndef eval(args):\n # configure model\n if args.model == 'gpt2':\n actor = GPTActor().to(torch.cuda.current_device())\n elif args.model == 'bloom':\n actor = BLOOMActor().to(torch.cuda.current_device())\n elif args.model == 'opt':\n actor = OPTActor().to(torch.cuda.current_device())\n else:\n raise ValueError(f'Unsupported model \"{args.model}\"')\n\n state_dict = torch.load(args.pretrain)\n actor.model.load_state_dict(state_dict)\n \n \n # configure tokenizer\n if args.model == 'gpt2':\n tokenizer = GPT2Tokenizer.from_pretrained('gpt2')\n tokenizer.pad_token = tokenizer.eos_token\n elif args.model == 'bloom':\n tokenizer = AutoTokenizer.from_pretrained('bigscience/bloom-560m')\n tokenizer.pad_token = tokenizer.eos_token\n elif args.model == 'opt':\n tokenizer = AutoTokenizer.from_pretrained('facebook/opt-350m')\n else:\n raise ValueError(f'Unsupported model \"{args.model}\"')\n\n actor.eval()\n input = args.input\n input_ids = tokenizer.encode(input, return_tensors='pt').to(torch.cuda.current_device())\n outputs = actor.generate(input_ids,\n max_length=args.max_length,\n do_sample=True,\n top_k=50,\n top_p=0.95,\n num_return_sequences=1)\n output = tokenizer.batch_decode(outputs[0], skip_special_tokens=True)\n print(output)\n\n\nif __name__ == '__main__':\n parser = argparse.ArgumentParser()\n parser.add_argument('--model', default='gpt2', choices=['gpt2', 'bloom', 'opt'])\n parser.add_argument('--pretrain', type=str, default=None)\n parser.add_argument('--input', type=str, default='Question: How are you ? Answer:')\n parser.add_argument('--max_length', type=int, default=100)\n args = parser.parse_args()\n eval(args)\n", "path": "applications/ChatGPT/examples/inference.py"}]} | 970 | 592 |
gh_patches_debug_15935 | rasdani/github-patches | git_diff | vispy__vispy-305 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
The first emitted Timer event has `None` as `dt` property
``` python
def on_timer(self, event):
print event.dt
```
displays `None` the first time, and the correct dt then (a float). The first dt should probably be `0.0`.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `vispy/app/timer.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 # Copyright (c) 2014, Vispy Development Team.
3 # Distributed under the (new) BSD License. See LICENSE.txt for more info.
4
5 from __future__ import division
6
7 from ..util.event import Event, EmitterGroup
8 from ..util.ptime import time as precision_time
9 from ..ext.six import string_types
10 from .base import BaseTimerBackend as TimerBackend # noqa
11 from . import use_app, Application
12
13
14 class Timer(object):
15
16 """Timer used to schedule events in the future or on a repeating schedule
17
18 Parameters
19 ----------
20 interval : float
21 Time between events.
22 connect : function | None
23 The function to call.
24 iterations : int
25 Number of iterations. Can be -1 for infinite.
26 start : bool
27 Whether to start the timer.
28 app : instance of vispy.app.Application
29 The application to attach the timer to.
30 """
31
32 def __init__(self, interval=0.0, connect=None, iterations=-1, start=False,
33 app=None):
34 self.events = EmitterGroup(source=self,
35 start=Event,
36 stop=Event,
37 timeout=Event)
38 #self.connect = self.events.timeout.connect
39 #self.disconnect = self.events.timeout.disconnect
40
41 # Get app instance
42 if app is None:
43 self._app = use_app()
44 elif isinstance(app, Application):
45 self._app = app
46 elif isinstance(app, string_types):
47 self._app = Application(app)
48 else:
49 raise ValueError('Invalid value for app %r' % app)
50
51 # Ensure app has backend app object
52 self._app.native
53
54 # Instantiate the backed with the right class
55 self._backend = self._app.backend_module.TimerBackend(self)
56
57 self._interval = interval
58 self._running = False
59 self._last_emit_time = None
60 self.iter_count = 0
61 self.max_iterations = iterations
62 if connect is not None:
63 self.connect(connect)
64 if start:
65 self.start()
66
67 @property
68 def app(self):
69 """ The vispy Application instance on which this Timer is based.
70 """
71 return self._app
72
73 @property
74 def interval(self):
75 return self._interval
76
77 @interval.setter
78 def interval(self, val):
79 self._interval = val
80 if self.running:
81 self.stop()
82 self.start()
83
84 @property
85 def running(self):
86 return self._running
87
88 def start(self, interval=None, iterations=None):
89 """Start the timer.
90
91 A timeout event will be generated every *interval* seconds.
92 If *interval* is None, then self.interval will be used.
93
94 If *iterations* is specified, the timer will stop after
95 emitting that number of events. If unspecified, then
96 the previous value of self.iterations will be used. If the value is
97 negative, then the timer will continue running until stop() is called.
98 """
99 self.iter_count = 0
100 if interval is not None:
101 self.interval = interval
102 if iterations is not None:
103 self.max_iterations = iterations
104 self._backend._vispy_start(self.interval)
105 self._running = True
106 self._last_emit_time = None
107 self.events.start(type='timer_start')
108
109 def stop(self):
110 """Stop the timer."""
111 self._backend._vispy_stop()
112 self._running = False
113 self.events.stop(type='timer_stop')
114
115 # use timer.app.run() and .quit() instead.
116 # def run_event_loop(self):
117 #"""Execute the event loop for this Timer's backend.
118 #"""
119 # return self._backend._vispy_run()
120
121 # def quit_event_loop(self):
122 #"""Exit the event loop for this Timer's backend.
123 #"""
124 # return self._backend._vispy_quit()
125
126 @property
127 def native(self):
128 """ The native timer on which this Timer is based.
129 """
130 return self._backend._vispy_get_native_timer()
131
132 def _timeout(self, *args):
133 # called when the backend timer has triggered.
134 if not self.running:
135 return
136 if self.max_iterations >= 0 and self.iter_count >= self.max_iterations:
137 self.stop()
138 return
139
140 # compute dt since last event
141 now = precision_time()
142 if self._last_emit_time is None:
143 dt = None
144 else:
145 dt = now - self._last_emit_time
146 self._last_emit_time = now
147
148 self.events.timeout(
149 type='timer_timeout',
150 iteration=self.iter_count,
151 dt=dt)
152 self.iter_count += 1
153
154 def connect(self, callback):
155 """ Alias for self.events.timeout.connect() """
156 return self.events.timeout.connect(callback)
157
158 def disconnect(self, callback=None):
159 """ Alias for self.events.timeout.disconnect() """
160 return self.events.timeout.disconnect(callback)
161
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/vispy/app/timer.py b/vispy/app/timer.py
--- a/vispy/app/timer.py
+++ b/vispy/app/timer.py
@@ -103,7 +103,7 @@
self.max_iterations = iterations
self._backend._vispy_start(self.interval)
self._running = True
- self._last_emit_time = None
+ self._last_emit_time = precision_time()
self.events.start(type='timer_start')
def stop(self):
@@ -139,10 +139,7 @@
# compute dt since last event
now = precision_time()
- if self._last_emit_time is None:
- dt = None
- else:
- dt = now - self._last_emit_time
+ dt = now - self._last_emit_time
self._last_emit_time = now
self.events.timeout(
| {"golden_diff": "diff --git a/vispy/app/timer.py b/vispy/app/timer.py\n--- a/vispy/app/timer.py\n+++ b/vispy/app/timer.py\n@@ -103,7 +103,7 @@\n self.max_iterations = iterations\n self._backend._vispy_start(self.interval)\n self._running = True\n- self._last_emit_time = None\n+ self._last_emit_time = precision_time()\n self.events.start(type='timer_start')\n \n def stop(self):\n@@ -139,10 +139,7 @@\n \n # compute dt since last event\n now = precision_time()\n- if self._last_emit_time is None:\n- dt = None\n- else:\n- dt = now - self._last_emit_time\n+ dt = now - self._last_emit_time\n self._last_emit_time = now\n \n self.events.timeout(\n", "issue": "The first emitted Timer event has `None` as `dt` property\n``` python\ndef on_timer(self, event):\n print event.dt\n```\n\ndisplays `None` the first time, and the correct dt then (a float). The first dt should probably be `0.0`.\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n# Copyright (c) 2014, Vispy Development Team.\n# Distributed under the (new) BSD License. See LICENSE.txt for more info.\n\nfrom __future__ import division\n\nfrom ..util.event import Event, EmitterGroup\nfrom ..util.ptime import time as precision_time\nfrom ..ext.six import string_types\nfrom .base import BaseTimerBackend as TimerBackend # noqa\nfrom . import use_app, Application\n\n\nclass Timer(object):\n\n \"\"\"Timer used to schedule events in the future or on a repeating schedule\n\n Parameters\n ----------\n interval : float\n Time between events.\n connect : function | None\n The function to call.\n iterations : int\n Number of iterations. Can be -1 for infinite.\n start : bool\n Whether to start the timer.\n app : instance of vispy.app.Application\n The application to attach the timer to.\n \"\"\"\n\n def __init__(self, interval=0.0, connect=None, iterations=-1, start=False,\n app=None):\n self.events = EmitterGroup(source=self,\n start=Event,\n stop=Event,\n timeout=Event)\n #self.connect = self.events.timeout.connect\n #self.disconnect = self.events.timeout.disconnect\n\n # Get app instance\n if app is None:\n self._app = use_app()\n elif isinstance(app, Application):\n self._app = app\n elif isinstance(app, string_types):\n self._app = Application(app)\n else:\n raise ValueError('Invalid value for app %r' % app)\n \n # Ensure app has backend app object\n self._app.native\n \n # Instantiate the backed with the right class\n self._backend = self._app.backend_module.TimerBackend(self)\n\n self._interval = interval\n self._running = False\n self._last_emit_time = None\n self.iter_count = 0\n self.max_iterations = iterations\n if connect is not None:\n self.connect(connect)\n if start:\n self.start()\n\n @property\n def app(self):\n \"\"\" The vispy Application instance on which this Timer is based.\n \"\"\"\n return self._app\n\n @property\n def interval(self):\n return self._interval\n\n @interval.setter\n def interval(self, val):\n self._interval = val\n if self.running:\n self.stop()\n self.start()\n\n @property\n def running(self):\n return self._running\n\n def start(self, interval=None, iterations=None):\n \"\"\"Start the timer.\n\n A timeout event will be generated every *interval* seconds.\n If *interval* is None, then self.interval will be used.\n\n If *iterations* is specified, the timer will stop after\n emitting that number of events. If unspecified, then\n the previous value of self.iterations will be used. If the value is\n negative, then the timer will continue running until stop() is called.\n \"\"\"\n self.iter_count = 0\n if interval is not None:\n self.interval = interval\n if iterations is not None:\n self.max_iterations = iterations\n self._backend._vispy_start(self.interval)\n self._running = True\n self._last_emit_time = None\n self.events.start(type='timer_start')\n\n def stop(self):\n \"\"\"Stop the timer.\"\"\"\n self._backend._vispy_stop()\n self._running = False\n self.events.stop(type='timer_stop')\n\n # use timer.app.run() and .quit() instead.\n # def run_event_loop(self):\n #\"\"\"Execute the event loop for this Timer's backend.\n #\"\"\"\n # return self._backend._vispy_run()\n\n # def quit_event_loop(self):\n #\"\"\"Exit the event loop for this Timer's backend.\n #\"\"\"\n # return self._backend._vispy_quit()\n\n @property\n def native(self):\n \"\"\" The native timer on which this Timer is based.\n \"\"\"\n return self._backend._vispy_get_native_timer()\n\n def _timeout(self, *args):\n # called when the backend timer has triggered.\n if not self.running:\n return\n if self.max_iterations >= 0 and self.iter_count >= self.max_iterations:\n self.stop()\n return\n\n # compute dt since last event\n now = precision_time()\n if self._last_emit_time is None:\n dt = None\n else:\n dt = now - self._last_emit_time\n self._last_emit_time = now\n\n self.events.timeout(\n type='timer_timeout',\n iteration=self.iter_count,\n dt=dt)\n self.iter_count += 1\n\n def connect(self, callback):\n \"\"\" Alias for self.events.timeout.connect() \"\"\"\n return self.events.timeout.connect(callback)\n\n def disconnect(self, callback=None):\n \"\"\" Alias for self.events.timeout.disconnect() \"\"\"\n return self.events.timeout.disconnect(callback)\n", "path": "vispy/app/timer.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n# Copyright (c) 2014, Vispy Development Team.\n# Distributed under the (new) BSD License. See LICENSE.txt for more info.\n\nfrom __future__ import division\n\nfrom ..util.event import Event, EmitterGroup\nfrom ..util.ptime import time as precision_time\nfrom ..ext.six import string_types\nfrom .base import BaseTimerBackend as TimerBackend # noqa\nfrom . import use_app, Application\n\n\nclass Timer(object):\n\n \"\"\"Timer used to schedule events in the future or on a repeating schedule\n\n Parameters\n ----------\n interval : float\n Time between events.\n connect : function | None\n The function to call.\n iterations : int\n Number of iterations. Can be -1 for infinite.\n start : bool\n Whether to start the timer.\n app : instance of vispy.app.Application\n The application to attach the timer to.\n \"\"\"\n\n def __init__(self, interval=0.0, connect=None, iterations=-1, start=False,\n app=None):\n self.events = EmitterGroup(source=self,\n start=Event,\n stop=Event,\n timeout=Event)\n #self.connect = self.events.timeout.connect\n #self.disconnect = self.events.timeout.disconnect\n\n # Get app instance\n if app is None:\n self._app = use_app()\n elif isinstance(app, Application):\n self._app = app\n elif isinstance(app, string_types):\n self._app = Application(app)\n else:\n raise ValueError('Invalid value for app %r' % app)\n \n # Ensure app has backend app object\n self._app.native\n \n # Instantiate the backed with the right class\n self._backend = self._app.backend_module.TimerBackend(self)\n\n self._interval = interval\n self._running = False\n self._last_emit_time = None\n self.iter_count = 0\n self.max_iterations = iterations\n if connect is not None:\n self.connect(connect)\n if start:\n self.start()\n\n @property\n def app(self):\n \"\"\" The vispy Application instance on which this Timer is based.\n \"\"\"\n return self._app\n\n @property\n def interval(self):\n return self._interval\n\n @interval.setter\n def interval(self, val):\n self._interval = val\n if self.running:\n self.stop()\n self.start()\n\n @property\n def running(self):\n return self._running\n\n def start(self, interval=None, iterations=None):\n \"\"\"Start the timer.\n\n A timeout event will be generated every *interval* seconds.\n If *interval* is None, then self.interval will be used.\n\n If *iterations* is specified, the timer will stop after\n emitting that number of events. If unspecified, then\n the previous value of self.iterations will be used. If the value is\n negative, then the timer will continue running until stop() is called.\n \"\"\"\n self.iter_count = 0\n if interval is not None:\n self.interval = interval\n if iterations is not None:\n self.max_iterations = iterations\n self._backend._vispy_start(self.interval)\n self._running = True\n self._last_emit_time = precision_time()\n self.events.start(type='timer_start')\n\n def stop(self):\n \"\"\"Stop the timer.\"\"\"\n self._backend._vispy_stop()\n self._running = False\n self.events.stop(type='timer_stop')\n\n # use timer.app.run() and .quit() instead.\n # def run_event_loop(self):\n #\"\"\"Execute the event loop for this Timer's backend.\n #\"\"\"\n # return self._backend._vispy_run()\n\n # def quit_event_loop(self):\n #\"\"\"Exit the event loop for this Timer's backend.\n #\"\"\"\n # return self._backend._vispy_quit()\n\n @property\n def native(self):\n \"\"\" The native timer on which this Timer is based.\n \"\"\"\n return self._backend._vispy_get_native_timer()\n\n def _timeout(self, *args):\n # called when the backend timer has triggered.\n if not self.running:\n return\n if self.max_iterations >= 0 and self.iter_count >= self.max_iterations:\n self.stop()\n return\n\n # compute dt since last event\n now = precision_time()\n dt = now - self._last_emit_time\n self._last_emit_time = now\n\n self.events.timeout(\n type='timer_timeout',\n iteration=self.iter_count,\n dt=dt)\n self.iter_count += 1\n\n def connect(self, callback):\n \"\"\" Alias for self.events.timeout.connect() \"\"\"\n return self.events.timeout.connect(callback)\n\n def disconnect(self, callback=None):\n \"\"\" Alias for self.events.timeout.disconnect() \"\"\"\n return self.events.timeout.disconnect(callback)\n", "path": "vispy/app/timer.py"}]} | 1,767 | 204 |
gh_patches_debug_24108 | rasdani/github-patches | git_diff | pypa__pip-11264 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Checking out Bazaar branch makes full clone
When checking out a Bazaar branch, pip currently makes a full clone of the branch history. This is unnecessary and much slower than just fetching the latest revision:
For example, performance on my system for 'bzr co --lightweight lp:bzr':
0.60s user 0.11s system 5% cpu 12.234 total
Performance on my system for 'bzr branch lp:bzr':
65.41s user 1.48s system 39% cpu 2:47.91 total
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/pip/_internal/vcs/bazaar.py`
Content:
```
1 import logging
2 from typing import List, Optional, Tuple
3
4 from pip._internal.utils.misc import HiddenText, display_path
5 from pip._internal.utils.subprocess import make_command
6 from pip._internal.utils.urls import path_to_url
7 from pip._internal.vcs.versioncontrol import (
8 AuthInfo,
9 RemoteNotFoundError,
10 RevOptions,
11 VersionControl,
12 vcs,
13 )
14
15 logger = logging.getLogger(__name__)
16
17
18 class Bazaar(VersionControl):
19 name = "bzr"
20 dirname = ".bzr"
21 repo_name = "branch"
22 schemes = (
23 "bzr+http",
24 "bzr+https",
25 "bzr+ssh",
26 "bzr+sftp",
27 "bzr+ftp",
28 "bzr+lp",
29 "bzr+file",
30 )
31
32 @staticmethod
33 def get_base_rev_args(rev: str) -> List[str]:
34 return ["-r", rev]
35
36 def fetch_new(
37 self, dest: str, url: HiddenText, rev_options: RevOptions, verbosity: int
38 ) -> None:
39 rev_display = rev_options.to_display()
40 logger.info(
41 "Checking out %s%s to %s",
42 url,
43 rev_display,
44 display_path(dest),
45 )
46 if verbosity <= 0:
47 flag = "--quiet"
48 elif verbosity == 1:
49 flag = ""
50 else:
51 flag = f"-{'v'*verbosity}"
52 cmd_args = make_command("branch", flag, rev_options.to_args(), url, dest)
53 self.run_command(cmd_args)
54
55 def switch(self, dest: str, url: HiddenText, rev_options: RevOptions) -> None:
56 self.run_command(make_command("switch", url), cwd=dest)
57
58 def update(self, dest: str, url: HiddenText, rev_options: RevOptions) -> None:
59 cmd_args = make_command("pull", "-q", rev_options.to_args())
60 self.run_command(cmd_args, cwd=dest)
61
62 @classmethod
63 def get_url_rev_and_auth(cls, url: str) -> Tuple[str, Optional[str], AuthInfo]:
64 # hotfix the URL scheme after removing bzr+ from bzr+ssh:// readd it
65 url, rev, user_pass = super().get_url_rev_and_auth(url)
66 if url.startswith("ssh://"):
67 url = "bzr+" + url
68 return url, rev, user_pass
69
70 @classmethod
71 def get_remote_url(cls, location: str) -> str:
72 urls = cls.run_command(
73 ["info"], show_stdout=False, stdout_only=True, cwd=location
74 )
75 for line in urls.splitlines():
76 line = line.strip()
77 for x in ("checkout of branch: ", "parent branch: "):
78 if line.startswith(x):
79 repo = line.split(x)[1]
80 if cls._is_local_repository(repo):
81 return path_to_url(repo)
82 return repo
83 raise RemoteNotFoundError
84
85 @classmethod
86 def get_revision(cls, location: str) -> str:
87 revision = cls.run_command(
88 ["revno"],
89 show_stdout=False,
90 stdout_only=True,
91 cwd=location,
92 )
93 return revision.splitlines()[-1]
94
95 @classmethod
96 def is_commit_id_equal(cls, dest: str, name: Optional[str]) -> bool:
97 """Always assume the versions don't match"""
98 return False
99
100
101 vcs.register(Bazaar)
102
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/pip/_internal/vcs/bazaar.py b/src/pip/_internal/vcs/bazaar.py
--- a/src/pip/_internal/vcs/bazaar.py
+++ b/src/pip/_internal/vcs/bazaar.py
@@ -49,14 +49,25 @@
flag = ""
else:
flag = f"-{'v'*verbosity}"
- cmd_args = make_command("branch", flag, rev_options.to_args(), url, dest)
+ cmd_args = make_command(
+ "checkout", "--lightweight", flag, rev_options.to_args(), url, dest
+ )
self.run_command(cmd_args)
def switch(self, dest: str, url: HiddenText, rev_options: RevOptions) -> None:
self.run_command(make_command("switch", url), cwd=dest)
def update(self, dest: str, url: HiddenText, rev_options: RevOptions) -> None:
- cmd_args = make_command("pull", "-q", rev_options.to_args())
+ output = self.run_command(
+ make_command("info"), show_stdout=False, stdout_only=True, cwd=dest
+ )
+ if output.startswith("Standalone "):
+ # Older versions of pip used to create standalone branches.
+ # Convert the standalone branch to a checkout by calling "bzr bind".
+ cmd_args = make_command("bind", "-q", url)
+ self.run_command(cmd_args, cwd=dest)
+
+ cmd_args = make_command("update", "-q", rev_options.to_args())
self.run_command(cmd_args, cwd=dest)
@classmethod
| {"golden_diff": "diff --git a/src/pip/_internal/vcs/bazaar.py b/src/pip/_internal/vcs/bazaar.py\n--- a/src/pip/_internal/vcs/bazaar.py\n+++ b/src/pip/_internal/vcs/bazaar.py\n@@ -49,14 +49,25 @@\n flag = \"\"\n else:\n flag = f\"-{'v'*verbosity}\"\n- cmd_args = make_command(\"branch\", flag, rev_options.to_args(), url, dest)\n+ cmd_args = make_command(\n+ \"checkout\", \"--lightweight\", flag, rev_options.to_args(), url, dest\n+ )\n self.run_command(cmd_args)\n \n def switch(self, dest: str, url: HiddenText, rev_options: RevOptions) -> None:\n self.run_command(make_command(\"switch\", url), cwd=dest)\n \n def update(self, dest: str, url: HiddenText, rev_options: RevOptions) -> None:\n- cmd_args = make_command(\"pull\", \"-q\", rev_options.to_args())\n+ output = self.run_command(\n+ make_command(\"info\"), show_stdout=False, stdout_only=True, cwd=dest\n+ )\n+ if output.startswith(\"Standalone \"):\n+ # Older versions of pip used to create standalone branches.\n+ # Convert the standalone branch to a checkout by calling \"bzr bind\".\n+ cmd_args = make_command(\"bind\", \"-q\", url)\n+ self.run_command(cmd_args, cwd=dest)\n+\n+ cmd_args = make_command(\"update\", \"-q\", rev_options.to_args())\n self.run_command(cmd_args, cwd=dest)\n \n @classmethod\n", "issue": "Checking out Bazaar branch makes full clone\nWhen checking out a Bazaar branch, pip currently makes a full clone of the branch history. This is unnecessary and much slower than just fetching the latest revision:\r\n\r\nFor example, performance on my system for 'bzr co --lightweight lp:bzr':\r\n\r\n0.60s user 0.11s system 5% cpu 12.234 total\r\n\r\nPerformance on my system for 'bzr branch lp:bzr':\r\n\r\n65.41s user 1.48s system 39% cpu 2:47.91 total\r\n\n", "before_files": [{"content": "import logging\nfrom typing import List, Optional, Tuple\n\nfrom pip._internal.utils.misc import HiddenText, display_path\nfrom pip._internal.utils.subprocess import make_command\nfrom pip._internal.utils.urls import path_to_url\nfrom pip._internal.vcs.versioncontrol import (\n AuthInfo,\n RemoteNotFoundError,\n RevOptions,\n VersionControl,\n vcs,\n)\n\nlogger = logging.getLogger(__name__)\n\n\nclass Bazaar(VersionControl):\n name = \"bzr\"\n dirname = \".bzr\"\n repo_name = \"branch\"\n schemes = (\n \"bzr+http\",\n \"bzr+https\",\n \"bzr+ssh\",\n \"bzr+sftp\",\n \"bzr+ftp\",\n \"bzr+lp\",\n \"bzr+file\",\n )\n\n @staticmethod\n def get_base_rev_args(rev: str) -> List[str]:\n return [\"-r\", rev]\n\n def fetch_new(\n self, dest: str, url: HiddenText, rev_options: RevOptions, verbosity: int\n ) -> None:\n rev_display = rev_options.to_display()\n logger.info(\n \"Checking out %s%s to %s\",\n url,\n rev_display,\n display_path(dest),\n )\n if verbosity <= 0:\n flag = \"--quiet\"\n elif verbosity == 1:\n flag = \"\"\n else:\n flag = f\"-{'v'*verbosity}\"\n cmd_args = make_command(\"branch\", flag, rev_options.to_args(), url, dest)\n self.run_command(cmd_args)\n\n def switch(self, dest: str, url: HiddenText, rev_options: RevOptions) -> None:\n self.run_command(make_command(\"switch\", url), cwd=dest)\n\n def update(self, dest: str, url: HiddenText, rev_options: RevOptions) -> None:\n cmd_args = make_command(\"pull\", \"-q\", rev_options.to_args())\n self.run_command(cmd_args, cwd=dest)\n\n @classmethod\n def get_url_rev_and_auth(cls, url: str) -> Tuple[str, Optional[str], AuthInfo]:\n # hotfix the URL scheme after removing bzr+ from bzr+ssh:// readd it\n url, rev, user_pass = super().get_url_rev_and_auth(url)\n if url.startswith(\"ssh://\"):\n url = \"bzr+\" + url\n return url, rev, user_pass\n\n @classmethod\n def get_remote_url(cls, location: str) -> str:\n urls = cls.run_command(\n [\"info\"], show_stdout=False, stdout_only=True, cwd=location\n )\n for line in urls.splitlines():\n line = line.strip()\n for x in (\"checkout of branch: \", \"parent branch: \"):\n if line.startswith(x):\n repo = line.split(x)[1]\n if cls._is_local_repository(repo):\n return path_to_url(repo)\n return repo\n raise RemoteNotFoundError\n\n @classmethod\n def get_revision(cls, location: str) -> str:\n revision = cls.run_command(\n [\"revno\"],\n show_stdout=False,\n stdout_only=True,\n cwd=location,\n )\n return revision.splitlines()[-1]\n\n @classmethod\n def is_commit_id_equal(cls, dest: str, name: Optional[str]) -> bool:\n \"\"\"Always assume the versions don't match\"\"\"\n return False\n\n\nvcs.register(Bazaar)\n", "path": "src/pip/_internal/vcs/bazaar.py"}], "after_files": [{"content": "import logging\nfrom typing import List, Optional, Tuple\n\nfrom pip._internal.utils.misc import HiddenText, display_path\nfrom pip._internal.utils.subprocess import make_command\nfrom pip._internal.utils.urls import path_to_url\nfrom pip._internal.vcs.versioncontrol import (\n AuthInfo,\n RemoteNotFoundError,\n RevOptions,\n VersionControl,\n vcs,\n)\n\nlogger = logging.getLogger(__name__)\n\n\nclass Bazaar(VersionControl):\n name = \"bzr\"\n dirname = \".bzr\"\n repo_name = \"branch\"\n schemes = (\n \"bzr+http\",\n \"bzr+https\",\n \"bzr+ssh\",\n \"bzr+sftp\",\n \"bzr+ftp\",\n \"bzr+lp\",\n \"bzr+file\",\n )\n\n @staticmethod\n def get_base_rev_args(rev: str) -> List[str]:\n return [\"-r\", rev]\n\n def fetch_new(\n self, dest: str, url: HiddenText, rev_options: RevOptions, verbosity: int\n ) -> None:\n rev_display = rev_options.to_display()\n logger.info(\n \"Checking out %s%s to %s\",\n url,\n rev_display,\n display_path(dest),\n )\n if verbosity <= 0:\n flag = \"--quiet\"\n elif verbosity == 1:\n flag = \"\"\n else:\n flag = f\"-{'v'*verbosity}\"\n cmd_args = make_command(\n \"checkout\", \"--lightweight\", flag, rev_options.to_args(), url, dest\n )\n self.run_command(cmd_args)\n\n def switch(self, dest: str, url: HiddenText, rev_options: RevOptions) -> None:\n self.run_command(make_command(\"switch\", url), cwd=dest)\n\n def update(self, dest: str, url: HiddenText, rev_options: RevOptions) -> None:\n output = self.run_command(\n make_command(\"info\"), show_stdout=False, stdout_only=True, cwd=dest\n )\n if output.startswith(\"Standalone \"):\n # Older versions of pip used to create standalone branches.\n # Convert the standalone branch to a checkout by calling \"bzr bind\".\n cmd_args = make_command(\"bind\", \"-q\", url)\n self.run_command(cmd_args, cwd=dest)\n\n cmd_args = make_command(\"update\", \"-q\", rev_options.to_args())\n self.run_command(cmd_args, cwd=dest)\n\n @classmethod\n def get_url_rev_and_auth(cls, url: str) -> Tuple[str, Optional[str], AuthInfo]:\n # hotfix the URL scheme after removing bzr+ from bzr+ssh:// readd it\n url, rev, user_pass = super().get_url_rev_and_auth(url)\n if url.startswith(\"ssh://\"):\n url = \"bzr+\" + url\n return url, rev, user_pass\n\n @classmethod\n def get_remote_url(cls, location: str) -> str:\n urls = cls.run_command(\n [\"info\"], show_stdout=False, stdout_only=True, cwd=location\n )\n for line in urls.splitlines():\n line = line.strip()\n for x in (\"checkout of branch: \", \"parent branch: \"):\n if line.startswith(x):\n repo = line.split(x)[1]\n if cls._is_local_repository(repo):\n return path_to_url(repo)\n return repo\n raise RemoteNotFoundError\n\n @classmethod\n def get_revision(cls, location: str) -> str:\n revision = cls.run_command(\n [\"revno\"],\n show_stdout=False,\n stdout_only=True,\n cwd=location,\n )\n return revision.splitlines()[-1]\n\n @classmethod\n def is_commit_id_equal(cls, dest: str, name: Optional[str]) -> bool:\n \"\"\"Always assume the versions don't match\"\"\"\n return False\n\n\nvcs.register(Bazaar)\n", "path": "src/pip/_internal/vcs/bazaar.py"}]} | 1,332 | 355 |
gh_patches_debug_5737 | rasdani/github-patches | git_diff | spesmilo__electrum-1738 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Feature request: Label colour - History Window
The transaction amount of outgoing transactions, when viewed in the history window, is coloured red.
It would be a nice if the label of these transactions could be coloured the same red colour so that they stand out more against the other incoming transactions.
Not a big issue but would be a 'nice to have'.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `gui/qt/history_widget.py`
Content:
```
1 #!/usr/bin/env python
2 #
3 # Electrum - lightweight Bitcoin client
4 # Copyright (C) 2015 Thomas Voegtlin
5 #
6 # Permission is hereby granted, free of charge, to any person
7 # obtaining a copy of this software and associated documentation files
8 # (the "Software"), to deal in the Software without restriction,
9 # including without limitation the rights to use, copy, modify, merge,
10 # publish, distribute, sublicense, and/or sell copies of the Software,
11 # and to permit persons to whom the Software is furnished to do so,
12 # subject to the following conditions:
13 #
14 # The above copyright notice and this permission notice shall be
15 # included in all copies or substantial portions of the Software.
16 #
17 # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
18 # EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
19 # MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
20 # NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS
21 # BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN
22 # ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
23 # CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
24 # SOFTWARE.
25
26
27 import webbrowser
28
29 from util import *
30 from electrum.i18n import _
31 from electrum.util import block_explorer_URL, format_satoshis, format_time
32 from electrum.plugins import run_hook
33
34
35 class HistoryWidget(MyTreeWidget):
36
37 def __init__(self, parent=None):
38 MyTreeWidget.__init__(self, parent, self.create_menu, [], 3)
39 self.refresh_headers()
40 self.setColumnHidden(1, True)
41 self.config = self.parent.config
42
43 def refresh_headers(self):
44 headers = ['', '', _('Date'), _('Description') , _('Amount'),
45 _('Balance')]
46 run_hook('history_tab_headers', headers)
47 self.update_headers(headers)
48
49 def get_icon(self, conf, timestamp):
50 time_str = _("unknown")
51 if conf > 0:
52 time_str = format_time(timestamp)
53 if conf == -1:
54 time_str = _('Not Verified')
55 icon = QIcon(":icons/unconfirmed.png")
56 elif conf == 0:
57 time_str = _('Unconfirmed')
58 icon = QIcon(":icons/unconfirmed.png")
59 elif conf < 6:
60 icon = QIcon(":icons/clock%d.png"%conf)
61 else:
62 icon = QIcon(":icons/confirmed.png")
63 return icon, time_str
64
65 def get_domain(self):
66 '''Replaced in address_dialog.py'''
67 return self.wallet.get_account_addresses(self.parent.current_account)
68
69 def on_update(self):
70 self.wallet = self.parent.wallet
71 h = self.wallet.get_history(self.get_domain())
72
73 item = self.currentItem()
74 current_tx = item.data(0, Qt.UserRole).toString() if item else None
75 self.clear()
76 run_hook('history_tab_update_begin')
77 for tx in h:
78 tx_hash, conf, value, timestamp, balance = tx
79 if conf is None and timestamp is None:
80 continue # skip history in offline mode
81 icon, time_str = self.get_icon(conf, timestamp)
82 v_str = self.parent.format_amount(value, True, whitespaces=True)
83 balance_str = self.parent.format_amount(balance, whitespaces=True)
84 label = self.wallet.get_label(tx_hash)
85 entry = ['', tx_hash, time_str, label, v_str, balance_str]
86 run_hook('history_tab_update', tx, entry)
87 item = QTreeWidgetItem(entry)
88 item.setIcon(0, icon)
89 for i in range(len(entry)):
90 if i>3:
91 item.setTextAlignment(i, Qt.AlignRight)
92 if i!=2:
93 item.setFont(i, QFont(MONOSPACE_FONT))
94 if value < 0:
95 item.setForeground(4, QBrush(QColor("#BC1E1E")))
96 if tx_hash:
97 item.setData(0, Qt.UserRole, tx_hash)
98 self.insertTopLevelItem(0, item)
99 if current_tx == tx_hash:
100 self.setCurrentItem(item)
101
102 def update_item(self, tx_hash, conf, timestamp):
103 icon, time_str = self.get_icon(conf, timestamp)
104 items = self.findItems(tx_hash, Qt.UserRole|Qt.MatchContains|Qt.MatchRecursive, column=1)
105 if items:
106 item = items[0]
107 item.setIcon(0, icon)
108 item.setText(2, time_str)
109
110 def create_menu(self, position):
111 self.selectedIndexes()
112 item = self.currentItem()
113 if not item:
114 return
115 tx_hash = str(item.data(0, Qt.UserRole).toString())
116 if not tx_hash:
117 return
118 tx_URL = block_explorer_URL(self.config, 'tx', tx_hash)
119 if not tx_URL:
120 return
121 menu = QMenu()
122 menu.addAction(_("Copy ID to Clipboard"), lambda: self.parent.app.clipboard().setText(tx_hash))
123 menu.addAction(_("Details"), lambda: self.parent.show_transaction(self.wallet.transactions.get(tx_hash)))
124 menu.addAction(_("Edit description"), lambda: self.editItem(item, self.editable_columns[0]))
125 menu.addAction(_("View on block explorer"), lambda: webbrowser.open(tx_URL))
126 menu.exec_(self.viewport().mapToGlobal(position))
127
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/gui/qt/history_widget.py b/gui/qt/history_widget.py
--- a/gui/qt/history_widget.py
+++ b/gui/qt/history_widget.py
@@ -92,6 +92,7 @@
if i!=2:
item.setFont(i, QFont(MONOSPACE_FONT))
if value < 0:
+ item.setForeground(3, QBrush(QColor("#BC1E1E")))
item.setForeground(4, QBrush(QColor("#BC1E1E")))
if tx_hash:
item.setData(0, Qt.UserRole, tx_hash)
| {"golden_diff": "diff --git a/gui/qt/history_widget.py b/gui/qt/history_widget.py\n--- a/gui/qt/history_widget.py\n+++ b/gui/qt/history_widget.py\n@@ -92,6 +92,7 @@\n if i!=2:\n item.setFont(i, QFont(MONOSPACE_FONT))\n if value < 0:\n+ item.setForeground(3, QBrush(QColor(\"#BC1E1E\")))\n item.setForeground(4, QBrush(QColor(\"#BC1E1E\")))\n if tx_hash:\n item.setData(0, Qt.UserRole, tx_hash)\n", "issue": "Feature request: Label colour - History Window\nThe transaction amount of outgoing transactions, when viewed in the history window, is coloured red. \n\nIt would be a nice if the label of these transactions could be coloured the same red colour so that they stand out more against the other incoming transactions. \n\nNot a big issue but would be a 'nice to have'.\n\n", "before_files": [{"content": "#!/usr/bin/env python\n#\n# Electrum - lightweight Bitcoin client\n# Copyright (C) 2015 Thomas Voegtlin\n#\n# Permission is hereby granted, free of charge, to any person\n# obtaining a copy of this software and associated documentation files\n# (the \"Software\"), to deal in the Software without restriction,\n# including without limitation the rights to use, copy, modify, merge,\n# publish, distribute, sublicense, and/or sell copies of the Software,\n# and to permit persons to whom the Software is furnished to do so,\n# subject to the following conditions:\n#\n# The above copyright notice and this permission notice shall be\n# included in all copies or substantial portions of the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND,\n# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF\n# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND\n# NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS\n# BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN\n# ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN\n# CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n# SOFTWARE.\n\n\nimport webbrowser\n\nfrom util import *\nfrom electrum.i18n import _\nfrom electrum.util import block_explorer_URL, format_satoshis, format_time\nfrom electrum.plugins import run_hook\n\n\nclass HistoryWidget(MyTreeWidget):\n\n def __init__(self, parent=None):\n MyTreeWidget.__init__(self, parent, self.create_menu, [], 3)\n self.refresh_headers()\n self.setColumnHidden(1, True)\n self.config = self.parent.config\n\n def refresh_headers(self):\n headers = ['', '', _('Date'), _('Description') , _('Amount'),\n _('Balance')]\n run_hook('history_tab_headers', headers)\n self.update_headers(headers)\n\n def get_icon(self, conf, timestamp):\n time_str = _(\"unknown\")\n if conf > 0:\n time_str = format_time(timestamp)\n if conf == -1:\n time_str = _('Not Verified')\n icon = QIcon(\":icons/unconfirmed.png\")\n elif conf == 0:\n time_str = _('Unconfirmed')\n icon = QIcon(\":icons/unconfirmed.png\")\n elif conf < 6:\n icon = QIcon(\":icons/clock%d.png\"%conf)\n else:\n icon = QIcon(\":icons/confirmed.png\")\n return icon, time_str\n\n def get_domain(self):\n '''Replaced in address_dialog.py'''\n return self.wallet.get_account_addresses(self.parent.current_account)\n\n def on_update(self):\n self.wallet = self.parent.wallet\n h = self.wallet.get_history(self.get_domain())\n\n item = self.currentItem()\n current_tx = item.data(0, Qt.UserRole).toString() if item else None\n self.clear()\n run_hook('history_tab_update_begin')\n for tx in h:\n tx_hash, conf, value, timestamp, balance = tx\n if conf is None and timestamp is None:\n continue # skip history in offline mode\n icon, time_str = self.get_icon(conf, timestamp)\n v_str = self.parent.format_amount(value, True, whitespaces=True)\n balance_str = self.parent.format_amount(balance, whitespaces=True)\n label = self.wallet.get_label(tx_hash)\n entry = ['', tx_hash, time_str, label, v_str, balance_str]\n run_hook('history_tab_update', tx, entry)\n item = QTreeWidgetItem(entry)\n item.setIcon(0, icon)\n for i in range(len(entry)):\n if i>3:\n item.setTextAlignment(i, Qt.AlignRight)\n if i!=2:\n item.setFont(i, QFont(MONOSPACE_FONT))\n if value < 0:\n item.setForeground(4, QBrush(QColor(\"#BC1E1E\")))\n if tx_hash:\n item.setData(0, Qt.UserRole, tx_hash)\n self.insertTopLevelItem(0, item)\n if current_tx == tx_hash:\n self.setCurrentItem(item)\n\n def update_item(self, tx_hash, conf, timestamp):\n icon, time_str = self.get_icon(conf, timestamp)\n items = self.findItems(tx_hash, Qt.UserRole|Qt.MatchContains|Qt.MatchRecursive, column=1)\n if items:\n item = items[0]\n item.setIcon(0, icon)\n item.setText(2, time_str)\n\n def create_menu(self, position):\n self.selectedIndexes()\n item = self.currentItem()\n if not item:\n return\n tx_hash = str(item.data(0, Qt.UserRole).toString())\n if not tx_hash:\n return\n tx_URL = block_explorer_URL(self.config, 'tx', tx_hash)\n if not tx_URL:\n return\n menu = QMenu()\n menu.addAction(_(\"Copy ID to Clipboard\"), lambda: self.parent.app.clipboard().setText(tx_hash))\n menu.addAction(_(\"Details\"), lambda: self.parent.show_transaction(self.wallet.transactions.get(tx_hash)))\n menu.addAction(_(\"Edit description\"), lambda: self.editItem(item, self.editable_columns[0]))\n menu.addAction(_(\"View on block explorer\"), lambda: webbrowser.open(tx_URL))\n menu.exec_(self.viewport().mapToGlobal(position))\n", "path": "gui/qt/history_widget.py"}], "after_files": [{"content": "#!/usr/bin/env python\n#\n# Electrum - lightweight Bitcoin client\n# Copyright (C) 2015 Thomas Voegtlin\n#\n# Permission is hereby granted, free of charge, to any person\n# obtaining a copy of this software and associated documentation files\n# (the \"Software\"), to deal in the Software without restriction,\n# including without limitation the rights to use, copy, modify, merge,\n# publish, distribute, sublicense, and/or sell copies of the Software,\n# and to permit persons to whom the Software is furnished to do so,\n# subject to the following conditions:\n#\n# The above copyright notice and this permission notice shall be\n# included in all copies or substantial portions of the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND,\n# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF\n# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND\n# NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS\n# BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN\n# ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN\n# CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n# SOFTWARE.\n\n\nimport webbrowser\n\nfrom util import *\nfrom electrum.i18n import _\nfrom electrum.util import block_explorer_URL, format_satoshis, format_time\nfrom electrum.plugins import run_hook\n\n\nclass HistoryWidget(MyTreeWidget):\n\n def __init__(self, parent=None):\n MyTreeWidget.__init__(self, parent, self.create_menu, [], 3)\n self.refresh_headers()\n self.setColumnHidden(1, True)\n self.config = self.parent.config\n\n def refresh_headers(self):\n headers = ['', '', _('Date'), _('Description') , _('Amount'),\n _('Balance')]\n run_hook('history_tab_headers', headers)\n self.update_headers(headers)\n\n def get_icon(self, conf, timestamp):\n time_str = _(\"unknown\")\n if conf > 0:\n time_str = format_time(timestamp)\n if conf == -1:\n time_str = _('Not Verified')\n icon = QIcon(\":icons/unconfirmed.png\")\n elif conf == 0:\n time_str = _('Unconfirmed')\n icon = QIcon(\":icons/unconfirmed.png\")\n elif conf < 6:\n icon = QIcon(\":icons/clock%d.png\"%conf)\n else:\n icon = QIcon(\":icons/confirmed.png\")\n return icon, time_str\n\n def get_domain(self):\n '''Replaced in address_dialog.py'''\n return self.wallet.get_account_addresses(self.parent.current_account)\n\n def on_update(self):\n self.wallet = self.parent.wallet\n h = self.wallet.get_history(self.get_domain())\n\n item = self.currentItem()\n current_tx = item.data(0, Qt.UserRole).toString() if item else None\n self.clear()\n run_hook('history_tab_update_begin')\n for tx in h:\n tx_hash, conf, value, timestamp, balance = tx\n if conf is None and timestamp is None:\n continue # skip history in offline mode\n icon, time_str = self.get_icon(conf, timestamp)\n v_str = self.parent.format_amount(value, True, whitespaces=True)\n balance_str = self.parent.format_amount(balance, whitespaces=True)\n label = self.wallet.get_label(tx_hash)\n entry = ['', tx_hash, time_str, label, v_str, balance_str]\n run_hook('history_tab_update', tx, entry)\n item = QTreeWidgetItem(entry)\n item.setIcon(0, icon)\n for i in range(len(entry)):\n if i>3:\n item.setTextAlignment(i, Qt.AlignRight)\n if i!=2:\n item.setFont(i, QFont(MONOSPACE_FONT))\n if value < 0:\n item.setForeground(3, QBrush(QColor(\"#BC1E1E\")))\n item.setForeground(4, QBrush(QColor(\"#BC1E1E\")))\n if tx_hash:\n item.setData(0, Qt.UserRole, tx_hash)\n self.insertTopLevelItem(0, item)\n if current_tx == tx_hash:\n self.setCurrentItem(item)\n\n def update_item(self, tx_hash, conf, timestamp):\n icon, time_str = self.get_icon(conf, timestamp)\n items = self.findItems(tx_hash, Qt.UserRole|Qt.MatchContains|Qt.MatchRecursive, column=1)\n if items:\n item = items[0]\n item.setIcon(0, icon)\n item.setText(2, time_str)\n\n def create_menu(self, position):\n self.selectedIndexes()\n item = self.currentItem()\n if not item:\n return\n tx_hash = str(item.data(0, Qt.UserRole).toString())\n if not tx_hash:\n return\n tx_URL = block_explorer_URL(self.config, 'tx', tx_hash)\n if not tx_URL:\n return\n menu = QMenu()\n menu.addAction(_(\"Copy ID to Clipboard\"), lambda: self.parent.app.clipboard().setText(tx_hash))\n menu.addAction(_(\"Details\"), lambda: self.parent.show_transaction(self.wallet.transactions.get(tx_hash)))\n menu.addAction(_(\"Edit description\"), lambda: self.editItem(item, self.editable_columns[0]))\n menu.addAction(_(\"View on block explorer\"), lambda: webbrowser.open(tx_URL))\n menu.exec_(self.viewport().mapToGlobal(position))\n", "path": "gui/qt/history_widget.py"}]} | 1,727 | 122 |
gh_patches_debug_627 | rasdani/github-patches | git_diff | bridgecrewio__checkov-3151 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Terraform parsing error string with escaped backslash at the end
**Describe the issue**
Checkov crashes if it encounters an escaped backslash (`"\\"`) at the end of a string.
**Examples**
Minimal example to reproduce the error:
```terraform
variable "slash" {
default = "\\"
}
output "slash" {
value = var.slash
}
```
`terraform validate` sees this configuration as valid, but checkov fails with a parsing error.
This only happens when the last character of the string is the escaped backslash, as the parser assumes the closing quotation mark is escaped. Adding any normal character at the end of the string doesn't trigger this error.
```terraform
variable "slash" {
default = "\\"
}
```
**Exception Trace**
Relevant traceback
```sh
> LOG_LEVEL=DEBUG checkov -d .
[...]
[MainThread ] [DEBUG] failed while parsing file /workdir/main.tf
Traceback (most recent call last):
File "/Users/user/.local/pipx/venvs/checkov/lib/python3.8/site-packages/checkov/terraform/parser.py", line 726, in _load_or_die_quietly
raw_data = hcl2.load(f)
File "/Users/user/.local/pipx/venvs/checkov/lib/python3.8/site-packages/hcl2/api.py", line 12, in load
return loads(file.read())
File "/Users/user/.local/pipx/venvs/checkov/lib/python3.8/site-packages/hcl2/api.py", line 80, in loads
raise ValueError(f"Line has unclosed quote marks: {line}")
ValueError: Line has unclosed quote marks: default = "\\"
[...]
```
**Desktop (please complete the following information):**
- OS: MacOS 12.3.1 (Intel)
- Checkov Version: 2.0.1230
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #!/usr/bin/env python
2 import logging
3 import os
4 from importlib import util
5 from os import path
6
7 import setuptools
8 from setuptools import setup
9
10 # read the contents of your README file
11 this_directory = path.abspath(path.dirname(__file__))
12 with open(path.join(this_directory, "README.md"), encoding="utf-8") as f:
13 long_description = f.read()
14
15 logger = logging.getLogger(__name__)
16 spec = util.spec_from_file_location(
17 "checkov.version", os.path.join("checkov", "version.py")
18 )
19 # noinspection PyUnresolvedReferences
20 mod = util.module_from_spec(spec)
21 spec.loader.exec_module(mod) # type: ignore
22 version = mod.version # type: ignore
23
24 setup(
25 extras_require={
26 "dev": [
27 "pytest==5.3.1",
28 "coverage==5.5",
29 "coverage-badge",
30 "GitPython==3.1.7",
31 "bandit",
32 "jsonschema",
33 ]
34 },
35 install_requires=[
36 "bc-python-hcl2==0.3.42",
37 "cloudsplaining>=0.4.1",
38 "deep_merge",
39 "tabulate",
40 "colorama",
41 "termcolor",
42 "junit-xml>=1.9",
43 "dpath>=1.5.0,<2",
44 "pyyaml>=5.4.1",
45 "boto3>=1.17",
46 "GitPython",
47 "jmespath",
48 "tqdm",
49 "update_checker",
50 "semantic_version",
51 "packaging",
52 "networkx",
53 "dockerfile-parse",
54 "docker",
55 "configargparse",
56 "argcomplete",
57 "detect-secrets",
58 "policyuniverse",
59 "typing-extensions>=4.1.0",
60 "cachetools",
61 "cyclonedx-python-lib>=2.4.0",
62 "click>=8.0.0",
63 "aiohttp",
64 "aiodns",
65 "aiomultiprocess",
66 "jsonpath_ng",
67 "jsonschema~=3.0",
68 "prettytable>=3.0.0",
69 "pycep-parser==0.3.7",
70 "charset-normalizer",
71 ],
72 license="Apache License 2.0",
73 name="checkov",
74 version=version,
75 python_requires=">=3.7",
76 description="Infrastructure as code static analysis",
77 author="bridgecrew",
78 author_email="[email protected]",
79 url="https://github.com/bridgecrewio/checkov",
80 packages=setuptools.find_packages(exclude=["tests*", "integration_tests*"]),
81 include_package_data=True,
82 package_dir={
83 "checkov.bicep.checks.graph_checks": "checkov/bicep/checks/graph_checks",
84 "checkov.terraform.checks.graph_checks": "checkov/terraform/checks/graph_checks",
85 },
86 package_data={
87 "checkov": ["py.typed"],
88 "checkov.bicep.checks.graph_checks": ["*.yaml"],
89 "checkov.common.util.templates": ["*.jinja2"],
90 "checkov.terraform.checks.graph_checks": [
91 "aws/*.yaml",
92 "gcp/*.yaml",
93 "azure/*.yaml",
94 ],
95 },
96 scripts=["bin/checkov", "bin/checkov.cmd"],
97 long_description=long_description,
98 long_description_content_type="text/markdown",
99 classifiers=[
100 "Environment :: Console",
101 "Intended Audience :: Developers",
102 "Intended Audience :: System Administrators",
103 "License :: OSI Approved :: Apache Software License",
104 "Programming Language :: Python :: 3 :: Only",
105 "Programming Language :: Python :: 3.7",
106 "Programming Language :: Python :: 3.8",
107 "Programming Language :: Python :: 3.9",
108 "Programming Language :: Python :: 3.10",
109 "Topic :: Security",
110 "Topic :: Software Development :: Build Tools",
111 "Typing :: Typed",
112 ],
113 )
114
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -33,7 +33,7 @@
]
},
install_requires=[
- "bc-python-hcl2==0.3.42",
+ "bc-python-hcl2==0.3.44",
"cloudsplaining>=0.4.1",
"deep_merge",
"tabulate",
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -33,7 +33,7 @@\n ]\n },\n install_requires=[\n- \"bc-python-hcl2==0.3.42\",\n+ \"bc-python-hcl2==0.3.44\",\n \"cloudsplaining>=0.4.1\",\n \"deep_merge\",\n \"tabulate\",\n", "issue": "Terraform parsing error string with escaped backslash at the end\n**Describe the issue**\r\nCheckov crashes if it encounters an escaped backslash (`\"\\\\\"`) at the end of a string.\r\n\r\n**Examples**\r\nMinimal example to reproduce the error:\r\n```terraform\r\nvariable \"slash\" {\r\n default = \"\\\\\"\r\n}\r\n\r\noutput \"slash\" {\r\n value = var.slash\r\n}\r\n```\r\n`terraform validate` sees this configuration as valid, but checkov fails with a parsing error.\r\n\r\nThis only happens when the last character of the string is the escaped backslash, as the parser assumes the closing quotation mark is escaped. Adding any normal character at the end of the string doesn't trigger this error.\r\n```terraform\r\nvariable \"slash\" {\r\n default = \"\\\\\"\r\n}\r\n```\r\n\r\n**Exception Trace**\r\nRelevant traceback\r\n```sh\r\n> LOG_LEVEL=DEBUG checkov -d .\r\n[...]\r\n[MainThread ] [DEBUG] failed while parsing file /workdir/main.tf\r\nTraceback (most recent call last):\r\n File \"/Users/user/.local/pipx/venvs/checkov/lib/python3.8/site-packages/checkov/terraform/parser.py\", line 726, in _load_or_die_quietly\r\n raw_data = hcl2.load(f)\r\n File \"/Users/user/.local/pipx/venvs/checkov/lib/python3.8/site-packages/hcl2/api.py\", line 12, in load\r\n return loads(file.read())\r\n File \"/Users/user/.local/pipx/venvs/checkov/lib/python3.8/site-packages/hcl2/api.py\", line 80, in loads\r\n raise ValueError(f\"Line has unclosed quote marks: {line}\")\r\nValueError: Line has unclosed quote marks: default = \"\\\\\"\r\n[...]\r\n```\r\n\r\n**Desktop (please complete the following information):**\r\n - OS: MacOS 12.3.1 (Intel)\r\n - Checkov Version: 2.0.1230\n", "before_files": [{"content": "#!/usr/bin/env python\nimport logging\nimport os\nfrom importlib import util\nfrom os import path\n\nimport setuptools\nfrom setuptools import setup\n\n# read the contents of your README file\nthis_directory = path.abspath(path.dirname(__file__))\nwith open(path.join(this_directory, \"README.md\"), encoding=\"utf-8\") as f:\n long_description = f.read()\n\nlogger = logging.getLogger(__name__)\nspec = util.spec_from_file_location(\n \"checkov.version\", os.path.join(\"checkov\", \"version.py\")\n)\n# noinspection PyUnresolvedReferences\nmod = util.module_from_spec(spec)\nspec.loader.exec_module(mod) # type: ignore\nversion = mod.version # type: ignore\n\nsetup(\n extras_require={\n \"dev\": [\n \"pytest==5.3.1\",\n \"coverage==5.5\",\n \"coverage-badge\",\n \"GitPython==3.1.7\",\n \"bandit\",\n \"jsonschema\",\n ]\n },\n install_requires=[\n \"bc-python-hcl2==0.3.42\",\n \"cloudsplaining>=0.4.1\",\n \"deep_merge\",\n \"tabulate\",\n \"colorama\",\n \"termcolor\",\n \"junit-xml>=1.9\",\n \"dpath>=1.5.0,<2\",\n \"pyyaml>=5.4.1\",\n \"boto3>=1.17\",\n \"GitPython\",\n \"jmespath\",\n \"tqdm\",\n \"update_checker\",\n \"semantic_version\",\n \"packaging\",\n \"networkx\",\n \"dockerfile-parse\",\n \"docker\",\n \"configargparse\",\n \"argcomplete\",\n \"detect-secrets\",\n \"policyuniverse\",\n \"typing-extensions>=4.1.0\",\n \"cachetools\",\n \"cyclonedx-python-lib>=2.4.0\",\n \"click>=8.0.0\",\n \"aiohttp\",\n \"aiodns\",\n \"aiomultiprocess\",\n \"jsonpath_ng\",\n \"jsonschema~=3.0\",\n \"prettytable>=3.0.0\",\n \"pycep-parser==0.3.7\",\n \"charset-normalizer\",\n ],\n license=\"Apache License 2.0\",\n name=\"checkov\",\n version=version,\n python_requires=\">=3.7\",\n description=\"Infrastructure as code static analysis\",\n author=\"bridgecrew\",\n author_email=\"[email protected]\",\n url=\"https://github.com/bridgecrewio/checkov\",\n packages=setuptools.find_packages(exclude=[\"tests*\", \"integration_tests*\"]),\n include_package_data=True,\n package_dir={\n \"checkov.bicep.checks.graph_checks\": \"checkov/bicep/checks/graph_checks\",\n \"checkov.terraform.checks.graph_checks\": \"checkov/terraform/checks/graph_checks\",\n },\n package_data={\n \"checkov\": [\"py.typed\"],\n \"checkov.bicep.checks.graph_checks\": [\"*.yaml\"],\n \"checkov.common.util.templates\": [\"*.jinja2\"],\n \"checkov.terraform.checks.graph_checks\": [\n \"aws/*.yaml\",\n \"gcp/*.yaml\",\n \"azure/*.yaml\",\n ],\n },\n scripts=[\"bin/checkov\", \"bin/checkov.cmd\"],\n long_description=long_description,\n long_description_content_type=\"text/markdown\",\n classifiers=[\n \"Environment :: Console\",\n \"Intended Audience :: Developers\",\n \"Intended Audience :: System Administrators\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Programming Language :: Python :: 3 :: Only\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Programming Language :: Python :: 3.10\",\n \"Topic :: Security\",\n \"Topic :: Software Development :: Build Tools\",\n \"Typing :: Typed\",\n ],\n)\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/env python\nimport logging\nimport os\nfrom importlib import util\nfrom os import path\n\nimport setuptools\nfrom setuptools import setup\n\n# read the contents of your README file\nthis_directory = path.abspath(path.dirname(__file__))\nwith open(path.join(this_directory, \"README.md\"), encoding=\"utf-8\") as f:\n long_description = f.read()\n\nlogger = logging.getLogger(__name__)\nspec = util.spec_from_file_location(\n \"checkov.version\", os.path.join(\"checkov\", \"version.py\")\n)\n# noinspection PyUnresolvedReferences\nmod = util.module_from_spec(spec)\nspec.loader.exec_module(mod) # type: ignore\nversion = mod.version # type: ignore\n\nsetup(\n extras_require={\n \"dev\": [\n \"pytest==5.3.1\",\n \"coverage==5.5\",\n \"coverage-badge\",\n \"GitPython==3.1.7\",\n \"bandit\",\n \"jsonschema\",\n ]\n },\n install_requires=[\n \"bc-python-hcl2==0.3.44\",\n \"cloudsplaining>=0.4.1\",\n \"deep_merge\",\n \"tabulate\",\n \"colorama\",\n \"termcolor\",\n \"junit-xml>=1.9\",\n \"dpath>=1.5.0,<2\",\n \"pyyaml>=5.4.1\",\n \"boto3>=1.17\",\n \"GitPython\",\n \"jmespath\",\n \"tqdm\",\n \"update_checker\",\n \"semantic_version\",\n \"packaging\",\n \"networkx\",\n \"dockerfile-parse\",\n \"docker\",\n \"configargparse\",\n \"argcomplete\",\n \"detect-secrets\",\n \"policyuniverse\",\n \"typing-extensions>=4.1.0\",\n \"cachetools\",\n \"cyclonedx-python-lib>=2.4.0\",\n \"click>=8.0.0\",\n \"aiohttp\",\n \"aiodns\",\n \"aiomultiprocess\",\n \"jsonpath_ng\",\n \"jsonschema~=3.0\",\n \"prettytable>=3.0.0\",\n \"pycep-parser==0.3.7\",\n \"charset-normalizer\",\n ],\n license=\"Apache License 2.0\",\n name=\"checkov\",\n version=version,\n python_requires=\">=3.7\",\n description=\"Infrastructure as code static analysis\",\n author=\"bridgecrew\",\n author_email=\"[email protected]\",\n url=\"https://github.com/bridgecrewio/checkov\",\n packages=setuptools.find_packages(exclude=[\"tests*\", \"integration_tests*\"]),\n include_package_data=True,\n package_dir={\n \"checkov.bicep.checks.graph_checks\": \"checkov/bicep/checks/graph_checks\",\n \"checkov.terraform.checks.graph_checks\": \"checkov/terraform/checks/graph_checks\",\n },\n package_data={\n \"checkov\": [\"py.typed\"],\n \"checkov.bicep.checks.graph_checks\": [\"*.yaml\"],\n \"checkov.common.util.templates\": [\"*.jinja2\"],\n \"checkov.terraform.checks.graph_checks\": [\n \"aws/*.yaml\",\n \"gcp/*.yaml\",\n \"azure/*.yaml\",\n ],\n },\n scripts=[\"bin/checkov\", \"bin/checkov.cmd\"],\n long_description=long_description,\n long_description_content_type=\"text/markdown\",\n classifiers=[\n \"Environment :: Console\",\n \"Intended Audience :: Developers\",\n \"Intended Audience :: System Administrators\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Programming Language :: Python :: 3 :: Only\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Programming Language :: Python :: 3.10\",\n \"Topic :: Security\",\n \"Topic :: Software Development :: Build Tools\",\n \"Typing :: Typed\",\n ],\n)\n", "path": "setup.py"}]} | 1,765 | 94 |
gh_patches_debug_25963 | rasdani/github-patches | git_diff | pymedusa__Medusa-3131 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Login error
```
Error
unhashable type: 'list'
Traceback
Traceback (most recent call last):
File "/home/pi/Medusa/ext/tornado/web.py", line 1509, in _execute result = method(*self.path_args, **self.path_kwargs)
File "/home/pi/Medusa/medusa/server/web/core/authentication.py", line 73, in post notifiers.notify_login(self.request.remote_ip)
File "/home/pi/Medusa/medusa/notifiers/__init__.py", line 127, in notify_login n.notify_login(ipaddress)
File "/home/pi/Medusa/medusa/notifiers/nma.py", line 44, in notify_login self._sendNMA(nma_api=None, nma_priority=None, event=title, message=update_text.format(ipaddress))
File "/home/pi/Medusa/medusa/notifiers/nma.py", line 74, in _sendNMA if not response[nma_api][u'code'] == u'200':
TypeError: unhashable type: 'list'
Request Info
body: username=supergonkas&password=&remember_me=1&submit=Login
files: {}
protocol: http
connection:
body_arguments: {'username': ['supergonkas'], 'remember_me': ['1'], 'password': [''], 'submit': ['Login']}
uri: /login/?next=%2Fhistory%2F
query_arguments: {'next': ['/history/']}
_start_time: 1505997382.06
headers: Origin: http://:8081 Save-Data: on Content-Length: 61 Accept-Language: pt-PT,pt;q=0.8,en-US;q=0.6,en;q=0.4,es;q=0.2 Accept-Encoding: gzip, deflate Connection: keep-alive Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8 Upgrade-Insecure-Requests: 1 Dnt: 1 Host::8081 Referer: http://:8081/login/?next=%2Fhistory%2F Cache-Control: max-age=0 User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.116 Safari/537.36 Content-Type: application/x-www-form-urlencoded
host: :8081
version: HTTP/1.1
server_connection:
host_name: s
_finish_time: None
query: next=%2Fhistory%2F
arguments: {'username': ['supergonkas'], 'remember_me': ['1'], 'password': [''], 'submit': ['Login'], 'next': ['/history/']}
path: /login/
method: POST
remote_ip: 194.210.190.12
View Log(Errors)
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `medusa/notifiers/nma.py`
Content:
```
1 # coding=utf-8
2
3 import logging
4
5 from medusa import app, common
6 from medusa.logger.adapters.style import BraceAdapter
7
8 from pynma import pynma
9
10 log = BraceAdapter(logging.getLogger(__name__))
11 log.logger.addHandler(logging.NullHandler())
12
13
14 class Notifier(object):
15 def test_notify(self, nma_api, nma_priority):
16 return self._sendNMA(nma_api, nma_priority, event='Test', message='Testing NMA settings from Medusa',
17 force=True)
18
19 def notify_snatch(self, ep_name, is_proper):
20 if app.NMA_NOTIFY_ONSNATCH:
21 self._sendNMA(nma_api=None, nma_priority=None, event=common.notifyStrings[(common.NOTIFY_SNATCH, common.NOTIFY_SNATCH_PROPER)[is_proper]],
22 message=ep_name)
23
24 def notify_download(self, ep_name):
25 if app.NMA_NOTIFY_ONDOWNLOAD:
26 self._sendNMA(nma_api=None, nma_priority=None, event=common.notifyStrings[common.NOTIFY_DOWNLOAD],
27 message=ep_name)
28
29 def notify_subtitle_download(self, ep_name, lang):
30 if app.NMA_NOTIFY_ONSUBTITLEDOWNLOAD:
31 self._sendNMA(nma_api=None, nma_priority=None, event=common.notifyStrings[common.NOTIFY_SUBTITLE_DOWNLOAD],
32 message=ep_name + ': ' + lang)
33
34 def notify_git_update(self, new_version='??'):
35 if app.USE_NMA:
36 update_text = common.notifyStrings[common.NOTIFY_GIT_UPDATE_TEXT]
37 title = common.notifyStrings[common.NOTIFY_GIT_UPDATE]
38 self._sendNMA(nma_api=None, nma_priority=None, event=title, message=update_text + new_version)
39
40 def notify_login(self, ipaddress=''):
41 if app.USE_NMA:
42 update_text = common.notifyStrings[common.NOTIFY_LOGIN_TEXT]
43 title = common.notifyStrings[common.NOTIFY_LOGIN]
44 self._sendNMA(nma_api=None, nma_priority=None, event=title, message=update_text.format(ipaddress))
45
46 def _sendNMA(self, nma_api=None, nma_priority=None, event=None, message=None, force=False):
47
48 title = 'Medusa'
49
50 if not app.USE_NMA and not force:
51 return False
52
53 if nma_api is None:
54 nma_api = app.NMA_API
55 else:
56 nma_api = nma_api.split(',')
57
58 if nma_priority is None:
59 nma_priority = app.NMA_PRIORITY
60
61 batch = False
62
63 p = pynma.PyNMA()
64 keys = nma_api
65 p.addkey(keys)
66
67 if len(keys) > 1:
68 batch = True
69
70 log.debug(u'NMA: Sending notice with details: event="{0}, message="{1}", priority={2}, batch={3}',
71 event, message, nma_priority, batch)
72 response = p.push(application=title, event=event, description=message, priority=nma_priority, batch_mode=batch)
73
74 if not response[nma_api][u'code'] == u'200':
75 log.error(u'Could not send notification to NotifyMyAndroid')
76 return False
77 else:
78 log.info(u'NMA: Notification sent to NotifyMyAndroid')
79 return True
80
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/medusa/notifiers/nma.py b/medusa/notifiers/nma.py
--- a/medusa/notifiers/nma.py
+++ b/medusa/notifiers/nma.py
@@ -4,8 +4,8 @@
from medusa import app, common
from medusa.logger.adapters.style import BraceAdapter
-
from pynma import pynma
+from six import text_type
log = BraceAdapter(logging.getLogger(__name__))
log.logger.addHandler(logging.NullHandler())
@@ -52,8 +52,8 @@
if nma_api is None:
nma_api = app.NMA_API
- else:
- nma_api = nma_api.split(',')
+ elif isinstance(nma_api, text_type):
+ nma_api = [nma_api]
if nma_priority is None:
nma_priority = app.NMA_PRIORITY
@@ -71,7 +71,7 @@
event, message, nma_priority, batch)
response = p.push(application=title, event=event, description=message, priority=nma_priority, batch_mode=batch)
- if not response[nma_api][u'code'] == u'200':
+ if not response[','.join(nma_api)][u'code'] == u'200':
log.error(u'Could not send notification to NotifyMyAndroid')
return False
else:
| {"golden_diff": "diff --git a/medusa/notifiers/nma.py b/medusa/notifiers/nma.py\n--- a/medusa/notifiers/nma.py\n+++ b/medusa/notifiers/nma.py\n@@ -4,8 +4,8 @@\n \n from medusa import app, common\n from medusa.logger.adapters.style import BraceAdapter\n-\n from pynma import pynma\n+from six import text_type\n \n log = BraceAdapter(logging.getLogger(__name__))\n log.logger.addHandler(logging.NullHandler())\n@@ -52,8 +52,8 @@\n \n if nma_api is None:\n nma_api = app.NMA_API\n- else:\n- nma_api = nma_api.split(',')\n+ elif isinstance(nma_api, text_type):\n+ nma_api = [nma_api]\n \n if nma_priority is None:\n nma_priority = app.NMA_PRIORITY\n@@ -71,7 +71,7 @@\n event, message, nma_priority, batch)\n response = p.push(application=title, event=event, description=message, priority=nma_priority, batch_mode=batch)\n \n- if not response[nma_api][u'code'] == u'200':\n+ if not response[','.join(nma_api)][u'code'] == u'200':\n log.error(u'Could not send notification to NotifyMyAndroid')\n return False\n else:\n", "issue": "Login error\n```\r\nError\r\nunhashable type: 'list'\r\nTraceback\r\nTraceback (most recent call last): \r\nFile \"/home/pi/Medusa/ext/tornado/web.py\", line 1509, in _execute result = method(*self.path_args, **self.path_kwargs) \r\nFile \"/home/pi/Medusa/medusa/server/web/core/authentication.py\", line 73, in post notifiers.notify_login(self.request.remote_ip) \r\nFile \"/home/pi/Medusa/medusa/notifiers/__init__.py\", line 127, in notify_login n.notify_login(ipaddress)\r\nFile \"/home/pi/Medusa/medusa/notifiers/nma.py\", line 44, in notify_login self._sendNMA(nma_api=None, nma_priority=None, event=title, message=update_text.format(ipaddress)) \r\nFile \"/home/pi/Medusa/medusa/notifiers/nma.py\", line 74, in _sendNMA if not response[nma_api][u'code'] == u'200': \r\nTypeError: unhashable type: 'list' \r\nRequest Info\r\nbody: username=supergonkas&password=&remember_me=1&submit=Login\r\nfiles: {}\r\nprotocol: http\r\nconnection: \r\nbody_arguments: {'username': ['supergonkas'], 'remember_me': ['1'], 'password': [''], 'submit': ['Login']}\r\nuri: /login/?next=%2Fhistory%2F\r\nquery_arguments: {'next': ['/history/']}\r\n_start_time: 1505997382.06\r\nheaders: Origin: http://:8081 Save-Data: on Content-Length: 61 Accept-Language: pt-PT,pt;q=0.8,en-US;q=0.6,en;q=0.4,es;q=0.2 Accept-Encoding: gzip, deflate Connection: keep-alive Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8 Upgrade-Insecure-Requests: 1 Dnt: 1 Host::8081 Referer: http://:8081/login/?next=%2Fhistory%2F Cache-Control: max-age=0 User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.116 Safari/537.36 Content-Type: application/x-www-form-urlencoded \r\nhost: :8081\r\nversion: HTTP/1.1\r\nserver_connection: \r\nhost_name: s\r\n_finish_time: None\r\nquery: next=%2Fhistory%2F\r\narguments: {'username': ['supergonkas'], 'remember_me': ['1'], 'password': [''], 'submit': ['Login'], 'next': ['/history/']}\r\npath: /login/\r\nmethod: POST\r\nremote_ip: 194.210.190.12\r\nView Log(Errors)\r\n```\n", "before_files": [{"content": "# coding=utf-8\n\nimport logging\n\nfrom medusa import app, common\nfrom medusa.logger.adapters.style import BraceAdapter\n\nfrom pynma import pynma\n\nlog = BraceAdapter(logging.getLogger(__name__))\nlog.logger.addHandler(logging.NullHandler())\n\n\nclass Notifier(object):\n def test_notify(self, nma_api, nma_priority):\n return self._sendNMA(nma_api, nma_priority, event='Test', message='Testing NMA settings from Medusa',\n force=True)\n\n def notify_snatch(self, ep_name, is_proper):\n if app.NMA_NOTIFY_ONSNATCH:\n self._sendNMA(nma_api=None, nma_priority=None, event=common.notifyStrings[(common.NOTIFY_SNATCH, common.NOTIFY_SNATCH_PROPER)[is_proper]],\n message=ep_name)\n\n def notify_download(self, ep_name):\n if app.NMA_NOTIFY_ONDOWNLOAD:\n self._sendNMA(nma_api=None, nma_priority=None, event=common.notifyStrings[common.NOTIFY_DOWNLOAD],\n message=ep_name)\n\n def notify_subtitle_download(self, ep_name, lang):\n if app.NMA_NOTIFY_ONSUBTITLEDOWNLOAD:\n self._sendNMA(nma_api=None, nma_priority=None, event=common.notifyStrings[common.NOTIFY_SUBTITLE_DOWNLOAD],\n message=ep_name + ': ' + lang)\n\n def notify_git_update(self, new_version='??'):\n if app.USE_NMA:\n update_text = common.notifyStrings[common.NOTIFY_GIT_UPDATE_TEXT]\n title = common.notifyStrings[common.NOTIFY_GIT_UPDATE]\n self._sendNMA(nma_api=None, nma_priority=None, event=title, message=update_text + new_version)\n\n def notify_login(self, ipaddress=''):\n if app.USE_NMA:\n update_text = common.notifyStrings[common.NOTIFY_LOGIN_TEXT]\n title = common.notifyStrings[common.NOTIFY_LOGIN]\n self._sendNMA(nma_api=None, nma_priority=None, event=title, message=update_text.format(ipaddress))\n\n def _sendNMA(self, nma_api=None, nma_priority=None, event=None, message=None, force=False):\n\n title = 'Medusa'\n\n if not app.USE_NMA and not force:\n return False\n\n if nma_api is None:\n nma_api = app.NMA_API\n else:\n nma_api = nma_api.split(',')\n\n if nma_priority is None:\n nma_priority = app.NMA_PRIORITY\n\n batch = False\n\n p = pynma.PyNMA()\n keys = nma_api\n p.addkey(keys)\n\n if len(keys) > 1:\n batch = True\n\n log.debug(u'NMA: Sending notice with details: event=\"{0}, message=\"{1}\", priority={2}, batch={3}',\n event, message, nma_priority, batch)\n response = p.push(application=title, event=event, description=message, priority=nma_priority, batch_mode=batch)\n\n if not response[nma_api][u'code'] == u'200':\n log.error(u'Could not send notification to NotifyMyAndroid')\n return False\n else:\n log.info(u'NMA: Notification sent to NotifyMyAndroid')\n return True\n", "path": "medusa/notifiers/nma.py"}], "after_files": [{"content": "# coding=utf-8\n\nimport logging\n\nfrom medusa import app, common\nfrom medusa.logger.adapters.style import BraceAdapter\nfrom pynma import pynma\nfrom six import text_type\n\nlog = BraceAdapter(logging.getLogger(__name__))\nlog.logger.addHandler(logging.NullHandler())\n\n\nclass Notifier(object):\n def test_notify(self, nma_api, nma_priority):\n return self._sendNMA(nma_api, nma_priority, event='Test', message='Testing NMA settings from Medusa',\n force=True)\n\n def notify_snatch(self, ep_name, is_proper):\n if app.NMA_NOTIFY_ONSNATCH:\n self._sendNMA(nma_api=None, nma_priority=None, event=common.notifyStrings[(common.NOTIFY_SNATCH, common.NOTIFY_SNATCH_PROPER)[is_proper]],\n message=ep_name)\n\n def notify_download(self, ep_name):\n if app.NMA_NOTIFY_ONDOWNLOAD:\n self._sendNMA(nma_api=None, nma_priority=None, event=common.notifyStrings[common.NOTIFY_DOWNLOAD],\n message=ep_name)\n\n def notify_subtitle_download(self, ep_name, lang):\n if app.NMA_NOTIFY_ONSUBTITLEDOWNLOAD:\n self._sendNMA(nma_api=None, nma_priority=None, event=common.notifyStrings[common.NOTIFY_SUBTITLE_DOWNLOAD],\n message=ep_name + ': ' + lang)\n\n def notify_git_update(self, new_version='??'):\n if app.USE_NMA:\n update_text = common.notifyStrings[common.NOTIFY_GIT_UPDATE_TEXT]\n title = common.notifyStrings[common.NOTIFY_GIT_UPDATE]\n self._sendNMA(nma_api=None, nma_priority=None, event=title, message=update_text + new_version)\n\n def notify_login(self, ipaddress=''):\n if app.USE_NMA:\n update_text = common.notifyStrings[common.NOTIFY_LOGIN_TEXT]\n title = common.notifyStrings[common.NOTIFY_LOGIN]\n self._sendNMA(nma_api=None, nma_priority=None, event=title, message=update_text.format(ipaddress))\n\n def _sendNMA(self, nma_api=None, nma_priority=None, event=None, message=None, force=False):\n\n title = 'Medusa'\n\n if not app.USE_NMA and not force:\n return False\n\n if nma_api is None:\n nma_api = app.NMA_API\n elif isinstance(nma_api, text_type):\n nma_api = [nma_api]\n\n if nma_priority is None:\n nma_priority = app.NMA_PRIORITY\n\n batch = False\n\n p = pynma.PyNMA()\n keys = nma_api\n p.addkey(keys)\n\n if len(keys) > 1:\n batch = True\n\n log.debug(u'NMA: Sending notice with details: event=\"{0}, message=\"{1}\", priority={2}, batch={3}',\n event, message, nma_priority, batch)\n response = p.push(application=title, event=event, description=message, priority=nma_priority, batch_mode=batch)\n\n if not response[','.join(nma_api)][u'code'] == u'200':\n log.error(u'Could not send notification to NotifyMyAndroid')\n return False\n else:\n log.info(u'NMA: Notification sent to NotifyMyAndroid')\n return True\n", "path": "medusa/notifiers/nma.py"}]} | 1,798 | 304 |
gh_patches_debug_16829 | rasdani/github-patches | git_diff | pyload__pyload-1412 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Smoozed
Hello,
Smooth is free in the moment for 10GB a day but the hooks said that it is not a premium account.
Would be great if somebody can have a look on it.
Thanks
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `module/plugins/accounts/SmoozedCom.py`
Content:
```
1 # -*- coding: utf-8 -*-
2
3 import hashlib
4 import time
5
6 try:
7 from beaker.crypto.pbkdf2 import PBKDF2
8
9 except ImportError:
10 from beaker.crypto.pbkdf2 import pbkdf2
11 from binascii import b2a_hex
12
13 class PBKDF2(object):
14 def __init__(self, passphrase, salt, iterations=1000):
15 self.passphrase = passphrase
16 self.salt = salt
17 self.iterations = iterations
18
19 def hexread(self, octets):
20 return b2a_hex(pbkdf2(self.passphrase, self.salt, self.iterations, octets))
21
22 from module.common.json_layer import json_loads
23 from module.plugins.Account import Account
24
25
26 class SmoozedCom(Account):
27 __name__ = "SmoozedCom"
28 __type__ = "account"
29 __version__ = "0.04"
30
31 __description__ = """Smoozed.com account plugin"""
32 __license__ = "GPLv3"
33 __authors__ = [("", "")]
34
35
36 def loadAccountInfo(self, user, req):
37 # Get user data from premiumize.me
38 status = self.getAccountStatus(user, req)
39
40 self.logDebug(status)
41
42 if status['state'] != 'ok':
43 info = {'validuntil' : None,
44 'trafficleft': None,
45 'premium' : False}
46 else:
47 # Parse account info
48 info = {'validuntil' : float(status["data"]["user"]["user_premium"]),
49 'trafficleft': max(0, status["data"]["traffic"][1] - status["data"]["traffic"][0]),
50 'session' : status["data"]["session_key"],
51 'hosters' : [hoster["name"] for hoster in status["data"]["hoster"]]}
52
53 if info['validuntil'] < time.time():
54 info['premium'] = False
55 else:
56 info['premium'] = True
57
58 return info
59
60
61 def login(self, user, data, req):
62 # Get user data from premiumize.me
63 status = self.getAccountStatus(user, req)
64
65 # Check if user and password are valid
66 if status['state'] != 'ok':
67 self.wrongPassword()
68
69
70 def getAccountStatus(self, user, req):
71 password = self.getAccountData(user)['password']
72 salt = hashlib.sha256(password).hexdigest()
73 encrypted = PBKDF2(password, salt, iterations=1000).hexread(32)
74
75 return json_loads(req.load("http://www2.smoozed.com/api/login",
76 get={'auth': user, 'password': encrypted}))
77
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/module/plugins/accounts/SmoozedCom.py b/module/plugins/accounts/SmoozedCom.py
--- a/module/plugins/accounts/SmoozedCom.py
+++ b/module/plugins/accounts/SmoozedCom.py
@@ -34,7 +34,6 @@
def loadAccountInfo(self, user, req):
- # Get user data from premiumize.me
status = self.getAccountStatus(user, req)
self.logDebug(status)
@@ -51,7 +50,10 @@
'hosters' : [hoster["name"] for hoster in status["data"]["hoster"]]}
if info['validuntil'] < time.time():
- info['premium'] = False
+ if float(status["data"]["user"].get("user_trial", 0)) > time.time():
+ info['premium'] = True
+ else:
+ info['premium'] = False
else:
info['premium'] = True
| {"golden_diff": "diff --git a/module/plugins/accounts/SmoozedCom.py b/module/plugins/accounts/SmoozedCom.py\n--- a/module/plugins/accounts/SmoozedCom.py\n+++ b/module/plugins/accounts/SmoozedCom.py\n@@ -34,7 +34,6 @@\n \n \n def loadAccountInfo(self, user, req):\n- # Get user data from premiumize.me\n status = self.getAccountStatus(user, req)\n \n self.logDebug(status)\n@@ -51,7 +50,10 @@\n 'hosters' : [hoster[\"name\"] for hoster in status[\"data\"][\"hoster\"]]}\n \n if info['validuntil'] < time.time():\n- info['premium'] = False\n+ if float(status[\"data\"][\"user\"].get(\"user_trial\", 0)) > time.time():\n+ info['premium'] = True\n+ else:\n+ info['premium'] = False\n else:\n info['premium'] = True\n", "issue": "Smoozed\nHello,\n\nSmooth is free in the moment for 10GB a day but the hooks said that it is not a premium account.\n\nWould be great if somebody can have a look on it.\n\nThanks\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\nimport hashlib\nimport time\n\ntry:\n from beaker.crypto.pbkdf2 import PBKDF2\n\nexcept ImportError:\n from beaker.crypto.pbkdf2 import pbkdf2\n from binascii import b2a_hex\n\n class PBKDF2(object):\n def __init__(self, passphrase, salt, iterations=1000):\n self.passphrase = passphrase\n self.salt = salt\n self.iterations = iterations\n\n def hexread(self, octets):\n return b2a_hex(pbkdf2(self.passphrase, self.salt, self.iterations, octets))\n\nfrom module.common.json_layer import json_loads\nfrom module.plugins.Account import Account\n\n\nclass SmoozedCom(Account):\n __name__ = \"SmoozedCom\"\n __type__ = \"account\"\n __version__ = \"0.04\"\n\n __description__ = \"\"\"Smoozed.com account plugin\"\"\"\n __license__ = \"GPLv3\"\n __authors__ = [(\"\", \"\")]\n\n\n def loadAccountInfo(self, user, req):\n # Get user data from premiumize.me\n status = self.getAccountStatus(user, req)\n\n self.logDebug(status)\n\n if status['state'] != 'ok':\n info = {'validuntil' : None,\n 'trafficleft': None,\n 'premium' : False}\n else:\n # Parse account info\n info = {'validuntil' : float(status[\"data\"][\"user\"][\"user_premium\"]),\n 'trafficleft': max(0, status[\"data\"][\"traffic\"][1] - status[\"data\"][\"traffic\"][0]),\n 'session' : status[\"data\"][\"session_key\"],\n 'hosters' : [hoster[\"name\"] for hoster in status[\"data\"][\"hoster\"]]}\n\n if info['validuntil'] < time.time():\n info['premium'] = False\n else:\n info['premium'] = True\n\n return info\n\n\n def login(self, user, data, req):\n # Get user data from premiumize.me\n status = self.getAccountStatus(user, req)\n\n # Check if user and password are valid\n if status['state'] != 'ok':\n self.wrongPassword()\n\n\n def getAccountStatus(self, user, req):\n password = self.getAccountData(user)['password']\n salt = hashlib.sha256(password).hexdigest()\n encrypted = PBKDF2(password, salt, iterations=1000).hexread(32)\n\n return json_loads(req.load(\"http://www2.smoozed.com/api/login\",\n get={'auth': user, 'password': encrypted}))\n", "path": "module/plugins/accounts/SmoozedCom.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\nimport hashlib\nimport time\n\ntry:\n from beaker.crypto.pbkdf2 import PBKDF2\n\nexcept ImportError:\n from beaker.crypto.pbkdf2 import pbkdf2\n from binascii import b2a_hex\n\n class PBKDF2(object):\n def __init__(self, passphrase, salt, iterations=1000):\n self.passphrase = passphrase\n self.salt = salt\n self.iterations = iterations\n\n def hexread(self, octets):\n return b2a_hex(pbkdf2(self.passphrase, self.salt, self.iterations, octets))\n\nfrom module.common.json_layer import json_loads\nfrom module.plugins.Account import Account\n\n\nclass SmoozedCom(Account):\n __name__ = \"SmoozedCom\"\n __type__ = \"account\"\n __version__ = \"0.04\"\n\n __description__ = \"\"\"Smoozed.com account plugin\"\"\"\n __license__ = \"GPLv3\"\n __authors__ = [(\"\", \"\")]\n\n\n def loadAccountInfo(self, user, req):\n status = self.getAccountStatus(user, req)\n\n self.logDebug(status)\n\n if status['state'] != 'ok':\n info = {'validuntil' : None,\n 'trafficleft': None,\n 'premium' : False}\n else:\n # Parse account info\n info = {'validuntil' : float(status[\"data\"][\"user\"][\"user_premium\"]),\n 'trafficleft': max(0, status[\"data\"][\"traffic\"][1] - status[\"data\"][\"traffic\"][0]),\n 'session' : status[\"data\"][\"session_key\"],\n 'hosters' : [hoster[\"name\"] for hoster in status[\"data\"][\"hoster\"]]}\n\n if info['validuntil'] < time.time():\n if float(status[\"data\"][\"user\"].get(\"user_trial\", 0)) > time.time():\n info['premium'] = True\n else:\n info['premium'] = False\n else:\n info['premium'] = True\n\n return info\n\n\n def login(self, user, data, req):\n # Get user data from premiumize.me\n status = self.getAccountStatus(user, req)\n\n # Check if user and password are valid\n if status['state'] != 'ok':\n self.wrongPassword()\n\n\n def getAccountStatus(self, user, req):\n password = self.getAccountData(user)['password']\n salt = hashlib.sha256(password).hexdigest()\n encrypted = PBKDF2(password, salt, iterations=1000).hexread(32)\n\n return json_loads(req.load(\"http://www2.smoozed.com/api/login\",\n get={'auth': user, 'password': encrypted}))\n", "path": "module/plugins/accounts/SmoozedCom.py"}]} | 1,053 | 214 |
gh_patches_debug_6986 | rasdani/github-patches | git_diff | Lightning-AI__torchmetrics-1210 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
MAE can't handle MPS backend
## 🐛 Bug
The currently, the mean absolute error can't handle the MPS backend. This is a simple fix and just requires casting to .float()
### To Reproduce
This works:
```python
import torchmetrics
import torch
a = torch.tensor([1, 2, 3]).to(torch.device("mps"))
acc = torchmetrics.Accuracy().to(torch.device("mps"))
acc.update(a, a)
acc.compute()
```
this also works:
```
a = torch.tensor([1, 2, 3])
mae = torchmetrics.MeanAbsoluteError()
mae.update(a, a)
mae.compute()
```
but this crashes
```python
a = torch.tensor([1, 2, 3]).to(torch.device("mps"))
mae = torchmetrics.MeanAbsoluteError().to(torch.device("mps"))
mae.update(a, a)
mae.compute()
```
```
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
Input In [12], in <module>
1 a = torch.tensor([1, 2, 3]).to(torch.device("mps"))
3 acc = torchmetrics.MeanAbsoluteError().to(torch.device("mps"))
----> 4 acc.update(a, a)
5 acc.compute()
File ~/miniforge3/lib/python3.9/site-packages/torchmetrics/metric.py:391, in Metric._wrap_update.<locals>.wrapped_func(*args, **kwargs)
389 with torch.set_grad_enabled(self._enable_grad):
390 try:
--> 391 update(*args, **kwargs)
392 except RuntimeError as err:
393 if "Expected all tensors to be on" in str(err):
File ~/miniforge3/lib/python3.9/site-packages/torchmetrics/regression/mae.py:63, in MeanAbsoluteError.update(self, preds, target)
56 def update(self, preds: Tensor, target: Tensor) -> None: # type: ignore
57 """Update state with predictions and targets.
58
59 Args:
60 preds: Predictions from model
61 target: Ground truth values
62 """
---> 63 sum_abs_error, n_obs = _mean_absolute_error_update(preds, target)
65 self.sum_abs_error += sum_abs_error
66 self.total += n_obs
File ~/miniforge3/lib/python3.9/site-packages/torchmetrics/functional/regression/mae.py:33, in _mean_absolute_error_update(preds, target)
23 """Updates and returns variables required to compute Mean Absolute Error.
24
25 Checks for same shape of input tensors.
(...)
29 target: Ground truth tensor
30 """
32 _check_same_shape(preds, target)
---> 33 sum_abs_error = torch.sum(torch.abs(preds - target))
34 n_obs = target.numel()
35 return sum_abs_error, n_obs
TypeError: Operation 'abs_out_mps()' does not support input type 'int64' in MPS backend.
```
### Environment
```
torch : 1.12.1
lightning : 2022.9.8
torchmetrics: 0.9.3
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/torchmetrics/functional/regression/mae.py`
Content:
```
1 # Copyright The PyTorch Lightning team.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 from typing import Tuple
15
16 import torch
17 from torch import Tensor
18
19 from torchmetrics.utilities.checks import _check_same_shape
20
21
22 def _mean_absolute_error_update(preds: Tensor, target: Tensor) -> Tuple[Tensor, int]:
23 """Updates and returns variables required to compute Mean Absolute Error.
24
25 Checks for same shape of input tensors.
26
27 Args:
28 preds: Predicted tensor
29 target: Ground truth tensor
30 """
31
32 _check_same_shape(preds, target)
33 sum_abs_error = torch.sum(torch.abs(preds - target))
34 n_obs = target.numel()
35 return sum_abs_error, n_obs
36
37
38 def _mean_absolute_error_compute(sum_abs_error: Tensor, n_obs: int) -> Tensor:
39 """Computes Mean Absolute Error.
40
41 Args:
42 sum_abs_error: Sum of absolute value of errors over all observations
43 n_obs: Number of predictions or observations
44
45 Example:
46 >>> preds = torch.tensor([0., 1, 2, 3])
47 >>> target = torch.tensor([0., 1, 2, 2])
48 >>> sum_abs_error, n_obs = _mean_absolute_error_update(preds, target)
49 >>> _mean_absolute_error_compute(sum_abs_error, n_obs)
50 tensor(0.2500)
51 """
52
53 return sum_abs_error / n_obs
54
55
56 def mean_absolute_error(preds: Tensor, target: Tensor) -> Tensor:
57 """Computes mean absolute error.
58
59 Args:
60 preds: estimated labels
61 target: ground truth labels
62
63 Return:
64 Tensor with MAE
65
66 Example:
67 >>> from torchmetrics.functional import mean_absolute_error
68 >>> x = torch.tensor([0., 1, 2, 3])
69 >>> y = torch.tensor([0., 1, 2, 2])
70 >>> mean_absolute_error(x, y)
71 tensor(0.2500)
72 """
73 sum_abs_error, n_obs = _mean_absolute_error_update(preds, target)
74 return _mean_absolute_error_compute(sum_abs_error, n_obs)
75
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/torchmetrics/functional/regression/mae.py b/src/torchmetrics/functional/regression/mae.py
--- a/src/torchmetrics/functional/regression/mae.py
+++ b/src/torchmetrics/functional/regression/mae.py
@@ -28,8 +28,9 @@
preds: Predicted tensor
target: Ground truth tensor
"""
-
_check_same_shape(preds, target)
+ preds = preds if preds.is_floating_point else preds.float()
+ target = target if target.is_floating_point else target.float()
sum_abs_error = torch.sum(torch.abs(preds - target))
n_obs = target.numel()
return sum_abs_error, n_obs
| {"golden_diff": "diff --git a/src/torchmetrics/functional/regression/mae.py b/src/torchmetrics/functional/regression/mae.py\n--- a/src/torchmetrics/functional/regression/mae.py\n+++ b/src/torchmetrics/functional/regression/mae.py\n@@ -28,8 +28,9 @@\n preds: Predicted tensor\n target: Ground truth tensor\n \"\"\"\n-\n _check_same_shape(preds, target)\n+ preds = preds if preds.is_floating_point else preds.float()\n+ target = target if target.is_floating_point else target.float()\n sum_abs_error = torch.sum(torch.abs(preds - target))\n n_obs = target.numel()\n return sum_abs_error, n_obs\n", "issue": "MAE can't handle MPS backend\n## \ud83d\udc1b Bug\r\n\r\nThe currently, the mean absolute error can't handle the MPS backend. This is a simple fix and just requires casting to .float()\r\n\r\n### To Reproduce\r\n\r\nThis works:\r\n\r\n```python\r\nimport torchmetrics\r\nimport torch\r\n\r\na = torch.tensor([1, 2, 3]).to(torch.device(\"mps\"))\r\n\r\nacc = torchmetrics.Accuracy().to(torch.device(\"mps\"))\r\nacc.update(a, a)\r\nacc.compute()\r\n```\r\n\r\nthis also works:\r\n\r\n```\r\na = torch.tensor([1, 2, 3])\r\n\r\nmae = torchmetrics.MeanAbsoluteError()\r\nmae.update(a, a)\r\nmae.compute()\r\n\r\n```\r\n\r\nbut this crashes\r\n\r\n```python\r\na = torch.tensor([1, 2, 3]).to(torch.device(\"mps\"))\r\n\r\nmae = torchmetrics.MeanAbsoluteError().to(torch.device(\"mps\"))\r\nmae.update(a, a)\r\nmae.compute()\r\n```\r\n\r\n```\r\n---------------------------------------------------------------------------\r\nTypeError Traceback (most recent call last)\r\nInput In [12], in <module>\r\n 1 a = torch.tensor([1, 2, 3]).to(torch.device(\"mps\"))\r\n 3 acc = torchmetrics.MeanAbsoluteError().to(torch.device(\"mps\"))\r\n----> 4 acc.update(a, a)\r\n 5 acc.compute()\r\n\r\nFile ~/miniforge3/lib/python3.9/site-packages/torchmetrics/metric.py:391, in Metric._wrap_update.<locals>.wrapped_func(*args, **kwargs)\r\n 389 with torch.set_grad_enabled(self._enable_grad):\r\n 390 try:\r\n--> 391 update(*args, **kwargs)\r\n 392 except RuntimeError as err:\r\n 393 if \"Expected all tensors to be on\" in str(err):\r\n\r\nFile ~/miniforge3/lib/python3.9/site-packages/torchmetrics/regression/mae.py:63, in MeanAbsoluteError.update(self, preds, target)\r\n 56 def update(self, preds: Tensor, target: Tensor) -> None: # type: ignore\r\n 57 \"\"\"Update state with predictions and targets.\r\n 58 \r\n 59 Args:\r\n 60 preds: Predictions from model\r\n 61 target: Ground truth values\r\n 62 \"\"\"\r\n---> 63 sum_abs_error, n_obs = _mean_absolute_error_update(preds, target)\r\n 65 self.sum_abs_error += sum_abs_error\r\n 66 self.total += n_obs\r\n\r\nFile ~/miniforge3/lib/python3.9/site-packages/torchmetrics/functional/regression/mae.py:33, in _mean_absolute_error_update(preds, target)\r\n 23 \"\"\"Updates and returns variables required to compute Mean Absolute Error.\r\n 24 \r\n 25 Checks for same shape of input tensors.\r\n (...)\r\n 29 target: Ground truth tensor\r\n 30 \"\"\"\r\n 32 _check_same_shape(preds, target)\r\n---> 33 sum_abs_error = torch.sum(torch.abs(preds - target))\r\n 34 n_obs = target.numel()\r\n 35 return sum_abs_error, n_obs\r\n\r\nTypeError: Operation 'abs_out_mps()' does not support input type 'int64' in MPS backend.\r\n```\r\n\r\n\r\n### Environment\r\n\r\n```\r\ntorch : 1.12.1\r\nlightning : 2022.9.8\r\ntorchmetrics: 0.9.3\r\n```\r\n\n", "before_files": [{"content": "# Copyright The PyTorch Lightning team.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Tuple\n\nimport torch\nfrom torch import Tensor\n\nfrom torchmetrics.utilities.checks import _check_same_shape\n\n\ndef _mean_absolute_error_update(preds: Tensor, target: Tensor) -> Tuple[Tensor, int]:\n \"\"\"Updates and returns variables required to compute Mean Absolute Error.\n\n Checks for same shape of input tensors.\n\n Args:\n preds: Predicted tensor\n target: Ground truth tensor\n \"\"\"\n\n _check_same_shape(preds, target)\n sum_abs_error = torch.sum(torch.abs(preds - target))\n n_obs = target.numel()\n return sum_abs_error, n_obs\n\n\ndef _mean_absolute_error_compute(sum_abs_error: Tensor, n_obs: int) -> Tensor:\n \"\"\"Computes Mean Absolute Error.\n\n Args:\n sum_abs_error: Sum of absolute value of errors over all observations\n n_obs: Number of predictions or observations\n\n Example:\n >>> preds = torch.tensor([0., 1, 2, 3])\n >>> target = torch.tensor([0., 1, 2, 2])\n >>> sum_abs_error, n_obs = _mean_absolute_error_update(preds, target)\n >>> _mean_absolute_error_compute(sum_abs_error, n_obs)\n tensor(0.2500)\n \"\"\"\n\n return sum_abs_error / n_obs\n\n\ndef mean_absolute_error(preds: Tensor, target: Tensor) -> Tensor:\n \"\"\"Computes mean absolute error.\n\n Args:\n preds: estimated labels\n target: ground truth labels\n\n Return:\n Tensor with MAE\n\n Example:\n >>> from torchmetrics.functional import mean_absolute_error\n >>> x = torch.tensor([0., 1, 2, 3])\n >>> y = torch.tensor([0., 1, 2, 2])\n >>> mean_absolute_error(x, y)\n tensor(0.2500)\n \"\"\"\n sum_abs_error, n_obs = _mean_absolute_error_update(preds, target)\n return _mean_absolute_error_compute(sum_abs_error, n_obs)\n", "path": "src/torchmetrics/functional/regression/mae.py"}], "after_files": [{"content": "# Copyright The PyTorch Lightning team.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Tuple\n\nimport torch\nfrom torch import Tensor\n\nfrom torchmetrics.utilities.checks import _check_same_shape\n\n\ndef _mean_absolute_error_update(preds: Tensor, target: Tensor) -> Tuple[Tensor, int]:\n \"\"\"Updates and returns variables required to compute Mean Absolute Error.\n\n Checks for same shape of input tensors.\n\n Args:\n preds: Predicted tensor\n target: Ground truth tensor\n \"\"\"\n _check_same_shape(preds, target)\n preds = preds if preds.is_floating_point else preds.float()\n target = target if target.is_floating_point else target.float()\n sum_abs_error = torch.sum(torch.abs(preds - target))\n n_obs = target.numel()\n return sum_abs_error, n_obs\n\n\ndef _mean_absolute_error_compute(sum_abs_error: Tensor, n_obs: int) -> Tensor:\n \"\"\"Computes Mean Absolute Error.\n\n Args:\n sum_abs_error: Sum of absolute value of errors over all observations\n n_obs: Number of predictions or observations\n\n Example:\n >>> preds = torch.tensor([0., 1, 2, 3])\n >>> target = torch.tensor([0., 1, 2, 2])\n >>> sum_abs_error, n_obs = _mean_absolute_error_update(preds, target)\n >>> _mean_absolute_error_compute(sum_abs_error, n_obs)\n tensor(0.2500)\n \"\"\"\n\n return sum_abs_error / n_obs\n\n\ndef mean_absolute_error(preds: Tensor, target: Tensor) -> Tensor:\n \"\"\"Computes mean absolute error.\n\n Args:\n preds: estimated labels\n target: ground truth labels\n\n Return:\n Tensor with MAE\n\n Example:\n >>> from torchmetrics.functional import mean_absolute_error\n >>> x = torch.tensor([0., 1, 2, 3])\n >>> y = torch.tensor([0., 1, 2, 2])\n >>> mean_absolute_error(x, y)\n tensor(0.2500)\n \"\"\"\n sum_abs_error, n_obs = _mean_absolute_error_update(preds, target)\n return _mean_absolute_error_compute(sum_abs_error, n_obs)\n", "path": "src/torchmetrics/functional/regression/mae.py"}]} | 1,747 | 160 |
gh_patches_debug_1424 | rasdani/github-patches | git_diff | cookiecutter__cookiecutter-1891 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
2.2.0 CLI reports version 2.1.2dev0
* Cookiecutter version: 2.2.0 (or 2.1.2dev0, depending on who you ask)
* Template project url: n/a
* Python version: 3.11
* Operating System: linux
### Description:
Get the accurate version of cookiecutter from the CLI
### What I've run:
```bash
cookiecutter --version
Cookiecutter 2.1.2.dev0 from $PREFIX/lib/python3.11/site-packages (Python 3.11.4 | packaged by conda-forge | (main, Jun 10 2023, 18:08:17) [GCC 12.2.0])
```
Would be a one-line fix, but ideally would be always be sourced from exactly one place:
- `setup.py` and `importlib_metadata`
- `__init__.py`
- a `VERSION` file
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 """cookiecutter distutils configuration."""
2 from setuptools import setup
3
4 version = "2.2.2.dev0"
5
6 with open('README.md', encoding='utf-8') as readme_file:
7 readme = readme_file.read()
8
9 requirements = [
10 'binaryornot>=0.4.4',
11 'Jinja2>=2.7,<4.0.0',
12 'click>=7.0,<9.0.0',
13 'pyyaml>=5.3.1',
14 'python-slugify>=4.0.0',
15 'requests>=2.23.0',
16 'arrow',
17 ]
18
19 setup(
20 name='cookiecutter',
21 version=version,
22 description=(
23 'A command-line utility that creates projects from project '
24 'templates, e.g. creating a Python package project from a '
25 'Python package project template.'
26 ),
27 long_description=readme,
28 long_description_content_type='text/markdown',
29 author='Audrey Feldroy',
30 author_email='[email protected]',
31 url='https://github.com/cookiecutter/cookiecutter',
32 project_urls={
33 "Documentation": "https://cookiecutter.readthedocs.io",
34 "Issues": "https://github.com/cookiecutter/cookiecutter/issues",
35 "Discord": "https://discord.gg/9BrxzPKuEW",
36 },
37 packages=['cookiecutter'],
38 package_dir={'cookiecutter': 'cookiecutter'},
39 entry_points={'console_scripts': ['cookiecutter = cookiecutter.__main__:main']},
40 include_package_data=True,
41 python_requires='>=3.7',
42 install_requires=requirements,
43 license='BSD',
44 zip_safe=False,
45 classifiers=[
46 "Development Status :: 5 - Production/Stable",
47 "Environment :: Console",
48 "Intended Audience :: Developers",
49 "Natural Language :: English",
50 "License :: OSI Approved :: BSD License",
51 "Programming Language :: Python :: 3 :: Only",
52 "Programming Language :: Python :: 3",
53 "Programming Language :: Python :: 3.7",
54 "Programming Language :: Python :: 3.8",
55 "Programming Language :: Python :: 3.9",
56 "Programming Language :: Python :: 3.10",
57 "Programming Language :: Python :: 3.11",
58 "Programming Language :: Python :: Implementation :: CPython",
59 "Programming Language :: Python :: Implementation :: PyPy",
60 "Programming Language :: Python",
61 "Topic :: Software Development",
62 ],
63 keywords=[
64 "cookiecutter",
65 "Python",
66 "projects",
67 "project templates",
68 "Jinja2",
69 "skeleton",
70 "scaffolding",
71 "project directory",
72 "package",
73 "packaging",
74 ],
75 )
76
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -1,7 +1,7 @@
"""cookiecutter distutils configuration."""
from setuptools import setup
-version = "2.2.2.dev0"
+version = "2.2.2"
with open('README.md', encoding='utf-8') as readme_file:
readme = readme_file.read()
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -1,7 +1,7 @@\n \"\"\"cookiecutter distutils configuration.\"\"\"\n from setuptools import setup\n \n-version = \"2.2.2.dev0\"\n+version = \"2.2.2\"\n \n with open('README.md', encoding='utf-8') as readme_file:\n readme = readme_file.read()\n", "issue": "2.2.0 CLI reports version 2.1.2dev0 \n* Cookiecutter version: 2.2.0 (or 2.1.2dev0, depending on who you ask)\r\n* Template project url: n/a\r\n* Python version: 3.11\r\n* Operating System: linux\r\n\r\n### Description:\r\n\r\nGet the accurate version of cookiecutter from the CLI\r\n\r\n### What I've run:\r\n\r\n```bash\r\ncookiecutter --version\r\nCookiecutter 2.1.2.dev0 from $PREFIX/lib/python3.11/site-packages (Python 3.11.4 | packaged by conda-forge | (main, Jun 10 2023, 18:08:17) [GCC 12.2.0])\r\n```\r\n\r\nWould be a one-line fix, but ideally would be always be sourced from exactly one place:\r\n- `setup.py` and `importlib_metadata`\r\n- `__init__.py`\r\n- a `VERSION` file\n", "before_files": [{"content": "\"\"\"cookiecutter distutils configuration.\"\"\"\nfrom setuptools import setup\n\nversion = \"2.2.2.dev0\"\n\nwith open('README.md', encoding='utf-8') as readme_file:\n readme = readme_file.read()\n\nrequirements = [\n 'binaryornot>=0.4.4',\n 'Jinja2>=2.7,<4.0.0',\n 'click>=7.0,<9.0.0',\n 'pyyaml>=5.3.1',\n 'python-slugify>=4.0.0',\n 'requests>=2.23.0',\n 'arrow',\n]\n\nsetup(\n name='cookiecutter',\n version=version,\n description=(\n 'A command-line utility that creates projects from project '\n 'templates, e.g. creating a Python package project from a '\n 'Python package project template.'\n ),\n long_description=readme,\n long_description_content_type='text/markdown',\n author='Audrey Feldroy',\n author_email='[email protected]',\n url='https://github.com/cookiecutter/cookiecutter',\n project_urls={\n \"Documentation\": \"https://cookiecutter.readthedocs.io\",\n \"Issues\": \"https://github.com/cookiecutter/cookiecutter/issues\",\n \"Discord\": \"https://discord.gg/9BrxzPKuEW\",\n },\n packages=['cookiecutter'],\n package_dir={'cookiecutter': 'cookiecutter'},\n entry_points={'console_scripts': ['cookiecutter = cookiecutter.__main__:main']},\n include_package_data=True,\n python_requires='>=3.7',\n install_requires=requirements,\n license='BSD',\n zip_safe=False,\n classifiers=[\n \"Development Status :: 5 - Production/Stable\",\n \"Environment :: Console\",\n \"Intended Audience :: Developers\",\n \"Natural Language :: English\",\n \"License :: OSI Approved :: BSD License\",\n \"Programming Language :: Python :: 3 :: Only\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Programming Language :: Python :: 3.10\",\n \"Programming Language :: Python :: 3.11\",\n \"Programming Language :: Python :: Implementation :: CPython\",\n \"Programming Language :: Python :: Implementation :: PyPy\",\n \"Programming Language :: Python\",\n \"Topic :: Software Development\",\n ],\n keywords=[\n \"cookiecutter\",\n \"Python\",\n \"projects\",\n \"project templates\",\n \"Jinja2\",\n \"skeleton\",\n \"scaffolding\",\n \"project directory\",\n \"package\",\n \"packaging\",\n ],\n)\n", "path": "setup.py"}], "after_files": [{"content": "\"\"\"cookiecutter distutils configuration.\"\"\"\nfrom setuptools import setup\n\nversion = \"2.2.2\"\n\nwith open('README.md', encoding='utf-8') as readme_file:\n readme = readme_file.read()\n\nrequirements = [\n 'binaryornot>=0.4.4',\n 'Jinja2>=2.7,<4.0.0',\n 'click>=7.0,<9.0.0',\n 'pyyaml>=5.3.1',\n 'python-slugify>=4.0.0',\n 'requests>=2.23.0',\n 'arrow',\n]\n\nsetup(\n name='cookiecutter',\n version=version,\n description=(\n 'A command-line utility that creates projects from project '\n 'templates, e.g. creating a Python package project from a '\n 'Python package project template.'\n ),\n long_description=readme,\n long_description_content_type='text/markdown',\n author='Audrey Feldroy',\n author_email='[email protected]',\n url='https://github.com/cookiecutter/cookiecutter',\n project_urls={\n \"Documentation\": \"https://cookiecutter.readthedocs.io\",\n \"Issues\": \"https://github.com/cookiecutter/cookiecutter/issues\",\n \"Discord\": \"https://discord.gg/9BrxzPKuEW\",\n },\n packages=['cookiecutter'],\n package_dir={'cookiecutter': 'cookiecutter'},\n entry_points={'console_scripts': ['cookiecutter = cookiecutter.__main__:main']},\n include_package_data=True,\n python_requires='>=3.7',\n install_requires=requirements,\n license='BSD',\n zip_safe=False,\n classifiers=[\n \"Development Status :: 5 - Production/Stable\",\n \"Environment :: Console\",\n \"Intended Audience :: Developers\",\n \"Natural Language :: English\",\n \"License :: OSI Approved :: BSD License\",\n \"Programming Language :: Python :: 3 :: Only\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Programming Language :: Python :: 3.10\",\n \"Programming Language :: Python :: 3.11\",\n \"Programming Language :: Python :: Implementation :: CPython\",\n \"Programming Language :: Python :: Implementation :: PyPy\",\n \"Programming Language :: Python\",\n \"Topic :: Software Development\",\n ],\n keywords=[\n \"cookiecutter\",\n \"Python\",\n \"projects\",\n \"project templates\",\n \"Jinja2\",\n \"skeleton\",\n \"scaffolding\",\n \"project directory\",\n \"package\",\n \"packaging\",\n ],\n)\n", "path": "setup.py"}]} | 1,211 | 88 |
gh_patches_debug_33 | rasdani/github-patches | git_diff | googleapis__google-api-python-client-1864 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Stop using external 'mock' dependency
As of Python 3.4, 'mock' is included in the standard library under the unittest module, and since the lowest supported version of Python is greater than that, we can remove the external dependency.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `noxfile.py`
Content:
```
1 # Copyright 2020 Google LLC
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 import os
16 import shutil
17
18 import nox
19
20 BLACK_VERSION = "black==22.3.0"
21 ISORT_VERSION = "isort==5.10.1"
22 BLACK_PATHS = [
23 "apiclient",
24 "googleapiclient",
25 "scripts",
26 "tests",
27 "describe.py",
28 "expandsymlinks.py",
29 "noxfile.py",
30 "owlbot.py",
31 "setup.py",
32 ]
33
34 test_dependencies = [
35 "django>=2.0.0",
36 "google-auth",
37 "google-auth-httplib2",
38 "mox",
39 "parameterized",
40 "pyopenssl",
41 "pytest",
42 "pytest-cov",
43 "webtest",
44 "coverage",
45 "mock",
46 ]
47
48
49 @nox.session(python=["3.7"])
50 def lint(session):
51 session.install("flake8")
52 session.run(
53 "flake8",
54 "googleapiclient",
55 "tests",
56 "--count",
57 "--select=E9,F63,F7,F82",
58 "--show-source",
59 "--statistics",
60 )
61
62
63 @nox.session(python="3.8")
64 def format(session):
65 """
66 Run isort to sort imports. Then run black
67 to format code to uniform standard.
68 """
69 session.install(BLACK_VERSION, ISORT_VERSION)
70 # Use the --fss option to sort imports using strict alphabetical order.
71 # See https://pycqa.github.io/isort/docs/configuration/options.html#force-sort-within-sections
72 session.run(
73 "isort",
74 "--fss",
75 *BLACK_PATHS,
76 )
77 session.run(
78 "black",
79 *BLACK_PATHS,
80 )
81
82
83 @nox.session(python=["3.7", "3.8", "3.9", "3.10"])
84 @nox.parametrize(
85 "oauth2client",
86 [
87 "oauth2client<2dev",
88 "oauth2client>=2,<=3dev",
89 "oauth2client>=3,<=4dev",
90 "oauth2client>=4,<=5dev",
91 ],
92 )
93 def unit(session, oauth2client):
94 # Clean up dist and build folders
95 shutil.rmtree("dist", ignore_errors=True)
96 shutil.rmtree("build", ignore_errors=True)
97
98 session.install(*test_dependencies)
99 session.install(oauth2client)
100
101 # Create and install wheels
102 session.run("python3", "setup.py", "bdist_wheel")
103 session.install(os.path.join("dist", os.listdir("dist").pop()))
104
105 # Run tests from a different directory to test the package artifacts
106 root_dir = os.path.dirname(os.path.realpath(__file__))
107 temp_dir = session.create_tmp()
108 session.chdir(temp_dir)
109 shutil.copytree(os.path.join(root_dir, "tests"), "tests")
110
111 # Run py.test against the unit tests.
112 session.run(
113 "py.test",
114 "--quiet",
115 "--cov=googleapiclient",
116 "--cov=tests",
117 "--cov-append",
118 "--cov-config=.coveragerc",
119 "--cov-report=",
120 "--cov-fail-under=85",
121 "tests",
122 *session.posargs,
123 )
124
125
126 @nox.session(python=["3.9"])
127 def scripts(session):
128 session.install(*test_dependencies)
129 session.install("-e", ".")
130 session.install("-r", "scripts/requirements.txt")
131
132 # Run py.test against the unit tests.
133 session.run(
134 "py.test",
135 "--quiet",
136 "--cov=scripts",
137 "--cov-config=.coveragerc",
138 "--cov-report=",
139 "--cov-fail-under=91",
140 "scripts",
141 *session.posargs,
142 )
143
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/noxfile.py b/noxfile.py
--- a/noxfile.py
+++ b/noxfile.py
@@ -42,7 +42,6 @@
"pytest-cov",
"webtest",
"coverage",
- "mock",
]
| {"golden_diff": "diff --git a/noxfile.py b/noxfile.py\n--- a/noxfile.py\n+++ b/noxfile.py\n@@ -42,7 +42,6 @@\n \"pytest-cov\",\n \"webtest\",\n \"coverage\",\n- \"mock\",\n ]\n", "issue": "Stop using external 'mock' dependency\nAs of Python 3.4, 'mock' is included in the standard library under the unittest module, and since the lowest supported version of Python is greater than that, we can remove the external dependency.\n", "before_files": [{"content": "# Copyright 2020 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport shutil\n\nimport nox\n\nBLACK_VERSION = \"black==22.3.0\"\nISORT_VERSION = \"isort==5.10.1\"\nBLACK_PATHS = [\n \"apiclient\",\n \"googleapiclient\",\n \"scripts\",\n \"tests\",\n \"describe.py\",\n \"expandsymlinks.py\",\n \"noxfile.py\",\n \"owlbot.py\",\n \"setup.py\",\n]\n\ntest_dependencies = [\n \"django>=2.0.0\",\n \"google-auth\",\n \"google-auth-httplib2\",\n \"mox\",\n \"parameterized\",\n \"pyopenssl\",\n \"pytest\",\n \"pytest-cov\",\n \"webtest\",\n \"coverage\",\n \"mock\",\n]\n\n\[email protected](python=[\"3.7\"])\ndef lint(session):\n session.install(\"flake8\")\n session.run(\n \"flake8\",\n \"googleapiclient\",\n \"tests\",\n \"--count\",\n \"--select=E9,F63,F7,F82\",\n \"--show-source\",\n \"--statistics\",\n )\n\n\[email protected](python=\"3.8\")\ndef format(session):\n \"\"\"\n Run isort to sort imports. Then run black\n to format code to uniform standard.\n \"\"\"\n session.install(BLACK_VERSION, ISORT_VERSION)\n # Use the --fss option to sort imports using strict alphabetical order.\n # See https://pycqa.github.io/isort/docs/configuration/options.html#force-sort-within-sections\n session.run(\n \"isort\",\n \"--fss\",\n *BLACK_PATHS,\n )\n session.run(\n \"black\",\n *BLACK_PATHS,\n )\n\n\[email protected](python=[\"3.7\", \"3.8\", \"3.9\", \"3.10\"])\[email protected](\n \"oauth2client\",\n [\n \"oauth2client<2dev\",\n \"oauth2client>=2,<=3dev\",\n \"oauth2client>=3,<=4dev\",\n \"oauth2client>=4,<=5dev\",\n ],\n)\ndef unit(session, oauth2client):\n # Clean up dist and build folders\n shutil.rmtree(\"dist\", ignore_errors=True)\n shutil.rmtree(\"build\", ignore_errors=True)\n\n session.install(*test_dependencies)\n session.install(oauth2client)\n\n # Create and install wheels\n session.run(\"python3\", \"setup.py\", \"bdist_wheel\")\n session.install(os.path.join(\"dist\", os.listdir(\"dist\").pop()))\n\n # Run tests from a different directory to test the package artifacts\n root_dir = os.path.dirname(os.path.realpath(__file__))\n temp_dir = session.create_tmp()\n session.chdir(temp_dir)\n shutil.copytree(os.path.join(root_dir, \"tests\"), \"tests\")\n\n # Run py.test against the unit tests.\n session.run(\n \"py.test\",\n \"--quiet\",\n \"--cov=googleapiclient\",\n \"--cov=tests\",\n \"--cov-append\",\n \"--cov-config=.coveragerc\",\n \"--cov-report=\",\n \"--cov-fail-under=85\",\n \"tests\",\n *session.posargs,\n )\n\n\[email protected](python=[\"3.9\"])\ndef scripts(session):\n session.install(*test_dependencies)\n session.install(\"-e\", \".\")\n session.install(\"-r\", \"scripts/requirements.txt\")\n\n # Run py.test against the unit tests.\n session.run(\n \"py.test\",\n \"--quiet\",\n \"--cov=scripts\",\n \"--cov-config=.coveragerc\",\n \"--cov-report=\",\n \"--cov-fail-under=91\",\n \"scripts\",\n *session.posargs,\n )\n", "path": "noxfile.py"}], "after_files": [{"content": "# Copyright 2020 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport shutil\n\nimport nox\n\nBLACK_VERSION = \"black==22.3.0\"\nISORT_VERSION = \"isort==5.10.1\"\nBLACK_PATHS = [\n \"apiclient\",\n \"googleapiclient\",\n \"scripts\",\n \"tests\",\n \"describe.py\",\n \"expandsymlinks.py\",\n \"noxfile.py\",\n \"owlbot.py\",\n \"setup.py\",\n]\n\ntest_dependencies = [\n \"django>=2.0.0\",\n \"google-auth\",\n \"google-auth-httplib2\",\n \"mox\",\n \"parameterized\",\n \"pyopenssl\",\n \"pytest\",\n \"pytest-cov\",\n \"webtest\",\n \"coverage\",\n]\n\n\[email protected](python=[\"3.7\"])\ndef lint(session):\n session.install(\"flake8\")\n session.run(\n \"flake8\",\n \"googleapiclient\",\n \"tests\",\n \"--count\",\n \"--select=E9,F63,F7,F82\",\n \"--show-source\",\n \"--statistics\",\n )\n\n\[email protected](python=\"3.8\")\ndef format(session):\n \"\"\"\n Run isort to sort imports. Then run black\n to format code to uniform standard.\n \"\"\"\n session.install(BLACK_VERSION, ISORT_VERSION)\n # Use the --fss option to sort imports using strict alphabetical order.\n # See https://pycqa.github.io/isort/docs/configuration/options.html#force-sort-within-sections\n session.run(\n \"isort\",\n \"--fss\",\n *BLACK_PATHS,\n )\n session.run(\n \"black\",\n *BLACK_PATHS,\n )\n\n\[email protected](python=[\"3.7\", \"3.8\", \"3.9\", \"3.10\"])\[email protected](\n \"oauth2client\",\n [\n \"oauth2client<2dev\",\n \"oauth2client>=2,<=3dev\",\n \"oauth2client>=3,<=4dev\",\n \"oauth2client>=4,<=5dev\",\n ],\n)\ndef unit(session, oauth2client):\n # Clean up dist and build folders\n shutil.rmtree(\"dist\", ignore_errors=True)\n shutil.rmtree(\"build\", ignore_errors=True)\n\n session.install(*test_dependencies)\n session.install(oauth2client)\n\n # Create and install wheels\n session.run(\"python3\", \"setup.py\", \"bdist_wheel\")\n session.install(os.path.join(\"dist\", os.listdir(\"dist\").pop()))\n\n # Run tests from a different directory to test the package artifacts\n root_dir = os.path.dirname(os.path.realpath(__file__))\n temp_dir = session.create_tmp()\n session.chdir(temp_dir)\n shutil.copytree(os.path.join(root_dir, \"tests\"), \"tests\")\n\n # Run py.test against the unit tests.\n session.run(\n \"py.test\",\n \"--quiet\",\n \"--cov=googleapiclient\",\n \"--cov=tests\",\n \"--cov-append\",\n \"--cov-config=.coveragerc\",\n \"--cov-report=\",\n \"--cov-fail-under=85\",\n \"tests\",\n *session.posargs,\n )\n\n\[email protected](python=[\"3.9\"])\ndef scripts(session):\n session.install(*test_dependencies)\n session.install(\"-e\", \".\")\n session.install(\"-r\", \"scripts/requirements.txt\")\n\n # Run py.test against the unit tests.\n session.run(\n \"py.test\",\n \"--quiet\",\n \"--cov=scripts\",\n \"--cov-config=.coveragerc\",\n \"--cov-report=\",\n \"--cov-fail-under=91\",\n \"scripts\",\n *session.posargs,\n )\n", "path": "noxfile.py"}]} | 1,557 | 61 |
gh_patches_debug_7786 | rasdani/github-patches | git_diff | Lightning-AI__pytorch-lightning-619 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Comet PAPI Depreciated
Use of the Comet API logger reports an unecessary depreciation warning relating to the use of comet_ml.papi, rather than the newer comet_ml.api.
Example:
`COMET WARNING: You have imported comet_ml.papi; this interface is deprecated. Please use comet_ml.api instead. For more information, see: https://www.comet.ml/docs/python-sdk/releases/#release-300`
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pytorch_lightning/logging/comet.py`
Content:
```
1 """
2 Log using `comet <https://www.comet.ml>`_
3
4 Comet logger can be used in either online or offline mode.
5 To log in online mode, CometLogger requries an API key:
6
7 .. code-block:: python
8
9 from pytorch_lightning.logging import CometLogger
10 # arguments made to CometLogger are passed on to the comet_ml.Experiment class
11 comet_logger = CometLogger(
12 api_key=os.environ["COMET_KEY"],
13 workspace=os.environ["COMET_WORKSPACE"], # Optional
14 project_name="default_project", # Optional
15 rest_api_key=os.environ["COMET_REST_KEY"], # Optional
16 experiment_name="default" # Optional
17 )
18 trainer = Trainer(logger=comet_logger)
19
20 To log in offline mode, CometLogger requires a path to a local directory:
21
22 .. code-block:: python
23
24 from pytorch_lightning.logging import CometLogger
25 # arguments made to CometLogger are passed on to the comet_ml.Experiment class
26 comet_logger = CometLogger(
27 save_dir=".",
28 workspace=os.environ["COMET_WORKSPACE"], # Optional
29 project_name="default_project", # Optional
30 rest_api_key=os.environ["COMET_REST_KEY"], # Optional
31 experiment_name="default" # Optional
32 )
33 trainer = Trainer(logger=comet_logger)
34
35
36 Use the logger anywhere in you LightningModule as follows:
37
38 .. code-block:: python
39
40 def train_step(...):
41 # example
42 self.logger.experiment.whatever_comet_ml_supports(...)
43
44 def any_lightning_module_function_or_hook(...):
45 self.logger.experiment.whatever_comet_ml_supports(...)
46
47
48 """
49
50 from logging import getLogger
51
52 try:
53 from comet_ml import Experiment as CometExperiment
54 from comet_ml import OfflineExperiment as CometOfflineExperiment
55 from comet_ml.papi import API
56 except ImportError:
57 raise ImportError('Missing comet_ml package.')
58
59 from torch import is_tensor
60
61 from .base import LightningLoggerBase, rank_zero_only
62 from ..utilities.debugging import MisconfigurationException
63
64 logger = getLogger(__name__)
65
66
67 class CometLogger(LightningLoggerBase):
68 def __init__(self, api_key=None, save_dir=None, workspace=None,
69 rest_api_key=None, project_name=None, experiment_name=None, **kwargs):
70 """Initialize a Comet.ml logger.
71 Requires either an API Key (online mode) or a local directory path (offline mode)
72
73 :param str api_key: Required in online mode. API key, found on Comet.ml
74 :param str save_dir: Required in offline mode. The path for the directory to save local comet logs
75 :param str workspace: Optional. Name of workspace for this user
76 :param str project_name: Optional. Send your experiment to a specific project.
77 Otherwise will be sent to Uncategorized Experiments.
78 If project name does not already exists Comet.ml will create a new project.
79 :param str rest_api_key: Optional. Rest API key found in Comet.ml settings.
80 This is used to determine version number
81 :param str experiment_name: Optional. String representing the name for this particular experiment on Comet.ml
82 """
83 super().__init__()
84 self._experiment = None
85
86 # Determine online or offline mode based on which arguments were passed to CometLogger
87 if save_dir is not None and api_key is not None:
88 # If arguments are passed for both save_dir and api_key, preference is given to online mode
89 self.mode = "online"
90 self.api_key = api_key
91 elif api_key is not None:
92 self.mode = "online"
93 self.api_key = api_key
94 elif save_dir is not None:
95 self.mode = "offline"
96 self.save_dir = save_dir
97 else:
98 # If neither api_key nor save_dir are passed as arguments, raise an exception
99 raise MisconfigurationException("CometLogger requires either api_key or save_dir during initialization.")
100
101 logger.info(f"CometLogger will be initialized in {self.mode} mode")
102
103 self.workspace = workspace
104 self.project_name = project_name
105 self._kwargs = kwargs
106
107 if rest_api_key is not None:
108 # Comet.ml rest API, used to determine version number
109 self.rest_api_key = rest_api_key
110 self.comet_api = API(self.rest_api_key)
111 else:
112 self.rest_api_key = None
113 self.comet_api = None
114
115 if experiment_name:
116 try:
117 self.name = experiment_name
118 except TypeError as e:
119 logger.exception("Failed to set experiment name for comet.ml logger")
120
121 @property
122 def experiment(self):
123 if self._experiment is not None:
124 return self._experiment
125
126 if self.mode == "online":
127 self._experiment = CometExperiment(
128 api_key=self.api_key,
129 workspace=self.workspace,
130 project_name=self.project_name,
131 **self._kwargs
132 )
133 else:
134 self._experiment = CometOfflineExperiment(
135 offline_directory=self.save_dir,
136 workspace=self.workspace,
137 project_name=self.project_name,
138 **self._kwargs
139 )
140
141 return self._experiment
142
143 @rank_zero_only
144 def log_hyperparams(self, params):
145 self.experiment.log_parameters(vars(params))
146
147 @rank_zero_only
148 def log_metrics(self, metrics, step=None):
149 # Comet.ml expects metrics to be a dictionary of detached tensors on CPU
150 for key, val in metrics.items():
151 if is_tensor(val):
152 metrics[key] = val.cpu().detach()
153
154 self.experiment.log_metrics(metrics, step=step)
155
156 @rank_zero_only
157 def finalize(self, status):
158 self.experiment.end()
159
160 @property
161 def name(self):
162 return self.experiment.project_name
163
164 @name.setter
165 def name(self, value):
166 self.experiment.set_name(value)
167
168 @property
169 def version(self):
170 return self.experiment.id
171
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pytorch_lightning/logging/comet.py b/pytorch_lightning/logging/comet.py
--- a/pytorch_lightning/logging/comet.py
+++ b/pytorch_lightning/logging/comet.py
@@ -52,7 +52,11 @@
try:
from comet_ml import Experiment as CometExperiment
from comet_ml import OfflineExperiment as CometOfflineExperiment
- from comet_ml.papi import API
+ try:
+ from comet_ml.api import API
+ except ImportError:
+ # For more information, see: https://www.comet.ml/docs/python-sdk/releases/#release-300
+ from comet_ml.papi import API
except ImportError:
raise ImportError('Missing comet_ml package.')
| {"golden_diff": "diff --git a/pytorch_lightning/logging/comet.py b/pytorch_lightning/logging/comet.py\n--- a/pytorch_lightning/logging/comet.py\n+++ b/pytorch_lightning/logging/comet.py\n@@ -52,7 +52,11 @@\n try:\n from comet_ml import Experiment as CometExperiment\n from comet_ml import OfflineExperiment as CometOfflineExperiment\n- from comet_ml.papi import API\n+ try:\n+ from comet_ml.api import API\n+ except ImportError:\n+ # For more information, see: https://www.comet.ml/docs/python-sdk/releases/#release-300\n+ from comet_ml.papi import API\n except ImportError:\n raise ImportError('Missing comet_ml package.')\n", "issue": "Comet PAPI Depreciated\nUse of the Comet API logger reports an unecessary depreciation warning relating to the use of comet_ml.papi, rather than the newer comet_ml.api.\r\n\r\nExample:\r\n`COMET WARNING: You have imported comet_ml.papi; this interface is deprecated. Please use comet_ml.api instead. For more information, see: https://www.comet.ml/docs/python-sdk/releases/#release-300`\n", "before_files": [{"content": "\"\"\"\nLog using `comet <https://www.comet.ml>`_\n\nComet logger can be used in either online or offline mode.\nTo log in online mode, CometLogger requries an API key:\n\n.. code-block:: python\n\n from pytorch_lightning.logging import CometLogger\n # arguments made to CometLogger are passed on to the comet_ml.Experiment class\n comet_logger = CometLogger(\n api_key=os.environ[\"COMET_KEY\"],\n workspace=os.environ[\"COMET_WORKSPACE\"], # Optional\n project_name=\"default_project\", # Optional\n rest_api_key=os.environ[\"COMET_REST_KEY\"], # Optional\n experiment_name=\"default\" # Optional\n )\n trainer = Trainer(logger=comet_logger)\n\nTo log in offline mode, CometLogger requires a path to a local directory:\n\n.. code-block:: python\n\n from pytorch_lightning.logging import CometLogger\n # arguments made to CometLogger are passed on to the comet_ml.Experiment class\n comet_logger = CometLogger(\n save_dir=\".\",\n workspace=os.environ[\"COMET_WORKSPACE\"], # Optional\n project_name=\"default_project\", # Optional\n rest_api_key=os.environ[\"COMET_REST_KEY\"], # Optional\n experiment_name=\"default\" # Optional\n )\n trainer = Trainer(logger=comet_logger)\n\n\nUse the logger anywhere in you LightningModule as follows:\n\n.. code-block:: python\n\n def train_step(...):\n # example\n self.logger.experiment.whatever_comet_ml_supports(...)\n\n def any_lightning_module_function_or_hook(...):\n self.logger.experiment.whatever_comet_ml_supports(...)\n\n\n\"\"\"\n\nfrom logging import getLogger\n\ntry:\n from comet_ml import Experiment as CometExperiment\n from comet_ml import OfflineExperiment as CometOfflineExperiment\n from comet_ml.papi import API\nexcept ImportError:\n raise ImportError('Missing comet_ml package.')\n\nfrom torch import is_tensor\n\nfrom .base import LightningLoggerBase, rank_zero_only\nfrom ..utilities.debugging import MisconfigurationException\n\nlogger = getLogger(__name__)\n\n\nclass CometLogger(LightningLoggerBase):\n def __init__(self, api_key=None, save_dir=None, workspace=None,\n rest_api_key=None, project_name=None, experiment_name=None, **kwargs):\n \"\"\"Initialize a Comet.ml logger.\n Requires either an API Key (online mode) or a local directory path (offline mode)\n\n :param str api_key: Required in online mode. API key, found on Comet.ml\n :param str save_dir: Required in offline mode. The path for the directory to save local comet logs\n :param str workspace: Optional. Name of workspace for this user\n :param str project_name: Optional. Send your experiment to a specific project.\n Otherwise will be sent to Uncategorized Experiments.\n If project name does not already exists Comet.ml will create a new project.\n :param str rest_api_key: Optional. Rest API key found in Comet.ml settings.\n This is used to determine version number\n :param str experiment_name: Optional. String representing the name for this particular experiment on Comet.ml\n \"\"\"\n super().__init__()\n self._experiment = None\n\n # Determine online or offline mode based on which arguments were passed to CometLogger\n if save_dir is not None and api_key is not None:\n # If arguments are passed for both save_dir and api_key, preference is given to online mode\n self.mode = \"online\"\n self.api_key = api_key\n elif api_key is not None:\n self.mode = \"online\"\n self.api_key = api_key\n elif save_dir is not None:\n self.mode = \"offline\"\n self.save_dir = save_dir\n else:\n # If neither api_key nor save_dir are passed as arguments, raise an exception\n raise MisconfigurationException(\"CometLogger requires either api_key or save_dir during initialization.\")\n\n logger.info(f\"CometLogger will be initialized in {self.mode} mode\")\n\n self.workspace = workspace\n self.project_name = project_name\n self._kwargs = kwargs\n\n if rest_api_key is not None:\n # Comet.ml rest API, used to determine version number\n self.rest_api_key = rest_api_key\n self.comet_api = API(self.rest_api_key)\n else:\n self.rest_api_key = None\n self.comet_api = None\n\n if experiment_name:\n try:\n self.name = experiment_name\n except TypeError as e:\n logger.exception(\"Failed to set experiment name for comet.ml logger\")\n\n @property\n def experiment(self):\n if self._experiment is not None:\n return self._experiment\n\n if self.mode == \"online\":\n self._experiment = CometExperiment(\n api_key=self.api_key,\n workspace=self.workspace,\n project_name=self.project_name,\n **self._kwargs\n )\n else:\n self._experiment = CometOfflineExperiment(\n offline_directory=self.save_dir,\n workspace=self.workspace,\n project_name=self.project_name,\n **self._kwargs\n )\n\n return self._experiment\n\n @rank_zero_only\n def log_hyperparams(self, params):\n self.experiment.log_parameters(vars(params))\n\n @rank_zero_only\n def log_metrics(self, metrics, step=None):\n # Comet.ml expects metrics to be a dictionary of detached tensors on CPU\n for key, val in metrics.items():\n if is_tensor(val):\n metrics[key] = val.cpu().detach()\n\n self.experiment.log_metrics(metrics, step=step)\n\n @rank_zero_only\n def finalize(self, status):\n self.experiment.end()\n\n @property\n def name(self):\n return self.experiment.project_name\n\n @name.setter\n def name(self, value):\n self.experiment.set_name(value)\n\n @property\n def version(self):\n return self.experiment.id\n", "path": "pytorch_lightning/logging/comet.py"}], "after_files": [{"content": "\"\"\"\nLog using `comet <https://www.comet.ml>`_\n\nComet logger can be used in either online or offline mode.\nTo log in online mode, CometLogger requries an API key:\n\n.. code-block:: python\n\n from pytorch_lightning.logging import CometLogger\n # arguments made to CometLogger are passed on to the comet_ml.Experiment class\n comet_logger = CometLogger(\n api_key=os.environ[\"COMET_KEY\"],\n workspace=os.environ[\"COMET_WORKSPACE\"], # Optional\n project_name=\"default_project\", # Optional\n rest_api_key=os.environ[\"COMET_REST_KEY\"], # Optional\n experiment_name=\"default\" # Optional\n )\n trainer = Trainer(logger=comet_logger)\n\nTo log in offline mode, CometLogger requires a path to a local directory:\n\n.. code-block:: python\n\n from pytorch_lightning.logging import CometLogger\n # arguments made to CometLogger are passed on to the comet_ml.Experiment class\n comet_logger = CometLogger(\n save_dir=\".\",\n workspace=os.environ[\"COMET_WORKSPACE\"], # Optional\n project_name=\"default_project\", # Optional\n rest_api_key=os.environ[\"COMET_REST_KEY\"], # Optional\n experiment_name=\"default\" # Optional\n )\n trainer = Trainer(logger=comet_logger)\n\n\nUse the logger anywhere in you LightningModule as follows:\n\n.. code-block:: python\n\n def train_step(...):\n # example\n self.logger.experiment.whatever_comet_ml_supports(...)\n\n def any_lightning_module_function_or_hook(...):\n self.logger.experiment.whatever_comet_ml_supports(...)\n\n\n\"\"\"\n\nfrom logging import getLogger\n\ntry:\n from comet_ml import Experiment as CometExperiment\n from comet_ml import OfflineExperiment as CometOfflineExperiment\n try:\n from comet_ml.api import API\n except ImportError:\n # For more information, see: https://www.comet.ml/docs/python-sdk/releases/#release-300\n from comet_ml.papi import API\nexcept ImportError:\n raise ImportError('Missing comet_ml package.')\n\nfrom torch import is_tensor\n\nfrom .base import LightningLoggerBase, rank_zero_only\nfrom ..utilities.debugging import MisconfigurationException\n\nlogger = getLogger(__name__)\n\n\nclass CometLogger(LightningLoggerBase):\n def __init__(self, api_key=None, save_dir=None, workspace=None,\n rest_api_key=None, project_name=None, experiment_name=None, **kwargs):\n \"\"\"Initialize a Comet.ml logger.\n Requires either an API Key (online mode) or a local directory path (offline mode)\n\n :param str api_key: Required in online mode. API key, found on Comet.ml\n :param str save_dir: Required in offline mode. The path for the directory to save local comet logs\n :param str workspace: Optional. Name of workspace for this user\n :param str project_name: Optional. Send your experiment to a specific project.\n Otherwise will be sent to Uncategorized Experiments.\n If project name does not already exists Comet.ml will create a new project.\n :param str rest_api_key: Optional. Rest API key found in Comet.ml settings.\n This is used to determine version number\n :param str experiment_name: Optional. String representing the name for this particular experiment on Comet.ml\n \"\"\"\n super().__init__()\n self._experiment = None\n\n # Determine online or offline mode based on which arguments were passed to CometLogger\n if save_dir is not None and api_key is not None:\n # If arguments are passed for both save_dir and api_key, preference is given to online mode\n self.mode = \"online\"\n self.api_key = api_key\n elif api_key is not None:\n self.mode = \"online\"\n self.api_key = api_key\n elif save_dir is not None:\n self.mode = \"offline\"\n self.save_dir = save_dir\n else:\n # If neither api_key nor save_dir are passed as arguments, raise an exception\n raise MisconfigurationException(\"CometLogger requires either api_key or save_dir during initialization.\")\n\n logger.info(f\"CometLogger will be initialized in {self.mode} mode\")\n\n self.workspace = workspace\n self.project_name = project_name\n self._kwargs = kwargs\n\n if rest_api_key is not None:\n # Comet.ml rest API, used to determine version number\n self.rest_api_key = rest_api_key\n self.comet_api = API(self.rest_api_key)\n else:\n self.rest_api_key = None\n self.comet_api = None\n\n if experiment_name:\n try:\n self.name = experiment_name\n except TypeError as e:\n logger.exception(\"Failed to set experiment name for comet.ml logger\")\n\n @property\n def experiment(self):\n if self._experiment is not None:\n return self._experiment\n\n if self.mode == \"online\":\n self._experiment = CometExperiment(\n api_key=self.api_key,\n workspace=self.workspace,\n project_name=self.project_name,\n **self._kwargs\n )\n else:\n self._experiment = CometOfflineExperiment(\n offline_directory=self.save_dir,\n workspace=self.workspace,\n project_name=self.project_name,\n **self._kwargs\n )\n\n return self._experiment\n\n @rank_zero_only\n def log_hyperparams(self, params):\n self.experiment.log_parameters(vars(params))\n\n @rank_zero_only\n def log_metrics(self, metrics, step=None):\n # Comet.ml expects metrics to be a dictionary of detached tensors on CPU\n for key, val in metrics.items():\n if is_tensor(val):\n metrics[key] = val.cpu().detach()\n\n self.experiment.log_metrics(metrics, step=step)\n\n @rank_zero_only\n def finalize(self, status):\n self.experiment.end()\n\n @property\n def name(self):\n return self.experiment.project_name\n\n @name.setter\n def name(self, value):\n self.experiment.set_name(value)\n\n @property\n def version(self):\n return self.experiment.id\n", "path": "pytorch_lightning/logging/comet.py"}]} | 2,015 | 157 |
gh_patches_debug_28626 | rasdani/github-patches | git_diff | pymodbus-dev__pymodbus-1053 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Updating Server Example does not work
<!--
Please use the Pymodbus gitter channel at https://gitter.im/pymodbus_dev/Lobby or Stack Overflow(tag [pymodbus](https://stackoverflow.com/questions/tagged/pymodbus) for
support questions.
Before opening a new issue, make sure you do the following:
* check that your issue isn't already filed: https://github.com/riptideio/pymodbus/issues
* check the discussions forum https://github.com/riptideio/pymodbus/discussions
* prepare a short, runnable example that reproduce the issue with the latest development version of Pymodbus
-->
### Versions
* Python: 3.7, 3.8, 3,10
* OS: ubuntu bionic
* Pymodbus: 2.5.3
* Modbus Hardware (if used):
### Pymodbus Specific
* Server: tcp - sync/async
### Description
I try to run example, first error is
`TypeError: ModbusDeviceIdentification.__init__() got an unexpected keyword argument 'info_name'`
when i remove that info, it ends on
```
modbusupdater.py", line 81, in run_updating_server
loop.start(time, now=False) # initially delay by time
AttributeError: '_UnixSelectorEventLoop' object has no attribute 'start'
```
async/sync server have same error with info_name, but run when removed. But i am unable to run updating server example.
Thank you
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `examples/common/updating_server.py`
Content:
```
1 #!/usr/bin/env python3
2 # pylint: disable=missing-any-param-doc,differing-param-doc
3 """Pymodbus Server With Updating Thread.
4
5 This is an example of having a background thread updating the
6 context while the server is operating. This can also be done with
7 a python thread::
8
9 from threading import Thread
10 Thread(target=updating_writer, args=(context,)).start()
11 """
12 import logging
13 import asyncio
14
15 from pymodbus.datastore import (
16 ModbusSequentialDataBlock,
17 ModbusServerContext,
18 ModbusSlaveContext,
19 )
20 from pymodbus.device import ModbusDeviceIdentification
21 from pymodbus.server.async_io import StartTcpServer
22 from pymodbus.version import version
23
24 # --------------------------------------------------------------------------- #
25 # configure the service logging
26 # --------------------------------------------------------------------------- #
27 log = logging.getLogger()
28 log.setLevel(logging.DEBUG)
29
30 # --------------------------------------------------------------------------- #
31 # define your callback process
32 # --------------------------------------------------------------------------- #
33
34
35 def updating_writer(extra):
36 """Run every so often,
37
38 and updates live values of the context. It should be noted
39 that there is a race condition for the update.
40
41 :param arguments: The input arguments to the call
42 """
43 log.debug("updating the context")
44 context = extra[0]
45 register = 3
46 slave_id = 0x00
47 address = 0x10
48 values = context[slave_id].getValues(register, address, count=5)
49 values = [v + 1 for v in values]
50 txt = f"new values: {str(values)}"
51 log.debug(txt)
52 context[slave_id].setValues(register, address, values)
53
54
55 def run_updating_server():
56 """Run updating server."""
57 # ----------------------------------------------------------------------- #
58 # initialize your data store
59 # ----------------------------------------------------------------------- #
60
61 store = ModbusSlaveContext(
62 di=ModbusSequentialDataBlock(0, [17] * 100),
63 co=ModbusSequentialDataBlock(0, [17] * 100),
64 hr=ModbusSequentialDataBlock(0, [17] * 100),
65 ir=ModbusSequentialDataBlock(0, [17] * 100),
66 )
67 context = ModbusServerContext(slaves=store, single=True)
68
69 # ----------------------------------------------------------------------- #
70 # initialize the server information
71 # ----------------------------------------------------------------------- #
72 identity = ModbusDeviceIdentification(
73 info_name={
74 "VendorName": "pymodbus",
75 "ProductCode": "PM",
76 "VendorUrl": "https://github.com/riptideio/pymodbus/",
77 "ProductName": "pymodbus Server",
78 "ModelName": "pymodbus Server",
79 "MajorMinorRevision": version.short(),
80 }
81 )
82
83 # ----------------------------------------------------------------------- #
84 # run the server you want
85 # ----------------------------------------------------------------------- #
86 time = 5 # 5 seconds delay
87 loop = asyncio.get_event_loop()
88 loop.start(time, now=False) # initially delay by time
89 StartTcpServer(context, identity=identity, address=("localhost", 5020))
90
91
92 if __name__ == "__main__":
93 run_updating_server()
94
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/examples/common/updating_server.py b/examples/common/updating_server.py
--- a/examples/common/updating_server.py
+++ b/examples/common/updating_server.py
@@ -36,7 +36,7 @@
"""Run every so often,
and updates live values of the context. It should be noted
- that there is a race condition for the update.
+ that there is a lrace condition for the update.
:param arguments: The input arguments to the call
"""
@@ -52,7 +52,7 @@
context[slave_id].setValues(register, address, values)
-def run_updating_server():
+async def run_updating_server():
"""Run updating server."""
# ----------------------------------------------------------------------- #
# initialize your data store
@@ -83,11 +83,15 @@
# ----------------------------------------------------------------------- #
# run the server you want
# ----------------------------------------------------------------------- #
- time = 5 # 5 seconds delay
- loop = asyncio.get_event_loop()
- loop.start(time, now=False) # initially delay by time
- StartTcpServer(context, identity=identity, address=("localhost", 5020))
+ log.debug("Start server")
+ await StartTcpServer(
+ context,
+ identity=identity,
+ address=("localhost", 5020),
+ defer_start=False
+ )
+ log.debug("Done")
if __name__ == "__main__":
- run_updating_server()
+ asyncio.run(run_updating_server())
| {"golden_diff": "diff --git a/examples/common/updating_server.py b/examples/common/updating_server.py\n--- a/examples/common/updating_server.py\n+++ b/examples/common/updating_server.py\n@@ -36,7 +36,7 @@\n \"\"\"Run every so often,\n \n and updates live values of the context. It should be noted\n- that there is a race condition for the update.\n+ that there is a lrace condition for the update.\n \n :param arguments: The input arguments to the call\n \"\"\"\n@@ -52,7 +52,7 @@\n context[slave_id].setValues(register, address, values)\n \n \n-def run_updating_server():\n+async def run_updating_server():\n \"\"\"Run updating server.\"\"\"\n # ----------------------------------------------------------------------- #\n # initialize your data store\n@@ -83,11 +83,15 @@\n # ----------------------------------------------------------------------- #\n # run the server you want\n # ----------------------------------------------------------------------- #\n- time = 5 # 5 seconds delay\n- loop = asyncio.get_event_loop()\n- loop.start(time, now=False) # initially delay by time\n- StartTcpServer(context, identity=identity, address=(\"localhost\", 5020))\n+ log.debug(\"Start server\")\n+ await StartTcpServer(\n+ context,\n+ identity=identity,\n+ address=(\"localhost\", 5020),\n+ defer_start=False\n+ )\n+ log.debug(\"Done\")\n \n \n if __name__ == \"__main__\":\n- run_updating_server()\n+ asyncio.run(run_updating_server())\n", "issue": "Updating Server Example does not work\n<!--\r\nPlease use the Pymodbus gitter channel at https://gitter.im/pymodbus_dev/Lobby or Stack Overflow(tag [pymodbus](https://stackoverflow.com/questions/tagged/pymodbus) for\r\nsupport questions.\r\n\r\nBefore opening a new issue, make sure you do the following:\r\n * check that your issue isn't already filed: https://github.com/riptideio/pymodbus/issues\r\n * check the discussions forum https://github.com/riptideio/pymodbus/discussions\r\n * prepare a short, runnable example that reproduce the issue with the latest development version of Pymodbus\r\n-->\r\n\r\n### Versions\r\n\r\n* Python: 3.7, 3.8, 3,10\r\n* OS: ubuntu bionic\r\n* Pymodbus: 2.5.3\r\n* Modbus Hardware (if used): \r\n\r\n### Pymodbus Specific\r\n* Server: tcp - sync/async\r\n\r\n\r\n### Description\r\n\r\nI try to run example, first error is\r\n`TypeError: ModbusDeviceIdentification.__init__() got an unexpected keyword argument 'info_name'`\r\n\r\nwhen i remove that info, it ends on\r\n\r\n```\r\nmodbusupdater.py\", line 81, in run_updating_server\r\n loop.start(time, now=False) # initially delay by time\r\nAttributeError: '_UnixSelectorEventLoop' object has no attribute 'start'\r\n```\r\n\r\nasync/sync server have same error with info_name, but run when removed. But i am unable to run updating server example.\r\n\r\nThank you\n", "before_files": [{"content": "#!/usr/bin/env python3\n# pylint: disable=missing-any-param-doc,differing-param-doc\n\"\"\"Pymodbus Server With Updating Thread.\n\nThis is an example of having a background thread updating the\ncontext while the server is operating. This can also be done with\na python thread::\n\n from threading import Thread\n Thread(target=updating_writer, args=(context,)).start()\n\"\"\"\nimport logging\nimport asyncio\n\nfrom pymodbus.datastore import (\n ModbusSequentialDataBlock,\n ModbusServerContext,\n ModbusSlaveContext,\n)\nfrom pymodbus.device import ModbusDeviceIdentification\nfrom pymodbus.server.async_io import StartTcpServer\nfrom pymodbus.version import version\n\n# --------------------------------------------------------------------------- #\n# configure the service logging\n# --------------------------------------------------------------------------- #\nlog = logging.getLogger()\nlog.setLevel(logging.DEBUG)\n\n# --------------------------------------------------------------------------- #\n# define your callback process\n# --------------------------------------------------------------------------- #\n\n\ndef updating_writer(extra):\n \"\"\"Run every so often,\n\n and updates live values of the context. It should be noted\n that there is a race condition for the update.\n\n :param arguments: The input arguments to the call\n \"\"\"\n log.debug(\"updating the context\")\n context = extra[0]\n register = 3\n slave_id = 0x00\n address = 0x10\n values = context[slave_id].getValues(register, address, count=5)\n values = [v + 1 for v in values]\n txt = f\"new values: {str(values)}\"\n log.debug(txt)\n context[slave_id].setValues(register, address, values)\n\n\ndef run_updating_server():\n \"\"\"Run updating server.\"\"\"\n # ----------------------------------------------------------------------- #\n # initialize your data store\n # ----------------------------------------------------------------------- #\n\n store = ModbusSlaveContext(\n di=ModbusSequentialDataBlock(0, [17] * 100),\n co=ModbusSequentialDataBlock(0, [17] * 100),\n hr=ModbusSequentialDataBlock(0, [17] * 100),\n ir=ModbusSequentialDataBlock(0, [17] * 100),\n )\n context = ModbusServerContext(slaves=store, single=True)\n\n # ----------------------------------------------------------------------- #\n # initialize the server information\n # ----------------------------------------------------------------------- #\n identity = ModbusDeviceIdentification(\n info_name={\n \"VendorName\": \"pymodbus\",\n \"ProductCode\": \"PM\",\n \"VendorUrl\": \"https://github.com/riptideio/pymodbus/\",\n \"ProductName\": \"pymodbus Server\",\n \"ModelName\": \"pymodbus Server\",\n \"MajorMinorRevision\": version.short(),\n }\n )\n\n # ----------------------------------------------------------------------- #\n # run the server you want\n # ----------------------------------------------------------------------- #\n time = 5 # 5 seconds delay\n loop = asyncio.get_event_loop()\n loop.start(time, now=False) # initially delay by time\n StartTcpServer(context, identity=identity, address=(\"localhost\", 5020))\n\n\nif __name__ == \"__main__\":\n run_updating_server()\n", "path": "examples/common/updating_server.py"}], "after_files": [{"content": "#!/usr/bin/env python3\n# pylint: disable=missing-any-param-doc,differing-param-doc\n\"\"\"Pymodbus Server With Updating Thread.\n\nThis is an example of having a background thread updating the\ncontext while the server is operating. This can also be done with\na python thread::\n\n from threading import Thread\n Thread(target=updating_writer, args=(context,)).start()\n\"\"\"\nimport logging\nimport asyncio\n\nfrom pymodbus.datastore import (\n ModbusSequentialDataBlock,\n ModbusServerContext,\n ModbusSlaveContext,\n)\nfrom pymodbus.device import ModbusDeviceIdentification\nfrom pymodbus.server.async_io import StartTcpServer\nfrom pymodbus.version import version\n\n# --------------------------------------------------------------------------- #\n# configure the service logging\n# --------------------------------------------------------------------------- #\nlog = logging.getLogger()\nlog.setLevel(logging.DEBUG)\n\n# --------------------------------------------------------------------------- #\n# define your callback process\n# --------------------------------------------------------------------------- #\n\n\ndef updating_writer(extra):\n \"\"\"Run every so often,\n\n and updates live values of the context. It should be noted\n that there is a lrace condition for the update.\n\n :param arguments: The input arguments to the call\n \"\"\"\n log.debug(\"updating the context\")\n context = extra[0]\n register = 3\n slave_id = 0x00\n address = 0x10\n values = context[slave_id].getValues(register, address, count=5)\n values = [v + 1 for v in values]\n txt = f\"new values: {str(values)}\"\n log.debug(txt)\n context[slave_id].setValues(register, address, values)\n\n\nasync def run_updating_server():\n \"\"\"Run updating server.\"\"\"\n # ----------------------------------------------------------------------- #\n # initialize your data store\n # ----------------------------------------------------------------------- #\n\n store = ModbusSlaveContext(\n di=ModbusSequentialDataBlock(0, [17] * 100),\n co=ModbusSequentialDataBlock(0, [17] * 100),\n hr=ModbusSequentialDataBlock(0, [17] * 100),\n ir=ModbusSequentialDataBlock(0, [17] * 100),\n )\n context = ModbusServerContext(slaves=store, single=True)\n\n # ----------------------------------------------------------------------- #\n # initialize the server information\n # ----------------------------------------------------------------------- #\n identity = ModbusDeviceIdentification(\n info_name={\n \"VendorName\": \"pymodbus\",\n \"ProductCode\": \"PM\",\n \"VendorUrl\": \"https://github.com/riptideio/pymodbus/\",\n \"ProductName\": \"pymodbus Server\",\n \"ModelName\": \"pymodbus Server\",\n \"MajorMinorRevision\": version.short(),\n }\n )\n\n # ----------------------------------------------------------------------- #\n # run the server you want\n # ----------------------------------------------------------------------- #\n log.debug(\"Start server\")\n await StartTcpServer(\n context,\n identity=identity,\n address=(\"localhost\", 5020),\n defer_start=False\n )\n log.debug(\"Done\")\n\n\nif __name__ == \"__main__\":\n asyncio.run(run_updating_server())\n", "path": "examples/common/updating_server.py"}]} | 1,458 | 338 |
gh_patches_debug_3003 | rasdani/github-patches | git_diff | lightly-ai__lightly-656 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Incorrect inputsize for BarlowTwins Lightning Example Code
Should the input_size in [1] be `32` instead of `224`?
In [2], we use `input_size=32`.
[1] https://github.com/lightly-ai/lightly/blob/master/examples/pytorch_lightning/barlowtwins.py#L44
[2] https://github.com/lightly-ai/lightly/blob/master/examples/pytorch/barlowtwins.py#L35
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `examples/pytorch_lightning/barlowtwins.py`
Content:
```
1 import torch
2 from torch import nn
3 import torchvision
4 import pytorch_lightning as pl
5
6 from lightly.data import LightlyDataset
7 from lightly.data import ImageCollateFunction
8 from lightly.loss import BarlowTwinsLoss
9 from lightly.models.modules import BarlowTwinsProjectionHead
10
11
12 class BarlowTwins(pl.LightningModule):
13 def __init__(self):
14 super().__init__()
15 resnet = torchvision.models.resnet18()
16 self.backbone = nn.Sequential(*list(resnet.children())[:-1])
17 self.projection_head = BarlowTwinsProjectionHead(512, 2048, 2048)
18 self.criterion = BarlowTwinsLoss()
19
20 def forward(self, x):
21 x = self.backbone(x).flatten(start_dim=1)
22 z = self.projection_head(x)
23 return z
24
25 def training_step(self, batch, batch_index):
26 (x0, x1), _, _ = batch
27 z0 = self.forward(x0)
28 z1 = self.forward(x1)
29 loss = self.criterion(z0, z1)
30 return loss
31
32 def configure_optimizers(self):
33 optim = torch.optim.SGD(self.parameters(), lr=0.06)
34 return optim
35
36
37 model = BarlowTwins()
38
39 cifar10 = torchvision.datasets.CIFAR10("datasets/cifar10", download=True)
40 dataset = LightlyDataset.from_torch_dataset(cifar10)
41 # or create a dataset from a folder containing images or videos:
42 # dataset = LightlyDataset("path/to/folder")
43
44 collate_fn = ImageCollateFunction(input_size=224)
45
46 dataloader = torch.utils.data.DataLoader(
47 dataset,
48 batch_size=256,
49 collate_fn=collate_fn,
50 shuffle=True,
51 drop_last=True,
52 num_workers=8,
53 )
54
55 gpus = 1 if torch.cuda.is_available() else 0
56
57 trainer = pl.Trainer(max_epochs=10, gpus=gpus)
58 trainer.fit(model=model, train_dataloaders=dataloader)
59
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/examples/pytorch_lightning/barlowtwins.py b/examples/pytorch_lightning/barlowtwins.py
--- a/examples/pytorch_lightning/barlowtwins.py
+++ b/examples/pytorch_lightning/barlowtwins.py
@@ -41,7 +41,7 @@
# or create a dataset from a folder containing images or videos:
# dataset = LightlyDataset("path/to/folder")
-collate_fn = ImageCollateFunction(input_size=224)
+collate_fn = ImageCollateFunction(input_size=32)
dataloader = torch.utils.data.DataLoader(
dataset,
| {"golden_diff": "diff --git a/examples/pytorch_lightning/barlowtwins.py b/examples/pytorch_lightning/barlowtwins.py\n--- a/examples/pytorch_lightning/barlowtwins.py\n+++ b/examples/pytorch_lightning/barlowtwins.py\n@@ -41,7 +41,7 @@\n # or create a dataset from a folder containing images or videos:\n # dataset = LightlyDataset(\"path/to/folder\")\n \n-collate_fn = ImageCollateFunction(input_size=224)\n+collate_fn = ImageCollateFunction(input_size=32)\n \n dataloader = torch.utils.data.DataLoader(\n dataset,\n", "issue": "Incorrect inputsize for BarlowTwins Lightning Example Code\nShould the input_size in [1] be `32` instead of `224`?\r\nIn [2], we use `input_size=32`.\r\n\r\n[1] https://github.com/lightly-ai/lightly/blob/master/examples/pytorch_lightning/barlowtwins.py#L44\r\n[2] https://github.com/lightly-ai/lightly/blob/master/examples/pytorch/barlowtwins.py#L35\r\n\r\n\n", "before_files": [{"content": "import torch\nfrom torch import nn\nimport torchvision\nimport pytorch_lightning as pl\n\nfrom lightly.data import LightlyDataset\nfrom lightly.data import ImageCollateFunction\nfrom lightly.loss import BarlowTwinsLoss\nfrom lightly.models.modules import BarlowTwinsProjectionHead\n\n\nclass BarlowTwins(pl.LightningModule):\n def __init__(self):\n super().__init__()\n resnet = torchvision.models.resnet18()\n self.backbone = nn.Sequential(*list(resnet.children())[:-1])\n self.projection_head = BarlowTwinsProjectionHead(512, 2048, 2048)\n self.criterion = BarlowTwinsLoss()\n\n def forward(self, x):\n x = self.backbone(x).flatten(start_dim=1)\n z = self.projection_head(x)\n return z\n\n def training_step(self, batch, batch_index):\n (x0, x1), _, _ = batch\n z0 = self.forward(x0)\n z1 = self.forward(x1)\n loss = self.criterion(z0, z1)\n return loss\n\n def configure_optimizers(self):\n optim = torch.optim.SGD(self.parameters(), lr=0.06)\n return optim\n\n\nmodel = BarlowTwins()\n\ncifar10 = torchvision.datasets.CIFAR10(\"datasets/cifar10\", download=True)\ndataset = LightlyDataset.from_torch_dataset(cifar10)\n# or create a dataset from a folder containing images or videos:\n# dataset = LightlyDataset(\"path/to/folder\")\n\ncollate_fn = ImageCollateFunction(input_size=224)\n\ndataloader = torch.utils.data.DataLoader(\n dataset,\n batch_size=256,\n collate_fn=collate_fn,\n shuffle=True,\n drop_last=True,\n num_workers=8,\n)\n\ngpus = 1 if torch.cuda.is_available() else 0\n\ntrainer = pl.Trainer(max_epochs=10, gpus=gpus)\ntrainer.fit(model=model, train_dataloaders=dataloader)\n", "path": "examples/pytorch_lightning/barlowtwins.py"}], "after_files": [{"content": "import torch\nfrom torch import nn\nimport torchvision\nimport pytorch_lightning as pl\n\nfrom lightly.data import LightlyDataset\nfrom lightly.data import ImageCollateFunction\nfrom lightly.loss import BarlowTwinsLoss\nfrom lightly.models.modules import BarlowTwinsProjectionHead\n\n\nclass BarlowTwins(pl.LightningModule):\n def __init__(self):\n super().__init__()\n resnet = torchvision.models.resnet18()\n self.backbone = nn.Sequential(*list(resnet.children())[:-1])\n self.projection_head = BarlowTwinsProjectionHead(512, 2048, 2048)\n self.criterion = BarlowTwinsLoss()\n\n def forward(self, x):\n x = self.backbone(x).flatten(start_dim=1)\n z = self.projection_head(x)\n return z\n\n def training_step(self, batch, batch_index):\n (x0, x1), _, _ = batch\n z0 = self.forward(x0)\n z1 = self.forward(x1)\n loss = self.criterion(z0, z1)\n return loss\n\n def configure_optimizers(self):\n optim = torch.optim.SGD(self.parameters(), lr=0.06)\n return optim\n\n\nmodel = BarlowTwins()\n\ncifar10 = torchvision.datasets.CIFAR10(\"datasets/cifar10\", download=True)\ndataset = LightlyDataset.from_torch_dataset(cifar10)\n# or create a dataset from a folder containing images or videos:\n# dataset = LightlyDataset(\"path/to/folder\")\n\ncollate_fn = ImageCollateFunction(input_size=32)\n\ndataloader = torch.utils.data.DataLoader(\n dataset,\n batch_size=256,\n collate_fn=collate_fn,\n shuffle=True,\n drop_last=True,\n num_workers=8,\n)\n\ngpus = 1 if torch.cuda.is_available() else 0\n\ntrainer = pl.Trainer(max_epochs=10, gpus=gpus)\ntrainer.fit(model=model, train_dataloaders=dataloader)\n", "path": "examples/pytorch_lightning/barlowtwins.py"}]} | 934 | 135 |
gh_patches_debug_924 | rasdani/github-patches | git_diff | joke2k__faker-993 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
text-unidecode is released under the Artistic license
`text-unidecode` is released under the Artistic license v1.0, which is considered non-free by the FSF (and therefore not compatible with the GPL). I believe this clause is also of concern to commercial users of faker too:
> 5. You may charge a reasonable copying fee for any distribution of this Package. You may charge any fee you choose for support of this Package. You may not charge a fee for this Package itself. However, you may distribute this Package in aggregate with other (possibly commercial) programs as part of a larger (possibly commercial) software distribution provided that you do not advertise this Package as a product of your own.
Not being able to charge a fee for the software is problematic for those of us who are contractors, for example.
I realise there aren't really any good alternatives (`unidecode` is GPL licensed as pointed out in #628 , `isounidecode` doesn't support Python 3), so would a patch making `text-unidecode` an optional dependency be acceptable?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #!/usr/bin/env python
2 # coding=utf-8
3
4 import io
5 import os
6
7 from setuptools import find_packages, setup
8
9 here = os.path.abspath(os.path.dirname(__file__))
10 with io.open(os.path.join(here, 'README.rst'), encoding='utf-8') as fp:
11 README = fp.read()
12
13 with io.open(os.path.join(here, 'VERSION')) as version_file:
14 VERSION = version_file.read().strip()
15
16
17 # this module can be zip-safe if the zipimporter implements iter_modules or if
18 # pkgutil.iter_importer_modules has registered a dispatch for the zipimporter.
19 try:
20 import pkgutil
21 import zipimport
22 zip_safe = hasattr(zipimport.zipimporter, "iter_modules") or \
23 zipimport.zipimporter in pkgutil.iter_importer_modules.registry.keys()
24 except (ImportError, AttributeError):
25 zip_safe = False
26
27 setup(
28 name='Faker',
29 version=VERSION,
30 description="Faker is a Python package that generates fake data for you.",
31 long_description=README,
32 entry_points={
33 'console_scripts': ['faker=faker.cli:execute_from_command_line'],
34 },
35 classifiers=[
36 # See https://pypi.org/pypi?%3Aaction=list_classifiers
37 'Development Status :: 5 - Production/Stable',
38 'Environment :: Console',
39 'Intended Audience :: Developers',
40 'Programming Language :: Python',
41 'Programming Language :: Python :: 2',
42 'Programming Language :: Python :: 2.7',
43 'Programming Language :: Python :: 3',
44 'Programming Language :: Python :: 3.4',
45 'Programming Language :: Python :: 3.5',
46 'Programming Language :: Python :: 3.6',
47 'Programming Language :: Python :: 3.7',
48 'Programming Language :: Python :: Implementation :: CPython',
49 'Programming Language :: Python :: Implementation :: PyPy',
50 'Topic :: Software Development :: Libraries :: Python Modules',
51 'Topic :: Software Development :: Testing',
52 'Topic :: Utilities',
53 'License :: OSI Approved :: MIT License',
54 ],
55 keywords='faker fixtures data test mock generator',
56 author='joke2k',
57 author_email='[email protected]',
58 url='https://github.com/joke2k/faker',
59 license='MIT License',
60 packages=find_packages(exclude=["docs", "tests", "tests.*"]),
61 platforms=["any"],
62 test_suite='tests',
63 zip_safe=zip_safe,
64 python_requires=">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*",
65 setup_requires=["pytest-runner"],
66 install_requires=[
67 "python-dateutil>=2.4",
68 "six>=1.10",
69 "text-unidecode==1.2",
70 ],
71 tests_require=[
72 "validators>=0.13.0",
73 "ukpostcodeparser>=1.1.1",
74 "mock ; python_version < '3.3'",
75 "pytest>=3.8.0,<3.9",
76 "more-itertools<6.0.0 ; python_version < '3.0'",
77 # restricted because they may drop python2 support in future versions
78 # https://github.com/joke2k/faker/issues/970
79 "random2<1.1",
80 "freezegun<0.4",
81 ],
82 extras_require={
83 ':python_version<"3.3"': [
84 'ipaddress',
85 ],
86 },
87 )
88
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -66,7 +66,7 @@
install_requires=[
"python-dateutil>=2.4",
"six>=1.10",
- "text-unidecode==1.2",
+ "text-unidecode==1.3",
],
tests_require=[
"validators>=0.13.0",
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -66,7 +66,7 @@\n install_requires=[\n \"python-dateutil>=2.4\",\n \"six>=1.10\",\n- \"text-unidecode==1.2\",\n+ \"text-unidecode==1.3\",\n ],\n tests_require=[\n \"validators>=0.13.0\",\n", "issue": "text-unidecode is released under the Artistic license\n`text-unidecode` is released under the Artistic license v1.0, which is considered non-free by the FSF (and therefore not compatible with the GPL). I believe this clause is also of concern to commercial users of faker too:\r\n\r\n> 5. You may charge a reasonable copying fee for any distribution of this Package. You may charge any fee you choose for support of this Package. You may not charge a fee for this Package itself. However, you may distribute this Package in aggregate with other (possibly commercial) programs as part of a larger (possibly commercial) software distribution provided that you do not advertise this Package as a product of your own.\r\n\r\nNot being able to charge a fee for the software is problematic for those of us who are contractors, for example.\r\n\r\nI realise there aren't really any good alternatives (`unidecode` is GPL licensed as pointed out in #628 , `isounidecode` doesn't support Python 3), so would a patch making `text-unidecode` an optional dependency be acceptable?\n", "before_files": [{"content": "#!/usr/bin/env python\n# coding=utf-8\n\nimport io\nimport os\n\nfrom setuptools import find_packages, setup\n\nhere = os.path.abspath(os.path.dirname(__file__))\nwith io.open(os.path.join(here, 'README.rst'), encoding='utf-8') as fp:\n README = fp.read()\n\nwith io.open(os.path.join(here, 'VERSION')) as version_file:\n VERSION = version_file.read().strip()\n\n\n# this module can be zip-safe if the zipimporter implements iter_modules or if\n# pkgutil.iter_importer_modules has registered a dispatch for the zipimporter.\ntry:\n import pkgutil\n import zipimport\n zip_safe = hasattr(zipimport.zipimporter, \"iter_modules\") or \\\n zipimport.zipimporter in pkgutil.iter_importer_modules.registry.keys()\nexcept (ImportError, AttributeError):\n zip_safe = False\n\nsetup(\n name='Faker',\n version=VERSION,\n description=\"Faker is a Python package that generates fake data for you.\",\n long_description=README,\n entry_points={\n 'console_scripts': ['faker=faker.cli:execute_from_command_line'],\n },\n classifiers=[\n # See https://pypi.org/pypi?%3Aaction=list_classifiers\n 'Development Status :: 5 - Production/Stable',\n 'Environment :: Console',\n 'Intended Audience :: Developers',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 2',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.4',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7',\n 'Programming Language :: Python :: Implementation :: CPython',\n 'Programming Language :: Python :: Implementation :: PyPy',\n 'Topic :: Software Development :: Libraries :: Python Modules',\n 'Topic :: Software Development :: Testing',\n 'Topic :: Utilities',\n 'License :: OSI Approved :: MIT License',\n ],\n keywords='faker fixtures data test mock generator',\n author='joke2k',\n author_email='[email protected]',\n url='https://github.com/joke2k/faker',\n license='MIT License',\n packages=find_packages(exclude=[\"docs\", \"tests\", \"tests.*\"]),\n platforms=[\"any\"],\n test_suite='tests',\n zip_safe=zip_safe,\n python_requires=\">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*\",\n setup_requires=[\"pytest-runner\"],\n install_requires=[\n \"python-dateutil>=2.4\",\n \"six>=1.10\",\n \"text-unidecode==1.2\",\n ],\n tests_require=[\n \"validators>=0.13.0\",\n \"ukpostcodeparser>=1.1.1\",\n \"mock ; python_version < '3.3'\",\n \"pytest>=3.8.0,<3.9\",\n \"more-itertools<6.0.0 ; python_version < '3.0'\",\n # restricted because they may drop python2 support in future versions\n # https://github.com/joke2k/faker/issues/970\n \"random2<1.1\",\n \"freezegun<0.4\",\n ],\n extras_require={\n ':python_version<\"3.3\"': [\n 'ipaddress',\n ],\n },\n)\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/env python\n# coding=utf-8\n\nimport io\nimport os\n\nfrom setuptools import find_packages, setup\n\nhere = os.path.abspath(os.path.dirname(__file__))\nwith io.open(os.path.join(here, 'README.rst'), encoding='utf-8') as fp:\n README = fp.read()\n\nwith io.open(os.path.join(here, 'VERSION')) as version_file:\n VERSION = version_file.read().strip()\n\n\n# this module can be zip-safe if the zipimporter implements iter_modules or if\n# pkgutil.iter_importer_modules has registered a dispatch for the zipimporter.\ntry:\n import pkgutil\n import zipimport\n zip_safe = hasattr(zipimport.zipimporter, \"iter_modules\") or \\\n zipimport.zipimporter in pkgutil.iter_importer_modules.registry.keys()\nexcept (ImportError, AttributeError):\n zip_safe = False\n\nsetup(\n name='Faker',\n version=VERSION,\n description=\"Faker is a Python package that generates fake data for you.\",\n long_description=README,\n entry_points={\n 'console_scripts': ['faker=faker.cli:execute_from_command_line'],\n },\n classifiers=[\n # See https://pypi.org/pypi?%3Aaction=list_classifiers\n 'Development Status :: 5 - Production/Stable',\n 'Environment :: Console',\n 'Intended Audience :: Developers',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 2',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.4',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7',\n 'Programming Language :: Python :: Implementation :: CPython',\n 'Programming Language :: Python :: Implementation :: PyPy',\n 'Topic :: Software Development :: Libraries :: Python Modules',\n 'Topic :: Software Development :: Testing',\n 'Topic :: Utilities',\n 'License :: OSI Approved :: MIT License',\n ],\n keywords='faker fixtures data test mock generator',\n author='joke2k',\n author_email='[email protected]',\n url='https://github.com/joke2k/faker',\n license='MIT License',\n packages=find_packages(exclude=[\"docs\", \"tests\", \"tests.*\"]),\n platforms=[\"any\"],\n test_suite='tests',\n zip_safe=zip_safe,\n python_requires=\">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*\",\n setup_requires=[\"pytest-runner\"],\n install_requires=[\n \"python-dateutil>=2.4\",\n \"six>=1.10\",\n \"text-unidecode==1.3\",\n ],\n tests_require=[\n \"validators>=0.13.0\",\n \"ukpostcodeparser>=1.1.1\",\n \"mock ; python_version < '3.3'\",\n \"pytest>=3.8.0,<3.9\",\n \"more-itertools<6.0.0 ; python_version < '3.0'\",\n # restricted because they may drop python2 support in future versions\n # https://github.com/joke2k/faker/issues/970\n \"random2<1.1\",\n \"freezegun<0.4\",\n ],\n extras_require={\n ':python_version<\"3.3\"': [\n 'ipaddress',\n ],\n },\n)\n", "path": "setup.py"}]} | 1,404 | 96 |
gh_patches_debug_22236 | rasdani/github-patches | git_diff | onnx__onnx-4386 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Make C++ and Python checker API consistent
Python checker API supports `full_check` arg:
https://github.com/onnx/onnx/blob/fa6f8cfdce3d86346e8a7494f3062b98416c85fb/onnx/checker.py#L94
C++ does not.
It'd be nice for them to be consistent.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `onnx/checker.py`
Content:
```
1 # SPDX-License-Identifier: Apache-2.0
2 """onnx checker
3
4 This implements graphalities that allows us to check whether a serialized
5 proto is legal.
6 """
7
8 import functools
9
10 from onnx import (ValueInfoProto,
11 AttributeProto,
12 TensorProto,
13 SparseTensorProto,
14 NodeProto,
15 ModelProto,
16 GraphProto,
17 IR_VERSION)
18 import onnx.onnx_cpp2py_export.checker as C
19 import onnx.defs
20 from google.protobuf.message import Message
21 from typing import TypeVar, Callable, Any, Type, cast, Union
22 import onnx.shape_inference
23 import sys
24
25
26 # Limitation of single protobuf file is 2GB
27 MAXIMUM_PROTOBUF = 2000000000
28
29 # TODO: This thing where we reserialize the protobuf back into the
30 # string, only to deserialize it at the call site, is really goofy.
31 # Stop doing that.
32
33
34 # NB: Please don't edit this context!
35 DEFAULT_CONTEXT = C.CheckerContext()
36 DEFAULT_CONTEXT.ir_version = IR_VERSION
37 # TODO: Maybe ONNX-ML should also be defaulted?
38 DEFAULT_CONTEXT.opset_imports = {'': onnx.defs.onnx_opset_version()}
39
40
41 FuncType = TypeVar('FuncType', bound=Callable[..., Any])
42
43
44 # TODO: This really doesn't seem worth the metaprogramming...
45 def _create_checker(proto_type: Type[Message]) -> Callable[[FuncType], FuncType]:
46 def decorator(py_func: FuncType) -> FuncType:
47 @functools.wraps(py_func)
48 def checker(proto: Message, ctx: C.CheckerContext = DEFAULT_CONTEXT) -> Any:
49 if not isinstance(proto, proto_type):
50 raise RuntimeError(
51 'You cannot pass an object that is not of type {}'.format(
52 proto_type.__name__))
53 return getattr(C, py_func.__name__)(
54 proto.SerializeToString(), ctx)
55 return cast(FuncType, checker)
56 return decorator
57
58
59 @_create_checker(ValueInfoProto)
60 def check_value_info(value_info: ValueInfoProto, ctx: C.CheckerContext = DEFAULT_CONTEXT) -> None:
61 pass
62
63
64 @_create_checker(TensorProto)
65 def check_tensor(tensor: TensorProto, ctx: C.CheckerContext = DEFAULT_CONTEXT) -> None:
66 pass
67
68
69 @_create_checker(AttributeProto)
70 def check_attribute(attr: AttributeProto, ctx: C.CheckerContext = DEFAULT_CONTEXT) -> None:
71 pass
72
73
74 @_create_checker(NodeProto)
75 def check_node(node: NodeProto, ctx: C.CheckerContext = DEFAULT_CONTEXT) -> None:
76 pass
77
78
79 @_create_checker(GraphProto)
80 def check_graph(graph: GraphProto, ctx: C.CheckerContext = DEFAULT_CONTEXT) -> None:
81 pass
82
83
84 def check_sparse_tensor(sparse: SparseTensorProto, ctx: C.CheckerContext = DEFAULT_CONTEXT) -> None:
85 C.check_sparse_tensor(sparse.SerializeToString(), ctx)
86
87
88 def check_model(model: Union[ModelProto, str, bytes], full_check: bool = False) -> None:
89 """Check the consistency of a model. An exception is raised if the test fails.
90
91 Arguments:
92 model (ModelProto): model to check
93 full_check (bool): if True, the function checks shapes can be inferred
94 """
95 # If model is a path instead of ModelProto
96 if isinstance(model, str):
97 C.check_model_path(model)
98 if full_check:
99 onnx.shape_inference.infer_shapes_path(model, check_type=True, strict_mode=True)
100 else:
101 protobuf_string = model if isinstance(model, bytes) else model.SerializeToString()
102 # If the protobuf is larger than 2GB,
103 # remind users should use the model path to check
104 if sys.getsizeof(protobuf_string) > MAXIMUM_PROTOBUF:
105 raise ValueError('This protobuf of onnx model is too large (>2GB). Call check_model with model path instead.')
106 C.check_model(protobuf_string)
107 if full_check:
108 onnx.shape_inference.infer_shapes(model, check_type=True, strict_mode=True)
109
110
111 ValidationError = C.ValidationError
112
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/onnx/checker.py b/onnx/checker.py
--- a/onnx/checker.py
+++ b/onnx/checker.py
@@ -94,18 +94,14 @@
"""
# If model is a path instead of ModelProto
if isinstance(model, str):
- C.check_model_path(model)
- if full_check:
- onnx.shape_inference.infer_shapes_path(model, check_type=True, strict_mode=True)
+ C.check_model_path(model, full_check)
else:
protobuf_string = model if isinstance(model, bytes) else model.SerializeToString()
# If the protobuf is larger than 2GB,
# remind users should use the model path to check
if sys.getsizeof(protobuf_string) > MAXIMUM_PROTOBUF:
raise ValueError('This protobuf of onnx model is too large (>2GB). Call check_model with model path instead.')
- C.check_model(protobuf_string)
- if full_check:
- onnx.shape_inference.infer_shapes(model, check_type=True, strict_mode=True)
+ C.check_model(protobuf_string, full_check)
ValidationError = C.ValidationError
| {"golden_diff": "diff --git a/onnx/checker.py b/onnx/checker.py\n--- a/onnx/checker.py\n+++ b/onnx/checker.py\n@@ -94,18 +94,14 @@\n \"\"\"\n # If model is a path instead of ModelProto\n if isinstance(model, str):\n- C.check_model_path(model)\n- if full_check:\n- onnx.shape_inference.infer_shapes_path(model, check_type=True, strict_mode=True)\n+ C.check_model_path(model, full_check)\n else:\n protobuf_string = model if isinstance(model, bytes) else model.SerializeToString()\n # If the protobuf is larger than 2GB,\n # remind users should use the model path to check\n if sys.getsizeof(protobuf_string) > MAXIMUM_PROTOBUF:\n raise ValueError('This protobuf of onnx model is too large (>2GB). Call check_model with model path instead.')\n- C.check_model(protobuf_string)\n- if full_check:\n- onnx.shape_inference.infer_shapes(model, check_type=True, strict_mode=True)\n+ C.check_model(protobuf_string, full_check)\n \n \n ValidationError = C.ValidationError\n", "issue": "Make C++ and Python checker API consistent\nPython checker API supports `full_check` arg:\r\nhttps://github.com/onnx/onnx/blob/fa6f8cfdce3d86346e8a7494f3062b98416c85fb/onnx/checker.py#L94\r\n\r\nC++ does not.\r\nIt'd be nice for them to be consistent.\n", "before_files": [{"content": "# SPDX-License-Identifier: Apache-2.0\n\"\"\"onnx checker\n\nThis implements graphalities that allows us to check whether a serialized\nproto is legal.\n\"\"\"\n\nimport functools\n\nfrom onnx import (ValueInfoProto,\n AttributeProto,\n TensorProto,\n SparseTensorProto,\n NodeProto,\n ModelProto,\n GraphProto,\n IR_VERSION)\nimport onnx.onnx_cpp2py_export.checker as C\nimport onnx.defs\nfrom google.protobuf.message import Message\nfrom typing import TypeVar, Callable, Any, Type, cast, Union\nimport onnx.shape_inference\nimport sys\n\n\n# Limitation of single protobuf file is 2GB\nMAXIMUM_PROTOBUF = 2000000000\n\n# TODO: This thing where we reserialize the protobuf back into the\n# string, only to deserialize it at the call site, is really goofy.\n# Stop doing that.\n\n\n# NB: Please don't edit this context!\nDEFAULT_CONTEXT = C.CheckerContext()\nDEFAULT_CONTEXT.ir_version = IR_VERSION\n# TODO: Maybe ONNX-ML should also be defaulted?\nDEFAULT_CONTEXT.opset_imports = {'': onnx.defs.onnx_opset_version()}\n\n\nFuncType = TypeVar('FuncType', bound=Callable[..., Any])\n\n\n# TODO: This really doesn't seem worth the metaprogramming...\ndef _create_checker(proto_type: Type[Message]) -> Callable[[FuncType], FuncType]:\n def decorator(py_func: FuncType) -> FuncType:\n @functools.wraps(py_func)\n def checker(proto: Message, ctx: C.CheckerContext = DEFAULT_CONTEXT) -> Any:\n if not isinstance(proto, proto_type):\n raise RuntimeError(\n 'You cannot pass an object that is not of type {}'.format(\n proto_type.__name__))\n return getattr(C, py_func.__name__)(\n proto.SerializeToString(), ctx)\n return cast(FuncType, checker)\n return decorator\n\n\n@_create_checker(ValueInfoProto)\ndef check_value_info(value_info: ValueInfoProto, ctx: C.CheckerContext = DEFAULT_CONTEXT) -> None:\n pass\n\n\n@_create_checker(TensorProto)\ndef check_tensor(tensor: TensorProto, ctx: C.CheckerContext = DEFAULT_CONTEXT) -> None:\n pass\n\n\n@_create_checker(AttributeProto)\ndef check_attribute(attr: AttributeProto, ctx: C.CheckerContext = DEFAULT_CONTEXT) -> None:\n pass\n\n\n@_create_checker(NodeProto)\ndef check_node(node: NodeProto, ctx: C.CheckerContext = DEFAULT_CONTEXT) -> None:\n pass\n\n\n@_create_checker(GraphProto)\ndef check_graph(graph: GraphProto, ctx: C.CheckerContext = DEFAULT_CONTEXT) -> None:\n pass\n\n\ndef check_sparse_tensor(sparse: SparseTensorProto, ctx: C.CheckerContext = DEFAULT_CONTEXT) -> None:\n C.check_sparse_tensor(sparse.SerializeToString(), ctx)\n\n\ndef check_model(model: Union[ModelProto, str, bytes], full_check: bool = False) -> None:\n \"\"\"Check the consistency of a model. An exception is raised if the test fails.\n\n Arguments:\n model (ModelProto): model to check\n full_check (bool): if True, the function checks shapes can be inferred\n \"\"\"\n # If model is a path instead of ModelProto\n if isinstance(model, str):\n C.check_model_path(model)\n if full_check:\n onnx.shape_inference.infer_shapes_path(model, check_type=True, strict_mode=True)\n else:\n protobuf_string = model if isinstance(model, bytes) else model.SerializeToString()\n # If the protobuf is larger than 2GB,\n # remind users should use the model path to check\n if sys.getsizeof(protobuf_string) > MAXIMUM_PROTOBUF:\n raise ValueError('This protobuf of onnx model is too large (>2GB). Call check_model with model path instead.')\n C.check_model(protobuf_string)\n if full_check:\n onnx.shape_inference.infer_shapes(model, check_type=True, strict_mode=True)\n\n\nValidationError = C.ValidationError\n", "path": "onnx/checker.py"}], "after_files": [{"content": "# SPDX-License-Identifier: Apache-2.0\n\"\"\"onnx checker\n\nThis implements graphalities that allows us to check whether a serialized\nproto is legal.\n\"\"\"\n\nimport functools\n\nfrom onnx import (ValueInfoProto,\n AttributeProto,\n TensorProto,\n SparseTensorProto,\n NodeProto,\n ModelProto,\n GraphProto,\n IR_VERSION)\nimport onnx.onnx_cpp2py_export.checker as C\nimport onnx.defs\nfrom google.protobuf.message import Message\nfrom typing import TypeVar, Callable, Any, Type, cast, Union\nimport onnx.shape_inference\nimport sys\n\n\n# Limitation of single protobuf file is 2GB\nMAXIMUM_PROTOBUF = 2000000000\n\n# TODO: This thing where we reserialize the protobuf back into the\n# string, only to deserialize it at the call site, is really goofy.\n# Stop doing that.\n\n\n# NB: Please don't edit this context!\nDEFAULT_CONTEXT = C.CheckerContext()\nDEFAULT_CONTEXT.ir_version = IR_VERSION\n# TODO: Maybe ONNX-ML should also be defaulted?\nDEFAULT_CONTEXT.opset_imports = {'': onnx.defs.onnx_opset_version()}\n\n\nFuncType = TypeVar('FuncType', bound=Callable[..., Any])\n\n\n# TODO: This really doesn't seem worth the metaprogramming...\ndef _create_checker(proto_type: Type[Message]) -> Callable[[FuncType], FuncType]:\n def decorator(py_func: FuncType) -> FuncType:\n @functools.wraps(py_func)\n def checker(proto: Message, ctx: C.CheckerContext = DEFAULT_CONTEXT) -> Any:\n if not isinstance(proto, proto_type):\n raise RuntimeError(\n 'You cannot pass an object that is not of type {}'.format(\n proto_type.__name__))\n return getattr(C, py_func.__name__)(\n proto.SerializeToString(), ctx)\n return cast(FuncType, checker)\n return decorator\n\n\n@_create_checker(ValueInfoProto)\ndef check_value_info(value_info: ValueInfoProto, ctx: C.CheckerContext = DEFAULT_CONTEXT) -> None:\n pass\n\n\n@_create_checker(TensorProto)\ndef check_tensor(tensor: TensorProto, ctx: C.CheckerContext = DEFAULT_CONTEXT) -> None:\n pass\n\n\n@_create_checker(AttributeProto)\ndef check_attribute(attr: AttributeProto, ctx: C.CheckerContext = DEFAULT_CONTEXT) -> None:\n pass\n\n\n@_create_checker(NodeProto)\ndef check_node(node: NodeProto, ctx: C.CheckerContext = DEFAULT_CONTEXT) -> None:\n pass\n\n\n@_create_checker(GraphProto)\ndef check_graph(graph: GraphProto, ctx: C.CheckerContext = DEFAULT_CONTEXT) -> None:\n pass\n\n\ndef check_sparse_tensor(sparse: SparseTensorProto, ctx: C.CheckerContext = DEFAULT_CONTEXT) -> None:\n C.check_sparse_tensor(sparse.SerializeToString(), ctx)\n\n\ndef check_model(model: Union[ModelProto, str, bytes], full_check: bool = False) -> None:\n \"\"\"Check the consistency of a model. An exception is raised if the test fails.\n\n Arguments:\n model (ModelProto): model to check\n full_check (bool): if True, the function checks shapes can be inferred\n \"\"\"\n # If model is a path instead of ModelProto\n if isinstance(model, str):\n C.check_model_path(model, full_check)\n else:\n protobuf_string = model if isinstance(model, bytes) else model.SerializeToString()\n # If the protobuf is larger than 2GB,\n # remind users should use the model path to check\n if sys.getsizeof(protobuf_string) > MAXIMUM_PROTOBUF:\n raise ValueError('This protobuf of onnx model is too large (>2GB). Call check_model with model path instead.')\n C.check_model(protobuf_string, full_check)\n\n\nValidationError = C.ValidationError\n", "path": "onnx/checker.py"}]} | 1,450 | 249 |
gh_patches_debug_37440 | rasdani/github-patches | git_diff | arviz-devs__arviz-636 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Allow normalization in plot_parallel
It would be great if plot_parallel had a keyword arg `normalize` (or standardize), that centers and rescales the variables before plotting. That would make it easier to see things if some posteriors are much more tight than others:

--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `arviz/plots/parallelplot.py`
Content:
```
1 """Parallel coordinates plot showing posterior points with and without divergences marked."""
2 import matplotlib.pyplot as plt
3 import numpy as np
4
5 from ..data import convert_to_dataset
6 from .plot_utils import _scale_fig_size, xarray_to_ndarray, get_coords
7 from ..utils import _var_names
8
9
10 def plot_parallel(
11 data,
12 var_names=None,
13 coords=None,
14 figsize=None,
15 textsize=None,
16 legend=True,
17 colornd="k",
18 colord="C1",
19 shadend=0.025,
20 ax=None,
21 ):
22 """
23 Plot parallel coordinates plot showing posterior points with and without divergences.
24
25 Described by https://arxiv.org/abs/1709.01449, suggested by Ari Hartikainen
26
27 Parameters
28 ----------
29 data : obj
30 Any object that can be converted to an az.InferenceData object
31 Refer to documentation of az.convert_to_dataset for details
32 var_names : list of variable names
33 Variables to be plotted, if None all variable are plotted. Can be used to change the order
34 of the plotted variables
35 coords : mapping, optional
36 Coordinates of var_names to be plotted. Passed to `Dataset.sel`
37 figsize : tuple
38 Figure size. If None it will be defined automatically.
39 textsize: float
40 Text size scaling factor for labels, titles and lines. If None it will be autoscaled based
41 on figsize.
42 legend : bool
43 Flag for plotting legend (defaults to True)
44 colornd : valid matplotlib color
45 color for non-divergent points. Defaults to 'k'
46 colord : valid matplotlib color
47 color for divergent points. Defaults to 'C1'
48 shadend : float
49 Alpha blending value for non-divergent points, between 0 (invisible) and 1 (opaque).
50 Defaults to .025
51 ax : axes
52 Matplotlib axes.
53
54 Returns
55 -------
56 ax : matplotlib axes
57 """
58 if coords is None:
59 coords = {}
60
61 # Get diverging draws and combine chains
62 divergent_data = convert_to_dataset(data, group="sample_stats")
63 _, diverging_mask = xarray_to_ndarray(divergent_data, var_names=("diverging",), combined=True)
64 diverging_mask = np.squeeze(diverging_mask)
65
66 # Get posterior draws and combine chains
67 posterior_data = convert_to_dataset(data, group="posterior")
68 var_names = _var_names(var_names, posterior_data)
69 var_names, _posterior = xarray_to_ndarray(
70 get_coords(posterior_data, coords), var_names=var_names, combined=True
71 )
72
73 if len(var_names) < 2:
74 raise ValueError("This plot needs at least two variables")
75
76 figsize, _, _, xt_labelsize, _, _ = _scale_fig_size(figsize, textsize, 1, 1)
77
78 if ax is None:
79 _, ax = plt.subplots(figsize=figsize, constrained_layout=True)
80
81 ax.plot(_posterior[:, ~diverging_mask], color=colornd, alpha=shadend)
82
83 if np.any(diverging_mask):
84 ax.plot(_posterior[:, diverging_mask], color=colord, lw=1)
85
86 ax.tick_params(labelsize=textsize)
87 ax.set_xticks(range(len(var_names)))
88 ax.set_xticklabels(var_names)
89
90 if legend:
91 ax.plot([], color=colornd, label="non-divergent")
92 if np.any(diverging_mask):
93 ax.plot([], color=colord, label="divergent")
94 ax.legend(fontsize=xt_labelsize)
95
96 return ax
97
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/arviz/plots/parallelplot.py b/arviz/plots/parallelplot.py
--- a/arviz/plots/parallelplot.py
+++ b/arviz/plots/parallelplot.py
@@ -2,6 +2,7 @@
import matplotlib.pyplot as plt
import numpy as np
+from scipy.stats.mstats import rankdata
from ..data import convert_to_dataset
from .plot_utils import _scale_fig_size, xarray_to_ndarray, get_coords
from ..utils import _var_names
@@ -18,6 +19,7 @@
colord="C1",
shadend=0.025,
ax=None,
+ norm_method=None,
):
"""
Plot parallel coordinates plot showing posterior points with and without divergences.
@@ -50,10 +52,33 @@
Defaults to .025
ax : axes
Matplotlib axes.
+ norm_method : str
+ Method for normalizing the data. Methods include normal, minmax and rank.
+ Defaults to none.
Returns
-------
ax : matplotlib axes
+
+ Examples
+ --------
+ Plot default parallel plot
+
+ .. plot::
+ :context: close-figs
+
+ >>> import arviz as az
+ >>> data = az.load_arviz_data('centered_eight')
+ >>> az.plot_parallel(data, var_names=["mu", "tau"])
+
+
+ Plot parallel plot with normalization
+
+ .. plot::
+ :context: close-figs
+
+ >>> az.plot_parallel(data, var_names=["mu", "tau"], norm_method='normal')
+
"""
if coords is None:
coords = {}
@@ -69,9 +94,23 @@
var_names, _posterior = xarray_to_ndarray(
get_coords(posterior_data, coords), var_names=var_names, combined=True
)
-
if len(var_names) < 2:
raise ValueError("This plot needs at least two variables")
+ if norm_method is not None:
+ if norm_method == "normal":
+ mean = np.mean(_posterior, axis=1)
+ standard_deviation = np.std(_posterior, axis=1)
+ for i in range(0, np.shape(mean)[0]):
+ _posterior[i, :] = (_posterior[i, :] - mean[i]) / standard_deviation[i]
+ elif norm_method == "minmax":
+ min_elem = np.min(_posterior, axis=1)
+ max_elem = np.max(_posterior, axis=1)
+ for i in range(0, np.shape(min_elem)[0]):
+ _posterior[i, :] = ((_posterior[i, :]) - min_elem[i]) / (max_elem[i] - min_elem[i])
+ elif norm_method == "rank":
+ _posterior = rankdata(_posterior, axis=1)
+ else:
+ raise ValueError("{} is not supported. Use normal, minmax or rank.".format(norm_method))
figsize, _, _, xt_labelsize, _, _ = _scale_fig_size(figsize, textsize, 1, 1)
| {"golden_diff": "diff --git a/arviz/plots/parallelplot.py b/arviz/plots/parallelplot.py\n--- a/arviz/plots/parallelplot.py\n+++ b/arviz/plots/parallelplot.py\n@@ -2,6 +2,7 @@\n import matplotlib.pyplot as plt\n import numpy as np\n \n+from scipy.stats.mstats import rankdata\n from ..data import convert_to_dataset\n from .plot_utils import _scale_fig_size, xarray_to_ndarray, get_coords\n from ..utils import _var_names\n@@ -18,6 +19,7 @@\n colord=\"C1\",\n shadend=0.025,\n ax=None,\n+ norm_method=None,\n ):\n \"\"\"\n Plot parallel coordinates plot showing posterior points with and without divergences.\n@@ -50,10 +52,33 @@\n Defaults to .025\n ax : axes\n Matplotlib axes.\n+ norm_method : str\n+ Method for normalizing the data. Methods include normal, minmax and rank.\n+ Defaults to none.\n \n Returns\n -------\n ax : matplotlib axes\n+\n+ Examples\n+ --------\n+ Plot default parallel plot\n+\n+ .. plot::\n+ :context: close-figs\n+\n+ >>> import arviz as az\n+ >>> data = az.load_arviz_data('centered_eight')\n+ >>> az.plot_parallel(data, var_names=[\"mu\", \"tau\"])\n+\n+\n+ Plot parallel plot with normalization\n+\n+ .. plot::\n+ :context: close-figs\n+\n+ >>> az.plot_parallel(data, var_names=[\"mu\", \"tau\"], norm_method='normal')\n+\n \"\"\"\n if coords is None:\n coords = {}\n@@ -69,9 +94,23 @@\n var_names, _posterior = xarray_to_ndarray(\n get_coords(posterior_data, coords), var_names=var_names, combined=True\n )\n-\n if len(var_names) < 2:\n raise ValueError(\"This plot needs at least two variables\")\n+ if norm_method is not None:\n+ if norm_method == \"normal\":\n+ mean = np.mean(_posterior, axis=1)\n+ standard_deviation = np.std(_posterior, axis=1)\n+ for i in range(0, np.shape(mean)[0]):\n+ _posterior[i, :] = (_posterior[i, :] - mean[i]) / standard_deviation[i]\n+ elif norm_method == \"minmax\":\n+ min_elem = np.min(_posterior, axis=1)\n+ max_elem = np.max(_posterior, axis=1)\n+ for i in range(0, np.shape(min_elem)[0]):\n+ _posterior[i, :] = ((_posterior[i, :]) - min_elem[i]) / (max_elem[i] - min_elem[i])\n+ elif norm_method == \"rank\":\n+ _posterior = rankdata(_posterior, axis=1)\n+ else:\n+ raise ValueError(\"{} is not supported. Use normal, minmax or rank.\".format(norm_method))\n \n figsize, _, _, xt_labelsize, _, _ = _scale_fig_size(figsize, textsize, 1, 1)\n", "issue": "Allow normalization in plot_parallel\nIt would be great if plot_parallel had a keyword arg `normalize` (or standardize), that centers and rescales the variables before plotting. That would make it easier to see things if some posteriors are much more tight than others:\r\n\n", "before_files": [{"content": "\"\"\"Parallel coordinates plot showing posterior points with and without divergences marked.\"\"\"\nimport matplotlib.pyplot as plt\nimport numpy as np\n\nfrom ..data import convert_to_dataset\nfrom .plot_utils import _scale_fig_size, xarray_to_ndarray, get_coords\nfrom ..utils import _var_names\n\n\ndef plot_parallel(\n data,\n var_names=None,\n coords=None,\n figsize=None,\n textsize=None,\n legend=True,\n colornd=\"k\",\n colord=\"C1\",\n shadend=0.025,\n ax=None,\n):\n \"\"\"\n Plot parallel coordinates plot showing posterior points with and without divergences.\n\n Described by https://arxiv.org/abs/1709.01449, suggested by Ari Hartikainen\n\n Parameters\n ----------\n data : obj\n Any object that can be converted to an az.InferenceData object\n Refer to documentation of az.convert_to_dataset for details\n var_names : list of variable names\n Variables to be plotted, if None all variable are plotted. Can be used to change the order\n of the plotted variables\n coords : mapping, optional\n Coordinates of var_names to be plotted. Passed to `Dataset.sel`\n figsize : tuple\n Figure size. If None it will be defined automatically.\n textsize: float\n Text size scaling factor for labels, titles and lines. If None it will be autoscaled based\n on figsize.\n legend : bool\n Flag for plotting legend (defaults to True)\n colornd : valid matplotlib color\n color for non-divergent points. Defaults to 'k'\n colord : valid matplotlib color\n color for divergent points. Defaults to 'C1'\n shadend : float\n Alpha blending value for non-divergent points, between 0 (invisible) and 1 (opaque).\n Defaults to .025\n ax : axes\n Matplotlib axes.\n\n Returns\n -------\n ax : matplotlib axes\n \"\"\"\n if coords is None:\n coords = {}\n\n # Get diverging draws and combine chains\n divergent_data = convert_to_dataset(data, group=\"sample_stats\")\n _, diverging_mask = xarray_to_ndarray(divergent_data, var_names=(\"diverging\",), combined=True)\n diverging_mask = np.squeeze(diverging_mask)\n\n # Get posterior draws and combine chains\n posterior_data = convert_to_dataset(data, group=\"posterior\")\n var_names = _var_names(var_names, posterior_data)\n var_names, _posterior = xarray_to_ndarray(\n get_coords(posterior_data, coords), var_names=var_names, combined=True\n )\n\n if len(var_names) < 2:\n raise ValueError(\"This plot needs at least two variables\")\n\n figsize, _, _, xt_labelsize, _, _ = _scale_fig_size(figsize, textsize, 1, 1)\n\n if ax is None:\n _, ax = plt.subplots(figsize=figsize, constrained_layout=True)\n\n ax.plot(_posterior[:, ~diverging_mask], color=colornd, alpha=shadend)\n\n if np.any(diverging_mask):\n ax.plot(_posterior[:, diverging_mask], color=colord, lw=1)\n\n ax.tick_params(labelsize=textsize)\n ax.set_xticks(range(len(var_names)))\n ax.set_xticklabels(var_names)\n\n if legend:\n ax.plot([], color=colornd, label=\"non-divergent\")\n if np.any(diverging_mask):\n ax.plot([], color=colord, label=\"divergent\")\n ax.legend(fontsize=xt_labelsize)\n\n return ax\n", "path": "arviz/plots/parallelplot.py"}], "after_files": [{"content": "\"\"\"Parallel coordinates plot showing posterior points with and without divergences marked.\"\"\"\nimport matplotlib.pyplot as plt\nimport numpy as np\n\nfrom scipy.stats.mstats import rankdata\nfrom ..data import convert_to_dataset\nfrom .plot_utils import _scale_fig_size, xarray_to_ndarray, get_coords\nfrom ..utils import _var_names\n\n\ndef plot_parallel(\n data,\n var_names=None,\n coords=None,\n figsize=None,\n textsize=None,\n legend=True,\n colornd=\"k\",\n colord=\"C1\",\n shadend=0.025,\n ax=None,\n norm_method=None,\n):\n \"\"\"\n Plot parallel coordinates plot showing posterior points with and without divergences.\n\n Described by https://arxiv.org/abs/1709.01449, suggested by Ari Hartikainen\n\n Parameters\n ----------\n data : obj\n Any object that can be converted to an az.InferenceData object\n Refer to documentation of az.convert_to_dataset for details\n var_names : list of variable names\n Variables to be plotted, if None all variable are plotted. Can be used to change the order\n of the plotted variables\n coords : mapping, optional\n Coordinates of var_names to be plotted. Passed to `Dataset.sel`\n figsize : tuple\n Figure size. If None it will be defined automatically.\n textsize: float\n Text size scaling factor for labels, titles and lines. If None it will be autoscaled based\n on figsize.\n legend : bool\n Flag for plotting legend (defaults to True)\n colornd : valid matplotlib color\n color for non-divergent points. Defaults to 'k'\n colord : valid matplotlib color\n color for divergent points. Defaults to 'C1'\n shadend : float\n Alpha blending value for non-divergent points, between 0 (invisible) and 1 (opaque).\n Defaults to .025\n ax : axes\n Matplotlib axes.\n norm_method : str\n Method for normalizing the data. Methods include normal, minmax and rank.\n Defaults to none.\n\n Returns\n -------\n ax : matplotlib axes\n\n Examples\n --------\n Plot default parallel plot\n\n .. plot::\n :context: close-figs\n\n >>> import arviz as az\n >>> data = az.load_arviz_data('centered_eight')\n >>> az.plot_parallel(data, var_names=[\"mu\", \"tau\"])\n\n\n Plot parallel plot with normalization\n\n .. plot::\n :context: close-figs\n\n >>> az.plot_parallel(data, var_names=[\"mu\", \"tau\"], norm_method='normal')\n\n \"\"\"\n if coords is None:\n coords = {}\n\n # Get diverging draws and combine chains\n divergent_data = convert_to_dataset(data, group=\"sample_stats\")\n _, diverging_mask = xarray_to_ndarray(divergent_data, var_names=(\"diverging\",), combined=True)\n diverging_mask = np.squeeze(diverging_mask)\n\n # Get posterior draws and combine chains\n posterior_data = convert_to_dataset(data, group=\"posterior\")\n var_names = _var_names(var_names, posterior_data)\n var_names, _posterior = xarray_to_ndarray(\n get_coords(posterior_data, coords), var_names=var_names, combined=True\n )\n if len(var_names) < 2:\n raise ValueError(\"This plot needs at least two variables\")\n if norm_method is not None:\n if norm_method == \"normal\":\n mean = np.mean(_posterior, axis=1)\n standard_deviation = np.std(_posterior, axis=1)\n for i in range(0, np.shape(mean)[0]):\n _posterior[i, :] = (_posterior[i, :] - mean[i]) / standard_deviation[i]\n elif norm_method == \"minmax\":\n min_elem = np.min(_posterior, axis=1)\n max_elem = np.max(_posterior, axis=1)\n for i in range(0, np.shape(min_elem)[0]):\n _posterior[i, :] = ((_posterior[i, :]) - min_elem[i]) / (max_elem[i] - min_elem[i])\n elif norm_method == \"rank\":\n _posterior = rankdata(_posterior, axis=1)\n else:\n raise ValueError(\"{} is not supported. Use normal, minmax or rank.\".format(norm_method))\n\n figsize, _, _, xt_labelsize, _, _ = _scale_fig_size(figsize, textsize, 1, 1)\n\n if ax is None:\n _, ax = plt.subplots(figsize=figsize, constrained_layout=True)\n\n ax.plot(_posterior[:, ~diverging_mask], color=colornd, alpha=shadend)\n\n if np.any(diverging_mask):\n ax.plot(_posterior[:, diverging_mask], color=colord, lw=1)\n\n ax.tick_params(labelsize=textsize)\n ax.set_xticks(range(len(var_names)))\n ax.set_xticklabels(var_names)\n\n if legend:\n ax.plot([], color=colornd, label=\"non-divergent\")\n if np.any(diverging_mask):\n ax.plot([], color=colord, label=\"divergent\")\n ax.legend(fontsize=xt_labelsize)\n\n return ax\n", "path": "arviz/plots/parallelplot.py"}]} | 1,352 | 702 |
gh_patches_debug_30482 | rasdani/github-patches | git_diff | SeldonIO__MLServer-605 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add docker build option to not use cache
To ensure fresh environment and while potentially developing a new model version users will not want to use cached layers in the docker build for their image.
In docker this is the --no-cache option, I would make it a boolean option within the cli and also add the --rm option (although default is true might be worth making sure) to remove any intermediate containers after success to again ensure a clean environment in CI/CD as well as locally.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mlserver/cli/build.py`
Content:
```
1 import subprocess
2 import os
3
4 from tempfile import TemporaryDirectory
5
6 from .. import __version__
7 from ..logging import logger
8
9 from .constants import (
10 DockerfileName,
11 DockerfileTemplate,
12 DockerignoreName,
13 Dockerignore,
14 )
15
16
17 def generate_dockerfile() -> str:
18 return DockerfileTemplate.format(version=__version__)
19
20
21 def write_dockerfile(
22 folder: str, dockerfile: str, include_dockerignore: bool = True
23 ) -> str:
24 dockerfile_path = os.path.join(folder, DockerfileName)
25 with open(dockerfile_path, "w") as dockerfile_handler:
26 logger.info(f"Writing Dockerfile in {dockerfile_path}")
27 dockerfile_handler.write(dockerfile)
28
29 if include_dockerignore:
30 # Point to our own .dockerignore
31 # https://docs.docker.com/engine/reference/commandline/build/#use-a-dockerignore-file
32 dockerignore_path = dockerfile_path + DockerignoreName
33 with open(dockerignore_path, "w") as dockerignore_handler:
34 logger.info(f"Writing .dockerignore in {dockerignore_path}")
35 dockerignore_handler.write(Dockerignore)
36
37 return dockerfile_path
38
39
40 def build_image(folder: str, dockerfile: str, image_tag: str) -> str:
41 logger.info(f"Building Docker image with tag {image_tag}")
42 with TemporaryDirectory() as tmp_dir:
43 dockerfile_path = write_dockerfile(tmp_dir, dockerfile)
44
45 build_cmd = f"docker build {folder} -f {dockerfile_path} -t {image_tag}"
46 build_env = os.environ.copy()
47 build_env["DOCKER_BUILDKIT"] = "1"
48 subprocess.run(build_cmd, check=True, shell=True, env=build_env)
49
50 return image_tag
51
```
Path: `mlserver/cli/main.py`
Content:
```
1 """
2 Command-line interface to manage MLServer models.
3 """
4 import click
5 import asyncio
6
7 from functools import wraps
8
9 from ..server import MLServer
10 from ..logging import logger, configure_logger
11 from ..utils import install_uvloop_event_loop
12
13 from .build import generate_dockerfile, build_image, write_dockerfile
14 from .serve import load_settings
15
16
17 def click_async(f):
18 @wraps(f)
19 def wrapper(*args, **kwargs):
20 return asyncio.run(f(*args, **kwargs))
21
22 return wrapper
23
24
25 @click.group()
26 @click.version_option()
27 def root():
28 """
29 Command-line interface to manage MLServer models.
30 """
31 pass
32
33
34 @root.command("start")
35 @click.argument("folder", nargs=1)
36 @click_async
37 async def start(folder: str):
38 """
39 Start serving a machine learning model with MLServer.
40 """
41 settings, models_settings = await load_settings(folder)
42
43 server = MLServer(settings)
44 await server.start(models_settings)
45
46
47 @root.command("build")
48 @click.argument("folder", nargs=1)
49 @click.option("-t", "--tag", type=str)
50 @click_async
51 async def build(folder: str, tag: str):
52 """
53 Build a Docker image for a custom MLServer runtime.
54 """
55 dockerfile = generate_dockerfile()
56 build_image(folder, dockerfile, tag)
57 logger.info(f"Successfully built custom Docker image with tag {tag}")
58
59
60 @root.command("dockerfile")
61 @click.argument("folder", nargs=1)
62 @click.option("-i", "--include-dockerignore", is_flag=True)
63 @click_async
64 async def dockerfile(folder: str, include_dockerignore: bool):
65 """
66 Generate a Dockerfile
67 """
68 dockerfile = generate_dockerfile()
69 dockerfile_path = write_dockerfile(
70 folder, dockerfile, include_dockerignore=include_dockerignore
71 )
72 logger.info(f"Successfully written Dockerfile in {dockerfile_path}")
73
74
75 def main():
76 configure_logger()
77 install_uvloop_event_loop()
78 root()
79
80
81 if __name__ == "__main__":
82 main()
83
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/mlserver/cli/build.py b/mlserver/cli/build.py
--- a/mlserver/cli/build.py
+++ b/mlserver/cli/build.py
@@ -37,12 +37,18 @@
return dockerfile_path
-def build_image(folder: str, dockerfile: str, image_tag: str) -> str:
+def build_image(
+ folder: str, dockerfile: str, image_tag: str, no_cache: bool = False
+) -> str:
logger.info(f"Building Docker image with tag {image_tag}")
+ _docker_command_prefix = "docker build --rm "
with TemporaryDirectory() as tmp_dir:
dockerfile_path = write_dockerfile(tmp_dir, dockerfile)
-
- build_cmd = f"docker build {folder} -f {dockerfile_path} -t {image_tag}"
+ _docker_command_suffix = f"{folder} -f {dockerfile_path} -t {image_tag}"
+ if no_cache:
+ build_cmd = _docker_command_prefix + "--no-cache " + _docker_command_suffix
+ else:
+ build_cmd = _docker_command_prefix + _docker_command_suffix
build_env = os.environ.copy()
build_env["DOCKER_BUILDKIT"] = "1"
subprocess.run(build_cmd, check=True, shell=True, env=build_env)
diff --git a/mlserver/cli/main.py b/mlserver/cli/main.py
--- a/mlserver/cli/main.py
+++ b/mlserver/cli/main.py
@@ -47,13 +47,14 @@
@root.command("build")
@click.argument("folder", nargs=1)
@click.option("-t", "--tag", type=str)
[email protected]("--no-cache", default=False, is_flag=True)
@click_async
-async def build(folder: str, tag: str):
+async def build(folder: str, tag: str, no_cache: bool = False):
"""
Build a Docker image for a custom MLServer runtime.
"""
dockerfile = generate_dockerfile()
- build_image(folder, dockerfile, tag)
+ build_image(folder, dockerfile, tag, no_cache=no_cache)
logger.info(f"Successfully built custom Docker image with tag {tag}")
| {"golden_diff": "diff --git a/mlserver/cli/build.py b/mlserver/cli/build.py\n--- a/mlserver/cli/build.py\n+++ b/mlserver/cli/build.py\n@@ -37,12 +37,18 @@\n return dockerfile_path\n \n \n-def build_image(folder: str, dockerfile: str, image_tag: str) -> str:\n+def build_image(\n+ folder: str, dockerfile: str, image_tag: str, no_cache: bool = False\n+) -> str:\n logger.info(f\"Building Docker image with tag {image_tag}\")\n+ _docker_command_prefix = \"docker build --rm \"\n with TemporaryDirectory() as tmp_dir:\n dockerfile_path = write_dockerfile(tmp_dir, dockerfile)\n-\n- build_cmd = f\"docker build {folder} -f {dockerfile_path} -t {image_tag}\"\n+ _docker_command_suffix = f\"{folder} -f {dockerfile_path} -t {image_tag}\"\n+ if no_cache:\n+ build_cmd = _docker_command_prefix + \"--no-cache \" + _docker_command_suffix\n+ else:\n+ build_cmd = _docker_command_prefix + _docker_command_suffix\n build_env = os.environ.copy()\n build_env[\"DOCKER_BUILDKIT\"] = \"1\"\n subprocess.run(build_cmd, check=True, shell=True, env=build_env)\ndiff --git a/mlserver/cli/main.py b/mlserver/cli/main.py\n--- a/mlserver/cli/main.py\n+++ b/mlserver/cli/main.py\n@@ -47,13 +47,14 @@\n @root.command(\"build\")\n @click.argument(\"folder\", nargs=1)\n @click.option(\"-t\", \"--tag\", type=str)\[email protected](\"--no-cache\", default=False, is_flag=True)\n @click_async\n-async def build(folder: str, tag: str):\n+async def build(folder: str, tag: str, no_cache: bool = False):\n \"\"\"\n Build a Docker image for a custom MLServer runtime.\n \"\"\"\n dockerfile = generate_dockerfile()\n- build_image(folder, dockerfile, tag)\n+ build_image(folder, dockerfile, tag, no_cache=no_cache)\n logger.info(f\"Successfully built custom Docker image with tag {tag}\")\n", "issue": "Add docker build option to not use cache\nTo ensure fresh environment and while potentially developing a new model version users will not want to use cached layers in the docker build for their image.\r\n\r\nIn docker this is the --no-cache option, I would make it a boolean option within the cli and also add the --rm option (although default is true might be worth making sure) to remove any intermediate containers after success to again ensure a clean environment in CI/CD as well as locally. \n", "before_files": [{"content": "import subprocess\nimport os\n\nfrom tempfile import TemporaryDirectory\n\nfrom .. import __version__\nfrom ..logging import logger\n\nfrom .constants import (\n DockerfileName,\n DockerfileTemplate,\n DockerignoreName,\n Dockerignore,\n)\n\n\ndef generate_dockerfile() -> str:\n return DockerfileTemplate.format(version=__version__)\n\n\ndef write_dockerfile(\n folder: str, dockerfile: str, include_dockerignore: bool = True\n) -> str:\n dockerfile_path = os.path.join(folder, DockerfileName)\n with open(dockerfile_path, \"w\") as dockerfile_handler:\n logger.info(f\"Writing Dockerfile in {dockerfile_path}\")\n dockerfile_handler.write(dockerfile)\n\n if include_dockerignore:\n # Point to our own .dockerignore\n # https://docs.docker.com/engine/reference/commandline/build/#use-a-dockerignore-file\n dockerignore_path = dockerfile_path + DockerignoreName\n with open(dockerignore_path, \"w\") as dockerignore_handler:\n logger.info(f\"Writing .dockerignore in {dockerignore_path}\")\n dockerignore_handler.write(Dockerignore)\n\n return dockerfile_path\n\n\ndef build_image(folder: str, dockerfile: str, image_tag: str) -> str:\n logger.info(f\"Building Docker image with tag {image_tag}\")\n with TemporaryDirectory() as tmp_dir:\n dockerfile_path = write_dockerfile(tmp_dir, dockerfile)\n\n build_cmd = f\"docker build {folder} -f {dockerfile_path} -t {image_tag}\"\n build_env = os.environ.copy()\n build_env[\"DOCKER_BUILDKIT\"] = \"1\"\n subprocess.run(build_cmd, check=True, shell=True, env=build_env)\n\n return image_tag\n", "path": "mlserver/cli/build.py"}, {"content": "\"\"\"\nCommand-line interface to manage MLServer models.\n\"\"\"\nimport click\nimport asyncio\n\nfrom functools import wraps\n\nfrom ..server import MLServer\nfrom ..logging import logger, configure_logger\nfrom ..utils import install_uvloop_event_loop\n\nfrom .build import generate_dockerfile, build_image, write_dockerfile\nfrom .serve import load_settings\n\n\ndef click_async(f):\n @wraps(f)\n def wrapper(*args, **kwargs):\n return asyncio.run(f(*args, **kwargs))\n\n return wrapper\n\n\[email protected]()\[email protected]_option()\ndef root():\n \"\"\"\n Command-line interface to manage MLServer models.\n \"\"\"\n pass\n\n\[email protected](\"start\")\[email protected](\"folder\", nargs=1)\n@click_async\nasync def start(folder: str):\n \"\"\"\n Start serving a machine learning model with MLServer.\n \"\"\"\n settings, models_settings = await load_settings(folder)\n\n server = MLServer(settings)\n await server.start(models_settings)\n\n\[email protected](\"build\")\[email protected](\"folder\", nargs=1)\[email protected](\"-t\", \"--tag\", type=str)\n@click_async\nasync def build(folder: str, tag: str):\n \"\"\"\n Build a Docker image for a custom MLServer runtime.\n \"\"\"\n dockerfile = generate_dockerfile()\n build_image(folder, dockerfile, tag)\n logger.info(f\"Successfully built custom Docker image with tag {tag}\")\n\n\[email protected](\"dockerfile\")\[email protected](\"folder\", nargs=1)\[email protected](\"-i\", \"--include-dockerignore\", is_flag=True)\n@click_async\nasync def dockerfile(folder: str, include_dockerignore: bool):\n \"\"\"\n Generate a Dockerfile\n \"\"\"\n dockerfile = generate_dockerfile()\n dockerfile_path = write_dockerfile(\n folder, dockerfile, include_dockerignore=include_dockerignore\n )\n logger.info(f\"Successfully written Dockerfile in {dockerfile_path}\")\n\n\ndef main():\n configure_logger()\n install_uvloop_event_loop()\n root()\n\n\nif __name__ == \"__main__\":\n main()\n", "path": "mlserver/cli/main.py"}], "after_files": [{"content": "import subprocess\nimport os\n\nfrom tempfile import TemporaryDirectory\n\nfrom .. import __version__\nfrom ..logging import logger\n\nfrom .constants import (\n DockerfileName,\n DockerfileTemplate,\n DockerignoreName,\n Dockerignore,\n)\n\n\ndef generate_dockerfile() -> str:\n return DockerfileTemplate.format(version=__version__)\n\n\ndef write_dockerfile(\n folder: str, dockerfile: str, include_dockerignore: bool = True\n) -> str:\n dockerfile_path = os.path.join(folder, DockerfileName)\n with open(dockerfile_path, \"w\") as dockerfile_handler:\n logger.info(f\"Writing Dockerfile in {dockerfile_path}\")\n dockerfile_handler.write(dockerfile)\n\n if include_dockerignore:\n # Point to our own .dockerignore\n # https://docs.docker.com/engine/reference/commandline/build/#use-a-dockerignore-file\n dockerignore_path = dockerfile_path + DockerignoreName\n with open(dockerignore_path, \"w\") as dockerignore_handler:\n logger.info(f\"Writing .dockerignore in {dockerignore_path}\")\n dockerignore_handler.write(Dockerignore)\n\n return dockerfile_path\n\n\ndef build_image(\n folder: str, dockerfile: str, image_tag: str, no_cache: bool = False\n) -> str:\n logger.info(f\"Building Docker image with tag {image_tag}\")\n _docker_command_prefix = \"docker build --rm \"\n with TemporaryDirectory() as tmp_dir:\n dockerfile_path = write_dockerfile(tmp_dir, dockerfile)\n _docker_command_suffix = f\"{folder} -f {dockerfile_path} -t {image_tag}\"\n if no_cache:\n build_cmd = _docker_command_prefix + \"--no-cache \" + _docker_command_suffix\n else:\n build_cmd = _docker_command_prefix + _docker_command_suffix\n build_env = os.environ.copy()\n build_env[\"DOCKER_BUILDKIT\"] = \"1\"\n subprocess.run(build_cmd, check=True, shell=True, env=build_env)\n\n return image_tag\n", "path": "mlserver/cli/build.py"}, {"content": "\"\"\"\nCommand-line interface to manage MLServer models.\n\"\"\"\nimport click\nimport asyncio\n\nfrom functools import wraps\n\nfrom ..server import MLServer\nfrom ..logging import logger, configure_logger\nfrom ..utils import install_uvloop_event_loop\n\nfrom .build import generate_dockerfile, build_image, write_dockerfile\nfrom .serve import load_settings\n\n\ndef click_async(f):\n @wraps(f)\n def wrapper(*args, **kwargs):\n return asyncio.run(f(*args, **kwargs))\n\n return wrapper\n\n\[email protected]()\[email protected]_option()\ndef root():\n \"\"\"\n Command-line interface to manage MLServer models.\n \"\"\"\n pass\n\n\[email protected](\"start\")\[email protected](\"folder\", nargs=1)\n@click_async\nasync def start(folder: str):\n \"\"\"\n Start serving a machine learning model with MLServer.\n \"\"\"\n settings, models_settings = await load_settings(folder)\n\n server = MLServer(settings)\n await server.start(models_settings)\n\n\[email protected](\"build\")\[email protected](\"folder\", nargs=1)\[email protected](\"-t\", \"--tag\", type=str)\[email protected](\"--no-cache\", default=False, is_flag=True)\n@click_async\nasync def build(folder: str, tag: str, no_cache: bool = False):\n \"\"\"\n Build a Docker image for a custom MLServer runtime.\n \"\"\"\n dockerfile = generate_dockerfile()\n build_image(folder, dockerfile, tag, no_cache=no_cache)\n logger.info(f\"Successfully built custom Docker image with tag {tag}\")\n\n\[email protected](\"dockerfile\")\[email protected](\"folder\", nargs=1)\[email protected](\"-i\", \"--include-dockerignore\", is_flag=True)\n@click_async\nasync def dockerfile(folder: str, include_dockerignore: bool):\n \"\"\"\n Generate a Dockerfile\n \"\"\"\n dockerfile = generate_dockerfile()\n dockerfile_path = write_dockerfile(\n folder, dockerfile, include_dockerignore=include_dockerignore\n )\n logger.info(f\"Successfully written Dockerfile in {dockerfile_path}\")\n\n\ndef main():\n configure_logger()\n install_uvloop_event_loop()\n root()\n\n\nif __name__ == \"__main__\":\n main()\n", "path": "mlserver/cli/main.py"}]} | 1,461 | 479 |
gh_patches_debug_30513 | rasdani/github-patches | git_diff | great-expectations__great_expectations-1292 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
QueryBatchKwargsGenerator._get_raw_query produces incorrect filepath
I am using the `QueryBatchKwargsGenerator` to generate `BATCH_KWARGS` using a SQL file I have stored on disk. My `great_expectations.yml` looks like:
```yaml
datasources:
domi:
class_name: SqlAlchemyDatasource
module_name:
data_asset_type:
class_name: SqlAlchemyDataset
credentials:
drivername: postgres
host: ${HOST}
port: ${PORT}
username: ${USER}
password: ${PASSWORD}
database: ${DBNAME}
generators:
apartments-generator:
class_name: QueryBatchKwargsGenerator
name: apartments
```
I am running the following:
```python
>>> import great_expectations as get
>>> context = ge.data_context.DataContext()
>>> context.get_available_data_asset_names()
{'domi': {'apartments-generator': {'names': [('listings', 'query')]}}}
>>> context.build_batch_kwargs(
datasource='domi', generator='apartments-generator', name='listings'
)
```
Which raises this error:

This happens because `_get_raw_query` is doing `tuple('listings')` which returns: `('l', 'i', 's', 't', 'i', 'n', 'g', 's')`)
```python
def _get_raw_query(self, generator_asset):
return self._store_backend.get(tuple(generator_asset))
```
I believe this should just be replaced with:
```python
def _get_raw_query(self, generator_asset):
return self._store_backend.get((generator_asset,))
```
Currently I can get around this by providing a tuple instead of a string, i.e.
```python
context.build_batch_kwargs(
datasource='domi',
generator='apartments-generator',
name=('listings', )
)
```
But that goes against what the docs are saying [here](https://docs.greatexpectations.io/en/latest/module_docs/data_context_module.html#great_expectations.data_context.BaseDataContext.build_batch_kwargs):

--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `great_expectations/datasource/generator/query_generator.py`
Content:
```
1 import os
2 import logging
3
4 from .batch_kwargs_generator import BatchKwargsGenerator
5 from great_expectations.datasource.types import SqlAlchemyDatasourceQueryBatchKwargs
6 from great_expectations.exceptions import (
7 BatchKwargsError,
8 ClassInstantiationError,
9 )
10 from ...data_context.util import instantiate_class_from_config
11
12 logger = logging.getLogger(__name__)
13
14 try:
15 import sqlalchemy
16 from sqlalchemy import create_engine
17 from sqlalchemy.engine import reflection
18 except ImportError:
19 sqlalchemy = None
20 create_engine = None
21 reflection = None
22 logger.debug("Unable to import sqlalchemy.")
23
24
25 class QueryBatchKwargsGenerator(BatchKwargsGenerator):
26 """Produce query-style batch_kwargs from sql files stored on disk
27 """
28 recognized_batch_parameters = {'query_parameters', 'partition_id'}
29
30 def __init__(self, name="default", datasource=None, query_store_backend=None, queries=None):
31 super(QueryBatchKwargsGenerator, self).__init__(name=name, datasource=datasource)
32 root_directory = None
33 if query_store_backend is None:
34 # We will choose a Tuple store if there is a configured DataContext with a root_directory,
35 # and an InMemoryStore otherwise
36 if datasource and datasource.data_context and datasource.data_context.root_directory:
37 query_store_backend = {
38 "class_name": "TupleFilesystemStoreBackend",
39 "base_directory": os.path.join(datasource.data_context.root_directory, "datasources",
40 datasource.name, "generators", name),
41 "filepath_suffix": ".sql"
42 }
43 root_directory = datasource.data_context.root_directory
44 else:
45 query_store_backend = {
46 "class_name": "InMemoryStoreBackend"
47 }
48 module_name = 'great_expectations.data_context.store'
49 self._store_backend = instantiate_class_from_config(
50 config=query_store_backend,
51 runtime_environment={
52 "root_directory": root_directory
53 },
54 config_defaults={
55 "module_name": module_name
56 }
57 )
58 if not self._store_backend:
59 raise ClassInstantiationError(
60 module_name=module_name,
61 package_name=None,
62 class_name=query_store_backend['class_name']
63 )
64 if queries is not None:
65 for query_name, query in queries.items():
66 self.add_query(query_name, query)
67
68 def _get_raw_query(self, generator_asset):
69 return self._store_backend.get(tuple(generator_asset))
70
71 def _get_iterator(self, generator_asset, query_parameters=None):
72 raw_query = self._get_raw_query(generator_asset)
73 if raw_query is None:
74 logger.warning("No query defined for generator asset: %s" % generator_asset)
75 # There is no valid query path or temp query storage defined with the generator_asset
76 return None
77
78 if query_parameters is None:
79 iter_ = iter([
80 SqlAlchemyDatasourceQueryBatchKwargs(
81 query=raw_query
82 )])
83 else:
84 iter_= iter([
85 SqlAlchemyDatasourceQueryBatchKwargs(
86 query=raw_query,
87 query_parameters=query_parameters
88 )])
89
90 return iter_
91
92 def add_query(self, generator_asset, query):
93 # Backends must have a tuple key; we use only a single-element tuple
94 self._store_backend.set(tuple(generator_asset), query)
95
96 def get_available_data_asset_names(self):
97 defined_queries = self._store_backend.list_keys()
98 # Backends must have a tuple key; we use only a single-element tuple
99 return {"names": [(query_key_tuple[0], "query") for query_key_tuple in defined_queries]}
100
101 def _build_batch_kwargs(self, batch_parameters):
102 """Build batch kwargs from a partition id."""
103 generator_asset = batch_parameters.pop("name")
104 raw_query = self._get_raw_query(generator_asset)
105 partition_id = batch_parameters.pop("partition_id", None)
106 batch_kwargs = self._datasource.process_batch_parameters(**batch_parameters)
107 batch_kwargs["query"] = raw_query
108
109 if partition_id:
110 if not batch_kwargs["query_parameters"]:
111 batch_kwargs["query_parameters"] = {}
112 batch_kwargs["query_parameters"]["partition_id"] = partition_id
113
114 return SqlAlchemyDatasourceQueryBatchKwargs(batch_kwargs)
115
116 def get_available_partition_ids(self, generator_asset):
117 raise BatchKwargsError("QueryBatchKwargsGenerator cannot identify partitions.", {})
118
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/great_expectations/datasource/generator/query_generator.py b/great_expectations/datasource/generator/query_generator.py
--- a/great_expectations/datasource/generator/query_generator.py
+++ b/great_expectations/datasource/generator/query_generator.py
@@ -25,7 +25,7 @@
class QueryBatchKwargsGenerator(BatchKwargsGenerator):
"""Produce query-style batch_kwargs from sql files stored on disk
"""
- recognized_batch_parameters = {'query_parameters', 'partition_id'}
+ recognized_batch_parameters = {'query_parameters', 'partition_id', 'name'}
def __init__(self, name="default", datasource=None, query_store_backend=None, queries=None):
super(QueryBatchKwargsGenerator, self).__init__(name=name, datasource=datasource)
@@ -66,7 +66,7 @@
self.add_query(query_name, query)
def _get_raw_query(self, generator_asset):
- return self._store_backend.get(tuple(generator_asset))
+ return self._store_backend.get((generator_asset,))
def _get_iterator(self, generator_asset, query_parameters=None):
raw_query = self._get_raw_query(generator_asset)
@@ -91,7 +91,7 @@
def add_query(self, generator_asset, query):
# Backends must have a tuple key; we use only a single-element tuple
- self._store_backend.set(tuple(generator_asset), query)
+ self._store_backend.set((generator_asset,), query)
def get_available_data_asset_names(self):
defined_queries = self._store_backend.list_keys()
| {"golden_diff": "diff --git a/great_expectations/datasource/generator/query_generator.py b/great_expectations/datasource/generator/query_generator.py\n--- a/great_expectations/datasource/generator/query_generator.py\n+++ b/great_expectations/datasource/generator/query_generator.py\n@@ -25,7 +25,7 @@\n class QueryBatchKwargsGenerator(BatchKwargsGenerator):\n \"\"\"Produce query-style batch_kwargs from sql files stored on disk\n \"\"\"\n- recognized_batch_parameters = {'query_parameters', 'partition_id'}\n+ recognized_batch_parameters = {'query_parameters', 'partition_id', 'name'}\n \n def __init__(self, name=\"default\", datasource=None, query_store_backend=None, queries=None):\n super(QueryBatchKwargsGenerator, self).__init__(name=name, datasource=datasource)\n@@ -66,7 +66,7 @@\n self.add_query(query_name, query)\n \n def _get_raw_query(self, generator_asset):\n- return self._store_backend.get(tuple(generator_asset))\n+ return self._store_backend.get((generator_asset,))\n \n def _get_iterator(self, generator_asset, query_parameters=None):\n raw_query = self._get_raw_query(generator_asset)\n@@ -91,7 +91,7 @@\n \n def add_query(self, generator_asset, query):\n # Backends must have a tuple key; we use only a single-element tuple\n- self._store_backend.set(tuple(generator_asset), query)\n+ self._store_backend.set((generator_asset,), query)\n \n def get_available_data_asset_names(self):\n defined_queries = self._store_backend.list_keys()\n", "issue": "QueryBatchKwargsGenerator._get_raw_query produces incorrect filepath\nI am using the `QueryBatchKwargsGenerator` to generate `BATCH_KWARGS` using a SQL file I have stored on disk. My `great_expectations.yml` looks like:\r\n\r\n```yaml\r\ndatasources:\r\n domi:\r\n class_name: SqlAlchemyDatasource\r\n module_name:\r\n data_asset_type:\r\n class_name: SqlAlchemyDataset\r\n credentials:\r\n drivername: postgres\r\n host: ${HOST}\r\n port: ${PORT}\r\n username: ${USER}\r\n password: ${PASSWORD}\r\n database: ${DBNAME}\r\n generators:\r\n apartments-generator:\r\n class_name: QueryBatchKwargsGenerator\r\n name: apartments\r\n```\r\n\r\nI am running the following:\r\n\r\n```python\r\n>>> import great_expectations as get\r\n\r\n>>> context = ge.data_context.DataContext()\r\n>>> context.get_available_data_asset_names()\r\n{'domi': {'apartments-generator': {'names': [('listings', 'query')]}}}\r\n\r\n>>> context.build_batch_kwargs(\r\n datasource='domi', generator='apartments-generator', name='listings'\r\n)\r\n```\r\n\r\nWhich raises this error:\r\n\r\n\r\n\r\nThis happens because `_get_raw_query` is doing `tuple('listings')` which returns: `('l', 'i', 's', 't', 'i', 'n', 'g', 's')`)\r\n\r\n```python\r\n def _get_raw_query(self, generator_asset):\r\n return self._store_backend.get(tuple(generator_asset))\r\n```\r\n\r\nI believe this should just be replaced with:\r\n\r\n```python\r\n def _get_raw_query(self, generator_asset):\r\n return self._store_backend.get((generator_asset,))\r\n```\r\n\r\nCurrently I can get around this by providing a tuple instead of a string, i.e.\r\n\r\n```python\r\ncontext.build_batch_kwargs(\r\n datasource='domi',\r\n generator='apartments-generator',\r\n name=('listings', )\r\n)\r\n```\r\n\r\nBut that goes against what the docs are saying [here](https://docs.greatexpectations.io/en/latest/module_docs/data_context_module.html#great_expectations.data_context.BaseDataContext.build_batch_kwargs):\r\n\r\n\r\n\n", "before_files": [{"content": "import os\nimport logging\n\nfrom .batch_kwargs_generator import BatchKwargsGenerator\nfrom great_expectations.datasource.types import SqlAlchemyDatasourceQueryBatchKwargs\nfrom great_expectations.exceptions import (\n BatchKwargsError,\n ClassInstantiationError,\n)\nfrom ...data_context.util import instantiate_class_from_config\n\nlogger = logging.getLogger(__name__)\n\ntry:\n import sqlalchemy\n from sqlalchemy import create_engine\n from sqlalchemy.engine import reflection\nexcept ImportError:\n sqlalchemy = None\n create_engine = None\n reflection = None\n logger.debug(\"Unable to import sqlalchemy.\")\n\n\nclass QueryBatchKwargsGenerator(BatchKwargsGenerator):\n \"\"\"Produce query-style batch_kwargs from sql files stored on disk\n \"\"\"\n recognized_batch_parameters = {'query_parameters', 'partition_id'}\n\n def __init__(self, name=\"default\", datasource=None, query_store_backend=None, queries=None):\n super(QueryBatchKwargsGenerator, self).__init__(name=name, datasource=datasource)\n root_directory = None\n if query_store_backend is None:\n # We will choose a Tuple store if there is a configured DataContext with a root_directory,\n # and an InMemoryStore otherwise\n if datasource and datasource.data_context and datasource.data_context.root_directory:\n query_store_backend = {\n \"class_name\": \"TupleFilesystemStoreBackend\",\n \"base_directory\": os.path.join(datasource.data_context.root_directory, \"datasources\",\n datasource.name, \"generators\", name),\n \"filepath_suffix\": \".sql\"\n }\n root_directory = datasource.data_context.root_directory\n else:\n query_store_backend = {\n \"class_name\": \"InMemoryStoreBackend\"\n }\n module_name = 'great_expectations.data_context.store'\n self._store_backend = instantiate_class_from_config(\n config=query_store_backend,\n runtime_environment={\n \"root_directory\": root_directory\n },\n config_defaults={\n \"module_name\": module_name\n }\n )\n if not self._store_backend:\n raise ClassInstantiationError(\n module_name=module_name,\n package_name=None,\n class_name=query_store_backend['class_name']\n )\n if queries is not None:\n for query_name, query in queries.items():\n self.add_query(query_name, query)\n\n def _get_raw_query(self, generator_asset):\n return self._store_backend.get(tuple(generator_asset))\n\n def _get_iterator(self, generator_asset, query_parameters=None):\n raw_query = self._get_raw_query(generator_asset)\n if raw_query is None:\n logger.warning(\"No query defined for generator asset: %s\" % generator_asset)\n # There is no valid query path or temp query storage defined with the generator_asset\n return None\n\n if query_parameters is None:\n iter_ = iter([\n SqlAlchemyDatasourceQueryBatchKwargs(\n query=raw_query\n )])\n else:\n iter_= iter([\n SqlAlchemyDatasourceQueryBatchKwargs(\n query=raw_query,\n query_parameters=query_parameters\n )])\n\n return iter_\n\n def add_query(self, generator_asset, query):\n # Backends must have a tuple key; we use only a single-element tuple\n self._store_backend.set(tuple(generator_asset), query)\n\n def get_available_data_asset_names(self):\n defined_queries = self._store_backend.list_keys()\n # Backends must have a tuple key; we use only a single-element tuple\n return {\"names\": [(query_key_tuple[0], \"query\") for query_key_tuple in defined_queries]}\n\n def _build_batch_kwargs(self, batch_parameters):\n \"\"\"Build batch kwargs from a partition id.\"\"\"\n generator_asset = batch_parameters.pop(\"name\")\n raw_query = self._get_raw_query(generator_asset)\n partition_id = batch_parameters.pop(\"partition_id\", None)\n batch_kwargs = self._datasource.process_batch_parameters(**batch_parameters)\n batch_kwargs[\"query\"] = raw_query\n\n if partition_id:\n if not batch_kwargs[\"query_parameters\"]:\n batch_kwargs[\"query_parameters\"] = {}\n batch_kwargs[\"query_parameters\"][\"partition_id\"] = partition_id\n\n return SqlAlchemyDatasourceQueryBatchKwargs(batch_kwargs)\n\n def get_available_partition_ids(self, generator_asset):\n raise BatchKwargsError(\"QueryBatchKwargsGenerator cannot identify partitions.\", {})\n", "path": "great_expectations/datasource/generator/query_generator.py"}], "after_files": [{"content": "import os\nimport logging\n\nfrom .batch_kwargs_generator import BatchKwargsGenerator\nfrom great_expectations.datasource.types import SqlAlchemyDatasourceQueryBatchKwargs\nfrom great_expectations.exceptions import (\n BatchKwargsError,\n ClassInstantiationError,\n)\nfrom ...data_context.util import instantiate_class_from_config\n\nlogger = logging.getLogger(__name__)\n\ntry:\n import sqlalchemy\n from sqlalchemy import create_engine\n from sqlalchemy.engine import reflection\nexcept ImportError:\n sqlalchemy = None\n create_engine = None\n reflection = None\n logger.debug(\"Unable to import sqlalchemy.\")\n\n\nclass QueryBatchKwargsGenerator(BatchKwargsGenerator):\n \"\"\"Produce query-style batch_kwargs from sql files stored on disk\n \"\"\"\n recognized_batch_parameters = {'query_parameters', 'partition_id', 'name'}\n\n def __init__(self, name=\"default\", datasource=None, query_store_backend=None, queries=None):\n super(QueryBatchKwargsGenerator, self).__init__(name=name, datasource=datasource)\n root_directory = None\n if query_store_backend is None:\n # We will choose a Tuple store if there is a configured DataContext with a root_directory,\n # and an InMemoryStore otherwise\n if datasource and datasource.data_context and datasource.data_context.root_directory:\n query_store_backend = {\n \"class_name\": \"TupleFilesystemStoreBackend\",\n \"base_directory\": os.path.join(datasource.data_context.root_directory, \"datasources\",\n datasource.name, \"generators\", name),\n \"filepath_suffix\": \".sql\"\n }\n root_directory = datasource.data_context.root_directory\n else:\n query_store_backend = {\n \"class_name\": \"InMemoryStoreBackend\"\n }\n module_name = 'great_expectations.data_context.store'\n self._store_backend = instantiate_class_from_config(\n config=query_store_backend,\n runtime_environment={\n \"root_directory\": root_directory\n },\n config_defaults={\n \"module_name\": module_name\n }\n )\n if not self._store_backend:\n raise ClassInstantiationError(\n module_name=module_name,\n package_name=None,\n class_name=query_store_backend['class_name']\n )\n if queries is not None:\n for query_name, query in queries.items():\n self.add_query(query_name, query)\n\n def _get_raw_query(self, generator_asset):\n return self._store_backend.get((generator_asset,))\n\n def _get_iterator(self, generator_asset, query_parameters=None):\n raw_query = self._get_raw_query(generator_asset)\n if raw_query is None:\n logger.warning(\"No query defined for generator asset: %s\" % generator_asset)\n # There is no valid query path or temp query storage defined with the generator_asset\n return None\n\n if query_parameters is None:\n iter_ = iter([\n SqlAlchemyDatasourceQueryBatchKwargs(\n query=raw_query\n )])\n else:\n iter_= iter([\n SqlAlchemyDatasourceQueryBatchKwargs(\n query=raw_query,\n query_parameters=query_parameters\n )])\n\n return iter_\n\n def add_query(self, generator_asset, query):\n # Backends must have a tuple key; we use only a single-element tuple\n self._store_backend.set((generator_asset,), query)\n\n def get_available_data_asset_names(self):\n defined_queries = self._store_backend.list_keys()\n # Backends must have a tuple key; we use only a single-element tuple\n return {\"names\": [(query_key_tuple[0], \"query\") for query_key_tuple in defined_queries]}\n\n def _build_batch_kwargs(self, batch_parameters):\n \"\"\"Build batch kwargs from a partition id.\"\"\"\n generator_asset = batch_parameters.pop(\"name\")\n raw_query = self._get_raw_query(generator_asset)\n partition_id = batch_parameters.pop(\"partition_id\", None)\n batch_kwargs = self._datasource.process_batch_parameters(**batch_parameters)\n batch_kwargs[\"query\"] = raw_query\n\n if partition_id:\n if not batch_kwargs[\"query_parameters\"]:\n batch_kwargs[\"query_parameters\"] = {}\n batch_kwargs[\"query_parameters\"][\"partition_id\"] = partition_id\n\n return SqlAlchemyDatasourceQueryBatchKwargs(batch_kwargs)\n\n def get_available_partition_ids(self, generator_asset):\n raise BatchKwargsError(\"QueryBatchKwargsGenerator cannot identify partitions.\", {})\n", "path": "great_expectations/datasource/generator/query_generator.py"}]} | 1,992 | 348 |
gh_patches_debug_10612 | rasdani/github-patches | git_diff | fedora-infra__bodhi-1450 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
There are multiple alembic heads on the develop branch
The migrations can't be applied on the develop branch because there are multiple alembic heads:
```
[vagrant@bodhi-dev bodhi]$ alembic upgrade head
/usr/lib64/python2.7/pkgutil.py:186: ImportWarning: Not importing directory '/usr/lib64/python2.7/site-packages/zope': missing __init__.py
file, filename, etc = imp.find_module(subname, path)
/usr/lib64/python2.7/pkgutil.py:186: ImportWarning: Not importing directory '/usr/lib/python2.7/site-packages/zope': missing __init__.py
file, filename, etc = imp.find_module(subname, path)
/usr/lib64/python2.7/pkgutil.py:186: ImportWarning: Not importing directory '/usr/lib/python2.7/site-packages/paste': missing __init__.py
file, filename, etc = imp.find_module(subname, path)
/usr/lib64/python2.7/pkgutil.py:186: ImportWarning: Not importing directory '/usr/lib/python2.7/site-packages/moksha': missing __init__.py
file, filename, etc = imp.find_module(subname, path)
/usr/lib64/python2.7/pkgutil.py:186: ImportWarning: Not importing directory '/usr/lib/python2.7/site-packages/repoze': missing __init__.py
file, filename, etc = imp.find_module(subname, path)
/home/vagrant/bodhi/bodhi/server/__init__.py:26: DeprecationWarning: unauthenticated_userid: As of Pyramid 1.5 the "pyramid.security.unauthenticated_userid" API is now deprecated. It will be removed in Pyramd 1.8. Use the "unauthenticated_userid" attribute of the Pyramid request instead.
from pyramid.security import unauthenticated_userid
INFO [alembic.runtime.migration] Context impl PostgresqlImpl.
INFO [alembic.runtime.migration] Will assume transactional DDL.
ERROR [alembic.util.messaging] Multiple head revisions are present for given argument 'head'; please specify a specific target revision, '<branchname>@head' to narrow to a specific head, or 'heads' for all heads
FAILED: Multiple head revisions are present for given argument 'head'; please specify a specific target revision,
'<branchname>@head' to narrow to a specific head, or 'heads' for all heads
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `alembic/versions/9241378c92ab_convert_the_builds_table_to_be_.py`
Content:
```
1 """Convert the builds table to be polymorphic.
2
3 Revision ID: 9241378c92ab
4 Revises: fc6b0169c596
5 Create Date: 2017-04-06 20:37:24.766366
6 """
7 from alembic import op
8 import sqlalchemy as sa
9
10
11 # revision identifiers, used by Alembic.
12 revision = '9241378c92ab'
13 down_revision = 'fc6b0169c596'
14
15
16 def upgrade():
17 """Add the type column to the builds table."""
18 # The default of ``1`` is the RPM Build type.
19 op.add_column('builds', sa.Column('type', sa.Integer(), nullable=False, server_default=u'1'))
20 op.alter_column('builds', 'type', server_default=None)
21
22
23 def downgrade():
24 """Remove the type column from the builds table."""
25 op.drop_column('builds', 'type')
26
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/alembic/versions/9241378c92ab_convert_the_builds_table_to_be_.py b/alembic/versions/9241378c92ab_convert_the_builds_table_to_be_.py
--- a/alembic/versions/9241378c92ab_convert_the_builds_table_to_be_.py
+++ b/alembic/versions/9241378c92ab_convert_the_builds_table_to_be_.py
@@ -1,7 +1,7 @@
"""Convert the builds table to be polymorphic.
Revision ID: 9241378c92ab
-Revises: fc6b0169c596
+Revises: 12d3e8695f90
Create Date: 2017-04-06 20:37:24.766366
"""
from alembic import op
@@ -10,7 +10,7 @@
# revision identifiers, used by Alembic.
revision = '9241378c92ab'
-down_revision = 'fc6b0169c596'
+down_revision = '12d3e8695f90'
def upgrade():
| {"golden_diff": "diff --git a/alembic/versions/9241378c92ab_convert_the_builds_table_to_be_.py b/alembic/versions/9241378c92ab_convert_the_builds_table_to_be_.py\n--- a/alembic/versions/9241378c92ab_convert_the_builds_table_to_be_.py\n+++ b/alembic/versions/9241378c92ab_convert_the_builds_table_to_be_.py\n@@ -1,7 +1,7 @@\n \"\"\"Convert the builds table to be polymorphic.\n \n Revision ID: 9241378c92ab\n-Revises: fc6b0169c596\n+Revises: 12d3e8695f90\n Create Date: 2017-04-06 20:37:24.766366\n \"\"\"\n from alembic import op\n@@ -10,7 +10,7 @@\n \n # revision identifiers, used by Alembic.\n revision = '9241378c92ab'\n-down_revision = 'fc6b0169c596'\n+down_revision = '12d3e8695f90'\n \n \n def upgrade():\n", "issue": "There are multiple alembic heads on the develop branch\nThe migrations can't be applied on the develop branch because there are multiple alembic heads:\r\n\r\n```\r\n[vagrant@bodhi-dev bodhi]$ alembic upgrade head\r\n/usr/lib64/python2.7/pkgutil.py:186: ImportWarning: Not importing directory '/usr/lib64/python2.7/site-packages/zope': missing __init__.py\r\n file, filename, etc = imp.find_module(subname, path)\r\n/usr/lib64/python2.7/pkgutil.py:186: ImportWarning: Not importing directory '/usr/lib/python2.7/site-packages/zope': missing __init__.py\r\n file, filename, etc = imp.find_module(subname, path)\r\n/usr/lib64/python2.7/pkgutil.py:186: ImportWarning: Not importing directory '/usr/lib/python2.7/site-packages/paste': missing __init__.py\r\n file, filename, etc = imp.find_module(subname, path)\r\n/usr/lib64/python2.7/pkgutil.py:186: ImportWarning: Not importing directory '/usr/lib/python2.7/site-packages/moksha': missing __init__.py\r\n file, filename, etc = imp.find_module(subname, path)\r\n/usr/lib64/python2.7/pkgutil.py:186: ImportWarning: Not importing directory '/usr/lib/python2.7/site-packages/repoze': missing __init__.py\r\n file, filename, etc = imp.find_module(subname, path)\r\n/home/vagrant/bodhi/bodhi/server/__init__.py:26: DeprecationWarning: unauthenticated_userid: As of Pyramid 1.5 the \"pyramid.security.unauthenticated_userid\" API is now deprecated. It will be removed in Pyramd 1.8. Use the \"unauthenticated_userid\" attribute of the Pyramid request instead.\r\n from pyramid.security import unauthenticated_userid\r\nINFO [alembic.runtime.migration] Context impl PostgresqlImpl.\r\nINFO [alembic.runtime.migration] Will assume transactional DDL.\r\nERROR [alembic.util.messaging] Multiple head revisions are present for given argument 'head'; please specify a specific target revision, '<branchname>@head' to narrow to a specific head, or 'heads' for all heads\r\n FAILED: Multiple head revisions are present for given argument 'head'; please specify a specific target revision,\r\n '<branchname>@head' to narrow to a specific head, or 'heads' for all heads\r\n```\n", "before_files": [{"content": "\"\"\"Convert the builds table to be polymorphic.\n\nRevision ID: 9241378c92ab\nRevises: fc6b0169c596\nCreate Date: 2017-04-06 20:37:24.766366\n\"\"\"\nfrom alembic import op\nimport sqlalchemy as sa\n\n\n# revision identifiers, used by Alembic.\nrevision = '9241378c92ab'\ndown_revision = 'fc6b0169c596'\n\n\ndef upgrade():\n \"\"\"Add the type column to the builds table.\"\"\"\n # The default of ``1`` is the RPM Build type.\n op.add_column('builds', sa.Column('type', sa.Integer(), nullable=False, server_default=u'1'))\n op.alter_column('builds', 'type', server_default=None)\n\n\ndef downgrade():\n \"\"\"Remove the type column from the builds table.\"\"\"\n op.drop_column('builds', 'type')\n", "path": "alembic/versions/9241378c92ab_convert_the_builds_table_to_be_.py"}], "after_files": [{"content": "\"\"\"Convert the builds table to be polymorphic.\n\nRevision ID: 9241378c92ab\nRevises: 12d3e8695f90\nCreate Date: 2017-04-06 20:37:24.766366\n\"\"\"\nfrom alembic import op\nimport sqlalchemy as sa\n\n\n# revision identifiers, used by Alembic.\nrevision = '9241378c92ab'\ndown_revision = '12d3e8695f90'\n\n\ndef upgrade():\n \"\"\"Add the type column to the builds table.\"\"\"\n # The default of ``1`` is the RPM Build type.\n op.add_column('builds', sa.Column('type', sa.Integer(), nullable=False, server_default=u'1'))\n op.alter_column('builds', 'type', server_default=None)\n\n\ndef downgrade():\n \"\"\"Remove the type column from the builds table.\"\"\"\n op.drop_column('builds', 'type')\n", "path": "alembic/versions/9241378c92ab_convert_the_builds_table_to_be_.py"}]} | 1,090 | 307 |
gh_patches_debug_57587 | rasdani/github-patches | git_diff | joke2k__faker-262 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
en_US SSN provider generates invalid SSNs
There's a few limitations on United States' SSNs that prevent it from being completely random.
- No group can be all 0s
- The SSN cannot start with 666
- The SSN cannot start with a number >= 900
See http://www.ssa.gov/employer/randomization.html
Could you modify the SSN generator to avoid these issues?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `faker/providers/ssn/en_US/__init__.py`
Content:
```
1 # coding=utf-8
2 from __future__ import unicode_literals
3 from .. import Provider as BaseProvider
4
5
6 class Provider(BaseProvider):
7 pass
8
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/faker/providers/ssn/en_US/__init__.py b/faker/providers/ssn/en_US/__init__.py
--- a/faker/providers/ssn/en_US/__init__.py
+++ b/faker/providers/ssn/en_US/__init__.py
@@ -4,4 +4,17 @@
class Provider(BaseProvider):
- pass
+
+ @classmethod
+ def ssn(cls):
+ # Certain numbers are invalid for U.S. SSNs. The area (first 3 digits)
+ # cannot be 666 or 900-999. The group number (middle digits) cannot be
+ # 00. The serial (last 4 digits) cannot be 0000
+ area = BaseProvider.random_int(min=1, max=899)
+ if area == 666:
+ area += 1
+ group = BaseProvider.random_int(1, 99)
+ serial = BaseProvider.random_int(1, 9999)
+
+ ssn = "{0:03d}-{1:02d}-{2:04d}".format(area, group, serial)
+ return ssn
| {"golden_diff": "diff --git a/faker/providers/ssn/en_US/__init__.py b/faker/providers/ssn/en_US/__init__.py\n--- a/faker/providers/ssn/en_US/__init__.py\n+++ b/faker/providers/ssn/en_US/__init__.py\n@@ -4,4 +4,17 @@\n \n \n class Provider(BaseProvider):\n- pass\n+\n+ @classmethod\n+ def ssn(cls):\n+ # Certain numbers are invalid for U.S. SSNs. The area (first 3 digits)\n+ # cannot be 666 or 900-999. The group number (middle digits) cannot be\n+ # 00. The serial (last 4 digits) cannot be 0000\n+ area = BaseProvider.random_int(min=1, max=899)\n+ if area == 666:\n+ area += 1\n+ group = BaseProvider.random_int(1, 99)\n+ serial = BaseProvider.random_int(1, 9999)\n+\n+ ssn = \"{0:03d}-{1:02d}-{2:04d}\".format(area, group, serial)\n+ return ssn\n", "issue": "en_US SSN provider generates invalid SSNs\nThere's a few limitations on United States' SSNs that prevent it from being completely random.\n- No group can be all 0s\n- The SSN cannot start with 666\n- The SSN cannot start with a number >= 900\n\nSee http://www.ssa.gov/employer/randomization.html\n\nCould you modify the SSN generator to avoid these issues?\n\n", "before_files": [{"content": "# coding=utf-8\nfrom __future__ import unicode_literals\nfrom .. import Provider as BaseProvider\n\n\nclass Provider(BaseProvider):\n pass\n", "path": "faker/providers/ssn/en_US/__init__.py"}], "after_files": [{"content": "# coding=utf-8\nfrom __future__ import unicode_literals\nfrom .. import Provider as BaseProvider\n\n\nclass Provider(BaseProvider):\n\n @classmethod\n def ssn(cls):\n # Certain numbers are invalid for U.S. SSNs. The area (first 3 digits)\n # cannot be 666 or 900-999. The group number (middle digits) cannot be\n # 00. The serial (last 4 digits) cannot be 0000\n area = BaseProvider.random_int(min=1, max=899)\n if area == 666:\n area += 1\n group = BaseProvider.random_int(1, 99)\n serial = BaseProvider.random_int(1, 9999)\n\n ssn = \"{0:03d}-{1:02d}-{2:04d}\".format(area, group, serial)\n return ssn\n", "path": "faker/providers/ssn/en_US/__init__.py"}]} | 395 | 269 |
gh_patches_debug_57082 | rasdani/github-patches | git_diff | SeldonIO__MLServer-1171 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add OS constraint in PyPI
Mention MLServer's OS constraints as metadata in `setup.py`, so that it becomes visible in pypi.org.
```
setup(...,
classifiers=[
'Operating System :: POSIX',
],
)
```
_Originally posted by @HugoMVale in https://github.com/SeldonIO/MLServer/issues/1022#issuecomment-1456788132_
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 import os
2
3 from typing import Dict
4 from setuptools import setup, find_packages
5
6 ROOT_PATH = os.path.dirname(__file__)
7 PKG_NAME = "mlserver"
8 PKG_PATH = os.path.join(ROOT_PATH, PKG_NAME)
9
10
11 def _load_version() -> str:
12 version = ""
13 version_path = os.path.join(PKG_PATH, "version.py")
14 with open(version_path) as fp:
15 version_module: Dict[str, str] = {}
16 exec(fp.read(), version_module)
17 version = version_module["__version__"]
18
19 return version
20
21
22 def _load_description() -> str:
23 readme_path = os.path.join(ROOT_PATH, "README.md")
24 with open(readme_path) as fp:
25 return fp.read()
26
27
28 env_marker_cpython = (
29 "sys_platform != 'win32'"
30 " and (sys_platform != 'cygwin'"
31 " and platform_python_implementation != 'PyPy')"
32 )
33
34 setup(
35 name=PKG_NAME,
36 version=_load_version(),
37 url="https://github.com/SeldonIO/MLServer.git",
38 author="Seldon Technologies Ltd.",
39 author_email="[email protected]",
40 description="ML server",
41 include_package_data=True,
42 packages=find_packages(exclude=["tests", "tests.*"]),
43 install_requires=[
44 "click",
45 # 0.89.0: https://github.com/tiangolo/fastapi/issues/5861
46 "fastapi >=0.88.0, <=0.89.1, !=0.89.0",
47 "python-dotenv",
48 "grpcio",
49 # The importlib-resources backport is required to use some
50 # functionality added in Python 3.10
51 # https://setuptools.pypa.io/en/latest/userguide/datafiles.html#accessing-data-files-at-runtime
52 "importlib-resources",
53 "numpy",
54 "pandas",
55 "protobuf",
56 "uvicorn",
57 "starlette_exporter",
58 "py-grpc-prometheus",
59 "uvloop;" + env_marker_cpython,
60 "aiokafka",
61 "tritonclient[http]>=2.24",
62 "aiofiles",
63 "orjson",
64 ],
65 entry_points={"console_scripts": ["mlserver=mlserver.cli:main"]},
66 long_description=_load_description(),
67 long_description_content_type="text/markdown",
68 license="Apache 2.0",
69 )
70
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -37,6 +37,7 @@
url="https://github.com/SeldonIO/MLServer.git",
author="Seldon Technologies Ltd.",
author_email="[email protected]",
+ classifiers=["Operating System :: POSIX", "Operating System :: MacOS"],
description="ML server",
include_package_data=True,
packages=find_packages(exclude=["tests", "tests.*"]),
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -37,6 +37,7 @@\n url=\"https://github.com/SeldonIO/MLServer.git\",\n author=\"Seldon Technologies Ltd.\",\n author_email=\"[email protected]\",\n+ classifiers=[\"Operating System :: POSIX\", \"Operating System :: MacOS\"],\n description=\"ML server\",\n include_package_data=True,\n packages=find_packages(exclude=[\"tests\", \"tests.*\"]),\n", "issue": "Add OS constraint in PyPI\nMention MLServer's OS constraints as metadata in `setup.py`, so that it becomes visible in pypi.org. \r\n\r\n```\r\nsetup(...,\r\n classifiers=[\r\n 'Operating System :: POSIX',\r\n ],\r\n )\r\n```\r\n\r\n_Originally posted by @HugoMVale in https://github.com/SeldonIO/MLServer/issues/1022#issuecomment-1456788132_\r\n \n", "before_files": [{"content": "import os\n\nfrom typing import Dict\nfrom setuptools import setup, find_packages\n\nROOT_PATH = os.path.dirname(__file__)\nPKG_NAME = \"mlserver\"\nPKG_PATH = os.path.join(ROOT_PATH, PKG_NAME)\n\n\ndef _load_version() -> str:\n version = \"\"\n version_path = os.path.join(PKG_PATH, \"version.py\")\n with open(version_path) as fp:\n version_module: Dict[str, str] = {}\n exec(fp.read(), version_module)\n version = version_module[\"__version__\"]\n\n return version\n\n\ndef _load_description() -> str:\n readme_path = os.path.join(ROOT_PATH, \"README.md\")\n with open(readme_path) as fp:\n return fp.read()\n\n\nenv_marker_cpython = (\n \"sys_platform != 'win32'\"\n \" and (sys_platform != 'cygwin'\"\n \" and platform_python_implementation != 'PyPy')\"\n)\n\nsetup(\n name=PKG_NAME,\n version=_load_version(),\n url=\"https://github.com/SeldonIO/MLServer.git\",\n author=\"Seldon Technologies Ltd.\",\n author_email=\"[email protected]\",\n description=\"ML server\",\n include_package_data=True,\n packages=find_packages(exclude=[\"tests\", \"tests.*\"]),\n install_requires=[\n \"click\",\n # 0.89.0: https://github.com/tiangolo/fastapi/issues/5861\n \"fastapi >=0.88.0, <=0.89.1, !=0.89.0\",\n \"python-dotenv\",\n \"grpcio\",\n # The importlib-resources backport is required to use some\n # functionality added in Python 3.10\n # https://setuptools.pypa.io/en/latest/userguide/datafiles.html#accessing-data-files-at-runtime\n \"importlib-resources\",\n \"numpy\",\n \"pandas\",\n \"protobuf\",\n \"uvicorn\",\n \"starlette_exporter\",\n \"py-grpc-prometheus\",\n \"uvloop;\" + env_marker_cpython,\n \"aiokafka\",\n \"tritonclient[http]>=2.24\",\n \"aiofiles\",\n \"orjson\",\n ],\n entry_points={\"console_scripts\": [\"mlserver=mlserver.cli:main\"]},\n long_description=_load_description(),\n long_description_content_type=\"text/markdown\",\n license=\"Apache 2.0\",\n)\n", "path": "setup.py"}], "after_files": [{"content": "import os\n\nfrom typing import Dict\nfrom setuptools import setup, find_packages\n\nROOT_PATH = os.path.dirname(__file__)\nPKG_NAME = \"mlserver\"\nPKG_PATH = os.path.join(ROOT_PATH, PKG_NAME)\n\n\ndef _load_version() -> str:\n version = \"\"\n version_path = os.path.join(PKG_PATH, \"version.py\")\n with open(version_path) as fp:\n version_module: Dict[str, str] = {}\n exec(fp.read(), version_module)\n version = version_module[\"__version__\"]\n\n return version\n\n\ndef _load_description() -> str:\n readme_path = os.path.join(ROOT_PATH, \"README.md\")\n with open(readme_path) as fp:\n return fp.read()\n\n\nenv_marker_cpython = (\n \"sys_platform != 'win32'\"\n \" and (sys_platform != 'cygwin'\"\n \" and platform_python_implementation != 'PyPy')\"\n)\n\nsetup(\n name=PKG_NAME,\n version=_load_version(),\n url=\"https://github.com/SeldonIO/MLServer.git\",\n author=\"Seldon Technologies Ltd.\",\n author_email=\"[email protected]\",\n classifiers=[\"Operating System :: POSIX\", \"Operating System :: MacOS\"],\n description=\"ML server\",\n include_package_data=True,\n packages=find_packages(exclude=[\"tests\", \"tests.*\"]),\n install_requires=[\n \"click\",\n # 0.89.0: https://github.com/tiangolo/fastapi/issues/5861\n \"fastapi >=0.88.0, <=0.89.1, !=0.89.0\",\n \"python-dotenv\",\n \"grpcio\",\n # The importlib-resources backport is required to use some\n # functionality added in Python 3.10\n # https://setuptools.pypa.io/en/latest/userguide/datafiles.html#accessing-data-files-at-runtime\n \"importlib-resources\",\n \"numpy\",\n \"pandas\",\n \"protobuf\",\n \"uvicorn\",\n \"starlette_exporter\",\n \"py-grpc-prometheus\",\n \"uvloop;\" + env_marker_cpython,\n \"aiokafka\",\n \"tritonclient[http]>=2.24\",\n \"aiofiles\",\n \"orjson\",\n ],\n entry_points={\"console_scripts\": [\"mlserver=mlserver.cli:main\"]},\n long_description=_load_description(),\n long_description_content_type=\"text/markdown\",\n license=\"Apache 2.0\",\n)\n", "path": "setup.py"}]} | 1,009 | 104 |
gh_patches_debug_41531 | rasdani/github-patches | git_diff | deepset-ai__haystack-7247 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Docstrings - `haystack.components.caching`
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `haystack/components/caching/cache_checker.py`
Content:
```
1 from typing import List, Dict, Any
2
3 import importlib
4
5 import logging
6
7 from haystack import component, Document, default_from_dict, default_to_dict, DeserializationError
8 from haystack.document_stores.types import DocumentStore
9
10
11 logger = logging.getLogger(__name__)
12
13
14 @component
15 class CacheChecker:
16 """
17 CacheChecker is a component that checks for the presence of documents in a Document Store based on a specified
18 cache field.
19 """
20
21 def __init__(self, document_store: DocumentStore, cache_field: str):
22 """
23 Create a UrlCacheChecker component.
24 """
25 self.document_store = document_store
26 self.cache_field = cache_field
27
28 def to_dict(self) -> Dict[str, Any]:
29 """
30 Serialize this component to a dictionary.
31 """
32 return default_to_dict(self, document_store=self.document_store.to_dict(), cache_field=self.cache_field)
33
34 @classmethod
35 def from_dict(cls, data: Dict[str, Any]) -> "CacheChecker":
36 """
37 Deserialize this component from a dictionary.
38 """
39 init_params = data.get("init_parameters", {})
40 if "document_store" not in init_params:
41 raise DeserializationError("Missing 'document_store' in serialization data")
42 if "type" not in init_params["document_store"]:
43 raise DeserializationError("Missing 'type' in document store's serialization data")
44
45 try:
46 module_name, type_ = init_params["document_store"]["type"].rsplit(".", 1)
47 logger.debug("Trying to import %s", module_name)
48 module = importlib.import_module(module_name)
49 except (ImportError, DeserializationError) as e:
50 raise DeserializationError(
51 f"DocumentStore of type '{init_params['document_store']['type']}' not correctly imported"
52 ) from e
53
54 docstore_class = getattr(module, type_)
55 docstore = docstore_class.from_dict(init_params["document_store"])
56
57 data["init_parameters"]["document_store"] = docstore
58 return default_from_dict(cls, data)
59
60 @component.output_types(hits=List[Document], misses=List)
61 def run(self, items: List[Any]):
62 """
63 Checks if any document associated with the specified field is already present in the store. If matching documents
64 are found, they are returned as hits. If not, the items are returned as misses, indicating they are not in the cache.
65
66 :param items: A list of values associated with the cache_field to be checked against the cache.
67 :return: A dictionary with two keys: "hits" and "misses". The values are lists of documents that were found in
68 the cache and items that were not, respectively.
69 """
70 found_documents = []
71 misses = []
72
73 for item in items:
74 filters = {self.cache_field: item}
75 found = self.document_store.filter_documents(filters=filters)
76 if found:
77 found_documents.extend(found)
78 else:
79 misses.append(item)
80 return {"hits": found_documents, "misses": misses}
81
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/haystack/components/caching/cache_checker.py b/haystack/components/caching/cache_checker.py
--- a/haystack/components/caching/cache_checker.py
+++ b/haystack/components/caching/cache_checker.py
@@ -14,27 +14,63 @@
@component
class CacheChecker:
"""
- CacheChecker is a component that checks for the presence of documents in a Document Store based on a specified
- cache field.
+ Checks for the presence of documents in a Document Store based on a specified
+ field in each document's metadata.
+
+ If matching documents are found, they are returned as hits. If not, the items
+ are returned as misses, indicating they are not in the cache.
+
+ Usage example:
+ ```python
+ from haystack import Document
+ from haystack.document_stores.in_memory import InMemoryDocumentStore
+ from haystack.components.caching.cache_checker import CacheChecker
+
+ docstore = InMemoryDocumentStore()
+ documents = [
+ Document(content="doc1", meta={"url": "https://example.com/1"}),
+ Document(content="doc2", meta={"url": "https://example.com/2"}),
+ Document(content="doc3", meta={"url": "https://example.com/1"}),
+ Document(content="doc4", meta={"url": "https://example.com/2"}),
+ ]
+ docstore.write_documents(documents)
+ checker = CacheChecker(docstore, cache_field="url")
+ results = checker.run(items=["https://example.com/1", "https://example.com/5"])
+ assert results == {"hits": [documents[0], documents[2]], "misses": ["https://example.com/5"]}
+ ```
"""
def __init__(self, document_store: DocumentStore, cache_field: str):
"""
- Create a UrlCacheChecker component.
+ Create a CacheChecker component.
+
+ :param document_store:
+ Document store to check.
+ :param cache_field:
+ Name of the Document metadata field
+ to check for cache hits.
"""
self.document_store = document_store
self.cache_field = cache_field
def to_dict(self) -> Dict[str, Any]:
"""
- Serialize this component to a dictionary.
+ Serializes the component to a dictionary.
+
+ :returns:
+ Dictionary with serialized data.
"""
return default_to_dict(self, document_store=self.document_store.to_dict(), cache_field=self.cache_field)
@classmethod
def from_dict(cls, data: Dict[str, Any]) -> "CacheChecker":
"""
- Deserialize this component from a dictionary.
+ Deserializes the component from a dictionary.
+
+ :param data:
+ Dictionary to deserialize from.
+ :returns:
+ Deserialized component.
"""
init_params = data.get("init_parameters", {})
if "document_store" not in init_params:
@@ -60,12 +96,15 @@
@component.output_types(hits=List[Document], misses=List)
def run(self, items: List[Any]):
"""
- Checks if any document associated with the specified field is already present in the store. If matching documents
- are found, they are returned as hits. If not, the items are returned as misses, indicating they are not in the cache.
-
- :param items: A list of values associated with the cache_field to be checked against the cache.
- :return: A dictionary with two keys: "hits" and "misses". The values are lists of documents that were found in
- the cache and items that were not, respectively.
+ Checks if any document associated with the specified cache field
+ is already present in the store.
+
+ :param items:
+ Values to be checked against the cache field.
+ :return:
+ A dictionary with two keys:
+ - `hits` - Documents that matched with any of the items.
+ - `misses` - Items that were not present in any documents.
"""
found_documents = []
misses = []
| {"golden_diff": "diff --git a/haystack/components/caching/cache_checker.py b/haystack/components/caching/cache_checker.py\n--- a/haystack/components/caching/cache_checker.py\n+++ b/haystack/components/caching/cache_checker.py\n@@ -14,27 +14,63 @@\n @component\n class CacheChecker:\n \"\"\"\n- CacheChecker is a component that checks for the presence of documents in a Document Store based on a specified\n- cache field.\n+ Checks for the presence of documents in a Document Store based on a specified\n+ field in each document's metadata.\n+\n+ If matching documents are found, they are returned as hits. If not, the items\n+ are returned as misses, indicating they are not in the cache.\n+\n+ Usage example:\n+ ```python\n+ from haystack import Document\n+ from haystack.document_stores.in_memory import InMemoryDocumentStore\n+ from haystack.components.caching.cache_checker import CacheChecker\n+\n+ docstore = InMemoryDocumentStore()\n+ documents = [\n+ Document(content=\"doc1\", meta={\"url\": \"https://example.com/1\"}),\n+ Document(content=\"doc2\", meta={\"url\": \"https://example.com/2\"}),\n+ Document(content=\"doc3\", meta={\"url\": \"https://example.com/1\"}),\n+ Document(content=\"doc4\", meta={\"url\": \"https://example.com/2\"}),\n+ ]\n+ docstore.write_documents(documents)\n+ checker = CacheChecker(docstore, cache_field=\"url\")\n+ results = checker.run(items=[\"https://example.com/1\", \"https://example.com/5\"])\n+ assert results == {\"hits\": [documents[0], documents[2]], \"misses\": [\"https://example.com/5\"]}\n+ ```\n \"\"\"\n \n def __init__(self, document_store: DocumentStore, cache_field: str):\n \"\"\"\n- Create a UrlCacheChecker component.\n+ Create a CacheChecker component.\n+\n+ :param document_store:\n+ Document store to check.\n+ :param cache_field:\n+ Name of the Document metadata field\n+ to check for cache hits.\n \"\"\"\n self.document_store = document_store\n self.cache_field = cache_field\n \n def to_dict(self) -> Dict[str, Any]:\n \"\"\"\n- Serialize this component to a dictionary.\n+ Serializes the component to a dictionary.\n+\n+ :returns:\n+ Dictionary with serialized data.\n \"\"\"\n return default_to_dict(self, document_store=self.document_store.to_dict(), cache_field=self.cache_field)\n \n @classmethod\n def from_dict(cls, data: Dict[str, Any]) -> \"CacheChecker\":\n \"\"\"\n- Deserialize this component from a dictionary.\n+ Deserializes the component from a dictionary.\n+\n+ :param data:\n+ Dictionary to deserialize from.\n+ :returns:\n+ Deserialized component.\n \"\"\"\n init_params = data.get(\"init_parameters\", {})\n if \"document_store\" not in init_params:\n@@ -60,12 +96,15 @@\n @component.output_types(hits=List[Document], misses=List)\n def run(self, items: List[Any]):\n \"\"\"\n- Checks if any document associated with the specified field is already present in the store. If matching documents\n- are found, they are returned as hits. If not, the items are returned as misses, indicating they are not in the cache.\n-\n- :param items: A list of values associated with the cache_field to be checked against the cache.\n- :return: A dictionary with two keys: \"hits\" and \"misses\". The values are lists of documents that were found in\n- the cache and items that were not, respectively.\n+ Checks if any document associated with the specified cache field\n+ is already present in the store.\n+\n+ :param items:\n+ Values to be checked against the cache field.\n+ :return:\n+ A dictionary with two keys:\n+ - `hits` - Documents that matched with any of the items.\n+ - `misses` - Items that were not present in any documents.\n \"\"\"\n found_documents = []\n misses = []\n", "issue": "Docstrings - `haystack.components.caching`\n\n", "before_files": [{"content": "from typing import List, Dict, Any\n\nimport importlib\n\nimport logging\n\nfrom haystack import component, Document, default_from_dict, default_to_dict, DeserializationError\nfrom haystack.document_stores.types import DocumentStore\n\n\nlogger = logging.getLogger(__name__)\n\n\n@component\nclass CacheChecker:\n \"\"\"\n CacheChecker is a component that checks for the presence of documents in a Document Store based on a specified\n cache field.\n \"\"\"\n\n def __init__(self, document_store: DocumentStore, cache_field: str):\n \"\"\"\n Create a UrlCacheChecker component.\n \"\"\"\n self.document_store = document_store\n self.cache_field = cache_field\n\n def to_dict(self) -> Dict[str, Any]:\n \"\"\"\n Serialize this component to a dictionary.\n \"\"\"\n return default_to_dict(self, document_store=self.document_store.to_dict(), cache_field=self.cache_field)\n\n @classmethod\n def from_dict(cls, data: Dict[str, Any]) -> \"CacheChecker\":\n \"\"\"\n Deserialize this component from a dictionary.\n \"\"\"\n init_params = data.get(\"init_parameters\", {})\n if \"document_store\" not in init_params:\n raise DeserializationError(\"Missing 'document_store' in serialization data\")\n if \"type\" not in init_params[\"document_store\"]:\n raise DeserializationError(\"Missing 'type' in document store's serialization data\")\n\n try:\n module_name, type_ = init_params[\"document_store\"][\"type\"].rsplit(\".\", 1)\n logger.debug(\"Trying to import %s\", module_name)\n module = importlib.import_module(module_name)\n except (ImportError, DeserializationError) as e:\n raise DeserializationError(\n f\"DocumentStore of type '{init_params['document_store']['type']}' not correctly imported\"\n ) from e\n\n docstore_class = getattr(module, type_)\n docstore = docstore_class.from_dict(init_params[\"document_store\"])\n\n data[\"init_parameters\"][\"document_store\"] = docstore\n return default_from_dict(cls, data)\n\n @component.output_types(hits=List[Document], misses=List)\n def run(self, items: List[Any]):\n \"\"\"\n Checks if any document associated with the specified field is already present in the store. If matching documents\n are found, they are returned as hits. If not, the items are returned as misses, indicating they are not in the cache.\n\n :param items: A list of values associated with the cache_field to be checked against the cache.\n :return: A dictionary with two keys: \"hits\" and \"misses\". The values are lists of documents that were found in\n the cache and items that were not, respectively.\n \"\"\"\n found_documents = []\n misses = []\n\n for item in items:\n filters = {self.cache_field: item}\n found = self.document_store.filter_documents(filters=filters)\n if found:\n found_documents.extend(found)\n else:\n misses.append(item)\n return {\"hits\": found_documents, \"misses\": misses}\n", "path": "haystack/components/caching/cache_checker.py"}], "after_files": [{"content": "from typing import List, Dict, Any\n\nimport importlib\n\nimport logging\n\nfrom haystack import component, Document, default_from_dict, default_to_dict, DeserializationError\nfrom haystack.document_stores.types import DocumentStore\n\n\nlogger = logging.getLogger(__name__)\n\n\n@component\nclass CacheChecker:\n \"\"\"\n Checks for the presence of documents in a Document Store based on a specified\n field in each document's metadata.\n\n If matching documents are found, they are returned as hits. If not, the items\n are returned as misses, indicating they are not in the cache.\n\n Usage example:\n ```python\n from haystack import Document\n from haystack.document_stores.in_memory import InMemoryDocumentStore\n from haystack.components.caching.cache_checker import CacheChecker\n\n docstore = InMemoryDocumentStore()\n documents = [\n Document(content=\"doc1\", meta={\"url\": \"https://example.com/1\"}),\n Document(content=\"doc2\", meta={\"url\": \"https://example.com/2\"}),\n Document(content=\"doc3\", meta={\"url\": \"https://example.com/1\"}),\n Document(content=\"doc4\", meta={\"url\": \"https://example.com/2\"}),\n ]\n docstore.write_documents(documents)\n checker = CacheChecker(docstore, cache_field=\"url\")\n results = checker.run(items=[\"https://example.com/1\", \"https://example.com/5\"])\n assert results == {\"hits\": [documents[0], documents[2]], \"misses\": [\"https://example.com/5\"]}\n ```\n \"\"\"\n\n def __init__(self, document_store: DocumentStore, cache_field: str):\n \"\"\"\n Create a CacheChecker component.\n\n :param document_store:\n Document store to check.\n :param cache_field:\n Name of the Document metadata field\n to check for cache hits.\n \"\"\"\n self.document_store = document_store\n self.cache_field = cache_field\n\n def to_dict(self) -> Dict[str, Any]:\n \"\"\"\n Serializes the component to a dictionary.\n\n :returns:\n Dictionary with serialized data.\n \"\"\"\n return default_to_dict(self, document_store=self.document_store.to_dict(), cache_field=self.cache_field)\n\n @classmethod\n def from_dict(cls, data: Dict[str, Any]) -> \"CacheChecker\":\n \"\"\"\n Deserializes the component from a dictionary.\n\n :param data:\n Dictionary to deserialize from.\n :returns:\n Deserialized component.\n \"\"\"\n init_params = data.get(\"init_parameters\", {})\n if \"document_store\" not in init_params:\n raise DeserializationError(\"Missing 'document_store' in serialization data\")\n if \"type\" not in init_params[\"document_store\"]:\n raise DeserializationError(\"Missing 'type' in document store's serialization data\")\n\n try:\n module_name, type_ = init_params[\"document_store\"][\"type\"].rsplit(\".\", 1)\n logger.debug(\"Trying to import %s\", module_name)\n module = importlib.import_module(module_name)\n except (ImportError, DeserializationError) as e:\n raise DeserializationError(\n f\"DocumentStore of type '{init_params['document_store']['type']}' not correctly imported\"\n ) from e\n\n docstore_class = getattr(module, type_)\n docstore = docstore_class.from_dict(init_params[\"document_store\"])\n\n data[\"init_parameters\"][\"document_store\"] = docstore\n return default_from_dict(cls, data)\n\n @component.output_types(hits=List[Document], misses=List)\n def run(self, items: List[Any]):\n \"\"\"\n Checks if any document associated with the specified cache field\n is already present in the store.\n\n :param items:\n Values to be checked against the cache field.\n :return:\n A dictionary with two keys:\n - `hits` - Documents that matched with any of the items.\n - `misses` - Items that were not present in any documents.\n \"\"\"\n found_documents = []\n misses = []\n\n for item in items:\n filters = {self.cache_field: item}\n found = self.document_store.filter_documents(filters=filters)\n if found:\n found_documents.extend(found)\n else:\n misses.append(item)\n return {\"hits\": found_documents, \"misses\": misses}\n", "path": "haystack/components/caching/cache_checker.py"}]} | 1,070 | 899 |
gh_patches_debug_19709 | rasdani/github-patches | git_diff | fossasia__open-event-server-5615 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Unable to login to the Admin Panel.
**Description:**
When we try to login to admin panel, even if the credentials belong to super-admin, it returns "Credential incorrect"
**Steps to reproduce the behavior:**
1. Go to 127.0.0.1:5000/admin
2. Enter the admin credentials
3. Click on login button
4. See error "Credential incorrect"
**Expected Behaviour:**
It should login the user to the admin panel if credentials are correct and the user is an admin.
**Screenshots**

--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `app/views/__init__.py`
Content:
```
1 import flask_login as login
2 import requests
3 from flask import url_for, redirect, Blueprint, request, make_response
4 from flask_admin import Admin, AdminIndexView, expose, helpers as admin_helpers
5 from flask_admin.contrib.sqla import ModelView
6 from flask_scrypt import generate_password_hash
7 from wtforms import form, fields, validators
8
9 from app.models import db
10 from app.models.user import User
11
12
13 class AdminModelView(ModelView):
14 def is_accessible(self):
15 return login.current_user.is_authenticated
16
17 def inaccessible_callback(self, name, **kwargs):
18 # redirect to login page if user doesn't have access
19 return redirect(url_for('admin.index', next=request.url))
20
21
22 class LoginForm(form.Form):
23 login = fields.TextField(validators=[validators.required(), validators.email()], render_kw={"placeholder": "[email protected]"})
24 password = fields.PasswordField(validators=[validators.required()], render_kw={"placeholder": "xyzzy"})
25
26 def validate_login(self, field):
27 """
28 validate login
29 :param field:
30 :return:
31 """
32 user = self.get_user()
33
34 if user is None:
35 raise validators.ValidationError('User does not exist.')
36
37 if user.password != generate_password_hash(self.password.data, user.salt):
38 raise validators.ValidationError('Credentials incorrect.')
39
40 if not user.is_admin and not user.is_super_admin:
41 raise validators.ValidationError('Access Forbidden. Admin Rights Required')
42
43 def get_user(self):
44 return User.query.filter_by(email=self.login.data).first()
45
46
47 class MyAdminIndexView(AdminIndexView):
48 @expose('/')
49 def index(self):
50 """
51 /admin
52 :return:
53 """
54 if not login.current_user.is_authenticated:
55 return redirect(url_for('.login_view'))
56 return super(MyAdminIndexView, self).index()
57
58 @expose('/login/', methods=('GET', 'POST'))
59 def login_view(self):
60 """
61 login view for flask-admin
62 :return:
63 """
64 # handle user login
65 form = LoginForm(request.form)
66 if admin_helpers.validate_form_on_submit(form):
67 user = form.get_user()
68 login.login_user(user)
69
70 if login.current_user.is_authenticated:
71 return redirect(url_for('.index'))
72 self._template_args['form'] = form
73 return super(MyAdminIndexView, self).index()
74
75 @expose('/logout/')
76 def logout_view(self):
77 login.logout_user()
78 return redirect(url_for('.index'))
79
80
81 home_routes = Blueprint('home', __name__)
82
83
84 # Flask views
85 @home_routes.route('/')
86 def index():
87 """
88 Index route
89 :return:
90 """
91 r = requests.get('https://raw.githubusercontent.com/fossasia/open-event-server/gh-pages/api/v1/index.html')
92 response = make_response(r.content)
93 response.headers["Content-Type"] = "text/html"
94 return response
95
96
97 class BlueprintsManager:
98 def __init__(self):
99 pass
100
101 @staticmethod
102 def register(app):
103 """
104 Register blueprints
105 :param app: a flask app instance
106 :return:
107 """
108 app.register_blueprint(home_routes)
109 admin = Admin(app, name='Open Event API', template_mode='bootstrap3', index_view=MyAdminIndexView(),
110 base_template='admin_base.html')
111
112 # Get all the models in the db, all models should have a explicit __tablename__
113 classes, models, table_names = [], [], []
114 # noinspection PyProtectedMember
115 for class_ in list(db.Model._decl_class_registry.values()):
116 try:
117 table_names.append(class_.__tablename__)
118 classes.append(class_)
119 except:
120 pass
121 for table in list(db.metadata.tables.items()):
122 if table[0] in table_names:
123 models.append(classes[table_names.index(table[0])])
124
125 for model in models:
126 admin.add_view(AdminModelView(model, db.session))
127
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/app/views/__init__.py b/app/views/__init__.py
--- a/app/views/__init__.py
+++ b/app/views/__init__.py
@@ -3,7 +3,6 @@
from flask import url_for, redirect, Blueprint, request, make_response
from flask_admin import Admin, AdminIndexView, expose, helpers as admin_helpers
from flask_admin.contrib.sqla import ModelView
-from flask_scrypt import generate_password_hash
from wtforms import form, fields, validators
from app.models import db
@@ -34,7 +33,7 @@
if user is None:
raise validators.ValidationError('User does not exist.')
- if user.password != generate_password_hash(self.password.data, user.salt):
+ if not user.is_correct_password(self.password.data):
raise validators.ValidationError('Credentials incorrect.')
if not user.is_admin and not user.is_super_admin:
| {"golden_diff": "diff --git a/app/views/__init__.py b/app/views/__init__.py\n--- a/app/views/__init__.py\n+++ b/app/views/__init__.py\n@@ -3,7 +3,6 @@\n from flask import url_for, redirect, Blueprint, request, make_response\n from flask_admin import Admin, AdminIndexView, expose, helpers as admin_helpers\n from flask_admin.contrib.sqla import ModelView\n-from flask_scrypt import generate_password_hash\n from wtforms import form, fields, validators\n \n from app.models import db\n@@ -34,7 +33,7 @@\n if user is None:\n raise validators.ValidationError('User does not exist.')\n \n- if user.password != generate_password_hash(self.password.data, user.salt):\n+ if not user.is_correct_password(self.password.data):\n raise validators.ValidationError('Credentials incorrect.')\n \n if not user.is_admin and not user.is_super_admin:\n", "issue": "Unable to login to the Admin Panel.\n**Description:**\r\nWhen we try to login to admin panel, even if the credentials belong to super-admin, it returns \"Credential incorrect\"\r\n\r\n**Steps to reproduce the behavior:**\r\n1. Go to 127.0.0.1:5000/admin\r\n2. Enter the admin credentials\r\n3. Click on login button\r\n4. See error \"Credential incorrect\"\r\n\r\n**Expected Behaviour:**\r\nIt should login the user to the admin panel if credentials are correct and the user is an admin.\r\n\r\n**Screenshots**\r\n\r\n\n", "before_files": [{"content": "import flask_login as login\nimport requests\nfrom flask import url_for, redirect, Blueprint, request, make_response\nfrom flask_admin import Admin, AdminIndexView, expose, helpers as admin_helpers\nfrom flask_admin.contrib.sqla import ModelView\nfrom flask_scrypt import generate_password_hash\nfrom wtforms import form, fields, validators\n\nfrom app.models import db\nfrom app.models.user import User\n\n\nclass AdminModelView(ModelView):\n def is_accessible(self):\n return login.current_user.is_authenticated\n\n def inaccessible_callback(self, name, **kwargs):\n # redirect to login page if user doesn't have access\n return redirect(url_for('admin.index', next=request.url))\n\n\nclass LoginForm(form.Form):\n login = fields.TextField(validators=[validators.required(), validators.email()], render_kw={\"placeholder\": \"[email protected]\"})\n password = fields.PasswordField(validators=[validators.required()], render_kw={\"placeholder\": \"xyzzy\"})\n\n def validate_login(self, field):\n \"\"\"\n validate login\n :param field:\n :return:\n \"\"\"\n user = self.get_user()\n\n if user is None:\n raise validators.ValidationError('User does not exist.')\n\n if user.password != generate_password_hash(self.password.data, user.salt):\n raise validators.ValidationError('Credentials incorrect.')\n\n if not user.is_admin and not user.is_super_admin:\n raise validators.ValidationError('Access Forbidden. Admin Rights Required')\n\n def get_user(self):\n return User.query.filter_by(email=self.login.data).first()\n\n\nclass MyAdminIndexView(AdminIndexView):\n @expose('/')\n def index(self):\n \"\"\"\n /admin\n :return:\n \"\"\"\n if not login.current_user.is_authenticated:\n return redirect(url_for('.login_view'))\n return super(MyAdminIndexView, self).index()\n\n @expose('/login/', methods=('GET', 'POST'))\n def login_view(self):\n \"\"\"\n login view for flask-admin\n :return:\n \"\"\"\n # handle user login\n form = LoginForm(request.form)\n if admin_helpers.validate_form_on_submit(form):\n user = form.get_user()\n login.login_user(user)\n\n if login.current_user.is_authenticated:\n return redirect(url_for('.index'))\n self._template_args['form'] = form\n return super(MyAdminIndexView, self).index()\n\n @expose('/logout/')\n def logout_view(self):\n login.logout_user()\n return redirect(url_for('.index'))\n\n\nhome_routes = Blueprint('home', __name__)\n\n\n# Flask views\n@home_routes.route('/')\ndef index():\n \"\"\"\n Index route\n :return:\n \"\"\"\n r = requests.get('https://raw.githubusercontent.com/fossasia/open-event-server/gh-pages/api/v1/index.html')\n response = make_response(r.content)\n response.headers[\"Content-Type\"] = \"text/html\"\n return response\n\n\nclass BlueprintsManager:\n def __init__(self):\n pass\n\n @staticmethod\n def register(app):\n \"\"\"\n Register blueprints\n :param app: a flask app instance\n :return:\n \"\"\"\n app.register_blueprint(home_routes)\n admin = Admin(app, name='Open Event API', template_mode='bootstrap3', index_view=MyAdminIndexView(),\n base_template='admin_base.html')\n\n # Get all the models in the db, all models should have a explicit __tablename__\n classes, models, table_names = [], [], []\n # noinspection PyProtectedMember\n for class_ in list(db.Model._decl_class_registry.values()):\n try:\n table_names.append(class_.__tablename__)\n classes.append(class_)\n except:\n pass\n for table in list(db.metadata.tables.items()):\n if table[0] in table_names:\n models.append(classes[table_names.index(table[0])])\n\n for model in models:\n admin.add_view(AdminModelView(model, db.session))\n", "path": "app/views/__init__.py"}], "after_files": [{"content": "import flask_login as login\nimport requests\nfrom flask import url_for, redirect, Blueprint, request, make_response\nfrom flask_admin import Admin, AdminIndexView, expose, helpers as admin_helpers\nfrom flask_admin.contrib.sqla import ModelView\nfrom wtforms import form, fields, validators\n\nfrom app.models import db\nfrom app.models.user import User\n\n\nclass AdminModelView(ModelView):\n def is_accessible(self):\n return login.current_user.is_authenticated\n\n def inaccessible_callback(self, name, **kwargs):\n # redirect to login page if user doesn't have access\n return redirect(url_for('admin.index', next=request.url))\n\n\nclass LoginForm(form.Form):\n login = fields.TextField(validators=[validators.required(), validators.email()], render_kw={\"placeholder\": \"[email protected]\"})\n password = fields.PasswordField(validators=[validators.required()], render_kw={\"placeholder\": \"xyzzy\"})\n\n def validate_login(self, field):\n \"\"\"\n validate login\n :param field:\n :return:\n \"\"\"\n user = self.get_user()\n\n if user is None:\n raise validators.ValidationError('User does not exist.')\n\n if not user.is_correct_password(self.password.data):\n raise validators.ValidationError('Credentials incorrect.')\n\n if not user.is_admin and not user.is_super_admin:\n raise validators.ValidationError('Access Forbidden. Admin Rights Required')\n\n def get_user(self):\n return User.query.filter_by(email=self.login.data).first()\n\n\nclass MyAdminIndexView(AdminIndexView):\n @expose('/')\n def index(self):\n \"\"\"\n /admin\n :return:\n \"\"\"\n if not login.current_user.is_authenticated:\n return redirect(url_for('.login_view'))\n return super(MyAdminIndexView, self).index()\n\n @expose('/login/', methods=('GET', 'POST'))\n def login_view(self):\n \"\"\"\n login view for flask-admin\n :return:\n \"\"\"\n # handle user login\n form = LoginForm(request.form)\n if admin_helpers.validate_form_on_submit(form):\n user = form.get_user()\n login.login_user(user)\n\n if login.current_user.is_authenticated:\n return redirect(url_for('.index'))\n self._template_args['form'] = form\n return super(MyAdminIndexView, self).index()\n\n @expose('/logout/')\n def logout_view(self):\n login.logout_user()\n return redirect(url_for('.index'))\n\n\nhome_routes = Blueprint('home', __name__)\n\n\n# Flask views\n@home_routes.route('/')\ndef index():\n \"\"\"\n Index route\n :return:\n \"\"\"\n r = requests.get('https://raw.githubusercontent.com/fossasia/open-event-server/gh-pages/api/v1/index.html')\n response = make_response(r.content)\n response.headers[\"Content-Type\"] = \"text/html\"\n return response\n\n\nclass BlueprintsManager:\n def __init__(self):\n pass\n\n @staticmethod\n def register(app):\n \"\"\"\n Register blueprints\n :param app: a flask app instance\n :return:\n \"\"\"\n app.register_blueprint(home_routes)\n admin = Admin(app, name='Open Event API', template_mode='bootstrap3', index_view=MyAdminIndexView(),\n base_template='admin_base.html')\n\n # Get all the models in the db, all models should have a explicit __tablename__\n classes, models, table_names = [], [], []\n # noinspection PyProtectedMember\n for class_ in list(db.Model._decl_class_registry.values()):\n try:\n table_names.append(class_.__tablename__)\n classes.append(class_)\n except:\n pass\n for table in list(db.metadata.tables.items()):\n if table[0] in table_names:\n models.append(classes[table_names.index(table[0])])\n\n for model in models:\n admin.add_view(AdminModelView(model, db.session))\n", "path": "app/views/__init__.py"}]} | 1,546 | 193 |
gh_patches_debug_10505 | rasdani/github-patches | git_diff | cornellius-gp__gpytorch-1371 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
ZeroMean for Batch Independent Multioutput GP
I'm following the Batch Independent Multioutput GP example, but instead of using a constant mean, I would like each dimension to use a zero mean (maybe this is a bad idea?).
```
class ZeroMeanIndependentMultitaskGPModel(gpytorch.models.ExactGP):
def __init__(self, train_x, train_y, likelihood, nx):
super().__init__(train_x, train_y, likelihood)
self.n = nx #output dimension
#self.mean_module = gpytorch.means.MultitaskMean([gpytorch.means.ZeroMean()]*self.n,
# num_tasks=self.n)
#self.mean_module = gpytorch.means.ConstantMean(batch_shape=torch.Size([self.n]))
self.mean_module = gpytorch.means.ZeroMean()
self.covar_module = gpytorch.kernels.ScaleKernel(
gpytorch.kernels.RBFKernel(batch_shape=torch.Size([self.n])),
batch_shape=torch.Size([self.n])
)
def forward(self, x):
mean_x = self.mean_module(x) # is this needed for ZeroMean?
covar_x = self.covar_module(x)
return gpytorch.distributions.MultitaskMultivariateNormal.from_batch_mvn(
gpytorch.distributions.MultivariateNormal(mean_x, covar_x)
)
```
When training with this, I get the error `RuntimeError: mean should be a matrix or a batch matrix (batch mode)`. It works as intended with constant mean. As you can see, I've tried a couple different things, but they don't seem to work either. I can't seem to find other people with the same issue online. Is it possible to do this with ZeroMean?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `gpytorch/means/zero_mean.py`
Content:
```
1 #!/usr/bin/env python3
2
3 import torch
4
5 from .mean import Mean
6
7
8 class ZeroMean(Mean):
9 def forward(self, input):
10 return torch.zeros(input.shape[:-1], dtype=input.dtype, device=input.device)
11
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/gpytorch/means/zero_mean.py b/gpytorch/means/zero_mean.py
--- a/gpytorch/means/zero_mean.py
+++ b/gpytorch/means/zero_mean.py
@@ -2,9 +2,18 @@
import torch
+from ..utils.broadcasting import _mul_broadcast_shape
from .mean import Mean
class ZeroMean(Mean):
+ def __init__(self, batch_shape=torch.Size(), **kwargs):
+ super(ZeroMean, self).__init__()
+ self.batch_shape = batch_shape
+
def forward(self, input):
- return torch.zeros(input.shape[:-1], dtype=input.dtype, device=input.device)
+ mean = torch.zeros(*self.batch_shape, 1, dtype=input.dtype, device=input.device)
+ if input.shape[:-2] == self.batch_shape:
+ return mean.expand(input.shape[:-1])
+ else:
+ return mean.expand(_mul_broadcast_shape(input.shape[:-1], mean.shape))
| {"golden_diff": "diff --git a/gpytorch/means/zero_mean.py b/gpytorch/means/zero_mean.py\n--- a/gpytorch/means/zero_mean.py\n+++ b/gpytorch/means/zero_mean.py\n@@ -2,9 +2,18 @@\n \n import torch\n \n+from ..utils.broadcasting import _mul_broadcast_shape\n from .mean import Mean\n \n \n class ZeroMean(Mean):\n+ def __init__(self, batch_shape=torch.Size(), **kwargs):\n+ super(ZeroMean, self).__init__()\n+ self.batch_shape = batch_shape\n+\n def forward(self, input):\n- return torch.zeros(input.shape[:-1], dtype=input.dtype, device=input.device)\n+ mean = torch.zeros(*self.batch_shape, 1, dtype=input.dtype, device=input.device)\n+ if input.shape[:-2] == self.batch_shape:\n+ return mean.expand(input.shape[:-1])\n+ else:\n+ return mean.expand(_mul_broadcast_shape(input.shape[:-1], mean.shape))\n", "issue": "ZeroMean for Batch Independent Multioutput GP\nI'm following the Batch Independent Multioutput GP example, but instead of using a constant mean, I would like each dimension to use a zero mean (maybe this is a bad idea?).\r\n\r\n```\r\nclass ZeroMeanIndependentMultitaskGPModel(gpytorch.models.ExactGP):\r\n def __init__(self, train_x, train_y, likelihood, nx):\r\n super().__init__(train_x, train_y, likelihood)\r\n self.n = nx #output dimension\r\n #self.mean_module = gpytorch.means.MultitaskMean([gpytorch.means.ZeroMean()]*self.n,\r\n # num_tasks=self.n)\r\n #self.mean_module = gpytorch.means.ConstantMean(batch_shape=torch.Size([self.n]))\r\n self.mean_module = gpytorch.means.ZeroMean()\r\n self.covar_module = gpytorch.kernels.ScaleKernel(\r\n gpytorch.kernels.RBFKernel(batch_shape=torch.Size([self.n])),\r\n batch_shape=torch.Size([self.n])\r\n )\r\n\r\n def forward(self, x):\r\n mean_x = self.mean_module(x) # is this needed for ZeroMean?\r\n covar_x = self.covar_module(x)\r\n return gpytorch.distributions.MultitaskMultivariateNormal.from_batch_mvn(\r\n gpytorch.distributions.MultivariateNormal(mean_x, covar_x)\r\n )\r\n```\r\n\r\n\r\nWhen training with this, I get the error `RuntimeError: mean should be a matrix or a batch matrix (batch mode)`. It works as intended with constant mean. As you can see, I've tried a couple different things, but they don't seem to work either. I can't seem to find other people with the same issue online. Is it possible to do this with ZeroMean?\r\n\n", "before_files": [{"content": "#!/usr/bin/env python3\n\nimport torch\n\nfrom .mean import Mean\n\n\nclass ZeroMean(Mean):\n def forward(self, input):\n return torch.zeros(input.shape[:-1], dtype=input.dtype, device=input.device)\n", "path": "gpytorch/means/zero_mean.py"}], "after_files": [{"content": "#!/usr/bin/env python3\n\nimport torch\n\nfrom ..utils.broadcasting import _mul_broadcast_shape\nfrom .mean import Mean\n\n\nclass ZeroMean(Mean):\n def __init__(self, batch_shape=torch.Size(), **kwargs):\n super(ZeroMean, self).__init__()\n self.batch_shape = batch_shape\n\n def forward(self, input):\n mean = torch.zeros(*self.batch_shape, 1, dtype=input.dtype, device=input.device)\n if input.shape[:-2] == self.batch_shape:\n return mean.expand(input.shape[:-1])\n else:\n return mean.expand(_mul_broadcast_shape(input.shape[:-1], mean.shape))\n", "path": "gpytorch/means/zero_mean.py"}]} | 704 | 216 |
gh_patches_debug_1971 | rasdani/github-patches | git_diff | strawberry-graphql__strawberry-1994 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Postponed annotation evaluation causes `Annotated` to break
When using postponed annotation evaluation, annotating resolver arguments no longer works:
```python
from __future__ import annotations
import random
from typing import Annotated
import strawberry
@strawberry.type
class Query:
@strawberry.field
def dice_roll(
self,
sides: Annotated[
int,
strawberry.argument(description="Number of sides the die should have."),
] = 6,
) -> int:
return random.randint(1, sides)
strawberry.Schema(query=Query)
```
The example above raises this TypeError:
```
TypeError: Query fields cannot be resolved. Unexpected type 'typing.Annotated[int, <strawberry.arguments.StrawberryArgumentAnnotation object at 0x7fd12e130d00>]'
```
When the first line (`from __future__ import annotations`) is left out, everything works as intended. This will probably also break once Python 3.11 lands, since the behavior will become mandatory then. #1586 refers to a somewhat related issue.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `strawberry/auto.py`
Content:
```
1 from __future__ import annotations
2
3 from typing import Any, Optional, Union, cast
4
5 from typing_extensions import Annotated, get_args, get_origin
6
7 from strawberry.type import StrawberryType
8
9 from .annotation import StrawberryAnnotation
10
11
12 class StrawberryAutoMeta(type):
13 """Metaclass for StrawberryAuto.
14
15 This is used to make sure StrawberryAuto is a singleton and also to
16 override the behavior of `isinstance` so that it consider the following
17 cases:
18
19 >> isinstance(StrawberryAuto(), StrawberryAuto)
20 True
21 >> isinstance(StrawberryAnnotation(StrawberryAuto()), StrawberryAuto)
22 True
23 >> isinstance(Annotated[StrawberryAuto(), object()), StrawberryAuto)
24 True
25
26 """
27
28 def __init__(self, *args, **kwargs):
29 self._instance: Optional[StrawberryAuto] = None
30 super().__init__(*args, **kwargs)
31
32 def __call__(cls, *args, **kwargs):
33 if cls._instance is None:
34 cls._instance = super().__call__(*args, **kwargs)
35
36 return cls._instance
37
38 def __instancecheck__(
39 self,
40 instance: Union[StrawberryAuto, StrawberryAnnotation, StrawberryType, type],
41 ):
42 if isinstance(instance, StrawberryAnnotation):
43 resolved = instance.annotation
44 if isinstance(resolved, str):
45 namespace = instance.namespace
46 resolved = namespace and namespace.get(resolved)
47
48 if resolved is not None:
49 instance = cast(type, resolved)
50
51 if instance is auto:
52 return True
53
54 # Support uses of Annotated[auto, something()]
55 if get_origin(instance) is Annotated:
56 args = get_args(instance)
57 if args[0] is Any:
58 return any(isinstance(arg, StrawberryAuto) for arg in args[1:])
59
60 return False
61
62
63 class StrawberryAuto(metaclass=StrawberryAutoMeta):
64 def __str__(self):
65 return "auto"
66
67 def __repr__(self):
68 return "<auto>"
69
70
71 auto = Annotated[Any, StrawberryAuto()]
72
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/strawberry/auto.py b/strawberry/auto.py
--- a/strawberry/auto.py
+++ b/strawberry/auto.py
@@ -57,7 +57,7 @@
if args[0] is Any:
return any(isinstance(arg, StrawberryAuto) for arg in args[1:])
- return False
+ return instance == "strawberry.auto"
class StrawberryAuto(metaclass=StrawberryAutoMeta):
| {"golden_diff": "diff --git a/strawberry/auto.py b/strawberry/auto.py\n--- a/strawberry/auto.py\n+++ b/strawberry/auto.py\n@@ -57,7 +57,7 @@\n if args[0] is Any:\n return any(isinstance(arg, StrawberryAuto) for arg in args[1:])\n \n- return False\n+ return instance == \"strawberry.auto\"\n \n \n class StrawberryAuto(metaclass=StrawberryAutoMeta):\n", "issue": "Postponed annotation evaluation causes `Annotated` to break\nWhen using postponed annotation evaluation, annotating resolver arguments no longer works:\r\n\r\n```python\r\nfrom __future__ import annotations\r\n\r\nimport random\r\nfrom typing import Annotated\r\n\r\nimport strawberry\r\n\r\n\r\[email protected]\r\nclass Query:\r\n @strawberry.field\r\n def dice_roll(\r\n self,\r\n sides: Annotated[\r\n int,\r\n strawberry.argument(description=\"Number of sides the die should have.\"),\r\n ] = 6,\r\n ) -> int:\r\n return random.randint(1, sides)\r\n\r\n\r\nstrawberry.Schema(query=Query)\r\n```\r\n\r\nThe example above raises this TypeError:\r\n\r\n```\r\nTypeError: Query fields cannot be resolved. Unexpected type 'typing.Annotated[int, <strawberry.arguments.StrawberryArgumentAnnotation object at 0x7fd12e130d00>]'\r\n```\r\n\r\nWhen the first line (`from __future__ import annotations`) is left out, everything works as intended. This will probably also break once Python 3.11 lands, since the behavior will become mandatory then. #1586 refers to a somewhat related issue.\n", "before_files": [{"content": "from __future__ import annotations\n\nfrom typing import Any, Optional, Union, cast\n\nfrom typing_extensions import Annotated, get_args, get_origin\n\nfrom strawberry.type import StrawberryType\n\nfrom .annotation import StrawberryAnnotation\n\n\nclass StrawberryAutoMeta(type):\n \"\"\"Metaclass for StrawberryAuto.\n\n This is used to make sure StrawberryAuto is a singleton and also to\n override the behavior of `isinstance` so that it consider the following\n cases:\n\n >> isinstance(StrawberryAuto(), StrawberryAuto)\n True\n >> isinstance(StrawberryAnnotation(StrawberryAuto()), StrawberryAuto)\n True\n >> isinstance(Annotated[StrawberryAuto(), object()), StrawberryAuto)\n True\n\n \"\"\"\n\n def __init__(self, *args, **kwargs):\n self._instance: Optional[StrawberryAuto] = None\n super().__init__(*args, **kwargs)\n\n def __call__(cls, *args, **kwargs):\n if cls._instance is None:\n cls._instance = super().__call__(*args, **kwargs)\n\n return cls._instance\n\n def __instancecheck__(\n self,\n instance: Union[StrawberryAuto, StrawberryAnnotation, StrawberryType, type],\n ):\n if isinstance(instance, StrawberryAnnotation):\n resolved = instance.annotation\n if isinstance(resolved, str):\n namespace = instance.namespace\n resolved = namespace and namespace.get(resolved)\n\n if resolved is not None:\n instance = cast(type, resolved)\n\n if instance is auto:\n return True\n\n # Support uses of Annotated[auto, something()]\n if get_origin(instance) is Annotated:\n args = get_args(instance)\n if args[0] is Any:\n return any(isinstance(arg, StrawberryAuto) for arg in args[1:])\n\n return False\n\n\nclass StrawberryAuto(metaclass=StrawberryAutoMeta):\n def __str__(self):\n return \"auto\"\n\n def __repr__(self):\n return \"<auto>\"\n\n\nauto = Annotated[Any, StrawberryAuto()]\n", "path": "strawberry/auto.py"}], "after_files": [{"content": "from __future__ import annotations\n\nfrom typing import Any, Optional, Union, cast\n\nfrom typing_extensions import Annotated, get_args, get_origin\n\nfrom strawberry.type import StrawberryType\n\nfrom .annotation import StrawberryAnnotation\n\n\nclass StrawberryAutoMeta(type):\n \"\"\"Metaclass for StrawberryAuto.\n\n This is used to make sure StrawberryAuto is a singleton and also to\n override the behavior of `isinstance` so that it consider the following\n cases:\n\n >> isinstance(StrawberryAuto(), StrawberryAuto)\n True\n >> isinstance(StrawberryAnnotation(StrawberryAuto()), StrawberryAuto)\n True\n >> isinstance(Annotated[StrawberryAuto(), object()), StrawberryAuto)\n True\n\n \"\"\"\n\n def __init__(self, *args, **kwargs):\n self._instance: Optional[StrawberryAuto] = None\n super().__init__(*args, **kwargs)\n\n def __call__(cls, *args, **kwargs):\n if cls._instance is None:\n cls._instance = super().__call__(*args, **kwargs)\n\n return cls._instance\n\n def __instancecheck__(\n self,\n instance: Union[StrawberryAuto, StrawberryAnnotation, StrawberryType, type],\n ):\n if isinstance(instance, StrawberryAnnotation):\n resolved = instance.annotation\n if isinstance(resolved, str):\n namespace = instance.namespace\n resolved = namespace and namespace.get(resolved)\n\n if resolved is not None:\n instance = cast(type, resolved)\n\n if instance is auto:\n return True\n\n # Support uses of Annotated[auto, something()]\n if get_origin(instance) is Annotated:\n args = get_args(instance)\n if args[0] is Any:\n return any(isinstance(arg, StrawberryAuto) for arg in args[1:])\n\n return instance == \"strawberry.auto\"\n\n\nclass StrawberryAuto(metaclass=StrawberryAutoMeta):\n def __str__(self):\n return \"auto\"\n\n def __repr__(self):\n return \"<auto>\"\n\n\nauto = Annotated[Any, StrawberryAuto()]\n", "path": "strawberry/auto.py"}]} | 1,076 | 104 |
gh_patches_debug_10295 | rasdani/github-patches | git_diff | holoviz__panel-723 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Scipy now required dependency?
The latest dev build of panel (panel-0.7.0a15-py_0) looks like it is trying to import scipy, but scipy is not among the required dependencies that are automatically installed through conda.
```
$ conda create -n panel -c pyviz/label/dev panel
...
$ conda activate panel
(panel) $ python
Python 3.7.4 (default, Aug 13 2019, 15:17:50)
[Clang 4.0.1 (tags/RELEASE_401/final)] :: Anaconda, Inc. on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import panel
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Users/rditlsc9/miniconda/envs/panel/lib/python3.7/site-packages/panel/__init__.py", line 6, in <module>
from . import links # noqa
File "/Users/rditlsc9/miniconda/envs/panel/lib/python3.7/site-packages/panel/links.py", line 12, in <module>
from .pane.holoviews import HoloViews, generate_panel_bokeh_map, is_bokeh_element_plot
File "/Users/rditlsc9/miniconda/envs/panel/lib/python3.7/site-packages/panel/pane/__init__.py", line 13, in <module>
from .holoviews import HoloViews # noqa
File "/Users/rditlsc9/miniconda/envs/panel/lib/python3.7/site-packages/panel/pane/holoviews.py", line 20, in <module>
from ..widgets import Player
File "/Users/rditlsc9/miniconda/envs/panel/lib/python3.7/site-packages/panel/widgets/__init__.py", line 12, in <module>
from .misc import Audio, VideoStream # noqa
File "/Users/rditlsc9/miniconda/envs/panel/lib/python3.7/site-packages/panel/widgets/misc.py", line 14, in <module>
from scipy.io import wavfile
ModuleNotFoundError: No module named 'scipy'
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `panel/widgets/misc.py`
Content:
```
1 """
2 Miscellaneous widgets which do not fit into the other main categories.
3 """
4 from __future__ import absolute_import, division, unicode_literals
5
6 import os
7
8 from io import BytesIO
9 from base64 import b64encode
10 from six import string_types
11
12 import param
13 import numpy as np
14 from scipy.io import wavfile
15
16 from ..io.notebook import push
17 from ..io.state import state
18 from ..models import (Audio as _BkAudio,
19 VideoStream as _BkVideoStream)
20 from .base import Widget
21
22
23 class Audio(Widget):
24
25 loop = param.Boolean(default=False, doc="""
26 Whether the audio should loop""")
27
28 time = param.Number(default=0, doc="""
29 The current timestamp""")
30
31 throttle = param.Integer(default=250, doc="""
32 How frequently to sample the current playback time in milliseconds""")
33
34 paused = param.Boolean(default=True, doc="""
35 Whether the audio is currently paused""")
36
37 sample_rate = param.Integer(default=44100, doc="""
38 The sample_rate of the audio when given a NumPy array.""")
39
40 value = param.ClassSelector(default='', class_=(string_types + (np.ndarray,)), doc="""
41 The audio file either local or remote.""")
42
43 volume = param.Number(default=None, bounds=(0, 100), doc="""
44 The volume of the audio player.""")
45
46 _widget_type = _BkAudio
47
48 _rename = {'name': None, 'sample_rate': None}
49
50 def _from_numpy(self, data):
51 buffer = BytesIO()
52 wavfile.write(buffer, self.sample_rate, data)
53 return buffer
54
55 def _process_param_change(self, msg):
56 msg = super(Audio, self)._process_param_change(msg)
57
58 if 'value' in msg:
59 value = msg['value']
60 if isinstance(value, np.ndarray):
61 fmt = 'wav'
62 buffer = self._from_numpy(value)
63 data = b64encode(buffer.getvalue())
64 elif os.path.isfile(value):
65 fmt = value.split('.')[-1]
66 with open(value, 'rb') as f:
67 data = f.read()
68 data = b64encode(data)
69 elif value.lower().startswith('http'):
70 return msg
71 elif not value:
72 data, fmt = b'', 'wav'
73 else:
74 raise ValueError('Value should be either path to a sound file or numpy array')
75 template = 'data:audio/{mime};base64,{data}'
76 msg['value'] = template.format(data=data.decode('utf-8'),
77 mime=fmt)
78 return msg
79
80
81
82 class VideoStream(Widget):
83
84 format = param.ObjectSelector(default='png', objects=['png', 'jpeg'],
85 doc="""
86 The file format as which the video is returned.""")
87
88 paused = param.Boolean(default=False, doc="""
89 Whether the video is currently paused""")
90
91 timeout = param.Number(default=None, doc="""
92 Interval between snapshots in millisecons""")
93
94 value = param.String(default='', doc="""
95 A base64 representation of the video stream snapshot.""")
96
97 _widget_type = _BkVideoStream
98
99 _rename = {'name': None}
100
101 def snapshot(self):
102 """
103 Triggers a snapshot of the current VideoStream state to sync
104 the widget value.
105 """
106 for ref, (m, _) in self._models.items():
107 m.snapshot = not m.snapshot
108 (self, root, doc, comm) = state._views[ref]
109 if comm and 'embedded' not in root.tags:
110 push(doc, comm)
111
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/panel/widgets/misc.py b/panel/widgets/misc.py
--- a/panel/widgets/misc.py
+++ b/panel/widgets/misc.py
@@ -11,7 +11,6 @@
import param
import numpy as np
-from scipy.io import wavfile
from ..io.notebook import push
from ..io.state import state
@@ -48,6 +47,7 @@
_rename = {'name': None, 'sample_rate': None}
def _from_numpy(self, data):
+ from scipy.io import wavfile
buffer = BytesIO()
wavfile.write(buffer, self.sample_rate, data)
return buffer
| {"golden_diff": "diff --git a/panel/widgets/misc.py b/panel/widgets/misc.py\n--- a/panel/widgets/misc.py\n+++ b/panel/widgets/misc.py\n@@ -11,7 +11,6 @@\n \n import param\n import numpy as np\n-from scipy.io import wavfile\n \n from ..io.notebook import push\n from ..io.state import state\n@@ -48,6 +47,7 @@\n _rename = {'name': None, 'sample_rate': None}\n \n def _from_numpy(self, data):\n+ from scipy.io import wavfile\n buffer = BytesIO()\n wavfile.write(buffer, self.sample_rate, data)\n return buffer\n", "issue": "Scipy now required dependency?\nThe latest dev build of panel (panel-0.7.0a15-py_0) looks like it is trying to import scipy, but scipy is not among the required dependencies that are automatically installed through conda.\r\n\r\n```\r\n$ conda create -n panel -c pyviz/label/dev panel\r\n...\r\n$ conda activate panel\r\n(panel) $ python\r\nPython 3.7.4 (default, Aug 13 2019, 15:17:50) \r\n[Clang 4.0.1 (tags/RELEASE_401/final)] :: Anaconda, Inc. on darwin\r\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\r\n>>> import panel\r\nTraceback (most recent call last):\r\n File \"<stdin>\", line 1, in <module>\r\n File \"/Users/rditlsc9/miniconda/envs/panel/lib/python3.7/site-packages/panel/__init__.py\", line 6, in <module>\r\n from . import links # noqa\r\n File \"/Users/rditlsc9/miniconda/envs/panel/lib/python3.7/site-packages/panel/links.py\", line 12, in <module>\r\n from .pane.holoviews import HoloViews, generate_panel_bokeh_map, is_bokeh_element_plot\r\n File \"/Users/rditlsc9/miniconda/envs/panel/lib/python3.7/site-packages/panel/pane/__init__.py\", line 13, in <module>\r\n from .holoviews import HoloViews # noqa\r\n File \"/Users/rditlsc9/miniconda/envs/panel/lib/python3.7/site-packages/panel/pane/holoviews.py\", line 20, in <module>\r\n from ..widgets import Player\r\n File \"/Users/rditlsc9/miniconda/envs/panel/lib/python3.7/site-packages/panel/widgets/__init__.py\", line 12, in <module>\r\n from .misc import Audio, VideoStream # noqa\r\n File \"/Users/rditlsc9/miniconda/envs/panel/lib/python3.7/site-packages/panel/widgets/misc.py\", line 14, in <module>\r\n from scipy.io import wavfile\r\nModuleNotFoundError: No module named 'scipy'\r\n```\n", "before_files": [{"content": "\"\"\"\nMiscellaneous widgets which do not fit into the other main categories.\n\"\"\"\nfrom __future__ import absolute_import, division, unicode_literals\n\nimport os\n\nfrom io import BytesIO\nfrom base64 import b64encode\nfrom six import string_types\n\nimport param\nimport numpy as np\nfrom scipy.io import wavfile\n\nfrom ..io.notebook import push\nfrom ..io.state import state\nfrom ..models import (Audio as _BkAudio,\n VideoStream as _BkVideoStream)\nfrom .base import Widget\n\n\nclass Audio(Widget):\n\n loop = param.Boolean(default=False, doc=\"\"\"\n Whether the audio should loop\"\"\")\n\n time = param.Number(default=0, doc=\"\"\"\n The current timestamp\"\"\")\n\n throttle = param.Integer(default=250, doc=\"\"\"\n How frequently to sample the current playback time in milliseconds\"\"\")\n\n paused = param.Boolean(default=True, doc=\"\"\"\n Whether the audio is currently paused\"\"\")\n\n sample_rate = param.Integer(default=44100, doc=\"\"\"\n The sample_rate of the audio when given a NumPy array.\"\"\")\n\n value = param.ClassSelector(default='', class_=(string_types + (np.ndarray,)), doc=\"\"\"\n The audio file either local or remote.\"\"\")\n\n volume = param.Number(default=None, bounds=(0, 100), doc=\"\"\"\n The volume of the audio player.\"\"\")\n\n _widget_type = _BkAudio\n\n _rename = {'name': None, 'sample_rate': None}\n\n def _from_numpy(self, data):\n buffer = BytesIO()\n wavfile.write(buffer, self.sample_rate, data)\n return buffer\n\n def _process_param_change(self, msg):\n msg = super(Audio, self)._process_param_change(msg)\n\n if 'value' in msg:\n value = msg['value']\n if isinstance(value, np.ndarray):\n fmt = 'wav'\n buffer = self._from_numpy(value)\n data = b64encode(buffer.getvalue())\n elif os.path.isfile(value):\n fmt = value.split('.')[-1]\n with open(value, 'rb') as f:\n data = f.read()\n data = b64encode(data)\n elif value.lower().startswith('http'):\n return msg\n elif not value:\n data, fmt = b'', 'wav'\n else:\n raise ValueError('Value should be either path to a sound file or numpy array')\n template = 'data:audio/{mime};base64,{data}'\n msg['value'] = template.format(data=data.decode('utf-8'),\n mime=fmt)\n return msg\n\n\n\nclass VideoStream(Widget):\n\n format = param.ObjectSelector(default='png', objects=['png', 'jpeg'],\n doc=\"\"\"\n The file format as which the video is returned.\"\"\")\n\n paused = param.Boolean(default=False, doc=\"\"\"\n Whether the video is currently paused\"\"\")\n\n timeout = param.Number(default=None, doc=\"\"\"\n Interval between snapshots in millisecons\"\"\")\n\n value = param.String(default='', doc=\"\"\"\n A base64 representation of the video stream snapshot.\"\"\")\n\n _widget_type = _BkVideoStream\n\n _rename = {'name': None}\n\n def snapshot(self):\n \"\"\"\n Triggers a snapshot of the current VideoStream state to sync\n the widget value.\n \"\"\"\n for ref, (m, _) in self._models.items():\n m.snapshot = not m.snapshot\n (self, root, doc, comm) = state._views[ref]\n if comm and 'embedded' not in root.tags:\n push(doc, comm)\n", "path": "panel/widgets/misc.py"}], "after_files": [{"content": "\"\"\"\nMiscellaneous widgets which do not fit into the other main categories.\n\"\"\"\nfrom __future__ import absolute_import, division, unicode_literals\n\nimport os\n\nfrom io import BytesIO\nfrom base64 import b64encode\nfrom six import string_types\n\nimport param\nimport numpy as np\n\nfrom ..io.notebook import push\nfrom ..io.state import state\nfrom ..models import (Audio as _BkAudio,\n VideoStream as _BkVideoStream)\nfrom .base import Widget\n\n\nclass Audio(Widget):\n\n loop = param.Boolean(default=False, doc=\"\"\"\n Whether the audio should loop\"\"\")\n\n time = param.Number(default=0, doc=\"\"\"\n The current timestamp\"\"\")\n\n throttle = param.Integer(default=250, doc=\"\"\"\n How frequently to sample the current playback time in milliseconds\"\"\")\n\n paused = param.Boolean(default=True, doc=\"\"\"\n Whether the audio is currently paused\"\"\")\n\n sample_rate = param.Integer(default=44100, doc=\"\"\"\n The sample_rate of the audio when given a NumPy array.\"\"\")\n\n value = param.ClassSelector(default='', class_=(string_types + (np.ndarray,)), doc=\"\"\"\n The audio file either local or remote.\"\"\")\n\n volume = param.Number(default=None, bounds=(0, 100), doc=\"\"\"\n The volume of the audio player.\"\"\")\n\n _widget_type = _BkAudio\n\n _rename = {'name': None, 'sample_rate': None}\n\n def _from_numpy(self, data):\n from scipy.io import wavfile\n buffer = BytesIO()\n wavfile.write(buffer, self.sample_rate, data)\n return buffer\n\n def _process_param_change(self, msg):\n msg = super(Audio, self)._process_param_change(msg)\n\n if 'value' in msg:\n value = msg['value']\n if isinstance(value, np.ndarray):\n fmt = 'wav'\n buffer = self._from_numpy(value)\n data = b64encode(buffer.getvalue())\n elif os.path.isfile(value):\n fmt = value.split('.')[-1]\n with open(value, 'rb') as f:\n data = f.read()\n data = b64encode(data)\n elif value.lower().startswith('http'):\n return msg\n elif not value:\n data, fmt = b'', 'wav'\n else:\n raise ValueError('Value should be either path to a sound file or numpy array')\n template = 'data:audio/{mime};base64,{data}'\n msg['value'] = template.format(data=data.decode('utf-8'),\n mime=fmt)\n return msg\n\n\n\nclass VideoStream(Widget):\n\n format = param.ObjectSelector(default='png', objects=['png', 'jpeg'],\n doc=\"\"\"\n The file format as which the video is returned.\"\"\")\n\n paused = param.Boolean(default=False, doc=\"\"\"\n Whether the video is currently paused\"\"\")\n\n timeout = param.Number(default=None, doc=\"\"\"\n Interval between snapshots in millisecons\"\"\")\n\n value = param.String(default='', doc=\"\"\"\n A base64 representation of the video stream snapshot.\"\"\")\n\n _widget_type = _BkVideoStream\n\n _rename = {'name': None}\n\n def snapshot(self):\n \"\"\"\n Triggers a snapshot of the current VideoStream state to sync\n the widget value.\n \"\"\"\n for ref, (m, _) in self._models.items():\n m.snapshot = not m.snapshot\n (self, root, doc, comm) = state._views[ref]\n if comm and 'embedded' not in root.tags:\n push(doc, comm)\n", "path": "panel/widgets/misc.py"}]} | 1,782 | 143 |
gh_patches_debug_41072 | rasdani/github-patches | git_diff | PaddlePaddle__PaddleSeg-1746 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
paddleseg/models/hrnet_contrast.py 中没有执行 init_weight
paddleseg/models/hrnet_contrast.py 中__init__()没有执行 init_weight,导致hrnet_w48_contrast 没法加载完整的模型
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `paddleseg/models/hrnet_contrast.py`
Content:
```
1 # Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 import paddle
16 import paddle.nn as nn
17 import paddle.nn.functional as F
18
19 from paddleseg.cvlibs import manager
20 from paddleseg.models import layers
21 from paddleseg.utils import utils
22
23
24 @manager.MODELS.add_component
25 class HRNetW48Contrast(nn.Layer):
26 """
27 The HRNetW48Contrast implementation based on PaddlePaddle.
28
29 The original article refers to
30 Wenguan Wang, Tianfei Zhou, et al. "Exploring Cross-Image Pixel Contrast for Semantic Segmentation"
31 (https://arxiv.org/abs/2101.11939).
32
33 Args:
34 in_channels (int): The output dimensions of backbone.
35 num_classes (int): The unique number of target classes.
36 backbone (Paddle.nn.Layer): Backbone network, currently support HRNet_W48.
37 drop_prob (float): The probability of dropout.
38 proj_dim (int): The projection dimensions.
39 align_corners (bool, optional): An argument of F.interpolate. It should be set to False when the feature size is even,
40 e.g. 1024x512, otherwise it is True, e.g. 769x769. Default: False.
41 pretrained (str, optional): The path or url of pretrained model. Default: None.
42 """
43 def __init__(self,
44 in_channels,
45 num_classes,
46 backbone,
47 drop_prob,
48 proj_dim,
49 align_corners=False,
50 pretrained=None):
51 super().__init__()
52 self.in_channels = in_channels
53 self.backbone = backbone
54 self.num_classes = num_classes
55 self.proj_dim = proj_dim
56 self.align_corners = align_corners
57 self.pretrained = pretrained
58
59 self.cls_head = nn.Sequential(
60 layers.ConvBNReLU(in_channels,
61 in_channels,
62 kernel_size=3,
63 stride=1,
64 padding=1),
65 nn.Dropout2D(drop_prob),
66 nn.Conv2D(in_channels,
67 num_classes,
68 kernel_size=1,
69 stride=1,
70 bias_attr=False),
71 )
72 self.proj_head = ProjectionHead(dim_in=in_channels,
73 proj_dim=self.proj_dim)
74
75 def init_weight(self):
76 if self.pretrained is not None:
77 utils.load_entire_model(self, self.pretrained)
78
79 def forward(self, x):
80 feats = self.backbone(x)[0]
81 out = self.cls_head(feats)
82 logit_list = []
83 if self.training:
84 emb = self.proj_head(feats)
85 logit_list.append(
86 F.interpolate(out,
87 paddle.shape(x)[2:],
88 mode='bilinear',
89 align_corners=self.align_corners))
90 logit_list.append({'seg': out, 'embed': emb})
91 else:
92 logit_list.append(
93 F.interpolate(out,
94 paddle.shape(x)[2:],
95 mode='bilinear',
96 align_corners=self.align_corners))
97 return logit_list
98
99
100 class ProjectionHead(nn.Layer):
101 """
102 The projection head used by contrast learning.
103 Args:
104 dim_in (int): The dimensions of input features.
105 proj_dim (int, optional): The output dimensions of projection head. Default: 256.
106 proj (str, optional): The type of projection head, only support 'linear' and 'convmlp'. Default: 'convmlp'.
107 """
108 def __init__(self, dim_in, proj_dim=256, proj='convmlp'):
109 super(ProjectionHead, self).__init__()
110 if proj == 'linear':
111 self.proj = nn.Conv2D(dim_in, proj_dim, kernel_size=1)
112 elif proj == 'convmlp':
113 self.proj = nn.Sequential(
114 layers.ConvBNReLU(dim_in, dim_in, kernel_size=1),
115 nn.Conv2D(dim_in, proj_dim, kernel_size=1),
116 )
117 else:
118 raise ValueError(
119 "The type of project head only support 'linear' and 'convmlp', but got {}."
120 .format(proj))
121
122 def forward(self, x):
123 return F.normalize(self.proj(x), p=2, axis=1)
124
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/paddleseg/models/hrnet_contrast.py b/paddleseg/models/hrnet_contrast.py
--- a/paddleseg/models/hrnet_contrast.py
+++ b/paddleseg/models/hrnet_contrast.py
@@ -40,6 +40,7 @@
e.g. 1024x512, otherwise it is True, e.g. 769x769. Default: False.
pretrained (str, optional): The path or url of pretrained model. Default: None.
"""
+
def __init__(self,
in_channels,
num_classes,
@@ -54,23 +55,23 @@
self.num_classes = num_classes
self.proj_dim = proj_dim
self.align_corners = align_corners
- self.pretrained = pretrained
self.cls_head = nn.Sequential(
- layers.ConvBNReLU(in_channels,
- in_channels,
- kernel_size=3,
- stride=1,
- padding=1),
+ layers.ConvBNReLU(
+ in_channels, in_channels, kernel_size=3, stride=1, padding=1),
nn.Dropout2D(drop_prob),
- nn.Conv2D(in_channels,
- num_classes,
- kernel_size=1,
- stride=1,
- bias_attr=False),
+ nn.Conv2D(
+ in_channels,
+ num_classes,
+ kernel_size=1,
+ stride=1,
+ bias_attr=False),
)
- self.proj_head = ProjectionHead(dim_in=in_channels,
- proj_dim=self.proj_dim)
+ self.proj_head = ProjectionHead(
+ dim_in=in_channels, proj_dim=self.proj_dim)
+
+ self.pretrained = pretrained
+ self.init_weight()
def init_weight(self):
if self.pretrained is not None:
@@ -83,17 +84,19 @@
if self.training:
emb = self.proj_head(feats)
logit_list.append(
- F.interpolate(out,
- paddle.shape(x)[2:],
- mode='bilinear',
- align_corners=self.align_corners))
+ F.interpolate(
+ out,
+ paddle.shape(x)[2:],
+ mode='bilinear',
+ align_corners=self.align_corners))
logit_list.append({'seg': out, 'embed': emb})
else:
logit_list.append(
- F.interpolate(out,
- paddle.shape(x)[2:],
- mode='bilinear',
- align_corners=self.align_corners))
+ F.interpolate(
+ out,
+ paddle.shape(x)[2:],
+ mode='bilinear',
+ align_corners=self.align_corners))
return logit_list
@@ -105,6 +108,7 @@
proj_dim (int, optional): The output dimensions of projection head. Default: 256.
proj (str, optional): The type of projection head, only support 'linear' and 'convmlp'. Default: 'convmlp'.
"""
+
def __init__(self, dim_in, proj_dim=256, proj='convmlp'):
super(ProjectionHead, self).__init__()
if proj == 'linear':
| {"golden_diff": "diff --git a/paddleseg/models/hrnet_contrast.py b/paddleseg/models/hrnet_contrast.py\n--- a/paddleseg/models/hrnet_contrast.py\n+++ b/paddleseg/models/hrnet_contrast.py\n@@ -40,6 +40,7 @@\n e.g. 1024x512, otherwise it is True, e.g. 769x769. Default: False.\n pretrained (str, optional): The path or url of pretrained model. Default: None.\n \"\"\"\n+\n def __init__(self,\n in_channels,\n num_classes,\n@@ -54,23 +55,23 @@\n self.num_classes = num_classes\n self.proj_dim = proj_dim\n self.align_corners = align_corners\n- self.pretrained = pretrained\n \n self.cls_head = nn.Sequential(\n- layers.ConvBNReLU(in_channels,\n- in_channels,\n- kernel_size=3,\n- stride=1,\n- padding=1),\n+ layers.ConvBNReLU(\n+ in_channels, in_channels, kernel_size=3, stride=1, padding=1),\n nn.Dropout2D(drop_prob),\n- nn.Conv2D(in_channels,\n- num_classes,\n- kernel_size=1,\n- stride=1,\n- bias_attr=False),\n+ nn.Conv2D(\n+ in_channels,\n+ num_classes,\n+ kernel_size=1,\n+ stride=1,\n+ bias_attr=False),\n )\n- self.proj_head = ProjectionHead(dim_in=in_channels,\n- proj_dim=self.proj_dim)\n+ self.proj_head = ProjectionHead(\n+ dim_in=in_channels, proj_dim=self.proj_dim)\n+\n+ self.pretrained = pretrained\n+ self.init_weight()\n \n def init_weight(self):\n if self.pretrained is not None:\n@@ -83,17 +84,19 @@\n if self.training:\n emb = self.proj_head(feats)\n logit_list.append(\n- F.interpolate(out,\n- paddle.shape(x)[2:],\n- mode='bilinear',\n- align_corners=self.align_corners))\n+ F.interpolate(\n+ out,\n+ paddle.shape(x)[2:],\n+ mode='bilinear',\n+ align_corners=self.align_corners))\n logit_list.append({'seg': out, 'embed': emb})\n else:\n logit_list.append(\n- F.interpolate(out,\n- paddle.shape(x)[2:],\n- mode='bilinear',\n- align_corners=self.align_corners))\n+ F.interpolate(\n+ out,\n+ paddle.shape(x)[2:],\n+ mode='bilinear',\n+ align_corners=self.align_corners))\n return logit_list\n \n \n@@ -105,6 +108,7 @@\n proj_dim (int, optional): The output dimensions of projection head. Default: 256.\n proj (str, optional): The type of projection head, only support 'linear' and 'convmlp'. Default: 'convmlp'.\n \"\"\"\n+\n def __init__(self, dim_in, proj_dim=256, proj='convmlp'):\n super(ProjectionHead, self).__init__()\n if proj == 'linear':\n", "issue": "paddleseg/models/hrnet_contrast.py \u4e2d\u6ca1\u6709\u6267\u884c init_weight\npaddleseg/models/hrnet_contrast.py \u4e2d__init__()\u6ca1\u6709\u6267\u884c init_weight\uff0c\u5bfc\u81f4hrnet_w48_contrast \u6ca1\u6cd5\u52a0\u8f7d\u5b8c\u6574\u7684\u6a21\u578b\n", "before_files": [{"content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom paddleseg.cvlibs import manager\nfrom paddleseg.models import layers\nfrom paddleseg.utils import utils\n\n\[email protected]_component\nclass HRNetW48Contrast(nn.Layer):\n \"\"\"\n The HRNetW48Contrast implementation based on PaddlePaddle.\n\n The original article refers to\n Wenguan Wang, Tianfei Zhou, et al. \"Exploring Cross-Image Pixel Contrast for Semantic Segmentation\"\n (https://arxiv.org/abs/2101.11939).\n\n Args:\n in_channels (int): The output dimensions of backbone.\n num_classes (int): The unique number of target classes.\n backbone (Paddle.nn.Layer): Backbone network, currently support HRNet_W48.\n drop_prob (float): The probability of dropout.\n proj_dim (int): The projection dimensions.\n align_corners (bool, optional): An argument of F.interpolate. It should be set to False when the feature size is even,\n e.g. 1024x512, otherwise it is True, e.g. 769x769. Default: False.\n pretrained (str, optional): The path or url of pretrained model. Default: None.\n \"\"\"\n def __init__(self,\n in_channels,\n num_classes,\n backbone,\n drop_prob,\n proj_dim,\n align_corners=False,\n pretrained=None):\n super().__init__()\n self.in_channels = in_channels\n self.backbone = backbone\n self.num_classes = num_classes\n self.proj_dim = proj_dim\n self.align_corners = align_corners\n self.pretrained = pretrained\n\n self.cls_head = nn.Sequential(\n layers.ConvBNReLU(in_channels,\n in_channels,\n kernel_size=3,\n stride=1,\n padding=1),\n nn.Dropout2D(drop_prob),\n nn.Conv2D(in_channels,\n num_classes,\n kernel_size=1,\n stride=1,\n bias_attr=False),\n )\n self.proj_head = ProjectionHead(dim_in=in_channels,\n proj_dim=self.proj_dim)\n\n def init_weight(self):\n if self.pretrained is not None:\n utils.load_entire_model(self, self.pretrained)\n\n def forward(self, x):\n feats = self.backbone(x)[0]\n out = self.cls_head(feats)\n logit_list = []\n if self.training:\n emb = self.proj_head(feats)\n logit_list.append(\n F.interpolate(out,\n paddle.shape(x)[2:],\n mode='bilinear',\n align_corners=self.align_corners))\n logit_list.append({'seg': out, 'embed': emb})\n else:\n logit_list.append(\n F.interpolate(out,\n paddle.shape(x)[2:],\n mode='bilinear',\n align_corners=self.align_corners))\n return logit_list\n\n\nclass ProjectionHead(nn.Layer):\n \"\"\"\n The projection head used by contrast learning.\n Args:\n dim_in (int): The dimensions of input features.\n proj_dim (int, optional): The output dimensions of projection head. Default: 256.\n proj (str, optional): The type of projection head, only support 'linear' and 'convmlp'. Default: 'convmlp'.\n \"\"\"\n def __init__(self, dim_in, proj_dim=256, proj='convmlp'):\n super(ProjectionHead, self).__init__()\n if proj == 'linear':\n self.proj = nn.Conv2D(dim_in, proj_dim, kernel_size=1)\n elif proj == 'convmlp':\n self.proj = nn.Sequential(\n layers.ConvBNReLU(dim_in, dim_in, kernel_size=1),\n nn.Conv2D(dim_in, proj_dim, kernel_size=1),\n )\n else:\n raise ValueError(\n \"The type of project head only support 'linear' and 'convmlp', but got {}.\"\n .format(proj))\n\n def forward(self, x):\n return F.normalize(self.proj(x), p=2, axis=1)\n", "path": "paddleseg/models/hrnet_contrast.py"}], "after_files": [{"content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom paddleseg.cvlibs import manager\nfrom paddleseg.models import layers\nfrom paddleseg.utils import utils\n\n\[email protected]_component\nclass HRNetW48Contrast(nn.Layer):\n \"\"\"\n The HRNetW48Contrast implementation based on PaddlePaddle.\n\n The original article refers to\n Wenguan Wang, Tianfei Zhou, et al. \"Exploring Cross-Image Pixel Contrast for Semantic Segmentation\"\n (https://arxiv.org/abs/2101.11939).\n\n Args:\n in_channels (int): The output dimensions of backbone.\n num_classes (int): The unique number of target classes.\n backbone (Paddle.nn.Layer): Backbone network, currently support HRNet_W48.\n drop_prob (float): The probability of dropout.\n proj_dim (int): The projection dimensions.\n align_corners (bool, optional): An argument of F.interpolate. It should be set to False when the feature size is even,\n e.g. 1024x512, otherwise it is True, e.g. 769x769. Default: False.\n pretrained (str, optional): The path or url of pretrained model. Default: None.\n \"\"\"\n\n def __init__(self,\n in_channels,\n num_classes,\n backbone,\n drop_prob,\n proj_dim,\n align_corners=False,\n pretrained=None):\n super().__init__()\n self.in_channels = in_channels\n self.backbone = backbone\n self.num_classes = num_classes\n self.proj_dim = proj_dim\n self.align_corners = align_corners\n\n self.cls_head = nn.Sequential(\n layers.ConvBNReLU(\n in_channels, in_channels, kernel_size=3, stride=1, padding=1),\n nn.Dropout2D(drop_prob),\n nn.Conv2D(\n in_channels,\n num_classes,\n kernel_size=1,\n stride=1,\n bias_attr=False),\n )\n self.proj_head = ProjectionHead(\n dim_in=in_channels, proj_dim=self.proj_dim)\n\n self.pretrained = pretrained\n self.init_weight()\n\n def init_weight(self):\n if self.pretrained is not None:\n utils.load_entire_model(self, self.pretrained)\n\n def forward(self, x):\n feats = self.backbone(x)[0]\n out = self.cls_head(feats)\n logit_list = []\n if self.training:\n emb = self.proj_head(feats)\n logit_list.append(\n F.interpolate(\n out,\n paddle.shape(x)[2:],\n mode='bilinear',\n align_corners=self.align_corners))\n logit_list.append({'seg': out, 'embed': emb})\n else:\n logit_list.append(\n F.interpolate(\n out,\n paddle.shape(x)[2:],\n mode='bilinear',\n align_corners=self.align_corners))\n return logit_list\n\n\nclass ProjectionHead(nn.Layer):\n \"\"\"\n The projection head used by contrast learning.\n Args:\n dim_in (int): The dimensions of input features.\n proj_dim (int, optional): The output dimensions of projection head. Default: 256.\n proj (str, optional): The type of projection head, only support 'linear' and 'convmlp'. Default: 'convmlp'.\n \"\"\"\n\n def __init__(self, dim_in, proj_dim=256, proj='convmlp'):\n super(ProjectionHead, self).__init__()\n if proj == 'linear':\n self.proj = nn.Conv2D(dim_in, proj_dim, kernel_size=1)\n elif proj == 'convmlp':\n self.proj = nn.Sequential(\n layers.ConvBNReLU(dim_in, dim_in, kernel_size=1),\n nn.Conv2D(dim_in, proj_dim, kernel_size=1),\n )\n else:\n raise ValueError(\n \"The type of project head only support 'linear' and 'convmlp', but got {}.\"\n .format(proj))\n\n def forward(self, x):\n return F.normalize(self.proj(x), p=2, axis=1)\n", "path": "paddleseg/models/hrnet_contrast.py"}]} | 1,602 | 701 |
gh_patches_debug_20653 | rasdani/github-patches | git_diff | ipython__ipython-6338 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Implement check_pid for .NET
This is a reminder - I added an `IPython.utils.process.check_pid` function to check if a process exists with a given PID. However, we have a `_process_cli` module containing .NET implementations of these things, and I didn't write the check_pid implementation for .NET.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `IPython/utils/process.py`
Content:
```
1 # encoding: utf-8
2 """
3 Utilities for working with external processes.
4 """
5
6 #-----------------------------------------------------------------------------
7 # Copyright (C) 2008-2011 The IPython Development Team
8 #
9 # Distributed under the terms of the BSD License. The full license is in
10 # the file COPYING, distributed as part of this software.
11 #-----------------------------------------------------------------------------
12
13 #-----------------------------------------------------------------------------
14 # Imports
15 #-----------------------------------------------------------------------------
16 from __future__ import print_function
17
18 # Stdlib
19 import os
20 import sys
21
22 # Our own
23 if sys.platform == 'win32':
24 from ._process_win32 import _find_cmd, system, getoutput, arg_split, check_pid
25 elif sys.platform == 'cli':
26 from ._process_cli import _find_cmd, system, getoutput, arg_split
27 else:
28 from ._process_posix import _find_cmd, system, getoutput, arg_split, check_pid
29
30 from ._process_common import getoutputerror, get_output_error_code, process_handler
31 from . import py3compat
32
33 #-----------------------------------------------------------------------------
34 # Code
35 #-----------------------------------------------------------------------------
36
37
38 class FindCmdError(Exception):
39 pass
40
41
42 def find_cmd(cmd):
43 """Find absolute path to executable cmd in a cross platform manner.
44
45 This function tries to determine the full path to a command line program
46 using `which` on Unix/Linux/OS X and `win32api` on Windows. Most of the
47 time it will use the version that is first on the users `PATH`.
48
49 Warning, don't use this to find IPython command line programs as there
50 is a risk you will find the wrong one. Instead find those using the
51 following code and looking for the application itself::
52
53 from IPython.utils.path import get_ipython_module_path
54 from IPython.utils.process import pycmd2argv
55 argv = pycmd2argv(get_ipython_module_path('IPython.terminal.ipapp'))
56
57 Parameters
58 ----------
59 cmd : str
60 The command line program to look for.
61 """
62 try:
63 path = _find_cmd(cmd).rstrip()
64 except OSError:
65 raise FindCmdError('command could not be found: %s' % cmd)
66 # which returns empty if not found
67 if path == '':
68 raise FindCmdError('command could not be found: %s' % cmd)
69 return os.path.abspath(path)
70
71
72 def is_cmd_found(cmd):
73 """Check whether executable `cmd` exists or not and return a bool."""
74 try:
75 find_cmd(cmd)
76 return True
77 except FindCmdError:
78 return False
79
80
81 def pycmd2argv(cmd):
82 r"""Take the path of a python command and return a list (argv-style).
83
84 This only works on Python based command line programs and will find the
85 location of the ``python`` executable using ``sys.executable`` to make
86 sure the right version is used.
87
88 For a given path ``cmd``, this returns [cmd] if cmd's extension is .exe,
89 .com or .bat, and [, cmd] otherwise.
90
91 Parameters
92 ----------
93 cmd : string
94 The path of the command.
95
96 Returns
97 -------
98 argv-style list.
99 """
100 ext = os.path.splitext(cmd)[1]
101 if ext in ['.exe', '.com', '.bat']:
102 return [cmd]
103 else:
104 return [sys.executable, cmd]
105
106
107 def abbrev_cwd():
108 """ Return abbreviated version of cwd, e.g. d:mydir """
109 cwd = py3compat.getcwd().replace('\\','/')
110 drivepart = ''
111 tail = cwd
112 if sys.platform == 'win32':
113 if len(cwd) < 4:
114 return cwd
115 drivepart,tail = os.path.splitdrive(cwd)
116
117
118 parts = tail.split('/')
119 if len(parts) > 2:
120 tail = '/'.join(parts[-2:])
121
122 return (drivepart + (
123 cwd == '/' and '/' or tail))
124
```
Path: `IPython/utils/_process_cli.py`
Content:
```
1 """cli-specific implementation of process utilities.
2
3 cli - Common Language Infrastructure for IronPython. Code
4 can run on any operating system. Check os.name for os-
5 specific settings.
6
7 This file is only meant to be imported by process.py, not by end-users.
8
9 This file is largely untested. To become a full drop-in process
10 interface for IronPython will probably require you to help fill
11 in the details.
12 """
13
14 # Import cli libraries:
15 import clr
16 import System
17
18 # Import Python libraries:
19 import os
20
21 # Import IPython libraries:
22 from IPython.utils import py3compat
23 from ._process_common import arg_split
24
25 def _find_cmd(cmd):
26 """Find the full path to a command using which."""
27 paths = System.Environment.GetEnvironmentVariable("PATH").Split(os.pathsep)
28 for path in paths:
29 filename = os.path.join(path, cmd)
30 if System.IO.File.Exists(filename):
31 return py3compat.bytes_to_str(filename)
32 raise OSError("command %r not found" % cmd)
33
34 def system(cmd):
35 """
36 system(cmd) should work in a cli environment on Mac OSX, Linux,
37 and Windows
38 """
39 psi = System.Diagnostics.ProcessStartInfo(cmd)
40 psi.RedirectStandardOutput = True
41 psi.RedirectStandardError = True
42 psi.WindowStyle = System.Diagnostics.ProcessWindowStyle.Normal
43 psi.UseShellExecute = False
44 # Start up process:
45 reg = System.Diagnostics.Process.Start(psi)
46
47 def getoutput(cmd):
48 """
49 getoutput(cmd) should work in a cli environment on Mac OSX, Linux,
50 and Windows
51 """
52 psi = System.Diagnostics.ProcessStartInfo(cmd)
53 psi.RedirectStandardOutput = True
54 psi.RedirectStandardError = True
55 psi.WindowStyle = System.Diagnostics.ProcessWindowStyle.Normal
56 psi.UseShellExecute = False
57 # Start up process:
58 reg = System.Diagnostics.Process.Start(psi)
59 myOutput = reg.StandardOutput
60 output = myOutput.ReadToEnd()
61 myError = reg.StandardError
62 error = myError.ReadToEnd()
63 return output
64
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/IPython/utils/_process_cli.py b/IPython/utils/_process_cli.py
--- a/IPython/utils/_process_cli.py
+++ b/IPython/utils/_process_cli.py
@@ -61,3 +61,18 @@
myError = reg.StandardError
error = myError.ReadToEnd()
return output
+
+def check_pid(pid):
+ """
+ Check if a process with the given PID (pid) exists
+ """
+ try:
+ System.Diagnostics.Process.GetProcessById(pid)
+ # process with given pid is running
+ return True
+ except System.InvalidOperationException:
+ # process wasn't started by this object (but is running)
+ return True
+ except System.ArgumentException:
+ # process with given pid isn't running
+ return False
diff --git a/IPython/utils/process.py b/IPython/utils/process.py
--- a/IPython/utils/process.py
+++ b/IPython/utils/process.py
@@ -23,7 +23,7 @@
if sys.platform == 'win32':
from ._process_win32 import _find_cmd, system, getoutput, arg_split, check_pid
elif sys.platform == 'cli':
- from ._process_cli import _find_cmd, system, getoutput, arg_split
+ from ._process_cli import _find_cmd, system, getoutput, arg_split, check_pid
else:
from ._process_posix import _find_cmd, system, getoutput, arg_split, check_pid
| {"golden_diff": "diff --git a/IPython/utils/_process_cli.py b/IPython/utils/_process_cli.py\n--- a/IPython/utils/_process_cli.py\n+++ b/IPython/utils/_process_cli.py\n@@ -61,3 +61,18 @@\n myError = reg.StandardError\n error = myError.ReadToEnd()\n return output\n+\n+def check_pid(pid):\n+ \"\"\"\n+ Check if a process with the given PID (pid) exists\n+ \"\"\"\n+ try:\n+ System.Diagnostics.Process.GetProcessById(pid)\n+ # process with given pid is running\n+ return True\n+ except System.InvalidOperationException:\n+ # process wasn't started by this object (but is running)\n+ return True\n+ except System.ArgumentException:\n+ # process with given pid isn't running\n+ return False \ndiff --git a/IPython/utils/process.py b/IPython/utils/process.py\n--- a/IPython/utils/process.py\n+++ b/IPython/utils/process.py\n@@ -23,7 +23,7 @@\n if sys.platform == 'win32':\n from ._process_win32 import _find_cmd, system, getoutput, arg_split, check_pid\n elif sys.platform == 'cli':\n- from ._process_cli import _find_cmd, system, getoutput, arg_split\n+ from ._process_cli import _find_cmd, system, getoutput, arg_split, check_pid\n else:\n from ._process_posix import _find_cmd, system, getoutput, arg_split, check_pid\n", "issue": "Implement check_pid for .NET\nThis is a reminder - I added an `IPython.utils.process.check_pid` function to check if a process exists with a given PID. However, we have a `_process_cli` module containing .NET implementations of these things, and I didn't write the check_pid implementation for .NET.\n\n", "before_files": [{"content": "# encoding: utf-8\n\"\"\"\nUtilities for working with external processes.\n\"\"\"\n\n#-----------------------------------------------------------------------------\n# Copyright (C) 2008-2011 The IPython Development Team\n#\n# Distributed under the terms of the BSD License. The full license is in\n# the file COPYING, distributed as part of this software.\n#-----------------------------------------------------------------------------\n\n#-----------------------------------------------------------------------------\n# Imports\n#-----------------------------------------------------------------------------\nfrom __future__ import print_function\n\n# Stdlib\nimport os\nimport sys\n\n# Our own\nif sys.platform == 'win32':\n from ._process_win32 import _find_cmd, system, getoutput, arg_split, check_pid\nelif sys.platform == 'cli':\n from ._process_cli import _find_cmd, system, getoutput, arg_split\nelse:\n from ._process_posix import _find_cmd, system, getoutput, arg_split, check_pid\n\nfrom ._process_common import getoutputerror, get_output_error_code, process_handler\nfrom . import py3compat\n\n#-----------------------------------------------------------------------------\n# Code\n#-----------------------------------------------------------------------------\n\n\nclass FindCmdError(Exception):\n pass\n\n\ndef find_cmd(cmd):\n \"\"\"Find absolute path to executable cmd in a cross platform manner.\n\n This function tries to determine the full path to a command line program\n using `which` on Unix/Linux/OS X and `win32api` on Windows. Most of the\n time it will use the version that is first on the users `PATH`.\n\n Warning, don't use this to find IPython command line programs as there\n is a risk you will find the wrong one. Instead find those using the\n following code and looking for the application itself::\n\n from IPython.utils.path import get_ipython_module_path\n from IPython.utils.process import pycmd2argv\n argv = pycmd2argv(get_ipython_module_path('IPython.terminal.ipapp'))\n\n Parameters\n ----------\n cmd : str\n The command line program to look for.\n \"\"\"\n try:\n path = _find_cmd(cmd).rstrip()\n except OSError:\n raise FindCmdError('command could not be found: %s' % cmd)\n # which returns empty if not found\n if path == '':\n raise FindCmdError('command could not be found: %s' % cmd)\n return os.path.abspath(path)\n\n\ndef is_cmd_found(cmd):\n \"\"\"Check whether executable `cmd` exists or not and return a bool.\"\"\"\n try:\n find_cmd(cmd)\n return True\n except FindCmdError:\n return False\n\n\ndef pycmd2argv(cmd):\n r\"\"\"Take the path of a python command and return a list (argv-style).\n\n This only works on Python based command line programs and will find the\n location of the ``python`` executable using ``sys.executable`` to make\n sure the right version is used.\n\n For a given path ``cmd``, this returns [cmd] if cmd's extension is .exe,\n .com or .bat, and [, cmd] otherwise.\n\n Parameters\n ----------\n cmd : string\n The path of the command.\n\n Returns\n -------\n argv-style list.\n \"\"\"\n ext = os.path.splitext(cmd)[1]\n if ext in ['.exe', '.com', '.bat']:\n return [cmd]\n else:\n return [sys.executable, cmd]\n\n\ndef abbrev_cwd():\n \"\"\" Return abbreviated version of cwd, e.g. d:mydir \"\"\"\n cwd = py3compat.getcwd().replace('\\\\','/')\n drivepart = ''\n tail = cwd\n if sys.platform == 'win32':\n if len(cwd) < 4:\n return cwd\n drivepart,tail = os.path.splitdrive(cwd)\n\n\n parts = tail.split('/')\n if len(parts) > 2:\n tail = '/'.join(parts[-2:])\n\n return (drivepart + (\n cwd == '/' and '/' or tail))\n", "path": "IPython/utils/process.py"}, {"content": "\"\"\"cli-specific implementation of process utilities.\n\ncli - Common Language Infrastructure for IronPython. Code\n can run on any operating system. Check os.name for os-\n specific settings.\n\nThis file is only meant to be imported by process.py, not by end-users.\n\nThis file is largely untested. To become a full drop-in process\ninterface for IronPython will probably require you to help fill\nin the details. \n\"\"\"\n\n# Import cli libraries:\nimport clr\nimport System\n\n# Import Python libraries:\nimport os\n\n# Import IPython libraries:\nfrom IPython.utils import py3compat\nfrom ._process_common import arg_split\n\ndef _find_cmd(cmd):\n \"\"\"Find the full path to a command using which.\"\"\"\n paths = System.Environment.GetEnvironmentVariable(\"PATH\").Split(os.pathsep)\n for path in paths:\n filename = os.path.join(path, cmd)\n if System.IO.File.Exists(filename):\n return py3compat.bytes_to_str(filename)\n raise OSError(\"command %r not found\" % cmd)\n\ndef system(cmd):\n \"\"\"\n system(cmd) should work in a cli environment on Mac OSX, Linux,\n and Windows\n \"\"\"\n psi = System.Diagnostics.ProcessStartInfo(cmd)\n psi.RedirectStandardOutput = True\n psi.RedirectStandardError = True\n psi.WindowStyle = System.Diagnostics.ProcessWindowStyle.Normal\n psi.UseShellExecute = False\n # Start up process:\n reg = System.Diagnostics.Process.Start(psi)\n\ndef getoutput(cmd):\n \"\"\"\n getoutput(cmd) should work in a cli environment on Mac OSX, Linux,\n and Windows\n \"\"\"\n psi = System.Diagnostics.ProcessStartInfo(cmd)\n psi.RedirectStandardOutput = True\n psi.RedirectStandardError = True\n psi.WindowStyle = System.Diagnostics.ProcessWindowStyle.Normal\n psi.UseShellExecute = False\n # Start up process:\n reg = System.Diagnostics.Process.Start(psi)\n myOutput = reg.StandardOutput\n output = myOutput.ReadToEnd()\n myError = reg.StandardError\n error = myError.ReadToEnd()\n return output\n", "path": "IPython/utils/_process_cli.py"}], "after_files": [{"content": "# encoding: utf-8\n\"\"\"\nUtilities for working with external processes.\n\"\"\"\n\n#-----------------------------------------------------------------------------\n# Copyright (C) 2008-2011 The IPython Development Team\n#\n# Distributed under the terms of the BSD License. The full license is in\n# the file COPYING, distributed as part of this software.\n#-----------------------------------------------------------------------------\n\n#-----------------------------------------------------------------------------\n# Imports\n#-----------------------------------------------------------------------------\nfrom __future__ import print_function\n\n# Stdlib\nimport os\nimport sys\n\n# Our own\nif sys.platform == 'win32':\n from ._process_win32 import _find_cmd, system, getoutput, arg_split, check_pid\nelif sys.platform == 'cli':\n from ._process_cli import _find_cmd, system, getoutput, arg_split, check_pid\nelse:\n from ._process_posix import _find_cmd, system, getoutput, arg_split, check_pid\n\nfrom ._process_common import getoutputerror, get_output_error_code, process_handler\nfrom . import py3compat\n\n#-----------------------------------------------------------------------------\n# Code\n#-----------------------------------------------------------------------------\n\n\nclass FindCmdError(Exception):\n pass\n\n\ndef find_cmd(cmd):\n \"\"\"Find absolute path to executable cmd in a cross platform manner.\n\n This function tries to determine the full path to a command line program\n using `which` on Unix/Linux/OS X and `win32api` on Windows. Most of the\n time it will use the version that is first on the users `PATH`.\n\n Warning, don't use this to find IPython command line programs as there\n is a risk you will find the wrong one. Instead find those using the\n following code and looking for the application itself::\n\n from IPython.utils.path import get_ipython_module_path\n from IPython.utils.process import pycmd2argv\n argv = pycmd2argv(get_ipython_module_path('IPython.terminal.ipapp'))\n\n Parameters\n ----------\n cmd : str\n The command line program to look for.\n \"\"\"\n try:\n path = _find_cmd(cmd).rstrip()\n except OSError:\n raise FindCmdError('command could not be found: %s' % cmd)\n # which returns empty if not found\n if path == '':\n raise FindCmdError('command could not be found: %s' % cmd)\n return os.path.abspath(path)\n\n\ndef is_cmd_found(cmd):\n \"\"\"Check whether executable `cmd` exists or not and return a bool.\"\"\"\n try:\n find_cmd(cmd)\n return True\n except FindCmdError:\n return False\n\n\ndef pycmd2argv(cmd):\n r\"\"\"Take the path of a python command and return a list (argv-style).\n\n This only works on Python based command line programs and will find the\n location of the ``python`` executable using ``sys.executable`` to make\n sure the right version is used.\n\n For a given path ``cmd``, this returns [cmd] if cmd's extension is .exe,\n .com or .bat, and [, cmd] otherwise.\n\n Parameters\n ----------\n cmd : string\n The path of the command.\n\n Returns\n -------\n argv-style list.\n \"\"\"\n ext = os.path.splitext(cmd)[1]\n if ext in ['.exe', '.com', '.bat']:\n return [cmd]\n else:\n return [sys.executable, cmd]\n\n\ndef abbrev_cwd():\n \"\"\" Return abbreviated version of cwd, e.g. d:mydir \"\"\"\n cwd = py3compat.getcwd().replace('\\\\','/')\n drivepart = ''\n tail = cwd\n if sys.platform == 'win32':\n if len(cwd) < 4:\n return cwd\n drivepart,tail = os.path.splitdrive(cwd)\n\n\n parts = tail.split('/')\n if len(parts) > 2:\n tail = '/'.join(parts[-2:])\n\n return (drivepart + (\n cwd == '/' and '/' or tail))\n", "path": "IPython/utils/process.py"}, {"content": "\"\"\"cli-specific implementation of process utilities.\n\ncli - Common Language Infrastructure for IronPython. Code\n can run on any operating system. Check os.name for os-\n specific settings.\n\nThis file is only meant to be imported by process.py, not by end-users.\n\nThis file is largely untested. To become a full drop-in process\ninterface for IronPython will probably require you to help fill\nin the details. \n\"\"\"\n\n# Import cli libraries:\nimport clr\nimport System\n\n# Import Python libraries:\nimport os\n\n# Import IPython libraries:\nfrom IPython.utils import py3compat\nfrom ._process_common import arg_split\n\ndef _find_cmd(cmd):\n \"\"\"Find the full path to a command using which.\"\"\"\n paths = System.Environment.GetEnvironmentVariable(\"PATH\").Split(os.pathsep)\n for path in paths:\n filename = os.path.join(path, cmd)\n if System.IO.File.Exists(filename):\n return py3compat.bytes_to_str(filename)\n raise OSError(\"command %r not found\" % cmd)\n\ndef system(cmd):\n \"\"\"\n system(cmd) should work in a cli environment on Mac OSX, Linux,\n and Windows\n \"\"\"\n psi = System.Diagnostics.ProcessStartInfo(cmd)\n psi.RedirectStandardOutput = True\n psi.RedirectStandardError = True\n psi.WindowStyle = System.Diagnostics.ProcessWindowStyle.Normal\n psi.UseShellExecute = False\n # Start up process:\n reg = System.Diagnostics.Process.Start(psi)\n\ndef getoutput(cmd):\n \"\"\"\n getoutput(cmd) should work in a cli environment on Mac OSX, Linux,\n and Windows\n \"\"\"\n psi = System.Diagnostics.ProcessStartInfo(cmd)\n psi.RedirectStandardOutput = True\n psi.RedirectStandardError = True\n psi.WindowStyle = System.Diagnostics.ProcessWindowStyle.Normal\n psi.UseShellExecute = False\n # Start up process:\n reg = System.Diagnostics.Process.Start(psi)\n myOutput = reg.StandardOutput\n output = myOutput.ReadToEnd()\n myError = reg.StandardError\n error = myError.ReadToEnd()\n return output\n\ndef check_pid(pid):\n \"\"\"\n Check if a process with the given PID (pid) exists\n \"\"\"\n try:\n System.Diagnostics.Process.GetProcessById(pid)\n # process with given pid is running\n return True\n except System.InvalidOperationException:\n # process wasn't started by this object (but is running)\n return True\n except System.ArgumentException:\n # process with given pid isn't running\n return False \n", "path": "IPython/utils/_process_cli.py"}]} | 2,016 | 327 |
gh_patches_debug_1640 | rasdani/github-patches | git_diff | scikit-image__scikit-image-1430 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
measure.label is documented under morphology.label
In the [measure API reference](http://scikit-image.org/docs/stable/api/skimage.measure.html) label is not documented, but it is [documented under morphology module](http://scikit-image.org/docs/stable/api/skimage.morphology.html#label) (which is depreciated).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `skimage/measure/__init__.py`
Content:
```
1 from ._find_contours import find_contours
2 from ._marching_cubes import (marching_cubes, mesh_surface_area,
3 correct_mesh_orientation)
4 from ._regionprops import regionprops, perimeter
5 from ._structural_similarity import structural_similarity
6 from ._polygon import approximate_polygon, subdivide_polygon
7 from ._pnpoly import points_in_poly, grid_points_in_poly
8 from ._moments import moments, moments_central, moments_normalized, moments_hu
9 from .profile import profile_line
10 from .fit import LineModel, CircleModel, EllipseModel, ransac
11 from .block import block_reduce
12 from ._ccomp import label
13
14
15 __all__ = ['find_contours',
16 'regionprops',
17 'perimeter',
18 'structural_similarity',
19 'approximate_polygon',
20 'subdivide_polygon',
21 'LineModel',
22 'CircleModel',
23 'EllipseModel',
24 'ransac',
25 'block_reduce',
26 'moments',
27 'moments_central',
28 'moments_normalized',
29 'moments_hu',
30 'marching_cubes',
31 'mesh_surface_area',
32 'correct_mesh_orientation',
33 'profile_line',
34 'label',
35 'points_in_poly',
36 'grid_points_in_poly']
37
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/skimage/measure/__init__.py b/skimage/measure/__init__.py
--- a/skimage/measure/__init__.py
+++ b/skimage/measure/__init__.py
@@ -9,7 +9,7 @@
from .profile import profile_line
from .fit import LineModel, CircleModel, EllipseModel, ransac
from .block import block_reduce
-from ._ccomp import label
+from ._label import label
__all__ = ['find_contours',
| {"golden_diff": "diff --git a/skimage/measure/__init__.py b/skimage/measure/__init__.py\n--- a/skimage/measure/__init__.py\n+++ b/skimage/measure/__init__.py\n@@ -9,7 +9,7 @@\n from .profile import profile_line\n from .fit import LineModel, CircleModel, EllipseModel, ransac\n from .block import block_reduce\n-from ._ccomp import label\n+from ._label import label\n \n \n __all__ = ['find_contours',\n", "issue": "measure.label is documented under morphology.label\nIn the [measure API reference](http://scikit-image.org/docs/stable/api/skimage.measure.html) label is not documented, but it is [documented under morphology module](http://scikit-image.org/docs/stable/api/skimage.morphology.html#label) (which is depreciated).\n\n", "before_files": [{"content": "from ._find_contours import find_contours\nfrom ._marching_cubes import (marching_cubes, mesh_surface_area,\n correct_mesh_orientation)\nfrom ._regionprops import regionprops, perimeter\nfrom ._structural_similarity import structural_similarity\nfrom ._polygon import approximate_polygon, subdivide_polygon\nfrom ._pnpoly import points_in_poly, grid_points_in_poly\nfrom ._moments import moments, moments_central, moments_normalized, moments_hu\nfrom .profile import profile_line\nfrom .fit import LineModel, CircleModel, EllipseModel, ransac\nfrom .block import block_reduce\nfrom ._ccomp import label\n\n\n__all__ = ['find_contours',\n 'regionprops',\n 'perimeter',\n 'structural_similarity',\n 'approximate_polygon',\n 'subdivide_polygon',\n 'LineModel',\n 'CircleModel',\n 'EllipseModel',\n 'ransac',\n 'block_reduce',\n 'moments',\n 'moments_central',\n 'moments_normalized',\n 'moments_hu',\n 'marching_cubes',\n 'mesh_surface_area',\n 'correct_mesh_orientation',\n 'profile_line',\n 'label',\n 'points_in_poly',\n 'grid_points_in_poly']\n", "path": "skimage/measure/__init__.py"}], "after_files": [{"content": "from ._find_contours import find_contours\nfrom ._marching_cubes import (marching_cubes, mesh_surface_area,\n correct_mesh_orientation)\nfrom ._regionprops import regionprops, perimeter\nfrom ._structural_similarity import structural_similarity\nfrom ._polygon import approximate_polygon, subdivide_polygon\nfrom ._pnpoly import points_in_poly, grid_points_in_poly\nfrom ._moments import moments, moments_central, moments_normalized, moments_hu\nfrom .profile import profile_line\nfrom .fit import LineModel, CircleModel, EllipseModel, ransac\nfrom .block import block_reduce\nfrom ._label import label\n\n\n__all__ = ['find_contours',\n 'regionprops',\n 'perimeter',\n 'structural_similarity',\n 'approximate_polygon',\n 'subdivide_polygon',\n 'LineModel',\n 'CircleModel',\n 'EllipseModel',\n 'ransac',\n 'block_reduce',\n 'moments',\n 'moments_central',\n 'moments_normalized',\n 'moments_hu',\n 'marching_cubes',\n 'mesh_surface_area',\n 'correct_mesh_orientation',\n 'profile_line',\n 'label',\n 'points_in_poly',\n 'grid_points_in_poly']\n", "path": "skimage/measure/__init__.py"}]} | 661 | 114 |
gh_patches_debug_128 | rasdani/github-patches | git_diff | opsdroid__opsdroid-28 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Regex case sensitive
The regex match is currently case insensitive. It shouldn't be.
https://github.com/opsdroid/opsdroid/blob/master/opsdroid/helper.py#L30
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `opsdroid/helper.py`
Content:
```
1 """Helper functions to use within OpsDroid."""
2
3 import logging
4 import re
5
6
7 def set_logging_level(logging_level):
8 """Set the logger level based on the user configuration."""
9 logger = logging.getLogger()
10 if logging_level == 'critical':
11 logger.setLevel(logging.CRITICAL)
12 elif logging_level == 'error':
13 logger.setLevel(logging.ERROR)
14 elif logging_level == 'warning':
15 logger.setLevel(logging.WARNING)
16 elif logging_level == 'info':
17 logger.setLevel(logging.INFO)
18 elif logging_level == 'debug':
19 logger.setLevel(logging.DEBUG)
20 # No need to log the others as they'll never be seen
21 logging.debug("Set log level to debug")
22 else:
23 logger.setLevel(logging.INFO)
24 logging.warning("Log level '" + logging_level +
25 "' unknown, defaulting to 'info'")
26
27
28 def match(regex, message):
29 """Regex match a string."""
30 return re.match(regex, message, re.M | re.I)
31
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/opsdroid/helper.py b/opsdroid/helper.py
--- a/opsdroid/helper.py
+++ b/opsdroid/helper.py
@@ -27,4 +27,4 @@
def match(regex, message):
"""Regex match a string."""
- return re.match(regex, message, re.M | re.I)
+ return re.match(regex, message)
| {"golden_diff": "diff --git a/opsdroid/helper.py b/opsdroid/helper.py\n--- a/opsdroid/helper.py\n+++ b/opsdroid/helper.py\n@@ -27,4 +27,4 @@\n \n def match(regex, message):\n \"\"\"Regex match a string.\"\"\"\n- return re.match(regex, message, re.M | re.I)\n+ return re.match(regex, message)\n", "issue": "Regex case sensitive\nThe regex match is currently case insensitive. It shouldn't be.\n\nhttps://github.com/opsdroid/opsdroid/blob/master/opsdroid/helper.py#L30\n\n", "before_files": [{"content": "\"\"\"Helper functions to use within OpsDroid.\"\"\"\n\nimport logging\nimport re\n\n\ndef set_logging_level(logging_level):\n \"\"\"Set the logger level based on the user configuration.\"\"\"\n logger = logging.getLogger()\n if logging_level == 'critical':\n logger.setLevel(logging.CRITICAL)\n elif logging_level == 'error':\n logger.setLevel(logging.ERROR)\n elif logging_level == 'warning':\n logger.setLevel(logging.WARNING)\n elif logging_level == 'info':\n logger.setLevel(logging.INFO)\n elif logging_level == 'debug':\n logger.setLevel(logging.DEBUG)\n # No need to log the others as they'll never be seen\n logging.debug(\"Set log level to debug\")\n else:\n logger.setLevel(logging.INFO)\n logging.warning(\"Log level '\" + logging_level +\n \"' unknown, defaulting to 'info'\")\n\n\ndef match(regex, message):\n \"\"\"Regex match a string.\"\"\"\n return re.match(regex, message, re.M | re.I)\n", "path": "opsdroid/helper.py"}], "after_files": [{"content": "\"\"\"Helper functions to use within OpsDroid.\"\"\"\n\nimport logging\nimport re\n\n\ndef set_logging_level(logging_level):\n \"\"\"Set the logger level based on the user configuration.\"\"\"\n logger = logging.getLogger()\n if logging_level == 'critical':\n logger.setLevel(logging.CRITICAL)\n elif logging_level == 'error':\n logger.setLevel(logging.ERROR)\n elif logging_level == 'warning':\n logger.setLevel(logging.WARNING)\n elif logging_level == 'info':\n logger.setLevel(logging.INFO)\n elif logging_level == 'debug':\n logger.setLevel(logging.DEBUG)\n # No need to log the others as they'll never be seen\n logging.debug(\"Set log level to debug\")\n else:\n logger.setLevel(logging.INFO)\n logging.warning(\"Log level '\" + logging_level +\n \"' unknown, defaulting to 'info'\")\n\n\ndef match(regex, message):\n \"\"\"Regex match a string.\"\"\"\n return re.match(regex, message)\n", "path": "opsdroid/helper.py"}]} | 553 | 86 |
gh_patches_debug_24940 | rasdani/github-patches | git_diff | HypothesisWorks__hypothesis-1548 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Log message hiding interfers with pytest log caputure fixture.
The log message hiding merged in #1535 interferes with pytests log capturing.
In the example below I am using the [caplog fixture](https://docs.pytest.org/en/latest/logging.html#caplog-fixture) to capture the log message and test that they are correct.
In this simple case just asserting the number of messages. This results in an empty record list in the first call with 3.69.11 but works as expected with 3.69.10
```python
import logging
from hypothesis import given, strategies
@given(x=strategies.floats(min_value=-3, max_value=3))
def test_logcapture(x, caplog):
with caplog.at_level(logging.CRITICAL, logger='root'):
caplog.clear()
logging.critical(f"x is {x}")
assert len(caplog.records) == 1
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `hypothesis-python/src/hypothesis/control.py`
Content:
```
1 # coding=utf-8
2 #
3 # This file is part of Hypothesis, which may be found at
4 # https://github.com/HypothesisWorks/hypothesis-python
5 #
6 # Most of this work is copyright (C) 2013-2018 David R. MacIver
7 # ([email protected]), but it contains contributions by others. See
8 # CONTRIBUTING.rst for a full list of people who may hold copyright, and
9 # consult the git log if you need to determine who owns an individual
10 # contribution.
11 #
12 # This Source Code Form is subject to the terms of the Mozilla Public License,
13 # v. 2.0. If a copy of the MPL was not distributed with this file, You can
14 # obtain one at http://mozilla.org/MPL/2.0/.
15 #
16 # END HEADER
17
18 from __future__ import division, print_function, absolute_import
19
20 import logging
21 import traceback
22
23 from hypothesis import Verbosity, settings
24 from hypothesis.errors import CleanupFailed, InvalidArgument, \
25 UnsatisfiedAssumption
26 from hypothesis.reporting import report
27 from hypothesis.utils.dynamicvariables import DynamicVariable
28
29 if False:
30 from typing import Any, AnyStr # noqa
31
32
33 def reject():
34 raise UnsatisfiedAssumption()
35
36
37 def assume(condition):
38 # type: (Any) -> bool
39 """Calling ``assume`` is like an :ref:`assert <python:assert>` that marks
40 the example as bad, rather than failing the test.
41
42 This allows you to specify properties that you *assume* will be
43 true, and let Hypothesis try to avoid similar examples in future.
44 """
45 if not condition:
46 raise UnsatisfiedAssumption()
47 return True
48
49
50 _current_build_context = DynamicVariable(None)
51
52
53 def current_build_context():
54 context = _current_build_context.value
55 if context is None:
56 raise InvalidArgument(
57 u'No build context registered')
58 return context
59
60
61 class BuildContext(object):
62
63 def __init__(self, data, is_final=False, close_on_capture=True):
64 self.data = data
65 self.tasks = []
66 self.is_final = is_final
67 self.close_on_capture = close_on_capture
68 self.close_on_del = False
69 self.notes = []
70 self.original_logging_disable = logging.NOTSET
71
72 def __enter__(self):
73 if not self.is_final:
74 self.original_logging_disable = logging.root.manager.disable
75 logging.disable(logging.CRITICAL)
76 self.assign_variable = _current_build_context.with_value(self)
77 self.assign_variable.__enter__()
78 return self
79
80 def __exit__(self, exc_type, exc_value, tb):
81 self.assign_variable.__exit__(exc_type, exc_value, tb)
82 if not self.is_final:
83 logging.disable(self.original_logging_disable)
84 if self.close() and exc_type is None:
85 raise CleanupFailed()
86
87 def local(self):
88 return _current_build_context.with_value(self)
89
90 def close(self):
91 any_failed = False
92 for task in self.tasks:
93 try:
94 task()
95 except BaseException:
96 any_failed = True
97 report(traceback.format_exc())
98 return any_failed
99
100
101 def cleanup(teardown):
102 """Register a function to be called when the current test has finished
103 executing. Any exceptions thrown in teardown will be printed but not
104 rethrown.
105
106 Inside a test this isn't very interesting, because you can just use
107 a finally block, but note that you can use this inside map, flatmap,
108 etc. in order to e.g. insist that a value is closed at the end.
109 """
110 context = _current_build_context.value
111 if context is None:
112 raise InvalidArgument(
113 u'Cannot register cleanup outside of build context')
114 context.tasks.append(teardown)
115
116
117 def note(value):
118 # type: (AnyStr) -> None
119 """Report this value in the final execution."""
120 context = _current_build_context.value
121 if context is None:
122 raise InvalidArgument(
123 'Cannot make notes outside of a test')
124 context.notes.append(value)
125 if context.is_final or settings.default.verbosity >= Verbosity.verbose:
126 report(value)
127
128
129 def event(value):
130 # type: (AnyStr) -> None
131 """Record an event that occurred this test. Statistics on number of test
132 runs with each event will be reported at the end if you run Hypothesis in
133 statistics reporting mode.
134
135 Events should be strings or convertible to them.
136 """
137 context = _current_build_context.value
138 if context is None:
139 raise InvalidArgument(
140 'Cannot make record events outside of a test')
141
142 if context.data is not None:
143 context.data.note_event(value)
144
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/hypothesis-python/src/hypothesis/control.py b/hypothesis-python/src/hypothesis/control.py
--- a/hypothesis-python/src/hypothesis/control.py
+++ b/hypothesis-python/src/hypothesis/control.py
@@ -17,7 +17,6 @@
from __future__ import division, print_function, absolute_import
-import logging
import traceback
from hypothesis import Verbosity, settings
@@ -67,20 +66,14 @@
self.close_on_capture = close_on_capture
self.close_on_del = False
self.notes = []
- self.original_logging_disable = logging.NOTSET
def __enter__(self):
- if not self.is_final:
- self.original_logging_disable = logging.root.manager.disable
- logging.disable(logging.CRITICAL)
self.assign_variable = _current_build_context.with_value(self)
self.assign_variable.__enter__()
return self
def __exit__(self, exc_type, exc_value, tb):
self.assign_variable.__exit__(exc_type, exc_value, tb)
- if not self.is_final:
- logging.disable(self.original_logging_disable)
if self.close() and exc_type is None:
raise CleanupFailed()
| {"golden_diff": "diff --git a/hypothesis-python/src/hypothesis/control.py b/hypothesis-python/src/hypothesis/control.py\n--- a/hypothesis-python/src/hypothesis/control.py\n+++ b/hypothesis-python/src/hypothesis/control.py\n@@ -17,7 +17,6 @@\n \n from __future__ import division, print_function, absolute_import\n \n-import logging\n import traceback\n \n from hypothesis import Verbosity, settings\n@@ -67,20 +66,14 @@\n self.close_on_capture = close_on_capture\n self.close_on_del = False\n self.notes = []\n- self.original_logging_disable = logging.NOTSET\n \n def __enter__(self):\n- if not self.is_final:\n- self.original_logging_disable = logging.root.manager.disable\n- logging.disable(logging.CRITICAL)\n self.assign_variable = _current_build_context.with_value(self)\n self.assign_variable.__enter__()\n return self\n \n def __exit__(self, exc_type, exc_value, tb):\n self.assign_variable.__exit__(exc_type, exc_value, tb)\n- if not self.is_final:\n- logging.disable(self.original_logging_disable)\n if self.close() and exc_type is None:\n raise CleanupFailed()\n", "issue": "Log message hiding interfers with pytest log caputure fixture.\nThe log message hiding merged in #1535 interferes with pytests log capturing. \r\n\r\nIn the example below I am using the [caplog fixture](https://docs.pytest.org/en/latest/logging.html#caplog-fixture) to capture the log message and test that they are correct.\r\nIn this simple case just asserting the number of messages. This results in an empty record list in the first call with 3.69.11 but works as expected with 3.69.10\r\n\r\n```python\r\nimport logging\r\nfrom hypothesis import given, strategies\r\n\r\n@given(x=strategies.floats(min_value=-3, max_value=3))\r\ndef test_logcapture(x, caplog):\r\n\r\n with caplog.at_level(logging.CRITICAL, logger='root'):\r\n caplog.clear()\r\n logging.critical(f\"x is {x}\")\r\n assert len(caplog.records) == 1 \r\n```\n", "before_files": [{"content": "# coding=utf-8\n#\n# This file is part of Hypothesis, which may be found at\n# https://github.com/HypothesisWorks/hypothesis-python\n#\n# Most of this work is copyright (C) 2013-2018 David R. MacIver\n# ([email protected]), but it contains contributions by others. See\n# CONTRIBUTING.rst for a full list of people who may hold copyright, and\n# consult the git log if you need to determine who owns an individual\n# contribution.\n#\n# This Source Code Form is subject to the terms of the Mozilla Public License,\n# v. 2.0. If a copy of the MPL was not distributed with this file, You can\n# obtain one at http://mozilla.org/MPL/2.0/.\n#\n# END HEADER\n\nfrom __future__ import division, print_function, absolute_import\n\nimport logging\nimport traceback\n\nfrom hypothesis import Verbosity, settings\nfrom hypothesis.errors import CleanupFailed, InvalidArgument, \\\n UnsatisfiedAssumption\nfrom hypothesis.reporting import report\nfrom hypothesis.utils.dynamicvariables import DynamicVariable\n\nif False:\n from typing import Any, AnyStr # noqa\n\n\ndef reject():\n raise UnsatisfiedAssumption()\n\n\ndef assume(condition):\n # type: (Any) -> bool\n \"\"\"Calling ``assume`` is like an :ref:`assert <python:assert>` that marks\n the example as bad, rather than failing the test.\n\n This allows you to specify properties that you *assume* will be\n true, and let Hypothesis try to avoid similar examples in future.\n \"\"\"\n if not condition:\n raise UnsatisfiedAssumption()\n return True\n\n\n_current_build_context = DynamicVariable(None)\n\n\ndef current_build_context():\n context = _current_build_context.value\n if context is None:\n raise InvalidArgument(\n u'No build context registered')\n return context\n\n\nclass BuildContext(object):\n\n def __init__(self, data, is_final=False, close_on_capture=True):\n self.data = data\n self.tasks = []\n self.is_final = is_final\n self.close_on_capture = close_on_capture\n self.close_on_del = False\n self.notes = []\n self.original_logging_disable = logging.NOTSET\n\n def __enter__(self):\n if not self.is_final:\n self.original_logging_disable = logging.root.manager.disable\n logging.disable(logging.CRITICAL)\n self.assign_variable = _current_build_context.with_value(self)\n self.assign_variable.__enter__()\n return self\n\n def __exit__(self, exc_type, exc_value, tb):\n self.assign_variable.__exit__(exc_type, exc_value, tb)\n if not self.is_final:\n logging.disable(self.original_logging_disable)\n if self.close() and exc_type is None:\n raise CleanupFailed()\n\n def local(self):\n return _current_build_context.with_value(self)\n\n def close(self):\n any_failed = False\n for task in self.tasks:\n try:\n task()\n except BaseException:\n any_failed = True\n report(traceback.format_exc())\n return any_failed\n\n\ndef cleanup(teardown):\n \"\"\"Register a function to be called when the current test has finished\n executing. Any exceptions thrown in teardown will be printed but not\n rethrown.\n\n Inside a test this isn't very interesting, because you can just use\n a finally block, but note that you can use this inside map, flatmap,\n etc. in order to e.g. insist that a value is closed at the end.\n \"\"\"\n context = _current_build_context.value\n if context is None:\n raise InvalidArgument(\n u'Cannot register cleanup outside of build context')\n context.tasks.append(teardown)\n\n\ndef note(value):\n # type: (AnyStr) -> None\n \"\"\"Report this value in the final execution.\"\"\"\n context = _current_build_context.value\n if context is None:\n raise InvalidArgument(\n 'Cannot make notes outside of a test')\n context.notes.append(value)\n if context.is_final or settings.default.verbosity >= Verbosity.verbose:\n report(value)\n\n\ndef event(value):\n # type: (AnyStr) -> None\n \"\"\"Record an event that occurred this test. Statistics on number of test\n runs with each event will be reported at the end if you run Hypothesis in\n statistics reporting mode.\n\n Events should be strings or convertible to them.\n \"\"\"\n context = _current_build_context.value\n if context is None:\n raise InvalidArgument(\n 'Cannot make record events outside of a test')\n\n if context.data is not None:\n context.data.note_event(value)\n", "path": "hypothesis-python/src/hypothesis/control.py"}], "after_files": [{"content": "# coding=utf-8\n#\n# This file is part of Hypothesis, which may be found at\n# https://github.com/HypothesisWorks/hypothesis-python\n#\n# Most of this work is copyright (C) 2013-2018 David R. MacIver\n# ([email protected]), but it contains contributions by others. See\n# CONTRIBUTING.rst for a full list of people who may hold copyright, and\n# consult the git log if you need to determine who owns an individual\n# contribution.\n#\n# This Source Code Form is subject to the terms of the Mozilla Public License,\n# v. 2.0. If a copy of the MPL was not distributed with this file, You can\n# obtain one at http://mozilla.org/MPL/2.0/.\n#\n# END HEADER\n\nfrom __future__ import division, print_function, absolute_import\n\nimport traceback\n\nfrom hypothesis import Verbosity, settings\nfrom hypothesis.errors import CleanupFailed, InvalidArgument, \\\n UnsatisfiedAssumption\nfrom hypothesis.reporting import report\nfrom hypothesis.utils.dynamicvariables import DynamicVariable\n\nif False:\n from typing import Any, AnyStr # noqa\n\n\ndef reject():\n raise UnsatisfiedAssumption()\n\n\ndef assume(condition):\n # type: (Any) -> bool\n \"\"\"Calling ``assume`` is like an :ref:`assert <python:assert>` that marks\n the example as bad, rather than failing the test.\n\n This allows you to specify properties that you *assume* will be\n true, and let Hypothesis try to avoid similar examples in future.\n \"\"\"\n if not condition:\n raise UnsatisfiedAssumption()\n return True\n\n\n_current_build_context = DynamicVariable(None)\n\n\ndef current_build_context():\n context = _current_build_context.value\n if context is None:\n raise InvalidArgument(\n u'No build context registered')\n return context\n\n\nclass BuildContext(object):\n\n def __init__(self, data, is_final=False, close_on_capture=True):\n self.data = data\n self.tasks = []\n self.is_final = is_final\n self.close_on_capture = close_on_capture\n self.close_on_del = False\n self.notes = []\n\n def __enter__(self):\n self.assign_variable = _current_build_context.with_value(self)\n self.assign_variable.__enter__()\n return self\n\n def __exit__(self, exc_type, exc_value, tb):\n self.assign_variable.__exit__(exc_type, exc_value, tb)\n if self.close() and exc_type is None:\n raise CleanupFailed()\n\n def local(self):\n return _current_build_context.with_value(self)\n\n def close(self):\n any_failed = False\n for task in self.tasks:\n try:\n task()\n except BaseException:\n any_failed = True\n report(traceback.format_exc())\n return any_failed\n\n\ndef cleanup(teardown):\n \"\"\"Register a function to be called when the current test has finished\n executing. Any exceptions thrown in teardown will be printed but not\n rethrown.\n\n Inside a test this isn't very interesting, because you can just use\n a finally block, but note that you can use this inside map, flatmap,\n etc. in order to e.g. insist that a value is closed at the end.\n \"\"\"\n context = _current_build_context.value\n if context is None:\n raise InvalidArgument(\n u'Cannot register cleanup outside of build context')\n context.tasks.append(teardown)\n\n\ndef note(value):\n # type: (AnyStr) -> None\n \"\"\"Report this value in the final execution.\"\"\"\n context = _current_build_context.value\n if context is None:\n raise InvalidArgument(\n 'Cannot make notes outside of a test')\n context.notes.append(value)\n if context.is_final or settings.default.verbosity >= Verbosity.verbose:\n report(value)\n\n\ndef event(value):\n # type: (AnyStr) -> None\n \"\"\"Record an event that occurred this test. Statistics on number of test\n runs with each event will be reported at the end if you run Hypothesis in\n statistics reporting mode.\n\n Events should be strings or convertible to them.\n \"\"\"\n context = _current_build_context.value\n if context is None:\n raise InvalidArgument(\n 'Cannot make record events outside of a test')\n\n if context.data is not None:\n context.data.note_event(value)\n", "path": "hypothesis-python/src/hypothesis/control.py"}]} | 1,801 | 262 |
gh_patches_debug_10841 | rasdani/github-patches | git_diff | digitalfabrik__integreat-cms-435 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Manual creation of Languages results in Bugsplash at page view.
### Describe the Bug
If a system is set up manually without the test data. The manual language setup results in a error when opening the list view of the pages.

### Steps to Reproduce
1. Create Languages (Like German with code DE_de)
2. Create Region (Like Berlin)
3. Add Language Tree model like German -> English
4. Click on Pages and see error
### Expected Behavior
The usual list should be displayed.
### Actual Behavior
Error message from Django.
### Additional Information
I guess this has something to do with manually setting the language code and this can't be matched by django.
Manual creation of Languages results in Bugsplash at page view.
### Describe the Bug
If a system is set up manually without the test data. The manual language setup results in a error when opening the list view of the pages.

### Steps to Reproduce
1. Create Languages (Like German with code DE_de)
2. Create Region (Like Berlin)
3. Add Language Tree model like German -> English
4. Click on Pages and see error
### Expected Behavior
The usual list should be displayed.
### Actual Behavior
Error message from Django.
### Additional Information
I guess this has something to do with manually setting the language code and this can't be matched by django.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/cms/templatetags/content_filters.py`
Content:
```
1 import logging
2
3 from django import template
4
5 from ..models import Language
6
7 logger = logging.getLogger(__name__)
8 register = template.Library()
9
10
11 @register.simple_tag
12 def get_translation(instance, language_code):
13 return instance.translations.filter(language__code=language_code).first()
14
15
16 @register.simple_tag
17 def translated_language_name(language_code):
18 return Language.objects.get(code=language_code).translated_name
19
20 @register.simple_tag
21 def get_language(language_code):
22 return Language.objects.get(code=language_code)
23
24 # Unify the language codes of backend and content languages
25 @register.simple_tag
26 def unify_language_code(language_code):
27 if language_code == 'en-gb':
28 return 'en-us'
29 return language_code
30
31
32 @register.filter
33 def get_int_list(data, list_name):
34 return [int(item) for item in data.getlist(list_name)]
35
36
37 @register.filter
38 def is_empty(iterable):
39 return not bool(iterable)
40
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/cms/templatetags/content_filters.py b/src/cms/templatetags/content_filters.py
--- a/src/cms/templatetags/content_filters.py
+++ b/src/cms/templatetags/content_filters.py
@@ -15,11 +15,14 @@
@register.simple_tag
def translated_language_name(language_code):
- return Language.objects.get(code=language_code).translated_name
+ language = Language.objects.filter(code=language_code)
+ if language.exists():
+ return language.first().translated_name
+ return ''
@register.simple_tag
def get_language(language_code):
- return Language.objects.get(code=language_code)
+ return Language.objects.filter(code=language_code).first()
# Unify the language codes of backend and content languages
@register.simple_tag
| {"golden_diff": "diff --git a/src/cms/templatetags/content_filters.py b/src/cms/templatetags/content_filters.py\n--- a/src/cms/templatetags/content_filters.py\n+++ b/src/cms/templatetags/content_filters.py\n@@ -15,11 +15,14 @@\n \n @register.simple_tag\n def translated_language_name(language_code):\n- return Language.objects.get(code=language_code).translated_name\n+ language = Language.objects.filter(code=language_code)\n+ if language.exists():\n+ return language.first().translated_name\n+ return ''\n \n @register.simple_tag\n def get_language(language_code):\n- return Language.objects.get(code=language_code)\n+ return Language.objects.filter(code=language_code).first()\n \n # Unify the language codes of backend and content languages\n @register.simple_tag\n", "issue": "Manual creation of Languages results in Bugsplash at page view.\n### Describe the Bug\r\nIf a system is set up manually without the test data. The manual language setup results in a error when opening the list view of the pages.\r\n\r\n\r\n\r\n\r\n### Steps to Reproduce\r\n\r\n1. Create Languages (Like German with code DE_de)\r\n2. Create Region (Like Berlin)\r\n3. Add Language Tree model like German -> English\r\n4. Click on Pages and see error\r\n\r\n### Expected Behavior\r\nThe usual list should be displayed. \r\n\r\n\r\n### Actual Behavior\r\nError message from Django.\r\n\r\n\r\n### Additional Information\r\nI guess this has something to do with manually setting the language code and this can't be matched by django.\r\n\r\n\nManual creation of Languages results in Bugsplash at page view.\n### Describe the Bug\r\nIf a system is set up manually without the test data. The manual language setup results in a error when opening the list view of the pages.\r\n\r\n\r\n\r\n\r\n### Steps to Reproduce\r\n\r\n1. Create Languages (Like German with code DE_de)\r\n2. Create Region (Like Berlin)\r\n3. Add Language Tree model like German -> English\r\n4. Click on Pages and see error\r\n\r\n### Expected Behavior\r\nThe usual list should be displayed. \r\n\r\n\r\n### Actual Behavior\r\nError message from Django.\r\n\r\n\r\n### Additional Information\r\nI guess this has something to do with manually setting the language code and this can't be matched by django.\r\n\r\n\n", "before_files": [{"content": "import logging\n\nfrom django import template\n\nfrom ..models import Language\n\nlogger = logging.getLogger(__name__)\nregister = template.Library()\n\n\[email protected]_tag\ndef get_translation(instance, language_code):\n return instance.translations.filter(language__code=language_code).first()\n\n\[email protected]_tag\ndef translated_language_name(language_code):\n return Language.objects.get(code=language_code).translated_name\n\[email protected]_tag\ndef get_language(language_code):\n return Language.objects.get(code=language_code)\n\n# Unify the language codes of backend and content languages\[email protected]_tag\ndef unify_language_code(language_code):\n if language_code == 'en-gb':\n return 'en-us'\n return language_code\n\n\[email protected]\ndef get_int_list(data, list_name):\n return [int(item) for item in data.getlist(list_name)]\n\n\[email protected]\ndef is_empty(iterable):\n return not bool(iterable)\n", "path": "src/cms/templatetags/content_filters.py"}], "after_files": [{"content": "import logging\n\nfrom django import template\n\nfrom ..models import Language\n\nlogger = logging.getLogger(__name__)\nregister = template.Library()\n\n\[email protected]_tag\ndef get_translation(instance, language_code):\n return instance.translations.filter(language__code=language_code).first()\n\n\[email protected]_tag\ndef translated_language_name(language_code):\n language = Language.objects.filter(code=language_code)\n if language.exists():\n return language.first().translated_name\n return ''\n\[email protected]_tag\ndef get_language(language_code):\n return Language.objects.filter(code=language_code).first()\n\n# Unify the language codes of backend and content languages\[email protected]_tag\ndef unify_language_code(language_code):\n if language_code == 'en-gb':\n return 'en-us'\n return language_code\n\n\[email protected]\ndef get_int_list(data, list_name):\n return [int(item) for item in data.getlist(list_name)]\n\n\[email protected]\ndef is_empty(iterable):\n return not bool(iterable)\n", "path": "src/cms/templatetags/content_filters.py"}]} | 949 | 178 |
gh_patches_debug_229 | rasdani/github-patches | git_diff | facebookresearch__hydra-1808 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[Bug] hydra-optuna-sweeper 1.1.0 requires numpy<1.20.0
# 🐛 Bug
## Description
<!-- A clear and concise description of what the bug is. -->
I used the guide from
https://hydra.cc/docs/plugins/optuna_sweeper/
And install hydra-optuna-sweeper:
```bash
pip install hydra-optuna-sweeper --upgrade
```
But it seems this plugin requires numpy<1.20.0:

**Edit:**
I searched for optuna's requirements, found this:
https://github.com/optuna/optuna/blob/cbae80476c15b6d39e1d8851dc6a501c63c3ca92/setup.py#L35
Why hydra-optuna-sweeper need to use numpy<1.20.0?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `plugins/hydra_optuna_sweeper/hydra_plugins/hydra_optuna_sweeper/__init__.py`
Content:
```
1 # Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved
2
3 __version__ = "1.1.0"
4
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/plugins/hydra_optuna_sweeper/hydra_plugins/hydra_optuna_sweeper/__init__.py b/plugins/hydra_optuna_sweeper/hydra_plugins/hydra_optuna_sweeper/__init__.py
--- a/plugins/hydra_optuna_sweeper/hydra_plugins/hydra_optuna_sweeper/__init__.py
+++ b/plugins/hydra_optuna_sweeper/hydra_plugins/hydra_optuna_sweeper/__init__.py
@@ -1,3 +1,3 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved
-__version__ = "1.1.0"
+__version__ = "1.1.1"
| {"golden_diff": "diff --git a/plugins/hydra_optuna_sweeper/hydra_plugins/hydra_optuna_sweeper/__init__.py b/plugins/hydra_optuna_sweeper/hydra_plugins/hydra_optuna_sweeper/__init__.py\n--- a/plugins/hydra_optuna_sweeper/hydra_plugins/hydra_optuna_sweeper/__init__.py\n+++ b/plugins/hydra_optuna_sweeper/hydra_plugins/hydra_optuna_sweeper/__init__.py\n@@ -1,3 +1,3 @@\n # Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved\n \n-__version__ = \"1.1.0\"\n+__version__ = \"1.1.1\"\n", "issue": "[Bug] hydra-optuna-sweeper 1.1.0 requires numpy<1.20.0\n# \ud83d\udc1b Bug\r\n## Description\r\n<!-- A clear and concise description of what the bug is. -->\r\n\r\nI used the guide from\r\nhttps://hydra.cc/docs/plugins/optuna_sweeper/\r\n\r\nAnd install hydra-optuna-sweeper:\r\n\r\n```bash \r\npip install hydra-optuna-sweeper --upgrade\r\n```\r\n\r\nBut it seems this plugin requires numpy<1.20.0:\r\n\r\n\r\n\r\n**Edit:**\r\n\r\nI searched for optuna's requirements, found this:\r\n\r\nhttps://github.com/optuna/optuna/blob/cbae80476c15b6d39e1d8851dc6a501c63c3ca92/setup.py#L35\r\n\r\nWhy hydra-optuna-sweeper need to use numpy<1.20.0?\r\n\r\n\r\n\r\n\n", "before_files": [{"content": "# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved\n\n__version__ = \"1.1.0\"\n", "path": "plugins/hydra_optuna_sweeper/hydra_plugins/hydra_optuna_sweeper/__init__.py"}], "after_files": [{"content": "# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved\n\n__version__ = \"1.1.1\"\n", "path": "plugins/hydra_optuna_sweeper/hydra_plugins/hydra_optuna_sweeper/__init__.py"}]} | 577 | 163 |
gh_patches_debug_43516 | rasdani/github-patches | git_diff | ResonantGeoData__ResonantGeoData-577 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Client: API token saving bug on MacOS
@banesullivan was experiencing the following issue:
When calling `create_rgd_client`, if there isn't already a token stored locally (in `$HOME/.rgd/token`), the client is supposed to make a request to the server to create it, and save it in that file. On MacOS, it seems this might not be occurring. The file doesn't appear to ever be created (notably though, the `.rgd` folder _is_ present). Furthermore, if you try to manually populate that file with your token, it will correctly read it, but the file will then be gone afterwards.
This doesn't actually affect authorization, as it still just fetches the token from the API and stores it in memory, but the storage issue needs to be looked into.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `django-rgd/client/rgd_client/client.py`
Content:
```
1 import getpass
2 import os
3 from typing import List, Optional, Type
4
5 import requests
6
7 from .plugin import CorePlugin
8 from .session import RgdClientSession, clone_session
9 from .utils import API_KEY_DIR_PATH, API_KEY_FILE_NAME, DEFAULT_RGD_API
10
11
12 class RgdClient:
13 def __init__(
14 self,
15 api_url: str = DEFAULT_RGD_API,
16 username: Optional[str] = None,
17 password: Optional[str] = None,
18 save: Optional[bool] = True,
19 ) -> None:
20 """
21 Initialize the base RGD Client.
22
23 Args:
24 api_url: The base url of the RGD API instance.
25 username: The username to authenticate to the instance with, if any.
26 password: The password associated with the provided username. If None, a prompt will be provided.
27 save: Whether or not to save the logged-in user's API key to disk for future use.
28
29 Returns:
30 A base RgdClient instance.
31 """
32 # Look for an API key in the environment. If it's not there, check username/password
33 api_key = _read_api_key()
34 if api_key is None:
35 if username is not None and password is None:
36 password = getpass.getpass()
37
38 # Get an API key for this user and save it to disk
39 if username and password:
40 api_key = _get_api_key(api_url, username, password, save)
41
42 auth_header = f'Token {api_key}'
43
44 self.session = RgdClientSession(base_url=api_url, auth_header=auth_header)
45 self.rgd = CorePlugin(clone_session(self.session))
46
47 def clear_token(self):
48 """Delete a locally-stored API key."""
49 (API_KEY_DIR_PATH / API_KEY_FILE_NAME).unlink(missing_ok=True)
50
51
52 def _get_api_key(api_url: str, username: str, password: str, save: bool) -> str:
53 """Get an RGD API Key for the given user from the server, and save it if requested."""
54 resp = requests.post(f'{api_url}/api-token-auth', {'username': username, 'password': password})
55 resp.raise_for_status()
56 token = resp.json()['token']
57 if save:
58 API_KEY_DIR_PATH.mkdir(parents=True, exist_ok=True)
59 with open(API_KEY_DIR_PATH / API_KEY_FILE_NAME, 'w') as fd:
60 fd.write(token)
61 return token
62
63
64 def _read_api_key() -> Optional[str]:
65 """
66 Retrieve an RGD API Key from the users environment.
67
68 This function checks for an environment variable named RGD_API_TOKEN and returns it if it exists.
69 If it does not exist, it looks for a file located at ~/.rgd/token and returns its contents.
70 """
71 token = os.getenv('RGD_API_TOKEN', None)
72 if token is not None:
73 return token
74
75 try:
76 # read the first line of the text file at ~/.rgd/token
77 with open(API_KEY_DIR_PATH / API_KEY_FILE_NAME, 'r') as fd:
78 return fd.readline().strip()
79 except FileNotFoundError:
80 return None
81
82
83 def create_rgd_client(
84 api_url: str = DEFAULT_RGD_API,
85 username: Optional[str] = None,
86 password: Optional[str] = None,
87 save: Optional[bool] = True,
88 extra_plugins: Optional[List[Type]] = None,
89 ):
90 # Avoid circular import
91 from ._plugin_utils import _inject_plugin_deps, _plugin_classes, _plugin_instances
92
93 # Create initial client
94 client = RgdClient(api_url, username, password, save)
95
96 # Perform plugin initialization
97 plugin_classes = _plugin_classes(extra_plugins=extra_plugins)
98 plugin_instances = _plugin_instances(client, plugin_classes)
99 _inject_plugin_deps(plugin_instances)
100
101 return client
102
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/django-rgd/client/rgd_client/client.py b/django-rgd/client/rgd_client/client.py
--- a/django-rgd/client/rgd_client/client.py
+++ b/django-rgd/client/rgd_client/client.py
@@ -1,4 +1,5 @@
import getpass
+import logging
import os
from typing import List, Optional, Type
@@ -8,6 +9,8 @@
from .session import RgdClientSession, clone_session
from .utils import API_KEY_DIR_PATH, API_KEY_FILE_NAME, DEFAULT_RGD_API
+logger = logging.getLogger(__name__)
+
class RgdClient:
def __init__(
@@ -30,7 +33,7 @@
A base RgdClient instance.
"""
# Look for an API key in the environment. If it's not there, check username/password
- api_key = _read_api_key()
+ api_key = _read_api_key(api_url=api_url, username=username, password=password)
if api_key is None:
if username is not None and password is None:
password = getpass.getpass()
@@ -38,6 +41,10 @@
# Get an API key for this user and save it to disk
if username and password:
api_key = _get_api_key(api_url, username, password, save)
+ if api_key is None:
+ logger.error(
+ 'Failed to retrieve API key; are your username and password correct?'
+ )
auth_header = f'Token {api_key}'
@@ -49,11 +56,12 @@
(API_KEY_DIR_PATH / API_KEY_FILE_NAME).unlink(missing_ok=True)
-def _get_api_key(api_url: str, username: str, password: str, save: bool) -> str:
+def _get_api_key(api_url: str, username: str, password: str, save: bool) -> Optional[str]:
"""Get an RGD API Key for the given user from the server, and save it if requested."""
resp = requests.post(f'{api_url}/api-token-auth', {'username': username, 'password': password})
- resp.raise_for_status()
- token = resp.json()['token']
+ token = resp.json().get('token')
+ if token is None:
+ return None
if save:
API_KEY_DIR_PATH.mkdir(parents=True, exist_ok=True)
with open(API_KEY_DIR_PATH / API_KEY_FILE_NAME, 'w') as fd:
@@ -61,7 +69,7 @@
return token
-def _read_api_key() -> Optional[str]:
+def _read_api_key(api_url: str, username: str = None, password: str = None) -> Optional[str]:
"""
Retrieve an RGD API Key from the users environment.
@@ -75,10 +83,29 @@
try:
# read the first line of the text file at ~/.rgd/token
with open(API_KEY_DIR_PATH / API_KEY_FILE_NAME, 'r') as fd:
- return fd.readline().strip()
+ api_key = fd.readline().strip()
except FileNotFoundError:
return None
+ # Make sure API key works by hitting a protected endpoint
+ resp = requests.get(f'{api_url}/rgd/collection', headers={'Authorization': f'Token {api_key}'})
+
+ # If it doesn't, try to get a new one and save it to ~/.rgd/token, as the current one is corrupted
+ if resp.status_code == 401:
+ logger.error('API key is invalid.')
+ # If username + password were provided, try to get a new API key with them
+ if username is not None and password is not None:
+ logger.warning('Attempting to fetch a new API key...')
+ api_key = _get_api_key(api_url, username, password, save=True)
+ if api_key is not None:
+ logger.warning('Succeeded.')
+ return api_key
+ else:
+ logger.error('Provide your username and password next time to fetch a new one.')
+ return None
+
+ return api_key
+
def create_rgd_client(
api_url: str = DEFAULT_RGD_API,
| {"golden_diff": "diff --git a/django-rgd/client/rgd_client/client.py b/django-rgd/client/rgd_client/client.py\n--- a/django-rgd/client/rgd_client/client.py\n+++ b/django-rgd/client/rgd_client/client.py\n@@ -1,4 +1,5 @@\n import getpass\n+import logging\n import os\n from typing import List, Optional, Type\n \n@@ -8,6 +9,8 @@\n from .session import RgdClientSession, clone_session\n from .utils import API_KEY_DIR_PATH, API_KEY_FILE_NAME, DEFAULT_RGD_API\n \n+logger = logging.getLogger(__name__)\n+\n \n class RgdClient:\n def __init__(\n@@ -30,7 +33,7 @@\n A base RgdClient instance.\n \"\"\"\n # Look for an API key in the environment. If it's not there, check username/password\n- api_key = _read_api_key()\n+ api_key = _read_api_key(api_url=api_url, username=username, password=password)\n if api_key is None:\n if username is not None and password is None:\n password = getpass.getpass()\n@@ -38,6 +41,10 @@\n # Get an API key for this user and save it to disk\n if username and password:\n api_key = _get_api_key(api_url, username, password, save)\n+ if api_key is None:\n+ logger.error(\n+ 'Failed to retrieve API key; are your username and password correct?'\n+ )\n \n auth_header = f'Token {api_key}'\n \n@@ -49,11 +56,12 @@\n (API_KEY_DIR_PATH / API_KEY_FILE_NAME).unlink(missing_ok=True)\n \n \n-def _get_api_key(api_url: str, username: str, password: str, save: bool) -> str:\n+def _get_api_key(api_url: str, username: str, password: str, save: bool) -> Optional[str]:\n \"\"\"Get an RGD API Key for the given user from the server, and save it if requested.\"\"\"\n resp = requests.post(f'{api_url}/api-token-auth', {'username': username, 'password': password})\n- resp.raise_for_status()\n- token = resp.json()['token']\n+ token = resp.json().get('token')\n+ if token is None:\n+ return None\n if save:\n API_KEY_DIR_PATH.mkdir(parents=True, exist_ok=True)\n with open(API_KEY_DIR_PATH / API_KEY_FILE_NAME, 'w') as fd:\n@@ -61,7 +69,7 @@\n return token\n \n \n-def _read_api_key() -> Optional[str]:\n+def _read_api_key(api_url: str, username: str = None, password: str = None) -> Optional[str]:\n \"\"\"\n Retrieve an RGD API Key from the users environment.\n \n@@ -75,10 +83,29 @@\n try:\n # read the first line of the text file at ~/.rgd/token\n with open(API_KEY_DIR_PATH / API_KEY_FILE_NAME, 'r') as fd:\n- return fd.readline().strip()\n+ api_key = fd.readline().strip()\n except FileNotFoundError:\n return None\n \n+ # Make sure API key works by hitting a protected endpoint\n+ resp = requests.get(f'{api_url}/rgd/collection', headers={'Authorization': f'Token {api_key}'})\n+\n+ # If it doesn't, try to get a new one and save it to ~/.rgd/token, as the current one is corrupted\n+ if resp.status_code == 401:\n+ logger.error('API key is invalid.')\n+ # If username + password were provided, try to get a new API key with them\n+ if username is not None and password is not None:\n+ logger.warning('Attempting to fetch a new API key...')\n+ api_key = _get_api_key(api_url, username, password, save=True)\n+ if api_key is not None:\n+ logger.warning('Succeeded.')\n+ return api_key\n+ else:\n+ logger.error('Provide your username and password next time to fetch a new one.')\n+ return None\n+\n+ return api_key\n+\n \n def create_rgd_client(\n api_url: str = DEFAULT_RGD_API,\n", "issue": "Client: API token saving bug on MacOS\n@banesullivan was experiencing the following issue:\r\n\r\nWhen calling `create_rgd_client`, if there isn't already a token stored locally (in `$HOME/.rgd/token`), the client is supposed to make a request to the server to create it, and save it in that file. On MacOS, it seems this might not be occurring. The file doesn't appear to ever be created (notably though, the `.rgd` folder _is_ present). Furthermore, if you try to manually populate that file with your token, it will correctly read it, but the file will then be gone afterwards.\r\n\r\nThis doesn't actually affect authorization, as it still just fetches the token from the API and stores it in memory, but the storage issue needs to be looked into.\n", "before_files": [{"content": "import getpass\nimport os\nfrom typing import List, Optional, Type\n\nimport requests\n\nfrom .plugin import CorePlugin\nfrom .session import RgdClientSession, clone_session\nfrom .utils import API_KEY_DIR_PATH, API_KEY_FILE_NAME, DEFAULT_RGD_API\n\n\nclass RgdClient:\n def __init__(\n self,\n api_url: str = DEFAULT_RGD_API,\n username: Optional[str] = None,\n password: Optional[str] = None,\n save: Optional[bool] = True,\n ) -> None:\n \"\"\"\n Initialize the base RGD Client.\n\n Args:\n api_url: The base url of the RGD API instance.\n username: The username to authenticate to the instance with, if any.\n password: The password associated with the provided username. If None, a prompt will be provided.\n save: Whether or not to save the logged-in user's API key to disk for future use.\n\n Returns:\n A base RgdClient instance.\n \"\"\"\n # Look for an API key in the environment. If it's not there, check username/password\n api_key = _read_api_key()\n if api_key is None:\n if username is not None and password is None:\n password = getpass.getpass()\n\n # Get an API key for this user and save it to disk\n if username and password:\n api_key = _get_api_key(api_url, username, password, save)\n\n auth_header = f'Token {api_key}'\n\n self.session = RgdClientSession(base_url=api_url, auth_header=auth_header)\n self.rgd = CorePlugin(clone_session(self.session))\n\n def clear_token(self):\n \"\"\"Delete a locally-stored API key.\"\"\"\n (API_KEY_DIR_PATH / API_KEY_FILE_NAME).unlink(missing_ok=True)\n\n\ndef _get_api_key(api_url: str, username: str, password: str, save: bool) -> str:\n \"\"\"Get an RGD API Key for the given user from the server, and save it if requested.\"\"\"\n resp = requests.post(f'{api_url}/api-token-auth', {'username': username, 'password': password})\n resp.raise_for_status()\n token = resp.json()['token']\n if save:\n API_KEY_DIR_PATH.mkdir(parents=True, exist_ok=True)\n with open(API_KEY_DIR_PATH / API_KEY_FILE_NAME, 'w') as fd:\n fd.write(token)\n return token\n\n\ndef _read_api_key() -> Optional[str]:\n \"\"\"\n Retrieve an RGD API Key from the users environment.\n\n This function checks for an environment variable named RGD_API_TOKEN and returns it if it exists.\n If it does not exist, it looks for a file located at ~/.rgd/token and returns its contents.\n \"\"\"\n token = os.getenv('RGD_API_TOKEN', None)\n if token is not None:\n return token\n\n try:\n # read the first line of the text file at ~/.rgd/token\n with open(API_KEY_DIR_PATH / API_KEY_FILE_NAME, 'r') as fd:\n return fd.readline().strip()\n except FileNotFoundError:\n return None\n\n\ndef create_rgd_client(\n api_url: str = DEFAULT_RGD_API,\n username: Optional[str] = None,\n password: Optional[str] = None,\n save: Optional[bool] = True,\n extra_plugins: Optional[List[Type]] = None,\n):\n # Avoid circular import\n from ._plugin_utils import _inject_plugin_deps, _plugin_classes, _plugin_instances\n\n # Create initial client\n client = RgdClient(api_url, username, password, save)\n\n # Perform plugin initialization\n plugin_classes = _plugin_classes(extra_plugins=extra_plugins)\n plugin_instances = _plugin_instances(client, plugin_classes)\n _inject_plugin_deps(plugin_instances)\n\n return client\n", "path": "django-rgd/client/rgd_client/client.py"}], "after_files": [{"content": "import getpass\nimport logging\nimport os\nfrom typing import List, Optional, Type\n\nimport requests\n\nfrom .plugin import CorePlugin\nfrom .session import RgdClientSession, clone_session\nfrom .utils import API_KEY_DIR_PATH, API_KEY_FILE_NAME, DEFAULT_RGD_API\n\nlogger = logging.getLogger(__name__)\n\n\nclass RgdClient:\n def __init__(\n self,\n api_url: str = DEFAULT_RGD_API,\n username: Optional[str] = None,\n password: Optional[str] = None,\n save: Optional[bool] = True,\n ) -> None:\n \"\"\"\n Initialize the base RGD Client.\n\n Args:\n api_url: The base url of the RGD API instance.\n username: The username to authenticate to the instance with, if any.\n password: The password associated with the provided username. If None, a prompt will be provided.\n save: Whether or not to save the logged-in user's API key to disk for future use.\n\n Returns:\n A base RgdClient instance.\n \"\"\"\n # Look for an API key in the environment. If it's not there, check username/password\n api_key = _read_api_key(api_url=api_url, username=username, password=password)\n if api_key is None:\n if username is not None and password is None:\n password = getpass.getpass()\n\n # Get an API key for this user and save it to disk\n if username and password:\n api_key = _get_api_key(api_url, username, password, save)\n if api_key is None:\n logger.error(\n 'Failed to retrieve API key; are your username and password correct?'\n )\n\n auth_header = f'Token {api_key}'\n\n self.session = RgdClientSession(base_url=api_url, auth_header=auth_header)\n self.rgd = CorePlugin(clone_session(self.session))\n\n def clear_token(self):\n \"\"\"Delete a locally-stored API key.\"\"\"\n (API_KEY_DIR_PATH / API_KEY_FILE_NAME).unlink(missing_ok=True)\n\n\ndef _get_api_key(api_url: str, username: str, password: str, save: bool) -> Optional[str]:\n \"\"\"Get an RGD API Key for the given user from the server, and save it if requested.\"\"\"\n resp = requests.post(f'{api_url}/api-token-auth', {'username': username, 'password': password})\n token = resp.json().get('token')\n if token is None:\n return None\n if save:\n API_KEY_DIR_PATH.mkdir(parents=True, exist_ok=True)\n with open(API_KEY_DIR_PATH / API_KEY_FILE_NAME, 'w') as fd:\n fd.write(token)\n return token\n\n\ndef _read_api_key(api_url: str, username: str = None, password: str = None) -> Optional[str]:\n \"\"\"\n Retrieve an RGD API Key from the users environment.\n\n This function checks for an environment variable named RGD_API_TOKEN and returns it if it exists.\n If it does not exist, it looks for a file located at ~/.rgd/token and returns its contents.\n \"\"\"\n token = os.getenv('RGD_API_TOKEN', None)\n if token is not None:\n return token\n\n try:\n # read the first line of the text file at ~/.rgd/token\n with open(API_KEY_DIR_PATH / API_KEY_FILE_NAME, 'r') as fd:\n api_key = fd.readline().strip()\n except FileNotFoundError:\n return None\n\n # Make sure API key works by hitting a protected endpoint\n resp = requests.get(f'{api_url}/rgd/collection', headers={'Authorization': f'Token {api_key}'})\n\n # If it doesn't, try to get a new one and save it to ~/.rgd/token, as the current one is corrupted\n if resp.status_code == 401:\n logger.error('API key is invalid.')\n # If username + password were provided, try to get a new API key with them\n if username is not None and password is not None:\n logger.warning('Attempting to fetch a new API key...')\n api_key = _get_api_key(api_url, username, password, save=True)\n if api_key is not None:\n logger.warning('Succeeded.')\n return api_key\n else:\n logger.error('Provide your username and password next time to fetch a new one.')\n return None\n\n return api_key\n\n\ndef create_rgd_client(\n api_url: str = DEFAULT_RGD_API,\n username: Optional[str] = None,\n password: Optional[str] = None,\n save: Optional[bool] = True,\n extra_plugins: Optional[List[Type]] = None,\n):\n # Avoid circular import\n from ._plugin_utils import _inject_plugin_deps, _plugin_classes, _plugin_instances\n\n # Create initial client\n client = RgdClient(api_url, username, password, save)\n\n # Perform plugin initialization\n plugin_classes = _plugin_classes(extra_plugins=extra_plugins)\n plugin_instances = _plugin_instances(client, plugin_classes)\n _inject_plugin_deps(plugin_instances)\n\n return client\n", "path": "django-rgd/client/rgd_client/client.py"}]} | 1,456 | 938 |
gh_patches_debug_36852 | rasdani/github-patches | git_diff | googleapis__google-api-python-client-961 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Python == 2.7
The read me states that
># Deprecated Python Versions
>Python == 2.7
It does not appear to work with Python == 2.7. (link to question on [SO](https://stackoverflow.com/q/56731694/1841839))
Should this note be removed from the [readme](https://github.com/googleapis/google-api-python-client/edit/master/README.md)? to me deprecation means that it should still work just be removed soon.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `noxfile.py`
Content:
```
1
2 # Copyright 2020 Google LLC
3 #
4 # Licensed under the Apache License, Version 2.0 (the "License");
5 # you may not use this file except in compliance with the License.
6 # You may obtain a copy of the License at
7 #
8 # http://www.apache.org/licenses/LICENSE-2.0
9 #
10 # Unless required by applicable law or agreed to in writing, software
11 # distributed under the License is distributed on an "AS IS" BASIS,
12 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13 # See the License for the specific language governing permissions and
14 # limitations under the License.
15
16 import sys
17
18 import nox
19
20 test_dependencies = [
21 "google-auth",
22 "google-auth-httplib2",
23 "mox",
24 "parameterized",
25 "pyopenssl",
26 "pytest",
27 "pytest-cov",
28 "webtest",
29 "coverage",
30 "unittest2",
31 "mock",
32 ]
33
34
35 @nox.session(python=["3.7"])
36 def lint(session):
37 session.install("flake8")
38 session.run(
39 "flake8",
40 "googleapiclient",
41 "tests",
42 "--count",
43 "--select=E9,F63,F7,F82",
44 "--show-source",
45 "--statistics",
46 )
47
48
49 @nox.session(python=["2.7", "3.5", "3.6", "3.7", "3.8"])
50 @nox.parametrize(
51 "oauth2client",
52 [
53 "oauth2client<2dev",
54 "oauth2client>=2,<=3dev",
55 "oauth2client>=3,<=4dev",
56 "oauth2client>=4,<=5dev",
57 ],
58 )
59 def unit(session, oauth2client):
60 session.install(*test_dependencies)
61 session.install(oauth2client)
62 if session.python < "3.0":
63 session.install("django<2.0.0")
64 else:
65 session.install("django>=2.0.0")
66
67 session.install('.')
68
69 # Run py.test against the unit tests.
70 session.run(
71 "py.test",
72 "--quiet",
73 "--cov=googleapiclient",
74 "--cov=tests",
75 "--cov-append",
76 "--cov-config=.coveragerc",
77 "--cov-report=",
78 "--cov-fail-under=85",
79 "tests",
80 *session.posargs,
81 )
82
83
84 @nox.session(python="3.6")
85 def docs(session):
86 session.install('.')
87 session.run("python", "describe.py")
```
Path: `setup.py`
Content:
```
1 # Copyright 2014 Google Inc. All Rights Reserved.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 """Setup script for Google API Python client.
16
17 Also installs included versions of third party libraries, if those libraries
18 are not already installed.
19 """
20 from __future__ import print_function
21
22 import sys
23
24 if sys.version_info < (2, 7):
25 print("google-api-python-client requires python version >= 2.7.", file=sys.stderr)
26 sys.exit(1)
27 if (3, 1) <= sys.version_info < (3, 4):
28 print("google-api-python-client requires python3 version >= 3.4.", file=sys.stderr)
29 sys.exit(1)
30
31 import io
32 import os
33 from setuptools import setup
34
35 packages = ["apiclient", "googleapiclient", "googleapiclient/discovery_cache"]
36
37 install_requires = [
38 "httplib2>=0.15.0,<1dev",
39 "google-auth>=1.16.0",
40 "google-auth-httplib2>=0.0.3",
41 "google-api-core>=1.21.0,<2dev",
42 "six>=1.13.0,<2dev",
43 "uritemplate>=3.0.0,<4dev",
44 ]
45
46 package_root = os.path.abspath(os.path.dirname(__file__))
47
48 readme_filename = os.path.join(package_root, "README.md")
49 with io.open(readme_filename, encoding="utf-8") as readme_file:
50 readme = readme_file.read()
51
52 version = "1.12.8"
53
54 setup(
55 name="google-api-python-client",
56 version=version,
57 description="Google API Client Library for Python",
58 long_description=readme,
59 long_description_content_type='text/markdown',
60 author="Google LLC",
61 author_email="[email protected]",
62 url="https://github.com/googleapis/google-api-python-client/",
63 install_requires=install_requires,
64 python_requires=">=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*",
65 packages=packages,
66 package_data={},
67 license="Apache 2.0",
68 keywords="google api client",
69 classifiers=[
70 "Programming Language :: Python :: 2",
71 "Programming Language :: Python :: 2.7",
72 "Programming Language :: Python :: 3",
73 "Programming Language :: Python :: 3.5",
74 "Programming Language :: Python :: 3.6",
75 "Programming Language :: Python :: 3.7",
76 "Development Status :: 5 - Production/Stable",
77 "Intended Audience :: Developers",
78 "License :: OSI Approved :: Apache Software License",
79 "Operating System :: OS Independent",
80 "Topic :: Internet :: WWW/HTTP",
81 ],
82 )
83
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/noxfile.py b/noxfile.py
--- a/noxfile.py
+++ b/noxfile.py
@@ -18,6 +18,7 @@
import nox
test_dependencies = [
+ "django>=2.0.0",
"google-auth",
"google-auth-httplib2",
"mox",
@@ -46,7 +47,7 @@
)
[email protected](python=["2.7", "3.5", "3.6", "3.7", "3.8"])
[email protected](python=["3.6", "3.7", "3.8", "3.9"])
@nox.parametrize(
"oauth2client",
[
@@ -59,11 +60,6 @@
def unit(session, oauth2client):
session.install(*test_dependencies)
session.install(oauth2client)
- if session.python < "3.0":
- session.install("django<2.0.0")
- else:
- session.install("django>=2.0.0")
-
session.install('.')
# Run py.test against the unit tests.
diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -21,11 +21,8 @@
import sys
-if sys.version_info < (2, 7):
- print("google-api-python-client requires python version >= 2.7.", file=sys.stderr)
- sys.exit(1)
-if (3, 1) <= sys.version_info < (3, 4):
- print("google-api-python-client requires python3 version >= 3.4.", file=sys.stderr)
+if sys.version_info < (3, 6):
+ print("google-api-python-client requires python3 version >= 3.6.", file=sys.stderr)
sys.exit(1)
import io
@@ -61,18 +58,17 @@
author_email="[email protected]",
url="https://github.com/googleapis/google-api-python-client/",
install_requires=install_requires,
- python_requires=">=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*",
+ python_requires=">=3.6",
packages=packages,
package_data={},
license="Apache 2.0",
keywords="google api client",
classifiers=[
- "Programming Language :: Python :: 2",
- "Programming Language :: Python :: 2.7",
"Programming Language :: Python :: 3",
- "Programming Language :: Python :: 3.5",
"Programming Language :: Python :: 3.6",
"Programming Language :: Python :: 3.7",
+ "Programming Language :: Python :: 3.8",
+ "Programming Language :: Python :: 3.9",
"Development Status :: 5 - Production/Stable",
"Intended Audience :: Developers",
"License :: OSI Approved :: Apache Software License",
| {"golden_diff": "diff --git a/noxfile.py b/noxfile.py\n--- a/noxfile.py\n+++ b/noxfile.py\n@@ -18,6 +18,7 @@\n import nox\n \n test_dependencies = [\n+ \"django>=2.0.0\",\n \"google-auth\",\n \"google-auth-httplib2\",\n \"mox\",\n@@ -46,7 +47,7 @@\n )\n \n \[email protected](python=[\"2.7\", \"3.5\", \"3.6\", \"3.7\", \"3.8\"])\[email protected](python=[\"3.6\", \"3.7\", \"3.8\", \"3.9\"])\n @nox.parametrize(\n \"oauth2client\",\n [\n@@ -59,11 +60,6 @@\n def unit(session, oauth2client):\n session.install(*test_dependencies)\n session.install(oauth2client)\n- if session.python < \"3.0\":\n- session.install(\"django<2.0.0\")\n- else:\n- session.install(\"django>=2.0.0\")\n-\n session.install('.')\n \n # Run py.test against the unit tests.\ndiff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -21,11 +21,8 @@\n \n import sys\n \n-if sys.version_info < (2, 7):\n- print(\"google-api-python-client requires python version >= 2.7.\", file=sys.stderr)\n- sys.exit(1)\n-if (3, 1) <= sys.version_info < (3, 4):\n- print(\"google-api-python-client requires python3 version >= 3.4.\", file=sys.stderr)\n+if sys.version_info < (3, 6):\n+ print(\"google-api-python-client requires python3 version >= 3.6.\", file=sys.stderr)\n sys.exit(1)\n \n import io\n@@ -61,18 +58,17 @@\n author_email=\"[email protected]\",\n url=\"https://github.com/googleapis/google-api-python-client/\",\n install_requires=install_requires,\n- python_requires=\">=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*\",\n+ python_requires=\">=3.6\",\n packages=packages,\n package_data={},\n license=\"Apache 2.0\",\n keywords=\"google api client\",\n classifiers=[\n- \"Programming Language :: Python :: 2\",\n- \"Programming Language :: Python :: 2.7\",\n \"Programming Language :: Python :: 3\",\n- \"Programming Language :: Python :: 3.5\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n+ \"Programming Language :: Python :: 3.8\",\n+ \"Programming Language :: Python :: 3.9\",\n \"Development Status :: 5 - Production/Stable\",\n \"Intended Audience :: Developers\",\n \"License :: OSI Approved :: Apache Software License\",\n", "issue": "Python == 2.7\nThe read me states that\r\n\r\n># Deprecated Python Versions\r\n>Python == 2.7\r\n\r\nIt does not appear to work with Python == 2.7. (link to question on [SO](https://stackoverflow.com/q/56731694/1841839))\r\n\r\nShould this note be removed from the [readme](https://github.com/googleapis/google-api-python-client/edit/master/README.md)? to me deprecation means that it should still work just be removed soon.\n", "before_files": [{"content": "\n# Copyright 2020 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport sys\n\nimport nox\n\ntest_dependencies = [\n \"google-auth\",\n \"google-auth-httplib2\",\n \"mox\",\n \"parameterized\",\n \"pyopenssl\",\n \"pytest\",\n \"pytest-cov\",\n \"webtest\",\n \"coverage\",\n \"unittest2\",\n \"mock\",\n]\n\n\[email protected](python=[\"3.7\"])\ndef lint(session):\n session.install(\"flake8\")\n session.run(\n \"flake8\",\n \"googleapiclient\",\n \"tests\",\n \"--count\",\n \"--select=E9,F63,F7,F82\",\n \"--show-source\",\n \"--statistics\",\n )\n\n\[email protected](python=[\"2.7\", \"3.5\", \"3.6\", \"3.7\", \"3.8\"])\[email protected](\n \"oauth2client\",\n [\n \"oauth2client<2dev\",\n \"oauth2client>=2,<=3dev\",\n \"oauth2client>=3,<=4dev\",\n \"oauth2client>=4,<=5dev\",\n ],\n)\ndef unit(session, oauth2client):\n session.install(*test_dependencies)\n session.install(oauth2client)\n if session.python < \"3.0\":\n session.install(\"django<2.0.0\")\n else:\n session.install(\"django>=2.0.0\")\n\n session.install('.')\n\n # Run py.test against the unit tests.\n session.run(\n \"py.test\",\n \"--quiet\",\n \"--cov=googleapiclient\",\n \"--cov=tests\",\n \"--cov-append\",\n \"--cov-config=.coveragerc\",\n \"--cov-report=\",\n \"--cov-fail-under=85\",\n \"tests\",\n *session.posargs,\n )\n\n\[email protected](python=\"3.6\")\ndef docs(session):\n session.install('.')\n session.run(\"python\", \"describe.py\")", "path": "noxfile.py"}, {"content": "# Copyright 2014 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"Setup script for Google API Python client.\n\nAlso installs included versions of third party libraries, if those libraries\nare not already installed.\n\"\"\"\nfrom __future__ import print_function\n\nimport sys\n\nif sys.version_info < (2, 7):\n print(\"google-api-python-client requires python version >= 2.7.\", file=sys.stderr)\n sys.exit(1)\nif (3, 1) <= sys.version_info < (3, 4):\n print(\"google-api-python-client requires python3 version >= 3.4.\", file=sys.stderr)\n sys.exit(1)\n\nimport io\nimport os\nfrom setuptools import setup\n\npackages = [\"apiclient\", \"googleapiclient\", \"googleapiclient/discovery_cache\"]\n\ninstall_requires = [\n \"httplib2>=0.15.0,<1dev\",\n \"google-auth>=1.16.0\",\n \"google-auth-httplib2>=0.0.3\",\n \"google-api-core>=1.21.0,<2dev\",\n \"six>=1.13.0,<2dev\",\n \"uritemplate>=3.0.0,<4dev\",\n]\n\npackage_root = os.path.abspath(os.path.dirname(__file__))\n\nreadme_filename = os.path.join(package_root, \"README.md\")\nwith io.open(readme_filename, encoding=\"utf-8\") as readme_file:\n readme = readme_file.read()\n\nversion = \"1.12.8\"\n\nsetup(\n name=\"google-api-python-client\",\n version=version,\n description=\"Google API Client Library for Python\",\n long_description=readme,\n long_description_content_type='text/markdown',\n author=\"Google LLC\",\n author_email=\"[email protected]\",\n url=\"https://github.com/googleapis/google-api-python-client/\",\n install_requires=install_requires,\n python_requires=\">=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*\",\n packages=packages,\n package_data={},\n license=\"Apache 2.0\",\n keywords=\"google api client\",\n classifiers=[\n \"Programming Language :: Python :: 2\",\n \"Programming Language :: Python :: 2.7\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.5\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Development Status :: 5 - Production/Stable\",\n \"Intended Audience :: Developers\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Operating System :: OS Independent\",\n \"Topic :: Internet :: WWW/HTTP\",\n ],\n)\n", "path": "setup.py"}], "after_files": [{"content": "\n# Copyright 2020 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport sys\n\nimport nox\n\ntest_dependencies = [\n \"django>=2.0.0\",\n \"google-auth\",\n \"google-auth-httplib2\",\n \"mox\",\n \"parameterized\",\n \"pyopenssl\",\n \"pytest\",\n \"pytest-cov\",\n \"webtest\",\n \"coverage\",\n \"unittest2\",\n \"mock\",\n]\n\n\[email protected](python=[\"3.7\"])\ndef lint(session):\n session.install(\"flake8\")\n session.run(\n \"flake8\",\n \"googleapiclient\",\n \"tests\",\n \"--count\",\n \"--select=E9,F63,F7,F82\",\n \"--show-source\",\n \"--statistics\",\n )\n\n\[email protected](python=[\"3.6\", \"3.7\", \"3.8\", \"3.9\"])\[email protected](\n \"oauth2client\",\n [\n \"oauth2client<2dev\",\n \"oauth2client>=2,<=3dev\",\n \"oauth2client>=3,<=4dev\",\n \"oauth2client>=4,<=5dev\",\n ],\n)\ndef unit(session, oauth2client):\n session.install(*test_dependencies)\n session.install(oauth2client)\n session.install('.')\n\n # Run py.test against the unit tests.\n session.run(\n \"py.test\",\n \"--quiet\",\n \"--cov=googleapiclient\",\n \"--cov=tests\",\n \"--cov-append\",\n \"--cov-config=.coveragerc\",\n \"--cov-report=\",\n \"--cov-fail-under=85\",\n \"tests\",\n *session.posargs,\n )\n\n\[email protected](python=\"3.6\")\ndef docs(session):\n session.install('.')\n session.run(\"python\", \"describe.py\")", "path": "noxfile.py"}, {"content": "# Copyright 2014 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"Setup script for Google API Python client.\n\nAlso installs included versions of third party libraries, if those libraries\nare not already installed.\n\"\"\"\nfrom __future__ import print_function\n\nimport sys\n\nif sys.version_info < (3, 6):\n print(\"google-api-python-client requires python3 version >= 3.6.\", file=sys.stderr)\n sys.exit(1)\n\nimport io\nimport os\nfrom setuptools import setup\n\npackages = [\"apiclient\", \"googleapiclient\", \"googleapiclient/discovery_cache\"]\n\ninstall_requires = [\n \"httplib2>=0.15.0,<1dev\",\n \"google-auth>=1.16.0\",\n \"google-auth-httplib2>=0.0.3\",\n \"google-api-core>=1.21.0,<2dev\",\n \"six>=1.13.0,<2dev\",\n \"uritemplate>=3.0.0,<4dev\",\n]\n\npackage_root = os.path.abspath(os.path.dirname(__file__))\n\nreadme_filename = os.path.join(package_root, \"README.md\")\nwith io.open(readme_filename, encoding=\"utf-8\") as readme_file:\n readme = readme_file.read()\n\nversion = \"1.12.8\"\n\nsetup(\n name=\"google-api-python-client\",\n version=version,\n description=\"Google API Client Library for Python\",\n long_description=readme,\n long_description_content_type='text/markdown',\n author=\"Google LLC\",\n author_email=\"[email protected]\",\n url=\"https://github.com/googleapis/google-api-python-client/\",\n install_requires=install_requires,\n python_requires=\">=3.6\",\n packages=packages,\n package_data={},\n license=\"Apache 2.0\",\n keywords=\"google api client\",\n classifiers=[\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Development Status :: 5 - Production/Stable\",\n \"Intended Audience :: Developers\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Operating System :: OS Independent\",\n \"Topic :: Internet :: WWW/HTTP\",\n ],\n)\n", "path": "setup.py"}]} | 1,958 | 659 |
gh_patches_debug_18713 | rasdani/github-patches | git_diff | pypi__warehouse-3396 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Missing Purges
Noticed this while setting up new mirror. We don't seem to be purging `project/<normalized_name>` key when projects are deleted.
This leads bandersnatch to get confused and fall behind until the key is purged so the JSON api returns a 404
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `warehouse/packaging/__init__.py`
Content:
```
1 # Licensed under the Apache License, Version 2.0 (the "License");
2 # you may not use this file except in compliance with the License.
3 # You may obtain a copy of the License at
4 #
5 # http://www.apache.org/licenses/LICENSE-2.0
6 #
7 # Unless required by applicable law or agreed to in writing, software
8 # distributed under the License is distributed on an "AS IS" BASIS,
9 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
10 # See the License for the specific language governing permissions and
11 # limitations under the License.
12
13 from celery.schedules import crontab
14 from sqlalchemy.orm.base import NO_VALUE
15
16 from warehouse import db
17 from warehouse.accounts.models import User, Email
18 from warehouse.cache.origin import key_factory, receive_set
19 from warehouse.packaging.interfaces import IFileStorage
20 from warehouse.packaging.models import Project, Release
21 from warehouse.packaging.tasks import compute_trending
22
23
24 @db.listens_for(User.name, 'set')
25 def user_name_receive_set(config, target, value, oldvalue, initiator):
26 if oldvalue is not NO_VALUE:
27 receive_set(User.name, config, target)
28
29
30 @db.listens_for(Email.primary, 'set')
31 def email_primary_receive_set(config, target, value, oldvalue, initiator):
32 if oldvalue is not NO_VALUE:
33 receive_set(Email.primary, config, target)
34
35
36 def includeme(config):
37 # Register whatever file storage backend has been configured for storing
38 # our package files.
39 storage_class = config.maybe_dotted(
40 config.registry.settings["files.backend"],
41 )
42 config.register_service_factory(storage_class.create_service, IFileStorage)
43
44 # Register our origin cache keys
45 config.register_origin_cache_keys(
46 Project,
47 cache_keys=["project/{obj.normalized_name}"],
48 purge_keys=[
49 key_factory("project/{obj.normalized_name}"),
50 key_factory("user/{itr.username}", iterate_on='users'),
51 key_factory("all-projects"),
52 ],
53 )
54 config.register_origin_cache_keys(
55 Release,
56 cache_keys=["project/{obj.project.normalized_name}"],
57 purge_keys=[
58 key_factory("project/{obj.project.normalized_name}"),
59 key_factory("user/{itr.username}", iterate_on='project.users'),
60 key_factory("all-projects"),
61 ],
62 )
63 config.register_origin_cache_keys(
64 User,
65 cache_keys=["user/{obj.username}"],
66 )
67 config.register_origin_cache_keys(
68 User.name,
69 purge_keys=[
70 key_factory("user/{obj.username}"),
71 key_factory("project/{itr.normalized_name}", iterate_on='projects')
72 ],
73 )
74 config.register_origin_cache_keys(
75 Email.primary,
76 purge_keys=[
77 key_factory("user/{obj.user.username}"),
78 key_factory(
79 "project/{itr.normalized_name}",
80 iterate_on='user.projects',
81 )
82 ],
83 )
84
85 # Add a periodic task to compute trending once a day, assuming we have
86 # been configured to be able to access BigQuery.
87 if config.get_settings().get("warehouse.trending_table"):
88 config.add_periodic_task(crontab(minute=0, hour=3), compute_trending)
89
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/warehouse/packaging/__init__.py b/warehouse/packaging/__init__.py
--- a/warehouse/packaging/__init__.py
+++ b/warehouse/packaging/__init__.py
@@ -17,7 +17,7 @@
from warehouse.accounts.models import User, Email
from warehouse.cache.origin import key_factory, receive_set
from warehouse.packaging.interfaces import IFileStorage
-from warehouse.packaging.models import Project, Release
+from warehouse.packaging.models import Project, Release, Role
from warehouse.packaging.tasks import compute_trending
@@ -60,6 +60,13 @@
key_factory("all-projects"),
],
)
+ config.register_origin_cache_keys(
+ Role,
+ purge_keys=[
+ key_factory("user/{obj.user.username}"),
+ key_factory("project/{obj.project.normalized_name}")
+ ],
+ )
config.register_origin_cache_keys(
User,
cache_keys=["user/{obj.username}"],
| {"golden_diff": "diff --git a/warehouse/packaging/__init__.py b/warehouse/packaging/__init__.py\n--- a/warehouse/packaging/__init__.py\n+++ b/warehouse/packaging/__init__.py\n@@ -17,7 +17,7 @@\n from warehouse.accounts.models import User, Email\n from warehouse.cache.origin import key_factory, receive_set\n from warehouse.packaging.interfaces import IFileStorage\n-from warehouse.packaging.models import Project, Release\n+from warehouse.packaging.models import Project, Release, Role\n from warehouse.packaging.tasks import compute_trending\n \n \n@@ -60,6 +60,13 @@\n key_factory(\"all-projects\"),\n ],\n )\n+ config.register_origin_cache_keys(\n+ Role,\n+ purge_keys=[\n+ key_factory(\"user/{obj.user.username}\"),\n+ key_factory(\"project/{obj.project.normalized_name}\")\n+ ],\n+ )\n config.register_origin_cache_keys(\n User,\n cache_keys=[\"user/{obj.username}\"],\n", "issue": "Missing Purges\nNoticed this while setting up new mirror. We don't seem to be purging `project/<normalized_name>` key when projects are deleted.\r\n\r\nThis leads bandersnatch to get confused and fall behind until the key is purged so the JSON api returns a 404\n", "before_files": [{"content": "# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom celery.schedules import crontab\nfrom sqlalchemy.orm.base import NO_VALUE\n\nfrom warehouse import db\nfrom warehouse.accounts.models import User, Email\nfrom warehouse.cache.origin import key_factory, receive_set\nfrom warehouse.packaging.interfaces import IFileStorage\nfrom warehouse.packaging.models import Project, Release\nfrom warehouse.packaging.tasks import compute_trending\n\n\[email protected]_for(User.name, 'set')\ndef user_name_receive_set(config, target, value, oldvalue, initiator):\n if oldvalue is not NO_VALUE:\n receive_set(User.name, config, target)\n\n\[email protected]_for(Email.primary, 'set')\ndef email_primary_receive_set(config, target, value, oldvalue, initiator):\n if oldvalue is not NO_VALUE:\n receive_set(Email.primary, config, target)\n\n\ndef includeme(config):\n # Register whatever file storage backend has been configured for storing\n # our package files.\n storage_class = config.maybe_dotted(\n config.registry.settings[\"files.backend\"],\n )\n config.register_service_factory(storage_class.create_service, IFileStorage)\n\n # Register our origin cache keys\n config.register_origin_cache_keys(\n Project,\n cache_keys=[\"project/{obj.normalized_name}\"],\n purge_keys=[\n key_factory(\"project/{obj.normalized_name}\"),\n key_factory(\"user/{itr.username}\", iterate_on='users'),\n key_factory(\"all-projects\"),\n ],\n )\n config.register_origin_cache_keys(\n Release,\n cache_keys=[\"project/{obj.project.normalized_name}\"],\n purge_keys=[\n key_factory(\"project/{obj.project.normalized_name}\"),\n key_factory(\"user/{itr.username}\", iterate_on='project.users'),\n key_factory(\"all-projects\"),\n ],\n )\n config.register_origin_cache_keys(\n User,\n cache_keys=[\"user/{obj.username}\"],\n )\n config.register_origin_cache_keys(\n User.name,\n purge_keys=[\n key_factory(\"user/{obj.username}\"),\n key_factory(\"project/{itr.normalized_name}\", iterate_on='projects')\n ],\n )\n config.register_origin_cache_keys(\n Email.primary,\n purge_keys=[\n key_factory(\"user/{obj.user.username}\"),\n key_factory(\n \"project/{itr.normalized_name}\",\n iterate_on='user.projects',\n )\n ],\n )\n\n # Add a periodic task to compute trending once a day, assuming we have\n # been configured to be able to access BigQuery.\n if config.get_settings().get(\"warehouse.trending_table\"):\n config.add_periodic_task(crontab(minute=0, hour=3), compute_trending)\n", "path": "warehouse/packaging/__init__.py"}], "after_files": [{"content": "# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom celery.schedules import crontab\nfrom sqlalchemy.orm.base import NO_VALUE\n\nfrom warehouse import db\nfrom warehouse.accounts.models import User, Email\nfrom warehouse.cache.origin import key_factory, receive_set\nfrom warehouse.packaging.interfaces import IFileStorage\nfrom warehouse.packaging.models import Project, Release, Role\nfrom warehouse.packaging.tasks import compute_trending\n\n\[email protected]_for(User.name, 'set')\ndef user_name_receive_set(config, target, value, oldvalue, initiator):\n if oldvalue is not NO_VALUE:\n receive_set(User.name, config, target)\n\n\[email protected]_for(Email.primary, 'set')\ndef email_primary_receive_set(config, target, value, oldvalue, initiator):\n if oldvalue is not NO_VALUE:\n receive_set(Email.primary, config, target)\n\n\ndef includeme(config):\n # Register whatever file storage backend has been configured for storing\n # our package files.\n storage_class = config.maybe_dotted(\n config.registry.settings[\"files.backend\"],\n )\n config.register_service_factory(storage_class.create_service, IFileStorage)\n\n # Register our origin cache keys\n config.register_origin_cache_keys(\n Project,\n cache_keys=[\"project/{obj.normalized_name}\"],\n purge_keys=[\n key_factory(\"project/{obj.normalized_name}\"),\n key_factory(\"user/{itr.username}\", iterate_on='users'),\n key_factory(\"all-projects\"),\n ],\n )\n config.register_origin_cache_keys(\n Release,\n cache_keys=[\"project/{obj.project.normalized_name}\"],\n purge_keys=[\n key_factory(\"project/{obj.project.normalized_name}\"),\n key_factory(\"user/{itr.username}\", iterate_on='project.users'),\n key_factory(\"all-projects\"),\n ],\n )\n config.register_origin_cache_keys(\n Role,\n purge_keys=[\n key_factory(\"user/{obj.user.username}\"),\n key_factory(\"project/{obj.project.normalized_name}\")\n ],\n )\n config.register_origin_cache_keys(\n User,\n cache_keys=[\"user/{obj.username}\"],\n )\n config.register_origin_cache_keys(\n User.name,\n purge_keys=[\n key_factory(\"user/{obj.username}\"),\n key_factory(\"project/{itr.normalized_name}\", iterate_on='projects')\n ],\n )\n config.register_origin_cache_keys(\n Email.primary,\n purge_keys=[\n key_factory(\"user/{obj.user.username}\"),\n key_factory(\n \"project/{itr.normalized_name}\",\n iterate_on='user.projects',\n )\n ],\n )\n\n # Add a periodic task to compute trending once a day, assuming we have\n # been configured to be able to access BigQuery.\n if config.get_settings().get(\"warehouse.trending_table\"):\n config.add_periodic_task(crontab(minute=0, hour=3), compute_trending)\n", "path": "warehouse/packaging/__init__.py"}]} | 1,161 | 216 |
gh_patches_debug_21636 | rasdani/github-patches | git_diff | cloudtools__troposphere-1775 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add AdvancedSecurityOptions property to AWS ElasticSearch Domain
**Description:**
AWS Elasticsearch now supports fine-grained access control with Cloudformation. Need to add AdvancedSecurityOptions and MasterUserOptions to AWS::Elasticsearch::Domain object to enable this new functionality.
https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticsearch-domain.html#cfn-elasticsearch-domain-advancedsecurityoptions
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `troposphere/elasticsearch.py`
Content:
```
1 # Copyright (c) 2012-2015, Mark Peek <[email protected]>
2 # All rights reserved.
3 #
4 # See LICENSE file for full license.
5
6 from . import AWSProperty, AWSObject, Tags
7 from .compat import policytypes
8 from .validators import boolean, integer, integer_range, positive_integer
9
10 VALID_VOLUME_TYPES = ('standard', 'gp2', 'io1')
11
12
13 def validate_volume_type(volume_type):
14 """Validate VolumeType for ElasticsearchDomain"""
15 if volume_type not in VALID_VOLUME_TYPES:
16 raise ValueError("Elasticsearch Domain VolumeType must be one of: %s" %
17 ", ".join(VALID_VOLUME_TYPES))
18 return volume_type
19
20
21 class CognitoOptions(AWSProperty):
22 props = {
23 'Enabled': (boolean, False),
24 'IdentityPoolId': (basestring, False),
25 'RoleArn': (basestring, False),
26 'UserPoolId': (basestring, False),
27 }
28
29
30 class EBSOptions(AWSProperty):
31 props = {
32 'EBSEnabled': (boolean, False),
33 'Iops': (positive_integer, False),
34 'VolumeSize': (integer, False),
35 'VolumeType': (validate_volume_type, False)
36 }
37
38 def validate(self):
39 volume_type = self.properties.get('VolumeType')
40 iops = self.properties.get('Iops')
41 if volume_type == 'io1' and not iops:
42 raise ValueError("Must specify Iops if VolumeType is 'io1'.")
43
44
45 class ZoneAwarenessConfig(AWSProperty):
46 props = {
47 'AvailabilityZoneCount': (integer, False),
48 }
49
50
51 class ElasticsearchClusterConfig(AWSProperty):
52 props = {
53 'DedicatedMasterCount': (integer, False),
54 'DedicatedMasterEnabled': (boolean, False),
55 'DedicatedMasterType': (basestring, False),
56 'InstanceCount': (integer, False),
57 'InstanceType': (basestring, False),
58 'ZoneAwarenessConfig': (ZoneAwarenessConfig, False),
59 'ZoneAwarenessEnabled': (boolean, False)
60 }
61
62
63 class EncryptionAtRestOptions(AWSProperty):
64 props = {
65 'Enabled': (boolean, False),
66 'KmsKeyId': (basestring, False),
67 }
68
69
70 class NodeToNodeEncryptionOptions(AWSProperty):
71 props = {
72 'Enabled': (boolean, False),
73 }
74
75
76 class SnapshotOptions(AWSProperty):
77 props = {
78 'AutomatedSnapshotStartHour': (integer_range(0, 23), False)
79 }
80
81
82 class VPCOptions(AWSProperty):
83 props = {
84 "SecurityGroupIds": ([basestring], False),
85 "SubnetIds": ([basestring], False)
86 }
87
88
89 class Domain(AWSObject):
90 resource_type = "AWS::Elasticsearch::Domain"
91
92 props = {
93 'AccessPolicies': (policytypes, False),
94 'AdvancedOptions': (dict, False),
95 'CognitoOptions': (CognitoOptions, False),
96 'DomainName': (basestring, False),
97 'EBSOptions': (EBSOptions, False),
98 'ElasticsearchClusterConfig': (ElasticsearchClusterConfig, False),
99 'ElasticsearchVersion': (basestring, False),
100 'EncryptionAtRestOptions': (EncryptionAtRestOptions, False),
101 'LogPublishingOptions': (dict, False),
102 'NodeToNodeEncryptionOptions': (NodeToNodeEncryptionOptions, False),
103 'SnapshotOptions': (SnapshotOptions, False),
104 'Tags': ((Tags, list), False),
105 'VPCOptions': (VPCOptions, False),
106 }
107
108
109 # Backward compatibility
110 ElasticsearchDomain = Domain
111
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/troposphere/elasticsearch.py b/troposphere/elasticsearch.py
--- a/troposphere/elasticsearch.py
+++ b/troposphere/elasticsearch.py
@@ -81,8 +81,24 @@
class VPCOptions(AWSProperty):
props = {
- "SecurityGroupIds": ([basestring], False),
- "SubnetIds": ([basestring], False)
+ 'SecurityGroupIds': ([basestring], False),
+ 'SubnetIds': ([basestring], False)
+ }
+
+
+class MasterUserOptions(AWSProperty):
+ props = {
+ 'MasterUserARN': (basestring, False),
+ 'MasterUserName': (basestring, False),
+ 'MasterUserPassword': (basestring, False),
+ }
+
+
+class AdvancedSecurityOptionsInput(AWSProperty):
+ props = {
+ 'Enabled': (boolean, False),
+ 'InternalUserDatabaseEnabled': (boolean, False),
+ 'MasterUserOptions': (MasterUserOptions, False),
}
@@ -92,6 +108,7 @@
props = {
'AccessPolicies': (policytypes, False),
'AdvancedOptions': (dict, False),
+ 'AdvancedSecurityOptions': (AdvancedSecurityOptionsInput, False),
'CognitoOptions': (CognitoOptions, False),
'DomainName': (basestring, False),
'EBSOptions': (EBSOptions, False),
| {"golden_diff": "diff --git a/troposphere/elasticsearch.py b/troposphere/elasticsearch.py\n--- a/troposphere/elasticsearch.py\n+++ b/troposphere/elasticsearch.py\n@@ -81,8 +81,24 @@\n \n class VPCOptions(AWSProperty):\n props = {\n- \"SecurityGroupIds\": ([basestring], False),\n- \"SubnetIds\": ([basestring], False)\n+ 'SecurityGroupIds': ([basestring], False),\n+ 'SubnetIds': ([basestring], False)\n+ }\n+\n+\n+class MasterUserOptions(AWSProperty):\n+ props = {\n+ 'MasterUserARN': (basestring, False),\n+ 'MasterUserName': (basestring, False),\n+ 'MasterUserPassword': (basestring, False),\n+ }\n+\n+\n+class AdvancedSecurityOptionsInput(AWSProperty):\n+ props = {\n+ 'Enabled': (boolean, False),\n+ 'InternalUserDatabaseEnabled': (boolean, False),\n+ 'MasterUserOptions': (MasterUserOptions, False),\n }\n \n \n@@ -92,6 +108,7 @@\n props = {\n 'AccessPolicies': (policytypes, False),\n 'AdvancedOptions': (dict, False),\n+ 'AdvancedSecurityOptions': (AdvancedSecurityOptionsInput, False),\n 'CognitoOptions': (CognitoOptions, False),\n 'DomainName': (basestring, False),\n 'EBSOptions': (EBSOptions, False),\n", "issue": "Add AdvancedSecurityOptions property to AWS ElasticSearch Domain\n**Description:**\r\n\r\nAWS Elasticsearch now supports fine-grained access control with Cloudformation. Need to add AdvancedSecurityOptions and MasterUserOptions to AWS::Elasticsearch::Domain object to enable this new functionality. \r\n\r\nhttps://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-elasticsearch-domain.html#cfn-elasticsearch-domain-advancedsecurityoptions\n", "before_files": [{"content": "# Copyright (c) 2012-2015, Mark Peek <[email protected]>\n# All rights reserved.\n#\n# See LICENSE file for full license.\n\nfrom . import AWSProperty, AWSObject, Tags\nfrom .compat import policytypes\nfrom .validators import boolean, integer, integer_range, positive_integer\n\nVALID_VOLUME_TYPES = ('standard', 'gp2', 'io1')\n\n\ndef validate_volume_type(volume_type):\n \"\"\"Validate VolumeType for ElasticsearchDomain\"\"\"\n if volume_type not in VALID_VOLUME_TYPES:\n raise ValueError(\"Elasticsearch Domain VolumeType must be one of: %s\" %\n \", \".join(VALID_VOLUME_TYPES))\n return volume_type\n\n\nclass CognitoOptions(AWSProperty):\n props = {\n 'Enabled': (boolean, False),\n 'IdentityPoolId': (basestring, False),\n 'RoleArn': (basestring, False),\n 'UserPoolId': (basestring, False),\n }\n\n\nclass EBSOptions(AWSProperty):\n props = {\n 'EBSEnabled': (boolean, False),\n 'Iops': (positive_integer, False),\n 'VolumeSize': (integer, False),\n 'VolumeType': (validate_volume_type, False)\n }\n\n def validate(self):\n volume_type = self.properties.get('VolumeType')\n iops = self.properties.get('Iops')\n if volume_type == 'io1' and not iops:\n raise ValueError(\"Must specify Iops if VolumeType is 'io1'.\")\n\n\nclass ZoneAwarenessConfig(AWSProperty):\n props = {\n 'AvailabilityZoneCount': (integer, False),\n }\n\n\nclass ElasticsearchClusterConfig(AWSProperty):\n props = {\n 'DedicatedMasterCount': (integer, False),\n 'DedicatedMasterEnabled': (boolean, False),\n 'DedicatedMasterType': (basestring, False),\n 'InstanceCount': (integer, False),\n 'InstanceType': (basestring, False),\n 'ZoneAwarenessConfig': (ZoneAwarenessConfig, False),\n 'ZoneAwarenessEnabled': (boolean, False)\n }\n\n\nclass EncryptionAtRestOptions(AWSProperty):\n props = {\n 'Enabled': (boolean, False),\n 'KmsKeyId': (basestring, False),\n }\n\n\nclass NodeToNodeEncryptionOptions(AWSProperty):\n props = {\n 'Enabled': (boolean, False),\n }\n\n\nclass SnapshotOptions(AWSProperty):\n props = {\n 'AutomatedSnapshotStartHour': (integer_range(0, 23), False)\n }\n\n\nclass VPCOptions(AWSProperty):\n props = {\n \"SecurityGroupIds\": ([basestring], False),\n \"SubnetIds\": ([basestring], False)\n }\n\n\nclass Domain(AWSObject):\n resource_type = \"AWS::Elasticsearch::Domain\"\n\n props = {\n 'AccessPolicies': (policytypes, False),\n 'AdvancedOptions': (dict, False),\n 'CognitoOptions': (CognitoOptions, False),\n 'DomainName': (basestring, False),\n 'EBSOptions': (EBSOptions, False),\n 'ElasticsearchClusterConfig': (ElasticsearchClusterConfig, False),\n 'ElasticsearchVersion': (basestring, False),\n 'EncryptionAtRestOptions': (EncryptionAtRestOptions, False),\n 'LogPublishingOptions': (dict, False),\n 'NodeToNodeEncryptionOptions': (NodeToNodeEncryptionOptions, False),\n 'SnapshotOptions': (SnapshotOptions, False),\n 'Tags': ((Tags, list), False),\n 'VPCOptions': (VPCOptions, False),\n }\n\n\n# Backward compatibility\nElasticsearchDomain = Domain\n", "path": "troposphere/elasticsearch.py"}], "after_files": [{"content": "# Copyright (c) 2012-2015, Mark Peek <[email protected]>\n# All rights reserved.\n#\n# See LICENSE file for full license.\n\nfrom . import AWSProperty, AWSObject, Tags\nfrom .compat import policytypes\nfrom .validators import boolean, integer, integer_range, positive_integer\n\nVALID_VOLUME_TYPES = ('standard', 'gp2', 'io1')\n\n\ndef validate_volume_type(volume_type):\n \"\"\"Validate VolumeType for ElasticsearchDomain\"\"\"\n if volume_type not in VALID_VOLUME_TYPES:\n raise ValueError(\"Elasticsearch Domain VolumeType must be one of: %s\" %\n \", \".join(VALID_VOLUME_TYPES))\n return volume_type\n\n\nclass CognitoOptions(AWSProperty):\n props = {\n 'Enabled': (boolean, False),\n 'IdentityPoolId': (basestring, False),\n 'RoleArn': (basestring, False),\n 'UserPoolId': (basestring, False),\n }\n\n\nclass EBSOptions(AWSProperty):\n props = {\n 'EBSEnabled': (boolean, False),\n 'Iops': (positive_integer, False),\n 'VolumeSize': (integer, False),\n 'VolumeType': (validate_volume_type, False)\n }\n\n def validate(self):\n volume_type = self.properties.get('VolumeType')\n iops = self.properties.get('Iops')\n if volume_type == 'io1' and not iops:\n raise ValueError(\"Must specify Iops if VolumeType is 'io1'.\")\n\n\nclass ZoneAwarenessConfig(AWSProperty):\n props = {\n 'AvailabilityZoneCount': (integer, False),\n }\n\n\nclass ElasticsearchClusterConfig(AWSProperty):\n props = {\n 'DedicatedMasterCount': (integer, False),\n 'DedicatedMasterEnabled': (boolean, False),\n 'DedicatedMasterType': (basestring, False),\n 'InstanceCount': (integer, False),\n 'InstanceType': (basestring, False),\n 'ZoneAwarenessConfig': (ZoneAwarenessConfig, False),\n 'ZoneAwarenessEnabled': (boolean, False)\n }\n\n\nclass EncryptionAtRestOptions(AWSProperty):\n props = {\n 'Enabled': (boolean, False),\n 'KmsKeyId': (basestring, False),\n }\n\n\nclass NodeToNodeEncryptionOptions(AWSProperty):\n props = {\n 'Enabled': (boolean, False),\n }\n\n\nclass SnapshotOptions(AWSProperty):\n props = {\n 'AutomatedSnapshotStartHour': (integer_range(0, 23), False)\n }\n\n\nclass VPCOptions(AWSProperty):\n props = {\n 'SecurityGroupIds': ([basestring], False),\n 'SubnetIds': ([basestring], False)\n }\n\n\nclass MasterUserOptions(AWSProperty):\n props = {\n 'MasterUserARN': (basestring, False),\n 'MasterUserName': (basestring, False),\n 'MasterUserPassword': (basestring, False),\n }\n\n\nclass AdvancedSecurityOptionsInput(AWSProperty):\n props = {\n 'Enabled': (boolean, False),\n 'InternalUserDatabaseEnabled': (boolean, False),\n 'MasterUserOptions': (MasterUserOptions, False),\n }\n\n\nclass Domain(AWSObject):\n resource_type = \"AWS::Elasticsearch::Domain\"\n\n props = {\n 'AccessPolicies': (policytypes, False),\n 'AdvancedOptions': (dict, False),\n 'AdvancedSecurityOptions': (AdvancedSecurityOptionsInput, False),\n 'CognitoOptions': (CognitoOptions, False),\n 'DomainName': (basestring, False),\n 'EBSOptions': (EBSOptions, False),\n 'ElasticsearchClusterConfig': (ElasticsearchClusterConfig, False),\n 'ElasticsearchVersion': (basestring, False),\n 'EncryptionAtRestOptions': (EncryptionAtRestOptions, False),\n 'LogPublishingOptions': (dict, False),\n 'NodeToNodeEncryptionOptions': (NodeToNodeEncryptionOptions, False),\n 'SnapshotOptions': (SnapshotOptions, False),\n 'Tags': ((Tags, list), False),\n 'VPCOptions': (VPCOptions, False),\n }\n\n\n# Backward compatibility\nElasticsearchDomain = Domain\n", "path": "troposphere/elasticsearch.py"}]} | 1,376 | 322 |
gh_patches_debug_39486 | rasdani/github-patches | git_diff | PlasmaPy__PlasmaPy-491 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Improve project description on PyPI
[Project description](https://pypi.org/project/plasmapy/#description) is currently bland. We should detail some functionality, problems that the project is trying to address, mention openAstronomy affiliation, etc.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #!/usr/bin/env python
2 # Licensed under a 3-clause BSD style license - see LICENSE.rst
3
4 import glob
5 import os
6 import sys
7
8 # Enforce Python version check - this is the same check as in __init__.py but
9 # this one has to happen before importing ah_bootstrap.
10 if sys.version_info < tuple((int(val) for val in "3.6".split('.'))):
11 sys.stderr.write("ERROR: plasmapy requires Python {} or later\n".format(3.6))
12 sys.exit(1)
13
14 import ah_bootstrap
15 from setuptools import setup
16
17 # A dirty hack to get around some early import/configurations ambiguities
18 if sys.version_info[0] >= 3:
19 import builtins
20 else:
21 import __builtin__ as builtins
22 builtins._ASTROPY_SETUP_ = True
23
24 from astropy_helpers.setup_helpers import (register_commands, get_debug_option,
25 get_package_info)
26 from astropy_helpers.git_helpers import get_git_devstr
27 from astropy_helpers.version_helpers import generate_version_py
28
29 # Get some values from the setup.cfg
30 try:
31 from ConfigParser import ConfigParser
32 except ImportError:
33 from configparser import ConfigParser
34
35 conf = ConfigParser()
36 conf.read(['setup.cfg'])
37 metadata = dict(conf.items('metadata'))
38
39 PACKAGENAME = metadata.get('package_name', 'plasmapy')
40 DESCRIPTION = metadata.get('description', 'plasmapy')
41 AUTHOR = metadata.get('author', 'PlasmaPy Developers')
42 AUTHOR_EMAIL = metadata.get('author_email', '')
43 LICENSE = metadata.get('license', 'unknown')
44 URL = metadata.get('url', 'http://plasmapy.org')
45
46 # order of priority for long_description:
47 # (1) set in setup.cfg,
48 # (2) load LONG_DESCRIPTION.rst,
49 # (3) load README.rst,
50 # (4) package docstring
51 readme_glob = 'README*'
52 _cfg_long_description = metadata.get('long_description', '')
53 if _cfg_long_description:
54 LONG_DESCRIPTION = _cfg_long_description
55
56 elif os.path.exists('LONG_DESCRIPTION.rst'):
57 with open('LONG_DESCRIPTION.rst') as f:
58 LONG_DESCRIPTION = f.read()
59
60 elif len(glob.glob(readme_glob)) > 0:
61 with open(glob.glob(readme_glob)[0]) as f:
62 LONG_DESCRIPTION = f.read()
63
64 else:
65 # Get the long description from the package's docstring
66 __import__(PACKAGENAME)
67 package = sys.modules[PACKAGENAME]
68 LONG_DESCRIPTION = package.__doc__
69
70 # Store the package name in a built-in variable so it's easy
71 # to get from other parts of the setup infrastructure
72 builtins._ASTROPY_PACKAGE_NAME_ = PACKAGENAME
73
74 # VERSION should be PEP440 compatible (http://www.python.org/dev/peps/pep-0440)
75 VERSION = metadata.get('version', '0.0.dev0')
76
77 # Indicates if this version is a release version
78 RELEASE = 'dev' not in VERSION
79
80 if not RELEASE:
81 VERSION += get_git_devstr(False)
82
83 # Populate the dict of setup command overrides; this should be done before
84 # invoking any other functionality from distutils since it can potentially
85 # modify distutils' behavior.
86 cmdclassd = register_commands(PACKAGENAME, VERSION, RELEASE)
87
88 # Freeze build information in version.py
89 generate_version_py(PACKAGENAME, VERSION, RELEASE,
90 get_debug_option(PACKAGENAME))
91
92 # Treat everything in scripts except README* as a script to be installed
93 scripts = [fname for fname in glob.glob(os.path.join('scripts', '*'))
94 if not os.path.basename(fname).startswith('README')]
95
96
97 # Get configuration information from all of the various subpackages.
98 # See the docstring for setup_helpers.update_package_files for more
99 # details.
100 package_info = get_package_info()
101
102 # Add the project-global data
103 package_info['package_data'].setdefault(PACKAGENAME, [])
104 package_info['package_data'][PACKAGENAME].append('data/*')
105
106 # Define entry points for command-line scripts
107 entry_points = {'console_scripts': []}
108
109 if conf.has_section('entry_points'):
110 entry_point_list = conf.items('entry_points')
111 for entry_point in entry_point_list:
112 entry_points['console_scripts'].append('{0} = {1}'.format(
113 entry_point[0], entry_point[1]))
114
115 # Include all .c files, recursively, including those generated by
116 # Cython, since we can not do this in MANIFEST.in with a "dynamic"
117 # directory name.
118 c_files = []
119 for root, dirs, files in os.walk(PACKAGENAME):
120 for filename in files:
121 if filename.endswith('.c'):
122 c_files.append(
123 os.path.join(
124 os.path.relpath(root, PACKAGENAME), filename))
125 package_info['package_data'][PACKAGENAME].extend(c_files)
126
127 # Note that requires and provides should not be included in the call to
128 # ``setup``, since these are now deprecated. See this link for more details:
129 # https://groups.google.com/forum/#!topic/astropy-dev/urYO8ckB2uM
130
131 setup(name=PACKAGENAME,
132 version=VERSION,
133 description=DESCRIPTION,
134 scripts=scripts,
135 setup_requires=metadata.get("setup_requires", None),
136 install_requires=[s.strip() for s in metadata.get('install_requires', 'astropy').split(',')],
137 author=AUTHOR,
138 author_email=AUTHOR_EMAIL,
139 license=LICENSE,
140 url=URL,
141 long_description=LONG_DESCRIPTION,
142 cmdclass=cmdclassd,
143 zip_safe=False,
144 use_2to3=False,
145 include_package_data=True,
146 entry_points=entry_points,
147 python_requires='>={}'.format("3.6"),
148 **package_info
149 )
150
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -23,6 +23,7 @@
from astropy_helpers.setup_helpers import (register_commands, get_debug_option,
get_package_info)
+from astropy_helpers.distutils_helpers import is_distutils_display_option
from astropy_helpers.git_helpers import get_git_devstr
from astropy_helpers.version_helpers import generate_version_py
@@ -124,6 +125,18 @@
os.path.relpath(root, PACKAGENAME), filename))
package_info['package_data'][PACKAGENAME].extend(c_files)
+setup_requires = ['numpy']
+
+# Make sure to have the packages needed for building PlasmaPy, but do not require them
+# when installing from an sdist as the c files are included there.
+if not os.path.exists(os.path.join(os.path.dirname(__file__), 'PKG-INFO')):
+ setup_requires.extend(['cython>=0.27.2'])
+
+# Avoid installing setup_requires dependencies if the user just
+# queries for information
+if is_distutils_display_option():
+ setup_requires = []
+
# Note that requires and provides should not be included in the call to
# ``setup``, since these are now deprecated. See this link for more details:
# https://groups.google.com/forum/#!topic/astropy-dev/urYO8ckB2uM
@@ -132,18 +145,33 @@
version=VERSION,
description=DESCRIPTION,
scripts=scripts,
- setup_requires=metadata.get("setup_requires", None),
+ setup_requires=[s.strip() for s in metadata.get('install_requires', 'astropy').split(',')],
install_requires=[s.strip() for s in metadata.get('install_requires', 'astropy').split(',')],
author=AUTHOR,
author_email=AUTHOR_EMAIL,
license=LICENSE,
url=URL,
long_description=LONG_DESCRIPTION,
+ keywords=['plasma', 'physics', 'transport', 'collisions', 'science',
+ 'atomic', 'particle', 'simulation', 'langmuir', 'tokamak',
+ 'instability', 'modeling'],
+ classifiers=[
+ 'Intended Audience :: Science/Research',
+ 'License :: OSI Approved :: BSD-2-Clause-Patent',
+ 'Operating System :: OS Independent',
+ 'Programming Language :: C',
+ 'Programming Language :: Cython',
+ 'Programming Language :: Python :: 3',
+ 'Programming Language :: Python :: Implementation :: CPython',
+ 'Topic :: Scientific/Engineering :: Astronomy',
+ 'Topic :: Scientific/Engineering :: Physics'
+ ],
cmdclass=cmdclassd,
zip_safe=False,
use_2to3=False,
include_package_data=True,
entry_points=entry_points,
python_requires='>={}'.format("3.6"),
+ tests_require=["pytest", "pytest-astropy"],
**package_info
)
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -23,6 +23,7 @@\n \n from astropy_helpers.setup_helpers import (register_commands, get_debug_option,\n get_package_info)\n+from astropy_helpers.distutils_helpers import is_distutils_display_option\n from astropy_helpers.git_helpers import get_git_devstr\n from astropy_helpers.version_helpers import generate_version_py\n \n@@ -124,6 +125,18 @@\n os.path.relpath(root, PACKAGENAME), filename))\n package_info['package_data'][PACKAGENAME].extend(c_files)\n \n+setup_requires = ['numpy']\n+\n+# Make sure to have the packages needed for building PlasmaPy, but do not require them\n+# when installing from an sdist as the c files are included there.\n+if not os.path.exists(os.path.join(os.path.dirname(__file__), 'PKG-INFO')):\n+ setup_requires.extend(['cython>=0.27.2'])\n+\n+# Avoid installing setup_requires dependencies if the user just\n+# queries for information\n+if is_distutils_display_option():\n+ setup_requires = []\n+\n # Note that requires and provides should not be included in the call to\n # ``setup``, since these are now deprecated. See this link for more details:\n # https://groups.google.com/forum/#!topic/astropy-dev/urYO8ckB2uM\n@@ -132,18 +145,33 @@\n version=VERSION,\n description=DESCRIPTION,\n scripts=scripts,\n- setup_requires=metadata.get(\"setup_requires\", None),\n+ setup_requires=[s.strip() for s in metadata.get('install_requires', 'astropy').split(',')],\n install_requires=[s.strip() for s in metadata.get('install_requires', 'astropy').split(',')],\n author=AUTHOR,\n author_email=AUTHOR_EMAIL,\n license=LICENSE,\n url=URL,\n long_description=LONG_DESCRIPTION,\n+ keywords=['plasma', 'physics', 'transport', 'collisions', 'science',\n+ 'atomic', 'particle', 'simulation', 'langmuir', 'tokamak',\n+ 'instability', 'modeling'],\n+ classifiers=[\n+ 'Intended Audience :: Science/Research',\n+ 'License :: OSI Approved :: BSD-2-Clause-Patent',\n+ 'Operating System :: OS Independent',\n+ 'Programming Language :: C',\n+ 'Programming Language :: Cython',\n+ 'Programming Language :: Python :: 3',\n+ 'Programming Language :: Python :: Implementation :: CPython',\n+ 'Topic :: Scientific/Engineering :: Astronomy',\n+ 'Topic :: Scientific/Engineering :: Physics'\n+ ],\n cmdclass=cmdclassd,\n zip_safe=False,\n use_2to3=False,\n include_package_data=True,\n entry_points=entry_points,\n python_requires='>={}'.format(\"3.6\"),\n+ tests_require=[\"pytest\", \"pytest-astropy\"],\n **package_info\n )\n", "issue": "Improve project description on PyPI\n[Project description](https://pypi.org/project/plasmapy/#description) is currently bland. We should detail some functionality, problems that the project is trying to address, mention openAstronomy affiliation, etc.\r\n\n", "before_files": [{"content": "#!/usr/bin/env python\n# Licensed under a 3-clause BSD style license - see LICENSE.rst\n\nimport glob\nimport os\nimport sys\n\n# Enforce Python version check - this is the same check as in __init__.py but\n# this one has to happen before importing ah_bootstrap.\nif sys.version_info < tuple((int(val) for val in \"3.6\".split('.'))):\n sys.stderr.write(\"ERROR: plasmapy requires Python {} or later\\n\".format(3.6))\n sys.exit(1)\n\nimport ah_bootstrap\nfrom setuptools import setup\n\n# A dirty hack to get around some early import/configurations ambiguities\nif sys.version_info[0] >= 3:\n import builtins\nelse:\n import __builtin__ as builtins\nbuiltins._ASTROPY_SETUP_ = True\n\nfrom astropy_helpers.setup_helpers import (register_commands, get_debug_option,\n get_package_info)\nfrom astropy_helpers.git_helpers import get_git_devstr\nfrom astropy_helpers.version_helpers import generate_version_py\n\n# Get some values from the setup.cfg\ntry:\n from ConfigParser import ConfigParser\nexcept ImportError:\n from configparser import ConfigParser\n\nconf = ConfigParser()\nconf.read(['setup.cfg'])\nmetadata = dict(conf.items('metadata'))\n\nPACKAGENAME = metadata.get('package_name', 'plasmapy')\nDESCRIPTION = metadata.get('description', 'plasmapy')\nAUTHOR = metadata.get('author', 'PlasmaPy Developers')\nAUTHOR_EMAIL = metadata.get('author_email', '')\nLICENSE = metadata.get('license', 'unknown')\nURL = metadata.get('url', 'http://plasmapy.org')\n\n# order of priority for long_description:\n# (1) set in setup.cfg,\n# (2) load LONG_DESCRIPTION.rst,\n# (3) load README.rst,\n# (4) package docstring\nreadme_glob = 'README*'\n_cfg_long_description = metadata.get('long_description', '')\nif _cfg_long_description:\n LONG_DESCRIPTION = _cfg_long_description\n\nelif os.path.exists('LONG_DESCRIPTION.rst'):\n with open('LONG_DESCRIPTION.rst') as f:\n LONG_DESCRIPTION = f.read()\n\nelif len(glob.glob(readme_glob)) > 0:\n with open(glob.glob(readme_glob)[0]) as f:\n LONG_DESCRIPTION = f.read()\n\nelse:\n # Get the long description from the package's docstring\n __import__(PACKAGENAME)\n package = sys.modules[PACKAGENAME]\n LONG_DESCRIPTION = package.__doc__\n\n# Store the package name in a built-in variable so it's easy\n# to get from other parts of the setup infrastructure\nbuiltins._ASTROPY_PACKAGE_NAME_ = PACKAGENAME\n\n# VERSION should be PEP440 compatible (http://www.python.org/dev/peps/pep-0440)\nVERSION = metadata.get('version', '0.0.dev0')\n\n# Indicates if this version is a release version\nRELEASE = 'dev' not in VERSION\n\nif not RELEASE:\n VERSION += get_git_devstr(False)\n\n# Populate the dict of setup command overrides; this should be done before\n# invoking any other functionality from distutils since it can potentially\n# modify distutils' behavior.\ncmdclassd = register_commands(PACKAGENAME, VERSION, RELEASE)\n\n# Freeze build information in version.py\ngenerate_version_py(PACKAGENAME, VERSION, RELEASE,\n get_debug_option(PACKAGENAME))\n\n# Treat everything in scripts except README* as a script to be installed\nscripts = [fname for fname in glob.glob(os.path.join('scripts', '*'))\n if not os.path.basename(fname).startswith('README')]\n\n\n# Get configuration information from all of the various subpackages.\n# See the docstring for setup_helpers.update_package_files for more\n# details.\npackage_info = get_package_info()\n\n# Add the project-global data\npackage_info['package_data'].setdefault(PACKAGENAME, [])\npackage_info['package_data'][PACKAGENAME].append('data/*')\n\n# Define entry points for command-line scripts\nentry_points = {'console_scripts': []}\n\nif conf.has_section('entry_points'):\n entry_point_list = conf.items('entry_points')\n for entry_point in entry_point_list:\n entry_points['console_scripts'].append('{0} = {1}'.format(\n entry_point[0], entry_point[1]))\n\n# Include all .c files, recursively, including those generated by\n# Cython, since we can not do this in MANIFEST.in with a \"dynamic\"\n# directory name.\nc_files = []\nfor root, dirs, files in os.walk(PACKAGENAME):\n for filename in files:\n if filename.endswith('.c'):\n c_files.append(\n os.path.join(\n os.path.relpath(root, PACKAGENAME), filename))\npackage_info['package_data'][PACKAGENAME].extend(c_files)\n\n# Note that requires and provides should not be included in the call to\n# ``setup``, since these are now deprecated. See this link for more details:\n# https://groups.google.com/forum/#!topic/astropy-dev/urYO8ckB2uM\n\nsetup(name=PACKAGENAME,\n version=VERSION,\n description=DESCRIPTION,\n scripts=scripts,\n setup_requires=metadata.get(\"setup_requires\", None),\n install_requires=[s.strip() for s in metadata.get('install_requires', 'astropy').split(',')],\n author=AUTHOR,\n author_email=AUTHOR_EMAIL,\n license=LICENSE,\n url=URL,\n long_description=LONG_DESCRIPTION,\n cmdclass=cmdclassd,\n zip_safe=False,\n use_2to3=False,\n include_package_data=True,\n entry_points=entry_points,\n python_requires='>={}'.format(\"3.6\"),\n **package_info\n)\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/env python\n# Licensed under a 3-clause BSD style license - see LICENSE.rst\n\nimport glob\nimport os\nimport sys\n\n# Enforce Python version check - this is the same check as in __init__.py but\n# this one has to happen before importing ah_bootstrap.\nif sys.version_info < tuple((int(val) for val in \"3.6\".split('.'))):\n sys.stderr.write(\"ERROR: plasmapy requires Python {} or later\\n\".format(3.6))\n sys.exit(1)\n\nimport ah_bootstrap\nfrom setuptools import setup\n\n# A dirty hack to get around some early import/configurations ambiguities\nif sys.version_info[0] >= 3:\n import builtins\nelse:\n import __builtin__ as builtins\nbuiltins._ASTROPY_SETUP_ = True\n\nfrom astropy_helpers.setup_helpers import (register_commands, get_debug_option,\n get_package_info)\nfrom astropy_helpers.distutils_helpers import is_distutils_display_option\nfrom astropy_helpers.git_helpers import get_git_devstr\nfrom astropy_helpers.version_helpers import generate_version_py\n\n# Get some values from the setup.cfg\ntry:\n from ConfigParser import ConfigParser\nexcept ImportError:\n from configparser import ConfigParser\n\nconf = ConfigParser()\nconf.read(['setup.cfg'])\nmetadata = dict(conf.items('metadata'))\n\nPACKAGENAME = metadata.get('package_name', 'plasmapy')\nDESCRIPTION = metadata.get('description', 'plasmapy')\nAUTHOR = metadata.get('author', 'PlasmaPy Developers')\nAUTHOR_EMAIL = metadata.get('author_email', '')\nLICENSE = metadata.get('license', 'unknown')\nURL = metadata.get('url', 'http://plasmapy.org')\n\n# order of priority for long_description:\n# (1) set in setup.cfg,\n# (2) load LONG_DESCRIPTION.rst,\n# (3) load README.rst,\n# (4) package docstring\nreadme_glob = 'README*'\n_cfg_long_description = metadata.get('long_description', '')\nif _cfg_long_description:\n LONG_DESCRIPTION = _cfg_long_description\n\nelif os.path.exists('LONG_DESCRIPTION.rst'):\n with open('LONG_DESCRIPTION.rst') as f:\n LONG_DESCRIPTION = f.read()\n\nelif len(glob.glob(readme_glob)) > 0:\n with open(glob.glob(readme_glob)[0]) as f:\n LONG_DESCRIPTION = f.read()\n\nelse:\n # Get the long description from the package's docstring\n __import__(PACKAGENAME)\n package = sys.modules[PACKAGENAME]\n LONG_DESCRIPTION = package.__doc__\n\n# Store the package name in a built-in variable so it's easy\n# to get from other parts of the setup infrastructure\nbuiltins._ASTROPY_PACKAGE_NAME_ = PACKAGENAME\n\n# VERSION should be PEP440 compatible (http://www.python.org/dev/peps/pep-0440)\nVERSION = metadata.get('version', '0.0.dev0')\n\n# Indicates if this version is a release version\nRELEASE = 'dev' not in VERSION\n\nif not RELEASE:\n VERSION += get_git_devstr(False)\n\n# Populate the dict of setup command overrides; this should be done before\n# invoking any other functionality from distutils since it can potentially\n# modify distutils' behavior.\ncmdclassd = register_commands(PACKAGENAME, VERSION, RELEASE)\n\n# Freeze build information in version.py\ngenerate_version_py(PACKAGENAME, VERSION, RELEASE,\n get_debug_option(PACKAGENAME))\n\n# Treat everything in scripts except README* as a script to be installed\nscripts = [fname for fname in glob.glob(os.path.join('scripts', '*'))\n if not os.path.basename(fname).startswith('README')]\n\n\n# Get configuration information from all of the various subpackages.\n# See the docstring for setup_helpers.update_package_files for more\n# details.\npackage_info = get_package_info()\n\n# Add the project-global data\npackage_info['package_data'].setdefault(PACKAGENAME, [])\npackage_info['package_data'][PACKAGENAME].append('data/*')\n\n# Define entry points for command-line scripts\nentry_points = {'console_scripts': []}\n\nif conf.has_section('entry_points'):\n entry_point_list = conf.items('entry_points')\n for entry_point in entry_point_list:\n entry_points['console_scripts'].append('{0} = {1}'.format(\n entry_point[0], entry_point[1]))\n\n# Include all .c files, recursively, including those generated by\n# Cython, since we can not do this in MANIFEST.in with a \"dynamic\"\n# directory name.\nc_files = []\nfor root, dirs, files in os.walk(PACKAGENAME):\n for filename in files:\n if filename.endswith('.c'):\n c_files.append(\n os.path.join(\n os.path.relpath(root, PACKAGENAME), filename))\npackage_info['package_data'][PACKAGENAME].extend(c_files)\n\nsetup_requires = ['numpy']\n\n# Make sure to have the packages needed for building PlasmaPy, but do not require them\n# when installing from an sdist as the c files are included there.\nif not os.path.exists(os.path.join(os.path.dirname(__file__), 'PKG-INFO')):\n setup_requires.extend(['cython>=0.27.2'])\n\n# Avoid installing setup_requires dependencies if the user just\n# queries for information\nif is_distutils_display_option():\n setup_requires = []\n\n# Note that requires and provides should not be included in the call to\n# ``setup``, since these are now deprecated. See this link for more details:\n# https://groups.google.com/forum/#!topic/astropy-dev/urYO8ckB2uM\n\nsetup(name=PACKAGENAME,\n version=VERSION,\n description=DESCRIPTION,\n scripts=scripts,\n setup_requires=[s.strip() for s in metadata.get('install_requires', 'astropy').split(',')],\n install_requires=[s.strip() for s in metadata.get('install_requires', 'astropy').split(',')],\n author=AUTHOR,\n author_email=AUTHOR_EMAIL,\n license=LICENSE,\n url=URL,\n long_description=LONG_DESCRIPTION,\n keywords=['plasma', 'physics', 'transport', 'collisions', 'science',\n 'atomic', 'particle', 'simulation', 'langmuir', 'tokamak',\n 'instability', 'modeling'],\n classifiers=[\n 'Intended Audience :: Science/Research',\n 'License :: OSI Approved :: BSD-2-Clause-Patent',\n 'Operating System :: OS Independent',\n 'Programming Language :: C',\n 'Programming Language :: Cython',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: Implementation :: CPython',\n 'Topic :: Scientific/Engineering :: Astronomy',\n 'Topic :: Scientific/Engineering :: Physics'\n ],\n cmdclass=cmdclassd,\n zip_safe=False,\n use_2to3=False,\n include_package_data=True,\n entry_points=entry_points,\n python_requires='>={}'.format(\"3.6\"),\n tests_require=[\"pytest\", \"pytest-astropy\"],\n **package_info\n)\n", "path": "setup.py"}]} | 1,899 | 645 |
gh_patches_debug_25159 | rasdani/github-patches | git_diff | mlflow__mlflow-9258 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Typo fix
https://github.com/mlflow/mlflow/blob/9724c83bd8f0100c465e68e30651a9727de42ce0/dev/show_package_release_dates.py#L49
`package_legnth` -> `package_length`
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `dev/show_package_release_dates.py`
Content:
```
1 import os
2 import json
3 import sys
4 import subprocess
5 import requests
6 from concurrent.futures import ThreadPoolExecutor
7 import traceback
8
9
10 def get_distributions():
11 res = subprocess.check_output(
12 [sys.executable, "-m", "pip", "list", "--format", "json"], text=True
13 )
14 return [(pkg["name"], pkg["version"]) for pkg in json.loads(res)]
15
16
17 def get_release_date(package, version):
18 resp = requests.get(f"https://pypi.python.org/pypi/{package}/json", timeout=10)
19 if not resp.ok:
20 return ""
21
22 matched = [dist_files for ver, dist_files in resp.json()["releases"].items() if ver == version]
23 if (not matched) or (not matched[0]):
24 return ""
25
26 upload_time = matched[0][0]["upload_time"]
27 return upload_time.split("T")[0] # return year-month-day
28
29
30 def get_longest_string_length(array):
31 return len(max(array, key=len))
32
33
34 def safe_result(future, if_error=""):
35 try:
36 return future.result()
37 except Exception:
38 traceback.print_exc()
39 return if_error
40
41
42 def main():
43 distributions = get_distributions()
44 with ThreadPoolExecutor(max_workers=min(32, os.cpu_count() + 4)) as executor:
45 futures = [executor.submit(get_release_date, pkg, ver) for pkg, ver in distributions]
46 release_dates = [safe_result(f) for f in futures]
47
48 packages, versions = list(zip(*distributions))
49 package_legnth = get_longest_string_length(packages)
50 version_length = get_longest_string_length(versions)
51 release_date_length = len("Release Date")
52 print("Package".ljust(package_legnth), "Version".ljust(version_length), "Release Date")
53 print("-" * (package_legnth + version_length + release_date_length + 2))
54 for package, version, release_date in sorted(
55 zip(packages, versions, release_dates),
56 # Sort by release date in descending order
57 key=lambda x: x[2],
58 reverse=True,
59 ):
60 print(
61 package.ljust(package_legnth),
62 version.ljust(version_length),
63 release_date.ljust(release_date_length),
64 )
65
66
67 if __name__ == "__main__":
68 main()
69
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/dev/show_package_release_dates.py b/dev/show_package_release_dates.py
--- a/dev/show_package_release_dates.py
+++ b/dev/show_package_release_dates.py
@@ -46,11 +46,11 @@
release_dates = [safe_result(f) for f in futures]
packages, versions = list(zip(*distributions))
- package_legnth = get_longest_string_length(packages)
+ package_length = get_longest_string_length(packages)
version_length = get_longest_string_length(versions)
release_date_length = len("Release Date")
- print("Package".ljust(package_legnth), "Version".ljust(version_length), "Release Date")
- print("-" * (package_legnth + version_length + release_date_length + 2))
+ print("Package".ljust(package_length), "Version".ljust(version_length), "Release Date")
+ print("-" * (package_length + version_length + release_date_length + 2))
for package, version, release_date in sorted(
zip(packages, versions, release_dates),
# Sort by release date in descending order
@@ -58,7 +58,7 @@
reverse=True,
):
print(
- package.ljust(package_legnth),
+ package.ljust(package_length),
version.ljust(version_length),
release_date.ljust(release_date_length),
)
| {"golden_diff": "diff --git a/dev/show_package_release_dates.py b/dev/show_package_release_dates.py\n--- a/dev/show_package_release_dates.py\n+++ b/dev/show_package_release_dates.py\n@@ -46,11 +46,11 @@\n release_dates = [safe_result(f) for f in futures]\n \n packages, versions = list(zip(*distributions))\n- package_legnth = get_longest_string_length(packages)\n+ package_length = get_longest_string_length(packages)\n version_length = get_longest_string_length(versions)\n release_date_length = len(\"Release Date\")\n- print(\"Package\".ljust(package_legnth), \"Version\".ljust(version_length), \"Release Date\")\n- print(\"-\" * (package_legnth + version_length + release_date_length + 2))\n+ print(\"Package\".ljust(package_length), \"Version\".ljust(version_length), \"Release Date\")\n+ print(\"-\" * (package_length + version_length + release_date_length + 2))\n for package, version, release_date in sorted(\n zip(packages, versions, release_dates),\n # Sort by release date in descending order\n@@ -58,7 +58,7 @@\n reverse=True,\n ):\n print(\n- package.ljust(package_legnth),\n+ package.ljust(package_length),\n version.ljust(version_length),\n release_date.ljust(release_date_length),\n )\n", "issue": "Typo fix\nhttps://github.com/mlflow/mlflow/blob/9724c83bd8f0100c465e68e30651a9727de42ce0/dev/show_package_release_dates.py#L49\r\n\r\n`package_legnth` -> `package_length`\n", "before_files": [{"content": "import os\nimport json\nimport sys\nimport subprocess\nimport requests\nfrom concurrent.futures import ThreadPoolExecutor\nimport traceback\n\n\ndef get_distributions():\n res = subprocess.check_output(\n [sys.executable, \"-m\", \"pip\", \"list\", \"--format\", \"json\"], text=True\n )\n return [(pkg[\"name\"], pkg[\"version\"]) for pkg in json.loads(res)]\n\n\ndef get_release_date(package, version):\n resp = requests.get(f\"https://pypi.python.org/pypi/{package}/json\", timeout=10)\n if not resp.ok:\n return \"\"\n\n matched = [dist_files for ver, dist_files in resp.json()[\"releases\"].items() if ver == version]\n if (not matched) or (not matched[0]):\n return \"\"\n\n upload_time = matched[0][0][\"upload_time\"]\n return upload_time.split(\"T\")[0] # return year-month-day\n\n\ndef get_longest_string_length(array):\n return len(max(array, key=len))\n\n\ndef safe_result(future, if_error=\"\"):\n try:\n return future.result()\n except Exception:\n traceback.print_exc()\n return if_error\n\n\ndef main():\n distributions = get_distributions()\n with ThreadPoolExecutor(max_workers=min(32, os.cpu_count() + 4)) as executor:\n futures = [executor.submit(get_release_date, pkg, ver) for pkg, ver in distributions]\n release_dates = [safe_result(f) for f in futures]\n\n packages, versions = list(zip(*distributions))\n package_legnth = get_longest_string_length(packages)\n version_length = get_longest_string_length(versions)\n release_date_length = len(\"Release Date\")\n print(\"Package\".ljust(package_legnth), \"Version\".ljust(version_length), \"Release Date\")\n print(\"-\" * (package_legnth + version_length + release_date_length + 2))\n for package, version, release_date in sorted(\n zip(packages, versions, release_dates),\n # Sort by release date in descending order\n key=lambda x: x[2],\n reverse=True,\n ):\n print(\n package.ljust(package_legnth),\n version.ljust(version_length),\n release_date.ljust(release_date_length),\n )\n\n\nif __name__ == \"__main__\":\n main()\n", "path": "dev/show_package_release_dates.py"}], "after_files": [{"content": "import os\nimport json\nimport sys\nimport subprocess\nimport requests\nfrom concurrent.futures import ThreadPoolExecutor\nimport traceback\n\n\ndef get_distributions():\n res = subprocess.check_output(\n [sys.executable, \"-m\", \"pip\", \"list\", \"--format\", \"json\"], text=True\n )\n return [(pkg[\"name\"], pkg[\"version\"]) for pkg in json.loads(res)]\n\n\ndef get_release_date(package, version):\n resp = requests.get(f\"https://pypi.python.org/pypi/{package}/json\", timeout=10)\n if not resp.ok:\n return \"\"\n\n matched = [dist_files for ver, dist_files in resp.json()[\"releases\"].items() if ver == version]\n if (not matched) or (not matched[0]):\n return \"\"\n\n upload_time = matched[0][0][\"upload_time\"]\n return upload_time.split(\"T\")[0] # return year-month-day\n\n\ndef get_longest_string_length(array):\n return len(max(array, key=len))\n\n\ndef safe_result(future, if_error=\"\"):\n try:\n return future.result()\n except Exception:\n traceback.print_exc()\n return if_error\n\n\ndef main():\n distributions = get_distributions()\n with ThreadPoolExecutor(max_workers=min(32, os.cpu_count() + 4)) as executor:\n futures = [executor.submit(get_release_date, pkg, ver) for pkg, ver in distributions]\n release_dates = [safe_result(f) for f in futures]\n\n packages, versions = list(zip(*distributions))\n package_length = get_longest_string_length(packages)\n version_length = get_longest_string_length(versions)\n release_date_length = len(\"Release Date\")\n print(\"Package\".ljust(package_length), \"Version\".ljust(version_length), \"Release Date\")\n print(\"-\" * (package_length + version_length + release_date_length + 2))\n for package, version, release_date in sorted(\n zip(packages, versions, release_dates),\n # Sort by release date in descending order\n key=lambda x: x[2],\n reverse=True,\n ):\n print(\n package.ljust(package_length),\n version.ljust(version_length),\n release_date.ljust(release_date_length),\n )\n\n\nif __name__ == \"__main__\":\n main()\n", "path": "dev/show_package_release_dates.py"}]} | 962 | 301 |
gh_patches_debug_27386 | rasdani/github-patches | git_diff | alltheplaces__alltheplaces-8360 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Problems with Spiders built on YextSpider
I've noticed a few of the spiders whose results I was using in my OSM tool have recently stopped returning any results, including five_guys_de_es_fr_gb.py , matalan_gb.py , and three_gb.py .
A common feature of these spiders is that they use the YextSpider class. Checking some other spiders that are also based on this class, reveals they've also stopped working. The spider stats suggest 404 and 403 errors are preventing the data being retrieved.
If this is a general problem affecting multiple spiders, would someone be able to take a look and see if it can be fixed?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `locations/spiders/independent_financial_us.py`
Content:
```
1 from locations.categories import Categories, apply_category
2 from locations.storefinders.yext import YextSpider
3
4
5 class IndependentFinancialUSSpider(YextSpider):
6 name = "independent_financial_us"
7 item_attributes = {"brand": "Independent Financial", "brand_wikidata": "Q6016398"}
8 api_key = "ee4600854cf5501c53831bf944472e57"
9 wanted_types = ["location", "atm"]
10
11 def parse_item(self, item, location):
12 if location["meta"]["entityType"] == "location":
13 apply_category(Categories.BANK, item)
14 item["ref"] = location.get("c_branchCode", location["meta"].get("id"))
15 item["name"] = " ".join(filter(None, [location.get("name"), location.get("geomodifier")]))
16 elif location["meta"]["entityType"] == "atm":
17 apply_category(Categories.ATM, item)
18 item["name"] = location.get("geomodifier")
19 item["website"] = location.get("c_pagesURL")
20 item.pop("email", None)
21 item["extras"].pop("contact:instagram", None)
22 item.pop("twitter", None)
23 item.pop("facebook", None)
24 yield item
25
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/locations/spiders/independent_financial_us.py b/locations/spiders/independent_financial_us.py
--- a/locations/spiders/independent_financial_us.py
+++ b/locations/spiders/independent_financial_us.py
@@ -1,24 +1,18 @@
from locations.categories import Categories, apply_category
-from locations.storefinders.yext import YextSpider
+from locations.storefinders.yext_answers import YextAnswersSpider
-class IndependentFinancialUSSpider(YextSpider):
+class IndependentFinancialUSSpider(YextAnswersSpider):
name = "independent_financial_us"
item_attributes = {"brand": "Independent Financial", "brand_wikidata": "Q6016398"}
api_key = "ee4600854cf5501c53831bf944472e57"
- wanted_types = ["location", "atm"]
+ experience_key = "independent-financial-search"
- def parse_item(self, item, location):
- if location["meta"]["entityType"] == "location":
- apply_category(Categories.BANK, item)
- item["ref"] = location.get("c_branchCode", location["meta"].get("id"))
- item["name"] = " ".join(filter(None, [location.get("name"), location.get("geomodifier")]))
- elif location["meta"]["entityType"] == "atm":
+ def parse_item(self, location, item):
+ if location["data"]["type"] == "atm":
apply_category(Categories.ATM, item)
- item["name"] = location.get("geomodifier")
- item["website"] = location.get("c_pagesURL")
- item.pop("email", None)
- item["extras"].pop("contact:instagram", None)
- item.pop("twitter", None)
- item.pop("facebook", None)
+ elif location["data"]["type"] == "location":
+ apply_category(Categories.BANK, item)
+ else:
+ self.logger.error("Unknown location type: {}".format(location["data"]["type"]))
yield item
| {"golden_diff": "diff --git a/locations/spiders/independent_financial_us.py b/locations/spiders/independent_financial_us.py\n--- a/locations/spiders/independent_financial_us.py\n+++ b/locations/spiders/independent_financial_us.py\n@@ -1,24 +1,18 @@\n from locations.categories import Categories, apply_category\n-from locations.storefinders.yext import YextSpider\n+from locations.storefinders.yext_answers import YextAnswersSpider\n \n \n-class IndependentFinancialUSSpider(YextSpider):\n+class IndependentFinancialUSSpider(YextAnswersSpider):\n name = \"independent_financial_us\"\n item_attributes = {\"brand\": \"Independent Financial\", \"brand_wikidata\": \"Q6016398\"}\n api_key = \"ee4600854cf5501c53831bf944472e57\"\n- wanted_types = [\"location\", \"atm\"]\n+ experience_key = \"independent-financial-search\"\n \n- def parse_item(self, item, location):\n- if location[\"meta\"][\"entityType\"] == \"location\":\n- apply_category(Categories.BANK, item)\n- item[\"ref\"] = location.get(\"c_branchCode\", location[\"meta\"].get(\"id\"))\n- item[\"name\"] = \" \".join(filter(None, [location.get(\"name\"), location.get(\"geomodifier\")]))\n- elif location[\"meta\"][\"entityType\"] == \"atm\":\n+ def parse_item(self, location, item):\n+ if location[\"data\"][\"type\"] == \"atm\":\n apply_category(Categories.ATM, item)\n- item[\"name\"] = location.get(\"geomodifier\")\n- item[\"website\"] = location.get(\"c_pagesURL\")\n- item.pop(\"email\", None)\n- item[\"extras\"].pop(\"contact:instagram\", None)\n- item.pop(\"twitter\", None)\n- item.pop(\"facebook\", None)\n+ elif location[\"data\"][\"type\"] == \"location\":\n+ apply_category(Categories.BANK, item)\n+ else:\n+ self.logger.error(\"Unknown location type: {}\".format(location[\"data\"][\"type\"]))\n yield item\n", "issue": "Problems with Spiders built on YextSpider\nI've noticed a few of the spiders whose results I was using in my OSM tool have recently stopped returning any results, including five_guys_de_es_fr_gb.py , matalan_gb.py , and three_gb.py .\r\n\r\nA common feature of these spiders is that they use the YextSpider class. Checking some other spiders that are also based on this class, reveals they've also stopped working. The spider stats suggest 404 and 403 errors are preventing the data being retrieved.\r\n\r\nIf this is a general problem affecting multiple spiders, would someone be able to take a look and see if it can be fixed?\n", "before_files": [{"content": "from locations.categories import Categories, apply_category\nfrom locations.storefinders.yext import YextSpider\n\n\nclass IndependentFinancialUSSpider(YextSpider):\n name = \"independent_financial_us\"\n item_attributes = {\"brand\": \"Independent Financial\", \"brand_wikidata\": \"Q6016398\"}\n api_key = \"ee4600854cf5501c53831bf944472e57\"\n wanted_types = [\"location\", \"atm\"]\n\n def parse_item(self, item, location):\n if location[\"meta\"][\"entityType\"] == \"location\":\n apply_category(Categories.BANK, item)\n item[\"ref\"] = location.get(\"c_branchCode\", location[\"meta\"].get(\"id\"))\n item[\"name\"] = \" \".join(filter(None, [location.get(\"name\"), location.get(\"geomodifier\")]))\n elif location[\"meta\"][\"entityType\"] == \"atm\":\n apply_category(Categories.ATM, item)\n item[\"name\"] = location.get(\"geomodifier\")\n item[\"website\"] = location.get(\"c_pagesURL\")\n item.pop(\"email\", None)\n item[\"extras\"].pop(\"contact:instagram\", None)\n item.pop(\"twitter\", None)\n item.pop(\"facebook\", None)\n yield item\n", "path": "locations/spiders/independent_financial_us.py"}], "after_files": [{"content": "from locations.categories import Categories, apply_category\nfrom locations.storefinders.yext_answers import YextAnswersSpider\n\n\nclass IndependentFinancialUSSpider(YextAnswersSpider):\n name = \"independent_financial_us\"\n item_attributes = {\"brand\": \"Independent Financial\", \"brand_wikidata\": \"Q6016398\"}\n api_key = \"ee4600854cf5501c53831bf944472e57\"\n experience_key = \"independent-financial-search\"\n\n def parse_item(self, location, item):\n if location[\"data\"][\"type\"] == \"atm\":\n apply_category(Categories.ATM, item)\n elif location[\"data\"][\"type\"] == \"location\":\n apply_category(Categories.BANK, item)\n else:\n self.logger.error(\"Unknown location type: {}\".format(location[\"data\"][\"type\"]))\n yield item\n", "path": "locations/spiders/independent_financial_us.py"}]} | 735 | 477 |
gh_patches_debug_950 | rasdani/github-patches | git_diff | OpenNMT__OpenNMT-py-2204 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
torch.div() (1.6.0) does not have 'rounding_mode' parameter
According to the torch 1.6.0 here: https://pytorch.org/docs/1.6.0/generated/torch.div.html?highlight=torch%20div#torch.div
there is no 'rounding_mode' parameter.
But in translator:
https://github.com/OpenNMT/OpenNMT-py/blob/0f411ce11a83b18c0223ac94ccc11a35403763df/onmt/translate/beam_search.py#L282
That's why I receive this error:
```
onmt_translate -model ./../output/test/nmt/f0/run/model_step_100.pt -src ./../output/test/nmt/f0/src-test.txt -output ./../output/test/nmt/f0/test.epoch100.pred.csv -gpu 0 --min_length 2 -verbose
[2022-09-15 20:32:19,980 INFO] Translating shard 0.
Traceback (most recent call last):
File "c:\programdata\anaconda3\envs\nuecg\lib\runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "c:\programdata\anaconda3\envs\nuecg\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "C:\ProgramData\Anaconda3\envs\nuecg\Scripts\onmt_translate.exe\__main__.py", line 7, in <module>
File "c:\programdata\anaconda3\envs\nuecg\lib\site-packages\onmt\bin\translate.py", line 54, in main
translate(opt)
File "c:\programdata\anaconda3\envs\nuecg\lib\site-packages\onmt\bin\translate.py", line 38, in translate
align_debug=opt.align_debug
File "c:\programdata\anaconda3\envs\nuecg\lib\site-packages\onmt\translate\translator.py", line 440, in translate
phrase_table=phrase_table)
File "c:\programdata\anaconda3\envs\nuecg\lib\site-packages\onmt\translate\translator.py", line 487, in _translate
batch, data.src_vocabs, attn_debug
File "c:\programdata\anaconda3\envs\nuecg\lib\site-packages\onmt\translate\translator.py", line 861, in translate_batch
batch, src_vocabs, decode_strategy
File "c:\programdata\anaconda3\envs\nuecg\lib\site-packages\onmt\translate\translator.py", line 947, in _translate_batch_with_strategy
decode_strategy.advance(log_probs, attn)
File "c:\programdata\anaconda3\envs\nuecg\lib\site-packages\onmt\translate\beam_search.py", line 283, in advance
rounding_mode='trunc')
TypeError: div() got an unexpected keyword argument 'rounding_mode'
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #!/usr/bin/env python
2 from setuptools import setup, find_packages
3 from os import path
4
5 this_directory = path.abspath(path.dirname(__file__))
6 with open(path.join(this_directory, 'README.md'), encoding='utf-8') as f:
7 long_description = f.read()
8
9 setup(
10 name='OpenNMT-py',
11 description='A python implementation of OpenNMT',
12 long_description=long_description,
13 long_description_content_type='text/markdown',
14 version='2.3.0',
15 packages=find_packages(),
16 project_urls={
17 "Documentation": "http://opennmt.net/OpenNMT-py/",
18 "Forum": "http://forum.opennmt.net/",
19 "Gitter": "https://gitter.im/OpenNMT/OpenNMT-py",
20 "Source": "https://github.com/OpenNMT/OpenNMT-py/"
21 },
22 python_requires=">=3.5",
23 install_requires=[
24 "torch>=1.6.0",
25 "torchtext==0.5.0",
26 "configargparse",
27 "tensorboard>=2.3",
28 "flask",
29 "waitress",
30 "pyonmttok>=1.23,<2",
31 "pyyaml",
32 "sacrebleu"
33 ],
34 entry_points={
35 "console_scripts": [
36 "onmt_server=onmt.bin.server:main",
37 "onmt_train=onmt.bin.train:main",
38 "onmt_translate=onmt.bin.translate:main",
39 "onmt_translate_dynamic=onmt.bin.translate_dynamic:main",
40 "onmt_release_model=onmt.bin.release_model:main",
41 "onmt_average_models=onmt.bin.average_models:main",
42 "onmt_build_vocab=onmt.bin.build_vocab:main"
43 ],
44 }
45 )
46
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -21,7 +21,7 @@
},
python_requires=">=3.5",
install_requires=[
- "torch>=1.6.0",
+ "torch>=1.9.0",
"torchtext==0.5.0",
"configargparse",
"tensorboard>=2.3",
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -21,7 +21,7 @@\n },\n python_requires=\">=3.5\",\n install_requires=[\n- \"torch>=1.6.0\",\n+ \"torch>=1.9.0\",\n \"torchtext==0.5.0\",\n \"configargparse\",\n \"tensorboard>=2.3\",\n", "issue": "torch.div() (1.6.0) does not have 'rounding_mode' parameter\nAccording to the torch 1.6.0 here: https://pytorch.org/docs/1.6.0/generated/torch.div.html?highlight=torch%20div#torch.div\r\nthere is no 'rounding_mode' parameter. \r\n\r\nBut in translator:\r\nhttps://github.com/OpenNMT/OpenNMT-py/blob/0f411ce11a83b18c0223ac94ccc11a35403763df/onmt/translate/beam_search.py#L282\r\n\r\nThat's why I receive this error:\r\n```\r\nonmt_translate -model ./../output/test/nmt/f0/run/model_step_100.pt -src ./../output/test/nmt/f0/src-test.txt -output ./../output/test/nmt/f0/test.epoch100.pred.csv -gpu 0 --min_length 2 -verbose \r\n[2022-09-15 20:32:19,980 INFO] Translating shard 0.\r\nTraceback (most recent call last):\r\n File \"c:\\programdata\\anaconda3\\envs\\nuecg\\lib\\runpy.py\", line 193, in _run_module_as_main\r\n \"__main__\", mod_spec)\r\n File \"c:\\programdata\\anaconda3\\envs\\nuecg\\lib\\runpy.py\", line 85, in _run_code\r\n exec(code, run_globals)\r\n File \"C:\\ProgramData\\Anaconda3\\envs\\nuecg\\Scripts\\onmt_translate.exe\\__main__.py\", line 7, in <module>\r\n File \"c:\\programdata\\anaconda3\\envs\\nuecg\\lib\\site-packages\\onmt\\bin\\translate.py\", line 54, in main\r\n translate(opt)\r\n File \"c:\\programdata\\anaconda3\\envs\\nuecg\\lib\\site-packages\\onmt\\bin\\translate.py\", line 38, in translate\r\n align_debug=opt.align_debug\r\n File \"c:\\programdata\\anaconda3\\envs\\nuecg\\lib\\site-packages\\onmt\\translate\\translator.py\", line 440, in translate\r\n phrase_table=phrase_table)\r\n File \"c:\\programdata\\anaconda3\\envs\\nuecg\\lib\\site-packages\\onmt\\translate\\translator.py\", line 487, in _translate\r\n batch, data.src_vocabs, attn_debug\r\n File \"c:\\programdata\\anaconda3\\envs\\nuecg\\lib\\site-packages\\onmt\\translate\\translator.py\", line 861, in translate_batch\r\n batch, src_vocabs, decode_strategy\r\n File \"c:\\programdata\\anaconda3\\envs\\nuecg\\lib\\site-packages\\onmt\\translate\\translator.py\", line 947, in _translate_batch_with_strategy\r\n decode_strategy.advance(log_probs, attn)\r\n File \"c:\\programdata\\anaconda3\\envs\\nuecg\\lib\\site-packages\\onmt\\translate\\beam_search.py\", line 283, in advance\r\n rounding_mode='trunc')\r\nTypeError: div() got an unexpected keyword argument 'rounding_mode'\r\n```\n", "before_files": [{"content": "#!/usr/bin/env python\nfrom setuptools import setup, find_packages\nfrom os import path\n\nthis_directory = path.abspath(path.dirname(__file__))\nwith open(path.join(this_directory, 'README.md'), encoding='utf-8') as f:\n long_description = f.read()\n\nsetup(\n name='OpenNMT-py',\n description='A python implementation of OpenNMT',\n long_description=long_description,\n long_description_content_type='text/markdown',\n version='2.3.0',\n packages=find_packages(),\n project_urls={\n \"Documentation\": \"http://opennmt.net/OpenNMT-py/\",\n \"Forum\": \"http://forum.opennmt.net/\",\n \"Gitter\": \"https://gitter.im/OpenNMT/OpenNMT-py\",\n \"Source\": \"https://github.com/OpenNMT/OpenNMT-py/\"\n },\n python_requires=\">=3.5\",\n install_requires=[\n \"torch>=1.6.0\",\n \"torchtext==0.5.0\",\n \"configargparse\",\n \"tensorboard>=2.3\",\n \"flask\",\n \"waitress\",\n \"pyonmttok>=1.23,<2\",\n \"pyyaml\",\n \"sacrebleu\"\n ],\n entry_points={\n \"console_scripts\": [\n \"onmt_server=onmt.bin.server:main\",\n \"onmt_train=onmt.bin.train:main\",\n \"onmt_translate=onmt.bin.translate:main\",\n \"onmt_translate_dynamic=onmt.bin.translate_dynamic:main\",\n \"onmt_release_model=onmt.bin.release_model:main\",\n \"onmt_average_models=onmt.bin.average_models:main\",\n \"onmt_build_vocab=onmt.bin.build_vocab:main\"\n ],\n }\n)\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/env python\nfrom setuptools import setup, find_packages\nfrom os import path\n\nthis_directory = path.abspath(path.dirname(__file__))\nwith open(path.join(this_directory, 'README.md'), encoding='utf-8') as f:\n long_description = f.read()\n\nsetup(\n name='OpenNMT-py',\n description='A python implementation of OpenNMT',\n long_description=long_description,\n long_description_content_type='text/markdown',\n version='2.3.0',\n packages=find_packages(),\n project_urls={\n \"Documentation\": \"http://opennmt.net/OpenNMT-py/\",\n \"Forum\": \"http://forum.opennmt.net/\",\n \"Gitter\": \"https://gitter.im/OpenNMT/OpenNMT-py\",\n \"Source\": \"https://github.com/OpenNMT/OpenNMT-py/\"\n },\n python_requires=\">=3.5\",\n install_requires=[\n \"torch>=1.9.0\",\n \"torchtext==0.5.0\",\n \"configargparse\",\n \"tensorboard>=2.3\",\n \"flask\",\n \"waitress\",\n \"pyonmttok>=1.23,<2\",\n \"pyyaml\",\n \"sacrebleu\"\n ],\n entry_points={\n \"console_scripts\": [\n \"onmt_server=onmt.bin.server:main\",\n \"onmt_train=onmt.bin.train:main\",\n \"onmt_translate=onmt.bin.translate:main\",\n \"onmt_translate_dynamic=onmt.bin.translate_dynamic:main\",\n \"onmt_release_model=onmt.bin.release_model:main\",\n \"onmt_average_models=onmt.bin.average_models:main\",\n \"onmt_build_vocab=onmt.bin.build_vocab:main\"\n ],\n }\n)\n", "path": "setup.py"}]} | 1,461 | 95 |
gh_patches_debug_13922 | rasdani/github-patches | git_diff | huggingface__accelerate-445 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
`psutil` required by utils/modeling.py but it isn't declared as a dependency
### System Info
```Shell
Accelerate `0.10.0.dev0` on Debian Bullseye running Python 3.10.5.
File "/opt/venv/lib/python3.10/site-packages/accelerate/utils/modeling.py", line 276, in get_max_memory
import psutil
ModuleNotFoundError: No module named 'psutil'
```
I'm not sure if you have any minimum version you need to satisfy for `psutil` as a runtime dependency but I see that there are no constraints on it as a `test` dependency in setup.py.
If you don't have any requirements, I'm happy to just add it myself and open a patch PR.
Thanks!
```
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] One of the scripts in the examples/ folder of Accelerate or an officially supported `no_trainer` script in the `examples` folder of the `transformers` repo (such as `run_no_trainer_glue.py`)
- [ ] My own task or dataset (give details below)
### Reproduction
Use `device_map="auto"` when loading any model that supports it.
### Expected behavior
```Shell
I expect that `psutil` is declared as a runtime dependency of the `accelerate` package instead of having to install it myself.
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 # Copyright 2021 The HuggingFace Team. All rights reserved.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 from setuptools import setup
16 from setuptools import find_packages
17
18 extras = {}
19 extras["quality"] = ["black ~= 22.0", "isort >= 5.5.4", "flake8 >= 3.8.3"]
20 extras["docs"] = []
21 extras["test"] = [
22 "psutil",
23 "pytest",
24 "pytest-xdist",
25 "pytest-subtests",
26 "datasets",
27 "evaluate",
28 "transformers",
29 "scipy",
30 "sklearn",
31 "parameterized",
32 "deepspeed",
33 ]
34
35 extras["test_trackers"] = ["wandb", "comet-ml", "tensorboard"]
36 extras["dev"] = extras["quality"] + extras["test"]
37
38 extras["sagemaker"] = [
39 "sagemaker", # boto3 is a required package in sagemaker
40 ]
41
42 setup(
43 name="accelerate",
44 version="0.10.0.dev0",
45 description="Accelerate",
46 long_description=open("README.md", "r", encoding="utf-8").read(),
47 long_description_content_type="text/markdown",
48 keywords="deep learning",
49 license="Apache",
50 author="The HuggingFace team",
51 author_email="[email protected]",
52 url="https://github.com/huggingface/accelerate",
53 package_dir={"": "src"},
54 packages=find_packages("src"),
55 entry_points={
56 "console_scripts": [
57 "accelerate=accelerate.commands.accelerate_cli:main",
58 "accelerate-config=accelerate.commands.config:main",
59 "accelerate-launch=accelerate.commands.launch:main",
60 ]
61 },
62 python_requires=">=3.7.0",
63 install_requires=["numpy>=1.17", "packaging>=20.0", "pyyaml", "torch>=1.4.0"],
64 extras_require=extras,
65 classifiers=[
66 "Development Status :: 5 - Production/Stable",
67 "Intended Audience :: Developers",
68 "Intended Audience :: Education",
69 "Intended Audience :: Science/Research",
70 "License :: OSI Approved :: Apache Software License",
71 "Operating System :: OS Independent",
72 "Programming Language :: Python :: 3",
73 "Programming Language :: Python :: 3.7",
74 "Topic :: Scientific/Engineering :: Artificial Intelligence",
75 ],
76 )
77
78 # Release checklist
79 # 1. Change the version in __init__.py and setup.py.
80 # 2. Commit these changes with the message: "Release: VERSION"
81 # 3. Add a tag in git to mark the release: "git tag VERSION -m 'Adds tag VERSION for pypi' "
82 # Push the tag to git: git push --tags origin main
83 # 4. Run the following commands in the top-level directory:
84 # python setup.py bdist_wheel
85 # python setup.py sdist
86 # 5. Upload the package to the pypi test server first:
87 # twine upload dist/* -r pypitest
88 # twine upload dist/* -r pypitest --repository-url=https://test.pypi.org/legacy/
89 # 6. Check that you can install it in a virtualenv by running:
90 # pip install -i https://testpypi.python.org/pypi accelerate
91 # accelerate env
92 # accelerate test
93 # 7. Upload the final version to actual pypi:
94 # twine upload dist/* -r pypi
95 # 8. Add release notes to the tag in github once everything is looking hunky-dory.
96 # 9. Update the version in __init__.py, setup.py to the new version "-dev" and push to master
97
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -19,7 +19,6 @@
extras["quality"] = ["black ~= 22.0", "isort >= 5.5.4", "flake8 >= 3.8.3"]
extras["docs"] = []
extras["test"] = [
- "psutil",
"pytest",
"pytest-xdist",
"pytest-subtests",
@@ -60,7 +59,7 @@
]
},
python_requires=">=3.7.0",
- install_requires=["numpy>=1.17", "packaging>=20.0", "pyyaml", "torch>=1.4.0"],
+ install_requires=["numpy>=1.17", "packaging>=20.0", "psutil", "pyyaml", "torch>=1.4.0"],
extras_require=extras,
classifiers=[
"Development Status :: 5 - Production/Stable",
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -19,7 +19,6 @@\n extras[\"quality\"] = [\"black ~= 22.0\", \"isort >= 5.5.4\", \"flake8 >= 3.8.3\"]\n extras[\"docs\"] = []\n extras[\"test\"] = [\n- \"psutil\",\n \"pytest\",\n \"pytest-xdist\",\n \"pytest-subtests\",\n@@ -60,7 +59,7 @@\n ]\n },\n python_requires=\">=3.7.0\",\n- install_requires=[\"numpy>=1.17\", \"packaging>=20.0\", \"pyyaml\", \"torch>=1.4.0\"],\n+ install_requires=[\"numpy>=1.17\", \"packaging>=20.0\", \"psutil\", \"pyyaml\", \"torch>=1.4.0\"],\n extras_require=extras,\n classifiers=[\n \"Development Status :: 5 - Production/Stable\",\n", "issue": "`psutil` required by utils/modeling.py but it isn't declared as a dependency\n### System Info\n\n```Shell\nAccelerate `0.10.0.dev0` on Debian Bullseye running Python 3.10.5.\r\n\r\n\r\nFile \"/opt/venv/lib/python3.10/site-packages/accelerate/utils/modeling.py\", line 276, in get_max_memory\r\n import psutil\r\nModuleNotFoundError: No module named 'psutil'\r\n```\r\n\r\nI'm not sure if you have any minimum version you need to satisfy for `psutil` as a runtime dependency but I see that there are no constraints on it as a `test` dependency in setup.py.\r\n\r\nIf you don't have any requirements, I'm happy to just add it myself and open a patch PR.\r\n\r\nThanks!\n```\n\n\n### Information\n\n- [ ] The official example scripts\n- [ ] My own modified scripts\n\n### Tasks\n\n- [ ] One of the scripts in the examples/ folder of Accelerate or an officially supported `no_trainer` script in the `examples` folder of the `transformers` repo (such as `run_no_trainer_glue.py`)\n- [ ] My own task or dataset (give details below)\n\n### Reproduction\n\nUse `device_map=\"auto\"` when loading any model that supports it.\n\n### Expected behavior\n\n```Shell\nI expect that `psutil` is declared as a runtime dependency of the `accelerate` package instead of having to install it myself.\n```\n\n", "before_files": [{"content": "# Copyright 2021 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom setuptools import setup\nfrom setuptools import find_packages\n\nextras = {}\nextras[\"quality\"] = [\"black ~= 22.0\", \"isort >= 5.5.4\", \"flake8 >= 3.8.3\"]\nextras[\"docs\"] = []\nextras[\"test\"] = [\n \"psutil\",\n \"pytest\",\n \"pytest-xdist\",\n \"pytest-subtests\",\n \"datasets\",\n \"evaluate\",\n \"transformers\",\n \"scipy\",\n \"sklearn\",\n \"parameterized\",\n \"deepspeed\",\n]\n\nextras[\"test_trackers\"] = [\"wandb\", \"comet-ml\", \"tensorboard\"]\nextras[\"dev\"] = extras[\"quality\"] + extras[\"test\"]\n\nextras[\"sagemaker\"] = [\n \"sagemaker\", # boto3 is a required package in sagemaker\n]\n\nsetup(\n name=\"accelerate\",\n version=\"0.10.0.dev0\",\n description=\"Accelerate\",\n long_description=open(\"README.md\", \"r\", encoding=\"utf-8\").read(),\n long_description_content_type=\"text/markdown\",\n keywords=\"deep learning\",\n license=\"Apache\",\n author=\"The HuggingFace team\",\n author_email=\"[email protected]\",\n url=\"https://github.com/huggingface/accelerate\",\n package_dir={\"\": \"src\"},\n packages=find_packages(\"src\"),\n entry_points={\n \"console_scripts\": [\n \"accelerate=accelerate.commands.accelerate_cli:main\",\n \"accelerate-config=accelerate.commands.config:main\",\n \"accelerate-launch=accelerate.commands.launch:main\",\n ]\n },\n python_requires=\">=3.7.0\",\n install_requires=[\"numpy>=1.17\", \"packaging>=20.0\", \"pyyaml\", \"torch>=1.4.0\"],\n extras_require=extras,\n classifiers=[\n \"Development Status :: 5 - Production/Stable\",\n \"Intended Audience :: Developers\",\n \"Intended Audience :: Education\",\n \"Intended Audience :: Science/Research\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Operating System :: OS Independent\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.7\",\n \"Topic :: Scientific/Engineering :: Artificial Intelligence\",\n ],\n)\n\n# Release checklist\n# 1. Change the version in __init__.py and setup.py.\n# 2. Commit these changes with the message: \"Release: VERSION\"\n# 3. Add a tag in git to mark the release: \"git tag VERSION -m 'Adds tag VERSION for pypi' \"\n# Push the tag to git: git push --tags origin main\n# 4. Run the following commands in the top-level directory:\n# python setup.py bdist_wheel\n# python setup.py sdist\n# 5. Upload the package to the pypi test server first:\n# twine upload dist/* -r pypitest\n# twine upload dist/* -r pypitest --repository-url=https://test.pypi.org/legacy/\n# 6. Check that you can install it in a virtualenv by running:\n# pip install -i https://testpypi.python.org/pypi accelerate\n# accelerate env\n# accelerate test\n# 7. Upload the final version to actual pypi:\n# twine upload dist/* -r pypi\n# 8. Add release notes to the tag in github once everything is looking hunky-dory.\n# 9. Update the version in __init__.py, setup.py to the new version \"-dev\" and push to master\n", "path": "setup.py"}], "after_files": [{"content": "# Copyright 2021 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom setuptools import setup\nfrom setuptools import find_packages\n\nextras = {}\nextras[\"quality\"] = [\"black ~= 22.0\", \"isort >= 5.5.4\", \"flake8 >= 3.8.3\"]\nextras[\"docs\"] = []\nextras[\"test\"] = [\n \"pytest\",\n \"pytest-xdist\",\n \"pytest-subtests\",\n \"datasets\",\n \"evaluate\",\n \"transformers\",\n \"scipy\",\n \"sklearn\",\n \"parameterized\",\n \"deepspeed\",\n]\n\nextras[\"test_trackers\"] = [\"wandb\", \"comet-ml\", \"tensorboard\"]\nextras[\"dev\"] = extras[\"quality\"] + extras[\"test\"]\n\nextras[\"sagemaker\"] = [\n \"sagemaker\", # boto3 is a required package in sagemaker\n]\n\nsetup(\n name=\"accelerate\",\n version=\"0.10.0.dev0\",\n description=\"Accelerate\",\n long_description=open(\"README.md\", \"r\", encoding=\"utf-8\").read(),\n long_description_content_type=\"text/markdown\",\n keywords=\"deep learning\",\n license=\"Apache\",\n author=\"The HuggingFace team\",\n author_email=\"[email protected]\",\n url=\"https://github.com/huggingface/accelerate\",\n package_dir={\"\": \"src\"},\n packages=find_packages(\"src\"),\n entry_points={\n \"console_scripts\": [\n \"accelerate=accelerate.commands.accelerate_cli:main\",\n \"accelerate-config=accelerate.commands.config:main\",\n \"accelerate-launch=accelerate.commands.launch:main\",\n ]\n },\n python_requires=\">=3.7.0\",\n install_requires=[\"numpy>=1.17\", \"packaging>=20.0\", \"psutil\", \"pyyaml\", \"torch>=1.4.0\"],\n extras_require=extras,\n classifiers=[\n \"Development Status :: 5 - Production/Stable\",\n \"Intended Audience :: Developers\",\n \"Intended Audience :: Education\",\n \"Intended Audience :: Science/Research\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Operating System :: OS Independent\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.7\",\n \"Topic :: Scientific/Engineering :: Artificial Intelligence\",\n ],\n)\n\n# Release checklist\n# 1. Change the version in __init__.py and setup.py.\n# 2. Commit these changes with the message: \"Release: VERSION\"\n# 3. Add a tag in git to mark the release: \"git tag VERSION -m 'Adds tag VERSION for pypi' \"\n# Push the tag to git: git push --tags origin main\n# 4. Run the following commands in the top-level directory:\n# python setup.py bdist_wheel\n# python setup.py sdist\n# 5. Upload the package to the pypi test server first:\n# twine upload dist/* -r pypitest\n# twine upload dist/* -r pypitest --repository-url=https://test.pypi.org/legacy/\n# 6. Check that you can install it in a virtualenv by running:\n# pip install -i https://testpypi.python.org/pypi accelerate\n# accelerate env\n# accelerate test\n# 7. Upload the final version to actual pypi:\n# twine upload dist/* -r pypi\n# 8. Add release notes to the tag in github once everything is looking hunky-dory.\n# 9. Update the version in __init__.py, setup.py to the new version \"-dev\" and push to master\n", "path": "setup.py"}]} | 1,676 | 222 |
gh_patches_debug_9144 | rasdani/github-patches | git_diff | Cloud-CV__EvalAI-1305 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add challenge filtering using featured parameter.
We need to add filtering in challenge model on the basis of `featured` parameter.
Add challenge filtering using featured parameter.
We need to add filtering in challenge model on the basis of `featured` parameter.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `apps/challenges/admin.py`
Content:
```
1 from django.contrib import admin
2
3 from base.admin import ImportExportTimeStampedAdmin
4
5 from .models import (Challenge,
6 ChallengeConfiguration,
7 ChallengePhase,
8 ChallengePhaseSplit,
9 DatasetSplit,
10 Leaderboard,
11 LeaderboardData,
12 StarChallenge,)
13
14
15 @admin.register(Challenge)
16 class ChallengeAdmin(ImportExportTimeStampedAdmin):
17 list_display = ("title", "start_date", "end_date", "creator", "published", "enable_forum", "anonymous_leaderboard")
18 list_filter = ("creator", "published", "enable_forum", "anonymous_leaderboard")
19 search_fields = ("title", "creator")
20
21
22 @admin.register(DatasetSplit)
23 class DatasetSplitAdmin(ImportExportTimeStampedAdmin):
24 list_display = ("name", "codename")
25 list_filter = ("name", "codename")
26 search_fields = ("name", "codename")
27
28
29 @admin.register(ChallengePhase)
30 class ChallengePhaseAdmin(ImportExportTimeStampedAdmin):
31 list_display = ("name", "challenge", "start_date", "end_date", "test_annotation", "is_public", "leaderboard_public")
32 list_filter = ("leaderboard_public", "challenge")
33 search_fields = ("name",)
34
35
36 @admin.register(Leaderboard)
37 class LeaderboardAdmin(ImportExportTimeStampedAdmin):
38 list_display = ("id", "schema")
39 search_fields = ("id",)
40
41
42 @admin.register(ChallengePhaseSplit)
43 class ChallengePhaseSplitAdmin(ImportExportTimeStampedAdmin):
44 list_display = ("id", "challenge_phase", "dataset_split", "leaderboard", "visibility")
45 list_filter = ("challenge_phase", "dataset_split", "leaderboard", "visibility")
46 search_fields = ("challenge_phase", "dataset_split", "leaderboard")
47
48
49 @admin.register(LeaderboardData)
50 class LeaderboardDataAdmin(ImportExportTimeStampedAdmin):
51 list_display = ("challenge_phase_split", "submission", "leaderboard", "result")
52 list_filter = ("challenge_phase_split", "leaderboard",)
53 search_fields = ("challenge_phase_split", "submission", "leaderboard", "result")
54
55
56 @admin.register(ChallengeConfiguration)
57 class ChallengeConfigurationAdmin(ImportExportTimeStampedAdmin):
58 list_display = ('user', 'challenge', 'is_created', 'zip_configuration',)
59 list_filter = ('user', 'is_created',)
60 search_fields = ('user', 'challenge',)
61
62
63 @admin.register(StarChallenge)
64 class StarChallengeAdmin(ImportExportTimeStampedAdmin):
65 list_display = ('user', 'challenge', 'is_starred')
66 search_fields = ('user', 'challenge',)
67
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/apps/challenges/admin.py b/apps/challenges/admin.py
--- a/apps/challenges/admin.py
+++ b/apps/challenges/admin.py
@@ -14,8 +14,9 @@
@admin.register(Challenge)
class ChallengeAdmin(ImportExportTimeStampedAdmin):
- list_display = ("title", "start_date", "end_date", "creator", "published", "enable_forum", "anonymous_leaderboard")
- list_filter = ("creator", "published", "enable_forum", "anonymous_leaderboard")
+ list_display = ("title", "start_date", "end_date", "creator", "published", "enable_forum", "anonymous_leaderboard",
+ "featured")
+ list_filter = ("creator", "published", "enable_forum", "anonymous_leaderboard", "featured")
search_fields = ("title", "creator")
| {"golden_diff": "diff --git a/apps/challenges/admin.py b/apps/challenges/admin.py\n--- a/apps/challenges/admin.py\n+++ b/apps/challenges/admin.py\n@@ -14,8 +14,9 @@\n \n @admin.register(Challenge)\n class ChallengeAdmin(ImportExportTimeStampedAdmin):\n- list_display = (\"title\", \"start_date\", \"end_date\", \"creator\", \"published\", \"enable_forum\", \"anonymous_leaderboard\")\n- list_filter = (\"creator\", \"published\", \"enable_forum\", \"anonymous_leaderboard\")\n+ list_display = (\"title\", \"start_date\", \"end_date\", \"creator\", \"published\", \"enable_forum\", \"anonymous_leaderboard\",\n+ \"featured\")\n+ list_filter = (\"creator\", \"published\", \"enable_forum\", \"anonymous_leaderboard\", \"featured\")\n search_fields = (\"title\", \"creator\")\n", "issue": "Add challenge filtering using featured parameter.\nWe need to add filtering in challenge model on the basis of `featured` parameter.\nAdd challenge filtering using featured parameter.\nWe need to add filtering in challenge model on the basis of `featured` parameter.\n", "before_files": [{"content": "from django.contrib import admin\n\nfrom base.admin import ImportExportTimeStampedAdmin\n\nfrom .models import (Challenge,\n ChallengeConfiguration,\n ChallengePhase,\n ChallengePhaseSplit,\n DatasetSplit,\n Leaderboard,\n LeaderboardData,\n StarChallenge,)\n\n\[email protected](Challenge)\nclass ChallengeAdmin(ImportExportTimeStampedAdmin):\n list_display = (\"title\", \"start_date\", \"end_date\", \"creator\", \"published\", \"enable_forum\", \"anonymous_leaderboard\")\n list_filter = (\"creator\", \"published\", \"enable_forum\", \"anonymous_leaderboard\")\n search_fields = (\"title\", \"creator\")\n\n\[email protected](DatasetSplit)\nclass DatasetSplitAdmin(ImportExportTimeStampedAdmin):\n list_display = (\"name\", \"codename\")\n list_filter = (\"name\", \"codename\")\n search_fields = (\"name\", \"codename\")\n\n\[email protected](ChallengePhase)\nclass ChallengePhaseAdmin(ImportExportTimeStampedAdmin):\n list_display = (\"name\", \"challenge\", \"start_date\", \"end_date\", \"test_annotation\", \"is_public\", \"leaderboard_public\")\n list_filter = (\"leaderboard_public\", \"challenge\")\n search_fields = (\"name\",)\n\n\[email protected](Leaderboard)\nclass LeaderboardAdmin(ImportExportTimeStampedAdmin):\n list_display = (\"id\", \"schema\")\n search_fields = (\"id\",)\n\n\[email protected](ChallengePhaseSplit)\nclass ChallengePhaseSplitAdmin(ImportExportTimeStampedAdmin):\n list_display = (\"id\", \"challenge_phase\", \"dataset_split\", \"leaderboard\", \"visibility\")\n list_filter = (\"challenge_phase\", \"dataset_split\", \"leaderboard\", \"visibility\")\n search_fields = (\"challenge_phase\", \"dataset_split\", \"leaderboard\")\n\n\[email protected](LeaderboardData)\nclass LeaderboardDataAdmin(ImportExportTimeStampedAdmin):\n list_display = (\"challenge_phase_split\", \"submission\", \"leaderboard\", \"result\")\n list_filter = (\"challenge_phase_split\", \"leaderboard\",)\n search_fields = (\"challenge_phase_split\", \"submission\", \"leaderboard\", \"result\")\n\n\[email protected](ChallengeConfiguration)\nclass ChallengeConfigurationAdmin(ImportExportTimeStampedAdmin):\n list_display = ('user', 'challenge', 'is_created', 'zip_configuration',)\n list_filter = ('user', 'is_created',)\n search_fields = ('user', 'challenge',)\n\n\[email protected](StarChallenge)\nclass StarChallengeAdmin(ImportExportTimeStampedAdmin):\n list_display = ('user', 'challenge', 'is_starred')\n search_fields = ('user', 'challenge',)\n", "path": "apps/challenges/admin.py"}], "after_files": [{"content": "from django.contrib import admin\n\nfrom base.admin import ImportExportTimeStampedAdmin\n\nfrom .models import (Challenge,\n ChallengeConfiguration,\n ChallengePhase,\n ChallengePhaseSplit,\n DatasetSplit,\n Leaderboard,\n LeaderboardData,\n StarChallenge,)\n\n\[email protected](Challenge)\nclass ChallengeAdmin(ImportExportTimeStampedAdmin):\n list_display = (\"title\", \"start_date\", \"end_date\", \"creator\", \"published\", \"enable_forum\", \"anonymous_leaderboard\",\n \"featured\")\n list_filter = (\"creator\", \"published\", \"enable_forum\", \"anonymous_leaderboard\", \"featured\")\n search_fields = (\"title\", \"creator\")\n\n\[email protected](DatasetSplit)\nclass DatasetSplitAdmin(ImportExportTimeStampedAdmin):\n list_display = (\"name\", \"codename\")\n list_filter = (\"name\", \"codename\")\n search_fields = (\"name\", \"codename\")\n\n\[email protected](ChallengePhase)\nclass ChallengePhaseAdmin(ImportExportTimeStampedAdmin):\n list_display = (\"name\", \"challenge\", \"start_date\", \"end_date\", \"test_annotation\", \"is_public\", \"leaderboard_public\")\n list_filter = (\"leaderboard_public\", \"challenge\")\n search_fields = (\"name\",)\n\n\[email protected](Leaderboard)\nclass LeaderboardAdmin(ImportExportTimeStampedAdmin):\n list_display = (\"id\", \"schema\")\n search_fields = (\"id\",)\n\n\[email protected](ChallengePhaseSplit)\nclass ChallengePhaseSplitAdmin(ImportExportTimeStampedAdmin):\n list_display = (\"id\", \"challenge_phase\", \"dataset_split\", \"leaderboard\", \"visibility\")\n list_filter = (\"challenge_phase\", \"dataset_split\", \"leaderboard\", \"visibility\")\n search_fields = (\"challenge_phase\", \"dataset_split\", \"leaderboard\")\n\n\[email protected](LeaderboardData)\nclass LeaderboardDataAdmin(ImportExportTimeStampedAdmin):\n list_display = (\"challenge_phase_split\", \"submission\", \"leaderboard\", \"result\")\n list_filter = (\"challenge_phase_split\", \"leaderboard\",)\n search_fields = (\"challenge_phase_split\", \"submission\", \"leaderboard\", \"result\")\n\n\[email protected](ChallengeConfiguration)\nclass ChallengeConfigurationAdmin(ImportExportTimeStampedAdmin):\n list_display = ('user', 'challenge', 'is_created', 'zip_configuration',)\n list_filter = ('user', 'is_created',)\n search_fields = ('user', 'challenge',)\n\n\[email protected](StarChallenge)\nclass StarChallengeAdmin(ImportExportTimeStampedAdmin):\n list_display = ('user', 'challenge', 'is_starred')\n search_fields = ('user', 'challenge',)\n", "path": "apps/challenges/admin.py"}]} | 983 | 181 |
gh_patches_debug_5820 | rasdani/github-patches | git_diff | aws__aws-cli-761 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
datapipeline query-object --query argument is shadowed
The top level `--query` option shadows the `--query` from datapipeline query-object. This can be addressed in the argrename customization model. We can also take this opportunity to remove the `cli_name` from the `.extra.json` files in botocore.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `awscli/customizations/argrename.py`
Content:
```
1 # Copyright 2013 Amazon.com, Inc. or its affiliates. All Rights Reserved.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License"). You
4 # may not use this file except in compliance with the License. A copy of
5 # the License is located at
6 #
7 # http://aws.amazon.com/apache2.0/
8 #
9 # or in the "license" file accompanying this file. This file is
10 # distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF
11 # ANY KIND, either express or implied. See the License for the specific
12 # language governing permissions and limitations under the License.
13 """
14 """
15
16 from awscli.customizations import utils
17
18
19 ARGUMENT_RENAMES = {
20 # Mapping of original arg to renamed arg.
21 # The key is <service>.<operation>.argname
22 # The first part of the key is used for event registration
23 # so if you wanted to rename something for an entire service you
24 # could say 'ec2.*.dry-run': 'renamed-arg-name', or if you wanted
25 # to rename across all services you could say '*.*.dry-run': 'new-name'.
26 'ec2.create-image.no-no-reboot': 'reboot',
27 'ec2.*.no-egress': 'ingress',
28 'ec2.*.no-disable-api-termination': 'enable-api-termination',
29 }
30
31
32 def register_arg_renames(cli):
33 for original, new_name in ARGUMENT_RENAMES.items():
34 event_portion, original_arg_name = original.rsplit('.', 1)
35 cli.register('building-argument-table.%s' % event_portion,
36 rename_arg(original_arg_name, new_name))
37
38
39 def rename_arg(original_arg_name, new_name):
40 def _rename_arg(argument_table, **kwargs):
41 if original_arg_name in argument_table:
42 utils.rename_argument(argument_table, original_arg_name, new_name)
43 return _rename_arg
44
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/awscli/customizations/argrename.py b/awscli/customizations/argrename.py
--- a/awscli/customizations/argrename.py
+++ b/awscli/customizations/argrename.py
@@ -26,6 +26,11 @@
'ec2.create-image.no-no-reboot': 'reboot',
'ec2.*.no-egress': 'ingress',
'ec2.*.no-disable-api-termination': 'enable-api-termination',
+ 'opsworks.*.region': 'stack-region',
+ 'elastictranscoder.*.output': 'job-output',
+ 'swf.register-activity-type.version': 'activity-version',
+ 'swf.register-workflow-type.version': 'workflow-version',
+ 'datapipeline.*.query': 'objects-query',
}
| {"golden_diff": "diff --git a/awscli/customizations/argrename.py b/awscli/customizations/argrename.py\n--- a/awscli/customizations/argrename.py\n+++ b/awscli/customizations/argrename.py\n@@ -26,6 +26,11 @@\n 'ec2.create-image.no-no-reboot': 'reboot',\n 'ec2.*.no-egress': 'ingress',\n 'ec2.*.no-disable-api-termination': 'enable-api-termination',\n+ 'opsworks.*.region': 'stack-region',\n+ 'elastictranscoder.*.output': 'job-output',\n+ 'swf.register-activity-type.version': 'activity-version',\n+ 'swf.register-workflow-type.version': 'workflow-version',\n+ 'datapipeline.*.query': 'objects-query',\n }\n", "issue": "datapipeline query-object --query argument is shadowed\nThe top level `--query` option shadows the `--query` from datapipeline query-object. This can be addressed in the argrename customization model. We can also take this opportunity to remove the `cli_name` from the `.extra.json` files in botocore.\n\n", "before_files": [{"content": "# Copyright 2013 Amazon.com, Inc. or its affiliates. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"). You\n# may not use this file except in compliance with the License. A copy of\n# the License is located at\n#\n# http://aws.amazon.com/apache2.0/\n#\n# or in the \"license\" file accompanying this file. This file is\n# distributed on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF\n# ANY KIND, either express or implied. See the License for the specific\n# language governing permissions and limitations under the License.\n\"\"\"\n\"\"\"\n\nfrom awscli.customizations import utils\n\n\nARGUMENT_RENAMES = {\n # Mapping of original arg to renamed arg.\n # The key is <service>.<operation>.argname\n # The first part of the key is used for event registration\n # so if you wanted to rename something for an entire service you\n # could say 'ec2.*.dry-run': 'renamed-arg-name', or if you wanted\n # to rename across all services you could say '*.*.dry-run': 'new-name'.\n 'ec2.create-image.no-no-reboot': 'reboot',\n 'ec2.*.no-egress': 'ingress',\n 'ec2.*.no-disable-api-termination': 'enable-api-termination',\n}\n\n\ndef register_arg_renames(cli):\n for original, new_name in ARGUMENT_RENAMES.items():\n event_portion, original_arg_name = original.rsplit('.', 1)\n cli.register('building-argument-table.%s' % event_portion,\n rename_arg(original_arg_name, new_name))\n\n\ndef rename_arg(original_arg_name, new_name):\n def _rename_arg(argument_table, **kwargs):\n if original_arg_name in argument_table:\n utils.rename_argument(argument_table, original_arg_name, new_name)\n return _rename_arg\n", "path": "awscli/customizations/argrename.py"}], "after_files": [{"content": "# Copyright 2013 Amazon.com, Inc. or its affiliates. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"). You\n# may not use this file except in compliance with the License. A copy of\n# the License is located at\n#\n# http://aws.amazon.com/apache2.0/\n#\n# or in the \"license\" file accompanying this file. This file is\n# distributed on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF\n# ANY KIND, either express or implied. See the License for the specific\n# language governing permissions and limitations under the License.\n\"\"\"\n\"\"\"\n\nfrom awscli.customizations import utils\n\n\nARGUMENT_RENAMES = {\n # Mapping of original arg to renamed arg.\n # The key is <service>.<operation>.argname\n # The first part of the key is used for event registration\n # so if you wanted to rename something for an entire service you\n # could say 'ec2.*.dry-run': 'renamed-arg-name', or if you wanted\n # to rename across all services you could say '*.*.dry-run': 'new-name'.\n 'ec2.create-image.no-no-reboot': 'reboot',\n 'ec2.*.no-egress': 'ingress',\n 'ec2.*.no-disable-api-termination': 'enable-api-termination',\n 'opsworks.*.region': 'stack-region',\n 'elastictranscoder.*.output': 'job-output',\n 'swf.register-activity-type.version': 'activity-version',\n 'swf.register-workflow-type.version': 'workflow-version',\n 'datapipeline.*.query': 'objects-query',\n}\n\n\ndef register_arg_renames(cli):\n for original, new_name in ARGUMENT_RENAMES.items():\n event_portion, original_arg_name = original.rsplit('.', 1)\n cli.register('building-argument-table.%s' % event_portion,\n rename_arg(original_arg_name, new_name))\n\n\ndef rename_arg(original_arg_name, new_name):\n def _rename_arg(argument_table, **kwargs):\n if original_arg_name in argument_table:\n utils.rename_argument(argument_table, original_arg_name, new_name)\n return _rename_arg\n", "path": "awscli/customizations/argrename.py"}]} | 829 | 176 |
gh_patches_debug_17055 | rasdani/github-patches | git_diff | svthalia__concrexit-1680 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Show non-current FoodEvents in API v2.
### Describe the bug
The `api/v2/food/events/` and `api/v2/food/events/<pk>/` endpoints currently do not return FoodEvents that are not current.
I think to change that we’d only need to replace some `FoodEvent.current_objects.all()`s with `FoodEvent.objects.all()`.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `website/pizzas/api/v2/views.py`
Content:
```
1 from oauth2_provider.contrib.rest_framework import IsAuthenticatedOrTokenHasScope
2 from rest_framework.generics import (
3 ListAPIView,
4 RetrieveAPIView,
5 get_object_or_404,
6 CreateAPIView,
7 DestroyAPIView,
8 UpdateAPIView,
9 )
10
11 from rest_framework import filters as framework_filters, status
12 from rest_framework.permissions import DjangoModelPermissionsOrAnonReadOnly
13 from rest_framework.response import Response
14
15 from pizzas.api.v2 import filters
16 from pizzas.api.v2.serializers import (
17 ProductSerializer,
18 FoodOrderSerializer,
19 FoodOrderUpdateSerializer,
20 FoodOrderCreateSerializer,
21 )
22 from pizzas.api.v2.serializers.food_event import FoodEventSerializer
23 from pizzas.models import FoodEvent, Product, FoodOrder
24 from thaliawebsite.api.v2.permissions import IsAuthenticatedOrTokenHasScopeForMethod
25
26
27 class FoodEventListView(ListAPIView):
28 """Returns an overview of all food events."""
29
30 serializer_class = FoodEventSerializer
31 queryset = FoodEvent.current_objects.all()
32 filter_backends = (
33 framework_filters.OrderingFilter,
34 filters.FoodEventDateFilterBackend,
35 )
36 ordering_fields = ("start", "end")
37 permission_classes = [
38 IsAuthenticatedOrTokenHasScope,
39 DjangoModelPermissionsOrAnonReadOnly,
40 ]
41 required_scopes = ["food:read"]
42
43
44 class FoodEventDetailView(RetrieveAPIView):
45 """Returns one single food event."""
46
47 serializer_class = FoodEventSerializer
48 queryset = FoodEvent.current_objects.all()
49 permission_classes = [
50 IsAuthenticatedOrTokenHasScope,
51 DjangoModelPermissionsOrAnonReadOnly,
52 ]
53 required_scopes = ["food:read"]
54
55
56 class FoodEventProductsListView(ListAPIView):
57 """Returns an overview of all products."""
58
59 serializer_class = ProductSerializer
60 queryset = Product.available_products.all()
61 filter_backends = (framework_filters.SearchFilter,)
62 search_fields = ("name",)
63 permission_classes = [
64 IsAuthenticatedOrTokenHasScope,
65 DjangoModelPermissionsOrAnonReadOnly,
66 ]
67 required_scopes = ["food:read"]
68
69
70 class FoodEventOrderDetailView(
71 RetrieveAPIView, CreateAPIView, UpdateAPIView, DestroyAPIView
72 ):
73 """Returns details of a food order."""
74
75 permission_classes = [
76 IsAuthenticatedOrTokenHasScopeForMethod,
77 DjangoModelPermissionsOrAnonReadOnly,
78 ]
79 required_scopes_per_method = {
80 "GET": ["food:read"],
81 "POST": ["food:order"],
82 "PUT": ["food:order"],
83 "PATCH": ["food:order"],
84 "DELETE": ["food:order"],
85 }
86
87 def get_serializer_class(self):
88 if self.request.method.lower() == "get":
89 return FoodOrderSerializer
90 if self.request.method.lower() == "post":
91 return FoodOrderCreateSerializer
92 return FoodOrderUpdateSerializer
93
94 def get_queryset(self):
95 return FoodOrder.objects.filter(food_event=self.food_event)
96
97 def get_object(self):
98 queryset = self.filter_queryset(self.get_queryset())
99 obj = get_object_or_404(queryset, member=self.request.member)
100
101 # May raise a permission denied
102 self.check_object_permissions(self.request, obj)
103
104 return obj
105
106 def dispatch(self, request, *args, **kwargs):
107 self.food_event = get_object_or_404(FoodEvent, pk=self.kwargs.get("pk"))
108 return super().dispatch(request, *args, **kwargs)
109
110 def update(self, request, *args, **kwargs):
111 super().update(request, *args, **kwargs)
112 instance = self.get_object()
113 return Response(
114 FoodOrderSerializer(instance, context=self.get_serializer_context()).data
115 )
116
117 def create(self, request, *args, **kwargs):
118 serializer = self.get_serializer(data=request.data)
119 serializer.is_valid(raise_exception=True)
120 instance = serializer.save(food_event=self.food_event)
121 return Response(
122 FoodOrderSerializer(instance, context=self.get_serializer_context()).data,
123 status=status.HTTP_201_CREATED,
124 )
125
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/website/pizzas/api/v2/views.py b/website/pizzas/api/v2/views.py
--- a/website/pizzas/api/v2/views.py
+++ b/website/pizzas/api/v2/views.py
@@ -28,7 +28,7 @@
"""Returns an overview of all food events."""
serializer_class = FoodEventSerializer
- queryset = FoodEvent.current_objects.all()
+ queryset = FoodEvent.objects.all()
filter_backends = (
framework_filters.OrderingFilter,
filters.FoodEventDateFilterBackend,
@@ -45,7 +45,7 @@
"""Returns one single food event."""
serializer_class = FoodEventSerializer
- queryset = FoodEvent.current_objects.all()
+ queryset = FoodEvent.objects.all()
permission_classes = [
IsAuthenticatedOrTokenHasScope,
DjangoModelPermissionsOrAnonReadOnly,
| {"golden_diff": "diff --git a/website/pizzas/api/v2/views.py b/website/pizzas/api/v2/views.py\n--- a/website/pizzas/api/v2/views.py\n+++ b/website/pizzas/api/v2/views.py\n@@ -28,7 +28,7 @@\n \"\"\"Returns an overview of all food events.\"\"\"\n \n serializer_class = FoodEventSerializer\n- queryset = FoodEvent.current_objects.all()\n+ queryset = FoodEvent.objects.all()\n filter_backends = (\n framework_filters.OrderingFilter,\n filters.FoodEventDateFilterBackend,\n@@ -45,7 +45,7 @@\n \"\"\"Returns one single food event.\"\"\"\n \n serializer_class = FoodEventSerializer\n- queryset = FoodEvent.current_objects.all()\n+ queryset = FoodEvent.objects.all()\n permission_classes = [\n IsAuthenticatedOrTokenHasScope,\n DjangoModelPermissionsOrAnonReadOnly,\n", "issue": "Show non-current FoodEvents in API v2.\n### Describe the bug\r\nThe `api/v2/food/events/` and `api/v2/food/events/<pk>/` endpoints currently do not return FoodEvents that are not current. \r\n\r\nI think to change that we\u2019d only need to replace some `FoodEvent.current_objects.all()`s with `FoodEvent.objects.all()`.\r\n\n", "before_files": [{"content": "from oauth2_provider.contrib.rest_framework import IsAuthenticatedOrTokenHasScope\nfrom rest_framework.generics import (\n ListAPIView,\n RetrieveAPIView,\n get_object_or_404,\n CreateAPIView,\n DestroyAPIView,\n UpdateAPIView,\n)\n\nfrom rest_framework import filters as framework_filters, status\nfrom rest_framework.permissions import DjangoModelPermissionsOrAnonReadOnly\nfrom rest_framework.response import Response\n\nfrom pizzas.api.v2 import filters\nfrom pizzas.api.v2.serializers import (\n ProductSerializer,\n FoodOrderSerializer,\n FoodOrderUpdateSerializer,\n FoodOrderCreateSerializer,\n)\nfrom pizzas.api.v2.serializers.food_event import FoodEventSerializer\nfrom pizzas.models import FoodEvent, Product, FoodOrder\nfrom thaliawebsite.api.v2.permissions import IsAuthenticatedOrTokenHasScopeForMethod\n\n\nclass FoodEventListView(ListAPIView):\n \"\"\"Returns an overview of all food events.\"\"\"\n\n serializer_class = FoodEventSerializer\n queryset = FoodEvent.current_objects.all()\n filter_backends = (\n framework_filters.OrderingFilter,\n filters.FoodEventDateFilterBackend,\n )\n ordering_fields = (\"start\", \"end\")\n permission_classes = [\n IsAuthenticatedOrTokenHasScope,\n DjangoModelPermissionsOrAnonReadOnly,\n ]\n required_scopes = [\"food:read\"]\n\n\nclass FoodEventDetailView(RetrieveAPIView):\n \"\"\"Returns one single food event.\"\"\"\n\n serializer_class = FoodEventSerializer\n queryset = FoodEvent.current_objects.all()\n permission_classes = [\n IsAuthenticatedOrTokenHasScope,\n DjangoModelPermissionsOrAnonReadOnly,\n ]\n required_scopes = [\"food:read\"]\n\n\nclass FoodEventProductsListView(ListAPIView):\n \"\"\"Returns an overview of all products.\"\"\"\n\n serializer_class = ProductSerializer\n queryset = Product.available_products.all()\n filter_backends = (framework_filters.SearchFilter,)\n search_fields = (\"name\",)\n permission_classes = [\n IsAuthenticatedOrTokenHasScope,\n DjangoModelPermissionsOrAnonReadOnly,\n ]\n required_scopes = [\"food:read\"]\n\n\nclass FoodEventOrderDetailView(\n RetrieveAPIView, CreateAPIView, UpdateAPIView, DestroyAPIView\n):\n \"\"\"Returns details of a food order.\"\"\"\n\n permission_classes = [\n IsAuthenticatedOrTokenHasScopeForMethod,\n DjangoModelPermissionsOrAnonReadOnly,\n ]\n required_scopes_per_method = {\n \"GET\": [\"food:read\"],\n \"POST\": [\"food:order\"],\n \"PUT\": [\"food:order\"],\n \"PATCH\": [\"food:order\"],\n \"DELETE\": [\"food:order\"],\n }\n\n def get_serializer_class(self):\n if self.request.method.lower() == \"get\":\n return FoodOrderSerializer\n if self.request.method.lower() == \"post\":\n return FoodOrderCreateSerializer\n return FoodOrderUpdateSerializer\n\n def get_queryset(self):\n return FoodOrder.objects.filter(food_event=self.food_event)\n\n def get_object(self):\n queryset = self.filter_queryset(self.get_queryset())\n obj = get_object_or_404(queryset, member=self.request.member)\n\n # May raise a permission denied\n self.check_object_permissions(self.request, obj)\n\n return obj\n\n def dispatch(self, request, *args, **kwargs):\n self.food_event = get_object_or_404(FoodEvent, pk=self.kwargs.get(\"pk\"))\n return super().dispatch(request, *args, **kwargs)\n\n def update(self, request, *args, **kwargs):\n super().update(request, *args, **kwargs)\n instance = self.get_object()\n return Response(\n FoodOrderSerializer(instance, context=self.get_serializer_context()).data\n )\n\n def create(self, request, *args, **kwargs):\n serializer = self.get_serializer(data=request.data)\n serializer.is_valid(raise_exception=True)\n instance = serializer.save(food_event=self.food_event)\n return Response(\n FoodOrderSerializer(instance, context=self.get_serializer_context()).data,\n status=status.HTTP_201_CREATED,\n )\n", "path": "website/pizzas/api/v2/views.py"}], "after_files": [{"content": "from oauth2_provider.contrib.rest_framework import IsAuthenticatedOrTokenHasScope\nfrom rest_framework.generics import (\n ListAPIView,\n RetrieveAPIView,\n get_object_or_404,\n CreateAPIView,\n DestroyAPIView,\n UpdateAPIView,\n)\n\nfrom rest_framework import filters as framework_filters, status\nfrom rest_framework.permissions import DjangoModelPermissionsOrAnonReadOnly\nfrom rest_framework.response import Response\n\nfrom pizzas.api.v2 import filters\nfrom pizzas.api.v2.serializers import (\n ProductSerializer,\n FoodOrderSerializer,\n FoodOrderUpdateSerializer,\n FoodOrderCreateSerializer,\n)\nfrom pizzas.api.v2.serializers.food_event import FoodEventSerializer\nfrom pizzas.models import FoodEvent, Product, FoodOrder\nfrom thaliawebsite.api.v2.permissions import IsAuthenticatedOrTokenHasScopeForMethod\n\n\nclass FoodEventListView(ListAPIView):\n \"\"\"Returns an overview of all food events.\"\"\"\n\n serializer_class = FoodEventSerializer\n queryset = FoodEvent.objects.all()\n filter_backends = (\n framework_filters.OrderingFilter,\n filters.FoodEventDateFilterBackend,\n )\n ordering_fields = (\"start\", \"end\")\n permission_classes = [\n IsAuthenticatedOrTokenHasScope,\n DjangoModelPermissionsOrAnonReadOnly,\n ]\n required_scopes = [\"food:read\"]\n\n\nclass FoodEventDetailView(RetrieveAPIView):\n \"\"\"Returns one single food event.\"\"\"\n\n serializer_class = FoodEventSerializer\n queryset = FoodEvent.objects.all()\n permission_classes = [\n IsAuthenticatedOrTokenHasScope,\n DjangoModelPermissionsOrAnonReadOnly,\n ]\n required_scopes = [\"food:read\"]\n\n\nclass FoodEventProductsListView(ListAPIView):\n \"\"\"Returns an overview of all products.\"\"\"\n\n serializer_class = ProductSerializer\n queryset = Product.available_products.all()\n filter_backends = (framework_filters.SearchFilter,)\n search_fields = (\"name\",)\n permission_classes = [\n IsAuthenticatedOrTokenHasScope,\n DjangoModelPermissionsOrAnonReadOnly,\n ]\n required_scopes = [\"food:read\"]\n\n\nclass FoodEventOrderDetailView(\n RetrieveAPIView, CreateAPIView, UpdateAPIView, DestroyAPIView\n):\n \"\"\"Returns details of a food order.\"\"\"\n\n permission_classes = [\n IsAuthenticatedOrTokenHasScopeForMethod,\n DjangoModelPermissionsOrAnonReadOnly,\n ]\n required_scopes_per_method = {\n \"GET\": [\"food:read\"],\n \"POST\": [\"food:order\"],\n \"PUT\": [\"food:order\"],\n \"PATCH\": [\"food:order\"],\n \"DELETE\": [\"food:order\"],\n }\n\n def get_serializer_class(self):\n if self.request.method.lower() == \"get\":\n return FoodOrderSerializer\n if self.request.method.lower() == \"post\":\n return FoodOrderCreateSerializer\n return FoodOrderUpdateSerializer\n\n def get_queryset(self):\n return FoodOrder.objects.filter(food_event=self.food_event)\n\n def get_object(self):\n queryset = self.filter_queryset(self.get_queryset())\n obj = get_object_or_404(queryset, member=self.request.member)\n\n # May raise a permission denied\n self.check_object_permissions(self.request, obj)\n\n return obj\n\n def dispatch(self, request, *args, **kwargs):\n self.food_event = get_object_or_404(FoodEvent, pk=self.kwargs.get(\"pk\"))\n return super().dispatch(request, *args, **kwargs)\n\n def update(self, request, *args, **kwargs):\n super().update(request, *args, **kwargs)\n instance = self.get_object()\n return Response(\n FoodOrderSerializer(instance, context=self.get_serializer_context()).data\n )\n\n def create(self, request, *args, **kwargs):\n serializer = self.get_serializer(data=request.data)\n serializer.is_valid(raise_exception=True)\n instance = serializer.save(food_event=self.food_event)\n return Response(\n FoodOrderSerializer(instance, context=self.get_serializer_context()).data,\n status=status.HTTP_201_CREATED,\n )\n", "path": "website/pizzas/api/v2/views.py"}]} | 1,460 | 190 |
gh_patches_debug_30501 | rasdani/github-patches | git_diff | mampfes__hacs_waste_collection_schedule-306 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Calendar used as a trigger for automations never fire.
So I created an automation that uses the generated calendar to notify me when I have to put the bins out using the new calendar triggers.
However, the automation never runs because the trigger never fires.
I debugged this a bit and found tha following issue:
HA asks the integration [here](https://github.com/home-assistant/core/blob/dev/homeassistant/components/calendar/trigger.py#L98) for all applicable events. However, the returned list is not quite correct. The timestamps are datetimes and the integration checks only the date component [here](https://github.com/mampfes/hacs_waste_collection_schedule/blob/master/custom_components/waste_collection_schedule/calendar.py#L53).
In my case, my local timezone is Europe/Berlin, which is currently UTC+2. HA gives UTC timestamps to the integration, so they are shifted by two hours "into the past" (not really, but you know what I mean). This means that the date check is wrong, as it misses the events for the day.
I changed the following and it worked in my testing but maybe you have a better idea on how to fix that:
```python
async def async_get_events(self, hass, start_datetime, end_datetime):
"""Return all events within specified time span."""
collections = []
for a in self._scraper.get_upcoming(include_today=True):
event = self._convert(a)
if event.start_datetime_local >= start_datetime and event.end_datetime_local <= end_datetime:
collections.append(event)
return collections
def _convert(self, collection):
"""Convert an collection into a Home Assistant calendar event."""
return CalendarEvent(
summary=collection.type,
start=collection.date,
end=collection.date,
)
```
Essentially, I convert to a HA calender event first and then let HA convert the start/end times of the event to local time to compare them against the given start/end times which are still in UTC. But both are now proper datetime objects with timezone information so comparing them works fine.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `custom_components/waste_collection_schedule/calendar.py`
Content:
```
1 """Calendar platform support for Waste Collection Schedule."""
2
3 import logging
4 from datetime import timedelta
5
6 from homeassistant.components.calendar import CalendarEntity, CalendarEvent
7
8 from custom_components.waste_collection_schedule.waste_collection_schedule.scraper import (
9 Scraper,
10 )
11
12 _LOGGER = logging.getLogger(__name__)
13
14
15 async def async_setup_platform(hass, config, async_add_entities, discovery_info=None):
16 """Set up calendar platform."""
17 # We only want this platform to be set up via discovery.
18 if discovery_info is None:
19 return
20
21 entities = []
22
23 api = discovery_info["api"]
24
25 for scraper in api.scrapers:
26 dedicated_calendar_types = scraper.get_dedicated_calendar_types()
27 global_calendar_types = scraper.get_global_calendar_types()
28
29 if dedicated_calendar_types is not None:
30 for type in dedicated_calendar_types:
31 unique_id = calc_unique_calendar_id(scraper, type)
32
33 entities.append(
34 WasteCollectionCalendar(
35 api,
36 scraper,
37 scraper.get_calendar_title_for_type(type),
38 [scraper.get_collection_type(type)],
39 unique_id,
40 )
41 )
42
43 if global_calendar_types is not None or dedicated_calendar_types is None:
44 unique_id = calc_unique_calendar_id(scraper)
45 entities.append(
46 WasteCollectionCalendar(
47 api,
48 scraper,
49 scraper.calendar_title,
50 [
51 scraper.get_collection_type(type)
52 for type in global_calendar_types
53 ]
54 if global_calendar_types is not None
55 else None,
56 unique_id,
57 )
58 )
59
60 async_add_entities(entities)
61
62
63 class WasteCollectionCalendar(CalendarEntity):
64 """Calendar entity class."""
65
66 def __init__(self, api, scraper, name, types, unique_id: str):
67 self._api = api
68 self._scraper = scraper
69 self._name = name
70 self._types = types
71 self._unique_id = unique_id
72 self._attr_unique_id = unique_id
73
74 @property
75 def name(self):
76 """Return entity name."""
77 return self._name
78
79 @property
80 def event(self):
81 """Return next collection event."""
82 collections = self._scraper.get_upcoming(
83 count=1, include_today=True, types=self._types
84 )
85
86 if len(collections) == 0:
87 return None
88 else:
89 return self._convert(collections[0])
90
91 async def async_get_events(self, hass, start_date, end_date):
92 """Return all events within specified time span."""
93 collections = []
94 for a in self._scraper.get_upcoming(include_today=True, types=self._types):
95 if a.date >= start_date.date() and a.date <= end_date.date():
96 collections.append(self._convert(a))
97 return collections
98
99 def _convert(self, collection):
100 """Convert an collection into a Home Assistant calendar event."""
101 return CalendarEvent(
102 summary=collection.type,
103 start=collection.date,
104 end=collection.date + timedelta(days=1),
105 )
106
107
108 def calc_unique_calendar_id(scraper: Scraper, type: str | None = None):
109 return scraper.unique_id + ("_" + type if type is not None else "") + "_calendar"
110
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/custom_components/waste_collection_schedule/calendar.py b/custom_components/waste_collection_schedule/calendar.py
--- a/custom_components/waste_collection_schedule/calendar.py
+++ b/custom_components/waste_collection_schedule/calendar.py
@@ -1,9 +1,11 @@
"""Calendar platform support for Waste Collection Schedule."""
import logging
-from datetime import timedelta
+from datetime import timedelta, timezone, datetime
from homeassistant.components.calendar import CalendarEntity, CalendarEvent
+from homeassistant.core import HomeAssistant
+from homeassistant.util.dt import DEFAULT_TIME_ZONE
from custom_components.waste_collection_schedule.waste_collection_schedule.scraper import (
Scraper,
@@ -88,15 +90,23 @@
else:
return self._convert(collections[0])
- async def async_get_events(self, hass, start_date, end_date):
+ async def async_get_events(
+ self, hass: HomeAssistant, start_date: datetime, end_date: datetime
+ ):
"""Return all events within specified time span."""
- collections = []
- for a in self._scraper.get_upcoming(include_today=True, types=self._types):
- if a.date >= start_date.date() and a.date <= end_date.date():
- collections.append(self._convert(a))
- return collections
+ events = []
- def _convert(self, collection):
+ for collection in self._scraper.get_upcoming(
+ include_today=True, types=self._types
+ ):
+ event = self._convert(collection)
+
+ if start_date <= event.start_datetime_local <= end_date:
+ events.append(event)
+
+ return events
+
+ def _convert(self, collection) -> CalendarEvent:
"""Convert an collection into a Home Assistant calendar event."""
return CalendarEvent(
summary=collection.type,
| {"golden_diff": "diff --git a/custom_components/waste_collection_schedule/calendar.py b/custom_components/waste_collection_schedule/calendar.py\n--- a/custom_components/waste_collection_schedule/calendar.py\n+++ b/custom_components/waste_collection_schedule/calendar.py\n@@ -1,9 +1,11 @@\n \"\"\"Calendar platform support for Waste Collection Schedule.\"\"\"\n \n import logging\n-from datetime import timedelta\n+from datetime import timedelta, timezone, datetime\n \n from homeassistant.components.calendar import CalendarEntity, CalendarEvent\n+from homeassistant.core import HomeAssistant\n+from homeassistant.util.dt import DEFAULT_TIME_ZONE\n \n from custom_components.waste_collection_schedule.waste_collection_schedule.scraper import (\n Scraper,\n@@ -88,15 +90,23 @@\n else:\n return self._convert(collections[0])\n \n- async def async_get_events(self, hass, start_date, end_date):\n+ async def async_get_events(\n+ self, hass: HomeAssistant, start_date: datetime, end_date: datetime\n+ ):\n \"\"\"Return all events within specified time span.\"\"\"\n- collections = []\n- for a in self._scraper.get_upcoming(include_today=True, types=self._types):\n- if a.date >= start_date.date() and a.date <= end_date.date():\n- collections.append(self._convert(a))\n- return collections\n+ events = []\n \n- def _convert(self, collection):\n+ for collection in self._scraper.get_upcoming(\n+ include_today=True, types=self._types\n+ ):\n+ event = self._convert(collection)\n+\n+ if start_date <= event.start_datetime_local <= end_date:\n+ events.append(event)\n+\n+ return events\n+\n+ def _convert(self, collection) -> CalendarEvent:\n \"\"\"Convert an collection into a Home Assistant calendar event.\"\"\"\n return CalendarEvent(\n summary=collection.type,\n", "issue": "Calendar used as a trigger for automations never fire.\nSo I created an automation that uses the generated calendar to notify me when I have to put the bins out using the new calendar triggers.\r\n\r\nHowever, the automation never runs because the trigger never fires.\r\n\r\nI debugged this a bit and found tha following issue:\r\n\r\nHA asks the integration [here](https://github.com/home-assistant/core/blob/dev/homeassistant/components/calendar/trigger.py#L98) for all applicable events. However, the returned list is not quite correct. The timestamps are datetimes and the integration checks only the date component [here](https://github.com/mampfes/hacs_waste_collection_schedule/blob/master/custom_components/waste_collection_schedule/calendar.py#L53).\r\n\r\nIn my case, my local timezone is Europe/Berlin, which is currently UTC+2. HA gives UTC timestamps to the integration, so they are shifted by two hours \"into the past\" (not really, but you know what I mean). This means that the date check is wrong, as it misses the events for the day.\r\n\r\nI changed the following and it worked in my testing but maybe you have a better idea on how to fix that:\r\n\r\n```python\r\n async def async_get_events(self, hass, start_datetime, end_datetime):\r\n \"\"\"Return all events within specified time span.\"\"\"\r\n collections = []\r\n for a in self._scraper.get_upcoming(include_today=True):\r\n event = self._convert(a)\r\n if event.start_datetime_local >= start_datetime and event.end_datetime_local <= end_datetime:\r\n collections.append(event)\r\n return collections\r\n\r\n def _convert(self, collection):\r\n \"\"\"Convert an collection into a Home Assistant calendar event.\"\"\"\r\n return CalendarEvent(\r\n summary=collection.type,\r\n start=collection.date,\r\n end=collection.date,\r\n )\r\n```\r\n\r\nEssentially, I convert to a HA calender event first and then let HA convert the start/end times of the event to local time to compare them against the given start/end times which are still in UTC. But both are now proper datetime objects with timezone information so comparing them works fine.\n", "before_files": [{"content": "\"\"\"Calendar platform support for Waste Collection Schedule.\"\"\"\n\nimport logging\nfrom datetime import timedelta\n\nfrom homeassistant.components.calendar import CalendarEntity, CalendarEvent\n\nfrom custom_components.waste_collection_schedule.waste_collection_schedule.scraper import (\n Scraper,\n)\n\n_LOGGER = logging.getLogger(__name__)\n\n\nasync def async_setup_platform(hass, config, async_add_entities, discovery_info=None):\n \"\"\"Set up calendar platform.\"\"\"\n # We only want this platform to be set up via discovery.\n if discovery_info is None:\n return\n\n entities = []\n\n api = discovery_info[\"api\"]\n\n for scraper in api.scrapers:\n dedicated_calendar_types = scraper.get_dedicated_calendar_types()\n global_calendar_types = scraper.get_global_calendar_types()\n\n if dedicated_calendar_types is not None:\n for type in dedicated_calendar_types:\n unique_id = calc_unique_calendar_id(scraper, type)\n\n entities.append(\n WasteCollectionCalendar(\n api,\n scraper,\n scraper.get_calendar_title_for_type(type),\n [scraper.get_collection_type(type)],\n unique_id,\n )\n )\n\n if global_calendar_types is not None or dedicated_calendar_types is None:\n unique_id = calc_unique_calendar_id(scraper)\n entities.append(\n WasteCollectionCalendar(\n api,\n scraper,\n scraper.calendar_title,\n [\n scraper.get_collection_type(type)\n for type in global_calendar_types\n ]\n if global_calendar_types is not None\n else None,\n unique_id,\n )\n )\n\n async_add_entities(entities)\n\n\nclass WasteCollectionCalendar(CalendarEntity):\n \"\"\"Calendar entity class.\"\"\"\n\n def __init__(self, api, scraper, name, types, unique_id: str):\n self._api = api\n self._scraper = scraper\n self._name = name\n self._types = types\n self._unique_id = unique_id\n self._attr_unique_id = unique_id\n\n @property\n def name(self):\n \"\"\"Return entity name.\"\"\"\n return self._name\n\n @property\n def event(self):\n \"\"\"Return next collection event.\"\"\"\n collections = self._scraper.get_upcoming(\n count=1, include_today=True, types=self._types\n )\n\n if len(collections) == 0:\n return None\n else:\n return self._convert(collections[0])\n\n async def async_get_events(self, hass, start_date, end_date):\n \"\"\"Return all events within specified time span.\"\"\"\n collections = []\n for a in self._scraper.get_upcoming(include_today=True, types=self._types):\n if a.date >= start_date.date() and a.date <= end_date.date():\n collections.append(self._convert(a))\n return collections\n\n def _convert(self, collection):\n \"\"\"Convert an collection into a Home Assistant calendar event.\"\"\"\n return CalendarEvent(\n summary=collection.type,\n start=collection.date,\n end=collection.date + timedelta(days=1),\n )\n\n\ndef calc_unique_calendar_id(scraper: Scraper, type: str | None = None):\n return scraper.unique_id + (\"_\" + type if type is not None else \"\") + \"_calendar\"\n", "path": "custom_components/waste_collection_schedule/calendar.py"}], "after_files": [{"content": "\"\"\"Calendar platform support for Waste Collection Schedule.\"\"\"\n\nimport logging\nfrom datetime import timedelta, timezone, datetime\n\nfrom homeassistant.components.calendar import CalendarEntity, CalendarEvent\nfrom homeassistant.core import HomeAssistant\nfrom homeassistant.util.dt import DEFAULT_TIME_ZONE\n\nfrom custom_components.waste_collection_schedule.waste_collection_schedule.scraper import (\n Scraper,\n)\n\n_LOGGER = logging.getLogger(__name__)\n\n\nasync def async_setup_platform(hass, config, async_add_entities, discovery_info=None):\n \"\"\"Set up calendar platform.\"\"\"\n # We only want this platform to be set up via discovery.\n if discovery_info is None:\n return\n\n entities = []\n\n api = discovery_info[\"api\"]\n\n for scraper in api.scrapers:\n dedicated_calendar_types = scraper.get_dedicated_calendar_types()\n global_calendar_types = scraper.get_global_calendar_types()\n\n if dedicated_calendar_types is not None:\n for type in dedicated_calendar_types:\n unique_id = calc_unique_calendar_id(scraper, type)\n\n entities.append(\n WasteCollectionCalendar(\n api,\n scraper,\n scraper.get_calendar_title_for_type(type),\n [scraper.get_collection_type(type)],\n unique_id,\n )\n )\n\n if global_calendar_types is not None or dedicated_calendar_types is None:\n unique_id = calc_unique_calendar_id(scraper)\n entities.append(\n WasteCollectionCalendar(\n api,\n scraper,\n scraper.calendar_title,\n [\n scraper.get_collection_type(type)\n for type in global_calendar_types\n ]\n if global_calendar_types is not None\n else None,\n unique_id,\n )\n )\n\n async_add_entities(entities)\n\n\nclass WasteCollectionCalendar(CalendarEntity):\n \"\"\"Calendar entity class.\"\"\"\n\n def __init__(self, api, scraper, name, types, unique_id: str):\n self._api = api\n self._scraper = scraper\n self._name = name\n self._types = types\n self._unique_id = unique_id\n self._attr_unique_id = unique_id\n\n @property\n def name(self):\n \"\"\"Return entity name.\"\"\"\n return self._name\n\n @property\n def event(self):\n \"\"\"Return next collection event.\"\"\"\n collections = self._scraper.get_upcoming(\n count=1, include_today=True, types=self._types\n )\n\n if len(collections) == 0:\n return None\n else:\n return self._convert(collections[0])\n\n async def async_get_events(\n self, hass: HomeAssistant, start_date: datetime, end_date: datetime\n ):\n \"\"\"Return all events within specified time span.\"\"\"\n events = []\n\n for collection in self._scraper.get_upcoming(\n include_today=True, types=self._types\n ):\n event = self._convert(collection)\n\n if start_date <= event.start_datetime_local <= end_date:\n events.append(event)\n\n return events\n\n def _convert(self, collection) -> CalendarEvent:\n \"\"\"Convert an collection into a Home Assistant calendar event.\"\"\"\n return CalendarEvent(\n summary=collection.type,\n start=collection.date,\n end=collection.date + timedelta(days=1),\n )\n\n\ndef calc_unique_calendar_id(scraper: Scraper, type: str | None = None):\n return scraper.unique_id + (\"_\" + type if type is not None else \"\") + \"_calendar\"\n", "path": "custom_components/waste_collection_schedule/calendar.py"}]} | 1,596 | 392 |
gh_patches_debug_36612 | rasdani/github-patches | git_diff | alltheplaces__alltheplaces-2633 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Spider officedepot is broken
During the global build at 2021-08-18-14-42-26, spider **officedepot** failed with **0 features** and **0 errors**.
Here's [the log](https://data.alltheplaces.xyz/runs/2021-08-18-14-42-26/logs/officedepot.txt) and [the output](https://data.alltheplaces.xyz/runs/2021-08-18-14-42-26/output/officedepot.geojson) ([on a map](https://data.alltheplaces.xyz/map.html?show=https://data.alltheplaces.xyz/runs/2021-08-18-14-42-26/output/officedepot.geojson))
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `locations/spiders/officedepot.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 import scrapy
3
4 from locations.items import GeojsonPointItem
5 from locations.hours import OpeningHours
6
7
8 class OfficedepotSpider(scrapy.Spider):
9 name = 'officedepot'
10 allowed_domains = ['www.officedepot.com']
11 start_urls = ['https://www.officedepot.com/storelocator/states/']
12
13 def parse_store(self, response):
14 o = OpeningHours()
15 for d in response.xpath('//time[@itemprop="openingHours"]/@datetime').extract():
16 day, times = d.split(' ', 1)
17 s, f = times.split('-')
18
19 # They seem to have a bug where they put down 24:00 when they mean noon
20 if s == '24:00': s = '12:00'
21
22 o.add_range(day, s, f)
23
24
25 store_number_results = response.xpath('//dt[@class="lsp_number"]/text()')
26 if store_number_results:
27 ref = store_number_results[-1].extract().strip()
28
29 yield GeojsonPointItem(
30 lat=response.xpath('//meta[@itemprop="latitude"]/@content').extract_first(),
31 lon=response.xpath('//meta[@itemprop="longitude"]/@content').extract_first(),
32 phone=response.xpath('//p[@itemprop="telephone"]/text()').extract_first(),
33 addr_full=response.xpath('//p[@itemprop="streetAddress"]/text()').extract_first(),
34 city=response.xpath('//p[@itemprop="addressLocality"]/text()').extract_first(),
35 state=response.xpath('//p[@itemprop="addressRegion"]/text()').extract_first(),
36 postcode=response.xpath('//p[@itemprop="postalCode"]/text()').extract_first(),
37 website=response.url,
38 ref=ref,
39 opening_hours=o.as_opening_hours(),
40 )
41
42 def parse(self, response):
43 for state in response.xpath('//div[@style="float: left; width: 200px;"]/a/@href').extract():
44 yield scrapy.Request(
45 response.urljoin(state),
46 callback=self.parse,
47 )
48
49 for store in response.xpath('//div[@style="float: left; width: 300px; padding-top: 10px;"]/a/@href').extract():
50 yield scrapy.Request(
51 response.urljoin(store),
52 callback=self.parse_store,
53 )
54
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/locations/spiders/officedepot.py b/locations/spiders/officedepot.py
--- a/locations/spiders/officedepot.py
+++ b/locations/spiders/officedepot.py
@@ -1,4 +1,5 @@
# -*- coding: utf-8 -*-
+import json
import scrapy
from locations.items import GeojsonPointItem
@@ -7,8 +8,41 @@
class OfficedepotSpider(scrapy.Spider):
name = 'officedepot'
- allowed_domains = ['www.officedepot.com']
- start_urls = ['https://www.officedepot.com/storelocator/states/']
+ allowed_domains = ["where2getit.com"]
+
+ def start_requests(self):
+ url = 'https://locations.where2getit.com/officedepot/rest/getlist?like=0.9145201524205426&lang=en_US'
+
+ headers = {
+ 'Accept-Language': 'en-US,en;q=0.9',
+ 'Origin': 'https://hosted.where2getit.com',
+ 'Accept-Encoding': 'gzip, deflate, br',
+ 'Accept': 'application/json, text/javascript, */*; q=0.01',
+ 'Referer': 'https://hosted.where2getit.com/officedepot/2015/index1.html',
+ 'Connection': 'keep-alive',
+ 'Content-Type': 'application/json',
+ }
+
+ form_data = {
+ "request": {
+ "appkey": "592778B0-A13B-11EB-B3DB-84030D516365",
+ "formdata": {
+ "order": "city",
+ "objectname": "Locator::Store",
+ "softmatch": "1",
+ "where": {
+ }
+ }
+ }
+ }
+
+ yield scrapy.http.FormRequest(
+ url=url,
+ method='POST',
+ body=json.dumps(form_data),
+ headers=headers,
+ callback=self.parse,
+ )
def parse_store(self, response):
o = OpeningHours()
@@ -40,14 +74,20 @@
)
def parse(self, response):
- for state in response.xpath('//div[@style="float: left; width: 200px;"]/a/@href').extract():
- yield scrapy.Request(
- response.urljoin(state),
- callback=self.parse,
- )
-
- for store in response.xpath('//div[@style="float: left; width: 300px; padding-top: 10px;"]/a/@href').extract():
- yield scrapy.Request(
- response.urljoin(store),
- callback=self.parse_store,
- )
+ data = json.loads(response.body_as_unicode())
+
+ for store in data["response"]["collection"]:
+ properties = {
+ 'ref': store["clientkey"],
+ 'name': store.get("name"),
+ 'addr_full': store["address1"],
+ 'city': store["city"],
+ 'state': store["state"],
+ 'postcode': store["postalcode"],
+ 'country': store["country"],
+ 'lat': store["latitude"],
+ 'lon': store["longitude"],
+ 'phone': store["phone"],
+ }
+
+ yield GeojsonPointItem(**properties)
| {"golden_diff": "diff --git a/locations/spiders/officedepot.py b/locations/spiders/officedepot.py\n--- a/locations/spiders/officedepot.py\n+++ b/locations/spiders/officedepot.py\n@@ -1,4 +1,5 @@\n # -*- coding: utf-8 -*-\n+import json\n import scrapy\n \n from locations.items import GeojsonPointItem\n@@ -7,8 +8,41 @@\n \n class OfficedepotSpider(scrapy.Spider):\n name = 'officedepot'\n- allowed_domains = ['www.officedepot.com']\n- start_urls = ['https://www.officedepot.com/storelocator/states/']\n+ allowed_domains = [\"where2getit.com\"]\n+\n+ def start_requests(self):\n+ url = 'https://locations.where2getit.com/officedepot/rest/getlist?like=0.9145201524205426&lang=en_US'\n+\n+ headers = {\n+ 'Accept-Language': 'en-US,en;q=0.9',\n+ 'Origin': 'https://hosted.where2getit.com',\n+ 'Accept-Encoding': 'gzip, deflate, br',\n+ 'Accept': 'application/json, text/javascript, */*; q=0.01',\n+ 'Referer': 'https://hosted.where2getit.com/officedepot/2015/index1.html',\n+ 'Connection': 'keep-alive',\n+ 'Content-Type': 'application/json',\n+ }\n+\n+ form_data = {\n+ \"request\": {\n+ \"appkey\": \"592778B0-A13B-11EB-B3DB-84030D516365\",\n+ \"formdata\": {\n+ \"order\": \"city\",\n+ \"objectname\": \"Locator::Store\",\n+ \"softmatch\": \"1\",\n+ \"where\": {\n+ }\n+ }\n+ }\n+ }\n+\n+ yield scrapy.http.FormRequest(\n+ url=url,\n+ method='POST',\n+ body=json.dumps(form_data),\n+ headers=headers,\n+ callback=self.parse,\n+ )\n \n def parse_store(self, response):\n o = OpeningHours()\n@@ -40,14 +74,20 @@\n )\n \n def parse(self, response):\n- for state in response.xpath('//div[@style=\"float: left; width: 200px;\"]/a/@href').extract():\n- yield scrapy.Request(\n- response.urljoin(state),\n- callback=self.parse,\n- )\n-\n- for store in response.xpath('//div[@style=\"float: left; width: 300px; padding-top: 10px;\"]/a/@href').extract():\n- yield scrapy.Request(\n- response.urljoin(store),\n- callback=self.parse_store,\n- )\n+ data = json.loads(response.body_as_unicode())\n+\n+ for store in data[\"response\"][\"collection\"]:\n+ properties = {\n+ 'ref': store[\"clientkey\"],\n+ 'name': store.get(\"name\"),\n+ 'addr_full': store[\"address1\"],\n+ 'city': store[\"city\"],\n+ 'state': store[\"state\"],\n+ 'postcode': store[\"postalcode\"],\n+ 'country': store[\"country\"],\n+ 'lat': store[\"latitude\"],\n+ 'lon': store[\"longitude\"],\n+ 'phone': store[\"phone\"],\n+ }\n+\n+ yield GeojsonPointItem(**properties)\n", "issue": "Spider officedepot is broken\nDuring the global build at 2021-08-18-14-42-26, spider **officedepot** failed with **0 features** and **0 errors**.\n\nHere's [the log](https://data.alltheplaces.xyz/runs/2021-08-18-14-42-26/logs/officedepot.txt) and [the output](https://data.alltheplaces.xyz/runs/2021-08-18-14-42-26/output/officedepot.geojson) ([on a map](https://data.alltheplaces.xyz/map.html?show=https://data.alltheplaces.xyz/runs/2021-08-18-14-42-26/output/officedepot.geojson))\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\nimport scrapy\n\nfrom locations.items import GeojsonPointItem\nfrom locations.hours import OpeningHours\n\n\nclass OfficedepotSpider(scrapy.Spider):\n name = 'officedepot'\n allowed_domains = ['www.officedepot.com']\n start_urls = ['https://www.officedepot.com/storelocator/states/']\n\n def parse_store(self, response):\n o = OpeningHours()\n for d in response.xpath('//time[@itemprop=\"openingHours\"]/@datetime').extract():\n day, times = d.split(' ', 1)\n s, f = times.split('-')\n\n # They seem to have a bug where they put down 24:00 when they mean noon\n if s == '24:00': s = '12:00'\n\n o.add_range(day, s, f)\n\n\n store_number_results = response.xpath('//dt[@class=\"lsp_number\"]/text()')\n if store_number_results:\n ref = store_number_results[-1].extract().strip()\n\n yield GeojsonPointItem(\n lat=response.xpath('//meta[@itemprop=\"latitude\"]/@content').extract_first(),\n lon=response.xpath('//meta[@itemprop=\"longitude\"]/@content').extract_first(),\n phone=response.xpath('//p[@itemprop=\"telephone\"]/text()').extract_first(),\n addr_full=response.xpath('//p[@itemprop=\"streetAddress\"]/text()').extract_first(),\n city=response.xpath('//p[@itemprop=\"addressLocality\"]/text()').extract_first(),\n state=response.xpath('//p[@itemprop=\"addressRegion\"]/text()').extract_first(),\n postcode=response.xpath('//p[@itemprop=\"postalCode\"]/text()').extract_first(),\n website=response.url,\n ref=ref,\n opening_hours=o.as_opening_hours(),\n )\n\n def parse(self, response):\n for state in response.xpath('//div[@style=\"float: left; width: 200px;\"]/a/@href').extract():\n yield scrapy.Request(\n response.urljoin(state),\n callback=self.parse,\n )\n\n for store in response.xpath('//div[@style=\"float: left; width: 300px; padding-top: 10px;\"]/a/@href').extract():\n yield scrapy.Request(\n response.urljoin(store),\n callback=self.parse_store,\n )\n", "path": "locations/spiders/officedepot.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\nimport json\nimport scrapy\n\nfrom locations.items import GeojsonPointItem\nfrom locations.hours import OpeningHours\n\n\nclass OfficedepotSpider(scrapy.Spider):\n name = 'officedepot'\n allowed_domains = [\"where2getit.com\"]\n\n def start_requests(self):\n url = 'https://locations.where2getit.com/officedepot/rest/getlist?like=0.9145201524205426&lang=en_US'\n\n headers = {\n 'Accept-Language': 'en-US,en;q=0.9',\n 'Origin': 'https://hosted.where2getit.com',\n 'Accept-Encoding': 'gzip, deflate, br',\n 'Accept': 'application/json, text/javascript, */*; q=0.01',\n 'Referer': 'https://hosted.where2getit.com/officedepot/2015/index1.html',\n 'Connection': 'keep-alive',\n 'Content-Type': 'application/json',\n }\n\n form_data = {\n \"request\": {\n \"appkey\": \"592778B0-A13B-11EB-B3DB-84030D516365\",\n \"formdata\": {\n \"order\": \"city\",\n \"objectname\": \"Locator::Store\",\n \"softmatch\": \"1\",\n \"where\": {\n }\n }\n }\n }\n\n yield scrapy.http.FormRequest(\n url=url,\n method='POST',\n body=json.dumps(form_data),\n headers=headers,\n callback=self.parse,\n )\n\n def parse_store(self, response):\n o = OpeningHours()\n for d in response.xpath('//time[@itemprop=\"openingHours\"]/@datetime').extract():\n day, times = d.split(' ', 1)\n s, f = times.split('-')\n\n # They seem to have a bug where they put down 24:00 when they mean noon\n if s == '24:00': s = '12:00'\n\n o.add_range(day, s, f)\n\n\n store_number_results = response.xpath('//dt[@class=\"lsp_number\"]/text()')\n if store_number_results:\n ref = store_number_results[-1].extract().strip()\n\n yield GeojsonPointItem(\n lat=response.xpath('//meta[@itemprop=\"latitude\"]/@content').extract_first(),\n lon=response.xpath('//meta[@itemprop=\"longitude\"]/@content').extract_first(),\n phone=response.xpath('//p[@itemprop=\"telephone\"]/text()').extract_first(),\n addr_full=response.xpath('//p[@itemprop=\"streetAddress\"]/text()').extract_first(),\n city=response.xpath('//p[@itemprop=\"addressLocality\"]/text()').extract_first(),\n state=response.xpath('//p[@itemprop=\"addressRegion\"]/text()').extract_first(),\n postcode=response.xpath('//p[@itemprop=\"postalCode\"]/text()').extract_first(),\n website=response.url,\n ref=ref,\n opening_hours=o.as_opening_hours(),\n )\n\n def parse(self, response):\n data = json.loads(response.body_as_unicode())\n\n for store in data[\"response\"][\"collection\"]:\n properties = {\n 'ref': store[\"clientkey\"],\n 'name': store.get(\"name\"),\n 'addr_full': store[\"address1\"],\n 'city': store[\"city\"],\n 'state': store[\"state\"],\n 'postcode': store[\"postalcode\"],\n 'country': store[\"country\"],\n 'lat': store[\"latitude\"],\n 'lon': store[\"longitude\"],\n 'phone': store[\"phone\"],\n }\n\n yield GeojsonPointItem(**properties)\n", "path": "locations/spiders/officedepot.py"}]} | 1,058 | 777 |
gh_patches_debug_29567 | rasdani/github-patches | git_diff | GPflow__GPflow-1350 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
tf2.2-rc1 gast requirement
Hi,
gpflow 2.0.0-rc1 has gast requirement 0.2.2.
TensorFlow has gast requirement 0.3.3 from 2.2-rc1, which is incompatible with gpflow requirement.
Best Regards,
Marco
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #!/usr/bin/env python
2 # -*- coding: utf-8 -*-
3
4 # pylint: skip-file
5
6 import os
7 import sys
8 from pathlib import Path
9
10 from pkg_resources import parse_version
11 from setuptools import find_packages, setup
12
13 is_py37 = sys.version_info.major == 3 and sys.version_info.minor == 7
14 on_rtd = os.environ.get('READTHEDOCS', None) == 'True' # copied from the docs
15
16 # Dependencies of GPflow
17 requirements = [
18 'numpy>=1.10.0',
19 'scipy>=0.18.0',
20 'multipledispatch>=0.4.9',
21 'tabulate',
22 'gast==0.2.2',
23 ]
24
25 if not is_py37:
26 requirements.append("dataclasses")
27
28 if not on_rtd:
29 requirements.append("tensorflow-probability>=0.9")
30
31 min_tf_version = '2.1.0'
32 tf_cpu = 'tensorflow'
33 tf_gpu = 'tensorflow-gpu'
34
35 # Only detect TF if not installed or outdated. If not, do not do not list as
36 # requirement to avoid installing over e.g. tensorflow-gpu
37 # To avoid this, rely on importing rather than the package name (like pip).
38
39 try:
40 # If tf not installed, import raises ImportError
41 import tensorflow as tf
42 if parse_version(tf.__version__) < parse_version(min_tf_version):
43 # TF pre-installed, but below the minimum required version
44 raise DeprecationWarning("TensorFlow version below minimum requirement")
45 except (ImportError, DeprecationWarning):
46 # Add TensorFlow to dependencies to trigger installation/update
47 if not on_rtd:
48 # Do not add TF if we are installing GPflow on readthedocs
49 requirements.append(tf_cpu)
50
51 with open(str(Path(".", "VERSION").absolute())) as version_file:
52 version = version_file.read().strip()
53
54 packages = find_packages('.', exclude=["tests"])
55
56 setup(name='gpflow',
57 version=version,
58 author="James Hensman, Alex Matthews",
59 author_email="[email protected]",
60 description="Gaussian process methods in TensorFlow",
61 license="Apache License 2.0",
62 keywords="machine-learning gaussian-processes kernels tensorflow",
63 url="http://github.com/GPflow/GPflow",
64 packages=packages,
65 include_package_data=True,
66 install_requires=requirements,
67 extras_require={'Tensorflow with GPU': [tf_gpu]},
68 python_requires=">=3.6",
69 classifiers=[
70 'License :: OSI Approved :: Apache Software License',
71 'Natural Language :: English',
72 'Operating System :: MacOS :: MacOS X',
73 'Operating System :: Microsoft :: Windows',
74 'Operating System :: POSIX :: Linux',
75 'Programming Language :: Python :: 3.6',
76 'Topic :: Scientific/Engineering :: Artificial Intelligence'
77 ])
78
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -18,8 +18,7 @@
'numpy>=1.10.0',
'scipy>=0.18.0',
'multipledispatch>=0.4.9',
- 'tabulate',
- 'gast==0.2.2',
+ 'tabulate'
]
if not is_py37:
@@ -32,6 +31,22 @@
tf_cpu = 'tensorflow'
tf_gpu = 'tensorflow-gpu'
+
+# for latest_version() [see https://github.com/GPflow/GPflow/issues/1348]:
+def latest_version(package_name):
+ import json
+ from urllib import request
+ import re
+
+ url = f"https://pypi.python.org/pypi/{package_name}/json"
+ data = json.load(request.urlopen(url))
+ # filter out rc and beta releases and, more generally, any releases that
+ # do not contain exclusively numbers and dots.
+ versions = [parse_version(v) for v in data["releases"].keys() if re.match("^[0-9.]+$", v)]
+ versions.sort()
+ return versions[-1] # return latest version
+
+
# Only detect TF if not installed or outdated. If not, do not do not list as
# requirement to avoid installing over e.g. tensorflow-gpu
# To avoid this, rely on importing rather than the package name (like pip).
@@ -47,6 +62,9 @@
if not on_rtd:
# Do not add TF if we are installing GPflow on readthedocs
requirements.append(tf_cpu)
+ gast_requirement = 'gast>=0.2.2,<0.3' if latest_version('tensorflow') < parse_version('2.2') else 'gast>=0.3.3'
+ requirements.append(gast_requirement)
+
with open(str(Path(".", "VERSION").absolute())) as version_file:
version = version_file.read().strip()
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -18,8 +18,7 @@\n 'numpy>=1.10.0',\n 'scipy>=0.18.0',\n 'multipledispatch>=0.4.9',\n- 'tabulate',\n- 'gast==0.2.2',\n+ 'tabulate'\n ]\n \n if not is_py37:\n@@ -32,6 +31,22 @@\n tf_cpu = 'tensorflow'\n tf_gpu = 'tensorflow-gpu'\n \n+\n+# for latest_version() [see https://github.com/GPflow/GPflow/issues/1348]:\n+def latest_version(package_name):\n+ import json\n+ from urllib import request\n+ import re\n+\n+ url = f\"https://pypi.python.org/pypi/{package_name}/json\"\n+ data = json.load(request.urlopen(url))\n+ # filter out rc and beta releases and, more generally, any releases that\n+ # do not contain exclusively numbers and dots.\n+ versions = [parse_version(v) for v in data[\"releases\"].keys() if re.match(\"^[0-9.]+$\", v)] \n+ versions.sort()\n+ return versions[-1] # return latest version\n+\n+\n # Only detect TF if not installed or outdated. If not, do not do not list as\n # requirement to avoid installing over e.g. tensorflow-gpu\n # To avoid this, rely on importing rather than the package name (like pip).\n@@ -47,6 +62,9 @@\n if not on_rtd:\n # Do not add TF if we are installing GPflow on readthedocs\n requirements.append(tf_cpu)\n+ gast_requirement = 'gast>=0.2.2,<0.3' if latest_version('tensorflow') < parse_version('2.2') else 'gast>=0.3.3'\n+ requirements.append(gast_requirement)\n+ \n \n with open(str(Path(\".\", \"VERSION\").absolute())) as version_file:\n version = version_file.read().strip()\n", "issue": "tf2.2-rc1 gast requirement\nHi,\r\n\r\ngpflow 2.0.0-rc1 has gast requirement 0.2.2.\r\n\r\nTensorFlow has gast requirement 0.3.3 from 2.2-rc1, which is incompatible with gpflow requirement.\r\n\r\nBest Regards,\r\n\r\nMarco\n", "before_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n\n# pylint: skip-file\n\nimport os\nimport sys\nfrom pathlib import Path\n\nfrom pkg_resources import parse_version\nfrom setuptools import find_packages, setup\n\nis_py37 = sys.version_info.major == 3 and sys.version_info.minor == 7\non_rtd = os.environ.get('READTHEDOCS', None) == 'True' # copied from the docs\n\n# Dependencies of GPflow\nrequirements = [\n 'numpy>=1.10.0',\n 'scipy>=0.18.0',\n 'multipledispatch>=0.4.9',\n 'tabulate',\n 'gast==0.2.2',\n]\n\nif not is_py37:\n requirements.append(\"dataclasses\")\n\nif not on_rtd:\n requirements.append(\"tensorflow-probability>=0.9\")\n\nmin_tf_version = '2.1.0'\ntf_cpu = 'tensorflow'\ntf_gpu = 'tensorflow-gpu'\n\n# Only detect TF if not installed or outdated. If not, do not do not list as\n# requirement to avoid installing over e.g. tensorflow-gpu\n# To avoid this, rely on importing rather than the package name (like pip).\n\ntry:\n # If tf not installed, import raises ImportError\n import tensorflow as tf\n if parse_version(tf.__version__) < parse_version(min_tf_version):\n # TF pre-installed, but below the minimum required version\n raise DeprecationWarning(\"TensorFlow version below minimum requirement\")\nexcept (ImportError, DeprecationWarning):\n # Add TensorFlow to dependencies to trigger installation/update\n if not on_rtd:\n # Do not add TF if we are installing GPflow on readthedocs\n requirements.append(tf_cpu)\n\nwith open(str(Path(\".\", \"VERSION\").absolute())) as version_file:\n version = version_file.read().strip()\n\npackages = find_packages('.', exclude=[\"tests\"])\n\nsetup(name='gpflow',\n version=version,\n author=\"James Hensman, Alex Matthews\",\n author_email=\"[email protected]\",\n description=\"Gaussian process methods in TensorFlow\",\n license=\"Apache License 2.0\",\n keywords=\"machine-learning gaussian-processes kernels tensorflow\",\n url=\"http://github.com/GPflow/GPflow\",\n packages=packages,\n include_package_data=True,\n install_requires=requirements,\n extras_require={'Tensorflow with GPU': [tf_gpu]},\n python_requires=\">=3.6\",\n classifiers=[\n 'License :: OSI Approved :: Apache Software License',\n 'Natural Language :: English',\n 'Operating System :: MacOS :: MacOS X',\n 'Operating System :: Microsoft :: Windows',\n 'Operating System :: POSIX :: Linux',\n 'Programming Language :: Python :: 3.6',\n 'Topic :: Scientific/Engineering :: Artificial Intelligence'\n ])\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n\n# pylint: skip-file\n\nimport os\nimport sys\nfrom pathlib import Path\n\nfrom pkg_resources import parse_version\nfrom setuptools import find_packages, setup\n\nis_py37 = sys.version_info.major == 3 and sys.version_info.minor == 7\non_rtd = os.environ.get('READTHEDOCS', None) == 'True' # copied from the docs\n\n# Dependencies of GPflow\nrequirements = [\n 'numpy>=1.10.0',\n 'scipy>=0.18.0',\n 'multipledispatch>=0.4.9',\n 'tabulate'\n]\n\nif not is_py37:\n requirements.append(\"dataclasses\")\n\nif not on_rtd:\n requirements.append(\"tensorflow-probability>=0.9\")\n\nmin_tf_version = '2.1.0'\ntf_cpu = 'tensorflow'\ntf_gpu = 'tensorflow-gpu'\n\n\n# for latest_version() [see https://github.com/GPflow/GPflow/issues/1348]:\ndef latest_version(package_name):\n import json\n from urllib import request\n import re\n\n url = f\"https://pypi.python.org/pypi/{package_name}/json\"\n data = json.load(request.urlopen(url))\n # filter out rc and beta releases and, more generally, any releases that\n # do not contain exclusively numbers and dots.\n versions = [parse_version(v) for v in data[\"releases\"].keys() if re.match(\"^[0-9.]+$\", v)] \n versions.sort()\n return versions[-1] # return latest version\n\n\n# Only detect TF if not installed or outdated. If not, do not do not list as\n# requirement to avoid installing over e.g. tensorflow-gpu\n# To avoid this, rely on importing rather than the package name (like pip).\n\ntry:\n # If tf not installed, import raises ImportError\n import tensorflow as tf\n if parse_version(tf.__version__) < parse_version(min_tf_version):\n # TF pre-installed, but below the minimum required version\n raise DeprecationWarning(\"TensorFlow version below minimum requirement\")\nexcept (ImportError, DeprecationWarning):\n # Add TensorFlow to dependencies to trigger installation/update\n if not on_rtd:\n # Do not add TF if we are installing GPflow on readthedocs\n requirements.append(tf_cpu)\n gast_requirement = 'gast>=0.2.2,<0.3' if latest_version('tensorflow') < parse_version('2.2') else 'gast>=0.3.3'\n requirements.append(gast_requirement)\n \n\nwith open(str(Path(\".\", \"VERSION\").absolute())) as version_file:\n version = version_file.read().strip()\n\npackages = find_packages('.', exclude=[\"tests\"])\n\nsetup(name='gpflow',\n version=version,\n author=\"James Hensman, Alex Matthews\",\n author_email=\"[email protected]\",\n description=\"Gaussian process methods in TensorFlow\",\n license=\"Apache License 2.0\",\n keywords=\"machine-learning gaussian-processes kernels tensorflow\",\n url=\"http://github.com/GPflow/GPflow\",\n packages=packages,\n include_package_data=True,\n install_requires=requirements,\n extras_require={'Tensorflow with GPU': [tf_gpu]},\n python_requires=\">=3.6\",\n classifiers=[\n 'License :: OSI Approved :: Apache Software License',\n 'Natural Language :: English',\n 'Operating System :: MacOS :: MacOS X',\n 'Operating System :: Microsoft :: Windows',\n 'Operating System :: POSIX :: Linux',\n 'Programming Language :: Python :: 3.6',\n 'Topic :: Scientific/Engineering :: Artificial Intelligence'\n ])\n", "path": "setup.py"}]} | 1,085 | 462 |
gh_patches_debug_18286 | rasdani/github-patches | git_diff | spotify__luigi-908 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Exception when a command-line parametter contains a non-ascii character
Hey,
We've just upgraded to Luigi 1.1.2, and now we get an error when launching Tasks that have utf-8 characters on the parameters. (using python 2.7.3)
Here's the stacktrace we get :
```
Traceback (most recent call last):
File "examples/top_artists.py", line 283, in <module>
luigi.run()
File "/home/jonas/.virtualenvs/website/local/lib/python2.7/site-packages/luigi/interface.py", line 434, in run
return interface.run(tasks, worker_scheduler_factory, override_defaults=override_defaults)
File "/home/jonas/.virtualenvs/website/local/lib/python2.7/site-packages/luigi/interface.py", line 165, in run
not(lock.acquire_for(env_params.lock_pid_dir, env_params.lock_size))):
File "/home/jonas/.virtualenvs/website/local/lib/python2.7/site-packages/luigi/lock.py", line 59, in acquire_for
my_pid, my_cmd, pid_file = get_info(pid_dir)
File "/home/jonas/.virtualenvs/website/local/lib/python2.7/site-packages/luigi/lock.py", line 44, in get_info
pid_file = os.path.join(pid_dir, hashlib.md5(my_cmd.encode('utf8')).hexdigest()) + '.pid'
UnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 97: ordinal not in range(128)
```
Here I've just added a `plop = luigi.Parameter()` to `AggregateArtists`, and called it with
`python examples/top_artists.py AggregateArtists --local-scheduler --date-interval 2012-06 --plop à`
The error seems to come from the [Python3 refactoring](https://github.com/spotify/luigi/pull/745). Removing `.encode('utf8')` on [this line](https://github.com/spotify/luigi/blob/master/luigi/lock.py#L44) seems to solve the problem, but I guess it was added for a reason ^^.
Any ideas of how we could solve this on our side, other than removing the encode? (I'll try to add a test for it later)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `luigi/lock.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 #
3 # Copyright 2012-2015 Spotify AB
4 #
5 # Licensed under the Apache License, Version 2.0 (the "License");
6 # you may not use this file except in compliance with the License.
7 # You may obtain a copy of the License at
8 #
9 # http://www.apache.org/licenses/LICENSE-2.0
10 #
11 # Unless required by applicable law or agreed to in writing, software
12 # distributed under the License is distributed on an "AS IS" BASIS,
13 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
14 # See the License for the specific language governing permissions and
15 # limitations under the License.
16 #
17 """
18 Locking functionality when launching things from the command line.
19 Uses a pidfile.
20 This prevents multiple identical workflows to be launched simultaneously.
21 """
22 from __future__ import print_function
23
24 import hashlib
25 import os
26
27
28 def getpcmd(pid):
29 """
30 Returns command of process.
31
32 :param pid:
33 """
34 cmd = 'ps -p %s -o command=' % (pid,)
35 p = os.popen(cmd, 'r')
36 return p.readline().strip()
37
38
39 def get_info(pid_dir):
40 # Check the name and pid of this process
41 my_pid = os.getpid()
42 my_cmd = getpcmd(my_pid)
43
44 pid_file = os.path.join(pid_dir, hashlib.md5(my_cmd.encode('utf8')).hexdigest()) + '.pid'
45
46 return my_pid, my_cmd, pid_file
47
48
49 def acquire_for(pid_dir, num_available=1):
50 """
51 Makes sure the process is only run once at the same time with the same name.
52
53 Notice that we since we check the process name, different parameters to the same
54 command can spawn multiple processes at the same time, i.e. running
55 "/usr/bin/my_process" does not prevent anyone from launching
56 "/usr/bin/my_process --foo bar".
57 """
58
59 my_pid, my_cmd, pid_file = get_info(pid_dir)
60
61 # Check if there is a pid file corresponding to this name
62 if not os.path.exists(pid_dir):
63 os.mkdir(pid_dir)
64 os.chmod(pid_dir, 0o777)
65
66 pids = set()
67 pid_cmds = {}
68 if os.path.exists(pid_file):
69 # There is such a file - read the pid and look up its process name
70 pids.update(filter(None, map(str.strip, open(pid_file))))
71 pid_cmds = dict((pid, getpcmd(pid)) for pid in pids)
72 matching_pids = list(filter(lambda pid: pid_cmds[pid] == my_cmd, pids))
73
74 if len(matching_pids) >= num_available:
75 # We are already running under a different pid
76 print('Pid(s)', ', '.join(matching_pids), 'already running')
77 return False
78 else:
79 # The pid belongs to something else, we could
80 pass
81 pid_cmds[str(my_pid)] = my_cmd
82
83 # Write pids
84 pids.add(str(my_pid))
85 with open(pid_file, 'w') as f:
86 f.writelines('%s\n' % (pid, ) for pid in filter(pid_cmds.__getitem__, pids))
87
88 # Make the file writable by all
89 if os.name == 'nt':
90 pass
91 else:
92 s = os.stat(pid_file)
93 if os.getuid() == s.st_uid:
94 os.chmod(pid_file, s.st_mode | 0o777)
95
96 return True
97
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/luigi/lock.py b/luigi/lock.py
--- a/luigi/lock.py
+++ b/luigi/lock.py
@@ -24,6 +24,8 @@
import hashlib
import os
+from luigi import six
+
def getpcmd(pid):
"""
@@ -32,16 +34,23 @@
:param pid:
"""
cmd = 'ps -p %s -o command=' % (pid,)
- p = os.popen(cmd, 'r')
- return p.readline().strip()
+ with os.popen(cmd, 'r') as p:
+ return p.readline().strip()
-def get_info(pid_dir):
+def get_info(pid_dir, my_pid=None):
# Check the name and pid of this process
- my_pid = os.getpid()
+ if my_pid is None:
+ my_pid = os.getpid()
+
my_cmd = getpcmd(my_pid)
- pid_file = os.path.join(pid_dir, hashlib.md5(my_cmd.encode('utf8')).hexdigest()) + '.pid'
+ if six.PY3:
+ cmd_hash = my_cmd.encode('utf8')
+ else:
+ cmd_hash = my_cmd
+
+ pid_file = os.path.join(pid_dir, hashlib.md5(cmd_hash).hexdigest()) + '.pid'
return my_pid, my_cmd, pid_file
| {"golden_diff": "diff --git a/luigi/lock.py b/luigi/lock.py\n--- a/luigi/lock.py\n+++ b/luigi/lock.py\n@@ -24,6 +24,8 @@\n import hashlib\n import os\n \n+from luigi import six\n+\n \n def getpcmd(pid):\n \"\"\"\n@@ -32,16 +34,23 @@\n :param pid:\n \"\"\"\n cmd = 'ps -p %s -o command=' % (pid,)\n- p = os.popen(cmd, 'r')\n- return p.readline().strip()\n+ with os.popen(cmd, 'r') as p:\n+ return p.readline().strip()\n \n \n-def get_info(pid_dir):\n+def get_info(pid_dir, my_pid=None):\n # Check the name and pid of this process\n- my_pid = os.getpid()\n+ if my_pid is None:\n+ my_pid = os.getpid()\n+\n my_cmd = getpcmd(my_pid)\n \n- pid_file = os.path.join(pid_dir, hashlib.md5(my_cmd.encode('utf8')).hexdigest()) + '.pid'\n+ if six.PY3:\n+ cmd_hash = my_cmd.encode('utf8')\n+ else:\n+ cmd_hash = my_cmd\n+\n+ pid_file = os.path.join(pid_dir, hashlib.md5(cmd_hash).hexdigest()) + '.pid'\n \n return my_pid, my_cmd, pid_file\n", "issue": "Exception when a command-line parametter contains a non-ascii character\nHey,\n\nWe've just upgraded to Luigi 1.1.2, and now we get an error when launching Tasks that have utf-8 characters on the parameters. (using python 2.7.3)\n\nHere's the stacktrace we get : \n\n```\nTraceback (most recent call last):\n File \"examples/top_artists.py\", line 283, in <module>\n luigi.run()\n File \"/home/jonas/.virtualenvs/website/local/lib/python2.7/site-packages/luigi/interface.py\", line 434, in run\n return interface.run(tasks, worker_scheduler_factory, override_defaults=override_defaults)\n File \"/home/jonas/.virtualenvs/website/local/lib/python2.7/site-packages/luigi/interface.py\", line 165, in run\n not(lock.acquire_for(env_params.lock_pid_dir, env_params.lock_size))):\n File \"/home/jonas/.virtualenvs/website/local/lib/python2.7/site-packages/luigi/lock.py\", line 59, in acquire_for\n my_pid, my_cmd, pid_file = get_info(pid_dir)\n File \"/home/jonas/.virtualenvs/website/local/lib/python2.7/site-packages/luigi/lock.py\", line 44, in get_info\n pid_file = os.path.join(pid_dir, hashlib.md5(my_cmd.encode('utf8')).hexdigest()) + '.pid'\nUnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 97: ordinal not in range(128)\n```\n\nHere I've just added a `plop = luigi.Parameter()` to `AggregateArtists`, and called it with \n`python examples/top_artists.py AggregateArtists --local-scheduler --date-interval 2012-06 --plop \u00e0`\n\nThe error seems to come from the [Python3 refactoring](https://github.com/spotify/luigi/pull/745). Removing `.encode('utf8')` on [this line](https://github.com/spotify/luigi/blob/master/luigi/lock.py#L44) seems to solve the problem, but I guess it was added for a reason ^^.\n\nAny ideas of how we could solve this on our side, other than removing the encode? (I'll try to add a test for it later)\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n#\n# Copyright 2012-2015 Spotify AB\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n#\n\"\"\"\nLocking functionality when launching things from the command line.\nUses a pidfile.\nThis prevents multiple identical workflows to be launched simultaneously.\n\"\"\"\nfrom __future__ import print_function\n\nimport hashlib\nimport os\n\n\ndef getpcmd(pid):\n \"\"\"\n Returns command of process.\n\n :param pid:\n \"\"\"\n cmd = 'ps -p %s -o command=' % (pid,)\n p = os.popen(cmd, 'r')\n return p.readline().strip()\n\n\ndef get_info(pid_dir):\n # Check the name and pid of this process\n my_pid = os.getpid()\n my_cmd = getpcmd(my_pid)\n\n pid_file = os.path.join(pid_dir, hashlib.md5(my_cmd.encode('utf8')).hexdigest()) + '.pid'\n\n return my_pid, my_cmd, pid_file\n\n\ndef acquire_for(pid_dir, num_available=1):\n \"\"\"\n Makes sure the process is only run once at the same time with the same name.\n\n Notice that we since we check the process name, different parameters to the same\n command can spawn multiple processes at the same time, i.e. running\n \"/usr/bin/my_process\" does not prevent anyone from launching\n \"/usr/bin/my_process --foo bar\".\n \"\"\"\n\n my_pid, my_cmd, pid_file = get_info(pid_dir)\n\n # Check if there is a pid file corresponding to this name\n if not os.path.exists(pid_dir):\n os.mkdir(pid_dir)\n os.chmod(pid_dir, 0o777)\n\n pids = set()\n pid_cmds = {}\n if os.path.exists(pid_file):\n # There is such a file - read the pid and look up its process name\n pids.update(filter(None, map(str.strip, open(pid_file))))\n pid_cmds = dict((pid, getpcmd(pid)) for pid in pids)\n matching_pids = list(filter(lambda pid: pid_cmds[pid] == my_cmd, pids))\n\n if len(matching_pids) >= num_available:\n # We are already running under a different pid\n print('Pid(s)', ', '.join(matching_pids), 'already running')\n return False\n else:\n # The pid belongs to something else, we could\n pass\n pid_cmds[str(my_pid)] = my_cmd\n\n # Write pids\n pids.add(str(my_pid))\n with open(pid_file, 'w') as f:\n f.writelines('%s\\n' % (pid, ) for pid in filter(pid_cmds.__getitem__, pids))\n\n # Make the file writable by all\n if os.name == 'nt':\n pass\n else:\n s = os.stat(pid_file)\n if os.getuid() == s.st_uid:\n os.chmod(pid_file, s.st_mode | 0o777)\n\n return True\n", "path": "luigi/lock.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n#\n# Copyright 2012-2015 Spotify AB\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n#\n\"\"\"\nLocking functionality when launching things from the command line.\nUses a pidfile.\nThis prevents multiple identical workflows to be launched simultaneously.\n\"\"\"\nfrom __future__ import print_function\n\nimport hashlib\nimport os\n\nfrom luigi import six\n\n\ndef getpcmd(pid):\n \"\"\"\n Returns command of process.\n\n :param pid:\n \"\"\"\n cmd = 'ps -p %s -o command=' % (pid,)\n with os.popen(cmd, 'r') as p:\n return p.readline().strip()\n\n\ndef get_info(pid_dir, my_pid=None):\n # Check the name and pid of this process\n if my_pid is None:\n my_pid = os.getpid()\n\n my_cmd = getpcmd(my_pid)\n\n if six.PY3:\n cmd_hash = my_cmd.encode('utf8')\n else:\n cmd_hash = my_cmd\n\n pid_file = os.path.join(pid_dir, hashlib.md5(cmd_hash).hexdigest()) + '.pid'\n\n return my_pid, my_cmd, pid_file\n\n\ndef acquire_for(pid_dir, num_available=1):\n \"\"\"\n Makes sure the process is only run once at the same time with the same name.\n\n Notice that we since we check the process name, different parameters to the same\n command can spawn multiple processes at the same time, i.e. running\n \"/usr/bin/my_process\" does not prevent anyone from launching\n \"/usr/bin/my_process --foo bar\".\n \"\"\"\n\n my_pid, my_cmd, pid_file = get_info(pid_dir)\n\n # Check if there is a pid file corresponding to this name\n if not os.path.exists(pid_dir):\n os.mkdir(pid_dir)\n os.chmod(pid_dir, 0o777)\n\n pids = set()\n pid_cmds = {}\n if os.path.exists(pid_file):\n # There is such a file - read the pid and look up its process name\n pids.update(filter(None, map(str.strip, open(pid_file))))\n pid_cmds = dict((pid, getpcmd(pid)) for pid in pids)\n matching_pids = list(filter(lambda pid: pid_cmds[pid] == my_cmd, pids))\n\n if len(matching_pids) >= num_available:\n # We are already running under a different pid\n print('Pid(s)', ', '.join(matching_pids), 'already running')\n return False\n else:\n # The pid belongs to something else, we could\n pass\n pid_cmds[str(my_pid)] = my_cmd\n\n # Write pids\n pids.add(str(my_pid))\n with open(pid_file, 'w') as f:\n f.writelines('%s\\n' % (pid, ) for pid in filter(pid_cmds.__getitem__, pids))\n\n # Make the file writable by all\n if os.name == 'nt':\n pass\n else:\n s = os.stat(pid_file)\n if os.getuid() == s.st_uid:\n os.chmod(pid_file, s.st_mode | 0o777)\n\n return True\n", "path": "luigi/lock.py"}]} | 1,724 | 313 |
gh_patches_debug_22058 | rasdani/github-patches | git_diff | pex-tool__pex-258 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Update PyPI page
It would be nice if the `README.rst` were included in the `setup.py` `long_description` in addition to the `CHANGES.rst` so that users browsing PyPI could read the README without having to travel to GitHub.
Would also be nice if the trove classifiers in `setup.py` reflected which versions of Python were officially supported (e.g. `'Programming Language :: Python :: 3.5'`).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 # Copyright 2014 Pants project contributors (see CONTRIBUTORS.md).
2 # Licensed under the Apache License, Version 2.0 (see LICENSE).
3
4 import os
5
6 from setuptools import setup
7
8
9 with open(os.path.join(os.path.dirname(__file__), 'CHANGES.rst')) as fp:
10 LONG_DESCRIPTION = fp.read()
11
12
13 # This seems to be a fairly standard version file pattern.
14 #
15 # Populates the following variables:
16 # __version__
17 # __setuptools_requirement
18 # __wheel_requirement
19 __version__ = ''
20 version_py_file = os.path.join(os.path.dirname(__file__), 'pex', 'version.py')
21 with open(version_py_file) as version_py:
22 exec(compile(version_py.read(), version_py_file, 'exec'))
23
24
25 setup(
26 name = 'pex',
27 version = __version__,
28 description = "The PEX packaging toolchain.",
29 long_description = LONG_DESCRIPTION,
30 url = 'https://github.com/pantsbuild/pex',
31 license = 'Apache License, Version 2.0',
32 zip_safe = True,
33 classifiers = [
34 'Intended Audience :: Developers',
35 'License :: OSI Approved :: Apache Software License',
36 'Operating System :: OS Independent',
37 'Programming Language :: Python',
38 ],
39 packages = [
40 'pex',
41 'pex.bin',
42 'pex.commands',
43 ],
44 install_requires = [
45 SETUPTOOLS_REQUIREMENT,
46 ],
47 tests_require = [
48 'mock',
49 'twitter.common.contextutil>=0.3.1,<0.4.0',
50 'twitter.common.lang>=0.3.1,<0.4.0',
51 'twitter.common.testing>=0.3.1,<0.4.0',
52 'twitter.common.dirutil>=0.3.1,<0.4.0',
53 'pytest',
54 ],
55 entry_points = {
56 'distutils.commands': [
57 'bdist_pex = pex.commands.bdist_pex:bdist_pex',
58 ],
59 'console_scripts': [
60 'pex = pex.bin.pex:main',
61 ],
62 },
63 )
64
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -5,9 +5,11 @@
from setuptools import setup
+with open(os.path.join(os.path.dirname(__file__), 'README.rst')) as fp:
+ LONG_DESCRIPTION = fp.read() + '\n'
with open(os.path.join(os.path.dirname(__file__), 'CHANGES.rst')) as fp:
- LONG_DESCRIPTION = fp.read()
+ LONG_DESCRIPTION += fp.read()
# This seems to be a fairly standard version file pattern.
@@ -33,8 +35,17 @@
classifiers = [
'Intended Audience :: Developers',
'License :: OSI Approved :: Apache Software License',
- 'Operating System :: OS Independent',
+ 'Operating System :: Unix',
+ 'Operating System :: POSIX :: Linux',
+ 'Operating System :: MacOS :: MacOS X',
'Programming Language :: Python',
+ 'Programming Language :: Python :: 2',
+ 'Programming Language :: Python :: 2.6',
+ 'Programming Language :: Python :: 2.7',
+ 'Programming Language :: Python :: 3',
+ 'Programming Language :: Python :: 3.3',
+ 'Programming Language :: Python :: 3.4',
+ 'Programming Language :: Python :: 3.5',
],
packages = [
'pex',
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -5,9 +5,11 @@\n \n from setuptools import setup\n \n+with open(os.path.join(os.path.dirname(__file__), 'README.rst')) as fp:\n+ LONG_DESCRIPTION = fp.read() + '\\n'\n \n with open(os.path.join(os.path.dirname(__file__), 'CHANGES.rst')) as fp:\n- LONG_DESCRIPTION = fp.read()\n+ LONG_DESCRIPTION += fp.read()\n \n \n # This seems to be a fairly standard version file pattern.\n@@ -33,8 +35,17 @@\n classifiers = [\n 'Intended Audience :: Developers',\n 'License :: OSI Approved :: Apache Software License',\n- 'Operating System :: OS Independent',\n+ 'Operating System :: Unix',\n+ 'Operating System :: POSIX :: Linux',\n+ 'Operating System :: MacOS :: MacOS X',\n 'Programming Language :: Python',\n+ 'Programming Language :: Python :: 2',\n+ 'Programming Language :: Python :: 2.6',\n+ 'Programming Language :: Python :: 2.7',\n+ 'Programming Language :: Python :: 3',\n+ 'Programming Language :: Python :: 3.3',\n+ 'Programming Language :: Python :: 3.4',\n+ 'Programming Language :: Python :: 3.5',\n ],\n packages = [\n 'pex',\n", "issue": "Update PyPI page\nIt would be nice if the `README.rst` were included in the `setup.py` `long_description` in addition to the `CHANGES.rst` so that users browsing PyPI could read the README without having to travel to GitHub.\n\nWould also be nice if the trove classifiers in `setup.py` reflected which versions of Python were officially supported (e.g. `'Programming Language :: Python :: 3.5'`).\n\n", "before_files": [{"content": "# Copyright 2014 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\nimport os\n\nfrom setuptools import setup\n\n\nwith open(os.path.join(os.path.dirname(__file__), 'CHANGES.rst')) as fp:\n LONG_DESCRIPTION = fp.read()\n\n\n# This seems to be a fairly standard version file pattern.\n#\n# Populates the following variables:\n# __version__\n# __setuptools_requirement\n# __wheel_requirement\n__version__ = ''\nversion_py_file = os.path.join(os.path.dirname(__file__), 'pex', 'version.py')\nwith open(version_py_file) as version_py:\n exec(compile(version_py.read(), version_py_file, 'exec'))\n\n\nsetup(\n name = 'pex',\n version = __version__,\n description = \"The PEX packaging toolchain.\",\n long_description = LONG_DESCRIPTION,\n url = 'https://github.com/pantsbuild/pex',\n license = 'Apache License, Version 2.0',\n zip_safe = True,\n classifiers = [\n 'Intended Audience :: Developers',\n 'License :: OSI Approved :: Apache Software License',\n 'Operating System :: OS Independent',\n 'Programming Language :: Python',\n ],\n packages = [\n 'pex',\n 'pex.bin',\n 'pex.commands',\n ],\n install_requires = [\n SETUPTOOLS_REQUIREMENT,\n ],\n tests_require = [\n 'mock',\n 'twitter.common.contextutil>=0.3.1,<0.4.0',\n 'twitter.common.lang>=0.3.1,<0.4.0',\n 'twitter.common.testing>=0.3.1,<0.4.0',\n 'twitter.common.dirutil>=0.3.1,<0.4.0',\n 'pytest',\n ],\n entry_points = {\n 'distutils.commands': [\n 'bdist_pex = pex.commands.bdist_pex:bdist_pex',\n ],\n 'console_scripts': [\n 'pex = pex.bin.pex:main',\n ],\n },\n)\n", "path": "setup.py"}], "after_files": [{"content": "# Copyright 2014 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\nimport os\n\nfrom setuptools import setup\n\nwith open(os.path.join(os.path.dirname(__file__), 'README.rst')) as fp:\n LONG_DESCRIPTION = fp.read() + '\\n'\n\nwith open(os.path.join(os.path.dirname(__file__), 'CHANGES.rst')) as fp:\n LONG_DESCRIPTION += fp.read()\n\n\n# This seems to be a fairly standard version file pattern.\n#\n# Populates the following variables:\n# __version__\n# __setuptools_requirement\n# __wheel_requirement\n__version__ = ''\nversion_py_file = os.path.join(os.path.dirname(__file__), 'pex', 'version.py')\nwith open(version_py_file) as version_py:\n exec(compile(version_py.read(), version_py_file, 'exec'))\n\n\nsetup(\n name = 'pex',\n version = __version__,\n description = \"The PEX packaging toolchain.\",\n long_description = LONG_DESCRIPTION,\n url = 'https://github.com/pantsbuild/pex',\n license = 'Apache License, Version 2.0',\n zip_safe = True,\n classifiers = [\n 'Intended Audience :: Developers',\n 'License :: OSI Approved :: Apache Software License',\n 'Operating System :: Unix',\n 'Operating System :: POSIX :: Linux',\n 'Operating System :: MacOS :: MacOS X',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 2',\n 'Programming Language :: Python :: 2.6',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.3',\n 'Programming Language :: Python :: 3.4',\n 'Programming Language :: Python :: 3.5',\n ],\n packages = [\n 'pex',\n 'pex.bin',\n 'pex.commands',\n ],\n install_requires = [\n SETUPTOOLS_REQUIREMENT,\n ],\n tests_require = [\n 'mock',\n 'twitter.common.contextutil>=0.3.1,<0.4.0',\n 'twitter.common.lang>=0.3.1,<0.4.0',\n 'twitter.common.testing>=0.3.1,<0.4.0',\n 'twitter.common.dirutil>=0.3.1,<0.4.0',\n 'pytest',\n ],\n entry_points = {\n 'distutils.commands': [\n 'bdist_pex = pex.commands.bdist_pex:bdist_pex',\n ],\n 'console_scripts': [\n 'pex = pex.bin.pex:main',\n ],\n },\n)\n", "path": "setup.py"}]} | 926 | 298 |
gh_patches_debug_19082 | rasdani/github-patches | git_diff | mkdocs__mkdocs-1322 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Version to 0.17.0
See discussion in #1166.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mkdocs/contrib/legacy_search/__init__.py`
Content:
```
1 # coding: utf-8
2
3 from __future__ import absolute_import, unicode_literals
4
5 import os
6 import logging
7 from mkdocs import utils
8 from mkdocs.plugins import BasePlugin
9 from mkdocs.contrib.legacy_search.search_index import SearchIndex
10
11
12 log = logging.getLogger(__name__)
13
14
15 class SearchPlugin(BasePlugin):
16 """ Add a search feature to MkDocs. """
17
18 def on_config(self, config, **kwargs):
19 "Add plugin templates and scripts to config."
20 path = os.path.join(os.path.dirname(os.path.abspath(__file__)), 'templates')
21 config['theme'].dirs.append(path)
22 config['theme'].static_templates.add('search.html')
23 config['extra_javascript'].append('search/require.js')
24 config['extra_javascript'].append('search/search.js')
25 return config
26
27 def on_pre_build(self, config, **kwargs):
28 "Create search index instance for later use."
29 self.search_index = SearchIndex()
30
31 def on_page_context(self, context, **kwargs):
32 "Add page to search index."
33 self.search_index.add_entry_from_context(context['page'])
34
35 def on_post_build(self, config, **kwargs):
36 "Build search index."
37 search_index = self.search_index.generate_search_index()
38 json_output_path = os.path.join(config['site_dir'], 'search', 'search_index.json')
39 utils.write_file(search_index.encode('utf-8'), json_output_path)
40
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/mkdocs/contrib/legacy_search/__init__.py b/mkdocs/contrib/legacy_search/__init__.py
--- a/mkdocs/contrib/legacy_search/__init__.py
+++ b/mkdocs/contrib/legacy_search/__init__.py
@@ -17,11 +17,13 @@
def on_config(self, config, **kwargs):
"Add plugin templates and scripts to config."
- path = os.path.join(os.path.dirname(os.path.abspath(__file__)), 'templates')
- config['theme'].dirs.append(path)
- config['theme'].static_templates.add('search.html')
- config['extra_javascript'].append('search/require.js')
- config['extra_javascript'].append('search/search.js')
+ if 'include_search_page' in config['theme'] and config['theme']['include_search_page']:
+ config['theme'].static_templates.add('search.html')
+ if not ('search_index_only' in config['theme'] and config['theme']['search_index_only']):
+ path = os.path.join(os.path.dirname(os.path.abspath(__file__)), 'templates')
+ config['theme'].dirs.append(path)
+ config['extra_javascript'].append('search/require.js')
+ config['extra_javascript'].append('search/search.js')
return config
def on_pre_build(self, config, **kwargs):
| {"golden_diff": "diff --git a/mkdocs/contrib/legacy_search/__init__.py b/mkdocs/contrib/legacy_search/__init__.py\n--- a/mkdocs/contrib/legacy_search/__init__.py\n+++ b/mkdocs/contrib/legacy_search/__init__.py\n@@ -17,11 +17,13 @@\n \n def on_config(self, config, **kwargs):\n \"Add plugin templates and scripts to config.\"\n- path = os.path.join(os.path.dirname(os.path.abspath(__file__)), 'templates')\n- config['theme'].dirs.append(path)\n- config['theme'].static_templates.add('search.html')\n- config['extra_javascript'].append('search/require.js')\n- config['extra_javascript'].append('search/search.js')\n+ if 'include_search_page' in config['theme'] and config['theme']['include_search_page']:\n+ config['theme'].static_templates.add('search.html')\n+ if not ('search_index_only' in config['theme'] and config['theme']['search_index_only']):\n+ path = os.path.join(os.path.dirname(os.path.abspath(__file__)), 'templates')\n+ config['theme'].dirs.append(path)\n+ config['extra_javascript'].append('search/require.js')\n+ config['extra_javascript'].append('search/search.js')\n return config\n \n def on_pre_build(self, config, **kwargs):\n", "issue": "Version to 0.17.0\nSee discussion in #1166.\n", "before_files": [{"content": "# coding: utf-8\n\nfrom __future__ import absolute_import, unicode_literals\n\nimport os\nimport logging\nfrom mkdocs import utils\nfrom mkdocs.plugins import BasePlugin\nfrom mkdocs.contrib.legacy_search.search_index import SearchIndex\n\n\nlog = logging.getLogger(__name__)\n\n\nclass SearchPlugin(BasePlugin):\n \"\"\" Add a search feature to MkDocs. \"\"\"\n\n def on_config(self, config, **kwargs):\n \"Add plugin templates and scripts to config.\"\n path = os.path.join(os.path.dirname(os.path.abspath(__file__)), 'templates')\n config['theme'].dirs.append(path)\n config['theme'].static_templates.add('search.html')\n config['extra_javascript'].append('search/require.js')\n config['extra_javascript'].append('search/search.js')\n return config\n\n def on_pre_build(self, config, **kwargs):\n \"Create search index instance for later use.\"\n self.search_index = SearchIndex()\n\n def on_page_context(self, context, **kwargs):\n \"Add page to search index.\"\n self.search_index.add_entry_from_context(context['page'])\n\n def on_post_build(self, config, **kwargs):\n \"Build search index.\"\n search_index = self.search_index.generate_search_index()\n json_output_path = os.path.join(config['site_dir'], 'search', 'search_index.json')\n utils.write_file(search_index.encode('utf-8'), json_output_path)\n", "path": "mkdocs/contrib/legacy_search/__init__.py"}], "after_files": [{"content": "# coding: utf-8\n\nfrom __future__ import absolute_import, unicode_literals\n\nimport os\nimport logging\nfrom mkdocs import utils\nfrom mkdocs.plugins import BasePlugin\nfrom mkdocs.contrib.legacy_search.search_index import SearchIndex\n\n\nlog = logging.getLogger(__name__)\n\n\nclass SearchPlugin(BasePlugin):\n \"\"\" Add a search feature to MkDocs. \"\"\"\n\n def on_config(self, config, **kwargs):\n \"Add plugin templates and scripts to config.\"\n if 'include_search_page' in config['theme'] and config['theme']['include_search_page']:\n config['theme'].static_templates.add('search.html')\n if not ('search_index_only' in config['theme'] and config['theme']['search_index_only']):\n path = os.path.join(os.path.dirname(os.path.abspath(__file__)), 'templates')\n config['theme'].dirs.append(path)\n config['extra_javascript'].append('search/require.js')\n config['extra_javascript'].append('search/search.js')\n return config\n\n def on_pre_build(self, config, **kwargs):\n \"Create search index instance for later use.\"\n self.search_index = SearchIndex()\n\n def on_page_context(self, context, **kwargs):\n \"Add page to search index.\"\n self.search_index.add_entry_from_context(context['page'])\n\n def on_post_build(self, config, **kwargs):\n \"Build search index.\"\n search_index = self.search_index.generate_search_index()\n json_output_path = os.path.join(config['site_dir'], 'search', 'search_index.json')\n utils.write_file(search_index.encode('utf-8'), json_output_path)\n", "path": "mkdocs/contrib/legacy_search/__init__.py"}]} | 660 | 301 |
gh_patches_debug_79 | rasdani/github-patches | git_diff | flairNLP__flair-447 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
__version__ attribute?
I'm always frustrated when flair doesn't have a __version__attribute... :-)
Please, add a __version__attribute to the module.
Thank you!
DC
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `flair/__init__.py`
Content:
```
1 import torch
2
3 from . import data
4 from . import models
5 from . import visual
6 from . import trainers
7
8 import logging.config
9
10
11 logging.config.dictConfig({
12 'version': 1,
13 'disable_existing_loggers': False,
14 'formatters': {
15 'standard': {
16 'format': '%(asctime)-15s %(message)s'
17 },
18 },
19 'handlers': {
20 'console': {
21 'level': 'INFO',
22 'class': 'logging.StreamHandler',
23 'formatter': 'standard',
24 'stream': 'ext://sys.stdout'
25 },
26 },
27 'loggers': {
28 'flair': {
29 'handlers': ['console'],
30 'level': 'INFO',
31 'propagate': False
32 }
33 },
34 'root': {
35 'handlers': ['console'],
36 'level': 'WARNING'
37 }
38 })
39
40 logger = logging.getLogger('flair')
41
42
43 device = None
44 if torch.cuda.is_available():
45 device = torch.device('cuda:0')
46 else:
47 device = torch.device('cpu')
48
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/flair/__init__.py b/flair/__init__.py
--- a/flair/__init__.py
+++ b/flair/__init__.py
@@ -7,6 +7,7 @@
import logging.config
+__version__ = "0.4.1"
logging.config.dictConfig({
'version': 1,
| {"golden_diff": "diff --git a/flair/__init__.py b/flair/__init__.py\n--- a/flair/__init__.py\n+++ b/flair/__init__.py\n@@ -7,6 +7,7 @@\n \n import logging.config\n \n+__version__ = \"0.4.1\"\n \n logging.config.dictConfig({\n 'version': 1,\n", "issue": "__version__ attribute?\nI'm always frustrated when flair doesn't have a __version__attribute... :-)\r\n\r\nPlease, add a __version__attribute to the module.\r\n\r\nThank you!\r\nDC\r\n\n", "before_files": [{"content": "import torch\n\nfrom . import data\nfrom . import models\nfrom . import visual\nfrom . import trainers\n\nimport logging.config\n\n\nlogging.config.dictConfig({\n 'version': 1,\n 'disable_existing_loggers': False,\n 'formatters': {\n 'standard': {\n 'format': '%(asctime)-15s %(message)s'\n },\n },\n 'handlers': {\n 'console': {\n 'level': 'INFO',\n 'class': 'logging.StreamHandler',\n 'formatter': 'standard',\n 'stream': 'ext://sys.stdout'\n },\n },\n 'loggers': {\n 'flair': {\n 'handlers': ['console'],\n 'level': 'INFO',\n 'propagate': False\n }\n },\n 'root': {\n 'handlers': ['console'],\n 'level': 'WARNING'\n }\n})\n\nlogger = logging.getLogger('flair')\n\n\ndevice = None\nif torch.cuda.is_available():\n device = torch.device('cuda:0')\nelse:\n device = torch.device('cpu')\n", "path": "flair/__init__.py"}], "after_files": [{"content": "import torch\n\nfrom . import data\nfrom . import models\nfrom . import visual\nfrom . import trainers\n\nimport logging.config\n\n__version__ = \"0.4.1\"\n\nlogging.config.dictConfig({\n 'version': 1,\n 'disable_existing_loggers': False,\n 'formatters': {\n 'standard': {\n 'format': '%(asctime)-15s %(message)s'\n },\n },\n 'handlers': {\n 'console': {\n 'level': 'INFO',\n 'class': 'logging.StreamHandler',\n 'formatter': 'standard',\n 'stream': 'ext://sys.stdout'\n },\n },\n 'loggers': {\n 'flair': {\n 'handlers': ['console'],\n 'level': 'INFO',\n 'propagate': False\n }\n },\n 'root': {\n 'handlers': ['console'],\n 'level': 'WARNING'\n }\n})\n\nlogger = logging.getLogger('flair')\n\n\ndevice = None\nif torch.cuda.is_available():\n device = torch.device('cuda:0')\nelse:\n device = torch.device('cpu')\n", "path": "flair/__init__.py"}]} | 616 | 76 |
gh_patches_debug_29935 | rasdani/github-patches | git_diff | google__jax-2481 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add multivariate normal pdf evalutation to jax.scipy
It would be great to have a Multivariate gaussian pdf/logpdf implementation, similar to the univariate version in [jax.scipy.stats.norm](https://jax.readthedocs.io/en/latest/_modules/jax/scipy/stats/norm.html#logpdf). I am currently working with this hacky function:
```
@jit
def multi_gauss_logpdf(x, mean, cov):
""" Calculate the probability density of a
sample from the multivariate normal. """
D = mean.shape[0]
(sign, logdet) = np.linalg.slogdet(cov)
p1 = D*np.log(2*np.pi) + logdet
p2 = (x-mean).T @ np.linalg.inv(cov) @ (x-mean)
return -1./2 * (p1 + p2)
batch_logpdf = vmap(multi_gauss_logpdf, in_axes=(0, None, None))
```
My `lax`/primitive knowledge is still fairly limited but I will try to put together a pr. Any recommendations how to speed things up?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `jax/scipy/stats/multivariate_normal.py`
Content:
```
1 # Copyright 2018 Google LLC
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # https://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15
16 import numpy as np
17 import scipy.stats as osp_stats
18
19 from ... import lax
20 from ...numpy.lax_numpy import _promote_dtypes_inexact, _constant_like, _wraps
21 from ...numpy.lax_numpy import dot, subtract, einsum
22 from ...numpy.linalg import det, inv
23
24
25 @_wraps(osp_stats.multivariate_normal.logpdf, update_doc=False)
26 def logpdf(x, mean, cov):
27 x, mean, cov = _promote_dtypes_inexact(x, mean, cov)
28 two = _constant_like(x, 2)
29 dim = _constant_like(x, mean.shape[0])
30 det_sig = det(cov).astype(cov.dtype)
31 log_normalizer = lax.log(lax.mul(lax.pow(_constant_like(x, 2 * np.pi), dim),
32 det_sig))
33 x_shape = x.shape[:-1]
34 if x_shape:
35 x_2d = x.reshape((-1, mean.shape[0]))
36 quadratic = einsum("ij,jk,ik->i", subtract(x_2d, mean), inv(cov),
37 subtract(x_2d, mean)).reshape(x_shape).astype(cov.dtype)
38 else:
39 quadratic = dot(dot(subtract(x, mean), inv(cov)), subtract(x, mean).T).astype(cov.dtype)
40 return lax.div(lax.neg(lax.add(log_normalizer, quadratic)), two)
41
42 @_wraps(osp_stats.multivariate_normal.pdf, update_doc=False)
43 def pdf(x, mean, cov):
44 return lax.exp(logpdf(x, mean, cov))
45
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/jax/scipy/stats/multivariate_normal.py b/jax/scipy/stats/multivariate_normal.py
--- a/jax/scipy/stats/multivariate_normal.py
+++ b/jax/scipy/stats/multivariate_normal.py
@@ -17,27 +17,29 @@
import scipy.stats as osp_stats
from ... import lax
+from ...lax_linalg import cholesky, triangular_solve
+from ... import numpy as jnp
from ...numpy.lax_numpy import _promote_dtypes_inexact, _constant_like, _wraps
-from ...numpy.lax_numpy import dot, subtract, einsum
-from ...numpy.linalg import det, inv
@_wraps(osp_stats.multivariate_normal.logpdf, update_doc=False)
def logpdf(x, mean, cov):
x, mean, cov = _promote_dtypes_inexact(x, mean, cov)
- two = _constant_like(x, 2)
- dim = _constant_like(x, mean.shape[0])
- det_sig = det(cov).astype(cov.dtype)
- log_normalizer = lax.log(lax.mul(lax.pow(_constant_like(x, 2 * np.pi), dim),
- det_sig))
- x_shape = x.shape[:-1]
- if x_shape:
- x_2d = x.reshape((-1, mean.shape[0]))
- quadratic = einsum("ij,jk,ik->i", subtract(x_2d, mean), inv(cov),
- subtract(x_2d, mean)).reshape(x_shape).astype(cov.dtype)
+ if not mean.shape:
+ return -1/2 * (x - mean) ** 2 / cov - 1/2 * (np.log(2*np.pi) + jnp.log(cov))
else:
- quadratic = dot(dot(subtract(x, mean), inv(cov)), subtract(x, mean).T).astype(cov.dtype)
- return lax.div(lax.neg(lax.add(log_normalizer, quadratic)), two)
+ n = mean.shape[-1]
+ if not np.shape(cov):
+ y = x - mean
+ return (-1/2 * jnp.einsum('...i,...i->...', y, y) / cov
+ - n/2 * (np.log(2*np.pi) + jnp.log(cov)))
+ else:
+ if cov.ndim < 2 or cov.shape[-2:] != (n, n):
+ raise ValueError("multivariate_normal.logpdf got incompatible shapes")
+ L = cholesky(cov)
+ y = triangular_solve(L, x - mean, lower=True, transpose_a=True)
+ return (-1/2 * jnp.einsum('...i,...i->...', y, y) - n/2*np.log(2*np.pi)
+ - jnp.log(L.diagonal()).sum())
@_wraps(osp_stats.multivariate_normal.pdf, update_doc=False)
def pdf(x, mean, cov):
| {"golden_diff": "diff --git a/jax/scipy/stats/multivariate_normal.py b/jax/scipy/stats/multivariate_normal.py\n--- a/jax/scipy/stats/multivariate_normal.py\n+++ b/jax/scipy/stats/multivariate_normal.py\n@@ -17,27 +17,29 @@\n import scipy.stats as osp_stats\n \n from ... import lax\n+from ...lax_linalg import cholesky, triangular_solve\n+from ... import numpy as jnp\n from ...numpy.lax_numpy import _promote_dtypes_inexact, _constant_like, _wraps\n-from ...numpy.lax_numpy import dot, subtract, einsum\n-from ...numpy.linalg import det, inv\n \n \n @_wraps(osp_stats.multivariate_normal.logpdf, update_doc=False)\n def logpdf(x, mean, cov):\n x, mean, cov = _promote_dtypes_inexact(x, mean, cov)\n- two = _constant_like(x, 2)\n- dim = _constant_like(x, mean.shape[0])\n- det_sig = det(cov).astype(cov.dtype)\n- log_normalizer = lax.log(lax.mul(lax.pow(_constant_like(x, 2 * np.pi), dim),\n- det_sig))\n- x_shape = x.shape[:-1]\n- if x_shape:\n- x_2d = x.reshape((-1, mean.shape[0]))\n- quadratic = einsum(\"ij,jk,ik->i\", subtract(x_2d, mean), inv(cov), \n- subtract(x_2d, mean)).reshape(x_shape).astype(cov.dtype)\n+ if not mean.shape:\n+ return -1/2 * (x - mean) ** 2 / cov - 1/2 * (np.log(2*np.pi) + jnp.log(cov))\n else:\n- quadratic = dot(dot(subtract(x, mean), inv(cov)), subtract(x, mean).T).astype(cov.dtype)\n- return lax.div(lax.neg(lax.add(log_normalizer, quadratic)), two)\n+ n = mean.shape[-1]\n+ if not np.shape(cov):\n+ y = x - mean\n+ return (-1/2 * jnp.einsum('...i,...i->...', y, y) / cov\n+ - n/2 * (np.log(2*np.pi) + jnp.log(cov)))\n+ else:\n+ if cov.ndim < 2 or cov.shape[-2:] != (n, n):\n+ raise ValueError(\"multivariate_normal.logpdf got incompatible shapes\")\n+ L = cholesky(cov)\n+ y = triangular_solve(L, x - mean, lower=True, transpose_a=True)\n+ return (-1/2 * jnp.einsum('...i,...i->...', y, y) - n/2*np.log(2*np.pi)\n+ - jnp.log(L.diagonal()).sum())\n \n @_wraps(osp_stats.multivariate_normal.pdf, update_doc=False)\n def pdf(x, mean, cov):\n", "issue": "Add multivariate normal pdf evalutation to jax.scipy\nIt would be great to have a Multivariate gaussian pdf/logpdf implementation, similar to the univariate version in [jax.scipy.stats.norm](https://jax.readthedocs.io/en/latest/_modules/jax/scipy/stats/norm.html#logpdf). I am currently working with this hacky function: \r\n\r\n```\r\n@jit\r\ndef multi_gauss_logpdf(x, mean, cov):\r\n \"\"\" Calculate the probability density of a\r\n sample from the multivariate normal. \"\"\"\r\n D = mean.shape[0]\r\n (sign, logdet) = np.linalg.slogdet(cov)\r\n p1 = D*np.log(2*np.pi) + logdet\r\n p2 = (x-mean).T @ np.linalg.inv(cov) @ (x-mean)\r\n return -1./2 * (p1 + p2)\r\n\r\nbatch_logpdf = vmap(multi_gauss_logpdf, in_axes=(0, None, None))\r\n```\r\n\r\nMy `lax`/primitive knowledge is still fairly limited but I will try to put together a pr. Any recommendations how to speed things up?\n", "before_files": [{"content": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\nimport numpy as np\nimport scipy.stats as osp_stats\n\nfrom ... import lax\nfrom ...numpy.lax_numpy import _promote_dtypes_inexact, _constant_like, _wraps\nfrom ...numpy.lax_numpy import dot, subtract, einsum\nfrom ...numpy.linalg import det, inv\n\n\n@_wraps(osp_stats.multivariate_normal.logpdf, update_doc=False)\ndef logpdf(x, mean, cov):\n x, mean, cov = _promote_dtypes_inexact(x, mean, cov)\n two = _constant_like(x, 2)\n dim = _constant_like(x, mean.shape[0])\n det_sig = det(cov).astype(cov.dtype)\n log_normalizer = lax.log(lax.mul(lax.pow(_constant_like(x, 2 * np.pi), dim),\n det_sig))\n x_shape = x.shape[:-1]\n if x_shape:\n x_2d = x.reshape((-1, mean.shape[0]))\n quadratic = einsum(\"ij,jk,ik->i\", subtract(x_2d, mean), inv(cov), \n subtract(x_2d, mean)).reshape(x_shape).astype(cov.dtype)\n else:\n quadratic = dot(dot(subtract(x, mean), inv(cov)), subtract(x, mean).T).astype(cov.dtype)\n return lax.div(lax.neg(lax.add(log_normalizer, quadratic)), two)\n\n@_wraps(osp_stats.multivariate_normal.pdf, update_doc=False)\ndef pdf(x, mean, cov):\n return lax.exp(logpdf(x, mean, cov))\n", "path": "jax/scipy/stats/multivariate_normal.py"}], "after_files": [{"content": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\nimport numpy as np\nimport scipy.stats as osp_stats\n\nfrom ... import lax\nfrom ...lax_linalg import cholesky, triangular_solve\nfrom ... import numpy as jnp\nfrom ...numpy.lax_numpy import _promote_dtypes_inexact, _constant_like, _wraps\n\n\n@_wraps(osp_stats.multivariate_normal.logpdf, update_doc=False)\ndef logpdf(x, mean, cov):\n x, mean, cov = _promote_dtypes_inexact(x, mean, cov)\n if not mean.shape:\n return -1/2 * (x - mean) ** 2 / cov - 1/2 * (np.log(2*np.pi) + jnp.log(cov))\n else:\n n = mean.shape[-1]\n if not np.shape(cov):\n y = x - mean\n return (-1/2 * jnp.einsum('...i,...i->...', y, y) / cov\n - n/2 * (np.log(2*np.pi) + jnp.log(cov)))\n else:\n if cov.ndim < 2 or cov.shape[-2:] != (n, n):\n raise ValueError(\"multivariate_normal.logpdf got incompatible shapes\")\n L = cholesky(cov)\n y = triangular_solve(L, x - mean, lower=True, transpose_a=True)\n return (-1/2 * jnp.einsum('...i,...i->...', y, y) - n/2*np.log(2*np.pi)\n - jnp.log(L.diagonal()).sum())\n\n@_wraps(osp_stats.multivariate_normal.pdf, update_doc=False)\ndef pdf(x, mean, cov):\n return lax.exp(logpdf(x, mean, cov))\n", "path": "jax/scipy/stats/multivariate_normal.py"}]} | 1,053 | 655 |
gh_patches_debug_17452 | rasdani/github-patches | git_diff | streamlink__streamlink-5908 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
plugins.vkplay: vkplay.live has moved to another domain (live.vkplay.ru)
### Checklist
- [X] This is a [plugin issue](https://streamlink.github.io/plugins.html) and not [a different kind of issue](https://github.com/streamlink/streamlink/issues/new/choose)
- [X] [I have read the contribution guidelines](https://github.com/streamlink/streamlink/blob/master/CONTRIBUTING.md#contributing-to-streamlink)
- [X] [I have checked the list of open and recently closed plugin issues](https://github.com/streamlink/streamlink/issues?q=is%3Aissue+label%3A%22plugin+issue%22)
- [X] [I have checked the commit log of the master branch](https://github.com/streamlink/streamlink/commits/master)
### Streamlink version
streamlink 6.7.2
### Description
https://vk.com/wall-212496568_91026
yesterday, vkplay live changed its domain. if you specify the old domain in the link to the stream, then everything still works, but on the site itself there are links to a new domain, to which the existing plugin does not respond.
I just tried to change the updated part of the domain in the plugin code (vkplay.live -> live.vkplay.ru ), and everything seems to be working well. It's a bit difficult for me to create a pull request, but here's the corrected plugin on gist:
https://gist.github.com/oexlkinq/eef0a260dddad473c5febafd91b980d9
the old domain is also listed in the documentation (https://streamlink.github.io/plugins.html#vkplay)
### Debug log
```text
streamlink https://live.vkplay.ru/ruwarface 720p --loglevel=debug
[cli][debug] OS: Linux-6.8.1-arch1-1-x86_64-with-glibc2.39
[cli][debug] Python: 3.11.8
[cli][debug] OpenSSL: OpenSSL 3.2.1 30 Jan 2024
[cli][debug] Streamlink: 6.7.2
[cli][debug] Dependencies:
[cli][debug] certifi: 2024.2.2
[cli][debug] exceptiongroup: 1.2.0
[cli][debug] isodate: 0.6.1
[cli][debug] lxml: 5.1.0
[cli][debug] pycountry: 23.12.11
[cli][debug] pycryptodome: 3.20.0
[cli][debug] PySocks: 1.7.1
[cli][debug] requests: 2.31.0
[cli][debug] trio: 0.25.0
[cli][debug] trio-websocket: 0.11.1
[cli][debug] typing-extensions: 4.10.0
[cli][debug] urllib3: 1.26.18
[cli][debug] websocket-client: 1.7.0
[cli][debug] Arguments:
[cli][debug] url=https://live.vkplay.ru/ruwarface
[cli][debug] stream=['720p']
[cli][debug] --loglevel=debug
error: No plugin can handle URL: https://live.vkplay.ru/ruwarface
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/streamlink/plugins/vkplay.py`
Content:
```
1 """
2 $description Russian live-streaming platform for gaming and esports, owned by VKontakte.
3 $url vkplay.live
4 $type live
5 $metadata id
6 $metadata author
7 $metadata category
8 $metadata title
9 """
10
11 import logging
12 import re
13
14 from streamlink.plugin import Plugin, pluginmatcher
15 from streamlink.plugin.api import validate
16 from streamlink.stream.hls import HLSStream
17
18
19 log = logging.getLogger(__name__)
20
21
22 @pluginmatcher(re.compile(
23 r"https?://vkplay\.live/(?P<channel_name>\w+)/?$",
24 ))
25 class VKplay(Plugin):
26 API_URL = "https://api.vkplay.live/v1"
27
28 def _get_streams(self):
29 self.author = self.match.group("channel_name")
30 log.debug(f"Channel name: {self.author}")
31
32 data = self.session.http.get(
33 f"{self.API_URL}/blog/{self.author}/public_video_stream",
34 headers={"Referer": self.url},
35 acceptable_status=(200, 404),
36 schema=validate.Schema(
37 validate.parse_json(),
38 validate.any(
39 validate.all(
40 {"error": str, "error_description": str},
41 validate.get("error_description"),
42 ),
43 validate.all(
44 {
45 validate.optional("category"): validate.all(
46 {
47 "title": str,
48 },
49 validate.get("title"),
50 ),
51 "title": str,
52 "data": validate.any(
53 [
54 validate.all(
55 {
56 "vid": str,
57 "playerUrls": [
58 validate.all(
59 {
60 "type": str,
61 "url": validate.any("", validate.url()),
62 },
63 validate.union_get("type", "url"),
64 ),
65 ],
66 },
67 validate.union_get("vid", "playerUrls"),
68 ),
69 ],
70 [],
71 ),
72 },
73 validate.union_get(
74 "category",
75 "title",
76 ("data", 0),
77 ),
78 ),
79 ),
80 ),
81 )
82 if isinstance(data, str):
83 log.error(data)
84 return
85
86 self.category, self.title, streamdata = data
87 if not streamdata:
88 return
89
90 self.id, streams = streamdata
91
92 for streamtype, streamurl in streams:
93 if streamurl and streamtype == "live_hls":
94 return HLSStream.parse_variant_playlist(self.session, streamurl)
95
96
97 __plugin__ = VKplay
98
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/streamlink/plugins/vkplay.py b/src/streamlink/plugins/vkplay.py
--- a/src/streamlink/plugins/vkplay.py
+++ b/src/streamlink/plugins/vkplay.py
@@ -1,6 +1,6 @@
"""
$description Russian live-streaming platform for gaming and esports, owned by VKontakte.
-$url vkplay.live
+$url live.vkplay.ru
$type live
$metadata id
$metadata author
@@ -20,13 +20,13 @@
@pluginmatcher(re.compile(
- r"https?://vkplay\.live/(?P<channel_name>\w+)/?$",
+ r"https?://(?:live\.vkplay\.ru|vkplay\.live)/(?P<channel_name>\w+)/?$",
))
class VKplay(Plugin):
- API_URL = "https://api.vkplay.live/v1"
+ API_URL = "https://api.live.vkplay.ru/v1"
def _get_streams(self):
- self.author = self.match.group("channel_name")
+ self.author = self.match["channel_name"]
log.debug(f"Channel name: {self.author}")
data = self.session.http.get(
| {"golden_diff": "diff --git a/src/streamlink/plugins/vkplay.py b/src/streamlink/plugins/vkplay.py\n--- a/src/streamlink/plugins/vkplay.py\n+++ b/src/streamlink/plugins/vkplay.py\n@@ -1,6 +1,6 @@\n \"\"\"\n $description Russian live-streaming platform for gaming and esports, owned by VKontakte.\n-$url vkplay.live\n+$url live.vkplay.ru\n $type live\n $metadata id\n $metadata author\n@@ -20,13 +20,13 @@\n \n \n @pluginmatcher(re.compile(\n- r\"https?://vkplay\\.live/(?P<channel_name>\\w+)/?$\",\n+ r\"https?://(?:live\\.vkplay\\.ru|vkplay\\.live)/(?P<channel_name>\\w+)/?$\",\n ))\n class VKplay(Plugin):\n- API_URL = \"https://api.vkplay.live/v1\"\n+ API_URL = \"https://api.live.vkplay.ru/v1\"\n \n def _get_streams(self):\n- self.author = self.match.group(\"channel_name\")\n+ self.author = self.match[\"channel_name\"]\n log.debug(f\"Channel name: {self.author}\")\n \n data = self.session.http.get(\n", "issue": "plugins.vkplay: vkplay.live has moved to another domain (live.vkplay.ru)\n### Checklist\n\n- [X] This is a [plugin issue](https://streamlink.github.io/plugins.html) and not [a different kind of issue](https://github.com/streamlink/streamlink/issues/new/choose)\n- [X] [I have read the contribution guidelines](https://github.com/streamlink/streamlink/blob/master/CONTRIBUTING.md#contributing-to-streamlink)\n- [X] [I have checked the list of open and recently closed plugin issues](https://github.com/streamlink/streamlink/issues?q=is%3Aissue+label%3A%22plugin+issue%22)\n- [X] [I have checked the commit log of the master branch](https://github.com/streamlink/streamlink/commits/master)\n\n### Streamlink version\n\nstreamlink 6.7.2\n\n### Description\n\nhttps://vk.com/wall-212496568_91026\r\n\r\nyesterday, vkplay live changed its domain. if you specify the old domain in the link to the stream, then everything still works, but on the site itself there are links to a new domain, to which the existing plugin does not respond.\r\n\r\nI just tried to change the updated part of the domain in the plugin code (vkplay.live -> live.vkplay.ru ), and everything seems to be working well. It's a bit difficult for me to create a pull request, but here's the corrected plugin on gist:\r\nhttps://gist.github.com/oexlkinq/eef0a260dddad473c5febafd91b980d9\r\n\r\nthe old domain is also listed in the documentation (https://streamlink.github.io/plugins.html#vkplay)\n\n### Debug log\n\n```text\nstreamlink https://live.vkplay.ru/ruwarface 720p --loglevel=debug\r\n[cli][debug] OS: Linux-6.8.1-arch1-1-x86_64-with-glibc2.39\r\n[cli][debug] Python: 3.11.8\r\n[cli][debug] OpenSSL: OpenSSL 3.2.1 30 Jan 2024\r\n[cli][debug] Streamlink: 6.7.2\r\n[cli][debug] Dependencies:\r\n[cli][debug] certifi: 2024.2.2\r\n[cli][debug] exceptiongroup: 1.2.0\r\n[cli][debug] isodate: 0.6.1\r\n[cli][debug] lxml: 5.1.0\r\n[cli][debug] pycountry: 23.12.11\r\n[cli][debug] pycryptodome: 3.20.0\r\n[cli][debug] PySocks: 1.7.1\r\n[cli][debug] requests: 2.31.0\r\n[cli][debug] trio: 0.25.0\r\n[cli][debug] trio-websocket: 0.11.1\r\n[cli][debug] typing-extensions: 4.10.0\r\n[cli][debug] urllib3: 1.26.18\r\n[cli][debug] websocket-client: 1.7.0\r\n[cli][debug] Arguments:\r\n[cli][debug] url=https://live.vkplay.ru/ruwarface\r\n[cli][debug] stream=['720p']\r\n[cli][debug] --loglevel=debug\r\nerror: No plugin can handle URL: https://live.vkplay.ru/ruwarface\n```\n\n", "before_files": [{"content": "\"\"\"\n$description Russian live-streaming platform for gaming and esports, owned by VKontakte.\n$url vkplay.live\n$type live\n$metadata id\n$metadata author\n$metadata category\n$metadata title\n\"\"\"\n\nimport logging\nimport re\n\nfrom streamlink.plugin import Plugin, pluginmatcher\nfrom streamlink.plugin.api import validate\nfrom streamlink.stream.hls import HLSStream\n\n\nlog = logging.getLogger(__name__)\n\n\n@pluginmatcher(re.compile(\n r\"https?://vkplay\\.live/(?P<channel_name>\\w+)/?$\",\n))\nclass VKplay(Plugin):\n API_URL = \"https://api.vkplay.live/v1\"\n\n def _get_streams(self):\n self.author = self.match.group(\"channel_name\")\n log.debug(f\"Channel name: {self.author}\")\n\n data = self.session.http.get(\n f\"{self.API_URL}/blog/{self.author}/public_video_stream\",\n headers={\"Referer\": self.url},\n acceptable_status=(200, 404),\n schema=validate.Schema(\n validate.parse_json(),\n validate.any(\n validate.all(\n {\"error\": str, \"error_description\": str},\n validate.get(\"error_description\"),\n ),\n validate.all(\n {\n validate.optional(\"category\"): validate.all(\n {\n \"title\": str,\n },\n validate.get(\"title\"),\n ),\n \"title\": str,\n \"data\": validate.any(\n [\n validate.all(\n {\n \"vid\": str,\n \"playerUrls\": [\n validate.all(\n {\n \"type\": str,\n \"url\": validate.any(\"\", validate.url()),\n },\n validate.union_get(\"type\", \"url\"),\n ),\n ],\n },\n validate.union_get(\"vid\", \"playerUrls\"),\n ),\n ],\n [],\n ),\n },\n validate.union_get(\n \"category\",\n \"title\",\n (\"data\", 0),\n ),\n ),\n ),\n ),\n )\n if isinstance(data, str):\n log.error(data)\n return\n\n self.category, self.title, streamdata = data\n if not streamdata:\n return\n\n self.id, streams = streamdata\n\n for streamtype, streamurl in streams:\n if streamurl and streamtype == \"live_hls\":\n return HLSStream.parse_variant_playlist(self.session, streamurl)\n\n\n__plugin__ = VKplay\n", "path": "src/streamlink/plugins/vkplay.py"}], "after_files": [{"content": "\"\"\"\n$description Russian live-streaming platform for gaming and esports, owned by VKontakte.\n$url live.vkplay.ru\n$type live\n$metadata id\n$metadata author\n$metadata category\n$metadata title\n\"\"\"\n\nimport logging\nimport re\n\nfrom streamlink.plugin import Plugin, pluginmatcher\nfrom streamlink.plugin.api import validate\nfrom streamlink.stream.hls import HLSStream\n\n\nlog = logging.getLogger(__name__)\n\n\n@pluginmatcher(re.compile(\n r\"https?://(?:live\\.vkplay\\.ru|vkplay\\.live)/(?P<channel_name>\\w+)/?$\",\n))\nclass VKplay(Plugin):\n API_URL = \"https://api.live.vkplay.ru/v1\"\n\n def _get_streams(self):\n self.author = self.match[\"channel_name\"]\n log.debug(f\"Channel name: {self.author}\")\n\n data = self.session.http.get(\n f\"{self.API_URL}/blog/{self.author}/public_video_stream\",\n headers={\"Referer\": self.url},\n acceptable_status=(200, 404),\n schema=validate.Schema(\n validate.parse_json(),\n validate.any(\n validate.all(\n {\"error\": str, \"error_description\": str},\n validate.get(\"error_description\"),\n ),\n validate.all(\n {\n validate.optional(\"category\"): validate.all(\n {\n \"title\": str,\n },\n validate.get(\"title\"),\n ),\n \"title\": str,\n \"data\": validate.any(\n [\n validate.all(\n {\n \"vid\": str,\n \"playerUrls\": [\n validate.all(\n {\n \"type\": str,\n \"url\": validate.any(\"\", validate.url()),\n },\n validate.union_get(\"type\", \"url\"),\n ),\n ],\n },\n validate.union_get(\"vid\", \"playerUrls\"),\n ),\n ],\n [],\n ),\n },\n validate.union_get(\n \"category\",\n \"title\",\n (\"data\", 0),\n ),\n ),\n ),\n ),\n )\n if isinstance(data, str):\n log.error(data)\n return\n\n self.category, self.title, streamdata = data\n if not streamdata:\n return\n\n self.id, streams = streamdata\n\n for streamtype, streamurl in streams:\n if streamurl and streamtype == \"live_hls\":\n return HLSStream.parse_variant_playlist(self.session, streamurl)\n\n\n__plugin__ = VKplay\n", "path": "src/streamlink/plugins/vkplay.py"}]} | 1,762 | 261 |
gh_patches_debug_958 | rasdani/github-patches | git_diff | nvaccess__nvda-10921 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
VS Code: Reduce the number of times one has to use NVDA+Space to switch modes.
I just filed [this issue](https://github.com/microsoft/vscode/issues/93087) against VS Code where I suggest to use targeted role="document" in those places that produce HTML output for consumption, to make NVDA switch in and out of browse mode in a smart, automated, manner, reducing the number of times one has to use NVDA+Space to toggle modes. Examples I found while using the 1.44 VS Code Insider builds were:
* The Welcome page
* The details page for an extension
* The ReadMe file that may be displayed after an extension has been installed.
@leonardder suggested that, once this lands in stable, a modification might be needed for the VS Code app module. So filing this issue here.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `source/appModules/code.py`
Content:
```
1 #appModules/code.py
2 #A part of NonVisual Desktop Access (NVDA)
3 #Copyright (C) 2019 NV Access Limited, Babbage B.V.
4 #This file is covered by the GNU General Public License.
5 #See the file COPYING for more details.
6
7 import appModuleHandler
8
9 class AppModule(appModuleHandler.AppModule):
10 disableBrowseModeByDefault = True
11
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/source/appModules/code.py b/source/appModules/code.py
deleted file mode 100644
--- a/source/appModules/code.py
+++ /dev/null
@@ -1,10 +0,0 @@
-#appModules/code.py
-#A part of NonVisual Desktop Access (NVDA)
-#Copyright (C) 2019 NV Access Limited, Babbage B.V.
-#This file is covered by the GNU General Public License.
-#See the file COPYING for more details.
-
-import appModuleHandler
-
-class AppModule(appModuleHandler.AppModule):
- disableBrowseModeByDefault = True
| {"golden_diff": "diff --git a/source/appModules/code.py b/source/appModules/code.py\ndeleted file mode 100644\n--- a/source/appModules/code.py\n+++ /dev/null\n@@ -1,10 +0,0 @@\n-#appModules/code.py\n-#A part of NonVisual Desktop Access (NVDA)\n-#Copyright (C) 2019 NV Access Limited, Babbage B.V.\n-#This file is covered by the GNU General Public License.\n-#See the file COPYING for more details.\n-\n-import appModuleHandler\n-\n-class AppModule(appModuleHandler.AppModule):\n-\tdisableBrowseModeByDefault = True\n", "issue": "VS Code: Reduce the number of times one has to use NVDA+Space to switch modes.\nI just filed [this issue](https://github.com/microsoft/vscode/issues/93087) against VS Code where I suggest to use targeted role=\"document\" in those places that produce HTML output for consumption, to make NVDA switch in and out of browse mode in a smart, automated, manner, reducing the number of times one has to use NVDA+Space to toggle modes. Examples I found while using the 1.44 VS Code Insider builds were:\r\n\r\n* The Welcome page\r\n* The details page for an extension\r\n* The ReadMe file that may be displayed after an extension has been installed.\r\n\r\n@leonardder suggested that, once this lands in stable, a modification might be needed for the VS Code app module. So filing this issue here.\n", "before_files": [{"content": "#appModules/code.py\n#A part of NonVisual Desktop Access (NVDA)\n#Copyright (C) 2019 NV Access Limited, Babbage B.V.\n#This file is covered by the GNU General Public License.\n#See the file COPYING for more details.\n\nimport appModuleHandler\n\nclass AppModule(appModuleHandler.AppModule):\n\tdisableBrowseModeByDefault = True\n", "path": "source/appModules/code.py"}], "after_files": [{"content": null, "path": "source/appModules/code.py"}]} | 535 | 139 |
gh_patches_debug_27836 | rasdani/github-patches | git_diff | encode__starlette-151 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Wildcard domains and TrustedhostMiddleware
Support for wildcard domains
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `starlette/middleware/trustedhost.py`
Content:
```
1 from starlette.datastructures import Headers
2 from starlette.responses import PlainTextResponse
3 from starlette.types import ASGIApp, ASGIInstance, Scope
4 import typing
5
6
7 class TrustedHostMiddleware:
8 def __init__(
9 self, app: ASGIApp, allowed_hosts: typing.Sequence[str] = ["*"]
10 ) -> None:
11 self.app = app
12 self.allowed_hosts = allowed_hosts
13 self.allow_any = "*" in allowed_hosts
14
15 def __call__(self, scope: Scope) -> ASGIInstance:
16 if scope["type"] in ("http", "websocket") and not self.allow_any:
17 headers = Headers(scope=scope)
18 host = headers.get("host")
19 if host not in self.allowed_hosts:
20 return PlainTextResponse("Invalid host header", status_code=400)
21
22 return self.app(scope)
23
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/starlette/middleware/trustedhost.py b/starlette/middleware/trustedhost.py
--- a/starlette/middleware/trustedhost.py
+++ b/starlette/middleware/trustedhost.py
@@ -4,10 +4,17 @@
import typing
+ENFORCE_DOMAIN_WILDCARD = "Domain wildcard patterns must be like '*.example.com'."
+
+
class TrustedHostMiddleware:
def __init__(
self, app: ASGIApp, allowed_hosts: typing.Sequence[str] = ["*"]
) -> None:
+ for pattern in allowed_hosts:
+ assert "*" not in pattern[1:], ENFORCE_DOMAIN_WILDCARD
+ if pattern.startswith("*") and pattern != "*":
+ assert pattern.startswith("*."), ENFORCE_DOMAIN_WILDCARD
self.app = app
self.allowed_hosts = allowed_hosts
self.allow_any = "*" in allowed_hosts
@@ -15,8 +22,15 @@
def __call__(self, scope: Scope) -> ASGIInstance:
if scope["type"] in ("http", "websocket") and not self.allow_any:
headers = Headers(scope=scope)
- host = headers.get("host")
- if host not in self.allowed_hosts:
+ host = headers.get("host", "").split(":")[0]
+ for pattern in self.allowed_hosts:
+ if (
+ host == pattern
+ or pattern.startswith("*")
+ and host.endswith(pattern[1:])
+ ):
+ break
+ else:
return PlainTextResponse("Invalid host header", status_code=400)
return self.app(scope)
| {"golden_diff": "diff --git a/starlette/middleware/trustedhost.py b/starlette/middleware/trustedhost.py\n--- a/starlette/middleware/trustedhost.py\n+++ b/starlette/middleware/trustedhost.py\n@@ -4,10 +4,17 @@\n import typing\n \n \n+ENFORCE_DOMAIN_WILDCARD = \"Domain wildcard patterns must be like '*.example.com'.\"\n+\n+\n class TrustedHostMiddleware:\n def __init__(\n self, app: ASGIApp, allowed_hosts: typing.Sequence[str] = [\"*\"]\n ) -> None:\n+ for pattern in allowed_hosts:\n+ assert \"*\" not in pattern[1:], ENFORCE_DOMAIN_WILDCARD\n+ if pattern.startswith(\"*\") and pattern != \"*\":\n+ assert pattern.startswith(\"*.\"), ENFORCE_DOMAIN_WILDCARD\n self.app = app\n self.allowed_hosts = allowed_hosts\n self.allow_any = \"*\" in allowed_hosts\n@@ -15,8 +22,15 @@\n def __call__(self, scope: Scope) -> ASGIInstance:\n if scope[\"type\"] in (\"http\", \"websocket\") and not self.allow_any:\n headers = Headers(scope=scope)\n- host = headers.get(\"host\")\n- if host not in self.allowed_hosts:\n+ host = headers.get(\"host\", \"\").split(\":\")[0]\n+ for pattern in self.allowed_hosts:\n+ if (\n+ host == pattern\n+ or pattern.startswith(\"*\")\n+ and host.endswith(pattern[1:])\n+ ):\n+ break\n+ else:\n return PlainTextResponse(\"Invalid host header\", status_code=400)\n \n return self.app(scope)\n", "issue": "Wildcard domains and TrustedhostMiddleware\nSupport for wildcard domains\n", "before_files": [{"content": "from starlette.datastructures import Headers\nfrom starlette.responses import PlainTextResponse\nfrom starlette.types import ASGIApp, ASGIInstance, Scope\nimport typing\n\n\nclass TrustedHostMiddleware:\n def __init__(\n self, app: ASGIApp, allowed_hosts: typing.Sequence[str] = [\"*\"]\n ) -> None:\n self.app = app\n self.allowed_hosts = allowed_hosts\n self.allow_any = \"*\" in allowed_hosts\n\n def __call__(self, scope: Scope) -> ASGIInstance:\n if scope[\"type\"] in (\"http\", \"websocket\") and not self.allow_any:\n headers = Headers(scope=scope)\n host = headers.get(\"host\")\n if host not in self.allowed_hosts:\n return PlainTextResponse(\"Invalid host header\", status_code=400)\n\n return self.app(scope)\n", "path": "starlette/middleware/trustedhost.py"}], "after_files": [{"content": "from starlette.datastructures import Headers\nfrom starlette.responses import PlainTextResponse\nfrom starlette.types import ASGIApp, ASGIInstance, Scope\nimport typing\n\n\nENFORCE_DOMAIN_WILDCARD = \"Domain wildcard patterns must be like '*.example.com'.\"\n\n\nclass TrustedHostMiddleware:\n def __init__(\n self, app: ASGIApp, allowed_hosts: typing.Sequence[str] = [\"*\"]\n ) -> None:\n for pattern in allowed_hosts:\n assert \"*\" not in pattern[1:], ENFORCE_DOMAIN_WILDCARD\n if pattern.startswith(\"*\") and pattern != \"*\":\n assert pattern.startswith(\"*.\"), ENFORCE_DOMAIN_WILDCARD\n self.app = app\n self.allowed_hosts = allowed_hosts\n self.allow_any = \"*\" in allowed_hosts\n\n def __call__(self, scope: Scope) -> ASGIInstance:\n if scope[\"type\"] in (\"http\", \"websocket\") and not self.allow_any:\n headers = Headers(scope=scope)\n host = headers.get(\"host\", \"\").split(\":\")[0]\n for pattern in self.allowed_hosts:\n if (\n host == pattern\n or pattern.startswith(\"*\")\n and host.endswith(pattern[1:])\n ):\n break\n else:\n return PlainTextResponse(\"Invalid host header\", status_code=400)\n\n return self.app(scope)\n", "path": "starlette/middleware/trustedhost.py"}]} | 492 | 355 |
gh_patches_debug_30801 | rasdani/github-patches | git_diff | numba__numba-1719 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
LinkedList jitclass example is broken
```
Internal error:
TypeError: Invalid store of %"deferred.4329823704.value" to {i8*, {i32, {%"deferred.4329823704.data", i8}}*} in <numba.datamodel.models.OptionalModel object at 0x106713278>
File "linkedlist.py", line 53
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `numba/targets/optional.py`
Content:
```
1 from __future__ import print_function, absolute_import, division
2
3 from numba import types, cgutils
4
5 from .imputils import lower_cast
6
7
8 def make_optional(valtype):
9 """
10 Return the Structure representation of a optional value
11 """
12 return cgutils.create_struct_proxy(types.Optional(valtype))
13
14
15 def always_return_true_impl(context, builder, sig, args):
16 return cgutils.true_bit
17
18
19 def always_return_false_impl(context, builder, sig, args):
20 return cgutils.false_bit
21
22
23 @lower_cast(types.Any, types.Optional)
24 def any_to_optional(context, builder, fromty, toty, val):
25 if fromty == types.none:
26 return context.make_optional_none(builder, toty.type)
27 else:
28 val = context.cast(builder, val, fromty, toty.type)
29 return context.make_optional_value(builder, toty.type, val)
30
31 @lower_cast(types.Optional, types.Any)
32 def optional_to_any(context, builder, fromty, toty, val):
33 optty = context.make_optional(fromty)
34 optval = optty(context, builder, value=val)
35 validbit = cgutils.as_bool_bit(builder, optval.valid)
36 with builder.if_then(builder.not_(validbit), likely=False):
37 msg = "expected %s, got None" % (fromty.type,)
38 context.call_conv.return_user_exc(builder, TypeError, (msg,))
39
40 return optval.data
41
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/numba/targets/optional.py b/numba/targets/optional.py
--- a/numba/targets/optional.py
+++ b/numba/targets/optional.py
@@ -20,6 +20,39 @@
return cgutils.false_bit
+@lower_cast(types.Optional, types.Optional)
+def optional_to_optional(context, builder, fromty, toty, val):
+ """
+ The handling of optional->optional cast must be special cased for
+ correct propagation of None value. Given type T and U. casting of
+ T? to U? (? denotes optional) should always succeed. If the from-value
+ is None, the None value the casted value (U?) should be None; otherwise,
+ the from-value is casted to U. This is different from casting T? to U,
+ which requires the from-value must not be None.
+ """
+ optty = context.make_optional(fromty)
+ optval = optty(context, builder, value=val)
+ validbit = cgutils.as_bool_bit(builder, optval.valid)
+ # Create uninitialized optional value
+ outoptty = context.make_optional(toty)
+ outoptval = outoptty(context, builder)
+
+ with builder.if_else(validbit) as (is_valid, is_not_valid):
+ with is_valid:
+ # Cast internal value
+ outoptval.valid = cgutils.true_bit
+ outoptval.data = context.cast(builder, optval.data,
+ fromty.type, toty.type)
+
+ with is_not_valid:
+ # Store None to result
+ outoptval.valid = cgutils.false_bit
+ outoptval.data = cgutils.get_null_value(
+ outoptval.data.type)
+
+ return outoptval._getvalue()
+
+
@lower_cast(types.Any, types.Optional)
def any_to_optional(context, builder, fromty, toty, val):
if fromty == types.none:
@@ -28,6 +61,7 @@
val = context.cast(builder, val, fromty, toty.type)
return context.make_optional_value(builder, toty.type, val)
+
@lower_cast(types.Optional, types.Any)
def optional_to_any(context, builder, fromty, toty, val):
optty = context.make_optional(fromty)
| {"golden_diff": "diff --git a/numba/targets/optional.py b/numba/targets/optional.py\n--- a/numba/targets/optional.py\n+++ b/numba/targets/optional.py\n@@ -20,6 +20,39 @@\n return cgutils.false_bit\n \n \n+@lower_cast(types.Optional, types.Optional)\n+def optional_to_optional(context, builder, fromty, toty, val):\n+ \"\"\"\n+ The handling of optional->optional cast must be special cased for\n+ correct propagation of None value. Given type T and U. casting of\n+ T? to U? (? denotes optional) should always succeed. If the from-value\n+ is None, the None value the casted value (U?) should be None; otherwise,\n+ the from-value is casted to U. This is different from casting T? to U,\n+ which requires the from-value must not be None.\n+ \"\"\"\n+ optty = context.make_optional(fromty)\n+ optval = optty(context, builder, value=val)\n+ validbit = cgutils.as_bool_bit(builder, optval.valid)\n+ # Create uninitialized optional value\n+ outoptty = context.make_optional(toty)\n+ outoptval = outoptty(context, builder)\n+\n+ with builder.if_else(validbit) as (is_valid, is_not_valid):\n+ with is_valid:\n+ # Cast internal value\n+ outoptval.valid = cgutils.true_bit\n+ outoptval.data = context.cast(builder, optval.data,\n+ fromty.type, toty.type)\n+\n+ with is_not_valid:\n+ # Store None to result\n+ outoptval.valid = cgutils.false_bit\n+ outoptval.data = cgutils.get_null_value(\n+ outoptval.data.type)\n+\n+ return outoptval._getvalue()\n+\n+\n @lower_cast(types.Any, types.Optional)\n def any_to_optional(context, builder, fromty, toty, val):\n if fromty == types.none:\n@@ -28,6 +61,7 @@\n val = context.cast(builder, val, fromty, toty.type)\n return context.make_optional_value(builder, toty.type, val)\n \n+\n @lower_cast(types.Optional, types.Any)\n def optional_to_any(context, builder, fromty, toty, val):\n optty = context.make_optional(fromty)\n", "issue": "LinkedList jitclass example is broken\n```\nInternal error:\nTypeError: Invalid store of %\"deferred.4329823704.value\" to {i8*, {i32, {%\"deferred.4329823704.data\", i8}}*} in <numba.datamodel.models.OptionalModel object at 0x106713278>\nFile \"linkedlist.py\", line 53\n```\n\n", "before_files": [{"content": "from __future__ import print_function, absolute_import, division\n\nfrom numba import types, cgutils\n\nfrom .imputils import lower_cast\n\n\ndef make_optional(valtype):\n \"\"\"\n Return the Structure representation of a optional value\n \"\"\"\n return cgutils.create_struct_proxy(types.Optional(valtype))\n\n\ndef always_return_true_impl(context, builder, sig, args):\n return cgutils.true_bit\n\n\ndef always_return_false_impl(context, builder, sig, args):\n return cgutils.false_bit\n\n\n@lower_cast(types.Any, types.Optional)\ndef any_to_optional(context, builder, fromty, toty, val):\n if fromty == types.none:\n return context.make_optional_none(builder, toty.type)\n else:\n val = context.cast(builder, val, fromty, toty.type)\n return context.make_optional_value(builder, toty.type, val)\n\n@lower_cast(types.Optional, types.Any)\ndef optional_to_any(context, builder, fromty, toty, val):\n optty = context.make_optional(fromty)\n optval = optty(context, builder, value=val)\n validbit = cgutils.as_bool_bit(builder, optval.valid)\n with builder.if_then(builder.not_(validbit), likely=False):\n msg = \"expected %s, got None\" % (fromty.type,)\n context.call_conv.return_user_exc(builder, TypeError, (msg,))\n\n return optval.data\n", "path": "numba/targets/optional.py"}], "after_files": [{"content": "from __future__ import print_function, absolute_import, division\n\nfrom numba import types, cgutils\n\nfrom .imputils import lower_cast\n\n\ndef make_optional(valtype):\n \"\"\"\n Return the Structure representation of a optional value\n \"\"\"\n return cgutils.create_struct_proxy(types.Optional(valtype))\n\n\ndef always_return_true_impl(context, builder, sig, args):\n return cgutils.true_bit\n\n\ndef always_return_false_impl(context, builder, sig, args):\n return cgutils.false_bit\n\n\n@lower_cast(types.Optional, types.Optional)\ndef optional_to_optional(context, builder, fromty, toty, val):\n \"\"\"\n The handling of optional->optional cast must be special cased for\n correct propagation of None value. Given type T and U. casting of\n T? to U? (? denotes optional) should always succeed. If the from-value\n is None, the None value the casted value (U?) should be None; otherwise,\n the from-value is casted to U. This is different from casting T? to U,\n which requires the from-value must not be None.\n \"\"\"\n optty = context.make_optional(fromty)\n optval = optty(context, builder, value=val)\n validbit = cgutils.as_bool_bit(builder, optval.valid)\n # Create uninitialized optional value\n outoptty = context.make_optional(toty)\n outoptval = outoptty(context, builder)\n\n with builder.if_else(validbit) as (is_valid, is_not_valid):\n with is_valid:\n # Cast internal value\n outoptval.valid = cgutils.true_bit\n outoptval.data = context.cast(builder, optval.data,\n fromty.type, toty.type)\n\n with is_not_valid:\n # Store None to result\n outoptval.valid = cgutils.false_bit\n outoptval.data = cgutils.get_null_value(\n outoptval.data.type)\n\n return outoptval._getvalue()\n\n\n@lower_cast(types.Any, types.Optional)\ndef any_to_optional(context, builder, fromty, toty, val):\n if fromty == types.none:\n return context.make_optional_none(builder, toty.type)\n else:\n val = context.cast(builder, val, fromty, toty.type)\n return context.make_optional_value(builder, toty.type, val)\n\n\n@lower_cast(types.Optional, types.Any)\ndef optional_to_any(context, builder, fromty, toty, val):\n optty = context.make_optional(fromty)\n optval = optty(context, builder, value=val)\n validbit = cgutils.as_bool_bit(builder, optval.valid)\n with builder.if_then(builder.not_(validbit), likely=False):\n msg = \"expected %s, got None\" % (fromty.type,)\n context.call_conv.return_user_exc(builder, TypeError, (msg,))\n\n return optval.data\n", "path": "numba/targets/optional.py"}]} | 743 | 525 |
gh_patches_debug_14145 | rasdani/github-patches | git_diff | facebookresearch__nevergrad-11 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Hello-world install question
## Steps to reproduce
Install nevergrad
```
pip3 install -e [email protected]:facebookresearch/nevergrad@master#egg=nevergrad
Obtaining nevergrad from [email protected]:facebookresearch/nevergrad@master#egg=nevergrad
Cloning [email protected]:facebookresearch/nevergrad (to revision master) to ./src/nevergrad
Warning: Permanently added the RSA host key for IP address '192.30.255.112' to the list of known hosts.
[email protected]: Permission denied (publickey).
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
Command "git clone -q [email protected]:facebookresearch/nevergrad /Users/ME/Documents/workspace/temp/src/nevergrad" failed with error code 128 in None
```
ok. trying git clone and `python3 setup.py install` method. That seems to work.
run the sample program:
python3 mynevergrad.py
```
from nevergrad.optimization import optimizerlib
def square(x):
return (x - .5)**2
optimizer = optimizerlib.OnePlusOne(dimension=1, budget=100, num_workers=5)
recommendation = optimizer.optimize(square, executor=None, batch_mode=True)
```
## Observed Results
```
Traceback (most recent call last):
File "mynevergrad.py", line 6, in <module>
from nevergrad.optimization import optimizerlib
ModuleNotFoundError: No module named 'nevergrad.optimization'
```
## Expected Results
It should run the sample
## Relevant Code
```
import pkg_resources
for d in pkg_resources.working_set:
print(d)
```
DOES include `nevergrad 0.1.0`
This is very likely an install, python3, homebrew "installed in user directory", or paths issue, but given that `nevergrad 0.1.0` shows up in the list, it is odd...
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #!/usr/bin/env python
2 # Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved.
3 #
4 # This source code is licensed under the MIT license found in the
5 # LICENSE file in the root directory of this source tree.
6
7 # from distutils.core import setup
8 from setuptools import setup
9
10
11 with open('requirements.txt') as f:
12 requirements = f.read().splitlines()
13
14
15 setup(name='nevergrad',
16 version='0.1.0',
17 description='Gradient-free optimization toolbox',
18 author='Facebook AI Research',
19 packages=['nevergrad'],
20 install_requires=requirements,)
21
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -7,14 +7,22 @@
# from distutils.core import setup
from setuptools import setup
-
with open('requirements.txt') as f:
requirements = f.read().splitlines()
-
-setup(name='nevergrad',
- version='0.1.0',
- description='Gradient-free optimization toolbox',
- author='Facebook AI Research',
- packages=['nevergrad'],
- install_requires=requirements,)
+setup(
+ name='nevergrad',
+ version='0.1.0',
+ description='Gradient-free optimization toolbox',
+ author='Facebook AI Research',
+ packages=[
+ 'nevergrad',
+ 'nevergrad.benchmark',
+ 'nevergrad.benchmark.additional',
+ 'nevergrad.common',
+ 'nevergrad.functions',
+ 'nevergrad.instrumentation',
+ 'nevergrad.optimization',
+ ],
+ install_requires=requirements,
+)
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -7,14 +7,22 @@\n # from distutils.core import setup\n from setuptools import setup\n \n-\n with open('requirements.txt') as f:\n requirements = f.read().splitlines()\n \n-\n-setup(name='nevergrad',\n- version='0.1.0',\n- description='Gradient-free optimization toolbox',\n- author='Facebook AI Research',\n- packages=['nevergrad'],\n- install_requires=requirements,)\n+setup(\n+ name='nevergrad',\n+ version='0.1.0',\n+ description='Gradient-free optimization toolbox',\n+ author='Facebook AI Research',\n+ packages=[\n+ 'nevergrad',\n+ 'nevergrad.benchmark',\n+ 'nevergrad.benchmark.additional',\n+ 'nevergrad.common',\n+ 'nevergrad.functions',\n+ 'nevergrad.instrumentation',\n+ 'nevergrad.optimization',\n+ ],\n+ install_requires=requirements,\n+)\n", "issue": "Hello-world install question\n## Steps to reproduce\r\n\r\nInstall nevergrad\r\n\r\n```\r\npip3 install -e [email protected]:facebookresearch/nevergrad@master#egg=nevergrad\r\nObtaining nevergrad from [email protected]:facebookresearch/nevergrad@master#egg=nevergrad\r\n Cloning [email protected]:facebookresearch/nevergrad (to revision master) to ./src/nevergrad\r\nWarning: Permanently added the RSA host key for IP address '192.30.255.112' to the list of known hosts.\r\[email protected]: Permission denied (publickey).\r\nfatal: Could not read from remote repository.\r\n\r\nPlease make sure you have the correct access rights\r\nand the repository exists.\r\nCommand \"git clone -q [email protected]:facebookresearch/nevergrad /Users/ME/Documents/workspace/temp/src/nevergrad\" failed with error code 128 in None\r\n```\r\n\r\nok. trying git clone and `python3 setup.py install` method. That seems to work.\r\n\r\nrun the sample program:\r\n\r\npython3 mynevergrad.py\r\n\r\n```\r\nfrom nevergrad.optimization import optimizerlib\r\n\r\ndef square(x):\r\n return (x - .5)**2\r\n\r\noptimizer = optimizerlib.OnePlusOne(dimension=1, budget=100, num_workers=5)\r\nrecommendation = optimizer.optimize(square, executor=None, batch_mode=True)\r\n```\r\n\r\n\r\n## Observed Results\r\n\r\n```\r\nTraceback (most recent call last):\r\n File \"mynevergrad.py\", line 6, in <module>\r\n from nevergrad.optimization import optimizerlib\r\nModuleNotFoundError: No module named 'nevergrad.optimization'\r\n```\r\n\r\n\r\n## Expected Results\r\n\r\nIt should run the sample\r\n\r\n## Relevant Code\r\n\r\n```\r\nimport pkg_resources\r\nfor d in pkg_resources.working_set:\r\n\tprint(d)\r\n```\r\nDOES include `nevergrad 0.1.0`\r\n\r\n\r\nThis is very likely an install, python3, homebrew \"installed in user directory\", or paths issue, but given that `nevergrad 0.1.0` shows up in the list, it is odd...\r\n\n", "before_files": [{"content": "#!/usr/bin/env python\n# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved.\n#\n# This source code is licensed under the MIT license found in the\n# LICENSE file in the root directory of this source tree.\n\n# from distutils.core import setup\nfrom setuptools import setup\n\n\nwith open('requirements.txt') as f:\n requirements = f.read().splitlines()\n\n\nsetup(name='nevergrad',\n version='0.1.0',\n description='Gradient-free optimization toolbox',\n author='Facebook AI Research',\n packages=['nevergrad'],\n install_requires=requirements,)\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/env python\n# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved.\n#\n# This source code is licensed under the MIT license found in the\n# LICENSE file in the root directory of this source tree.\n\n# from distutils.core import setup\nfrom setuptools import setup\n\nwith open('requirements.txt') as f:\n requirements = f.read().splitlines()\n\nsetup(\n name='nevergrad',\n version='0.1.0',\n description='Gradient-free optimization toolbox',\n author='Facebook AI Research',\n packages=[\n 'nevergrad',\n 'nevergrad.benchmark',\n 'nevergrad.benchmark.additional',\n 'nevergrad.common',\n 'nevergrad.functions',\n 'nevergrad.instrumentation',\n 'nevergrad.optimization',\n ],\n install_requires=requirements,\n)\n", "path": "setup.py"}]} | 870 | 219 |
gh_patches_debug_33030 | rasdani/github-patches | git_diff | pypa__cibuildwheel-1613 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
setup.py setup() not detected in __name__ == '__main__' block
### Description
My setup.py setup() includes:
python_requires=">=3.8"
However cibuildwheel still tries and fails to compile under Python 3.6.
I understand there is [CIBW_BUILD / CIBW_SKIP](https://cibuildwheel.readthedocs.io/en/stable/options/#build-skip) but that is then duplicating the python requires information.
I can add a \[project\] section to pyproject.toml but that leads to a lot of problems because it ends up fighting with setup() parameters and they **really** don't like it.
I believe cibuildwheel should establish the Python version support automatically whether it comes from setuptools or pyproject.toml, and not try to build on unsupported versions. My [pyproject.toml](https://github.com/rogerbinns/apsw/blob/master/pyproject.toml) is:
````
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"
````
### Build log
https://github.com/rogerbinns/apsw/actions/runs/6175182758/job/16761477543
### CI config
https://github.com/rogerbinns/apsw/actions/runs/6175182758/workflow
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `cibuildwheel/projectfiles.py`
Content:
```
1 from __future__ import annotations
2
3 import ast
4 import configparser
5 import contextlib
6 from pathlib import Path
7
8 from ._compat import tomllib
9
10
11 class Analyzer(ast.NodeVisitor):
12 def __init__(self) -> None:
13 self.requires_python: str | None = None
14
15 def visit(self, node: ast.AST) -> None:
16 for inner_node in ast.walk(node):
17 for child in ast.iter_child_nodes(inner_node):
18 child.parent = inner_node # type: ignore[attr-defined]
19 super().visit(node)
20
21 def visit_keyword(self, node: ast.keyword) -> None:
22 self.generic_visit(node)
23 # Must not be nested in an if or other structure
24 # This will be Module -> Expr -> Call -> keyword
25 if (
26 node.arg == "python_requires"
27 and not hasattr(node.parent.parent.parent, "parent") # type: ignore[attr-defined]
28 and isinstance(node.value, ast.Constant)
29 ):
30 self.requires_python = node.value.value
31
32
33 def setup_py_python_requires(content: str) -> str | None:
34 try:
35 tree = ast.parse(content)
36 analyzer = Analyzer()
37 analyzer.visit(tree)
38 return analyzer.requires_python or None
39 except Exception: # pylint: disable=broad-except
40 return None
41
42
43 def get_requires_python_str(package_dir: Path) -> str | None:
44 """Return the python requires string from the most canonical source available, or None"""
45
46 # Read in from pyproject.toml:project.requires-python
47 with contextlib.suppress(FileNotFoundError):
48 with (package_dir / "pyproject.toml").open("rb") as f1:
49 info = tomllib.load(f1)
50 with contextlib.suppress(KeyError, IndexError, TypeError):
51 return str(info["project"]["requires-python"])
52
53 # Read in from setup.cfg:options.python_requires
54 config = configparser.ConfigParser()
55 with contextlib.suppress(FileNotFoundError):
56 config.read(package_dir / "setup.cfg")
57 with contextlib.suppress(KeyError, IndexError, TypeError):
58 return str(config["options"]["python_requires"])
59
60 setup_py = package_dir / "setup.py"
61 with contextlib.suppress(FileNotFoundError), setup_py.open(encoding="utf8") as f2:
62 return setup_py_python_requires(f2.read())
63
64 return None
65
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/cibuildwheel/projectfiles.py b/cibuildwheel/projectfiles.py
--- a/cibuildwheel/projectfiles.py
+++ b/cibuildwheel/projectfiles.py
@@ -8,6 +8,43 @@
from ._compat import tomllib
+def get_parent(node: ast.AST | None, depth: int = 1) -> ast.AST | None:
+ for _ in range(depth):
+ node = getattr(node, "parent", None)
+ return node
+
+
+def is_main(parent: ast.AST | None) -> bool:
+ if parent is None:
+ return False
+
+ # This would be much nicer with 3.10's pattern matching!
+ if not isinstance(parent, ast.If):
+ return False
+ if not isinstance(parent.test, ast.Compare):
+ return False
+
+ try:
+ (op,) = parent.test.ops
+ (comp,) = parent.test.comparators
+ except ValueError:
+ return False
+
+ if not isinstance(op, ast.Eq):
+ return False
+
+ values = {comp, parent.test.left}
+
+ mains = {x for x in values if isinstance(x, ast.Constant) and x.value == "__main__"}
+ if len(mains) != 1:
+ return False
+ consts = {x for x in values if isinstance(x, ast.Name) and x.id == "__name__"}
+ if len(consts) != 1:
+ return False
+
+ return True
+
+
class Analyzer(ast.NodeVisitor):
def __init__(self) -> None:
self.requires_python: str | None = None
@@ -19,13 +56,22 @@
super().visit(node)
def visit_keyword(self, node: ast.keyword) -> None:
+ # Must not be nested except for if __name__ == "__main__"
+
self.generic_visit(node)
- # Must not be nested in an if or other structure
# This will be Module -> Expr -> Call -> keyword
+ parent = get_parent(node, 4)
+ unnested = parent is None
+
+ # This will be Module -> If -> Expr -> Call -> keyword
+ name_main_unnested = (
+ parent is not None and get_parent(parent) is None and is_main(get_parent(node, 3))
+ )
+
if (
node.arg == "python_requires"
- and not hasattr(node.parent.parent.parent, "parent") # type: ignore[attr-defined]
and isinstance(node.value, ast.Constant)
+ and (unnested or name_main_unnested)
):
self.requires_python = node.value.value
| {"golden_diff": "diff --git a/cibuildwheel/projectfiles.py b/cibuildwheel/projectfiles.py\n--- a/cibuildwheel/projectfiles.py\n+++ b/cibuildwheel/projectfiles.py\n@@ -8,6 +8,43 @@\n from ._compat import tomllib\n \n \n+def get_parent(node: ast.AST | None, depth: int = 1) -> ast.AST | None:\n+ for _ in range(depth):\n+ node = getattr(node, \"parent\", None)\n+ return node\n+\n+\n+def is_main(parent: ast.AST | None) -> bool:\n+ if parent is None:\n+ return False\n+\n+ # This would be much nicer with 3.10's pattern matching!\n+ if not isinstance(parent, ast.If):\n+ return False\n+ if not isinstance(parent.test, ast.Compare):\n+ return False\n+\n+ try:\n+ (op,) = parent.test.ops\n+ (comp,) = parent.test.comparators\n+ except ValueError:\n+ return False\n+\n+ if not isinstance(op, ast.Eq):\n+ return False\n+\n+ values = {comp, parent.test.left}\n+\n+ mains = {x for x in values if isinstance(x, ast.Constant) and x.value == \"__main__\"}\n+ if len(mains) != 1:\n+ return False\n+ consts = {x for x in values if isinstance(x, ast.Name) and x.id == \"__name__\"}\n+ if len(consts) != 1:\n+ return False\n+\n+ return True\n+\n+\n class Analyzer(ast.NodeVisitor):\n def __init__(self) -> None:\n self.requires_python: str | None = None\n@@ -19,13 +56,22 @@\n super().visit(node)\n \n def visit_keyword(self, node: ast.keyword) -> None:\n+ # Must not be nested except for if __name__ == \"__main__\"\n+\n self.generic_visit(node)\n- # Must not be nested in an if or other structure\n # This will be Module -> Expr -> Call -> keyword\n+ parent = get_parent(node, 4)\n+ unnested = parent is None\n+\n+ # This will be Module -> If -> Expr -> Call -> keyword\n+ name_main_unnested = (\n+ parent is not None and get_parent(parent) is None and is_main(get_parent(node, 3))\n+ )\n+\n if (\n node.arg == \"python_requires\"\n- and not hasattr(node.parent.parent.parent, \"parent\") # type: ignore[attr-defined]\n and isinstance(node.value, ast.Constant)\n+ and (unnested or name_main_unnested)\n ):\n self.requires_python = node.value.value\n", "issue": "setup.py setup() not detected in __name__ == '__main__' block\n### Description\n\nMy setup.py setup() includes:\r\n\r\n python_requires=\">=3.8\"\r\n\r\nHowever cibuildwheel still tries and fails to compile under Python 3.6.\r\n\r\nI understand there is [CIBW_BUILD / CIBW_SKIP](https://cibuildwheel.readthedocs.io/en/stable/options/#build-skip) but that is then duplicating the python requires information.\r\n\r\nI can add a \\[project\\] section to pyproject.toml but that leads to a lot of problems because it ends up fighting with setup() parameters and they **really** don't like it.\r\n\r\nI believe cibuildwheel should establish the Python version support automatically whether it comes from setuptools or pyproject.toml, and not try to build on unsupported versions. My [pyproject.toml](https://github.com/rogerbinns/apsw/blob/master/pyproject.toml) is:\r\n\r\n````\r\n[build-system]\r\nrequires = [\"setuptools\"]\r\nbuild-backend = \"setuptools.build_meta\"\r\n````\r\n\n\n### Build log\n\nhttps://github.com/rogerbinns/apsw/actions/runs/6175182758/job/16761477543\n\n### CI config\n\nhttps://github.com/rogerbinns/apsw/actions/runs/6175182758/workflow\n", "before_files": [{"content": "from __future__ import annotations\n\nimport ast\nimport configparser\nimport contextlib\nfrom pathlib import Path\n\nfrom ._compat import tomllib\n\n\nclass Analyzer(ast.NodeVisitor):\n def __init__(self) -> None:\n self.requires_python: str | None = None\n\n def visit(self, node: ast.AST) -> None:\n for inner_node in ast.walk(node):\n for child in ast.iter_child_nodes(inner_node):\n child.parent = inner_node # type: ignore[attr-defined]\n super().visit(node)\n\n def visit_keyword(self, node: ast.keyword) -> None:\n self.generic_visit(node)\n # Must not be nested in an if or other structure\n # This will be Module -> Expr -> Call -> keyword\n if (\n node.arg == \"python_requires\"\n and not hasattr(node.parent.parent.parent, \"parent\") # type: ignore[attr-defined]\n and isinstance(node.value, ast.Constant)\n ):\n self.requires_python = node.value.value\n\n\ndef setup_py_python_requires(content: str) -> str | None:\n try:\n tree = ast.parse(content)\n analyzer = Analyzer()\n analyzer.visit(tree)\n return analyzer.requires_python or None\n except Exception: # pylint: disable=broad-except\n return None\n\n\ndef get_requires_python_str(package_dir: Path) -> str | None:\n \"\"\"Return the python requires string from the most canonical source available, or None\"\"\"\n\n # Read in from pyproject.toml:project.requires-python\n with contextlib.suppress(FileNotFoundError):\n with (package_dir / \"pyproject.toml\").open(\"rb\") as f1:\n info = tomllib.load(f1)\n with contextlib.suppress(KeyError, IndexError, TypeError):\n return str(info[\"project\"][\"requires-python\"])\n\n # Read in from setup.cfg:options.python_requires\n config = configparser.ConfigParser()\n with contextlib.suppress(FileNotFoundError):\n config.read(package_dir / \"setup.cfg\")\n with contextlib.suppress(KeyError, IndexError, TypeError):\n return str(config[\"options\"][\"python_requires\"])\n\n setup_py = package_dir / \"setup.py\"\n with contextlib.suppress(FileNotFoundError), setup_py.open(encoding=\"utf8\") as f2:\n return setup_py_python_requires(f2.read())\n\n return None\n", "path": "cibuildwheel/projectfiles.py"}], "after_files": [{"content": "from __future__ import annotations\n\nimport ast\nimport configparser\nimport contextlib\nfrom pathlib import Path\n\nfrom ._compat import tomllib\n\n\ndef get_parent(node: ast.AST | None, depth: int = 1) -> ast.AST | None:\n for _ in range(depth):\n node = getattr(node, \"parent\", None)\n return node\n\n\ndef is_main(parent: ast.AST | None) -> bool:\n if parent is None:\n return False\n\n # This would be much nicer with 3.10's pattern matching!\n if not isinstance(parent, ast.If):\n return False\n if not isinstance(parent.test, ast.Compare):\n return False\n\n try:\n (op,) = parent.test.ops\n (comp,) = parent.test.comparators\n except ValueError:\n return False\n\n if not isinstance(op, ast.Eq):\n return False\n\n values = {comp, parent.test.left}\n\n mains = {x for x in values if isinstance(x, ast.Constant) and x.value == \"__main__\"}\n if len(mains) != 1:\n return False\n consts = {x for x in values if isinstance(x, ast.Name) and x.id == \"__name__\"}\n if len(consts) != 1:\n return False\n\n return True\n\n\nclass Analyzer(ast.NodeVisitor):\n def __init__(self) -> None:\n self.requires_python: str | None = None\n\n def visit(self, node: ast.AST) -> None:\n for inner_node in ast.walk(node):\n for child in ast.iter_child_nodes(inner_node):\n child.parent = inner_node # type: ignore[attr-defined]\n super().visit(node)\n\n def visit_keyword(self, node: ast.keyword) -> None:\n # Must not be nested except for if __name__ == \"__main__\"\n\n self.generic_visit(node)\n # This will be Module -> Expr -> Call -> keyword\n parent = get_parent(node, 4)\n unnested = parent is None\n\n # This will be Module -> If -> Expr -> Call -> keyword\n name_main_unnested = (\n parent is not None and get_parent(parent) is None and is_main(get_parent(node, 3))\n )\n\n if (\n node.arg == \"python_requires\"\n and isinstance(node.value, ast.Constant)\n and (unnested or name_main_unnested)\n ):\n self.requires_python = node.value.value\n\n\ndef setup_py_python_requires(content: str) -> str | None:\n try:\n tree = ast.parse(content)\n analyzer = Analyzer()\n analyzer.visit(tree)\n return analyzer.requires_python or None\n except Exception: # pylint: disable=broad-except\n return None\n\n\ndef get_requires_python_str(package_dir: Path) -> str | None:\n \"\"\"Return the python requires string from the most canonical source available, or None\"\"\"\n\n # Read in from pyproject.toml:project.requires-python\n with contextlib.suppress(FileNotFoundError):\n with (package_dir / \"pyproject.toml\").open(\"rb\") as f1:\n info = tomllib.load(f1)\n with contextlib.suppress(KeyError, IndexError, TypeError):\n return str(info[\"project\"][\"requires-python\"])\n\n # Read in from setup.cfg:options.python_requires\n config = configparser.ConfigParser()\n with contextlib.suppress(FileNotFoundError):\n config.read(package_dir / \"setup.cfg\")\n with contextlib.suppress(KeyError, IndexError, TypeError):\n return str(config[\"options\"][\"python_requires\"])\n\n setup_py = package_dir / \"setup.py\"\n with contextlib.suppress(FileNotFoundError), setup_py.open(encoding=\"utf8\") as f2:\n return setup_py_python_requires(f2.read())\n\n return None\n", "path": "cibuildwheel/projectfiles.py"}]} | 1,180 | 596 |
gh_patches_debug_13024 | rasdani/github-patches | git_diff | vega__altair-2570 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Tooltip doesn't support "image" key
My understanding is that to render tooltips, Altair uses the `vega-tooltip` plugin. Per that project's [README](https://github.com/vega/vega-tooltip), one awesome feature that library has is the ability to render images as part of the tooltip with the image key. From the docs:
> Supports special keys title (becomes the title of the tooltip) and image (used as the url for an embedded image)
Using the tooltip without the `image` key is fine:
```
mydata = pd.DataFrame.from_records([{'a': 1, 'b': 1}, {'a': 2, 'b': 2}])
chart = alt.Chart(mydata).mark_circle().encode(
x='a',
y='b',
tooltip=alt.Tooltip(['a'], title='My Cool Tooltip')
).interactive()
display(chart)
```
However, when I add the `image` key, it stops working:
```
chart = alt.Chart(mydata).mark_circle().encode(
x='a',
y='b',
tooltip=alt.Tooltip(['a'], title='My Cool Tooltip', image='https://picsum.photos/200')
).interactive()
display(chart)
```
```
SchemaValidationError: Invalid specification
altair.vegalite.v4.schema.channels.Tooltip, validating 'additionalProperties'
Additional properties are not allowed ('image' was unexpected)
```
Maybe this feature is already supported but the schema is out of date? Thanks.
---
Please follow these steps to make it more efficient to solve your issue:
- [N/A] Since Altair is a Python wrapper around the Vega-Lite visualization grammar, [most bugs should be reported directly to Vega-Lite](https://github.com/vega/vega-lite/issues). You can click the Action Button of your Altair chart and "Open in Vega Editor" to create a reproducible Vega-Lite example and see if you get the same error in the Vega Editor.
- [x] Search for duplicate issues.
- [x] Use the latest version of Altair.
- [x] Describe how to reproduce the bug and include the full code and data to reproduce it, ideally using a sample data set from `vega_datasets`.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `altair/examples/image_tooltip.py`
Content:
```
1 """
2 Image tooltip
3 -------------
4 This example shows how to render images in tooltips.
5 Either URLs or local file paths can be used to reference
6 the images.
7 """
8 # category: other charts
9
10 import altair as alt
11 import pandas as pd
12
13 source = pd.DataFrame.from_records(
14 [{'a': 1, 'b': 1, 'image': 'https://altair-viz.github.io/_static/altair-logo-light.png'},
15 {'a': 2, 'b': 2, 'image': 'https://avatars.githubusercontent.com/u/11796929?s=200&v=4'}]
16 )
17 alt.Chart(source).mark_circle(size=200).encode(
18 x='a',
19 y='b',
20 tooltip=['image'] # Must be a list for the image to render
21 )
22
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/altair/examples/image_tooltip.py b/altair/examples/image_tooltip.py
--- a/altair/examples/image_tooltip.py
+++ b/altair/examples/image_tooltip.py
@@ -3,7 +3,9 @@
-------------
This example shows how to render images in tooltips.
Either URLs or local file paths can be used to reference
-the images.
+the images. To render the image, you must use the special
+column name "image" in your data and pass it as a list to
+the tooltip encoding.
"""
# category: other charts
@@ -17,5 +19,5 @@
alt.Chart(source).mark_circle(size=200).encode(
x='a',
y='b',
- tooltip=['image'] # Must be a list for the image to render
+ tooltip=['image'] # Must be a list containing a field called "image"
)
| {"golden_diff": "diff --git a/altair/examples/image_tooltip.py b/altair/examples/image_tooltip.py\n--- a/altair/examples/image_tooltip.py\n+++ b/altair/examples/image_tooltip.py\n@@ -3,7 +3,9 @@\n -------------\n This example shows how to render images in tooltips.\n Either URLs or local file paths can be used to reference\n-the images.\n+the images. To render the image, you must use the special\n+column name \"image\" in your data and pass it as a list to\n+the tooltip encoding.\n \"\"\"\n # category: other charts\n \n@@ -17,5 +19,5 @@\n alt.Chart(source).mark_circle(size=200).encode(\n x='a',\n y='b',\n- tooltip=['image'] # Must be a list for the image to render\n+ tooltip=['image'] # Must be a list containing a field called \"image\"\n )\n", "issue": "Tooltip doesn't support \"image\" key\nMy understanding is that to render tooltips, Altair uses the `vega-tooltip` plugin. Per that project's [README](https://github.com/vega/vega-tooltip), one awesome feature that library has is the ability to render images as part of the tooltip with the image key. From the docs:\r\n\r\n> Supports special keys title (becomes the title of the tooltip) and image (used as the url for an embedded image)\r\n\r\nUsing the tooltip without the `image` key is fine:\r\n\r\n```\r\nmydata = pd.DataFrame.from_records([{'a': 1, 'b': 1}, {'a': 2, 'b': 2}])\r\nchart = alt.Chart(mydata).mark_circle().encode(\r\n x='a',\r\n y='b',\r\n tooltip=alt.Tooltip(['a'], title='My Cool Tooltip')\r\n).interactive()\r\n\r\ndisplay(chart)\r\n```\r\n\r\nHowever, when I add the `image` key, it stops working:\r\n\r\n```\r\nchart = alt.Chart(mydata).mark_circle().encode(\r\n x='a',\r\n y='b',\r\n tooltip=alt.Tooltip(['a'], title='My Cool Tooltip', image='https://picsum.photos/200')\r\n).interactive()\r\n\r\ndisplay(chart)\r\n```\r\n\r\n```\r\nSchemaValidationError: Invalid specification\r\n\r\n altair.vegalite.v4.schema.channels.Tooltip, validating 'additionalProperties'\r\n\r\n Additional properties are not allowed ('image' was unexpected)\r\n```\r\n\r\nMaybe this feature is already supported but the schema is out of date? Thanks.\r\n\r\n---\r\n\r\nPlease follow these steps to make it more efficient to solve your issue:\r\n\r\n- [N/A] Since Altair is a Python wrapper around the Vega-Lite visualization grammar, [most bugs should be reported directly to Vega-Lite](https://github.com/vega/vega-lite/issues). You can click the Action Button of your Altair chart and \"Open in Vega Editor\" to create a reproducible Vega-Lite example and see if you get the same error in the Vega Editor.\r\n- [x] Search for duplicate issues.\r\n- [x] Use the latest version of Altair.\r\n- [x] Describe how to reproduce the bug and include the full code and data to reproduce it, ideally using a sample data set from `vega_datasets`.\r\n\n", "before_files": [{"content": "\"\"\"\nImage tooltip\n-------------\nThis example shows how to render images in tooltips.\nEither URLs or local file paths can be used to reference\nthe images.\n\"\"\"\n# category: other charts\n\nimport altair as alt\nimport pandas as pd\n\nsource = pd.DataFrame.from_records(\n [{'a': 1, 'b': 1, 'image': 'https://altair-viz.github.io/_static/altair-logo-light.png'},\n {'a': 2, 'b': 2, 'image': 'https://avatars.githubusercontent.com/u/11796929?s=200&v=4'}]\n)\nalt.Chart(source).mark_circle(size=200).encode(\n x='a',\n y='b',\n tooltip=['image'] # Must be a list for the image to render\n)\n", "path": "altair/examples/image_tooltip.py"}], "after_files": [{"content": "\"\"\"\nImage tooltip\n-------------\nThis example shows how to render images in tooltips.\nEither URLs or local file paths can be used to reference\nthe images. To render the image, you must use the special\ncolumn name \"image\" in your data and pass it as a list to\nthe tooltip encoding.\n\"\"\"\n# category: other charts\n\nimport altair as alt\nimport pandas as pd\n\nsource = pd.DataFrame.from_records(\n [{'a': 1, 'b': 1, 'image': 'https://altair-viz.github.io/_static/altair-logo-light.png'},\n {'a': 2, 'b': 2, 'image': 'https://avatars.githubusercontent.com/u/11796929?s=200&v=4'}]\n)\nalt.Chart(source).mark_circle(size=200).encode(\n x='a',\n y='b',\n tooltip=['image'] # Must be a list containing a field called \"image\"\n)\n", "path": "altair/examples/image_tooltip.py"}]} | 952 | 195 |
gh_patches_debug_2489 | rasdani/github-patches | git_diff | rucio__rucio-2776 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Account and Scope new types
Motivation
----------
For multi-vo the internal representation of scope and account will need to be different from the external representation. The translations for these should be done in a consistent way and this can be prepared beforehand.
Modification
------------
Create a new type for each of scope and account.
Convert incoming account names and scopes to these types in the API directory so that multi-vo changes are more straight forward.
If normal strings are used in core, raise an error.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `lib/rucio/vcsversion.py`
Content:
```
1
2 '''
3 This file is automatically generated; Do not edit it. :)
4 '''
5 VERSION_INFO = {
6 'final': True,
7 'version': '1.20.3',
8 'branch_nick': 'patch-0-Release__Rucio_1_20_3_preparation',
9 'revision_id': 'f05e019f7178590718bf3f1eee415cc46cb59159',
10 'revno': 8410
11 }
12
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/lib/rucio/vcsversion.py b/lib/rucio/vcsversion.py
--- a/lib/rucio/vcsversion.py
+++ b/lib/rucio/vcsversion.py
@@ -4,8 +4,8 @@
'''
VERSION_INFO = {
'final': True,
- 'version': '1.20.3',
- 'branch_nick': 'patch-0-Release__Rucio_1_20_3_preparation',
- 'revision_id': 'f05e019f7178590718bf3f1eee415cc46cb59159',
- 'revno': 8410
+ 'version': '1.20.4rc1',
+ 'branch_nick': 'patch-0-Release__1_20_4rc1_preparation',
+ 'revision_id': '525812b8f83f1069d38ab78aebedb732f21e77ec',
+ 'revno': 8418
}
| {"golden_diff": "diff --git a/lib/rucio/vcsversion.py b/lib/rucio/vcsversion.py\n--- a/lib/rucio/vcsversion.py\n+++ b/lib/rucio/vcsversion.py\n@@ -4,8 +4,8 @@\n '''\n VERSION_INFO = {\n 'final': True,\n- 'version': '1.20.3',\n- 'branch_nick': 'patch-0-Release__Rucio_1_20_3_preparation',\n- 'revision_id': 'f05e019f7178590718bf3f1eee415cc46cb59159',\n- 'revno': 8410\n+ 'version': '1.20.4rc1',\n+ 'branch_nick': 'patch-0-Release__1_20_4rc1_preparation',\n+ 'revision_id': '525812b8f83f1069d38ab78aebedb732f21e77ec',\n+ 'revno': 8418\n }\n", "issue": "Account and Scope new types\nMotivation\r\n----------\r\nFor multi-vo the internal representation of scope and account will need to be different from the external representation. The translations for these should be done in a consistent way and this can be prepared beforehand.\r\n\r\n\r\nModification\r\n------------\r\nCreate a new type for each of scope and account. \r\nConvert incoming account names and scopes to these types in the API directory so that multi-vo changes are more straight forward.\r\nIf normal strings are used in core, raise an error.\r\n\n", "before_files": [{"content": "\n'''\nThis file is automatically generated; Do not edit it. :)\n'''\nVERSION_INFO = {\n 'final': True,\n 'version': '1.20.3',\n 'branch_nick': 'patch-0-Release__Rucio_1_20_3_preparation',\n 'revision_id': 'f05e019f7178590718bf3f1eee415cc46cb59159',\n 'revno': 8410\n}\n", "path": "lib/rucio/vcsversion.py"}], "after_files": [{"content": "\n'''\nThis file is automatically generated; Do not edit it. :)\n'''\nVERSION_INFO = {\n 'final': True,\n 'version': '1.20.4rc1',\n 'branch_nick': 'patch-0-Release__1_20_4rc1_preparation',\n 'revision_id': '525812b8f83f1069d38ab78aebedb732f21e77ec',\n 'revno': 8418\n}\n", "path": "lib/rucio/vcsversion.py"}]} | 497 | 254 |
gh_patches_debug_1579 | rasdani/github-patches | git_diff | elastic__apm-agent-python-1758 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
KeyError:'db' when capturing elasticsearch span context
**Describe the bug**: ...
**To Reproduce**
1. Instrument via capture serverless
2. Perform an async_scan:
```python
async with AsyncElasticsearch(**elastic_context) as client:
async for doc in async_scan(
client=client,
index='XXX',
query=query,
_source=['field']
):
```

**Environment (please complete the following information)**
- OS: [e.g. Linux] debian:bullseye-slim (python:3.9-slim container image) x86
- Python version: 3.9
- Framework and version [e.g. Django 2.1]: N/A AWS Lambda Function
- APM Server version: 8.6.1
- Agent version: 6.14.0
**Additional context**
Add any other context about the problem here.
- Agent config options <!-- be careful not to post sensitive information -->
<details>
<summary>Click to expand</summary>
```
ELASTIC_APM_LOG_LEVEL: warning
ELASTIC_APM_LAMBDA_APM_SERVER: https://<elastic-cloud-hostname>:443
ELASTIC_APM_SECRET_TOKEN: ${env:ELASTIC_APM_SECRET_TOKEN}
ELASTIC_APM_ENVIRONMENT: ${self:provider.stage}
ELASTIC_APM_SPAN_COMPRESSION_SAME_KIND_MAX_DURATION: 50ms
```
</details>
- `requirements.txt`:
<details>
<summary>Click to expand</summary>
```toml
[tool.poetry.dependencies]
python = "^3.9"
elasticsearch = "^8"
elastic-apm = "^6.14"
tomlkit = "^0.11.1"
uvloop = "^0.17.0"
PyYAML = "^6.0"
aiobotocore = "^2.3"
urllib3 = "^1.26.10"
python-dateutil = "^2.8.2"
pyjarm = "^0.0.5"
aiohttp = "^3.8"
multidict = "^6"
```
</details>
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `elasticapm/instrumentation/packages/asyncio/elasticsearch.py`
Content:
```
1 # BSD 3-Clause License
2 #
3 # Copyright (c) 2019, Elasticsearch BV
4 # All rights reserved.
5 #
6 # Redistribution and use in source and binary forms, with or without
7 # modification, are permitted provided that the following conditions are met:
8 #
9 # * Redistributions of source code must retain the above copyright notice, this
10 # list of conditions and the following disclaimer.
11 #
12 # * Redistributions in binary form must reproduce the above copyright notice,
13 # this list of conditions and the following disclaimer in the documentation
14 # and/or other materials provided with the distribution.
15 #
16 # * Neither the name of the copyright holder nor the names of its
17 # contributors may be used to endorse or promote products derived from
18 # this software without specific prior written permission.
19 #
20 # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
21 # AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
22 # IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
23 # DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
24 # FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
25 # DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
26 # SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
27 # CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
28 # OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
29 # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
30
31 import elasticapm
32 from elasticapm.instrumentation.packages.asyncio.base import AsyncAbstractInstrumentedModule
33 from elasticapm.instrumentation.packages.elasticsearch import (
34 ElasticsearchConnectionInstrumentation,
35 ElasticsearchTransportInstrumentation,
36 )
37 from elasticapm.traces import DroppedSpan, execution_context
38
39
40 class ElasticSearchAsyncConnection(ElasticsearchConnectionInstrumentation, AsyncAbstractInstrumentedModule):
41 name = "elasticsearch_connection"
42
43 def get_instrument_list(self):
44 try:
45 import elastic_transport # noqa: F401
46
47 return [
48 ("elastic_transport._node._http_aiohttp", "AiohttpHttpNode.perform_request"),
49 ]
50 except ImportError:
51 return [
52 ("elasticsearch_async.connection", "AIOHttpConnection.perform_request"),
53 ("elasticsearch._async.http_aiohttp", "AIOHttpConnection.perform_request"),
54 ]
55
56 async def call(self, module, method, wrapped, instance, args, kwargs):
57 span = execution_context.get_span()
58 if not span or isinstance(span, DroppedSpan):
59 return await wrapped(*args, **kwargs)
60
61 self._update_context_by_request_data(span.context, instance, args, kwargs)
62
63 result = await wrapped(*args, **kwargs)
64 if hasattr(result, "meta"): # elasticsearch-py 8.x+
65 status_code = result.meta.status
66 else:
67 status_code = result[0]
68
69 span.context["http"] = {"status_code": status_code}
70
71 return result
72
73
74 class ElasticsearchAsyncTransportInstrumentation(
75 ElasticsearchTransportInstrumentation, AsyncAbstractInstrumentedModule
76 ):
77 name = "elasticsearch_connection"
78
79 instrument_list = [
80 ("elasticsearch._async.transport", "AsyncTransport.perform_request"),
81 ]
82
83 def get_instrument_list(self):
84 try:
85 import elastic_transport # noqa: F401
86
87 return [
88 ("elastic_transport", "AsyncTransport.perform_request"),
89 ]
90 except ImportError:
91 return [
92 ("elasticsearch._async.transport", "AsyncTransport.perform_request"),
93 ]
94
95 async def call(self, module, method, wrapped, instance, args, kwargs):
96 async with elasticapm.async_capture_span(
97 self._get_signature(args, kwargs),
98 span_type="db",
99 span_subtype="elasticsearch",
100 span_action="query",
101 extra={},
102 skip_frames=2,
103 leaf=True,
104 ) as span:
105 result_data = await wrapped(*args, **kwargs)
106
107 hits = self._get_hits(result_data)
108 if hits:
109 span.context["db"]["rows_affected"] = hits
110
111 return result_data
112
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/elasticapm/instrumentation/packages/asyncio/elasticsearch.py b/elasticapm/instrumentation/packages/asyncio/elasticsearch.py
--- a/elasticapm/instrumentation/packages/asyncio/elasticsearch.py
+++ b/elasticapm/instrumentation/packages/asyncio/elasticsearch.py
@@ -106,6 +106,6 @@
hits = self._get_hits(result_data)
if hits:
- span.context["db"]["rows_affected"] = hits
+ span.update_context("db", {"rows_affected": hits})
return result_data
| {"golden_diff": "diff --git a/elasticapm/instrumentation/packages/asyncio/elasticsearch.py b/elasticapm/instrumentation/packages/asyncio/elasticsearch.py\n--- a/elasticapm/instrumentation/packages/asyncio/elasticsearch.py\n+++ b/elasticapm/instrumentation/packages/asyncio/elasticsearch.py\n@@ -106,6 +106,6 @@\n \n hits = self._get_hits(result_data)\n if hits:\n- span.context[\"db\"][\"rows_affected\"] = hits\n+ span.update_context(\"db\", {\"rows_affected\": hits})\n \n return result_data\n", "issue": "KeyError:'db' when capturing elasticsearch span context\n**Describe the bug**: ...\r\n\r\n**To Reproduce**\r\n\r\n1. Instrument via capture serverless\r\n2. Perform an async_scan:\r\n```python\r\nasync with AsyncElasticsearch(**elastic_context) as client:\r\n async for doc in async_scan(\r\n client=client,\r\n index='XXX',\r\n query=query,\r\n _source=['field']\r\n ):\r\n```\r\n\r\n\r\n\r\n**Environment (please complete the following information)**\r\n- OS: [e.g. Linux] debian:bullseye-slim (python:3.9-slim container image) x86\r\n- Python version: 3.9\r\n- Framework and version [e.g. Django 2.1]: N/A AWS Lambda Function\r\n- APM Server version: 8.6.1\r\n- Agent version: 6.14.0\r\n\r\n\r\n**Additional context**\r\n\r\nAdd any other context about the problem here.\r\n\r\n- Agent config options <!-- be careful not to post sensitive information -->\r\n <details>\r\n <summary>Click to expand</summary>\r\n\r\n ```\r\n ELASTIC_APM_LOG_LEVEL: warning\r\n ELASTIC_APM_LAMBDA_APM_SERVER: https://<elastic-cloud-hostname>:443\r\n ELASTIC_APM_SECRET_TOKEN: ${env:ELASTIC_APM_SECRET_TOKEN}\r\n ELASTIC_APM_ENVIRONMENT: ${self:provider.stage}\r\n ELASTIC_APM_SPAN_COMPRESSION_SAME_KIND_MAX_DURATION: 50ms\r\n ```\r\n </details>\r\n- `requirements.txt`:\r\n <details>\r\n <summary>Click to expand</summary>\r\n\r\n ```toml\r\n [tool.poetry.dependencies]\r\n python = \"^3.9\"\r\n elasticsearch = \"^8\"\r\n elastic-apm = \"^6.14\"\r\n tomlkit = \"^0.11.1\"\r\n uvloop = \"^0.17.0\"\r\n PyYAML = \"^6.0\"\r\n aiobotocore = \"^2.3\"\r\n urllib3 = \"^1.26.10\"\r\n python-dateutil = \"^2.8.2\"\r\n pyjarm = \"^0.0.5\"\r\n aiohttp = \"^3.8\"\r\n multidict = \"^6\"\r\n ```\r\n </details>\r\n\n", "before_files": [{"content": "# BSD 3-Clause License\n#\n# Copyright (c) 2019, Elasticsearch BV\n# All rights reserved.\n#\n# Redistribution and use in source and binary forms, with or without\n# modification, are permitted provided that the following conditions are met:\n#\n# * Redistributions of source code must retain the above copyright notice, this\n# list of conditions and the following disclaimer.\n#\n# * Redistributions in binary form must reproduce the above copyright notice,\n# this list of conditions and the following disclaimer in the documentation\n# and/or other materials provided with the distribution.\n#\n# * Neither the name of the copyright holder nor the names of its\n# contributors may be used to endorse or promote products derived from\n# this software without specific prior written permission.\n#\n# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\"\n# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\n# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\n# DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE\n# FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL\n# DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR\n# SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER\n# CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,\n# OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\n# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n\nimport elasticapm\nfrom elasticapm.instrumentation.packages.asyncio.base import AsyncAbstractInstrumentedModule\nfrom elasticapm.instrumentation.packages.elasticsearch import (\n ElasticsearchConnectionInstrumentation,\n ElasticsearchTransportInstrumentation,\n)\nfrom elasticapm.traces import DroppedSpan, execution_context\n\n\nclass ElasticSearchAsyncConnection(ElasticsearchConnectionInstrumentation, AsyncAbstractInstrumentedModule):\n name = \"elasticsearch_connection\"\n\n def get_instrument_list(self):\n try:\n import elastic_transport # noqa: F401\n\n return [\n (\"elastic_transport._node._http_aiohttp\", \"AiohttpHttpNode.perform_request\"),\n ]\n except ImportError:\n return [\n (\"elasticsearch_async.connection\", \"AIOHttpConnection.perform_request\"),\n (\"elasticsearch._async.http_aiohttp\", \"AIOHttpConnection.perform_request\"),\n ]\n\n async def call(self, module, method, wrapped, instance, args, kwargs):\n span = execution_context.get_span()\n if not span or isinstance(span, DroppedSpan):\n return await wrapped(*args, **kwargs)\n\n self._update_context_by_request_data(span.context, instance, args, kwargs)\n\n result = await wrapped(*args, **kwargs)\n if hasattr(result, \"meta\"): # elasticsearch-py 8.x+\n status_code = result.meta.status\n else:\n status_code = result[0]\n\n span.context[\"http\"] = {\"status_code\": status_code}\n\n return result\n\n\nclass ElasticsearchAsyncTransportInstrumentation(\n ElasticsearchTransportInstrumentation, AsyncAbstractInstrumentedModule\n):\n name = \"elasticsearch_connection\"\n\n instrument_list = [\n (\"elasticsearch._async.transport\", \"AsyncTransport.perform_request\"),\n ]\n\n def get_instrument_list(self):\n try:\n import elastic_transport # noqa: F401\n\n return [\n (\"elastic_transport\", \"AsyncTransport.perform_request\"),\n ]\n except ImportError:\n return [\n (\"elasticsearch._async.transport\", \"AsyncTransport.perform_request\"),\n ]\n\n async def call(self, module, method, wrapped, instance, args, kwargs):\n async with elasticapm.async_capture_span(\n self._get_signature(args, kwargs),\n span_type=\"db\",\n span_subtype=\"elasticsearch\",\n span_action=\"query\",\n extra={},\n skip_frames=2,\n leaf=True,\n ) as span:\n result_data = await wrapped(*args, **kwargs)\n\n hits = self._get_hits(result_data)\n if hits:\n span.context[\"db\"][\"rows_affected\"] = hits\n\n return result_data\n", "path": "elasticapm/instrumentation/packages/asyncio/elasticsearch.py"}], "after_files": [{"content": "# BSD 3-Clause License\n#\n# Copyright (c) 2019, Elasticsearch BV\n# All rights reserved.\n#\n# Redistribution and use in source and binary forms, with or without\n# modification, are permitted provided that the following conditions are met:\n#\n# * Redistributions of source code must retain the above copyright notice, this\n# list of conditions and the following disclaimer.\n#\n# * Redistributions in binary form must reproduce the above copyright notice,\n# this list of conditions and the following disclaimer in the documentation\n# and/or other materials provided with the distribution.\n#\n# * Neither the name of the copyright holder nor the names of its\n# contributors may be used to endorse or promote products derived from\n# this software without specific prior written permission.\n#\n# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\"\n# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\n# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\n# DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE\n# FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL\n# DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR\n# SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER\n# CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,\n# OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\n# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n\nimport elasticapm\nfrom elasticapm.instrumentation.packages.asyncio.base import AsyncAbstractInstrumentedModule\nfrom elasticapm.instrumentation.packages.elasticsearch import (\n ElasticsearchConnectionInstrumentation,\n ElasticsearchTransportInstrumentation,\n)\nfrom elasticapm.traces import DroppedSpan, execution_context\n\n\nclass ElasticSearchAsyncConnection(ElasticsearchConnectionInstrumentation, AsyncAbstractInstrumentedModule):\n name = \"elasticsearch_connection\"\n\n def get_instrument_list(self):\n try:\n import elastic_transport # noqa: F401\n\n return [\n (\"elastic_transport._node._http_aiohttp\", \"AiohttpHttpNode.perform_request\"),\n ]\n except ImportError:\n return [\n (\"elasticsearch_async.connection\", \"AIOHttpConnection.perform_request\"),\n (\"elasticsearch._async.http_aiohttp\", \"AIOHttpConnection.perform_request\"),\n ]\n\n async def call(self, module, method, wrapped, instance, args, kwargs):\n span = execution_context.get_span()\n if not span or isinstance(span, DroppedSpan):\n return await wrapped(*args, **kwargs)\n\n self._update_context_by_request_data(span.context, instance, args, kwargs)\n\n result = await wrapped(*args, **kwargs)\n if hasattr(result, \"meta\"): # elasticsearch-py 8.x+\n status_code = result.meta.status\n else:\n status_code = result[0]\n\n span.context[\"http\"] = {\"status_code\": status_code}\n\n return result\n\n\nclass ElasticsearchAsyncTransportInstrumentation(\n ElasticsearchTransportInstrumentation, AsyncAbstractInstrumentedModule\n):\n name = \"elasticsearch_connection\"\n\n instrument_list = [\n (\"elasticsearch._async.transport\", \"AsyncTransport.perform_request\"),\n ]\n\n def get_instrument_list(self):\n try:\n import elastic_transport # noqa: F401\n\n return [\n (\"elastic_transport\", \"AsyncTransport.perform_request\"),\n ]\n except ImportError:\n return [\n (\"elasticsearch._async.transport\", \"AsyncTransport.perform_request\"),\n ]\n\n async def call(self, module, method, wrapped, instance, args, kwargs):\n async with elasticapm.async_capture_span(\n self._get_signature(args, kwargs),\n span_type=\"db\",\n span_subtype=\"elasticsearch\",\n span_action=\"query\",\n extra={},\n skip_frames=2,\n leaf=True,\n ) as span:\n result_data = await wrapped(*args, **kwargs)\n\n hits = self._get_hits(result_data)\n if hits:\n span.update_context(\"db\", {\"rows_affected\": hits})\n\n return result_data\n", "path": "elasticapm/instrumentation/packages/asyncio/elasticsearch.py"}]} | 1,947 | 132 |
gh_patches_debug_6983 | rasdani/github-patches | git_diff | frappe__frappe-6179 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
"StartTLS is not supported"-Bug in LDAP integrations
`self` is used in the method `authenticate_ldap_user` which results in the Python exception:
```
NameError: global name 'self' is not defined
```
https://github.com/frappe/frappe/blob/6811f0345e35ea521942db4b29425dd71e403ab8/frappe/integrations/doctype/ldap_settings/ldap_settings.py#L89
Since all exceptions are caught regardless their type, a misleading "StartTLS is not supported" error is thrown:
https://github.com/frappe/frappe/blob/6811f0345e35ea521942db4b29425dd71e403ab8/frappe/integrations/doctype/ldap_settings/ldap_settings.py#L95
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `frappe/integrations/doctype/ldap_settings/ldap_settings.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 # Copyright (c) 2015, Frappe Technologies and contributors
3 # For license information, please see license.txt
4
5 from __future__ import unicode_literals
6 import frappe
7 from frappe import _
8 from frappe.utils import cstr
9 from frappe.model.document import Document
10
11 class LDAPSettings(Document):
12 def validate(self):
13 if not self.flags.ignore_mandatory:
14 self.validate_ldap_credentails()
15
16 def validate_ldap_credentails(self):
17 try:
18 import ldap
19 conn = ldap.initialize(self.ldap_server_url)
20 try:
21 if self.ssl_tls_mode == 'StartTLS':
22 conn.set_option(ldap.OPT_X_TLS_DEMAND, True)
23 if self.require_trusted_certificate == 'Yes':
24 conn.set_option(ldap.OPT_X_TLS_REQUIRE_CERT, ldap.OPT_X_TLS_DEMAND)
25 conn.start_tls_s()
26 except:
27 frappe.throw(_("StartTLS is not supported"))
28
29 conn.simple_bind_s(self.base_dn, self.get_password(raise_exception=False))
30 except ImportError:
31 msg = """
32 <div>
33 {{_("Seems ldap is not installed on system.<br>Guidelines to install ldap dependancies and python package")}},
34 <a href="https://discuss.erpnext.com/t/frappe-v-7-1-beta-ldap-dependancies/15841" target="_blank">{{_("Click here")}}</a>,
35 </div>
36 """
37 frappe.throw(msg, title=_("LDAP Not Installed"))
38
39 except ldap.LDAPError:
40 conn.unbind_s()
41 frappe.throw(_("Incorrect UserId or Password"))
42
43 def get_ldap_settings():
44 try:
45 settings = frappe.get_doc("LDAP Settings")
46
47 settings.update({
48 "method": "frappe.integrations.doctype.ldap_settings.ldap_settings.login"
49 })
50 return settings
51 except Exception:
52 # this will return blank settings
53 return frappe._dict()
54
55 @frappe.whitelist(allow_guest=True)
56 def login():
57 #### LDAP LOGIN LOGIC #####
58 args = frappe.form_dict
59 user = authenticate_ldap_user(frappe.as_unicode(args.usr), frappe.as_unicode(args.pwd))
60
61 frappe.local.login_manager.user = user.name
62 frappe.local.login_manager.post_login()
63
64 # because of a GET request!
65 frappe.db.commit()
66
67 def authenticate_ldap_user(user=None, password=None):
68 dn = None
69 params = {}
70 settings = get_ldap_settings()
71
72 try:
73 import ldap
74 except:
75 msg = """
76 <div>
77 {{_("Seems ldap is not installed on system.")}}<br>
78 <a href"https://discuss.erpnext.com/t/frappe-v-7-1-beta-ldap-dependancies/15841">{{_("Click here")}}</a>,
79 {{_("Guidelines to install ldap dependancies and python")}}
80 </div>
81 """
82 frappe.throw(msg, title=_("LDAP Not Installed"))
83
84 conn = ldap.initialize(settings.ldap_server_url)
85
86 try:
87 try:
88 # set TLS settings for secure connection
89 if self.ssl_tls_mode == 'StartTLS':
90 conn.set_option(ldap.OPT_X_TLS_DEMAND, True)
91 if self.require_trusted_certificate == 'Yes':
92 conn.set_option(ldap.OPT_X_TLS_REQUIRE_CERT, ldap.OPT_X_TLS_DEMAND)
93 conn.start_tls_s()
94 except:
95 frappe.throw(_("StartTLS is not supported"))
96
97 # simple_bind_s is synchronous binding to server, it takes two param DN and password
98 conn.simple_bind_s(settings.base_dn, settings.get_password(raise_exception=False))
99
100 #search for surnames beginning with a
101 #available options for how deep a search you want.
102 #LDAP_SCOPE_BASE, LDAP_SCOPE_ONELEVEL,LDAP_SCOPE_SUBTREE,
103 result = conn.search_s(settings.organizational_unit, ldap.SCOPE_SUBTREE,
104 settings.ldap_search_string.format(user))
105
106 for dn, r in result:
107 dn = cstr(dn)
108 params["email"] = cstr(r[settings.ldap_email_field][0])
109 params["username"] = cstr(r[settings.ldap_username_field][0])
110 params["first_name"] = cstr(r[settings.ldap_first_name_field][0])
111
112 if dn:
113 conn.simple_bind_s(dn, frappe.as_unicode(password))
114 return create_user(params)
115 else:
116 frappe.throw(_("Not a valid LDAP user"))
117
118 except ldap.LDAPError:
119 conn.unbind_s()
120 frappe.throw(_("Incorrect UserId or Password"))
121
122 def create_user(params):
123 if frappe.db.exists("User", params["email"]):
124 return frappe.get_doc("User", params["email"])
125
126 else:
127 params.update({
128 "doctype": "User",
129 "send_welcome_email": 0,
130 "language": "",
131 "user_type": "System User",
132 "roles": [{
133 "role": _("Blogger")
134 }]
135 })
136
137 user = frappe.get_doc(params).insert(ignore_permissions=True)
138 frappe.db.commit()
139
140 return user
141
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/frappe/integrations/doctype/ldap_settings/ldap_settings.py b/frappe/integrations/doctype/ldap_settings/ldap_settings.py
--- a/frappe/integrations/doctype/ldap_settings/ldap_settings.py
+++ b/frappe/integrations/doctype/ldap_settings/ldap_settings.py
@@ -86,7 +86,7 @@
try:
try:
# set TLS settings for secure connection
- if self.ssl_tls_mode == 'StartTLS':
+ if settings.ssl_tls_mode == 'StartTLS':
conn.set_option(ldap.OPT_X_TLS_DEMAND, True)
if self.require_trusted_certificate == 'Yes':
conn.set_option(ldap.OPT_X_TLS_REQUIRE_CERT, ldap.OPT_X_TLS_DEMAND)
| {"golden_diff": "diff --git a/frappe/integrations/doctype/ldap_settings/ldap_settings.py b/frappe/integrations/doctype/ldap_settings/ldap_settings.py\n--- a/frappe/integrations/doctype/ldap_settings/ldap_settings.py\n+++ b/frappe/integrations/doctype/ldap_settings/ldap_settings.py\n@@ -86,7 +86,7 @@\n \ttry:\n \t\ttry:\n \t\t\t# set TLS settings for secure connection\n-\t\t\tif self.ssl_tls_mode == 'StartTLS':\n+\t\t\tif settings.ssl_tls_mode == 'StartTLS':\n \t\t\t\tconn.set_option(ldap.OPT_X_TLS_DEMAND, True)\n \t\t\t\tif self.require_trusted_certificate == 'Yes':\n \t\t\t\t\tconn.set_option(ldap.OPT_X_TLS_REQUIRE_CERT, ldap.OPT_X_TLS_DEMAND)\n", "issue": "\"StartTLS is not supported\"-Bug in LDAP integrations\n`self` is used in the method `authenticate_ldap_user` which results in the Python exception:\r\n```\r\nNameError: global name 'self' is not defined\r\n```\r\nhttps://github.com/frappe/frappe/blob/6811f0345e35ea521942db4b29425dd71e403ab8/frappe/integrations/doctype/ldap_settings/ldap_settings.py#L89\r\n\r\nSince all exceptions are caught regardless their type, a misleading \"StartTLS is not supported\" error is thrown:\r\nhttps://github.com/frappe/frappe/blob/6811f0345e35ea521942db4b29425dd71e403ab8/frappe/integrations/doctype/ldap_settings/ldap_settings.py#L95\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n# Copyright (c) 2015, Frappe Technologies and contributors\n# For license information, please see license.txt\n\nfrom __future__ import unicode_literals\nimport frappe\nfrom frappe import _\nfrom frappe.utils import cstr\nfrom frappe.model.document import Document\n\nclass LDAPSettings(Document):\n\tdef validate(self):\n\t\tif not self.flags.ignore_mandatory:\n\t\t\tself.validate_ldap_credentails()\n\n\tdef validate_ldap_credentails(self):\n\t\ttry:\n\t\t\timport ldap\n\t\t\tconn = ldap.initialize(self.ldap_server_url)\n\t\t\ttry:\n\t\t\t\tif self.ssl_tls_mode == 'StartTLS':\n\t\t\t\t\tconn.set_option(ldap.OPT_X_TLS_DEMAND, True)\n\t\t\t\t\tif self.require_trusted_certificate == 'Yes':\n\t\t\t\t\t\tconn.set_option(ldap.OPT_X_TLS_REQUIRE_CERT, ldap.OPT_X_TLS_DEMAND)\n\t\t\t\t\tconn.start_tls_s()\n\t\t\texcept:\n\t\t\t\tfrappe.throw(_(\"StartTLS is not supported\"))\n\n\t\t\tconn.simple_bind_s(self.base_dn, self.get_password(raise_exception=False))\n\t\texcept ImportError:\n\t\t\tmsg = \"\"\"\n\t\t\t\t<div>\n\t\t\t\t\t{{_(\"Seems ldap is not installed on system.<br>Guidelines to install ldap dependancies and python package\")}},\n\t\t\t\t\t<a href=\"https://discuss.erpnext.com/t/frappe-v-7-1-beta-ldap-dependancies/15841\" target=\"_blank\">{{_(\"Click here\")}}</a>,\n\t\t\t\t</div>\n\t\t\t\"\"\"\n\t\t\tfrappe.throw(msg, title=_(\"LDAP Not Installed\"))\n\n\t\texcept ldap.LDAPError:\n\t\t\tconn.unbind_s()\n\t\t\tfrappe.throw(_(\"Incorrect UserId or Password\"))\n\ndef get_ldap_settings():\n\ttry:\n\t\tsettings = frappe.get_doc(\"LDAP Settings\")\n\n\t\tsettings.update({\n\t\t\t\"method\": \"frappe.integrations.doctype.ldap_settings.ldap_settings.login\"\n\t\t})\n\t\treturn settings\n\texcept Exception:\n\t\t# this will return blank settings\n\t\treturn frappe._dict()\n\[email protected](allow_guest=True)\ndef login():\n\t#### LDAP LOGIN LOGIC #####\n\targs = frappe.form_dict\n\tuser = authenticate_ldap_user(frappe.as_unicode(args.usr), frappe.as_unicode(args.pwd))\n\n\tfrappe.local.login_manager.user = user.name\n\tfrappe.local.login_manager.post_login()\n\n\t# because of a GET request!\n\tfrappe.db.commit()\n\ndef authenticate_ldap_user(user=None, password=None):\n\tdn = None\n\tparams = {}\n\tsettings = get_ldap_settings()\n\n\ttry:\n\t\timport ldap\n\texcept:\n\t\tmsg = \"\"\"\n\t\t\t<div>\n\t\t\t\t{{_(\"Seems ldap is not installed on system.\")}}<br>\n\t\t\t\t<a href\"https://discuss.erpnext.com/t/frappe-v-7-1-beta-ldap-dependancies/15841\">{{_(\"Click here\")}}</a>,\n\t\t\t\t\t{{_(\"Guidelines to install ldap dependancies and python\")}}\n\t\t\t</div>\n\t\t\"\"\"\n\t\tfrappe.throw(msg, title=_(\"LDAP Not Installed\"))\n\n\tconn = ldap.initialize(settings.ldap_server_url)\n\n\ttry:\n\t\ttry:\n\t\t\t# set TLS settings for secure connection\n\t\t\tif self.ssl_tls_mode == 'StartTLS':\n\t\t\t\tconn.set_option(ldap.OPT_X_TLS_DEMAND, True)\n\t\t\t\tif self.require_trusted_certificate == 'Yes':\n\t\t\t\t\tconn.set_option(ldap.OPT_X_TLS_REQUIRE_CERT, ldap.OPT_X_TLS_DEMAND)\n\t\t\t\tconn.start_tls_s()\n\t\texcept:\n\t\t\tfrappe.throw(_(\"StartTLS is not supported\"))\n\n\t\t# simple_bind_s is synchronous binding to server, it takes two param DN and password\n\t\tconn.simple_bind_s(settings.base_dn, settings.get_password(raise_exception=False))\n\n\t\t#search for surnames beginning with a\n\t\t#available options for how deep a search you want.\n\t\t#LDAP_SCOPE_BASE, LDAP_SCOPE_ONELEVEL,LDAP_SCOPE_SUBTREE,\n\t\tresult = conn.search_s(settings.organizational_unit, ldap.SCOPE_SUBTREE,\n\t\t\tsettings.ldap_search_string.format(user))\n\n\t\tfor dn, r in result:\n\t\t\tdn = cstr(dn)\n\t\t\tparams[\"email\"] = cstr(r[settings.ldap_email_field][0])\n\t\t\tparams[\"username\"] = cstr(r[settings.ldap_username_field][0])\n\t\t\tparams[\"first_name\"] = cstr(r[settings.ldap_first_name_field][0])\n\n\t\tif dn:\n\t\t\tconn.simple_bind_s(dn, frappe.as_unicode(password))\n\t\t\treturn create_user(params)\n\t\telse:\n\t\t\tfrappe.throw(_(\"Not a valid LDAP user\"))\n\n\texcept ldap.LDAPError:\n\t\tconn.unbind_s()\n\t\tfrappe.throw(_(\"Incorrect UserId or Password\"))\n\ndef create_user(params):\n\tif frappe.db.exists(\"User\", params[\"email\"]):\n\t\treturn frappe.get_doc(\"User\", params[\"email\"])\n\n\telse:\n\t\tparams.update({\n\t\t\t\"doctype\": \"User\",\n\t\t\t\"send_welcome_email\": 0,\n\t\t\t\"language\": \"\",\n\t\t\t\"user_type\": \"System User\",\n\t\t\t\"roles\": [{\n\t\t\t\t\"role\": _(\"Blogger\")\n\t\t\t}]\n\t\t})\n\n\t\tuser = frappe.get_doc(params).insert(ignore_permissions=True)\n\t\tfrappe.db.commit()\n\n\t\treturn user\n", "path": "frappe/integrations/doctype/ldap_settings/ldap_settings.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n# Copyright (c) 2015, Frappe Technologies and contributors\n# For license information, please see license.txt\n\nfrom __future__ import unicode_literals\nimport frappe\nfrom frappe import _\nfrom frappe.utils import cstr\nfrom frappe.model.document import Document\n\nclass LDAPSettings(Document):\n\tdef validate(self):\n\t\tif not self.flags.ignore_mandatory:\n\t\t\tself.validate_ldap_credentails()\n\n\tdef validate_ldap_credentails(self):\n\t\ttry:\n\t\t\timport ldap\n\t\t\tconn = ldap.initialize(self.ldap_server_url)\n\t\t\ttry:\n\t\t\t\tif self.ssl_tls_mode == 'StartTLS':\n\t\t\t\t\tconn.set_option(ldap.OPT_X_TLS_DEMAND, True)\n\t\t\t\t\tif self.require_trusted_certificate == 'Yes':\n\t\t\t\t\t\tconn.set_option(ldap.OPT_X_TLS_REQUIRE_CERT, ldap.OPT_X_TLS_DEMAND)\n\t\t\t\t\tconn.start_tls_s()\n\t\t\texcept:\n\t\t\t\tfrappe.throw(_(\"StartTLS is not supported\"))\n\n\t\t\tconn.simple_bind_s(self.base_dn, self.get_password(raise_exception=False))\n\t\texcept ImportError:\n\t\t\tmsg = \"\"\"\n\t\t\t\t<div>\n\t\t\t\t\t{{_(\"Seems ldap is not installed on system.<br>Guidelines to install ldap dependancies and python package\")}},\n\t\t\t\t\t<a href=\"https://discuss.erpnext.com/t/frappe-v-7-1-beta-ldap-dependancies/15841\" target=\"_blank\">{{_(\"Click here\")}}</a>,\n\t\t\t\t</div>\n\t\t\t\"\"\"\n\t\t\tfrappe.throw(msg, title=_(\"LDAP Not Installed\"))\n\n\t\texcept ldap.LDAPError:\n\t\t\tconn.unbind_s()\n\t\t\tfrappe.throw(_(\"Incorrect UserId or Password\"))\n\ndef get_ldap_settings():\n\ttry:\n\t\tsettings = frappe.get_doc(\"LDAP Settings\")\n\n\t\tsettings.update({\n\t\t\t\"method\": \"frappe.integrations.doctype.ldap_settings.ldap_settings.login\"\n\t\t})\n\t\treturn settings\n\texcept Exception:\n\t\t# this will return blank settings\n\t\treturn frappe._dict()\n\[email protected](allow_guest=True)\ndef login():\n\t#### LDAP LOGIN LOGIC #####\n\targs = frappe.form_dict\n\tuser = authenticate_ldap_user(frappe.as_unicode(args.usr), frappe.as_unicode(args.pwd))\n\n\tfrappe.local.login_manager.user = user.name\n\tfrappe.local.login_manager.post_login()\n\n\t# because of a GET request!\n\tfrappe.db.commit()\n\ndef authenticate_ldap_user(user=None, password=None):\n\tdn = None\n\tparams = {}\n\tsettings = get_ldap_settings()\n\n\ttry:\n\t\timport ldap\n\texcept:\n\t\tmsg = \"\"\"\n\t\t\t<div>\n\t\t\t\t{{_(\"Seems ldap is not installed on system.\")}}<br>\n\t\t\t\t<a href\"https://discuss.erpnext.com/t/frappe-v-7-1-beta-ldap-dependancies/15841\">{{_(\"Click here\")}}</a>,\n\t\t\t\t\t{{_(\"Guidelines to install ldap dependancies and python\")}}\n\t\t\t</div>\n\t\t\"\"\"\n\t\tfrappe.throw(msg, title=_(\"LDAP Not Installed\"))\n\n\tconn = ldap.initialize(settings.ldap_server_url)\n\n\ttry:\n\t\ttry:\n\t\t\t# set TLS settings for secure connection\n\t\t\tif settings.ssl_tls_mode == 'StartTLS':\n\t\t\t\tconn.set_option(ldap.OPT_X_TLS_DEMAND, True)\n\t\t\t\tif self.require_trusted_certificate == 'Yes':\n\t\t\t\t\tconn.set_option(ldap.OPT_X_TLS_REQUIRE_CERT, ldap.OPT_X_TLS_DEMAND)\n\t\t\t\tconn.start_tls_s()\n\t\texcept:\n\t\t\tfrappe.throw(_(\"StartTLS is not supported\"))\n\n\t\t# simple_bind_s is synchronous binding to server, it takes two param DN and password\n\t\tconn.simple_bind_s(settings.base_dn, settings.get_password(raise_exception=False))\n\n\t\t#search for surnames beginning with a\n\t\t#available options for how deep a search you want.\n\t\t#LDAP_SCOPE_BASE, LDAP_SCOPE_ONELEVEL,LDAP_SCOPE_SUBTREE,\n\t\tresult = conn.search_s(settings.organizational_unit, ldap.SCOPE_SUBTREE,\n\t\t\tsettings.ldap_search_string.format(user))\n\n\t\tfor dn, r in result:\n\t\t\tdn = cstr(dn)\n\t\t\tparams[\"email\"] = cstr(r[settings.ldap_email_field][0])\n\t\t\tparams[\"username\"] = cstr(r[settings.ldap_username_field][0])\n\t\t\tparams[\"first_name\"] = cstr(r[settings.ldap_first_name_field][0])\n\n\t\tif dn:\n\t\t\tconn.simple_bind_s(dn, frappe.as_unicode(password))\n\t\t\treturn create_user(params)\n\t\telse:\n\t\t\tfrappe.throw(_(\"Not a valid LDAP user\"))\n\n\texcept ldap.LDAPError:\n\t\tconn.unbind_s()\n\t\tfrappe.throw(_(\"Incorrect UserId or Password\"))\n\ndef create_user(params):\n\tif frappe.db.exists(\"User\", params[\"email\"]):\n\t\treturn frappe.get_doc(\"User\", params[\"email\"])\n\n\telse:\n\t\tparams.update({\n\t\t\t\"doctype\": \"User\",\n\t\t\t\"send_welcome_email\": 0,\n\t\t\t\"language\": \"\",\n\t\t\t\"user_type\": \"System User\",\n\t\t\t\"roles\": [{\n\t\t\t\t\"role\": _(\"Blogger\")\n\t\t\t}]\n\t\t})\n\n\t\tuser = frappe.get_doc(params).insert(ignore_permissions=True)\n\t\tfrappe.db.commit()\n\n\t\treturn user\n", "path": "frappe/integrations/doctype/ldap_settings/ldap_settings.py"}]} | 1,929 | 167 |
gh_patches_debug_9901 | rasdani/github-patches | git_diff | bridgecrewio__checkov-1062 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
CKV_AZURE_12: Retention policy of 0 is not supported
**Describe the bug**
When setting the `retention_period.days` value on `azurerm_network_watcher_flow_log` to `0`, `CKV_AZURE_12` still fails. `0` is the value to signify indefinite/forever retention.
**To Reproduce**
Steps to reproduce the behavior:
1. Create an `azurerm_network_watcher_flow_log` resource
2. Set
```
retention_policy {
enabled = true
days = 0
}
```
3. Run `checkov`
4. Receive failure for `CKV_AZURE_12`
**Expected behavior**
`0` is an accepted value as documented at https://docs.microsoft.com/en-us/azure/network-watcher/network-watcher-nsg-flow-logging-overview#how-logging-works
**Desktop (please complete the following information):**
- OS: Linux
- Checkov Version: 2.0.26
**Additional context**
At https://github.com/bridgecrewio/checkov/blob/master/checkov/terraform/checks/resource/azure/NetworkWatcherFlowLogPeriod.py#L17 likely need to adjust the innermost `if` statement to something like:
```python
if retention_in_days is not None and (retention_days == 0 or retention_days >= 90):
```
Happy to open a pull request if this is an acceptable solution.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `checkov/terraform/checks/resource/azure/NetworkWatcherFlowLogPeriod.py`
Content:
```
1 from checkov.common.models.enums import CheckResult, CheckCategories
2 from checkov.common.util.type_forcers import force_int
3 from checkov.terraform.checks.resource.base_resource_value_check import BaseResourceCheck
4
5
6 class NetworkWatcherFlowLogPeriod(BaseResourceCheck):
7 def __init__(self):
8 name = "Ensure that Network Security Group Flow Log retention period is 'greater than 90 days'"
9 id = "CKV_AZURE_12"
10 supported_resources = ['azurerm_network_watcher_flow_log']
11 categories = [CheckCategories.LOGGING]
12 super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)
13
14 def scan_resource_conf(self, conf):
15 if 'enabled' in conf and conf['enabled'][0]:
16 retention_block = conf['retention_policy'][0]
17 if retention_block['enabled'][0]:
18 retention_in_days = force_int(retention_block['days'][0])
19 if retention_in_days and retention_in_days >= 90:
20 return CheckResult.PASSED
21 return CheckResult.FAILED
22
23
24 check = NetworkWatcherFlowLogPeriod()
25
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/checkov/terraform/checks/resource/azure/NetworkWatcherFlowLogPeriod.py b/checkov/terraform/checks/resource/azure/NetworkWatcherFlowLogPeriod.py
--- a/checkov/terraform/checks/resource/azure/NetworkWatcherFlowLogPeriod.py
+++ b/checkov/terraform/checks/resource/azure/NetworkWatcherFlowLogPeriod.py
@@ -16,7 +16,7 @@
retention_block = conf['retention_policy'][0]
if retention_block['enabled'][0]:
retention_in_days = force_int(retention_block['days'][0])
- if retention_in_days and retention_in_days >= 90:
+ if retention_in_days is not None and (retention_in_days == 0 or retention_in_days >= 90):
return CheckResult.PASSED
return CheckResult.FAILED
| {"golden_diff": "diff --git a/checkov/terraform/checks/resource/azure/NetworkWatcherFlowLogPeriod.py b/checkov/terraform/checks/resource/azure/NetworkWatcherFlowLogPeriod.py\n--- a/checkov/terraform/checks/resource/azure/NetworkWatcherFlowLogPeriod.py\n+++ b/checkov/terraform/checks/resource/azure/NetworkWatcherFlowLogPeriod.py\n@@ -16,7 +16,7 @@\n retention_block = conf['retention_policy'][0]\n if retention_block['enabled'][0]:\n retention_in_days = force_int(retention_block['days'][0])\n- if retention_in_days and retention_in_days >= 90:\n+ if retention_in_days is not None and (retention_in_days == 0 or retention_in_days >= 90):\n return CheckResult.PASSED\n return CheckResult.FAILED\n", "issue": "CKV_AZURE_12: Retention policy of 0 is not supported\n**Describe the bug**\r\nWhen setting the `retention_period.days` value on `azurerm_network_watcher_flow_log` to `0`, `CKV_AZURE_12` still fails. `0` is the value to signify indefinite/forever retention.\r\n\r\n**To Reproduce**\r\nSteps to reproduce the behavior:\r\n1. Create an `azurerm_network_watcher_flow_log` resource\r\n2. Set\r\n```\r\n retention_policy {\r\n enabled = true\r\n days = 0\r\n }\r\n```\r\n3. Run `checkov`\r\n4. Receive failure for `CKV_AZURE_12`\r\n\r\n**Expected behavior**\r\n`0` is an accepted value as documented at https://docs.microsoft.com/en-us/azure/network-watcher/network-watcher-nsg-flow-logging-overview#how-logging-works\r\n\r\n**Desktop (please complete the following information):**\r\n - OS: Linux\r\n - Checkov Version: 2.0.26\r\n\r\n\r\n**Additional context**\r\nAt https://github.com/bridgecrewio/checkov/blob/master/checkov/terraform/checks/resource/azure/NetworkWatcherFlowLogPeriod.py#L17 likely need to adjust the innermost `if` statement to something like:\r\n\r\n```python\r\nif retention_in_days is not None and (retention_days == 0 or retention_days >= 90):\r\n```\r\n\r\nHappy to open a pull request if this is an acceptable solution.\n", "before_files": [{"content": "from checkov.common.models.enums import CheckResult, CheckCategories\nfrom checkov.common.util.type_forcers import force_int\nfrom checkov.terraform.checks.resource.base_resource_value_check import BaseResourceCheck\n\n\nclass NetworkWatcherFlowLogPeriod(BaseResourceCheck):\n def __init__(self):\n name = \"Ensure that Network Security Group Flow Log retention period is 'greater than 90 days'\"\n id = \"CKV_AZURE_12\"\n supported_resources = ['azurerm_network_watcher_flow_log']\n categories = [CheckCategories.LOGGING]\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)\n\n def scan_resource_conf(self, conf):\n if 'enabled' in conf and conf['enabled'][0]:\n retention_block = conf['retention_policy'][0]\n if retention_block['enabled'][0]:\n retention_in_days = force_int(retention_block['days'][0])\n if retention_in_days and retention_in_days >= 90:\n return CheckResult.PASSED\n return CheckResult.FAILED\n\n\ncheck = NetworkWatcherFlowLogPeriod()\n", "path": "checkov/terraform/checks/resource/azure/NetworkWatcherFlowLogPeriod.py"}], "after_files": [{"content": "from checkov.common.models.enums import CheckResult, CheckCategories\nfrom checkov.common.util.type_forcers import force_int\nfrom checkov.terraform.checks.resource.base_resource_value_check import BaseResourceCheck\n\n\nclass NetworkWatcherFlowLogPeriod(BaseResourceCheck):\n def __init__(self):\n name = \"Ensure that Network Security Group Flow Log retention period is 'greater than 90 days'\"\n id = \"CKV_AZURE_12\"\n supported_resources = ['azurerm_network_watcher_flow_log']\n categories = [CheckCategories.LOGGING]\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)\n\n def scan_resource_conf(self, conf):\n if 'enabled' in conf and conf['enabled'][0]:\n retention_block = conf['retention_policy'][0]\n if retention_block['enabled'][0]:\n retention_in_days = force_int(retention_block['days'][0])\n if retention_in_days is not None and (retention_in_days == 0 or retention_in_days >= 90):\n return CheckResult.PASSED\n return CheckResult.FAILED\n\n\ncheck = NetworkWatcherFlowLogPeriod()\n", "path": "checkov/terraform/checks/resource/azure/NetworkWatcherFlowLogPeriod.py"}]} | 867 | 182 |
gh_patches_debug_30688 | rasdani/github-patches | git_diff | joke2k__faker-592 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Reseed code broke in 0.8 unnecessarily
Commit ea4f189bbff1925d7a7e2d7cbc6e91e2e8a9a3f3 changed the name of the `random.Random()` object in `faker.generator` from `random` to `mod_random`, re-exposing the `random` module as the name `faker.generator.random`. This broke code like:
```py
from faker.generator import random
random.seed(1)
```
which is basically what my `pytest-randomly` plugin does. At first I thought this was #586 but then I saw that every run was broken. The failure is silent because `random` is still a name in `faker.generator`, it just now points at the global `random` module 😱
I suggest just doing some renaming in the module to fix this so old code continues to work.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `faker/utils/distribution.py`
Content:
```
1 # coding=utf-8
2
3 import bisect
4 from faker.generator import mod_random
5
6 def random_sample(random=None):
7 if random is None:
8 random = mod_random
9 return random.uniform(0.0, 1.0)
10
11
12 def cumsum(it):
13 total = 0
14 for x in it:
15 total += x
16 yield total
17
18
19 def choice_distribution(a, p, random=None):
20 if random is None:
21 random = mod_random
22
23 assert len(a) == len(p)
24
25 if hasattr(random, 'choices'):
26 return random.choices(a, weights=p)[0]
27 else:
28 cdf = list(cumsum(p))
29 normal = cdf[-1]
30 cdf2 = [float(i) / float(normal) for i in cdf]
31 uniform_sample = random_sample(random=random)
32 idx = bisect.bisect_right(cdf2, uniform_sample)
33 return a[idx]
34
```
Path: `faker/generator.py`
Content:
```
1 # coding=utf-8
2
3 from __future__ import unicode_literals
4
5 import re
6 import random
7
8
9 _re_token = re.compile(r'\{\{(\s?)(\w+)(\s?)\}\}')
10 mod_random = random.Random()
11
12
13 class Generator(object):
14
15 __config = {}
16
17 def __init__(self, **config):
18 self.providers = []
19 self.__config = dict(
20 list(self.__config.items()) + list(config.items()))
21 self.__random = mod_random
22
23 def add_provider(self, provider):
24
25 if type(provider) is type:
26 provider = provider(self)
27
28 self.providers.insert(0, provider)
29
30 for method_name in dir(provider):
31 # skip 'private' method
32 if method_name.startswith('_'):
33 continue
34
35 faker_function = getattr(provider, method_name)
36
37 if hasattr(faker_function, '__call__') or \
38 isinstance(faker_function, (classmethod, staticmethod)):
39 # add all faker method to generator
40 self.set_formatter(method_name, faker_function)
41
42 def provider(self, name):
43 try:
44 lst = [p for p in self.get_providers()
45 if p.__provider__ == name.lower()]
46 return lst[0]
47 except IndexError:
48 return None
49
50 def get_providers(self):
51 """Returns added providers."""
52 return self.providers
53
54 @property
55 def random(self):
56 return self.__random
57
58 def seed_instance(self, seed=None):
59 """Calls random.seed"""
60 if self.__random == mod_random:
61 # create per-instance random obj when first time seed_instance() is called
62 self.__random = random.Random()
63 self.__random.seed(seed)
64
65 @classmethod
66 def seed(cls, seed=None):
67 mod_random.seed(seed)
68
69 def format(self, formatter, *args, **kwargs):
70 """
71 This is a secure way to make a fake from another Provider.
72 """
73 # TODO: data export?
74 return self.get_formatter(formatter)(*args, **kwargs)
75
76 def get_formatter(self, formatter):
77 try:
78 return getattr(self, formatter)
79 except AttributeError:
80 if 'locale' in self.__config:
81 msg = 'Unknown formatter "{0}" with locale "{1}"'.format(
82 formatter, self.__config['locale']
83 )
84 else:
85 raise AttributeError('Unknown formatter "{0}"'.format(
86 formatter
87 ))
88 raise AttributeError(msg)
89
90 def set_formatter(self, name, method):
91 """
92 This method adds a provider method to generator.
93 Override this method to add some decoration or logging stuff.
94 """
95 setattr(self, name, method)
96
97 def parse(self, text):
98 """
99 Replaces tokens (like '{{ tokenName }}' or '{{tokenName}}')
100 with the result from the token method call.
101 """
102 return _re_token.sub(self.__format_token, text)
103
104 def __format_token(self, matches):
105 formatter = list(matches.groups())
106 formatter[1] = self.format(formatter[1])
107 return ''.join(formatter)
108
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/faker/generator.py b/faker/generator.py
--- a/faker/generator.py
+++ b/faker/generator.py
@@ -3,11 +3,12 @@
from __future__ import unicode_literals
import re
-import random
+import random as random_module
_re_token = re.compile(r'\{\{(\s?)(\w+)(\s?)\}\}')
-mod_random = random.Random()
+random = random_module.Random()
+mod_random = random # compat with name released in 0.8
class Generator(object):
@@ -18,7 +19,7 @@
self.providers = []
self.__config = dict(
list(self.__config.items()) + list(config.items()))
- self.__random = mod_random
+ self.__random = random
def add_provider(self, provider):
@@ -57,14 +58,14 @@
def seed_instance(self, seed=None):
"""Calls random.seed"""
- if self.__random == mod_random:
+ if self.__random == random:
# create per-instance random obj when first time seed_instance() is called
- self.__random = random.Random()
+ self.__random = random_module.Random()
self.__random.seed(seed)
@classmethod
def seed(cls, seed=None):
- mod_random.seed(seed)
+ random.seed(seed)
def format(self, formatter, *args, **kwargs):
"""
diff --git a/faker/utils/distribution.py b/faker/utils/distribution.py
--- a/faker/utils/distribution.py
+++ b/faker/utils/distribution.py
@@ -1,7 +1,8 @@
# coding=utf-8
import bisect
-from faker.generator import mod_random
+from faker.generator import random as mod_random
+
def random_sample(random=None):
if random is None:
| {"golden_diff": "diff --git a/faker/generator.py b/faker/generator.py\n--- a/faker/generator.py\n+++ b/faker/generator.py\n@@ -3,11 +3,12 @@\n from __future__ import unicode_literals\n \n import re\n-import random\n+import random as random_module\n \n \n _re_token = re.compile(r'\\{\\{(\\s?)(\\w+)(\\s?)\\}\\}')\n-mod_random = random.Random()\n+random = random_module.Random()\n+mod_random = random # compat with name released in 0.8\n \n \n class Generator(object):\n@@ -18,7 +19,7 @@\n self.providers = []\n self.__config = dict(\n list(self.__config.items()) + list(config.items()))\n- self.__random = mod_random\n+ self.__random = random\n \n def add_provider(self, provider):\n \n@@ -57,14 +58,14 @@\n \n def seed_instance(self, seed=None):\n \"\"\"Calls random.seed\"\"\"\n- if self.__random == mod_random:\n+ if self.__random == random:\n # create per-instance random obj when first time seed_instance() is called\n- self.__random = random.Random()\n+ self.__random = random_module.Random()\n self.__random.seed(seed)\n \n @classmethod\n def seed(cls, seed=None):\n- mod_random.seed(seed)\n+ random.seed(seed)\n \n def format(self, formatter, *args, **kwargs):\n \"\"\"\ndiff --git a/faker/utils/distribution.py b/faker/utils/distribution.py\n--- a/faker/utils/distribution.py\n+++ b/faker/utils/distribution.py\n@@ -1,7 +1,8 @@\n # coding=utf-8\n \n import bisect\n-from faker.generator import mod_random\n+from faker.generator import random as mod_random\n+\n \n def random_sample(random=None):\n if random is None:\n", "issue": "Reseed code broke in 0.8 unnecessarily\nCommit ea4f189bbff1925d7a7e2d7cbc6e91e2e8a9a3f3 changed the name of the `random.Random()` object in `faker.generator` from `random` to `mod_random`, re-exposing the `random` module as the name `faker.generator.random`. This broke code like:\r\n\r\n```py\r\nfrom faker.generator import random\r\nrandom.seed(1)\r\n``` \r\n\r\nwhich is basically what my `pytest-randomly` plugin does. At first I thought this was #586 but then I saw that every run was broken. The failure is silent because `random` is still a name in `faker.generator`, it just now points at the global `random` module \ud83d\ude31\r\n\r\nI suggest just doing some renaming in the module to fix this so old code continues to work.\n", "before_files": [{"content": "# coding=utf-8\n\nimport bisect\nfrom faker.generator import mod_random\n\ndef random_sample(random=None):\n if random is None:\n random = mod_random\n return random.uniform(0.0, 1.0)\n\n\ndef cumsum(it):\n total = 0\n for x in it:\n total += x\n yield total\n\n\ndef choice_distribution(a, p, random=None):\n if random is None:\n random = mod_random\n\n assert len(a) == len(p)\n\n if hasattr(random, 'choices'):\n return random.choices(a, weights=p)[0]\n else:\n cdf = list(cumsum(p))\n normal = cdf[-1]\n cdf2 = [float(i) / float(normal) for i in cdf]\n uniform_sample = random_sample(random=random)\n idx = bisect.bisect_right(cdf2, uniform_sample)\n return a[idx]\n", "path": "faker/utils/distribution.py"}, {"content": "# coding=utf-8\n\nfrom __future__ import unicode_literals\n\nimport re\nimport random\n\n\n_re_token = re.compile(r'\\{\\{(\\s?)(\\w+)(\\s?)\\}\\}')\nmod_random = random.Random()\n\n\nclass Generator(object):\n\n __config = {}\n\n def __init__(self, **config):\n self.providers = []\n self.__config = dict(\n list(self.__config.items()) + list(config.items()))\n self.__random = mod_random\n\n def add_provider(self, provider):\n\n if type(provider) is type:\n provider = provider(self)\n\n self.providers.insert(0, provider)\n\n for method_name in dir(provider):\n # skip 'private' method\n if method_name.startswith('_'):\n continue\n\n faker_function = getattr(provider, method_name)\n\n if hasattr(faker_function, '__call__') or \\\n isinstance(faker_function, (classmethod, staticmethod)):\n # add all faker method to generator\n self.set_formatter(method_name, faker_function)\n\n def provider(self, name):\n try:\n lst = [p for p in self.get_providers()\n if p.__provider__ == name.lower()]\n return lst[0]\n except IndexError:\n return None\n\n def get_providers(self):\n \"\"\"Returns added providers.\"\"\"\n return self.providers\n\n @property\n def random(self):\n return self.__random\n\n def seed_instance(self, seed=None):\n \"\"\"Calls random.seed\"\"\"\n if self.__random == mod_random:\n # create per-instance random obj when first time seed_instance() is called\n self.__random = random.Random()\n self.__random.seed(seed)\n\n @classmethod\n def seed(cls, seed=None):\n mod_random.seed(seed)\n\n def format(self, formatter, *args, **kwargs):\n \"\"\"\n This is a secure way to make a fake from another Provider.\n \"\"\"\n # TODO: data export?\n return self.get_formatter(formatter)(*args, **kwargs)\n\n def get_formatter(self, formatter):\n try:\n return getattr(self, formatter)\n except AttributeError:\n if 'locale' in self.__config:\n msg = 'Unknown formatter \"{0}\" with locale \"{1}\"'.format(\n formatter, self.__config['locale']\n )\n else:\n raise AttributeError('Unknown formatter \"{0}\"'.format(\n formatter\n ))\n raise AttributeError(msg)\n\n def set_formatter(self, name, method):\n \"\"\"\n This method adds a provider method to generator.\n Override this method to add some decoration or logging stuff.\n \"\"\"\n setattr(self, name, method)\n\n def parse(self, text):\n \"\"\"\n Replaces tokens (like '{{ tokenName }}' or '{{tokenName}}')\n with the result from the token method call.\n \"\"\"\n return _re_token.sub(self.__format_token, text)\n\n def __format_token(self, matches):\n formatter = list(matches.groups())\n formatter[1] = self.format(formatter[1])\n return ''.join(formatter)\n", "path": "faker/generator.py"}], "after_files": [{"content": "# coding=utf-8\n\nimport bisect\nfrom faker.generator import random as mod_random\n\n\ndef random_sample(random=None):\n if random is None:\n random = mod_random\n return random.uniform(0.0, 1.0)\n\n\ndef cumsum(it):\n total = 0\n for x in it:\n total += x\n yield total\n\n\ndef choice_distribution(a, p, random=None):\n if random is None:\n random = mod_random\n\n assert len(a) == len(p)\n\n if hasattr(random, 'choices'):\n return random.choices(a, weights=p)[0]\n else:\n cdf = list(cumsum(p))\n normal = cdf[-1]\n cdf2 = [float(i) / float(normal) for i in cdf]\n uniform_sample = random_sample(random=random)\n idx = bisect.bisect_right(cdf2, uniform_sample)\n return a[idx]\n", "path": "faker/utils/distribution.py"}, {"content": "# coding=utf-8\n\nfrom __future__ import unicode_literals\n\nimport re\nimport random as random_module\n\n\n_re_token = re.compile(r'\\{\\{(\\s?)(\\w+)(\\s?)\\}\\}')\nrandom = random_module.Random()\nmod_random = random # compat with name released in 0.8\n\n\nclass Generator(object):\n\n __config = {}\n\n def __init__(self, **config):\n self.providers = []\n self.__config = dict(\n list(self.__config.items()) + list(config.items()))\n self.__random = random\n\n def add_provider(self, provider):\n\n if type(provider) is type:\n provider = provider(self)\n\n self.providers.insert(0, provider)\n\n for method_name in dir(provider):\n # skip 'private' method\n if method_name.startswith('_'):\n continue\n\n faker_function = getattr(provider, method_name)\n\n if hasattr(faker_function, '__call__') or \\\n isinstance(faker_function, (classmethod, staticmethod)):\n # add all faker method to generator\n self.set_formatter(method_name, faker_function)\n\n def provider(self, name):\n try:\n lst = [p for p in self.get_providers()\n if p.__provider__ == name.lower()]\n return lst[0]\n except IndexError:\n return None\n\n def get_providers(self):\n \"\"\"Returns added providers.\"\"\"\n return self.providers\n\n @property\n def random(self):\n return self.__random\n\n def seed_instance(self, seed=None):\n \"\"\"Calls random.seed\"\"\"\n if self.__random == random:\n # create per-instance random obj when first time seed_instance() is called\n self.__random = random_module.Random()\n self.__random.seed(seed)\n\n @classmethod\n def seed(cls, seed=None):\n random.seed(seed)\n\n def format(self, formatter, *args, **kwargs):\n \"\"\"\n This is a secure way to make a fake from another Provider.\n \"\"\"\n # TODO: data export?\n return self.get_formatter(formatter)(*args, **kwargs)\n\n def get_formatter(self, formatter):\n try:\n return getattr(self, formatter)\n except AttributeError:\n if 'locale' in self.__config:\n msg = 'Unknown formatter \"{0}\" with locale \"{1}\"'.format(\n formatter, self.__config['locale']\n )\n else:\n raise AttributeError('Unknown formatter \"{0}\"'.format(\n formatter\n ))\n raise AttributeError(msg)\n\n def set_formatter(self, name, method):\n \"\"\"\n This method adds a provider method to generator.\n Override this method to add some decoration or logging stuff.\n \"\"\"\n setattr(self, name, method)\n\n def parse(self, text):\n \"\"\"\n Replaces tokens (like '{{ tokenName }}' or '{{tokenName}}')\n with the result from the token method call.\n \"\"\"\n return _re_token.sub(self.__format_token, text)\n\n def __format_token(self, matches):\n formatter = list(matches.groups())\n formatter[1] = self.format(formatter[1])\n return ''.join(formatter)\n", "path": "faker/generator.py"}]} | 1,596 | 407 |
gh_patches_debug_4387 | rasdani/github-patches | git_diff | falconry__falcon-1593 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Official support for CPython 3.8
CPython 3.8.0 was [released Oct 14, 2019](https://www.python.org/downloads/release/python-380/).
Make Falcon offically support CPython 3.8:
- [ ] README
- [ ] Docker images
- [ ] Docs
- [ ] Tox
- [ ] Travis
- [ ] setup.py
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 import glob
2 import imp
3 import io
4 import os
5 from os import path
6 import re
7 import sys
8
9 from setuptools import Extension, find_packages, setup
10
11 MYDIR = path.abspath(os.path.dirname(__file__))
12
13 VERSION = imp.load_source('version', path.join('.', 'falcon', 'version.py'))
14 VERSION = VERSION.__version__
15
16 REQUIRES = []
17
18 try:
19 sys.pypy_version_info
20 PYPY = True
21 except AttributeError:
22 PYPY = False
23
24 if PYPY:
25 CYTHON = False
26 else:
27 try:
28 from Cython.Distutils import build_ext
29 CYTHON = True
30 except ImportError:
31 # TODO(kgriffs): pip now ignores all output, so the user
32 # may not see this message. See also:
33 #
34 # https://github.com/pypa/pip/issues/2732
35 #
36 print('\nNOTE: Cython not installed. '
37 'Falcon will still work fine, but may run '
38 'a bit slower.\n')
39 CYTHON = False
40
41 if CYTHON:
42 def list_modules(dirname):
43 filenames = glob.glob(path.join(dirname, '*.py'))
44
45 module_names = []
46 for name in filenames:
47 module, ext = path.splitext(path.basename(name))
48 if module != '__init__':
49 module_names.append(module)
50
51 return module_names
52
53 package_names = [
54 'falcon',
55 'falcon.media',
56 'falcon.routing',
57 'falcon.util',
58 'falcon.vendor.mimeparse',
59 ]
60 ext_modules = [
61 Extension(
62 package + '.' + module,
63 [path.join(*(package.split('.') + [module + '.py']))]
64 )
65 for package in package_names
66 for module in list_modules(path.join(MYDIR, *package.split('.')))
67 ]
68
69 cmdclass = {'build_ext': build_ext}
70
71 else:
72 cmdclass = {}
73 ext_modules = []
74
75
76 def load_description():
77 in_patron_list = False
78 in_patron_replacement = False
79 in_raw = False
80
81 description_lines = []
82
83 # NOTE(kgriffs): PyPI does not support the raw directive
84 for readme_line in io.open('README.rst', 'r', encoding='utf-8'):
85
86 # NOTE(vytas): The patron list largely builds upon raw sections
87 if readme_line.startswith('.. Patron list starts'):
88 in_patron_list = True
89 in_patron_replacement = True
90 continue
91 elif in_patron_list:
92 if not readme_line.strip():
93 in_patron_replacement = False
94 elif in_patron_replacement:
95 description_lines.append(readme_line.lstrip())
96 if readme_line.startswith('.. Patron list ends'):
97 in_patron_list = False
98 continue
99 elif readme_line.startswith('.. raw::'):
100 in_raw = True
101 elif in_raw:
102 if readme_line and not re.match(r'\s', readme_line):
103 in_raw = False
104
105 if not in_raw:
106 description_lines.append(readme_line)
107
108 return ''.join(description_lines)
109
110
111 setup(
112 name='falcon',
113 version=VERSION,
114 description='An unladen web framework for building APIs and app backends.',
115 long_description=load_description(),
116 long_description_content_type='text/x-rst',
117 classifiers=[
118 'Development Status :: 5 - Production/Stable',
119 'Environment :: Web Environment',
120 'Natural Language :: English',
121 'Intended Audience :: Developers',
122 'Intended Audience :: System Administrators',
123 'License :: OSI Approved :: Apache Software License',
124 'Operating System :: MacOS :: MacOS X',
125 'Operating System :: Microsoft :: Windows',
126 'Operating System :: POSIX',
127 'Topic :: Internet :: WWW/HTTP :: WSGI',
128 'Topic :: Software Development :: Libraries :: Application Frameworks',
129 'Programming Language :: Python',
130 'Programming Language :: Python :: Implementation :: CPython',
131 'Programming Language :: Python :: Implementation :: PyPy',
132 'Programming Language :: Python :: 3',
133 'Programming Language :: Python :: 3.5',
134 'Programming Language :: Python :: 3.6',
135 'Programming Language :: Python :: 3.7',
136 ],
137 keywords='wsgi web api framework rest http cloud',
138 author='Kurt Griffiths',
139 author_email='[email protected]',
140 url='https://falconframework.org',
141 license='Apache 2.0',
142 packages=find_packages(exclude=['tests']),
143 include_package_data=True,
144 zip_safe=False,
145 python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*',
146 install_requires=REQUIRES,
147 cmdclass=cmdclass,
148 ext_modules=ext_modules,
149 tests_require=['testtools', 'requests', 'pyyaml', 'pytest', 'pytest-runner'],
150 entry_points={
151 'console_scripts': [
152 'falcon-bench = falcon.cmd.bench:main',
153 'falcon-print-routes = falcon.cmd.print_routes:main'
154 ]
155 }
156 )
157
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -133,6 +133,7 @@
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',
+ 'Programming Language :: Python :: 3.8',
],
keywords='wsgi web api framework rest http cloud',
author='Kurt Griffiths',
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -133,6 +133,7 @@\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7',\n+ 'Programming Language :: Python :: 3.8',\n ],\n keywords='wsgi web api framework rest http cloud',\n author='Kurt Griffiths',\n", "issue": "Official support for CPython 3.8\nCPython 3.8.0 was [released Oct 14, 2019](https://www.python.org/downloads/release/python-380/).\r\n\r\nMake Falcon offically support CPython 3.8:\r\n- [ ] README\r\n- [ ] Docker images\r\n- [ ] Docs\r\n- [ ] Tox\r\n- [ ] Travis\r\n- [ ] setup.py\n", "before_files": [{"content": "import glob\nimport imp\nimport io\nimport os\nfrom os import path\nimport re\nimport sys\n\nfrom setuptools import Extension, find_packages, setup\n\nMYDIR = path.abspath(os.path.dirname(__file__))\n\nVERSION = imp.load_source('version', path.join('.', 'falcon', 'version.py'))\nVERSION = VERSION.__version__\n\nREQUIRES = []\n\ntry:\n sys.pypy_version_info\n PYPY = True\nexcept AttributeError:\n PYPY = False\n\nif PYPY:\n CYTHON = False\nelse:\n try:\n from Cython.Distutils import build_ext\n CYTHON = True\n except ImportError:\n # TODO(kgriffs): pip now ignores all output, so the user\n # may not see this message. See also:\n #\n # https://github.com/pypa/pip/issues/2732\n #\n print('\\nNOTE: Cython not installed. '\n 'Falcon will still work fine, but may run '\n 'a bit slower.\\n')\n CYTHON = False\n\nif CYTHON:\n def list_modules(dirname):\n filenames = glob.glob(path.join(dirname, '*.py'))\n\n module_names = []\n for name in filenames:\n module, ext = path.splitext(path.basename(name))\n if module != '__init__':\n module_names.append(module)\n\n return module_names\n\n package_names = [\n 'falcon',\n 'falcon.media',\n 'falcon.routing',\n 'falcon.util',\n 'falcon.vendor.mimeparse',\n ]\n ext_modules = [\n Extension(\n package + '.' + module,\n [path.join(*(package.split('.') + [module + '.py']))]\n )\n for package in package_names\n for module in list_modules(path.join(MYDIR, *package.split('.')))\n ]\n\n cmdclass = {'build_ext': build_ext}\n\nelse:\n cmdclass = {}\n ext_modules = []\n\n\ndef load_description():\n in_patron_list = False\n in_patron_replacement = False\n in_raw = False\n\n description_lines = []\n\n # NOTE(kgriffs): PyPI does not support the raw directive\n for readme_line in io.open('README.rst', 'r', encoding='utf-8'):\n\n # NOTE(vytas): The patron list largely builds upon raw sections\n if readme_line.startswith('.. Patron list starts'):\n in_patron_list = True\n in_patron_replacement = True\n continue\n elif in_patron_list:\n if not readme_line.strip():\n in_patron_replacement = False\n elif in_patron_replacement:\n description_lines.append(readme_line.lstrip())\n if readme_line.startswith('.. Patron list ends'):\n in_patron_list = False\n continue\n elif readme_line.startswith('.. raw::'):\n in_raw = True\n elif in_raw:\n if readme_line and not re.match(r'\\s', readme_line):\n in_raw = False\n\n if not in_raw:\n description_lines.append(readme_line)\n\n return ''.join(description_lines)\n\n\nsetup(\n name='falcon',\n version=VERSION,\n description='An unladen web framework for building APIs and app backends.',\n long_description=load_description(),\n long_description_content_type='text/x-rst',\n classifiers=[\n 'Development Status :: 5 - Production/Stable',\n 'Environment :: Web Environment',\n 'Natural Language :: English',\n 'Intended Audience :: Developers',\n 'Intended Audience :: System Administrators',\n 'License :: OSI Approved :: Apache Software License',\n 'Operating System :: MacOS :: MacOS X',\n 'Operating System :: Microsoft :: Windows',\n 'Operating System :: POSIX',\n 'Topic :: Internet :: WWW/HTTP :: WSGI',\n 'Topic :: Software Development :: Libraries :: Application Frameworks',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: Implementation :: CPython',\n 'Programming Language :: Python :: Implementation :: PyPy',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7',\n ],\n keywords='wsgi web api framework rest http cloud',\n author='Kurt Griffiths',\n author_email='[email protected]',\n url='https://falconframework.org',\n license='Apache 2.0',\n packages=find_packages(exclude=['tests']),\n include_package_data=True,\n zip_safe=False,\n python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*',\n install_requires=REQUIRES,\n cmdclass=cmdclass,\n ext_modules=ext_modules,\n tests_require=['testtools', 'requests', 'pyyaml', 'pytest', 'pytest-runner'],\n entry_points={\n 'console_scripts': [\n 'falcon-bench = falcon.cmd.bench:main',\n 'falcon-print-routes = falcon.cmd.print_routes:main'\n ]\n }\n)\n", "path": "setup.py"}], "after_files": [{"content": "import glob\nimport imp\nimport io\nimport os\nfrom os import path\nimport re\nimport sys\n\nfrom setuptools import Extension, find_packages, setup\n\nMYDIR = path.abspath(os.path.dirname(__file__))\n\nVERSION = imp.load_source('version', path.join('.', 'falcon', 'version.py'))\nVERSION = VERSION.__version__\n\nREQUIRES = []\n\ntry:\n sys.pypy_version_info\n PYPY = True\nexcept AttributeError:\n PYPY = False\n\nif PYPY:\n CYTHON = False\nelse:\n try:\n from Cython.Distutils import build_ext\n CYTHON = True\n except ImportError:\n # TODO(kgriffs): pip now ignores all output, so the user\n # may not see this message. See also:\n #\n # https://github.com/pypa/pip/issues/2732\n #\n print('\\nNOTE: Cython not installed. '\n 'Falcon will still work fine, but may run '\n 'a bit slower.\\n')\n CYTHON = False\n\nif CYTHON:\n def list_modules(dirname):\n filenames = glob.glob(path.join(dirname, '*.py'))\n\n module_names = []\n for name in filenames:\n module, ext = path.splitext(path.basename(name))\n if module != '__init__':\n module_names.append(module)\n\n return module_names\n\n package_names = [\n 'falcon',\n 'falcon.media',\n 'falcon.routing',\n 'falcon.util',\n 'falcon.vendor.mimeparse',\n ]\n ext_modules = [\n Extension(\n package + '.' + module,\n [path.join(*(package.split('.') + [module + '.py']))]\n )\n for package in package_names\n for module in list_modules(path.join(MYDIR, *package.split('.')))\n ]\n\n cmdclass = {'build_ext': build_ext}\n\nelse:\n cmdclass = {}\n ext_modules = []\n\n\ndef load_description():\n in_patron_list = False\n in_patron_replacement = False\n in_raw = False\n\n description_lines = []\n\n # NOTE(kgriffs): PyPI does not support the raw directive\n for readme_line in io.open('README.rst', 'r', encoding='utf-8'):\n\n # NOTE(vytas): The patron list largely builds upon raw sections\n if readme_line.startswith('.. Patron list starts'):\n in_patron_list = True\n in_patron_replacement = True\n continue\n elif in_patron_list:\n if not readme_line.strip():\n in_patron_replacement = False\n elif in_patron_replacement:\n description_lines.append(readme_line.lstrip())\n if readme_line.startswith('.. Patron list ends'):\n in_patron_list = False\n continue\n elif readme_line.startswith('.. raw::'):\n in_raw = True\n elif in_raw:\n if readme_line and not re.match(r'\\s', readme_line):\n in_raw = False\n\n if not in_raw:\n description_lines.append(readme_line)\n\n return ''.join(description_lines)\n\n\nsetup(\n name='falcon',\n version=VERSION,\n description='An unladen web framework for building APIs and app backends.',\n long_description=load_description(),\n long_description_content_type='text/x-rst',\n classifiers=[\n 'Development Status :: 5 - Production/Stable',\n 'Environment :: Web Environment',\n 'Natural Language :: English',\n 'Intended Audience :: Developers',\n 'Intended Audience :: System Administrators',\n 'License :: OSI Approved :: Apache Software License',\n 'Operating System :: MacOS :: MacOS X',\n 'Operating System :: Microsoft :: Windows',\n 'Operating System :: POSIX',\n 'Topic :: Internet :: WWW/HTTP :: WSGI',\n 'Topic :: Software Development :: Libraries :: Application Frameworks',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: Implementation :: CPython',\n 'Programming Language :: Python :: Implementation :: PyPy',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7',\n 'Programming Language :: Python :: 3.8',\n ],\n keywords='wsgi web api framework rest http cloud',\n author='Kurt Griffiths',\n author_email='[email protected]',\n url='https://falconframework.org',\n license='Apache 2.0',\n packages=find_packages(exclude=['tests']),\n include_package_data=True,\n zip_safe=False,\n python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*',\n install_requires=REQUIRES,\n cmdclass=cmdclass,\n ext_modules=ext_modules,\n tests_require=['testtools', 'requests', 'pyyaml', 'pytest', 'pytest-runner'],\n entry_points={\n 'console_scripts': [\n 'falcon-bench = falcon.cmd.bench:main',\n 'falcon-print-routes = falcon.cmd.print_routes:main'\n ]\n }\n)\n", "path": "setup.py"}]} | 1,812 | 105 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.