url
stringlengths
50
53
repository_url
stringclasses
1 value
labels_url
stringlengths
64
67
comments_url
stringlengths
59
62
events_url
stringlengths
57
60
html_url
stringlengths
38
43
id
int64
597k
2.65B
node_id
stringlengths
18
32
number
int64
1
6.83k
title
stringlengths
1
296
user
dict
labels
listlengths
0
5
state
stringclasses
2 values
locked
bool
2 classes
assignee
dict
assignees
listlengths
0
4
milestone
dict
comments
int64
0
211
created_at
stringlengths
20
20
updated_at
stringlengths
20
20
closed_at
stringlengths
20
20
author_association
stringclasses
3 values
active_lock_reason
stringclasses
4 values
body
stringlengths
0
65.6k
closed_by
dict
reactions
dict
timeline_url
stringlengths
59
62
performed_via_github_app
null
state_reason
stringclasses
3 values
draft
bool
2 classes
pull_request
dict
is_pull_request
bool
2 classes
issue_comments
listlengths
0
30
https://api.github.com/repos/psf/requests/issues/2530
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2530/labels{/name}
https://api.github.com/repos/psf/requests/issues/2530/comments
https://api.github.com/repos/psf/requests/issues/2530/events
https://github.com/psf/requests/issues/2530
66,483,378
MDU6SXNzdWU2NjQ4MzM3OA==
2,530
__build__ out of sync? Still at 0x020503, looks like 2.5.3 rather than 2.6.0
{ "avatar_url": "https://avatars.githubusercontent.com/u/1577132?v=4", "events_url": "https://api.github.com/users/hartwork/events{/privacy}", "followers_url": "https://api.github.com/users/hartwork/followers", "following_url": "https://api.github.com/users/hartwork/following{/other_user}", "gists_url": "https://api.github.com/users/hartwork/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/hartwork", "id": 1577132, "login": "hartwork", "node_id": "MDQ6VXNlcjE1NzcxMzI=", "organizations_url": "https://api.github.com/users/hartwork/orgs", "received_events_url": "https://api.github.com/users/hartwork/received_events", "repos_url": "https://api.github.com/users/hartwork/repos", "site_admin": false, "starred_url": "https://api.github.com/users/hartwork/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/hartwork/subscriptions", "type": "User", "url": "https://api.github.com/users/hartwork", "user_view_type": "public" }
[ { "color": "e10c02", "default": false, "description": null, "id": 117744, "name": "Bug", "node_id": "MDU6TGFiZWwxMTc3NDQ=", "url": "https://api.github.com/repos/psf/requests/labels/Bug" } ]
closed
true
null
[]
null
1
2015-04-05T21:21:27Z
2021-09-08T23:05:47Z
2015-04-06T21:54:22Z
NONE
resolved
Could it be that ``` __build__ = 0x020503 ``` at https://github.com/kennethreitz/requests/blob/master/requests/__init__.py#L46 went out of sync? It looks like 2.5.3 rather than 2.6.0. Best, Sebastian
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2530/reactions" }
https://api.github.com/repos/psf/requests/issues/2530/timeline
null
completed
null
null
false
[ "Yup, this looks wrong.\n" ]
https://api.github.com/repos/psf/requests/issues/2529
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2529/labels{/name}
https://api.github.com/repos/psf/requests/issues/2529/comments
https://api.github.com/repos/psf/requests/issues/2529/events
https://github.com/psf/requests/issues/2529
66,482,975
MDU6SXNzdWU2NjQ4Mjk3NQ==
2,529
"import requests" takes a full second?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1577132?v=4", "events_url": "https://api.github.com/users/hartwork/events{/privacy}", "followers_url": "https://api.github.com/users/hartwork/followers", "following_url": "https://api.github.com/users/hartwork/following{/other_user}", "gists_url": "https://api.github.com/users/hartwork/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/hartwork", "id": 1577132, "login": "hartwork", "node_id": "MDQ6VXNlcjE1NzcxMzI=", "organizations_url": "https://api.github.com/users/hartwork/orgs", "received_events_url": "https://api.github.com/users/hartwork/received_events", "repos_url": "https://api.github.com/users/hartwork/repos", "site_admin": false, "starred_url": "https://api.github.com/users/hartwork/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/hartwork/subscriptions", "type": "User", "url": "https://api.github.com/users/hartwork", "user_view_type": "public" }
[]
closed
true
null
[]
null
8
2015-04-05T21:16:20Z
2021-09-08T12:01:07Z
2015-04-06T11:20:26Z
NONE
resolved
Hi! I noticed that the `--help` output of a tool of mine was noticably slower than that of my other tools. It took a full second to show up. I traced the lagging down to the call to `import requests` (and [worked around it](https://github.com/hartwork/backup-my-hub/commit/509b30e53cccefd39c68311e456bba813e0be455) by importing requests only after processing command line options). `import requests` takes at least a second on both the SSDed Debian and HDDed Gentoo, with version 2.5.3 as well as 2.6.0. Compared to the 30 milliseconds that other modules take to import, that's a long time: ``` # time python -c 'import requests; print requests.__file__' /usr/lib64/python2.7/site-packages/requests/__init__.pyc real 0m1.099s user 0m1.055s sys 0m0.039s # time python -c 'import argparse; print argparse.__file__' /usr/lib64/python2.7/argparse.pyc real 0m0.030s user 0m0.018s sys 0m0.010s ``` It would be great if you could work on reducing import time. Many thanks!
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2529/reactions" }
https://api.github.com/repos/psf/requests/issues/2529/timeline
null
completed
null
null
false
[ "I believe the slow part of importing requests is importing pyopenssl. In theory you could delay the import until the first time a request is made, though I'm pretty sure the team is not in favor of that approach.\n", "> I believe the slow part of importing requests is importing pyopenssl. \n\nThat makes perfect sense:\n\n```\n# time python -c 'import OpenSSL.SSL; print OpenSSL.SSL.__file__'\n/usr/lib64/python2.7/site-packages/OpenSSL/SSL.pyc\n\nreal 0m1.014s\nuser 0m0.991s\nsys 0m0.018s\n```\n", "@hartwork Are you using a new virtualenv for each test? Mostly importing pyopenssl takes a while because cffi does an implicit C compile at import time in some cases, but that should only happen the first time.\n", "No virtualenv over here, no.\n", "```\n# for i in {1..10} ; do sh -c \"time python -c 'import OpenSSL.SSL'\" |& sed -n '2,2p' ; done\nreal 0m1.004s\nreal 0m0.993s\nreal 0m0.996s\nreal 0m0.994s\nreal 0m0.997s\nreal 0m0.999s\nreal 0m0.998s\nreal 0m0.991s\nreal 0m0.999s\nreal 0m0.997s\n```\n", "Interesting.\n\nFundamentally, this is a `cryptography` bug, probably pyca/cryptography#1764. I'm afraid until that's resolved we cannot do anything about this.\n", "On my little ARM system from NextThing this takes SEVEN SECONDS.", "@afcady Please report the issue to the appropriate upstream." ]
https://api.github.com/repos/psf/requests/issues/2528
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2528/labels{/name}
https://api.github.com/repos/psf/requests/issues/2528/comments
https://api.github.com/repos/psf/requests/issues/2528/events
https://github.com/psf/requests/issues/2528
66,482,196
MDU6SXNzdWU2NjQ4MjE5Ng==
2,528
2.5.1 labeled latest release at GitHub (rather than 2.6.0)?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1577132?v=4", "events_url": "https://api.github.com/users/hartwork/events{/privacy}", "followers_url": "https://api.github.com/users/hartwork/followers", "following_url": "https://api.github.com/users/hartwork/following{/other_user}", "gists_url": "https://api.github.com/users/hartwork/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/hartwork", "id": 1577132, "login": "hartwork", "node_id": "MDQ6VXNlcjE1NzcxMzI=", "organizations_url": "https://api.github.com/users/hartwork/orgs", "received_events_url": "https://api.github.com/users/hartwork/received_events", "repos_url": "https://api.github.com/users/hartwork/repos", "site_admin": false, "starred_url": "https://api.github.com/users/hartwork/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/hartwork/subscriptions", "type": "User", "url": "https://api.github.com/users/hartwork", "user_view_type": "public" }
[]
closed
true
null
[]
null
6
2015-04-05T21:05:46Z
2021-09-08T23:05:49Z
2015-04-05T22:32:01Z
NONE
resolved
Hi! Looking at https://github.com/kennethreitz/requests/releases I need to click on "Show 3 newer tags" to see 2.6.0 uncovered. 2.5.1 is marked the latest release. Is that intended? It's 2.6.0 on PyPI https://pypi.python.org/pypi/requests/ . Best, Sebastian
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2528/reactions" }
https://api.github.com/repos/psf/requests/issues/2528/timeline
null
completed
null
null
false
[ "We don't really have a consistent policy of updating GitHub's releases. For our release notes we strongly recommend looking at PyPI or our documentation.\n", "So... no fix? 2.5.1 remains the latest on GitHub?\n", "@hartwork as @Lukasa has already kindly explained, PyPI is the source of truth. Not GitHub. You might notice that there are _several_ releases missing notes on GitHub.\n", "My point are not the releases notes. My point is the fix potential user confusion. The release page up here raises more questions than it answers. If it's trouble to keep in sync, okay. If not, my vote to keep it synced, even just versions without release notes. That's all.\n", "@hartwork it is not trivial.\n", "Okay :)\n" ]
https://api.github.com/repos/psf/requests/issues/2527
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2527/labels{/name}
https://api.github.com/repos/psf/requests/issues/2527/comments
https://api.github.com/repos/psf/requests/issues/2527/events
https://github.com/psf/requests/issues/2527
66,447,848
MDU6SXNzdWU2NjQ0Nzg0OA==
2,527
AttributeError- LWPCookieJar instance has no attribute 'copy'
{ "avatar_url": "https://avatars.githubusercontent.com/u/4114154?v=4", "events_url": "https://api.github.com/users/zhaoguixu/events{/privacy}", "followers_url": "https://api.github.com/users/zhaoguixu/followers", "following_url": "https://api.github.com/users/zhaoguixu/following{/other_user}", "gists_url": "https://api.github.com/users/zhaoguixu/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/zhaoguixu", "id": 4114154, "login": "zhaoguixu", "node_id": "MDQ6VXNlcjQxMTQxNTQ=", "organizations_url": "https://api.github.com/users/zhaoguixu/orgs", "received_events_url": "https://api.github.com/users/zhaoguixu/received_events", "repos_url": "https://api.github.com/users/zhaoguixu/repos", "site_admin": false, "starred_url": "https://api.github.com/users/zhaoguixu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zhaoguixu/subscriptions", "type": "User", "url": "https://api.github.com/users/zhaoguixu", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2015-04-05T15:05:40Z
2021-09-08T23:05:48Z
2015-04-06T11:15:55Z
NONE
resolved
Hi, In version 2.x: <b>[test-case]</b> ``` python import requests from cookielib import LWPCookieJar cookiejar = LWPCookieJar() req = requests.Request('GET', 'http://httpbin.org/get', cookies=cookiejar) r = req.prepare() r.copy() ``` <b>[traceback]</b> ... AttributeError: LWPCookieJar instance has no attribute 'copy' <b>[comments]</b> It seems that not all instances of cookiejar have the copy method, but in the copy method of PreparedRequest, it call the copy method of cookies without any checks.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2527/reactions" }
https://api.github.com/repos/psf/requests/issues/2527/timeline
null
completed
null
null
false
[ "What is your proposed solution?\n", "Hi @sigmavirus24,\nThanks for quick response. This situation is a little tricky that the base CookieJar class does not supply the 'copy' method but the RequestsCookieJar does. However, in the method 'prepare_cookies' of PreparedRequests, you permit all the instances of the subclass of CookieJar preserved as its original form(which may not contain 'copy' method). As I suppose, one uniformed way is to remove the copy method of RequestsCookieJar and use the python copy function when copy is needed(This may need changes of other places). The other is more conservative, that is, restrict the self._cookies of PreparedRequest to be only the instance of RequestsCookieJar or else check whether having copy method before calling this method of a CookieJar instance.\n\n``` python\ndef prepare_cookies(self, cookies):\n \"\"\"Prepares the given HTTP cookie data.\"\"\"\n\n if isinstance(cookies, cookielib.CookieJar): #<=====\n self._cookies = cookies\n else:\n self._cookies = cookiejar_from_dict(cookies)\n ...\n```\n", "@zhaoguixu check out #2534 \n", "As mentioned in #2077, self._cookies may be None\n\n``` python\np._cookies = _copy_cookie_jar(self._cookies) # need check whether it is None\n```\n", "Thanks for catching that @zhaoguixu. I updated the PR to account for that.\n" ]
https://api.github.com/repos/psf/requests/issues/2526
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2526/labels{/name}
https://api.github.com/repos/psf/requests/issues/2526/comments
https://api.github.com/repos/psf/requests/issues/2526/events
https://github.com/psf/requests/issues/2526
66,282,887
MDU6SXNzdWU2NjI4Mjg4Nw==
2,526
Add support for digest authentication with an HTTP proxy
{ "avatar_url": "https://avatars.githubusercontent.com/u/705937?v=4", "events_url": "https://api.github.com/users/spectrumjade/events{/privacy}", "followers_url": "https://api.github.com/users/spectrumjade/followers", "following_url": "https://api.github.com/users/spectrumjade/following{/other_user}", "gists_url": "https://api.github.com/users/spectrumjade/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/spectrumjade", "id": 705937, "login": "spectrumjade", "node_id": "MDQ6VXNlcjcwNTkzNw==", "organizations_url": "https://api.github.com/users/spectrumjade/orgs", "received_events_url": "https://api.github.com/users/spectrumjade/received_events", "repos_url": "https://api.github.com/users/spectrumjade/repos", "site_admin": false, "starred_url": "https://api.github.com/users/spectrumjade/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/spectrumjade/subscriptions", "type": "User", "url": "https://api.github.com/users/spectrumjade", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2015-04-04T06:25:15Z
2021-09-08T23:05:50Z
2015-04-04T06:50:25Z
NONE
resolved
Currently, requests only supports HTTP basic authentication to a proxy. It would be very useful to support digest authentication with a proxy as well. Additionally, there should be some way to signal that requests should not attempt to pass proxy credentials in plaintext before receiving the digest nonce from the proxy.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2526/reactions" }
https://api.github.com/repos/psf/requests/issues/2526/timeline
null
completed
null
null
false
[ "@justintime32 Thanks for the feature request!\n\nIt should be entirely possible to write a short authentication handler that does exactly what you need: a quick Google showed [this](http://stackoverflow.com/questions/13506455/how-to-pass-proxy-authentication-requires-digest-auth-by-using-python-requests) as the top result. Because of the ease of adding such a thing yourself, and because it's relatively infrequently used, we don't believe there's much advantage in bringing Proxy Digest Auth into the core library.\n", "Hi @Lukasa, I actually started by writing an auth handler. The main issue I ran into was that it only works for non-SSL requests. SSL requests through a proxy are made through a CONNECT tunnel, and the CONNECT request must be authenticated. The auth handler appears to be unable to hook into the proxy tunnel creation step, and therefore SSL requests will always fail.\n\nAre there other ways to add this authentication without modifying the core? Maybe some more hooks around the proxy tunnel creation?\n", "Unfortunately, what you need is not possible with `httplib`. To do the CONNECT tunnel in httplib, you end up in the `_tunnel` method, which has the following code:\n\n``` python\n def _tunnel(self):\n connect_str = \"CONNECT %s:%d HTTP/1.0\\r\\n\" % (self._tunnel_host,\n self._tunnel_port)\n connect_bytes = connect_str.encode(\"ascii\")\n self.send(connect_bytes)\n for header, value in self._tunnel_headers.items():\n header_str = \"%s: %s\\r\\n\" % (header, value)\n header_bytes = header_str.encode(\"latin-1\")\n self.send(header_bytes)\n self.send(b'\\r\\n')\n\n response = self.response_class(self.sock, method=self._method)\n (version, code, message) = response._read_status()\n\n if code != 200:\n self.close()\n raise OSError(\"Tunnel connection failed: %d %s\" % (code,\n message.strip()))\n```\n\nAs you can see, there is no way to hook into a 407 response here. We can only do this by overriding the way the HTTP connection functions, which is something we do in [urllib3](https://github.com/shazow/urllib3): I recommend opening a feature request there.\n" ]
https://api.github.com/repos/psf/requests/issues/2525
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2525/labels{/name}
https://api.github.com/repos/psf/requests/issues/2525/comments
https://api.github.com/repos/psf/requests/issues/2525/events
https://github.com/psf/requests/pull/2525
66,162,705
MDExOlB1bGxSZXF1ZXN0MzI1OTQxOTE=
2,525
Issue 2062: Added custom header precedence info
{ "avatar_url": "https://avatars.githubusercontent.com/u/4780134?v=4", "events_url": "https://api.github.com/users/benjaminran/events{/privacy}", "followers_url": "https://api.github.com/users/benjaminran/followers", "following_url": "https://api.github.com/users/benjaminran/following{/other_user}", "gists_url": "https://api.github.com/users/benjaminran/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/benjaminran", "id": 4780134, "login": "benjaminran", "node_id": "MDQ6VXNlcjQ3ODAxMzQ=", "organizations_url": "https://api.github.com/users/benjaminran/orgs", "received_events_url": "https://api.github.com/users/benjaminran/received_events", "repos_url": "https://api.github.com/users/benjaminran/repos", "site_admin": false, "starred_url": "https://api.github.com/users/benjaminran/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/benjaminran/subscriptions", "type": "User", "url": "https://api.github.com/users/benjaminran", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2015-04-03T15:06:08Z
2021-09-08T08:00:55Z
2015-04-03T15:57:54Z
CONTRIBUTOR
resolved
Let me know if anything should be adjusted.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2525/reactions" }
https://api.github.com/repos/psf/requests/issues/2525/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2525.diff", "html_url": "https://github.com/psf/requests/pull/2525", "merged_at": "2015-04-03T15:57:54Z", "patch_url": "https://github.com/psf/requests/pull/2525.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2525" }
true
[ "\\o/ This is beautiful @benjaminran! Thanks so much! :cake: :sparkles:\n", "This is a fantastic pull request. Thanks so much!\n" ]
https://api.github.com/repos/psf/requests/issues/2524
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2524/labels{/name}
https://api.github.com/repos/psf/requests/issues/2524/comments
https://api.github.com/repos/psf/requests/issues/2524/events
https://github.com/psf/requests/issues/2524
65,993,903
MDU6SXNzdWU2NTk5MzkwMw==
2,524
TypeError: __str__ returned non-string (type Error)
{ "avatar_url": "https://avatars.githubusercontent.com/u/204508?v=4", "events_url": "https://api.github.com/users/mktums/events{/privacy}", "followers_url": "https://api.github.com/users/mktums/followers", "following_url": "https://api.github.com/users/mktums/following{/other_user}", "gists_url": "https://api.github.com/users/mktums/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/mktums", "id": 204508, "login": "mktums", "node_id": "MDQ6VXNlcjIwNDUwOA==", "organizations_url": "https://api.github.com/users/mktums/orgs", "received_events_url": "https://api.github.com/users/mktums/received_events", "repos_url": "https://api.github.com/users/mktums/repos", "site_admin": false, "starred_url": "https://api.github.com/users/mktums/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mktums/subscriptions", "type": "User", "url": "https://api.github.com/users/mktums", "user_view_type": "public" }
[]
closed
true
null
[]
null
7
2015-04-02T19:21:20Z
2021-09-08T07:00:23Z
2015-04-02T19:28:43Z
NONE
resolved
OS X 10.9 Python 2.7.9 ``` $ pip freeze cffi==0.9.2 cryptography==0.8.1 enum34==1.0.4 gitdb==0.6.4 GitPython==0.3.6 ndg-httpsclient==0.3.3 pyasn1==0.1.7 pycparser==2.10 pyOpenSSL==0.14 requests==2.6.0 requests-toolbelt==0.3.1 six==1.9.0 smmap==0.9.0 ``` On request to website with TLS 1.0 `https://codeforge.lbl.gov/frs/download.php/409/fastbit-ibis1.3.8.tar.gz` I'm getting: ``` python Traceback (most recent call last): File "./main.py", line 118, in <module> resp = downloader.run() File "/Users/mktums/Projects/brew404/downloaders.py", line 64, in run resp = self.fetch() File "/Users/mktums/Projects/brew404/downloaders.py", line 59, in fetch resp = s.send(req, verify=verify_ssl, allow_redirects=True, timeout=20) File "/Users/mktums/.virtualenvs/brew/lib/python2.7/site-packages/requests/sessions.py", line 576, in send r = adapter.send(request, **kwargs) File "/Users/mktums/.virtualenvs/brew/lib/python2.7/site-packages/requests/adapters.py", line 370, in send timeout=timeout File "/Users/mktums/.virtualenvs/brew/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py", line 544, in urlopen body=body, headers=headers) File "/Users/mktums/.virtualenvs/brew/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py", line 344, in _make_request self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) File "/Users/mktums/.virtualenvs/brew/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py", line 314, in _raise_timeout if 'timed out' in str(err) or 'did not complete (read)' in str(err): # Python 2.6 TypeError: __str__ returned non-string (type Error) ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2524/reactions" }
https://api.github.com/repos/psf/requests/issues/2524/timeline
null
completed
null
null
false
[ "Thanks for reporting this! This is currently tracked under shazow/urllib3#556, so I'm going to close this issue to centralise discussion there.\n", "@Lukasa Thanks for fast reply!\n", "how to fix this bug? any ideas?\n", "@gdonzy Please read the above issue, which links to a related tracking issue.\n", "Thanks a lot. When I use \"requests.get(\"xxxx\", verify=False)\", the bug is fixed.\n", "Can I re open this? I am getting this even for a signed certificate. I cannot use verify = False\r\n", "No, this was fixed a long time ago. You should update instead. " ]
https://api.github.com/repos/psf/requests/issues/2523
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2523/labels{/name}
https://api.github.com/repos/psf/requests/issues/2523/comments
https://api.github.com/repos/psf/requests/issues/2523/events
https://github.com/psf/requests/pull/2523
65,926,630
MDExOlB1bGxSZXF1ZXN0MzI1MTE3MTI=
2,523
HTTPDigestAuth - Making it thread-safe
{ "avatar_url": "https://avatars.githubusercontent.com/u/7661068?v=4", "events_url": "https://api.github.com/users/exvito/events{/privacy}", "followers_url": "https://api.github.com/users/exvito/followers", "following_url": "https://api.github.com/users/exvito/following{/other_user}", "gists_url": "https://api.github.com/users/exvito/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/exvito", "id": 7661068, "login": "exvito", "node_id": "MDQ6VXNlcjc2NjEwNjg=", "organizations_url": "https://api.github.com/users/exvito/orgs", "received_events_url": "https://api.github.com/users/exvito/received_events", "repos_url": "https://api.github.com/users/exvito/repos", "site_admin": false, "starred_url": "https://api.github.com/users/exvito/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/exvito/subscriptions", "type": "User", "url": "https://api.github.com/users/exvito", "user_view_type": "public" }
[]
closed
true
null
[]
null
12
2015-04-02T13:36:09Z
2021-09-08T06:00:57Z
2015-10-05T14:09:45Z
CONTRIBUTOR
resolved
The existing code counts the number of 401 responses in the num_401_calls authenticator attribute. This is in place so as to ensure the necessary auth header is sent, while avoiding infinite 401 loops (issue #547). This commit makes num_401_calls an instance of threading.local() (previously an integer), using num_401_calls.value as the counter. It ensures that concurrent authentication requests get each their own counter and behave as expected (otherwise every other concurrent request would have its authentication fail).
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2523/reactions" }
https://api.github.com/repos/psf/requests/issues/2523/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2523.diff", "html_url": "https://github.com/psf/requests/pull/2523", "merged_at": "2015-10-05T14:09:45Z", "patch_url": "https://github.com/psf/requests/pull/2523.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2523" }
true
[ "@Lukasa , @kennethreitz - Per your suggestion, I just updated the PR with commit e65360d.\nThanks for your input.\n", "I think the code is racy in nearly all the areas, not just the 401 counter.\n- build_digest_header is updating a number of internal variables including nonce lastnonce, etc.\n- if there are several unauthenticated request at the same time, you will get several challenges, but stored in the same variables for subsequent request.\n\nI think a more stable approach would be to store the whole state in a per thread variable. This has the drawback of doing one auth per thread, instead of trying to reuse the chalenge data from other threads.\n", "@tardyp : You are correct. This needs further improvement, just as you point out there are other attributes currently shared accross threads that will lead to auth failures in certain conditions.\n\nI'll extend the per-thread approach of num_401_calls to the remaining state-tracking attributes and update the PR for review.\n\nRegarding the drawback you refer to I think it comes as a reasonable compromise:\n- This approach won't hurt the currently working single-thread case.\n- It guarantees correct authentication in a multi-threaded case (which requests claims to support).\n- Simple and easy to understand code change.\n", "@exvito : @vincentxb and I are on the same team, and have story on our sprint to resolve this issue. I was working on it this morning, and am at the point of creating a unit test.\n", "Updated with commit e8d9bc5. Highlights:\n- All state now in thread local storage.\n- Factored out state initialization to `init_per_thread_state()` which must be called from `__init__()` for the regular, single-threaded case and, eventually, from `__call__()` in the cases where threads that did not create the auth handler invoke it.\n- Maybe the `self.tl` name can be better: I opted for short, project leaders may prefer `self.thread_local` or some variation.\n- Maybe the test for thread local state initialization in `__call__()` can take a different approach.\n\nHere's a contribution. Thanks for your feedback.\nPS: @tardyp does that mean you'll work from this PR or you'll take a different approach and suggest I drop this PR?\n", "This looks entirely reasonable to me. It would be ideal to have some tests for this if at all possible.\n", "@exvito, we worked on the same approach at the same time. \n\nI opted for the name thd for the local variable, and did the same ini_per_thread call. I think it is not necessery in **init** however, only in _call_\n\n :+1: \n\nI have a unit test on a separate commit:\n\nhttps://github.com/tardyp/requests/commit/e15ed2c46823183b73250503fc44146f979eb891\n\nPlease cherry-pick it!\n", "> I think it is not necessery in `__init__` however, only in `__call__`.\n\nYou are correct. Additionally, after taking a peek at your code, I opted for your approach of ensuring per-thread state is initialized once and only once based on the `self.tl.init` attribute.\n\nUpdating with two commits:\n- Your cherry picked test case (confirmed to work nicely, thanks).\n- A simpler approach at initializing per-thread state.\n", "I still like this idea, but I have some notes in the diff. =)\n", "Is this code spawning a new thread?\n", "@kennethreitz The unit test for it is, but the product code doesn't. It's just to make it safe when it's called from multiple threads.\n", ":+1:\n" ]
https://api.github.com/repos/psf/requests/issues/2522
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2522/labels{/name}
https://api.github.com/repos/psf/requests/issues/2522/comments
https://api.github.com/repos/psf/requests/issues/2522/events
https://github.com/psf/requests/issues/2522
65,835,263
MDU6SXNzdWU2NTgzNTI2Mw==
2,522
Save and load session
{ "avatar_url": "https://avatars.githubusercontent.com/u/1593287?v=4", "events_url": "https://api.github.com/users/JimHokanson/events{/privacy}", "followers_url": "https://api.github.com/users/JimHokanson/followers", "following_url": "https://api.github.com/users/JimHokanson/following{/other_user}", "gists_url": "https://api.github.com/users/JimHokanson/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/JimHokanson", "id": 1593287, "login": "JimHokanson", "node_id": "MDQ6VXNlcjE1OTMyODc=", "organizations_url": "https://api.github.com/users/JimHokanson/orgs", "received_events_url": "https://api.github.com/users/JimHokanson/received_events", "repos_url": "https://api.github.com/users/JimHokanson/repos", "site_admin": false, "starred_url": "https://api.github.com/users/JimHokanson/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/JimHokanson/subscriptions", "type": "User", "url": "https://api.github.com/users/JimHokanson", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2015-04-02T04:35:04Z
2020-10-08T18:13:20Z
2015-04-02T06:07:52Z
NONE
resolved
I'd like to be able to save and load a session to disk. I think this means just saving and loading the cookies, but perhaps there are other nuances that would need to be included. I'm interested in using Robobrowser (https://github.com/jmcarp/robobrowser) which wraps requests. I'd like to be able to persist the browser between running my application, which means being able to save and load the session to disk. There is a SO post regarding how to do this. I think tellingly, the most popular answer is wrong. http://stackoverflow.com/questions/13030095/how-to-save-requests-python-cookies-to-a-file Thanks, Jim
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2522/reactions" }
https://api.github.com/repos/psf/requests/issues/2522/timeline
null
completed
null
null
false
[ "The easiest thing to do is just to pickle the whole `session` object:\n\n``` python\nimport requests, requests.utils, pickle\nsession = requests.session()\n# Make some calls\nwith open('somefile', 'w') as f:\n pickle.dump(session, f)\n```\n\n``` python\nwith open('somefile') as f:\n session = pickle.load(f)\n```\n", "As an addendum: looks like Python3 expects the 'b' flag during write and read:\r\n\r\n```python\r\nwith open('somefile', 'wb') as f:\r\n pickle.dump(session, f)\r\n```\r\n```python\r\nwith open('somefile', 'rb') as f:\r\n session = pickle.load(f)\r\n```\r\n", "I tried using this approach, however I could not get TLS session resumption to work after reloading the pickled session. Does anyone know how to achieve this behavior? " ]
https://api.github.com/repos/psf/requests/issues/2521
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2521/labels{/name}
https://api.github.com/repos/psf/requests/issues/2521/comments
https://api.github.com/repos/psf/requests/issues/2521/events
https://github.com/psf/requests/issues/2521
65,644,529
MDU6SXNzdWU2NTY0NDUyOQ==
2,521
ConnectionError and socket.gaierror occurred on IPv6-enabled system
{ "avatar_url": "https://avatars.githubusercontent.com/u/5781687?v=4", "events_url": "https://api.github.com/users/shichao-an/events{/privacy}", "followers_url": "https://api.github.com/users/shichao-an/followers", "following_url": "https://api.github.com/users/shichao-an/following{/other_user}", "gists_url": "https://api.github.com/users/shichao-an/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/shichao-an", "id": 5781687, "login": "shichao-an", "node_id": "MDQ6VXNlcjU3ODE2ODc=", "organizations_url": "https://api.github.com/users/shichao-an/orgs", "received_events_url": "https://api.github.com/users/shichao-an/received_events", "repos_url": "https://api.github.com/users/shichao-an/repos", "site_admin": false, "starred_url": "https://api.github.com/users/shichao-an/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/shichao-an/subscriptions", "type": "User", "url": "https://api.github.com/users/shichao-an", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2015-04-01T09:32:58Z
2021-09-08T23:05:50Z
2015-04-01T19:34:44Z
NONE
resolved
I tried to make a request with a tool that uses python-requests on a Ubuntu 14.04 server with IPv6 enabled (with inet6 interface configured at `/etc/network/interfaces`) and encountered the following errors: ``` bash $ 115down Traceback (most recent call last): File "/usr/local/bin/115down", line 130, in <module> main() File "/usr/local/bin/115down", line 126, in main args.sub_num, args.count, args.sub_count, args.tasks) File "/usr/local/bin/115down", line 64, in get_entries api.login(username, password, section) File "/usr/local/lib/python2.7/dist-packages/u115/api.py", line 248, in login if self.has_logged_in: File "/usr/local/lib/python2.7/dist-packages/u115/api.py", line 297, in has_logged_in r = self.http.get(CHECKPOINT_URL) File "/usr/local/lib/python2.7/dist-packages/u115/api.py", line 56, in get r = self.session.get(url, params=params) File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 467, in get return self.request('GET', url, **kwargs) File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 455, in request resp = self.send(prep, **send_kwargs) File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 558, in send r = adapter.send(request, **kwargs) File "/usr/lib/python2.7/dist-packages/requests/adapters.py", line 378, in send raise ConnectionError(e) requests.exceptions.ConnectionError: HTTPConnectionPool(host='passport.115.com', port=80): Max retries exceeded with url: /?ct=ajax&ac=ajax_check_point (Caused by <class 'socket.gaierror'>: [Errno -2] Name or service not known) Exception requests.exceptions.ConnectionError: ConnectionError(MaxRetryError("HTTPConnectionPool(host='passport.115.com', port=80): Max retries exceeded with url: /?ct=ajax&ac=ajax_check_point (Caused by <class 'socket.gaierror'>: [Errno -2] Name or service not known)",),) in <bound method API.__del__ of <u115.api.API object at 0x7fe060092210>> ignored ``` When disabled IPv6 by commenting relevant lines in `/etc/network/interfaces` and reboot, everything works. Anyone have any ideas?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2521/reactions" }
https://api.github.com/repos/psf/requests/issues/2521/timeline
null
completed
null
null
false
[ "`socket.gaierror` seems to suggest that you're failing a DNS lookup. Try this:\n\n``` python\nimport socket\nsocket.getaddrinfo('passport.115.com', 80)\n```\n\nin both IPv6 and IPv4 mode, and see if the problem reproduces.\n", "To reproduce,\n\n```\n>>> import socket\n>>> socket.getaddrinfo('passport.115.com', 80)\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\nsocket.gaierror: [Errno -2] Name or service not known\n```\n\nBut I think I got the cause, when I tried to disable IPv6 editing the default `/etc/network/interfaces`, I found DNS is set only for IPv6 of the eth0 interface but not for IPv4:\n\n```\ndns-nameservers 2001:4860:4860::8844 2001:4860:4860::8888 8.8.8.8\n```\n\nSo I manually added 8.8.8.8 and 8.8.4.4 to IPv4 and it worked.\n\nIt seems the default system is not \"properly\" configured for this scenario\n", "I'm glad you were able to diagnose the problem! \n" ]
https://api.github.com/repos/psf/requests/issues/2520
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2520/labels{/name}
https://api.github.com/repos/psf/requests/issues/2520/comments
https://api.github.com/repos/psf/requests/issues/2520/events
https://github.com/psf/requests/issues/2520
65,415,631
MDU6SXNzdWU2NTQxNTYzMQ==
2,520
chardet.detect() call in models.py fails when using IronPython.
{ "avatar_url": "https://avatars.githubusercontent.com/u/11410671?v=4", "events_url": "https://api.github.com/users/dxtodorovic/events{/privacy}", "followers_url": "https://api.github.com/users/dxtodorovic/followers", "following_url": "https://api.github.com/users/dxtodorovic/following{/other_user}", "gists_url": "https://api.github.com/users/dxtodorovic/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/dxtodorovic", "id": 11410671, "login": "dxtodorovic", "node_id": "MDQ6VXNlcjExNDEwNjcx", "organizations_url": "https://api.github.com/users/dxtodorovic/orgs", "received_events_url": "https://api.github.com/users/dxtodorovic/received_events", "repos_url": "https://api.github.com/users/dxtodorovic/repos", "site_admin": false, "starred_url": "https://api.github.com/users/dxtodorovic/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dxtodorovic/subscriptions", "type": "User", "url": "https://api.github.com/users/dxtodorovic", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2015-03-31T10:10:13Z
2021-09-08T23:05:50Z
2015-03-31T10:50:41Z
NONE
resolved
In models.py, chardet.detect(self.content)['encoding'] fails to work with IronPython with an exception: 'Expected a bytes object, not a unicode object'. It is due to a check in chardet's __init__ ( isinstance(aBuf, unicode) ) and IronPython's way of handling strings as unicode type. Excerpt from chardet's **init**: ``` def detect(aBuf): if ((version_info < (3, 0) and isinstance(aBuf, unicode)) or (version_info >= (3, 0) and not isinstance(aBuf, bytes))): raise ValueError('Expected a bytes object, not a unicode object') ``` I tried passing bytes(self.content) to chardet instead but that solution fails short because in compat.py, we are redefining bytes to str, which under IronPython is unicode. Can we perhaps add some way of handling IronPython properly or is it something that should be fixed in chardet instead?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2520/reactions" }
https://api.github.com/repos/psf/requests/issues/2520/timeline
null
completed
null
null
false
[ "I think this needs to be fixed in chardet itself, sadly. Happily, it's now under active development, over at [chardet/chardet](https://github.com/chardet/chardet), so opening the issue over there should get it promptly looked at.\n", "Actually this looks like a problem with IronPython that voids requests expectations. `Response.content` should always be a `bytes` object. That's what we expect to get back from the socket via httplib. If we aren't getting bytes back, that sounds like a bug in the way IronPython has copied in the standard library. We have no way of encoding it back to bytes intelligently and IronPython should really really really not be doing that.\n" ]
https://api.github.com/repos/psf/requests/issues/2519
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2519/labels{/name}
https://api.github.com/repos/psf/requests/issues/2519/comments
https://api.github.com/repos/psf/requests/issues/2519/events
https://github.com/psf/requests/issues/2519
65,243,921
MDU6SXNzdWU2NTI0MzkyMQ==
2,519
Client Certificates w/Passphrases?
{ "avatar_url": "https://avatars.githubusercontent.com/u/6197517?v=4", "events_url": "https://api.github.com/users/tdussa/events{/privacy}", "followers_url": "https://api.github.com/users/tdussa/followers", "following_url": "https://api.github.com/users/tdussa/following{/other_user}", "gists_url": "https://api.github.com/users/tdussa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/tdussa", "id": 6197517, "login": "tdussa", "node_id": "MDQ6VXNlcjYxOTc1MTc=", "organizations_url": "https://api.github.com/users/tdussa/orgs", "received_events_url": "https://api.github.com/users/tdussa/received_events", "repos_url": "https://api.github.com/users/tdussa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/tdussa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/tdussa/subscriptions", "type": "User", "url": "https://api.github.com/users/tdussa", "user_view_type": "public" }
[ { "color": "02e10c", "default": false, "description": null, "id": 76800, "name": "Feature Request", "node_id": "MDU6TGFiZWw3NjgwMA==", "url": "https://api.github.com/repos/psf/requests/labels/Feature%20Request" }, { "color": "e102d8", "default": false, "description": null, "id": 117745, "name": "Planned", "node_id": "MDU6TGFiZWwxMTc3NDU=", "url": "https://api.github.com/repos/psf/requests/labels/Planned" } ]
closed
false
null
[]
{ "closed_at": null, "closed_issues": 29, "created_at": "2024-05-19T18:29:04Z", "creator": { "avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4", "events_url": "https://api.github.com/users/sethmlarson/events{/privacy}", "followers_url": "https://api.github.com/users/sethmlarson/followers", "following_url": "https://api.github.com/users/sethmlarson/following{/other_user}", "gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sethmlarson", "id": 18519037, "login": "sethmlarson", "node_id": "MDQ6VXNlcjE4NTE5MDM3", "organizations_url": "https://api.github.com/users/sethmlarson/orgs", "received_events_url": "https://api.github.com/users/sethmlarson/received_events", "repos_url": "https://api.github.com/users/sethmlarson/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions", "type": "User", "url": "https://api.github.com/users/sethmlarson", "user_view_type": "public" }, "description": "", "due_on": null, "html_url": "https://github.com/psf/requests/milestone/34", "id": 11073254, "labels_url": "https://api.github.com/repos/psf/requests/milestones/34/labels", "node_id": "MI_kwDOABTKOs4AqPbm", "number": 34, "open_issues": 0, "state": "open", "title": "Bankruptcy", "updated_at": "2024-05-20T14:37:16Z", "url": "https://api.github.com/repos/psf/requests/milestones/34" }
33
2015-03-30T15:55:43Z
2024-05-20T14:36:24Z
2024-05-20T14:36:23Z
NONE
null
Hi, client certificates can be specified with the "cert" parameter. If I pass an encrypted client certificate, the underlying OpenSSL call will query for the corresponding passphrase, but that is not really a feasible way of handling this for a larger session with multiple calls, because the password is obviously not cached in any way. Is there any way to pass the passphrase to a connection with API calls so it is then passed on to OpenSSL? Cheers, Toby.
{ "avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4", "events_url": "https://api.github.com/users/sethmlarson/events{/privacy}", "followers_url": "https://api.github.com/users/sethmlarson/followers", "following_url": "https://api.github.com/users/sethmlarson/following{/other_user}", "gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sethmlarson", "id": 18519037, "login": "sethmlarson", "node_id": "MDQ6VXNlcjE4NTE5MDM3", "organizations_url": "https://api.github.com/users/sethmlarson/orgs", "received_events_url": "https://api.github.com/users/sethmlarson/received_events", "repos_url": "https://api.github.com/users/sethmlarson/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions", "type": "User", "url": "https://api.github.com/users/sethmlarson", "user_view_type": "public" }
{ "+1": 3, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 3, "url": "https://api.github.com/repos/psf/requests/issues/2519/reactions" }
https://api.github.com/repos/psf/requests/issues/2519/timeline
null
completed
null
null
false
[ "Hi @tdussa,\n\nIn the future, please ask _questions_ on [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests). Quoting from [the docs](http://docs.python-requests.org/en/latest/api/?highlight=cert).\n\n> cert – (optional) if String, path to ssl client cert file (.pem). If Tuple, (‘cert’, ‘key’) pair.\n\nSo if you have the key file, you can use that. @t-8ch can correct me if I'm wrong, but I don't believe we (or urllib3) support sending a passphrase. I think the lack of support is specifically a limitation of the way the SSL module [loads verification data](https://docs.python.org/3/library/ssl.html?highlight=ssl#ssl.SSLContext.load_verify_locations).\n", "Hi,\n\nTHX for your quick response. Admittedly, I was hoping that this would just be a question, but I am afraid that it will actually be a feature request if I read you correctly.\n\nSo I have tried to read through the relevant portions of the request and urllib3 code, and I had hoped that I had overlooked some obvious way of passing a passphrase to pyOpenSSL.\n\nActually, the underlying SSL code DOES support encrypted (password-protected) client certificates; the relevant function that is called by urllib3 is not [load_verify_locations](https://docs.python.org/3/library/ssl.html?highlight=ssl#ssl.SSLContext.load_verify_locations) but [load_cert_chain](https://docs.python.org/3/library/ssl.html?highlight=ssl#ssl.SSLContext.load_cert_chain) (which I also think is badly named). So pyOpenSSL does support what I'm looking for, but I haven't been able to figure out a way of actually passing a password argument through requests/urllib3 to pyOpenSSL. I have started thinking about how to best patch this into both requests and urllib3, but the actual code path is not that obvious to me right now.\n\nTHX & Cheers,\nToby.\n", "So the appropriate way to do this, would be for me to pick work back up on https://github.com/shazow/urllib3/pull/507 since it would provide the API you're looking for. You would create a SSLContext object and call `load_cert_chain` yourself with the password and give that to us to use.\n", "Sounds about right, yeah. So if that would be possible, that'd be awesome.\n\nTHX & Cheers,\nToby.\n", "Is there any estimate as to when this feature might be added?\n", "@tdussa Not at this time I'm afraid. =(\n", "Bummer. :(\nLooking forward to it. ;-)\n", "Waiting for this feature as well. Any estimates?\n", "Nope. There's a big chunk of work to be done for this.\n", "Is it not even in the next milestone?\n", "Not currently, no. We have a finite amount of resources to spend, and the development team is stretched pretty thin across a wide number of projects.\n", "any advice on how to get to urllib3's `context.load_cert_chain` from `requests.Session` or prepared request ? I'm looking for a workaround\n", "@traut There isn't a good one, really. You can attempt to use the TransportAdapter's `init_poolmanager` method to pass objects into urllib3.\n", "any progress on this issue?\n", "@traut Yes. We're a few releases away from users being able to use TransportAdapters to provide SSLContext objects to urllib3, which will resolve this issue.\n", "nice! thanks for the update, @Lukasa \n", "Looks like this is now possible. Here's an example of how I made this work:\r\n\r\nhttps://gist.github.com/aiguofer/1eb881ccf199d4aaa2097d87f93ace6a", "My question is very related, but not quite the same. Is there a way to pass the unencrypted certificates and key as raw_bytes or as file objects, rather than _file paths_? Since for security reasons, one may not want to store the certificates on disk.", "@amiralia Right now the answer is \"not easily\". The Python standard library `ssl` module exposes no way to load client certs/keys from bytes: only from files. This is despite the fact that OpenSSL itself does provide these tools.\n\nOne way around this is to ensure you're using Requests PyOpenSSL support. If you do that, you can get an `SSLContext` object that uses PyOpenSSL instead by calling `requests.packages.urllib3.util.ssl_.create_urllib3_context()`. That will let you call PyOpenSSL methods like [`use_certificate`](https://pyopenssl.readthedocs.io/en/stable/api/ssl.html#OpenSSL.SSL.Context.use_certificate), which will work with files loaded from memory.\n\nUnfortunately, until the stdlib changes its API, that is the only option.", "I tried that, but even with using the following code for injection, and then creating the context as described, the SSL_Context object didn't have `use_certificate` or `use_privatekey` methods.\r\n`import urllib3.contrib.pyopenssl\r\nurllib3.contrib.pyopenssl.inject_into_urllib3()`", "The object itself does not, but it has a `ctx` object on it that does. You'll need to look at the actual code in that module to see how it works.", "Ok, I got it to work, but there was a very weird bug where the injection overwrote the variables under `requests.packages.urllib3.util`, but not `requests.packages.urllib3.util.ssl_`, including the important `requests.packages.urllib3.util.ssl_.SSLContext`. I resolved it by doing a manual monkey patch in my code of the remaining variables. ", "Thanks again everyone!", "Here's an alternative. Re-encode the cert to not require a passphrase. \r\n\r\nRun this script passing the .p12 as an argument. It'll prompt for the passphrase and generate a .pem that doesn't need one.\r\nThen it asks for a new passphrase (twice, for validation) that's used to build a new .p12. You can leave it blank and the result is a .p12 and .pem that don't require passphrases.\r\n\r\n```\r\n#!/bin/bash -e\r\nnp_pem=${1/.p12/_np.pem}\r\nnp_p12=${np_pem/pem/p12}\r\nopenssl pkcs12 -in $1 -nodes -out $np_pem\r\necho \"Press <CR> twice here for empty passphrase\"\r\nopenssl pkcs12 -export -in $np_pem -out $np_p12\r\n```", "Thanks bedge! That's good to know. But for our security requirements, we're mandated by the company to keep passphrases. ", "workaround idea: maybe it's possible to re-encode into a temp file that never hits the disk, yet can be accessed as named file through `/dev/fd/N`? `tempfile.TemporaryFile()`, unix only. \r\nSecurity depends on [O_TMPFILE support](http://man7.org/linux/man-pages/man2/open.2.html) or unpredictability of file name, and on non-tmpfs filesystems on question whether contents for an unnamed inode might still be flushed out to disk (though not accessible from any directory).\r\nSee also https://unix.stackexchange.com/questions/74497/single-process-accessible-temporary-file\r\n\r\nEDIT: note that other processes by same user can see _and_ read the temp file via a path like `/proc/20782/fd/3`. (The symlink reads as broken `3 -> '/tmp/#345829 (deleted)'` but opening does work. Rescuing deleted but still open files is a deliberate use case of /proc/NN/fd...) \r\nThere are of course other attack vectors from processes by same user, e.g. `ptrace`...", "Here is a solution that I found by taking inspiration from this article: \r\n[Creating a Python requests session using a passphrase protected Client side Cert](https://gist.github.com/aiguofer/1eb881ccf199d4aaa2097d87f93ace6a)\r\n\r\n~~~~\r\nimport ssl\r\nimport requests\r\nfrom requests.adapters import HTTPAdapter\r\n\r\nclass SSLAdapter(HTTPAdapter):\r\n\r\n def __init__(self, certfile, password):\r\n self.context = ssl.create_default_context(ssl.Purpose.CLIENT_AUTH)\r\n self.context.load_cert_chain(certfile=certfile, password=password)\r\n super().__init__()\r\n\r\n def init_poolmanager(self, *args, **kwargs):\r\n kwargs['ssl_context'] = self.context\r\n return super().init_poolmanager(*args, **kwargs)\r\n\r\nsession = requests.session()\r\nsession.mount('https://my_protected_site.com', SSLAdapter('my_certificate.crt', 'my_passphrase'))\r\n~~~~\r\n", "Yes, that's what I ended up using as well those years ago. It should be\ndoable for requests to wrap that logic in a function call for\nget_secure_session?\nAmirali Abdullah\n\nSoftware Engineering\n\n\n\n<http://www.facebook.com/Qualtrics> <http://www.twitter.com/Qualtrics>\n<http://www.instagram.com/qualtrics>\n<http://www.linkedin.com/company/qualtrics>\n\n\nOn Sun, May 19, 2019 at 8:31 AM qfayet <[email protected]> wrote:\n\n> Here is a solution that I found by taking inspiration from this article:\n> https://gist.github.com/aiguofer/1eb881ccf199d4aaa2097d87f93ace6a\n>\n> import ssl\n> import requests\n> from requests.adapters import HTTPAdapter\n>\n> class SSLAdapter(HTTPAdapter):\n>\n> def init_poolmanager(self, *args, **kwargs):\n> context = ssl.create_default_context(ssl.Purpose.CLIENT_AUTH)\n> context.load_cert_chain(certfile='my_certificate.crt',\n> password='my_password')\n> kwargs['ssl_context'] = context\n> return super().init_poolmanager(*args, **kwargs)\n>\n> session = requests.session()\n> session.mount('https://my_protected_site.com', SSLAdapter())\n>\n> —\n> You are receiving this because you were mentioned.\n> Reply to this email directly, view it on GitHub\n> <https://github.com/kennethreitz/requests/issues/2519?email_source=notifications&email_token=AE4W6UK3AZ7QPKBDWM6M7QDPWFQC7A5CNFSM4A64TK5KYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODVXDMJQ#issuecomment-493762086>,\n> or mute the thread\n> <https://github.com/notifications/unsubscribe-auth/AE4W6UNZRK5TI7Z2PI46DRDPWFQC7ANCNFSM4A64TK5A>\n> .\n>\n", "I believe this is a duplicate of #1573?", "AFAIK, it's still not supported today (at least according to documentation). Looks like it has been requested for a long time. Is there any plan for this ?" ]
https://api.github.com/repos/psf/requests/issues/2518
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2518/labels{/name}
https://api.github.com/repos/psf/requests/issues/2518/comments
https://api.github.com/repos/psf/requests/issues/2518/events
https://github.com/psf/requests/pull/2518
64,325,068
MDExOlB1bGxSZXF1ZXN0MzE5NDUwMjA=
2,518
shorter and faster version extraction from __init__.py
{ "avatar_url": "https://avatars.githubusercontent.com/u/439279?v=4", "events_url": "https://api.github.com/users/deronnax/events{/privacy}", "followers_url": "https://api.github.com/users/deronnax/followers", "following_url": "https://api.github.com/users/deronnax/following{/other_user}", "gists_url": "https://api.github.com/users/deronnax/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/deronnax", "id": 439279, "login": "deronnax", "node_id": "MDQ6VXNlcjQzOTI3OQ==", "organizations_url": "https://api.github.com/users/deronnax/orgs", "received_events_url": "https://api.github.com/users/deronnax/received_events", "repos_url": "https://api.github.com/users/deronnax/repos", "site_admin": false, "starred_url": "https://api.github.com/users/deronnax/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/deronnax/subscriptions", "type": "User", "url": "https://api.github.com/users/deronnax", "user_view_type": "public" }
[]
closed
true
null
[]
null
11
2015-03-25T17:20:49Z
2021-09-08T08:00:56Z
2015-03-27T12:49:29Z
CONTRIBUTOR
resolved
@sigmavirus24 I though it could be improved a bit. Tell me what you think.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2518/reactions" }
https://api.github.com/repos/psf/requests/issues/2518/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2518.diff", "html_url": "https://github.com/psf/requests/pull/2518", "merged_at": "2015-03-27T12:49:29Z", "patch_url": "https://github.com/psf/requests/pull/2518.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2518" }
true
[ "if you think readability has been tampered too much, I'll make a pull request with this less-pure-but-softer version:\n\n```\n- for line in fd:\n- m = reg.match(line)\n- if m:\n- version = m.group(1)\n- break\n+ version = reg.search(fd.read())\n```\n", "No objections from me. :+1:\n", "Do you have benchmarks? Also, why are we optimizing something that is run at most once by a user (during pip install)?\n", "It's not optimizing. It's making it simpler. And thus that it's simpler also makes faster.\nWhat you did was opening the file, buffering it, crawling it line by line (triggering a search for line terminators whereas we don't care about) and applying a regex on each line.\nWhat I do is opening and reading the whole file and applying the regex on the content.\nThis uninteresting [microbenchmark](https://gist.github.com/deronnax/984499ee7ef62b9b81f4) shows us it's 3 time faster, but that's not my point. It's simpler, that's what matter :)\n", "Either way, this change is incomplete in its current form. I've left feedback on what is necessary to fix it.\n", "Yeah, sorry for the cardet, I thought about it and forgot.\nI think using the dollar and thus requiring the \"version\" line ends with a newline is a bad idea : it might end with a whitespace or a comment, it's valid.\nI think the cardet suffices. And it thus behaves the same as your version did ;)\n", "> I think using the dollar and thus requiring the \"version\" line ends with a newline is a bad idea : it might end with a whitespace or a comment, it's valid.\n\nNo one asked for you to add anything other than the `^`.\n\nThanks for fixing this up. Any further feedback @Lukasa?\n", "Nope. Fine by me. =)\n", "The\n\n> Please fix the expression to check for line beginnings and **endings**\n\n misled me :)\n", "I misspoke twice then! Sorry for the confusion @deronnax. Thanks for the contribution! :cake: \n", "You're welcome :)\n" ]
https://api.github.com/repos/psf/requests/issues/2517
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2517/labels{/name}
https://api.github.com/repos/psf/requests/issues/2517/comments
https://api.github.com/repos/psf/requests/issues/2517/events
https://github.com/psf/requests/issues/2517
64,016,384
MDU6SXNzdWU2NDAxNjM4NA==
2,517
ZeroReturnError exception on GET request to facebook.com
{ "avatar_url": "https://avatars.githubusercontent.com/u/6208933?v=4", "events_url": "https://api.github.com/users/chrj/events{/privacy}", "followers_url": "https://api.github.com/users/chrj/followers", "following_url": "https://api.github.com/users/chrj/following{/other_user}", "gists_url": "https://api.github.com/users/chrj/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/chrj", "id": 6208933, "login": "chrj", "node_id": "MDQ6VXNlcjYyMDg5MzM=", "organizations_url": "https://api.github.com/users/chrj/orgs", "received_events_url": "https://api.github.com/users/chrj/received_events", "repos_url": "https://api.github.com/users/chrj/repos", "site_admin": false, "starred_url": "https://api.github.com/users/chrj/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/chrj/subscriptions", "type": "User", "url": "https://api.github.com/users/chrj", "user_view_type": "public" }
[]
closed
true
null
[]
null
6
2015-03-24T14:51:19Z
2021-09-08T23:05:51Z
2015-03-24T15:43:04Z
NONE
resolved
Running on Python 2.7.8 with: - `pyOpenSSL==0.14` - `ndg-httpsclient==0.3.3` - `pyasn1==0.1.7` - `requests==2.5.1` When I run this: ``` python requests.get("http://www.facebook.com/", headers={"Connection" : "close"}) ``` I get the following exception: ``` Traceback (most recent call last): File "<stdin>", line 1, in <module> File "[venv]/local/lib/python2.7/site-packages/requests/api.py", line 65, in get return request('get', url, **kwargs) File "[venv]/local/lib/python2.7/site-packages/requests/api.py", line 49, in request response = session.request(method=method, url=url, **kwargs) File "[venv]/local/lib/python2.7/site-packages/requests/sessions.py", line 461, in request resp = self.send(prep, **send_kwargs) File "[venv]/local/lib/python2.7/site-packages/requests/sessions.py", line 599, in send history = [resp for resp in gen] if allow_redirects else [] File "[venv]/local/lib/python2.7/site-packages/requests/sessions.py", line 192, in resolve_redirects allow_redirects=False, File "[venv]/local/lib/python2.7/site-packages/requests/sessions.py", line 610, in send r.content File "[venv]/local/lib/python2.7/site-packages/requests/models.py", line 730, in content self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes() File "[venv]/local/lib/python2.7/site-packages/requests/models.py", line 655, in generate for chunk in self.raw.stream(chunk_size, decode_content=True): File "[venv]/local/lib/python2.7/site-packages/requests/packages/urllib3/response.py", line 256, in stream data = self.read(amt=amt, decode_content=decode_content) File "[venv]/local/lib/python2.7/site-packages/requests/packages/urllib3/response.py", line 186, in read data = self._fp.read(amt) File "/usr/lib/python2.7/httplib.py", line 567, in read s = self.fp.read(amt) File "/usr/lib/python2.7/socket.py", line 380, in read data = self._sock.recv(left) File "[venv]/local/lib/python2.7/site-packages/requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv data = self.connection.recv(*args, **kwargs) File "[venv]/local/lib/python2.7/site-packages/OpenSSL/SSL.py", line 995, in recv self._raise_ssl_error(self._ssl, result) File "[venv]/local/lib/python2.7/site-packages/OpenSSL/SSL.py", line 851, in _raise_ssl_error raise ZeroReturnError() OpenSSL.SSL.ZeroReturnError ``` According to the pyOpenSSL docs on `ZeroReturnError` this is an indication that remote closed the connection cleanly: http://pyopenssl.readthedocs.org/en/latest/api/ssl.html#OpenSSL.SSL.ZeroReturnError I'm guessing this exception should probably be handled somewhere along the call path instead of propagating to the caller. Running under Python 3.4.2 the request goes through.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2517/reactions" }
https://api.github.com/repos/psf/requests/issues/2517/timeline
null
completed
null
null
false
[ "This is better filed over on [urllib3](/shazow/urllib3).\n", "Also, ping @shazow \n", "This should be fixed in 2.5.2 or later but you should upgrade to 2.6.0 either way.\n", "No need to report this to urllib3 either.\n", "Confirmed. \n\nI could swear I checked for an updated version before reporting :)\n\nThanks.\n", "No worries @chrj. Hopefully now that there's a bug here, people will see it and not open a new one. =D It's still appreciated. :D :+1: \n" ]
https://api.github.com/repos/psf/requests/issues/2516
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2516/labels{/name}
https://api.github.com/repos/psf/requests/issues/2516/comments
https://api.github.com/repos/psf/requests/issues/2516/events
https://github.com/psf/requests/pull/2516
63,991,614
MDExOlB1bGxSZXF1ZXN0MzE4MjQ3MDk=
2,516
Import the builtin json first.
{ "avatar_url": "https://avatars.githubusercontent.com/u/129501?v=4", "events_url": "https://api.github.com/users/ionelmc/events{/privacy}", "followers_url": "https://api.github.com/users/ionelmc/followers", "following_url": "https://api.github.com/users/ionelmc/following{/other_user}", "gists_url": "https://api.github.com/users/ionelmc/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ionelmc", "id": 129501, "login": "ionelmc", "node_id": "MDQ6VXNlcjEyOTUwMQ==", "organizations_url": "https://api.github.com/users/ionelmc/orgs", "received_events_url": "https://api.github.com/users/ionelmc/received_events", "repos_url": "https://api.github.com/users/ionelmc/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ionelmc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ionelmc/subscriptions", "type": "User", "url": "https://api.github.com/users/ionelmc", "user_view_type": "public" }
[ { "color": "eb6420", "default": false, "description": null, "id": 44501256, "name": "Breaking API Change", "node_id": "MDU6TGFiZWw0NDUwMTI1Ng==", "url": "https://api.github.com/repos/psf/requests/labels/Breaking%20API%20Change" }, { "color": "e11d21", "default": false, "description": null, "id": 44501305, "name": "Not Ready To Merge", "node_id": "MDU6TGFiZWw0NDUwMTMwNQ==", "url": "https://api.github.com/repos/psf/requests/labels/Not%20Ready%20To%20Merge" } ]
closed
true
null
[]
null
15
2015-03-24T13:16:12Z
2021-09-08T07:00:46Z
2015-04-02T18:41:52Z
CONTRIBUTOR
resolved
`simplejson` segfaults for me ([details](https://github.com/simplejson/simplejson/issues/114)) and it looks like it's not maintained. It should only be used as a last resort.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2516/reactions" }
https://api.github.com/repos/psf/requests/issues/2516/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2516.diff", "html_url": "https://github.com/psf/requests/pull/2516", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/2516.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2516" }
true
[ "This is a very deliberate import situation. requests does not support a version of Python that lacks the standard library version. simplejson is imported first specifically because there are people who would rather it be used than the standard library's json module. As such I'm strongly -1 on including this change, but I'd still like to hear @Lukasa's thoughts.\n", "@sigmavirus24 Which people would rather use simplejson? For what reason?\n", "@ionelmc it was an oft-requested feature. simplejson supposedly has better performance. As for \"which people\" my job is not to catalogue all users of requests relying on _X_ feature. The people using this are likely easy to find in the issue history of requests. Searching \"simplejson\" should probably get you 98% of the way there.\n", "It was merged via https://github.com/kennethreitz/requests/pull/710 - doesn't have any performance tests. IMO not using unmaintained libraries is reason enough to revert that change..\n", "> IMO not using unmaintained libraries is reason enough to revert that change..\n\nYou keep calling simplejson unmaintained. Looking at the issue you linked, your issue received a nearly immediate response. Granted the author hasn't responded since the last time, but it's a very specific problem on a very specific version of python that they don't have immediate access to. I wouldn't call that unmaintained. If it were unmaintained, you would never have received a response whatsoever.\n", "Also, even if we do want to accept this, we can't do it before 3.0.0 because it's a breaking API change (people are expecting this to use simplejson in certain cases and removing it is breaking those expectations).\n", "So I looked into the bug on simplejson and I can't reproduce it. Also, I forgot to mention earlier that this change is functionally equivalent to only ever importing the standard library's `json` module (for interested parties).\n\nI'd still like input from @Lukasa although I'm not rather convinced that this isn't a bug requests needs to be concerned about.\n", "@sigmavirus24 Seems your simplejson tests were incomplete, see https://github.com/simplejson/simplejson/issues/114 for details.\n\nVictor Stinner [seemed to be able to reproduce the issue](https://gist.github.com/ionelmc/811bbf8ceed66ca83a2a). But alas, cpython devs aren't interested in fixing simplejson. Neither am I.\n", "My position here is that the short-term fix is simple: if simplejson is segfaulting, don't use it.\n\nIn the longer term, I don't really understand why we're bothering. If people really want to use simplejson they can, it's not that hard to monkeypatch. So for 3.0.0 I'd like to remove the conditional import.\n", "Regardless, we certainly shouldn't merge this, because there's no point having the fallback: if we're going to import the stdlib we should just only import the stdlib. That import will _never_ fail.\n", "Python 2.5 is not supported right? (`json` was added in 2.6)\n", "Correct.\n\n> On 27 Mar 2015, at 11:04, Ionel Cristian Mărieș [email protected] wrote:\n> \n> Python 2.5 is not supported right?\n> \n> —\n> Reply to this email directly or view it on GitHub.\n", "`json` and `simplejson` are the same library. In Python 2.6, where it is called `json`, it was not shipped with the speed-boosting c extensions enabled. So, in Python 2.6, it could be considered best practice to use `simplejson` if it is available. \n\nThis is no longer the case with Python 2.7, however. Oh well. It's still a nice design consideration, in my opinion. \n", "@kennethreitz they are in fact slightly different, the builtin json receiving serveral bugfixes simplejson didn't.\n", "There shouldn't be any bugfixes in json that are not also in simplejson, if you know of any and you report them I can make sure they get applied.\n" ]
https://api.github.com/repos/psf/requests/issues/2515
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2515/labels{/name}
https://api.github.com/repos/psf/requests/issues/2515/comments
https://api.github.com/repos/psf/requests/issues/2515/events
https://github.com/psf/requests/issues/2515
63,795,326
MDU6SXNzdWU2Mzc5NTMyNg==
2,515
Can't copy-paste international URL
{ "avatar_url": "https://avatars.githubusercontent.com/u/56778?v=4", "events_url": "https://api.github.com/users/cool-RR/events{/privacy}", "followers_url": "https://api.github.com/users/cool-RR/followers", "following_url": "https://api.github.com/users/cool-RR/following{/other_user}", "gists_url": "https://api.github.com/users/cool-RR/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/cool-RR", "id": 56778, "login": "cool-RR", "node_id": "MDQ6VXNlcjU2Nzc4", "organizations_url": "https://api.github.com/users/cool-RR/orgs", "received_events_url": "https://api.github.com/users/cool-RR/received_events", "repos_url": "https://api.github.com/users/cool-RR/repos", "site_admin": false, "starred_url": "https://api.github.com/users/cool-RR/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/cool-RR/subscriptions", "type": "User", "url": "https://api.github.com/users/cool-RR", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2015-03-23T18:21:47Z
2021-09-08T23:05:51Z
2015-03-23T18:29:35Z
NONE
resolved
Observe this link to an article: http://alaxon.co.il/article/%D7%9C%D7%95-%D7%99%D7%9B%D7%9C%D7%95-%D7%97%D7%AA%D7%95%D7%9C%D7%99%D7%9D-%D7%9C%D7%93%D7%91%D7%A8/ If you click it, you'll see Hebrew letters in your address bar. (At least in Chrome.) The article works. Now, try accessing it in requests: ``` >>> import requests >>> requests.get('http://alaxon.co.il/article/%D7%9C%D7%95-%D7%99%D7%9B%D7%9C%D7%95-%D7%97%D7%AA%D7%95%D7%9C%D7%99%D7%9D-%D7%9C%D7%93%D7%91%D7%A8/') Starting new HTTP connection (1): alaxon.co.il <Response [403]> ``` Why is that?
{ "avatar_url": "https://avatars.githubusercontent.com/u/56778?v=4", "events_url": "https://api.github.com/users/cool-RR/events{/privacy}", "followers_url": "https://api.github.com/users/cool-RR/followers", "following_url": "https://api.github.com/users/cool-RR/following{/other_user}", "gists_url": "https://api.github.com/users/cool-RR/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/cool-RR", "id": 56778, "login": "cool-RR", "node_id": "MDQ6VXNlcjU2Nzc4", "organizations_url": "https://api.github.com/users/cool-RR/orgs", "received_events_url": "https://api.github.com/users/cool-RR/received_events", "repos_url": "https://api.github.com/users/cool-RR/repos", "site_admin": false, "starred_url": "https://api.github.com/users/cool-RR/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/cool-RR/subscriptions", "type": "User", "url": "https://api.github.com/users/cool-RR", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2515/reactions" }
https://api.github.com/repos/psf/requests/issues/2515/timeline
null
completed
null
null
false
[ "Note that I get the same 403 error response when I put the Hebrew URL in `requests.get` without the percentage escape sequences. If I put this Hebrew URL in the browser, it works fine.\n", "My mistake, this was a completely unrelated problem with the site.\n", "Heh, I was just diving in. Thanks for solving it yourself!\n" ]
https://api.github.com/repos/psf/requests/issues/2514
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2514/labels{/name}
https://api.github.com/repos/psf/requests/issues/2514/comments
https://api.github.com/repos/psf/requests/issues/2514/events
https://github.com/psf/requests/pull/2514
63,747,071
MDExOlB1bGxSZXF1ZXN0MzE3NDA5Mjk=
2,514
Revert "Minor Patch TypeError thrown"
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" } ]
null
2
2015-03-23T15:11:32Z
2021-09-08T08:00:57Z
2015-03-23T15:13:14Z
CONTRIBUTOR
resolved
Reverts kennethreitz/requests#2513 The change was made to urllib3 which is vendored and to which pull requests should not be accepted.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2514/reactions" }
https://api.github.com/repos/psf/requests/issues/2514/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2514.diff", "html_url": "https://github.com/psf/requests/pull/2514", "merged_at": "2015-03-23T15:13:14Z", "patch_url": "https://github.com/psf/requests/pull/2514.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2514" }
true
[ "Further, that breaks the API of urllib3 by raising an exception that is not a urllib3 exception and would introduce a breaking change to requests.\n", "Yeah, this is straight up my bad.\n" ]
https://api.github.com/repos/psf/requests/issues/2513
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2513/labels{/name}
https://api.github.com/repos/psf/requests/issues/2513/comments
https://api.github.com/repos/psf/requests/issues/2513/events
https://github.com/psf/requests/pull/2513
63,627,017
MDExOlB1bGxSZXF1ZXN0MzE3MDUzNDg=
2,513
Minor Patch TypeError thrown
{ "avatar_url": "https://avatars.githubusercontent.com/u/1904543?v=4", "events_url": "https://api.github.com/users/ropwareJB/events{/privacy}", "followers_url": "https://api.github.com/users/ropwareJB/followers", "following_url": "https://api.github.com/users/ropwareJB/following{/other_user}", "gists_url": "https://api.github.com/users/ropwareJB/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ropwareJB", "id": 1904543, "login": "ropwareJB", "node_id": "MDQ6VXNlcjE5MDQ1NDM=", "organizations_url": "https://api.github.com/users/ropwareJB/orgs", "received_events_url": "https://api.github.com/users/ropwareJB/received_events", "repos_url": "https://api.github.com/users/ropwareJB/repos", "site_admin": true, "starred_url": "https://api.github.com/users/ropwareJB/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ropwareJB/subscriptions", "type": "User", "url": "https://api.github.com/users/ropwareJB", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2015-03-23T05:03:25Z
2021-09-08T08:00:57Z
2015-03-23T06:18:48Z
CONTRIBUTOR
resolved
Python2.7, str(err) was throwing a TypeError as it doesn't override the 'toString' method **str**. Wrapped offending block in a quick try block, catching only TypeError exceptions.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2513/reactions" }
https://api.github.com/repos/psf/requests/issues/2513/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2513.diff", "html_url": "https://github.com/psf/requests/pull/2513", "merged_at": "2015-03-23T06:18:48Z", "patch_url": "https://github.com/psf/requests/pull/2513.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2513" }
true
[ "Thanks! :cake:\n", "@Montycarlo please see #2514. This should have been opened at [urllib3](/shazow/urllib3) and the change needs to be much better since this will introduce a breaking change to urllib3.\n", "Sorry @Montycarlo! I should have caught that earlier. =(\n", "Sorry for wasting your time! Keep up the awesome work :)\n" ]
https://api.github.com/repos/psf/requests/issues/2512
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2512/labels{/name}
https://api.github.com/repos/psf/requests/issues/2512/comments
https://api.github.com/repos/psf/requests/issues/2512/events
https://github.com/psf/requests/pull/2512
63,424,725
MDExOlB1bGxSZXF1ZXN0MzE2NzM3NTk=
2,512
Expand on what 'elapsed' means.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[]
closed
true
null
[]
null
0
2015-03-21T16:27:44Z
2021-09-08T08:00:58Z
2015-03-22T19:51:17Z
MEMBER
resolved
Resolves #2511.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2512/reactions" }
https://api.github.com/repos/psf/requests/issues/2512/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2512.diff", "html_url": "https://github.com/psf/requests/pull/2512", "merged_at": "2015-03-22T19:51:17Z", "patch_url": "https://github.com/psf/requests/pull/2512.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2512" }
true
[]
https://api.github.com/repos/psf/requests/issues/2511
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2511/labels{/name}
https://api.github.com/repos/psf/requests/issues/2511/comments
https://api.github.com/repos/psf/requests/issues/2511/events
https://github.com/psf/requests/issues/2511
63,412,190
MDU6SXNzdWU2MzQxMjE5MA==
2,511
Documentation unclear on Response.elapsed
{ "avatar_url": "https://avatars.githubusercontent.com/u/350846?v=4", "events_url": "https://api.github.com/users/jribbens/events{/privacy}", "followers_url": "https://api.github.com/users/jribbens/followers", "following_url": "https://api.github.com/users/jribbens/following{/other_user}", "gists_url": "https://api.github.com/users/jribbens/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jribbens", "id": 350846, "login": "jribbens", "node_id": "MDQ6VXNlcjM1MDg0Ng==", "organizations_url": "https://api.github.com/users/jribbens/orgs", "received_events_url": "https://api.github.com/users/jribbens/received_events", "repos_url": "https://api.github.com/users/jribbens/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jribbens/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jribbens/subscriptions", "type": "User", "url": "https://api.github.com/users/jribbens", "user_view_type": "public" }
[]
closed
true
null
[]
null
15
2015-03-21T14:36:36Z
2021-09-08T23:05:51Z
2015-03-22T19:51:17Z
NONE
resolved
The documentation says that `Response.elapsed` is "the amount of time elapsed between sending the request and the arrival of the response". However it is very unclear what this actually means - the time to the first byte of the headers, the time to the first byte of the body, the time to the end of the body, etc? What value does it, and the other corresponding values in the `Response.history` list's objects, have if the response involved one or more redirects? Does it change if you access `Response.content`? Does it make a difference if `stream` is set on the request? A great deal more clarification on what this value actually tells you would be very helpful.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2511/reactions" }
https://api.github.com/repos/psf/requests/issues/2511/timeline
null
completed
null
null
false
[ "Good point!\n\nThe most accurate answer is that it's the time taken between passing the request to `httplib` and getting the response from it. This means it's time to the end of the _headers_ and the headers are finished parsing. Therefore, each element in `Response.history` measures its own `elapsed` value. It does not change if you access `Response.content` or set `stream=True`.\n\nWe should certainly write this in the documentation. =)\n", "See #2512.\n", "Actually @Lukasa that's not exactly correct. See https://github.com/kennethreitz/requests/pull/2209 for more details.\n", "To be clear, it seems to me that `elapsed` serves almost no purpose at all if it does not work as @Lukasa describes - measuring the total time taken is trivial to do in the application, but measuring the \"time to first byte\" is impossible if Requests does not provide that information itself.\n", "Maybe I'm going mad, but my reading of #2209 suggests that my description of what happens is accurate, but that it may not be what we _intend_ to happen. Is that correct?\n", "If I remember #2209 correctly, there's a difference in behaviour between behaviour with `stream=True` and `stream=False`. In the former it is as you described, in the latter it's the time until the download finished.\n", "Not according to the code. According to the code the `stream` kwarg has no effect on the `HTTPAdapter`.\n", "This sort of confusion is why I thought it important that the expected behaviour be documented ;-)\nTL;DR for the below: it looks like the behaviour changed between versions of Requests and @sigmavirus24 used to be correct but now @Lukasa is correct...\n\nI've set up two URLs. `redirect.php`:\n\n``` php\n<? sleep(1); header(\"Location: /delay.php\"); flush(); sleep(2); ?>\n<p>Foo</p>\n```\n\n`delay.php`:\n\n``` php\n<p>Hello</p>\n<? flush(); sleep(5); ?>\n<p>world!</p>\n```\n\nWith Requests 2.2.1, this is what happens:\n\n``` python\n>>> n=datetime.datetime.now(); r=requests.get(\"http://localhost/redirect.php\"); print(r.history[0].elapsed, r.elapsed, datetime.datetime.now()-n);\n0:00:03.023007 0:00:05.010399 0:00:08.049038\n>>> n=datetime.datetime.now(); r=requests.get(\"http://localhost/redirect.php\", stream=True); print(r.history[0].elapsed, r.elapsed, datetime.datetime.now()-n);\n0:00:01.026764 0:00:00.012600 0:00:03.054618\n```\n\nWith Requests 2.6.0, this happens:\n\n``` python\n>>> n=datetime.datetime.now(); r=requests.get(\"http://localhost/redirect.php\"); print(r.history[0].elapsed, r.elapsed, datetime.datetime.now()-n);\n0:00:01.020450 0:00:00.009139 0:00:08.033785\n>>> n=datetime.datetime.now(); r=requests.get(\"http://localhost/redirect.php\", stream=True); print(r.history[0].elapsed, r.elapsed, datetime.datetime.now()-n);\n0:00:01.029155 0:00:00.013359 0:00:03.044638\n```\n\nSo with `stream=True` the behaviour is consistent between versions, and matches what @Lukasa describes. With `stream=False`, in 2.2.1 `elapsed` changes from \"time to first byte\" to \"time to last byte\", but in Requests 2.6.0 there is no difference.\n\nCan I humbly suggest that this change be regarded as a \"bug fix\" rather than a \"bug to fix\"? The new behaviour is more logical, consistent and useful, and to change it would be to remove a feature (i.e. the ability to get TTFB with `stream=False`).\n", "I'm actually curious when and how this changed because it's a change that should have been documented. At the least, we should now close #2209, or provide another attribute to replace the old behaviour to get the time to last byte when `stream=False`.\n", "I guess part of the problem would be, with `stream=True`, Requests simply cannot know the \"time to last byte\", until the entire body has been read (and in the case of automatically-handled redirects, it may never be available). Add to this the fact that finding \"time to last byte\" in the application is utterly trivial (as shown in my one-liners above), it might do more harm than good to add a \"total_time\" value that will only sometimes be available.\n", "> in the case of automatically-handled redirects, it may never be available\n\nTo anyone else reading, this is inherently incorrect. `elapsed` is available on every response object and will be TTFB. There attribute will only be correct for that request/response pair though. So a total time with redirects will simply be equivalent to:\n\n``` python\nsum((r.elapsed for r in response.history), response.elapsed)\n```\n\nWe also consume the entirety of each response's body on redirects to prevent a socket leak.\n", "@sigmavirus24 you misread my comment. I said that time to _last_ byte may not be available, not that `elapsed` (time to _first_ byte) may not be available. And yes, that `sum` expression is exactly what I am using in my application to determine TTFB :-)\n", "Final question on this - @Lukasa's documentation change specifies that the DNS-lookup time and socket connect() time are excluded from `elapsed`, which seems peculiar if true. Is it true?\n", "It _may_ be true. It is TTLB for that response. If to get that byte we need to create a socket, then yes, it'll include DNS lookup. If we are able to re-use one, it will not.\n", "That's great, that's what I expected (assuming 'TTLB' was a typo for 'TTFB'). The documentation change is still wrong then I'm afraid :-( But the code behaviour is what I need :-)\n" ]
https://api.github.com/repos/psf/requests/issues/2510
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2510/labels{/name}
https://api.github.com/repos/psf/requests/issues/2510/comments
https://api.github.com/repos/psf/requests/issues/2510/events
https://github.com/psf/requests/issues/2510
63,408,132
MDU6SXNzdWU2MzQwODEzMg==
2,510
TypeError: getresponse() got an unexpected keyword argument 'buffering'
{ "avatar_url": "https://avatars.githubusercontent.com/u/350846?v=4", "events_url": "https://api.github.com/users/jribbens/events{/privacy}", "followers_url": "https://api.github.com/users/jribbens/followers", "following_url": "https://api.github.com/users/jribbens/following{/other_user}", "gists_url": "https://api.github.com/users/jribbens/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jribbens", "id": 350846, "login": "jribbens", "node_id": "MDQ6VXNlcjM1MDg0Ng==", "organizations_url": "https://api.github.com/users/jribbens/orgs", "received_events_url": "https://api.github.com/users/jribbens/received_events", "repos_url": "https://api.github.com/users/jribbens/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jribbens/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jribbens/subscriptions", "type": "User", "url": "https://api.github.com/users/jribbens", "user_view_type": "public" }
[]
closed
true
null
[]
null
10
2015-03-21T14:07:08Z
2021-09-08T21:00:47Z
2015-03-21T16:21:22Z
NONE
resolved
Firstly, please note I have already read issues #1289 and #1915. Secondly, please note that I am fully aware that the problem is that the underlying URL fetch failed due to a problem that is outside Requests' control, i.e. a network problem or failure of the remote web server. My problem is not that an exception is raised - my problem is that the _wrong_ exception is raised. According to the Requests documentation, in this circumstance a `ReadTimeout` exception should be raised, and in any event the exception should be a subclass of `RequestException`. Instead, a `TypeError` is being raised. This is fatal for my application, which needs to know _why_ the request failed and to be sure that it was indeed a network failure and not a bug in the application or Requests itself. As far as I can see, this is indeed a bug in Requests. I am using Requests 2.5.3 on Python 3.4.0 on Ubuntu 14.04.2. ``` Traceback (most recent call last): File "/home/jribbens/src/bitatawa/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 372, in _make_request httplib_response = conn.getresponse(buffering=True) TypeError: getresponse() got an unexpected keyword argument 'buffering' During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/jribbens/src/bitatawa/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 374, in _make_request httplib_response = conn.getresponse() File "/usr/lib/python3.4/http/client.py", line 1147, in getresponse response.begin() File "/usr/lib/python3.4/http/client.py", line 351, in begin version, status, reason = self._read_status() File "/usr/lib/python3.4/http/client.py", line 313, in _read_status line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") File "/usr/lib/python3.4/socket.py", line 371, in readinto return self._sock.recv_into(b) socket.timeout: timed out During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/jribbens/src/bitatawa/lib/python3.4/site-packages/requests/adapters.py", line 370, in send timeout=timeout File "/home/jribbens/src/bitatawa/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 597, in urlopen _stacktrace=sys.exc_info()[2]) File "/home/jribbens/src/bitatawa/lib/python3.4/site-packages/requests/packages/urllib3/util/retry.py", line 245, in increment raise six.reraise(type(error), error, _stacktrace) File "/home/jribbens/src/bitatawa/lib/python3.4/site-packages/requests/packages/urllib3/packages/six.py", line 310, in reraise raise value File "/home/jribbens/src/bitatawa/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 544, in urlopen body=body, headers=headers) File "/home/jribbens/src/bitatawa/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 376, in _make_request self._raise_timeout(err=e, url=url, timeout_value=read_timeout) File "/home/jribbens/src/bitatawa/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 304, in _raise_timeout raise ReadTimeoutError(self, url, "Read timed out. (read timeout=%s)" % timeout_value) requests.packages.urllib3.exceptions.ReadTimeoutError: HTTPConnectionPool(host='www.libraryofbirmingham.com', port=80): Read timed out. (read timeout=30) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "./bitatawa-client.py", line 231, in check timeout=settings.getint("check-timeout", 30)) File "/home/jribbens/src/bitatawa/lib/python3.4/site-packages/requests/api.py", line 65, in get return request('get', url, **kwargs) File "/home/jribbens/src/bitatawa/lib/python3.4/site-packages/requests/api.py", line 49, in request response = session.request(method=method, url=url, **kwargs) File "/home/jribbens/src/bitatawa/lib/python3.4/site-packages/requests/sessions.py", line 461, in request resp = self.send(prep, **send_kwargs) File "/home/jribbens/src/bitatawa/lib/python3.4/site-packages/requests/sessions.py", line 573, in send r = adapter.send(request, **kwargs) File "/home/jribbens/src/bitatawa/lib/python3.4/site-packages/requests/adapters.py", line 433, in send raise ReadTimeout(e, request=request) requests.exceptions.ReadTimeout: HTTPConnectionPool(host='www.libraryofbirmingham.com', port=80): Read timed out. (read timeout=30) ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2510/reactions" }
https://api.github.com/repos/psf/requests/issues/2510/timeline
null
completed
null
null
false
[ "Sorry @jribbens, but you didn't read #1289 or #1915 clearly enough. I will reproduce my statements from those issues again, in the hopes that this will be the last time I have to write this down. =)\n\nThis is an unforeseen problem to do with how exception tracebacks are being reported in Python 3. PEP 3134 introduced this 'chaining exceptions' reporting that you can see in the traceback. The purpose of this error reporting is to highlight that some exceptions occur in except blocks, and to work out what chain of exceptions was hit. This is potentially very useful: for instance, you can hit an exception after destroying a resource and then attempt to use that resource in the except block, which hits another exception. It's helpful to be able to see both exceptions at once.\n\nThe key is that the `TypeError` raised as the first exception is **unrelated** to the subsequent ones. In fact, that's the standard control flow in urllib3. This means that the real exception that's being raised here is the request.exceptions.ReadTimeout exception that wraps the urllib3.exceptions.ReadTimeoutError exception being raised in urllib3.\n\nThe exception that actually bubbled up to the user code is the **last** one, not the first one. The way you read that traceback is as follows:\n- I hit a `TypeError` in this method.\n- In the `except` block that caught the `TypeError`, I raised a `socket.timeout` (in practice, this is well down the call stack, in a totally standard control flow)\n- In the `except` block that caught the `socket.timeout`, I raised a `requests.packages.urllib3.exceptions.ReadTimeoutError`\n- In the `except` block that caught the `requests.packages.urllib3.exceptions.ReadTimeoutError`, I raised a `requests.exceptions.ReadTimeout`.\n- This exception was never caught.\n\nTo be clear, I have stated quite publicly that I think Python 3 screwed users on this 'feature'. I think it's a misfeature that discourages the Easier to Ask Forgiveness than Permission convention in the Python community because it leads to repeated bug reports like this one, even by people who honestly believe that they have read and understood the previous bug reports on this topic.\n\nI promise you, the `ReadTimeout` is exactly what was raised here.\n", "Ah, I see. Thank you for your response. I had wrapped the requests call in a `try: except requests.RequestException` which is why I thought it was not raising `ReadTimeout`, but yet another exception in that block was confusing the issue even further.\n\nPython 3's stack backtrace here does seem to be completely insane. I don't see the point of tracking back past a `raise` unless it was `raise ... from ...`. But this is, of course, not your fault ;-)\n", "> Python 3's stack backtrace here does seem to be completely insane. I don't see the point of tracking back past a raise unless it was raise ... from .... But this is, of course, not your fault ;-)\n\nI'm entirely agreed with you here, this has cost the requests team more time than I'd like. Such is life. =)\n", "Could you not just break the chain with `raise <E> from None`?\n\n```\nPython 3.4.1 (v3.4.1:c0e311e010fc, May 18 2014, 00:54:21)\n[GCC 4.2.1 (Apple Inc. build 5666) (dot 3)] on darwin\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> try:\n... r = 5/0\n... except:\n... raise FileNotFoundError\n...\nTraceback (most recent call last):\n File \"<stdin>\", line 2, in <module>\nZeroDivisionError: division by zero\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"<stdin>\", line 4, in <module>\nFileNotFoundError\n>>> try:\n... r = 5/0\n... except:\n... raise FileNotFoundError from None\n...\nTraceback (most recent call last):\n File \"<stdin>\", line 4, in <module>\nFileNotFoundError\n```\n", "Nope: we have a common codebase for Python 2 and Python 3, and that syntax isn't valid in Python 2.\n", "Surely a test for legacy python would be worth avoiding all these bug reports?\n", "@Grazfather No, it's a syntax error: Python 2 cannot even parse the file with that in it\n", "A shame. Thanks.\n", "``` python\nhttplib_response = conn.getresponse(buffering=True) # This is almost certainly not the error you are looking for!\n```\n", "(sorry) \n" ]
https://api.github.com/repos/psf/requests/issues/2509
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2509/labels{/name}
https://api.github.com/repos/psf/requests/issues/2509/comments
https://api.github.com/repos/psf/requests/issues/2509/events
https://github.com/psf/requests/issues/2509
63,366,723
MDU6SXNzdWU2MzM2NjcyMw==
2,509
Not sure if this is an issue, but how do I select a port
{ "avatar_url": "https://avatars.githubusercontent.com/u/9436380?v=4", "events_url": "https://api.github.com/users/SethDusek/events{/privacy}", "followers_url": "https://api.github.com/users/SethDusek/followers", "following_url": "https://api.github.com/users/SethDusek/following{/other_user}", "gists_url": "https://api.github.com/users/SethDusek/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/SethDusek", "id": 9436380, "login": "SethDusek", "node_id": "MDQ6VXNlcjk0MzYzODA=", "organizations_url": "https://api.github.com/users/SethDusek/orgs", "received_events_url": "https://api.github.com/users/SethDusek/received_events", "repos_url": "https://api.github.com/users/SethDusek/repos", "site_admin": false, "starred_url": "https://api.github.com/users/SethDusek/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SethDusek/subscriptions", "type": "User", "url": "https://api.github.com/users/SethDusek", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2015-03-21T06:01:46Z
2021-09-08T23:05:52Z
2015-03-21T06:42:12Z
NONE
resolved
I want to connect on port 1079, but I cant find any documentation on choosing a port
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2509/reactions" }
https://api.github.com/repos/psf/requests/issues/2509/timeline
null
completed
null
null
false
[ "`r = requests.get('http://http2bin.org:8080')`\n" ]
https://api.github.com/repos/psf/requests/issues/2508
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2508/labels{/name}
https://api.github.com/repos/psf/requests/issues/2508/comments
https://api.github.com/repos/psf/requests/issues/2508/events
https://github.com/psf/requests/issues/2508
63,320,912
MDU6SXNzdWU2MzMyMDkxMg==
2,508
Session timeout doesn't accept tuple?
{ "avatar_url": "https://avatars.githubusercontent.com/u/10137?v=4", "events_url": "https://api.github.com/users/ghost/events{/privacy}", "followers_url": "https://api.github.com/users/ghost/followers", "following_url": "https://api.github.com/users/ghost/following{/other_user}", "gists_url": "https://api.github.com/users/ghost/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ghost", "id": 10137, "login": "ghost", "node_id": "MDQ6VXNlcjEwMTM3", "organizations_url": "https://api.github.com/users/ghost/orgs", "received_events_url": "https://api.github.com/users/ghost/received_events", "repos_url": "https://api.github.com/users/ghost/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ghost/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ghost/subscriptions", "type": "User", "url": "https://api.github.com/users/ghost", "user_view_type": "public" }
[]
closed
true
null
[]
null
9
2015-03-20T22:31:35Z
2021-09-08T13:05:32Z
2015-03-21T02:01:20Z
NONE
resolved
Hi. Following the [documentation](http://docs.python-requests.org/en/latest/user/advanced/#timeouts), I've set a tuple with connect and read timeouts: ``` data = self.session.post('https://...', data=params, headers=headers, timeout=(4.5, 10), verify=False).json() ``` But what I've got is this: ``` Timeout value connect was (4.5, 10), but it must be an int or float. ``` As stated in the [developer API](http://docs.python-requests.org/en/latest/api/#request-sessions) under request method, timeout should accept float or tuple, right? Thanks
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2508/reactions" }
https://api.github.com/repos/psf/requests/issues/2508/timeline
null
completed
null
null
false
[ "Taking a look now\n\nOn Friday, March 20, 2015, Mateus Dalto Piveta [email protected]\nwrote:\n\n> Hi.\n> Following the documentation\n> http://docs.python-requests.org/en/latest/user/advanced/#timeouts, I've\n> set a tuple with connect and read timeouts:\n> \n> data = self.session.post('https://...,\n> data=params,\n> headers=headers,\n> timeout=(4.5, 10),\n> verify=False).json()\n> \n> But what I've got is this:\n> \n> Timeout value connect was (4.5, 10), but it must be an int or float.\n> \n> As stated in the developer API\n> http://docs.python-requests.org/en/latest/api/#request-sessions under\n> request method, timeout should accept float or tuple, right?\n> Thanks\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2508.\n\n## \n\n## \n\nKevin Burke\nphone: 925.271.7005 | twentymilliseconds.com\n", "What version of requests are you using?\n", "The reason I ask is that with 2.6.0, I cannot reproduce this bug.\n", "Requests 2.6.0\nPython 3.4.0\n\nAlso, same error without using session\n", "Once again, I do not reproduce this on requests 2.6.0 on Python 3.4.3 on OS X.\n", "![screen shot 2015-03-20 at 20 20 19](https://cloud.githubusercontent.com/assets/6289699/6762468/92405c54-cf3f-11e4-9ce9-66b619e9c49c.png)\n\nUbuntu 14.04.2\n=/\n", "what is the urllib3 version? a single `util.py` file is fairly old..\n\n## \n\nKevin Burke\nphone: 925.271.7005 | twentymilliseconds.com\n\nOn Fri, Mar 20, 2015 at 4:31 PM, Mateus Dalto Piveta <\[email protected]> wrote:\n\n> [image: screen shot 2015-03-20 at 20 20 19]\n> https://cloud.githubusercontent.com/assets/6289699/6762468/92405c54-cf3f-11e4-9ce9-66b619e9c49c.png\n> \n> Ubuntu 14.04.2\n> =/\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2508#issuecomment-84194385\n> .\n", "sorry - what I am trying to suggest is that your requests version is 2.6.0,\nbut the attached urllib3 version is part of your system packages and may be\nearlier, in a compatibility-breaking way.\n\n## \n\nKevin Burke\nphone: 925.271.7005 | twentymilliseconds.com\n\nOn Fri, Mar 20, 2015 at 4:33 PM, Kevin Burke [email protected] wrote:\n\n> what is the urllib3 version? a single `util.py` file is fairly old..\n> \n> ## \n> \n> Kevin Burke\n> phone: 925.271.7005 | twentymilliseconds.com\n> \n> On Fri, Mar 20, 2015 at 4:31 PM, Mateus Dalto Piveta <\n> [email protected]> wrote:\n> \n> > [image: screen shot 2015-03-20 at 20 20 19]\n> > https://cloud.githubusercontent.com/assets/6289699/6762468/92405c54-cf3f-11e4-9ce9-66b619e9c49c.png\n> > \n> > Ubuntu 14.04.2\n> > =/\n> > \n> > —\n> > Reply to this email directly or view it on GitHub\n> > https://github.com/kennethreitz/requests/issues/2508#issuecomment-84194385\n> > .\n", "I've uninstalled the python3-urllib3 that comes with default installation (v1.7.1) and installed via pip v1.10.2, and now everything works :grin:\r\nThank you guys\r\n" ]
https://api.github.com/repos/psf/requests/issues/2507
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2507/labels{/name}
https://api.github.com/repos/psf/requests/issues/2507/comments
https://api.github.com/repos/psf/requests/issues/2507/events
https://github.com/psf/requests/issues/2507
63,276,678
MDU6SXNzdWU2MzI3NjY3OA==
2,507
Request works from command line, but not from within script.
{ "avatar_url": "https://avatars.githubusercontent.com/u/248430?v=4", "events_url": "https://api.github.com/users/egmhansen/events{/privacy}", "followers_url": "https://api.github.com/users/egmhansen/followers", "following_url": "https://api.github.com/users/egmhansen/following{/other_user}", "gists_url": "https://api.github.com/users/egmhansen/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/egmhansen", "id": 248430, "login": "egmhansen", "node_id": "MDQ6VXNlcjI0ODQzMA==", "organizations_url": "https://api.github.com/users/egmhansen/orgs", "received_events_url": "https://api.github.com/users/egmhansen/received_events", "repos_url": "https://api.github.com/users/egmhansen/repos", "site_admin": false, "starred_url": "https://api.github.com/users/egmhansen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/egmhansen/subscriptions", "type": "User", "url": "https://api.github.com/users/egmhansen", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2015-03-20T18:12:58Z
2021-09-08T23:05:52Z
2015-03-20T18:25:45Z
NONE
resolved
If I run the following from the command line, it works fine. If I try to run it from a script I get the AttributeError. Why is those things I said? import requests r = requests.head('https://api.github.com/user') AttributeError: 'module' object has no attribute 'head'
{ "avatar_url": "https://avatars.githubusercontent.com/u/248430?v=4", "events_url": "https://api.github.com/users/egmhansen/events{/privacy}", "followers_url": "https://api.github.com/users/egmhansen/followers", "following_url": "https://api.github.com/users/egmhansen/following{/other_user}", "gists_url": "https://api.github.com/users/egmhansen/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/egmhansen", "id": 248430, "login": "egmhansen", "node_id": "MDQ6VXNlcjI0ODQzMA==", "organizations_url": "https://api.github.com/users/egmhansen/orgs", "received_events_url": "https://api.github.com/users/egmhansen/received_events", "repos_url": "https://api.github.com/users/egmhansen/repos", "site_admin": false, "starred_url": "https://api.github.com/users/egmhansen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/egmhansen/subscriptions", "type": "User", "url": "https://api.github.com/users/egmhansen", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2507/reactions" }
https://api.github.com/repos/psf/requests/issues/2507/timeline
null
completed
null
null
false
[ "Is there a file named 'requests.py'? Python would try to import that file, not the module, in that case.\n", "Nailed it! thanks Kevin!\n" ]
https://api.github.com/repos/psf/requests/issues/2506
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2506/labels{/name}
https://api.github.com/repos/psf/requests/issues/2506/comments
https://api.github.com/repos/psf/requests/issues/2506/events
https://github.com/psf/requests/pull/2506
62,806,978
MDExOlB1bGxSZXF1ZXN0MzE0NzQ1NDY=
2,506
Expand on chunked handling.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[]
closed
true
null
[]
null
6
2015-03-18T21:32:41Z
2021-09-08T07:00:49Z
2015-07-18T15:44:31Z
MEMBER
resolved
Once we merge in the new urllib3, this will update the docs to reflect the new best way to handle chunked transfer encoding.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2506/reactions" }
https://api.github.com/repos/psf/requests/issues/2506/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2506.diff", "html_url": "https://github.com/psf/requests/pull/2506", "merged_at": "2015-07-18T15:44:31Z", "patch_url": "https://github.com/psf/requests/pull/2506.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2506" }
true
[ "looks fine to me\n\nhow about specifying the size of a chunk? will it be possible?\n", "Yup, passing the chunk size parameter to `iter_content` will do it. What you _can't_ do is change that size from chunk to chunk.\n", "You are right! Should have checked the code.\n\nThe reason I asked because I was slightly confused by this sentence. I would probably update it to:\n\n``` diff\n-iterate chunk-by-chunk by calling ``iter_content`` with a chunk\n-size parameter of ``None``.\n+iterate chunk-by-chunk by calling ``iter_content`` (it is ``None``\n+by default, but you can set chunk size yourself by providing an ``int``).\n```\n\nBut you're the native speaker here. So I'm assuming that's just me not getting it.\n", "Agree that it needs rewording, though that wording isn't quite right (it's not `None` by default, it's `1`).\n", "pong\n", "Always be updating PRs.\n" ]
https://api.github.com/repos/psf/requests/issues/2505
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2505/labels{/name}
https://api.github.com/repos/psf/requests/issues/2505/comments
https://api.github.com/repos/psf/requests/issues/2505/events
https://github.com/psf/requests/issues/2505
62,794,103
MDU6SXNzdWU2Mjc5NDEwMw==
2,505
request.files is empty after POSTing a file
{ "avatar_url": "https://avatars.githubusercontent.com/u/1389463?v=4", "events_url": "https://api.github.com/users/bepetersn/events{/privacy}", "followers_url": "https://api.github.com/users/bepetersn/followers", "following_url": "https://api.github.com/users/bepetersn/following{/other_user}", "gists_url": "https://api.github.com/users/bepetersn/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/bepetersn", "id": 1389463, "login": "bepetersn", "node_id": "MDQ6VXNlcjEzODk0NjM=", "organizations_url": "https://api.github.com/users/bepetersn/orgs", "received_events_url": "https://api.github.com/users/bepetersn/received_events", "repos_url": "https://api.github.com/users/bepetersn/repos", "site_admin": false, "starred_url": "https://api.github.com/users/bepetersn/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/bepetersn/subscriptions", "type": "User", "url": "https://api.github.com/users/bepetersn", "user_view_type": "public" }
[]
closed
true
null
[]
null
6
2015-03-18T20:41:34Z
2021-09-08T23:05:52Z
2015-03-18T21:54:34Z
NONE
resolved
Hey guys, I don't know if this is the right place to make this bug report, but I could sure use some help, I have been banging my head against a wall for a bit. See this repo where I'm reproducing my error: https://github.com/bepetersn/special-repo. Using Flask, I'm seeing weird behavior around its `request` object. After uploading a file with the requests library, e.g. `requests.post(uri, files=<my_files>)`, by the time the request propagated to my view function, `request.files` was empty. Oddly, I observed that the contents of the file itself was available under `request.form[None]`. After quite a lot of debugging, I saw that Werkzeug received roughly the following as the form/multipart-encoded data, and proceeded to try to parse it: > Content-Disposition: form-data; name_=utf-8\'\'Spirit%20Airlines%20-%20cheap%20tickets%2C%0A %20cheap%20flights%2C%20discount%20airfare%2C%20cheap%20hotels %2C%20cheap%20car%20rentals%2C%20cheap%20travel.pdf; filename_=utf- 8\'\'Spirit%20Airlines%20-%20cheap%20tickets%2C%0A%20cheap%20flights %2C%20discount%20airfare%2C%20cheap%20hotels%2C%20cheap%20car%20rentals %2C%20cheap%20travel.pdf\r\n Do you notice the little "*" characters just after the "name" and "filename" header definitions? From my sense of things, Werkzeug looks for exactly "name" and "filename", doesn't know how to parse these in their place, fails to find an attachment name, and thus doesn't create a `werkzeug.datastructures.FileStorage`. For this reason, `request.files` is empty. What does anyone think? Is requests responsible for adding these extra characters in?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 2, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 2, "url": "https://api.github.com/repos/psf/requests/issues/2505/reactions" }
https://api.github.com/repos/psf/requests/issues/2505/timeline
null
completed
null
null
false
[ "I see the line responsible here: https://github.com/kennethreitz/requests/blob/35d083e1665beff39aabe47a79cd1f867b897b0c/requests/packages/urllib3/fields.py#L45. However, it's pretty clear that it's intentional. I'm curious if anyone can point me to what purpose it has (maybe something to do with RFC 2231?), -- and whether it's really correct, if I'm somehow giving requests bad data, or even if werkzeug isn't handling this stuff correctly..\n", "This problem is almost certainly because you're passing a unicode string into requests 'files' parameter. Make sure your strings are all bytestrings first. =)\n", "I will try that. :) Beats reading about 4 RFCs to check if I should actually be wanting to submit a pull request to werkzeug.\n", "Well, I don't fully get the logic of it, but what I actually needed to do was get rid of the newline that was in the filename I was passing to the `files` parameter of `requests.post`.\n", "Ah, yes, that'll do it to.\n\nThe problem here is that the multipart/form-encoded spec doesn't ordinarily allow newlines in the form fields, so you need to do some awkward work to get around them. That involves the encoding we've used, which is sadly relatively poorly supported by many servers. =(\n", "Thanks for your help, Corey.\nOn Mar 18, 2015 4:54 PM, \"Cory Benfield\" [email protected] wrote:\n\n> Closed #2505 https://github.com/kennethreitz/requests/issues/2505.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2505#event-258690327.\n" ]
https://api.github.com/repos/psf/requests/issues/2504
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2504/labels{/name}
https://api.github.com/repos/psf/requests/issues/2504/comments
https://api.github.com/repos/psf/requests/issues/2504/events
https://github.com/psf/requests/pull/2504
62,668,489
MDExOlB1bGxSZXF1ZXN0MzE0MjA0MjI=
2,504
fix resolve redirect to pass all original arguments
{ "avatar_url": "https://avatars.githubusercontent.com/u/391586?v=4", "events_url": "https://api.github.com/users/pvanderlinden/events{/privacy}", "followers_url": "https://api.github.com/users/pvanderlinden/followers", "following_url": "https://api.github.com/users/pvanderlinden/following{/other_user}", "gists_url": "https://api.github.com/users/pvanderlinden/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/pvanderlinden", "id": 391586, "login": "pvanderlinden", "node_id": "MDQ6VXNlcjM5MTU4Ng==", "organizations_url": "https://api.github.com/users/pvanderlinden/orgs", "received_events_url": "https://api.github.com/users/pvanderlinden/received_events", "repos_url": "https://api.github.com/users/pvanderlinden/repos", "site_admin": false, "starred_url": "https://api.github.com/users/pvanderlinden/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pvanderlinden/subscriptions", "type": "User", "url": "https://api.github.com/users/pvanderlinden", "user_view_type": "public" }
[]
closed
true
null
[]
null
9
2015-03-18T11:33:15Z
2021-09-08T08:00:50Z
2015-04-16T05:57:56Z
CONTRIBUTOR
resolved
Fix for https://github.com/kennethreitz/requests/issues/2503 It is still a bit inconsistent but as mentioned in the issue, but it otherwise will break the current public api. This makes it possible for non urllib3 adapters to accept extra arguments.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2504/reactions" }
https://api.github.com/repos/psf/requests/issues/2504/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2504.diff", "html_url": "https://github.com/psf/requests/pull/2504", "merged_at": "2015-04-16T05:57:56Z", "patch_url": "https://github.com/psf/requests/pull/2504.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2504" }
true
[ "I think I'd like the `kwargs` parameter in `resolve_redirects` to have a different name: `adapter_kwargs`, maybe?\n", "I have changed the parameter.\n", "LGTM. The one thing I'm thinking about is whether we want to allow for forward compatibility by actually storing these kwargs in a dictionary for resolve_redirects, but I haven't decided yet. I'll let @sigmavirus24 make a call here.\n", "> The one thing I'm thinking about is whether we want to allow for forward compatibility by actually storing these kwargs in a dictionary for resolve_redirects\n\nCould you expand on that @Lukasa? I'm not quite certain what you mean.\n", "Any thoughts? I would like to get this merged as it is preventing me from adding a feature to my adapter (making it open source soon).\n", "Sorry, what I meant was I'm wondering whether we should, rather than using `**kwargs` the whole way through, save the initial `**kwargs` in a dictionary which is then passed _as a dictionary_. That way, if we ever feel the need to add `**kwargs` to any of these functions later we could.\n\nI don't think it's a big deal though.\n", "I'm okay with this if you are @Lukasa.\n", "Sure, why not. =)\n", "Thanks!\n" ]
https://api.github.com/repos/psf/requests/issues/2503
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2503/labels{/name}
https://api.github.com/repos/psf/requests/issues/2503/comments
https://api.github.com/repos/psf/requests/issues/2503/events
https://github.com/psf/requests/issues/2503
62,400,729
MDU6SXNzdWU2MjQwMDcyOQ==
2,503
Redirect doesn't pass all parameters
{ "avatar_url": "https://avatars.githubusercontent.com/u/391586?v=4", "events_url": "https://api.github.com/users/pvanderlinden/events{/privacy}", "followers_url": "https://api.github.com/users/pvanderlinden/followers", "following_url": "https://api.github.com/users/pvanderlinden/following{/other_user}", "gists_url": "https://api.github.com/users/pvanderlinden/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/pvanderlinden", "id": 391586, "login": "pvanderlinden", "node_id": "MDQ6VXNlcjM5MTU4Ng==", "organizations_url": "https://api.github.com/users/pvanderlinden/orgs", "received_events_url": "https://api.github.com/users/pvanderlinden/received_events", "repos_url": "https://api.github.com/users/pvanderlinden/repos", "site_admin": false, "starred_url": "https://api.github.com/users/pvanderlinden/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pvanderlinden/subscriptions", "type": "User", "url": "https://api.github.com/users/pvanderlinden", "user_view_type": "public" }
[ { "color": "e10c02", "default": false, "description": null, "id": 117744, "name": "Bug", "node_id": "MDU6TGFiZWwxMTc3NDQ=", "url": "https://api.github.com/repos/psf/requests/labels/Bug" } ]
closed
true
null
[]
null
4
2015-03-17T13:41:37Z
2021-09-08T23:04:52Z
2015-05-03T15:06:00Z
CONTRIBUTOR
resolved
When you write your own adapter which takes other parameters then the default adapter, resolve_redirects only pass in the parameters which the default adapter takes, this is different from the first request as it passes all parameters to the send function.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2503/reactions" }
https://api.github.com/repos/psf/requests/issues/2503/timeline
null
completed
null
null
false
[ "Good spot, we should aim to fix that. Not sure where we'll store them though.\n", "Requests currently keeps all those parameters seperatly (just in a local variables I think), instead of getting them from **kwargs invidually pass them whole kwargs to the redirect resolver? I might raise a PR for this tomorrow if I come around it.\n", "I think having a PR to talk around would be good. I'm kinda scattered today, so it'll be nice to have something concrete to look at. =)\n", "So I'm mildly -0.5 on this. In practice we've seen that 99.9% of the users are satisfied by how `resolve_redirects` passes parameters. So `resolve_redirects` takes exactly the parameters it needs to know about.\n\nHaving a PR is still nice to have, but I'll caution you against an immediate no-go: Do not replace the existing optional parameters to `resolve_redirects` with `**kwargs` because that breaks a public API that previously accepted positional arguments.\n" ]
https://api.github.com/repos/psf/requests/issues/2502
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2502/labels{/name}
https://api.github.com/repos/psf/requests/issues/2502/comments
https://api.github.com/repos/psf/requests/issues/2502/events
https://github.com/psf/requests/issues/2502
62,343,597
MDU6SXNzdWU2MjM0MzU5Nw==
2,502
Kerberos auth doesn't work when the host is an alias
{ "avatar_url": "https://avatars.githubusercontent.com/u/3543138?v=4", "events_url": "https://api.github.com/users/AndreaGiardini/events{/privacy}", "followers_url": "https://api.github.com/users/AndreaGiardini/followers", "following_url": "https://api.github.com/users/AndreaGiardini/following{/other_user}", "gists_url": "https://api.github.com/users/AndreaGiardini/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/AndreaGiardini", "id": 3543138, "login": "AndreaGiardini", "node_id": "MDQ6VXNlcjM1NDMxMzg=", "organizations_url": "https://api.github.com/users/AndreaGiardini/orgs", "received_events_url": "https://api.github.com/users/AndreaGiardini/received_events", "repos_url": "https://api.github.com/users/AndreaGiardini/repos", "site_admin": false, "starred_url": "https://api.github.com/users/AndreaGiardini/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/AndreaGiardini/subscriptions", "type": "User", "url": "https://api.github.com/users/AndreaGiardini", "user_view_type": "public" }
[]
closed
true
null
[]
null
13
2015-03-17T09:29:52Z
2021-09-08T21:00:47Z
2015-11-05T18:04:23Z
NONE
resolved
Hi I'm using the request module in combination with request-kerberos. Since the server i'm querying is behind a load balancer it often happens that the authentication is done with host1 and the query on host2, leading to error 401 Unauthenticated. I manage to workaround this problem this way: ``` host = socket.gethostbyaddr(random.choice(socket.gethostbyname_ex(host)[2]))[0] ``` Don't know if i should open this issue in `requests-kerberos` as well.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2502/reactions" }
https://api.github.com/repos/psf/requests/issues/2502/timeline
null
completed
null
null
false
[ "requests-kerberos should authenticate every connection so long as you attach it as an auth handler to the _session_, not to individual requests. Ideally that should avoid this problem.\n", "Do you mean something like this?\n\n```\ns = requests.Session()\ns.auth = HTTPKerberosAuth()\ns.get('http://my.beautiful.alias')\n```\n\nBecause this example (on their project page) doesn't work in my case:\n\n```\nrequests.get(\"http://my.beautiful.alias\", auth=HTTPKerberosAuth())\n```\n", "That is exactly what I meant. =) Does that work?\n", "In my case it doesn't...\nIf it connects to the same server behind the load balancer it works, if it connects to a different one then i have authentication failed.\n", "Interesting. Do all the endpoints share the same hostname?\n", "The reason I ask is that Kerberos with loadbalancers is...[tricky](http://serverfault.com/questions/365065/kerberos-authentication-through-load-balancer).\n", "That's my case:\n\n```\n➜ host judy.my.com\njudy.my.com is an alias for foremanlb.my.com.\nforemanlb.my.com has address X\nforemanlb.my.com has address Y\n```\n", "It would be interesting to see if any other kerberos client implementation functions here. I think Chrome has a Kerberos implementation...\n", "So HTTPKerberosAuth requires the same IP each time for authentication to continue working? I'm not sure that's the problem of requests and I'm very sure we don't want to randomly choose a value to give the authenticator if in fact that's a potential solution.\n", "I didn't investigate deeply tbh...\nAlso requests need to pick up a random value and not the alias, otherwise it doesn't work.\nThis is the only way afaik:\n\n```\nhost = socket.gethostbyaddr(random.choice(socket.gethostbyname_ex(host)[2]))[0]\nrequests.get(host, auth=HTTPKerberosAuth())\n```\n\nThe thing is that both requests and HTTPKerberosAuth need to make their calls to the same server behind the alias.\n", "not sure if this might help, but I had a similar problem and it seems to have gone after setting SPNs for the alias names. In my case, I had a webserver bound to an Active Directory domain with a specific URL asking for Kerberos auth. Without SPNs for the alias name in place, auth didn't work. I ran this with DomainAdmin privileges to register the SPNs:\n\nsetspn /A host/myaliashostname realhostname\nsetspn /A http/myaliashostname realhostname\n\nIn my case, I also had to reboot the (Windows) client machine on which I'm running the python/requests client (had problems before reboot).\n", "Why this has been closed? It's still an issue for me\n", "This issue is not really appropriate on this repository, it should be on requests-kerberos. =)\n" ]
https://api.github.com/repos/psf/requests/issues/2501
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2501/labels{/name}
https://api.github.com/repos/psf/requests/issues/2501/comments
https://api.github.com/repos/psf/requests/issues/2501/events
https://github.com/psf/requests/pull/2501
62,201,326
MDExOlB1bGxSZXF1ZXN0MzEyODEzNjA=
2,501
Added myself to authors.rst
{ "avatar_url": "https://avatars.githubusercontent.com/u/3696393?v=4", "events_url": "https://api.github.com/users/yasoob/events{/privacy}", "followers_url": "https://api.github.com/users/yasoob/followers", "following_url": "https://api.github.com/users/yasoob/following{/other_user}", "gists_url": "https://api.github.com/users/yasoob/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/yasoob", "id": 3696393, "login": "yasoob", "node_id": "MDQ6VXNlcjM2OTYzOTM=", "organizations_url": "https://api.github.com/users/yasoob/orgs", "received_events_url": "https://api.github.com/users/yasoob/received_events", "repos_url": "https://api.github.com/users/yasoob/repos", "site_admin": false, "starred_url": "https://api.github.com/users/yasoob/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yasoob/subscriptions", "type": "User", "url": "https://api.github.com/users/yasoob", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2015-03-16T20:30:21Z
2021-09-08T08:00:58Z
2015-03-16T20:31:09Z
CONTRIBUTOR
resolved
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2501/reactions" }
https://api.github.com/repos/psf/requests/issues/2501/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2501.diff", "html_url": "https://github.com/psf/requests/pull/2501", "merged_at": "2015-03-16T20:31:09Z", "patch_url": "https://github.com/psf/requests/pull/2501.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2501" }
true
[ ":sparkles:\n" ]
https://api.github.com/repos/psf/requests/issues/2500
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2500/labels{/name}
https://api.github.com/repos/psf/requests/issues/2500/comments
https://api.github.com/repos/psf/requests/issues/2500/events
https://github.com/psf/requests/issues/2500
62,085,366
MDU6SXNzdWU2MjA4NTM2Ng==
2,500
Add config option to use system CA bundle
{ "avatar_url": "https://avatars.githubusercontent.com/u/4709746?v=4", "events_url": "https://api.github.com/users/wrycu/events{/privacy}", "followers_url": "https://api.github.com/users/wrycu/followers", "following_url": "https://api.github.com/users/wrycu/following{/other_user}", "gists_url": "https://api.github.com/users/wrycu/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/wrycu", "id": 4709746, "login": "wrycu", "node_id": "MDQ6VXNlcjQ3MDk3NDY=", "organizations_url": "https://api.github.com/users/wrycu/orgs", "received_events_url": "https://api.github.com/users/wrycu/received_events", "repos_url": "https://api.github.com/users/wrycu/repos", "site_admin": false, "starred_url": "https://api.github.com/users/wrycu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/wrycu/subscriptions", "type": "User", "url": "https://api.github.com/users/wrycu", "user_view_type": "public" }
[ { "color": "02e10c", "default": false, "description": null, "id": 76800, "name": "Feature Request", "node_id": "MDU6TGFiZWw3NjgwMA==", "url": "https://api.github.com/repos/psf/requests/labels/Feature%20Request" }, { "color": "f7c6c7", "default": false, "description": null, "id": 167537670, "name": "Propose Close", "node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=", "url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close" } ]
closed
true
null
[]
null
2
2015-03-16T13:43:51Z
2021-09-08T23:05:53Z
2015-03-16T15:47:27Z
NONE
resolved
I'm using requests with an internal CA which is trusted on my machines. I recently discovered that you can modify the code in certs.py to return the system CA bundle (thus allowing verified HTTPS requests), but I think this should really be an option, not a modification to the module. And since we all use venvs to dev, simply adding it to the Requests bundle is not a very elegant solution.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2500/reactions" }
https://api.github.com/repos/psf/requests/issues/2500/timeline
null
completed
null
null
false
[ "If you know the location, you can use the verify parameter on a session to set the location or you can use `REQUESTS_CA_BUNDLE` and `CURL_CA_BUNDLE`.\n\nWe used to have a way to auto-discover system bundles but there were far too many corner cases.\n", "This is another thing that can be solved by shazow/urllib3#507, if it gets merged. Until that time, I agree that this is more complicated than simply passing your own certificate bundle to requests.\n\nFor anyone who is interested, you can build this bundle by running:\n\n``` bash\n$ cat `python -c 'import requests; print requests.certs.where()'` <your PEM file>\n```\n" ]
https://api.github.com/repos/psf/requests/issues/2499
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2499/labels{/name}
https://api.github.com/repos/psf/requests/issues/2499/comments
https://api.github.com/repos/psf/requests/issues/2499/events
https://github.com/psf/requests/issues/2499
62,045,659
MDU6SXNzdWU2MjA0NTY1OQ==
2,499
Passing multiple certs for CA validation
{ "avatar_url": "https://avatars.githubusercontent.com/u/348449?v=4", "events_url": "https://api.github.com/users/nanonyme/events{/privacy}", "followers_url": "https://api.github.com/users/nanonyme/followers", "following_url": "https://api.github.com/users/nanonyme/following{/other_user}", "gists_url": "https://api.github.com/users/nanonyme/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/nanonyme", "id": 348449, "login": "nanonyme", "node_id": "MDQ6VXNlcjM0ODQ0OQ==", "organizations_url": "https://api.github.com/users/nanonyme/orgs", "received_events_url": "https://api.github.com/users/nanonyme/received_events", "repos_url": "https://api.github.com/users/nanonyme/repos", "site_admin": false, "starred_url": "https://api.github.com/users/nanonyme/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nanonyme/subscriptions", "type": "User", "url": "https://api.github.com/users/nanonyme", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2015-03-16T11:04:42Z
2021-09-08T23:05:53Z
2015-03-16T15:45:03Z
CONTRIBUTOR
resolved
It would be nice that requests allowed passing multiple certs for CA validation. It might be for reason or another be unfeasible to merge CA certs into one bundle. In comparison stdlib ssl you can run load_verify_locations arbitrary amount of times to load more certificate locations.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2499/reactions" }
https://api.github.com/repos/psf/requests/issues/2499/timeline
null
completed
null
null
false
[ "Requests absolutely does allow that, you just need to concatenate your various certs together into one file that you then pass to `verify`.\n", "Yeah, I was meaning without concatenating your certs together in one file. That's specifically what load_verify_locations called multiple times avoids. That's especially useful on Windows since the system certificate store is not a file but a binary storage accessed through an API.\n", "@nanonyme Currently we can't achieve this: we just don't have the primitives. At the very least we would need shazow/urllib3#507 to get merged before this is even possible.\n\nAdditionally, the stdlib SSL module only allows this for some versions: specifically, those with SSLContext objects. That limits it to 2.7.9 and later, which is problematic.\n\nFinally, of course, this doesn't solve the Windows problem since Windows users can't point OpenSSL at a file _period_, they need another solution.\n\nAltogether, I think this is just too niche a request to satisfy at the moment. It seems like it would make life easier for a tiny minority of users, but that minority would be just as well served by a merge of sshazow/urllib3#507 and then playing with SSLContext objects.\n\nThanks for the suggestion, though! Please keep them coming. =)\n" ]
https://api.github.com/repos/psf/requests/issues/2498
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2498/labels{/name}
https://api.github.com/repos/psf/requests/issues/2498/comments
https://api.github.com/repos/psf/requests/issues/2498/events
https://github.com/psf/requests/issues/2498
61,913,986
MDU6SXNzdWU2MTkxMzk4Ng==
2,498
Cookies + proxy
{ "avatar_url": "https://avatars.githubusercontent.com/u/10043592?v=4", "events_url": "https://api.github.com/users/vtenq/events{/privacy}", "followers_url": "https://api.github.com/users/vtenq/followers", "following_url": "https://api.github.com/users/vtenq/following{/other_user}", "gists_url": "https://api.github.com/users/vtenq/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/vtenq", "id": 10043592, "login": "vtenq", "node_id": "MDQ6VXNlcjEwMDQzNTky", "organizations_url": "https://api.github.com/users/vtenq/orgs", "received_events_url": "https://api.github.com/users/vtenq/received_events", "repos_url": "https://api.github.com/users/vtenq/repos", "site_admin": false, "starred_url": "https://api.github.com/users/vtenq/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vtenq/subscriptions", "type": "User", "url": "https://api.github.com/users/vtenq", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2015-03-15T22:44:13Z
2021-09-08T17:05:40Z
2015-03-15T22:50:39Z
NONE
resolved
I'm sending requests to some site using proxy servers for each request another proxy. I'm not a pro in this sphere, so I'm interested in a such things: - are cookies generated each time while i'm sending requests? - can site (to which i'm sending request) identify my computer based on cookies or another factor and to find out that this request and other ones where made especially from my computer?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2498/reactions" }
https://api.github.com/repos/psf/requests/issues/2498/timeline
null
completed
null
null
false
[ "1. It depends. If you aren't using a [Session object](http://docs.python-requests.org/en/latest/user/advanced/#session-objects), then yes, cookies are generated each time because we have nowhere to store them. If you _are_ using a Session object, then you should re-use the cookies, because cookies are tied to the origin server, not the individual connection.\n2. Yes. That's exactly what cookies are for.\n", "@ttomoday By the way, while I did answer your questions, please note that this issue tracker is a place for reporting bugs and discussing improvements, not for general questions about library usage or about HTTP more widely. I recommend using Stack Overflow to ask further questions of that nature. =)\n", "okay, thanks for your answers\n" ]
https://api.github.com/repos/psf/requests/issues/2497
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2497/labels{/name}
https://api.github.com/repos/psf/requests/issues/2497/comments
https://api.github.com/repos/psf/requests/issues/2497/events
https://github.com/psf/requests/pull/2497
61,872,088
MDExOlB1bGxSZXF1ZXN0MzEzNzI3NTg=
2,497
Requests security release policy
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[ { "color": "e11d21", "default": false, "description": null, "id": 136589914, "name": "Needs Info", "node_id": "MDU6TGFiZWwxMzY1ODk5MTQ=", "url": "https://api.github.com/repos/psf/requests/labels/Needs%20Info" }, { "color": "fad8c7", "default": false, "description": null, "id": 136616769, "name": "Documentation", "node_id": "MDU6TGFiZWwxMzY2MTY3Njk=", "url": "https://api.github.com/repos/psf/requests/labels/Documentation" }, { "color": "777777", "default": false, "description": null, "id": 162780722, "name": "Question/Not a bug", "node_id": "MDU6TGFiZWwxNjI3ODA3MjI=", "url": "https://api.github.com/repos/psf/requests/labels/Question/Not%20a%20bug" } ]
closed
true
null
[]
null
14
2015-03-15T17:28:25Z
2021-09-08T08:00:56Z
2015-03-27T11:03:41Z
MEMBER
resolved
In the wake of CVE-2015-2296 we should take another look at our policy for patching, releasing and publicising security vulnerabilities like this one. Although we did a great job of responding quickly and pushing out a new release, we can definitely improve. I've received feedback from a couple of places that wanted to point out ways we could improve, and I'd like to solicit public feedback from anyone who has an interest to ensure that we're doing the best we possibly can with this sort of thing. The goal here is for me to write up a document that explains, step-by-step, our policy with security issues. This will be posted publicly and we'll use it as a reference for any future events. Below is the list of points people have raised: - We probably shouldn't announce these vulnerabilities on weekends. Several people have products and projects that require them to evaluate vulnerabilities as soon as they become aware of them, and forcing those people to work on weekends does not engender positive feelings towards us. - We released and announced before we had a CVE number. This might not have been the best thing for us to do, particularly as it will have made it a bit more difficult for people to keep track of what was going on. If you have any other suggestions, please post them below. I'm going to ping some specific interested parties to ensure they see this: @sigmavirus24 @dstufft @eriol @ralphbean
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2497/reactions" }
https://api.github.com/repos/psf/requests/issues/2497/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2497.diff", "html_url": "https://github.com/psf/requests/pull/2497", "merged_at": "2015-03-27T11:03:41Z", "patch_url": "https://github.com/psf/requests/pull/2497.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2497" }
true
[ "https://alexgaynor.net/2013/oct/19/security-process-open-source-projects/\n", "Some lead time on the changes in 3bd8afbff would have been nice, yes. @alex's recommendations look quite good to me as a 'maximal effort'. If even portions of those suggestions could be adopted, it would be an improvement. Thanks for the effort on this!\n", "I agree with @ralphbean and I too would like to thank your effort on this!\n", "I'm entirely happy to adopt almost all of @alex's suggestions. I'll take a look at writing up a procedure at some point shortly and pass it around for consideration.\n", "FWIW, if you need those in a more prose version, you're welcome to any text\nfrom http://cryptography.readthedocs.org/en/latest/security/\n\nOn Mon, Mar 16, 2015 at 4:12 PM, Cory Benfield [email protected]\nwrote:\n\n> I'm entirely happy to adopt almost all of @alex https://github.com/alex's\n> suggestions. I'll take a look at writing up a procedure at some point\n> shortly and pass it around for consideration.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2497#issuecomment-81910438\n> .\n\n## \n\n\"I disapprove of what you say, but I will defend to the death your right to\nsay it.\" -- Evelyn Beatrice Hall (summarizing Voltaire)\n\"The people's good is the highest law.\" -- Cicero\nGPG Key fingerprint: 125F 5C67 DFE9 4084\n", "Alright, thanks for your feedback all! Can you please take a look at the diff attached to this issue and confirm that it's to your liking, and that you're happy with this commitment? All feedback valued! \n\nOne particular note: I'm not sure about having a public list of who we notify downstream, but at the same time I worry about losing track of it if we don't record it somewhere. Thoughts there? @sigmavirus24?\n", "It looks good. Here's one thing to consider:\n- Can you privately disclose the patches to your downstream re-distributors before you make it public? That way we can have the time to ensure that the patch applies to X, Y, and Z older releases before hand. This requires agreement and trust that we won't prematurely reveal the patch by pushing it to a public build system or anything like that.\n", "@ralphbean Ah, yes, that was something I intended but clearly failed to make explicit in the document. I'll extend it to include it.\n", "Alright, I've made that update. =)\n", ":+1: \n", "If there are no objections, I'm going to merge this tonight or tomorrow.\n", "I'm not sure it belongs in the docs (or maybe it does, who knows), but we should probably create a template for our announcements as well as a template for requesting CVEs. (Making it easy for any of us to handle this.)\n", "Agreed. Probably best to put it in our docs (it's the open source way).\n", "So more food for thought surrounding vendoring:\n1. When requests itself has a vulnerability found, we should definitely **not** upgrade urllib3 before cutting a release. The release would, ideally, be very minimal.\n2. Given the fact that we vendor urllib3 and chardet and the fact that either could potentially produce a vulnerability as well, we need a way to announce a release that updates the vendored copy and identifies which versions of requests are affected.\n" ]
https://api.github.com/repos/psf/requests/issues/2496
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2496/labels{/name}
https://api.github.com/repos/psf/requests/issues/2496/comments
https://api.github.com/repos/psf/requests/issues/2496/events
https://github.com/psf/requests/pull/2496
61,801,159
MDExOlB1bGxSZXF1ZXN0MzExOTkxNzI=
2,496
Update 2.6.0 changelog with CVE number CVE-2015-2296
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[]
closed
true
null
[]
null
0
2015-03-15T11:50:03Z
2021-09-08T08:00:59Z
2015-03-15T11:50:52Z
MEMBER
resolved
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2496/reactions" }
https://api.github.com/repos/psf/requests/issues/2496/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2496.diff", "html_url": "https://github.com/psf/requests/pull/2496", "merged_at": "2015-03-15T11:50:52Z", "patch_url": "https://github.com/psf/requests/pull/2496.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2496" }
true
[]
https://api.github.com/repos/psf/requests/issues/2495
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2495/labels{/name}
https://api.github.com/repos/psf/requests/issues/2495/comments
https://api.github.com/repos/psf/requests/issues/2495/events
https://github.com/psf/requests/issues/2495
61,781,459
MDU6SXNzdWU2MTc4MTQ1OQ==
2,495
A true SSLContext object is not available warning in 2.6.0 for python < 2.7.9
{ "avatar_url": "https://avatars.githubusercontent.com/u/484306?v=4", "events_url": "https://api.github.com/users/davecoutts/events{/privacy}", "followers_url": "https://api.github.com/users/davecoutts/followers", "following_url": "https://api.github.com/users/davecoutts/following{/other_user}", "gists_url": "https://api.github.com/users/davecoutts/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/davecoutts", "id": 484306, "login": "davecoutts", "node_id": "MDQ6VXNlcjQ4NDMwNg==", "organizations_url": "https://api.github.com/users/davecoutts/orgs", "received_events_url": "https://api.github.com/users/davecoutts/received_events", "repos_url": "https://api.github.com/users/davecoutts/repos", "site_admin": false, "starred_url": "https://api.github.com/users/davecoutts/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/davecoutts/subscriptions", "type": "User", "url": "https://api.github.com/users/davecoutts", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2015-03-15T09:19:56Z
2021-09-08T23:04:41Z
2015-03-15T16:19:40Z
NONE
resolved
Since upgrading to from requests 2.5.3 to 2.6.0 I now see 'A true SSLContext object is not available' warnings. The test environment is Python 2.7.6 on Ubuntu 14.04.2 LTS ``` python mkvirtualenv requests260 -i requests==2.6.0 .... (requests260)dave@xxps:~$ python Python 2.7.6 (default, Mar 22 2014, 22:59:56) [GCC 4.8.2] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> import requests >>> resp = requests.get('https://github.com') /home/dave/.virtualenvs/requests260/local/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:79: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning. InsecurePlatformWarning >>> quit() ``` The [urllib3](https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning) documentation recommends the use of [PyOpenSSL](https://urllib3.readthedocs.org/en/latest/security.html#openssl-pyopenssl) for python versions earlier than 2.7.9. The **pyopenssl.inject_into_urllib3()** call is being made in [requests/**init**.py](https://github.com/kennethreitz/requests/blob/fa338da717124188b21d48df55f2fa9aca459d12/requests/__init__.py#L54) Is the requests 2.6.0 SSLContext warning expected behaviour on python versions earlier than 2.7.9 considering that PyOpenSSL is being used ?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2495/reactions" }
https://api.github.com/repos/psf/requests/issues/2495/timeline
null
completed
null
null
false
[ "That warning should not be raised unless we're not using pyOpenSSL. Can you confirm that `pip freeze` lists pyOpenSSL, ndg-httpsclient, and pyasn1?\n", "Oooops! I had convinced myself that pyOpenSSL was installed, but it wasn't.\nIf I install **ndg-httpsclient** along with **requests** I no longer see the warnings.\n\n``` python\nmkvirtualenv requests260_ndghttpsclient -i requests==2.6.0 -i ndg-httpsclient\n...\n(requests260_ndghttpsclient)dave@xxps:~$ pip freeze\ncffi==0.9.2\ncryptography==0.8\nenum34==1.0.4\nndg-httpsclient==0.3.3\npyasn1==0.1.7\npycparser==2.10\npyOpenSSL==0.14\nrequests==2.6.0\nsix==1.9.0\n```\n\nDoing a very crude timing test it appears that the addition of pyOpenSSL adds about half a second to the import process. Which pretty much lines up with this [ticket](https://github.com/pyca/pyopenssl/issues/137) for pyOpenSSL\n\nThe timing below is run on distinct virtualenvs with the following modules installed requests==2.5.3, requests==2.6.0, requests==2.6.0 + ndg-httpsclient==0.3.3.\n\n``` python\ndave@xxps:~$ time /home/dave/.virtualenvs/requests253/bin/python -c \"import requests; resp = requests.get('https://github.com')\"\n\nreal 0m0.516s\nuser 0m0.089s\nsys 0m0.022s\n\ndave@xxps:~$ time /home/dave/.virtualenvs/requests260/bin/python -c \"import requests; requests.packages.urllib3.disable_warnings(); resp = requests.get('https://github.com')\"\n\nreal 0m0.515s\nuser 0m0.106s\nsys 0m0.023s\n\ndave@xxps:~$ time /home/dave/.virtualenvs/requests260_ndghttpsclient/bin/python -c \"import requests; resp = requests.get('https://github.com')\"\n\nreal 0m1.134s\nuser 0m0.722s\nsys 0m0.037s\n```\n\nFrom what I have read there aren't currently any plans to backport python 2.7.9 to the current Ubuntu Long Term Support version, Ubuntu 14.04.\n\nI guess Ubuntu 14.04 users have the following choices,\n- Install **ndg-httpsclient** and its dependencies along with **requests**, and take the import performance hit.\n- Disable all urllib3 warnings with **requests.packages.urllib3.disable_warnings()**\n- Stick with requests<=2.5.3\n\nWould it be beneficial to state a recommendation to install ndg-httpsclient for python versions less than 2.7.9 in the [install](http://docs.python-requests.org/en/latest/user/install/#install) section of the documentation ? Or have the install process issue a warning that ndg-httpsclient should be installed when python < 2.7.9 is detected ?\n", "I'm not sure. Our hope is that this warning is enough (and it did lead you down the right track!), but it's possible that we'll still find this to be a problem.\n\nFor the moment, I'm inclined to say that we're fine as we are, but that we should remain aware of the possibility that we need to be even more explicit.\n", "Thanks for the clarifications on import performance. Since this is a likely landing spot for people searching for terms like InsecurePlatformWarning requests, perhaps someone could link to or describe the security implications of ignoring the warnings by not using PyOpenSSL, ndg-httpsclient etc on pre-python 2.7.9 systems, so we have a better sense for how to evaluate the options. Of course I know it varies by what people are using requests for, but even a worst-case scenario would be helpful.\n" ]
https://api.github.com/repos/psf/requests/issues/2494
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2494/labels{/name}
https://api.github.com/repos/psf/requests/issues/2494/comments
https://api.github.com/repos/psf/requests/issues/2494/events
https://github.com/psf/requests/pull/2494
61,762,998
MDExOlB1bGxSZXF1ZXN0MzExOTYzNDk=
2,494
"A CVE identifier"
{ "avatar_url": "https://avatars.githubusercontent.com/u/1937160?v=4", "events_url": "https://api.github.com/users/Ayrx/events{/privacy}", "followers_url": "https://api.github.com/users/Ayrx/followers", "following_url": "https://api.github.com/users/Ayrx/following{/other_user}", "gists_url": "https://api.github.com/users/Ayrx/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Ayrx", "id": 1937160, "login": "Ayrx", "node_id": "MDQ6VXNlcjE5MzcxNjA=", "organizations_url": "https://api.github.com/users/Ayrx/orgs", "received_events_url": "https://api.github.com/users/Ayrx/received_events", "repos_url": "https://api.github.com/users/Ayrx/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Ayrx/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Ayrx/subscriptions", "type": "User", "url": "https://api.github.com/users/Ayrx", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2015-03-15T06:47:28Z
2021-09-08T08:00:59Z
2015-03-15T11:50:47Z
NONE
resolved
I believe this is the more correct grammar?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2494/reactions" }
https://api.github.com/repos/psf/requests/issues/2494/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2494.diff", "html_url": "https://github.com/psf/requests/pull/2494", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/2494.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2494" }
true
[ "I believe it is too, however a CVE has now been assigned for this, so I'm going to update the changelog to refer to it instead. Thanks for spotting it though!\n" ]
https://api.github.com/repos/psf/requests/issues/2493
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2493/labels{/name}
https://api.github.com/repos/psf/requests/issues/2493/comments
https://api.github.com/repos/psf/requests/issues/2493/events
https://github.com/psf/requests/pull/2493
61,687,095
MDExOlB1bGxSZXF1ZXN0MzExODkxOTg=
2,493
made the quickstart more reader friendly
{ "avatar_url": "https://avatars.githubusercontent.com/u/3696393?v=4", "events_url": "https://api.github.com/users/yasoob/events{/privacy}", "followers_url": "https://api.github.com/users/yasoob/followers", "following_url": "https://api.github.com/users/yasoob/following{/other_user}", "gists_url": "https://api.github.com/users/yasoob/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/yasoob", "id": 3696393, "login": "yasoob", "node_id": "MDQ6VXNlcjM2OTYzOTM=", "organizations_url": "https://api.github.com/users/yasoob/orgs", "received_events_url": "https://api.github.com/users/yasoob/received_events", "repos_url": "https://api.github.com/users/yasoob/repos", "site_admin": false, "starred_url": "https://api.github.com/users/yasoob/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yasoob/subscriptions", "type": "User", "url": "https://api.github.com/users/yasoob", "user_view_type": "public" }
[]
closed
true
null
[]
null
6
2015-03-14T20:02:40Z
2021-09-08T08:00:59Z
2015-03-15T18:39:54Z
CONTRIBUTOR
resolved
Closes #2418
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2493/reactions" }
https://api.github.com/repos/psf/requests/issues/2493/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2493.diff", "html_url": "https://github.com/psf/requests/pull/2493", "merged_at": "2015-03-15T18:39:54Z", "patch_url": "https://github.com/psf/requests/pull/2493.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2493" }
true
[ "Can we choose a better header to set? Or a better value? I can't see how a long user-agent string will be reader friendly. Some people will now start asking why all that information is necessary and what it means and why it's important. Why not use `headers={'Connection': 'close'}`?\n", "Or just shorten the user-agent. `User-Agent: my-app/0.0.1` should do just fine.\n", "Made the string shorter.\n", "Hurrah! :cake:\n", "Proud of my first contribution to requests :+1: Here's a :cake: for everyone :smile: \n", "@yasoob Feel free to open a pull request adding yourself to AUTHORS.rst!\n" ]
https://api.github.com/repos/psf/requests/issues/2492
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2492/labels{/name}
https://api.github.com/repos/psf/requests/issues/2492/comments
https://api.github.com/repos/psf/requests/issues/2492/events
https://github.com/psf/requests/pull/2492
61,641,136
MDExOlB1bGxSZXF1ZXN0MzExODYwNjQ=
2,492
Bump version and add release notes for 2.6.0
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[]
closed
true
null
[]
null
0
2015-03-14T16:42:50Z
2021-09-08T08:01:00Z
2015-03-14T16:43:32Z
CONTRIBUTOR
resolved
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2492/reactions" }
https://api.github.com/repos/psf/requests/issues/2492/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2492.diff", "html_url": "https://github.com/psf/requests/pull/2492", "merged_at": "2015-03-14T16:43:32Z", "patch_url": "https://github.com/psf/requests/pull/2492.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2492" }
true
[]
https://api.github.com/repos/psf/requests/issues/2491
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2491/labels{/name}
https://api.github.com/repos/psf/requests/issues/2491/comments
https://api.github.com/repos/psf/requests/issues/2491/events
https://github.com/psf/requests/issues/2491
61,610,032
MDU6SXNzdWU2MTYxMDAzMg==
2,491
Allow some way to implement monitor callbacks for files in POST requests
{ "avatar_url": "https://avatars.githubusercontent.com/u/1429103?v=4", "events_url": "https://api.github.com/users/cryzed/events{/privacy}", "followers_url": "https://api.github.com/users/cryzed/followers", "following_url": "https://api.github.com/users/cryzed/following{/other_user}", "gists_url": "https://api.github.com/users/cryzed/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/cryzed", "id": 1429103, "login": "cryzed", "node_id": "MDQ6VXNlcjE0MjkxMDM=", "organizations_url": "https://api.github.com/users/cryzed/orgs", "received_events_url": "https://api.github.com/users/cryzed/received_events", "repos_url": "https://api.github.com/users/cryzed/repos", "site_admin": false, "starred_url": "https://api.github.com/users/cryzed/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/cryzed/subscriptions", "type": "User", "url": "https://api.github.com/users/cryzed", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2015-03-14T13:42:53Z
2021-09-08T23:05:54Z
2015-03-14T13:56:07Z
NONE
resolved
I am currently developing a small tool to download and upload files. I am using the requests and progressbar libraries, among others. During downloading and uploading of files I want to display the progress and speed of the upload. This works very well with the download, I simply set stream=True in my request and then use iter_content on the Response object with some chunk_size and report the total loaded bytes back to the progressbar. As far as I can tell this is impossible with files sent via POST requests. I am trying to upload a single file by passing a dict as "files" to the POST request and would like to somehow implement a monitor callback that would either report how many bytes of that specific file have so far been uploaded, or how many bytes of the entire request have been uploaded -- either way works. I checked the requests internals, and the magic seems to happen in: request.models.RequestEncodingMixin._encode_files. The problem is that in the last few lines of that function, the data always seems to be read into memory all at once, making it impossible to decorate or implement the read method of the file-like object in such a way that would allow monitoring the current upload further down the line. Is there some kind of way to do what I am trying to do, and if there isn't yet, would you consider adding it? Maybe this would fit neatly into the hooks-system which doesn't seem to have that many features as of yet... EDIT: The [requests-toolbelt](https://toolbelt.readthedocs.org/en/0.3.0/user.html#monitoring-your-streaming-upload) seems to have the solution.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1429103?v=4", "events_url": "https://api.github.com/users/cryzed/events{/privacy}", "followers_url": "https://api.github.com/users/cryzed/followers", "following_url": "https://api.github.com/users/cryzed/following{/other_user}", "gists_url": "https://api.github.com/users/cryzed/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/cryzed", "id": 1429103, "login": "cryzed", "node_id": "MDQ6VXNlcjE0MjkxMDM=", "organizations_url": "https://api.github.com/users/cryzed/orgs", "received_events_url": "https://api.github.com/users/cryzed/received_events", "repos_url": "https://api.github.com/users/cryzed/repos", "site_admin": false, "starred_url": "https://api.github.com/users/cryzed/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/cryzed/subscriptions", "type": "User", "url": "https://api.github.com/users/cryzed", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2491/reactions" }
https://api.github.com/repos/psf/requests/issues/2491/timeline
null
completed
null
null
false
[ "For others looking at this, we discussed this on IRC and worked out the correct solution using the [requests-toolbelt](/sigmavirus24/requests-toolbelt).\n" ]
https://api.github.com/repos/psf/requests/issues/2490
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2490/labels{/name}
https://api.github.com/repos/psf/requests/issues/2490/comments
https://api.github.com/repos/psf/requests/issues/2490/events
https://github.com/psf/requests/issues/2490
61,575,936
MDU6SXNzdWU2MTU3NTkzNg==
2,490
411 response results in exception in Session.send
{ "avatar_url": "https://avatars.githubusercontent.com/u/3715287?v=4", "events_url": "https://api.github.com/users/ben-crowhurst/events{/privacy}", "followers_url": "https://api.github.com/users/ben-crowhurst/followers", "following_url": "https://api.github.com/users/ben-crowhurst/following{/other_user}", "gists_url": "https://api.github.com/users/ben-crowhurst/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ben-crowhurst", "id": 3715287, "login": "ben-crowhurst", "node_id": "MDQ6VXNlcjM3MTUyODc=", "organizations_url": "https://api.github.com/users/ben-crowhurst/orgs", "received_events_url": "https://api.github.com/users/ben-crowhurst/received_events", "repos_url": "https://api.github.com/users/ben-crowhurst/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ben-crowhurst/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ben-crowhurst/subscriptions", "type": "User", "url": "https://api.github.com/users/ben-crowhurst", "user_view_type": "public" }
[]
closed
true
null
[]
null
14
2015-03-14T11:59:39Z
2022-02-26T04:00:39Z
2021-11-28T03:02:39Z
NONE
resolved
Full details can be found over at [StackOverflow](http://stackoverflow.com/questions/29030398/411-response-results-in-exception-in-session-send)
{ "avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4", "events_url": "https://api.github.com/users/sethmlarson/events{/privacy}", "followers_url": "https://api.github.com/users/sethmlarson/followers", "following_url": "https://api.github.com/users/sethmlarson/following{/other_user}", "gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sethmlarson", "id": 18519037, "login": "sethmlarson", "node_id": "MDQ6VXNlcjE4NTE5MDM3", "organizations_url": "https://api.github.com/users/sethmlarson/orgs", "received_events_url": "https://api.github.com/users/sethmlarson/received_events", "repos_url": "https://api.github.com/users/sethmlarson/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions", "type": "User", "url": "https://api.github.com/users/sethmlarson", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2490/reactions" }
https://api.github.com/repos/psf/requests/issues/2490/timeline
null
completed
null
null
false
[ "So this is probably a stupid question, but does that server actually work? As you discussed with @mjpieters on the question, this doesn't happen with our typical test server. Is there a server we can actually reproduce this with?\n", "The application server works with Httplib2, cURL command line tool and browsers (Safari/Chrome).\n", "I can see this happening in a number of cases. What I think we'd want to test is whether we can attempt to read from the socket _after_ it threw an exception on the send. Put another way, if we call `HTTPConnection.send()` and it throws a socket exception, will `HTTPConnection.get_response()` work well (i.e. no blocking?).\n", "> The application server works with Httplib2, cURL command line tool and browsers (Safari/Chrome).\n\nI understand that it doesn't work with requests. Could you share (even privately, since our emails are publicly accessible) what server it is that you're experiencing problems with so we can attempt to debug this further?\n", "Sorry, I'm afraid the client won't allow that to happen :(. I can inform you that placing a sleep(1) after asio::write( response ) within the server code allows Requests to do its job accurately.\n\n**Server**\n1. receive request\n2. parse http headers\n3. search for content-length\n 3.1 if not found return 411\n4. parse http body\n", "So let's be clear, the problem is that the `socket.send()` throws an exception because the socket is no longer open for writing. What matters is whether we can still coerce httplib into providing us the response in that case without blocking (in case no such response was made).\n", "With that detail, we can definitely write a socket-level test. Thank you @ben-crowhurst \n", "@sigmavirus24 FYI, if you're planning to take this, I think the fix will actually go into urllib3 (or maybe even httplib, if it turns out httplib can't handle it).\n", "Yeah. \"socket-level test\" was my hint at you that this would be in urllib3 ;)\n", "So some quick local tests suggest that for EPIPE we're ok. Running this 'server':\n\n``` python\nimport socket\n\ns = socket.socket()\ns.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)\ns.bind(('', 8080))\ns.listen(0)\n\nwhile True:\n c = s.accept()[0]\n c.send('HTTP/1.1 411 Length Needed\\r\\nContent-Length: 0\\r\\n\\r\\n')\n c.close()\n```\n\nWe can achieve the following in `httplib`:\n\n``` python\n>>> import httplib\n>>> c = httplib.HTTPConnection('localhost', 8080)\n>>> c.putrequest('GET', '/'); c.endheaders(); c.send('some data');\n>>> c.send('some data')\n>>> c.send('some data')\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"/usr/local/Cellar/python/2.7.9/Frameworks/Python.framework/Versions/2.7/lib/python2.7/httplib.py\", line 826, in send\n self.sock.sendall(data)\n File \"/usr/local/Cellar/python/2.7.9/Frameworks/Python.framework/Versions/2.7/lib/python2.7/socket.py\", line 224, in meth\n return getattr(self._sock,name)(*args)\nsocket.error: [Errno 32] Broken pipe\n>>> r = c.getresponse()\n>>> r.status\n411\n>>> r.read()\n''\n```\n\nSo an EPIPE has the potential to give us a response when there is one.\n\nIf there _isn't_ a response, it seems that instead we get a `BadStatusLine` error, which is exactly what I'd suspect. So I think we can bully httplib into doing what we need here.\n", "Having looked at the HTTP Adapter I'm suggesting the following alteration?\n\n```\n except (ProtocolError, socket.error) as err:\n raise ConnectionError(err, request=request, response=response)\n```\n\nThis would stop making a behavioural change to the API but allow those interested to respond accordingly.\n", "I don't think HTTP Adapter is low enough down. I think urllib3 needs to know what's going on here.\n", "@Lukasa you're correct.\n", "Closing this as it was never a concern for Requests." ]
https://api.github.com/repos/psf/requests/issues/2489
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2489/labels{/name}
https://api.github.com/repos/psf/requests/issues/2489/comments
https://api.github.com/repos/psf/requests/issues/2489/events
https://github.com/psf/requests/pull/2489
61,569,592
MDExOlB1bGxSZXF1ZXN0MzExODIxODQ=
2,489
Don't label cookies for the target domain.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[]
closed
true
null
[]
null
0
2015-03-14T11:31:11Z
2021-09-08T08:01:01Z
2015-03-14T12:05:48Z
MEMBER
resolved
Per a discussion @sigmavirus24 and I had on the mailing list, this change ensures that we correctly record cookie properties based on the original request, rather than the mutated version.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2489/reactions" }
https://api.github.com/repos/psf/requests/issues/2489/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2489.diff", "html_url": "https://github.com/psf/requests/pull/2489", "merged_at": "2015-03-14T12:05:48Z", "patch_url": "https://github.com/psf/requests/pull/2489.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2489" }
true
[]
https://api.github.com/repos/psf/requests/issues/2488
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2488/labels{/name}
https://api.github.com/repos/psf/requests/issues/2488/comments
https://api.github.com/repos/psf/requests/issues/2488/events
https://github.com/psf/requests/issues/2488
60,943,619
MDU6SXNzdWU2MDk0MzYxOQ==
2,488
Need documentation about "Cookies set on individual Requests through a `Session` are not persisted to the session"
{ "avatar_url": "https://avatars.githubusercontent.com/u/1499555?v=4", "events_url": "https://api.github.com/users/colinfang/events{/privacy}", "followers_url": "https://api.github.com/users/colinfang/followers", "following_url": "https://api.github.com/users/colinfang/following{/other_user}", "gists_url": "https://api.github.com/users/colinfang/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/colinfang", "id": 1499555, "login": "colinfang", "node_id": "MDQ6VXNlcjE0OTk1NTU=", "organizations_url": "https://api.github.com/users/colinfang/orgs", "received_events_url": "https://api.github.com/users/colinfang/received_events", "repos_url": "https://api.github.com/users/colinfang/repos", "site_admin": false, "starred_url": "https://api.github.com/users/colinfang/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/colinfang/subscriptions", "type": "User", "url": "https://api.github.com/users/colinfang", "user_view_type": "public" }
[ { "color": "e102d8", "default": false, "description": null, "id": 117745, "name": "Planned", "node_id": "MDU6TGFiZWwxMTc3NDU=", "url": "https://api.github.com/repos/psf/requests/labels/Planned" }, { "color": "0b02e1", "default": false, "description": null, "id": 191274, "name": "Contributor Friendly", "node_id": "MDU6TGFiZWwxOTEyNzQ=", "url": "https://api.github.com/repos/psf/requests/labels/Contributor%20Friendly" }, { "color": "fad8c7", "default": false, "description": null, "id": 136616769, "name": "Documentation", "node_id": "MDU6TGFiZWwxMzY2MTY3Njk=", "url": "https://api.github.com/repos/psf/requests/labels/Documentation" } ]
closed
true
null
[]
null
2
2015-03-13T00:16:32Z
2021-09-08T23:00:42Z
2015-08-08T01:18:21Z
NONE
resolved
It took me a while to figure it out. Thought it might be a bug initially, I then came across #1791 & #1728. Since this feature by design is quite confusing. I feel it deserves some more attention apart from simply mention it in the changelog (0b68037). Can we have it as a tip/hint some where in the documentation (or quick start)? And suggest the proper way to prepare the persistent cookies.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2488/reactions" }
https://api.github.com/repos/psf/requests/issues/2488/timeline
null
completed
null
null
false
[ "Sorry, it's late here and I'm tired. Could you give an example of what you mean?\n", "So I think this is a good idea. It's not entirely clear that we don't send the custom cookies in the second request here:\n\n``` python\ns = Session()\ns.get('http://myhost.com/page1', cookies={'from-my': 'browser'})\ns.get('http://myhost.com/page2')\n```\n\nI'll happily merge any pull request that adds documentation for this. =)\n" ]
https://api.github.com/repos/psf/requests/issues/2487
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2487/labels{/name}
https://api.github.com/repos/psf/requests/issues/2487/comments
https://api.github.com/repos/psf/requests/issues/2487/events
https://github.com/psf/requests/pull/2487
60,814,487
MDExOlB1bGxSZXF1ZXN0MzEwMzI4NTQ=
2,487
Add return type field to entry-point docstrings
{ "avatar_url": "https://avatars.githubusercontent.com/u/55078?v=4", "events_url": "https://api.github.com/users/ulope/events{/privacy}", "followers_url": "https://api.github.com/users/ulope/followers", "following_url": "https://api.github.com/users/ulope/following{/other_user}", "gists_url": "https://api.github.com/users/ulope/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ulope", "id": 55078, "login": "ulope", "node_id": "MDQ6VXNlcjU1MDc4", "organizations_url": "https://api.github.com/users/ulope/orgs", "received_events_url": "https://api.github.com/users/ulope/received_events", "repos_url": "https://api.github.com/users/ulope/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ulope/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ulope/subscriptions", "type": "User", "url": "https://api.github.com/users/ulope", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2015-03-12T12:56:10Z
2021-09-08T08:01:02Z
2015-03-14T11:19:12Z
CONTRIBUTOR
resolved
Fixes: #2483
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2487/reactions" }
https://api.github.com/repos/psf/requests/issues/2487/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2487.diff", "html_url": "https://github.com/psf/requests/pull/2487", "merged_at": "2015-03-14T11:19:12Z", "patch_url": "https://github.com/psf/requests/pull/2487.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2487" }
true
[ ":cake: :star: :cookie: Thanks for this @ulope! Want to add yourself to AUTHORS.rst while we're here?\n", "Oh, yeah, sure. I'll push a new version momentarily.\n", "Beautiful! :cookie: :sparkles: :cake:\n\nThanks so much @ulope!\n" ]
https://api.github.com/repos/psf/requests/issues/2486
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2486/labels{/name}
https://api.github.com/repos/psf/requests/issues/2486/comments
https://api.github.com/repos/psf/requests/issues/2486/events
https://github.com/psf/requests/pull/2486
60,759,840
MDExOlB1bGxSZXF1ZXN0MzEwMDI2MzI=
2,486
Extract version from requests/__init__.py instead of importing requests
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[]
closed
true
null
[]
null
17
2015-03-12T02:02:12Z
2021-09-08T08:01:00Z
2015-03-14T11:20:33Z
CONTRIBUTOR
resolved
This works around a bug in the way setuptools' `test` command works when initially run. Closes #2462
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2486/reactions" }
https://api.github.com/repos/psf/requests/issues/2486/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2486.diff", "html_url": "https://github.com/psf/requests/pull/2486", "merged_at": "2015-03-14T11:20:33Z", "patch_url": "https://github.com/psf/requests/pull/2486.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2486" }
true
[ "Do we believe this only affects sys?\n", "## It's the only import in init and I'm hoping @wardi can help make sure this is correct.\n\nSent from my Android device with K-9 Mail. Please excuse my brevity.\n", "looks like it would work, but reproducing the error is a pain. I don't yet know why it only failed on travis on certain python versions. Does some package need to be importable for it to get to this code? What special conditions are required?\n\nI don't like magic importing and monkeypatching code in utility libraries, especially when it makes me chase phantom bugs like this. (I'm not even using SSL let alone SNI!)\n\nMaybe I can reproduce by referencing this commit id in my install_requires and poking travis.\n", "Seems I can't reference a commit id in install_requires, so this is going to be tricky to test unless you push it up to pypi under a different name, or help me figure out what environment will trigger the bug. https://github.com/ckan/ckanapi/pull/39\n", "Maybe @p-p-m can share more details about how to reproduce this.\n", "How to reproduce bug:\n1. Clone nodeconductor(https://github.com/opennode/nodeconductor).\n2. In `setup.py` change `requests<=2.5.1` to `requests==2.5.3`\n3. Run python `setup.py` install\n", "@p-p-m right, so the only way to reproduce is during install from an install_requires. I've got that too, I'll just have to push this commit up to pypi under a new project name and test that way.\n", "@wardi with this info I have an idea how to test this without making a new package on PyPI. Give me a couple hours to experiment. Thanks.\n", "So here's something that's interesting. I followed @p-p-m's instructions and had 0 problems. (Granted I've done it once so this may be non-deterministic behaviour and may require repetition.)\n\nThat said, I tried it on setuptools 3.6. I would hope that Travis is on an even more recent version.\n", "That gave me an idea. I added a pip-freeze to my travis run just before the install https://travis-ci.org/wardi/ckanapi/builds/54313804\n\nI think this rules out versions of installed python packages, unless there is something behaving differently based on some other things installed on the system. Still don't know how to reproduce this except in travis...\n\nPython 2.6 (passing, probably because I've pre-installed requests in this case):\n\n```\nargparse==1.3.0\nckanapi==3.5.dev0\ndocopt==0.6.2\nlinecache2==1.0.0\nmock==1.0.1\nnose==1.3.4\nnumpy==1.9.1\npy==1.4.26\npytest==2.6.4\nrequests==2.5.3\nsimplejson==3.6.5\nsix==1.9.0\ntraceback2==1.4.0\nunittest2==1.0.1\nwheel==0.24.0\n```\n\npython2.7 (failing)\n\n```\nmock==1.0.1\nnose==1.3.4\nnumpy==1.9.1\npy==1.4.26\npytest==2.6.4\nwheel==0.24.0\n```\n\npython 3.2 (failing)\n\n```\nmock==1.0.1\nnose==1.3.4\nnumpy==1.9.1\npy==1.4.26\npytest==2.6.4\nwheel==0.24.0\n```\n\npython 3.3 (passing, maybe a change in behavior in 3.3+ that avoids the issue?)\n\n```\nmock==1.0.1\nnose==1.3.4\nnumpy==1.9.1\npy==1.4.26\npytest==2.6.4\nwheel==0.24.0\n```\n\npypy (passing)\n\n```\ncffi==0.8.6\ngreenlet==0.4.5\nmock==1.0.1\nnose==1.3.4\npy==1.4.26\npytest==2.6.4\nreadline==6.2.4.1\nwheel==0.24.0\n```\n", "So using the ckanapi project, I'm able to reproduce this reliably. \n\nIf I do\n\n```\npython setup.py test\n```\n\nAfter a fresh clone, I see the behaviour described in the bug. Interestingly, if I rerun `python setup.py test` again, the tests run and pass. If I remove `.eggs` and run `python setup.py test` then it fails again in the same way.\n", "So if we look at the traceback we see:\n\n```\n self.with_project_on_sys_path(self.run_tests)\n```\n\nWhich contains [this code](https://bitbucket.org/pypa/setuptools/src/18eceee63709bc70c06b51a609f3dbd2eab07ef4/setuptools/command/test.py?at=default#cl-117). Which also messes with `sys.path` and `sys.modules`. Oh and on Py3, setuptools will [delete modules](https://bitbucket.org/pypa/setuptools/src/18eceee63709bc70c06b51a609f3dbd2eab07ef4/setuptools/command/test.py?at=default#cl-152) when running tests. (Which I don't think is at fault here, just amusing.)\n", "So after using the version of requests that would result from this PR, we get the following with `python setup.py test`:\n\n```\n requests/packages/__init__.py\", line 74, in load_module\nTypeError: isinstance() arg 2 must be a class, type, or tuple of classes and types\n```\n\nThe line in question is\n\n``` py\nif not isinstance(m, VendorAlias)\n```\n\nInside the list comprehension where we redefine `sys.metapath`. In otherwords, `VendorAlias` is now `None` as well. What's baffling to me is why this works the second time we run it.\n\nTo note, I put a pdb trace into setuptools' test command at line 161. The first time through sys.metapath has our `VendorAlias` instance on it. The second time through, it's empty. So what I think is happening is the following:\n\nWhen it's run the first time, requests is imported by setuptools as part of the install. And I think the bug is actually in our `setup.py`.\n\nIn our `setup.py` we import `requests` so that we can find the version number. I'm going to try something else real quick, but I think we've found the real problem.\n", "Bingo. That's it. I'll push up a new fix ASAP.\n", "LGTM. :cake:\n", "Ah, this looks good. I was hunting down the root cause for this bug - thanks for the fix!\nWhen can we expect to see 2.5.4 on pypi?\n", "@johnnykv 2.6.0 was released earlier today (2015 March 14)\n" ]
https://api.github.com/repos/psf/requests/issues/2485
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2485/labels{/name}
https://api.github.com/repos/psf/requests/issues/2485/comments
https://api.github.com/repos/psf/requests/issues/2485/events
https://github.com/psf/requests/pull/2485
60,759,374
MDExOlB1bGxSZXF1ZXN0MzEwMDIzNzQ=
2,485
Import urllib3's Retry location from the right place
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" } ]
null
2
2015-03-12T01:55:12Z
2021-09-08T08:01:02Z
2015-03-12T06:50:33Z
CONTRIBUTOR
resolved
Importing from urllib3's top-level location causes the namespace to be urllib3.util.retry.Retry instead of requests.packages.urllib3.util.retry.Retry. Without this fix, an using requests with an un-vendored version of urllib3 will break when urllib3's retry handling kicks in. Closes shazow/urllib3#567
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2485/reactions" }
https://api.github.com/repos/psf/requests/issues/2485/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2485.diff", "html_url": "https://github.com/psf/requests/pull/2485", "merged_at": "2015-03-12T06:50:33Z", "patch_url": "https://github.com/psf/requests/pull/2485.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2485" }
true
[ "@Lukasa feel free to merge this so we can release 2.5.5 tomorrow.\n", ":cake:\n" ]
https://api.github.com/repos/psf/requests/issues/2484
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2484/labels{/name}
https://api.github.com/repos/psf/requests/issues/2484/comments
https://api.github.com/repos/psf/requests/issues/2484/events
https://github.com/psf/requests/pull/2484
60,654,228
MDExOlB1bGxSZXF1ZXN0MzA5NDAzMTk=
2,484
Sanitise verb string. Now we can use unicode strings in http methods.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1284569?v=4", "events_url": "https://api.github.com/users/pawlyk/events{/privacy}", "followers_url": "https://api.github.com/users/pawlyk/followers", "following_url": "https://api.github.com/users/pawlyk/following{/other_user}", "gists_url": "https://api.github.com/users/pawlyk/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/pawlyk", "id": 1284569, "login": "pawlyk", "node_id": "MDQ6VXNlcjEyODQ1Njk=", "organizations_url": "https://api.github.com/users/pawlyk/orgs", "received_events_url": "https://api.github.com/users/pawlyk/received_events", "repos_url": "https://api.github.com/users/pawlyk/repos", "site_admin": false, "starred_url": "https://api.github.com/users/pawlyk/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pawlyk/subscriptions", "type": "User", "url": "https://api.github.com/users/pawlyk", "user_view_type": "public" }
[ { "color": "e11d21", "default": false, "description": null, "id": 78002701, "name": "Do Not Merge", "node_id": "MDU6TGFiZWw3ODAwMjcwMQ==", "url": "https://api.github.com/repos/psf/requests/labels/Do%20Not%20Merge" } ]
closed
true
null
[]
null
3
2015-03-11T12:58:46Z
2021-09-08T08:01:04Z
2015-03-11T14:25:05Z
NONE
resolved
There is error (`UnicodeDecodeError: 'ascii' codec can't decode...`) when use unicode string as http verb in instance Request.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2484/reactions" }
https://api.github.com/repos/psf/requests/issues/2484/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2484.diff", "html_url": "https://github.com/psf/requests/pull/2484", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/2484.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2484" }
true
[ "Interesting.\n\nWhy do you want to do this? RFC 7230 is pretty unambiguous about what's allowed in a HTTP METHOD: namely, `token`. The ABNF defines `token` as:\n\n```\n token = 1*tchar\n\n tchar = \"!\" / \"#\" / \"$\" / \"%\" / \"&\" / \"'\" / \"*\"\n / \"+\" / \"-\" / \".\" / \"^\" / \"_\" / \"`\" / \"|\" / \"~\"\n / DIGIT / ALPHA\n ; any VCHAR, except delimiters\n\n DIGIT = %x30-39\n ; 0-9\n\n ALPHA = %x41-5A / %x61-7A ; A-Z / a-z\n\n VCHAR = %x21-7E\n ; visible (printing) characters\n```\n\nThat would seem to guarantee that any valid method is necessarily representable in ASCII. So what's the rationale for allowing non-ASCII methods?\n", "I agree with you about RFC 7230. But documentation does't have mention that http verbs must be in ASCII, there are only list of allowed verbs. Maybe better clarify this part.\n", "That seems like a reasonable decision to me. I'll happily accept any pull request that updates the documentation (probably by updating the docstrings in `requests.request`, `requests.models.Request`, and `session.request`).\n" ]
https://api.github.com/repos/psf/requests/issues/2483
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2483/labels{/name}
https://api.github.com/repos/psf/requests/issues/2483/comments
https://api.github.com/repos/psf/requests/issues/2483/events
https://github.com/psf/requests/issues/2483
60,638,350
MDU6SXNzdWU2MDYzODM1MA==
2,483
Consider adding return type documentation in docstrings
{ "avatar_url": "https://avatars.githubusercontent.com/u/55078?v=4", "events_url": "https://api.github.com/users/ulope/events{/privacy}", "followers_url": "https://api.github.com/users/ulope/followers", "following_url": "https://api.github.com/users/ulope/following{/other_user}", "gists_url": "https://api.github.com/users/ulope/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ulope", "id": 55078, "login": "ulope", "node_id": "MDQ6VXNlcjU1MDc4", "organizations_url": "https://api.github.com/users/ulope/orgs", "received_events_url": "https://api.github.com/users/ulope/received_events", "repos_url": "https://api.github.com/users/ulope/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ulope/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ulope/subscriptions", "type": "User", "url": "https://api.github.com/users/ulope", "user_view_type": "public" }
[ { "color": "0b02e1", "default": false, "description": null, "id": 191274, "name": "Contributor Friendly", "node_id": "MDU6TGFiZWwxOTEyNzQ=", "url": "https://api.github.com/repos/psf/requests/labels/Contributor%20Friendly" }, { "color": "5319e7", "default": false, "description": null, "id": 67760318, "name": "Fixed", "node_id": "MDU6TGFiZWw2Nzc2MDMxOA==", "url": "https://api.github.com/repos/psf/requests/labels/Fixed" }, { "color": "fad8c7", "default": false, "description": null, "id": 136616769, "name": "Documentation", "node_id": "MDU6TGFiZWwxMzY2MTY3Njk=", "url": "https://api.github.com/repos/psf/requests/labels/Documentation" } ]
closed
true
null
[]
null
4
2015-03-11T10:32:21Z
2021-09-08T23:05:49Z
2015-04-06T02:42:35Z
CONTRIBUTOR
resolved
It would be great if the docstrings of (at least) the top level entry points (`request`, `get`, `post`, etc.) could define the return type in a sphinx compatible way (`:rtype:` http://sphinx-doc.org/domains.html#info-field-lists). This would make it easier to work with requests in various IDEs and code inspection tools.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2483/reactions" }
https://api.github.com/repos/psf/requests/issues/2483/timeline
null
completed
null
null
false
[ "Sounds good to me!\n\nDo you want to open a pull request containing this change?\n", "Sure. I'll try to fit it in later in the day.\n", ":+1: \n", "Looks like this was fixed by #2487. Closing as fixed.\n" ]
https://api.github.com/repos/psf/requests/issues/2482
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2482/labels{/name}
https://api.github.com/repos/psf/requests/issues/2482/comments
https://api.github.com/repos/psf/requests/issues/2482/events
https://github.com/psf/requests/pull/2482
60,592,489
MDExOlB1bGxSZXF1ZXN0MzA5MDY1MDM=
2,482
Update urllib3 to 43b5b2b452e4344374de7d08ececcca495079b8d
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2015-03-11T00:34:50Z
2021-09-08T08:01:03Z
2015-03-11T14:52:11Z
CONTRIBUTOR
resolved
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2482/reactions" }
https://api.github.com/repos/psf/requests/issues/2482/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2482.diff", "html_url": "https://github.com/psf/requests/pull/2482", "merged_at": "2015-03-11T14:52:11Z", "patch_url": "https://github.com/psf/requests/pull/2482.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2482" }
true
[ "```\n(requests)~/s/requests git:update-urllib3 ❯❯❯ py.test ✭\n==================================================================================================================== test session starts =====================================================================================================================\nplatform darwin -- Python 2.7.6 -- pytest-2.3.4\ncollected 156 items\n\ntest_requests.py ............................................................................................................................................................\n\n================================================================================================================ 156 passed in 24.73 seconds =================================================================================================================\n(requests)~/s/requests git:update-urllib3 ❯❯❯ python ✭\nPython 2.7.6 (default, Sep 9 2014, 15:04:36)\n[GCC 4.2.1 Compatible Apple LLVM 6.0 (clang-600.0.39)] on darwin\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> import requests\n>>> r = requests.get('https://httpbin.org/response-headers', params=[('WWW-Authenticate', 'digest'), ('WWW-Authenticate', 'basic')])\nrequests/packages/urllib3/util/ssl_.py:79: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.\n InsecurePlatformWarning\n>>> r.json()\n{u'Content-Length': u'126', u'Content-Type': u'application/json', u'WWW-Authenticate': [u'digest', u'basic']}\n>>> r.headers\n{'content-length': '126', 'server': 'nginx', 'connection': 'keep-alive', 'access-control-allow-credentials': 'true', 'date': 'Wed, 11 Mar 2015 00:38:36 GMT', 'access-control-allow-origin': '*', 'content-type': 'application/json', 'www-authenticate': 'digest, basic'}\n```\n", ":+1:\n", "I'm going to tag 2.5.5 as a quick fix. Then we'll start working on 2.6.0\n" ]
https://api.github.com/repos/psf/requests/issues/2481
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2481/labels{/name}
https://api.github.com/repos/psf/requests/issues/2481/comments
https://api.github.com/repos/psf/requests/issues/2481/events
https://github.com/psf/requests/issues/2481
60,481,079
MDU6SXNzdWU2MDQ4MTA3OQ==
2,481
"407 Proxy Authentication Required" after redirect with load balanced authenticated proxies
{ "avatar_url": "https://avatars.githubusercontent.com/u/291289?v=4", "events_url": "https://api.github.com/users/gsakkis/events{/privacy}", "followers_url": "https://api.github.com/users/gsakkis/followers", "following_url": "https://api.github.com/users/gsakkis/following{/other_user}", "gists_url": "https://api.github.com/users/gsakkis/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/gsakkis", "id": 291289, "login": "gsakkis", "node_id": "MDQ6VXNlcjI5MTI4OQ==", "organizations_url": "https://api.github.com/users/gsakkis/orgs", "received_events_url": "https://api.github.com/users/gsakkis/received_events", "repos_url": "https://api.github.com/users/gsakkis/repos", "site_admin": false, "starred_url": "https://api.github.com/users/gsakkis/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/gsakkis/subscriptions", "type": "User", "url": "https://api.github.com/users/gsakkis", "user_view_type": "public" }
[ { "color": "f7c6c7", "default": false, "description": null, "id": 167537670, "name": "Propose Close", "node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=", "url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close" } ]
closed
true
null
[]
null
14
2015-03-10T10:49:14Z
2021-09-08T23:05:55Z
2015-03-10T14:36:47Z
NONE
resolved
I'm having a problem I traced down to requests as the equivalent curl command works. I am running HAProxy on `localhost:5555` to load balance requests to a pool of authenticated backend http proxies as explained [here](http://serverfault.com/questions/386431/authenticated-proxies-on-haproxy-load-balancer). It works fine for fetching urls without redirection, as shown below using [httpie](https://github.com/jakubroztocil/httpie) (same behaviour observed using requests directly from the shell): ``` $ http 'http://httpbin.org/get' --proxy http:http://localhost:5555 --headers --debug HTTPie 0.9.2 HTTPie data: /home/gsakkis/.httpie Requests 2.5.3 Pygments 2.0.2 Python 2.7.6 (default, Mar 22 2014, 22:59:56) [GCC 4.8.2] linux2 >>> requests.request({'allow_redirects': False, 'auth': None, 'cert': None, 'data': OrderedDict(), 'files': DataDict(), 'headers': {'User-Agent': 'HTTPie/0.9.2'}, 'method': 'get', 'params': ParamsDict(), 'proxies': {u'http': u'http://localhost:5555'}, 'stream': True, 'timeout': 30, 'url': u'http://httpbin.org/get', 'verify': True}) HTTP/1.0 200 OK Access-Control-Allow-Credentials: true Access-Control-Allow-Origin: * Connection: keep-alive Content-Length: 269 Content-Type: application/json Date: Tue, 10 Mar 2015 10:38:37 GMT Server: nginx Via: 1.0 fl291 (squid/3.1.10) X-Cache: MISS from fl291 X-Cache-Lookup: MISS from fl291:80 ``` If I fetch a url that redirects without actually redirecting it still works, returning 302: ``` $ http 'http://httpbin.org/redirect-to?url=http://example.com/' --proxy http:http://localhost:5555 --headers --debug HTTPie 0.9.2 HTTPie data: /home/gsakkis/.httpie Requests 2.5.3 Pygments 2.0.2 Python 2.7.6 (default, Mar 22 2014, 22:59:56) [GCC 4.8.2] linux2 >>> requests.request({'allow_redirects': False, 'auth': None, 'cert': None, 'data': OrderedDict(), 'files': DataDict(), 'headers': {'User-Agent': 'HTTPie/0.9.2'}, 'method': 'get', 'params': ParamsDict(), 'proxies': {u'http': u'http://localhost:5555'}, 'stream': True, 'timeout': 30, 'url': u'http://httpbin.org/redirect-to?url=http://example.com/', 'verify': True}) HTTP/1.0 302 Moved Temporarily Access-Control-Allow-Credentials: true Access-Control-Allow-Origin: * Connection: keep-alive Content-Length: 0 Content-Type: text/html; charset=utf-8 Date: Tue, 10 Mar 2015 10:41:00 GMT Location: http://example.com/ Server: nginx Via: 1.0 mf5 (squid/3.1.10) X-Cache: MISS from mf5 X-Cache-Lookup: MISS from mf5:80 ``` However attempting to follow the redirect fails with 407: ``` $ http 'http://httpbin.org/redirect-to?url=http://example.com/' --proxy http:http://localhost:5555 --headers --follow --debug HTTPie 0.9.2 HTTPie data: /home/gsakkis/.httpie Requests 2.5.3 Pygments 2.0.2 Python 2.7.6 (default, Mar 22 2014, 22:59:56) [GCC 4.8.2] linux2 >>> requests.request({'allow_redirects': True, 'auth': None, 'cert': None, 'data': OrderedDict(), 'files': DataDict(), 'headers': {'User-Agent': 'HTTPie/0.9.2'}, 'method': 'get', 'params': ParamsDict(), 'proxies': {u'http': u'http://localhost:5555'}, 'stream': True, 'timeout': 30, 'url': u'http://httpbin.org/redirect-to?url=http://example.com/', 'verify': True}) HTTP/1.0 407 Proxy Authentication Required Connection: keep-alive Content-Length: 4 Content-Type: text/html Date: Tue, 10 Mar 2015 10:42:02 GMT Mime-Version: 1.0 Proxy-Authenticate: Basic realm="login" Server: squid/3.1.10 Via: 1.0 mf5 (squid/3.1.10) X-Cache: MISS from mf5 X-Cache-Lookup: NONE from mf5:80 X-Squid-Error: ERR_CACHE_ACCESS_DENIED 0 ``` The equivalent curl command seems to work fine though: ``` $ curl 'http://httpbin.org/redirect-to?url=http://example.com/' -s -L -x localhost:5555 -v -o /dev/null * Hostname was NOT found in DNS cache * Trying 127.0.0.1... * Connected to localhost (127.0.0.1) port 5555 (#0) > GET http://httpbin.org/redirect-to?url=http://example.com/ HTTP/1.1 > User-Agent: curl/7.35.0 > Host: httpbin.org > Accept: */* > Proxy-Connection: Keep-Alive > * HTTP 1.0, assume close after body < HTTP/1.0 302 Moved Temporarily < Server: nginx < Date: Tue, 10 Mar 2015 10:44:56 GMT < Content-Type: text/html; charset=utf-8 < Content-Length: 0 < Location: http://example.com/ < Access-Control-Allow-Origin: * < Access-Control-Allow-Credentials: true < X-Cache: MISS from fl291 < X-Cache-Lookup: MISS from fl291:80 < Via: 1.0 fl291 (squid/3.1.10) * HTTP/1.0 connection set to keep alive! < Connection: keep-alive < * Connection #0 to host localhost left intact * Issue another request to this URL: 'http://example.com/' * Hostname was found in DNS cache * Trying 127.0.0.1... * Connected to localhost (127.0.0.1) port 5555 (#1) > GET http://example.com/ HTTP/1.0 > User-Agent: curl/7.35.0 > Host: example.com > Accept: */* > Proxy-Connection: Keep-Alive > * HTTP 1.0, assume close after body < HTTP/1.0 200 OK < Accept-Ranges: bytes < Cache-Control: max-age=604800 < Content-Type: text/html < Date: Tue, 10 Mar 2015 10:44:57 GMT < ETag: "359670651" < Expires: Tue, 17 Mar 2015 10:44:57 GMT < Last-Modified: Fri, 09 Aug 2013 23:54:35 GMT < Server: ECS (cpm/F9D5) < X-Cache: HIT < x-ec-custom-error: 1 < Content-Length: 1270 < X-Cache: MISS from bv400 < X-Cache-Lookup: MISS from bv400:80 < Via: 1.0 bv400 (squid/3.1.10) * HTTP/1.0 connection set to keep alive! < Connection: keep-alive < { [data not shown] * Connection #1 to host localhost left intact ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2481/reactions" }
https://api.github.com/repos/psf/requests/issues/2481/timeline
null
completed
null
null
false
[ "Can you use tcpdump to show the HTTP packets flowing between your HAProxy instance and requests? Is this something you're familiar with doing?\n", "I have played with it occasionally but I'm far from an expert. Any particular command line I could try?\n", "Try `tcpdump -nnvvXSs 1514 -i lo port 5555` will be a good start, assuming you have a `lo` interface.\n", "Thanks; there you go: http://pastebin.com/raw.php?i=dRiFnf2c\n", "So that leaves us with the following message flow:\n\n```\nGET http://httpbin.org/redirect-to?url=http://example.com/ HTTP/1.1\nHost: httpbin.org\nConnection: keep-alive\nAccept-Encoding: gzip, deflate\nAccept: */*\nUser-Agent: HTTPie/0.9.2\n\n\nHTTP/1.0 302 Moved Temporarily\nServer: nginx\nDate: Tue, 10 Mar 2015 11:40:51 GMT\nContent-Type: text/html; charset=utf-8\nContent-Length: 0\nLocation: http://example.com/\nAccess-Control-Allow-Origin: *\nAccess-Control-Allow-Credentials: true\nX-Cache: MISS from hv453\nX-Cache-Lookup: MISS from hv453:80\nVia: 1.0 hv453 (squid/3.1.10)\nConnection: keep-alive\n\n\nGET http://example.com/ HTTP/1.1\nHost: example.com\nConnection: keep-alive\nAccept-Encoding: gzip, deflate\nAccept: */*\nUser-Agent: HTTPie/0.9.2\n\n\nHTTP/1.0 407 Proxy Authentication Required\nServer: squid/3.1.10\nMime-Version: 1.0\nDate: Tue, 10 Mar 2015 04:43:35 GMT\nContent-Type: text/html\nContent-Length: 4\nX-Squid-Error: ERR_CACHE_ACCESS_DENIED 0\nProxy-Authenticate: Basic realm=\"login\"\nX-Cache: MISS from hv453\nX-Cache-Lookup: NONE from hv453:80\nVia: 1.0 hv453 (squid/3.1.10)\nConnection: keep-alive\n\n407.\n```\n\nCan you get me tcpdump of cURL following that redirect as well? I have a theory.\n", "Sure, pasted at http://pastebin.com/raw.php?i=T4npM2XM\n", "We have a winner! HAProxy is at fault.\n\nIn your HAProxy configuration you add the Proxy-Authorization header automatically so that you don't have to manually authorize your requests. That's all well and good, however it seems that HAProxy is only adding that header to the first request on a connection. That's why cURL works: it opens a new TCP connection for the second request.\n\nThis is problematic, because HAProxy is explicitly telling us we can keep the connection alive, as you can see by the `Connection: keep-alive` header that requests sees. As RFC 7230 says:\n\n> A proxy or gateway MUST parse a received Connection header field before a message is forwarded [...] and then remove the Connection header field itself (or replace it with the intermediary's own connection options for the forwarded message).\n\n_NB: It's an interesting aside to note that cURL is sending the Proxy-Connection header despite an explicit admonition [not to do so from RFC 7230 section A.1.2](https://tools.ietf.org/html/rfc7230#appendix-A.1.2)._\n\nIn the short term, you can try adding `Connection: close` to your headers and see if that resolves the problem. Longer term, we should investigate why HAProxy doesn't add the header on subsequent requests.\n", "Initial reading suggests that HAProxy is doing this on purpose. From the docs:\n\n> By default HAProxy operates in a tunnel-like mode with regards to persistent connections: for each connection it processes the first request and forwards everything else (including additional requests) to selected server.\n", "Even more compelling (emphasis mine):\n\n> In HTTP mode, it is possible to rewrite, add or delete some of the request and response headers based on regular expressions. [..] But there is a limitation to this : **since HAProxy's HTTP engine does not support keep-alive, only headers passed during the first request of a TCP session will be seen**. All subsequent headers will be considered data only and not analyzed.\n", "@gsakkis Another option is to try passing the username and password to the proxy URL, and hope that HAProxy's HTTP engine is too stupid to strip the Proxy-Authorization header off. That is, change your command line to:\n\n``` bash\n$ http 'http://httpbin.org/redirect-to?url=http://example.com/' --proxy http:http://username:password@localhost:5555 --headers --follow --debug\nHTTPie 0.9.2\n```\n", "The fact that curl closes the connection between the requests is a regression and a mistake... I'll take that.\n\nThe Proxy-Connection: header we use was added once upon the time because it was necessary and I have not found a good reason to remove it again, even if RFC7230 now has arrived saying so...!\n", "@bagder Thanks for looking in. =) I suspected that was going to be the reasoning for Proxy-Connection, but did just want to clarify it. I doubt we'll add it.\n\nGlad we were able to help find a regression, even if that regression made it look like you were working and we weren't! ;)\n\nGiven that feedback, I'm closing this issue because it's not a requests bug, it's a problem with @gsakkis' HAProxy usage/configuration. @gsakkis, please note that it's definitely possible to configure HAProxy to do what you want it to do, you just need to make some configuration changes or change the way you make the requests.\n", "@Lukasa passing credentials to the proxy url does work indeed! I'd never figure it out on my own, thank you so much for the investigation!\n\nPS: I linked to this ticket on the selected ServerFault answer in case someone stumbles on it too.\n", "@gsakkis Happy to help, I'm glad we were able to resolve the situation. =)\n" ]
https://api.github.com/repos/psf/requests/issues/2480
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2480/labels{/name}
https://api.github.com/repos/psf/requests/issues/2480/comments
https://api.github.com/repos/psf/requests/issues/2480/events
https://github.com/psf/requests/issues/2480
60,340,376
MDU6SXNzdWU2MDM0MDM3Ng==
2,480
Radio streaming download: ICY causes BadStatusLine
{ "avatar_url": "https://avatars.githubusercontent.com/u/4907670?v=4", "events_url": "https://api.github.com/users/rubentorresbonet/events{/privacy}", "followers_url": "https://api.github.com/users/rubentorresbonet/followers", "following_url": "https://api.github.com/users/rubentorresbonet/following{/other_user}", "gists_url": "https://api.github.com/users/rubentorresbonet/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/rubentorresbonet", "id": 4907670, "login": "rubentorresbonet", "node_id": "MDQ6VXNlcjQ5MDc2NzA=", "organizations_url": "https://api.github.com/users/rubentorresbonet/orgs", "received_events_url": "https://api.github.com/users/rubentorresbonet/received_events", "repos_url": "https://api.github.com/users/rubentorresbonet/repos", "site_admin": false, "starred_url": "https://api.github.com/users/rubentorresbonet/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/rubentorresbonet/subscriptions", "type": "User", "url": "https://api.github.com/users/rubentorresbonet", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2015-03-09T13:08:33Z
2021-09-08T23:05:55Z
2015-03-09T13:42:59Z
NONE
resolved
Hello, Even if the icy's protocol is broken, I would still need to download the streaming data of a shoutcast server. When attempting to, I get this issue: File "/usr/lib/python3/dist-packages/requests/api.py", line 55, in get return request('get', url, *_kwargs) File "/usr/lib/python3/dist-packages/requests/api.py", line 44, in request return session.request(method=method, url=url, *_kwargs) File "/usr/lib/python3/dist-packages/requests/sessions.py", line 455, in request resp = self.send(prep, *_send_kwargs) File "/usr/lib/python3/dist-packages/requests/sessions.py", line 558, in send r = adapter.send(request, *_kwargs) File "/usr/lib/python3/dist-packages/requests/adapters.py", line 378, in send raise ConnectionError(e) requests.exceptions.ConnectionError: HTTPConnectionPool(host='streaming3.radiocat.net', port=80): Max retries exceeded with url: /;listen.mp3 (Caused by ProtocolError('Connection aborted.', BadStatusLine('ICY 200 OK\r\n',))) Any workaround or something I could do? Thanks.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2480/reactions" }
https://api.github.com/repos/psf/requests/issues/2480/timeline
null
completed
null
null
false
[ "Yes. =) Don't use a HTTP library.\n\nThat answer was a bit glib, but it's pretty accurate. Let me show you what RFC 7230 says a HTTP status line should look like:\n\n`status-line = HTTP-version SP status-code SP reason-phrase CRLF`\n\nThat first set of characters needs to be a HTTP version. If it doesn't specify a HTTP version, a HTTP library cannot assume it has the faintest idea what's going on. For that reason, `httplib` sees a HTTP version called 'ICY', doesn't recognise it, and bails out (quite rightly).\n\nUnfortunately for you, `httplib` then closes the socket connection and loses its reference to the socket object.\n\nWhat you really need is something that can fake out a HTTP request using a socket. This shouldn't be very hard: create a socket connection directly to your target and then just write some basic HTTP headers to it.\n\nIf you find that helpful, you could write an ICY transport adapter for requests and use it.\n\nSadly, however, this is outside the scope of requests to help you. =)\n", "Lukasa, thank you for your answer.\n\nHow bad would it be to modify the original library like this?\nhttps://mail.python.org/pipermail/web-sig/2005-August/001641.html\n\nIt sounds like I'd save a lot of work just by adding two lines. Still, I know it's not a great solution. What do you personally think about it?\n\nDoing everything with sockets will be a lot of pain.\n", "@rubentorresbonet if you choose to modify httplib like that, that's your decision. It's not up to our discretion nor is it our place to recommend it.\n\nI think modifying a standard library is a poor way of handling this, but if that's what you choose to do, there's nothing for anyone to do about it.\n\nAlso @Lukasa didn't tell you to do _everything_ with sockets, only portions dealing with ICY.\n" ]
https://api.github.com/repos/psf/requests/issues/2479
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2479/labels{/name}
https://api.github.com/repos/psf/requests/issues/2479/comments
https://api.github.com/repos/psf/requests/issues/2479/events
https://github.com/psf/requests/issues/2479
60,266,197
MDU6SXNzdWU2MDI2NjE5Nw==
2,479
Returning json from SocketServer: ConnectionError BadStatusLine
{ "avatar_url": "https://avatars.githubusercontent.com/u/2013357?v=4", "events_url": "https://api.github.com/users/manojgudi/events{/privacy}", "followers_url": "https://api.github.com/users/manojgudi/followers", "following_url": "https://api.github.com/users/manojgudi/following{/other_user}", "gists_url": "https://api.github.com/users/manojgudi/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/manojgudi", "id": 2013357, "login": "manojgudi", "node_id": "MDQ6VXNlcjIwMTMzNTc=", "organizations_url": "https://api.github.com/users/manojgudi/orgs", "received_events_url": "https://api.github.com/users/manojgudi/received_events", "repos_url": "https://api.github.com/users/manojgudi/repos", "site_admin": false, "starred_url": "https://api.github.com/users/manojgudi/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/manojgudi/subscriptions", "type": "User", "url": "https://api.github.com/users/manojgudi", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2015-03-08T17:17:08Z
2021-09-08T23:05:56Z
2015-03-08T18:50:07Z
NONE
resolved
This worked in requests version 2.2.1, but breaks in 2.5.3 Environment: Ubuntu 12.04/x86 Python 2.7.3
{ "avatar_url": "https://avatars.githubusercontent.com/u/2013357?v=4", "events_url": "https://api.github.com/users/manojgudi/events{/privacy}", "followers_url": "https://api.github.com/users/manojgudi/followers", "following_url": "https://api.github.com/users/manojgudi/following{/other_user}", "gists_url": "https://api.github.com/users/manojgudi/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/manojgudi", "id": 2013357, "login": "manojgudi", "node_id": "MDQ6VXNlcjIwMTMzNTc=", "organizations_url": "https://api.github.com/users/manojgudi/orgs", "received_events_url": "https://api.github.com/users/manojgudi/received_events", "repos_url": "https://api.github.com/users/manojgudi/repos", "site_admin": false, "starred_url": "https://api.github.com/users/manojgudi/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/manojgudi/subscriptions", "type": "User", "url": "https://api.github.com/users/manojgudi", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2479/reactions" }
https://api.github.com/repos/psf/requests/issues/2479/timeline
null
completed
null
null
false
[ "@manojgudi can you give us more information. We have absolutely nothing to debug this with otherwise and will be forced to close the issue.\n", "@sigmavirus24 Thanks for prompt reply! Lemme write a script to reproduce this issue.. Gimme a day or so..\n\nThanks\n", "Turns out I didnt attach http headers when returning a json response from my custom SocketServer(which was causing this issue) \nsorry for the noise..\n" ]
https://api.github.com/repos/psf/requests/issues/2478
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2478/labels{/name}
https://api.github.com/repos/psf/requests/issues/2478/comments
https://api.github.com/repos/psf/requests/issues/2478/events
https://github.com/psf/requests/pull/2478
60,198,633
MDExOlB1bGxSZXF1ZXN0MzA3MDAxNDc=
2,478
Document Response.iter_lines() deficiencies
{ "avatar_url": "https://avatars.githubusercontent.com/u/91895?v=4", "events_url": "https://api.github.com/users/plaes/events{/privacy}", "followers_url": "https://api.github.com/users/plaes/followers", "following_url": "https://api.github.com/users/plaes/following{/other_user}", "gists_url": "https://api.github.com/users/plaes/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/plaes", "id": 91895, "login": "plaes", "node_id": "MDQ6VXNlcjkxODk1", "organizations_url": "https://api.github.com/users/plaes/orgs", "received_events_url": "https://api.github.com/users/plaes/received_events", "repos_url": "https://api.github.com/users/plaes/repos", "site_admin": false, "starred_url": "https://api.github.com/users/plaes/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/plaes/subscriptions", "type": "User", "url": "https://api.github.com/users/plaes", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2015-03-07T09:08:03Z
2021-09-08T08:01:04Z
2015-03-07T10:03:42Z
CONTRIBUTOR
resolved
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2478/reactions" }
https://api.github.com/repos/psf/requests/issues/2478/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2478.diff", "html_url": "https://github.com/psf/requests/pull/2478", "merged_at": "2015-03-07T10:03:42Z", "patch_url": "https://github.com/psf/requests/pull/2478.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2478" }
true
[ "@plaes This is a great PR. One small note however, we don't use `unittest`, we use `pytest`. This is why your test suite failed: `unittest.expectedFailure` doesn't exist in Python 2.6. Instead, use [`pytest.mark.xfail`](http://pytest.org/latest/skipping.html#mark-a-test-function-as-expected-to-fail).\n", "Oops.. I push forced the fix.\n", "Hurrah! Thanks so much for this @plaes!\n", ":cake: :star:\n", "Thanks @plaes!\n" ]
https://api.github.com/repos/psf/requests/issues/2477
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2477/labels{/name}
https://api.github.com/repos/psf/requests/issues/2477/comments
https://api.github.com/repos/psf/requests/issues/2477/events
https://github.com/psf/requests/pull/2477
60,142,170
MDExOlB1bGxSZXF1ZXN0MzA2NjkyODA=
2,477
Make Response.iter_lines() reentrant safe
{ "avatar_url": "https://avatars.githubusercontent.com/u/91895?v=4", "events_url": "https://api.github.com/users/plaes/events{/privacy}", "followers_url": "https://api.github.com/users/plaes/followers", "following_url": "https://api.github.com/users/plaes/following{/other_user}", "gists_url": "https://api.github.com/users/plaes/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/plaes", "id": 91895, "login": "plaes", "node_id": "MDQ6VXNlcjkxODk1", "organizations_url": "https://api.github.com/users/plaes/orgs", "received_events_url": "https://api.github.com/users/plaes/received_events", "repos_url": "https://api.github.com/users/plaes/repos", "site_admin": false, "starred_url": "https://api.github.com/users/plaes/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/plaes/subscriptions", "type": "User", "url": "https://api.github.com/users/plaes", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2015-03-06T18:50:20Z
2021-09-08T08:01:05Z
2015-03-07T09:09:01Z
CONTRIBUTOR
resolved
Calling r.iter_lines() multiple times doesn't lose pending data between calls to this method.
{ "avatar_url": "https://avatars.githubusercontent.com/u/91895?v=4", "events_url": "https://api.github.com/users/plaes/events{/privacy}", "followers_url": "https://api.github.com/users/plaes/followers", "following_url": "https://api.github.com/users/plaes/following{/other_user}", "gists_url": "https://api.github.com/users/plaes/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/plaes", "id": 91895, "login": "plaes", "node_id": "MDQ6VXNlcjkxODk1", "organizations_url": "https://api.github.com/users/plaes/orgs", "received_events_url": "https://api.github.com/users/plaes/received_events", "repos_url": "https://api.github.com/users/plaes/repos", "site_admin": false, "starred_url": "https://api.github.com/users/plaes/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/plaes/subscriptions", "type": "User", "url": "https://api.github.com/users/plaes", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2477/reactions" }
https://api.github.com/repos/psf/requests/issues/2477/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2477.diff", "html_url": "https://github.com/psf/requests/pull/2477", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/2477.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2477" }
true
[ "This is fix for issue #2476. So either this, or documentation should mention that Response.iter_lines() is not reentrant safe.\n", "Eh, fine by me. @sigmavirus24?\n", "I feel likewise \"Eh\". This is just going to be broken if there aren't any tests for it. It also breaks the thread-safety of a response I think. (I would hope someone's not consuming a response from multiple threads, but this makes the iterator less threadsafe.)\n", "Yup, this approach is broken. I'll see whether I can find a way to fix it.\n", "I'm closing this for now.. and submitted documentation about this issue and testcases in another pull requests.\n" ]
https://api.github.com/repos/psf/requests/issues/2476
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2476/labels{/name}
https://api.github.com/repos/psf/requests/issues/2476/comments
https://api.github.com/repos/psf/requests/issues/2476/events
https://github.com/psf/requests/issues/2476
60,108,994
MDU6SXNzdWU2MDEwODk5NA==
2,476
iter_lines: Skipping first line using next(..)
{ "avatar_url": "https://avatars.githubusercontent.com/u/91895?v=4", "events_url": "https://api.github.com/users/plaes/events{/privacy}", "followers_url": "https://api.github.com/users/plaes/followers", "following_url": "https://api.github.com/users/plaes/following{/other_user}", "gists_url": "https://api.github.com/users/plaes/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/plaes", "id": 91895, "login": "plaes", "node_id": "MDQ6VXNlcjkxODk1", "organizations_url": "https://api.github.com/users/plaes/orgs", "received_events_url": "https://api.github.com/users/plaes/received_events", "repos_url": "https://api.github.com/users/plaes/repos", "site_admin": false, "starred_url": "https://api.github.com/users/plaes/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/plaes/subscriptions", "type": "User", "url": "https://api.github.com/users/plaes", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2015-03-06T14:35:00Z
2021-09-08T23:05:56Z
2015-03-06T18:09:28Z
CONTRIBUTOR
resolved
When trying to skip first line of `response.iter_lines()` using `next()`, some extra data from following lines is also consumed. Test script: ``` python import requests from contextlib import closing with closing(requests.get('http://httpbin.org/stream/5', stream=True)) as r: skip = next(r.iter_lines()) print 'SKIP: ', skip for n, l in enumerate(r.iter_lines(), start=1): print 'PROCESS: {}'.format(n), l ``` With following output (line 1 totally missing, and line 2 being partial): ``` SKIP: {"url": "http://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "python-requests/2.4.3 CPython/2.7.9 Linux/4.0.0-rc2+"}, "args": {}, "id": 0, "origin": "xxx"} PROCESS: 1 n.org/stream/5", "headers": {"Host": "httpbin.org", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "python-requests/2.4.3 CPython/2.7.9 Linux/4.0.0-rc2+"}, "args": {}, "id": 2, "origin": "xxx"} PROCESS: 2 {"url": "http://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "python-requests/2.4.3 CPython/2.7.9 Linux/4.0.0-rc2+"}, "args": {}, "id": 3, "origin": "xxx"} PROCESS: 3 {"url": "http://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "python-requests/2.4.3 CPython/2.7.9 Linux/4.0.0-rc2+"}, "args": {}, "id": 4, "origin": "xxx"} ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2476/reactions" }
https://api.github.com/repos/psf/requests/issues/2476/timeline
null
completed
null
null
false
[ "This is a tricky interaction with iterators. Most iterators don't quite behave the same way. What you need to do is the following\n\n``` py\nwith closing(requests.get('https://httpbin.org/stream/5', stream=True)) as r:\n iterator = r.iter_lines()\n print 'SKIP: ', next(iterator)\n for n, l in enumerate(iterator, start=1):\n print 'PROCESS: {}'.format(n), l\n```\n\nThat works fine for me. I'm not confident this is a bug because `iter_lines` is allowed to make some assumptions about how it's used. I'll wait to see if @Lukasa agrees with me.\n", "Thanks, this was unexpected :)\n", "Agreed.\n\n`iter_lines` and `iter_content` should _really_ only be used in a single place, or they risk having unexpected effects. It might be worth adding a `..warning` or `..admonition` block to the docstrings for those methods to reinforce that message.\n", "So I took a quick look at the implementation of [`Response.iter_lines`](https://github.com/kennethreitz/requests/blob/master/requests/models.py#L693)\n\nWe read a chunk from the response, and then find the line within that chunk and return it. We store the remainder locally in the function and not on a the response object itself. I think an update to the docs like @Lukasa described is warranted. \n", "Would it make sense to actually store the pending data as `self._pending`?\n" ]
https://api.github.com/repos/psf/requests/issues/2475
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2475/labels{/name}
https://api.github.com/repos/psf/requests/issues/2475/comments
https://api.github.com/repos/psf/requests/issues/2475/events
https://github.com/psf/requests/issues/2475
60,052,903
MDU6SXNzdWU2MDA1MjkwMw==
2,475
Use Requests in Pycharm has a ImportError
{ "avatar_url": "https://avatars.githubusercontent.com/u/1762036?v=4", "events_url": "https://api.github.com/users/viweei/events{/privacy}", "followers_url": "https://api.github.com/users/viweei/followers", "following_url": "https://api.github.com/users/viweei/following{/other_user}", "gists_url": "https://api.github.com/users/viweei/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/viweei", "id": 1762036, "login": "viweei", "node_id": "MDQ6VXNlcjE3NjIwMzY=", "organizations_url": "https://api.github.com/users/viweei/orgs", "received_events_url": "https://api.github.com/users/viweei/received_events", "repos_url": "https://api.github.com/users/viweei/repos", "site_admin": false, "starred_url": "https://api.github.com/users/viweei/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/viweei/subscriptions", "type": "User", "url": "https://api.github.com/users/viweei", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2015-03-06T03:08:50Z
2021-09-08T23:05:49Z
2015-04-06T02:40:55Z
NONE
resolved
### Error: AttributeError: 'module' object has no attribute 'get' When I use Pycharm Requests, error: "AttributeError: 'module' object has no attribute 'get'" I modified file "requests\packages\urllib3\connection.py" in the 8-11 row ``` python try: # Python 3      from http.client import HTTPConnection as _HTTPConnection, HTTPException except ImportError:      from httplib import HTTPConnection as _HTTPConnection, HTTPException ``` To: ``` python py = sys.version_info if py >= (3, 0, 0): from http.client import HTTPConnection as _HTTPConnection, HTTPException else: from httplib import HTTPConnection as _HTTPConnection, HTTPException ``` It worked. ### Environment: Pycharm 4.04 Python 2.75
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2475/reactions" }
https://api.github.com/repos/psf/requests/issues/2475/timeline
null
completed
null
null
false
[ "What is PyCharm doing here? Because it seems to be totally confused.\n\nThis is a really common pattern, and it's considered good style: it's better to test for features than to do strict version checks wherever possible. The follow-on is: there's never access of an attribute called 'get' from the module you changed. So again, it feels like PyCharm is getting this wrong.\n", "No response in almost a month. Closing as inactive.\n" ]
https://api.github.com/repos/psf/requests/issues/2474
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2474/labels{/name}
https://api.github.com/repos/psf/requests/issues/2474/comments
https://api.github.com/repos/psf/requests/issues/2474/events
https://github.com/psf/requests/issues/2474
59,943,083
MDU6SXNzdWU1OTk0MzA4Mw==
2,474
requests 2.5.4 mishandles WWW-Authenticate headers
{ "avatar_url": "https://avatars.githubusercontent.com/u/220082?v=4", "events_url": "https://api.github.com/users/dieterv/events{/privacy}", "followers_url": "https://api.github.com/users/dieterv/followers", "following_url": "https://api.github.com/users/dieterv/following{/other_user}", "gists_url": "https://api.github.com/users/dieterv/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/dieterv", "id": 220082, "login": "dieterv", "node_id": "MDQ6VXNlcjIyMDA4Mg==", "organizations_url": "https://api.github.com/users/dieterv/orgs", "received_events_url": "https://api.github.com/users/dieterv/received_events", "repos_url": "https://api.github.com/users/dieterv/repos", "site_admin": false, "starred_url": "https://api.github.com/users/dieterv/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dieterv/subscriptions", "type": "User", "url": "https://api.github.com/users/dieterv", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2015-03-05T12:07:32Z
2021-09-08T23:05:57Z
2015-03-05T12:27:29Z
CONTRIBUTOR
resolved
Executing a get() with requests 2.5.4 I now stumble over an unexpected 401 error. Looking into it I see that response.headers contains: {'server': 'Microsoft-IIS/7.5', 'content-type': 'text/html', 'date': 'Thu, 05 Mar 2015 11:56:25 GMT', 'content-length': '1293', 'x-powered-by': 'ASP.NET', 'www-authenticate': 'Negotiate'} and the HttpNtlmAuth handler installed onto my session object has not kicked in. I have a feeling the "WWW-Authenticate: NTLM" header which is also in the response disappeared somewhere: ![headers](https://cloud.githubusercontent.com/assets/220082/6504440/b22c5680-c337-11e4-89bb-6f2da348c9fd.png)
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2474/reactions" }
https://api.github.com/repos/psf/requests/issues/2474/timeline
null
completed
null
null
false
[ "Note the above works as expected after downgrading to requests 2.5.1.\n", "Thanks @dieterv, this is a known issue, see shazow/urllib3#561. We aim to push out a fix for this very shortly.\n" ]
https://api.github.com/repos/psf/requests/issues/2473
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2473/labels{/name}
https://api.github.com/repos/psf/requests/issues/2473/comments
https://api.github.com/repos/psf/requests/issues/2473/events
https://github.com/psf/requests/issues/2473
59,907,161
MDU6SXNzdWU1OTkwNzE2MQ==
2,473
requests can't detect the right charset
{ "avatar_url": "https://avatars.githubusercontent.com/u/3759816?v=4", "events_url": "https://api.github.com/users/gengjiawen/events{/privacy}", "followers_url": "https://api.github.com/users/gengjiawen/followers", "following_url": "https://api.github.com/users/gengjiawen/following{/other_user}", "gists_url": "https://api.github.com/users/gengjiawen/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/gengjiawen", "id": 3759816, "login": "gengjiawen", "node_id": "MDQ6VXNlcjM3NTk4MTY=", "organizations_url": "https://api.github.com/users/gengjiawen/orgs", "received_events_url": "https://api.github.com/users/gengjiawen/received_events", "repos_url": "https://api.github.com/users/gengjiawen/repos", "site_admin": false, "starred_url": "https://api.github.com/users/gengjiawen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/gengjiawen/subscriptions", "type": "User", "url": "https://api.github.com/users/gengjiawen", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2015-03-05T05:25:22Z
2021-09-08T23:05:57Z
2015-03-05T07:01:38Z
NONE
resolved
I write some code in python 3: ``` url = "http://www.kanunu8.com/book/3981/42775.html" r = requests.get(url) print(r.encoding) print(r.text) ``` from the html source code,it clearly state it's charset is gbk,but r.encoding is "ISO-8859-1". BTW,I can get the right html from urllib.request.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2473/reactions" }
https://api.github.com/repos/psf/requests/issues/2473/timeline
null
completed
null
null
false
[ "Hi there!\n\nThis is a well known requests behaviour that is clearly documented. I answered this in a previous GitHub issue [here](https://github.com/kennethreitz/requests/issues/2122#issuecomment-48448778): please consult that answer.\n", "But the function`requests.utils.get_encodings_from_content` raise a exception\n\n```\nTraceback (most recent call last):\n File \"D:/Developer/Python/PythonEx/Http/Crawler/requests_with_coding.py\", line 5, in <module>\n codings = get_encodings_from_content(r.content)\n File \"C:\\Python34\\lib\\site-packages\\requests\\utils.py\", line 304, in get_encodings_from_content\n pragma_re.findall(content) +\nTypeError: can't use a string pattern on a bytes-like object\n```\n", "We plan to remove `get_encodings_from_content` in requests 3.0, because we don't think we should have any code to support reading HTML at all. We are _not_ an HTML library, we're a HTTP library, and we don't want to introspect request bodies at all if we can help it.\n\nThe code for it is extremely simple, you can easily bring it into your own project. Find it [here](https://github.com/kennethreitz/requests/blob/master/requests/utils.py#L288).\n", "I got it, thanks a lot :)\n", "For anyone may see this, you can use Beautifulsoup to get the desired html:\njust use: `soup = BeautifulSoup(r.content)`\n" ]
https://api.github.com/repos/psf/requests/issues/2472
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2472/labels{/name}
https://api.github.com/repos/psf/requests/issues/2472/comments
https://api.github.com/repos/psf/requests/issues/2472/events
https://github.com/psf/requests/pull/2472
59,829,323
MDExOlB1bGxSZXF1ZXN0MzA0ODQ1NzU=
2,472
Avoid data duplication when creating a Request with str/bytes/bytearray input
{ "avatar_url": "https://avatars.githubusercontent.com/u/138657?v=4", "events_url": "https://api.github.com/users/scholer/events{/privacy}", "followers_url": "https://api.github.com/users/scholer/followers", "following_url": "https://api.github.com/users/scholer/following{/other_user}", "gists_url": "https://api.github.com/users/scholer/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/scholer", "id": 138657, "login": "scholer", "node_id": "MDQ6VXNlcjEzODY1Nw==", "organizations_url": "https://api.github.com/users/scholer/orgs", "received_events_url": "https://api.github.com/users/scholer/received_events", "repos_url": "https://api.github.com/users/scholer/repos", "site_admin": false, "starred_url": "https://api.github.com/users/scholer/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/scholer/subscriptions", "type": "User", "url": "https://api.github.com/users/scholer", "user_view_type": "public" }
[]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" } ]
{ "closed_at": "2015-04-06T01:57:33Z", "closed_issues": 4, "created_at": "2015-01-18T20:07:00Z", "creator": { "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }, "description": "", "due_on": null, "html_url": "https://github.com/psf/requests/milestone/23", "id": 940764, "labels_url": "https://api.github.com/repos/psf/requests/milestones/23/labels", "node_id": "MDk6TWlsZXN0b25lOTQwNzY0", "number": 23, "open_issues": 0, "state": "closed", "title": "2.6.0", "updated_at": "2015-04-06T01:57:33Z", "url": "https://api.github.com/repos/psf/requests/milestones/23" }
2
2015-03-04T17:13:10Z
2021-09-08T08:01:01Z
2015-03-14T12:43:18Z
CONTRIBUTOR
resolved
I've worked on the idea discussed in #2468 regarding avoiding duplicating the file data when creating a Request with files input that is str/bytes/bytearray and not a stream. The code currently creates a BytesIO or StringIO stream using the str/bytes/bytearray as initial input. This is then read out with fp.read() when creating a RequestField. The problem with this is that BytesIO and StringIO creates a copy of the input. Thus, if you have created a 100 MB file in memory using a bytearray which is used to create a Request, three identical copies is produced: _(1)_ The original bytearray, _(2)_ The BytesIO/StringIO object, which has its own copy, and _(3)_ The copy that is read out with fp.read() and stored in the RequestField object. If we instead of creating BytesIO/StringIO objects just pass the str/bytes/bytearray on to RequestField, we avoid making the two duplicates of the data. This yields a **30-40% performance improvement** for files larger than 5 MB when creating a Request. The end result is, as far as I've been able to tell, the same. I have created a GIST with my code for checking that the proposed change yield the same body as the old. It also has the code for evaluating performance and the results of the three ways I've compared the performance differences. https://gist.github.com/scholer/57fced2b63d1bb3130eb The tests in test_requests.py completes without failure on python 2.6 and 3.4. I don't believe any new tests are required, but let me know. **Question:** Is this really relevant? Why would you ever build a large file in memory? Large files are always read from disk or handled as some other stream. Answer: Some cloud storage services, e.g. Amazon AWS, relies on adding a "prefix" and "suffix" to the file content, which includes e.g. upload authorization tokens. You can build the file data using e.g. a BytesIO stream or a bytearray. The bytearray can be used through a memoryview, which supports Python's buffer protocol. This means that the file data can be inserted directly into the bytearray without extra memory overhead. ``` fp.readinto(mv[offset:offset+length]) ``` To my knowledge, it is not easy to read the content of one stream directly into another. For instance, appending a BytesIO with filecontent using fp.readinto(bytestream) # will fail since bytestream does not support the buffer interface Note: Although my tests says that the final body is the same with the new code as the old, I would still recommend that someone with more experience takes a look at this to make sure that it does not have any adverse effects. Thanks.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2472/reactions" }
https://api.github.com/repos/psf/requests/issues/2472/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2472.diff", "html_url": "https://github.com/psf/requests/pull/2472", "merged_at": "2015-03-14T12:43:17Z", "patch_url": "https://github.com/psf/requests/pull/2472.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2472" }
true
[ "Thanks for this!\n\nI can see no particular reason to be concerned about this patch, it seems totally reasonable to me. I'll obviously wait for @sigmavirus24 to hop in. It would also be extremely helpful to get someone who is deploying requests in anger to weigh in here.\n", "This looks fine but I'd like to test it a bit. Sorry for the delay. Too many things to do\n" ]
https://api.github.com/repos/psf/requests/issues/2471
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2471/labels{/name}
https://api.github.com/repos/psf/requests/issues/2471/comments
https://api.github.com/repos/psf/requests/issues/2471/events
https://github.com/psf/requests/issues/2471
59,803,770
MDU6SXNzdWU1OTgwMzc3MA==
2,471
duplicated headers are lost
{ "avatar_url": "https://avatars.githubusercontent.com/u/3345886?v=4", "events_url": "https://api.github.com/users/wesnm/events{/privacy}", "followers_url": "https://api.github.com/users/wesnm/followers", "following_url": "https://api.github.com/users/wesnm/following{/other_user}", "gists_url": "https://api.github.com/users/wesnm/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/wesnm", "id": 3345886, "login": "wesnm", "node_id": "MDQ6VXNlcjMzNDU4ODY=", "organizations_url": "https://api.github.com/users/wesnm/orgs", "received_events_url": "https://api.github.com/users/wesnm/received_events", "repos_url": "https://api.github.com/users/wesnm/repos", "site_admin": false, "starred_url": "https://api.github.com/users/wesnm/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/wesnm/subscriptions", "type": "User", "url": "https://api.github.com/users/wesnm", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2015-03-04T14:22:12Z
2021-09-08T23:05:57Z
2015-03-04T14:23:07Z
NONE
resolved
With the upgrade to 2.5.3, my application suddenly broke. I've tracked it down to a loss of headers. The response includes two "WWW-Authenticate" headers, but only one is returned in the header list. This appears to be a urllib3 bug.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2471/reactions" }
https://api.github.com/repos/psf/requests/issues/2471/timeline
null
completed
null
null
false
[ "Please file this against [urllib3](/shazow/urllib3) then.\n" ]
https://api.github.com/repos/psf/requests/issues/2470
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2470/labels{/name}
https://api.github.com/repos/psf/requests/issues/2470/comments
https://api.github.com/repos/psf/requests/issues/2470/events
https://github.com/psf/requests/issues/2470
59,646,646
MDU6SXNzdWU1OTY0NjY0Ng==
2,470
Unable to import requests
{ "avatar_url": "https://avatars.githubusercontent.com/u/417478?v=4", "events_url": "https://api.github.com/users/xZise/events{/privacy}", "followers_url": "https://api.github.com/users/xZise/followers", "following_url": "https://api.github.com/users/xZise/following{/other_user}", "gists_url": "https://api.github.com/users/xZise/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/xZise", "id": 417478, "login": "xZise", "node_id": "MDQ6VXNlcjQxNzQ3OA==", "organizations_url": "https://api.github.com/users/xZise/orgs", "received_events_url": "https://api.github.com/users/xZise/received_events", "repos_url": "https://api.github.com/users/xZise/repos", "site_admin": false, "starred_url": "https://api.github.com/users/xZise/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/xZise/subscriptions", "type": "User", "url": "https://api.github.com/users/xZise", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2015-03-03T14:12:47Z
2021-09-08T23:05:58Z
2015-03-04T13:49:12Z
NONE
resolved
Hi on [Travis](https://travis-ci.org/wikimedia/pywikibot-core/jobs/52812789#L742) our bot is unable to import requests 2.5.3 properly on Python 2.7 and 2.6: ``` File "/home/travis/build/wikimedia/pywikibot-core/.eggs/flickrapi-2.0-py2.7.egg/flickrapi/__init__.py", line 52, in <module> File "/home/travis/build/wikimedia/pywikibot-core/.eggs/flickrapi-2.0-py2.7.egg/flickrapi/core.py", line 13, in <module> File "/home/travis/build/wikimedia/pywikibot-core/.eggs/flickrapi-2.0-py2.7.egg/flickrapi/tokencache.py", line 10, in <module> File "/home/travis/build/wikimedia/pywikibot-core/.eggs/flickrapi-2.0-py2.7.egg/flickrapi/auth.py", line 23, in <module> File "build/bdist.linux-x86_64/egg/requests_toolbelt/__init__.py", line 19, in <module> File "build/bdist.linux-x86_64/egg/requests_toolbelt/adapters/__init__.py", line 12, in <module> File "build/bdist.linux-x86_64/egg/requests_toolbelt/adapters/ssl.py", line 12, in <module> SSLError -- exception raised for I/O errors File "/home/travis/build/wikimedia/pywikibot-core/.eggs/requests-2.5.3-py2.7.egg/requests/__init__.py", line 53, in <module> from .packages.urllib3.contrib import pyopenssl File "/tmp/easy_install-uogHkP/requests-2.5.3/requests/packages/__init__.py", line 49, in load_module ``` It seems that this only happened recently. [This run](https://travis-ci.org/wikimedia/pywikibot-core/jobs/51751467#L478) worked fine with 2.5.1. Locally I could install flickapi (2.0) and requests (2.5.3) and import them without problems on Python 2.7.8.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2470/reactions" }
https://api.github.com/repos/psf/requests/issues/2470/timeline
null
completed
null
null
false
[ "Thanks for the report!\n\nThis is a known regression in 2.5.2 and 2.5.3, with a fix being tracked under #2466.\n" ]
https://api.github.com/repos/psf/requests/issues/2469
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2469/labels{/name}
https://api.github.com/repos/psf/requests/issues/2469/comments
https://api.github.com/repos/psf/requests/issues/2469/events
https://github.com/psf/requests/issues/2469
59,631,918
MDU6SXNzdWU1OTYzMTkxOA==
2,469
RuntimeWarning when using eventlet for patching requests
{ "avatar_url": "https://avatars.githubusercontent.com/u/5242977?v=4", "events_url": "https://api.github.com/users/imazor/events{/privacy}", "followers_url": "https://api.github.com/users/imazor/followers", "following_url": "https://api.github.com/users/imazor/following{/other_user}", "gists_url": "https://api.github.com/users/imazor/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/imazor", "id": 5242977, "login": "imazor", "node_id": "MDQ6VXNlcjUyNDI5Nzc=", "organizations_url": "https://api.github.com/users/imazor/orgs", "received_events_url": "https://api.github.com/users/imazor/received_events", "repos_url": "https://api.github.com/users/imazor/repos", "site_admin": false, "starred_url": "https://api.github.com/users/imazor/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/imazor/subscriptions", "type": "User", "url": "https://api.github.com/users/imazor", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2015-03-03T11:58:19Z
2018-04-23T14:41:55Z
2015-03-03T15:32:27Z
NONE
resolved
Hi, I am getting the following warning while using requests with eventlet: /usr/local/lib/python2.7/site-packages/requests/utils.py:74: RuntimeWarning: Parent module 'requests' not found while handling absolute import from netrc import netrc, NetrcParseError looking in the issue below that have similar warning https://github.com/kennethreitz/requests/issues/2014 could it be that it related to the fact (if I am not wrong) that during the eventlet patching, requests removed from the sys.path. from the eventlet docs: http://web.archive.org/web/20110723123938/http://www.eventlet.net/doc/patching.html#monkeypatching-the-standard-library eventlet.patcher.import_patched(module_name, _additional_modules, *_kw_additional_modules) Imports a module in a greened manner, so that the module’s use of networking libraries like socket will use Eventlet’s green versions instead. The only required argument is the name of the module to be imported: import eventlet httplib2 = eventlet.import_patched('httplib2') Under the hood, it works by temporarily swapping out the “normal” versions of the libraries in sys.modules for an eventlet.green equivalent. When the import of the to-be-patched module completes, the state of sys.modules is restored. Therefore, if the patched module contains the statement ‘import socket’, import_patched will have it reference eventlet.green.socket. One weakness of this approach is that it doesn’t work for late binding (i.e. imports that happen during runtime). Late binding of imports is fortunately rarely done (it’s slow and against PEP-8), so in most cases import_patched will work just fine. so I just want to verify the the warning is ok (because looks that the patched version of requests works quite fine)
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2469/reactions" }
https://api.github.com/repos/psf/requests/issues/2469/timeline
null
completed
null
null
false
[ "eventlet shouldn't be affecting the import of netrc. I'm also not certain if eventlet has special case handling for requests. All eventlet should be patching is socket or httplib (or both) as far as I know, so this seems like a bug for eventlet, not us. I don't use eventlet with requests, nor does @Lukasa so neither of us can tell you that this isn't going to break anything (i.e., that the warning is okay). So if you want assurances, you'll need to search elsewhere (e.g., ask a question on StackOverflow and tag it with eventlet so that someone more familiar with eventlet will answer your question).\n", "As mentioned above, the problem is in the `eventlet` library.\r\n\r\nHere is an original tip (by @temoto) how to solve the problem: https://github.com/eventlet/eventlet/issues/208#issuecomment-76978478\r\n\r\nAnd this is my workaround:\r\n\r\n```python\r\nfrom eventlet import import_patched\r\nrequests = import_patched('requests.__init__')\r\n```\r\n\r\n## see also\r\n\r\n* https://github.com/requests/requests/issues/2469\r\n* https://github.com/requests/requests/issues/2014" ]
https://api.github.com/repos/psf/requests/issues/2468
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2468/labels{/name}
https://api.github.com/repos/psf/requests/issues/2468/comments
https://api.github.com/repos/psf/requests/issues/2468/events
https://github.com/psf/requests/pull/2468
59,588,521
MDExOlB1bGxSZXF1ZXN0MzAzNDMwNDI=
2,468
Support for bytearray when creating Request with files argument, just like str and bytes.
{ "avatar_url": "https://avatars.githubusercontent.com/u/138657?v=4", "events_url": "https://api.github.com/users/scholer/events{/privacy}", "followers_url": "https://api.github.com/users/scholer/followers", "following_url": "https://api.github.com/users/scholer/following{/other_user}", "gists_url": "https://api.github.com/users/scholer/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/scholer", "id": 138657, "login": "scholer", "node_id": "MDQ6VXNlcjEzODY1Nw==", "organizations_url": "https://api.github.com/users/scholer/orgs", "received_events_url": "https://api.github.com/users/scholer/received_events", "repos_url": "https://api.github.com/users/scholer/repos", "site_admin": false, "starred_url": "https://api.github.com/users/scholer/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/scholer/subscriptions", "type": "User", "url": "https://api.github.com/users/scholer", "user_view_type": "public" }
[]
closed
true
null
[]
null
6
2015-03-03T03:07:04Z
2021-09-08T08:01:07Z
2015-03-03T20:01:31Z
CONTRIBUTOR
resolved
When creating a Request with files argument, the "fp" part of a file entry can currently be either (a) a str, (b) a bytes, or (c) any other object that has a .read() method that returns data. For str or bytes, StringIO or ByteIO wrapper-objects are created. When building data in-memory, a bytearray is often more convenient and memory efficient compared to immutable bytes. This simple change allows models.RequestEncodingMixin._encode_files method to use bytearray as fp in addition to the currently supported bytes and str input. This means that users do not have to worry about creating a ByteIO object. I've seen requests users creating bytes-objects from bytearrays before passing it to requests. This, of course, doubles the memory consumption and is completely unnecessary. Yes, the correct approach from the user's side would be to create a ByteIO object (instead of bytes). But not all users seem to be aware of this. Letting requests take bytearrays means that it would "just work", when the user passes these to requests, and thus avoids the potential pitfall where a user just says "requests chokes on bytearrays, so convert to str." (Actual comment from pyzotero, a principal python client module to consume Zotero's REST API). I'm not sure if this simple change requires it's own test, but I've added one just in case. It also tests for bytes objects, which was not covered by the existing test, test_can_send_nonstring_objects_with_files. Perhaps I should just expand test_can_send_nonstring_objects_with_files to also cover bytes and bytearray? It currently only covers str (which are, of course, also bytes in python 2). The test suite currently does fail in TestMorselToCookieExpires.test_expires_valid_str; however this is unrelated to this change and was also failing before, in commit 461b740d. Compatability: bytearray and the byte literal (used in the test) are both available in python 2.6 and forward. Let me know if there is anything else that you would need in order to accept this pull request.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2468/reactions" }
https://api.github.com/repos/psf/requests/issues/2468/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2468.diff", "html_url": "https://github.com/psf/requests/pull/2468", "merged_at": "2015-03-03T20:01:31Z", "patch_url": "https://github.com/psf/requests/pull/2468.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2468" }
true
[ "This looks great, I see no reason not to merge this. :cake:\n\n@sigmavirus24, I'll let you pull the trigger.\n", "After further thought, I'm thinking maybe it would be better to just use the str/bytes/bytearray object that the user provided directly? This would cut the number of in-memory file data copies from 3 to 1. E.g.\n\n```\nif isinstance(fp, (str, bytes, bytearray)):\n data = fp\nelse:\n data = fp.read()\n```\n\nFor reference, the existing code with annotation of the created copies:\n\n```\n# We already have one copy of the file data, provided by the user\nif isinstance(fp, str):\n fp = StringIO(fp)\nif isinstance(fp, (bytes, bytearray)):\n fp = BytesIO(fp) # 2nd copy\n\nrf = RequestField(name=k, data=fp.read(), # calling fp.read() creates a 3rd copy.\n filename=fn, headers=fh)\n```\n\nThe data is not modified down stream, only used in `encode_multipart_formdata` as:\n\n```\nbody.write(data)\n```\n\nSo it should be ok to use the data copy that the user provided.\n", "Hi, \nI've updated the test case to use string input rather than float. The test case is just a modified version of test_can_send_nonstring_objects_with_files, which had (and still has) float input, data = {'a': 0.0}.\n", "Regarding my previous note on reducing the number of copies created by the use of BytesIO and StringIO: This is apparently going to be addressed in Python 3.5, where BytesIO will only create a copy of the initial data when it is changed: http://bugs.python.org/issue22003.\nI'm not really sure whether fp.read() will then also just return the original data, or if it will return a copy.\n", "Thanks @scholer! :cake: \n", "Thank you for accepting my pull request, @sigmavirus24 :)\n" ]
https://api.github.com/repos/psf/requests/issues/2467
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2467/labels{/name}
https://api.github.com/repos/psf/requests/issues/2467/comments
https://api.github.com/repos/psf/requests/issues/2467/events
https://github.com/psf/requests/issues/2467
59,567,494
MDU6SXNzdWU1OTU2NzQ5NA==
2,467
timeout issues for chunked responses
{ "avatar_url": "https://avatars.githubusercontent.com/u/939977?v=4", "events_url": "https://api.github.com/users/zsalzbank/events{/privacy}", "followers_url": "https://api.github.com/users/zsalzbank/followers", "following_url": "https://api.github.com/users/zsalzbank/following{/other_user}", "gists_url": "https://api.github.com/users/zsalzbank/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/zsalzbank", "id": 939977, "login": "zsalzbank", "node_id": "MDQ6VXNlcjkzOTk3Nw==", "organizations_url": "https://api.github.com/users/zsalzbank/orgs", "received_events_url": "https://api.github.com/users/zsalzbank/received_events", "repos_url": "https://api.github.com/users/zsalzbank/repos", "site_admin": false, "starred_url": "https://api.github.com/users/zsalzbank/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zsalzbank/subscriptions", "type": "User", "url": "https://api.github.com/users/zsalzbank", "user_view_type": "public" }
[]
closed
true
null
[]
null
41
2015-03-02T23:11:27Z
2021-09-08T23:05:58Z
2015-03-03T18:12:39Z
NONE
resolved
When I set a timeout value for my simple GET request, the `requests.get` call takes a really long time to complete when I don't iterate over the returned content. If I iterate over the content, the `requests.get` call comes back very quickly. I've delved pretty deep into the requests source code and can't figure out what the issue is. Here is a test case to illustrate my issue: https://gist.github.com/zsalzbank/90ebcaaf94f6c34e9559 What I would expect to happen is that the non-streamed and non-iterated responses would return just as quickly as all the other tests, but instead, they take 25x longer. When the timeout is removed, the responses come back quickly as well. From my understanding of the timeout parameter, it is only timing out for the initial data response from the server. I know that some data comes back quickly because I modified the requests library and printed out the data as soon as it comes back (added a `print` after this line: https://github.com/kennethreitz/requests/blob/master/requests/models.py#L655). What is going on here? Am I doing something wrong on the server side? I know that no `Content-Length` comes back from the server, but `Connection: close` does come back. I'm not sure if this is related to #1041 (it sounds similar).
{ "avatar_url": "https://avatars.githubusercontent.com/u/939977?v=4", "events_url": "https://api.github.com/users/zsalzbank/events{/privacy}", "followers_url": "https://api.github.com/users/zsalzbank/followers", "following_url": "https://api.github.com/users/zsalzbank/following{/other_user}", "gists_url": "https://api.github.com/users/zsalzbank/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/zsalzbank", "id": 939977, "login": "zsalzbank", "node_id": "MDQ6VXNlcjkzOTk3Nw==", "organizations_url": "https://api.github.com/users/zsalzbank/orgs", "received_events_url": "https://api.github.com/users/zsalzbank/received_events", "repos_url": "https://api.github.com/users/zsalzbank/repos", "site_admin": false, "starred_url": "https://api.github.com/users/zsalzbank/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zsalzbank/subscriptions", "type": "User", "url": "https://api.github.com/users/zsalzbank", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2467/reactions" }
https://api.github.com/repos/psf/requests/issues/2467/timeline
null
completed
null
null
false
[ "Can you post your values? Because I don't see this at all:\n\nT=10, stream iter took: 0.515799999237\nT=10, stream all took: 0.428910970688\nT=10, no stream took: 0.423730134964\nT=None, stream iter took: 0.422012805939\nT=None, stream all took: 0.445449113846\nT=None, no stream took: 0.433798074722\n", "Oh _hey_, that's because I ran on Python 2. If I run on Python 3, I definitely see this effect:\n\nT=10, stream iter took: 0.4896872043609619\nT=10, stream all took: 4.6765148639678955\nT=10, no stream took: 5.007489919662476\nT=None, stream iter took: 0.4465291500091553\nT=None, stream all took: 0.43279504776000977\nT=None, no stream took: 0.4505140781402588\n", "Switching to HTTP solves this problem:\n\nT=10, stream iter took: 0.28220510482788086\nT=10, stream all took: 0.24484610557556152\nT=10, no stream took: 0.20595622062683105\nT=None, stream iter took: 0.2057638168334961\nT=None, stream all took: 0.2119450569152832\nT=None, no stream took: 0.21066498756408691\n", "So, that makes this a fun bug in the interaction between SSL, timeouts, and streaming on Python 3. That is scarily specific.\n", "So, I just ran the following timings on my box:\n\nT=10, stream iter took: 0.8549139499664307\nT=10, stream all took: 4.98350191116333\nT=10, no stream took: 4.616538047790527\nT=None, stream iter took: 0.3969600200653076\nT=None, stream all took: 0.5639438629150391\nT=None, no stream took: 0.3719618320465088\n\nAt the same time, I took a Wireshark trace of the packet flow. The key difference is in the slow cases there's a very long wait between the ACK of the response data (which all comes in a single TCP segment) and the connection being torn down. In these cases, the connection gets torn down by the server, whereas in the other cases we tear it down ourselves.\n", "So the problem seems to be that `Connection: close` is being ignored when using `response.content` on Python 3 over HTTPS.\n\nWat?\n", "Oh thank god, I'm not going mad. By changing the gist to use a chunk size of `10*1024` in the iter_content stanza (just like response.content does), I get a new set of timings:\n\nT=10, stream iter took: 5.412540912628174\nT=10, stream all took: 5.003500938415527\nT=10, no stream took: 4.998499155044556\nT=None, stream iter took: 0.4110419750213623\nT=None, stream all took: 0.3670358657836914\nT=None, no stream took: 0.3610358238220215\n\nThis means the problem is actually the chunk size, on Python 3, over HTTPS, with timeouts set.\n", "Profiling the code with just the three slow runs shows that almost all the time is spent in the `SSLSocket`'s `read` method: 14 seconds, or 90% of program execution time. This means that socket timeouts are likely the problem here.\n", "If socket timeouts are the problem, that means we're reading on a socket that has no further data available, waiting for more. Someone in the SSL stack is getting this wrong. Here's the call stack:\n\n```\n13. ~:0(<method 'read' of '_ssl._SSLSocket' objects>)\n12. C:\\Python34\\Lib\\ssl.py:609(read)\n11. C:\\Python34\\Lib\\ssl.py:735(recv_into)\n10. C:\\Python34\\Lib\\socket.py:357(readinto)\n9. ~:0(<method 'readinto' of '_io.BufferedReader' objects>)\n8. C:\\Python34\\Lib\\http\\client.py:520(readinto)\n7. ~:0(<function HTTPResponse.read at 0x02BB97C8>)\n6. C:\\Python34\\Lib\\http\\client.py:489(read)\n5. C:\\Python34\\lib\\site-packages\\requests\\packages\\urllib3\\response.py:143(read)\n4. C:\\Python34\\lib\\site-packages\\requests\\packages\\urllib3\\response.py:239(stream)\n3. C:\\Python34\\lib\\site-packages\\requests\\models.py:651(generate)\n2. test.py:7(test)\n1. test.py:1(<module>)\n```\n\nThe obvious risks are everything from 8 to 13, but specifically 11 to 13 look risky.\n", "So, I don't think it's `SocketIO.readinto`, because if a timeout is raised there it gets thrown up the stack, which _should_ abort the `BufferedReader.readinto` method early (though I can't guarantee it because it's a C extension and I have limited experience writing them, so I don't know what their exception handling logic is like.\n\nI think the problem here is way lower. If I change the read chunk size to 7 the read also fails, so any read size that isn't a nice multiple of the total data being transferred seems to be good enough here.\n", "Oh hey! Here's something that's likely to be relevant: the server doesn't return a `content-length` _or_ send in chunked encoding. Pretty sure that's a spec violation right there!\n", "Don't know why I didn't process this earlier. That totally solves it.\n\nHere's a minimal reproduction of the problem, involving only the standard library:\n\n``` python\nimport time\nimport http.client\nimport socket\n\ndef test(timeout):\n start = time.time()\n c = http.client.HTTPSConnection('sleepy-reaches-6892.herokuapp.com', 443)\n c.connect()\n if timeout:\n c.sock.settimeout(timeout)\n c.request('GET', '/test')\n r = c.getresponse()\n while True:\n data = r.read(7)\n if not data:\n break\n print(\"T={}, raw connection took: {}\".format(timeout, time.time() - start))\n\ntest(10)\ntest(0)\n```\n", "Thanks for digging into this more. I did as much as I could and thought it might have something to do with that, but wasn't sure.\n\nWhy does it go quickly when there is no timeout - shouldn't that just hang forever? Seems inconsistent to me.\n\nDo you think that returning chunked or a content length will solve the problem? Is there a way to work around it using the library?\n\nObviously, the script above is just using a test heroku app that I made to illustrate the problem, but it seems like heroku might be modifying my response. According to heroku (https://devcenter.heroku.com/articles/http-routing#http-validation-and-restrictions):\n\n> Additionally, while HTTP/1.1 requests and responses are expected to be keep-alive by default, if the initial request had an explicit connection: close header from the router to the dyno, the dyno can send a response delimited by the connection termination, without a specific content-encoding nor an explicit content-length.\n\nNot sure if the above has anything to do with this (and I understand this might not be the place to discuss it), but it seems related. I imagine this is a pretty common case (using heroku to serve an API and requests to consume it).\n", "> Why does it go quickly when there is no timeout - shouldn't that just hang forever? Seems inconsistent to me.\n\nIt's a good question. The problem seems to be the interaction with OpenSSL, where we can get stuck in a call that would otherwise be detected as blocking. I _think_ that the specific problem is that we make another call to `socket.socket.recv_into` when we should be able to tell that the communication is over. In the HTTP case we correctly detect that and abort early, so OpenSSL is hiding some useful socket information from us.\n", "Something vitally different happens with recv_into in the OpenSSL and non-OpenSSL case. I think stepping through with PDB in both cases might be necessary.\n", "> Do you think that returning chunked or a content length will solve the problem?\n\nI would say, yes. Well-behaved servers get good behaviour out of requests. It's impossible for us to behave better otherwise in every possible scenario an application developer can think of. Further, it is entirely plausible that even if we can find out what OpenSSL is hiding from us, there will not be a way for us to fix it (work around OpenSSL). The faster solution, almost certainly, will be for your application behave as has been defined by RFCs that have been around for ages. Either:\n1. properly chunk the response (most wsgi-based frameworks support this easily and with good APIs)\n2. add a content-length\n\nI'm confident that either will fix this problem. It'll also fix it far faster than the fix would take for us to release (since we'll have to dig pretty deep on this and there's a lot that can mislead us, as you've already seen).\n", "@sigmavirus24 is entirely right. Testing against http2bin.org (which correctly returns a content-length) does not reveal this bug.\n\nFocusing further on this behaviour, the problem is very, very deep. So far I've tracked it as low as line 371 of socket.py, in the `SocketIO.readinto` method, which looks like this:\n\n``` python\n def readinto(self, b):\n \"\"\"Read up to len(b) bytes into the writable buffer *b* and return\n the number of bytes read. If the socket is non-blocking and no bytes\n are available, None is returned.\n\n If *b* is non-empty, a 0 return value indicates that the connection\n was shutdown at the other end.\n \"\"\"\n self._checkClosed()\n self._checkReadable()\n if self._timeout_occurred:\n raise OSError(\"cannot read from timed out object\")\n while True:\n try:\n return self._sock.recv_into(b)\n except timeout:\n self._timeout_occurred = True\n raise\n except InterruptedError:\n continue\n except error as e:\n if e.args[0] in _blocking_errnos:\n return None\n raise\n```\n\nThis method calls `socket.recv_into` for non-SSL methods, and `SSLSocket.recv_into` for SSL. This is where the problem seems to lie. That recv_into method seems to block for quite a long time in the SSL case, presumably waiting on an underlying socket timeout.\n", "I have found it.\n\nDeep, deep down in the C code for the SSL module there is a socket read function, which is what is used by `recv_into`. This function begins [here](https://github.com/python/cpython/blob/26f18b1c18ae390c29a697d561767f205859ef1d/Modules/_ssl.c#L1774).\n\nEarly on in that function it checks whether there's any data pending on that socket. If there isn't (as in this final case), it calls another function, called [`check_socket_and_wait_for_timeout`](https://github.com/python/cpython/blob/26f18b1c18ae390c29a697d561767f205859ef1d/Modules/_ssl.c#L1832). That function, [defined here](https://github.com/python/cpython/blob/26f18b1c18ae390c29a697d561767f205859ef1d/Modules/_ssl.c#L1623), calls a poll with a duration equal to the socket timeout length. This will block until either the socket becomes readable or the timeout expires. Of course, the remote end closes the socket _before_ that timeout, which marks it readable, which causes a safe return up the stack and the empty read that tells the client library that this is probably done.\n\nI have no idea if this is expected behaviour or not, but the behaviour is far inside the standard library.\n", "And here's why we don't see it in blocking mode: [in blocking mode the function returns early](https://github.com/python/cpython/blob/26f18b1c18ae390c29a697d561767f205859ef1d/Modules/_ssl.c#L1604).\n", "But here's the thing: that behaviour isn't wrong in _the general case_. In the general case, recv_into should absolutely block until the timeout, and we do see that with non-SSL sockets. So why does this go so wrong in the HTTPS case?\n", "So at this point, it would seen this is a stdlib/language-level bug, if I'm following your research correctly @Lukasa. In that case, we should continue this on bugs.python.org. With that in mind, @zsalzbank you absolutely will not be better served by waiting for requests to cut a release because there's nothing the library can do (I don't think) that will allow us to work around this for you given the level it seems to be at. If you'd like to wait longer, you can but I think you'll see better results in general by writing RFC compliant servers.\n", "I'm happy to take this to the stdlib. I'll open a bug.\n", "Understood - but I'm a little confused by the RFC. It seems to me from [the spec](http://www.w3.org/Protocols/rfc2616/rfc2616-sec4.html#sec4.4) that neither `Content-Length` or `Transfer-Encoding` are required (see point number 5).\n\nI realize this is my issue, but I'm having lots of trouble figuring out why my server isn't adding the header to my data. I even explicitly added `Rack::ContentLength` to another test rails app (https://limitless-dawn-3533.herokuapp.com/test) and get the same result.\n", "possible that Heroku's routing layer is doing something naughty with your\nheaders?\n\nOn Tuesday, March 3, 2015, Zachary Salzbank [email protected]\nwrote:\n\n> Understood - but I'm a little confused by the RFC. It seems to me from the\n> spec http://www.w3.org/Protocols/rfc2616/rfc2616-sec4.html#sec4.4 that\n> neither Content-Length or Transfer-Encoding are required.\n> \n> I realize this is my issue, but I'm having lots of trouble figuring out\n> why my server isn't adding the header to my data. I even explicitly added\n> Rack::ContentLength to another test rails app (\n> https://limitless-dawn-3533.herokuapp.com/test) and get the same result.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2467#issuecomment-76983823\n> .\n\n## \n\n## \n\nKevin Burke\nphone: 925.271.7005 | twentymilliseconds.com\n", "I suppose - I'm going to reach out to them - I guess I should make sure the app sends out some kind of header (either `Content-Length` or `Transfer-Encoding`) without their routing first.\n", "@zsalzbank Sadly, you're looking at the wrong RFC. RFC 2616 got superseded by RFCs 732[0-5]. From [RFC 7320](https://tools.ietf.org/html/rfc7230) section 3.3:\n\n> The presence of a message body in a request is signaled by a Content-Length or Transfer-Encoding header field. Request message framing is independent of method semantics, even if the method does not define any use for a message body.\n\nThe follow-on logic is in section 3.3.3, that says:\n\n> `7. Otherwise, this is a response message without a declared message\n> body length, so the message body length is determined by the\n> number of octets received prior to the server closing the\n> connection.`\n\nGiven that the server is _not_ closing the connection in this HTTPS case, the server is still at fault here.\n\nRequests is really doing the best it can with the information it has.\n", "By the by, this is now tracked upstream as issue [23576](http://bugs.python.org/issue23576).\n", "So there _is_ an issue here (we think), just not with requests. Just trying to understand - the `Connection: close` header should indicate that the data is finished, but that is not being interpreted correctly?\n", "Also, I do see this on Python2 as well.\n", "`Connection: close` is totally unhelpful if the server hasn't given the client any way to know when the data is over. How can the client possibly tell?\n" ]
https://api.github.com/repos/psf/requests/issues/2466
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2466/labels{/name}
https://api.github.com/repos/psf/requests/issues/2466/comments
https://api.github.com/repos/psf/requests/issues/2466/events
https://github.com/psf/requests/pull/2466
59,348,333
MDExOlB1bGxSZXF1ZXN0MzAyMjY4NjI=
2,466
Only add VendorAliases for vendored dependencies
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[ { "color": "e10c02", "default": false, "description": null, "id": 117744, "name": "Bug", "node_id": "MDU6TGFiZWwxMTc3NDQ=", "url": "https://api.github.com/repos/psf/requests/labels/Bug" } ]
closed
true
null
[]
null
9
2015-02-28T16:53:27Z
2021-09-08T08:01:06Z
2015-03-04T13:49:12Z
CONTRIBUTOR
resolved
Closes #2465 Closes #2470 /cc @dstufft Let me know if I should send this to pip as well
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2466/reactions" }
https://api.github.com/repos/psf/requests/issues/2466/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2466.diff", "html_url": "https://github.com/psf/requests/pull/2466", "merged_at": "2015-03-04T13:49:12Z", "patch_url": "https://github.com/psf/requests/pull/2466.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2466" }
true
[ "I was doing some other work and a few corner cases popped into my head that this breaks. Previously the following would break\n\n``` py\nfrom requests.packages import sys\nfrom requests.packages.chardet import os\n```\n\nNow they'll happily succeed. \n", "I would be so much more comfortable if we weren't relying just on pip's tests, but take a look at [this run](https://travis-ci.org/pypa/pip/builds/52597911) (see also: https://github.com/pypa/pip/pull/2465) to see that this works the same way it did before, but now with the added benefit of not allowing imports like `from requests.packages.chardet import sys`.\n\nThe import error message will vary though depending on whether or not `urllib3` or `chardet` are vendored though. Perhaps we care, perhaps we don't.\n", "Also many thanks to @dstufft for helping with the final solution.\n", "@ralphbean @eriol if you two wouldn't mind testing this on Fedora and Debian respectively, I'd greatly appreciate it. (Alternatively, if y'all have scripts that I could test this with on the respective distros, I can spin up cloud images to test with.)\n", "@Lukasa should I just merge this myself to push 2.5.4 out the door?\n", "Yes, do it.\n", "Looks good to me. Sorry for the late reply.\n", "It's look good also to me. Sorry for my late reply I was a little busy.\n", "No worries. I think I'll release 2.5.5 today to include this patch for Python 2.x users\n" ]
https://api.github.com/repos/psf/requests/issues/2465
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2465/labels{/name}
https://api.github.com/repos/psf/requests/issues/2465/comments
https://api.github.com/repos/psf/requests/issues/2465/events
https://github.com/psf/requests/issues/2465
59,286,371
MDU6SXNzdWU1OTI4NjM3MQ==
2,465
Issue with pyinstaller
{ "avatar_url": "https://avatars.githubusercontent.com/u/6761036?v=4", "events_url": "https://api.github.com/users/gRanger777/events{/privacy}", "followers_url": "https://api.github.com/users/gRanger777/followers", "following_url": "https://api.github.com/users/gRanger777/following{/other_user}", "gists_url": "https://api.github.com/users/gRanger777/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/gRanger777", "id": 6761036, "login": "gRanger777", "node_id": "MDQ6VXNlcjY3NjEwMzY=", "organizations_url": "https://api.github.com/users/gRanger777/orgs", "received_events_url": "https://api.github.com/users/gRanger777/received_events", "repos_url": "https://api.github.com/users/gRanger777/repos", "site_admin": false, "starred_url": "https://api.github.com/users/gRanger777/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/gRanger777/subscriptions", "type": "User", "url": "https://api.github.com/users/gRanger777", "user_view_type": "public" }
[]
closed
true
null
[]
null
20
2015-02-27T20:01:10Z
2021-09-08T15:00:39Z
2015-04-06T11:19:15Z
NONE
resolved
Can't seem to wrap my head around this. having a problem compiling a windows exe in python 2.7.9 that uses the requests library and can't find anything on google about the specific error. My script runs fine from the interpreter but when i use pyinstaller, i get :ImportError: No module named 'requests.packages.chardet.sys' I can also compile windows executables that don't use requests just fine. ``` ###Sample Script ----------------Begin #!/usr/bin/python import requests r = requests.get('https://google.com') print(r.text) ----------------End ###command run to compile into windows exe ---------------Begin pyinstaller --onefile test.py OR pyinstaller test.py ---------------End ###Output ---------------Begin C:\Python27>pyinstaller test.py 76 INFO: wrote C:\Python27\test.spec 102 INFO: Testing for ability to set icons, version resources... 125 INFO: ... resource update available 129 INFO: UPX is not available. 164 INFO: Processing hook hook-os 409 INFO: Processing hook hook-time 417 INFO: Processing hook hook-cPickle 536 INFO: Processing hook hook-_sre 773 INFO: Processing hook hook-cStringIO 944 INFO: Processing hook hook-encodings 965 INFO: Processing hook hook-codecs 1687 INFO: Extending PYTHONPATH with C:\Python27 1687 INFO: checking Analysis 1687 INFO: Building Analysis because out00-Analysis.toc non existent 1688 INFO: running Analysis out00-Analysis.toc 1690 INFO: Adding Microsoft.VC90.CRT to dependent assemblies of final executable 1781 INFO: Searching for assembly x86_Microsoft.VC90.CRT_1fc8b3b9a1e18e3b_9.0.21 022.8_none ... 1782 WARNING: Assembly not found 1782 ERROR: Assembly x86_Microsoft.VC90.CRT_1fc8b3b9a1e18e3b_9.0.21022.8_none no t found 1954 WARNING: lib not found: MSVCR90.dll dependency of C:\Python27\python.exe 2039 INFO: Searching for assembly x86_Microsoft.VC90.CRT_1fc8b3b9a1e18e3b_9.0.21 022.8_none ... 2040 WARNING: Assembly not found 2042 ERROR: Assembly x86_Microsoft.VC90.CRT_1fc8b3b9a1e18e3b_9.0.21022.8_none no t found 2263 WARNING: lib not found: MSVCR90.dll dependency of C:\Windows\system32\pytho n27.dll 2266 INFO: Analyzing C:\Python27\lib\site-packages\pyinstaller-2.1.1.dev0-py2.7. egg\PyInstaller\loader\_pyi_bootstrap.py 2284 INFO: Processing hook hook-os 2309 INFO: Processing hook hook-site 2339 INFO: Processing hook hook-encodings 2582 INFO: Processing hook hook-time 2590 INFO: Processing hook hook-cPickle 2715 INFO: Processing hook hook-_sre 2975 INFO: Processing hook hook-cStringIO 3164 INFO: Processing hook hook-codecs 3907 INFO: Processing hook hook-pydoc 4185 INFO: Processing hook hook-email 4309 INFO: Processing hook hook-httplib 4368 INFO: Processing hook hook-email.message 4517 INFO: Analyzing C:\Python27\lib\site-packages\pyinstaller-2.1.1.dev0-py2.7. egg\PyInstaller\loader\pyi_importers.py 4690 INFO: Analyzing C:\Python27\lib\site-packages\pyinstaller-2.1.1.dev0-py2.7. egg\PyInstaller\loader\pyi_archive.py 4865 INFO: Analyzing C:\Python27\lib\site-packages\pyinstaller-2.1.1.dev0-py2.7. egg\PyInstaller\loader\pyi_carchive.py 5040 INFO: Analyzing C:\Python27\lib\site-packages\pyinstaller-2.1.1.dev0-py2.7. egg\PyInstaller\loader\pyi_os_path.py 5069 INFO: Analyzing test.py 6014 INFO: Processing hook hook-requests 7263 INFO: Processing hook hook-xml 7445 INFO: Processing hook hook-xml.sax 7516 INFO: Processing hook hook-pyexpat 7646 INFO: Hidden import 'codecs' has been found otherwise 7648 INFO: Hidden import 'encodings' has been found otherwise 7648 INFO: Looking for run-time hooks 7830 WARNING: lib not found: MSVCR90.dll dependency of C:\Python27\lib\site-pack ages\win32\win32pipe.pyd 7987 WARNING: lib not found: MSVCR90.dll dependency of C:\Python27\DLLs\select.p yd 8144 WARNING: lib not found: MSVCR90.dll dependency of C:\Python27\DLLs\unicoded ata.pyd 8319 WARNING: lib not found: MSVCR90.dll dependency of C:\Python27\lib\site-pack ages\win32\win32wnet.pyd 8501 WARNING: lib not found: MSVCR90.dll dependency of C:\Python27\DLLs\_hashlib .pyd 8671 WARNING: lib not found: MSVCR90.dll dependency of C:\Python27\DLLs\bz2.pyd 8859 WARNING: lib not found: MSVCR90.dll dependency of C:\Python27\DLLs\_ssl.pyd 9052 WARNING: lib not found: MSVCR90.dll dependency of C:\Python27\DLLs\_ctypes. pyd 9223 WARNING: lib not found: MSVCR90.dll dependency of C:\Python27\DLLs\pyexpat. pyd 9460 WARNING: lib not found: MSVCR90.dll dependency of C:\Python27\lib\site-pack ages\win32\win32api.pyd 9632 WARNING: lib not found: MSVCR90.dll dependency of C:\Python27\DLLs\_socket. pyd 9828 WARNING: lib not found: MSVCR90.dll dependency of C:\Windows\system32\pywin types27.dll 9848 INFO: Using Python library C:\Windows\system32\python27.dll 10016 INFO: Warnings written to C:\Python27\build\test\warntest.txt 10023 INFO: checking PYZ 10023 INFO: Rebuilding out00-PYZ.toc because out00-PYZ.pyz is missing 10024 INFO: Building PYZ (ZlibArchive) out00-PYZ.toc 12259 INFO: checking PKG 12261 INFO: Rebuilding out00-PKG.toc because out00-PKG.pkg is missing 12261 INFO: Building PKG (CArchive) out00-PKG.pkg 12286 INFO: checking EXE 12287 INFO: Rebuilding out00-EXE.toc because test.exe missing 12289 INFO: Building EXE from out00-EXE.toc 12292 INFO: Appending archive to EXE C:\Python27\build\test\test.exe 12296 INFO: checking COLLECT 12296 INFO: Building COLLECT out00-COLLECT.toc ---------------End ###What happens when running the executable ---------------Begin C:\Users\gRanger\Desktop\dist\test>test.exe Traceback (most recent call last): File "<string>", line 3, in <module> File "c:\python27\lib\site-packages\PyInstaller-2.1.1.dev0-py2.7.egg\PyInstall er\loader\pyi_importers.py", line 276, in load_module exec(bytecode, module.__dict__) File "C:\Users\gRanger\Desktop\build\test\out00-PYZ.pyz\requests", line 58, in <module> File "c:\python27\lib\site-packages\PyInstaller-2.1.1.dev0-py2.7.egg\PyInstall er\loader\pyi_importers.py", line 276, in load_module exec(bytecode, module.__dict__) File "C:\Users\gRanger\Desktop\build\test\out00-PYZ.pyz\requests.utils", line 26, in <module> File "c:\python27\lib\site-packages\PyInstaller-2.1.1.dev0-py2.7.egg\PyInstall er\loader\pyi_importers.py", line 276, in load_module exec(bytecode, module.__dict__) File "C:\Users\gRanger\Desktop\build\test\out00-PYZ.pyz\requests.compat", line 7, in <module> File "c:\python27\lib\site-packages\PyInstaller-2.1.1.dev0-py2.7.egg\PyInstall er\loader\pyi_importers.py", line 276, in load_module exec(bytecode, module.__dict__) File "C:\Users\gRanger\Desktop\build\test\out00-PYZ.pyz\requests.packages.char det", line 19, in <module> File "C:\Users\gRanger\Desktop\build\test\out00-PYZ.pyz\requests.packages", li ne 83, in load_module ImportError: No module named 'requests.packages.chardet.sys' ---------------End ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2465/reactions" }
https://api.github.com/repos/psf/requests/issues/2465/timeline
null
completed
null
null
false
[ "So it looks like pyinstaller isn't detecting an import of the standard library's `sys` module even though it finds `httplib`, `os` and others. You should see if anyone on [StackOverflow](https://stackoverflow.com) can help you with this (or if this is a bug in pyinstaller). It's certainly not a bug in requests though\n", "So this is sort of a bug with that import machinery and sort of a bug with python 2 and sort of a bug with chardet. On the bright side, there's a fix for it that can be performed in chardet and then vendored into requests, etc. On the not-so-bright side, this _can_ affect urllib3 as well.\n\nI was trying to debug https://github.com/jakubroztocil/httpie/issues/315 with pdb to figure out why I was seeing a different error and ran into an issue with this import logic trying to import `requests.packages.urllib3.pdb` because on Py2, `import pdb` is treated as a implicit relative import first and then a non-relative import second. (Woo, thanks Python 2.) The temporary work-around was to add `from __future__ import absolute_import` to the top of the file I was trying to debug in. This, of course, could be applied to urllib3 and chardet both. I think the better option is to attempt to fix the import machinery stolen wholesale from pip.\n\n@dstufft definitely understands this code better than I do, but as I understand it now: we stop trying to import it at L83 if the `__import__(name)` (which in these cases are `chardet.sys` and `urllib3.pdb`) fails. Instead, I think we need to figure out how to try one last case to actually mimic the regular import machinery. I can imagine more complex import failures, like seeing something like `chardet.os.path` fail, so something like\n\n``` py\nimport_name = name\nwhile import_name:\n (_, import_name) = import_name.split('.', 1)\n try:\n __import__(import_name)\n module = sys.modules(import_name)\n except ImportError:\n pass\n\nif not module: # or if module is None:\n raise ImportError(...)\n\nsys.modules[name] = module\nreturn module\n```\n\nDoes that make sense?\n", "Hey, so I finally registered on stackoverflow and posted my question (and this will probably provide more detail for you to debug the issue) but I think something did break in requests. Someone told me to force version 2.5.1 and it worked flawlessly! So it appears whatever issue is plauging pyinstaller/requests has been present since version 2.5.2\n\nhttp://stackoverflow.com/questions/28775276/pyinstaller-error-importerror-no-module-named-requests-packages-chardet-sys\n\nI'm not blaming requests fully here either =) I realize that it could be an issue on pyinstaller's part still, but I just wanted to report and let you know that I was able to uninstall requests with pip and force version 2.5.1 and now i'm not having any compiling errors or execution errors.\n", "@gRanger915 We agree, there's a candidate fix in #2466.\n", "I'm afraid I'm still seeing this with requests 2.6.0 and pyinstaller at both version 2.1 and the current master ([67610f](https://github.com/pyinstaller/pyinstaller/commit/67610f2ddadf378c90bf3c8872f3b38baefcb215)). I've created a minimal failing case at [aanand/requests-pyinstaller-bug](https://github.com/aanand/requests-pyinstaller-bug):\n\n```\n$ make\nscript/build-linux\n>> Building Docker image\n>> Running bin/test\nImporting requests\nImported requests 2.6.0\n>> Building dist/test\n>> Running dist/test\nImporting requests\nTraceback (most recent call last):\n File \"<string>\", line 3, in <module>\n File \"/usr/local/lib/python2.7/dist-packages/PyInstaller/loader/pyi_importers.py\", line 276, in load_module\n exec(bytecode, module.__dict__)\n File \"/code/build/test/out00-PYZ.pyz/requests\", line 58, in <module>\n\n File \"/usr/local/lib/python2.7/dist-packages/PyInstaller/loader/pyi_importers.py\", line 276, in load_module\n exec(bytecode, module.__dict__)\n File \"/code/build/test/out00-PYZ.pyz/requests.utils\", line 26, in <module>\n File \"/usr/local/lib/python2.7/dist-packages/PyInstaller/loader/pyi_importers.py\", line 276, in load_module\n exec(bytecode, module.__dict__)\n File \"/code/build/test/out00-PYZ.pyz/requests.compat\", line 7, in <module>\n File \"/usr/local/lib/python2.7/dist-packages/PyInstaller/loader/pyi_importers.py\", line 276, in load_module\n exec(bytecode, module.__dict__)\n File \"/code/build/test/out00-PYZ.pyz/requests.packages.chardet\", line 19, in <module>\n File \"/code/build/test/out00-PYZ.pyz/requests.packages\", line 95, in load_module\nImportError: No module named 'requests.packages.chardet.sys'\nmake: *** [default] Error 255\n```\n", "+1\n", "+1\n", "@sigmavirus24 Seems that we're still having some trouble here with our new vendoring logic. Care to take a look?\n", "Yep it's on my list.\n", "@aanand if you can remove as many variables from your reproduction case as possible that'd be appreciated (e.g., just PyInstaller and Requests, no docker or anything else)\n", "Alternatively, if all the people taking time to reply \"+1\" can instead provide their reproduction steps, that'd be far more useful.\n", "Sorry, meant to reopen this after @aanand commented but never did.\n\nSo here's the minimal number of steps to reproduce this:\n1. Create a virtualenv\n2. `pip install pyinstaller requests`\n3. Create a file (e.g., `foo.py`) that imports requests (and optionally uses it)\n4. Run `pyinstaller -F <filename>` (e.g., `pyinstaller -F foo.py`)\n5. Run the created single-file executable (e.g., `./dist/foo`)\n\nHere's the slightly more entertaining thing. In a plain 2.7 environment, if I do this:\n\n```\n ❯❯❯ pip install requests==2.5.0\nCollecting requests==2.5.0\n Downloading requests-2.5.0-py2.py3-none-any.whl (464kB)\n 100% |################################| 466kB 561kB/s\nInstalling collected packages: requests\n\nSuccessfully installed requests-2.5.0\n❯❯❯ python\nPython 2.7.9 (default, Dec 13 2014, 22:20:22)\n[GCC 4.2.1 Compatible Apple LLVM 6.0 (clang-600.0.56)] on darwin\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> import requests\n>>> requests.__version__\n'2.5.0'\n>>> import sys\n>>> modules = [m for m in sys.modules.keys() if m.startswith('requests.')]\n>>> modules\n['requests.packages.urllib3.util.select', 'requests.packages.urllib3.codecs', 'requests.packages.urllib3.socket', 'requests.packages.urllib3.datetime', 'requests.time', 'requests.io', 'requests.packages.urllib3.util.errno', 'requests.packages.urllib3.packages.six.moves', 'requests.urllib2', 'requests.packages.urllib3.packages.StringIO', 'requests.utils', 'requests.packages.urllib3.fields', 'requests.adapters', 'requests.hooks', 'requests.packages.urllib3.util.logging', 'requests.packages.urllib3.packages.types', 'requests.packages.urllib3.util.socket', 'requests.packages.urllib3', 'requests.packages.urllib3.util.request', 'requests.exceptions', 'requests.packages.urllib3.response', 'requests.urlparse', 'requests.os', 'requests.packages.urllib3.util.collections', 'requests.models', 'requests.socket', 'requests.packages.urllib3.poolmanager', 'requests.packages.urllib3.Queue', 'requests.packages.urllib3.sys', 'requests.logging', 'requests.hashlib', 'requests.threading', 'requests.packages.urllib3.httplib', 'requests.sessions', 'requests.codecs', 'requests.packages.urllib3.packages.ordered_dict', 'requests.packages.urllib3.contrib', 'requests.status_codes', 'requests.auth', 'requests.re', 'requests.datetime', 'requests.packages.urllib3.mimetypes', 'requests.collections', 'requests.api', 'requests.struct', 'requests.packages.chardet', 'requests.urllib', 'requests.packages.urllib3.packages.sys', 'requests.packages.urllib3.exceptions', 'requests.packages.urllib3.util.retry', 'requests.packages.urllib3.connection', 'requests.packages.urllib3.ssl', 'requests.StringIO', 'requests.certs', 'requests.platform', 'requests.packages.urllib3.packages.ssl_match_hostname', 'requests.packages.urllib3.util.url', 'requests.packages.urllib3.util.binascii', 'requests.warnings', 'requests.packages.urllib3.util', 'requests.packages.urllib3.util.base64', 'requests.cookies', 'requests.packages.urllib3._collections', 'requests.packages.urllib3.connectionpool', 'requests.packages.urllib3.threading', 'requests.packages.urllib3.util.connection', 'requests.compat', 'requests.packages.urllib3.collections', 'requests.packages.urllib3.util.hashlib', 'requests.packages.urllib3.util.ssl_', 'requests.packages', 'requests.packages.urllib3.zlib', 'requests.cgi', 'requests.base64', 'requests.packages.urllib3.util.ssl', 'requests.packages.urllib3.filepost', 'requests.cookielib', 'requests.json', 'requests.Cookie', 'requests.packages.urllib3.packages.operator', 'requests.packages.urllib3.packages.ssl_match_hostname.ssl', 'requests.packages.chardet.sys', 'requests.packages.urllib3.uuid', 'requests.packages.urllib3.packages._abcoll', 'requests.packages.urllib3.util.response', 'requests.packages.urllib3.util.timeout', 'requests.structures', 'requests.packages.urllib3.packages.thread', 'requests.packages.urllib3.errno', 'requests.packages.urllib3.urllib', 'requests.packages.urllib3.request', 'requests.sys', 'requests.packages.urllib3.urlparse', 'requests.packages.urllib3.packages', 'requests.packages.urllib3.email', 'requests.packages.urllib3.util.time', 'requests.packages.urllib3.io', 'requests.packages.urllib3.logging', 'requests.packages.urllib3.warnings', 'requests.packages.urllib3.packages.six']\n>>> 'requests.packages.chardet.sys' in modules\nTrue\n>>> print sys.modules['requests.packages.chardet.sys']\nNone\n```\n\nSo this is, in a large part, a wart in how the import machinery in Python 2.x works. If PyInstaller weren't experimental on Python 3, I'd spend time seeing if I could reproduce it there, but I suspect it wouldn't happen. I have a hunch for how to fix this though (and it doesn't involve needing to toy with the custom import machinery any further).\n", "Further intrigue, pyinstaller places meta-path modifiers into `sys.meta_path` so debugging into `load_module` shows:\n\n```\n[<pyi_importers.BuiltinImporter object at 0x105a32990>, <pyi_importers.FrozenImporter object at 0x105a32a10>, <pyi_importers.CExtensionImporter object at 0x105a32b90>, <requests.packages.VendorAlias object at 0x105b0eb90>]\n```\n\nI think the existence of those Importer objects is preventing the default Python behaviour from taking over. (In other words, Python is attempting to do a module local import before doing an absolute import and those meta_path modifiers are preventing it from working as expected.)\n\nSo there are two options here:\n1. Update requests, chardet, and urllib3 to add `from __future__ import absolute_import` to the top of each submodule (annoying but in general just a good idea and will help with Py2/Py3 import behaviour compatibility).\n2. Start down the rabbit hole I had started chasing the first time. - Develop an algorithm that continuously tries to import an algorithm. Right now we do the following:\n - Given a name, e.g., `requests.packages.chardet.sys`, try to import that\n - If that fails, strip `requests.packages.` from the name and try to import that\n - If that fails, give up\n \n We could modify that to instead then attempt to strip `requests.packages.chardet.` or `requests.packages.urllib3` from the name but then something like `requests.packages.urllib3.contrib.pyopenssl.ssl`.\n\nI have some more experimentation that I'd like to play with, before acting on either of these, but I strongly prefer option 1.\n", "So the simplest solution, although not one I like best, is to insert the `VendorAlias` into the first position of `sys.meta_path` (e.g., `sys.meta_path.insert(0, VendorAlias(...))`) and the resulting script behaves the same with (this slightly modified) 2.6.0 as with 2.5.0.\n\nThat said, I'm still in favor of 1. although it seems far less necessary.\n", "So I read a bit of the PEP that explains the part `sys.meta_path` plays and talked to @dstufft a bit on IRC. Provided the way PyInstaller registers it's import hooks on the meta path you can see that it registers it's \"BuiltinImporter\" first. Since the first hook to return a module wins, we probably want to register our hook first. This allows Python 2's behaviour of try an implicit relative import then try an absolute import to kind of just work.\n\nAs a quick fix, I'm going to change our behaviour from appending to the meta path to inserting at the start of it. We kind of want that behaviour anyway.\n", "Nice. #2533 fixes it for me.\n\nIs there an ETA for this fix landing in a stable version?\n", "Not really. We'll probably discuss if we want to release 2.6.1 during PyCon or if we want to wait and get a bit more in for a 2.7.0\n", "Just checking in. Any chance of this making it into a release soon? I'm eager to get it downstream into Docker Compose.\n", "Still getting this with Python 3.5, pyinstaller 3.2 and requests 2.7.0. Any update?\n\nEdit: Nevermind, I seem to have had something nasty in my `./build`, once removed and rebuilt it seems to work. \n", "I still have this issue, and my fix was to do sth following in the spec file:\n\n```\nimport requests\nrequests_base_dir = os.path.dirname(requests.__file__)\nrequests_tree = Tree(requests_base_dir, prefix='requests')\n....\ncoll = COLLECT(exe,\n a.binaries,\n a.zipfiles,\n a.datas,\n icon_tree,\n data_tree,\n astropy_tree,requests_tree,\n strip=None,\n upx=True,\n name=vaex.__build_name__)\n```\n\nThis was the source is copied, and found, this also works for difficult packages, such as astropy, hope this helps people in the future.\n" ]
https://api.github.com/repos/psf/requests/issues/2464
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2464/labels{/name}
https://api.github.com/repos/psf/requests/issues/2464/comments
https://api.github.com/repos/psf/requests/issues/2464/events
https://github.com/psf/requests/issues/2464
59,223,589
MDU6SXNzdWU1OTIyMzU4OQ==
2,464
Multiple requests suggested way
{ "avatar_url": "https://avatars.githubusercontent.com/u/4975761?v=4", "events_url": "https://api.github.com/users/nsorros/events{/privacy}", "followers_url": "https://api.github.com/users/nsorros/followers", "following_url": "https://api.github.com/users/nsorros/following{/other_user}", "gists_url": "https://api.github.com/users/nsorros/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/nsorros", "id": 4975761, "login": "nsorros", "node_id": "MDQ6VXNlcjQ5NzU3NjE=", "organizations_url": "https://api.github.com/users/nsorros/orgs", "received_events_url": "https://api.github.com/users/nsorros/received_events", "repos_url": "https://api.github.com/users/nsorros/repos", "site_admin": false, "starred_url": "https://api.github.com/users/nsorros/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nsorros/subscriptions", "type": "User", "url": "https://api.github.com/users/nsorros", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2015-02-27T11:20:25Z
2021-09-08T23:05:58Z
2015-02-27T12:58:17Z
NONE
resolved
Hi, I am trying to do multiple requests without having to wait for each one to finish. I was using grequests but i am wondering if there is a better way to handle it.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2464/reactions" }
https://api.github.com/repos/psf/requests/issues/2464/timeline
null
completed
null
null
false
[ "grequests is the best way to handle it at the moment if you like using gevent. Your other options are [trequests](https://github.com/1stvamp/trequests), [treq](https://github.com/dreid/treq), [async-requests](https://github.com/inglesp/async-requests), and there are likely some others floating around that I don't know about.\n" ]
https://api.github.com/repos/psf/requests/issues/2463
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2463/labels{/name}
https://api.github.com/repos/psf/requests/issues/2463/comments
https://api.github.com/repos/psf/requests/issues/2463/events
https://github.com/psf/requests/issues/2463
59,156,605
MDU6SXNzdWU1OTE1NjYwNQ==
2,463
Ampersand (&) added for no reason?
{ "avatar_url": "https://avatars.githubusercontent.com/u/100194?v=4", "events_url": "https://api.github.com/users/GAZ082/events{/privacy}", "followers_url": "https://api.github.com/users/GAZ082/followers", "following_url": "https://api.github.com/users/GAZ082/following{/other_user}", "gists_url": "https://api.github.com/users/GAZ082/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/GAZ082", "id": 100194, "login": "GAZ082", "node_id": "MDQ6VXNlcjEwMDE5NA==", "organizations_url": "https://api.github.com/users/GAZ082/orgs", "received_events_url": "https://api.github.com/users/GAZ082/received_events", "repos_url": "https://api.github.com/users/GAZ082/repos", "site_admin": false, "starred_url": "https://api.github.com/users/GAZ082/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/GAZ082/subscriptions", "type": "User", "url": "https://api.github.com/users/GAZ082", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2015-02-26T22:41:58Z
2021-09-08T23:05:59Z
2015-02-27T02:01:04Z
NONE
resolved
Hi guys, could you confirm if this is a bug or works as intended? ``` python import requests import json url = "http://server:8080/jsonrpc?request="; headers = {'content-type': 'application/json'}; payload = {blah}; r = requests.get(url, params=json.dumps(payload), headers=headers); print(r.url); ``` It outputs the encoded string BUT for no reason adds an ampersand as below: http://server:8080/jsonrpc?request=**&**{blah} This provokes parse errors when querying XBMC/KODI server. Thanks!
{ "avatar_url": "https://avatars.githubusercontent.com/u/100194?v=4", "events_url": "https://api.github.com/users/GAZ082/events{/privacy}", "followers_url": "https://api.github.com/users/GAZ082/followers", "following_url": "https://api.github.com/users/GAZ082/following{/other_user}", "gists_url": "https://api.github.com/users/GAZ082/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/GAZ082", "id": 100194, "login": "GAZ082", "node_id": "MDQ6VXNlcjEwMDE5NA==", "organizations_url": "https://api.github.com/users/GAZ082/orgs", "received_events_url": "https://api.github.com/users/GAZ082/received_events", "repos_url": "https://api.github.com/users/GAZ082/repos", "site_admin": false, "starred_url": "https://api.github.com/users/GAZ082/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/GAZ082/subscriptions", "type": "User", "url": "https://api.github.com/users/GAZ082", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2463/reactions" }
https://api.github.com/repos/psf/requests/issues/2463/timeline
null
completed
null
null
false
[ "It is indeed working as intended.\n\nThe problem is this: we have no real way of knowing what you mean when you provide part of the query string as part of the URL and part in the 'params' section. You're doing something particularly unexpected by passing parameters as a string rather than a mapping (though I suspect that's for the sake of this limited example and not your actual code).\n\nThe position is generally that if you want us to build your query string you should let us build the whole thing, or at least build whole name-value pairs. Don't ask us to build a part of a name-value pair, because it's really hard to see when that's intended behaviour.\n\nI don't really consider this a bug, more a sharp edge in the API.\n", "Mmmm. So basically, i messed it up when put **?request=**\n", "@GAZ082 the following should work:\n\n``` py\nimport requests\nimport json\n\nurl = \"http://server:8080/jsonrpc\";\nheaders = {'content-type': 'application/json'};\npayload = {'foo': 'bar'}\nparams = {'request': json.dumps(payload)}\n\nr = requests.get(url, params=params, headers=headers)\n\nprint(r.url)\n```\n\nIn the future though, questions like this will be answered quickly on [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests)\n", "Thank you sigma.\n" ]
https://api.github.com/repos/psf/requests/issues/2462
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2462/labels{/name}
https://api.github.com/repos/psf/requests/issues/2462/comments
https://api.github.com/repos/psf/requests/issues/2462/events
https://github.com/psf/requests/issues/2462
59,059,558
MDU6SXNzdWU1OTA1OTU1OA==
2,462
Error in `requests.packages.__init__` (version > 2.5.1)
{ "avatar_url": "https://avatars.githubusercontent.com/u/1850488?v=4", "events_url": "https://api.github.com/users/p-p-m/events{/privacy}", "followers_url": "https://api.github.com/users/p-p-m/followers", "following_url": "https://api.github.com/users/p-p-m/following{/other_user}", "gists_url": "https://api.github.com/users/p-p-m/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/p-p-m", "id": 1850488, "login": "p-p-m", "node_id": "MDQ6VXNlcjE4NTA0ODg=", "organizations_url": "https://api.github.com/users/p-p-m/orgs", "received_events_url": "https://api.github.com/users/p-p-m/received_events", "repos_url": "https://api.github.com/users/p-p-m/repos", "site_admin": false, "starred_url": "https://api.github.com/users/p-p-m/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/p-p-m/subscriptions", "type": "User", "url": "https://api.github.com/users/p-p-m", "user_view_type": "public" }
[ { "color": "f7c6c7", "default": false, "description": null, "id": 167537670, "name": "Propose Close", "node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=", "url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close" } ]
closed
true
null
[]
null
8
2015-02-26T11:36:49Z
2021-09-08T23:05:54Z
2015-03-14T11:20:33Z
NONE
resolved
After update to requests 2.5.3 in my project I got such error. ``` File "/home/pavel/Python/projects/myproject/requests-2.5.3-py2.7.egg/requests/__init__.py", line 53, in <module> from .packages.urllib3.contrib import pyopenssl File "/tmp/easy_install-NUp7Qp/requests-2.5.3/requests/packages/__init__.py", line 49, in load_module AttributeError: 'NoneType' object has no attribute 'modules' Error in atexit._run_exitfuncs: Traceback (most recent call last): File "/usr/lib/python2.7/atexit.py", line 24, in _run_exitfuncs func(*targs, **kargs) File "/usr/lib/python2.7/multiprocessing/util.py", line 284, in _exit_function info('process shutting down') TypeError: 'NoneType' object is not callable Error in sys.exitfunc: Traceback (most recent call last): File "/usr/lib/python2.7/atexit.py", line 24, in _run_exitfuncs func(*targs, **kargs) File "/usr/lib/python2.7/multiprocessing/util.py", line 284, in _exit_function info('process shutting down') TypeError: 'NoneType' object is not callable ``` As I understand it occurs because of adding copy of pip's machinery (https://github.com/kennethreitz/requests/commit/d61540551943df57aa0dece5e44e130309dcafec)
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2462/reactions" }
https://api.github.com/repos/psf/requests/issues/2462/timeline
null
completed
null
null
false
[ "This isn't a bug in requests. [`sys.modules`](https://github.com/kennethreitz/requests/blob/461b740db6ae3d3ab1c5d975b657307f5c630fcb/requests/packages/__init__.py#L49) is something we should be able to expect to exist reasonably. The problem is that you're using multiprocessing and it is doing something bizarre to the `sys` module. Not only is it affecting `requests`, it's also affecting `atexit` which you'll not by:\n\n``` py\nError in atexit._run_exitfuncs:\nTraceback (most recent call last):\n File \"/usr/lib/python2.7/atexit.py\", line 24, in _run_exitfuncs\n func(*targs, **kargs)\n File \"/usr/lib/python2.7/multiprocessing/util.py\", line 284, in _exit_function\n info('process shutting down')\nTypeError: 'NoneType' object is not callable\n```\n\nand \n\n``` py\nError in sys.exitfunc:\nTraceback (most recent call last):\n File \"/usr/lib/python2.7/atexit.py\", line 24, in _run_exitfuncs\n func(*targs, **kargs)\n File \"/usr/lib/python2.7/multiprocessing/util.py\", line 284, in _exit_function\n info('process shutting down')\nTypeError: 'NoneType' object is not callable\n```\n\nSo something is causing `sys` to be undefined in multiple contexts (my guess is the garbage collector) and not only is that negatively affecting requests, it's negatively affecting the standard library's `atexit`. pip's import machinery is not causing `sys` to be undefined.\n", "I came across this issue as well when running integration tests in Python 2.7. I was able to reproduce it with a simple test case:\nhttps://github.com/sheppard/requests-test\n\nIt seems like it happens when setuptools downloads an egg file during tests, if requests isn't already installed. Running the tests a second time usually works. For Travis CI I was able to work around this by preinstalling requests before running the tests.\n", "@sigmavirus24 What's the odds this is related to our import machinery?\n", "I've seen this issue with both setuputils (via dependencies listed in tests_require=['requests']) and tox (using deps = ... in tox.ini)\n\n```\n18:06:49 import requests\n18:06:49 .eggs/requests-2.5.2-py2.7.egg/requests/__init__.py:53: in <module>\n18:06:49 from .packages.urllib3.contrib import pyopenssl\n18:06:49 /tmp/easy_install-HlEuTN/requests-2.5.2/requests/packages/__init__.py:49: in load_module\n18:06:49 ???\n18:06:49 E AttributeError: 'NoneType' object has no attribute 'modules'\n```\n\nPinning requests to 2.5.0 fixed the issue with both tox and setuputils.\n", "Seeing this too https://travis-ci.org/ckan/ckanapi/jobs/53942761#L177\n\nLooks like the `packages/__init__.py` module is being deallocated (which can happen when something is messing with sys.modules). That causes all module-level globals to be set to None. A possible workaround is to import sys inside the class to avoid the None-clobbering\n", "@sigmavirus24 You've got more experience in this area than me. Thoughts?\n", "/cc @dstufft \n", "@wardi did you test that solution or would you be willing to test it if I put it on a branch?\n" ]
https://api.github.com/repos/psf/requests/issues/2461
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2461/labels{/name}
https://api.github.com/repos/psf/requests/issues/2461/comments
https://api.github.com/repos/psf/requests/issues/2461/events
https://github.com/psf/requests/issues/2461
59,025,589
MDU6SXNzdWU1OTAyNTU4OQ==
2,461
Import Error: Cannot import name certs - Second time onwards!
{ "avatar_url": "https://avatars.githubusercontent.com/u/1830169?v=4", "events_url": "https://api.github.com/users/rtindru/events{/privacy}", "followers_url": "https://api.github.com/users/rtindru/followers", "following_url": "https://api.github.com/users/rtindru/following{/other_user}", "gists_url": "https://api.github.com/users/rtindru/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/rtindru", "id": 1830169, "login": "rtindru", "node_id": "MDQ6VXNlcjE4MzAxNjk=", "organizations_url": "https://api.github.com/users/rtindru/orgs", "received_events_url": "https://api.github.com/users/rtindru/received_events", "repos_url": "https://api.github.com/users/rtindru/repos", "site_admin": false, "starred_url": "https://api.github.com/users/rtindru/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/rtindru/subscriptions", "type": "User", "url": "https://api.github.com/users/rtindru", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2015-02-26T06:16:36Z
2021-09-08T23:05:59Z
2015-02-26T15:25:27Z
NONE
resolved
``` In [1]: import requests --------------------------------------------------------------------------- ImportError Traceback (most recent call last) /usr/local/python2.7/lib/python2.7/site-packages/Django-1.3.4-py2.7.egg/django/core/management/commands/shell.pyc in <module>() ----> 1 import requests /usr/local/python2.7/lib/python2.7/site-packages/requests/__init__.py in <module>() 56 pass 57 ---> 58 from . import utils 59 from .models import Request, Response, PreparedRequest 60 from .api import request, get, head, post, patch, put, delete, options /usr/local/python2.7/lib/python2.7/site-packages/requests/utils.py in <module>() 23 from . import __version__ 24 from . import certs ---> 25 from .compat import parse_http_list as _parse_list_header 26 from .compat import (quote, urlparse, bytes, str, OrderedDict, unquote, is_py2, 27 builtin_str, getproxies, proxy_bypass) /usr/local/python2.7/lib/python2.7/site-packages/requests/compat.py in <module>() 5 """ 6 ----> 7 from .packages import chardet 8 9 import sys /usr/local/python2.7/lib/python2.7/site-packages/requests/packages/__init__.py in <module>() 1 from __future__ import absolute_import 2 ----> 3 from . import urllib3 /usr/local/python2.7/lib/python2.7/site-packages/requests/packages/urllib3/__init__.py in <module>() 14 15 ---> 16 from .connectionpool import ( 17 HTTPConnectionPool, 18 HTTPSConnectionPool, /usr/local/python2.7/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py in <module>() 14 from queue import LifoQueue, Empty, Full 15 except ImportError: ---> 16 from Queue import LifoQueue, Empty, Full 17 import Queue as _ # Platform-specific: Windows 18 ImportError: cannot import name LifoQueue In [2]: import requests --------------------------------------------------------------------------- ImportError Traceback (most recent call last) /usr/local/python2.7/lib/python2.7/site-packages/Django-1.3.4-py2.7.egg/django/core/management/commands/shell.pyc in <module>() ----> 1 import requests /usr/local/python2.7/lib/python2.7/site-packages/requests/__init__.py in <module>() 56 pass 57 ---> 58 from . import utils 59 from .models import Request, Response, PreparedRequest 60 from .api import request, get, head, post, patch, put, delete, options /usr/local/python2.7/lib/python2.7/site-packages/requests/utils.py in <module>() 22 23 from . import __version__ ---> 24 from . import certs 25 from .compat import parse_http_list as _parse_list_header 26 from .compat import (quote, urlparse, bytes, str, OrderedDict, unquote, is_py2, ImportError: cannot import name certs ``` I have a module Queue that mirrors python's default Queue. This is why the LifoQueue import fails. But having failed there the first time, every successive import of requests fails at 'Cannot import certs' which I find bizarre! Also, I have not fixed the Queue conflict. Just want to know why the import error is different for the first time and the successive times are all some cert error! ![error_rep](https://cloud.githubusercontent.com/assets/1830169/6387200/16bdd6fe-bdad-11e4-96a7-d197769fd10c.png)
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2461/reactions" }
https://api.github.com/repos/psf/requests/issues/2461/timeline
null
completed
null
null
false
[ "This isn't a bug in requests. If you want to know why the import machinery is working this way, you should as on python-list or StackOverflow\n" ]
https://api.github.com/repos/psf/requests/issues/2460
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2460/labels{/name}
https://api.github.com/repos/psf/requests/issues/2460/comments
https://api.github.com/repos/psf/requests/issues/2460/events
https://github.com/psf/requests/issues/2460
58,811,630
MDU6SXNzdWU1ODgxMTYzMA==
2,460
Import requests library troubleshooting
{ "avatar_url": "https://avatars.githubusercontent.com/u/3380261?v=4", "events_url": "https://api.github.com/users/theasder/events{/privacy}", "followers_url": "https://api.github.com/users/theasder/followers", "following_url": "https://api.github.com/users/theasder/following{/other_user}", "gists_url": "https://api.github.com/users/theasder/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/theasder", "id": 3380261, "login": "theasder", "node_id": "MDQ6VXNlcjMzODAyNjE=", "organizations_url": "https://api.github.com/users/theasder/orgs", "received_events_url": "https://api.github.com/users/theasder/received_events", "repos_url": "https://api.github.com/users/theasder/repos", "site_admin": false, "starred_url": "https://api.github.com/users/theasder/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/theasder/subscriptions", "type": "User", "url": "https://api.github.com/users/theasder", "user_view_type": "public" }
[]
closed
true
null
[]
null
9
2015-02-24T21:53:07Z
2021-09-08T23:05:59Z
2015-02-25T19:43:08Z
NONE
resolved
Requests was installed via python3 -m pip install requests. Importing library was used to work well. But suddenly, it started to show me an error: ``` $ python3 Python 3.4.2 (v3.4.2:ab2c023a9432, Oct 5 2014, 20:42:22) [GCC 4.2.1 (Apple Inc. build 5666) (dot 3)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> import requests Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/site-packages/requests-2.5.3-py3.4.egg/requests/__init__.py", line 58, in <module> from . import utils File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/site-packages/requests-2.5.3-py3.4.egg/requests/utils.py", line 12, in <module> import cgi File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/cgi.py", line 44, in <module> import tempfile File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/tempfile.py", line 34, in <module> import shutil as _shutil File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/shutil.py", line 14, in <module> import tarfile File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/tarfile.py", line 48, in <module> import copy File "/Users/artemdremov/Documents/grabber/copy.py", line 2, in <module> import urllib.request File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/urllib/request.py", line 102, in <module> from urllib.error import URLError, HTTPError, ContentTooShortError File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/urllib/error.py", line 14, in <module> import urllib.response File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/urllib/response.py", line 14, in <module> class addbase(tempfile._TemporaryFileWrapper): AttributeError: 'module' object has no attribute '_TemporaryFileWrapper' ``` I work under OS X 10.10.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2460/reactions" }
https://api.github.com/repos/psf/requests/issues/2460/timeline
null
completed
null
null
false
[ "That traceback seems to pass through some of your own code, specifically a module called `copy`. Is that new?\n", "To be clear, @Lukasa is talking about \n\n```\n File \"/Users/artemdremov/Documents/grabber/copy.py\", line 2, in <module>\n```\n\nI'm rather surprised though that that would even be used. I thought that absolute imports on Python 3 would be preferring the stdlib's copy module.\n", "@Lukasa Yes, this was a second python script, that used request module. Importing request library worked well in first file.\n", "What I'm concerned about is that this new module leads to urllib being on the import path, which is also where the error is coming from. Indeed, this looks like a circular import, as your import of requests causes an import of tempfile, which ends up importing urllib which then imports tempfile again.\n", "Specifically the import of tempfile never completes, so when urllib tries to import it, we go back through that look. @theasder can you try importing requests on its own from a different directory or renaming `grabber/copy.py` to `grabber/_copy.py`?\n", "After renaming grabber/copy.py to grabber/_copy.py\n\n```\n$ python3\nPython 3.4.2 (v3.4.2:ab2c023a9432, Oct 5 2014, 20:42:22) \n[GCC 4.2.1 (Apple Inc. build 5666) (dot 3)] on darwin\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> import requests\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/site-packages/requests-2.5.3-py3.4.egg/requests/__init__.py\", line 58, in <module>\n from . import utils\n File \"/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/site-packages/requests-2.5.3-py3.4.egg/requests/utils.py\", line 12, in <module>\n import cgi\n File \"/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/cgi.py\", line 44, in <module>\n import tempfile\n File \"/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/tempfile.py\", line 34, in <module>\n import shutil as _shutil\n File \"/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/shutil.py\", line 14, in <module>\n import tarfile\n File \"/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/tarfile.py\", line 48, in <module>\n import copy\nImportError: bad magic number in 'copy': b'\\x03\\xf3\\r\\n'\n```\n", "You'll also need to remove the `__pycache__` directory or at least `__pycache__/copy.pyc`.\n", "@sigmavirus24 hooray, it works, thank you!\n", "Excellent! Thanks for the report @theasder.\n" ]
https://api.github.com/repos/psf/requests/issues/2459
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2459/labels{/name}
https://api.github.com/repos/psf/requests/issues/2459/comments
https://api.github.com/repos/psf/requests/issues/2459/events
https://github.com/psf/requests/pull/2459
58,763,798
MDExOlB1bGxSZXF1ZXN0Mjk5MTI1NDE=
2,459
Update HISTORY and release version for 2.5.3
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" } ]
null
0
2015-02-24T16:31:49Z
2021-09-08T08:01:07Z
2015-02-24T16:32:15Z
CONTRIBUTOR
resolved
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2459/reactions" }
https://api.github.com/repos/psf/requests/issues/2459/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2459.diff", "html_url": "https://github.com/psf/requests/pull/2459", "merged_at": "2015-02-24T16:32:15Z", "patch_url": "https://github.com/psf/requests/pull/2459.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2459" }
true
[]
https://api.github.com/repos/psf/requests/issues/2458
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2458/labels{/name}
https://api.github.com/repos/psf/requests/issues/2458/comments
https://api.github.com/repos/psf/requests/issues/2458/events
https://github.com/psf/requests/pull/2458
58,756,824
MDExOlB1bGxSZXF1ZXN0Mjk5MDgwMTE=
2,458
Revert "Update certificate bundle."
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[ { "color": "e10c02", "default": false, "description": null, "id": 117744, "name": "Bug", "node_id": "MDU6TGFiZWwxMTc3NDQ=", "url": "https://api.github.com/repos/psf/requests/labels/Bug" } ]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" } ]
null
4
2015-02-24T15:44:57Z
2021-09-08T08:01:07Z
2015-02-24T16:18:16Z
CONTRIBUTOR
resolved
Reverts kennethreitz/requests#2442
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2458/reactions" }
https://api.github.com/repos/psf/requests/issues/2458/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2458.diff", "html_url": "https://github.com/psf/requests/pull/2458", "merged_at": "2015-02-24T16:18:16Z", "patch_url": "https://github.com/psf/requests/pull/2458.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2458" }
true
[ "We may need to merge this to temporarily resolve the CA problems added in 2.5.2.\n", "And then we'll cut 2.5.3\n", ":sob: \n", "Yeah. This is the worst\n" ]
https://api.github.com/repos/psf/requests/issues/2457
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2457/labels{/name}
https://api.github.com/repos/psf/requests/issues/2457/comments
https://api.github.com/repos/psf/requests/issues/2457/events
https://github.com/psf/requests/issues/2457
58,685,916
MDU6SXNzdWU1ODY4NTkxNg==
2,457
ImportError: cannot import name is_windows
{ "avatar_url": "https://avatars.githubusercontent.com/u/244702?v=4", "events_url": "https://api.github.com/users/papachoco/events{/privacy}", "followers_url": "https://api.github.com/users/papachoco/followers", "following_url": "https://api.github.com/users/papachoco/following{/other_user}", "gists_url": "https://api.github.com/users/papachoco/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/papachoco", "id": 244702, "login": "papachoco", "node_id": "MDQ6VXNlcjI0NDcwMg==", "organizations_url": "https://api.github.com/users/papachoco/orgs", "received_events_url": "https://api.github.com/users/papachoco/received_events", "repos_url": "https://api.github.com/users/papachoco/repos", "site_admin": false, "starred_url": "https://api.github.com/users/papachoco/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/papachoco/subscriptions", "type": "User", "url": "https://api.github.com/users/papachoco", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2015-02-24T02:45:29Z
2021-09-08T09:00:39Z
2015-02-24T02:54:06Z
NONE
resolved
I am not sure if this should be a bug for requests (2.5.2). httpie (http command) no longer works with requests 2.5.2. b/c the the method is_windows was removed from requests.compat Traceback (most recent call last): File "/Users/csanchez/Projects/nti.dataserver-buildout/bin/http", line 413, in <module> import httpie.**main** File "/Users/csanchez/Documents/workspace/nti.dataserver-buildout/eggs/httpie-0.9.1-py2.7.egg/httpie/**main**.py", line 6, in <module> from .core import main File "/Users/csanchez/Documents/workspace/nti.dataserver-buildout/eggs/httpie-0.9.1-py2.7.egg/httpie/core.py", line 21, in <module> from httpie.compat import str, bytes, is_py3 File "/Users/csanchez/Documents/workspace/nti.dataserver-buildout/eggs/httpie-0.9.1-py2.7.egg/httpie/compat.py", line 7, in <module> from requests.compat import is_windows, bytes, str, is_py3, is_py26 ImportError: cannot import name is_windows
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2457/reactions" }
https://api.github.com/repos/psf/requests/issues/2457/timeline
null
completed
null
null
false
[ "This is a bug in httpie. We cleaned up old symbols we weren't using. `requests.compat` isn't a publicly documented or exported API.\n", "I figured ..\nThanks a lot\n\nCarlos\n", "@papachoco can you please share how you solved this error?\r\n", "@ahsentaqikazmi this is an issue with httpie rather than Requests. You should be able to find solutions in the related issue for that project (jakubroztocil/httpie#314)." ]
https://api.github.com/repos/psf/requests/issues/2456
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2456/labels{/name}
https://api.github.com/repos/psf/requests/issues/2456/comments
https://api.github.com/repos/psf/requests/issues/2456/events
https://github.com/psf/requests/issues/2456
58,684,598
MDU6SXNzdWU1ODY4NDU5OA==
2,456
Thawte (erroneously?) removed from certificate bundle
{ "avatar_url": "https://avatars.githubusercontent.com/u/565385?v=4", "events_url": "https://api.github.com/users/jimrollenhagen/events{/privacy}", "followers_url": "https://api.github.com/users/jimrollenhagen/followers", "following_url": "https://api.github.com/users/jimrollenhagen/following{/other_user}", "gists_url": "https://api.github.com/users/jimrollenhagen/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jimrollenhagen", "id": 565385, "login": "jimrollenhagen", "node_id": "MDQ6VXNlcjU2NTM4NQ==", "organizations_url": "https://api.github.com/users/jimrollenhagen/orgs", "received_events_url": "https://api.github.com/users/jimrollenhagen/received_events", "repos_url": "https://api.github.com/users/jimrollenhagen/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jimrollenhagen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jimrollenhagen/subscriptions", "type": "User", "url": "https://api.github.com/users/jimrollenhagen", "user_view_type": "public" }
[]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" } ]
null
6
2015-02-24T02:26:30Z
2021-09-08T23:06:00Z
2015-02-24T16:00:59Z
NONE
resolved
#2442 updated the certificate bundle and appears to have dropped Thawte from the bundle. Thawte still shows up for me both in the Mozilla bundle and the mkcert.org standard bundle. Mozilla appears to have been unchanged recently: https://hg.mozilla.org/releases/mozilla-release/log/a2ffa9047bf4/security/nss/lib/ckfw/builtins/certdata.txt So I'm not sure how this happened, but it's broken using requests for Rackspace's cloud API, and I'm sure other sites as well.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2456/reactions" }
https://api.github.com/repos/psf/requests/issues/2456/timeline
null
completed
null
null
false
[ "@Lukasa generated the certificate with mkcert.org if I recall correctly but I'm not sure the list of certificates he used. It seems we just need to add thawte\n", "Actually looks like it's [here](https://github.com/kennethreitz/requests/blob/master/requests/cacert.pem#L1710) but I see that there seem to be other certificate roots [here](https://github.com/Lukasa/requests/commit/b86330d1f15f04e77093800efb8c6bd6909e8c8e#diff-2560f13d524dedb5044eb83d0622174aL28) and [here](https://github.com/Lukasa/requests/commit/b86330d1f15f04e77093800efb8c6bd6909e8c8e#diff-2560f13d524dedb5044eb83d0622174aL55)\n\n```\n~/s/requests git:master ❯❯❯ curl -s https://mkcert.org/labels/ | python -m json.tool | ag 'thawte'\n \"thawte Primary Root CA\",\n \"thawte Primary Root CA - G2\",\n \"thawte Primary Root CA - G3\",\n```\n\nShows that mkcert only seems to know about the Thawte root that we are including.\n", "Yeah, I'm seeing three Thawte certificates in our bundle and three Thawte certificates in Mozilla's bundle. Which certificate exactly are you expecting to see, @jimrollenhagen?\n", "Hmm, I just saw Thawte certs being removed in b86330d1f15f04e77093800efb8c6bd6909e8c8e, I missed that we still have some in the bundle. Let me poke some more and loop back around.\n", "So in that commit the following certificates were removed (by label/sha256 fingerprint):\n- GTE CyberTrust Global Root / `a5:31:25:18:8d:21:10:aa:96:4b:02:c7:b7:c6:da:32:03:17:08:94:e5:fb:71:ff:fb:66:67:d5:e6:81:0a:36`\n- Thawte Server CA / `b4:41:0b:73:e2:e6:ea:ca:47:fb:c4:2f:8f:a4:01:8a:f4:38:1d:c5:4c:fa:a8:44:50:46:1e:ed:09:45:4d:e9`\n- Thawte Premium Server CA / `ab:70:36:36:5c:71:54:aa:29:c2:c2:9f:5d:41:91:16:3b:16:2a:22:25:01:13:57:d5:6d:07:ff:a7:bc:1f:72`\n- Verisign Class 3 Public Primary Certification Authority / `e7:68:56:34:ef:ac:f6:9a:ce:93:9a:6b:25:5b:7b:4f:ab:ef:42:93:5b:50:a2:65:ac:b5:cb:60:27:e4:4e:70`\n- Verisign Class 3 Public Primary Certification Authority - G2 / `83:ce:3c:12:29:68:8a:59:3d:48:5f:81:97:3c:0f:91:95:43:1e:da:37:cc:5e:36:43:0e:79:c7:a8:88:63:8b`\n- ValiCert Class 1 VA / `f4:c1:49:55:1a:30:13:a3:5b:c7:bf:fe:17:a7:f3:44:9b:c1:ab:5b:5a:0a:e7:4b:06:c2:3b:90:00:4c:01:04`\n- ValiCert Class 2 VA / `58:d0:17:27:9c:d4:dc:63:ab:dd:b1:96:a6:c9:90:6c:30:c4:e0:87:83:ea:e8:c1:60:99:54:d6:93:55:59:6b`\n- RSA Root Certificate 1 / `bc:23:f9:8a:31:3c:b9:2d:e3:bb:fc:3a:5a:9f:44:61:ac:39:49:4c:4a:e1:5a:9e:9d:f1:31:e9:9b:73:01:9a`\n- Entrust.net Secure Server CA / `62:f2:40:27:8c:56:4c:4d:d8:bf:7d:9d:4f:6f:36:6e:a8:94:d2:2f:5f:34:d9:89:a9:83:ac:ec:2f:ff:ed:50`\n- Equifax Secure eBusiness CA 1 / `cf:56:ff:46:a4:a1:86:10:9d:d9:65:84:b5:ee:b5:8a:51:0c:42:75:b0:e5:f9:4f:40:bb:ae:86:5e:19:f6:73`\n- America Online Root Certification Authority 1 / `77:40:73:12:c6:3a:15:3d:5b:c0:0b:4e:51:75:9c:df:da:c2:37:dc:2a:33:b6:79:46:e9:8e:9b:fa:68:0a:e3`\n- America Online Root Certification Authority 2 / `7d:3b:46:5a:60:14:e5:26:c0:af:fc:ee:21:27:d2:31:17:27:ad:81:1c:26:84:2d:00:6a:f3:73:06:cc:80:bd`\n- TDC Internet Root CA / `48:98:c6:88:8c:0c:ff:b0:d3:e3:1a:ca:8a:37:d4:e3:51:5f:f7:46:d0:26:35:d8:66:46:cf:a0:a3:18:5a:e7`\n- NetLock Business (Class B) Root / `39:df:7b:68:2b:7b:93:8f:84:71:54:81:cc:de:8d:60:d8:f2:2e:c5:98:87:7d:0a:aa:c1:2b:59:18:2b:03:12`\n- NetLock Express (Class C) Root / `0b:5e:ed:4e:84:64:03:cf:55:e0:65:84:84:40:ed:2a:82:75:8b:f5:b9:aa:1f:25:3d:46:13:cf:a0:80:ff:3f`\n- Firmaprofesional Root CA / `c1:cf:0b:52:09:64:35:e3:f1:b7:1d:aa:ec:45:5a:23:11:c8:40:4f:55:83:a9:e2:13:c6:9d:85:7d:94:33:05`\n- AC Ra\\xC3\\xADz Certic\\xC3\\xA1mara S.A. / `a6:c5:1e:0d:a5:ca:0a:93:09:d2:e4:c0:e4:0c:2a:f9:10:7a:ae:82:03:85:7f:e1:98:e3:e7:69:e3:43:08:5c`\n- Verisign Class 3 Public Primary Certification Authority / `a4:b6:b3:99:6f:c2:f3:06:b3:fd:86:81:bd:63:41:3d:8c:50:09:cc:4f:a3:29:c2:cc:f0:e2:fa:1b:14:03:05`\n\nThere were others that were added too. Proposal: We should revert (at least some of) the deletions that took place in b86330d after confirming that none of them were intentionally removed/blacklisted from Mozilla's bundle (we don't want to add back compromised CAs).\n", "I now believe this is a manifestation of #2455: closing to focus the discussion there.\n" ]
https://api.github.com/repos/psf/requests/issues/2455
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2455/labels{/name}
https://api.github.com/repos/psf/requests/issues/2455/comments
https://api.github.com/repos/psf/requests/issues/2455/events
https://github.com/psf/requests/issues/2455
58,678,665
MDU6SXNzdWU1ODY3ODY2NQ==
2,455
CERTIFICATE_VERIFY_FAILED with 2.5.2, works with 2.5.1
{ "avatar_url": "https://avatars.githubusercontent.com/u/641278?v=4", "events_url": "https://api.github.com/users/rmcgibbo/events{/privacy}", "followers_url": "https://api.github.com/users/rmcgibbo/followers", "following_url": "https://api.github.com/users/rmcgibbo/following{/other_user}", "gists_url": "https://api.github.com/users/rmcgibbo/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/rmcgibbo", "id": 641278, "login": "rmcgibbo", "node_id": "MDQ6VXNlcjY0MTI3OA==", "organizations_url": "https://api.github.com/users/rmcgibbo/orgs", "received_events_url": "https://api.github.com/users/rmcgibbo/received_events", "repos_url": "https://api.github.com/users/rmcgibbo/repos", "site_admin": false, "starred_url": "https://api.github.com/users/rmcgibbo/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/rmcgibbo/subscriptions", "type": "User", "url": "https://api.github.com/users/rmcgibbo", "user_view_type": "public" }
[ { "color": "e10c02", "default": false, "description": null, "id": 117744, "name": "Bug", "node_id": "MDU6TGFiZWwxMTc3NDQ=", "url": "https://api.github.com/repos/psf/requests/labels/Bug" }, { "color": "e102d8", "default": false, "description": null, "id": 117745, "name": "Planned", "node_id": "MDU6TGFiZWwxMTc3NDU=", "url": "https://api.github.com/repos/psf/requests/labels/Planned" } ]
closed
true
null
[]
null
111
2015-02-24T01:11:01Z
2021-09-08T20:00:52Z
2015-05-31T12:57:28Z
NONE
resolved
I'm having a somewhat odd issue. A get() request seems to work fine with requests-2.5.1, but after upgrading to requests 2.5.2, the same URL leads to `CERTIFICATE_VERIFY_FAILED`. cc @mpharrigan ``` $ pip install requests==2.5.1 [ ... snip, installs just fine ...] $ python -c 'import requests; print(requests.get("http://conda.binstar.org/omnia/linux-64/fftw3f-3.3.3-1.tar.bz2"))' <Response [200]> $ pip install requests==2.5.2 [ ... snip, installs just fine ...] $ python -c 'import requests; print(requests.get("http://conda.binstar.org/omnia/linux-64/fftw3f-3.3.3-1.tar.bz2"))' Traceback (most recent call last): File "/Users/rmcgibbo/miniconda/envs/3.4.2/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 544, in urlopen body=body, headers=headers) File "/Users/rmcgibbo/miniconda/envs/3.4.2/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 341, in _make_request self._validate_conn(conn) File "/Users/rmcgibbo/miniconda/envs/3.4.2/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 762, in _validate_conn conn.connect() File "/Users/rmcgibbo/miniconda/envs/3.4.2/lib/python3.4/site-packages/requests/packages/urllib3/connection.py", line 238, in connect ssl_version=resolved_ssl_version) File "/Users/rmcgibbo/miniconda/envs/3.4.2/lib/python3.4/site-packages/requests/packages/urllib3/util/ssl_.py", line 256, in ssl_wrap_socket return context.wrap_socket(sock, server_hostname=server_hostname) File "/Users/rmcgibbo/miniconda/envs/3.4.2/lib/python3.4/ssl.py", line 364, in wrap_socket _context=self) File "/Users/rmcgibbo/miniconda/envs/3.4.2/lib/python3.4/ssl.py", line 578, in __init__ self.do_handshake() File "/Users/rmcgibbo/miniconda/envs/3.4.2/lib/python3.4/ssl.py", line 805, in do_handshake self._sslobj.do_handshake() ssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:600) During handling of the above exception, another exception occurred: [... snip ...] ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2455/reactions" }
https://api.github.com/repos/psf/requests/issues/2455/timeline
null
completed
null
null
false
[ "FWIW, this is with python 3.4.2 and OS X 10.10.\n", "@t-8ch seems related to the urllib3 patch to disable built-in hostname verification possibly.\n", "Would it be helpful for me to run a git-bisect or something?\n", "Not really. The change is upstream of us. Thanks for the bug though.\n", "@sigmavirus24 I would rather blame the new ciphersuite or cert bundle (not tested)\n", "Could easily be the new cert bundle.\n", "I'm assuming that you hand-typed that code @rmcgibbo? The reason I ask is because you're accessing HTTP urls there.\n", "I cannot reproduce this on my copy of 2.5.2.\n", "That suggests it's not the cert bundle.\n", "@Lukasa I can now reproduce it, the URLs are right, there seems to be an redirect\n", "So there is. I still can't reproduce though.\n", "@Lukasa Are you running from a git checkout? It works for me there to, but not from a wheel, will try a sdist\n", "Nope, from a wheel.\n\n``` python\nPython 2.7.9 (default, Dec 19 2014, 06:00:59) \n[GCC 4.2.1 Compatible Apple LLVM 6.0 (clang-600.0.56)] on darwin\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> import requests\n>>> requests.__version__\n'2.5.2'\n>>> requests.__file__\n'/usr/local/lib/python2.7/site-packages/requests/__init__.pyc'\n>>> r = requests.get(\"http://conda.binstar.org/omnia/linux-64/fftw3f-3.3.3-1.tar.bz2\")\n>>> r.status_code\n200\n>>> r.url\nu'https://binstar-cio-packages-prod.s3.amazonaws.com/531fd36fe1dad154c8ea89c5/54b0d1045e76836c22735094?Signature=JRp8nPqRc6wXijqGg5JjDnIytz8%3D&Expires=1424763499&AWSAccessKeyId=ASIAIIYL6XQN5BMSPEFA&response-content-disposition=attachment%3B%20filename%3Dfftw3f-3.3.3-1.tar.bz2&response-content-type=application/x-tar&x-amz-security-token=AQoDYXdzEM///////////wEa4APi0P1wChyd8cFWFddkWZVgQYQQPAli9WBPWotWMA77n%2BqFSO92c2mb607CMadC7Jdo21sJVmtuKu4gM2vQhzX8Uy8NXWHtrBpkAgulTshstH1oFEcrkfV2wVtcxiHPrWTrEw7UTHAqDBQppLNHIHTQjs%2Bks0d/yQ//i6iCJTulVjNUWsXv%2BGre%2BMX2s%2BetTldscc2vbHsrzLG%2BvlOGrKbgUaDzLHWa7RgA6z4DxJdu0inmSPGIoV02fFonVRrwjI3qUrw50dmIFIW5SFxN9XYnDnXf1DiJ2b3vKnrj9I%2BYEY%2ByP5pHpSWvF%2BIRrmBDX6KHf4QKF5%2Bo01Y2Oeqq%2BhQncFFdEt6QBU6Eqa%2B3kb6iwdJdPAXr7HOgd5TugixLDNnKaWuF%2B1grizrD8Drz6nYhb0sgZjazDmjNSFpNXwCyYDWQNNIL7o13Sh6q2fB8sWD5oCEJu22HB4CWBxq3AC/APi86cPSZ/5RiRnv2pz%2BBB7HGNwTZveJaSfSX9ZR2pq6oZOT5kpP2aJrj4wwvE7BC0kf%2B2au4B%2BGH5uHNZvm42%2BwWophhx9NcIjdX/F9g%2B8cnkrx9nooNcB%2BVzxgXZG%2BCzPuhdFbcRFzWaqz0W/T5eJWVpmCIcKpGlIibqcwAjGAg16%2BwpwU%3D'\n```\n", "Potentially also relevant:\n\n``` python\n>>> import ssl\n>>> ssl.OPENSSL_VERSION\n'OpenSSL 1.0.2 22 Jan 2015'\n```\n", "It seems to be the bundle.\n", "@t-8ch What makes you say that?\n", "It works with the old one and does not with the new one, also it works with my system bundle.\n(requests and curl)\n", "Aha, yep, can now reproduce. Don't know what I was doing differently before. =(\n", "So the cert chain is:\n\n```\n 0 s:/C=US/ST=Washington/L=Seattle/O=Amazon.com Inc./CN=*.s3.amazonaws.com\n i:/C=US/O=VeriSign, Inc./OU=VeriSign Trust Network/OU=Terms of use at https://www.verisign.com/rpa (c)10/CN=VeriSign Class 3 Secure Server CA - G3\n 1 s:/C=US/O=VeriSign, Inc./OU=VeriSign Trust Network/OU=Terms of use at https://www.verisign.com/rpa (c)10/CN=VeriSign Class 3 Secure Server CA - G3\n i:/C=US/O=VeriSign, Inc./OU=VeriSign Trust Network/OU=(c) 2006 VeriSign, Inc. - For authorized use only/CN=VeriSign Class 3 Public Primary Certification Authority - G5\n 2 s:/C=US/O=VeriSign, Inc./OU=VeriSign Trust Network/OU=(c) 2006 VeriSign, Inc. - For authorized use only/CN=VeriSign Class 3 Public Primary Certification Authority - G5\n i:/C=US/O=VeriSign, Inc./OU=Class 3 Public Primary Certification Authority\n```\n", "That Class 3 Public Primary Certification Authority - G5 certificate is in the bundle. The one Mozilla uses as its trust root when I access the page in Firefox is definitely 100% in that bundle. Serial numbers match and everything.\n", "Weirdly though, if I pass that certificate to curl to use to validate the connection, it doesn't like it.\n", "A warning that so far I've not been able to reproduce this _in requests_ on either Windows or OS X, presumably because their built in cert-stores are saving the day.\n", "Also, running `openssl s_client` seems to have no problems with that bundle:\n\n```\nubuntu@cb2ubuntu:~/garpd$ cat /usr/local/lib/python2.7/dist-packages/requests/__init__.py | grep version\n__version__ = '2.5.2'\n```\n\n```\nubuntu@cb2ubuntu:~/garpd$ openssl s_client -CApath /usr/local/lib/python2.7/dist-packages/requests/cacert.pem -connect binstar-cio-packages-prod.s3.amazonaws.com:443 [8/8]\nCONNECTED(00000003)\ndepth=3 C = US, O = \"VeriSign, Inc.\", OU = Class 3 Public Primary Certification Authority\nverify return:1\ndepth=2 C = US, O = \"VeriSign, Inc.\", OU = VeriSign Trust Network, OU = \"(c) 2006 VeriSign, Inc. - For authorized use only\", CN = VeriSign Class 3 Public Primary Certification Authority - G5\nverify return:1\ndepth=1 C = US, O = \"VeriSign, Inc.\", OU = VeriSign Trust Network, OU = Terms of use at https://www.verisign.com/rpa (c)10, CN = VeriSign Class 3 Secure Server CA - G3\nverify return:1\ndepth=0 C = US, ST = Washington, L = Seattle, O = Amazon.com Inc., CN = *.s3.amazonaws.com\nverify return:1\n---\nCertificate chain\n 0 s:/C=US/ST=Washington/L=Seattle/O=Amazon.com Inc./CN=*.s3.amazonaws.com\n i:/C=US/O=VeriSign, Inc./OU=VeriSign Trust Network/OU=Terms of use at https://www.verisign.com/rpa (c)10/CN=VeriSign Class 3 Secure Server CA - G3\n 1 s:/C=US/O=VeriSign, Inc./OU=VeriSign Trust Network/OU=Terms of use at https://www.verisign.com/rpa (c)10/CN=VeriSign Class 3 Secure Server CA - G3\n i:/C=US/O=VeriSign, Inc./OU=VeriSign Trust Network/OU=(c) 2006 VeriSign, Inc. - For authorized use only/CN=VeriSign Class 3 Public Primary Certification Authority - G5\n 2 s:/C=US/O=VeriSign, Inc./OU=VeriSign Trust Network/OU=(c) 2006 VeriSign, Inc. - For authorized use only/CN=VeriSign Class 3 Public Primary Certification Authority - G5\n i:/C=US/O=VeriSign, Inc./OU=Class 3 Public Primary Certification Authority\n---\nServer certificate\n-----BEGIN CERTIFICATE-----\nMIIFQTCCBCmgAwIBAgIQGHBX7tZDXzmvfSkeROrx7DANBgkqhkiG9w0BAQUFADCB\ntTELMAkGA1UEBhMCVVMxFzAVBgNVBAoTDlZlcmlTaWduLCBJbmMuMR8wHQYDVQQL\nExZWZXJpU2lnbiBUcnVzdCBOZXR3b3JrMTswOQYDVQQLEzJUZXJtcyBvZiB1c2Ug\nYXQgaHR0cHM6Ly93d3cudmVyaXNpZ24uY29tL3JwYSAoYykxMDEvMC0GA1UEAxMm\nVmVyaVNpZ24gQ2xhc3MgMyBTZWN1cmUgU2VydmVyIENBIC0gRzMwHhcNMTQwNDA5\nMDAwMDAwWhcNMTUwNDA5MjM1OTU5WjBrMQswCQYDVQQGEwJVUzETMBEGA1UECBMK\nV2FzaGluZ3RvbjEQMA4GA1UEBxQHU2VhdHRsZTEYMBYGA1UEChQPQW1hem9uLmNv\nbSBJbmMuMRswGQYDVQQDFBIqLnMzLmFtYXpvbmF3cy5jb20wggEiMA0GCSqGSIb3\nDQEBAQUAA4IBDwAwggEKAoIBAQCyIdaCeebmUg7oowAEkJOGAkE9KA7f/Kpsbexn\nsD0v/W2Hbq7Kmys8LD9bs6RX4YNIr/Cx0i4gQlymmVXy/OhgrvSpl/lbmHzFXF30\nUF2/L6NWkbkca2QbmolYBjYHngblx/gRQw6XGSui2Ql8q6W5IOz1EyHUZOhcr5W8\nx76JtY4r5/uav+2WO9pgtGEL4aROQfE7R/399OvkUCabcTvaG9N0TMBLTdB/mWyD\nGlnHSwWl67lH1HPr429iz/2cPP7l3eq1V1PNq25w5JCV2kySmq5d0XKt4cy5mMh/\nOg2vcwyj31u8B4fzyGWxQAXLs10wWF9xdVNHrJwoBD9jeiWDAgMBAAGjggGUMIIB\nkDAJBgNVHRMEAjAAMEMGA1UdIAQ8MDowOAYKYIZIAYb4RQEHNjAqMCgGCCsGAQUF\nBwIBFhxodHRwczovL3d3dy52ZXJpc2lnbi5jb20vY3BzMEUGA1UdHwQ+MDwwOqA4\noDaGNGh0dHA6Ly9TVlJTZWN1cmUtRzMtY3JsLnZlcmlzaWduLmNvbS9TVlJTZWN1\ncmVHMy5jcmwwHQYDVR0lBBYwFAYIKwYBBQUHAwEGCCsGAQUFBwMCMB8GA1UdIwQY\nMBaAFA1EXBZTRMGCfh0gqyX0AWPYvnmlMHYGCCsGAQUFBwEBBGowaDAkBggrBgEF\nBQcwAYYYaHR0cDovL29jc3AudmVyaXNpZ24uY29tMEAGCCsGAQUFBzAChjRodHRw\nOi8vU1ZSU2VjdXJlLUczLWFpYS52ZXJpc2lnbi5jb20vU1ZSU2VjdXJlRzMuY2Vy\nMA4GA1UdDwEB/wQEAwIFoDAvBgNVHREEKDAmghIqLnMzLmFtYXpvbmF3cy5jb22C\nEHMzLmFtYXpvbmF3cy5jb20wDQYJKoZIhvcNAQEFBQADggEBAD2yDlI/JHDW9LNT\nrsvy1lnS8H0IT8Zc+z9Imd5zEEqBs2G1beCtM9U4o/MDEao95DWfRck3Gx428fPv\nbsabSwJHtSpGLQiWi/UwnxN0p5Lz6tQVaglBqlsvm4ZGHdS94hSaYwd4nUZ+Wpo8\nhhCk44lVjwD0hTqr4G08XQiS/mlOY2422zo6+ULw+YG6ocMtVTe+VsL3V7dLRYgN\nwV15Z5GLL4f50hbUHQAjdFHMtDkIQTWu0l7SJB6ueQBxoBNJoHC89IZMom0Oy9WL\n1UNYgBTsad76ql/K3feTPJodalB1RXbEwSgc4pAC1/rtlfoZewZvNqANMxYc7k7G\nufhUTyk=\n-----END CERTIFICATE-----\nsubject=/C=US/ST=Washington/L=Seattle/O=Amazon.com Inc./CN=*.s3.amazonaws.com\nissuer=/C=US/O=VeriSign, Inc./OU=VeriSign Trust Network/OU=Terms of use at https://www.verisign.com/rpa (c)10/CN=VeriSign Class 3 Secure Server CA - G3\n---\nNo client certificate CA names sent\n---\nSSL handshake has read 4632 bytes and written 455 bytes\n---\nNew, TLSv1/SSLv3, Cipher is ECDHE-RSA-AES128-SHA\nServer public key is 2048 bit\nSecure Renegotiation IS supported\nCompression: NONE\nExpansion: NONE\nSSL-Session:\n Protocol : TLSv1.2\n Cipher : ECDHE-RSA-AES128-SHA\n Session-ID: 65EDD443247002AE9E1DCBE70147AA14DF818C026DD0DC3A07575DEEEFA01895\n Session-ID-ctx:\n Master-Key: 4814563FB3450B10F25DB657DC158FFF4CE9B1601057F3CB552BC5B881D40FD2FA97F8B8A0BCB86B6E1D830C3D647299\n Key-Arg : None\n PSK identity: None\n PSK identity hint: None\n SRP username: None\n Start Time: 1424769860\n Timeout : 300 (sec)\n Verify return code: 0 (ok)\n---\n```\n", "But cert verification definitely fails with requests. So that's perplexing.\n", "At least we get the same leaf cert with s_client and requests.\n", "@t-8ch To verify: does your bundle succeed with `s_client`?\n", "Now I am _really_ confused:\n", "Installed from wheel:\n\npy2 + 2.5.1: works\npy3.3 + 2.5.1: works\npy3.4 + 2.5.1: works\n\npy2 + 2.5.2: breaks\npy3.3 + 2.5.2: breaks\npy3.4 + 2.5.2: breaks\n\nGit checkout:\n\npy2 + 2.5.1: works\npy3 + 2.5.1: works\n\npy2 + 2.5.2: breaks\npy3.3 + 2.5.2: breaks\npy3.4 + 2.5.2: works\n", "All on linux with 3.4.2 and 2.7.9\n", "Well that's extremely perplexing. Can you diff the cacert.pem files in each case?\n" ]
https://api.github.com/repos/psf/requests/issues/2454
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2454/labels{/name}
https://api.github.com/repos/psf/requests/issues/2454/comments
https://api.github.com/repos/psf/requests/issues/2454/events
https://github.com/psf/requests/issues/2454
58,501,309
MDU6SXNzdWU1ODUwMTMwOQ==
2,454
SSL certificate caching
{ "avatar_url": "https://avatars.githubusercontent.com/u/38136?v=4", "events_url": "https://api.github.com/users/RuudBurger/events{/privacy}", "followers_url": "https://api.github.com/users/RuudBurger/followers", "following_url": "https://api.github.com/users/RuudBurger/following{/other_user}", "gists_url": "https://api.github.com/users/RuudBurger/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/RuudBurger", "id": 38136, "login": "RuudBurger", "node_id": "MDQ6VXNlcjM4MTM2", "organizations_url": "https://api.github.com/users/RuudBurger/orgs", "received_events_url": "https://api.github.com/users/RuudBurger/received_events", "repos_url": "https://api.github.com/users/RuudBurger/repos", "site_admin": false, "starred_url": "https://api.github.com/users/RuudBurger/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/RuudBurger/subscriptions", "type": "User", "url": "https://api.github.com/users/RuudBurger", "user_view_type": "public" }
[]
closed
true
null
[]
null
14
2015-02-22T12:36:34Z
2021-09-08T23:06:00Z
2015-02-22T13:10:01Z
NONE
resolved
I have a question regarding caching of ssl certificates. I switched over to a sha2 certificate which has 2 intermediate certificates. This all works fine and requests is working like expected. The problem is the size of the certificates. It was a 2KB certificate before, but now with the 2 intermediate certificates in there it's at 7KB. This doesn't seem like much, but the server I'm running it on has over 25million requests each day. With the content send per requests well below 10KB. Practically doubling my traffic by just adding the new certificate. I tried searching on how the certificate are used. Is it a 1 time thing and then cached on the client side?, is the certificated send with each request? etc. But I keep finding the basics of how and what certificates do. Could you tell me how `requests` uses the certificate and if there is a way on somehow caching it in the session object for example. Making sure it doesn't get requested with each call. Maybe I'm asking something weird here, so please let me know if it just "how http works" ;)
{ "avatar_url": "https://avatars.githubusercontent.com/u/38136?v=4", "events_url": "https://api.github.com/users/RuudBurger/events{/privacy}", "followers_url": "https://api.github.com/users/RuudBurger/followers", "following_url": "https://api.github.com/users/RuudBurger/following{/other_user}", "gists_url": "https://api.github.com/users/RuudBurger/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/RuudBurger", "id": 38136, "login": "RuudBurger", "node_id": "MDQ6VXNlcjM4MTM2", "organizations_url": "https://api.github.com/users/RuudBurger/orgs", "received_events_url": "https://api.github.com/users/RuudBurger/received_events", "repos_url": "https://api.github.com/users/RuudBurger/repos", "site_admin": false, "starred_url": "https://api.github.com/users/RuudBurger/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/RuudBurger/subscriptions", "type": "User", "url": "https://api.github.com/users/RuudBurger", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2454/reactions" }
https://api.github.com/repos/psf/requests/issues/2454/timeline
null
completed
null
null
false
[ "@RuudBurger The certificate chain is sent with each SSL handshake, which is performed for each new connection. If you use a `requests` Session it will use a connectionpool, which reuses connections for each host.\nAs you are talking to the same host only one handshake will be performed.\n", "It's not \"how HTTP works\", it's \"how SSL/TLS works\". =)\n\nTLS/SSL authenticates a single _TCP connection_. The handshake is done at the beginning, including certificate transfer, and then never again. However, it's very important that the certificate chain be sent at the start of each connection. Thus, for connections that don't trust your intermediate certificates (but only trust the root), you _must_ send the whole cert chain on each connection.\n\nThus, the general way to improve this is to make sure connections last a while. Make sure your server doesn't send `Connection: close` on its responses, and that wherever possible you re-use connections. Requests will do this if you use a `Session` object. If you aren't already, try that: it should minimise some of your cost.\n\nIf requests is your only client (that is, you aren't worried about whether third parties have errors verifying the trust chain), you _can_ improve the situation further. The way to do it is to tell requests to trust the intermediate certificate directly. That means putting your intermediate certificate into a single file in `pem` format, and then passing the path to that file as the argument to `verify` on your requests calls. Then your server only needs to provide its certificate: requests will verify that it trusts the certificate that issued it, and all should be well.\n", "@Lukasa Instead of trusting the intermediate, would it not be easier then to use a single self-signed cert and thrust it directly or just pin the fingerprint?\n", "That's absolutely an alternative option. Pinning the fingerprint is harder, I don't know how you'd do it with requests' API, but the self-signed cert would work. Similarly, trusting the cert directly is also an option.\n", "@RuudBurger I assume the 7kb you measured are the PEM-encoded files. This will be less when transmitted over the wire.\n", "Wow thanks for the quick response guys.\nIs there a way to test and see if a connection is re-used in requests session? so I can find the best timeout from the server side to close the connection. I do have a single session object for all outgoing requests so this should already be the case, just need to find out it is.\n\nWith so many people connecting I don't know if I can keep a lot of connections open though, need to figure out how nginx handles this.\n\nYeah the 7kb is the combined size of the certificates. A regular response body of my api is 1.7KB gzipped. I don't think the handshake is compressed though, read somewhere there were issues with that so nginx disabled it by default.\nMy traffic went from 5GB/hour to over 10GB/hour, this is why I noticed.\n", "The easiest way to check whether requests is re-using the connection is to turn on DEBUG level logging. urllib3 will log out its connection pool logic. Post the logs here and I can tell you how to read them. =)\n", "Don't know if these are it, seems a bit low on the info ;)\n\n```\n02-22 13:59:30 INFO [hpotato.core.plugins.base] Opening url: get https://api.couchpota.to/info/tt1262416/, data: []\n02-22 13:59:30 INFO Starting new HTTPS connection (1): api.couchpota.to\n02-22 13:59:30 DEBUG \"GET /info/tt1262416/ HTTP/1.1\" 200 None\n02-22 13:59:30 INFO [hpotato.core.plugins.base] Opening url: get https://api.couchpota.to/info/tt0800369/, data: []\n02-22 13:59:30 INFO Resetting dropped connection: api.couchpota.to\n02-22 13:59:30 DEBUG \"GET /info/tt0800369/ HTTP/1.1\" 200 None\n02-22 13:59:30 INFO [hpotato.core.plugins.base] Opening url: get https://api.couchpota.to/info/tt2494280/, data: []\n02-22 13:59:30 INFO Resetting dropped connection: api.couchpota.to\n02-22 13:59:30 DEBUG \"GET /info/tt2494280/ HTTP/1.1\" 200 None\n```\n\nwithout connection: close\n\n```\n02-22 14:00:36 INFO [hpotato.core.plugins.base] Opening url: get https://api.couchpota.to/info/tt1262416/, data: []\n02-22 14:00:36 INFO Starting new HTTPS connection (1): api.couchpota.to\n02-22 14:00:36 DEBUG \"GET /info/tt1262416/ HTTP/1.1\" 200 None\n02-22 14:00:36 INFO [hpotato.core.plugins.base] Opening url: get https://api.couchpota.to/info/tt0800369/, data: []\n02-22 14:00:36 DEBUG \"GET /info/tt0800369/ HTTP/1.1\" 200 None\n02-22 14:00:37 INFO [hpotato.core.plugins.base] Opening url: get https://api.couchpota.to/info/tt2494280/, data: []\n02-22 14:00:37 DEBUG \"GET /info/tt2494280/ HTTP/1.1\" 200 None\n```\n", "That's what we needed. =)\n\nThe relevant messages are 'Starting new HTTPS connection' and 'Resetting dropped connection'.\n\nEach time you see one of them it means we're doing the handshake again. 'Resetting dropped connection' is an indication that we still have the socket object, but that we believe it to be closed. What you want is to see as few 'Starting new HTTPS connection' messages as possible, and ideally no 'resetting dropped connection' messages.\n\nWhat's in the headers on the request and response?\n\nIf you want, you can correlate these with tcpdump showing connection closures to verify what I'm saying.\n", "https://github.com/RuudBurger/CouchPotatoServer/blob/master/couchpotato/core/plugins/base.py#L195-L200 these are the default headers for the request so I need to change the \"connection close\" part.\n\nThis is the response:\n\n```\nContent-Encoding gzip\nContent-Type application/json; charset=utf-8\nDate Sun, 22 Feb 2015 13:06:33 GMT\nServer nginx\nStrict-Transport-Security max-age=31536000; includeSubdomains;\nVary Accept-Encoding\nX-Powered-By CouchPotato (3007)\n```\n\nThanks for the help, I'll do some testing and see if I can keep the connections open a bit longer. Offload the traffic a bit. \n", "Correct, `Connection: close` won't help you here. =)\n", "I can't believe I kept the `connection: close` in there for so long. I think this is code from before I used requests and the session object. Not re-using the connections..\nOnly enabling this took a normal \"api use scenario\" from 6.5s to 1.8s.. \n\nI'll keep an eye out if the fixes helped with the traffic. But because the propagation of the update in the app takes a while I don't know until a couple of weeks when all versions have the fix. Will let it know here, for other people coming by.\n", "Did and update to all users, most active users are using the `keep-alive` fix now. Traffic is normal and server load is way down. Thanks for the help!\n", "No problem, glad I could help!\n" ]
https://api.github.com/repos/psf/requests/issues/2453
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2453/labels{/name}
https://api.github.com/repos/psf/requests/issues/2453/comments
https://api.github.com/repos/psf/requests/issues/2453/events
https://github.com/psf/requests/pull/2453
58,391,249
MDExOlB1bGxSZXF1ZXN0Mjk3MTg1MjE=
2,453
document combination of repeated response headers
{ "avatar_url": "https://avatars.githubusercontent.com/u/6235698?v=4", "events_url": "https://api.github.com/users/requiredfield/events{/privacy}", "followers_url": "https://api.github.com/users/requiredfield/followers", "following_url": "https://api.github.com/users/requiredfield/following{/other_user}", "gists_url": "https://api.github.com/users/requiredfield/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/requiredfield", "id": 6235698, "login": "requiredfield", "node_id": "MDQ6VXNlcjYyMzU2OTg=", "organizations_url": "https://api.github.com/users/requiredfield/orgs", "received_events_url": "https://api.github.com/users/requiredfield/received_events", "repos_url": "https://api.github.com/users/requiredfield/repos", "site_admin": false, "starred_url": "https://api.github.com/users/requiredfield/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/requiredfield/subscriptions", "type": "User", "url": "https://api.github.com/users/requiredfield", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2015-02-20T18:25:58Z
2021-09-08T08:01:08Z
2015-02-21T10:20:57Z
CONTRIBUTOR
resolved
as explained in https://github.com/kennethreitz/requests/issues/1155#issuecomment-53229867. motivation: requests uses a dict for response headers. In a given response, the same header may appear multiple times with different values, but dicts of course allow each key to map to only one value. requests [handles this intelligently](https://github.com/kennethreitz/requests/issues/1155#issuecomment-53229867), but users will not expect this given it's not documented in any of these places: - http://docs.python-requests.org/en/latest/user/quickstart/#response-headers - http://docs.python-requests.org/en/latest/api/?highlight=headers#requests.request - http://docs.python-requests.org/en/latest/api/#requests.Response.headers (Also, they may (likely) be unfamiliar with this part of http://tools.ietf.org/html/rfc7230#section-3.2, while only actually being familiar with werkzeug and other libraries that use a multidict instead to handle repeated headers.) I think there's a perfect spot to document this behavior (right after the docs explain how the dict is already special in that it's case-insensitive), and given that #1155 will probably be closed, I figured it's a good time to document this and increase users' future understanding. Thanks for the great work on requests.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2453/reactions" }
https://api.github.com/repos/psf/requests/issues/2453/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2453.diff", "html_url": "https://github.com/psf/requests/pull/2453", "merged_at": "2015-02-21T10:20:57Z", "patch_url": "https://github.com/psf/requests/pull/2453.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2453" }
true
[ ":cake: Beautiful. =)\n", "Thanks for merging. =)\n" ]
https://api.github.com/repos/psf/requests/issues/2452
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2452/labels{/name}
https://api.github.com/repos/psf/requests/issues/2452/comments
https://api.github.com/repos/psf/requests/issues/2452/events
https://github.com/psf/requests/issues/2452
58,387,684
MDU6SXNzdWU1ODM4NzY4NA==
2,452
multiple cookie headers should be joined with semicolons
{ "avatar_url": "https://avatars.githubusercontent.com/u/6235698?v=4", "events_url": "https://api.github.com/users/requiredfield/events{/privacy}", "followers_url": "https://api.github.com/users/requiredfield/followers", "following_url": "https://api.github.com/users/requiredfield/following{/other_user}", "gists_url": "https://api.github.com/users/requiredfield/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/requiredfield", "id": 6235698, "login": "requiredfield", "node_id": "MDQ6VXNlcjYyMzU2OTg=", "organizations_url": "https://api.github.com/users/requiredfield/orgs", "received_events_url": "https://api.github.com/users/requiredfield/received_events", "repos_url": "https://api.github.com/users/requiredfield/repos", "site_admin": false, "starred_url": "https://api.github.com/users/requiredfield/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/requiredfield/subscriptions", "type": "User", "url": "https://api.github.com/users/requiredfield", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2015-02-20T17:56:22Z
2021-09-08T23:05:56Z
2015-03-05T14:18:27Z
CONTRIBUTOR
resolved
please see https://github.com/kennethreitz/requests/issues/1155#issuecomment-55176733
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2452/reactions" }
https://api.github.com/repos/psf/requests/issues/2452/timeline
null
completed
null
null
false
[ "This should be fixed by https://github.com/shazow/urllib3/pull/534 after we cut a new release of requests.\n", "I apologise, my original statement in #1155 is wrong. Multiple cookie headers shouldn't be joined at all: there is simply no safe way to do that.\n" ]
https://api.github.com/repos/psf/requests/issues/2451
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2451/labels{/name}
https://api.github.com/repos/psf/requests/issues/2451/comments
https://api.github.com/repos/psf/requests/issues/2451/events
https://github.com/psf/requests/issues/2451
58,264,845
MDU6SXNzdWU1ODI2NDg0NQ==
2,451
Add HTTP2 support
{ "avatar_url": "https://avatars.githubusercontent.com/u/6730980?v=4", "events_url": "https://api.github.com/users/AkshatM/events{/privacy}", "followers_url": "https://api.github.com/users/AkshatM/followers", "following_url": "https://api.github.com/users/AkshatM/following{/other_user}", "gists_url": "https://api.github.com/users/AkshatM/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/AkshatM", "id": 6730980, "login": "AkshatM", "node_id": "MDQ6VXNlcjY3MzA5ODA=", "organizations_url": "https://api.github.com/users/AkshatM/orgs", "received_events_url": "https://api.github.com/users/AkshatM/received_events", "repos_url": "https://api.github.com/users/AkshatM/repos", "site_admin": false, "starred_url": "https://api.github.com/users/AkshatM/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/AkshatM/subscriptions", "type": "User", "url": "https://api.github.com/users/AkshatM", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2015-02-19T20:27:11Z
2021-09-08T23:06:00Z
2015-02-19T20:33:30Z
NONE
resolved
The [draft specification of HTTP 2.0](https://tools.ietf.org/html/draft-ietf-httpbis-http2-17#page-7) is out. As a long-term goal, there should be a way to make **requests** compatible with it.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2451/reactions" }
https://api.github.com/repos/psf/requests/issues/2451/timeline
null
completed
null
null
false
[ "Please check old issues: #2082.\n" ]
https://api.github.com/repos/psf/requests/issues/2450
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2450/labels{/name}
https://api.github.com/repos/psf/requests/issues/2450/comments
https://api.github.com/repos/psf/requests/issues/2450/events
https://github.com/psf/requests/pull/2450
58,254,164
MDExOlB1bGxSZXF1ZXN0Mjk2Mzk3MDk=
2,450
Update README to use Shields badges
{ "avatar_url": "https://avatars.githubusercontent.com/u/2434728?v=4", "events_url": "https://api.github.com/users/iKevinY/events{/privacy}", "followers_url": "https://api.github.com/users/iKevinY/followers", "following_url": "https://api.github.com/users/iKevinY/following{/other_user}", "gists_url": "https://api.github.com/users/iKevinY/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/iKevinY", "id": 2434728, "login": "iKevinY", "node_id": "MDQ6VXNlcjI0MzQ3Mjg=", "organizations_url": "https://api.github.com/users/iKevinY/orgs", "received_events_url": "https://api.github.com/users/iKevinY/received_events", "repos_url": "https://api.github.com/users/iKevinY/repos", "site_admin": false, "starred_url": "https://api.github.com/users/iKevinY/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/iKevinY/subscriptions", "type": "User", "url": "https://api.github.com/users/iKevinY", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2015-02-19T19:14:06Z
2021-09-08T09:00:55Z
2015-02-20T02:01:55Z
CONTRIBUTOR
resolved
The SVG badges served by [Shields](http://shields.io) are friendlier to retina displays and have a standardized design. ![Shields Comparison](https://cloud.githubusercontent.com/assets/2434728/6273906/0fc39ff8-b828-11e4-95e8-b7b5f80ef9e9.png)
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2450/reactions" }
https://api.github.com/repos/psf/requests/issues/2450/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2450.diff", "html_url": "https://github.com/psf/requests/pull/2450", "merged_at": "2015-02-20T02:01:55Z", "patch_url": "https://github.com/psf/requests/pull/2450.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2450" }
true
[ "No objection from me. @sigmavirus24?\n", "One thing I noticed is that there's less precision in the download count – not sure if this problematic.\n", "Meh. Download counts are working exactly correctly with PyPI at the moment anyway.\n", "Thanks @iKevinY !\n", "No worries!\n" ]
https://api.github.com/repos/psf/requests/issues/2449
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2449/labels{/name}
https://api.github.com/repos/psf/requests/issues/2449/comments
https://api.github.com/repos/psf/requests/issues/2449/events
https://github.com/psf/requests/issues/2449
58,058,431
MDU6SXNzdWU1ODA1ODQzMQ==
2,449
[rfe] requests should process chunked responses transparently
{ "avatar_url": "https://avatars.githubusercontent.com/u/1662493?v=4", "events_url": "https://api.github.com/users/TomasTomecek/events{/privacy}", "followers_url": "https://api.github.com/users/TomasTomecek/followers", "following_url": "https://api.github.com/users/TomasTomecek/following{/other_user}", "gists_url": "https://api.github.com/users/TomasTomecek/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/TomasTomecek", "id": 1662493, "login": "TomasTomecek", "node_id": "MDQ6VXNlcjE2NjI0OTM=", "organizations_url": "https://api.github.com/users/TomasTomecek/orgs", "received_events_url": "https://api.github.com/users/TomasTomecek/received_events", "repos_url": "https://api.github.com/users/TomasTomecek/repos", "site_admin": false, "starred_url": "https://api.github.com/users/TomasTomecek/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/TomasTomecek/subscriptions", "type": "User", "url": "https://api.github.com/users/TomasTomecek", "user_view_type": "public" }
[]
closed
true
null
[]
null
7
2015-02-18T11:51:57Z
2021-09-08T23:05:52Z
2015-02-18T11:54:11Z
NONE
resolved
When server responds with `Transfer-Encoding: chunked`, it would be nice if `requests` would process the chunked response. Right now if I do `response.iter_lines()` I get access to chunks part where length of the chunk is encoded with optional chunk extension. It would be awesome if `requests` would provide something like `iter_chunks` where I would get access to the chunk data only. More info: http://en.wikipedia.org/wiki/Chunked_transfer_encoding http://mihai.ibanescu.net/chunked-encoding-and-python-requests
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2449/reactions" }
https://api.github.com/repos/psf/requests/issues/2449/timeline
null
completed
null
null
false
[ "As mentioned in the blog post you link to, `httplib` does not provide us with access to those chunks. To get them we'd have to go around `httplib` and read directly from the socket itself. That's do-able, but it's a pretty aggressive layering violation. The first place to ask for that would be in our underlying layer, urllib3. I suggest opening an issue there. =)\n", "Oh, should have read the blog most carefully.\n\nOkay, will migrate this issue.\n", "That's ok, it's an easy mistake to make. =)\n", "@Lukasa Since it's in urllib3 now, how we can take advantage of it?\n", "That's actually an interesting point.\n\nBy default, `iter_content` will automatically use `stream`, as you can see [here](https://github.com/kennethreitz/requests/blob/master/requests/models.py#L656-L657). This _should_ mean that `iter_content` will now transparently work with chunked transfer encoding (interestingly, so will `iter_lines`).\n\nHowever, it might be a good idea to document how best to handle chunked transfer encoding (namely, calling `iter_content` with a chunk size of `None`).\n", "I probably won't have time to write that docs. But am really looking forward to try the new stuff in docker-py once urllib3 is rebased in requests and released.\n", "See #2506 for those docs.\n" ]
https://api.github.com/repos/psf/requests/issues/2448
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2448/labels{/name}
https://api.github.com/repos/psf/requests/issues/2448/comments
https://api.github.com/repos/psf/requests/issues/2448/events
https://github.com/psf/requests/issues/2448
57,998,689
MDU6SXNzdWU1Nzk5ODY4OQ==
2,448
Use of Session for repeated requests subject to TCP timing problem
{ "avatar_url": "https://avatars.githubusercontent.com/u/1697414?v=4", "events_url": "https://api.github.com/users/pddenhar/events{/privacy}", "followers_url": "https://api.github.com/users/pddenhar/followers", "following_url": "https://api.github.com/users/pddenhar/following{/other_user}", "gists_url": "https://api.github.com/users/pddenhar/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/pddenhar", "id": 1697414, "login": "pddenhar", "node_id": "MDQ6VXNlcjE2OTc0MTQ=", "organizations_url": "https://api.github.com/users/pddenhar/orgs", "received_events_url": "https://api.github.com/users/pddenhar/received_events", "repos_url": "https://api.github.com/users/pddenhar/repos", "site_admin": false, "starred_url": "https://api.github.com/users/pddenhar/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pddenhar/subscriptions", "type": "User", "url": "https://api.github.com/users/pddenhar", "user_view_type": "public" }
[ { "color": "f7c6c7", "default": false, "description": null, "id": 167537670, "name": "Propose Close", "node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=", "url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close" } ]
closed
true
null
[]
null
12
2015-02-17T23:20:28Z
2021-09-07T00:06:12Z
2015-04-06T11:13:39Z
NONE
resolved
This is a bug that I originally experienced using the InfluxDB-python client and reported [here](https://github.com/influxdb/influxdb-python/issues/103). Upon further investigation, I've been able to reproduce it using only the requests library (which InfluxDB-client uses). Running a very simple script making repeated requests to a remote URL will result in a failed request if a certain timing condition is met, resulting in a thrown `requests.exceptions.ConnectionError`. Here is the script I used to reproduce the issue: ``` import requests import time session = requests.Session() while 1: try: response = session.request( method='GET', url="http://mediaqueri.es/", timeout=15 ) print response.status_code, response.text[0:30] except Exception as e: print e, type(e) time.sleep(5) ``` The site I'm GETting is just a publicly available site that happens to be running on the Flask framework. About one in ten requests to the site will fail, and the following line will be printed by the exception handler shown above: `HTTPConnectionPool(host='mediaqueri.es', port=80): Max retries exceeded with url: / (Caused by <class 'httplib.BadStatusLine'>: '') <class 'requests.exceptions.ConnectionError'>` This exception message confused me initially, because the BadStatusLine is something that would usually indicate a HTTP status being returned that is not considered valid, which would be unusual for such a basic request to a web server. After inspecting the actual TCP traffic with Wireshark, I've found that something else is happening. Here is the packet sequence for two requests being made five seconds apart, with the second one failing: ![screenshot from 2015-02-17 17 00 15](https://cloud.githubusercontent.com/assets/1697414/6239135/87e61a8c-b6c6-11e4-9b54-1776fbaab130.png) The first request proceeds as you would expect, with a TCP connection being opened with a SYN, and then a GET and a HTTP response being received. The use of connection pooling means that the connection is kept open between the first and second and after five seconds the second GET is sent out. The issue is that the Flask web server also seems to close inactive TCP connections after five seconds, so right after that GET is sent out and probably while it's still in flight, a FIN is received from the server and duly ACKed, which closes that TCP connection without receiving the HTTP response. At that point, some RST packets arrive (indicating that a packet arrived on a socket after a FIN was sent?) and they seem to somehow end up causing a BadStatusLine HTTP error to be thrown. I understand that is a fairly involved series of events, so thank you if you've made it this far. It might even be possible that this bug is at an even lower level than the requests library, somewhere in urllib3 perhaps. In any case though, having Session requests fail because they happen to be in sync with the rate at which TCP connections close is a pretty obscure problem for an end user to handle. At the least I think throwing an exception that lists BadStatusLine as the cause is not the correct behavior. The client side is ACKing the FIN it receives from the server, which means it should not expect to receive an HTTP response on that connection after that point.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2448/reactions" }
https://api.github.com/repos/psf/requests/issues/2448/timeline
null
completed
null
null
false
[ "So I think the fundamental problem here is that neither requests nor urllib3 have any way of knowing what's happening with the actually underlying TCP connection. urllib3 at its best wraps sockets to enable TLS but doesn't really directly manage sockets beyond that very thoroughly.\n\nIt isn't as if urllib3 or requests is acknowledging the FIN that it receives. That's generally how the socket behaves without us having to do that. I think a way around this is to enable TCP Keep-Alive but I'm not entirely convinced this is a solution that needs to live in requests proper.\n\nI think @Lukasa is on vacation or just generally taking a break so I'll have to look into this more tonight. Thanks for the very very detailed issue @pddenhar!\n", "BTW, @pddenhar I'm also in MSN. =D\n", "Hello from the UW CS dept! I'm about to leave for the day but I'll certainly be interested to see where the issue goes. It's been a hard thing to track down because it's not really anyone's \"fault\"; TCP connections closing down are a totally normal part of the TCP state machine. I think somewhere things are just getting a little strange as the closed connection still results in a malformed HTTP response getting \"handled\". \n\nI think my fix for the time being will be not using sessions on high latency devices where the FIN and HTTP GET on pooled connections are likely to end up in flight at the same time. \n", "@pddenhar I wonder if the `TCPKeepAliveAdapter` in [this PR](https://github.com/sigmavirus24/requests-toolbelt/pull/59) helps any with your problem.\n", "> The client side is ACKing the FIN it receives from the server, which means it should not expect to receive an HTTP response on that connection after that point.\n\nSadly that's not really how this works. The flow of events is this:\n1. Second HTTP request is emitted by your machine and traverses the network.\n2. The data is received, but not yet processed by Flask.\n3. Flask closes the connection.\n4. The TCP connection is torn down, either forcefully or abruptly.\n\nWe know that step 2 happened because the packet is FIN+ACK, not just FIN. This ACK applies to a previously received data-packet, and the only un-ACKed packet that can be is the one that transmitted the second request. This is a bit irrelevant though.\n\nWhat matters is the sequence of socket calls. They go:\n1. connect (optional, not done in this case)\n2. send, send, send, send...\n3. recv, recv, recv, recv...\n4. GOTO 2\n\nHere's the thing. When a TCP connection is closed by the remote end, the visible effect of this is that the recv call returns 'short' (with whatever data was received before the FIN), and subsequent recv calls on the same FD throw exceptions. In this case, as no data was received the recv call will return a length-zero read. httplib parses this as an early connection close and raises `BadStatusLine`. The RST packets are a red herring here: they don't cause this behaviour. The FIN by itself would do it.\n\nUnfortunately, above the socket layer we cannot tell the difference between the server pulling down a connection before the application has processed the request and the server pulling it down after. `httplib` has no way of knowing that this closure means the request wasn't processed at all, versus the case where the request is processed and the response to it is to close the socket.\n\nI just don't think there's anything we can do here.\n", "I'm not sure there's much we can do to fix this, so I'm proposing we close it.\n", "I am facing this issue without using sessions. I am calling `requests.{get,post,put,delete}` on a single `host:port` and after a few requests I get `BadStatusLine` exception. I am connecting over `https` with `verify=False` as the server is sending a self-signed cert. ", "I believe that the underlying libraries still re-use TCP connections even when one isn't using a single session object for making requests. Otherwise, I wouldn't have faced this issue.", "@narendraj9 that is only true for extremely old versions of Requests. You've provided no information to indicate this is still a problem. Even so, you should file a new issue if you can show that to be the case.", "The version of python-requests that I am facing the issue is `2.11.1` on python `2.7.5`.", "Right. So 2.11.1 was released in August of 2016 (per [PyPI](https://pypi.org/project/requests/#history)). That's over 2 years old at this point. That is unsupported from our point of view.", "Okay. Thanks for the information. I tried looking at the code to see if it\nis re-using TCP connections but couldn't find that it is. If it is possible\nfor you to tell me when it stopped re-using connections for a non-session\ncall on requests obejct, it will be really helpful.\n\nNarendra Joshi\n\nOn Mon, 8 Oct 2018 17:02 Ian Stapleton Cordasco, <[email protected]>\nwrote:\n\n> Right. So 2.11.1 was released in August of 2016 (per PyPI\n> <https://pypi.org/project/requests/#history>). That's over 2 years old at\n> this point. That is unsupported from our point of view.\n>\n> —\n> You are receiving this because you were mentioned.\n> Reply to this email directly, view it on GitHub\n> <https://github.com/requests/requests/issues/2448#issuecomment-427800208>,\n> or mute the thread\n> <https://github.com/notifications/unsubscribe-auth/ADbemgJMuYnv9qqZGCtHSuFNFJ_JIBAwks5uize0gaJpZM4DhzA->\n> .\n>\n" ]
https://api.github.com/repos/psf/requests/issues/2447
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2447/labels{/name}
https://api.github.com/repos/psf/requests/issues/2447/comments
https://api.github.com/repos/psf/requests/issues/2447/events
https://github.com/psf/requests/issues/2447
57,680,795
MDU6SXNzdWU1NzY4MDc5NQ==
2,447
[Suggestion] allow TCP options to be set more easily
{ "avatar_url": "https://avatars.githubusercontent.com/u/348398?v=4", "events_url": "https://api.github.com/users/hoodja/events{/privacy}", "followers_url": "https://api.github.com/users/hoodja/followers", "following_url": "https://api.github.com/users/hoodja/following{/other_user}", "gists_url": "https://api.github.com/users/hoodja/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/hoodja", "id": 348398, "login": "hoodja", "node_id": "MDQ6VXNlcjM0ODM5OA==", "organizations_url": "https://api.github.com/users/hoodja/orgs", "received_events_url": "https://api.github.com/users/hoodja/received_events", "repos_url": "https://api.github.com/users/hoodja/repos", "site_admin": false, "starred_url": "https://api.github.com/users/hoodja/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/hoodja/subscriptions", "type": "User", "url": "https://api.github.com/users/hoodja", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2015-02-14T04:32:17Z
2021-09-08T23:06:01Z
2015-02-15T15:53:00Z
NONE
resolved
First, thanks for all the work on this library - I've been using it for about a year and it is always a joy to work with! I've run into a situation where a request I'm making is running through a NAT that is not in my control, and the request takes >10m to complete. On a Linux machine I am able to retrieve the response using cURL, but it never completed using requests in python. I suspect this is because cURL sends TCP-level keepalive requests while the default initialization of requests does not. In researching how I would be able to enable keepalive on the underlying sockets, it seems like I would just need to be able to provide the `socket_options` list to `requests.adapters.HTTPAdapter` at construction, and have HTTPAdapter include that in `pool_kwargs`. I imagine this could be accomplished by adding a `**kwargs` argument to `HTTPAdapter` and passing it through / merging it into `pool_kwargs`. This issue is meant to be a discussion on the appropriateness of that solution.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2447/reactions" }
https://api.github.com/repos/psf/requests/issues/2447/timeline
null
completed
null
null
false
[ "For added context, below is a workaround that seems to work for me on Linux (but not OS X):\n\n``` python\ns = requests.Session()\n\nfrom requests.packages.urllib3.connection import HTTPConnection\nmy_socket_options = HTTPConnection.default_socket_options + [\n (socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1)\n, (socket.IPPROTO_TCP, socket.TCP_KEEPIDLE, 60)\n, (socket.IPPROTO_TCP, socket.TCP_KEEPINTVL, 20)\n, (socket.IPPROTO_TCP, socket.TCP_KEEPCNT, 5)\n]\n\nadapter = requests.adapters.HTTPAdapter()\nadapter.init_poolmanager(10, 10, block=False, socket_options=my_socket_options)\ns.mount('https://', adapter)\n```\n\nI feel dirty calling `adapter.init_pooolmanager()` after constructing the `HTTPAdapter` but wasn't willing to write my own Adapter just to make that connection.\n", "Hey @hoodja I'm glad you like the library.\n\nWe've had several (suggestions|feature requests) to add various different (and usually of little popularity) keyword arguments to `HTTPAdapter.__init__`. We've been consistently resistant because the ideology of the project author and its maintainers is to serve the 95% (or greater) use case. I personally added the ability to configure socket options to urllib3 and you're the third person I've ever found who has needed to set socket options through requests. So we're certainly not likely to add a socket options keyword argument to `HTTPAdapter.__init__`. The [toolbelt](/sigmavirus24/requests-toolbelt) already has one or two adapters for less common but non-trivial use-cases and we can discuss there whether a `CustomSocketOptionsAdapter` would fit your needs there and what that API would look like.\n\nCheers!\nIan\n", "Hey @hoodja I added both a socket options adapter to the toolbelt and a TCP Keep-Alive adapter: https://github.com/sigmavirus24/requests-toolbelt/pull/59\n", "Thanks Ian! I'll definitely give that a spin.\n\nOn Tue, Feb 17, 2015 at 9:35 PM, Ian Cordasco [email protected]\nwrote:\n\n> Hey @hoodja https://github.com/hoodja I added both a socket options\n> adapter to the toolbelt and a TCP Keep-Alive adapter:\n> sigmavirus24/requests-toolbelt#59\n> https://github.com/sigmavirus24/requests-toolbelt/pull/59\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2447#issuecomment-74805834\n> .\n" ]
https://api.github.com/repos/psf/requests/issues/2446
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2446/labels{/name}
https://api.github.com/repos/psf/requests/issues/2446/comments
https://api.github.com/repos/psf/requests/issues/2446/events
https://github.com/psf/requests/issues/2446
57,653,096
MDU6SXNzdWU1NzY1MzA5Ng==
2,446
gzip response is not decoded
{ "avatar_url": "https://avatars.githubusercontent.com/u/5251753?v=4", "events_url": "https://api.github.com/users/barab-a/events{/privacy}", "followers_url": "https://api.github.com/users/barab-a/followers", "following_url": "https://api.github.com/users/barab-a/following{/other_user}", "gists_url": "https://api.github.com/users/barab-a/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/barab-a", "id": 5251753, "login": "barab-a", "node_id": "MDQ6VXNlcjUyNTE3NTM=", "organizations_url": "https://api.github.com/users/barab-a/orgs", "received_events_url": "https://api.github.com/users/barab-a/received_events", "repos_url": "https://api.github.com/users/barab-a/repos", "site_admin": false, "starred_url": "https://api.github.com/users/barab-a/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/barab-a/subscriptions", "type": "User", "url": "https://api.github.com/users/barab-a", "user_view_type": "public" }
[]
closed
true
null
[]
null
7
2015-02-13T20:58:10Z
2019-08-09T15:40:34Z
2015-02-13T22:46:14Z
NONE
resolved
It says in the FAQ: > Requests automatically decompresses gzip-encoded responses, and does its best to decode response content to unicode when possible. But the following code produces uncompressed data: ``` import requests response = requests.get("http://www.shopcade.com/sitemaps/sitemap_products_index.xml.gz") print response.text ``` So is this an issue of requests library or is there something wrong with the response from that site. When I download the archive via browser it decompresses well and the xml seems to be fine. Also the response headers contain the following key-value pair: ``` 'content-type': 'application/x-gzip' ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2446/reactions" }
https://api.github.com/repos/psf/requests/issues/2446/timeline
null
completed
null
null
false
[ "Ah, yes. This is an easy misunderstanding to make.\n\nWhen we say gzip-encoded responses, we mean responses that are sent with `Transfer-Encoding: gzip`. That means that the body has a known type but has been compressed with gzip for transport. That does not include things whose actual content is gzip, as in the example above. Those are transmitted exactly as originally served.\n", "Seems like a good thing to properly document for users so this confusion doesn't happen again.\n", "@Lukasa:\nI'm wondering if it's possible to use Requests to decode a gzip-encoded file (e.g. not-for-transport), or would that not be in the spirit of the library (file provenance and all)?\n\nI've got a large number of gzipped text file to download then parse (that would be amazing to just stream), and if it was the server doing the encoding, it seems like Requests could handle it no problem, but if the file's already in that form, would it work?\n", "@riordan we will not decode something that does not have a gzip (or compress) Content-Encoding. The gzip module in Python should do this for you though and it might be possible to create that with `response.raw` as a file object to sort of stream i.\n", "Alternatively, you can set up a fairly simple generator-based pipeline:\n\n``` python\nimport zlib\n\ndef decompress_stream(stream):\n o = zlib.decompressobj(16 + zlib.MAX_WBITS)\n\n for chunk in stream:\n yield o.decompress(chunk)\n\n yield o.flush()\n\n\nr = requests.get(some_url, stream=True)\nparseable_data = decompress_stream(r.iter_content(1024))\n```\n", "@sigmavirus24 @Lukasa This is _awesome_ thank you. That's exactly what I wound up doing.\n", "#### my code:\r\n```\r\ndef decompress_stream(stream):\r\n o = zlib.decompressobj(16 + zlib.MAX_WBITS)\r\n for chunk in stream:\r\n yield o.decompress(chunk)\r\n yield o.flush()\r\n\r\nrsp = requests.post(url,stream=bUseStream, data=szCmd, verify=False, headers=loadCookie(url,headers),timeout=None)\r\n \r\n rsp.raise_for_status()\r\n f9 = open(szFNms1,\"wb\")\r\n for chunk in decompress_stream(rsp.iter_content(chunk_size=81920)): \r\n if chunk: # filter out keep-alive new chunks\r\n f9.write(chunk)\r\n f9.flush()\r\n print(\"+\",sep=\"\",end=\"\",flush=True)\r\n f9.close()\r\n```\r\nis error\r\n```\r\n('Received response with content-encoding: gzip, but failed to decode it.', error('Error -3 while decompressing data: incorrect header check'))\r\n```\r\n@Lukasa " ]
https://api.github.com/repos/psf/requests/issues/2445
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2445/labels{/name}
https://api.github.com/repos/psf/requests/issues/2445/comments
https://api.github.com/repos/psf/requests/issues/2445/events
https://github.com/psf/requests/issues/2445
57,504,627
MDU6SXNzdWU1NzUwNDYyNw==
2,445
Allow a "url_prefix" to be set on Session
{ "avatar_url": "https://avatars.githubusercontent.com/u/1653275?v=4", "events_url": "https://api.github.com/users/fiatjaf/events{/privacy}", "followers_url": "https://api.github.com/users/fiatjaf/followers", "following_url": "https://api.github.com/users/fiatjaf/following{/other_user}", "gists_url": "https://api.github.com/users/fiatjaf/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/fiatjaf", "id": 1653275, "login": "fiatjaf", "node_id": "MDQ6VXNlcjE2NTMyNzU=", "organizations_url": "https://api.github.com/users/fiatjaf/orgs", "received_events_url": "https://api.github.com/users/fiatjaf/received_events", "repos_url": "https://api.github.com/users/fiatjaf/repos", "site_admin": false, "starred_url": "https://api.github.com/users/fiatjaf/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/fiatjaf/subscriptions", "type": "User", "url": "https://api.github.com/users/fiatjaf", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2015-02-12T19:08:05Z
2021-09-08T23:06:01Z
2015-02-12T19:13:25Z
NONE
resolved
So we can do something like this: ``` python COUCHDB_URL = 'https://mybigusername:[email protected]/databasename/' s = requests.Session(url_prefix=COUCHDB_URL) r = s.get('/_all_docs', params=dict(include_docs=True)) r = s.get('/some_document_id') ``` The example uses the CouchDB API and it would almost entirely prevent the need for API wrappers like [this](https://github.com/adamlofts/couchdb-requests) (see it storing the url prefix [right here](https://github.com/adamlofts/couchdb-requests/blob/93e83d5e8cf7b0894de33747572789777dee845e/couchdbreq/resource.py#L80)). But extensive usage of any REST API would benefit from this.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 2, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 2, "url": "https://api.github.com/repos/psf/requests/issues/2445/reactions" }
https://api.github.com/repos/psf/requests/issues/2445/timeline
null
completed
null
null
false
[ "Duplicate of #133.\n\nIn short, we're not adding new features to the API and we're especially not going to attempt to build URLs for you when you're using a session. There are lots of other libraries that allow for similar functionality but that is not something inherently necessary in the transmission or handling of HTTP. It's important to keep in mind that `requests` is \"HTTP for Humans\", not \"HTTP and a bunch of super nice to have features for Humans\". The main focus of this library is HTTP.\n", "Well, thank you.\n" ]
https://api.github.com/repos/psf/requests/issues/2444
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2444/labels{/name}
https://api.github.com/repos/psf/requests/issues/2444/comments
https://api.github.com/repos/psf/requests/issues/2444/events
https://github.com/psf/requests/pull/2444
57,355,512
MDExOlB1bGxSZXF1ZXN0MjkxMTUyNjg=
2,444
Upgrade urllib3 to 490d3a227fadb626cd54a240b9d0922f849914b4
{ "avatar_url": "https://avatars.githubusercontent.com/u/48383?v=4", "events_url": "https://api.github.com/users/Yasumoto/events{/privacy}", "followers_url": "https://api.github.com/users/Yasumoto/followers", "following_url": "https://api.github.com/users/Yasumoto/following{/other_user}", "gists_url": "https://api.github.com/users/Yasumoto/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Yasumoto", "id": 48383, "login": "Yasumoto", "node_id": "MDQ6VXNlcjQ4Mzgz", "organizations_url": "https://api.github.com/users/Yasumoto/orgs", "received_events_url": "https://api.github.com/users/Yasumoto/received_events", "repos_url": "https://api.github.com/users/Yasumoto/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Yasumoto/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Yasumoto/subscriptions", "type": "User", "url": "https://api.github.com/users/Yasumoto", "user_view_type": "public" }
[]
closed
true
null
[]
null
7
2015-02-11T18:23:53Z
2021-09-08T08:01:08Z
2015-02-23T22:01:06Z
CONTRIBUTOR
resolved
This is the release tagged at 85dfc16817df1e3604c238ad5d64f3b229e0598b
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2444/reactions" }
https://api.github.com/repos/psf/requests/issues/2444/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2444.diff", "html_url": "https://github.com/psf/requests/pull/2444", "merged_at": "2015-02-23T22:01:06Z", "patch_url": "https://github.com/psf/requests/pull/2444.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2444" }
true
[ "Tested on OS X 10.10:\n\n```\n[tw-mbp-jsmith requests (urllib_upgrade)]$ make test\n# This runs all of the tests. To run an individual test, run py.test with\n# the -k flag, like \"py.test -k test_path_is_not_double_encoded\"\npy.test test_requests.py\n============================================================================== test session starts ===============================================================================\nplatform darwin -- Python 2.7.9 -- pytest-2.3.4\nplugins: cov\ncollected 154 items \n\ntest_requests.py ..........................................................................................................................................................\n\n========================================================================== 154 passed in 26.34 seconds ===========================================================================\n```\n\nThis is to get https://github.com/shazow/urllib3/pull/526 merged in https://github.com/pypa/pip/issues/2395 can take advantage of the fix for older SSL library versions. This was previously tracked at https://github.com/kennethreitz/requests/issues/2435.\n", "We actually will pull in tip when we do a release. Speaking of which @Lukasa when do you think we should do a release? There's extra header awesomeness that's unreleased in urllib3 (merged last night) that I'd like to grab before doing a release.\n", "@sigmavirus24 sounds good- I updated to HEAD, but feel free to drop this PR in favor of another one if this is in flight elsewhere.\n", "It'd be great to get a bug-fix release soon (for issue #2435), so that we can use requests on python 2.7.9...\n", "Howdy all, anything I can do to help get this over the finish line? Happy to pitch in on testing or updating docs for 'upgrading urllib3' if I missed something too.\n\nThanks!!\n", "This is a pretty big blocker for me right now. Any chance we can get this pull request merged soon?\n", "Thanks @sigmavirus24 !\n" ]
https://api.github.com/repos/psf/requests/issues/2443
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2443/labels{/name}
https://api.github.com/repos/psf/requests/issues/2443/comments
https://api.github.com/repos/psf/requests/issues/2443/events
https://github.com/psf/requests/issues/2443
57,228,826
MDU6SXNzdWU1NzIyODgyNg==
2,443
Scaffold an API console sample from a model
{ "avatar_url": "https://avatars.githubusercontent.com/u/109167?v=4", "events_url": "https://api.github.com/users/s-celles/events{/privacy}", "followers_url": "https://api.github.com/users/s-celles/followers", "following_url": "https://api.github.com/users/s-celles/following{/other_user}", "gists_url": "https://api.github.com/users/s-celles/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/s-celles", "id": 109167, "login": "s-celles", "node_id": "MDQ6VXNlcjEwOTE2Nw==", "organizations_url": "https://api.github.com/users/s-celles/orgs", "received_events_url": "https://api.github.com/users/s-celles/received_events", "repos_url": "https://api.github.com/users/s-celles/repos", "site_admin": false, "starred_url": "https://api.github.com/users/s-celles/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/s-celles/subscriptions", "type": "User", "url": "https://api.github.com/users/s-celles", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2015-02-10T20:05:31Z
2021-09-08T23:06:02Z
2015-02-11T03:46:44Z
NONE
resolved
Hello, I wonder if RAML http://www.sitepoint.com/raml-restful-api-modeling-language/ http://raml.org/ couldn't be use to create (scaffold) an API console sample. This idea is probably out of the scope of requests but we could talk about such idea here. Same idea for server: https://github.com/flask-restful/flask-restful/issues/394 Kind regards PS: I'm writing this about RAML but maybe API Blueprint https://apiblueprint.org/ or Swagger http://swagger.io/ are also good candidates
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2443/reactions" }
https://api.github.com/repos/psf/requests/issues/2443/timeline
null
completed
null
null
false
[ "> This idea is probably out of the scope of requests but we could talk about such idea here.\n\nIt is definitely out of the scope of requests. Since it is out of scope this is not the place to discuss it.\n" ]
https://api.github.com/repos/psf/requests/issues/2442
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2442/labels{/name}
https://api.github.com/repos/psf/requests/issues/2442/comments
https://api.github.com/repos/psf/requests/issues/2442/events
https://github.com/psf/requests/pull/2442
57,219,946
MDExOlB1bGxSZXF1ZXN0MjkwMzQ2NzE=
2,442
Update certificate bundle.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[]
closed
true
null
[]
null
0
2015-02-10T19:05:03Z
2021-09-08T09:00:55Z
2015-02-10T19:05:56Z
MEMBER
resolved
This updates the cert bundle to what was produced by mkcert.org at Tue 10 Feb 2015 19:04:55 GMT.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2442/reactions" }
https://api.github.com/repos/psf/requests/issues/2442/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2442.diff", "html_url": "https://github.com/psf/requests/pull/2442", "merged_at": "2015-02-10T19:05:56Z", "patch_url": "https://github.com/psf/requests/pull/2442.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2442" }
true
[]
https://api.github.com/repos/psf/requests/issues/2441
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2441/labels{/name}
https://api.github.com/repos/psf/requests/issues/2441/comments
https://api.github.com/repos/psf/requests/issues/2441/events
https://github.com/psf/requests/pull/2441
56,954,028
MDExOlB1bGxSZXF1ZXN0Mjg4ODEyNjg=
2,441
Raise explicitly when exclusive parameters are provided
{ "avatar_url": "https://avatars.githubusercontent.com/u/259691?v=4", "events_url": "https://api.github.com/users/andreif/events{/privacy}", "followers_url": "https://api.github.com/users/andreif/followers", "following_url": "https://api.github.com/users/andreif/following{/other_user}", "gists_url": "https://api.github.com/users/andreif/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/andreif", "id": 259691, "login": "andreif", "node_id": "MDQ6VXNlcjI1OTY5MQ==", "organizations_url": "https://api.github.com/users/andreif/orgs", "received_events_url": "https://api.github.com/users/andreif/received_events", "repos_url": "https://api.github.com/users/andreif/repos", "site_admin": false, "starred_url": "https://api.github.com/users/andreif/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/andreif/subscriptions", "type": "User", "url": "https://api.github.com/users/andreif", "user_view_type": "public" }
[ { "color": "e102d8", "default": false, "description": null, "id": 117745, "name": "Planned", "node_id": "MDU6TGFiZWwxMTc3NDU=", "url": "https://api.github.com/repos/psf/requests/labels/Planned" }, { "color": "eb6420", "default": false, "description": null, "id": 44501256, "name": "Breaking API Change", "node_id": "MDU6TGFiZWw0NDUwMTI1Ng==", "url": "https://api.github.com/repos/psf/requests/labels/Breaking%20API%20Change" }, { "color": "e11d21", "default": false, "description": null, "id": 44501305, "name": "Not Ready To Merge", "node_id": "MDU6TGFiZWw0NDUwMTMwNQ==", "url": "https://api.github.com/repos/psf/requests/labels/Not%20Ready%20To%20Merge" } ]
closed
true
null
[]
{ "closed_at": null, "closed_issues": 29, "created_at": "2013-11-17T11:29:34Z", "creator": { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }, "description": null, "due_on": null, "html_url": "https://github.com/psf/requests/milestone/20", "id": 487518, "labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels", "node_id": "MDk6TWlsZXN0b25lNDg3NTE4", "number": 20, "open_issues": 12, "state": "open", "title": "3.0.0", "updated_at": "2024-05-19T18:43:00Z", "url": "https://api.github.com/repos/psf/requests/milestones/20" }
9
2015-02-08T14:52:56Z
2021-09-08T05:01:02Z
2016-01-30T03:53:32Z
NONE
resolved
It's a common case when one sends multipart request with json meta data. Currently, it is not clear that `json` argument is silently omitted. The same happens when both `data` and `json` are provided.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2441/reactions" }
https://api.github.com/repos/psf/requests/issues/2441/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2441.diff", "html_url": "https://github.com/psf/requests/pull/2441", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/2441.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2441" }
true
[ "Thanks for the PR @andreif!\n\nI'm mostly -0.5 on this. I think the right approach is to fix the documentation, since that's clearly what's lacking here (in fact, I could find no examples of using the json parameter actually, and I take full blame for that).\n\nThat said, the API for multipart requests seems confusing enough that I think some really good documentation around the `data`, `json`, and `files` parameters is really necessary.\n\nFurther, I think if we are going to raise an exception here, it should be a `ValueError`. `NotImplementedError` is usually used for abstract methods that need to be overridden by a subclass. That's not what we expect people to do here. What actually happens here, the problem is that we technically received to many values and can't possibly know what the users want us to do.\n", "@sigmavirus24 Well, I wonder if we could take both files and json. It's a common case to upload files with json meta data, so maybe it makes sense to have a nice api for this.\n", "It would be nice to write\n\n``` py\nr = requests.post('https://www.googleapis.com/upload/drive/v2/files',\n params={'uploadType': 'multipart'},\n headers={'Authorization': 'Bearer ' + access_token},\n json={'title': 'test.txt'},\n files={'file': open('test.txt', 'rb')})\n```\n\ninstead of\n\n``` py\nr = requests.post('https://www.googleapis.com/upload/drive/v2/files',\n params={'uploadType': 'multipart'},\n headers={'Authorization': 'Bearer ' + access_token},\n files={'file': open('test.txt', 'rb'),\n None: (None, json.dumps({'title': 'test.txt'}),\n 'application/json; charset=UTF-8')})\n```\n", "I could look into it if you are fine with accepting json argument together with files\n", "@andreif That's a good idea, but sadly the API doesn't render out well. For instance, what happens if instead of uploading one file you're uploading two? And how do you associate them together?\n\nSadly, JSON in mutlipart/form-data messages is not standardised in any sense, so there's no way we can extend our API for that to be consistently useful. In that situation I'd want to resist the temptation to guess.\n\nHowever, I'm +0 on throwing exceptions when mutually exclusive parameters are passed, because I don't really think we should fail silently. That said, it's backwards incompatible, and so we need to hold it for at least a minor release.\n", "@Lukasa Alright. Should I change to `ValueError` in this case?\n", "@andreif please do change it to a ValueError\n", "@sigmavirus24 I took existing raise as an example. Should I change all of them to `ValueError` or should I keep the original as it is?\n\n``` py\nraise NotImplementedError('Streamed bodies and files are mutually exclusive.')\n```\n", "This should be solved with documentation.\n" ]
https://api.github.com/repos/psf/requests/issues/2440
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2440/labels{/name}
https://api.github.com/repos/psf/requests/issues/2440/comments
https://api.github.com/repos/psf/requests/issues/2440/events
https://github.com/psf/requests/pull/2440
56,813,369
MDExOlB1bGxSZXF1ZXN0Mjg4MDkzMTM=
2,440
Update to use readthedocs.org instead of rtfd.org
{ "avatar_url": "https://avatars.githubusercontent.com/u/201155?v=4", "events_url": "https://api.github.com/users/aquarion/events{/privacy}", "followers_url": "https://api.github.com/users/aquarion/followers", "following_url": "https://api.github.com/users/aquarion/following{/other_user}", "gists_url": "https://api.github.com/users/aquarion/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/aquarion", "id": 201155, "login": "aquarion", "node_id": "MDQ6VXNlcjIwMTE1NQ==", "organizations_url": "https://api.github.com/users/aquarion/orgs", "received_events_url": "https://api.github.com/users/aquarion/received_events", "repos_url": "https://api.github.com/users/aquarion/repos", "site_admin": false, "starred_url": "https://api.github.com/users/aquarion/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/aquarion/subscriptions", "type": "User", "url": "https://api.github.com/users/aquarion", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2015-02-06T13:47:04Z
2021-09-08T09:00:56Z
2015-02-06T14:02:58Z
CONTRIBUTOR
resolved
https://toolbelt.rtfd.org currently causes an SSL error because the wildcard SSL cert is for *.readthedocs.org.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2440/reactions" }
https://api.github.com/repos/psf/requests/issues/2440/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2440.diff", "html_url": "https://github.com/psf/requests/pull/2440", "merged_at": "2015-02-06T14:02:58Z", "patch_url": "https://github.com/psf/requests/pull/2440.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2440" }
true
[ "Who added this link?\n\nThanks for fixing this @acquarion!\n" ]
https://api.github.com/repos/psf/requests/issues/2439
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2439/labels{/name}
https://api.github.com/repos/psf/requests/issues/2439/comments
https://api.github.com/repos/psf/requests/issues/2439/events
https://github.com/psf/requests/pull/2439
56,597,006
MDExOlB1bGxSZXF1ZXN0Mjg2ODA5OTA=
2,439
Suppress the warning if verify=False
{ "avatar_url": "https://avatars.githubusercontent.com/u/227426?v=4", "events_url": "https://api.github.com/users/zaitcev/events{/privacy}", "followers_url": "https://api.github.com/users/zaitcev/followers", "following_url": "https://api.github.com/users/zaitcev/following{/other_user}", "gists_url": "https://api.github.com/users/zaitcev/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/zaitcev", "id": 227426, "login": "zaitcev", "node_id": "MDQ6VXNlcjIyNzQyNg==", "organizations_url": "https://api.github.com/users/zaitcev/orgs", "received_events_url": "https://api.github.com/users/zaitcev/received_events", "repos_url": "https://api.github.com/users/zaitcev/repos", "site_admin": false, "starred_url": "https://api.github.com/users/zaitcev/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zaitcev/subscriptions", "type": "User", "url": "https://api.github.com/users/zaitcev", "user_view_type": "public" }
[ { "color": "e11d21", "default": false, "description": null, "id": 78002701, "name": "Do Not Merge", "node_id": "MDU6TGFiZWw3ODAwMjcwMQ==", "url": "https://api.github.com/repos/psf/requests/labels/Do%20Not%20Merge" } ]
closed
true
null
[]
null
5
2015-02-04T22:22:16Z
2021-09-08T09:00:57Z
2015-02-05T04:27:33Z
NONE
resolved
Setting verify to False differs from verify=None in being explicitly set, and therefore the warning being undesirable. See this for the a proposed alternative that involves importing implementation details of Requests onto the application: https://github.com/kennethreitz/requests/issues/2214#issuecomment-72941896 This fix avoids deep imports and pulling the knowledge of internals of Requests into the application. Fixes #2214
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2439/reactions" }
https://api.github.com/repos/psf/requests/issues/2439/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2439.diff", "html_url": "https://github.com/psf/requests/pull/2439", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/2439.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2439" }
true
[ "Some notes:\n1. urllib3 is a separate project that we vendor in its entirety. This PR affects it, and so requires a separate change in that library.\n2. 'Explicit' is dependent on who you are. This is 'explicit' to the person who calls requests, but not to the person above them, if any. The warning is not just for the person writing against requests, but for anyone who uses an application that uses requests.\n\nI'm -1 on this change.\n", "`CERT_EXPLICITLY_NONE` means nothing and doesn't add much value here. Given that this is mostly in urllib3 and would rely on acceptance there, I'm closing this until progress has been made there.\n", "Sorry, but the closing statement is false. CERT_EXPLICITLY_NONE means CERT_NONE set explicitly, and its value is making the distinction between such case and CERT_NONE set explicitly, purely by the lack of certificates.\n", "@zaitcev The closing statement was: \"Given that this is mostly in urllib3 and would rely on acceptance there, **I'm closing this until progress has been made there**.\" (Emphasis mine.)\n", "@zaitcev `CERT_NONE`, `CERT_REQUIRED` and `CERT_OPTIONAL` are all defined on the `ssl` module and have a real meaning in the context of performing validation. `CERT_EXPLICITLY_NONE` means absolutely nothing in the same context.\n" ]
https://api.github.com/repos/psf/requests/issues/2438
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2438/labels{/name}
https://api.github.com/repos/psf/requests/issues/2438/comments
https://api.github.com/repos/psf/requests/issues/2438/events
https://github.com/psf/requests/pull/2438
56,470,544
MDExOlB1bGxSZXF1ZXN0Mjg2MDgxOTQ=
2,438
add a timeout value to connection_error_invalid_port test to accelerate failure
{ "avatar_url": "https://avatars.githubusercontent.com/u/739280?v=4", "events_url": "https://api.github.com/users/colindickson/events{/privacy}", "followers_url": "https://api.github.com/users/colindickson/followers", "following_url": "https://api.github.com/users/colindickson/following{/other_user}", "gists_url": "https://api.github.com/users/colindickson/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/colindickson", "id": 739280, "login": "colindickson", "node_id": "MDQ6VXNlcjczOTI4MA==", "organizations_url": "https://api.github.com/users/colindickson/orgs", "received_events_url": "https://api.github.com/users/colindickson/received_events", "repos_url": "https://api.github.com/users/colindickson/repos", "site_admin": false, "starred_url": "https://api.github.com/users/colindickson/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/colindickson/subscriptions", "type": "User", "url": "https://api.github.com/users/colindickson", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2015-02-04T02:17:49Z
2021-09-08T09:00:57Z
2015-02-04T07:01:10Z
CONTRIBUTOR
resolved
As mentioned here https://github.com/kennethreitz/requests/pull/2436 , one of the tests seemed like it was hanging. In fact, the test was just (correctly) waiting to hit its timeout value. We now give it a shorter timeout value to accelerate the testing process.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2438/reactions" }
https://api.github.com/repos/psf/requests/issues/2438/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2438.diff", "html_url": "https://github.com/psf/requests/pull/2438", "merged_at": "2015-02-04T07:01:10Z", "patch_url": "https://github.com/psf/requests/pull/2438.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2438" }
true
[ "Yeah, this feels sensible to me. :cake:\n" ]
https://api.github.com/repos/psf/requests/issues/2437
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2437/labels{/name}
https://api.github.com/repos/psf/requests/issues/2437/comments
https://api.github.com/repos/psf/requests/issues/2437/events
https://github.com/psf/requests/pull/2437
56,352,390
MDExOlB1bGxSZXF1ZXN0Mjg1MzY5Mzg=
2,437
quickstart: using a list as a value in query params
{ "avatar_url": "https://avatars.githubusercontent.com/u/912257?v=4", "events_url": "https://api.github.com/users/tomscytale/events{/privacy}", "followers_url": "https://api.github.com/users/tomscytale/followers", "following_url": "https://api.github.com/users/tomscytale/following{/other_user}", "gists_url": "https://api.github.com/users/tomscytale/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/tomscytale", "id": 912257, "login": "tomscytale", "node_id": "MDQ6VXNlcjkxMjI1Nw==", "organizations_url": "https://api.github.com/users/tomscytale/orgs", "received_events_url": "https://api.github.com/users/tomscytale/received_events", "repos_url": "https://api.github.com/users/tomscytale/repos", "site_admin": false, "starred_url": "https://api.github.com/users/tomscytale/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/tomscytale/subscriptions", "type": "User", "url": "https://api.github.com/users/tomscytale", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2015-02-03T09:21:06Z
2021-09-08T09:00:58Z
2015-02-03T10:34:07Z
CONTRIBUTOR
resolved
Update docs - how to pass a list of values in a query
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2437/reactions" }
https://api.github.com/repos/psf/requests/issues/2437/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2437.diff", "html_url": "https://github.com/psf/requests/pull/2437", "merged_at": "2015-02-03T10:34:07Z", "patch_url": "https://github.com/psf/requests/pull/2437.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2437" }
true
[ ":cake: :star: Thanks!\n" ]
https://api.github.com/repos/psf/requests/issues/2436
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2436/labels{/name}
https://api.github.com/repos/psf/requests/issues/2436/comments
https://api.github.com/repos/psf/requests/issues/2436/events
https://github.com/psf/requests/pull/2436
56,325,178
MDExOlB1bGxSZXF1ZXN0Mjg1MjE5Mzc=
2,436
Fix for failing test: test_connection_error
{ "avatar_url": "https://avatars.githubusercontent.com/u/739280?v=4", "events_url": "https://api.github.com/users/colindickson/events{/privacy}", "followers_url": "https://api.github.com/users/colindickson/followers", "following_url": "https://api.github.com/users/colindickson/following{/other_user}", "gists_url": "https://api.github.com/users/colindickson/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/colindickson", "id": 739280, "login": "colindickson", "node_id": "MDQ6VXNlcjczOTI4MA==", "organizations_url": "https://api.github.com/users/colindickson/orgs", "received_events_url": "https://api.github.com/users/colindickson/received_events", "repos_url": "https://api.github.com/users/colindickson/repos", "site_admin": false, "starred_url": "https://api.github.com/users/colindickson/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/colindickson/subscriptions", "type": "User", "url": "https://api.github.com/users/colindickson", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2015-02-03T02:28:37Z
2021-09-08T09:00:57Z
2015-02-03T06:50:11Z
CONTRIBUTOR
resolved
http://fooobarbangbazbing.httpbin.org does not seem to be an unknown domain anymore. I've also split up this unit test into two tests since they test slightly different things.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2436/reactions" }
https://api.github.com/repos/psf/requests/issues/2436/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2436.diff", "html_url": "https://github.com/psf/requests/pull/2436", "merged_at": "2015-02-03T06:50:11Z", "patch_url": "https://github.com/psf/requests/pull/2436.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2436" }
true
[ "LGTM\n", "Thanks! :cake:\n" ]
https://api.github.com/repos/psf/requests/issues/2435
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2435/labels{/name}
https://api.github.com/repos/psf/requests/issues/2435/comments
https://api.github.com/repos/psf/requests/issues/2435/events
https://github.com/psf/requests/issues/2435
56,308,913
MDU6SXNzdWU1NjMwODkxMw==
2,435
ValueError: check_hostname requires server_hostname when making GET requests
{ "avatar_url": "https://avatars.githubusercontent.com/u/1057616?v=4", "events_url": "https://api.github.com/users/jhuang314/events{/privacy}", "followers_url": "https://api.github.com/users/jhuang314/followers", "following_url": "https://api.github.com/users/jhuang314/following{/other_user}", "gists_url": "https://api.github.com/users/jhuang314/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jhuang314", "id": 1057616, "login": "jhuang314", "node_id": "MDQ6VXNlcjEwNTc2MTY=", "organizations_url": "https://api.github.com/users/jhuang314/orgs", "received_events_url": "https://api.github.com/users/jhuang314/received_events", "repos_url": "https://api.github.com/users/jhuang314/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jhuang314/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jhuang314/subscriptions", "type": "User", "url": "https://api.github.com/users/jhuang314", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2015-02-02T23:15:23Z
2021-08-27T00:08:25Z
2015-02-03T06:52:41Z
NONE
resolved
I just built python 2.7.9 from source for RHEL5.7 and installed requests using easy_install. I get the following error when running ``` r = requests.get('https://api.github.com/user', auth=('user', 'pass')) ``` ``` Python 2.7.9 (default, Feb 2 2015, 17:03:23) [GCC 4.1.2 20080704 (Red Hat 4.1.2-52)] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> import requests >>> r = requests.get('https://api.github.com/user', auth=('user', 'pass')) Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/opt/python2.7/lib/python2.7/site-packages/requests-2.5.1-py2.7.egg/requests/api.py", line 65, in get return request('get', url, **kwargs) File "/opt/python2.7/lib/python2.7/site-packages/requests-2.5.1-py2.7.egg/requests/api.py", line 49, in request response = session.request(method=method, url=url, **kwargs) File "/opt/python2.7/lib/python2.7/site-packages/requests-2.5.1-py2.7.egg/requests/sessions.py", line 461, in request resp = self.send(prep, **send_kwargs) File "/opt/python2.7/lib/python2.7/site-packages/requests-2.5.1-py2.7.egg/requests/sessions.py", line 573, in send r = adapter.send(request, **kwargs) File "/opt/python2.7/lib/python2.7/site-packages/requests-2.5.1-py2.7.egg/requests/adapters.py", line 370, in send timeout=timeout File "/opt/python2.7/lib/python2.7/site-packages/requests-2.5.1-py2.7.egg/requests/packages/urllib3/connectionpool.py", line 518, in urlopen body=body, headers=headers) File "/opt/python2.7/lib/python2.7/site-packages/requests-2.5.1-py2.7.egg/requests/packages/urllib3/connectionpool.py", line 322, in _make_request self._validate_conn(conn) File "/opt/python2.7/lib/python2.7/site-packages/requests-2.5.1-py2.7.egg/requests/packages/urllib3/connectionpool.py", line 727, in _validate_conn conn.connect() File "/opt/python2.7/lib/python2.7/site-packages/requests-2.5.1-py2.7.egg/requests/packages/urllib3/connection.py", line 238, in connect ssl_version=resolved_ssl_version) File "/opt/python2.7/lib/python2.7/site-packages/requests-2.5.1-py2.7.egg/requests/packages/urllib3/util/ssl_.py", line 254, in ssl_wrap_socket return context.wrap_socket(sock) File "/opt/python2.7/lib/python2.7/ssl.py", line 350, in wrap_socket _context=self) File "/opt/python2.7/lib/python2.7/ssl.py", line 537, in __init__ raise ValueError("check_hostname requires server_hostname") ValueError: check_hostname requires server_hostname >>> ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2435/reactions" }
https://api.github.com/repos/psf/requests/issues/2435/timeline
null
completed
null
null
false
[ "Ok, so this happens in one of two cases: either `server_hostname` is `None`, or `ssl.HAS_SNI` is `False`. Can you print the result of `import ssl; print ssl.HAS_SNI` please?\n", "I get `False`\n", "Right, so the problem is that in Python 2.7.9 the assumption we make about `HAS_SNI` and hostname verification is wrong. This is actually a bug in [urllib3](https://github.com/shazow/urllib3), so we need to open it over there.\n", "Forgive my ignorance, I'm trying to follow what's going on with this issue and the linked issues since I'm experiencing this problem right now. Is there a workaround in the meantime before this change makes it into released versions? Currently I have requests-2.5.0 installed with pip and virtualenv, which seems to pull in its own copy of urllib3. Is there a different compatible version of requests I can install to workaround?\n" ]
https://api.github.com/repos/psf/requests/issues/2434
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2434/labels{/name}
https://api.github.com/repos/psf/requests/issues/2434/comments
https://api.github.com/repos/psf/requests/issues/2434/events
https://github.com/psf/requests/pull/2434
56,264,162
MDExOlB1bGxSZXF1ZXN0Mjg0ODUwNDQ=
2,434
Remove urllib3-specific section of iter_chunks
{ "avatar_url": "https://avatars.githubusercontent.com/u/82622?v=4", "events_url": "https://api.github.com/users/larsks/events{/privacy}", "followers_url": "https://api.github.com/users/larsks/followers", "following_url": "https://api.github.com/users/larsks/following{/other_user}", "gists_url": "https://api.github.com/users/larsks/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/larsks", "id": 82622, "login": "larsks", "node_id": "MDQ6VXNlcjgyNjIy", "organizations_url": "https://api.github.com/users/larsks/orgs", "received_events_url": "https://api.github.com/users/larsks/received_events", "repos_url": "https://api.github.com/users/larsks/repos", "site_admin": false, "starred_url": "https://api.github.com/users/larsks/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/larsks/subscriptions", "type": "User", "url": "https://api.github.com/users/larsks", "user_view_type": "public" }
[]
closed
true
null
[]
null
8
2015-02-02T17:12:59Z
2021-09-08T08:00:56Z
2015-04-02T18:44:13Z
NONE
resolved
This commit resolves #2433 by using os.read() on self.raw.fileno() in all cases, and removes the use fo the self.raw.stream() iterator which would block rather than returning short reads.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2434/reactions" }
https://api.github.com/repos/psf/requests/issues/2434/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2434.diff", "html_url": "https://github.com/psf/requests/pull/2434", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/2434.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2434" }
true
[ "I wonder if this shouldn't go into urllib3 instead, thereby giving the benefit there too. @shazow? @sigmavirus24?\n", "I'd like that.\n", "The existing documentation for `urllib3.response.stream` says:\n\n> A generator wrapper for the read() method. A call will block until\n> `amt` bytes have been read from the connection or until the\n> connection is closed.\n\nSo from that perspective things are behaving as expected (it's supposed to block). So...would the fix be a new method, or a boolean (e.g., `allow_short_reads`), or something else?\n", "Hm, actually are we sure that doing something like .fileno() will work on AppEngine etc?\n", "@shazow Maybe not? Looking at the code in `urllib3.response`, `self._fp` is set from `body`, which could be a string type instead of a file pointer.\n\n...oh wait, that's a lie, it couldn't be a string type in that case, but it only checks for a `read` method so it might not have `fileno()`, I guess.\n", "Mmm, it's not clear that we can safely do this. A potential optimisation, though?\n", "I'm open to have it use this method if it's of the right type, if that helps. :)\n", "This breaks decoding of content too, no?\n" ]
https://api.github.com/repos/psf/requests/issues/2433
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2433/labels{/name}
https://api.github.com/repos/psf/requests/issues/2433/comments
https://api.github.com/repos/psf/requests/issues/2433/events
https://github.com/psf/requests/issues/2433
56,194,102
MDU6SXNzdWU1NjE5NDEwMg==
2,433
iter_lines method will always hold the last response in the server in a buffer
{ "avatar_url": "https://avatars.githubusercontent.com/u/82622?v=4", "events_url": "https://api.github.com/users/larsks/events{/privacy}", "followers_url": "https://api.github.com/users/larsks/followers", "following_url": "https://api.github.com/users/larsks/following{/other_user}", "gists_url": "https://api.github.com/users/larsks/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/larsks", "id": 82622, "login": "larsks", "node_id": "MDQ6VXNlcjgyNjIy", "organizations_url": "https://api.github.com/users/larsks/orgs", "received_events_url": "https://api.github.com/users/larsks/received_events", "repos_url": "https://api.github.com/users/larsks/repos", "site_admin": false, "starred_url": "https://api.github.com/users/larsks/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/larsks/subscriptions", "type": "User", "url": "https://api.github.com/users/larsks", "user_view_type": "public" }
[]
open
false
null
[]
{ "closed_at": null, "closed_issues": 29, "created_at": "2013-11-17T11:29:34Z", "creator": { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }, "description": null, "due_on": null, "html_url": "https://github.com/psf/requests/milestone/20", "id": 487518, "labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels", "node_id": "MDk6TWlsZXN0b25lNDg3NTE4", "number": 20, "open_issues": 12, "state": "open", "title": "3.0.0", "updated_at": "2024-05-19T18:43:00Z", "url": "https://api.github.com/repos/psf/requests/milestones/20" }
10
2015-02-02T04:09:56Z
2019-12-11T22:42:18Z
null
NONE
null
The implementation of the `iter_lines` and `iter_content` methods in `requests` means that when receiving line-by-line data from a server in "push" mode, the latest line received from the server will almost invariably be smaller than the `chunk_size` parameter, causing the final read operation to block. A good example of this is the Kubernetes `watch` api, which produces one line of JSON output per event, like this: ``` {"type":"ADDED","object":{"kind":"Service","id":"kubernetes-ro","uid":"5718d954-a31a-11e4-8c74-20cf30467e62","creationTimestamp":"2015-01-23T11:10:25-05:00","selfLink":"/api/v1beta1/services/kubernetes-ro","resourceVersion":4,"apiVersion":"v1beta1","namespace":"default","port":80,"protocol":"TCP","labels":{"component":"apiserver","provider":"kubernetes"},"selector":null,"containerPort":0,"portalIP":"10.254.6.100"}} ``` If you compare the output of: ``` import requests r = requests.get('http://localhost:8080/api/v1beta1/watch/services', stream=True) for line in r.iter_lines(): print line ``` With the output of `curl` running against the same URL, you will see that the output from the Python code lags behind the output seen by `curl` by one line. I was able to work around this behavior by writing my own `iter_lines` method, which looks like this: ``` def iter_lines(fd, chunk_size=1024): '''Iterates over the content of a file-like object line-by-line.''' pending = None while True: chunk = os.read(fd.fileno(), chunk_size) if not chunk: break if pending is not None: chunk = pending + chunk pending = None lines = chunk.splitlines() if lines and lines[-1]: pending = lines.pop() for line in lines: yield line if pending: yield(pending) ``` This works around the problem partly by calling `os.read`, which will happily return fewer bytes than requested in `chunk_size`. With the above routing available, the following code behaves correctly: ``` import requests r = requests.get('http://localhost:8080/api/v1beta1/watch/services', stream=True) for line in iter_lines(r.raw): print line ``` This code will always print out the most recent reply from the server when it is received.
null
{ "+1": 1, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 1, "url": "https://api.github.com/repos/psf/requests/issues/2433/reactions" }
https://api.github.com/repos/psf/requests/issues/2433/timeline
null
null
null
null
false
[ "Generally speaking I'd be in favour of changing this behaviour. It's been stupid for a long time now.\n", "You also have my support. The trick is doing this in a way that's backwards compatible so we can help you out before 3.0\n", "The above change works for me with python 2.7.8 and 3.4.1 (both with `urllib3` available).\n", "I've just encountered this unfortunate behavior trying to consume a feed=continuous changes feed from couchdb which has much the same semantics. Any chance of this going in?\n", "note that this doesn't seem to work if you don't have urllib3 installed and using r.raw means requests emits the raw chunks of the chunked transfer mode.\n", "> note that this doesn't seem to work if you don't have urllib3 installed and using r.raw means requests emits the raw chunks of the chunked transfer mode.\n\nI'm sorry. I don't understand that at all. Could you help me understand? If you're using requests from PyPI, you always have urllib3 installed as `requests.packages.urllib3`. Are you using requests from one of the distribution packages without urllib3 installed? If so, how is requests even working?\n", "an excellent question but likely off-topic (I only noticed that 'pip install urllib3' installed the library, and then I uninstalled it, but of course I probably have another copy somewhere else).\n\nThe bug in iter_lines is real and affects at least two use cases, so great to see it destined for 3.0, thanks :)\n", "I am pretty sure we've seen another instance of this bug [in the wild](http://stackoverflow.com/questions/39662596/python-requests-get-ignores-the-last-record).\n", "Is this really still a bug? At the very least this should be well documented -- I would imagine most people would just not use `iter_lines` if they knew about this\r\n\r\n> in a way that's backwards compatible\r\n\r\n@sigmavirus24 I'm having trouble understanding that. It's a bug, right? It's not intended behavior that's being broken, it's fixing it to work as intended.", "@eschwartz I'm no longer involved in this project. Please don't mention me on this or other issues." ]
https://api.github.com/repos/psf/requests/issues/2432
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2432/labels{/name}
https://api.github.com/repos/psf/requests/issues/2432/comments
https://api.github.com/repos/psf/requests/issues/2432/events
https://github.com/psf/requests/issues/2432
56,178,137
MDU6SXNzdWU1NjE3ODEzNw==
2,432
Implement limits for requests
{ "avatar_url": "https://avatars.githubusercontent.com/u/23648?v=4", "events_url": "https://api.github.com/users/skorokithakis/events{/privacy}", "followers_url": "https://api.github.com/users/skorokithakis/followers", "following_url": "https://api.github.com/users/skorokithakis/following{/other_user}", "gists_url": "https://api.github.com/users/skorokithakis/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/skorokithakis", "id": 23648, "login": "skorokithakis", "node_id": "MDQ6VXNlcjIzNjQ4", "organizations_url": "https://api.github.com/users/skorokithakis/orgs", "received_events_url": "https://api.github.com/users/skorokithakis/received_events", "repos_url": "https://api.github.com/users/skorokithakis/repos", "site_admin": false, "starred_url": "https://api.github.com/users/skorokithakis/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/skorokithakis/subscriptions", "type": "User", "url": "https://api.github.com/users/skorokithakis", "user_view_type": "public" }
[ { "color": "02e10c", "default": false, "description": null, "id": 76800, "name": "Feature Request", "node_id": "MDU6TGFiZWw3NjgwMA==", "url": "https://api.github.com/repos/psf/requests/labels/Feature%20Request" }, { "color": "777777", "default": false, "description": null, "id": 162780722, "name": "Question/Not a bug", "node_id": "MDU6TGFiZWwxNjI3ODA3MjI=", "url": "https://api.github.com/repos/psf/requests/labels/Question/Not%20a%20bug" }, { "color": "f7c6c7", "default": false, "description": null, "id": 167537670, "name": "Propose Close", "node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=", "url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close" } ]
closed
true
null
[]
null
3
2015-02-01T21:03:40Z
2021-09-08T23:05:48Z
2015-04-06T08:41:29Z
NONE
resolved
Would anyone be interested in a `limit` parameter that timed out after X seconds or Y bytes received, as a DoS countermeasure? I want to implement one, but I'm not sure if you would accept the PR here or whether I should make it a third party library. It feels like it would be more fitting in requests itself, though.
{ "avatar_url": "https://avatars.githubusercontent.com/u/23648?v=4", "events_url": "https://api.github.com/users/skorokithakis/events{/privacy}", "followers_url": "https://api.github.com/users/skorokithakis/followers", "following_url": "https://api.github.com/users/skorokithakis/following{/other_user}", "gists_url": "https://api.github.com/users/skorokithakis/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/skorokithakis", "id": 23648, "login": "skorokithakis", "node_id": "MDQ6VXNlcjIzNjQ4", "organizations_url": "https://api.github.com/users/skorokithakis/orgs", "received_events_url": "https://api.github.com/users/skorokithakis/received_events", "repos_url": "https://api.github.com/users/skorokithakis/repos", "site_admin": false, "starred_url": "https://api.github.com/users/skorokithakis/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/skorokithakis/subscriptions", "type": "User", "url": "https://api.github.com/users/skorokithakis", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2432/reactions" }
https://api.github.com/repos/psf/requests/issues/2432/timeline
null
completed
null
null
false
[ "We've been in a perpetual feature freeze for some time now (see https://github.com/kennethreitz/requests/issues/1165) meaning that this would not be accepted as yet another keyword parameter. Additions like this are being added to the [requests-toolbelt](/sigmavirus24/requests-toolbelt).\n", "I'm proposing we close this since there's been no response in over 2 months and there seems to be no one else interested in this. Further, there's an appropriate place for a feature like this (the toolbelt).\n", "Sorry, yes, do close it. I already made it into a library:\n\nhttps://pypi.python.org/pypi/requests-guard\n" ]
https://api.github.com/repos/psf/requests/issues/2431
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2431/labels{/name}
https://api.github.com/repos/psf/requests/issues/2431/comments
https://api.github.com/repos/psf/requests/issues/2431/events
https://github.com/psf/requests/pull/2431
56,110,634
MDExOlB1bGxSZXF1ZXN0Mjg0MDg4Mjc=
2,431
Fixes problems with iter_lines when the delimiter is split
{ "avatar_url": "https://avatars.githubusercontent.com/u/903537?v=4", "events_url": "https://api.github.com/users/ianepperson/events{/privacy}", "followers_url": "https://api.github.com/users/ianepperson/followers", "following_url": "https://api.github.com/users/ianepperson/following{/other_user}", "gists_url": "https://api.github.com/users/ianepperson/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ianepperson", "id": 903537, "login": "ianepperson", "node_id": "MDQ6VXNlcjkwMzUzNw==", "organizations_url": "https://api.github.com/users/ianepperson/orgs", "received_events_url": "https://api.github.com/users/ianepperson/received_events", "repos_url": "https://api.github.com/users/ianepperson/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ianepperson/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ianepperson/subscriptions", "type": "User", "url": "https://api.github.com/users/ianepperson", "user_view_type": "public" }
[ { "color": "eb6420", "default": false, "description": null, "id": 44501256, "name": "Breaking API Change", "node_id": "MDU6TGFiZWw0NDUwMTI1Ng==", "url": "https://api.github.com/repos/psf/requests/labels/Breaking%20API%20Change" } ]
closed
true
null
[]
{ "closed_at": null, "closed_issues": 29, "created_at": "2013-11-17T11:29:34Z", "creator": { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }, "description": null, "due_on": null, "html_url": "https://github.com/psf/requests/milestone/20", "id": 487518, "labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels", "node_id": "MDk6TWlsZXN0b25lNDg3NTE4", "number": 20, "open_issues": 12, "state": "open", "title": "3.0.0", "updated_at": "2024-05-19T18:43:00Z", "url": "https://api.github.com/repos/psf/requests/milestones/20" }
8
2015-01-31T03:29:19Z
2021-09-08T01:21:36Z
2016-12-02T20:09:32Z
NONE
resolved
I ran into this problem today where my data would randomly have additional null strings from iter_lines. I added a test to show the issue, then fixed it. The only weirdness might be when no data comes back, how should iter_lines behave? Should there be a single yield of a null string, or should it just return without yielding anything? This edit performs the latter.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2431/reactions" }
https://api.github.com/repos/psf/requests/issues/2431/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2431.diff", "html_url": "https://github.com/psf/requests/pull/2431", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/2431.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2431" }
true
[ "The specific issue happens when attempting to retrieve lines with a multi-byte delimiter. Very occasionally, the delimiter ends up split between data chunks. This results in the yielding of an additional null string. For reading an MJPEG data stream with `iter_lines('\\r\\n')` it was causing a garbled frame every 10 seconds or so.\n", "I see you've marked it with the Breaking API Change. I suppose that's correct, as it was previously inconsistent. If a null-string chunk were received then the port closed, `iter_lines()` would not yield anything and exit, but `iter_lines(any_delimiter)` would yield a null-string then exit. With this change they both consistently yield nothing and exit. (Which I think is the more historically expected behavior.)\n", "> Which I think is the more historically expected behavior.\n\nPlease define historically. In the context of the project, empty strings are historically the normal, intentionally or not.\n\nFor what it's worth, this can still be merged before 3.0 but I wanted to ensure no one merged this accidentally before we had a chance to discuss it.\n", "> Please define historically.\n\nSure. Prior to the change that added the [ability to define the delimiter](https://github.com/kennethreitz/requests/commit/f5ff05be1ed04bcc50e33af28fb54382466a32e8), empty strings would not have been emitted as that only came about as a result of using `split(delimiter=x)` in place of `splitlines()` - they have different behavior. `split` will always provide at list of at least length 1, whereas `splitlines` will occasionally provide a zero length list. Because of this, when the delimiter is split across chunks, nulls are occasionally emitted. Also, when the socket (or iter_content) emits a zero length string and nothing more, the behavior is as follows:\n\nBehavior prior to October 2014:\n- `iter_lines()` - returns without yielding anything.\n\nAfter October 2014:\n- `iter_lines()` - returns without yielding anything.\n- `iter_lines(delimiter)` - yields a zero-length string, then returns.\n\nSide-effect of proposed fix:\n- `iter_lines()` - returns without yielding anything.\n- `iter_lines(delimiter)` - returns without yielding anything.\n\nAll this is an aside from my proposed fix, which essentially yielding if and only if there's a full line of content. If I'm expecting a string that looks like this: `\"--boundary\\r\\nContent-Type: image/jpg\\r\\nContent-Length: 65535\\r\\n\\r\\n...\\r\\n\"` I'd expect `iter_lines(delimiter='\\r\\n')` to yield once per `\\r\\n`. I expect it to behave exactly as if I iterated over:\n\n```\n[\n '--boundary',\n 'Content-Type: image/jpg',\n 'Content-Length: 65535',\n '',\n '...',\n]\n```\n\nBut once in a while, the delimiter is split across chunks internally, and without any change to the source content, I occasionally get an extra `''` at seemingly random intervals:\n\n```\n[\n '--boundary',\n 'Content-Type: image/jpg',\n '',\n 'Content-Length: 65535',\n '',\n '...',\n]\n```\n\nThe proposed change fixes the routine so that it behaves more consistently with null-string input and does not yield when there's not a line of data*\\* in the input stream.\n\n**Okay, there's one additional issue that's been in there since at least October 2014 that I didn't address: if using `iter_lines()` (without the delimiter option) AND the delimiters are `\\r\\n` AND the delimiter gets split across chunks, a spurious `''` will be emitted. That is, usually the string `'a line\\r\\n'` will end up as `['a line']`, but once in a while it'll be `['a line', '']`. This could be rectified by providing the actual expected delimiter - using `iter_lines(delimiter=\\r\\n')` instead.\n", "Here's a breakdown as I understand it.\n\nPrior to October 2014:\n\n| Input | Process | Result |\n| --- | --- | --- |\n| `''` | `iter_lines()` | `[]` |\n| `'line'` | `iter_lines()` | `['line']` |\n| `'line\\n'` | `iter_lines()` | `['line']` but occasionally `['line', '']` |\n| `'line\\r\\n'` | `iter_lines()` | `['line']` but occasionally `['line', '']` and very rarely `['line', '', '']` |\n\nAfter October 2014:\n\n| Input | Process | Result |\n| --- | --- | --- |\n| `''` | `iter_lines()` | `[]` |\n| | `iter_lines(delimiter='\\r\\n')` | `['']` |\n| `'line\\n'` | `iter_lines()` | `['line']` but occasionally `['line', '']` |\n| | `'iter_lines(delimiter='\\r\\n')` | `['line\\n']` |\n| `'line\\r\\n'` | `iter_lines()` | `['line']` but occasionally `['line', '']` and very rarely `['line', '', '']` |\n| | `'iter_lines(delimiter='\\r\\n')` | `['line', '']` but occasionally `['line', '', '']` |\n\nProposed Change:\n\n| Input | Process | Result |\n| --- | --- | --- |\n| `''` | `iter_lines()` | `[]` |\n| | `iter_lines(delimiter='\\r\\n')` | `[]` |\n| `'line\\n'` | `iter_lines()` | `['line']` but occasionally `['line', '']` |\n| | `'iter_lines(delimiter='\\r\\n')` | `['line\\n']` |\n| `'line\\r\\n'` | `iter_lines()` | `['line']` but occasionally `['line', '']` and very rarely `['line', '', '']` |\n| | `'iter_lines(delimiter='\\r\\n')` | `['line', '']` always |\n\nSince `'iter_lines()` is and always has been inconsistent, it's unreliable and introduces subtle bugs. `iter_lines(delimiter=x)`, after October 2014 has the same problem. My proposed change makes `iter_lines(delimiter=x)` consistent, but leaves the inconsistency of the original version.\n", "This PR hasn't seen any love in a long time. @ianepperson, would mind performing a rebase to make this once-again mergable? \n\n@sigmavirus24 @Lukasa +1? -1? I want to get this merged or closed out. \n", "I'm generally +1 on this, though it needs to be merged into 3.0.0 rather than the master branch.\n", "Closing in favour of #3745." ]