url
stringlengths
50
53
repository_url
stringclasses
1 value
labels_url
stringlengths
64
67
comments_url
stringlengths
59
62
events_url
stringlengths
57
60
html_url
stringlengths
38
43
id
int64
597k
2.65B
node_id
stringlengths
18
32
number
int64
1
6.83k
title
stringlengths
1
296
user
dict
labels
listlengths
0
5
state
stringclasses
2 values
locked
bool
2 classes
assignee
dict
assignees
listlengths
0
4
milestone
dict
comments
int64
0
211
created_at
stringlengths
20
20
updated_at
stringlengths
20
20
closed_at
stringlengths
20
20
author_association
stringclasses
3 values
active_lock_reason
stringclasses
4 values
body
stringlengths
0
65.6k
closed_by
dict
reactions
dict
timeline_url
stringlengths
59
62
performed_via_github_app
null
state_reason
stringclasses
3 values
draft
bool
2 classes
pull_request
dict
is_pull_request
bool
2 classes
issue_comments
listlengths
0
30
https://api.github.com/repos/psf/requests/issues/3392
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3392/labels{/name}
https://api.github.com/repos/psf/requests/issues/3392/comments
https://api.github.com/repos/psf/requests/issues/3392/events
https://github.com/psf/requests/issues/3392
164,704,240
MDU6SXNzdWUxNjQ3MDQyNDA=
3,392
Automatically executes script in working folder if its name "cgi"
{ "avatar_url": "https://avatars.githubusercontent.com/u/9381116?v=4", "events_url": "https://api.github.com/users/dan-win/events{/privacy}", "followers_url": "https://api.github.com/users/dan-win/followers", "following_url": "https://api.github.com/users/dan-win/following{/other_user}", "gists_url": "https://api.github.com/users/dan-win/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/dan-win", "id": 9381116, "login": "dan-win", "node_id": "MDQ6VXNlcjkzODExMTY=", "organizations_url": "https://api.github.com/users/dan-win/orgs", "received_events_url": "https://api.github.com/users/dan-win/received_events", "repos_url": "https://api.github.com/users/dan-win/repos", "site_admin": false, "starred_url": "https://api.github.com/users/dan-win/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dan-win/subscriptions", "type": "User", "url": "https://api.github.com/users/dan-win", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-07-10T07:41:50Z
2021-09-08T17:05:32Z
2016-07-10T08:00:52Z
NONE
resolved
Requests version - 2.10.0 As soon as the working folder folder contains script with name "cgi.py" (it was part of my application), your package automatically executes it. In my case - I fixed that myself by renaming my script. For another users - probability of such case is not zero. (In my case it was about 1 hour of fighting with strange behavior of my application). To reproduce it, it is possible to create 2 simple scripts, e.g.: **test.py**: _import requests print "Hello from test"_ **cgi.py**: _print "Hello from CGI!"_ Possible solutions: to reflect such behavior in documentation or to fix it. P.S. Very nice package, it saved me much more time this lost hour!
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3392/reactions" }
https://api.github.com/repos/psf/requests/issues/3392/timeline
null
completed
null
null
false
[ "This is not a bug: this is unfortunately the way Python works. Python puts the current working directory at the top of the import namespace, which means that when we try to import standard library modules, if you have a script in the working directory with the same name, we will accidentally get that instead. \n\nTo prevent your script from executing in these circumstances, you will want to use the [`if **name** == '**main**':](http://stackoverflow.com/questions/419163/what-does-if-name-main-do) pattern. That would change this error to either an ImportError or an AttributeError, which are better failure modes. \n\nOtherwise, try to avoid naming your scripts after stdlib modules. (You'd have similar problems with, for example, naming a script 'urllib.py'.)\n" ]
https://api.github.com/repos/psf/requests/issues/3391
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3391/labels{/name}
https://api.github.com/repos/psf/requests/issues/3391/comments
https://api.github.com/repos/psf/requests/issues/3391/events
https://github.com/psf/requests/issues/3391
164,685,870
MDU6SXNzdWUxNjQ2ODU4NzA=
3,391
SSLError: EOF occurred in violation of protocol (_ssl.c:590)
{ "avatar_url": "https://avatars.githubusercontent.com/u/13929583?v=4", "events_url": "https://api.github.com/users/vandernath/events{/privacy}", "followers_url": "https://api.github.com/users/vandernath/followers", "following_url": "https://api.github.com/users/vandernath/following{/other_user}", "gists_url": "https://api.github.com/users/vandernath/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/vandernath", "id": 13929583, "login": "vandernath", "node_id": "MDQ6VXNlcjEzOTI5NTgz", "organizations_url": "https://api.github.com/users/vandernath/orgs", "received_events_url": "https://api.github.com/users/vandernath/received_events", "repos_url": "https://api.github.com/users/vandernath/repos", "site_admin": false, "starred_url": "https://api.github.com/users/vandernath/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vandernath/subscriptions", "type": "User", "url": "https://api.github.com/users/vandernath", "user_view_type": "public" }
[]
closed
true
null
[]
null
6
2016-07-09T21:10:37Z
2021-09-08T01:21:14Z
2016-07-14T13:01:51Z
NONE
resolved
Please bear with me as I'm quite new with Python and github in general. I have been using requests to scrape data from the Play Store. I need to make a large amount of requests (about 20k). It works great for about 3000-4000 requests but gets stuck after that (SSL Error). I am not familiar with SSL and requests, so I don't know what causes this. Error: ``` (SSLError Traceback (most recent call last) <ipython-input-23-1da544640d89> in <module>() 53 time.sleep(0.1) 54 ---> 55 r = requests.get('https://play.google.com' + link + '&hl=en') 56 link_tree = html.fromstring(r.content) 57 description = link_tree.xpath('//div[@jsname="C4s9Ed"]/text()') + link_tree.xpath('//div[@jsname="C4s9Ed"]/p/text()') C:\Users\Nathan\AppData\Local\Enthought\Canopy\User\lib\site-packages\requests\api.pyc in get(url, params, **kwargs) 65 66 kwargs.setdefault('allow_redirects', True) ---> 67 return request('get', url, params=params, **kwargs) 68 69 C:\Users\Nathan\AppData\Local\Enthought\Canopy\User\lib\site-packages\requests\api.pyc in request(method, url, **kwargs) 51 # cases, and look like a memory leak in others. 52 with sessions.Session() as session: ---> 53 return session.request(method=method, url=url, **kwargs) 54 55 C:\Users\Nathan\AppData\Local\Enthought\Canopy\User\lib\site-packages\requests\sessions.pyc in request(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert, json) 466 } 467 send_kwargs.update(settings) --> 468 resp = self.send(prep, **send_kwargs) 469 470 return resp C:\Users\Nathan\AppData\Local\Enthought\Canopy\User\lib\site-packages\requests\sessions.pyc in send(self, request, **kwargs) 574 575 # Send the request --> 576 r = adapter.send(request, **kwargs) 577 578 # Total elapsed time of the request (approximately) C:\Users\Nathan\AppData\Local\Enthought\Canopy\User\lib\site-packages\requests\adapters.pyc in send(self, request, stream, timeout, verify, cert, proxies) 445 except (_SSLError, _HTTPError) as e: 446 if isinstance(e, _SSLError): --> 447 raise SSLError(e, request=request) 448 elif isinstance(e, ReadTimeoutError): 449 raise ReadTimeout(e, request=request) SSLError: EOF occurred in violation of protocol (_ssl.c:590) ) ``` The-efi, on this github, seemed to have the same problem on this thread: https://github.com/kennethreitz/requests/issues/3006 (see below, he was not the OP) but I was not able to find the thread he opened for further assistance. I use Python 2.7. as well. I have been stuck on this for quite a while now and I can't find any answer here nor StackOverflow (the answer was probably right under my nose but I've had trouble understanding them because of my lack of knowledge in SSL & requests). Thank you in advance for your help, and sorry if something is unclear -- please let me know.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3391/reactions" }
https://api.github.com/repos/psf/requests/issues/3391/timeline
null
completed
null
null
false
[ "When you say you get stuck, is it just that the exception fires? Or do follow-up requests not work? I ask because transient network errors _do_ occur, and if you're making large numbers of web requests you should consider implementing some kind of retry logic in the face of them. \n", "The exception fires. Follow-up requests seem to work, but I have not yet tried to implement retry. I was afraid I was being against the rules of making too many requests to a server or something, I guess.\n\nI will definitely try that and update this thread. Thanks!\n", "Well for what it's worth, because you're using `requests.*` you're putting yourself at more risk of overloading the network resources between you and the server. You should try using [a session](http://docs.python-requests.org/en/master/user/advanced/#session-objects).\n", "For anyone with this problem:\n\nI've fixed it by following @Lukasa 's suggestions and added this just after importing requests:\n\n```\nimport requests\nsess = requests.Session()\nadapter = requests.adapters.HTTPAdapter(max_retries = 20)\nsess.mount('http://', adapter)\n```\n\nThen, where I was using `requests.get()` before, I used `sess.get()`.\n\nHopefully this helps, and thanks for your help @Lukasa !\n", "I had exactly the same error message, the problem was I didn't have ndg-httpsclient installed\n\nhttps://github.com/kennethreitz/requests/issues/3605\n", "@variable I installed ndg-httpsclient but the same error:urllib.error.URLError: <urlopen error EOF occurred in violation of protocol (_ssl.c:748)>" ]
https://api.github.com/repos/psf/requests/issues/3390
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3390/labels{/name}
https://api.github.com/repos/psf/requests/issues/3390/comments
https://api.github.com/repos/psf/requests/issues/3390/events
https://github.com/psf/requests/pull/3390
164,333,769
MDExOlB1bGxSZXF1ZXN0NzY2MTI4OTU=
3,390
Defining header value type requirements and tests
{ "avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4", "events_url": "https://api.github.com/users/nateprewitt/events{/privacy}", "followers_url": "https://api.github.com/users/nateprewitt/followers", "following_url": "https://api.github.com/users/nateprewitt/following{/other_user}", "gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/nateprewitt", "id": 5271761, "login": "nateprewitt", "node_id": "MDQ6VXNlcjUyNzE3NjE=", "organizations_url": "https://api.github.com/users/nateprewitt/orgs", "received_events_url": "https://api.github.com/users/nateprewitt/received_events", "repos_url": "https://api.github.com/users/nateprewitt/repos", "site_admin": false, "starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions", "type": "User", "url": "https://api.github.com/users/nateprewitt", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2016-07-07T15:13:22Z
2021-09-08T02:10:18Z
2016-07-07T16:20:03Z
MEMBER
resolved
Currently a non-string/bytes value will cause `pat.match(value)` to raise a TypeError from `re`. I'm proposing catching this exception and raising it as a more descriptive `InvalidHeader` instead, so that it's clear we're intending this to happen.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3390/reactions" }
https://api.github.com/repos/psf/requests/issues/3390/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3390.diff", "html_url": "https://github.com/psf/requests/pull/3390", "merged_at": "2016-07-07T16:20:03Z", "patch_url": "https://github.com/psf/requests/pull/3390.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3390" }
true
[ "👍 from me. @Lukasa ?\n", "LGTM, let's do it! :sparkles:\n" ]
https://api.github.com/repos/psf/requests/issues/3389
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3389/labels{/name}
https://api.github.com/repos/psf/requests/issues/3389/comments
https://api.github.com/repos/psf/requests/issues/3389/events
https://github.com/psf/requests/issues/3389
164,275,047
MDU6SXNzdWUxNjQyNzUwNDc=
3,389
I question the distribution of LGPL code with Requests
{ "avatar_url": "https://avatars.githubusercontent.com/u/619444?v=4", "events_url": "https://api.github.com/users/gozdal/events{/privacy}", "followers_url": "https://api.github.com/users/gozdal/followers", "following_url": "https://api.github.com/users/gozdal/following{/other_user}", "gists_url": "https://api.github.com/users/gozdal/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/gozdal", "id": 619444, "login": "gozdal", "node_id": "MDQ6VXNlcjYxOTQ0NA==", "organizations_url": "https://api.github.com/users/gozdal/orgs", "received_events_url": "https://api.github.com/users/gozdal/received_events", "repos_url": "https://api.github.com/users/gozdal/repos", "site_admin": false, "starred_url": "https://api.github.com/users/gozdal/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/gozdal/subscriptions", "type": "User", "url": "https://api.github.com/users/gozdal", "user_view_type": "public" }
[ { "color": "f7c6c7", "default": false, "description": null, "id": 167537670, "name": "Propose Close", "node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=", "url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close" } ]
closed
true
null
[]
null
14
2016-07-07T10:11:12Z
2018-06-12T16:16:52Z
2016-07-07T23:13:05Z
NONE
resolved
I'm not sure if this is the right place to discuss licensing issues. If not please point me to the right place. Requests itself is under an Apache license, while bundled chardet library is under LGPL. I believe those two are incompatible and people that include requests are expecting it to be 100% Apache, while they get a portion of it under LGPL. Any thoughts on how this could be resolved?
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3389/reactions" }
https://api.github.com/repos/psf/requests/issues/3389/timeline
null
completed
null
null
false
[ "Yeah, this makes me somewhat uncomfortable. My _suspicion_ is that the LGPL doesn't allow what we're doing here, but IANAL. Fundamentally the decision here is with @kennethreitz. If it were me, though, I'd chat with Van Lindberg and see what he says, and then potentially take action to make chardet an optional dependency (like PyOpenSSL), rather than a bundled one.\n", "I'm not a lawyer either. I have raised this before and it has been cleared by certain people though. That said, we do include https://github.com/kennethreitz/requests/blob/master/NOTICE. It's clear to some that requests' source code is Apache, while it includes other non-Apache packages.\n", "LGPL allows re-distribution, which is what we are doing here. The license for the CA Bundle falls under similar territory. The NOTICES file is important (just as important as LICENSE), as it is in any project that has one. \n", "> LGPL allows re-distribution, which is what we are doing here. \n\nRight. It's important to understand that things that requests vendors are not modified when vendored. I work on chardet upstream to ensure it's suitable for our vendoring. Likewise @Lukasa and I work on urllib3.\n\nWe don't vendor & modify, we simply vendor to redistribute.\n", "And I'm very grateful that it does allow re-distribution. I was previously unaware of this, and was going to attempt to commission a copyleft chardet-like library be written, because this functionality is so core and important. \n", "As somebody installing the module only via `pip` I was surprised to learn that it bundles some other code under different license than Apache.\n\nA solution for me is to have a private fork of `requests` with `chardet` removed but I found it misleading as https://pypi.python.org/pypi/requests described the license as \"Apache\". I only found the `NOTICE` file after this discussion.\n", "> I was surprised to learn that it bundles some other code under different license than Apache.\n\nI'm sorry you were surprised. Most people do thorough research beyond what the license field says on PyPI. NOTICE files are common practice for projects that package other OSS code into their project.\n\nYou should also consult a lawyer before haphazardly breaking your dependencies. Projects like OpenStack (that have large corporate adoption) rely on requests without worrying about this.\n", "I've re-titled this since the licenses are _not_ incompatible.\n", "P.S. if you're using a non-ancient version of pip, then you already have Requests (including these vendored packages) installed on your system. \n", "I don't think the issue is with LGPL but with APL.\nhttps://www.apache.org/legal/resolved.html\n", "The problem we're hitting with chardet is that we bundle our code and dependencies (incl requests) using PyInstaller as a self-contained executable, a.k.a. static linking, and the user cannot \"use their own copy of\" chardet. This is a problem with LGPL. ", "Commenting although this is closed. APL cannot pull in LGPL. Is there any movement to remove the LGPL dependency from chardet (or have chardet update their license?)", "Hey @pauljamescleary, we haven't had chardet bundled with Requests for a little over a year now. I don't believe there's any intent to change license types on chardet, but that would be something to follow up with their team. We don't have any immediate plans to make further changes in Requests regarding this.\r\n\r\nAs stated above, none of the maintainers are lawyers, but our usages have been approved by legal teams well-versed in software licensing. I *believe* the specific clause that is worth noting is [LGPL2.1 § 5](https://opensource.org/licenses/LGPL-2.1):\r\n\r\n> A program that contains no derivative of any portion of the Library, but is designed to work with the Library by being compiled or linked with it, is called a \"work that uses the Library\". Such a work, in isolation, is not a derivative work of the Library, and therefore falls outside the scope of this License.", "This topic has been discussed over and over again with the exact same outcome. The topic is resolved." ]
https://api.github.com/repos/psf/requests/issues/3388
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3388/labels{/name}
https://api.github.com/repos/psf/requests/issues/3388/comments
https://api.github.com/repos/psf/requests/issues/3388/events
https://github.com/psf/requests/pull/3388
164,176,058
MDExOlB1bGxSZXF1ZXN0NzY1MDMyMzM=
3,388
updating documentation to reflect decision of #3386
{ "avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4", "events_url": "https://api.github.com/users/nateprewitt/events{/privacy}", "followers_url": "https://api.github.com/users/nateprewitt/followers", "following_url": "https://api.github.com/users/nateprewitt/following{/other_user}", "gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/nateprewitt", "id": 5271761, "login": "nateprewitt", "node_id": "MDQ6VXNlcjUyNzE3NjE=", "organizations_url": "https://api.github.com/users/nateprewitt/orgs", "received_events_url": "https://api.github.com/users/nateprewitt/received_events", "repos_url": "https://api.github.com/users/nateprewitt/repos", "site_admin": false, "starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions", "type": "User", "url": "https://api.github.com/users/nateprewitt", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2016-07-06T21:15:46Z
2021-09-08T02:10:15Z
2016-07-07T14:56:08Z
MEMBER
resolved
These are a couple of minor doc tweaks to document the decision to enforce strings in header values. Since requests previously operated with non-string header values, I figured it was best to make note that an exception will now be thrown in the event of a non-string value. There may be some formatting and wording changes needed, this iteration is mainly to get the conversation going.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3388/reactions" }
https://api.github.com/repos/psf/requests/issues/3388/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3388.diff", "html_url": "https://github.com/psf/requests/pull/3388", "merged_at": "2016-07-07T14:56:08Z", "patch_url": "https://github.com/psf/requests/pull/3388.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3388" }
true
[ "Cool, LGTM! :sparkles: @sigmavirus24?\n", "While you're here @sigmavirus24 and @Lukasa, I threw together [another commit](https://github.com/nateprewitt/requests/commit/be31a90906deb5553c2e703fb05cf6964ee23ed5) related to this to encapsulate the type error thrown by `re` when non-strings are passed to `check_header_validity`. This is to make it clear that the behaviour is to be expected, but may be overkill with this documentation now. Any thoughts on if this would be worth opening? \n", "Thanks @nateprewitt 🍮 🍰 ☕ \n" ]
https://api.github.com/repos/psf/requests/issues/3387
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3387/labels{/name}
https://api.github.com/repos/psf/requests/issues/3387/comments
https://api.github.com/repos/psf/requests/issues/3387/events
https://github.com/psf/requests/pull/3387
163,887,442
MDExOlB1bGxSZXF1ZXN0NzYzMDAzOTg=
3,387
fixing #3366 for non-str header values
{ "avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4", "events_url": "https://api.github.com/users/nateprewitt/events{/privacy}", "followers_url": "https://api.github.com/users/nateprewitt/followers", "following_url": "https://api.github.com/users/nateprewitt/following{/other_user}", "gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/nateprewitt", "id": 5271761, "login": "nateprewitt", "node_id": "MDQ6VXNlcjUyNzE3NjE=", "organizations_url": "https://api.github.com/users/nateprewitt/orgs", "received_events_url": "https://api.github.com/users/nateprewitt/received_events", "repos_url": "https://api.github.com/users/nateprewitt/repos", "site_admin": false, "starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions", "type": "User", "url": "https://api.github.com/users/nateprewitt", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2016-07-05T16:06:41Z
2021-09-08T03:00:59Z
2016-07-05T16:41:43Z
MEMBER
resolved
This will fix issue #3386 with non-string/buffer header values, but there may be some more discussion to be had here. This is a passthrough for non-string/buffers which means all other datatypes will skip the regex check. I wasn't able to create the header split issue with most of the other standard datatypes I checked (lists, sets, dicts), so this _should_ be safe with the passthrough. I think an alternative, and possibly better solution is to cast everything that isn't a byte string as a `str`, whether that be standard, or `encode('latin-1')`. This will ensure the check still runs and doesn't modify the value in the actual `Requests.headers` value. This is probably closer to what @sigmavirus24 already did [here](https://github.com/kennethreitz/requests/pull/866/files#diff-5956087d5835a57d9ef6fff974f6fd9bR273).
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3387/reactions" }
https://api.github.com/repos/psf/requests/issues/3387/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3387.diff", "html_url": "https://github.com/psf/requests/pull/3387", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3387.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3387" }
true
[ "@nateprewitt that pull request was _rejected_ precisely because integers are not defined behaviour within requests and haven't been for _years_.\n", "Sorry, @sigmavirus24, I was suggesting something along this line for the check since it wouldn't be transforming the value in this case. I was trying to use this as a template example. I realize the documentation says that headers must be strings, and that was the assumption I was operating off of when I wrote #3366. I submitted the passthrough as the initial pull request here because it avoids dealing with any defining behaviour by ignoring non-strings.\n\nI just wanted to make sure both options were at least briefly discussed.\n", "Yeah, so the question boils down to whether we handle non-string headers. Clearly we've oscillated around for a while: they didn't work, and then they did, and now they don't again.\n\nHowever, what I failed to note is Kenneth's original response in #865. With that in mind, I'm inclined to want to continue to respect his wishes and say that non-string headers are not acceptable.\n\nSorry for having you do this work @nateprewitt!\n", "No problem, glad to have the question resolved.\n" ]
https://api.github.com/repos/psf/requests/issues/3386
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3386/labels{/name}
https://api.github.com/repos/psf/requests/issues/3386/comments
https://api.github.com/repos/psf/requests/issues/3386/events
https://github.com/psf/requests/issues/3386
163,870,374
MDU6SXNzdWUxNjM4NzAzNzQ=
3,386
The keywords headers must be string or buffer in new version 2.10.0?
{ "avatar_url": "https://avatars.githubusercontent.com/u/12270732?v=4", "events_url": "https://api.github.com/users/wut0n9/events{/privacy}", "followers_url": "https://api.github.com/users/wut0n9/followers", "following_url": "https://api.github.com/users/wut0n9/following{/other_user}", "gists_url": "https://api.github.com/users/wut0n9/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/wut0n9", "id": 12270732, "login": "wut0n9", "node_id": "MDQ6VXNlcjEyMjcwNzMy", "organizations_url": "https://api.github.com/users/wut0n9/orgs", "received_events_url": "https://api.github.com/users/wut0n9/received_events", "repos_url": "https://api.github.com/users/wut0n9/repos", "site_admin": false, "starred_url": "https://api.github.com/users/wut0n9/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/wut0n9/subscriptions", "type": "User", "url": "https://api.github.com/users/wut0n9", "user_view_type": "public" }
[]
closed
true
null
[]
null
7
2016-07-05T14:53:08Z
2021-09-08T17:05:33Z
2016-07-05T16:42:22Z
NONE
resolved
The keywords `headers` value must be string or buffer? If the keywords `dnt` and `upgrade-insecure-requests` value is **string** or **buffer**, it's right,but the value is int,it's wrong.Why is that? Right: ``` 'dnt': '1', 'upgrade-insecure-requests': '1', ``` Wrong: ``` 'dnt': 1, 'upgrade-insecure-requests': 1, ``` Why?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3386/reactions" }
https://api.github.com/repos/psf/requests/issues/3386/timeline
null
completed
null
null
false
[ "@wut0n9 This behavioural change is not in v2.10.0, it's in the current master branch.\n\nHowever, that's a real bug: #3366 has regressed this. @nateprewitt, are you interested in trying to update with a fix for this?\n", "Headers should always have been string. I don't think this is a significant regression if a regression at all. We've always documented that header values should be strings.\n", "Regardless of what we documented, this used to work and now it doesn't. We'll break a _lot_ of code if we don't change this back.\n", "Yeah, I'll get right on this @Lukasa.\n", "See also https://github.com/kennethreitz/requests/issues/865 and https://github.com/kennethreitz/requests/pull/866\n\nI don't know how much a \"lot\" actually is in this case.\n", "@sigmavirus24 has linked to the relevant earlier opinions, which suggest that actually we don't allow non-string header values. So that means this is not a bug: it was us making a revision that is within the scope of the API definition.\n", "@Lukasa right, I'm frankly surprised this didn't break earlier. The meaning of non-bytes/str as a header value is undefined as far as I'm concerned.\n" ]
https://api.github.com/repos/psf/requests/issues/3385
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3385/labels{/name}
https://api.github.com/repos/psf/requests/issues/3385/comments
https://api.github.com/repos/psf/requests/issues/3385/events
https://github.com/psf/requests/pull/3385
163,801,538
MDExOlB1bGxSZXF1ZXN0NzYyMzgzNTY=
3,385
Support responses like `HTTP/1.1 404 öööööööö`
{ "avatar_url": "https://avatars.githubusercontent.com/u/75169?v=4", "events_url": "https://api.github.com/users/gugu/events{/privacy}", "followers_url": "https://api.github.com/users/gugu/followers", "following_url": "https://api.github.com/users/gugu/following{/other_user}", "gists_url": "https://api.github.com/users/gugu/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/gugu", "id": 75169, "login": "gugu", "node_id": "MDQ6VXNlcjc1MTY5", "organizations_url": "https://api.github.com/users/gugu/orgs", "received_events_url": "https://api.github.com/users/gugu/received_events", "repos_url": "https://api.github.com/users/gugu/repos", "site_admin": false, "starred_url": "https://api.github.com/users/gugu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/gugu/subscriptions", "type": "User", "url": "https://api.github.com/users/gugu", "user_view_type": "public" }
[]
closed
true
null
[]
null
18
2016-07-05T08:54:17Z
2021-09-08T03:00:51Z
2016-07-05T14:01:19Z
CONTRIBUTOR
resolved
Currently this code raises UnicodeDecodeError: `requests.get('http://www.voice.fi//lampaan-kiveksia-ja-coca-cola-hulluutta-5-asiaa-joita-et-tiennyt-islannista-122015?utm_source=feed&utm_medium=feed&utm_content=article_link&utm_campaign=feed').ok` Here is their status: `HTTP/1.1 404 Komponenttia ei löydy` This pull request allow module to handle URLs like this correctly
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3385/reactions" }
https://api.github.com/repos/psf/requests/issues/3385/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3385.diff", "html_url": "https://github.com/psf/requests/pull/3385", "merged_at": "2016-07-05T14:01:19Z", "patch_url": "https://github.com/psf/requests/pull/3385.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3385" }
true
[ "Generally speaking I'm not too interested in supporting this kind of logic: servers are really not supposed to be sending non-ASCII-bytes in the reason phrase, and choosing to assume that UTF-8 is being used is really fairly optimistic.\n\nHowever, in the case of `raise_for_status`, we're exploding in a situation that we don't need to. RFC 7230 says we should ignore the reason phrase, and we're choosing not to do that, so the least we should do is _not_ explode if it's weird. For that reason, I'm in favour of a patch like this, though I'd like us to change the decode logic to make sure that we will _always_ successfully decode the reason phrase, even if we decode it to gibberish.\n", "@Lukasa is `reason.decode('utf-8', 'ignore')` ok?\n", "`'ignore'` is fine.\n", "@Lukasa done, updated PR\n", "@Lukasa done\n", "Cool, I'm happy with this as-is. I'd like @sigmavirus24 to ACK as well, if possible.\n", "I have a question or two inline. Otherwise, I agree we shouldn't have this exception raised, even if the server is in the wrong.\n", "@gugu just use `u''`. Further, stop importing six. It's not a dependency of requests.\n", "six is a _test_ dependency of requests though, it's acceptable to use there.\n", "> six is a test dependency of requests though, it's acceptable to use there.\n\nSure but there's little sense to using `six.u` in the tests when we already use `u''` in the tests.\n", "@sigmavirus24 @lucasa done, updated PR\n", "Looks good to me. @Lukasa, the changes are minimal from when you gave your LGTM, so I'm merging. :)\n", "Thanks @gugu! :sparkles: :cake: :sparkles: :fireworks: \n", "y'all emoji-chain thieves!\n", "Is there any plan to make a new Pypi release? We would rather bump `requests`' version in our dependencies instead of referencing a GitHub commit.\n\n(We're affected because of the [`raven` library calling `response.ok` on all our `requests` calls](https://github.com/getsentry/raven-python/blob/6b5044ab72040aa5678f9c522dacb7099cbe8829/raven/breadcrumbs.py#L275))\n", "We're currently tentatively planning a release for next week. =)\n", "Great\n", "Thank you all for v2.11!\n" ]
https://api.github.com/repos/psf/requests/issues/3384
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3384/labels{/name}
https://api.github.com/repos/psf/requests/issues/3384/comments
https://api.github.com/repos/psf/requests/issues/3384/events
https://github.com/psf/requests/issues/3384
163,502,741
MDU6SXNzdWUxNjM1MDI3NDE=
3,384
Will requests bundle with future version of python?
{ "avatar_url": "https://avatars.githubusercontent.com/u/7894040?v=4", "events_url": "https://api.github.com/users/KiYugadgeter/events{/privacy}", "followers_url": "https://api.github.com/users/KiYugadgeter/followers", "following_url": "https://api.github.com/users/KiYugadgeter/following{/other_user}", "gists_url": "https://api.github.com/users/KiYugadgeter/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/KiYugadgeter", "id": 7894040, "login": "KiYugadgeter", "node_id": "MDQ6VXNlcjc4OTQwNDA=", "organizations_url": "https://api.github.com/users/KiYugadgeter/orgs", "received_events_url": "https://api.github.com/users/KiYugadgeter/received_events", "repos_url": "https://api.github.com/users/KiYugadgeter/repos", "site_admin": false, "starred_url": "https://api.github.com/users/KiYugadgeter/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/KiYugadgeter/subscriptions", "type": "User", "url": "https://api.github.com/users/KiYugadgeter", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2016-07-02T05:22:48Z
2021-09-08T17:05:33Z
2016-07-02T07:00:18Z
NONE
resolved
Will requests bundle with future version of python like pathlib? I think urllib is not intuitive. I hope that requests replace urllib as a intuitive high level http request module.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3384/reactions" }
https://api.github.com/repos/psf/requests/issues/3384/timeline
null
completed
null
null
false
[ "Thanks for this question! The answer I'm afraid is \"no\". Requests is security-critical for most users, which means we need to be able to react quickly to security vulnerabilities and ensure that they reach all users. That's not possible if we're in the standard library because of its requirements on backward compatibility and long release cycles.\n\nAdditionally, it's never been easier to get a copy of Requests for your Python. Modern Pythons ship with pip, which means that Requests is a simple `pip install requests` away: one small command line. This means that the convenience advantage of bundling with the standard library is pretty minimal.\n\nFor this reason, we will not be part of the standard library: we can do better work by being outside it.\n", "https://speakerdeck.com/kennethreitz/python-requests-and-the-standard-library\n", "The official decision was made for the official Python documentation to recommend Requests, instead. It's now prominent in the urllib documentation. \n" ]
https://api.github.com/repos/psf/requests/issues/3383
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3383/labels{/name}
https://api.github.com/repos/psf/requests/issues/3383/comments
https://api.github.com/repos/psf/requests/issues/3383/events
https://github.com/psf/requests/pull/3383
163,490,531
MDExOlB1bGxSZXF1ZXN0NzYwNDg3NDI=
3,383
Change exception and variable names so that tests will run.
{ "avatar_url": "https://avatars.githubusercontent.com/u/3794108?v=4", "events_url": "https://api.github.com/users/davidsoncasey/events{/privacy}", "followers_url": "https://api.github.com/users/davidsoncasey/followers", "following_url": "https://api.github.com/users/davidsoncasey/following{/other_user}", "gists_url": "https://api.github.com/users/davidsoncasey/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/davidsoncasey", "id": 3794108, "login": "davidsoncasey", "node_id": "MDQ6VXNlcjM3OTQxMDg=", "organizations_url": "https://api.github.com/users/davidsoncasey/orgs", "received_events_url": "https://api.github.com/users/davidsoncasey/received_events", "repos_url": "https://api.github.com/users/davidsoncasey/repos", "site_admin": false, "starred_url": "https://api.github.com/users/davidsoncasey/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/davidsoncasey/subscriptions", "type": "User", "url": "https://api.github.com/users/davidsoncasey", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-07-01T23:42:41Z
2021-09-08T03:01:00Z
2016-07-02T07:33:31Z
NONE
resolved
@Lukasa @sigmavirus24 I tried running the tests on the proposed/3.0.0 branch, and found that they did not run. So I changed a couple of variable names to fix the obvious errors. If you prefer InvalidSchema and MissingSchema I can change those back, I changed the imports to match the names of the exceptions as they were defined. After these changes, there is still one failing test. I'll try to take a look and see if I can debug it. But this was low hanging fruit.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3383/reactions" }
https://api.github.com/repos/psf/requests/issues/3383/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3383.diff", "html_url": "https://github.com/psf/requests/pull/3383", "merged_at": "2016-07-02T07:33:31Z", "patch_url": "https://github.com/psf/requests/pull/3383.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3383" }
true
[ "Looks good to me, thanks!\n" ]
https://api.github.com/repos/psf/requests/issues/3382
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3382/labels{/name}
https://api.github.com/repos/psf/requests/issues/3382/comments
https://api.github.com/repos/psf/requests/issues/3382/events
https://github.com/psf/requests/pull/3382
163,467,405
MDExOlB1bGxSZXF1ZXN0NzYwMzE3MDA=
3,382
updating iter_content docstring to match functionality
{ "avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4", "events_url": "https://api.github.com/users/nateprewitt/events{/privacy}", "followers_url": "https://api.github.com/users/nateprewitt/followers", "following_url": "https://api.github.com/users/nateprewitt/following{/other_user}", "gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/nateprewitt", "id": 5271761, "login": "nateprewitt", "node_id": "MDQ6VXNlcjUyNzE3NjE=", "organizations_url": "https://api.github.com/users/nateprewitt/orgs", "received_events_url": "https://api.github.com/users/nateprewitt/received_events", "repos_url": "https://api.github.com/users/nateprewitt/repos", "site_admin": false, "starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions", "type": "User", "url": "https://api.github.com/users/nateprewitt", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2016-07-01T20:18:53Z
2021-09-08T03:00:51Z
2016-07-02T18:10:20Z
MEMBER
resolved
Sorry to be bombarding you guys with PRs. This one is pretty minor but explains what I would say was non-obvious behaviour of the function before PR #3368 and PR #3365.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3382/reactions" }
https://api.github.com/repos/psf/requests/issues/3382/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3382.diff", "html_url": "https://github.com/psf/requests/pull/3382", "merged_at": "2016-07-02T18:10:20Z", "patch_url": "https://github.com/psf/requests/pull/3382.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3382" }
true
[ "So this isn't quite right. In particular, `iter_content` with `stream=True` and a chunk encoded response will iterate in chunk size. So it would be better for this to say that it iterates in chunks of undefined size. \n", "Looks like I failed to set stream=True in my test case. I was misinterpreting the call to `r.content` as the stream read. Thanks @Lukasa, string updated.\n", "Cool, I'm happy with this! Thanks @nateprewitt! :sparkles: :cake: :sparkles:\n" ]
https://api.github.com/repos/psf/requests/issues/3371
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3371/labels{/name}
https://api.github.com/repos/psf/requests/issues/3371/comments
https://api.github.com/repos/psf/requests/issues/3371/events
https://github.com/psf/requests/pull/3371
163,427,961
MDExOlB1bGxSZXF1ZXN0NzYwMDMyMjA=
3,371
Fix minor typo in README
{ "avatar_url": "https://avatars.githubusercontent.com/u/1977525?v=4", "events_url": "https://api.github.com/users/jeremycline/events{/privacy}", "followers_url": "https://api.github.com/users/jeremycline/followers", "following_url": "https://api.github.com/users/jeremycline/following{/other_user}", "gists_url": "https://api.github.com/users/jeremycline/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jeremycline", "id": 1977525, "login": "jeremycline", "node_id": "MDQ6VXNlcjE5Nzc1MjU=", "organizations_url": "https://api.github.com/users/jeremycline/orgs", "received_events_url": "https://api.github.com/users/jeremycline/received_events", "repos_url": "https://api.github.com/users/jeremycline/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jeremycline/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jeremycline/subscriptions", "type": "User", "url": "https://api.github.com/users/jeremycline", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2016-07-01T16:12:55Z
2021-09-08T03:01:01Z
2016-07-01T16:15:16Z
CONTRIBUTOR
resolved
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3371/reactions" }
https://api.github.com/repos/psf/requests/issues/3371/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3371.diff", "html_url": "https://github.com/psf/requests/pull/3371", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3371.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3371" }
true
[ "That's not a typo. =) The sentence is \"One of the most downloaded Python packages of all time\".\n", "Oooops. :coffee: \n" ]
https://api.github.com/repos/psf/requests/issues/3370
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3370/labels{/name}
https://api.github.com/repos/psf/requests/issues/3370/comments
https://api.github.com/repos/psf/requests/issues/3370/events
https://github.com/psf/requests/pull/3370
163,314,827
MDExOlB1bGxSZXF1ZXN0NzU5MjQyMTg=
3,370
adding in slice_length fix and test for chunk_size=None
{ "avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4", "events_url": "https://api.github.com/users/nateprewitt/events{/privacy}", "followers_url": "https://api.github.com/users/nateprewitt/followers", "following_url": "https://api.github.com/users/nateprewitt/following{/other_user}", "gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/nateprewitt", "id": 5271761, "login": "nateprewitt", "node_id": "MDQ6VXNlcjUyNzE3NjE=", "organizations_url": "https://api.github.com/users/nateprewitt/orgs", "received_events_url": "https://api.github.com/users/nateprewitt/received_events", "repos_url": "https://api.github.com/users/nateprewitt/repos", "site_admin": false, "starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions", "type": "User", "url": "https://api.github.com/users/nateprewitt", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2016-07-01T03:55:18Z
2021-09-08T02:10:13Z
2016-07-02T20:56:20Z
MEMBER
resolved
This fixes the issue discussed in #3369 but may not be the best way. It adds a certain amount of additional complexity to iter_slices that may be better solved by using a try/except. If we throw an exception requiring an int for slice_length, we could pass `chunk_size or len(self._content)` [here](https://github.com/kennethreitz/requests/blob/master/requests/models.py#L691) and solve the issue that way as well.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3370/reactions" }
https://api.github.com/repos/psf/requests/issues/3370/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3370.diff", "html_url": "https://github.com/psf/requests/pull/3370", "merged_at": "2016-07-02T20:56:20Z", "patch_url": "https://github.com/psf/requests/pull/3370.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3370" }
true
[ "I'm happy with this, but would like @sigmavirus24 to take a look as well.\n", "Actually one quick note on this, it looks like we actually should have an integer > 0 as a requirement. slice_length <= 0 produces an infinite generator. I can modify the check if that's something you think should be caught (I do).\n", "Sure thing =)\n", "Ok, I'm still happy, but would still like @sigmavirus24 to weigh in. \n", ":+1: Looks fine to me. Thanks @nateprewitt!\n" ]
https://api.github.com/repos/psf/requests/issues/3369
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3369/labels{/name}
https://api.github.com/repos/psf/requests/issues/3369/comments
https://api.github.com/repos/psf/requests/issues/3369/events
https://github.com/psf/requests/issues/3369
163,314,564
MDU6SXNzdWUxNjMzMTQ1NjQ=
3,369
TypeError in iter_slices() when slice_length=None
{ "avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4", "events_url": "https://api.github.com/users/nateprewitt/events{/privacy}", "followers_url": "https://api.github.com/users/nateprewitt/followers", "following_url": "https://api.github.com/users/nateprewitt/following{/other_user}", "gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/nateprewitt", "id": 5271761, "login": "nateprewitt", "node_id": "MDQ6VXNlcjUyNzE3NjE=", "organizations_url": "https://api.github.com/users/nateprewitt/orgs", "received_events_url": "https://api.github.com/users/nateprewitt/received_events", "repos_url": "https://api.github.com/users/nateprewitt/repos", "site_admin": false, "starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions", "type": "User", "url": "https://api.github.com/users/nateprewitt", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2016-07-01T03:52:10Z
2021-09-08T17:05:33Z
2016-07-03T03:29:16Z
MEMBER
resolved
An unintended side effect of using `chunk_size=None` (as referenced in PR #3368) is a failure in `iter_slices()`. The two code snippets below will both trigger [`reused_chunks` being chosen](https://github.com/kennethreitz/requests/blob/master/requests/models.py#L695) as the return value of `iter_content`. The generator is produced by `iter_slices` which currently requires the chunk_size to be an integer, otherwise it raises a TypeError when [trying to add an int and NoneType](https://github.com/kennethreitz/requests/blob/master/requests/utils.py#L388). Perhaps these are obscure enough edge cases, that it's not an issue. I just wanted to make a quick note of it. #### Examples ``` def test_stream(): s = requests.Session() req = requests.Request('GET', 'https://httpbin.org/get').prepare() resp = s.send(req, stream=False) next(resp.iter_content(None)) def test_decode(): r = requests.Response() r.raw = io.BytesIO(b'the content') all_content = r.content decoded_chunk = next(r.iter_content(None, True)) ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4", "events_url": "https://api.github.com/users/nateprewitt/events{/privacy}", "followers_url": "https://api.github.com/users/nateprewitt/followers", "following_url": "https://api.github.com/users/nateprewitt/following{/other_user}", "gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/nateprewitt", "id": 5271761, "login": "nateprewitt", "node_id": "MDQ6VXNlcjUyNzE3NjE=", "organizations_url": "https://api.github.com/users/nateprewitt/orgs", "received_events_url": "https://api.github.com/users/nateprewitt/received_events", "repos_url": "https://api.github.com/users/nateprewitt/repos", "site_admin": false, "starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions", "type": "User", "url": "https://api.github.com/users/nateprewitt", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3369/reactions" }
https://api.github.com/repos/psf/requests/issues/3369/timeline
null
completed
null
null
false
[ "Why does `iter_slices` with `slice_length = None` not trigger a `TypeError`?\n", "Sorry, I'm not sure I understand the question. I'm currently seeing the following error returned with both python2.7 and python3.5.\n\n`TypeError: unsupported operand type(s) for +: 'int' and 'NoneType'`\n\nA more direct code snippet for reproduction would be:\n\n`next(requests.utils.iter_slices('my string', None))`\n", "Ok, that's good, I just didn't read the report clearly enough I guess. =)\n" ]
https://api.github.com/repos/psf/requests/issues/3368
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3368/labels{/name}
https://api.github.com/repos/psf/requests/issues/3368/comments
https://api.github.com/repos/psf/requests/issues/3368/events
https://github.com/psf/requests/pull/3368
163,293,541
MDExOlB1bGxSZXF1ZXN0NzU5MDk3ODI=
3,368
Allow None value for chunk_size again
{ "avatar_url": "https://avatars.githubusercontent.com/u/1304895?v=4", "events_url": "https://api.github.com/users/joyzheng/events{/privacy}", "followers_url": "https://api.github.com/users/joyzheng/followers", "following_url": "https://api.github.com/users/joyzheng/following{/other_user}", "gists_url": "https://api.github.com/users/joyzheng/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/joyzheng", "id": 1304895, "login": "joyzheng", "node_id": "MDQ6VXNlcjEzMDQ4OTU=", "organizations_url": "https://api.github.com/users/joyzheng/orgs", "received_events_url": "https://api.github.com/users/joyzheng/received_events", "repos_url": "https://api.github.com/users/joyzheng/repos", "site_admin": false, "starred_url": "https://api.github.com/users/joyzheng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/joyzheng/subscriptions", "type": "User", "url": "https://api.github.com/users/joyzheng", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2016-07-01T00:06:42Z
2021-09-08T03:01:00Z
2016-07-01T00:11:01Z
CONTRIBUTOR
resolved
Passing in `chunk_size=None` used to be fine, but started raising a `TypeError` following the addition of the type check (#3365)
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3368/reactions" }
https://api.github.com/repos/psf/requests/issues/3368/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3368.diff", "html_url": "https://github.com/psf/requests/pull/3368", "merged_at": "2016-07-01T00:11:01Z", "patch_url": "https://github.com/psf/requests/pull/3368.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3368" }
true
[ "Thanks @joyzheng! :cake: \n", "It may also be worth updating the `iter_content` docstring to reflect the \"readall\" behaviour with `iter_content(None)`.\n", "Yeah, good point. I'm not actually sure about all the behavior cases for `iter_content(None)`, since it doesn't always result in reading the entire input; the primary one I've used it for is chunked encoding with `stream=True`, where setting `chunk_size=None` iterates chunk-by-chunk.\n" ]
https://api.github.com/repos/psf/requests/issues/3367
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3367/labels{/name}
https://api.github.com/repos/psf/requests/issues/3367/comments
https://api.github.com/repos/psf/requests/issues/3367/events
https://github.com/psf/requests/issues/3367
163,098,513
MDU6SXNzdWUxNjMwOTg1MTM=
3,367
Special IP proxy service timeout problems
{ "avatar_url": "https://avatars.githubusercontent.com/u/1249205?v=4", "events_url": "https://api.github.com/users/toadzhou/events{/privacy}", "followers_url": "https://api.github.com/users/toadzhou/followers", "following_url": "https://api.github.com/users/toadzhou/following{/other_user}", "gists_url": "https://api.github.com/users/toadzhou/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/toadzhou", "id": 1249205, "login": "toadzhou", "node_id": "MDQ6VXNlcjEyNDkyMDU=", "organizations_url": "https://api.github.com/users/toadzhou/orgs", "received_events_url": "https://api.github.com/users/toadzhou/received_events", "repos_url": "https://api.github.com/users/toadzhou/repos", "site_admin": false, "starred_url": "https://api.github.com/users/toadzhou/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/toadzhou/subscriptions", "type": "User", "url": "https://api.github.com/users/toadzhou", "user_view_type": "public" }
[]
closed
true
null
[]
null
19
2016-06-30T07:26:06Z
2021-09-08T15:00:54Z
2016-09-06T00:01:54Z
NONE
resolved
``` #cat checkip.py #To streamline the code def check(ip, port, timeout=30): proxies = { "http": "http://%s:%s" % (ip, str(port)), "https": "http://%s:%s" % (ip, str(port)), } requests.get(check_url, proxies=proxies, timeout=timeout, verify=False) ` #python2.7 >>> import requests >>> requests.__version__ '2.10.0' >>> from checkip import check >>> check("164.132.52.143",3128,timeout=1) False >>> check("164.132.52.143",3128,timeout=10) Here is the infinite waiting for...` #ss #The state is ESTAB ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3367/reactions" }
https://api.github.com/repos/psf/requests/issues/3367/timeline
null
completed
null
null
false
[ "@toadzhou Does this actually have anything to do with the timeout? If you change the second timeout to `1`, does the error still occur?\n", "Of time is very short is no problem, the problem may appear in he has established a connection!\n", "To be clear, if you're calling `requests.get` then there is no state being shared: there is no way for two calls to `requests.get` to interact with each other on your machine. The proxy might be screwing it up, but Requests cannot. The only way to have that behaviour is if you're using a `Session`, which this code is not.\n", "I am a single independent connections. \nTry the requests.Session() still won't do\nWhat method can solve the problem\n", "@toadzhou Right now it's still not really clear what's happening in your connection. Can you run the script above with something like Wireshark and send me the packet capture?\n", "@Lukasa \n\n#tcpdump -i eth0 -vnn 164.132.52.143\n16:54:31.440888 IP (tos 0x0, ttl 64, id 52147, offset 0, flags [DF], proto TCP (6), length 44)\n x.x.x.x.62004 > 164.132.52.143.3128: Flags [S], cksum 0x4aa1 (correct), seq 1499401970, win 5840, options [mss 1460], length 0\n16:54:31.749509 IP (tos 0x0, ttl 52, id 0, offset 0, flags [DF], proto TCP (6), length 44)\n 164.132.52.143.3128 > x.x.x.x.62004: Flags [S.], cksum 0x203b (correct), seq 3958367012, ack 1499401971, win 29200, options [mss 1460], length 0\n16:54:31.749549 IP (tos 0x0, ttl 64, id 52148, offset 0, flags [DF], proto TCP (6), length 40)\n x.x.x.x.62004 > 164.132.52.143.3128: Flags [.], cksum 0x9338 (correct), ack 1, win 5840, length 0\n16:54:31.749663 IP (tos 0x0, ttl 64, id 52149, offset 0, flags [DF], proto TCP (6), length 80)\n x.x.x.x.62004 > 164.132.52.143.3128: Flags [P.], cksum 0xd039 (incorrect -> 0x5679), seq 1:41, ack 1, win 5840, length 40\n16:54:32.057972 IP (tos 0x0, ttl 52, id 42614, offset 0, flags [DF], proto TCP (6), length 40)\n 164.132.52.143.3128 > x.x.x.x.62004: Flags [.], cksum 0x37d0 (correct), ack 41, win 29200, length 0\n16:54:32.058005 IP (tos 0x0, ttl 64, id 52150, offset 0, flags [DF], proto TCP (6), length 42)\n x.x.x.x.62004 > 164.132.52.143.3128: Flags [P.], cksum 0xd013 (incorrect -> 0x85fc), seq 41:43, ack 1, win 5840, length 2\n16:54:32.366473 IP (tos 0x0, ttl 52, id 42615, offset 0, flags [DF], proto TCP (6), length 40)\n 164.132.52.143.3128 > x.x.x.x.62004: Flags [.], cksum 0x37ce (correct), ack 43, win 29200, length 0\n16:54:32.488958 IP (tos 0x0, ttl 52, id 42616, offset 0, flags [DF], proto TCP (6), length 79)\n 164.132.52.143.3128 > x.x.x.x.62004: Flags [P.], cksum 0x2de6 (correct), seq 1:40, ack 43, win 29200, length 39\n16:54:32.488986 IP (tos 0x0, ttl 64, id 52151, offset 0, flags [DF], proto TCP (6), length 40)\n x.x.x.x.62004 > 164.132.52.143.3128: Flags [.], cksum 0x92e7 (correct), ack 40, win 5840, length 0\n16:54:32.490157 IP (tos 0x0, ttl 64, id 52152, offset 0, flags [DF], proto TCP (6), length 337)\n x.x.x.x.62004 > 164.132.52.143.3128: Flags [P.], cksum 0xd13a (incorrect -> 0xf979), seq 43:340, ack 40, win 5840, length 297\n16:54:32.798673 IP (tos 0x0, ttl 52, id 42617, offset 0, flags [DF], proto TCP (6), length 40)\n 164.132.52.143.3128 > x.x.x.x.62004: Flags [.], cksum 0x334e (correct), ack 340, win 30016, length 0\n16:54:35.254479 IP (tos 0x0, ttl 52, id 42618, offset 0, flags [DF], proto TCP (6), length 41)\n 164.132.52.143.3128 > x.x.x.x.62004: Flags [P.], cksum 0x1d45 (correct), seq 40:41, ack 340, win 30016, length 1\n16:54:35.295080 IP (tos 0x0, ttl 64, id 52153, offset 0, flags [DF], proto TCP (6), length 40)\n x.x.x.x.62004 > 164.132.52.143.3128: Flags [.], cksum 0x91bd (correct), ack 41, win 5840, length 0\n16:54:38.045490 IP (tos 0x0, ttl 52, id 42619, offset 0, flags [DF], proto TCP (6), length 41)\n 164.132.52.143.3128 > x.x.x.x.62004: Flags [P.], cksum 0x3044 (correct), seq 41:42, ack 340, win 30016, length 1\n16:54:38.045518 IP (tos 0x0, ttl 64, id 52154, offset 0, flags [DF], proto TCP (6), length 40)\n x.x.x.x.62004 > 164.132.52.143.3128: Flags [.], cksum 0x91bc (correct), ack 42, win 5840, length 0\n16:54:40.817095 IP (tos 0x0, ttl 52, id 42620, offset 0, flags [DF], proto TCP (6), length 41)\n 164.132.52.143.3128 > x.x.x.x.62004: Flags [P.], cksum 0x3043 (correct), seq 42:43, ack 340, win 30016, length 1\n16:54:40.817125 IP (tos 0x0, ttl 64, id 52155, offset 0, flags [DF], proto TCP (6), length 40)\n x.x.x.x.62004 > 164.132.52.143.3128: Flags [.], cksum 0x91bb (correct), ack 43, win 5840, length 0\n16:54:43.574452 IP (tos 0x0, ttl 52, id 42621, offset 0, flags [DF], proto TCP (6), length 41)\n 164.132.52.143.3128 > x.x.x.x.62004: Flags [P.], cksum 0x2242 (correct), seq 43:44, ack 340, win 30016, length 1\n16:54:43.574483 IP (tos 0x0, ttl 64, id 52156, offset 0, flags [DF], proto TCP (6), length 40)\n x.x.x.x.62004 > 164.132.52.143.3128: Flags [.], cksum 0x91ba (correct), ack 44, win 5840, length 0\n16:54:46.349464 IP (tos 0x0, ttl 52, id 42622, offset 0, flags [DF], proto TCP (6), length 41)\n 164.132.52.143.3128 > x.x.x.x.62004: Flags [P.], cksum 0xc140 (correct), seq 44:45, ack 340, win 30016, length 1\n16:54:46.349495 IP (tos 0x0, ttl 64, id 52157, offset 0, flags [DF], proto TCP (6), length 40)\n x.x.x.x.62004 > 164.132.52.143.3128: Flags [.], cksum 0x91b9 (correct), ack 45, win 5840, length 0\n16:54:48.247303 IP (tos 0x0, ttl 64, id 52158, offset 0, flags [DF], proto TCP (6), length 40)\n x.x.x.x.62004 > 164.132.52.143.3128: Flags [F.], cksum 0x91b8 (correct), seq 340, ack 45, win 5840, length 0\n16:54:48.556321 IP (tos 0x0, ttl 52, id 42623, offset 0, flags [DF], proto TCP (6), length 40)\n 164.132.52.143.3128 > x.x.x.x.62004: Flags [F.], cksum 0x3347 (correct), seq 45, ack 341, win 30016, length 0\n16:54:48.556358 IP (tos 0x0, ttl 64, id 52159, offset 0, flags [DF], proto TCP (6), length 40)\n x.x.x.x.62004 > 164.132.52.143.3128: Flags [.], cksum 0x91b7 (correct), ack 46, win 5840, length 0\n", "My code is to check all kinds of proxy IP is available, the timeout of which is directly to give up. But because the timeout is not effective, result in the process of my pool into a zombie process\n", "Sorry, can you run that tcpdump with body capture turned on? `tcpdump -nnvvXSs 1514 -i eth0 164.132.52.143` should do it.\n", "@toadzhou Is that log _with_ or _without_ Sessions?\n", "requests.get()\nr = requests.Session()\nr.get()\n\nThe result is the same\n", "Or you can try with the local agent 164.132.52.143:3128\n", "@toadzhou The _result_ is the same, but what happens is not. To be clear, I'd like a packet capture where you're using `requests.get`, not `session.get`.\n", "I grab bag is running on this code\n\n``` python\n#!/usr/bin/env python\nimport requests\nimport socket\n#import eventlet\n#eventlet.monkey_patch()\ncheck_url = 'https://xxx.com/'\ncheck_key = 'test_key.'\n\nfrom requests.packages import urllib3\nurllib3.disable_warnings()\n\ndef check(ip, port, timeout=30):\n proxies = {\n \"http\": \"http://%s:%s\" % (ip, str(port)),\n \"https\": \"http://%s:%s\" % (ip, str(port)),\n \"Accept\": \"text/html, application/xhtml+xml, */*\",\n \"Accept-Language\": \"zh-CN\",\n \"User-Agent\": \"Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko\",\n \"Accept-Encoding\": \"gzip, deflate\",\n \"Connection\": \"Keep-Alive\"\n }\n\n try:\n r = requests.get(check_url, proxies=proxies, timeout=timeout, verify=False)\n #import pdb;pdb.set_trace()\n if r.ok and check_key in r.content:\n return True\n else:\n return False\n except:\n return False\n\n\ncheck(\"164.132.52.143\",3128,timeout=10) \n```\n", "Ok, and what _exactly_ is the result of running that code? Does it hang forever, raise exceptions, or what?\n", "Relative to 164.132.52.143:3128 agent timeout does not work \n", "Right, but to be clear, can you break down into a) what you _expect_ this code to do, and b) what it _actually_ does? Try to avoid giving a high level explanation and be as literal as possible.\n", "I'm sorry!Also please forgive me \nVerbal ability Determine the need to strengthen\n", "@toadzhou You don't need to apologise, I appreciate that English isn't your first language. =) I'm just trying to find a way to explain what I need from you as clearly as possible.\n\nIn this case, I think I need to understand exactly what happens when you run that code, and what you think should happen.\n", "This has been inactive for well over a month. I'm going to close this for now @toadzhou since we're unable to determine what the problem is. If you can gather more information and provide it, we can reopen this if we determine there's a bug in requests.\n\nThanks for filing this bug!\n" ]
https://api.github.com/repos/psf/requests/issues/3366
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3366/labels{/name}
https://api.github.com/repos/psf/requests/issues/3366/comments
https://api.github.com/repos/psf/requests/issues/3366/events
https://github.com/psf/requests/pull/3366
162,988,175
MDExOlB1bGxSZXF1ZXN0NzU2OTMwMTY=
3,366
check for headers containing return characters
{ "avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4", "events_url": "https://api.github.com/users/nateprewitt/events{/privacy}", "followers_url": "https://api.github.com/users/nateprewitt/followers", "following_url": "https://api.github.com/users/nateprewitt/following{/other_user}", "gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/nateprewitt", "id": 5271761, "login": "nateprewitt", "node_id": "MDQ6VXNlcjUyNzE3NjE=", "organizations_url": "https://api.github.com/users/nateprewitt/orgs", "received_events_url": "https://api.github.com/users/nateprewitt/received_events", "repos_url": "https://api.github.com/users/nateprewitt/repos", "site_admin": false, "starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions", "type": "User", "url": "https://api.github.com/users/nateprewitt", "user_view_type": "public" }
[]
closed
true
null
[]
null
30
2016-06-29T17:57:41Z
2021-09-08T03:00:49Z
2016-07-02T19:32:17Z
MEMBER
resolved
Addresses issue raised in #2947. This likely needs some tweaking. I added the check in `adapters.py` because it was the lowest level I could find header modification being done before sending. I'm not sure it actually needs to be this low though since it doesn't seem easily reachable with a non-`PreparedRequest` object. Alternatively, we could move the check to the [`prepare_headers`](https://github.com/kennethreitz/requests/blob/master/requests/models.py#L406) method in `PreparedRequest`. I also ignored `Accept` headers in the check because I the requirements in [RFC7230](https://tools.ietf.org/html/rfc7230#page-24) seems a bit vague. > This specification deprecates such line folding except within the message/http media type > (Section 8.3.1). A sender MUST NOT generate a message that includes line folding (i.e., that has any field-value that contains a match to the obs-fold rule) unless the message is intended for packaging > within the message/http media type. This may be a poor interpretation on my part though.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3366/reactions" }
https://api.github.com/repos/psf/requests/issues/3366/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3366.diff", "html_url": "https://github.com/psf/requests/pull/3366", "merged_at": "2016-07-02T19:32:17Z", "patch_url": "https://github.com/psf/requests/pull/3366.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3366" }
true
[ "We should forbid the newline in Accept headers too: that is a misreading of RFC 7230. =) Otherwise, this is basically the right approach. Some notes:\n- Can we move the logic into a underscore-prefixed method on the HTTPAdapter? That will let people override the method if they need to.\n- Can we also search for line feed characters (`'\\r'`)?\n- Can we also also confirm the header field does not begin with linear whitespace (e.g. space, tab)?\n\nThis should handle most of the header smuggling risks. At this point we probably just want to match a regex against the header.\n", "I understand the reasoning behind putting it in adapters, but I do feel like it should be in the prepare_headers section instead, if possible. PreparedRequests should allow users to do stupid things, like this. No need to have saftey tape at that level, imo. \n\nI won't be surprised if @Lukasa disagrees :)\n", "If we're treating this as a security issue and trying to prevent injection on dynamically generated headers, I think @Lukasa's internal method allows flexibility while maintaining the smallest surface area for issues. If we're just looking to prevent unintentional foot shooting, prepare_headers seems \"cleaner\" to me.\n", "Agreed. I feel like it's the latter. Am I incorrect?\n", "No, I was just checking I understood the intent of the patch :)\n\nWorking on moving things up to `prepare_headers` with Lukasa's proposed changes now.\n", "Much to @kennethreitz's surprise, I agree with him. =)\n\nI'm worried that users who are unsuspectingly using our high-level APIs aren't subject to risk, but if you're going to drop down to monkeying with PreparedRequests then you should be able to do more-or-less what you like. =) \n", "I've got some small notes about code structure, but this is looking really good!\n", "Thanks, I think all of the changes you asked for should be in place. If you've got any additional notes, let me know. Always open to feedback :)\n", "Sorry, some more smaller notes. =) This should be the last review though!\n", "To be clear, you need to remove `to_native_string(val)` when using the bytes pattern.\n", "Hrgh, actually, this is a bit trickier than that. We need two regexes, one for unicode strings and one for bytes strings. And at this point, we probably want to turn this into a full-blown utility function.\n", "Hmm, yeah. So you're suggesting something along the lines of this?\n\n```\n\ndef validate_header(val):\n if isinstance(val, str):\n regex = str_reg\n else:\n regex = byte_reg\n if regex.match(val):\n raise InvalidHeader()\n return val\n\n---\n\ndef prepare_headers(self, headers) \n self.headers = CaseInsensitiveDict()\n if headers:\n for name, val in headers.items():\n self.headers[to_native_string(name)] = validate_header(val)\n\n```\n", "Yup, something like that, yeah.\n", "@Lukasa Alright, there's rev 1 for a utility function.\n", "Ok, one tiny string fix and I'll finally be done complaining. =D\n", "Cool, LGTM. Rebase away. =D\n", "Squashed and passing. Thanks for taking the time to work through this with me @Lukasa :)\n", "Hooray! I am happy. =)\n", "Thanks @nateprewitt! :sparkles: :cake: :sparkles:\n", "@Lukasa, stealing my emoji chain hahaha\n", "Just to let you know that we bumped into this recently here: https://github.com/jjjake/internetarchive/issues/151 - we were just sending integer 1 or 0 expecting it to be coerced to \"1\" or \"0\".\nThis is was simple to fix but I think this should be marked as a breaking change.\n\nRegarding the implementation: using `isinstance()` checks is kind of [un-pythonic](http://canonical.org/~kragen/isinstance/), it forbids the caller to ship instances of objects that should behave like strings or return rendered strings when asked.\n", "> Regarding the implementation: using isinstance() checks is kind of un-pythonic, it forbids the caller to ship instances of objects that should behave like strings or return rendered strings when asked.\n\nThere is no way in Python for something to behave \"like a string\" without being a string. You can ask something to be a string via `str()`, but because`str()` falls back to `repr()` it cannot be used as a sensible screening mechanism.\n\nWhat would be your proposed alternative?\n", "That probably means we have no choice but to depend on `str()` ? The original bug was about embedded newlines only I think.\n", "We _cannot_ depend on `str()`, its behaviours are consistently and completely unacceptable. The worst of them being the fact that, on Python 3, `str(b'hello') == \"b'hello'\"`, meaning that anyone passing a bytestring header value on Python 3 would get their header value totally mangled, despite the fact that passing us a bytestring is actually the _simplest_ case from the perspective of our logic: we pass it through, after checking that no ASCII bytes for newlines or leading/trailing whitespace are present.\n\nUnconditionally casting data to strings does not behave well in Python, and the biggest problem is that it leads to subtle, difficult-to-discover bugs. The bug you hit, @saper, fails loudly and clearly. It is easy to understand what the problem is. However, if we adjust to your proposed behaviour, the failure mode is that someone has passed us something like a `datetime`, leading to Requests sending a header that reads `Date: datetime(days=52, hours=23, minutes=12, seconds=0)`, which is a _clearly_ incorrect header value. That failure mode is much harder to debug: all that you see in your code is a 400 Client Error that needs to be chased down, requiring you to examine everything about your request.\n\nSubtle failure modes are almost always bad. I appreciate that this is an inconvenience for you, but we never intended your use-case to work, and historically it did not. It was by accident that it became acceptable several releases ago, and we have now taken action to formally prevent that use-case from being allowed again.\n", "Question: Did implicit stringification work correctly in both Python releases? (Before we needed an explicit one to cut away newlines?). Except for the `repr()` case of course?\n\nI've seen [{k: str(v) for k, v in headers.items()}](https://github.com/jjjake/internetarchive/pull/150/files#diff-562526d0e51ec94fea4ddda8c54853e5R155) proposed as a fix and I understand from what you are saying that explicit stringification is no good.\n", "That depends on your definition of \"correctly\", but basically: no. All my complaints about the weirder datatypes remain in place.\n", "Thank you, that makes sense. But wouldn't be good to mark it as a breaking change? A relatively innocent implicit `int` conversion just stopped to work. Not sure if there were people implementing `__str__()` to have their HTTP headers formatted nicely.\n", "That's a reasonable question, and we discussed it. The conclusion was that, given that we had documented that header were strings, that we were covered by that restriction. I felt less strongly on that point than the other core maintainers, but the call was made. =)\n", "Well, let it be then, thanks!\n", "Thanks for the understanding @saper! =)\n" ]
https://api.github.com/repos/psf/requests/issues/3365
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3365/labels{/name}
https://api.github.com/repos/psf/requests/issues/3365/comments
https://api.github.com/repos/psf/requests/issues/3365/events
https://github.com/psf/requests/pull/3365
162,566,828
MDExOlB1bGxSZXF1ZXN0NzUzOTM1MTE=
3,365
added in type check for chunk_size
{ "avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4", "events_url": "https://api.github.com/users/nateprewitt/events{/privacy}", "followers_url": "https://api.github.com/users/nateprewitt/followers", "following_url": "https://api.github.com/users/nateprewitt/following{/other_user}", "gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/nateprewitt", "id": 5271761, "login": "nateprewitt", "node_id": "MDQ6VXNlcjUyNzE3NjE=", "organizations_url": "https://api.github.com/users/nateprewitt/orgs", "received_events_url": "https://api.github.com/users/nateprewitt/received_events", "repos_url": "https://api.github.com/users/nateprewitt/repos", "site_admin": false, "starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions", "type": "User", "url": "https://api.github.com/users/nateprewitt", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-06-27T22:55:25Z
2021-09-08T02:10:19Z
2016-06-28T07:33:29Z
MEMBER
resolved
This should raise a type error if a non-integer chunk_size is passed. I'm assuming we're not concerned about a chunk_size of type long, but that may be wrong. Resolves issue #3364
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3365/reactions" }
https://api.github.com/repos/psf/requests/issues/3365/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3365.diff", "html_url": "https://github.com/psf/requests/pull/3365", "merged_at": "2016-06-28T07:33:29Z", "patch_url": "https://github.com/psf/requests/pull/3365.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3365" }
true
[ "Very nice @nateprewitt! Thanks! :sparkles: :cake: :sparkles:\n" ]
https://api.github.com/repos/psf/requests/issues/3364
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3364/labels{/name}
https://api.github.com/repos/psf/requests/issues/3364/comments
https://api.github.com/repos/psf/requests/issues/3364/events
https://github.com/psf/requests/issues/3364
162,499,663
MDU6SXNzdWUxNjI0OTk2NjM=
3,364
iter_content not chunking?
{ "avatar_url": "https://avatars.githubusercontent.com/u/541830?v=4", "events_url": "https://api.github.com/users/coyotlinden/events{/privacy}", "followers_url": "https://api.github.com/users/coyotlinden/followers", "following_url": "https://api.github.com/users/coyotlinden/following{/other_user}", "gists_url": "https://api.github.com/users/coyotlinden/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/coyotlinden", "id": 541830, "login": "coyotlinden", "node_id": "MDQ6VXNlcjU0MTgzMA==", "organizations_url": "https://api.github.com/users/coyotlinden/orgs", "received_events_url": "https://api.github.com/users/coyotlinden/received_events", "repos_url": "https://api.github.com/users/coyotlinden/repos", "site_admin": false, "starred_url": "https://api.github.com/users/coyotlinden/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/coyotlinden/subscriptions", "type": "User", "url": "https://api.github.com/users/coyotlinden", "user_view_type": "public" }
[]
closed
true
null
[]
null
21
2016-06-27T16:59:26Z
2021-09-08T17:05:34Z
2016-06-28T07:33:39Z
NONE
resolved
I have code which is practically identical to the quickstart code: ``` with open(self.filename, 'wb') as fd: for chunk in self.req.iter_content(self.chunk_size): fd.write(chunk) if self.progressbar: self.in_queue.put(len(chunk)) ``` where the queue is read in another thread by the GUI object and updates the progress bar. The problem that I have noticed is that despite the chunk_size being 1024, this code seems to wait until until the code is done and then puts the entire file size in the queue, defeating the point of providing the user with continuous progress on feedback.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3364/reactions" }
https://api.github.com/repos/psf/requests/issues/3364/timeline
null
completed
null
null
false
[ "@coyotlinden Do you make the original requests call with `stream=True`?\n", "Yes.\n\n```\nreq = requests.get(url, stream=True)\n```\n", "@coyotlinden And that `req` is the same as the one that is called `self.req` elsewhere? How big is the file? Is the URL publicly available?\n", "- Same `req`? Yes. The `req` is passed to a subclass of `threading.Thread` where it is referred to as `self.req` .\n- How big is the file? 90,728,799 bytes\n- Is the URL public? Yes. http://download.cloud.secondlife.com/Viewer_4/Second_Life_4_0_5_315117_i386.dmg \n", "Are you absolutely sure that `self.chunk_size` is 1024? Testing that locally I get a series of 1024-byte chunks, until a final 351-byte chunk. Can you throw a print or a logging statement in there to confirm what `self.chunk_size` is?\n", "I was using the Wing IDE debugger to check the value, but will put in a print statement after lunch.\n", "Given:\n\n```\n def run(self):\n with open(self.filename, 'wb') as fd:\n for chunk in self.req.iter_content(self.chunk_size):\n fd.write(chunk)\n print \"chunk_size: %s chunk length: %s\" % (self.chunk_size, len(chunk))\n if self.progressbar:\n self.in_queue.put(len(chunk))\n```\n\nI get:\n\n`chunk_size: 1024 chunk length: 90728799`\n", "Are you sending custom headers or anything?\n", "Nope.\n", "Ok, when I run this I see repeatedly 1024-byte sized chunks:\n\n``` python\nimport requests\n\n\nCHUNK_SIZE = 1024\n\n\ndef run():\n req = requests.get('http://download.cloud.secondlife.com/Viewer_4/Second_Life_4_0_5_315117_i386.dmg', stream=True)\n with open('out.bin', 'wb') as fd:\n for chunk in req.iter_content(CHUNK_SIZE):\n fd.write(chunk)\n print \"chunk_size: %s chunk length: %s\" % (CHUNK_SIZE, len(chunk))\n\n\nrun()\n```\n\nCan you try running that and see what your output is?\n", "Given:\n\n```\n \"\"\"\"\n with open(self.filename, 'wb') as fd:\n for chunk in self.req.iter_content(self.chunk_size):\n fd.write(chunk)\n print \"chunk_size: %s chunk length: %s\" % (self.chunk_size, len(chunk))\n if self.progressbar:\n self.in_queue.put(len(chunk))\"\"\"\n CHUNK_SIZE = 1024\n req = requests.get('http://download.cloud.secondlife.com/Viewer_4/Second_Life_4_0_5_315117_i386.dmg', stream=True)\n with open('out.bin', 'wb') as fd:\n for chunk in req.iter_content(CHUNK_SIZE):\n fd.write(chunk)\n print \"chunk_size: %s chunk length: %s\" % (CHUNK_SIZE, len(chunk)) \n```\n\nI _do_ see the correct output. I'll try and do some incremental swapping back to my previous and see if I can't isolate the problem.\n", "This is very strange. Just changing the chunk size variable from a local constant to the passed in value breaks it:\n\n```\n#CHUNK_SIZE = 1024\n req = requests.get('http://download.cloud.secondlife.com/Viewer_4/Second_Life_4_0_5_315117_i386.dmg', stream=True)\n with open('out.bin', 'wb') as fd:\n for chunk in req.iter_content(self.chunk_size):\n fd.write(chunk)\n print \"chunk_size: %s chunk length: %s\" % (self.chunk_size, len(chunk))\n```\n\nGets `chunk_size: 1024 chunk length: 90728799`\n", "@coyotlinden What is `type(self.chunk_size)`?\n", "Ah ha! `chunk type: <type 'str'> chunk_size: 1024 chunk length: 90728799`\n", "Fixed with `self.chunk_size = int(chunk_size)`\n", "=D There we go!\n", "I'd be inclined to say that we should maybe `TypeError` on stupid inputs there.\n", "I'd +1 that change!\n\nThis now works _perfectly_:\n\n```\n def run(self):\n with open(self.filename, 'wb') as fd:\n for chunk in self.req.iter_content(self.chunk_size):\n fd.write(chunk)\n if self.progressbar:\n self.in_queue.put(len(chunk)) \n self.in_queue.put(-1)\n```\n\nMany thanks!\n", "> I'd be inclined to say that we should maybe TypeError on stupid inputs there.\n\nI'd be inclined to agree.\n", "I threw together a quick check and a couple tests for non-int chunk_sizes in PR #3365\n", "Resolved by #3365.\n" ]
https://api.github.com/repos/psf/requests/issues/3363
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3363/labels{/name}
https://api.github.com/repos/psf/requests/issues/3363/comments
https://api.github.com/repos/psf/requests/issues/3363/events
https://github.com/psf/requests/issues/3363
162,237,796
MDU6SXNzdWUxNjIyMzc3OTY=
3,363
Add let's encrypt ca certificates to trusted certs
{ "avatar_url": "https://avatars.githubusercontent.com/u/5488440?v=4", "events_url": "https://api.github.com/users/maxnoe/events{/privacy}", "followers_url": "https://api.github.com/users/maxnoe/followers", "following_url": "https://api.github.com/users/maxnoe/following{/other_user}", "gists_url": "https://api.github.com/users/maxnoe/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/maxnoe", "id": 5488440, "login": "maxnoe", "node_id": "MDQ6VXNlcjU0ODg0NDA=", "organizations_url": "https://api.github.com/users/maxnoe/orgs", "received_events_url": "https://api.github.com/users/maxnoe/received_events", "repos_url": "https://api.github.com/users/maxnoe/repos", "site_admin": false, "starred_url": "https://api.github.com/users/maxnoe/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/maxnoe/subscriptions", "type": "User", "url": "https://api.github.com/users/maxnoe", "user_view_type": "public" }
[]
closed
true
null
[]
null
9
2016-06-24T21:53:37Z
2021-09-08T17:05:35Z
2016-06-25T06:27:04Z
NONE
resolved
IMO requests should trust let's encrypt issued certs: www.letsencrypt.org
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3363/reactions" }
https://api.github.com/repos/psf/requests/issues/3363/timeline
null
completed
null
null
false
[ "Do you have evidence that we don't?\n\n## If so, can you share the URL and version of requests, Python, and openssl?\n\nSent from my Android device with K-9 Mail. Please excuse my brevity.\n", "Requests does trust LE: LE uses a cross signed root that we have chained up to for years. \n", "MMh, then why do I get an SSL error when I request https://auxdb.app.tu-dortmund.de? It is using a letsencrypt certificate and all browsers I tried trust this certificate.\n\n`requests.get('https://auxdb.app.tu-dortmund.de')`\n\nI added the 4 intermediate certificates of letsencrypt to my systems ca-certificates:\nhttps://letsencrypt.org/certificates/\n\nThis works:\n\n```\nrequests.get('https://auxdb.app.tu-dortmund.de', verify='/etc/ssl/certs/ca-certificates.crt')\n```\n", "What version of requests do you have installed, what packages on your system do you have installed, what OS are you on, and what OpenSSL version do you have?\n", "Are you saying that you can request the given url without ssl error? \n", "@MaxNoe The server does not server the intermediate certificate(s) like it should (browser cache them from other sites, so it may work)\nCompare:\n\n``` sh\n$ openssl s_client -connect auxdb.app.tu-dortmund.de:443 -CAfile /etc/ssl/certs/ca-certificates.crt\n...\nCertificate chain\n 0 s:/CN=git.e5.physik.tu-dortmund.de\n i:/C=US/O=Let's Encrypt/CN=Let's Encrypt Authority X3\n...\nVerify return code: 21 (unable to verify the first certificate)\n```\n\n``` sh\n$ openssl s_client -connect t-8ch.de:443 -CAfile /etc/ssl/certs/ca-certificates.crt\n...\nCertificate chain\n 0 s:/CN=t-8ch.de\n i:/C=US/O=Let's Encrypt/CN=Let's Encrypt Authority X3\n 1 s:/C=US/O=Let's Encrypt/CN=Let's Encrypt Authority X3\n i:/O=Digital Signature Trust Co./CN=DST Root CA X3\n...\nVerify return code: 0 (ok)\n```\n\nCertbot saves the chain to `chain.pem` and `fullchain.pem` (including the leaf cert).\n", "Mhh, ok. So it's a problem on the other hand. Sorry to bother.\n", "Ok, got it. I use a python script to upload the certificates obtained by letsencrypt into the rancher api. I got the keyword for the chain cert wrong. Everything works now.\n", "Glad to hear that Requests is of a level of quality that it accidentally led you to fix your improperly configured server.\n\nWe should start selling a security audit edition license for corporations :)\n" ]
https://api.github.com/repos/psf/requests/issues/3362
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3362/labels{/name}
https://api.github.com/repos/psf/requests/issues/3362/comments
https://api.github.com/repos/psf/requests/issues/3362/events
https://github.com/psf/requests/pull/3362
162,147,196
MDExOlB1bGxSZXF1ZXN0NzUxMTQyNDc=
3,362
adding asserted_encoding check on None type encoding to match text() …
{ "avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4", "events_url": "https://api.github.com/users/nateprewitt/events{/privacy}", "followers_url": "https://api.github.com/users/nateprewitt/followers", "following_url": "https://api.github.com/users/nateprewitt/following{/other_user}", "gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/nateprewitt", "id": 5271761, "login": "nateprewitt", "node_id": "MDQ6VXNlcjUyNzE3NjE=", "organizations_url": "https://api.github.com/users/nateprewitt/orgs", "received_events_url": "https://api.github.com/users/nateprewitt/received_events", "repos_url": "https://api.github.com/users/nateprewitt/repos", "site_admin": false, "starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions", "type": "User", "url": "https://api.github.com/users/nateprewitt", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2016-06-24T13:31:31Z
2021-09-08T02:10:20Z
2016-06-28T19:22:00Z
MEMBER
resolved
…behavior. Resolves #3359
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3362/reactions" }
https://api.github.com/repos/psf/requests/issues/3362/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3362.diff", "html_url": "https://github.com/psf/requests/pull/3362", "merged_at": "2016-06-28T19:22:00Z", "patch_url": "https://github.com/psf/requests/pull/3362.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3362" }
true
[ "@Lukasa this looks good to me. What are your thoughts?\n", "I think this looks good to me. @sigmavirus24 are you happy to merge this directly? I think I'd call the old behaviour a bug so I'd be fine with merging this for the next release.\n", "Me too. Merging!\n" ]
https://api.github.com/repos/psf/requests/issues/3361
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3361/labels{/name}
https://api.github.com/repos/psf/requests/issues/3361/comments
https://api.github.com/repos/psf/requests/issues/3361/events
https://github.com/psf/requests/pull/3361
161,837,576
MDExOlB1bGxSZXF1ZXN0NzQ4OTQzODg=
3,361
Allow sending an unprepared request via session.
{ "avatar_url": "https://avatars.githubusercontent.com/u/416057?v=4", "events_url": "https://api.github.com/users/jamielennox/events{/privacy}", "followers_url": "https://api.github.com/users/jamielennox/followers", "following_url": "https://api.github.com/users/jamielennox/following{/other_user}", "gists_url": "https://api.github.com/users/jamielennox/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jamielennox", "id": 416057, "login": "jamielennox", "node_id": "MDQ6VXNlcjQxNjA1Nw==", "organizations_url": "https://api.github.com/users/jamielennox/orgs", "received_events_url": "https://api.github.com/users/jamielennox/received_events", "repos_url": "https://api.github.com/users/jamielennox/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jamielennox/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jamielennox/subscriptions", "type": "User", "url": "https://api.github.com/users/jamielennox", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2016-06-23T04:25:28Z
2021-09-08T03:01:01Z
2016-06-23T06:06:56Z
NONE
resolved
This can be considered a request in the form of a PR because it was easier to make the code change than explain it. I have multiple places where i am passing around method, url, json etc before it makes it to the requests level. Now these can all be bundled up into a single request object which is much easier to pass around, but then i have to manually call prep = session.prepare_request(req) settings = session.merge_environment_settings(....) settings['timeout'] = timeout session.send(prep, **settings) Now it's not super hard, but it's something that happens by default doing request() and it's certainly not intuitive. If we add a send_request() which does exactly the same as request() just on an existing request object it makes it much easier to correctly use Request objects. Note: If you like the idea but not the implementation and have something equivalent - please just do it your way. I don't care if this particular PR gets accepted or not if the idea gets up. I proposed this against 3.0 thinking it was going to be a more breaking change, but it seems pretty easy if you'd prefer i propose against master. Thanks
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3361/reactions" }
https://api.github.com/repos/psf/requests/issues/3361/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3361.diff", "html_url": "https://github.com/psf/requests/pull/3361", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3361.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3361" }
true
[ "PreparedRequests should be suitable for accomplishing what you're trying to accomplish.\n\nI see where you're coming from, and thank you for your contribution. However, in short, no. :)\n\n@sigmavirus24 & @Lukasa are welcome to add additional thoughts, if desired.\n", "I once thought about this too. Then I realized that you're introducing yet another way of doing things when there already exist more than one way to do it. You can do:\n\n``` py\n#: 1\nrequests.get(...)\n#: 2\nrequests.request('GET', ...)\n#: 3\nmy_session.get(...)\n#: 4\nmy_session.request('GET', ...)\n#: 5\nrequest = requests.Request(method='GET', ...)\nprepared = my_session.prepare_request(request)\nsettings = my_session.merge_environment_settings(....) \n# modify settings\nmy_session.send(prepared, **settings)\n```\n\nWe really don't need yet another way of doing roughly the same thing. That's too much API surface area for very little extra benefit. If you're using the PreparedRequest flow, your proposal is only saving a tiny fraction of that workflow and diverging the typical code-paths further. Frankly, I don't think it's worth the extra surface area. In my experience, people who think the PreparedRequest flow is too complicated or kludgy, don't actually need it. \n" ]
https://api.github.com/repos/psf/requests/issues/3360
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3360/labels{/name}
https://api.github.com/repos/psf/requests/issues/3360/comments
https://api.github.com/repos/psf/requests/issues/3360/events
https://github.com/psf/requests/issues/3360
161,817,854
MDU6SXNzdWUxNjE4MTc4NTQ=
3,360
Custom ResponseClass
{ "avatar_url": "https://avatars.githubusercontent.com/u/416057?v=4", "events_url": "https://api.github.com/users/jamielennox/events{/privacy}", "followers_url": "https://api.github.com/users/jamielennox/followers", "following_url": "https://api.github.com/users/jamielennox/following{/other_user}", "gists_url": "https://api.github.com/users/jamielennox/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jamielennox", "id": 416057, "login": "jamielennox", "node_id": "MDQ6VXNlcjQxNjA1Nw==", "organizations_url": "https://api.github.com/users/jamielennox/orgs", "received_events_url": "https://api.github.com/users/jamielennox/received_events", "repos_url": "https://api.github.com/users/jamielennox/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jamielennox/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jamielennox/subscriptions", "type": "User", "url": "https://api.github.com/users/jamielennox", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2016-06-23T00:48:43Z
2021-09-08T17:05:36Z
2016-06-23T06:09:16Z
NONE
resolved
A common pattern in webob is allowing applications to define there own request and response class. This makes it easier to work with and expose attributes and methods that you will commonly use in your application. For example you can add properties for headers that are commonly used in your application. Theoretically you could do the request side of this today subclassing requests.Request, however the more useful side here would be being able to specify a response class (it can be asserted that custom response class is a subclass of requests.Response).
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 1, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 1, "url": "https://api.github.com/repos/psf/requests/issues/3360/reactions" }
https://api.github.com/repos/psf/requests/issues/3360/timeline
null
completed
null
null
false
[ "Thank you for the idea, but again, in short brevity: no. :)\n", "> For example you can add properties for headers that are commonly used in your application.\n\nYou could also do this with a separate response class (that takes a requests.Response) as a parameter and uses descriptors for that. OpenStack already has a lot of \"session\" classes that use requests.Session instances under the covers. I don't understand why they couldn't have their own Response object that uses descriptors as intended to achieve a similar goal.\n" ]
https://api.github.com/repos/psf/requests/issues/3359
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3359/labels{/name}
https://api.github.com/repos/psf/requests/issues/3359/comments
https://api.github.com/repos/psf/requests/issues/3359/events
https://github.com/psf/requests/issues/3359
161,568,785
MDU6SXNzdWUxNjE1Njg3ODU=
3,359
Uncertain about content/text vs iter_content(decode_unicode=True/False)
{ "avatar_url": "https://avatars.githubusercontent.com/u/48527?v=4", "events_url": "https://api.github.com/users/mikepelley/events{/privacy}", "followers_url": "https://api.github.com/users/mikepelley/followers", "following_url": "https://api.github.com/users/mikepelley/following{/other_user}", "gists_url": "https://api.github.com/users/mikepelley/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/mikepelley", "id": 48527, "login": "mikepelley", "node_id": "MDQ6VXNlcjQ4NTI3", "organizations_url": "https://api.github.com/users/mikepelley/orgs", "received_events_url": "https://api.github.com/users/mikepelley/received_events", "repos_url": "https://api.github.com/users/mikepelley/repos", "site_admin": false, "starred_url": "https://api.github.com/users/mikepelley/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mikepelley/subscriptions", "type": "User", "url": "https://api.github.com/users/mikepelley", "user_view_type": "public" }
[ { "color": "e10c02", "default": false, "description": null, "id": 117744, "name": "Bug", "node_id": "MDU6TGFiZWwxMTc3NDQ=", "url": "https://api.github.com/repos/psf/requests/labels/Bug" } ]
closed
true
null
[]
null
15
2016-06-22T00:19:56Z
2021-09-08T14:00:38Z
2016-09-24T17:01:11Z
NONE
resolved
When requesting an application/json document, I'm seeing `next(r.iter_content(16*1024, decode_unicode=True))` returning bytes, whereas `r.text` returns unicode. My understanding was that both should return a unicode object. In essence, I thought "iter_content" was equivalent to "iter_text" when decode_unicode was True. Have I misunderstood something? I can provide an example if needed. For reference, I'm using python 3.5.1 and requests 2.10.0. Thanks!
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3359/reactions" }
https://api.github.com/repos/psf/requests/issues/3359/timeline
null
completed
null
null
false
[ "what does (your response object).encoding return?\n", "There's at least one key difference: `decode_unicode=True` doesn't fall back to `apparent_encoding`, which means it'll never autodetect the encoding. This means if `response.encoding` is None it is a no-op: in fact, it's a no-op that yields bytes.\n\nThat behaviour seems genuinely bad to me, so I think we should consider it a bug. I'd rather we had the same logic as in `text` for this.\n", "`r.encoding` returns `None`.\n\nOn a related note, `iter_text` might be clearer/more consistent than `iter_content(decode_unicode=True)` if there's room for change in the APIs future (and `iter_content_lines` and `iter_text_lines` I guess), assuming you don't see that as bloat.\n", "@mikepelley The API is presently frozen so I don't think we'll be adding those three methods. Besides, `iter_text` likely wouldn't provide much extra value outside of calling `iter_content(decode_unicode=True)`.\n", "I added a pull request #3362 that adds the same check used in `text()` to set the encoding type if there is none. It seems to solve the raw bytes issue. Let me know if there's any needed tweaking.\n", "Glad we fixed this, this was definitely an oversight. \n", "I was reading through this and the related issues and PRs and I'd love to help out. However, I'm not sure what direction to head with it (or if it's something more suitable for the next version of `requests`).\n\nJust so I understand correctly, we want to get it so `iter_content` and `text` both return Unicode strings. The original PR solved this by grabbing the `apparent_encoding`, which caused a problem when using streaming responses (`chardet.detect` consumes the entire stream).\n\nI'm assuming [detecting the encoding incrementally](http://chardet.readthedocs.io/en/latest/usage.html#example-detecting-encoding-incrementally) will still lead to problems, as part of the stream will be consumed. Is this something that may be best solved via documentation, suggesting encoding is set before using `iter_content`? It seems like we'd need some sort of knowledge about the response, which may be hard to do without consuming it (or part of it).\n\nCould we find some middle ground? For streaming responses, documentation calls out that encoding should be set. If it isn't, return bytes and let the users decode to their heart's desire (seemed like it worked well as a workaround). While nonstreaming responses can use the `apparent_encoding`, as originally suggested.\n\nI know its a bit cold, but thoughts @Lukasa, @sigmavirus24?\n", "I'm inclined to suggest that we just use `encoding` and, if that is not set, that we throw an exception that requires that users set it, rather than subtly returning an incompatible type. \n", "I've been playing around with this since we reverted and I don't think that there's a nice way to avoid the exception. I'm +1 on that but I'm also thinking that an encoding param for `iter_content` that @Lukasa suggested in #3481 would be useful. It allows the user to avoid having to check and set `Response.encoding` on every Response that might be streamed.\n", "I know @sigmavirus24 brought up a concern about raising an exception if `encoding` isn't specified for the current major version. Would the suggested fix go against the 3.0 branch? @Lukasa, when you say `encoding`, I'm assuming you're referring to the response's encoding rather than the additional parameter on `iter_content`?\n", "Any change to the response's `encoding` logic (e.g. throwing exceptions if it's not present but `decode_unicode` is) is definitely breaking, and so would have to go against the 3.0.0 branch. I was not planning to add an extra parameter to `iter_content`.\n", "I think @shellhead's work in #3574 should have this addressed in 3.0.0.\n", "I'm pulling this comment out of #3675 and putting it here to avoid derailing the other issue.\n\n#3675 suggests to me again that there's a use case for a user supplied encoding fallback. Right now, you get unexpected bytes with `decode_unicode=True`. Even with the 3.0.0 fix, anytime `decode_unicode` is used, it must be wrapped in a try/except or conditional block to guarantee the request will be useful. This isn't the end of the world, but the fact that `iter_*` will fail on (semi?) common APIs like Amazon S3, Pinterest, and Stripe seems like a gap worth addressing. Why not provide the interface instead of forcing the users to program around it every time?\n", "Users are encouraged to set `response.encoding`: the docs point out that you can and should do it. I think that means we already provide the interface.\n", "> Users are encouraged to set response.encoding: the docs point out that you can and should do it.\n\nI don't know if I would go so far as to assert the docs say you \"should do it\", in fact I think they give a false sense of assurance about decoding.\n\nThe encoding section only discusses how `Response.text` will either use the header encoding/chardet to decode the body, or you can get raw bytes from `Response.content`. If you want to change the default encoding you can supply a new one. They don't really touch on that default being a no-op, or in the future halting entirely.\n\nIf `if r.encoding is None: r.encoding = 'my-chosen-fallback-1'` is a requirement to reliably use the streaming interface because `decode_unicode=True` isn't sufficient, then that should definitely be detailed. I guess I'm beating a dead horse on integrating this into `iter_*` though.\n" ]
https://api.github.com/repos/psf/requests/issues/3358
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3358/labels{/name}
https://api.github.com/repos/psf/requests/issues/3358/comments
https://api.github.com/repos/psf/requests/issues/3358/events
https://github.com/psf/requests/pull/3358
161,512,593
MDExOlB1bGxSZXF1ZXN0NzQ2NjU2OTQ=
3,358
Update list of supported Python versions in todo.rst
{ "avatar_url": "https://avatars.githubusercontent.com/u/1192314?v=4", "events_url": "https://api.github.com/users/petedmarsh/events{/privacy}", "followers_url": "https://api.github.com/users/petedmarsh/followers", "following_url": "https://api.github.com/users/petedmarsh/following{/other_user}", "gists_url": "https://api.github.com/users/petedmarsh/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/petedmarsh", "id": 1192314, "login": "petedmarsh", "node_id": "MDQ6VXNlcjExOTIzMTQ=", "organizations_url": "https://api.github.com/users/petedmarsh/orgs", "received_events_url": "https://api.github.com/users/petedmarsh/received_events", "repos_url": "https://api.github.com/users/petedmarsh/repos", "site_admin": false, "starred_url": "https://api.github.com/users/petedmarsh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/petedmarsh/subscriptions", "type": "User", "url": "https://api.github.com/users/petedmarsh", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-06-21T18:52:24Z
2021-09-08T03:01:02Z
2016-06-21T18:53:19Z
CONTRIBUTOR
resolved
Extremely minor change; I happened to notice that the list of supported Python versions in todo.rst was out-of-sync with the Python version classifiers in setup.py, this just puts it back in sync.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3358/reactions" }
https://api.github.com/repos/psf/requests/issues/3358/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3358.diff", "html_url": "https://github.com/psf/requests/pull/3358", "merged_at": "2016-06-21T18:53:19Z", "patch_url": "https://github.com/psf/requests/pull/3358.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3358" }
true
[ "Thanks @petedmarsh! :sparkles: :cake: :sparkles:\n" ]
https://api.github.com/repos/psf/requests/issues/3357
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3357/labels{/name}
https://api.github.com/repos/psf/requests/issues/3357/comments
https://api.github.com/repos/psf/requests/issues/3357/events
https://github.com/psf/requests/issues/3357
161,480,241
MDU6SXNzdWUxNjE0ODAyNDE=
3,357
Adding DRY-ness in auth.py
{ "avatar_url": "https://avatars.githubusercontent.com/u/3393794?v=4", "events_url": "https://api.github.com/users/ueg1990/events{/privacy}", "followers_url": "https://api.github.com/users/ueg1990/followers", "following_url": "https://api.github.com/users/ueg1990/following{/other_user}", "gists_url": "https://api.github.com/users/ueg1990/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ueg1990", "id": 3393794, "login": "ueg1990", "node_id": "MDQ6VXNlcjMzOTM3OTQ=", "organizations_url": "https://api.github.com/users/ueg1990/orgs", "received_events_url": "https://api.github.com/users/ueg1990/received_events", "repos_url": "https://api.github.com/users/ueg1990/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ueg1990/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ueg1990/subscriptions", "type": "User", "url": "https://api.github.com/users/ueg1990", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2016-06-21T16:19:04Z
2021-09-08T17:05:36Z
2016-06-21T16:52:58Z
CONTRIBUTOR
resolved
@sigmavirus24 mentioned that in auth.py, the magic method **ne** is technically not needed since != is implied from definition of **eq**. Would this be considered a useful update?
{ "avatar_url": "https://avatars.githubusercontent.com/u/3393794?v=4", "events_url": "https://api.github.com/users/ueg1990/events{/privacy}", "followers_url": "https://api.github.com/users/ueg1990/followers", "following_url": "https://api.github.com/users/ueg1990/following{/other_user}", "gists_url": "https://api.github.com/users/ueg1990/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ueg1990", "id": 3393794, "login": "ueg1990", "node_id": "MDQ6VXNlcjMzOTM3OTQ=", "organizations_url": "https://api.github.com/users/ueg1990/orgs", "received_events_url": "https://api.github.com/users/ueg1990/received_events", "repos_url": "https://api.github.com/users/ueg1990/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ueg1990/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ueg1990/subscriptions", "type": "User", "url": "https://api.github.com/users/ueg1990", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3357/reactions" }
https://api.github.com/repos/psf/requests/issues/3357/timeline
null
completed
null
null
false
[ "Have you checked that it's not necessary on Python 2.6? It might be necessary there.\n", "You are right. It is needed in Python 2.6. \n" ]
https://api.github.com/repos/psf/requests/issues/3356
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3356/labels{/name}
https://api.github.com/repos/psf/requests/issues/3356/comments
https://api.github.com/repos/psf/requests/issues/3356/events
https://github.com/psf/requests/pull/3356
161,476,877
MDExOlB1bGxSZXF1ZXN0NzQ2Mzk3MjU=
3,356
Do not attempt to decode to unicode twice
{ "avatar_url": "https://avatars.githubusercontent.com/u/555959?v=4", "events_url": "https://api.github.com/users/zachmullen/events{/privacy}", "followers_url": "https://api.github.com/users/zachmullen/followers", "following_url": "https://api.github.com/users/zachmullen/following{/other_user}", "gists_url": "https://api.github.com/users/zachmullen/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/zachmullen", "id": 555959, "login": "zachmullen", "node_id": "MDQ6VXNlcjU1NTk1OQ==", "organizations_url": "https://api.github.com/users/zachmullen/orgs", "received_events_url": "https://api.github.com/users/zachmullen/received_events", "repos_url": "https://api.github.com/users/zachmullen/repos", "site_admin": false, "starred_url": "https://api.github.com/users/zachmullen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zachmullen/subscriptions", "type": "User", "url": "https://api.github.com/users/zachmullen", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2016-06-21T16:04:23Z
2021-09-08T03:01:02Z
2016-06-21T16:05:40Z
NONE
resolved
It's possible I'm missing something here, but it seems like we should not be trying to decode via the underlying urllib3 method since `iter_content` is responsible for the decision of whether or not to decode.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3356/reactions" }
https://api.github.com/repos/psf/requests/issues/3356/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3356.diff", "html_url": "https://github.com/psf/requests/pull/3356", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3356.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3356" }
true
[ "Thanks for this!\n\nThe urllib3 decoding is about decoding compressed content-encodings like gzip and deflate. We do always want that on. =) iter_content's decoding is about decoding to unicode, an unrelated concern.\n", "Ah, thank you for the explanation :)\n" ]
https://api.github.com/repos/psf/requests/issues/3355
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3355/labels{/name}
https://api.github.com/repos/psf/requests/issues/3355/comments
https://api.github.com/repos/psf/requests/issues/3355/events
https://github.com/psf/requests/issues/3355
161,128,559
MDU6SXNzdWUxNjExMjg1NTk=
3,355
'True' payload in get cause problem
{ "avatar_url": "https://avatars.githubusercontent.com/u/1731864?v=4", "events_url": "https://api.github.com/users/lufeihaidao/events{/privacy}", "followers_url": "https://api.github.com/users/lufeihaidao/followers", "following_url": "https://api.github.com/users/lufeihaidao/following{/other_user}", "gists_url": "https://api.github.com/users/lufeihaidao/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/lufeihaidao", "id": 1731864, "login": "lufeihaidao", "node_id": "MDQ6VXNlcjE3MzE4NjQ=", "organizations_url": "https://api.github.com/users/lufeihaidao/orgs", "received_events_url": "https://api.github.com/users/lufeihaidao/received_events", "repos_url": "https://api.github.com/users/lufeihaidao/repos", "site_admin": false, "starred_url": "https://api.github.com/users/lufeihaidao/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/lufeihaidao/subscriptions", "type": "User", "url": "https://api.github.com/users/lufeihaidao", "user_view_type": "public" }
[]
closed
true
null
[]
null
7
2016-06-20T06:58:42Z
2021-09-08T17:05:37Z
2016-06-20T07:48:33Z
NONE
resolved
``` python payload = {'key1': 'value1', 'key2': True} r = requests.get('http://httpbin.org/get', params=payload) print(r.url) # 'http://httpbin.org/get?key1=value1&key2=True' ``` With `key2` I wanna get a boolean param, but requests gives me 'True'. 'True' will be considered as a string in many frameworks.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3355/reactions" }
https://api.github.com/repos/psf/requests/issues/3355/timeline
null
completed
null
null
false
[ "Is it a bug, or designed to be so?\n", "@lufeihaidao HTTP query parameters are not typed: they are _all_ strings. There is no such thing as a boolean parameter in a query string. What do you think a boolean parameter is, in this context?\n", "@Lukasa Sorry, in my app, `key2` is supposed to be a boolean field. And many frameworks' boolean validations only check whether the `key2` string is '1' or 'true',rather than 'True'. So when the client assembles params with 'True', it will not pass the server's boolean validator.\n", "In this case, I need to modify the server's boolean validator to support 'True'?\n", "@lufeihaidao You could do that, or you could just pass the string `true` rather than `True`.\n\n``` python\npayload = {'key1': 'value1', 'key2': 'true'}\n```\n", "@Lukasa OK, thanks.\n", "No problem.\n" ]
https://api.github.com/repos/psf/requests/issues/3354
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3354/labels{/name}
https://api.github.com/repos/psf/requests/issues/3354/comments
https://api.github.com/repos/psf/requests/issues/3354/events
https://github.com/psf/requests/issues/3354
161,112,202
MDU6SXNzdWUxNjExMTIyMDI=
3,354
Failed to establish a new connection: [Errno 65] No route to host
{ "avatar_url": "https://avatars.githubusercontent.com/u/4336119?v=4", "events_url": "https://api.github.com/users/ljdawn/events{/privacy}", "followers_url": "https://api.github.com/users/ljdawn/followers", "following_url": "https://api.github.com/users/ljdawn/following{/other_user}", "gists_url": "https://api.github.com/users/ljdawn/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ljdawn", "id": 4336119, "login": "ljdawn", "node_id": "MDQ6VXNlcjQzMzYxMTk=", "organizations_url": "https://api.github.com/users/ljdawn/orgs", "received_events_url": "https://api.github.com/users/ljdawn/received_events", "repos_url": "https://api.github.com/users/ljdawn/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ljdawn/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ljdawn/subscriptions", "type": "User", "url": "https://api.github.com/users/ljdawn", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2016-06-20T03:46:43Z
2021-09-08T17:05:22Z
2016-06-28T18:40:47Z
NONE
resolved
hi, I'm trying to ``` python import requests req = requests.Session() headers = {'Referer':'https://archive.org/','User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10.11; rv:47.0) Gecko/20100101 Firefox/47.0'} req.headers.update(headers) res = req.get('https://archive.org/', headers=headers, timeout=20, proxies={'http': '52.192.0.225:8080'}, verify=False) ``` Then got --------------------------------------------------------------------------- ConnectionError Traceback (most recent call last) <ipython-input-23-1ddf77e531ff> in <module>() 3 headers = {'Referer':'https://archive.org/','User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10.11; rv:47.0) Gecko/20100101 Firefox/47.0'} 4 req.headers.update(headers) ----> 5 res = req.get('https://archive.org/', headers=headers, timeout=20, proxies={'http': '52.192.0.225:8080'}, verify=False) /Users/test/env/lib/python2.7/site-packages/requests/sessions.pyc in get(self, url, *_kwargs) 485 486 kwargs.setdefault('allow_redirects', True) --> 487 return self.request('GET', url, *_kwargs) 488 489 def options(self, url, **kwargs): /Users/test/env/lib/python2.7/site-packages/requests/sessions.pyc in request(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert, json) 473 } 474 send_kwargs.update(settings) --> 475 resp = self.send(prep, **send_kwargs) 476 477 return resp /Users/test/env/lib/python2.7/site-packages/requests/sessions.pyc in send(self, request, *_kwargs) 583 584 # Send the request --> 585 r = adapter.send(request, *_kwargs) 586 587 # Total elapsed time of the request (approximately) /Users/test/env/lib/python2.7/site-packages/requests/adapters.pyc in send(self, request, stream, timeout, verify, cert, proxies) 465 raise ProxyError(e, request=request) 466 --> 467 raise ConnectionError(e, request=request) 468 469 except ClosedPoolError as e: ConnectionError: HTTPSConnectionPool(host='archive.org', port=443): Max retries exceeded with url: / (Caused by NewConnectionError('<requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x105ea5fd0>: Failed to establish a new connection: [Errno 65] No route to host',)) I've tried to use proxies, set down the verify, but that does not work. Thank you
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3354/reactions" }
https://api.github.com/repos/psf/requests/issues/3354/timeline
null
completed
null
null
false
[ "Your proxy configuration appears to be wrong: you've set up a proxy for only HTTP schemed addresses. Do you want to try changing your `req.get` request to be:\n\n``` python\nres = req.get('https://archive.org/', headers=headers, timeout=20, proxies={'http': 'http://52.192.0.225:8080', 'https': 'http://52.192.0.225:8080'})\n```\n", "@Lukasa OK. thank you .\n" ]
https://api.github.com/repos/psf/requests/issues/3353
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3353/labels{/name}
https://api.github.com/repos/psf/requests/issues/3353/comments
https://api.github.com/repos/psf/requests/issues/3353/events
https://github.com/psf/requests/issues/3353
161,044,323
MDU6SXNzdWUxNjEwNDQzMjM=
3,353
get request hangs "in readinto return self._sock.recv_into(b)" without raising exception
{ "avatar_url": "https://avatars.githubusercontent.com/u/8951994?v=4", "events_url": "https://api.github.com/users/FlxVctr/events{/privacy}", "followers_url": "https://api.github.com/users/FlxVctr/followers", "following_url": "https://api.github.com/users/FlxVctr/following{/other_user}", "gists_url": "https://api.github.com/users/FlxVctr/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/FlxVctr", "id": 8951994, "login": "FlxVctr", "node_id": "MDQ6VXNlcjg5NTE5OTQ=", "organizations_url": "https://api.github.com/users/FlxVctr/orgs", "received_events_url": "https://api.github.com/users/FlxVctr/received_events", "repos_url": "https://api.github.com/users/FlxVctr/repos", "site_admin": false, "starred_url": "https://api.github.com/users/FlxVctr/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/FlxVctr/subscriptions", "type": "User", "url": "https://api.github.com/users/FlxVctr", "user_view_type": "public" }
[]
closed
true
null
[]
null
20
2016-06-18T21:30:49Z
2021-08-29T00:06:38Z
2016-06-19T11:00:05Z
NONE
resolved
Hi, I'm pretty desperately trying to nail down the reason that an url unshortener script is freezing from time to time. I'm using requests with several head and get-requests, such as ``` with requests.Session as s: r = s.get(url, allow_redirects=True, timeout=self.timeout, headers=self.headers, verify=False) ``` where self.timeout are 10 seconds and the headers just include a user-agent. The request is called by a worker in a multiprocessing Pool with 10 workers that get's constantly feeded via Pool.imap() with a list of urls. When keyboard interrupting I get from some of the workers the following Traceback: ``` Traceback (most recent call last): File "---/dataprocessor.py", line 98, in try_unshorten_expanded_urls result = unshorten_expanded_urls(tweet_dataframe_tuple) File "---/dataprocessor.py", line 84, in unshorten_expanded_urls 'unshortened_url': u.unshorten(expanded_urls[i]), File "---/unshorten.py", line 162, in unshorten verify=False) File "/home/ubuntu/miniconda3/envs/TQ/lib/python3.5/site-packages/requests/sessions.py", line 480, in get return self.request('GET', url, **kwargs) File "/home/ubuntu/miniconda3/envs/TQ/lib/python3.5/site-packages/requests/sessions.py", line 468, in request resp = self.send(prep, **send_kwargs) File "/home/ubuntu/miniconda3/envs/TQ/lib/python3.5/site-packages/requests/sessions.py", line 608, in send r.content File "/home/ubuntu/miniconda3/envs/TQ/lib/python3.5/site-packages/requests/models.py", line 737, in content self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes() File "/home/ubuntu/miniconda3/envs/TQ/lib/python3.5/site-packages/requests/models.py", line 660, in generate for chunk in self.raw.stream(chunk_size, decode_content=True): File "/home/ubuntu/miniconda3/envs/TQ/lib/python3.5/site-packages/requests/packages/urllib3/response.py", line 344, in stream data = self.read(amt=amt, decode_content=decode_content) File "/home/ubuntu/miniconda3/envs/TQ/lib/python3.5/site-packages/requests/packages/urllib3/response.py", line 301, in read data = self._fp.read(amt) File "/home/ubuntu/miniconda3/envs/TQ/lib/python3.5/http/client.py", line 433, in read n = self.readinto(b) File "/home/ubuntu/miniconda3/envs/TQ/lib/python3.5/http/client.py", line 473, in readinto n = self.fp.readinto(b) File "/home/ubuntu/miniconda3/envs/TQ/lib/python3.5/socket.py", line 575, in readinto return self._sock.recv_into(b) KeyboardInterrupt ``` If the function would raise an exception, I've made sure that it would be catched within the function so that it would not block the pool. But as it seems, `readinto` is just hanging here (for hours). I don't know whether this is related to #3066, as I am not familiar enough with the protocols or requests itself, so please excuse a duplicate if this should be one or if the error is my bad. The freeze happened several times with the same batch of urls now. I tried to find the exact url causing this freeze but did not succeed as the batch contains several thousand, though I've already excluded everything that exceeds a certain content length or does not have text as content type via a head request before the get. I am running requests 2.10.0 with Python 3.5.1 in a miniconda environment installed via pip on an Ubuntu 14.04. However, great library, hope this helps to make it better. Cheers! UPDATE: Here the relevant code for the traceback (line 162 in unshorten.py): ``` python def check_size_and_type(request_object, max_content_length): try: length = int(request_object.headers['content-length']) except: length = None try: type = str(request_object.headers['content-type']) except: type = '' if ((length is None or length < max_content_length) and (type.startswith('text/html') or type.startswith('application/xhtml') or type.startswith('text/xml'))): return True else: return False with requests.Session() as s: r = s.head(url, allow_redirects=True, timeout=self.timeout, headers=self.headers, verify=False) if check_size_and_type(r, self.max_content_length): r = s.get(url, allow_redirects=True, timeout=self.timeout, headers=self.headers, verify=False) ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/8951994?v=4", "events_url": "https://api.github.com/users/FlxVctr/events{/privacy}", "followers_url": "https://api.github.com/users/FlxVctr/followers", "following_url": "https://api.github.com/users/FlxVctr/following{/other_user}", "gists_url": "https://api.github.com/users/FlxVctr/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/FlxVctr", "id": 8951994, "login": "FlxVctr", "node_id": "MDQ6VXNlcjg5NTE5OTQ=", "organizations_url": "https://api.github.com/users/FlxVctr/orgs", "received_events_url": "https://api.github.com/users/FlxVctr/received_events", "repos_url": "https://api.github.com/users/FlxVctr/repos", "site_admin": false, "starred_url": "https://api.github.com/users/FlxVctr/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/FlxVctr/subscriptions", "type": "User", "url": "https://api.github.com/users/FlxVctr", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3353/reactions" }
https://api.github.com/repos/psf/requests/issues/3353/timeline
null
completed
null
null
false
[ "Thanks for this report!\n\nWhat worries me is that we're getting stuck even with a timeout. The timeout should prevent any socket call lasting longer than that number of seconds. This suggests two possibilities to me that _aren't_ bugs, as well as the possibility of a bug (though it's unlikely).\n\nFirstly, can you use a print statement to confirm your timeout value at the call site?\n\nSecondly, it's possible that data _is_ slowly being fed to the socket so that each individual socket call takes less than 10s. Can you try waiting to see if this problem resolves itself?\n\nThirdly is the risk of bugs. For that I ideally need a reproducible script I can run in my own environment. Do you think you can produce one?\n", "Thanks for the quick answer!\n\nI can confirm via PyCharm Debugger and print statements that the timeout is 10 seconds for every call.\n\nIt could be that data is slowly being fed, but it hanged for more than 8 hours the last time and `max_content_length` is 500kB. Waiting for such a long time (especially if it happens over the weekend) leads to quite a bunch of data piling up to be processed afterwards and is not available for live analysis. \n\nThis happened already for several datasets we're collecting.\n\nI will try to write something, but to make it reproducible I'd have to narrow down a smaller batch of urls causing it. This could take a while.\n", "Yeah, that's fine, there's no rush, but I'm unlikely to be able to discover this without a good repro scenario: it's likely that this is a weird interaction around sockets, forking, and the standard library.\n", "Ok, I think I found the url. But it's no problem with requests then. Rather a misconfigured webserver ...\n\n``` python\n>>> r = requests.head('http://bluepoint-radio.de:9080/')\n>>> r.headers\n{'Content-Type': 'text/html'}\n```\n\n😡 \n\nThanks for your help though.\n", "Yes, reproduces the error:\n\n``` python\n>>> r = requests.get('http://bluepoint-radio.de:9080/', timeout=5)\n^CTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"/home/ubuntu/miniconda3/envs/TQ/lib/python3.5/site-packages/requests/api.py\", line 67, in get\n :rtype: requests.Response\n File \"/home/ubuntu/miniconda3/envs/TQ/lib/python3.5/site-packages/requests/api.py\", line 53, in request\n # By using the 'with' statement we are sure the session is closed, thus we\n File \"/home/ubuntu/miniconda3/envs/TQ/lib/python3.5/site-packages/requests/sessions.py\", line 468, in request\n\n File \"/home/ubuntu/miniconda3/envs/TQ/lib/python3.5/site-packages/requests/sessions.py\", line 608, in send\n # Shuffle things around if there's history.\n File \"/home/ubuntu/miniconda3/envs/TQ/lib/python3.5/site-packages/requests/models.py\", line 737, in content\n\n File \"/home/ubuntu/miniconda3/envs/TQ/lib/python3.5/site-packages/requests/models.py\", line 660, in generate\n def generate():\n File \"/home/ubuntu/miniconda3/envs/TQ/lib/python3.5/site-packages/requests/packages/urllib3/response.py\", line 344, in stream\n :param decode_content:\n File \"/home/ubuntu/miniconda3/envs/TQ/lib/python3.5/site-packages/requests/packages/urllib3/response.py\", line 301, in read\n data = None\n File \"/home/ubuntu/miniconda3/envs/TQ/lib/python3.5/http/client.py\", line 433, in read\n n = self.readinto(b)\n File \"/home/ubuntu/miniconda3/envs/TQ/lib/python3.5/http/client.py\", line 473, in readinto\n n = self.fp.readinto(b)\n File \"/home/ubuntu/miniconda3/envs/TQ/lib/python3.5/socket.py\", line 575, in readinto\n return self._sock.recv_into(b)\nKeyboardInterrupt\n>>> \n```\n", "Just for reference if somebody is looking this up:\n\nUse [body content workflow](http://docs.python-requests.org/en/master/user/advanced/#body-content-workflow) instead of head requests to make content retrieval conditional. Headers from head requests seem to be unreliable.\n", "I have a very similar problem (if not the same): an application is randomly hanging (every ~10 to 20 days, roughly) and I connected `gdb` to a hung session today. `py-list` shows\n\n```\n(gdb) py-list\n 366 self._checkReadable()\n 367 if self._timeout_occurred:\n 368 raise OSError(\"cannot read from timed out object\")\n 369 while True:\n 370 try:\n>371 return self._sock.recv_into(b)\n 372 except timeout:\n 373 self._timeout_occurred = True\n 374 raise\n 375 except InterruptedError:\n 376 continue\n```\n\nit just hangs here, apparently (I have no experience with `gdb` but when I disconnect and reconnect, `py-list` brings me to the same place above).\n\n@FlxVctr : thanks for the tip but I do not use head requests (nor body content workflow)\n", "That hang point seems to be in the standard library. Do you have a full Python traceback?\n", "@Lukasa : no, my program hung (without any traceback) and I was wondering what could have caused the hanging. I just did a `gdb python3 <pid of my script>` and the only valuable thing I got was the pitput of `py-list`. I just rebooted my machine (it is a RPi) because I needed the script to run but if there is something i can do with the code to make it more talkative please let me know.\n", "gdb should be capable of printing a Python traceback.\n", "Ah, I still have what I tried on the terminal. Yes, there is indeed a traceback:\n\n```\n(gdb) py-bt\nTraceback (most recent call first):\n File \"/usr/lib/python3.4/socket.py\", line 371, in readinto\n return self._sock.recv_into(b)\n File \"/usr/lib/python3.4/wsgiref/simple_server.py\", line 118, in handle\n self.raw_requestline = self.rfile.readline(65537)\n File \"/usr/lib/python3.4/socketserver.py\", line 669, in __init__\n self.handle()\n File \"/usr/lib/python3.4/socketserver.py\", line 344, in finish_request\n self.RequestHandlerClass(request, client_address, self)\n File \"/usr/lib/python3.4/socketserver.py\", line 331, in process_request\n self.finish_request(request, client_address)\n File \"/usr/lib/python3.4/socketserver.py\", line 305, in _handle_request_noblock\n self.process_request(request, client_address)\n File \"/usr/lib/python3.4/socketserver.py\", line 238, in serve_forever\n self._handle_request_noblock()\n File \"/usr/lib/python3/dist-packages/bottle.py\", line 2769, in run\n srv.serve_forever()\n File \"/usr/lib/python3/dist-packages/bottle.py\", line 3114, in run\n server.run(app)\n File \"webserver.py\", line 510, in __init__\n File \"webserver.py\", line 547, in <module>\n```\n", "@wsw70 I should note that requests is not in that stack. 😉 \n", "Yes, I googled the question and saw references to `urllib*`, `socket` etc. - all of them were quite cryptic. I just mentioned in this thread that I have the same issue and that it is has indeed something to do with the standard lib.\n\nDid I mention that **requests is awesome**? (and I really mean it, it is an opportunity to voice this out since the core dev is listening :) )\n", "Yup. =P So that hang seems to be related to a TCP connection falling over. You'll have to take that up with the bottle devs. =D And I'm glad you like Requests!\n", "Having the same issue here. with timeout without it doesn't make any difference. It is hanging in the `socket` module `self._sock.recv_into(b)` \r\n\r\n python3 \"/home/devilfromir/code/src/utilities/deckdl/ddtokenizer.py\"\r\n^CTraceback (most recent call last):\r\n File \"/usr/local/lib/python3.6/dist-packages/requests/packages/urllib3/connectionpool.py\r\n\", line 379, in _make_request\r\n httplib_response = conn.getresponse(buffering=True)\r\nTypeError: getresponse() got an unexpected keyword argument 'buffering'\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File \"/home/devilfromir/code/src/utilities/deckdl/ddtokenizer.py\", line 63, in <module>\r\n ip = anon.IP()\r\n File \"/home/devilfromir/code/src/utilities/deckdl/anon.py\", line 26, in __init__\r\n self.check(withTor=True)\r\n File \"/home/devilfromir/code/src/utilities/deckdl/anon.py\", line 37, in check\r\n res = req.get('http://www.icanhazip.com', headers=headerGen(''))\r\n File \"/home/devilfromir/code/src/utilities/deckdl/anon.py\", line 87, in torReq\r\n r = getattr(requests, attrName)(self.url, **kwargs)\r\n File \"/usr/local/lib/python3.6/dist-packages/requests/api.py\", line 70, in get\r\n return request('get', url, params=params, **kwargs)\r\n File \"/usr/local/lib/python3.6/dist-packages/requests/api.py\", line 56, in request\r\n return session.request(method=method, url=url, **kwargs)\r\n File \"/usr/local/lib/python3.6/dist-packages/requests/sessions.py\", line 488, in request\r\n resp = self.send(prep, **send_kwargs)\r\n File \"/usr/local/lib/python3.6/dist-packages/requests/sessions.py\", line 609, in send r = adapter.send(request, **kwargs)\r\n File \"/usr/local/lib/python3.6/dist-packages/requests/adapters.py\", line 423, in send\r\n timeout=timeout\r\n File \"/usr/local/lib/python3.6/dist-packages/requests/packages/urllib3/connectionpool.py\r\n\", line 600, in urlopen\r\n chunked=chunked)\r\n File \"/usr/local/lib/python3.6/dist-packages/requests/packages/urllib3/connectionpool.py\r\n\", line 382, in _make_request\r\n httplib_response = conn.getresponse()\r\n File \"/usr/lib/python3.6/http/client.py\", line 1331, in getresponse\r\n response.begin()\r\n File \"/usr/lib/python3.6/http/client.py\", line 297, in begin\r\n version, status, reason = self._read_status()\r\n File \"/usr/lib/python3.6/http/client.py\", line 258, in _read_status\r\n line = str(self.fp.readline(_MAXLINE + 1), \"iso-8859-1\")\r\n File \"/usr/lib/python3.6/socket.py\", line 586, in readinto\r\n return self._sock.recv_into(b)\r\nKeyboardInterrupt\r\n\r\nThis happens when i tunnel through tor using privoxy as http proxy", "遇到了同样的问题,进程在return self._sock.recv_into(b)这里卡住了", "I had the same problem in windows,hangs in `self._sock.recv_into(b)` .Maybe once in a dozen days.I use multiprocessing and requests.But when I switch to threading, It don't hangs in the socket.This may be a complicated problem.", "I'm having a similar issue.\r\nMy application hangs in `self._sock.recv_into(b)` without throwing any exception. I have configured `pool_manager.connection_pool_kw[\"timeout\"]` with a small timeout.\r\n\r\nI tried to trace the problem by including the `hanging_threads` package in the module where the hang happens. The hang happens in a client stub generated by swagger codegen from an openapi schema.\r\n\r\nThis is how I managed to get the traceback:\r\n\r\n`from hanging_threads import start_monitoring\r\nmonitoring_thread = start_monitoring()`\r\n\r\nThe issue seems related to the fact that I was issuing an http request to the same resource from whom I was called, in the same thread and before issuing a response to the caller.\r\nMoving the call to another thread solves the problem. It probably has something to do with the underlying socket but I need to investigate further.\r\n\r\n\r\nThe following is the traceback of the hang\r\n\r\n--------Thread 139917216446208 \"Thread-12\" hangs ----------\r\nFile \"/p37/threading.py\", line 890, in _bootstrap\r\n self._bootstrap_inner()\r\nFile \"/p37/threading.py\", line 926, in _bootstrap_inner\r\n self.run()\r\nFile \"/p37/threading.py\", line 870, in run\r\n self._target(*self._args, **self._kwargs)\r\nFile \"/p37/socketserver.py\", line 650, in process_request_thread\r\n self.finish_request(request, client_address)\r\nFile \"/p37/socketserver.py\", line 360, in finish_request\r\n self.RequestHandlerClass(request, client_address, self)\r\nFile \"/p37/socketserver.py\", line 720, in __init__\r\n self.handle()\r\nFile \"/p37/site-packages/werkzeug/serving.py\", line 345, in handle\r\n BaseHTTPRequestHandler.handle(self)\r\nFile \"/p37/http/server.py\", line 426, in handle\r\n self.handle_one_request()\r\nFile \"/p37/site-packages/werkzeug/serving.py\", line 379, in handle_one_request\r\n return self.run_wsgi()\r\nFile \"/p37/site-packages/werkzeug/serving.py\", line 323, in run_wsgi\r\n execute(self.server.app)\r\nFile \"/p37/site-packages/werkzeug/serving.py\", line 312, in execute\r\n application_iter = app(environ, start_response)\r\nFile \"/p37/site-packages/flask/cli.py\", line 337, in __call__\r\n return self._app(environ, start_response)\r\nFile \"/p37/site-packages/flask/app.py\", line 2464, in __call__\r\n return self.wsgi_app(environ, start_response)\r\nFile \"/p37/site-packages/flask/app.py\", line 2447, in wsgi_app\r\n response = self.full_dispatch_request()\r\nFile \"/p37/site-packages/flask/app.py\", line 1950, in full_dispatch_request\r\n rv = self.dispatch_request()\r\nFile \"/p37/site-packages/flask/app.py\", line 1936, in dispatch_request\r\n return self.view_functions[rule.endpoint](**req.view_args)\r\nFile \"/p37/site-packages/my_app/__init__.py\", line 49, in ast\r\n STORYBOARD_FSM.ast_request()\r\nFile \"/p37/site-packages/my_app/my_app.py\", line 123, in ast_request\r\n self._state.ast()\r\nFile \"/p37/site-packages/my_app/my_app.py\", line 231, in ast\r\n self.context.transition_to(Reading())\r\nFile \"/p37/site-packages/my_app/my_app.py\", line 116, in transition_to\r\n self._state.do()\r\nFile \"/p37/site-packages/my_app/my_app.py\", line 184, in do\r\n self.context.transition_to(RunningLookAt())\r\nFile \"/p37/site-packages/my_app/my_app.py\", line 116, in transition_to\r\n self._state.do()\r\nFile \"/p37/site-packages/my_app/my_app.py\", line 245, in do\r\n embs.call_look_at_api(point)\r\nFile \"/p37/site-packages/my_app/my_service.py\", line 62, in call_look_at_api\r\n api_instance.look_at_post(body)\r\nFile \"/swagger_client/api/idle_api.py\", line 148, in look_at_post\r\n (data) = self.look_at_post_with_http_info(body, **kwargs) # noqa: E501\r\nFile \"/swagger_client/api/idle_api.py\", line 221, in look_at_post_with_http_info\r\n collection_formats=collection_formats)\r\nFile \"/swagger_client/api_client.py\", line 321, in call_api\r\n _preload_content, _request_timeout)\r\nFile \"/swagger_client/api_client.py\", line 152, in __call_api\r\n _request_timeout=_request_timeout)\r\nFile \"/swagger_client/api_client.py\", line 364, in request\r\n body=body)\r\nFile \"/swagger_client/rest.py\", line 274, in POST\r\n body=body)\r\nFile \"/swagger_client/rest.py\", line 166, in request\r\n headers=headers)\r\nFile \"/p37/site-packages/urllib3/request.py\", line 80, in request\r\n method, url, fields=fields, headers=headers, **urlopen_kw\r\nFile \"/p37/site-packages/urllib3/request.py\", line 171, in request_encode_body\r\n return self.urlopen(method, url, **extra_kw)\r\nFile \"/p37/site-packages/urllib3/poolmanager.py\", line 336, in urlopen\r\n response = conn.urlopen(method, u.request_uri, **kw)\r\nFile \"/p37/site-packages/urllib3/connectionpool.py\", line 677, in urlopen\r\n chunked=chunked,\r\nFile \"/p37/site-packages/urllib3/connectionpool.py\", line 421, in _make_request\r\n httplib_response = conn.getresponse()\r\nFile \"/p37/http/client.py\", line 1344, in getresponse\r\n response.begin()\r\nFile \"/p37/http/client.py\", line 306, in begin\r\n version, status, reason = self._read_status()\r\nFile \"/p37/http/client.py\", line 267, in _read_status\r\n line = str(self.fp.readline(_MAXLINE + 1), \"iso-8859-1\")\r\nFile \"/p37/socket.py\", line 589, in readinto\r\n return self._sock.recv_into(b)\r\n--------Thread 139917208053504 \"Thread-11\" died ----------\r\n\r\n", "I am also facing similar issue, using Python3.7\r\n\r\nCode:\r\nimport speech_recognition as sr\r\n\r\nr = sr.Recognizer()\r\nwith sr.Microphone() as source:\r\n print(\"Hello World 1\")\r\n audio = r.listen(source)\r\n voice_data = r.recognize_google(audio)\r\n print(voice_data)\r\n\r\nError:- \r\n voice_data = r.recognize_google(audio)\r\n File \"C:\\Program Files (x86)\\Microsoft Visual Studio\\Shared\\Python37_64\\lib\\site-packages\\speech_recognition\\__init__.py\", line 840, in recognize_google\r\n response = urlopen(request, timeout=self.operation_timeout)\r\n File \"C:\\Program Files (x86)\\Microsoft Visual Studio\\Shared\\Python37_64\\lib\\urllib\\request.py\", line 222, in urlopen\r\n return opener.open(url, data, timeout)\r\n File \"C:\\Program Files (x86)\\Microsoft Visual Studio\\Shared\\Python37_64\\lib\\urllib\\request.py\", line 525, in open\r\n response = self._open(req, data)\r\n File \"C:\\Program Files (x86)\\Microsoft Visual Studio\\Shared\\Python37_64\\lib\\urllib\\request.py\", line 543, in _open\r\n '_open', req)\r\n File \"C:\\Program Files (x86)\\Microsoft Visual Studio\\Shared\\Python37_64\\lib\\urllib\\request.py\", line 503, in _call_chain\r\n result = func(*args)\r\n File \"C:\\Program Files (x86)\\Microsoft Visual Studio\\Shared\\Python37_64\\lib\\urllib\\request.py\", line 1345, in http_open\r\n return self.do_open(http.client.HTTPConnection, req)\r\n File \"C:\\Program Files (x86)\\Microsoft Visual Studio\\Shared\\Python37_64\\lib\\urllib\\request.py\", line 1320, in do_open\r\n r = h.getresponse()\r\n File \"C:\\Program Files (x86)\\Microsoft Visual Studio\\Shared\\Python37_64\\lib\\http\\client.py\", line 1336, in getresponse\r\n response.begin()\r\n File \"C:\\Program Files (x86)\\Microsoft Visual Studio\\Shared\\Python37_64\\lib\\http\\client.py\", line 306, in begin\r\n version, status, reason = self._read_status()\r\n File \"C:\\Program Files (x86)\\Microsoft Visual Studio\\Shared\\Python37_64\\lib\\http\\client.py\", line 267, in _read_status\r\n line = str(self.fp.readline(_MAXLINE + 1), \"iso-8859-1\")\r\n File \"C:\\Program Files (x86)\\Microsoft Visual Studio\\Shared\\Python37_64\\lib\\socket.py\", line 589, in readinto\r\n return self._sock.recv_into(b)\r\nConnectionResetError: [WinError 10054] An existing connection was forcibly closed by the remote host", "I'm having the same issue and it turns out that it is the socket getting disconnected. It works fine once the TCP keep alive options are set explicitly:\r\n\r\n```\r\n# Set TCP keep alive options to avoid HTTP requests hanging issue\r\n# Reference: https://stackoverflow.com/a/14855726/2360527\r\nimport platform\r\nimport socket\r\nimport urllib3.connection\r\n\r\nplatform_name = platform.system()\r\norig_connect = urllib3.connection.HTTPConnection.connect\r\ndef patch_connect(self):\r\n orig_connect(self)\r\n if platform_name == \"Linux\" or platform_name == \"Windows\":\r\n self.sock.setsockopt(socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1),\r\n self.sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPIDLE, 1),\r\n self.sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPINTVL, 3),\r\n self.sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPCNT, 5),\r\n elif platform_name == \"Darwin\":\r\n TCP_KEEPALIVE = 0x10\r\n self.sock.setsockopt(socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1)\r\n self.sock.setsockopt(socket.IPPROTO_TCP, TCP_KEEPALIVE, 3)\r\nurllib3.connection.HTTPConnection.connect = patch_connect\r\n\r\nimport requests\r\nrequests.get(...)\r\n```\r\n\r\nRequests version: `requests==2.24.0`\r\n\r\nUseful links:\r\n- https://stackoverflow.com/a/14855726/2360527\r\n- https://stackoverflow.com/a/15148308/2360527" ]
https://api.github.com/repos/psf/requests/issues/3342
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3342/labels{/name}
https://api.github.com/repos/psf/requests/issues/3342/comments
https://api.github.com/repos/psf/requests/issues/3342/events
https://github.com/psf/requests/pull/3342
160,953,189
MDExOlB1bGxSZXF1ZXN0NzQyODkxNjQ=
3,342
fix PUT of empty (0 byte) stream
{ "avatar_url": "https://avatars.githubusercontent.com/u/423176?v=4", "events_url": "https://api.github.com/users/nlevitt/events{/privacy}", "followers_url": "https://api.github.com/users/nlevitt/followers", "following_url": "https://api.github.com/users/nlevitt/following{/other_user}", "gists_url": "https://api.github.com/users/nlevitt/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/nlevitt", "id": 423176, "login": "nlevitt", "node_id": "MDQ6VXNlcjQyMzE3Ng==", "organizations_url": "https://api.github.com/users/nlevitt/orgs", "received_events_url": "https://api.github.com/users/nlevitt/received_events", "repos_url": "https://api.github.com/users/nlevitt/repos", "site_admin": false, "starred_url": "https://api.github.com/users/nlevitt/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nlevitt/subscriptions", "type": "User", "url": "https://api.github.com/users/nlevitt", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2016-06-17T18:56:16Z
2021-09-08T03:01:03Z
2016-06-17T19:41:30Z
NONE
resolved
I found #3051 but I'm offering this anyway. It's a simple fix for what appears to be a simple oversight in the code.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3342/reactions" }
https://api.github.com/repos/psf/requests/issues/3342/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3342.diff", "html_url": "https://github.com/psf/requests/pull/3342", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3342.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3342" }
true
[ "Thanks for this!\n\nThis is not an oversight, it's entirely deliberate. We do this to handle the case that we have a body with indeterminate length (e.g. stdin) which reports its length as zero or throws errors when we try to work out how long it is. In this case, defaulting to transfer encoding chunked is sensible.\n\nAll servers should be able to tolerate this. Have you encountered an actual problem with this?\n", "Thanks for the quick response. I was still looking into the test failures when I saw it.\n\nIt seems like the length should be `None` rather than zero in case of a body with indeterminate length, don't you think? :) \n\nI do have a real world problem with this. The archive.org s3-like api doesn't support chunked uploads and responds to a PUT without content-length with a \"411 Length Required\". \nhttps://archive.org/help/abouts3.txt\n\nFor example:\n\n```\n$ curl -LsSv --request PUT --upload-file /dev/null --header x-archive-meta-collection:test_collection --header \"authorization: LOW xxx:xxx\" https://s3.us.archive.org/nlevitt-test-2016-06-17/\n* Trying 207.241.224.50...\n* Connected to s3.us.archive.org (207.241.224.50) port 443 (#0)\n* TLS 1.2 connection using TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256\n* Server certificate: *.s3.us.archive.org\n* Server certificate: Go Daddy Secure Certificate Authority - G2\n* Server certificate: Go Daddy Root Certificate Authority - G2\n> PUT /nlevitt-test-2016-06-17/null HTTP/1.1\n> Host: s3.us.archive.org\n> User-Agent: curl/7.43.0\n> Accept: */*\n> Transfer-Encoding: chunked\n> x-archive-meta-collection:test_collection\n> authorization: LOW xxx:xxx\n> Expect: 100-continue\n> \n< HTTP/1.1 411 Length Required\n< Date: Fri, 17 Jun 2016 19:34:15 GMT\n< Server: Apache/2.4.7 (Ubuntu)\n< Content-Length: 320\n< Connection: close\n< Content-Type: text/html; charset=iso-8859-1\n< \n<!DOCTYPE HTML PUBLIC \"-//IETF//DTD HTML 2.0//EN\">\n<html><head>\n<title>411 Length Required</title>\n</head><body>\n<h1>Length Required</h1>\n<p>A request of the requested method PUT requires a valid Content-length.<br />\n</p>\n<hr>\n<address>Apache/2.4.7 (Ubuntu) Server at s3.us.archive.org Port 80</address>\n</body></html>\n* Closing connection 0\n```\n", "The reason we fall back to zero is a side effect of the way lengths are calculated. If you dive back into bug reports you'll see that we had bugs where we sent bodies with Content-Length: 0: this behaviour is much nicer. =D\n\nArchive.org in this case is wildly violating the HTTP RFCs, but you can force the correct behaviour by using the prepared request flow and setting up the headers and body as you need them. \n", "Ok. Thanks.\n", "The archive.org s3-like API is extremely important, despite the fact that it is poorly implemented. Personally, I have utilized it successfully with Boto (which uses Requests internally, for some things). You may find the use of that library more appropriate. \n\nPreparedRequest is indeed the solution, if not, though.\n" ]
https://api.github.com/repos/psf/requests/issues/3341
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3341/labels{/name}
https://api.github.com/repos/psf/requests/issues/3341/comments
https://api.github.com/repos/psf/requests/issues/3341/events
https://github.com/psf/requests/issues/3341
160,872,751
MDU6SXNzdWUxNjA4NzI3NTE=
3,341
Setting session.timeout doesn't do anything
{ "avatar_url": "https://avatars.githubusercontent.com/u/921251?v=4", "events_url": "https://api.github.com/users/ikus060/events{/privacy}", "followers_url": "https://api.github.com/users/ikus060/followers", "following_url": "https://api.github.com/users/ikus060/following{/other_user}", "gists_url": "https://api.github.com/users/ikus060/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ikus060", "id": 921251, "login": "ikus060", "node_id": "MDQ6VXNlcjkyMTI1MQ==", "organizations_url": "https://api.github.com/users/ikus060/orgs", "received_events_url": "https://api.github.com/users/ikus060/received_events", "repos_url": "https://api.github.com/users/ikus060/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ikus060/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ikus060/subscriptions", "type": "User", "url": "https://api.github.com/users/ikus060", "user_view_type": "public" }
[]
closed
true
null
[]
null
10
2016-06-17T11:59:21Z
2018-07-02T14:53:32Z
2016-06-17T12:10:50Z
NONE
resolved
I was using an old version of requests (I don't know the exact version). When settings the `session.timeout` it was affecting the timeout value during a call to `session.get()`. With recent version of requests, this feature doesn't seams to work. I'm currently using version 2.10.0 Basically, I'm expecting this: ``` session = requests.session() session.timeout = 0.0001 session.get('http://localhost/') ``` To have the same behavior as: ``` requests.get('http://localhost/', timeout=0.0001) ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3341/reactions" }
https://api.github.com/repos/psf/requests/issues/3341/timeline
null
completed
null
null
false
[ "This behaviour was deliberately changed: see #2856, and #2011 which have discussion on them.\n", "And to be clear, that version must have been prior to 1.0 when this behaviour was removed.\n", "After reading both issues, I still don't understand why it's not supported. What is the rational behind this decision ?\n", "Also, whats it the alternative to avoid setting the timeout value for each and every get() request ?\n", "@ikus The rationale is that timeouts are not of-a-kind with the things that are set on `Sessions`. They represent a property of the transport layer and don't semantically apply to a _session_: how does one \"time out\" a session? In general they are a per-request property, and defaulting them is a property of the transport layer, which is where [Transport Adapters](http://docs.python-requests.org/en/master/user/advanced/#transport-adapters) are used to store configuration.\n\n[This comment](https://github.com/kennethreitz/requests/issues/2011#issuecomment-64440818) provides an option for defaulting the value.\n", "I will not argue to much about this decision. But I think it goes against the \"Python HTTP Requests for Humans\". In order to make sure every GET requests does have a good timeout: (a) I need to review all the code to add `timeout` as an argument to each call of `get()` or (b) I need to monkey patch the default `HTTPAdapter` and provide a default timeout of my own. \n\nDefining the timeout value at the session level would be more human even if the property is not strictly related to the session it self.\n", "@ikus060 You do not need to monkeypatch the default HTTPAdapter: just install the correct adapters in at the point where you construct the session.\n", "If we are talking of this code:\n\n``` python\nclass MyHTTPAdapter(requests.adapters.HTTPAdapter):\n def __init__(self, timeout=None, *args, **kwargs):\n self.timeout = timeout\n super(MyHTTPAdapter, self).__init__(*args, **kwargs)\n\n def send(self, *args, **kwargs):\n kwargs['timeout'] = self.timeout\n return super(MyHTTPAdapter, self).send(*args, **kwargs)\n```\n\nIt look like we are monkeypatching the `HTTPAdapter.send()` method to provide a default timeout value. At least, would be nice to have an HTTPAdapter constructor accepting a `timeout` value to avoid defining our own class.\n\ne.g.:\n\n``` python\ns = requests.Session()\ns.mount(\"http://\", requests.adapters.HTTPAdapter(timeout=10))\n```\n", "@ikus060 You're not monkeypatching, you're subclassing. This is the entirely expected way to interact with Transport Adapters, and it's done throughout the requests community, as you can [see here](https://github.com/sigmavirus24/requests-toolbelt/tree/master/requests_toolbelt/adapters).\n\nIt's not clear to me that a huge amount is gained here by defaulting this. A simpler override, if you're always using the same timeout, is just:\n\n``` python\nclass MyHTTPAdapter(requests.adapters.HTTPAdapter):\n def send(self, *args, **kwargs):\n kwargs['timeout'] = 10\n return super(MyHTTPAdapter, self).send(*args, **kwargs)\n```\n\nAt this point we're arguing about saving you 5 LoC. I'm _open_ to having the `HTTPAdapter` have a timeout default on the object, certainly, but it's not an immediately obvious win.\n", "I would totally agree with @ikus060 that:\r\n\r\n``` python\r\ns = requests.Session()\r\ns.mount(\"http://\", requests.adapters.HTTPAdapter(timeout=10))\r\ns.mount(\"https://\", requests.adapters.HTTPAdapter(timeout=10))\r\n```\r\n\r\nWould be a vastly simpler way for humans to set a timeout, and should be supported. It doesn't require people to understand subclassing and understand the HTTPAdapter.send method to override it.\r\n\r\nSince it's easy to support, could it be supported?\r\n\r\nEven nicer would be something like:\r\n\r\n``` python\r\ns = requests.Session(\r\n transport_adapters={\r\n ('http://', 'https://'): HTTPAdapter(timeout=10)\r\n }\r\n)\r\n```\r\n\r\nBut I guess that's a longer discussion.\r\n\r\n(It would also be nice if you didn't *always* have to mount *both* `http://` and `https://`, but that's a different topic)." ]
https://api.github.com/repos/psf/requests/issues/3340
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3340/labels{/name}
https://api.github.com/repos/psf/requests/issues/3340/comments
https://api.github.com/repos/psf/requests/issues/3340/events
https://github.com/psf/requests/issues/3340
160,807,249
MDU6SXNzdWUxNjA4MDcyNDk=
3,340
use the paypal in mac os x has the problem about sslv3
{ "avatar_url": "https://avatars.githubusercontent.com/u/13331694?v=4", "events_url": "https://api.github.com/users/iLaus/events{/privacy}", "followers_url": "https://api.github.com/users/iLaus/followers", "following_url": "https://api.github.com/users/iLaus/following{/other_user}", "gists_url": "https://api.github.com/users/iLaus/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/iLaus", "id": 13331694, "login": "iLaus", "node_id": "MDQ6VXNlcjEzMzMxNjk0", "organizations_url": "https://api.github.com/users/iLaus/orgs", "received_events_url": "https://api.github.com/users/iLaus/received_events", "repos_url": "https://api.github.com/users/iLaus/repos", "site_admin": false, "starred_url": "https://api.github.com/users/iLaus/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/iLaus/subscriptions", "type": "User", "url": "https://api.github.com/users/iLaus", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2016-06-17T03:53:00Z
2021-09-08T17:05:37Z
2016-06-17T07:38:00Z
NONE
resolved
requests.exceptions.SSLError: [SSL: SSLV3_ALERT_HANDSHAKE_FAILURE] sslv3 alert handshake failure (_ssl.c:600)
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3340/reactions" }
https://api.github.com/repos/psf/requests/issues/3340/timeline
null
completed
null
null
false
[ "I have change the terminal's openssl version,\n\nkingwangdeMini:icampuslist_web kingwang$ openssl version\nOpenSSL 1.0.2d 9 Jul 2015\n\nbut my python's openssl version is still old version:\n\nkingwangdeMini:icampuslist_web kingwang$ python -c \"import ssl; print(ssl.OPENSSL_VERSION)\"\nOpenSSL 0.9.8zg 14 July 2015\n\nI know that [SSL: SSLV3_ALERT_HANDSHAKE_FAILURE] 's bug is because of the version error.\n\nI want to know that how to change my python's openssl version~\nthanks!\n", "Your Python's OpenSSL version is what it is compiled against. In this case, for Mac OS X, you'll need to compile yourself a new Python. The easiest way to do that is to use [Homebrew](http://brew.sh/): `brew install python` will provide you with a new Python that has a more modern OpenSSL.\n\nYou can also fix the bug for yourself by running `pip install pyasn1 ndg-httpsclient pyopenssl`, which will also give you a newer OpenSSL that only Requests can use.\n", "This is my first use github to solve issue.\n@Lukasa \nThank you! And my bug solved now.\n\nI use the pyenv to run the python. and I uninstall the python, then install again. It's working! So happy~~\n\nBy the way, I tried the second method firstly, but It not working. Then I tried the first one.I didn't know the reason. Maybe I had already installed that packages. \n\nFinally, Thanks very much!\n" ]
https://api.github.com/repos/psf/requests/issues/3339
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3339/labels{/name}
https://api.github.com/repos/psf/requests/issues/3339/comments
https://api.github.com/repos/psf/requests/issues/3339/events
https://github.com/psf/requests/pull/3339
160,747,845
MDExOlB1bGxSZXF1ZXN0NzQxNDI1NjQ=
3,339
Use seek from end rather than getvalue
{ "avatar_url": "https://avatars.githubusercontent.com/u/296164?v=4", "events_url": "https://api.github.com/users/jseabold/events{/privacy}", "followers_url": "https://api.github.com/users/jseabold/followers", "following_url": "https://api.github.com/users/jseabold/following{/other_user}", "gists_url": "https://api.github.com/users/jseabold/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jseabold", "id": 296164, "login": "jseabold", "node_id": "MDQ6VXNlcjI5NjE2NA==", "organizations_url": "https://api.github.com/users/jseabold/orgs", "received_events_url": "https://api.github.com/users/jseabold/received_events", "repos_url": "https://api.github.com/users/jseabold/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jseabold/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jseabold/subscriptions", "type": "User", "url": "https://api.github.com/users/jseabold", "user_view_type": "public" }
[]
closed
true
null
[]
null
11
2016-06-16T20:00:58Z
2021-09-08T03:00:45Z
2016-08-24T18:02:00Z
NONE
resolved
Avoids having the (potentially large) object in memory twice.
{ "avatar_url": "https://avatars.githubusercontent.com/u/296164?v=4", "events_url": "https://api.github.com/users/jseabold/events{/privacy}", "followers_url": "https://api.github.com/users/jseabold/followers", "following_url": "https://api.github.com/users/jseabold/following{/other_user}", "gists_url": "https://api.github.com/users/jseabold/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jseabold", "id": 296164, "login": "jseabold", "node_id": "MDQ6VXNlcjI5NjE2NA==", "organizations_url": "https://api.github.com/users/jseabold/orgs", "received_events_url": "https://api.github.com/users/jseabold/received_events", "repos_url": "https://api.github.com/users/jseabold/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jseabold/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jseabold/subscriptions", "type": "User", "url": "https://api.github.com/users/jseabold", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3339/reactions" }
https://api.github.com/repos/psf/requests/issues/3339/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3339.diff", "html_url": "https://github.com/psf/requests/pull/3339", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3339.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3339" }
true
[ "Thanks for this patch!\n\nAs composed, however, this patch doesn't make sense. The code block tests for `getvalue()` but then uses two unrelated methods.\n\nIt might be better to have a separate block that tests for `seek()` and `tell()` and then use those in that block, considering that a higher priority than `getvalue()`. Does that sound sensible?\n", "`seek` and `tell` are necessary but not sufficient conditions for it being a BytesIO object, no? That would also capture the file-objects that are the last thing checked for and use the filesystem.\n\nThe whole thing strikes me as a bit funny, but I didn't have enough context here to make wholesale changes. Why not just use `isinstance` be explicit about what's what? Python 2/3 issues? `getvalue` is a sufficient condition for BytesIO though.\n\nThoughts?\n", "The main reason not to use `isinstance` is because we want to support duck-typed objects: things that behave _like_ a BytesIO object but are not BytesIO objects, for example. This is particularly important with file-like objects, as there is _technically_ a protocol to which they conform but _in practice_ many file-like objects support only part of the API.\n\nIn this case, we may be able to resolve the problem by changing the logic. Rather than have one gigantic `if...elif` chain, we could change it to be more like this:\n\n``` python\ntotal_length = None\nif x:\n total_length = whatever()\n\nif total_length is None and y:\n total_length = whatever2()\n\nif total_length is None and z:\n total_length = whatever3()\n```\n\nThat would allow us to re-order things to do the `fileno` block _first_ and then, if we drop out of the `fileno` block, fall back on seek() and tell(). How does that sound?\n", "Makes sense. Let me give it another go.\n", "See what you think about this. Added some code comments about why it's implemented this way.\n", "I suppose we could run into a case where `current_position` becomes None in the 'tell' block, but I'm not sure under what circumstances.\n", "Well, looks like that happens on Python 2.\n", "🍝 \n", "I'm not going to have time to see this through. Closing in favor of other intrepid souls. Happy to have a look at subsequent work.\n", "Thanks for that @jseabold, it's helpful when people come back to tidy up when they no longer have the time to dedicate to the patch. All the best with whatever you're spending your time on, and I hope you hang around!\n", "Thanks for contributing to Requests @jseabold!\n" ]
https://api.github.com/repos/psf/requests/issues/3338
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3338/labels{/name}
https://api.github.com/repos/psf/requests/issues/3338/comments
https://api.github.com/repos/psf/requests/issues/3338/events
https://github.com/psf/requests/pull/3338
160,706,362
MDExOlB1bGxSZXF1ZXN0NzQxMTMwNTc=
3,338
Refactor prepare body
{ "avatar_url": "https://avatars.githubusercontent.com/u/3794108?v=4", "events_url": "https://api.github.com/users/davidsoncasey/events{/privacy}", "followers_url": "https://api.github.com/users/davidsoncasey/followers", "following_url": "https://api.github.com/users/davidsoncasey/following{/other_user}", "gists_url": "https://api.github.com/users/davidsoncasey/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/davidsoncasey", "id": 3794108, "login": "davidsoncasey", "node_id": "MDQ6VXNlcjM3OTQxMDg=", "organizations_url": "https://api.github.com/users/davidsoncasey/orgs", "received_events_url": "https://api.github.com/users/davidsoncasey/received_events", "repos_url": "https://api.github.com/users/davidsoncasey/repos", "site_admin": false, "starred_url": "https://api.github.com/users/davidsoncasey/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/davidsoncasey/subscriptions", "type": "User", "url": "https://api.github.com/users/davidsoncasey", "user_view_type": "public" }
[ { "color": "d4c5f9", "default": false, "description": null, "id": 536793543, "name": "needs rebase", "node_id": "MDU6TGFiZWw1MzY3OTM1NDM=", "url": "https://api.github.com/repos/psf/requests/labels/needs%20rebase" } ]
closed
true
null
[]
{ "closed_at": null, "closed_issues": 29, "created_at": "2013-11-17T11:29:34Z", "creator": { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }, "description": null, "due_on": null, "html_url": "https://github.com/psf/requests/milestone/20", "id": 487518, "labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels", "node_id": "MDk6TWlsZXN0b25lNDg3NTE4", "number": 20, "open_issues": 12, "state": "open", "title": "3.0.0", "updated_at": "2024-05-19T18:43:00Z", "url": "https://api.github.com/repos/psf/requests/milestones/20" }
9
2016-06-16T16:38:33Z
2021-09-07T00:06:30Z
2017-03-01T17:25:30Z
NONE
resolved
PR for proposed/3.0 with refactoring of `prepare_body` method. For discussion see #3184.
{ "avatar_url": "https://avatars.githubusercontent.com/u/3794108?v=4", "events_url": "https://api.github.com/users/davidsoncasey/events{/privacy}", "followers_url": "https://api.github.com/users/davidsoncasey/followers", "following_url": "https://api.github.com/users/davidsoncasey/following{/other_user}", "gists_url": "https://api.github.com/users/davidsoncasey/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/davidsoncasey", "id": 3794108, "login": "davidsoncasey", "node_id": "MDQ6VXNlcjM3OTQxMDg=", "organizations_url": "https://api.github.com/users/davidsoncasey/orgs", "received_events_url": "https://api.github.com/users/davidsoncasey/received_events", "repos_url": "https://api.github.com/users/davidsoncasey/repos", "site_admin": false, "starred_url": "https://api.github.com/users/davidsoncasey/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/davidsoncasey/subscriptions", "type": "User", "url": "https://api.github.com/users/davidsoncasey", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3338/reactions" }
https://api.github.com/repos/psf/requests/issues/3338/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3338.diff", "html_url": "https://github.com/psf/requests/pull/3338", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3338.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3338" }
true
[ "I can take a look tonight and see if I can fix the conflict.\n", "@Lukasa @nateprewitt I sort of forgot that this PR has been hanging out - is this still something we'd like to merge in? I can see about getting the conflicts fixed. Sorry I sort of lost momentum working on requests, I switched jobs and don't get time at work for open source projects any more.\n", "@davidsoncasey, no worries, thanks for taking the time to check back in! 😀\n\n @Lukasa may have more on this but I think moving [these tests](https://github.com/davidsoncasey/requests/blob/60e0349ce265378b127ae51e1853533b57eaac38/tests/test_requests.py#L1241-L1270) into a separate PR against master will be the most immediate benefit. This will show that the issue is currently fixed on master and allow us to close out #3066.\n\nThe work you did for `prepare_body` and `prepare_content_length` simplifies a lot of the logic and would be great for proposed/3.0.0. We'll need to merge master into the proposed/3.0.0 branch in this repo and then have you rebase your commits on top of it. I did some local testing and got this patch merged and working relatively pain free.\n\nI'm also happy to help with any of the reshuffling of your commits if needed, just let us know.\n", "Hey @davidsoncasey, just checking back in. If you don't have any qualms with this, I'd like to open a PR rebasing 1a01007 and lines 1264-1270 in tests/test_requests.py (e5f0993) onto master next week. This will verify the work for #3066 is finished there and then we can revisit the `prepare_body` consolidation here whenever you're ready.", "@davidsoncasey, checking in again. It looks like @kennethreitz may want to start cleaning some of these older PRs up. There's still a lot of useful stuff in here that didn't make it into master but should be in 3.0.0. Would you be interested in bringing this branch up to date and fixing a few things? If not, I'll tidy up your commits and wrap this up.", "needs a rebase", "@nateprewitt feel free to go ahead and tidy up and get this into master. I haven't had a chance to work on this project recently, and it would take me a little to get back up to speed on it.", "Thanks for checking back in @davidsoncasey, I'll move forward with rebasing this into a new PR then. All the best in whatever you're currently working on :)", "Closing in favor of #3897 " ]
https://api.github.com/repos/psf/requests/issues/3337
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3337/labels{/name}
https://api.github.com/repos/psf/requests/issues/3337/comments
https://api.github.com/repos/psf/requests/issues/3337/events
https://github.com/psf/requests/issues/3337
160,415,540
MDU6SXNzdWUxNjA0MTU1NDA=
3,337
SSLError: [Errno bad handshake] (-1, 'Unexpected EOF')
{ "avatar_url": "https://avatars.githubusercontent.com/u/15658015?v=4", "events_url": "https://api.github.com/users/PabloLefort/events{/privacy}", "followers_url": "https://api.github.com/users/PabloLefort/followers", "following_url": "https://api.github.com/users/PabloLefort/following{/other_user}", "gists_url": "https://api.github.com/users/PabloLefort/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/PabloLefort", "id": 15658015, "login": "PabloLefort", "node_id": "MDQ6VXNlcjE1NjU4MDE1", "organizations_url": "https://api.github.com/users/PabloLefort/orgs", "received_events_url": "https://api.github.com/users/PabloLefort/received_events", "repos_url": "https://api.github.com/users/PabloLefort/repos", "site_admin": false, "starred_url": "https://api.github.com/users/PabloLefort/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/PabloLefort/subscriptions", "type": "User", "url": "https://api.github.com/users/PabloLefort", "user_view_type": "public" }
[]
closed
true
null
[]
null
9
2016-06-15T12:58:03Z
2021-09-08T08:00:40Z
2016-06-27T14:42:02Z
NONE
resolved
Hi, i have a service for send email through SMTP with [Mandrill](https://mandrillapp.com). ``` python # Service from django.core.mail import EmailMessage e = EmailMessage('test subject', 'test message', '[email protected]', ['[email protected]']) e.send() ``` Throws this exception trace: ``` python Traceback (most recent call last): File "<input>", line 1, in <module> File "/opt/apps/project/virtualenv/project/lib/python2.7/site-packages/django/core/mail/message.py", line 286, in send return self.get_connection(fail_silently).send_messages([self]) File "/opt/apps/project/virtualenv/project/lib/python2.7/site-packages/djrill/mail/backends/djrill.py", line 68, in send_messages sent = self._send(message) File "/opt/apps/project/virtualenv/project/lib/python2.7/site-packages/djrill/mail/backends/djrill.py", line 107, in _send response = requests.post(api_url, data=json.dumps(api_params, cls=JSONDateUTCEncoder)) File "/opt/apps/project/virtualenv/project/lib/python2.7/site-packages/requests/api.py", line 109, in post return request('post', url, data=data, json=json, **kwargs) File "/opt/apps/project/virtualenv/project/lib/python2.7/site-packages/requests/api.py", line 50, in request response = session.request(method=method, url=url, **kwargs) File "/opt/apps/project/virtualenv/project/lib/python2.7/site-packages/requests/sessions.py", line 465, in request resp = self.send(prep, **send_kwargs) File "/opt/apps/project/virtualenv/project/lib/python2.7/site-packages/requests/sessions.py", line 573, in send r = adapter.send(request, **kwargs) File "/opt/apps/project/virtualenv/project/lib/python2.7/site-packages/requests/adapters.py", line 431, in send raise SSLError(e, request=request) SSLError: [Errno bad handshake] (-1, 'Unexpected EOF') ``` Env: Red Hat 4.8.2-13, Python: 2.7.5 Libs: ``` Django==1.7.4 requests==2.7.0 djrill==1.3.0 pyOpenSSL==0.14 ``` Any suggestions? Going with [this](http://stackoverflow.com/questions/29504955/send-requests-to-twitter-with-requests-requests-exceptions-sslerror-errno) answer did not change anything. Thanks in advance.
{ "avatar_url": "https://avatars.githubusercontent.com/u/15658015?v=4", "events_url": "https://api.github.com/users/PabloLefort/events{/privacy}", "followers_url": "https://api.github.com/users/PabloLefort/followers", "following_url": "https://api.github.com/users/PabloLefort/following{/other_user}", "gists_url": "https://api.github.com/users/PabloLefort/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/PabloLefort", "id": 15658015, "login": "PabloLefort", "node_id": "MDQ6VXNlcjE1NjU4MDE1", "organizations_url": "https://api.github.com/users/PabloLefort/orgs", "received_events_url": "https://api.github.com/users/PabloLefort/received_events", "repos_url": "https://api.github.com/users/PabloLefort/repos", "site_admin": false, "starred_url": "https://api.github.com/users/PabloLefort/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/PabloLefort/subscriptions", "type": "User", "url": "https://api.github.com/users/PabloLefort", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3337/reactions" }
https://api.github.com/repos/psf/requests/issues/3337/timeline
null
completed
null
null
false
[ "Unexpected EOF in this instance almost certainly means that there is a problem with your TLS handshake, and the server just tore the connection down rather than establish the TLS connection.\n\nFirstly, you should install pyasn1 and ndg-httpsclient in addition to your current libraries: that will resolve your possible SNI issue. Then, if that didn't help, you should let me know what version of OpenSSL is installed in your system.\n", "@Lukasa Already installed those libs, alike, running with --upgrade tells `Requirement already up-to-date`\n", "@PabloLefort And your OpenSSL version, please?\n", "@Lukasa OpenSSl version: `OpenSSL 1.0.1e-fips 11 Feb 2013`\n", "Right, so there's no immediately obvious set of problems there. Are you familiar with Wireshark? If so, it'd be very convenient if you could grab a packet capture of the failing connection.\n", "I'm not familiar but i will try to grab the packet and post the response.\nThanks.\n", "@Lukasa So, after some investigation, installed `tcpdump` to see whats happening with the packets.\nOn my local env the connection was through `TLS`, but in the server first try to connect with `TLS` and fallback to `SSL`. This raise `EOF Exception`.\nGoing foward, there was some firewall closing every connection. I changed it and it works like a charm!\n\nThanks for all.\n", "I would like to note, that problems may be relating on the proxy settings. Unset https_proxy and http_proxy.", "Thanks! @VladislavMesh 🚀 " ]
https://api.github.com/repos/psf/requests/issues/3336
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3336/labels{/name}
https://api.github.com/repos/psf/requests/issues/3336/comments
https://api.github.com/repos/psf/requests/issues/3336/events
https://github.com/psf/requests/pull/3336
160,391,829
MDExOlB1bGxSZXF1ZXN0NzM4ODkzMDY=
3,336
Merge remote-tracking branch 'refs/remotes/kennethreitz/master'
{ "avatar_url": "https://avatars.githubusercontent.com/u/1407528?v=4", "events_url": "https://api.github.com/users/hyokosdeveloper/events{/privacy}", "followers_url": "https://api.github.com/users/hyokosdeveloper/followers", "following_url": "https://api.github.com/users/hyokosdeveloper/following{/other_user}", "gists_url": "https://api.github.com/users/hyokosdeveloper/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/hyokosdeveloper", "id": 1407528, "login": "hyokosdeveloper", "node_id": "MDQ6VXNlcjE0MDc1Mjg=", "organizations_url": "https://api.github.com/users/hyokosdeveloper/orgs", "received_events_url": "https://api.github.com/users/hyokosdeveloper/received_events", "repos_url": "https://api.github.com/users/hyokosdeveloper/repos", "site_admin": false, "starred_url": "https://api.github.com/users/hyokosdeveloper/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/hyokosdeveloper/subscriptions", "type": "User", "url": "https://api.github.com/users/hyokosdeveloper", "user_view_type": "public" }
[]
closed
true
null
[]
null
0
2016-06-15T10:44:03Z
2021-09-08T03:01:04Z
2016-06-15T10:44:25Z
NONE
resolved
{ "avatar_url": "https://avatars.githubusercontent.com/u/1407528?v=4", "events_url": "https://api.github.com/users/hyokosdeveloper/events{/privacy}", "followers_url": "https://api.github.com/users/hyokosdeveloper/followers", "following_url": "https://api.github.com/users/hyokosdeveloper/following{/other_user}", "gists_url": "https://api.github.com/users/hyokosdeveloper/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/hyokosdeveloper", "id": 1407528, "login": "hyokosdeveloper", "node_id": "MDQ6VXNlcjE0MDc1Mjg=", "organizations_url": "https://api.github.com/users/hyokosdeveloper/orgs", "received_events_url": "https://api.github.com/users/hyokosdeveloper/received_events", "repos_url": "https://api.github.com/users/hyokosdeveloper/repos", "site_admin": false, "starred_url": "https://api.github.com/users/hyokosdeveloper/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/hyokosdeveloper/subscriptions", "type": "User", "url": "https://api.github.com/users/hyokosdeveloper", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3336/reactions" }
https://api.github.com/repos/psf/requests/issues/3336/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3336.diff", "html_url": "https://github.com/psf/requests/pull/3336", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3336.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3336" }
true
[]
https://api.github.com/repos/psf/requests/issues/3335
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3335/labels{/name}
https://api.github.com/repos/psf/requests/issues/3335/comments
https://api.github.com/repos/psf/requests/issues/3335/events
https://github.com/psf/requests/pull/3335
160,390,960
MDExOlB1bGxSZXF1ZXN0NzM4ODg2NDg=
3,335
Merge remote-tracking branch 'refs/remotes/kennethreitz/master'
{ "avatar_url": "https://avatars.githubusercontent.com/u/1407528?v=4", "events_url": "https://api.github.com/users/hyokosdeveloper/events{/privacy}", "followers_url": "https://api.github.com/users/hyokosdeveloper/followers", "following_url": "https://api.github.com/users/hyokosdeveloper/following{/other_user}", "gists_url": "https://api.github.com/users/hyokosdeveloper/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/hyokosdeveloper", "id": 1407528, "login": "hyokosdeveloper", "node_id": "MDQ6VXNlcjE0MDc1Mjg=", "organizations_url": "https://api.github.com/users/hyokosdeveloper/orgs", "received_events_url": "https://api.github.com/users/hyokosdeveloper/received_events", "repos_url": "https://api.github.com/users/hyokosdeveloper/repos", "site_admin": false, "starred_url": "https://api.github.com/users/hyokosdeveloper/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/hyokosdeveloper/subscriptions", "type": "User", "url": "https://api.github.com/users/hyokosdeveloper", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-06-15T10:39:04Z
2016-06-15T10:43:55Z
2016-06-15T10:41:03Z
NONE
null
null
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3335/reactions" }
https://api.github.com/repos/psf/requests/issues/3335/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3335.diff", "html_url": "https://github.com/psf/requests/pull/3335", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3335.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3335" }
true
[ "h\n" ]
https://api.github.com/repos/psf/requests/issues/3334
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3334/labels{/name}
https://api.github.com/repos/psf/requests/issues/3334/comments
https://api.github.com/repos/psf/requests/issues/3334/events
https://github.com/psf/requests/issues/3334
160,260,450
MDU6SXNzdWUxNjAyNjA0NTA=
3,334
500 error when using request.get()
{ "avatar_url": "https://avatars.githubusercontent.com/u/1102722?v=4", "events_url": "https://api.github.com/users/pramud/events{/privacy}", "followers_url": "https://api.github.com/users/pramud/followers", "following_url": "https://api.github.com/users/pramud/following{/other_user}", "gists_url": "https://api.github.com/users/pramud/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/pramud", "id": 1102722, "login": "pramud", "node_id": "MDQ6VXNlcjExMDI3MjI=", "organizations_url": "https://api.github.com/users/pramud/orgs", "received_events_url": "https://api.github.com/users/pramud/received_events", "repos_url": "https://api.github.com/users/pramud/repos", "site_admin": false, "starred_url": "https://api.github.com/users/pramud/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pramud/subscriptions", "type": "User", "url": "https://api.github.com/users/pramud", "user_view_type": "public" }
[]
closed
true
null
[]
null
0
2016-06-14T19:10:50Z
2016-06-14T19:27:40Z
2016-06-14T19:10:57Z
NONE
null
I am new to requests and python I'm trying to run the following code but getting 500 Server Error. import requests url = 'https://developers.zomato.com/api/v2.1/categories' r = requests.get(url,headers={'user-key':'apikey'}) print(r) print(r.content) print(r.headers)
{ "avatar_url": "https://avatars.githubusercontent.com/u/1102722?v=4", "events_url": "https://api.github.com/users/pramud/events{/privacy}", "followers_url": "https://api.github.com/users/pramud/followers", "following_url": "https://api.github.com/users/pramud/following{/other_user}", "gists_url": "https://api.github.com/users/pramud/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/pramud", "id": 1102722, "login": "pramud", "node_id": "MDQ6VXNlcjExMDI3MjI=", "organizations_url": "https://api.github.com/users/pramud/orgs", "received_events_url": "https://api.github.com/users/pramud/received_events", "repos_url": "https://api.github.com/users/pramud/repos", "site_admin": false, "starred_url": "https://api.github.com/users/pramud/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pramud/subscriptions", "type": "User", "url": "https://api.github.com/users/pramud", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3334/reactions" }
https://api.github.com/repos/psf/requests/issues/3334/timeline
null
completed
null
null
false
[]
https://api.github.com/repos/psf/requests/issues/3333
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3333/labels{/name}
https://api.github.com/repos/psf/requests/issues/3333/comments
https://api.github.com/repos/psf/requests/issues/3333/events
https://github.com/psf/requests/issues/3333
160,257,771
MDU6SXNzdWUxNjAyNTc3NzE=
3,333
500 error when using requests.get()
{ "avatar_url": "https://avatars.githubusercontent.com/u/1102722?v=4", "events_url": "https://api.github.com/users/pramud/events{/privacy}", "followers_url": "https://api.github.com/users/pramud/followers", "following_url": "https://api.github.com/users/pramud/following{/other_user}", "gists_url": "https://api.github.com/users/pramud/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/pramud", "id": 1102722, "login": "pramud", "node_id": "MDQ6VXNlcjExMDI3MjI=", "organizations_url": "https://api.github.com/users/pramud/orgs", "received_events_url": "https://api.github.com/users/pramud/received_events", "repos_url": "https://api.github.com/users/pramud/repos", "site_admin": false, "starred_url": "https://api.github.com/users/pramud/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pramud/subscriptions", "type": "User", "url": "https://api.github.com/users/pramud", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-06-14T18:57:36Z
2016-06-14T19:27:48Z
2016-06-14T19:08:58Z
NONE
null
I am new to requests and python I'm trying to run the following code but getting 500 Server Error. import requests url = 'https://developers.zomato.com/api/v2.1/categories' r = requests.get(url,headers={'user-key':'apikey'}) print(r) print(r.content) print(r.headers)
{ "avatar_url": "https://avatars.githubusercontent.com/u/1102722?v=4", "events_url": "https://api.github.com/users/pramud/events{/privacy}", "followers_url": "https://api.github.com/users/pramud/followers", "following_url": "https://api.github.com/users/pramud/following{/other_user}", "gists_url": "https://api.github.com/users/pramud/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/pramud", "id": 1102722, "login": "pramud", "node_id": "MDQ6VXNlcjExMDI3MjI=", "organizations_url": "https://api.github.com/users/pramud/orgs", "received_events_url": "https://api.github.com/users/pramud/received_events", "repos_url": "https://api.github.com/users/pramud/repos", "site_admin": false, "starred_url": "https://api.github.com/users/pramud/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pramud/subscriptions", "type": "User", "url": "https://api.github.com/users/pramud", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3333/reactions" }
https://api.github.com/repos/psf/requests/issues/3333/timeline
null
completed
null
null
false
[ "The issue is with zomato api. Adding the user agent made it work.\n" ]
https://api.github.com/repos/psf/requests/issues/3332
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3332/labels{/name}
https://api.github.com/repos/psf/requests/issues/3332/comments
https://api.github.com/repos/psf/requests/issues/3332/events
https://github.com/psf/requests/issues/3332
159,975,149
MDU6SXNzdWUxNTk5NzUxNDk=
3,332
Downloading gzipped files causes ChunkEncodingErrors, ProtocolErrors, and ConnectionResetErrors on Windows 10
{ "avatar_url": "https://avatars.githubusercontent.com/u/7537841?v=4", "events_url": "https://api.github.com/users/siennathesane/events{/privacy}", "followers_url": "https://api.github.com/users/siennathesane/followers", "following_url": "https://api.github.com/users/siennathesane/following{/other_user}", "gists_url": "https://api.github.com/users/siennathesane/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/siennathesane", "id": 7537841, "login": "siennathesane", "node_id": "MDQ6VXNlcjc1Mzc4NDE=", "organizations_url": "https://api.github.com/users/siennathesane/orgs", "received_events_url": "https://api.github.com/users/siennathesane/received_events", "repos_url": "https://api.github.com/users/siennathesane/repos", "site_admin": false, "starred_url": "https://api.github.com/users/siennathesane/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/siennathesane/subscriptions", "type": "User", "url": "https://api.github.com/users/siennathesane", "user_view_type": "public" }
[]
closed
true
null
[]
null
6
2016-06-13T15:15:42Z
2021-09-04T00:06:30Z
2016-08-05T07:46:28Z
NONE
resolved
I was trying to download a remote gzipped JSON file from my Win10 box using Python 3.5.1 and 2.7.11, and it kept erroring out. The remote content-encoding header is gzip. I tried using requests 2.9.2 and 2.10.0 on both versions of Python, installed in a virtualenv with pip. I tried the same code on Linux, it doesn't error out at all. I'm not sure if it's requests or Python which causes the problem, so I wanted to start here, since the problem definitely exists specifically on Windows. With this code (chunk_size does not matter): ``` python data = requests.get(url, stream=True) with open(url.rsplit("/")[-1], "wb") as fh: for chunk in data.iter_content(chunk_size=1024): fh.write(chunk) ``` I would inconsistently get this trace back: ``` python Traceback (most recent call last): File "H:\Programming\Python\virtualenvs\warehouse\lib\site-packages\requests\packages\urllib3\response.py", line 228, in _error_catcher yield File "H:\Programming\Python\virtualenvs\warehouse\lib\site-packages\requests\packages\urllib3\response.py", line 501, in read_chunked chunk = self._handle_chunk(amt) File "H:\Programming\Python\virtualenvs\warehouse\lib\site-packages\requests\packages\urllib3\response.py", line 461, in _handle_chunk value = self._fp._safe_read(amt) File "C:\Users\Mike\AppData\Local\Programs\Python\Python35\Lib\http\client.py", line 592, in _safe_read chunk = self.fp.read(min(amt, MAXAMOUNT)) File "C:\Users\Mike\AppData\Local\Programs\Python\Python35\Lib\socket.py", line 575, in readinto return self._sock.recv_into(b) ConnectionResetError: [WinError 10054] An existing connection was forcibly closed by the remote host During handling of the above exception, another exception occurred: Traceback (most recent call last): File "H:\Programming\Python\virtualenvs\warehouse\lib\site-packages\requests\models.py", line 664, in generate for chunk in self.raw.stream(chunk_size, decode_content=True): File "H:\Programming\Python\virtualenvs\warehouse\lib\site-packages\requests\packages\urllib3\response.py", line 349, in stream for line in self.read_chunked(amt, decode_content=decode_content): File "H:\Programming\Python\virtualenvs\warehouse\lib\site-packages\requests\packages\urllib3\response.py", line 526, in read_chunked self._original_response.close() File "C:\Users\Mike\AppData\Local\Programs\Python\Python35\Lib\contextlib.py", line 77, in __exit__ self.gen.throw(type, value, traceback) File "H:\Programming\Python\virtualenvs\warehouse\lib\site-packages\requests\packages\urllib3\response.py", line 246, in _error_catcher raise ProtocolError('Connection broken: %r' % e, e) requests.packages.urllib3.exceptions.ProtocolError: ("Connection broken: ConnectionResetError(10054, 'An existing connection was forcibly closed by the remote host', None, 10054, None)", ConnectionResetError(10054, 'An existing connection was forcibly closed by the remote host', None, 10054, None)) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "H:/Programming/Python/warehouse/main.py", line 55, in <module> compile_auctions(slugs) File "H:/Programming/Python/warehouse/main.py", line 44, in compile_auctions for chunk in data.iter_content(chunk_size=1024): File "H:\Programming\Python\virtualenvs\warehouse\lib\site-packages\requests\models.py", line 667, in generate raise ChunkedEncodingError(e) requests.exceptions.ChunkedEncodingError: ("Connection broken: ConnectionResetError(10054, 'An existing connection was forcibly closed by the remote host', None, 10054, None)", ConnectionResetError(10054, 'An existing connection was forcibly closed by the remote host', None, 10054, None)) ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3332/reactions" }
https://api.github.com/repos/psf/requests/issues/3332/timeline
null
completed
null
null
false
[ "That traceback just indicates that the remote peer decided to stop sending data for no reason that it could communicate to us: we were expecting more data, and we didn't get any. In this case, it's right in the middle of a content chunk.\n\nThat doesn't really give us much to go with. The server could really be doing anything it wants. Do you happen to have access to server logs for this server?\n", "No, sorry, I don't. I'm filing the bug report because this issue doesn't happen on Linux, using the exact same code. It very specifically happens on Windows. I don't know if Requests handles Windows differently than Linux, so I'd like to make the determination of whether this is a bug with Requests or a bug with Python on Windows.\n", "Requests does _very little_ differently on Windows, at least directly. However, there are a number of things that may accidentally differ between Windows and Linux. For example, OpenSSL versions are different, as are zlib versions. Even Python versions are usually different. This makes it extremely difficult to work out, at a surface level, what's going wrong.\n\nAre you using HTTPS?\n", "Also, are you working through proxies on the Windows machine?\n", "Closing for inactivity.\n", "I'm having a similar issue while using request when I run using python3.6 after few mins I get a response saying connection broken. if I run with python2.7 I get the expected output. I was able to get the expected output using postman, using curl but not with python 3.6 or 3.7. \r\n\r\nYes, it's an HTTPS server.\r\n\r\nAny help is appreciated." ]
https://api.github.com/repos/psf/requests/issues/3331
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3331/labels{/name}
https://api.github.com/repos/psf/requests/issues/3331/comments
https://api.github.com/repos/psf/requests/issues/3331/events
https://github.com/psf/requests/issues/3331
159,931,691
MDU6SXNzdWUxNTk5MzE2OTE=
3,331
Scrape issue
{ "avatar_url": "https://avatars.githubusercontent.com/u/399798?v=4", "events_url": "https://api.github.com/users/itranga/events{/privacy}", "followers_url": "https://api.github.com/users/itranga/followers", "following_url": "https://api.github.com/users/itranga/following{/other_user}", "gists_url": "https://api.github.com/users/itranga/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/itranga", "id": 399798, "login": "itranga", "node_id": "MDQ6VXNlcjM5OTc5OA==", "organizations_url": "https://api.github.com/users/itranga/orgs", "received_events_url": "https://api.github.com/users/itranga/received_events", "repos_url": "https://api.github.com/users/itranga/repos", "site_admin": false, "starred_url": "https://api.github.com/users/itranga/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/itranga/subscriptions", "type": "User", "url": "https://api.github.com/users/itranga", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2016-06-13T11:46:52Z
2021-09-08T17:05:38Z
2016-06-13T11:53:25Z
NONE
resolved
I tried to scrape http://www.nbcnews.com/health/health-news/brain-study-helps-explain-some-veterans-agony-n589916 .Then i got the below issue. 404 Client Error: Not Found for url: http://www.nbcnews.com/health/health-news/brain-study-helps-explain-some-veterans-agony-n589916%20
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3331/reactions" }
https://api.github.com/repos/psf/requests/issues/3331/timeline
null
completed
null
null
false
[ "You left a space at the end of your URL. You need to remove it: requests will not automatically do that for you.\n", "Thanks for your reply. But i did't add any spaces to the url, i also saw that '%20' end of the url. how is it happened?\n", "That %20 is the urlencoding for a space, which is how I determined what happened. Where did your URL come from?\n", "This discussion belongs on [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests).\n" ]
https://api.github.com/repos/psf/requests/issues/3300
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3300/labels{/name}
https://api.github.com/repos/psf/requests/issues/3300/comments
https://api.github.com/repos/psf/requests/issues/3300/events
https://github.com/psf/requests/issues/3300
159,759,291
MDU6SXNzdWUxNTk3NTkyOTE=
3,300
Upload stream
{ "avatar_url": "https://avatars.githubusercontent.com/u/9058818?v=4", "events_url": "https://api.github.com/users/ggcatu/events{/privacy}", "followers_url": "https://api.github.com/users/ggcatu/followers", "following_url": "https://api.github.com/users/ggcatu/following{/other_user}", "gists_url": "https://api.github.com/users/ggcatu/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ggcatu", "id": 9058818, "login": "ggcatu", "node_id": "MDQ6VXNlcjkwNTg4MTg=", "organizations_url": "https://api.github.com/users/ggcatu/orgs", "received_events_url": "https://api.github.com/users/ggcatu/received_events", "repos_url": "https://api.github.com/users/ggcatu/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ggcatu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ggcatu/subscriptions", "type": "User", "url": "https://api.github.com/users/ggcatu", "user_view_type": "public" }
[]
closed
true
null
[]
null
6
2016-06-11T06:40:53Z
2021-09-08T17:05:38Z
2016-06-14T07:19:53Z
NONE
resolved
Hello guys, I'm on python3 - requests 2.10.0. Im trying to reproduce this: http://docs.python-requests.org/en/master/user/advanced/#chunk-encoding ``` def gen(): yield 'hi' yield 'there' requests.post('http://some.url/chunked', data=gen()) ``` But I get `File "C:\Users\Gabriel\AppData\Local\Programs\Python\Python35\lib\site-packages\requests\models.py", line 152, in _encode_files fdata = fp.read() AttributeError: 'generator' object has no attribute 'read'` Basically I wanna to be able to upload a file, and be able to trace the upload speed.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3300/reactions" }
https://api.github.com/repos/psf/requests/issues/3300/timeline
null
completed
null
null
false
[ "If I run the demo code in python3.5, I also get this Exception:\n\n```\nTypeError: a bytes-like object is required, not 'str'\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"requests_test.py\", line 10, in <module>\n r = requests.post('http://httpbin.org/post', data=gen())\n File \"/Users/pegasus/Program/py3.5_test/lib/python3.5/site-packages/requests/api.py\", line 111, in post\n return request('post', url, data=data, json=json, **kwargs)\n File \"/Users/pegasus/Program/py3.5_test/lib/python3.5/site-packages/requests/api.py\", line 57, in request\n return session.request(method=method, url=url, **kwargs)\n File \"/Users/pegasus/Program/py3.5_test/lib/python3.5/site-packages/requests/sessions.py\", line 475, in request\n resp = self.send(prep, **send_kwargs)\n File \"/Users/pegasus/Program/py3.5_test/lib/python3.5/site-packages/requests/sessions.py\", line 585, in send\n r = adapter.send(request, **kwargs)\n File \"/Users/pegasus/Program/py3.5_test/lib/python3.5/site-packages/requests/adapters.py\", line 426, in send\n low_conn.send(i)\n File \"/Users/pegasus/.pyenv/versions/3.5.0/lib/python3.5/http/client.py\", line 889, in send\n self.sock.sendall(d)\nTypeError: a bytes-like object is required, not 'str'.\n```\n\nis a bug? Python3 should encode str to bytes.\n", "@PegasusWang Python 3 should _not_ encode `str` to `bytes`. We cannot guess what encoding you might want, and guessing just leads to _really_ subtle bugs where you send a request you think is valid, get weird 400 errors, and then spend hours of your time and ours trying to work out why and it turns out it's because Requests defaulted to using UTF-8 and your server was expecting Latin-1. You, the user, are responsible for emitting bytes from your generator.\n\n@maxibabyx Your sample code simply cannot be right, I'm afraid: or at least, it cannot be triggering the error you've provided. The branch of code you're in can only be entered in the following situation:\n1. The value of your `data` argument does not provide the `__iter__` property, or it _does_ but is either a string, list, tuple, or dict from the perspective of `isinstance`.\n2. The `files` keyword argument was also passed.\n\nYour sample code does neither of these things, meaning that when I test it on my machine I do not encounter the bug you have encountered.\n\nCan you confirm for me please that your sample code actually does reproduce your bug, ideally by providing a URL that is publicly reachable for me to test against myself?\n", "You're rigth.\nhttp://image.prntscr.com/image/8f573b147ab14755918cc0f5a3b3ba02.png\n\nI get the error when I use the files keyword.\nHow can I get around ?\n", "We do not support using a generator with the `files` keyword argument. This is in part because there is no value in it: the `files` keyword argument requires reading _all_ the data in to format it appropriately. \n\nAre you sure you need to send multipart/form-encoded data here? If you really do, you'll have to change it to `b''.join(gen())`.\n", "I want to be able to upload in chunks, so my script doesn't get blocked until upload has finished.\n", "If you want to do that then you can't use what Requests does by default. You'll need to use the [streaming multipart data encoder](https://toolbelt.readthedocs.io/en/latest/uploading-data.html#streaming-multipart-data-encoder) from the Requests toolbelt.\n" ]
https://api.github.com/repos/psf/requests/issues/3299
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3299/labels{/name}
https://api.github.com/repos/psf/requests/issues/3299/comments
https://api.github.com/repos/psf/requests/issues/3299/events
https://github.com/psf/requests/issues/3299
159,450,843
MDU6SXNzdWUxNTk0NTA4NDM=
3,299
413 Client Error: Request Entity Too Large on SSL site where curl works
{ "avatar_url": "https://avatars.githubusercontent.com/u/212279?v=4", "events_url": "https://api.github.com/users/eriol/events{/privacy}", "followers_url": "https://api.github.com/users/eriol/followers", "following_url": "https://api.github.com/users/eriol/following{/other_user}", "gists_url": "https://api.github.com/users/eriol/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/eriol", "id": 212279, "login": "eriol", "node_id": "MDQ6VXNlcjIxMjI3OQ==", "organizations_url": "https://api.github.com/users/eriol/orgs", "received_events_url": "https://api.github.com/users/eriol/received_events", "repos_url": "https://api.github.com/users/eriol/repos", "site_admin": false, "starred_url": "https://api.github.com/users/eriol/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/eriol/subscriptions", "type": "User", "url": "https://api.github.com/users/eriol", "user_view_type": "public" }
[]
closed
true
null
[]
null
6
2016-06-09T16:23:45Z
2021-09-08T17:05:39Z
2016-06-09T17:48:16Z
CONTRIBUTOR
resolved
Hello, I would like to bring this issue to your attention: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=801506 The problem shows in requests 2.10.0. This is my test script: ``` python import requests URL = 'https://contributors.debian.org/contributors/test_post' FILE = 'test.json.xz' # FILE = 'test-small.json.xz' # This works if __name__ == '__main__': payload = {'source': 'bugs.debian.org', 'data_compression': 'xz'} with open(FILE, 'rb') as f: r = requests.post(URL, files={'data': f}, data=payload) print(r.text) ``` `test-small.json.xz` is the same of `test.json.xz` only smaller: in my test I used only 10 objs. curl is able to post correctly: ``` curl https://contributors.debian.org/contributors/test_post -F source=bugs.debian.org -F [email protected] -F data_compression=xz ``` github doesn't allow to upload a `.xz` so please take it from Debian BTS: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=801506;filename=test.json.xz;att=1;msg=15 Any suggestions?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3299/reactions" }
https://api.github.com/repos/psf/requests/issues/3299/timeline
null
completed
null
null
false
[ "So after digging around I have checked some things. Requests does seem to be in agreement with itself, at least: the length of the request body it sends does match the length of the body it _actually_ sends, at least according to Python. So that's a good start.\n\nHowever, an interesting problem occurs. If I use mitmproxy to try to spy on the upload, curl starts to get a 413! It weirdly seems like if anything touches Python code it explodes, which seems just _so_ unlikely. On top of that, mitmproxy doesn't seem to be able to load the request/response: it just hangs. That's extremely perplexing.\n", "Yup, even mitmdump sees this problem. What. The. Hell.\n", "Got it.\n\ncurl by default sends the `Expect: 100-Continue` header. Requests does not send this header. That appears to be affecting Apache's decision-making logic here: for large bodies it clearly wants that to be set so that it can validate that the request is actually wanted.\n\nIf you prevent curl from sending that header by using the command `curl https://contributors.debian.org/contributors/test_post -F source=bugs.debian.org -F [email protected] -F data_compression=xz -H \"Expect:\"`, that causes curl to see the 413 as well.\n\nRequests cannot, in its current form, support the 100-Continue response, so there is nothing we can do about this: if you'd like to use requests here you'll have to adjust your Apache configuration appropriately.\n", "@Lukasa many thanks for the investigation!\n", "For the records, [this](https://bz.apache.org/bugzilla/show_bug.cgi?id=39243) is relevant, and they have a rationale and a work-around:\n\n> But you should really design your site to ensure that the first request to a\n> client-cert-protected area is not a POST request with a large body; make it a\n> GET or something. Any request body has to be buffered into RAM to handle this\n> case, so represents an opportunity to DoS the server.\n", "I confirm that the workaround works, as long as the get and post happen on the same session:\n\n``` py\ns = requests.Session()\nres = s.get(\"https://example.org\")\nres.raise_for_status()\nres = s.post(\"https://example.org\", **args)\nres.raise_for_status()\n```\n" ]
https://api.github.com/repos/psf/requests/issues/3298
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3298/labels{/name}
https://api.github.com/repos/psf/requests/issues/3298/comments
https://api.github.com/repos/psf/requests/issues/3298/events
https://github.com/psf/requests/pull/3298
159,316,861
MDExOlB1bGxSZXF1ZXN0NzMxNDk4MTE=
3,298
Update a note on AppEngine
{ "avatar_url": "https://avatars.githubusercontent.com/u/185043?v=4", "events_url": "https://api.github.com/users/davidfischer/events{/privacy}", "followers_url": "https://api.github.com/users/davidfischer/followers", "following_url": "https://api.github.com/users/davidfischer/following{/other_user}", "gists_url": "https://api.github.com/users/davidfischer/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/davidfischer", "id": 185043, "login": "davidfischer", "node_id": "MDQ6VXNlcjE4NTA0Mw==", "organizations_url": "https://api.github.com/users/davidfischer/orgs", "received_events_url": "https://api.github.com/users/davidfischer/received_events", "repos_url": "https://api.github.com/users/davidfischer/repos", "site_admin": false, "starred_url": "https://api.github.com/users/davidfischer/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/davidfischer/subscriptions", "type": "User", "url": "https://api.github.com/users/davidfischer", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-06-09T02:32:39Z
2021-09-08T03:01:05Z
2016-06-09T02:49:06Z
CONTRIBUTOR
resolved
This is simply an update on AppEngine support to point people in the right direction.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3298/reactions" }
https://api.github.com/repos/psf/requests/issues/3298/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3298.diff", "html_url": "https://github.com/psf/requests/pull/3298", "merged_at": "2016-06-09T02:49:06Z", "patch_url": "https://github.com/psf/requests/pull/3298.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3298" }
true
[ "thanks!\n" ]
https://api.github.com/repos/psf/requests/issues/3297
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3297/labels{/name}
https://api.github.com/repos/psf/requests/issues/3297/comments
https://api.github.com/repos/psf/requests/issues/3297/events
https://github.com/psf/requests/pull/3297
159,310,872
MDExOlB1bGxSZXF1ZXN0NzMxNDU3Mjg=
3,297
Note how HTTPErrors are raised
{ "avatar_url": "https://avatars.githubusercontent.com/u/185043?v=4", "events_url": "https://api.github.com/users/davidfischer/events{/privacy}", "followers_url": "https://api.github.com/users/davidfischer/followers", "following_url": "https://api.github.com/users/davidfischer/following{/other_user}", "gists_url": "https://api.github.com/users/davidfischer/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/davidfischer", "id": 185043, "login": "davidfischer", "node_id": "MDQ6VXNlcjE4NTA0Mw==", "organizations_url": "https://api.github.com/users/davidfischer/orgs", "received_events_url": "https://api.github.com/users/davidfischer/received_events", "repos_url": "https://api.github.com/users/davidfischer/repos", "site_admin": false, "starred_url": "https://api.github.com/users/davidfischer/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/davidfischer/subscriptions", "type": "User", "url": "https://api.github.com/users/davidfischer", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2016-06-09T01:21:28Z
2021-09-08T03:01:05Z
2016-06-09T06:57:44Z
CONTRIBUTOR
resolved
After looking through the code line, `requests.exceptions.HTTPError`s are only raised by `raise_for_status` but the docs made it seem like that error could happen through other means. This PR makes the docs explicit.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3297/reactions" }
https://api.github.com/repos/psf/requests/issues/3297/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3297.diff", "html_url": "https://github.com/psf/requests/pull/3297", "merged_at": "2016-06-09T06:57:44Z", "patch_url": "https://github.com/psf/requests/pull/3297.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3297" }
true
[ "I believe the original text is indeed correct, that error can occur when an HTTPError (not related to status code) is raised. I'm not sure, off the top of my head, when that could be anymore. It's possible that's a remnant from when requests ran on urllib2. \n\nRegardless, the _additions_ to the text are most certainly welcome.\n\nThis PR may be perfect as-is, but will need double checking, given the notes in my first paragraph.\n", "Urllib3 has an exception with the same name, but it is not actually related to `requests.exceptions.HTTPError`. I [searched this repository](https://github.com/kennethreitz/requests/search?utf8=%E2%9C%93&q=httperror) (well, I used grep) for HTTPError and the only place I can see where requests itself uses it is with `raise_for_status`.\n\nI had a discussion with @Lukasa briefly on IRC about this. As far as I can tell, `requests.exceptions.HTTPError` is only used for this purpose. However, if you think you'll use it for other purposes, another alternative is to have something like an `HTTPStatusCodeError` (a subclass of HTTPError) that is specifically for `raise_for_status`.\n", "Okay then, that is, in fact, a remnant from running on urllib2, and your improved wording is welcome. Thanks for the due diligence :)\n", "`HTTPStatusCodeError` would be a better name for this, but it is too late to change it now. Behavior was a bit different when this was originally crafted :)\n" ]
https://api.github.com/repos/psf/requests/issues/3296
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3296/labels{/name}
https://api.github.com/repos/psf/requests/issues/3296/comments
https://api.github.com/repos/psf/requests/issues/3296/events
https://github.com/psf/requests/issues/3296
159,287,397
MDU6SXNzdWUxNTkyODczOTc=
3,296
no_proxy env ignored on 302 redirect
{ "avatar_url": "https://avatars.githubusercontent.com/u/13124500?v=4", "events_url": "https://api.github.com/users/Gaasmann/events{/privacy}", "followers_url": "https://api.github.com/users/Gaasmann/followers", "following_url": "https://api.github.com/users/Gaasmann/following{/other_user}", "gists_url": "https://api.github.com/users/Gaasmann/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Gaasmann", "id": 13124500, "login": "Gaasmann", "node_id": "MDQ6VXNlcjEzMTI0NTAw", "organizations_url": "https://api.github.com/users/Gaasmann/orgs", "received_events_url": "https://api.github.com/users/Gaasmann/received_events", "repos_url": "https://api.github.com/users/Gaasmann/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Gaasmann/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Gaasmann/subscriptions", "type": "User", "url": "https://api.github.com/users/Gaasmann", "user_view_type": "public" }
[ { "color": "e10c02", "default": false, "description": null, "id": 117744, "name": "Bug", "node_id": "MDU6TGFiZWwxMTc3NDQ=", "url": "https://api.github.com/repos/psf/requests/labels/Bug" } ]
open
false
null
[]
null
7
2016-06-08T22:17:20Z
2016-09-06T00:00:34Z
null
NONE
null
Hello, I have a weird problem using Requests and proxies. I have https_proxy and no_proxy env variable for a domain and it seems the no_proxy is ignored on a HTTP 302 redirect target. What I do is basically: ``` s = requests.Session() r = s.get("https://use-the-proxy.com") ``` The response is 302 Found (Location: https://do-not-use-the-proxy.com) and Requests tries to follow. Here the problem: even if do-not-use-the-proxy.com is present in no_proxy, Requests still use the proxy for the request following a 302(the 302 "target"). If I directly request https://do-not-use-the-proxy.com, the proxy is not used as expected. Thanks, ## Nico
null
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3296/reactions" }
https://api.github.com/repos/psf/requests/issues/3296/timeline
null
null
null
null
false
[ "What version of requests are you using?\n", "Hello,\nRequests version is 2.7.0.\nThanks\n", "Cool, that's a good spot.\n\nYes, there is a minor bug in `resolve_redirects`. Specifically, while `resolve_redirects` attempts to remove proxy information, it cannot actually tell the difference between a proxy that was passed in via the command-line API or from the session and one that was extracted from the environment. This is because `resolve_redirects` is passed the _computed_ `proxies` argument, not the _user's_ `proxies` argument.\n\nWith the way the code in requests is structured, this is a very difficult problem to solve. One option is to hang the original `proxies` kwarg off the `Request` object: this will allow `rebuild_proxies` to essentially re-calculate the proxies argument. Another option is to suggest that the `NO_PROXY` environment variable overrides the user proxies argument for redirects: this is out of line with what we normally do, so I'm inclined to not do this. A third option is to try to do something wacky with storing the original `proxies` kwarg value and passing it to `Session.send` so that we can pass it to `resolve_redirects`, but that seems kind of nutty.\n\nDoes anyone else have an opinion on how to go about doing this that doesn't suck as much as the three I have just mentioned? @sigmavirus24?\n", "For what it's worth, I'd argue that this is a good indication that `proxies` (and probably `verify` and `cert`) are being handled at the wrong level of abstraction. Arguably, the core logic about deciding which proxy to use (and, by analogy, how to work the TLS) belongs more on Transport Adapters than on Sessions: it's a property of the connection, not a property of the HTTP layer.\n\nWe can fix that up, but it requires a breaking change in 3.0.0, which is...less than ideal.\n", "What if Request/PreparedRequest objects had hidden state about session level settings and per-request level settings (which I think you're referring to as \"user\" settings) so that we could distinguish the two? Storing there, means allowing the TransportAdapter to resolve things would be (maybe?) a simpler change.\n", "I don't entirely know, to be honest. I'm uncomfortable with shoving stuff onto the `Request`/`PreparedRequest` object just because that's the convenient thing to do. It feels like fundamentally the wrong abstraction layer. =(\n", "I know. Unfortunately, we've exposed too much of the internals to users to be able to fix this another way. If we had a better way of wrapping all of this up in some sort of context object for a request that is passed to a transport adapter, that would be much better. Sadly, that might break things for people with custom transport adapters. \n" ]
https://api.github.com/repos/psf/requests/issues/3295
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3295/labels{/name}
https://api.github.com/repos/psf/requests/issues/3295/comments
https://api.github.com/repos/psf/requests/issues/3295/events
https://github.com/psf/requests/pull/3295
159,193,157
MDExOlB1bGxSZXF1ZXN0NzMwNjc3ODM=
3,295
Document header ordering caveats.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" } ]
null
2
2016-06-08T15:26:13Z
2021-09-02T00:07:36Z
2016-06-08T16:44:33Z
MEMBER
resolved
Resolves #3096.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3295/reactions" }
https://api.github.com/repos/psf/requests/issues/3295/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3295.diff", "html_url": "https://github.com/psf/requests/pull/3295", "merged_at": "2016-06-08T16:44:33Z", "patch_url": "https://github.com/psf/requests/pull/3295.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3295" }
true
[ "This looks great. Thanks @Lukasa \n", "@0x3c3e, please consult the documentation for Session. The code you’ve supplied is modifying the Session class, not an instance of Session which is likely your problem. Further questions about usage should directed towards StackOverflow as per our project policies. Thanks!" ]
https://api.github.com/repos/psf/requests/issues/3294
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3294/labels{/name}
https://api.github.com/repos/psf/requests/issues/3294/comments
https://api.github.com/repos/psf/requests/issues/3294/events
https://github.com/psf/requests/issues/3294
159,121,233
MDU6SXNzdWUxNTkxMjEyMzM=
3,294
Performance issues while working behind a proxy where no proxy is needed.
{ "avatar_url": "https://avatars.githubusercontent.com/u/5635462?v=4", "events_url": "https://api.github.com/users/Tset-Noitamotua/events{/privacy}", "followers_url": "https://api.github.com/users/Tset-Noitamotua/followers", "following_url": "https://api.github.com/users/Tset-Noitamotua/following{/other_user}", "gists_url": "https://api.github.com/users/Tset-Noitamotua/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Tset-Noitamotua", "id": 5635462, "login": "Tset-Noitamotua", "node_id": "MDQ6VXNlcjU2MzU0NjI=", "organizations_url": "https://api.github.com/users/Tset-Noitamotua/orgs", "received_events_url": "https://api.github.com/users/Tset-Noitamotua/received_events", "repos_url": "https://api.github.com/users/Tset-Noitamotua/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Tset-Noitamotua/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Tset-Noitamotua/subscriptions", "type": "User", "url": "https://api.github.com/users/Tset-Noitamotua", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-06-08T09:41:09Z
2021-09-08T17:05:40Z
2016-06-08T09:42:15Z
NONE
resolved
Hi, plz have a look at this [issue #116](https://github.com/bulkan/robotframework-requests/issues/116) with [robotframework-requests](https://github.com/bulkan/robotframework-requests) library, take particular notice of my [second comment](https://github.com/bulkan/robotframework-requests/issues/116#issuecomment-224537034), it might be related to 'requests'. Cheers Tset
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3294/reactions" }
https://api.github.com/repos/psf/requests/issues/3294/timeline
null
completed
null
null
false
[ "This is a known issue: it's a duplicate of #2988.\n" ]
https://api.github.com/repos/psf/requests/issues/3293
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3293/labels{/name}
https://api.github.com/repos/psf/requests/issues/3293/comments
https://api.github.com/repos/psf/requests/issues/3293/events
https://github.com/psf/requests/pull/3293
158,931,313
MDExOlB1bGxSZXF1ZXN0NzI4ODA5MTY=
3,293
idea add the gitignore
{ "avatar_url": "https://avatars.githubusercontent.com/u/7835577?v=4", "events_url": "https://api.github.com/users/GaussDing/events{/privacy}", "followers_url": "https://api.github.com/users/GaussDing/followers", "following_url": "https://api.github.com/users/GaussDing/following{/other_user}", "gists_url": "https://api.github.com/users/GaussDing/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/GaussDing", "id": 7835577, "login": "GaussDing", "node_id": "MDQ6VXNlcjc4MzU1Nzc=", "organizations_url": "https://api.github.com/users/GaussDing/orgs", "received_events_url": "https://api.github.com/users/GaussDing/received_events", "repos_url": "https://api.github.com/users/GaussDing/repos", "site_admin": false, "starred_url": "https://api.github.com/users/GaussDing/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/GaussDing/subscriptions", "type": "User", "url": "https://api.github.com/users/GaussDing", "user_view_type": "public" }
[]
closed
true
null
[]
null
0
2016-06-07T14:03:24Z
2021-09-08T03:01:06Z
2016-06-07T14:04:30Z
NONE
resolved
{ "avatar_url": "https://avatars.githubusercontent.com/u/7835577?v=4", "events_url": "https://api.github.com/users/GaussDing/events{/privacy}", "followers_url": "https://api.github.com/users/GaussDing/followers", "following_url": "https://api.github.com/users/GaussDing/following{/other_user}", "gists_url": "https://api.github.com/users/GaussDing/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/GaussDing", "id": 7835577, "login": "GaussDing", "node_id": "MDQ6VXNlcjc4MzU1Nzc=", "organizations_url": "https://api.github.com/users/GaussDing/orgs", "received_events_url": "https://api.github.com/users/GaussDing/received_events", "repos_url": "https://api.github.com/users/GaussDing/repos", "site_admin": false, "starred_url": "https://api.github.com/users/GaussDing/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/GaussDing/subscriptions", "type": "User", "url": "https://api.github.com/users/GaussDing", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3293/reactions" }
https://api.github.com/repos/psf/requests/issues/3293/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3293.diff", "html_url": "https://github.com/psf/requests/pull/3293", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3293.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3293" }
true
[]
https://api.github.com/repos/psf/requests/issues/3292
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3292/labels{/name}
https://api.github.com/repos/psf/requests/issues/3292/comments
https://api.github.com/repos/psf/requests/issues/3292/events
https://github.com/psf/requests/issues/3292
158,894,079
MDU6SXNzdWUxNTg4OTQwNzk=
3,292
I can not use Requests in Jupyter
{ "avatar_url": "https://avatars.githubusercontent.com/u/19483465?v=4", "events_url": "https://api.github.com/users/hedgeliu/events{/privacy}", "followers_url": "https://api.github.com/users/hedgeliu/followers", "following_url": "https://api.github.com/users/hedgeliu/following{/other_user}", "gists_url": "https://api.github.com/users/hedgeliu/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/hedgeliu", "id": 19483465, "login": "hedgeliu", "node_id": "MDQ6VXNlcjE5NDgzNDY1", "organizations_url": "https://api.github.com/users/hedgeliu/orgs", "received_events_url": "https://api.github.com/users/hedgeliu/received_events", "repos_url": "https://api.github.com/users/hedgeliu/repos", "site_admin": false, "starred_url": "https://api.github.com/users/hedgeliu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/hedgeliu/subscriptions", "type": "User", "url": "https://api.github.com/users/hedgeliu", "user_view_type": "public" }
[]
closed
true
null
[]
null
6
2016-06-07T10:53:21Z
2019-01-03T14:19:39Z
2016-06-07T14:19:12Z
NONE
resolved
Hi, the error massages appear all the time while I follow the instruction in Jupyter. But it is fine in other places. It is really strange. Because I saw lots of resource that show Jupyter can run "requests". Did I get something wrong before import requests or I need to write some more code like matplotlib in Jupyter? ``` import requests r = requests.get('https://www.python.org') r.status_code ``` > --- > > gaierror Traceback (most recent call last) > /opt/conda/lib/python3.5/site-packages/requests/packages/urllib3/connection.py in _new_conn(self) > 136 conn = connection.create_connection( > --> 137 (self.host, self.port), self.timeout, **extra_kw) > 138 > > /opt/conda/lib/python3.5/site-packages/requests/packages/urllib3/util/connection.py in create_connection(address, timeout, source_address, socket_options) > 66 err = None > ---> 67 for res in socket.getaddrinfo(host, port, 0, socket.SOCK_STREAM): > 68 af, socktype, proto, canonname, sa = res > > /opt/conda/lib/python3.5/socket.py in getaddrinfo(host, port, family, type, proto, flags) > 731 addrlist = [] > --> 732 for res in _socket.getaddrinfo(host, port, family, type, proto, flags): > 733 af, socktype, proto, canonname, sa = res > > gaierror: [Errno -2] Name or service not known > > During handling of the above exception, another exception occurred: > > NewConnectionError Traceback (most recent call last) > /opt/conda/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, **response_kw) > 558 timeout=timeout_obj, > --> 559 body=body, headers=headers) > 560 > > /opt/conda/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py in _make_request(self, conn, method, url, timeout, *_httplib_request_kw) > 352 # urllib3.request. It also calls makefile (recv) on the socket. > --> 353 conn.request(method, url, *_httplib_request_kw) > 354 > > /opt/conda/lib/python3.5/http/client.py in request(self, method, url, body, headers) > 1082 """Send a complete request to the server.""" > -> 1083 self._send_request(method, url, body, headers) > 1084 > > /opt/conda/lib/python3.5/http/client.py in _send_request(self, method, url, body, headers) > 1127 body = body.encode('iso-8859-1') > -> 1128 self.endheaders(body) > 1129 > > /opt/conda/lib/python3.5/http/client.py in endheaders(self, message_body) > 1078 raise CannotSendHeader() > -> 1079 self._send_output(message_body) > 1080 > > /opt/conda/lib/python3.5/http/client.py in _send_output(self, message_body) > 910 > --> 911 self.send(msg) > 912 if message_body is not None: > > /opt/conda/lib/python3.5/http/client.py in send(self, data) > 853 if self.auto_open: > --> 854 self.connect() > 855 else: > > /opt/conda/lib/python3.5/site-packages/requests/packages/urllib3/connection.py in connect(self) > 161 def connect(self): > --> 162 conn = self._new_conn() > 163 self._prepare_conn(conn) > > /opt/conda/lib/python3.5/site-packages/requests/packages/urllib3/connection.py in _new_conn(self) > 145 raise NewConnectionError( > --> 146 self, "Failed to establish a new connection: %s" % e) > 147 > > NewConnectionError: <requests.packages.urllib3.connection.HTTPConnection object at 0x7fee369df7f0>: Failed to establish a new connection: [Errno -2] Name or service not known > > During handling of the above exception, another exception occurred: > > MaxRetryError Traceback (most recent call last) > /opt/conda/lib/python3.5/site-packages/requests/adapters.py in send(self, request, stream, timeout, verify, cert, proxies) > 375 retries=self.max_retries, > --> 376 timeout=timeout > 377 ) > > /opt/conda/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, **response_kw) > 608 retries = retries.increment(method, url, error=e, _pool=self, > --> 609 _stacktrace=sys.exc_info()[2]) > 610 retries.sleep() > > /opt/conda/lib/python3.5/site-packages/requests/packages/urllib3/util/retry.py in increment(self, method, url, response, error, _pool, _stacktrace) > 272 if new_retry.is_exhausted(): > --> 273 raise MaxRetryError(_pool, url, error or ResponseError(cause)) > 274 > > MaxRetryError: HTTPConnectionPool(host='www.ichangtou.com', port=80): Max retries exceeded with url: / (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7fee369df7f0>: Failed to establish a new connection: [Errno -2] Name or service not known',)) > > During handling of the above exception, another exception occurred: > > ConnectionError Traceback (most recent call last) > <ipython-input-7-6f872ca7f8cc> in <module>() > 4 headers = {'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.95 Safari/537.36'} > 5 > ----> 6 response = requests.get(url, headers=headers) > 7 print(response.content) > > /opt/conda/lib/python3.5/site-packages/requests/api.py in get(url, params, *_kwargs) > 65 > 66 kwargs.setdefault('allow_redirects', True) > ---> 67 return request('get', url, params=params, *_kwargs) > 68 > 69 > > /opt/conda/lib/python3.5/site-packages/requests/api.py in request(method, url, *_kwargs) > 51 # cases, and look like a memory leak in others. > 52 with sessions.Session() as session: > ---> 53 return session.request(method=method, url=url, *_kwargs) > 54 > 55 > > /opt/conda/lib/python3.5/site-packages/requests/sessions.py in request(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert, json) > 466 } > 467 send_kwargs.update(settings) > --> 468 resp = self.send(prep, **send_kwargs) > 469 > 470 return resp > > /opt/conda/lib/python3.5/site-packages/requests/sessions.py in send(self, request, *_kwargs) > 574 > 575 # Send the request > --> 576 r = adapter.send(request, *_kwargs) > 577 > 578 # Total elapsed time of the request (approximately) > > /opt/conda/lib/python3.5/site-packages/requests/adapters.py in send(self, request, stream, timeout, verify, cert, proxies) > 435 raise RetryError(e, request=request) > 436 > --> 437 raise ConnectionError(e, request=request) > 438 > 439 except ClosedPoolError as e: > > ConnectionError: HTTPConnectionPool(host='www.ichangtou.com', port=80): Max retries exceeded with url: / (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7fee369df7f0>: Failed to establish a new connection: [Errno -2] Name or service not known',))
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3292/reactions" }
https://api.github.com/repos/psf/requests/issues/3292/timeline
null
completed
null
null
false
[ "The problem here is that the machine that is running the Jupyter Python kernel is not able to DNS resolve the hostname `www.ichangtou.com`. Do you have physical access to the machine running the kernel?\n", "Hi Lukasa, I don't think I have the physical access. I just open the website and write the code. And any other idea can help?\n", "@hedgeliu I'm afraid not. you can try connecting via IP address directly, but if you don't have access to the machine it's highly likely that your hosting provider doesn't allow you to perform network access at all. You'd have to run the notebook locally.\n", "Sorry to raise again, Lukasa. I had tried in different places and laptops between Mainland china and Hongkong lots of time. But it still not work. Thus I think it is not the machine problem. @Lukasa \n", "@hedgeliu Did you try running the notebook server on lots of different laptops, or just running the notebook inside the web page? Because the notebook server is actually where the code executes, not the web page.\n", "I have the same issue, not only with requests, but any other library trying to make a connection to any rest api. The issue doesn't manifest itself when I use the same code on the same machine outside of Jupyter or using curl. In my case it's not a DNS issue. I'm going to run some packet captures and see if that can tell me anything. Unfortunately I don't have a resolution, but just wanted to call out that I am seeing the same behavior and have ruled out DNS resolution on my machine as a root cause." ]
https://api.github.com/repos/psf/requests/issues/3291
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3291/labels{/name}
https://api.github.com/repos/psf/requests/issues/3291/comments
https://api.github.com/repos/psf/requests/issues/3291/events
https://github.com/psf/requests/issues/3291
158,764,582
MDU6SXNzdWUxNTg3NjQ1ODI=
3,291
session ignores POST data when caching redirects
{ "avatar_url": "https://avatars.githubusercontent.com/u/10416599?v=4", "events_url": "https://api.github.com/users/panda-34/events{/privacy}", "followers_url": "https://api.github.com/users/panda-34/followers", "following_url": "https://api.github.com/users/panda-34/following{/other_user}", "gists_url": "https://api.github.com/users/panda-34/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/panda-34", "id": 10416599, "login": "panda-34", "node_id": "MDQ6VXNlcjEwNDE2NTk5", "organizations_url": "https://api.github.com/users/panda-34/orgs", "received_events_url": "https://api.github.com/users/panda-34/received_events", "repos_url": "https://api.github.com/users/panda-34/repos", "site_admin": false, "starred_url": "https://api.github.com/users/panda-34/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/panda-34/subscriptions", "type": "User", "url": "https://api.github.com/users/panda-34", "user_view_type": "public" }
[]
closed
true
null
[]
null
6
2016-06-06T20:15:22Z
2021-09-08T17:05:41Z
2016-06-06T20:36:17Z
NONE
resolved
I was scraping a site where the search form makes a POST request which is redirected with 301 code. That code being permanent, the redirect is cached in session.redirect_cache by POST URL alone, so for all further searches the session gets the same page. I suggest that redirect_cache should use POST body as a key also, it's quite typical for sites to redirect POSTs to different URLs based on data.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3291/reactions" }
https://api.github.com/repos/psf/requests/issues/3291/timeline
null
completed
null
null
false
[ "Hey @panda-34,\n\nThanks for pointing this out. This was something I hadn't seen before (or considered). Tthe redirect cache is meant to very simplistic. The default cache can be disabled like as described [here](https://github.com/kennethreitz/requests/pull/2095#issuecomment-45977320). You can then use a real cache system, like [cachecontrol](/ionrock/cachecontrol) to do this kind of work.\n", "POST requests aren't really supposed to be cacheable, redirects or not. Bypassing the POST and jumping immediately to the result (even with correct URL) might have some very nasty consequences.\n", "Just on the topic of disabling the cache - Is it possible to disable the\ncache for specific calls, or does it need to be done globally?\n\nOn Tue, Jun 7, 2016 at 6:51 AM panda-34 [email protected] wrote:\n\n> POST requests aren't really supposed to be cacheable, redirects or not.\n> Bypassing the POST and jumping immediately to the result (even with correct\n> URL) might have some very nasty consequences.\n> \n> —\n> You are receiving this because you are subscribed to this thread.\n> Reply to this email directly, view it on GitHub\n> https://github.com/kennethreitz/requests/issues/3291#issuecomment-224084122,\n> or mute the thread\n> https://github.com/notifications/unsubscribe/AGk_7dwxUktz7CDneIGjxlnQ4Edgsty8ks5qJIhEgaJpZM4IvRXW\n> .\n", "Once again, I would vastly prefer that rather than us making the 301 cache increasingly more complex we simply _remove_ it. It frequently behaves incorrectly and leads to subtle confusion. I was in opposition to its merging back when it was proposed and I remain opposed to its function now.\n", "@Lukasa I'm in agreement. Let's do that in 3.0\n", "+1, I don't like it. \n" ]
https://api.github.com/repos/psf/requests/issues/3290
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3290/labels{/name}
https://api.github.com/repos/psf/requests/issues/3290/comments
https://api.github.com/repos/psf/requests/issues/3290/events
https://github.com/psf/requests/issues/3290
158,745,955
MDU6SXNzdWUxNTg3NDU5NTU=
3,290
Content-length not calculated correctly
{ "avatar_url": "https://avatars.githubusercontent.com/u/13381584?v=4", "events_url": "https://api.github.com/users/therealjb86/events{/privacy}", "followers_url": "https://api.github.com/users/therealjb86/followers", "following_url": "https://api.github.com/users/therealjb86/following{/other_user}", "gists_url": "https://api.github.com/users/therealjb86/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/therealjb86", "id": 13381584, "login": "therealjb86", "node_id": "MDQ6VXNlcjEzMzgxNTg0", "organizations_url": "https://api.github.com/users/therealjb86/orgs", "received_events_url": "https://api.github.com/users/therealjb86/received_events", "repos_url": "https://api.github.com/users/therealjb86/repos", "site_admin": false, "starred_url": "https://api.github.com/users/therealjb86/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/therealjb86/subscriptions", "type": "User", "url": "https://api.github.com/users/therealjb86", "user_view_type": "public" }
[ { "color": "777777", "default": false, "description": null, "id": 162780722, "name": "Question/Not a bug", "node_id": "MDU6TGFiZWwxNjI3ODA3MjI=", "url": "https://api.github.com/repos/psf/requests/labels/Question/Not%20a%20bug" }, { "color": "f7c6c7", "default": false, "description": null, "id": 167537670, "name": "Propose Close", "node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=", "url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close" }, { "color": "fef2c0", "default": false, "description": null, "id": 298537994, "name": "Needs More Information", "node_id": "MDU6TGFiZWwyOTg1Mzc5OTQ=", "url": "https://api.github.com/repos/psf/requests/labels/Needs%20More%20Information" } ]
closed
true
null
[]
null
7
2016-06-06T18:42:51Z
2021-09-08T17:05:41Z
2016-06-07T08:54:40Z
NONE
resolved
PUT request using previous connection .content generates content-length of 1. Please can you help me determine why this is happening? Code: r = requests.get('http://192.168.55.5/api/?client=wget&file-name=dummy&type=vmware/vmware/2.0/si/serviceprofile/serviceprofile-1/containerset', data={r1.content}, headers={'content-type':'application/xml'}, verify=False, auth=('admin', 'paloalto')) Generated request: PUT /api/?client=wget&file-name=dummy&type=vmware/vmware/2.0/si/serviceprofile/serviceprofile-1/containerset HTTP/1.1 Host: 192.168.55.5 Connection: keep-alive User-Agent: python-requests/2.10.0 Accept-Encoding: gzip, deflate Accept: _/_ Content-Length: 1 content-type: application/xml Authorization: Basic YWRtaW46cGFsb2FsdG8= <?xml version="1.0" encoding="UTF-8"?> <containerSet><container><id>securitygroup-22</id><name>Ten1SubtA-sec-group</name><description></description><revision>3</revision><type>IP</type><address>192.168.111.89</address><address>192.168.111.88</address></container><container><id>securitygroup-27</id><name>test-sec</name><description></description><revision>7</revision><type>IP</type></container></containerSet>
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3290/reactions" }
https://api.github.com/repos/psf/requests/issues/3290/timeline
null
completed
null
null
false
[ "You can see the body is clearly more than 1 byte so unless I'm missing something this looks like a bug. Let me know what additional information you would like and I will add it.\n", "It's actually not quite complete. You haven't told us what operating system you're using or shown us how to get `r1` for the post request you make above. The way in which you made that request is actually worth knowing.\n\nYou also don't explain where you retrieved this generated request from. Is that from the server or from the request attribute on the response.\n\nThere is, in fact, very little information here and at this point belongs on StackOverflow. There's little evidence of any actual bug in requests.\n", "Python 3.6.0a1 (v3.6.0a1:5896da372fb0, May 16 2016, 15:20:48) \n[GCC 4.2.1 (Apple Inc. build 5666) (dot 3)] on darwin\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n\n> > > import requests\n> > > requests.**version**\n> > > '2.10.0'\n\nThe generated request is a capture (tcpdump) of r on the server, before it was cut off it clearly showed over 400 bytes of content, yet the Content-Length has been calculated as '1' by requests. \n\nr1 = requests.get('https://10.193.87.73/api/2.0/si/serviceprofile/serviceprofile-16/containerset', verify=False, auth=('root', 'paloalto'))\n\nThe response to GET request r1 contains the data I am using in the PUT request r\n\ndata:\n\n<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<containerSet><container><id>securitygroup-22</id><name>Ten1SubtA-sec-group</name><description></description><revision>3</revision><type>IP</type><address>192.168.111.89</address><address>192.168.111.88</address></container><container><id>securitygroup-27</id><name>test-sec</name><description></description><revision>7</revision><type>IP</type></container></containerSet>\n", "Uh...why is your `data=` argument a set?\n", "uh... hmm.. maybe I chopped and changed my code too much and ended up with a typo... Thanks for pointing me at my mistake :) I will now go and slap myself in the face for the next 10 minutes :(\n", "Heh, no need to self-flagellate too much, it happens to all of us. =)\n", "@therealjb86 I should have caught that last night. :) Sorry for not being able to be helpful sooner.\n" ]
https://api.github.com/repos/psf/requests/issues/3289
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3289/labels{/name}
https://api.github.com/repos/psf/requests/issues/3289/comments
https://api.github.com/repos/psf/requests/issues/3289/events
https://github.com/psf/requests/pull/3289
158,694,498
MDExOlB1bGxSZXF1ZXN0NzI3MjA2MzY=
3,289
Test verify parameter
{ "avatar_url": "https://avatars.githubusercontent.com/u/15092?v=4", "events_url": "https://api.github.com/users/jayvdb/events{/privacy}", "followers_url": "https://api.github.com/users/jayvdb/followers", "following_url": "https://api.github.com/users/jayvdb/following{/other_user}", "gists_url": "https://api.github.com/users/jayvdb/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jayvdb", "id": 15092, "login": "jayvdb", "node_id": "MDQ6VXNlcjE1MDky", "organizations_url": "https://api.github.com/users/jayvdb/orgs", "received_events_url": "https://api.github.com/users/jayvdb/received_events", "repos_url": "https://api.github.com/users/jayvdb/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jayvdb/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jayvdb/subscriptions", "type": "User", "url": "https://api.github.com/users/jayvdb", "user_view_type": "public" }
[]
closed
true
null
[]
null
10
2016-06-06T14:41:02Z
2021-09-08T03:01:03Z
2016-06-17T13:00:05Z
CONTRIBUTOR
resolved
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3289/reactions" }
https://api.github.com/repos/psf/requests/issues/3289/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3289.diff", "html_url": "https://github.com/psf/requests/pull/3289", "merged_at": "2016-06-17T13:00:05Z", "patch_url": "https://github.com/psf/requests/pull/3289.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3289" }
true
[ "I've removed the `SSLError` tests, and session tests due to what you've explained in #3288, and focused on the `SubjectAltNameWarning` which should always occur with `pytest-httpbin`.\n\nThis is actually better at showing the problem in #3287 , as it needs to occur in all combinations of dependencies and Python versions.\n", "This looks reasonable to me aside from my one note about warning spam. @sigmavirus24, what do you think?\n", "This is a good start. I think we can mitigate the spam.\n", "There are also other bugs in pytests handling of warnings (see https://github.com/pytest-dev/pytest/issues/840)\nAn alternative approach to address the warnings problem is to redirect them through the logging layer for the test suite, using `captureWarnings` and backports or some pytest extension which provides something similar. py.test itself doesnt [appear](https://pytest.org/latest/recwarn.html) to have anything particularly fancy for managing warnings in the global test context. There are pytest extensions for the logging layer. Let me know if you want me to do a little analysis down that path.\n", "Any objections to the warnings init moving into `tests.__init__` like so...\nhttps://github.com/jayvdb/requests/blob/unbundling_walkthrough-v2/tests/__init__.py ? It makes the results more stable.\n", "I'm ok with it.\n", "Sorry for the delay; been recovering from hard disk loss\n", "Ok I'm good with this, leaving it up to @sigmavirus24 to merge when he's happy.\n", "@sigmavirus24 , ping ... ?\n", "Sorry. Was distracted with other things. This looks fine to me.\n" ]
https://api.github.com/repos/psf/requests/issues/3288
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3288/labels{/name}
https://api.github.com/repos/psf/requests/issues/3288/comments
https://api.github.com/repos/psf/requests/issues/3288/events
https://github.com/psf/requests/issues/3288
158,692,705
MDU6SXNzdWUxNTg2OTI3MDU=
3,288
Minor session object documentation issues
{ "avatar_url": "https://avatars.githubusercontent.com/u/15092?v=4", "events_url": "https://api.github.com/users/jayvdb/events{/privacy}", "followers_url": "https://api.github.com/users/jayvdb/followers", "following_url": "https://api.github.com/users/jayvdb/following{/other_user}", "gists_url": "https://api.github.com/users/jayvdb/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jayvdb", "id": 15092, "login": "jayvdb", "node_id": "MDQ6VXNlcjE1MDky", "organizations_url": "https://api.github.com/users/jayvdb/orgs", "received_events_url": "https://api.github.com/users/jayvdb/received_events", "repos_url": "https://api.github.com/users/jayvdb/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jayvdb/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jayvdb/subscriptions", "type": "User", "url": "https://api.github.com/users/jayvdb", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2016-06-06T14:33:31Z
2021-09-08T17:05:42Z
2016-06-06T14:51:37Z
CONTRIBUTOR
resolved
I suspect this is a documentation issue, for expected and desired behaviour, rather than an unintended bug... http://docs.python-requests.org/en/master/user/advanced/#session-objects states that > The Session object ... will use urllib3's connection pooling. So if you're making several requests to the same host, the underlying TCP connection will be reused, which can result in a significant performance increase (see HTTP persistent connection). > ... > Note, however, that method-level parameters will not be persisted across requests, even if using a session. This example will only send the cookies with the first request, but not the second: ..." There is a minor conflict between those, in that exceptions to the rule include the `verify` parameter and likely other connection related parameters (`cert`, `timeout`, ...?), which do in fact persist across the requests. This is strongly implied by the opening paragraph, which says that the connection is reused. Also "several requests to the same host" is not quite correct, I believe, as different protocols will result in different connections, naturally.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3288/reactions" }
https://api.github.com/repos/psf/requests/issues/3288/timeline
null
completed
null
null
false
[ "Note #3289 shows the `verify` parameter being persistent between requests in the same session.\n", "What would be interesting to test is whether a connection reset of some sort causes the `verify` parameter in the session to the same protocol/host to be lost. Maybe there is a test in `urllib3` that already covers that scenario.\n", "This is a known issue (see #2863) and will be fixed in an upcoming release, having had the underlying urllib3 issue fixed (shazow/urllib3#830).\n" ]
https://api.github.com/repos/psf/requests/issues/3287
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3287/labels{/name}
https://api.github.com/repos/psf/requests/issues/3287/comments
https://api.github.com/repos/psf/requests/issues/3287/events
https://github.com/psf/requests/issues/3287
158,589,042
MDU6SXNzdWUxNTg1ODkwNDI=
3,287
requests.packages monkey patching of unbundled urllib3 does not work
{ "avatar_url": "https://avatars.githubusercontent.com/u/15092?v=4", "events_url": "https://api.github.com/users/jayvdb/events{/privacy}", "followers_url": "https://api.github.com/users/jayvdb/followers", "following_url": "https://api.github.com/users/jayvdb/following{/other_user}", "gists_url": "https://api.github.com/users/jayvdb/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jayvdb", "id": 15092, "login": "jayvdb", "node_id": "MDQ6VXNlcjE1MDky", "organizations_url": "https://api.github.com/users/jayvdb/orgs", "received_events_url": "https://api.github.com/users/jayvdb/received_events", "repos_url": "https://api.github.com/users/jayvdb/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jayvdb/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jayvdb/subscriptions", "type": "User", "url": "https://api.github.com/users/jayvdb", "user_view_type": "public" }
[]
closed
true
null
[]
null
25
2016-06-06T02:10:11Z
2021-09-08T07:00:26Z
2017-07-30T14:01:27Z
CONTRIBUTOR
resolved
In requests 2.5.2 `requests.packages.__init__` added code to allow `requests.packages.urllib3` to not exist, and it would fall back to using `urllib3`. No doubt this works in some cases, as the people on #2375 were happy with it, but it doesnt work with pyopenssl, resulting in no SNI/etc. `requests.packages.__init__` only injects `urllib3` as `requests.packages.urllib3`. However all of the other modules within urlllib3 will have different copies in the namespace `requests.packages.urllib3`. `requests.packages.urllib3.contrib.pyopenssl` then writes to a separate copy of `requests.packages.urllib3.util` and `requests.packages.urllib3.connection`, and these changes are never seen by `urllib3`. I encountered this on debian stretch & sid's `python-requests` 2.10.0-1 with `python-urllib3` 1.15.1 when running on Python 2.7.6 (Ubuntu Trusty), and haven't retested yet on a later Python 2.7. @eriol #3286 fixes the problem for me. I note that Fedora's latest `python-requests` no longer unbundles urllib3.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 1, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 1, "url": "https://api.github.com/repos/psf/requests/issues/3287/reactions" }
https://api.github.com/repos/psf/requests/issues/3287/timeline
null
completed
null
null
false
[ "Here is a set of tests which demonstrate some of the problem: https://travis-ci.org/jayvdb/requests/builds/135716555\n\nMy apologies that it doesnt show a problem in `test_unbundling`, which is what I had hoped would show the problem more clearly.\n\nInstead, see the errors in `test_verify` have:\n\n```\n('HAS_MODERN_SSL', False)\n('HAS_PYOPENSSL', True)\n('HAS_SNI', True)\n```\n\nand yet they have warnings indicating that those requests are not being processed securely.\n", "@jayvdb thanks for testing but there is something I don't get. Tests for `DIST=trusty+sid` seem to use Python 2.7.6, but Stretch has Python 2.7.11. Even the stable release, Jessie, has 2.7.9 with several backports (for example cPython >= 2.7.9 has ssl features backported from Python3).\n", "Hi @eriol, the `DIST=trusty+sid` job in that build is mostly Travis' trusty, with selected packages added from sid. Travis trusty reports that it is Python 2.7.6, whereas trusty ships with 2.7.5 as far as I can see, so I am a bit confused about that, but it isnt a job I am particularly focused on, as the same problems appear in the other jobs that have a cleaner virtual env at the beginning of the test sequence.\n\nAs a consequence of building the additional testing on Travis, I feel more confident that any distro version shipping Python 2.7.9+ is fine with the current `requests.packages` unbundling code. I still have some more test scenarios to create, though.\n\nSo the only 'problem' may be that the requests 2.10 package in stretch / sid states it `Depends: python:any (>= 2.7.5-5~)` , which is why it can be installed onto these Travis trusty environments, and probably any other debian derivatives which are still on Python 2.7.5 - 2.7.8 (are there any?).\n\nArch Linux also uses the unbundling, but it appears to be only providing Python 3.5.\nContrary to what I said earlier, Fedora isnt using the unbundling fallbacks in `requests.packaging.__init__`, but they are still [using symlinks](https://pkgs.fedoraproject.org/cgit/rpms/python-requests.git/tree/python-requests.spec#n115), however they are Python 2.7.10+ also, so should be safe anyway.\n", "@eriol, I've been able to provide a better test case for this, and it shows that the problem exists even in 2.7.11 and 3.3.\n\nhttps://travis-ci.org/jayvdb/requests/builds/135867398 is just #3289 with a `.travis.yml` that shows various combinations all work well with the bundled version of `urllib3`.\n\nhttps://travis-ci.org/jayvdb/requests/builds/135870520 is a [very simple](https://github.com/jayvdb/requests/commit/09ae4ad78e28087fd5e041f5ce0a7bc603cd6a04) change to the `.travis.yml` that emulates what the Debian package looks like, and shows that `SubjectAltNameWarning` stops occurring on all environments that are using pyopenssl. As explained in the opening issue, under the covers it is more than just the warning that isnt happening. A second copy of the modules are being created and configured for pyopenssl mode, and the actual `urllib3` doesnt get into pyopenssl mode.\n\nFinally, https://travis-ci.org/jayvdb/requests/builds/135873388 is #3286 , which fixes the problem.\nThat patch isnt intended to be the final solution; it is a WIP until we figure out what should be merged (I was told in #2670 to PR early), intended mostly to show what does work. I have very quickly looked at the pip approach, and it is doing roughly the same thing so it should work. It does require closer coupling between pip/requests and urllib3, which my patch avoided, for better and for worse. I am not pushing to have my patch, or any other similar patch, merged, pushing `requests` further down this 'support unbundling' rabbit hole further, if the maintainers don't feel it is appropriate. The patch is there to prove the bug exists.\n\nMy next step is going to be to check what happens with symlinks like what Fedora is doing, to see if that is a way to beat the import machinery. (I'd be surprised if it didnt work, but this problem is full of surprises).\n", "symlink results are [in](https://travis-ci.org/jayvdb/requests/builds/135914574), with failures up until 2.7.8 only, and the failures are the opposite -- there are too many warnings, rather than too few.\n", "I've put together a little playground to show what is happening under the covers, as I found it difficult to do this under pytest, in order to better see the scope of the problem.\nhttps://github.com/jayvdb/requests_issue_3287\n\nWorth noting that 2.7.9+ seems to mostly [do the right thing](https://travis-ci.org/jayvdb/requests_issue_3287/builds/136099105), but I havent yet added any demos using site using/abusing Subject Alternative Name.\n", "@warsaw , as I saw you're working on a similar package and de-vendoring ( https://github.com/pypa/pip/issues/3802 ), you might like to be aware of, or assist with, this.\n", "@jayvdb Thanks for the heads-up. That was a quick fix which seemed to work for Debian. I'll take a look at this issue in more detail when I get a chance. \n", "Hi, i did non forgot this. Only busy during this week.\n\n@warsaw my was plan is to use same approach of pip here. What do you think about?\n", "@eriol +1 I think pip has done the best job of making devendorizing easier for downstream redistributors. My deltas are very small and now that it's been in place for a while, I haven't encountered any issues with it. It's possible I'm missing something, so pinging @dstufft for additional thoughts.\n", "Once #3289 is merged, we'll have a unit test to check the de-vendored solution.\n", "@jayvdb many thanks for #3289! I'm going to work on this in a few hours or at max tomorrow. Feel free to ping me again if you did not get a report from me tomorrow afternoon. But I hope to be able to work on this today.\n", "We've gone through many iterations of the debundling with pip. I think we're pretty happy with the current incantation which has the following properties:\n- It's a no-op when you're using upstream (it's gated behind a conditional that always evaluates to false unless downstream patches the conditional).\n- Inside the code, everything still just uses from `pip._vendor import whatever` without having to sprinkle `try .. except ImportError` everywhere.\n- It prefers to use the bundled copy rather than an already installed copy, so you can conditionally leave one thing bundled if for some reason that's needed.\n- It still uses bundled error messages (e.g. import error messages will still be of the form `can't import pip._vendor.six`).\n- When using as a library, it doesn't pollute `sys.modules` except to alias `six` inside of `pip._vendor.six`.\n\nThe downsides that we've discovered so far:\n- Testing is a bit of a pain since enabling it requires manual patching, we end up using `sed` to rewrite the conditional in our tests.\n- You have to enumerate all the packages, including sub packages and modules, in the bundled library or you will not be able to import them when debundled.\n- In pip's particular case we pollute `sys.path` with wheels when debundled, but that shouldn't apply to requests because debundled requests doesn't have the same \"must never break, even when `pip upgrade'ing` cases as pip does.\n", "@dstuff, I very quickly tested pip, and it inherits this problem from requests. pip de-vendoring may well be working wonderfully, but the de-vendored requests isnt initalising urlllib3's pyopenssl properly. See https://travis-ci.org/jayvdb/requests_issue_3287/builds/138597348 , which shows only additional warnings being emitted for pip's requests, but please see the rest of this issue to see why those warnings are occurring and the impact of that. Due to pip mostly being used to talk to one set of servers, the fallout is smaller. My guess is that this isnt a security vulnerability , as it will cause connections to fail when they should succeed, rather than connect when it shouldnt. But I have very little experience in this area, and not much time to understand the corner cases in these packages, so I am sure others here will know the real potential for this to be abused or not.\n\nThe pip travis testing framework for de-vendored packages is nice, but note that https://github.com/pypa/pip/blob/master/pip/_vendor/vendor.txt isnt including `requests[security]` , so that isnt accurately representing distros. Not that it matters a great deal, as there doesnt seem to be any pip tests covering problematic sites needing better ssl support (not surprising; pip expects requests to do that).\nAnd, fyi, the pip tests suite [are failing](https://travis-ci.org/jayvdb/pip/builds/138602638) atm.\n", "@jayvdb fast report, working on unbundling stuff right now. Testing with your https://github.com/jayvdb/requests_issue_3287.\n", "@dstufft many thanks for the detailed explanation!\n\nI have just uploaded `requests 2.10.0-2` to unstable (still in upload queue, it'll show up shortly). I cherry picked the unbundling stuff from pip, [this is the patch](https://anonscm.debian.org/cgit/python-modules/packages/requests.git/commit/?h=patched/2.10.0-2&id=3311851f0cffea52cd779d01a6bf31cd8d34a37f) landed on Debian.\n\n@jayvdb can you test again when `requests 2.10.0-2` will be in the archive? Thanks!\nAlso, what about renaming this bug in a more specific way? _Does not work_ seems to broad to me: we are addressing a problem using Python2 and related to SSL.\n\nOne more thing, I can bump the Python dependency to ensure a cPython2 with SSL feature from Python3, but only after we fix this. I was not aware of `trusty+sid` combo, I can understand the use, but mixing packages from different release seems dangerous to me.\n", "I have limited capacity to test atm, in a bit of a rush to airport, but should be good in ~6 hrs.\n\nRegarding issue title, this problem also occurs only py3.\nOn py34 it doesnt have the same impact, as the native ssl is better, but the pyopenssl monkey patching is still not working properly.\nSee here on py34\nhttps://travis-ci.org/jayvdb/requests_issue_3287/jobs/136099126#L390\n", "Also there are other packages that requests vendors the same way, so obscure problems can probably be found there also. It is the monkey patching that is wrong, not an ssl issue. It is just the ssl side effect is the most concerning of course.\n", "I've done some basic testing. The warnings are fixed. It appears that Debian's 2.10.0-2 `requests.packages.urllib3` is now a functioning monkey patched copy of urllib3.\n\nHowever `requests.packages.urllib3` is not `urllib3`! Any code which relies on Debian's requests enabling the top level urllib3 package pyopenssl mode , or expects requests' urllib3 to have the same state as the top level urllib3 package, will still have that expectation broken.\nIm not sure if this is a problem, but I note that `requests/packages/__init__.py` stresses that ``requests.packages.urllib3 is urllib3` was an objective (and that it had achieved that goal, which is true, but not terribly useful when many of the modules in the two namespaces had different copies with different state ..).\n\nYou can see Debian's requests 2.10.0-2 has different state to urllib3 at https://travis-ci.org/jayvdb/requests_issue_3287/builds/138669983#L289 , but that is because requests on Debian now copies urllib3 and doesnt enable pyopenssl in the original top level urllib3 package which is left in its default state as my test script doesnt attempt to enable pyopenssl in urllib3.\n", "Hi, distro user here, and today I've met exactly the same issue. Can someone explain to me why requests bundles so many packages? For easy-of-use? Users need only one big package in this way. But pip has done a good job on installing dependencies for these users. So it is for pinning known-compatible versions? Or something else I've totally missed?", "@lilydjwg pip does a good job on installing dependencies if there are no conflicts. Consider the following case:\r\n\r\nThere's a version of Requests, let's call it `vR.0.0` that requires a specific version of urllib3, let's call that `vU.0.0` and anything prior to `vU.0.0` will break something in Requests. Let's also say that the user uses the python elasticsearch package and pin urllib3 to `vT.0.0` which comes before `vU.0.0`. If their requirements.txt file looks like this:\r\n\r\n```\r\nrequests==R.0.0\r\nelasticsearch-py==E.0.0\r\nurllib3==T.0.0\r\n```\r\n\r\nThen they *will* have a broken Requests installation. Vendoring the dependencies makes it easy to know that Requests will always work when installed from PyPI (we cannot ever make the same guarantee for a distro user, that guarantee is up to the packagers to make). \r\n\r\nBoth Requests and urllib3 are *widely* used and popular packages. Requests will sometimes want to stick to an older version or use a newer version and this allows us complete control over that.\r\n\r\nKeep in mind, however, that pip has GSOC student who is working on resolving the above problem. That will be one less issue for Requests to worry about then once that lands. Unfortunately, people don't often upgrade pip. Namely, the people who *rarely* update pip are users such as yourself who rely on their distro to provide them python packages. In your case, you will likely run into this issue unless you're using a alpha/beta version of your distro or something like Arch Linux which does it's best to stay current.", "@sigmavirus24 thanks for the explanation!\r\n\r\nI'm on Arch Linux. However companies tend to use stable distros with terribly-old packages, so I install pip and upgrade it to latest every time :-)", "Closing since Requests unvendored everything recently.", "data['created_by'] = request.headers.get(header.GRASS_HEADER_USER_ID, '')\r\nhow to mock it using monkeypatch any idea ?\r\n", "@sakshi094 that's not what this issue is about. I suggest you ask your question on StackOverflow." ]
https://api.github.com/repos/psf/requests/issues/3286
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3286/labels{/name}
https://api.github.com/repos/psf/requests/issues/3286/comments
https://api.github.com/repos/psf/requests/issues/3286/events
https://github.com/psf/requests/pull/3286
158,588,989
MDExOlB1bGxSZXF1ZXN0NzI2NDg3MjY=
3,286
Load all unbundled urllib3 modules
{ "avatar_url": "https://avatars.githubusercontent.com/u/15092?v=4", "events_url": "https://api.github.com/users/jayvdb/events{/privacy}", "followers_url": "https://api.github.com/users/jayvdb/followers", "following_url": "https://api.github.com/users/jayvdb/following{/other_user}", "gists_url": "https://api.github.com/users/jayvdb/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jayvdb", "id": 15092, "login": "jayvdb", "node_id": "MDQ6VXNlcjE1MDky", "organizations_url": "https://api.github.com/users/jayvdb/orgs", "received_events_url": "https://api.github.com/users/jayvdb/received_events", "repos_url": "https://api.github.com/users/jayvdb/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jayvdb/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jayvdb/subscriptions", "type": "User", "url": "https://api.github.com/users/jayvdb", "user_view_type": "public" }
[]
closed
true
null
[]
null
6
2016-06-06T02:09:25Z
2021-09-08T03:00:54Z
2016-08-08T12:04:22Z
CONTRIBUTOR
resolved
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3286/reactions" }
https://api.github.com/repos/psf/requests/issues/3286/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3286.diff", "html_url": "https://github.com/psf/requests/pull/3286", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3286.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3286" }
true
[ "I am _extremely_ nervous about applying further upstream patches to handle downstream unbundling behaviours. In particular, it's not clear to me what exact problem you're experiencing in #3287.\n\nFor the moment however I'm inclined to call this a Debian bug unless told otherwise. \n", "The problem is that pyopenssl injection doesnt work, despite having all of the necessary dependencies installed, and despite the pyopenssl injection function appearing to be successful (i.e. no `ImportError` is raised). i.e. Debian's requests 2.10 doesnt have OpenSSL & SNI support, at least in the combinations I've tested using.\n\nIt looks like there is no tests covering this unbundling support code, so I'll set up unit tests to verify this isnt a Debian only problem.\n\nfwiw, I have no objection to the unbundling support code being removed from `requests.packages.__init__`, but it is currently there, and doesn't work, but has lots of documentation in the module that gives packagers the false expectation that it works.\n", "So pip does [something similar](https://github.com/pypa/pip/blob/master/pip/_vendor/__init__.py#L79) but in a far more intelligent way than what is happening here.\n\nThat said, I don't think we have evidence that what pip is doing works either.\n\nI also don't agree with your all or nothing mentality @jayvdb. It's not productive.\n", "Hello,\nsorry for the late reply, I was on trip and once returned I had to work on the backport of betamax for the stable release due the sheduled upload on OpenStack (I don't remember which one).\n\nI definitively agree with @Lukasa here: the problem is on the Debian side.\nSo, as soon I complete the backport of betamax I will work on this.\nThis is my plan:\n1. use the same pip's `vendored` function to patch requests.packages.**init**;\n2. add some tests on Debian CI infrastructure over this specific issue;\n3. upload this new revision to experimental suite;\n4. make a call for test;\n\n@jayvdb can you share your tests about OpenSSL & SNI support on Debian? Thanks!\n", "@jayvdb never mind, you were talking about https://github.com/kennethreitz/requests/issues/3287.\n", "I agree with @Lukasa — this part of the codebase is absolutely abhorrent and should not even exist — however, we have chosen to do so to improve the user experience of our very unfortunate distro-installed users. \n\nWe've already done more than I think we should have. Doing even more, without extremely strong beyond-a-reasonable-doubt purpose, is out of the question.\n" ]
https://api.github.com/repos/psf/requests/issues/3285
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3285/labels{/name}
https://api.github.com/repos/psf/requests/issues/3285/comments
https://api.github.com/repos/psf/requests/issues/3285/events
https://github.com/psf/requests/issues/3285
158,548,823
MDU6SXNzdWUxNTg1NDg4MjM=
3,285
install
{ "avatar_url": "https://avatars.githubusercontent.com/u/9263473?v=4", "events_url": "https://api.github.com/users/zdrjson/events{/privacy}", "followers_url": "https://api.github.com/users/zdrjson/followers", "following_url": "https://api.github.com/users/zdrjson/following{/other_user}", "gists_url": "https://api.github.com/users/zdrjson/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/zdrjson", "id": 9263473, "login": "zdrjson", "node_id": "MDQ6VXNlcjkyNjM0NzM=", "organizations_url": "https://api.github.com/users/zdrjson/orgs", "received_events_url": "https://api.github.com/users/zdrjson/received_events", "repos_url": "https://api.github.com/users/zdrjson/repos", "site_admin": false, "starred_url": "https://api.github.com/users/zdrjson/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zdrjson/subscriptions", "type": "User", "url": "https://api.github.com/users/zdrjson", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2016-06-05T10:50:55Z
2021-09-08T18:00:38Z
2016-06-05T14:59:40Z
NONE
resolved
homebrew support?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3285/reactions" }
https://api.github.com/repos/psf/requests/issues/3285/timeline
null
completed
null
null
false
[ "Why? What value would that bring that pip does not?\n\nBasically, anyone is welcome to downstream repackage us if they'd like to, but pip is the deployment mechanism we support. \n", "Homebrew is for Python itself, and perhaps some difficult to install modules, not general Python packages. \n" ]
https://api.github.com/repos/psf/requests/issues/3274
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3274/labels{/name}
https://api.github.com/repos/psf/requests/issues/3274/comments
https://api.github.com/repos/psf/requests/issues/3274/events
https://github.com/psf/requests/pull/3274
158,540,971
MDExOlB1bGxSZXF1ZXN0NzI2MjIyOTU=
3,274
Try to allow the socks patch to use http_no_tunnel
{ "avatar_url": "https://avatars.githubusercontent.com/u/4951803?v=4", "events_url": "https://api.github.com/users/birm/events{/privacy}", "followers_url": "https://api.github.com/users/birm/followers", "following_url": "https://api.github.com/users/birm/following{/other_user}", "gists_url": "https://api.github.com/users/birm/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/birm", "id": 4951803, "login": "birm", "node_id": "MDQ6VXNlcjQ5NTE4MDM=", "organizations_url": "https://api.github.com/users/birm/orgs", "received_events_url": "https://api.github.com/users/birm/received_events", "repos_url": "https://api.github.com/users/birm/repos", "site_admin": false, "starred_url": "https://api.github.com/users/birm/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/birm/subscriptions", "type": "User", "url": "https://api.github.com/users/birm", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2016-06-05T06:52:05Z
2021-09-08T03:01:07Z
2016-06-05T07:54:53Z
NONE
resolved
re #3238
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3274/reactions" }
https://api.github.com/repos/psf/requests/issues/3274/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3274.diff", "html_url": "https://github.com/psf/requests/pull/3274", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3274.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3274" }
true
[ "Thanks for this. Unfortunately, we can't merge this patch: it is entirely in urllib3 (a third-party library that we don't merge patches for). It's also not correct: it uses a constant that doesn't exist and handles a weirdly scoped URL rather than looking for environment variables. \n", "Thank you. I'll see if I can find another way to add this functionality. \n" ]
https://api.github.com/repos/psf/requests/issues/3263
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3263/labels{/name}
https://api.github.com/repos/psf/requests/issues/3263/comments
https://api.github.com/repos/psf/requests/issues/3263/events
https://github.com/psf/requests/issues/3263
158,492,736
MDU6SXNzdWUxNTg0OTI3MzY=
3,263
BadStatusLine
{ "avatar_url": "https://avatars.githubusercontent.com/u/12016537?v=4", "events_url": "https://api.github.com/users/liampauling/events{/privacy}", "followers_url": "https://api.github.com/users/liampauling/followers", "following_url": "https://api.github.com/users/liampauling/following{/other_user}", "gists_url": "https://api.github.com/users/liampauling/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/liampauling", "id": 12016537, "login": "liampauling", "node_id": "MDQ6VXNlcjEyMDE2NTM3", "organizations_url": "https://api.github.com/users/liampauling/orgs", "received_events_url": "https://api.github.com/users/liampauling/received_events", "repos_url": "https://api.github.com/users/liampauling/repos", "site_admin": false, "starred_url": "https://api.github.com/users/liampauling/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/liampauling/subscriptions", "type": "User", "url": "https://api.github.com/users/liampauling", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2016-06-04T07:33:40Z
2021-09-08T18:00:39Z
2016-06-04T14:53:31Z
NONE
resolved
Hi, python version: 3.4.4 requests version: 2.10.0 I am trying to connect to what has been described as 'ssl socket (normal) with a CRLF json protocol', this means the raw response looks like this: > reply: '{"op":"connection","connectionId":"002-040616070341-5494"}\r\n' However requests is expecting it to look like this: > reply: 'HTTP/1.1 200 OK\r\n' Code used: ``` url = 'https://stream-api.betfair.com:443/api' session = requests.session() req = requests.Request('GET', url) pre = req.prepare() login_req = session.send(pre, stream=True) ``` I can see that the error is occurring in _read_status, does anyone know how I can bypass this? Or should I be connecting to this socket differently? ``` Traceback (most recent call last): File "/Users/temp/anaconda/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 385, in _make_request httplib_response = conn.getresponse(buffering=True) TypeError: getresponse() got an unexpected keyword argument 'buffering' During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/Users/temp/anaconda/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 578, in urlopen chunked=chunked) File "/Users/temp/anaconda/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 387, in _make_request httplib_response = conn.getresponse() File "/Users/temp/anaconda/lib/python3.4/http/client.py", line 1227, in getresponse response.begin() File "/Users/temp/anaconda/lib/python3.4/http/client.py", line 386, in begin version, status, reason = self._read_status() File "/Users/temp/anaconda/lib/python3.4/http/client.py", line 368, in _read_status raise BadStatusLine(line) http.client.BadStatusLine: {"op":"connection","connectionId":"002-040616071750-5507"} During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/Users/temp/anaconda/lib/python3.4/site-packages/requests/adapters.py", line 403, in send timeout=timeout File "/Users/temp/anaconda/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 623, in urlopen _stacktrace=sys.exc_info()[2]) File "/Users/temp/anaconda/lib/python3.4/site-packages/requests/packages/urllib3/util/retry.py", line 255, in increment raise six.reraise(type(error), error, _stacktrace) File "/Users/temp/anaconda/lib/python3.4/site-packages/requests/packages/urllib3/packages/six.py", line 309, in reraise raise value.with_traceback(tb) File "/Users/temp/anaconda/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 578, in urlopen chunked=chunked) File "/Users/temp/anaconda/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 387, in _make_request httplib_response = conn.getresponse() File "/Users/temp/anaconda/lib/python3.4/http/client.py", line 1227, in getresponse response.begin() File "/Users/temp/anaconda/lib/python3.4/http/client.py", line 386, in begin version, status, reason = self._read_status() File "/Users/temp/anaconda/lib/python3.4/http/client.py", line 368, in _read_status raise BadStatusLine(line) requests.packages.urllib3.exceptions.ProtocolError: ('Connection aborted.', BadStatusLine('{"op":"connection","connectionId":"002-040616071750-5507"}\r\n',)) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "marketstreaming/test.py", line 52, in <module> login_req = session.send(pre, stream=True) File "/Users/temp/anaconda/lib/python3.4/site-packages/requests/sessions.py", line 585, in send r = adapter.send(request, **kwargs) File "/Users/temp/anaconda/lib/python3.4/site-packages/requests/adapters.py", line 453, in send raise ConnectionError(err, request=request) requests.exceptions.ConnectionError: ('Connection aborted.', BadStatusLine('{"op":"connection","connectionId":"002-040616071750-5507"}\r\n',)) ``` Thanks for any help
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3263/reactions" }
https://api.github.com/repos/psf/requests/issues/3263/timeline
null
completed
null
null
false
[ "This API does not speak HTTP. You would have to create a custom transport adapter. (Personally I think shoehorning this into requests will be quite painful, better create a custom library).\nThey should not reuse the well known port for HTTP.\n", "Requests is a HTTP library: it is not expected to be able to handle arbitrary protocols. In this case, most of the things requests gives you (easy access to HTTP headers and status codes, cookie support, redirect following, etc.) literally don't apply here because there is no HTTP. To do this you'll need to learn the socket library.\n" ]
https://api.github.com/repos/psf/requests/issues/3252
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3252/labels{/name}
https://api.github.com/repos/psf/requests/issues/3252/comments
https://api.github.com/repos/psf/requests/issues/3252/events
https://github.com/psf/requests/issues/3252
158,042,697
MDU6SXNzdWUxNTgwNDI2OTc=
3,252
Setting Proxy-Authorization Header Manually
{ "avatar_url": "https://avatars.githubusercontent.com/u/1574365?v=4", "events_url": "https://api.github.com/users/mmedal/events{/privacy}", "followers_url": "https://api.github.com/users/mmedal/followers", "following_url": "https://api.github.com/users/mmedal/following{/other_user}", "gists_url": "https://api.github.com/users/mmedal/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/mmedal", "id": 1574365, "login": "mmedal", "node_id": "MDQ6VXNlcjE1NzQzNjU=", "organizations_url": "https://api.github.com/users/mmedal/orgs", "received_events_url": "https://api.github.com/users/mmedal/received_events", "repos_url": "https://api.github.com/users/mmedal/repos", "site_admin": false, "starred_url": "https://api.github.com/users/mmedal/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mmedal/subscriptions", "type": "User", "url": "https://api.github.com/users/mmedal", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2016-06-02T01:41:20Z
2021-09-08T18:00:39Z
2016-06-02T16:35:32Z
CONTRIBUTOR
resolved
I know you guys have addressed a variety of concerns relating to configuring proxies and that proxying is functionally complex, but I'd like to raise one issue I was having today. We have a new proxy in our environment that requires auth with a yubikey rather than a password. We can simply set an environment variable to a token returned from a yubikey auth endpoint that expires every 12 hours and pass it along with the Proxy-Authorization header in tools like curl, wget, etc... However, it gets a little trickier with requests, as the default api seems to only support including the proxy-user:proxy-pass in the proxy-url. In our case, we can't do this, as a yubikey token is only good once. As a temporary workaround, I've subclassed HTTPAdapter and modified the proxy_headers method: ``` class CustomHTTPAdapter(HTTPAdapter): def proxy_headers(self, proxy): return { 'Proxy-Authorization': 'Basic ' + environ.get('proxy_token', str()) } ``` and then mount it on a Session: ``` import requests from mypkgs.adapters import CustomHTTPAdapter s = requests.Session() s.mount('https://', CustomHTTPAdapter()) s.get('https://someurl') ``` It would be incredibly useful to simply be able to do something like `requests.get(uri, proxy_headers=proxy_headers)`. Am I barking up the wrong tree here or do you have any ideas or feedback?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3252/reactions" }
https://api.github.com/repos/psf/requests/issues/3252/timeline
null
completed
null
null
false
[ "Generally speaking we're disinclined to add a separate `proxy_headers` kwarg. The vast majority of users don't need it, and those that do are relatively well-served by the `proxy_headers` override of the HTTPAdapter.\n\nRequests is extremely reluctant to add new kwargs to the top-level API if it can possibly be avoided, and in this case I think it can.\n\nSorry! =(\n", "Thanks for the feedback! In that case, is there an easy way to mount a custom HTTPAdapter on the main requests api rather than on a session? I didn't note any in the documentation, but maybe I missed it. \n", "Nope: the requests API manufactures a new Session on each request to avoid having any implicit global state that can lead to all kinds of weird bugs.\n" ]
https://api.github.com/repos/psf/requests/issues/3251
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3251/labels{/name}
https://api.github.com/repos/psf/requests/issues/3251/comments
https://api.github.com/repos/psf/requests/issues/3251/events
https://github.com/psf/requests/pull/3251
157,944,941
MDExOlB1bGxSZXF1ZXN0NzIyMDc1MTY=
3,251
Update documentation of Session.max_redirects
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" } ]
null
0
2016-06-01T16:01:06Z
2021-09-08T04:00:56Z
2016-06-01T18:08:39Z
CONTRIBUTOR
resolved
Fixes #3250
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3251/reactions" }
https://api.github.com/repos/psf/requests/issues/3251/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3251.diff", "html_url": "https://github.com/psf/requests/pull/3251", "merged_at": "2016-06-01T18:08:39Z", "patch_url": "https://github.com/psf/requests/pull/3251.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3251" }
true
[]
https://api.github.com/repos/psf/requests/issues/3250
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3250/labels{/name}
https://api.github.com/repos/psf/requests/issues/3250/comments
https://api.github.com/repos/psf/requests/issues/3250/events
https://github.com/psf/requests/issues/3250
157,771,177
MDU6SXNzdWUxNTc3NzExNzc=
3,250
docs don't show correct max_redirects
{ "avatar_url": "https://avatars.githubusercontent.com/u/204779?v=4", "events_url": "https://api.github.com/users/jvanasco/events{/privacy}", "followers_url": "https://api.github.com/users/jvanasco/followers", "following_url": "https://api.github.com/users/jvanasco/following{/other_user}", "gists_url": "https://api.github.com/users/jvanasco/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jvanasco", "id": 204779, "login": "jvanasco", "node_id": "MDQ6VXNlcjIwNDc3OQ==", "organizations_url": "https://api.github.com/users/jvanasco/orgs", "received_events_url": "https://api.github.com/users/jvanasco/received_events", "repos_url": "https://api.github.com/users/jvanasco/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jvanasco/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jvanasco/subscriptions", "type": "User", "url": "https://api.github.com/users/jvanasco", "user_view_type": "public" }
[]
closed
true
null
[]
null
10
2016-05-31T21:08:46Z
2021-09-08T17:05:39Z
2016-06-01T18:08:39Z
CONTRIBUTOR
resolved
In the docs, `max_redirects = None` (http://docs.python-requests.org/en/master/api/?highlight=max_redirects#requests.Session.max_redirects) however it actually is set to `DEFAULT_REDIRECT_LIMIT` from `.models`... https://github.com/kennethreitz/requests/blob/cd4e6b9aef4b5d3224a1aae1b3a2cef1b09710a4/requests/sessions.py#L330-L332 ... which is defined as `30` https://github.com/kennethreitz/requests/blob/cd4e6b9aef4b5d3224a1aae1b3a2cef1b09710a4/requests/models.py#L47
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3250/reactions" }
https://api.github.com/repos/psf/requests/issues/3250/timeline
null
completed
null
null
false
[ "Hrm, this seems to be a problem with the RTD docs build. @sigmavirus24, any ideas?\n", "So, sphinx is the source of this bug and I suspect it might be fixable but with something that I'm not entirely certain we want to do.\n\nThe following triggers the \"bug\" (a.k.a., limitation in sphinx-doc):\n\n``` py\nclass Session(...):\n def __init__(self):\n self.max_redirects = 30\n```\n\nAnd this \"fixes\" it:\n\n``` py\nclass Session(...):\n max_redirects = 30\n def __init__(self):\n pass\n```\n\nWe update our attributes to do this for documentation purposes, but this _can_ have negative effects if we over-do it. Thoughts?\n", "I'd rather not do it, frankly: I think it'd end up looking gross. =(\n", "The class (or possibly method) do string can declare it as an instance variable, and then note what package it's imported from. It's not a perfect solution in that one needs to look at source for the real value, but it stops the wrong info from appearing in the API docs. (Fwiw, I naively trusted the docs and hit the limit... Making me go crazy for 15 minutes reinstalling requests and trying to figure out why No Limit was triggering an exception)\n\n> On May 31, 2016, at 8:18 PM, Cory Benfield [email protected] wrote:\n> \n> I'd rather not do it, frankly: I think it'd end up looking gross. =(\n> \n> —\n> You are receiving this because you authored the thread.\n> Reply to this email directly, view it on GitHub, or mute the thread.\n", "@Lukasa to be clear we'd still be able to use the constant we import from `requests.models` that also appropriately helps sphinx determine what is happening.\n\nI am not particularly opinionated about this I guess.\n", "Can we just change the doc string to say \"Defaults to `30`\"?\n", "@Lukasa that works for me.\n", "@Lukasa that was going to be my suggestion. Utilizing class-instance variables just for the doc build is a huge red flag for me. \n", "It's important to keep in mind that a _lot_ of people read through the Requests codebase is a way to learn how to write and organize Python code well. \n", "> Utilizing class-instance variables just for the doc build is a huge red flag for me. \n\nSo class variable definitions have a lot of other benefits. One very common one is if someone uses mock to create an autospec'd Session. The mock will reflect the attributes that users expect to be there _only_ if the attributes were defined at the class level. (In other words, we make people's lives easier when they mock out the Session as a collaborator object in their testing.)\n" ]
https://api.github.com/repos/psf/requests/issues/3239
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3239/labels{/name}
https://api.github.com/repos/psf/requests/issues/3239/comments
https://api.github.com/repos/psf/requests/issues/3239/events
https://github.com/psf/requests/issues/3239
157,648,668
MDU6SXNzdWUxNTc2NDg2Njg=
3,239
Make RequestEncodingMixin._encode_params() public
{ "avatar_url": "https://avatars.githubusercontent.com/u/15640868?v=4", "events_url": "https://api.github.com/users/arjennienhuis/events{/privacy}", "followers_url": "https://api.github.com/users/arjennienhuis/followers", "following_url": "https://api.github.com/users/arjennienhuis/following{/other_user}", "gists_url": "https://api.github.com/users/arjennienhuis/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/arjennienhuis", "id": 15640868, "login": "arjennienhuis", "node_id": "MDQ6VXNlcjE1NjQwODY4", "organizations_url": "https://api.github.com/users/arjennienhuis/orgs", "received_events_url": "https://api.github.com/users/arjennienhuis/received_events", "repos_url": "https://api.github.com/users/arjennienhuis/repos", "site_admin": false, "starred_url": "https://api.github.com/users/arjennienhuis/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/arjennienhuis/subscriptions", "type": "User", "url": "https://api.github.com/users/arjennienhuis", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2016-05-31T11:24:56Z
2021-09-08T17:05:35Z
2016-06-27T19:52:35Z
NONE
resolved
I need something that does exactly what `RequestEncodingMixin._encode_params()` does. This works in both python2 and 3: ``` py >>> RequestEncodingMixin._encode_params({u'a': b'b'}) 'a=b' ``` Can it be made part of the public API? (Or is it already?) I am aware that I can just use it as is. But that seems "wrong".
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3239/reactions" }
https://api.github.com/repos/psf/requests/issues/3239/timeline
null
completed
null
null
false
[ "I don't have any strong feelings on it. I actually am uncertain why we're using a Mixin there when those are static methods and don't need to be declared on a class.\n\nIf we're going to make them public, I'd rather move them to a well documented submodule that does encoding of data and such for requests. That way people can simply do:\n\n``` py\nfrom requests import encodingutils\n\nencodingutils.encode_params(...)\n```\n", "Do we really need to expose them? This is a 16-line function that mostly just translates different inputs to something useful to pass to `urlencode`. Do we need to really expose this, or is it safer just for those who want it to copy the code out?\n", "> Do we really need to expose them?\n\nI don't know. I do have questions about using a Mixin here but that's tangential to this issue. I'm not sure we want to make these publicly available though. I lean towards not doing it because it's not what requests is for, but I'm also not sure if not opening it up means people won't use it anyway.\n", "Don't change something just because you can. \n" ]
https://api.github.com/repos/psf/requests/issues/3238
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3238/labels{/name}
https://api.github.com/repos/psf/requests/issues/3238/comments
https://api.github.com/repos/psf/requests/issues/3238/events
https://github.com/psf/requests/issues/3238
157,638,068
MDU6SXNzdWUxNTc2MzgwNjg=
3,238
Support proxy protocol http_no_tunnel
{ "avatar_url": "https://avatars.githubusercontent.com/u/3806117?v=4", "events_url": "https://api.github.com/users/liketic/events{/privacy}", "followers_url": "https://api.github.com/users/liketic/followers", "following_url": "https://api.github.com/users/liketic/following{/other_user}", "gists_url": "https://api.github.com/users/liketic/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/liketic", "id": 3806117, "login": "liketic", "node_id": "MDQ6VXNlcjM4MDYxMTc=", "organizations_url": "https://api.github.com/users/liketic/orgs", "received_events_url": "https://api.github.com/users/liketic/received_events", "repos_url": "https://api.github.com/users/liketic/repos", "site_admin": false, "starred_url": "https://api.github.com/users/liketic/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/liketic/subscriptions", "type": "User", "url": "https://api.github.com/users/liketic", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-05-31T10:21:31Z
2021-09-08T16:00:37Z
2016-08-05T07:47:01Z
NONE
resolved
Hi, Is there any way to support 'http_no_tunnel' protocol in proxy? I want to move from httplib2 to requests, but I didn't find any way to archive it. Thanks.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3238/reactions" }
https://api.github.com/repos/psf/requests/issues/3238/timeline
null
completed
null
null
false
[ "what happens when you preface your hostname with http:// and run? Any difference?\n" ]
https://api.github.com/repos/psf/requests/issues/3237
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3237/labels{/name}
https://api.github.com/repos/psf/requests/issues/3237/comments
https://api.github.com/repos/psf/requests/issues/3237/events
https://github.com/psf/requests/issues/3237
157,529,412
MDU6SXNzdWUxNTc1Mjk0MTI=
3,237
Session cookies handled incorrectly on redirect
{ "avatar_url": "https://avatars.githubusercontent.com/u/1091971?v=4", "events_url": "https://api.github.com/users/Markus-Goetz/events{/privacy}", "followers_url": "https://api.github.com/users/Markus-Goetz/followers", "following_url": "https://api.github.com/users/Markus-Goetz/following{/other_user}", "gists_url": "https://api.github.com/users/Markus-Goetz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Markus-Goetz", "id": 1091971, "login": "Markus-Goetz", "node_id": "MDQ6VXNlcjEwOTE5NzE=", "organizations_url": "https://api.github.com/users/Markus-Goetz/orgs", "received_events_url": "https://api.github.com/users/Markus-Goetz/received_events", "repos_url": "https://api.github.com/users/Markus-Goetz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Markus-Goetz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Markus-Goetz/subscriptions", "type": "User", "url": "https://api.github.com/users/Markus-Goetz", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-05-30T16:15:13Z
2021-09-08T18:00:40Z
2016-05-30T17:32:21Z
NONE
resolved
Hi everyone, the session cookies are handled incorrectly when being redirected from an https URL to an http resource.The initial cookies have been acquired via an https request to <domain>. Therefore, the secure attribute of the respective cookie is set to True. In a later request to the same domain via https, I am 307 redirected to the very same URL except via http. When requests tries to resolve the redirect it will not copy the cookie over because the `return_ok_secure` check in the `cookielib.py` will obviously fail (mismatch between request.type == 'http' and checked type 'https'). Is that a wrong implementation by the service provider or python-requests? Below a pseudo example: ``` import request print requests.__version__ >>> '2.10.0' s = requests.Session() s.post('https://domain.com/login', data={'login': login, 'pw': pw}) print s.cookies >>> <RequestsCookieJar[<Cookie token=... for domain.com/>]> res = s.get('https://domain/com/resource') # redirects internally with 307 ``` Unfortunately, I can not provide a minimal working example since the script would include sensitive information.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3237/reactions" }
https://api.github.com/repos/psf/requests/issues/3237/timeline
null
completed
null
null
false
[ "The `secure` attribute of a cookie should only be set to `True` if the service provider sets it manually. If they _do_ set it manually, that is an instruction that the cookie must not be sent over an insecure (e.g. `http`) channel. Requests is correctly refusing to send that cookie: it should only be sent when TLS is used.\n" ]
https://api.github.com/repos/psf/requests/issues/3236
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3236/labels{/name}
https://api.github.com/repos/psf/requests/issues/3236/comments
https://api.github.com/repos/psf/requests/issues/3236/events
https://github.com/psf/requests/pull/3236
157,400,129
MDExOlB1bGxSZXF1ZXN0NzE4MzQ2NDQ=
3,236
Use xfail marker for a test expected to fail without Internet connection
{ "avatar_url": "https://avatars.githubusercontent.com/u/212279?v=4", "events_url": "https://api.github.com/users/eriol/events{/privacy}", "followers_url": "https://api.github.com/users/eriol/followers", "following_url": "https://api.github.com/users/eriol/following{/other_user}", "gists_url": "https://api.github.com/users/eriol/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/eriol", "id": 212279, "login": "eriol", "node_id": "MDQ6VXNlcjIxMjI3OQ==", "organizations_url": "https://api.github.com/users/eriol/orgs", "received_events_url": "https://api.github.com/users/eriol/received_events", "repos_url": "https://api.github.com/users/eriol/repos", "site_admin": false, "starred_url": "https://api.github.com/users/eriol/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/eriol/subscriptions", "type": "User", "url": "https://api.github.com/users/eriol", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2016-05-29T18:57:33Z
2021-09-08T03:00:59Z
2016-06-21T02:17:24Z
CONTRIBUTOR
resolved
This is only a minor improvement on the great work of https://github.com/kennethreitz/requests/pull/2859 that permits to run tests on hosts without Internet connection without failures.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3236/reactions" }
https://api.github.com/repos/psf/requests/issues/3236/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3236.diff", "html_url": "https://github.com/psf/requests/pull/3236", "merged_at": "2016-06-21T02:17:24Z", "patch_url": "https://github.com/psf/requests/pull/3236.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3236" }
true
[ "Not sure why no-one's gotten back to you on this yet — my apologies. \n\nThis looks like a great improvement. Thank you. \n", "@eriol I'm so sorry. I think I lost track of this while I was travelling to PyCon. I expect the same happened to @Lukasa \n", "Sorry for my late reply... No need to apologize! I was aware of the PyCon during those days and I was sure this would not be lost... so just thanks for merging it! :)\n" ]
https://api.github.com/repos/psf/requests/issues/3225
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3225/labels{/name}
https://api.github.com/repos/psf/requests/issues/3225/comments
https://api.github.com/repos/psf/requests/issues/3225/events
https://github.com/psf/requests/pull/3225
157,382,518
MDExOlB1bGxSZXF1ZXN0NzE4MjUyMzQ=
3,225
Convert readthedocs link for their .org -> .io migration for hosted projects
{ "avatar_url": "https://avatars.githubusercontent.com/u/857609?v=4", "events_url": "https://api.github.com/users/adamchainz/events{/privacy}", "followers_url": "https://api.github.com/users/adamchainz/followers", "following_url": "https://api.github.com/users/adamchainz/following{/other_user}", "gists_url": "https://api.github.com/users/adamchainz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/adamchainz", "id": 857609, "login": "adamchainz", "node_id": "MDQ6VXNlcjg1NzYwOQ==", "organizations_url": "https://api.github.com/users/adamchainz/orgs", "received_events_url": "https://api.github.com/users/adamchainz/received_events", "repos_url": "https://api.github.com/users/adamchainz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/adamchainz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/adamchainz/subscriptions", "type": "User", "url": "https://api.github.com/users/adamchainz", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-05-29T11:48:56Z
2021-09-08T04:00:57Z
2016-05-29T13:09:48Z
NONE
resolved
As per their email ‘Changes to project subdomains’: > Starting today, Read the Docs will start hosting projects from subdomains on the domain readthedocs.io, instead of on readthedocs.org. This change addresses some security concerns around site cookies while hosting user generated data on the same domain as our dashboard. Test Plan: Manually visited all the links I’ve modified.
{ "avatar_url": "https://avatars.githubusercontent.com/u/857609?v=4", "events_url": "https://api.github.com/users/adamchainz/events{/privacy}", "followers_url": "https://api.github.com/users/adamchainz/followers", "following_url": "https://api.github.com/users/adamchainz/following{/other_user}", "gists_url": "https://api.github.com/users/adamchainz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/adamchainz", "id": 857609, "login": "adamchainz", "node_id": "MDQ6VXNlcjg1NzYwOQ==", "organizations_url": "https://api.github.com/users/adamchainz/orgs", "received_events_url": "https://api.github.com/users/adamchainz/received_events", "repos_url": "https://api.github.com/users/adamchainz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/adamchainz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/adamchainz/subscriptions", "type": "User", "url": "https://api.github.com/users/adamchainz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3225/reactions" }
https://api.github.com/repos/psf/requests/issues/3225/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3225.diff", "html_url": "https://github.com/psf/requests/pull/3225", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3225.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3225" }
true
[ "Just realized these are all in `urllib3`, replaced with shazow/urllib3#882\n" ]
https://api.github.com/repos/psf/requests/issues/3224
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3224/labels{/name}
https://api.github.com/repos/psf/requests/issues/3224/comments
https://api.github.com/repos/psf/requests/issues/3224/events
https://github.com/psf/requests/issues/3224
157,332,409
MDU6SXNzdWUxNTczMzI0MDk=
3,224
Global verify options
{ "avatar_url": "https://avatars.githubusercontent.com/u/9717944?v=4", "events_url": "https://api.github.com/users/abelmokadem/events{/privacy}", "followers_url": "https://api.github.com/users/abelmokadem/followers", "following_url": "https://api.github.com/users/abelmokadem/following{/other_user}", "gists_url": "https://api.github.com/users/abelmokadem/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/abelmokadem", "id": 9717944, "login": "abelmokadem", "node_id": "MDQ6VXNlcjk3MTc5NDQ=", "organizations_url": "https://api.github.com/users/abelmokadem/orgs", "received_events_url": "https://api.github.com/users/abelmokadem/received_events", "repos_url": "https://api.github.com/users/abelmokadem/repos", "site_admin": false, "starred_url": "https://api.github.com/users/abelmokadem/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/abelmokadem/subscriptions", "type": "User", "url": "https://api.github.com/users/abelmokadem", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2016-05-28T10:35:52Z
2021-09-08T18:00:40Z
2016-05-28T14:42:03Z
NONE
resolved
I have noticed that it is possible to specify `verify=False` in order to avoid `CERTIFICATE_VERIFY_FAILED`. Is it possible to globally set this option?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3224/reactions" }
https://api.github.com/repos/psf/requests/issues/3224/timeline
null
completed
null
null
false
[ "It is not, no.\n\nSetting `verify=False` is an extremely insecure thing to do, and should be avoided if at all possible. Having a hook that enables you to _globally_ disable TLS certificate verification is extremely insecure, and is also where we draw the line.\n", "Further, we request that all questions be asked on [StackOverflow](stackoverflow.com), not on the defect tracker.\n" ]
https://api.github.com/repos/psf/requests/issues/3223
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3223/labels{/name}
https://api.github.com/repos/psf/requests/issues/3223/comments
https://api.github.com/repos/psf/requests/issues/3223/events
https://github.com/psf/requests/issues/3223
157,219,651
MDU6SXNzdWUxNTcyMTk2NTE=
3,223
Handling no network at startup in Requests
{ "avatar_url": "https://avatars.githubusercontent.com/u/12682170?v=4", "events_url": "https://api.github.com/users/shreepads/events{/privacy}", "followers_url": "https://api.github.com/users/shreepads/followers", "following_url": "https://api.github.com/users/shreepads/following{/other_user}", "gists_url": "https://api.github.com/users/shreepads/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/shreepads", "id": 12682170, "login": "shreepads", "node_id": "MDQ6VXNlcjEyNjgyMTcw", "organizations_url": "https://api.github.com/users/shreepads/orgs", "received_events_url": "https://api.github.com/users/shreepads/received_events", "repos_url": "https://api.github.com/users/shreepads/repos", "site_admin": false, "starred_url": "https://api.github.com/users/shreepads/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/shreepads/subscriptions", "type": "User", "url": "https://api.github.com/users/shreepads", "user_view_type": "public" }
[]
closed
true
null
[]
null
6
2016-05-27T14:31:17Z
2021-09-08T16:00:37Z
2016-08-05T07:47:29Z
NONE
resolved
To be honest I don't know if I'm doing something wrong, but I am unable to find a solution so here goes. This is running on my laptop connected over wifi - no proxies.I'm on Fedora 23, running requests 2.10.0 in a Python 3.4.3 virtual env. I have an infinite loop running requests.get's on a http url (say 'http://info.cern.ch/hypertext/WWW/TheProject.html') and sleeping. If I start the program with wifi ON, it works well. The gets succeed. If I turn OFF wifi on my laptop (program still running), the gets start throwing exceptions which I catch and handle. Then when I turn wifi ON again (program still running), the gets start working again. But if I turn wifi OFF and then start the program, it doesn't work. The gets throw exceptions which I catch and handle. But then if I turn ON the wifi (program still running) the gets continue to throw exceptions. At this point I can access the url from my browser but the requests.get()s continue to fail throwing `RequestException: HTTPConnectionPool(host='info.cern.ch', port=80): Max retries exceeded with url: /hypertext/WWW/TheProject.html (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7fe105b12cf8>: Failed to establish a new connection: [Errno -5] No address associated with hostname',))` Best I can make out is that the HTTPConnectionPool created at startup - with wifi OFF - somehow sticks around (probably tied to the eth interface?) and continues to fail even though wifi is now working. I have tried creating a Session in the while loop and explicitly close() ing it at the end. I've tried creating a Session outside the while loop and close() and re-creating it inside the get() exception handling. I've even tried `import requests` inside the while loop (yes I know that's a bad idea) but even that fails. I can reproduce this behaviour from the Python REPL environment as follows: 1. Turn OFF wifi 2. Start REPL, import requests and get() ``` $ python Python 3.4.3 (default, Mar 31 2016, 20:42:37) [GCC 5.3.1 20151207 (Red Hat 5.3.1-2)] on linux Type "help", "copyright", "credits" or "license" for more information. >>> import requests >>> >>> >>> requests.get('http://google.com') Traceback (most recent call last): File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/connection.py", line 142, in _new_conn (self.host, self.port), self.timeout, **extra_kw) File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/util/connection.py", line 67, in create_connection for res in socket.getaddrinfo(host, port, 0, socket.SOCK_STREAM): File "/usr/lib64/python3.4/socket.py", line 533, in getaddrinfo for res in _socket.getaddrinfo(host, port, family, type, proto, flags): socket.gaierror: [Errno -5] No address associated with hostname During handling of the above exception, another exception occurred: Traceback (most recent call last): File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 578, in urlopen chunked=chunked) File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 362, in _make_request conn.request(method, url, **httplib_request_kw) File "/usr/lib64/python3.4/http/client.py", line 1088, in request self._send_request(method, url, body, headers) File "/usr/lib64/python3.4/http/client.py", line 1126, in _send_request self.endheaders(body) File "/usr/lib64/python3.4/http/client.py", line 1084, in endheaders self._send_output(message_body) File "/usr/lib64/python3.4/http/client.py", line 922, in _send_output self.send(msg) File "/usr/lib64/python3.4/http/client.py", line 857, in send self.connect() File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/connection.py", line 167, in connect conn = self._new_conn() File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/connection.py", line 151, in _new_conn self, "Failed to establish a new connection: %s" % e) requests.packages.urllib3.exceptions.NewConnectionError: <requests.packages.urllib3.connection.HTTPConnection object at 0x7fb41c6dcb00>: Failed to establish a new connection: [Errno -5] No address associated with hostname During handling of the above exception, another exception occurred: Traceback (most recent call last): File "~/venv/lib64/python3.4/site-packages/requests/adapters.py", line 403, in send timeout=timeout File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 623, in urlopen _stacktrace=sys.exc_info()[2]) File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/util/retry.py", line 281, in increment raise MaxRetryError(_pool, url, error or ResponseError(cause)) requests.packages.urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='google.com', port=80): Max retries exceeded with url: / (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7fb41c6dcb00>: Failed to establish a new connection: [Errno -5] No address associated with hostname',)) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "<stdin>", line 1, in <module> File "~/venv/lib64/python3.4/site-packages/requests/api.py", line 71, in get return request('get', url, params=params, **kwargs) File "~/venv/lib64/python3.4/site-packages/requests/api.py", line 57, in request return session.request(method=method, url=url, **kwargs) File "~/venv/lib64/python3.4/site-packages/requests/sessions.py", line 475, in request resp = self.send(prep, **send_kwargs) File "~/venv/lib64/python3.4/site-packages/requests/sessions.py", line 585, in send r = adapter.send(request, **kwargs) File "~/venv/lib64/python3.4/site-packages/requests/adapters.py", line 467, in send raise ConnectionError(e, request=request) requests.exceptions.ConnectionError: HTTPConnectionPool(host='google.com', port=80): Max retries exceeded with url: / (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7fb41c6dcb00>: Failed to establish a new connection: [Errno -5] No address associated with hostname',)) >>> ``` 1. Turn ON wifi, google.com accessible from browser 2. Back to REPL get() errors ``` >>> requests.get('http://google.com') Traceback (most recent call last): File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/connection.py", line 142, in _new_conn (self.host, self.port), self.timeout, **extra_kw) File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/util/connection.py", line 67, in create_connection for res in socket.getaddrinfo(host, port, 0, socket.SOCK_STREAM): File "/usr/lib64/python3.4/socket.py", line 533, in getaddrinfo for res in _socket.getaddrinfo(host, port, family, type, proto, flags): socket.gaierror: [Errno -5] No address associated with hostname During handling of the above exception, another exception occurred: Traceback (most recent call last): File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 578, in urlopen chunked=chunked) File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 362, in _make_request conn.request(method, url, **httplib_request_kw) File "/usr/lib64/python3.4/http/client.py", line 1088, in request self._send_request(method, url, body, headers) File "/usr/lib64/python3.4/http/client.py", line 1126, in _send_request self.endheaders(body) File "/usr/lib64/python3.4/http/client.py", line 1084, in endheaders self._send_output(message_body) File "/usr/lib64/python3.4/http/client.py", line 922, in _send_output self.send(msg) File "/usr/lib64/python3.4/http/client.py", line 857, in send self.connect() File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/connection.py", line 167, in connect conn = self._new_conn() File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/connection.py", line 151, in _new_conn self, "Failed to establish a new connection: %s" % e) requests.packages.urllib3.exceptions.NewConnectionError: <requests.packages.urllib3.connection.HTTPConnection object at 0x7fb41c6f4470>: Failed to establish a new connection: [Errno -5] No address associated with hostname During handling of the above exception, another exception occurred: Traceback (most recent call last): File "~/venv/lib64/python3.4/site-packages/requests/adapters.py", line 403, in send timeout=timeout File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 623, in urlopen _stacktrace=sys.exc_info()[2]) File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/util/retry.py", line 281, in increment raise MaxRetryError(_pool, url, error or ResponseError(cause)) requests.packages.urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='google.com', port=80): Max retries exceeded with url: / (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7fb41c6f4470>: Failed to establish a new connection: [Errno -5] No address associated with hostname',)) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "<stdin>", line 1, in <module> File "~/venv/lib64/python3.4/site-packages/requests/api.py", line 71, in get return request('get', url, params=params, **kwargs) File "~/venv/lib64/python3.4/site-packages/requests/api.py", line 57, in request return session.request(method=method, url=url, **kwargs) File "~/venv/lib64/python3.4/site-packages/requests/sessions.py", line 475, in request resp = self.send(prep, **send_kwargs) File "~/venv/lib64/python3.4/site-packages/requests/sessions.py", line 585, in send r = adapter.send(request, **kwargs) File "~/venv/lib64/python3.4/site-packages/requests/adapters.py", line 467, in send raise ConnectionError(e, request=request) requests.exceptions.ConnectionError: HTTPConnectionPool(host='google.com', port=80): Max retries exceeded with url: / (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7fb41c6f4470>: Failed to establish a new connection: [Errno -5] No address associated with hostname',)) >>> ``` I expect the second get() to work. Particularly if I've used a Session explicitly. Wifi OFF ``` $ python Python 3.4.3 (default, Mar 31 2016, 20:42:37) [GCC 5.3.1 20151207 (Red Hat 5.3.1-2)] on linux Type "help", "copyright", "credits" or "license" for more information. >>> import requests >>> sess = requests.session() >>> sess.get('http://www.google.com') Traceback (most recent call last): File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/connection.py", line 142, in _new_conn (self.host, self.port), self.timeout, **extra_kw) File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/util/connection.py", line 67, in create_connection for res in socket.getaddrinfo(host, port, 0, socket.SOCK_STREAM): File "/usr/lib64/python3.4/socket.py", line 533, in getaddrinfo for res in _socket.getaddrinfo(host, port, family, type, proto, flags): socket.gaierror: [Errno -5] No address associated with hostname During handling of the above exception, another exception occurred: Traceback (most recent call last): File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 578, in urlopen chunked=chunked) File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 362, in _make_request conn.request(method, url, **httplib_request_kw) File "/usr/lib64/python3.4/http/client.py", line 1088, in request self._send_request(method, url, body, headers) File "/usr/lib64/python3.4/http/client.py", line 1126, in _send_request self.endheaders(body) File "/usr/lib64/python3.4/http/client.py", line 1084, in endheaders self._send_output(message_body) File "/usr/lib64/python3.4/http/client.py", line 922, in _send_output self.send(msg) File "/usr/lib64/python3.4/http/client.py", line 857, in send self.connect() File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/connection.py", line 167, in connect conn = self._new_conn() File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/connection.py", line 151, in _new_conn self, "Failed to establish a new connection: %s" % e) requests.packages.urllib3.exceptions.NewConnectionError: <requests.packages.urllib3.connection.HTTPConnection object at 0x7f8e83e0ab00>: Failed to establish a new connection: [Errno -5] No address associated with hostname During handling of the above exception, another exception occurred: Traceback (most recent call last): File "~/venv/lib64/python3.4/site-packages/requests/adapters.py", line 403, in send timeout=timeout File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 623, in urlopen _stacktrace=sys.exc_info()[2]) File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/util/retry.py", line 281, in increment raise MaxRetryError(_pool, url, error or ResponseError(cause)) requests.packages.urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='www.google.com', port=80): Max retries exceeded with url: / (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7f8e83e0ab00>: Failed to establish a new connection: [Errno -5] No address associated with hostname',)) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "<stdin>", line 1, in <module> File "~/venv/lib64/python3.4/site-packages/requests/sessions.py", line 487, in get return self.request('GET', url, **kwargs) File "~/venv/lib64/python3.4/site-packages/requests/sessions.py", line 475, in request resp = self.send(prep, **send_kwargs) File "~/venv/lib64/python3.4/site-packages/requests/sessions.py", line 585, in send r = adapter.send(request, **kwargs) File "~/venv/lib64/python3.4/site-packages/requests/adapters.py", line 467, in send raise ConnectionError(e, request=request) requests.exceptions.ConnectionError: HTTPConnectionPool(host='www.google.com', port=80): Max retries exceeded with url: / (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7f8e83e0ab00>: Failed to establish a new connection: [Errno -5] No address associated with hostname',)) >>> sess.close() >>> del sess ``` wifi ON, google.com reachable from browser ``` >>> sess = requests.session() >>> sess.get('http://www.google.com') Traceback (most recent call last): File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/connection.py", line 142, in _new_conn (self.host, self.port), self.timeout, **extra_kw) File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/util/connection.py", line 67, in create_connection for res in socket.getaddrinfo(host, port, 0, socket.SOCK_STREAM): File "/usr/lib64/python3.4/socket.py", line 533, in getaddrinfo for res in _socket.getaddrinfo(host, port, family, type, proto, flags): socket.gaierror: [Errno -5] No address associated with hostname During handling of the above exception, another exception occurred: Traceback (most recent call last): File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 578, in urlopen chunked=chunked) File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 362, in _make_request conn.request(method, url, **httplib_request_kw) File "/usr/lib64/python3.4/http/client.py", line 1088, in request self._send_request(method, url, body, headers) File "/usr/lib64/python3.4/http/client.py", line 1126, in _send_request self.endheaders(body) File "/usr/lib64/python3.4/http/client.py", line 1084, in endheaders self._send_output(message_body) File "/usr/lib64/python3.4/http/client.py", line 922, in _send_output self.send(msg) File "/usr/lib64/python3.4/http/client.py", line 857, in send self.connect() File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/connection.py", line 167, in connect conn = self._new_conn() File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/connection.py", line 151, in _new_conn self, "Failed to establish a new connection: %s" % e) requests.packages.urllib3.exceptions.NewConnectionError: <requests.packages.urllib3.connection.HTTPConnection object at 0x7f8e83e234e0>: Failed to establish a new connection: [Errno -5] No address associated with hostname During handling of the above exception, another exception occurred: Traceback (most recent call last): File "~/venv/lib64/python3.4/site-packages/requests/adapters.py", line 403, in send timeout=timeout File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 623, in urlopen _stacktrace=sys.exc_info()[2]) File "~/venv/lib64/python3.4/site-packages/requests/packages/urllib3/util/retry.py", line 281, in increment raise MaxRetryError(_pool, url, error or ResponseError(cause)) requests.packages.urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='www.google.com', port=80): Max retries exceeded with url: / (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7f8e83e234e0>: Failed to establish a new connection: [Errno -5] No address associated with hostname',)) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "<stdin>", line 1, in <module> File "~/venv/lib64/python3.4/site-packages/requests/sessions.py", line 487, in get return self.request('GET', url, **kwargs) File "~/venv/lib64/python3.4/site-packages/requests/sessions.py", line 475, in request resp = self.send(prep, **send_kwargs) File "~/venv/lib64/python3.4/site-packages/requests/sessions.py", line 585, in send r = adapter.send(request, **kwargs) File "~/venv/lib64/python3.4/site-packages/requests/adapters.py", line 467, in send raise ConnectionError(e, request=request) requests.exceptions.ConnectionError: HTTPConnectionPool(host='www.google.com', port=80): Max retries exceeded with url: / (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7f8e83e234e0>: Failed to establish a new connection: [Errno -5] No address associated with hostname',)) >>> ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3223/reactions" }
https://api.github.com/repos/psf/requests/issues/3223/timeline
null
completed
null
null
false
[ "Hrm. This is what I'd call \"somewhat puzzling\". I'm travelling at the moment so I don't have time to dive into this right this second, but I'll aim to take a look in the next couple of days and see if I can repro this.\n", "Ok, so here's the thing.\n\nRequests has no implicit global state. That means that two subsequent `requests.get` calls use totally brand new state. The fact that you're seeing this problem suggests that the state at either the OS or Python level are causing problems.\n\nI also can't reproduce this on OS X, which further suggests that this is a problem with your networking stack in some form.\n", "So the fact that these are all `getaddrinfo` failures makes me think that this might be some sort of DNS caching. I don't think Python 3.4 does that, so I'm guessing Fedora is doing it. I also don't think there's a way that Requests can work around this. And I don't know Fedora very well but @ralphbean does. Maybe he can help us out here?\n", "Debian 8\nPython 3.4.2 and 2.7.9\nRequests 2.4.3\nCouldn't reproduce the behavior.\n", "@AlexPHorta I appreciate that but @shreepads is using 2.10.0 on Fedora 23\n", "Closing for inactivity.\n" ]
https://api.github.com/repos/psf/requests/issues/3222
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3222/labels{/name}
https://api.github.com/repos/psf/requests/issues/3222/comments
https://api.github.com/repos/psf/requests/issues/3222/events
https://github.com/psf/requests/issues/3222
157,042,171
MDU6SXNzdWUxNTcwNDIxNzE=
3,222
Session.request() overrides socket.setdefaulttimeout()
{ "avatar_url": "https://avatars.githubusercontent.com/u/3270755?v=4", "events_url": "https://api.github.com/users/adeverteuil/events{/privacy}", "followers_url": "https://api.github.com/users/adeverteuil/followers", "following_url": "https://api.github.com/users/adeverteuil/following{/other_user}", "gists_url": "https://api.github.com/users/adeverteuil/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/adeverteuil", "id": 3270755, "login": "adeverteuil", "node_id": "MDQ6VXNlcjMyNzA3NTU=", "organizations_url": "https://api.github.com/users/adeverteuil/orgs", "received_events_url": "https://api.github.com/users/adeverteuil/received_events", "repos_url": "https://api.github.com/users/adeverteuil/repos", "site_admin": false, "starred_url": "https://api.github.com/users/adeverteuil/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/adeverteuil/subscriptions", "type": "User", "url": "https://api.github.com/users/adeverteuil", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-05-26T17:56:54Z
2021-09-08T16:00:36Z
2016-08-05T07:47:58Z
NONE
resolved
Hello, I am using requests 2.10.0 and Python 2.7.11. By setting `timeout=None` by default in the `session.Session().request()` method's default parameters, the value `None` is propagated all the way to the low level `socket` module, overriding whatever value was set using `socket.setdefaulttimeout(_timeout_)`. If the user doesn't assign a timeout value, the expected result is that `socket.getdefaulttimeout()` is used. To get this behavior, the value of `packages.urllib3.util.timeout._default` should be used by default instead of None, since None has a special meaning. This can be done by importing `packages.urllib3.util.Timeout` as `TimeoutSauce` (as it is done in the `adapter` module) and assigning `timeout=TimeoutSauce.DEFAULT_TIMEOUT`. Here is a unit test suite that shows that using `socket` directly respects `socket.getdefaulttimeout()` and another test using `requests` that blocks indefinitely unless the above modification is applied. ``` # usage: python -m unittest discover import socket import unittest import requests class SocketTimeoutTestCase(unittest.TestCase): def setUp(self): self.original_timeout = socket.getdefaulttimeout() socket.setdefaulttimeout(0.1) self.host = '127.0.0.1' self.port = 50000 self.server = socket.socket(socket.AF_INET, socket.SOCK_STREAM) self.server.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1) self.server.bind((self.host, self.port)) self.server.listen(1) def tearDown(self): socket.setdefaulttimeout(self.original_timeout) self.server.close() def test_default_timeout_with_socket(self): client = socket.socket(socket.AF_INET, socket.SOCK_STREAM) with self.assertRaises(socket.timeout): client.connect((self.host, self.port)) conn, addr = self.server.accept() client.recv(1024) conn.close() def test_default_timeout_with_requests(self): with self.assertRaises(requests.Timeout): r = requests.get('http://{}:{}/'.format(self.host, self.port)) ``` May I submit a pull request? I need to read contributing guidelines to prepare it correctly as this would be my first pull request. As far as I know (and I am not familiar with the code base), only the `session` module requires modification, but you may know better.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3222/reactions" }
https://api.github.com/repos/psf/requests/issues/3222/timeline
null
completed
null
null
false
[ "This is related to \n- https://github.com/shazow/urllib3/issues/655\n\nWhich has been fixed for 3.0.0 because it's otherwise not backwards compatible.\n" ]
https://api.github.com/repos/psf/requests/issues/3221
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3221/labels{/name}
https://api.github.com/repos/psf/requests/issues/3221/comments
https://api.github.com/repos/psf/requests/issues/3221/events
https://github.com/psf/requests/issues/3221
156,979,920
MDU6SXNzdWUxNTY5Nzk5MjA=
3,221
did it update doc?
{ "avatar_url": "https://avatars.githubusercontent.com/u/4942395?v=4", "events_url": "https://api.github.com/users/yangbeom/events{/privacy}", "followers_url": "https://api.github.com/users/yangbeom/followers", "following_url": "https://api.github.com/users/yangbeom/following{/other_user}", "gists_url": "https://api.github.com/users/yangbeom/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/yangbeom", "id": 4942395, "login": "yangbeom", "node_id": "MDQ6VXNlcjQ5NDIzOTU=", "organizations_url": "https://api.github.com/users/yangbeom/orgs", "received_events_url": "https://api.github.com/users/yangbeom/received_events", "repos_url": "https://api.github.com/users/yangbeom/repos", "site_admin": false, "starred_url": "https://api.github.com/users/yangbeom/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yangbeom/subscriptions", "type": "User", "url": "https://api.github.com/users/yangbeom", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2016-05-26T13:10:15Z
2021-09-08T18:00:41Z
2016-05-26T13:27:34Z
NONE
resolved
I translate Doc into Doc-KR. in todo.rst > As of v1.0.0, Requests has now entered a feature freeze. did this translate or delete?
{ "avatar_url": "https://avatars.githubusercontent.com/u/4942395?v=4", "events_url": "https://api.github.com/users/yangbeom/events{/privacy}", "followers_url": "https://api.github.com/users/yangbeom/followers", "following_url": "https://api.github.com/users/yangbeom/following{/other_user}", "gists_url": "https://api.github.com/users/yangbeom/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/yangbeom", "id": 4942395, "login": "yangbeom", "node_id": "MDQ6VXNlcjQ5NDIzOTU=", "organizations_url": "https://api.github.com/users/yangbeom/orgs", "received_events_url": "https://api.github.com/users/yangbeom/received_events", "repos_url": "https://api.github.com/users/yangbeom/repos", "site_admin": false, "starred_url": "https://api.github.com/users/yangbeom/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yangbeom/subscriptions", "type": "User", "url": "https://api.github.com/users/yangbeom", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3221/reactions" }
https://api.github.com/repos/psf/requests/issues/3221/timeline
null
completed
null
null
false
[ "That sentence is true, but it's still fine to continue with translations. =)\n", "@lukasa Thanks :) i think that sentence is unnecessary in v2.10.0, so seek your advice.\n", "It's still necessary: we're still in a nominal feature freeze. =)\n", "@Lukasa make sense!! thanks =)\n" ]
https://api.github.com/repos/psf/requests/issues/3220
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3220/labels{/name}
https://api.github.com/repos/psf/requests/issues/3220/comments
https://api.github.com/repos/psf/requests/issues/3220/events
https://github.com/psf/requests/issues/3220
156,878,829
MDU6SXNzdWUxNTY4Nzg4Mjk=
3,220
Location Value Error when trying to post through a session
{ "avatar_url": "https://avatars.githubusercontent.com/u/8885119?v=4", "events_url": "https://api.github.com/users/hudcoley/events{/privacy}", "followers_url": "https://api.github.com/users/hudcoley/followers", "following_url": "https://api.github.com/users/hudcoley/following{/other_user}", "gists_url": "https://api.github.com/users/hudcoley/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/hudcoley", "id": 8885119, "login": "hudcoley", "node_id": "MDQ6VXNlcjg4ODUxMTk=", "organizations_url": "https://api.github.com/users/hudcoley/orgs", "received_events_url": "https://api.github.com/users/hudcoley/received_events", "repos_url": "https://api.github.com/users/hudcoley/repos", "site_admin": false, "starred_url": "https://api.github.com/users/hudcoley/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/hudcoley/subscriptions", "type": "User", "url": "https://api.github.com/users/hudcoley", "user_view_type": "public" }
[]
closed
true
null
[]
null
11
2016-05-26T00:25:05Z
2021-09-08T16:00:34Z
2016-08-05T07:55:18Z
NONE
resolved
I am using a 3rd party library that uses requests, and when I try to post using a session I get this error: File "", line 1, in File "/usr/local/lib/python2.7/dist-packages/pacer_lib/scraper.py", line 42, in init self.refresh_login() File "/usr/local/lib/python2.7/dist-packages/pacer_lib/scraper.py", line 74, in refresh_login response = self.br.post(login_url, data=payload) File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 500, in post return self.request('POST', url, data=data, json=json, _kwargs) File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 457, in request resp = self.send(prep, *send_kwargs) File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 595, in send history = [resp for resp in gen] if allow_redirects else [] File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 189, in resolve_redirects allow_redirects=False, File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 569, in send r = adapter.send(request, *_kwargs) File "/usr/lib/python2.7/dist-packages/requests/adapters.py", line 329, in send conn = self.get_connection(request.url, proxies) File "/usr/lib/python2.7/dist-packages/requests/adapters.py", line 243, in get_connection conn = self.poolmanager.connection_from_url(url) File "/usr/lib/python2.7/dist-packages/urllib3/poolmanager.py", line 131, in connection_from_url return self.connection_from_host(u.host, port=u.port, scheme=u.scheme) File "/usr/lib/python2.7/dist-packages/urllib3/poolmanager.py", line 102, in connection_from_host raise LocationValueError("No host specified.") urllib3.exceptions.LocationValueError: No host specified. here is the relevant source from pacer_lib def refresh_login(self): """ Logs in to the PACER system using the login and password provided at the initialization of `search_agent()`. This will create a Requests session that will allow you to query the PACER system. If _auto_login_ =False, `refresh_login()` must be called before you can query the case_locator. This function will raise an error if you supply an invalid login or password. ``` Returns nothing. """ #SETTINGS (determined from the form from PACER's '/login.pl') payload = {'loginid':self.username, 'passwd':self.password} login_url = 'https://pacer.login.uscourts.gov/cgi-bin/check-pacer-passwd.pl' self.br = requests.Session() response = self.br.post(login_url, data=payload) ``` I have tried just posting without a session and it works (at least I get a 200), but every time I try to use a session I get that same error. Any help would be greatly appreciated
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3220/reactions" }
https://api.github.com/repos/psf/requests/issues/3220/timeline
null
completed
null
null
false
[ "In this instance the problem seems to be related to a redirect occurring. Are you sure that nothing is being set on the session here, anywhere in the code?\n", "It doesn't look like it is, as far as I can tell. This method that is creating the session is invoked in the constructor of a class, and the session 'br' is being initialized as an empty string. Is it possible that the library contains a bad link that doesn't work the way it was intended anymore?\n", "I sincerely doubt it. Requests uses a `Session` internally when you use a `requests.*` method, and that includes following redirects. There is essentially no difference between `requests.post(x)` and `s = requests.Session(); s.post(x)`.\n\nBecause I don't have a PACER login I can't validate the problem, of course, but one option would be to set `allow_redirects=False` and then manually check the Location header in the response to see what it says.\n", "So the location I get is \"http:///cgi-bin/links.pl\", which I tried to use directly in a post and got that it is an invalid URL (obvious, but still figured I'd try). It looks like the pacer.login.uscourts.gov part is being taken out\n", "Hrm...yeah, that URL is invalid. Is that the `Location` header value directly?\n", "Yeah. Here's the code I'm running\n'''\ns = requests.Session()\nprint s.headers\nr = s.post('https://pacer.login.uscourts.gov/cgi-bin/check-pacer-passwd.pl', data = payload)\nprint r.headers\n'''\nand the print headers prints this:\n{'Content-Length': '208', 'X-Content-Type-Options': 'nosniff', 'Set-Cookie': 'PacerSession=\"janfFdEMIJ14lGFtPtR0S5itQ0LhmKdvtcpM3zde72I2wDcyVyNktJvvSZU92AK7OOUFKkpLzHLKoQIA9fvzaH0UY3m3VFtTIvBJoIM26XaMV6SJSFo4JFRpJEZQ7TIs\"; path=/; domain=.uscourts.gov; secure, PacerClientCode=\"\"; path=/; domain=.uscourts.gov; secure, PacerPref=\"receipt=Y\"; path=/; domain=.uscourts.gov; secure', 'Keep-Alive': 'timeout=15, max=100', 'Server': 'Apache', 'Connection': 'Keep-Alive', 'Location': 'http:///cgi-bin/links.pl', 'Date': 'Wed, 01 Jun 2016 00:15:43 GMT', 'X-Frame-Options': 'SAMEORIGIN', 'Content-Type': 'text/html; charset=iso-8859-1'}\n", "Wow. Ok, so that means that PACER is producing the invalid header, presumably by having something in their server stack screw up the location header they _wanted_ to produce (which is probably `/cgi-bin/links.pl`). I don't know that we can really do much about this: the URL is severely under-specified.\n", "Fair enough. Could I try passing the location header myself, or is that not how it works? I don't really understand http (which is why I appreciate so much what you guys do). Is this the kind of thing I could raise with pacer's IT team?\n", "Sadly you can't pass the header yourself, it's returned from the server. I think it's exactly the kind of thing you could raise with pacer's IT team. =)\n\nWhat you can do is set `allow_redirects=False` and then fixup the URL yourself, manually following the redirects until they terminate. That might not be a good idea if you're not confident with HTTP though.\n", "Awesome, thank you so much for the help. One final question before I get out of your hair, if I were able to login via another url using a Session, I should still be able to access the various pages within pacer that I want to, correct?\n", "Absolutely!\n\nYou may also find that you can use pacer immediately after the exception: the login still worked and the cookies should still be stored. \n" ]
https://api.github.com/repos/psf/requests/issues/3219
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3219/labels{/name}
https://api.github.com/repos/psf/requests/issues/3219/comments
https://api.github.com/repos/psf/requests/issues/3219/events
https://github.com/psf/requests/issues/3219
156,497,454
MDU6SXNzdWUxNTY0OTc0NTQ=
3,219
requests does not provide a way to send full chain-of-trust when server makes a CertificateRequest
{ "avatar_url": "https://avatars.githubusercontent.com/u/6081083?v=4", "events_url": "https://api.github.com/users/jaloren/events{/privacy}", "followers_url": "https://api.github.com/users/jaloren/followers", "following_url": "https://api.github.com/users/jaloren/following{/other_user}", "gists_url": "https://api.github.com/users/jaloren/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jaloren", "id": 6081083, "login": "jaloren", "node_id": "MDQ6VXNlcjYwODEwODM=", "organizations_url": "https://api.github.com/users/jaloren/orgs", "received_events_url": "https://api.github.com/users/jaloren/received_events", "repos_url": "https://api.github.com/users/jaloren/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jaloren/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jaloren/subscriptions", "type": "User", "url": "https://api.github.com/users/jaloren", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2016-05-24T12:43:04Z
2021-09-08T18:00:42Z
2016-05-24T12:46:55Z
NONE
resolved
I am on python2.7 installed on CentOS 6. I have OpenSSL 1.0.1e. I have requests version 2.9.1. The HTTP server is performing SSL client authentication. The client's chain has a verification depth of 2, which means there's a single intermediate CA certificate in the client chain. I am not sure if this is a bug or a feature request. Based on the public api, I sent a request in like so. note i have explicitly chosen not to verify the server's chain of trust (i know that's not secure but this isn't being used in production). The cert.pem contains the client's PEM-encoded certificate, the intermediate CA cert, and the root cert. The final entity in the cert.pem is the client's private key (which is NOT encrypted). resp = requests.get(url,verify=False,cert='cert.pem') The following exception is thrown. requests.exceptions.SSLError: [Errno bad handshake](-1,) Looking at the server's ssl logs, I can see that the reason the server terminated the SSL connection was caused by the server's inability to construct the client's chain-of-trust. The server could not do this because the request's library only sent its client cert in it. I used openssl s_server to verify this as well. I could not find anything in the official documentation that would allow me to specify a set of certs to be included in the client's chain-of-trust. Looking at the pyopenssl wrapper used by requests, I can see the Context.use_certificate method, which sets the leaf cert. I am suspecting that only sets a single certificate. I suspect that for this to work, the requests external api would need to provide an interface for passing in a list of PEM-encoded certs, which is then set via the Context.add_extra_chain_cert method.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3219/reactions" }
https://api.github.com/repos/psf/requests/issues/3219/timeline
null
completed
null
null
false
[ "Hi, thanks for this report!\n\nThe PyOpenSSL wrapper here is actually part of [urllib3](https://github.com/shazow/urllib3), and requests uses urllib3 exactly as provided in that project. Do you mind opening this issue over there instead? (It'll seem weird because it'll still be me responding to you there, but trust me, this is organisationally cleaner for us.)\n", "Sure!! Not a problem.\n" ]
https://api.github.com/repos/psf/requests/issues/3218
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3218/labels{/name}
https://api.github.com/repos/psf/requests/issues/3218/comments
https://api.github.com/repos/psf/requests/issues/3218/events
https://github.com/psf/requests/issues/3218
156,493,607
MDU6SXNzdWUxNTY0OTM2MDc=
3,218
uncaught possible exception in urllib3/connectionpool.py
{ "avatar_url": "https://avatars.githubusercontent.com/u/772003?v=4", "events_url": "https://api.github.com/users/sposs/events{/privacy}", "followers_url": "https://api.github.com/users/sposs/followers", "following_url": "https://api.github.com/users/sposs/following{/other_user}", "gists_url": "https://api.github.com/users/sposs/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sposs", "id": 772003, "login": "sposs", "node_id": "MDQ6VXNlcjc3MjAwMw==", "organizations_url": "https://api.github.com/users/sposs/orgs", "received_events_url": "https://api.github.com/users/sposs/received_events", "repos_url": "https://api.github.com/users/sposs/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sposs/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sposs/subscriptions", "type": "User", "url": "https://api.github.com/users/sposs", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2016-05-24T12:23:45Z
2021-09-08T18:00:42Z
2016-05-24T12:32:09Z
NONE
resolved
Hi, I'm seeing the following exception ``` File "/opt/virtualenvs/myapp/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py", line 314, in _raise_timeout if 'timed out' in str(err) or 'did not complete (read)' in str(err): # Python 2.6 TypeError: __str__ returned non-string (type SysCallError) ``` I'm using version 2.7.0 of requests. The url I try to access is a https one. I don't really know how to debug this...
{ "avatar_url": "https://avatars.githubusercontent.com/u/772003?v=4", "events_url": "https://api.github.com/users/sposs/events{/privacy}", "followers_url": "https://api.github.com/users/sposs/followers", "following_url": "https://api.github.com/users/sposs/following{/other_user}", "gists_url": "https://api.github.com/users/sposs/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sposs", "id": 772003, "login": "sposs", "node_id": "MDQ6VXNlcjc3MjAwMw==", "organizations_url": "https://api.github.com/users/sposs/orgs", "received_events_url": "https://api.github.com/users/sposs/received_events", "repos_url": "https://api.github.com/users/sposs/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sposs/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sposs/subscriptions", "type": "User", "url": "https://api.github.com/users/sposs", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3218/reactions" }
https://api.github.com/repos/psf/requests/issues/3218/timeline
null
completed
null
null
false
[ "Please update your version of requests. This was fixed several releases ago.\n", "Thanks. Updated now. Will reopen if needed.\n" ]
https://api.github.com/repos/psf/requests/issues/3217
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3217/labels{/name}
https://api.github.com/repos/psf/requests/issues/3217/comments
https://api.github.com/repos/psf/requests/issues/3217/events
https://github.com/psf/requests/issues/3217
156,365,023
MDU6SXNzdWUxNTYzNjUwMjM=
3,217
unhandled exception on timeout
{ "avatar_url": "https://avatars.githubusercontent.com/u/522344?v=4", "events_url": "https://api.github.com/users/alanhamlett/events{/privacy}", "followers_url": "https://api.github.com/users/alanhamlett/followers", "following_url": "https://api.github.com/users/alanhamlett/following{/other_user}", "gists_url": "https://api.github.com/users/alanhamlett/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/alanhamlett", "id": 522344, "login": "alanhamlett", "node_id": "MDQ6VXNlcjUyMjM0NA==", "organizations_url": "https://api.github.com/users/alanhamlett/orgs", "received_events_url": "https://api.github.com/users/alanhamlett/received_events", "repos_url": "https://api.github.com/users/alanhamlett/repos", "site_admin": false, "starred_url": "https://api.github.com/users/alanhamlett/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/alanhamlett/subscriptions", "type": "User", "url": "https://api.github.com/users/alanhamlett", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-05-23T20:56:26Z
2021-09-08T18:00:42Z
2016-05-24T07:00:54Z
CONTRIBUTOR
resolved
``` python Traceback (most recent call last): File "requests/packages/urllib3/connectionpool.py", line 623, in urlopen _stacktrace=sys.exc_info()[2]) File "requests/packages/urllib3/util/retry.py", line 285, in increment _observed_errors=_observed_errors) File "requests/packages/urllib3/util/retry.py", line 151, in new raise_on_status=self.raise_on_status, AttributeError: 'Retry' object has no attribute 'raise_on_status' ``` You can reproduce this by changing line 140 of `requests/packages/urllib3/connection.py` to: ``` 140 try: 141 raise SocketTimeout('') 142 conn = connection.create_connection( 143 (self.host, self.port), self.timeout, **extra_kw) 144 145 except SocketTimeout as e: ``` This forces a `SocketTimeout` and causes the unhandled exception when calling `Retry.new`.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3217/reactions" }
https://api.github.com/repos/psf/requests/issues/3217/timeline
null
completed
null
null
false
[ "Hey @alanhamlett, thanks for reporting this!\n\nThis will be fixed when we release the next minor version of requests, which will bring in an updated version of urllib3.\n" ]
https://api.github.com/repos/psf/requests/issues/3216
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3216/labels{/name}
https://api.github.com/repos/psf/requests/issues/3216/comments
https://api.github.com/repos/psf/requests/issues/3216/events
https://github.com/psf/requests/pull/3216
156,363,982
MDExOlB1bGxSZXF1ZXN0NzExMTg2MDQ=
3,216
Make BaseAdapter describe the mandatory adapter interface
{ "avatar_url": "https://avatars.githubusercontent.com/u/348449?v=4", "events_url": "https://api.github.com/users/nanonyme/events{/privacy}", "followers_url": "https://api.github.com/users/nanonyme/followers", "following_url": "https://api.github.com/users/nanonyme/following{/other_user}", "gists_url": "https://api.github.com/users/nanonyme/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/nanonyme", "id": 348449, "login": "nanonyme", "node_id": "MDQ6VXNlcjM0ODQ0OQ==", "organizations_url": "https://api.github.com/users/nanonyme/orgs", "received_events_url": "https://api.github.com/users/nanonyme/received_events", "repos_url": "https://api.github.com/users/nanonyme/repos", "site_admin": false, "starred_url": "https://api.github.com/users/nanonyme/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nanonyme/subscriptions", "type": "User", "url": "https://api.github.com/users/nanonyme", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-05-23T20:51:04Z
2021-09-08T04:00:57Z
2016-05-24T18:23:16Z
CONTRIBUTOR
resolved
BaseAdapter should document the mandatory interfaces for implementing your own adapter from scratch for a different HTTP library. Currently requests requires all the parameters from HTTPAdapter implementation of send to be defined so copying them and relevant documentation over to the base adapter. Also adding relevant parts of documentation on what close is to the base class
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3216/reactions" }
https://api.github.com/repos/psf/requests/issues/3216/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3216.diff", "html_url": "https://github.com/psf/requests/pull/3216", "merged_at": "2016-05-24T18:23:16Z", "patch_url": "https://github.com/psf/requests/pull/3216.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3216" }
true
[ "There's one small note from @sigmavirus24 but I'd be happy to merge this when that's resolved.\n" ]
https://api.github.com/repos/psf/requests/issues/3215
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3215/labels{/name}
https://api.github.com/repos/psf/requests/issues/3215/comments
https://api.github.com/repos/psf/requests/issues/3215/events
https://github.com/psf/requests/issues/3215
156,284,305
MDU6SXNzdWUxNTYyODQzMDU=
3,215
requests and pylab SSL conflict
{ "avatar_url": "https://avatars.githubusercontent.com/u/649929?v=4", "events_url": "https://api.github.com/users/CrimsonGlory/events{/privacy}", "followers_url": "https://api.github.com/users/CrimsonGlory/followers", "following_url": "https://api.github.com/users/CrimsonGlory/following{/other_user}", "gists_url": "https://api.github.com/users/CrimsonGlory/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/CrimsonGlory", "id": 649929, "login": "CrimsonGlory", "node_id": "MDQ6VXNlcjY0OTkyOQ==", "organizations_url": "https://api.github.com/users/CrimsonGlory/orgs", "received_events_url": "https://api.github.com/users/CrimsonGlory/received_events", "repos_url": "https://api.github.com/users/CrimsonGlory/repos", "site_admin": false, "starred_url": "https://api.github.com/users/CrimsonGlory/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/CrimsonGlory/subscriptions", "type": "User", "url": "https://api.github.com/users/CrimsonGlory", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2016-05-23T13:45:11Z
2021-09-08T18:00:41Z
2016-05-24T22:34:45Z
NONE
resolved
Installing pylab after requests makes requests fail SSL certificate verification. Wasn't sure if I should report this here or in pylab repo. This was tested with Docker. $ ls `Dockerfile test.py ` $ cat Dockerfile ``` FROM python:2.7 ADD test.py . RUN apt-get install openssl libssl-dev RUN pip install requests CMD ["python","test.py"] ``` $ cat test.py ``` import requests print requests.get("https://google.com") ``` $ sudo su # docker build --tag bugtest1 . && docker run --rm bugtest1 ``` Sending build context to Docker daemon 3.072 kB Step 1 : FROM python:2.7 ---> 8f7cafb9cc63 Step 2 : ADD test.py . ---> Using cache ---> c5e7e76e36c6 Step 3 : RUN apt-get install openssl libssl-dev ---> Using cache ---> cecc8e87d049 Step 4 : RUN pip install requests ---> Using cache ---> 6f309b214a07 Step 5 : CMD python test.py ---> Using cache ---> 161f61e62da3 Successfully built 161f61e62da3 <Response [200]> ``` # vim Dockerfile # add pylab line # cat Dockerfile ``` FROM python:2.7 ADD test.py . RUN apt-get install openssl libssl-dev RUN pip install requests RUN pip install pylab CMD ["python","test.py"] ``` # docker build --tag bugtest1 . && docker run --rm bugtest1 ``` Sending build context to Docker daemon 3.072 kB Step 1 : FROM python:2.7 ---> 8f7cafb9cc63 Step 2 : ADD test.py . ---> Using cache ---> c5e7e76e36c6 Step 3 : RUN apt-get install openssl libssl-dev ---> Using cache ---> cecc8e87d049 Step 4 : RUN pip install requests ---> Using cache ---> 6f309b214a07 Step 5 : RUN pip install pylab ---> Running in dc7af4eba40b Collecting pylab Downloading pylab-0.1.3-py2.py3-none-any.whl Collecting networkx (from pylab) Downloading networkx-1.11-py2.py3-none-any.whl (1.3MB) Collecting pytz (from pylab) Downloading pytz-2016.4-py2.py3-none-any.whl (480kB) Collecting tornado (from pylab) Downloading tornado-4.3.tar.gz (450kB) Collecting pandas (from pylab) Downloading pandas-0.18.1.tar.gz (7.3MB) Collecting seaborn (from pylab) Downloading seaborn-0.7.0.tar.gz (154kB) Collecting numpy (from pylab) Downloading numpy-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl (15.3MB) Collecting matplotlib (from pylab) Downloading matplotlib-1.5.1.tar.gz (54.0MB) Collecting jsonschema (from pylab) Downloading jsonschema-2.5.1-py2.py3-none-any.whl Collecting scipy (from pylab) Downloading scipy-0.17.1-cp27-cp27mu-manylinux1_x86_64.whl (39.5MB) Collecting jinja2 (from pylab) Downloading Jinja2-2.8-py2.py3-none-any.whl (263kB) Collecting sympy (from pylab) Downloading sympy-1.0.tar.gz (4.3MB) Collecting ipython (from pylab) Downloading ipython-4.2.0-py2-none-any.whl (736kB) Collecting scikit-image (from pylab) Downloading scikit_image-0.12.3-cp27-cp27mu-manylinux1_x86_64.whl (28.0MB) Collecting scikit-learn (from pylab) Downloading scikit_learn-0.17.1-cp27-cp27mu-manylinux1_x86_64.whl (17.6MB) Collecting pyzmq (from pylab) Downloading pyzmq-15.2.0.zip (1.5MB) Collecting decorator>=3.4.0 (from networkx->pylab) Downloading decorator-4.0.9-py2.py3-none-any.whl Collecting backports.ssl-match-hostname (from tornado->pylab) Downloading backports.ssl_match_hostname-3.5.0.1.tar.gz Collecting singledispatch (from tornado->pylab) Downloading singledispatch-3.4.0.3-py2.py3-none-any.whl Collecting certifi (from tornado->pylab) Downloading certifi-2016.2.28-py2.py3-none-any.whl (366kB) Collecting backports-abc>=0.4 (from tornado->pylab) Downloading backports_abc-0.4-py2.py3-none-any.whl Collecting python-dateutil (from pandas->pylab) Downloading python_dateutil-2.5.3-py2.py3-none-any.whl (201kB) Collecting cycler (from matplotlib->pylab) Downloading cycler-0.10.0-py2.py3-none-any.whl Collecting pyparsing!=2.0.0,!=2.0.4,>=1.5.6 (from matplotlib->pylab) Downloading pyparsing-2.1.4-py2.py3-none-any.whl (40kB) Collecting functools32 (from jsonschema->pylab) Downloading functools32-3.2.3-2.zip Collecting MarkupSafe (from jinja2->pylab) Downloading MarkupSafe-0.23.tar.gz Collecting mpmath>=0.19 (from sympy->pylab) Downloading mpmath-0.19.tar.gz (498kB) Collecting traitlets (from ipython->pylab) Downloading traitlets-4.2.1-py2.py3-none-any.whl (67kB) Collecting pickleshare (from ipython->pylab) Downloading pickleshare-0.7.2-py2.py3-none-any.whl Collecting simplegeneric>0.8 (from ipython->pylab) Downloading simplegeneric-0.8.1.zip Collecting backports.shutil-get-terminal-size (from ipython->pylab) Downloading backports.shutil_get_terminal_size-1.0.0-py2.py3-none-any.whl Requirement already satisfied (use --upgrade to upgrade): setuptools>=18.5 in /usr/local/lib/python2.7/site-packages (from ipython->pylab) Collecting pexpect (from ipython->pylab) Downloading pexpect-4.0.1.tar.gz (143kB) Collecting dask[array]>=0.5.0 (from scikit-image->pylab) Downloading dask-0.9.0-py2.py3-none-any.whl (203kB) Collecting six>=1.7.3 (from scikit-image->pylab) Downloading six-1.10.0-py2.py3-none-any.whl Collecting pillow>=2.1.0 (from scikit-image->pylab) Downloading Pillow-3.2.0.zip (10.5MB) Collecting ipython-genutils (from traitlets->ipython->pylab) Downloading ipython_genutils-0.1.0-py2.py3-none-any.whl Collecting pathlib2 (from pickleshare->ipython->pylab) Downloading pathlib2-2.1.0-py2.py3-none-any.whl Collecting ptyprocess>=0.5 (from pexpect->ipython->pylab) Downloading ptyprocess-0.5.1-py2.py3-none-any.whl Collecting toolz>=0.7.2 (from dask[array]>=0.5.0->scikit-image->pylab) Downloading toolz-0.7.4.tar.gz Building wheels for collected packages: tornado, pandas, seaborn, matplotlib, sympy, pyzmq, backports.ssl-match-hostname, functools32, MarkupSafe, mpmath, simplegeneric, pexpect, pillow, toolz Running setup.py bdist_wheel for tornado: started Running setup.py bdist_wheel for tornado: finished with status 'done' Stored in directory: /root/.cache/pip/wheels/df/20/c7/38911d3d7ac9ae3c6c1b73f01bc61d8fdb46c7fad1a720d394 Running setup.py bdist_wheel for pandas: started Running setup.py bdist_wheel for pandas: still running... Running setup.py bdist_wheel for pandas: still running... Running setup.py bdist_wheel for pandas: still running... Running setup.py bdist_wheel for pandas: still running... Running setup.py bdist_wheel for pandas: still running... Running setup.py bdist_wheel for pandas: still running... Running setup.py bdist_wheel for pandas: finished with status 'done' Stored in directory: /root/.cache/pip/wheels/9a/8c/95/ceb8f988caf19dd90c4c587eea0ee1665c3bb6af73b3ca8264 Running setup.py bdist_wheel for seaborn: started Running setup.py bdist_wheel for seaborn: finished with status 'done' Stored in directory: /root/.cache/pip/wheels/e0/94/01/731d82dc437e1c8b65130956028f3fe693943c3e499a676dc9 Running setup.py bdist_wheel for matplotlib: started Running setup.py bdist_wheel for matplotlib: still running... Running setup.py bdist_wheel for matplotlib: still running... Running setup.py bdist_wheel for matplotlib: still running... Running setup.py bdist_wheel for matplotlib: finished with status 'done' Stored in directory: /root/.cache/pip/wheels/d8/38/3c/a388e11fd09f9b23f5e4cd74594197394d9fd65f91f64c4aa7 Running setup.py bdist_wheel for sympy: started Running setup.py bdist_wheel for sympy: finished with status 'done' Stored in directory: /root/.cache/pip/wheels/05/93/22/2d0f59d842347b1f38df0d3f7a3870586df60568d2a49d94c5 Running setup.py bdist_wheel for pyzmq: started Running setup.py bdist_wheel for pyzmq: still running... Running setup.py bdist_wheel for pyzmq: still running... Running setup.py bdist_wheel for pyzmq: still running... Running setup.py bdist_wheel for pyzmq: finished with status 'done' Stored in directory: /root/.cache/pip/wheels/a1/2f/b7/8d30910d93816f084b3d29ec38e38ae7cc9f8431a0b44a5864 Running setup.py bdist_wheel for backports.ssl-match-hostname: started Running setup.py bdist_wheel for backports.ssl-match-hostname: finished with status 'done' Stored in directory: /root/.cache/pip/wheels/5d/72/36/b2a31507b613967b728edc33378a5ff2ada0f62855b93c5ae1 Running setup.py bdist_wheel for functools32: started Running setup.py bdist_wheel for functools32: finished with status 'done' Stored in directory: /root/.cache/pip/wheels/3c/d0/09/cd78d0ff4d6cfecfbd730782a7815a4571cd2cd4d2ed6e69d9 Running setup.py bdist_wheel for MarkupSafe: started Running setup.py bdist_wheel for MarkupSafe: finished with status 'done' Stored in directory: /root/.cache/pip/wheels/a3/fa/dc/0198eed9ad95489b8a4f45d14dd5d2aee3f8984e46862c5748 Running setup.py bdist_wheel for mpmath: started Running setup.py bdist_wheel for mpmath: finished with status 'done' Stored in directory: /root/.cache/pip/wheels/02/2b/99/cd867d5da48d951118a8020e86c0c12a65022702426582d4b8 Running setup.py bdist_wheel for simplegeneric: started Running setup.py bdist_wheel for simplegeneric: finished with status 'done' Stored in directory: /root/.cache/pip/wheels/7b/31/08/c85e74c84188cbec6a6827beec4d640f2bd78ae003dc1ec09d Running setup.py bdist_wheel for pexpect: started Running setup.py bdist_wheel for pexpect: finished with status 'done' Stored in directory: /root/.cache/pip/wheels/f2/65/89/09578bcd0efeabc7e2b0079cd62d3955c1477f2e55aa5031a4 Running setup.py bdist_wheel for pillow: started Running setup.py bdist_wheel for pillow: finished with status 'done' Stored in directory: /root/.cache/pip/wheels/88/2d/ce/3ff4ae4e2b8600d1bde1cbde5dfcc6d8770222c38348fe9139 Running setup.py bdist_wheel for toolz: started Running setup.py bdist_wheel for toolz: finished with status 'done' Stored in directory: /root/.cache/pip/wheels/3e/e9/72/b9e24c6b4c0347670b9a20afeba6b2534655f5dc714b30cb4e Successfully built tornado pandas seaborn matplotlib sympy pyzmq backports.ssl-match-hostname functools32 MarkupSafe mpmath simplegeneric pexpect pillow toolz Installing collected packages: decorator, networkx, pytz, backports.ssl-match-hostname, six, singledispatch, certifi, backports-abc, tornado, python-dateutil, numpy, pandas, scipy, cycler, pyparsing, matplotlib, seaborn, functools32, jsonschema, MarkupSafe, jinja2, mpmath, sympy, ipython-genutils, traitlets, pathlib2, pickleshare, simplegeneric, backports.shutil-get-terminal-size, ptyprocess, pexpect, ipython, toolz, dask, pillow, scikit-image, scikit-learn, pyzmq, pylab Successfully installed MarkupSafe-0.23 backports-abc-0.4 backports.shutil-get-terminal-size-1.0.0 backports.ssl-match-hostname-3.5.0.1 certifi-2016.2.28 cycler-0.10.0 dask-0.9.0 decorator-4.0.9 functools32-3.2.3.post2 ipython-4.2.0 ipython-genutils-0.1.0 jinja2-2.8 jsonschema-2.5.1 matplotlib-1.5.1 mpmath-0.19 networkx-1.11 numpy-1.11.0 pandas-0.18.1 pathlib2-2.1.0 pexpect-4.0.1 pickleshare-0.7.2 pillow-3.2.0 ptyprocess-0.5.1 pylab-0.1.3 pyparsing-2.1.4 python-dateutil-2.5.3 pytz-2016.4 pyzmq-15.2.0 scikit-image-0.12.3 scikit-learn-0.17.1 scipy-0.17.1 seaborn-0.7.0 simplegeneric-0.8.1 singledispatch-3.4.0.3 six-1.10.0 sympy-1.0 toolz-0.7.4 tornado-4.3 traitlets-4.2.1 You are using pip version 8.1.1, however version 8.1.2 is available. You should consider upgrading via the 'pip install --upgrade pip' command. ---> 3b53dc9ef60f Removing intermediate container dc7af4eba40b Step 6 : CMD python test.py ---> Running in 243dd6bfbc15 ---> c977d7831c40 Removing intermediate container 243dd6bfbc15 Successfully built c977d7831c40 Traceback (most recent call last): File "test.py", line 2, in <module> print requests.get("https://google.com") File "/usr/local/lib/python2.7/site-packages/requests/api.py", line 71, in get return request('get', url, params=params, **kwargs) File "/usr/local/lib/python2.7/site-packages/requests/api.py", line 57, in request return session.request(method=method, url=url, **kwargs) File "/usr/local/lib/python2.7/site-packages/requests/sessions.py", line 475, in request resp = self.send(prep, **send_kwargs) File "/usr/local/lib/python2.7/site-packages/requests/sessions.py", line 585, in send r = adapter.send(request, **kwargs) File "/usr/local/lib/python2.7/site-packages/requests/adapters.py", line 477, in send raise SSLError(e, request=request) requests.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:590) ``` # docker exec -ti containername bash root(inside docker container):/# python --version ``` Python 2.7.11 ``` root(inside docker container):/# pip show requests ``` --- Metadata-Version: 2.0 Name: requests Version: 2.10.0 Summary: Python HTTP for Humans. Home-page: http://python-requests.org Author: Kenneth Reitz Author-email: [email protected] Installer: pip License: Apache 2.0 Location: /usr/local/lib/python2.7/site-packages Requires: Classifiers: Development Status :: 5 - Production/Stable Intended Audience :: Developers Natural Language :: English License :: OSI Approved :: Apache Software License Programming Language :: Python Programming Language :: Python :: 2.6 Programming Language :: Python :: 2.7 Programming Language :: Python :: 3 Programming Language :: Python :: 3.3 Programming Language :: Python :: 3.4 Programming Language :: Python :: 3.5 Programming Language :: Python :: Implementation :: CPython Programming Language :: Python :: Implementation :: PyPy You are using pip version 8.1.1, however version 8.1.2 is available. You should consider upgrading via the 'pip install --upgrade pip' command. ``` root(inside docker container):/# pip show pylab ``` --- Metadata-Version: 2.0 Name: pylab Version: 0.1.3 Summary: Data science meta-package Home-page: https://github.com/javipalanca/pylab Author: Javi Palanca Author-email: [email protected] Installer: pip License: BSD Location: /usr/local/lib/python2.7/site-packages Requires: networkx, pytz, tornado, pandas, seaborn, numpy, matplotlib, jsonschema, scipy, jinja2, sympy, ipython, scikit-image, scikit-learn, pyzmq Classifiers: Development Status :: 2 - Pre-Alpha Intended Audience :: Developers License :: OSI Approved :: BSD License Natural Language :: English Programming Language :: Python :: 2 Programming Language :: Python :: 2.6 Programming Language :: Python :: 2.7 Programming Language :: Python :: 3 Programming Language :: Python :: 3.3 Programming Language :: Python :: 3.4 You are using pip version 8.1.1, however version 8.1.2 is available. You should consider upgrading via the 'pip install --upgrade pip' command. ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/649929?v=4", "events_url": "https://api.github.com/users/CrimsonGlory/events{/privacy}", "followers_url": "https://api.github.com/users/CrimsonGlory/followers", "following_url": "https://api.github.com/users/CrimsonGlory/following{/other_user}", "gists_url": "https://api.github.com/users/CrimsonGlory/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/CrimsonGlory", "id": 649929, "login": "CrimsonGlory", "node_id": "MDQ6VXNlcjY0OTkyOQ==", "organizations_url": "https://api.github.com/users/CrimsonGlory/orgs", "received_events_url": "https://api.github.com/users/CrimsonGlory/received_events", "repos_url": "https://api.github.com/users/CrimsonGlory/repos", "site_admin": false, "starred_url": "https://api.github.com/users/CrimsonGlory/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/CrimsonGlory/subscriptions", "type": "User", "url": "https://api.github.com/users/CrimsonGlory", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3215/reactions" }
https://api.github.com/repos/psf/requests/issues/3215/timeline
null
completed
null
null
false
[ "Can you please show us what version of openssl is linked against Python in your container? `python -c \"import ssl; print ssl.OPENSSL_VERSION\"` should do the trick.\n", "both setups (installing pylab+requests or only requests), print the same openssl version:\nOpenSSL 1.0.1k 8 Jan 2015\n", "Yup, so the problem here is that pylab brings in certifi. Certifi has removed the deprecated and weak 1024-bit trust roots that requests normally ships with, but requests will use it if available. That's fine except that OpenSSL earlier than 1.0.2 will fail to build the cert chain if the site uses a cross-signed root cert. \n\nTry setting the environment variable `REQUESTS_CA_BUNDLE` to the string returned by `python -c \"import certifi; print certifi.old_where\"` and see if that helps. \n", "Thanks! That fixed the issue.\n\n```\nENV REQUESTS_CA_BUNDLE \"/usr/local/lib/python2.7/site-packages/certifi/weak.pem\"\n```\n" ]
https://api.github.com/repos/psf/requests/issues/3214
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3214/labels{/name}
https://api.github.com/repos/psf/requests/issues/3214/comments
https://api.github.com/repos/psf/requests/issues/3214/events
https://github.com/psf/requests/issues/3214
156,135,767
MDU6SXNzdWUxNTYxMzU3Njc=
3,214
Using specific SSL version TLSv1.2
{ "avatar_url": "https://avatars.githubusercontent.com/u/15092?v=4", "events_url": "https://api.github.com/users/jayvdb/events{/privacy}", "followers_url": "https://api.github.com/users/jayvdb/followers", "following_url": "https://api.github.com/users/jayvdb/following{/other_user}", "gists_url": "https://api.github.com/users/jayvdb/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jayvdb", "id": 15092, "login": "jayvdb", "node_id": "MDQ6VXNlcjE1MDky", "organizations_url": "https://api.github.com/users/jayvdb/orgs", "received_events_url": "https://api.github.com/users/jayvdb/received_events", "repos_url": "https://api.github.com/users/jayvdb/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jayvdb/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jayvdb/subscriptions", "type": "User", "url": "https://api.github.com/users/jayvdb", "user_view_type": "public" }
[]
closed
true
null
[]
null
7
2016-05-22T05:25:24Z
2021-09-04T00:06:15Z
2016-05-22T11:57:58Z
CONTRIBUTOR
resolved
I ran into a problem with a TLSv1.2 site using the Travis Ubuntu precise containers, which is documented at https://github.com/travis-ci/travis-ci/issues/4757 I was able to work around the problem using: ``` python import ssl, urllib3 pm = urllib3.PoolManager(ssl_version=ssl.PROTOCOL_TLSv1_2) r = pm.request('GET', 'http://proofwiki.org/wiki/Main_Page') ``` I'd like to workaround it using `requests` if possible, and it seems like it should be possible according to http://docs.python-requests.org/en/master/user/advanced/#example-specific-ssl-version However when I try to follow that recipe (below) on Python 2.7.9+ / 3.4+, `requests.exceptions.SSLError: ("bad handshake: SysCallError(104, 'ECONNRESET')",)` is raised. ``` python import ssl import sys import requests from requests.adapters import HTTPAdapter from requests.packages.urllib3.poolmanager import PoolManager class Tls12HttpAdapter(HTTPAdapter): """Transport adapter that forces use of TLSv1.2.""" def init_poolmanager(self, connections, maxsize, block=False): """Create and initialize the urllib3 PoolManager.""" self.poolmanager = PoolManager( num_pools=connections, maxsize=maxsize, block=block, ssl_version=ssl.PROTOCOL_TLSv1_2) url = sys.argv[1] s = requests.Session() s.mount(url, Tls12HttpAdapter()) r = s.get(url) print(r.status_code) ``` https://github.com/jayvdb/my-ci-test/blob/tls12/requests_explicit.py Full backtrace at https://travis-ci.org/jayvdb/my-ci-test/jobs/132022039
{ "avatar_url": "https://avatars.githubusercontent.com/u/15092?v=4", "events_url": "https://api.github.com/users/jayvdb/events{/privacy}", "followers_url": "https://api.github.com/users/jayvdb/followers", "following_url": "https://api.github.com/users/jayvdb/following{/other_user}", "gists_url": "https://api.github.com/users/jayvdb/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jayvdb", "id": 15092, "login": "jayvdb", "node_id": "MDQ6VXNlcjE1MDky", "organizations_url": "https://api.github.com/users/jayvdb/orgs", "received_events_url": "https://api.github.com/users/jayvdb/received_events", "repos_url": "https://api.github.com/users/jayvdb/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jayvdb/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jayvdb/subscriptions", "type": "User", "url": "https://api.github.com/users/jayvdb", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3214/reactions" }
https://api.github.com/repos/psf/requests/issues/3214/timeline
null
completed
null
null
false
[ "So, adopters are selected based on a prefix match against a lowercased normalised form of the URL. That means it's rarely a good idea to add an adapter that is mounted against the _whole_ URL: scheme and host is better. \n\nTry mounting the adapter using a more restrictive URL. \n", "Well it works if I mount the eventual target scheme and hostname (i.e. https://proofwiki.org) , but if fails if I mount the intermediary scheme and hostname (i.e. https://www.proofwiki.org) and as already mentioned it fails if I mount the http scheme and hostname that my program receives as input, which is http://proofwiki.org .\n", "So we use a different adapter for each request, so if you use the http URL to mount the adapter it definitely won't work: you'll need to use the URL for the _actual_ host that requires TLSv1.2. If you want you can mount unconditionally for `https://`, assuming all your hosts support TLSv1.2\n", "Ok, thanks, mounting unconditionally for https:// worked for my scenario.\n", "I'm glad we found something that works for you! Thanks for letting us know!\n", "Hello,\r\nI tried mounting my adapter separately but I get the same error. \r\n\r\n```\r\nclass Myadapter(HTTPAdapter):\r\n def init_poolmanager(self, connections, maxsize, block=False):\r\n \"\"\"Create and initialize the urllib3 PoolManager.\"\"\"\r\n self.poolmanager = PoolManager(\r\n num_pools=connections, maxsize=maxsize,\r\n block=block, ssl_version=ssl.PROTOCOL_TLSv1_2)\r\n\r\n\r\ndef requests_session(session=None):\r\n try:\r\n adapter = Myadapter()\r\n session.mount('http://', adapter)\r\n session.mount('https://', adapter)\r\n return session\r\n except Exception:\r\n ('requests_session - Error Api calls')\r\n\r\ndata = {\r\n 'username' : \r\n 'password' :\r\n}\r\nheaders = {\r\n 'Content-Type' : 'application/json'\r\n}\r\nsession_id = requests.Session()\r\nrequests_r= requests_session(session=session_id)\r\nresp= requests_r.post(url, data=data, headers=headers).json()\r\n```\r\nThe Error that I get is:\r\n```\r\nMax retries exceeded with url: some_random_api (Caused by SSLError(SSLError(\"bad handshake: Error([('SSL routines', 'tls_process_server_certificate', 'certificate verify failed')],)\",),))\r\n```\r\n\r\n\r\n", "You should add `session.verify = \"/path/to/ca_cert.pem\"` @ankitpatnaik :\r\n\r\n```python\r\nsession_id = requests.Session()\r\nsession_id.verify = \"/path/to/ca_cert.pem\"\r\nrequests_r = requests_session(session=session_id)\r\n```\r\n\r\nor add `verify=False` in your requests, but I don't suggest that in production:\r\n\r\n```python\r\nresp = requests_r.post(url, data=data, headers=headers, verify=False).json()\r\n```" ]
https://api.github.com/repos/psf/requests/issues/3213
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3213/labels{/name}
https://api.github.com/repos/psf/requests/issues/3213/comments
https://api.github.com/repos/psf/requests/issues/3213/events
https://github.com/psf/requests/issues/3213
156,099,503
MDU6SXNzdWUxNTYwOTk1MDM=
3,213
Importing requests is fairly slow
{ "avatar_url": "https://avatars.githubusercontent.com/u/25111?v=4", "events_url": "https://api.github.com/users/cournape/events{/privacy}", "followers_url": "https://api.github.com/users/cournape/followers", "following_url": "https://api.github.com/users/cournape/following{/other_user}", "gists_url": "https://api.github.com/users/cournape/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/cournape", "id": 25111, "login": "cournape", "node_id": "MDQ6VXNlcjI1MTEx", "organizations_url": "https://api.github.com/users/cournape/orgs", "received_events_url": "https://api.github.com/users/cournape/received_events", "repos_url": "https://api.github.com/users/cournape/repos", "site_admin": false, "starred_url": "https://api.github.com/users/cournape/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/cournape/subscriptions", "type": "User", "url": "https://api.github.com/users/cournape", "user_view_type": "public" }
[]
closed
true
null
[]
null
44
2016-05-21T12:23:08Z
2021-09-08T07:00:40Z
2017-07-30T13:57:49Z
NONE
resolved
We are using requests in our projects, and it is working great. Unfortunately, for our CLI tools, using requests is an issue because it is slow to import. E.g. on my 2014 macbook: ``` python -c "import requests" ``` Takes close to 90 ms. Is optimizing import time worth of consideration for the project ?
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3213/reactions" }
https://api.github.com/repos/psf/requests/issues/3213/timeline
null
completed
null
null
false
[ "This is almost certainly the result of CFFI. Can you print what libraries you have installed in your environment (`pip freeze`)?\n", "This is not linked to cffi (I don't have it installed). I did the test w/ a quasi empty virtualenv:\n\n```\n$ pip list\npip (8.1.2)\nrequests (2.10.0)\nsetuptools (21.1.0)\nwheel (0.24.0)\n$ time python -c \"import requests\"\n\nreal 0m0.090s\nuser 0m0.061s\nsys 0m0.026s\n```\n", "I have a 2012 Macbook pro:, using a slightly different way of measuring the import time:\n\n```\n$ pip list\npip (8.0.2)\nrequests (2.10.0)\nsetuptools (19.6.2)\nwheel (0.26.0)\n$ python -m timeit -n 10000000 'import requests'\n10000000 loops, best of 3: 0.571 usec per loop\n```\n\nI also upgraded pip and setuptools to match your environment:\n\n```\n$ pip list\npip (8.1.2)\nrequests (2.10.0)\nsetuptools (21.1.0)\nwheel (0.26.0)\n$ python -m timeit -n 10000000 'import requests'\n10000000 loops, best of 3: 0.59 usec per loop\n```\n\nFurther\n\n```\n$ time python -c 'import requests'\npython -c 'import requests' 0.07s user 0.05s system 61% cpu 0.200 total\n```\n\nWhich is still not ideal, but is probably higher than the simple import of requests due to Python's initialization.\n", "That said, if you have information about specific things in our `__init__.py` (and elsewhere) that are causing slowness, I think we can fix them so long as they don't break backwards compatibility or our functional API.\n", "`timeit` is not informative for imports, you have to create a new process every time because of python import caching.\n\nThe main cost is importing urllib3, so I suspect that should be addressed there.\n", "@sigmavirus24 as @cournape mentions you can't time it like that; try using `time` on the command line. On my machine (ran each ~6-10 times to try to get a reliable average) for user+sys times (including sys because if there's any strange calls made into the kernel as a result of the import, that should be counted):\n- `time python -c \"\"` (CPy 2.7.11) = 130 ms (time for Python to start up)\n- `time python -c \"import requests\"` (2.10.0) = 240 ms (above + importing requests)\n- `time python -c \"import urllib3\"` (1.15.1) = 210 ms (installed separately)\n\nSo `urllib3` takes about 80 ms, then about 30 ms more for requests stuff.\n", "Neither of these numbers seem that high to me. \n", "Yeah. Especially as your average request is going to take a few hundred milliseconds or so. If you _reeeeally_ want to shave off milliseconds, could always back-translate to use the stdlib. 😜 \n\nOr you could shorten the name of your command line tool by 1-2 characters.\n", "Ah, I didn't realize you were working on a CLI tool. Well, it hasn't seemed to be an issue for httpie.\n", "It is true that as soon as you do even one request on a remote server, the delay will not matter much. But the problem in python is that you pay the import cost whether you actually do a request or not: think doing `tool --version`, `tool --help`, etc... Imagine a tool like git or hg that only occasionally do http IO.\n\n80 ms is pretty bad in this case: the rule of thumb for a CLI tool to have noticeable delay is ~100-150 ms. Try using e.g. git with a 200 ms delay, it is a bit annoying.\n\nNow, I completely understand that it may not be a focus of the library, especially if it is not that easy to improve.\n", "I personally think you're using the wrong tool for the job — given that, on the metrics given above, Python itself is taking 120ms alone. Httpie and mercuial are well-loved tools that do their jobs very well, but they are not, nor ever will they be, curl or git. \n", "Makes sense; you could code up some lazy loading bits. Found this though, which claims to have been excerpted from `hg`: https://github.com/bwesterb/py-demandimport (haven't ever used it, no guarantees). \n\nApparent origin: https://selenic.com/hg/file/tip/mercurial/demandimport.py\n", "@kennethreitz python does not take 120 ms to start, unless you are on a seriously broken environment. It is much closer to 20 ms on decently modern hw (< 5 years), i.e. importing requests means 3x the cost of starting python.\n\nFWIW, on my 2011 Desktop PC (Debian):\n\n```\n$ time python -c \"\"\nreal 0m0.020s\nuser 0m0.016s\nsys 0m0.000s\n$ time python -c \"import requests\"\nreal 0m0.107s\nuser 0m0.088s\nsys 0m0.012s\n```\n\nOn my 2014 macbook (OS X)\n\n```\n$ time python -c \"\"\n\nreal 0m0.022s\nuser 0m0.010s\nsys 0m0.009s\n$ time python -c \"import requests\"\nreal 0m0.096s\nuser 0m0.060s\nsys 0m0.032s\n```\n\nA simple `hg` (for help) on my macbook takes ~ 100 ms\n", "Oh, oops, yeah...my environment is a bit wonky because of the `pyenv` shim (adds about 100 ms because it's a bash script that leads to python with `exec`). Bypassing that, the incremental times are still the same and Python itself loads in 10 ms or so.\n", "haha, I thought that number looked high :)\n", "The \"profimp\" module shows where the time goes:\r\n\r\nhttps://pypi.python.org/pypi/profimp\r\n\r\npkg_resources takes up a lot of time in general.\r\n\r\nAnother time suck is the automatic loading of `.packages.urllib3.contrib.pyopenssl`. It would be great if we could opt-out, as there is core SNI support in the appropriate versions of Python.\r\n\r\nThere doesn't appear to be any way of doing this today, if PyOpenSSL is importable. Would a PR allowing an opt-out be likely to be merged?", "@dsully How would you propose to expose the API for an opt-out?", "@Lukasa Unless I'm mistaken, the `.contrib.pyopenssl` ssl wrapper & context is not needed on Python 2.7.9+ and 3.4+. Given that, `requests` is always using pyopenssl when it is installed, even when the core Python libs support SNI, etc. So, changing the import to be:\r\n\r\n```\r\n# Attempt to enable urllib3's SNI support, if possible & needed.\r\ntry:\r\n from ssl import SSLContext # Python 2.7.9+, 3.4+\r\nexcept ImportError:\r\n try:\r\n from .packages.urllib3.contrib import pyopenssl\r\n pyopenssl.inject_into_urllib3()\r\n except ImportError:\r\n pass\r\n```\r\n\r\nLess of an opt-out, more of a not-needed.\r\n\r\nNow, if there are cases where the pyopenssl SSLContext wrapper is desired for some reason (?) even when the core libraries are sufficient, then I'll make another pass. What do you think?", "@dsully you said:\r\n\r\n> pkg_resources takes up a lot of time in general.\r\n\r\nWhere is Requests using pkg_resources? I don't believe either Requests or Urllib3 uses it.\r\n\r\n> Now, if there are cases where the pyopenssl SSLContext wrapper is desired for some reason (?) even when the core libraries are sufficient\r\n\r\nIf I remember correctly, only the more recent versions of 2.7 are actually appropriate. I think there were some minor issues with 2.7.9 and 2.7.10.", "> Unless I'm mistaken, the .contrib.pyopenssl ssl wrapper & context is not needed on Python 2.7.9+ and 3.4+.\r\n\r\nUnfortunately, that's not true. The best example is on the system Python on macOS, which provides an SSLContext but which ships OpenSSL 0.9.8zh. This is an almost impossible to use version which is missing a number of vital features, including TLSv1.2 support. This is resolved by PyOpenSSL, which uses the OpenSSL 1.0.2 statically linked inside the cryptography module. This is *vital* to secure systems.\r\n\r\n*More generally*, using PyOpenSSL allows for users to link in a version of OpenSSL that is not constrained by the OpenSSL that their base Python is linked against.", "@sigmavirus24 Indirectly via `cryptography`:\r\n\r\n```\r\nsrc/cryptography/hazmat/backends/__init__.py\r\n7:import pkg_resources\r\n27: for ep in pkg_resources.iter_entry_points(\r\n```\r\n\r\nImported via:\r\n\r\n```\r\npackages/urllib3/contrib/pyopenssl.py\r\n49:from cryptography.hazmat.backends.openssl import backend as openssl_backend\r\n50:from cryptography.hazmat.backends.openssl.x509 import _Certificate\r\n```", "@Lukasa Right.. forgot about that. I don't have that particular issue on macOS, but most people do.\r\n\r\nI'll look at coming up with a way to explicitly opt-out then.", "It should be noted that the *easiest* way to opt-out is to simply not have PyOpenSSL in your environment. If it's present and you need it, you're presumably paying that import cost somewhere anyway.", "Also worth noting that `pkg_resources` takes longer to iterate over entry-points on systems with very large numbers of packages installed. It's also sadly the best solution to finding plugins defined by packages.", "@Lukasa - yes and no. Our build environment has real dependency management for modules (https://engineering.linkedin.com/blog/2016/08/introducing--py-gradle--an-open-source-python-plugin-for-gradle), which means just because someone included PyOpenSSL in their dependency tree, doesn't mean that the code you are importing for your upstream uses it. Lots of code gets installed transitively, but not imported.", "@reaperhulk BTW, is slow import of cryptography still an ongoing issue?", "@Lukasa - without moving the location of the pyopenssl loader, currently in `requests/__init__.py`, would an environment variable be acceptable?", "I'm *tentatively* open to that, yes.", "> Lots of code gets installed transitively, but not imported.\r\n\r\nThat sounds like the real root cause of this. Well that and the possibility that cryptography is still slow to import. Which is an issue for pyca/cryptography.\r\n\r\n> would an environment variable be acceptable?\r\n\r\nI don't like that. Let's say someone is using HTTPie and notice it's slow. They're also using requests in a development project. If they export the environment variable to speed it up without understanding the ramifications *and* they're on an LTS distribution with a *terrible* `ssl` module and *terrible* OpenSSL, then they'll start seeing SSLErrors in their project if they were relying on any of the functionality provided by PyOpenSSL. But those SSLErrors weren't there yesterday and they've already forgotten that they exported that variable. Now we have yet one more avenue of complexity to help them debug. I abhor that.", "@sigmavirus24 I hear you there. If environment changes aren't ok, an explicit call? I'd have to move the current injection, since it happens in `__init__.py`" ]
https://api.github.com/repos/psf/requests/issues/3212
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3212/labels{/name}
https://api.github.com/repos/psf/requests/issues/3212/comments
https://api.github.com/repos/psf/requests/issues/3212/events
https://github.com/psf/requests/issues/3212
156,011,291
MDU6SXNzdWUxNTYwMTEyOTE=
3,212
SSL Error: bad handshake
{ "avatar_url": "https://avatars.githubusercontent.com/u/1202193?v=4", "events_url": "https://api.github.com/users/pensnarik/events{/privacy}", "followers_url": "https://api.github.com/users/pensnarik/followers", "following_url": "https://api.github.com/users/pensnarik/following{/other_user}", "gists_url": "https://api.github.com/users/pensnarik/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/pensnarik", "id": 1202193, "login": "pensnarik", "node_id": "MDQ6VXNlcjEyMDIxOTM=", "organizations_url": "https://api.github.com/users/pensnarik/orgs", "received_events_url": "https://api.github.com/users/pensnarik/received_events", "repos_url": "https://api.github.com/users/pensnarik/repos", "site_admin": false, "starred_url": "https://api.github.com/users/pensnarik/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pensnarik/subscriptions", "type": "User", "url": "https://api.github.com/users/pensnarik", "user_view_type": "public" }
[ { "color": "fef2c0", "default": false, "description": null, "id": 298537994, "name": "Needs More Information", "node_id": "MDU6TGFiZWwyOTg1Mzc5OTQ=", "url": "https://api.github.com/repos/psf/requests/labels/Needs%20More%20Information" } ]
closed
true
null
[]
null
33
2016-05-20T17:20:55Z
2021-09-08T03:00:36Z
2016-09-06T00:04:02Z
NONE
resolved
I could not use your lib in CeontOS 7 with Python 2.7.5. I got this error: ``` File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 576, in send r = adapter.send(request, **kwargs) File "/usr/lib/python2.7/site-packages/requests/adapters.py", line 447, in send raise SSLError(e, request=request) requests.exceptions.SSLError: ("bad handshake: Error([('SSL routines', 'SSL3_GET_SERVER_CERTIFICATE', 'certificate verify failed')],)",) ``` Updating of Python or any SSL libs didn't help. I've got this error in CentOS and Ubuntu, in Arch Linux everything works well.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3212/reactions" }
https://api.github.com/repos/psf/requests/issues/3212/timeline
null
completed
null
null
false
[ "Could you please tell us how you've installed requests and what version you've installed?\n", "This is almost certainly an SNI problem, so depending on how the library was installed we may need to add some optional dependencies. \n", "requests was installed as dependency for smartsheet-python-sdk via pip:\n\n```\npip show requests\n---\nMetadata-Version: 2.0\nName: requests\nVersion: 2.10.0\nSummary: Python HTTP for Humans.\nHome-page: http://python-requests.org\nAuthor: Kenneth Reitz\nAuthor-email: [email protected]\nInstaller: pip\nLicense: Apache 2.0\nLocation: /usr/lib/python3.4/site-packages\nRequires: \nClassifiers:\n Development Status :: 5 - Production/Stable\n Intended Audience :: Developers\n Natural Language :: English\n License :: OSI Approved :: Apache Software License\n Programming Language :: Python\n Programming Language :: Python :: 2.6\n Programming Language :: Python :: 2.7\n Programming Language :: Python :: 3\n Programming Language :: Python :: 3.3\n Programming Language :: Python :: 3.4\n Programming Language :: Python :: 3.5\n Programming Language :: Python :: Implementation :: CPython\n Programming Language :: Python :: Implementation :: PyPy\n\npip show smartsheet-python-sdk\n---\nMetadata-Version: 2.0\nName: smartsheet-python-sdk\nVersion: 1.0.1\nSummary: Library that uses Python to connect to Smartsheet services (using API 2.0).\nHome-page: http://smartsheet-platform.github.io/api-docs/\nAuthor: Smartsheet\nAuthor-email: [email protected]\nInstaller: pip\nLicense: Apache-2.0\nLocation: /usr/lib/python3.4/site-packages\nRequires: certifi, requests, six, python-dateutil, requests-toolbelt\nClassifiers:\n Development Status :: 5 - Production/Stable\n Intended Audience :: Developers\n Natural Language :: English\n Operating System :: OS Independent\n License :: OSI Approved :: Apache Software License\n Programming Language :: Python\n Programming Language :: Python :: 2.7\n Programming Language :: Python :: 3.3\n Programming Language :: Python :: 3.4\n Programming Language :: Python :: 3.5\n Programming Language :: Python :: Implementation :: PyPy\n Programming Language :: Python :: Implementation :: CPython\n Topic :: Software Development :: Libraries :: Python Modules\n Topic :: Office/Business :: Financial :: Spreadsheet\n```\n", "Removed package in pip and installing from yum won't help...\n", "@pensnarik Can you try running `pip install -U requests[security]` in your environment and then try again?\n", "Thak you, Lukasa, for your advice, but it did not help...\n\n``` bash\n[mutex@unica1 parser]$ sudo pip install -U requests[security]\n[sudo] password for mutex: \nCollecting requests[security]\n Using cached requests-2.10.0-py2.py3-none-any.whl\nCollecting pyOpenSSL>=0.13 (from requests[security])\n Using cached pyOpenSSL-16.0.0-py2.py3-none-any.whl\nCollecting ndg-httpsclient (from requests[security])\nRequirement already up-to-date: pyasn1 in /usr/lib/python2.7/site-packages (from requests[security])\nCollecting cryptography>=1.3 (from pyOpenSSL>=0.13->requests[security])\n Using cached cryptography-1.3.2.tar.gz\nRequirement already up-to-date: six>=1.5.2 in /usr/lib/python2.7/site-packages (from pyOpenSSL>=0.13->requests[security])\nRequirement already up-to-date: idna>=2.0 in /usr/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL>=0.13->requests[security])\nRequirement already up-to-date: setuptools>=11.3 in /usr/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL>=0.13->requests[security])\nRequirement already up-to-date: enum34 in /usr/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL>=0.13->requests[security])\nRequirement already up-to-date: ipaddress in /usr/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL>=0.13->requests[security])\nRequirement already up-to-date: cffi>=1.4.1 in /usr/lib64/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL>=0.13->requests[security])\nRequirement already up-to-date: pycparser in /usr/lib/python2.7/site-packages (from cffi>=1.4.1->cryptography>=1.3->pyOpenSSL>=0.13->requests[security])\nBuilding wheels for collected packages: cryptography\n Running setup.py bdist_wheel for cryptography ... done\n Stored in directory: /root/.cache/pip/wheels/14/df/02/611097a49d7739151deb68d0172dff5ae7cba01b82769e56ef\nSuccessfully built cryptography\nInstalling collected packages: cryptography, pyOpenSSL, ndg-httpsclient, requests\n Found existing installation: requests 2.6.0\n DEPRECATION: Uninstalling a distutils installed project (requests) has been deprecated and will be removed in a future version. This is due to the fact that uninstalling a distutils project will only partially uninstall the project.\n Uninstalling requests-2.6.0:\n Successfully uninstalled requests-2.6.0\nSuccessfully installed cryptography-1.3.2 ndg-httpsclient-0.4.0 pyOpenSSL-16.0.0 requests-2.10.0\nYou are using pip version 8.1.0, however version 8.1.2 is available.\nYou should consider upgrading via the 'pip install --upgrade pip' command.\n[mutex@unica1 parser]$ ./update_region_price.py \nTraceback (most recent call last):\n File \"./update_region_price.py\", line 129, in <module>\n sys.exit(app.run(sys.argv))\n File \"./update_region_price.py\", line 122, in run\n self.update_basic()\n File \"./update_region_price.py\", line 65, in update_basic\n sheet = self.sm.Sheets.get_sheet(self.sheet_id, page_size=5000)\n File \"/usr/lib/python2.7/site-packages/smartsheet/sheets.py\", line 460, in get_sheet\n response = self._base.request(prepped_request, expected, _op)\n File \"/usr/lib/python2.7/site-packages/smartsheet/smartsheet.py\", line 178, in request\n res = self.request_with_retry(prepped_request, operation)\n File \"/usr/lib/python2.7/site-packages/smartsheet/smartsheet.py\", line 242, in request_with_retry\n return self._request(prepped_request, operation)\n File \"/usr/lib/python2.7/site-packages/smartsheet/smartsheet.py\", line 208, in _request\n res = self._session.send(prepped_request, stream=stream)\n File \"/usr/lib/python2.7/site-packages/requests/sessions.py\", line 585, in send\n r = adapter.send(request, **kwargs)\n File \"/usr/lib/python2.7/site-packages/requests/adapters.py\", line 477, in send\n raise SSLError(e, request=request)\nrequests.exceptions.SSLError: (\"bad handshake: Error([('SSL routines', 'SSL3_GET_SERVER_CERTIFICATE', 'certificate verify failed')],)\",)\n```\n", "Are you able to show me what host you're connecting to?\n", "Lukasa, of course: https://api.smartsheet.com/2.0, I'm using python wrapper for smartsheet API (https://github.com/smartsheet-platform/smartsheet-python-sdk). \n", "And do you happen to have `certifi` installed in your environment?\n", "Yes, I have:\n\n``` bash\n[mutex@unica1 parser]$ pip show certifi\n---\nMetadata-Version: 2.0\nName: certifi\nVersion: 2016.2.28\nSummary: Python package for providing Mozilla's CA Bundle.\nHome-page: http://certifi.io/\nAuthor: Kenneth Reitz\nAuthor-email: [email protected]\nInstaller: pip\nLicense: ISC\nLocation: /usr/lib/python2.7/site-packages\nRequires: \nClassifiers:\n Development Status :: 5 - Production/Stable\n Intended Audience :: Developers\n Natural Language :: English\n Programming Language :: Python\n Programming Language :: Python :: 2.5\n Programming Language :: Python :: 2.6\n Programming Language :: Python :: 2.7\n Programming Language :: Python :: 3.0\n Programming Language :: Python :: 3.1\n Programming Language :: Python :: 3.2\n Programming Language :: Python :: 3.3\n Programming Language :: Python :: 3.4\n```\n", "Aha, ok, we got there.\n\n`api.smartsheet.com` serves its TLS using what's known as a \"cross-signed certificate\". This was used because Verisign, the CA for `api.smartsheet.com`, originally used a 1024-bit root certificate. These were deprecated and replaced by stronger root certificates, but some older browsers and systems may not have received updates, so sites like `api.smartsheet.com` serve a root certificate that is signed by the 1024-bit root.\n\nThat's not normally a problem, _except_:\n- `certifi` removed the weak 1024-bit roots\n- OpenSSL older than 1.0.2 sucks at building cert chains, and so fails to correctly validate the cross-signed root.\n\nYou can solve this in two ways. The first, better but more drastic way, is to upgrade your OpenSSL to 1.0.2 or later. This is hard to do on Centos, I'm afraid. The less good but more effective way is to get the output of running `python -c \"import certifi; print certifi.old_where()\"` and then set the `REQUESTS_CA_BUNDLE` environment variable to the printed path.\n", "Argh, the smartsheet SDK overrides that environment variable by explicitly using certifi.\n\nNew plan. Right at the start after you import smartsheet, before you do _anything_ else, can you add the following lines?\n\n``` python\nimport smartsheet.session\nimport certifi\nsmartsheet.session._TRUSTED_CERT_FILE = certifi.old_where()\n```\n\nThat should hopefully help. You shouldn't need the environment variable with this approach.\n", "Unfortunately, it does not work even if I change this line in /usr/lib/python2.7/site-packages/smartsheet/session.py:\n\n``` python\n_TRUSTED_CERT_FILE = certifi.where()\n```\n\nto\n\n``` python\n_TRUSTED_CERT_FILE = certifi.old_where()\n```\n\nWhen I change this var from my code, It does not helps too...\n", "At this point, I'm pretty confident this is not a bug in requests and is in fact a problem with the smartsheet API wrapper. Have you tried reporting this to them?\n", "Hrm. Are you confident that that code is being executed? Can you use print statements to confirm that it's being used?\n", "Downgrading certifi from 2015.11.20.1 to 2015.4.28 fixes the problem!\n", "ubuntu@ip-172-31-58-148:~/requests$ clear\nubuntu@ip-172-31-58-148:~/requests$ python\nPython 2.7.6 (default, Jun 22 2015, 17:58:13)\n[GCC 4.8.2] on linux2\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n\n> > > import requests\n> > > requests.get('https://logo.clearbit.com/beutifi.com')\n> > > requests/packages/urllib3/util/ssl_.py:318: SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#snimissingwarning.\n> > > SNIMissingWarning\n> > > requests/packages/urllib3/util/ssl_.py:122: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.\n> > > InsecurePlatformWarning\n> > > Traceback (most recent call last):\n> > > File \"<stdin>\", line 1, in <module>\n> > > File \"requests/api.py\", line 71, in get\n> > > return request('get', url, params=params, *_kwargs)\n> > > File \"requests/api.py\", line 57, in request\n> > > return session.request(method=method, url=url, *_kwargs)\n> > > File \"requests/sessions.py\", line 477, in request\n> > > resp = self.send(prep, *_send_kwargs)\n> > > File \"requests/sessions.py\", line 587, in send\n> > > r = adapter.send(request, *_kwargs)\n> > > File \"requests/adapters.py\", line 491, in send\n> > > raise SSLError(e, request=request)\n> > > requests.exceptions.SSLError: [Errno 1] _ssl.c:510: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure\n\nThis is my input to this problem. It doe snot happen with every https request.\n", "@sProject You would need to resolve both of those warnings before we can determine if you have a new problem. If you got requests from pip, run `pip install pyopenssl pyasn1 ndg-httpsclient`. If you got Requests from your system provider, you should install those three packages from there.\n", "This bug appears to have been resolved over two months ago. If I'm incorrect in this conclusion and there is still a bug to be found in requests, I'll be happy to reopen this. \n\nThanks everyone for collaborating on requests!\n", "The example from [pygodaddy](https://pypi.python.org/pypi/pygodaddy) raises the same exception\n\n```\nfrom pygodaddy import GoDaddyClient\nclient = GoDaddyClient()\nif client.login(login[0],login[1]):\n print client.find_domains()\nelse: print 'falhou'\n\n\n$ ./dynip.py \nTraceback (most recent call last):\n File \"./dynip.py\", line 8, in <module>\n if client.login(login[0],login[1]):\n File \"/usr/lib/python2.7/site-packages/pygodaddy/client.py\", line 99, in login\n r = self.session.get(self.default_url)\n File \"/usr/lib/python2.7/site-packages/requests/sessions.py\", line 501, in get\n return self.request('GET', url, **kwargs)\n File \"/usr/lib/python2.7/site-packages/requests/sessions.py\", line 488, in request\n resp = self.send(prep, **send_kwargs)\n File \"/usr/lib/python2.7/site-packages/requests/sessions.py\", line 609, in send\n r = adapter.send(request, **kwargs)\n File \"/usr/lib/python2.7/site-packages/requests/adapters.py\", line 497, in send\n raise SSLError(e, request=request)\nrequests.exceptions.SSLError: (\"bad handshake: Error([('SSL routines', 'ssl3_get_server_certificate', 'certificate verify failed')],)\",)\n```\n\nDowngrading certifi as suggested by @pensnarik does not fix it. This is Fedora 24, Python 2.7.12.\n\n```\n# pip list\nBabel (2.3.4)\nbeautifulsoup4 (4.5.1)\nblinker (1.4)\ncertifi (2016.9.26)\ncffi (1.9.1)\nchardet (2.3.0)\nclick (6.6)\ncryptography (1.5.3)\ncssselect (0.9.2)\ndecorator (4.0.10)\ndjango-htmlmin (0.9.1)\ndnspython (1.14.0)\nenum34 (1.1.6)\nextras (1.0.0)\nfirst (2.0.1)\nfixtures (3.0.0)\nFlask (0.11.1)\nFlask-Babel (0.11.1)\nFlask-Gravatar (0.4.2)\nFlask-Login (0.3.2)\nFlask-Mail (0.9.1)\nFlask-Principal (0.4.0)\nFlask-Security (1.7.5)\nFlask-SQLAlchemy (2.1)\nFlask-WTF (0.12)\ngps (3.16)\ngssapi (1.2.0)\nhtml5lib (1.0b3)\nidna (2.1)\nimportlib (1.0.4)\niniparse (0.4)\nipaclient (4.3.2)\nipaddress (1.0.17)\nipalib (4.3.2)\nipaplatform (4.3.2)\nipapython (4.3.2)\nitsdangerous (0.24)\nJinja2 (2.8)\njwcrypto (0.3.2)\nkitchen (1.2.4)\nlinecache2 (1.0.0)\nlockfile (0.12.2)\nlxml (3.6.4)\nM2Crypto (0.25.1)\nMarkupSafe (0.23)\nmunch (2.0.4)\nndg-httpsclient (0.4.2)\nnetaddr (0.7.18)\nnose (1.3.7)\nnumpy (1.11.1)\npasslib (1.6.5)\npbr (1.10.0)\npif (0.7.3)\nPillow (3.3.1)\npip (8.1.2)\npip-tools (1.7.0)\nply (3.9)\npsutil (4.3.0)\npsycopg2 (2.6.2)\npwquality (1.3.0)\npyasn1 (0.1.9)\npyasn1-modules (0.0.8)\npycparser (2.17)\npycrypto (2.6.1)\npycurl (7.43.0)\npygobject (3.20.1)\npygodaddy (0.2.2)\npygpgme (0.3)\npyliblzma (0.5.3)\npyOpenSSL (16.2.0)\npyrsistent (0.11.13)\nPySocks (1.5.7)\npython-dateutil (2.5.3)\npython-fedora (0.8.0)\npython-ldap (2.4.27)\npython-mimeparse (1.5.2)\npython-nss (1.0.0)\npython-yubico (1.3.2)\npytz (2016.6.1)\npyusb (1.0.0)\npyxattr (0.5.5)\nqrcode (5.3)\nreportlab (3.3.0)\nrequests (2.12.1)\nrequests-file (1.4.1)\nrpm-python (4.13.0)\nscdate (1.10.9)\nsetuptools (28.8.0)\nsimplejson (3.8.2)\nsix (1.10.0)\nslip (0.6.4)\nspeaklater (1.3)\nSQLAlchemy (1.0.14)\nsqlparse (0.2.1)\nSSSDConfig (1.14.2)\nTerminator (0.98)\ntestscenarios (0.5.0)\ntesttools (2.2.0)\ntldextract (2.0.2)\ntraceback2 (1.4.0)\ntyping (3.5.2.2)\nuniconvertor (2.0)\nunittest2 (1.1.0)\nurlgrabber (3.10.1)\nurllib3 (1.16)\nWerkzeug (0.11.11)\nWTForms (2.1)\nyum-metadata-parser (1.1.4)\n```\n", "GoDaddy is serving an incomplete certificate chain to you. That means we're missing one of the intermediate certificates and can't build up a trust chain. Either you'll need to add the missing intermediary certificate to the certifi trust store or you'll need to contact GoDaddy and tell them to sort their mess out.\n", "@Lukasa how can one check if the problem is with requests or certificate issuer?\r\nI have one cert issued by Entrust and my browsers are quite ok with it when I browse to the URL.\r\nBut when I try to get to that URL via requests I have `[SSL: CERTIFICATE_VERIFY_FAILED]`\r\n\r\n**Full traceback**\r\n```\r\n$ python main.py\r\nTraceback (most recent call last):\r\n File \"/Users/romandodin/venvs/nokdok/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py\", line 594, in urlopen\r\n chunked=chunked)\r\n File \"/Users/romandodin/venvs/nokdok/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py\", line 350, in _make_request\r\n self._validate_conn(conn)\r\n File \"/Users/romandodin/venvs/nokdok/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py\", line 835, in _validate_conn\r\n conn.connect()\r\n File \"/Users/romandodin/venvs/nokdok/lib/python3.5/site-packages/requests/packages/urllib3/connection.py\", line 323, in connect\r\n ssl_context=context)\r\n File \"/Users/romandodin/venvs/nokdok/lib/python3.5/site-packages/requests/packages/urllib3/util/ssl_.py\", line 324, in ssl_wrap_socket\r\n return context.wrap_socket(sock, server_hostname=server_hostname)\r\n File \"/usr/local/Cellar/python3/3.5.1/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py\", line 376, in wrap_socket\r\n _context=self)\r\n File \"/usr/local/Cellar/python3/3.5.1/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py\", line 747, in __init__\r\n self.do_handshake()\r\n File \"/usr/local/Cellar/python3/3.5.1/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py\", line 983, in do_handshake\r\n self._sslobj.do_handshake()\r\n File \"/usr/local/Cellar/python3/3.5.1/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py\", line 628, in do_handshake\r\n self._sslobj.do_handshake()\r\nssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:645)\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File \"/Users/romandodin/venvs/nokdok/lib/python3.5/site-packages/requests/adapters.py\", line 423, in send\r\n timeout=timeout\r\n File \"/Users/romandodin/venvs/nokdok/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py\", line 624, in urlopen\r\n raise SSLError(e)\r\nrequests.packages.urllib3.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:645)\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File \"main.py\", line 64, in <module>\r\n r = s.get('https://infoproducts.alcatel-lucent.com/cgi-bin/get_doc_list.pl?entry_id=1-0000000000662&srch_how=&srch_str=&release=&model=&category=&contype=&format=&sortby=&how=all_prod',\r\n File \"/Users/romandodin/venvs/nokdok/lib/python3.5/site-packages/requests/sessions.py\", line 501, in get\r\n return self.request('GET', url, **kwargs)\r\n File \"/Users/romandodin/venvs/nokdok/lib/python3.5/site-packages/requests/sessions.py\", line 488, in request\r\n resp = self.send(prep, **send_kwargs)\r\n File \"/Users/romandodin/venvs/nokdok/lib/python3.5/site-packages/requests/sessions.py\", line 609, in send\r\n r = adapter.send(request, **kwargs)\r\n File \"/Users/romandodin/venvs/nokdok/lib/python3.5/site-packages/requests/adapters.py\", line 497, in send\r\n raise SSLError(e, request=request)\r\nrequests.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:645)\r\n```\r\n\r\n**installed packages**\r\n```\r\n$ pip list\r\ncertifi (2016.9.26)\r\npip (9.0.1)\r\nrequests (2.12.4)\r\nsetuptools (32.3.1)\r\nwheel (0.29.0)\r\n```\r\n\r\n**Cert details**\r\n```\r\nSubject infoproducts.alcatel-lucent.com\r\nSAN infoproducts.alcatel-lucent.com\r\ndocumentation.alcatel-lucent.com\r\nValid From Thu, 01 Dec 2016 13:15:08 GMT\r\nValid Until Sat, 01 Dec 2018 13:45:07 GMT\r\nIssuer Entrust Certification Authority - L1K\r\n```", "The problem is almost certainly not with the issuer, but with your server. Can you show me the output of running `openssl s_client -connect \"<your-host>:<your-port>\" -showcerts -servername \"<your-host>\"` against your server please? ", "I am not sure that I used the command right, but that's someting:\r\n```\r\n$ openssl s_client -connect \"infoproducts.alcatel-lucent.com:443\" -showcerts -servername \"infoproducts.alcatel-lucent.com\"\r\nCONNECTED(00000003)\r\ndepth=0 /C=US/ST=Illinois/L=Naperville/O=Alcatel-Lucent USA Inc./CN=infoproducts.alcatel-lucent.com\r\nverify error:num=20:unable to get local issuer certificate\r\nverify return:1\r\ndepth=0 /C=US/ST=Illinois/L=Naperville/O=Alcatel-Lucent USA Inc./CN=infoproducts.alcatel-lucent.com\r\nverify error:num=27:certificate not trusted\r\nverify return:1\r\ndepth=0 /C=US/ST=Illinois/L=Naperville/O=Alcatel-Lucent USA Inc./CN=infoproducts.alcatel-lucent.com\r\nverify error:num=21:unable to verify the first certificate\r\nverify return:1\r\n---\r\nCertificate chain\r\n 0 s:/C=US/ST=Illinois/L=Naperville/O=Alcatel-Lucent USA Inc./CN=infoproducts.alcatel-lucent.com\r\n i:/C=US/O=Entrust, Inc./OU=See www.entrust.net/legal-terms/OU=(c) 2012 Entrust, Inc. - for authorized use only/CN=Entrust Certification Authority - L1K\r\n-----BEGIN CERTIFICATE-----\r\nMIIFeDCCBGCgAwIBAgIRAO3KTvH+nKgrAAAAAFDanDYwDQYJKoZIhvcNAQELBQAw\r\ngboxCzAJBgNVBAYTAlVTMRYwFAYDVQQKEw1FbnRydXN0LCBJbmMuMSgwJgYDVQQL\r\nEx9TZWUgd3d3LmVudHJ1c3QubmV0L2xlZ2FsLXRlcm1zMTkwNwYDVQQLEzAoYykg\r\nMjAxMiBFbnRydXN0LCBJbmMuIC0gZm9yIGF1dGhvcml6ZWQgdXNlIG9ubHkxLjAs\r\nBgNVBAMTJUVudHJ1c3QgQ2VydGlmaWNhdGlvbiBBdXRob3JpdHkgLSBMMUswHhcN\r\nMTYxMjAxMTMxNTA4WhcNMTgxMjAxMTM0NTA3WjCBgTELMAkGA1UEBhMCVVMxETAP\r\nBgNVBAgTCElsbGlub2lzMRMwEQYDVQQHEwpOYXBlcnZpbGxlMSAwHgYDVQQKExdB\r\nbGNhdGVsLUx1Y2VudCBVU0EgSW5jLjEoMCYGA1UEAxMfaW5mb3Byb2R1Y3RzLmFs\r\nY2F0ZWwtbHVjZW50LmNvbTCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEB\r\nAK23xiJ/GCWNJa8Oa6SAYwT7QTmIJOXGrfLLZM9ZWDK81SVLm5xUAVRYaYmb3t/U\r\nKpzeUoL+cJIa2xJGdbj50ehUTbB3SOXW7dxr15fuWahSChqaNkI/NClNAxy2Vho5\r\nHxEtsjmuoJ0cNRcZLHZndLtWi27js3ivGxFxUcl7O3rGGj9yb+XXwJvEOsITPfZ/\r\ngpURnHfAurZrw1+xpsArQVOF6+K6KkPnjGoCj0XCyU3LnWc6akcwwwV+HXcW8H5G\r\n/CPvcm3DpI+45v/P2vVJ9+LiEJVHngVYK2QQ8fMDrvS2vs563I4PLnnRds0eOJph\r\nLfHmn87oSW5jCP3RnsW0czECAwEAAaOCAa4wggGqMA4GA1UdDwEB/wQEAwIFoDAT\r\nBgNVHSUEDDAKBggrBgEFBQcDATAzBgNVHR8ELDAqMCigJqAkhiJodHRwOi8vY3Js\r\nLmVudHJ1c3QubmV0L2xldmVsMWsuY3JsMEsGA1UdIAREMEIwNgYKYIZIAYb6bAoB\r\nBTAoMCYGCCsGAQUFBwIBFhpodHRwOi8vd3d3LmVudHJ1c3QubmV0L3JwYTAIBgZn\r\ngQwBAgIwaAYIKwYBBQUHAQEEXDBaMCMGCCsGAQUFBzABhhdodHRwOi8vb2NzcC5l\r\nbnRydXN0Lm5ldDAzBggrBgEFBQcwAoYnaHR0cDovL2FpYS5lbnRydXN0Lm5ldC9s\r\nMWstY2hhaW4yNTYuY2VyMEwGA1UdEQRFMEOCH2luZm9wcm9kdWN0cy5hbGNhdGVs\r\nLWx1Y2VudC5jb22CIGRvY3VtZW50YXRpb24uYWxjYXRlbC1sdWNlbnQuY29tMB8G\r\nA1UdIwQYMBaAFIKicHTdvFM/z3vU981/p2DGCky/MB0GA1UdDgQWBBT9w+iH+fpG\r\nUJ0IMw1vVpn+91hk6jAJBgNVHRMEAjAAMA0GCSqGSIb3DQEBCwUAA4IBAQAgOcDq\r\n0YsihBWAxA4WFIGkdB/JdOKLRQ0Hvc7KStgtdkB1AqkUv7oBnDCHesazw4vyWbjv\r\nw7dBTtB95jq1ZHX6eHyUDUo2xBmjnaeb2lSAdVV6j10sl7lMfRQh7ba1Im0ibhfe\r\n7gvBn0tUlmdGqqvDqokzV4lVX74Z9nKKKr5D9e3vJsb5AvbDC/eYguBK9Oy8EDa2\r\nZcuPve3mB68lVy5UDg21RVZE072qC0FlYhNasZlMVUUg7tgDMlynQeeoxHe7Rcic\r\npHANQxJqtN8/bsE2mO/ryRZALyC7mWeDvG522ZXMaKslwTUr+jokpyUF7tS786Pi\r\nn4zJ/KNZK2suVwcK\r\n-----END CERTIFICATE-----\r\n---\r\nServer certificate\r\nsubject=/C=US/ST=Illinois/L=Naperville/O=Alcatel-Lucent USA Inc./CN=infoproducts.alcatel-lucent.com\r\nissuer=/C=US/O=Entrust, Inc./OU=See www.entrust.net/legal-terms/OU=(c) 2012 Entrust, Inc. - for authorized use only/CN=Entrust Certification Authority - L1K\r\n---\r\nNo client certificate CA names sent\r\n---\r\nSSL handshake has read 1556 bytes and written 466 bytes\r\n---\r\nNew, TLSv1/SSLv3, Cipher is AES256-SHA\r\nServer public key is 2048 bit\r\nSecure Renegotiation IS NOT supported\r\nCompression: NONE\r\nExpansion: NONE\r\nSSL-Session:\r\n Protocol : TLSv1\r\n Cipher : AES256-SHA\r\n Session-ID: 48D3C9611E37C997182C4F123E21BCFD909D8D4BE887B44355866B80D44E1163\r\n Session-ID-ctx:\r\n Master-Key: 0A4A9E2A10E33EAA248541C39AC71F2F879310BAE141A4EA26F425D56C27099FB6D6572C2CCFC763FC743ECF99DABB5B\r\n Key-Arg : None\r\n Start Time: 1483448892\r\n Timeout : 300 (sec)\r\n Verify return code: 21 (unable to verify the first certificate)\r\n---\r\nclosed\r\n```", "Yup, so the server is wrong.\r\n\r\nThis is a really common problem with TLS. A TLS certificate validation is performed by first building a *certificate chain* that takes us from a certificate the server can prove it has the private key for (the \"leaf\") to a certificate we trust (the \"root\"). For OpenSSL there is an additional rule: the root must be self-signed.\r\n\r\nIn most cases, this chain is more than two steps long: that is, the leaf is not signed by the root, but is instead signed by a so-called *intermediate* certificate. That intermediate may itself be signed by the root, but may also be signed by *other* intermediates, until we eventually reach an intermediate that was signed by a root.\r\n\r\nFor example, for https://python-hyper.org, the chain is as follows:\r\n\r\n1. python-hyper.org, the \"leaf\", which I applied for and had issued\r\n2. Let's Encrypt Authority X3, an intermediate certificate used by the Let's Encrypt project\r\n3. DST Root CA X3, the root certificate that my web browser (and Requests) trusts\r\n\r\nThe correct chain for the site you're accessing *should be*:\r\n\r\n1. infoproducts.alcatel-lucent.com, the leaf\r\n2. Entrust Certification Authority - L1K, the intermediate certificate which issued the leaf\r\n3. Entrust Root Certification Authority - G2, the root certificate that issued the intermediate\r\n\r\nFor your use case, Requests ships with certificate number 3 in its trust database as a trusted root. However, your server is only sending certificate number 1. Requests cannot go from certificate 1 to certificate 3 without having certificate 2, and the server isn't sending it. Most browsers ship with commonly-used intermediate certs, or can maintain caches of them, but Requests cannot do that, so it has no way of getting hold of certificate 2. Without certificate 2, validation must fail.\r\n\r\nProperly-configured TLS servers send the leaf certificate *and all of the necessary intermediaries*. This is to ensure that clients that have never seen the intermediaries and that cannot dynamically fetch them are still able to validate the chain. As a comparison, check out the output of a command like yours run against `python-hyper.org`:\r\n\r\n```\r\n% openssl s_client -connect python-hyper.org:443 -showcerts -servername \"python-hyper.org\"\r\nCONNECTED(00000003)\r\ndepth=1 /C=US/O=Let's Encrypt/CN=Let's Encrypt Authority X3\r\nverify error:num=20:unable to get local issuer certificate\r\nverify return:0\r\n---\r\nCertificate chain\r\n 0 s:/CN=python-hyper.org\r\n i:/C=US/O=Let's Encrypt/CN=Let's Encrypt Authority X3\r\n-----BEGIN CERTIFICATE-----\r\nMIIFBDCCA+ygAwIBAgISAwji03rFoFMPZnEC0rLc7jg3MA0GCSqGSIb3DQEBCwUA\r\nMEoxCzAJBgNVBAYTAlVTMRYwFAYDVQQKEw1MZXQncyBFbmNyeXB0MSMwIQYDVQQD\r\nExpMZXQncyBFbmNyeXB0IEF1dGhvcml0eSBYMzAeFw0xNzAxMDExMjA4MDBaFw0x\r\nNzA0MDExMjA4MDBaMBsxGTAXBgNVBAMTEHB5dGhvbi1oeXBlci5vcmcwggEiMA0G\r\nCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQCjF6VMEOe2qHsdbAnYstunDCW5/fBx\r\nyhzNAxSqZKA5qfATdvhDmiPFnHJkQgUkUeGDwbBwemxuUFUaGZKTJDRhlrymLSkN\r\nhkcouBzs/mxDjNlKacokJBm3hpu+oxYohhxPIZBs8NM4olUPDSG68r6sd1EwR+Ia\r\nLw//nRsMpcrGNmYy+howiBvV3CbuYsbgB59bJ+5y6G2ZqeHMwzFZ+No1oQmBck9T\r\nKAvCh5TteQphzcM9s9NZiB6Z9C0s+vBPKOc1uLssCI3hfr29Af203CX2xgyBjXH7\r\nM4WS17o1zLgs3Q2V03gQ7AmhtgjuJ+2NRf/d6Bsmk8ZhGnyyf+qbY+G/AgMBAAGj\r\nggIRMIICDTAOBgNVHQ8BAf8EBAMCBaAwHQYDVR0lBBYwFAYIKwYBBQUHAwEGCCsG\r\nAQUFBwMCMAwGA1UdEwEB/wQCMAAwHQYDVR0OBBYEFMD50AC8wul1eNKkGVxaAvXb\r\nDSkDMB8GA1UdIwQYMBaAFKhKamMEfd265tE5t6ZFZe/zqOyhMHAGCCsGAQUFBwEB\r\nBGQwYjAvBggrBgEFBQcwAYYjaHR0cDovL29jc3AuaW50LXgzLmxldHNlbmNyeXB0\r\nLm9yZy8wLwYIKwYBBQUHMAKGI2h0dHA6Ly9jZXJ0LmludC14My5sZXRzZW5jcnlw\r\ndC5vcmcvMBsGA1UdEQQUMBKCEHB5dGhvbi1oeXBlci5vcmcwgf4GA1UdIASB9jCB\r\n8zAIBgZngQwBAgEwgeYGCysGAQQBgt8TAQEBMIHWMCYGCCsGAQUFBwIBFhpodHRw\r\nOi8vY3BzLmxldHNlbmNyeXB0Lm9yZzCBqwYIKwYBBQUHAgIwgZ4MgZtUaGlzIENl\r\ncnRpZmljYXRlIG1heSBvbmx5IGJlIHJlbGllZCB1cG9uIGJ5IFJlbHlpbmcgUGFy\r\ndGllcyBhbmQgb25seSBpbiBhY2NvcmRhbmNlIHdpdGggdGhlIENlcnRpZmljYXRl\r\nIFBvbGljeSBmb3VuZCBhdCBodHRwczovL2xldHNlbmNyeXB0Lm9yZy9yZXBvc2l0\r\nb3J5LzANBgkqhkiG9w0BAQsFAAOCAQEAgX00tBHyz9GiDQjw+Id7sbT1lrtHrmtR\r\nDB+kofnq9pkwIExDXT0bAZ14EnU6atiVqhF3j3KxvxIfbNvmSr7emmhPwt+KqOf7\r\n/1m+gxg3ode9LIg6oLtVOfulecxkS4/Wj990O40vuRNdy4XT4PSNze8iuJtGALoS\r\nU9kP8G/V6VnrdbTYhSIIUW9nm0XQcOUbvupFWtiwE8vZw4t0pQloeECdMuALgVO/\r\n1xSu0kqZgidLOkFwei/xqItx7foREzVvq3kUHD1OAuPI1azHQjErQC3N12OmxmBU\r\n3KDOsaJC2Uu8fqI/y1YOkO97hpsgFZX4BiQNaNhtyy7sscD/teVhWQ==\r\n-----END CERTIFICATE-----\r\n 1 s:/C=US/O=Let's Encrypt/CN=Let's Encrypt Authority X3\r\n i:/O=Digital Signature Trust Co./CN=DST Root CA X3\r\n-----BEGIN CERTIFICATE-----\r\nMIIEkjCCA3qgAwIBAgIQCgFBQgAAAVOFc2oLheynCDANBgkqhkiG9w0BAQsFADA/\r\nMSQwIgYDVQQKExtEaWdpdGFsIFNpZ25hdHVyZSBUcnVzdCBDby4xFzAVBgNVBAMT\r\nDkRTVCBSb290IENBIFgzMB4XDTE2MDMxNzE2NDA0NloXDTIxMDMxNzE2NDA0Nlow\r\nSjELMAkGA1UEBhMCVVMxFjAUBgNVBAoTDUxldCdzIEVuY3J5cHQxIzAhBgNVBAMT\r\nGkxldCdzIEVuY3J5cHQgQXV0aG9yaXR5IFgzMIIBIjANBgkqhkiG9w0BAQEFAAOC\r\nAQ8AMIIBCgKCAQEAnNMM8FrlLke3cl03g7NoYzDq1zUmGSXhvb418XCSL7e4S0EF\r\nq6meNQhY7LEqxGiHC6PjdeTm86dicbp5gWAf15Gan/PQeGdxyGkOlZHP/uaZ6WA8\r\nSMx+yk13EiSdRxta67nsHjcAHJyse6cF6s5K671B5TaYucv9bTyWaN8jKkKQDIZ0\r\nZ8h/pZq4UmEUEz9l6YKHy9v6Dlb2honzhT+Xhq+w3Brvaw2VFn3EK6BlspkENnWA\r\na6xK8xuQSXgvopZPKiAlKQTGdMDQMc2PMTiVFrqoM7hD8bEfwzB/onkxEz0tNvjj\r\n/PIzark5McWvxI0NHWQWM6r6hCm21AvA2H3DkwIDAQABo4IBfTCCAXkwEgYDVR0T\r\nAQH/BAgwBgEB/wIBADAOBgNVHQ8BAf8EBAMCAYYwfwYIKwYBBQUHAQEEczBxMDIG\r\nCCsGAQUFBzABhiZodHRwOi8vaXNyZy50cnVzdGlkLm9jc3AuaWRlbnRydXN0LmNv\r\nbTA7BggrBgEFBQcwAoYvaHR0cDovL2FwcHMuaWRlbnRydXN0LmNvbS9yb290cy9k\r\nc3Ryb290Y2F4My5wN2MwHwYDVR0jBBgwFoAUxKexpHsscfrb4UuQdf/EFWCFiRAw\r\nVAYDVR0gBE0wSzAIBgZngQwBAgEwPwYLKwYBBAGC3xMBAQEwMDAuBggrBgEFBQcC\r\nARYiaHR0cDovL2Nwcy5yb290LXgxLmxldHNlbmNyeXB0Lm9yZzA8BgNVHR8ENTAz\r\nMDGgL6AthitodHRwOi8vY3JsLmlkZW50cnVzdC5jb20vRFNUUk9PVENBWDNDUkwu\r\nY3JsMB0GA1UdDgQWBBSoSmpjBH3duubRObemRWXv86jsoTANBgkqhkiG9w0BAQsF\r\nAAOCAQEA3TPXEfNjWDjdGBX7CVW+dla5cEilaUcne8IkCJLxWh9KEik3JHRRHGJo\r\nuM2VcGfl96S8TihRzZvoroed6ti6WqEBmtzw3Wodatg+VyOeph4EYpr/1wXKtx8/\r\nwApIvJSwtmVi4MFU5aMqrSDE6ea73Mj2tcMyo5jMd6jmeWUHK8so/joWUoHOUgwu\r\nX4Po1QYz+3dszkDqMp4fklxBwXRsW10KXzPMTZ+sOPAveyxindmjkW8lGy+QsRlG\r\nPfZ+G6Z6h7mjem0Y+iWlkYcV4PIWL1iwBi8saCbGS5jN2p8M+X+Q7UNKEkROb3N6\r\nKOqkqm57TH2H3eDJAkSnh6/DNFu0Qg==\r\n-----END CERTIFICATE-----\r\n---\r\nServer certificate\r\nsubject=/CN=python-hyper.org\r\nissuer=/C=US/O=Let's Encrypt/CN=Let's Encrypt Authority X3\r\n---\r\nNo client certificate CA names sent\r\n---\r\nSSL handshake has read 2638 bytes and written 451 bytes\r\n---\r\nNew, TLSv1/SSLv3, Cipher is AES256-SHA\r\nServer public key is 2048 bit\r\nSecure Renegotiation IS supported\r\nCompression: NONE\r\nExpansion: NONE\r\nSSL-Session:\r\n Protocol : TLSv1\r\n Cipher : AES256-SHA\r\n Session-ID: BBDF2CA65B2142C1F0C416B28C4FDF8FD5118ED4D5FF8802789D0B54C6309469\r\n Session-ID-ctx: \r\n Master-Key: BD9B8B97EC439248CC5849EBF4B2ED18E0A73472EB8D86FE6F038C4E6944AFD8A163D80D47A3A3B28A6237A44C1B31CE\r\n Key-Arg : None\r\n Start Time: 1483447980\r\n Timeout : 300 (sec)\r\n Verify return code: 0 (ok)\r\n---\r\n```\r\n\r\nNote that the server sent *two* certificates in my case, whereas in yours it only sent one.\r\n\r\nThis is the error. Please reconfigure your server to send all the relevant intermediaries as well as the leaf.\r\n\r\nWhile I'm here, I should note that running Qualys SSL Labs against your server would also have told you about this problem. =)", "Thanks @Lukasa for this educational reply you gave! Really appreciate and sorry for troubling with this, it was my TLS-knowledge-gap =) Mostly I went to issues because in was all good in browsers and didn't work in requests. But now its clear to me what is the root cause.\r\n\r\nI believe there is no workaround (download intermediate cert upfront) for this case, but to reconfigure the server? For the time being I will skip verification checks", "@hellt No need to apologise: this is a thing that is extremely non-obvious and it's a *really* common error to make, so you're in good company here. =)\r\n\r\nThere is a workaround, in fact. You can pass a certificate bundle to `verify` to override the use of certifi's bundle. This bundle can, in addition to root certs, contain intermediary certificates that become available to OpenSSL to use to build the chain. As a result, what you can do is take the certifi cert chain (which is in a file you can find by running `python -c 'import certifi; print(certifi.where())'`, copy that out to somewhere else on your filesystem, and then add the intermediate certificate to the end of that file. The intermediate has to be in PEM format. If then pass the path to the new cert bundle to the `verify` kwarg, you'll find everything starts working.", "Luka,\r\n What if I get an this type of error;\r\npip install certifi\r\nCollecting certifi\r\n Could not fetch URL https://pypi.python.org/simple/certifi/: There was a problem confirming the ssl certificate: [Errno 2] No such file or directory - skipping\r\n Could not find a version that satisfies the requirement certifi (from versions: )\r\nNo matching distribution found for certifi\r\n\r\nQ", "Seems like you are having trouble validating the TLS certificates from PyPI. However, that error message is a bit surprising. I don't entirely know where it is coming from, but it suggests a problem either with the way your pip is configured or with your Python. Do you encounter similar errors with other python packages?", "Hello All,\r\nI am trying to do the kerberos authentication using Python within the intranet of my organisation but getting this error.\r\n\r\nimport requests \r\nfrom requests_kerberos import HTTPKerberosAuth \r\nr = requests.get(\"https:*****************\", auth=HTTPKerberosAuth())\r\n\r\nSSLError: (\"bad handshake: Error([('SSL routines', 'ssl3_get_server_certificate', 'certificate verify failed')],)\",)\r\n\r\nDoes any one know how to add certificates for Python ? Will that help to get this resolved?" ]
https://api.github.com/repos/psf/requests/issues/3201
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3201/labels{/name}
https://api.github.com/repos/psf/requests/issues/3201/comments
https://api.github.com/repos/psf/requests/issues/3201/events
https://github.com/psf/requests/issues/3201
155,859,310
MDU6SXNzdWUxNTU4NTkzMTA=
3,201
Request's documentation is self-contradictory about whether connections are pooled automatically
{ "avatar_url": "https://avatars.githubusercontent.com/u/90853?v=4", "events_url": "https://api.github.com/users/thanatos/events{/privacy}", "followers_url": "https://api.github.com/users/thanatos/followers", "following_url": "https://api.github.com/users/thanatos/following{/other_user}", "gists_url": "https://api.github.com/users/thanatos/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/thanatos", "id": 90853, "login": "thanatos", "node_id": "MDQ6VXNlcjkwODUz", "organizations_url": "https://api.github.com/users/thanatos/orgs", "received_events_url": "https://api.github.com/users/thanatos/received_events", "repos_url": "https://api.github.com/users/thanatos/repos", "site_admin": false, "starred_url": "https://api.github.com/users/thanatos/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/thanatos/subscriptions", "type": "User", "url": "https://api.github.com/users/thanatos", "user_view_type": "public" }
[ { "color": "0b02e1", "default": false, "description": null, "id": 191274, "name": "Contributor Friendly", "node_id": "MDU6TGFiZWwxOTEyNzQ=", "url": "https://api.github.com/repos/psf/requests/labels/Contributor%20Friendly" }, { "color": "fad8c7", "default": false, "description": null, "id": 136616769, "name": "Documentation", "node_id": "MDU6TGFiZWwxMzY2MTY3Njk=", "url": "https://api.github.com/repos/psf/requests/labels/Documentation" }, { "color": "fef2c0", "default": false, "description": null, "id": 298537994, "name": "Needs More Information", "node_id": "MDU6TGFiZWwyOTg1Mzc5OTQ=", "url": "https://api.github.com/repos/psf/requests/labels/Needs%20More%20Information" } ]
closed
true
null
[]
null
2
2016-05-19T23:42:28Z
2021-09-08T17:05:39Z
2016-06-09T21:44:26Z
NONE
resolved
The introduction to requests claims, > Keep-alive and HTTP connection pooling are 100% automatic The page on sessions contradicts this, and heavily implies that I need to create a `Session` object to get pooling: > if you're making several requests to the same host, the underlying TCP connection will be reused, which can result in a significant performance increase (see HTTP persistent connection). If I need to create a Session in order to achieve connection pooling… that doesn't seem "100% automatic" if I must manually ask for it. Use of the library seems to indicate that `Session` is the way to go, because I'm getting spammed by "Starting new HTTPS connection" logs.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3201/reactions" }
https://api.github.com/repos/psf/requests/issues/3201/timeline
null
completed
null
null
false
[ "Thanks for this!\n\nI don't know that I'd call the documentation contradictory. At least when I see \"100% automatic\" I see a phrase that means \"no user intervention is required\", which is true. However, you do need an object to put state on to.\n\nRegardless, what wording change would you recommend to clear this up?\n", "Ever porcelain-level api call (e.g. `requests.get`) has a session baked into it, which will be used across all requests within that function call.\n\nAs documented, if you intend to make multiple calls, you must create a session.\n\nDocumentation is correct. \n" ]
https://api.github.com/repos/psf/requests/issues/3200
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3200/labels{/name}
https://api.github.com/repos/psf/requests/issues/3200/comments
https://api.github.com/repos/psf/requests/issues/3200/events
https://github.com/psf/requests/pull/3200
155,837,127
MDExOlB1bGxSZXF1ZXN0NzA3NjY2ODU=
3,200
Document the cert pooling bug
{ "avatar_url": "https://avatars.githubusercontent.com/u/236086?v=4", "events_url": "https://api.github.com/users/zestyping/events{/privacy}", "followers_url": "https://api.github.com/users/zestyping/followers", "following_url": "https://api.github.com/users/zestyping/following{/other_user}", "gists_url": "https://api.github.com/users/zestyping/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/zestyping", "id": 236086, "login": "zestyping", "node_id": "MDQ6VXNlcjIzNjA4Ng==", "organizations_url": "https://api.github.com/users/zestyping/orgs", "received_events_url": "https://api.github.com/users/zestyping/received_events", "repos_url": "https://api.github.com/users/zestyping/repos", "site_admin": false, "starred_url": "https://api.github.com/users/zestyping/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zestyping/subscriptions", "type": "User", "url": "https://api.github.com/users/zestyping", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2016-05-19T21:16:32Z
2021-09-08T04:00:58Z
2016-05-19T21:56:04Z
NONE
resolved
Hi there. I just ran into https://github.com/kennethreitz/requests/issues/2863 and spent a lot of time trying to figure out what was wrong. It would be really nice for future people who might encounter this bug to be informed up front, so here is a note to add to the User Guide and the API documentation. I understand that people are working on fixing this bug, but it's not fixed yet and it's been open for about 6 months. I think it's valuable to make this quick change to the docs in the meantime, and then when the bug is fixed we can remove these warnings. Thanks!
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3200/reactions" }
https://api.github.com/repos/psf/requests/issues/3200/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3200.diff", "html_url": "https://github.com/psf/requests/pull/3200", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3200.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3200" }
true
[ "Thanks for this suggestion! However, I don't think the proposed documentation change is good: in fact, I think it is dangerously misleading. The bug occurs regardless of how `cert` is set: so long as it has been set and a session is being used it is possible for it to be effectively incorrectly set. \n\nHowever, I also think there is no urgency to document this. A fix has been merged into urllib3 (shazow/urllib3#830), and a new urllib3 release will be out in the next few weeks. I doubt this will remain unfixed for more than another few weeks. \n", "Thanks, I didn't know a fix was imminent. I'm looking forward to it!\n\nI don't understand what's wrong with the advice to set `cert` immediately on creation and never changing it—only asking for clarification in case it means I don't understand what the bug is. If you set `cert` before making any requests and never change it, then how can you encounter the bug?\n", "Nothing's wrong with that advice: it's just not what your documentation says. =) Your documentation change says _don't use it at all_, which is not necessarily right.\n\nYou're right though: if you set `cert` before using the Session and never want to change or override it, you won't hit the bug.\n" ]
https://api.github.com/repos/psf/requests/issues/3199
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3199/labels{/name}
https://api.github.com/repos/psf/requests/issues/3199/comments
https://api.github.com/repos/psf/requests/issues/3199/events
https://github.com/psf/requests/issues/3199
155,795,525
MDU6SXNzdWUxNTU3OTU1MjU=
3,199
Not possible to receive server response before all of request body is sent?
{ "avatar_url": "https://avatars.githubusercontent.com/u/555959?v=4", "events_url": "https://api.github.com/users/zachmullen/events{/privacy}", "followers_url": "https://api.github.com/users/zachmullen/followers", "following_url": "https://api.github.com/users/zachmullen/following{/other_user}", "gists_url": "https://api.github.com/users/zachmullen/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/zachmullen", "id": 555959, "login": "zachmullen", "node_id": "MDQ6VXNlcjU1NTk1OQ==", "organizations_url": "https://api.github.com/users/zachmullen/orgs", "received_events_url": "https://api.github.com/users/zachmullen/received_events", "repos_url": "https://api.github.com/users/zachmullen/repos", "site_admin": false, "starred_url": "https://api.github.com/users/zachmullen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zachmullen/subscriptions", "type": "User", "url": "https://api.github.com/users/zachmullen", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2016-05-19T17:47:22Z
2021-09-08T18:00:44Z
2016-05-19T19:46:16Z
NONE
resolved
I'm not sure if this is actually the case, but from looking at the code it appears to be. To motivate my use case, I am using chunked transfer-encoding mode for the request since I want to support streaming from an underlying service, so the content length is unknown and could be very long. It appears that if the server sends a response prior to the request body being completely sent, we wouldn't know about it, which would be quite bad for something like a redirect or an error condition that was detected very early in the stream. In particular, in a redirect case we certainly don't want to re-send the entire stream. Is there a good way around this?
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3199/reactions" }
https://api.github.com/repos/psf/requests/issues/3199/timeline
null
completed
null
null
false
[ "@zachmullen short of rewriting our HTTP stack to not use httplib, no. And none of us presently have the time to do that. :-(\n", "Well, we could work around it if we adjusted `send` to always check whether the socket is readable first, but it's a dirty hack and I don't think it's worth it: it'll just be too brittle I suspect. That said, if someone wants to prototype a change to do that in urllib3 it'd be an interested experiment.\n\nRegardless, this would first be a urllib3 enhancement so I'm closing this.\n", "Ok, thanks\n", "(Closing for @Lukasa since he said he was going to and didn't. ;))\n" ]
https://api.github.com/repos/psf/requests/issues/3198
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3198/labels{/name}
https://api.github.com/repos/psf/requests/issues/3198/comments
https://api.github.com/repos/psf/requests/issues/3198/events
https://github.com/psf/requests/pull/3198
155,128,791
MDExOlB1bGxSZXF1ZXN0NzAyNjY0NjI=
3,198
Issue #2645 - Change behavior of Response.ok
{ "avatar_url": "https://avatars.githubusercontent.com/u/15115673?v=4", "events_url": "https://api.github.com/users/AlexPHorta/events{/privacy}", "followers_url": "https://api.github.com/users/AlexPHorta/followers", "following_url": "https://api.github.com/users/AlexPHorta/following{/other_user}", "gists_url": "https://api.github.com/users/AlexPHorta/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/AlexPHorta", "id": 15115673, "login": "AlexPHorta", "node_id": "MDQ6VXNlcjE1MTE1Njcz", "organizations_url": "https://api.github.com/users/AlexPHorta/orgs", "received_events_url": "https://api.github.com/users/AlexPHorta/received_events", "repos_url": "https://api.github.com/users/AlexPHorta/repos", "site_admin": false, "starred_url": "https://api.github.com/users/AlexPHorta/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/AlexPHorta/subscriptions", "type": "User", "url": "https://api.github.com/users/AlexPHorta", "user_view_type": "public" }
[ { "color": "fbca04", "default": false, "description": null, "id": 44501249, "name": "Needs BDFL Input", "node_id": "MDU6TGFiZWw0NDUwMTI0OQ==", "url": "https://api.github.com/repos/psf/requests/labels/Needs%20BDFL%20Input" }, { "color": "eb6420", "default": false, "description": null, "id": 44501256, "name": "Breaking API Change", "node_id": "MDU6TGFiZWw0NDUwMTI1Ng==", "url": "https://api.github.com/repos/psf/requests/labels/Breaking%20API%20Change" }, { "color": "e11d21", "default": false, "description": null, "id": 78002701, "name": "Do Not Merge", "node_id": "MDU6TGFiZWw3ODAwMjcwMQ==", "url": "https://api.github.com/repos/psf/requests/labels/Do%20Not%20Merge" } ]
closed
true
null
[]
{ "closed_at": null, "closed_issues": 29, "created_at": "2013-11-17T11:29:34Z", "creator": { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }, "description": null, "due_on": null, "html_url": "https://github.com/psf/requests/milestone/20", "id": 487518, "labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels", "node_id": "MDk6TWlsZXN0b25lNDg3NTE4", "number": 20, "open_issues": 12, "state": "open", "title": "3.0.0", "updated_at": "2024-05-19T18:43:00Z", "url": "https://api.github.com/repos/psf/requests/milestones/20" }
5
2016-05-16T21:41:53Z
2021-09-07T00:06:36Z
2017-02-10T17:18:50Z
NONE
resolved
As discussed on issue #2645, I added a success attribute to requests.codes. Now, Response.ok checks if the response's status code is listed on requests.codes.success.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3198/reactions" }
https://api.github.com/repos/psf/requests/issues/3198/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3198.diff", "html_url": "https://github.com/psf/requests/pull/3198", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3198.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3198" }
true
[ "I get it, it's just that I thought the agreed solution was to add the success attribute to requests.codes. I'll update the commit right now.\n", "This LGTM. We can't merge until we're ready to do 3.0.0, though.\n", "An additional check should be added for entire 2xx range after the registered codes are checked (or perhaps the '-1 index +1 :' slice). Both for future-proofing and a server might be using an esoteric number honoring the numbering scheme. \n", "Also, I've shared some thoughts about this on https://github.com/kennethreitz/requests/issues/2645.\n\nBasically, no behavioral modifications to `ok`. A new attribute can be added for the 200-only functionality, if this is still desired. \n\nI will need to think for a bit about the naming. feel free to use `is_successful` as a placeholder for now.\n\n---\n\nMaintainers: **do not merge** without my approval.\n", "P.S. everybody gets cake :sparkles: :cake: ✨ \n" ]
https://api.github.com/repos/psf/requests/issues/3197
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3197/labels{/name}
https://api.github.com/repos/psf/requests/issues/3197/comments
https://api.github.com/repos/psf/requests/issues/3197/events
https://github.com/psf/requests/issues/3197
155,076,971
MDU6SXNzdWUxNTUwNzY5NzE=
3,197
session proxies not honored during request
{ "avatar_url": "https://avatars.githubusercontent.com/u/1371585?v=4", "events_url": "https://api.github.com/users/mbondfusion/events{/privacy}", "followers_url": "https://api.github.com/users/mbondfusion/followers", "following_url": "https://api.github.com/users/mbondfusion/following{/other_user}", "gists_url": "https://api.github.com/users/mbondfusion/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/mbondfusion", "id": 1371585, "login": "mbondfusion", "node_id": "MDQ6VXNlcjEzNzE1ODU=", "organizations_url": "https://api.github.com/users/mbondfusion/orgs", "received_events_url": "https://api.github.com/users/mbondfusion/received_events", "repos_url": "https://api.github.com/users/mbondfusion/repos", "site_admin": false, "starred_url": "https://api.github.com/users/mbondfusion/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mbondfusion/subscriptions", "type": "User", "url": "https://api.github.com/users/mbondfusion", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-05-16T17:21:01Z
2021-09-08T18:00:45Z
2016-05-16T17:23:07Z
NONE
resolved
Using a session object and setting the proxies dictionary does not affect the proxies used during a request. ex: Still tries to use system proxy. ``` s = requests.Session() s.proxies = {'http': None, 'https': None} s.post(url='http://10.0.1.1', json={'test': 'data'}) ``` Using proxies during each request works, but it would be great to set them at the session level. Specifically if you potentially have dozens or ".post()" or ".get()" calls throughout your script. Any thoughts?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3197/reactions" }
https://api.github.com/repos/psf/requests/issues/3197/timeline
null
completed
null
null
false
[ "Thanks for this! This is a duplicate of #2018.\n" ]
https://api.github.com/repos/psf/requests/issues/3196
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3196/labels{/name}
https://api.github.com/repos/psf/requests/issues/3196/comments
https://api.github.com/repos/psf/requests/issues/3196/events
https://github.com/psf/requests/issues/3196
155,073,824
MDU6SXNzdWUxNTUwNzM4MjQ=
3,196
redirect with odd (but correct) Location url fails via proxy
{ "avatar_url": "https://avatars.githubusercontent.com/u/2027885?v=4", "events_url": "https://api.github.com/users/mgdelmonte/events{/privacy}", "followers_url": "https://api.github.com/users/mgdelmonte/followers", "following_url": "https://api.github.com/users/mgdelmonte/following{/other_user}", "gists_url": "https://api.github.com/users/mgdelmonte/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/mgdelmonte", "id": 2027885, "login": "mgdelmonte", "node_id": "MDQ6VXNlcjIwMjc4ODU=", "organizations_url": "https://api.github.com/users/mgdelmonte/orgs", "received_events_url": "https://api.github.com/users/mgdelmonte/received_events", "repos_url": "https://api.github.com/users/mgdelmonte/repos", "site_admin": false, "starred_url": "https://api.github.com/users/mgdelmonte/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mgdelmonte/subscriptions", "type": "User", "url": "https://api.github.com/users/mgdelmonte", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2016-05-16T17:03:52Z
2021-09-08T16:00:36Z
2016-08-05T07:49:07Z
NONE
resolved
Using v2.10.0 (but tested and get same behavior with 2.9.1), if we get a 302 redirect with a header like this (note the missing trailing slash after the domain): Location: http://captrust.netxinvestor.com?custId=.... AND we're using a proxy (proxy in question is tinyproxy, but tested with other proxies and get same behavior) AND allow_redirects=True the proxy will fail with "cannot connect" because the domain in the Location url is improperly parsed. Firefox and Chrome both properly handle the redirect. To be honest I haven't checked whether the Location URL does or does not violate the spec, but if the browsers handle it and it appears to be unambiguous I'd vote for `requests` handling it also. Seems like an easy patch, though I must also confess that I've stepped through the code and can't find where to handle it. Oddly, if that exact URL is passed to requests as a GET, even via a proxy, it works fine -- so my guess is that it happens somewhere in the SessionRedirectMixin object. Workaround right now is to request with allow_redirects=False and handle the redirect manually.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3196/reactions" }
https://api.github.com/repos/psf/requests/issues/3196/timeline
null
completed
null
null
false
[ "Are you sure this is a URL parsing issue? Are you getting an exception or an error from the proxy? If the former, can I see it?\n\nIf not, which is what I suspect, I think your proxy may be the one that's struggling. Essentially, the major difference in the two code paths is that requests won't add the `'/'` path to a `Location` header. Now, probably it should, but right now it doesn't. That means that in the case when you pass it in to requests yourself, requests would emit an extra `/` to the proxy. It seems that your _proxy_ is getting confused here, more than requests.\n", "I can't be sure, though I'll give it another test to see if I can watch where it submits to the proxy. But without question it works to request the Location URL, but it fails with a 500 error if I request a URL that redirects to the Location URL. Since the proxy is the same both ways, my best guess is that requests is asking something different of the proxy.\n", "Yeah, that's definitely the case, but I think the two URLs are semantically identical.\n\nRegardless, I _also_ think that we have a question to ask ourselves (we being @sigmavirus24 and I), which is: why does `resolve_redirects` not end up calling `prepare_url`?\n" ]
https://api.github.com/repos/psf/requests/issues/3195
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3195/labels{/name}
https://api.github.com/repos/psf/requests/issues/3195/comments
https://api.github.com/repos/psf/requests/issues/3195/events
https://github.com/psf/requests/pull/3195
154,971,353
MDExOlB1bGxSZXF1ZXN0NzAxNTY3ODc=
3,195
Fix to check for Plain ip notations in no_proxy settings if not CIDR
{ "avatar_url": "https://avatars.githubusercontent.com/u/3090124?v=4", "events_url": "https://api.github.com/users/kumarvaradarajulu/events{/privacy}", "followers_url": "https://api.github.com/users/kumarvaradarajulu/followers", "following_url": "https://api.github.com/users/kumarvaradarajulu/following{/other_user}", "gists_url": "https://api.github.com/users/kumarvaradarajulu/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kumarvaradarajulu", "id": 3090124, "login": "kumarvaradarajulu", "node_id": "MDQ6VXNlcjMwOTAxMjQ=", "organizations_url": "https://api.github.com/users/kumarvaradarajulu/orgs", "received_events_url": "https://api.github.com/users/kumarvaradarajulu/received_events", "repos_url": "https://api.github.com/users/kumarvaradarajulu/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kumarvaradarajulu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kumarvaradarajulu/subscriptions", "type": "User", "url": "https://api.github.com/users/kumarvaradarajulu", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2016-05-16T06:50:12Z
2021-09-08T04:01:00Z
2016-05-16T12:47:33Z
NONE
resolved
For ipv4 addresses no_proxy is not being honored. [This line ](https://github.com/kennethreitz/requests/blob/master/requests/utils.py#L539) checks for `cidr notation` but `plain ip` notation is not considered, due to which the request is always routed to proxy server. This results in error ReadTimeoutError Sample `proxy` configuration ``` no_proxy=192.168.1.1,192.168.1.3,*.example.com http_proxy=http://192.168.1.100:8080 https_proxy=http://192.168.1.100:8080 ``` > When request is made to `192.168.1.1`, the request should not be made to proxy server, without this fix it is being made to proxy server `192.168.1.100`
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3195/reactions" }
https://api.github.com/repos/psf/requests/issues/3195/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3195.diff", "html_url": "https://github.com/psf/requests/pull/3195", "merged_at": "2016-05-16T12:47:33Z", "patch_url": "https://github.com/psf/requests/pull/3195.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3195" }
true
[ "This seems good to me. Can we have a test to validate that this function works as expected?\n", "@Lukasa added tests, please check and merge.\n", "@Lukasa addressed comments, pls check\n", "Cool, I'm :+1: on this for now. I'd like another :+1: from @sigmavirus24 before we merge, but I think this is good.\n", "Looks good to me. Thanks @kumarvaradarajulu \n" ]
https://api.github.com/repos/psf/requests/issues/3194
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3194/labels{/name}
https://api.github.com/repos/psf/requests/issues/3194/comments
https://api.github.com/repos/psf/requests/issues/3194/events
https://github.com/psf/requests/issues/3194
154,915,473
MDU6SXNzdWUxNTQ5MTU0NzM=
3,194
Add support for in-memory multipart upload
{ "avatar_url": "https://avatars.githubusercontent.com/u/2421362?v=4", "events_url": "https://api.github.com/users/wujek-srujek/events{/privacy}", "followers_url": "https://api.github.com/users/wujek-srujek/followers", "following_url": "https://api.github.com/users/wujek-srujek/following{/other_user}", "gists_url": "https://api.github.com/users/wujek-srujek/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/wujek-srujek", "id": 2421362, "login": "wujek-srujek", "node_id": "MDQ6VXNlcjI0MjEzNjI=", "organizations_url": "https://api.github.com/users/wujek-srujek/orgs", "received_events_url": "https://api.github.com/users/wujek-srujek/received_events", "repos_url": "https://api.github.com/users/wujek-srujek/repos", "site_admin": false, "starred_url": "https://api.github.com/users/wujek-srujek/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/wujek-srujek/subscriptions", "type": "User", "url": "https://api.github.com/users/wujek-srujek", "user_view_type": "public" }
[]
closed
true
null
[]
null
6
2016-05-15T15:55:51Z
2021-09-08T18:00:46Z
2016-05-15T16:15:29Z
NONE
resolved
As far as I can understand from the docs, it is possible to upload files using the 'files' keyword param, but it requires a file object. I have a case where I have the data in memory, and it would require me to store to a temp file, which I would like to prevent if possible. Would it be possible for requests to support in-memory data upload for multipart?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3194/reactions" }
https://api.github.com/repos/psf/requests/issues/3194/timeline
null
completed
null
null
false
[ "It already does. =) Note the description for the files kwarg in [this bit of documentation](http://docs.python-requests.org/en/master/api/#requests.request).\n", "Also [this section of the narrative documentation](http://docs.python-requests.org/en/master/user/quickstart/#post-a-multipart-encoded-file) mentions it.\n", "But it requires a 'file-like object', doesn't it? What I meant is I have a byte buffer in memory (an instance of the bytes type in Python 3), not a file. For this interface to work I have to write to a temp file and upload this (at least that's what I am doing now).\n", "> But it requires a 'file-like object', doesn't it?\n\nNo. Let me quote [the narrative documentation](http://docs.python-requests.org/en/master/user/quickstart/#post-a-multipart-encoded-file) here:\n\n> If you want, you can send strings to be received as files:\n> \n> ```\n> >>> url = 'http://httpbin.org/post'\n> >>> files = {'file': ('report.csv', 'some,data,to,send\\nanother,row,to,send\\n')}\n> \n> >>> r = requests.post(url, files=files)\n> >>> r.text\n> {\n> ...\n> \"files\": {\n> \"file\": \"some,data,to,send\\\\nanother,row,to,send\\\\n\"\n> },\n> ...\n> }\n> ```\n", "Right, sorry, I was a bit mislead by the 'string' wording in the example. For me, it didn't seem like an instance of bytes could also be passed. I guess the documentation could be improved on this - the 'files' keyword description should probably say it is not really necessary for it to be a fileobj, and the narrative docs should say it can also be a bytes instance, not only a string. (Although it rings a bell that in Python 2.x things were calles 'byte strings', maybe that's where it comes from?)\n\nBut thank you anyways, I tested it and it works, and I prefer this way than mess around with temporary files.\n", "> (Although it rings a bell that in Python 2.x things were calles 'byte strings', maybe that's where it comes from?)\n\nYup. In Python 2, strings were by default byte strings, which is very similar to the Python 3 `bytes` type. This convention has stuck around for those of us who write Python 2/3 compatible code.\n" ]
https://api.github.com/repos/psf/requests/issues/3193
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3193/labels{/name}
https://api.github.com/repos/psf/requests/issues/3193/comments
https://api.github.com/repos/psf/requests/issues/3193/events
https://github.com/psf/requests/pull/3193
154,880,265
MDExOlB1bGxSZXF1ZXN0NzAxMDY2MjE=
3,193
Something went wrong with this. Ignore it, I will correct the PR.
{ "avatar_url": "https://avatars.githubusercontent.com/u/15115673?v=4", "events_url": "https://api.github.com/users/AlexPHorta/events{/privacy}", "followers_url": "https://api.github.com/users/AlexPHorta/followers", "following_url": "https://api.github.com/users/AlexPHorta/following{/other_user}", "gists_url": "https://api.github.com/users/AlexPHorta/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/AlexPHorta", "id": 15115673, "login": "AlexPHorta", "node_id": "MDQ6VXNlcjE1MTE1Njcz", "organizations_url": "https://api.github.com/users/AlexPHorta/orgs", "received_events_url": "https://api.github.com/users/AlexPHorta/received_events", "repos_url": "https://api.github.com/users/AlexPHorta/repos", "site_admin": false, "starred_url": "https://api.github.com/users/AlexPHorta/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/AlexPHorta/subscriptions", "type": "User", "url": "https://api.github.com/users/AlexPHorta", "user_view_type": "public" }
[]
closed
true
null
[]
null
0
2016-05-14T23:07:41Z
2021-09-08T04:01:00Z
2016-05-14T23:12:54Z
NONE
resolved
I was scanning the issues and found this one, proposed for 3.0. I believe it's alright, at least the tests are all passing. Basically, I created a success attribute in requests.codes and changed the ok property at Response. I hope everything is alright, thanks.
{ "avatar_url": "https://avatars.githubusercontent.com/u/15115673?v=4", "events_url": "https://api.github.com/users/AlexPHorta/events{/privacy}", "followers_url": "https://api.github.com/users/AlexPHorta/followers", "following_url": "https://api.github.com/users/AlexPHorta/following{/other_user}", "gists_url": "https://api.github.com/users/AlexPHorta/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/AlexPHorta", "id": 15115673, "login": "AlexPHorta", "node_id": "MDQ6VXNlcjE1MTE1Njcz", "organizations_url": "https://api.github.com/users/AlexPHorta/orgs", "received_events_url": "https://api.github.com/users/AlexPHorta/received_events", "repos_url": "https://api.github.com/users/AlexPHorta/repos", "site_admin": false, "starred_url": "https://api.github.com/users/AlexPHorta/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/AlexPHorta/subscriptions", "type": "User", "url": "https://api.github.com/users/AlexPHorta", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3193/reactions" }
https://api.github.com/repos/psf/requests/issues/3193/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3193.diff", "html_url": "https://github.com/psf/requests/pull/3193", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3193.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3193" }
true
[]
https://api.github.com/repos/psf/requests/issues/3192
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3192/labels{/name}
https://api.github.com/repos/psf/requests/issues/3192/comments
https://api.github.com/repos/psf/requests/issues/3192/events
https://github.com/psf/requests/pull/3192
154,824,720
MDExOlB1bGxSZXF1ZXN0NzAwNzYyMzM=
3,192
Allow graceful interruption of testserver.Server
{ "avatar_url": "https://avatars.githubusercontent.com/u/91550?v=4", "events_url": "https://api.github.com/users/brettdh/events{/privacy}", "followers_url": "https://api.github.com/users/brettdh/followers", "following_url": "https://api.github.com/users/brettdh/following{/other_user}", "gists_url": "https://api.github.com/users/brettdh/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/brettdh", "id": 91550, "login": "brettdh", "node_id": "MDQ6VXNlcjkxNTUw", "organizations_url": "https://api.github.com/users/brettdh/orgs", "received_events_url": "https://api.github.com/users/brettdh/received_events", "repos_url": "https://api.github.com/users/brettdh/repos", "site_admin": false, "starred_url": "https://api.github.com/users/brettdh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/brettdh/subscriptions", "type": "User", "url": "https://api.github.com/users/brettdh", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2016-05-14T00:46:52Z
2021-09-08T04:00:59Z
2016-05-17T15:45:12Z
CONTRIBUTOR
resolved
So that failing tests don't cause the server thread to hang indefinitely, waiting for connections that will never come. Rationale for suppressing error/traceback from interrupted _accept_connection in testserver.Server: https://gist.github.com/brettdh/b6e741227b2297f19d2118077f14dfa5
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3192/reactions" }
https://api.github.com/repos/psf/requests/issues/3192/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3192.diff", "html_url": "https://github.com/psf/requests/pull/3192", "merged_at": "2016-05-17T15:45:12Z", "patch_url": "https://github.com/psf/requests/pull/3192.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3192" }
true
[ "ping @Lukasa @sigmavirus24 \n\nI think I addressed the comments from #3185.\n\nOddly, I cannot make the test suite hang indefinitely when I run it locally, but that's what's happening in jenkins. :confused: Maybe you can spot something I'm missing?\n", "Ok, cool, I'm happy with this. Assuming @sigmavirus24 is as well, he may merge.\n", "I'm going to squash merge because the title of 5cfb691 is not very helpful and I don't want it in our commit history if we go to bisect things.\n", "Thanks @brettdh!\n" ]
https://api.github.com/repos/psf/requests/issues/3191
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3191/labels{/name}
https://api.github.com/repos/psf/requests/issues/3191/comments
https://api.github.com/repos/psf/requests/issues/3191/events
https://github.com/psf/requests/issues/3191
154,682,307
MDU6SXNzdWUxNTQ2ODIzMDc=
3,191
Why urllib3 is bundled dependency not external?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1063219?v=4", "events_url": "https://api.github.com/users/kuraga/events{/privacy}", "followers_url": "https://api.github.com/users/kuraga/followers", "following_url": "https://api.github.com/users/kuraga/following{/other_user}", "gists_url": "https://api.github.com/users/kuraga/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kuraga", "id": 1063219, "login": "kuraga", "node_id": "MDQ6VXNlcjEwNjMyMTk=", "organizations_url": "https://api.github.com/users/kuraga/orgs", "received_events_url": "https://api.github.com/users/kuraga/received_events", "repos_url": "https://api.github.com/users/kuraga/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kuraga/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kuraga/subscriptions", "type": "User", "url": "https://api.github.com/users/kuraga", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-05-13T10:32:52Z
2021-09-08T18:00:46Z
2016-05-13T10:35:23Z
NONE
resolved
Good day! Why urllib3 is bundled dependency not external? (Sorry I couldn't find something like mailing list for this question.)
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3191/reactions" }
https://api.github.com/repos/psf/requests/issues/3191/timeline
null
completed
null
null
false
[ "Hi, thanks for the question! Please see the discussion on #1812 and #1811.\n" ]
https://api.github.com/repos/psf/requests/issues/3190
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3190/labels{/name}
https://api.github.com/repos/psf/requests/issues/3190/comments
https://api.github.com/repos/psf/requests/issues/3190/events
https://github.com/psf/requests/issues/3190
154,575,595
MDU6SXNzdWUxNTQ1NzU1OTU=
3,190
Requests Session doesn't use HTTP_PROXY settings when sending PreparedRequests?
{ "avatar_url": "https://avatars.githubusercontent.com/u/2421362?v=4", "events_url": "https://api.github.com/users/wujek-srujek/events{/privacy}", "followers_url": "https://api.github.com/users/wujek-srujek/followers", "following_url": "https://api.github.com/users/wujek-srujek/following{/other_user}", "gists_url": "https://api.github.com/users/wujek-srujek/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/wujek-srujek", "id": 2421362, "login": "wujek-srujek", "node_id": "MDQ6VXNlcjI0MjEzNjI=", "organizations_url": "https://api.github.com/users/wujek-srujek/orgs", "received_events_url": "https://api.github.com/users/wujek-srujek/received_events", "repos_url": "https://api.github.com/users/wujek-srujek/repos", "site_admin": false, "starred_url": "https://api.github.com/users/wujek-srujek/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/wujek-srujek/subscriptions", "type": "User", "url": "https://api.github.com/users/wujek-srujek", "user_view_type": "public" }
[]
closed
true
null
[]
null
6
2016-05-12T20:37:10Z
2021-09-08T17:05:35Z
2016-05-12T20:41:50Z
NONE
resolved
I am using mitmproxy for development of a RESTful API client as I want to see what's sent. I noticed that sometimes I don't see all requests in my proxy logs. I narrowed down the code to a reproducible test case. How to reproduce: start some proxy locally which allows you to see the requests, set the HTTP(S)_PROXY variables and execute the following code: ``` #!/usr/bin/env python3 import requests session = requests.Session() def retry_hook(res, *args, **kwargs): req = res.request print(req) if 'X-Retried-Request' not in req.headers: print('(Retrying)') req.headers['X-Retried-Request'] = True return session.send(req) else: print('(Not retrying retried request)') session.get('https://www.google.de', hooks={'response': retry_hook}, verify=False) ``` - I'm using google.de as I am in Germany and google.com results in redirects, which I don't want to see in my log. Depending on where you test this, use a different domain. What the code does: it sends a request, and using a hook, when the response is ready, it re-sends the original request. The first time the request is re-sent is sets a custom header, so that the next time the request is not re-sent as the header is discovered. The first time the request is immediately re-sent, the second time it is not. Every time the hook is invoked the request is printed out. I see the following output (ignoring a warning about unverified HTTPS requests): ``` <PreparedRequest [GET]> (Retrying) <PreparedRequest [GET]> (Not retrying retried request) ``` However, in my proxy window, I only see the first request, the second one is not seen (the custom header is missing from the request, that's how I know it is the second one that's missing. Now, this is a contrived example, let me explain a real world scenario. We have a RESTful service which requires OAuth2 tokens for authorization. The tokens we have are valid for 3 minutes. I am writing a tool in Python which allows us to call the service, and it handles authentication transparently: it gets a token initially, and as long as it is valid, it uses it. At some point, a 401 response will be returned, that's when the hook comes in - it recognizes 401, fetches another token, sets it on the session and request again, and retries the request once. The result of this retried request is the result of the hook, meaning that a unauthorized first request is 'forgotten', and the retried one is returned. Any other result than 401, and retried requests resulting in 401 get standard treatment. There is an issue that such retried requests always fail when a proxy is in place, but as they correctly fetched the token and set it on the Session, subsequent requests don't have this problem.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3190/reactions" }
https://api.github.com/repos/psf/requests/issues/3190/timeline
null
completed
null
null
false
[ "Yup, sorry, this is an expected behaviour. See #2807 which also discusses the expected solution.\n", "It seems I need code like this right before I re-send the request:\n\n```\nsettings = session.merge_environment_settings(req.url, {}, False, False, None)\nreturn session.send(req, **settings)\n```\n\nI have a few questions which I couldn't find answers for in the linked ticket, I hope you don't mind\n- It seems a such a PreparedRequest doesn't keep all the information, like the proxies, whether it is streamed, etc. the original request was sent with. What would then be the best, most robust way to just re-send a request with a single change to a header value?\n- Right now in the hook I would somehow need to determine what stream, verify etc. settings were set to in the original, which sometimes will be difficult as these settings are not available. I think the above code will break for requests for which I do want streaming - the re-sent request will not be streaming, as I explicitly configured it to be. How can I get the necessary information in the hook to pass it to the merge_xxx method? Note that the same hook is used for different types of requests, some of which are streamed and some are not.\n- The called merge_environment_settings method takes explicit certs and stream parameters, which seem to never be read from env settings - is this correct? Should then they be in a method named like this?\n", "> It seems a such a PreparedRequest doesn't keep all the information, like the proxies, whether it is streamed, etc. the original request was sent with. What would then be the best, most robust way to just re-send a request with a single change to a header value?\n\nWe're treading a bit into semantics.\n\nThe `PreparedRequest` keeps all of the information about the _request_ itself: that is, the HTTP parts of the protocol like headers and body format. It doesn't keep the information about _how_ the request was sent and what was done at the socket layer: proxies etc fall into that category. Think of the `PreparedRequest` as an idealised form of a request that could be sent in lots of different ways.\n\nIf you want to resend a specific request _exactly as before_ but with changes only to that request, I recommend swapping to the [PreparedRequest flow](http://docs.python-requests.org/en/master/user/advanced/#prepared-requests) for the _entire_ request/response cycle: that is, build a `Request` object, call `Session.prepare_request` to get your `PreparedRequest`. Then, call `Session.merge_environment_settings` to get the settings, and `Session.send` with those settings. Those last two steps can be repeated as needed, so you can catch any errors and retry those in a loop.\n\n> How can I get the necessary information in the hook to pass it to the merge_xxx method? Note that the same hook is used for different types of requests, some of which are streamed and some are not.\n\nI recommend not using a hook at all. It's not entirely clear to me why you're using the hook instead of handling it manually, to be honest.\n\n> The called merge_environment_settings method takes explicit certs and stream parameters, which seem to never be read from env settings - is this correct? Should then they be in a method named like this?\n\nYeah, this is somewhat inconsistent, I agree, but it's the result of some historical layout of code. The real reason they're there is because those two kwargs can be set _on the Session itself_, and that method is responsible for grabbing them.\n", "Ok, thanks, I now understand the difference between the request data and how the request is sent in, thank you for the clarification. It does makes sense to separate this, of course.\n\nI will look into the proposed flow. One more question, though - it seems I would need to write a loop myself for re-trying the request, is this correct? That was one of the reasons for the hook, as it is automatic. I think I could actually still implement it using hooks - create a method which uses the prepared request workflow, and in the hook, just call it again. What would be your suggested solution?\n\nAs for why I'm using a hook - I'm a novice in both Python and requests, I didn't know any better and by reading the docs I thought it would be the easiest thing possible. What made me wade deeper and deeper into the hook solution was the fact that it did in fact work most of the time; the times when it didn't I didn't see it or I basically put the fault on my code.\n", "> I think I could actually still implement it using hooks - create a method which uses the prepared request workflow, and in the hook, just call it again. What would be your suggested solution?\n\nYup, this will certainly work as well. You could actually do it by providing that hook as a closure which closes over the relevant variables.\n\nThe hook is fine, but depending on what you want to do it may simply be easier to let the request run to completion and then handle the response separately. Entirely up to you though: the closure method will work just fine.\n", "I wonder if we need to document the PreparedRequest workflow better. \n" ]
https://api.github.com/repos/psf/requests/issues/3189
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3189/labels{/name}
https://api.github.com/repos/psf/requests/issues/3189/comments
https://api.github.com/repos/psf/requests/issues/3189/events
https://github.com/psf/requests/issues/3189
154,557,093
MDU6SXNzdWUxNTQ1NTcwOTM=
3,189
error 54, 'Connection reset by peer'
{ "avatar_url": "https://avatars.githubusercontent.com/u/156872?v=4", "events_url": "https://api.github.com/users/degroat/events{/privacy}", "followers_url": "https://api.github.com/users/degroat/followers", "following_url": "https://api.github.com/users/degroat/following{/other_user}", "gists_url": "https://api.github.com/users/degroat/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/degroat", "id": 156872, "login": "degroat", "node_id": "MDQ6VXNlcjE1Njg3Mg==", "organizations_url": "https://api.github.com/users/degroat/orgs", "received_events_url": "https://api.github.com/users/degroat/received_events", "repos_url": "https://api.github.com/users/degroat/repos", "site_admin": false, "starred_url": "https://api.github.com/users/degroat/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/degroat/subscriptions", "type": "User", "url": "https://api.github.com/users/degroat", "user_view_type": "public" }
[]
closed
true
null
[]
null
16
2016-05-12T19:02:16Z
2018-02-22T00:43:30Z
2016-05-12T20:33:26Z
NONE
off-topic
I'm getting the dreaded Error 54, Connection reset by peer when trying to do a POST to a URL with a payment provider I'm using. Here is the basic code I'm running: ``` python import requests, base64 url = "https://sandbox.api.mxmerchant.com/checkout/v3/auth/token/282059461" headers = { 'Authorization': 'Basic ' + base64.b64encode('myusername:mypassword') , } r = requests.post(url, headers=headers) print r.text ``` If I run this exact code on an Ubuntu server it works fine (meaning it returns a json message stating that the username and password are incorrect). If I run it on OSX 10.11.4, I get the connection reset error. After reading a ton of issues on here and various postings on StackOverflow, everyone seems to think it's related to SNI. Unfortunately, none of the recommendations that I've come across have fixed the issue. My Ubuntu server has Python 2.7.6, OpenSSL 1.0.1f and requests 2.8.1. My OSX has Python 2.7.10, OpenSSL 1.0.2g and requests 2.10.0. I also have ndg-httpsclient and pyopenssl both installed per #1347 Any idea what it is that I'm missing?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 3, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 3, "url": "https://api.github.com/repos/psf/requests/issues/3189/reactions" }
https://api.github.com/repos/psf/requests/issues/3189/timeline
null
completed
null
null
false
[ "Can you confirm for me what the output of these two commands is on your OS X machine?\n\n```\npython -c \"import ssl; print ssl.OPENSSL_VERSION\"\npython -c \"from OpenSSL.SSL import SSLeay_version, SSLEAY_VERSION; print SSLeay_version(SSLEAY_VERSION)\"\n```\n", "```\n>> python -c \"import ssl; print ssl.OPENSSL_VERSION\"\n---\nOpenSSL 0.9.8zh 14 Jan 2016\n```\n\nSo... obviously that's an issue there. But this looks even worse...\n\n```\n>> python -c \"from OpenSSL.SSL import SSLeay_version, SSLEAY_VERSION; print SSLeay_version(SSLEAY_VERSION)\"\n---\nTraceback (most recent call last):\n File \"<string>\", line 1, in <module>\n File \"/Users/degroat/.virtualenvs/cb/lib/python2.7/site-packages/OpenSSL/__init__.py\", line 8, in <module>\n from OpenSSL import rand, crypto, SSL\n File \"/Users/degroat/.virtualenvs/cb/lib/python2.7/site-packages/OpenSSL/rand.py\", line 12, in <module>\n from OpenSSL._util import (\n File \"/Users/degroat/.virtualenvs/cb/lib/python2.7/site-packages/OpenSSL/_util.py\", line 6, in <module>\n from cryptography.hazmat.bindings.openssl.binding import Binding\n File \"/Users/degroat/.virtualenvs/cb/lib/python2.7/site-packages/cryptography/hazmat/bindings/openssl/binding.py\", line 15, in <module>\n from cryptography.hazmat.bindings._openssl import ffi, lib\nImportError: dlopen(/Users/degroat/.virtualenvs/cb/lib/python2.7/site-packages/cryptography/hazmat/bindings/_openssl.so, 2): Symbol not found: _BIO_new_CMS\n Referenced from: /Users/degroat/.virtualenvs/cb/lib/python2.7/site-packages/cryptography/hazmat/bindings/_openssl.so\n Expected in: flat namespace\n in /Users/degroat/.virtualenvs/cb/lib/python2.7/site-packages/cryptography/hazmat/bindings/_openssl.so\n```\n", "Alright! Good, we're getting somewhere.\n\nSo, the `ImportError` from `PyOpenSSL` is causing requests to fallback to the stdlib, and that ancient OpenSSL is almost certainly not compatible with your website. How did you install Python 2.7.10, or did it ship with the OS?\n", "I'm 99% sure I went with the OSX baked in python last time I rebuilt my dev environment. When I'm not in a virtualenv, which python points to '/usr/bin/python' and that file is not a symlink to the Cellar directory.\n", "Ok, so that there is going to be the issue. The system Python uses the ancient OpenSSL that OS X ships.\n\nI'd like to tag @reaperhulk in here to try to understand why PyOpenSSL isn't correctly installed, because that _should_ have resolved the problem.\n", "I went ahead and installed a new version of python using homebrew (using brewed OpenSSL) and that fixed it for me.\n", "@degroat Yup, that'll do it too. =)\n", "I met the same issue. Is there a solution to solve this without upgrade python?\n", "@goalong It's possible, but it depends on the specifics of your situation. In this instance, all we know is that the server you're connecting to doesn't like your TLS handshake. That can happen for lots of reasons, but without specifics it's hard to know what would fix the problem. \n", "For the sake of others like me coming to this issue by googling this exception:\n\nI resolved this issue for myself by updating my system `virtualenvwrapper` package to latest (`sudo pip install -U virtualenvwrapper`), which lifted my `virtualenv` version from 13.1.2 to 15.1.0. I then recreated the virtualenv for my project. I then executed the command that was giving me the error in the first place, and it completed successfully.\n", "while researching online another solution was `pip install pyopenssl ndg-httpsclient pyasn1` as per http://stackoverflow.com/a/38854398/1349938", "@Bashar That specific list is now out of date. You should make sure you use `pip install requests[security]` instead: that will always install the correct dependencies.", "Hey guys, I've tried the solution given above but still I'm facing the same issue. Could you please help. TIA\r\nVJ:audit vijay.dugini$ python -c \"import ssl; print ssl.OPENSSL_VERSION\"\r\nOpenSSL 1.0.2l 25 May 2017\r\nVJ:audit vijay.dugini$ python -c \"from OpenSSL.SSL import SSLeay_version, SSLEAY_VERSION; print SSLeay_version(SSLEAY_VERSION)\"\r\nOpenSSL 1.1.0f 25 May 2017", "Tried pip install pyopenssl ndg-httpsclient pyasn1 and also installed requests but still no result", "Ignore my post mates. I had multiple version of pythons in my machine due to which it wasn't able to pick the right one and was throwing error. Posting this thinking it may be helpful for someone.", "Was facing the same issue: python ConnectionError: ('Connection aborted.', error(54, 'Connection reset by peer'))\r\n\r\n`pip install requests[security]` resolved it." ]
https://api.github.com/repos/psf/requests/issues/3188
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3188/labels{/name}
https://api.github.com/repos/psf/requests/issues/3188/comments
https://api.github.com/repos/psf/requests/issues/3188/events
https://github.com/psf/requests/pull/3188
154,443,240
MDExOlB1bGxSZXF1ZXN0Njk4MDkxNTQ=
3,188
Replace tab with appropriate spaces.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[]
closed
true
null
[]
null
0
2016-05-12T09:58:34Z
2021-09-08T04:01:01Z
2016-05-12T13:11:19Z
MEMBER
resolved
Looks like this crept in. It shouldn't break anything, it's in a multiline string, but still, let's get rid of it.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3188/reactions" }
https://api.github.com/repos/psf/requests/issues/3188/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3188.diff", "html_url": "https://github.com/psf/requests/pull/3188", "merged_at": "2016-05-12T13:11:19Z", "patch_url": "https://github.com/psf/requests/pull/3188.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3188" }
true
[]
https://api.github.com/repos/psf/requests/issues/3187
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3187/labels{/name}
https://api.github.com/repos/psf/requests/issues/3187/comments
https://api.github.com/repos/psf/requests/issues/3187/events
https://github.com/psf/requests/issues/3187
154,426,728
MDU6SXNzdWUxNTQ0MjY3Mjg=
3,187
GAE and ChunkedEncodingError, working patch.
{ "avatar_url": "https://avatars.githubusercontent.com/u/317142?v=4", "events_url": "https://api.github.com/users/cnicodeme/events{/privacy}", "followers_url": "https://api.github.com/users/cnicodeme/followers", "following_url": "https://api.github.com/users/cnicodeme/following{/other_user}", "gists_url": "https://api.github.com/users/cnicodeme/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/cnicodeme", "id": 317142, "login": "cnicodeme", "node_id": "MDQ6VXNlcjMxNzE0Mg==", "organizations_url": "https://api.github.com/users/cnicodeme/orgs", "received_events_url": "https://api.github.com/users/cnicodeme/received_events", "repos_url": "https://api.github.com/users/cnicodeme/repos", "site_admin": false, "starred_url": "https://api.github.com/users/cnicodeme/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/cnicodeme/subscriptions", "type": "User", "url": "https://api.github.com/users/cnicodeme", "user_view_type": "public" }
[ { "color": "fbca04", "default": false, "description": null, "id": 615414998, "name": "GAE Support", "node_id": "MDU6TGFiZWw2MTU0MTQ5OTg=", "url": "https://api.github.com/repos/psf/requests/labels/GAE%20Support" } ]
closed
true
null
[]
null
3
2016-05-12T08:36:47Z
2021-09-08T09:00:45Z
2016-05-12T08:38:59Z
NONE
resolved
Hey everyone, sorry for my intervention here, "out of the blue". I tried to understand if this issue was already discussed (but could find them) so I come here with the situation and trying to understand the solution. The problem is the following (from what I understood) : Google App Engine, replace the implementation of urlfetch by their own, which contains some changes that breaks urllib3 and subsequently Requests. @agfor [made a fix](https://github.com/agfor/requests/commit/da863cc) that seems to work (I didn't had any ChunkedEncodingError since I place the patch on line) and, from my understanding, doesn't break the others users, those that don't use GAE. I didn't saw his patch as submitted here (I'm wondering why) and I don't know why this hasn't been taken into consideration here. (Please note those are real question, not critics/remarks, my intentions are good :) ). Hence my post :)
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3187/reactions" }
https://api.github.com/repos/psf/requests/issues/3187/timeline
null
completed
null
null
false
[ "@cnicodeme The requests project does not consider GAE formally supported. Put another way, we don't have any special-case code _in requests_ to use GAE properly. The recommended method of using GoogleAppEngine with requests is to use the [requests-toolbelt `AppEngineAdapter`](http://toolbelt.readthedocs.io/en/latest/adapters.html#appengineadapter).\n", "Ok that explains all my questions, thank you !\n\n(and wow, you're fast!)\n", "My pleasure!\n" ]
https://api.github.com/repos/psf/requests/issues/3186
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3186/labels{/name}
https://api.github.com/repos/psf/requests/issues/3186/comments
https://api.github.com/repos/psf/requests/issues/3186/events
https://github.com/psf/requests/issues/3186
154,319,859
MDU6SXNzdWUxNTQzMTk4NTk=
3,186
Response.content iterates in needlessly small chunks
{ "avatar_url": "https://avatars.githubusercontent.com/u/300211?v=4", "events_url": "https://api.github.com/users/vfaronov/events{/privacy}", "followers_url": "https://api.github.com/users/vfaronov/followers", "following_url": "https://api.github.com/users/vfaronov/following{/other_user}", "gists_url": "https://api.github.com/users/vfaronov/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/vfaronov", "id": 300211, "login": "vfaronov", "node_id": "MDQ6VXNlcjMwMDIxMQ==", "organizations_url": "https://api.github.com/users/vfaronov/orgs", "received_events_url": "https://api.github.com/users/vfaronov/received_events", "repos_url": "https://api.github.com/users/vfaronov/repos", "site_admin": false, "starred_url": "https://api.github.com/users/vfaronov/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vfaronov/subscriptions", "type": "User", "url": "https://api.github.com/users/vfaronov", "user_view_type": "public" }
[]
open
false
null
[]
null
5
2016-05-11T19:12:34Z
2024-07-08T10:55:05Z
null
NONE
null
`Response.content` [iterates over the response data in chunks of 10240 bytes](https://github.com/kennethreitz/requests/blob/87704105af65b382b86f168f6a54192eab91faf2/requests/models.py#L741). The number 10240 was set in commit [`62d2ea8`](https://github.com/kennethreitz/requests/commit/62d2ea8). After tracing the source code of urllib3 and httplib, I can’t see a reason for this behavior. It all ultimately goes through httplib’s [`HTTPResponse.readinto`](https://hg.python.org/cpython/file/87130512ef34/Lib/http/client.py#l469), which automatically limits the read size according to `Content-Length` or the `chunked` framing. Therefore, it seems that, if you simply set `CONTENT_CHUNK_SIZE` to a much larger number (like 10240000), nothing should change, except `Response.content` will become more efficient on large responses. **Update:** it seems like httplib allocates a buffer of the requested size (to be read into), so simply setting `CONTENT_CHUNK_SIZE` to a large value will cause large chunks of memory to be allocated, which is probably a bad idea. This is not a problem for me and I have not researched it thoroughly. I’m filing this issue after investigating [a Stack Overflow question](http://stackoverflow.com/questions/37135880/python-3-urllib-vs-requests-performance) where this caused an unexpected slowdown for the poster, and a subsequent [IRC exchange](https://botbot.me/freenode/python-requests/2016-05-11/?msg=65874287&page=1) with @Lukasa. Feel free to do (or not do) whatever you think is right here.
null
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3186/reactions" }
https://api.github.com/repos/psf/requests/issues/3186/timeline
null
null
null
null
false
[ "It's good to know that httplib allocates a buffer of that size. I think we can probably stretch to double that buffer though: 20kb of buffer is unlikely to be the end of the world.\n\nAt the very least, though, we should understand how this works so that we can write documentation to explain this.\n", "Originally, I iterated over a chunk size of `1` :)\n", "While we're on the topic, we have 4 different default chunk_sizes between all of our iterator functions in Requests. Some I can find reasoning for ([`CONTENT_CHUNK_SIZE` vs.\n`ITER_CHUNK_SIZE`](https://github.com/kennethreitz/requests/pull/1122)), but others like `__iter__` and the default for `iter_content` aren't entirely clear.\n\nI'm not saying these are wrong, just curious why `__iter__` is declared as 128 instead of `ITER_CHUNK_SIZE` and if there's a reason for still having such a default of 1 on `iter_content`. Is it related to blocking or file-objects not returning without a full read?\n", "There is a long, long issue to look at in the backlog. Anyone wanting to make progress on this needs to read _and understand_ #844. Safe to say this is not a good choice for someone who doesn't want to find a really tough slog of a job.\n", "just a ping back from the pip project on this 12 years old bug. :)\r\n\r\nthe iter_content() was set to 10240 bytes 12 years ago in requests. it's a needlessly small size and incur a lot of overhead.\r\nlinked bug ticket: real bug in pip, 30% of the time taken by pip to download packages, was just overhead because of using this default chunk size.\r\n\r\nI'm quite curious if there is any reason that prevents from updating `CONTENT_CHUNK_SIZE` to something more reasonable nowadays? \r\n64k-128k-256k would be reasonable values for I/O.\r\n\r\nOn Linux, the network read buffer was increased to 64k in kernel v4.20, year 2018, the read and write buffer were 16k historically before that.\r\n(they're resized dynamically with the TCP window up to 4MB write 6M read, but let's not get into TCP window sizing, see sysctl_tcp_rmem sysctl_tcp_wmem)\r\nlinux code: https://github.com/torvalds/linux/blame/master/net/ipv4/tcp.c#L4775\r\ncommit Sep 2018: https://github.com/torvalds/linux/commit/a337531b942bd8a03e7052444d7e36972aac2d92\r\n" ]
https://api.github.com/repos/psf/requests/issues/3185
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3185/labels{/name}
https://api.github.com/repos/psf/requests/issues/3185/comments
https://api.github.com/repos/psf/requests/issues/3185/events
https://github.com/psf/requests/pull/3185
154,239,373
MDExOlB1bGxSZXF1ZXN0Njk2Njg3NzU=
3,185
Support ALL_PROXY environment variable
{ "avatar_url": "https://avatars.githubusercontent.com/u/91550?v=4", "events_url": "https://api.github.com/users/brettdh/events{/privacy}", "followers_url": "https://api.github.com/users/brettdh/followers", "following_url": "https://api.github.com/users/brettdh/following{/other_user}", "gists_url": "https://api.github.com/users/brettdh/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/brettdh", "id": 91550, "login": "brettdh", "node_id": "MDQ6VXNlcjkxNTUw", "organizations_url": "https://api.github.com/users/brettdh/orgs", "received_events_url": "https://api.github.com/users/brettdh/received_events", "repos_url": "https://api.github.com/users/brettdh/repos", "site_admin": false, "starred_url": "https://api.github.com/users/brettdh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/brettdh/subscriptions", "type": "User", "url": "https://api.github.com/users/brettdh", "user_view_type": "public" }
[]
closed
true
null
[]
null
14
2016-05-11T13:09:02Z
2021-09-08T04:00:59Z
2016-05-17T15:42:32Z
CONTRIBUTOR
resolved
Closes #3183.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3185/reactions" }
https://api.github.com/repos/psf/requests/issues/3185/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3185.diff", "html_url": "https://github.com/psf/requests/pull/3185", "merged_at": "2016-05-17T15:42:32Z", "patch_url": "https://github.com/psf/requests/pull/3185.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3185" }
true
[ "ping @Lukasa \n", "Agh, Python3 (iteritems). Pushing fix with six shortly.\n\nOn a related note - has there ever been talk of moving to [tox](http://tox.readthedocs.io/en/latest/) for testing?\n", "Found that this doesn't actually work yet, as evidenced by the new, failing functional test. Will investigate further (and fix the other broken test) tomorrow.\n\nAlso, the jenkins setup doesn't seem to have PySocks installed. What's the best way to make sure that's present in the CI environment? Just add it to the requirements.txt?\n", "@brettdh Yeah, I think that would be fine. =)\n", "Alrighty, I think it's good now.\n", "It's rather difficult to review this with the amount of refactoring of the testserver. It's distracting from the actual feature and reviewing that.\n", "@sigmavirus24 The test server refactoring was mainly done as I added a failing functional test for the feature, in order to prevent that test failure from hanging the server (because it was waiting for a connection that wasn't coming).\n\nNow that the test is passing, I'm happy to separate the test server refactoring into a separate PR if that helps.\n", "Ok, I split off all but the minimum changes needed for the new test. I'll make a new PR for the test server refactoring soon.\n", "Cool, @sigmavirus24 do you want to re-review?\n", "Eh I realize you just trimmed that whitespace, but you drew my attention to it ;). Otherwise, this looks okay to me.\n", "@sigmavirus24 Yeah, sorry; I forgot that I had vim configured to always trim trailing whitespace on save. Thought I had reverted those before committing, but clearly I missed one. :-)\n", "@brettdh no need to put trailing whitespace back, just change that line to use iter instead =D\n", "Ok, :+1: from me. @sigmavirus24?\n", "Looks good to me. Thanks @brettdh!\n" ]
https://api.github.com/repos/psf/requests/issues/3184
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3184/labels{/name}
https://api.github.com/repos/psf/requests/issues/3184/comments
https://api.github.com/repos/psf/requests/issues/3184/events
https://github.com/psf/requests/pull/3184
154,155,484
MDExOlB1bGxSZXF1ZXN0Njk2MTIxMDc=
3,184
Refactor prepare_body
{ "avatar_url": "https://avatars.githubusercontent.com/u/3794108?v=4", "events_url": "https://api.github.com/users/davidsoncasey/events{/privacy}", "followers_url": "https://api.github.com/users/davidsoncasey/followers", "following_url": "https://api.github.com/users/davidsoncasey/following{/other_user}", "gists_url": "https://api.github.com/users/davidsoncasey/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/davidsoncasey", "id": 3794108, "login": "davidsoncasey", "node_id": "MDQ6VXNlcjM3OTQxMDg=", "organizations_url": "https://api.github.com/users/davidsoncasey/orgs", "received_events_url": "https://api.github.com/users/davidsoncasey/received_events", "repos_url": "https://api.github.com/users/davidsoncasey/repos", "site_admin": false, "starred_url": "https://api.github.com/users/davidsoncasey/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/davidsoncasey/subscriptions", "type": "User", "url": "https://api.github.com/users/davidsoncasey", "user_view_type": "public" }
[ { "color": "e102d8", "default": false, "description": null, "id": 117745, "name": "Planned", "node_id": "MDU6TGFiZWwxMTc3NDU=", "url": "https://api.github.com/repos/psf/requests/labels/Planned" }, { "color": "eb6420", "default": false, "description": null, "id": 44501256, "name": "Breaking API Change", "node_id": "MDU6TGFiZWw0NDUwMTI1Ng==", "url": "https://api.github.com/repos/psf/requests/labels/Breaking%20API%20Change" }, { "color": "e11d21", "default": false, "description": null, "id": 78002701, "name": "Do Not Merge", "node_id": "MDU6TGFiZWw3ODAwMjcwMQ==", "url": "https://api.github.com/repos/psf/requests/labels/Do%20Not%20Merge" }, { "color": "d4c5f9", "default": false, "description": null, "id": 536793543, "name": "needs rebase", "node_id": "MDU6TGFiZWw1MzY3OTM1NDM=", "url": "https://api.github.com/repos/psf/requests/labels/needs%20rebase" } ]
closed
true
null
[]
null
26
2016-05-11T04:01:14Z
2021-09-07T00:06:37Z
2017-02-10T17:18:18Z
NONE
resolved
@Lukasa Here's an alternative fix to #3066 (discussed in PR #3181), where I refactored `PreparedRequest.prepare_body` and `PreparedRequest.prepare_content_length` so that `prepare_content_length` is always called. My only question is in the case when the body is a stream (which it is in the case that brought up this issue) that the current position will always be 0 when the length is calculated. Previously, in cases where `prepare_auth` was not called, the content length was being calculated with `super_len`, now it would be calculated using `self.headers['Content-Length'] = builtin_str(max(0, end_pos - curr_pos))`. As far as I can tell, this will give the same result (as long as the current position is 0), and this is how it's been computed in cases where authorization is being used. But I'm curious if you have thoughts about this.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3184/reactions" }
https://api.github.com/repos/psf/requests/issues/3184/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3184.diff", "html_url": "https://github.com/psf/requests/pull/3184", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3184.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3184" }
true
[ "So, I think in general I like this better. However, your point about `super_len` is a real one.\n\nI think the solution here is to rewrite `prepare_content_length` to have it always call `super_len`. `super_len` does use `tell` to do a little dance here, so I think it's probably going to be good enough: @sigmavirus24, does that seem right to you?\n", "@Lukasa in that case, could we get rid of the first elif statement and just go to the case where `body is not None`? Looking at `super_len`, I think it would calculate length in the same way. So in what I'm proposing, prepare content length would look like this:\n\n``` python\n def prepare_content_length(self, body):\n if 'Transfer-Encoding' in self.headers:\n return\n elif body is not None:\n l = super_len(body)\n if l:\n self.headers['Content-Length'] = builtin_str(l)\n elif (self.method not in ('GET', 'HEAD')) and (self.headers.get('Content-Length') is None):\n self.headers['Content-Length'] = '0'\n```\n\nAll tests pass with this change.\n", "Ok, I think I'm happy with this, though I'd like @sigmavirus24 to review before we merge. =)\n", "I'm not sure of the right way to do this, but this will allow someone to do:\n\n``` py\nr = requests.post(url, headers={'Transfer-Encoding': 'chunked'}, data='foo')\n```\n\nWhich will generate an invalid request to the server. I would much rather forcibly strip the TE header from something where we can determine the length than ignore adding the Content-Length header if a TE is present.\n", "@sigmavirus24 Arg, good spot. Hrm. We need to have some extra logic here to get this right.\n", "@Lukasa @sigmavirus24 it looks to me like the try/except of using `super_len` to calculate the length of the body could be moved to `prepare_content_length` - and then both the `Content-Length` and `Transfer-Encoding` headers would be set there. If content length can be calculated, then we strip the TE header, and otherwise, we strip the CL header. I think this would be the most clear, and would ensure that the headers are mutually exclusive, regardless of what a user enters.\n", "@davidsoncasey I think we want to keep the `try...except` in `super_len`: that method gets used outside requests quite frequently. It won't hurt to do the check twice.\n", "@Lukasa sounds good. I'll see if I can put something together for that. Thanks for helping work through this.\n", "@Lukasa @sigmavirus24 I updated this PR to make the Content-Length and Transfer-Encoding headers mutually exclusive, regardless of whether a user manually provides a value, as @sigmavirus24 showed in his comment. While working on this, I stumbled upon #1648. I hadn't realized that this had been a subject of so much discussion, and that people have differing opinions about how this should work. This is obviously related to that issue, so I understand if you decide that this isn't the fix that you're looking for. It could be refactored to raise an error if a user has provided a CL header when TE is set, and vice versa, instead of silently removing the unneeded header. Let me know what you think.\n", "I'm not sure which behaviour I want (if we **must** merge something before 3.0). I think I'd rather raise an exception though.\n", "I'd be inclined to want to raise an exception, but I also think we should consider discussing this at PyCon.\n", "@Lukasa @sigmavirus24 I'll go ahead and alter it to raise an exception for now (perhaps `InvalidHeadersError`? Or is there an existing exception that would make sense to raise?). And then we can leave this PR open until it's decided what the best behavior is.\n\nAlso, are you planning on coming to Portland for PyCon? I didn't get a ticket in time, but I live in Portland and it would be great to meet up and discuss the project.\n", "Yup, we'll be in Portland: we're planning to have a \"state of Requests\" chat at some point that'll be able to tackle things like these.\n", "Fantastic, I'd love to participate.\n", "@Lukasa @sigmavirus24 I made a few more changes and a bit more refactoring of `prepare_body` and `prepare_content_length`. These include:\n- Moving the logic set `Transfer-Encoding` header to the `prepare_content_length` method. This makes it more explicit that the headers should be mutually exclusive, and does not rely on implicitly connected logic in two methods.\n- Raises an `InvalidHeaderError` if a user manually sets either header when it should not be set (let me know if you think that there's a preexisting exception that could be raised instead).\n- Added tests to check different cases of when each header should be set.\n\nI understand we may need to wait to decide if this is the desired behavior, so no need to review or merge this right away. We can leave this PR open until you've had a chance to discuss and agree upon the desired behavior.\n", "I've got a note inline, but I don't mind this change: it's definitely backward-incompatible though, so we'll need to release this as part of 3.0.0.\n", "Once we've cleaned this up, can we squash some of these commits together and send this to proposed/3.0.0?\n", "@sigmavirus24 @Lukasa sounds good. I'll make those small cleanup changes and squash extraneous commits.\n", "I squashed some commits together and it looks like there's a conflict now. I'll get a fresh pull of the master branch and get that resolved.\n", "@Lukasa @sigmavirus24 alright, it took me a bit to get back to this, but I've cleaned it up a bit and it should be ready to merge into the proposed branch. The conlflict is in AUTHORS.rst. Let me know if there's anything else here you'd like to see changed, otherwise I'll try to find another issue to get started on.\n", "Thanks @davidsoncasey, I've left a few more notes.\n", "Ok, good by me. @sigmavirus24?\n", "So this looks great, but:\n1. Should this be against proposed/3.0?\n2. Can we resolve the merge conflicts?\n", "@sigmavirus24 I'll open a new PR into proposed/3.0. As far as the conflict, that was only in the contributors.rst, and I was thinking it would be a little cleaner if one of you resolved it on merge, instead of back merging master into this branch and fixing it there. I'm happy to do that if you like though.\n", "needs a rebase", "closing due to inactivity (not sure if that's our fault or not). please re-open if you're still interested in getting this in! " ]
https://api.github.com/repos/psf/requests/issues/3183
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3183/labels{/name}
https://api.github.com/repos/psf/requests/issues/3183/comments
https://api.github.com/repos/psf/requests/issues/3183/events
https://github.com/psf/requests/issues/3183
154,007,895
MDU6SXNzdWUxNTQwMDc4OTU=
3,183
Feature request: support ALL_PROXY environment variable
{ "avatar_url": "https://avatars.githubusercontent.com/u/91550?v=4", "events_url": "https://api.github.com/users/brettdh/events{/privacy}", "followers_url": "https://api.github.com/users/brettdh/followers", "following_url": "https://api.github.com/users/brettdh/following{/other_user}", "gists_url": "https://api.github.com/users/brettdh/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/brettdh", "id": 91550, "login": "brettdh", "node_id": "MDQ6VXNlcjkxNTUw", "organizations_url": "https://api.github.com/users/brettdh/orgs", "received_events_url": "https://api.github.com/users/brettdh/received_events", "repos_url": "https://api.github.com/users/brettdh/repos", "site_admin": false, "starred_url": "https://api.github.com/users/brettdh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/brettdh/subscriptions", "type": "User", "url": "https://api.github.com/users/brettdh", "user_view_type": "public" }
[ { "color": "0b02e1", "default": false, "description": null, "id": 191274, "name": "Contributor Friendly", "node_id": "MDU6TGFiZWwxOTEyNzQ=", "url": "https://api.github.com/repos/psf/requests/labels/Contributor%20Friendly" } ]
closed
true
null
[]
null
2
2016-05-10T13:35:35Z
2021-09-08T18:00:44Z
2016-05-17T15:42:32Z
CONTRIBUTOR
resolved
As of 2.10.0, I can set `HTTP_PROXY=socks5://example.com:port`, and then requests will use that SOCKS proxy. However, other tools (e.g. Docker) currently require the use of `ALL_PROXY` instead, and get confused if you set `HTTP_PROXY` to a URL that begins with `socks5://`. I think Docker should support the use of `HTTP_PROXY`, but requests should also support the use of `ALL_PROXY`, especially with the recent (thank you thank you thank you) addition of SOCKS support. I thought that this might need to go through urllib3, but it seems like it would be a relatively simple change to [sessions.py](https://github.com/kennethreitz/requests/blob/master/requests/sessions.py#L212), to look at the `'all'` key before checking the scheme. But, then I read your contributor's guide and thought it best to post an issue before just doing it. :smile: Please let me know what you think.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3183/reactions" }
https://api.github.com/repos/psf/requests/issues/3183/timeline
null
completed
null
null
false
[ "Thanks for the suggestion!\n\nI don't think we need to go through urllib3: your analysis is right, sessions.py can DTRT here. I think this is a reasonable enhancment, so we'd welcome a pull request for this.\n", "Great, thanks for the super-quick answer! I'm happy to pick this up and will start looking into it shortly.\n" ]