url
stringlengths
50
53
repository_url
stringclasses
1 value
labels_url
stringlengths
64
67
comments_url
stringlengths
59
62
events_url
stringlengths
57
60
html_url
stringlengths
38
43
id
int64
597k
2.65B
node_id
stringlengths
18
32
number
int64
1
6.83k
title
stringlengths
1
296
user
dict
labels
listlengths
0
5
state
stringclasses
2 values
locked
bool
2 classes
assignee
dict
assignees
listlengths
0
4
milestone
dict
comments
int64
0
211
created_at
stringlengths
20
20
updated_at
stringlengths
20
20
closed_at
stringlengths
20
20
author_association
stringclasses
3 values
active_lock_reason
stringclasses
4 values
body
stringlengths
0
65.6k
closed_by
dict
reactions
dict
timeline_url
stringlengths
59
62
performed_via_github_app
null
state_reason
stringclasses
3 values
draft
bool
2 classes
pull_request
dict
is_pull_request
bool
2 classes
issue_comments
listlengths
0
30
https://api.github.com/repos/psf/requests/issues/3860
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3860/labels{/name}
https://api.github.com/repos/psf/requests/issues/3860/comments
https://api.github.com/repos/psf/requests/issues/3860/events
https://github.com/psf/requests/issues/3860
206,740,910
MDU6SXNzdWUyMDY3NDA5MTA=
3,860
File names stripped of leading dots
{ "avatar_url": "https://avatars.githubusercontent.com/u/26496?v=4", "events_url": "https://api.github.com/users/p3lim/events{/privacy}", "followers_url": "https://api.github.com/users/p3lim/followers", "following_url": "https://api.github.com/users/p3lim/following{/other_user}", "gists_url": "https://api.github.com/users/p3lim/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/p3lim", "id": 26496, "login": "p3lim", "node_id": "MDQ6VXNlcjI2NDk2", "organizations_url": "https://api.github.com/users/p3lim/orgs", "received_events_url": "https://api.github.com/users/p3lim/received_events", "repos_url": "https://api.github.com/users/p3lim/repos", "site_admin": false, "starred_url": "https://api.github.com/users/p3lim/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/p3lim/subscriptions", "type": "User", "url": "https://api.github.com/users/p3lim", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2017-02-10T08:57:05Z
2021-09-08T12:00:59Z
2017-02-10T09:06:40Z
NONE
resolved
Currently using the following to upload a file to a webserver: ``` files = {'files': ('.png', open(file_path, 'rb'))} res = requests.post(url, files=files) ``` The reason I rename the file to just ".png" is because with an empty base name the server will randomly give the file a name. However, and I'm not sure if requests is the one to blame here, the file ends up being uploaded with the name "png" instead, as if the leading "." was stripped from the name. Is this something requests is intentionally doing or should I look elsewhere for the cause?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3860/reactions" }
https://api.github.com/repos/psf/requests/issues/3860/timeline
null
completed
null
null
false
[ "This is not something requests is doing. A quick Wireshark dump of your sample code against `http://httpbin.org/post` shows that the request looks like this:\r\n\r\n```\r\nPOST /post HTTP/1.1\r\nHost: httpbin.org\r\nConnection: keep-alive\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nUser-Agent: python-requests/2.13.0\r\nContent-Length: 655\r\nContent-Type: multipart/form-data; boundary=746c7fc2a98549969068cf0f6860f3d7\r\n\r\n--746c7fc2a98549969068cf0f6860f3d7\r\nContent-Disposition: form-data; name=\"files\"; filename=\".png\"\r\n<...file data ...>\r\n```\r\n\r\nNote that the filename still has the leading dot. =) So this isn't Requests doing something, it has to be elsewhere.", "Alright, thanks for checking 👍 " ]
https://api.github.com/repos/psf/requests/issues/3859
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3859/labels{/name}
https://api.github.com/repos/psf/requests/issues/3859/comments
https://api.github.com/repos/psf/requests/issues/3859/events
https://github.com/psf/requests/issues/3859
206,570,028
MDU6SXNzdWUyMDY1NzAwMjg=
3,859
Certificate failure with Python 2.7.13 and current Requests on virtualized server
{ "avatar_url": "https://avatars.githubusercontent.com/u/25668513?v=4", "events_url": "https://api.github.com/users/PatrickDChampion/events{/privacy}", "followers_url": "https://api.github.com/users/PatrickDChampion/followers", "following_url": "https://api.github.com/users/PatrickDChampion/following{/other_user}", "gists_url": "https://api.github.com/users/PatrickDChampion/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/PatrickDChampion", "id": 25668513, "login": "PatrickDChampion", "node_id": "MDQ6VXNlcjI1NjY4NTEz", "organizations_url": "https://api.github.com/users/PatrickDChampion/orgs", "received_events_url": "https://api.github.com/users/PatrickDChampion/received_events", "repos_url": "https://api.github.com/users/PatrickDChampion/repos", "site_admin": false, "starred_url": "https://api.github.com/users/PatrickDChampion/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/PatrickDChampion/subscriptions", "type": "User", "url": "https://api.github.com/users/PatrickDChampion", "user_view_type": "public" }
[]
closed
true
null
[]
null
24
2017-02-09T17:17:17Z
2021-09-08T05:00:50Z
2017-06-27T06:01:57Z
NONE
resolved
I use to use Requests and Python 2.7.6 to access a server with a REST interface. After our organization update their servers, I got the "InsecurePlatformWarning: A true SSLContext object is not available" message and then a certificate failure at the bottom of the call stack. I have now upgraded to Python 2.7.13 and tried. I stopped getting the InsecurePlatformWarning, but I still got the certificate failure at the end. So, then I updated using pip the Requests from version 2.7.0 to the current 2.13.0. I still get the same certificate failure (without the insecure warning). Below is the call stack. If I force the get() to ignore verification "r = s.get(url, params=addr, verify=False)", the transaction works fine, but I would like to maintain security and use the certificates. I am using Windows 7 Enterprise Service Pack 1 (64 bit) and 64bit Python. Any idea on what is going wrong. Any suggestions? Starting Session Sending request Traceback (most recent call last): File "S:\FY2015\ChoiceCard\DailyQ\checkSoftAuthOld.py", line 20, in <module> r = s.get(url,params=addr) File "C:\Python27\lib\site-packages\requests\sessions.py", line 501, in get return self.request('GET', url, **kwargs) File "C:\Python27\lib\site-packages\requests\sessions.py", line 488, in request resp = self.send(prep, **send_kwargs) File "C:\Python27\lib\site-packages\requests\sessions.py", line 630, in send history = [resp for resp in gen] if allow_redirects else [] File "C:\Python27\lib\site-packages\requests\sessions.py", line 190, in resolve_redirects **adapter_kwargs File "C:\Python27\lib\site-packages\requests\sessions.py", line 609, in send r = adapter.send(request, **kwargs) File "C:\Python27\lib\site-packages\requests\adapters.py", line 497, in send raise SSLError(e, request=request) requests.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:661)
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3859/reactions" }
https://api.github.com/repos/psf/requests/issues/3859/timeline
null
completed
null
null
false
[ "Assuming the server used to work and now does not, it's most likely that they're serving an invalid certificate or certificate chain. Is the server publicly accessible? If so, can you tell me the host name?", "Thank you for responding. A closely related link has been working for another application. The server is borderline public. You may or may not have access to it. Can you see if the certificate or chain is valid without having the full REST api? I would prefer to not broadcast to everyone the url with subdirectory path. The top level url is https://www.va.gov but the VA has dozens of servers hanging off that front door. Can I email you the path?", "@PatrickDChampion @Lukasa and I both have our emails on our profiles so you can do just that. =)", "Yup. If you're really worried we both also have our GPG keys associated with keybase accounts, in case that's helpful.", "Just sent the full URL.", "Sorry. Clicked wrong button.", "Hi,\r\n\r\nI'm seeing a similar issue in ansible/azure sdk with python 2.7.5 and requests 2.13.0\r\n\r\nI've raised an issue in the azure sdk for it here but then found this which sounds similar.\r\n\r\nhttps://github.com/Azure/azure-sdk-for-python/issues/1089\r\n\r\n", "So this a pretty bad issue – I am unable to complete any OAuth requests because very call `requests` makes fails to verify SSL. I have updated my `ca-certificates` on boxes and I am using latest `certifi`\r\n\r\nI am running Python 2.7.13, I am also suspecting that it works haphazardly, but will need to see it work again to confirm. \r\n\r\nI also can't just disable checks, unless you have a global way to disable it in a Django project since packages I am using actually make the lower level calls.", "I vote for allowing an option to validate an SSL cert with SHA1 root signature. This is the root of the problem here, right? @kennethreitz any ideas?", "@punkrokk I am totally unsure, what is causing this issue, but verification needs to fail either quietly notifying the user, not blocking the call which is just counter productive to making network calls like so:\r\n\r\n`Authentication failed: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:661)`", "Maybe I should open a separate issue, but I thought I followed this because I am having the issue with many APIs, where they have rootCA cert signed 1-2 years ago with SHA1, but requests doesn't like that. It was pretty hard to track down, and I can't seem to get requests to honor a cert bundle built by myself. So - in practice - what devs will do is set verify=False, just to get their https request to even work. ", "It's currently failing to verify a Google domain. ", "Here's an example cert that is failing for me: https://www.ssllabs.com/ssltest/analyze.html?d=tap-api-v2.proofpoint.com&latest (and you can see why in the report)\r\n", "@myusuf3 @punkrokk Hey folks, it would have been better if the two of you had filed a new issue for this. However, yes, this is a 1024-bit root problem. In this case the fix is to change your requests call to `requests.get(url, verify=certifi.old_where())`. As to the idea of disabling cert validation, this is already possible by passing `verify=False`, as a quick look at the docs would have told you. \r\n\r\nRequests will honor a cert bundle you build if you pass a path to the `verify` parameter. If this isn't working for you, please file a bug rather than suffer in silence. ", "Ok now that I'm at a computer let me answer you more directly and thoroughly.\n\n> I vote for allowing an option to validate an SSL cert with SHA1 root signature. This is the root of the problem here, right? @kennethreitz any ideas?\n\n@punkrokk Requests by default *already* trusts the 1024-bit roots (this isn't a SHA1 concern, it's a 1024-bit concern). However, when it finds certifi in the environment it automatically updates to use the strong trust bundle present in certifi, which does not trust those roots.\n\n> verification needs to fail either quietly notifying the user, not blocking the call which is just counter productive to making network calls like so:\n\nThat's nonsensical. If the certificate verification fails, one possibility is that you are under *active man-in-the-middle attack*. Soft-failure of certificate validation is unacceptable: it's just not an option.\n\n> Maybe I should open a separate issue, but I thought I followed this because I am having the issue with many APIs, where they have rootCA cert signed 1-2 years ago with SHA1, but requests doesn't like that.\n\nRequests does not police the signature algorithms used in a certificate chain.\n\n> It's currently failing to verify a Google domain.\n\nThis is because the Google certificate has two possible certificate chains it could build. One leads to a root certificate present in the `certifi` bundle that has a 2048-bit RSA key. The other leads to a root certificate *not* present in the `certifi` bundle that has a 1024-bit RSA key. These 1024-bit RSA keys are *desperately* unsafe, and so by default when Requests can find `certifi` it will not choose to trust them.\n\nYou can avoid this problem either by moving to a newer OpenSSL or by pointing Requests at `certifi.old_where()`. In the first case, this resolves OpenSSL's cert chain building problem to allow it to find the 2048-bit root that is actually present. In the second case, this will point to a version of `certifi` that contains the old, weak roots. I *strongly* recommend doing the first, rather than the second.\n\n> Here's an example cert that is failing for me: https://www.ssllabs.com/ssltest/analyze.html?d=tap-api-v2.proofpoint.com&latest (and you can see why in the report)\n\nThis validates just fine for me. Your issue isn't that there is a SHA-1 signature, though that is bad, but because like Google this site uses a cross-signed root certificate.\n\nAs an explainer of what this is, it occurred because Thawte had issued certificates for a long time that chained up to their [Thawte Premium Server CA](https://crt.sh/?caid=13) root certificate. This root certificate has a 1024-bit RSA key, which is extremely vulnerable to brute-force attacks to recover the private key. If the private key is recovered, it will be possible to create new certificates that appear to have been issued by the Thawte Premium Server CA. For this reason browsers have removed all 1024-bit RSA root certificates from their trust stores and CAs minted new 2048-bit-or-larger roots.\n\nHowever, some older browsers may not have the ability to update their cert stores, and so may not know about the new 2048-bit roots. For this reason, CA's \"cross-signed\" them. That is, they created two versions of the certificate: one is self-signed, and distributed to trust stores; the other is signed by the old 1024-bit root. This is true of the root CA here, which is [thawte Primary Root CA](https://crt.sh/?caid=14).\n\nIf you look back in your ssllabs log you'll see the remote server is sending a copy of this certificate over that is signed by Thawte Premium Server CA. However, if you expand the \"certification paths\" section of the report you'll see that the thawte Primary Root CA is being treated as a root, despite not being self-signed. What gives?\n\nWell, SSLLabs has a cert chain builder that looks for roots in its trust store *before* it chains up to a cert sent by the remote server. This means that when it looks for a cert called \"thawte Primary Root CA\" it looks in its trust database (e.g. like `certifi`) for that root cert, and finds it. Thus, it has built a trusted chain.\n\nOlder OpenSSLs do not do this. They always build a cert chain *first* using the certs sent by the remote server. For this reason, OpenSSL will use the cross-signed root sent by the server, and so will look for Thawte Premium Server CA to root the chain. `certifi` doesn't have this cert (because it's vulnerable to attack), so OpenSSL gives up. Newer versions of OpenSSL are smarter about this, and so they have no trouble validating a cert chain like this (which is why I don't encounter the problem).\n\nYour advice is the same as what I give to @myusuf3: either upgrade OpenSSL, or use the weak cert bundle. I *strongly* recommend the former.", "By the way, you'll see that `certifi` contains [thawte Primary Root CA](https://github.com/certifi/python-certifi/blob/master/certifi/cacert.pem#L1424-L1455). It also has [Thawte Premium Server CA](https://github.com/certifi/python-certifi/blob/master/certifi/old_root.pem#L241-L266) in a file labelled \"old_root.pem\", which is where the 1024-bit roots are stored in case they're needed.", "@Lukasa Cory - thanks so much for this explanation. Very helpful. I agree with you. If I have problems with the cert bundle I will open a new ticket. But I suspect the internet will appreciate your explanation in the future. I couldn't figure this out easily. What version of OpenSSL do you recommend? ", "@Lukasa I have updated `OpenSSL` and has fixed my issue. Thanks. Documentation around these errors would be great. Your _effort_ is appreciated. \r\n\r\nAs for the suggestion to pass `verified=False` to the request call, that assumes I own all the code that uses `requests` a way to knowingly turn off that functionality would be appreciated. \r\n\r\nThanks again.", "> a way to knowingly turn off that functionality would be appreciated.\r\n\r\nRequests very deliberately provides no such global switch: turning off certificate validation for all programs that use Requests very easily is a bad idea, and opens up real problems with securing a system. We provide environment variables that allow you to point us at custom cert bundles, which is the recommended solution: point us at your centralised CAs and we'll do the right thing.", "@Lukasa this is turning its ugly head again. I have the latest update for OpenSSL. What could be the issue now? I also have the most recent `certifi` installed as well. ", "What OpenSSL are you using?", "@Lukasa `OpenSSL 1.0.1f 6 Jan 2014` I just upgraded to most recent and I am still getting the same issue. do I need to reinstall `requests`, or `certifi`", "Neither should be required. OpenSSL 1.0.1f may still have chain building problems, depending on how your vendor provided it to you. Are you using `certifi.old_where()`?", "I had to rebuild Python with the new OpenSSL that built from scratch. Software man. thanks @Lukasa " ]
https://api.github.com/repos/psf/requests/issues/3858
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3858/labels{/name}
https://api.github.com/repos/psf/requests/issues/3858/comments
https://api.github.com/repos/psf/requests/issues/3858/events
https://github.com/psf/requests/issues/3858
206,444,445
MDU6SXNzdWUyMDY0NDQ0NDU=
3,858
why I
{ "avatar_url": "https://avatars.githubusercontent.com/u/4971925?v=4", "events_url": "https://api.github.com/users/xsren/events{/privacy}", "followers_url": "https://api.github.com/users/xsren/followers", "following_url": "https://api.github.com/users/xsren/following{/other_user}", "gists_url": "https://api.github.com/users/xsren/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/xsren", "id": 4971925, "login": "xsren", "node_id": "MDQ6VXNlcjQ5NzE5MjU=", "organizations_url": "https://api.github.com/users/xsren/orgs", "received_events_url": "https://api.github.com/users/xsren/received_events", "repos_url": "https://api.github.com/users/xsren/repos", "site_admin": false, "starred_url": "https://api.github.com/users/xsren/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/xsren/subscriptions", "type": "User", "url": "https://api.github.com/users/xsren", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2017-02-09T09:10:22Z
2021-09-08T12:01:00Z
2017-02-09T09:23:06Z
NONE
resolved
Hi, execuse me to trouble you! I want to use requests with gevent to make a concurrent spider. The offical docs says > If you are concerned about the use of blocking IO, there are lots of projects out there that combine Requests with one of Python's asynchronicity frameworks. Two excellent examples are grequests and requests-futures. But after read [this question in stackoverflow](http://stackoverflow.com/questions/9501663/how-enable-requests-async-mode), i am confused. Because I run the code in my machine, but it seems requests does better than urllib(python 2.7.10,gevent 1.1.2,requests 2.11.1). the code: ``` import sys import gevent from gevent import monkey monkey.patch_all() import requests import urllib2 def worker(url, use_urllib2=False): if use_urllib2: content = urllib2.urlopen(url).read().lower() else: content = requests.get(url).content.lower() title = content.split('<title>')[1].split('</title>')[0].strip() urls = ['http://www.baidu.com']*15 def by_requests(): jobs = [gevent.spawn(worker, url) for url in urls] gevent.joinall(jobs) def by_urllib2(): jobs = [gevent.spawn(worker, url, True) for url in urls] gevent.joinall(jobs) if __name__=='__main__': from timeit import Timer t = Timer(stmt="by_requests()", setup="from __main__ import by_requests") print 'by requests: %s seconds'%t.timeit(number=3) t = Timer(stmt="by_urllib2()", setup="from __main__ import by_urllib2") print 'by urllib2: %s seconds'%t.timeit(number=3) sys.exit(0) ``` the result: ``` ➜ test_requests python test04.py by requests: 0.178946971893 seconds by urllib2: 1.94786500931 seconds ➜ test_requests python test04.py by requests: 0.233242034912 seconds by urllib2: 1.55849504471 seconds ➜ test_requests python test04.py by requests: 0.52639913559 seconds by urllib2: 3.95958900452 seconds ➜ test_requests python test04.py by requests: 0.56934094429 seconds by urllib2: 4.00116705894 seconds ``` My question: 1. The result seems that now gevent + requests is asynchronous. But many people in the website say I should use grequests. Is that right? 2. Why the result I use requests same to I use grequests? Hope you can answer my question . Thanks very much.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3858/reactions" }
https://api.github.com/repos/psf/requests/issues/3858/timeline
null
completed
null
null
false
[ "Requests can work with gevent directly without any trouble: this is the nature of gevent. grequests provides some specific additional utilities for managing Requests with gevent. It's a nice addition, but not required.", "OK!Thank you very much!" ]
https://api.github.com/repos/psf/requests/issues/3857
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3857/labels{/name}
https://api.github.com/repos/psf/requests/issues/3857/comments
https://api.github.com/repos/psf/requests/issues/3857/events
https://github.com/psf/requests/pull/3857
206,177,352
MDExOlB1bGxSZXF1ZXN0MTA1MjAyMDk1
3,857
Say that we use a dictionary of strings.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2017-02-08T11:55:27Z
2021-09-07T00:06:38Z
2017-02-08T12:05:51Z
MEMBER
resolved
Resolves #3856.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3857/reactions" }
https://api.github.com/repos/psf/requests/issues/3857/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3857.diff", "html_url": "https://github.com/psf/requests/pull/3857", "merged_at": "2017-02-08T12:05:51Z", "patch_url": "https://github.com/psf/requests/pull/3857.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3857" }
true
[ "# [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3857?src=pr&el=h1) Report\n> Merging [#3857](https://codecov.io/gh/kennethreitz/requests/pull/3857?src=pr&el=desc) into [master](https://codecov.io/gh/kennethreitz/requests/commit/5c68b8f2279be82a672b908d0017d60a8ce6efcd?src=pr&el=desc) will **not impact** coverage.\n\n\n```diff\n@@ Coverage Diff @@\n## master #3857 +/- ##\n=======================================\n Coverage 89.05% 89.05% \n=======================================\n Files 15 15 \n Lines 1873 1873 \n=======================================\n Hits 1668 1668 \n Misses 205 205\n```\n\n\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3857?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3857?src=pr&el=footer). Last update [5c68b8f...08ce652](https://codecov.io/gh/kennethreitz/requests/compare/5c68b8f2279be82a672b908d0017d60a8ce6efcd...08ce652b8b42c64ad904133ceb3e4f71df90c03c?el=footer&src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments)." ]
https://api.github.com/repos/psf/requests/issues/3856
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3856/labels{/name}
https://api.github.com/repos/psf/requests/issues/3856/comments
https://api.github.com/repos/psf/requests/issues/3856/events
https://github.com/psf/requests/issues/3856
206,161,151
MDU6SXNzdWUyMDYxNjExNTE=
3,856
Can querystring params be non-strings?
{ "avatar_url": "https://avatars.githubusercontent.com/u/3839472?v=4", "events_url": "https://api.github.com/users/sharmaeklavya2/events{/privacy}", "followers_url": "https://api.github.com/users/sharmaeklavya2/followers", "following_url": "https://api.github.com/users/sharmaeklavya2/following{/other_user}", "gists_url": "https://api.github.com/users/sharmaeklavya2/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sharmaeklavya2", "id": 3839472, "login": "sharmaeklavya2", "node_id": "MDQ6VXNlcjM4Mzk0NzI=", "organizations_url": "https://api.github.com/users/sharmaeklavya2/orgs", "received_events_url": "https://api.github.com/users/sharmaeklavya2/received_events", "repos_url": "https://api.github.com/users/sharmaeklavya2/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sharmaeklavya2/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sharmaeklavya2/subscriptions", "type": "User", "url": "https://api.github.com/users/sharmaeklavya2", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2017-02-08T10:50:37Z
2021-09-08T12:01:01Z
2017-02-08T11:58:29Z
NONE
resolved
Many functions in requests take a parameter named `params`. There we are supposed to pass a mapping which will be used to construct the querystring. Here is the documentation I found about it: http://docs.python-requests.org/en/master/user/quickstart/#passing-parameters-in-urls http://docs.python-requests.org/en/master/api/#requests.request In none of these places there is any mention of whether it's allowed to pass non-string objects. This example shows that even `int`s and `bool`s are accepted. ``` >>> Request(url='http://localhost', params={1: True}).prepare().url 'http://localhost/?1=True' ``` After some experimentation, it seems that requests can accept anything which can be converted to a string. I looked at the source code of [`requests.models._encode_params`](https://github.com/kennethreitz/requests/blob/master/requests/models.py#L82) so I know why it's behaving like this. But is this the intended behavior? Also, comments in the source code of `requests.models._encode_params` say that a list of 2-tuples is also accepted. Even this isn't mentioned in the documentation links above. So my point is that I want to know what is the intended way of using the `params` parameter in requests. What can and cannot be passed in as `params`? Is there some existing documentation about this which I haven't been able to find? If no, this information needs to be documented somewhere. (I'm actually trying to improve [typeshed](https://github.com/python/typeshed) stubs for requests. Whether non-string objects are allowed in `params` or not will determine how to write stubs.)
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3856/reactions" }
https://api.github.com/repos/psf/requests/issues/3856/timeline
null
completed
null
null
false
[ "Oh boy, you are going to get @Lukasa screaming after you soon :) From this email https://lwn.net/Articles/643399/ it is intentional that requests take all sort of types for the \"param\" value. If something needs a fix, that probably would be Requests documentation.\r\n\r\nAlso, if you have not already, read #3855 ", "So the fact that Requests can accept anything that can be converted to a string is accidental, and not something we commit to supporting. If you read `encode_params` more closely you'll note that in fact it deals with the *entire* `params` argument (it also deals with `data`, incidentally). It can thus be broken down as follows:\r\n\r\n1. If `params` is a literal string object, pass it through unchanged. This allows sending already urlencoded query strings, or literal data, or JSON.\r\n2. Otherwise, if it has a `read` method, pass it through unchanged. This makes no sense for query string parameters, and in fact will fail later on, but this method is also used for the `data` argument where it *does* make sense, so the code allows it for now.\r\n3. Otherwise, it's iterable. We assume here that the iterable is of the form of an iterable of two-tuples or something that can be converted to that (e.g. a dict). This drops into a further decision tree for each entry in the iterable, which is split as a two-tuple into key and value:\r\n 1. If the value is not an iterable (or it's a string), we put it into a one-element list. This is to allow lists of elements for query string parameters.\r\n 2. Then, for each element in the value, if it's not `None`, we add a two-tuple of (key, value), optionally encoding the key and value in UTF-8 if they're unicode strings.\r\n 3. We then urlencode this iterable of two-tuples and return it.\r\n\r\nSo, let's be clear about the kinds of non-string entries that are allowed. `params` can be:\r\n\r\n- A string\r\n- A bytestring\r\n- An object with the read method (though this later fails)\r\n- Anything that conforms to `collections.Mapping`.\r\n- Anything that is an Iterable of two-tuples, including generators.\r\n\r\nFor the last two cases, those two-tuples need to be of things that urlencode will accept, so the type signature there is whatever urlencode's type signature is.\r\n\r\nAll of this is to say: the reason Requests' documentation doesn't go into this level of detail is because it's so absurdly far beyond the pale of what is reasonable to document. The *reality* here is that we *promise* to work only with strings, mappings of string to string, and iterables of two-tuples of strings. Non-string objects are *allowed* in the sense that they have worked for a long time and will likely continue to work for the remainder of the 2.0 release cycle due to the ossification of that API, but they are not formally condoned by the Requests documentation or team and we won't be advertising them as a feature because they rely on an implementation detail of the Python standard library that could change at any time.\r\n\r\nI'm of the opinion that the fact that the Requests documentation only ever uses strings for those arguments is a sufficient indicator that we don't support anything else. However, to resolve this for future discussion, I've opened #3857.\r\n\r\nIn the longer term, I'd like to suggest that you don't ask the Requests team to write the stub files for you. :wink: There's a reason we haven't done it, and this kind of ambiguity is one part of it. There are large chunks of the API where what we can accept is limited by the tools we use, and by the Python standard library, and where we will often be able to work with things that we don't technically want to support. Pursuing entirely accurate stub files for Requests is likely to be a frustrating endeavour." ]
https://api.github.com/repos/psf/requests/issues/3855
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3855/labels{/name}
https://api.github.com/repos/psf/requests/issues/3855/comments
https://api.github.com/repos/psf/requests/issues/3855/events
https://github.com/psf/requests/issues/3855
205,744,841
MDU6SXNzdWUyMDU3NDQ4NDE=
3,855
PEP 484 type annotations for Requests
{ "avatar_url": "https://avatars.githubusercontent.com/u/169930?v=4", "events_url": "https://api.github.com/users/ygingras/events{/privacy}", "followers_url": "https://api.github.com/users/ygingras/followers", "following_url": "https://api.github.com/users/ygingras/following{/other_user}", "gists_url": "https://api.github.com/users/ygingras/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ygingras", "id": 169930, "login": "ygingras", "node_id": "MDQ6VXNlcjE2OTkzMA==", "organizations_url": "https://api.github.com/users/ygingras/orgs", "received_events_url": "https://api.github.com/users/ygingras/received_events", "repos_url": "https://api.github.com/users/ygingras/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ygingras/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ygingras/subscriptions", "type": "User", "url": "https://api.github.com/users/ygingras", "user_view_type": "public" }
[]
closed
true
null
[]
null
6
2017-02-07T00:00:17Z
2018-07-04T13:31:54Z
2017-02-07T08:26:25Z
NONE
off-topic
I am considering adding type annotations to Requests to help my team do static analysis of code that depends on it. Would a pull request containing type annotations be merge back? More info on type annotations: * PEP 484: https://www.python.org/dev/peps/pep-0484/ * mypy, the type-checker of choice these days: http://mypy-lang.org/ Given the need to support Python 2.x, the preferred format for annotations would be as Type Comments, but I would be open to do .pyi stubs if that makes it easier for everyone.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3855/reactions" }
https://api.github.com/repos/psf/requests/issues/3855/timeline
null
completed
null
null
false
[ "Thanks for suggesting this!\n\nNo, Requests has no interest in maintaining type annotations in-tree. This has been our policy since PEP 484 was originally discussed on python-dev. PEP 484's type annotations are a poor fit for Requests due to the substantial complexity and flexibility in our API, and so writing a set that are sufficiently complete to be useful without being misleading is a tricky thing to do.\n\nI should note that the typeshed has partial type stubs for Requests: you may find it helpful to extend theirs rather than start from scratch.", "> type annotations are a poor fit for Requests due to the substantial complexity and flexibility in our API\r\n\r\nOn the contrary, I find Requests source code clear and relatively straightforward. And having used the `typing` module, it is also flexible.\r\n\r\nIt would be great for the Python community if we could try annotating Requests, and give feedback about typing + mypy to the people in charge. I mean, if no one is trying, how can we expect these projects to succeed and meet the demand ?\r\n\r\nI am really getting used to having method signatures annotated, it helps me a lot when I do code reviews. And the IDE autocompletion is just awesome.\r\n\r\nBut yeah, improving the stubs in `typeshed` would be a nice start, if upstream is not ready yet.", "> On the contrary, I find Requests source code clear and relatively straightforward. And having used the typing module, it is also flexible.\n\nThe clarity of the source code is not the issue here: the issue here is the flexibility of inputs to the API.\n\nI noted this originally in [mailing list posts](https://lwn.net/Articles/643399/) when PEP 484 was discussed, and as far as I am aware nothing has changed since.\n\n> It would be great for the Python community if we could try annotating Requests, and give feedback about typing + mypy to the people in charge. I mean, if no one is trying, how can we expect these projects to succeed and meet the demand ?\n\nThird parties like yourself are entirely free to do whatever you like. You are welcome to attempt to annotate the source and to provide the appropriate feedback upstream.\n\nHowever, the *Requests project* and maintainers are not researchers on behalf of PEP 484. It is not incumbent upon us to make PEP 484 a success. Adding type hints to the Requests repository makes us responsible for keeping them up-to-date, which is not something any of us feel like is worth doing at this time.\n\nHowever, I highly recommend you read through the link above where I originally explained my concerns with adding type hints to the Requests API. Until those concerns are addressed, and until we can validate the correctness of type hints in our CI, I'm going to be opposed to adding first-party support for them.", "Further there are already some flawed and unofficial hints in https://github.com/python/typeshed. We won't endorse those, but they already exist, so there's less work for you to do.\r\n\r\nAlso, in the spirit of being explicit, I wholeheartedly agree with @Lukasa.", "Fair enough, Cory and Ian. I will start by adding to typeshed and see where that leads us. Cheers!", "@Lukasa \r\nThanks for the mailing list post. I've been fooling around with `mypy` for a bit and you articulate my main doubt about this PEP's usefulness, namely that the most Pythonic code is the most difficult to annotate. Sufficiently Pythonic code is currently impossible to annotate.\r\n\r\nIt sort of follows that for this PEP to be successful the definition of \"Pythonic\" would have to change. It's hard to see members of the community working to make that happen.\r\n" ]
https://api.github.com/repos/psf/requests/issues/3854
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3854/labels{/name}
https://api.github.com/repos/psf/requests/issues/3854/comments
https://api.github.com/repos/psf/requests/issues/3854/events
https://github.com/psf/requests/issues/3854
205,649,184
MDU6SXNzdWUyMDU2NDkxODQ=
3,854
Support urllib3 Retries at Individual Request Level
{ "avatar_url": "https://avatars.githubusercontent.com/u/1214204?v=4", "events_url": "https://api.github.com/users/TimOrme/events{/privacy}", "followers_url": "https://api.github.com/users/TimOrme/followers", "following_url": "https://api.github.com/users/TimOrme/following{/other_user}", "gists_url": "https://api.github.com/users/TimOrme/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/TimOrme", "id": 1214204, "login": "TimOrme", "node_id": "MDQ6VXNlcjEyMTQyMDQ=", "organizations_url": "https://api.github.com/users/TimOrme/orgs", "received_events_url": "https://api.github.com/users/TimOrme/received_events", "repos_url": "https://api.github.com/users/TimOrme/repos", "site_admin": false, "starred_url": "https://api.github.com/users/TimOrme/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/TimOrme/subscriptions", "type": "User", "url": "https://api.github.com/users/TimOrme", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2017-02-06T17:08:49Z
2021-09-08T12:01:01Z
2017-02-06T17:14:19Z
NONE
resolved
Right now, requests supports urllib3 retries at the pool level, allowing retries to occur fairly globally. It seems as if urllib3 supports retries at a more granular level as well, for individual requests: ``` >>> r = http.request( ... 'GET', ... 'http://httpbin.org/redirect/3', ... retries=urllib3.Retries( ... redirect=2, raise_on_redirect=False)) >>> r.status 302 ``` It would be nice if requests had support for this level of granularity, as some of our cases require retries for only specific endpoints on a single API. Apologies if this is the wrong place for this, wasn't sure where to place feature requests. Thanks! Tim
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3854/reactions" }
https://api.github.com/repos/psf/requests/issues/3854/timeline
null
completed
null
null
false
[ "Thanks for this request!\n\nUnfortunately, we're unlikely to support this: it's fundamentally an additional complexity that we don't think is justified in the API. At this point our top-level API is largely frozen, and we're very unlikely to widen it or add new features.\n\nHowever, it's worth noting that the Transport Adapter in use for a given request is chosen on a longest-prefix-match on the *entire* URL. That means that you can register different adapters for different paths, as well as different hosts, which should give you the flexibility you need.\n\nI hope that helps!" ]
https://api.github.com/repos/psf/requests/issues/3853
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3853/labels{/name}
https://api.github.com/repos/psf/requests/issues/3853/comments
https://api.github.com/repos/psf/requests/issues/3853/events
https://github.com/psf/requests/pull/3853
205,637,955
MDExOlB1bGxSZXF1ZXN0MTA0ODMyNTA2
3,853
Do not convert /o\ into /O\
{ "avatar_url": "https://avatars.githubusercontent.com/u/167327?v=4", "events_url": "https://api.github.com/users/StyXman/events{/privacy}", "followers_url": "https://api.github.com/users/StyXman/followers", "following_url": "https://api.github.com/users/StyXman/following{/other_user}", "gists_url": "https://api.github.com/users/StyXman/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/StyXman", "id": 167327, "login": "StyXman", "node_id": "MDQ6VXNlcjE2NzMyNw==", "organizations_url": "https://api.github.com/users/StyXman/orgs", "received_events_url": "https://api.github.com/users/StyXman/received_events", "repos_url": "https://api.github.com/users/StyXman/repos", "site_admin": false, "starred_url": "https://api.github.com/users/StyXman/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/StyXman/subscriptions", "type": "User", "url": "https://api.github.com/users/StyXman", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2017-02-06T16:31:22Z
2021-09-07T00:06:39Z
2017-02-07T16:04:25Z
CONTRIBUTOR
resolved
There is code that already check that a guy raising (at least) his right arm does not get big-headed (a.k.a uppercased). Just add the same for the other, poor guy complaining about an error.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3853/reactions" }
https://api.github.com/repos/psf/requests/issues/3853/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3853.diff", "html_url": "https://github.com/psf/requests/pull/3853", "merged_at": "2017-02-07T16:04:25Z", "patch_url": "https://github.com/psf/requests/pull/3853.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3853" }
true
[ "Yes, sorry, I just (w)hacked it in 3 minutes without proper testing. Shame on me. Let me see...", "Cool, looks good now, thanks! :sparkles:", "\\o/" ]
https://api.github.com/repos/psf/requests/issues/3852
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3852/labels{/name}
https://api.github.com/repos/psf/requests/issues/3852/comments
https://api.github.com/repos/psf/requests/issues/3852/events
https://github.com/psf/requests/issues/3852
205,373,183
MDU6SXNzdWUyMDUzNzMxODM=
3,852
Missing response headers when malformed header is part of the response
{ "avatar_url": "https://avatars.githubusercontent.com/u/370329?v=4", "events_url": "https://api.github.com/users/gboudreau/events{/privacy}", "followers_url": "https://api.github.com/users/gboudreau/followers", "following_url": "https://api.github.com/users/gboudreau/following{/other_user}", "gists_url": "https://api.github.com/users/gboudreau/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/gboudreau", "id": 370329, "login": "gboudreau", "node_id": "MDQ6VXNlcjM3MDMyOQ==", "organizations_url": "https://api.github.com/users/gboudreau/orgs", "received_events_url": "https://api.github.com/users/gboudreau/received_events", "repos_url": "https://api.github.com/users/gboudreau/repos", "site_admin": false, "starred_url": "https://api.github.com/users/gboudreau/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/gboudreau/subscriptions", "type": "User", "url": "https://api.github.com/users/gboudreau", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2017-02-04T19:58:37Z
2021-09-08T12:01:02Z
2017-02-04T21:12:25Z
NONE
resolved
Using Python 3.5, requests 2.13.0 ``` >>> import requests >>> r = requests.get('https://online.chasecanada.ca/ChaseCanada_Consumer/Login.do') >>> print(r) <Response [200]> >>> print(r.headers) {'Date': 'Sat, 04 Feb 2017 19:52:14 GMT'} >>> ``` I'm pretty sure the problem is caused by an invalid HTTP header returned by the server: HTTP/1.1 200 OK Date: Sat, 04 Feb 2017 19:16:34 GMT My Param: None [...] It directly follows the `Date` response header, which is returned fine, but since no other response headers is returned, I think this broken header is breaking the HTTP response headers parser.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3852/reactions" }
https://api.github.com/repos/psf/requests/issues/3852/timeline
null
completed
null
null
false
[ "Hum. Not sure this is a `requests` issue, since the `headers` dict in the `urllib3.response.HTTPResponse` object also only has the `Date` header...\r\nMaybe there is a workaround somehow..? ", "As I answered on the urllib3 issue you opened, there is no workaround until urllib3 is no longer reliant on `http.client`. Cheers!" ]
https://api.github.com/repos/psf/requests/issues/3851
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3851/labels{/name}
https://api.github.com/repos/psf/requests/issues/3851/comments
https://api.github.com/repos/psf/requests/issues/3851/events
https://github.com/psf/requests/pull/3851
205,216,395
MDExOlB1bGxSZXF1ZXN0MTA0NTc0OTAw
3,851
remove pin
{ "avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4", "events_url": "https://api.github.com/users/nateprewitt/events{/privacy}", "followers_url": "https://api.github.com/users/nateprewitt/followers", "following_url": "https://api.github.com/users/nateprewitt/following{/other_user}", "gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/nateprewitt", "id": 5271761, "login": "nateprewitt", "node_id": "MDQ6VXNlcjUyNzE3NjE=", "organizations_url": "https://api.github.com/users/nateprewitt/orgs", "received_events_url": "https://api.github.com/users/nateprewitt/received_events", "repos_url": "https://api.github.com/users/nateprewitt/repos", "site_admin": false, "starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions", "type": "User", "url": "https://api.github.com/users/nateprewitt", "user_view_type": "public" }
[]
closed
true
null
[]
null
0
2017-02-03T17:14:31Z
2021-09-07T00:06:39Z
2017-02-03T17:21:14Z
MEMBER
resolved
@kennethreitz, it looks like the Pipfile was rebuilt yesterday with the latest version of `pipenv`. We'll need to remove the version we had pinned now that kennethreitz/pipenv#90 is resolved.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3851/reactions" }
https://api.github.com/repos/psf/requests/issues/3851/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3851.diff", "html_url": "https://github.com/psf/requests/pull/3851", "merged_at": "2017-02-03T17:21:14Z", "patch_url": "https://github.com/psf/requests/pull/3851.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3851" }
true
[]
https://api.github.com/repos/psf/requests/issues/3850
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3850/labels{/name}
https://api.github.com/repos/psf/requests/issues/3850/comments
https://api.github.com/repos/psf/requests/issues/3850/events
https://github.com/psf/requests/issues/3850
205,180,611
MDU6SXNzdWUyMDUxODA2MTE=
3,850
Unsupported authorization method
{ "avatar_url": "https://avatars.githubusercontent.com/u/4481867?v=4", "events_url": "https://api.github.com/users/thewebsitedev/events{/privacy}", "followers_url": "https://api.github.com/users/thewebsitedev/followers", "following_url": "https://api.github.com/users/thewebsitedev/following{/other_user}", "gists_url": "https://api.github.com/users/thewebsitedev/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/thewebsitedev", "id": 4481867, "login": "thewebsitedev", "node_id": "MDQ6VXNlcjQ0ODE4Njc=", "organizations_url": "https://api.github.com/users/thewebsitedev/orgs", "received_events_url": "https://api.github.com/users/thewebsitedev/received_events", "repos_url": "https://api.github.com/users/thewebsitedev/repos", "site_admin": false, "starred_url": "https://api.github.com/users/thewebsitedev/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/thewebsitedev/subscriptions", "type": "User", "url": "https://api.github.com/users/thewebsitedev", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2017-02-03T15:03:16Z
2021-09-08T12:01:03Z
2017-02-03T15:06:32Z
NONE
resolved
Hello, My code: ``` headers = {'Authorization': 'Bearer '+access_token} payload = {'grant_type':'refresh_token','refresh_token':refresh_token,'client_id': client_id, 'client_secret': client_secret} resp = requests.post(url,headers=headers,data=payload).json() ``` I am trying to get refresh token as mentioned in the [docs](https://build.envato.com/api/#oauth) for Envato API. When I run this code I get the following response: **{'error': 'invalid_request', 'error_description': 'Unsupported authorization method: '}** Can anyone point out as to what I am missing here? As per the docs I need to make a POST request to get the refresh token. ``` POST https://api.envato.com/token grant_type=refresh_token& refresh_token=[REFRESH_TOKEN]& client_id=[CLIENT_ID]& client_secret=[CLIENT_SECRET] ``` Requests: v2.13.0 Python: v3.5.3
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3850/reactions" }
https://api.github.com/repos/psf/requests/issues/3850/timeline
null
completed
null
null
false
[ "So, this is not really a Requests problem: the problem is with the specific data you need to send in the OAuth flow.\r\n\r\nHowever, can I suggest you try removing the `Authorization` header? This uses the `token` endpoint, which some of the other requests do, and none of the others provide an `Authorization` header. I suggest trying that.", "I totally agree. Thank you so much!" ]
https://api.github.com/repos/psf/requests/issues/3849
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3849/labels{/name}
https://api.github.com/repos/psf/requests/issues/3849/comments
https://api.github.com/repos/psf/requests/issues/3849/events
https://github.com/psf/requests/issues/3849
205,100,797
MDU6SXNzdWUyMDUxMDA3OTc=
3,849
Received response with content-encoding: gzip, but failed to decode it
{ "avatar_url": "https://avatars.githubusercontent.com/u/11490531?v=4", "events_url": "https://api.github.com/users/wavenator/events{/privacy}", "followers_url": "https://api.github.com/users/wavenator/followers", "following_url": "https://api.github.com/users/wavenator/following{/other_user}", "gists_url": "https://api.github.com/users/wavenator/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/wavenator", "id": 11490531, "login": "wavenator", "node_id": "MDQ6VXNlcjExNDkwNTMx", "organizations_url": "https://api.github.com/users/wavenator/orgs", "received_events_url": "https://api.github.com/users/wavenator/received_events", "repos_url": "https://api.github.com/users/wavenator/repos", "site_admin": false, "starred_url": "https://api.github.com/users/wavenator/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/wavenator/subscriptions", "type": "User", "url": "https://api.github.com/users/wavenator", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2017-02-03T08:46:29Z
2021-09-05T00:07:03Z
2017-02-03T08:58:17Z
NONE
resolved
```python import requests requests.get('http://gett.bike/') ``` This code raises the following exception: ```python ContentDecodingError: ('Received response with content-encoding: gzip, but failed to decode it.', error('Error -3 while decompressing data: incorrect data check',)) ``` Arch linux x64 requests==2.13.0 python=3.6.0
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3849/reactions" }
https://api.github.com/repos/psf/requests/issues/3849/timeline
null
completed
null
null
false
[ "This seems like the server is returning data it claims is gzipped, but isn't, or is invalid somehow. curl agrees with this assessment, as you can see by doing `curl -L --compressed http://gett.bike`, which leads to:\r\n\r\n```curl: (23) Error while processing content unencoding: invalid block type```\r\n\r\nYou can work around this for now by setting the `Accept-Encoding` header to `identity`:\r\n\r\n```python\r\nimport requests\r\n\r\nrequests.get('http://gett.bike/', headers={'Accept-Encoding': 'identity'})\r\n```\r\n\r\nBut I strongly recommend you reach out to the server operator and get them to fix their server.", "I'm now receiving that same error. I've even switched hosting providers. Here's my output. Any ideas?\r\n\r\n```Environment:\r\n\r\n\r\nRequest Method: GET\r\nRequest URL: https://sm8attachments.com/authenticate/callback/?code=1ace2d4df855a2b7e5fb6b1add0581102127cd5b&state=\r\n\r\nDjango Version: 2.1\r\nPython Version: 3.6.6\r\nInstalled Applications:\r\n['django.contrib.admin',\r\n 'django.contrib.auth',\r\n 'django.contrib.contenttypes',\r\n 'django.contrib.sessions',\r\n 'django.contrib.messages',\r\n 'django.contrib.staticfiles',\r\n 'django_celery_results',\r\n 'django.contrib.sites',\r\n 'user',\r\n 'account',\r\n 'attachment',\r\n 'webhook',\r\n 'authenticate',\r\n 'subscribe',\r\n 'home']\r\nInstalled Middleware:\r\n('whitenoise.middleware.WhiteNoiseMiddleware',\r\n 'django.middleware.security.SecurityMiddleware',\r\n 'django.contrib.sessions.middleware.SessionMiddleware',\r\n 'django.middleware.common.CommonMiddleware',\r\n 'django.middleware.csrf.CsrfViewMiddleware',\r\n 'django.contrib.auth.middleware.AuthenticationMiddleware',\r\n 'django.contrib.messages.middleware.MessageMiddleware',\r\n 'django.middleware.clickjacking.XFrameOptionsMiddleware',\r\n 'whitenoise.middleware.WhiteNoiseMiddleware')\r\n\r\n\r\n\r\nTraceback:\r\n\r\nFile \"/app/.heroku/python/lib/python3.6/site-packages/urllib3/response.py\" in _decode\r\n 295. data = self._decoder.decompress(data)\r\n\r\nFile \"/app/.heroku/python/lib/python3.6/site-packages/urllib3/response.py\" in decompress\r\n 77. ret += self._obj.decompress(data)\r\n\r\nDuring handling of the above exception (Error -3 while decompressing data: incorrect data check), another exception occurred:\r\n\r\nFile \"/app/.heroku/python/lib/python3.6/site-packages/requests/models.py\" in generate\r\n 749. for chunk in self.raw.stream(chunk_size, decode_content=True):\r\n\r\nFile \"/app/.heroku/python/lib/python3.6/site-packages/urllib3/response.py\" in stream\r\n 465. data = self.read(amt=amt, decode_content=decode_content)\r\n\r\nFile \"/app/.heroku/python/lib/python3.6/site-packages/urllib3/response.py\" in read\r\n 437. data = self._decode(data, decode_content, flush_decoder)\r\n\r\nFile \"/app/.heroku/python/lib/python3.6/site-packages/urllib3/response.py\" in _decode\r\n 300. \"failed to decode it.\" % content_encoding, e)\r\n\r\nDuring handling of the above exception (('Received response with content-encoding: gzip, but failed to decode it.', error('Error -3 while decompressing data: incorrect data check',))), another exception occurred:\r\n\r\nFile \"/app/.heroku/python/lib/python3.6/site-packages/django/core/handlers/exception.py\" in inner\r\n 34. response = get_response(request)\r\n\r\nFile \"/app/.heroku/python/lib/python3.6/site-packages/django/core/handlers/base.py\" in _get_response\r\n 126. response = self.process_exception_by_middleware(e, request)\r\n\r\nFile \"/app/.heroku/python/lib/python3.6/site-packages/django/core/handlers/base.py\" in _get_response\r\n 124. response = wrapped_callback(request, *callback_args, **callback_kwargs)\r\n\r\nFile \"/app/.heroku/python/lib/python3.6/site-packages/django/views/generic/base.py\" in view\r\n 68. return self.dispatch(request, *args, **kwargs)\r\n\r\nFile \"/app/.heroku/python/lib/python3.6/site-packages/django/views/generic/base.py\" in dispatch\r\n 88. return handler(request, *args, **kwargs)\r\n\r\nFile \"/app/authenticate/views.py\" in get\r\n 58. for vendor in get_vendors(r['access_token']):\r\n\r\nFile \"/app/authenticate/views.py\" in get_vendors\r\n 34. vendors = requests.get(url, headers=headers).json()\r\n\r\nFile \"/app/.heroku/python/lib/python3.6/site-packages/requests/api.py\" in get\r\n 72. return request('get', url, params=params, **kwargs)\r\n\r\nFile \"/app/.heroku/python/lib/python3.6/site-packages/requests/api.py\" in request\r\n 58. return session.request(method=method, url=url, **kwargs)\r\n\r\nFile \"/app/.heroku/python/lib/python3.6/site-packages/requests/sessions.py\" in request\r\n 512. resp = self.send(prep, **send_kwargs)\r\n\r\nFile \"/app/.heroku/python/lib/python3.6/site-packages/requests/sessions.py\" in send\r\n 662. r.content\r\n\r\nFile \"/app/.heroku/python/lib/python3.6/site-packages/requests/models.py\" in content\r\n 827. self._content = b''.join(self.iter_content(CONTENT_CHUNK_SIZE)) or b''\r\n\r\nFile \"/app/.heroku/python/lib/python3.6/site-packages/requests/models.py\" in generate\r\n 754. raise ContentDecodingError(e)\r\n\r\nException Type: ContentDecodingError at /authenticate/callback/\r\nException Value: ('Received response with content-encoding: gzip, but failed to decode it.', error('Error -3 while decompressing data: incorrect data check',))```" ]
https://api.github.com/repos/psf/requests/issues/3848
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3848/labels{/name}
https://api.github.com/repos/psf/requests/issues/3848/comments
https://api.github.com/repos/psf/requests/issues/3848/events
https://github.com/psf/requests/issues/3848
205,099,377
MDU6SXNzdWUyMDUwOTkzNzc=
3,848
A little suggestion about 'LookupDict' in 'requests/requests/structures.py'
{ "avatar_url": "https://avatars.githubusercontent.com/u/22973136?v=4", "events_url": "https://api.github.com/users/jiazhuamh/events{/privacy}", "followers_url": "https://api.github.com/users/jiazhuamh/followers", "following_url": "https://api.github.com/users/jiazhuamh/following{/other_user}", "gists_url": "https://api.github.com/users/jiazhuamh/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jiazhuamh", "id": 22973136, "login": "jiazhuamh", "node_id": "MDQ6VXNlcjIyOTczMTM2", "organizations_url": "https://api.github.com/users/jiazhuamh/orgs", "received_events_url": "https://api.github.com/users/jiazhuamh/received_events", "repos_url": "https://api.github.com/users/jiazhuamh/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jiazhuamh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jiazhuamh/subscriptions", "type": "User", "url": "https://api.github.com/users/jiazhuamh", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2017-02-03T08:37:43Z
2021-09-08T12:01:03Z
2017-02-03T09:02:48Z
NONE
resolved
Dear Sir, I like your 'requests' Python library very much and I'm reading the source code these days. I found a problem that is tricky. The 'LookupDict' class definition is: ` ``` class LookupDict(dict): def __init__(self, name=None): self.name = name super(LookupDict, self).__init__() def __repr__(self): return '<lookup \'%s\'>' % (self.name) def __getitem__(self, key): # We allow fall-through here, so values default to None return self.__dict__.get(key, None) def get(self, key, default=None): return self.__dict__.get(key, default) ``` ` It inherits the 'dict' object but does not use any of the 'dict' methed. It only use the class's \__dict__ and it is ok to replace the 'dict' with 'object'.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3848/reactions" }
https://api.github.com/repos/psf/requests/issues/3848/timeline
null
completed
null
null
false
[ "`LookupDict` absolutely includes all of the dict methods. In particular, it allows the use of `__setitem__` and other things from the dict class. It specifically does only one thing, which is override `__getitem__` to allow returning `None`.\r\n\r\nSo it's not ok to replace the dict with `object` in this case, even though the requests codebase doesn't use that functionality.\r\n\r\nThanks for reporting this issue though, and I hope you keep reading through the codebase!", "Thanks a lot for your reply. I'll read the code with your clue." ]
https://api.github.com/repos/psf/requests/issues/3847
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3847/labels{/name}
https://api.github.com/repos/psf/requests/issues/3847/comments
https://api.github.com/repos/psf/requests/issues/3847/events
https://github.com/psf/requests/issues/3847
205,083,957
MDU6SXNzdWUyMDUwODM5NTc=
3,847
requests messes up a ZIP stream (?)
{ "avatar_url": "https://avatars.githubusercontent.com/u/3466341?v=4", "events_url": "https://api.github.com/users/ResidentMario/events{/privacy}", "followers_url": "https://api.github.com/users/ResidentMario/followers", "following_url": "https://api.github.com/users/ResidentMario/following{/other_user}", "gists_url": "https://api.github.com/users/ResidentMario/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ResidentMario", "id": 3466341, "login": "ResidentMario", "node_id": "MDQ6VXNlcjM0NjYzNDE=", "organizations_url": "https://api.github.com/users/ResidentMario/orgs", "received_events_url": "https://api.github.com/users/ResidentMario/received_events", "repos_url": "https://api.github.com/users/ResidentMario/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ResidentMario/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ResidentMario/subscriptions", "type": "User", "url": "https://api.github.com/users/ResidentMario", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2017-02-03T06:43:04Z
2021-09-08T12:01:02Z
2017-02-03T10:11:29Z
NONE
resolved
There's an interesting bug that I've run into involving reading [a zipfile](https://data.cityofnewyork.us/dataset/Broadband-Data-Dig-Datasets/ft4n-yqee) off the web. The details as in the comments to [this answer on StackOverflow](http://stackoverflow.com/a/42016924/1993206): This fails with a `BadZipFileError`: ``` z = zipfile.ZipFile(io.BytesIO(requests.get("https://data.cityofnewyork.us/dataset/Broadband-Data-Dig-Datasets/ft4n-yqee").content)) ``` But writing `requests.get("https://data.cityofnewyork.us/dataset/Broadband-Data-Dig-Datasets/ft4n-yqee").content` to a file and *then* opening it with `ZipFile` works fine. So this works: ``` with open("temp.zip", "wb") as fp: fp.write(requests.get("https://data.cityofnewyork.us/dataset/Broadband-Data-Dig-Datasets/ft4n-yqee").content) with open("temp.zip", "rb") as fp: zipcontent = zipfile.ZipFile(fp) ``` This is on Python 3.5.2 and Requests 2.12.4.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3847/reactions" }
https://api.github.com/repos/psf/requests/issues/3847/timeline
null
completed
null
null
false
[ "That doesn't work for me at all. =)\r\n\r\nAn FYI: the response I get from that website is HTML, always. By my reckoning the download link is `https://data.cityofnewyork.us/download/ft4n-yqee/application%2Fzip`, and that link does appear to work correctly in the code sample. =)", "Yes, my bad for posting the wrong link! What platform are you on?", "Mac. =) Python 3.5.2." ]
https://api.github.com/repos/psf/requests/issues/3846
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3846/labels{/name}
https://api.github.com/repos/psf/requests/issues/3846/comments
https://api.github.com/repos/psf/requests/issues/3846/events
https://github.com/psf/requests/pull/3846
204,761,219
MDExOlB1bGxSZXF1ZXN0MTA0MjYxNDg0
3,846
initial attempt at `get_redirect_target`
{ "avatar_url": "https://avatars.githubusercontent.com/u/204779?v=4", "events_url": "https://api.github.com/users/jvanasco/events{/privacy}", "followers_url": "https://api.github.com/users/jvanasco/followers", "following_url": "https://api.github.com/users/jvanasco/following{/other_user}", "gists_url": "https://api.github.com/users/jvanasco/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jvanasco", "id": 204779, "login": "jvanasco", "node_id": "MDQ6VXNlcjIwNDc3OQ==", "organizations_url": "https://api.github.com/users/jvanasco/orgs", "received_events_url": "https://api.github.com/users/jvanasco/received_events", "repos_url": "https://api.github.com/users/jvanasco/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jvanasco/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jvanasco/subscriptions", "type": "User", "url": "https://api.github.com/users/jvanasco", "user_view_type": "public" }
[ { "color": "d4c5f9", "default": false, "description": null, "id": 536793543, "name": "needs rebase", "node_id": "MDU6TGFiZWw1MzY3OTM1NDM=", "url": "https://api.github.com/repos/psf/requests/labels/needs%20rebase" } ]
closed
true
null
[]
null
14
2017-02-02T01:52:21Z
2021-09-07T00:06:35Z
2017-02-10T21:46:14Z
CONTRIBUTOR
resolved
This is the first attempt at a PR to address my proposal #3837 to support pluggable redirect handling. Feedback is welcome. This does not break existing tests and adds a new test to for handling malformed 200+location responses using a custom session mixin. I've been testing it for a day and it seems to be fine. Thank you for simply considering this, and even more thanks for all the suggestions and advice in the related thread. It has been a long week of little sleep trying to pin down some issues.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3846/reactions" }
https://api.github.com/repos/psf/requests/issues/3846/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3846.diff", "html_url": "https://github.com/psf/requests/pull/3846", "merged_at": "2017-02-10T21:46:14Z", "patch_url": "https://github.com/psf/requests/pull/3846.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3846" }
true
[ "# [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3846?src=pr&el=h1) Report\n> Merging [#3846](https://codecov.io/gh/kennethreitz/requests/pull/3846?src=pr&el=desc) into [master](https://codecov.io/gh/kennethreitz/requests/commit/ad65b0cb19124b5ae4dd01bf19d82c16ffb2485d?src=pr&el=desc) will **increase** coverage by `<.01%`.\n\n\n```diff\n@@ Coverage Diff @@\n## master #3846 +/- ##\n==========================================\n+ Coverage 89.05% 89.06% +<.01% \n==========================================\n Files 15 15 \n Lines 1873 1874 +1 \n==========================================\n+ Hits 1668 1669 +1 \n Misses 205 205\n```\n\n\n| [Impacted Files](https://codecov.io/gh/kennethreitz/requests/pull/3846?src=pr&el=tree) | Coverage Δ | |\n|---|---|---|\n| [requests/sessions.py](https://codecov.io/gh/kennethreitz/requests/compare/ad65b0cb19124b5ae4dd01bf19d82c16ffb2485d...70f31a3166c1f9470b5cfad888f828357c1daadd?src=pr&el=tree#diff-cmVxdWVzdHMvc2Vzc2lvbnMucHk=) | `92.83% <100%> (+0.02%)` | :white_check_mark: |\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3846?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3846?src=pr&el=footer). Last update [ad65b0c...70f31a3](https://codecov.io/gh/kennethreitz/requests/compare/ad65b0cb19124b5ae4dd01bf19d82c16ffb2485d...70f31a3166c1f9470b5cfad888f828357c1daadd?el=footer&src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).", "I originally hoped to change the while-loop. I found it a difficult to decipher what is going on, as it's doing a lot of copying/resetting of vars and prepping for the next iteration. I admittedly missed some obvious behaviors in it.\r\n\r\nIn terms of maintenance and readability, I agree a generator or closure would likely be better. Tracking the iteration with `i` could be replaced by just referencing `len(history)`too.", "💫✨🍰✨💫", "this is great. ", "here's a quick edit to get rid of that `i`:\r\n\r\nhttps://github.com/kennethreitz/requests/commit/6c40707b93f98fac9a7982cc5f50521079b155b8\r\n\r\nIMHO this is a bit cleaner to understand. it not only removes the `i` (i found it annoying), but instead of having to handle the weird iteration conditions and copying the history via`list`, just uses an array slice to create a new list excluding the first element.", "Yup, that does seem like a positive step.", "Should I merge that change into this PR?", "Let's give it a try and see how it looks.", "needs a rebase!", "Rebased, but a conflict on the `AUTHORS.txt` left an ugly merge commit :(\r\n\r\nI pulled in the clarification to the while-loop too.\r\n\r\nI *really dislike* how the while-loop handles the history, but I'm still trying to clean it up. ", "So FWIW, it may be easier to cherry-pick the commits you need to a new branch and force-push it up to this PR. You dragged a whole bunch of other history into the PR which makes reviewing it really hard.", "Actually, I'm going to manually patch and then force-push. That'll give you one commit. Sorry about this. Git is often not my friend.", "Cleaned up. Sorry for the mess.", "💫✨🍰✨💫" ]
https://api.github.com/repos/psf/requests/issues/3845
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3845/labels{/name}
https://api.github.com/repos/psf/requests/issues/3845/comments
https://api.github.com/repos/psf/requests/issues/3845/events
https://github.com/psf/requests/issues/3845
204,752,845
MDU6SXNzdWUyMDQ3NTI4NDU=
3,845
Various Intermittent Request Errors
{ "avatar_url": "https://avatars.githubusercontent.com/u/7819023?v=4", "events_url": "https://api.github.com/users/lukas-gitl/events{/privacy}", "followers_url": "https://api.github.com/users/lukas-gitl/followers", "following_url": "https://api.github.com/users/lukas-gitl/following{/other_user}", "gists_url": "https://api.github.com/users/lukas-gitl/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/lukas-gitl", "id": 7819023, "login": "lukas-gitl", "node_id": "MDQ6VXNlcjc4MTkwMjM=", "organizations_url": "https://api.github.com/users/lukas-gitl/orgs", "received_events_url": "https://api.github.com/users/lukas-gitl/received_events", "repos_url": "https://api.github.com/users/lukas-gitl/repos", "site_admin": false, "starred_url": "https://api.github.com/users/lukas-gitl/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/lukas-gitl/subscriptions", "type": "User", "url": "https://api.github.com/users/lukas-gitl", "user_view_type": "public" }
[]
closed
true
null
[]
null
11
2017-02-02T00:57:28Z
2021-09-08T04:00:37Z
2017-02-02T08:34:42Z
NONE
resolved
When using requests.post (have not tried other methods) I get the following errors **intermittently**. **Errors** - ('Connection aborted.', BadStatusLine("''",)) - ('Connection aborted.', error(104, 'Connection reset by peer')) - ("bad handshake: SysCallError(104, 'ECONNRESET')",) - EOF occurred in violation of protocol (_ssl.c:661) - HTTPSConnectionPool(host='...', port=443): Max retries exceeded with url: ... (Caused by NewConnectionError('<requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x...>: Failed to establish a new connection: [Errno 110] Operation timed out',)) - Sometimes I just get a 502 or 504 back, but that might be a remote issues The amount this happens differs, but at least a few times every hour (we are sending a few thousand logs per hour). I wasn't able to reproduce these errors locally or reliably. I have experienced this will various targets hosts, namely the api provided by: loggly.com, logentries.com, logz.io. Since they all have the issue I don't think it is a server configuration problem. This issue first appeared about a year ago when we set this up. However it seems to have gotten more problematic since we are sending more requests. **Setup** Running Python on Elastic Beanstalk inside a [Single Container Docker Configuration](http://docs.aws.amazon.com/elasticbeanstalk/latest/dg/create_deploy_docker_image.html) behind an Elastic Load Balancer. The outgoing requests happen in various threads (not just in the main thread). More information that might be useful: - EC2 Instance OS (`cat /etc/*-release`) ``` NAME="Amazon Linux AMI" VERSION="2016.09" ID="amzn" ID_LIKE="rhel fedora" VERSION_ID="2016.09" PRETTY_NAME="Amazon Linux AMI 2016.09" ANSI_COLOR="0;33" CPE_NAME="cpe:/o:amazon:linux:2016.09:ga" HOME_URL="http://aws.amazon.com/amazon-linux-ami/" Amazon Linux AMI release 2016.09 ``` - Docker Version (`docker --version`) ``` Docker version 1.12.6, build 7392c3b/1.12.6 ``` - Docker Container OS (`cat /etc/*-release`) ``` 3.4.6 NAME="Alpine Linux" ID=alpine VERSION_ID=3.4.6 PRETTY_NAME="Alpine Linux v3.4" HOME_URL="http://alpinelinux.org" BUG_REPORT_URL="http://bugs.alpinelinux.org" ``` - Python 2.7.13 - `pip freeze --local` results: _alembic==0.8.10 aniso8601==1.2.0 apns==2.0.1 awsebcli==3.9.0 backports.ssl-match-hostname==3.5.0.1 blessed==1.9.5 boto==2.45.0 botocore==1.5.7 cement==2.8.2 cffi==1.9.1 click==6.7 colorama==0.3.7 contextlib2==0.5.4 coverage==4.3.4 cryptography==1.7.2 docker-py==1.7.2 dockerpty==0.4.1 docopt==0.4.0 docutils==0.13.1 enum34==1.1.6 Faker==0.7.7 Flask==0.12 Flask-Compress==1.4.0 Flask-Limiter==0.9.3 Flask-Migrate==2.0.3 Flask-RESTful==0.3.5 -e flask-restful-swagger==0.15.5 Flask-Script==2.0.5 Flask-SQLAlchemy==2.1 Flask-Testing==0.6.1 funcsigs==1.0.2 GeoAlchemy2==0.4.0 idna==2.2 ipaddress==1.0.18 itsdangerous==0.24 Jinja2==2.9.5 jmespath==0.9.1 limits==1.2.1 Mako==1.0.6 mandrill==1.0.57 MarkupSafe==0.23 mock==2.0.0 nose==1.3.7 pathspec==0.5.0 pbr==1.10.0 pep8==1.7.0 pexpect==4.2.1 psycopg2==2.6.2 ptyprocess==0.5.1 pyasn1==0.1.9 pybars3==0.9.2 pycparser==2.17 pycrypto==2.6.1 pymemcache==1.4.0 PyMeta3==0.5.1 pyOpenSSL==16.2.0 python-dateutil==2.6.0 python-editor==1.0.3 python-gcm==0.1.5 pytz==2016.10 PyYAML==3.12 requests==2.13.0 rollbar==0.13.10 semantic-version==2.5.0 six==1.10.0 SQLAlchemy==1.1.5 tabulate==0.7.5 uWSGI==2.0.14 vcrpy==1.10.5 wcwidth==0.1.7 websocket-client==0.40.0 Werkzeug==0.11.15 wrapt==1.10.8_ **Not working Solutions** I have found similar issues here, but was not able to fix it with any of the described solutions. I have tried a lot of different approaches, non of which seem to work. Would much appreciate advice and what I should try next. - Added a small sleep, since I thought this might be related to this [GIL issue](http://stackoverflow.com/questions/383738/104-connection-reset-by-peer-socket-error-or-when-does-closing-a-socket-resu) - Tried running requests in separate processes instead of a thread, since [GIL issue](http://stackoverflow.com/questions/383738/104-connection-reset-by-peer-socket-error-or-when-does-closing-a-socket-resu) implied this might help - Tried to use sessions per thread as to not open too many TCP connections, but no difference in error rate - Tried installing various libraries (pyopenssl, pyasn1, and ndg-httpsclient) as recommended [here](https://github.com/kennethreitz/requests/issues/3391) and [here](https://github.com/kennethreitz/requests/issues/3006). - Tried injecting `from requests.packages.urllib3.contrib import pyopenssl; pyopenssl.inject_into_urllib3()` like described [here](https://github.com/kennethreitz/requests/issues/3006). - switched from ubuntu to alpine docker container - updated everythign to the latest version (including ec2 and docker version) - Some other stuff that I can't remember at this point What else should I try? Any idea what might be causing this? What tools should I use to debug this? I'm kind of at a loss here.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3845/reactions" }
https://api.github.com/repos/psf/requests/issues/3845/timeline
null
completed
null
null
false
[ "So the first four errors all amount to \"the connection was closed by the remote peer at a time we didn't expect\". The fifth one is a standard connection timeout.\r\n\r\nWhile it's possible that some of these reflect underlying problems (for example, a connection closed during a TLS handshake usually indicates that your TLS configuration is not supported by the server), in general all of these can be lumped together as what I'd call \"network noise\". Essentially, when using keep-alive connections it is possible for all kinds of wacky stuff to go on at the connection management layer.\r\n\r\nThe simplest way to handle this, and something I highly recommend, is to turn on *retries*. This can be done at the transport adapter level like this:\r\n\r\n```python\r\nimport requests\r\nfrom requests.adapters import HTTPAdapter\r\n\r\ns = requests.Session()\r\ns.mount('http', HTTPAdapter(max_retries=3))\r\ns.mount('https', HTTPAdapter(max_retries=3))\r\n```\r\n\r\nIf you want even more control you can pass a urllib3 [`Retry` object as shown in urllib3's documentation](https://urllib3.readthedocs.io/en/latest/user-guide.html#retrying-requests) instead of the integer value.\r\n\r\nHowever, without more specificity we can't help you with the individual problems. If you can get down to single reproducing cases that'd be fab.", "<strike>@Lukasa `If you want even more control you can pass a urllib3 Retry object as shown in urllib3's documentation instead of the integer value.` - This does not seem to work. Is this broken?\r\n\r\nThe code doesn't seem to handle this correctly. Reference: https://github.com/kennethreitz/requests/blob/master/requests/adapters.py#L110\r\n\r\nI'll try to set it on the adapter after initialisation with:\r\n```python\r\nretries = Retry(\r\n total=gitl.app.config['MAX_REQUEST_SESSION_RETRIES'],\r\n backoff_factor=0.1,\r\n status_forcelist=[500, 502, 503, 504],\r\n)\r\nadapter = HTTPAdapter()\r\nadapter.max_retries = retries\r\nself.session.mount('http', adapter)\r\nself.session.mount('https', adapter)\r\n```\r\nand see if that works...</strike>\r\n\r\nIt seems to be passes through correctly. However I still get the errors. Very odd...\r\n\r\nEdit: Turns out `POST` requests are not automatically retried...", "Indeed, this is because POST requests are not idempotent so it is not safe to automatically retry all POST requests. If you know POST requests are safe in this case, you can use urllib3's detailed Retry configuration object to allow retries to POST requests.", "Yes, that makes perfect sense. This now seems to be working fine. I'm just wondering how I would urllib3 retry to also handle `(\"bad handshake: SysCallError(104, 'ECONNRESET')\",)`. It seems to not currently retry on that error. Any idea?", "Hrm, do you have the full trace back? I suspect that's a bug.", "Kk, I'll try to get more information the next time it happens!", "Here is the full stack trace:\r\n```\r\nTraceback (most recent call last):\r\n File \"...\", line ..., in ...\r\n r = self.session.post(..., data=...)\r\n File \"/usr/local/lib/python2.7/site-packages/requests/sessions.py\", line 535, in post\r\n return self.request('POST', url, data=data, json=json, **kwargs)\r\n File \"/usr/local/lib/python2.7/site-packages/requests/sessions.py\", line 488, in request\r\n resp = self.send(prep, **send_kwargs)\r\n File \"/usr/local/lib/python2.7/site-packages/requests/sessions.py\", line 609, in send\r\n r = adapter.send(request, **kwargs)\r\n File \"/usr/local/lib/python2.7/site-packages/requests/adapters.py\", line 497, in send\r\n raise SSLError(e, request=request)\r\nSSLError: (\"bad handshake: SysCallError(-1, 'Unexpected EOF')\",)\r\n```", "This is one of those moments where I'd have liked to see more data out of the stack trace. I swear, Python 3's chained exceptions only show up in my life when they aren't helpful, and never when they would be. :wink:\n\nHowever, I *think* this is a real bug. Looking at urllib3's code, it simply does not consider SSL errors to be worth retrying on. I think that instinct is wrong: SSL errors are almost certainly worth a retry attempt. I recommend you raise this as a bug over on the urllib3 repository, where we'll deal with it.\n\nFor when you raise the bug, please attach the following repro code. Specifically, if the following server is run:\n\n```python\nimport time\nimport socket\n\ns = socket.socket()\ns.bind(('', 4433))\ns.listen(10)\n\nwhile True:\n news, _ = s.accept()\n time.sleep(0.5)\n news.close()\n```\n\nThen this code will exhibit the issue:\n\n```python\nimport urllib3\nhttp = urllib3.PoolManager()\nhttp.request('GET', 'https://localhost:4433/', retries=5)\n```\n\nThat code should fire off a `MaxRetryError`, but instead gets an `SSLError`.", "For anyone reading this: Forking python processes doesn't seem to play nicely with requests. Ideally you have a single worker thread that is shared among all processes. This seems to have resolved all our errors here.", "I've noticed a pattern of receiving ```requests.exceptions.ConnectionError: ('Connection aborted.', OSError(\"(104, 'ECONNRESET')\",))``` when just beyond 5 minutes since request initiated and I've verified that the server I'm talking to (IIS 8.5) has server side connection timeout set to 5 minutes. I have not been able to convince admin of this server to increase the IIS server's connection timeout from 5 to 10 or 15 minutes despite one of their REST APIs can return quite a large payload back which sometimes goes beyond the timeout threshold and when it does, I get this error. Here is full stack trace. Any ideas how I can resolve this in my case? Let me know if this seems different enough from this issue's thread and I'll open another issue:\r\n\r\n```\r\nTraceback (most recent call last):\r\n File \"/home/jdfagan/.miniconda/apollo/lib/python3.6/site-packages/urllib3/connectionpool.py\", line 601, in urlopen\r\n chunked=chunked)\r\n File \"/home/jdfagan/.miniconda/apollo/lib/python3.6/site-packages/urllib3/connectionpool.py\", line 387, in _make_request\r\n six.raise_from(e, None)\r\n File \"<string>\", line 2, in raise_from\r\n File \"/home/jdfagan/.miniconda/apollo/lib/python3.6/site-packages/urllib3/connectionpool.py\", line 383, in _make_request\r\n httplib_response = conn.getresponse()\r\n File \"/home/jdfagan/.miniconda/apollo/lib/python3.6/http/client.py\", line 1331, in getresponse\r\n response.begin()\r\n File \"/home/jdfagan/.miniconda/apollo/lib/python3.6/http/client.py\", line 297, in begin\r\n version, status, reason = self._read_status()\r\n File \"/home/jdfagan/.miniconda/apollo/lib/python3.6/http/client.py\", line 258, in _read_status\r\n line = str(self.fp.readline(_MAXLINE + 1), \"iso-8859-1\")\r\n File \"/home/jdfagan/.miniconda/apollo/lib/python3.6/socket.py\", line 586, in readinto\r\n return self._sock.recv_into(b)\r\n File \"/home/jdfagan/.miniconda/apollo/lib/python3.6/site-packages/urllib3/contrib/pyopenssl.py\", line 285, in recv_into\r\n raise SocketError(str(e))\r\nOSError: (104, 'ECONNRESET')\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File \"/home/jdfagan/.miniconda/apollo/lib/python3.6/site-packages/requests/adapters.py\", line 440, in send\r\n timeout=timeout\r\n File \"/home/jdfagan/.miniconda/apollo/lib/python3.6/site-packages/urllib3/connectionpool.py\", line 639, in urlopen\r\n _stacktrace=sys.exc_info()[2])\r\n File \"/home/jdfagan/.miniconda/apollo/lib/python3.6/site-packages/urllib3/util/retry.py\", line 357, in increment\r\n raise six.reraise(type(error), error, _stacktrace)\r\n File \"/home/jdfagan/.miniconda/apollo/lib/python3.6/site-packages/urllib3/packages/six.py\", line 685, in reraise\r\n raise value.with_traceback(tb)\r\n File \"/home/jdfagan/.miniconda/apollo/lib/python3.6/site-packages/urllib3/connectionpool.py\", line 601, in urlopen\r\n chunked=chunked)\r\n File \"/home/jdfagan/.miniconda/apollo/lib/python3.6/site-packages/urllib3/connectionpool.py\", line 387, in _make_request\r\n six.raise_from(e, None)\r\n File \"<string>\", line 2, in raise_from\r\n File \"/home/jdfagan/.miniconda/apollo/lib/python3.6/site-packages/urllib3/connectionpool.py\", line 383, in _make_request\r\n httplib_response = conn.getresponse()\r\n File \"/home/jdfagan/.miniconda/apollo/lib/python3.6/http/client.py\", line 1331, in getresponse\r\n response.begin()\r\n File \"/home/jdfagan/.miniconda/apollo/lib/python3.6/http/client.py\", line 297, in begin\r\n version, status, reason = self._read_status()\r\n File \"/home/jdfagan/.miniconda/apollo/lib/python3.6/http/client.py\", line 258, in _read_status\r\n line = str(self.fp.readline(_MAXLINE + 1), \"iso-8859-1\")\r\n File \"/home/jdfagan/.miniconda/apollo/lib/python3.6/socket.py\", line 586, in readinto\r\n return self._sock.recv_into(b)\r\n File \"/home/jdfagan/.miniconda/apollo/lib/python3.6/site-packages/urllib3/contrib/pyopenssl.py\", line 285, in recv_into\r\n raise SocketError(str(e))\r\nurllib3.exceptions.ProtocolError: ('Connection aborted.', OSError(\"(104, 'ECONNRESET')\",))\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File \"apollo/waypoint/etl/mri/cli.py\", line 150, in <module>\r\n cli()\r\n File \"/home/jdfagan/.miniconda/apollo/lib/python3.6/site-packages/click/core.py\", line 722, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"/home/jdfagan/.miniconda/apollo/lib/python3.6/site-packages/click/core.py\", line 697, in main\r\n rv = self.invoke(ctx)\r\n File \"/home/jdfagan/.miniconda/apollo/lib/python3.6/site-packages/click/core.py\", line 1066, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n File \"/home/jdfagan/.miniconda/apollo/lib/python3.6/site-packages/click/core.py\", line 895, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"/home/jdfagan/.miniconda/apollo/lib/python3.6/site-packages/click/core.py\", line 535, in invoke\r\n return callback(*args, **kwargs)\r\n File \"apollo/waypoint/etl/mri/cli.py\", line 54, in etl\r\n etl.etl(route_set=route_set, routes=route, routes_params=route_params)\r\n File \"/home/jdfagan/Repositories/Waypoint/apollo/apollo/waypoint/etl/mri/api_etl.py\", line 95, in etl\r\n self.extract(route_set=route_set, routes=routes, routes_params=routes_params)\r\n File \"/home/jdfagan/Repositories/Waypoint/apollo/apollo/waypoint/etl/mri/api_etl.py\", line 171, in extract\r\n responses[extractor.name] = extractor.extract(params=params)\r\n File \"/home/jdfagan/Repositories/Waypoint/apollo/apollo/waypoint/etl/mri/api_extractor.py\", line 113, in extract\r\n responses = self.route.get_all(params=params)\r\n File \"/home/jdfagan/Repositories/Waypoint/apollo/apollo/accounting/__init__.py\", line 15, in wrapper\r\n result = f(self, *arg, **kw)\r\n File \"/home/jdfagan/Repositories/Waypoint/apollo/apollo/accounting/mri/mri_api.py\", line 432, in get_all\r\n response = self.get(params=params, stream=stream)\r\n File \"/home/jdfagan/Repositories/Waypoint/apollo/apollo/accounting/__init__.py\", line 15, in wrapper\r\n result = f(self, *arg, **kw)\r\n File \"/home/jdfagan/Repositories/Waypoint/apollo/apollo/accounting/mri/mri_api.py\", line 480, in get\r\n response = self.session.get(self.url, params=params, headers=self.headers, stream=stream)\r\n File \"/home/jdfagan/.miniconda/apollo/lib/python3.6/site-packages/requests/sessions.py\", line 521, in get\r\n return self.request('GET', url, **kwargs)\r\n File \"/home/jdfagan/.miniconda/apollo/lib/python3.6/site-packages/requests/sessions.py\", line 508, in request\r\n resp = self.send(prep, **send_kwargs)\r\n File \"/home/jdfagan/.miniconda/apollo/lib/python3.6/site-packages/requests/sessions.py\", line 618, in send\r\n r = adapter.send(request, **kwargs)\r\n File \"/home/jdfagan/.miniconda/apollo/lib/python3.6/site-packages/requests/adapters.py\", line 490, in send\r\n raise ConnectionError(err, request=request)\r\nrequests.exceptions.ConnectionError: ('Connection aborted.', OSError(\"(104, 'ECONNRESET')\",))\r\n```", "@JDFagan \r\n\r\n1. Reviving issues closed for over a year because the exception looks vaguely similar isn't that helpful to anyone subscribed to the issue or the issue tracker\r\n\r\n1. You have a completely different issue and there's literally nothing you can do to avoid `ECONNRESET` based on what you said. Basically you have to either convince the server to not take 5 minutes or get it to return data that returns in 5 minutes\r\n\r\n1. Finally, you're asking for help on a defect tracker. Requests for help with requests belong on [StackOverflow](https://stackoverflow.com) and this is well documented." ]
https://api.github.com/repos/psf/requests/issues/3844
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3844/labels{/name}
https://api.github.com/repos/psf/requests/issues/3844/comments
https://api.github.com/repos/psf/requests/issues/3844/events
https://github.com/psf/requests/issues/3844
204,655,822
MDU6SXNzdWUyMDQ2NTU4MjI=
3,844
HTTPS proxy connection not initialized if first request is chunked
{ "avatar_url": "https://avatars.githubusercontent.com/u/3267443?v=4", "events_url": "https://api.github.com/users/bpitman/events{/privacy}", "followers_url": "https://api.github.com/users/bpitman/followers", "following_url": "https://api.github.com/users/bpitman/following{/other_user}", "gists_url": "https://api.github.com/users/bpitman/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/bpitman", "id": 3267443, "login": "bpitman", "node_id": "MDQ6VXNlcjMyNjc0NDM=", "organizations_url": "https://api.github.com/users/bpitman/orgs", "received_events_url": "https://api.github.com/users/bpitman/received_events", "repos_url": "https://api.github.com/users/bpitman/repos", "site_admin": false, "starred_url": "https://api.github.com/users/bpitman/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/bpitman/subscriptions", "type": "User", "url": "https://api.github.com/users/bpitman", "user_view_type": "public" }
[]
closed
true
null
[]
null
6
2017-02-01T17:50:04Z
2024-04-23T00:03:48Z
2023-04-22T23:33:06Z
NONE
resolved
Using code below, will produce 403 error without https_proxy: ``` $ ./test.py <Response [403]> ``` But ssl error with https_proxy (because CONNECT request is never sent): ``` $ https_proxy=http://10.32.2.100:1080/ ./test.py Traceback (most recent call last): File "./test.py", line 11, in <module> res = session.put("https://github.com/kennethreitz", data=StringIO.StringIO("")) File "/opt/pepperdata/native/lib/python2.7/site-packages/requests/sessions.py", line 533, in put return self.request('PUT', url, data=data, **kwargs) File "/opt/pepperdata/native/lib/python2.7/site-packages/requests/sessions.py", line 475, in request resp = self.send(prep, **send_kwargs) File "/opt/pepperdata/native/lib/python2.7/site-packages/requests/sessions.py", line 596, in send r = adapter.send(request, **kwargs) File "/opt/pepperdata/native/lib/python2.7/site-packages/requests/adapters.py", line 492, in send raise ConnectionError(err, request=request) ConnectionError: [SSL: UNKNOWN_PROTOCOL] unknown protocol (_ssl.c:590) ``` For now, I'm patching my local requests/adaptor.py to init proxy when first conn use is with chunked request: ``` > if hasattr(conn, 'proxy'): > if conn.proxy is not None and not getattr(low_conn, 'sock', None): > conn._prepare_proxy(low_conn) ``` ``` ## test.py ######################################################## import requests import traceback import StringIO import sys try: session = requests.session() res = session.put("https://github.com/kennethreitz", data=StringIO.StringIO("")) sys.stdout.write(str(res) + "\n") except Exception: traceback.print_exc() session.close() ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4", "events_url": "https://api.github.com/users/nateprewitt/events{/privacy}", "followers_url": "https://api.github.com/users/nateprewitt/followers", "following_url": "https://api.github.com/users/nateprewitt/following{/other_user}", "gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/nateprewitt", "id": 5271761, "login": "nateprewitt", "node_id": "MDQ6VXNlcjUyNzE3NjE=", "organizations_url": "https://api.github.com/users/nateprewitt/orgs", "received_events_url": "https://api.github.com/users/nateprewitt/received_events", "repos_url": "https://api.github.com/users/nateprewitt/repos", "site_admin": false, "starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions", "type": "User", "url": "https://api.github.com/users/nateprewitt", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3844/reactions" }
https://api.github.com/repos/psf/requests/issues/3844/timeline
null
completed
null
null
false
[ "Can you provide some more information please? What requests version are you using. What proxy? What Python version?", "I work with Brent. I've seen this issue on both Linux and OSX. On Linux I've seen it with python 2.7.11 and requests 2.11.1. On OSX (my laptop) I've seen it with python 2.7.11+requests 2.11.1, python 2.7.11+requests 2.13.0 and python 3.5.1+requests 2.13.0. I believe Brent used Apache as a proxy, but I'm not positive.\r\n\r\nI think, though, that given the patch, (btw, it's around like 432 of requests/adapters.py), it's probably not relevant what the proxy software was.", "I'm using apache2. Config below (don't do this unless server is secure). Like Sean said, though, I don't think it's relevant. Packet captures clearly show that CONNECT request isn't sent.\r\n\r\n# add to ports.conf\r\nListen 1080\r\n\r\n# add to a file in sites-enabled\r\n<VirtualHost *:1080>\r\n ProxyRequests On\r\n ProxyVia On\r\n</VirtualHost>", "Yeah, I suspect this is a real problem. I think this can be resolved by writing a patch to use the `urlopen` method for chunked uploads with the parameter `chunked=True` in `HTTPAdapter.send`. I'd welcome a patch from anyone who wants to make it.", "Found this thread when searching for the \"unknonw protocol\" problem.\r\nIn fact, there's another bug in the \"chunked\" branch, the timeout parameter is not used in this branch, turns out the request will hang forever if it's a \"chunked\" request.\r\nAnother similar bug was reported (https://github.com/requests/requests/issues/2336) in 2014 but seems only fixed in the branch that uses urlopen().\r\nIt's not a good idea to call the private method of another library(urllib3), maybe it's better to try use urlopen() in both branches.", "(as commented on #4179)\r\n\r\nI was experiencing the same issue (chunked encoding through an HTTPS proxy tunnel resulting in an SSL unknown protocol error) with requests 2.22.0.\r\n\r\nI upgraded to a9ee0ee and the issue disappeared :+1: . After spending some time analyzing the issue I'm confident that the fix came from #5128.\r\n\r\nSince as of Jan 10, 2020 the fix is not released I used this command to upgrade requests to head of master branch: `pip install git+https://github.com/psf/requests.git --upgrade`" ]
https://api.github.com/repos/psf/requests/issues/3843
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3843/labels{/name}
https://api.github.com/repos/psf/requests/issues/3843/comments
https://api.github.com/repos/psf/requests/issues/3843/events
https://github.com/psf/requests/issues/3843
204,565,507
MDU6SXNzdWUyMDQ1NjU1MDc=
3,843
`except` without exception class specified makes IDNA-encoding non-threadsafe.
{ "avatar_url": "https://avatars.githubusercontent.com/u/179961?v=4", "events_url": "https://api.github.com/users/Crazy-Owl/events{/privacy}", "followers_url": "https://api.github.com/users/Crazy-Owl/followers", "following_url": "https://api.github.com/users/Crazy-Owl/following{/other_user}", "gists_url": "https://api.github.com/users/Crazy-Owl/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Crazy-Owl", "id": 179961, "login": "Crazy-Owl", "node_id": "MDQ6VXNlcjE3OTk2MQ==", "organizations_url": "https://api.github.com/users/Crazy-Owl/orgs", "received_events_url": "https://api.github.com/users/Crazy-Owl/received_events", "repos_url": "https://api.github.com/users/Crazy-Owl/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Crazy-Owl/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Crazy-Owl/subscriptions", "type": "User", "url": "https://api.github.com/users/Crazy-Owl", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2017-02-01T11:59:56Z
2021-09-08T12:00:51Z
2017-02-01T12:05:28Z
NONE
resolved
Hello, We have a python program with resident process running in separate thread. We make requests in said thread and when master thread spots a timeout, a custom exception is dispatched to child thread via OS signal. We use Requests `2.12.4`. However, that exception is caught not by our handler (in child thread main loop), but by `except:` directive in `alabel` function deep in `packages/idna/core.py`: https://github.com/kennethreitz/requests/blob/master/requests/packages/idna/core.py#L264 because that handler is an unconditional "catch-all" `except:` directive. That leads to very cryptic behaviour like child thread not stopping and trying to access urls with unnecessary IDNA-encoding (all our URLs are in plain ascii). Expected behaviour for that handler is not to catch everything, but to expect a specific exception class(es) and work just with them.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3843/reactions" }
https://api.github.com/repos/psf/requests/issues/3843/timeline
null
completed
null
null
false
[ "Yup, that looks wrong, thanks for reporting it! The `idna` code is vendored, without edits, from [this repo](https://github.com/kjd/idna). I recommend you open an issue there, which when fixed will make its way into Requests.", "The changes to `idna` repo landed about two weeks ago: https://github.com/kjd/idna/pull/39\r\n\r\nWhen can we expect the chages to be reflected in `requests`?", "The changes will land when we prepare our next minor release: those releases are when we update our vendored dependencies." ]
https://api.github.com/repos/psf/requests/issues/3842
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3842/labels{/name}
https://api.github.com/repos/psf/requests/issues/3842/comments
https://api.github.com/repos/psf/requests/issues/3842/events
https://github.com/psf/requests/issues/3842
204,344,671
MDU6SXNzdWUyMDQzNDQ2NzE=
3,842
PUT (requests.put) specifying the source file location for zip file error 10054
{ "avatar_url": "https://avatars.githubusercontent.com/u/3253973?v=4", "events_url": "https://api.github.com/users/Twoflower2/events{/privacy}", "followers_url": "https://api.github.com/users/Twoflower2/followers", "following_url": "https://api.github.com/users/Twoflower2/following{/other_user}", "gists_url": "https://api.github.com/users/Twoflower2/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Twoflower2", "id": 3253973, "login": "Twoflower2", "node_id": "MDQ6VXNlcjMyNTM5NzM=", "organizations_url": "https://api.github.com/users/Twoflower2/orgs", "received_events_url": "https://api.github.com/users/Twoflower2/received_events", "repos_url": "https://api.github.com/users/Twoflower2/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Twoflower2/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Twoflower2/subscriptions", "type": "User", "url": "https://api.github.com/users/Twoflower2", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2017-01-31T16:14:49Z
2021-09-07T00:06:13Z
2017-01-31T16:40:24Z
NONE
resolved
With using trying an zip file I get the following, any clues: (requests-2.13.0) ``` with open('default.zip', 'rb') as data: requests.put(url, data=data) ``` Output: ``` C:\svn\libraries\cpp>python req_put.py default.zip Traceback (most recent call last): File "req_put.py", line 102, in <module> resp = requests.put(uri, headers=headers, data=data) # data=data , params=payload) File "C:\Python27\lib\site-packages\requests\api.py", line 124, in put return request('put', url, data=data, **kwargs) File "C:\Python27\lib\site-packages\requests\api.py", line 56, in request return session.request(method=method, url=url, **kwargs) File "C:\Python27\lib\site-packages\requests\sessions.py", line 488, in request resp = self.send(prep, **send_kwargs) File "C:\Python27\lib\site-packages\requests\sessions.py", line 609, in send r = adapter.send(request, **kwargs) File "C:\Python27\lib\site-packages\requests\adapters.py", line 473, in send raise ConnectionError(err, request=request) requests.exceptions.ConnectionError: ('Connection aborted.', error(10054, 'An existing connection was forcibly closed by the remote host')) ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3842/reactions" }
https://api.github.com/repos/psf/requests/issues/3842/timeline
null
completed
null
null
false
[ "The server will have sent a 4XX response, which means it doesn't like your upload. Unfortunately Requests can't handle this yet, because httplib doesn't make it easy. We're working on a substantial refactor to fix this, but it'll be a long time coming. In the meantime, I suggest you try to investigate what format the server is actually expecting. ", "@Lukasa Many thanks. Seeing that is really seems you know what is going on here 3 qs:\r\n\r\n1. What other Python cURL lib can I use here that you might know of that works?\r\n2. Why would requests.put() work perfectly not specifying the data param, hence not specifying the source location of file?\r\n3. Not being sure how cURL does this, is it not possible to also add the cURL `-T, --upload-file <file>` like option in the requests library?", "1. pycurl\r\n2. HTTP is a very general protocol. PUT does not inherently have anything to do with uploading files. In fact, it's perfectly allowed to send a PUT without any body. This is why Requests allows it.\r\n3. `curl -T` does exactly what you just did. The only difference between Requests and curl is in one case: if you do `curl -T <filename> http://host` (that is, with no path), then curl will send a PUT to `/filename`. To do the same thing with Requests, you'd need to do:\r\n\r\n ```python\r\n url = \"http://example.com/\"\r\n with open('default.zip', 'rb') as data:\r\n requests.put(url + \"default.zip\", data=data)\r\n ```", "what should I do for the tar.gz file in a PUT request using pycurl module @Lukasa ", "I figure out the solution." ]
https://api.github.com/repos/psf/requests/issues/3841
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3841/labels{/name}
https://api.github.com/repos/psf/requests/issues/3841/comments
https://api.github.com/repos/psf/requests/issues/3841/events
https://github.com/psf/requests/pull/3841
204,193,292
MDExOlB1bGxSZXF1ZXN0MTAzODY0Nzk3
3,841
Update Copyright in LICENSE to 2017
{ "avatar_url": "https://avatars.githubusercontent.com/u/6897645?v=4", "events_url": "https://api.github.com/users/StewPoll/events{/privacy}", "followers_url": "https://api.github.com/users/StewPoll/followers", "following_url": "https://api.github.com/users/StewPoll/following{/other_user}", "gists_url": "https://api.github.com/users/StewPoll/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/StewPoll", "id": 6897645, "login": "StewPoll", "node_id": "MDQ6VXNlcjY4OTc2NDU=", "organizations_url": "https://api.github.com/users/StewPoll/orgs", "received_events_url": "https://api.github.com/users/StewPoll/received_events", "repos_url": "https://api.github.com/users/StewPoll/repos", "site_admin": false, "starred_url": "https://api.github.com/users/StewPoll/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/StewPoll/subscriptions", "type": "User", "url": "https://api.github.com/users/StewPoll", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2017-01-31T03:04:22Z
2021-09-07T00:06:39Z
2017-01-31T08:54:28Z
CONTRIBUTOR
resolved
Branch name was meant to be 'Happy New Year' but apparently typing the word 'Year' isn't my strongest skill.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3841/reactions" }
https://api.github.com/repos/psf/requests/issues/3841/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3841.diff", "html_url": "https://github.com/psf/requests/pull/3841", "merged_at": "2017-01-31T08:54:28Z", "patch_url": "https://github.com/psf/requests/pull/3841.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3841" }
true
[ "# [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3841?src=pr&el=h1) Report\n> Merging [#3841](https://codecov.io/gh/kennethreitz/requests/pull/3841?src=pr&el=desc) into [master](https://codecov.io/gh/kennethreitz/requests/commit/e84086ac86075321aa479a24f6c83dd3ead40212?src=pr&el=desc) will **not impact** coverage.\n\n\n```diff\n@@ Coverage Diff @@\n## master #3841 +/- ##\n=======================================\n Coverage 89.05% 89.05% \n=======================================\n Files 15 15 \n Lines 1873 1873 \n=======================================\n Hits 1668 1668 \n Misses 205 205\n```\n\n\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3841?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3841?src=pr&el=footer). Last update [e84086a...1a4ba6c](https://codecov.io/gh/kennethreitz/requests/compare/e84086ac86075321aa479a24f6c83dd3ead40212...1a4ba6c833f16283103d4eac488bd46640280bdf?el=footer&src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).", "Thanks @TetraEtc!" ]
https://api.github.com/repos/psf/requests/issues/3840
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3840/labels{/name}
https://api.github.com/repos/psf/requests/issues/3840/comments
https://api.github.com/repos/psf/requests/issues/3840/events
https://github.com/psf/requests/issues/3840
204,136,883
MDU6SXNzdWUyMDQxMzY4ODM=
3,840
should a malformed non-gzip yet marked gzip response be handled in requests?
{ "avatar_url": "https://avatars.githubusercontent.com/u/204779?v=4", "events_url": "https://api.github.com/users/jvanasco/events{/privacy}", "followers_url": "https://api.github.com/users/jvanasco/followers", "following_url": "https://api.github.com/users/jvanasco/following{/other_user}", "gists_url": "https://api.github.com/users/jvanasco/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jvanasco", "id": 204779, "login": "jvanasco", "node_id": "MDQ6VXNlcjIwNDc3OQ==", "organizations_url": "https://api.github.com/users/jvanasco/orgs", "received_events_url": "https://api.github.com/users/jvanasco/received_events", "repos_url": "https://api.github.com/users/jvanasco/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jvanasco/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jvanasco/subscriptions", "type": "User", "url": "https://api.github.com/users/jvanasco", "user_view_type": "public" }
[]
closed
true
null
[]
null
18
2017-01-30T21:34:50Z
2021-09-08T12:00:57Z
2017-02-09T21:37:02Z
CONTRIBUTOR
resolved
Working on a PR, I discovered that a decently sized CDN will *usually* block the `requests` library via user-agent, but does so with a malformed response that raises an error. (*usually* means hitting the CDN from an ip address using a non-blocked user-string seems to whitelist the IP for 120 seconds -- which is why this took forever to figure out). Their 403 response is malformed, as the header indicates a gzipped encoding {'Content-Length': '345', 'Content-Encoding': 'gzip', 'Content-Type': '*/*'} however the payload is uncompressed plain-text: <?xml version="1.0" encoding="iso-8859-1"?> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en"> <head> <title>403 - Forbidden</title> </head> <body> <h1>403 - Forbidden</h1> </body> </html> This raises a DecodeError/ContentDecodingError in `Response.iter_content`. I can provide a test case. I just don't want to name the CDN or a client as some project maintainers have (un)official policies on stuff like that. This is definitely a "bad server". Aside from sending the malformed response, they don't respect 'Accept-Encoding' either. With this particular CDN, the payload is not chunked and reading the stream with `decode_content=False` will work. I'm not sure how/if this should be handled. It might be nice to have a fallback where a failure to read compressed data will attempt to read it uncompressed.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3840/reactions" }
https://api.github.com/repos/psf/requests/issues/3840/timeline
null
completed
null
null
false
[ "I don't think this should be handled directly. This is strictly an error and the CDN needs to sort out their mess. I have no problems naming names on the Requests issue tracker, but if you'd rather not you should reach out to me directly via email and let me know who it is: the odds are good that either I know someone who works there or that I know someone who knows someone. Bonus points for a repro script!", "Yeah, stacking workarounds on bad server behaviors doesn't seem the good approach. The maintenance burden will bit eventually. ", "This appears to effect several domains on Verizon's \"Enterprise Content Delivery\" network -- which was EdgeCast before acquisition. The response headers for their servers are `ecd ({release_id})`, and are across a decently sized IP block.\r\n\r\nI was testing a PR candidate for redirects against the first short link on my twitter feed, and this issue came up. I dug in and found several domains this happens on, but a good example is the first one I found -- the huffington post.\r\n\r\nHere's a streamlined reproduction that should work:\r\n\r\n```\r\nimport requests\r\n\r\n# this will raise a ContentDecodeError\r\nurl = \"http://www.huffingtonpost.com/entry/senate-democrats-boycott_us_5890ae26e4b0c90eff001c3c?v35c9pbty5f5vobt9\"\r\nr = requests.get(url)\r\nprint r\r\nprint r.content\r\n```\r\n\r\nThere is a lot of extended behavior going on. I currently think the bit of the querystring I left intact above is some sort of auth payload, and I wasn't dealing with IP whitelisting but cache expiration. It appears that if you can load a url into their CDN cache (which depends on User-Agent and/or the URL's querystrings; \"base\" urls without querystrings always pass), you then enable most representations of that URL to all clients in the region for 2 minutes. I wrote about 30 lines of text and some code illustrating that, but I'll save you from the boredom ;)\r\n\r\n", "Yeah, so this is a bit weird: I can't reproduce that here. Presumably this is somewhat specific to certain caches. Want to try using a few different source address blocks to see if it is consistently reproducible?", "FWIW, the reproducer worked for me.; I did get a `requests.exceptions.ContentDecodingError: ('Received response with content-encoding: gzip, but failed to decode it.', error('Error -3 while decompressing: incorrect header check',))`", "Can you two share your global regions? I tested this from 3 servers in the USA: NYC, New Jersey and Los Angeles.", "I'm London based, so almost certainly using different DCs.", "I am in Paris. From work (in Paris too, but with a different network provider) the above code snipped raised an exception. From home, it return a properly formatted 403 error...", "seems to be a popular method to block non-browsers\r\n`\r\nrequests.get('http://www.bcast.site/stream.php')`", "Hello,\r\nI get these errors with some urls \r\n\r\nexamples : \r\n\r\n```\r\n requests.get('http://superuser.com/robots.txt')\r\n requests.get('http://askubuntu.com/robots.txt')\r\n requests.get('http://stackoverflow.com/robots.txt')\r\n```\r\n\r\nbut fetching doesn't fail with urllib3 only : \r\n```\r\n>>> import urllib3\r\n>>> http = urllib3.PoolManager()\r\n>>> r = http.request('GET','http://superuser.com/robots.txt')\r\n>>> r.status\r\n200\r\n>>> r.data\r\nb\"User-Agent: *\\r\\nDisallow: ....\r\n```", "Ok folks, let's please try to restrict this issue. User agent spoofing is usually enough to solve this problem. ", "I'm sorry but agent spoofing does not solve this problem. You can try any headers you want, it will lead to the same error. Also **urllib3** (on which **requests** is build on) can fetch data without any issue. Hence this is a bug _a priori_. If it is not, can you explain why please ?", "@azotlikid Well, let's be clear, it's not *a priori* a bug because it works fine for me using the current Requests build.\r\n\r\n*However*, if your problem really is the same as the one discussed in this issue then there is a bug, but it is *server side*: it is sending invalid gzipped data. The server is at fault, and if the data is invalid then there is little scope for us to avoid barfing on it.", "FWIW, @azotlikid's examples work fine for me now too. It did not work the other day from this same location. I had looked up the IP blocks the other day, and they were hosted on the Fastly CDN -- and still are. They may have corrected things on their system. I'll reach out to someone at StackExchange I know through a project to see if I can find out anything.\r\n\r\nIt would still be nice if a future branch had some sort of design detail where bad-servers could be more cleanly caught and handled by developers or the errors could bubble-up a bit better.\r\n\r\nHaving gone though the code recently to address the \"gzip header when not really gzip\", it would honestly create more problems trying to address this stuff -- so I don't think think it would be worth trying to handle. \r\n\r\nHowever... it would be *swell* if the raised exceptions could be more constructive for developers, allowing them to do something with the bad request.\r\n\r\nGoing back to my example, the exception raised is `requests.exceptions.ContentDecodingError`. IIRC, it bubbles up from this block: https://github.com/kennethreitz/requests/blob/master/requests/models.py#L715-L733\r\n\r\nPerhaps these exceptions could be extended to include the `self` response object. That would allow a developer to examine the instant `Response` and then act upon the `.stream` attribute. Ultimately, that would allow someone to inspect the exception and determine how/why it is a \"bad server\" -- instead of using a hunt&peck method of what might or might not work.\r\n\r\n", "> Perhaps these exceptions could be extended to include the `self` response object.\r\n\r\n@jvansco That shouldn't really be necessary. If the user sets `stream=True` that exception will only fire when accessing the body content, at which point the response object will already be in their hands. It's just not very hard to arrange a situation where any problems processing the body occur at a controlled and defined time.", "Thanks for the reply,\r\n@jvanasco Inexplicably this error doesn't happen again since yesterday... Obviously it was a bug in the space-time continuum... And I still don't understand why there was no problem with urllib3 but this will remain a mystery.\r\n@Lukasa the problem is that I use a library which use `requests`, and I can't catch this exception without a global catch.", "> at which point the response object will already be in their hands. \r\n\r\n@Lukasa you are 100% correct. I've been staring at the inner workings too long. my apologies!", ";) no need for apologies @jvanasco, this is why we work in groups: it's easy for any one of us to miss the wood for the trees." ]
https://api.github.com/repos/psf/requests/issues/3839
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3839/labels{/name}
https://api.github.com/repos/psf/requests/issues/3839/comments
https://api.github.com/repos/psf/requests/issues/3839/events
https://github.com/psf/requests/issues/3839
204,121,739
MDU6SXNzdWUyMDQxMjE3Mzk=
3,839
multipart/form-data not encoding correctly on recent versions
{ "avatar_url": "https://avatars.githubusercontent.com/u/6018782?v=4", "events_url": "https://api.github.com/users/cmanallen/events{/privacy}", "followers_url": "https://api.github.com/users/cmanallen/followers", "following_url": "https://api.github.com/users/cmanallen/following{/other_user}", "gists_url": "https://api.github.com/users/cmanallen/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/cmanallen", "id": 6018782, "login": "cmanallen", "node_id": "MDQ6VXNlcjYwMTg3ODI=", "organizations_url": "https://api.github.com/users/cmanallen/orgs", "received_events_url": "https://api.github.com/users/cmanallen/received_events", "repos_url": "https://api.github.com/users/cmanallen/repos", "site_admin": false, "starred_url": "https://api.github.com/users/cmanallen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/cmanallen/subscriptions", "type": "User", "url": "https://api.github.com/users/cmanallen", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2017-01-30T20:33:23Z
2021-09-08T12:01:05Z
2017-01-31T09:45:32Z
NONE
resolved
I have a request that works in older version of the `requests` library but not in more recent versions. The trouble point is the `model` key in the files dictionary. In version `2.11.1`, the model key is correctly recognized by the remote endpoint. In newer versions, the `model` key appears to be omitted or malformed in some way. I don't believe the API endpoint is the issue. I can execute a `curl` command and it will work as expected. A sample (functioning) curl request: ```bash curl -F data_file=@my_file.mp3 -F model=en-US "https://api.speechmatics.com/v1.0/user/1/jobs/?auth_token=ABC" ``` ### Request ```python import requests files = { 'data_file': open(filename, 'rb'), 'model': ('', 'en-US'), } response = requests.post( 'https://api.speechmatics.com/v1.0/user/1/jobs/?auth_token=ABC', files=files) print(response.content) # b'{\n "code": 400, \n "error": "No language selected"\n}' ``` ### Versions tested: *All versions installed through `pip` and are being used with `python 3.5`.* `2.11.1` - Works. `2.12.4` - **Fails.** `2.13.0` - **Fails.**
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3839/reactions" }
https://api.github.com/repos/psf/requests/issues/3839/timeline
null
completed
null
null
false
[ "Yeah, this behaviour changed. In particular, the lower-level urllib3 library wanted to allow the possibility of setting an *empty* filename (as opposed to not providing the filename at all). You can get the correct behaviour on all platforms by changing `'model': ('', 'en-US'),` to `'model': (None, 'en-US'),`", "@Lukasa Thanks!" ]
https://api.github.com/repos/psf/requests/issues/3838
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3838/labels{/name}
https://api.github.com/repos/psf/requests/issues/3838/comments
https://api.github.com/repos/psf/requests/issues/3838/events
https://github.com/psf/requests/issues/3838
203,864,011
MDU6SXNzdWUyMDM4NjQwMTE=
3,838
ignoring MTU and MSS in TCP handshake
{ "avatar_url": "https://avatars.githubusercontent.com/u/17862972?v=4", "events_url": "https://api.github.com/users/boogardgodig/events{/privacy}", "followers_url": "https://api.github.com/users/boogardgodig/followers", "following_url": "https://api.github.com/users/boogardgodig/following{/other_user}", "gists_url": "https://api.github.com/users/boogardgodig/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/boogardgodig", "id": 17862972, "login": "boogardgodig", "node_id": "MDQ6VXNlcjE3ODYyOTcy", "organizations_url": "https://api.github.com/users/boogardgodig/orgs", "received_events_url": "https://api.github.com/users/boogardgodig/received_events", "repos_url": "https://api.github.com/users/boogardgodig/repos", "site_admin": false, "starred_url": "https://api.github.com/users/boogardgodig/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/boogardgodig/subscriptions", "type": "User", "url": "https://api.github.com/users/boogardgodig", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2017-01-29T11:17:05Z
2021-09-08T12:01:06Z
2017-01-29T16:01:58Z
NONE
resolved
Hi I was using requests flawlessly till yesterday I realised I cannot send large packets to https server getting following error `requests/adapters.py", line 473, in send raise ConnectionError(err, request=request) requests.exceptions.ConnectionError: ('Connection aborted.', BadStatusLine("''",)) ` Turns out my ISP started filtering ICMP fargmentation needed packets and drops large packets. Otherwise packets would have been fragmented and retransmited. Shouldn't requests fragment packets with respenct to MTU or MSS to prevent this from happening? Is there a work around to this problem?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3838/reactions" }
https://api.github.com/repos/psf/requests/issues/3838/timeline
null
completed
null
null
false
[ "Requests does not control the way packets are emitted. The socket API for TCP does not expose or allow us to control packet boundaries: the TCP layer in the kernel manages all of this. Any problem you're seeing would be encountered by other programs on your system. \r\n\r\nYou'll need to investigate your system's configuration. " ]
https://api.github.com/repos/psf/requests/issues/3837
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3837/labels{/name}
https://api.github.com/repos/psf/requests/issues/3837/comments
https://api.github.com/repos/psf/requests/issues/3837/events
https://github.com/psf/requests/issues/3837
203,821,379
MDU6SXNzdWUyMDM4MjEzNzk=
3,837
Proposal: potential way to follow bad url shorteners
{ "avatar_url": "https://avatars.githubusercontent.com/u/204779?v=4", "events_url": "https://api.github.com/users/jvanasco/events{/privacy}", "followers_url": "https://api.github.com/users/jvanasco/followers", "following_url": "https://api.github.com/users/jvanasco/following{/other_user}", "gists_url": "https://api.github.com/users/jvanasco/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jvanasco", "id": 204779, "login": "jvanasco", "node_id": "MDQ6VXNlcjIwNDc3OQ==", "organizations_url": "https://api.github.com/users/jvanasco/orgs", "received_events_url": "https://api.github.com/users/jvanasco/received_events", "repos_url": "https://api.github.com/users/jvanasco/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jvanasco/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jvanasco/subscriptions", "type": "User", "url": "https://api.github.com/users/jvanasco", "user_view_type": "public" }
[]
closed
true
null
[]
null
16
2017-01-28T18:03:54Z
2021-09-08T12:00:58Z
2017-02-10T22:10:22Z
CONTRIBUTOR
resolved
I've ran into an issue that runs tangent to issue #441. We have a content indexer that is powered by `requests`. Some of the larger URL shorteners exhibit some weird behaviors depending on the user agent string. * (variant of #441) depending on the user-agent header, a particular shortener will either send a proper 301 response OR a HTTP 200 containing a meta-refresh value of 0. <head><meta name="referrer" content="always"><noscript><META http-equiv="refresh" content="0;URL={URL}"></noscript><title>{URL}</title></head><script>window.opener = null; location.replace("{URL}")</script> * (new?) a handful of url shorteners may send a HTTP-200 response with a `location` header to the redirect. (yes I know, it breaks spec and makes no sense. To handle both of these (and several other scenarios) I have a novel suggestion. The callback hooks could be used to allow developers to catch and follow these types of redirect oddities within a single configured request. A developer would be able to handle edge cases like the above by with a hook: def bad_shortener_callback(r): r_location = r.headers.get('location') if r.status_code == 200 and r_location!= r.url: r.is_redirect = True r.redirect_location_override = r_location return r In order to make this work, a slight change would be needed: The redirect url is pulled via this line: https://github.com/kennethreitz/requests/blob/f72684e13c5074a671506d29c1b5638156680ea7/requests/sessions.py#L116 url = resp.headers['location'] I propose this change to sessions.py -url = resp.headers['location'] +url = resp.redirect_location and then extending models.py accordingly + redirect_location_override = None @property def redirect_location(self): if self.redirect_location_override is not None: return self.redirect_location_override return self.headers['location'] if this is acceptable, I would be happy to issue a PR that includes tests.
{ "avatar_url": "https://avatars.githubusercontent.com/u/204779?v=4", "events_url": "https://api.github.com/users/jvanasco/events{/privacy}", "followers_url": "https://api.github.com/users/jvanasco/followers", "following_url": "https://api.github.com/users/jvanasco/following{/other_user}", "gists_url": "https://api.github.com/users/jvanasco/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jvanasco", "id": 204779, "login": "jvanasco", "node_id": "MDQ6VXNlcjIwNDc3OQ==", "organizations_url": "https://api.github.com/users/jvanasco/orgs", "received_events_url": "https://api.github.com/users/jvanasco/received_events", "repos_url": "https://api.github.com/users/jvanasco/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jvanasco/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jvanasco/subscriptions", "type": "User", "url": "https://api.github.com/users/jvanasco", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3837/reactions" }
https://api.github.com/repos/psf/requests/issues/3837/timeline
null
completed
null
null
false
[ "Thanks for the suggestion!\r\n\r\nSo, I think I don't fully understand why this needs to be a Requests patch. What advantages would you or other users get from having this integrated into Requests?", "I'm not sure how to interpret your question, so I'm going to answer it a few ways:\r\n\r\nAnd I'll refer to this URL flow, where the 301s are a proper redirect and the 200 is standing in for a malformed response (meta-refresh or 200+location).\r\n\r\n\tA -> [301] -> B -> [200] -> C -> [301] -> D\r\n\r\n1. What is the utility in general?\r\n\r\nIn terms of shorteners, it's incredibly common to encounter somewhat broken servers or a <meta-refresh> tag. Being able to follow this stuff is often desired. \r\n\r\nRegarding the two examples I mentioned above, one is major social network and the other is one of the largest commercial url shortening services that handles hundreds of domains. \r\n\r\n\r\n2. A patch streamlines work.\r\n\r\nWith a patch in place, consumer code to handle bad servers would look something like the following. \r\n\r\n def handle_response(r):\r\n ...\r\n r.location_override = 'url2'\r\n return r\r\n \r\n r = requests.get('url', max_redirects=4, hooks={'response': handle_response} )\r\n print(r)\r\n > <Response [200]>\r\n print(r.history)\r\n > [<Response [301]>, <Response [200]>, <Response [301]>]\r\n\r\nI know what you're thinking -- \"so elegant! much requests!\" . Right? Right!\r\n\r\nWithout a patch in place, it's quite messy/ugly to keep reprocessing the urls while adhering to an intended max_redirects - otherwise you have an endless recursion. This is just a rough try at rewriting the above:\r\n\r\n def needs_redirect(r):\r\n ...\r\n return alt_location\r\n\r\n\tmax_redirects = 4\r\n\ts = requests.Session(max_redirects=max_redirects) # use a session so we don't have to create two objects with the same configuration kwargs)\r\n\tr = s.get(url)\r\n\tr_history = r.history # we're going to overwrite r, so ugh.\r\n\twhile True:\r\n\t\tredirect_location = needs_redirect(r)\r\n\t\tif not redirect_location:\r\n\t\t\tbreak\r\n\t\t_max_redirects = max_redirects - len(r_history) # compute this as part of the redirects\r\n\t\tif _max_redirects <= 0:\r\n\t\t\traise requests.exceptions.TooManyRedirects('Exceeded %s redirects.' % max_redirects, response=r) # we'd raise an error within requests if a patch existed\r\n\t\tr = s.get(url, max_redirects=_max_redirects)\r\n\t\tr_history.extend(r.history)\r\n print(r)\r\n > <Response [200]>\r\n print(r_history)\r\n > [<Response [301]>, <Response [200]>, <Response [301]>]\r\n\r\n\r\n3. Abusing the object's API\r\n\r\nA hook could potentially overwrite the existing response data without the need for a patch, but that would make dealing with the response history very messy while debugging/developing and could break:\r\n\r\n def handle_response(r):\r\n ...\r\n r.headers['requests-original-location'] = r.headers.get('location')\r\n r.original_status_code = r.status_code\r\n r.headers['location'] = 'url2'\r\n r.status_code = 301\r\n return r\r\n \r\n r = requests.get('url', max_redirects=4, hooks={'response': handle_response} )\r\n print(r)\r\n > <Response [200]>\r\n print(r.history)\r\n > [<Response [301]>, <Response [301]>, <Response [301]>]\r\n\r\nNote that this functions like the patched version, but r.history will show the overwritten value for the second item -- it can be hard to see what's going on. This could be replaced with another object that inherits from Response, but that gets into even more code.\r\n\r\nSo the TLDR is that the proposed functionality would allow users to elegantly support some common scenarios while also adhering to limits set by max_redirects:\r\n\r\n• follow bad shorteners (location but HTTP_OK)\r\n• act upon html meta-refresh tags, or perhaps follow a canonical?\r\n• see a clear and concise url history (don't break/fake the history)\r\n", "So, I'm inclined to call this a \"niche\" issue: while I'm sure it's common in some uses, in general it is very uncommon.\r\n\r\nI think we could consider solving the wacky \"200 with Location header\" as part of requests' redirect handling code. But I'm disinclined to give arbitrary hooks the option to trigger further redirects. It seems like additional code that doesn't meet the requirements of broad utility. In your second case, writing the code external to Requests, that code doesn't seem too unreasonable.\r\n\r\nSo I think I'm a tentative -0.5 on the idea of the bulk API change, but maybe -0 on the 200-with-Location if we can get some citations about why the hell browsers allow that and whether there is a threat model we need to consider there.", "So, we've talked several times of refactoring our redirect handling into more discrete parts. If we strike the right balance, the user could override the logic that determines whether or not there's a redirect to follow. I don't think we need to start adding headers as response attributes. We've already shot those proposals down several times.", "regarding the comment from @lukasa on general utility (not the suggested implementation): i disagree about broad utility of following non-location redirects. i know this means allowing users to operate on \"html\", but please consider that many consumers will eventually consume the HTML redirect (which could be a meta-refresh, rel=\"canonical\", type=\"og:url\" or several others). it's not a niche use, but a common one.\r\n\r\nregarding the implementation details comment from @sigmavirus24: I do agree. stashing headers was a way to not suggest a larger patch.\r\n\r\ni was a bit surprised the redirect handling used a `while` loop to create a generator that is immediately consumed, and then just overrides the request. i had looked through the commits and tickets, and it seems like approach was dictated by earlier api behaviors that no longer exist (or I haven't seen).", "> but please consider that many consumers will eventually consume the HTML redirect (which could be a meta-refresh, rel=\"canonical\", type=\"og:url\" or several others). it's not a niche use, but a common one.\r\n\r\nBut Requests does *nothing* with HTML nor will it ever. Dealing with the HTML should be the user's choice. If something like this is *that* common, the [toolbelt](/sigmavirus24/requests-toolbelt) could grow some functionality for it, but I doubt that would be without extra issues.\r\n\r\n> stashing headers was a way to not suggest a larger patch.\r\n\r\nI appreciate that. Sometimes smaller patches are significantly worse in impact than larger ones. :)\r\n\r\n> i was a bit surprised the redirect handling used a while loop to create a generator that is immediately consumed, and then just overrides the request. i had looked through the commits and tickets, and it seems like approach was dictated by earlier api behaviors that no longer exist (or I haven't seen).\r\n\r\nIt is an API that was solidified when Kenneth rewrote Requests overnight and released v1.0. We stick to that API so as to not break users currently relying on it. We can expand that particular bit though and have done so in the past where it makes sense.", "I have no problem with achieving pluggability via a refactor of the redirect following logic. For example, it could be enough to update `SessionRedirectMixin` with a method `get_redirect_target` which we use to determine if a response needs us to redirect, and then call that.", "I can try to do a PR for that.", "I'd definitely be interested in seeing one. ", "Just to be clear - I'm not suggesting that `requests` act on HTML data or natively support edge cases with redirects. Dealing with that stuff is opening Pandora's Box -- new issues can (and will) keep popping up and needing support, and many use cases won't care. Personally, I have one application that requires strict HTTP compliance and another one that doesn't care how malformed a response is, and aggressively wants/needs to consume it. I wouldn't want to slow down one of my applications with the overhead of code needed for the other. That stuff doesn't belong in the core library.\r\n\r\nMy intent is just to make it easier for users to be able to find and handle issues as they arise. The existing \"hooks\" processing seemed like a natural choice, as it would allow users to write custom solutions as edge cases are discovered and immediately deploy and share them.", "Still working out details, but how does this general approach look:\r\n\r\n class SessionRedirectMixin(object):\r\n + def get_redirect_target(self, resp):\r\n + \"\"\"Receives a Response. Returns a redirect URI or `None`\"\"\"\r\n + if resp.is_redirect:\r\n + return resp.headers['location']\r\n + return None\r\n\r\n def resolve_redirects(self, resp, req, stream=False, timeout=None,\r\n ...\r\n - while resp.is_redirect:\r\n + while self.get_redirect_target(resp):\r\n ...\r\n - url = resp.headers['location']\r\n + url = self.get_redirect_target(resp)\r\n\r\nThat would enable pluggability via:\r\n\r\n class CustomSessionRedirectMixin(object):\r\n\r\n def get_redirect_target(self, resp):\r\n \"custom override\"\r\n pass\r\n\r\n class CustomSession(SessionRedirectMixinCustom, requests.Session):\r\n pass\r\n\r\n r = s.get('http://example.com')\r\n\r\n", "So *broadly* I think this is fine. I'd be inclined to want to avoid calling `get_redirect_target` multiple times, but yes, I can get behind that general API shape.", "I don't like processing `get_redirect_target` twice either. The result needs to be consulted twice: once to determine if a redirect condition exists, and again to grab the result. Caching the result in the Session is too messy - a history cleanup would be needed. The best place to stash the calculated value would be on the `Response` object. (I thought about setting`Response._redirect_uri`) but I wanted to avoid touching that object.\r\n\r\nThis is one of those rare situations where I actually miss Perl. (it supports assignments like `while url = self.get_redirect_target(resp):`)\r\n\r\nI'll spend a few more days thinking over this. ", "It shouldn't need to be cached: it just needs to be obtained at the bottom of the while loop (and of course once at the start of the generator).", "oh, that is so obvious. how did i miss that! thanks.", "Closing this, it was merged." ]
https://api.github.com/repos/psf/requests/issues/3836
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3836/labels{/name}
https://api.github.com/repos/psf/requests/issues/3836/comments
https://api.github.com/repos/psf/requests/issues/3836/events
https://github.com/psf/requests/pull/3836
203,736,052
MDExOlB1bGxSZXF1ZXN0MTAzNTYzNTE0
3,836
pin pipenv until kennethreitz/pipenv#90 is resolved
{ "avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4", "events_url": "https://api.github.com/users/nateprewitt/events{/privacy}", "followers_url": "https://api.github.com/users/nateprewitt/followers", "following_url": "https://api.github.com/users/nateprewitt/following{/other_user}", "gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/nateprewitt", "id": 5271761, "login": "nateprewitt", "node_id": "MDQ6VXNlcjUyNzE3NjE=", "organizations_url": "https://api.github.com/users/nateprewitt/orgs", "received_events_url": "https://api.github.com/users/nateprewitt/received_events", "repos_url": "https://api.github.com/users/nateprewitt/repos", "site_admin": false, "starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions", "type": "User", "url": "https://api.github.com/users/nateprewitt", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2017-01-27T20:45:28Z
2021-09-07T00:06:40Z
2017-01-27T20:55:07Z
MEMBER
resolved
Fixing build for now until we can properly parse all dependencies in pipenv.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3836/reactions" }
https://api.github.com/repos/psf/requests/issues/3836/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3836.diff", "html_url": "https://github.com/psf/requests/pull/3836", "merged_at": "2017-01-27T20:55:07Z", "patch_url": "https://github.com/psf/requests/pull/3836.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3836" }
true
[ "## [Current coverage](https://codecov.io/gh/kennethreitz/requests/pull/3836?src=pr) is 89.09% (diff: 100%)\n> Merging [#3836](https://codecov.io/gh/kennethreitz/requests/pull/3836?src=pr) into [master](https://codecov.io/gh/kennethreitz/requests/branch/master?src=pr) will increase coverage by **0.05%**\n\n```diff\n@@ master #3836 diff @@\n==========================================\n Files 15 15 \n Lines 1870 1870 \n Methods 0 0 \n Messages 0 0 \n Branches 0 0 \n==========================================\n+ Hits 1665 1666 +1 \n+ Misses 205 204 -1 \n Partials 0 0 \n```\n\n> Powered by [Codecov](https://codecov.io?src=pr). Last update [23057db...808e4c6](https://codecov.io/gh/kennethreitz/requests/compare/23057dbc22d08ee831198f287d11d750f200e092...808e4c62dc54ec7a10197038d56fb70f738b19cf?src=pr)", "Thanks @nateprewitt!" ]
https://api.github.com/repos/psf/requests/issues/3835
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3835/labels{/name}
https://api.github.com/repos/psf/requests/issues/3835/comments
https://api.github.com/repos/psf/requests/issues/3835/events
https://github.com/psf/requests/pull/3835
203,489,835
MDExOlB1bGxSZXF1ZXN0MTAzMzkwMDY3
3,835
Only send HTTPDigestAuth on 4xx challenges
{ "avatar_url": "https://avatars.githubusercontent.com/u/1574365?v=4", "events_url": "https://api.github.com/users/mmedal/events{/privacy}", "followers_url": "https://api.github.com/users/mmedal/followers", "following_url": "https://api.github.com/users/mmedal/following{/other_user}", "gists_url": "https://api.github.com/users/mmedal/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/mmedal", "id": 1574365, "login": "mmedal", "node_id": "MDQ6VXNlcjE1NzQzNjU=", "organizations_url": "https://api.github.com/users/mmedal/orgs", "received_events_url": "https://api.github.com/users/mmedal/received_events", "repos_url": "https://api.github.com/users/mmedal/repos", "site_admin": false, "starred_url": "https://api.github.com/users/mmedal/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mmedal/subscriptions", "type": "User", "url": "https://api.github.com/users/mmedal", "user_view_type": "public" }
[]
closed
true
null
[]
null
12
2017-01-26T21:10:55Z
2021-09-07T00:06:40Z
2017-01-29T08:15:36Z
CONTRIBUTOR
resolved
Resolves: #3772
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3835/reactions" }
https://api.github.com/repos/psf/requests/issues/3835/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3835.diff", "html_url": "https://github.com/psf/requests/pull/3835", "merged_at": "2017-01-29T08:15:36Z", "patch_url": "https://github.com/psf/requests/pull/3835.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3835" }
true
[ "@Lukasa I believe there were issues with ONLY doing 401 as noted in #3772. The related RFCs allow for the WWW-Authenticate header in a non-401 response which the client SHOULD respond to if received. I think the conclusion we came to is to support 4XX response codes.", "I believe @nateprewitt is correct. It seemed to me that the conclusion of the discussion was to support 4XX response codes.", "Done!", "Might I also suggest renaming the hook to something like handle_4xx_auth_challenge?", "> Might I also suggest renaming the hook to something like handle_4xx_auth_challenge?\r\n\r\nI think I'd rather not. Most auth plugins use the same method name for consistency. But I'm not firmly set in that opinion.", "Hrm. @mmedal, can you rebase on top of the current master? We seem to be having some build problems and I want to check whether they're ones we've seen before and fixed, or new ones.", "I triggered a new build (my branch is up to date), but it looks like you're still having build problems.", "@Lukasa this is a bug in pipenv (kennethreitz/pipenv#90) that may not have an immediate solution. It may be best to pin Requests' dependency for a bit. `pipenv==3.1.9` should work for now.", "@nateprewitt I'll merge a PR that does that. 😁", "On it!", "Retriggering build.", "## [Current coverage](https://codecov.io/gh/kennethreitz/requests/pull/3835?src=pr) is 89.05% (diff: 100%)\n> Merging [#3835](https://codecov.io/gh/kennethreitz/requests/pull/3835?src=pr) into [master](https://codecov.io/gh/kennethreitz/requests/branch/master?src=pr) will increase coverage by **0.01%**\n\n```diff\n@@ master #3835 diff @@\n==========================================\n Files 15 15 \n Lines 1870 1873 +3 \n Methods 0 0 \n Messages 0 0 \n Branches 0 0 \n==========================================\n+ Hits 1665 1668 +3 \n Misses 205 205 \n Partials 0 0 \n```\n\n> Powered by [Codecov](https://codecov.io?src=pr). Last update [d3c97dd...8a58427](https://codecov.io/gh/kennethreitz/requests/compare/d3c97dda1101cc579e4300212929abb9cb43b331...8a58427d8aa73d6af81b6e5f6fd934b3386ef3de?src=pr)" ]
https://api.github.com/repos/psf/requests/issues/3834
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3834/labels{/name}
https://api.github.com/repos/psf/requests/issues/3834/comments
https://api.github.com/repos/psf/requests/issues/3834/events
https://github.com/psf/requests/issues/3834
203,343,953
MDU6SXNzdWUyMDMzNDM5NTM=
3,834
Strict-Transport-Security Header not complet submittet
{ "avatar_url": "https://avatars.githubusercontent.com/u/10137?v=4", "events_url": "https://api.github.com/users/ghost/events{/privacy}", "followers_url": "https://api.github.com/users/ghost/followers", "following_url": "https://api.github.com/users/ghost/following{/other_user}", "gists_url": "https://api.github.com/users/ghost/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ghost", "id": 10137, "login": "ghost", "node_id": "MDQ6VXNlcjEwMTM3", "organizations_url": "https://api.github.com/users/ghost/orgs", "received_events_url": "https://api.github.com/users/ghost/received_events", "repos_url": "https://api.github.com/users/ghost/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ghost/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ghost/subscriptions", "type": "User", "url": "https://api.github.com/users/ghost", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2017-01-26T11:01:53Z
2021-09-08T12:01:08Z
2017-01-26T11:12:14Z
NONE
resolved
I'm trying to get the strict-transport-security header in python3 with requests. My code looks like the following: `import requests r = requests.get("https://%s" % domain") print (r.headers)` My problem is that the strict-transport-security header does not contain all data. The real Header looks like ```max-age=31536010; includeSubDomains; preload```, but the header by r.headers['strict-transport-security'] only contains ```max-age=31536010; includeSubDomains```, not the preload which is needed by my application. Tested on FreeBSD 11.0 and Mint Rosa and Arch 4.8.13
null
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3834/reactions" }
https://api.github.com/repos/psf/requests/issues/3834/timeline
null
completed
null
null
false
[ "Requests does not amend the headers in any way, so you get what we received. Are there any proxies or other intermediaries which may be amending headers?" ]
https://api.github.com/repos/psf/requests/issues/3833
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3833/labels{/name}
https://api.github.com/repos/psf/requests/issues/3833/comments
https://api.github.com/repos/psf/requests/issues/3833/events
https://github.com/psf/requests/issues/3833
203,325,633
MDU6SXNzdWUyMDMzMjU2MzM=
3,833
bad handshake - requests.exceptions.SSLError
{ "avatar_url": "https://avatars.githubusercontent.com/u/4057835?v=4", "events_url": "https://api.github.com/users/filipposantovito/events{/privacy}", "followers_url": "https://api.github.com/users/filipposantovito/followers", "following_url": "https://api.github.com/users/filipposantovito/following{/other_user}", "gists_url": "https://api.github.com/users/filipposantovito/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/filipposantovito", "id": 4057835, "login": "filipposantovito", "node_id": "MDQ6VXNlcjQwNTc4MzU=", "organizations_url": "https://api.github.com/users/filipposantovito/orgs", "received_events_url": "https://api.github.com/users/filipposantovito/received_events", "repos_url": "https://api.github.com/users/filipposantovito/repos", "site_admin": false, "starred_url": "https://api.github.com/users/filipposantovito/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/filipposantovito/subscriptions", "type": "User", "url": "https://api.github.com/users/filipposantovito", "user_view_type": "public" }
[]
closed
true
null
[]
null
10
2017-01-26T09:29:29Z
2021-09-08T12:01:07Z
2017-01-26T09:40:36Z
NONE
resolved
Hi, I read and tried what's in other threads without success. This is the error: ```bash python -c "import requests; requests.get('https://acciseonline.agenziadogane.it')" ``` ```python Traceback (most recent call last): File "<string>", line 1, in <module> File "/home/filippo/src/venvs/test_ssl_sito_dogana/local/lib/python2.7/site-packages/requests/api.py", line 70, in get return request('get', url, params=params, **kwargs) File "/home/filippo/src/venvs/test_ssl_sito_dogana/local/lib/python2.7/site-packages/requests/api.py", line 56, in request return session.request(method=method, url=url, **kwargs) File "/home/filippo/src/venvs/test_ssl_sito_dogana/local/lib/python2.7/site-packages/requests/sessions.py", line 488, in request resp = self.send(prep, **send_kwargs) File "/home/filippo/src/venvs/test_ssl_sito_dogana/local/lib/python2.7/site-packages/requests/sessions.py", line 609, in send r = adapter.send(request, **kwargs) File "/home/filippo/src/venvs/test_ssl_sito_dogana/local/lib/python2.7/site-packages/requests/adapters.py", line 497, in send raise SSLError(e, request=request) requests.exceptions.SSLError: ("bad handshake: Error([('SSL routines', 'SSL23_GET_SERVER_HELLO', 'sslv3 alert handshake failure')],)",) ``` it runs inside a virtualenv ```bash (test_ssl_sito_dogana) filippo@savage:~/src/venvs/test_ssl_sito_dogana$ which python /home/filippo/src/venvs/test_ssl_sito_dogana/bin/python (test_ssl_sito_dogana) filippo@savage:~/src/venvs/test_ssl_sito_dogana$ pip --version pip 9.0.1 from /home/filippo/src/venvs/test_ssl_sito_dogana/local/lib/python2.7/site-packages (python 2.7) ``` here the pip freeze output ```bash (test_ssl_sito_dogana) filippo@savage:~/src/venvs/test_ssl_sito_dogana$ pip freeze | sort appdirs==1.4.0 backports.shutil-get-terminal-size==1.0.0 cffi==1.9.1 configparser==3.5.0 cryptography==1.7.1 decorator==4.0.11 elpy==1.13.0 enum34==1.1.6 flake8==3.2.1 idna==2.2 importmagic==0.1.7 ipaddress==1.0.18 ipdb==0.10.2 ipython==5.1.0 ipython-genutils==0.1.0 jedi==0.9.0 mccabe==0.5.3 ndg-httpsclient==0.4.2 nose==1.3.7 packaging==16.8 pathlib2==2.2.1 pep8==1.7.0 pexpect==4.2.1 pickleshare==0.7.4 pkg-resources==0.0.0 prompt-toolkit==1.0.9 ptyprocess==0.5.1 pyasn1==0.1.9 pycodestyle==2.2.0 pycparser==2.17 pyflakes==1.5.0 Pygments==2.2.0 pyOpenSSL==16.2.0 pyparsing==2.1.10 requests==2.13.0 scandir==1.4 simplegeneric==0.8.1 six==1.10.0 traitlets==4.3.1 wcwidth==0.1.7 ``` and here some openssl info: ```bash (test_ssl_sito_dogana) filippo@savage:~/src/venvs/test_ssl_sito_dogana$ python -c "import ssl; print ssl.OPENSSL_VERSION" OpenSSL 1.0.2g 1 Mar 2016 ``` ```bash (test_ssl_sito_dogana) filippo@savage:~/src/venvs/test_ssl_sito_dogana$ openssl s_client -connect acciseonline.agenziadogane.it:443 CONNECTED(00000003) depth=3 C = SE, O = AddTrust AB, OU = AddTrust External TTP Network, CN = AddTrust External CA Root verify return:1 depth=2 C = GB, ST = Greater Manchester, L = Salford, O = COMODO CA Limited, CN = COMODO RSA Certification Authority verify return:1 depth=1 C = GB, ST = Greater Manchester, L = Salford, O = COMODO CA Limited, CN = COMODO RSA Domain Validation Secure Server CA verify return:1 depth=0 OU = Domain Control Validated, OU = SSL, CN = acciseonline.agenziadogane.it verify return:1 --- Certificate chain 0 s:/OU=Domain Control Validated/OU=SSL/CN=acciseonline.agenziadogane.it i:/C=GB/ST=Greater Manchester/L=Salford/O=COMODO CA Limited/CN=COMODO RSA Domain Validation Secure Server CA 1 s:/C=GB/ST=Greater Manchester/L=Salford/O=COMODO CA Limited/CN=COMODO RSA Domain Validation Secure Server CA i:/C=GB/ST=Greater Manchester/L=Salford/O=COMODO CA Limited/CN=COMODO RSA Certification Authority 2 s:/C=GB/ST=Greater Manchester/L=Salford/O=COMODO CA Limited/CN=COMODO RSA Certification Authority i:/C=SE/O=AddTrust AB/OU=AddTrust External TTP Network/CN=AddTrust External CA Root 3 s:/C=SE/O=AddTrust AB/OU=AddTrust External TTP Network/CN=AddTrust External CA Root i:/C=SE/O=AddTrust AB/OU=AddTrust External TTP Network/CN=AddTrust External CA Root --- Server certificate -----BEGIN CERTIFICATE----- MIIFczCCBFugAwIBAgIQc2nmdmmY4REjS+9DUxt9NjANBgkqhkiG9w0BAQsFADCB kDELMAkGA1UEBhMCR0IxGzAZBgNVBAgTEkdyZWF0ZXIgTWFuY2hlc3RlcjEQMA4G [...] BFDOspLDTFNZ4/WmvfSS/VGZhbqewlpyD+GCYvdiiOrrhxX7cOXkQrBR0ckEypX2 qWl7dMhwlA== -----END CERTIFICATE----- subject=/OU=Domain Control Validated/OU=SSL/CN=acciseonline.agenziadogane.it issuer=/C=GB/ST=Greater Manchester/L=Salford/O=COMODO CA Limited/CN=COMODO RSA Domain Validation Secure Server CA --- No client certificate CA names sent --- SSL handshake has read 5585 bytes and written 619 bytes --- New, TLSv1/SSLv3, Cipher is RC4-SHA Server public key is 2048 bit Secure Renegotiation IS supported Compression: NONE Expansion: NONE No ALPN negotiated SSL-Session: Protocol : TLSv1 Cipher : RC4-SHA Session-ID: FE200000E6B4995A513CC8625648B6F13F32666D5858585812C08958DF770000 Session-ID-ctx: Master-Key: 35260147D2EEB4B465892E426BBD8504190DDC895FFA894DE0079A40C50247AF1080C1FF792D8E255AF2CD6F2A04087E Key-Arg : None PSK identity: None PSK identity hint: None SRP username: None Start Time: 1485422610 Timeout : 300 (sec) Verify return code: 0 (ok) --- ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3833/reactions" }
https://api.github.com/repos/psf/requests/issues/3833/timeline
null
completed
null
null
false
[ "Hi there!\r\n\r\nThis is not the first time this website has been reported to us as a problem. However, I should note for the future that [this website has quite possibly the worst TLS configuration I have ever seen](https://www.ssllabs.com/ssltest/analyze.html?d=acciseonline.agenziadogane.it&hideResults=on).\r\n\r\nRequests refuses to talk to this server in the default configuration because it cannot establish a secure connection: we don't support any of the ciphers that this server does, because all of them are weak. You can get this to work by [enabling support for 3DES as shown in this answer](https://github.com/kennethreitz/requests/issues/3774#issuecomment-267871876), but I *strongly recommend* you don't do that, and that instead you put pressure on your governmental representatives to fix this terrible TLS configuration.", "thank you for your response. I'll try to enable 3DES.\r\n\r\n> but I strongly recommend you don't do that, and that instead you put pressure on your governmental representatives to fix this terrible TLS configuration.\r\n\r\nhave you noticed the '.it' at the end of the url? Our government/parliament isn't able to produce a electoral law: it's impossible to 'put pressure' on anyone.... Right now no one cares about IT services... To be honest no one cares about the people, too... it's a sad story....\r\n", "I have, but it's not really my place to make judgements about the Italian electoral system. The best I can really do is point out which organisation needs to take action. =(", "the following code works. Thank you for your support. \r\n\r\n```python\r\nimport requests\r\nfrom requests.adapters import HTTPAdapter\r\nfrom requests.packages.urllib3.util.ssl_ import create_urllib3_context\r\n\r\n# This is the 2.11 Requests cipher string.\r\nCIPHERS = (\r\n 'ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES:ECDH+HIGH:'\r\n 'DH+HIGH:ECDH+3DES:DH+3DES:RSA+AESGCM:RSA+AES:RSA+HIGH:RSA+3DES:!aNULL:'\r\n '!eNULL:!MD5'\r\n)\r\n\r\nclass DESAdapter(HTTPAdapter):\r\n def init_poolmanager(self, *args, **kwargs):\r\n context = create_urllib3_context(ciphers=CIPHERS)\r\n kwargs['ssl_context'] = context\r\n return super(DESAdapter, self).init_poolmanager(*args, **kwargs)\r\n\r\n\r\nif __name__ == '__main__':\r\n s = requests.Session()\r\n s.mount('https://acciseonline.agenziadogane.it', DESAdapter())\r\n print s.get('https://acciseonline.agenziadogane.it')\r\n```", "anyway how did you spot that 3des was missing and necessary? I'm asking just to understand the topic and avoid to re-ask for similar problems.", "So, I have a moderate advantage in that I do maintenance of the TLS configuration in Requests.\r\n\r\nHowever, the Requests cipher string lives at `requests.packages.urllib3.util.ssl_.DEFAULT_CIPHERS`. You can check what that turns out to be on your system in most cases by running\r\n\r\n```\r\nopenssl ciphers `python -c \"import requests; print(requests.packages.urllib3.util.ssl_.DEFAULT_CIPHERS)\"`\r\n```\r\n\r\nOf course, that produces a lot of output, so the compressed form used in Requests may be more helpful, which is:\r\n\r\n```\r\n'ECDH+AESGCM:ECDH+CHACHA20:DH+AESGCM:DH+CHACHA20:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES:RSA+AESGCM:RSA+AES:!aNULL:!eNULL:!MD5'\r\n```\r\n\r\nThis translates to, in order of preference:\r\n\r\n- (ECDH+AESGCM) Any cipher suite using Elliptic-Curve Diffie-Hellman key exchange and AES in GCM mode; or\r\n- (ECDH+CHACHA20) Any cipher suite using Elliptic-Curve Diffie-Hellman key exchange and ChaCha20 (not widely available in OpenSSL yet); or\r\n- (DH+AESGCM) Any cipher suite using regular Diffie-Hellman key exchange and AES in GCM mode; or\r\n- (DH+CHACHA20) Any cipher suite using regular Diffie-Hellman key exchange and ChaCha20 (same caveat as before); or\r\n- (ECDH+AES256) Any cipher suite using Elliptic-Curve Diffie-Hellman key exchange and 256-bit AES in any mode; or\r\n- (DH+AES256) Any cipher suite using regular Diffie-Hellman key exchange and 256-bit AES in any mode; or\r\n- (ECDH+AES128) Any cipher suite using Elliptic-Curve Diffie-Hellman key exchange and 128-bit AES in any mode; or\r\n- (DH+AES) Any cipher suite using regular Diffie-Hellman key exchange and any-strength AES in any mode (this basically covers the same as DH+AES128, it's just set this way for completion); or\r\n- (RSA+AESGCM) Any cipher suite using RSA key exchange and AES in GCM mode; or\r\n- (RSA+AES) Any cipher suite using RSA key exchange and AES in any mode;\r\n- (!aNULL) **Also** remove any cipher suites that may have made it into that list that have no authentication (there shouldn't be any, but just to be safe)\r\n- (!eNULL) **Also** remove any cipher suites that may have made it into that list that have no encryption (there shouldn't be any, but just to be safe)\r\n- (!MD5) **Also** remove any cipher suites that may have made it into that list that use MD5 for their MAC.\r\n\r\nWhat's noticeable about this list? Well, it contains only two stream ciphers: AES in its appropriate stream modes, and ChaCha20. That's because we removed 3DES in the wake of the news that it's insecure for bulk transfer, as we do not control whether or not our users perform bulk transfer so it should be removed. Additionally, 3DES is very slow compared to AES and ChaCha20, so it was usually a sub-optimal choice.", "clear. But let me elaborate on the theme.\r\n\r\nFrom https://www.ssllabs.com/ssltest/analyze.html?d=acciseonline.agenziadogane.it&hideResults=on you got TLS_RSA_WITH_3DES_EDE_CBC_SHA is a valid cipher for the site. Now it's obvious that we need to instruct requests/openssl to use that cipher. \r\n\r\nThe way is to add to the ciphers string the following:\r\n\r\n- ECDH+3DES\r\n- DH+3DES\r\n- RSA+3DES\r\n\r\nwhere are these strings listed and documented? I've found a partial answer here: https://wiki.openssl.org/index.php/Manual:Ciphers(1) \r\n\r\nlooking for TLS_RSA_WITH_3DES_EDE_CBC_SHA in that page you get:\r\n\r\n> TLS_RSA_WITH_3DES_EDE_CBC_SHA **DES-CBC3-SHA**\r\n\r\nunder **TLS v1.0 cipher suites**.\r\n\r\nbut how do you know that **ECDH+3DES** or **DH+3DES** or **RSA+3DES** is the correct string to add in order to solve my problem?", "You don't need to add all three of those: that's just what used to be the Requests cipher string. To add *literally just that cipher suite* you could add just `DES-CBC3-SHA`.\r\n\r\nHowever, the others are *groups* of cipher suites. Again, they use the weird OpenSSL grouping language. \"ECDH+3DES\" means any cipher suite that uses Elliptic-Curve Diffie-Hellman key exchange and 3DES as the stream cipher. \"DH+3DES\", same deal: regular Diffie-Hellman and 3DES. Finally, \"RSA+3DES\" is RSA key exchange and 3DES.\r\n\r\nIn this case, you can just add the specific string you need (DES-CBC3-SHA). The string you added is a bit more liberal, because it includes forward-secret versions of 3DES that this server doesn't use. That's probably ok, and it's written that way to be a bit more defensive in case other users hit 3DES issues: it basically reverts to the Requests behaviour of 2.11, before we removed 3DES from our supported cipher list.\r\n\r\nUnfortunately, the only way to understand what cipher strings do is to understand how a TLS cipher suite is constructed, and to understand the way OpenSSL thinks about them. That's a somewhat complex issue, but let me break it down a bit. BTW, some of this has changed in the upcoming TLSv1.3, so bear that in mind.\r\n\r\nIn TLSv1.2 and earlier, all cipher suites have got formal IANA-registered names like TLS_RSA_WITH_3DES_EDE_CBC_SHA. These names uniquely identify all the constituent parts of a cipher suite. A cipher suite is made up of the following parts:\r\n\r\n- A key exchange mechanism.\r\n- An authentication/signing mechanism to validate the certificate. This is not actually a free-choice: the server has to use one that matches the type of private key it has. For example, GitHub.com's certificate has an RSA private key, so it must always use a cipher suite with an RSA signature part.\r\n- A stream cipher\r\n- A key size\r\n- A MAC\r\n\r\nThe name encodes all of this. For example, consider the following names and their breakdowns:\r\n\r\n1. TLS_RSA_WITH_3DES_EDE_CBC_SHA:\r\n - Key Exchange: RSA\r\n - Signing: RSA (this means this must be an RSA certificate). Note that RSA is a unique case in that if it is used as a key exchange mechanism it is also used as the signing mechanism, so only needs to be specified once)\r\n - Stream Cipher: 3DES in EDE-CBC mode\r\n - Key Size: Not specified, so it's the default for 3DES, which is 168 bits (3x56)\r\n - MAC: SHA1\r\n2. TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384\r\n - Key Exchange: Ephemeral Elliptic-Curve Diffie-Hellman\r\n - Signing: Elliptic-Curve DSA (not common at the moment)\r\n - Stream Cipher: AES in GCM mode (the best currently-deployed AES mode)\r\n - Key Size: 256-bits\r\n - MAC: Actually, doesn't have one.\r\n\r\n You see, I lied a little bit. Starting in TLSv1.2 we got stream ciphers with AEAD modes. These modes don't require a MAC (or, more properly, the MAC functionality is integrated with the cipher mode). However, these modes *do* change the way the pseudo-random numbers are generated in TLS, so the part of the name that would normally have the MAC instead have the PRF algorithm. So...\r\n\r\n - PRF: SHA384 (For non-AEAD cipher suites, the PRF is always SHA1 and MD5).\r\n3. TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256\r\n - Key Exchange: Ephemeral Elliptic-Curve Diffie-Hellman\r\n - Signing: RSA\r\n - Stream Cipher: ChaCha20 using Poly1305 authentication\r\n - Key Size: 256-bits\r\n - MAC: None, again, ChaCha20/Poly1305 is an AEAD cipher.\r\n - PRF: SHA256.\r\n\r\nAs you can see, this is not a totally simple idea. However, the argument goes that you can construct \"groups\" of cipher suites based on their common functionality. For example, if you like ChaCha20 you could construct a group of cipher suites all based on ChaCha20. This would be any suite that uses ChaCha20 as its stream cipher. In this case, as of TLSv1.2, that list is:\r\n\r\n- TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256\r\n- TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256\r\n- TLS_DHE_RSA_WITH_CHACHA20_POLY1305_SHA256\r\n- TLS_PSK_WITH_CHACHA20_POLY1305_SHA256\r\n- TLS_ECDHE_PSK_WITH_CHACHA20_POLY1305_SHA256\r\n- TLS_DHE_PSK_WITH_CHACHA20_POLY1305_SHA256\r\n- TLS_RSA_PSK_WITH_CHACHA20_POLY1305_SHA256\r\n\r\nOpenSSL lets you specify these either one-by-one using their full OpenSSL name (e.g. the first one is ECDHE-ECDSA-CHACHA20-POLY1305), or as a group by just using \"CHACHA20\". If you wanted only the ones that use ECDHE for key exchange, you can join that requirement with a +: \"ECDH+CHACHA20\". Maybe you want to exclude the PSK ones (often a good idea): in that case, you can say \"CHACHA20:!PSK\".\r\n\r\nThe final wrinkle in here is that some of OpenSSL's full names seem a lot shorter than the IANA name, and are missing parts. Consider again our friend TLS_RSA_WITH_3DES_EDE_CBC_SHA. As you noted, OpenSSL's name for this is DES-CBC3-SHA. This only specified the stream cipher, the mode, and the MAC. Why is it missing this other stuff?\r\n\r\nThe answer is that OpenSSL has *defaults*: stuff that is assumed. If you don't name the key exchange mechanism, OpenSSL assumes RSA. That means that if you tried to see what OpenSSL uses for \"ECDH+3DES\", you'll find that DES-CBC3-SHA isn't in there. It's only present for \"RSA+3DES\".\r\n\r\nEssentially the OpenSSL cipher string language is an extremely complex query system. It is moderately difficult to understand, and I recommend playing about with the `openssl ciphers` command to get a feel for what's going on.", "wow. things are a lot clearer now. I think this answer should live on requests' main site.\r\n\r\nthank you very much!", "Thanks!\r\n\r\nWhile I'm glad you liked the answer, however, I don't think Requests should document it. Touching this area of the code is very much like opening the emergency exit on an aircraft in flight. Sure, there *might* be a good reason to do it, and it should be possible, but the average passenger should never even realise that it can be done, let alone know how.\r\n\r\nAs much as possible we don't want users to have to touch this code at all. That's why changing this code is tricky (and undocumented). The term I like to use is \"attractive nuisance\": people who encounter errors when running their code will often start flicking random switches in an attempt to get it to work. `verify=False` is our one concession to this case in the public API. But we very much don't want the cipher suite list to be a part of this because there are too many bad cipher suites, and the list gets longer over time.\r\n\r\nThe Requests and urllib3 cipher suite list is maintained by a small number of people who keep a very careful eye on the crypto research and keep it up-to-date. We expand to use new believed-safe ciphers, and we proactively remove ones that are discovered to be bad. That is generally less-likely to happen if users are overriding our choices, so we don't want to make it too easy for users to do that unless they really, really need to." ]
https://api.github.com/repos/psf/requests/issues/3832
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3832/labels{/name}
https://api.github.com/repos/psf/requests/issues/3832/comments
https://api.github.com/repos/psf/requests/issues/3832/events
https://github.com/psf/requests/pull/3832
203,135,045
MDExOlB1bGxSZXF1ZXN0MTAzMTQxODAx
3,832
fixing codecov with pipenv
{ "avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4", "events_url": "https://api.github.com/users/nateprewitt/events{/privacy}", "followers_url": "https://api.github.com/users/nateprewitt/followers", "following_url": "https://api.github.com/users/nateprewitt/following{/other_user}", "gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/nateprewitt", "id": 5271761, "login": "nateprewitt", "node_id": "MDQ6VXNlcjUyNzE3NjE=", "organizations_url": "https://api.github.com/users/nateprewitt/orgs", "received_events_url": "https://api.github.com/users/nateprewitt/received_events", "repos_url": "https://api.github.com/users/nateprewitt/repos", "site_admin": false, "starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions", "type": "User", "url": "https://api.github.com/users/nateprewitt", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2017-01-25T15:32:06Z
2021-09-07T00:06:41Z
2017-01-25T15:40:30Z
MEMBER
resolved
codecov broke with the introduction of pipenv because the dependency isn't installed outside of the virtualenv. Running it inside of pipenv fixes this.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3832/reactions" }
https://api.github.com/repos/psf/requests/issues/3832/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3832.diff", "html_url": "https://github.com/psf/requests/pull/3832", "merged_at": "2017-01-25T15:40:30Z", "patch_url": "https://github.com/psf/requests/pull/3832.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3832" }
true
[ "## [Current coverage](https://codecov.io/gh/kennethreitz/requests/pull/3832?src=pr) is 89.03% (diff: 100%)\n\n\n> No coverage report found for **master** at b92058b.\n\n> Powered by [Codecov](https://codecov.io?src=pr). Last update [b92058b...795106c](https://codecov.io/gh/kennethreitz/requests/compare/b92058b6fe68a3b114b872d1dba072c115af413b...795106c1885f19c84cefe037914a12d37ddd6248?src=pr)", "LGTM, thanks @nateprewitt!" ]
https://api.github.com/repos/psf/requests/issues/3831
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3831/labels{/name}
https://api.github.com/repos/psf/requests/issues/3831/comments
https://api.github.com/repos/psf/requests/issues/3831/events
https://github.com/psf/requests/issues/3831
203,021,037
MDU6SXNzdWUyMDMwMjEwMzc=
3,831
AttributeError after gevent.monkey.patch_select()
{ "avatar_url": "https://avatars.githubusercontent.com/u/19982?v=4", "events_url": "https://api.github.com/users/sublee/events{/privacy}", "followers_url": "https://api.github.com/users/sublee/followers", "following_url": "https://api.github.com/users/sublee/following{/other_user}", "gists_url": "https://api.github.com/users/sublee/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sublee", "id": 19982, "login": "sublee", "node_id": "MDQ6VXNlcjE5OTgy", "organizations_url": "https://api.github.com/users/sublee/orgs", "received_events_url": "https://api.github.com/users/sublee/received_events", "repos_url": "https://api.github.com/users/sublee/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sublee/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sublee/subscriptions", "type": "User", "url": "https://api.github.com/users/sublee", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2017-01-25T06:12:34Z
2021-09-08T12:01:04Z
2017-01-25T10:04:41Z
NONE
resolved
Requests-2.13 fails with `AttributeError` because there's no `select.epoll()` when `gevent` monkey-patched `select`. This issue is actually a bug from urllib3-1.20. It depends on `select.epoll()` and it doesn't handle `AttributeError`. But `gevent.monkey.patch_select()` removes `epoll`, `kqueue`, `kevent`, and `devpoll` to make some modules non-blocking. ``` Traceback (most recent call last): ... File "/home/sub/env/local/lib/python2.7/site-packages/requests/sessions.py", line 501, in get return self.request('GET', url, **kwargs) File "/home/sub/env/local/lib/python2.7/site-packages/requests/sessions.py", line 488, in request resp = self.send(prep, **send_kwargs) File "/home/sub/env/local/lib/python2.7/site-packages/requests/sessions.py", line 609, in send r = adapter.send(request, **kwargs) File "/home/sub/env/local/lib/python2.7/site-packages/requests/adapters.py", line 423, in send timeout=timeout File "/home/sub/env/local/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py", line 588, in urlopen conn = self._get_conn(timeout=pool_timeout) File "/home/sub/env/local/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py", line 241, in _get_conn if conn and is_connection_dropped(conn): File "/home/sub/env/local/lib/python2.7/site-packages/requests/packages/urllib3/util/connection.py", line 27, in is_connection_dropped return bool(wait_for_read(sock, timeout=0.0)) File "/home/sub/env/local/lib/python2.7/site-packages/requests/packages/urllib3/util/wait.py", line 33, in wait_for_read return _wait_for_io_events(socks, EVENT_READ, timeout) File "/home/sub/env/local/lib/python2.7/site-packages/requests/packages/urllib3/util/wait.py", line 22, in _wait_for_io_events with DefaultSelector() as selector: File "/home/sub/env/local/lib/python2.7/site-packages/requests/packages/urllib3/util/selectors.py", line 364, in __init__ self._epoll = select.epoll() AttributeError: 'module' object has no attribute 'epoll' ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/19982?v=4", "events_url": "https://api.github.com/users/sublee/events{/privacy}", "followers_url": "https://api.github.com/users/sublee/followers", "following_url": "https://api.github.com/users/sublee/following{/other_user}", "gists_url": "https://api.github.com/users/sublee/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sublee", "id": 19982, "login": "sublee", "node_id": "MDQ6VXNlcjE5OTgy", "organizations_url": "https://api.github.com/users/sublee/orgs", "received_events_url": "https://api.github.com/users/sublee/received_events", "repos_url": "https://api.github.com/users/sublee/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sublee/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sublee/subscriptions", "type": "User", "url": "https://api.github.com/users/sublee", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3831/reactions" }
https://api.github.com/repos/psf/requests/issues/3831/timeline
null
completed
null
null
false
[ "When are you patching with gevent? urllib3 should absolutely tolerate not having an epoll implementation. The epoll implementation is only chosen when `hasattr(select, 'epoll')` is `True`. I suspect you are importing requests before you patch with gevent. When working with gevent you *must* monkey patch before you import other modules, especially if the monkey patch is going to delete items.\r\n\r\nThere is no really good way for urllib3 to tolerate the possibility that a selector may disappear sometime between import time and class creation time.", "As a more general note, I don't believe that library authors are required to make their libraries resilient in the face of monkey patching: I believe it is incumbent on monkey patch authors and users to make their patches resilient in the face of what libraries do.\r\n\r\nIn this case, urllib3 makes the reasonable decision to check for implementations at import time, because urllib3 rightly judges that it is absurd that we could have an epoll implementation at time *x* but not at time *x + n ∀ n > 0*. epoll implementations don't vanish on systems that aren't being messed with. :wink:\r\n\r\nIn this case, as noted above, the simple fix is to simply monkey patch before you import other modules. This is good practice with gevent as many other modules do import-time feature detection.", "I agree to you. There was something which imports `requests` before monkey-patching. This should not be fixed by Requests own. I close this issue.", "I agree that monkey patching has to be done before everything. I have a django app running with uwsgi+supervisord config. And `gevent.monkey.patch_all()` was the first line of wsgi.py (which is the entry point to the app). But still this exception was thrown. So either adding `--gevent-early-monkey-patch` in the uwsgi config or `gevent.monkey.patch_all(select=False)` in the wsgi.py fixed the error. I'm not sure that this is relevant to the requests or urllib3 libs but just for someone's information." ]
https://api.github.com/repos/psf/requests/issues/3830
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3830/labels{/name}
https://api.github.com/repos/psf/requests/issues/3830/comments
https://api.github.com/repos/psf/requests/issues/3830/events
https://github.com/psf/requests/pull/3830
202,968,271
MDExOlB1bGxSZXF1ZXN0MTAzMDI0MjMz
3,830
Fixing Pipfile source
{ "avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4", "events_url": "https://api.github.com/users/nateprewitt/events{/privacy}", "followers_url": "https://api.github.com/users/nateprewitt/followers", "following_url": "https://api.github.com/users/nateprewitt/following{/other_user}", "gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/nateprewitt", "id": 5271761, "login": "nateprewitt", "node_id": "MDQ6VXNlcjUyNzE3NjE=", "organizations_url": "https://api.github.com/users/nateprewitt/orgs", "received_events_url": "https://api.github.com/users/nateprewitt/received_events", "repos_url": "https://api.github.com/users/nateprewitt/repos", "site_admin": false, "starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions", "type": "User", "url": "https://api.github.com/users/nateprewitt", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2017-01-24T23:13:07Z
2021-09-07T00:06:41Z
2017-01-25T01:58:00Z
MEMBER
resolved
This fixes the issue that was encountered this morning during the 2.13.0 release. The source specified in the original Pipfile was for an incorrect pypi endpoint. This didn't matter until pipenv 2.6 was released, specifically [cb22a12](https://github.com/kennethreitz/pipenv/commit/cb22a129eae8c8e800e603c38bf1fe04d420fbde), which started using the provided `source` value. @kennethreitz I used `pipenv lock` with pipenv 3.0.0 to generate the new lock file which I'm assuming is what we want here.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3830/reactions" }
https://api.github.com/repos/psf/requests/issues/3830/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3830.diff", "html_url": "https://github.com/psf/requests/pull/3830", "merged_at": "2017-01-25T01:58:00Z", "patch_url": "https://github.com/psf/requests/pull/3830.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3830" }
true
[ "While we're here, it looks like this value is still used in pipenv [here](https://github.com/kennethreitz/pipenv/blob/3fa204b734f67522d7858ab47b1d10263a76dc90/pipenv/_pipfile/api.py#L73) , but I'm not sure if that code is actively being used. I can open a quick PR updating it over there too if needed.", "ah yes, this is the issue. " ]
https://api.github.com/repos/psf/requests/issues/3829
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3829/labels{/name}
https://api.github.com/repos/psf/requests/issues/3829/comments
https://api.github.com/repos/psf/requests/issues/3829/events
https://github.com/psf/requests/issues/3829
202,820,492
MDU6SXNzdWUyMDI4MjA0OTI=
3,829
Session.verify=False ignored when REQUESTS_CA_BUNDLE environment variable is set
{ "avatar_url": "https://avatars.githubusercontent.com/u/137616?v=4", "events_url": "https://api.github.com/users/intgr/events{/privacy}", "followers_url": "https://api.github.com/users/intgr/followers", "following_url": "https://api.github.com/users/intgr/following{/other_user}", "gists_url": "https://api.github.com/users/intgr/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/intgr", "id": 137616, "login": "intgr", "node_id": "MDQ6VXNlcjEzNzYxNg==", "organizations_url": "https://api.github.com/users/intgr/orgs", "received_events_url": "https://api.github.com/users/intgr/received_events", "repos_url": "https://api.github.com/users/intgr/repos", "site_admin": false, "starred_url": "https://api.github.com/users/intgr/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/intgr/subscriptions", "type": "User", "url": "https://api.github.com/users/intgr", "user_view_type": "public" }
[ { "color": "e10c02", "default": false, "description": null, "id": 117744, "name": "Bug", "node_id": "MDU6TGFiZWwxMTc3NDQ=", "url": "https://api.github.com/repos/psf/requests/labels/Bug" }, { "color": "e102d8", "default": false, "description": null, "id": 117745, "name": "Planned", "node_id": "MDU6TGFiZWwxMTc3NDU=", "url": "https://api.github.com/repos/psf/requests/labels/Planned" } ]
open
true
null
[]
{ "closed_at": null, "closed_issues": 29, "created_at": "2013-11-17T11:29:34Z", "creator": { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }, "description": null, "due_on": null, "html_url": "https://github.com/psf/requests/milestone/20", "id": 487518, "labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels", "node_id": "MDk6TWlsZXN0b25lNDg3NTE4", "number": 20, "open_issues": 12, "state": "open", "title": "3.0.0", "updated_at": "2024-05-19T18:43:00Z", "url": "https://api.github.com/repos/psf/requests/milestones/20" }
14
2017-01-24T13:40:18Z
2024-02-12T14:04:41Z
null
NONE
null
One would expect that when the caller explicitly asks to make unverified requests, then the `REQUESTS_CA_BUNDLE` environment variable doesn't affect it. The reality is different, however. ```python import os import requests os.environ['REQUESTS_CA_BUNDLE'] = 'asd.pem' # Must be an existing file r = requests.get('https://self-signed.badssl.com/', verify=False) print(r) # Prints: <Response [200]> session = requests.Session() session.verify = False r = session.get('https://self-signed.badssl.com/') print(r) # Fails: requests.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:749) ```
null
{ "+1": 26, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 26, "url": "https://api.github.com/repos/psf/requests/issues/3829/reactions" }
https://api.github.com/repos/psf/requests/issues/3829/timeline
null
null
null
null
false
[ "Thanks for raising this issue! This is a related issue to #2018: specifically, we prefer the environment to the `Session` value assuming the environment specifies an auth value. The fix is easy, but we've unfortunately ossified around this value, so we can't fix it until v3.\r\n\r\nGiven that we fail-closed here (that is, it's not possible to use this arrangement to force us not to verify when we should), this isn't a security vulnerability, so there is no way we can justify bringing it forward to before v3.\r\n", "OK, thanks for the quick response. Is 3.0.0 coming some time soon or is it just a plan for now?\r\n", "\"soon\" is a bit strong. However, there's a branch that code can be committed to, and there is active work being done on urllib3 v2, which once done will be the catalyst for us to actually ship requests v3.", "For whoever else is struggling with this problem, I created a wrapper class as workaround:\r\n\r\n```python\r\nclass WrappedSession(requests.Session):\r\n \"\"\"A wrapper for requests.Session to override 'verify' property, ignoring REQUESTS_CA_BUNDLE environment variable.\r\n\r\n This is a workaround for https://github.com/kennethreitz/requests/issues/3829 (will be fixed in requests 3.0.0)\r\n \"\"\"\r\n def merge_environment_settings(self, url, proxies, stream, verify, *args, **kwargs):\r\n if self.verify is False:\r\n verify = False\r\n\r\n return super(WrappedSession, self).merge_environment_settings(url, proxies, stream, verify, *args, **kwargs)\r\n```\r\n", "This issue hit me today. I had to debug a good amount of code to track it down.\r\n\r\nWhere are we now relative to when it was identified back in January?", "No change.", "as my test in win10 python 3.6.4, requests (2.18.3) (not until v3 ??) urllib3 (1.22) the \r\n```\r\nsession = requests.Session()\r\nsession.verify = False\r\n```\r\nworked, and will prompt a warning message:\r\n```\r\n\\appdata\\local\\programs\\python\\python36\\lib\\site-packages\\urllib3\\connectionpool.py:858: InsecureRequestWarning: Unverified HTTPS request is being made. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings\r\n InsecureRequestWarning)\r\n```\r\nto disable the warning message, just:\r\n```\r\nfrom requests.packages.urllib3.exceptions import InsecureRequestWarning\r\nrequests.packages.urllib3.disable_warnings(InsecureRequestWarning)\r\n```", "The 'fail close, so no security issue' argument is only correct if the verify is set to `False`.\r\n\r\nIf the verify is set to a subset of the global CAs for access to some systems (as a poor mans form of certificate pinning) this fails open. So when used in a library that allows specifiying certificate authorities to allow for e.g. authentication backends that use sessions to allow easier cookie flows, the globally set environments (e.g. to allow access to internet sites in other parts of the program) invalidates this unless the library takes extra precautions.\r\n\r\nFor example:\r\nI allow the usual CAs via the curl/mozilla ca bundle for normal internet sites.\r\nAuthentication is done via OAuth2 against Azure AD and that should be limited to only allow the Baltimore Cybertrust CA used by Azure. \r\nNow i cannot use session.verify for this and have to propagate the flag to all individual requests calls (or use a wrapper around Session as seen above).", "It would be nice if requests could at least throw a warning about this. I just spend 6 hours trying to figure out what is going on, then found out that `REQUESTS_CA_BUNDLE` is set by the daemon executing my test script and to then only find this issue. :disappointed: ", "It's funny that this \"fix\" is delayed to v3 because it's considered \"breaking change\", yet a huge number of people keep burning hours pinning down this behavior.\nI'd say that calls it a bug, if no one expects it, and bug fixing would not be breaking changes...", "We spent 3 hours on this, third party library was set env `REQUESTS_CA_BUNDLE `... It really would be nice to have a warning.", "Since this is fixed in requests 3.* version, I added a mechanism to warn the user via #5816. Please provide feedback on the PR.", "Hello, Is this documented ? [Official documentation](https://requests.readthedocs.io/en/latest/api/#requests.Session.verify) did not mention this behaviour. I lost 2 hours..\r\n" ]
https://api.github.com/repos/psf/requests/issues/3828
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3828/labels{/name}
https://api.github.com/repos/psf/requests/issues/3828/comments
https://api.github.com/repos/psf/requests/issues/3828/events
https://github.com/psf/requests/pull/3828
202,810,858
MDExOlB1bGxSZXF1ZXN0MTAyOTEyODI0
3,828
Blacklist known bad version of pipenv
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" } ]
null
3
2017-01-24T12:56:38Z
2021-09-07T00:06:41Z
2017-01-24T13:13:52Z
CONTRIBUTOR
resolved
We pinned pipenv to release v2.13.0 but in reality, we could have just blacklisted the known bad version. For future us, we now blacklist it.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3828/reactions" }
https://api.github.com/repos/psf/requests/issues/3828/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3828.diff", "html_url": "https://github.com/psf/requests/pull/3828", "merged_at": "2017-01-24T13:13:52Z", "patch_url": "https://github.com/psf/requests/pull/3828.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3828" }
true
[ "I think it's more than just 2.8.0, I can't get a working build on travis after 2.5.0. It looks like something happened there that's breaking the build, checking it out now.", "Odd, 0.2.7 worked just fine locally", "We can blacklist everything between 0.2.5 and 0.2.8 though" ]
https://api.github.com/repos/psf/requests/issues/3827
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3827/labels{/name}
https://api.github.com/repos/psf/requests/issues/3827/comments
https://api.github.com/repos/psf/requests/issues/3827/events
https://github.com/psf/requests/pull/3827
202,806,189
MDExOlB1bGxSZXF1ZXN0MTAyOTA5NDY1
3,827
Release v2.13.0
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2017-01-24T12:34:11Z
2021-09-07T00:06:42Z
2017-01-24T12:52:23Z
CONTRIBUTOR
resolved
This includes updates to: - when we load idna - urllib3 (updated to version 1.20) - idna (updated to version 2.2) - release publishing machinery
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3827/reactions" }
https://api.github.com/repos/psf/requests/issues/3827/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3827.diff", "html_url": "https://github.com/psf/requests/pull/3827", "merged_at": "2017-01-24T12:52:23Z", "patch_url": "https://github.com/psf/requests/pull/3827.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3827" }
true
[ "It looks like `init` in the Makefile may need `pipenv shell` run to initialize the virtualenv?", "Maybe? pipenv 0.1.14 worked, 0.2.9 doesn't. The changelog is not immediately forthcoming as to what the specific problem is.", "For now we'll just pin it so the builds keep working: @kennethreitz should feel free to remove the pin and investigate when he has some time.", "Just tested with pipenv 0.2.7 locally and that seems to work, so I'm going to say it's just 0.2.8 that's busted. We can blacklist it after this is merged and 2.13.0 is released\r\n", "BAM." ]
https://api.github.com/repos/psf/requests/issues/3826
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3826/labels{/name}
https://api.github.com/repos/psf/requests/issues/3826/comments
https://api.github.com/repos/psf/requests/issues/3826/events
https://github.com/psf/requests/issues/3826
202,292,208
MDU6SXNzdWUyMDIyOTIyMDg=
3,826
Access to the server certificate
{ "avatar_url": "https://avatars.githubusercontent.com/u/15989628?v=4", "events_url": "https://api.github.com/users/Te-k/events{/privacy}", "followers_url": "https://api.github.com/users/Te-k/followers", "following_url": "https://api.github.com/users/Te-k/following{/other_user}", "gists_url": "https://api.github.com/users/Te-k/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Te-k", "id": 15989628, "login": "Te-k", "node_id": "MDQ6VXNlcjE1OTg5NjI4", "organizations_url": "https://api.github.com/users/Te-k/orgs", "received_events_url": "https://api.github.com/users/Te-k/received_events", "repos_url": "https://api.github.com/users/Te-k/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Te-k/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Te-k/subscriptions", "type": "User", "url": "https://api.github.com/users/Te-k", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2017-01-21T04:49:00Z
2021-09-08T13:05:26Z
2017-01-21T10:46:42Z
NONE
resolved
Hi, I checked documentation and the stackoverflow questions (like [this](https://stackoverflow.com/questions/16903528/how-to-get-response-ssl-certificate-from-requests-in-python)) and from what I see, it is not possible to get the server certificate information with requests. * Is there any hack to get this info? * Is there any plan to add this info to the answer? If no, consider this as a feature request :+1:
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3826/reactions" }
https://api.github.com/repos/psf/requests/issues/3826/timeline
null
completed
null
null
false
[ "> Is there any hack to get this info?\r\n\r\nYes. If you stream the response (that is, set `stream=True`), then when you get the response back you can go digging for the socket. Depending on the Python version it'll be in slightly different places: on Python 2 it'll be at `response.raw._fp.fp._sock`, on Python 3 it'll be `response.raw._fp.fp.raw._sock`. At this point, for SSL connections, you can call `getpeercert()`.\r\n\r\n> Is there any plan to add this info to the answer?\r\n\r\nThere is not. =)\r\n\r\nUltimately, this is a feature that's of minimal utility to most people. It comes too late to make doing any kind of security validation worthwhile, and if the goal is simply to determine the certificate for a remote peer it is almost always going to be better to use the ssl and socket modules directly, or even the OpenSSL command line tool.\r\n\r\nSo the number of people for whom this feature would provide a benefit is ultimately too low to justify adding it. Sorry!", "Thanks for your quick answer.\r\n\r\nAs I could not find any answer to that question, some other people may find this hack interesting. Is there a place where I could add this for documentation? (Like a \"ugly hack that works\" in the documentation?)\r\n\r\nI noticed that the cert is not available if verify is False, is there any way of still accessing this information even without verifying the certificate?", "I would vastly prefer not to add this hack to the documentation, primarily because we don't promise it'll continue to work.\r\n\r\n> I noticed that the cert is not available if verify is False, is there any way of still accessing this information even without verifying the certificate?\r\n\r\nI'm afraid not. This is a product of the way OpenSSL works: `verify=False` turns into `CERT_NONE`, which causes OpenSSL to not store the certificate if the remote peer presents it. " ]
https://api.github.com/repos/psf/requests/issues/3825
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3825/labels{/name}
https://api.github.com/repos/psf/requests/issues/3825/comments
https://api.github.com/repos/psf/requests/issues/3825/events
https://github.com/psf/requests/issues/3825
202,131,592
MDU6SXNzdWUyMDIxMzE1OTI=
3,825
urllib3 connection pool block is not properly implemented
{ "avatar_url": "https://avatars.githubusercontent.com/u/6129283?v=4", "events_url": "https://api.github.com/users/csala/events{/privacy}", "followers_url": "https://api.github.com/users/csala/followers", "following_url": "https://api.github.com/users/csala/following{/other_user}", "gists_url": "https://api.github.com/users/csala/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/csala", "id": 6129283, "login": "csala", "node_id": "MDQ6VXNlcjYxMjkyODM=", "organizations_url": "https://api.github.com/users/csala/orgs", "received_events_url": "https://api.github.com/users/csala/received_events", "repos_url": "https://api.github.com/users/csala/repos", "site_admin": false, "starred_url": "https://api.github.com/users/csala/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/csala/subscriptions", "type": "User", "url": "https://api.github.com/users/csala", "user_view_type": "public" }
[]
closed
false
null
[]
{ "closed_at": null, "closed_issues": 29, "created_at": "2024-05-19T18:29:04Z", "creator": { "avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4", "events_url": "https://api.github.com/users/sethmlarson/events{/privacy}", "followers_url": "https://api.github.com/users/sethmlarson/followers", "following_url": "https://api.github.com/users/sethmlarson/following{/other_user}", "gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sethmlarson", "id": 18519037, "login": "sethmlarson", "node_id": "MDQ6VXNlcjE4NTE5MDM3", "organizations_url": "https://api.github.com/users/sethmlarson/orgs", "received_events_url": "https://api.github.com/users/sethmlarson/received_events", "repos_url": "https://api.github.com/users/sethmlarson/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions", "type": "User", "url": "https://api.github.com/users/sethmlarson", "user_view_type": "public" }, "description": "", "due_on": null, "html_url": "https://github.com/psf/requests/milestone/34", "id": 11073254, "labels_url": "https://api.github.com/repos/psf/requests/milestones/34/labels", "node_id": "MI_kwDOABTKOs4AqPbm", "number": 34, "open_issues": 0, "state": "open", "title": "Bankruptcy", "updated_at": "2024-05-20T14:37:16Z", "url": "https://api.github.com/repos/psf/requests/milestones/34" }
12
2017-01-20T12:42:36Z
2024-05-20T14:36:40Z
2024-05-20T14:36:39Z
NONE
null
Current connection pooling in the urllib3 package is not making a proper use of the `block` parameter, and is not using `pool_timeout` for anything. In the current implementation of the [`_get_conn` method](https://github.com/kennethreitz/requests/blob/362da46e9a46da6e86e1907f03014384ab210151/requests/packages/urllib3/connectionpool.py#L219), if `block` is `False` and the pool raises an `Empty` indicating that there are no connections left because all are being used, a new connection is created anyway an later on discarded. Instead of this, the connection attempt should be cancelled with an Exception being raised. Or, at least, there should be the option to indicate that this is the expected behaviour. Otherwise, there is no way to make use of the connection pooling to effectively limit the amount of concurrent outgoing connections without getting the additional ones stuck waiting. Also, pool_timeout is currently getting the default `None` value and there is no way to change it. It should be possible to define it when the adapter is created, like the rest of the parameters, to indicate how long a connection attempt can be left waiting in the queue.
{ "avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4", "events_url": "https://api.github.com/users/sethmlarson/events{/privacy}", "followers_url": "https://api.github.com/users/sethmlarson/followers", "following_url": "https://api.github.com/users/sethmlarson/following{/other_user}", "gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sethmlarson", "id": 18519037, "login": "sethmlarson", "node_id": "MDQ6VXNlcjE4NTE5MDM3", "organizations_url": "https://api.github.com/users/sethmlarson/orgs", "received_events_url": "https://api.github.com/users/sethmlarson/received_events", "repos_url": "https://api.github.com/users/sethmlarson/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions", "type": "User", "url": "https://api.github.com/users/sethmlarson", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3825/reactions" }
https://api.github.com/repos/psf/requests/issues/3825/timeline
null
completed
null
null
false
[ "If the suggested changes are discussed and accepted I can implement them.", "> Instead of this, the connection attempt should be cancelled with an Exception being raised.\r\n> \r\n> Otherwise, connection pooling provides not benefit.\r\n\r\nThat is not true.\r\n\r\nThe connection pooling strategy requests uses is what is known as a \"leaky bucket\". Essentially, by default, Requests will store as many connections as it can in the pool. However, if the user is making more concurrent requests than the size of the pool, they will not by default be forced to wait for a new connection to show up: they are instead given a brand new one. If and when all connections are returned to the pool, the excess connections used are thrown away.\r\n\r\nThis means that in the default case Requests provides a maximum threshold on how much state it will maintain for the sake of connection pooling, whilst still allowing users to burst over that threshold temporarily. This is a very common connection pooling strategy, which is why it is the default for both Requests and urllib3.\r\n\r\nConnection pooling still provides a substantial benefit, because we still keep many connections around. In a single-threaded case, for example, where each response is entirely read before making a new request, only one connection per host will be opened. This is exactly as expected.\r\n\r\n> Also, pool_timeout is currently getting the default `None` value and there is no way to change it.\r\n\r\nI agree that this is a limitation that should be addressed. I would accept a PR that makes this a property on the HTTPAdapter that is automatically passed to `urlopen`/`get_conn` as appropriate.", "Thanks for the quick response @Lukasa \r\n\r\nPlease have a look again at the first comment, which I edited shortly after publishing.\r\n\r\nBasically: Even if I we discussed and agreed on the inconvenience of the 'leaky bucket' strategy (which will probably not happen), it's clear that this would be a major functionality change and therefore it would make no sense to force its introduction.\r\n\r\nHowever, I think that a parameter could be added to provide such a functionality. Otherwise, as I said, there is no way to actually limit the amount of outgoing connections without taking care of it in a higher level.", "A parameter exists already: you can set `block` on the HTTPAdapter.", "@Lukasa No, no, what I mean is something else.\r\n\r\nI mean another parameter, call it `fail_if_empy` to indicate that, if `block` is `False` and the connection pool is empty, an Exception will be raised instead a new connection created.\r\n\r\nFunctionally, what I would like to achieve is being able to use the connection pool to make sure that the number of concurrent outgoing connections will never exceed a given number ( `pool_maxsize`)", "@csala Right, but that is what the `block` parameter does. If you look at the code you'll see that `block`, if set to true, causes an `EmptyPoolError` exception to be raised rather than having a new connection created.", "However, it should be noted that requests/urllib3 do not provide any limit on the *total* number of outgoing connections. They only provide tools to do *per-host* limits on the number of outgoing connections.", "@Lukasa Well, that's not entirely true. Basically because that part of the code can never be reached because of how the queue works and because `pool_timeout` is not being used.\r\n\r\nAt the moment, if `block` is set to `True`, the `pool.get` call will never raise an exception because it will just block the `get` attempt indefinitely until a connection is available.\r\n\r\nHowever, it's true that if the `pool_timeout` is indicated and reached, it would work as I expect. And it could even be set to 0 to make it fail immediately.\r\n\r\nBut, on the other side, without such a `fail_if_empty` parameter, you cannot have it following the 'leaky_bucket' strategy along with a timeout!\r\nIf there was this parameter, you could make it wait for some time and, if no connection was available after the given timeout, go on and create a new one, which currently is not possible.", "@csala I'm in agreement that we should expose the pool timeout. I'm just not in agreement that we should add anything else. All of the relevant behaviours can be controlled by varying `block` and `pool_timeout`.", "@Lukasa Alright, I'll go for that bit.", "Fab, I'll be happy to review a PR adding that functionality. =)", "In an effort to clean up the issue tracker to only have issues that are still relevant to the project we've done a quick pass and decided this issue may no longer be relevant for a variety of potential reasons, including:\r\n\r\n* Applies to a much older version, unclear whether the issue still applies.\r\n* Change requires a backwards incompatible release and it's unclear if the benefits are worth the migration effort from the community.\r\n* There isn't a clear demand from the community on the change landing in Requests.\r\n\r\nIf you think the issue should remain open, please comment so below or open a new issue and link back to the original issue. Again, thank you for opening the issue and for the discussion, it's much appreciated." ]
https://api.github.com/repos/psf/requests/issues/3824
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3824/labels{/name}
https://api.github.com/repos/psf/requests/issues/3824/comments
https://api.github.com/repos/psf/requests/issues/3824/events
https://github.com/psf/requests/issues/3824
201,630,562
MDU6SXNzdWUyMDE2MzA1NjI=
3,824
With SSL connection: TypeError: data must be a memoryview, buffer or byte string
{ "avatar_url": "https://avatars.githubusercontent.com/u/10797139?v=4", "events_url": "https://api.github.com/users/Arno0x/events{/privacy}", "followers_url": "https://api.github.com/users/Arno0x/followers", "following_url": "https://api.github.com/users/Arno0x/following{/other_user}", "gists_url": "https://api.github.com/users/Arno0x/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Arno0x", "id": 10797139, "login": "Arno0x", "node_id": "MDQ6VXNlcjEwNzk3MTM5", "organizations_url": "https://api.github.com/users/Arno0x/orgs", "received_events_url": "https://api.github.com/users/Arno0x/received_events", "repos_url": "https://api.github.com/users/Arno0x/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Arno0x/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Arno0x/subscriptions", "type": "User", "url": "https://api.github.com/users/Arno0x", "user_view_type": "public" }
[]
closed
true
null
[]
null
7
2017-01-18T16:53:53Z
2021-09-08T13:05:26Z
2017-01-19T09:34:40Z
NONE
resolved
Hi, I'm having an exception with my program, probably because of an update in one of the python libs recently, but I don't know how to figure this out. OS: Linux kali 4.6.0-kali1-amd64 #1 SMP Debian 4.6.4-1kali1 (2016-07-21) x86_64 GNU/Linux Python: 2.7.13 Requests version: ``` # pip show requests Name: requests Version: 2.12.4 Summary: Python HTTP for Humans. Home-page: http://python-requests.org Author: Kenneth Reitz Author-email: [email protected] License: Apache 2.0 Location: /usr/local/lib/python2.7/dist-packages Requires: ``` Traceback: ``` [...] File "/root/Tools/DBC2/lib/dropboxHandler.py", line 32, in sendRequest r = requests.post(url,headers=headers,data=data) File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 110, in post return request('post', url, data=data, json=json, **kwargs) File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 56, in request return session.request(method=method, url=url, **kwargs) File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 488, in request resp = self.send(prep, **send_kwargs) File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 609, in send r = adapter.send(request, **kwargs) File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 423, in send timeout=timeout File "/usr/local/lib/python2.7/dist-packages/requests/packages/urllib3/connectionpool.py", line 594, in urlopen chunked=chunked) File "/usr/local/lib/python2.7/dist-packages/requests/packages/urllib3/connectionpool.py", line 361, in _make_request conn.request(method, url, **httplib_request_kw) File "/usr/lib/python2.7/httplib.py", line 1042, in request self._send_request(method, url, body, headers) File "/usr/lib/python2.7/httplib.py", line 1082, in _send_request self.endheaders(body) File "/usr/lib/python2.7/httplib.py", line 1038, in endheaders self._send_output(message_body) File "/usr/lib/python2.7/httplib.py", line 886, in _send_output self.send(message_body) File "/usr/lib/python2.7/httplib.py", line 858, in send self.sock.sendall(data) File "/usr/local/lib/python2.7/dist-packages/requests/packages/urllib3/contrib/pyopenssl.py", line 292, in sendall sent = self._send_until_done(data[total_sent:total_sent + SSL_WRITE_BLOCKSIZE]) File "/usr/local/lib/python2.7/dist-packages/requests/packages/urllib3/contrib/pyopenssl.py", line 281, in _send_until_done return self.connection.send(data) File "/usr/lib/python2.7/dist-packages/OpenSSL/SSL.py", line 1251, in send raise TypeError("data must be a memoryview, buffer or byte string") TypeError: data must be a memoryview, buffer or byte string ``` Other info: The same exact version of my program (pulled from my git repo) works on MacOSX and on Alpine Linux (Docker image). Any idea what the issue is ? Thanks, Arno The exact
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3824/reactions" }
https://api.github.com/repos/psf/requests/issues/3824/timeline
null
completed
null
null
false
[ "Yes, almost certainly you're sending a unicode string in your `data` keyword argument. You only see it on some platforms because most of the time it can be tolerated. What is your `data` keyword argument?", "What the app is sending as a `data`argument is always a byte array of AES256 encrypted data, so not really Unicode :-(\r\n\r\nUnless the \"data\" in the context of SSL.py also includes the HTTP headers I'm providing to \"requests\". So I checked all my headers value and name with the following code:\r\n```\r\nfor headerName, headerValue in headers.items():\r\n\t\t\t\ttry:\r\n\t\t\t\t\theaderValue.decode('ascii')\r\n\t\t\t\t\theaderName.decode('ascii')\r\n\t\t\t\texcept UnicodeDecodeError:\r\n\t\t\t\t print \"Header {} is NOT a ascii-encoded unicode string\".format(headerValue)\r\n\t\t\t\telse:\r\n\t\t\t\t print \"Header {} may have been an ascii-encoded unicode string\".format(headerValue)\r\n```\r\nAnd running this code seems to prove that everything is properly ASCII encoded.\r\n\r\nThe point is, the exact same app worked fine just a few days before, I made stricly no changes to my code but in the meantime I installed Impacket which probably comes with a bunch of python libs install/update which I'm suspecting causes the issue since this is the only thing I can see may have changed...", "@Arno0x To be clear, the appearance of the problem was caused by you installing PyOpenSSL. This is because we preferentially use PyOpenSSL if it's available, but PyOpenSSL is very strict about the data that is handed to it.\r\n\r\nThe fact that you can safely call `decode` on the headers is not sufficient for a test. The headers must be native strings on Python: that means you should manually encode them if they are unicode strings.", "@Lukasa thank you so much for helping me on this issue.\r\nOk, so I understand that I'm now using an installed PyOpenSSL lib on this system, rather than the PyOpenSSL that comes embedded with \"requests\", which explains the difference with my other systems.\r\n\r\nIf I get it right (provided I'm using Python2): all strings in my code are of type 'str' by default and I guess that's what you call native string (is it?). So it looks like all my headers (name or value) are already native python string. Still, just in case, I tried to change my code to add `.encode('ascii')`on all my header strings => no luck, still the same error.\r\n\r\nSo I've reduced my testing scenario down to the basic following test in python command line:\r\n```\r\nroot@kali:~/Tools/DBC2# python\r\nPython 2.7.12+ (default, Sep 1 2016, 20:27:38) \r\n[GCC 6.2.0 20160822] on linux2\r\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\r\n>>> import requests\r\n>>> from lib.crypto import Crypto\r\n>>>\r\n>>> headers={\"Authorization\": \"Bearer xxxxxMyDropBoxTokenxxxxx\", \"Content-Type\": \"application/octet-stream\", \"Dropbox-API-Arg\":'{\"path\":\"/test\",\"mode\":\"overwrite\"}'}\r\n>>>\r\n>>> requests.post(\"https://content.dropboxapi.com/2/files/upload\", headers=headers, data='test')\r\n<Response [200]>\r\n>>>\r\n>>> requests.post(\"https://content.dropboxapi.com/2/files/upload\", headers=headers, data=Crypto.xor(bytearray('test'),'whateverkey'))\r\n[...]\r\nFile \"/usr/local/lib/python2.7/dist-packages/requests/packages/urllib3/contrib/pyopenssl.py\", line 292, in sendall\r\n sent = self._send_until_done(data[total_sent:total_sent + SSL_WRITE_BLOCKSIZE])\r\n File \"/usr/local/lib/python2.7/dist-packages/requests/packages/urllib3/contrib/pyopenssl.py\", line 281, in _send_until_done\r\n return self.connection.send(data)\r\n File \"/usr/lib/python2.7/dist-packages/OpenSSL/SSL.py\", line 1251, in send\r\n raise TypeError(\"data must be a memoryview, buffer or byte string\")\r\nTypeError: data must be a memoryview, buffer or byte string\r\n```\r\nSo it turns out not to be a header issue, but rather the data itself returned from my `Crypto.xor` function, which is very basic:\r\n```\r\ndef xor(cls, data, key):\r\n\t\tl = len(key)\r\n\t\tkeyAsInt = map(ord, key)\r\n\t\treturn bytearray((\r\n\t\t (data[i] ^ keyAsInt[i % l]) for i in range(0,len(data))\r\n\t\t))\r\n```\r\n\r\nI hope you'll get to see what's wrong in this piece of code that PyOpenSSL doesn't like... thanks again for your help.", "So it looks like PyOpenSSL doesn't accept bytearrays: only buffers or bytes. You can work around this in the short term by calling `bytes` on the bytearray.\r\n\r\nHowever, I'd call this a PyOpenSSL bug: there's no reason not to accept bytearrays here. So I recommend opening a bug report on the PyOpenSSL repository as well. =)", "SPOTTED ! :-)\r\nIt works !\r\n\r\nThank you so much for your help. I'll certainly open an issue on the PyOpenSSL as well.\r\n\r\nCheers.", "No problem. =)" ]
https://api.github.com/repos/psf/requests/issues/3823
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3823/labels{/name}
https://api.github.com/repos/psf/requests/issues/3823/comments
https://api.github.com/repos/psf/requests/issues/3823/events
https://github.com/psf/requests/pull/3823
201,560,087
MDExOlB1bGxSZXF1ZXN0MTAyMDU1MjM5
3,823
Prepare changelog for 2.12.5
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2017-01-18T12:23:34Z
2021-09-07T00:06:43Z
2017-01-18T12:41:05Z
MEMBER
resolved
Getting ready to ship the last patch release in the somewhat accursed 2.12 release series.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3823/reactions" }
https://api.github.com/repos/psf/requests/issues/3823/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3823.diff", "html_url": "https://github.com/psf/requests/pull/3823", "merged_at": "2017-01-18T12:41:05Z", "patch_url": "https://github.com/psf/requests/pull/3823.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3823" }
true
[ "## [Current coverage](https://codecov.io/gh/kennethreitz/requests/pull/3823?src=pr) is 89.12% (diff: 100%)\n> Merging [#3823](https://codecov.io/gh/kennethreitz/requests/pull/3823?src=pr) into [master](https://codecov.io/gh/kennethreitz/requests/branch/master?src=pr) will decrease coverage by **0.05%**\n\n```diff\n@@ master #3823 diff @@\n==========================================\n Files 15 15 \n Lines 1857 1857 \n Methods 0 0 \n Messages 0 0 \n Branches 0 0 \n==========================================\n- Hits 1656 1655 -1 \n- Misses 201 202 +1 \n Partials 0 0 \n```\n\n> Powered by [Codecov](https://codecov.io?src=pr). Last update [160d364...219d9a7](https://codecov.io/gh/kennethreitz/requests/compare/160d36450350cbb8948657a60aa4f0cbe18c601e...219d9a7297efc5673a8c77400204130d66bca311?src=pr)" ]
https://api.github.com/repos/psf/requests/issues/3822
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3822/labels{/name}
https://api.github.com/repos/psf/requests/issues/3822/comments
https://api.github.com/repos/psf/requests/issues/3822/events
https://github.com/psf/requests/issues/3822
200,831,349
MDU6SXNzdWUyMDA4MzEzNDk=
3,822
"import requests" breaks OpenSSL when used in mod_python
{ "avatar_url": "https://avatars.githubusercontent.com/u/2719629?v=4", "events_url": "https://api.github.com/users/LarsMichelsen/events{/privacy}", "followers_url": "https://api.github.com/users/LarsMichelsen/followers", "following_url": "https://api.github.com/users/LarsMichelsen/following{/other_user}", "gists_url": "https://api.github.com/users/LarsMichelsen/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/LarsMichelsen", "id": 2719629, "login": "LarsMichelsen", "node_id": "MDQ6VXNlcjI3MTk2Mjk=", "organizations_url": "https://api.github.com/users/LarsMichelsen/orgs", "received_events_url": "https://api.github.com/users/LarsMichelsen/received_events", "repos_url": "https://api.github.com/users/LarsMichelsen/repos", "site_admin": false, "starred_url": "https://api.github.com/users/LarsMichelsen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/LarsMichelsen/subscriptions", "type": "User", "url": "https://api.github.com/users/LarsMichelsen", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2017-01-14T21:35:55Z
2021-09-08T13:05:27Z
2017-01-14T21:47:01Z
NONE
resolved
When creating a mod_python (Python 2.7, mod_python 3.3.1) page handler that just imports the requests module (version 2.12.3), it breaks pyOpenSSL (pyOpenSSL-16.2.0, cryptography-1.5.3, idna-2.1 in my case). For example creating certificates like this is not working anymore: ``` from OpenSSL import crypto pkey = crypto.PKey() pkey.generate_key(crypto.TYPE_RSA, 2048) ``` An exception like this is raised: Error ([('rsa routines', 'RSA_BUILTIN_KEYGEN', 'BN lib')]) ``` File "key_mgmt.py", line 224, in _generate_key pkey.generate_key(crypto.TYPE_RSA, 2048) File "lib/python/pyOpenSSL-16.2.0-py2.7.egg/OpenSSL/crypto.py", line 258, in generate_key _openssl_assert(result == 1) File "lib/python/pyOpenSSL-16.2.0-py2.7.egg/OpenSSL/_util.py", line 61, in openssl_assert exception_from_error_queue(error) File "lib/python/pyOpenSSL-16.2.0-py2.7.egg/OpenSSL/_util.py", line 48, in exception_from_error_queue raise exception_type(errors) ``` When removing this from requests __init__.py, it works again: ``` try: from .packages.urllib3.contrib import pyopenssl as pyopenssl pyopenssl.inject_into_urllib3() except ImportError: pass ``` I tried to remove only the `pyopenssl.inject_into_urllib3()` call, but that did not solve the issue. I had to remove the import which then fixed the issue. So it seems that it is somehow related to that module directly or some dependency. The same code works when running as dedicated python script. Maybe it's related to the mod_python importer or something. Any ideas?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3822/reactions" }
https://api.github.com/repos/psf/requests/issues/3822/timeline
null
completed
null
null
false
[ "This is definitely related to mod_python. The cryptography library which PyOpenSSL uses has had problems with mod_python in the past due to mod_python's unusual method of running Python. I suggest you open this issue on the pyca/cryptography repository: they should be best placed to fix this issue." ]
https://api.github.com/repos/psf/requests/issues/3821
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3821/labels{/name}
https://api.github.com/repos/psf/requests/issues/3821/comments
https://api.github.com/repos/psf/requests/issues/3821/events
https://github.com/psf/requests/issues/3821
200,796,456
MDU6SXNzdWUyMDA3OTY0NTY=
3,821
problem with the proxy
{ "avatar_url": "https://avatars.githubusercontent.com/u/8215563?v=4", "events_url": "https://api.github.com/users/feng-1985/events{/privacy}", "followers_url": "https://api.github.com/users/feng-1985/followers", "following_url": "https://api.github.com/users/feng-1985/following{/other_user}", "gists_url": "https://api.github.com/users/feng-1985/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/feng-1985", "id": 8215563, "login": "feng-1985", "node_id": "MDQ6VXNlcjgyMTU1NjM=", "organizations_url": "https://api.github.com/users/feng-1985/orgs", "received_events_url": "https://api.github.com/users/feng-1985/received_events", "repos_url": "https://api.github.com/users/feng-1985/repos", "site_admin": false, "starred_url": "https://api.github.com/users/feng-1985/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/feng-1985/subscriptions", "type": "User", "url": "https://api.github.com/users/feng-1985", "user_view_type": "public" }
[ { "color": "777777", "default": false, "description": null, "id": 162780722, "name": "Question/Not a bug", "node_id": "MDU6TGFiZWwxNjI3ODA3MjI=", "url": "https://api.github.com/repos/psf/requests/labels/Question/Not%20a%20bug" } ]
closed
true
null
[]
null
7
2017-01-14T10:54:58Z
2021-09-08T12:01:10Z
2017-01-23T16:01:58Z
NONE
resolved
python3 code: session = requests.session() r = session.get(url="http://www.google.com",headers=headers,timeout=30,proxies=proxies) the headers={"User-Agent": random.choice(USER_AGENTS)} the proxies is like this: {'http': 'http://117.68.236.155:808', 'https': 'http://117.68.236.155:808'} I got the following Error Message: requests.exceptions.ProxyError: HTTPConnectionPool(host='117.68.236.155', port=808): Max retries exceeded with url: http://www.google.com/ (Caused by ProxyError('Cannot connect to proxy.', NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x0000007C82F8EBE0>: Failed to establish a new connection: [WinError 10060] 由于连接方在一段时间后没有正确答复或连接的主机没有反应,连接尝试失败。',))) Requests verison 2.12.4, Python3 I have test the following code, but the return is none: import requests print(requests.utils.get_environ_proxies('http://www.google.com')) Is there any solution? I looking forward to your reply!
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3821/reactions" }
https://api.github.com/repos/psf/requests/issues/3821/timeline
null
completed
null
null
false
[ "import requests\r\nsession = requests.session()\r\nr = session.get(url=\"http://www.google.com\")\r\ncookie = session.cookies.get_dict()\r\nprint(cookie)\r\n\r\nget the following error:\r\nrequests.exceptions.ConnectionError: ('Connection aborted.', ConnectionResetError(10054, '远程主机强迫关闭了一个现有的连接。', None, 10054, None))\r\n", "We appear to be failing to connect to the proxy. Are you sure it is running at 117.68.236.155:808? Can you tell me what Requests version you're using please?", "@Lukasa hidden in the original message is:\r\n\r\n```\r\nRequests verison 2.12.4, Python3\r\n```", "@bifeng in your second example, you're not using proxies. Does this mean that you cannot connect to the internet **at all**?", "@sigmavirus24 \r\nI am surfing the internet, and run this code , so connect is not the problem.\r\nI am still getting the error:\r\nrequests.exceptions.ConnectionError: ('Connection aborted.', ConnectionResetError(10054, '远程主机强迫关闭了一个现有的连接。', None, 10054, None))\r\nThat's weird! ", "The complete error message:\r\nTraceback (most recent call last):\r\n File \"C:\\Users\\q\\Anaconda3\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.py\", line 594, in urlopen\r\n chunked=chunked)\r\n File \"C:\\Users\\q\\Anaconda3\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.py\", line 391, in _make_request\r\n six.raise_from(e, None)\r\n File \"<string>\", line 2, in raise_from\r\n File \"C:\\Users\\q\\Anaconda3\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.py\", line 387, in _make_request\r\n httplib_response = conn.getresponse()\r\n File \"C:\\Users\\q\\Anaconda3\\lib\\http\\client.py\", line 1197, in getresponse\r\n response.begin()\r\n File \"C:\\Users\\q\\Anaconda3\\lib\\http\\client.py\", line 297, in begin\r\n version, status, reason = self._read_status()\r\n File \"C:\\Users\\q\\Anaconda3\\lib\\http\\client.py\", line 258, in _read_status\r\n line = str(self.fp.readline(_MAXLINE + 1), \"iso-8859-1\")\r\n File \"C:\\Users\\q\\Anaconda3\\lib\\socket.py\", line 575, in readinto\r\n return self._sock.recv_into(b)\r\nConnectionResetError: [WinError 10054] 远程主机强迫关闭了一个现有的连接。\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File \"C:\\Users\\q\\Anaconda3\\lib\\site-packages\\requests\\adapters.py\", line 423, in send\r\n timeout=timeout\r\n File \"C:\\Users\\q\\Anaconda3\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.py\", line 643, in urlopen\r\n _stacktrace=sys.exc_info()[2])\r\n File \"C:\\Users\\q\\Anaconda3\\lib\\site-packages\\requests\\packages\\urllib3\\util\\retry.py\", line 334, in increment\r\n raise six.reraise(type(error), error, _stacktrace)\r\n File \"C:\\Users\\q\\Anaconda3\\lib\\site-packages\\requests\\packages\\urllib3\\packages\\six.py\", line 685, in reraise\r\n raise value.with_traceback(tb)\r\n File \"C:\\Users\\q\\Anaconda3\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.py\", line 594, in urlopen\r\n chunked=chunked)\r\n File \"C:\\Users\\q\\Anaconda3\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.py\", line 391, in _make_request\r\n six.raise_from(e, None)\r\n File \"<string>\", line 2, in raise_from\r\n File \"C:\\Users\\q\\Anaconda3\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.py\", line 387, in _make_request\r\n httplib_response = conn.getresponse()\r\n File \"C:\\Users\\q\\Anaconda3\\lib\\http\\client.py\", line 1197, in getresponse\r\n response.begin()\r\n File \"C:\\Users\\q\\Anaconda3\\lib\\http\\client.py\", line 297, in begin\r\n version, status, reason = self._read_status()\r\n File \"C:\\Users\\q\\Anaconda3\\lib\\http\\client.py\", line 258, in _read_status\r\n line = str(self.fp.readline(_MAXLINE + 1), \"iso-8859-1\")\r\n File \"C:\\Users\\q\\Anaconda3\\lib\\socket.py\", line 575, in readinto\r\n return self._sock.recv_into(b)\r\nrequests.packages.urllib3.exceptions.ProtocolError: ('Connection aborted.', ConnectionResetError(10054, '远程主机强迫关闭了一个现有的连接。', None, 10054, None))\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File \"C:/Users/q/Desktop/review0.2/tmp2.py\", line 13, in <module>\r\n r = session.get(url=\"http://www.google.com\")\r\n File \"C:\\Users\\q\\Anaconda3\\lib\\site-packages\\requests\\sessions.py\", line 501, in get\r\n return self.request('GET', url, **kwargs)\r\n File \"C:\\Users\\q\\Anaconda3\\lib\\site-packages\\requests\\sessions.py\", line 488, in request\r\n resp = self.send(prep, **send_kwargs)\r\n File \"C:\\Users\\q\\Anaconda3\\lib\\site-packages\\requests\\sessions.py\", line 609, in send\r\n r = adapter.send(request, **kwargs)\r\n File \"C:\\Users\\q\\Anaconda3\\lib\\site-packages\\requests\\adapters.py\", line 473, in send\r\n raise ConnectionError(err, request=request)\r\nrequests.exceptions.ConnectionError: ('Connection aborted.', ConnectionResetError(10054, '远程主机强迫关闭了一个现有的连接。', None, 10054, None))\r\n", "So it looks very much like your machine is unable to actually make a network connection to the proxy you're trying to connect to. Without debugging that failure we can't really do much more." ]
https://api.github.com/repos/psf/requests/issues/3820
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3820/labels{/name}
https://api.github.com/repos/psf/requests/issues/3820/comments
https://api.github.com/repos/psf/requests/issues/3820/events
https://github.com/psf/requests/issues/3820
200,791,420
MDU6SXNzdWUyMDA3OTE0MjA=
3,820
requests and urllib2 get different headers when connecting to the same host
{ "avatar_url": "https://avatars.githubusercontent.com/u/440101?v=4", "events_url": "https://api.github.com/users/dofine/events{/privacy}", "followers_url": "https://api.github.com/users/dofine/followers", "following_url": "https://api.github.com/users/dofine/following{/other_user}", "gists_url": "https://api.github.com/users/dofine/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/dofine", "id": 440101, "login": "dofine", "node_id": "MDQ6VXNlcjQ0MDEwMQ==", "organizations_url": "https://api.github.com/users/dofine/orgs", "received_events_url": "https://api.github.com/users/dofine/received_events", "repos_url": "https://api.github.com/users/dofine/repos", "site_admin": false, "starred_url": "https://api.github.com/users/dofine/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dofine/subscriptions", "type": "User", "url": "https://api.github.com/users/dofine", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2017-01-14T08:54:29Z
2021-09-08T07:00:34Z
2017-01-14T09:51:01Z
NONE
resolved
Requests version: 2.11.1 Python: 2.7.12 We've got a server providing .txt files, basically some log files growing over time. When I use `urllib2` to send `GET` to the server `r = urllib2.urlopen('http://example.com')` , the headers of the response would be: ``` Date: XXX Server: Apache Last-Modified: XXX Accept-Ranges: bytes Content-Length: 12345678 Vary: Accept-Encoding Connection: close Content-Type: text/plain ``` While if `r = requests.get('http://example.com')`: ``` Content-Encoding: gzip Accept-Ranges: bytes Vary: Accept-Encoding Keep-alive: timeout=5, max=128 Last-Modified: XXX Connection: Keep-Alive ETag: xxxxxxxxx Content-Type: text/plain ``` The second response is the same with what I get using chrome develop tools. So why the two are different? I need the `Content-Length` header to determine how many bytes I need to download every time, becasue the file could grow really big. Using `httpbin.org/get` to test: urllib2 response: ``` {u'args': {}, u'headers': {u'Accept-Encoding': u'identity', u'Host': u'httpbin.org', u'User-Agent': u'Python-urllib/2.7'}, u'origin': u'ip', u'url': u'http://httpbin.org/get'} ``` response headers: ``` Server: nginx Date: Sat, 14 Jan 2017 07:41:16 GMT Content-Type: application/json Content-Length: 207 Connection: close Access-Control-Allow-Origin: * Access-Control-Allow-Credentials: true ``` requests response: ``` {u'args': {}, u'headers': {u'Accept': u'*/*', u'Accept-Encoding': u'gzip, deflate', u'Host': u'httpbin.org', u'User-Agent': u'python-requests/2.11.1'}, u'origin': u'ip', u'url': u'http://httpbin.org/get'} ``` response headers: ``` Server : nginx Date : Sat, 14 Jan 2017 07:42:39 GMT Content-Type : application/json Content-Length : 239 Connection : keep-alive Access-Control-Allow-Origin : * Access-Control-Allow-Credentials : true ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3820/reactions" }
https://api.github.com/repos/psf/requests/issues/3820/timeline
null
completed
null
null
false
[ "This is the code:\r\n\r\n```\r\nimport urllib2 \r\nimport requests \r\n\r\nurl = 'exmaple.com' \r\nheaders = { \r\n\"Authorization\": \"Basic xxxx\", \r\n\"Range\": \"bytes=0-\" \r\n} \r\nreq = urllib2.Request(url, headers=headers) \r\nresp = urllib2.urlopen(req) \r\nprint resp.info() \r\n\r\nr = requests.get(url, headers=headers) \r\nprint r.headers \r\nassert resp.info()['ETag'] == r.headers['ETag'] \r\n```\r\n\r\nand output:\r\n\r\n```\r\nDate: Sat, 14 Jan 2017 09:39:50 GMT\r\nServer: Apache\r\nLast-Modified: Sat, 14 Jan 2017 09:39:49 GMT\r\nETag: \"e91103-10e04f7-5460abb4743a3\"\r\nAccept-Ranges: bytes\r\nContent-Length: 17695991\r\nVary: Accept-Encoding\r\nContent-Range: bytes 0-17695990/17695991\r\nConnection: close\r\nContent-Type: text/plain\r\n```\r\n```\r\n{'Content-Encoding': 'gzip', 'Transfer-Encoding': 'chunked', 'Accept-Ranges': 'bytes', 'Vary': 'Accept-Encoding', 'Keep-Alive': 'timeout=5, max=128', 'Server': 'Apache', 'Last-Modified': 'Sat, 14 Jan 2017 09:39:49 GMT', 'Connection': 'Keep-Alive', 'ETag': '\"e91103-10e04f7-5460abb4743a3\"', 'Date': 'Sat, 14 Jan 2017 09:39:50 GMT', 'Content-Type': 'text/plain'}\r\n```", "The response is different because requests indicates that it supports gzip-encoded bodies, by sending an `Accept-Encoding: gzip, deflate` header field. urllib2 does not. You'll find if you added that header to your urllib2 request that you get the new behaviour.\r\n\r\nClearly, in this case, the server is dynamically gzipping the responses. This means it doesn't know how long the response will be, so it is sending using chunked transfer encoding.\r\n\r\nIf you really must get the Content-Length header, then you should add the following headers to your Requests request: `{'Accept-Encoding': 'identity'}`.", "Never mind. :) Forgot `stream=True`.\r\n Hi, when I use \r\n\r\n```python\r\nfor line in response.iter_lines():\r\n print line\r\n```\r\non this server, it would print nothing until all the body content is downloaded, which is very slow for a large response. How should I change the behavior to be the same as `urllib2` in Python 2?" ]
https://api.github.com/repos/psf/requests/issues/3819
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3819/labels{/name}
https://api.github.com/repos/psf/requests/issues/3819/comments
https://api.github.com/repos/psf/requests/issues/3819/events
https://github.com/psf/requests/pull/3819
200,779,099
MDExOlB1bGxSZXF1ZXN0MTAxNTQ5NTQw
3,819
Fix #3818. avoid doing an import inside a method
{ "avatar_url": "https://avatars.githubusercontent.com/u/568181?v=4", "events_url": "https://api.github.com/users/llazzaro/events{/privacy}", "followers_url": "https://api.github.com/users/llazzaro/followers", "following_url": "https://api.github.com/users/llazzaro/following{/other_user}", "gists_url": "https://api.github.com/users/llazzaro/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/llazzaro", "id": 568181, "login": "llazzaro", "node_id": "MDQ6VXNlcjU2ODE4MQ==", "organizations_url": "https://api.github.com/users/llazzaro/orgs", "received_events_url": "https://api.github.com/users/llazzaro/received_events", "repos_url": "https://api.github.com/users/llazzaro/repos", "site_admin": false, "starred_url": "https://api.github.com/users/llazzaro/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/llazzaro/subscriptions", "type": "User", "url": "https://api.github.com/users/llazzaro", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2017-01-14T04:13:04Z
2021-09-08T01:21:25Z
2017-01-14T10:01:54Z
NONE
resolved
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3819/reactions" }
https://api.github.com/repos/psf/requests/issues/3819/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3819.diff", "html_url": "https://github.com/psf/requests/pull/3819", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3819.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3819" }
true
[ "Thanks for this!\r\n\r\nThe idna module is vendored without changes from [here](https://github.com/kjd/idna). We do not carry patches to vendored modules in our own source tree. Please propose this pull request upstream, and we will pull in the changes when a new release is cut.", "@Lukasa thanks for the commet! I already propose the changes on that repository.\r\n\r\n" ]
https://api.github.com/repos/psf/requests/issues/3818
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3818/labels{/name}
https://api.github.com/repos/psf/requests/issues/3818/comments
https://api.github.com/repos/psf/requests/issues/3818/events
https://github.com/psf/requests/issues/3818
200,778,661
MDU6SXNzdWUyMDA3Nzg2NjE=
3,818
Import error with idna version
{ "avatar_url": "https://avatars.githubusercontent.com/u/568181?v=4", "events_url": "https://api.github.com/users/llazzaro/events{/privacy}", "followers_url": "https://api.github.com/users/llazzaro/followers", "following_url": "https://api.github.com/users/llazzaro/following{/other_user}", "gists_url": "https://api.github.com/users/llazzaro/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/llazzaro", "id": 568181, "login": "llazzaro", "node_id": "MDQ6VXNlcjU2ODE4MQ==", "organizations_url": "https://api.github.com/users/llazzaro/orgs", "received_events_url": "https://api.github.com/users/llazzaro/received_events", "repos_url": "https://api.github.com/users/llazzaro/repos", "site_admin": false, "starred_url": "https://api.github.com/users/llazzaro/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/llazzaro/subscriptions", "type": "User", "url": "https://api.github.com/users/llazzaro", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2017-01-14T04:03:47Z
2021-09-08T13:05:28Z
2017-01-14T10:02:06Z
NONE
resolved
Hello, I recently upgraded to 2.12 and I started to see this import error: > (Pdb) request('get', url, params=params, **kwargs) > *** ImportError: No module named uts46data Looking in the changes I saw that it was recently added idna. My projects are using import hooks and inside of it the hook use requests. The code of idna is doing an import inside the method uts46_remap, which is breaking my code. Moving the import uts46_remap outside fixes this issue.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3818/reactions" }
https://api.github.com/repos/psf/requests/issues/3818/timeline
null
completed
null
null
false
[ "Closed for reasons discussed in #3819." ]
https://api.github.com/repos/psf/requests/issues/3817
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3817/labels{/name}
https://api.github.com/repos/psf/requests/issues/3817/comments
https://api.github.com/repos/psf/requests/issues/3817/events
https://github.com/psf/requests/pull/3817
200,727,873
MDExOlB1bGxSZXF1ZXN0MTAxNTE1MTM2
3,817
Only load .packages.urllib3.contrib.pyopenssl if we have an old version of OpenSSL.
{ "avatar_url": "https://avatars.githubusercontent.com/u/22371?v=4", "events_url": "https://api.github.com/users/dsully/events{/privacy}", "followers_url": "https://api.github.com/users/dsully/followers", "following_url": "https://api.github.com/users/dsully/following{/other_user}", "gists_url": "https://api.github.com/users/dsully/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/dsully", "id": 22371, "login": "dsully", "node_id": "MDQ6VXNlcjIyMzcx", "organizations_url": "https://api.github.com/users/dsully/orgs", "received_events_url": "https://api.github.com/users/dsully/received_events", "repos_url": "https://api.github.com/users/dsully/repos", "site_admin": false, "starred_url": "https://api.github.com/users/dsully/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dsully/subscriptions", "type": "User", "url": "https://api.github.com/users/dsully", "user_view_type": "public" }
[]
closed
true
null
[]
null
11
2017-01-13T20:51:40Z
2021-09-08T01:21:24Z
2017-01-16T20:30:14Z
NONE
resolved
See issue #3213
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3817/reactions" }
https://api.github.com/repos/psf/requests/issues/3817/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3817.diff", "html_url": "https://github.com/psf/requests/pull/3817", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3817.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3817" }
true
[ "Ok, @alex and @reaperhulk really need to sign off on this before I consider merging it. ;)", "Ok, all commenters, *please* read some of the discussion on #3213 before making further comments here. You'll find it helps you avoid retreading some ground. ;)\r\n\r\n@reaperhulk, my main concern here is that we are fundamentally looking at adding this patch because \"importing PyOpenSSL is slow\". My question is: do you consider that a bug?", "It would help to quantify slow. In the long long ago (pre-cryptography 1.0) importing pyOpenSSL was slow (1+ seconds), but on my 4 year old laptop I just tested and it took about 50ms to import pyOpenSSL 16.2 + cryptography 1.7. `import requests` with no pyOpenSSL installed takes ~63ms and ~108ms for requests+pyOpenSSL. That doesn't seem particularly problematic.", "It'd also help if I took @Lukasa's advice and actually read the whole issue that spawned this.\r\n\r\nI don't consider the current import performance to be a bug, but if someone wants to try to improve it on cryptography (which is probably the dominant source of import time for pyopenssl) I'm happy to review. The biggest single source of time is likely when it loops over the lib object to build a new conditional lib object. That could be optimized if cffi supported a means of conditional binding, but we're likely talking only ~10ms?", "So step 1 is: are we talking about the same level of \"slow\" here?", "@Lukasa @reaperhulk so it's worth noting that `slow` will change based on how many things are installed + how many possible entry-points exist. `pkg_resources`, if I remember correctly, will scan all of `site_packages` for entry-points. So if you're testing `pyOpenSSL` in a fresh virtual environment, you're import speed (given that cryptography apparently scans entry-points at import) is going to be faster than @dsully since they seem to be installing a lot of things in one `site_packages` directory.\r\n\r\nGranted, this is a fundamental flaw of how `pkg_resources` works, but I think it's still a legitimate problem. I haven't looked at what cryptography uses `pkg_resources` to find at import, but there would only be \"import time\" benefits to avoiding that scan rather than any real performance benefit to not doing it when cryptography is imported.\r\n\r\nI'd also like a better understanding of `slow` from @dsully + maybe a better description of how much is being installed into their site-packages directory.", "Also, to set everyone's expectations appropriately, this appears to be a problem @dsully is encountering at work, so I completely expect them to *not* respond until Monday during work hours.", "Okay, this is getting a bit more interesting. `pkg_resources` is invoked to determine what backends are available to cryptography. This is done in `cryptography.hazmat.backends.__init__`. requests doesn't need that check since it directly imports the backend it requires. However, if you look in that you can see that we **are** doing it lazily. To summarize a wasted hour here, `import pkg_resources` is actually where we pay the speed penalty. I haven't looked into what's going on inside `pkg_resources`, but it seems likely that a side effect of importing is an immediate scan of the packages?\r\n\r\nAnyway, this points us to a potential significant performance improvement: just `import pkg_resources` inside the function call so it is lazily imported as well. On my machine with a test virtualenv containing 100 packages this drops the import time by 150ms.", "@reaperhulk If the cryptography team is okay with moving the import inside the function that seems suitable and should eliminate the need for this PR based on your findings.", "@reaperhulk Yes - removing the pkg_resources import until it's needed will certainly help. I'll create a PR over there to move the import.\r\n\r\n@sigmavirus24 nails it - we have a lot installed in site-packages, and are extremely sensitive about CLI tools start up time. Every millisecond counts in the eyes of our users. I do agree this is a fundamental flaw in pkg_resources. (My weekend was pretty packed, so I wasn't able to reply until now).\r\n\r\nI do feel that requests automatically trying to use pyopenssl if it's installed without any way to opt-out is surprise functionality. I do have a work around for now, so this PR can be dropped.", "> I do feel that requests automatically trying to use pyopenssl if it's installed without any way to opt-out is surprise functionality.\r\n\r\nI mean, you can opt-out *after* import with no difficulty at all. ;)" ]
https://api.github.com/repos/psf/requests/issues/3816
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3816/labels{/name}
https://api.github.com/repos/psf/requests/issues/3816/comments
https://api.github.com/repos/psf/requests/issues/3816/events
https://github.com/psf/requests/issues/3816
200,630,475
MDU6SXNzdWUyMDA2MzA0NzU=
3,816
Proxy and SSL : the CONNECT request fails on some proxy (cntlm) as the request is HTTP 1.0 instead of 1.1
{ "avatar_url": "https://avatars.githubusercontent.com/u/504748?v=4", "events_url": "https://api.github.com/users/LaurentChardin/events{/privacy}", "followers_url": "https://api.github.com/users/LaurentChardin/followers", "following_url": "https://api.github.com/users/LaurentChardin/following{/other_user}", "gists_url": "https://api.github.com/users/LaurentChardin/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/LaurentChardin", "id": 504748, "login": "LaurentChardin", "node_id": "MDQ6VXNlcjUwNDc0OA==", "organizations_url": "https://api.github.com/users/LaurentChardin/orgs", "received_events_url": "https://api.github.com/users/LaurentChardin/received_events", "repos_url": "https://api.github.com/users/LaurentChardin/repos", "site_admin": false, "starred_url": "https://api.github.com/users/LaurentChardin/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/LaurentChardin/subscriptions", "type": "User", "url": "https://api.github.com/users/LaurentChardin", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2017-01-13T13:29:13Z
2021-09-08T13:05:27Z
2017-01-13T13:34:08Z
NONE
resolved
It looks like requests (or urllib3 ?) is creating a CONNECT request that fails with some proxies when trying to initiate the SSL tunnel: INFO:requests.packages.urllib3.connectionpool:Starting new HTTPS connection (1): api.hipchat.com send: 'CONNECT api.hipchat.com:443 HTTP/1.0\r\n' send: '\r\n' as we can, it generates an HTTP/1.0 header which seems to be inaccurate. I could not find any definition of the CONNECT command in the HTTP/1.0 RFC, and seems to have been really defined in HTTP 1.1 RFC. By manually telneting to my cntlm, i could make the CONNECT work by replacing 1.0 by 1.1. This makes the requests library fail to work on my side where i have to use CNTLM because our usual proxy is using Kerberos/CNTLM for authentication. Is there any way to have this reviewed ? best,
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3816/reactions" }
https://api.github.com/repos/psf/requests/issues/3816/timeline
null
completed
null
null
false
[ "Unfortunately, this is not something we can easily change in urllib3 at the moment, because this CONNECT is actually created by httplib. Simply changing to HTTP/1.1 is not sufficient because that CONNECT request is then not HTTP/1.1 compliant (it has no `Host` header field). We'd need to change both.\r\n\r\nUltimately, the upcoming urllib3 v2 release will resolve this issue by removing our dependence on httplib. Until then, the only way to fix this is to monkeypatch or override httplib's `tunnel()` logic.", "Found some additional information here : http://stackoverflow.com/questions/1841730/how-can-urllib2-httplib-talk-http-1-1-for-https-connections-via-a-squid-proxy/1841740#1841740" ]
https://api.github.com/repos/psf/requests/issues/3815
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3815/labels{/name}
https://api.github.com/repos/psf/requests/issues/3815/comments
https://api.github.com/repos/psf/requests/issues/3815/events
https://github.com/psf/requests/issues/3815
200,610,645
MDU6SXNzdWUyMDA2MTA2NDU=
3,815
Add validation of HTTP URL
{ "avatar_url": "https://avatars.githubusercontent.com/u/4943600?v=4", "events_url": "https://api.github.com/users/Th30n/events{/privacy}", "followers_url": "https://api.github.com/users/Th30n/followers", "following_url": "https://api.github.com/users/Th30n/following{/other_user}", "gists_url": "https://api.github.com/users/Th30n/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Th30n", "id": 4943600, "login": "Th30n", "node_id": "MDQ6VXNlcjQ5NDM2MDA=", "organizations_url": "https://api.github.com/users/Th30n/orgs", "received_events_url": "https://api.github.com/users/Th30n/received_events", "repos_url": "https://api.github.com/users/Th30n/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Th30n/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Th30n/subscriptions", "type": "User", "url": "https://api.github.com/users/Th30n", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2017-01-13T11:40:37Z
2021-09-08T13:05:29Z
2017-01-13T11:41:55Z
NONE
resolved
I'd like to verify that a URL will be valid for a HTTP request. The main motivation is telling the user in the GUI that the typed URL cannot be used. Currently, I'm using `requests.models.parse_url`, but this is obviously not enough because `PreparedRequest.prepare_url` performs additional checks (e.g. `host.encode('idna')`). It would be great if the same checks were exposed as a public API.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3815/reactions" }
https://api.github.com/repos/psf/requests/issues/3815/timeline
null
completed
null
null
false
[ "The simplest thing to do is to use the [PreparedRequest flow](http://docs.python-requests.org/en/master/user/advanced/#prepared-requests): specifically the second example, which goes via a Session.\r\n\r\nYou can catch exceptions raised in this process and use those to determine the validity of the URL.", "You could also use rfc3986 which has some validation code built-in.", "Thanks @Lukasa." ]
https://api.github.com/repos/psf/requests/issues/3814
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3814/labels{/name}
https://api.github.com/repos/psf/requests/issues/3814/comments
https://api.github.com/repos/psf/requests/issues/3814/events
https://github.com/psf/requests/pull/3814
200,557,373
MDExOlB1bGxSZXF1ZXN0MTAxMzk1NzMy
3,814
Add ability to load ca bundle from data
{ "avatar_url": "https://avatars.githubusercontent.com/u/88809?v=4", "events_url": "https://api.github.com/users/Kentzo/events{/privacy}", "followers_url": "https://api.github.com/users/Kentzo/followers", "following_url": "https://api.github.com/users/Kentzo/following{/other_user}", "gists_url": "https://api.github.com/users/Kentzo/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Kentzo", "id": 88809, "login": "Kentzo", "node_id": "MDQ6VXNlcjg4ODA5", "organizations_url": "https://api.github.com/users/Kentzo/orgs", "received_events_url": "https://api.github.com/users/Kentzo/received_events", "repos_url": "https://api.github.com/users/Kentzo/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Kentzo/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Kentzo/subscriptions", "type": "User", "url": "https://api.github.com/users/Kentzo", "user_view_type": "public" }
[]
closed
true
null
[]
null
7
2017-01-13T06:29:28Z
2021-09-07T00:06:43Z
2017-01-13T07:19:47Z
NONE
resolved
SSLContext.load_verify_locations allows to specify cadata with 2.7.9+, 3.4+
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3814/reactions" }
https://api.github.com/repos/psf/requests/issues/3814/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3814.diff", "html_url": "https://github.com/psf/requests/pull/3814", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3814.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3814" }
true
[ "Thanks for this! Some notes:\r\n\r\n- Firstly, this patch contains changes to the urllib3 project. This is a sub-project vendored in its entirety unchanged into Requests. That means any changes to urllib3 need to be made on that project, and they will then make their way into Requests over time.\r\n- Secondly, I am not sure how important this patch is. In the current Requests release it is possible to pass a SSLContext in to a transport adapter, as discussed in a few other issues (e.g. [here](https://github.com/kennethreitz/requests/issues/3774#issuecomment-267871876)). This means it's already possible to use cadata, which drastically reduces the urgency of this patch.\r\n- Thirdly, I suspect in v3 we'll want to refactor this so that verify can accept an `SSLContext` directly, rather than continuing to overload it with more stringly-typed inputs.\r\n\r\nBecause of the first reason I cannot merge this patch directly: you'd need to open a PR on urllib3 first. However, you may want to consider whether it's worth pursuing the \"pass an SSLContext\" approach instead.", "Agree, passing custom SSLContext solves the issue.\r\n\r\nCould you update documentation to better reflect that, with an example of passing custom SSLContext that loads its certificates from memory?", "@Lukasa One question: won't implementation of `ssl_wrap_socket` override custom CA certificates by calling load_verify_locations inside?", "@Lukasa That's how I did it so far:\r\n\r\n```python\r\nfrom requests.adapters import HTTPAdapter\r\n\r\n\r\nclass CustomSSLContextAdapter(HTTPAdapter):\r\n def __init__(self, *args, ssl_context=None, **kwargs):\r\n super().__init__(*args, **kwargs)\r\n self._ssl_context = ssl_context\r\n\r\n def init_poolmanager(self, *args, **kwargs):\r\n kwargs['ssl_context'] = self._ssl_context\r\n return super().init_poolmanager(*args, **kwargs)\r\n```", "It's certainly possible. You may need to make further adapter overrides so you can set `ca_certs` and `ca_cert_dir` to `None`.", "@Lukasa Something like this:\r\n\r\n```python\r\nclass CustomSSLContextAdapter(HTTPAdapter):\r\n \"\"\"\r\n HTTP adapter with custom SSLContext.\r\n\r\n >>> from requests.sessions import Session\r\n >>>\r\n >>> with Session() as session:\r\n >>> session.mount('https://', CustomSSLContextAdapter(ssl_context=...))\r\n >>> session(method=..., url=..., ...)\r\n \"\"\"\r\n def __init__(self, *args, ssl_context=None, **kwargs):\r\n \"\"\"\r\n @param ssl_context: Custom SSL context.\r\n @type ssl_context: ssl.SSLContext\r\n \"\"\"\r\n self._ssl_context = ssl_context\r\n super().__init__(*args, **kwargs)\r\n\r\n def init_poolmanager(self, *args, **kwargs):\r\n kwargs['ssl_context'] = self._ssl_context\r\n return super().init_poolmanager(*args, **kwargs)\r\n\r\n def cert_verify(self, conn, *args, **kwargs):\r\n super().cert_verify(conn, *args, **kwargs)\r\n conn.ca_certs = None\r\n conn.ca_cert_dir = None\r\n```\r\n?", "I think that should be fine." ]
https://api.github.com/repos/psf/requests/issues/3813
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3813/labels{/name}
https://api.github.com/repos/psf/requests/issues/3813/comments
https://api.github.com/repos/psf/requests/issues/3813/events
https://github.com/psf/requests/issues/3813
200,430,233
MDU6SXNzdWUyMDA0MzAyMzM=
3,813
update
{ "avatar_url": "https://avatars.githubusercontent.com/u/1249913?v=4", "events_url": "https://api.github.com/users/noelmas/events{/privacy}", "followers_url": "https://api.github.com/users/noelmas/followers", "following_url": "https://api.github.com/users/noelmas/following{/other_user}", "gists_url": "https://api.github.com/users/noelmas/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/noelmas", "id": 1249913, "login": "noelmas", "node_id": "MDQ6VXNlcjEyNDk5MTM=", "organizations_url": "https://api.github.com/users/noelmas/orgs", "received_events_url": "https://api.github.com/users/noelmas/received_events", "repos_url": "https://api.github.com/users/noelmas/repos", "site_admin": false, "starred_url": "https://api.github.com/users/noelmas/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/noelmas/subscriptions", "type": "User", "url": "https://api.github.com/users/noelmas", "user_view_type": "public" }
[]
closed
true
null
[]
null
0
2017-01-12T17:27:56Z
2021-09-08T13:05:30Z
2017-01-12T17:28:13Z
NONE
resolved
{ "avatar_url": "https://avatars.githubusercontent.com/u/1249913?v=4", "events_url": "https://api.github.com/users/noelmas/events{/privacy}", "followers_url": "https://api.github.com/users/noelmas/followers", "following_url": "https://api.github.com/users/noelmas/following{/other_user}", "gists_url": "https://api.github.com/users/noelmas/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/noelmas", "id": 1249913, "login": "noelmas", "node_id": "MDQ6VXNlcjEyNDk5MTM=", "organizations_url": "https://api.github.com/users/noelmas/orgs", "received_events_url": "https://api.github.com/users/noelmas/received_events", "repos_url": "https://api.github.com/users/noelmas/repos", "site_admin": false, "starred_url": "https://api.github.com/users/noelmas/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/noelmas/subscriptions", "type": "User", "url": "https://api.github.com/users/noelmas", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3813/reactions" }
https://api.github.com/repos/psf/requests/issues/3813/timeline
null
completed
null
null
false
[]
https://api.github.com/repos/psf/requests/issues/3812
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3812/labels{/name}
https://api.github.com/repos/psf/requests/issues/3812/comments
https://api.github.com/repos/psf/requests/issues/3812/events
https://github.com/psf/requests/pull/3812
200,423,124
MDExOlB1bGxSZXF1ZXN0MTAxMjk5NTY0
3,812
Add default value of allow_redirects to docs
{ "avatar_url": "https://avatars.githubusercontent.com/u/28734?v=4", "events_url": "https://api.github.com/users/inglesp/events{/privacy}", "followers_url": "https://api.github.com/users/inglesp/followers", "following_url": "https://api.github.com/users/inglesp/following{/other_user}", "gists_url": "https://api.github.com/users/inglesp/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/inglesp", "id": 28734, "login": "inglesp", "node_id": "MDQ6VXNlcjI4NzM0", "organizations_url": "https://api.github.com/users/inglesp/orgs", "received_events_url": "https://api.github.com/users/inglesp/received_events", "repos_url": "https://api.github.com/users/inglesp/repos", "site_admin": false, "starred_url": "https://api.github.com/users/inglesp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/inglesp/subscriptions", "type": "User", "url": "https://api.github.com/users/inglesp", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2017-01-12T17:00:51Z
2021-09-08T01:21:25Z
2017-01-12T17:12:54Z
CONTRIBUTOR
resolved
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3812/reactions" }
https://api.github.com/repos/psf/requests/issues/3812/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3812.diff", "html_url": "https://github.com/psf/requests/pull/3812", "merged_at": "2017-01-12T17:12:54Z", "patch_url": "https://github.com/psf/requests/pull/3812.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3812" }
true
[ "Thanks @inglesp!" ]
https://api.github.com/repos/psf/requests/issues/3811
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3811/labels{/name}
https://api.github.com/repos/psf/requests/issues/3811/comments
https://api.github.com/repos/psf/requests/issues/3811/events
https://github.com/psf/requests/issues/3811
200,199,743
MDU6SXNzdWUyMDAxOTk3NDM=
3,811
Ubuntu 16.04 Python3 SSLV3_ALERT_HANDSHAKE_FAILURE
{ "avatar_url": "https://avatars.githubusercontent.com/u/5175230?v=4", "events_url": "https://api.github.com/users/2tim/events{/privacy}", "followers_url": "https://api.github.com/users/2tim/followers", "following_url": "https://api.github.com/users/2tim/following{/other_user}", "gists_url": "https://api.github.com/users/2tim/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/2tim", "id": 5175230, "login": "2tim", "node_id": "MDQ6VXNlcjUxNzUyMzA=", "organizations_url": "https://api.github.com/users/2tim/orgs", "received_events_url": "https://api.github.com/users/2tim/received_events", "repos_url": "https://api.github.com/users/2tim/repos", "site_admin": false, "starred_url": "https://api.github.com/users/2tim/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/2tim/subscriptions", "type": "User", "url": "https://api.github.com/users/2tim", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2017-01-11T20:32:47Z
2021-09-08T13:05:31Z
2017-01-11T20:40:57Z
NONE
resolved
Using: Python 3.5.2 Ubuntu 16.04.01 OpenSSL 1.0.2g 1 Mar 2016 pip freeze: alabaster==0.7.9 amqp==2.1.4 antigate==1.4.0 anyjson==0.3.3 Babel==2.3.4 beautifulsoup4==4.5.3 billiard==3.5.0.2 celery==4.0.2 celery-with-redis==3.0 certifi==2016.9.26 cffi==1.9.1 configparser==3.5.0 cryptography==1.7.1 Django==1.10.5 django-docs==0.2.1 djangorestframework==3.5.3 docutils==0.13.1 epydoc==3.0.1 flower==0.9.1 future==0.16.0 gunicorn==19.6.0 idna==2.2 imagesize==0.7.1 Jinja2==2.9.4 kombu==4.0.2 lxml==3.7.2 MarkupSafe==0.23 miette==1.3 ndg-httpsclient==0.4.2 nose==1.3.7 olefile==0.44 ordereddict==1.1 Pillow==4.0.0 probableparsing==0.0.1 psycopg2==2.6.1 pyasn1==0.1.9 pycparser==2.17 Pygments==2.1.3 pyPdf==1.13 python-crfsuite==0.9.1 python-dateutil==2.6.0 pytz==2016.10 redis==2.10.5 reportlab==3.3.0 requests==2.12.4 selenium==2.52.0 six==1.10.0 snowballstemmer==1.2.1 South==1.0.2 Sphinx==1.5.1 tornado==4.2 Unidecode==0.4.20 urllib3==1.19.1 usaddress==0.5.9 vine==1.1.3 xlrd==1.0.0 xmltodict==0.10.2 xvfbwrapper==0.2.9 I tried to track this down from other issues but none of them seemed to answer the problem for my setup. Here is the console steps with error output: Python 3.5.2 (default, Nov 17 2016, 17:05:23) [GCC 5.4.0 20160609] on linux Type "help", "copyright", "credits" or "license" for more information. >>> import requests >>> requests.get("https://webapps.kdads.ks.gov", verify=False) Traceback (most recent call last): File "/home/datadev/.envs/scraping/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 594, in urlopen chunked=chunked) File "/home/datadev/.envs/scraping/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 350, in _make_request self._validate_conn(conn) File "/home/datadev/.envs/scraping/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 835, in _validate_conn conn.connect() File "/home/datadev/.envs/scraping/lib/python3.5/site-packages/requests/packages/urllib3/connection.py", line 323, in connect ssl_context=context) File "/home/datadev/.envs/scraping/lib/python3.5/site-packages/requests/packages/urllib3/util/ssl_.py", line 324, in ssl_wrap_socket return context.wrap_socket(sock, server_hostname=server_hostname) File "/usr/lib/python3.5/ssl.py", line 377, in wrap_socket _context=self) File "/usr/lib/python3.5/ssl.py", line 752, in __init__ self.do_handshake() File "/usr/lib/python3.5/ssl.py", line 988, in do_handshake self._sslobj.do_handshake() File "/usr/lib/python3.5/ssl.py", line 633, in do_handshake self._sslobj.do_handshake() ssl.SSLError: [SSL: SSLV3_ALERT_HANDSHAKE_FAILURE] sslv3 alert handshake failure (_ssl.c:645) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/datadev/.envs/scraping/lib/python3.5/site-packages/requests/adapters.py", line 423, in send timeout=timeout File "/home/datadev/.envs/scraping/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 624, in urlopen raise SSLError(e) requests.packages.urllib3.exceptions.SSLError: [SSL: SSLV3_ALERT_HANDSHAKE_FAILURE] sslv3 alert handshake failure (_ssl.c:645) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/home/datadev/.envs/scraping/lib/python3.5/site-packages/requests/api.py", line 70, in get return request('get', url, params=params, **kwargs) File "/home/datadev/.envs/scraping/lib/python3.5/site-packages/requests/api.py", line 56, in request return session.request(method=method, url=url, **kwargs) File "/home/datadev/.envs/scraping/lib/python3.5/site-packages/requests/sessions.py", line 488, in request resp = self.send(prep, **send_kwargs) File "/home/datadev/.envs/scraping/lib/python3.5/site-packages/requests/sessions.py", line 609, in send r = adapter.send(request, **kwargs) File "/home/datadev/.envs/scraping/lib/python3.5/site-packages/requests/adapters.py", line 497, in send raise SSLError(e, request=request) requests.exceptions.SSLError: [SSL: SSLV3_ALERT_HANDSHAKE_FAILURE] sslv3 alert handshake failure (_ssl.c:645)
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3811/reactions" }
https://api.github.com/repos/psf/requests/issues/3811/timeline
null
completed
null
null
false
[ "I should note we had the same issue with OSX but assumed it was related to our setup with using the built in (outdated) version of OpenSSL. I'm hoping someone will have an easy fix for this that I am just missing.", "The problem here is that the only cipher suite the server supports is 3DES-based, and newer versions of Requests do not allow 3DES because it is insecure for large data transfers.\r\n\r\nIf you know the server operators or can contact them, you should ask them to improve their TLS usage. In the meantime, you can resolve this by mounting a custom transport adapter for that specific hostname, as shown by [this answer](https://github.com/kennethreitz/requests/issues/3774#issuecomment-267871876).", "@Lukasa Hmm, I was hoping that would work. I ran the following and got the same response.\r\n\r\nimport requests\r\nfrom requests.adapters import HTTPAdapter\r\nfrom requests.packages.urllib3.poolmanager import PoolManager\r\nfrom requests.packages.urllib3.util.ssl_ import create_urllib3_context\r\nCIPHERS = (\r\n 'ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES:ECDH+HIGH:'\r\n 'DH+HIGH:ECDH+3DES:DH+3DES:RSA+AESGCM:RSA+AES:RSA+HIGH:RSA+3DES:!aNULL:'\r\n '!eNULL:!MD5'\r\n)\r\nbaseurl='https://webapps.kdads.ks.gov/LSOBP18'\r\nclass DESAdapter(HTTPAdapter):\r\n def init_poolmanager(self, connections, maxsize, block=False,*args, **kwargs):\r\n context = create_urllib3_context(ciphers=CIPHERS)\r\n kwargs['ssl_context'] = context\r\n self.poolmanager = PoolManager(num_pools=connections,\r\n maxsize=maxsize,\r\n block=block,\r\n *args, **kwargs)\r\ns=requests.Session()\r\ns.mount('https://10.192.8.89', DESAdapter())\r\ns.get(baseurl, verify=False)", "You need to change the mount line to mount at https://webapps.kdads.ks.gov", "Thanks, sorry for that." ]
https://api.github.com/repos/psf/requests/issues/3810
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3810/labels{/name}
https://api.github.com/repos/psf/requests/issues/3810/comments
https://api.github.com/repos/psf/requests/issues/3810/events
https://github.com/psf/requests/issues/3810
200,116,604
MDU6SXNzdWUyMDAxMTY2MDQ=
3,810
Timeout in stream request closes connection from client side
{ "avatar_url": "https://avatars.githubusercontent.com/u/16137032?v=4", "events_url": "https://api.github.com/users/tsarvela/events{/privacy}", "followers_url": "https://api.github.com/users/tsarvela/followers", "following_url": "https://api.github.com/users/tsarvela/following{/other_user}", "gists_url": "https://api.github.com/users/tsarvela/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/tsarvela", "id": 16137032, "login": "tsarvela", "node_id": "MDQ6VXNlcjE2MTM3MDMy", "organizations_url": "https://api.github.com/users/tsarvela/orgs", "received_events_url": "https://api.github.com/users/tsarvela/received_events", "repos_url": "https://api.github.com/users/tsarvela/repos", "site_admin": false, "starred_url": "https://api.github.com/users/tsarvela/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/tsarvela/subscriptions", "type": "User", "url": "https://api.github.com/users/tsarvela", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2017-01-11T15:03:22Z
2021-09-08T13:05:32Z
2017-01-11T15:05:53Z
NONE
resolved
I've hit a problematic behaviour with python3 and requests, stemming from urllib3. Consider the following example usage with tight read timeout. After getting the first exception, the connection is cleanly shut down from clientside. Serverside gets Broken pipe. I'd expected to be able to continue reading from the stream, without needing to reconnect. Reading the urllib3 explanation clarifies the issue somewhat, but is this wanted behaviour? ``` #!/usr/bin/env python3 import requests res = requests.get("http://localhost/stream", stream=True, timeout=(10.0, 1.0)) reason = None while not reason: try: line = res.raw.readline() print(line) except Exception as e: print (e) ``` Full traceback is ``` Traceback (most recent call last): File "/usr/lib/python3.5/site-packages/requests/packages/urllib3/response.py", line 228, in _error_catcher yield File "/usr/lib/python3.5/site-packages/requests/packages/urllib3/response.py", line 310, in read data = self._fp.read(amt) File "/usr/lib64/python3.5/http/client.py", line 448, in read n = self.readinto(b) File "/usr/lib64/python3.5/http/client.py", line 488, in readinto n = self.fp.readinto(b) File "/usr/lib64/python3.5/socket.py", line 575, in readinto return self._sock.recv_into(b) socket.timeout: timed out During handling of the above exception, another exception occurred: Traceback (most recent call last): File "example.py", line 8, in <module> except Exception as e: raise (e) File "example.py", line 6, in <module> line = res.raw.readline() File "/usr/lib/python3.5/site-packages/requests/packages/urllib3/response.py", line 320, in read flush_decoder = True File "/usr/lib64/python3.5/contextlib.py", line 77, in __exit__ self.gen.throw(type, value, traceback) File "/usr/lib/python3.5/site-packages/requests/packages/urllib3/response.py", line 233, in _error_catcher raise ReadTimeoutError(self._pool, None, 'Read timed out.') requests.packages.urllib3.exceptions.ReadTimeoutError: HTTPConnectionPool(host='localhost'): Read timed out. ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3810/reactions" }
https://api.github.com/repos/psf/requests/issues/3810/timeline
null
completed
null
null
false
[ "Timeouts are considered error conditions: that is, we assume that if the server takes longer than a timeout to generate a value, you want us to treat that as an error. That means we tear the connection down, because it's now in an indeterminite state: the server may have gone away, or any number of other bad things have happened.\r\n\r\nThis is entirely expected behaviour. Requests does not support a form of non-blocking operation, so if you want to do stuff while you're waiting for downloads you should set longer timeouts and do your HTTP requests in a separate thread, allowing your main thread to execute while the background thread does the HTTP." ]
https://api.github.com/repos/psf/requests/issues/3809
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3809/labels{/name}
https://api.github.com/repos/psf/requests/issues/3809/comments
https://api.github.com/repos/psf/requests/issues/3809/events
https://github.com/psf/requests/pull/3809
200,045,203
MDExOlB1bGxSZXF1ZXN0MTAxMDMxOTQ2
3,809
Remove unused module from tests
{ "avatar_url": "https://avatars.githubusercontent.com/u/13811604?v=4", "events_url": "https://api.github.com/users/winterjung/events{/privacy}", "followers_url": "https://api.github.com/users/winterjung/followers", "following_url": "https://api.github.com/users/winterjung/following{/other_user}", "gists_url": "https://api.github.com/users/winterjung/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/winterjung", "id": 13811604, "login": "winterjung", "node_id": "MDQ6VXNlcjEzODExNjA0", "organizations_url": "https://api.github.com/users/winterjung/orgs", "received_events_url": "https://api.github.com/users/winterjung/received_events", "repos_url": "https://api.github.com/users/winterjung/repos", "site_admin": false, "starred_url": "https://api.github.com/users/winterjung/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/winterjung/subscriptions", "type": "User", "url": "https://api.github.com/users/winterjung", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2017-01-11T09:41:59Z
2021-09-08T01:21:26Z
2017-01-11T09:49:59Z
NONE
resolved
merge_cookies and cookiejar_from_dict functions are already tested by other test functions.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3809/reactions" }
https://api.github.com/repos/psf/requests/issues/3809/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3809.diff", "html_url": "https://github.com/psf/requests/pull/3809", "merged_at": "2017-01-11T09:49:59Z", "patch_url": "https://github.com/psf/requests/pull/3809.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3809" }
true
[ "Thanks for this @JungWinter! :sparkles: :cake: :sparkles:" ]
https://api.github.com/repos/psf/requests/issues/3808
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3808/labels{/name}
https://api.github.com/repos/psf/requests/issues/3808/comments
https://api.github.com/repos/psf/requests/issues/3808/events
https://github.com/psf/requests/issues/3808
200,012,415
MDU6SXNzdWUyMDAwMTI0MTU=
3,808
Connection reset by peer with AWS NAT Gateway and Keep Alive
{ "avatar_url": "https://avatars.githubusercontent.com/u/892861?v=4", "events_url": "https://api.github.com/users/ediskandarov/events{/privacy}", "followers_url": "https://api.github.com/users/ediskandarov/followers", "following_url": "https://api.github.com/users/ediskandarov/following{/other_user}", "gists_url": "https://api.github.com/users/ediskandarov/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ediskandarov", "id": 892861, "login": "ediskandarov", "node_id": "MDQ6VXNlcjg5Mjg2MQ==", "organizations_url": "https://api.github.com/users/ediskandarov/orgs", "received_events_url": "https://api.github.com/users/ediskandarov/received_events", "repos_url": "https://api.github.com/users/ediskandarov/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ediskandarov/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ediskandarov/subscriptions", "type": "User", "url": "https://api.github.com/users/ediskandarov", "user_view_type": "public" }
[]
closed
true
null
[]
null
6
2017-01-11T06:19:27Z
2021-09-08T11:00:38Z
2017-01-11T09:00:30Z
NONE
resolved
Set up following scenario to reproduce the issue: backend side: CloudFlare is proxying to AWS ELB. ELB is proxying to Nginx. client side: Create private subnet and attach AWS NAT Gateway to it. Launch EC2 instance in private subnet and use a script below to reproduce the issue. Request path is: EC2 instance(originating) -> AWS NAT Gateway -> CloudFlare -> AWS ELB -> Nginx. I created a script which reproduces the issue: ```python import requests import time def request_with_timeout(url, minutes): session = requests.Session() session.get(url, allow_redirects=False) time.sleep(minutes * 60) session.get(url, allow_redirects=False) request_with_timeout('https://coins.ph', 6) ``` After executing - requests raises an exception: ``` >>> request_with_timeout('https://coins.ph', 6) /opt/app/accounting/venv/local/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:315: SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#snimissingwarning. SNIMissingWarning /opt/app/accounting/venv/local/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:120: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning. InsecurePlatformWarning Traceback (most recent call last): File "<stdin>", line 1, in <module> File "<stdin>", line 5, in request_with_timeout File "/opt/app/accounting/venv/local/lib/python2.7/site-packages/requests/sessions.py", line 480, in get return self.request('GET', url, **kwargs) File "/opt/app/accounting/venv/local/lib/python2.7/site-packages/requests/sessions.py", line 468, in request resp = self.send(prep, **send_kwargs) File "/opt/app/accounting/venv/local/lib/python2.7/site-packages/requests/sessions.py", line 576, in send r = adapter.send(request, **kwargs) File "/opt/app/accounting/venv/local/lib/python2.7/site-packages/requests/adapters.py", line 426, in send raise ConnectionError(err, request=request) requests.exceptions.ConnectionError: ('Connection aborted.', error(104, 'Connection reset by peer')) ``` Software versions: ``` Python 2.7.6 requests==2.9.1 ``` But also reproduced on Python 2.7.12 and requests 2.12.4
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3808/reactions" }
https://api.github.com/repos/psf/requests/issues/3808/timeline
null
completed
null
null
false
[ "Attaching tcpdump\r\n\r\n[tcpdump-log.txt](https://github.com/kennethreitz/requests/files/698100/tcpdump-log.txt)\r\n", "The issue does not appear when URL has `http` scheme.", "Also everything works fine when there's no AWS NAT Gateway on the traffic path.\r\n\r\nThis can be a result of AWS NAT Gateway behavior. http://docs.aws.amazon.com/AmazonVPC/latest/UserGuide/vpc-nat-comparison.html\r\n\r\n<img width=\"769\" alt=\"screen shot 2017-01-10 at 16 14 36\" src=\"https://cloud.githubusercontent.com/assets/892861/21838818/60012a24-d7e5-11e6-8859-ac504780771b.png\">\r\n", "I also noticed that error does not happen if requests are passing-by cloudflare.", "So, this looks like you're encountering a race condition.\r\n\r\nIt seems like, one way or another, there is a server or CDN that has a 6-minute timeout on their connection, such that when it is idle for 6 minutes they will close it. If you start another request while the RST packet is in-flight, requests will see the RST as a connection abort. This is indistinguishable from the server closing the connection in our face, so we have to report it as an error condition.\r\n\r\nTo resolve this, I recommend adding retry logic. Your code should, in general, be resilient to failure in the face of network instability. You can do this either by mounting custom transport adapters, or by changing the value of `requests.adapters.DEFAULT_RETRIES` to something other than zero before you create your session.", "I also stumbled upon this behaviour recently.\r\nBeing resilient to failure in the face of network instability is good advice, but sometimes it may be expensive to implement this.\r\nAfter some research I found that turning on TCP keepalive helps preventing reset errors.\r\nFor example like this:\r\n\r\n class KeepaliveAdapter(requests.adapters.HTTPAdapter):\r\n def init_poolmanager(self, *args, **kwargs):\r\n kwargs['socket_options'] = HTTPConnection.default_socket_options + [\r\n (socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1),\r\n ]\r\n super(KeepaliveAdapter, self).init_poolmanager(*args, **kwargs)\r\n\r\n keepalive_aware_session = requests.Session()\r\n\r\n adapter = KeepaliveAdapter()\r\n keepalive_aware_session.mount('http://', adapter)\r\n keepalive_aware_session.mount('https://', adapter)\r\n\r\nAnd then use session as usual.\r\nBut for this to work it is also neccesary to tune keepalive timers (i.e. set it less than 6 min, I used 180 seconds):\r\n\r\n sysctl -w net.ipv4.tcp_keepalive_time=180\r\n\r\nDefault value for this param on Ubuntu is **two hours**." ]
https://api.github.com/repos/psf/requests/issues/3807
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3807/labels{/name}
https://api.github.com/repos/psf/requests/issues/3807/comments
https://api.github.com/repos/psf/requests/issues/3807/events
https://github.com/psf/requests/issues/3807
199,955,807
MDU6SXNzdWUxOTk5NTU4MDc=
3,807
'NoneType' object has no attribute 'readline'
{ "avatar_url": "https://avatars.githubusercontent.com/u/9061113?v=4", "events_url": "https://api.github.com/users/gilessbrown/events{/privacy}", "followers_url": "https://api.github.com/users/gilessbrown/followers", "following_url": "https://api.github.com/users/gilessbrown/following{/other_user}", "gists_url": "https://api.github.com/users/gilessbrown/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/gilessbrown", "id": 9061113, "login": "gilessbrown", "node_id": "MDQ6VXNlcjkwNjExMTM=", "organizations_url": "https://api.github.com/users/gilessbrown/orgs", "received_events_url": "https://api.github.com/users/gilessbrown/received_events", "repos_url": "https://api.github.com/users/gilessbrown/repos", "site_admin": false, "starred_url": "https://api.github.com/users/gilessbrown/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/gilessbrown/subscriptions", "type": "User", "url": "https://api.github.com/users/gilessbrown", "user_view_type": "public" }
[]
closed
true
null
[]
null
31
2017-01-10T22:48:46Z
2021-09-02T00:07:16Z
2017-01-11T17:29:42Z
NONE
resolved
Tried a quick check of open issues, but wasn't sure what to search for on this. Here's the code to reproduce: ``` import shutil import requests import os session = requests.Session() # it has something to do with content on the 302 from this url u = 'http://www.amazon.com/gp/redirect.html/ref=gw_m_b_ir?_encoding=UTF8&location=http%3A%2F%2Fphx.corporate-ir.net%2Fphoenix.zhtml%3Fc%3D97664%26p%3Dirol-irhome&source=standards&token=F9CAD8A11D4336B5E0B3C3B089FA066D0A467C1C' r0 = session.get(u, stream=True, allow_redirects=False) redirects = session.resolve_redirects(r0, r0.request, stream=True) for redirect in redirects: with open(os.devnull, 'wb') as devnull: shutil.copyfileobj(redirect.raw, devnull) ``` Gives me the traceback: ``` (tmp-91018d725433cf2c) gsbrown$ python bug.py Traceback (most recent call last): File "bug.py", line 11, in <module> for redirect in redirects: File "/Users/gsbrown/.virtualenvs/tmp-91018d725433cf2c/lib/python2.7/site-packages/requests/sessions.py", line 106, in resolve_redirects resp.content # Consume socket so it can be released File "/Users/gsbrown/.virtualenvs/tmp-91018d725433cf2c/lib/python2.7/site-packages/requests/models.py", line 781, in content self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes() File "/Users/gsbrown/.virtualenvs/tmp-91018d725433cf2c/lib/python2.7/site-packages/requests/models.py", line 703, in generate for chunk in self.raw.stream(chunk_size, decode_content=True): File "/Users/gsbrown/.virtualenvs/tmp-91018d725433cf2c/lib/python2.7/site-packages/requests/packages/urllib3/response.py", line 428, in stream for line in self.read_chunked(amt, decode_content=decode_content): File "/Users/gsbrown/.virtualenvs/tmp-91018d725433cf2c/lib/python2.7/site-packages/requests/packages/urllib3/response.py", line 590, in read_chunked self._update_chunk_length() File "/Users/gsbrown/.virtualenvs/tmp-91018d725433cf2c/lib/python2.7/site-packages/requests/packages/urllib3/response.py", line 532, in _update_chunk_length line = self._fp.fp.readline() AttributeError: 'NoneType' object has no attribute 'readline' ``` The redirect url is significant. It doesn't happen with the httpbin ones.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3807/reactions" }
https://api.github.com/repos/psf/requests/issues/3807/timeline
null
completed
null
null
false
[ "FYI: \r\n```\r\n$ pip list\r\npip (1.5.4)\r\nrequests (2.12.4)\r\nsetuptools (2.2)\r\nwsgiref (0.1.2)\r\n```", "Just a quick note... I can't really help with your issue directly, as it goes beyond my knowledge.\r\n\r\nJust thought I'd mention that pip is up to version 9.0.1... pip 1.5.4 is nearly 3 years old. I'd strongly suggest look at why you're using such an old version of pip", "Ha. I just like old stuff. :) Can you reproduce the bug?", "I can when using python 2.7.12\r\n\r\nI can't replicate with python 3.5 though. ", "Are you going to tell me to upgrade to Python 3.5? ;)", "Hey @gilessbrown, thanks for opening this issue! This behaviour was exposed by a change made in (327512f) which removed exception handling for this case. It currently only exists in the 2.12.x releases, so using Requests 2.11.1 should work for you.\r\n\r\nWhile you're seeing this behaviour in Requests, it actually seems to be how we're handling chunked responses without a body in urllib3. shazow/urllib3#990 introduced an attempt to catch this problem but the check is just slightly off. While it verifies the existence to `fp`, it doesn't check that `fp` isn't `None`.\r\n\r\nI think the simple fix here is to change [the check](https://github.com/shazow/urllib3/blob/master/urllib3/response.py#L525) in urllib3 to `return getattr(self._fp, 'fp', None) is not None` which should give us what we actually want. It verifies that `fp` both exists, and is not the default `None` value.\r\n\r\nIf @Lukasa or @sigmavirus24 are in agreement, we can address this over in urllib3.", "Also, I'm able to produce this in both Python 2.7 and 3.5.", "Aha - My 3.5 runtime was using requests 2.11.x, which is why it worked", "Thanks for the tip on Request 2.11.1 and for investigating the issue. I think I’ll follow your suggestion.", "> While you're seeing this behaviour in Requests, it actually seems to be how we're handling chunked responses without a body in urllib3.\r\n\r\nHang on. Chunked responses without a body *are not a thing*. The only time they're allowed is in response to a HEAD request, where the body must not be sent. Otherwise, if `Transfer-Encoding: chunked` is set then there *must* be a chunk body.\r\n\r\nI'd say the server is at fault here.", "> I'd say the server is at fault here.\r\n\r\nI can't reproduce this with 2.12.4 on Python 2.7.12 ... \r\n\r\nWhat I am seeing, however, is that the URL provided redirects to\r\n\r\n```\r\nhttps://www.amazon.com/gp/redirect.html/ref=gw_m_b_ir?_encoding=UTF8&location=http%3A%2F%2Fphx.corporate-ir.net%2Fphoenix.zhtml%3Fc%3D97664%26p%3Dirol-irhome&source=standards&token=F9CAD8A11D4336B5E0B3C3B089FA066D0A467C1C\r\n```\r\n\r\nWhich results in another 302 with `Transfer-Encoding: chunked`. If I access the content on that response, I get `'\\n'`.\r\n\r\n```\r\n>>> r.headers\r\n{'x-amz-id-1': 'V7C8AQ04FPXYP69MGHPS', 'Content-Encoding': 'gzip', 'Transfer-Encoding': 'chunked', 'Strict-Transport-Security': 'max-age=47474747; includeSubDomains; preload', 'Vary': 'Accept-Encoding,User-Agent', 'Server': 'Server', 'Connection': 'keep-alive', 'Location': 'http://phx.corporate-ir.net/phoenix.zhtml?c=97664&p=irol-irhome', 'Date': 'Wed, 11 Jan 2017 12:48:59 GMT', 'p3p': 'policyref=\"https://www.amazon.com/w3c/p3p.xml\",CP=\"CAO DSP LAW CUR ADM IVAo IVDo CONo OTPo OUR DELi PUBi OTRi BUS PHY ONL UNI PUR FIN COM NAV INT DEM CNT STA HEA PRE LOC GOV OTC \"', 'Content-Type': 'text/html; charset=ISO-8859-1', 'x-frame-options': 'SAMEORIGIN'}\r\n>>> r.content\r\n'\\n'\r\n```\r\n\r\nAfter that, I get a 200. So in a virtualenv with 2.12.4 on Python 2.7.12 (and on Fedora 25) I get no issues. I'm curious what's different in what y'all are using that you're seeing this.", "> I'd say the server is at fault here.\r\n\r\nI completely agree the server isn't compliant here, but we also just said we'd *like* to be tolerant of things like this in #3794, which is even more out of spec. The server is definitely returning garbage though, so maybe we choose not to address this.\r\n\r\n@sigmavirus24 I'm currently able to reproduce this on Python 2.7.12 and 3.5.2 on Mac OS 10.12.2, Ubuntu 12.04, and [Travis](https://travis-ci.org/nateprewitt/requests/builds/190981913).\r\n\r\nThe repro won't fail if you start it at the second hop (the https url), so it seems to require this specific set of responses. It's definitely related to the transfer-encoding because chaining a similar set of responses ((http)302->(https)302->(separate server)200) from httpbin won't trigger the failure. \r\n\r\nThis \"bug\" was introduced in Requests 2.7.0 (urllib3 1.10.4) but masked by two separate `try/except AssertionError` blocks. The first was removed in 2.8.0 in c6c8d64 but this didn't expose the issue because the `content` exception block was still catching it. 327512f removed the second safeguard which is why 2.12.x is now showing this. I backported 327512f and was able to confirm this only started happening after the chunk transfer work in urllib3 1.10.4. Something causes the underlying socket closed before we can read it for the redirect, but I wasn't able to immediately find what.\r\n\r\nAt this point this is probably too much digging for a pretty uncommon edge case, but I'll let you two decide.", "I understand that streaming from the intermediate redirects is a little exotic. For the record I want to note that the server behaviour is not especially exotic. With tweaked sample to pass the url ...\r\n\r\n```\r\n(requestsbug$ python bug.py http://localad.homedepot.com/\r\nTraceback (most recent call last):\r\n File \"bug.py\", line 8, in <module>\r\n bug.main()\r\n File \"/Users/gsbrown/bug.py\", line 28, in main\r\n stream_redirects(url)\r\n File \"/Users/gsbrown/bug.py\", line 21, in stream_redirects\r\n for redirect in redirects:\r\n File \"/Users/gsbrown/.virtualenvs/requestsbug/lib/python2.7/site-packages/requests/sessions.py\", line 106, in resolve_redirects\r\n resp.content # Consume socket so it can be released\r\n File \"/Users/gsbrown/.virtualenvs/requestsbug/lib/python2.7/site-packages/requests/models.py\", line 781, in content\r\n self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes()\r\n File \"/Users/gsbrown/.virtualenvs/requestsbug/lib/python2.7/site-packages/requests/models.py\", line 703, in generate\r\n for chunk in self.raw.stream(chunk_size, decode_content=True):\r\n File \"/Users/gsbrown/.virtualenvs/requestsbug/lib/python2.7/site-packages/requests/packages/urllib3/response.py\", line 428, in stream\r\n for line in self.read_chunked(amt, decode_content=decode_content):\r\n File \"/Users/gsbrown/.virtualenvs/requestsbug/lib/python2.7/site-packages/requests/packages/urllib3/response.py\", line 590, in read_chunked\r\n self._update_chunk_length()\r\n File \"/Users/gsbrown/.virtualenvs/requestsbug/lib/python2.7/site-packages/requests/packages/urllib3/response.py\", line 532, in _update_chunk_length\r\n line = self._fp.fp.readline()\r\nAttributeError: 'NoneType' object has no attribute 'readline'\r\n```\r\n\r\n\r\n\r\n", "Extra info about the previous run:\r\n```\r\n(requestsbug)$ pip list --format=columns\r\nPackage Version\r\n---------- -------\r\npip 9.0.1\r\nrequests 2.12.4\r\nsetuptools 2.2\r\n(requestsbug)$ python --version\r\nPython 2.7.10```", "And the downgrade does not generate the traceback:\r\n```\r\n(requestsbug)$ pip install requests==2.11.1\r\nCollecting requests==2.11.1\r\n Using cached requests-2.11.1-py2.py3-none-any.whl\r\nInstalling collected packages: requests\r\n Found existing installation: requests 2.12.4\r\n Uninstalling requests-2.12.4:\r\n Successfully uninstalled requests-2.12.4\r\nSuccessfully installed requests-2.11.1\r\n(requestsbug)$ python bug.py http://localad.homedepot.com/\r\n(requestsbug)$\r\n```", "There's nothing unexpected about streaming the intermediate response. We're fine with you doing that. What's wrong is the form of that intermediate response. Allow me to reproduce it for you here (using the original example):\r\n\r\n```\r\nHTTP/1.1 302 MovedTemporarily\r\nServer: Server\r\nDate: Wed, 11 Jan 2017 17:06:17 GMT\r\nContent-Type: text/html; charset=ISO-8859-1\r\nTransfer-Encoding: chunked\r\nConnection: keep-alive\r\nStrict-Transport-Security: max-age=47474747; includeSubDomains; preload\r\nx-amz-id-1: PF9SK6WE9FGKS0FV29ED\r\np3p: policyref=\"https://www.amazon.com/w3c/p3p.xml\",CP=\"CAO DSP LAW CUR ADM IVAo IVDo CONo OTPo OUR DELi PUBi OTRi BUS PHY ONL UNI PUR FIN COM NAV INT DEM CNT STA HEA PRE LOC GOV OTC \"\r\nx-frame-options: SAMEORIGIN\r\nLocation: http://phx.corporate-ir.net/phoenix.zhtml?c=97664&p=irol-irhome\r\nVary: Accept-Encoding,User-Agent\r\nContent-Encoding: gzip\r\n\r\n\\x1f\\x8b\\x08\\x00\\x00\\x00\\x00\\x00\\x00\\x03\\xe2\\x02\\x00\\x00\\x00\\xff\\xff\\x03\\x00\\x93\\x06\\xd72\\x01\\x00\\x00\\x00\r\n```\r\n\r\n(I have had to hex encode the body to print it out, because it's not valid ASCII)\r\n\r\nNote that this response says `Transfer-Encoding: chunked`, and then provides a non-chunk-encoded body. That body does not obey chunked transfer encoding. This is *an error condition*. The traceback is not wrong. There is no way to safely read this body. That means your attempt to redirect that body into a file *must* fail.\r\n\r\nI cannot stress this enough: there is no way Requests can \"tolerate\" this if you ask us for the body because we cannot read the body. It is an invalid HTTP/1.1 response, and we cannot tell when the body is supposed to end. The problem here is not the traceback. If anything, the fact that it \"works\" in the older case is the *actual* bug, because we are pretending to the user that everything is fine when in fact it is very much not fine.\r\n\r\nIf you are going to handle redirects yourself, then you must be prepared to do what Requests does when it receives a bad body on a redirect response, which is swallow them. It is not our job to swallow them for you when you indicated you want the actual response. We are giving you the response: it is the server's fault that the response is invalid. Please take it up with them.", "I should note that I would be prepared to accept a PR on urllib3 to provide a better exception in this case, because really we should be emitting a urllib3-specific exception here.", "Hrm. Further investigation suggests that the chunked transfer encoding here is actually fine. I managed to pull out some specific chunked encoding from curl that works just fine, as does the httplib code. So, new question: why is the urllib3 code getting this wrong?", "Ack, I suspect I know why. We're consuming the content twice. If you eat the data via `response.raw`, `resolve_redirects` goes to consume it again and gets all wibbly and confused because it doesn't know that you already did. This can be seen by modifying the originally posted code to (note the additional line at the bottom):\r\n\r\n```python\r\nimport requests\r\nimport os\r\n\r\nsession = requests.Session()\r\n\r\n# it has something to do with content on the 302 from this url\r\nu = 'http://www.amazon.com/gp/redirect.html/ref=gw_m_b_ir?_encoding=UTF8&location=http%3A%2F%2Fphx.corporate-ir.net%2Fphoenix.zhtml%3Fc%3D97664%26p%3Dirol-irhome&source=standards&token=F9CAD8A11D4336B5E0B3C3B089FA066D0A467C1C'\r\nr0 = session.get(u, stream=True, allow_redirects=False)\r\nredirects = session.resolve_redirects(r0, r0.request, stream=True)\r\n\r\nfor redirect in redirects:\r\n with open(os.devnull, 'wb') as devnull:\r\n shutil.copyfileobj(redirect.raw, devnull)\r\n redirect._content = b''\r\n```", "So, `iter_content` flags this by setting `response._content_consumed = True` when it has finished iterating.\r\n\r\nI suspect the real culprit here is urllib3's custom chunking code, which does not check whether it believes the response is complete or not (in this case, it is, and so it should be short-circuit returning). I think that's a bug on urllib3.", "See shazow/urllib3#1088. In the meantime, I recommend a workaround when using `response.raw` when iterating redirects that you set `_content_consumed` to `True` if you have, in fact, consumed the content. Closing in favour of the urllib3 bug.", "Hi @Lukasa, your analysis of it being caused by the double consumption of raw (first by the `copyfileobj`, and then by the `response.content` in `resolve_redirects`) matches what I saw in the debugger so I'm pretty confident your work-around of setting `._content` will work for me. Thanks for your deep dive into this.", "Could someone change this subject to \"'NoneType' object has no attribute 'readline'\" ?\r\n\r\nI ran into the same issue, and the SEO on this pushed it down a bit on search results.\r\n\r\nFrom my tests so far, I believe the `._content_consumed` flag needs to be set to true if anything is read off the socket during a redirect resolve (ie a header not just \"content\"). My workaround is to place a \"#TODO\" pointing to this issue in my code, and consuming all the content + setting the flag at that point.", "@jvanasco Reading the headers should not require `_content_consumed` to be `True`. If it did, the `_content_consumed` flag would be entirely redundant, because we never create a `Response` object without having already read the headers.", "hm. I'm not sure what happened. I tracked my error down to a custom redirect resolver. while the block does consume content, I was getting an error on cases that exited out before then. the only thing accessed on those was the headers/status. \r\n\r\ni added these 2 lines to the top of the block, and all the edges passed:\r\n\r\n ...\r\n touched = resp.content\r\n resp._content_consumed = True\r\n ...\r\n\r\ni'm far too busy to look deeper into this, but I'll make a note and try to investigate on my free time. \r\n\r\noh! thanks for the input and making the title more SEO friendly!", "Howdy, I'm not clear – was this resolved? I'm on requests==2.18.4 with python 3.5.1 and still seeing this issue. \r\n\r\nSpecifically,\r\n\r\n```\r\n File \"/Users/foo/Code/dev/myapp/brain/network.py\", line 56, in expand_url\r\n return get(url)\r\n File \"/Users/foo/Code/dev/myapp/brain/network.py\", line 27, in get\r\n return requests.get(url, headers=get_headers(), timeout=20, allow_redirects=True)\r\n File \"/Users/foo/Code/dev/venv/lib/python3.5/site-packages/requests/api.py\", line 72, in get\r\n return request('get', url, params=params, **kwargs)\r\n File \"/Users/foo/Code/dev/venv/lib/python3.5/site-packages/requests/api.py\", line 58, in request\r\n return session.request(method=method, url=url, **kwargs)\r\n File \"/Users/foo/Code/dev/venv/lib/python3.5/site-packages/requests/sessions.py\", line 508, in request\r\n resp = self.send(prep, **send_kwargs)\r\n File \"/Users/foo/Code/dev/venv/lib/python3.5/site-packages/requests/sessions.py\", line 640, in send\r\n history = [resp for resp in gen] if allow_redirects else []\r\n File \"/Users/foo/Code/dev/venv/lib/python3.5/site-packages/requests/sessions.py\", line 640, in <listcomp>\r\n history = [resp for resp in gen] if allow_redirects else []\r\n File \"/Users/foo/Code/dev/venv/lib/python3.5/site-packages/requests/sessions.py\", line 218, in resolve_redirects\r\n **adapter_kwargs\r\n File \"/Users/foo/Code/dev/venv/lib/python3.5/site-packages/requests/sessions.py\", line 658, in send\r\n r.content\r\n File \"/Users/foo/Code/dev/venv/lib/python3.5/site-packages/requests/models.py\", line 823, in content\r\n self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes()\r\n File \"/Users/foo/Code/dev/venv/lib/python3.5/site-packages/requests/models.py\", line 745, in generate\r\n for chunk in self.raw.stream(chunk_size, decode_content=True):\r\n File \"/Users/foo/Code/dev/venv/lib/python3.5/site-packages/urllib3/response.py\", line 432, in stream\r\n for line in self.read_chunked(amt, decode_content=decode_content):\r\n File \"/Users/foo/Code/dev/venv/lib/python3.5/site-packages/urllib3/response.py\", line 598, in read_chunked\r\n self._update_chunk_length()\r\n File \"/Users/foo/Code/dev/venv/lib/python3.5/site-packages/urllib3/response.py\", line 540, in _update_chunk_length\r\n line = self._fp.fp.readline()\r\nAttributeError: 'NoneType' object has no attribute 'readline'\r\n```\r\n\r\nThank you for any insight. ", "Hey @zefoo, this is an issue in urllib3 which is a layer beneath Requests. The issue is being tracked in shazow/urllib3#1088, but I don't know if any progress has been made. I would recommend upgrading to the latest version of urllib3 and if the issue persists, follow up there. Thanks!", "Using `requests` with any common user-agent I hit the same bug at: `http://forbes.com`:\r\n\r\n```\r\nimport requests\r\n\r\nsession = requests.Session()\r\nresp = session.get('http://forbes.com', headers={'User-Agent': 'Mozilla/5.0'})\r\n```\r\n\r\nThe stacktrace:\r\n```\r\nAttributeError Traceback (most recent call last) \r\n<ipython-input-8-a73f5204685b> in <module>() \r\n----> 1 resp = session.get('http://forbes.com', headers={'User-Agent': 'Mozilla/5.0'}) \r\n \r\n~/.virtualenvs/test/lib/python3.6/site-packages/requests/sessions.py in get(self, url, **kwargs) \r\n 519 \r\n 520 kwargs.setdefault('allow_redirects', True) \r\n--> 521 return self.request('GET', url, **kwargs) \r\n 522 \r\n 523 def options(self, url, **kwargs): \r\n \r\n~/.virtualenvs/test/lib/python3.6/site-packages/requests/sessions.py in request(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream\r\n, verify, cert, json) \r\n 506 } \r\n 507 send_kwargs.update(settings) \r\n--> 508 resp = self.send(prep, **send_kwargs) \r\n 509 \r\n 510 return resp \r\n \r\n~/.virtualenvs/test/lib/python3.6/site-packages/requests/sessions.py in send(self, request, **kwargs) \r\n 638 \r\n 639 # Resolve redirects if allowed. \r\n--> 640 history = [resp for resp in gen] if allow_redirects else [] \r\n 641 \r\n 642 # Shuffle things around if there's history. \r\n \r\n~/.virtualenvs/test/lib/python3.6/site-packages/requests/sessions.py in <listcomp>(.0) \r\n 638 \r\n 639 # Resolve redirects if allowed. \r\n--> 640 history = [resp for resp in gen] if allow_redirects else [] \r\n 641 \r\n 642 # Shuffle things around if there's history. \r\n \r\n~/.virtualenvs/test/lib/python3.6/site-packages/requests/sessions.py in resolve_redirects(self, resp, req, stream, timeout, verify, cert, proxies, yield_requests, **adapter_kwargs)\r\n 216 proxies=proxies,\r\n 217 allow_redirects=False,\r\n--> 218 **adapter_kwargs\r\n 219 )\r\n 220 \r\n\r\n~/.virtualenvs/test/lib/python3.6/site-packages/requests/sessions.py in send(self, request, **kwargs)\r\n 656 \r\n 657 if not stream:\r\n--> 658 r.content\r\n 659 \r\n 660 return r\r\n\r\n~/.virtualenvs/test/lib/python3.6/site-packages/requests/models.py in content(self)\r\n 821 self._content = None\r\n 822 else:\r\n--> 823 self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes()\r\n 824 \r\n 825 self._content_consumed = True\r\n\r\n~/.virtualenvs/test/lib/python3.6/site-packages/requests/models.py in generate()\r\n 743 if hasattr(self.raw, 'stream'):\r\n 744 try:\r\n--> 745 for chunk in self.raw.stream(chunk_size, decode_content=True):\r\n 746 yield chunk\r\n 747 except ProtocolError as e:\r\n\r\n~/.virtualenvs/test/lib/python3.6/site-packages/urllib3/response.py in stream(self, amt, decode_content)\r\n 430 \"\"\"\r\n 431 if self.chunked and self.supports_chunked_reads():\r\n--> 432 for line in self.read_chunked(amt, decode_content=decode_content):\r\n 433 yield line\r\n 434 else:\r\n\r\n~/.virtualenvs/test/lib/python3.6/site-packages/urllib3/response.py in read_chunked(self, amt, decode_content)\r\n 596 with self._error_catcher():\r\n 597 while True:\r\n--> 598 self._update_chunk_length()\r\n 599 if self.chunk_left == 0:\r\n 600 break\r\n\r\n~/.virtualenvs/test/lib/python3.6/site-packages/urllib3/response.py in _update_chunk_length(self)\r\n 538 if self.chunk_left is not None:\r\n 539 return\r\n--> 540 line = self._fp.fp.readline()\r\n 541 line = line.split(b';', 1)[0]\r\n 542 try:\r\n\r\nAttributeError: 'NoneType' object has no attribute 'readline'\r\n```\r\n\r\nThis is the `wget` output:\r\n```\r\n--2018-05-31 20:39:04-- http://forbes.com/\r\nResolving forbes.com... 151.101.2.49\r\nConnecting to forbes.com|151.101.2.49|:80... connected.\r\nHTTP request sent, awaiting response... \r\n HTTP/1.1 301 Moved Permanently\r\n Server: Varnish\r\n Retry-After: 0\r\n Content-Length: 0\r\n Location: https://forbes.com/\r\n Accept-Ranges: bytes\r\n Date: Fri, 01 Jun 2018 03:39:04 GMT\r\n Via: 1.1 varnish\r\n Connection: close\r\n X-Served-By: cache-pao17432-PAO\r\n X-Cache: HIT\r\n X-Cache-Hits: 0\r\n X-Timer: S1527824344.292472,VS0,VE0\r\n Access-Control-Allow-Credentials: true\r\nLocation: https://forbes.com/ [following]\r\n--2018-05-31 20:39:04-- https://forbes.com/\r\nConnecting to forbes.com|151.101.2.49|:443... connected.\r\nHTTP request sent, awaiting response... \r\n HTTP/1.1 302 Found\r\n Server: Varnish\r\n Accept-Ranges: bytes\r\n location: https://www.forbes.com/forbes/welcome/?toURL=https://forbes.com/&refURL=&referrer=\r\n X-Frame-Options: SAMEORIGIN\r\n X-Cicero-Cache: MISS\r\n Content-Security-Policy: upgrade-insecure-requests\r\n Strict-Transport-Security: max-age=0; includeSubDomains\r\n Accept-Ranges: bytes\r\n Date: Fri, 01 Jun 2018 03:39:04 GMT\r\n Via: 1.1 varnish\r\n X-Served-By: cache-pao17451-PAO\r\n X-Cache: MISS\r\n X-Cache-Hits: 0\r\n X-Timer: S1527824344.332493,VS0,VE269\r\n Vary: X-ABtesting\r\n Access-Control-Allow-Credentials: true\r\n Connection: close\r\nLocation: https://www.forbes.com/forbes/welcome/?toURL=https://forbes.com/&refURL=&referrer= [following]\r\n--2018-05-31 20:39:04-- https://www.forbes.com/forbes/welcome/?toURL=https://forbes.com/&refURL=&referrer=\r\nResolving www.forbes.com... 151.101.190.49\r\nConnecting to www.forbes.com|151.101.190.49|:443... connected.\r\nHTTP request sent, awaiting response... \r\n HTTP/1.1 200 OK\r\n Content-Type: text/html;charset=utf-8\r\n Content-Language: en-US\r\n Server:\r\n Backend: templates\r\n X-YourTtl: 300.000\r\n X-Frame-Options: SAMEORIGIN\r\n X-Cicero-Cache: HIT 4\r\n Content-Security-Policy: upgrade-insecure-requests\r\n Strict-Transport-Security: max-age=10886400; includeSubDomains; preload\r\n Accept-Ranges: bytes\r\n Date: Fri, 01 Jun 2018 03:39:04 GMT\r\n Via: 1.1 varnish\r\n X-Served-By: cache-pao17438-PAO\r\n X-Cache: MISS\r\n X-Cache-Hits: 0\r\n X-Timer: S1527824345.632429,VS0,VE68\r\n Vary: Accept-Encoding, X-ABtesting\r\n Access-Control-Allow-Credentials: true\r\n Set-Cookie: client_id=442dfb7dc29c1468da4ee0432b04ab7cb1b; Path=/; Domain=.forbes.com; Expires=Sun, 31 May 2020 03:39:04 GMT\r\n Connection: close\r\nLength: unspecified [text/html]\r\n```\r\n\r\nThis is for:\r\n```\r\nrequests==2.18.4\r\nurllib3==1.22\r\n```\r\n\r\nHowever, I've even tried with `urlib3-dev` which has the fix for [urllib3/urllib3#1088](https://github.com/urllib3/urllib3/issues/1088) [merged](https://github.com/urllib3/urllib3/pull/1345) and the issue still persists.", "Hi @mjuarezm, if you can verify that you're using the dev version of urllib3, both with the warning we emit \"RequestsDependencyWarning: urllib3 (dev) or chardet (3.0.4) doesn't match a supported version!\" and with the contents of `requests.packages.urllib3.__version__` being \"dev\", you'll want to open a new ticket since this is 18 months old. I'm unable to reproduce the issue with what you've provided with the current master branch of urllib3, so we'll need more information about your system.", "@nateprewitt I'm sorry, I was installing requests from master branch using pip and didn't realize that it was automatically uninstalling urllib3-dev and installing urllib3. I made sure to use urllib3-dev and I get a status code 200 as expected. Thanks!" ]
https://api.github.com/repos/psf/requests/issues/3806
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3806/labels{/name}
https://api.github.com/repos/psf/requests/issues/3806/comments
https://api.github.com/repos/psf/requests/issues/3806/events
https://github.com/psf/requests/issues/3806
199,827,368
MDU6SXNzdWUxOTk4MjczNjg=
3,806
Problem with proxy + https
{ "avatar_url": "https://avatars.githubusercontent.com/u/1655105?v=4", "events_url": "https://api.github.com/users/jachymb/events{/privacy}", "followers_url": "https://api.github.com/users/jachymb/followers", "following_url": "https://api.github.com/users/jachymb/following{/other_user}", "gists_url": "https://api.github.com/users/jachymb/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jachymb", "id": 1655105, "login": "jachymb", "node_id": "MDQ6VXNlcjE2NTUxMDU=", "organizations_url": "https://api.github.com/users/jachymb/orgs", "received_events_url": "https://api.github.com/users/jachymb/received_events", "repos_url": "https://api.github.com/users/jachymb/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jachymb/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jachymb/subscriptions", "type": "User", "url": "https://api.github.com/users/jachymb", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2017-01-10T13:56:11Z
2021-09-08T13:05:31Z
2017-01-11T20:39:25Z
NONE
resolved
GET request fails when requesting a https page over a https proxy. ``` >>> import requests >>> import os >>> os.getenv("HTTPS_PROXY") 'https://localhost:3130' >>> requests.get("https://example.org", verify = False) Traceback (most recent call last): File "/usr/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 588, in urlopen self._prepare_proxy(conn) File "/usr/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 801, in _prepare_proxy conn.connect() File "/usr/lib/python3.5/site-packages/requests/packages/urllib3/connection.py", line 291, in connect self._tunnel() File "/usr/lib/python3.5/http/client.py", line 827, in _tunnel (version, code, message) = response._read_status() File "/usr/lib/python3.5/http/client.py", line 258, in _read_status line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") File "/usr/lib/python3.5/socket.py", line 575, in readinto return self._sock.recv_into(b) ConnectionResetError: [Errno 104] Connection reset by peer During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/lib/python3.5/site-packages/requests/adapters.py", line 423, in send timeout=timeout File "/usr/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 643, in urlopen _stacktrace=sys.exc_info()[2]) File "/usr/lib/python3.5/site-packages/requests/packages/urllib3/util/retry.py", line 363, in increment raise MaxRetryError(_pool, url, error or ResponseError(cause)) requests.packages.urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='example.org', port=443): Max retries exceeded with url: / (Caused by ProxyError('Cannot connect to proxy.', ConnectionResetError(104, 'Connection reset by peer'))) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/lib/python3.5/site-packages/requests/api.py", line 70, in get return request('get', url, params=params, **kwargs) File "/usr/lib/python3.5/site-packages/requests/api.py", line 56, in request return session.request(method=method, url=url, **kwargs) File "/usr/lib/python3.5/site-packages/requests/sessions.py", line 488, in request resp = self.send(prep, **send_kwargs) File "/usr/lib/python3.5/site-packages/requests/sessions.py", line 609, in send r = adapter.send(request, **kwargs) File "/usr/lib/python3.5/site-packages/requests/adapters.py", line 485, in send raise ProxyError(e, request=request) requests.exceptions.ProxyError: HTTPSConnectionPool(host='example.org', port=443): Max retries exceeded with url: / (Caused by ProxyError('Cannot connect to proxy.', ConnectionResetError(104, 'Connection reset by peer'))) ``` The proxy is the squid proxy with this configuration directive: ``` https_port 3130 cert=/etc/sqiud/ssl/squid.pem key=/etc/squid/ssl/squid.key ``` And this works OK: ``` curl --proxy https://localhost:3130 --proxy-insecure https://example.com ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3806/reactions" }
https://api.github.com/repos/psf/requests/issues/3806/timeline
null
completed
null
null
false
[ "Requests does not really support connecting to proxies over HTTPS. In particular, we try to tunnel through them, and that just causes everything to go wrong. I'm afraid that until such time as someone improves httplib's support for this, or until we replace httplib, this is a use-case we simply cannot support.", "how about this? http://code.activestate.com/recipes/456195/", "We can make requests *to* HTTPS targets through a proxy, but we only support doing so when we've connected to the proxy over plaintext. That is, you must not make a TLS connection to the proxy itself." ]
https://api.github.com/repos/psf/requests/issues/3805
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3805/labels{/name}
https://api.github.com/repos/psf/requests/issues/3805/comments
https://api.github.com/repos/psf/requests/issues/3805/events
https://github.com/psf/requests/issues/3805
199,809,741
MDU6SXNzdWUxOTk4MDk3NDE=
3,805
Cannot clone repo due to bad timezone error
{ "avatar_url": "https://avatars.githubusercontent.com/u/10137?v=4", "events_url": "https://api.github.com/users/ghost/events{/privacy}", "followers_url": "https://api.github.com/users/ghost/followers", "following_url": "https://api.github.com/users/ghost/following{/other_user}", "gists_url": "https://api.github.com/users/ghost/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ghost", "id": 10137, "login": "ghost", "node_id": "MDQ6VXNlcjEwMTM3", "organizations_url": "https://api.github.com/users/ghost/orgs", "received_events_url": "https://api.github.com/users/ghost/received_events", "repos_url": "https://api.github.com/users/ghost/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ghost/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ghost/subscriptions", "type": "User", "url": "https://api.github.com/users/ghost", "user_view_type": "public" }
[]
closed
true
null
[]
null
8
2017-01-10T12:34:30Z
2018-09-21T12:35:13Z
2017-01-10T12:35:52Z
NONE
resolved
``` nyuszika7h@cadoth ~/src > git clone https://github.com/kennethreitz/requests.git Cloning into 'requests'... remote: Counting objects: 18867, done. remote: Compressing objects: 100% (11/11), done. error: object 5e6ecdad9f69b1ff789a17733b8edc6fd7091bd8: badTimezone: invalid author/committer line - bad time zone fatal: Error in object fatal: index-pack failed ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3805/reactions" }
https://api.github.com/repos/psf/requests/issues/3805/timeline
null
completed
null
null
false
[ "Thanks for raising this issue! In future, please search the issue history before opening new issues on projects, as you'll often find that your issue has been reported before. In this case, this is a duplicate of #3088, #3008, and #2690.", "Sorry. I did search, but didn't get the right terms.", "If you have an error message, it's usually sufficient just to try to search for that. In this case, I searched for \"fatal: index-pack failed\" and #3088 popped right up.", "Yeah, well, I searched for \"timezone\" because I didn't realize there's a space in it in the error message. (Also, it's strange, I don't remember enabling fsck, but I found it in my git config.)", "fsck is on-by-default on new installs of git, so unless you're persisting a gitconfig from an older version of git you'll always find it on. =)", "I do have a custom gitconfig, so that's strange. This kind of issue could have been avoided if they enabled fsck by default in the first place, but oh well. You can't rewrite 6 years of history now, that would just mess things up even more. I do wonder how that weird timezone ended up in a commit though.", "Yeah, it's really annoying because it causes legitimate problems for lots of people, especially those who use git in automated scripts, but we really can't do anything about it now. =(", "Even though it hurts to do this, you may want to consider rewriting the history to fix this problem once and for all. For Breathe I ended up doing this and I added a notice to the top of the README, in the biggest heading possible, for a while.\r\n\r\nProjects using this as a submodule will have to update the SHA once. But currently many setups encounter a clone failure meaning existing submodules will not work at all and cloning this repo directly won't either.\r\n\r\nRef: https://github.com/michaeljones/breathe/issues/340#issuecomment-394396673" ]
https://api.github.com/repos/psf/requests/issues/3804
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3804/labels{/name}
https://api.github.com/repos/psf/requests/issues/3804/comments
https://api.github.com/repos/psf/requests/issues/3804/events
https://github.com/psf/requests/pull/3804
199,734,872
MDExOlB1bGxSZXF1ZXN0MTAwODEyMzIy
3,804
Improve discoverability of OAuth 2 support
{ "avatar_url": "https://avatars.githubusercontent.com/u/1026649?v=4", "events_url": "https://api.github.com/users/ncoghlan/events{/privacy}", "followers_url": "https://api.github.com/users/ncoghlan/followers", "following_url": "https://api.github.com/users/ncoghlan/following{/other_user}", "gists_url": "https://api.github.com/users/ncoghlan/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ncoghlan", "id": 1026649, "login": "ncoghlan", "node_id": "MDQ6VXNlcjEwMjY2NDk=", "organizations_url": "https://api.github.com/users/ncoghlan/orgs", "received_events_url": "https://api.github.com/users/ncoghlan/received_events", "repos_url": "https://api.github.com/users/ncoghlan/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ncoghlan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ncoghlan/subscriptions", "type": "User", "url": "https://api.github.com/users/ncoghlan", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2017-01-10T05:17:34Z
2021-09-05T00:07:11Z
2017-01-10T08:31:59Z
CONTRIBUTOR
resolved
The previous summary gave the impression that requests-oauthlib only supports OAuth 1. This updates makes it clear that it also supports OAuth 2, and links directly to the use case specific authentication flow guides.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3804/reactions" }
https://api.github.com/repos/psf/requests/issues/3804/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3804.diff", "html_url": "https://github.com/psf/requests/pull/3804", "merged_at": "2017-01-10T08:31:59Z", "patch_url": "https://github.com/psf/requests/pull/3804.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3804" }
true
[ "I'm currently working on adding OpenID Connect support to a pre-existing web application (https://github.com/fedora-infra/anitya/issues/329) and found it very hard to find good documentation on writing a full semi-automated integration test where the only manual step is the user authorising a temporary access token.\r\n\r\nPart of the problem was hitting this page in the docs and thinking that requests didn't have native OAuth 2 support - it was only after clicking through to the full requests-oauthlib documentation that I realised that that initial impression was wrong.", "Thanks for having the code available to find!\r\n\r\nYou may be amused by the horrible hack I wrote to automate obtaining a refresh token for Anitya's Fedora Account System integration tests: https://github.com/ncoghlan/anitya/blob/authenticated-api/request_oidc_credentials.py :)", "permalink for above broken link:\r\nhttps://github.com/release-monitoring/anitya/blob/c369715f5ae7873066ca92b7ae3c412f01711e16/request_oidc_credentials.py" ]
https://api.github.com/repos/psf/requests/issues/3803
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3803/labels{/name}
https://api.github.com/repos/psf/requests/issues/3803/comments
https://api.github.com/repos/psf/requests/issues/3803/events
https://github.com/psf/requests/issues/3803
199,635,105
MDU6SXNzdWUxOTk2MzUxMDU=
3,803
AttributeError: 'set' object has no attribute 'items'
{ "avatar_url": "https://avatars.githubusercontent.com/u/911902?v=4", "events_url": "https://api.github.com/users/codespaced/events{/privacy}", "followers_url": "https://api.github.com/users/codespaced/followers", "following_url": "https://api.github.com/users/codespaced/following{/other_user}", "gists_url": "https://api.github.com/users/codespaced/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/codespaced", "id": 911902, "login": "codespaced", "node_id": "MDQ6VXNlcjkxMTkwMg==", "organizations_url": "https://api.github.com/users/codespaced/orgs", "received_events_url": "https://api.github.com/users/codespaced/received_events", "repos_url": "https://api.github.com/users/codespaced/repos", "site_admin": false, "starred_url": "https://api.github.com/users/codespaced/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/codespaced/subscriptions", "type": "User", "url": "https://api.github.com/users/codespaced", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2017-01-09T19:05:36Z
2021-08-31T00:06:53Z
2017-01-09T19:41:40Z
NONE
resolved
fresh pip install requests. version 2.12.4. This works in postman. ``` import requests url = 'https://api.spotify.com/v1/audio-analysis/4JpKVNYnVcJ8tuMKjAj50A' headers = {'Authorization', 'Bearer TOKEN-HERE'} r = requests.get(url, headers=headers) ``` ``` --------------------------------------------------------------------------- AttributeError Traceback (most recent call last) <ipython-input-5-0b5138b50602> in <module>() 1 url = 'https://api.spotify.com/v1/audio-analysis/4JpKVNYnVcJ8tuMKjAj50A' 2 headers = {'Authorization', 'Bearer TOKEN-HERE'} ----> 3 r = requests.get(url, headers=headers) c:\python27\lib\site-packages\requests\api.pyc in get(url, params, **kwargs) 68 69 kwargs.setdefault('allow_redirects', True) ---> 70 return request('get', url, params=params, **kwargs) 71 72 c:\python27\lib\site-packages\requests\api.pyc in request(method, url, **kwargs) 54 # cases, and look like a memory leak in others. 55 with sessions.Session() as session: ---> 56 return session.request(method=method, url=url, **kwargs) 57 58 c:\python27\lib\site-packages\requests\sessions.pyc in request(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert, json) 472 hooks = hooks, 473 ) --> 474 prep = self.prepare_request(req) 475 476 proxies = proxies or {} c:\python27\lib\site-packages\requests\sessions.pyc in prepare_request(self, request) 405 auth=merge_setting(auth, self.auth), 406 cookies=merged_cookies, --> 407 hooks=merge_hooks(request.hooks, self.hooks), 408 ) 409 return p c:\python27\lib\site-packages\requests\models.pyc in prepare(self, method, url, headers, files, data, params, auth, cookies, hooks, json) 301 self.prepare_method(method) 302 self.prepare_url(url, params) --> 303 self.prepare_headers(headers) 304 self.prepare_cookies(cookies) 305 self.prepare_body(data, files, json) c:\python27\lib\site-packages\requests\models.pyc in prepare_headers(self, headers) 423 self.headers = CaseInsensitiveDict() 424 if headers: --> 425 for header in headers.items(): 426 # Raise exception on invalid header value. 427 check_header_validity(header) AttributeError: 'set' object has no attribute 'items' ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3803/reactions" }
https://api.github.com/repos/psf/requests/issues/3803/timeline
null
completed
null
null
false
[ "Hey @codespaced, thanks for opening this issue. It looks like you used a comma instead of a colon to separate the key and value in your header. This is creating a set object instead of the dictionary we're expecting. Replacing that should solve your issue.", "Yup, this is strictly a Python syntax problem and not a Requests issue. Thanks @nateprewitt. :D", "hey i am facing the same isuues but not able to get where i am facing attribute errors:\r\n\r\n self.token= 'ywlGaT0X44WGEps98WDN8xyFMmJgfGTehKxT5heiq_A7yqa_xwM9z6s2ouB1R7LprEYaTtXoILzx713kYhvoOxQrPLsRuvEAhnHfpc0sRonNXRYJgD91P8c2_mJJJ58MYs2TiMCEXlByG2ODRrb9e5OA35l74HXAegMabFJE7GnLX_4ZCxD2EygFP8LKqamdfao66QXAmXW60nwWYOF7g3bPdQNH3eK2siY9yODd_pFJ2IpLxBiadEFvAWZj5Y'\r\n\r\n self.headers = {'content-type': 'application/json',\r\n 'Authorization': 'token {}'.format(self.token)}\r\n\r\n\r\ngetting an error like :+1: \r\n def test_api_get(self):\r\n # A get request (json example):\r\n> response = requests.get(self.myurl, headers=self.header)\r\nE AttributeError: 'CL_API' object has no attribute 'header'\r\n\r\n\r\nPlease give me some inputs on it", "I also have a similar error and I don't know why. \r\n\r\nfrom bs4 import BeautifulSoup\r\nimport requests\r\n\r\nURL = \"https://www.amazon.com/Canon-EOS-1D-Digital-Discontinued-Manufacturer/dp/B005Y3T1AI?pf_rd_r=VM4AVQ8V3J10M7R9S7P7&pf_rd_p=be25f964-4afb-442f-819e-9e628b270a7c&pd_rd_r=9182559a-b6e0-49d1-bd12-5b729de2c9b5&pd_rd_w=AgAM5&pd_rd_wg=wTOnV&ref_=pd_gw_ci_mcx_mr_hp_d\"\r\n\r\nheaders = {\"User Agent:\" 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/83.0.4103.61 Safari/537.36'}\r\n\r\ndef price_check():\r\n page = requests.get(URL, headers=headers)\r\n soup = BeautifulSoup(page.content, 'html.parser')\r\n\r\n title = soup.find(id=\"productTitle\").get_text()\r\n print(title.strip())\r\n\r\nprice_check()", "You have a typo in how you're writing your headers value.\r\n\r\n```py\r\nheaders = {\"User Agent:\" 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/83.0.4103.61 Safari/537.36'}\r\n```\r\n\r\nShould be\r\n\r\n```py\r\nheaders = {\"User Agent\": 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/83.0.4103.61 Safari/537.36'}\r\n```\r\n\r\nLook very closely" ]
https://api.github.com/repos/psf/requests/issues/3802
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3802/labels{/name}
https://api.github.com/repos/psf/requests/issues/3802/comments
https://api.github.com/repos/psf/requests/issues/3802/events
https://github.com/psf/requests/pull/3802
199,472,453
MDExOlB1bGxSZXF1ZXN0MTAwNjMwOTI2
3,802
Add **kwargs to cookiejar_from_dict
{ "avatar_url": "https://avatars.githubusercontent.com/u/606837?v=4", "events_url": "https://api.github.com/users/Atterratio/events{/privacy}", "followers_url": "https://api.github.com/users/Atterratio/followers", "following_url": "https://api.github.com/users/Atterratio/following{/other_user}", "gists_url": "https://api.github.com/users/Atterratio/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Atterratio", "id": 606837, "login": "Atterratio", "node_id": "MDQ6VXNlcjYwNjgzNw==", "organizations_url": "https://api.github.com/users/Atterratio/orgs", "received_events_url": "https://api.github.com/users/Atterratio/received_events", "repos_url": "https://api.github.com/users/Atterratio/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Atterratio/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Atterratio/subscriptions", "type": "User", "url": "https://api.github.com/users/Atterratio", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2017-01-09T03:45:58Z
2021-09-07T00:06:37Z
2017-02-10T17:11:56Z
NONE
resolved
It's can be useful for set not default cookies attributes values, for example "domain" or "discard".
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3802/reactions" }
https://api.github.com/repos/psf/requests/issues/3802/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3802.diff", "html_url": "https://github.com/psf/requests/pull/3802", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3802.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3802" }
true
[ "So, while we can do this, are you not likely to be better served by creating a CookieJar yourself at this point? `cookiejar_from_dict` as a utility function is mostly intended for internal use to support the usage of dict in our APIs: we kind of expect that users who want to get more granular control over cookie settings will simply directly use a CookieJar.", "I agree with @Lukasa. While I know a significant number of people use this function, it's not a public API that we support and even so, this is the one of the least optimal ways of accomplishing what you want." ]
https://api.github.com/repos/psf/requests/issues/3801
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3801/labels{/name}
https://api.github.com/repos/psf/requests/issues/3801/comments
https://api.github.com/repos/psf/requests/issues/3801/events
https://github.com/psf/requests/pull/3801
199,452,909
MDExOlB1bGxSZXF1ZXN0MTAwNjE5MzIz
3,801
Added python 3.6.0
{ "avatar_url": "https://avatars.githubusercontent.com/u/1374633?v=4", "events_url": "https://api.github.com/users/andriisoldatenko/events{/privacy}", "followers_url": "https://api.github.com/users/andriisoldatenko/followers", "following_url": "https://api.github.com/users/andriisoldatenko/following{/other_user}", "gists_url": "https://api.github.com/users/andriisoldatenko/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/andriisoldatenko", "id": 1374633, "login": "andriisoldatenko", "node_id": "MDQ6VXNlcjEzNzQ2MzM=", "organizations_url": "https://api.github.com/users/andriisoldatenko/orgs", "received_events_url": "https://api.github.com/users/andriisoldatenko/received_events", "repos_url": "https://api.github.com/users/andriisoldatenko/repos", "site_admin": false, "starred_url": "https://api.github.com/users/andriisoldatenko/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/andriisoldatenko/subscriptions", "type": "User", "url": "https://api.github.com/users/andriisoldatenko", "user_view_type": "public" }
[]
closed
true
null
[]
null
0
2017-01-08T22:41:06Z
2021-09-08T01:21:26Z
2017-01-09T04:13:04Z
NONE
resolved
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3801/reactions" }
https://api.github.com/repos/psf/requests/issues/3801/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3801.diff", "html_url": "https://github.com/psf/requests/pull/3801", "merged_at": "2017-01-09T04:13:04Z", "patch_url": "https://github.com/psf/requests/pull/3801.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3801" }
true
[]
https://api.github.com/repos/psf/requests/issues/3800
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3800/labels{/name}
https://api.github.com/repos/psf/requests/issues/3800/comments
https://api.github.com/repos/psf/requests/issues/3800/events
https://github.com/psf/requests/issues/3800
199,384,061
MDU6SXNzdWUxOTkzODQwNjE=
3,800
Use proxy to hide IP
{ "avatar_url": "https://avatars.githubusercontent.com/u/1591920?v=4", "events_url": "https://api.github.com/users/Djokx/events{/privacy}", "followers_url": "https://api.github.com/users/Djokx/followers", "following_url": "https://api.github.com/users/Djokx/following{/other_user}", "gists_url": "https://api.github.com/users/Djokx/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Djokx", "id": 1591920, "login": "Djokx", "node_id": "MDQ6VXNlcjE1OTE5MjA=", "organizations_url": "https://api.github.com/users/Djokx/orgs", "received_events_url": "https://api.github.com/users/Djokx/received_events", "repos_url": "https://api.github.com/users/Djokx/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Djokx/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Djokx/subscriptions", "type": "User", "url": "https://api.github.com/users/Djokx", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2017-01-07T20:45:09Z
2021-09-08T13:05:33Z
2017-01-07T21:05:41Z
NONE
resolved
Hi, I'm trying to use a proxy to hide my IP with my script. I'm just trying `requests.get("https://api.ipify.org?format=json", proxies={"http":proxies[0][:-1]}).json()` But I get my real IP address, no the proxy's one. It that normal ? How is it possible to hide my IP with requests ? Thanks
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3800/reactions" }
https://api.github.com/repos/psf/requests/issues/3800/timeline
null
completed
null
null
false
[ "The proxy dictionary is keyed on the scheme of the URL you are *requesting*. In your example you are requesting a *https* URL with a proxy dictionary that only has a *http* key. This means Requests thinks you don't want to proxy https requests, and so sends them directly. Add *https* to your proxy dictionary as well, using the same proxy, and you should be fine." ]
https://api.github.com/repos/psf/requests/issues/3799
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3799/labels{/name}
https://api.github.com/repos/psf/requests/issues/3799/comments
https://api.github.com/repos/psf/requests/issues/3799/events
https://github.com/psf/requests/issues/3799
199,215,760
MDU6SXNzdWUxOTkyMTU3NjA=
3,799
Why LifoQueue and not simply Queue (or custom LifoQueue)?
{ "avatar_url": "https://avatars.githubusercontent.com/u/56894?v=4", "events_url": "https://api.github.com/users/Kronuz/events{/privacy}", "followers_url": "https://api.github.com/users/Kronuz/followers", "following_url": "https://api.github.com/users/Kronuz/following{/other_user}", "gists_url": "https://api.github.com/users/Kronuz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Kronuz", "id": 56894, "login": "Kronuz", "node_id": "MDQ6VXNlcjU2ODk0", "organizations_url": "https://api.github.com/users/Kronuz/orgs", "received_events_url": "https://api.github.com/users/Kronuz/received_events", "repos_url": "https://api.github.com/users/Kronuz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Kronuz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Kronuz/subscriptions", "type": "User", "url": "https://api.github.com/users/Kronuz", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2017-01-06T15:12:38Z
2021-09-08T13:05:33Z
2017-01-06T15:24:18Z
NONE
resolved
During some profiling, I found out one slow operation is `self.pool.get()` in `_get_conn()`. In urllib3, in `connectionpool.ConnectionPool`, requests uses `LifoQueue` (which uses a list instead of a much faster deque): https://github.com/kennethreitz/requests/blob/master/requests/packages/urllib3/connectionpool.py#L64 My suggestion is one could use a custom LifoQueue as: ```python class LifoQueue(Queue): def _get(self): return self.queue.pop() ``` or simply use a Queue, if Lifo is not really needed.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3799/reactions" }
https://api.github.com/repos/psf/requests/issues/3799/timeline
null
completed
null
null
false
[ "The LIFO property of the LifoQueue is very much needed: this is to ensure the increased likelihood of using a warm TCP connection, which provides substantial efficiency bonuses. The reason it's a queue, instead of just a deque, is to get the thread-safety guarantees that the queue module provides.\r\n\r\nYou are welcome to attempt to optimise this, but those two semantics need to be preserved. I suspect you'll find that the major performance cost is actually the locking, and by the time you reintroduce that the perf gains are minimal.", "Further the code you're linking to is actually in urllib3, thus making this not only an invalid defect report but also one on the wrong project." ]
https://api.github.com/repos/psf/requests/issues/3798
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3798/labels{/name}
https://api.github.com/repos/psf/requests/issues/3798/comments
https://api.github.com/repos/psf/requests/issues/3798/events
https://github.com/psf/requests/issues/3798
199,156,439
MDU6SXNzdWUxOTkxNTY0Mzk=
3,798
Customized proxies seems not work, how can we know the proxies's IP address send to server?
{ "avatar_url": "https://avatars.githubusercontent.com/u/4835229?v=4", "events_url": "https://api.github.com/users/withr/events{/privacy}", "followers_url": "https://api.github.com/users/withr/followers", "following_url": "https://api.github.com/users/withr/following{/other_user}", "gists_url": "https://api.github.com/users/withr/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/withr", "id": 4835229, "login": "withr", "node_id": "MDQ6VXNlcjQ4MzUyMjk=", "organizations_url": "https://api.github.com/users/withr/orgs", "received_events_url": "https://api.github.com/users/withr/received_events", "repos_url": "https://api.github.com/users/withr/repos", "site_admin": false, "starred_url": "https://api.github.com/users/withr/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/withr/subscriptions", "type": "User", "url": "https://api.github.com/users/withr", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2017-01-06T09:27:37Z
2021-09-08T13:05:34Z
2017-01-06T09:30:38Z
NONE
resolved
Take this as an example: ``` #!/usr/bin/env python # -*- coding: utf-8 -*- import requests proxies = {"http":"http://111.222.333.444:80"} r = requests.get("https://www.google.com", proxies=proxies, timeout=5) r.status_code r.content ``` Obviously, the proxies "111.222.333.444" is not correct, but why I still got status_code 200?? How can we check the proxies we used to send to server?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3798/reactions" }
https://api.github.com/repos/psf/requests/issues/3798/timeline
null
completed
null
null
false
[ "The `proxies` dictionary is defined as having the following form: the keys are the URL schemes for the URLs you are requesting, and the values are the URLs for reaching the proxy.\r\n\r\nThat means when we make a web request, we take the scheme of the URL you're requesting and look for a proxy for it. In your case, the scheme is `https` (because you're looking for `https://www.google.com`). Your proxy dictionary has no `https` key, so that means \"no proxy\", so we just go directly to Google.\r\n\r\nIf you change your `requests.get` to be `http://www.google.com`, then you'll find that we attempt to reach the proxy and error out.", "Thanks, so rapid reply!!\r\nCan you also reply the second question: can we retrieve the proxies send to server? some time, I got error, and I am pretty sure the schemes are same but still got an error, like use free proxies, some of them may out of date. ", "@withr Unfortunately, not easily. Generally speaking you should look at the proxies dictionary you provided. Alternatively, you can write a custom Transport Adapter that will provide the information you need, though that's quite a lot of work." ]
https://api.github.com/repos/psf/requests/issues/3797
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3797/labels{/name}
https://api.github.com/repos/psf/requests/issues/3797/comments
https://api.github.com/repos/psf/requests/issues/3797/events
https://github.com/psf/requests/issues/3797
198,975,576
MDU6SXNzdWUxOTg5NzU1NzY=
3,797
Help: how to upgrade urllib3 for solve "filedescriptor out of range in select()" issue
{ "avatar_url": "https://avatars.githubusercontent.com/u/4769764?v=4", "events_url": "https://api.github.com/users/abc100m/events{/privacy}", "followers_url": "https://api.github.com/users/abc100m/followers", "following_url": "https://api.github.com/users/abc100m/following{/other_user}", "gists_url": "https://api.github.com/users/abc100m/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/abc100m", "id": 4769764, "login": "abc100m", "node_id": "MDQ6VXNlcjQ3Njk3NjQ=", "organizations_url": "https://api.github.com/users/abc100m/orgs", "received_events_url": "https://api.github.com/users/abc100m/received_events", "repos_url": "https://api.github.com/users/abc100m/repos", "site_admin": false, "starred_url": "https://api.github.com/users/abc100m/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/abc100m/subscriptions", "type": "User", "url": "https://api.github.com/users/abc100m", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2017-01-05T15:01:02Z
2021-09-08T12:01:09Z
2017-01-05T15:02:47Z
NONE
resolved
requests version: 2.12.4 python version: 2.7.9 platform: linux(CentOS) issue occured: ```shell ValueError: filedescriptor out of range in select() ``` detail description : https://github.com/shazow/urllib3/commit/a49bec58fa5f7aca76d1d0b2f1975eb094648eab and this issue have been fixed : https://github.com/shazow/urllib3/issues/589 https://github.com/sigmavirus24/urllib3/commit/eb8264f55be9cd6b2d0be2833c5ba48058ce999d I wonder why requests don't upgrade urllib3? and **how can I upgrade urllib3 for requests?**
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3797/reactions" }
https://api.github.com/repos/psf/requests/issues/3797/timeline
null
completed
null
null
false
[ "Requests uses a vendored copy of urllib3, so we ship our own updates to it. That means you cannot upgrade urllib3 separately to requests.\r\n\r\nWhile I am here, I should note that there is no released version of urllib3 that contains that fix, you'd have to use urllib3 from master. If that's the case, you are best served by cloning the Requests repository and updating the copy of urllib3 inside it and installing that.", "@Lukasa Sorry to bother on closed issue but I'm wondering why should this fix not being updated in Request?", "@RobGThai We pushed requests v2.13 today which contains that fix." ]
https://api.github.com/repos/psf/requests/issues/3796
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3796/labels{/name}
https://api.github.com/repos/psf/requests/issues/3796/comments
https://api.github.com/repos/psf/requests/issues/3796/events
https://github.com/psf/requests/pull/3796
198,880,786
MDExOlB1bGxSZXF1ZXN0MTAwMjM3NjEx
3,796
updating https demo urls
{ "avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4", "events_url": "https://api.github.com/users/nateprewitt/events{/privacy}", "followers_url": "https://api.github.com/users/nateprewitt/followers", "following_url": "https://api.github.com/users/nateprewitt/following{/other_user}", "gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/nateprewitt", "id": 5271761, "login": "nateprewitt", "node_id": "MDQ6VXNlcjUyNzE3NjE=", "organizations_url": "https://api.github.com/users/nateprewitt/orgs", "received_events_url": "https://api.github.com/users/nateprewitt/received_events", "repos_url": "https://api.github.com/users/nateprewitt/repos", "site_admin": false, "starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions", "type": "User", "url": "https://api.github.com/users/nateprewitt", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2017-01-05T05:45:45Z
2021-09-08T01:21:26Z
2017-01-05T06:17:23Z
MEMBER
resolved
The doc examples for SSL Cert Verification are pointing at an older url that no longer serves a response which is causing the example to hang indefinitely. This updates the urls to the current, correct domain. Also removing a swap file that I accidentally snuck in a couple months ago 😅
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3796/reactions" }
https://api.github.com/repos/psf/requests/issues/3796/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3796.diff", "html_url": "https://github.com/psf/requests/pull/3796", "merged_at": "2017-01-05T06:17:23Z", "patch_url": "https://github.com/psf/requests/pull/3796.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3796" }
true
[ "✨🍰✨" ]
https://api.github.com/repos/psf/requests/issues/3795
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3795/labels{/name}
https://api.github.com/repos/psf/requests/issues/3795/comments
https://api.github.com/repos/psf/requests/issues/3795/events
https://github.com/psf/requests/pull/3795
198,174,940
MDExOlB1bGxSZXF1ZXN0OTk3NzAwNzk=
3,795
Require pytest-mock for the tests
{ "avatar_url": "https://avatars.githubusercontent.com/u/916551?v=4", "events_url": "https://api.github.com/users/AdamWill/events{/privacy}", "followers_url": "https://api.github.com/users/AdamWill/followers", "following_url": "https://api.github.com/users/AdamWill/following{/other_user}", "gists_url": "https://api.github.com/users/AdamWill/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/AdamWill", "id": 916551, "login": "AdamWill", "node_id": "MDQ6VXNlcjkxNjU1MQ==", "organizations_url": "https://api.github.com/users/AdamWill/orgs", "received_events_url": "https://api.github.com/users/AdamWill/received_events", "repos_url": "https://api.github.com/users/AdamWill/repos", "site_admin": false, "starred_url": "https://api.github.com/users/AdamWill/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/AdamWill/subscriptions", "type": "User", "url": "https://api.github.com/users/AdamWill", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2016-12-30T17:41:19Z
2021-09-08T01:21:27Z
2016-12-30T18:15:00Z
CONTRIBUTOR
resolved
test_requests.py `test_session_close_proxy_clear` uses the `mocker` fixture, which is provided by pytest-mock.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3795/reactions" }
https://api.github.com/repos/psf/requests/issues/3795/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3795.diff", "html_url": "https://github.com/psf/requests/pull/3795", "merged_at": "2016-12-30T18:15:00Z", "patch_url": "https://github.com/psf/requests/pull/3795.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3795" }
true
[ "Now I look at it, the pytest-httpbin required version also differs between setup.py and requirements.txt (as a downstream distributor I also much prefer `>=` to `==`, but...:>)", "Perhaps it would be worth reducing test_requirements to a non-versioned list like [urllib3 does](https://github.com/shazow/urllib3/blob/master/setup.py#L48-L54)? That allows a minimum starting point for someone running tests through setup.py, while avoiding having to remember updating dependencies in multiple places. It looks like the chosen versions in test_requirements were simply the latest ones available when test_requirements was added (6c2942b).", "FWIW, we distributors like `install_requires` and `test_requires` over the 'everything together' format of `requirements.txt` because we have the same split at the distro level: we need to have all the test requirements present when *building* the package (assuming we're going to run the test suite) - so as `BuildRequires`, in RPM terms - but we only need the generated packages to require the stuff from `install_requires` at install time (so as `Requires` in RPM terms). Since `requirements.txt` draws no distinction it's hard for us to use.", "We have no `requirements.txt` requirement: all the setup.py stuff should work. It's just that requirements files can be handy for things like tox, which can try to be clever about tracking changes on it. \r\n\r\nRegardless, we should being setup.py into line with requirements.txt for testing purposes. I see no reason to remove the version pins at this time. " ]
https://api.github.com/repos/psf/requests/issues/3794
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3794/labels{/name}
https://api.github.com/repos/psf/requests/issues/3794/comments
https://api.github.com/repos/psf/requests/issues/3794/events
https://github.com/psf/requests/issues/3794
198,011,364
MDU6SXNzdWUxOTgwMTEzNjQ=
3,794
When a response with status code 204 is received the package still tries to parser its bytes and get an exception
{ "avatar_url": "https://avatars.githubusercontent.com/u/14309774?v=4", "events_url": "https://api.github.com/users/elasti-georgeg/events{/privacy}", "followers_url": "https://api.github.com/users/elasti-georgeg/followers", "following_url": "https://api.github.com/users/elasti-georgeg/following{/other_user}", "gists_url": "https://api.github.com/users/elasti-georgeg/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/elasti-georgeg", "id": 14309774, "login": "elasti-georgeg", "node_id": "MDQ6VXNlcjE0MzA5Nzc0", "organizations_url": "https://api.github.com/users/elasti-georgeg/orgs", "received_events_url": "https://api.github.com/users/elasti-georgeg/received_events", "repos_url": "https://api.github.com/users/elasti-georgeg/repos", "site_admin": false, "starred_url": "https://api.github.com/users/elasti-georgeg/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/elasti-georgeg/subscriptions", "type": "User", "url": "https://api.github.com/users/elasti-georgeg", "user_view_type": "public" }
[]
closed
true
null
[]
null
8
2016-12-29T13:59:09Z
2021-09-08T13:05:28Z
2017-01-14T12:42:39Z
NONE
resolved
our rails server send status_code 204 for some logout requests (a delete with some cookies), and hence the message has no body however in a code I catch the following exception: /usr/local/lib/python2.7/dist-packages/requests/api.pyc in request(method, url, **kwargs) 54 # cases, and look like a memory leak in others. 55 with sessions.Session() as session: ---> 56 return session.request(method=method, url=url, **kwargs) 57 58 /usr/local/lib/python2.7/dist-packages/requests/sessions.pyc in request(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert, json) 486 } 487 send_kwargs.update(settings) --> 488 resp = self.send(prep, **send_kwargs) 489 490 return resp /usr/local/lib/python2.7/dist-packages/requests/sessions.pyc in send(self, request, **kwargs) 639 640 if not stream: --> 641 r.content 642 643 return r /usr/local/lib/python2.7/dist-packages/requests/models.pyc in content(self) 779 self._content = None 780 else: --> 781 self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes() 782 783 self._content_consumed = True /usr/local/lib/python2.7/dist-packages/requests/models.pyc in generate() 704 yield chunk 705 except ProtocolError as e: --> 706 raise ChunkedEncodingError(e) 707 except DecodeError as e: 708 raise ContentDecodingError(e) The problem is here: https://github.com/kennethreitz/requests/blob/master/requests/models.py#L778 for some reason the status code checks itself agains 0, and raw requests checks agains None while raw request does contain all the headers it has no body and thus an iterator fails to iterate on an empty chunk.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3794/reactions" }
https://api.github.com/repos/psf/requests/issues/3794/timeline
null
completed
null
null
false
[ "Hey @elasti-georgeg, thanks for opening this issue. I am able to reproduce the issue you've listed but only with a 204 response containing a \"Transfer-Encoding: chunked\" header.\r\n\r\n[RFC 7230 § 3.3.1](https://tools.ietf.org/html/rfc7230#section-3.3.1) states:\r\n\r\n>A server MUST NOT send a Transfer-Encoding header field in any\r\n response with a status code of 1xx (Informational) or 204 (No\r\n Content). \r\n\r\nThis appears to be what is causing the ChunkedEncodingError which should be what we want in this scenario. Can you confirm you're receiving a Transfer-Encoding header? If that's the case, this is an issue that needs to be fixed with your Rails server.\r\n\r\nThe code below should allow you to perform the request without triggering the error. Just make sure you aren't calling `.content` on the response.\r\n```python\r\ns = requests.Session()\r\nr = s.get(url, stream=True)\r\nr.headers\r\n```", "@nateprewitt yes, they should fix that there, but we should also be able to tolerate this.", "Hmm, so if a server says Transfer-Encoding: chunked but fails to use that encoding, can we reliably do anything other than raise an exception? The response only fails if it doesn't include `0\\r\\n` stating the body has ended, which I was under the impression is required. Requests doesn't explicitly prohibit a 204 with a chunked header, but does require the server to adhere to the encoding.\r\n\r\nFrom Requests' end, we can do a forced `None` body for status codes that don't allow a body. That doesn't solve the core problem here though, which I don't know if we can do anything meaningful about.", "Is httplib going to be able to tolerate this?", "@Lukasa no, it won't because that's where the error is being raised from. httplib raises a IncompleteRead error of 0 bytes because the connection is terminated without sending anything.", "Right, so it's very hard for us to tolerate this sensibly in the current codebase. ", "We can probably close this out with the same reasoning in #3807, since we can't reasonably address this, and the exception is correct here.", "Good catch @nateprewitt " ]
https://api.github.com/repos/psf/requests/issues/3793
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3793/labels{/name}
https://api.github.com/repos/psf/requests/issues/3793/comments
https://api.github.com/repos/psf/requests/issues/3793/events
https://github.com/psf/requests/issues/3793
198,000,030
MDU6SXNzdWUxOTgwMDAwMzA=
3,793
Pyinstaller error with requests version > 2.11.1
{ "avatar_url": "https://avatars.githubusercontent.com/u/12796536?v=4", "events_url": "https://api.github.com/users/Sattar1/events{/privacy}", "followers_url": "https://api.github.com/users/Sattar1/followers", "following_url": "https://api.github.com/users/Sattar1/following{/other_user}", "gists_url": "https://api.github.com/users/Sattar1/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Sattar1", "id": 12796536, "login": "Sattar1", "node_id": "MDQ6VXNlcjEyNzk2NTM2", "organizations_url": "https://api.github.com/users/Sattar1/orgs", "received_events_url": "https://api.github.com/users/Sattar1/received_events", "repos_url": "https://api.github.com/users/Sattar1/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Sattar1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Sattar1/subscriptions", "type": "User", "url": "https://api.github.com/users/Sattar1", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2016-12-29T12:27:55Z
2021-09-08T12:01:04Z
2016-12-29T13:17:10Z
NONE
resolved
Hi I am developing an application with python, using _requests_ library in it. I make executable file with _pyinstaller_ library. My application works with newest version of _requests_ when I run it through python commands or inside IDLE. But when I get EXE output with _pyinstaller_, It doesn't run correctly. Here is the message: " ... (trace files) ImportError: No module named 'queue' During handling of the above exception, another exception occurred: ... (trace files) ImportError: No module named 'urllib3' Failed to execute script. " So I figured out that there is something wrong with _**urllib3**_ library. Then I downgraded _requests_ to "2.11.1" version (One before updating bundled urllib3 to 1.19) and created another EXE file. Rerun it, and worked! Now I know that there is something wrong with newer versions of _requests_ (> 2.11.1) and _pyinstaller_. But don't know how to fix it. Could you plz help me? Thanks.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3793/reactions" }
https://api.github.com/repos/psf/requests/issues/3793/timeline
null
completed
null
null
false
[ "This is strictly a PyInstaller issue. Requests works fine, and vendors all of its dependencies. Clearly PyInstaller is having trouble locating the dependent module, even though it is entirely within the Requests package. That problem is PyInstaller's, and you'll need to take it up with them.", "This is caused by requests' bundling it's own copy of `six`." ]
https://api.github.com/repos/psf/requests/issues/3792
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3792/labels{/name}
https://api.github.com/repos/psf/requests/issues/3792/comments
https://api.github.com/repos/psf/requests/issues/3792/events
https://github.com/psf/requests/issues/3792
197,950,441
MDU6SXNzdWUxOTc5NTA0NDE=
3,792
Invalid detection of utf-32-be
{ "avatar_url": "https://avatars.githubusercontent.com/u/5925982?v=4", "events_url": "https://api.github.com/users/evgen231/events{/privacy}", "followers_url": "https://api.github.com/users/evgen231/followers", "following_url": "https://api.github.com/users/evgen231/following{/other_user}", "gists_url": "https://api.github.com/users/evgen231/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/evgen231", "id": 5925982, "login": "evgen231", "node_id": "MDQ6VXNlcjU5MjU5ODI=", "organizations_url": "https://api.github.com/users/evgen231/orgs", "received_events_url": "https://api.github.com/users/evgen231/received_events", "repos_url": "https://api.github.com/users/evgen231/repos", "site_admin": false, "starred_url": "https://api.github.com/users/evgen231/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/evgen231/subscriptions", "type": "User", "url": "https://api.github.com/users/evgen231", "user_view_type": "public" }
[]
closed
true
null
[]
null
0
2016-12-29T04:03:23Z
2021-09-08T13:05:35Z
2016-12-29T13:28:08Z
CONTRIBUTOR
resolved
```python import requests requests.get('http://localhost/example.json').json() ``` ``` Traceback (most recent call last): File "example.py", line 3, in <module> requests.get('http://localhost/example.json').json() File "/usr/lib/python3.6/site-packages/requests/models.py", line 850, in json return complexjson.loads(self.text, **kwargs) File "/usr/lib/python3.6/json/__init__.py", line 344, in loads s, 0) json.decoder.JSONDecodeError: Unexpected UTF-8 BOM (decode using utf-8-sig): line 1 column 1 (char 0) ``` ``` # wget --server-response http://localhost/example.json --2016-12-29 10:49:23-- http://localhost/example.json Resolving localhost (localhost)... 127.0.0.1 Connecting to localhost (localhost)|127.0.0.1|:80... connected. HTTP request sent, awaiting response... HTTP/1.1 200 OK Date: Thu, 29 Dec 2016 03:51:03 GMT Server: Apache/2.4.10 (Debian) Last-Modified: Thu, 29 Dec 2016 03:42:32 GMT ETag: "c-544c3e019ba33" Accept-Ranges: bytes Content-Length: 12 Keep-Alive: timeout=5, max=100 Connection: Keep-Alive Content-Type: application/json Length: 12 [application/json] ``` http://localhost/example.json is file encoded with utf-32-be. ```json {} ``` Fix #3791
{ "avatar_url": "https://avatars.githubusercontent.com/u/5925982?v=4", "events_url": "https://api.github.com/users/evgen231/events{/privacy}", "followers_url": "https://api.github.com/users/evgen231/followers", "following_url": "https://api.github.com/users/evgen231/following{/other_user}", "gists_url": "https://api.github.com/users/evgen231/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/evgen231", "id": 5925982, "login": "evgen231", "node_id": "MDQ6VXNlcjU5MjU5ODI=", "organizations_url": "https://api.github.com/users/evgen231/orgs", "received_events_url": "https://api.github.com/users/evgen231/received_events", "repos_url": "https://api.github.com/users/evgen231/repos", "site_admin": false, "starred_url": "https://api.github.com/users/evgen231/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/evgen231/subscriptions", "type": "User", "url": "https://api.github.com/users/evgen231", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3792/reactions" }
https://api.github.com/repos/psf/requests/issues/3792/timeline
null
completed
null
null
false
[]
https://api.github.com/repos/psf/requests/issues/3791
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3791/labels{/name}
https://api.github.com/repos/psf/requests/issues/3791/comments
https://api.github.com/repos/psf/requests/issues/3791/events
https://github.com/psf/requests/pull/3791
197,947,895
MDExOlB1bGxSZXF1ZXN0OTk2MTQ4NzQ=
3,791
Fixed detection of utf-32-be by BOM.
{ "avatar_url": "https://avatars.githubusercontent.com/u/5925982?v=4", "events_url": "https://api.github.com/users/evgen231/events{/privacy}", "followers_url": "https://api.github.com/users/evgen231/followers", "following_url": "https://api.github.com/users/evgen231/following{/other_user}", "gists_url": "https://api.github.com/users/evgen231/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/evgen231", "id": 5925982, "login": "evgen231", "node_id": "MDQ6VXNlcjU5MjU5ODI=", "organizations_url": "https://api.github.com/users/evgen231/orgs", "received_events_url": "https://api.github.com/users/evgen231/received_events", "repos_url": "https://api.github.com/users/evgen231/repos", "site_admin": false, "starred_url": "https://api.github.com/users/evgen231/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/evgen231/subscriptions", "type": "User", "url": "https://api.github.com/users/evgen231", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-12-29T03:24:28Z
2021-09-08T01:21:28Z
2016-12-29T09:41:24Z
CONTRIBUTOR
resolved
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3791/reactions" }
https://api.github.com/repos/psf/requests/issues/3791/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3791.diff", "html_url": "https://github.com/psf/requests/pull/3791", "merged_at": "2016-12-29T09:41:24Z", "patch_url": "https://github.com/psf/requests/pull/3791.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3791" }
true
[ "Thanks so much!" ]
https://api.github.com/repos/psf/requests/issues/3790
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3790/labels{/name}
https://api.github.com/repos/psf/requests/issues/3790/comments
https://api.github.com/repos/psf/requests/issues/3790/events
https://github.com/psf/requests/issues/3790
197,942,525
MDU6SXNzdWUxOTc5NDI1MjU=
3,790
Module boot extremely slow (2-3 seconds)
{ "avatar_url": "https://avatars.githubusercontent.com/u/1411897?v=4", "events_url": "https://api.github.com/users/djbaldey/events{/privacy}", "followers_url": "https://api.github.com/users/djbaldey/followers", "following_url": "https://api.github.com/users/djbaldey/following{/other_user}", "gists_url": "https://api.github.com/users/djbaldey/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/djbaldey", "id": 1411897, "login": "djbaldey", "node_id": "MDQ6VXNlcjE0MTE4OTc=", "organizations_url": "https://api.github.com/users/djbaldey/orgs", "received_events_url": "https://api.github.com/users/djbaldey/received_events", "repos_url": "https://api.github.com/users/djbaldey/repos", "site_admin": false, "starred_url": "https://api.github.com/users/djbaldey/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/djbaldey/subscriptions", "type": "User", "url": "https://api.github.com/users/djbaldey", "user_view_type": "public" }
[]
closed
true
null
[]
null
12
2016-12-29T02:02:35Z
2021-09-08T13:05:34Z
2017-01-04T04:57:09Z
NONE
resolved
On GNU/Debian 8, AMD Phenom 9550 Quad-Core Processor, 8Gb DDR2, Python 2.7.9/3.4.2 requests version 2.12.4 from PyPi This is due to the following code in the `__init__.py` file: ``` # Attempt to enable urllib3's SNI support, if possible try: from .packages.urllib3.contrib import pyopenssl pyopenssl.inject_into_urllib3() except ImportError: pass ``` After I commented out this code, the time was reduced to 0.05 seconds. Whether this test is necessary for all users? You may want to remove the code?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1411897?v=4", "events_url": "https://api.github.com/users/djbaldey/events{/privacy}", "followers_url": "https://api.github.com/users/djbaldey/followers", "following_url": "https://api.github.com/users/djbaldey/following{/other_user}", "gists_url": "https://api.github.com/users/djbaldey/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/djbaldey", "id": 1411897, "login": "djbaldey", "node_id": "MDQ6VXNlcjE0MTE4OTc=", "organizations_url": "https://api.github.com/users/djbaldey/orgs", "received_events_url": "https://api.github.com/users/djbaldey/received_events", "repos_url": "https://api.github.com/users/djbaldey/repos", "site_admin": false, "starred_url": "https://api.github.com/users/djbaldey/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/djbaldey/subscriptions", "type": "User", "url": "https://api.github.com/users/djbaldey", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3790/reactions" }
https://api.github.com/repos/psf/requests/issues/3790/timeline
null
completed
null
null
false
[ "@djbaldey This code optionally injects the PyOpenSSL based TLS backend into Requests. It's extremely useful because in many cases it enables support of advanced features that the standard library TLS module does not support. For this reason, we always attempt to inject it, as this is the most reliably way to detect its presence.\r\n\r\nIn this case, it's almost certainly the result of you having an older cryptography module with a slow import time. Can you try updating cryptography and PyOpenSSL to the newest versions to see if that resolves your issue?", "No, to upgrade PyOpenSSL impossible. Delete - possible. But, without this package - the same delay as with him.\r\n\r\nUpdate:\r\n\r\nI'm upgrade PyOpenSSL from jessie-backports to 16.0.0-1~bpo8+1 version. Now the module is loaded a little faster - 1-1.5 seconds. But it's too slow!\r\n", "What version of cryptography and PyOpenSSL is installed?", "> I'm upgrade PyOpenSSL from jessie-backports to 16.0.0-1~bpo8+1 version.\r\n\r\nWhat version of cryptography is installed? Have you tried not using system packages? I have no way of guaranteeing that they've built things appropriately and aren't contributing to the slowness you're seeing.", "Package: python-cryptography\r\nVersion: 1.3.4-1~bpo8+2\r\n\r\nPackage: python-openssl\r\nSource: pyopenssl\r\nVersion: 16.0.0-1~bpo8+1\r\n\r\n> Have you tried not using system packages?\r\n\r\nNo. It is unsafe in production. The system packages is the only guarantee of stability and compatibility of all components.", "If you're going to go that route and you've decided to only trust system packages, why are you installing \"unsafe\" packages from PyPi?", "@drpoggi, why instead of trying to solve the problem with slow code - you are trying to shift the responsibility for organizational activities?\r\n", "I'm not trying to shift anything. I'm not a core maintainer of Requests and certainly don't mean to represent the opinions of the core developers, sorry if it came across that way. I have no affiliation with Requests other than I submitted a PR once and Github I guess labels that.\r\n\r\nThe actual core developers of Requests have already suggested methods to fix your slow code and you've rejected them. I'm not sure what else they could do at this point. As has already been said, they're not responsible for maintaining the system packages which it seems like that is your issue.", "@drpoggi, I understand all of this. Therefore, I suggest to remove the trying from urllib3 module initialization. I don't think that the most important component and it is more important than speed loading the module. I am sure that those who need urllib3 with TLS - they can write these strings in the app after loading the module. And it will be right.\r\nUpd:\r\nIn addition, in the internal corporate network - [\"advanced features TLS\"](#issuecomment-269605327) may be banned or make mistakes.\r\n", "Can you please open up a shell and check how long \"import cryptography\" takes for me? If this is the source of the slowdown then we can go from here. \r\n\r\n> I don't think that the most important component and it is more important than speed loading the module.\r\n\r\nOn this I disagree with you strongly. On many platforms Python's standard TLS module does not support SNI, disabling TLS compression, or fine-grained control of TLS configuration. Those things are important, and if we can make them present for our users with zero configuration cost then we will try to do that.\r\n\r\nI am entirely willing to look into and attempt to fix the slowdown problem, but removing the auto-insertion of PyOpenSSL from Requests is simply entirely off the table. ", "Needless to say, I agree with @Lukasa. Being able to provide the best possible security out-of-the-box (without futher user configuration) is absolutely imperative. We will not sacrifice that.\r\n\r\nIf you are doing `apt-get install -y python-requests` and it's installing `PyOpenSSL` for you, that means the packagers understand this *and* they understand that the system Python you're using doesn't allow for the best possible security for you and every other user. Now, let's try to get to the bottom of thos.\r\n\r\n@djbaldey can you determine how long it takes on your system to do `import cryptography`?", "Excuse me.\r\nThe problem is my hardware. I tried on a laptop with Intel Core i3 - everything loads instantly.\r\nClose the issue." ]
https://api.github.com/repos/psf/requests/issues/3789
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3789/labels{/name}
https://api.github.com/repos/psf/requests/issues/3789/comments
https://api.github.com/repos/psf/requests/issues/3789/events
https://github.com/psf/requests/pull/3789
197,403,344
MDExOlB1bGxSZXF1ZXN0OTkyNjUxMzM=
3,789
3780: Lazy load idna library
{ "avatar_url": "https://avatars.githubusercontent.com/u/5230880?v=4", "events_url": "https://api.github.com/users/moin18/events{/privacy}", "followers_url": "https://api.github.com/users/moin18/followers", "following_url": "https://api.github.com/users/moin18/following{/other_user}", "gists_url": "https://api.github.com/users/moin18/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/moin18", "id": 5230880, "login": "moin18", "node_id": "MDQ6VXNlcjUyMzA4ODA=", "organizations_url": "https://api.github.com/users/moin18/orgs", "received_events_url": "https://api.github.com/users/moin18/received_events", "repos_url": "https://api.github.com/users/moin18/repos", "site_admin": false, "starred_url": "https://api.github.com/users/moin18/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/moin18/subscriptions", "type": "User", "url": "https://api.github.com/users/moin18", "user_view_type": "public" }
[]
closed
true
null
[]
null
7
2016-12-23T17:21:43Z
2021-09-07T00:06:42Z
2017-01-19T09:19:00Z
CONTRIBUTOR
resolved
Changes based on comment from previous Pull Request: https://github.com/kennethreitz/requests/pull/3787 Fix for issue: https://github.com/kennethreitz/requests/issues/3780
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3789/reactions" }
https://api.github.com/repos/psf/requests/issues/3789/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3789.diff", "html_url": "https://github.com/psf/requests/pull/3789", "merged_at": "2017-01-19T09:19:00Z", "patch_url": "https://github.com/psf/requests/pull/3789.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3789" }
true
[ "@Lukasa Please let me know your thought on this. I think it should address your concern in my previous Pull Request.", "@sigmavirus24 Not sure why it is showing the status as \"Changes requested\" even though they are fixed in commit: https://github.com/kennethreitz/requests/pull/3789/commits/965eb3d1525e3d384623c0267654c2f07acc4aea\r\n\r\nMay be it due to additional import of `sys` (needed but was missing in previous PR)", "@moin18 that will be in place until sigmavirus24 has a chance to review your changes again. Today is still a \"holiday\" for a lot of people working in the US, so sigmavirus24 likely won't respond immediately. Things look pretty cleaned up now, so I think you can leave these changes and he'll respond when he has a spare moment :)", "@sigmavirus24 I have fixed the comments you mentioned. Please can you review them once more? :)", "@moin18 would you be willing to rebase this rather than merging master into this branch? Alternatively, @Lukasa how do you feel about squash merging this?", "I have no objection to squash merging.", "@sigmavirus24 @Lukasa cleaned up the commit history. Merged all commits into single commit." ]
https://api.github.com/repos/psf/requests/issues/3788
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3788/labels{/name}
https://api.github.com/repos/psf/requests/issues/3788/comments
https://api.github.com/repos/psf/requests/issues/3788/events
https://github.com/psf/requests/issues/3788
197,376,970
MDU6SXNzdWUxOTczNzY5NzA=
3,788
Exceptions having references to urllib3 exceptions in "message" field
{ "avatar_url": "https://avatars.githubusercontent.com/u/290258?v=4", "events_url": "https://api.github.com/users/rabbbit/events{/privacy}", "followers_url": "https://api.github.com/users/rabbbit/followers", "following_url": "https://api.github.com/users/rabbbit/following{/other_user}", "gists_url": "https://api.github.com/users/rabbbit/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/rabbbit", "id": 290258, "login": "rabbbit", "node_id": "MDQ6VXNlcjI5MDI1OA==", "organizations_url": "https://api.github.com/users/rabbbit/orgs", "received_events_url": "https://api.github.com/users/rabbbit/received_events", "repos_url": "https://api.github.com/users/rabbbit/repos", "site_admin": false, "starred_url": "https://api.github.com/users/rabbbit/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/rabbbit/subscriptions", "type": "User", "url": "https://api.github.com/users/rabbbit", "user_view_type": "public" }
[]
closed
true
null
[]
null
13
2016-12-23T14:20:05Z
2021-11-26T05:00:32Z
2021-08-28T04:20:47Z
NONE
resolved
Hey, tldr; is exceptions keeping references to urllib3 exceptions a design choice? I ran an issue when trying to automatically serialise `TimeoutError.message`. It appears it contains a reference to urllib3 exceptions: ```python >>> try: ... requests.get('http://google.com', timeout=0.05) ... except Exception as e: pass >>> e.message ReadTimeoutError("HTTPConnectionPool(host='www.google.pl', port=80): Read timed out. (read timeout=0.05)",) >>> e.message.__class__ <class 'requests.packages.urllib3.exceptions.ReadTimeoutError'> ``` Which is inline with I see in [in the source](https://github.com/kennethreitz/requests/blob/master/requests/adapters.py#L499) ```python elif isinstance(e, ReadTimeoutError): raise ReadTimeout(e, request=request) ``` but I wasn't expecting this. [Python docs](https://docs.python.org/2/library/exceptions.html#exceptions.BaseException) say: > The tuple of arguments given to the exception constructor. Some built-in exceptions (like IOError) expect a certain number of arguments and assign a special meaning to the elements of this tuple, while others are usually called only with a single string giving an error message. Given the above I was expecting either: 1. `message` to be a string ( per typical use case ) 2. `message` to be `None` and `errno` and `strerror` being set: ( because [Request exceptions extend IOError](https://github.com/kennethreitz/requests/blob/master/requests/exceptions.py#L12) and [IOError having a specific constructor defined](https://docs.python.org/2/library/exceptions.html#exceptions.EnvironmentError) ) Was the current implementation a conscious decision? If not, what would be your preference, option (1) or (2) from the above? (2) seems more correct, (1) seems simpler. Happy to try to PR. Thanks!
{ "avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4", "events_url": "https://api.github.com/users/nateprewitt/events{/privacy}", "followers_url": "https://api.github.com/users/nateprewitt/followers", "following_url": "https://api.github.com/users/nateprewitt/following{/other_user}", "gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/nateprewitt", "id": 5271761, "login": "nateprewitt", "node_id": "MDQ6VXNlcjUyNzE3NjE=", "organizations_url": "https://api.github.com/users/nateprewitt/orgs", "received_events_url": "https://api.github.com/users/nateprewitt/received_events", "repos_url": "https://api.github.com/users/nateprewitt/repos", "site_admin": false, "starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions", "type": "User", "url": "https://api.github.com/users/nateprewitt", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3788/reactions" }
https://api.github.com/repos/psf/requests/issues/3788/timeline
null
completed
null
null
false
[ "Yes, keeping references to urllib3 exceptions is a deliberate design decision. It ensures that low-level causes of high-level problems can be accurately diagnosed without relying on Python 3 exception chaining, which has not historically existed and continues to not exist on Python 2.\r\n\r\nI am pretty disinclined to want to mess with the exceptions at this time: they're pretty stable, and almost any change to them is backward compatible in a big way with pretty minimal gain. ", "Ah, I see.\r\n\r\nIt almost felt like Requests exception should abstract the low-level ( by copying relevant/required things ) but I guess that's a lot of \"how does urllib3 work\" work :]\r\n\r\nIf we were to keep having a reference, why not introduce a `parent_exception` field though? Using `message` for this purpose really feels like a field abuse? :)", "It's also ultimately not worthwhile: it ties us so closely to decisions that urllib3 makes in its internals that we become essentially unable to tolerate changes within urllib3 itself.\r\n\r\nI don't think there's any particular problem with passing a non-string exception message. What makes you think it's field abuse? I should note as well that `Exception.message` does not exist on Python 3, so this is a Python-2-only concern.", "I don't really do python3 :(\r\n\r\nWell,\r\n- is called \"message\"\r\n- I've only ever seen it seen it treated as a debug/string message. (not necessarily an argument ;))\r\n- The documentation says \"while others are usually called only with a single string giving an error message.\".\r\n\r\nI've just always assumed it's a general language pattern. (again possibly I'm wrong ;)).\r\n\r\nFrom afar, it would also seem cleaner to have it explicitly set as an attribute, rather than saying _\"the \"root\" exception is the first argument of the tuple of arguments passed in to the exception\"_, which I guess is the case in Python3 too?", "We ran into a related issue lately and fixed it downstream: https://github.com/ckan/ckanapi/pull/111.\r\nWhile a case can be made that this is not `request`'s problem as `Exception.message` is deprecated since Python 2.6, it might be worth considering to put something like\r\n\r\n`self.message = str(args) + str(kwargs)`\r\n\r\ninto `RequestException.__init__`\r\n\r\nto give legacy applications the string they want if they use `Exception.message`.\r\n\r\n ", "We had a bug in our code related to this. The docs at http://docs.python-requests.org/en/master/api/#requests.Response.reason state:\r\n\r\n> Textual reason of responded HTTP Status, e.g. \"Not Found\" or \"OK\".\r\n\r\nBut `Response.reason` can also contain non-strings, for example we got `SSLError` instance there. I assume this is because requests just grabs `arg[0]` and we expect it to be a string (like it normally is)?\r\n\r\nIf nothing else, the docs should be updated at least.", "What? `Response.reason` is only ever set by `build_response` to the same as the string in urllib3, and that is only ever set to the reason phrase from `httplib`. So...where do you think we're getting `args[0]` from to set `Response.reason`?", "@Lukasa That was just assumption, I didn't look into `requests` code.\r\n\r\nThat is the behavior we witnessed, however. If you're saying that shouldn't be possible, I should double check.", "As far as I can tell that should not be possible. If you're using a custom transport adapter, though, they could be doing any number of things.", "Ok nevermind, ignore what I wrote earlier. I interpreted the situation a bit too far. Nothing wrong with `Response.reason`.\r\n\r\nThe problem was that guys were catching `HttpException` and then grabbing `exception.message`, which returned `SSLError` instead of a string. But that is in the core of this ticket, right?", "Yup, sure, I was just responding to your analysis. =)\r\n\r\nMy position on the core ticket remains broadly the same: it affects only Python 2, and it's sufficiently stable that changing it requires a breaking change (so it'd have to go into v3). Given that v3 will release with no more than about two years on Python 2's supported lifecycle, I'm pretty disinclined to consider this a high-priority concern.", "With `Exception.message` being deprecated, I think that's a reasonable conclusion. We should always call `str(exception)` if we are after string representation.\r\n\r\nI've had convention of adding `cause` kwarg to `Exception`s that hold it. That makes it clear where the cause is retrievable at (not `exception.args[0]` or `exception.args[1]` but `exception.cause`). I'm not sure, maybe I could redirect/copy that as `Exception.__cause__` to make it similar to how things work in Python 3...\r\n\r\nBut yeah, changes would break backwards-compatibility here, so probably best to do nothing, I agree.", "Resolving as Python 2.7 being deprecated and we're unlikely to be investing time fixing bugs that aren't specific to Python 3." ]
https://api.github.com/repos/psf/requests/issues/3787
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3787/labels{/name}
https://api.github.com/repos/psf/requests/issues/3787/comments
https://api.github.com/repos/psf/requests/issues/3787/events
https://github.com/psf/requests/pull/3787
197,285,230
MDExOlB1bGxSZXF1ZXN0OTkxODM3MTc=
3,787
Lazy load idna to free memory
{ "avatar_url": "https://avatars.githubusercontent.com/u/5230880?v=4", "events_url": "https://api.github.com/users/moin18/events{/privacy}", "followers_url": "https://api.github.com/users/moin18/followers", "following_url": "https://api.github.com/users/moin18/following{/other_user}", "gists_url": "https://api.github.com/users/moin18/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/moin18", "id": 5230880, "login": "moin18", "node_id": "MDQ6VXNlcjUyMzA4ODA=", "organizations_url": "https://api.github.com/users/moin18/orgs", "received_events_url": "https://api.github.com/users/moin18/received_events", "repos_url": "https://api.github.com/users/moin18/repos", "site_admin": false, "starred_url": "https://api.github.com/users/moin18/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/moin18/subscriptions", "type": "User", "url": "https://api.github.com/users/moin18", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2016-12-22T23:07:33Z
2021-09-08T01:21:29Z
2016-12-23T13:52:43Z
CONTRIBUTOR
resolved
Fix for issue: https://github.com/kennethreitz/requests/issues/3780 Lazily loading `idna`. This CR includes utilities to: * lazily load idna package * And, delete module from sys.modules after use Changes looks to work fine for me. Now, None of the reference to `idna` is getting used directly. GC will release the memory when required.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3787/reactions" }
https://api.github.com/repos/psf/requests/issues/3787/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3787.diff", "html_url": "https://github.com/psf/requests/pull/3787", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3787.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3787" }
true
[ "I think it will be a better idea to create a separate package to lazy load module and add that package as the part of both `urllib3` and `requests` package", "urllib3 does not vendor idna, so such a package cannot really be easily shared. ", "Additionally, Requests does not carry patches to urllib3: it vendors an unedited copy. Any urllib3 patch you need to make must be proposed to that project. ", "@Lukasa : In that case I will be creating a separate patch for urllib3 discarding the urllib3 related changes from this PR. Apart from that, does the `requests` library related changes looks good to you?", "I think it looks really over complex. Repeatedly loading and unloading the module is solving a non-existent problem. All we should do is defer loading as late in the process as possible." ]
https://api.github.com/repos/psf/requests/issues/3786
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3786/labels{/name}
https://api.github.com/repos/psf/requests/issues/3786/comments
https://api.github.com/repos/psf/requests/issues/3786/events
https://github.com/psf/requests/issues/3786
197,231,914
MDU6SXNzdWUxOTcyMzE5MTQ=
3,786
plus character in url not working
{ "avatar_url": "https://avatars.githubusercontent.com/u/24253794?v=4", "events_url": "https://api.github.com/users/Andriuskislas/events{/privacy}", "followers_url": "https://api.github.com/users/Andriuskislas/followers", "following_url": "https://api.github.com/users/Andriuskislas/following{/other_user}", "gists_url": "https://api.github.com/users/Andriuskislas/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Andriuskislas", "id": 24253794, "login": "Andriuskislas", "node_id": "MDQ6VXNlcjI0MjUzNzk0", "organizations_url": "https://api.github.com/users/Andriuskislas/orgs", "received_events_url": "https://api.github.com/users/Andriuskislas/received_events", "repos_url": "https://api.github.com/users/Andriuskislas/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Andriuskislas/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Andriuskislas/subscriptions", "type": "User", "url": "https://api.github.com/users/Andriuskislas", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2016-12-22T18:02:12Z
2021-09-08T13:05:36Z
2016-12-23T01:36:34Z
NONE
resolved
My url is like this ` http://api.example.com/user/jhon+doe?field=full&key=somestring ` After I making request, when I print the url , I only get `http://api.example.com/user/jhon` which is incorrect and it's return error from api. How can I request url with plus character?
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3786/reactions" }
https://api.github.com/repos/psf/requests/issues/3786/timeline
null
completed
null
null
false
[ "My guess is that the issue is not with the library but your api. Your api is not able to translate `+` into valid user", "Hey @Andriuskislas, thanks for opening this ticket. I can't seem to reproduce the path dropping in 2.12.4 with the information supplied above. However, I think I have an idea of what you're hitting.\r\n\r\nFirst, which version of requests are you using? URIs that I'm passing with a + character seem to be unaffected from our end, so I'm wondering if you're hitting a redirect which isn't parsing the supplied URI correctly on the server side.\r\n\r\n`+` is a reserved character, that should represent a space. If you're intending to pass it as a literal `+` character, you'll likely want to encode it as `%2B` instead.\r\n\r\nIf you could supply a more detailed repro, we can take a deeper look.", "@moin18 , No my api endpoint is fine. When I hit the url from browser it returns correct data. \r\n\r\n@nateprewitt , Thank you. Firstly my requests version is \r\n```\r\n>>> req.__version__\r\n'2.12.4'\r\n```\r\nHere is my encoded try \r\n```\r\nurl = \"http://api.example.com/user/jhon%2Bdoe?field=full&key=somestring\"\r\nr = req.get(url)\r\nr.url # output: http://api.example.com/user/jhon \r\n\r\n``` \r\nwithout encoding \r\n```\r\nurl = \"http://api.example.com/user/jhon+doe?field=full&key=somestring\"\r\nr = req.get(url)\r\nr.url # output: http://api.example.com/user/jhon\r\n```\r\nAlso tried with encoding whole url ```http%3A%2F%2Fapi.example.com%2Fuser%2Fjhon%2Bdoe%3Ffield%3Dfull%26key%3Dsomestring``` but request didn't made. \r\n", "Thanks @Andriuskislas, url encoding should only be used for unicode characters, or uses of reserved characters for something other than their assigned meaning, which is why the last example isn't routing.\r\n\r\nI can use a variant of your example above on a different server, and it leaves the + unchanged. That points pretty clearly to this being a server behaviour rather than something in Requests.\r\n\r\n```python\r\n>> url = 'http://httpbin.org/get+hello'\r\n>> r = requests.get(url)\r\n>> r.url\r\n'http://httpbin.org/get+hello'\r\n```\r\n\r\nAt this point, it feels like this is probably something better suited for Stackoverflow so we're not bombarding everyone watching issues on Requests.\r\n\r\nYour output above is reassuring my redirect hunch, so here's one more test to perform. If you can check the output of `r.history`, that would be helpful. If it's anything other than `[]`, check what `r.history[0].url` is and compare that value to `r.request.url`. If they're different, this is definitely something happening server side.\r\n\r\nIf that's the case, would you mind opening an issue on https://stackoverflow.com and posting the link here so we have a paper trail? I'll follow up over there.", "@Andriuskislas I would bet that @nateprewitt is correct about redirects being the root cause here. You can also tell requests not to follow them by passing `allow_redirects=False` to your request. It's plausible the server you're talking to is redirecting based on something like User-Agent. Either way, @nateprewitt's suggestion of asking a question on StackOverflow is correct. This isn't a general support forum, but a place for defects, which I don't think you've found." ]
https://api.github.com/repos/psf/requests/issues/3785
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3785/labels{/name}
https://api.github.com/repos/psf/requests/issues/3785/comments
https://api.github.com/repos/psf/requests/issues/3785/events
https://github.com/psf/requests/pull/3785
196,993,536
MDExOlB1bGxSZXF1ZXN0OTg5NzY0NTE=
3,785
fix urllib3 documentation link
{ "avatar_url": "https://avatars.githubusercontent.com/u/3793595?v=4", "events_url": "https://api.github.com/users/Adusei/events{/privacy}", "followers_url": "https://api.github.com/users/Adusei/followers", "following_url": "https://api.github.com/users/Adusei/following{/other_user}", "gists_url": "https://api.github.com/users/Adusei/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Adusei", "id": 3793595, "login": "Adusei", "node_id": "MDQ6VXNlcjM3OTM1OTU=", "organizations_url": "https://api.github.com/users/Adusei/orgs", "received_events_url": "https://api.github.com/users/Adusei/received_events", "repos_url": "https://api.github.com/users/Adusei/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Adusei/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Adusei/subscriptions", "type": "User", "url": "https://api.github.com/users/Adusei", "user_view_type": "public" }
[]
closed
true
null
[]
null
6
2016-12-21T17:48:57Z
2021-09-08T01:21:28Z
2016-12-21T18:08:46Z
CONTRIBUTOR
resolved
The link to urllib3 documentation in the appengine warning message shows " SORRY This page does not exist yet." This PR fixes the documentation link.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3785/reactions" }
https://api.github.com/repos/psf/requests/issues/3785/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3785.diff", "html_url": "https://github.com/psf/requests/pull/3785", "merged_at": "2016-12-21T18:08:46Z", "patch_url": "https://github.com/psf/requests/pull/3785.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3785" }
true
[ "Thanks for the patch!", "@Lukasa it looks like this is a modification in the appengine module in urllib3. The warning update may need to go over there, otherwise we'll overwrite it the next time we update urllib3.", "Agh good spot, I missed that. Can we open a new patch on urllib3 that DTRT?", "@Adusei would you be interested in submitting this patch to [urllib3](https://github.com/shazow/urllib3) so this can be fixed at the source?", "urllib3 has the proper documentation link.. checking it now, and seems that they fixed the issue with these two commits last month:\r\n\r\n- https://github.com/shazow/urllib3/commit/097b935bea9fa575c59391e8c30a3ab00c974fae\r\n- https://github.com/shazow/urllib3/commit/9688f0311839ce79c59b5254a8c68c148e73528c\r\n", "Great, things should be set for future releases then. Thanks for taking a look @Adusei!" ]
https://api.github.com/repos/psf/requests/issues/3784
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3784/labels{/name}
https://api.github.com/repos/psf/requests/issues/3784/comments
https://api.github.com/repos/psf/requests/issues/3784/events
https://github.com/psf/requests/issues/3784
196,981,015
MDU6SXNzdWUxOTY5ODEwMTU=
3,784
SSL context
{ "avatar_url": "https://avatars.githubusercontent.com/u/12953712?v=4", "events_url": "https://api.github.com/users/AlmightyOatmeal/events{/privacy}", "followers_url": "https://api.github.com/users/AlmightyOatmeal/followers", "following_url": "https://api.github.com/users/AlmightyOatmeal/following{/other_user}", "gists_url": "https://api.github.com/users/AlmightyOatmeal/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/AlmightyOatmeal", "id": 12953712, "login": "AlmightyOatmeal", "node_id": "MDQ6VXNlcjEyOTUzNzEy", "organizations_url": "https://api.github.com/users/AlmightyOatmeal/orgs", "received_events_url": "https://api.github.com/users/AlmightyOatmeal/received_events", "repos_url": "https://api.github.com/users/AlmightyOatmeal/repos", "site_admin": false, "starred_url": "https://api.github.com/users/AlmightyOatmeal/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/AlmightyOatmeal/subscriptions", "type": "User", "url": "https://api.github.com/users/AlmightyOatmeal", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2016-12-21T16:52:08Z
2021-09-08T13:05:37Z
2016-12-21T17:11:58Z
NONE
resolved
In some instances, requests is used to access services via SSL that disable TLS v1.0 for security reasons but there is no way to configure what SSL/TLS versions are acceptable nor pass in a SSL context. This is very disappointing. For example, I *need* to be able to create this context and use it, but I can't... ``` import ssl context = ssl.SSLContext(ssl.PROTOCOL_SSLv23) context.options |= ssl.OP_NO_TLSv1 ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3784/reactions" }
https://api.github.com/repos/psf/requests/issues/3784/timeline
null
completed
null
null
false
[ "Thanks for this report!\r\n\r\nFirstly, services that disable TLSv1 should still be compatible with Requests. Requests advertises support for all TLS versions higher than SSLv3 in the default configuration, so services should simply choose TLSv1.2. Requests will happily do that. \r\n\r\nSecondly, you absolutely can pass an SSL context as of v2.12. See [this comment](https://github.com/kennethreitz/requests/issues/3777#issuecomment-267998174) for an example of how to do it.\r\n\r\nI hope that calms your disappointment. ", "I should note: *please* do not use the context in your original comment. It's severely insecure. Please always use `create_urllib3_context` and then edit it accordingly: you'll get much higher security that way, and we'll keep moving forward with more secure configurations. ", "> Firstly, services that disable TLSv1 should still be compatible with Requests. \r\n\r\nIn an ideal world, yes. This is the real world which is far from ideal.\r\n\r\n> Requests advertises support for all TLS versions higher than SSLv3 in the default \r\n> configuration, so services should simply choose TLSv1.2. Requests will happily do \r\n> that.\r\n\r\nObviously not.\r\n\r\n> Secondly, you absolutely can pass an SSL context as of v2.12. See this comment \r\n> for an example of how to do it.\r\n\r\nAlso not true. I have to create an entire adapter to pass into it instead of simply passing in a context. However, it does work! I was able to successfully modify Zeep to nicely handle that new adapter for the session and it worked like a champ but I still had to construct my own context that was only slightly different than the one urllib3 made. \r\n\r\n> I should note: please do not use the context in your original comment. It's severely \r\n> insecure. Please always use create_urllib3_context and then edit it accordingly: \r\n> you'll get much higher security that way, and we'll keep moving forward with more \r\n> secure configurations.\r\n\r\nSo far from the truth. What does `create_urllib3_context` do? Literally the same thing I did.\r\n\r\nIn the end, thank you for the suggestion because that adapter did exactly what I needed it to. Unfortunately Zeep is such a mess that it may be a moot point although I should be able to modify Suds to use requests. I am still a fan, and advocate, of the `requests` module! \r\n\r\nCheers Lukasa!\r\n" ]
https://api.github.com/repos/psf/requests/issues/3783
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3783/labels{/name}
https://api.github.com/repos/psf/requests/issues/3783/comments
https://api.github.com/repos/psf/requests/issues/3783/events
https://github.com/psf/requests/pull/3783
196,939,612
MDExOlB1bGxSZXF1ZXN0OTg5Mzc0MTk=
3,783
Closes #3780 High memory usage due to idma import
{ "avatar_url": "https://avatars.githubusercontent.com/u/5230880?v=4", "events_url": "https://api.github.com/users/moin18/events{/privacy}", "followers_url": "https://api.github.com/users/moin18/followers", "following_url": "https://api.github.com/users/moin18/following{/other_user}", "gists_url": "https://api.github.com/users/moin18/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/moin18", "id": 5230880, "login": "moin18", "node_id": "MDQ6VXNlcjUyMzA4ODA=", "organizations_url": "https://api.github.com/users/moin18/orgs", "received_events_url": "https://api.github.com/users/moin18/received_events", "repos_url": "https://api.github.com/users/moin18/repos", "site_admin": false, "starred_url": "https://api.github.com/users/moin18/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/moin18/subscriptions", "type": "User", "url": "https://api.github.com/users/moin18", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2016-12-21T14:00:00Z
2021-09-08T01:21:29Z
2016-12-21T14:06:29Z
CONTRIBUTOR
resolved
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3783/reactions" }
https://api.github.com/repos/psf/requests/issues/3783/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3783.diff", "html_url": "https://github.com/psf/requests/pull/3783", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3783.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3783" }
true
[ "Because of how Python imports work, this also will not have any effect. Specifically, `from x import y` is approximately the same as `import x; y = x.y`. I'm afraid we need a better solution than even this. \r\n\r\nSorry!", "By better solution do you mean the changes in: `idna/core.py` that is imported by `__init__.py`?", "Ideally we'd only bring in IDNA when it was actually needed: that is, when we had a domain we had to internationalise.", "What are your thoughts on this?\r\n\r\n```\r\n@staticmethod:\r\ndef _get_idna_encoded_host(host):\r\n from . package import idna\r\n try:\r\n host = idna.encode(host, uts46=True).decode('utf-8')\r\n except idna.IDNAError:\r\n raise UnicodeError\r\n\r\n.....\r\n\r\nif not unicode_is_ascii(host):\r\n try:\r\n host =self._get_idna_encoded_host(host)\r\n except UnicodeError:\r\n raise InvalidURL('URL has an invalid label.')\r\nelif host.startswith(u'*'):\r\n raise InvalidURL('URL has an invalid label.')\r\n```\r\n ", "@moin18 The problem I ran up against was that `idna` is part of `packages` so whenever `packages` is loaded (e.g. in `models.py`) `idna` will be loaded at that point. So the fix would be for `idna` not to be loaded at that point but somehow still be loaded when needed :/" ]
https://api.github.com/repos/psf/requests/issues/3782
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3782/labels{/name}
https://api.github.com/repos/psf/requests/issues/3782/comments
https://api.github.com/repos/psf/requests/issues/3782/events
https://github.com/psf/requests/pull/3782
196,925,645
MDExOlB1bGxSZXF1ZXN0OTg5Mjc0Mzk=
3,782
Closes ##3780 High memory usage due to idma import
{ "avatar_url": "https://avatars.githubusercontent.com/u/5230880?v=4", "events_url": "https://api.github.com/users/moin18/events{/privacy}", "followers_url": "https://api.github.com/users/moin18/followers", "following_url": "https://api.github.com/users/moin18/following{/other_user}", "gists_url": "https://api.github.com/users/moin18/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/moin18", "id": 5230880, "login": "moin18", "node_id": "MDQ6VXNlcjUyMzA4ODA=", "organizations_url": "https://api.github.com/users/moin18/orgs", "received_events_url": "https://api.github.com/users/moin18/received_events", "repos_url": "https://api.github.com/users/moin18/repos", "site_admin": false, "starred_url": "https://api.github.com/users/moin18/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/moin18/subscriptions", "type": "User", "url": "https://api.github.com/users/moin18", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2016-12-21T12:51:52Z
2021-09-08T01:21:30Z
2016-12-21T13:41:20Z
CONTRIBUTOR
resolved
Fix related to bug: https://github.com/kennethreitz/requests/issues/3780 `idma` package will now be imported within the function
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3782/reactions" }
https://api.github.com/repos/psf/requests/issues/3782/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3782.diff", "html_url": "https://github.com/psf/requests/pull/3782", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3782.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3782" }
true
[ "Thanks for this!\r\n\r\nHowever, this patch does not meaningfully reduce the memory usage. Any use of Requests will encounter this branch immediately and unconditionally, meaning that in practice all that we have done is delay the loading of the module. I think we need a better solution than this one.", "How about doing selective import of the needed objects?\r\n\r\nRaised another PR: https://github.com/kennethreitz/requests/pull/3783\r\n\r\n\r\nOn Wed, Dec 21, 2016 at 7:11 PM, Cory Benfield <[email protected]>\r\nwrote:\r\n\r\n> Closed #3782 <https://github.com/kennethreitz/requests/pull/3782>.\r\n>\r\n> —\r\n> You are receiving this because you authored the thread.\r\n> Reply to this email directly, view it on GitHub\r\n> <https://github.com/kennethreitz/requests/pull/3782#event-901602069>, or mute\r\n> the thread\r\n> <https://github.com/notifications/unsubscribe-auth/AE_RIPmien3rnNTVgjj6Y4O2yNmqOqr9ks5rKSyVgaJpZM4LS6ge>\r\n> .\r\n>\r\n" ]
https://api.github.com/repos/psf/requests/issues/3781
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3781/labels{/name}
https://api.github.com/repos/psf/requests/issues/3781/comments
https://api.github.com/repos/psf/requests/issues/3781/events
https://github.com/psf/requests/pull/3781
196,854,401
MDExOlB1bGxSZXF1ZXN0OTg4Nzc1Njc=
3,781
updating idna.uts46data from upstream project
{ "avatar_url": "https://avatars.githubusercontent.com/u/1094627?v=4", "events_url": "https://api.github.com/users/mplonka/events{/privacy}", "followers_url": "https://api.github.com/users/mplonka/followers", "following_url": "https://api.github.com/users/mplonka/following{/other_user}", "gists_url": "https://api.github.com/users/mplonka/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/mplonka", "id": 1094627, "login": "mplonka", "node_id": "MDQ6VXNlcjEwOTQ2Mjc=", "organizations_url": "https://api.github.com/users/mplonka/orgs", "received_events_url": "https://api.github.com/users/mplonka/received_events", "repos_url": "https://api.github.com/users/mplonka/repos", "site_admin": false, "starred_url": "https://api.github.com/users/mplonka/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mplonka/subscriptions", "type": "User", "url": "https://api.github.com/users/mplonka", "user_view_type": "public" }
[]
closed
true
null
[]
null
8
2016-12-21T06:10:53Z
2021-09-07T00:06:42Z
2016-12-21T15:59:35Z
CONTRIBUTOR
resolved
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3781/reactions" }
https://api.github.com/repos/psf/requests/issues/3781/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3781.diff", "html_url": "https://github.com/psf/requests/pull/3781", "merged_at": "2016-12-21T15:59:35Z", "patch_url": "https://github.com/psf/requests/pull/3781.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3781" }
true
[ "This is meant to include https://github.com/kjd/idna/pull/33 which will fix https://github.com/kennethreitz/requests/issues/3711.\r\n\r\nThe patch was done by running `make idna`. Also patching Makefile itself, removing `sed` invocation, which isn't necessary anymore.", "Thanks for this patch @mplonka. However, we would generally prefer to update the bundled packages ourselves at the time of release: that way, we have some confidence that we have done the sensible thing. Can you pare this patch down to just the makefile change, please?", "Thanks for reviewing this PR, @Lukasa .\r\n\r\nAs per your request, here's the revert commit.", "Thanks! For the sake of clarity, do you mind squashing these into one commit? ", "I actually wanted to suggest squashing.", "@Lukasa it seems that idna hasn't been updated yet, but a new version has been released in the mean time. Will idna be updated to fix the Jython issue?", "@LordGaav Per [our documentation](http://docs.python-requests.org/en/master/community/release-process/#hotfix-releases), we do not update vendored dependencies in patch releases. Please wait for v2.13, which will contain an update.", "That was quick. I'll use `requests!=2.12.*` in my requirements.txt in the mean time then. Thanks." ]
https://api.github.com/repos/psf/requests/issues/3780
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3780/labels{/name}
https://api.github.com/repos/psf/requests/issues/3780/comments
https://api.github.com/repos/psf/requests/issues/3780/events
https://github.com/psf/requests/issues/3780
196,773,876
MDU6SXNzdWUxOTY3NzM4NzY=
3,780
idna package increases memory usage by ~20MB
{ "avatar_url": "https://avatars.githubusercontent.com/u/2750246?v=4", "events_url": "https://api.github.com/users/DanielGibbsNZ/events{/privacy}", "followers_url": "https://api.github.com/users/DanielGibbsNZ/followers", "following_url": "https://api.github.com/users/DanielGibbsNZ/following{/other_user}", "gists_url": "https://api.github.com/users/DanielGibbsNZ/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/DanielGibbsNZ", "id": 2750246, "login": "DanielGibbsNZ", "node_id": "MDQ6VXNlcjI3NTAyNDY=", "organizations_url": "https://api.github.com/users/DanielGibbsNZ/orgs", "received_events_url": "https://api.github.com/users/DanielGibbsNZ/received_events", "repos_url": "https://api.github.com/users/DanielGibbsNZ/repos", "site_admin": false, "starred_url": "https://api.github.com/users/DanielGibbsNZ/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/DanielGibbsNZ/subscriptions", "type": "User", "url": "https://api.github.com/users/DanielGibbsNZ", "user_view_type": "public" }
[ { "color": "0b02e1", "default": false, "description": null, "id": 191274, "name": "Contributor Friendly", "node_id": "MDU6TGFiZWwxOTEyNzQ=", "url": "https://api.github.com/repos/psf/requests/labels/Contributor%20Friendly" } ]
closed
true
null
[]
null
10
2016-12-20T20:32:35Z
2021-09-08T12:01:09Z
2017-01-24T12:25:00Z
NONE
resolved
I am running an Ubuntu 16.04.1 LTS machine with Python 3.5.2 which has about 40 Python scripts loaded, each one of which uses Requests. I recently updated Requests and suddenly found I was running out of memory. Upon further investigation I discovered that as of version 2.12.0, the `idna` package has been included in requests and this has caused the memory consumption for each script to increase by about 20MB. This can be observed with the following steps: 1. Install Requests version 2.11.1 2. Run `python3 -c "import requests; import time; time.sleep(60)" &`. 3. While the previous command is running, run `top` or `htop` and observe the resident memory usage of the command. 4. Install Requests version 2.12.0. 5. Repeat steps 2 and 3. Is there any way that Requests could lazily load the idna library so that this extra 20MB of memory is only used when actually working with IDNA domains?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3780/reactions" }
https://api.github.com/repos/psf/requests/issues/3780/timeline
null
completed
null
null
false
[ "Yeah, requests could absolutely lazily import idna. Anyone who would like to can work on a patch to defer this import. =)", "Just had a look at where `idna` is used (https://github.com/kennethreitz/requests/blob/7d2dfa86841fae49fa82f4f59098d5be862d1ba0/requests/models.py#L371) and it seems it's being used for every request; is there any check we can do to see whether `idna` needs to be loaded here?", "@DanielGibbsNZ Not enormously effectively, no. In principle the check in the `except` block could be used, I suppose.", "Something like this perhaps?\r\n\r\n```\r\nif not unicode_is_ascii(host):\r\n from .packages import idna\r\n try:\r\n host = idna.encode(host, uts46=True).decode('utf-8')\r\n except (UnicodeError, idna.IDNAError):\r\n if not unicode_is_ascii(host) or host.startswith(u'*'):\r\n raise InvalidURL('URL has an invalid label.')\r\n```", "That's sensible enough, though we would want to avoid doing the `unicode_is_ascii` check twice.", "Hmm, looks slightly more complicated than I thought to actually get this working as I'm not sure how to get `packages/__init__.py` to lazily load `idna`. Additionally, `urllib3`'s `pyopenssl` uses `idna` as well. I might leave this for someone more experienced than I.", "@Lukasa: `idna` is also imported in `urllib3/contrib/pyopenssl.py`. Here: https://github.com/kennethreitz/requests/blob/master/requests/packages/urllib3/contrib/pyopenssl.py#L46\r\n\r\nIn order to remove the reference of `idna` via garbage collector, corresponding changes in urllib3 are also required. The idea is to do lazy loading of idna package only on demand basis. \r\n\r\nLet me know if I am on correct path. If yes, I will raise the bug and pull request to change this behavior in urllib3 as well.\r\n\r\nSample diff is available here: https://github.com/kennethreitz/requests/pull/3787", "For the urllib3 case, we could also lazily load idna, but let's fix it one step at a time.", "Looks like it's already fixed there.", "Looks like this is now resolved with #3789." ]
https://api.github.com/repos/psf/requests/issues/3779
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3779/labels{/name}
https://api.github.com/repos/psf/requests/issues/3779/comments
https://api.github.com/repos/psf/requests/issues/3779/events
https://github.com/psf/requests/issues/3779
196,734,596
MDU6SXNzdWUxOTY3MzQ1OTY=
3,779
Move to travis
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
[]
closed
true
null
[]
null
0
2016-12-20T17:39:38Z
2021-09-08T13:05:38Z
2016-12-21T07:07:03Z
CONTRIBUTOR
resolved
jenkins is enough of a headache to deal with, let's do this.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 1, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 1, "laugh": 0, "rocket": 0, "total_count": 2, "url": "https://api.github.com/repos/psf/requests/issues/3779/reactions" }
https://api.github.com/repos/psf/requests/issues/3779/timeline
null
completed
null
null
false
[]
https://api.github.com/repos/psf/requests/issues/3778
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3778/labels{/name}
https://api.github.com/repos/psf/requests/issues/3778/comments
https://api.github.com/repos/psf/requests/issues/3778/events
https://github.com/psf/requests/issues/3778
196,427,245
MDU6SXNzdWUxOTY0MjcyNDU=
3,778
Handle Content in new tab
{ "avatar_url": "https://avatars.githubusercontent.com/u/18625267?v=4", "events_url": "https://api.github.com/users/arunchandramouli/events{/privacy}", "followers_url": "https://api.github.com/users/arunchandramouli/followers", "following_url": "https://api.github.com/users/arunchandramouli/following{/other_user}", "gists_url": "https://api.github.com/users/arunchandramouli/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/arunchandramouli", "id": 18625267, "login": "arunchandramouli", "node_id": "MDQ6VXNlcjE4NjI1MjY3", "organizations_url": "https://api.github.com/users/arunchandramouli/orgs", "received_events_url": "https://api.github.com/users/arunchandramouli/received_events", "repos_url": "https://api.github.com/users/arunchandramouli/repos", "site_admin": false, "starred_url": "https://api.github.com/users/arunchandramouli/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/arunchandramouli/subscriptions", "type": "User", "url": "https://api.github.com/users/arunchandramouli", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2016-12-19T14:31:49Z
2016-12-19T15:00:22Z
2016-12-19T14:54:11Z
NONE
null
I tried to scrap report from https://www.theice.com/marketdata/reports/datawarehouse/ConsolidatedDailyVolumeOIReport.shtml?selectionForm= We can find Report Date, Asset Class and a Submit button When we hit submit it actually opens in a new Tab as https://www.theice.com/marketdata/reports/datawarehouse/ConsolidatedDailyVolumeOIReport.shtm I tried simulating POST, but it always says Session Time Out. On Contrast if you could see https://www.theice.com/marketdata/reports/175. When you enter dates and hit submit, it opens but not in a new tab, in this case requests is able to get the data. Is there any solution for this? Please suggest.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3778/reactions" }
https://api.github.com/repos/psf/requests/issues/3778/timeline
null
completed
null
null
false
[ "Questions belong on [StackOverflow](https://stackoverflow.com) using the [python-requests tag](http://stackoverflow.com/questions/tagged/python-requests) not in the defect/bug tracker.", "But it's not working?", "@arunchandramouli taking a quick look at those websites, it looks as if they require JavaScript in order for you to successfully scrape their data. Requests does not (and has never) executed JavaScript. You need a different tool. Requests is working exactly as expected. There is no bug here." ]
https://api.github.com/repos/psf/requests/issues/3777
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3777/labels{/name}
https://api.github.com/repos/psf/requests/issues/3777/comments
https://api.github.com/repos/psf/requests/issues/3777/events
https://github.com/psf/requests/issues/3777
196,416,685
MDU6SXNzdWUxOTY0MTY2ODU=
3,777
EOF occurred in violation of protocol
{ "avatar_url": "https://avatars.githubusercontent.com/u/961966?v=4", "events_url": "https://api.github.com/users/ernestoalejo/events{/privacy}", "followers_url": "https://api.github.com/users/ernestoalejo/followers", "following_url": "https://api.github.com/users/ernestoalejo/following{/other_user}", "gists_url": "https://api.github.com/users/ernestoalejo/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ernestoalejo", "id": 961966, "login": "ernestoalejo", "node_id": "MDQ6VXNlcjk2MTk2Ng==", "organizations_url": "https://api.github.com/users/ernestoalejo/orgs", "received_events_url": "https://api.github.com/users/ernestoalejo/received_events", "repos_url": "https://api.github.com/users/ernestoalejo/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ernestoalejo/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ernestoalejo/subscriptions", "type": "User", "url": "https://api.github.com/users/ernestoalejo", "user_view_type": "public" }
[]
closed
true
null
[]
null
8
2016-12-19T13:44:06Z
2021-09-08T13:05:38Z
2016-12-19T14:47:13Z
NONE
resolved
Related to issue #3774, but I preferred to open a new one just in case it's different. The code: ```python import requests from requests.adapters import HTTPAdapter from requests.packages.urllib3.util.ssl_ import create_urllib3_context # This is the 2.11 Requests cipher string. CIPHERS = ( 'ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES:ECDH+HIGH:' 'DH+HIGH:ECDH+3DES:DH+3DES:RSA+AESGCM:RSA+AES:RSA+HIGH:RSA+3DES:!aNULL:' '!eNULL:!MD5' ) class DESAdapter(HTTPAdapter): def init_poolmanager(self, *args, **kwargs): context = create_urllib3_context(ciphers=CIPHERS) kwargs['ssl_context'] = context return super(DESAdapter, self).init_poolmanager(*args, **kwargs) s = requests.Session() s.mount('https://XXX:25000/YYY', DESAdapter()) print s.post('https://XXX:25000/YYY', verify=False, timeout=10) ``` The exception: ``` Traceback (most recent call last): File "test.py", line 22, in <module> print s.post('https://XXX:25000/YYY', verify=False, timeout=10) File "/home/ernesto/projects/test2/local/lib/python2.7/site-packages/requests/sessions.py", line 535, in post return self.request('POST', url, data=data, json=json, **kwargs) File "/home/ernesto/projects/test2/local/lib/python2.7/site-packages/requests/sessions.py", line 488, in request resp = self.send(prep, **send_kwargs) File "/home/ernesto/projects/test2/local/lib/python2.7/site-packages/requests/sessions.py", line 609, in send r = adapter.send(request, **kwargs) File "/home/ernesto/projects/test2/local/lib/python2.7/site-packages/requests/adapters.py", line 497, in send raise SSLError(e, request=request) requests.exceptions.SSLError: EOF occurred in violation of protocol (_ssl.c:590) ``` The issue seems related to #3774 because `curl` connects correctly but outputs this in verbose mode: ``` * SSL connection using TLS1.0 / RSA_3DES_EDE_CBC_SHA1 * server certificate verification SKIPPED * server certificate status verification SKIPPED ``` Output of `pip freeze`: ``` pkg-resources==0.0.0 requests==2.12.4 ``` Inside the Docker production container requests v2.11.1 is working right now and v2.12.0 not. I wanted to pinpoint exactly the commit to fill the issue but in my local virtualenv outside Docker every single version I'm trying raises the error and I can't start the git blame. I'll try to make a custom simplified container. I wanted to confirm the issue is the 3DES cypher before asking the server operator to change it. ----- If I install `requests[security]` the error changes to: ``` Traceback (most recent call last): File "test.py", line 22, in <module> print s.post('https://rol.othello.es:25000/BookingEngine', verify=False, timeout=10) File "/home/ernesto/projects/test2/local/lib/python2.7/site-packages/requests/sessions.py", line 535, in post return self.request('POST', url, data=data, json=json, **kwargs) File "/home/ernesto/projects/test2/local/lib/python2.7/site-packages/requests/sessions.py", line 488, in request resp = self.send(prep, **send_kwargs) File "/home/ernesto/projects/test2/local/lib/python2.7/site-packages/requests/sessions.py", line 609, in send r = adapter.send(request, **kwargs) File "/home/ernesto/projects/test2/local/lib/python2.7/site-packages/requests/adapters.py", line 497, in send raise SSLError(e, request=request) requests.exceptions.SSLError: ("bad handshake: SysCallError(-1, 'Unexpected EOF')",) ``` and `pip freeze` changes to: ``` cffi==1.9.1 cryptography==1.7.1 enum34==1.1.6 idna==2.1 ipaddress==1.0.17 pkg-resources==0.0.0 pyasn1==0.1.9 pycparser==2.17 pyOpenSSL==16.2.0 requests==2.12.4 six==1.10.0 ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/961966?v=4", "events_url": "https://api.github.com/users/ernestoalejo/events{/privacy}", "followers_url": "https://api.github.com/users/ernestoalejo/followers", "following_url": "https://api.github.com/users/ernestoalejo/following{/other_user}", "gists_url": "https://api.github.com/users/ernestoalejo/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ernestoalejo", "id": 961966, "login": "ernestoalejo", "node_id": "MDQ6VXNlcjk2MTk2Ng==", "organizations_url": "https://api.github.com/users/ernestoalejo/orgs", "received_events_url": "https://api.github.com/users/ernestoalejo/received_events", "repos_url": "https://api.github.com/users/ernestoalejo/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ernestoalejo/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ernestoalejo/subscriptions", "type": "User", "url": "https://api.github.com/users/ernestoalejo", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3777/reactions" }
https://api.github.com/repos/psf/requests/issues/3777/timeline
null
completed
null
null
false
[ "I can now reproduce the issue correctly inside a simple Docker container, but after `git bisect` of course the commit introducing the error was this one: 99fa7becf263473c7bfc1998b41c2c6c80a0f499", "It's extremely likely that 3DES is the problem, but I haven't been able to debug the DESAdapter I wrote because I'm away from my laptop. \r\n\r\nTo get a better idea you may want to use git bisect on urllib3 rather than on requests. Do you feel up to that?", "Yes, I'll checkout urllib3 inside the requests folder and I'll try to bisect it.", "The commit is this one: 02c5d965d0f8c64b67c4067c9d650754ef644781, so clearly 3DES is the problem here. I'll try to read some documentation about the adapter to try to fix it myself.", "Edit: Link to the commit: https://github.com/shazow/urllib3/commit/02c5d965d0f8c64b67c4067c9d650754ef644781", "Definitely 3DES is the problem here; adding 'RSA+3DES' to the DEFAULT_CHIPERS fixes the issue and commenting it raises the exception. I'm going to close the issue now that the problem has been found and I'll comment back later when I finish the adapter code if someone else needs it.", "Ok, so I need to look into why the adapter code isn't working at some point.", "It was a silly error, the line:\r\n\r\n```python\r\ns.mount('https://XXX:25000/YYY', DESAdapter())\r\n```\r\n\r\nshould be:\r\n\r\n```python\r\ns.mount('https://XXX:25000', DESAdapter())\r\n```\r\n\r\nand voila!, support for old and insecure protocols restored. :smile: I copied that line from #3774 so I'll leave a note there too in case that was the reason the error kept raising after the fix.\r\n\r\n-----\r\n\r\nFull code for reference:\r\n```python\r\nimport requests\r\n\r\nfrom requests.adapters import HTTPAdapter\r\nfrom requests.packages.urllib3.util.ssl_ import create_urllib3_context\r\n\r\n# This is the 2.11 Requests cipher string.\r\nCIPHERS = (\r\n 'ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES:ECDH+HIGH:'\r\n 'DH+HIGH:ECDH+3DES:DH+3DES:RSA+AESGCM:RSA+AES:RSA+HIGH:RSA+3DES:!aNULL:'\r\n '!eNULL:!MD5'\r\n)\r\n\r\nclass DESAdapter(HTTPAdapter):\r\n def init_poolmanager(self, *args, **kwargs):\r\n context = create_urllib3_context(ciphers=CIPHERS)\r\n kwargs['ssl_context'] = context\r\n return super(DESAdapter, self).init_poolmanager(*args, **kwargs)\r\n\r\ns = requests.Session()\r\ns.mount('https://XXX:25000', DESAdapter())\r\nprint s.post('https://XXX:25000/YYY', verify=False, timeout=10)\r\n```" ]
https://api.github.com/repos/psf/requests/issues/3776
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3776/labels{/name}
https://api.github.com/repos/psf/requests/issues/3776/comments
https://api.github.com/repos/psf/requests/issues/3776/events
https://github.com/psf/requests/issues/3776
196,354,090
MDU6SXNzdWUxOTYzNTQwOTA=
3,776
Non-ASCII characters in `requests/cacerts.pem` file cause errors in Jython / IBM JVM
{ "avatar_url": "https://avatars.githubusercontent.com/u/1094627?v=4", "events_url": "https://api.github.com/users/mplonka/events{/privacy}", "followers_url": "https://api.github.com/users/mplonka/followers", "following_url": "https://api.github.com/users/mplonka/following{/other_user}", "gists_url": "https://api.github.com/users/mplonka/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/mplonka", "id": 1094627, "login": "mplonka", "node_id": "MDQ6VXNlcjEwOTQ2Mjc=", "organizations_url": "https://api.github.com/users/mplonka/orgs", "received_events_url": "https://api.github.com/users/mplonka/received_events", "repos_url": "https://api.github.com/users/mplonka/repos", "site_admin": false, "starred_url": "https://api.github.com/users/mplonka/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mplonka/subscriptions", "type": "User", "url": "https://api.github.com/users/mplonka", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2016-12-19T08:13:48Z
2021-09-08T13:05:36Z
2016-12-19T12:15:51Z
CONTRIBUTOR
resolved
When using requests with Jython running on of IBM JVM, accessing HTTPS endpoints fails with: ``` java.security.cert.CertificateException: Fail to parse input stream ``` Example code: ```python import requests r = requests.get('https://www.google.com/') ``` The problem occurs only with Jython and only when running on IBM JVMs. The exception is being thrown by IBM's implementation of `java.security.cert.CertificateFactory`: ``` certs = list(cf.generateCertificates(ByteArrayInputStream(f.read()))) at com.ibm.crypto.provider.X509Factory.b(Unknown Source) at com.ibm.crypto.provider.X509Factory.engineGenerateCertificates(Unknown Source) at java.security.cert.CertificateFactory.generateCertificates(CertificateFactory.java:448) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:95) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:55) at java.lang.reflect.Method.invoke(Method.java:508) ``` It turns out that CertificateFactory in IBM JVMs is very strict when parsing PEM files and it does not allow non-ASCII characters at all, even in comments. After removing the following lines, everything works without issues: ``` [ 1949]: # Issuer: CN=TÜBİTAK UEKAE Kök Sertifika Hizmet Sağlayıcısı - Sürüm 3 O=Türkiye Bilimsel ve Teknolojik Araştırma Kurumu - TÜBİTAK OU=Ulusal Elektronik ve Kriptoloji Araştırma Enstitüsü - UEKAE/Kamu Sertifikasyon Merkezi [ 1950]: # Subject: CN=TÜBİTAK UEKAE Kök Sertifika Hizmet Sağlayıcısı - Sürüm 3 O=Türkiye Bilimsel ve Teknolojik Araştırma Kurumu - TÜBİTAK OU=Ulusal Elektronik ve Kriptoloji Araştırma Enstitüsü - UEKAE/Kamu Sertifikasyon Merkezi [ 2280]: # Issuer: CN=NetLock Arany (Class Gold) Főtanúsítvány O=NetLock Kft. OU=Tanúsítványkiadók (Certification Services) [ 2281]: # Subject: CN=NetLock Arany (Class Gold) Főtanúsítvány O=NetLock Kft. OU=Tanúsítványkiadók (Certification Services) [ 2282]: # Label: "NetLock Arany (Class Gold) Főtanúsítvány" [ 2936]: # Issuer: CN=Certinomis - Autorité Racine O=Certinomis OU=0002 433998903 [ 2937]: # Subject: CN=Certinomis - Autorité Racine O=Certinomis OU=0002 433998903 [ 2938]: # Label: "Certinomis - Autorité Racine" [ 3413]: # Issuer: CN=TÜRKTRUST Elektronik Sertifika Hizmet Sağlayıcısı O=TÜRKTRUST Bilgi İletişim ve Bilişim Güvenliği Hizmetleri A.Ş. (c) Aralık 2007 [ 3414]: # Subject: CN=TÜRKTRUST Elektronik Sertifika Hizmet Sağlayıcısı O=TÜRKTRUST Bilgi İletişim ve Bilişim Güvenliği Hizmetleri A.Ş. (c) Aralık 2007 [ 3896]: # Issuer: CN=E-Tugra Certification Authority O=E-Tuğra EBG Bilişim Teknolojileri ve Hizmetleri A.Ş. OU=E-Tugra Sertifikasyon Merkezi [ 3897]: # Subject: CN=E-Tugra Certification Authority O=E-Tuğra EBG Bilişim Teknolojileri ve Hizmetleri A.Ş. OU=E-Tugra Sertifikasyon Merkezi [ 4303]: # Issuer: CN=CA 沃通根证书 O=WoSign CA Limited [ 4304]: # Subject: CN=CA 沃通根证书 O=WoSign CA Limited [ 4750]: # Issuer: CN=TÜRKTRUST Elektronik Sertifika Hizmet Sağlayıcısı H5 O=TÜRKTRUST Bilgi İletişim ve Bilişim Güvenliği Hizmetleri A.Ş. [ 4751]: # Subject: CN=TÜRKTRUST Elektronik Sertifika Hizmet Sağlayıcısı H5 O=TÜRKTRUST Bilgi İletişim ve Bilişim Güvenliği Hizmetleri A.Ş. [ 4752]: # Label: "TÜRKTRUST Elektronik Sertifika Hizmet Sağlayıcısı H5" [ 4783]: # Issuer: CN=TÜRKTRUST Elektronik Sertifika Hizmet Sağlayıcısı H6 O=TÜRKTRUST Bilgi İletişim ve Bilişim Güvenliği Hizmetleri A.Ş. [ 4784]: # Subject: CN=TÜRKTRUST Elektronik Sertifika Hizmet Sağlayıcısı H6 O=TÜRKTRUST Bilgi İletişim ve Bilişim Güvenliği Hizmetleri A.Ş. [ 4785]: # Label: "TÜRKTRUST Elektronik Sertifika Hizmet Sağlayıcısı H6" ``` A temporary workaround is to remove all comments from `cacerts.pem` file and set `REQUESTS_CA_BUNDLE` variable: ```bash grep -v '^\s*\(#.*\)\?$' requests/cacert.pem > /tmp/cacert.pem export REQUESTS_CA_BUNDLE=/tmp/cacert.pem ``` From what I can see in the [Makefile](https://github.com/kennethreitz/requests/blob/v2.12.4/Makefile#L17-L18), the `requests/cacert.pem` file is being downloaded from <http://ci.kennethreitz.org/job/ca-bundle/lastSuccessfulBuild/artifact/cacerts.pem>, which means that the patch should be either applied to that Jenkins job or the `Makefile`. Not sure which one is best? My suggested solution (but not necessarily the implementation) is to encode those special characters with 'backslashreplace'. Piping the pem file through this script does exactly that: ```python import sys for l in sys.stdin.readlines(): print unicode(l, 'utf8').strip().encode('ascii', errors='backslashreplace') ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3776/reactions" }
https://api.github.com/repos/psf/requests/issues/3776/timeline
null
completed
null
null
false
[ "The PEM file actually comes from https://mkcert.org/generate/. I see no reason to remove the comments on the source side, given that every other tool is happy to ignore them. But I also see no reason we shouldn't fix up the build process for certifi to strip them.\r\n\r\nI consider this an issue on certifi more than on requests, so I've opened certifi/python-certifi#50, which tracks the work.\r\n\r\nThanks for the report!", "Hi @Lukasa. Thanks for looking into that.\r\nWouldn't it be wise to do some sort of escaping of those comments in https://github.com/Lukasa/mkcert itself? Are you OK with me submitting a PR there?", "@mplonka I don't really see any reason to do the escaping there. PEM isn't well specced but so far we have only one extremely unusual implementation that chokes. I don't really see any reason to destroy that output for that, given that it's clearly intended to be human readable. " ]
https://api.github.com/repos/psf/requests/issues/3775
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3775/labels{/name}
https://api.github.com/repos/psf/requests/issues/3775/comments
https://api.github.com/repos/psf/requests/issues/3775/events
https://github.com/psf/requests/issues/3775
196,320,655
MDU6SXNzdWUxOTYzMjA2NTU=
3,775
Problem running PyInstaller 3.2 compiled executable using requests v2.12.x
{ "avatar_url": "https://avatars.githubusercontent.com/u/16149848?v=4", "events_url": "https://api.github.com/users/sureshk75/events{/privacy}", "followers_url": "https://api.github.com/users/sureshk75/followers", "following_url": "https://api.github.com/users/sureshk75/following{/other_user}", "gists_url": "https://api.github.com/users/sureshk75/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sureshk75", "id": 16149848, "login": "sureshk75", "node_id": "MDQ6VXNlcjE2MTQ5ODQ4", "organizations_url": "https://api.github.com/users/sureshk75/orgs", "received_events_url": "https://api.github.com/users/sureshk75/received_events", "repos_url": "https://api.github.com/users/sureshk75/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sureshk75/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sureshk75/subscriptions", "type": "User", "url": "https://api.github.com/users/sureshk75", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2016-12-19T02:49:41Z
2021-09-08T10:00:38Z
2016-12-19T03:08:19Z
NONE
resolved
Hi, firstly thank you for the awesome module that you have here. I've been using Python 3.4.4 on an app I created about 2 months ago and recently ran into some issues with requests. The following is the traceback displayed when attempting to run it after compiling it into a windows executable using PyInstaller 3.2 using requests v2.12.x (all patches). I have downgraded to requests v2.11.1 as that version compiles and runs without errors. I thought I'd give you the heads up as I couldn't find a similar issue and solution anywhere. Thanks > Traceback (most recent call last): > File "site-packages\requests\packages\__init__.py", line 27, in <module> > File "C:\Python34\lib\site-packages\PyInstaller\loader\pyimod03_importers.py", line 389, in load_module > exec(bytecode, module.__dict__) > File "site-packages\requests\packages\urllib3\__init__.py", line 8, in <module> > File "C:\Python34\lib\site-packages\PyInstaller\loader\pyimod03_importers.py", line 389, in load_module > exec(bytecode, module.__dict__) > File "site-packages\requests\packages\urllib3\connectionpool.py", line 28, in <module> > File "site-packages\requests\packages\urllib3\packages\six.py", line 203, in load_module > File "site-packages\requests\packages\urllib3\packages\six.py", line 115, in _resolve > File "site-packages\requests\packages\urllib3\packages\six.py", line 82, in _import_module > ImportError: No module named 'queue' > > During handling of the above exception, another exception occurred: > > Traceback (most recent call last): > File "dt3.py", line 13, in <module> > File "C:\Python34\lib\site-packages\PyInstaller\loader\pyimod03_importers.py", line 389, in load_module > exec(bytecode, module.__dict__) > File "site-packages\requests\__init__.py", line 63, in <module> > File "C:\Python34\lib\site-packages\PyInstaller\loader\pyimod03_importers.py", line 389, in load_module > exec(bytecode, module.__dict__) > File "site-packages\requests\utils.py", line 24, in <module> > File "C:\Python34\lib\site-packages\PyInstaller\loader\pyimod03_importers.py", line 389, in load_module > exec(bytecode, module.__dict__) > File "site-packages\requests\_internal_utils.py", line 11, in <module> > File "C:\Python34\lib\site-packages\PyInstaller\loader\pyimod03_importers.py", line 389, in load_module > exec(bytecode, module.__dict__) > File "site-packages\requests\compat.py", line 11, in <module> > File "C:\Python34\lib\site-packages\PyInstaller\loader\pyimod03_importers.py", line 389, in load_module > exec(bytecode, module.__dict__) > File "site-packages\requests\packages\__init__.py", line 29, in <module> > ImportError: No module named 'urllib3'
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3775/reactions" }
https://api.github.com/repos/psf/requests/issues/3775/timeline
null
completed
null
null
false
[ "This seems like a pyinstaller problem: our import of the queue module via `six` isn't being detected. You should ensure that your packaging includes that standard library module. ", "@Lukasa We hit this problem today. Can you explain what you mean by:\r\n> You should ensure that your packaging includes that standard library module.\r\n\r\nThanks", "The queue module, which is part of the standard library. ", "Thanks for the quick reply @Lukasa. Seems the pyinstaller guys think this is a problem with requests due to your bundling of `six` (https://github.com/kennethreitz/requests/issues/3793).\r\n\r\nAny thoughts on that? I'd like to try to help get this resolved.", "I'm pretty unimpressed that pyinstaller thinks this is our fault. We are using valid Python code that through a valid import chain imports a stdlib module that they are removing. Our job is not to contort ourselves for the benefit of their tool. Their job is to support valid Python code. " ]
https://api.github.com/repos/psf/requests/issues/3774
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3774/labels{/name}
https://api.github.com/repos/psf/requests/issues/3774/comments
https://api.github.com/repos/psf/requests/issues/3774/events
https://github.com/psf/requests/issues/3774
196,314,842
MDU6SXNzdWUxOTYzMTQ4NDI=
3,774
bad handshake error with ssl3
{ "avatar_url": "https://avatars.githubusercontent.com/u/1667227?v=4", "events_url": "https://api.github.com/users/bigbagboom/events{/privacy}", "followers_url": "https://api.github.com/users/bigbagboom/followers", "following_url": "https://api.github.com/users/bigbagboom/following{/other_user}", "gists_url": "https://api.github.com/users/bigbagboom/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/bigbagboom", "id": 1667227, "login": "bigbagboom", "node_id": "MDQ6VXNlcjE2NjcyMjc=", "organizations_url": "https://api.github.com/users/bigbagboom/orgs", "received_events_url": "https://api.github.com/users/bigbagboom/received_events", "repos_url": "https://api.github.com/users/bigbagboom/repos", "site_admin": false, "starred_url": "https://api.github.com/users/bigbagboom/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/bigbagboom/subscriptions", "type": "User", "url": "https://api.github.com/users/bigbagboom", "user_view_type": "public" }
[]
closed
true
null
[]
null
14
2016-12-19T01:40:11Z
2019-02-01T18:00:10Z
2016-12-19T01:57:19Z
NONE
resolved
I have an inhouse IIS server with ssl3 but an expired certificate, so I used requests without certificate verification and it was working fine with requests 2.11.1. But after I upgrade requests to 2.12.0, there was an error occured. the code is: ... requests.get('https://10.192.8.89:8080/yps_report', verify=False) ... error message: Traceback (most recent call last): File "c:\python35\lib\site-packages\requests\packages\urllib3\contrib\pyopenssl.py", line 417, in wrap_socket cnx.do_handshake() File "c:\python35\lib\site-packages\OpenSSL\SSL.py", line 1426, in do_handshake self._raise_ssl_error(self._ssl, result) File "c:\python35\lib\site-packages\OpenSSL\SSL.py", line 1167, in _raise_ssl_error raise SysCallError(-1, "Unexpected EOF") OpenSSL.SSL.SysCallError: (-1, 'Unexpected EOF') During handling of the above exception, another exception occurred: Traceback (most recent call last): File "c:\python35\lib\site-packages\requests\packages\urllib3\connectionpool.py", line 594, in urlopen chunked=chunked) File "c:\python35\lib\site-packages\requests\packages\urllib3\connectionpool.py", line 350, in _make_request self._validate_conn(conn) File "c:\python35\lib\site-packages\requests\packages\urllib3\connectionpool.py", line 835, in _validate_conn conn.connect() File "c:\python35\lib\site-packages\requests\packages\urllib3\connection.py", line 323, in connect ssl_context=context) File "c:\python35\lib\site-packages\requests\packages\urllib3\util\ssl_.py", line 324, in ssl_wrap_socket return context.wrap_socket(sock, server_hostname=server_hostname) File "c:\python35\lib\site-packages\requests\packages\urllib3\contrib\pyopenssl.py", line 424, in wrap_socket raise ssl.SSLError('bad handshake: %r' % e) ssl.SSLError: ("bad handshake: SysCallError(-1, 'Unexpected EOF')",) ... I tried to downgrade requests to 2.11.1 and the error was gone. I have no idea how to fix this.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 1, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 1, "url": "https://api.github.com/repos/psf/requests/issues/3774/reactions" }
https://api.github.com/repos/psf/requests/issues/3774/timeline
null
completed
null
null
false
[ "This is almost certainly because the server in question only supports 3DES as a cipher, which we dropped from our default cipher list because it's insecure. If you operate the server I strongly encourage you to upgrade its cipher suite configuration to support something secure. ", "I think you are right. But I don't run the server.... It seems I have to strick to 2.11.1 for good.", "There are other options. It is possible to configure a custom SSLContext object that can do what you need. You need a custom HTTPAdapter:\r\n\r\n```python\r\nfrom requests.adapters import HTTPAdapter\r\nfrom requests.packages.urllib3.util.ssl_ import create_urllib3_context\r\n\r\n# This is the 2.11 Requests cipher string.\r\nCIPHERS = (\r\n 'ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES:ECDH+HIGH:'\r\n 'DH+HIGH:ECDH+3DES:DH+3DES:RSA+AESGCM:RSA+AES:RSA+HIGH:RSA+3DES:!aNULL:'\r\n '!eNULL:!MD5'\r\n)\r\n\r\nclass DESAdapter(HTTPAdapter):\r\n def init_poolmanager(self, *args, **kwargs):\r\n context = create_urllib3_context(ciphers=CIPHERS)\r\n kwargs['ssl_context'] = context\r\n return super(HTTPAdapter, self).init_poolmanager(*args, **kwargs)\r\n\r\ns = requests.Session()\r\ns.mount('https://10.192.8.89', DESAdapter())\r\n```\r\n\r\nThis should work. Please avoid using this too much, and also ensure that you put pressure on the server administrator to update their config: they have *no* secure cipher suites in their config now, which is pretty unacceptable. ", "@Lukasa thank you for your reply, but your code doesn't work for me. \r\nhere are my code(minor modified on yours):\r\n\r\n```\r\nimport sys\r\nimport ssl\r\nimport requests\r\nfrom requests_ntlm import HttpNtlmAuth\r\nfrom requests.adapters import HTTPAdapter\r\nfrom requests.packages.urllib3.poolmanager import PoolManager\r\nfrom requests.packages.urllib3.util.ssl_ import create_urllib3_context\r\n\r\n# This is the 2.11 Requests cipher string.\r\nCIPHERS = (\r\n 'ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES:ECDH+HIGH:'\r\n 'DH+HIGH:ECDH+3DES:DH+3DES:RSA+AESGCM:RSA+AES:RSA+HIGH:RSA+3DES:!aNULL:'\r\n '!eNULL:!MD5'\r\n)\r\n\r\nrequests.packages.urllib3.disable_warnings()\r\n\r\nclass Ssl3HttpAdapter(HTTPAdapter):\r\n \"\"\"\"Transport adapter\" that allows us to use SSLv3.\"\"\"\r\n def init_poolmanager(self, connections, maxsize, block=False):\r\n self.poolmanager = PoolManager(\r\n num_pools=connections, maxsize=maxsize,\r\n block=block, ssl_version=ssl.PROTOCOL_SSLv3)\r\n\r\nclass DESAdapter(HTTPAdapter):\r\n def init_poolmanager(self, connections, maxsize, block=False,*args, **kwargs):\r\n context = create_urllib3_context(ciphers=CIPHERS)\r\n kwargs['ssl_context'] = context\r\n self.poolmanager = PoolManager(\r\n num_pools=connections, maxsize=maxsize,\r\n block=block, ssl_version=ssl.PROTOCOL_SSLv3,*args, **kwargs)\r\n\r\nif __name__==\"__main__\":\r\n\r\n username=\"someuser\"\r\n password=\"thepwd\"\r\n\r\n s = requests.Session()\r\n\r\n #s.mount('https://10.192.8.89:8080/yps_report',Ssl3HttpAdapter())\r\n s.mount('https://10.192.8.89:8080/yps_report',DESAdapter())\r\n\r\n r=s.get('https://10.192.8.89:8080/yps_report',auth=HttpNtlmAuth(username,password),verify=False)\r\n\r\n```\r\n\r\nsame error msg appears:\r\n\r\n```\r\n========= RESTART: M:\\support control\\KPSKTPRC\\getKpsfiles - test.py =========\r\nTraceback (most recent call last):\r\n File \"c:\\python35\\lib\\site-packages\\requests\\packages\\urllib3\\contrib\\pyopenssl.py\", line 417, in wrap_socket\r\n cnx.do_handshake()\r\n File \"c:\\python35\\lib\\site-packages\\OpenSSL\\SSL.py\", line 1426, in do_handshake\r\n self._raise_ssl_error(self._ssl, result)\r\n File \"c:\\python35\\lib\\site-packages\\OpenSSL\\SSL.py\", line 1167, in _raise_ssl_error\r\n raise SysCallError(-1, \"Unexpected EOF\")\r\nOpenSSL.SSL.SysCallError: (-1, 'Unexpected EOF')\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File \"c:\\python35\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.py\", line 594, in urlopen\r\n chunked=chunked)\r\n File \"c:\\python35\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.py\", line 350, in _make_request\r\n self._validate_conn(conn)\r\n File \"c:\\python35\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.py\", line 835, in _validate_conn\r\n conn.connect()\r\n File \"c:\\python35\\lib\\site-packages\\requests\\packages\\urllib3\\connection.py\", line 323, in connect\r\n ssl_context=context)\r\n File \"c:\\python35\\lib\\site-packages\\requests\\packages\\urllib3\\util\\ssl_.py\", line 324, in ssl_wrap_socket\r\n return context.wrap_socket(sock, server_hostname=server_hostname)\r\n File \"c:\\python35\\lib\\site-packages\\requests\\packages\\urllib3\\contrib\\pyopenssl.py\", line 424, in wrap_socket\r\n raise ssl.SSLError('bad handshake: %r' % e)\r\nssl.SSLError: (\"bad handshake: SysCallError(-1, 'Unexpected EOF')\",)\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File \"c:\\python35\\lib\\site-packages\\requests\\adapters.py\", line 423, in send\r\n timeout=timeout\r\n File \"c:\\python35\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.py\", line 624, in urlopen\r\n raise SSLError(e)\r\nrequests.packages.urllib3.exceptions.SSLError: (\"bad handshake: SysCallError(-1, 'Unexpected EOF')\",)\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File \"M:\\support control\\KPSKTPRC\\getKpsfiles - test.py\", line 44, in <module>\r\n r=s.get('https://10.192.8.89:8080/yps_report',auth=HttpNtlmAuth(username,password),verify=False)\r\n File \"c:\\python35\\lib\\site-packages\\requests\\sessions.py\", line 501, in get\r\n return self.request('GET', url, **kwargs)\r\n File \"c:\\python35\\lib\\site-packages\\requests\\sessions.py\", line 488, in request\r\n resp = self.send(prep, **send_kwargs)\r\n File \"c:\\python35\\lib\\site-packages\\requests\\sessions.py\", line 609, in send\r\n r = adapter.send(request, **kwargs)\r\n File \"c:\\python35\\lib\\site-packages\\requests\\adapters.py\", line 497, in send\r\n raise SSLError(e, request=request)\r\nrequests.exceptions.SSLError: (\"bad handshake: SysCallError(-1, 'Unexpected EOF')\",)\r\n```\r\n\r\nand under 2.11.1, there was another error which i can fix it by removing the ssl_context key:\r\n\r\n```\r\n========= RESTART: M:\\support control\\KPSKTPRC\\getKpsfiles - test.py =========\r\nTraceback (most recent call last):\r\n File \"M:\\support control\\KPSKTPRC\\getKpsfiles - test.py\", line 44, in <module>\r\n r=s.get('https://10.192.8.89:8080/yps_report',auth=HttpNtlmAuth(username,password),verify=False)\r\n File \"c:\\python35\\lib\\site-packages\\requests\\sessions.py\", line 488, in get\r\n return self.request('GET', url, **kwargs)\r\n File \"c:\\python35\\lib\\site-packages\\requests\\sessions.py\", line 475, in request\r\n resp = self.send(prep, **send_kwargs)\r\n File \"c:\\python35\\lib\\site-packages\\requests\\sessions.py\", line 596, in send\r\n r = adapter.send(request, **kwargs)\r\n File \"c:\\python35\\lib\\site-packages\\requests\\adapters.py\", line 423, in send\r\n timeout=timeout\r\n File \"c:\\python35\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.py\", line 583, in urlopen\r\n conn = self._get_conn(timeout=pool_timeout)\r\n File \"c:\\python35\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.py\", line 257, in _get_conn\r\n return conn or self._new_conn()\r\n File \"c:\\python35\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.py\", line 819, in _new_conn\r\n strict=self.strict, **self.conn_kw)\r\n File \"c:\\python35\\lib\\site-packages\\requests\\packages\\urllib3\\connection.py\", line 210, in __init__\r\n timeout=timeout, **kw)\r\n File \"c:\\python35\\lib\\site-packages\\requests\\packages\\urllib3\\connection.py\", line 126, in __init__\r\n _HTTPConnection.__init__(self, *args, **kw)\r\nTypeError: __init__() got an unexpected keyword argument 'ssl_context' \r\n```", "Hrm, it's also possible that you're using PyOpenSSL when you aren't expecting to be. Can you show me the output of `pip freeze`?", "For reference, the line:\r\n\r\n```python\r\ns.mount('https://10.192.8.89:8080/yps_report',DESAdapter())\r\n```\r\n\r\nshould be:\r\n\r\n```python\r\ns.mount('https://10.192.8.89:8080',DESAdapter())\r\n```\r\n\r\nor the error adapter won't be used correctly.", "@Lukasa yes, i installed PyOpenSSL but I don't think it's the cause. I uninstalled it and the error changed to the following:\r\n```\r\n========= RESTART: M:\\support control\\KPSKTPRC\\getKpsfiles - test.py =========\r\nTraceback (most recent call last):\r\n File \"c:\\python35\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.py\", line 594, in urlopen\r\n chunked=chunked)\r\n File \"c:\\python35\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.py\", line 350, in _make_request\r\n self._validate_conn(conn)\r\n File \"c:\\python35\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.py\", line 835, in _validate_conn\r\n conn.connect()\r\n File \"c:\\python35\\lib\\site-packages\\requests\\packages\\urllib3\\connection.py\", line 323, in connect\r\n ssl_context=context)\r\n File \"c:\\python35\\lib\\site-packages\\requests\\packages\\urllib3\\util\\ssl_.py\", line 324, in ssl_wrap_socket\r\n return context.wrap_socket(sock, server_hostname=server_hostname)\r\n File \"C:\\Python35\\lib\\ssl.py\", line 377, in wrap_socket\r\n _context=self)\r\n File \"C:\\Python35\\lib\\ssl.py\", line 752, in __init__\r\n self.do_handshake()\r\n File \"C:\\Python35\\lib\\ssl.py\", line 988, in do_handshake\r\n self._sslobj.do_handshake()\r\n File \"C:\\Python35\\lib\\ssl.py\", line 633, in do_handshake\r\n self._sslobj.do_handshake()\r\nssl.SSLEOFError: EOF occurred in violation of protocol (_ssl.c:645)\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File \"c:\\python35\\lib\\site-packages\\requests\\adapters.py\", line 423, in send\r\n timeout=timeout\r\n File \"c:\\python35\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.py\", line 624, in urlopen\r\n raise SSLError(e)\r\nrequests.packages.urllib3.exceptions.SSLError: EOF occurred in violation of protocol (_ssl.c:645)\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File \"M:\\support control\\KPSKTPRC\\getKpsfiles - test.py\", line 45, in <module>\r\n r=s.get('https://10.192.8.89:8080/yps_report',auth=HttpNtlmAuth(username,password),verify=False)\r\n File \"c:\\python35\\lib\\site-packages\\requests\\sessions.py\", line 501, in get\r\n return self.request('GET', url, **kwargs)\r\n File \"c:\\python35\\lib\\site-packages\\requests\\sessions.py\", line 488, in request\r\n resp = self.send(prep, **send_kwargs)\r\n File \"c:\\python35\\lib\\site-packages\\requests\\sessions.py\", line 609, in send\r\n r = adapter.send(request, **kwargs)\r\n File \"c:\\python35\\lib\\site-packages\\requests\\adapters.py\", line 497, in send\r\n raise SSLError(e, request=request)\r\nrequests.exceptions.SSLError: EOF occurred in violation of protocol (_ssl.c:645)\r\n```\r\nnow the pip freeze is:\r\nC:\\Users\\bigbagboom>pip freeze\r\nalabaster==0.7.9\r\nanaconda-client==1.4.0\r\nanaconda-navigator==1.1.0\r\nargcomplete==1.0.0\r\nastroid==1.4.8\r\nastropy==1.1.2\r\nBabel==2.3.4\r\nbeautifulsoup4==4.4.1\r\nbitarray==0.8.1\r\nblaze==0.9.1\r\nbokeh==0.11.1\r\nboto==2.39.0\r\nBottleneck==1.0.0\r\ncertifi==2016.9.26\r\ncffi==1.9.1\r\nchest==0.2.3\r\nclick==6.6\r\ncloudpickle==0.1.1\r\nclyent==1.2.1\r\ncolorama==0.3.7\r\ncomtypes==1.1.2\r\nconda==4.0.5\r\nconda-build==1.20.0\r\nconda-env==2.4.5\r\nconda-manager==0.3.1\r\nconfigobj==5.0.6\r\ncryptography==1.7.1\r\ncycler==0.10.0\r\nCython==0.23.4\r\ncytoolz==0.7.5\r\ndask==0.8.1\r\ndatashape==0.5.1\r\ndecorator==4.0.10\r\ndill==0.2.4\r\ndocutils==0.13.1\r\ndynd===c328ab7\r\nentrypoints==0.2.2\r\net-xmlfile==1.0.1\r\nfastcache==1.0.2\r\nFlask==0.11.1\r\nFlask-Cors==2.1.2\r\nfuture==0.16.0\r\ngevent==1.1.2\r\ngreenlet==0.4.11\r\nh5py==2.5.0\r\nHeapDict==1.0.0\r\nidna==2.1\r\nimagesize==0.7.1\r\nipykernel==4.5.2\r\nipython==5.1.0\r\nipython-genutils==0.1.0\r\nipywidgets==4.1.1\r\nisort==4.2.5\r\nitsdangerous==0.24\r\njdcal==1.3\r\njedi==0.9.0\r\nJinja2==2.8\r\njsonschema==2.5.1\r\njupyter==1.0.0\r\njupyter-client==4.4.0\r\njupyter-console==4.1.1\r\njupyter-core==4.2.1\r\nlazy-object-proxy==1.2.2\r\nllvmlite==0.9.0\r\nlocket==0.2.0\r\nlxml==3.6.0\r\nMarkupSafe==0.23\r\nmatplotlib==1.5.1\r\nmccabe==0.5.3\r\nmenuinst==1.3.2\r\nmistune==0.7.3\r\nmpmath==0.19\r\nmultipledispatch==0.4.8\r\nnbconvert==4.3.0\r\nnbformat==4.2.0\r\nndg-httpsclient==0.4.2\r\nnetworkx==1.11\r\nnltk==3.2\r\nnose==1.3.7\r\nnotebook==4.1.0\r\nntlm-auth==1.0.2\r\nnumba==0.24.0\r\nnumexpr==2.6.1\r\nnumpy==1.11.2\r\nodo==0.4.2\r\nopenpyxl==2.4.1\r\nordereddict==1.1\r\npandas==0.18.0\r\npartd==0.3.2\r\npath.py==0.0.0\r\npatsy==0.4.0\r\npep8==1.7.0\r\npickleshare==0.7.4\r\nPillow==3.4.2\r\nply==3.8\r\nprompt-toolkit==1.0.9\r\npsutil==5.0.0\r\npy==1.4.31\r\npyasn1==0.1.9\r\npycosat==0.6.1\r\npycparser==2.17\r\npycrypto==2.6.1\r\npyexcel==0.3.3\r\npyexcel-io==0.2.4\r\npyexcel-xls==0.2.0\r\npyexcel-xlsx==0.2.3\r\npyflakes==1.3.0\r\nPygments==2.1.3\r\npylint==1.6.4\r\npyparsing==2.0.3\r\npyreadline==2.1\r\npytest==2.8.5\r\npython-dateutil==2.5.1\r\npython-ntlm3==1.0.2\r\npytz==2016.10\r\npywin32==220\r\nPyYAML==3.11\r\npyzmq==16.0.2\r\nQtAwesome==0.3.3\r\nqtconsole==4.2.1\r\nQtPy==1.1.2\r\nrequests==2.12.0\r\nrequests-ntlm==1.0.0\r\nrequests-toolbelt==0.7.0\r\nrope-py3k==0.9.4.post1\r\nscikit-image==0.12.3\r\nscikit-learn==0.17.1\r\nscipy==0.17.0\r\nsimplegeneric==0.8.1\r\nsingledispatch==3.4.0.3\r\nsix==1.10.0\r\nsnowballstemmer==1.2.1\r\nsockjs-tornado==1.0.1\r\nsphinx-rtd-theme==0.1.9\r\nspyder==2.3.8\r\nSQLAlchemy==1.0.12\r\nstatsmodels==0.6.1\r\nsympy==1.0\r\ntables==3.3.0\r\ntexttable==0.8.7\r\ntoolz==0.7.4\r\ntornado==4.4.2\r\ntraitlets==4.3.1\r\nunicodecsv==0.14.1\r\nurllib3==1.19.1\r\nwcwidth==0.1.7\r\nWerkzeug==0.11.11\r\nwin-unicode-console==0.5\r\nwrapt==1.10.8\r\nxlrd==0.9.4\r\nXlsxWriter==0.8.4\r\nxlwings==0.7.0\r\nxlwt==1.1.2\r\nxlwt-future==0.8.0\r\n\r\nAnd for reference, here is some output(successful) with curl 7.50.3:\r\n```\r\n* Trying 10.192.8.89...\r\n* TCP_NODELAY set\r\n % Total % Received % Xferd Average Speed Time Time Time Current\r\n Dload Upload Total Spent Left Speed\r\n 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0* Connected to 10.192.8.89 (10.192.8.89) p\r\nort 8080 (#0)\r\n* ALPN, offering h2\r\n* ALPN, offering http/1.1\r\n* Cipher selection: ALL:!EXPORT:!EXPORT40:!EXPORT56:!aNULL:!LOW:!RC4:@STRENGTH\r\n} [5 bytes data]\r\n* SSLv3 (OUT), TLS handshake, Client hello (1):\r\n} [131 bytes data]\r\n* SSLv3 (IN), TLS handshake, Server hello (2):\r\n{ [74 bytes data]\r\n* SSLv3 (IN), TLS handshake, Certificate (11):\r\n{ [1069 bytes data]\r\n* SSLv3 (IN), TLS handshake, Server finished (14):\r\n{ [4 bytes data]\r\n* SSLv3 (OUT), TLS handshake, Client key exchange (16):\r\n} [132 bytes data]\r\n* SSLv3 (OUT), TLS change cipher, Client hello (1):\r\n} [1 bytes data]\r\n* SSLv3 (OUT), TLS handshake, Finished (20):\r\n} [40 bytes data]\r\n 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0* SSLv3 (IN), TLS change cipher, Client he\r\nllo (1):\r\n{ [1 bytes data]\r\n* SSLv3 (IN), TLS handshake, Finished (20):\r\n{ [40 bytes data]\r\n* SSL connection using SSLv3 / DES-CBC3-SHA\r\n* ALPN, server did not agree to a protocol\r\n* Server certificate:\r\n* subject: C=CN; ST=SH; L=SH; O=XXX; OU=XXX; CN=XXX\r\n* start date: Jul 13 09:31:03 2013 GMT\r\n* expire date: Jul 13 09:41:03 2014 GMT\r\n* issuer: CN=XXXXXXXXX\r\n* SSL certificate verify result: unable to get local issuer certificate (20), continuing anyway.\r\n* Server auth using NTLM with user 'someuser'\r\n} [5 bytes data]\r\n> GET /yps_report/share_dir/XXX/XXXXXXXXXXXXX.xls HTTP/1.1\r\n> Host: 10.192.8.89:8080\r\n> Authorization: NTLM TlRMTVNTUAABAAAABoIIAAAAAAAAAAAAAAAAAAAAAAA=\r\n> User-Agent: curl/7.50.3\r\n> Accept: */*\r\n>\r\n{ [5 bytes data]\r\n< HTTP/1.1 401 Unauthorized\r\n< Content-Length: 1251\r\n< Content-Type: text/html\r\n< Server: Microsoft-IIS/6.0\r\n< WWW-Authenticate: NTLM TlRMTVNTUAACAAAACAAIADgAAAAGgooCa6Za/bxp8M0AAAAAAAAAAFQAVABAAAAABQLODgAAAA9QV0tUUFNWUgIAEABQAFc\r\nASwBUAFAAUwBWAFIAAQAQAFAAVwBLAFQAUABTAFYAUgAEABAAcAB3AGsAdABwAHMAdgByAAMAEABwAHcAawB0AHAAcwB2AHIAAAAAAA==\r\n< X-Powered-By: ASP.NET\r\n< Date: Tue, 20 Dec 2016 00:48:40 GMT\r\n<\r\n* Ignoring the response-body\r\n{ [1251 bytes data]\r\n* Curl_http_done: called premature == 0\r\n100 1251 100 1251 0 0 2113 0 --:--:-- --:--:-- --:--:-- 2229\r\n* Connection #0 to host 10.192.8.89 left intact\r\n* Issue another request to this URL: 'https://10.192.8.89:8080/yps_report/share_dir/XXX/XXXXXXXXXXXXX.xls'\r\n* Found bundle for host 10.192.8.89: 0x4cd490 [can pipeline]\r\n* Re-using existing connection! (#0) with host 10.192.8.89\r\n* Connected to 10.192.8.89 (10.192.8.89) port 8080 (#0)\r\n* Server auth using NTLM with user 'someuser'\r\n} [5 bytes data]\r\n> GET /yps_report/share_dir/XXX/XXXXXXXXXXXXX.xls HTTP/1.1\r\n> Host: 10.192.8.89:8080\r\n> Authorization: NTLM TlRMTVNTUAADAAAAGAAYAEAAAACEAIQAWAAAAAAAAADcAAAABwAHANwAAAANAA0A4wAAAAAAAAAAAAAABoKKAu8HJ+fusEEOQg\r\nho0GY4AyMot6g8k1ZQsSt3Mxi4PdkXtAHr8KmqFa4BAQAAAAAAAID6yM5aWtIBKLeoPJNWULEAAAAAAgAQAFAAVwBLAFQAUABTAFYAUgABABAAUABXAEsAVA\r\nBQAFMAVgBSAAQAEABwAHcAawB0AHAAcwB2AHIAAwAQAHAAdwBrAHQAcABzAHYAcgAAAAAAAAAAAHlwc3VzZXJUQy1HNlJYVUowVzJI\r\n> User-Agent: curl/7.50.3\r\n> Accept: */*\r\n>\r\n{ [5 bytes data]\r\n< HTTP/1.1 200 OK\r\n< Content-Length: 8881658\r\n< Content-Type: application/vnd.ms-excel\r\n< Last-Modified: Mon, 19 Dec 2016 10:07:56 GMT\r\n< Accept-Ranges: bytes\r\n< ETag: \"d65911c5df59d21:47c\"\r\n< Server: Microsoft-IIS/6.0\r\n< X-Powered-By: ASP.NET\r\n< Date: Tue, 20 Dec 2016 00:48:40 GMT\r\n<\r\n{ [16116 bytes data]\r\n 96 8673k 96 8409k 0 0 737k 0 0:00:11 0:00:11 --:--:-- 807k* Curl_http_done: called premature == 0\r\n100 8673k 100 8673k 0 0 728k 0 0:00:11 0:00:11 --:--:-- 795k\r\n* Connection #0 to host 10.192.8.89 left intact\r\n\r\n```", "@ernestoalejo thank you but I am not sure... you know 'https://10.192.8.89:8080/yps_report' and 'https://10.192.8.89:8080' can be 2 different sites in IIS's configuration.\r\n\r\n> For reference, the line:\r\n> \r\n> s.mount('https://10.192.8.89:8080/yps_report',DESAdapter())\r\n> should be:\r\n> \r\n> s.mount('https://10.192.8.89:8080',DESAdapter())\r\n> or the error adapter won't be used correctly.", "@bigbagboom They cannot possibly have different configurations for TLS though. IIS can't see what path is being used until after the TLS handshake is complete.", "my error was gone by adding 'DES-CBC3-SHA' in the CIPHERS string. sigh...\r\n\r\nThank you @Lukasa and @ernestoalejo .\r\n\r\nand it's also ok with following code:\r\ns.mount('https://10.192.8.89:8080/yps_report',DESAdapter()) ", "Hey, I'm not an expert on HTTPS or SSL, and I'm unable to figure out what's wrong with the communication to [the server that I want to scrape](https://www.old.health.gov.il/pages/). According to Chrome developer tools, \r\n\r\n> The connection to this site uses an obsolete protocol (TLS 1.0), an obsolete key exchange (RSA), and an obsolete cipher (3DES_EDE_CBC with HMAC-SHA1).\r\n\r\nI've tried the code above, and tried to add `:RSA+3DES-EDE-CBC` to the list of ciphers, but I still get the error:\r\n```\r\n\r\nmy_venv\\Scripts\\python.exe Projects/scrape_proj.py/scrape_proj.py\r\n2.17.3\r\nTraceback (most recent call last):\r\n File \"my_venv\\lib\\site-packages\\urllib3\\connectionpool.py\", line 600, in urlopen\r\n chunked=chunked)\r\n File \"my_venv\\lib\\site-packages\\urllib3\\connectionpool.py\", line 345, in _make_request\r\n self._validate_conn(conn)\r\n File \"my_venv\\lib\\site-packages\\urllib3\\connectionpool.py\", line 844, in _validate_conn\r\n conn.connect()\r\n File \"my_venv\\lib\\site-packages\\urllib3\\connection.py\", line 326, in connect\r\n ssl_context=context)\r\n File \"my_venv\\lib\\site-packages\\urllib3\\util\\ssl_.py\", line 325, in ssl_wrap_socket\r\n return context.wrap_socket(sock, server_hostname=server_hostname)\r\n File \"AppData\\Local\\Programs\\Python\\Python35\\lib\\ssl.py\", line 377, in wrap_socket\r\n _context=self)\r\n File \"AppData\\Local\\Programs\\Python\\Python35\\lib\\ssl.py\", line 752, in __init__\r\n self.do_handshake()\r\n File \"AppData\\Local\\Programs\\Python\\Python35\\lib\\ssl.py\", line 988, in do_handshake\r\n self._sslobj.do_handshake()\r\n File \"AppData\\Local\\Programs\\Python\\Python35\\lib\\ssl.py\", line 633, in do_handshake\r\n self._sslobj.do_handshake()\r\nssl.SSLEOFError: EOF occurred in violation of protocol (_ssl.c:645)\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File \"my_venv\\lib\\site-packages\\requests\\adapters.py\", line 440, in send\r\n timeout=timeout\r\n File \"my_venv\\lib\\site-packages\\urllib3\\connectionpool.py\", line 630, in urlopen\r\n raise SSLError(e)\r\nurllib3.exceptions.SSLError: EOF occurred in violation of protocol (_ssl.c:645)\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File \"Projects/scrape_proj.py/scrape_proj.py\", line 48, in <module>\r\n response = s.get('https://www.old.health.gov.il/pages')\r\n File \"my_venv\\lib\\site-packages\\requests\\sessions.py\", line 526, in get\r\n return self.request('GET', url, **kwargs)\r\n File \"my_venv\\lib\\site-packages\\requests\\sessions.py\", line 513, in request\r\n resp = self.send(prep, **send_kwargs)\r\n File \"my_venv\\lib\\site-packages\\requests\\sessions.py\", line 623, in send\r\n r = adapter.send(request, **kwargs)\r\n File \"my_venv\\lib\\site-packages\\requests\\adapters.py\", line 514, in send\r\n raise SSLError(e, request=request)\r\nrequests.exceptions.SSLError: EOF occurred in violation of protocol (_ssl.c:645)\r\n\r\nProcess finished with exit code 1\r\n\r\n```\r\n\r\n`pip freeze` gives:\r\n\r\n```\r\ncertifi==2017.4.17\r\nchardet==3.0.4\r\nidna==2.5\r\nlxml==3.8.0\r\nrequests==2.17.3\r\nurllib3==1.21.1\r\n```\r\n\r\nWhat can I do to make requests work with a site like this? I think there should be something about this in the documentation for the module. Should I open a separate issue?", "> I've tried the code above, and tried to add `:RSA+3DES-EDE-CBC` to the list of ciphers,\r\n\r\nThat's not a valid OpenSSL cipher specification. Try adding `DES-CBC3-SHA` instead.", "@kwikwag \r\nDid you solve your problem? I have a similar situation, #4638.", "Where to add this DES-CBC3-SHA exactly?" ]
https://api.github.com/repos/psf/requests/issues/3773
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3773/labels{/name}
https://api.github.com/repos/psf/requests/issues/3773/comments
https://api.github.com/repos/psf/requests/issues/3773/events
https://github.com/psf/requests/pull/3773
196,167,951
MDExOlB1bGxSZXF1ZXN0OTg0MTMyNzk=
3,773
remove HTTPProxyAuth in favor of the proxies parameter
{ "avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4", "events_url": "https://api.github.com/users/nateprewitt/events{/privacy}", "followers_url": "https://api.github.com/users/nateprewitt/followers", "following_url": "https://api.github.com/users/nateprewitt/following{/other_user}", "gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/nateprewitt", "id": 5271761, "login": "nateprewitt", "node_id": "MDQ6VXNlcjUyNzE3NjE=", "organizations_url": "https://api.github.com/users/nateprewitt/orgs", "received_events_url": "https://api.github.com/users/nateprewitt/received_events", "repos_url": "https://api.github.com/users/nateprewitt/repos", "site_admin": false, "starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions", "type": "User", "url": "https://api.github.com/users/nateprewitt", "user_view_type": "public" }
[]
closed
true
null
[]
null
0
2016-12-16T22:26:37Z
2021-09-08T01:21:30Z
2016-12-17T14:23:33Z
MEMBER
resolved
Minor tweak for 3.0.0 to cross off #2003 by removing the HTTPProxyAuth class in favor of the `proxies` param.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3773/reactions" }
https://api.github.com/repos/psf/requests/issues/3773/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3773.diff", "html_url": "https://github.com/psf/requests/pull/3773", "merged_at": "2016-12-17T14:23:33Z", "patch_url": "https://github.com/psf/requests/pull/3773.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3773" }
true
[]
https://api.github.com/repos/psf/requests/issues/3772
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3772/labels{/name}
https://api.github.com/repos/psf/requests/issues/3772/comments
https://api.github.com/repos/psf/requests/issues/3772/events
https://github.com/psf/requests/issues/3772
196,051,679
MDU6SXNzdWUxOTYwNTE2Nzk=
3,772
Digest Auth will auth regardless of status code
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[ { "color": "0b02e1", "default": false, "description": null, "id": 191274, "name": "Contributor Friendly", "node_id": "MDU6TGFiZWwxOTEyNzQ=", "url": "https://api.github.com/repos/psf/requests/labels/Contributor%20Friendly" } ]
closed
true
null
[]
{ "closed_at": null, "closed_issues": 29, "created_at": "2013-11-17T11:29:34Z", "creator": { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }, "description": null, "due_on": null, "html_url": "https://github.com/psf/requests/milestone/20", "id": 487518, "labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels", "node_id": "MDk6TWlsZXN0b25lNDg3NTE4", "number": 20, "open_issues": 12, "state": "open", "title": "3.0.0", "updated_at": "2024-05-19T18:43:00Z", "url": "https://api.github.com/repos/psf/requests/milestones/20" }
10
2016-12-16T12:45:03Z
2021-09-08T12:01:06Z
2017-01-29T08:15:36Z
MEMBER
resolved
Discovered on IRC. The digest auth handler in Requests' codebase doesn't ever actually check that it is responding to a 401: it just looks for an Authorization header. That's pretty dumb, so we should add a check to only actually do the execution for 401s. This is a contributor friendly change.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3772/reactions" }
https://api.github.com/repos/psf/requests/issues/3772/timeline
null
completed
null
null
false
[ "Are we sure that tightening this check is what we want to do? It looks like [RFC 7235](https://tools.ietf.org/html/rfc7235#section-4.1) suggests that it's acceptable to send a WWW-Authenticate challenge with a non-401 status code. I'm not sure `handle_401` is the most apt name for the hook, but the functionality seems correct.\r\n\r\n>A server generating a 401 (Unauthorized) response MUST send a\r\n WWW-Authenticate header field containing at least one challenge. A\r\n server MAY generate a WWW-Authenticate header field in other response\r\n messages to indicate that supplying credentials (or different\r\n credentials) might affect the response.", "@nateprewitt So we have two role models to look at here:\r\n\r\n1. Browsers - If you read the logs for our IRC channel, you'll see that Browsers are doing what we do (although probably not as loosely as we do it)\r\n\r\n2. Curl/Wget - They disregard the WWW-Authenticate headers on anything that's not a 401. In the case of the user on IRC they were receiving a 403 with WWW-Authenticate headers and we were authenticating while Curl and Wget were not.\r\n\r\nFrankly, I'm not sold on restricting this to 401s only, but I do think we should restrict it to 4xx codes. Having these headers sent back on a 30x, for example, could lead to interesting behaviour. On the one hand, we have to answer the challenge, right? On the other we're being redirected, potentially to a totally different domain. I think the only right answer in non 4xx cases is to not authenticate. Unfortunately, we will answer any challenge we receive at the moment and that's problematic.", "Also @nateprewitt, please let someone else pick this up.", "Yeah, I was planning on leaving it on the stack. I just did work on testing this functionality last week which is why I felt the need to comment.\r\n\r\nI just implemented tests confirming that we won't send Authorization on a 3xx request unless challenged after the redirect, so I don't think that's an issue we need to worry about. \r\n\r\nI don't think that restricting this to the 4XX range is unreasonable but I do think the RFC is pretty permissive. I'm just trying to provide information that I couldn't find in the IRC messages that seems relevant. Hopefully helping avoid similar problems that we just had with tightening basic auth.\r\n\r\nThe [Authorization](https://tools.ietf.org/html/rfc7235#section-4.2) definition also seems to leave this pretty open.\r\n>The \"Authorization\" header field allows a user agent to authenticate\r\n itself with an origin server -- usually, but not necessarily, after\r\n receiving a 401 (Unauthorized) response.", "Correct, which is why we send Basic Authentication headers on the first request. We can't do that with Digest-Auth because we need to respond to a Challenge.", "You could also send `Authorization` eagerly if it's, for example, a bearer token, or some other kind of trivial authentication that doesn't rely on the challenge-response cycle put forth in the Digest Authentication specification.", "Sorry, I should have clarified further on Authorization. I agree we shouldn't be sending Digest Authorization without an appropriate challenge first. What I'm saying is that the section quoted above seems to also support that responding to an appropriate challenge in a non-401 response is permissible which is what we what we currently do.\r\n\r\nThe scoping here is obviously in the hands of you and @Lukasa, I just wanted to make sure we didn't completely constrict that to 401 hastily as the original post noted.", "> I agree we shouldn't be sending Digest Authorization without an appropriate challenge first. \r\n\r\nWe also *literally* can't.\r\n\r\n> I just wanted to make sure we didn't completely constrict that to 401 hastily as the original post noted.\r\n\r\nI appreciate that :) I think @Lukasa is starting to feel more and more (as I do) that we should be very strict in our interpretation of specifications. That said, (as you point out) the spec is lenient, so my position is that we should be strict in the spirit of the specification. I can't see any reason why someone would challenge with a 200 response, so it should be safe to restrict it to 4xx responses.", "So, let's consider this a 3.0.0 issue instead. Clearly there is no *problem* with Requests' current behaviour: we're well within specification, and we have nothing to worry about with that, but I do think we should restrict ourselves to 4xx responses in the future.", "Hey @Lukasa will work on this issue" ]
https://api.github.com/repos/psf/requests/issues/3771
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3771/labels{/name}
https://api.github.com/repos/psf/requests/issues/3771/comments
https://api.github.com/repos/psf/requests/issues/3771/events
https://github.com/psf/requests/pull/3771
195,860,796
MDExOlB1bGxSZXF1ZXN0OTgxOTE0MzI=
3,771
Reuse all connections
{ "avatar_url": "https://avatars.githubusercontent.com/u/3725538?v=4", "events_url": "https://api.github.com/users/agalera/events{/privacy}", "followers_url": "https://api.github.com/users/agalera/followers", "following_url": "https://api.github.com/users/agalera/following{/other_user}", "gists_url": "https://api.github.com/users/agalera/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/agalera", "id": 3725538, "login": "agalera", "node_id": "MDQ6VXNlcjM3MjU1Mzg=", "organizations_url": "https://api.github.com/users/agalera/orgs", "received_events_url": "https://api.github.com/users/agalera/received_events", "repos_url": "https://api.github.com/users/agalera/repos", "site_admin": false, "starred_url": "https://api.github.com/users/agalera/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/agalera/subscriptions", "type": "User", "url": "https://api.github.com/users/agalera", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2016-12-15T17:06:55Z
2021-09-08T01:21:31Z
2016-12-15T17:47:15Z
NONE
resolved
When testing, every time I make a request, the message appears: `` `Starting a new HTTP connection (1): www. *****. With I have seen one session per application. He tried to reuse the "session" for all requests and seems to work properly, even with connections that have been closed. What implications would this change have? ``` python In [9]: requests.get("https://github.com") Starting new HTTPS connection (1): github.com Out[9]: <Response [200]> In [10]: requests.get("https://github.com") Out[10]: <Response [200]> In [11]: requests.get("https://stackoverflow.com") Starting new HTTPS connection (1): stackoverflow.com Out[11]: <Response [200]> In [12]: requests.get("https://stackoverflow.com") Out[12]: <Response [200]> In [13]: requests.get("https://github.com") Out[13]: <Response [200]> In [14]: requests.get("https://github.com") Out[14]: <Response [200]> ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3771/reactions" }
https://api.github.com/repos/psf/requests/issues/3771/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3771.diff", "html_url": "https://github.com/psf/requests/pull/3771", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3771.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3771" }
true
[ "Hey @kianxineki, I skimmed through your question and responded a bit too quickly. The implications of this change is we would now use a single `Session` object across ALL connections with `requests.request` which may not be what the user wants. The api methods are intended to be single connections that are cleaned up after completion.\r\n\r\nAs I think you've already noted above, you can accomplish the change by simply adding an extra line in your code declaring your own Session instance. This will allow you to utilize connection pooling and avoid the new connections.\r\n\r\n```python\r\ns = requests.Session()\r\nresp_git = s.get('https://github.com')\r\nresp_stack = s.get('https://stackoverflow.com')\r\n```\r\n\r\nYou can find more information on how to use `Session` [here](http://docs.python-requests.org/en/master/user/advanced/#session-objects).", "In my current code I do this to not have to use s.get() and to be able to use the methods that facilitates me requests.\r\n\r\n``` python\r\nimport requests\r\n\r\n\r\nsession = requests.sessions.Session()\r\n\r\n\r\ndef request(method, url, **kwargs):\r\n return session.request(method=method, url=url, **kwargs)\r\n\r\n\r\nrequests.api.request = request\r\n\r\nrequests.get(\"https://google.com\")\r\n```", "@kianxineki Thanks for sending us a pull request! It looks to be your first to the project. Welcome!\r\n\r\nIt seems as though you're trying to ask why we do not use a global session object in `requests.api`. To answer you, let me explain a little bit of history first.\r\n\r\nWe used to use a global session object and our Session objects used to be thoroughly thread-safe. At some point, something changed about our Session objects that no one has yet been able to identify. This means that they're no longer properly threadsafe. As such, people were having significant problems sharing a session between threads which also applied to using the functional API. Further, having a session that sits around until the process finishes was holding onto sockets that were never being reused. \r\n\r\nThe solution to the thread-safety issue and the socket leak was to stop using a global session.\r\n\r\nCurrently, if you want to eliminate those lines in your logs and utilize connection pooling, you *must* use a Session. There's no way to provide connection pooling to you via our functional API without actively harming other users.\r\n\r\nIn the future, please direct all of your questions to StackOverflow, which is the proper Question & Answer forum for Requests.\r\n\r\nAgain, thank you for your pull request, but we can't accept it." ]
https://api.github.com/repos/psf/requests/issues/3770
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3770/labels{/name}
https://api.github.com/repos/psf/requests/issues/3770/comments
https://api.github.com/repos/psf/requests/issues/3770/events
https://github.com/psf/requests/issues/3770
195,690,075
MDU6SXNzdWUxOTU2OTAwNzU=
3,770
Revoked SSL Certificates not causing SSL error?
{ "avatar_url": "https://avatars.githubusercontent.com/u/12736585?v=4", "events_url": "https://api.github.com/users/mendaxi/events{/privacy}", "followers_url": "https://api.github.com/users/mendaxi/followers", "following_url": "https://api.github.com/users/mendaxi/following{/other_user}", "gists_url": "https://api.github.com/users/mendaxi/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/mendaxi", "id": 12736585, "login": "mendaxi", "node_id": "MDQ6VXNlcjEyNzM2NTg1", "organizations_url": "https://api.github.com/users/mendaxi/orgs", "received_events_url": "https://api.github.com/users/mendaxi/received_events", "repos_url": "https://api.github.com/users/mendaxi/repos", "site_admin": false, "starred_url": "https://api.github.com/users/mendaxi/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mendaxi/subscriptions", "type": "User", "url": "https://api.github.com/users/mendaxi", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2016-12-15T01:08:43Z
2021-09-08T13:05:35Z
2016-12-15T08:30:32Z
NONE
resolved
eyo this is my first time submitting a bug report of value (i think) so a day or so ago I discovered that when you don't get an SSL error when you do: ```requests.get("revoked.badssl.com")``` which isn't very good the problem isn't with OpenSSL, opening the following fails because of the bad certificate. ```openssl s_client -connect revoked.badssl.com:443``` I did some digging, as I'm working with SSL in my personal project, and _I think_ know what the issue is. when you use the regular old' ```ssl.wrap_socket()```, you get the same behavior, where the module flat out ignores the bad certificate. It seems the correct way to do this is to do something similar to the following: ``` default_context = ssl.create_default_context(purpose=ssl.Purpose.CLIENT_AUTH) default_context.verify_mode = ssl.CERT_REQUIRED wrapped_socket = default_context.wrap_socket(your_socket_here) ``` This will then create an error when you try to get the webpage. It seems that the issue stems from [here](https://github.com/kennethreitz/requests/blame/362da46e9a46da6e86e1907f03014384ab210151/requests/packages/urllib3/util/ssl_.py#L98), and was somewhat referenced [here]( https://github.com/kennethreitz/requests/issues/1786#issuecomment-167593081) running python 3.5.2 on debian stretch with requests version 2.12.3, if that helps at all tl;dr it seems that revoked ssl certs aren't being rejected, which could be an issue I also wanted to thank y'all for making such an amazing python HTTP(S) module. I don't know what I would do without you guys.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3770/reactions" }
https://api.github.com/repos/psf/requests/issues/3770/timeline
null
completed
null
null
false
[ "So we are not doing client auth here.\r\n\r\nAnd the openssl client fails for me because there's no local certificate in my bundle for the issuer of that revoked certificate:\r\n\r\n```\r\nCONNECTED(00000003)\r\ndepth=0 /C=US/ST=California/L=San Francisco/O=BadSSL Fallback. Unknown subdomain or no SNI./CN=badssl-fallback-unknown-subdomain-or-no-sni\r\nverify error:num=20:unable to get local issuer certificate\r\nverify return:1\r\ndepth=0 /C=US/ST=California/L=San Francisco/O=BadSSL Fallback. Unknown subdomain or no SNI./CN=badssl-fallback-unknown-subdomain-or-no-sni\r\nverify error:num=27:certificate not trusted\r\nverify return:1\r\ndepth=0 /C=US/ST=California/L=San Francisco/O=BadSSL Fallback. Unknown subdomain or no SNI./CN=badssl-fallback-unknown-subdomain-or-no-sni\r\nverify error:num=21:unable to verify the first certificate\r\nverify return:1\r\n---\r\nCertificate chain\r\n 0 s:/C=US/ST=California/L=San Francisco/O=BadSSL Fallback. Unknown subdomain or no SNI./CN=badssl-fallback-unknown-subdomain-or-no-sni\r\n i:/C=US/ST=California/L=San Francisco/O=BadSSL/CN=BadSSL Intermediate Certificate Authority\r\n---\r\n...\r\n Verify return code: 21 (unable to verify the first certificate)\r\n```\r\n\r\n> tl;dr it seems that revoked ssl certs aren't being rejected, which could be an issue\r\n\r\nThis does seem, at this point, true. That said, I believe this is because requests (and openssl) don't by default turn on OCSP which is what we'd need to use to *detect* a revoked certificate.\r\n\r\nThere are lots of opinions about OCSP and certificate revocation, but one prevailing one is that it doesn't work very well (which isn't an excuse to not support it). Either way, I believe to support it, we need some work in the standard library ssl library and in cryptography to allow us to add that support to PyOpenSSL.\r\n\r\nI'm not 100% confident in any of this, though, so I'd advise we wait until @Lukasa is up or someone like @reaperhulk can take a look.\r\n\r\n---\r\n\r\nThanks for reporting this @mendaxi ", "Requests does not respect revocation notices at this time. This is unfortunate but ultimately beyond us: there are no APIs available to extract and validate stapled OCSP responses, in either PyOpenSSL or the standard library.\r\n\r\nI should note that there is only one case where revocation is really useful, and that is with OCSP-Must-Staple. In all other cases we basically fail open, which isn't really very helpful at all. \r\n\r\nHowever, @mendaxi seems to be suggesting that setting the verify mode to CERT_REQUIRED would resolve his issue. That's weird, because Requests has been setting that flag for 6 years.\r\n\r\nTL:DR we don't use the regular old wrap socket and haven't for a while. We have been managing our own TLS config for years and validating certificates.", "@Lukasa to be clear, @mendaxi is using CLIENT_AUTH with an SSLContext object. It's failing because it's the wrong configuration to validate a certificate from a server. :)", "Sorry for the wrong configuration. My bad on that." ]
https://api.github.com/repos/psf/requests/issues/3769
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3769/labels{/name}
https://api.github.com/repos/psf/requests/issues/3769/comments
https://api.github.com/repos/psf/requests/issues/3769/events
https://github.com/psf/requests/issues/3769
195,564,579
MDU6SXNzdWUxOTU1NjQ1Nzk=
3,769
Failure to install on Jython since 2.12.0
{ "avatar_url": "https://avatars.githubusercontent.com/u/726553?v=4", "events_url": "https://api.github.com/users/ppolewicz/events{/privacy}", "followers_url": "https://api.github.com/users/ppolewicz/followers", "following_url": "https://api.github.com/users/ppolewicz/following{/other_user}", "gists_url": "https://api.github.com/users/ppolewicz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ppolewicz", "id": 726553, "login": "ppolewicz", "node_id": "MDQ6VXNlcjcyNjU1Mw==", "organizations_url": "https://api.github.com/users/ppolewicz/orgs", "received_events_url": "https://api.github.com/users/ppolewicz/received_events", "repos_url": "https://api.github.com/users/ppolewicz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ppolewicz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ppolewicz/subscriptions", "type": "User", "url": "https://api.github.com/users/ppolewicz", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2016-12-14T15:39:42Z
2021-09-08T13:05:39Z
2016-12-14T15:42:20Z
NONE
resolved
``` (jython) r@r-V ~ $ python --version Jython 2.7.1b3 (jython) r@r-V ~ $ pip install requests==2.11.1 Collecting requests==2.11.1 Using cached requests-2.11.1-py2.py3-none-any.whl Installing collected packages: requests Found existing installation: requests 2.11.0 Uninstalling requests-2.11.0: Successfully uninstalled requests-2.11.0 Successfully installed requests-2.11.1 (jython) r@r-V ~ $ pip install requests==2.12.0 Collecting requests==2.12.0 Using cached requests-2.12.0-py2.py3-none-any.whl Installing collected packages: requests Found existing installation: requests 2.11.1 Uninstalling requests-2.11.1: Successfully uninstalled requests-2.11.1 Rolling back uninstall of requests Exception: Traceback (most recent call last): File "/home/r/.virtualenvs/jython/Lib/site-packages/pip/basecommand.py", line 215, in main status = self.run(options, args) File "/home/r/.virtualenvs/jython/Lib/site-packages/pip/commands/install.py", line 338, in run requirement_set.install( File "/home/r/.virtualenvs/jython/Lib/site-packages/pip/commands/install.py", line 338, in run requirement_set.install( File "/home/r/.virtualenvs/jython/Lib/site-packages/pip/req/req_set.py", line 780, in install requirement.install( File "/home/r/.virtualenvs/jython/Lib/site-packages/pip/req/req_install.py", line 851, in install self.move_wheel_files(self.source_dir, root=root, prefix=prefix) File "/home/r/.virtualenvs/jython/Lib/site-packages/pip/req/req_install.py", line 1057, in move_wheel_files move_wheel_files( File "/home/r/.virtualenvs/jython/Lib/site-packages/pip/wheel.py", line 272, in move_wheel_files compileall.compile_dir(source, force=True, quiet=True) File "/home/r/jython/Lib/compileall.py", line 56, in compile_dir if not compile_dir(fullname, maxlevels - 1, dfile, force, rx, File "/home/r/jython/Lib/compileall.py", line 56, in compile_dir if not compile_dir(fullname, maxlevels - 1, dfile, force, rx, File "/home/r/jython/Lib/compileall.py", line 56, in compile_dir if not compile_dir(fullname, maxlevels - 1, dfile, force, rx, File "/home/r/jython/Lib/compileall.py", line 50, in compile_dir if not compile_file(fullname, ddir, force, rx, quiet): File "/home/r/jython/Lib/compileall.py", line 99, in compile_file ok = py_compile.compile(fullname, None, dfile, True) File "/home/r/jython/Lib/compileall.py", line 99, in compile_file ok = py_compile.compile(fullname, None, dfile, True) File "/home/r/jython/Lib/py_compile.py", line 96, in compile _py_compile.compile(file, cfile, dfile) File "/home/r/jython/Lib/py_compile.py", line 96, in compile _py_compile.compile(file, cfile, dfile) RuntimeException: java.lang.RuntimeException: Method code too large! (jython) r@r-V ~ $ ``` This might not be a bug in `requests`, but perhaps it is workaroundable? Has any change been done in 2.12.0 that caused a very big method to be created?
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3769/reactions" }
https://api.github.com/repos/psf/requests/issues/3769/timeline
null
completed
null
null
false
[ "In the future, please search **closed and** open issues before creating new ones that are duplicates.\r\n\r\nThis is a duplicate of the **open** #3711 ", "I found that issue, but it did not contain \"RuntimeException: java.lang.RuntimeException: Method code too large!\" and it mentioned usage, not installation, with something called `Indna`, which I didn't think I use.\r\n\r\nSorry for the trouble.", "Hey @ppolewicz, it looks like this is specifically a limitation of how Jython uses the JVM as noted in [this ticket](http://bugs.jython.org/issue1891). I'd ensure you have the latest version of Jython2.7, and attempt to use Requests 2.12.4.\r\n\r\nIf that doesn't solve it, I'm afraid there's not much we can do at this time since the bug isn't really with Requests.", "(for people searching for a solution later and finding this)\r\n\r\nThat ticket says:\r\n\r\n> Issue will get solved if the jython version is 2.7 without applying work around(Splitting of method content).\r\n\r\n...but my Jython version is 2.7.1b3 and it still fails.\r\n\r\n`requests==2.12.4` fail to install for the same reason." ]
https://api.github.com/repos/psf/requests/issues/3768
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3768/labels{/name}
https://api.github.com/repos/psf/requests/issues/3768/comments
https://api.github.com/repos/psf/requests/issues/3768/events
https://github.com/psf/requests/issues/3768
195,443,880
MDU6SXNzdWUxOTU0NDM4ODA=
3,768
Proxy socks windows
{ "avatar_url": "https://avatars.githubusercontent.com/u/17412555?v=4", "events_url": "https://api.github.com/users/kityamoto/events{/privacy}", "followers_url": "https://api.github.com/users/kityamoto/followers", "following_url": "https://api.github.com/users/kityamoto/following{/other_user}", "gists_url": "https://api.github.com/users/kityamoto/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kityamoto", "id": 17412555, "login": "kityamoto", "node_id": "MDQ6VXNlcjE3NDEyNTU1", "organizations_url": "https://api.github.com/users/kityamoto/orgs", "received_events_url": "https://api.github.com/users/kityamoto/received_events", "repos_url": "https://api.github.com/users/kityamoto/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kityamoto/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kityamoto/subscriptions", "type": "User", "url": "https://api.github.com/users/kityamoto", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2016-12-14T05:29:56Z
2021-09-08T11:00:29Z
2017-04-27T19:38:29Z
NONE
resolved
`$ pip install requests[socks]` `>>>import requests` `>>>proxies = {'http': 'socks5://127.0.0.1:9150','https': 'socks5://127.0.0.1:9150'}` `>>>r = requests.get('https://api.github.com', proxies=proxies)` `Traceback (most recent call last):` `...` ` raise InvalidSchema("Missing dependencies for SOCKS support.")` `requests.exceptions.InvalidSchema: Missing dependencies for SOCKS support.` Package is installed win_inet_pton with pysocks automatically. `$ pip install win_inet_pton`
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3768/reactions" }
https://api.github.com/repos/psf/requests/issues/3768/timeline
null
completed
null
null
false
[ "Can you provide the output of `pip freeze`, and confirm that when you type `import socks` that that works?", "I confirm for him ;)\r\n\r\n```\r\n>> pip freeze\r\nfuture==0.16.0\r\npsutil==5.0.0\r\nPySocks==1.6.5\r\nrequests==2.12.4\r\nsix==1.10.0\r\nwin-inet-pton==1.0.1\r\n```\r\n\r\n\r\n```\r\n>> python\r\nPython 2.7.12 (v2.7.12:d33e0cf91556, Jun 27 2016, 15:24:40) [MSC v.1500 64 bit (AMD64)] on win32\r\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\r\n>>> import socks\r\n>>>\r\n```\r\n", "@yoyoprs I believe @Lukasa wanted you two to `import socks` and then check that the version was the same as what `pip` reports.", "Workaround:\r\n\r\n```\r\npip install win-inet-pton\r\n```" ]
https://api.github.com/repos/psf/requests/issues/3767
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3767/labels{/name}
https://api.github.com/repos/psf/requests/issues/3767/comments
https://api.github.com/repos/psf/requests/issues/3767/events
https://github.com/psf/requests/issues/3767
195,441,238
MDU6SXNzdWUxOTU0NDEyMzg=
3,767
Expected a bytes-like object in prepare_auth
{ "avatar_url": "https://avatars.githubusercontent.com/u/910895?v=4", "events_url": "https://api.github.com/users/raphaeltm/events{/privacy}", "followers_url": "https://api.github.com/users/raphaeltm/followers", "following_url": "https://api.github.com/users/raphaeltm/following{/other_user}", "gists_url": "https://api.github.com/users/raphaeltm/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/raphaeltm", "id": 910895, "login": "raphaeltm", "node_id": "MDQ6VXNlcjkxMDg5NQ==", "organizations_url": "https://api.github.com/users/raphaeltm/orgs", "received_events_url": "https://api.github.com/users/raphaeltm/received_events", "repos_url": "https://api.github.com/users/raphaeltm/repos", "site_admin": false, "starred_url": "https://api.github.com/users/raphaeltm/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/raphaeltm/subscriptions", "type": "User", "url": "https://api.github.com/users/raphaeltm", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2016-12-14T05:03:02Z
2021-09-08T13:05:40Z
2016-12-14T08:48:40Z
NONE
resolved
I'm using [python-intercom](https://github.com/jkeyes/python-intercom) which requires requests 2.6.0. I have been using requests outside of that as well, and recently upgraded from 2.11.1 to 2.12.3. python-intercom then stopped working (but works when I revert to requests 2.11.1). I haven't dug through the python-intercom code yet to figure out exactly what is going on. I'm happy to stick to 2.11.1, but since the requests website suggests that there should be no breaking changes with minor releases, I figured I'd post here. Stack trace: ``` Traceback (most recent call last): File "/tmp/Potato/lib/python3.5/site-packages/django/core/handlers/exception.py", line 39, in inner response = get_response(request) File "/tmp/Potato/lib/python3.5/site-packages/django/core/handlers/base.py", line 187, in _get_response response = self.process_exception_by_middleware(e, request) File "/tmp/Potato/lib/python3.5/site-packages/django/core/handlers/base.py", line 185, in _get_response response = wrapped_callback(request, *callback_args, **callback_kwargs) File "/vagrant/PotatoMain/views/user.py", line 39, in auth_callback login(request, user) File "/tmp/Potato/lib/python3.5/site-packages/django/contrib/auth/__init__.py", line 130, in login user_logged_in.send(sender=user.__class__, request=request, user=user) File "/tmp/Potato/lib/python3.5/site-packages/django/dispatch/dispatcher.py", line 191, in send response = receiver(signal=self, sender=sender, **named) File "/tmp/Potato/lib/python3.5/site-packages/django/contrib/auth/models.py", line 25, in update_last_login user.save(update_fields=['last_login']) File "/tmp/Potato/lib/python3.5/site-packages/django/contrib/auth/base_user.py", line 80, in save super(AbstractBaseUser, self).save(*args, **kwargs) File "/tmp/Potato/lib/python3.5/site-packages/django/db/models/base.py", line 796, in save force_update=force_update, update_fields=update_fields) File "/tmp/Potato/lib/python3.5/site-packages/django/db/models/base.py", line 833, in save_base update_fields=update_fields, raw=raw, using=using) File "/tmp/Potato/lib/python3.5/site-packages/django/dispatch/dispatcher.py", line 191, in send response = receiver(signal=self, sender=sender, **named) File "/vagrant/PotatoMain/signals.py", line 14, in feed_post_save intercom.User.create(email=instance.email) File "/tmp/Potato/lib/python3.5/site-packages/intercom/api_operations/save.py", line 12, in create response = Intercom.post("/%s/" % (collection), **params) File "/tmp/Potato/lib/python3.5/site-packages/intercom/__init__.py", line 171, in post return cls.request('POST', path, params) File "/tmp/Potato/lib/python3.5/site-packages/intercom/__init__.py", line 163, in request method, cls.get_url(path), cls._auth, params) File "/tmp/Potato/lib/python3.5/site-packages/intercom/request.py", line 47, in send_request_to_path auth=auth, verify=certifi.where(), **req_params) File "/tmp/Potato/lib/python3.5/site-packages/requests/api.py", line 56, in request return session.request(method=method, url=url, **kwargs) File "/tmp/Potato/lib/python3.5/site-packages/requests/sessions.py", line 474, in request prep = self.prepare_request(req) File "/tmp/Potato/lib/python3.5/site-packages/requests/sessions.py", line 407, in prepare_request hooks=merge_hooks(request.hooks, self.hooks), File "/tmp/Potato/lib/python3.5/site-packages/requests/models.py", line 306, in prepare self.prepare_auth(auth, url) File "/tmp/Potato/lib/python3.5/site-packages/requests/models.py", line 527, in prepare_auth r = auth(self) File "/tmp/Potato/lib/python3.5/site-packages/requests/auth.py", line 68, in __call__ r.headers['Authorization'] = _basic_auth_str(self.username, self.password) File "/tmp/Potato/lib/python3.5/site-packages/requests/auth.py", line 38, in _basic_auth_str b64encode(b':'.join((username, password))).strip() TypeError: sequence item 1: expected a bytes-like object, NoneType found ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3767/reactions" }
https://api.github.com/repos/psf/requests/issues/3767/timeline
null
completed
null
null
false
[ "Hey @raphaeltm, thanks for opening a ticket. This has been addressed in #3758 and is working on master. It should be fixed when the next release ships.", "Yup. In the meantime, python-intercom can resolve the issue by not passing objects that aren't strings as auth objects." ]
https://api.github.com/repos/psf/requests/issues/3766
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3766/labels{/name}
https://api.github.com/repos/psf/requests/issues/3766/comments
https://api.github.com/repos/psf/requests/issues/3766/events
https://github.com/psf/requests/pull/3766
195,037,095
MDExOlB1bGxSZXF1ZXN0OTc1OTgxMjI=
3,766
testing HTTPDigestAuth hooks
{ "avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4", "events_url": "https://api.github.com/users/nateprewitt/events{/privacy}", "followers_url": "https://api.github.com/users/nateprewitt/followers", "following_url": "https://api.github.com/users/nateprewitt/following{/other_user}", "gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/nateprewitt", "id": 5271761, "login": "nateprewitt", "node_id": "MDQ6VXNlcjUyNzE3NjE=", "organizations_url": "https://api.github.com/users/nateprewitt/orgs", "received_events_url": "https://api.github.com/users/nateprewitt/received_events", "repos_url": "https://api.github.com/users/nateprewitt/repos", "site_admin": false, "starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions", "type": "User", "url": "https://api.github.com/users/nateprewitt", "user_view_type": "public" }
[]
closed
true
null
[]
null
9
2016-12-12T17:35:13Z
2021-09-08T01:21:31Z
2016-12-14T09:41:11Z
MEMBER
resolved
#1979 is still opened as "unresolved" but it looks like a patch (#2253) was merged to address it in 2014. There wasn't a testing framework at the time, so no tests were included to verify things actually worked. I haven't been able to concoct a end-to-end test with httpbin for this, but the individual pieces work as intended. I can work on devising a socket test if needed.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3766/reactions" }
https://api.github.com/repos/psf/requests/issues/3766/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3766.diff", "html_url": "https://github.com/psf/requests/pull/3766", "merged_at": "2016-12-14T09:41:11Z", "patch_url": "https://github.com/psf/requests/pull/3766.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3766" }
true
[ "I think I'd be much happier seeing a socket test than the httpbin ones: they just end up testing implementation details more than actual behaviour.", "I don't think either of these are testing anything specific to httpbin's implementation, they should work the same regardless of which server they hit as long as the status codes are correct.\r\n\r\nI added a somewhat monstrous test to simulate the full lifecycle of 401->302-401->200. I'm cutting a few corners with the actual digest piece because of the thread state, but the scenario we're trying to prove is correct. I've tested this both against 2.2.1 and 2.12.3 without the `handle_redirect` hook mounted. It's failing as expected in those scenarios.", "> I remain unconvinced about whether checking num_401_calls is a good idea or whether we should consider it an implementation detail.\r\n\r\nSorry, could you clarify this a little? Are you referring to this in relation to the tests, or the actual implementation? I'm not sure how else we'd test that this is appropriately reset since that's the single flag that determines whether the 401 challenge is responded to or not.", "I mean it about the tests. Generally speaking if we can we should restrict ourselves to testing observable behaviour, by which I mean bytes output or public return values. ", "Ok, I'll remove those assert statements and rely on having expected header and status_code values.", "Alright, assertions about the state of `num_401_calls` should be gone now and we're only relying upon observable behaviour.", "Alright, we're on only socket tests now testing behaviour rather than implementation details.", "@Lukasa I rescinded my last comment on `num_401_calls` because while what I was testing worked, it's not reproducible in the wild. I still think there may be value in testing the hooks individually but I don't think it's a blocker here. These tests cover the use cases we're concerned about in #1979.", "Cool, tests are good. Thanks for the work @nateprewitt!" ]
https://api.github.com/repos/psf/requests/issues/3765
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3765/labels{/name}
https://api.github.com/repos/psf/requests/issues/3765/comments
https://api.github.com/repos/psf/requests/issues/3765/events
https://github.com/psf/requests/pull/3765
194,852,241
MDExOlB1bGxSZXF1ZXN0OTc0NzAzNjY=
3,765
Make code match behavior as defined in RFC 2818 and RFC 6125 and documented behavior
{ "avatar_url": "https://avatars.githubusercontent.com/u/302525?v=4", "events_url": "https://api.github.com/users/noxxi/events{/privacy}", "followers_url": "https://api.github.com/users/noxxi/followers", "following_url": "https://api.github.com/users/noxxi/following{/other_user}", "gists_url": "https://api.github.com/users/noxxi/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/noxxi", "id": 302525, "login": "noxxi", "node_id": "MDQ6VXNlcjMwMjUyNQ==", "organizations_url": "https://api.github.com/users/noxxi/orgs", "received_events_url": "https://api.github.com/users/noxxi/received_events", "repos_url": "https://api.github.com/users/noxxi/repos", "site_admin": false, "starred_url": "https://api.github.com/users/noxxi/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/noxxi/subscriptions", "type": "User", "url": "https://api.github.com/users/noxxi", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-12-11T19:59:01Z
2021-09-08T01:21:33Z
2016-12-11T20:57:13Z
NONE
resolved
The CN should be checked if no dNSName exist in subjectAltNames. The code treated a IP address as DNS name by appending it to dnsnames. This way CN was not checked in a certificate which contained the hostname as CN but only IP addresses as SAN. See also http://stackoverflow.com/questions/41089539/authentication-issue-with-ssl-certificate-using-python-requests-lib
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3765/reactions" }
https://api.github.com/repos/psf/requests/issues/3765/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3765.diff", "html_url": "https://github.com/psf/requests/pull/3765", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3765.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3765" }
true
[ "Thanks for this! For the moment, I'm not going to merge this, for the following reasons.\r\n\r\n1. This patch affects urllib3, which we vendor without changes. It would need to be proposed to that project. \r\n2. The inclusion of match_hostname is currently intended to behave the same way on all platforms. Thus, if we edit it, we'd also have to stop preferring the stdlib until they matched our behaviour. This patch is therefore incomplete.\r\n3. Your assertion about the incorrectness of our logic is not as clear-cut as it seems. Our behaviour currently matches curl, Firefox, and Chrome, which is some good company to be in (see curl/curl#1065).\r\n4. It seems the Python crypto experts [aren't convinced of this either, and ordinarily I'd default to respecting their logic](http://bugs.python.org/issue28938).\r\n5. There is basically no good reason to have a cert that fails in this manner. CN has been deprecated for years, SAN has been around forever. Just put the hostname in the SAN field as well." ]
https://api.github.com/repos/psf/requests/issues/3764
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3764/labels{/name}
https://api.github.com/repos/psf/requests/issues/3764/comments
https://api.github.com/repos/psf/requests/issues/3764/events
https://github.com/psf/requests/pull/3764
194,783,012
MDExOlB1bGxSZXF1ZXN0OTc0MzI5Njk=
3,764
Proposed/3.0.0
{ "avatar_url": "https://avatars.githubusercontent.com/u/20820116?v=4", "events_url": "https://api.github.com/users/ihiro100/events{/privacy}", "followers_url": "https://api.github.com/users/ihiro100/followers", "following_url": "https://api.github.com/users/ihiro100/following{/other_user}", "gists_url": "https://api.github.com/users/ihiro100/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ihiro100", "id": 20820116, "login": "ihiro100", "node_id": "MDQ6VXNlcjIwODIwMTE2", "organizations_url": "https://api.github.com/users/ihiro100/orgs", "received_events_url": "https://api.github.com/users/ihiro100/received_events", "repos_url": "https://api.github.com/users/ihiro100/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ihiro100/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ihiro100/subscriptions", "type": "User", "url": "https://api.github.com/users/ihiro100", "user_view_type": "public" }
[]
closed
true
null
[]
null
0
2016-12-10T18:12:26Z
2021-09-08T01:21:33Z
2016-12-10T18:43:37Z
NONE
resolved
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3764/reactions" }
https://api.github.com/repos/psf/requests/issues/3764/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3764.diff", "html_url": "https://github.com/psf/requests/pull/3764", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3764.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3764" }
true
[]
https://api.github.com/repos/psf/requests/issues/3763
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3763/labels{/name}
https://api.github.com/repos/psf/requests/issues/3763/comments
https://api.github.com/repos/psf/requests/issues/3763/events
https://github.com/psf/requests/pull/3763
194,676,872
MDExOlB1bGxSZXF1ZXN0OTczNjU3ODY=
3,763
Revert "use enforce_content_length in Requests"
{ "avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4", "events_url": "https://api.github.com/users/nateprewitt/events{/privacy}", "followers_url": "https://api.github.com/users/nateprewitt/followers", "following_url": "https://api.github.com/users/nateprewitt/following{/other_user}", "gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/nateprewitt", "id": 5271761, "login": "nateprewitt", "node_id": "MDQ6VXNlcjUyNzE3NjE=", "organizations_url": "https://api.github.com/users/nateprewitt/orgs", "received_events_url": "https://api.github.com/users/nateprewitt/received_events", "repos_url": "https://api.github.com/users/nateprewitt/repos", "site_admin": false, "starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions", "type": "User", "url": "https://api.github.com/users/nateprewitt", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-12-09T19:27:16Z
2021-09-08T01:21:34Z
2016-12-09T21:16:30Z
MEMBER
resolved
This is more of a TODO reminder. Given the upcoming work for urllib3 2.0, which I'm assuming will be used in a Requests 3.0.0, this won't work. We don't need to remove this immediately, but I'll leave this here for when we're sure 2.0 is happening.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3763/reactions" }
https://api.github.com/repos/psf/requests/issues/3763/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3763.diff", "html_url": "https://github.com/psf/requests/pull/3763", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3763.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3763" }
true
[ "Heh, we won't forget: we'll run immediately into errors because I removed the kwarg entirely. Let's not queue up a whole bunch of PRs: let's focus on doing 2.0 first. :wink:" ]
https://api.github.com/repos/psf/requests/issues/3762
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3762/labels{/name}
https://api.github.com/repos/psf/requests/issues/3762/comments
https://api.github.com/repos/psf/requests/issues/3762/events
https://github.com/psf/requests/pull/3762
194,676,129
MDExOlB1bGxSZXF1ZXN0OTczNjUyNTc=
3,762
updating 3.0 history
{ "avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4", "events_url": "https://api.github.com/users/nateprewitt/events{/privacy}", "followers_url": "https://api.github.com/users/nateprewitt/followers", "following_url": "https://api.github.com/users/nateprewitt/following{/other_user}", "gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/nateprewitt", "id": 5271761, "login": "nateprewitt", "node_id": "MDQ6VXNlcjUyNzE3NjE=", "organizations_url": "https://api.github.com/users/nateprewitt/orgs", "received_events_url": "https://api.github.com/users/nateprewitt/received_events", "repos_url": "https://api.github.com/users/nateprewitt/repos", "site_admin": false, "starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions", "type": "User", "url": "https://api.github.com/users/nateprewitt", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-12-09T19:23:35Z
2021-09-08T01:21:32Z
2016-12-10T13:57:44Z
MEMBER
resolved
Catching up the 3.0-HISTORY.rst file for the most of 2016's changes.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3762/reactions" }
https://api.github.com/repos/psf/requests/issues/3762/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3762.diff", "html_url": "https://github.com/psf/requests/pull/3762", "merged_at": "2016-12-10T13:57:44Z", "patch_url": "https://github.com/psf/requests/pull/3762.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3762" }
true
[ "✨ 🍰 ✨ " ]
https://api.github.com/repos/psf/requests/issues/3761
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3761/labels{/name}
https://api.github.com/repos/psf/requests/issues/3761/comments
https://api.github.com/repos/psf/requests/issues/3761/events
https://github.com/psf/requests/pull/3761
194,618,007
MDExOlB1bGxSZXF1ZXN0OTczMjQxODE=
3,761
re-restrict params for _basic_auth_str in 3.0.0
{ "avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4", "events_url": "https://api.github.com/users/nateprewitt/events{/privacy}", "followers_url": "https://api.github.com/users/nateprewitt/followers", "following_url": "https://api.github.com/users/nateprewitt/following{/other_user}", "gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/nateprewitt", "id": 5271761, "login": "nateprewitt", "node_id": "MDQ6VXNlcjUyNzE3NjE=", "organizations_url": "https://api.github.com/users/nateprewitt/orgs", "received_events_url": "https://api.github.com/users/nateprewitt/received_events", "repos_url": "https://api.github.com/users/nateprewitt/repos", "site_admin": false, "starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions", "type": "User", "url": "https://api.github.com/users/nateprewitt", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-12-09T15:06:13Z
2021-09-08T01:21:34Z
2016-12-09T15:29:17Z
MEMBER
resolved
Bon voyage ⛵!
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3761/reactions" }
https://api.github.com/repos/psf/requests/issues/3761/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3761.diff", "html_url": "https://github.com/psf/requests/pull/3761", "merged_at": "2016-12-09T15:29:17Z", "patch_url": "https://github.com/psf/requests/pull/3761.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3761" }
true
[ "@Lukasa I should have learned by now that when I feel the need for emojis, my changes are overzealous. Ready for another peek :)" ]