url
stringlengths
50
53
repository_url
stringclasses
1 value
labels_url
stringlengths
64
67
comments_url
stringlengths
59
62
events_url
stringlengths
57
60
html_url
stringlengths
38
43
id
int64
597k
2.65B
node_id
stringlengths
18
32
number
int64
1
6.83k
title
stringlengths
1
296
user
dict
labels
listlengths
0
5
state
stringclasses
2 values
locked
bool
2 classes
assignee
dict
assignees
listlengths
0
4
milestone
dict
comments
int64
0
211
created_at
stringlengths
20
20
updated_at
stringlengths
20
20
closed_at
stringlengths
20
20
author_association
stringclasses
3 values
active_lock_reason
stringclasses
4 values
body
stringlengths
0
65.6k
closed_by
dict
reactions
dict
timeline_url
stringlengths
59
62
performed_via_github_app
null
state_reason
stringclasses
3 values
draft
bool
2 classes
pull_request
dict
is_pull_request
bool
2 classes
issue_comments
listlengths
0
30
https://api.github.com/repos/psf/requests/issues/2030
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2030/labels{/name}
https://api.github.com/repos/psf/requests/issues/2030/comments
https://api.github.com/repos/psf/requests/issues/2030/events
https://github.com/psf/requests/issues/2030
32,637,686
MDU6SXNzdWUzMjYzNzY4Ng==
2,030
Session proxies not used
{ "avatar_url": "https://avatars.githubusercontent.com/u/5490133?v=4", "events_url": "https://api.github.com/users/galfan/events{/privacy}", "followers_url": "https://api.github.com/users/galfan/followers", "following_url": "https://api.github.com/users/galfan/following{/other_user}", "gists_url": "https://api.github.com/users/galfan/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/galfan", "id": 5490133, "login": "galfan", "node_id": "MDQ6VXNlcjU0OTAxMzM=", "organizations_url": "https://api.github.com/users/galfan/orgs", "received_events_url": "https://api.github.com/users/galfan/received_events", "repos_url": "https://api.github.com/users/galfan/repos", "site_admin": false, "starred_url": "https://api.github.com/users/galfan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/galfan/subscriptions", "type": "User", "url": "https://api.github.com/users/galfan", "user_view_type": "public" }
[]
closed
true
null
[]
null
6
2014-05-01T18:50:31Z
2021-09-09T00:01:03Z
2014-05-01T18:58:12Z
NONE
resolved
In: s = requests.Session() s.proxies["http"] = "" s.proxies["https"] = "" s.get("http://server/",proxies=s.proxies) I'd expect "proxies=s.proxies" to be redundant.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2030/reactions" }
https://api.github.com/repos/psf/requests/issues/2030/timeline
null
completed
null
null
false
[ "What makes you think it isn't?\n", "Guess ...\n", "I shouldn't have to guess. =) Bug reports are not a guessing game, they're a report of a fault: you should say what you did, what you expected to happen, and what actually happened.\n", "I thought I did exactly that.\nWithout the proxies it would use my environment variables.\n", "Right, there we go. You didn't say that, you just wrote some code down and then said you expected it to be redundant. That doesn't tell me what was _wrong_. =)\n\nYes, this has been identified already in the very poorly named issue #2018. I'm planning to fix it this weekend.\n\nThanks for raising the issue! I'm closing to focus discussion on #2018.\n", "Thanks\n" ]
https://api.github.com/repos/psf/requests/issues/2029
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2029/labels{/name}
https://api.github.com/repos/psf/requests/issues/2029/comments
https://api.github.com/repos/psf/requests/issues/2029/events
https://github.com/psf/requests/issues/2029
32,602,791
MDU6SXNzdWUzMjYwMjc5MQ==
2,029
Is it possible to specify an SSL timeout?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1440143?v=4", "events_url": "https://api.github.com/users/GP89/events{/privacy}", "followers_url": "https://api.github.com/users/GP89/followers", "following_url": "https://api.github.com/users/GP89/following{/other_user}", "gists_url": "https://api.github.com/users/GP89/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/GP89", "id": 1440143, "login": "GP89", "node_id": "MDQ6VXNlcjE0NDAxNDM=", "organizations_url": "https://api.github.com/users/GP89/orgs", "received_events_url": "https://api.github.com/users/GP89/received_events", "repos_url": "https://api.github.com/users/GP89/repos", "site_admin": false, "starred_url": "https://api.github.com/users/GP89/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/GP89/subscriptions", "type": "User", "url": "https://api.github.com/users/GP89", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2014-05-01T08:54:42Z
2021-09-08T23:11:01Z
2014-06-08T09:51:45Z
NONE
resolved
Not really an issue I guess, But I'm noticing on some logs that I'm receiving an SSLError 'The read operation timed out' just over a second after all the data has been read and uploaded from a file during a PUT upload request. I've tried increasing the timeout by passing a higher value requests.put(url, filename, timeout=60), but I guess this isn't getting passed to the SSL part of the underlying library? as it still times out at just over a second as it did before. Is there a way to increase the timeout for the SSL part of the request from 1 second (it appears to be 1 second anyway)
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2029/reactions" }
https://api.github.com/repos/psf/requests/issues/2029/timeline
null
completed
null
null
false
[ "I've honestly got no idea. @t-8ch, does the underlying ssl library not time out based on the socket timeout?\n", "@GP89 which SSL backend and Python interpreter are you using?\nI can't reproduce a SSLError on read timeout, only with a handshake using the stdlib. The handshake with PyOpenSSL never times out...\n", "I'm not sure, how would I check? PyOpenSSL I would assume though.\nThat's odd, I was seeing it a lot and the timeouts were always raised just over a second after the call was made, which made me think it was likely that there was a timeout value set somewhere.\n", "Sorry for the long delay, I am travelling...\nHa, a usecase for #2023.\n\nCould you show me the result of running https://gist.github.com/t-8ch/4ecfd78a502b7a616028?\n\nI used `ncat -kl --ssl 0.0.0.0 4000` and `requests.get('https://localhost:4000')` to reproduce your issue but I could not get the same error as you. Could you try those commands and tell me the outcome?\n", "Closing for inactivity.\n" ]
https://api.github.com/repos/psf/requests/issues/2028
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2028/labels{/name}
https://api.github.com/repos/psf/requests/issues/2028/comments
https://api.github.com/repos/psf/requests/issues/2028/events
https://github.com/psf/requests/issues/2028
32,534,123
MDU6SXNzdWUzMjUzNDEyMw==
2,028
ImportError: cannot import name certs
{ "avatar_url": "https://avatars.githubusercontent.com/u/1942093?v=4", "events_url": "https://api.github.com/users/poolski/events{/privacy}", "followers_url": "https://api.github.com/users/poolski/followers", "following_url": "https://api.github.com/users/poolski/following{/other_user}", "gists_url": "https://api.github.com/users/poolski/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/poolski", "id": 1942093, "login": "poolski", "node_id": "MDQ6VXNlcjE5NDIwOTM=", "organizations_url": "https://api.github.com/users/poolski/orgs", "received_events_url": "https://api.github.com/users/poolski/received_events", "repos_url": "https://api.github.com/users/poolski/repos", "site_admin": false, "starred_url": "https://api.github.com/users/poolski/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/poolski/subscriptions", "type": "User", "url": "https://api.github.com/users/poolski", "user_view_type": "public" }
[]
closed
true
null
[]
null
17
2014-04-30T13:25:55Z
2021-09-06T00:06:38Z
2014-05-25T09:57:47Z
NONE
resolved
When using Requests in a very basic function, it won't even start as it can't load "certs" ``` import requests import django.utils.simplejson as json def getFeed(user_id): jsonURL = "https://feed.whatever.net/json" resp = requests.get(jsonURL) result = json.load(r.json()) import pdb; pdb.set_trace() return result ``` I get ``` File "/home/vagrant/dev/python/team-calendar/team_calendar/calendar/json.py", line 1, in <module> import requests File "/home/vagrant/.virtualenvs/teamcalendar/local/lib/python2.7/site-packages/requests/__init__.py", line 58, in <module> from . import utils File "/home/vagrant/.virtualenvs/teamcalendar/local/lib/python2.7/site-packages/requests/utils.py", line 24, in <module> from . import certs ImportError: cannot import name certs ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2028/reactions" }
https://api.github.com/repos/psf/requests/issues/2028/timeline
null
completed
null
null
false
[ "What version of requests are you using and where did you install it from?\n", "There's no need to answer that: this is another manifestation of #2026. =)\n", "@sigmavirus24 @Lukasa I'm using 2.2.1 from pypi...\n\nI saw in #2026 that it was \"fixed\" by installing 2.2.1 from pypi, but mine IS a normal, vanilla version...\n", "Really? Can you tell me what the **version** string in `/home/vagrant/.virtualenvs/teamcalendar/local/lib/python2.7/site-packages/requests/__init__.py` says?\n", "```\n__title__ = 'requests'\n__version__ = '2.2.1'\n__build__ = 0x020201\n__author__ = 'Kenneth Reitz'\n__license__ = 'Apache 2.0'\n__copyright__ = 'Copyright 2014 Kenneth Reitz'\n```\n", "Did you do `pip install requests==2.2.1` or did you do `pip uninstall requests` and then `pip install requests`. I'm not sure if pip will remove the `pyc` files before installing a new version. It could be causing issues for you if you don't uninstall everything first.\n", "That does look like 2.2.1 and it appears to be in the place the traceback is coming from. Why is that relative import failing?\n", "Are you running this in ipython? I noticed that I get the same error in ipython when importing requests and not in the standard python shell...\n", "Hang on, I'm pretty sure this is a manifestation of #2026. Anyone still affected should update to 2.3.0, which ought to fix the problem. \n", "We started seeing this with 2.6.0 this morning in our CI (codeship), and only in the CI. All tests are failing for us now because of this.\n\n```\n[... snip lots of lines from the pytest/django unit test machinery ...]\nFile \"/home/rof/.virtualenv/local/lib/python2.7/site-packages/requests/utils.py\", line 25, in <module>\nfrom . import certs\nImportError: cannot import name certs\n```\n\nTests were passing just fine until they didn't. No idea what caused it yet, we haven't changed anything. :confused: \n", "@bochecha Nothing breaks without a change. _Something_ in your environment has changed. There are a few options:\n1. Your code has changed. The Python import system is fragile and weirdly global. To work this out, do a binary search through your commits until you can find the breakage. The advantage of this is that it should be easily reproducible on more or less any system.\n2. Your dependencies have changed. You appear to have pinned Requests: have you pinned everything else? Check whether any of your dependencies have pushed updates recently.\n3. Your OS has changed. Do you control your OS image on Codeship?\n\nBasically: code that is exactly the same doesn't just magically change. Given that the one thing you definitely _haven't_ changed is Requests, I think it's safe to say that Requests should not be the first place you look for problems. =D\n", "> Nothing breaks without a change. Something in your environment has changed.\n\nI know that. What I'm saying is that **we** didn't change anything. :wink:\n\nI'm still tracking what actually happened.\n\n> Your code has changed.\n\nI've resubmitted a build that had previously passed, and it's now failing. Same code from the same git commit.\n\n> Your dependencies have changed. You appear to have pinned Requests: have you pinned everything else?\n\nAlmost everything is pinned. Very few isn't, and I'm going through them.\n\n> Your OS has changed. Do you control your OS image on Codeship?\n\nNo we don't. This is what I'm currently looking at.\n\n> I think it's safe to say that Requests should not be the first place you look for problems. =D\n\nSure, I just found this ticket with the exact same symptoms, and a comment saying it is solved in 2.3.0, so I thought it was an interesting data point to report that it seems to have come back two years later.\n", "So I do have one real question: where are you getting your deps from? PyPI, or from your OS?\n", "> So I do have one real question: where are you getting your deps from? PyPI, or from your OS?\n\nMost of them from PyPI, there is only one coming from a private github repo that we haven't modified since 2015.\n\n---\n\nAnyway, I finally found the culprit.\n\nNow for the details...\n\nOur deps are pinned, and it seems nobody ever updated them before I joined the project. So we have an old requests 2.6.0.\n\nAll the runtime deps are pinned, but not the development ones. One such unpinned dependency is pytest.\n\nWith pytest 2.9.2, the tests run just fine. With pytest 3.0.2, I reproduce the import problem on my workstation (so it's not just the CI). Upgrading to requests 2.6.1 fixes the `ImportError`.\n\nThat explains why things started failing this morning: the CI somehow updated pytest this morning in its virtualenv (it usually uses a cached version of the virtualenv).\n\n tl;dr: **It seems requests==2.6.0 doesn't work with pytest==3.0.2.** Leaving it here in case it can help someone else in the future.\n\n/me goes and upgrades all the deps :wink: \n", "This issue is cropping up again for me more than 2 years later. requests is installed as a dependency of ec2-metadata globally on an Ubuntu 16.04 machine. When importing requests in a python shell it imports fine. When it is imported as part of a salt external pillar module it fails with an ImportError.\r\n```\r\nTraceback (most recent call last):\r\n File \"/srv/salt/modules/pillar/my_module.py\", line 9, in <module>\r\n import ec2_metadata\r\n File \"/usr/local/lib/python2.7/dist-packages/ec2_metadata.py\", line 4, in <module>\r\n import requests\r\n File \"/usr/lib/python2.7/dist-packages/requests/__init__.py\", line 58, in <module>\r\n from . import utils\r\n File \"/usr/lib/python2.7/dist-packages/requests/utils.py\", line 25, in <module>\r\n from . import certs\r\nImportError: cannot import name certs\r\n```\r\nHere is the pip freeze:\r\n```\r\nroot@saltmaster:~# pip freeze\r\naws-cfn-bootstrap==1.4\r\nboto3==1.9.69\r\nbotocore==1.12.69\r\ncached-property==1.5.1\r\ncffi==1.5.2\r\nchardet==2.3.0\r\ncryptography==1.2.3\r\ndocutils==0.14\r\nec2-metadata==1.8.0\r\nenum34==1.1.2\r\nfutures==3.0.5\r\ngitdb==0.6.4\r\nGitPython==1.0.1\r\nidna==2.0\r\nipaddress==1.0.16\r\nJinja2==2.8\r\njmespath==0.9.3\r\nlockfile==0.12.2\r\nMako==1.0.3\r\nMarkupSafe==0.23\r\nmsgpack-python==0.4.6\r\nndg-httpsclient==0.4.0\r\nply==3.7\r\npyasn1==0.1.9\r\npycparser==2.14\r\npycrypto==2.6.1\r\npycurl==7.43.0\r\npygit2==0.24.0\r\nPyMySQL==0.7.2\r\npyOpenSSL==0.15.1\r\npystache==0.5.4\r\npython-apt==1.1.0b1+ubuntu0.16.4.2\r\npython-daemon==1.6.1\r\npython-dateutil==2.4.2\r\npython-systemd==231\r\nPyYAML==3.11\r\npyzmq==15.2.0\r\nredis==2.10.5\r\nrequests==2.9.1\r\ns3transfer==0.1.13\r\nsalt==2016.3.8\r\nsix==1.10.0\r\nsmmap==0.9.0\r\ntornado==4.2.1\r\nurllib3==1.24.1\r\n```\r\nThis is running within salt 2016.3.8+ds-1 installed through the official salt apt repo\r\n```\r\ndeb http://repo.saltstack.com/apt/ubuntu/16.04/amd64/2016.3 xenial main\r\n```\r\nAnd Python version:\r\n```\r\nPython 2.7.12 (default, Nov 12 2018, 14:36:49)\r\n```\r\nI realize that this seems more like a bug with salt but thought I'd ask here in case anyone had any idea.", "requests 2.9 is not supported any longer", "Got the same issue with version `requests==2.19.1` on Heroku.\r\nAny ideas what can go wrong there? Locally on Ubuntu 18 works well." ]
https://api.github.com/repos/psf/requests/issues/2027
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2027/labels{/name}
https://api.github.com/repos/psf/requests/issues/2027/comments
https://api.github.com/repos/psf/requests/issues/2027/events
https://github.com/psf/requests/issues/2027
32,453,236
MDU6SXNzdWUzMjQ1MzIzNg==
2,027
Chunk Size - Question -- Not issue
{ "avatar_url": "https://avatars.githubusercontent.com/u/6764126?v=4", "events_url": "https://api.github.com/users/patelgr/events{/privacy}", "followers_url": "https://api.github.com/users/patelgr/followers", "following_url": "https://api.github.com/users/patelgr/following{/other_user}", "gists_url": "https://api.github.com/users/patelgr/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/patelgr", "id": 6764126, "login": "patelgr", "node_id": "MDQ6VXNlcjY3NjQxMjY=", "organizations_url": "https://api.github.com/users/patelgr/orgs", "received_events_url": "https://api.github.com/users/patelgr/received_events", "repos_url": "https://api.github.com/users/patelgr/repos", "site_admin": false, "starred_url": "https://api.github.com/users/patelgr/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/patelgr/subscriptions", "type": "User", "url": "https://api.github.com/users/patelgr", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2014-04-29T15:01:22Z
2021-09-09T00:01:03Z
2014-04-29T15:51:43Z
NONE
resolved
Is there any guideline on selecting chunk size? I tried different chunk size but none of them give download speed comparable to browser or wget download speed here is snapshot of my code r = requests.get(url, headers = headers,stream=True) total_length = int(r.headers.get('content-length')) if not total_length is None: # no content length header for chunk in r.iter_content(1024): f.write(chunk) Any help would be appreciated.?
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2027/reactions" }
https://api.github.com/repos/psf/requests/issues/2027/timeline
null
completed
null
null
false
[ "Hi @gauravp2003,\n\nMuch of the documentation around requests, including the [README](https://github.com/kennethreitz/requests#contribute), suggests that the issue tracker is for bugs or feature requests, not questions. If you have questions, the should be asked on [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests). @Lukasa, someone else, or I will answer it there.\n\nThanks for your interest in requests,\n" ]
https://api.github.com/repos/psf/requests/issues/2026
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2026/labels{/name}
https://api.github.com/repos/psf/requests/issues/2026/comments
https://api.github.com/repos/psf/requests/issues/2026/events
https://github.com/psf/requests/pull/2026
32,407,944
MDExOlB1bGxSZXF1ZXN0MTUzMDI1OTg=
2,026
ImportError: No module named 'requests.packages.urllib3.util'
{ "avatar_url": "https://avatars.githubusercontent.com/u/3210446?v=4", "events_url": "https://api.github.com/users/zacharysarah/events{/privacy}", "followers_url": "https://api.github.com/users/zacharysarah/followers", "following_url": "https://api.github.com/users/zacharysarah/following{/other_user}", "gists_url": "https://api.github.com/users/zacharysarah/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/zacharysarah", "id": 3210446, "login": "zacharysarah", "node_id": "MDQ6VXNlcjMyMTA0NDY=", "organizations_url": "https://api.github.com/users/zacharysarah/orgs", "received_events_url": "https://api.github.com/users/zacharysarah/received_events", "repos_url": "https://api.github.com/users/zacharysarah/repos", "site_admin": false, "starred_url": "https://api.github.com/users/zacharysarah/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zacharysarah/subscriptions", "type": "User", "url": "https://api.github.com/users/zacharysarah", "user_view_type": "public" }
[]
closed
true
null
[]
null
9
2014-04-28T23:23:24Z
2021-09-09T00:01:26Z
2014-05-02T19:09:39Z
NONE
resolved
I'm using Python 3.4. I get this error from `import requests`: ``` Traceback (most recent call last): File "devwiki.py", line 4, in <module> import requests File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/site-packages/requests-2.3.0-py3.4.egg/requests/__init__.py", line 58, in <module> from . import utils File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/site-packages/requests-2.3.0-py3.4.egg/requests/utils.py", line 25, in <module> from .compat import parse_http_list as _parse_list_header File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/site-packages/requests-2.3.0-py3.4.egg/requests/compat.py", line 7, in <module> from .packages import chardet File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/site-packages/requests-2.3.0-py3.4.egg/requests/packages/__init__.py", line 3, in <module> from . import urllib3 File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/site-packages/requests-2.3.0-py3.4.egg/requests/packages/urllib3/__init__.py", line 16, in <module> from .connectionpool import ( File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/site-packages/requests-2.3.0-py3.4.egg/requests/packages/urllib3/connectionpool.py", line 36, in <module> from .connection import ( File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/site-packages/requests-2.3.0-py3.4.egg/requests/packages/urllib3/connection.py", line 43, in <module> from .util import ( ImportError: No module named 'requests.packages.urllib3.util' ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2026/reactions" }
https://api.github.com/repos/psf/requests/issues/2026/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2026.diff", "html_url": "https://github.com/psf/requests/pull/2026", "merged_at": "2014-05-02T19:09:39Z", "patch_url": "https://github.com/psf/requests/pull/2026.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2026" }
true
[ "@zakcalrissian how did you install requests? Your traceback seems to say that you have version 2.3.0 installed when no such version has been released.\n", "I got an email about this as well. Installing directly from Github reveals this problem, but it's very specific: you need to install the package and then run from somewhere _other_ than the requests directory.\n\nThe problem is that #2017 updated urllib3 to a new version that includes a `util` module that is a directory, not a file. However, we didn't update `setup.py` to ensure that the new directory is installed, so import fails. I'll fix this up and raise a pull request shortly.\n", "I had this problem too. Installing 2.2.1 from Pypi has resolved the problem for the moment.\n", "Ok, the fix attached to this issue should fix the problem. =)\n", ":+1: This is due to a refactor pull request introduced upstream.\n", "It works in local. Thanks. \n", ":cake:\n", "It works. Thank you!\n", "Couldn't we just use [`setuptools.find_packages`](https://pythonhosted.org/setuptools/setuptools.html#using-find-packages) instead of the long list of subpackages?\n" ]
https://api.github.com/repos/psf/requests/issues/2025
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2025/labels{/name}
https://api.github.com/repos/psf/requests/issues/2025/comments
https://api.github.com/repos/psf/requests/issues/2025/events
https://github.com/psf/requests/issues/2025
32,402,696
MDU6SXNzdWUzMjQwMjY5Ng==
2,025
JSON Upload
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
[ { "color": "e102d8", "default": false, "description": null, "id": 117745, "name": "Planned", "node_id": "MDU6TGFiZWwxMTc3NDU=", "url": "https://api.github.com/repos/psf/requests/labels/Planned" }, { "color": "0b02e1", "default": false, "description": null, "id": 191274, "name": "Contributor Friendly", "node_id": "MDU6TGFiZWwxOTEyNzQ=", "url": "https://api.github.com/repos/psf/requests/labels/Contributor%20Friendly" }, { "color": "fbca04", "default": false, "description": null, "id": 44501249, "name": "Needs BDFL Input", "node_id": "MDU6TGFiZWw0NDUwMTI0OQ==", "url": "https://api.github.com/repos/psf/requests/labels/Needs%20BDFL%20Input" } ]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" } ]
null
11
2014-04-28T21:58:28Z
2021-09-08T23:07:57Z
2014-10-05T16:46:09Z
CONTRIBUTOR
resolved
We download it but don't upload it. Asymmetrical. ``` r = requests.post('http://httpbin.org/post', json={'life': 42'}) ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2025/reactions" }
https://api.github.com/repos/psf/requests/issues/2025/timeline
null
completed
null
null
false
[ "This is really easy to do. I'm curious though, what made you change your mind about this?\n", "Tagging this as contributor friendly because it's perfect for @sigmavirus24 to pair with a new developer on.\n", "cool!\n", "Did anyone already start working on it ? If not, I would be interested in giving it a try ...\n", "@willingc and I are working on this.\n", "Is anyone still working on it? Otherwise I will be happy to work on this as my first issue ever. Please let me know if I can contribute. :)\n", "## Yes they are. Please be patient. \n\nSent from my Android device with K-9 Mail. Please excuse my brevity.\n", "I've just been doing `requests.post('http://httpbin.org/post', data=json.dumps({'life':42}))`; is there some limitation I'm missing about doing that way?\n", "@gbromios That's the only way to do it at the moment. Note, however, that the Content-Type is wrong. =)\n", "ah, so convert to a string and override the Content-Type header in one step. That seems reasonable.\n", "Yeah — basically if we do it on one side (download), we should also do it on the other (upload). Nice and symmetrical.\n" ]
https://api.github.com/repos/psf/requests/issues/2024
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2024/labels{/name}
https://api.github.com/repos/psf/requests/issues/2024/comments
https://api.github.com/repos/psf/requests/issues/2024/events
https://github.com/psf/requests/issues/2024
32,385,865
MDU6SXNzdWUzMjM4NTg2NQ==
2,024
Response.close doesn't really close the connection
{ "avatar_url": "https://avatars.githubusercontent.com/u/167414?v=4", "events_url": "https://api.github.com/users/ctheiss/events{/privacy}", "followers_url": "https://api.github.com/users/ctheiss/followers", "following_url": "https://api.github.com/users/ctheiss/following{/other_user}", "gists_url": "https://api.github.com/users/ctheiss/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ctheiss", "id": 167414, "login": "ctheiss", "node_id": "MDQ6VXNlcjE2NzQxNA==", "organizations_url": "https://api.github.com/users/ctheiss/orgs", "received_events_url": "https://api.github.com/users/ctheiss/received_events", "repos_url": "https://api.github.com/users/ctheiss/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ctheiss/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ctheiss/subscriptions", "type": "User", "url": "https://api.github.com/users/ctheiss", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2014-04-28T18:30:51Z
2021-09-09T00:01:04Z
2014-04-28T18:34:23Z
NONE
resolved
The docs for `Response.close` say: "Closes the underlying file descriptor and releases the connection back to the pool.", and the code is: `self.raw.release_conn()`. I think this should be `self.raw.close`, as releasing the connection (https://github.com/kennethreitz/requests/blob/826667a54c608d72d8813c3eddb75b27ab26c988/requests/packages/urllib3/response.py#L121) and closing the response's underlying file descriptor (https://github.com/kennethreitz/requests/blob/826667a54c608d72d8813c3eddb75b27ab26c988/requests/packages/urllib3/response.py#L279) are different. It's a little difficult making out all the implications, but I believe releasing the connection will close the connection _when that connection is used again from the pool_, which is different than the docs suggest.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2024/reactions" }
https://api.github.com/repos/psf/requests/issues/2024/timeline
null
completed
null
null
false
[ "Thanks for raising this!\n\nThis has come up before, and we don't agree with your assessment. From requests' perspective, connection pooling is A Good Thing, and we want to enable that as much as possible. For that reason, we allow urllib3 to manage the lifetime of our underlying connections, which it does very well.\n\nFrom the user's perspective, the fact that \"closed\" doesn't mean \"socket is closed\" is irrelevant. What _we_ mean is that the user cannot use that raw object directly: closed is a perfectly good word for that state of affairs.\n\nI see that you've raised this issue over at urllib3 (shazow/urllib3#379). That issue is good enough for your core problem: we're not changing this behaviour. =)\n", "Wow your response was quick! I'm continually impressed with this project :)\n\nWhat you're saying is completely reasonable... how about removing the \"Closes the underlying file descriptor\" from the docs, then? Or changing it to \"Releases the connection back to the pool, which will eventually close the underlying file descriptor\"?\n", "Thanks! Keeping quick response times on this project is one of the goals. We don't accept new features often, but we want people to feel like they can suggest them, and a fast response time helps there: we don't want you to think we're ignoring you!\n\nWith that in mind, the change to the documentation is a good idea. What do you think of [this](http://docs.python-requests.org/en/latest/api/#requests.Response.close)?\n", "Looks good to me! Getting rid of the mention of file descriptors is probably best, given how low-level they are.\n" ]
https://api.github.com/repos/psf/requests/issues/2023
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2023/labels{/name}
https://api.github.com/repos/psf/requests/issues/2023/comments
https://api.github.com/repos/psf/requests/issues/2023/events
https://github.com/psf/requests/pull/2023
32,298,236
MDExOlB1bGxSZXF1ZXN0MTUyMDYzMTA=
2,023
show which ssl backend we are using
{ "avatar_url": "https://avatars.githubusercontent.com/u/717901?v=4", "events_url": "https://api.github.com/users/t-8ch/events{/privacy}", "followers_url": "https://api.github.com/users/t-8ch/followers", "following_url": "https://api.github.com/users/t-8ch/following{/other_user}", "gists_url": "https://api.github.com/users/t-8ch/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/t-8ch", "id": 717901, "login": "t-8ch", "node_id": "MDQ6VXNlcjcxNzkwMQ==", "organizations_url": "https://api.github.com/users/t-8ch/orgs", "received_events_url": "https://api.github.com/users/t-8ch/received_events", "repos_url": "https://api.github.com/users/t-8ch/repos", "site_admin": false, "starred_url": "https://api.github.com/users/t-8ch/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/t-8ch/subscriptions", "type": "User", "url": "https://api.github.com/users/t-8ch", "user_view_type": "public" }
[ { "color": "009800", "default": false, "description": null, "id": 44501218, "name": "Ready To Merge", "node_id": "MDU6TGFiZWw0NDUwMTIxOA==", "url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge" }, { "color": "207de5", "default": false, "description": null, "id": 60620163, "name": "Minion Seal of Approval", "node_id": "MDU6TGFiZWw2MDYyMDE2Mw==", "url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval" } ]
closed
true
null
[]
null
11
2014-04-26T21:12:48Z
2021-09-08T23:11:07Z
2014-05-12T18:57:10Z
CONTRIBUTOR
resolved
This is meant to be used when triaging bugs, without requiring deep knowledge about the internals from reporters.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2023/reactions" }
https://api.github.com/repos/psf/requests/issues/2023/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2023.diff", "html_url": "https://github.com/psf/requests/pull/2023", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/2023.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2023" }
true
[ "LGTM, can only be helpful. =) :cake:\n", "Did someone need this?\n", "@kennethreitz this will help a great deal when debugging issues where people are running into errors with HTTPS connections.\n", "Specifically #2022 would have been easier to debug if we had this information readily available to us\n", "Hmm, I wonder if we can put this into the connection adapter instead. \n", "@kennethreitz I don't think we can do that. We want to know if the user installed the dependencies to make the top level import/execution work.\n", "Of course we can, we can stick the variable wherever we want. I'm really not a fan of adding random variables around the codebase.\n\nI think a better approach would be for the pyopenssl monkeypatcher to set a variable in urllib3, since it's the one being monkeypatched. \n\nPerhaps `urllib3._ssl_backend`.\n", "Let's talk to them and see what they say. If not, we can revisit. \n\nI fall on the side of \"we're not using pyopenssl, urllib3 is\", however we're the ones manually monkeypatching, so the line is ever–so–slightly unclear. \n", "Yeah, I'll move it over to urllib3. On the other hand 'regular' users of urllib3 know which backend they are using, as they have to activate it themselves. Requests users get automagic.\n", "I agree with @t-8ch that this belongs here, even if it's hidden somewhere\n", "Just so that everyone is up to speed, urllib3 is planning to add a module that uses PyOpenSSL to 'backport' TLS features to versions of Python if it's available. This module _will_ keep track of what it's using, so we can simply wait for that to happen. Work is tracked in shazow/urllib3#371.\n" ]
https://api.github.com/repos/psf/requests/issues/2022
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2022/labels{/name}
https://api.github.com/repos/psf/requests/issues/2022/comments
https://api.github.com/repos/psf/requests/issues/2022/events
https://github.com/psf/requests/issues/2022
32,294,037
MDU6SXNzdWUzMjI5NDAzNw==
2,022
https GET request fails with "handshake failure"
{ "avatar_url": "https://avatars.githubusercontent.com/u/101148?v=4", "events_url": "https://api.github.com/users/jaddison/events{/privacy}", "followers_url": "https://api.github.com/users/jaddison/followers", "following_url": "https://api.github.com/users/jaddison/following{/other_user}", "gists_url": "https://api.github.com/users/jaddison/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jaddison", "id": 101148, "login": "jaddison", "node_id": "MDQ6VXNlcjEwMTE0OA==", "organizations_url": "https://api.github.com/users/jaddison/orgs", "received_events_url": "https://api.github.com/users/jaddison/received_events", "repos_url": "https://api.github.com/users/jaddison/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jaddison/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jaddison/subscriptions", "type": "User", "url": "https://api.github.com/users/jaddison", "user_view_type": "public" }
[]
closed
true
null
[]
null
76
2014-04-26T17:42:19Z
2018-06-15T02:16:34Z
2014-04-26T21:08:02Z
NONE
resolved
Related to #1083, perhaps. Standard `requests.get()` for this particular site/page `https://docs.apitools.com/2014/04/24/a-small-router-for-openresty.html` results in: ``` >>> import requests >>> requests.get('https://docs.apitools.com/2014/04/24/a-small-router-for-openresty.html') Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/Users/jaddison/.virtualenvs/techtown/lib/python2.7/site-packages/requests/api.py", line 55, in get return request('get', url, **kwargs) File "/Users/jaddison/.virtualenvs/techtown/lib/python2.7/site-packages/requests/api.py", line 44, in request return session.request(method=method, url=url, **kwargs) File "/Users/jaddison/.virtualenvs/techtown/lib/python2.7/site-packages/requests/sessions.py", line 383, in request resp = self.send(prep, **send_kwargs) File "/Users/jaddison/.virtualenvs/techtown/lib/python2.7/site-packages/requests/sessions.py", line 486, in send r = adapter.send(request, **kwargs) File "/Users/jaddison/.virtualenvs/techtown/lib/python2.7/site-packages/requests/adapters.py", line 385, in send raise SSLError(e) requests.exceptions.SSLError: [Errno 1] _ssl.c:504: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure ``` Using `request-toolbelt`'s `SSLAdapter` to try various ssl versions, they all fail, it would seem... see following tracebacks. TLSv1: ``` >>> adapter = SSLAdapter('TLSv1') >>> s = requests.Session() >>> s.mount('https://', adapter) >>> s.get('https://docs.apitools.com/2014/04/24/a-small-router-for-openresty.html') Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/Users/jaddison/.virtualenvs/techtown/lib/python2.7/site-packages/requests/sessions.py", line 395, in get return self.request('GET', url, **kwargs) File "/Users/jaddison/.virtualenvs/techtown/lib/python2.7/site-packages/requests/sessions.py", line 383, in request resp = self.send(prep, **send_kwargs) File "/Users/jaddison/.virtualenvs/techtown/lib/python2.7/site-packages/requests/sessions.py", line 486, in send r = adapter.send(request, **kwargs) File "/Users/jaddison/.virtualenvs/techtown/lib/python2.7/site-packages/requests/adapters.py", line 385, in send raise SSLError(e) requests.exceptions.SSLError: [Errno 1] _ssl.c:504: error:14094410:SSL routines:SSL3_READ_BYTES:sslv3 alert handshake failure ``` SSLv3: ``` >>> adapter = SSLAdapter('SSLv3') >>> s = requests.Session() >>> s.mount('https://', adapter) >>> s.get('https://docs.apitools.com/2014/04/24/a-small-router-for-openresty.html') Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/Users/jaddison/.virtualenvs/techtown/lib/python2.7/site-packages/requests/sessions.py", line 395, in get return self.request('GET', url, **kwargs) File "/Users/jaddison/.virtualenvs/techtown/lib/python2.7/site-packages/requests/sessions.py", line 383, in request resp = self.send(prep, **send_kwargs) File "/Users/jaddison/.virtualenvs/techtown/lib/python2.7/site-packages/requests/sessions.py", line 486, in send r = adapter.send(request, **kwargs) File "/Users/jaddison/.virtualenvs/techtown/lib/python2.7/site-packages/requests/adapters.py", line 385, in send raise SSLError(e) requests.exceptions.SSLError: [Errno 1] _ssl.c:504: error:14094410:SSL routines:SSL3_READ_BYTES:sslv3 alert handshake failure ``` SSLv2: ``` >>> adapter = SSLAdapter('SSLv2') >>> s = requests.Session() >>> s.mount('https://', adapter) >>> s.get('https://docs.apitools.com/2014/04/24/a-small-router-for-openresty.html') Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/Users/jaddison/.virtualenvs/techtown/lib/python2.7/site-packages/requests/sessions.py", line 395, in get return self.request('GET', url, **kwargs) File "/Users/jaddison/.virtualenvs/techtown/lib/python2.7/site-packages/requests/sessions.py", line 383, in request resp = self.send(prep, **send_kwargs) File "/Users/jaddison/.virtualenvs/techtown/lib/python2.7/site-packages/requests/sessions.py", line 486, in send r = adapter.send(request, **kwargs) File "/Users/jaddison/.virtualenvs/techtown/lib/python2.7/site-packages/requests/adapters.py", line 378, in send raise ConnectionError(e) requests.exceptions.ConnectionError: HTTPSConnectionPool(host='docs.apitools.com', port=443): Max retries exceeded with url: /2014/04/24/a-small-router-for-openresty.html (Caused by <class 'socket.error'>: [Errno 54] Connection reset by peer) ``` Note the last one gives a `Connection reset by peer` error, which differs from the others, but I'm pretty sure SSLv2 isn't supported by the server anyhow. For fun, I tried to pass through some more appropriate headers through on the last request as well: ``` >>> headers = { ... 'Accept': u"text/html,application/xhtml+xml,application/xml", ... 'User-Agent': u"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/34.0.1847.131 Safari/537.36", ... 'Accept-Encoding': u"gzip,deflate", ... 'Accept-Language': u"en-US,en;q=0.8" ... } >>> adapter = SSLAdapter('SSLv2') >>> s = requests.Session() >>> s.mount('https://', adapter) >>> s.get('https://docs.apitools.com/2014/04/24/a-small-router-for-openresty.html', headers=headers) Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/Users/jaddison/.virtualenvs/techtown/lib/python2.7/site-packages/requests/sessions.py", line 395, in get return self.request('GET', url, **kwargs) File "/Users/jaddison/.virtualenvs/techtown/lib/python2.7/site-packages/requests/sessions.py", line 383, in request resp = self.send(prep, **send_kwargs) File "/Users/jaddison/.virtualenvs/techtown/lib/python2.7/site-packages/requests/sessions.py", line 486, in send r = adapter.send(request, **kwargs) File "/Users/jaddison/.virtualenvs/techtown/lib/python2.7/site-packages/requests/adapters.py", line 378, in send raise ConnectionError(e) requests.exceptions.ConnectionError: HTTPSConnectionPool(host='docs.apitools.com', port=443): Max retries exceeded with url: /2014/04/24/a-small-router-for-openresty.html (Caused by <class 'socket.error'>: [Errno 54] Connection reset by peer) ``` No dice there either. Here's what the HTTPS connection info in Chrome on Mac looks like: ![screen shot 2014-04-26 at 10 35 21 am](https://cloud.githubusercontent.com/assets/101148/2809354/44aa40b4-cd69-11e3-88f1-fabf089fe238.png) I'm not positive, but some googling indicates it's likely a cipher list issue, which is more urllib3, I think? I tried to modify `DEFAULT_CIPHER_LIST` in `pyopenssl`, but started running into import errors. At this point it seemed like things were just broken, and there wasn't really a proper way to approach fixing this yet. Version information: OSX Mavericks Python 2.7.5 OpenSSL 0.9.8y 5 Feb 2013 - (from `python -c "import ssl; print ssl.OPENSSL_VERSION"`) requests 2.2.1 requests-toolbelt 0.2.0 urllib3 1.8
{ "avatar_url": "https://avatars.githubusercontent.com/u/101148?v=4", "events_url": "https://api.github.com/users/jaddison/events{/privacy}", "followers_url": "https://api.github.com/users/jaddison/followers", "following_url": "https://api.github.com/users/jaddison/following{/other_user}", "gists_url": "https://api.github.com/users/jaddison/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jaddison", "id": 101148, "login": "jaddison", "node_id": "MDQ6VXNlcjEwMTE0OA==", "organizations_url": "https://api.github.com/users/jaddison/orgs", "received_events_url": "https://api.github.com/users/jaddison/received_events", "repos_url": "https://api.github.com/users/jaddison/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jaddison/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jaddison/subscriptions", "type": "User", "url": "https://api.github.com/users/jaddison", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2022/reactions" }
https://api.github.com/repos/psf/requests/issues/2022/timeline
null
completed
null
null
false
[ "Sadly, this is unrelated to the issue you identified, and entirely down to the crappy OpenSSL that OS X ships with by default. Version 0.9.8y has some real problems with performing SSL handshakes, and some servers don't tolerate it well. Using Python 3 on my OS X box (therefore using a newer OpenSSL) reveals that there's no problem.\n\nYou have two options:\n1. Install OpenSSL from Homebrew, then install a new version of Python 2 from Homebrew which will automatically link against the Homebrew-provided OpenSSL.\n2. Install OpenSSL from Homebrew, and then install PyOpenSSL against that new version by running `env ARCHFLAGS=\"-arch x86_64\" LDFLAGS=\"-L/usr/local/opt/openssl/lib\" CFLAGS=\"-I/usr/local/opt/openssl/include\" pip install PyOpenSSL`.\n", "Ah, looks like I was following a red herring then - I don't plan on deploying anything on OSX anyhow. Looks like I'll move my testing to a linux virtualbox. Apologies for this long-winded issue!\n", "No need to apologise, asking that question was the right thing to do: it's bizarrely specific knowledge to know that OS X has this problem. =)\n", "Ok, this is a bummer. I created an Ubuntu 14.04 server 32bit Virtualbox image via Vagrant and this is all still happening except for the SSLv2 case, where it fails because the protocol isn't included in the OpenSSL version in Ubuntu 14.04 (by design, I believe - SSLv2 is old and outdated).\n\nVersions:\nUbuntu 14.04 32bit (via Vagrant/Virtualbox combo)\nPython 2.7.6\nrequests==2.2.1\nrequests-toolbelt==0.2.0\nurllib3==1.8.2\n\nEDIT: forgot the OpenSSL version...\n\npython -c \"import ssl; print ssl.OPENSSL_VERSION\"\nOpenSSL 1.0.1f 6 Jan 2014\n\nTLSv1:\n\n```\n>>> import requests\n>>> from requests_toolbelt import SSLAdapter\n>>> adapter = SSLAdapter('TLSv1')\n>>> s = requests.Session()\n>>> s.mount('https://', adapter)\n>>> s.get('https://docs.apitools.com/2014/04/24/a-small-router-for-openresty.html')\nTraceback (most recent call last):\n File \"<console>\", line 1, in <module>\n File \"/home/vagrant/.virtualenvs/techtown/local/lib/python2.7/site-packages/requests/sessions.py\", line 395, in get\n return self.request('GET', url, **kwargs)\n File \"/home/vagrant/.virtualenvs/techtown/local/lib/python2.7/site-packages/requests/sessions.py\", line 383, in request\n resp = self.send(prep, **send_kwargs)\n File \"/home/vagrant/.virtualenvs/techtown/local/lib/python2.7/site-packages/requests/sessions.py\", line 486, in send\n r = adapter.send(request, **kwargs)\n File \"/home/vagrant/.virtualenvs/techtown/local/lib/python2.7/site-packages/requests/adapters.py\", line 385, in send\n raise SSLError(e)\nSSLError: [Errno 1] _ssl.c:510: error:14094410:SSL routines:SSL3_READ_BYTES:sslv3 alert handshake failure\n```\n\nSSLv2:\n\n```\n>>> import requests\n>>> from requests_toolbelt import SSLAdapter\n>>> adapter = SSLAdapter('SSLv3')\n>>> s = requests.Session()\n>>> s.mount('https://', adapter)\n>>> s.get('https://docs.apitools.com/2014/04/24/a-small-router-for-openresty.html')\nTraceback (most recent call last):\n File \"<console>\", line 1, in <module>\n File \"/home/vagrant/.virtualenvs/techtown/local/lib/python2.7/site-packages/requests/sessions.py\", line 395, in get\n return self.request('GET', url, **kwargs)\n File \"/home/vagrant/.virtualenvs/techtown/local/lib/python2.7/site-packages/requests/sessions.py\", line 383, in request\n resp = self.send(prep, **send_kwargs)\n File \"/home/vagrant/.virtualenvs/techtown/local/lib/python2.7/site-packages/requests/sessions.py\", line 486, in send\n r = adapter.send(request, **kwargs)\n File \"/home/vagrant/.virtualenvs/techtown/local/lib/python2.7/site-packages/requests/adapters.py\", line 385, in send\n raise SSLError(e)\nSSLError: [Errno 1] _ssl.c:510: error:14094410:SSL routines:SSL3_READ_BYTES:sslv3 alert handshake failure\n```\n\nSSLv23:\n\n```\n>>> import requests\n>>> from requests_toolbelt import SSLAdapter\n>>> adapter = SSLAdapter('SSLv23')\n>>> s = requests.Session()\n>>> s.mount('https://', adapter)\n>>> s.get('https://docs.apitools.com/2014/04/24/a-small-router-for-openresty.html')\nTraceback (most recent call last):\n File \"<console>\", line 1, in <module>\n File \"/home/vagrant/.virtualenvs/techtown/local/lib/python2.7/site-packages/requests/sessions.py\", line 395, in get\n return self.request('GET', url, **kwargs)\n File \"/home/vagrant/.virtualenvs/techtown/local/lib/python2.7/site-packages/requests/sessions.py\", line 383, in request\n resp = self.send(prep, **send_kwargs)\n File \"/home/vagrant/.virtualenvs/techtown/local/lib/python2.7/site-packages/requests/sessions.py\", line 486, in send\n r = adapter.send(request, **kwargs)\n File \"/home/vagrant/.virtualenvs/techtown/local/lib/python2.7/site-packages/requests/adapters.py\", line 385, in send\n raise SSLError(e)\nSSLError: [Errno 1] _ssl.c:510: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure\n```\n\nPerhaps this is a cipher list issue then? Or is the OpenSSL version used here still problematic?\n", "I am absolutely willing to put in some time to help debug this if necessary... provided you guys give me some direction.\n", "VM is downloading. I can't reproduce this on ArchLinux.\nThe stacktraces indicate this but I'd like to be sure: You are _not_ using PyOpenSSL but only the stdlib?\n", "@t-8ch Thanks for taking a look at this, I'm a bit confused. OpenSSL makes my life really hard =(\n", "@t-8ch I haven't installed PyOpenSSL if that's what you're asking?\n\nI would have assumed (perhaps incorrectly) that `pip install requests` should give me everything I need to successfully call `requests.get('...')` on an HTTPS page. Which, of course, it works for the most part, just not for this site for some reason. \n", "@jaddison It _mostly_ does. Unfortunately, Python 2.7s standard library sucks hard and doesn't support some features, such as SNI.\n\nI wonder if this is SNI...\n", "@jaddison There are two different codepaths behind the scenes. You shouldn't have to care about those, but it helps to know when debugging.\n\nHowever I can now reproduce this on ubuntu. But only o Py2. On Py3 everything is fine.\nI suspect @Lukasa is right and the server fails when the client is not using SNI.\n", "It bothers me that an absence of SNI fails in multiple different ways depending on the server in question.\n", "I did notice this change between OpenSSL 1.0.1f and 1.0.1g (https://www.openssl.org/news/openssl-1.0.1-notes.html):\n\n`Add TLS padding extension workaround for broken servers.`\n\nEDIT: Ahh, nevermind - the bug shouldn't vary between Py 2 and 3, I'd think.\n", "@jaddison To test whether this is SNI, you'll need to [install the SNI requirements](https://stackoverflow.com/questions/18578439/using-requests-with-tls-doesnt-give-sni-support/18579484#18579484) for Python 2.\n", "@Lukasa was right. Compare:\n\n``` bash\n$ openssl s_client -connect docs.apitools.com:443 \nCONNECTED(00000003)\n139846853338768:error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure:s23_clnt.c:762:\n---\nno peer certificate available\n---\nNo client certificate CA names sent\n---\nSSL handshake has read 7 bytes and written 517 bytes\n---\nNew, (NONE), Cipher is (NONE)\nSecure Renegotiation IS NOT supported\nCompression: NONE\nExpansion: NONE\n---\n\n$ openssl s_client -connect docs.apitools.com:443 -servername docs.apitools.com\n... happy handshake here\n```\n", "To elaborate: The second command enables the SNI functionality of `openssl s_client`.\n\nYou can a) switch to python3 b) install extra dependencies.\nThe stdlib has at the moment no way to do SNI.\n", "Thanks for the quick feedback. Seeing as there is no bug, I'll close this... again.\n", "Hey, thank you guys !! I installed python3 on my mac and boom, it works.\n", "Just want to chime in and say that I experienced this issue on OS X 10.9.5, Python 2.7.7 and OpenSSL 0.9.8zc.\n\nI was able to fix my handshaking issue by:\n1. Installing a newer-than-stock OpenSSL on my machine via `brew install OpenSSL`\n2. Compiling and installing the `cryptography` package linked against the new OpenSSL (`env ARCHFLAGS=\"-arch x86_64\" LDFLAGS=\"-L/usr/local/opt/openssl/lib\" CFLAGS=\"-I/usr/local/opt/openssl/include\" pip install cryptography`) \n3. Installing requests with SNI support by doing `pip install requests[security]`\n", "Thanks, @Microserf. I'm pretty much running the same specs (10.9.5, Python 2.7.6 installed via Homebrew but compiled with system provided OpenSSL 0.9.8zg) and this was my entire process for getting `requests` up and running for Django:\n\n```\nbrew install openssl\n```\n\nInstall `requests` with a bunch of [SNI stuff](https://urllib3.readthedocs.org/en/latest/security.html#openssl-pyopenssl), compiled against our new install of OpenSSL. The `[security]` option simply installs `pyopenssl ndg-httpsclient pyasn1`\n\n```\nenv ARCHFLAGS=\"-arch x86_64\" LDFLAGS=\"-L/usr/local/opt/openssl/lib\" CFLAGS=\"-I/usr/local/opt/openssl/include\" pip install requests[security] urllib3\n```\n\nAnd we're good to go:\n\n``` python\n\"\"\"\nThis may or may not be needed. See:\nhttps://urllib3.readthedocs.org/en/latest/security.html#openssl-pyopenssl\n\"\"\"\n# from urllib3.contrib import pyopenssl\n# pyopenssl.inject_into_urllib3()\n\nimport requests\n# r = requests.get(...)\n```\n", "Is there a definitive answer on how to get this working on ubuntu? I'm running into this issue, and it looks like the only answer here concerns how to get this working on a Mac. Upgrading our entire codebase to python 3 is not an option.\n", "OK, I may have just answered my own question. What I did boils down to:\n\n```\nsudo apt-get install libffi-dev\npip install pyOpenSSL ndg-httpsclient pyasn1\n```\n", "@lsemel thank you, that just saved me a bunch of time\n", "@lsemel Are your sure? I tried it on Ubuntu 15.10 and it still doesn't work with Python 2.7.10.\n\nIt works with Python 2.7 on Travis CI:\nhttps://travis-ci.org/playing-se/swish-python\n", "Got it to work now! I simply uninstalled pyOpenSSL:\n`pip uninstall pyOpenSSL`\n", "Maybe we should only pyopenssl.inject_into_urllib3() if Python version is less than 2.7.9? pyOpenSSL seems to break stuff on Ubuntu and Windows if Python version is 2.7.10.\n", "PyOpenSSL should not be breaking anything. If it does, that's a bug that should be reported. \n", "I will have to look into this, but is there any good reason to inject pyopenssl into urllib3 if Python version is 2.7.9 or newer?\n\nI am thinking of something like this:\n\n```\n# Check if Modern SSL with SNI support\ntry:\n from ssl import SSLContext\n from ssl import HAS_SNI\nexcept ImportError:\n # Attempt to enable urllib3's SNI support, if possible\n try:\n from .packages.urllib3.contrib import pyopenssl\n pyopenssl.inject_into_urllib3()\n except ImportError:\n pass\n```\n", "Yeah, frequently there is. For example, on OS X most Pythons link against the system OpenSSL, which is version 0.9.8zg. PyOpenSSL, however, will link against a much newer OpenSSL (1.0.2). That makes using PyOpenSSL a substantial security improvement.\n\nAdditionally, PyOpenSSL gives us much better access to OpenSSL, allowing us to secure it more effectively. \n", "OK, I have played around with this a little now.\n\nIt WORKS with pyopenssl BUT not if ndg-httpsclient is installed.\n\nHowever, I can get it work with ndg-httpsclient if I uninstall pyasn1 giving me these warnings:\n\n```\n/usr/lib/python2.7/dist-packages/ndg/httpsclient/subj_alt_name.py:22: UserWarning: Error importing pyasn1, subjectAltName check for SSL peer verification will be disabled. Import error is: No module named pyasn1.type\n warnings.warn(import_error_msg)\n/usr/lib/python2.7/dist-packages/ndg/httpsclient/ssl_peer_verification.py:25: UserWarning: SubjectAltName support is disabled - check pyasn1 package installation to enable\n warnings.warn(SUBJ_ALT_NAME_SUPPORT_MSG)\n/usr/lib/python2.7/dist-packages/ndg/httpsclient/subj_alt_name.py:22: UserWarning: Error importing pyasn1, subjectAltName check for SSL peer verification will be disabled. Import error is: No module named pyasn1.type\n warnings.warn(import_error_msg)\n```\n\nSame behavior on Ubuntu 15.10 and Windows 10 with Python 2.7.10 installed.\n", "That's because without ndg-httpsclient the PyOpenSSL support isn't used. \n" ]
https://api.github.com/repos/psf/requests/issues/2021
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2021/labels{/name}
https://api.github.com/repos/psf/requests/issues/2021/comments
https://api.github.com/repos/psf/requests/issues/2021/events
https://github.com/psf/requests/pull/2021
32,287,867
MDExOlB1bGxSZXF1ZXN0MTUyMDE5MzU=
2,021
Don't repopulate proxies if we don't trust the environment.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[ { "color": "009800", "default": false, "description": null, "id": 44501218, "name": "Ready To Merge", "node_id": "MDU6TGFiZWw0NDUwMTIxOA==", "url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge" }, { "color": "207de5", "default": false, "description": null, "id": 60620163, "name": "Minion Seal of Approval", "node_id": "MDU6TGFiZWw2MDYyMDE2Mw==", "url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval" } ]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" } ]
null
1
2014-04-26T12:10:01Z
2021-09-08T23:08:21Z
2014-04-28T21:49:28Z
MEMBER
resolved
This should resolve the problems raised in #2018.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2021/reactions" }
https://api.github.com/repos/psf/requests/issues/2021/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2021.diff", "html_url": "https://github.com/psf/requests/pull/2021", "merged_at": "2014-04-28T21:49:28Z", "patch_url": "https://github.com/psf/requests/pull/2021.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2021" }
true
[ "LGTM. Merge when ready @kennethreitz \n" ]
https://api.github.com/repos/psf/requests/issues/2020
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2020/labels{/name}
https://api.github.com/repos/psf/requests/issues/2020/comments
https://api.github.com/repos/psf/requests/issues/2020/events
https://github.com/psf/requests/issues/2020
32,263,500
MDU6SXNzdWUzMjI2MzUwMA==
2,020
Iterating over chunks that are not terminated by new lines
{ "avatar_url": "https://avatars.githubusercontent.com/u/13951?v=4", "events_url": "https://api.github.com/users/pkaeding/events{/privacy}", "followers_url": "https://api.github.com/users/pkaeding/followers", "following_url": "https://api.github.com/users/pkaeding/following{/other_user}", "gists_url": "https://api.github.com/users/pkaeding/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/pkaeding", "id": 13951, "login": "pkaeding", "node_id": "MDQ6VXNlcjEzOTUx", "organizations_url": "https://api.github.com/users/pkaeding/orgs", "received_events_url": "https://api.github.com/users/pkaeding/received_events", "repos_url": "https://api.github.com/users/pkaeding/repos", "site_admin": false, "starred_url": "https://api.github.com/users/pkaeding/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pkaeding/subscriptions", "type": "User", "url": "https://api.github.com/users/pkaeding", "user_view_type": "public" }
[]
closed
true
null
[]
null
17
2014-04-25T20:21:57Z
2021-09-08T12:00:57Z
2014-04-25T20:23:56Z
NONE
resolved
I would like to iterator over chunks coming from a service, where each chunk is a JSON blob (similar to the example in the docs). However, in this service, there are no newlines after each chunk. As a result, it seems that `iter_lines` only yields one line, which is the entire response (not chunked), and `iter_content` yields one character at a time. Is it possible to use this library to stream data from this service? Thanks!
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2020/reactions" }
https://api.github.com/repos/psf/requests/issues/2020/timeline
null
completed
null
null
false
[ "`iter_content()` takes an argument, the [chunk size](http://docs.python-requests.org/en/latest/api/#requests.Response.iter_content). Set this larger and you'll get back larger chunks. =)\n", "Right, but isn't the chunk size determined by the server? Before each chunk, the server sends the size of the next chunk. Chunks are not all going to be the same size, so how can I specify the chunk size before-hand?\n", "For example, I have [forked httpbin](https://github.com/pkaeding/httpbin/commit/40aeb58992613f61c616bad85ae3aa867e050684) and modified the `/stream` resource to not add newlines to the end of each chunk (note that this it adding a newline to the chunk itself, not the CRLF that terminates the chunk, and is not counted as part of the chunk size).\n\nI have this forked httpbin app running at http://shielded-thicket-4419.herokuapp.com/ if you want to take a look. The following code snippet:\n\n```\nimport json\nimport requests\n\nr = requests.get('http://shielded-thicket-4419.herokuapp.com/stream/20', stream=True)\n\nfor line in r.iter_lines():\n\n # filter out keep-alive new lines\n if line:\n print json.loads(line)\n```\n\nNow produces this error, because it is lumping all chunks into one:\n\n```\nTraceback (most recent call last):\n File \"chunked_request_client.py\", line 10, in <module>\n print json.loads(line)\n File \"/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/__init__.py\", line 326, in loads\n return _default_decoder.decode(s)\n File \"/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/decoder.py\", line 369, in decode\n raise ValueError(errmsg(\"Extra data\", s, end, len(s)))\nValueError: Extra data: line 1 column 381 - line 1 column 7630 (char 381 - 7630)\n```\n", "Oh, I apologise, it wasn't clear to me that when you said 'chunk' you meant as-in 'chunked encoding'.\n\nThe short answer is that requests provides no way to use `iter_content` in such a way that you obtain chunks. However, chunked encoding _mandates_ a newline between chunk length and the chunk body itself, and then between the body and the next chunk length, so `iter_lines` would normally be fine. Your problem seems to be a server that is _approximating_ chunked encoding without actually doing it. Requests unfortunately provides no way to easily handle that situation beyond iterating using `iter_content` and handling the decode yourself.\n", "As I [understand it](http://www.w3.org/Protocols/rfc2616/rfc2616-sec3.html#sec3.6.1), chunked encoding mandates a CRLF in between chunks, but not that the chunk data itself end in a new line. Thus, I believe the modification I made to httpbin is still valid HTTP, right?\n", "Nope. From [RFC 2616](http://tools.ietf.org/html/rfc2616#section-3.6.1), here is the ABNF for chunked encoding:\n\n```\n Chunked-Body = *chunk\n last-chunk\n trailer\n CRLF\n\n chunk = chunk-size [ chunk-extension ] CRLF\n chunk-data CRLF\n chunk-size = 1*HEX\n```\n\nNote that each chunk is made up of the chunk data followed by a CRLF combination.\n", "Right, but I mean that the CRLF is not part of the chunk data. For example, try running this:\n\n```\n$ curl -v --raw http://shielded-thicket-4419.herokuapp.com/stream/20\n```\n\nYou will see that there is line break after the chunk data, before the next chunk header. Also, note that the chunk header does not include that CRLF in its length.\n\n(You will also see that my chunks are not coming out on the right boundaries; it seems that heroku is doing something at the proxy level.) When I run locally, it looks like this:\n\n```\n...\nAB\n{\"url\": \"http://localhost:5000/stream/20\", \"headers\": {\"Host\": \"localhost:5000\", \"Accept\": \"*/*\", \"User-Agent\": \"curl/7.27.0\"}, \"args\": {}, \"id\": 9, \"origin\": \"127.0.0.1\"}\nAC\n{\"url\": \"http://localhost:5000/stream/20\", \"headers\": {\"Host\": \"localhost:5000\", \"Accept\": \"*/*\", \"User-Agent\": \"curl/7.27.0\"}, \"args\": {}, \"id\": 10, \"origin\": \"127.0.0.1\"}\n...\n```\n\nThat first line in that snippet is 171 characters long, not including the line break. 171 in decimal is 0xAB in hex. So, you can see that there is a CRLF terminating the chunk, but there is not one that is a part of the chunk.\n", "Agreed, but the chunk still needs to be terminated by a CRLF: the fact that it's not included in the chunk length is irrelevant.\n\nI'm really struggling to see the problem here. Any well-formed chunked encoding will have its chunks terminated by CRLF. `iter_lines` will split on those. Use `iter_lines` to get chunks. Why is that logic not working?\n", "Oh, hang on, I see it. Presumably `httplib` is consuming the newline. Argh.\n", "Yeah, in this situation requests doesn't give you the kind of low-level control you want. In requests' opinion, the Transfer Encoding is an implementation detail: the fact that chunked transfer encoding is being used should not be exposed to the user. Unfortunately, if you want that finer-grained control you'll need to use a lower-level library. Alternatively, consider a different method of obtaining the JSON (such as a streaming JSON decoder).\n", "there are some hacks: http://mihai.ibanescu.net/chunked-encoding-and-python-requests\nI hope this function could merged into requests core.\n", "I'm open to having requests provide some useful way of handling chunked encoding as it arrives, though how it would fit into the API is a bit unclear.\n", "Note also that chunked encoding is an artefact of HTTP/1.1 and is not present in HTTP/2. That may be an argument _against_ exposing it in requests: it's obviously a transport-level detail.\n", "@Lukasa http/2 may not call it chunked transfers, but it does have DATA frames as mentioned here: [rfc7540](http://httpwg.org/specs/rfc7540.html#HttpSequence). The 'pad length' concept is analogous to chunk size.", "@electronicsguy pad length is not analagous to chunk size: it is how much padding is attached to the frame. \r\n\r\nMore importantly, again, DATA frames are not semantic and should also not be exposed to users. ", "@Lukasa oh I see. Thanks for clearing that up about lad length. But I don't understand why DATA frames are not analogous to chunking. It is in-fact mentioned in this mailing list that it is the case, at least logically: [http/2-chunking](https://lists.w3.org/Archives/Public/ietf-http-wg/2014JulSep/1676.html). Even in the case of http/1.1, chunked transfer encoding is not visible to the users since it is at the transport layer. So when you said above that chunked encoding is an artefact of http/1.1 and not present in http/2, I'm not able to understand how that is.", "@electronicsguy So the short answer is that we're in agreement. Note again what I said:\n\n> Note also that chunked encoding is an artefact of HTTP/1.1 and is not present in HTTP/2. That may be an argument *against* exposing it in requests: it's obviously a transport-level detail.\n\nThat sentence can be rephrased as:\n\n> Note that because chunked encoding was removed in HTTP/2, it is clearly not a semantic part of HTTP, as HTTP/2 was required to keep a one-to-one semantic mapping to HTTP/1.1. For this reason, we should probably consider not exposing the chunks: they aren't supposed to have semantic meaning to the user.\n\nThis is of course backed up by all the relevant RFCs. It's also the course we're taking in the urllib3 v2 work: the boundaries of data chunks are not being exposed to users." ]
https://api.github.com/repos/psf/requests/issues/2019
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2019/labels{/name}
https://api.github.com/repos/psf/requests/issues/2019/comments
https://api.github.com/repos/psf/requests/issues/2019/events
https://github.com/psf/requests/issues/2019
32,241,617
MDU6SXNzdWUzMjI0MTYxNw==
2,019
Unable to POST large files
{ "avatar_url": "https://avatars.githubusercontent.com/u/6054881?v=4", "events_url": "https://api.github.com/users/jvaleo/events{/privacy}", "followers_url": "https://api.github.com/users/jvaleo/followers", "following_url": "https://api.github.com/users/jvaleo/following{/other_user}", "gists_url": "https://api.github.com/users/jvaleo/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jvaleo", "id": 6054881, "login": "jvaleo", "node_id": "MDQ6VXNlcjYwNTQ4ODE=", "organizations_url": "https://api.github.com/users/jvaleo/orgs", "received_events_url": "https://api.github.com/users/jvaleo/received_events", "repos_url": "https://api.github.com/users/jvaleo/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jvaleo/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jvaleo/subscriptions", "type": "User", "url": "https://api.github.com/users/jvaleo", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2014-04-25T15:27:18Z
2021-09-09T00:01:04Z
2014-04-25T16:09:00Z
NONE
resolved
When trying to POST a large binary file > 2GB, I see the following error. I've tried on multiple web severs all with the same result: ``` bash-3.2# du -h AirGuitarNation-Quicktime_H.264.mp4 2.4G AirGuitarNation-Quicktime_H.264.mp4 .... 2014-04-24 17:26:41,600 ERROR Exception caught in upload_file HTTPConnectionPool(host='REDACTED', port=80): Max retries exceeded with url: /uploader_rails/upload.py?signature= REDACTED&username= REDACTED&site_id=763&validate=localhost%3A29125 (Caused by <class 'socket.error'>: [Errno 22] Invalid argument) ``` Here is my code: ``` python file_path = os.path.join(ROOT_PATH, 'uploads', file_name) files = {'file': open(file_path, 'rb')} logger.info('Starting upload for {}'.format(file_name)) try: post = requests.post(upload_url, files=files) except Exception as e: logger.error('Exception caught in upload_file {}'.format(e.message)) ``` Smaller files work without issue, it seems to be anything over 2GB.
{ "avatar_url": "https://avatars.githubusercontent.com/u/6054881?v=4", "events_url": "https://api.github.com/users/jvaleo/events{/privacy}", "followers_url": "https://api.github.com/users/jvaleo/followers", "following_url": "https://api.github.com/users/jvaleo/following{/other_user}", "gists_url": "https://api.github.com/users/jvaleo/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jvaleo", "id": 6054881, "login": "jvaleo", "node_id": "MDQ6VXNlcjYwNTQ4ODE=", "organizations_url": "https://api.github.com/users/jvaleo/orgs", "received_events_url": "https://api.github.com/users/jvaleo/received_events", "repos_url": "https://api.github.com/users/jvaleo/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jvaleo/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jvaleo/subscriptions", "type": "User", "url": "https://api.github.com/users/jvaleo", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2019/reactions" }
https://api.github.com/repos/psf/requests/issues/2019/timeline
null
completed
null
null
false
[ "Hmm, I suspect the problem here is because the multipart body is greater than 2GB in size and we try to POST the whole thing.\n\nWant to try using the [Streaming Multipart Uploader from the toolbelt](http://toolbelt.readthedocs.org/en/latest/user.html#streaming-multipart-data-encoder)?\n", "That's the ticket! Thank you!\n", "Glad to be of service! =)\n" ]
https://api.github.com/repos/psf/requests/issues/2018
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2018/labels{/name}
https://api.github.com/repos/psf/requests/issues/2018/comments
https://api.github.com/repos/psf/requests/issues/2018/events
https://github.com/psf/requests/issues/2018
32,203,116
MDU6SXNzdWUzMjIwMzExNg==
2,018
Re-order proxy precedence.
{ "avatar_url": "https://avatars.githubusercontent.com/u/392047?v=4", "events_url": "https://api.github.com/users/ouroborus/events{/privacy}", "followers_url": "https://api.github.com/users/ouroborus/followers", "following_url": "https://api.github.com/users/ouroborus/following{/other_user}", "gists_url": "https://api.github.com/users/ouroborus/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ouroborus", "id": 392047, "login": "ouroborus", "node_id": "MDQ6VXNlcjM5MjA0Nw==", "organizations_url": "https://api.github.com/users/ouroborus/orgs", "received_events_url": "https://api.github.com/users/ouroborus/received_events", "repos_url": "https://api.github.com/users/ouroborus/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ouroborus/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ouroborus/subscriptions", "type": "User", "url": "https://api.github.com/users/ouroborus", "user_view_type": "public" }
[ { "color": "e10c02", "default": false, "description": null, "id": 117744, "name": "Bug", "node_id": "MDU6TGFiZWwxMTc3NDQ=", "url": "https://api.github.com/repos/psf/requests/labels/Bug" }, { "color": "e102d8", "default": false, "description": null, "id": 117745, "name": "Planned", "node_id": "MDU6TGFiZWwxMTc3NDU=", "url": "https://api.github.com/repos/psf/requests/labels/Planned" }, { "color": "eb6420", "default": false, "description": null, "id": 44501256, "name": "Breaking API Change", "node_id": "MDU6TGFiZWw0NDUwMTI1Ng==", "url": "https://api.github.com/repos/psf/requests/labels/Breaking%20API%20Change" } ]
open
false
null
[]
{ "closed_at": null, "closed_issues": 29, "created_at": "2013-11-17T11:29:34Z", "creator": { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }, "description": null, "due_on": null, "html_url": "https://github.com/psf/requests/milestone/20", "id": 487518, "labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels", "node_id": "MDk6TWlsZXN0b25lNDg3NTE4", "number": 20, "open_issues": 12, "state": "open", "title": "3.0.0", "updated_at": "2024-05-19T18:43:00Z", "url": "https://api.github.com/repos/psf/requests/milestones/20" }
16
2014-04-25T03:09:20Z
2022-02-17T19:17:32Z
null
NONE
null
`Session.trust_env = False` turns off the checking of environment variables for options including proxy settings (`*_proxy`). But `urllib` picks up and uses these environment proxy settings anyway. `requests` should pass the `trust_env` setting on to `urllib`. (Although I'm not sure if `urllib` has a similar override.) (Proxy setting precedence should be sorted out here as well. They way it is now, environment proxy settings will interfere with (rather than be over-ridden by) the `proxies` argument in `Session.request` or `requests.request` calls and the `Session.proxies` config regardless of `trust_env` settings.)
null
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2018/reactions" }
https://api.github.com/repos/psf/requests/issues/2018/timeline
null
null
null
null
false
[ "I've taken a quick look at the code in urllib3 and I can't find any reference to the `*_PROXY` environment variables there. Can you point to where in the code urllib3 grabs that information?\n\nAs for proxy setting precedence, what do you mean by 'interfere with'? We don't override any settings already in the proxy dictionary as we use `dict.setdefault` to set those values. All that happens is you may get an additional proxy value for a scheme you didn't declare. Is this what you meant by 'interfere with'?\n", "I'll have to write up a sample that triggers this.\n\nSo far I've got a situation where the env proxy is defined, `session.proxy` is defined (with a different proxy), and `session.trust_env = False`. The results are the initial request goes through the session-defined proxy and the redirect goes through the env-defined proxy.\n", "It seems redirects pass through `SessionRedirectMixin.resolve_redirects` in `sessions.py` which has a line `proxies = self.rebuild_proxies(prepared_request, proxies)`. `rebuild_proxies` appears to ignore the passed in `proxies` argument and `session.trust_env` settings.\n", "It does ignore `trust_env` and the original proxies dict (both bugs), but you're not running that code unless you're using the requests release from GitHub. =) Fixing both of those bugs, however, should resolve the problem. I'll fix it up later today.\n", "Ah, somehow I though the installation docs were saying to use the github source rather than pip or easy_install. I see now that it says instead to use pip instead of easy_install and then goes on to say where you can get the source.\n", "Regarding proxy setting precedence, I think `Session.request(..., proxies)` should override `Session.proxies` which should override proxies set in the environment. Currently, environment proxies override session proxies (using `Session.trust_env = True`).\n\nIn pseudo-code, it'd be something like:\n\n```\ntrust_env = request.trust_env\nif trust_env == None:\n trust_env = Session.trust_env\nif trust_env == None:\n trust_env = True\nproxies = {}\nif trust_env:\n proxies = env.proxies\nproxies = proxies.update(Session.proxies).update(request.proxies)\n```\n", "To be clear for those who aren't sure, the way @ouroborus' suggestion differs from the current logic is that we take the proxies from the request, then apply proxies from the environment, then finally apply proxies from the `Session`.\n\nI'm open to re-ordering the precedence of the priorities. @sigmavirus24, thoughts?\n", "Can anyone recall the reasoning behind the current order of precedence? It is extremely odd to me that the order of precedence is contrary to the rest of the library.\n", "No, and IIRC (I'm on my phone) it's been that way for a while. Looked like an oversight to me. \n", "## Then consider me in favor of the reordering.\n\nSent from my Android device with K-9 Mail. Please excuse my brevity.\n", "It's been noted that this issue is poorly named. I'm not sure what to rename it or even if it should be now that it has been created. @Lukasa, feel free to rename it if and as you see fit.\n", "@Lukasa wasn't this already fixed?\n", "@sigmavirus24 Not that I can see. =)\n", "This issue causes real headache when using saltstack with pip states and https_proxy set in the environment. \nWhile saltstacks pip state allows passing a proxy, this bug ignores the proxy given and prefers the environment proxy instead, as a result the wrong proxy gets used.\n\nA workaround is overwriting your environment variable with saltstacks env_var:\n\n```\nPackageX:\n pip.installed:\n - proxy: http://proxyA\n - env_vars:\n https_proxy: \"http://proxyA\n http_proxy: \"http://proxyA\n```\n\nBut it is rather messy, therefore I'd appreciate if this bug would be fixed.\n", "> But it is rather messy, therefore I'd appreciate if this bug would be fixed.\n\nIt will be. That's why it's labeled as \"Planned\". Also note that we've set a milestone for it. Thanks for your interest.\n", "To quote @nateprewitt from Aug 11, 2016:\r\n\r\n> It may be worth noting that it's PR #2839 that fixes this.\r\n> \r\n> _Originally posted by @nateprewitt in https://github.com/psf/requests/issues/3506#issuecomment-239304816_\r\n>" ]
https://api.github.com/repos/psf/requests/issues/2017
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2017/labels{/name}
https://api.github.com/repos/psf/requests/issues/2017/comments
https://api.github.com/repos/psf/requests/issues/2017/events
https://github.com/psf/requests/pull/2017
32,175,162
MDExOlB1bGxSZXF1ZXN0MTUxMzU0NjM=
2,017
Update urllib to 1.8.2
{ "avatar_url": "https://avatars.githubusercontent.com/u/111427?v=4", "events_url": "https://api.github.com/users/alpire/events{/privacy}", "followers_url": "https://api.github.com/users/alpire/followers", "following_url": "https://api.github.com/users/alpire/following{/other_user}", "gists_url": "https://api.github.com/users/alpire/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/alpire", "id": 111427, "login": "alpire", "node_id": "MDQ6VXNlcjExMTQyNw==", "organizations_url": "https://api.github.com/users/alpire/orgs", "received_events_url": "https://api.github.com/users/alpire/received_events", "repos_url": "https://api.github.com/users/alpire/repos", "site_admin": false, "starred_url": "https://api.github.com/users/alpire/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/alpire/subscriptions", "type": "User", "url": "https://api.github.com/users/alpire", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2014-04-24T18:49:17Z
2021-09-08T23:07:22Z
2014-04-26T03:04:54Z
CONTRIBUTOR
resolved
Hi, The update enables https request on the google app engine dev server (see https://github.com/shazow/urllib3/issues/356 and https://github.com/kennethreitz/requests/issues/1961). Thanks, Alex
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2017/reactions" }
https://api.github.com/repos/psf/requests/issues/2017/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2017.diff", "html_url": "https://github.com/psf/requests/pull/2017", "merged_at": "2014-04-26T03:04:54Z", "patch_url": "https://github.com/psf/requests/pull/2017.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2017" }
true
[ "Thanks!\n" ]
https://api.github.com/repos/psf/requests/issues/2016
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2016/labels{/name}
https://api.github.com/repos/psf/requests/issues/2016/comments
https://api.github.com/repos/psf/requests/issues/2016/events
https://github.com/psf/requests/issues/2016
32,155,482
MDU6SXNzdWUzMjE1NTQ4Mg==
2,016
App Engine Error: AttributeError: 'ApplicationError' object has no attribute 'errno'
{ "avatar_url": "https://avatars.githubusercontent.com/u/1918510?v=4", "events_url": "https://api.github.com/users/sdog869/events{/privacy}", "followers_url": "https://api.github.com/users/sdog869/followers", "following_url": "https://api.github.com/users/sdog869/following{/other_user}", "gists_url": "https://api.github.com/users/sdog869/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sdog869", "id": 1918510, "login": "sdog869", "node_id": "MDQ6VXNlcjE5MTg1MTA=", "organizations_url": "https://api.github.com/users/sdog869/orgs", "received_events_url": "https://api.github.com/users/sdog869/received_events", "repos_url": "https://api.github.com/users/sdog869/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sdog869/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sdog869/subscriptions", "type": "User", "url": "https://api.github.com/users/sdog869", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2014-04-24T14:53:23Z
2021-09-09T00:01:05Z
2014-04-24T14:55:27Z
NONE
resolved
I'm not sure if this is a urllib3 error or a requests issue, but when trying to use requests on live app engine I see the stack trace: line 55, in get return request('get', url, *_kwargs) line 44, in request return session.request(method=method, url=url, *_kwargs) /libs/requests/sessions.py", line 452, in request resp = self.send(prep, *_send_kwargs) /libs/requests/sessions.py", line 555, in send r = adapter.send(request, *_kwargs) /libs/requests/adapters.py", line 327, in send timeout=timeout /libs/requests/packages/urllib3/connectionpool.py", line 493, in urlopen body=body, headers=headers) /libs/requests/packages/urllib3/connectionpool.py", line 291, in _make_request conn.request(method, url, *_httplib_request_kw) File "/base/data/home/runtimes/python27/python27_dist/lib/python2.7/python_std_lib/httplib.py", line 973, in request self._send_request(method, url, body, headers) File "/base/data/home/runtimes/python27/python27_dist/lib/python2.7/python_std_lib/httplib.py", line 1007, in _send_request self.endheaders(body) File "/base/data/home/runtimes/python27/python27_dist/lib/python2.7/python_std_lib/httplib.py", line 969, in endheaders self._send_output(message_body) File "/base/data/home/runtimes/python27/python27_dist/lib/python2.7/python_std_lib/httplib.py", line 829, in _send_output self.send(msg) File "/base/data/home/runtimes/python27/python27_dist/lib/python2.7/python_std_lib/httplib.py", line 791, in send self.connect() /libs/requests/packages/urllib3/connection.py", line 156, in connect *_self.conn_kw) File "/base/data/home/runtimes/python27/python27_dist/lib/python2.7/socket.py", line 560, in create_connection sock.connect(sa, host) File "/base/data/home/runtimes/python27/python27_dist/lib/python2.7/socket.py", line 222, in meth return getattr(self._sock,name)(*args) File "/base/data/home/runtimes/python27/python27_lib/versions/1/google/appengine/api/remote_socket/_remote_socket.py", line 776, in connect if translated_e.errno == errno.EISCONN: AttributeError: 'ApplicationError' object has no attribute 'errno' This was working up until recently, so I'm not sure what happened, thanks for the help!
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2016/reactions" }
https://api.github.com/repos/psf/requests/issues/2016/timeline
null
completed
null
null
false
[ "Thanks for raising this, but please don't raise an issue both here and on urllib3 at the same time. urllib3 and requests work very closely together and I was already looking at the urllib3 issue, so don't worry about it getting missed: we'll move the issue if necessary.\n", "This matches shazow/urllib3#377.\n", "Good to know, thanks for the quick response I sure appreciate it!\n" ]
https://api.github.com/repos/psf/requests/issues/2015
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2015/labels{/name}
https://api.github.com/repos/psf/requests/issues/2015/comments
https://api.github.com/repos/psf/requests/issues/2015/events
https://github.com/psf/requests/issues/2015
32,065,596
MDU6SXNzdWUzMjA2NTU5Ng==
2,015
requests.get really slow when stream=True
{ "avatar_url": "https://avatars.githubusercontent.com/u/25111?v=4", "events_url": "https://api.github.com/users/cournape/events{/privacy}", "followers_url": "https://api.github.com/users/cournape/followers", "following_url": "https://api.github.com/users/cournape/following{/other_user}", "gists_url": "https://api.github.com/users/cournape/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/cournape", "id": 25111, "login": "cournape", "node_id": "MDQ6VXNlcjI1MTEx", "organizations_url": "https://api.github.com/users/cournape/orgs", "received_events_url": "https://api.github.com/users/cournape/received_events", "repos_url": "https://api.github.com/users/cournape/repos", "site_admin": false, "starred_url": "https://api.github.com/users/cournape/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/cournape/subscriptions", "type": "User", "url": "https://api.github.com/users/cournape", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2014-04-23T14:30:08Z
2021-09-09T00:01:06Z
2014-04-23T14:35:59Z
NONE
resolved
I noticed that using stream=True is really slow in some cases. Code that shows the issue: ``` import requests url = "https://api.enthought.com/eggs/rh5-64/numpy-1.8.0-1.egg" target = "numpy-1.8.0-1.egg" use_streaming = True if use_streaming: resp = requests.get(url, stream=True) else: resp = requests.get(url) resp.raise_for_status() with open(target, "wb") as target: if use_streaming: for chunk in resp.iter_content(): target.write(chunk) else: target.write(resp.content) ``` With use_streaming=True, it takes around 40 sec, and only 2 sec when False. Running this script with strace, it looks like the chunk size is 1 byte: ``` ... recvfrom(4, "\345", 1, 0, NULL, NULL) = 1 # I see many lines like this ``` Looks like for some reason the chunk size is ridiculously small ? I am using requests 2.2.1 on python 2.7 on debian.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2015/reactions" }
https://api.github.com/repos/psf/requests/issues/2015/timeline
null
completed
null
null
false
[ "Hm, if I read the documentation correctly, I would have seen that the default chunk size is 1 byte... so nvm.\n\nIs there a rationale for such a small size ?\n", "As you've spotted, by default `iter_content()`'s chunk size is 1, ensuring that it returns as rapidly as possible. This favours responsiveness over throughput (because we endure a large number of syscalls).\n\nWhether this is a good idea or not is unclear. We've got a giant issue that covers this (see #844), and that issue has not been decided emphatically. I'm inclined to increase the size, but wary about the risk of breaking things.\n\nNote that the bug here is not to do with streaming: if you use `r.content` in either mode it'll be fast. It's simply the use of `iter_content` at its default chunk size that causes problems.\n\nAnyway, the central issue is in #844, so I'll close this to centralise there.\n" ]
https://api.github.com/repos/psf/requests/issues/2014
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2014/labels{/name}
https://api.github.com/repos/psf/requests/issues/2014/comments
https://api.github.com/repos/psf/requests/issues/2014/events
https://github.com/psf/requests/issues/2014
32,051,740
MDU6SXNzdWUzMjA1MTc0MA==
2,014
RuntimeWarning: Parent module 'requests' not found while handling absolute import
{ "avatar_url": "https://avatars.githubusercontent.com/u/260438?v=4", "events_url": "https://api.github.com/users/dcarley/events{/privacy}", "followers_url": "https://api.github.com/users/dcarley/followers", "following_url": "https://api.github.com/users/dcarley/following{/other_user}", "gists_url": "https://api.github.com/users/dcarley/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/dcarley", "id": 260438, "login": "dcarley", "node_id": "MDQ6VXNlcjI2MDQzOA==", "organizations_url": "https://api.github.com/users/dcarley/orgs", "received_events_url": "https://api.github.com/users/dcarley/received_events", "repos_url": "https://api.github.com/users/dcarley/repos", "site_admin": false, "starred_url": "https://api.github.com/users/dcarley/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dcarley/subscriptions", "type": "User", "url": "https://api.github.com/users/dcarley", "user_view_type": "public" }
[]
closed
true
null
[]
null
11
2014-04-23T11:14:11Z
2018-04-23T14:41:42Z
2014-06-08T09:59:44Z
NONE
resolved
I get this warning when running `nosetests` against a project: ``` $ nosetests ............../home/travis/virtualenv/python2.7/local/lib/python2.7/site-packages/requests/utils.py:72: RuntimeWarning: Parent module 'requests' not found while handling absolute import from netrc import netrc, NetrcParseError ``` However I can't seem to reproduce it in a much simpler example, so I don't know if it's some specific combination of imports or something that nose/mock is doing. You can see that test here: - https://travis-ci.org/gds-operations/collectd-cdn/jobs/23515951#L48 Going back to v2.0.0 prior to a501b0ca81e340fb2c968e1cabc37565187006f7 doesn't produce the same warning.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2014/reactions" }
https://api.github.com/repos/psf/requests/issues/2014/timeline
null
completed
null
null
false
[ "Bizarre. Does this occur reproducibly when running your tests?\n", "Yeah, everytime.\n\nYou should be able to reproduce it by cloning gds-operations/collectd-cdn@e9dc75523f77461d589ce251dc0fc662d48e1006 and running `tox`.\n", "It's hugely confusing: it seems like requests is being removed from `sys.path`. Does that seem to be happening?\n", "I honestly have no idea. It is at least only a warning and requests appears to otherwise work. Is there anything I can do to help debug?\n", "What we really need is to find a minimum set of your code that reliably triggers the error. Let's start by stripping out all the tests but one that will hit that line of code and see if we can hit it.\n", "tox uses individual virtual environments. If you don't install requests in each one then you won't be able to use requests \n", "> What we really need is to find a minimum set of your code that reliably triggers the error. Let's start by stripping out all the tests but one that will hit that line of code and see if we can hit it.\n\nI'll see if I can reproduce it using `nose` (which is what TravisCI is calling directly - not `tox`, so we can eliminate that) and a much simpler module.\n\n> tox uses individual virtual environments. If you don't install requests in each one then you won't be able to use requests\n\nThe package I'm testing has an `install_requires=['requests']` that takes care of that. The tests would fail, rather than just emit a warning, if requests wasn't available.\n", "Closing due to inactivity.\n", "I have requests version 2.13.0. I face this issue only when I am monkey_patching() it with eventlet's requests:\r\nrequests = eventlet.import_patched('requests')\r\nand, when I call: r = requests.get(url)\r\n\r\nError:\r\npython2.7/site-packages/requests/utils.py:113: RuntimeWarning: Parent module 'requests' not found while handling absolute import\r\n from netrc import netrc, NetrcParseError", "This sounds like a problem with eventlet, not Requests. ", "As mentioned above, the problem is in the `eventlet` library.\r\n\r\nHere is an original tip (by @temoto) how to solve the problem: https://github.com/eventlet/eventlet/issues/208#issuecomment-76978478\r\n\r\nAnd this is my workaround:\r\n\r\n```python\r\nfrom eventlet import import_patched\r\nrequests = import_patched('requests.__init__')\r\n```\r\n\r\n## see also\r\n\r\n* https://github.com/requests/requests/issues/2469\r\n* https://github.com/requests/requests/issues/2014" ]
https://api.github.com/repos/psf/requests/issues/2013
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2013/labels{/name}
https://api.github.com/repos/psf/requests/issues/2013/comments
https://api.github.com/repos/psf/requests/issues/2013/events
https://github.com/psf/requests/pull/2013
32,017,810
MDExOlB1bGxSZXF1ZXN0MTUwNDA3OTE=
2,013
eval compatible repr for `Request`
{ "avatar_url": "https://avatars.githubusercontent.com/u/175882?v=4", "events_url": "https://api.github.com/users/cheecheeo/events{/privacy}", "followers_url": "https://api.github.com/users/cheecheeo/followers", "following_url": "https://api.github.com/users/cheecheeo/following{/other_user}", "gists_url": "https://api.github.com/users/cheecheeo/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/cheecheeo", "id": 175882, "login": "cheecheeo", "node_id": "MDQ6VXNlcjE3NTg4Mg==", "organizations_url": "https://api.github.com/users/cheecheeo/orgs", "received_events_url": "https://api.github.com/users/cheecheeo/received_events", "repos_url": "https://api.github.com/users/cheecheeo/repos", "site_admin": false, "starred_url": "https://api.github.com/users/cheecheeo/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/cheecheeo/subscriptions", "type": "User", "url": "https://api.github.com/users/cheecheeo", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2014-04-22T22:46:44Z
2021-09-08T22:01:15Z
2014-04-23T06:24:30Z
NONE
resolved
After this change: In [1]: from requests import Request In [2]: eval(repr(Request('GET', 'http://x.org'))) Out[2]: Request(method='GET', url='http://x.org') In [3]: eval(repr(Request('GET', 'http://x.org', auth=('hello', 'world')))) Out[3]: Request(method='GET', url='http://x.org', auth=('hello', 'world'))
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2013/reactions" }
https://api.github.com/repos/psf/requests/issues/2013/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2013.diff", "html_url": "https://github.com/psf/requests/pull/2013", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/2013.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2013" }
true
[ "Hey @cheecheeo, thanks for opening this!\n\nYour code is fine and the idea is valid. The problem is that this makes the rest of the library inconsistent and it is _not_ desired behaviour. We have no intent to implement this and we have never intended for the a Request's representation to be used like this.\n\nI'll wait for @Lukasa to weigh in, but I'm pretty sure we will not be accepting this contribution.\n\nCheers\n", "Yeah, I'm with @sigmavirus24 I'm afraid. This is really nice work, but it's just not the way the requests project tends to work. Sorry! :cake:\n" ]
https://api.github.com/repos/psf/requests/issues/2012
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2012/labels{/name}
https://api.github.com/repos/psf/requests/issues/2012/comments
https://api.github.com/repos/psf/requests/issues/2012/events
https://github.com/psf/requests/issues/2012
31,940,963
MDU6SXNzdWUzMTk0MDk2Mw==
2,012
Received "HTTP header is larger than 8192 bytes" but only given short one with requests.
{ "avatar_url": "https://avatars.githubusercontent.com/u/4996057?v=4", "events_url": "https://api.github.com/users/hxuanji/events{/privacy}", "followers_url": "https://api.github.com/users/hxuanji/followers", "following_url": "https://api.github.com/users/hxuanji/following{/other_user}", "gists_url": "https://api.github.com/users/hxuanji/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/hxuanji", "id": 4996057, "login": "hxuanji", "node_id": "MDQ6VXNlcjQ5OTYwNTc=", "organizations_url": "https://api.github.com/users/hxuanji/orgs", "received_events_url": "https://api.github.com/users/hxuanji/received_events", "repos_url": "https://api.github.com/users/hxuanji/repos", "site_admin": false, "starred_url": "https://api.github.com/users/hxuanji/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/hxuanji/subscriptions", "type": "User", "url": "https://api.github.com/users/hxuanji", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2014-04-22T02:30:37Z
2021-09-09T00:01:05Z
2014-04-23T10:47:03Z
NONE
resolved
Hi all, I'm using requests 2.2.1. I encounter a problem as follows. In ordinary situation, it all works fine. I use requests to do some communications with server and all the header I use is: ``` {"Connection": "close"} ``` But "occasionally" I got the exception as follows: ``` File "/share/HDA_DATA/package/lib/python2.7/site-packages/requests/api.py", line 88, in post return request('post', url, data=data, **kwargs) File "/share/HDA_DATA/package/lib/python2.7/site-packages/requests/api.py", line 44, in request return session.request(method=method, url=url, **kwargs) File "/share/HDA_DATA/package/lib/python2.7/site-packages/requests/sessions.py", line 383, in request resp = self.send(prep, **send_kwargs) File "/share/HDA_DATA/package/lib/python2.7/site-packages/requests/sessions.py", line 486, in send r = adapter.send(request, **kwargs) File "/share/HDA_DATA/package/lib/python2.7/site-packages/requests/adapters.py", line 378, in send raise ConnectionError(e) requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=9200): Max retries exceeded with url: /cate/main/3719e564ece2ad3a267ecac0f4d8db6e74333046/ (Caused by <class 'socket.error'>: [Errno 104] Connection reset by peer) ``` And then I check the server log, I found the error log: ``` handler.codec.frame.TooLongFrameException: HTTP header is larger than 8192 bytes. ``` It's strange, because the header is less than 8k apparently and this situation does not always happen. Does the requests append some extra http header? Any suggestions? cheers, Ivan
{ "avatar_url": "https://avatars.githubusercontent.com/u/4996057?v=4", "events_url": "https://api.github.com/users/hxuanji/events{/privacy}", "followers_url": "https://api.github.com/users/hxuanji/followers", "following_url": "https://api.github.com/users/hxuanji/following{/other_user}", "gists_url": "https://api.github.com/users/hxuanji/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/hxuanji", "id": 4996057, "login": "hxuanji", "node_id": "MDQ6VXNlcjQ5OTYwNTc=", "organizations_url": "https://api.github.com/users/hxuanji/orgs", "received_events_url": "https://api.github.com/users/hxuanji/received_events", "repos_url": "https://api.github.com/users/hxuanji/repos", "site_admin": false, "starred_url": "https://api.github.com/users/hxuanji/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/hxuanji/subscriptions", "type": "User", "url": "https://api.github.com/users/hxuanji", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2012/reactions" }
https://api.github.com/repos/psf/requests/issues/2012/timeline
null
completed
null
null
false
[ "Requests adds a number of extra headers, but the only one that is plausibly going to be that long is cookies reflected back to the server.\n\nI recommend using Wireshark or tcpdump to determine which header the server is complaining about. \n", "Ok, I think this might not be related to requests module. Thanks.\n", "Sorry to bother again. I wanna know about what you said, does the requests add the cookie content to the header by its own? Did I misunderstand something?\n\nThanks.\n", "Requests adds any cookies that the server has sent that are applicable to the current request automatically.\n" ]
https://api.github.com/repos/psf/requests/issues/2011
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2011/labels{/name}
https://api.github.com/repos/psf/requests/issues/2011/comments
https://api.github.com/repos/psf/requests/issues/2011/events
https://github.com/psf/requests/issues/2011
31,903,837
MDU6SXNzdWUzMTkwMzgzNw==
2,011
Feature Request: Add timeout to session
{ "avatar_url": "https://avatars.githubusercontent.com/u/167414?v=4", "events_url": "https://api.github.com/users/ctheiss/events{/privacy}", "followers_url": "https://api.github.com/users/ctheiss/followers", "following_url": "https://api.github.com/users/ctheiss/following{/other_user}", "gists_url": "https://api.github.com/users/ctheiss/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ctheiss", "id": 167414, "login": "ctheiss", "node_id": "MDQ6VXNlcjE2NzQxNA==", "organizations_url": "https://api.github.com/users/ctheiss/orgs", "received_events_url": "https://api.github.com/users/ctheiss/received_events", "repos_url": "https://api.github.com/users/ctheiss/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ctheiss/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ctheiss/subscriptions", "type": "User", "url": "https://api.github.com/users/ctheiss", "user_view_type": "public" }
[]
closed
true
null
[]
null
11
2014-04-21T16:01:43Z
2021-08-31T00:06:54Z
2014-04-21T16:04:32Z
NONE
resolved
It seems that every optional parameter that goes into send is also settable on the Session (e.g. verify, stream, etc.). Except timeout. Would it be possible to change the behaviour so that timeout gets merged in from session just like all the other arguments?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 1, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 1, "url": "https://api.github.com/repos/psf/requests/issues/2011/reactions" }
https://api.github.com/repos/psf/requests/issues/2011/timeline
null
completed
null
null
false
[ "@ctheiss Hey, thanks for raising this issue!\n\nThis is something that's come up before, most recently in #1987 but also in #1130 and #1563 (all this year). Kenneth has generally expressed a lack of interest in making this change, expecting that it'd be done using Transport Adapters. If you're interested in getting some guidance as to how you'd do that, I'm happy to help, but I don't think we'll be adding `timeout` to the `Session`.\n\nSorry we can't be more helpful!\n", "Thanks for the ultra-quick response. I need to get better at searching through the issues, as the ones you mentioned are exact duplicates!\n\nIs the idea to implement `BaseAdapter` (or subclass `HTTPAdapter`), then associate the subclass with a session using `mount`? That seems oddly laborious just to implement \"default timeout\" (given how amazingly easy everything else is in requests).\n", "The idea would be to subclass `HTTPAdapter`. It's not very laborious, really, but the main reason we do it is because it maintains a conceptual distinction in the library. `Session` objects strictly should manage things about the session that about how HTTP itself works: cookies, headers etc. The transport adapters manage things that are about how network connections work: sockets, timeouts etc.\n", "To clarify, your recommendation is that people do this?\n\n``` python\nclass MyHTTPAdapter(requests.adapters.HTTPAdapter):\n def __init__(self, timeout=None, *args, **kwargs):\n self.timeout = timeout\n super(MyHTTPAdapter, self).__init__(*args, **kwargs)\n\n def send(self, *args, **kwargs):\n kwargs['timeout'] = self.timeout\n return super(MyHTTPAdapter, self).send(*args, **kwargs)\n\ns = requests.Session()\ns.mount(\"http://\", MyHTTPAdapter(timeout=10))\n```\n", "@staticshock that's an option. Yes\n", "But, more specifically, is it the _recommendation_?\n", "There is no specific recommendation because users needs will vary and there will be a multitude of ways to address this for those users.\n", "@staticshock @sigmavirus24 \r\n\r\nThat solution seems overly complicated. The following seems to work for me.\r\n\r\n```python\r\ns = requests.Session()\r\ns.get_orig, s.get = s.get, functools.partial(s.get, timeout=3)\r\n# this should now timeout\r\ns.get('https://httpbin.org/delay/6')\r\n# and this should succeed\r\ns.get_orig('https://httpbin.org/delay/6')\r\n```", "> \r\n> \r\n> @staticshock @sigmavirus24\r\n> \r\n> That solution seems overly complicated. The following seems to work for me.\r\n> \r\n> ```python\r\n> s = requests.Session()\r\n> s.get_orig, s.get = s.get, functools.partial(s.get, timeout=3)\r\n> # this should now timeout\r\n> s.get('https://httpbin.org/delay/6')\r\n> # and this should succeed\r\n> s.get_orig('https://httpbin.org/delay/6')\r\n> ```\r\n\r\nNote that you'll also need to apply this to the other verbs e.g. `post`, `put` etc. if you use those. For completeness here's what I am using:\r\n\r\n```python\r\nsession = requests.Session()\r\nfor method in ('get', 'options', 'head', 'post', 'put', 'patch', 'delete'):\r\n setattr(session, method, functools.partial(getattr(session, method), timeout=5))\r\n# All methods of session should now timeout after 5 seconds\r\n```", "Why not just shim the `request` method?\r\n```py\r\ns = requests.Session()\r\ns.request = functools.partial(s.request, timeout=3)\r\n# this should now timeout\r\ns.get('https://httpbin.org/delay/6')\r\n```", "If you need it globally at the module level and aren't using sessions, you could potentially do:\r\n\r\n`requests.api.request = functools.partial(requests.api.request, timeout=3)`\r\n\r\nI'm not sure how this plays with using it across multiple modules, I'd imagine you'd need the shim in each file where you're using requests." ]
https://api.github.com/repos/psf/requests/issues/2010
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2010/labels{/name}
https://api.github.com/repos/psf/requests/issues/2010/comments
https://api.github.com/repos/psf/requests/issues/2010/events
https://github.com/psf/requests/pull/2010
31,808,782
MDExOlB1bGxSZXF1ZXN0MTQ5MjQyMjc=
2,010
[WIP] Update philosophy section
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[ { "color": "fbca04", "default": false, "description": null, "id": 44501249, "name": "Needs BDFL Input", "node_id": "MDU6TGFiZWw0NDUwMTI0OQ==", "url": "https://api.github.com/repos/psf/requests/labels/Needs%20BDFL%20Input" }, { "color": "fef2c0", "default": false, "description": null, "id": 60669570, "name": "Please Review", "node_id": "MDU6TGFiZWw2MDY2OTU3MA==", "url": "https://api.github.com/repos/psf/requests/labels/Please%20Review" } ]
closed
true
null
[]
null
7
2014-04-18T16:30:32Z
2021-09-09T00:01:26Z
2014-04-21T16:25:01Z
MEMBER
resolved
As @kennethreitz and I discussed at PyCon, this is a first pass at a rewrite of part of the Philosophy section of the docs, attempting to explain Requests' slightly unusual management style. I'd like some feedback. Some ideas: - should @shazow be mentioned? I felt like he should but I wasn't sure how best to explain the relationship. - is there anything major that I've missed?
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2010/reactions" }
https://api.github.com/repos/psf/requests/issues/2010/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2010.diff", "html_url": "https://github.com/psf/requests/pull/2010", "merged_at": "2014-04-21T16:25:01Z", "patch_url": "https://github.com/psf/requests/pull/2010.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2010" }
true
[ "I'm like Requests' spouse.\n\nMrs. Requests.\n\n#jkpleasedontsaythat\n", "=D\n\nI want to mention you, because I feel like you don't get enough credit. =)\n", "❤\n", "FYI, I made these changes a few days ago: http://docs.python-requests.org/en/latest/dev/authors/#core-contributors\n", "@kennethreitz Yeah, I'm more interested in the description of responsibilities. =)\n", "We should also mention here that the project is under perpetual feature freeze except at my discretion.\n", "We can continue to iterate on this, but I'm merging for now. \n" ]
https://api.github.com/repos/psf/requests/issues/2009
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2009/labels{/name}
https://api.github.com/repos/psf/requests/issues/2009/comments
https://api.github.com/repos/psf/requests/issues/2009/events
https://github.com/psf/requests/pull/2009
31,805,603
MDExOlB1bGxSZXF1ZXN0MTQ5MjI0NTI=
2,009
Fix my name
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2014-04-18T15:36:05Z
2021-09-08T23:06:03Z
2014-04-18T16:13:17Z
CONTRIBUTOR
resolved
24 not 42 ;)
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2009/reactions" }
https://api.github.com/repos/psf/requests/issues/2009/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2009.diff", "html_url": "https://github.com/psf/requests/pull/2009", "merged_at": "2014-04-18T16:13:17Z", "patch_url": "https://github.com/psf/requests/pull/2009.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2009" }
true
[ "10/10 would merge again. :cake:\n", "42/24 would PR again. :grinning: \n" ]
https://api.github.com/repos/psf/requests/issues/2008
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2008/labels{/name}
https://api.github.com/repos/psf/requests/issues/2008/comments
https://api.github.com/repos/psf/requests/issues/2008/events
https://github.com/psf/requests/issues/2008
31,783,205
MDU6SXNzdWUzMTc4MzIwNQ==
2,008
urllib3 has been updated to new version 1.8.1, "source_address" is supported. May the requests lib support it?
{ "avatar_url": "https://avatars.githubusercontent.com/u/7300843?v=4", "events_url": "https://api.github.com/users/bofortitude/events{/privacy}", "followers_url": "https://api.github.com/users/bofortitude/followers", "following_url": "https://api.github.com/users/bofortitude/following{/other_user}", "gists_url": "https://api.github.com/users/bofortitude/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/bofortitude", "id": 7300843, "login": "bofortitude", "node_id": "MDQ6VXNlcjczMDA4NDM=", "organizations_url": "https://api.github.com/users/bofortitude/orgs", "received_events_url": "https://api.github.com/users/bofortitude/received_events", "repos_url": "https://api.github.com/users/bofortitude/repos", "site_admin": false, "starred_url": "https://api.github.com/users/bofortitude/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/bofortitude/subscriptions", "type": "User", "url": "https://api.github.com/users/bofortitude", "user_view_type": "public" }
[]
closed
true
null
[]
null
32
2014-04-18T06:07:16Z
2018-03-23T02:08:42Z
2014-04-18T09:34:09Z
NONE
off-topic
New version urllib3 has been updated to 1.8.1 which support the "source_address"(in Python v2.7). Could you do something change to support it. It's really useful and needed. I will appreciate it very much.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2008/reactions" }
https://api.github.com/repos/psf/requests/issues/2008/timeline
null
completed
null
null
false
[ "Thanks for raising this issue!\n\nRequests doesn't plan to add this to the main API, it's simply not commonly used enough to justify the increased complexity. My recommendation is that you use our [Transport Adapter abstraction](http://docs.python-requests.org/en/latest/user/advanced/#transport-adapters) to provide the value. The example adapter that I linked to should provide enough of an example to demonstrate how this would work, but let me know if it didn't and I'll demonstrate.\n", "@Lukasa Thanks for your reply. But, what I want is to send http requests with specifying source address. I've no idea how to use the Transport Adapter abstraction to finish this. Maybe you can give an example how it does. \n", "Doing it at the per-request level is very difficult, but you can do it at either a global level or a per-host level.\n\nFor example:\n\n``` python\nimport requests\nfrom requests.adapters import HTTPAdapter\nfrom requests.packages.urllib3.poolmanager import PoolManager\n\n\nclass SourceAddressAdapter(HTTPAdapter):\n def __init__(self, source_address, **kwargs):\n self.source_address = source_address\n\n super(SourceAddressAdapter, self).__init__(**kwargs)\n\n def init_poolmanager(self, connections, maxsize, block=False):\n self.poolmanager = PoolManager(num_pools=connections,\n maxsize=maxsize,\n block=block,\n source_address=self.source_address)\n\ns = requests.Session()\ns.mount('http://', SourceAddressAdapter('10.10.10.10'))\ns.mount('https://', SourceAddressAdapter('12.12.12.12'))\ns.mount('http://github.com/', SourceAddressAdapter('120.120.120.120'))\n```\n\nThis code will set the source address to 10.10.10.10 for all HTTP traffic, and 12.12.12.12 for all HTTPS traffic except when sending to GitHub, when it'll use 120.120.120.120. Is this suitably useful?\n", "@Lukasa Great! That's what I want. After replacing with the new version urllib3, I did it! I hope there will be new version requests which embedded with new urllib3 as soon as possible. \nIt's so kind of you to show me this! Thanks you again. \n", "No problem, I'll make sure we update urllib3 before our next release. =)\n", "Should we move the SourceAddressAdapter into the toolbelt?\n", "Let's do it. =)\n", "PRs welcome. :)\n\nThinking we should have an adapters submodule for it. ;)\n", "@Lukasa \n\nthanks for the above dode\nyour code above have a small issue.\n\n```\nSourceAddressAdapter('10.10.10.10'))\n```\n\nthe source_address should be a pair (host, port).\n\nYou pass a string and this will cause an exception when the sock.bind method is called.\n\nThe correct way is to pass a tuple like:\n\n```\nSourceAddressAdapter( ('10.10.10.10', 0) )\n```\n\ncheers\n", "@gosom You're quite right: the version of the code in requests-toolbelt does not have this problem.\n", "@Lukasa \nI think the version of the code should not work with the current requrests version.\n\ncheck here:https://github.com/kennethreitz/requests/blob/master/requests/packages/urllib3/util/connection.py#L77\n\nThe socket.bind method it is called and you get an exception if you pass a string.\n\nOk, this is not a bug but would be nice to be documented that source_address \nshould be a (host, post) tuple.\ncheers\n", "@gosom Please read my comment again. I said the version of the code in the requests toolbelt works correctly. That code is available [here](https://github.com/sigmavirus24/requests-toolbelt/blob/master/requests_toolbelt/adapters/source.py).\n", "@Lukasa \nyes you are right. I was not even aware of the request-toolbelt :+1: \n\nThanks for the reply\n", "it seems this method does not work with current version of requests, it reports \"socket.gaierror: [Errno -9] Address family for hostname not supported\" during binding. \r\n", "There is no reason to assume that this has become broken. Can you show an example of your code please?", "```\r\nclass SourceAddressAdapter(HTTPAdapter):\r\n def __init__(self, source_address, **kwargs):\r\n self.source_address = source_address\r\n super(SourceAddressAdapter, self).__init__(**kwargs)\r\n\r\n def init_poolmanager(self, connections, maxsize, block=False):\r\n self.poolmanager = PoolManager(num_pools=connections,\r\n maxsize=maxsize,\r\n block=block,\r\n source_address=self.source_address)\r\n\r\n\r\ndef prepare_sessions(ip):\r\n sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)\r\n s = requests.session()\r\n s.mount(\"http://\", SourceAddressAdapter((str(ip), 0)))\r\n s.mount(\"https://\", SourceAddressAdapter((str(ip), 0)))\r\n return s \r\n\r\ndef test(s): \r\n print(\"{}\".format(s.get(\"http://bot.whatismyipaddress.com\").text))\r\n\r\n\r\n\r\nif __name__ == \"__main__\": \r\n s = prepare_sessions(sys.argv[1]) \r\n test(s) \r\n```", "Can you also show me the full traceback and the value of `sys.argv[1]` in this case?", "anything wrong with the code? I have a server with 5 ips, and when I use primary ip, it works fine, but when I tried with other ips, it reports error as below \r\n```\r\nTraceback (most recent call last):\r\n File \"/usr/lib/python3/dist-packages/urllib3/connection.py\", line 137, in _new_conn\r\n (self.host, self.port), self.timeout, **extra_kw)\r\n File \"/usr/lib/python3/dist-packages/urllib3/util/connection.py\", line 91, in create_connection\r\n raise err\r\n File \"/usr/lib/python3/dist-packages/urllib3/util/connection.py\", line 80, in create_connection\r\n sock.bind(source_address)\r\nsocket.gaierror: [Errno -9] Address family for hostname not supported\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File \"/usr/lib/python3/dist-packages/urllib3/connectionpool.py\", line 560, in urlopen\r\n body=body, headers=headers)\r\n File \"/usr/lib/python3/dist-packages/urllib3/connectionpool.py\", line 354, in _make_request\r\n conn.request(method, url, **httplib_request_kw)\r\n File \"/usr/lib/python3.5/http/client.py\", line 1106, in request\r\n self._send_request(method, url, body, headers)\r\n File \"/usr/lib/python3.5/http/client.py\", line 1151, in _send_request\r\n self.endheaders(body)\r\n File \"/usr/lib/python3.5/http/client.py\", line 1102, in endheaders\r\n self._send_output(message_body)\r\n File \"/usr/lib/python3.5/http/client.py\", line 934, in _send_output\r\n self.send(msg)\r\n File \"/usr/lib/python3.5/http/client.py\", line 877, in send\r\n self.connect()\r\n File \"/usr/lib/python3/dist-packages/urllib3/connection.py\", line 162, in connect\r\n conn = self._new_conn()\r\n File \"/usr/lib/python3/dist-packages/urllib3/connection.py\", line 146, in _new_conn\r\n self, \"Failed to establish a new connection: %s\" % e)\r\nrequests.packages.urllib3.exceptions.NewConnectionError: <requests.packages.urllib3.connection.HTTPConnection object at 0x7fba0cccb668>: Failed to establish a new connection: [Errno -9] Address family for hostname not supported\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File \"/usr/lib/python3/dist-packages/requests/adapters.py\", line 376, in send\r\n timeout=timeout\r\n File \"/usr/lib/python3/dist-packages/urllib3/connectionpool.py\", line 610, in urlopen\r\n _stacktrace=sys.exc_info()[2])\r\n File \"/usr/lib/python3/dist-packages/urllib3/util/retry.py\", line 273, in increment\r\n raise MaxRetryError(_pool, url, error or ResponseError(cause))\r\nrequests.packages.urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='bot.whatismyipaddress.com', port=80): Max retries exceeded with url: / (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7fba0cccb668>: Failed to establish a new connection: [Errno -9] Address family for hostname not supported',))\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File \"test.py\", line 58, in <module>\r\n test(s) \r\n File \"test.py\", line 52, in test\r\n print(\"{}\".format(s.get(\"http://bot.whatismyipaddress.com\").text))\r\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 480, in get\r\n return self.request('GET', url, **kwargs)\r\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 468, in request\r\n resp = self.send(prep, **send_kwargs)\r\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 576, in send\r\n r = adapter.send(request, **kwargs)\r\n File \"/usr/lib/python3/dist-packages/requests/adapters.py\", line 437, in send\r\n raise ConnectionError(e, request=request)\r\nrequests.exceptions.ConnectionError: HTTPConnectionPool(host='bot.whatismyipaddress.com', port=80): Max retries exceeded with url: / (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7fba0cccb668>: Failed to establish a new connection: [Errno -9] Address family for hostname not supported',))\r\n``` \r\n\r\nthe value passed is an ip string such as 172.82.134.69 ", "the same piece of code worked one and half years ago, I just copied over, but it seems not working, maybe I missed something? ", "So my best guess is that this is going to be related to IP address families: specifically, IPv4 and IPv6. `bot.whatismyipaddress.com` has an IPv6 address, which is probably why you're seeing this error.\r\n\r\nHowever, this error should only appear if all connection attempts failed, and presumably the IPv4 ones would not. So it'd be interesting to know why the other connection attempts fail. Mind using `strace` to trace the various socket syscalls to see what's happening?", "the related error is \r\n```\r\nbind(3, {sa_family=AF_INET, sin_port=htons(0), sin_addr=inet_addr(\"172.82.134.67\")}, 16) = -1 EADDRNOTAVAIL (Cannot assign requested address)\r\n``` \r\ndoes that mean the server cannot use the ip? ", "Certainly suggests so. Are you sure that IP is present on your machine?", "I rented the dedicated server with 5 ips, (4 usable), the ips are shown in control panel, how can I check in the system? Really appreciate your help! ", "`ifconfig` should show you.", "it looks like this \r\n```\r\neno1 Link encap:Ethernet HWaddr 00:8c:fa:05:21:c4 \r\n inet addr:172.82.134.66 Bcast:172.82.134.71 Mask:255.255.255.248\r\n inet6 addr: fe80::28c:faff:fe05:21c4/64 Scope:Link\r\n UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1\r\n RX packets:6636840 errors:0 dropped:0 overruns:0 frame:0\r\n TX packets:5852373 errors:0 dropped:0 overruns:0 carrier:0\r\n collisions:0 txqueuelen:1000 \r\n RX bytes:4217736834 (4.2 GB) TX bytes:850693240 (850.6 MB)\r\n Memory:c0000000-c001ffff \r\n``` \r\nthe mask seems correct, how should I interpret this? ", "No, your address is .66, and you put .67 above.", "The mask just indicates what your machine believes its subnet is, not what addresses it believes is assigned to it.", "but I have /29 subnet and it has additional ips range from 172.82.134.67 to 172.82.134.69, how can I use the ips? ", "You need to tell your machine you own them. `ip addr add 172.82.134.67` should work.", "Great! It works now! Thank you for helping! " ]
https://api.github.com/repos/psf/requests/issues/2007
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2007/labels{/name}
https://api.github.com/repos/psf/requests/issues/2007/comments
https://api.github.com/repos/psf/requests/issues/2007/events
https://github.com/psf/requests/issues/2007
31,683,974
MDU6SXNzdWUzMTY4Mzk3NA==
2,007
Update PyPi to v2.3.0
{ "avatar_url": "https://avatars.githubusercontent.com/u/310419?v=4", "events_url": "https://api.github.com/users/kromped/events{/privacy}", "followers_url": "https://api.github.com/users/kromped/followers", "following_url": "https://api.github.com/users/kromped/following{/other_user}", "gists_url": "https://api.github.com/users/kromped/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kromped", "id": 310419, "login": "kromped", "node_id": "MDQ6VXNlcjMxMDQxOQ==", "organizations_url": "https://api.github.com/users/kromped/orgs", "received_events_url": "https://api.github.com/users/kromped/received_events", "repos_url": "https://api.github.com/users/kromped/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kromped/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kromped/subscriptions", "type": "User", "url": "https://api.github.com/users/kromped", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2014-04-16T22:01:45Z
2021-09-09T00:01:07Z
2014-04-16T22:04:10Z
NONE
resolved
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2007/reactions" }
https://api.github.com/repos/psf/requests/issues/2007/timeline
null
completed
null
null
false
[ "2.3.0 does not exist yet: see #1968. It's a mismatch in the docs. =)\n" ]
https://api.github.com/repos/psf/requests/issues/2006
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2006/labels{/name}
https://api.github.com/repos/psf/requests/issues/2006/comments
https://api.github.com/repos/psf/requests/issues/2006/events
https://github.com/psf/requests/issues/2006
31,674,336
MDU6SXNzdWUzMTY3NDMzNg==
2,006
Feature request: automatic auth type detection
{ "avatar_url": "https://avatars.githubusercontent.com/u/837573?v=4", "events_url": "https://api.github.com/users/untitaker/events{/privacy}", "followers_url": "https://api.github.com/users/untitaker/followers", "following_url": "https://api.github.com/users/untitaker/following{/other_user}", "gists_url": "https://api.github.com/users/untitaker/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/untitaker", "id": 837573, "login": "untitaker", "node_id": "MDQ6VXNlcjgzNzU3Mw==", "organizations_url": "https://api.github.com/users/untitaker/orgs", "received_events_url": "https://api.github.com/users/untitaker/received_events", "repos_url": "https://api.github.com/users/untitaker/repos", "site_admin": false, "starred_url": "https://api.github.com/users/untitaker/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/untitaker/subscriptions", "type": "User", "url": "https://api.github.com/users/untitaker", "user_view_type": "public" }
[]
closed
true
null
[]
null
9
2014-04-16T20:01:08Z
2021-09-08T23:11:02Z
2014-06-08T09:49:14Z
CONTRIBUTOR
resolved
requests could figure out whether it needs to use `digest` or `basic` auth by inspecting the `WWW-Authenticate` header.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2006/reactions" }
https://api.github.com/repos/psf/requests/issues/2006/timeline
null
completed
null
null
false
[ "This is an interesting idea @untitaker! It does feel like it'd be useful.\n\nI wonder if we should trial it in [the toolbelt](https://github.com/sigmavirus24/requests-toolbelt) first, see if it's useful. @sigmavirus24?\n", "IIRC you said (in IRC or so) that request sends the auth headers preemptively, which means in this case that we never get the chance to inspect the relevant header. I think that's something that should be addressed before implementing the feature i am requesting anywhere.\n", "@untitaker That's not entirely true. It's what we do for Basic auth, but it's not what we do for Digest. And the fact that we do it pre-emptively for Basic is very deliberate: GitHub's API requires it. =)\n", "Okay, in that case i can't do more than retract that comment :)\n", "Haha, that's fine! The general idea is a good one though, we should have it somewhere.\n", "I wonder if this would fit in with https://github.com/sigmavirus24/requests-toolbelt/pull/21\n\nThe purpose was more to manage authentication credentials for different domains but we could probably also add this case. I'd also be happy to trial this as a separate object.\n", "@sigmavirus24\nI had a similar usage to `AuthBase` in mind:\n\n```\nrequests.get('...', auth=guess(username, password))\n```\n\nWhere GuessAuth is something like this:\n\n```\ndef guess(username, password):\n def inner(request):\n # hit network to determine auth type\n # we actually need the request object for this\n if bla:\n auth = HTTPDigestAuth(username, password)\n else:\n auth = HTTPBasicAuth(username, password)\n return auth(request)\n\n return inner\n```\n\nI don't think it would be nice to blend into `AuthHandler`, as the auth type\nto use can vary within a domain.\n", "Maybe the relevant parts of the guessing should be moved into another function which just returns a subclass of `AuthBase`, and not an instance (for advanced caching purposes). And if that is done, we can think about coupling AuthHandler with it.\n", "This is being handled at the toolbelt, so I'm closing this.\n" ]
https://api.github.com/repos/psf/requests/issues/2005
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2005/labels{/name}
https://api.github.com/repos/psf/requests/issues/2005/comments
https://api.github.com/repos/psf/requests/issues/2005/events
https://github.com/psf/requests/pull/2005
31,650,540
MDExOlB1bGxSZXF1ZXN0MTQ4MjgyNTA=
2,005
SVG logo version
{ "avatar_url": "https://avatars.githubusercontent.com/u/1292133?v=4", "events_url": "https://api.github.com/users/ap-Codkelden/events{/privacy}", "followers_url": "https://api.github.com/users/ap-Codkelden/followers", "following_url": "https://api.github.com/users/ap-Codkelden/following{/other_user}", "gists_url": "https://api.github.com/users/ap-Codkelden/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ap-Codkelden", "id": 1292133, "login": "ap-Codkelden", "node_id": "MDQ6VXNlcjEyOTIxMzM=", "organizations_url": "https://api.github.com/users/ap-Codkelden/orgs", "received_events_url": "https://api.github.com/users/ap-Codkelden/received_events", "repos_url": "https://api.github.com/users/ap-Codkelden/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ap-Codkelden/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ap-Codkelden/subscriptions", "type": "User", "url": "https://api.github.com/users/ap-Codkelden", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2014-04-16T15:14:41Z
2021-09-08T23:06:08Z
2014-04-16T19:37:29Z
CONTRIBUTOR
resolved
SVG version of Requests logo for users, which have not Palatino-Roman fonts and so on.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2005/reactions" }
https://api.github.com/repos/psf/requests/issues/2005/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2005.diff", "html_url": "https://github.com/psf/requests/pull/2005", "merged_at": "2014-04-16T19:37:29Z", "patch_url": "https://github.com/psf/requests/pull/2005.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2005" }
true
[ "Hooray! :+1: :cake:\n", ":sparkles: :typeface: :sparkles:\n", ":koala:\n" ]
https://api.github.com/repos/psf/requests/issues/2004
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2004/labels{/name}
https://api.github.com/repos/psf/requests/issues/2004/comments
https://api.github.com/repos/psf/requests/issues/2004/events
https://github.com/psf/requests/issues/2004
31,607,340
MDU6SXNzdWUzMTYwNzM0MA==
2,004
ResourceWarnings
{ "avatar_url": "https://avatars.githubusercontent.com/u/681260?v=4", "events_url": "https://api.github.com/users/giampaolo/events{/privacy}", "followers_url": "https://api.github.com/users/giampaolo/followers", "following_url": "https://api.github.com/users/giampaolo/following{/other_user}", "gists_url": "https://api.github.com/users/giampaolo/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/giampaolo", "id": 681260, "login": "giampaolo", "node_id": "MDQ6VXNlcjY4MTI2MA==", "organizations_url": "https://api.github.com/users/giampaolo/orgs", "received_events_url": "https://api.github.com/users/giampaolo/received_events", "repos_url": "https://api.github.com/users/giampaolo/repos", "site_admin": false, "starred_url": "https://api.github.com/users/giampaolo/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/giampaolo/subscriptions", "type": "User", "url": "https://api.github.com/users/giampaolo", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2014-04-16T02:13:50Z
2021-09-09T00:01:08Z
2014-04-16T10:08:26Z
NONE
resolved
``` $ python3.4 -W "a" -c "import requests; requests.get('http://google.com')" -c:1: ResourceWarning: unclosed <socket.socket fd=5, family=AddressFamily.AF_INET, type=SocketType.SOCK_STREAM, proto=6, laddr=('192.168.1.2', 60578), raddr=('173.194.35.151', 80)> -c:1: ResourceWarning: unclosed <socket.socket fd=4, family=AddressFamily.AF_INET, type=SocketType.SOCK_STREAM, proto=6, laddr=('192.168.1.2', 36102), raddr=('173.194.35.46', 80)> ``` Same thing happens if I close() the object returned by request.get(). Unfortunately the traceback is not very helpful for determining _where_ exactly this occurs.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2004/reactions" }
https://api.github.com/repos/psf/requests/issues/2004/timeline
null
completed
null
null
false
[ "@giampaolo This is an artifact of our connection pooling. The socket objects aren't being explicitly closed because reusing them is cheaper than making new syscalls and reassembling the object. This is tracked in #1882.\n", "Is there any way to tell requests/urllib3 to close the socket pool? \nNote: I need this in tests only, just to avoid the annoying warning lines to be printed.\n", "Again, if you check #1882 you'll see that you can call `.close()` on the pool. Try this:\n\n``` python\nfor pool_manager in session.adapters.values():\n pool_manager.clear()\n```\n", "Oh right, sorry. Thanks for chiming in.\n" ]
https://api.github.com/repos/psf/requests/issues/2003
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2003/labels{/name}
https://api.github.com/repos/psf/requests/issues/2003/comments
https://api.github.com/repos/psf/requests/issues/2003/events
https://github.com/psf/requests/issues/2003
31,525,291
MDU6SXNzdWUzMTUyNTI5MQ==
2,003
auth kwarg cannot be used to authenticate an HTTPS request through a HTTP proxy
{ "avatar_url": "https://avatars.githubusercontent.com/u/638?v=4", "events_url": "https://api.github.com/users/pjjw/events{/privacy}", "followers_url": "https://api.github.com/users/pjjw/followers", "following_url": "https://api.github.com/users/pjjw/following{/other_user}", "gists_url": "https://api.github.com/users/pjjw/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/pjjw", "id": 638, "login": "pjjw", "node_id": "MDQ6VXNlcjYzOA==", "organizations_url": "https://api.github.com/users/pjjw/orgs", "received_events_url": "https://api.github.com/users/pjjw/received_events", "repos_url": "https://api.github.com/users/pjjw/repos", "site_admin": false, "starred_url": "https://api.github.com/users/pjjw/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pjjw/subscriptions", "type": "User", "url": "https://api.github.com/users/pjjw", "user_view_type": "public" }
[ { "color": "e10c02", "default": false, "description": null, "id": 117744, "name": "Bug", "node_id": "MDU6TGFiZWwxMTc3NDQ=", "url": "https://api.github.com/repos/psf/requests/labels/Bug" }, { "color": "e102d8", "default": false, "description": null, "id": 117745, "name": "Planned", "node_id": "MDU6TGFiZWwxMTc3NDU=", "url": "https://api.github.com/repos/psf/requests/labels/Planned" }, { "color": "eb6420", "default": false, "description": null, "id": 44501256, "name": "Breaking API Change", "node_id": "MDU6TGFiZWw0NDUwMTI1Ng==", "url": "https://api.github.com/repos/psf/requests/labels/Breaking%20API%20Change" } ]
closed
true
null
[]
{ "closed_at": null, "closed_issues": 29, "created_at": "2013-11-17T11:29:34Z", "creator": { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }, "description": null, "due_on": null, "html_url": "https://github.com/psf/requests/milestone/20", "id": 487518, "labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels", "node_id": "MDk6TWlsZXN0b25lNDg3NTE4", "number": 20, "open_issues": 12, "state": "open", "title": "3.0.0", "updated_at": "2024-05-19T18:43:00Z", "url": "https://api.github.com/repos/psf/requests/milestones/20" }
7
2014-04-15T04:18:58Z
2021-09-08T13:05:39Z
2016-12-17T17:00:48Z
NONE
resolved
i mean so this is probably more of a rant than an issue, but that was 3 hours i won't get back. was trying to pipe requests through a squid proxy in a lights-out environment. credentials come from a data store separate from configuration, so seemed reasonable to use requests.auth.HTTPProxyAuth(l, p) as the auth kwarg as (what i thought) a supplement to the proxies dict fed to the proxies kwarg. this works fine for http requests, but this seems to be by accident- it falls apart when making an https request through a proxy, when urllib3 uses the CONNECT verb to encapsulate the request. basically requests.auth.HTTPProxyAuth doesn't make it clear that it's not the correct (always working) way to authenticate to a proxy, but that you must put the credentials into the proxy url in the proxies dict. deprecation/removal of that class would probably make the interface cleaner- failing that, a note in the source would be nice.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2003/reactions" }
https://api.github.com/repos/psf/requests/issues/2003/timeline
null
completed
null
null
false
[ "Yeah, agreed: I think we should aim to remove `HTTPProxyAuth` in version 3.0.\n", ":+1: for 3.0 removal.\n", "Are we still in agreement that this should be on it's way?", "@nateprewitt I am in agreement, but I would be unsurprised if Cory changed his mind. Since he's on holiday, why don't you hold off on working on this so you don't waste your time.", "Yep, will do. Just floating it out there for when you've both got a free moment. Thanks @sigmavirus24 :)", "My mind isn't changed at this time. Let's kill it. =D", "This will be resolved in 3.0.0 with #3773." ]
https://api.github.com/repos/psf/requests/issues/2002
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2002/labels{/name}
https://api.github.com/repos/psf/requests/issues/2002/comments
https://api.github.com/repos/psf/requests/issues/2002/events
https://github.com/psf/requests/issues/2002
31,505,086
MDU6SXNzdWUzMTUwNTA4Ng==
2,002
bool(failure response) is False
{ "avatar_url": "https://avatars.githubusercontent.com/u/98610?v=4", "events_url": "https://api.github.com/users/slinkp/events{/privacy}", "followers_url": "https://api.github.com/users/slinkp/followers", "following_url": "https://api.github.com/users/slinkp/following{/other_user}", "gists_url": "https://api.github.com/users/slinkp/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/slinkp", "id": 98610, "login": "slinkp", "node_id": "MDQ6VXNlcjk4NjEw", "organizations_url": "https://api.github.com/users/slinkp/orgs", "received_events_url": "https://api.github.com/users/slinkp/received_events", "repos_url": "https://api.github.com/users/slinkp/repos", "site_admin": false, "starred_url": "https://api.github.com/users/slinkp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/slinkp/subscriptions", "type": "User", "url": "https://api.github.com/users/slinkp", "user_view_type": "public" }
[ { "color": "02e10c", "default": false, "description": null, "id": 76800, "name": "Feature Request", "node_id": "MDU6TGFiZWw3NjgwMA==", "url": "https://api.github.com/repos/psf/requests/labels/Feature%20Request" }, { "color": "eb6420", "default": false, "description": null, "id": 44501256, "name": "Breaking API Change", "node_id": "MDU6TGFiZWw0NDUwMTI1Ng==", "url": "https://api.github.com/repos/psf/requests/labels/Breaking%20API%20Change" } ]
open
false
null
[]
{ "closed_at": null, "closed_issues": 29, "created_at": "2013-11-17T11:29:34Z", "creator": { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }, "description": null, "due_on": null, "html_url": "https://github.com/psf/requests/milestone/20", "id": 487518, "labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels", "node_id": "MDk6TWlsZXN0b25lNDg3NTE4", "number": 20, "open_issues": 12, "state": "open", "title": "3.0.0", "updated_at": "2024-05-19T18:43:00Z", "url": "https://api.github.com/repos/psf/requests/milestones/20" }
20
2014-04-14T21:09:55Z
2016-04-15T21:39:43Z
null
NONE
null
This is rather surprising, and not documented that I've seen (though I could certainly have missed it): ``` >>> r = requests.request('get', 'http://google.com/aopsdufsaf') >>> r <Response [404]> >>> bool(r) False ``` To me, "failure = false" is neither intuitive nor expected.
null
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2002/reactions" }
https://api.github.com/repos/psf/requests/issues/2002/timeline
null
null
null
null
false
[ "+1\n\nA common pattern is\n\n``` python\nresp = None\n\ntry:\n resp = requests.get(...)\nexcept:\n log.exception(\"ruh roh\")\n\nif resp:\n print(\"we got somethin'!\")\n handle_resp(resp)\n```\n\nThe current implementation of `__bool__` makes this intuitive pattern very tricky when writing code that accounts for erroneous status codes.\n", "Hey @slinkp thanks for opening this!\n\nIn this case, what you're looking for is an explicit attribute: `r.ok`\n\nThe fact of the matter is that this works on Python 3 because we define `__bool__`. If we define `__nonzero__` on Python 2 this would work there as well. That said, I think you should be using the `r.ok` pattern anyway. It's a far better pattern personally.\n\n@jamesob you're getting exactly what you ask for in that case, sorry to say it. You did get something if `resp` is not `None`. If you want a \"good\" response that also isn't None, you should be explicit about it:\n\n``` python\nif resp and resp.ok:\n print(\"we got something!\")\n handle_resp(resp)\n```\n\nEven if the `__nonzero__` bug is fixed, this is still far more obvious to anyone who is going to come along and read your code that you're not only expecting a non-`None` value but also a `2xx` response.\n", "Wait never mind, we do implement `__nonzero__`. So `bool(resp)` should work just fine. Investigating.\n", "@sigmavirus24 heh, appears this issue has just bitten you too. \n\n> You did get something if resp is not None.\n\nisn't true given the current implementation of `__nonzero__` (which kicks to `ok`), which is the point I was trying to get across.\n", "``` pycon\n>>> import requests\nr>>> r = requests.get('https://api.github.com/user')\n>>> r\n<Response [401]>\n>>> bool(r)\nFalse\n>>> r = requests.get('http://madisonpl.us/rubby')\n>>> r\n<Response [404]>\n>>> bool(r)\nFalse\n```\n\nWhat version of requests are both of you on?\n", "Oh, I misread the issue. >_< Yeah that's something I disagree with :). I bet it was someone else's feature request and not something we can \"fix\" until requests 3.\n", "``` python\nIn [3]: r = requests.get('http://google.com/awefawefae')\n\nIn [4]: r\nOut[4]: <Response [404]>\n\nIn [5]: if r:\n ...: print \"yo\"\n ...: \n\nIn [6]: requests.__version__\nOut[6]: '2.1.0'\n```\n", "@jamesob yeah I asked that because I thought I read that `bool(resp) is True` in the original issue. I've had too much caffeine to deal with the sleep deprivation caused by PyCon. =D\n", "haha, I hear ya. Anyways, would be awesome to get this ironed out.\n", "Yeah I figured this would break backward compatibility... somebody somewhere is surely depending on the current behavior. I just really dislike it :)\n\nThanks @sigmavirus24 \n", "Yeah I'm :+1: for removing this. @Lukasa thoughts? \n\nI know it would have to wait for requests 3.0 but it might still be good to have a wishlist for 3.0 \n", "I just chatted with Kenneth about this, he definitely doesn't like it. I think we should be leaving this on a wishlist for 3.0, but @kennethreitz might disagree.\n", "I know Semantic Versioning isn't a hard rule but I really really really really would like it if it were =P\n", "as the issue reporter, I would chime in to say semver++ ... don't break\nthings for people who rely on this behavior even if I hate it.\n\nOn Mon, Apr 14, 2014 at 5:56 PM, Ian Cordasco [email protected]:\n\n> I know Semantic Versioning isn't a hard rule but I really really really\n> really would like it if it were =P\n> \n> —\n> Reply to this email directly or view it on GitHubhttps://github.com/kennethreitz/requests/issues/2002#issuecomment-40423225\n> .\n\n## \n\nhttp://www.slinkp.com\n", "semver++ :)\n", "I think this is an irrelevant design decision that should really never be relied on either way. In my opinion neither `__nonzero__` nor `__bool__` should have been implemented for response objects in the first place.\n\n@jamesob In your first code snippet, you shouldn't be using None as a boolean anyways. Write `if resp is None:` instead and that pattern works fine. (Also, just FYI, `except:` should always be `except Exception:`, otherwise you catch things like KeyboardInterrupt which is baaaaaaad.)\n\nThe truthiness of an HTTP response is a very ambiguous concept. Instead, just use `response.ok` as was suggested earlier.\n\n> Explicit is better than implicit.\n> — Tim Peters\n", "@fletom I'd like to remind you to [be cordial](http://www.kennethreitz.org/essays/be-cordial-or-be-on-your-way). The code snippets provided for this issue likely do no represent code actually copied and pasted from production code. They merely serve as examples to illustrate a point. There's no need to teach anyone about bad practices. All you needed to do was voice an opinion relevant to the discussion of what `bool(response)` would return.\n", "@sigmavirus24 Sorry my comment came across as negative. I only meant to suggest a way that that common pattern can work with requests' current design. As KR suggested, the intention was to be educational/constructive and not insulting.\n", "This was one of the earliest features of requests and should def be removed :)\n", "Yep that's why it's lined up for 3.0 =D \n" ]
https://api.github.com/repos/psf/requests/issues/2001
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2001/labels{/name}
https://api.github.com/repos/psf/requests/issues/2001/comments
https://api.github.com/repos/psf/requests/issues/2001/events
https://github.com/psf/requests/issues/2001
31,429,074
MDU6SXNzdWUzMTQyOTA3NA==
2,001
Pull latest [urllib3] to fix HTTPSConnection Redirect Loop on AppEngine
{ "avatar_url": "https://avatars.githubusercontent.com/u/423702?v=4", "events_url": "https://api.github.com/users/soelinn/events{/privacy}", "followers_url": "https://api.github.com/users/soelinn/followers", "following_url": "https://api.github.com/users/soelinn/following{/other_user}", "gists_url": "https://api.github.com/users/soelinn/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/soelinn", "id": 423702, "login": "soelinn", "node_id": "MDQ6VXNlcjQyMzcwMg==", "organizations_url": "https://api.github.com/users/soelinn/orgs", "received_events_url": "https://api.github.com/users/soelinn/received_events", "repos_url": "https://api.github.com/users/soelinn/repos", "site_admin": false, "starred_url": "https://api.github.com/users/soelinn/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/soelinn/subscriptions", "type": "User", "url": "https://api.github.com/users/soelinn", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2014-04-14T01:15:11Z
2021-09-08T23:11:00Z
2014-06-08T10:02:36Z
NONE
resolved
HTTPSConnection is going out as HTTP on AppEngine 1.9.0 and causing a redirect loop error. Please pull the latest version of [urllib3]. Here is the commit for the urllib3 that fixes this issue. https://github.com/shazow/urllib3/commit/627fc7a897e8e52b59d8ca2eb3d5ebccbe66ac7a
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2001/reactions" }
https://api.github.com/repos/psf/requests/issues/2001/timeline
null
completed
null
null
false
[ "This is resolved and should be in 2.3.0.\n" ]
https://api.github.com/repos/psf/requests/issues/2000
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2000/labels{/name}
https://api.github.com/repos/psf/requests/issues/2000/comments
https://api.github.com/repos/psf/requests/issues/2000/events
https://github.com/psf/requests/issues/2000
31,425,191
MDU6SXNzdWUzMTQyNTE5MQ==
2,000
Requests.get() will void SIGINT if it's in a thread
{ "avatar_url": "https://avatars.githubusercontent.com/u/1433949?v=4", "events_url": "https://api.github.com/users/steven-shi/events{/privacy}", "followers_url": "https://api.github.com/users/steven-shi/followers", "following_url": "https://api.github.com/users/steven-shi/following{/other_user}", "gists_url": "https://api.github.com/users/steven-shi/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/steven-shi", "id": 1433949, "login": "steven-shi", "node_id": "MDQ6VXNlcjE0MzM5NDk=", "organizations_url": "https://api.github.com/users/steven-shi/orgs", "received_events_url": "https://api.github.com/users/steven-shi/received_events", "repos_url": "https://api.github.com/users/steven-shi/repos", "site_admin": false, "starred_url": "https://api.github.com/users/steven-shi/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/steven-shi/subscriptions", "type": "User", "url": "https://api.github.com/users/steven-shi", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2014-04-13T21:58:29Z
2021-09-09T00:01:08Z
2014-04-13T22:30:14Z
NONE
resolved
Hi, I would like to have a gentle way to shutdown the Thread by hooking SIGINT, but noticed once Requests been used in Thread, it will void Signal handler. Please test following example, and you will notice you cannot shut down by CTRL+C even the SIGINT has been hooked. ``` python import requests import signal import time from threading import Thread import sys def shutdownHandler(signal, frame): print "Shutting down..." sys.exit(0) def test_load(): x=requests.get('http://www.google.com') x.close() time.sleep(200000) if __name__ == '__main__': signal.signal(signal.SIGINT, shutdownHandler) t=Thread(target = test_load) t.start() signal.pause() t.join() ``` If you comment out ``` python x=requests.get('http://www.google.com') x.close() ``` Then run again, and press Ctrl+C, the process will be shutting down. Any idea what's going in Requests? Thanks a lot
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2000/reactions" }
https://api.github.com/repos/psf/requests/issues/2000/timeline
null
completed
null
null
false
[ "Thanks for raising this issue!\n\nRequests uses httplib under the covers, which is known to have the problem of consuming SIGINT. You can see this [here](http://cuddihyd.bitbucket.org/tm/python-ctrl-c.html). Unfortunately there's nothing we can do about this directly, you'd need to fix the bug in Python itself. This bug _is_ fixed in Python 3.4.\n" ]
https://api.github.com/repos/psf/requests/issues/1999
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1999/labels{/name}
https://api.github.com/repos/psf/requests/issues/1999/comments
https://api.github.com/repos/psf/requests/issues/1999/events
https://github.com/psf/requests/pull/1999
31,388,342
MDExOlB1bGxSZXF1ZXN0MTQ2ODc2MTA=
1,999
Font layer in logo convert to curves
{ "avatar_url": "https://avatars.githubusercontent.com/u/1292133?v=4", "events_url": "https://api.github.com/users/ap-Codkelden/events{/privacy}", "followers_url": "https://api.github.com/users/ap-Codkelden/followers", "following_url": "https://api.github.com/users/ap-Codkelden/following{/other_user}", "gists_url": "https://api.github.com/users/ap-Codkelden/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ap-Codkelden", "id": 1292133, "login": "ap-Codkelden", "node_id": "MDQ6VXNlcjEyOTIxMzM=", "organizations_url": "https://api.github.com/users/ap-Codkelden/orgs", "received_events_url": "https://api.github.com/users/ap-Codkelden/received_events", "repos_url": "https://api.github.com/users/ap-Codkelden/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ap-Codkelden/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ap-Codkelden/subscriptions", "type": "User", "url": "https://api.github.com/users/ap-Codkelden", "user_view_type": "public" }
[ { "color": "009800", "default": false, "description": null, "id": 44501218, "name": "Ready To Merge", "node_id": "MDU6TGFiZWw0NDUwMTIxOA==", "url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge" }, { "color": "207de5", "default": false, "description": null, "id": 60620163, "name": "Minion Seal of Approval", "node_id": "MDU6TGFiZWw2MDYyMDE2Mw==", "url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval" } ]
closed
true
null
[]
null
5
2014-04-12T18:21:34Z
2021-09-08T22:01:14Z
2014-04-16T14:56:44Z
CONTRIBUTOR
resolved
Some users may not have Palatino-Roman font, but in curves form they can use logo.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1999/reactions" }
https://api.github.com/repos/psf/requests/issues/1999/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/1999.diff", "html_url": "https://github.com/psf/requests/pull/1999", "merged_at": "2014-04-16T14:56:44Z", "patch_url": "https://github.com/psf/requests/pull/1999.patch", "url": "https://api.github.com/repos/psf/requests/pulls/1999" }
true
[ "This looks great to me, thanks! :cake: @kennethreitz, are you happy with this?\n", "Sorry, wrong answer in gmail. Thanks for approve!\n\n2014-04-12 21:37 GMT+03:00 Cory Benfield [email protected]:\n\n> This looks great to me, thanks! [image: :cake:]@kennethreitzhttps://github.com/kennethreitz,\n> are you happy with this?\n> \n> —\n> Reply to this email directly or view it on GitHubhttps://github.com/kennethreitz/requests/pull/1999#issuecomment-40288137\n> .\n\n## \n\nNasridinov Renat, [email protected]\n", ":cake:\n", "Actually I'm going to revert this. Let's just add an SVG for that. I want a \"source\" version for manipulation.\n", "Revert, I agree with you, and I'll make SVG now.\n" ]
https://api.github.com/repos/psf/requests/issues/1998
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1998/labels{/name}
https://api.github.com/repos/psf/requests/issues/1998/comments
https://api.github.com/repos/psf/requests/issues/1998/events
https://github.com/psf/requests/pull/1998
31,212,137
MDExOlB1bGxSZXF1ZXN0MTQ1ODU2Nzk=
1,998
Fix typo
{ "avatar_url": "https://avatars.githubusercontent.com/u/66017?v=4", "events_url": "https://api.github.com/users/kapyshin/events{/privacy}", "followers_url": "https://api.github.com/users/kapyshin/followers", "following_url": "https://api.github.com/users/kapyshin/following{/other_user}", "gists_url": "https://api.github.com/users/kapyshin/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kapyshin", "id": 66017, "login": "kapyshin", "node_id": "MDQ6VXNlcjY2MDE3", "organizations_url": "https://api.github.com/users/kapyshin/orgs", "received_events_url": "https://api.github.com/users/kapyshin/received_events", "repos_url": "https://api.github.com/users/kapyshin/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kapyshin/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kapyshin/subscriptions", "type": "User", "url": "https://api.github.com/users/kapyshin", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2014-04-10T03:11:30Z
2021-09-08T22:01:15Z
2014-04-10T05:46:05Z
CONTRIBUTOR
resolved
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1998/reactions" }
https://api.github.com/repos/psf/requests/issues/1998/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/1998.diff", "html_url": "https://github.com/psf/requests/pull/1998", "merged_at": "2014-04-10T05:46:05Z", "patch_url": "https://github.com/psf/requests/pull/1998.patch", "url": "https://api.github.com/repos/psf/requests/pulls/1998" }
true
[ "Thanks for this! :cake:\n" ]
https://api.github.com/repos/psf/requests/issues/1997
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1997/labels{/name}
https://api.github.com/repos/psf/requests/issues/1997/comments
https://api.github.com/repos/psf/requests/issues/1997/events
https://github.com/psf/requests/issues/1997
31,173,167
MDU6SXNzdWUzMTE3MzE2Nw==
1,997
boundary for custom 'Content-Type: multipart/form-data' header
{ "avatar_url": "https://avatars.githubusercontent.com/u/319736?v=4", "events_url": "https://api.github.com/users/jcfrank/events{/privacy}", "followers_url": "https://api.github.com/users/jcfrank/followers", "following_url": "https://api.github.com/users/jcfrank/following{/other_user}", "gists_url": "https://api.github.com/users/jcfrank/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jcfrank", "id": 319736, "login": "jcfrank", "node_id": "MDQ6VXNlcjMxOTczNg==", "organizations_url": "https://api.github.com/users/jcfrank/orgs", "received_events_url": "https://api.github.com/users/jcfrank/received_events", "repos_url": "https://api.github.com/users/jcfrank/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jcfrank/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jcfrank/subscriptions", "type": "User", "url": "https://api.github.com/users/jcfrank", "user_view_type": "public" }
[]
closed
true
null
[]
null
14
2014-04-09T16:35:42Z
2021-08-31T00:06:42Z
2014-06-08T10:03:37Z
NONE
resolved
# Test Script Send a muli-part post file request with custom boundary. ``` import requests import os.path filepath = '/tmp/test.tgz' with open(filepath, 'rb') as f: files = {'file':f} headers = {'Content-Type':'multipart/form-data;boundary=***someboundary***'} r = requests.post('http://127.0.0.1:56500/', files=files, headers=headers) ``` # Actual Result ``` $ nc -p 56500 -l POST / HTTP/1.1 Host: 127.0.0.1:56500 Accept-Encoding: gzip, deflate, compress User-Agent: python-requests/2.2.1 CPython/3.4.0 Linux/3.8.0-35-generic Content-Length: 277 Content-Type: multipart/form-data;boundary=***someboundary*** Accept: */* --84989444e2484915a216e1718e0f93f0 Content-Disposition: form-data; name="file"; filename="test.tgz" VlES� ��aws9���"ـ����E�)f��i�>.!;қ�>1ܼ5��I�<���Q���|lf���S���~ �8�C --84989444e2484915a216e1718e0f93f0-- ``` # Expected Result ``` $ nc -p 56500 -l POST / HTTP/1.1 Host: 127.0.0.1:56500 Accept-Encoding: gzip, deflate, compress User-Agent: python-requests/2.2.1 CPython/3.4.0 Linux/3.8.0-35-generic Content-Length: 277 Content-Type: multipart/form-data;boundary=***someboundary*** Accept: */* --***someboundary*** Content-Disposition: form-data; name="file"; filename="test.tgz" VlES� ��aws9���"ـ����E�)f��i�>.!;қ�>1ܼ5��I�<���Q���|lf���S���~ �8�C --***someboundary***-- ``` 'boundary' is a parameter required by multipart entities. However, when I specify the custom boundary in headers, the sent request still uses generated boundary. It dosen't help even if I remove the 'boundary=...' part. The only way to avoid this issue is to **not** specify the custom **Content-Type** as multipart in headers and let the scripts themselves generate it. I am new to this package and the issue got me stuck for hours. So... maybe there should be a warning about this in document or maybe the customized boundary part should be implemented?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 3, "-1": 0, "confused": 0, "eyes": 0, "heart": 2, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 5, "url": "https://api.github.com/repos/psf/requests/issues/1997/reactions" }
https://api.github.com/repos/psf/requests/issues/1997/timeline
null
completed
null
null
false
[ "This is similar to problems many people have when coming from less useful HTTP libraries. However, the documentation is very clear about how to [post multipart files](http://docs.python-requests.org/en/latest/user/quickstart/#post-a-multipart-encoded-file). Otherwise, all we could reasonably do is have a general note that says \"Please stop providing your own headers, requests can do it by itself\".\n", "At the end of that section that @Lukasa linked to, there's a bit about the [requests-toolbelt](https://gitlab.com/sigmavirus24/toolbelt). Using that library would allow you to specify your own boundary without having to craft the header yourself.\n", "@Lukasa Yes, I appreciate how simple and clear the document said about posting files. However, from the same page of document it also specified [custom headers](http://docs.python-requests.org/en/latest/user/quickstart/#custom-headers). And some samples from the document also have custom 'Content-Type' header. So it would seem totally OK to do that.\n@sigmavirus24 Thanks for point that out. But I've encountered this issue when I'm just playing around with fairly small sized files. So at first glance, I'd think that I don't need streaming and skip that part.\n\nThe reason I report this is that when I'm trying requests, there's no warning about custom boundaries in document or stderr ouput. If there were, it'd save me a lot of time trying to figure out why the sent request isn't working.\nSo I think at least there should be a note like \"Custom boundaries are not supported. Please let requests generate that specific header for you\".\n", "@jcfrank there's a difference in the documentation (that may be too subtle) between the parts that specify a `Content-Type` header and the Multipart Post part -- The former require you to format the data yourself, specifically when posting JSON data. You're relying on requests to format the multipart request so you should not send the header.\n\nAlso, the toolbelt is desired mostly by people who have large file sizes but it even works if you need neither streaming or large file handling. You can do the following:\n\n``` python\nfrom requests_toolbelt import MultipartEncoder\n\nfields = {\n# your multipart form fields\n}\n\nm = MultipartEncoder(fields, boundary='my_super_custom_header')\nr = requests.post(url, headers={'Content-Type': m.content_type}, data=m.to_string())\n```\n\nIn this case you do want to set your own header **because** you're formatting the data.\n", "I see the difference you mean. Thanks for the explanation.\nI'd still suggest a note be added in the document, or a warning print when the 'boundary' word is detected in content-type headers, though.\nYou can close this if it doesn't seem necessary.\n\nThank you both for checking this out! :)\n", "@jcfrank\r\nyou saved my life :)\r\nI removed the headers and can now post the file using requests.\r\n\r\n\r\n```\r\nfiles = { 'file': ('test_one.xlsx', open('test_one.xlsx', 'rb')),}\r\nurl = 'https://XXXXX/Y-Y-Y-Y/ZZZZ?XX=YY'\r\nr=requests.post(url, files=files)\r\n\r\n```", "@parmeshwor Glad to know it helps anyone after more than 3 years this was posted. :)\r\n\r\nI updated the description so it's easier to see the difference. In case anyone else has to look this up in the future. :/", "@jcfrank just saved me too! I've been stuck on this for at least an hour. Glad I came across this :+1: ", "The same thing happened to me , although I solved by myself , I am still feel happy to see this ", "@Lukasa I feel this issue should be reopened, the number of people having it is a testament to either a lack of documentation or unintuitive behaviour. I've myself encountered the issue and was unable to remediate it despite reading the part of the documentation about posting multipart/form-data requests.\r\n\r\nIf requests relies on the absence of a custom Content-Type header to craft a correct request, it should raise an error when it is provided by the user, otherwise it just silently crafts a malformed request and it is very hard to debug.\r\n\r\nIf you feel such an error goes against the design philosophy of requests (where you're supposed to refrain from using custom headers and use the keyword arguments which will take care of selecting the appropriate one, if I understood correctly.) then a warning should be added to the [POST a Multipart-Encoded File](http://docs.python-requests.org/en/master/user/quickstart/#post-a-multipart-encoded-file) part of the documentation. Currently there's only a warning about the Content-Length header. \r\n\r\n**Issues opened with the same problem**:\r\nhttps://github.com/requests/requests/issues/3744\r\nhttps://github.com/requests/requests/issues/621\r\nhttps://github.com/requests/requests/issues/4740\r\nhttps://github.com/requests/requests/issues/4589\r\nhttps://github.com/requests/requests/issues/4554\r\n", "@sigmavirus24 all day fighting a custom content-disposition boundary and this fixed it !\r\n\r\nWell done and thank you for leaving this here hah. Damn. So useful. Done in minutes once I found this post.", "@jcfrank Can you please check the pull request #5521 to see if it supports what you were looking for?", "@virajkanwade Thanks for working on a solution to it! Seems it does handle this issue. I'm a bit concerned about [the alternative way](https://requests.readthedocs.io/en/master/user/quickstart/?highlight=multipart#post-a-multipart-encoded-file) to provide the content-type header. I don't know whether it's possible for users to try adding the boundary parameter there.\r\n\r\nIf manually handling the boundary instead of using a 3rd-party library is a concern, maybe detecting such parameter and raising an exception would also do? Just a thought.", "@jcfrank I am working on a third party service which uses a custom boundary as another level of verification. So I need to set it. I approached the solution from bottom up. I noticed the urllib3 function encode_multipart_formdata already supports setting boundary. It just wasn't being passed by requests. Kept following the traceback till I managed to find a place where I could get it from the request headers. If the alternative way sets the header such that the boundary reaches till this point, i believe it should work. That is something probably the requests team can take a call on. But I don't like the approach if there is another library which does it, we won't update our. If that was the case, Python should have stayed on 2 and we should have been using libraries for everything. Just my personal thoughts." ]
https://api.github.com/repos/psf/requests/issues/1996
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1996/labels{/name}
https://api.github.com/repos/psf/requests/issues/1996/comments
https://api.github.com/repos/psf/requests/issues/1996/events
https://github.com/psf/requests/issues/1996
31,021,562
MDU6SXNzdWUzMTAyMTU2Mg==
1,996
MIME type no longer set on file upload
{ "avatar_url": "https://avatars.githubusercontent.com/u/3502082?v=4", "events_url": "https://api.github.com/users/wtip/events{/privacy}", "followers_url": "https://api.github.com/users/wtip/followers", "following_url": "https://api.github.com/users/wtip/following{/other_user}", "gists_url": "https://api.github.com/users/wtip/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/wtip", "id": 3502082, "login": "wtip", "node_id": "MDQ6VXNlcjM1MDIwODI=", "organizations_url": "https://api.github.com/users/wtip/orgs", "received_events_url": "https://api.github.com/users/wtip/received_events", "repos_url": "https://api.github.com/users/wtip/repos", "site_admin": false, "starred_url": "https://api.github.com/users/wtip/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/wtip/subscriptions", "type": "User", "url": "https://api.github.com/users/wtip", "user_view_type": "public" }
[ { "color": "fbca04", "default": false, "description": null, "id": 44501249, "name": "Needs BDFL Input", "node_id": "MDU6TGFiZWw0NDUwMTI0OQ==", "url": "https://api.github.com/repos/psf/requests/labels/Needs%20BDFL%20Input" }, { "color": "fad8c7", "default": false, "description": null, "id": 136616769, "name": "Documentation", "node_id": "MDU6TGFiZWwxMzY2MTY3Njk=", "url": "https://api.github.com/repos/psf/requests/labels/Documentation" } ]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" } ]
null
6
2014-04-07T20:58:49Z
2021-09-08T23:04:50Z
2015-05-03T15:11:13Z
NONE
resolved
I have the following code that was working fine with requests 1.2.0 but is not working with 2.2.1 ``` python while True: try: # Try to upload the image f = open (img_name,'rb') r = requests.post(url='http://myurl.com/post', data = {'Upload':'upload'}, files = {'userfile':f}, timeout=30) print r.status_code r.raise_for_status() break # If the upload completes without errors break out of the loop except requests.exceptions.RequestException: print "Something went wrong, waiting and then trying again" time.sleep(30) ``` The code runs without errors but the image is not uploaded. The upload page rejects the post due to unsupported file type. I did a tcpdump capture while running the code with both versions of requests. The main difference I saw was that requests 1.2.0 properly sets Content-Type: image/jpeg while with requests 2.2.1 I did not see Content-Type header. Am I doing something wrong or is this a bug?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1996/reactions" }
https://api.github.com/repos/psf/requests/issues/1996/timeline
null
completed
null
null
false
[ "Are you expecting the `Content-Type` header on the request or on one of the parts of the multipart upload?\n", "I am expecting Content-Type on the second part of the multipart upload. This is properly done with requests 1.2.0\n", "Hmm, yup, from looking at the code it's clear that we no longer guess the MIME type of the uploaded file. I wonder if that was deliberate.\n\nLooks like it went away in af4fb8cedca7c331b8c914a40c477a2cb02055e1 (part of #1640) when we switched to explicitly using `RequestField` objects. Assuming we're happy to guess MIME types I'm happy to put it back. @kennethreitz, should we be guessing MIME types on multipart upload parts?\n", "If it was intentional it would be nice if this would be put in the migration guide.\n\nIn trying to figure this out I saw that the mime type guessing code moved over to https://github.com/kennethreitz/requests/blob/master/requests/packages/urllib3/fields.py with `def guess_content_type` but doesn't seem to be working?\n", "It's not that it's not working, it's that it's not hit. We explicitly create `RequestField` objects, which don't do any content-type guessing.\n", "Hey @kennethreitz I was rummaging through issues and we never did get an answer from you. Was it intentional that content type guessing was removed? There's an API for the user to specify it now explicitly so we just might want to document this.\n" ]
https://api.github.com/repos/psf/requests/issues/1995
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1995/labels{/name}
https://api.github.com/repos/psf/requests/issues/1995/comments
https://api.github.com/repos/psf/requests/issues/1995/events
https://github.com/psf/requests/issues/1995
30,949,718
MDU6SXNzdWUzMDk0OTcxOA==
1,995
Create an Extra for Better SSL Support
{ "avatar_url": "https://avatars.githubusercontent.com/u/145979?v=4", "events_url": "https://api.github.com/users/dstufft/events{/privacy}", "followers_url": "https://api.github.com/users/dstufft/followers", "following_url": "https://api.github.com/users/dstufft/following{/other_user}", "gists_url": "https://api.github.com/users/dstufft/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/dstufft", "id": 145979, "login": "dstufft", "node_id": "MDQ6VXNlcjE0NTk3OQ==", "organizations_url": "https://api.github.com/users/dstufft/orgs", "received_events_url": "https://api.github.com/users/dstufft/received_events", "repos_url": "https://api.github.com/users/dstufft/repos", "site_admin": false, "starred_url": "https://api.github.com/users/dstufft/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dstufft/subscriptions", "type": "User", "url": "https://api.github.com/users/dstufft", "user_view_type": "public" }
[ { "color": "e102d8", "default": false, "description": null, "id": 117745, "name": "Planned", "node_id": "MDU6TGFiZWwxMTc3NDU=", "url": "https://api.github.com/repos/psf/requests/labels/Planned" }, { "color": "0b02e1", "default": false, "description": null, "id": 191274, "name": "Contributor Friendly", "node_id": "MDU6TGFiZWwxOTEyNzQ=", "url": "https://api.github.com/repos/psf/requests/labels/Contributor%20Friendly" } ]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" } ]
null
11
2014-04-06T22:00:00Z
2018-10-15T12:22:04Z
2014-09-04T18:36:56Z
CONTRIBUTOR
resolved
So right now the SSL connections when you use pyOpenSSL, ndg-httspclient, and pyasn1 are more secure than if you just use the stdlib options. However it's hard to actually remember those three things. It would be cool if requests would add an extra to it's setup.py so that people can install requests with betterssl, something like: ``` python setup( extras_require={ "betterssl": ["pyOpenSSL", "ndg-httpsclient", "pyasn1"], }, ) ``` Would make it so people can install requests like `pip install requests[betterssl]` and get all of those dependencies without having to manually track those down. It also means people could depend on `requests[betterssl]` instead of just `requests` in their own setup.py's. Extra name can of course be bikeshed here :)
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1995/reactions" }
https://api.github.com/repos/psf/requests/issues/1995/timeline
null
completed
null
null
false
[ "Also by default requests can't connect to some sites on OSX because of ancient OpenSSL. Using the above 3 packages makes it possible.\n\n```\nPython 2.7.5 (default, Sep 12 2013, 21:33:34)\n[GCC 4.2.1 Compatible Apple LLVM 5.0 (clang-500.0.68)] on darwin\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> import requests\n>>> requests.get(\"https://www.howsmyssl.com/a/check\")\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"/Users/hynek/.virtualenvs/1bd80d533b702044/lib/python2.7/site-packages/requests/api.py\", line 55, in get\n return request('get', url, **kwargs)\n File \"/Users/hynek/.virtualenvs/1bd80d533b702044/lib/python2.7/site-packages/requests/api.py\", line 44, in request\n return session.request(method=method, url=url, **kwargs)\n File \"/Users/hynek/.virtualenvs/1bd80d533b702044/lib/python2.7/site-packages/requests/sessions.py\", line 383, in request\n resp = self.send(prep, **send_kwargs)\n File \"/Users/hynek/.virtualenvs/1bd80d533b702044/lib/python2.7/site-packages/requests/sessions.py\", line 486, in send\n r = adapter.send(request, **kwargs)\n File \"/Users/hynek/.virtualenvs/1bd80d533b702044/lib/python2.7/site-packages/requests/adapters.py\", line 385, in send\n raise SSLError(e)\nrequests.exceptions.SSLError: [Errno 1] _ssl.c:504: error:1407742E:SSL routines:SSL23_GET_SERVER_HELLO:tlsv1 alert protocol version\n```\n", "I like this idea.\n", ":+1: from me as well.\n", "I'm happy to do this as well. @kennethreitz, do you want to do this?\n\nWould be nice if we could have a bit of the docs that talks about building the most secure possible form of requests, including stuff like installing OpenSSL from Homebrew and then building against that.\n", "+1\nThis would be much better to document.\nI would give it a neutral name a la `PyOpenSSL` as the other codepath isn't magic fairy dust and may exhibit other bugs.\n", "@dstufft can we do something like `requests[+PyOpenSSL]` or `requests[+betterssl]`? By which I mean: is the `+` allowed by distutils/setuptools?\n", "Pretty sure it is not.\n", "No, a `+` won't parse correctly on the `pip install` side.\n\nOn Sat, Apr 26, 2014 at 2:16 PM, Ian Cordasco [email protected]:\n\n> @dstufft https://github.com/dstufft can we do something like\n> requests[+PyOpenSSL] or requests[+betterssl]? By which I mean: is the +allowed by distutils/setuptools?\n> \n> —\n> Reply to this email directly or view it on GitHubhttps://github.com/kennethreitz/requests/issues/1995#issuecomment-41481134\n> .\n\n## \n\n\"I disapprove of what you say, but I will defend to the death your right to\nsay it.\" -- Evelyn Beatrice Hall (summarizing Voltaire)\n\"The people's good is the highest law.\" -- Cicero\nGPG Key fingerprint: 125F 5C67 DFE9 4084\n", "@kennethreitz any update on this? \n", "I absolutely want to do this. \n", "I found pyOpenSSL would cause memory problem, I use requests.get to request https://www.baidu.com for 10000times and memory growed!\r\n![ab983cf0-d52a-4347-8a5a-16912420c355](https://user-images.githubusercontent.com/2618277/46942822-be3fd800-d0a0-11e8-932d-1192ac3dfb29.png)\r\n" ]
https://api.github.com/repos/psf/requests/issues/1994
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1994/labels{/name}
https://api.github.com/repos/psf/requests/issues/1994/comments
https://api.github.com/repos/psf/requests/issues/1994/events
https://github.com/psf/requests/pull/1994
30,915,811
MDExOlB1bGxSZXF1ZXN0MTQ0MTUyMzg=
1,994
Make it more clear where install commands are run
{ "avatar_url": "https://avatars.githubusercontent.com/u/129501?v=4", "events_url": "https://api.github.com/users/ionelmc/events{/privacy}", "followers_url": "https://api.github.com/users/ionelmc/followers", "following_url": "https://api.github.com/users/ionelmc/following{/other_user}", "gists_url": "https://api.github.com/users/ionelmc/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ionelmc", "id": 129501, "login": "ionelmc", "node_id": "MDQ6VXNlcjEyOTUwMQ==", "organizations_url": "https://api.github.com/users/ionelmc/orgs", "received_events_url": "https://api.github.com/users/ionelmc/received_events", "repos_url": "https://api.github.com/users/ionelmc/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ionelmc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ionelmc/subscriptions", "type": "User", "url": "https://api.github.com/users/ionelmc", "user_view_type": "public" }
[]
closed
true
null
[]
null
15
2014-04-05T13:40:47Z
2021-09-08T22:01:14Z
2014-04-05T15:35:16Z
CONTRIBUTOR
resolved
so that newbies don't run them in the REPL, I've seen it happen :smile:
{ "avatar_url": "https://avatars.githubusercontent.com/u/129501?v=4", "events_url": "https://api.github.com/users/ionelmc/events{/privacy}", "followers_url": "https://api.github.com/users/ionelmc/followers", "following_url": "https://api.github.com/users/ionelmc/following{/other_user}", "gists_url": "https://api.github.com/users/ionelmc/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ionelmc", "id": 129501, "login": "ionelmc", "node_id": "MDQ6VXNlcjEyOTUwMQ==", "organizations_url": "https://api.github.com/users/ionelmc/orgs", "received_events_url": "https://api.github.com/users/ionelmc/received_events", "repos_url": "https://api.github.com/users/ionelmc/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ionelmc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ionelmc/subscriptions", "type": "User", "url": "https://api.github.com/users/ionelmc", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1994/reactions" }
https://api.github.com/repos/psf/requests/issues/1994/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/1994.diff", "html_url": "https://github.com/psf/requests/pull/1994", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/1994.patch", "url": "https://api.github.com/repos/psf/requests/pulls/1994" }
true
[ "@ionelmc Thanks for this! Leave the '$' signs in (we like it stylistically) and I'll happily merge it. =)\n", "It's poor style - it's not clear for newbies what the \"$\" stands for.\n", "And frankly, it just makes things harder to copy-paste, even if you know the obscure story about \"$\"\n", "You'll note that the requests documentation doesn't generally concern itself with being copy-pasteable. In fact, if you examine the rest of the requests documentation, you'll notice that we generally put the prompt in all of our examples and instructions.\n", "@ionelmc regardless of whether you think it is poor style, it is convention in all technical documentation and is a clear signal (to most) that you should be using the text following it on a command line.\n", "Maybe it's time for a change. For example windows users have trouble with this convention.\n", "The ruling here is basically that we're not going to make that switch. =) We make _loads_ of assumptions about what people will know in our documentation because our documentation is not targeted at complete newbies. Our docs are not an introduction to Python guide.\n", "Windows user need love too :smile: \n\nAlso, note that it's already inconsistent, see the git command at https://github.com/kennethreitz/requests/blob/master/docs/user/install.rst#get-the-code\n", "Yeah, that example doesn't back your case up at all: there are four lines in that section, and three of them begin with `$`. Given that, the other one is clearly an oversight. =)\n\nI appreciate that Windows users need love, but again, any remotely experienced Windows user will either know how `pip` works (and so not be confused by the `$` sign) or know that the `$` sign is a stand-in for the prompt.\n\nSorry, but this is not something we're going to bikeshed any longer, so here's the state of play: remove the change on the `$` and I'll merge this, otherwise I'll close it.\n", "You're just assuming that windows users understand that.\n\nI'm not clear why you're trying to be so obtuse here - I had the impression that requests was supposed to be an easy to use library, oriented towards the inexperienced.\n", "BTW, a friend of mine was doing this:\n<img src=\"https://lh3.googleusercontent.com/-Ou4CBbFXA_o/U0ACh89IYnI/AAAAAAAAAIw/E6TePRARiBM/w667-h179-no/pypip.png\">\n\nHe's an old C++ programmer, I wouldn't say he doesn't know programming :grin: \n", "Requests is written to be easy to use, but the word 'inexperienced' is not one we use to define our target audience. We don't want users to have to be HTTP experts, but we do require some prior knowledge from them. I direct your attention towards [the beginning of the quickstart guide](http://docs.python-requests.org/en/latest/user/quickstart/), by definition the simplest part of the docs. By my count in the first few paragraphs we use the following terms without defining them: object, API, HTTP request, POST, PUT, HEAD, DELETE, OPTIONS, parameters, URL, query string, dictionary, keyword argument, encoded (this is particularly egregious as we don't specify urlencoding here), key and `None`. That's a big list of terms we don't bother to explain.\n\nWe have to assume some basic knowledge on the part of our users because, as I said, _we are not an introduction to Python_ document. I agree that there are plenty of people who need just such a document, and I'd rather we directed them to one than tried to be one.\n\nYes, people make mistakes. I enthusiastically agreed to taking your change to specify the terminal, because that's unambiguously an improvement. Removing a widely-used convention from our documentation is substantially less clear. Users will _eventually_ encounter it, if not here then somewhere else. We assume that the user in question knows what `pip` is and has it installed. Your user above, for instance, would likely have encountered a separate problem without the `$`: they would get an error thrown by their shell when they tried to run it because `pip` isn't installed by default on Python 2.7 on Windows (IIRC).\n\nSo, once again, we've made a style decision. If you don't like it, that's totally understandable, but it blocks the merge of your pull request. Everyone's allowed to have their opinions and we acknowledge yours, but fundamentally this is not a democracy. I direct your attention to the [development philosophy](http://docs.python-requests.org/en/latest/dev/philosophy/):\n\n> Requests is an open but opinionated project, created by an open but opinionated developer.\n> - Listen to everyone, then disregard it.\n> - The API is all that matters, everything else is secondary.\n> - Fit the 90% use-case. Ignore the nay-sayers.\n\nThese are the principles we are upholding when we ask for the `$` to be left alone. =)\n", "No problem.\n", "Just so you know, I'm not looking for internet-points - I'm only trying to make this particular scenario easier.\n", "I didn't intend to accuse you if any such thing, and I apologise if I did. I'm just trying to maintain the project as best as I can. When I get home I plan to merge this and simply return the `$` myself. =)\n\nYour contribution is valuable, we're simply coming at the issue with different priorities. \n" ]
https://api.github.com/repos/psf/requests/issues/1993
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1993/labels{/name}
https://api.github.com/repos/psf/requests/issues/1993/comments
https://api.github.com/repos/psf/requests/issues/1993/events
https://github.com/psf/requests/pull/1993
30,910,114
MDExOlB1bGxSZXF1ZXN0MTQ0MTI5NzY=
1,993
Format doc section header
{ "avatar_url": "https://avatars.githubusercontent.com/u/5571478?v=4", "events_url": "https://api.github.com/users/RichVisual/events{/privacy}", "followers_url": "https://api.github.com/users/RichVisual/followers", "following_url": "https://api.github.com/users/RichVisual/following{/other_user}", "gists_url": "https://api.github.com/users/RichVisual/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/RichVisual", "id": 5571478, "login": "RichVisual", "node_id": "MDQ6VXNlcjU1NzE0Nzg=", "organizations_url": "https://api.github.com/users/RichVisual/orgs", "received_events_url": "https://api.github.com/users/RichVisual/received_events", "repos_url": "https://api.github.com/users/RichVisual/repos", "site_admin": false, "starred_url": "https://api.github.com/users/RichVisual/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/RichVisual/subscriptions", "type": "User", "url": "https://api.github.com/users/RichVisual", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2014-04-05T06:51:07Z
2021-09-08T22:01:13Z
2014-04-05T07:19:48Z
NONE
resolved
Seemed like this should have been formatted as a header (it was just regular text) to avoid confusion.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1993/reactions" }
https://api.github.com/repos/psf/requests/issues/1993/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/1993.diff", "html_url": "https://github.com/psf/requests/pull/1993", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/1993.patch", "url": "https://api.github.com/repos/psf/requests/pulls/1993" }
true
[ "No, it's not just regular text. See [the relevant docs section](http://docs.python-requests.org/en/latest/user/advanced/#session-objects). We want it that way: it shouldn't be its own section, it's just a useful way to highlight the value of that section.\n\nThanks though! :cake:\n", "Thanks for your patience with the issues I was having with that pull request. I see what you're saying, it just looks so different from the way it is on the webpage. I was looking at it in terms of this:\n\nhttp://imgur.com/a/7Oekj\n\nIt seems a little odd that one is regular text in that area and the other is clearly distinct from the surrounding text. On the website it's not a header really, but distinguished, whereas on the Github docs it doesn't seem to be distinguished.\n", "@richierichrawr That's really not a problem, we were all new once. Ours is not a community that vilifies mistakes. =)\n\nYeah, welcome to the fun world of GitHub. =D We use Sphinx for our documentation, which provides lots of excellent features that GitHub does not understand.\n" ]
https://api.github.com/repos/psf/requests/issues/1992
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1992/labels{/name}
https://api.github.com/repos/psf/requests/issues/1992/comments
https://api.github.com/repos/psf/requests/issues/1992/events
https://github.com/psf/requests/pull/1992
30,908,261
MDExOlB1bGxSZXF1ZXN0MTQ0MTIxMzE=
1,992
Master
{ "avatar_url": "https://avatars.githubusercontent.com/u/5571478?v=4", "events_url": "https://api.github.com/users/RichVisual/events{/privacy}", "followers_url": "https://api.github.com/users/RichVisual/followers", "following_url": "https://api.github.com/users/RichVisual/following{/other_user}", "gists_url": "https://api.github.com/users/RichVisual/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/RichVisual", "id": 5571478, "login": "RichVisual", "node_id": "MDQ6VXNlcjU1NzE0Nzg=", "organizations_url": "https://api.github.com/users/RichVisual/orgs", "received_events_url": "https://api.github.com/users/RichVisual/received_events", "repos_url": "https://api.github.com/users/RichVisual/repos", "site_admin": false, "starred_url": "https://api.github.com/users/RichVisual/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/RichVisual/subscriptions", "type": "User", "url": "https://api.github.com/users/RichVisual", "user_view_type": "public" }
[]
closed
true
null
[]
null
0
2014-04-05T04:08:12Z
2021-09-08T22:01:14Z
2014-04-05T07:33:55Z
NONE
resolved
Changed requests/docs/user/advanced.rst after the 1st section in that doc: Remove a Value From a Dict Parameter It's a distinct section from the rest of the neighboring text. EDIT: OK, I'm just learning about commits, and I did the edit on that doc in the browser, and now it's going to the wrong branch. Just to be clear, I don't have anything else to commit/change other than the doc issue I cited above. Sorry for the confusion. What's the best way to fix any problems this has caused? EDIT2: I figured out what happened and will submit it the right (I think) way this time. Sorry again for any confusion as I'm just starting to learn how this works. The submit seemed to go through on the other, so this pull can be disregarded in lieu of the other I made just now.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1992/reactions" }
https://api.github.com/repos/psf/requests/issues/1992/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/1992.diff", "html_url": "https://github.com/psf/requests/pull/1992", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/1992.patch", "url": "https://api.github.com/repos/psf/requests/pulls/1992" }
true
[]
https://api.github.com/repos/psf/requests/issues/1991
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1991/labels{/name}
https://api.github.com/repos/psf/requests/issues/1991/comments
https://api.github.com/repos/psf/requests/issues/1991/events
https://github.com/psf/requests/pull/1991
30,908,143
MDExOlB1bGxSZXF1ZXN0MTQ0MTIwNjM=
1,991
replace reference to crate.io
{ "avatar_url": "https://avatars.githubusercontent.com/u/219470?v=4", "events_url": "https://api.github.com/users/benjaminp/events{/privacy}", "followers_url": "https://api.github.com/users/benjaminp/followers", "following_url": "https://api.github.com/users/benjaminp/following{/other_user}", "gists_url": "https://api.github.com/users/benjaminp/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/benjaminp", "id": 219470, "login": "benjaminp", "node_id": "MDQ6VXNlcjIxOTQ3MA==", "organizations_url": "https://api.github.com/users/benjaminp/orgs", "received_events_url": "https://api.github.com/users/benjaminp/received_events", "repos_url": "https://api.github.com/users/benjaminp/repos", "site_admin": false, "starred_url": "https://api.github.com/users/benjaminp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/benjaminp/subscriptions", "type": "User", "url": "https://api.github.com/users/benjaminp", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2014-04-05T03:58:22Z
2021-09-08T23:07:19Z
2014-04-08T14:19:40Z
CONTRIBUTOR
resolved
crate.io is gone, so a different mirror should be used as an example. Also, the list of PyPI mirrors should be mentioned.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1991/reactions" }
https://api.github.com/repos/psf/requests/issues/1991/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/1991.diff", "html_url": "https://github.com/psf/requests/pull/1991", "merged_at": "2014-04-08T14:19:40Z", "patch_url": "https://github.com/psf/requests/pull/1991.patch", "url": "https://api.github.com/repos/psf/requests/pulls/1991" }
true
[ "This is great, thanks! Can we just update this to link to the mirrors, though, rather than calling any one in particular out? We linked to Crate because it was awesome, so now that it's gone we should just point to the mirrors in general.\n", "We should remove all reference of the mirrors. PyPI is at the point now where the mirrors are entirely unnecessary thanks to the use of a CDN by the infrastructure team. @dstufft can speak more to this.\n", "So it depends! PyPI itself should go down for installation very rarely now. Thanks to the fact that Fastly caches those pages for a day and if our backend servers die it'll fall back to a static mirror hosted in a different DC. Additionally one of the main reasons for the mirror infrastructure in general was the performance of downloading a file from `$wherever_pypi_is` when you live in `$place_on_other_side_of_he_world` which the CDN infrastructure more or less handles the bulk of this use case. Finally with the removal of Mirror Authenticity there is no longer any way to check that a mirror itself isn't malicous (not that we ever had anything _using_ that API to begin with) so if someone is going to use a mirror they should be selecting their own instead of just picking a random one because essentially using a mirror says \"I trust you\".\n\nThat being said! If Fastly itself has troubles (or more likely, one of their POP locations) then PyPI can be \"down\" in one region until Fastly is alerted to the fact that a region is having problems and they drain that DC. Additionally their are areas in the world where hosting a CDN is problematic. China for instance. Fastly has POP locations near China, but due to the political nature of China and their own Infrastructure any traffic that has to _leave_ China is going to take a severe performance hit because their outbound pipes are all severely congested. Fastly is unlikely to get a generally available POP location inside of China (Last I checked you could get them to manage own you own there though) because of the politics of that region would make Fastly responsible for what their customers are hosting through Fastly.\n\nAdditionally if you are going to mention a mirror it probably makes more sense to mention `https://pypi.gocept.com/` because it supports HTTPS and it's not inside China.\n\nThe tl;dr is that now adays you shouldn't generally have to ever thinking about the concept of mirrors unless you're in a problematic region like China.\n", "Indeed, PyPi is quite solid now. When this was written, it went down daily.\n" ]
https://api.github.com/repos/psf/requests/issues/1990
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1990/labels{/name}
https://api.github.com/repos/psf/requests/issues/1990/comments
https://api.github.com/repos/psf/requests/issues/1990/events
https://github.com/psf/requests/pull/1990
30,790,148
MDExOlB1bGxSZXF1ZXN0MTQzNDA0NjE=
1,990
Maintain DELETE method on 302
{ "avatar_url": "https://avatars.githubusercontent.com/u/1002315?v=4", "events_url": "https://api.github.com/users/kuxi/events{/privacy}", "followers_url": "https://api.github.com/users/kuxi/followers", "following_url": "https://api.github.com/users/kuxi/following{/other_user}", "gists_url": "https://api.github.com/users/kuxi/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kuxi", "id": 1002315, "login": "kuxi", "node_id": "MDQ6VXNlcjEwMDIzMTU=", "organizations_url": "https://api.github.com/users/kuxi/orgs", "received_events_url": "https://api.github.com/users/kuxi/received_events", "repos_url": "https://api.github.com/users/kuxi/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kuxi/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kuxi/subscriptions", "type": "User", "url": "https://api.github.com/users/kuxi", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2014-04-03T16:18:22Z
2021-09-08T22:01:13Z
2014-04-03T22:09:32Z
NONE
resolved
When resolving redirects, all methods except HEAD were being converted to GET to mimic browser behavior. This change maintains the DELETE method when resolving 302 redirects.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1002315?v=4", "events_url": "https://api.github.com/users/kuxi/events{/privacy}", "followers_url": "https://api.github.com/users/kuxi/followers", "following_url": "https://api.github.com/users/kuxi/following{/other_user}", "gists_url": "https://api.github.com/users/kuxi/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kuxi", "id": 1002315, "login": "kuxi", "node_id": "MDQ6VXNlcjEwMDIzMTU=", "organizations_url": "https://api.github.com/users/kuxi/orgs", "received_events_url": "https://api.github.com/users/kuxi/received_events", "repos_url": "https://api.github.com/users/kuxi/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kuxi/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kuxi/subscriptions", "type": "User", "url": "https://api.github.com/users/kuxi", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1990/reactions" }
https://api.github.com/repos/psf/requests/issues/1990/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/1990.diff", "html_url": "https://github.com/psf/requests/pull/1990", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/1990.patch", "url": "https://api.github.com/repos/psf/requests/pulls/1990" }
true
[ "Thanks for this!\n\nWhat's your justification for making this change?\n", "I had problems using python-swiftclient, which depends on requests in the environment at work. We had a swift cluster proxy redirecting HTTP requests to HTTPS and I couldn't delete using the client because DELETE requests were being converted to GET after redirects.\n\nThis change fixed our problem with deleting using python-swiftclient.\n", "I should mention though that since we had problems with POST and PUT we eventually reworked our setup to use HTTPS in order to avoid this problem all together. I thought that maybe it would help someone else anyway but I'm not 100% sure that it's desired behavior without being able to do the same with POST and PUT.\nMaybe it would be better to allow an optional kwarg to allow replaying requests after redirects despite the way browsers do it?\n", "It's not the desired behaviour, sadly.\n\nWe don't want a keyword argument because it dirties the API. Fundamentally, we need to act in a way that causes least surprises, and this is it: we do the same thing with 302s as all major user-agents do. This problem caused RFC 2616 to add two extra codes that are _unambiguous_: 303 and 307.\n\nUnfortunately, the expectation is that if you're faced with such an API you will have to handle redirects yourself, by setting `allow_redirects=False`.\n", "Fair enough :)\n" ]
https://api.github.com/repos/psf/requests/issues/1989
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1989/labels{/name}
https://api.github.com/repos/psf/requests/issues/1989/comments
https://api.github.com/repos/psf/requests/issues/1989/events
https://github.com/psf/requests/pull/1989
30,736,794
MDExOlB1bGxSZXF1ZXN0MTQzMDkxNjk=
1,989
Make note of the fact that we fixed some CVEs
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" } ]
null
0
2014-04-03T00:05:43Z
2021-09-08T22:01:12Z
2014-04-03T05:51:07Z
CONTRIBUTOR
resolved
Just a doc change. If anyone thinks they deserve separate items let me know. /cc @dstufft
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1989/reactions" }
https://api.github.com/repos/psf/requests/issues/1989/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/1989.diff", "html_url": "https://github.com/psf/requests/pull/1989", "merged_at": "2014-04-03T05:51:07Z", "patch_url": "https://github.com/psf/requests/pull/1989.patch", "url": "https://api.github.com/repos/psf/requests/pulls/1989" }
true
[]
https://api.github.com/repos/psf/requests/issues/1988
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1988/labels{/name}
https://api.github.com/repos/psf/requests/issues/1988/comments
https://api.github.com/repos/psf/requests/issues/1988/events
https://github.com/psf/requests/issues/1988
30,623,371
MDU6SXNzdWUzMDYyMzM3MQ==
1,988
Support for RFC 2549
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[ { "color": "02e10c", "default": false, "description": null, "id": 76800, "name": "Feature Request", "node_id": "MDU6TGFiZWw3NjgwMA==", "url": "https://api.github.com/repos/psf/requests/labels/Feature%20Request" }, { "color": "e102d8", "default": false, "description": null, "id": 117745, "name": "Planned", "node_id": "MDU6TGFiZWwxMTc3NDU=", "url": "https://api.github.com/repos/psf/requests/labels/Planned" } ]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" } ]
{ "closed_at": null, "closed_issues": 29, "created_at": "2013-11-17T11:29:34Z", "creator": { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }, "description": null, "due_on": null, "html_url": "https://github.com/psf/requests/milestone/20", "id": 487518, "labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels", "node_id": "MDk6TWlsZXN0b25lNDg3NTE4", "number": 20, "open_issues": 12, "state": "open", "title": "3.0.0", "updated_at": "2024-05-19T18:43:00Z", "url": "https://api.github.com/repos/psf/requests/milestones/20" }
2
2014-04-01T18:13:24Z
2021-09-09T00:01:09Z
2014-04-02T14:46:12Z
CONTRIBUTOR
resolved
We should start supporting https://tools.ietf.org/html/rfc2549
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1988/reactions" }
https://api.github.com/repos/psf/requests/issues/1988/timeline
null
completed
null
null
false
[ "I'd love to add support for RFC 2549, but the state of the avian interface modules in Python is awful. We'll need some serious upstream work in PyOpenPigeon.\n", "Would be awesome to get my first PR merged into PyOpen:bird:!\n" ]
https://api.github.com/repos/psf/requests/issues/1987
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1987/labels{/name}
https://api.github.com/repos/psf/requests/issues/1987/comments
https://api.github.com/repos/psf/requests/issues/1987/events
https://github.com/psf/requests/issues/1987
30,596,264
MDU6SXNzdWUzMDU5NjI2NA==
1,987
Allow setting timeout on the Session
{ "avatar_url": "https://avatars.githubusercontent.com/u/145979?v=4", "events_url": "https://api.github.com/users/dstufft/events{/privacy}", "followers_url": "https://api.github.com/users/dstufft/followers", "following_url": "https://api.github.com/users/dstufft/following{/other_user}", "gists_url": "https://api.github.com/users/dstufft/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/dstufft", "id": 145979, "login": "dstufft", "node_id": "MDQ6VXNlcjE0NTk3OQ==", "organizations_url": "https://api.github.com/users/dstufft/orgs", "received_events_url": "https://api.github.com/users/dstufft/received_events", "repos_url": "https://api.github.com/users/dstufft/repos", "site_admin": false, "starred_url": "https://api.github.com/users/dstufft/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dstufft/subscriptions", "type": "User", "url": "https://api.github.com/users/dstufft", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2014-04-01T12:59:57Z
2021-09-09T00:01:07Z
2014-04-01T13:43:01Z
CONTRIBUTOR
resolved
This came up in the requests IRC channel, it's something I think is generally useful (at least pip has implemented it in our Session subclass) but basically it'd be nice if the Session object had a timeout attribute that could be set to globally set a timeout instead of needing to pass it in for each request or do your own subclassing.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 1, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 1, "url": "https://api.github.com/repos/psf/requests/issues/1987/reactions" }
https://api.github.com/repos/psf/requests/issues/1987/timeline
null
completed
null
null
false
[ "I'm basically in favour of this, and I'm happy to do the legwork required for it, but I want to know if @kennethreitz wants it or not.\n", "I'd love to help out pip in anyway possible but this has been discussed twice in the last year:\n- https://github.com/kennethreitz/requests/issues/1130\n- https://github.com/kennethreitz/requests/issues/1563\n\nAnd in both cases Kenneth was -1 on the change. It's of course likely that the needs of pip might cause a change in his mind, but given that removing that attribute from the Session object was a very intentional decision, I'm not certain what his opinion will be.\n", "Aha, I missed those. I'm happy to assume he doesn't want it then.\n", "FWIW I don't mind the code in pip, we have to subclass session anyways for some other stuff and the code to add timeout support isn't very large :) I opened primarily at @Lukasa's prompting :)\n", "I wouldn't be surprised if Kenneth decided to reopen this though. =P\n" ]
https://api.github.com/repos/psf/requests/issues/1986
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1986/labels{/name}
https://api.github.com/repos/psf/requests/issues/1986/comments
https://api.github.com/repos/psf/requests/issues/1986/events
https://github.com/psf/requests/issues/1986
30,536,440
MDU6SXNzdWUzMDUzNjQ0MA==
1,986
AppEngine HTTPS POST not working
{ "avatar_url": "https://avatars.githubusercontent.com/u/220959?v=4", "events_url": "https://api.github.com/users/drakon/events{/privacy}", "followers_url": "https://api.github.com/users/drakon/followers", "following_url": "https://api.github.com/users/drakon/following{/other_user}", "gists_url": "https://api.github.com/users/drakon/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/drakon", "id": 220959, "login": "drakon", "node_id": "MDQ6VXNlcjIyMDk1OQ==", "organizations_url": "https://api.github.com/users/drakon/orgs", "received_events_url": "https://api.github.com/users/drakon/received_events", "repos_url": "https://api.github.com/users/drakon/repos", "site_admin": false, "starred_url": "https://api.github.com/users/drakon/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/drakon/subscriptions", "type": "User", "url": "https://api.github.com/users/drakon", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2014-03-31T18:27:22Z
2021-09-09T00:01:09Z
2014-03-31T18:39:44Z
NONE
resolved
HTTPS posts requests don't work on AppEngine. Specifically the code here: http://documentation.mailgun.com/user_manual.html#sending-via-api (I removed the parts not making a difference): ``` requests.post("https://api.mailgun.net/v2/samples.mailgun.org/messages") ``` Results in a 404 (not found), which is wrong since it is clearly available. I read a lot about requests' AppEngine support and the problems with urllib3 but I came to the conclusion that according to what I read it should work now. :) Unfortunately I couldn't find any other service, where I can test a HTTPS Post.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1986/reactions" }
https://api.github.com/repos/psf/requests/issues/1986/timeline
null
completed
null
null
false
[ "If you've been reading up on our AppEngine support you know that we don't have any. =) The official position of the requests project is that AppEngine is not Python. AppEngine has got all sorts of weird stuff going on in its network libraries, and we just cannot guarantee that they're doing anything even remotely like what we're expecting they'll do.\n\nIn this case, obvious possible culprits include a lack of support for Server Name Indication causing you to talk to an entirely different host than you're expecting to. I have no idea if AppEngine supports SNI, but I'm totally prepared to believe they do not.\n\nAnyway, the best place for this is Stack Overflow, where you might get some possible support. This is a bug tracker, and we do not consider AppEngine problems to be bugs. I'm sorry we can't be more help!\n", "Alright, I've seen things working on Stack Overflow, so I assumed it should work. But thanks for the clear response! I already have another solution (which is by for not nearly as elegant as requests, but I'll do for now).\n\nHaha, I like that one: \"The official position of the requests project is that AppEngine is not Python.\". True story. ^^\n" ]
https://api.github.com/repos/psf/requests/issues/1985
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1985/labels{/name}
https://api.github.com/repos/psf/requests/issues/1985/comments
https://api.github.com/repos/psf/requests/issues/1985/events
https://github.com/psf/requests/issues/1985
30,521,740
MDU6SXNzdWUzMDUyMTc0MA==
1,985
requests doen't work with IPv6 addresses that contains the '%' sign
{ "avatar_url": "https://avatars.githubusercontent.com/u/2158258?v=4", "events_url": "https://api.github.com/users/costinb7/events{/privacy}", "followers_url": "https://api.github.com/users/costinb7/followers", "following_url": "https://api.github.com/users/costinb7/following{/other_user}", "gists_url": "https://api.github.com/users/costinb7/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/costinb7", "id": 2158258, "login": "costinb7", "node_id": "MDQ6VXNlcjIxNTgyNTg=", "organizations_url": "https://api.github.com/users/costinb7/orgs", "received_events_url": "https://api.github.com/users/costinb7/received_events", "repos_url": "https://api.github.com/users/costinb7/repos", "site_admin": false, "starred_url": "https://api.github.com/users/costinb7/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/costinb7/subscriptions", "type": "User", "url": "https://api.github.com/users/costinb7", "user_view_type": "public" }
[]
closed
true
null
[]
null
22
2014-03-31T15:28:14Z
2021-09-08T14:00:43Z
2014-03-31T15:47:14Z
NONE
resolved
Some IPv6 addresses contains the '%' sign ( link-local addresses : ex. fe80::21b:63ff:feab:e6a6%eth0 or [fe80::21b:63ff:feab:e6a6%eth0] ) When using this addresses the requests library raises the fallowing exception: "requests.exceptions.InvalidURL: Invalid percent-escape sequence: 'et'" (raised by the function unquote_unreserved() from the utils.py). If I put as first line in this function the "return uri" line (in other words if I pass over this function), everything works ok.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1985/reactions" }
https://api.github.com/repos/psf/requests/issues/1985/timeline
null
completed
null
null
false
[ "The % character is reserved in a URI. This causes us no end of trouble because people expect us to magically know whether we should leave a % sequence alone or transform the % to its percent-encoded form. Realistically, we have no way of being able to do that correctly.\n\nOur concession is to say that we can sort out everything _except_ encoding percent characters: those need to be encoded by the user themselves. You'll want to replace the % in the IPv6 URI with its percent-encoded form (%25).\n", "Hmm.. yes now I see that the '%' character is reserved in a URI.. Anyway, I tried replacing it with %25 (http://[fe80::240:8cff:fec5:845e%25eth0]) and it still dosn't work, because the '%' characted is not in UNRESERVED_SET, '%25' is not transformed into '%', it is kept as it is.\n", "That's becuase, as discussed, it's reserved and therefore should not be in the UNRESERVED_SET. In what way does your request fail?\n", "It fails with the fallowing error:\n\nrequests.exceptions.ConnectionError: HTTPConnectionPool(host='fe80::240:8cff:fec5:845e%25eth0', port=80): Max retries exceeded with url: /axis-cgi/param.cgi?action=list&group=root.AutoTracking.A0.Running (Caused by <class 'socket.gaierror'>: [Errno -2] Name or service not known)\n\nThat is because the unquote_unreserved() function doesn't change anything in the URL, it should have replaced '%25' with '%' but it doesn't, it just returns the URL as it is.\n", "It should _not_ have replaced '%25' with '%'. You really need to read up on the rules for constructing URLs, but allow me to reproduce them for you here:\n\nThe host portion of the URL is defined as follows:\n\n```\nhost = IP-literal / IPv4address / reg-name\n\nIP-literal = \"[\" ( IPv6address / IPvFuture ) \"]\"\nIPvFuture = \"v\" 1*HEXDIG \".\" 1*( unreserved / sub-delims / \":\" )\n\nIPv6address = 6( h16 \":\" ) ls32\n / \"::\" 5( h16 \":\" ) ls32\n / [ h16 ] \"::\" 4( h16 \":\" ) ls32\n / [ *1( h16 \":\" ) h16 ] \"::\" 3( h16 \":\" ) ls32\n / [ *2( h16 \":\" ) h16 ] \"::\" 2( h16 \":\" ) ls32\n / [ *3( h16 \":\" ) h16 ] \"::\" h16 \":\" ls32\n / [ *4( h16 \":\" ) h16 ] \"::\" ls32\n / [ *5( h16 \":\" ) h16 ] \"::\" h16\n / [ *6( h16 \":\" ) h16 ] \"::\"\n\nls32 = ( h16 \":\" h16 ) / IPv4address\n ; least-significant 32 bits of address\n\nh16 = 1*4HEXDIG\n ; 16 bits of address represented in hexadecimal\n\nreg-name = *( unreserved / pct-encoded / sub-delims )\n```\n\nLet's go through these in turn. Is the address you've given (`[fe80::240:8cff:fec5:845e%25eth0]`) a valid IPv6 address, as defined above? No, it is not, because it contains a percent character which is not allowed by the above rules.\n\nIs it an IPvFuture? No, it is not, because it contains a percent character, which is not in `unreserved`, `sub-delims` or `HEXDIGIT`.\n\nIs it a reg-name? No, it is not, because `pct-encoded` is defined as `pct-encoded = \"%\" HEXDIG HEXDIG` and `t` is not a HEXDIGIT.\n\nThe only way it's a valid URL is if the percent character is percent encoded. This is why we didn't unencode it: we'd end up with an invalid URL.\n\nLet's ask a different question: does this work in the standard library?\n", "Yes, it does work with httplib, for example:\nconn = httplib.HTTPConnection(\"[fe80::240:8cff:fec5:845e%eth0]\")\n", "Interesting. Does it work in urllib3?\n", "So the key problem here is that the URL syntax above was supplemented in RFC 6874 to allow the percent character. I suspect our URL parsing logic is invalid here.\n", "comment: In some environments IPv6 Link Local addressing is VERY useful. If you would fix it, that would be helpful.\n", "@steve100 can you test if the `rfc3986` library handles this appropriately?\n", "what do I have to load to test the rfc3986 library? I probably can test it if you give me some more details. Thanks.\n", "I am also stuck on this issue. I can also pitch in if needed.\n", "if someone tells me how to test with this library:\nI can. rfc3986\n\nOn Fri, Mar 11, 2016 at 1:42 AM, littypreeth [email protected]\nwrote:\n\n> I am also stuck on this issue. I can also pitch in if needed.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/1985#issuecomment-195219184\n> .\n", "I had forgotten that there's already an open issue for this feature within rfc3986: https://github.com/sigmavirus24/rfc3986/issues/2 so there's no need to test it.\n", "Thanks.\n\nI know the issues that surround using the IPv6 Link Local Address..\n\nHere it is SO tempting to use the link local address help get a device or\nvm configured with better addresses and other configuration information.\n\nSteve\n\nOn Fri, Mar 11, 2016 at 2:32 PM, Ian Cordasco [email protected]\nwrote:\n\n> I had forgotten that there's already an open issue for this feature within\n> rfc3986: sigmavirus24/rfc3986#2\n> https://github.com/sigmavirus24/rfc3986/issues/2 so there's no need to\n> test it.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/1985#issuecomment-195512868\n> .\n", "Old thread, but I'm running into this LL IPv6 % handling issue and wondering if was ever resolved? people are (successfully) using curl instead of requests and it breaks my heart. I'm using python 2.7.11 and requests 2.9.1\n", "No. It's not resolved yet.\n\nOn 17-Aug-2016 9:47 pm, \"briggr1\" [email protected] wrote:\n\n> Old thread, but I'm running into this LL IPv6 % handling issue and\n> wondering if was ever resolved? people are (successfully) using curl\n> instead of requests and it breaks my heart. I'm using python 2.7.11 and\n> requests 2.9.1\n> \n> —\n> You are receiving this because you commented.\n> Reply to this email directly, view it on GitHub\n> https://github.com/kennethreitz/requests/issues/1985#issuecomment-240464306,\n> or mute the thread\n> https://github.com/notifications/unsubscribe-auth/AQ9F5Qht1dy6A2VFw1UN_htNSCgf_dnOks5qgzQFgaJpZM4BuSvZ\n> .\n", "updated to python 2.7.12, and requests 2.11.0 - no dice.\n", "So I'm going to start working on adding this support to [rfc3986](/sigmavirus24/rfc3986). In reading the specification, there's this:\n\n> In a URI, a literal IPv6 address is always embedded between \"[\" and\n> \"]\". This document specifies how a <zone_id> can be appended to the\n> address. According to URI syntax [RFC3986], \"%\" is always treated as\n> an escape character in a URI, so, according to the established URI\n> syntax [RFC3986] any occurrences of literal \"%\" symbols in a URI MUST\n> be percent-encoded and represented in the form \"%25\". Thus, the\n> scoped address fe80::a%en1 would appear in a URI as\n> http://[fe80::a%25en1].\n\nSo the right thing to do is encode the `%` sign it seems.\n\n@Lukasa should we reopen this to track the work that we need to do in urllib3?\n", "In the first instance let's just open a relevant issue on urllib3.\n", "Hi @Lukasa\n\nWondering if you could post the related issue number for this on urllib3 so I could track it? I tried looking for it, but didn't see it (I did see something about proxy handling...). Thanks in advance\n\nAlso, for others that might be wondering - there is a workaround: if your system only has 1 interface with IPv6 addressing, then you do not need to supply the %<interface>, and things work just fine. Unfortunately, all of my testing involves many networks with different vlans and I need IPv6 support on them all.\n", "@briggr1 I don't believe any tracking issue for this was actually opened on urllib3. I'd need @sigmavirus24 to outline exactly what work he believes is needed though: in my initial testing urllib3 seems to be handling this appropriately.\n" ]
https://api.github.com/repos/psf/requests/issues/1984
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1984/labels{/name}
https://api.github.com/repos/psf/requests/issues/1984/comments
https://api.github.com/repos/psf/requests/issues/1984/events
https://github.com/psf/requests/issues/1984
30,484,042
MDU6SXNzdWUzMDQ4NDA0Mg==
1,984
auth removes the post body
{ "avatar_url": "https://avatars.githubusercontent.com/u/710024?v=4", "events_url": "https://api.github.com/users/nathanathan/events{/privacy}", "followers_url": "https://api.github.com/users/nathanathan/followers", "following_url": "https://api.github.com/users/nathanathan/following{/other_user}", "gists_url": "https://api.github.com/users/nathanathan/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/nathanathan", "id": 710024, "login": "nathanathan", "node_id": "MDQ6VXNlcjcxMDAyNA==", "organizations_url": "https://api.github.com/users/nathanathan/orgs", "received_events_url": "https://api.github.com/users/nathanathan/received_events", "repos_url": "https://api.github.com/users/nathanathan/repos", "site_admin": false, "starred_url": "https://api.github.com/users/nathanathan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nathanathan/subscriptions", "type": "User", "url": "https://api.github.com/users/nathanathan", "user_view_type": "public" }
[]
closed
true
null
[]
null
10
2014-03-31T04:01:34Z
2021-09-08T23:11:00Z
2014-06-08T10:04:46Z
NONE
resolved
I'm using Requests 2.2.1 and when I use an auth header like so: ``` r = requests.post(URL, auth=("user", "pass"), data=json.dumps(data) ) print r.request.body ``` ...there is no data in the body. However, if I remove the auth header this snippet prints out all the data in the body.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1984/reactions" }
https://api.github.com/repos/psf/requests/issues/1984/timeline
null
completed
null
null
false
[ "Uh, I can't reproduce this:\n\n``` python\n>>> r = requests.post('http://www.httpbin.org/post', auth=('test', 'test'), data=json.dumps({'hi': 'there'}))\n>>> r.request.body\n'{\"hi\": \"there\"}'\n```\n\nCan you confirm that you are actually sending data in both cases?\n", "The body only disappears if the request makes it to the server (and I see it in my server logs). When I use the wrong password or post to a invalid URL r.request.body is set to what it should be.\n", "I should have mentioned that my server is running behind an Apache proxy does does the auth.\n", "Are you getting redirected?\n", "My server returns 303s in accordance with [this API](http://docs.annotatorjs.org/en/latest/storage.html#core-storage-api), however the request body is empty when it gets to the server, so I don't think the response code could make a difference.\n", "Wait, wait.\n\nDo you _know_ that the request body is empty? That's a very different problem. If all you've got as proof of that is that `response.request.body` is the _last_ request. The original request body is in `response.history[0].request.body`.\n", "I know that the request body is empty when it gets to my server from logging the request on the server.\n", "In that case you should try to get packet capture (tcpdump or wireshark) from your server before it hits Apache. I'm finding it very difficult to believe that requests is losing the request body.\n", "You should probably also do a packet capture on your local box to understand what's happening. Beyond checking your history on that response you should check `r.request.method`. Alternatively, you should make the same request with `allow_redirects=False`. Then check the generated request body.\n", "Closing for inactivity.\n" ]
https://api.github.com/repos/psf/requests/issues/1983
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1983/labels{/name}
https://api.github.com/repos/psf/requests/issues/1983/comments
https://api.github.com/repos/psf/requests/issues/1983/events
https://github.com/psf/requests/issues/1983
30,465,887
MDU6SXNzdWUzMDQ2NTg4Nw==
1,983
Referrers
{ "avatar_url": "https://avatars.githubusercontent.com/u/5372304?v=4", "events_url": "https://api.github.com/users/luceatnobis/events{/privacy}", "followers_url": "https://api.github.com/users/luceatnobis/followers", "following_url": "https://api.github.com/users/luceatnobis/following{/other_user}", "gists_url": "https://api.github.com/users/luceatnobis/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/luceatnobis", "id": 5372304, "login": "luceatnobis", "node_id": "MDQ6VXNlcjUzNzIzMDQ=", "organizations_url": "https://api.github.com/users/luceatnobis/orgs", "received_events_url": "https://api.github.com/users/luceatnobis/received_events", "repos_url": "https://api.github.com/users/luceatnobis/repos", "site_admin": false, "starred_url": "https://api.github.com/users/luceatnobis/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/luceatnobis/subscriptions", "type": "User", "url": "https://api.github.com/users/luceatnobis", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2014-03-30T14:10:53Z
2021-09-09T00:01:09Z
2014-03-30T14:17:30Z
NONE
resolved
I have noticed that requests sessions don't send referrer links, at least in what I tried, nor have I found any mention of referrers in the documentation. The only mention of referrers in the entire request source happens in the http://hg.python.org/cpython/file/3.4/Lib/http/server.py#l1088 file. Can anyone give information about the status of this header field?
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1983/reactions" }
https://api.github.com/repos/psf/requests/issues/1983/timeline
null
completed
null
null
false
[ "Correct, we don't. We send the minimal number of header fields to prevent servers tripping up on what we do, but we don't realistically know what you're doing. This means that it's reckless to assert that the last page you looked up via this `Session` should be the referrer: they could be totally unrelated!\n\nIf you want to send the `Referer` header, you should send it yourself: you know better than we do. =)\n", "Unlike a real web browser, we can not guess from which URL your request originate from. Using the previously requested URL could end up being an exposure of private data relating to what you were doing. There's no reason we should be forming or sending the Referrer header on your behalf.\n\nFurther, the issue tracker is not the best place for questions. There is an active IRC channel, StackOverflow tag, and not-so-active mailing list. Any of those three would be preferable to opening issues.\n" ]
https://api.github.com/repos/psf/requests/issues/1982
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1982/labels{/name}
https://api.github.com/repos/psf/requests/issues/1982/comments
https://api.github.com/repos/psf/requests/issues/1982/events
https://github.com/psf/requests/issues/1982
30,444,988
MDU6SXNzdWUzMDQ0NDk4OA==
1,982
Adding socks5 support
{ "avatar_url": "https://avatars.githubusercontent.com/u/5372304?v=4", "events_url": "https://api.github.com/users/luceatnobis/events{/privacy}", "followers_url": "https://api.github.com/users/luceatnobis/followers", "following_url": "https://api.github.com/users/luceatnobis/following{/other_user}", "gists_url": "https://api.github.com/users/luceatnobis/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/luceatnobis", "id": 5372304, "login": "luceatnobis", "node_id": "MDQ6VXNlcjUzNzIzMDQ=", "organizations_url": "https://api.github.com/users/luceatnobis/orgs", "received_events_url": "https://api.github.com/users/luceatnobis/received_events", "repos_url": "https://api.github.com/users/luceatnobis/repos", "site_admin": false, "starred_url": "https://api.github.com/users/luceatnobis/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/luceatnobis/subscriptions", "type": "User", "url": "https://api.github.com/users/luceatnobis", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2014-03-29T14:36:57Z
2021-09-09T00:01:10Z
2014-03-29T14:41:51Z
NONE
resolved
Hey, would there be any chance to have socks4a/socks5 proxy support added to requests in the near future? For Python2 there was a module called requesocks which provided that functionality, but apparently that one is unmaintained and has not been ported to python3 yet. Would that be within the goals of the library?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1982/reactions" }
https://api.github.com/repos/psf/requests/issues/1982/timeline
null
completed
null
null
false
[ "Requests is open to having SOCKS support, but it's limited by `urllib3`, which provides lots of our underlying HTTP functionality. There's an open issue there (shazow/urllib3#284) which needs some work, and has been open for a long time: if you think there's help you can provide you should swing by and offer it!\n" ]
https://api.github.com/repos/psf/requests/issues/1981
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1981/labels{/name}
https://api.github.com/repos/psf/requests/issues/1981/comments
https://api.github.com/repos/psf/requests/issues/1981/events
https://github.com/psf/requests/pull/1981
30,430,912
MDExOlB1bGxSZXF1ZXN0MTQxMzY1MTI=
1,981
Update urllib to 1.8 (8a8c601bee)
{ "avatar_url": "https://avatars.githubusercontent.com/u/963826?v=4", "events_url": "https://api.github.com/users/stanhu/events{/privacy}", "followers_url": "https://api.github.com/users/stanhu/followers", "following_url": "https://api.github.com/users/stanhu/following{/other_user}", "gists_url": "https://api.github.com/users/stanhu/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/stanhu", "id": 963826, "login": "stanhu", "node_id": "MDQ6VXNlcjk2MzgyNg==", "organizations_url": "https://api.github.com/users/stanhu/orgs", "received_events_url": "https://api.github.com/users/stanhu/received_events", "repos_url": "https://api.github.com/users/stanhu/repos", "site_admin": false, "starred_url": "https://api.github.com/users/stanhu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/stanhu/subscriptions", "type": "User", "url": "https://api.github.com/users/stanhu", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2014-03-28T23:42:34Z
2021-09-08T22:01:12Z
2014-03-31T14:29:30Z
CONTRIBUTOR
resolved
I noticed this version of urllib3 was attempting to use RC4 as the cipher, and SSL connections using pyOpenSSL would fail to servers that did not support RC4. Update to v1.8 version of urllib3 to get the new cipher settings pushed in https://github.com/shazow/urllib3/commit/2088570a293df42b1623dd74fcff0174d0565af5.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1981/reactions" }
https://api.github.com/repos/psf/requests/issues/1981/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/1981.diff", "html_url": "https://github.com/psf/requests/pull/1981", "merged_at": "2014-03-31T14:29:30Z", "patch_url": "https://github.com/psf/requests/pull/1981.patch", "url": "https://api.github.com/repos/psf/requests/pulls/1981" }
true
[ "Thanks for this! We don't really need the Pull Request though, we update urllib3 in each release. =) I'll leave it open because Kenneth may just want to merge it anyway.\n", "@Lukasa that is not true :)\n" ]
https://api.github.com/repos/psf/requests/issues/1980
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1980/labels{/name}
https://api.github.com/repos/psf/requests/issues/1980/comments
https://api.github.com/repos/psf/requests/issues/1980/events
https://github.com/psf/requests/pull/1980
30,417,272
MDExOlB1bGxSZXF1ZXN0MTQxMjc4NDY=
1,980
Create Request.links_multi and utils function for properly parsing Link headers
{ "avatar_url": "https://avatars.githubusercontent.com/u/711371?v=4", "events_url": "https://api.github.com/users/alekstorm/events{/privacy}", "followers_url": "https://api.github.com/users/alekstorm/followers", "following_url": "https://api.github.com/users/alekstorm/following{/other_user}", "gists_url": "https://api.github.com/users/alekstorm/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/alekstorm", "id": 711371, "login": "alekstorm", "node_id": "MDQ6VXNlcjcxMTM3MQ==", "organizations_url": "https://api.github.com/users/alekstorm/orgs", "received_events_url": "https://api.github.com/users/alekstorm/received_events", "repos_url": "https://api.github.com/users/alekstorm/repos", "site_admin": false, "starred_url": "https://api.github.com/users/alekstorm/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/alekstorm/subscriptions", "type": "User", "url": "https://api.github.com/users/alekstorm", "user_view_type": "public" }
[ { "color": "fbca04", "default": false, "description": null, "id": 44501249, "name": "Needs BDFL Input", "node_id": "MDU6TGFiZWw0NDUwMTI0OQ==", "url": "https://api.github.com/repos/psf/requests/labels/Needs%20BDFL%20Input" }, { "color": "207de5", "default": false, "description": null, "id": 60620163, "name": "Minion Seal of Approval", "node_id": "MDU6TGFiZWw2MDYyMDE2Mw==", "url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval" }, { "color": "fef2c0", "default": false, "description": null, "id": 60669570, "name": "Please Review", "node_id": "MDU6TGFiZWw2MDY2OTU3MA==", "url": "https://api.github.com/repos/psf/requests/labels/Please%20Review" } ]
closed
true
null
[]
null
12
2014-03-28T19:49:10Z
2021-09-08T10:01:13Z
2014-08-22T13:12:40Z
NONE
resolved
The current `Request.links` property has a few problems: it assumes there will only be a single link for a given relation type, and that each link has only a single relation type (see [RFC 5988](http://tools.ietf.org/html/rfc5988)); and the `utils.parse_header_links()` function it relies on does not properly parse inputs containing commas or semicolons within quoted strings. However, since I'm not sure of the proper way to fix either function without breaking backwards compatibility, I've instead created `Requests.links_multi` and `utils.parse_header_links_full`, if only to solicit feedback on the logic itself; the factoring can be changed in any way the maintainers prefer. Specifically, `parse_header_links_full` returns a list of the new `Link` `namedtuple`s with `uri` and `attrs` attributes. Internally, it delegates to a new `tokenize` function, which splits a string into a list of tokens (identifiers, "specials", and quoted strings) according to the rules in [RFC 2616](http://tools.ietf.org/html/rfc2616#section-2.2), which makes the parsing itself quite straightforward with some application of `itertools`. In addition, `parse_header_links` is reimplemented in terms of this, to at least properly handle quoted delimiters. `Request.links_multi`, in turn, returns a `CaseInsensitiveDict` keyed on the relation type (`Request.links` has been updated to do this as well), whose values are a list of all the links with that relation type. If a link has multiple relation types (e.g. `rel="foo bar"`), it will be added to the lists for each relation type.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1980/reactions" }
https://api.github.com/repos/psf/requests/issues/1980/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/1980.diff", "html_url": "https://github.com/psf/requests/pull/1980", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/1980.patch", "url": "https://api.github.com/repos/psf/requests/pulls/1980" }
true
[ "Wow, this is a serious chunk of work: thankyou!\n\nI have to get around to reviewing this, which I haven't done yet, but it seems like it's a good enhancement. I'm not going to speculate on how @kennethreitz will want to handle the API change: whether he'll want extra properties or to hold off until Semver lets us make this backwards incompatible change.\n", "I actually have no review feedback: assuming the API design is acceptable to Kenneth, I'm happy with this. =) It's going to confuse the hell out of anyone not familiar with `itertools`, but that's just an excuse for them to learn it. :+1: This is just outstanding work.\n", "I haven't looked at this at all and don't have any opinion yet. I just think the important context here is that (if I remember correctly) @kennethreitz was explicit in his desire to only handle the case where there's a single link for a given relation type. I'll have to find the corresponding issues/PRs to double check though. \n\nIn general I'm always :+1: for conforming to RFCs. As a side note, if this is rejected, I would be more than happy to accept this as a feature for the [toolbelt](/sigmavirus24/toolbelt).\n", "What's the status of this PR? I'm starting a new job soon, and will have less time to support languishing PRs like this. Still willing to make changes, however.\n", "@alekstorm Kenneth's hugely busy, and so sizeable pull requests will tend to sit on the pile for a while until he has time to look at them. We're not a project in a rush, so don't worry, this won't be lost, and we won't be pressuring you to make fast changes. =)\n\nCongrats on the new job, btw! \n", "@alekstorm also there's no rush. If needed, @Lukasa can improve the PR on a branch of our own (using all of your work as a base). And congratulations on the new job! I hope it treats you well :)\n", "Could you make this API function like the current one, but using a MultiDict, perhaps?\n", "@alekstorm if you can't finish this up, let me know and I'll take over the work for you.\n", "ppppppppiiiiiiing :)\n", "Sorry about the delay. How do I go about using a MultiDict? It looks like the discussion stalled in https://github.com/kennethreitz/requests/issues/1155, but urllib3 recently gained an `HTTPHeaderDict`, which looks like it provides the functionality we need, but using it for something other than HTTP headers feels... hacky.\n\nAssuming we find a MultiDict implementation, do you want me to nix `links_multi` in favor of changing `links`?\n", "Hah, when I originally wrote `HTTPHeaderDict` it was called `CaseInsensitiveMultiDict`. We ended up changing it because the `__getitem__` implementation joins all values for the key together with a comma which is something fairly peculiar to HTTP headers and means that it can only really work on string values. Due to this it might fall somewhat short for your needs. This is also opposed to e.g Werkzeug's `MultiDict` which just gives you back the first value.\n\nI'm open to any changes/feedback you might suggest for it to be more useful to you. Personally I think the `getlist` method needs to be changed to just go over the values because `set-cookie` headers can contain commas. If that is done it's really only the `__getitem__` behavior that keeps it from being a general `CaseInsensitiveMultiDict`.\n", "Closing due to lack of activity. \n" ]
https://api.github.com/repos/psf/requests/issues/1979
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1979/labels{/name}
https://api.github.com/repos/psf/requests/issues/1979/comments
https://api.github.com/repos/psf/requests/issues/1979/events
https://github.com/psf/requests/issues/1979
30,379,429
MDU6SXNzdWUzMDM3OTQyOQ==
1,979
Authentication Handlers lost on redirect.
{ "avatar_url": "https://avatars.githubusercontent.com/u/2158258?v=4", "events_url": "https://api.github.com/users/costinb7/events{/privacy}", "followers_url": "https://api.github.com/users/costinb7/followers", "following_url": "https://api.github.com/users/costinb7/following{/other_user}", "gists_url": "https://api.github.com/users/costinb7/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/costinb7", "id": 2158258, "login": "costinb7", "node_id": "MDQ6VXNlcjIxNTgyNTg=", "organizations_url": "https://api.github.com/users/costinb7/orgs", "received_events_url": "https://api.github.com/users/costinb7/received_events", "repos_url": "https://api.github.com/users/costinb7/repos", "site_admin": false, "starred_url": "https://api.github.com/users/costinb7/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/costinb7/subscriptions", "type": "User", "url": "https://api.github.com/users/costinb7", "user_view_type": "public" }
[]
closed
true
null
[]
null
18
2014-03-28T11:20:48Z
2021-09-08T13:05:39Z
2016-12-14T16:35:09Z
NONE
resolved
I'am trying to use the requests library by making a redirection with Digest authentication method, but the response is 401. I mention that it works with basic authentication. I've captured the packets with wireshark, and noticed that the first HTTP request is without the Authorization header, the 401 unauthorized answered is received, and after that the traffic continues as it should be, the Authorization header is added, the 302 answer is received, and after that with the https cyphers exchange. I don't know why the requests.send method returns 401.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1979/reactions" }
https://api.github.com/repos/psf/requests/issues/1979/timeline
null
completed
null
null
false
[ "Wait, I'm confused. Your description of the wireshark trace suggests that everything is happening as expected, but you're getting an unexpected status code?\n", "Yes, but in fact, I don't know if it is happening _exactlly_ as it should be. I tried to make a redirection from http to https. When I try with a browser it works well, and also it works well when using requests with Basic authentication. When using with Digest, what happens is as I described above (+ the https message exchange: server hello, server hello done, client key exchange, change cipher spec etc), but the send() method responds with 401.\n", "Can I see some sample code?\n", "The sample code is very simple:\n\nurl1 = 'http://192.168.40.248/param.cgi?action=list&group=root.AutoTracking'\nres = s.get(url1, auth=requests.auth.HTTPDigestAuth('user', 'pass'), verify=False)\nprint res.status_code\n\nThe url1 address is automatic redirecting to : \"https://192.168.40.248/...\"(the same url but with https).\nSo, the above code prints: \"401\"\n\nIf I change the authentication class to HTTPBasicAuth, and change the server to also use Basic authentication, everything works well (200 is printed).\n\nOfcourse, you don't have access to the DUT that I use. But I suppose you can try with any URL that automatic redirects to use https and use digest auth method.\n", "What version of requests are you using, and where did you install it from?\n", "I am using requests-2.2.1, I've downloaded it from here: https://pypi.python.org/pypi/requests (requests-2.2.1.tar.gz).\n", "Aha, ok.\n\nI don't think auth handlers can handle an auth challenge that occurs _after_ a redirect: I think we lose track of the auth handler. @sigmavirus24 does this match your reading of the code?\n\nIf that's the case, do we consider that behaviour a bug?\n", "Note: I'm renaming this issue to more accurately reflect the bug.\n", "In the case of basic auth, the header is kept (basic auth works with redirection). In the release history, one of the updates from version 0.7.4 to version 0.7.5 is : \"Redirection auth handling.\" , which suggests that this problem was treated.\n", "@costinb7 Can you print the following things for me?\n\n``` python\n[x.request.headers.get('Authorization', '') for x in res.history] + [res.request.headers.get('Authorization', '')]\n[x.headers.get('WWW-Authenticate', '') for x in res.history] + [res.headers.get('WWW-Authenticate', '')]\n```\n", "This is the result for the Digest auth:\n- for the first line:\n ['Digest username=\"root\", realm=\"AXIS_00408CC5845E\", nonce=\"0000125dY1917315e1e3668a12ee27b941e936bef5eb05\", uri=\"/axis-cgi/param.cgi?action=list&group=root.AutoTracking.A0.Running\", response=\"f04e66d4f1efffc1d3021866d2a1e784\", qop=\"auth\", nc=00000001, cnonce=\"bfaa00a7e5656fc6\"', '']\n- for the second line:\n ['', 'Digest realm=\"AXIS_00408CC5845E\", nonce=\"0000125dY475861fc82c8b34f5dd0c9131a4291ffbbec8\", stale=FALSE, qop=\"auth\"']\n\nfor Basic auth:\n- for the first line:\n [u'Basic cm9vdDpwYXNz', u'Basic cm9vdDpwYXNz']\n- for the second line:\n ['', '']\n", "Oh, right, I get it now.\n\nHTTP Basic and HTTP Digest are handled very differently in requests. HTTP Basic just applies a flat Authorization header to each request: HTTP Digest makes its own authentication challenge response. The important thing is that it'll only do that once: if challenged a second time, it aborts. That's exactly what's happening here.\n\nMy question is, should we try to distinguish between a 401 challenge that fails and a second 401, following a redirect?\n", "If you are asking me, yes it would be great if this distinction could be made.\n", "Opened a pull request please let me know if the fix is acceptable and I'll write the test.\n", "Hey guys, I'm still having this problem (using basic auth), just wondering if you intend to fix it any time soon. Other than that the project is awesome, so thanks for that :)\n", "@amnong We have no specific timetable for this at this time.\n", "So it looks like this was patched with #2253 but there wasn't a test included at the time.\r\n\r\nI've written up a [couple tests](https://gist.github.com/nateprewitt/86b4e076ab5142ede32f09dbddd2ae56) to verify the individual pieces are working as expected, but can't find a way to do a full repro with httpbin.\r\n\r\nI'm not sure @amnong's issue was related to this since it's through `HTTPBasicAuth` which should always send the 'Authorization' header.\r\n\r\nI'll throw up a PR with the tests if this looks right.", "Alright cool, this was fixed in #2253 and we now have confirmation in the test suite with #3766." ]
https://api.github.com/repos/psf/requests/issues/1978
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1978/labels{/name}
https://api.github.com/repos/psf/requests/issues/1978/comments
https://api.github.com/repos/psf/requests/issues/1978/events
https://github.com/psf/requests/issues/1978
30,347,364
MDU6SXNzdWUzMDM0NzM2NA==
1,978
Handle httplib IncompleteRead exception
{ "avatar_url": "https://avatars.githubusercontent.com/u/452575?v=4", "events_url": "https://api.github.com/users/hackdna/events{/privacy}", "followers_url": "https://api.github.com/users/hackdna/followers", "following_url": "https://api.github.com/users/hackdna/following{/other_user}", "gists_url": "https://api.github.com/users/hackdna/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/hackdna", "id": 452575, "login": "hackdna", "node_id": "MDQ6VXNlcjQ1MjU3NQ==", "organizations_url": "https://api.github.com/users/hackdna/orgs", "received_events_url": "https://api.github.com/users/hackdna/received_events", "repos_url": "https://api.github.com/users/hackdna/repos", "site_admin": false, "starred_url": "https://api.github.com/users/hackdna/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/hackdna/subscriptions", "type": "User", "url": "https://api.github.com/users/hackdna", "user_view_type": "public" }
[]
closed
true
null
[]
null
7
2014-03-27T22:15:56Z
2021-09-08T23:10:59Z
2014-06-08T10:06:33Z
NONE
resolved
Requests version 1.2.3 Relevant part of traceback: ``` File "/srv/scc/virtualenvs/refinery-platform/lib/python2.7/site-packages/requests/api.py", line 55, in get return request('get', url, **kwargs) File "/srv/scc/virtualenvs/refinery-platform/lib/python2.7/site-packages/requests/api.py", line 44, in request return session.request(method=method, url=url, **kwargs) File "/srv/scc/virtualenvs/refinery-platform/lib/python2.7/site-packages/requests/sessions.py", line 335, in request resp = self.send(prep, **send_kwargs) File "/srv/scc/virtualenvs/refinery-platform/lib/python2.7/site-packages/requests/sessions.py", line 438, in send r = adapter.send(request, **kwargs) File "/srv/scc/virtualenvs/refinery-platform/lib/python2.7/site-packages/requests/adapters.py", line 340, in send r.content File "/srv/scc/virtualenvs/refinery-platform/lib/python2.7/site-packages/requests/models.py", line 594, in content self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes() File "/srv/scc/virtualenvs/refinery-platform/lib/python2.7/site-packages/requests/models.py", line 541, in generate chunk = self.raw.read(chunk_size, decode_content=True) File "/srv/scc/virtualenvs/refinery-platform/lib/python2.7/site-packages/requests/packages/urllib3/response.py", line 171, in read data = self._fp.read(amt) File "/n/sw/python-2.7.3_slib/lib/python2.7/httplib.py", line 541, in read return self._read_chunked(amt) File "/n/sw/python-2.7.3_slib/lib/python2.7/httplib.py", line 586, in _read_chunked raise IncompleteRead(''.join(value)) IncompleteRead: IncompleteRead(4303 bytes read) ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1978/reactions" }
https://api.github.com/repos/psf/requests/issues/1978/timeline
null
completed
null
null
false
[ "Requests v1.2.3 is several versions out of date and not supported: I suggest you try upgrading. =)\n", "Is IncompleteRead exception handled in the latest version?\n", "That depends. If the server has sent you less data than the Content-Length header says they have, then no.\n", "OK, so would it be possible to add handling of that exception?\n", "Can you reproduce it on the latest version of requests? \n", "@hackdna Also, what would 'handling that exception' mean? We don't know what you want to do in that situation.\n", "Closed for inactivity.\n" ]
https://api.github.com/repos/psf/requests/issues/1977
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1977/labels{/name}
https://api.github.com/repos/psf/requests/issues/1977/comments
https://api.github.com/repos/psf/requests/issues/1977/events
https://github.com/psf/requests/issues/1977
30,323,043
MDU6SXNzdWUzMDMyMzA0Mw==
1,977
SSL Wildcard Cert Issue
{ "avatar_url": "https://avatars.githubusercontent.com/u/2682247?v=4", "events_url": "https://api.github.com/users/athoik/events{/privacy}", "followers_url": "https://api.github.com/users/athoik/followers", "following_url": "https://api.github.com/users/athoik/following{/other_user}", "gists_url": "https://api.github.com/users/athoik/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/athoik", "id": 2682247, "login": "athoik", "node_id": "MDQ6VXNlcjI2ODIyNDc=", "organizations_url": "https://api.github.com/users/athoik/orgs", "received_events_url": "https://api.github.com/users/athoik/received_events", "repos_url": "https://api.github.com/users/athoik/repos", "site_admin": false, "starred_url": "https://api.github.com/users/athoik/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/athoik/subscriptions", "type": "User", "url": "https://api.github.com/users/athoik", "user_view_type": "public" }
[]
closed
true
null
[]
null
6
2014-03-27T17:28:26Z
2021-09-09T00:01:10Z
2014-03-27T18:44:22Z
NONE
resolved
Hello, I am trying to open https://ia600301.us.archive.org/ with `requests` and i am receiving an `SSLError` ``` >>> import requests >>> requests.get("https://ia600301.us.archive.org/") Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/lib/python2.7/site-packages/requests/api.py", line 55, in get File "/usr/lib/python2.7/site-packages/requests/api.py", line 44, in request File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 335, in request File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 438, in send File "/usr/lib/python2.7/site-packages/requests/adapters.py", line 331, in send requests.exceptions.SSLError: hostname 'ia600301.us.archive.org' doesn't match either of '*.archive.org', 'archive.org' >>> requests.__version__ '1.2.3' >>> >>> import requests >>> requests.get("https://ia600301.us.archive.org/") Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/local/lib/python2.7/dist-packages/requests-2.0.1-py2.7.egg/requests/api.py", line 55, in get return request('get', url, **kwargs) File "/usr/local/lib/python2.7/dist-packages/requests-2.0.1-py2.7.egg/requests/api.py", line 44, in request return session.request(method=method, url=url, **kwargs) File "/usr/local/lib/python2.7/dist-packages/requests-2.0.1-py2.7.egg/requests/sessions.py", line 361, in request resp = self.send(prep, **send_kwargs) File "/usr/local/lib/python2.7/dist-packages/requests-2.0.1-py2.7.egg/requests/sessions.py", line 464, in send r = adapter.send(request, **kwargs) File "/usr/local/lib/python2.7/dist-packages/requests-2.0.1-py2.7.egg/requests/adapters.py", line 363, in send raise SSLError(e) requests.exceptions.SSLError: hostname 'ia600301.us.archive.org' doesn't match either of '*.archive.org', 'archive.org' >>> requests.__version__ '2.0.1' >>> import requests >>> requests.get("https://ia600301.us.archive.org/") Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/local/lib/python2.7/dist-packages/requests-2.2.1-py2.7.egg/requests/api.py", line 55, in get return request('get', url, **kwargs) File "/usr/local/lib/python2.7/dist-packages/requests-2.2.1-py2.7.egg/requests/api.py", line 44, in request return session.request(method=method, url=url, **kwargs) File "/usr/local/lib/python2.7/dist-packages/requests-2.2.1-py2.7.egg/requests/sessions.py", line 383, in request resp = self.send(prep, **send_kwargs) File "/usr/local/lib/python2.7/dist-packages/requests-2.2.1-py2.7.egg/requests/sessions.py", line 486, in send r = adapter.send(request, **kwargs) File "/usr/local/lib/python2.7/dist-packages/requests-2.2.1-py2.7.egg/requests/adapters.py", line 385, in send raise SSLError(e) requests.exceptions.SSLError: hostname 'ia600301.us.archive.org' doesn't match either of '*.archive.org', 'archive.org' >>> requests.__version__ '2.2.1' ``` The same Cert doesn't compain on my browser, also it looks ok here http://www.sslshopper.com/ssl-checker.html#hostname=https://ia600301.us.archive.org/
{ "avatar_url": "https://avatars.githubusercontent.com/u/2682247?v=4", "events_url": "https://api.github.com/users/athoik/events{/privacy}", "followers_url": "https://api.github.com/users/athoik/followers", "following_url": "https://api.github.com/users/athoik/following{/other_user}", "gists_url": "https://api.github.com/users/athoik/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/athoik", "id": 2682247, "login": "athoik", "node_id": "MDQ6VXNlcjI2ODIyNDc=", "organizations_url": "https://api.github.com/users/athoik/orgs", "received_events_url": "https://api.github.com/users/athoik/received_events", "repos_url": "https://api.github.com/users/athoik/repos", "site_admin": false, "starred_url": "https://api.github.com/users/athoik/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/athoik/subscriptions", "type": "User", "url": "https://api.github.com/users/athoik", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1977/reactions" }
https://api.github.com/repos/psf/requests/issues/1977/timeline
null
completed
null
null
false
[ "Afaik wildcards are not supposed to be recursive.\nI don't have time right now, but I'll write a proper answer later.\n", "@t-8ch @athoik \n\nwildcard certs support only one level wildcard-ish. \n\nBut: The SSL certificate on hostname `ia600301.us.archive.org` is issued on the name `*.us.archive.org`. \n\nThe error-message in requests shows another domain name .. \n", "@syphar, that is correct! Certificate has `Common name: *.us.archive.org` but in logs there is `hostname 'ia600301.us.archive.org' doesn't match either of '*.archive.org', 'archive.org'`\n", "This is an SNI bug: Python 2.7 does not support SNI in the standard library at this time. You can get requests support for it by installing `pyopenssl`, `ndg-httpsclient` and `pyasn1` in addition to requests.\n\nIf you do that, you'll then find that the server doesn't support the handshake. I was unable to get this to work at all in my copy of requests, even when using the [SSLAdapter](http://toolbelt.readthedocs.org/en/latest/user.html#ssladapter) to force a protocol version. (Note that you need to use PyOpenSSL version specifiers when using the SSLAdapter in this situation.)\n\nSadly, at this time you're limited by the support of SSL in Python 2.7, which is _very bad_. My recommendation is to upgrade to Python 3, and hope that PEP 466 gets accepted to improve the SSL situation in Python 2.7.\n", "Thanks for the explanation @Lukasa, lets hope that PEP 466 will accepted in Python 2.7. Until then i am afraid that setting `verify=False` is the only way.\n", "You could also verify the certificate fingerprint(s), if this is feasable for your usecase.\nYou would need a custom transport adapter and the `assert_fingerprint` parameter to urllib3\n" ]
https://api.github.com/repos/psf/requests/issues/1976
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1976/labels{/name}
https://api.github.com/repos/psf/requests/issues/1976/comments
https://api.github.com/repos/psf/requests/issues/1976/events
https://github.com/psf/requests/pull/1976
30,209,715
MDExOlB1bGxSZXF1ZXN0MTQwMDM4NDE=
1,976
The expected value changed for the proxies keyword
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[ { "color": "009800", "default": false, "description": null, "id": 44501218, "name": "Ready To Merge", "node_id": "MDU6TGFiZWw0NDUwMTIxOA==", "url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge" }, { "color": "207de5", "default": false, "description": null, "id": 60620163, "name": "Minion Seal of Approval", "node_id": "MDU6TGFiZWw2MDYyMDE2Mw==", "url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval" } ]
closed
true
null
[]
null
5
2014-03-26T13:13:26Z
2021-09-08T23:06:27Z
2014-03-26T15:34:36Z
CONTRIBUTOR
resolved
It used to be None but a recent PR changed that before my last one was merged Fixes #1975
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1976/reactions" }
https://api.github.com/repos/psf/requests/issues/1976/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/1976.diff", "html_url": "https://github.com/psf/requests/pull/1976", "merged_at": "2014-03-26T15:34:36Z", "patch_url": "https://github.com/psf/requests/pull/1976.patch", "url": "https://api.github.com/repos/psf/requests/pulls/1976" }
true
[ ":+1: Totally safe, thanks!\n", "In related news, @kennethreitz is there an email address that the Jenkins server emails when tests fail? If not, could you add @Lukasa and/or me to the notifications? Assuming Jenkins runs on pushes to master (i.e., when you merge a PR) I would have seen this and fixed it sooner.\n", "@sigmavirus24 I can set that up if you'd like, but it's pretty damn annoying :)\n", "@kennethreitz It's annoying, but it's the kind of annoying that @sigmavirus24 and I are for. =D\n", "What @Lukasa said. Also I like to know when I've broken stuff. (Because this was sort of my fault)\n" ]
https://api.github.com/repos/psf/requests/issues/1975
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1975/labels{/name}
https://api.github.com/repos/psf/requests/issues/1975/comments
https://api.github.com/repos/psf/requests/issues/1975/events
https://github.com/psf/requests/issues/1975
30,190,349
MDU6SXNzdWUzMDE5MDM0OQ==
1,975
Testsuite Failure: test_requests_are_updated_each_time
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2014-03-26T07:24:03Z
2021-09-09T00:01:11Z
2014-03-26T15:34:36Z
MEMBER
resolved
From Jenkins: ``` ============================= test session starts ============================== platform linux -- Python 3.3.2 -- pytest-2.3.4 plugins: cov collected 125 items test_requests.py ...........................................................................................................................F. =================================== FAILURES =================================== ______________ TestRedirects.test_requests_are_updated_each_time _______________ self = <test_requests.TestRedirects object at 0x2aca5b721dd0> def test_requests_are_updated_each_time(self): session = RedirectSession([303, 307]) prep = requests.Request('POST', 'http://httpbin.org/post').prepare() r0 = session.send(prep) assert r0.request.method == 'POST' assert session.calls[-1] == SendCall((r0.request,), {}) redirect_generator = session.resolve_redirects(r0, prep) for response in redirect_generator: assert response.request.method == 'GET' send_call = SendCall((response.request,), TestRedirects.default_keyword_args) > assert session.calls[-1] == send_call E assert SendCall(args...ects': False}) == SendCall(args=...ects': False}) E At index 1 diff: {'stream': False, 'timeout': None, 'cert': None, 'proxies': {}, 'verify': True, 'allow_redirects': False} != {'stream': False, 'timeout': None, 'cert': None, 'proxies': None, 'verify': True, 'allow_redirects': False} test_requests.py:1277: AssertionError generated xml file: /var/lib/jenkins/jobs/requests-pr/workspace/PYTHON/3.3/junit.xml ==================== 1 failed, 124 passed in 10.27 seconds ===================== ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1975/reactions" }
https://api.github.com/repos/psf/requests/issues/1975/timeline
null
completed
null
null
false
[ "Fixed in b92f4ec\n" ]
https://api.github.com/repos/psf/requests/issues/1974
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1974/labels{/name}
https://api.github.com/repos/psf/requests/issues/1974/comments
https://api.github.com/repos/psf/requests/issues/1974/events
https://github.com/psf/requests/issues/1974
30,084,364
MDU6SXNzdWUzMDA4NDM2NA==
1,974
Payload in array
{ "avatar_url": "https://avatars.githubusercontent.com/u/440320?v=4", "events_url": "https://api.github.com/users/guori12321/events{/privacy}", "followers_url": "https://api.github.com/users/guori12321/followers", "following_url": "https://api.github.com/users/guori12321/following{/other_user}", "gists_url": "https://api.github.com/users/guori12321/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/guori12321", "id": 440320, "login": "guori12321", "node_id": "MDQ6VXNlcjQ0MDMyMA==", "organizations_url": "https://api.github.com/users/guori12321/orgs", "received_events_url": "https://api.github.com/users/guori12321/received_events", "repos_url": "https://api.github.com/users/guori12321/repos", "site_admin": false, "starred_url": "https://api.github.com/users/guori12321/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/guori12321/subscriptions", "type": "User", "url": "https://api.github.com/users/guori12321", "user_view_type": "public" }
[]
closed
true
null
[]
null
7
2014-03-24T23:15:22Z
2021-09-09T00:09:54Z
2014-03-25T09:11:12Z
NONE
resolved
When I studied the HTTP Post in the Google Flight, I found the Request Payload is like the following: [,[[,"aa","[,2,\"S\"]","18563594632275",5]],[,[[,"b_al","no:73"],[,"b_ahr","no:s"],[,"b_am","aa"],[,"b_qu","0"],[,"b_qc","1"]]]] However, Python does allow None in array, that means, var = [None, 'something'] is right in Python, but var = [ , 'something'] is wrong. What's more, when I post something via lib requests, there is an error : TypeError: 'NoneType' object is not iterable So, could you please give me some suggestions of how to deal with the Payload? I searched about it, but in other cases the Payload is a list rather than a array like [, 'something'].
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1974/reactions" }
https://api.github.com/repos/psf/requests/issues/1974/timeline
null
completed
null
null
false
[ "Uh, can I see the body as it went over the wire? Because that payload doesn't look like anything I've ever seen before.\n", "1. For instance,\n https://www.google.fr/flights/#search;f=CFE;t=ORY,CDG;d=2014-04-10;r=2014-04-14.\n When you input the city name (Paris in this example), you will get the\n following.\n 2.\n2. Request URL:\n https://www.google.fr/flights/rpc\n3. Request Method:\n POST\n4. Status Code:\n 200 OK\n5. Request Headers\n 1. :host:\n www.google.fr\n 2. :method:\n POST\n 3. :path:\n /flights/rpc\n 4. :scheme:\n https\n 5. :version:\n HTTP/1.1\n 6. accept:\n _/_\n 7. accept-encoding:\n gzip,deflate,sdch\n 8. accept-language:\n en-US,en;q=0.8,zh-CN;q=0.6,zh;q=0.4\n 9. content-length:\n 332\n 10. content-type:\n application/json; charset=UTF-8\n 11. cookie:\n PREF=ID=bdc670c680013c1d:U=19abbcd0a8a83450:LD=en:TM=1394271727:LM=1394271727:S=QmfR-IjoUHauV5zE;\n NID=67=amO5W8p5BQ4fVLAxx-0BDwEn7HfIMxgmu7qBOUmcDxNIO1kSHXy7zTuzJq4lVTrCL3LeReO6Zt3L6FDexx6A-KkhoN1ssm2yhDhJqCupoP1C03GzgyotwrL4kchf8ZyQ4XDIOyCETdEjgbhutwcyoub7vg5P0xIz-ZMFDeEOaTvHS4TosDwnByJPQJP586F0Lab2HGXMuxgqfVXj-Y1L-vJoE0zYgkoNlVj-SCrKpoFS0Rd7mEIHnjBE58_M;\n HSID=AI16cWqyIltev68Uf; SSID=AlsAFvEjOeGI2t5dK;\n APISID=xY5RyNYIKLf8WiOV/AyRDjN2kPNIj0LPun;\n SAPISID=ciFaHzHJIXPA1jk3/AxCRCO5Y6MZJZWYgC;\n SID=DQAAAGIBAABrDfdaH1R_Yn793rLBvpfcnAKmXxfQrx4-NCpxKVau462gHkkEk-O9efQjJ5it9p5brTq9JvvbmLycn_VQeXz3vYlhCUkSAcE6S-CtnIW9IwSYVG8ymnj14JR4YDfT3E-JgY3sZAyZII4AqD3kAkb0t1SNp75wg_m36xFOayDD3D048PCcd6ziA3XUN7G-9H9cY-TqvK2NAyKxT83ab9pnZ7d_ke1nzbWpmTLq5MCmfE5eSfj31HXHEC_N7djFEeiTojdZG0WZmBgmGU38lYzo2Zsqe_hqQ_lZY5Q_FXTLsAxJBzSd-EiOP1OGGGEpceTOURtJfoAhTz5dl_c-eOVRQoZNrBoJLgY77sgKJQ52060A1vzH-oxmfRKaJFlDCmkl7Oqtlu6VxTbAcq8LaK67FIEftgfAVdg2dQX9o7s6RglTqGdVT31bCr53sSQ3bmlfHjGgIeHS3Lu3bJ3h99UESxi-nM8xnSad9mNWvEnccw;\n S=travel-flights=7TxbQr450yFEt8Fvj_9-mA\n 12. origin:\n https://www.google.fr\n 13. referer:\n https://www.google.fr/flights/\n 14. user-agent:\n Mozilla/5.0 (Macintosh; Intel Mac OS X 10_8_5) AppleWebKit/537.36\n (KHTML, like Gecko) Chrome/33.0.1750.152 Safari/537.36\n 15. x-client-data:\n CPm1yQEIlLbJAQiitskBCKm2yQEIxLbJAQi5iMoBCNiIygE=\n 16. x-gwt-cctoken:\n \n ADS25WNRLxnpbQVKxuqSMqiVqxn6s93mt3519cz4gVBbRfMoGZw2GbNb4S87PRNL65-ea7L6gTpcQOZcUjVw9flmw9tBfHmhvCreoXDJoP-2pZZGcp66BFFbMFtQArEPlRAfKzC-wcjQjPA02FnPR7sXYuPhgI4WqyBwtVvMlxQ8kOUozKZMMqfyNWGS4ev15eKrhA\n 17. x-gwt-module-base:\n https://www.google.fr/flights/static/\n 18. x-gwt-permutation:\n D847595CBA5616E47BFE440BE5D950B6\n 19. Request Payload\n 1.\n [,[[,\"tb\",\"[,[,[[,[\\\"CFE\\\"],[\\\"ORY\\\",\\\"CDG\\\"],\\\"2014-04-10\\\"],[,[\\\"ORY\\\",\\\"CDG\\\"],[\\\"CFE\\\"],\\\"2014-04-14\\\"]]]]\",\"1846721392154327\",11]],[,[[,\"b_lr\",\"15:45\"],[,\"b_al\",\"fs:86\"],[,\"b_ahr\",\"fs:s\"],[,\"b_lr\",\"13:207\"],[,\"b_lr\",\"19:0\"],[,\"b_lr\",\"13:208\"],[,\"b_am\",\"tb\"],[,\"b_pe\",\"4F56A1E74FAE0.AD54F0A.299D\"],[,\"b_qu\",\"1\"],[,\"b_qc\",\"1\"]]]]\n6. Response Headersview source\n 1. alternate-protocol:\n 443:quic\n 2. cache-control:\n no-cache, no-store, max-age=0, must-revalidate\n 3. content-encoding:\n gzip\n 4. content-length:\n 724\n 5. content-type:\n application/json; charset=utf-8\n 6. date:\n Tue, 25 Mar 2014 08:27:13 GMT\n 7. expires:\n Fri, 01 Jan 1990 00:00:00 GMT\n 8. pragma:\n no-cache\n 9. server:\n GSE\n 10. status:\n 200 OK\n 11. version:\n HTTP/1.1\n 12. x-content-type-options:\n nosniff\n 13. x-frame-options:\n SAMEORIGIN\n 14. x-xss-protection:\n 1; mode=block\n\n2014-03-25 8:12 GMT+01:00 Cory Benfield [email protected]:\n\n> Uh, can I see the body as it went over the wire? Because that payload\n> doesn't look like anything I've ever seen before.\n> \n> ## \n> \n> Reply to this email directly or view it on GitHubhttps://github.com/kennethreitz/requests/issues/1974#issuecomment-38536270\n> .\n\n## \n\nGuo Rui\nSenior Undergraduate Student\nSchool of Software\nHarbin Institute of Technology\n", "So, that request body appears to claim to be JSON, though I'm sure it's not valid JSON as Python fails to parse it. This is not really anything Requests can help you with: you'll need to create that string yourself. You won't even be able to use Python's JSON module to build it as Python has this silly idea that JSON actually needs to match the JSON spec.\n\nSorry we can't be more helpful, but I strongly suggest you write a very strongly worded letter to Google.fr to tell them to sort their mess out.\n", "I'm not sure how to create a string to do those things...I'm new to Web\ntech...Can you show me some examples in such cases? I searched the\ndocuments and didn't find what I want, so I open an issue in Github...\n\n2014-03-25 10:11 GMT+01:00 Cory Benfield [email protected]:\n\n> So, that request body appears to claim to be JSON, though I'm sure it's\n> not valid JSON as Python fails to parse it. This is not really anything\n> Requests can help you with: you'll need to create that string yourself. You\n> won't even be able to use Python's JSON module to build it as Python has\n> this silly idea that JSON actually needs to match the JSON spec.\n> \n> Sorry we can't be more helpful, but I strongly suggest you write a very\n> strongly worded letter to Google.fr to tell them to sort their mess out.\n> \n> ## \n> \n> Reply to this email directly or view it on GitHubhttps://github.com/kennethreitz/requests/issues/1974#issuecomment-38543308\n> .\n\n## \n\nGuo Rui\nSenior Undergraduate Student\nSchool of Software\nHarbin Institute of Technology\n", "Unfortunately, there's very limited help I can provide without simply writing your code for you. I'm sorry that this was your first foray into web technologies because Google have screwed you. They've invented their own crazy system that doesn't interoperate with anything. You'll literally need to build a plain string with no particular structure and put your own values into it.\n", "For anyone who ends up looking at this mess in the future, [I wrote an open letter](https://gist.github.com/Lukasa/9758031).\n", "I got it...Perhaps I can try nodeJS to develop the crawler later. Thanks\nagain!\n\n2014-03-25 10:32 GMT+01:00 Cory Benfield [email protected]:\n\n> For anyone who ends up looking at this mess in the future, I wrote an\n> open letter https://gist.github.com/Lukasa/9758031.\n> \n> ## \n> \n> Reply to this email directly or view it on GitHubhttps://github.com/kennethreitz/requests/issues/1974#issuecomment-38544888\n> .\n\n## \n\nGuo Rui\nSenior Undergraduate Student\nSchool of Software\nHarbin Institute of Technology\n" ]
https://api.github.com/repos/psf/requests/issues/1973
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1973/labels{/name}
https://api.github.com/repos/psf/requests/issues/1973/comments
https://api.github.com/repos/psf/requests/issues/1973/events
https://github.com/psf/requests/issues/1973
30,071,513
MDU6SXNzdWUzMDA3MTUxMw==
1,973
Sockets remain in CLOSE_WAIT
{ "avatar_url": "https://avatars.githubusercontent.com/u/454718?v=4", "events_url": "https://api.github.com/users/gosom/events{/privacy}", "followers_url": "https://api.github.com/users/gosom/followers", "following_url": "https://api.github.com/users/gosom/following{/other_user}", "gists_url": "https://api.github.com/users/gosom/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/gosom", "id": 454718, "login": "gosom", "node_id": "MDQ6VXNlcjQ1NDcxOA==", "organizations_url": "https://api.github.com/users/gosom/orgs", "received_events_url": "https://api.github.com/users/gosom/received_events", "repos_url": "https://api.github.com/users/gosom/repos", "site_admin": false, "starred_url": "https://api.github.com/users/gosom/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/gosom/subscriptions", "type": "User", "url": "https://api.github.com/users/gosom", "user_view_type": "public" }
[]
closed
true
null
[]
null
20
2014-03-24T20:21:18Z
2021-09-09T00:09:54Z
2014-03-25T14:48:15Z
NONE
resolved
``` Python 2.7.4 (default, Sep 26 2013, 03:20:56) [GCC 4.7.3] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> import requests >>> requests.__version__ '2.1.0' >>> response = requests.get('http://www.yahoo.de') ``` then check: ``` $ lsof -i python 22881 xubuntu 6u IPv4 545643 0t0 TCP 192.168.1.242:47049->ir2.fp.vip.ir2.yahoo.com:http (ESTABLISHED) ``` then : ``` >>response.close() ``` and check again lsof -i ``` python 22881 xubuntu 6u IPv4 545643 0t0 TCP 192.168.1.242:47049->ir2.fp.vip.ir2.yahoo.com:http (CLOSE_WAIT) ``` the sockets close only when I exit the interpreter ! Doing the same with urllib2: ``` >>res = urllib2.urlopen('http://yahoo.de') ``` lsof -i ``` python 22881 xubuntu 4u IPv4 546690 0t0 TCP 192.168.1.242:48715->ir1.fp.vip.ir2.yahoo.com:http (ESTABLISHED) >>res.close() ``` now the socket is closed! Can somebody explains this behavior?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1973/reactions" }
https://api.github.com/repos/psf/requests/issues/1973/timeline
null
completed
null
null
false
[ "Yes. =) Requests uses connection pooling to avoid the overhead of repeatedly creating and connecting sockets. This saves substantial system resources at the cost of keeping a few file descriptors open. Put another way: this is working exactly as designed. =)\n", "Thanks for the response.\nBut what if I am doing a lot of parallels requests?\nAnd if that is the case what is the purpose of the close method?\n", "We have a maximum number connections in the pool: in this case, 10 per scheme. The close method returns a socket to the pool. =)\n", "@Lukasa try that:\n\n```\nresponses = [ requests.get('http://www.greven.de') for _ in xrange(60) ]\n```\n\ndo an \n\n```\nlsof -i|grep 212.29|wc -l\n```\n\nyou should get 60\n\nthen \"release\" the connections to the pool:\n\n```\n[r.close() for r in responses]\n```\n\ndo the same \n\n```\nlsof -i|grep 212.29|wc -l\n```\n\nThere are still 60 sockets in CLOSE_WAIT state.\n\nI think that something is not as expected.\n\nCan you verify that it is ok?\nThanks\n", "Hmm, @shazow, thoughts?\n", "The connections seems to be reused when you do it like that:\n\n```\nfor _ in xrange(60):\n r = requests.get('http://www.greven.de')\n r.close()\n```\n", "Hmm. I wonder if, when you consider this and #1967, we have a bug with releasing connections back to the pool.\n", "Also this look strange and maybe related:\n\n```\n>> r = requests.get('http://www.greven.de')\n>> r.raw.closed\n```\n\n True\n\n```\npython 22881 xubuntu 6u IPv4 562317 0t0 TCP 192.168.1.242:45158->195.14.212.29:http (ESTABLISHED)\n```\n\nalthought the socked is ESTABLISHED .\n\nand then:\n\n```\n>>> r.close()\n>>> r.raw.closed\nTrue \n```\n", "Hm. One thing worth noting that by default, a pool will supply additional connections despite being empty. So, if you have a pool of size 10, and you ask for 50 connections concurrently, then it will create 50 new connections but still only return 10 of them to the pool (the rest will be closed and discarded). \n\nI'm not sure what Requests' default pool size is. I suspect it might be 1. If you're expecting to run N concurrent requests, I suggest creating a PoolManager with size N. You can also bottleneck on the pool size by using `block=True` on the pool. (These are urllib3 things, you'll need to translate them for Requests-land).\n", "This is what is supposed to do.\nIf that works as expected then when we do N parallel requests we should have N sockets open.\nWhen the requests have fiinished and read the content the sockets should close or returned to the pool.\n\nSo if the pool size is 10 (as Lukasa wrote) we have the following:\n- if N <= 10 all the connections should be returned to the pool\n - if N > 10 then N - 10 connections should be closed and 10 will be returned to the pool.\n\nSo after that we should at most have 10 sockets in wait_close state.\n\nAm I correct for the above?\n", "So the for loop is allowing previous responses to be garbage collected. The list comprehension does not allow for that. I suspect the problem is strongly related to that\n", "Aha, yes, the garbage collector is at fault here:\n\n``` python\n>>> for _ in xrange(60):\n... r = requests.get('http://www.greven.de')\n...\n>>>\n```\n\n``` bash\n[vagrant@localhost sipp]$ lsof -i|grep 212.29|wc -l\n1\n```\n\nThe one socket being held here is the one I have a reference to, via the variable `r`. We can see this with the list comprehension:\n\n``` python\n>>> responses = [ requests.get('http://www.greven.de') for _ in xrange(60) ]\n```\n\n``` bash\n[vagrant@localhost sipp]$ lsof -i|grep 212.29|wc -l\n61\n```\n\nNow I've got the 60 requests from the LC, plus the one from before. If I empty the list of stored responses, letting them get GC'd:\n\n``` python\n>>> responses = []\n```\n\n``` bash\n[vagrant@localhost sipp]$ lsof -i|grep 212.29|wc -l\n1\n```\n\nThis is obviously true, we don't close the sockets until the GC gets them. =) Unless we can see something else, there is no bug here.\n", "The for loop is first doing the request then closes the connection and then does the other request.\n\nThe list comprehension does the request then does the other requests ...\n\nWhen all the requests have been made it closes them one by one\n\nSo it is a different thing.\nThe list comprehension should create a connection foreach request.\nSo it should create N connections.\nThen it closes/releases these connections.\n\nThe for loop creates a connections and closes/releseases it.\nThe next iteration should resuse the same socket since it is released back to the pool.\n\nSo here I am expecting to create only one socket in the fist iteration and resuse it in the N-1 others\n", "@gosom The key is that you're storing the responses in the list comprehension. These responses have references to the socket, which prevents it getting garbage collected. We don't aggressively close sockets because there's no requirement to do that, but we do aggressively release them to the pool (where appropriate).\n", "@Lukasa \n\nyou are right it works correct this way.\nBut when you explicitly close the connections there should be only MAX_CONNECTION_POOL_SIZE open correct?\n", "Whether they are actually 'open' is not guaranteed: some servers close the connection, so we just hold on to the socket object. But yes, that should be the maximum we hold on to _once garbage collection has occurred_. \n", "```\n>>> import requests \n>>> responses = [requests.get('http://www.greven.de') for _ in xrange(30)] \n\n$lsof -i|grep python|grep -v grep|wc -l \n 30\n\n>>> [r.close() for r in responses] \n>>> responses = []\n\n$lsof -i|grep python|grep -v grep|wc -l \n 30\n```\n\nShouldn't the garbage collection cleanup?\nThen only 10 sockets should be in close_wait state\n", "I don't see that behaviour on my machine, I have all my sockets cleaned up.\n", "True, I tried in another machine (physical machine, I was testing in a vm before).\nand it is ok.\nIt cleanups all the sockets.\n\nBut still the close() method does not seem to do anything\n\nAn more if all the sockets are cleaned where is the reusage?\n\nOne more question the \n r.raw.closed property what is the purpose?\n", "1. The `.close()` method is documented to return the connection back to the pool, and is also mentioned not to be expected to be called.\n2. Connections are re-used if the server doesn't close them, but most do. We do, however, re-use the socket objects, thereby saving a few syscalls.\n3. You should try not to think about it. =) Generally, it reflects whether the backing connection is closed or not.\n\nThe advice when using Requests is not to worry about this stuff: we handle it for you.\n" ]
https://api.github.com/repos/psf/requests/issues/1972
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1972/labels{/name}
https://api.github.com/repos/psf/requests/issues/1972/comments
https://api.github.com/repos/psf/requests/issues/1972/events
https://github.com/psf/requests/pull/1972
29,972,184
MDExOlB1bGxSZXF1ZXN0MTM4NzE5NjE=
1,972
Add __str__ to case insensitive dict
{ "avatar_url": "https://avatars.githubusercontent.com/u/873597?v=4", "events_url": "https://api.github.com/users/avidas/events{/privacy}", "followers_url": "https://api.github.com/users/avidas/followers", "following_url": "https://api.github.com/users/avidas/following{/other_user}", "gists_url": "https://api.github.com/users/avidas/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/avidas", "id": 873597, "login": "avidas", "node_id": "MDQ6VXNlcjg3MzU5Nw==", "organizations_url": "https://api.github.com/users/avidas/orgs", "received_events_url": "https://api.github.com/users/avidas/received_events", "repos_url": "https://api.github.com/users/avidas/repos", "site_admin": false, "starred_url": "https://api.github.com/users/avidas/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/avidas/subscriptions", "type": "User", "url": "https://api.github.com/users/avidas", "user_view_type": "public" }
[ { "color": "fbca04", "default": false, "description": null, "id": 44501249, "name": "Needs BDFL Input", "node_id": "MDU6TGFiZWw0NDUwMTI0OQ==", "url": "https://api.github.com/repos/psf/requests/labels/Needs%20BDFL%20Input" }, { "color": "207de5", "default": false, "description": null, "id": 60620163, "name": "Minion Seal of Approval", "node_id": "MDU6TGFiZWw2MDYyMDE2Mw==", "url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval" }, { "color": "e11d21", "default": false, "description": null, "id": 78002701, "name": "Do Not Merge", "node_id": "MDU6TGFiZWw3ODAwMjcwMQ==", "url": "https://api.github.com/repos/psf/requests/labels/Do%20Not%20Merge" } ]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" } ]
null
9
2014-03-22T20:45:36Z
2021-09-09T00:01:15Z
2014-05-12T19:08:31Z
CONTRIBUTOR
resolved
Logging headers for debugging purposes is often necessary, and the currently logging the headers would be using **repr** which would log the implementation detail of headers, caseinsensitivedict. Adding str to case insensitive dict makes the headers of the response object more log friendly, by logging only the request itself and not implementation details.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1972/reactions" }
https://api.github.com/repos/psf/requests/issues/1972/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/1972.diff", "html_url": "https://github.com/psf/requests/pull/1972", "merged_at": "2014-05-12T19:08:31Z", "patch_url": "https://github.com/psf/requests/pull/1972.patch", "url": "https://api.github.com/repos/psf/requests/pulls/1972" }
true
[ "In principle I'd be happy to take this, but you broke the build. =) Some sample output is [here](http://ci.kennethreitz.org/job/requests-pr/263/PYTHON=3.3/console): mind fixing that up?\n", "Whoops, that's embarrassing. Fixed now!\n", "Cool, LGTM. Thanks for this! :cake:\n", "@sigmavirus24 Sure updated now\n", ":+1: :cake: Thanks. Assigning @kennethreitz since @Lukasa and I agree this is ready to merge.\n", "Will you merge this @kennethreitz ?\n", "@avidas Kenneth is insanely busy (and probably even more so than usual as PyCon is next week), and so he does these things in batches. He'll get to it. =)\n", "Perhaps we should just change our `__repr__`. Random split functionality is random.\n", "Yeah, I'm going to just change our `__repr__`. Thanks, for the contribution! :cake: \n\nThis will be a wonderful UX improvement for all of our users :)\n" ]
https://api.github.com/repos/psf/requests/issues/1971
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1971/labels{/name}
https://api.github.com/repos/psf/requests/issues/1971/comments
https://api.github.com/repos/psf/requests/issues/1971/events
https://github.com/psf/requests/issues/1971
29,945,341
MDU6SXNzdWUyOTk0NTM0MQ==
1,971
r.url in request.post is same as request.get
{ "avatar_url": "https://avatars.githubusercontent.com/u/6764126?v=4", "events_url": "https://api.github.com/users/patelgr/events{/privacy}", "followers_url": "https://api.github.com/users/patelgr/followers", "following_url": "https://api.github.com/users/patelgr/following{/other_user}", "gists_url": "https://api.github.com/users/patelgr/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/patelgr", "id": 6764126, "login": "patelgr", "node_id": "MDQ6VXNlcjY3NjQxMjY=", "organizations_url": "https://api.github.com/users/patelgr/orgs", "received_events_url": "https://api.github.com/users/patelgr/received_events", "repos_url": "https://api.github.com/users/patelgr/repos", "site_admin": false, "starred_url": "https://api.github.com/users/patelgr/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/patelgr/subscriptions", "type": "User", "url": "https://api.github.com/users/patelgr", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2014-03-21T22:45:03Z
2021-09-09T00:09:56Z
2014-03-21T22:47:46Z
NONE
resolved
In [24]: params = { 'f':'nopghc1xva2jkp5p6rer5j4dyr1s7r6r7b4t8j1s', #tags 's':"+".join(stocklist[:10]) } In [25]: r = requests.post("http://finance.yahoo.com/d/quotes.csv",params = params) In [29]: r.url Out[29]: u'http://download.finance.yahoo.com/d/quotes.csv?s=A%2BB%2BC%2BD%2BE%2BF%2BG%2BH%2BI%2BK&f=nopghc1xva2jkp5p6rer5j4dyr1s7r6r7b4t8j1s' ![issue](https://f.cloud.github.com/assets/6764126/2488723/116cc61c-b14a-11e3-9c22-1f86d4f821e3.png)
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1971/reactions" }
https://api.github.com/repos/psf/requests/issues/1971/timeline
null
completed
null
null
false
[ "In requests, `params` is always query string parameters. If you want to put something in the request body, use the `data` argument: `requests.post(url, data=data)`.\n", "thanks\n" ]
https://api.github.com/repos/psf/requests/issues/1970
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1970/labels{/name}
https://api.github.com/repos/psf/requests/issues/1970/comments
https://api.github.com/repos/psf/requests/issues/1970/events
https://github.com/psf/requests/issues/1970
29,878,434
MDU6SXNzdWUyOTg3ODQzNA==
1,970
SNI support seemingly still broken on OS X 10.9
{ "avatar_url": "https://avatars.githubusercontent.com/u/1225294?v=4", "events_url": "https://api.github.com/users/FiloSottile/events{/privacy}", "followers_url": "https://api.github.com/users/FiloSottile/followers", "following_url": "https://api.github.com/users/FiloSottile/following{/other_user}", "gists_url": "https://api.github.com/users/FiloSottile/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/FiloSottile", "id": 1225294, "login": "FiloSottile", "node_id": "MDQ6VXNlcjEyMjUyOTQ=", "organizations_url": "https://api.github.com/users/FiloSottile/orgs", "received_events_url": "https://api.github.com/users/FiloSottile/received_events", "repos_url": "https://api.github.com/users/FiloSottile/repos", "site_admin": false, "starred_url": "https://api.github.com/users/FiloSottile/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/FiloSottile/subscriptions", "type": "User", "url": "https://api.github.com/users/FiloSottile", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2014-03-21T04:12:00Z
2021-09-08T19:00:30Z
2014-03-21T06:51:57Z
NONE
resolved
The following does NOT reproduce with Python 3 or on Linux hosts. Jump to the bottom for the actual error. I just attached logs of the venv setup. ``` bash-3.2$ virtualenv venv New python executable in venv/bin/python Installing Setuptools..............................................................................................................................................................................................................................done. Installing Pip.....................................................................................................................................................................................................................................................................................................................................done. bash-3.2$ venv/bin/pip install requests urllib3 pyasn1 ndg-httpsclient pyopenssl Downloading/unpacking requests Using download cache from /Users/filosottile/.pip-download-cache/https%3A%2F%2Fpypi.python.org%2Fpackages%2Fsource%2Fr%2Frequests%2Frequests-2.2.1.tar.gz Running setup.py egg_info for package requests Downloading/unpacking urllib3 Using download cache from /Users/filosottile/.pip-download-cache/https%3A%2F%2Fpypi.python.org%2Fpackages%2Fsource%2Fu%2Furllib3%2Furllib3-1.8.tar.gz Running setup.py egg_info for package urllib3 Downloading/unpacking pyasn1 Using download cache from /Users/filosottile/.pip-download-cache/https%3A%2F%2Fpypi.python.org%2Fpackages%2Fsource%2Fp%2Fpyasn1%2Fpyasn1-0.1.7.tar.gz Running setup.py egg_info for package pyasn1 Downloading/unpacking ndg-httpsclient Using download cache from /Users/filosottile/.pip-download-cache/https%3A%2F%2Fpypi.python.org%2Fpackages%2Fsource%2Fn%2Fndg-httpsclient%2Fndg_httpsclient-0.3.2.tar.gz Running setup.py egg_info for package ndg-httpsclient Downloading/unpacking pyopenssl Using download cache from /Users/filosottile/.pip-download-cache/https%3A%2F%2Fpypi.python.org%2Fpackages%2Fsource%2Fp%2FpyOpenSSL%2FpyOpenSSL-0.14.tar.gz Running setup.py egg_info for package pyopenssl warning: no previously-included files matching '*.pyc' found anywhere in distribution no previously-included directories found matching 'doc/_build' Downloading/unpacking cryptography>=0.2.1 (from pyopenssl) Using download cache from /Users/filosottile/.pip-download-cache/https%3A%2F%2Fpypi.python.org%2Fpackages%2Fsource%2Fc%2Fcryptography%2Fcryptography-0.2.2.tar.gz Running setup.py egg_info for package cryptography no previously-included directories found matching 'documentation/_build' zip_safe flag not set; analyzing archive contents... six: module references __file__ six: module references __path__ Installed /Users/filosottile/tmp/venv/build/cryptography/six-1.6.1-py2.7.egg Searching for cffi>=0.8 Reading https://pypi.python.org/simple/cffi/ Best match: cffi 0.8.2 Downloading https://pypi.python.org/packages/source/c/cffi/cffi-0.8.2.tar.gz#md5=37fc88c62f40d04e8a18192433f951ec Processing cffi-0.8.2.tar.gz Writing /var/folders/t7/mkv9kpmd4094fb3gfqgkjp640000gn/T/easy_install-zzgElT/cffi-0.8.2/setup.cfg Running cffi-0.8.2/setup.py -q bdist_egg --dist-dir /var/folders/t7/mkv9kpmd4094fb3gfqgkjp640000gn/T/easy_install-zzgElT/cffi-0.8.2/egg-dist-tmp-_RWs2A Package libffi was not found in the pkg-config search path. Perhaps you should add the directory containing `libffi.pc' to the PKG_CONFIG_PATH environment variable No package 'libffi' found Package libffi was not found in the pkg-config search path. Perhaps you should add the directory containing `libffi.pc' to the PKG_CONFIG_PATH environment variable No package 'libffi' found Package libffi was not found in the pkg-config search path. Perhaps you should add the directory containing `libffi.pc' to the PKG_CONFIG_PATH environment variable No package 'libffi' found Package libffi was not found in the pkg-config search path. Perhaps you should add the directory containing `libffi.pc' to the PKG_CONFIG_PATH environment variable No package 'libffi' found Package libffi was not found in the pkg-config search path. Perhaps you should add the directory containing `libffi.pc' to the PKG_CONFIG_PATH environment variable No package 'libffi' found OS/X: confusion between 'cc' versus 'gcc' (see issue 123) will not use '__thread' in the C code Installed /Users/filosottile/tmp/venv/build/cryptography/cffi-0.8.2-py2.7-macosx-10.8-x86_64.egg Searching for pycparser Reading https://pypi.python.org/simple/pycparser/ Best match: pycparser 2.10 Downloading https://pypi.python.org/packages/source/p/pycparser/pycparser-2.10.tar.gz#md5=d87aed98c8a9f386aa56d365fe4d515f Processing pycparser-2.10.tar.gz Writing /var/folders/t7/mkv9kpmd4094fb3gfqgkjp640000gn/T/easy_install-1JE3Je/pycparser-2.10/setup.cfg Running pycparser-2.10/setup.py -q bdist_egg --dist-dir /var/folders/t7/mkv9kpmd4094fb3gfqgkjp640000gn/T/easy_install-1JE3Je/pycparser-2.10/egg-dist-tmp-AjEMEf zip_safe flag not set; analyzing archive contents... Installed /Users/filosottile/tmp/venv/build/cryptography/pycparser-2.10-py2.7.egg building '_cffi__x36bf4a13xfba2f231' extension clang -fno-strict-aliasing -fno-common -dynamic -I/usr/local/include -I/usr/local/opt/sqlite/include -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I/usr/local/Cellar/python/2.7.6/Frameworks/Python.framework/Versions/2.7/include/python2.7 -c cryptography/hazmat/primitives/__pycache__/_cffi__x36bf4a13xfba2f231.c -o /Users/filosottile/tmp/venv/build/cryptography/cryptography/hazmat/primitives/__pycache__/cryptography/hazmat/primitives/__pycache__/_cffi__x36bf4a13xfba2f231.o clang -bundle -undefined dynamic_lookup -L/usr/local/lib -L/usr/local/opt/sqlite/lib /Users/filosottile/tmp/venv/build/cryptography/cryptography/hazmat/primitives/__pycache__/cryptography/hazmat/primitives/__pycache__/_cffi__x36bf4a13xfba2f231.o -o /Users/filosottile/tmp/venv/build/cryptography/cryptography/hazmat/primitives/__pycache__/_cffi__x36bf4a13xfba2f231.so building '_cffi__xa1394602x4a8b9ec1' extension clang -fno-strict-aliasing -fno-common -dynamic -I/usr/local/include -I/usr/local/opt/sqlite/include -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I/usr/local/Cellar/python/2.7.6/Frameworks/Python.framework/Versions/2.7/include/python2.7 -c cryptography/hazmat/primitives/__pycache__/_cffi__xa1394602x4a8b9ec1.c -o /Users/filosottile/tmp/venv/build/cryptography/cryptography/hazmat/primitives/__pycache__/cryptography/hazmat/primitives/__pycache__/_cffi__xa1394602x4a8b9ec1.o clang -bundle -undefined dynamic_lookup -L/usr/local/lib -L/usr/local/opt/sqlite/lib /Users/filosottile/tmp/venv/build/cryptography/cryptography/hazmat/primitives/__pycache__/cryptography/hazmat/primitives/__pycache__/_cffi__xa1394602x4a8b9ec1.o -o /Users/filosottile/tmp/venv/build/cryptography/cryptography/hazmat/primitives/__pycache__/_cffi__xa1394602x4a8b9ec1.so building '_cffi__x5eaa210axf0ae7e21' extension clang -fno-strict-aliasing -fno-common -dynamic -I/usr/local/include -I/usr/local/opt/sqlite/include -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I/usr/local/Cellar/python/2.7.6/Frameworks/Python.framework/Versions/2.7/include/python2.7 -c cryptography/hazmat/bindings/__pycache__/_cffi__x5eaa210axf0ae7e21.c -o /Users/filosottile/tmp/venv/build/cryptography/cryptography/hazmat/bindings/__pycache__/cryptography/hazmat/bindings/__pycache__/_cffi__x5eaa210axf0ae7e21.o clang -bundle -undefined dynamic_lookup -L/usr/local/lib -L/usr/local/opt/sqlite/lib /Users/filosottile/tmp/venv/build/cryptography/cryptography/hazmat/bindings/__pycache__/cryptography/hazmat/bindings/__pycache__/_cffi__x5eaa210axf0ae7e21.o -lcrypto -lssl -o /Users/filosottile/tmp/venv/build/cryptography/cryptography/hazmat/bindings/__pycache__/_cffi__x5eaa210axf0ae7e21.so building '_cffi__x64a6e44fx21ac8a22' extension clang -fno-strict-aliasing -fno-common -dynamic -I/usr/local/include -I/usr/local/opt/sqlite/include -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I/usr/local/Cellar/python/2.7.6/Frameworks/Python.framework/Versions/2.7/include/python2.7 -c cryptography/hazmat/bindings/__pycache__/_cffi__x64a6e44fx21ac8a22.c -o /Users/filosottile/tmp/venv/build/cryptography/cryptography/hazmat/bindings/__pycache__/cryptography/hazmat/bindings/__pycache__/_cffi__x64a6e44fx21ac8a22.o clang -bundle -undefined dynamic_lookup -L/usr/local/lib -L/usr/local/opt/sqlite/lib /Users/filosottile/tmp/venv/build/cryptography/cryptography/hazmat/bindings/__pycache__/cryptography/hazmat/bindings/__pycache__/_cffi__x64a6e44fx21ac8a22.o -o /Users/filosottile/tmp/venv/build/cryptography/cryptography/hazmat/bindings/__pycache__/_cffi__x64a6e44fx21ac8a22.so Downloading/unpacking six>=1.5.2 (from pyopenssl) Using download cache from /Users/filosottile/.pip-download-cache/https%3A%2F%2Fpypi.python.org%2Fpackages%2Fsource%2Fs%2Fsix%2Fsix-1.6.1.tar.gz Running setup.py egg_info for package six no previously-included directories found matching 'documentation/_build' Downloading/unpacking cffi>=0.8 (from cryptography>=0.2.1->pyopenssl) Using download cache from /Users/filosottile/.pip-download-cache/https%3A%2F%2Fpypi.python.org%2Fpackages%2Fsource%2Fc%2Fcffi%2Fcffi-0.8.2.tar.gz Running setup.py egg_info for package cffi Package libffi was not found in the pkg-config search path. Perhaps you should add the directory containing `libffi.pc' to the PKG_CONFIG_PATH environment variable No package 'libffi' found Package libffi was not found in the pkg-config search path. Perhaps you should add the directory containing `libffi.pc' to the PKG_CONFIG_PATH environment variable No package 'libffi' found Package libffi was not found in the pkg-config search path. Perhaps you should add the directory containing `libffi.pc' to the PKG_CONFIG_PATH environment variable No package 'libffi' found Package libffi was not found in the pkg-config search path. Perhaps you should add the directory containing `libffi.pc' to the PKG_CONFIG_PATH environment variable No package 'libffi' found Package libffi was not found in the pkg-config search path. Perhaps you should add the directory containing `libffi.pc' to the PKG_CONFIG_PATH environment variable No package 'libffi' found OS/X: confusion between 'cc' versus 'gcc' (see issue 123) will not use '__thread' in the C code Downloading/unpacking pycparser (from cffi>=0.8->cryptography>=0.2.1->pyopenssl) Using download cache from /Users/filosottile/.pip-download-cache/https%3A%2F%2Fpypi.python.org%2Fpackages%2Fsource%2Fp%2Fpycparser%2Fpycparser-2.10.tar.gz Running setup.py egg_info for package pycparser Installing collected packages: requests, urllib3, pyasn1, ndg-httpsclient, pyopenssl, cryptography, six, cffi, pycparser Running setup.py install for requests Running setup.py install for urllib3 Running setup.py install for pyasn1 Running setup.py install for ndg-httpsclient Skipping installation of /Users/filosottile/tmp/venv/lib/python2.7/site-packages/ndg/__init__.py (namespace package) Installing /Users/filosottile/tmp/venv/lib/python2.7/site-packages/ndg_httpsclient-0.3.2-py2.7-nspkg.pth Installing ndg_httpclient script to /Users/filosottile/tmp/venv/bin Running setup.py install for pyopenssl warning: no previously-included files matching '*.pyc' found anywhere in distribution no previously-included directories found matching 'doc/_build' Running setup.py install for cryptography building '_cffi__x5eaa210axf0ae7e21' extension clang -fno-strict-aliasing -fno-common -dynamic -I/usr/local/include -I/usr/local/opt/sqlite/include -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I/usr/local/Cellar/python/2.7.6/Frameworks/Python.framework/Versions/2.7/include/python2.7 -c cryptography/hazmat/bindings/__pycache__/_cffi__x5eaa210axf0ae7e21.c -o build/temp.macosx-10.8-x86_64-2.7/cryptography/hazmat/bindings/__pycache__/_cffi__x5eaa210axf0ae7e21.o clang -bundle -undefined dynamic_lookup -L/usr/local/lib -L/usr/local/opt/sqlite/lib build/temp.macosx-10.8-x86_64-2.7/cryptography/hazmat/bindings/__pycache__/_cffi__x5eaa210axf0ae7e21.o -lcrypto -lssl -o build/lib.macosx-10.8-x86_64-2.7/cryptography/_cffi__x5eaa210axf0ae7e21.so building '_cffi__x36bf4a13xfba2f231' extension clang -fno-strict-aliasing -fno-common -dynamic -I/usr/local/include -I/usr/local/opt/sqlite/include -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I/usr/local/Cellar/python/2.7.6/Frameworks/Python.framework/Versions/2.7/include/python2.7 -c cryptography/hazmat/primitives/__pycache__/_cffi__x36bf4a13xfba2f231.c -o build/temp.macosx-10.8-x86_64-2.7/cryptography/hazmat/primitives/__pycache__/_cffi__x36bf4a13xfba2f231.o clang -bundle -undefined dynamic_lookup -L/usr/local/lib -L/usr/local/opt/sqlite/lib build/temp.macosx-10.8-x86_64-2.7/cryptography/hazmat/primitives/__pycache__/_cffi__x36bf4a13xfba2f231.o -o build/lib.macosx-10.8-x86_64-2.7/cryptography/_cffi__x36bf4a13xfba2f231.so building '_cffi__xa1394602x4a8b9ec1' extension clang -fno-strict-aliasing -fno-common -dynamic -I/usr/local/include -I/usr/local/opt/sqlite/include -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I/usr/local/Cellar/python/2.7.6/Frameworks/Python.framework/Versions/2.7/include/python2.7 -c cryptography/hazmat/primitives/__pycache__/_cffi__xa1394602x4a8b9ec1.c -o build/temp.macosx-10.8-x86_64-2.7/cryptography/hazmat/primitives/__pycache__/_cffi__xa1394602x4a8b9ec1.o clang -bundle -undefined dynamic_lookup -L/usr/local/lib -L/usr/local/opt/sqlite/lib build/temp.macosx-10.8-x86_64-2.7/cryptography/hazmat/primitives/__pycache__/_cffi__xa1394602x4a8b9ec1.o -o build/lib.macosx-10.8-x86_64-2.7/cryptography/_cffi__xa1394602x4a8b9ec1.so building '_cffi__x64a6e44fx21ac8a22' extension clang -fno-strict-aliasing -fno-common -dynamic -I/usr/local/include -I/usr/local/opt/sqlite/include -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I/usr/local/Cellar/python/2.7.6/Frameworks/Python.framework/Versions/2.7/include/python2.7 -c cryptography/hazmat/bindings/__pycache__/_cffi__x64a6e44fx21ac8a22.c -o build/temp.macosx-10.8-x86_64-2.7/cryptography/hazmat/bindings/__pycache__/_cffi__x64a6e44fx21ac8a22.o clang -bundle -undefined dynamic_lookup -L/usr/local/lib -L/usr/local/opt/sqlite/lib build/temp.macosx-10.8-x86_64-2.7/cryptography/hazmat/bindings/__pycache__/_cffi__x64a6e44fx21ac8a22.o -o build/lib.macosx-10.8-x86_64-2.7/cryptography/_cffi__x64a6e44fx21ac8a22.so Running setup.py install for six no previously-included directories found matching 'documentation/_build' Running setup.py install for cffi Package libffi was not found in the pkg-config search path. Perhaps you should add the directory containing `libffi.pc' to the PKG_CONFIG_PATH environment variable No package 'libffi' found Package libffi was not found in the pkg-config search path. Perhaps you should add the directory containing `libffi.pc' to the PKG_CONFIG_PATH environment variable No package 'libffi' found Package libffi was not found in the pkg-config search path. Perhaps you should add the directory containing `libffi.pc' to the PKG_CONFIG_PATH environment variable No package 'libffi' found Package libffi was not found in the pkg-config search path. Perhaps you should add the directory containing `libffi.pc' to the PKG_CONFIG_PATH environment variable No package 'libffi' found Package libffi was not found in the pkg-config search path. Perhaps you should add the directory containing `libffi.pc' to the PKG_CONFIG_PATH environment variable No package 'libffi' found OS/X: confusion between 'cc' versus 'gcc' (see issue 123) will not use '__thread' in the C code building '_cffi_backend' extension clang -fno-strict-aliasing -fno-common -dynamic -I/usr/local/include -I/usr/local/opt/sqlite/include -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I/usr/include/ffi -I/usr/include/libffi -I/usr/local/Cellar/python/2.7.6/Frameworks/Python.framework/Versions/2.7/include/python2.7 -c c/_cffi_backend.c -o build/temp.macosx-10.8-x86_64-2.7/c/_cffi_backend.o clang -bundle -undefined dynamic_lookup -L/usr/local/lib -L/usr/local/opt/sqlite/lib build/temp.macosx-10.8-x86_64-2.7/c/_cffi_backend.o -lffi -o build/lib.macosx-10.8-x86_64-2.7/_cffi_backend.so Running setup.py install for pycparser Successfully installed requests urllib3 pyasn1 ndg-httpsclient pyopenssl cryptography six cffi pycparser Cleaning up... bash-3.2$ venv/bin/pip freeze cffi==0.8.2 cryptography==0.2.2 ndg-httpsclient==0.3.2 pyOpenSSL==0.14 pyasn1==0.1.7 pycparser==2.10 requests==2.2.1 six==1.6.1 urllib3==1.8 wsgiref==0.1.2 bash-3.2$ venv/bin/python Python 2.7.6 (default, Dec 4 2013, 18:45:48) [GCC 4.2.1 Compatible Apple LLVM 5.0 (clang-500.2.75)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> import requests >>> requests.get("https://transport03-rts03-iad01.transport.home.nest.com:443/") Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/Users/filosottile/tmp/venv/lib/python2.7/site-packages/requests/api.py", line 55, in get return request('get', url, **kwargs) File "/Users/filosottile/tmp/venv/lib/python2.7/site-packages/requests/api.py", line 44, in request return session.request(method=method, url=url, **kwargs) File "/Users/filosottile/tmp/venv/lib/python2.7/site-packages/requests/sessions.py", line 383, in request resp = self.send(prep, **send_kwargs) File "/Users/filosottile/tmp/venv/lib/python2.7/site-packages/requests/sessions.py", line 486, in send r = adapter.send(request, **kwargs) File "/Users/filosottile/tmp/venv/lib/python2.7/site-packages/requests/adapters.py", line 385, in send raise SSLError(e) requests.exceptions.SSLError: [Errno bad handshake] (-1, 'Unexpected EOF') >>> ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1970/reactions" }
https://api.github.com/repos/psf/requests/issues/1970/timeline
null
completed
null
null
false
[ "I don't think this has anything to do with SNI, and instead has everything to do with the OpenSSL that ships on OS X being braindead and stupid. I've run this on 2.7.6 and 3.3.4, both on OS X. In both cases, the SNI extension is being correctly sent to the remote server. The difference is that the server, in the Python 2.7.6 case, is not responding to the TLS Client Hello.\n\nThere could be a number of reasons for this. In Python 2.7.6 we do a TLS v1 handshake, but in Python 3.3.4 we do a TLSv1.2 handshake. This is unlikely, as the server negotiates down to TLSv1. More likely, the server doesn't like the cipher suites we offer. In 3.3.4 we offer 72 cipher suites, and the server chooses TLS_DHE_RSA_WITH_3DES_EDE_CBC_SHA. In 2.7.6 we offer 2 cipher suites: TLS_RSA_WITH_RC4_128_SHA and TLS_EMPTY_RENEGOTIATION_INFO_SCSV. If the server in this case is expecting to choose a suite we've offered, but doesn't allow TLS_RSA_WITH_RC4_128_SHA (which is pretty weak), it will find that there's nothing for it to choose.\n\nEither way, we're sending SNI and it's right, so the problem is not requests. =)\n", "The analysis makes sense. Sadly linking Python against a better (2014) OpenSSL didn't fix it.\n\nI'll have to dig deeper. Thanks!\n", "This is happening to me on CentOS 5 as well.\n", "@mattwilliamson _What specifically_ is happening to you?\n", "@Lukasa clearly _it_ is happening. :wink: \n" ]
https://api.github.com/repos/psf/requests/issues/1969
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1969/labels{/name}
https://api.github.com/repos/psf/requests/issues/1969/comments
https://api.github.com/repos/psf/requests/issues/1969/events
https://github.com/psf/requests/issues/1969
29,796,801
MDU6SXNzdWUyOTc5NjgwMQ==
1,969
Better Document SNI and PyOpenSSL Support
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[ { "color": "e102d8", "default": false, "description": null, "id": 117745, "name": "Planned", "node_id": "MDU6TGFiZWwxMTc3NDU=", "url": "https://api.github.com/repos/psf/requests/labels/Planned" }, { "color": "0b02e1", "default": false, "description": null, "id": 191274, "name": "Contributor Friendly", "node_id": "MDU6TGFiZWwxOTEyNzQ=", "url": "https://api.github.com/repos/psf/requests/labels/Contributor%20Friendly" } ]
closed
true
null
[]
null
13
2014-03-20T06:30:46Z
2021-09-08T09:00:43Z
2017-05-29T20:56:52Z
MEMBER
resolved
It looks like the standard library [might start directing more people our way in order to get SNI on Python 2.7](http://bugs.python.org/issue5639#msg214170). This is great, but right now we don't really document it, and we don't test it. I think we should do both. This means - new CI environment that installs our SNI dependencies and actually tries to use it. - better docs.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1969/reactions" }
https://api.github.com/repos/psf/requests/issues/1969/timeline
null
completed
null
null
false
[ ":+1: \n", "No guarantees, actually, I've seen chats on Twitter that suggest that Nick Coghlan might be chatting with Christian and Alex about fixing 2.7. We're a bit in limbo here.\n", "Until python 2.7 supports SNI, maybe we can make use of https://github.com/trevp/tlslite ?\n", "We do support SNI. =) We just don't support it in an obvious way. Given the option between using pyOpenSSL and tlslite we're always going to choose pyOpenSSL. tlslite has very limited ciphersuite support and just generally lacks the features we need.\n", "Suggestion: rather than documenting `pip install pyopenssl ndg-httpsclient pyasn1`, turn it into a setuptools extra so you can `pip install requests[sni]` and get those dependencies so you only have to remember that one string. (I always forget `ndg-httpsclient`.)\n", "@glyph there's a separate issue (#1995) for that. Feel free to ping Kenneth there.\n", "FWIW, it looks like the stdlib in 2.7 will support SNI (and a whole host of other things), just as soon as someone reviews and merges @dreid and my patch.\n", ":cake: \n", "If anyone picks this up, please also add documentation for `pip install requests[security]`.\n", "I can confirm that it's not obvious for newcomers (like me!) to realise we need to install extra packages for SNI support.\n", "We should also push people towards Python 2.7.9, which addresses this.\n\nOn Sun, Feb 15, 2015 at 5:14 PM, Bastien Gandouet [email protected]\nwrote:\n\n> I can confirm that it's not obvious for newcomers (like me!) to realise we\n> need to install extra packages for SNI support.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/1969#issuecomment-74440187\n> .\n\n## \n\n\"I disapprove of what you say, but I will defend to the death your right to\nsay it.\" -- Evelyn Beatrice Hall (summarizing Voltaire)\n\"The people's good is the highest law.\" -- Cicero\nGPG Key fingerprint: 125F 5C67 DFE9 4084\n", "> We should also push people towards Python 2.7.9, which addresses this.\n\nDefinitely, but killing earlier versions will still take quite some time, so documenting what to do in the meanwhile is still important.\n", "I think this can be closed, since #2195 seems to resolve it?" ]
https://api.github.com/repos/psf/requests/issues/1968
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1968/labels{/name}
https://api.github.com/repos/psf/requests/issues/1968/comments
https://api.github.com/repos/psf/requests/issues/1968/events
https://github.com/psf/requests/issues/1968
29,777,700
MDU6SXNzdWUyOTc3NzcwMA==
1,968
latest version mismatch
{ "avatar_url": "https://avatars.githubusercontent.com/u/475147?v=4", "events_url": "https://api.github.com/users/lanterndev/events{/privacy}", "followers_url": "https://api.github.com/users/lanterndev/followers", "following_url": "https://api.github.com/users/lanterndev/following{/other_user}", "gists_url": "https://api.github.com/users/lanterndev/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/lanterndev", "id": 475147, "login": "lanterndev", "node_id": "MDQ6VXNlcjQ3NTE0Nw==", "organizations_url": "https://api.github.com/users/lanterndev/orgs", "received_events_url": "https://api.github.com/users/lanterndev/received_events", "repos_url": "https://api.github.com/users/lanterndev/repos", "site_admin": false, "starred_url": "https://api.github.com/users/lanterndev/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/lanterndev/subscriptions", "type": "User", "url": "https://api.github.com/users/lanterndev", "user_view_type": "public" }
[]
closed
true
null
[]
null
12
2014-03-19T22:14:04Z
2021-09-08T23:10:59Z
2014-06-08T10:43:19Z
NONE
resolved
http://docs.python-requests.org/en/latest/ currently corresponds to "Release v2.3.0". But running "pip install requests" gives ``` $ pip install -v requests Downloading/unpacking requests Using version 2.2.1 (newest of versions: 2.2.1, 2.2.1, 2.2.0, 2.2.0, 2.1.0, 2.1.0, 2.0.1, 2.0.1, 2.0.0, 2.0.0, 1.2.3, 1.2.2, 1.2.1, 1.2.0, 1.1.0, 1.0.4, 1.0.3, 1.0.2, 1.0.1, 1.0.0, 0.14.2, 0.14.1, 0.14.0, 0.13.9, 0.13.8, 0.13.7, 0.13.6, 0.13.5, 0.13.4, 0.13.3, 0.13.2, 0.13.1, 0.13.0, 0.12.1, 0.12.0, 0.11.2, 0.11.1, 0.10.8, 0.10.7, 0.10.6, 0.10.4, 0.10.3, 0.10.2, 0.10.1, 0.10.0, 0.9.3, 0.9.2, 0.9.1, 0.9.0, 0.8.9, 0.8.8, 0.8.7, 0.8.6, 0.8.5, 0.8.4, 0.8.3, 0.8.2, 0.8.1, 0.8.0, 0.7.6, 0.7.5, 0.7.4, 0.7.3, 0.7.2, 0.7.1, 0.7.0, 0.6.6, 0.6.5, 0.6.4, 0.6.3, 0.6.2, 0.6.1, 0.6.0, 0.5.1, 0.5.0, 0.4.1, 0.4.0, 0.3.4, 0.3.3, 0.3.2, 0.3.1, 0.3.0, 0.2.4, 0.2.3, 0.2.2, 0.2.1, 0.2.0) Downloading requests-2.2.1-py2.py3-none-any.whl (625kB): ... ``` Is 2.3.0 not in fact released yet, or has it just not yet been uploaded to PyPI?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1968/reactions" }
https://api.github.com/repos/psf/requests/issues/1968/timeline
null
completed
null
null
false
[ "Looks like there's no 2.3.0 tag at https://github.com/kennethreitz/requests/tags yet, so I take it 2.2.1 is in fact the latest version, and the http://docs.python-requests.org/en/latest/ page is showing an incorrect latest version.\n", "Correct, 2.3.0 is a long way off. In a breach of the way we normally do things, a Pull Request incremented the version number well before release which invalidates the docs. Probably the best thing to do at this stage is to roll the version number back to 2.2.1.\n", "@Lukasa I didn't trigger this but I prefer it this way. Certain things have been fixed/broken in 2.3.0. That behaviour is not available/visible in 2.2.1 and so the documented behaviour is accurately documented. :)\n", "Thanks for the info @lukasa.\n\nDo we all agree the current info on the\nhttp://docs.python-requests.org/en/latest/ page is misleading, as it\n(a) calls 2.3.0 \"released\", and (b) calls it \"latest\" (in the url)? From\nthat I'd conclude 2.3.0 was the latest release.\n\nOn Wednesday, March 19, 2014, Ian Cordasco\n<[email protected]<javascript:_e(%7B%7D,'cvml','[email protected]');>>\nwrote:\n\n> @Lukasa https://github.com/Lukasa I didn't trigger this but I prefer it\n> this way. Certain things have been fixed/broken in 2.3.0. That behaviour is\n> not available/visible in 2.2.1 and so the documented behaviour is\n> accurately documented. :)\n> \n> ## \n> \n> Reply to this email directly or view it on GitHubhttps://github.com/kennethreitz/requests/issues/1968#issuecomment-38116629\n> .\n", "If you change the current latest version to something like \"2.3.0-dev\" (and\nthen update the docs, ideally substituting \"release\" with \"version\" for\ndocs on not-yet-released versions) it'd be clearer, and more semantic too.\n\nOn Wednesday, March 19, 2014, [email protected] wrote:\n\n> Thanks for the info @lukasa.\n> \n> Do we all agree the current info on the\n> http://docs.python-requests.org/en/latest/ page is misleading, as it\n> (a) calls 2.3.0 \"released\", and (b) calls it \"latest\" (in the url)? From\n> that I'd conclude 2.3.0 was the latest release.\n> \n> On Wednesday, March 19, 2014, Ian Cordasco [email protected]\n> wrote:\n> \n> > @Lukasa https://github.com/Lukasa I didn't trigger this but I prefer\n> > it this way. Certain things have been fixed/broken in 2.3.0. That behaviour\n> > is not available/visible in 2.2.1 and so the documented behaviour is\n> > accurately documented. :)\n> > \n> > ## \n> > \n> > Reply to this email directly or view it on GitHubhttps://github.com/kennethreitz/requests/issues/1968#issuecomment-38116629\n> > .\n", "I'm honestly not sure what to do here. On my own projects I have a separate set of 'development' docs.\n", "@skivvies this isn't standard procedure for requests. Reverting the version bump seems silly at this point and changing it to 2.3.0-dev will only invoke Kenneth's ire. For now I vote we leave this alone. After 2.3.0 this just won't happen. \n", "Does requests follow the http://semver.org spec?\n\n> A pre-release version MAY be denoted by appending a hyphen and a series of dot separated identifiers immediately following the patch version. Identifiers MUST comprise only ASCII alphanumerics and hyphen [0-9A-Za-z-]. Identifiers MUST NOT be empty. Numeric identifiers MUST NOT include leading zeroes. Pre-release versions have a lower precedence than the associated normal version. A pre-release version indicates that the version is unstable and might not satisfy the intended compatibility requirements as denoted by its associated normal version. Examples: 1.0.0-alpha, 1.0.0-alpha.1, 1.0.0-0.3.7, 1.0.0-x.7.z.92.\n\nIt sounds like that could help avoid something like this in the future (e.g. right after publishing a final release, there's an immediate followup commit to bump the version on master to something like X.Y.Z-dev (where X.Y.Z is the next planned release).\n\nIf there's interest, I can open a ticket for that separately.\n", "We follow semver, but we don't do pre-releases, and we shouldn't do it here.\n\nOur actual releases, by which I mean our releases on pip and our git tags, follow semver. Anything else is development and may be wrong from time to time.\n", "Sorry for the confusion, I wasn't suggesting _publishing_ pre-releases, but rather just doing the automatic version bump to a pre-release version on master immediately after publishing a release. That way, if docs accidentally get published for a pre-release version, it's obvious they're for a pre-release version. It also makes it obvious during development that it's a pre-release version when you can inspect the package version and see something like `-dev` at the end.\n", "This is really up to @kennethreitz, but I suspect he's going to want to avoid the extra procedural overhead.\n", "This has been discussed a few times in the past but has never been something Kenneth was in favor of. Whoever bumped the version number this time got lucky and it slipped past all of us. Usually that doesn't happen.\n" ]
https://api.github.com/repos/psf/requests/issues/1967
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1967/labels{/name}
https://api.github.com/repos/psf/requests/issues/1967/comments
https://api.github.com/repos/psf/requests/issues/1967/events
https://github.com/psf/requests/issues/1967
29,760,771
MDU6SXNzdWUyOTc2MDc3MQ==
1,967
Running a single connection with pool_block == True, hangs the application
{ "avatar_url": "https://avatars.githubusercontent.com/u/5100824?v=4", "events_url": "https://api.github.com/users/DoxaLogosGit/events{/privacy}", "followers_url": "https://api.github.com/users/DoxaLogosGit/followers", "following_url": "https://api.github.com/users/DoxaLogosGit/following{/other_user}", "gists_url": "https://api.github.com/users/DoxaLogosGit/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/DoxaLogosGit", "id": 5100824, "login": "DoxaLogosGit", "node_id": "MDQ6VXNlcjUxMDA4MjQ=", "organizations_url": "https://api.github.com/users/DoxaLogosGit/orgs", "received_events_url": "https://api.github.com/users/DoxaLogosGit/received_events", "repos_url": "https://api.github.com/users/DoxaLogosGit/repos", "site_admin": false, "starred_url": "https://api.github.com/users/DoxaLogosGit/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/DoxaLogosGit/subscriptions", "type": "User", "url": "https://api.github.com/users/DoxaLogosGit", "user_view_type": "public" }
[]
closed
true
null
[]
null
35
2014-03-19T18:41:56Z
2021-09-09T00:01:11Z
2014-03-26T15:46:23Z
NONE
resolved
I have a problem where I'm trying to create a python script to test a single-threaded web server on an embedded device. I was having performance problems until I realized that I could only support one connection per session. When I made my own HTTP adapter to only use one connection with pool_block equal to True, one request got sent, then the script hung trying to send the next request. If I comment out line 533 (the "if release_conn:") in request/package/urllib3/connectionpool.py, my problems go away. The script performances super duper fast as expected compared against a cURL bash script. However, I'm not sure this is the proper fix for the problem, since I'm a special case. I need a way to configure this to run this single connection from the top-level api without getting blocked. BTW, if I turn blocking off (pool_block == False), I get the same horrible performance. Basically, it has to do retries and slows down, because the web server can only handle one connection at a time.
{ "avatar_url": "https://avatars.githubusercontent.com/u/5100824?v=4", "events_url": "https://api.github.com/users/DoxaLogosGit/events{/privacy}", "followers_url": "https://api.github.com/users/DoxaLogosGit/followers", "following_url": "https://api.github.com/users/DoxaLogosGit/following{/other_user}", "gists_url": "https://api.github.com/users/DoxaLogosGit/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/DoxaLogosGit", "id": 5100824, "login": "DoxaLogosGit", "node_id": "MDQ6VXNlcjUxMDA4MjQ=", "organizations_url": "https://api.github.com/users/DoxaLogosGit/orgs", "received_events_url": "https://api.github.com/users/DoxaLogosGit/received_events", "repos_url": "https://api.github.com/users/DoxaLogosGit/repos", "site_admin": false, "starred_url": "https://api.github.com/users/DoxaLogosGit/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/DoxaLogosGit/subscriptions", "type": "User", "url": "https://api.github.com/users/DoxaLogosGit", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1967/reactions" }
https://api.github.com/repos/psf/requests/issues/1967/timeline
null
completed
null
null
false
[ "Are you reading the full response from the request?\n", "As far as I know, I thought I was. I just get or post, and parse the\nresponse in a separate variable. Is there something else that needs to be\ndone?\n\nOn Wed, Mar 19, 2014 at 1:52 PM, Cory Benfield [email protected]:\n\n> Are you reading the full response from the request?\n> \n> ## \n> \n> Reply to this email directly or view it on GitHubhttps://github.com/kennethreitz/requests/issues/1967#issuecomment-38091631\n> .\n", "That's ok, I just wanted to confirm that you were.\n\nAs a quick fix, calling `Response.close()` should fix this problem. We need to work out why the connection isn't being released back to the pool.\n", "I'll give that a try.\n\nOn Wed, Mar 19, 2014 at 2:53 PM, Cory Benfield [email protected]:\n\n> That's ok, I just wanted to confirm that you were.\n> \n> As a quick fix, calling Response.close() should fix this problem. We need\n> to work out why the connection isn't being released back to the pool.\n> \n> ## \n> \n> Reply to this email directly or view it on GitHubhttps://github.com/kennethreitz/requests/issues/1967#issuecomment-38098836\n> .\n", "I put the \"close()\" call in the methods where I get a response from the requests calls.\n\nStill hangs.\n", "Hmm. Can I see some sample code?\n", "```\n def post_entry(self, url, ylist, entry): \n \"\"\" \n Create an instance of a yang list \n\n :param: url - the base rest url \n :param: ylist - the yang list this is operating on \n :param: entry - the yang instance you're creating(dictionary) \n\n :return: True for success, false for failure \n \"\"\" \n if not url.endswith(\"/\"): \n url += \"/\" \n url_string = url + ylist \n headers = {\"Content-type\":\"application/json\"} \n try: \n r = call_requests(self.session.post, url=url_string, data=json.dumps(entry), timeout=5, headers=headers) \n except Exception as ex: \n logger.error(\"Failure Post Exception: {0}\".format(str(ex))) \n return False \n\n r.close() \n if r.status_code != requests.codes.created: \n logger.error(\"Failure - Status: {0}\".format(r.status_code)) \n return False \n\n return True \n\n\n @RetryIt \n def call_requests(func, **kwargs): \n \"\"\" \n Intermediate call to the requests methods with retry decorator \n \"\"\" \n return func(**kwargs) \n```\n", "ugh, don't know why this posted twice. I blame the laptop:-)\n", "Hmm. I wonder if the retry decorator is doing something weird here.\n", "it only checks for the \"retries\" in the exception and sends again for maximum amount. passes everything back to the caller.\n", "Right, but 'sends again' is the bit that I find interesting. Can you remove it?\n", "Here's the meat of the decorator\n\n```\n def __call__(self, *args, **kwargs): \n \"\"\" \n Handle the function call and retry it \n \"\"\" \n try: \n results = self.f(*args, **kwargs) \n except Exception as ex: \n if \"retries\" in str(ex): \n self.retry_cnt -= 1 \n if self.retry_cnt >= 0: \n logger.info(\"Retrying {0} on {1} # {2}\".format(self.f.__name__, \n args[0].__name__, \n str(self.max_retry - self.retry_cnt))) \n time.sleep(0.001) \n results = self.__call__(*args, **kwargs) \n else: \n logger.error(\"Out of Retries for {0}\".format(self.f.__name__)) \n #reset \n\n self.retry_cnt = self.max_retry\n raise \n else: \n #reset \n self.retry_cnt = self.max_retry \n raise \n\n #reset \n self.retry_cnt = self.max_retry \n return results \n\n```\n", "Do I need a close inside there? I wasn't sure the Response existed in the exception.\n", "just saw your question. let me try\n", "too the hard to remove, rest of code expects a response object returned\n", "Let me remove the entry retry decorator. I probably don't need it if I configure the adaptor correctly\n", "Okay, I removed the decorator entirely. Still hangs even with \"close()\" being called.\n", "Right. So we need to work out why the connection isn't being returned to the pool.\n", "I walked through the debugger on the \"close()\" method. It's hits line 118 of /package/urllib3/response.py and exit's out on the if statement.\n\n```\ndef release_conn(self):\n if not self._pool or not self._connection:\n return\n```\n\nThe self._connection variable appears to be None.\n", "So in that situation we should have dropped the connection, which is what I'd expect if we'd exhausted it on the read (which we do).\n\n@shazow, want to run your eye over this quickly?\n", "So, I changed the if statement to just check if the pool is None. When it goes to put the connection back in the pool, it gets a \"Full\" exception. The comment on that exception says that it should never happen if self.block == True (which it is in this case).\n", "Hmm I'm not sure how to reproduce this with plain urllib3. Could be something with the Requests wrapping?\n\n``` python\nIn [8]: import urllib3\n\nIn [9]: urllib3.add_stderr_logger()\n2014-03-20 11:30:34,125 DEBUG Added an stderr logging handler to logger: urllib3\nOut[9]: <logging.StreamHandler at 0x106673550>\n\nIn [10]: pool = urllib3.HTTPConnectionPool('httpbin.org', block=True, maxsize=1)\n\nIn [11]: r = pool.request('GET', '/')\n2014-03-20 11:30:44,571 INFO Starting new HTTP connection (1): httpbin.org\n2014-03-20 11:30:45,345 DEBUG \"GET / HTTP/1.1\" 200 7700\n\nIn [12]: r = pool.request('GET', '/')\n2014-03-20 11:30:46,106 DEBUG \"GET / HTTP/1.1\" 200 7700\n\nIn [13]: r = pool.request('GET', '/')\n2014-03-20 11:30:47,640 DEBUG \"GET / HTTP/1.1\" 200 7700\n```\n", "It's worth noting that we don't use `pool.request()`, we use `urlopen` and grab the connection by hand ourselves.\n", "Perhaps we should be using `request` on an `HTTPConnectionPool` instance then?\n", "There's shouldn't be much difference, all it does is handle determining the different request types (urlencoded or multipart etc) and basic headers: https://github.com/shazow/urllib3/blob/master/urllib3/request.py#L58\n\nIt's a convenience function over urlopen. I think Requests has something similar already (though feel free to use urllib3's if you'd rather not maintain your logic for it).\n", "Another strange behavior problem, if I set \"release_conn=True\" in the call to url_open in the \"send\" method of requests/adapters.py. I no longer hang on the second requests call. However, it's still way too slow. At the same time, if I go and comment out the \"if release_conn:\" in the finally statement of urlopen, it still works but is slow. If I remove the \"release_conn=True\" in the url_open function call leaving the \"if release_conn:\" commented out like my previous example, it's back to behaving the way I expect ... super fast. I'm not sure why the release_conn=True would behave differently.\n", "Ok, I think what we need to do is work out what exactly is happening. @jgatkinsn, are you familiar with Wireshark/tcpdump? I'd like to see some packet captures of each of the different permutations so I can get an idea of how the connections are proceeding.\n", "I'm somewhat familiar(not an expert). I was already using tshark to capture the packets between my script and a simple bash script that calls cURL to do the same thing. It was leading me to believe the problem was somewhere in the python implemenation versus the super fast cURL bash script. I'll try and grab my old captures and post them here.\n", "I can't figure out how to attach the capture files to the Git site. Do you want me to email them directly to you?\n", "Feel free. =)\n" ]
https://api.github.com/repos/psf/requests/issues/1966
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1966/labels{/name}
https://api.github.com/repos/psf/requests/issues/1966/comments
https://api.github.com/repos/psf/requests/issues/1966/events
https://github.com/psf/requests/issues/1966
29,675,021
MDU6SXNzdWUyOTY3NTAyMQ==
1,966
response unconnected with stream=True for some SSL sites
{ "avatar_url": "https://avatars.githubusercontent.com/u/1279658?v=4", "events_url": "https://api.github.com/users/wummel/events{/privacy}", "followers_url": "https://api.github.com/users/wummel/followers", "following_url": "https://api.github.com/users/wummel/following{/other_user}", "gists_url": "https://api.github.com/users/wummel/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/wummel", "id": 1279658, "login": "wummel", "node_id": "MDQ6VXNlcjEyNzk2NTg=", "organizations_url": "https://api.github.com/users/wummel/orgs", "received_events_url": "https://api.github.com/users/wummel/received_events", "repos_url": "https://api.github.com/users/wummel/repos", "site_admin": false, "starred_url": "https://api.github.com/users/wummel/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/wummel/subscriptions", "type": "User", "url": "https://api.github.com/users/wummel", "user_view_type": "public" }
[]
closed
true
null
[]
null
8
2014-03-18T19:06:56Z
2021-09-09T00:01:08Z
2014-04-07T17:02:14Z
NONE
resolved
Hi, with some SSL sites and stream=True the SSL socket seems to be unconnected. Calling connect() manually helps (see test script output below). I expected that the behaviour is consistent, ie. after calling requests.get() the socket is connected but with no data read because of stream=True. The test script: ``` import requests url = "https://github.com/" response = requests.get(url, stream=True) print url, "sock", response.raw._connection.sock url = "https://www.ahold.com/" response = requests.get(url, stream=True) print url, "sock", response.raw._connection.sock response.raw._connection.connect() print url, "sock after connect", response.raw._connection.sock ``` Output: ``` https://github.com/ sock <ssl.SSLSocket object at 0x27792a8> https://www.ahold.com/ sock None https://www.ahold.com/ sock after connect <ssl.SSLSocket object at 0x7ff033a83578> ``` Version info: It's a Debian Linux wheezy system. ``` Python 2.7.3 (default, Mar 13 2014, 11:03:55) [GCC 4.7.2] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> import requests; print requests.__version__ 2.2.1 >>> ``` ps: maybe this is an urllib3 problem - I'm not sure.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1279658?v=4", "events_url": "https://api.github.com/users/wummel/events{/privacy}", "followers_url": "https://api.github.com/users/wummel/followers", "following_url": "https://api.github.com/users/wummel/following{/other_user}", "gists_url": "https://api.github.com/users/wummel/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/wummel", "id": 1279658, "login": "wummel", "node_id": "MDQ6VXNlcjEyNzk2NTg=", "organizations_url": "https://api.github.com/users/wummel/orgs", "received_events_url": "https://api.github.com/users/wummel/received_events", "repos_url": "https://api.github.com/users/wummel/repos", "site_admin": false, "starred_url": "https://api.github.com/users/wummel/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/wummel/subscriptions", "type": "User", "url": "https://api.github.com/users/wummel", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1966/reactions" }
https://api.github.com/repos/psf/requests/issues/1966/timeline
null
completed
null
null
false
[ "What effect does this have when using `requests`? Does it impede your ability to read data?\n", "I can reproduce this but I have the same question as @Lukasa \n", "I use requests for link checking. Every check first connects to the http(s) site, then if there are no errors and the content should be read (this is configurable) the content is checked.\n\nThe connection is done with session.get(stream=False). When this succeeds, the content is read with iter_content(). I want to ensure that session.get(stream=False) actually makes a connection ready to be read from. This guarantees that unconnectable servers or invalid SSL certificates are detected by the link checker, even if the user configured that the content of the URL should not be checked.\n\nRight now this is the workaround for ssl connections I am using in https://github.com/wummel/linkchecker/blob/master/linkcheck/plugins/sslcertcheck.py#L62\n(url_data.url_connection is the response object):\n\n```\n [...]\n raw_connection = url_data.url_connection.raw._connection\n if raw_connection.sock is None:\n # sometimes the socket is not yet connected\n # see https://github.com/kennethreitz/requests/issues/1966\n raw_connection.connect()\n ssl_sock = raw_connection.sock\n cert = ssl_sock.getpeercert()\n [...]\n```\n", "It feels like you're fighting `requests` here. Firstly, why are you doing certificate verification yourself?\n", "The other thing that is crucial to note is that `_connection` is an implementation detail. If you're mucking with that and getting burned or unexpected behaviour, that's neither our nor urllib3's fault. That's your own fault. If this is manifesting itself in other ways through public APIs then that's different.\n", "The reason I want to get the certificate data is to warn users if it expires soon.\nI agree the code above is using implementation-specific details. Is there another way to get the certificate data, or more specific to get the expiration date on the SSL certificate?\n", "The only way I can think of doing it is at a point when you know the connection to be open, and neither requests nor urllib3 makes it easy to do this. It is possible that urllib3 would be open to a feature request that stores off SSL certificates on a connection object, but that's at the discretion of that project.\n", "Ok, added some comments to https://github.com/shazow/urllib3/pull/257.\n" ]
https://api.github.com/repos/psf/requests/issues/1965
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1965/labels{/name}
https://api.github.com/repos/psf/requests/issues/1965/comments
https://api.github.com/repos/psf/requests/issues/1965/events
https://github.com/psf/requests/pull/1965
29,517,628
MDExOlB1bGxSZXF1ZXN0MTM2MTI3MjM=
1,965
Create new Session method `prepare_redirected_request`.
{ "avatar_url": "https://avatars.githubusercontent.com/u/325899?v=4", "events_url": "https://api.github.com/users/zackw/events{/privacy}", "followers_url": "https://api.github.com/users/zackw/followers", "following_url": "https://api.github.com/users/zackw/following{/other_user}", "gists_url": "https://api.github.com/users/zackw/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/zackw", "id": 325899, "login": "zackw", "node_id": "MDQ6VXNlcjMyNTg5OQ==", "organizations_url": "https://api.github.com/users/zackw/orgs", "received_events_url": "https://api.github.com/users/zackw/received_events", "repos_url": "https://api.github.com/users/zackw/repos", "site_admin": false, "starred_url": "https://api.github.com/users/zackw/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zackw/subscriptions", "type": "User", "url": "https://api.github.com/users/zackw", "user_view_type": "public" }
[ { "color": "fbca04", "default": false, "description": null, "id": 44501249, "name": "Needs BDFL Input", "node_id": "MDU6TGFiZWw0NDUwMTI0OQ==", "url": "https://api.github.com/repos/psf/requests/labels/Needs%20BDFL%20Input" }, { "color": "e11d21", "default": false, "description": null, "id": 44501305, "name": "Not Ready To Merge", "node_id": "MDU6TGFiZWw0NDUwMTMwNQ==", "url": "https://api.github.com/repos/psf/requests/labels/Not%20Ready%20To%20Merge" }, { "color": "e11d21", "default": false, "description": null, "id": 78002701, "name": "Do Not Merge", "node_id": "MDU6TGFiZWw3ODAwMjcwMQ==", "url": "https://api.github.com/repos/psf/requests/labels/Do%20Not%20Merge" } ]
closed
true
null
[]
null
7
2014-03-16T14:28:24Z
2021-09-08T23:11:01Z
2014-05-12T18:59:59Z
CONTRIBUTOR
resolved
This is almost entirely moving code around: the new method is most of the code that used to be the loop body of `resolve_redirects`.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1965/reactions" }
https://api.github.com/repos/psf/requests/issues/1965/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/1965.diff", "html_url": "https://github.com/psf/requests/pull/1965", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/1965.patch", "url": "https://api.github.com/repos/psf/requests/pulls/1965" }
true
[ "Why are we removing the redirect mixin?\n", "It becomes obvious when you break up `resolve_redirects` this way that both halves rely on internal Session data and therefore ought to be Session methods.\n\nTBH I don't really understand why the mixin existed in the first place - it's not providing a useful conceptual division nor could it realistically have been applied to any other class but Session.\n", "(To be scrupulously accurate: the remaining use of internal Session data inside `resolve_redirects` is unnecessary - it duplicates something `send` does already. I was keeping this pull strictly to code motion so I left it there. Even so, though, the conceptual argument stands.)\n", "> TBH I don't really understand why the mixin existed in the first place\n\nTestability (as was shown with #1963). Frankly, other parts of a Session would be easier to test if they were mix-ins as well (specifically the cookie handling pieces of logic). Then again, most people don't care how testable their code is.\n\n> nor could it realistically have been applied to any other class but Session\n\nThat's not exactly right. Most people would not have _wanted_ to apply it to any other class but Session, but it very realistically could have been applied to any other class that was carefully constructed (again, shown in #1963).\n", "I also disagree with your argument that it doesn't provide a good conceptual division, and I think this change _emphasises_ the conceptual division, not weakens it. Redirection is _complicated_, and the number of methods that handle redirection is increasing all the time. The mixin provides an organisational framework for that code which makes the code easier to follow (IMO). The fact that you've added more methods here makes that even clearer.\n\nYes, the mixin is tightly coupled to the `Session` class, but that's ok: it's an organisational and conceptual distinction not a philosophical one.\n", "Exactly what @Lukasa said — the Mixin is all about readability :)\n", "Closing due to inactivity. \n" ]
https://api.github.com/repos/psf/requests/issues/1964
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1964/labels{/name}
https://api.github.com/repos/psf/requests/issues/1964/comments
https://api.github.com/repos/psf/requests/issues/1964/events
https://github.com/psf/requests/pull/1964
29,500,005
MDExOlB1bGxSZXF1ZXN0MTM2MDY0MjE=
1,964
Fix violations of PEP 8(Style Guide).
{ "avatar_url": "https://avatars.githubusercontent.com/u/5206607?v=4", "events_url": "https://api.github.com/users/jayanthkoushik/events{/privacy}", "followers_url": "https://api.github.com/users/jayanthkoushik/followers", "following_url": "https://api.github.com/users/jayanthkoushik/following{/other_user}", "gists_url": "https://api.github.com/users/jayanthkoushik/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jayanthkoushik", "id": 5206607, "login": "jayanthkoushik", "node_id": "MDQ6VXNlcjUyMDY2MDc=", "organizations_url": "https://api.github.com/users/jayanthkoushik/orgs", "received_events_url": "https://api.github.com/users/jayanthkoushik/received_events", "repos_url": "https://api.github.com/users/jayanthkoushik/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jayanthkoushik/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jayanthkoushik/subscriptions", "type": "User", "url": "https://api.github.com/users/jayanthkoushik", "user_view_type": "public" }
[]
closed
true
null
[]
null
8
2014-03-15T19:53:41Z
2021-09-08T22:01:12Z
2014-03-16T04:11:20Z
NONE
resolved
This commit fixes various violations of PEP 8. Most of the changes are shortening of lines longer than 79 characters. Compatibility to PEP 8 was checked using Flake-8 and the vim plugin (vim-flake8). Violations because of 'modules being imported and not used' were left as it is. These are mostly 'unfixable'.
{ "avatar_url": "https://avatars.githubusercontent.com/u/5206607?v=4", "events_url": "https://api.github.com/users/jayanthkoushik/events{/privacy}", "followers_url": "https://api.github.com/users/jayanthkoushik/followers", "following_url": "https://api.github.com/users/jayanthkoushik/following{/other_user}", "gists_url": "https://api.github.com/users/jayanthkoushik/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jayanthkoushik", "id": 5206607, "login": "jayanthkoushik", "node_id": "MDQ6VXNlcjUyMDY2MDc=", "organizations_url": "https://api.github.com/users/jayanthkoushik/orgs", "received_events_url": "https://api.github.com/users/jayanthkoushik/received_events", "repos_url": "https://api.github.com/users/jayanthkoushik/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jayanthkoushik/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jayanthkoushik/subscriptions", "type": "User", "url": "https://api.github.com/users/jayanthkoushik", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1964/reactions" }
https://api.github.com/repos/psf/requests/issues/1964/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/1964.diff", "html_url": "https://github.com/psf/requests/pull/1964", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/1964.patch", "url": "https://api.github.com/repos/psf/requests/pulls/1964" }
true
[ "Thanks for this. Unfortunately requests does not follow pep8 because of Kenneth's personal style preferences. If you search the past issues there are several similar PRs and issues that have been closed.\n\nFor what it is worth, as the flake8 maintainer, I sincerely appreciate the effort you put forth. I am confident, however, that this PR will not be merged. \n", "Is it too late to start following? :) From what I could tell, there weren't too many deviations.\n", "> Is it too late to start following? :) From what I could tell, there weren't too many deviations.\n\nI'm +1 with you, but Kenneth clearly doesn't want PEP8 compliant code, see for example https://github.com/kennethreitz/requests/commit/c6084704ccb5610ea093b6b47fb45d2149570174...\n", "Got it. :)\n", "@jayanthkoushik also a few notes for future reference (that are independent of this project).\n- [pep8](/jcrocholl/pep8) is out of date with the actual PEP-0008 (meaning flake8 is out of date as well). The maximum line length suggested by PEP-0008 is now 100 characters.\n- `not` is not a function. Changing `not (expr)` to `not(\\nexpr)` doesn't look so great. Leave the space next time ;)\n- Chained methods look really awful when you place a line continuation after the `.` (not your fault though)\n- Function/method invocations with lots of parameters can have the parameters split. They do not need to be all on one line.\n- `\\` line continuations are ugly and you can use parentheses to avoid them, e.g.,\n\n``` python\na = \"some incredibly unbelievably ridiculously super awesome but wildly very long string that should probably be on two lines because of it's length\"\n# can be instead\na = (\n \"some incredibly unbelievably ridiculously super awesome but \"\n \"wildly very long string that should probably be on two lines \"\n \"because of it's length\"\n)\n```\n\nPython by default will think of that as one string and act accordingly. It's valid syntax and also valid PEP-0008.\n\nLong arithmetic expressions and imports also benefit from using parentheses. PEP-0008 provides a lot of freedom in writing compliant code but what I've outlined above are conventions in the community that work within PEP-0008 that I've seen when working in projects.\n", "Wait, I thought PEP8 favoured using parentheses instead of the explicit line continuation?\n", "It does. But it allows for both and the pep8 tool doesn't raise an error for line continuations iirc\n", "@sigmavirus24 : Thank you so much; will keep these things in mind.\n" ]
https://api.github.com/repos/psf/requests/issues/1963
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1963/labels{/name}
https://api.github.com/repos/psf/requests/issues/1963/comments
https://api.github.com/repos/psf/requests/issues/1963/events
https://github.com/psf/requests/pull/1963
29,497,541
MDExOlB1bGxSZXF1ZXN0MTM2MDUzOTc=
1,963
Fix #1955: Do not use original request in redirect
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[ { "color": "009800", "default": false, "description": null, "id": 44501218, "name": "Ready To Merge", "node_id": "MDU6TGFiZWw0NDUwMTIxOA==", "url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge" }, { "color": "207de5", "default": false, "description": null, "id": 60620163, "name": "Minion Seal of Approval", "node_id": "MDU6TGFiZWw2MDYyMDE2Mw==", "url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval" } ]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" } ]
null
7
2014-03-15T17:42:11Z
2021-09-08T23:08:12Z
2014-03-24T15:44:14Z
CONTRIBUTOR
resolved
The original request was never being properly overriden in resolve_redirects. As such being having a POST request respond with a 303 would generate a GET request. If the GET request encountered another redirect to something like a 307, then it would use the original request and generate another POST request. There are two parts to this fix: - The fix itself - The test infrastructure to ensure it does not regress because HTTPBin is insufficient
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1963/reactions" }
https://api.github.com/repos/psf/requests/issues/1963/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/1963.diff", "html_url": "https://github.com/psf/requests/pull/1963", "merged_at": "2014-03-24T15:44:14Z", "patch_url": "https://github.com/psf/requests/pull/1963.patch", "url": "https://api.github.com/repos/psf/requests/pulls/1963" }
true
[ "The code change seems correct to me.\n\nRe the test infrastructure, though, may I suggest you write a mock adapter object rather than a mock session object? That may be a bit more work, but it would be a more \"natural\" test (in the sense that more of the normal logic is exercised), and it would be more easily generalizable to cases like the bad gzipped response body in #1944.\n", "We could use Betamax but we don't have a real URL to test this against.\n", "I was imagining an adapter that wasn't backed by any actual network activity, just an internal list of pseudo-URLs and how it should react to them.\n", "Betamax only needs the network once. After the responses are recorded, it would respond to an internal list of URLs and it would know how to react to them.\n", "LGTM.\n\nI'll let you come up with a testing plan.\n", "@kennethreitz since you're around, care to weigh in on this one?\n", "Wow, this is an old, old, old bug. I totally forgot about it.\n" ]
https://api.github.com/repos/psf/requests/issues/1962
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1962/labels{/name}
https://api.github.com/repos/psf/requests/issues/1962/comments
https://api.github.com/repos/psf/requests/issues/1962/events
https://github.com/psf/requests/pull/1962
29,496,160
MDExOlB1bGxSZXF1ZXN0MTM2MDQ4MjM=
1,962
Fix #1960: A Response's history should be a list
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[ { "color": "009800", "default": false, "description": null, "id": 44501218, "name": "Ready To Merge", "node_id": "MDU6TGFiZWw0NDUwMTIxOA==", "url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge" }, { "color": "207de5", "default": false, "description": null, "id": 60620163, "name": "Minion Seal of Approval", "node_id": "MDU6TGFiZWw2MDYyMDE2Mw==", "url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval" } ]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" } ]
null
5
2014-03-15T16:35:33Z
2021-09-08T23:08:27Z
2014-03-24T15:45:06Z
CONTRIBUTOR
resolved
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1962/reactions" }
https://api.github.com/repos/psf/requests/issues/1962/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/1962.diff", "html_url": "https://github.com/psf/requests/pull/1962", "merged_at": "2014-03-24T15:45:06Z", "patch_url": "https://github.com/psf/requests/pull/1962.patch", "url": "https://api.github.com/repos/psf/requests/pulls/1962" }
true
[ "@Lukasa I was considering calling `list(r.history)` instead of `tuple(r.history)` but I cannot see a case there where it wouldn't be a list.\n", "LGTM. :shipit:\n", "@kennethreitz thoughts?\n", "@sigmavirus24 you seem to really be in a rush lately.\n", "This is a great evolution.\n" ]
https://api.github.com/repos/psf/requests/issues/1961
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1961/labels{/name}
https://api.github.com/repos/psf/requests/issues/1961/comments
https://api.github.com/repos/psf/requests/issues/1961/events
https://github.com/psf/requests/issues/1961
29,496,060
MDU6SXNzdWUyOTQ5NjA2MA==
1,961
HTTPS requests reverts to HTTP on Google App Engine dev server (1.9)
{ "avatar_url": "https://avatars.githubusercontent.com/u/6943095?v=4", "events_url": "https://api.github.com/users/cboscolo/events{/privacy}", "followers_url": "https://api.github.com/users/cboscolo/followers", "following_url": "https://api.github.com/users/cboscolo/following{/other_user}", "gists_url": "https://api.github.com/users/cboscolo/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/cboscolo", "id": 6943095, "login": "cboscolo", "node_id": "MDQ6VXNlcjY5NDMwOTU=", "organizations_url": "https://api.github.com/users/cboscolo/orgs", "received_events_url": "https://api.github.com/users/cboscolo/received_events", "repos_url": "https://api.github.com/users/cboscolo/repos", "site_admin": false, "starred_url": "https://api.github.com/users/cboscolo/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/cboscolo/subscriptions", "type": "User", "url": "https://api.github.com/users/cboscolo", "user_view_type": "public" }
[ { "color": "fbca04", "default": false, "description": null, "id": 615414998, "name": "GAE Support", "node_id": "MDU6TGFiZWw2MTU0MTQ5OTg=", "url": "https://api.github.com/repos/psf/requests/labels/GAE%20Support" } ]
closed
true
null
[]
null
5
2014-03-15T16:30:49Z
2021-09-08T09:00:46Z
2014-03-15T16:54:50Z
NONE
resolved
When using requests lib on Google App Engine, requests to sites over HTTPS on 443 were actually going out as HTTP on port 80. The problem appears to be an incompatibility with the version of httplib on GAE. I am new to GAE (and python for that matter), so I am not clear as to which lib is responsible for fixing it. To see the problem just do this in the GAE console: ``` import requests s = requests.Session() r = s.get("https://www.somewebsite.com/") ``` The request will go out over http instead of https. I was able to work around the issue with this patch: ``` --- a/requests/packages/urllib3/connection.py +++ b/requests/packages/urllib3/connection.py @@ -99,6 +99,10 @@ class HTTPSConnection(HTTPConnection): HTTPConnection.__init__(self, host, port, strict, timeout, source_address) except TypeError: # Python 2.6 HTTPConnection.__init__(self, host, port, strict, timeout) + + # This fixes an issue on Google App Engine 1.9.0 causing https requests to go out as http + self._protocol = "https" + self.key_file = key_file self.cert_file = cert_file ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1961/reactions" }
https://api.github.com/repos/psf/requests/issues/1961/timeline
null
completed
null
null
false
[ "Care to send a pull request? This seems minor enough to fall under what was documented [here](https://github.com/kennethreitz/requests/blob/54ad646067a3e023fd4dcf40c0937ec46069871f/docs/dev/todo.rst#runtime-environments).\n", "That patch is in urllib3, so needs to be opened over there. \n\n> On 15 Mar 2014, at 16:33, Ian Cordasco [email protected] wrote:\n> \n> Care to send a pull request? This seems minor enough to fall under what was documented here.\n> \n> —\n> Reply to this email directly or view it on GitHub.\n", "Good catch @Lukasa. There's a [related issue](https://github.com/shazow/urllib3/issues/356) already open on urllib3.\n\nI'm closing this since we pull in urllib3 before releasing each time anyway.\n", "Merged this fix into urllib3@master, thanks for the tip! :)\n", ":cake: \n" ]
https://api.github.com/repos/psf/requests/issues/1960
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1960/labels{/name}
https://api.github.com/repos/psf/requests/issues/1960/comments
https://api.github.com/repos/psf/requests/issues/1960/events
https://github.com/psf/requests/issues/1960
29,495,243
MDU6SXNzdWUyOTQ5NTI0Mw==
1,960
request.history can be either a list or a tuple
{ "avatar_url": "https://avatars.githubusercontent.com/u/410452?v=4", "events_url": "https://api.github.com/users/sylvinus/events{/privacy}", "followers_url": "https://api.github.com/users/sylvinus/followers", "following_url": "https://api.github.com/users/sylvinus/following{/other_user}", "gists_url": "https://api.github.com/users/sylvinus/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sylvinus", "id": 410452, "login": "sylvinus", "node_id": "MDQ6VXNlcjQxMDQ1Mg==", "organizations_url": "https://api.github.com/users/sylvinus/orgs", "received_events_url": "https://api.github.com/users/sylvinus/received_events", "repos_url": "https://api.github.com/users/sylvinus/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sylvinus/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sylvinus/subscriptions", "type": "User", "url": "https://api.github.com/users/sylvinus", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2014-03-15T15:54:39Z
2021-09-09T00:09:55Z
2014-03-24T15:45:07Z
NONE
resolved
IMHO r.history should always be a list for least surprise. In _some_ cases, it is returned as a tuple: https://github.com/kennethreitz/requests/blob/master/requests/sessions.py#L530 Thanks!
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1960/reactions" }
https://api.github.com/repos/psf/requests/issues/1960/timeline
null
completed
null
null
false
[ "I was going to close this but there is no other issue currently open for it, even though it's a known bug.\n\nI'm curious why is it least surprising for history to be a list?\n", "I was doing response1.history + response2.history, and both should always be the same type or python will raise an unexpected error from the user's standpoint :)\n", "I mean why shouldn't history always be a tuple?\n\n``` python\n(1, 2, 3) + (4, 5, 6) == (1, 2, 3, 4, 5, 6)\n```\n\nI understand they should always be the same type and agree with you. \n", "Yes definitely, list or tuple I don't actually care as long as it's always the same. I was mentioning list because it is in the docs:\nhttp://docs.python-requests.org/en/latest/user/quickstart/#redirection-and-history\n", "Good point. :+1:\n" ]
https://api.github.com/repos/psf/requests/issues/1959
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1959/labels{/name}
https://api.github.com/repos/psf/requests/issues/1959/comments
https://api.github.com/repos/psf/requests/issues/1959/events
https://github.com/psf/requests/pull/1959
29,417,757
MDExOlB1bGxSZXF1ZXN0MTM1NjE0Nzg=
1,959
support request tuple data
{ "avatar_url": "https://avatars.githubusercontent.com/u/1502489?v=4", "events_url": "https://api.github.com/users/Feng23/events{/privacy}", "followers_url": "https://api.github.com/users/Feng23/followers", "following_url": "https://api.github.com/users/Feng23/following{/other_user}", "gists_url": "https://api.github.com/users/Feng23/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Feng23", "id": 1502489, "login": "Feng23", "node_id": "MDQ6VXNlcjE1MDI0ODk=", "organizations_url": "https://api.github.com/users/Feng23/orgs", "received_events_url": "https://api.github.com/users/Feng23/received_events", "repos_url": "https://api.github.com/users/Feng23/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Feng23/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Feng23/subscriptions", "type": "User", "url": "https://api.github.com/users/Feng23", "user_view_type": "public" }
[ { "color": "009800", "default": false, "description": null, "id": 44501218, "name": "Ready To Merge", "node_id": "MDU6TGFiZWw0NDUwMTIxOA==", "url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge" }, { "color": "207de5", "default": false, "description": null, "id": 60620163, "name": "Minion Seal of Approval", "node_id": "MDU6TGFiZWw2MDYyMDE2Mw==", "url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval" } ]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" } ]
null
15
2014-03-14T08:37:24Z
2021-09-08T23:11:00Z
2014-03-31T14:30:36Z
CONTRIBUTOR
resolved
I have wrote this: r = requests.post(url, data=post_data, timeout=10) Then I found the data param does not support tuple type, but it should do. So I add one line in the requests/models.py to fix this problem and add some unittest to it.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1959/reactions" }
https://api.github.com/repos/psf/requests/issues/1959/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/1959.diff", "html_url": "https://github.com/psf/requests/pull/1959", "merged_at": "2014-03-31T14:30:36Z", "patch_url": "https://github.com/psf/requests/pull/1959.patch", "url": "https://api.github.com/repos/psf/requests/pulls/1959" }
true
[ "Thanks for this! Looks good to me. :cake:\n", "However, the test changes need some work.\n\nFirst, there's no reason to break out a new test file at this stage: keep adding to the one we have. =) This way your tests are covered as part of our CI system.\n\nSecondly, we're using `py.test`, so we use tests in that style. This means the classes don't subclass `unittest.TestCase` and they don't use `self.assertEqual` or `self.assertTrue`, they just use `assert`. Do you mind rewriting the tests?\n", "I will rewrite it. \nThanks.\n", "I have minor feedback when I can get to a computer \n", "Yeah, I just hoped to generate random data at first, but forgot it.\nI will adopt your suggestions.\nThanks. \n", "@Feng23 thanks in advance. Also thank you for being patient and understanding with @Lukasa and me. It's our job to make sure your code is excellent enough to be merged without Kenneth objecting. Thanks for this contribution. It's excellent. :cake: \n", "LGTM! :+1:\n", "I closed it by mistake... \n", ":+1: LGTM also. Thanks for this!\n", "@Feng23 Can you do a rebase for me? I'll merge once you do :)\n", "I fetched the master of `kennethreitz/requests`, squashed my changes and then performed the rebase, but there is a merge failure now. Did I do something wrong?\n", "Nope, that's the fault of two separate Pull Requests that got merged before yours whose changes conflicted a little bit. We should fix that up separately.\n", "Tracked in #1975.\n", "My kingdom for a CI server that would work when master is updated and PRs are open and mergeable. \n", "@Feng23 one more rebase and your build should pass. :)\n" ]
https://api.github.com/repos/psf/requests/issues/1958
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1958/labels{/name}
https://api.github.com/repos/psf/requests/issues/1958/comments
https://api.github.com/repos/psf/requests/issues/1958/events
https://github.com/psf/requests/issues/1958
29,379,196
MDU6SXNzdWUyOTM3OTE5Ng==
1,958
urllib3 exceptions "leak" under some conditions
{ "avatar_url": "https://avatars.githubusercontent.com/u/325899?v=4", "events_url": "https://api.github.com/users/zackw/events{/privacy}", "followers_url": "https://api.github.com/users/zackw/followers", "following_url": "https://api.github.com/users/zackw/following{/other_user}", "gists_url": "https://api.github.com/users/zackw/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/zackw", "id": 325899, "login": "zackw", "node_id": "MDQ6VXNlcjMyNTg5OQ==", "organizations_url": "https://api.github.com/users/zackw/orgs", "received_events_url": "https://api.github.com/users/zackw/received_events", "repos_url": "https://api.github.com/users/zackw/repos", "site_admin": false, "starred_url": "https://api.github.com/users/zackw/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zackw/subscriptions", "type": "User", "url": "https://api.github.com/users/zackw", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2014-03-13T18:57:44Z
2021-09-09T00:09:59Z
2014-03-13T19:09:29Z
CONTRIBUTOR
resolved
I said in #1957 that I would like a guarantee that ``` try: resp = requests.get("...") except requests.exceptions.RequestException as e: # recover ``` is sufficient to trap all exceptions that may occur as a result of "Weird Shit coming off the network" -- this should be understood broadly: anything other than a successful DNS lookup followed by a successful connection to a server that actually speaks HTTP(S) on that port and returns (a chain of valid redirections culminating in) a valid HTTP response. #1957 covers one class of problems in that area. This is the other: urllib3 is supposed to be just an implementation detail to the user of Requests, but under some conditions, some of its exceptions can "leak" out to the user. I have observed urllib3.exceptions.MaxRetryError and urllib3.exceptions.LocationParseError. Unfortunately, I do not have enough information to pin down exactly what causes these.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1958/reactions" }
https://api.github.com/repos/psf/requests/issues/1958/timeline
null
completed
null
null
false
[ "We really just need tracebacks to find these. This is an open issue in #1572, so I'm closing this to centralise there.\n" ]
https://api.github.com/repos/psf/requests/issues/1957
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1957/labels{/name}
https://api.github.com/repos/psf/requests/issues/1957/comments
https://api.github.com/repos/psf/requests/issues/1957/events
https://github.com/psf/requests/issues/1957
29,378,104
MDU6SXNzdWUyOTM3ODEwNA==
1,957
Poor handling of redirection to invalid URLs
{ "avatar_url": "https://avatars.githubusercontent.com/u/325899?v=4", "events_url": "https://api.github.com/users/zackw/events{/privacy}", "followers_url": "https://api.github.com/users/zackw/followers", "following_url": "https://api.github.com/users/zackw/following{/other_user}", "gists_url": "https://api.github.com/users/zackw/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/zackw", "id": 325899, "login": "zackw", "node_id": "MDQ6VXNlcjMyNTg5OQ==", "organizations_url": "https://api.github.com/users/zackw/orgs", "received_events_url": "https://api.github.com/users/zackw/received_events", "repos_url": "https://api.github.com/users/zackw/repos", "site_admin": false, "starred_url": "https://api.github.com/users/zackw/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zackw/subscriptions", "type": "User", "url": "https://api.github.com/users/zackw", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2014-03-13T18:44:31Z
2021-09-09T00:09:58Z
2014-03-13T19:13:54Z
CONTRIBUTOR
resolved
If you get redirected to a syntactically invalid URL, most of the time it is the send() for the next outgoing request that will fail, and with a reasonably plausible exception. For instance: ``` >>> import requests >>> requests.get("http://httpbin.org/redirect-to?url=htto://example.com/") Traceback (most recent call last): File "<stdin>", line 1, in <module> File ".../requests/api.py", line 55, in get return request('get', url, **kwargs) File ".../requests/api.py", line 44, in request return session.request(method=method, url=url, **kwargs) File ".../requests/sessions.py", line 382, in request resp = self.send(prep, **send_kwargs) File ".../requests/sessions.py", line 505, in send history = [resp for resp in gen] if allow_redirects else [] File ".../requests/sessions.py", line 505, in <listcomp> history = [resp for resp in gen] if allow_redirects else [] File ".../requests/sessions.py", line 167, in resolve_redirects allow_redirects=False, File ".../requests/sessions.py", line 480, in send adapter = self.get_adapter(url=request.url) File ".../requests/sessions.py", line 525, in get_adapter raise InvalidSchema("No connection adapters were found for '%s'" % url) requests.exceptions.InvalidSchema: No connection adapters were found for 'htto://example.com/' ``` But a sufficiently mangled URL can produce exceptions that don't make nearly as much sense: ``` >>> requests.get("http://httpbin.org/redirect-to?url=http://@") Traceback (most recent call last): File "<stdin>", line 1, in <module> File ".../requests/api.py", line 55, in get return request('get', url, **kwargs) File ".../requests/api.py", line 44, in request return session.request(method=method, url=url, **kwargs) File ".../requests/sessions.py", line 383, in request resp = self.send(prep, **send_kwargs) File ".../requests/sessions.py", line 506, in send history = [resp for resp in gen] if allow_redirects else [] File ".../requests/sessions.py", line 506, in <listcomp> history = [resp for resp in gen] if allow_redirects else [] File ".../requests/sessions.py", line 168, in resolve_redirects allow_redirects=False, File ".../requests/sessions.py", line 486, in send r = adapter.send(request, **kwargs) File ".../requests/adapters.py", line 305, in send conn = self.get_connection(request.url, proxies) File ".../requests/adapters.py", line 222, in get_connection conn = self.poolmanager.connection_from_url(url) File ".../urllib3/poolmanager.py", line 133, in connection_from_url return self.connection_from_host(u.host, port=u.port, scheme=u.scheme) File ".../urllib3/poolmanager.py", line 119, in connection_from_host pool = self._new_pool(scheme, host, port) File ".../urllib3/poolmanager.py", line 86, in _new_pool return pool_cls(host, port, **kwargs) File ".../urllib3/connectionpool.py", line 226, in __init__ ConnectionPool.__init__(self, host, port) File ".../urllib3/connectionpool.py", line 156, in __init__ host = host.strip('[]') AttributeError: 'NoneType' object has no attribute 'strip' ``` (This one is arguably a bug in urllib3.) Or you can get an exception from the guts of `urlsplit`: ``` >>> requests.get("http://httpbin.org/redirect-to?url=http://example[.com/") Traceback (most recent call last): File "<stdin>", line 1, in <module> File ".../requests/api.py", line 55, in get return request('get', url, **kwargs) File ".../requests/api.py", line 44, in request return session.request(method=method, url=url, **kwargs) File ".../requests/sessions.py", line 383, in request resp = self.send(prep, **send_kwargs) File ".../requests/sessions.py", line 506, in send history = [resp for resp in gen] if allow_redirects else [] File ".../requests/sessions.py", line 506, in <listcomp> history = [resp for resp in gen] if allow_redirects else [] File ".../requests/sessions.py", line 113, in resolve_redirects parsed = urlparse(url) File ".../urllib/parse.py", line 293, in urlparse splitresult = urlsplit(url, scheme, allow_fragments) File ".../urllib/parse.py", line 343, in urlsplit raise ValueError("Invalid IPv6 URL") ValueError: Invalid IPv6 URL ``` (The current urllib.parse module _appears_ only to throw exceptions, as used in Requests, when the netloc part of an URL contains an unmatched square bracket, but perhaps it might get pickier in the future.) I'm pretty agnostic about what the actual fix should be here, but I do think that it should entail a guarantee that ``` try: resp = requests.get("...") except requests.exceptions.RequestException as e: # recover ``` is sufficient to trap all exceptions that may occur as a result of Weird Shit coming off the network. (Getting TypeError, ValueError, AttributeError, etc. as a result of _programming errors in the application_ is fine.)
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1957/reactions" }
https://api.github.com/repos/psf/requests/issues/1957/timeline
null
completed
null
null
false
[ "Is that a highly generic exception? It seems very clear to me.\n", "Generic in the sense that `ValueError` gets thrown for many other reasons; but really that's not the issue; the issue is that catching `RequestException` isn't sufficient to intercept all possible server misbehavior.\n", "In which case this is a manifestation of #1958 and therefore #1572, so I'll close this. =)\n", "Fair enough. I filed shazow/urllib3#355 on the \"NoneType has no attribute 'strip'\" case.\n" ]
https://api.github.com/repos/psf/requests/issues/1956
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1956/labels{/name}
https://api.github.com/repos/psf/requests/issues/1956/comments
https://api.github.com/repos/psf/requests/issues/1956/events
https://github.com/psf/requests/issues/1956
29,365,980
MDU6SXNzdWUyOTM2NTk4MA==
1,956
`Session.send` accepts arbitrary kwargs for the adapter, `.resolve_redirects` doesn't
{ "avatar_url": "https://avatars.githubusercontent.com/u/325899?v=4", "events_url": "https://api.github.com/users/zackw/events{/privacy}", "followers_url": "https://api.github.com/users/zackw/followers", "following_url": "https://api.github.com/users/zackw/following{/other_user}", "gists_url": "https://api.github.com/users/zackw/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/zackw", "id": 325899, "login": "zackw", "node_id": "MDQ6VXNlcjMyNTg5OQ==", "organizations_url": "https://api.github.com/users/zackw/orgs", "received_events_url": "https://api.github.com/users/zackw/received_events", "repos_url": "https://api.github.com/users/zackw/repos", "site_admin": false, "starred_url": "https://api.github.com/users/zackw/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zackw/subscriptions", "type": "User", "url": "https://api.github.com/users/zackw", "user_view_type": "public" }
[ { "color": "e102d8", "default": false, "description": null, "id": 117745, "name": "Planned", "node_id": "MDU6TGFiZWwxMTc3NDU=", "url": "https://api.github.com/repos/psf/requests/labels/Planned" }, { "color": "0b02e1", "default": false, "description": null, "id": 191274, "name": "Contributor Friendly", "node_id": "MDU6TGFiZWwxOTEyNzQ=", "url": "https://api.github.com/repos/psf/requests/labels/Contributor%20Friendly" } ]
closed
true
null
[]
null
14
2014-03-13T16:31:23Z
2021-09-08T23:05:01Z
2015-04-20T05:46:59Z
CONTRIBUTOR
resolved
In addition to its documented set of optional arguments (`stream=`, `verify=`, `cert=`, `proxies=`), `Session.send` accepts arbitrary keyword arguments, which are passed directly to the adapter's `send` method. (For instance, `HTTPAdapter.send` also accepts a `timeout` argument.) `resolve_redirects` logically ought to do the same, but it does not: it only accepts stream/verify/cert/proxies. This may, for instance, cause a timeout manually specified in a `send` call to get applied only to the first request and not to subsequent redirections. If you are not manually walking redirections, you presumably want your `send` kwargs to get applied to all redirections in the chain; if you _are_ manually walking redirections, you may want to be able to control all of the adapter's supported `send` kwargs at each step.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1956/reactions" }
https://api.github.com/repos/psf/requests/issues/1956/timeline
null
completed
null
null
false
[ "Yeah, we should be passing all the keyword arguments through.\n", ":+1: Nice find.\n", "Just to clarify, if i change:\n`def resolve_redirects(self, resp, req, stream=False, timeout=None, verify=True, cert=None, proxies=None):`\nto:\n`def resolve_redirects(self, resp, req, **kwargs):`\nI need to add in the lines:\n\n```\nkwargs.setdefault('stream', self.stream)\nkwargs.setdefault('timeout', self.timeout)\nkwargs.setdefault('verify', self.verify)\nkwargs.setdefault('cert', self.cert)\nkwargs.setdefault('proxies', self.proxies)\n```\n\nand make sure to do:\n\n```\nallow_redirects = kwargs.pop('allow_redirects', True)\nstream = kwargs.get('stream')\ntimeout = kwargs.get('timeout')\nverify = kwargs.get('verify')\ncert = kwargs.get('cert')\nproxies = kwargs.get('proxies')\n\n```\n\nbefore i use those variables?\n\n(I'm new to the project and am slightly unclear on the purpose of changing this, clarification would be great)\n", "Hey @mikecool1000 welcome to the project. So you can't simply change the function signature like that, here's why.\n\nRight now, someone could be doing this:\n\n``` python\ns = requests.Session()\n# ...\ns.resolve_redirects(resp, req, False, 1.0, True, None, {})\n```\n\nIf we then change the signature, even one extra positional argument will fail, e.g., try the following:\n\n``` python\ndef foo(a, b, c=3):\n return a, b, c\n\nfoo(1, 2) # should return (1, 2, 3)\nfoo(1, 2, 4) # should return (1, 2, 4)\n\ndef foo(a, b, **kwargs):\n c = kwargs.pop('c', 3)\n return a, b, c\n\nfoo(1, 2) # should return (1, 2, 3)\nfoo(1, 2, 4) # should raise a TypeError about foo only accepting 2 arguments and receiving 3\n```\n\nSo you'd want to add `**kwargs` after the existing positional arguments (that have defaults). Maybe this will set you on the right course without giving you too much room to shoot yourself in the foot or giving you the all the answers. :)\n", "Thanks that's very helpful\n\nSent from my iPhone\n\n> On Oct 10, 2014, at 2:01 PM, Ian Cordasco [email protected] wrote:\n> \n> Hey @mikecool1000 welcome to the project. So you can't simply change the function signature like that, here's why.\n> \n> Right now, someone could be doing this:\n> \n> s = requests.Session()\n> \n> # ...\n> \n> s.resolve_redirects(resp, req, False, 1.0, True, None, {})\n> If we then change the signature, even one extra positional argument will fail, e.g., try the following:\n> \n> def foo(a, b, c=3):\n> return a, b, c\n> \n> foo(1, 2) # should return (1, 2, 3)\n> foo(1, 2, 4) # should return (1, 2, 4)\n> \n> def foo(a, b, **kwargs):\n> c = kwargs.pop('c', 3)\n> return a, b, c\n> \n> foo(1, 2) # should return (1, 2, 3)\n> foo(1, 2, 4) # should raise a TypeError about foo only accepting 2 arguments and receiving 3\n> So you'd want to add **kwargs after the existing positional arguments (that have defaults). Maybe this will set you on the right course without giving you too much room to shoot yourself in the foot or giving you the all the answers. :)\n> \n> —\n> Reply to this email directly or view it on GitHub.\n", "@mikecool1000 if you need other help, please hop onto #python-requests on Freenode this weekend. If I'm around, I'll be happy to chat with you about this.\n", "Hey guys, while working on this issue, i found:\n\n```\n # Redirect resolving generator.\n gen = self.resolve_redirects(r, request,\n stream=stream,\n timeout=timeout,\n verify=verify,\n cert=cert,\n proxies=proxies)\n\n # Resolve redirects if allowed.\n history = [resp for resp in gen] if allow_redirects else []\n```\n\nin the Session.send definition. I'm wondering if it would be better to not always call resolve redirects, and instead to choose to execute or not based upon whether or not allow_redirects is true. For example replace this code block with:\n\n```\nhistory = []\nif allow_redirects:\n gen = self.resolve_redirects(r, request,\n stream=stream,\n timeout=timeout,\n verify=verify,\n cert=cert,\n proxies=proxies)\n history = [resp for resp in gen]\n```\n\nunless I'm mistaken(which is quite possible), this would avoid unnecessary computation\n", "here is my first attempt at fixing this, no idea how good it is https://github.com/mikecool1000/requests/blob/fix-%231956/requests/sessions.py\n", "@mikecool1000 how familiar are you with generators? You should try this out at the interpreter:\n\n``` pycon\n>>> import requests\n>>> s = requests.session()\n>>> r = s.get('https://httpbin.org/redirect/2', allow_redirects=False)\n>>> g = s.resolve_redirects(r, r.request)\n>>> type(g)\n<type 'generator'>\n```\n\nWhen you run `g = s.resolve_redirects(r, r.request)` note that it will return immediately. It doesn't talk to the network until you start to consume the generator. There's no unnecessary computation at all :)\n", "i figured it was something like that, but ive had no experience with generators in the past, sorry for being such a bad coder lol\n", "No one called you a bad coder @mikecool1000 and you don't need to apologize for not having had prior experience with generators\n", "Thanks for helping me learn then :)\n\nSent from my iPhone\n\n> On Oct 13, 2014, at 11:07 AM, Ian Cordasco [email protected] wrote:\n> \n> No one called you a bad coder @mikecool1000 and you don't need to apologize for not having had prior experience with generators\n> \n> —\n> Reply to this email directly or view it on GitHub.\n", "Hullo! I see that `resolve_redirects` contains the `**adapter_kwargs` parameter already in `requests.sessions.py`:\n\n```\n def resolve_redirects(self, resp, req, stream=False, timeout=None,\n verify=True, cert=None, proxies=None, **adapter_kwargs):\n```\n\nSo it looks like the arbitrary `kwargs` is getting passed through already in the function definition? Maybe I might be mis-understanding a thing. \n", "@onceuponatimeforever Ah, good catch, we just merged a fix for this a few days ago. =)\n" ]
https://api.github.com/repos/psf/requests/issues/1955
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1955/labels{/name}
https://api.github.com/repos/psf/requests/issues/1955/comments
https://api.github.com/repos/psf/requests/issues/1955/events
https://github.com/psf/requests/issues/1955
29,365,191
MDU6SXNzdWUyOTM2NTE5MQ==
1,955
`Session.resolve_redirects` copies the original request for all subsequent requests, can cause incorrect method selection
{ "avatar_url": "https://avatars.githubusercontent.com/u/325899?v=4", "events_url": "https://api.github.com/users/zackw/events{/privacy}", "followers_url": "https://api.github.com/users/zackw/followers", "following_url": "https://api.github.com/users/zackw/following{/other_user}", "gists_url": "https://api.github.com/users/zackw/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/zackw", "id": 325899, "login": "zackw", "node_id": "MDQ6VXNlcjMyNTg5OQ==", "organizations_url": "https://api.github.com/users/zackw/orgs", "received_events_url": "https://api.github.com/users/zackw/received_events", "repos_url": "https://api.github.com/users/zackw/repos", "site_admin": false, "starred_url": "https://api.github.com/users/zackw/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zackw/subscriptions", "type": "User", "url": "https://api.github.com/users/zackw", "user_view_type": "public" }
[]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" } ]
null
4
2014-03-13T16:23:27Z
2021-09-09T00:09:56Z
2014-03-24T15:44:14Z
CONTRIBUTOR
resolved
Consider the following redirection chain: ``` POST /do_something HTTP/1.1 Host: server.example.com ... HTTP/1.1 303 See Other Location: /new_thing_1513 GET /new_thing_1513 Host: server.example.com ... HTTP/1.1 307 Temporary Redirect Location: //failover.example.com/new_thing_1513 ``` The intermediate 303 See Other has caused the POST to be converted to a GET. The subsequent 307 should preserve the GET. However, because `Session.resolve_redirects` starts each iteration by copying the _original_ request object, Requests will issue a POST!
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1955/reactions" }
https://api.github.com/repos/psf/requests/issues/1955/timeline
null
completed
null
null
false
[ "Uh, yes, that's a bug. =D\n", "This is also a good example of something that there's no good way to write a test for with httpbin as-is.\n", "This can be tested though, without httpbin, and I'll tackle this one tonight or this weekend. I've tinkered with `resolve_redirects` enough to be certain enough that I caused this. As such I feel its my responsibility to fix it.\n", "@zackw check out #1963 if you have some time.\n" ]
https://api.github.com/repos/psf/requests/issues/1954
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1954/labels{/name}
https://api.github.com/repos/psf/requests/issues/1954/comments
https://api.github.com/repos/psf/requests/issues/1954/events
https://github.com/psf/requests/issues/1954
29,364,213
MDU6SXNzdWUyOTM2NDIxMw==
1,954
`Session.resolve_redirects` allows you to shoot yourself in the foot by passing the wrong PreparedRequest
{ "avatar_url": "https://avatars.githubusercontent.com/u/325899?v=4", "events_url": "https://api.github.com/users/zackw/events{/privacy}", "followers_url": "https://api.github.com/users/zackw/followers", "following_url": "https://api.github.com/users/zackw/following{/other_user}", "gists_url": "https://api.github.com/users/zackw/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/zackw", "id": 325899, "login": "zackw", "node_id": "MDQ6VXNlcjMyNTg5OQ==", "organizations_url": "https://api.github.com/users/zackw/orgs", "received_events_url": "https://api.github.com/users/zackw/received_events", "repos_url": "https://api.github.com/users/zackw/repos", "site_admin": false, "starred_url": "https://api.github.com/users/zackw/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zackw/subscriptions", "type": "User", "url": "https://api.github.com/users/zackw", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2014-03-13T16:12:26Z
2021-09-09T00:09:58Z
2014-03-14T16:41:02Z
CONTRIBUTOR
resolved
`Session.resolve_redirects` requires you to pass in both the initial response, and the PreparedRequest that produced that response, even though the latter is available as the `.request` property of the former. If you pass in a _different_ PreparedRequest object, the redirection information in the response is merged with the PreparedRequest you supplied. This cannot cause you to be sent to the wrong _URL_, because the original URL is taken from the _response_, but it may cause you to supply the wrong X-headers, HTTP method, or anything else drawn directly from the request. I suggest that `Session.resolve_redirects` should continue to accept the `req` argument for backward compatibility, but it should be made optional (i.e. `req=None` in the signature) and should be ignored in favor of `resp.request`.
{ "avatar_url": "https://avatars.githubusercontent.com/u/325899?v=4", "events_url": "https://api.github.com/users/zackw/events{/privacy}", "followers_url": "https://api.github.com/users/zackw/followers", "following_url": "https://api.github.com/users/zackw/following{/other_user}", "gists_url": "https://api.github.com/users/zackw/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/zackw", "id": 325899, "login": "zackw", "node_id": "MDQ6VXNlcjMyNTg5OQ==", "organizations_url": "https://api.github.com/users/zackw/orgs", "received_events_url": "https://api.github.com/users/zackw/received_events", "repos_url": "https://api.github.com/users/zackw/repos", "site_admin": false, "starred_url": "https://api.github.com/users/zackw/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zackw/subscriptions", "type": "User", "url": "https://api.github.com/users/zackw", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1954/reactions" }
https://api.github.com/repos/psf/requests/issues/1954/timeline
null
completed
null
null
false
[ "I actually disagree with this one: the ability to pass arbitrary `PreparedRequest` objects makes this method easier to test. =)\n", "Hm, can you give an example? Perhaps we can find a better way.\n", "I'm with @Lukasa on this one and I'll work on an example. That said, we allow the user to shoot themselves in the foot plenty. There's absolutely no reason to stop them from doing so here. It's our job to provide an elegant API to the user, not to make sure they do everything right.\n\nYou're also proposing that we should favor the prepared request on the Response object to the one passed in. Perhaps someone is relying on this specific behaviour for a really good reason. If we change this the way you're suggesting, that person's code will break in an awful and entirely unexpected way. No matter how well you document changes, people will ignore that documentation and expect the same behaviour unless the API is totally changed, i.e., the request parameter is totally removed.\n", "Honestly, I consider `resolve_redirects` an API that nobody should have to use anyway - as far as I'm concerned it's an internal implementation detail of `send` that happens to be the only way to do manual redirect walking right now - and so I'm fine with leaving it as is on compatibility grounds...\n\n... _except_ that I'm worried that might make it harder to fix #1955. But you said you will deal with #1955, so that's your problem. =)\n" ]
https://api.github.com/repos/psf/requests/issues/1953
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1953/labels{/name}
https://api.github.com/repos/psf/requests/issues/1953/comments
https://api.github.com/repos/psf/requests/issues/1953/events
https://github.com/psf/requests/issues/1953
29,363,729
MDU6SXNzdWUyOTM2MzcyOQ==
1,953
The iterable produced by `Session.resolve_redirects` does not include the very first response
{ "avatar_url": "https://avatars.githubusercontent.com/u/325899?v=4", "events_url": "https://api.github.com/users/zackw/events{/privacy}", "followers_url": "https://api.github.com/users/zackw/followers", "following_url": "https://api.github.com/users/zackw/following{/other_user}", "gists_url": "https://api.github.com/users/zackw/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/zackw", "id": 325899, "login": "zackw", "node_id": "MDQ6VXNlcjMyNTg5OQ==", "organizations_url": "https://api.github.com/users/zackw/orgs", "received_events_url": "https://api.github.com/users/zackw/received_events", "repos_url": "https://api.github.com/users/zackw/repos", "site_admin": false, "starred_url": "https://api.github.com/users/zackw/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zackw/subscriptions", "type": "User", "url": "https://api.github.com/users/zackw", "user_view_type": "public" }
[ { "color": "02e10c", "default": false, "description": null, "id": 76800, "name": "Feature Request", "node_id": "MDU6TGFiZWw3NjgwMA==", "url": "https://api.github.com/repos/psf/requests/labels/Feature%20Request" } ]
open
false
null
[]
null
14
2014-03-13T16:06:54Z
2015-08-31T06:50:04Z
null
CONTRIBUTOR
null
If you are manually walking over redirects, you probably want to structure your code like this: ``` redirect_sequence = session.send_with_manual_redirect_walking(request, ...) for resp in redirect_sequence: # do something with 'resp' ``` The existing API does not let you do that. You must write either ``` first_response = session.send(request, ..., allow_redirects=False) # do something with 'first_response' for resp in session.resolve_redirects(first_response, request, ...) # do something with 'resp' ``` which involves writing the same "do something with" code in two places, or ``` resp = session.send(request, ..., allow_redirects=False) redir_iter = session.resolve_redirects(resp, request, ...) while True: # do something with 'resp' if not resp.is_redirect: break resp = next(redir_iter) ``` which is un-Pythonic loop structure. Since `Session.resolve_redirects` must remain as is for compatibility's sake, the only way to fix this is to add either a new mode to `send` (`allow_redirects=MANUAL`?) or a new Session method (perhaps in fact called `send_with_manual_redirect_walking`) which returns an iterable that _does_ include the very first response. I do not particularly care which, or what the new method is called in the second case.
null
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1953/reactions" }
https://api.github.com/repos/psf/requests/issues/1953/timeline
null
null
null
null
false
[ "It seems like the best approach is the second with a better method name.\n", "What do you think a better method name would be? I am chronically bad at names.\n", "Heh, there's not really a good one here, but could be `iter_send()`. Matches an established trend in requests to use `iter_x`.\n", "Wouldn't `iter_locations` or `iter_redirects` be better?\n", "I thought about it, but both are misleading, as they contain the first response (which is _not_ a redirect). Could be `iter_responses`?\n\nAnyway, this is bikeshedding of the highest order. The question is are we happy to add this method to the API?\n", "It's the very _last_ response in the sequence that's not a redirect, but yeah, one of them isn't. I rather like `iter_responses`.\n", "I like `iter_send` better than `iter_responses`. They both sound a little nebulous though. The former makes a bit more sense to me with the understanding that `allow_redirects` is **not** a parameter.\n\nAll things considered, the goal here is more to provide an equivalent to `send` that gives the user more control than `request` over their redirects. Since it really will be a companion to `send` then, I think `iter_send` makes more sense. `iter_responses` could mean too many things. We have to design this API with the constraint that the users taking advantage of this will will be advanced users. They should already be familiar with `send` so `iter_send` should be an intuitive leap to them, even if the name still is a bit vague.\n\n---\n\nOn a side note, allow me to play devil's advocate. I can easily see people complaining that we only allow for this handling of redirects on such a low level. People will want this change to bubble up to `iter_response` (to correlate to the `response` method) as well as corresponding methods for `iter_get`, `iter_post`, etc. (regardless of whether or not some of those make sense based on the RFCs and the way servers behave). In other words, as devil's advocate, I'm warning of what could be perceived by some users as a foot in the door to further API extensions that are unnecessary and ugly. This is less of an argument against this change, and more of a warning that we should be careful how we choose to architect and document this.\n", "Taking @sigmavirus24's concern on board, is there an elegant way we can do this _outside_ of the library, e.g. in the toolbelt?\n", "I'm not familiar with this \"toolbelt\"? But, an alternative would be a `Session.prepare_request_for_redirect` method that takes an `.is_redirect` response and produces a new `PreparedRequest` to follow the redirect. That is sufficiently low-level that I don't think it would induce feature creep, but makes it straightforward to write the generator yourself if you want it:\n\n```\ndef iter_send(session, request, **kwargs):\n resp = session.send(request, allow_redirects=False, **kwargs)\n while resp.is_redirect:\n yield resp\n resp = session.send(session.prepare_request_for_redirect(resp),\n allow_redirects=False, **kwargs)\n yield resp\n```\n\nSince backward compatibility dictates preserving `resolve_redirects`, we are going to want a method like this anyway to house shared code between `iter_send` and `resolve_redirects`.\n\nHaving said that, personally I'm not much concerned about the feature creep issue, because I think anyone who wants to do manual redirection chasing is going to want to work with the Session API anyway. If nothing else, you probably need Session-level control over cookies.\n", "> Taking @sigmavirus24's concern on board\n\nIt isn't a very strong concern. It's more of a pattern I've seen develop as of late. People watch the repo for a tiny change and use that change to get their foot in the door for a larger one that is widely unnecessary. It's a tiny concern that's ever present now.\n\nLikewise, I think the toolbelt could easily accomodate this. That said, I'm not convinced it should be either in or outside of the core (i.e., I don't actually know where it belongs).\n\nI've also been thinking along the same lines @zackw, but more geared towards making an eventual refactor a lot easier. I like having a compliment to `prepare_request` sibling. How does `prepare_redirected_request` sound?\n", "I can get behind that idea, though I'm +0.5 until I see some code.\n", "I really want to get some work done on the toolbelt and betamax today. If @zackw has the time to throw together an example of `prepare_redirected_request` that'd be great. Otherwise, I'll likely work on it later this week.\n", "@sigmavirus24 Not a problem - it's a simple matter of moving code around. See #1965.\n\n(I am going to be offline for most of the rest of the day, though.)\n", "It's the weekend. Enjoy your Sunday! :cake: \n" ]
https://api.github.com/repos/psf/requests/issues/1952
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1952/labels{/name}
https://api.github.com/repos/psf/requests/issues/1952/comments
https://api.github.com/repos/psf/requests/issues/1952/events
https://github.com/psf/requests/issues/1952
29,362,145
MDU6SXNzdWUyOTM2MjE0NQ==
1,952
In `stream=True` mode, redirect-following crashes when body already consumed
{ "avatar_url": "https://avatars.githubusercontent.com/u/325899?v=4", "events_url": "https://api.github.com/users/zackw/events{/privacy}", "followers_url": "https://api.github.com/users/zackw/followers", "following_url": "https://api.github.com/users/zackw/following{/other_user}", "gists_url": "https://api.github.com/users/zackw/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/zackw", "id": 325899, "login": "zackw", "node_id": "MDQ6VXNlcjMyNTg5OQ==", "organizations_url": "https://api.github.com/users/zackw/orgs", "received_events_url": "https://api.github.com/users/zackw/received_events", "repos_url": "https://api.github.com/users/zackw/repos", "site_admin": false, "starred_url": "https://api.github.com/users/zackw/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zackw/subscriptions", "type": "User", "url": "https://api.github.com/users/zackw", "user_view_type": "public" }
[ { "color": "5319e7", "default": false, "description": null, "id": 67760318, "name": "Fixed", "node_id": "MDU6TGFiZWw2Nzc2MDMxOA==", "url": "https://api.github.com/repos/psf/requests/labels/Fixed" } ]
closed
true
null
[]
null
2
2014-03-13T15:49:42Z
2021-09-08T23:06:11Z
2015-01-18T20:33:02Z
CONTRIBUTOR
resolved
`Session.resolve_redirects` unconditionally accesses `Response.content`, to make sure that all of the content _has_ been consumed, so the socket can be reused. If you are manually following redirects, and you have set `stream=True`, and you use `Response.iter_content` to consume the entire body of a redirect before cycling to the next one, this access of `.content` will throw a `RuntimeError`. This appears to be fixed by pull request #1944; I'm filing this bug anyway to record that it _is_ a bug (and to make clear why "`except RuntimeError: pass`" is needed).
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1952/reactions" }
https://api.github.com/repos/psf/requests/issues/1952/timeline
null
completed
null
null
false
[ "Thanks for this, I'm marking this issue as fixed. =)\n", "@zackw can you please give more detailed steps to reproduce (code example or test case). \n" ]
https://api.github.com/repos/psf/requests/issues/1951
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1951/labels{/name}
https://api.github.com/repos/psf/requests/issues/1951/comments
https://api.github.com/repos/psf/requests/issues/1951/events
https://github.com/psf/requests/pull/1951
29,293,841
MDExOlB1bGxSZXF1ZXN0MTM0ODkxMjk=
1,951
Re-evaluate proxy authorization.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[ { "color": "009800", "default": false, "description": null, "id": 44501218, "name": "Ready To Merge", "node_id": "MDU6TGFiZWw0NDUwMTIxOA==", "url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge" }, { "color": "207de5", "default": false, "description": null, "id": 60620163, "name": "Minion Seal of Approval", "node_id": "MDU6TGFiZWw2MDYyMDE2Mw==", "url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval" } ]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" } ]
null
7
2014-03-12T19:26:15Z
2021-09-08T23:07:19Z
2014-03-23T14:51:48Z
MEMBER
resolved
This pull request falls into three commits. The first is a refactoring of the `get_environ_proxies` method to make it possible to evaluate whether a given URL is in the NO_PROXY list. The second is a refactoring of the `resolve_redirects` method to move rebuilding the `Authorization` header to its own method. Both of these commits are intended to set the stage for the third, which is a new method that re-evaluates proxy configuration on a redirect, and re-evaluates proxy authorization as well. I don't want this merged yet, but it's a pretty sizeable change and I'd like to get eyes on it as early as possible. The key problem right now is that I don't have tests, though it should be possible for me to test this fairly easily. I'll want to add tests before we merge this. @sigmavirus24, do you mind taking a look?
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1951/reactions" }
https://api.github.com/repos/psf/requests/issues/1951/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/1951.diff", "html_url": "https://github.com/psf/requests/pull/1951", "merged_at": "2014-03-23T14:51:48Z", "patch_url": "https://github.com/psf/requests/pull/1951.patch", "url": "https://api.github.com/repos/psf/requests/pulls/1951" }
true
[ "I was skeptical of this at first, but I particularly like the changes to `resolve_redirects`.\n", "At least I'd prefer it if you changed the second `try`/`except` block. After that :shipit: unless there's more to do.\n", "Our inability to easily test our proxy function is a pain in the neck. One day I'll investigate adding a test harness that spins up a `mitmproxy` instance for testing.\n", "Anyway, I now consider this ready for formal review and, assuming everyone's happy, merging. =)\n", "Assigning to @kennethreitz since this looks good to me. :shipit: \n", ":sparkles: :cake: :sparkles:\n", "Now to check the mergeability of other redirect-related PRs ;)\n" ]
https://api.github.com/repos/psf/requests/issues/1950
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1950/labels{/name}
https://api.github.com/repos/psf/requests/issues/1950/comments
https://api.github.com/repos/psf/requests/issues/1950/events
https://github.com/psf/requests/issues/1950
29,282,347
MDU6SXNzdWUyOTI4MjM0Nw==
1,950
content length differ on requests with same data, method and url
{ "avatar_url": "https://avatars.githubusercontent.com/u/22042?v=4", "events_url": "https://api.github.com/users/jms/events{/privacy}", "followers_url": "https://api.github.com/users/jms/followers", "following_url": "https://api.github.com/users/jms/following{/other_user}", "gists_url": "https://api.github.com/users/jms/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jms", "id": 22042, "login": "jms", "node_id": "MDQ6VXNlcjIyMDQy", "organizations_url": "https://api.github.com/users/jms/orgs", "received_events_url": "https://api.github.com/users/jms/received_events", "repos_url": "https://api.github.com/users/jms/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jms/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jms/subscriptions", "type": "User", "url": "https://api.github.com/users/jms", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2014-03-12T17:11:02Z
2021-09-09T00:10:00Z
2014-03-12T18:27:16Z
NONE
resolved
Hi, i was testing requests against a rest web service(sugarcrm) and notice the content length differ on same request using different api call for example what's the different between using "prepared request" and "normal request", the content length should be the same or not ? ``` python import requests import hashlib import json username = 'admin' password = 'facil' a = { 'user_auth': { 'user_name': username, 'password': hashlib.md5(password).hexdigest() } } method = 'login' data = json.dumps(a) payload = { 'method': method, 'input_type': 'json', 'response_type' : 'json', 'rest_data' : data } sugarurl = 'http://192.168.100.138/crm/service/v4_1/rest.php' # first request s = requests.Session() r = requests.Request('GET', sugarurl, params=payload) response = s.send(r.prepare()) # second request, same data x = requests.get(sugarurl, params=payload) ``` this is the traffic capture for the 2 requests. ![requests](https://f.cloud.github.com/assets/22042/2400194/886738d2-aa07-11e3-8ef0-3adfa1129758.png)
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1950/reactions" }
https://api.github.com/repos/psf/requests/issues/1950/timeline
null
completed
null
null
false
[ "The way you've used them they will not be the same, because the second request gets the `Session` headers put on it.\n\nYou can get the same length by replacing `response = s.send(r.prepare())` with `response = s.send(s.prepare_request(r))`.\n" ]
https://api.github.com/repos/psf/requests/issues/1949
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1949/labels{/name}
https://api.github.com/repos/psf/requests/issues/1949/comments
https://api.github.com/repos/psf/requests/issues/1949/events
https://github.com/psf/requests/pull/1949
29,280,827
MDExOlB1bGxSZXF1ZXN0MTM0ODY2NTU=
1,949
Pickled Responses should include a None value for the raw attribute
{ "avatar_url": "https://avatars.githubusercontent.com/u/509830?v=4", "events_url": "https://api.github.com/users/ionrock/events{/privacy}", "followers_url": "https://api.github.com/users/ionrock/followers", "following_url": "https://api.github.com/users/ionrock/following{/other_user}", "gists_url": "https://api.github.com/users/ionrock/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ionrock", "id": 509830, "login": "ionrock", "node_id": "MDQ6VXNlcjUwOTgzMA==", "organizations_url": "https://api.github.com/users/ionrock/orgs", "received_events_url": "https://api.github.com/users/ionrock/received_events", "repos_url": "https://api.github.com/users/ionrock/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ionrock/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ionrock/subscriptions", "type": "User", "url": "https://api.github.com/users/ionrock", "user_view_type": "public" }
[]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" } ]
null
5
2014-03-12T16:54:19Z
2021-09-08T22:01:11Z
2014-03-12T20:18:58Z
CONTRIBUTOR
resolved
I received [this issue](https://github.com/ionrock/cachecontrol/issues/16) in CacheControl where a pickled response throws an error because there is no raw attribute on the response object. The `__setstate__` function should set this explicitly to an empty `urllib3.response.HTTPResponse`.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1949/reactions" }
https://api.github.com/repos/psf/requests/issues/1949/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/1949.diff", "html_url": "https://github.com/psf/requests/pull/1949", "merged_at": "2014-03-12T20:18:58Z", "patch_url": "https://github.com/psf/requests/pull/1949.patch", "url": "https://api.github.com/repos/psf/requests/pulls/1949" }
true
[ "Yep, this is a good catch. We should fix this up.\n", "I don't think we should add a `HTTPResponse` object because it violates Kenneth's condition about being too tied to urllib3. It also makes an incorrect assumption that `urllib3` is always being used to provide a `raw` response, which is simply untrue.\n\nI'd argue it should be present and set to `None`.\n", "Can I get review from @sigmavirus24 and @ionrock?\n", "This seems to fix things in CacheControl. Thank you @Lukasa! \n\nMy reasoning for suggesting the `HTTPResponse` was because then someone doesn't have to check whether or not a response was cached or created outside a normal request. For example, if you used `resp.raw.seek(0)` you'd get an error as `None` obviously doesn't have a `seek` method. \n\nWith that said, `seek` doesn't work anyway! No blood, no foul. \n\nThanks for fixing this so quickly. I'll add some docs in CacheControl to help communicate that a cached response's `raw` attribute will be `None`. \n", "Not that you needed it, but :+1:. Also we should document this for others relying on pickling and using libraries like multiprocessing (which pickle objects).\n" ]
https://api.github.com/repos/psf/requests/issues/1948
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1948/labels{/name}
https://api.github.com/repos/psf/requests/issues/1948/comments
https://api.github.com/repos/psf/requests/issues/1948/events
https://github.com/psf/requests/issues/1948
29,254,181
MDU6SXNzdWUyOTI1NDE4MQ==
1,948
iter_content doesn't honour the timeout
{ "avatar_url": "https://avatars.githubusercontent.com/u/1308999?v=4", "events_url": "https://api.github.com/users/mortoray/events{/privacy}", "followers_url": "https://api.github.com/users/mortoray/followers", "following_url": "https://api.github.com/users/mortoray/following{/other_user}", "gists_url": "https://api.github.com/users/mortoray/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/mortoray", "id": 1308999, "login": "mortoray", "node_id": "MDQ6VXNlcjEzMDg5OTk=", "organizations_url": "https://api.github.com/users/mortoray/orgs", "received_events_url": "https://api.github.com/users/mortoray/received_events", "repos_url": "https://api.github.com/users/mortoray/repos", "site_admin": false, "starred_url": "https://api.github.com/users/mortoray/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mortoray/subscriptions", "type": "User", "url": "https://api.github.com/users/mortoray", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2014-03-12T11:11:37Z
2021-09-09T00:10:01Z
2014-03-12T11:20:03Z
NONE
resolved
It appears that `iter_content` does not honour `timeout` set in the initial `request` call. I can't see any option to provide a timeout on streamed data.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1948/reactions" }
https://api.github.com/repos/psf/requests/issues/1948/timeline
null
completed
null
null
false
[ "This is an outstanding issue already: see #1803.\n", "And, in fact, @ceaess has already provided a Pull Request that fixes this in #1935, which will be in the next release.\n" ]
https://api.github.com/repos/psf/requests/issues/1947
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1947/labels{/name}
https://api.github.com/repos/psf/requests/issues/1947/comments
https://api.github.com/repos/psf/requests/issues/1947/events
https://github.com/psf/requests/issues/1947
29,221,075
MDU6SXNzdWUyOTIyMTA3NQ==
1,947
Optionally depend on certifi
{ "avatar_url": "https://avatars.githubusercontent.com/u/145979?v=4", "events_url": "https://api.github.com/users/dstufft/events{/privacy}", "followers_url": "https://api.github.com/users/dstufft/followers", "following_url": "https://api.github.com/users/dstufft/following{/other_user}", "gists_url": "https://api.github.com/users/dstufft/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/dstufft", "id": 145979, "login": "dstufft", "node_id": "MDQ6VXNlcjE0NTk3OQ==", "organizations_url": "https://api.github.com/users/dstufft/orgs", "received_events_url": "https://api.github.com/users/dstufft/received_events", "repos_url": "https://api.github.com/users/dstufft/repos", "site_admin": false, "starred_url": "https://api.github.com/users/dstufft/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dstufft/subscriptions", "type": "User", "url": "https://api.github.com/users/dstufft", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2014-03-11T22:37:50Z
2021-09-08T23:04:51Z
2015-05-03T15:10:08Z
CONTRIBUTOR
resolved
Right now requests (afaik) doesn't issue security releases for old versions or have any sort of support period for any versions of requests other than the latest. This is problematic because requests bundles the CA bundle as part of the code base. This means that people _must_ upgrade requests in order to get access to the latest certificates. This becomes even more problematic when people are on older requests that cross a major version boundary. I know requests used to depend on certifi and I assume it stopped doing so for some reason, likely ease of vendoring, so what I would like to suggest is for requests to know what version of certifi it's internal certificates corresponds to, and if there is a higher version certifi available requests will use that instead. This will allow people to easily update the certificates of requests without having to upgrade requests, which might require coding to cross a compatibility breakage.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1947/reactions" }
https://api.github.com/repos/psf/requests/issues/1947/timeline
null
completed
null
null
false
[ "@kennethreitz I'm open to doing the work here if this is something you think you'd like.\n", "Requests now automatically picks up certifi if present.\n" ]
https://api.github.com/repos/psf/requests/issues/1946
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1946/labels{/name}
https://api.github.com/repos/psf/requests/issues/1946/comments
https://api.github.com/repos/psf/requests/issues/1946/events
https://github.com/psf/requests/issues/1946
29,132,856
MDU6SXNzdWUyOTEzMjg1Ng==
1,946
Testsuite failure
{ "avatar_url": "https://avatars.githubusercontent.com/u/216539?v=4", "events_url": "https://api.github.com/users/ttanner/events{/privacy}", "followers_url": "https://api.github.com/users/ttanner/followers", "following_url": "https://api.github.com/users/ttanner/following{/other_user}", "gists_url": "https://api.github.com/users/ttanner/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ttanner", "id": 216539, "login": "ttanner", "node_id": "MDQ6VXNlcjIxNjUzOQ==", "organizations_url": "https://api.github.com/users/ttanner/orgs", "received_events_url": "https://api.github.com/users/ttanner/received_events", "repos_url": "https://api.github.com/users/ttanner/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ttanner/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ttanner/subscriptions", "type": "User", "url": "https://api.github.com/users/ttanner", "user_view_type": "public" }
[]
closed
true
null
[]
null
7
2014-03-10T21:08:06Z
2021-09-08T23:07:00Z
2014-11-13T19:11:28Z
NONE
resolved
Python 2.7.6 / macports, OS X 10.8.5, requests 2.2.1 ``` ====================================================================== FAIL: test_mixed_case_scheme_acceptable (test_requests.RequestsTestCase) ---------------------------------------------------------------------- Traceback (most recent call last): File "requests/test_requests.py", line 104, in test_mixed_case_scheme_acceptable assert r.status_code == 200, 'failed for scheme {0}'.format(scheme) AssertionError: failed for scheme HTTP:// 'HTTP://httpbin.org/get' = 'HTTP://' + ParseResult(scheme='http', netloc='httpbin.org', path='/get', params='', query='', fragment='').netloc + ParseResult(scheme='http', netloc='httpbin.org', path='/get', params='', query='', fragment='').path <Response [404]> = <module 'requests' from 'requests/requests/__init__.pyc'>.Request('GET', 'HTTP://httpbin.org/get') <Response [404]> = <requests.sessions.Session object at 0x110931e10>.send(<Response [404]>.prepare()) >> assert <Response [404]>.status_code == 200, 'failed for scheme {0}'.format('HTTP://') -------------------- >> begin captured logging << -------------------- requests.packages.urllib3.connectionpool: INFO: Starting new HTTP connection (1): 127.0.0.1 requests.packages.urllib3.connectionpool: DEBUG: "GET http://httpbin.org/get HTTP/1.1" 200 212 requests.packages.urllib3.connectionpool: DEBUG: "GET HTTP://httpbin.org/get HTTP/1.1" 404 2963 --------------------- >> end captured logging << --------------------- ---------------------------------------------------------------------- Ran 1 test in 0.504s FAILED (failures=1) ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/216539?v=4", "events_url": "https://api.github.com/users/ttanner/events{/privacy}", "followers_url": "https://api.github.com/users/ttanner/followers", "following_url": "https://api.github.com/users/ttanner/following{/other_user}", "gists_url": "https://api.github.com/users/ttanner/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ttanner", "id": 216539, "login": "ttanner", "node_id": "MDQ6VXNlcjIxNjUzOQ==", "organizations_url": "https://api.github.com/users/ttanner/orgs", "received_events_url": "https://api.github.com/users/ttanner/received_events", "repos_url": "https://api.github.com/users/ttanner/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ttanner/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ttanner/subscriptions", "type": "User", "url": "https://api.github.com/users/ttanner", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1946/reactions" }
https://api.github.com/repos/psf/requests/issues/1946/timeline
null
completed
null
null
false
[ "I can't reproduce this on my Mac (10.9.2, Python 2.7.6 (homebrew), Requests 2.2.1) and our continuous integration didn't fail either. Does this reproducibly fail for you?\n", "yes, it is reproducible, however, only if Glimmerblocker (1.5.3) is activated.\nAll other programm (e.g. curl) can download from HTTP://httpbin.org/get\n\nIt seems the problem is that requests does not convert the scheme to lowercase.\n", "I think this is Glimmerblocker's fault, actually. RFC 2616 states (emphasis mine):\n\n> When comparing two URIs to decide if they match or not, a client SHOULD use a case-sensitive octet-by-octet comparison of the entire URIs, with these exceptions:\n> \n> A port that is empty or not given is equivalent to the default port for that URI-reference;\n> Comparisons of host names MUST be case-insensitive;\n> **Comparisons of scheme names MUST be case-insensitive**;\n> An empty abs_path is equivalent to an abs_path of \"/\".\n\nWe've had trouble here in the past with scheme sensitivity, and concluded that we should allow schemes set weirdly as much as possible. Glimmerblocker is in violation of RFC 2616, we are not. =)\n\n_With that said_, I wonder if we should be lowercasing schemes when we build URLs for proxies.\n", "I agree it must be Glimmerblocker's fault.\nBut given most other programm lowercase the scheme (and I've seen other code checking only for 'http') it would be a useful workaround for broken proxies to lowercase it as well.\n", "I don't understand how Glimmerblocker is causing our tests to fail.\n", "We call [this method](http://docs.python.org/2/library/urllib.html#urllib.getproxies) that gets the proxies defined in the system so that we can route to them. I'd wager that Glimmerblocker appears in that list. When it does, we change the request URI from just the path to the full URI, causing Glimmerblocker to see the upper-cased scheme whereupon it chokes because it was badly written. =D\n", "This bug has been fixed as a side-effect of https://github.com/t-8ch/requests/commit/54bf4dcaaa37a5a5efb77669d6e5ab68680de737\n" ]
https://api.github.com/repos/psf/requests/issues/1945
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1945/labels{/name}
https://api.github.com/repos/psf/requests/issues/1945/comments
https://api.github.com/repos/psf/requests/issues/1945/events
https://github.com/psf/requests/pull/1945
29,126,369
MDExOlB1bGxSZXF1ZXN0MTMzODk4Nzc=
1,945
SSL certificate CN and fingerprint verification support
{ "avatar_url": "https://avatars.githubusercontent.com/u/216539?v=4", "events_url": "https://api.github.com/users/ttanner/events{/privacy}", "followers_url": "https://api.github.com/users/ttanner/followers", "following_url": "https://api.github.com/users/ttanner/following{/other_user}", "gists_url": "https://api.github.com/users/ttanner/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ttanner", "id": 216539, "login": "ttanner", "node_id": "MDQ6VXNlcjIxNjUzOQ==", "organizations_url": "https://api.github.com/users/ttanner/orgs", "received_events_url": "https://api.github.com/users/ttanner/received_events", "repos_url": "https://api.github.com/users/ttanner/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ttanner/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ttanner/subscriptions", "type": "User", "url": "https://api.github.com/users/ttanner", "user_view_type": "public" }
[]
closed
true
null
[]
null
13
2014-03-10T19:46:26Z
2021-09-08T10:01:05Z
2014-03-12T20:59:05Z
NONE
resolved
This PR adds support for SSL CommonName and fingerprint verification provided by urllib3. In contrast to https://github.com/kennethreitz/requests/pull/1606 it is minimally invasive and does not require new API parameters. Instead, it allows "verify" to be a dictionary with extra parameters.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1945/reactions" }
https://api.github.com/repos/psf/requests/issues/1945/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/1945.diff", "html_url": "https://github.com/psf/requests/pull/1945", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/1945.patch", "url": "https://api.github.com/repos/psf/requests/pulls/1945" }
true
[ "Thanks for this!\n\nI have no strong opinions, it looks fine. =) +0.5. @sigmavirus24?\n", "I have a couple concerns. The first is that you're introducing yet another way for the user to set the CA bundle. You did not state that in your description. I assume you want this in order to specify the path that you would otherwise specify to `verify` as a plain string.\n\nThat noted, I'm not comfortable with verify having a trinary value (its current state) let alone having 4 possible values (`True`, `False`, `'/path/to/bundle'`, and dictionary of **connection** options). Let's be clear, you are attaching these to the connection. We've held in the past that these are Transport Adapter concerns. Whilst being less invasive and a less severe API change, I'm still -0. I still feel that if you have all of the information for each URL you could structure it in such a way that a custom transport adapter could use it and attach it to the connection. The current API affords for what you want, just not in the simplest way possible. Especially given Kenneth's hesitancy to tie requests too closely to urllib3's API, I'm hesitant to be positive about this.\n\nIf @kennethreitz likes the change though my only request would be that you use `#get` on the verify dictionary like so:\n\n``` python\ncert_loc = verify.get('ca_bundle')\nconn.assert_hostname = verify.get('hostname')\nconn.assert_fingerprint = verify.get('fingerprint')\n```\n\nThis is far better than first checking for the value first and then assigning it. Each of those assigned values will be `None` if in fact the parameter is not provided or the dictionary is simply empty.\n", "-1, last time we did a mutating API like this (file upload api) it was a disaster. \n\nThis is why we have connection adapters. This configuration should be done there. \n", "Given @kennethreitz's -1 I'm going to close this.\n", "@sigmavirus24 I can close pull requests.\n", "@ttanner thank you so much for this contribution! Unfortunately, we won't be accepting it at this time. Basically, we've went down the route of passing tuples and dicts to a few different APIs, and it got a bit hairy if it wasn't the primary interface.\n\nIt may be worth exploring other options (perhaps passing `verify` a class?), if you think these SSL features are important, though. I'd love to learn more about them.\n", "@kennethreitz Here is a use case regarding `assert_hostname`. I want to run a HTTPS web server with a self signed certificate and run some tests with requests against it. As this web server can run on multiple nodes it has no static host/IP. \n\nIf I set the requests CA_BUNDLE to my certificate it fails on `ssl_match_hostname`, because there is no hostname defined in the cert (_requests.exceptions.SSLError: no appropriate commonName or subjectAltName fields were found_ ).\n\nRight now, the only way of making this configuration to work is a) completely disable certificate verification or b) patch urllib3/requests to pass `assert_hostname=False`. Both options are not really nice.\n\nCan you elaborate on your proposal to use a class? I guess I can help with a PR if you are more specific about what you expect.\n", "Or what about a function which has direct access to the connection object, so something like\n\n```\nif callable(verify):\n verifiy(conn)\n```\n\nOr a general API which allows you access to the connection object, e.g. a new keyword argument.\n", "> Or a general API which allows you access to the connection object, e.g. a new keyword argument.\n\nThe answer to that is simple: No. I'm going to investigate the rest of this soon to see how I can help.\n", "@schlamar in this case, if I were you, I would do the following:\n1. Use @t-8ch's [StackOverflow answer](http://stackoverflow.com/a/22794281/1953283) to create a new Adapter\n2. Register the adapter on your session like so:\n\n``` python\nmydomain = 'https://example.com'\ns = requests.session()\ns.mount(mydomain, HostNameIgnoringAdapter())\n```\n", "I took a look at code in SO question and it uses urllib3 which I thought was meant as an implementation detail of requests. If some parts of urllib3 are meant to be used by users they should be exposed explicitly by requests.\n", "@piotr-dobrogost as of this moment 98% (or more) of users can consider urllib3 an implementation detail. The rest of those users will need something from urllib3 and there are examples in the documentation (for the most common cases) which explicitly use urllib3. Regardless your comment is off-topic.\n", "> if I were you, I would do the following\n\nThanks, that works perfectly. I would propose adding this to the docs :)\n" ]
https://api.github.com/repos/psf/requests/issues/1944
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1944/labels{/name}
https://api.github.com/repos/psf/requests/issues/1944/comments
https://api.github.com/repos/psf/requests/issues/1944/events
https://github.com/psf/requests/pull/1944
29,076,466
MDExOlB1bGxSZXF1ZXN0MTMzNjA5MTg=
1,944
Catch errors while handling redirects
{ "avatar_url": "https://avatars.githubusercontent.com/u/238652?v=4", "events_url": "https://api.github.com/users/schlamar/events{/privacy}", "followers_url": "https://api.github.com/users/schlamar/followers", "following_url": "https://api.github.com/users/schlamar/following{/other_user}", "gists_url": "https://api.github.com/users/schlamar/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/schlamar", "id": 238652, "login": "schlamar", "node_id": "MDQ6VXNlcjIzODY1Mg==", "organizations_url": "https://api.github.com/users/schlamar/orgs", "received_events_url": "https://api.github.com/users/schlamar/received_events", "repos_url": "https://api.github.com/users/schlamar/repos", "site_admin": false, "starred_url": "https://api.github.com/users/schlamar/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/schlamar/subscriptions", "type": "User", "url": "https://api.github.com/users/schlamar", "user_view_type": "public" }
[]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" } ]
null
37
2014-03-10T07:24:49Z
2021-09-08T10:01:22Z
2014-05-12T20:50:33Z
CONTRIBUTOR
resolved
This is my proposal which fixes #1939 completely. @sigmavirus24 missed one possible exception in #1940 (ChunkedEncodingError) and didn't handle RuntimeError correctly (because this means that the content is already consumed). This also addresses the issue that a decoding error is already thrown in Adapter.send by moving the content reading part at the end of Session.send.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1944/reactions" }
https://api.github.com/repos/psf/requests/issues/1944/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/1944.diff", "html_url": "https://github.com/psf/requests/pull/1944", "merged_at": "2014-05-12T20:50:33Z", "patch_url": "https://github.com/psf/requests/pull/1944.patch", "url": "https://api.github.com/repos/psf/requests/pulls/1944" }
true
[ ":+1: \n", "@maxcountryman Does this fix your issue (https://github.com/shazow/urllib3/issues/206)?\n", "@schlamar it looks like it should. I can't test it at the moment. A little later on I'll see if I have time to give it a proper run with Photobucket.\n", "This looks reasonable to me. I'll let @sigmavirus24 take a look too.\n", "> and didn't handle RuntimeError correctly (because this means that the content is already consumed)\n\nIn my tests it did not block and I prefer to do something as opposed to using `pass` in an except block. How was the handling incorrect?\n", "> How was the handling incorrect?\n\nIt is just not logical to read from the response in case of RuntimeError because this exception is thrown if the content is already consumed (so a read is not necessary).\n\nHowever, I'm willing to handle this as in your PR if you prefer it.\n", "> a read is not necessary\n\nWhile the read is not necessary, it is also not harmful.\n\nIt's my own stylistic preference that except blocks not be a simple \"pass\". (If there's no action to take then perhaps the exception should be handled somewhere else (or not thrown at all).) It's up to Kenneth and Cory if they like it as is. I have no strenuous objections to it, I just have a bit of a pet peeve about this particular style.\n", ":+1: \n", "In addition to #1939, I believe this also fixes #1952 (which I just filed - sorry about that).\n\n@schlamar Could you please add some test cases for these bugs as well? A test case for #1952 can be dug out of the (now-scrapped) pull request #1919.\n\n@sigmavirus24 Regarding \"`except RuntimeError: pass # already decoded`\", I sympathize with your \"should this exception have been thrown at all?\" reaction, but it makes sense for `Response.content` to throw an exception when all the content has already been consumed; it happens that in this case we don't care since we are only accessing `.content` for its side effects. (Would it make you more comfortable if a more specific exception were thrown?)\n", "> Could you please add some test cases for these bugs as well?\n\nVery interesting. I guess this should be possible (while a test case for the original issue is actually hard, we would need a misbehaving web service for that).\n\n> should this exception have been thrown at all\n\nI don't think this was his point. I guess you mixed two things together. My interpretation of @sigmavirus24 comments is:\n1. Instead of a RuntimeError there should be a more explicit exception if the content is already consumed\n2. Don't handle the RuntimeError separately in resolve_redirects, just do a `r.raw.read(...)` in all exception cases.\n", "Here's the test case for #1952 (from #1919). Add to the big `RequestsTestCase` class in `test_requests.py`.\n\n```\n def test_manual_redirect_with_partial_body_read(self):\n s = requests.Session()\n r1 = s.get(httpbin('redirect/2'), allow_redirects=False, stream=True)\n assert r1.is_redirect\n rg = s.resolve_redirects(r1, r1.request, stream=True)\n\n # read only the first eight bytes of the response body,\n # then follow the redirect\n r1.iter_content(8)\n r2 = next(rg)\n assert r2.is_redirect\n\n # read all of the response via iter_content,\n # then follow the redirect\n for _ in r2.iter_content(): pass\n r3 = next(rg)\n assert not r3.is_redirect\n```\n\nI agree #1939 is not feasible to test without adding stuff to httpbin. Maybe that should wait for #1166? A bunch of the other redirection bugs I just filed (and more to come) are going to be hard to test without adding stuff to httpbin, too.\n", "Thanks, I added this test.\n\nMaybe we can add a `gzipped_redirect` to httpbin to test this?\n", "> My interpretation of @sigmavirus24 comments _[snip]_\n\nYour interpretation is correct. I've discussed this with several (far more experienced and knowledgeable Python developers, including core PyPy and CPython developers). `RuntimeError` exceptions should never be raised by any library ever. Yes they're there but that does not mean you should use them.\n\nThis also works with your overall goal @zackw: If the error were instead a child of a `RequestException` we could name it well and have:\n\n``` python\ntry:\n requests.get(...)\nexcept requests.RequestException:\n #...\n```\n\nWork in every case. In the redirect case, before this PR, it wouldn't. I know that there are several ideas of \"good\" Python code that I disagree with others on (including Kenneth), so I don't push those issues normally.\n\nI haven't found a good way to preserve backwards compatibility though for those catching the RuntimeError and allowing for a RequestException. I don't think creating a new one that inherits from both is a good idea in the slightest. I'd love if either of you had the solution. In fact, I'd probably send you :cake: :)\n", "While this is really interesting (never thought about how you would handle backwards compatibility when changing an exception type), it is actually a bit off topic here =) I think we should focus on this PR and maybe move this discussion into a new issue?\n\nPoints standing out:\n- Should I remove handling `RuntimeError` separately (I'm +0 on this)?\n- How to test against the original issue (#1939)? Would a `gzipped_redirect` on httpbin a valid option? I don't know how Kenneth feels about such a \"feature\"...\n", "@schlamar it took me several months to get a feature merged and deployed on HTTPBin in order to test a bug fix. I'm not certain we should bother waiting that long. Do you have sufficient confidence that you could fake out a gzipped response from a redirect? You could take a similar approach to my test in #1963. Unfortunately, when I planned that test code, I was planning for the simplest case (mine). If there's a good way to adapt it to this PR, that would be awesome.\n", "@sigmavirus24 @zackw Added a test. What do you think?\n", "@sigmavirus24 Should I change \n\n```\ntry:\n resp.content # Consume socket so it can be released\nexcept (ChunkedEncodingError, ContentDecodingError):\n resp.raw.read(decode_content=False)\nexcept RuntimeError:\n pass # already consumed\n```\n\nto\n\n```\ntry:\n resp.content # Consume socket so it can be released\nexcept (ChunkedEncodingError, ContentDecodingError, RuntimeError):\n resp.raw.read(decode_content=False)\n```\n", "Now looking at this in comparison I'm +1 for the latter one =)\n", "Updated and rebased against master.\n", "> Now looking at this in comparison I'm +1 for the latter one =)\n\nMe too ;)\n", "Gave this one more look over. LGTM. :shipit: :+1: \n", "> I prefer to do something as opposed to using pass in an except block. \n\n@sigmavirus24 I guess you'll like the new [`contextlib.suppress`](http://docs.python.org/3.4/library/contextlib.html#contextlib.suppress) in Python 3.4. =)\n", "I really should stop using Python 2 and only use 3. ;)\n", "That's what I've done with `hyper`. It's very freeing. =)\n", "#1939 terrifies me. When Requests was handling decoding of transfer-encodings itself, it It was explicitly handling the case of when servers are misinforming about the transfer-encoding. This is one of the major reasons I want to be very wary of coupling too closely to urllib3.\n\nPehaps, akin to the mutable `r.encoding`, there should be a `r.transfer_encoding`. This would be, very, very nice. \n\nThis would unify the API we have today — and gives us the same behavior for character encodings (which was my intention, I'm not sure how this bug got introduced :P).\n", "I suspect that `r.transfer_encoding` would be very useful for `pip`. I seem to recall that they've had some trouble with servers sending `.tar.gz` files with `Content-Encoding: gzip` set, causing `r.content` to save the ungzipped tar file. @dstufft, does that ring a bell?\n", "Yes that's right. I had to reimplement parts of requests in order to disable the decoding.\n", "@kennethreitz If `.transfer_encoding` is something you're interested in I'll take a crack at it.\n", "> When Requests was handling decoding of transfer-encodings itself, it It was explicitly handling the case of when servers are misinforming about the transfer-encoding.\n\nBut this was broken because it dropped at least one byte from the original response: https://github.com/kennethreitz/requests/issues/1249 To be able to recover gracefully from a decoding error and having the original bytes available afterwards you need basically cache the complete response because decoding could theoretically fail on the last byte (which makes streaming responses basically useless).\n\n> there should be a r.transfer_encoding\n\nI don't see any other use case than disabling content decoding. And this can already be done with `r.raw.decode_content = False` on streaming responses.\n", "@Lukasa by all means! \n" ]
https://api.github.com/repos/psf/requests/issues/1943
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1943/labels{/name}
https://api.github.com/repos/psf/requests/issues/1943/comments
https://api.github.com/repos/psf/requests/issues/1943/events
https://github.com/psf/requests/pull/1943
29,074,124
MDExOlB1bGxSZXF1ZXN0MTMzNTk2ODg=
1,943
add timout control for all requests of a Session
{ "avatar_url": "https://avatars.githubusercontent.com/u/3880627?v=4", "events_url": "https://api.github.com/users/junfenglx/events{/privacy}", "followers_url": "https://api.github.com/users/junfenglx/followers", "following_url": "https://api.github.com/users/junfenglx/following{/other_user}", "gists_url": "https://api.github.com/users/junfenglx/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/junfenglx", "id": 3880627, "login": "junfenglx", "node_id": "MDQ6VXNlcjM4ODA2Mjc=", "organizations_url": "https://api.github.com/users/junfenglx/orgs", "received_events_url": "https://api.github.com/users/junfenglx/received_events", "repos_url": "https://api.github.com/users/junfenglx/repos", "site_admin": false, "starred_url": "https://api.github.com/users/junfenglx/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/junfenglx/subscriptions", "type": "User", "url": "https://api.github.com/users/junfenglx", "user_view_type": "public" }
[ { "color": "009800", "default": false, "description": null, "id": 44501218, "name": "Ready To Merge", "node_id": "MDU6TGFiZWw0NDUwMTIxOA==", "url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge" }, { "color": "207de5", "default": false, "description": null, "id": 60620163, "name": "Minion Seal of Approval", "node_id": "MDU6TGFiZWw2MDYyMDE2Mw==", "url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval" } ]
closed
true
null
[]
null
4
2014-03-10T05:55:50Z
2021-09-08T22:01:11Z
2014-03-10T11:21:52Z
NONE
resolved
When work with a pool proxy,It doesn't close connection when except errors.Then request get block forever. I don't want add timeout arg for every request.So I opened this pull request.
{ "avatar_url": "https://avatars.githubusercontent.com/u/3880627?v=4", "events_url": "https://api.github.com/users/junfenglx/events{/privacy}", "followers_url": "https://api.github.com/users/junfenglx/followers", "following_url": "https://api.github.com/users/junfenglx/following{/other_user}", "gists_url": "https://api.github.com/users/junfenglx/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/junfenglx", "id": 3880627, "login": "junfenglx", "node_id": "MDQ6VXNlcjM4ODA2Mjc=", "organizations_url": "https://api.github.com/users/junfenglx/orgs", "received_events_url": "https://api.github.com/users/junfenglx/received_events", "repos_url": "https://api.github.com/users/junfenglx/repos", "site_admin": false, "starred_url": "https://api.github.com/users/junfenglx/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/junfenglx/subscriptions", "type": "User", "url": "https://api.github.com/users/junfenglx", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1943/reactions" }
https://api.github.com/repos/psf/requests/issues/1943/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/1943.diff", "html_url": "https://github.com/psf/requests/pull/1943", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/1943.patch", "url": "https://api.github.com/repos/psf/requests/pulls/1943" }
true
[ "Thanks for this!\n\nI want to warn you right now, we're technically in feature freeze, so it's possible this won't get accepted. That said, it's a pretty minor change.\n", "feature freeze in this version or forever?\n", "Feature freeze at the discretion of the BDFL. =)\n\nDon't worry, it's not a very hard line. The BDFL reserves the right to make exceptions and does all the time, so you shouldn't worry here.\n\nIt would be helpful, though, if you could squash these three commits down to one.\n", "Thank you for your enthusiastic explanation.\nI am a tiro about Git.\nThen, I closed this request.\nHope you add this minor change.\n" ]
https://api.github.com/repos/psf/requests/issues/1942
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1942/labels{/name}
https://api.github.com/repos/psf/requests/issues/1942/comments
https://api.github.com/repos/psf/requests/issues/1942/events
https://github.com/psf/requests/issues/1942
28,960,015
MDU6SXNzdWUyODk2MDAxNQ==
1,942
Requests, YCM, and FreeBSD error
{ "avatar_url": "https://avatars.githubusercontent.com/u/610865?v=4", "events_url": "https://api.github.com/users/ashemedai/events{/privacy}", "followers_url": "https://api.github.com/users/ashemedai/followers", "following_url": "https://api.github.com/users/ashemedai/following{/other_user}", "gists_url": "https://api.github.com/users/ashemedai/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ashemedai", "id": 610865, "login": "ashemedai", "node_id": "MDQ6VXNlcjYxMDg2NQ==", "organizations_url": "https://api.github.com/users/ashemedai/orgs", "received_events_url": "https://api.github.com/users/ashemedai/received_events", "repos_url": "https://api.github.com/users/ashemedai/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ashemedai/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ashemedai/subscriptions", "type": "User", "url": "https://api.github.com/users/ashemedai", "user_view_type": "public" }
[]
closed
true
null
[]
null
17
2014-03-07T11:50:07Z
2021-09-09T00:09:57Z
2014-03-07T17:16:22Z
NONE
resolved
Using the [YouCompleteMe (YCM)](https://github.com/Valloric/YouCompleteMe) project with vim I ran into a [problem](https://github.com/Valloric/YouCompleteMe/issues/804) that seems to come from Requests and running on FreeBSD 9.2-STABLE. The traceback I am running into: ``` python Exception in thread Thread-10 (most likely raised during interpreter shutdown): Traceback (most recent call last): File "/usr/local/lib/python2.7/threading.py", line 810, in __bootstrap_inner File "/usr/local/lib/python2.7/threading.py", line 763, in run File "/usr/home/asmodai/.vim/bundle/YouCompleteMe/autoload/../python/ycm/unsafe_thread_pool_executor.py", line 55, in _worker File "/usr/home/asmodai/.vim/bundle/YouCompleteMe/autoload/../python/ycm/unsafe_thread_pool_executor.py", line 43, in run File "/usr/home/asmodai/.vim/bundle/YouCompleteMe/third_party/requests/requests/sessions.py", line 357, in request File "/usr/home/asmodai/.vim/bundle/YouCompleteMe/third_party/requests/requests/sessions.py", line 460, in send File "/usr/home/asmodai/.vim/bundle/YouCompleteMe/third_party/requests/requests/adapters.py", line 350, in send <type 'exceptions.AttributeError'>: 'NoneType' object has no attribute 'error' ``` Requests is embedded within YCM as a Subproject, currently @ 3373548. What kind of additional information do I need to provide to be able to track down what goes wrong here?
{ "avatar_url": "https://avatars.githubusercontent.com/u/610865?v=4", "events_url": "https://api.github.com/users/ashemedai/events{/privacy}", "followers_url": "https://api.github.com/users/ashemedai/followers", "following_url": "https://api.github.com/users/ashemedai/following{/other_user}", "gists_url": "https://api.github.com/users/ashemedai/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ashemedai", "id": 610865, "login": "ashemedai", "node_id": "MDQ6VXNlcjYxMDg2NQ==", "organizations_url": "https://api.github.com/users/ashemedai/orgs", "received_events_url": "https://api.github.com/users/ashemedai/received_events", "repos_url": "https://api.github.com/users/ashemedai/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ashemedai/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ashemedai/subscriptions", "type": "User", "url": "https://api.github.com/users/ashemedai", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1942/reactions" }
https://api.github.com/repos/psf/requests/issues/1942/timeline
null
completed
null
null
false
[ "Is that really the whole traceback? I'm finding it bears no relationship to the actual code in Requests.\n\nNevertheless, assuming it does, the only place we could raise an uncaught `AttributeError` of this type would have to be in the except block where we attempt to get `socket.error`. Surely `socket` provides `socket.error` on FreeBSD 9.2?\n", "Yes it is the whole traceback. Here is a slightly more up to date one with latest YCM so that line numbers match better:\n\n``` python\nException in thread Thread-2 (most likely raised during interpreter shutdown):\nTraceback (most recent call last):\n File \"/usr/local/lib/python2.7/threading.py\", line 810, in __bootstrap_inner\n File \"/usr/local/lib/python2.7/threading.py\", line 763, in run\n File \"/usr/home/asmodai/.vim/bundle/YouCompleteMe/autoload/../python/ycm/unsafe_thread_pool_executor.py\", line 55, in _worker\n File \"/usr/home/asmodai/.vim/bundle/YouCompleteMe/autoload/../python/ycm/unsafe_thread_pool_executor.py\", line 43, in run\n File \"/usr/home/asmodai/.vim/bundle/YouCompleteMe/third_party/requests/requests/sessions.py\", line 383, in request\n File \"/usr/home/asmodai/.vim/bundle/YouCompleteMe/third_party/requests/requests/sessions.py\", line 486, in send\n File \"/usr/home/asmodai/.vim/bundle/YouCompleteMe/third_party/requests/requests/adapters.py\", line 374, in send\n<type 'exceptions.AttributeError'>: 'NoneType' object has no attribute 'error' \n```\n\nOn my FreeBSD 9.2-STABLE box:\n\n``` python\n>>> from socket import error\n>>> socket.error\n<class 'socket.error'>\n```\n\nSo yes, it provides it. But isn't this about the fact that socket isn't defined at this point? The traceback says that object NoneType has no attribute error, which makes sense because a None has no attributes. But that would mean, if I look at code in adapters.py that somehow the import of socket did not occur or got clobbered somehow?\n\n``` python\n374: except socket.error as sockerr:\n```\n\nI do not know the Requests codebase well enough to be able to judge whether this can happen.\n\nI am a bit caught between YCM and Requests at the moment in trying to see where it goes wrong exactly. I also double checked to see whether Python called from within Python actually can import socket and that works without problems.\n", "Can you test using YCM's \"unsafe thread pool\"? That is referenced in the traceback and sounds highly suspicious. \n", "Mm, I agree with @sigmavirus24 here. Something weird should be going on with YCM to cause this. Here's my reasoning:\n1. `socket` must be imported because it's unconditionally imported at the top of `adapters.py`: any failure to import would cause an `ImportError`.\n2. `socket` must be imported because if the import hadn't run at all we wouldn't get an `AttributeError`, we'd get a `NameError` on `socket`.\n\nThese two facts suggest that `requests.adapters.socket` has been monkeypatched to `None`. We never do this in Requests, so I'd look at YCM.\n", "Yeah, makes sense to me.\n\nEspecially when I see this in the `unsafe_thread_pool_executor.py`:\n\n``` python\n# This file provides an UnsafeThreadPoolExecutor, which operates exactly like\n# the upstream Python version of ThreadPoolExecutor with one exception: it\n# doesn't wait for worker threads to finish before shutting down the Python\n# interpreter.\n#\n# This is dangerous for many workloads, but fine for some (like when threads\n# only send network requests). The YCM workload is one of those workloads where\n# it's safe (the aforementioned network requests case).\n```\n\nI haven't had time yet to work with the \"unsafe thread pool\", but I am suspecting it will indeed point back to YCM. Yay, issue ping-pong. ;)\n", "Further if they're monkey patching socket with something like gevent, then that could cause `socket` to be `None`.\n", "Looking into YCM, they use pythonfutures and [requests-futures](https://github.com/ross/requests-futures). I wonder if @ross has any insight into this. Granted I'm not sure they actually use requests-futures, but they're at least using requests with concurrent futures and he might be able to help us all out.\n", "base_request.py seems to import requests-futures and creates a FuturesSession. the code is a bit difficult to follow, but what it is doing with requests-futures _seems_ valid and i don't see any clear way it'd be related to the socket.error problem. \n\nthe fact that unsafe_thread_pool_executor.py has comments about it being safe to kill still working threads and the stacktrace mentions that it's probably happening in shutdown would in my opinion point things at YCM.\n", "Fair enough, I'll take it up with @Valloric again. :)\n", "Thanks @ross !\n", "Yes, thanks for the help so far guys, much appreciated. I'll close it for now.\n", "> Looking into YCM, they use pythonfutures and requests-futures. I wonder if @ross has any insight into this. Granted I'm not sure they actually use requests-futures, but they're at least using requests with concurrent futures and he might be able to help us all out.\n\nYCM uses requests-futures, correct.\n\n> base_request.py seems to import requests-futures and creates a FuturesSession. the code is a bit difficult to follow, but what it is doing with requests-futures seems valid and i don't see any clear way it'd be related to the socket.error problem.\n\nYCM code doesn't do any monkeypatching of anything inside Requests (or any other third-party lib it uses). requests-futures might be doing it though, I'm not sure.\n\n> the fact that unsafe_thread_pool_executor.py has comments about it being safe to kill still working threads and the stacktrace mentions that it's probably happening in shutdown would in my opinion point things at YCM.\n\nYeah, the `unsafe_thread_pool_executor` is unsafe in general (that's why I named it that way) but safe for YCM workloads. When the user wants to shut down Vim, the YCM plugin that uses Requests to talk to the `ycmd` server might have some requests in flight and we don't care about those anymore, they're (logically) fine to kill at any point of execution. But from a code perspective, it appears it might provoke some tracebacks... _sigh_.\n\nBut waiting on the network threads to stop is profoundly not an option; depending on the user's code (that they're trying to get completions for), some of those requests might take several seconds to finish and blocking Vim shutdown to wait for them pisses everyone off. People want Vim to exit quickly, not block and then exit some time in the near future.\n\nI'm open to ideas here.\n", "I think I'll just make YCM's `unsafe_thread_pool_executor` swallow this exception during shutdown. It's a shitty solution, but others are even more so.\n", "@Valloric Sorry I couldn't get back to you after your first message, but yeah, if you genuinely don't care about what happens to the messages once you've decided to shut down you should just swallow those exceptions. After all, at this point they aren't exceptional, they're expected.\n", "Are we certain that this is occurring when @ashemedai is quitting vim? I didn't see him indicate that but perhaps I missed it in the original thread.\n", "@sigmavirus24 It happens for me only when I write-quit or quit out of vim and only from time to time. I am fine with it silently swallowing it. The reason I brought it up in the first place with @Valloric was that I thought it was unintended behaviour. :)\n", "In that case, ignore me. :)\n" ]
https://api.github.com/repos/psf/requests/issues/1941
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1941/labels{/name}
https://api.github.com/repos/psf/requests/issues/1941/comments
https://api.github.com/repos/psf/requests/issues/1941/events
https://github.com/psf/requests/issues/1941
28,886,450
MDU6SXNzdWUyODg4NjQ1MA==
1,941
nested dict to simple
{ "avatar_url": "https://avatars.githubusercontent.com/u/1665055?v=4", "events_url": "https://api.github.com/users/VVhiteCoder/events{/privacy}", "followers_url": "https://api.github.com/users/VVhiteCoder/followers", "following_url": "https://api.github.com/users/VVhiteCoder/following{/other_user}", "gists_url": "https://api.github.com/users/VVhiteCoder/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/VVhiteCoder", "id": 1665055, "login": "VVhiteCoder", "node_id": "MDQ6VXNlcjE2NjUwNTU=", "organizations_url": "https://api.github.com/users/VVhiteCoder/orgs", "received_events_url": "https://api.github.com/users/VVhiteCoder/received_events", "repos_url": "https://api.github.com/users/VVhiteCoder/repos", "site_admin": false, "starred_url": "https://api.github.com/users/VVhiteCoder/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/VVhiteCoder/subscriptions", "type": "User", "url": "https://api.github.com/users/VVhiteCoder", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2014-03-06T15:18:48Z
2021-09-09T00:10:01Z
2014-03-06T16:49:05Z
NONE
resolved
is there a tool that can convert this: {'firstname': u'MyName', 'lastname': u'MyLast', 'phone_numbers': [{'number': u'3413241234', 'public': 1}]} to this: {'firstname': u'MyName', 'lastname': u'MyLast', 'phone_numbers[0][number]': u'3413241234', 'phone_number[0][public]': 1}]} for form-encoded
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1941/reactions" }
https://api.github.com/repos/psf/requests/issues/1941/timeline
null
completed
null
null
false
[ "Not that I am aware of.\n\n@sigmavirus24, is this worth putting in the toolbelt? `flatten_and_form_encode`?\n", "yes will be very good tool :\nit's must convert nested dicts with different depth\n", "@Lukasa sounds like a worthwhile addition.\n", "Let's close this and continue the discussion over on the [toolbelt](/sigmavirus24/requests-toolbelt)\n" ]
https://api.github.com/repos/psf/requests/issues/1940
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1940/labels{/name}
https://api.github.com/repos/psf/requests/issues/1940/comments
https://api.github.com/repos/psf/requests/issues/1940/events
https://github.com/psf/requests/pull/1940
28,761,716
MDExOlB1bGxSZXF1ZXN0MTMxOTQ1Mzg=
1,940
Handle both exceptions possible while consuming a socket
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" } ]
null
5
2014-03-05T03:08:13Z
2021-09-08T23:08:30Z
2014-03-11T13:19:35Z
CONTRIBUTOR
resolved
This fixes #1939 and part of @zackw's rehaul of redirection. To better test this, I'd like to add Betamax as a test dependency. Also, as a separate PR I plan to kill the usage of `RuntimeError` in `Response#content`. That's awful and the community consensus is that nothing should ever raise that.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1940/reactions" }
https://api.github.com/repos/psf/requests/issues/1940/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/1940.diff", "html_url": "https://github.com/psf/requests/pull/1940", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/1940.patch", "url": "https://api.github.com/repos/psf/requests/pulls/1940" }
true
[ "And yes, I know I'm using a pattern that I absolutely abhor. That aside, there's no other way to catch the decode error.\n\nI wonder, however, if there's a better way to handle this. I'll sleep on it. :)\n", "The actual point of raising the exception is here: https://github.com/kennethreitz/requests/blob/master/requests/adapters.py#L391\n\n(you need to cover both of them)\n", "> The actual point of raising the exception is here: https://github.com/kennethreitz/requests/blob/master/requests/adapters.py#L391\n\n@schlamar we cannot catch that in `Session#resolve_redirects`. To catch it in the adapter would be a serious change in behaviour. I wonder why we consume all of the content in the adapter though. If there's a decoding error, we don't even attempt to return a Response to the user. That may be fodder for an entirely different issue though. If @Lukasa is satisfied with this as an immediate solution to what was reported, I am too. But I want to investigate that other piece in the adapter as well.\n", "> satisfied with this as an immediate solution to what was reported\n\nThis is no solution to #1939 because it is already failing in `Adapter.send` (before resolving redirects). See the traceback in this issue. The analysis of @mechanical-snail is not quite correct here. \n\nIt would work if we move the `if not stream: r.content` at the end of `Session.send` (which is IMO the correct place, the stream argument should only come in play after the redirects are resolved). But that means that the stream parameter to Adapter.send is obsolete, so this would be a API breaking change.\n", "Reviewing the issue, you're once again correct @schlamar. That aside @mechanical-snail is not wrong that this could in fact also be a problem. Their issue won't be fixed by this but it may fix the case where streaming is used.\n" ]
https://api.github.com/repos/psf/requests/issues/1939
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1939/labels{/name}
https://api.github.com/repos/psf/requests/issues/1939/comments
https://api.github.com/repos/psf/requests/issues/1939/events
https://github.com/psf/requests/issues/1939
28,759,127
MDU6SXNzdWUyODc1OTEyNw==
1,939
Why decode the response body of a redirect?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1093862?v=4", "events_url": "https://api.github.com/users/mechanical-snail/events{/privacy}", "followers_url": "https://api.github.com/users/mechanical-snail/followers", "following_url": "https://api.github.com/users/mechanical-snail/following{/other_user}", "gists_url": "https://api.github.com/users/mechanical-snail/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/mechanical-snail", "id": 1093862, "login": "mechanical-snail", "node_id": "MDQ6VXNlcjEwOTM4NjI=", "organizations_url": "https://api.github.com/users/mechanical-snail/orgs", "received_events_url": "https://api.github.com/users/mechanical-snail/received_events", "repos_url": "https://api.github.com/users/mechanical-snail/repos", "site_admin": false, "starred_url": "https://api.github.com/users/mechanical-snail/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mechanical-snail/subscriptions", "type": "User", "url": "https://api.github.com/users/mechanical-snail", "user_view_type": "public" }
[]
closed
true
null
[]
null
10
2014-03-05T01:58:01Z
2021-09-09T00:00:59Z
2014-05-12T20:50:33Z
NONE
resolved
Requests fails on the URL `http://www.whatbird.com/forum/index.php?/gallery/image/291517-foo/`, which is a 301 redirect to ``` http://www.whatbird.com/forum/index.php?/gallery/image/291517-title-paused-jewel-allens-hummingbird-a-backyard-bird-painting-in-oil-by-camille-engel/ ``` . The issue seems to be that the server's initial 301 response has a header falsely claiming that the response body (a simple HTML page) is gzipped, when it's actually uncompressed. When resolving redirects, Requests does (in `requests.sessions.resolve_redirects`): ``` resp.content # Consume socket so it can be released ``` which attempts to decode One could legitimately say this is the server's problem. However, conceptually, why decode the response body of a redirect, which won't get returned? Other programs (Chromium, Firefox, `curl`) don't do this. For example, `curl` gives an error, as expected, when not following redirects: ``` $ curl --compressed 'http://www.whatbird.com/forum/index.php?/gallery/image/291517-foo/' curl: (61) Error while processing content unencoding: invalid code lengths set ``` whereas it works if you add the `--location` flag (follow redirects). # Example of error ``` Python 3.3.2+ (default, Oct 9 2013, 14:56:03) [GCC 4.8.1] on linux Type "help", "copyright", "credits" or "license" for more information. >>> import requests ; requests.get('http://www.whatbird.com/forum/index.php?/gallery/image/291517-foo/') Traceback (most recent call last): File "./requests/packages/urllib3/response.py", line 199, in read data = self._decoder.decompress(data) zlib.error: Error -3 while decompressing data: incorrect header check During handling of the above exception, another exception occurred: Traceback (most recent call last): File "./requests/models.py", line 629, in generate for chunk in self.raw.stream(chunk_size, decode_content=True): File "./requests/packages/urllib3/response.py", line 236, in stream data = self.read(amt=amt, decode_content=decode_content) File "./requests/packages/urllib3/response.py", line 204, in read e) requests.packages.urllib3.exceptions.DecodeError: ('Received response with content-encoding: gzip, but failed to decode it.', error('Error -3 while decompressing data: incorrect header check',)) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "<stdin>", line 1, in <module> File "./requests/api.py", line 55, in get return request('get', url, **kwargs) File "./requests/api.py", line 44, in request return session.request(method=method, url=url, **kwargs) File "./requests/sessions.py", line 393, in request resp = self.send(prep, **send_kwargs) File "./requests/sessions.py", line 496, in send r = adapter.send(request, **kwargs) File "./requests/adapters.py", line 391, in send r.content File "./requests/models.py", line 691, in content self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes() File "./requests/models.py", line 634, in generate raise ContentDecodingError(e) requests.exceptions.ContentDecodingError: ('Received response with content-encoding: gzip, but failed to decode it.', error('Error -3 while decompressing data: incorrect header check',)) ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1939/reactions" }
https://api.github.com/repos/psf/requests/issues/1939/timeline
null
completed
null
null
false
[ "You answered your own question when you pasted that line. The comment explains exactly why we consume the content. If we do not, then a user handling a great deal of redirects will run out of available connections that can be made. Sockets will sit in the ready state waiting to be read from. I think it also has the potential to cause a memory usage issue when the ref count does not reach 0 and the socket is not garbage collected. We should be, however, catching that error in `resolve_redirects`.\n\nThank you for raising this issue! I'll throw together a PR to patch this.\n", "I agree that consuming the raw response data from the socket is needed. I'm asking why we should _decode_ the data.\n\n(And thanks for the quick response.)\n", "We decode the data because your assertion that it won't be read is false. You may read the response body from any redirect because we save it. Each redirect builds a _full_ response object that can be used exactly like any other. This is a very good thing, and won't be changed. =)\n\nThe fix, as @sigmavirus24 has suggested, is simply to catch this error.\n", "Interesting. It's very likely that this is a duplicate of https://github.com/shazow/urllib3/issues/206 / https://github.com/kennethreitz/requests/issues/1472 because there is also an 301 redirect in the curl output. @shazow \n", "@schlamar That's very interesting.\n\nHowever, this isn't a dupe, it's just related. The key is that we shouldn't really care even if we hit a legitimate decoding error when following redirects: we just want to do our best and then move on.\n", "@Lukasa hit the nail on the head :)\n", "> However, this isn't a dupe, it's just related.\n\nWhy do you think so? \n\nOn requests 1.2.3, I'm getting the same traceback than in #1472 with this URL:\n\n```\n>>> import requests ; requests.get('http://www.whatbird.com/forum/index.php?/gallery/image/291517-foo/')\nTraceback (most recent call last):\n ...\n File \"c:\\Python27\\lib\\site-packages\\requests\\packages\\urllib3\\response.py\", line 188, in read\n \"failed to decode it.\" % content_encoding)\nrequests.packages.urllib3.exceptions.DecodeError: Received response with content-encoding: gzip, but failed to decode it\n```\n", "@schlamar So the issues you linked cause the exception, but they aren't the problem being referred to. The key problem in _this_ issue is that if we hit an error decoding the response body of a redirect, we'll stop following redirects. That _shouldn't_ happen: we understood enough of the message to follow the redirect, so there's no reason to stop following them. =)\n\nFixing the bugs you linked fixes the specific case in question, but not the general one.\n", "> Fixing the bugs you linked fixes the specific case in question, but not the general one.\n\nYes, but fixing _this_ bug should resolve the linked issues (which is what I wanted to say :)\n", "Ahhhhh, I see. =)\n" ]
https://api.github.com/repos/psf/requests/issues/1938
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1938/labels{/name}
https://api.github.com/repos/psf/requests/issues/1938/comments
https://api.github.com/repos/psf/requests/issues/1938/events
https://github.com/psf/requests/issues/1938
28,744,210
MDU6SXNzdWUyODc0NDIxMA==
1,938
Support content-length checking
{ "avatar_url": "https://avatars.githubusercontent.com/u/308610?v=4", "events_url": "https://api.github.com/users/jaraco/events{/privacy}", "followers_url": "https://api.github.com/users/jaraco/followers", "following_url": "https://api.github.com/users/jaraco/following{/other_user}", "gists_url": "https://api.github.com/users/jaraco/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jaraco", "id": 308610, "login": "jaraco", "node_id": "MDQ6VXNlcjMwODYxMA==", "organizations_url": "https://api.github.com/users/jaraco/orgs", "received_events_url": "https://api.github.com/users/jaraco/received_events", "repos_url": "https://api.github.com/users/jaraco/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jaraco/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jaraco/subscriptions", "type": "User", "url": "https://api.github.com/users/jaraco", "user_view_type": "public" }
[]
closed
true
null
[]
null
8
2014-03-04T21:48:58Z
2021-09-09T00:10:02Z
2014-03-05T14:54:13Z
CONTRIBUTOR
resolved
I have an application that wants to do some content length validation based on the Content-Length reported by the server. Currently, if the response has more or fewer bytes than indicated by the Content-Length, there's no error and due to encoding, it's not possible to detect the length of the payload actually read. Consider: ``` #!/usr/bin/env python3 import requests resp = requests.get('http://en.wikipedia.org/wiki/Snowman', stream=True) lines = list(resp.iter_lines(decode_unicode=True)) # this will fail if the content contains multi-byte characters assert sum(map(len, lines)) == int(resp.headers['Content-Length']) ``` Of course, it would be possible if decoding of content is disabled, but then the reader is responsible for decoding. It would be preferable if there were a hook or attribute to enable the user to do the byte checking without forgoing decoding support. Two suggestions: - Have requests supply a 'bytes_read' on the Response object, which will be updated on any read operation. - Provide a hook in the request allowing the user to supply a custom Response subclass for customizing response handling behavior.
{ "avatar_url": "https://avatars.githubusercontent.com/u/308610?v=4", "events_url": "https://api.github.com/users/jaraco/events{/privacy}", "followers_url": "https://api.github.com/users/jaraco/followers", "following_url": "https://api.github.com/users/jaraco/following{/other_user}", "gists_url": "https://api.github.com/users/jaraco/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jaraco", "id": 308610, "login": "jaraco", "node_id": "MDQ6VXNlcjMwODYxMA==", "organizations_url": "https://api.github.com/users/jaraco/orgs", "received_events_url": "https://api.github.com/users/jaraco/received_events", "repos_url": "https://api.github.com/users/jaraco/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jaraco/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jaraco/subscriptions", "type": "User", "url": "https://api.github.com/users/jaraco", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1938/reactions" }
https://api.github.com/repos/psf/requests/issues/1938/timeline
null
completed
null
null
false
[ "Hi there! Thanks for raising this issue!\n\nI assume your logic application logic is not the same as your example, because if it were I'd simply say you should avoid using `stream=True`. That way we store the data off and you can access it at your leisure using `Response.content` and `Response.text`.\n\nThe bigger problem you're going to have is that we transparently decompress `gzip`ped and `deflate`d data. This means that if you're served such a response your content length checking will fail. I think the key question we have to ask is: what are you trying to achieve with your Content-Length checking? If we can answer that, we'll be able to see whether you're trying to do something that Requests can make easier for you.\n", "My assumptions was that by passing the Content-Length, the server is providing some indication about what to expect. If I can make that assumption, then I'd like to validate that expectation at whatever point makes sense. I'd like to be able to do this validation regardless of whether streaming content is used or eager loading and regardless of the content was encoded, gripped, or deflated. I presume the content-length has a meaningful value somewhere in the stack. It's at that point I want to capture the actual length.\n\nI'm porting from a urllib2 implementation that assumed bytes on the response and handled all of the decoding aspects. I'm looking forward to leveraging the requests functionality for handling that more richly. I'll grab a real world example from our code of what we are trying to accomplish.\n", "The short answer is that we are not going to extend the existing API unless it is either:\n1. Necessary\n2. Likely to be used by more than 90% of the users\n\nIt is important to note that the decoding of content takes place in urllib3 if I remember correctly. You also have access to the urllib3 response object in `resp._raw`, but that's not meant to be used by the typical user of requests. The _right_ place to add this, regardless, is in urllib3. That would keep track of the exact number of bytes read (before decoding) and could store the data for you. That said, this discussion will certainly be easier after you provide an example for us to look at.\n", "There are two forms of decoding here: decoding from compressed data to bytes, and then decoding from bytes to unicode. The first is done in urllib3, the second in Requests. Because the first is done in urllib3, Requests never sees the gzipped bytes. This means the only way to achieve what you want _in requests_ is to reimplement the convenience methods that urllib3 already has for decompressing data. That just seems silly, so I recommend following @sigmavirus24's advice and getting this into urllib3.\n", "In our older code which used httplib2, we had this class:\n\n```\nclass ContentLengthValidator(object):\n \"\"\"\n Verify the length of content in a line-buffered stream. Used to wrap\n any iterable of items of length. The validator will raise a\n ValueError if the iterator is consumed and the declared length\n doesn't match or if the consumed length exceeds the declared length.\n\n >>> sample = ['abc', 'bcd', 'def']\n >>> v = ContentLengthValidator(iter(sample), 9)\n >>> list(v)\n ['abc', 'bcd', 'def']\n\n >>> v = ContentLengthValidator(iter(sample), 8)\n >>> list(v)\n Traceback (most recent call last):\n ...\n ValueError: Expected 8 bytes, received 9\n\n >>> import itertools\n >>> v = ContentLengthValidator(itertools.cycle(sample), 1000)\n >>> list(v)\n Traceback (most recent call last):\n ...\n ValueError: Expected 1000 bytes, received 1002\n \"\"\"\n def __init__(self, stream, declared_length):\n self.stream = stream\n self.declared_length = declared_length\n self.length_read = 0\n\n def __iter__(self):\n return self\n\n def next(self):\n try:\n data = self.stream.next()\n except StopIteration:\n if not self.length_read == self.declared_length:\n self.__fail()\n raise\n self.length_read += len(data)\n if self.length_read > self.declared_length:\n self.__fail()\n return data\n\n def __fail(self):\n msg = \"Expected %(declared_length)d bytes, received %(length_read)d\"\n raise ValueError(msg % vars(self))\n\n```\n\nIt would be used to wrap an urllib2 response like so:\n\n```\nresp = urllib2.urlopen(url)\nif 'Content-Length' in resp.headers:\n resp = ContentLengthValidator(resp, resp.headers['Content-Length'])\n# load resp as CSV\nreader = csv.DictReader(resp, ...)\n```\n\nThe wrapper would serve to count the bytes as they passed through the iteration and then raise a ValueError at the end if they did not match the expectation.\n\nIn an attempt to update the code for Python 3 support, we desired to use requests to provide higher-level handling (including support for caching through cachecontrol), so I modified the ContentLengthValidator to be Requests specific:\n\n```\nclass ContentLengthValidator(object):\n \"\"\"\n Verify the length of content in a Requests response. The validator will\n raise a ValueError if the bytes read doesn't match the declared length.\n \"\"\"\n def __init__(self, resp):\n self.resp = resp\n self.iter_content_orig = self.resp.iter_content\n self.resp.iter_content = self.count_bytes\n\n def count_bytes(self, *args, **kwargs):\n self.bytes_read = 0\n gen = self.iter_content_orig(*args, **kwargs)\n for res in gen:\n self.bytes_read += len(res)\n yield res\n\n def iter_lines(self):\n for line in self.resp.iter_lines():\n yield line\n self._check_complete()\n\n def _check_complete(self):\n if 'content-length' not in self.resp.headers:\n return\n target_length = int(self.resp.headers['content-length'])\n if target_length == self.bytes_read:\n return\n msg = \"Expected {target_length} bytes, received {bytes_read}\"\n raise ValueError(msg.format(bytes_read=self.bytes_read, **vars(self)))\n\n```\n\nThe wrapper replaces 'iter_content' on the Response object with another iterator which will count the bytes as they're read. This technique is working for us now, but as you can see by trying to read it, the technique is clumsy. Furthermore, as you've pointed out, this technique will probably fail if gzip is used. That's why it's important for the requests library to support this functionality - because the subtle nuances of validating the Content-Length are hard, and if it can be done right once in requests, then no one else has to try and get it wrong.\n\nI'd much rather be able to do something like:\n\n```\nresp = requests.get(url, stream=True)\ndata = list(resp)\nassert resp.bytes_received == resp.headers.get('content-length', resp.bytes_received)\n```\n", "Looking into urllib3, it looks like the urllib3 Response object has a `_fp_bytes_read` attribute which may be just what I need.\n", "Given the limited usefulness of validating the length and the fact that the raw length is readily available on the raw object, I'm able to achieve my goals with something like this:\n\n```\nresp = requests.get(url, stream=True)\ndata = list(resp)\nbytes_received = resp.raw._fp_bytes_read\nassert bytes_received == resp.headers.get('content-length', bytes_received)\n```\n\nAnd that test should be more accurate than the one we previously had. Thanks for entertaining the issue.\n", "I'm glad you were able to find something that suits your use-case. Let us know if you find it's particularly troublesome: I'm sure the right place for the fix is in urllib3, but we can help you get it there if you find you need it. =)\n\nThanks again!\n" ]
https://api.github.com/repos/psf/requests/issues/1937
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1937/labels{/name}
https://api.github.com/repos/psf/requests/issues/1937/comments
https://api.github.com/repos/psf/requests/issues/1937/events
https://github.com/psf/requests/pull/1937
28,743,376
MDExOlB1bGxSZXF1ZXN0MTMxODM0MTU=
1,937
Improved decoding support for Response.iter_content and iter_lines
{ "avatar_url": "https://avatars.githubusercontent.com/u/308610?v=4", "events_url": "https://api.github.com/users/jaraco/events{/privacy}", "followers_url": "https://api.github.com/users/jaraco/followers", "following_url": "https://api.github.com/users/jaraco/following{/other_user}", "gists_url": "https://api.github.com/users/jaraco/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jaraco", "id": 308610, "login": "jaraco", "node_id": "MDQ6VXNlcjMwODYxMA==", "organizations_url": "https://api.github.com/users/jaraco/orgs", "received_events_url": "https://api.github.com/users/jaraco/received_events", "repos_url": "https://api.github.com/users/jaraco/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jaraco/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jaraco/subscriptions", "type": "User", "url": "https://api.github.com/users/jaraco", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2014-03-04T21:38:30Z
2021-09-08T23:06:26Z
2014-05-12T19:04:35Z
CONTRIBUTOR
resolved
This PR adds basic documentation for the decode_unicode parameter to iter_content (and implicitly for iter_lines). It also ensures the now-documented behavior is correct and consistent, specifically by honoring the parameter in all cases and not just in some (previously only apparent by reading the source).
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1937/reactions" }
https://api.github.com/repos/psf/requests/issues/1937/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/1937.diff", "html_url": "https://github.com/psf/requests/pull/1937", "merged_at": "2014-05-12T19:04:35Z", "patch_url": "https://github.com/psf/requests/pull/1937.patch", "url": "https://api.github.com/repos/psf/requests/pulls/1937" }
true
[ "Thanks for this!\n\nIn general this fix looks great, thanks! Would you mind providing a test? One that confirms that you get back `unicode` in the cases in question would be good enough.\n", ":sparkles: :cake: :sparkles:\n" ]
https://api.github.com/repos/psf/requests/issues/1936
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1936/labels{/name}
https://api.github.com/repos/psf/requests/issues/1936/comments
https://api.github.com/repos/psf/requests/issues/1936/events
https://github.com/psf/requests/issues/1936
28,535,365
MDU6SXNzdWUyODUzNTM2NQ==
1,936
feature request: module-level configuration of ssl certs
{ "avatar_url": "https://avatars.githubusercontent.com/u/5677962?v=4", "events_url": "https://api.github.com/users/thkang2/events{/privacy}", "followers_url": "https://api.github.com/users/thkang2/followers", "following_url": "https://api.github.com/users/thkang2/following{/other_user}", "gists_url": "https://api.github.com/users/thkang2/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/thkang2", "id": 5677962, "login": "thkang2", "node_id": "MDQ6VXNlcjU2Nzc5NjI=", "organizations_url": "https://api.github.com/users/thkang2/orgs", "received_events_url": "https://api.github.com/users/thkang2/received_events", "repos_url": "https://api.github.com/users/thkang2/repos", "site_admin": false, "starred_url": "https://api.github.com/users/thkang2/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/thkang2/subscriptions", "type": "User", "url": "https://api.github.com/users/thkang2", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2014-02-28T21:41:10Z
2021-09-09T00:10:03Z
2014-02-28T22:18:24Z
NONE
resolved
there are some python 'freezers' like `cx_Freeze` are used to create executable that doesn't depend on system-wide python (and relevant packages) installs. However, when you freeze a script, many of them love to bundle every package on which script depends into a single zip file. this doesn't play nice with `requests` - (on windows) when zipimported, it fails to load `cacert.pem` that is necessary for https connections. unless you disable ssl cert verification, you can't open any https connection. since requests is a versatile and elegant http/s library for python, many other packages depend on it. which means that if you try to zipimport and use `somepackage` which depends on `requests`, and both modules are in a zip, then your `somepackage` will fail to use ssl. I know there are many workarounds for this problem - one would be monkey patching `requests.get` (or other methods) like this: ``` import requests import sys import functools _get = sys.modules['requests'].get cert_path = <path-to-cacert.pem> sys.modules['requests'].get = functools.partial(_get, verify=cert_path) ``` which works, but undeniably an ugly 'hack'. would you consider a module-level ssl certificates handling, that we can cope with failures to import system CA pem files? like: ``` import requests requests.add_certs('certs.pem') ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1936/reactions" }
https://api.github.com/repos/psf/requests/issues/1936/timeline
null
completed
null
null
false
[ "This is very similar to #1896 which we have already rejected. There are a couple serious issues with this functionality. The most worrisome of these issues is that developers who wish to do something malicious can easily package their own malicious certificate PEM file and distribute it. A naïve user would never quite know the difference if they did not check the source. This is not to say that you're doing the wrong thing, but others would and this would almost certainly be considered a serious vulnerability. Without a way to verify that the provided certificate file is valid and not malicious we absolutely can not accept this request.\n\nThanks for reaching out! Please continue to contribute new ideas, we certainly welcome them! :cake: \n" ]
https://api.github.com/repos/psf/requests/issues/1935
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1935/labels{/name}
https://api.github.com/repos/psf/requests/issues/1935/comments
https://api.github.com/repos/psf/requests/issues/1935/events
https://github.com/psf/requests/pull/1935
28,512,228
MDExOlB1bGxSZXF1ZXN0MTMwNTcyNTE=
1,935
Add timeout to stream with testing
{ "avatar_url": "https://avatars.githubusercontent.com/u/4336127?v=4", "events_url": "https://api.github.com/users/ceaess/events{/privacy}", "followers_url": "https://api.github.com/users/ceaess/followers", "following_url": "https://api.github.com/users/ceaess/following{/other_user}", "gists_url": "https://api.github.com/users/ceaess/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ceaess", "id": 4336127, "login": "ceaess", "node_id": "MDQ6VXNlcjQzMzYxMjc=", "organizations_url": "https://api.github.com/users/ceaess/orgs", "received_events_url": "https://api.github.com/users/ceaess/received_events", "repos_url": "https://api.github.com/users/ceaess/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ceaess/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ceaess/subscriptions", "type": "User", "url": "https://api.github.com/users/ceaess", "user_view_type": "public" }
[ { "color": "009800", "default": false, "description": null, "id": 44501218, "name": "Ready To Merge", "node_id": "MDU6TGFiZWw0NDUwMTIxOA==", "url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge" }, { "color": "207de5", "default": false, "description": null, "id": 60620163, "name": "Minion Seal of Approval", "node_id": "MDU6TGFiZWw2MDYyMDE2Mw==", "url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval" } ]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" } ]
null
17
2014-02-28T16:11:18Z
2021-09-08T11:00:46Z
2014-03-03T18:14:37Z
NONE
resolved
Fixes Issue #1803
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1935/reactions" }
https://api.github.com/repos/psf/requests/issues/1935/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/1935.diff", "html_url": "https://github.com/psf/requests/pull/1935", "merged_at": "2014-03-03T18:14:37Z", "patch_url": "https://github.com/psf/requests/pull/1935.patch", "url": "https://api.github.com/repos/psf/requests/pulls/1935" }
true
[ "Hey @Lukasa can we have some :eyes: on this? I paired with @cjstapleton on this PR this morning. Let us know what you think.\n", "Thanks for this, @cjstapleton! It looks great. =) I've made a couple of tiny stylistic notes inline, but the substance of this change is perfect.\n\nIt's worth noting that this implements _part_ of #1801. I don't think that should be a reason not to merge this, but it's worth being aware of.\n\nOtherwise, this is good to go. When you make those changes I'll flag this as ready to merge. Thanks! :cake:\n", "I think I've fixed the styling issues, now @Lukasa - thanks! :) Let me know if I've missed anything. \n", "That's perfect, it looks great. I'll flag this as ready for Kenneth to merge.\n\nThanks again! :cookie:\n", ":cake: Thanks for this @cjstapleton \n", "We'll have to make sure we communicate this to our users. This is a big change.\n", "Perhaps, this paragraph in the API docs should also be changed\n\n> Timeouts behave slightly differently. On streaming requests, the timeout only applies to the connection attempt. On regular requests, the timeout is applied to the connection process and downloading the full body.\n\nPlease correct me if I am wrong, but the last sentence (\"downloading the full body\" part) is not true. It contradicts to the note in Quickstart:\n\n> timeout is not a time limit on the entire response download; rather, an exception is raised if the server has not issued a response for timeout seconds (more precisely, if no bytes have been received on the underlying socket for timeout seconds).\n", "@vlevit: Good catch, that section of the docs has been changed.\n\n@kennethreitz: For the moment I've made sure it'll be in the changelog. If you want something more drastic I'll whip up a section of the documentation that explains timeouts in more detail. =)\n", "Is there a release with this patch?\n", "@mortoray no. And there shouldn't be a release until we fix the Proxy Authorization exposure\n", "@mortoray This patch was only made 9 days ago, and we don't have a really rapid release cadence. =)\n", "@Lukasa Regarding docs again, I think the code example after the timeout changes explanation is no more relevant:-)\n", "It's substantially more accurate, but is also defined in slightly abstract terms. I use the phrase 'read from the socket' because a server can drip-feed data at the rate of one byte per timeout interval and avoid tripping the read timeout.\n", "Hmm, I am speaking about this code excerpt which is still present on the master:\n\n```\ntarball_url = 'https://github.com/kennethreitz/requests/tarball/master'\n\n# One second timeout for the connection attempt\n# Unlimited time to download the tarball\nr = requests.get(tarball_url, stream=True, timeout=1)\n\n# One second timeout for the connection attempt\n# Another full second timeout to download the tarball\nr = requests.get(tarball_url, timeout=1)\n```\n", "Ah, yes, I'll fix that up to note that it's no longer true. (I don't want to change it unconditionally because people may be upgrading to non-master versions of Requests that are still subsequent to v2.0.0.)\n", "Ugh, actually, I don't want to update that until we ship a release of this patch, so let's leave it on the pile of work 'to do'.\n", "This has shipped now, I have some doc fixes in #2187 \n" ]
https://api.github.com/repos/psf/requests/issues/1934
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1934/labels{/name}
https://api.github.com/repos/psf/requests/issues/1934/comments
https://api.github.com/repos/psf/requests/issues/1934/events
https://github.com/psf/requests/pull/1934
28,504,383
MDExOlB1bGxSZXF1ZXN0MTMwNTI3NzY=
1,934
Charade -> Chardet and Add cacert.pem license
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" } ]
null
5
2014-02-28T14:29:58Z
2021-09-08T23:01:03Z
2014-03-02T09:34:55Z
CONTRIBUTOR
resolved
- Charade is gone, long live Chardet. - cacert.pem is now taken wholesale from Mozilla so we need to display that itis licensed under the MPL2.0 Closes #1933
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1934/reactions" }
https://api.github.com/repos/psf/requests/issues/1934/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/1934.diff", "html_url": "https://github.com/psf/requests/pull/1934", "merged_at": "2014-03-02T09:34:55Z", "patch_url": "https://github.com/psf/requests/pull/1934.patch", "url": "https://api.github.com/repos/psf/requests/pulls/1934" }
true
[ "@Lukasa this is essentially a documentation change. I included the other information about cacert.pem but I'm not sure it is entirely necessary in the NOTICE file. I'll happily remove it if you share the same doubts.\n", "That extra bit of cacert.pem you included is actually part of the first certificate. =) We don't need it.\n", "> That extra bit of cacert.pem you included is actually part of the first certificate. =) We don't need it\n\nThought so ;). Fixed in 64f0b3c\n", "Awesome, let's do it.\n", ":cake: \n" ]
https://api.github.com/repos/psf/requests/issues/1933
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1933/labels{/name}
https://api.github.com/repos/psf/requests/issues/1933/comments
https://api.github.com/repos/psf/requests/issues/1933/events
https://github.com/psf/requests/issues/1933
28,492,727
MDU6SXNzdWUyODQ5MjcyNw==
1,933
NOTICE file looks incorrect
{ "avatar_url": "https://avatars.githubusercontent.com/u/238622?v=4", "events_url": "https://api.github.com/users/fdev31/events{/privacy}", "followers_url": "https://api.github.com/users/fdev31/followers", "following_url": "https://api.github.com/users/fdev31/following{/other_user}", "gists_url": "https://api.github.com/users/fdev31/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/fdev31", "id": 238622, "login": "fdev31", "node_id": "MDQ6VXNlcjIzODYyMg==", "organizations_url": "https://api.github.com/users/fdev31/orgs", "received_events_url": "https://api.github.com/users/fdev31/received_events", "repos_url": "https://api.github.com/users/fdev31/repos", "site_admin": false, "starred_url": "https://api.github.com/users/fdev31/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/fdev31/subscriptions", "type": "User", "url": "https://api.github.com/users/fdev31", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2014-02-28T10:59:53Z
2021-09-09T00:10:03Z
2014-03-02T09:34:55Z
NONE
resolved
Looks like the NOTICE file wasn't updated after charade/chardet refactors. Also, the PEM file license seems out of sync. To be short, the licensing looks currently inconsistent.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1933/reactions" }
https://api.github.com/repos/psf/requests/issues/1933/timeline
null
completed
null
null
false
[ "Good catch!\n\nThe charade/chardet switch is easy, but the fact that the PEM file is licensed under the MPL is a bit awkward: the easiest thing to do is to just whack the MPL text from the PEM file into NOTICE as well.\n" ]
https://api.github.com/repos/psf/requests/issues/1932
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1932/labels{/name}
https://api.github.com/repos/psf/requests/issues/1932/comments
https://api.github.com/repos/psf/requests/issues/1932/events
https://github.com/psf/requests/pull/1932
28,399,387
MDExOlB1bGxSZXF1ZXN0MTI5OTEwNjg=
1,932
Fix relative import which will fail on Python 3.X
{ "avatar_url": "https://avatars.githubusercontent.com/u/1407472?v=4", "events_url": "https://api.github.com/users/Cosmius/events{/privacy}", "followers_url": "https://api.github.com/users/Cosmius/followers", "following_url": "https://api.github.com/users/Cosmius/following{/other_user}", "gists_url": "https://api.github.com/users/Cosmius/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Cosmius", "id": 1407472, "login": "Cosmius", "node_id": "MDQ6VXNlcjE0MDc0NzI=", "organizations_url": "https://api.github.com/users/Cosmius/orgs", "received_events_url": "https://api.github.com/users/Cosmius/received_events", "repos_url": "https://api.github.com/users/Cosmius/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Cosmius/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cosmius/subscriptions", "type": "User", "url": "https://api.github.com/users/Cosmius", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2014-02-27T07:30:27Z
2021-09-08T23:01:07Z
2014-02-27T07:43:07Z
NONE
resolved
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1932/reactions" }
https://api.github.com/repos/psf/requests/issues/1932/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/1932.diff", "html_url": "https://github.com/psf/requests/pull/1932", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/1932.patch", "url": "https://api.github.com/repos/psf/requests/pulls/1932" }
true
[ "Thanks for this! Unfortunately, this change is in urllib3, which we take as-is from upstream, so you'd normally need to open your fixes there. However, this was actually fixed by shazow/urllib3#338, and so will be in the next version of Requests. =)\n\nThanks!\n" ]
https://api.github.com/repos/psf/requests/issues/1931
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/1931/labels{/name}
https://api.github.com/repos/psf/requests/issues/1931/comments
https://api.github.com/repos/psf/requests/issues/1931/events
https://github.com/psf/requests/issues/1931
28,303,622
MDU6SXNzdWUyODMwMzYyMg==
1,931
Upgrade to the latest verison of urllib3
{ "avatar_url": "https://avatars.githubusercontent.com/u/8818?v=4", "events_url": "https://api.github.com/users/hamish/events{/privacy}", "followers_url": "https://api.github.com/users/hamish/followers", "following_url": "https://api.github.com/users/hamish/following{/other_user}", "gists_url": "https://api.github.com/users/hamish/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/hamish", "id": 8818, "login": "hamish", "node_id": "MDQ6VXNlcjg4MTg=", "organizations_url": "https://api.github.com/users/hamish/orgs", "received_events_url": "https://api.github.com/users/hamish/received_events", "repos_url": "https://api.github.com/users/hamish/repos", "site_admin": false, "starred_url": "https://api.github.com/users/hamish/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/hamish/subscriptions", "type": "User", "url": "https://api.github.com/users/hamish", "user_view_type": "public" }
[ { "color": "fbca04", "default": false, "description": null, "id": 615414998, "name": "GAE Support", "node_id": "MDU6TGFiZWw2MTU0MTQ5OTg=", "url": "https://api.github.com/repos/psf/requests/labels/GAE%20Support" } ]
closed
true
null
[]
null
1
2014-02-26T02:18:25Z
2021-09-08T09:00:48Z
2014-02-26T07:21:04Z
NONE
resolved
When using the requests framework to connect to the xero accounting service from within Google App Engine (using pyxero), I get an error 'Exceeded 30 redirects.' After upgrading the included version of urllib3 to the latest version (1.7.1) the error did not occur any more, and the connection was successful. More details here: https://github.com/freakboy3742/pyxero/issues/23 Reproducible case here (this has the upgraded urllib3, you will need to revert to reproduce the problem): https://github.com/hamish/gae_xero
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/1931/reactions" }
https://api.github.com/repos/psf/requests/issues/1931/timeline
null
completed
null
null
false
[ "@hamish This is a per-release action for us, so the next release should have the newest version of urllib3 in it. =)\n" ]