url
stringlengths 50
53
| repository_url
stringclasses 1
value | labels_url
stringlengths 64
67
| comments_url
stringlengths 59
62
| events_url
stringlengths 57
60
| html_url
stringlengths 38
43
| id
int64 597k
2.65B
| node_id
stringlengths 18
32
| number
int64 1
6.83k
| title
stringlengths 1
296
| user
dict | labels
listlengths 0
5
| state
stringclasses 2
values | locked
bool 2
classes | assignee
dict | assignees
listlengths 0
4
| milestone
dict | comments
int64 0
211
| created_at
stringlengths 20
20
| updated_at
stringlengths 20
20
| closed_at
stringlengths 20
20
⌀ | author_association
stringclasses 3
values | active_lock_reason
stringclasses 4
values | body
stringlengths 0
65.6k
⌀ | closed_by
dict | reactions
dict | timeline_url
stringlengths 59
62
| performed_via_github_app
null | state_reason
stringclasses 3
values | draft
bool 2
classes | pull_request
dict | is_pull_request
bool 2
classes | issue_comments
listlengths 0
30
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/psf/requests/issues/2130
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2130/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2130/comments
|
https://api.github.com/repos/psf/requests/issues/2130/events
|
https://github.com/psf/requests/pull/2130
| 37,863,010 |
MDExOlB1bGxSZXF1ZXN0MTgzOTU1MzA=
| 2,130 |
Doc index updates
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/836426?v=4",
"events_url": "https://api.github.com/users/dpursehouse/events{/privacy}",
"followers_url": "https://api.github.com/users/dpursehouse/followers",
"following_url": "https://api.github.com/users/dpursehouse/following{/other_user}",
"gists_url": "https://api.github.com/users/dpursehouse/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/dpursehouse",
"id": 836426,
"login": "dpursehouse",
"node_id": "MDQ6VXNlcjgzNjQyNg==",
"organizations_url": "https://api.github.com/users/dpursehouse/orgs",
"received_events_url": "https://api.github.com/users/dpursehouse/received_events",
"repos_url": "https://api.github.com/users/dpursehouse/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/dpursehouse/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dpursehouse/subscriptions",
"type": "User",
"url": "https://api.github.com/users/dpursehouse",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2014-07-15T08:54:42Z
|
2021-09-08T23:01:06Z
|
2014-07-15T08:57:47Z
|
CONTRIBUTOR
|
resolved
|
- Wrap lines at around 80 characters in the doc index
- Add Sony to the list of users
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2130/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2130/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2130.diff",
"html_url": "https://github.com/psf/requests/pull/2130",
"merged_at": "2014-07-15T08:57:47Z",
"patch_url": "https://github.com/psf/requests/pull/2130.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2130"
}
| true |
[
"LGTM. Thanks! :cake:\n"
] |
https://api.github.com/repos/psf/requests/issues/2129
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2129/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2129/comments
|
https://api.github.com/repos/psf/requests/issues/2129/events
|
https://github.com/psf/requests/pull/2129
| 37,845,249 |
MDExOlB1bGxSZXF1ZXN0MTgzODUyMzk=
| 2,129 |
Apply environment features to prepared requests
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/416057?v=4",
"events_url": "https://api.github.com/users/jamielennox/events{/privacy}",
"followers_url": "https://api.github.com/users/jamielennox/followers",
"following_url": "https://api.github.com/users/jamielennox/following{/other_user}",
"gists_url": "https://api.github.com/users/jamielennox/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jamielennox",
"id": 416057,
"login": "jamielennox",
"node_id": "MDQ6VXNlcjQxNjA1Nw==",
"organizations_url": "https://api.github.com/users/jamielennox/orgs",
"received_events_url": "https://api.github.com/users/jamielennox/received_events",
"repos_url": "https://api.github.com/users/jamielennox/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jamielennox/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jamielennox/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jamielennox",
"user_view_type": "public"
}
|
[
{
"color": "e11d21",
"default": false,
"description": null,
"id": 44501305,
"name": "Not Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTMwNQ==",
"url": "https://api.github.com/repos/psf/requests/labels/Not%20Ready%20To%20Merge"
},
{
"color": "fef2c0",
"default": false,
"description": null,
"id": 60669570,
"name": "Please Review",
"node_id": "MDU6TGFiZWw2MDY2OTU3MA==",
"url": "https://api.github.com/repos/psf/requests/labels/Please%20Review"
}
] |
closed
| true | null |
[] | null | 21 |
2014-07-15T01:59:33Z
|
2021-09-08T11:00:54Z
|
2014-08-22T13:12:08Z
|
NONE
|
resolved
|
By having trust environment lookup in request() applications using
prepared requests don't get access to this functionality. Having this in
send() is a more logical place as we are dealing with default
communication parameters.
Closes #1436
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2129/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2129/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2129.diff",
"html_url": "https://github.com/psf/requests/pull/2129",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2129.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2129"
}
| true |
[
"Hi @jamielennox I haven't looked at this yet because the tests are failing. Please check the failures and fix them. Thank you\n",
"@sigmavirus24 fixed, thanks.\n",
"There are some nice changes in here. Looking forward to hearing what @sigmavirus24 and @Lukasa have to say :)\n",
"This seems in principle to be totally fine. Affects our public facing API in a backward incompatible way though, so we do want to be wary here.\n",
"I agree with @Lukasa and want to stress that it's a very subtle way in which it affects the API. Most people who use `Session#send` are not expecting environment based data to affect their request. If they are using prepared requests to avoid this altogether, we're making this basically impossible for them now. We might need to wait for Requests 3.0 to merge this.\n",
"What if we compromise. What if instead of moving this into `Session#send` we turn it into a method that returns a dictionary (which is basically `send_kwargs`). We then call that it in `Session#request` and pass it into `Session#send` and users who want this while circumventing `Session#request` can have it.\n",
"I realize the backwards compatibility issues you guys have here and if that's what's needed i can make it a function. I had resisted that because there are not a lot of public functions on session and so i hadn't wanted to add this one. \n\nHaving said that there is much simpler way to ignore the environment variables if that was your goal by setting session.trust_env = False, so whilst we may not be able to merge as is i think it's in the spirit of the change.\n\nWhat about a different way, the code stays much the same as this review, trust_env is a parameter to send(), the default value is False, and the value passed from request() is self.trust_env.\n",
"@jamielennox We don't need the `trust_env` keyword argument (in fact I don't think we want it). What we're getting at is that we've moved the location where a specific action occurs, so that previously people who didn't need to worry about the `trust_env` value now do. The fact that it can easily be turned off is not really relevant, we still need to be careful with when we merge the change. I'm happy for us to do it, we should just exercise caution.\n",
"@Lukasa Yea, i understand that. I was thinking by doing it this way we didn't change the existing behaviour for anyone and people who wanted to use the environment variables could opt-in. I don't know what the timeframe is but that's preferable to me than waiting for 3.0. \n\nDo you want me to revert back to the older commit? (wish github had a better way of managing this). \n",
"Yeah, the revert will be better because the new version of the code still changes the API. Previously, `trust_env` was considered in `Session.request`: now it's considered in `Session.send`. Unless we duplicate the code, we can't have it so that there's no change in the API.\n\nNow, that's fine: a change in the API is the whole point of this pull request! =) It just means that we may need to sit on it a little while.\n",
"Just for interest (as you're replying quickly), even though in the new version the execution will be exactly the same the fact that the code is executed in a different function constitutes a change in API? What is that designed to protect against - people mocking the adapter.send() function? \n",
"I'm concerned about people subclassing `Session`. =)\n",
"I like the idea of making the environment loading a method that would be called manually before running send. This gives people using the send API directly now the ability to get all of the functionality of the standard flow, as well as clean up the code a bit. \n\nThen we'd keep the behavior identical to what it is now. \n",
"As a side note, breaking this out into another method also means users can override it to do extra things if that suits their desires and we can also test it more easily.\n",
":+1: \n",
"Any update?\n",
"@kennethreitz it seems @jamielennox has abandoned this. I'll pull their commit(s) into a branch of my own and update it with our decision tonight.\n",
"Sorry everyone, i had let this slip off my list.\n\nThis is at least an example of extracting that information to a function, there is a question there as to whether the trust_env check should go inside the apply_environ() function or not, i've gone back and forth. \n\nI'm not sure if this is what you were looking for - it doesn't feel as 'clean' as the last patches for some reason. \n\n@sigmavirus24 don't be worried about stepping me through the review process, i'd prefer to get this feature in than worry about who contributed it. If you have something in mind that satisfies the case then merge it and close this out. \n",
"@sigmavirus24 want to own this?\n",
"Sure. I'll add this to my TODO list for Madison Ruby today. (Yeah, working on Python at a Ruby conference. =P)\n",
"Sweet, closing then :)\n"
] |
https://api.github.com/repos/psf/requests/issues/2128
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2128/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2128/comments
|
https://api.github.com/repos/psf/requests/issues/2128/events
|
https://github.com/psf/requests/pull/2128
| 37,789,805 |
MDExOlB1bGxSZXF1ZXN0MTgzNTAzODc=
| 2,128 |
Fix broken link
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/588486?v=4",
"events_url": "https://api.github.com/users/tshepang/events{/privacy}",
"followers_url": "https://api.github.com/users/tshepang/followers",
"following_url": "https://api.github.com/users/tshepang/following{/other_user}",
"gists_url": "https://api.github.com/users/tshepang/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/tshepang",
"id": 588486,
"login": "tshepang",
"node_id": "MDQ6VXNlcjU4ODQ4Ng==",
"organizations_url": "https://api.github.com/users/tshepang/orgs",
"received_events_url": "https://api.github.com/users/tshepang/received_events",
"repos_url": "https://api.github.com/users/tshepang/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/tshepang/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tshepang/subscriptions",
"type": "User",
"url": "https://api.github.com/users/tshepang",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2014-07-14T14:13:35Z
|
2021-09-08T23:01:07Z
|
2014-07-14T14:39:19Z
|
CONTRIBUTOR
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2128/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2128/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2128.diff",
"html_url": "https://github.com/psf/requests/pull/2128",
"merged_at": "2014-07-14T14:39:19Z",
"patch_url": "https://github.com/psf/requests/pull/2128.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2128"
}
| true |
[
"Thanks @tshepang !\n"
] |
|
https://api.github.com/repos/psf/requests/issues/2127
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2127/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2127/comments
|
https://api.github.com/repos/psf/requests/issues/2127/events
|
https://github.com/psf/requests/issues/2127
| 37,733,422 |
MDU6SXNzdWUzNzczMzQyMg==
| 2,127 |
verify=False,but still requests.exceptions.SSLError
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2603552?v=4",
"events_url": "https://api.github.com/users/khalidhsu/events{/privacy}",
"followers_url": "https://api.github.com/users/khalidhsu/followers",
"following_url": "https://api.github.com/users/khalidhsu/following{/other_user}",
"gists_url": "https://api.github.com/users/khalidhsu/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/khalidhsu",
"id": 2603552,
"login": "khalidhsu",
"node_id": "MDQ6VXNlcjI2MDM1NTI=",
"organizations_url": "https://api.github.com/users/khalidhsu/orgs",
"received_events_url": "https://api.github.com/users/khalidhsu/received_events",
"repos_url": "https://api.github.com/users/khalidhsu/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/khalidhsu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/khalidhsu/subscriptions",
"type": "User",
"url": "https://api.github.com/users/khalidhsu",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2014-07-13T02:57:16Z
|
2021-09-08T23:08:10Z
|
2014-09-09T22:25:55Z
|
NONE
|
resolved
|
I have set the ssl verify=False, and I used a proxy(charles) with a Fake SSL cert,works fine with some website ,but some still raise SSLError,such as
https://pcs.baidu.com/rest/2.0/pcs/file?method=rapidupload&content-length=1643802624&content-md5=a655883b853e766e0cdf961bda894df9&slice-md5=fc7812dbdce256902d9277cef2c9b0da&path=/tmp2/a/ak.iso&app_id=250528&ondup=newcopy&filename=asas.iso
---
Traceback (most recent call last):
File "/home/khalid/py/py-mime/py20140710/baidupan.py", line 260, in <module>
main(sys.argv[1:])
File "/home/khalid/py/py-mime/py20140710/baidupan.py", line 238, in main
upload_rapid('/media/khalid/数据盘/迅雷下载/VS2012_ULT_chs.iso')
File "/home/khalid/py/py-mime/py20140710/baidupan.py", line 136, in upload_rapid
ls.get(url, proxies=proxies, verify=False)
File "/home/khalid/py/py-mime/py20140710/lsession3.py", line 73, in get
self._r = self._rqs.get(url, *_kw)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 395, in get
return self.request('GET', url, *_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 383, in request
resp = self.send(prep, *_send_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 486, in send
r = adapter.send(request, *_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 385, in send
raise SSLError(e)
requests.exceptions.SSLError: [Errno 1] _ssl.c:504: error:14077438:SSL routines:SSL23_GET_SERVER_HELLO:tlsv1 alert internal error
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2127/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2127/timeline
| null |
completed
| null | null | false |
[
"Setting `verify=False` does not mean you will never get an `SSLError`, it just means we won't validate the certificate. In this case we tried to perform the TLS handshake and encountered an internal error. We can't ignore this kind of error, we have to raise it.\n",
"I am having a similar error. I am trying to request a url that has an expired certificate and I set verify to false I get:\n\nTraceback (most recent call last):\n File \"/media/tj/Windows 8.1/Users/tj/Desktop/sockets.py\", line 25, in <module>\n soc1.connect(ADDR)\n File \"/usr/lib/python2.7/socket.py\", line 224, in meth\n return getattr(self._sock,name)(*args)\nsocket.error: [Errno 113] No route to host\n\nThis is weird because if I use urllib2 for the same exact url it works fine.\n",
"@Fire30 that traceback doesn't appear to have anything to do with requests. Can you explain the relationship?\n",
"Closing due to lack of activity. If there are any updates let us know and we'll reopen this.\n"
] |
https://api.github.com/repos/psf/requests/issues/2126
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2126/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2126/comments
|
https://api.github.com/repos/psf/requests/issues/2126/events
|
https://github.com/psf/requests/issues/2126
| 37,727,228 |
MDU6SXNzdWUzNzcyNzIyOA==
| 2,126 |
SOCK4/5 Proxies
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2049730?v=4",
"events_url": "https://api.github.com/users/Segflow/events{/privacy}",
"followers_url": "https://api.github.com/users/Segflow/followers",
"following_url": "https://api.github.com/users/Segflow/following{/other_user}",
"gists_url": "https://api.github.com/users/Segflow/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Segflow",
"id": 2049730,
"login": "Segflow",
"node_id": "MDQ6VXNlcjIwNDk3MzA=",
"organizations_url": "https://api.github.com/users/Segflow/orgs",
"received_events_url": "https://api.github.com/users/Segflow/received_events",
"repos_url": "https://api.github.com/users/Segflow/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Segflow/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Segflow/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Segflow",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2014-07-12T19:46:47Z
|
2021-09-08T23:10:49Z
|
2014-07-12T19:50:47Z
|
NONE
|
resolved
|
Hello, is there any built-in sock4/5 support?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2126/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2126/timeline
| null |
completed
| null | null | false |
[] |
https://api.github.com/repos/psf/requests/issues/2125
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2125/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2125/comments
|
https://api.github.com/repos/psf/requests/issues/2125/events
|
https://github.com/psf/requests/issues/2125
| 37,725,349 |
MDU6SXNzdWUzNzcyNTM0OQ==
| 2,125 |
SOCK4/5 Proxies
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2049730?v=4",
"events_url": "https://api.github.com/users/Segflow/events{/privacy}",
"followers_url": "https://api.github.com/users/Segflow/followers",
"following_url": "https://api.github.com/users/Segflow/following{/other_user}",
"gists_url": "https://api.github.com/users/Segflow/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Segflow",
"id": 2049730,
"login": "Segflow",
"node_id": "MDQ6VXNlcjIwNDk3MzA=",
"organizations_url": "https://api.github.com/users/Segflow/orgs",
"received_events_url": "https://api.github.com/users/Segflow/received_events",
"repos_url": "https://api.github.com/users/Segflow/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Segflow/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Segflow/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Segflow",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-07-12T18:24:32Z
|
2021-09-08T23:10:50Z
|
2014-07-12T18:30:15Z
|
NONE
|
resolved
|
Hello, is there any built-in support for socks4 and socks5 proxies?
i'am running a tor process at port 7575 and i would like to use it in fetching data.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2125/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2125/timeline
| null |
completed
| null | null | false |
[
"Currently no. We need to work on getting shazow/urllib3#284 up to scratch and merged to get that support. \n",
"Also if you have questions, the proper place to ask them is on [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests). Also, had you performed a search you would have found your answer.\n"
] |
https://api.github.com/repos/psf/requests/issues/2124
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2124/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2124/comments
|
https://api.github.com/repos/psf/requests/issues/2124/events
|
https://github.com/psf/requests/pull/2124
| 37,603,833 |
MDExOlB1bGxSZXF1ZXN0MTgyNDU1OTg=
| 2,124 |
Depend on certifi
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 9 |
2014-07-10T20:51:52Z
|
2021-09-08T10:01:20Z
|
2014-07-11T17:30:07Z
|
MEMBER
|
resolved
|
Resolves #2123.
This represents one possible approach to certifi: always take it if it's present. This is probably not the most secure approach, however. If we wanted to be more intelligent we could embed the version number of certifi that requests contains and use the most up-to-date version. If you're interested in doing that let me know and I'll add it as an enhancement.
Note also that this adds certifi as a hard dependency. I'd be equally happy to not install it in the install process but take it if it's there.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2124/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2124/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2124.diff",
"html_url": "https://github.com/psf/requests/pull/2124",
"merged_at": "2014-07-11T17:30:07Z",
"patch_url": "https://github.com/psf/requests/pull/2124.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2124"
}
| true |
[
"Lets do it!\n\n:sparkles:\n",
"I don't think making certifi a hard-dependency is a good idea. There are reasons to prefer the distribution-wide ca-bundle.\n",
"Sure there are, but that's orthogonal to certifi being a dependency. All we've done is change our logic from preferring our own bundled certs to certifi's. If you override our logic (as all distros do) then you will continue not to use certifi.\n",
"I am the maintainer of the packages for archlinux and I override the logic there. The problem is that I also have to remove the requires in setup.py because, for example, scripts using load_entry_point will fail to import the required certifi module.\n\nEssentially I think the certs.py logics are fine but the requirement in setup.py should be lifted.\n",
"That's a risk. If we take the policy that our new dependence on certifi means we no longer need to update the build-in bundle we've exposed our users to risk.\n\nI have no particular objection, so I'll let @kennethreitz make the call: should the hard dependency on certifi be removed?\n",
"Certifi should be a soft-requirement, always used if present. \n\nIdeally, people should always have an up-to-date version of certifi installed. We can document this, and recommend this. \n",
"Ok, we'll remove certifi as a hard dependency then. =)\n",
"Done, see #2203. \n",
"<3\n"
] |
https://api.github.com/repos/psf/requests/issues/2123
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2123/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2123/comments
|
https://api.github.com/repos/psf/requests/issues/2123/events
|
https://github.com/psf/requests/issues/2123
| 37,600,291 |
MDU6SXNzdWUzNzYwMDI5MQ==
| 2,123 |
Consider soft-depending on certifi package
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2014-07-10T20:12:40Z
|
2021-09-08T23:10:50Z
|
2014-07-11T17:30:07Z
|
CONTRIBUTOR
|
resolved
|
Install it during installation, but still have a packaged version to fall back on.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2123/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2123/timeline
| null |
completed
| null | null | false |
[] |
https://api.github.com/repos/psf/requests/issues/2122
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2122/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2122/comments
|
https://api.github.com/repos/psf/requests/issues/2122/events
|
https://github.com/psf/requests/issues/2122
| 37,447,446 |
MDU6SXNzdWUzNzQ0NzQ0Ng==
| 2,122 |
Response encoding detect
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2986070?v=4",
"events_url": "https://api.github.com/users/fengsp/events{/privacy}",
"followers_url": "https://api.github.com/users/fengsp/followers",
"following_url": "https://api.github.com/users/fengsp/following{/other_user}",
"gists_url": "https://api.github.com/users/fengsp/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/fengsp",
"id": 2986070,
"login": "fengsp",
"node_id": "MDQ6VXNlcjI5ODYwNzA=",
"organizations_url": "https://api.github.com/users/fengsp/orgs",
"received_events_url": "https://api.github.com/users/fengsp/received_events",
"repos_url": "https://api.github.com/users/fengsp/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/fengsp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/fengsp/subscriptions",
"type": "User",
"url": "https://api.github.com/users/fengsp",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-07-09T09:27:39Z
|
2021-09-08T23:10:51Z
|
2014-07-09T09:34:49Z
|
NONE
|
resolved
|
Take one link for example: http://baike.baidu.com/view/115789.htm
This HTTP response has header `Content-Type:text/html`, no encoding, however, the
returned html content has `<meta http-equiv="Content-Type" content="text/html; charset=utf-8" />`, actually using requests, I got encoding `'ISO-8859-1'`.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2122/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2122/timeline
| null |
completed
| null | null | false |
[
"Thanks for this issue!\n\nThis is a well known requests behaviour that is very clearly documented. You can see the documentation section [here](http://docs.python-requests.org/en/latest/user/advanced/#encodings). It has been raised in prior issues as well: #1604 and #2042 for some of the most recent discussions of it.\n\nHowever, we're looking to remove this fallback: see #2086. Note, however, that we're not sure we'll examine the HTML for the meta tag. Our position on this has been that we're not a HTML library, we're a HTTP library, and therefore examining the body of the request is outside our remit.\n\nIn the meantime, if you want to do this you can take advantage of one of our utility functions: `requests.utils.get_encodings_from_content`. You use it like this:\n\n``` python\nimport requests\nfrom requests.utils import get_encodings_from_content\n\nr = requests.get('http://baike.baidu.com/view/115789.htm')\ncodings = get_encodings_from_content(r.content)\nif codings:\n r.encoding = codings[0]\n\n# Do stuff with r.text\n```\n",
"Thanks a lot, :)\n"
] |
https://api.github.com/repos/psf/requests/issues/2121
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2121/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2121/comments
|
https://api.github.com/repos/psf/requests/issues/2121/events
|
https://github.com/psf/requests/issues/2121
| 37,331,171 |
MDU6SXNzdWUzNzMzMTE3MQ==
| 2,121 |
python-requests does not support rfc6874 URLs with link-local addresses and zone identifiers
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/151255?v=4",
"events_url": "https://api.github.com/users/sagarun/events{/privacy}",
"followers_url": "https://api.github.com/users/sagarun/followers",
"following_url": "https://api.github.com/users/sagarun/following{/other_user}",
"gists_url": "https://api.github.com/users/sagarun/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sagarun",
"id": 151255,
"login": "sagarun",
"node_id": "MDQ6VXNlcjE1MTI1NQ==",
"organizations_url": "https://api.github.com/users/sagarun/orgs",
"received_events_url": "https://api.github.com/users/sagarun/received_events",
"repos_url": "https://api.github.com/users/sagarun/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sagarun/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sagarun/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sagarun",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-07-08T03:45:27Z
|
2021-09-08T23:10:51Z
|
2014-07-08T05:45:55Z
|
NONE
|
resolved
|
Description of problem:
Version-Release number of selected component (if applicable):
How reproducible:
always
Steps to Reproduce:
1.ip addr add fe80::1 dev p14p14
2. nc -6 -l 1234
3. python -c "import requests; requests.get('http://[fe80::1%25p14p1]:1234')"
3. nc -v6 'fe80::1%p14p1' 1234
Ncat: Version 6.40 ( http://nmap.org/ncat )
Ncat: Connected to fe80::1:1234.
Actual results:
The python script shows a traceback:
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/usr/lib/python2.7/site-packages/requests/api.py", line 55, in get
return request('get', url, *_kwargs)
File "/usr/lib/python2.7/site-packages/requests/api.py", line 44, in request
return session.request(method=method, url=url, *_kwargs)
File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 335, in request
resp = self.send(prep, *_send_kwargs)
File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 438, in send
r = adapter.send(request, *_kwargs)
File "/usr/lib/python2.7/site-packages/requests/adapters.py", line 327, in send
raise ConnectionError(e)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='fe80::1%25p14p1', port=1234): Max retries exceeded with url: / (Caused by <class 'socket.gaierror'>: [Errno -2] Name or service not known)
Expected results:
python requests should connect to netcat.
Additional info:
https://tools.ietf.org/html/rfc6874
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2121/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2121/timeline
| null |
completed
| null | null | false |
[
"Please see https://bugzilla.redhat.com/show_bug.cgi?id=1076822 for more information\n",
"This is a duplicate of #1985. I'm going to close this to centralise discussion there.\n"
] |
https://api.github.com/repos/psf/requests/issues/2120
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2120/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2120/comments
|
https://api.github.com/repos/psf/requests/issues/2120/events
|
https://github.com/psf/requests/issues/2120
| 37,179,438 |
MDU6SXNzdWUzNzE3OTQzOA==
| 2,120 |
Unexpected keyword argument in packages/urllib3/connectionpool.py
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/8069918?v=4",
"events_url": "https://api.github.com/users/jmandreoli/events{/privacy}",
"followers_url": "https://api.github.com/users/jmandreoli/followers",
"following_url": "https://api.github.com/users/jmandreoli/following{/other_user}",
"gists_url": "https://api.github.com/users/jmandreoli/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jmandreoli",
"id": 8069918,
"login": "jmandreoli",
"node_id": "MDQ6VXNlcjgwNjk5MTg=",
"organizations_url": "https://api.github.com/users/jmandreoli/orgs",
"received_events_url": "https://api.github.com/users/jmandreoli/received_events",
"repos_url": "https://api.github.com/users/jmandreoli/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jmandreoli/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmandreoli/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jmandreoli",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2014-07-04T16:59:50Z
|
2021-09-08T06:00:38Z
|
2014-07-07T10:50:36Z
|
NONE
|
resolved
|
This is what I get during a requests session under python 3.3:
Traceback (most recent call last):
File "/.../python3.3/site-packages/requests/packages/urllib3/connectionpool.py", line 319, in _make_request
httplib_response = conn.getresponse(buffering=True)
TypeError: getresponse() got an unexpected keyword argument 'buffering'
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2120/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2120/timeline
| null |
completed
| null | null | false |
[
"@jmandreoli I'm certain that is not the entire of your traceback and it is likely related to #1915 and #1289. Your issue is not that, because we were handling that exception when a different one occurred (with certainty). You need to give us the version of requests and what the _entire_ traceback is.\n",
"On 04/07/14 19:20, Ian Cordasco wrote:\n\n> @jmandreoli https://github.com/jmandreoli I'm certain that is not the\n> entire of your traceback and it is likely related to #1915\n> https://github.com/kennethreitz/requests/issues/1915 and #1289\n> https://github.com/kennethreitz/requests/issues/1289. Your issue is\n> not that, because we were handling that exception when a different one\n> occurred (with certainty). You need to give us the version of requests\n> and what the /entire/ traceback is.\n\nI send you enclosed the full traceback.\n- For me, the \"unexpected keyword argument 'buffering'\" error may not be \n the main cause of the problem, but it must still be a bug (invoking a \n method with a keyword argument which does not exist).\n- The version of python/requests is\n\n> Python 3.3.5 |Continuum Analytics, Inc.| (default, Mar 10 2014, 11:19:31)\n> [GCC 4.1.2 20080704 (Red Hat 4.1.2-54)] on linux\n> Type \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n> \n> > > > import requests\n> > > > requests.**version**\n> > > > '2.3.0'\n- I must also point out that the error does not always happen.\n- If it can help, I also send you enclosed the relevant section of the \n code. Note that the generator \"egxmonitor\" is enumerated only at low \n frequency (once every 2 secs), so it is not like the server is bombarded \n with requests in the \"while True\" loop.\n\nThanks for looking into this issue anyway, and if you find something, \nplease let me know.\n\n```\nJean-Marc\n```\n\nTraceback (most recent call last):\n File \"/.../anaconda/envs/py3k/lib/python3.3/site-packages/requests/packages/urllib3/connectionpool.py\", line 319, in _make_request\n httplib_response = conn.getresponse(buffering=True)\nTypeError: getresponse() got an unexpected keyword argument 'buffering'\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"/.../anaconda/envs/py3k/lib/python3.3/site-packages/requests/packages/urllib3/connectionpool.py\", line 493, in urlopen\n body=body, headers=headers)\n File \"/.../anaconda/envs/py3k/lib/python3.3/site-packages/requests/packages/urllib3/connectionpool.py\", line 321, in _make_request\n httplib_response = conn.getresponse()\n File \"/.../anaconda/envs/py3k/lib/python3.3/http/client.py\", line 1147, in getresponse\n response.begin()\n File \"/.../anaconda/envs/py3k/lib/python3.3/http/client.py\", line 358, in begin\n version, status, reason = self._read_status()\n File \"/.../anaconda/envs/py3k/lib/python3.3/http/client.py\", line 328, in _read_status\n raise BadStatusLine(line)\nhttp.client.BadStatusLine: ''\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"/.../anaconda/envs/py3k/lib/python3.3/site-packages/requests/adapters.py\", line 327, in send\n timeout=timeout\n File \"/.../anaconda/envs/py3k/lib/python3.3/site-packages/requests/packages/urllib3/connectionpool.py\", line 543, in urlopen\n raise MaxRetryError(self, url, e)\nrequests.packages.urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='13.202.222.220', port=80): Max retries exceeded with url: /UE/Post__PL__Data (Caused by <class 'http.client.BadStatusLine'>: '')\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"/.../chronorec.py\", line 161, in newsession\n for itern,x in enumerate(source,1):\n File \"/.../chronorec.py\", line 526, in atinterval\n for x in it:\n File \"/.../src/pwr/egx.py\", line 30, in egxmonitor\n r = session.post(url,auth=login,data=data)\n File \"/.../anaconda/envs/py3k/lib/python3.3/site-packages/requests/sessions.py\", line 498, in post\n return self.request('POST', url, data=data, *_kwargs)\n File \"/.../anaconda/envs/py3k/lib/python3.3/site-packages/requests/sessions.py\", line 456, in request\n resp = self.send(prep, *_send_kwargs)\n File \"/.../anaconda/envs/py3k/lib/python3.3/site-packages/requests/sessions.py\", line 559, in send\n r = adapter.send(request, **kwargs)\n File \"/.../anaconda/envs/py3k/lib/python3.3/site-packages/requests/adapters.py\", line 375, in send\n raise ConnectionError(e, request=request)\nrequests.exceptions.ConnectionError: HTTPConnectionPool(host='13.202.222.220', port=80): Max retries exceeded with url: /UE/Post__PL__Data (Caused by <class 'http.client.BadStatusLine'>: '')\n",
"@jmandreoli It is not a bug. The relevant code is reproduced here in full (it lives [here](https://github.com/kennethreitz/requests/blob/master/requests/packages/urllib3/connectionpool.py#L318-L321) for context):\n\n``` python\ntry: # Python 2.7+, use buffering of HTTP responses\n httplib_response = conn.getresponse(buffering=True)\nexcept TypeError: # Python 2.6 and older\n httplib_response = conn.getresponse()\n```\n\nThis code block exists because we want to use the `buffering` kwarg if we can on platforms that have it, but it isn't present on Python 2.6 or some versions of Python 3 (where it's the default). The style (attempt to use it, fail if we hit the exception and try without it) is a programming style usually summarised as \"it's easier to ask forgiveness than permission\". It's a very common Python programming style.\n\n_Unfortunately_, the chained tracebacks feature in Python 3.X trips people up when using this style because you frequently see exceptions that are of that form. In this case, the TypeError is totally expected.\n\nThe _relevant_ exception is the second one: `BadStatusLine`. The subsequent two exceptions are just wrapper exceptions and can be ignored. Looks like you're receiving a malformed response: you should use Wireshark or tcpdump to examine what is actually being received when you're hitting this error.\n\n`<digression>`\n\nI am not a fan of Python 3's chained exceptions being on by default. They're a very useful debugging feature, but they're also a classic example of an expert-level interface. If you don't expect to see them they're hugely difficult to read.\n\nThis is the perfect example: you posted four tracebacks. It's ridiculous to ask a non-expert requests user to work out which the hell one of those is the exception that matters. All you can know is that the last one is the one you'd have had to catch.\n\n_I_ know which one matters, but I've worked on this library for three years. This means that when I debug I can tell the difference between things that should happen and things that shouldn't, and so the extra context is helpful. For people who can't tell the difference (i.e. non-experts), the extra context is noise that makes debugging harder. On Python 2, you'd have seen this:\n\n```\nTraceback (most recent call last):\n File \"/.../chronorec.py\", line 161, in newsession\n for itern,x in enumerate(source,1):\n File \"/.../chronorec.py\", line 526, in atinterval\n for x in it:\n File \"/.../src/pwr/egx.py\", line 30, in egxmonitor\n r = session.post(url,auth=login,data=data)\n File \"/.../anaconda/envs/py3k/lib/python3.3/site-packages/requests/sessions.py\", line 498, in post\n return self.request('POST', url, data=data, **kwargs)\n File \"/.../anaconda/envs/py3k/lib/python3.3/site-packages/requests/sessions.py\", line 456, in request\n resp = self.send(prep, **send_kwargs)\n File \"/.../anaconda/envs/py3k/lib/python3.3/site-packages/requests/sessions.py\", line 559, in send\n r = adapter.send(request, **kwargs)\n File \"/.../anaconda/envs/py3k/lib/python3.3/site-packages/requests/adapters.py\", line 375, in send\n raise ConnectionError(e, request=request)\nrequests.exceptions.ConnectionError: HTTPConnectionPool(host='13.202.222.220', port=80): Max retries exceeded with url: /UE/Post__PL__Data (Caused by <class 'http.client.BadStatusLine'>: '')\n```\n\nThat exception is much more immediate: we had an error during connection (`ConnectionError`), which is the result of hitting our retry limit with a specific URL (`Max retries exceeded`), and the things that caused us to retry were `BadStatusLine` errors.\n\nFor all of Python 3's moves to make things easier for non-experts, this one made it harder for them. I find that inconsistency frustrating.\n\n`</digression>`\n",
"I suspected the server, but I was indeed under the impression that the \n\"unexpected keyword\" error was what prevented the package from \nrecovering from the server error. Your explanation made it clear what \nwas happening (I was not aware of this Python3 change in chained traceback).\n\nI have just added a try-catch around the session.post, and forced a \nrepeat (up to some limit) if there is a RequestException. This works. \nThanks very much for your help.\n\n```\nJean-Marc\n```\n",
"No problem, I'm glad you were able to resolve it. =)\n",
"@jmandreoli I am facing a similar issue, can you tell how you hadled it"
] |
https://api.github.com/repos/psf/requests/issues/2119
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2119/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2119/comments
|
https://api.github.com/repos/psf/requests/issues/2119/events
|
https://github.com/psf/requests/issues/2119
| 37,134,319 |
MDU6SXNzdWUzNzEzNDMxOQ==
| 2,119 |
Get parameters are being double encoded
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/4297003?v=4",
"events_url": "https://api.github.com/users/einchance/events{/privacy}",
"followers_url": "https://api.github.com/users/einchance/followers",
"following_url": "https://api.github.com/users/einchance/following{/other_user}",
"gists_url": "https://api.github.com/users/einchance/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/einchance",
"id": 4297003,
"login": "einchance",
"node_id": "MDQ6VXNlcjQyOTcwMDM=",
"organizations_url": "https://api.github.com/users/einchance/orgs",
"received_events_url": "https://api.github.com/users/einchance/received_events",
"repos_url": "https://api.github.com/users/einchance/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/einchance/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/einchance/subscriptions",
"type": "User",
"url": "https://api.github.com/users/einchance",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2014-07-04T00:23:40Z
|
2021-09-08T23:10:50Z
|
2014-07-11T04:30:36Z
|
NONE
|
resolved
|
parameters are being double encoded. I have the following code:
url = 'http://example.com'
queryParameters = {"MenuRequest": '{"language":"en-us","storeName":"R10501","datetime":"00:00:00","tenant":"store1"}'}
and I pass it in like such:
response = requests.get(url, params = queryParameters, verify=False)
the result is this:
http://example.com/?MenuRequest=%257B%2522language%2522%253A%2522en-us%2522%252C%2522storeName%2522%253A%2522R10501%2522%252C%2522datetime%2522%253A%252200%253A00%253A00%2522%252C%2522tenant%2522%253A%2522store1%2522%257D
it should be this:
http://example.com/?MenuRequest=%7B%22language%22%3A%22en-us%22%2C%22storeName%22%3A%22R10501%22%2C%22datetime%22%3A%2200%3A00%3A00%22%2C%22tenant%22%3A%22store1%22%7D
The problem is that that the parameters are being double encoded.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 1,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 1,
"url": "https://api.github.com/repos/psf/requests/issues/2119/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2119/timeline
| null |
completed
| null | null | false |
[
"``` pycon\n~ python\nPython 2.7.7 (default, Jun 2 2014, 18:55:26)\n[GCC 4.2.1 Compatible Apple LLVM 5.1 (clang-503.0.40)] on darwin\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> queryParameters = {\"MenuRequest\": '{\"language\":\"en-us\",\"storeName\":\"R10501\",\"datetime\":\"00:00:00\",\"tenant\":\"store1\"}'}\n>>> import requests\n>>> r = requests.get('https://httpbin.org/get', params=queryParameters)\n>>> r.url\nu'https://httpbin.org/get?MenuRequest=%7B%22language%22%3A%22en-us%22%2C%22storeName%22%3A%22R10501%22%2C%22datetime%22%3A%2200%3A00%3A00%22%2C%22tenant%22%3A%22store1%22%7D'\n>>> requests.__version__\n'2.3.0'\n```\n\nI'm not seeing the same behaviour. What version of requests are you using?\n",
"I'm using 2.3.0. Upon further investigation it appears that the behavior I described above is only occurring for me when I am running code from the eclipse pydev (on windows). It works correctly when I run it directly from the commandline.\n",
"I'm going to close this issue as it's not a issue with the requests library and likely an issue with my dev environment setup or potentially an issue with py-dev plugin.\n",
"I thought I'd fixed the issue, but I figured it out. Here's how to reproduce the double percent encoding issue:\n\nurl = 'http://www.pizzahut.ca/mobilem8-menu-service/rest/menuservice/menu?'\nqueryParameters = {\"MenuRequest\": '{\"language\":\"en-us\",\"storeName\":\"R10501\",\"datetime\":\"00:00:00\",\"tenant\":\"store1\"}'}\nresponse = requests.get(url, params = queryParameters, verify=False)\nprint response.url\n\nIf you change the url to https instead of http, it will work as expected and not percent encode twice.\n",
"This is not our fault, it's Pizza Hut's:\n\n``` python\n>>> url = 'http://www.pizzahut.ca/mobilem8-menu-service/rest/menuservice/menu?'\n>>> queryParameters = {\"MenuRequest\": '{\"language\":\"en-us\",\"storeName\":\"R10501\",\"datetime\":\"00:00:00\",\"tenant\":\"store1\"}'}\n>>> r = requests.get(url, params = queryParameters, verify=False)\n>>> r.history\n[<Response [301]>]\n>>> r.history[0].headers['location']\n'https://www.pizzahut.ca/mobilem8-menu-service/rest/menuservice/menu?MenuRequest=%257B%2522language%2522%253A%2522en-us%2522%252C%2522storeName%2522%253A%2522R10501%2522%252C%2522datetime%2522%253A%252200%253A00%253A00%2522%252C%2522tenant%2522%253A%2522store1%2522%257D'\n```\n\nWe're being redirected to that URL, so unsurprisingly we're going there. =)\n",
"The issue is not with the redirection. It's the fact that the when it's redirected, requests is encoding the query parameters a second time. The url you list at the end:\n\n'https://www.pizzahut.ca/mobilem8-menu-service/rest/menuservice/menu?MenuRequest=%257B%2522language%2522%253A%2522en-us%2522%252C%2522storeName%2522%253A%2522R10501%2522%252C%2522datetime%2522%253A%252200%253A00%253A00%2522%252C%2522tenant%2522%253A%2522store1%2522%257D'\n\nshould be\n\n 'https://www.pizzahut.ca/mobilem8-menu-service/rest/menuservice/menu?MenuRequest=%7B%22language%22%3A%22en-us%22%2C%22storeName%22%3A%22R10501%22%2C%22datetime%22%3A%2200%3A00%3A00%22%2C%22tenant%22%3A%22store1%22%7D'\n",
"Let's walk through an even more explicit version of the code I posted above. I invite you to follow along in your command shell. First, do some set up:\n\n``` python\n>>> import requests\n>>> url = 'http://www.pizzahut.ca/mobilem8-menu-service/rest/menuservice/menu?'\n>>> queryParameters = {\"MenuRequest\": '{\"language\":\"en-us\",\"storeName\":\"R10501\",\"datetime\":\"00:00:00\",\"tenant\":\"store1\"}'}\n```\n\nWith that set up done, let's make the request. We'll prevent requests from following any redirects to make sure that we don't mess with the response.\n\n``` python\n>>> r = requests.get(url, params=queryParameters, verify=False, allow_redirects=False)\n```\n\nCheck the status code and you'll see its a 301 redirect.\n\n``` python\n>>> r.status_code\n301\n```\n\nConfirm that the URL requests sent to was _not_ double encoded:\n\n``` python\n>>> r.url\nu'http://www.pizzahut.ca/mobilem8-menu-service/rest/menuservice/menu?MenuRequest=%7B%22language%22%3A%22en-us%22%2C%22storeName%22%3A%22R10501%22%2C%22datetime%22%3A%2200%3A00%3A00%22%2C%22tenant%22%3A%22store1%22%7D'\n```\n\nAs you can see, it's not, it's correctly single encoded. Let's check the URL that Pizza Hut redirected us to, which can be found in the `Location` header:\n\n``` python\n>>> r.headers['location']\n'https://www.pizzahut.ca/mobilem8-menu-service/rest/menuservice/menu?MenuRequest=%257B%2522language%2522%253A%2522en-us%2522%252C%2522storeName%2522%253A%2522R10501%2522%252C%2522datetime%2522%253A%252200%253A00%253A00%2522%252C%2522tenant%2522%253A%2522store1%2522%257D'\n```\n\n_That_ has been double encoded. This is emphatically not our fault. And just to put the final nail in this coffin, here is a Wireshark packet capture demonstrating that requests has not molested that header in any way:\n\n\n"
] |
https://api.github.com/repos/psf/requests/issues/2118
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2118/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2118/comments
|
https://api.github.com/repos/psf/requests/issues/2118/events
|
https://github.com/psf/requests/issues/2118
| 36,996,814 |
MDU6SXNzdWUzNjk5NjgxNA==
| 2,118 |
Let the user provide an SSLContext object
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/166162?v=4",
"events_url": "https://api.github.com/users/brandon-rhodes/events{/privacy}",
"followers_url": "https://api.github.com/users/brandon-rhodes/followers",
"following_url": "https://api.github.com/users/brandon-rhodes/following{/other_user}",
"gists_url": "https://api.github.com/users/brandon-rhodes/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/brandon-rhodes",
"id": 166162,
"login": "brandon-rhodes",
"node_id": "MDQ6VXNlcjE2NjE2Mg==",
"organizations_url": "https://api.github.com/users/brandon-rhodes/orgs",
"received_events_url": "https://api.github.com/users/brandon-rhodes/received_events",
"repos_url": "https://api.github.com/users/brandon-rhodes/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/brandon-rhodes/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/brandon-rhodes/subscriptions",
"type": "User",
"url": "https://api.github.com/users/brandon-rhodes",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 31 |
2014-07-02T16:02:03Z
|
2021-09-08T13:05:29Z
|
2017-01-13T07:22:06Z
|
NONE
|
resolved
|
The requests library seems to grow more and more keyword arguments to try to provide all of the flexibility that SSL users need. As of Python 3.2, the Standard Library now offers a different approach: an `SSLContext` that can accept settings for TLS protocol version, CA certificate list, identity certificate, secret key, allowable cipher list, Diffie-Hellman parameters, server-name callback function, whether to verify server hostnames, and so forth. It has a `wrap_socket()` method that starts up TLS on a socket using precisely the settings it has been configured with.
This lets protocol libraries in the Python 3 Standard Library opt out of needing keyword arguments for any of the above settings. They can simply accept a `context=` keyword argument, use the context to wrap their encrypted sockets, and stay out of the business of understanding SSL and all of its different settings.
If the requests library under Python 3 started supporting a `context=` parameter like the Standard Library protocols, then users could fine-tune their encryption settings without requests having to become more complicated.
A use-case: many users today are concerned about Perfect Forward Security (PFS) and want to only make connections with ciphers that at least make it possible to build connections that cannot be decrypted later if a secret key is captured. But the current requests library, so far as I can see, makes no provision for this. Nor do I want it to: adding a new `ciphers=` keyword would be only the first of a dozen other keywords that SSL users will need added over the coming years. But if requests accepted a `context=` parameter, then I can create an `SSLContext` and tell it which protocols I am willing to use and have requests (and urllib3) use that context for building their SSL connections.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 5,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 5,
"url": "https://api.github.com/repos/psf/requests/issues/2118/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2118/timeline
| null |
completed
| null | null | false |
[
"> The requests library seems to grow more and more keyword arguments to try to provide all of the flexibility that SSL users need.\n\nI'm not sure if you mean the fundamental `get/post/put/delete/head/options/request` methods/functions, but if that is what you're referring to, those haven't changed since before 1.0 was released. If you mean that there's an increasing amount of flexibility within adapters to fine tune this, then yes, you're correct.\n\n> If the requests library under Python 3 started supporting a context= parameter like the Standard Library protocols, then users could fine-tune their encryption settings without requests having to become more complicated.\n\nWe're very unlikely to start supporting a keyword argument on only one version of Python. `urllib3` will discard keyword arguments on specific versions of Python, but we have yet to do that.\n\nIn general I like the idea, but so long as we will support Python 2 (unless a backport of the SSL module can be successfully provided), we will have trouble supporting this. Further, we're ignoring whether or not @kennethreitz would even want to add this keyword argument. I for one would be in favor of adding it (when we can support all Python versions equally), but I don't get to make these decisions ;).\n\nThat said, I'll have to investigate if urllib3 will support `SSLContext` objects (because I frankly don't remember). I'll look into that tonight unless @shazow chimes in before then. I'm also willing to do the work to allow urllib3 to accept them.\n",
"Can we step back for a moment?\n\n@brandon-rhodes I'm really not clear about what you mean by requests sprouting more and more keywords. We have two: `verify` and `cert`. Anything else is done at the `HTTPAdapter` layer on a case-by-case basis, which allows exactly what you've described.\n\nWhat have I missed? What parameters are you identifying that I don't see?\n",
"Reading your post again @brandon-rhodes, I realise that your concern is not actually that we _currently_ have too many parameters, but that we might _end up_ having too many parameters. In this I agree with you.\n\nHowever, I think requests should continue its proud tradition of solving the 80% use-case in the primary API and allowing the Transport Adapter API to help the remaining 20%. For this reason, I'd never accept adding a `ciphers` kwarg to the main API: it's just too niche, _especially_ if we require it to look anything like OpenSSL's terrible cipher format.\n\nInstead, let's look at providing something like the `ssl_version` support in `urllib3`. This would allow us to write something like the [SSLAdapter](http://toolbelt.readthedocs.org/en/latest/user.html#ssladapter), but for ciphers (or indeed just to extend the SSLAdapter). That allows people who genuinely do care to plug something in, while allowing requests to continue to support the best-possible use-case.\n",
"> Further, we're ignoring whether or not @kennethreitz would even want to add this keyword argument.\n\nBy **no means** did I intend to ignore @kennethreitz’s wishes! Alas. Opening this issue was, I had thought, precisely the means by which Mr. Reitz’s wishes could become known. Would there have been a more appropriate forum for asking my question?\n\n> In this I agree with you.\n\nThank you, @Lukasa! Sorry if I worded the issue awkwardly and it required multiple read-throughs.\n\n> I think requests should continue its proud tradition of solving the 80% use-case … I'd never accept adding a `ciphers` kwarg to the main API … This would allow us to write something like the `SSLAdapter`\n\nSo the keyword arguments to `get()` and the other functions are not “kitchen sink” collections of everything that _could_ be specified, but a smaller collection of settings, and users are intended to step back and create adapters for more difficult cases? Then you are correct that what I probably need is an adapter that accepts an `SSLContext` for building connections!\n\nThe `urllib3` library accepts an `ssl_version` keyword? That, I fear, is a first step towards insanity, and a course which can be stopped by turning to `SSLContext`. Because the next logical step after `ssl_version` (which is really setting what OpenSSL calls the “protocol” version, from what I can see?) is adding a `ciphers` keyword, and then a `dh_params` keyword, and then `ecdh_curve`, and so forth.\n\nIn fact, what I really probably want is an adapter that does not even know that `SSLContext` exists, but simply accepts that I have gotten ready an object with a `wrap_socket()` method that it can use when it is ready to negotiate an encrypted connection. That way, as long as an SSL library that I want to use in the future also offers a `wrap_socket()` method (think of PyOpenSSL, or that new `cryptography` project), then it would automatically work if dropped in to the transport object.\n",
"> Would there have been a more appropriate forum for asking my question?\n\nNo. I just like to couch my positive replies in \"But Kenneth might not want this, so don't get your hopes up\". It might take him a while to get around to reading or replying though.\n\n> So the keyword arguments to get() and the other functions are not “kitchen sink” collections of everything that could be specified, but a smaller collection of settings, and users are intended to step back and create adapters for more difficult cases?\n\nThat is correct. There are currently a few good examples of this (like the SSLAdapter @Lukasa linked above).\n\n> Then you are correct that what I probably need is an adapter that accepts an SSLContext for building connections!\n\nGood news! @Lukasa has a blog post about building adapters for requests and we can both help you with building one of these. Further, I'd be interested in it if only to add it to the [`toolbelt`](https://github.com/sigmavirus24/requests-toolbelt) if only to help it be discovered more easily.\n\n> In fact, what I really probably want is an adapter that does not even know that SSLContext exists, but simply accepts that I have gotten ready an object with a wrap_socket() method that it can use when it is ready to negotiate an encrypted connection.\n\nYou _could_ do this, but then you would probably doing a lot of `urllib3`'s work since it doesn't (if I remember correctly) provide a way for you to pass in a socket to use (and I don't think it should start accepting sockets to use either).\n\nIt might be more helpful to acquaint yourself further with the adapters that currently exist in requests, to understand how they work. They're not terribly difficult to understand either.\n",
"Okay! I will go take a look at them, and then come back later and comment again on this issue.\n",
"Supplying your own socket factory thing is indeed something urllib3 wants to support. We had some discussions about providing a class which \"contains\" all the parts you need to make an SSL contexted socket. (https://github.com/shazow/urllib3/pull/371#issuecomment-40299324 alludes to this)\n\nFor now, you'd need to set the `.ConnectionCls` property of a `*ConnectionPool` object to bring your own thing. All it needs to be is something that extends `httplib.HTTPConnection` (or `urllib3.connection.HTTPConnection`, I suppose) for the constructor and implements `.connect()`.\n\nHere's the meat for our verified SSL stuff: https://github.com/shazow/urllib3/blob/master/urllib3/connection.py#L191\n",
"Hooray for three person issue conversations! They're always so easy to follow. =) @brandon-rhodes, you've provided lots of great options for us. I'd like to try to enumerate them as I understand them, and then provide feedback on each of them. Please step in if you think I've left anything out or misunderstood something.\n1. Make it possible to pass urllib3 a `socket`.\n \n I don't like this idea much. In principle it's do-able, but it violates the abstraction layer that urllib3 provides. Our favourite feature of urllib3 is that it performs connection pooling, and for it to do that it needs to be able to transparently create new connections (to conserve resources). This means we can't say \"here, use this connection object\" because urllib3 owns all the connection objects. This idea is sadly unworkable.\n2. Provide urllib3 with an `SSLContext`.\n \n This is workable, as urllib3 can use it as a kind of connection factory. However, urllib3 right now does an excellent job of transparently interworking between PyOpenSSL and the standard library's `ssl` module. This transparent interworking is only going to get stronger as we move toward [hyper](http://hyper.rtfd.org/en/development/) and HTTP/2. This means being able to provide an SSL context is something of a footgun: if you provide a stdlib `SSLContext` to `hyper`, it won't be able to make an HTTP/2 connection through it, it needs PyOpenSSL's.\n \n Additionally, the PyOpenSSL `Context` does not have the same API as the `ssl` module's one. This is incredibly annoying, and @alekstorm has done awesome work by writing a compatibility module ([backports.ssl](https://github.com/alekstorm/backports.ssl)) to paper over the differences. Again, however, I note that working around the myriad user inputs is a logistical nightmare.\n3. More SSL keyword arguments.\n \n Brandon, you've expressed a discomfort with the many keyword arguments potentially required. I am sympathetic to that argument. I have no good alternative to using either a **ssl_kwargs or to have a ssl_args dictionary argument, neither of which is great. I guess a NamedTuple?\n",
"I'd vote for passing `SSLContext` around. I actually have a usecase for that (https://github.com/keybar/keybar/blob/master/src/keybar/utils/security.py#L53) and would love to pass around this config.\n\nTornado for example implements a fancy `ssl_options_to_context` (http://tornado.readthedocs.org/en/latest/netutil.html#tornado.netutil.ssl_options_to_context) which checks for a `SSLContext` object that can be applied on Python 3.2+\n\nThough, actually all that would be required in urllib3 to make it usable.\n",
"@EnTeQuAk I continue to be in favour of my option 2. =)\n",
"I agree: option 2 is the winner here, and will bring Requests up to the level of capability of the protocol modules in the Standard Library (http.client, smtplib, poplib, imaplib, ftplib, and nntplib) that all also support SSLContext parameters.\n",
"+1, option 2.\n",
"I am going to provide a strawman of option 3, as I believe option 2 is extremely difficult at an engineering level without introducing a new `urllib3.SSLContext` object.\n\nI propose the addition of a new urllib3 utility class, `TLSOptions`. That class would work as follows:\n\n``` python\nopts = TLSOptions()\nopts.allow_compression = True\nopts.ciphers = 'TLS_ECDHE_RSA_AES_128_GCM_SHA_256'\nopts.versions = ['TLSv1.1', 'TLSv1.2']\n```\n\nThis states the options in a manner that is deliberately agnostic to whether `ssl` or `PyOpenSSL` is being used. It also enables us to specifically whitelist the options we believe it is acceptable to change: for example, we are unlikely to ever have `allow_compression`.\n\nFinally, the object could perform meaningful validation of settings: for example, if I had passed the cipher string above with `opts.versions = ['TLSv1.1']`, the object could except (because this is an invalid configuration). Again, we may not implement this, but we could.\n\nCan I get initial feedback on how this feels to people?\n",
"I'd like to advocate for having `TLSOptions` take various constructor arguments, and being immutable.\n",
"I second @alex's suggestion. Constructing it once and not having to use those 4 lines each time I need a slight variation is much nicer. Immutability will also be pleasant so the options cannot be changed under anyone's feet.\n",
"I'm happy with immutability. That gets us to this:\n\n``` python\nopts = TLSOptions(\n allow_compression=True,\n ciphers='TLS_ECDHE_RSA_AES_128_GCM_SHA_256',\n versions=['TLSv1.1', 'TLSv1.2']\n)\n```\n",
"`TLSOptions` sounds good. Or `SSLOptions`, else we may need to rename all the other `ssl` references for consistency. Should this conversation be happening in urllib3-land?\n",
"@shazow it's lived almost entirely here so let's just finish it here?\n",
"Are there other options we're considering for this?\n",
"I'm strongly in favor for option 2), the SSLContext object. If we can agree upon a minimalistic API then urllib3 and requests would be able to support more TLS/SSL libraries than just OpenSSL through stdlib ssl module and PyOpenSSL. For example I have a specific use case for NSS as TLS library. Somebody has expressed an interest in SecureTransport (OSX) as TLS lib. I even would go so far and standardize a minimal ABC in a PEP.\n\n```\nimport socket\nfrom abc import ABCMeta, abstractmethod, abstractproperty\n\nclass AbstractSocketDescriptor(metaclass=ABCMeta):\n @abstractmethod\n def fileno(self):\n pass\n\nclass AbstractSocket(AbstractSocketDescriptor):\n @property\n def family(self):\n return socket.AF_INET\n\n @property\n def type(self):\n return socket.SOCK_STREAM\n\n @abstractproperty\n def proto(self):\n pass\n\n @abstractmethod\n def close(self):\n pass\n\n @abstractmethod\n def makefile(self, mode=None, bufsize=None):\n pass\n\n @abstractmethod\n def recv(self, bufsize, flags=None):\n pass\n\n @abstractmethod\n def send(self, string, flags=None):\n pass\n\n @abstractmethod\n def sendall(self, string, flags=None):\n pass\n\n\nclass AbstractSSLSocket(AbstractSocket):\n @abstractmethod\n def get_channel_binding(self, cb_type=\"tls-unique\"):\n pass\n\n @abstractmethod\n def selected_alpn_protocol(self):\n pass\n\n @abstractmethod\n def selected_npn_protocol(self):\n pass\n\nclass AbstractSSLContext(metaclass=ABCMeta):\n @abstractmethod\n def wrap_socket(self, sock: AbstractSocketDescriptor,\n server_side=False, *, server_hostname=None\n ) -> AbstractSSLSocket:\n pass\n```\n\nI think these attributes and methods are about the bare minimum to define a useful socket, SSL connection and SSLContext. The ABCs deliberately omit all methods to create or configure a socket or a context. It only defines methods to read/write for TCP streaming connections over IPv4 or IPv4 as well as an API to wrap a socket into a SSL socket. Basically SSLContext object is demoted to a factory that turns minimal socket into a SSL-aware socket.\n\nFor most applications the socket is created and configured elsewhere, usually in a create_connection(). Maybe create_connection() could be a method of SSLContext, too. That would be useful for NSS, because NSPR has its own socket abstraction layer that is not fully compatible with OS sockets.\n\n```\ndef create_connection(\n address,\n timeout=socket._GLOBAL_DEFAULT_TIMEOUT,\n source_address=None,\n family=AF.AF_UNSPEC,\n addrflags=0, # getaddrinfo flags\n blocking=None, # setblocking()\n sockopts=(), # setsockopt()\n ) -> AbstractSocketDescriptor:\n pass\n```\n\nMy idea doesn't contradict Alex's idea for TLSOptions, it just deals with a different level of abstraction. TLSOptions is a useful high level interface for requests. My proposal doesn't care how a SSLContext object for a specific implementation is created. TLSOptions could be used to create ssl.SSLContext or OpenSSL.SSL.Context. But internally requests and urllib3 should really use SSLContext.\n",
"We'll obviously use SSLContexts at the interface to urllib3, because that's what urllib3 will accept. However, in requests land, we may want to use an API that's far clearer.\n",
"Hi, not sure if this thread is the right place, but I just wanted to add my 2 cents from an API user's perspective.\n\nCurrently, the `requests` library is missing any obvious way to access the server certificate to implement additional custom validations.\n\nAFAIU, when we gain a way to supply customized `SSLContext` instances (which seems to be the prevailing option right now), we'd be able to implement this with some obscure tricks:\n1. override the `SSLContext.wrap_socket()` method, delegating the call to the super `wrap_socket()`, then storing the results of `getpeercert(True)` in some custom attribute on the socket, e.g. `sock.ssl_peer_cert_der`, and results of `getpeercert(False)` in e.g. `sock.ssl_peer_cert_limitedly_parsed`\n2. reach inside the `requests`'s response object, making our way down to the socket through lots of inappropriate intimacy (`response.connection.sock` I suppose?)\n3. read the desired certificate attribute from the socket, and do what's desired with it (e.g. `crypto.load_certificate(crypto.FILETYPE_ASN1, socket.ssl_peer_cert_der)`\n\nHowever, this is obviously suboptimal and involves bad code smells.\n\nIf you get to implementing SSLContexts support, could you also expose an obvious way in the API (e.g. a callback function) to handle any additional custom validations with access to full certificate data?\n\nBasically, a way to get called back with the result of `SSLSocket.getpeercert(Boolean)` and throw some exception if we don't like the cert.\n",
"@aadamowski You wouldn't need that. Generally speaking, if you can provide an SSLContext you should just use PyOpenSSL and provide Context with a custom verify callback. That would allow you to achieve your goal.\n",
"@Lukasa , is the plan for `requests` to handle both Python `ssl`'s [`SSLContext`](https://docs.python.org/3/library/ssl.html#ssl.SSLContext) and PyOpenSSL's [`Context`](http://pyopenssl.sourceforge.net/pyOpenSSL.html/openssl-context.html) objects equivalently?\n\nDespite some similarities, these classes don't have compatible APIs and aren't related.\n",
"Correct, but that's why we would need to have both. Requests can use pyOpenSSL in some circumstances, which means we need to be able to use the pyOpenSSL Context object in some circumstances. This is why this issue has been around so long: it's not as easy as it seems at first glance. ;)\n",
"I came across this issue today after trying to customize the cipher suite for the SSL connection in a Session. For some reason the TLS version and cipher spec were not being automatically negotiated appropriately. I'm still trying to figure out why this is the case. This is an issue that has come up for us multiple times now and we really do need a solution. I do like using the SSLContext object because it is the standard way to customize the ssl connection in Python. In any case, we just need a documented way to specify these parameters.\n\nCurrently, to force a given TLS version, we implement an HTTPAdapter and use session.mount() with it. But the pool manager only allows us to specify a specific ssl_version to use (packages/urllib3/poolmanager.py). It does not allow us to customize the ciphers or anything else.\n\nTo provide more context, we have encountered issues using client-side certificates while connecting to Java GlassFish servers that require TLS 1.1/1.2.\n",
"+1 - this would be great to have. It's been almost a year with no movement here.",
"No, it hasn't. You can pass SSLContext objects using TransportAdapters as of v2.12.0.",
"@Lukasa Are there docs on doing so? I'm not finding anything.",
"@dsully Not at this time, though I have written examples [a few times](https://github.com/kennethreitz/requests/issues/3774#issuecomment-267871876)."
] |
https://api.github.com/repos/psf/requests/issues/2117
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2117/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2117/comments
|
https://api.github.com/repos/psf/requests/issues/2117/events
|
https://github.com/psf/requests/issues/2117
| 36,995,922 |
MDU6SXNzdWUzNjk5NTkyMg==
| 2,117 |
Multipart files unicode filename
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/128982?v=4",
"events_url": "https://api.github.com/users/homm/events{/privacy}",
"followers_url": "https://api.github.com/users/homm/followers",
"following_url": "https://api.github.com/users/homm/following{/other_user}",
"gists_url": "https://api.github.com/users/homm/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/homm",
"id": 128982,
"login": "homm",
"node_id": "MDQ6VXNlcjEyODk4Mg==",
"organizations_url": "https://api.github.com/users/homm/orgs",
"received_events_url": "https://api.github.com/users/homm/received_events",
"repos_url": "https://api.github.com/users/homm/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/homm/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/homm/subscriptions",
"type": "User",
"url": "https://api.github.com/users/homm",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 32 |
2014-07-02T15:53:05Z
|
2014-09-10T18:37:56Z
|
2014-07-02T20:34:47Z
|
CONTRIBUTOR
| null |
Starting from 2.0 requests does not send `filename` attribute of `Content-Disposition` header for multipart files with unicode names. Instead of this attribute with name `filename*` is sent.
```
# requests 1.2.3
>>> requests.post('http://ya.ru', files={'file': (u'файл', '123')}).request.body
'--db7a9522a6344e26a4ca2933aecad887\r\nContent-Disposition: form-data; name="file"; filename="\xd1\x84\xd0\xb0\xd0\xb9\xd0\xbb"\r\nContent-Type: application/octet-stream\r\n\r\n123\r\n--db7a9522a6344e26a4ca2933aecad887--\r\n'
# requests 2.0
>>> requests.post('http://ya.ru', files={'file': (u'файл', '123')}).request.body
'--a9f0de2871da46df86140bc5b72fc722\r\nContent-Disposition: form-data; name="file"; filename*=utf-8\'\'%D1%84%D0%B0%D0%B9%D0%BB\r\n\r\n123\r\n--a9f0de2871da46df86140bc5b72fc722--\r\n'
```
And this is a big problem, because looks like some systems does not recognize such fields as files. At least we encountered a problem with Django. Django places entire file's content in `request.POST` instead of `request.FILES`. It is clear from sources:
https://github.com/django/django/blob/1.7c1/django/http/multipartparser.py#L599-L601
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2117/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2117/timeline
| null |
completed
| null | null | false |
[
"This is actually a bug in Django then. Using this syntax is what we're supposed to be using. We have to indicate to the server that we're sending a field whose content is not ASCII or Latin-1 (ISO-8859-1). The proper way to do so is the syntax you see there. It is defined in [Section 3.2.1 of RFC 5987](http://tools.ietf.org/html/rfc5987#section-3.2.1). This bug should be filed against Django instead for not conforming to the proper handling of that value. **Edit** Note specifically that `parameter` is redefined as `reg-parameter` _or_ `ext-parameter` where `ext-parameter` is defined as `parmname` (e.g., `filename`) concatenated with a `*` character followed by = and the `ext-value`. This confirms that this is the proper handling of those header values. I believe this RFC also is applied to MIME header values which are what you're using in your `multipart/form-data` upload.\n",
"I test php 5.5.9 server, and it also does not understand form-data with `filename*` attribute as files. I think requests should be close to real world then to rfc, especially when compromise is possible. How about both attributes?\n",
"It's unclear what the compromise should be. What text encoding should we use?\n",
"Compromise is using both attributes. Encoding is not so important. It can be even `filename=blah.bin`, this will be enough for server to recognize files.\n",
"Filename is hugely important. If we just make random choices, I guarantee that people will simply raise a new bug report claiming that we mangled their filenames.\n\nIf you don't like the way we do it, encode the filename yourself. =)\n",
"Feel the difference: \"I can't even upload file to majority of servers\" and \"server does not understand file name\".\n\nThere are two cases:\n1. server understand unicode. It will use `filename*` property. No problems.\n2. server does not understand unicode. It will not be able to receive files OR It will not be able to recognize name. Wait, it already does not understand unicode, it will not be able to recognize name in any case.\n",
"By the way, before 2.0 requests use some encoding, and as I know everything works well.\n",
"Resist the temptation to say \"Understands unicode\". That phrase is meaningless when it comes to networked connections. The server needs to understand the specific text encoding we've chosen, and there is _no_ consistency here. We _will_ get this wrong, and what happens in this situation is totally unclear. Some servers will decode the text and get lucky, and read the filename as something totally obtuse and unclear, and that would be _terrible_.\n\nI would rather fail clearly and allow the user to make the choice than to guess and get it wrong. This is not PHP.\n",
"> Compromise is using both attributes\n\nHow would using both attributes work? If you read the grammar I linked to, it updates the meaning to say that you can only send one, i.e., either `filename=` or `filename*=`. If you send both, you're sending the wrong thing.\n\n> By the way, before 2.0 requests use some encoding, and as I know everything works well.\n\nIt worked purely by chance that those characters could be encoded to bytes. We were just sending whatever the bytestring was (if you look at the string it is what you would get if you took your original string and called `encode('<encoding>')`. You get the literal bytes and that's what we sent. You could potentially still encode it yourself and then pass that as the filename. You _really_ should only be giving us bytes anyway.\n\nSeeing as this is not a bug, and there is no good reason for us to change how we handle what is given us by the user, I'm closing this.\n",
"> How would using both attributes work? If you read the grammar I linked to, it updates the meaning to say that you can only send one. If you send both, you're sending the wrong thing.\n\nThis is [totally wrong](http://tools.ietf.org/html/rfc5987#section-4.2).\n\n```\nThis specification suggests that a\nparameter using the extended syntax takes precedence. This would\nallow producers to use both formats without breaking recipients that\ndo not understand the extended syntax yet.\n```\n",
"Interesting. I had forgotten that we could send both. Regardless, we can't provide an ASCII representation for you because we can't guess at how to generate ASCII from what you give us. You would still have to generate the value yourself because there's no sane way to provide that functionality via an API.\n",
"When I try to encode filename by myself as requests 1.2.3 does, I've got a error:\n\n``` python\n>>> requests.post('http://ubuntu:8000/', files={'file': (u'файл.png'.encode('utf-8'), '123')})\nUnicodeDecodeError: 'ascii' codec can't decode byte 0xd1 in position 10: ordinal not in range(128)\n```\n\nI can't use just ascii representation for `filename`, because there is no way to send both `filename` with ascii and encoded `filename*`.\n",
"There's so much going on in that line of code I don't know where the breakage is. Can you split that up into multiple lines so I can see which action causes that traceback?\n",
"``` python\n>>> filename = u'файл.png'.encode('utf-8')\n>>> requests.post('http://ubuntu:8000/', files={\n 'file': (filename, '123')\n})\n```\n\n```\n---------------------------------------------------------------------------\nUnicodeDecodeError Traceback (most recent call last)\n<ipython-input-5-42737369472a> in <module>()\n----> 1 import requests; print requests.post('http://ubuntu:8000/', files={'file': (u'файл.png'.encode('utf-8'), '123')})\n\n/home/homm/env/env_ucare/local/lib/python2.7/site-packages/requests/api.pyc in post(url, data, **kwargs)\n 86 \"\"\"\n 87 \n---> 88 return request('post', url, data=data, **kwargs)\n 89 \n 90 \n\n/home/homm/env/env_ucare/local/lib/python2.7/site-packages/requests/api.pyc in request(method, url, **kwargs)\n 42 \n 43 session = sessions.Session()\n---> 44 return session.request(method=method, url=url, **kwargs)\n 45 \n 46 \n\n/home/homm/env/env_ucare/local/lib/python2.7/site-packages/requests/sessions.pyc in request(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert)\n 420 hooks = hooks,\n 421 )\n--> 422 prep = self.prepare_request(req)\n 423 \n 424 proxies = proxies or {}\n\n/home/homm/env/env_ucare/local/lib/python2.7/site-packages/requests/sessions.pyc in prepare_request(self, request)\n 358 auth=merge_setting(auth, self.auth),\n 359 cookies=merged_cookies,\n--> 360 hooks=merge_hooks(request.hooks, self.hooks),\n 361 )\n 362 return p\n\n/home/homm/env/env_ucare/local/lib/python2.7/site-packages/requests/models.pyc in prepare(self, method, url, headers, files, data, params, auth, cookies, hooks)\n 295 self.prepare_headers(headers)\n 296 self.prepare_cookies(cookies)\n--> 297 self.prepare_body(data, files)\n 298 self.prepare_auth(auth, url)\n 299 # Note that prepare_auth must be last to enable authentication schemes\n\n/home/homm/env/env_ucare/local/lib/python2.7/site-packages/requests/models.pyc in prepare_body(self, data, files)\n 430 # Multi-part file uploads.\n 431 if files:\n--> 432 (body, content_type) = self._encode_files(files, data)\n 433 else:\n 434 if data:\n\n/home/homm/env/env_ucare/local/lib/python2.7/site-packages/requests/models.pyc in _encode_files(files, data)\n 147 rf = RequestField(name=k, data=fp.read(),\n 148 filename=fn, headers=fh)\n--> 149 rf.make_multipart(content_type=ft)\n 150 new_fields.append(rf)\n 151 \n\n/home/homm/env/env_ucare/local/lib/python2.7/site-packages/requests/packages/urllib3/fields.pyc in make_multipart(self, content_disposition, content_type, content_location)\n 173 \"\"\"\n 174 self.headers['Content-Disposition'] = content_disposition or 'form-data'\n--> 175 self.headers['Content-Disposition'] += '; '.join(['', self._render_parts((('name', self._name), ('filename', self._filename)))])\n 176 self.headers['Content-Type'] = content_type\n 177 self.headers['Content-Location'] = content_location\n\n/home/homm/env/env_ucare/local/lib/python2.7/site-packages/requests/packages/urllib3/fields.pyc in _render_parts(self, header_parts)\n 136 for name, value in iterable:\n 137 if value:\n--> 138 parts.append(self._render_part(name, value))\n 139 \n 140 return '; '.join(parts)\n\n/home/homm/env/env_ucare/local/lib/python2.7/site-packages/requests/packages/urllib3/fields.pyc in _render_part(self, name, value)\n 116 The value of the parameter, provided as a unicode string.\n 117 \"\"\"\n--> 118 return format_header_param(name, value)\n 119 \n 120 def _render_parts(self, header_parts):\n\n/home/homm/env/env_ucare/local/lib/python2.7/site-packages/requests/packages/urllib3/fields.pyc in format_header_param(name, value)\n 41 result = '%s=\"%s\"' % (name, value)\n 42 try:\n---> 43 result.encode('ascii')\n 44 except UnicodeEncodeError:\n 45 pass\n\nUnicodeDecodeError: 'ascii' codec can't decode byte 0xd1 in position 10: ordinal not in range(128)\n```\n",
"Hmm, that really looks like that should catch `UnicodeDecodeError`s as well, to allow for exactly this case. Or, even more accurately, should check the type of the string, and not bother encoding bytestrings.\n",
"Recommend we open a bug report on urllib3 if @shazow agrees.\n",
"I have checked Tornado and Werkzeug, no one of them don't know about `filename*`. Do you know any web server who reads rfc in same way as you?\n",
"Rails as well.\n",
"I know that all the major browser user agents do, [see here](http://stackoverflow.com/questions/7967079/special-characters-in-content-disposition-filename).\n",
"If you're doubtful whether that RFC applies to it, the same grammar is used in [Section 4 of RFC 2231](http://tools.ietf.org/html/rfc2231#section-4) which is more directly related to MIME headers and encoding non-ASCII values in them.\n",
"There's also a related discussion on the [Django ticketing system](https://code.djangoproject.com/ticket/20147) that shows the team knows this is something they have to do but hasn't acted on (in over a year).\n",
"And for the record, it isn't just a rails issue, it is a [rack problem](https://github.com/rack/rack/blob/master/lib/rack/multipart/parser.rb) (which does not have a corresponding issue, yet).\n",
"Until issues there are in majority of web servers, the issue is in requests.\n",
"@homm What is your proposal for requests? The best option I've seen so far is to send both `filename` and `filename*` in that order. If that happens, what should we put in `filename`?\n",
"@homm you're inherently wrong. Rack _claims_ to support RFC 2231, this is clearly a failure to properly do so given how clear 2231 (and 5987) are in their grammar. You claim earlier versions of requests do not cause the problem, the solution to your problem then is to clearly use old versions since you seem to think broken behaviour is correct.\n",
"@sigmavirus24 I am open to sending both `filename` and `filename*`, this should be supported, but I don't know how we'd populate `filename`.\n",
"@Lukasa\n1) whatever is enough for file recognition for not compliant server and does not brake others (because \"specification suggests that a parameter using the extended syntax takes precedence\").\n2) it can be value without any non-ascii chars and \"unknown\" as fallback.\n3) should be a way to send encoded `filename` if I exactly know how server handles encoded values.\n",
"@sigmavirus24 I can't use old requests version because I need SNI on python 2 :(\n",
"The problem is that 1) and 2) are wildly ambiguous. Suppose you provide the string `u\"使用的是\"`. What should our fallback name be? How can we determine it? What do we do when a user wants to provide that fallback name themselves?\n",
"@Lukasa ostensibly through the toolbelt since that's the only place where users have complete control over the fields but even so, this would probably need to be worked on in urllib3 because I'm fairly certain it doesn't currently provide a way to do this.\n\nFurther RFC 2231 is the authority on this since 5987 (as I've re-read it) is for HTTP Headers (not MIME/multipart headers) and 2231 does not allow for multiple parameters (on quick glance over it)\n\n---\n\n@homm \n\n> 2) it can be value without any non-ascii chars and \"unknown\" as fallback.\n\nThis is not an accurate representation in the slightest. At best it will introduce utter confusion.\n\n> 3) should be a way to send encoded filename if I exactly know how server handles encoded values.\n\nYou can never know exactly how any server handles these values unless you've written it from scratch yourself.\n"
] |
https://api.github.com/repos/psf/requests/issues/2116
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2116/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2116/comments
|
https://api.github.com/repos/psf/requests/issues/2116/events
|
https://github.com/psf/requests/pull/2116
| 36,994,624 |
MDExOlB1bGxSZXF1ZXN0MTc4OTM3MDc=
| 2,116 |
Replace dead link with StackOverflow answer
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
] | null | 1 |
2014-07-02T15:39:46Z
|
2021-09-08T23:05:22Z
|
2014-07-02T15:41:14Z
|
CONTRIBUTOR
|
resolved
|
Fix #2107
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2116/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2116/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2116.diff",
"html_url": "https://github.com/psf/requests/pull/2116",
"merged_at": "2014-07-02T15:41:14Z",
"patch_url": "https://github.com/psf/requests/pull/2116.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2116"
}
| true |
[
":cake:\n"
] |
https://api.github.com/repos/psf/requests/issues/2115
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2115/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2115/comments
|
https://api.github.com/repos/psf/requests/issues/2115/events
|
https://github.com/psf/requests/pull/2115
| 36,993,298 |
MDExOlB1bGxSZXF1ZXN0MTc4OTI4MzE=
| 2,115 |
Update how we check verify when inspecting env variables
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[
{
"color": "009800",
"default": false,
"description": null,
"id": 44501218,
"name": "Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTIxOA==",
"url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge"
},
{
"color": "207de5",
"default": false,
"description": null,
"id": 60620163,
"name": "Minion Seal of Approval",
"node_id": "MDU6TGFiZWw2MDYyMDE2Mw==",
"url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval"
}
] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
] | null | 6 |
2014-07-02T15:26:36Z
|
2021-09-08T23:05:18Z
|
2014-07-03T13:35:55Z
|
CONTRIBUTOR
|
resolved
|
Closes #2092
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2115/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2115/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2115.diff",
"html_url": "https://github.com/psf/requests/pull/2115",
"merged_at": "2014-07-03T13:35:55Z",
"patch_url": "https://github.com/psf/requests/pull/2115.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2115"
}
| true |
[
"Assigned to @Lukasa for review\n",
"Notes inline. =)\n",
"LGTM. :star: :cake:\n",
"We should do `if verify`, not `if verify is True`.\n",
"@kennethreitz Disagree: verify may be set to a string (path to the cert bundle) and we shouldn't overwrite it.\n",
"Ah, gotcha. Good catch :)\n"
] |
https://api.github.com/repos/psf/requests/issues/2114
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2114/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2114/comments
|
https://api.github.com/repos/psf/requests/issues/2114/events
|
https://github.com/psf/requests/pull/2114
| 36,926,170 |
MDExOlB1bGxSZXF1ZXN0MTc4NTEzOTU=
| 2,114 |
Update sessions.py
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2058626?v=4",
"events_url": "https://api.github.com/users/manizzle/events{/privacy}",
"followers_url": "https://api.github.com/users/manizzle/followers",
"following_url": "https://api.github.com/users/manizzle/following{/other_user}",
"gists_url": "https://api.github.com/users/manizzle/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/manizzle",
"id": 2058626,
"login": "manizzle",
"node_id": "MDQ6VXNlcjIwNTg2MjY=",
"organizations_url": "https://api.github.com/users/manizzle/orgs",
"received_events_url": "https://api.github.com/users/manizzle/received_events",
"repos_url": "https://api.github.com/users/manizzle/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/manizzle/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manizzle/subscriptions",
"type": "User",
"url": "https://api.github.com/users/manizzle",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2014-07-01T19:54:49Z
|
2021-09-08T23:05:24Z
|
2014-07-01T19:56:29Z
|
NONE
|
resolved
|
allow Session to set a timeout for requests
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2114/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2114/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2114.diff",
"html_url": "https://github.com/psf/requests/pull/2114",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2114.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2114"
}
| true |
[
"Thanks for this!\n\nUnfortunately, this is a feature request that comes up often but that we don't have any real interest in. See #2011 for a longer form justification. For this reason, I'm afraid that we can't accept this pull request. Thanks so much for the work though, and please do keep contributing! :cake:\n",
"Thanks. The documentation should probably updated. http://docs.python-requests.org/en/v0.10.6/api/#requests.Session\n",
"You just linked to the v0.10.6 documentation. Try looking at the latest version: http://docs.python-requests.org/en/latest/api/#requests.Session\n",
"@manizzle you're linking to an ancient version of the documentation (you can see it in the URL `v0.10.6`). The current documentation is fine: http://docs.python-requests.org/en/latest/user/advanced/#session-objects\n",
"Okay thanks\n"
] |
https://api.github.com/repos/psf/requests/issues/2113
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2113/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2113/comments
|
https://api.github.com/repos/psf/requests/issues/2113/events
|
https://github.com/psf/requests/issues/2113
| 36,921,130 |
MDU6SXNzdWUzNjkyMTEzMA==
| 2,113 |
TypeError with pyOpenSSL, unicode headers, and python 2.x
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/39916?v=4",
"events_url": "https://api.github.com/users/joeshaw/events{/privacy}",
"followers_url": "https://api.github.com/users/joeshaw/followers",
"following_url": "https://api.github.com/users/joeshaw/following{/other_user}",
"gists_url": "https://api.github.com/users/joeshaw/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/joeshaw",
"id": 39916,
"login": "joeshaw",
"node_id": "MDQ6VXNlcjM5OTE2",
"organizations_url": "https://api.github.com/users/joeshaw/orgs",
"received_events_url": "https://api.github.com/users/joeshaw/received_events",
"repos_url": "https://api.github.com/users/joeshaw/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/joeshaw/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joeshaw/subscriptions",
"type": "User",
"url": "https://api.github.com/users/joeshaw",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 9 |
2014-07-01T18:51:54Z
|
2021-09-08T23:10:53Z
|
2014-07-01T19:25:27Z
|
NONE
|
resolved
|
```
>>> requests.get("https://google.com", headers={"X-Foo": u"foo"})
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/buildbot/joe-wip/virtualenv/lib/python2.6/site-packages/requests/api.py", line 55, in get
return request('get', url, **kwargs)
File "/home/buildbot/joe-wip/virtualenv/lib/python2.6/site-packages/requests/api.py", line 44, in request
return session.request(method=method, url=url, **kwargs)
File "/home/buildbot/joe-wip/virtualenv/lib/python2.6/site-packages/requests/sessions.py", line 456, in request
resp = self.send(prep, **send_kwargs)
File "/home/buildbot/joe-wip/virtualenv/lib/python2.6/site-packages/requests/sessions.py", line 559, in send
r = adapter.send(request, **kwargs)
File "/home/buildbot/joe-wip/virtualenv/lib/python2.6/site-packages/requests/adapters.py", line 327, in send
timeout=timeout
File "/home/buildbot/joe-wip/virtualenv/lib/python2.6/site-packages/requests/packages/urllib3/connectionpool.py", line 493, in urlopen
body=body, headers=headers)
File "/home/buildbot/joe-wip/virtualenv/lib/python2.6/site-packages/requests/packages/urllib3/connectionpool.py", line 291, in _make_request
conn.request(method, url, **httplib_request_kw)
File "/usr/lib/python2.6/httplib.py", line 910, in request
self._send_request(method, url, body, headers)
File "/usr/lib/python2.6/httplib.py", line 947, in _send_request
self.endheaders()
File "/usr/lib/python2.6/httplib.py", line 904, in endheaders
self._send_output()
File "/usr/lib/python2.6/httplib.py", line 776, in _send_output
self.send(msg)
File "/usr/lib/python2.6/httplib.py", line 755, in send
self.sock.sendall(str)
File "/home/buildbot/joe-wip/virtualenv/lib/python2.6/site-packages/requests/packages/urllib3/contrib/pyopenssl.py", line 353, in sendall
return self.connection.sendall(data)
File "/home/buildbot/joe-wip/virtualenv/lib/python2.6/site-packages/OpenSSL/SSL.py", line 969, in sendall
raise TypeError("buf must be a byte string")
TypeError: buf must be a byte string
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2113/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2113/timeline
| null |
completed
| null | null | false |
[
"By comparison, both `requests.get(\"http://google.com\", headers={\"X-Foo\": u\"foo\"})` and `requests.get(\"https://google.com\", headers={\"X-Foo\": \"foo\"})` work.\n",
"We don't expect you to set unicode headers on Python 2. It's generally a bad idea anyway because the text encoding to use is really unclear: what bytes does that header turn into? The fact that it works in other requests is simple good luck, and you'll find that if you put non-ASCII characters into your unicode string it will also fail.\n\nOn Python 2 you should be encoding all your strings to bytestrings before passing them to requests. =)\n",
"@joeshaw do you mean to say this works without unicode headers with pyOpenSSL?\n",
"It would be nice if the validation happened at a higher level than the SSL layer. It feels wrong that the error happens inconsistently across HTTP and HTTPs, in such a low-level. (And I would assume utf-8 is a reasonable encoding for such things.)\n",
"@sigmavirus24 Both without unicode headers against HTTPS and with unicode headers against HTTP. In my particular case the data I am sending is always ASCII, it just happens to be a unicode object because i got it from a different library that always returns unicode objects.\n",
"@joeshaw True, but as you've pointed out, it will break lots of currently working code. Even though that code is in error, I'm reluctant to break it because it does _happen_ to work. This is the reason Python 3 happened in the first place. =)\n\nAs for assuming UTF-8 is valid, you would be wrong. =) There is no clear ruling on the valid textual encoding for headers. Generally, ASCII is all you can _guarantee_ to be safe. Everything from then on is difficult to know. Better to make the user do this.\n",
"@joeshaw the fix is to call `string.encode` on each string that library gives you. You also seem to know the encoding (something requests should never guess at), so you can get the string from that library and then simply call `s.encode('utf-8')` on it.\n\nAs @Lukasa the only totally safe header encoding is ASCII. The next safest would be Latin-1 (ISO-8859-1). Other than that, there's not much we can do for you.\n",
"Ok, that's fair. The inconsistency between SSL (pyOpenSSL, specifically, not the built-in SSL) and non-SSL is the most frustrating thing, and the thing that seems most like a bug to me. This all started because a third-party service I was using switched to SNI.\n",
"Yeah, that inconsistency is a general pain in the neck. PyOpenSSL is currently looking to fix some of that up (see pyca/pyopenssl#15), so hopefully their 0.15 release will improve matters. If not, we might work on adding it to backports.ssl.\n"
] |
https://api.github.com/repos/psf/requests/issues/2112
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2112/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2112/comments
|
https://api.github.com/repos/psf/requests/issues/2112/events
|
https://github.com/psf/requests/pull/2112
| 36,745,605 |
MDExOlB1bGxSZXF1ZXN0MTc3NDY4NjY=
| 2,112 |
Update quickstart.rst
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7949405?v=4",
"events_url": "https://api.github.com/users/np-csu/events{/privacy}",
"followers_url": "https://api.github.com/users/np-csu/followers",
"following_url": "https://api.github.com/users/np-csu/following{/other_user}",
"gists_url": "https://api.github.com/users/np-csu/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/np-csu",
"id": 7949405,
"login": "np-csu",
"node_id": "MDQ6VXNlcjc5NDk0MDU=",
"organizations_url": "https://api.github.com/users/np-csu/orgs",
"received_events_url": "https://api.github.com/users/np-csu/received_events",
"repos_url": "https://api.github.com/users/np-csu/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/np-csu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/np-csu/subscriptions",
"type": "User",
"url": "https://api.github.com/users/np-csu",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2014-06-29T13:08:07Z
|
2021-09-08T23:06:05Z
|
2014-06-29T13:31:23Z
|
CONTRIBUTOR
|
resolved
|
line 394: correct an input error.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2112/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2112/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2112.diff",
"html_url": "https://github.com/psf/requests/pull/2112",
"merged_at": "2014-06-29T13:31:23Z",
"patch_url": "https://github.com/psf/requests/pull/2112.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2112"
}
| true |
[
"Looks good to me! Thanks! :cake: \n"
] |
https://api.github.com/repos/psf/requests/issues/2111
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2111/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2111/comments
|
https://api.github.com/repos/psf/requests/issues/2111/events
|
https://github.com/psf/requests/issues/2111
| 36,588,596 |
MDU6SXNzdWUzNjU4ODU5Ng==
| 2,111 |
Unhelpful Error for large timeout
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1186124?v=4",
"events_url": "https://api.github.com/users/saulshanabrook/events{/privacy}",
"followers_url": "https://api.github.com/users/saulshanabrook/followers",
"following_url": "https://api.github.com/users/saulshanabrook/following{/other_user}",
"gists_url": "https://api.github.com/users/saulshanabrook/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/saulshanabrook",
"id": 1186124,
"login": "saulshanabrook",
"node_id": "MDQ6VXNlcjExODYxMjQ=",
"organizations_url": "https://api.github.com/users/saulshanabrook/orgs",
"received_events_url": "https://api.github.com/users/saulshanabrook/received_events",
"repos_url": "https://api.github.com/users/saulshanabrook/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/saulshanabrook/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/saulshanabrook/subscriptions",
"type": "User",
"url": "https://api.github.com/users/saulshanabrook",
"user_view_type": "public"
}
|
[
{
"color": "02e10c",
"default": false,
"description": null,
"id": 76800,
"name": "Feature Request",
"node_id": "MDU6TGFiZWw3NjgwMA==",
"url": "https://api.github.com/repos/psf/requests/labels/Feature%20Request"
}
] |
closed
| true | null |
[] | null | 8 |
2014-06-26T15:47:17Z
|
2021-09-08T23:10:53Z
|
2014-06-26T18:41:01Z
|
NONE
|
resolved
|
Requests should give a better error than `requests.exceptions.ConnectionError: HTTPConnectionPool(host='httpbin.org', port=80): Max retries exceeded with url: /get (Caused by <class 'OSError'>: [Errno 22] Invalid argument)` for inputting a timeout that is too large. Right now it doesn't specify that timeout caused the error, even though it does mention that there is an `Invalid argument` somewhere...
``` python
import requests
r = requests.get('http://httpbin.org/get', timeout=99**99)
```
```
Traceback (most recent call last):
File "/Users/saul/.virtualenvs/finance-scraper3/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 493, in urlopen
body=body, headers=headers)
File "/Users/saul/.virtualenvs/finance-scraper3/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 291, in _make_request
conn.request(method, url, **httplib_request_kw)
File "/usr/local/Cellar/python3/3.4.0/Frameworks/Python.framework/Versions/3.4/lib/python3.4/http/client.py", line 1066, in request
self._send_request(method, url, body, headers)
File "/usr/local/Cellar/python3/3.4.0/Frameworks/Python.framework/Versions/3.4/lib/python3.4/http/client.py", line 1104, in _send_request
self.endheaders(body)
File "/usr/local/Cellar/python3/3.4.0/Frameworks/Python.framework/Versions/3.4/lib/python3.4/http/client.py", line 1062, in endheaders
self._send_output(message_body)
File "/usr/local/Cellar/python3/3.4.0/Frameworks/Python.framework/Versions/3.4/lib/python3.4/http/client.py", line 907, in _send_output
self.send(msg)
File "/usr/local/Cellar/python3/3.4.0/Frameworks/Python.framework/Versions/3.4/lib/python3.4/http/client.py", line 842, in send
self.connect()
File "/Users/saul/.virtualenvs/finance-scraper3/lib/python3.4/site-packages/requests/packages/urllib3/connection.py", line 106, in connect
conn = self._new_conn()
File "/Users/saul/.virtualenvs/finance-scraper3/lib/python3.4/site-packages/requests/packages/urllib3/connection.py", line 90, in _new_conn
(self.host, self.port), self.timeout, *extra_args)
File "/usr/local/Cellar/python3/3.4.0/Frameworks/Python.framework/Versions/3.4/lib/python3.4/socket.py", line 509, in create_connection
raise err
File "/usr/local/Cellar/python3/3.4.0/Frameworks/Python.framework/Versions/3.4/lib/python3.4/socket.py", line 500, in create_connection
sock.connect(sa)
OSError: [Errno 22] Invalid argument
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/saul/.virtualenvs/finance-scraper3/lib/python3.4/site-packages/requests/adapters.py", line 327, in send
timeout=timeout
File "/Users/saul/.virtualenvs/finance-scraper3/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 543, in urlopen
raise MaxRetryError(self, url, e)
requests.packages.urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='httpbin.org', port=80): Max retries exceeded with url: /get (Caused by <class 'OSError'>: [Errno 22] Invalid argument)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "test_error.py", line 4, in <module>
r = requests.get('http://httpbin.org/get', timeout=99**99)
File "/Users/saul/.virtualenvs/finance-scraper3/lib/python3.4/site-packages/requests/api.py", line 55, in get
return request('get', url, **kwargs)
File "/Users/saul/.virtualenvs/finance-scraper3/lib/python3.4/site-packages/requests/api.py", line 44, in request
return session.request(method=method, url=url, **kwargs)
File "/Users/saul/.virtualenvs/finance-scraper3/lib/python3.4/site-packages/requests/sessions.py", line 456, in request
resp = self.send(prep, **send_kwargs)
File "/Users/saul/.virtualenvs/finance-scraper3/lib/python3.4/site-packages/requests/sessions.py", line 559, in send
r = adapter.send(request, **kwargs)
File "/Users/saul/.virtualenvs/finance-scraper3/lib/python3.4/site-packages/requests/adapters.py", line 375, in send
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='httpbin.org', port=80): Max retries exceeded with url: /get (Caused by <class 'OSError'>: [Errno 22] Invalid argument)
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1186124?v=4",
"events_url": "https://api.github.com/users/saulshanabrook/events{/privacy}",
"followers_url": "https://api.github.com/users/saulshanabrook/followers",
"following_url": "https://api.github.com/users/saulshanabrook/following{/other_user}",
"gists_url": "https://api.github.com/users/saulshanabrook/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/saulshanabrook",
"id": 1186124,
"login": "saulshanabrook",
"node_id": "MDQ6VXNlcjExODYxMjQ=",
"organizations_url": "https://api.github.com/users/saulshanabrook/orgs",
"received_events_url": "https://api.github.com/users/saulshanabrook/received_events",
"repos_url": "https://api.github.com/users/saulshanabrook/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/saulshanabrook/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/saulshanabrook/subscriptions",
"type": "User",
"url": "https://api.github.com/users/saulshanabrook",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2111/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2111/timeline
| null |
completed
| null | null | false |
[
"Here's the thing, we don't ever see that `OSError` ourselves. It isn't that we're choosing to raise an unhelpful error, we're raising the error we should be handling. If the socket raises an error, urllib3 catches that and raises a different error (as it should) which we then catch and then use to raise `ConnectionError`. The fact is that we couldn't establish a connection, we cannot introspect every possible cause for that and we certainly cannot introspect the fact that you chose far too large a timeout value. Luckily for you, you are using Python 3 and it helps clarify for you exactly what you did wrong (which you did since you told me what was wrong with your code).\n\nThere isn't much we can do in this respect. We're not going to prevent users (like yourself) from shooting themselves in the feet (which precludes us from attempting to determine a maximum timeout value and check for it) and we can't reasonably introspect ever socket-level exception to provide you with something more helpful.\n",
"The other issue is that the socket level error is actually coming from the C layer and is hopelessly generic. You've pointed out that it says \"Invalid Argument\". There are plenty of other arguments to those calls that could have been invalid. The absolutely number one best possible thing we could do is _guess_ at which argument is invalid, which is pretty bad.\n",
"@saulshanabrook all of this isn't to say that your request isn't a reasonable one, it's just that there's no practical way to implement it. This is something most of us would _love_ to have. The problem is that there is no way to do this that will be consistent, reliable, and reasonable. This kind of introspection/inference is totally out of the scope of requests unfortunately.\n",
"Unless @kennethreitz disagrees, I think this can be closed.\n",
"Thank you all so much for the explanations, I tried to dive into the socket C code to see where the actual exception is raised to see how it determines whether a timeout is too large, but I couldn't figure it out.\n\nMy only other question is if you guys think this problem is worth mentioning somewhere in the docs. Since I couldn't find anyone else complaining about it, it might not be very common to set timeout too large.\n",
"@saulshanabrook the docs aren't meant to be comprehensive, so no this isn't really worth mentioning in the docs. It would be worth writing a blog post about if you feel so inclined. With that indexed by Google, you'll be helping other people with the blog post though. (And sometimes the best documentation is third-party documentation written by people who have experienced problems and are passionate about explaining their solutions or discoveries.)\n",
"@sigmavirus24 I just took you up on your suggestion http://www.saulshanabrook.com/requests-timeout-and-odd-traceback/\n",
"I'll be tweeting it shortly @saulshanabrook looks like a good post. :+1: \n"
] |
https://api.github.com/repos/psf/requests/issues/2110
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2110/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2110/comments
|
https://api.github.com/repos/psf/requests/issues/2110/events
|
https://github.com/psf/requests/issues/2110
| 36,403,819 |
MDU6SXNzdWUzNjQwMzgxOQ==
| 2,110 |
Requests stops respecting configured proxies
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/145979?v=4",
"events_url": "https://api.github.com/users/dstufft/events{/privacy}",
"followers_url": "https://api.github.com/users/dstufft/followers",
"following_url": "https://api.github.com/users/dstufft/following{/other_user}",
"gists_url": "https://api.github.com/users/dstufft/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/dstufft",
"id": 145979,
"login": "dstufft",
"node_id": "MDQ6VXNlcjE0NTk3OQ==",
"organizations_url": "https://api.github.com/users/dstufft/orgs",
"received_events_url": "https://api.github.com/users/dstufft/received_events",
"repos_url": "https://api.github.com/users/dstufft/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/dstufft/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dstufft/subscriptions",
"type": "User",
"url": "https://api.github.com/users/dstufft",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 16 |
2014-06-24T16:58:32Z
|
2021-06-30T23:43:06Z
|
2014-06-24T17:30:24Z
|
CONTRIBUTOR
|
resolved
|
There is an open bug report on pip (pypa/pip#1805) after some back and forth this appears to be a legit requests (or maybe urllib3) bug.
Using requests 2.3.0 I had one of the people run a script like this:
``` python
import requests
session = requests.Session()
for i in range(100):
print(i)
session.get("https://pypi.python.org/simple/").content
```
They said they had both `http_proxy` and `https_proxy` environment variables set up, and when they ran it they got this output: https://dpaste.de/doVg/raw
On the pip ticket people have said they got this while configuring the proxy via `--proxy` which translates to ``session.proxies ={"https": "<value>", "http": "<value>"}.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2110/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2110/timeline
| null |
completed
| null | null | false |
[
"I wonder if this has anything to do with Connection Pooling. @shazow?\n",
"In fact, the weird increments this is happening at suggest it. Your ticket has it happening at ~16, this one is coming at 51. Very suspicious.\n",
"That said, we use a pool size of 10 by default, which isn't really close to either of those.\n",
"For the record, there is roughly 2-3 HTTP requests made for every package installed, so that 16 is more like 32-48 or so. It might even be 50 depending on different factors.\n",
"Interesting...\n",
"Right, so in the trivial case I can't reproduce this using `mitmproxy`, though `mitmproxy` is a bad example in this case because it obviously MITMs the request.\n",
"Might be interesting to get the person who can repro this bug to turn connection pooling off (create a `HTTPAdapter` with the pool size set to 1).\n",
"They ate kibad on irc. I think they joined the requests irc channel\n\n> On Jun 24, 2014, at 1:10 PM, Cory Benfield [email protected] wrote:\n> \n> Might be interesting to get the person who can repro this bug to turn connection pooling off (create a HTTPAdapter with the pool size set to 1).\n> \n> —\n> Reply to this email directly or view it on GitHub.\n",
"Let's try this then:\n\n``` python\nimport requests\nfrom requests.adapters import HTTPAdapter\nsession = requests.Session()\ns.mount('http://', HTTPAdapter(pool_maxsize=1))\ns.mount('https://', HTTPAdapter(pool_maxsize=1))\nfor i in range(100):\n print(i)\n session.get(\"https://pypi.python.org/simple/\").content\n```\n\nI'd be interested to see how this affects the result.\n",
"Changing the authority (from PyPI to elsewhere) seems to fix the bug. This suggests an interaction between PyPI and the user's proxy. We'll close this for now, but we should be alert to the idea that this issue might come back to us.\n",
"It could be proxy-retry-related. I'm not sure off the top of my head how we handle retrying when proxies are at play (do we retry proxy timeouts?), and not sure if Requests does its own thing for that too. (Just a thought.)\n",
"Thanks for quickly helping in nailing down this issue.\n",
"Configured local pypi mirror devpi-server and tried creating venv using tox. Ended up with read time out error. Log can be found here https://dpaste.de/d8yw/raw\n",
"## Should've finished reading that long pypa/pip thread, sorry for the noise…\n\nThis emulates what devpi does, and fails reliably _after_ index 50 (i.e. in the 52nd request). With \"Connection: close\" added, it does so in the 2nd request, which might hint at some relation to persistent connections, and the \"magic\" number (here: 51±1) could be MaxKeepAliveRequests or similar in the proxy.\n\nAlso, GET requests (for the PyPI index, and BLOBs) on the same installation pose no problem at all – it's just the XMLRPC POSTs.\n\n``` py\nimport sys, os\n\ntry:\n import xmlrpc.client as xmlrpc\nexcept ImportError:\n import xmlrpclib as xmlrpc\n\nimport requests\nfrom requests.adapters import HTTPAdapter\n\nagent = \"devpi-%s/%s\" % (\"devpi\", \"2.0.2\")\nagent += \" (py%s; %s)\" % (sys.version.split()[0], sys.platform)\nsession = requests.Session()\nsession.headers[\"user-agent\"] = agent\nsession.headers[\"content-type\"] = \"text/xml\"\nsession.headers[\"Accept\"] = \"text/xml\"\n#session.headers[\"Connection\"] = \"close\"\n#session.mount('http://', HTTPAdapter(pool_maxsize=1))\n#session.mount('https://', HTTPAdapter(pool_maxsize=1))\n\nfor i in range(55):\n payload = xmlrpc.dumps((1201322,), \"changelog_since_serial\")\n reply = session.post(\"https://pypi.python.org/pypi/\", data=payload, stream=False, timeout=5.0)\n print i, len(reply.content)\n```\n",
"I know this is old & closed, but the problem persists. Please tell me what you mean by \"changing the authority (from PyPI to elsewhere) seems to fix the bug\". I'm just trying to install packages via pip3. Can I \"change the authority\"?",
"The authority is the broader term of the URI which includes user information/authentication, host/IPv4/IPv6, and port. The point about changing the authority was that there were no issues communicating with a different service than PyPI (changing the authority) and that indicated an issue between the proxy and PyPI."
] |
https://api.github.com/repos/psf/requests/issues/2109
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2109/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2109/comments
|
https://api.github.com/repos/psf/requests/issues/2109/events
|
https://github.com/psf/requests/issues/2109
| 36,326,374 |
MDU6SXNzdWUzNjMyNjM3NA==
| 2,109 |
[idea] Change how we merge request and session settings
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[
{
"color": "eb6420",
"default": false,
"description": null,
"id": 44501256,
"name": "Breaking API Change",
"node_id": "MDU6TGFiZWw0NDUwMTI1Ng==",
"url": "https://api.github.com/repos/psf/requests/labels/Breaking%20API%20Change"
}
] |
open
| false | null |
[] |
{
"closed_at": null,
"closed_issues": 29,
"created_at": "2013-11-17T11:29:34Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
},
"description": null,
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/20",
"id": 487518,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels",
"node_id": "MDk6TWlsZXN0b25lNDg3NTE4",
"number": 20,
"open_issues": 12,
"state": "open",
"title": "3.0.0",
"updated_at": "2024-05-19T18:43:00Z",
"url": "https://api.github.com/repos/psf/requests/milestones/20"
}
| 0 |
2014-06-23T20:09:09Z
|
2014-10-05T17:15:42Z
| null |
CONTRIBUTOR
| null |
# Problem
Currently, tools (e.g., [openstack/python-swiftclient](https://github.com/openstack/python-swiftclient/blob/3d0de79e26e2aa6285742c60aca3c164e9c2fbb9/swiftclient/client.py#L942..L945)) fight how requests sets the `Content-Type` header for a request. Notice that if the user of swiftclient doesn't provide their own `content_type` then the library sets the header's value to `''`. Ideally, setting `None` in a situation like this (where the Session has no default `Content-Type`) would prevent the request preparation from setting its own `Content-Type` header. This doesn't work because the per-session and per-request settings are [merged at the session level](https://github.com/kennethreitz/requests/blob/master/requests/sessions.py#L361).
# Potential solutions
## Split the responsibility of merging settings
The merge at the session level could just take care of ensuring that the per-request settings have priority while waiting for the request preparation to remove `None`s. The downside of this behaviour is that it breaks backwards compatibility. Why? Consider the following:
``` python
import requests
s = requests.Session()
s.headers['Content-Type'] = 'application/json'
s.post(url, data={'some': 'formdata'}, headers={'Content-Type': None})
```
This allows for requests to detect that it is in fact `application/x-www-form-urlencoded` by removing the default set by the session. Keeping backwards compatibility would mean that we would have to do extra work while merging settings on the session-level. We would have to check if the header was set on the session level, and then we would delete headers with value `None`, otherwise, persist the `None` value. This is clearly a lot of extra logic.
## Create a constant/singleton to mean "Do not autogenerate any of this for me"
We could potentially expose a new constant or singleton in requests to allow users to specify when they want to prevent requests from handling it for them. I dislike this idea a lot, but it's potentially easier to implement and a bit more explicit than overloading the already overloaded meaning of `None` in this context.
| null |
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2109/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2109/timeline
| null | null | null | null | false |
[] |
https://api.github.com/repos/psf/requests/issues/2108
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2108/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2108/comments
|
https://api.github.com/repos/psf/requests/issues/2108/events
|
https://github.com/psf/requests/issues/2108
| 36,270,981 |
MDU6SXNzdWUzNjI3MDk4MQ==
| 2,108 |
gevent ssl and WantWrite error
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/489439?v=4",
"events_url": "https://api.github.com/users/vitek/events{/privacy}",
"followers_url": "https://api.github.com/users/vitek/followers",
"following_url": "https://api.github.com/users/vitek/following{/other_user}",
"gists_url": "https://api.github.com/users/vitek/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/vitek",
"id": 489439,
"login": "vitek",
"node_id": "MDQ6VXNlcjQ4OTQzOQ==",
"organizations_url": "https://api.github.com/users/vitek/orgs",
"received_events_url": "https://api.github.com/users/vitek/received_events",
"repos_url": "https://api.github.com/users/vitek/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/vitek/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vitek/subscriptions",
"type": "User",
"url": "https://api.github.com/users/vitek",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2014-06-23T08:29:40Z
|
2021-09-08T23:10:54Z
|
2014-06-23T09:34:55Z
|
NONE
|
resolved
|
Hi!
Using `gevent` with `pyOpenSSL 0.14` I got `OpenSSL.SSL.WantWriteError` error. Here is simple script to reproduce the issue:
```
from gevent import monkey
monkey.patch_all()
import requests
payload = 'x' * 1024 * 1000
requests.get('https://example.org/', data=payload, verify=False)
```
As a workaround you can use `pyOpenSSL 0.13`. BTW it seems to me that SSL's `WrappedSocket.sendall()` method should do the same thing it does for SSL handshake. But it seems to be a problem since pyOpenSSL doesn't report number of bytes already sent. I got `OpenSSL.SSL.Error: [('SSL routines', 'SSL3_WRITE_PENDING', 'bad write retry')]` on retry.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2108/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2108/timeline
| null |
completed
| null | null | false |
[
"Here is possible fix:\n\n```\nclass WrappedSocket(object):\n...\n...\n def send_until_done(self, data):\n while True:\n try:\n return self.connection.send(data)\n except OpenSSL.SSL.WantWriteError:\n # TODO: implement timeouts\n select.select([], [self.socket], [])\n continue\n\n def sendall(self, data):\n while len(data):\n sent = self.send_until_done(data)\n data = data[sent:]\n```\n",
"Hang on: if this works for PyOpenSSL 0.13 and breaks for PyOpenSSL 0.14, in what way is this our bug? (Just want to clarify my understanding, I'm not saying it's not our bug. Yet.)\n",
"I'm not sure about pyOpenSSL but it seems to me that in 0.14 they switched to ffi instead of C extension module. Also I'm not sure that it works correct in 0.13 as it handles WantWriteError somewhere inside pyOpenSSL C extension module bypassing `gevent's` select.\n",
"Sorry, I'm wrong 0.13 is also affected.\n",
"@Lukasa: anyway it's urllib3 related issue\n",
"Aha, if it's a urllib3 related issue can we open it over there?\n",
"Started new issue https://github.com/shazow/urllib3/issues/412\n"
] |
https://api.github.com/repos/psf/requests/issues/2107
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2107/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2107/comments
|
https://api.github.com/repos/psf/requests/issues/2107/events
|
https://github.com/psf/requests/issues/2107
| 36,269,358 |
MDU6SXNzdWUzNjI2OTM1OA==
| 2,107 |
Obsolete link address in docs/user/install.rst
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/731266?v=4",
"events_url": "https://api.github.com/users/piglei/events{/privacy}",
"followers_url": "https://api.github.com/users/piglei/followers",
"following_url": "https://api.github.com/users/piglei/following{/other_user}",
"gists_url": "https://api.github.com/users/piglei/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/piglei",
"id": 731266,
"login": "piglei",
"node_id": "MDQ6VXNlcjczMTI2Ng==",
"organizations_url": "https://api.github.com/users/piglei/orgs",
"received_events_url": "https://api.github.com/users/piglei/received_events",
"repos_url": "https://api.github.com/users/piglei/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/piglei/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/piglei/subscriptions",
"type": "User",
"url": "https://api.github.com/users/piglei",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-06-23T08:04:02Z
|
2021-09-08T23:10:53Z
|
2014-07-02T15:41:14Z
|
NONE
|
resolved
|
Hi, the link which describes pip and easy_install's difference in `docs/user/install.rst` are obsolete.
```
But, you really `shouldn't do that <http://www.pip-installer.org/en/latest/other-tools.html#pip-compared-to-easy-install>`_.
```
But I found that this page is not available in pip's document any more, maybe we should point this link to some place else instead?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2107/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2107/timeline
| null |
completed
| null | null | false |
[
"Good spot, we should fix that up. Maybe [this StackOverflow answer](http://stackoverflow.com/questions/3220404/why-use-pip-over-easy-install)?\n",
"I think that StackOverflow answer is good. :smile:\n"
] |
https://api.github.com/repos/psf/requests/issues/2106
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2106/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2106/comments
|
https://api.github.com/repos/psf/requests/issues/2106/events
|
https://github.com/psf/requests/pull/2106
| 36,260,330 |
MDExOlB1bGxSZXF1ZXN0MTc0NTU4ODg=
| 2,106 |
Update test_expires_valid_str in test_request.py
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/936479?v=4",
"events_url": "https://api.github.com/users/xiongxoy/events{/privacy}",
"followers_url": "https://api.github.com/users/xiongxoy/followers",
"following_url": "https://api.github.com/users/xiongxoy/following{/other_user}",
"gists_url": "https://api.github.com/users/xiongxoy/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/xiongxoy",
"id": 936479,
"login": "xiongxoy",
"node_id": "MDQ6VXNlcjkzNjQ3OQ==",
"organizations_url": "https://api.github.com/users/xiongxoy/orgs",
"received_events_url": "https://api.github.com/users/xiongxoy/received_events",
"repos_url": "https://api.github.com/users/xiongxoy/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/xiongxoy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/xiongxoy/subscriptions",
"type": "User",
"url": "https://api.github.com/users/xiongxoy",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2014-06-23T03:40:50Z
|
2021-09-08T23:07:15Z
|
2014-06-23T07:24:46Z
|
NONE
|
resolved
|
"assert cookie.expires == 1" is changed to "assert cookie.expires is not None" at line 1166.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2106/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2106/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2106.diff",
"html_url": "https://github.com/psf/requests/pull/2106",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2106.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2106"
}
| true |
[
"What's the purpose of this change?\n",
"When running on my system, the previous test failed, I think it should be\ntested against None, not 1\n2014年6月23日 下午1:02于 \"Cory Benfield\" [email protected]写道:\n\n> What's the purpose of this change?\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/pull/2106#issuecomment-46806552\n> .\n",
"So this change isn't going to be right. We tracked this issue previously in #1859 and #1860. The specific problem is that our cookie parsing seems to be weirdly timezone dependent. Testing against `None` isn't particularly helpful: we want to make sure the response is right, not that it's present.\n",
"It might be enough simply to reopen #1860 and do the investigation we never got around to doing.\n",
"@xiongxoy can you apply the commit in the PR that @Lukasa referenced (#1860) and see if that fixes the test for you?\n"
] |
https://api.github.com/repos/psf/requests/issues/2105
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2105/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2105/comments
|
https://api.github.com/repos/psf/requests/issues/2105/events
|
https://github.com/psf/requests/issues/2105
| 36,212,659 |
MDU6SXNzdWUzNjIxMjY1OQ==
| 2,105 |
setup.py says requests does not support python3.4
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1018674?v=4",
"events_url": "https://api.github.com/users/robvdl/events{/privacy}",
"followers_url": "https://api.github.com/users/robvdl/followers",
"following_url": "https://api.github.com/users/robvdl/following{/other_user}",
"gists_url": "https://api.github.com/users/robvdl/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/robvdl",
"id": 1018674,
"login": "robvdl",
"node_id": "MDQ6VXNlcjEwMTg2NzQ=",
"organizations_url": "https://api.github.com/users/robvdl/orgs",
"received_events_url": "https://api.github.com/users/robvdl/received_events",
"repos_url": "https://api.github.com/users/robvdl/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/robvdl/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/robvdl/subscriptions",
"type": "User",
"url": "https://api.github.com/users/robvdl",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2014-06-21T00:39:36Z
|
2021-09-08T23:10:38Z
|
2014-07-02T07:25:57Z
|
NONE
|
resolved
|
Since Ubuntu 14.04 LTS comes with python 3.4, this is quite a big deal.
I haven't tested it yet with python 3.4 myself
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2105/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2105/timeline
| null |
completed
| null | null | false |
[
"@robvdl it does not say that it \"does not support Python 3.4\" it is merely an omission that we do in fact support 3.4. I'd like to wait for @kennethreitz to add a Python 3.4 builder to the Jenkins server before adding it however.\n",
"Requests works fine with Python 3.4, I get some test coverage of it from hyper. =)\n",
"@sigmavirus24 you're too abrasive :)\n",
"@robvdl many thanks! fixed by d22b8d8e7e8fcb8d16efba6841977dc81ac2c935\n"
] |
https://api.github.com/repos/psf/requests/issues/2104
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2104/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2104/comments
|
https://api.github.com/repos/psf/requests/issues/2104/events
|
https://github.com/psf/requests/pull/2104
| 36,212,504 |
MDExOlB1bGxSZXF1ZXN0MTc0MzQyNDA=
| 2,104 |
make requests.request('HEAD',...) behave the same as requests.head(...)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5319942?v=4",
"events_url": "https://api.github.com/users/rtdean/events{/privacy}",
"followers_url": "https://api.github.com/users/rtdean/followers",
"following_url": "https://api.github.com/users/rtdean/following{/other_user}",
"gists_url": "https://api.github.com/users/rtdean/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/rtdean",
"id": 5319942,
"login": "rtdean",
"node_id": "MDQ6VXNlcjUzMTk5NDI=",
"organizations_url": "https://api.github.com/users/rtdean/orgs",
"received_events_url": "https://api.github.com/users/rtdean/received_events",
"repos_url": "https://api.github.com/users/rtdean/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/rtdean/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rtdean/subscriptions",
"type": "User",
"url": "https://api.github.com/users/rtdean",
"user_view_type": "public"
}
|
[
{
"color": "e11d21",
"default": false,
"description": null,
"id": 44501305,
"name": "Not Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTMwNQ==",
"url": "https://api.github.com/repos/psf/requests/labels/Not%20Ready%20To%20Merge"
},
{
"color": "e11d21",
"default": false,
"description": null,
"id": 78002701,
"name": "Do Not Merge",
"node_id": "MDU6TGFiZWw3ODAwMjcwMQ==",
"url": "https://api.github.com/repos/psf/requests/labels/Do%20Not%20Merge"
}
] |
closed
| true | null |
[] | null | 4 |
2014-06-21T00:33:50Z
|
2021-09-08T23:07:14Z
|
2014-06-23T19:10:42Z
|
NONE
|
resolved
|
Currently requests.head() behaves differently (doesn't follow redirects) than requests.request('HEAD', ...)
This is, I believe, undesirable behaviour, and also disagrees with the documentation (http://docs.python-requests.org/en/latest/user/quickstart/#redirection-and-history).
This also addresses the problem described in kennethreitz/grequests#45
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2104/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2104/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2104.diff",
"html_url": "https://github.com/psf/requests/pull/2104",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2104.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2104"
}
| true |
[
"I disagree. If you're using `requests.request` with the verb explicitly you should know that you should be setting `allow_redirects=False`. Note the bug you reference in grequests was never actually acknowledged by Kenneth as a bug.\n\nThere is a bug in the documentation, but the bug is that it doesn't say \"when using `requests.head`\". I see no reason to special case this at this level. There's also nothing in [RFC 7231](http://tools.ietf.org/html/rfc7231#section-4.3.2) to justify not following redirects.\n\nIn short I'm :-1: on this fix.\n",
"I'm generally in agreement with @sigmavirus24. More generally, if you want redirects on (or off), you should explicitly turn them on (or off). That's the only way to write code that is never going to change under your feet.\n",
"Also, regardless of whether this is a good idea or not, this solution only works for the functional API. It is in the wrong place to be even remotely acceptable.\n",
"Many thanks for the contribution! Unfortunately, due to some design considerations, we can't accept this at this time. \n\nBasically, this is intended behavior. Thanks again though :) :cake:\n"
] |
https://api.github.com/repos/psf/requests/issues/2103
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2103/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2103/comments
|
https://api.github.com/repos/psf/requests/issues/2103/events
|
https://github.com/psf/requests/issues/2103
| 36,147,541 |
MDU6SXNzdWUzNjE0NzU0MQ==
| 2,103 |
How to set requests logging level?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/615755?v=4",
"events_url": "https://api.github.com/users/zhangyuting/events{/privacy}",
"followers_url": "https://api.github.com/users/zhangyuting/followers",
"following_url": "https://api.github.com/users/zhangyuting/following{/other_user}",
"gists_url": "https://api.github.com/users/zhangyuting/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/zhangyuting",
"id": 615755,
"login": "zhangyuting",
"node_id": "MDQ6VXNlcjYxNTc1NQ==",
"organizations_url": "https://api.github.com/users/zhangyuting/orgs",
"received_events_url": "https://api.github.com/users/zhangyuting/received_events",
"repos_url": "https://api.github.com/users/zhangyuting/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/zhangyuting/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zhangyuting/subscriptions",
"type": "User",
"url": "https://api.github.com/users/zhangyuting",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2014-06-20T07:56:01Z
|
2021-09-08T23:10:56Z
|
2014-06-20T08:37:58Z
|
NONE
|
resolved
|
I just want to set requests logging level up to warning cause I'm not interested in that information. But I don't wish to impact my logging leveling with root logger I set, please help me?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2103/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2103/timeline
| null |
completed
| null | null | false |
[
"Setting logging levels for a module is very easy:\n\n``` python\nimport logging\nlog = logging.getLogger('requests')\nlog.setLevel(logging.WARNING)\n```\n",
"In the future, please ask your question on [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests)\n",
"Ok, thank you so much.—\nSent from my mobile device. Please excuse typos.\n\nOn Fri, Jun 20, 2014 at 8:32 PM, Ian Cordasco [email protected]\nwrote:\n\n> ## In the future, please ask your question on [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests)\n> \n> Reply to this email directly or view it on GitHub:\n> https://github.com/kennethreitz/requests/issues/2103#issuecomment-46673063\n"
] |
https://api.github.com/repos/psf/requests/issues/2102
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2102/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2102/comments
|
https://api.github.com/repos/psf/requests/issues/2102/events
|
https://github.com/psf/requests/issues/2102
| 36,044,887 |
MDU6SXNzdWUzNjA0NDg4Nw==
| 2,102 |
Is there any methods to get the source TCP port of the requests?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7300843?v=4",
"events_url": "https://api.github.com/users/bofortitude/events{/privacy}",
"followers_url": "https://api.github.com/users/bofortitude/followers",
"following_url": "https://api.github.com/users/bofortitude/following{/other_user}",
"gists_url": "https://api.github.com/users/bofortitude/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/bofortitude",
"id": 7300843,
"login": "bofortitude",
"node_id": "MDQ6VXNlcjczMDA4NDM=",
"organizations_url": "https://api.github.com/users/bofortitude/orgs",
"received_events_url": "https://api.github.com/users/bofortitude/received_events",
"repos_url": "https://api.github.com/users/bofortitude/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/bofortitude/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bofortitude/subscriptions",
"type": "User",
"url": "https://api.github.com/users/bofortitude",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2014-06-19T02:26:41Z
|
2021-09-08T23:10:57Z
|
2014-06-19T02:31:05Z
|
NONE
|
resolved
|
I want to get the source TCP port of the socket in requests I send.
Any answers will be appreciated.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2102/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2102/timeline
| null |
completed
| null | null | false |
[
"As the documentation indicates, there is no builtin method for that. If you can grab the connection from urllib3 there might be a way to do it there. In the future, please ask your _questions_ on [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests)\n"
] |
https://api.github.com/repos/psf/requests/issues/2101
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2101/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2101/comments
|
https://api.github.com/repos/psf/requests/issues/2101/events
|
https://github.com/psf/requests/issues/2101
| 36,043,588 |
MDU6SXNzdWUzNjA0MzU4OA==
| 2,101 |
can you guys help me out with json, et al? It's for a good cause.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/10137?v=4",
"events_url": "https://api.github.com/users/ghost/events{/privacy}",
"followers_url": "https://api.github.com/users/ghost/followers",
"following_url": "https://api.github.com/users/ghost/following{/other_user}",
"gists_url": "https://api.github.com/users/ghost/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ghost",
"id": 10137,
"login": "ghost",
"node_id": "MDQ6VXNlcjEwMTM3",
"organizations_url": "https://api.github.com/users/ghost/orgs",
"received_events_url": "https://api.github.com/users/ghost/received_events",
"repos_url": "https://api.github.com/users/ghost/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ghost/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ghost/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ghost",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-06-19T01:51:38Z
|
2021-09-08T23:10:57Z
|
2014-06-19T02:22:57Z
|
NONE
|
resolved
|
I need a super easy way to send and receive json.
Nothing fancy.
We want to send/receive json very similar to google geocode api.
I will put kudos on wikispeedia.org
thanks for consideration.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2101/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2101/timeline
| null |
completed
| null | null | false |
[
"The proper place to ask questions is [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests).\n",
"no thanks\n\nOn Wed, Jun 18, 2014 at 9:23 PM, Ian Cordasco [email protected]\nwrote:\n\n> The proper place to ask questions is StackOverflow\n> https://stackoverflow.com/questions/tagged/python-requests.\n> \n> ## \n> \n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2101#issuecomment-46517125\n> .\n"
] |
https://api.github.com/repos/psf/requests/issues/2100
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2100/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2100/comments
|
https://api.github.com/repos/psf/requests/issues/2100/events
|
https://github.com/psf/requests/issues/2100
| 36,043,524 |
MDU6SXNzdWUzNjA0MzUyNA==
| 2,100 |
can you guys help me out with json, et al? I
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/10137?v=4",
"events_url": "https://api.github.com/users/ghost/events{/privacy}",
"followers_url": "https://api.github.com/users/ghost/followers",
"following_url": "https://api.github.com/users/ghost/following{/other_user}",
"gists_url": "https://api.github.com/users/ghost/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ghost",
"id": 10137,
"login": "ghost",
"node_id": "MDQ6VXNlcjEwMTM3",
"organizations_url": "https://api.github.com/users/ghost/orgs",
"received_events_url": "https://api.github.com/users/ghost/received_events",
"repos_url": "https://api.github.com/users/ghost/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ghost/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ghost/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ghost",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2014-06-19T01:49:59Z
|
2021-09-08T23:10:57Z
|
2014-06-19T02:21:27Z
|
NONE
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2100/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2100/timeline
| null |
completed
| null | null | false |
[] |
|
https://api.github.com/repos/psf/requests/issues/2099
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2099/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2099/comments
|
https://api.github.com/repos/psf/requests/issues/2099/events
|
https://github.com/psf/requests/pull/2099
| 35,955,921 |
MDExOlB1bGxSZXF1ZXN0MTcyNzYxNTE=
| 2,099 |
hahahah
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3658006?v=4",
"events_url": "https://api.github.com/users/q4478842/events{/privacy}",
"followers_url": "https://api.github.com/users/q4478842/followers",
"following_url": "https://api.github.com/users/q4478842/following{/other_user}",
"gists_url": "https://api.github.com/users/q4478842/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/q4478842",
"id": 3658006,
"login": "q4478842",
"node_id": "MDQ6VXNlcjM2NTgwMDY=",
"organizations_url": "https://api.github.com/users/q4478842/orgs",
"received_events_url": "https://api.github.com/users/q4478842/received_events",
"repos_url": "https://api.github.com/users/q4478842/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/q4478842/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/q4478842/subscriptions",
"type": "User",
"url": "https://api.github.com/users/q4478842",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2014-06-18T05:35:20Z
|
2021-09-08T23:08:14Z
|
2014-06-18T15:30:26Z
|
NONE
|
resolved
|
sorry. this is for testing
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2099/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2099/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2099.diff",
"html_url": "https://github.com/psf/requests/pull/2099",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2099.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2099"
}
| true |
[
"@q4478842 Do not test whatever you're doing against this repository.\n"
] |
https://api.github.com/repos/psf/requests/issues/2098
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2098/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2098/comments
|
https://api.github.com/repos/psf/requests/issues/2098/events
|
https://github.com/psf/requests/pull/2098
| 35,954,163 |
MDExOlB1bGxSZXF1ZXN0MTcyNzUxNzM=
| 2,098 |
hahahah
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3658006?v=4",
"events_url": "https://api.github.com/users/q4478842/events{/privacy}",
"followers_url": "https://api.github.com/users/q4478842/followers",
"following_url": "https://api.github.com/users/q4478842/following{/other_user}",
"gists_url": "https://api.github.com/users/q4478842/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/q4478842",
"id": 3658006,
"login": "q4478842",
"node_id": "MDQ6VXNlcjM2NTgwMDY=",
"organizations_url": "https://api.github.com/users/q4478842/orgs",
"received_events_url": "https://api.github.com/users/q4478842/received_events",
"repos_url": "https://api.github.com/users/q4478842/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/q4478842/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/q4478842/subscriptions",
"type": "User",
"url": "https://api.github.com/users/q4478842",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2014-06-18T04:42:44Z
|
2021-09-08T23:08:19Z
|
2014-06-18T04:42:56Z
|
NONE
|
resolved
|
I‘m sorry...
This is just for testing.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3658006?v=4",
"events_url": "https://api.github.com/users/q4478842/events{/privacy}",
"followers_url": "https://api.github.com/users/q4478842/followers",
"following_url": "https://api.github.com/users/q4478842/following{/other_user}",
"gists_url": "https://api.github.com/users/q4478842/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/q4478842",
"id": 3658006,
"login": "q4478842",
"node_id": "MDQ6VXNlcjM2NTgwMDY=",
"organizations_url": "https://api.github.com/users/q4478842/orgs",
"received_events_url": "https://api.github.com/users/q4478842/received_events",
"repos_url": "https://api.github.com/users/q4478842/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/q4478842/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/q4478842/subscriptions",
"type": "User",
"url": "https://api.github.com/users/q4478842",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2098/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2098/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2098.diff",
"html_url": "https://github.com/psf/requests/pull/2098",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2098.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2098"
}
| true |
[] |
https://api.github.com/repos/psf/requests/issues/2097
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2097/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2097/comments
|
https://api.github.com/repos/psf/requests/issues/2097/events
|
https://github.com/psf/requests/pull/2097
| 35,744,066 |
MDExOlB1bGxSZXF1ZXN0MTcxNTAzOTc=
| 2,097 |
Update out-there.rst
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/10137?v=4",
"events_url": "https://api.github.com/users/ghost/events{/privacy}",
"followers_url": "https://api.github.com/users/ghost/followers",
"following_url": "https://api.github.com/users/ghost/following{/other_user}",
"gists_url": "https://api.github.com/users/ghost/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ghost",
"id": 10137,
"login": "ghost",
"node_id": "MDQ6VXNlcjEwMTM3",
"organizations_url": "https://api.github.com/users/ghost/orgs",
"received_events_url": "https://api.github.com/users/ghost/received_events",
"repos_url": "https://api.github.com/users/ghost/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ghost/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ghost/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ghost",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2014-06-15T07:10:14Z
|
2021-09-08T23:10:56Z
|
2014-06-15T08:19:34Z
|
NONE
|
resolved
|
Remove french blog post which is now a deadlink.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2097/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2097/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2097.diff",
"html_url": "https://github.com/psf/requests/pull/2097",
"merged_at": "2014-06-15T08:19:34Z",
"patch_url": "https://github.com/psf/requests/pull/2097.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2097"
}
| true |
[
"Thanks so much for this! :cake:\n"
] |
https://api.github.com/repos/psf/requests/issues/2096
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2096/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2096/comments
|
https://api.github.com/repos/psf/requests/issues/2096/events
|
https://github.com/psf/requests/issues/2096
| 35,729,374 |
MDU6SXNzdWUzNTcyOTM3NA==
| 2,096 |
Fix Event Hooks Documentation or the Hook Handling Code
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[
{
"color": "0b02e1",
"default": false,
"description": null,
"id": 191274,
"name": "Contributor Friendly",
"node_id": "MDU6TGFiZWwxOTEyNzQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Contributor%20Friendly"
}
] |
closed
| true | null |
[] | null | 2 |
2014-06-14T14:29:06Z
|
2021-09-08T23:10:58Z
|
2014-06-14T16:05:16Z
|
CONTRIBUTOR
|
resolved
|
After looking at [this StackOverflow question](http://stackoverflow.com/a/24214101/1953283), I realized that our documentation about [Event Hooks](http://docs.python-requests.org/en/latest/user/advanced/?highlight=hook#event-hooks) is wrong. Specifically,
```
If the function doesn’t return anything, nothing else is effected.
```
It makes sense to me that a callback would not be required to return anything for it to be a successful hook. There's two avenues to fix this:
1. Make the behaviour match the documentation
2. Make the documentation match the behaviour
Both are easy, but it depends on what we want the behaviour to actually be.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2096/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2096/timeline
| null |
completed
| null | null | false |
[
"Wait, is this wrong? If a function doesn't return anything it returns `None`, and our code seems to handle that appropriately. What have I missed?\n",
"This is my brain on very little (if any) sleep. Sorry all\n"
] |
https://api.github.com/repos/psf/requests/issues/2095
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2095/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2095/comments
|
https://api.github.com/repos/psf/requests/issues/2095/events
|
https://github.com/psf/requests/pull/2095
| 35,579,446 |
MDExOlB1bGxSZXF1ZXN0MTcwNTIwODM=
| 2,095 |
Redirect cache
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1167238?v=4",
"events_url": "https://api.github.com/users/ericfrederich/events{/privacy}",
"followers_url": "https://api.github.com/users/ericfrederich/followers",
"following_url": "https://api.github.com/users/ericfrederich/following{/other_user}",
"gists_url": "https://api.github.com/users/ericfrederich/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ericfrederich",
"id": 1167238,
"login": "ericfrederich",
"node_id": "MDQ6VXNlcjExNjcyMzg=",
"organizations_url": "https://api.github.com/users/ericfrederich/orgs",
"received_events_url": "https://api.github.com/users/ericfrederich/received_events",
"repos_url": "https://api.github.com/users/ericfrederich/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ericfrederich/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ericfrederich/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ericfrederich",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 22 |
2014-06-12T12:34:08Z
|
2021-09-08T10:01:22Z
|
2014-06-12T18:42:42Z
|
CONTRIBUTOR
|
resolved
|
When web clients like Firefox get a redirect they will not go after the old version any more even if you type it into the address bar.
This patch makes the requests framework work in the same way.
This helps performance when creating a request proxy that delegates to other servers based on URL.
This patch has me going from 245 requests per second to 470 requests per second against a server returning 308 response codes.
-- example server
https://gist.github.com/ericfrederich/a004862e1da1fb6916ef
-- example client
https://gist.github.com/ericfrederich/e1225e2d07e3ee923b27
-- example server output of 301 vs 302 redireects
https://gist.github.com/ericfrederich/b77bd4852a3cf9b968f0
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2095/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2095/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2095.diff",
"html_url": "https://github.com/psf/requests/pull/2095",
"merged_at": "2014-06-12T18:42:42Z",
"patch_url": "https://github.com/psf/requests/pull/2095.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2095"
}
| true |
[
"Thanks for this!\n\nI have mixed feelings. In general, this seems really helpful, but it also seems like it could introduce difficult-to-trace bugs when working with unhelpful upstream websites. Altogether I'm tempted to err in favour of the change though.\n\nIt'll need a unit test though: mind writing one?\n\nIn the meantime, @kennethreitz, you interested in this change?\n",
"I'm :-1: on this change if only because I can imagine consumers like CloudFlare and Runscope relying on the fact that we follow the redirects each time. Sure this provides a speed-up and performance enhancement but I'm not convinced this won't make some people's lives much harder and less pleasant. Before we move on we should at least gather a lot more feedback from the people who rely so heavily on requests.\n\n@dolph, @johnsheehan, @bryanhelmig, @dstufft, @mtourne, @dknecht can any of you weigh in or have someone more appropriate way in please?\n",
"Seems like something that is better left to CacheControl or whatever. Seems odd to have requests suddenly start handling caching but only in specific circumstances.\n",
"@sigmavirus24 @dstufft Both good points that I hadn't considered. I'm moving back down to -0, waiting to hear from some of the other big guns. =)\n",
"+1: I'd expect session objects to cache the results of permanent redirects.\n",
"I'm also down on this change. Not because of how it would work with our service, but because I think it is better handled in an adapter so that it's an explicit opt in. \n",
"We (CloudFlare) would agree with John. Seems like something that should be put in an adapter that people can opt into.\n\n> On Jun 12, 2014, at 11:19 AM, John Sheehan [email protected] wrote:\n> \n> I'm also down on this change. Not because of how it would work with our service, but because I think it is better handled in an adapter so that it's an explicit opt in.\n> \n> —\n> Reply to this email directly or view it on GitHub.\n",
"It looks like there is already a session option for \"allow_redirects\".\nThis looks like it controls whether to attach a \"history\" attribute to the response (not actually control whether resolve_redirects gets called or not).\nCould this cache be an option to the session rather than an adapter?\n\nI'm not opposed to an adapter (though I would need help with the approach), but again, this is how both chrome and firefox operate with permanent redirects; they don't make requests up the whole chain each time. That is why I thought the cache should be a first class citizen, even if it is made to be an opt-in.\n",
"@ericfrederich It does control whether we follow redirects. `resolve_redirects` returns a generator, so if `allow_redirects` is false that generator doesn't get consumed and no redirects get followed.\n\nThe key thing to note is that requests is not, in fact, a browser, and there are times users are going to want more control. With that said, the bigger problem is that we're strongly resistant to adding more fields to the `Session`. That's why I wanted @kennethreitz's insight, just to see if he thinks this is a reasonable exception.\n",
"requests is obviously not a \"browser,\" but it is an HTTP client, and HTTP/1.1 10.3.2 reads:\n\n> The requested resource has been assigned a new permanent URI and any future references to this resource SHOULD use one of the returned URIs.\n\nIf you're opposed to requests following recommendations in the HTTP specification by default, please justify why the recommendation should not guide the default behavior of requests.\n\n@ericfrederich: caching is a first class citizen in HTTP, so I agree it should be a first class citizen in a requests session.\n",
"@dolph We've had this discussion many times. The specifications apply to _user agents_, and I've argued that requests by itself is not a whole user agent. There are some actions we delegate to the user of requests.\n\nNow, to be clear, I have not taken a strong position on this one way or another, aside from to suggest that we should err on the side of back-compatibility and respect the requests feature freeze. At this stage I'm keeping an open mind as to both arguments.\n\nNote, finally, that the normative language in that section is SHOULD, not MUST. =)\n",
"Interesting @dolph .\n@Lukasa , I caught the \"SHOULD\" as well, but @dolph did state it is only recommendation.\n\nIt seems this comes down to whether requests is just a http library or a http client.\nPerhaps a http library shouldn't do caching but a http client should?\nSo, the question of the day: is a requests.Session object a client?... or is it just some percentage of a client?\n",
"@Lukasa My apologies for not being aware of prior discussions. Is there a documented guideline on where requests draws the line between the actions delegated to users of the library and requests' own responsibilities?\n",
"@dolph No need to apologise. =) Just wanted to make you aware of the context.\n\nThe short answer is no, we tend to do it by feel. Makes deciding these issues interesting. =P\n",
"Fascinating. I'll dwell on it. \n",
"Hooray! =) The tiebreaker arrives.\n",
":sparkles: :cake: :sparkles:\n",
"This is a bummer. I just realized it will have far more impact on our use of Requests than I initially thought, but I realize we're a special case. Unfortunately, without an explicit opt out from this behavior this change will break how our test runner works so we'll probably have to find another solution besides Requests. I understand why people want this change, but not making it configurable seems like an oversight to me. I know our customers will struggle with it (we see them fight with similar client cleverness all the time).\n",
"@johnsheehan you're ready to jump ship already?\nI hope your solution isn't going after the head of some github repo.\nI'm all for having an opt-in / opt-out option.\nYou could do it yourself or I might be able to get to it sometime.\nI just contributed 2 patches out of the blue while experimenting with a new design.\n",
"@johnsheehan You can pin requests to 2.3.0 until we refine this feature.\n",
"I'm not taking it out immediately as there are mitigations like pinning and this isn't shipped yet, but I've made a note to our team to plan accordingly since this would have a direct impact on our customers' expected product experience. My opinion is that this shouldn't have been merged without an explicit opt in/out but I respect the decision of the maintainers recognizing that our situation is likely unique.On Thu, Jun 12, 2014 at 5:41 PM -0700, \"Ian Cordasco\" [email protected] wrote:\n\n@johnsheehan You can pin requests to 2.3.0 until we refine this feature.\n\n—Reply to this email directly or view it on GitHub.\n",
"@johnsheehan This is very easily disabled.\n\n``` python\nclass NullDict(dict):\n def __setitem__(self, key, value):\n return\n\ns = requests.Session()\ns.redirect_cache = NullDict()\n```\n"
] |
https://api.github.com/repos/psf/requests/issues/2094
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2094/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2094/comments
|
https://api.github.com/repos/psf/requests/issues/2094/events
|
https://github.com/psf/requests/pull/2094
| 35,519,358 |
MDExOlB1bGxSZXF1ZXN0MTcwMTkwODY=
| 2,094 |
Default timeout for requests.Session object
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/620035?v=4",
"events_url": "https://api.github.com/users/moliware/events{/privacy}",
"followers_url": "https://api.github.com/users/moliware/followers",
"following_url": "https://api.github.com/users/moliware/following{/other_user}",
"gists_url": "https://api.github.com/users/moliware/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/moliware",
"id": 620035,
"login": "moliware",
"node_id": "MDQ6VXNlcjYyMDAzNQ==",
"organizations_url": "https://api.github.com/users/moliware/orgs",
"received_events_url": "https://api.github.com/users/moliware/received_events",
"repos_url": "https://api.github.com/users/moliware/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/moliware/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/moliware/subscriptions",
"type": "User",
"url": "https://api.github.com/users/moliware",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2014-06-11T19:46:51Z
|
2021-09-08T10:01:12Z
|
2014-06-11T19:51:57Z
|
NONE
|
resolved
|
The idea is to have a default timeout for each request you make with a session object.
Example:
``` python
>>> import requests
>>> s = requests.Session()
>>> s.timeout = 1
>>> s.get('https://httpbin.org/delay/10')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "requests/sessions.py", line 473, in get
return self.request('GET', url, **kwargs)
File "requests/sessions.py", line 461, in request
resp = self.send(prep, **send_kwargs)
File "requests/sessions.py", line 564, in send
r = adapter.send(request, **kwargs)
File "requests/adapters.py", line 384, in send
raise Timeout(e, request=request)
requests.exceptions.Timeout: HTTPSConnectionPool(host='httpbin.org', port=443): Read timed out.
```
I came up with this PR to meet a need of a project based on requests => https://github.com/RedTuna/mysolr/issues/40
Thanks!
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2094/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2094/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2094.diff",
"html_url": "https://github.com/psf/requests/pull/2094",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2094.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2094"
}
| true |
[
"Thanks for this!\n\nUnfortunately, this is a feature request that comes up often but that we don't have any real interest in. See #2011 for a longer form justification. For this reason, I'm afraid that we can't accept this pull request. Thanks so much for the work though, and please do keep contributing! :cake:\n",
"Thanks for the answer!\n",
"Thanks for all the effort you put in @moliware ! Even though we didn't accept the work, we appreciate your effort to make requests better :)\n"
] |
https://api.github.com/repos/psf/requests/issues/2093
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2093/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2093/comments
|
https://api.github.com/repos/psf/requests/issues/2093/events
|
https://github.com/psf/requests/issues/2093
| 35,511,527 |
MDU6SXNzdWUzNTUxMTUyNw==
| 2,093 |
Review new httpbin package and pytest-httpbin for testing
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 10 |
2014-06-11T18:21:05Z
|
2021-09-08T23:08:08Z
|
2014-09-12T20:14:15Z
|
CONTRIBUTOR
|
resolved
|
https://github.com/kevin1024/pytest-httpbin
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2093/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2093/timeline
| null |
completed
| null | null | false |
[
"That's still rather slow. [Betamax](/sigmavirus24/betamax) can cover our needs and still be faster than using that. If you're concerned about the speed of our tests, I'll be happy to add betamax to the mix.\n",
"This is still making actual HTTP requests, though. How much can we trust a Betamax-style test suite? I feel like I'd ALWAYS want to run it against an HTTPbin just to be sure anyway. \n",
"HTTPbin changes exactly how often? That's exactly how often you would have to run the tests against the site or a local copy. You can also control with what precision betamax will match a request in a recorded cassette so you can send multiple posts with varying bodies and be sure you'll get the right response.\n\nAs for trust, @dgouldin might be able to tell us how much he trusts them. I use them, and the equivalent tool in Ruby (VCR) is trusted quite heavily by quite a large number of people including several corporations I can think of off the top of my head.\n",
"I wrote pytest-httpbin to test [VCR.py](https://github.com/kevin1024/vcrpy), which is similar to Betamax, because I wanted to make sure I was mocking httplib correctly.\n\nI notice that Betamax replaces the requests HTTP adapter with a BetamaxAdapter, so you wouldn't be able to test the built-in requests.adapters.HTTPAdapter with Betamax tests.\n\nIncidentally, VCR.py replaces the HTTPConnection from httplib, monkeypatching at a higher level, so it might actually be a better choice for that particular case.\n",
"## It replaces it only to intercept PreparedRequests. It still continues to use the HTTPAdapter (or any adapter registered) to make the real requests. You're just replacing something at a lower level and making assumptions about every user's use case. There are users of the FTP Adapter for whom vcr.py is not useful while Betamax would work.\n\nSent from my Android device with K-9 Mail. Please excuse my brevity.\n",
"I see. I didn't know requests supported FTP, that's very cool.\n\nYou're obviously welcome to use whatever tools you think are best. I didn't mean to start an argument here!\n\nBTW - perhaps Betamax itself would benefit from pytest-httpbin.\n\n> ## It replaces it only to intercept PreparedRequests. It still continues to use the HTTPAdapter (or any adapter registered) to make the real requests. You're just replacing something at a lower level and making assumptions about every user's use case. There are users of the FTP Adapter for whom vcr.py is not useful while Betamax would work.\n> \n> Sent from my Android device with K-9 Mail. Please excuse my brevity.\n> —\n> Reply to this email directly or view it on GitHub.\n",
"I'm sincerely sorry if my response came across as argumentative @kevin1024. That really wasn't my intention. As for whether or not Betamax should use pytest-httpbin we should probably discuss that on an issue in Betamax. :)\n",
"No harm done, it's hard to read tone over the internet :)\n\n> It still continues to use the HTTPAdapter (or any adapter registered) to make the real requests\n\nThe real requests are only going to happen if the cassette doesn't contain the request, right? So if the request hits the cassette, then the code inside the adapter isn't going to be tested. Of course if you delete the cassette and re-record, then the test will hit all the code.\n",
"Yes. Also you can have a re-record interval so that you don't have to concern yourself with deleting the cassettes.\n",
"This is a duplicate of #2184. Closing to centralize discussion in one place.\n"
] |
https://api.github.com/repos/psf/requests/issues/2092
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2092/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2092/comments
|
https://api.github.com/repos/psf/requests/issues/2092/events
|
https://github.com/psf/requests/issues/2092
| 35,468,866 |
MDU6SXNzdWUzNTQ2ODg2Ng==
| 2,092 |
REQUESTS_CA_BUNDLE is not correctly check in ‘requests/sessions.py’
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7838259?v=4",
"events_url": "https://api.github.com/users/H0neyBadger/events{/privacy}",
"followers_url": "https://api.github.com/users/H0neyBadger/followers",
"following_url": "https://api.github.com/users/H0neyBadger/following{/other_user}",
"gists_url": "https://api.github.com/users/H0neyBadger/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/H0neyBadger",
"id": 7838259,
"login": "H0neyBadger",
"node_id": "MDQ6VXNlcjc4MzgyNTk=",
"organizations_url": "https://api.github.com/users/H0neyBadger/orgs",
"received_events_url": "https://api.github.com/users/H0neyBadger/received_events",
"repos_url": "https://api.github.com/users/H0neyBadger/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/H0neyBadger/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/H0neyBadger/subscriptions",
"type": "User",
"url": "https://api.github.com/users/H0neyBadger",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2014-06-11T10:07:50Z
|
2021-09-08T23:10:52Z
|
2014-07-03T13:35:55Z
|
NONE
|
resolved
|
When ‘verify’ is explicitly set to ‘True‘, CA_BUNDLE ENV variables are not checked:
'REQUESTS_CA_BUNDLE' and 'CURL_CA_BUNDLE'
**requests.get('https://my.trusted.server:8080', verify=True)**
requests/sessions.py line 434
# Look for configuration.
**if not verify** and verify is not False:
verify = os.environ.get('REQUESTS_CA_BUNDLE')
# Curl compatibility.
**if not verify** and verify is not False:
verify = os.environ.get('CURL_CA_BUNDLE')
You should probably use:
**if verify is True or (not verify and verify is not False):**
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2092/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2092/timeline
| null |
completed
| null | null | false |
[
"Thanks for this, it's a good spot!\n\nI feel like the correct logic is probably: `if verify is True or verify is None`. I don't believe the other falsy values (empty string, empty list etc.) should cause us to accidentally turn verification on.\n",
"I second @Lukasa's far more explicit check for `None`.\n",
"I ran into this issue also. It isn't completely clear from [the documentation](http://docs.python-requests.org/en/latest/user/advanced/#ssl-cert-verification) whether this is the intended behavior.\n\nCurrently, `verify=True` means \"verify with a default set of CAs,\" correct? Hopefully nobody is relying on that.\n\nFor me this issue would be resolved with either the proposed code change or a change to the documentation to make it clear that `REQUESTS_CA_BUNDLE` is not checked when `verify=True` is passed.\n"
] |
https://api.github.com/repos/psf/requests/issues/2091
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2091/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2091/comments
|
https://api.github.com/repos/psf/requests/issues/2091/events
|
https://github.com/psf/requests/pull/2091
| 35,442,931 |
MDExOlB1bGxSZXF1ZXN0MTY5NzU2OTU=
| 2,091 |
Update README.rst
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5325221?v=4",
"events_url": "https://api.github.com/users/Mtax/events{/privacy}",
"followers_url": "https://api.github.com/users/Mtax/followers",
"following_url": "https://api.github.com/users/Mtax/following{/other_user}",
"gists_url": "https://api.github.com/users/Mtax/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Mtax",
"id": 5325221,
"login": "Mtax",
"node_id": "MDQ6VXNlcjUzMjUyMjE=",
"organizations_url": "https://api.github.com/users/Mtax/orgs",
"received_events_url": "https://api.github.com/users/Mtax/received_events",
"repos_url": "https://api.github.com/users/Mtax/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Mtax/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Mtax/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Mtax",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2014-06-11T01:41:21Z
|
2021-09-09T00:01:15Z
|
2014-06-11T02:03:47Z
|
NONE
|
resolved
|
Translated some important message to Chinese.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2091/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2091/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2091.diff",
"html_url": "https://github.com/psf/requests/pull/2091",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2091.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2091"
}
| true |
[] |
https://api.github.com/repos/psf/requests/issues/2090
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2090/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2090/comments
|
https://api.github.com/repos/psf/requests/issues/2090/events
|
https://github.com/psf/requests/pull/2090
| 35,422,288 |
MDExOlB1bGxSZXF1ZXN0MTY5NjM3MzQ=
| 2,090 |
handle 308 redirection the same as 301 and 302
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1167238?v=4",
"events_url": "https://api.github.com/users/ericfrederich/events{/privacy}",
"followers_url": "https://api.github.com/users/ericfrederich/followers",
"following_url": "https://api.github.com/users/ericfrederich/following{/other_user}",
"gists_url": "https://api.github.com/users/ericfrederich/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ericfrederich",
"id": 1167238,
"login": "ericfrederich",
"node_id": "MDQ6VXNlcjExNjcyMzg=",
"organizations_url": "https://api.github.com/users/ericfrederich/orgs",
"received_events_url": "https://api.github.com/users/ericfrederich/received_events",
"repos_url": "https://api.github.com/users/ericfrederich/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ericfrederich/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ericfrederich/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ericfrederich",
"user_view_type": "public"
}
|
[
{
"color": "009800",
"default": false,
"description": null,
"id": 44501218,
"name": "Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTIxOA==",
"url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge"
},
{
"color": "207de5",
"default": false,
"description": null,
"id": 60620163,
"name": "Minion Seal of Approval",
"node_id": "MDU6TGFiZWw2MDYyMDE2Mw==",
"url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval"
}
] |
closed
| true | null |
[] | null | 8 |
2014-06-10T20:33:18Z
|
2021-09-09T00:01:24Z
|
2014-06-12T04:19:45Z
|
CONTRIBUTOR
|
resolved
|
308 is also a redirect. It is the permanent version of 307
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2090/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2090/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2090.diff",
"html_url": "https://github.com/psf/requests/pull/2090",
"merged_at": "2014-06-12T04:19:44Z",
"patch_url": "https://github.com/psf/requests/pull/2090.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2090"
}
| true |
[
"Thanks for this!\n\nWe should go the whole hog though. HTTP 308 is not 'Resume' anymore, it's 'Permanent Redirect', as defined in [RFC 7238](http://tools.ietf.org/html/rfc7238). Mind making the relevant updates in `status_codes.py`?\n",
"Alright, went ahead and made changes to status_codes and everywhere else resume was used.\n",
"LGTM. :+1: \n",
"LGTM as well, thanks for this! :+1:\n\nWhat I've just noticed, however, is that this would be a breaking change. =P Sorry I didn't notice this earlier, but can we re-add the `resume_incomplete` and `resume` options? We'll remove them at 3.0 when we can break other people's code. =)\n",
"Put resume_incomplete and resume back in there. Commented them to be removed for 3.0\n",
"Brilliant, thank you so much. I'm happy. =D :+1: :cake:\n",
"Thanks @ericfrederich !\n",
":sparkles: :cake: :sparkles:\n"
] |
https://api.github.com/repos/psf/requests/issues/2089
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2089/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2089/comments
|
https://api.github.com/repos/psf/requests/issues/2089/events
|
https://github.com/psf/requests/pull/2089
| 35,393,249 |
MDExOlB1bGxSZXF1ZXN0MTY5NDYzNTM=
| 2,089 |
Update README.rst
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5325221?v=4",
"events_url": "https://api.github.com/users/Mtax/events{/privacy}",
"followers_url": "https://api.github.com/users/Mtax/followers",
"following_url": "https://api.github.com/users/Mtax/following{/other_user}",
"gists_url": "https://api.github.com/users/Mtax/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Mtax",
"id": 5325221,
"login": "Mtax",
"node_id": "MDQ6VXNlcjUzMjUyMjE=",
"organizations_url": "https://api.github.com/users/Mtax/orgs",
"received_events_url": "https://api.github.com/users/Mtax/received_events",
"repos_url": "https://api.github.com/users/Mtax/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Mtax/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Mtax/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Mtax",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-06-10T15:22:32Z
|
2021-09-08T23:08:29Z
|
2014-06-10T16:25:08Z
|
NONE
|
resolved
|
Translated some important message to Chinese.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2089/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2089/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2089.diff",
"html_url": "https://github.com/psf/requests/pull/2089",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2089.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2089"
}
| true |
[
"You likely meant to submit this [here](https://github.com/requests/requests-docs-cn).\n",
"Thanks a lot\n\nOn Wed, Jun 11, 2014 at 12:25 AM, Ian Cordasco [email protected]\nwrote:\n\n> You likely meant to submit this here\n> https://github.com/requests/requests-docs-cn.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/pull/2089#issuecomment-45637055\n> .\n"
] |
https://api.github.com/repos/psf/requests/issues/2088
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2088/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2088/comments
|
https://api.github.com/repos/psf/requests/issues/2088/events
|
https://github.com/psf/requests/pull/2088
| 35,379,929 |
MDExOlB1bGxSZXF1ZXN0MTY5Mzg2NjQ=
| 2,088 |
various pep-8 cleanups; remove unsused imports; remove unused variables
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1280137?v=4",
"events_url": "https://api.github.com/users/jeffknupp/events{/privacy}",
"followers_url": "https://api.github.com/users/jeffknupp/followers",
"following_url": "https://api.github.com/users/jeffknupp/following{/other_user}",
"gists_url": "https://api.github.com/users/jeffknupp/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jeffknupp",
"id": 1280137,
"login": "jeffknupp",
"node_id": "MDQ6VXNlcjEyODAxMzc=",
"organizations_url": "https://api.github.com/users/jeffknupp/orgs",
"received_events_url": "https://api.github.com/users/jeffknupp/received_events",
"repos_url": "https://api.github.com/users/jeffknupp/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jeffknupp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jeffknupp/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jeffknupp",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-06-10T13:11:22Z
|
2021-09-08T23:11:07Z
|
2014-06-10T16:02:21Z
|
NONE
|
resolved
|
A number of PEP-8 related cleanups. Also removing unused imports and unused variables.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2088/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2088/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2088.diff",
"html_url": "https://github.com/psf/requests/pull/2088",
"merged_at": "2014-06-10T16:02:21Z",
"patch_url": "https://github.com/psf/requests/pull/2088.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2088"
}
| true |
[
"@jeffknupp In general this is great and I'm sure we'll be happy to have it. However, please note that we view PEP8 more as guidelines than actual rules (see #1899), and there's some things that Kenneth actively dislikes, so don't get too invested by changing other parts of the code. =)\n",
":sparkles: :cake: :sparkles:\n\nMany thanks. \n"
] |
https://api.github.com/repos/psf/requests/issues/2087
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2087/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2087/comments
|
https://api.github.com/repos/psf/requests/issues/2087/events
|
https://github.com/psf/requests/pull/2087
| 35,204,244 |
MDExOlB1bGxSZXF1ZXN0MTY4NTI3NjM=
| 2,087 |
Update trivial mentions to RFC 2616.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2014-06-07T08:56:07Z
|
2021-09-09T00:01:21Z
|
2014-06-08T14:44:26Z
|
MEMBER
|
resolved
|
As a companion to #2086, this updates all the other references to RFC 2616 in our codebase and documentation to their new locations. Gonna let @sigmavirus24 review this rather than merge it myself (which I'd normally do) just so he can sanity-check.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2087/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2087/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2087.diff",
"html_url": "https://github.com/psf/requests/pull/2087",
"merged_at": "2014-06-08T14:44:26Z",
"patch_url": "https://github.com/psf/requests/pull/2087.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2087"
}
| true |
[
"LGTM. :cake: \n"
] |
https://api.github.com/repos/psf/requests/issues/2086
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2086/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2086/comments
|
https://api.github.com/repos/psf/requests/issues/2086/events
|
https://github.com/psf/requests/issues/2086
| 35,202,262 |
MDU6SXNzdWUzNTIwMjI2Mg==
| 2,086 |
Remove ISO-8859-1 charset fallback
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[
{
"color": "e10c02",
"default": false,
"description": null,
"id": 117744,
"name": "Bug",
"node_id": "MDU6TGFiZWwxMTc3NDQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Bug"
},
{
"color": "e102d8",
"default": false,
"description": null,
"id": 117745,
"name": "Planned",
"node_id": "MDU6TGFiZWwxMTc3NDU=",
"url": "https://api.github.com/repos/psf/requests/labels/Planned"
},
{
"color": "eb6420",
"default": false,
"description": null,
"id": 44501256,
"name": "Breaking API Change",
"node_id": "MDU6TGFiZWw0NDUwMTI1Ng==",
"url": "https://api.github.com/repos/psf/requests/labels/Breaking%20API%20Change"
}
] |
open
| false | null |
[] |
{
"closed_at": null,
"closed_issues": 29,
"created_at": "2013-11-17T11:29:34Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
},
"description": null,
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/20",
"id": 487518,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels",
"node_id": "MDk6TWlsZXN0b25lNDg3NTE4",
"number": 20,
"open_issues": 12,
"state": "open",
"title": "3.0.0",
"updated_at": "2024-05-19T18:43:00Z",
"url": "https://api.github.com/repos/psf/requests/milestones/20"
}
| 37 |
2014-06-07T06:54:12Z
|
2018-02-07T16:04:07Z
| null |
MEMBER
| null |
For a long time we've had a fallback value in `response.encoding` of `ISO-8859-1`, because RFC 2616 told us to. RFC 2616 is now obsolete, replaced by RFCs 7230, 7231, 7232, 7233, 7234, and 7235. The authoritative RFC on this issue is RFC 7231, which has this to say:
> The default charset of ISO-8859-1 for text media types has been removed; the default is now whatever the media type definition says.
The media type definitions for `text/*` are most recently affected by RFC 6657, which has this to say:
> In accordance with option (a) above, registrations for "text/*" media types that can transport charset information inside the corresponding payloads (such as "text/html" and "text/xml") SHOULD NOT specify the use of a "charset" parameter, nor any default value, in order to avoid conflicting interpretations should the "charset" parameter value and the value specified in the payload disagree.
I checked the registration for `text/html` [here](https://www.iana.org/assignments/media-types/media-types.xhtml#text). Unsurprisingly, it provides no default values. It does allow a charset parameter which overrides anything in the content itself.
I propose the following changes:
1. Remove the ISO-8859-1 fallback, as it's no longer valid (being only enforced by RFC 2616). We should _definitely_ do this.
2. Consider writing a module that has the appropriate fallback encodings for other `text/*` content and use them where appropriate. This isn't vital, just is a "might be nice".
3. Begin checking HTML content for meta tags again, in order to appropriately fall back. This is controversial, and we'll want @kennethreitz to consider it carefully.
| null |
{
"+1": 5,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 5,
"url": "https://api.github.com/repos/psf/requests/issues/2086/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2086/timeline
| null | null | null | null | false |
[
"> Remove the ISO-8859-1 fallback, as it's no longer valid (being only enforced by RFC 2616). We should definitely do this.\n\nI agree that we should remove the fallback. I do wonder how we should handle `Response#text` in the event that the server does not specify a charset (in anyway, including the meta tags of the body). Should we disable `Response#text` conditionally either through an exception or something else? Not doing so will rely more heavily on chardet, which I have decreasing confidence in given the number of new encodings it does not detect.\n\n> Consider writing a module that has the appropriate fallback encodings for other text/\\* content and use them where appropriate. This isn't vital, just is a \"might be nice\".\n\nGiven that this is not guaranteed to be included in requests, I'm fine with adding it to the toolbelt, that said. I'm also okay with making this a separate package so users can just use that with out having to install the rest of the toolbelt. That, however, is a separate discussion.\n\n> Begin checking HTML content for meta tags again, in order to appropriately fall back. This is controversial, and we'll want @kennethreitz to consider it carefully.\n\nWe still have a method to do this in `utils`, right? I don't like the idea in the slightest, but it won't cost extra effort. That said, we have to make sure any charset provided in the media type takes precedence.\n",
"Upon reading more int RFC 7231, specifically [Section 3.1.1.5](https://tools.ietf.org/html/rfc7231#section-3.1.1.5) I think the third option should ideally be opt-in, not opt-out. My specific reasoning for this is:\n\n> Clients that do so _[examine a payload's content]_ risk drawing incorrect conclusions, which might expose additional security risks (e.g., \"privilege escalation\").\n\nTaken from the same section I linked above.\n",
"Agreed from a correctness perspective, but I wonder if @kennethreitz is going to want it from a usability perspective.\n",
"I wonder how easy it would be to prop up a simple app to demonstrate the security risks involved to give a concrete example why not to do it.\n",
"If only `1.)` is going to be implemented, i guess `r.encoding = None` and requests will use chardet?\n",
"Correct. =)\n",
"That's how it works now, so I don't think we'd change that.\n",
"> I wonder how easy it would be to prop up a simple app to demonstrate the security risks involved to give a concrete example why not to do it.\n\nLook up all the UTF-7 XSS attacks. (None work in any current browser, as everyone simply dropped UTF-7 sniffing — and most UTF-7 support entirely — to avoid making sites thus vulnerable.)\n\nIn a very real sense, using chardet is _worse_ than option three above — it will make different conclusions to what any implementation following a specification defining how to sniff the content would (and both HTML and XML provide such a specification). The only safe thing to do is if you don't know how to determine the charset is to not try. You can probably support the vast majority of users by implementing the (standardised) HTML and XML character encoding detection algorithms.\n\n> I checked the registration for text/html here. Unsurprisingly, it provides no default values. It does allow a charset parameter which overrides anything in the content itself.\n\nHmm, the registration (which is included inline in the HTML spec) contradicts the HTML spec itself — per the HTML spec, UTF-8/UTF-16 BOMs are given precedence over the MIME type. I've filed [bug 26100](https://www.w3.org/Bugs/Public/show_bug.cgi?id=26100) for that.\n",
"Hmmm....\n",
"http://html5.org/tools/web-apps-tracker?from=8723&to=8724 fixes the IANA registration in the HTML spec to match the body of it. It now reads:\n\n> The charset parameter may be provided to specify the document's character encoding, overriding any character encoding declarations in the document other than a Byte Order Mark (BOM). The parameter's value must be one of the labels of the character encoding used to serialise the file. [ENCODING]\n",
"Hello,\n\nI just got hit by this reading XML files which were encoded as UTF8. On OSX the content type was being returned as 'application/xml' but on linux it was set to 'text/xml' therefore the requests lib assumed its default encoding of 'ISO-8859-1' as 'text' was in the content. Most XML files will be encoded in UTF8 so setting the encoding as 'ISO-8859-1' for 'text/xml' content is surely wrong as discussed.\n",
"Just because an RFC specifies something, doesn't mean we should do it. Especially if it makes the code crazy.\n\nI believe that our current behavior is elegant and actually works quite effectively. Is this not the case?\n\nAs always, what does Chrome do?\n",
"Chrome introspects the HTML, a position we've always decided we don't want to do. We could optionally add support for hooks to do content-type specific encoding heuristics if we wanted. We already kinda do that for JSON, it might not hurt to do it more generally for other content types.\n",
"Grumble. \n",
"Note the HTML case is even worse than that, really. Because the pre-scan in browsers just looks at the first 1024 bytes or there abouts, and the parser itself can then change it.\n",
"I still feel like our current behavior is a great implementation. \n",
"Ok. =)\n\nHow about I knock up a proposal for smarter encoding heuristics, that takes certain known content-types and attempts to gently introspect them for their encodings. At least then we'll have something concrete to discuss.\n",
"@Lukasa @kennethreitz \nHey guys, for the time being, I don't think we have a obvious solution yet, but can we at least make this `ISO-8859-1` optional? \n\n```\n if 'text' in content_type:\n return 'ISO-8859-1'\n```\n\nThis looks way too brutal. Some parameters like `Session(auto_encoding=False)` would be nice. What do you guys think?\n",
"@est ISO-8859-1 _is_ optional, you can simply set `response.encoding` yourself. It's a one-line change. =)\n",
"@Lukasa But you can't determine whether the initial `response.encoding` came from the `Content-Type` header or whether it's the `ISO-8859-1` fallback, which means if you want to avoid the fallback you have to start parsing the `Content-Type` header yourself, and that's quite a lot of extra complexity all of a sudden.\n",
"@gsnedders Sure you can. `if 'charset' in response.headers['Content-Type']`.\n",
"While yes, that works under the assumption the other party is doing something sane, it doesn't work in the face of madness and something like `text/html; foo=charset`.\n",
"@gsnedders Try `if 'charset=' in response.headers['Content-Type']`. At this point we're out of 'crazy' and into 'invalid formatting'.\n",
"@Lukasa uh, I thought there was whitespace (or rather what 2616 had as implicit *LWS) allowed around the equals, apparently not.\n\nThe grammar appears to be:\n\n```\n media-type = type \"/\" subtype *( OWS \";\" OWS parameter )\n type = token\n subtype = token\n\n The type/subtype MAY be followed by parameters in the form of\n name=value pairs.\n\n parameter = token \"=\" ( token / quoted-string )\n```\n\nSo I guess the only issue here is something like `text/html; foo=\"charset=bar\"`.\n\nFWIW, html5lib's API changes are going to make the correct behaviour with requests require something like:\n\n``` python\nr = requests.get('https://api.github.com/events')\ne = response.encoding if 'charset=' in response.headers['Content-Type'] else None\ntree = html5lib.parse(r.content, transport_encoding=e)\n```\n",
"That seems reasonable. =)\n\nFWIW, in requests 3.0.0 we'll reconsider this, or at least consider adding some way of working out what decision we made.\n",
"Interesting discussion.\n\n> I still feel like our current behavior is a great implementation.\n\nIf I may, a counterexample. I use requests to extract the `<title>` from a requested document, and here is facebook.com.\n\n``` python\n>>> import requests\n>>> r = requests.get(\"https://www.facebook.com/mathiassundin/posts/10153748227675479\")\n>>> r.encoding\n'ISO-8859-1'\n>>> import re\n>>> m = re.search('<title[^>]*>\\s*(.+?)\\s*<\\/title>', r.text, re.IGNORECASE|re.MULTILINE)\n>>> m.group(1)\nu'Mathias Sundin - M\\xc3\\xa5nga roliga samtal mellan Tony Blair och... | Facebook'\n>>> r.headers['Content-Type']\n'text/html'\n```\n\nApparently they don't specify the encoding in their headers. But they do in the HTML (`<meta charset=\"utf-8\" />`, full example at https://gist.github.com/dentarg/f13ef7cc0ce55753faf6).\n\nAs mentioned in #2161, \"requests aren't a HTML library\", and I can understand that point of view. Perhaps I should be looking into parsing the HTML with some library that also can take the specified encoding into consideration.\n\nThank you for your work on requests. I'm looking forward to 3.0.0.\n",
"@dentarg That's not really a counter example: it's an example of why this system works.\n\nThe guidance from the IETF is that for all text/\\* encodings, one of the following things MUST be true:\n- the peer MUST send a `charset` in the content-type\n- the content MUST carry an encoding specifier in it (HTML, XML)\n\nRequests' default behaviour here works well: in the case of XML and HTML, ISO-8859-1 is guaranteed to safely decode the text well enough to let you see the `<meta>` tag. Anyone working with HTML really should go looking for that tag, because servers almost never correctly advertise the content type of the HTML they deliver, but the HTML itself is usually (though sadly not always) right. \n\nThe behaviour requests has right now is probably less prone to failure with HTML than the one proposed for 3.0.0, and part of me does wonder if we should try to solve this more by documentation than by code change.\n",
"Yes, perhaps documentation is the way forward.\n\nI can share my experience. I'm a very new user of requests, and I haven't looked at all the documentation for requests, but I have looked some. The [requests website](http://docs.python-requests.org/en/latest/) have this in the code snippet on the front page:\n\n``` python\n>>> r.encoding\n'utf-8'\n```\n\nThat information, combined with me somehow (maybe from browsing issues here on GitHub) finding out that requests uses chardet, gave me the impression that requests would solve the charset/encoding problem for me \"all the way\".\n\nOnce I found the [right documentation](http://www.crummy.com/software/BeautifulSoup/bs3/documentation.html#Beautiful%20Soup%20Gives%20You%20Unicode,%20Dammit), it was straightforward to workaround the limitations with another library. I can fully understand that requests just want to be the HTTP parser, not the HTML parser.\n\nAll that said, let me share some short examples where I think the ISO-8859-1 fallback might cause some unexpected behavior.\n\n``` python\n>>> import requests\n>>> r = requests.get(\"https://www.facebook.com/mathiassundin/posts/10153748227675479\")\n>>> from bs4 import BeautifulSoup\n```\n\nYou can't use `r.text`:\n\n``` python\n>>> print BeautifulSoup(r.text, \"html5lib\", from_encoding=\"\").title.text\nMathias Sundin - MÃ¥nga roliga samtal mellan Tony Blair och... | Facebook\n```\n\n``` python\n>>> print BeautifulSoup(r.text, \"html5lib\", from_encoding=r.encoding).title.text\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"/usr/local/lib/python2.7/site-packages/bs4/__init__.py\", line 215, in __init__\n self._feed()\n File \"/usr/local/lib/python2.7/site-packages/bs4/__init__.py\", line 239, in _feed\n self.builder.feed(self.markup)\n File \"/usr/local/lib/python2.7/site-packages/bs4/builder/_html5lib.py\", line 50, in feed\n doc = parser.parse(markup, encoding=self.user_specified_encoding)\n File \"/usr/local/lib/python2.7/site-packages/html5lib/html5parser.py\", line 236, in parse\n parseMeta=parseMeta, useChardet=useChardet)\n File \"/usr/local/lib/python2.7/site-packages/html5lib/html5parser.py\", line 89, in _parse\n parser=self, **kwargs)\n File \"/usr/local/lib/python2.7/site-packages/html5lib/tokenizer.py\", line 40, in __init__\n self.stream = HTMLInputStream(stream, encoding, parseMeta, useChardet)\n File \"/usr/local/lib/python2.7/site-packages/html5lib/inputstream.py\", line 144, in HTMLInputStream\n raise TypeError(\"Cannot explicitly set an encoding with a unicode string\")\nTypeError: Cannot explicitly set an encoding with a unicode string\n```\n\nYou need to use `r.content`, but if requests have fallen back to ISO-8859-1, `r.encoding` will cause trouble:\n\n``` python\n>>> print BeautifulSoup(r.content, \"html5lib\", from_encoding=r.encoding).title.text\nMathias Sundin - MÃ¥nga roliga samtal mellan Tony Blair och... | Facebook\n```\n\nWorking as intended:\n\n``` python\n>>> print BeautifulSoup(r.content, \"html5lib\", from_encoding=\"\").title.text\nMathias Sundin - Många roliga samtal mellan Tony Blair och... | Facebook\n```\n\n``` python\n>>> print BeautifulSoup(r.content, \"html5lib\", from_encoding=\"utf-8\").title.text\nMathias Sundin - Många roliga samtal mellan Tony Blair och... | Facebook\n```\n",
"So here we hit a problem, which is that we can't really document the way Requests interacts with every possible tool you may want to use it with: there are just too many possibilities! So instead we try to provide general documentation.\n\nThe best advice I can give is that the more specific the tool, the more likely it can handle bytes as an input and DTRT with them. BeautifulSoup is a HTML tool and so can almost certainly take a stream of arbitrary bytes and find the relevant `meta` tag (and indeed, it can), so you can just pass it `r.content`. The same would be true if you were passing it to an XML library, or to a JSON library, or to anything else like that.\n",
"> Requests' default behaviour here works well: in the case of XML and HTML, ISO-8859-1 is guaranteed to safely decode the text well enough to let you see the `<meta>` tag. \n\nSadly, that isn't true, because of UTF-16. In both cases, they should be able to handle BOMs. Furthermore, a conforming XML parser should be able to handle a UTF-16 encoded `<?xml version=\"1.0\" encoding=\"UTF-16\">` with no preceding BOM.\n"
] |
https://api.github.com/repos/psf/requests/issues/2085
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2085/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2085/comments
|
https://api.github.com/repos/psf/requests/issues/2085/events
|
https://github.com/psf/requests/issues/2085
| 35,133,134 |
MDU6SXNzdWUzNTEzMzEzNA==
| 2,085 |
NTLM auth broken in 2.3.0?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7807926?v=4",
"events_url": "https://api.github.com/users/rekcahpassyla/events{/privacy}",
"followers_url": "https://api.github.com/users/rekcahpassyla/followers",
"following_url": "https://api.github.com/users/rekcahpassyla/following{/other_user}",
"gists_url": "https://api.github.com/users/rekcahpassyla/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/rekcahpassyla",
"id": 7807926,
"login": "rekcahpassyla",
"node_id": "MDQ6VXNlcjc4MDc5MjY=",
"organizations_url": "https://api.github.com/users/rekcahpassyla/orgs",
"received_events_url": "https://api.github.com/users/rekcahpassyla/received_events",
"repos_url": "https://api.github.com/users/rekcahpassyla/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/rekcahpassyla/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rekcahpassyla/subscriptions",
"type": "User",
"url": "https://api.github.com/users/rekcahpassyla",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-06-06T09:59:28Z
|
2021-09-08T23:11:02Z
|
2014-06-06T10:00:58Z
|
NONE
|
resolved
|
Hi,
I am not sure exactly where this should go (here or https://github.com/requests/requests-ntlm/) so I will post it in both places.
I have been using `requests-ntlm` (post-version 0.0.2.2) in conjunction with `requests 2.2.1`. Recently I upgraded to `requests 2.3.0` and found that NTLM authentication no longer worked; every request was rejected with 401 Unauthorized.
Side-by-side debugging of 2.2.1 vs 2.3.0 yielded the result that [this deletion](https://github.com/kennethreitz/requests/compare/v2.2.1...v2.3.0#diff-31f6e77c031977d33226530924b4337aL393) causes a connection not to be put into the pool. I can fix this by patching `requests-ntlm` but I am not familiar with either library, so while I know that this will solve my problem I wanted to mention this here in case it might be related to any other problems to do with `requests`.
I'm happy to provide debug logs or further details.
Many thanks
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2085/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2085/timeline
| null |
completed
| null | null | false |
[
"This should be posted only on requests-ntlm. =)\n",
"Thanks, am going to repost now. \n"
] |
https://api.github.com/repos/psf/requests/issues/2084
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2084/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2084/comments
|
https://api.github.com/repos/psf/requests/issues/2084/events
|
https://github.com/psf/requests/issues/2084
| 34,999,129 |
MDU6SXNzdWUzNDk5OTEyOQ==
| 2,084 |
Verify https, without exception
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/38136?v=4",
"events_url": "https://api.github.com/users/RuudBurger/events{/privacy}",
"followers_url": "https://api.github.com/users/RuudBurger/followers",
"following_url": "https://api.github.com/users/RuudBurger/following{/other_user}",
"gists_url": "https://api.github.com/users/RuudBurger/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/RuudBurger",
"id": 38136,
"login": "RuudBurger",
"node_id": "MDQ6VXNlcjM4MTM2",
"organizations_url": "https://api.github.com/users/RuudBurger/orgs",
"received_events_url": "https://api.github.com/users/RuudBurger/received_events",
"repos_url": "https://api.github.com/users/RuudBurger/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/RuudBurger/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/RuudBurger/subscriptions",
"type": "User",
"url": "https://api.github.com/users/RuudBurger",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 12 |
2014-06-04T20:21:54Z
|
2021-09-08T23:11:03Z
|
2014-06-04T20:33:07Z
|
NONE
|
resolved
|
I'm looking for a way to do a verify, but still continue with the actual request.
This way I can create a message, notifying the user, but returning the content.
See:
https://github.com/RuudBurger/CouchPotatoServer/issues/3357 or https://github.com/RuudBurger/CouchPotatoServer/issues/3369
If I check both
http://www.sslshopper.com/ssl-checker.html#hostname=torrentshack.net
http://www.sslshopper.com/ssl-checker.html#hostname=nzbs.in
They seem fine to me, but they fail using requests
There are also a lot of users using selfsigned urls, giving just a warning would be good enough here also.
I've tried creating my own adapter, extending the httpadapter, but then I have to do a requests and if that fails on SSLError, remove verify=true and do the request again. Seems a bit overkill to me.
Hope you can help me out.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/38136?v=4",
"events_url": "https://api.github.com/users/RuudBurger/events{/privacy}",
"followers_url": "https://api.github.com/users/RuudBurger/followers",
"following_url": "https://api.github.com/users/RuudBurger/following{/other_user}",
"gists_url": "https://api.github.com/users/RuudBurger/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/RuudBurger",
"id": 38136,
"login": "RuudBurger",
"node_id": "MDQ6VXNlcjM4MTM2",
"organizations_url": "https://api.github.com/users/RuudBurger/orgs",
"received_events_url": "https://api.github.com/users/RuudBurger/received_events",
"repos_url": "https://api.github.com/users/RuudBurger/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/RuudBurger/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/RuudBurger/subscriptions",
"type": "User",
"url": "https://api.github.com/users/RuudBurger",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2084/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2084/timeline
| null |
completed
| null | null | false |
[
"It's not entirely clear what you want here. You want to verify the certificate, but then continue even if the certificate is invalid?\n",
"Yeah, I just need to catch the error. Like a browser does, saying \"maybe not safe, do you want to continue\".\n",
"In the cases you're talking about they're clearly using SNI. Python 2 does not do SNI out of the box, so you need to [install these packages](https://stackoverflow.com/questions/18578439/using-requests-with-tls-doesnt-give-sni-support/18579484#18579484).\n\nAs for doing that, you should do exactly that: catch the error, and then make the request again. =)\n",
"Thanks, so I have this:\n\n```\nclass HTTPSMaybeAdapter(HTTPAdapter):\n\n def send(self, *args, **kwargs):\n\n try:\n return super(HTTPSMaybeAdapter, self).send(*args, **kwargs)\n except SSLError as e:\n log.error('SSL verification failed, could be unsafe connection: %s', e)\n\n kwargs['verify'] = False\n return super(HTTPSMaybeAdapter, self).send(*args, **kwargs)\n```\n\nThats what the best/simplest options is then?\n",
"The simplest option is genuinely at the top level:\n\n``` python\ns = requests.Session()\n\ntry:\n r = s.get(url)\nexcept requests.exceptions.SSLError:\n r = s.get(url, verify=False)\n```\n",
"Just wanted to mention that both implementations don't work. I always get `ConnectionError: HTTPSConnectionPool(host='hostenameofwebsite', port=443): Max retries exceeded with url: / (Caused by <class 'httplib.CannotSendRequest'>: )`.\n",
"That has nothing to do with HTTPS, and everything to do with the `httplib.CannotSendRequest` exception. I wonder if we leave the connection in a bad state if we fail the SSL validation.\n",
"Do you want me to open a new issue for this?\n",
"Not sure, let's sit on it here for a moment.\n",
"So I don't seem to be able to reproduce it, which is a little odd.\n",
"```\nurl = 'https://torrentshack.net'\ns = requests.Session()\n\ntry:\n r = s.get(url)\nexcept requests.exceptions.SSLError:\n r = s.get(url, verify = False)\n```\n",
"Works fine for me. =)\n"
] |
https://api.github.com/repos/psf/requests/issues/2083
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2083/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2083/comments
|
https://api.github.com/repos/psf/requests/issues/2083/events
|
https://github.com/psf/requests/issues/2083
| 34,827,947 |
MDU6SXNzdWUzNDgyNzk0Nw==
| 2,083 |
Sending cookies even though I specify None
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1095574?v=4",
"events_url": "https://api.github.com/users/JakeAustwick/events{/privacy}",
"followers_url": "https://api.github.com/users/JakeAustwick/followers",
"following_url": "https://api.github.com/users/JakeAustwick/following{/other_user}",
"gists_url": "https://api.github.com/users/JakeAustwick/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/JakeAustwick",
"id": 1095574,
"login": "JakeAustwick",
"node_id": "MDQ6VXNlcjEwOTU1NzQ=",
"organizations_url": "https://api.github.com/users/JakeAustwick/orgs",
"received_events_url": "https://api.github.com/users/JakeAustwick/received_events",
"repos_url": "https://api.github.com/users/JakeAustwick/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/JakeAustwick/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/JakeAustwick/subscriptions",
"type": "User",
"url": "https://api.github.com/users/JakeAustwick",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2014-06-03T02:43:17Z
|
2021-09-08T23:11:04Z
|
2014-06-03T14:25:33Z
|
NONE
|
resolved
|
Sorry if this is a stupid question, but I _think_ it could be a bug. Is there any reason requests is sending cookies with this request?
``````
>>> headers = {'Cookie': None}
>>> r = requests.get('http://www.linkedin.com/in/xxx/', headers=headers)
>>> print r.request.headers
CaseInsensitiveDict({'Accept-Language': 'en-US,en;q=0.8', 'Cookie': 'bcookie="v=2&49d15313-08b9-45e7-894f-8f5c76614c04"; lidc="b=VB61:g=66:u=1:i=1401763223:t=1401849623:s=1820628467"; moflow_session_id=1465875475464192', 'Accept-Encoding': 'gzip, deflate, compress', 'Accept': '*/*', 'User-Agent': 'Mozilla/5.0 (iPhone; CPU iPhone OS 7_1 like Mac OS X) AppleWebKit/537.51.2 (KHTML, like Gecko) Version/7.0 Mobile/11D167 Safari/9537.53'})```
``````
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1095574?v=4",
"events_url": "https://api.github.com/users/JakeAustwick/events{/privacy}",
"followers_url": "https://api.github.com/users/JakeAustwick/followers",
"following_url": "https://api.github.com/users/JakeAustwick/following{/other_user}",
"gists_url": "https://api.github.com/users/JakeAustwick/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/JakeAustwick",
"id": 1095574,
"login": "JakeAustwick",
"node_id": "MDQ6VXNlcjEwOTU1NzQ=",
"organizations_url": "https://api.github.com/users/JakeAustwick/orgs",
"received_events_url": "https://api.github.com/users/JakeAustwick/received_events",
"repos_url": "https://api.github.com/users/JakeAustwick/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/JakeAustwick/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/JakeAustwick/subscriptions",
"type": "User",
"url": "https://api.github.com/users/JakeAustwick",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2083/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2083/timeline
| null |
completed
| null | null | false |
[
"The change in UA is to do with some other unrelated code, unrelated to cookies; just to be sure:\n\n```\nclass Session(requests.sessions.Session):\n def __init__(self, *args, **kwargs):\n super(Session, self).__init__(*args, **kwargs)\n self.headers['Accept-Language'] = 'en-US,en;q=0.8'\n\n if os.path.isfile('useragents.txt'):\n self.headers['User-Agent'] = random.choice([l.strip() for l in open('useragents.txt')])\n```\n",
"It looks to me like there may be a bug where the response's 'Cookie' field is found in request.header, rather than in the response header as expected.\n",
"No, neither of these are bugs. =)\n\nThe reason you're sending a cookie is because you got redirected: see `r.history` for evidence of it. This redirect set a cookie, which we are now sending back. As for there being a bug where the response's `Cookie` field is found on the request header, also no. The request that caused the specific response we're looking at _did_ have a cookie header on it.\n",
"@Lukasa Makes sense, thankyou.\n"
] |
https://api.github.com/repos/psf/requests/issues/2082
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2082/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2082/comments
|
https://api.github.com/repos/psf/requests/issues/2082/events
|
https://github.com/psf/requests/issues/2082
| 34,795,583 |
MDU6SXNzdWUzNDc5NTU4Mw==
| 2,082 |
Add HTTP 2.0 support
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/772?v=4",
"events_url": "https://api.github.com/users/alex/events{/privacy}",
"followers_url": "https://api.github.com/users/alex/followers",
"following_url": "https://api.github.com/users/alex/following{/other_user}",
"gists_url": "https://api.github.com/users/alex/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/alex",
"id": 772,
"login": "alex",
"node_id": "MDQ6VXNlcjc3Mg==",
"organizations_url": "https://api.github.com/users/alex/orgs",
"received_events_url": "https://api.github.com/users/alex/received_events",
"repos_url": "https://api.github.com/users/alex/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/alex/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/alex/subscriptions",
"type": "User",
"url": "https://api.github.com/users/alex",
"user_view_type": "public"
}
|
[
{
"color": "02e10c",
"default": false,
"description": null,
"id": 76800,
"name": "Feature Request",
"node_id": "MDU6TGFiZWw3NjgwMA==",
"url": "https://api.github.com/repos/psf/requests/labels/Feature%20Request"
},
{
"color": "e102d8",
"default": false,
"description": null,
"id": 117745,
"name": "Planned",
"node_id": "MDU6TGFiZWwxMTc3NDU=",
"url": "https://api.github.com/repos/psf/requests/labels/Planned"
}
] |
closed
| true | null |
[] | null | 3 |
2014-06-02T17:50:17Z
|
2021-09-08T23:10:52Z
|
2014-07-07T16:13:33Z
|
MEMBER
|
resolved
|
It would be great if requests could make HTTP 2.0 requests.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2082/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2082/timeline
| null |
completed
| null | null | false |
[
"It can: http://hyper.readthedocs.org/en/development/quickstart.html#requests-integration\n",
"As a follow-on, I want hyper to be added to urllib3, which is being tracked in Lukasa/hyper#18. This'll give requests transparent HTTP/2 support.\n",
"Awesome, thanks.\n\nOn Mon, Jun 2, 2014 at 10:54 AM, Cory Benfield [email protected]\nwrote:\n\n> As a follow-on, I want hyper to be added to urllib3, which is being\n> tracked in Lukasa/hyper#18 https://github.com/Lukasa/hyper/issues/18.\n> This'll give requests transparent HTTP/2 support.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2082#issuecomment-44869767\n> .\n\n## \n\n\"I disapprove of what you say, but I will defend to the death your right to\nsay it.\" -- Evelyn Beatrice Hall (summarizing Voltaire)\n\"The people's good is the highest law.\" -- Cicero\nGPG Key fingerprint: 125F 5C67 DFE9 4084\n"
] |
https://api.github.com/repos/psf/requests/issues/2081
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2081/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2081/comments
|
https://api.github.com/repos/psf/requests/issues/2081/events
|
https://github.com/psf/requests/issues/2081
| 34,770,233 |
MDU6SXNzdWUzNDc3MDIzMw==
| 2,081 |
Sometimes Requests doesn't handle properly domains encoded in Cyrillic
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/312759?v=4",
"events_url": "https://api.github.com/users/ntoshev/events{/privacy}",
"followers_url": "https://api.github.com/users/ntoshev/followers",
"following_url": "https://api.github.com/users/ntoshev/following{/other_user}",
"gists_url": "https://api.github.com/users/ntoshev/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ntoshev",
"id": 312759,
"login": "ntoshev",
"node_id": "MDQ6VXNlcjMxMjc1OQ==",
"organizations_url": "https://api.github.com/users/ntoshev/orgs",
"received_events_url": "https://api.github.com/users/ntoshev/received_events",
"repos_url": "https://api.github.com/users/ntoshev/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ntoshev/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ntoshev/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ntoshev",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2014-06-02T13:02:04Z
|
2021-09-08T23:11:03Z
|
2014-06-03T14:57:40Z
|
NONE
|
resolved
|
The domain gets URL-encoded:
``` python
>>> r=requests.get(u'http://www.тв-програма.bg')
>>> r.status_code
200
>>> requests.get(u'http://www.тв-програма.bg/predavane/%D0%95%D0%BA%D1%81%D0%BF%D0%B5%D0%B4%D0%B8%D1%86%D0%B8%D0%B8%D1%82%D0%B5-%D0%BD%D0%B0-%D0%94%D0%B6%D0%B5%D1%84-%D0%9A%D0%BE%D1%80%D1%83%D0%B8%D0%BD,1771305311//')
Traceback (most recent call last):
File "<pyshell#20>", line 1, in <module>
requests.get(u'http://www.тв-програма.bg/predavane/%D0%95%D0%BA%D1%81%D0%BF%D0%B5%D0%B4%D0%B8%D1%86%D0%B8%D0%B8%D1%82%D0%B5-%D0%BD%D0%B0-%D0%94%D0%B6%D0%B5%D1%84-%D0%9A%D0%BE%D1%80%D1%83%D0%B8%D0%BD,1771305311//')
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 55, in get
return request('get', url, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 44, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 456, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 585, in send
history = [resp for resp in gen] if allow_redirects else []
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 179, in resolve_redirects
allow_redirects=False,
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 559, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 375, in send
raise ConnectionError(e, request=request)
ConnectionError: HTTPConnectionPool(host='www.%D1%82%D0%B2-%D0%BF%D1%80%D0%BE%D0%B3%D1%80%D0%B0%D0%BC%D0%B0.bg', port=80): Max retries exceeded with url: /predavane/%D0%95%D0%BA%D1%81%D0%BF%D0%B5%D0%B4%D0%B8%D1%86%D0%B8%D0%B8%D1%82%D0%B5-%D0%BD%D0%B0-%D0%94%D0%B6%D0%B5%D1%84-%D0%9A%D0%BE%D1%80%D1%83%D0%B8%D0%BD,1771305311/ (Caused by <class 'socket.gaierror'>: [Errno -2] Name or service not known)
```
That URL appears in a webpage when crawling, and it works when pasted in a browser.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2081/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2081/timeline
| null |
completed
| null | null | false |
[
"I think you've misidentified the problem. Judging by your first request, we're treating the domain name just fine (and by debugging into the request logic, both requests domains are treated exactly the same). I visited the URL that causes the connection error and my browser tells me that it cannot find the server. Something about that URL is broken but it isn't requests.\n\n---\n\nFor the sake of others reading this issue, I placed a trace call in `requests.adapters.py` at line 316 (right before `if not chunked`) and examined the prepared requests for both calls, here's what I found:\n\n``` pycon\n(Pdb) p request.url\n'http://www.xn----8sbafg9clhjcp.bg/'\n(Pdb) p request.url\n'http://www.xn----8sbafg9clhjcp.bg/predavane/%D0%95%D0%BA%D1%81%D0%BF%D0%B5%D0%B4%D0%B8%D1%86%D0%B8%D0%B8%D1%82%D0%B5-%D0%BD%D0%B0-%D0%94%D0%B6%D0%B5%D1%84-%D0%9A%D0%BE%D1%80%D1%83%D0%B8%D0%BD,1771305311//'\n(Pdb) p conn.host\n'www.xn----8sbafg9clhjcp.bg'\n```\n\nContinuing raises this exception:\n\n```\nrequests.exceptions.ConnectionError: HTTPConnectionPool(host='www.%D1%82%D0%B2-%D0%BF%D1%80%D0%BE%D0%B3%D1%80%D0%B0%D0%BC%D0%B0.bg', port=80): Max retries exceeded with url: /predavane/%D0%95%D0%BA%D1%81%D0%BF%D0%B5%D0%B4%D0%B8%D1%86%D0%B8%D0%B8%D1%82%D0%B5-%D0%BD%D0%B0-%D0%94%D0%B6%D0%B5%D1%84-%D0%9A%D0%BE%D1%80%D1%83%D0%B8%D0%BD,1771305311/ (Caused by <class 'socket.gaierror'>: [Errno 8] nodename nor servname provided, or not known)\n```\n",
"Thanks for raising this issue!\n\nThe problem comes from redirects. The website in question in the failing case sends the following header:\n\n``` python\n>>> r.headers['Location']\n'http://www.\\xd1\\x82\\xd0\\xb2-\\xd0\\xbf\\xd1\\x80\\xd0\\xbe\\xd0\\xb3\\xd1\\x80\\xd0\\xb0\\xd0\\xbc\\xd0\\xb0.bg/predavane/%D0%95%D0%BA%D1%81%D0%BF%D0%B5%D0%B4%D0%B8%D1%86%D0%B8%D0%B8%D1%82%D0%B5-%D0%BD%D0%B0-%D0%94%D0%B6%D0%B5%D1%84-%D0%9A%D0%BE%D1%80%D1%83%D0%B8%D0%BD,1771305311/'\n```\n\nNote that this is a UTF-8 encoded string, which is in violation of RFC 2616:\n\n> Words of *TEXT MAY contain characters from character sets other than ISO- 8859-1 [22] only when encoded according to the rules of RFC 2047 [14].\n\nOf course, if it was encoded in that manner we still would have fallen over so that's not all that helpful to us (note: should we support decoding RFC 2047 headers in future? Might be nice).\n\nMore generally, what are we supposed to do here? We could put the header through our full header processing, but then we'd have to decode the header as UTF-8 and that violates spec. If we decoded as Latin-1 we'd still not be able to reach it.\n\nI'd argue the server should have sent the IDNA-encoded hostname, not the UTF-8 encoded one, but I'm not enough of an expert to be sure. I'll ask on Stack Overflow.\n",
"Stack overflow question is [here](http://stackoverflow.com/questions/23995781/how-should-internationalised-domain-names-idna-be-sent-in-the-location-header).\n",
"And our answer appears: the upstream server is at fault. From SO:\n\n> It must be a valid HTTP URI (as per RFCs 3986 and 7230), thus non-ASCII characters in the host name will need to be IDNA-encoded.\n",
"Wow, thanks for tracking this down!\n\nThis case is not critical to me, but in general I would expect that whatever works in browsers works in requests as well, even if the server is not behaving correctly. This page works in Chrome and fails in Firefox, so it's in a grey area really (actually in Firefox it doesn't load the first time, but the target location is shown in the address bar and if you press Enter again, it loads).\n",
"Yeah, the browsers have trouble here, but they have an advantage we don't: they can easily speculatively perform DNS resolution on the hostnames. This means that they can receive the `Location` header and immediately perform asynchronous DNS lookups on all those hostnames, attempting to work out which one it might be based on which hostname exists.\n\nThis is really difficult for requests to do because requests is fundamentally synchronous. Spawning three or four DNS lookups to try to find the right one basically requires us to either:\n1. Spawn threads to do the DNS lookups. Libraries should never spawn their own threads, so this is unappealing.\n2. Do the DNS lookups synchronously. DNS can be very slow, and doing two or three extra DNS lookups will add many hundreds of ms to our resolution time. This is also unappealing.\n\nUltimately, we're between a rock and a hard place. Browsers will always be able to do things we can't do, because they're bigger, faster and have a more specific use-case. We need to be able to do our best. In this case, we can let you be like Firefox, and take control of the redirection yourself:\n\n``` python\ns = requests.Session()\nr = s.get(url, allow_redirects=False)\n\nwhile r.status_code in range(300, 400):\n r = s.get(r.headers['Location'], allow_redirects=False)\n```\n",
"> in general I would expect that whatever works in browsers works in requests as well, even if the server is not behaving correctly.\n\nRequests is not a browser. It stores cookies for you and handles redirects. There is a lot more that a browser does that requests does not. For example, browsers\n- allow websites to store flash and other types of cookies\n- run JavaScript that can manipulate the document you receive initially\n- use CSS media queries to change how the document is rendered\n- will ignore blatantly bad data returned by a server and do _something_ \"intelligent\" with it\n- may stop a server from sending an endless stream of headers\n- and much much more\n\nExpecting requests to behave like a browser is unreasonable not only because there's so much we can't do, but also because there's no way we can make decisions about what we should do in undefined or poorly defined cases.\n\nUnless I'm misunderstanding the comments here, this issue can be closed, right?\n"
] |
https://api.github.com/repos/psf/requests/issues/2080
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2080/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2080/comments
|
https://api.github.com/repos/psf/requests/issues/2080/events
|
https://github.com/psf/requests/issues/2080
| 34,768,482 |
MDU6SXNzdWUzNDc2ODQ4Mg==
| 2,080 |
Cookie deletion test missing
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/410452?v=4",
"events_url": "https://api.github.com/users/sylvinus/events{/privacy}",
"followers_url": "https://api.github.com/users/sylvinus/followers",
"following_url": "https://api.github.com/users/sylvinus/following{/other_user}",
"gists_url": "https://api.github.com/users/sylvinus/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sylvinus",
"id": 410452,
"login": "sylvinus",
"node_id": "MDQ6VXNlcjQxMDQ1Mg==",
"organizations_url": "https://api.github.com/users/sylvinus/orgs",
"received_events_url": "https://api.github.com/users/sylvinus/received_events",
"repos_url": "https://api.github.com/users/sylvinus/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sylvinus/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sylvinus/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sylvinus",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2014-06-02T12:37:12Z
|
2021-09-08T23:11:04Z
|
2014-06-03T14:56:52Z
|
NONE
|
resolved
|
Hello!
I _think_ Cookie deletion (servers sending "Set-Cookie: name=;") is not working in requests. I don't see any tests using "/cookies/delete?name" from httpbin so it may not be tested yet?
Thanks!
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2080/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2080/timeline
| null |
completed
| null | null | false |
[
"@sylvinus 100% of that level of cookie handling is handled by the standard library (cookielib). We simply implement a few extra methods on top of the standard Cookie Jar.\n\nA quick test shows that this does work as expected:\n\n``` pycon\n>>> import requests\n>>> s = requests.Session()\n>>> s.get('https://httpbin.org/cookies/set?name=value')\n<Response [200]>\n>>> s.cookies\n<<class 'requests.cookies.RequestsCookieJar'>[Cookie(version=0, name='name', value='value', port=None, port_specified=False, domain='httpbin.org', domain_specified=False, domain_initial_dot=False, path='/', path_specified=True, secure=False, expires=None, discard=True, comment=None, comment_url=None, rest={}, rfc2109=False)]>\n>>> s.get('https://httpbin.org/cookies/delete?name')\n<Response [200]>\n>>> s.cookies\n<<class 'requests.cookies.RequestsCookieJar'>[]>\n```\n\nDo you have a problem with some other website?\n",
"Okay thanks for the sample! I think this should still be included in requests tests since it's a behaviour that depends on the Session object.\n\nThat was actually the source of my confusion:\n\n```\n>>> requests.get(\"https://httpbin.org/cookies/set?name=value\").history[0].cookies\n<<class 'requests.cookies.RequestsCookieJar'>[Cookie(version=0, name='name', value='value', port=None, port_specified=False, domain='httpbin.org', domain_specified=False, domain_initial_dot=False, path='/', path_specified=True, secure=False, expires=None, discard=True, comment=None, comment_url=None, rest={}, rfc2109=False)]>\n>>> requests.get(\"https://httpbin.org/cookies/delete?name\").history[0].cookies\n<<class 'requests.cookies.RequestsCookieJar'>[]>\n```\n\nI would have expected in the second case to have a way to see an empty cookie was sent (besides using a Session or manually inspecting the headers, both of which I don't want to do).\n",
"Uh...how would you expect to see a _deleted_ cookie? I mean in terms of interface?\n",
"I guess that's an open question :-) I manage a list of cookies separately and need a way to see if the remote server tried to delete one of them. Maybe the only way to do that properly is to write a custom cookie jar? \n",
"@sylvinus what you seem to need is to either use a lower level solution so you can pass the proper class to the cookielib methods or to write a custom cookie jar that will be able to use a Response instance from requests retrieve all of the information it needs. If you need further help with this, I would suggest asking a question on [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests).\n\nRegarding adding a test I'm -0, because the tests are already slow enough.\n",
"I think a custom cookie jar is the way forward here. =)\n"
] |
https://api.github.com/repos/psf/requests/issues/2079
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2079/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2079/comments
|
https://api.github.com/repos/psf/requests/issues/2079/events
|
https://github.com/psf/requests/issues/2079
| 34,734,790 |
MDU6SXNzdWUzNDczNDc5MA==
| 2,079 |
requests.Session should send "Referer" headers
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1210845?v=4",
"events_url": "https://api.github.com/users/za-creature/events{/privacy}",
"followers_url": "https://api.github.com/users/za-creature/followers",
"following_url": "https://api.github.com/users/za-creature/following{/other_user}",
"gists_url": "https://api.github.com/users/za-creature/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/za-creature",
"id": 1210845,
"login": "za-creature",
"node_id": "MDQ6VXNlcjEyMTA4NDU=",
"organizations_url": "https://api.github.com/users/za-creature/orgs",
"received_events_url": "https://api.github.com/users/za-creature/received_events",
"repos_url": "https://api.github.com/users/za-creature/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/za-creature/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/za-creature/subscriptions",
"type": "User",
"url": "https://api.github.com/users/za-creature",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2014-06-01T19:22:13Z
|
2021-09-08T23:11:05Z
|
2014-06-01T20:28:49Z
|
NONE
|
resolved
|
From my understanding, the point of requests.Session is that it emulates what a dumb\* browser does when asked to render a webpage. This means keeping cookies, following redirects and in my opinion, setting the HTTP "Referer" header.
The Session class doesn't seem to touch neither the referer header nor it's properly spelled variant. I consider this to be non-pythonic because both the documentation as well as the "Session" class name suggest that all requests in a session are performed sequentially, and each individual request may update the state of the session by setting cookies, or by returning 3xx responses which in turn forward the session to a different URL.
As such, I propose setting a default value for the "referer" header (perhaps setting the properly spelled "referrer" header as well for future-proofing) to the final form of the most recently executed request. This seems to be consistent with the way cookies are handled in that performing a request will automatically parse and update any cookies in the jar. Since this will break backwards compatibility (for a few servers, an undefined referrer is better than a wrong one), this should probably be implemented pending a check to Session.follow_referrer which should default to False.
I can create a patch for this if the package maintainer agrees that this is a bug.
- By dumb I mean a browser incapable of rendering images, executing scripts or handling stylesheets and limited to just parsing and rendering HTML, which is arguably what most robots do.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1210845?v=4",
"events_url": "https://api.github.com/users/za-creature/events{/privacy}",
"followers_url": "https://api.github.com/users/za-creature/followers",
"following_url": "https://api.github.com/users/za-creature/following{/other_user}",
"gists_url": "https://api.github.com/users/za-creature/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/za-creature",
"id": 1210845,
"login": "za-creature",
"node_id": "MDQ6VXNlcjEyMTA4NDU=",
"organizations_url": "https://api.github.com/users/za-creature/orgs",
"received_events_url": "https://api.github.com/users/za-creature/received_events",
"repos_url": "https://api.github.com/users/za-creature/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/za-creature/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/za-creature/subscriptions",
"type": "User",
"url": "https://api.github.com/users/za-creature",
"user_view_type": "public"
}
|
{
"+1": 5,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 3,
"laugh": 0,
"rocket": 0,
"total_count": 8,
"url": "https://api.github.com/repos/psf/requests/issues/2079/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2079/timeline
| null |
completed
| null | null | false |
[
"Thanks for raising this issue!\n\nThis is an interesting idea. The key problem is that it's difficult for requests to correctly fill out the Referer header. From RFC 2616:\n\n> The Referer[sic] request-header field allows the client to specify, for the server's benefit, the address (URI) of the resource from which the Request-URI was obtained (the \"referrer\", although the header field is misspelled.) \n\nThe problem is that, even with a `Session`, we don't know what resource provided the URI. We can _guess_ that it might be the resource we just fetched, but we simply can't prove it. In a lot of cases it won't be, and in those cases we're not being helpful, we're just leaking user data to unrelated websites (and that's a terrible thing to do).\n\nBrowsers can do this correctly because they know when you've clicked on a link versus when you typed in the URI bar, but we don't have that luxury.\n\nFor that reason, I don't think this is a good idea. Does that seem like a convincing argument to you?\n",
"I definitely agree with you on the \"it's difficult\" part. Figuring out who provided the target URI is basically guesswork, and I won't deny that. I've stumbled across this because I am implementing a crawler for a website that doesn't seem to like robots (well, it likes GoogleBot, but since they're being racist about it, I figured I'll just emulate a regular browser that they can't afford to ban)\n\nYou definitely can't prove that the referrer is the most recently fetched resource, and yes, in a lot of cases it won't be. This is why I was suggesting that the functionality should be implemented in an opt-in fashion (not unlike the redirect following mechanism).\n\nAs for leaking data, while it's a valid concern, it's not something that's not readily available in virtually every browser that uses the default config. Whenever a webpage requests a resource (image, stylesheet, script, object; doesn't matter if local or remote), pretty much every browser sets the referrer to the requesting webpage (and this is without any user interaction whatsoever). I'm not 100% about ajax requests, but those probably default to the URL they were loaded from as well.\n\nSo with all due respect, I have to say that no, your argument is not yet convincing enough for me to drop this.\n",
"Here are my follow-ups:\n\n> This is why I was suggesting that the functionality should be implemented in an opt-in fashion (not unlike the redirect following mechanism).\n\nRedirect-following isn't opt-in, it's opt-out. =) More generally, the requests philosophy is that either a behaviour ought to be the default (as in, generally useful), or it should be easy to achieve externally. Maintaining a referrer is very easy to do:\n\n``` python\ns = requests.Session()\nheaders = {}\n\nwhile urls:\n url = urls.pop()\n r = s.get(url, headers=headers)\n headers['Referer'] = r.request.url\n```\n\nSo right now the requirement is to convince me why something that's easy to do and generally not needed should be an opt-out behaviour (because we don't add opt-in behaviours).\n\n> As for leaking data, while it's a valid concern, it's not something that's not readily available in virtually every browser that uses the default config. Whenever a webpage requests a resource (image, stylesheet, script, object; doesn't matter if local or remote), pretty much every browser sets the referrer to the requesting webpage (and this is without any user interaction whatsoever).\n\nRight, but that wasn't my concern: this is the _correct_ use of `Referer`. My concern is when you are hitting _unrelated_ URLs, at which point you are both a) sending invalid referrers and b) effectively sending a third-party your HTTP history when you should not have.\n",
"Your provided solution is more or less what I did:\n\n``` python\nhttp = requests.Session()\nwhile queue:\n url, data = queue.popleft()\n response = http.get(url, headers={\"referer\": referrer})\n referrer = response.url\n```\n\nI was not aware of requests' \"opt-out-only\" philosophy, but I can't really say that I disagree with it. Either way, it seems that my primary argument is null and void, and as such, I don't have any more objections if you want to close the issue.\n\nAs for your other point, for the sake of argument I'll say that if you're hitting unrelated URLs, you shouldn't be using sessions to begin with (at the very least, cookies will not persist cross domain). For brevity, similar functionality can be achieved by combining regular requests with functools.partial.\n\nAnyway, I just want to say thanks for the library. While not perfect (nothing ever is, nor can any objective definition of perfection ever be stated), it's definitely the best HTTP library I've come across (pretty cool that it's greenlet-ready) and well...\n\nKeep up the good work!\n",
"No problem! Glad we could take this out.\n\nBy the way, we totally expect you to use `Session` objects across domains. =) They provide centralised configuration (only set headers once etc.) and connection pooling, so any time you're doing anything more than trivial work, we expect you to use a `Session`.\n"
] |
https://api.github.com/repos/psf/requests/issues/2078
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2078/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2078/comments
|
https://api.github.com/repos/psf/requests/issues/2078/events
|
https://github.com/psf/requests/pull/2078
| 34,663,803 |
MDExOlB1bGxSZXF1ZXN0MTY1NDM5NzM=
| 2,078 |
Allow copying of PreparedRequests without headers/cookies
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/55577?v=4",
"events_url": "https://api.github.com/users/dgouldin/events{/privacy}",
"followers_url": "https://api.github.com/users/dgouldin/followers",
"following_url": "https://api.github.com/users/dgouldin/following{/other_user}",
"gists_url": "https://api.github.com/users/dgouldin/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/dgouldin",
"id": 55577,
"login": "dgouldin",
"node_id": "MDQ6VXNlcjU1NTc3",
"organizations_url": "https://api.github.com/users/dgouldin/orgs",
"received_events_url": "https://api.github.com/users/dgouldin/received_events",
"repos_url": "https://api.github.com/users/dgouldin/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/dgouldin/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dgouldin/subscriptions",
"type": "User",
"url": "https://api.github.com/users/dgouldin",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2014-05-30T17:13:51Z
|
2021-09-09T00:01:16Z
|
2014-05-30T17:33:29Z
|
NONE
|
resolved
|
Fixes #2077.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2078/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2078/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2078.diff",
"html_url": "https://github.com/psf/requests/pull/2078",
"merged_at": "2014-05-30T17:33:29Z",
"patch_url": "https://github.com/psf/requests/pull/2078.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2078"
}
| true |
[
":cake:\n"
] |
https://api.github.com/repos/psf/requests/issues/2077
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2077/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2077/comments
|
https://api.github.com/repos/psf/requests/issues/2077/events
|
https://github.com/psf/requests/issues/2077
| 34,661,962 |
MDU6SXNzdWUzNDY2MTk2Mg==
| 2,077 |
PreparedRequest.copy fails if headers or _cookies are None
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/55577?v=4",
"events_url": "https://api.github.com/users/dgouldin/events{/privacy}",
"followers_url": "https://api.github.com/users/dgouldin/followers",
"following_url": "https://api.github.com/users/dgouldin/following{/other_user}",
"gists_url": "https://api.github.com/users/dgouldin/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/dgouldin",
"id": 55577,
"login": "dgouldin",
"node_id": "MDQ6VXNlcjU1NTc3",
"organizations_url": "https://api.github.com/users/dgouldin/orgs",
"received_events_url": "https://api.github.com/users/dgouldin/received_events",
"repos_url": "https://api.github.com/users/dgouldin/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/dgouldin/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dgouldin/subscriptions",
"type": "User",
"url": "https://api.github.com/users/dgouldin",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2014-05-30T16:48:40Z
|
2021-09-08T23:11:05Z
|
2014-05-30T17:33:29Z
|
NONE
|
resolved
|
Try the following:
```
from requests import PreparedRequest
PreparedRequest().copy()
```
It will raise `AttributeError: 'NoneType' object has no attribute 'copy'` because `self.headers` is `None`. The same will happen on `_cookies` if `headers` isn't `None`.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2077/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2077/timeline
| null |
completed
| null | null | false |
[] |
https://api.github.com/repos/psf/requests/issues/2076
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2076/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2076/comments
|
https://api.github.com/repos/psf/requests/issues/2076/events
|
https://github.com/psf/requests/pull/2076
| 34,595,954 |
MDExOlB1bGxSZXF1ZXN0MTY4NjE2NjQ=
| 2,076 |
Basic Auth handler sets unicode header value on Python 2
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2014-05-29T21:36:59Z
|
2021-09-09T00:01:17Z
|
2014-06-09T14:53:23Z
|
MEMBER
|
resolved
|
Observe:
``` python
>>> import requests
>>> requests.__version__
'2.3.0'
>>> r = requests.Request('GET', 'http://www.google.com/', auth=('user', 'pass'))
>>> p = r.prepare()
>>> p.headers['Authorization']
u'Basic dXNlcjpwYXNz'
```
This is wrong: all header keys and values should be native strings. Easily fixed though. =)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2076/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2076/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2076.diff",
"html_url": "https://github.com/psf/requests/pull/2076",
"merged_at": "2014-06-09T14:53:23Z",
"patch_url": "https://github.com/psf/requests/pull/2076.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2076"
}
| true |
[
"Interesting...\n",
"Alright, fix is now attached.\n",
":cake:\n"
] |
https://api.github.com/repos/psf/requests/issues/2075
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2075/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2075/comments
|
https://api.github.com/repos/psf/requests/issues/2075/events
|
https://github.com/psf/requests/issues/2075
| 34,595,020 |
MDU6SXNzdWUzNDU5NTAyMA==
| 2,075 |
How to run app on Windows?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/105314?v=4",
"events_url": "https://api.github.com/users/hickford/events{/privacy}",
"followers_url": "https://api.github.com/users/hickford/followers",
"following_url": "https://api.github.com/users/hickford/following{/other_user}",
"gists_url": "https://api.github.com/users/hickford/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/hickford",
"id": 105314,
"login": "hickford",
"node_id": "MDQ6VXNlcjEwNTMxNA==",
"organizations_url": "https://api.github.com/users/hickford/orgs",
"received_events_url": "https://api.github.com/users/hickford/received_events",
"repos_url": "https://api.github.com/users/hickford/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/hickford/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hickford/subscriptions",
"type": "User",
"url": "https://api.github.com/users/hickford",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2014-05-29T21:26:40Z
|
2021-09-08T23:11:05Z
|
2014-05-29T21:31:21Z
|
NONE
|
resolved
|
How to run the Flask app on Windows? (I want to test my fork)
Following the `Procfile` I first tried `gunicorn httpbin:app -w 6`, but that failed because Gunicorn doesn't run on Windows.
Next I tried `python httpbin/core.py` but that failed too
> Traceback (most recent call last):
> File "httpbin\core.py", line 24, in <module>
> from . import filters
> SystemError: Parent module '' not loaded, cannot perform relative import
Any ideas?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2075/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2075/timeline
| null |
completed
| null | null | false |
[
"I think you meant to open this on the httpbin repository, not this one. =)\n\nNevertheless, try `python -m httpbin.app`.\n",
"Oops, yes. Thanks, `python -m httpbin.core` worked. What does the `-m` flag do?\n",
"Stack Overflow will explain better than I could: https://stackoverflow.com/questions/7610001/what-is-the-m-switch-for-in-python\n"
] |
https://api.github.com/repos/psf/requests/issues/2074
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2074/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2074/comments
|
https://api.github.com/repos/psf/requests/issues/2074/events
|
https://github.com/psf/requests/pull/2074
| 34,480,817 |
MDExOlB1bGxSZXF1ZXN0MTY0MzU4NzI=
| 2,074 |
Ensure that we open files in binary mode in the docs.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2014-05-28T16:01:06Z
|
2021-09-08T23:08:22Z
|
2014-05-28T16:01:10Z
|
MEMBER
|
resolved
|
Go go gadget documentation!
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2074/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2074/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2074.diff",
"html_url": "https://github.com/psf/requests/pull/2074",
"merged_at": "2014-05-28T16:01:10Z",
"patch_url": "https://github.com/psf/requests/pull/2074.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2074"
}
| true |
[] |
https://api.github.com/repos/psf/requests/issues/2073
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2073/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2073/comments
|
https://api.github.com/repos/psf/requests/issues/2073/events
|
https://github.com/psf/requests/issues/2073
| 34,467,576 |
MDU6SXNzdWUzNDQ2NzU3Ng==
| 2,073 |
Newline problems in Windows
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1845771?v=4",
"events_url": "https://api.github.com/users/edrevo/events{/privacy}",
"followers_url": "https://api.github.com/users/edrevo/followers",
"following_url": "https://api.github.com/users/edrevo/following{/other_user}",
"gists_url": "https://api.github.com/users/edrevo/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/edrevo",
"id": 1845771,
"login": "edrevo",
"node_id": "MDQ6VXNlcjE4NDU3NzE=",
"organizations_url": "https://api.github.com/users/edrevo/orgs",
"received_events_url": "https://api.github.com/users/edrevo/received_events",
"repos_url": "https://api.github.com/users/edrevo/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/edrevo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/edrevo/subscriptions",
"type": "User",
"url": "https://api.github.com/users/edrevo",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2014-05-28T13:48:55Z
|
2021-09-09T00:00:55Z
|
2014-05-28T16:01:42Z
|
NONE
|
resolved
|
Calls to `requests.put([...], data=file_obj)` send invalid requests on Windows when `file_obj` is a file object that contains Windows-style newlines (`\r\n`). This does not repro in Mac.
The problem is that the `Content-Length` header is determined through the `super_len` function, which in turn calls `os.fstat(o.fileno()).st_size`. This return value, which comes from the actual file size in the disk, does not need to match the length of the contents that get read through the file object, since Python will perform newline normalization in Windows:
``` python
>>> import os
>>> f = open('test.md') # this file contains Windows-style newlines
>>> f.fileno()
3
>>> os.fstat(3).st_size
4031L
>>> c = f.read(4031)
>>> len(c)
3983
>>> c[:50]
'# Cosmos "Little Joe" (1G) Hardware spec.\n\nThe fir'
```
In Mac, on the other hand, the content is read without newline normalization:
``` python
>>> f = open('test.md') # this file contains Windows-style newlines
>>> c = f.read(4031)
>>> c[:50]
'# Cosmos "Little Joe" (1G) Hardware spec.\r\n\r\nThe f'
```
The consequence of this mismatch is that the HTTP request will have a body which is smaller than what the Content-Length says, so the server will wait for the rest of the content and it will eventually give up.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2073/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2073/timeline
| null |
completed
| null | null | false |
[
"You can repro this on a Mac by forcing Python to normalize newlines through the `'U'` file mode (https://docs.python.org/2/library/functions.html#open): `open('test.md', 'rU')`\n",
"Thanks for raising this issue!\n\nThis is not entirely surprising: you've not passed Requests the file in its on-disk form. Instead, you're applying transformations to it (newline normalisation is one, but decoding to unicode is another). This is problematic because it means the information about the file is now untrue.\n\nBear in mind that we send the file on the network _as bytes_. Given that the file exists on disk _as bytes_, the way to send it across the network intact is to send it _as bytes_. Use `open('test.md', 'rb')`.\n",
"Many thanks for the ultra-quick reply! The solution you propose works like a charm! :+1:\n\nIt could be nice to raise an error on the `requests` side if the amount of bytes read from the file object does not match the expected length. It'll be easier for other devs to pinpoint what went wrong if that kind of error is added.\n\nThanks again for your help!\n",
"@edrevo That's a great idea. Unfortunately, it's quite challenging to do, as we don't normally read from the file ourselves, we pass it down to the lower layers. An alternative might be to adjust [this section of the docs](http://docs.python-requests.org/en/latest/user/advanced/#streaming-uploads) to use `'rb'` as well, to reduce the risk of people getting this wrong.\n",
"Sounds good!\n",
"Ok, fixed up in 785e4ab3606be275f6887aab718ab7cda4a8938d. Thanks again for reporting this!\n"
] |
https://api.github.com/repos/psf/requests/issues/2072
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2072/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2072/comments
|
https://api.github.com/repos/psf/requests/issues/2072/events
|
https://github.com/psf/requests/pull/2072
| 34,428,077 |
MDExOlB1bGxSZXF1ZXN0MTY0MDU1NjI=
| 2,072 |
Check for basestring, not just str
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[
{
"color": "009800",
"default": false,
"description": null,
"id": 44501218,
"name": "Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTIxOA==",
"url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge"
},
{
"color": "207de5",
"default": false,
"description": null,
"id": 60620163,
"name": "Minion Seal of Approval",
"node_id": "MDU6TGFiZWw2MDYyMDE2Mw==",
"url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval"
}
] |
closed
| true | null |
[] | null | 5 |
2014-05-28T01:31:31Z
|
2021-09-08T23:10:54Z
|
2014-05-29T14:39:42Z
|
CONTRIBUTOR
|
resolved
|
Fixes #2071
This provides consistent behaviour across Python 2 & 3.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2072/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2072/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2072.diff",
"html_url": "https://github.com/psf/requests/pull/2072",
"merged_at": "2014-05-29T14:39:42Z",
"patch_url": "https://github.com/psf/requests/pull/2072.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2072"
}
| true |
[
"Nice.\n\nWith this change we shouldn't need the `builtin_str` clause anymore, so we may as well remove it while we're here. =)\n",
"@Lukasa fixed in 5ab79e2\n",
"Hurrah! LGTM. :shipit:\n",
":cake:\n",
":sparkles: \n"
] |
https://api.github.com/repos/psf/requests/issues/2071
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2071/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2071/comments
|
https://api.github.com/repos/psf/requests/issues/2071/events
|
https://github.com/psf/requests/issues/2071
| 34,417,663 |
MDU6SXNzdWUzNDQxNzY2Mw==
| 2,071 |
On Python 3, sending bytes automatically choses 'x-www-form-urlencoded' as the content_type
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/13807?v=4",
"events_url": "https://api.github.com/users/miracle2k/events{/privacy}",
"followers_url": "https://api.github.com/users/miracle2k/followers",
"following_url": "https://api.github.com/users/miracle2k/following{/other_user}",
"gists_url": "https://api.github.com/users/miracle2k/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/miracle2k",
"id": 13807,
"login": "miracle2k",
"node_id": "MDQ6VXNlcjEzODA3",
"organizations_url": "https://api.github.com/users/miracle2k/orgs",
"received_events_url": "https://api.github.com/users/miracle2k/received_events",
"repos_url": "https://api.github.com/users/miracle2k/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/miracle2k/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/miracle2k/subscriptions",
"type": "User",
"url": "https://api.github.com/users/miracle2k",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-05-27T22:05:54Z
|
2021-09-09T00:00:54Z
|
2014-05-29T14:39:42Z
|
NONE
|
resolved
|
Is this on purpose. I would have assumed bytes to mean "send this unknown chunk of data as-is".
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2071/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2071/timeline
| null |
completed
| null | null | false |
[
"@Lukasa this is the actual behaviour. I tested on Python 2 and 3 and this only happens on Python 3 due to [these lines](https://github.com/kennethreitz/requests/blob/6c72509f5bb0e9cd8fad64a44ba99687ed044771/requests/models.py#L434..L439) I'm pretty sure we don't want this behaviour by default. I'm going to work on this some tonight and post my progress before heading to sleep.\n",
"#2072 fixes this issue. Thanks for reporting it @miracle2k! :cake: \n"
] |
https://api.github.com/repos/psf/requests/issues/2070
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2070/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2070/comments
|
https://api.github.com/repos/psf/requests/issues/2070/events
|
https://github.com/psf/requests/issues/2070
| 34,402,903 |
MDU6SXNzdWUzNDQwMjkwMw==
| 2,070 |
Surface socket connection time
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/455202?v=4",
"events_url": "https://api.github.com/users/vjyanand/events{/privacy}",
"followers_url": "https://api.github.com/users/vjyanand/followers",
"following_url": "https://api.github.com/users/vjyanand/following{/other_user}",
"gists_url": "https://api.github.com/users/vjyanand/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/vjyanand",
"id": 455202,
"login": "vjyanand",
"node_id": "MDQ6VXNlcjQ1NTIwMg==",
"organizations_url": "https://api.github.com/users/vjyanand/orgs",
"received_events_url": "https://api.github.com/users/vjyanand/received_events",
"repos_url": "https://api.github.com/users/vjyanand/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/vjyanand/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vjyanand/subscriptions",
"type": "User",
"url": "https://api.github.com/users/vjyanand",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2014-05-27T19:06:22Z
|
2021-09-09T00:00:56Z
|
2014-05-27T19:07:41Z
|
NONE
|
resolved
|
Is there a way to get actual time taken for socket.create_connection https://github.com/kennethreitz/requests/blob/master/requests/packages/urllib3/connection.py#L89-L92 in requests object
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2070/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2070/timeline
| null |
completed
| null | null | false |
[
"Nope. =) You can try to get those diagnostics either by profiling a single run of the code using cprofile or monkeypatching. =)\n"
] |
https://api.github.com/repos/psf/requests/issues/2069
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2069/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2069/comments
|
https://api.github.com/repos/psf/requests/issues/2069/events
|
https://github.com/psf/requests/pull/2069
| 34,315,762 |
MDExOlB1bGxSZXF1ZXN0MTYzNDEwNzA=
| 2,069 |
Document and initialise Response.request
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[
{
"color": "009800",
"default": false,
"description": null,
"id": 44501218,
"name": "Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTIxOA==",
"url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge"
},
{
"color": "207de5",
"default": false,
"description": null,
"id": 60620163,
"name": "Minion Seal of Approval",
"node_id": "MDU6TGFiZWw2MDYyMDE2Mw==",
"url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval"
}
] |
closed
| true | null |
[] | null | 3 |
2014-05-26T15:41:47Z
|
2021-09-08T23:05:08Z
|
2014-05-27T15:11:21Z
|
MEMBER
|
resolved
|
In response to #2066. Weirdly, we weren't initialising the `.request` property in the constructor. Nothing terrible there, but we need to do it to get it documented, so let's do it.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2069/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2069/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2069.diff",
"html_url": "https://github.com/psf/requests/pull/2069",
"merged_at": "2014-05-27T15:11:21Z",
"patch_url": "https://github.com/psf/requests/pull/2069.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2069"
}
| true |
[
":clap: \n",
"If this were only a documentation change I'd be pressing \"Merge\" right now. :+1: :shipit: \n",
"Haha! I can't believe we haven't done this before :)\n\n:cake:\n"
] |
https://api.github.com/repos/psf/requests/issues/2068
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2068/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2068/comments
|
https://api.github.com/repos/psf/requests/issues/2068/events
|
https://github.com/psf/requests/issues/2068
| 34,306,339 |
MDU6SXNzdWUzNDMwNjMzOQ==
| 2,068 |
Define DNS used to resolve domain in request (suggestion)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2188047?v=4",
"events_url": "https://api.github.com/users/SigmundTannhauser/events{/privacy}",
"followers_url": "https://api.github.com/users/SigmundTannhauser/followers",
"following_url": "https://api.github.com/users/SigmundTannhauser/following{/other_user}",
"gists_url": "https://api.github.com/users/SigmundTannhauser/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/SigmundTannhauser",
"id": 2188047,
"login": "SigmundTannhauser",
"node_id": "MDQ6VXNlcjIxODgwNDc=",
"organizations_url": "https://api.github.com/users/SigmundTannhauser/orgs",
"received_events_url": "https://api.github.com/users/SigmundTannhauser/received_events",
"repos_url": "https://api.github.com/users/SigmundTannhauser/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/SigmundTannhauser/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SigmundTannhauser/subscriptions",
"type": "User",
"url": "https://api.github.com/users/SigmundTannhauser",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2014-05-26T13:08:44Z
|
2021-09-09T00:00:55Z
|
2014-05-26T13:45:00Z
|
NONE
|
resolved
|
For example:
```
req = requests.post('https://'+self.server+'/'+method,
data=json.dumps(kwargs),
cert=self.certs,
verify=self.verify,
headers={'content-type': 'application/json' },
dns=85.86.87.88
)
```
This could be practical if requests are used (for example) in test suite which connects to more than one test environment, and each environment has its own DNS (code cannot rely on the DNS defined in the connection properities, /etc/hosts and so on.
At the moment connection must be patched at a lower level to achieve this, and in the end it takes less code when I use socket + ssl instead of requests (in the whole project).
I am aware that this use is not very, but it may be useful in the future.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2068/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2068/timeline
| null |
completed
| null | null | false |
[
"Thanks for the suggestion!\n\nUnfortunately, this is not something we're going to accept. We've got the requests API in feature freeze, as we believe that the 90% use-case is covered. We won't accept extensions to the API unless we think it's going to be generally useful, and I don't believe that this will be.\n\nIf you're genuinely finding that you need to connect to different DNS servers but that you can't control it at the level of `/etc/hosts` or having multiple connections then this is very challenging.\n\nNote also that we build on top of `urllib3` and `httplib`, which do not expose this functionality, requiring that we take away connection building logic from `httplib` even further than we currently do. That's not really a good idea. =(\n",
"I agree entirely with @Lukasa. You may want to investigate how (if at all) other libraries do this @taku-pl like [httplib2](https://github.com/jcgregorio/httplib2) or [httpplus](https://code.google.com/p/httpplus/) and use them instead if possible.\n",
"I understand, and I already investigated, that's why I asked if this is at all possible, because it would be a lot, lot easier using requests :)\nUnfortuantely in the case I described there's no way to modify /etc/hosts: one of the reasons is that multiple tests run simultaneously, and usually on more than two environments, so global change is out of the question. Besides, we wouldn't want to run them as root :)\n\nFor some time I digged all the way down and patched httplib/urllib2, but it got too complicated to maintain the code with new cases (verifying self-signed test certificates, SNI support and such), and the patched code now is much more complicated than 'simply' (ahem :) ) use httplib/urllib.\n\nThanks for the feedback. Looks like I'll have to bid farewell to requests in this project :/\n",
"P.S. Just for laughs, this is not intended as a mean comment :)\n\nI spent a few afternoons trying to hack httplub and urllib 2/3 to get what I want (custom resolving with SSL+SNI support for verification). That's because I wanted to rely solely on python to do this. Today, I tried a completely different approach, and it replaced a few dozen lines of code:\n\nreturn subprocess.check_output([\"curl\", '-v', '-k','https://'+domain, '--resolve', domain+':443:'+ip])\n\nWorks like a charm - returns a string with complete page content and certificate C/CN/O :) It's the exact opposite of _how_ I wanted to do this, but, well, it works.\n",
"Then you can probably do this with pyCurl @taku-pl. I always forget it's an option because of its terrible API. =P\n"
] |
https://api.github.com/repos/psf/requests/issues/2067
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2067/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2067/comments
|
https://api.github.com/repos/psf/requests/issues/2067/events
|
https://github.com/psf/requests/pull/2067
| 34,299,094 |
MDExOlB1bGxSZXF1ZXN0MTYzMzEyOTE=
| 2,067 |
Add a data param to the delete request
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1322166?v=4",
"events_url": "https://api.github.com/users/khamaileon/events{/privacy}",
"followers_url": "https://api.github.com/users/khamaileon/followers",
"following_url": "https://api.github.com/users/khamaileon/following{/other_user}",
"gists_url": "https://api.github.com/users/khamaileon/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/khamaileon",
"id": 1322166,
"login": "khamaileon",
"node_id": "MDQ6VXNlcjEzMjIxNjY=",
"organizations_url": "https://api.github.com/users/khamaileon/orgs",
"received_events_url": "https://api.github.com/users/khamaileon/received_events",
"repos_url": "https://api.github.com/users/khamaileon/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/khamaileon/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/khamaileon/subscriptions",
"type": "User",
"url": "https://api.github.com/users/khamaileon",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2014-05-26T10:50:31Z
|
2021-09-08T23:05:19Z
|
2014-05-26T12:37:07Z
|
NONE
|
resolved
|
In the case of a bulk delete operation, it would be useful to have a "data" parameter to send the ids of objects to delete.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2067/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2067/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2067.diff",
"html_url": "https://github.com/psf/requests/pull/2067",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2067.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2067"
}
| true |
[
"Thanks for this!\n\nHowever, there's no need for it. All requests _can_ take a `data` parameter, regardless of their verb. `post` is the only one that mandates it.\n"
] |
https://api.github.com/repos/psf/requests/issues/2066
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2066/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2066/comments
|
https://api.github.com/repos/psf/requests/issues/2066/events
|
https://github.com/psf/requests/issues/2066
| 34,276,429 |
MDU6SXNzdWUzNDI3NjQyOQ==
| 2,066 |
Allow override of Authorization header in request
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/58730?v=4",
"events_url": "https://api.github.com/users/migurski/events{/privacy}",
"followers_url": "https://api.github.com/users/migurski/followers",
"following_url": "https://api.github.com/users/migurski/following{/other_user}",
"gists_url": "https://api.github.com/users/migurski/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/migurski",
"id": 58730,
"login": "migurski",
"node_id": "MDQ6VXNlcjU4NzMw",
"organizations_url": "https://api.github.com/users/migurski/orgs",
"received_events_url": "https://api.github.com/users/migurski/received_events",
"repos_url": "https://api.github.com/users/migurski/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/migurski/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/migurski/subscriptions",
"type": "User",
"url": "https://api.github.com/users/migurski",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2014-05-26T00:46:29Z
|
2021-09-09T00:00:56Z
|
2014-05-26T14:24:24Z
|
NONE
|
resolved
|
I have encountered a situation similar to #927.
I am creating a new request with an explicit `Authorization` header, e.g.
```
posted = post('…', headers={'Authorization': '…'}, data=…)
```
Requests is finding and using an entry from a `.netrc` file I didn’t know existed; it appears to have been generated by the Heroku toolbelt. This overrides the Authorization header supplied in the `headers` dictionary.
Can the default behavior instead allow environmental factors to be overridden?
Also, can requests make this behavior in some way easier to discover? I am unable to find a way to retrieve the request headers actually sent; they would have provided a valuable clue about the mystery username & password.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2066/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2066/timeline
| null |
completed
| null | null | false |
[
"Both of these problems in this case have solutions. To avoid using `.netrc`, use a `Session` and set `Session.trust_env` to `False`. To see what headers were sent _by requests_ (noting that the lower layers may add more), check `response.request.headers`.\n",
"This is a duplicate of #2062 and as such, I'm closing this\n",
"Thanks. @Lukasa, can `response.request` be added to the [Response documentation](http://docs.python-requests.org/en/latest/api/#requests.Response)?\n",
"@migurski Good idea, and done. See #2069.\n"
] |
https://api.github.com/repos/psf/requests/issues/2065
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2065/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2065/comments
|
https://api.github.com/repos/psf/requests/issues/2065/events
|
https://github.com/psf/requests/pull/2065
| 34,274,819 |
MDExOlB1bGxSZXF1ZXN0MTYzMTgyNTk=
| 2,065 |
Fix Python 3.2 compatibility in tests
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/89623?v=4",
"events_url": "https://api.github.com/users/mgeisler/events{/privacy}",
"followers_url": "https://api.github.com/users/mgeisler/followers",
"following_url": "https://api.github.com/users/mgeisler/following{/other_user}",
"gists_url": "https://api.github.com/users/mgeisler/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mgeisler",
"id": 89623,
"login": "mgeisler",
"node_id": "MDQ6VXNlcjg5NjIz",
"organizations_url": "https://api.github.com/users/mgeisler/orgs",
"received_events_url": "https://api.github.com/users/mgeisler/received_events",
"repos_url": "https://api.github.com/users/mgeisler/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mgeisler/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mgeisler/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mgeisler",
"user_view_type": "public"
}
|
[
{
"color": "009800",
"default": false,
"description": null,
"id": 44501218,
"name": "Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTIxOA==",
"url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge"
},
{
"color": "207de5",
"default": false,
"description": null,
"id": 60620163,
"name": "Minion Seal of Approval",
"node_id": "MDU6TGFiZWw2MDYyMDE2Mw==",
"url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval"
}
] |
closed
| true | null |
[] | null | 4 |
2014-05-25T23:00:11Z
|
2021-09-08T23:10:56Z
|
2014-05-27T15:24:47Z
|
CONTRIBUTOR
|
resolved
|
The FAQ says that requests is compatible with Python 3.2. However, I could not execute the tests using that Python version. This PR fixes this and updated the FAQ to match.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2065/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2065/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2065.diff",
"html_url": "https://github.com/psf/requests/pull/2065",
"merged_at": "2014-05-27T15:24:47Z",
"patch_url": "https://github.com/psf/requests/pull/2065.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2065"
}
| true |
[
"I should have mentioned this above: I noticed that `tox.ini` was added to `.gitignore` in 7e825acd9b730632623984f14cd0f3f2c0cbf45f. I guess the author of that commit used `tox` himself to run the test suite when he decided to remove Python 3.1 and 3.2 from the trove classifiers.\n\nIf `tox` itself is unwanted, then I'll be happy to redo the changes without `tox` — the important point for me was to get support for Python 3.2 back to the test suite.\n",
"I've updated the pull request by taking out the tox related changes (the branch name should be the only trace now).\n",
":+1: I'm in favor of this change. The only thing that would be better would be if @kennethreitz added support to Jenkins for 3.2, 3.4, and pypy 2.2 :)\n",
"@sigmavirus24 noted.\n"
] |
https://api.github.com/repos/psf/requests/issues/2064
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2064/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2064/comments
|
https://api.github.com/repos/psf/requests/issues/2064/events
|
https://github.com/psf/requests/pull/2064
| 34,274,778 |
MDExOlB1bGxSZXF1ZXN0MTYzMTgyNDE=
| 2,064 |
compat: handle SyntaxError when importing simplejson
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/89623?v=4",
"events_url": "https://api.github.com/users/mgeisler/events{/privacy}",
"followers_url": "https://api.github.com/users/mgeisler/followers",
"following_url": "https://api.github.com/users/mgeisler/following{/other_user}",
"gists_url": "https://api.github.com/users/mgeisler/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mgeisler",
"id": 89623,
"login": "mgeisler",
"node_id": "MDQ6VXNlcjg5NjIz",
"organizations_url": "https://api.github.com/users/mgeisler/orgs",
"received_events_url": "https://api.github.com/users/mgeisler/received_events",
"repos_url": "https://api.github.com/users/mgeisler/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mgeisler/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mgeisler/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mgeisler",
"user_view_type": "public"
}
|
[
{
"color": "009800",
"default": false,
"description": null,
"id": 44501218,
"name": "Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTIxOA==",
"url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge"
},
{
"color": "207de5",
"default": false,
"description": null,
"id": 60620163,
"name": "Minion Seal of Approval",
"node_id": "MDU6TGFiZWw2MDYyMDE2Mw==",
"url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval"
}
] |
closed
| true | null |
[] | null | 10 |
2014-05-25T22:57:50Z
|
2021-09-08T23:07:21Z
|
2014-05-27T15:25:54Z
|
CONTRIBUTOR
|
resolved
|
We officially support Python 2.6 to 3.3, but simplejson does not
support Python 3.1 or 3.2:
https://github.com/simplejson/simplejson/issues/66
Importing simplejson on Python 3.2 results in a SyntaxError because
simplejson uses the u'...' syntax (the syntax was not supported in
Python 3.0 to 3.2).
Support for loading simplejson instead of the stdlib json module was
added by #710:
https://github.com/kennethreitz/requests/pull/710
No mention was made of the lack of support for Python 3.2, but it was
mentioned that simplejson can be faster than the stdlib json module.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2064/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2064/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2064.diff",
"html_url": "https://github.com/psf/requests/pull/2064",
"merged_at": "2014-05-27T15:25:54Z",
"patch_url": "https://github.com/psf/requests/pull/2064.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2064"
}
| true |
[
"Thanks, this looks good to me! :cake: I'm not 100% sure it'll get merged, we may drop support for 3.2 soon, but if we don't we'll merge it. \n",
"I ran into this for a project of mine where we try to support Python 3.2. However, it also affects much more significant projects, in particular `pip`. I didn't realize this at first, but `pip` includes a copy of `requests` and it fails to install packages if `simplejson` happens to be already installed.\n\nSo I would suggest not dropping support for 3.2 before `pip` does.\n",
"The error for `pip` can be provoked like this:\n\n```\n$ virtualenv -p python3.2 venv32\n$ source venv32/bin/activate\n$ pip install simplejson\n```\n\nThis will fail to compile the C extension, but `simplejson` is still installed as pure Python. This means that `requests` fail when imported from `pip`:\n\n```\n$ pip install anything-you-want\nTraceback (most recent call last):\n File \"/home/mg/tmp/venv32/bin/pip\", line 7, in <module>\n from pip import main\n File \"/home/mg/tmp/venv32/lib/python3.2/site-packages/pip/__init__.py\", line 11, in <module>\n from pip.vcs import git, mercurial, subversion, bazaar # noqa\n File \"/home/mg/tmp/venv32/lib/python3.2/site-packages/pip/vcs/mercurial.py\", line 9, in <module>\n from pip.download import path_to_url\n File \"/home/mg/tmp/venv32/lib/python3.2/site-packages/pip/download.py\", line 22, in <module>\n from pip._vendor import requests, six\n File \"/home/mg/tmp/venv32/lib/python3.2/site-packages/pip/_vendor/requests/__init__.py\", line 58, in <module>\n from . import utils\n File \"/home/mg/tmp/venv32/lib/python3.2/site-packages/pip/_vendor/requests/utils.py\", line 25, in <module>\n from .compat import parse_http_list as _parse_list_header\n File \"/home/mg/tmp/venv32/lib/python3.2/site-packages/pip/_vendor/requests/compat.py\", line 77, in <module>\n import simplejson as json\n File \"/home/mg/tmp/venv32/lib/python3.2/site-packages/simplejson/__init__.py\", line 114, in <module>\n from .encoder import JSONEncoder, JSONEncoderForHTML\n File \"/home/mg/tmp/venv32/lib/python3.2/site-packages/simplejson/encoder.py\", line 21\n ESCAPE = re.compile(u'[\\\\x00-\\\\x1f\\\\\\\\\"\\\\b\\\\f\\\\n\\\\r\\\\t\\u2028\\u2029]')\n ^\nSyntaxError: invalid syntax\n```\n",
"Well, that's a pain in the neck. Seems like a good idea though, and seeing as `pip` is likely to still want 3.2 I see no reason not to merge this.\n",
"Cool, thanks for looking at it!\n",
"I could be wrong, but I think I remember @dstufft saying that pip support for 3.2 wasn't a high priority. That aside, I'm not quite sure why we support using simplejson. If @mgeisler hadn't linked to #710 I would have guessed it was to support Python 2.5 (which we no longer do).\n",
"My first attempt at solving this was to reverse the imports, that is try `import json` first and then import `simplejson` as a fallback. But then I discovered that simplejson apparently has some speedups that the author of #710 wanted to take advantage of.\n\nIn my own projects (for Python 2.6+) I've always just imported json and let that be good enough.\n",
"Either way y'all are right that it's better to have this than to not. It doesn't increase the complexity of the code and if anything the simplejson maintainers should be scolded for not supporting 3.2 :P. I'm :+1: \n",
"3.2 is important, 3.1 is not.\n",
"This looks like a great change. \n"
] |
https://api.github.com/repos/psf/requests/issues/2063
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2063/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2063/comments
|
https://api.github.com/repos/psf/requests/issues/2063/events
|
https://github.com/psf/requests/issues/2063
| 34,271,303 |
MDU6SXNzdWUzNDI3MTMwMw==
| 2,063 |
Mutating Session.proxies does not work
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/71486?v=4",
"events_url": "https://api.github.com/users/asmeurer/events{/privacy}",
"followers_url": "https://api.github.com/users/asmeurer/followers",
"following_url": "https://api.github.com/users/asmeurer/following{/other_user}",
"gists_url": "https://api.github.com/users/asmeurer/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/asmeurer",
"id": 71486,
"login": "asmeurer",
"node_id": "MDQ6VXNlcjcxNDg2",
"organizations_url": "https://api.github.com/users/asmeurer/orgs",
"received_events_url": "https://api.github.com/users/asmeurer/received_events",
"repos_url": "https://api.github.com/users/asmeurer/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/asmeurer/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/asmeurer/subscriptions",
"type": "User",
"url": "https://api.github.com/users/asmeurer",
"user_view_type": "public"
}
|
[
{
"color": "e10c02",
"default": false,
"description": null,
"id": 117744,
"name": "Bug",
"node_id": "MDU6TGFiZWwxMTc3NDQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Bug"
}
] |
closed
| true | null |
[] | null | 22 |
2014-05-25T19:42:09Z
|
2021-09-09T00:00:55Z
|
2014-05-28T08:06:33Z
|
NONE
|
resolved
|
This was originally brought up at https://github.com/kennethreitz/requests/issues/2061. If you change `session.proxies` after calling `session.get` and call `session.get` again, it does not use the updated value. You have to do `session.get(proxies=session.proxies)`.
Assumedly this is not difficult to fix, since setting the keyword argument does work. I didn't read the code, though.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2063/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2063/timeline
| null |
completed
| null | null | false |
[
"OK, I can't reproduce this locally. I made three requests through `mitmproxy`:\n\n``` python\n>>> import requests\n>>> s = requests.Session()\n>>> r = s.get('http://www.google.co.uk/', proxies={'http': 'http://localhost:8080'})\n>>> s.proxies['http'] = 'http://localhost:8080'\n>>> r = s.get('http://www.google.co.uk/')\n>>> s.proxies['http'] = 'http://user:pass@localhost:8080'\n>>> r = s.get('http://www.google.co.uk/')\n```\n\nThe first two requests pass through mitmproxy without difficulty. The last one also does, and additionally provides an auth header.\n\nCan you provide a reproducible code sample?\n",
"My case was the first request failed with 407. I don't remember if it was http or https. \n",
"That should be irrelevant, I'd have thought. I'd like a code fragment that can reproduce it, if it's not too hard. =)\n",
"I would expect it to be irrelevant as well...\n",
"I am using https://github.com/conda/conda/blob/5871df053da8de3b6e6fd82969acac14be367911/conda/fetch.py#L210. I can see if I can cook up a more directed example tomorrow. But using conda a simple `conda search` with a proxy enabled should do the trick. You may need to do `conda search -c asmeurer` to get https. \n",
"Uh...I wonder if `stream=True` causes a problem here.\n",
"Oh, unlikely. I actually was using https://github.com/conda/conda/blob/5871df053da8de3b6e6fd82969acac14be367911/conda/fetch.py#L77 when this happened. \n",
"Ah, ok. Well, I think we really need to hit a small, reproducible test case to work out whether or not this is truly a problem down at the connection pool layer. I simply don't think it is.\n",
"Stepping through the code, it's not too hard to see what is going on. The proxies are obtained from https://github.com/kennethreitz/requests/blob/6c72509f5bb0e9cd8fad64a44ba99687ed044771/requests/sessions.py#L429, which doesn't pull the proxies from the Session object at all. In fact, I think if I tried setting the proxies on the session object and they weren't picked up from the environment somehow, that it wouldn't use them at all. \n",
"Observe a few lines down: https://github.com/kennethreitz/requests/blob/6c72509f5bb0e9cd8fad64a44ba99687ed044771/requests/sessions.py#L442\n",
"It looks like that puts priority on the request setting, but I think that that is the environment proxies by the time it reaches that, so it always uses that. \n\nThat's assuming I read the code correctly :)\n",
"Hang on: why is the proxy suddenly ending up in the environment? We're aware of the incorrect precedence (it's in #2018), but I don't understand why it would work on the first request but fail on the second.\n",
"@asmeurer is conda looking for proxies on its own? If so I would guess that they're getting the same proxies we are (which is why it would work on the first try -- they come out of the ) and then they're trying to update them but it still pulls from the env\n",
"Yes, you can set them in the `.condarc` configuration if you want. \n",
"The proxies are probably ending up there because of https://github.com/conda/conda/blob/8902c5a6956192f64d1679fde7e42b0c4f7b223d/conda/connection.py#L60 (https://github.com/conda/conda/blob/8902c5a6956192f64d1679fde7e42b0c4f7b223d/conda/config.py#L245).\n",
"And in case any of that seems really stupid, it's worth reiterating:\n\n\n",
"@asmeurer I'm not sure what the issue linked has to do with the proxy problems. That said, I see that you're setting the `proxies` attributes on the Session instance [here](https://github.com/conda/conda/blob/96aaccefddce58850b07f414dce95192e6b1a697/conda/connection.py#L60). The easy solution to this bug is to simply also do `self.trust_env = False`. That will prevent us from loading from the environment each time. You're already loading proxies from the environment in `get_proxy_servers`. The only things you'll lose by setting `trust_env` to `False` are:\n- setting `verify` based on `REQUESTS_CA_BUNDLE` or `CURL_CA_BUNDLE`\n- using authentication defined in `.netrc`\n",
"Oh d'oh. That would be me not copying the url correctly. Try the updated comment. \n",
"`.netrc` seems potentially important. I think my current fix (just passing the proxies through to `get`; I only ever call `get` twice) is an OK workaround. I just opened this issue because it was confusing and I don't think I should have to do it. \n",
"> Oh d'oh. That would be me not copying the url correctly. Try the updated comment.\n\nNo worries. I wasn't sure if I was missing something =P\n\n> .netrc seems potentially important\n\nIt can be. That's why I mentioned it as a downside :)\n\n> I just opened this issue because it was confusing and I don't think I should have to do it. \n\nI agree. I don't think you should have to either. It seems, however, that this is a duplicate of #2018 after all. @Lukasa should we close this in deference to #2018?\n",
"Agreed, I think us fixing our proxy source priorities should fix your problem, @asmeurer. =)\n",
"Yes, I think so.\n"
] |
https://api.github.com/repos/psf/requests/issues/2062
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2062/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2062/comments
|
https://api.github.com/repos/psf/requests/issues/2062/events
|
https://github.com/psf/requests/issues/2062
| 34,239,807 |
MDU6SXNzdWUzNDIzOTgwNw==
| 2,062 |
Default Authentication Overwrites Authentication Header
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/328806?v=4",
"events_url": "https://api.github.com/users/joe42/events{/privacy}",
"followers_url": "https://api.github.com/users/joe42/followers",
"following_url": "https://api.github.com/users/joe42/following{/other_user}",
"gists_url": "https://api.github.com/users/joe42/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/joe42",
"id": 328806,
"login": "joe42",
"node_id": "MDQ6VXNlcjMyODgwNg==",
"organizations_url": "https://api.github.com/users/joe42/orgs",
"received_events_url": "https://api.github.com/users/joe42/received_events",
"repos_url": "https://api.github.com/users/joe42/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/joe42/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joe42/subscriptions",
"type": "User",
"url": "https://api.github.com/users/joe42",
"user_view_type": "public"
}
|
[
{
"color": "e102d8",
"default": false,
"description": null,
"id": 117745,
"name": "Planned",
"node_id": "MDU6TGFiZWwxMTc3NDU=",
"url": "https://api.github.com/repos/psf/requests/labels/Planned"
},
{
"color": "0b02e1",
"default": false,
"description": null,
"id": 191274,
"name": "Contributor Friendly",
"node_id": "MDU6TGFiZWwxOTEyNzQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Contributor%20Friendly"
},
{
"color": "fad8c7",
"default": false,
"description": null,
"id": 136616769,
"name": "Documentation",
"node_id": "MDU6TGFiZWwxMzY2MTY3Njk=",
"url": "https://api.github.com/repos/psf/requests/labels/Documentation"
}
] |
closed
| true | null |
[] | null | 28 |
2014-05-24T18:34:24Z
|
2021-09-08T20:00:53Z
|
2015-12-21T18:49:52Z
|
NONE
|
resolved
|
Hi,
And thanks for maintaining requests. When I set an Authorization header in a request, I do not expect it to be overwritten. I set up default credentials in my .netrc file (for automatizing the cadaver command line tool), and suddenly I could not authenticate to my service any longer.
Please check if an Authorization header does already exist in your default authorization handler, instead of overwriting it. I see that you documented this behaviour in the authentication section, but since I never used this form of authentication it took quite some time to find the problem. Also, overwriting a header that the user set does not seem to be a sensible implementation.
Thanks,
Joe
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2062/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2062/timeline
| null |
completed
| null | null | false |
[
"Hey @joe42 There are several headers we don't expect users to set on their own because they really shouldn't be. If you want to prevent us from overwriting the header you should use the `auth=` parameter to your request. By default it takes a two-tuple (username, and password) or any other Authentication Handler.\n",
"@sigmavirus24 Sorry, I should have mentioned that I already circumvented the problem by writing an authentication handler that does not modify the request. My post was meant as a suggestion to improve requests, as I suppose other users might run into the same problem.\n\n> Hey @joe42 There are several headers we don't expect users to set on their own because they really shouldn't be.\n\nI see. I do set several headers directly. Can you point me to a resource explaining which headers should not be set and why? Setting the headers directly seemed to be the way to go after reading the documentation's \"Custom Headers\" section.\n\nThx,\nJoe\n",
"This is another of those situations where the right thing to do is not entirely obvious. For instance, what _should_ the output be if this happens:\n\n``` python\nimport requests\n\nr = requests.get(url, headers={'Authorization': auth}, auth=(user, pass))\n```\n\nWhich of the two do we respect first? Our general principle is to respect the specific argument over the general one: `auth` over `headers`, `body` over Content-Length header etc.\n\nFor this reason, if you _can_ auth via an Auth handler you should: that's how requests expects you to do it. \n",
"@Lukasa : I can understand why you choose to design it this way. But I think that the result of the following request with default credentials in .netrc is not obvious at all. The point I do still not get is that I cannot imagine a case where a user would actually set a specific header (like in your example) and not expect it to be used by requests. Why should he bother doing this? So it might be good to either make it obvious (by documentation/warning/error) or to respect the users explicit settings.\n\n```\nimport requests\nr = requests.get(url, headers={'Authorization': auth}, auth=None)\n```\n",
"> So it might be good to either make it obvious (by documentation/warning/error) or to respect the users explicit settings.\n\nI appreciate that you've shared your opinion @joe42. You're right that `explicit is better than implicit` and in this case you can explicitly turn off requests using your `.netrc` file. With it off, you wouldn't have this trouble. That said, the documentation should be improved to explain which headers users should refrain from setting themselves. We will not try to internally keep track of which headers were provided by the users and which we have generated or which ones are generated by urllib3 or which ones are provided by httplib. It is certainly possible to do, but it is not something we will do. Further, if we allow users to set their own headers in cases like this (and others) it will likely lead to far more cases where the user is surprised by requests' behaviour. We're doing the right thing here.\n",
"I had a related problem in #2066.\n\nMy issue arose using Heroku’s API, which has two confusing factors relevant to this issue. The toolbelt [quietly creates a .netrc file](https://devcenter.heroku.com/articles/heroku-command#logging-in) (mine was a year old; I didn't know it existed) and Heroku asks users to [create a custom Authorization header](https://devcenter.heroku.com/articles/oauth#web-application-authorization):\n\n> For example, given the access token _01234567-89ab-cdef-0123-456789abcdef_, request headers should be set to _Authorization: Bearer 01234567-89ab-cdef-0123-456789abcdef_.\n\nI am now using `Session.trust_env` to prevent this issue thanks to @Lukasa, but I found the silent overwrite of the `Authorization` header to be a nasty surprise. I’m in agreement with @joe42 on this: specifically providing a header ought to overrule environmental factors. Alternatively, could the `.netrc` behavior be documented [under API](http://docs.python-requests.org/en/latest/api/#requests.request), [custom authentication](http://docs.python-requests.org/en/latest/user/advanced/#custom-authentication), and [quickstart](http://docs.python-requests.org/en/latest/user/quickstart/#custom-headers)? My searches for “auth” in the requests documentation should have mentioned this.\n",
"First, I'd like to address something:\n\n> My searches for “auth” in the requests documentation should have mentioned this.\n\nYour searching was not terribly effective. The [docs front page](http://docs.python-requests.org/en/latest/) contains a heading entitled 'Authentication'. [That section](http://docs.python-requests.org/en/latest/user/authentication/) contains a heading entitled [netrc Authentication](http://docs.python-requests.org/en/latest/user/authentication/#netrc-authentication). This is not very difficult to find, and it's in exactly the location in the documentation I'd expect to find it. I'm open to linking to the Authentication section from the Custom Authentication section, but anything more seems incredible to me.\n\nNow, I feel like we've seen several related issues recently that warrant a more careful and broader response. The key problem is this: lots of users with previous HTTP client experience are expecting that requests will treat the headers they set as a fundamental source of truth: anything set in the headers should be treated as _true_ by requests, or at least not removed.\n\nRequests has the opposite approach: headers are the _lowest_ source of truth, and we feel quite happy to replace many kinds of headers that the user sets. Some examples:\n- We will replace Authorization headers when an alternative auth source is found.\n- We will remove Authorization headers when you get redirected off-host.\n- We will replace Proxy-Authorization headers when you provide proxy credentials in the URL.\n- We will replace Content-Length headers when we can determine the length of the content.\n\nWe do this for very good reason: a headers-based API is utterly terrible. Affecting the library behaviour based on arbitrary key-value pairs set in the headers dictionary is asking for all kinds of terrible bugs, and is the kind of API-design-footgunnery that I'd expect to see from the OpenSSL project.\n\nPerhaps we need a section of the docs that explicitly says: you cannot trust that your headers will remain unmodified. Maybe even a list of the headers you absolutely should not set yourself (currently I think it's `Authorization`, `Proxy-Authorization`, and `Content-Length`).\n",
"You’re right, I don't know how I missed that section and it mentions .netrc. My mistake!\n",
"@migurski Whew, I was worried we had a bigger discoverability problem. Our docs are pretty large, sadly, which makes it easy to miss things. =(\n",
"A central explanation of the headers approach you describe linked throughout the docs, wherever `headers` is documented, would be amazingly helpful. It’s counterintuitive to me for certain headers, but I respect that Requests has a design intent here. Thanks for an excellent library!\n",
"@Lukasa: Thanks for your detailed answer. \n\n> headers are the lowest source of truth, and we feel quite happy to replace many kinds of headers that the user sets. Some examples:\n> \n> We will replace Authorization headers when an alternative auth source is found.\n> We will remove Authorization headers when you get redirected off-host.\n> We will replace Proxy-Authorization headers when you provide proxy credentials in the URL.\n> We will replace Content-Length headers when we can determine the length of the content.\n> \n> We do this for very good reason: a headers-based API is utterly terrible. Affecting the library behaviour based on arbitrary key-value pairs set in the headers dictionary is asking for all kinds of terrible bugs\n\nIt would be great if you could add this sentence to the \"Custom Headers\" section, after it states:\n\n> If you’d like to add HTTP headers to a request, simply pass in a dict to the headers parameter.\n",
"@sigmavirus24 Do you feel like this would be a good addition to the docs?\n",
"I feel like there have been a small number of users very violently surprised by the fact that we don't analyze the headers and do \"smart\" things based upon their presence (or lack thereof). I think adding to the Custom Headers section to say \"By the way, we do no analysis of the headers you set so setting different headers will have no effect on the way requests handles your request.\" We should also mention the fact that we will happily (and with good reason) ignore headers set by the user in preference for headers we generate ourselves. That said, the idea of linking this throughout the docs wherever headers are mentioned is clearly overkill so please don't do that.\n",
"At minimum, include a link from places where the `headers` param is documented. “Simply pass in a dict” is not accurate without some explanation of order of operations: first the dict is applied, then the environment is inspected for conflicting information to use instead.\n",
"Borrowing from past comments, here's what I think would address the issue well in the documentation:\n\n> Custom headers are given less precedence than more specific sources of information. For example:\n> - Authorization headers will be overridden by any alternative auth source found.\n> - Authorization headers will be removed if you get redirected off-host.\n> - Proxy-Authorization headers will be overridden by proxy credentials provided in the URL.\n> - Content-Length headers will be overridden when we can determine the length of the content.\n> \n> Furthermore, Requests does not change its behavior at all based on which custom headers are specified. The headers are simply passed on into the final request.\"\n\nIs there anything this doesn't address? Also, I feel like it should be more concise. Thoughts?\n",
"@benjaminran Agreed, that looks like extremely useful documentation to me. If you open a pull request that adds it, I'll happily merge it. =)\n",
"@benjaminran one thing: documentation shouldn't be overly verbose but they also shouldn't be too concise. The difficulty in writing documentation is finding the right balance such that everyone can understand what's going on without reading a novel. ;)\n",
"As long as it’s made clear what is covered by “any alternative auth source,” then yes.\n",
"That's a good point--that would probably confuse me if I saw it in unfamiliar documentation. How's this?\n\n> Authorization headers will be overridden if an `auth=` parameter is used.\n\nAlso, is it clear from context that 'Authorization headers' refers to custom headers?\n",
"A more accurate description would be:\n\n> Authorization headers will be overridden if credentials are passed via the `auth` parameter or are specified in a `.netrc` accessible in the environment.\n\nThe latter part is the root of this issue.\n",
"I was about to suggest that we close this issue, since the documentation given by @benjaminran is now merged into the master docs (in Quickstart->Custom Headers). However I have one more question: what happens when the `auth=` parameter is used, AND the `.netrc` environment config is set? It is not clear from the docs which one takes precedence. And as @migurski pointed out, we have a definite case where a hosting provider (Heroku) quietly created a `.netrc` file, so it would be important to know this. \n\nI suspect that `auth=` should override `.netrc`, but I wanted to confirm before creating a PR.\n",
"+1 to preferring that `auth=` overrides `.netrc`. \n",
"@ibnIrshad please do not create a PR. That would be a breaking change that we will not accept until 3.0.0\n",
"What I meant is I will create a PR to clarify the current expected behaviour in the docs, not to change anything. Is it true that `auth=` has precedence over `.netrc` currently? It is not clear from the docs, to me atleast. Whether it _should_ or _should not_ is a different issue, but right now I simply want to document it.\n",
"After scouring the tests, it seems `auth=` IS designed to override `.netrc`. Great! I'll just create PR to update the docs.\n\n`test_requests.py, line 372`\n\n``` python\n # Given auth should override and fail.\n r = requests.get(url, auth=wrong_auth)\n assert r.status_code == 401\n```\n",
"In my experience previously, `.netrc` overrode `auth=` —should I re-run my tests to see if this is now a real bug?\n",
"Well the tests only cover basic auth. What kind of auth are you doing?\n",
"Looking back on my original issue, looks I was setting and Authorization header: https://github.com/kennethreitz/requests/issues/2066\n"
] |
https://api.github.com/repos/psf/requests/issues/2061
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2061/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2061/comments
|
https://api.github.com/repos/psf/requests/issues/2061/events
|
https://github.com/psf/requests/issues/2061
| 34,191,133 |
MDU6SXNzdWUzNDE5MTEzMw==
| 2,061 |
Making https proxies easier to deal with
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/71486?v=4",
"events_url": "https://api.github.com/users/asmeurer/events{/privacy}",
"followers_url": "https://api.github.com/users/asmeurer/followers",
"following_url": "https://api.github.com/users/asmeurer/following{/other_user}",
"gists_url": "https://api.github.com/users/asmeurer/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/asmeurer",
"id": 71486,
"login": "asmeurer",
"node_id": "MDQ6VXNlcjcxNDg2",
"organizations_url": "https://api.github.com/users/asmeurer/orgs",
"received_events_url": "https://api.github.com/users/asmeurer/received_events",
"repos_url": "https://api.github.com/users/asmeurer/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/asmeurer/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/asmeurer/subscriptions",
"type": "User",
"url": "https://api.github.com/users/asmeurer",
"user_view_type": "public"
}
|
[] |
open
| false | null |
[] | null | 19 |
2014-05-23T16:35:02Z
|
2021-02-28T04:03:09Z
| null |
NONE
| null |
I'm trying to add support for proxies to conda, in particular, automatically prompting for a username and password on a 407. For http proxies, it's all fine. I can detect the 407 error code on `HTTPError` from `raise_for_status` and inject `HTTPProxyAuth` and try again.
But for https, it raises `ConnectionError` on `get`. Adding `HTTPProxyAuth` does not work. I have to parse the URL and add it in as `https://username:password@proxy:port`. `urlparse` does not make this particularly easy.
Furthermore, the only way I can tell to actually detect the the `ConnectionError` is a 407 is to do
``` py
try:
# do request
except ConnectionError as e:
if "407" in str(e):
# get username and password
```
| null |
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2061/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2061/timeline
| null | null | null | null | false |
[
"This was briefly discussed in IRC, [as seen here](https://botbot.me/freenode/python-requests/2014-05-23/?msg=15109663&page=1). This GitHub issue encompasses a number of problems. Let's enumerate them.\n1. `HTTPProxyAuth` doesn't work for tunneling proxies. Yes, that's true, and that's because tunneling proxies are fundamentally a very different beast to your standard proxy, involving the CONNECT request and all kinds of funky nonsense. Our auth paradigm doesn't map to this special case.\n \n My proposal here is actually to get rid of `HTTPProxyAuth` altogether. I hate it. We have had auth-in-the-proxy-url for a long time now, so it's not like it's the only way to do it. Additionally, it provides better conceptual abstraction: all auth handlers now apply auth to the _origin_, not to intermediaries. I like that distinction.\n2. If you aren't authed to a tunneling proxy, you receive a `ConnectionError` on the method call rather than getting the 407 back. This is again because of the way `httplib` is designed for tunneling proxies. I think we can improve this situation by getting `urllib3` to throw an appropriate exception which we catch and wrap. We can't move the exception to `.raise_for_status()` because that would require a `httplib` rewrite (or something truly evil), but at least we can make it easier to work out what happened.\n",
"@shazow, are you open to doing part 2?\n",
"Well, in this case, the `response` attribute of the `ConnectionError` is `None`. If it were the same as with HTTPError, I could just handle them uniformly. \n",
"I don't think it can be, logistically: we don't have a response in hand. This is all handled transparently by `httplib`. Because it's terrible.\n",
"Another possibly unrelated issue (I can open a new issue). If you mutate the `proxies` attribute of a `Session` object, the new value isn't used. So if you want to redo the request with the fixed proxies (\"fixed\" meaning including authentication credentials), you have to do\n\n``` py\ndef dostuff(url, session=None):\n session = session or Session() # Actually a custom Session subclass\n ...\n try:\n resp = session.get(url, proxies=session.proxies) # I shouldn't have to include the proxies in get() here; they are already in session.proxies!\n resp.raise_for_status\n except HTTPError as e:\n if e.response.status_code == 407:\n # Get username and password and put it on the proxy url\n session.proxies['http'] = fixes_proxy_url\n # Try again\n dostuff(url, session=session)\n```\n",
"And more to the point, that's a lot of logic to repeat every time I do a `get` (it's actually more, because I also have to check for `ConnectionError`). It would be nice if requests just had a way to automatically prompt the user for proxy authentication and retry the request. \n\nBut I'll be happy to just get these basic issues ironed out first. \n",
"Uh...as far as I can see on a quick code read, we absolutely pick up the proxy settings from the Session. What makes you think we're not?\n\nAs for 'prompting' for proxy auth, that's not going to happen, it's simply not requests' job.\n",
"> Uh...as far as I can see on a quick code read, we absolutely pick up the proxy settings from the Session. What makes you think we're not?\n\nMy guess would be that once we have made a request with a proxy, e.g., we've used the `http` proxy, then if @asmeurer changes the list of proxies it isn't fixed because we're using the same `HTTPAdapter` which hits [lines 209 through 215](https://github.com/kennethreitz/requests/blob/a7c218480d7acf1e866e07fde0627d05fb77fbc1/requests/adapters.py#L209..L215). Notice that if `http` is already in `self.proxy_manager` we don't do anything. Since it is, the new proxy is ignored. I haven't attempted to test if that's in fact the case, but that's my first guess as to what might be causing the behaviour that @asmeurer is seeing.\n\n> As for 'prompting' for proxy auth, that's not going to happen, it's simply not requests' job.\n\nI agree.\n",
"> As for 'prompting' for proxy auth, that's not going to happen, it's simply not requests' job.\n\nThat's unfortunate. It seems that a lot of http libraries take this view, which is why every application that connects to the internet has to implement its own proxy authentication (I don't use proxies, but I've noticed this pattern). \n\nAny thoughts on how requests could make this easier? A callback mechanism?\n",
"Requests should absolutely allow you to apply auth if you hadn't before.\n\nAs for making it easier to make the same request with auth for the proxy, the reason HTTP libraries generally don't is because we don't know what people are going to want. In requests case, this is sufficiently unusual that adding a mechanism would have two effects:\n1. Disappointing the people who wanted another way to add the proxy auth if they needed to.\n2. Require us to maintain a code path used by a tiny fraction of our users. \n\nFor that reason, we assume that users who need to re-make a request will do it themselves. \n\nThe auth stuff makes that harder, so we should fix it, but otherwise I think that's the end of it. \n",
"So I want to be sure I'm understanding everyone's concerns here properly:\n\n@asmeurer when you say you'd like requests to prompt for proxy auth, do you mean you'd like us to literally use `raw_input` (or `input` in the case of Python 3)? I'm pretty sure that's not what you want and that's not something we'll ever support. Further, I'm not quite certain how we would properly implement a callback system for this particular case since the only other system like it in requests relies on having a response which we don't have in this case.\n\nThat said, we've had a troubled history (which @Lukasa knows much better than I) dealing with HTTPS Proxy Authentication. If there were a better way of handling them, we would have already implemented it (I'd like to think).\n\nThis discussion should remain in this issue. Your other problem @asmeurer (in which mutating a Session's proxies does not affect subsequent requests should be a separate issue). I'm trying to think of a good way to handle that case since I think I've located the problem above.\n",
"Sure I'll open a new issue for the Session thing.\n",
"https://github.com/kennethreitz/requests/issues/2063\n",
"The situation with HTTPS proxy authentication remains like this:\n- Tunneling HTTPS over a proxy involves sending a request to the proxy with the CONNECT verb that establishes a TCP tunnel through the proxy, then sending the _actual_ request over TLS over that TCP tunnel.\n- This procedure is not done by using `httplib` to send that first request, but by using its `tunnel` functionality.\n- This functionality is enabled at the connection level, not at the request/response level.\n- Requests Auth Handlers act at the request/response level.\n- Thus, ProxyAuth for HTTPS over proxy _cannot_ be handled by an auth handler.\n\nI appreciate this is unfortunate, but there's simply no way around it: it needs to be done the way it's currently being done, or we need to special-case the Proxy Auth handler to mean something special. I don't want to do that because the Proxy Auth handler is semantically out of place: it applies authentication to the proxy, not to the origin. Given that you may _also_ want to authenticate the origin, and that we don't allow multiple auth handlers, applying the Proxy Auth handler impedes your ability to do any other kind.\n\nBest to have proxy authentication credentials come in on the `proxies` dictionary.\n",
"Ok, I agree that this is the wrong place for it. ProxyAuth did seem strange to me, for the reason you cited. \n",
"Not sure if this discussion is resolved or not, but please ping me again if I still have an action item/question. I will be home tomorrow for more in-depth reading. :)\n\n@Lukasa If this is still an open question, handling special-handling 407 in urllib3 sounds sensible if we can do it in a low-impact way.\n",
"My understanding of what is going on is limited here, but it didn't seem to me like there were any concrete action items yet. \n",
"Yeah, we're still trying to drill down into exactly what's happening where.\n",
"Hello guys. I tried to look around in the repo to find some answers but wasn't lucky. Is there any workaround to do a `get` request using a HTTPS proxy ? \r\n\r\nI'm getting this error \r\n\r\n` requests.exceptions.ProxyError: HTTPSConnectionPool(host='httpbin.org', port=443): Max retries exceeded with url: /ip (Caused by ProxyError('Cannot connect to proxy.', NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x0000021DB2CDF668>: Failed to establish a new connection: [WinError 10060] A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond',))) ` \r\n\r\nand this is what I'm trying to do to test. \r\n\r\n```\r\nimport requests\r\nurl = 'https://httpbin.org/ip'\r\nprox = 'https://89.36.195.238:35328'\r\n\r\nproxies = {\r\n \"https\": prox\r\n}\r\nresponse = requests.get(url,proxies=proxies)\r\nprint(response.json())\r\n```"
] |
https://api.github.com/repos/psf/requests/issues/2060
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2060/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2060/comments
|
https://api.github.com/repos/psf/requests/issues/2060/events
|
https://github.com/psf/requests/issues/2060
| 34,190,639 |
MDU6SXNzdWUzNDE5MDYzOQ==
| 2,060 |
request headers are incomplete
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1619869?v=4",
"events_url": "https://api.github.com/users/jacksontj/events{/privacy}",
"followers_url": "https://api.github.com/users/jacksontj/followers",
"following_url": "https://api.github.com/users/jacksontj/following{/other_user}",
"gists_url": "https://api.github.com/users/jacksontj/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jacksontj",
"id": 1619869,
"login": "jacksontj",
"node_id": "MDQ6VXNlcjE2MTk4Njk=",
"organizations_url": "https://api.github.com/users/jacksontj/orgs",
"received_events_url": "https://api.github.com/users/jacksontj/received_events",
"repos_url": "https://api.github.com/users/jacksontj/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jacksontj/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jacksontj/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jacksontj",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-05-23T16:28:32Z
|
2021-09-08T23:11:01Z
|
2014-06-08T09:49:56Z
|
NONE
|
resolved
|
Requests automatically adds a host header if the user didn't specify one. That header is not set in the request object. Example:
```
>>> import requests
>>> ret = requests.get('http://www.google.com')
>>> ret.request.headers['host']
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib/python2.7/site-packages/requests/structures.py", line 77, in __getitem__
return self._store[key.lower()][1]
KeyError: 'host'
```
This means that the headers in this object do not actually match what was sent on the wire, which is magic. Inside of poolmanager in _set_proxy_headers the accept and host headers are set, is there some way to make them set in that request object as well?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2060/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2060/timeline
| null |
completed
| null | null | false |
[
"Thanks for raising this issue!\n\nActually, `_set_proxy_headers` is only used when routing via a proxy, and implements custom logic for that case. The section of the code that normally sets these headers is actually down in `httplib`, way down in the standard library.\n\nWe've never guaranteed that the request object represents exactly what was sent on the wire, and we cannot meaningfully do it. Requests is abstractions built on abstractions, and at least one of those abstraction layers is outside of our control (`httplib`). We simply cannot prevent `httplib` changing our requests however it sees fit.\n\nWe could take ownership of the `Host` header, but I'm not really convinced it gains us much, aside from allowing us to introduce bugs.\n",
"Ah, that makes some sense. Well, if the http library doesn't expose the headers it sent then there really isn't a good way of doing this, just means I'll have to add some logic into my testing code that someone magically adds some headers.\n"
] |
https://api.github.com/repos/psf/requests/issues/2059
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2059/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2059/comments
|
https://api.github.com/repos/psf/requests/issues/2059/events
|
https://github.com/psf/requests/pull/2059
| 34,187,952 |
MDExOlB1bGxSZXF1ZXN0MTYyODA4NDQ=
| 2,059 |
remove unused IteratorProxy
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2245080?v=4",
"events_url": "https://api.github.com/users/jschneier/events{/privacy}",
"followers_url": "https://api.github.com/users/jschneier/followers",
"following_url": "https://api.github.com/users/jschneier/following{/other_user}",
"gists_url": "https://api.github.com/users/jschneier/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jschneier",
"id": 2245080,
"login": "jschneier",
"node_id": "MDQ6VXNlcjIyNDUwODA=",
"organizations_url": "https://api.github.com/users/jschneier/orgs",
"received_events_url": "https://api.github.com/users/jschneier/received_events",
"repos_url": "https://api.github.com/users/jschneier/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jschneier/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jschneier/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jschneier",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2014-05-23T15:54:58Z
|
2021-09-08T23:07:20Z
|
2014-05-27T15:28:11Z
|
CONTRIBUTOR
|
resolved
|
It's undocumented, unused and basically uninteresting. It looks like it was added in error in this commit https://github.com/kennethreitz/requests/commit/ef8563ab36c6b52834ee9c35f6f75a424cd9ceef. Or at least the interesting bit was factored out into `super_len`.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2059/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2059/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2059.diff",
"html_url": "https://github.com/psf/requests/pull/2059",
"merged_at": "2014-05-27T15:28:11Z",
"patch_url": "https://github.com/psf/requests/pull/2059.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2059"
}
| true |
[
"Yup, that looks unneeded to me. Thanks! :cake:\n",
":+1: \n",
":camel: \n\nNot a cake. I'm sure why.\n"
] |
https://api.github.com/repos/psf/requests/issues/2058
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2058/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2058/comments
|
https://api.github.com/repos/psf/requests/issues/2058/events
|
https://github.com/psf/requests/pull/2058
| 34,080,256 |
MDExOlB1bGxSZXF1ZXN0MTYyMTgzMDE=
| 2,058 |
Return an instance of dict_class from merge_setting
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/223831?v=4",
"events_url": "https://api.github.com/users/robgolding/events{/privacy}",
"followers_url": "https://api.github.com/users/robgolding/followers",
"following_url": "https://api.github.com/users/robgolding/following{/other_user}",
"gists_url": "https://api.github.com/users/robgolding/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/robgolding",
"id": 223831,
"login": "robgolding",
"node_id": "MDQ6VXNlcjIyMzgzMQ==",
"organizations_url": "https://api.github.com/users/robgolding/orgs",
"received_events_url": "https://api.github.com/users/robgolding/received_events",
"repos_url": "https://api.github.com/users/robgolding/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/robgolding/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/robgolding/subscriptions",
"type": "User",
"url": "https://api.github.com/users/robgolding",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-05-22T13:17:43Z
|
2021-09-08T23:07:16Z
|
2014-05-27T17:09:43Z
|
NONE
|
resolved
|
Proposed solution to #2057
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2058/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2058/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2058.diff",
"html_url": "https://github.com/psf/requests/pull/2058",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2058.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2058"
}
| true |
[
"As discussed in #2057 I'm :-1: on this change.\n",
"I am as well, unfortunately. \n"
] |
https://api.github.com/repos/psf/requests/issues/2057
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2057/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2057/comments
|
https://api.github.com/repos/psf/requests/issues/2057/events
|
https://github.com/psf/requests/issues/2057
| 34,079,391 |
MDU6SXNzdWUzNDA3OTM5MQ==
| 2,057 |
merge_setting always returns a dict, even if dict_class is set to something else
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/223831?v=4",
"events_url": "https://api.github.com/users/robgolding/events{/privacy}",
"followers_url": "https://api.github.com/users/robgolding/followers",
"following_url": "https://api.github.com/users/robgolding/following{/other_user}",
"gists_url": "https://api.github.com/users/robgolding/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/robgolding",
"id": 223831,
"login": "robgolding",
"node_id": "MDQ6VXNlcjIyMzgzMQ==",
"organizations_url": "https://api.github.com/users/robgolding/orgs",
"received_events_url": "https://api.github.com/users/robgolding/received_events",
"repos_url": "https://api.github.com/users/robgolding/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/robgolding/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/robgolding/subscriptions",
"type": "User",
"url": "https://api.github.com/users/robgolding",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 8 |
2014-05-22T13:06:54Z
|
2021-09-08T23:10:46Z
|
2014-05-27T19:01:57Z
|
NONE
|
resolved
|
I've been using [responses](https://github.com/dropbox/responses) to test an application using matched URL parameters, which was encountering issues as the `merge_setting` function always returns an unordered `dict` (even though the `dict_class` argument defaults to `OrderedDict`). This means that I can't guarantee which order the URL parameters will be constructed in, so writing a test which responds to a particular URL is difficult.
I'm not sure that the `dict_class` argument was meant to be the class of object that is returned, but making it work that way would seem to make sense. As a workaround, I've set the session object's parameters to `None`, which causes `merge_setting` to return the request's parameters unmodified (which, in my case, are an `OrderedDict`).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2057/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2057/timeline
| null |
completed
| null | null | false |
[
"I see no reason why this is a use case we should support. You're essentially testing something that should never matter (the ordering of the parameters) - in other words your tests are wrong. \n\nThe best way to write a test for your URL is to parse it after it has been constructed. That sounds like more work but if you are in fact intent on testing the URL you could build this into responses as an attribute (like `parsed_url`). \n\nIt is also important to note that we don't support `responses`, so any problems you have with it should be raised there.\n",
"In which case, what is the `dict_class` parameter supposed to be for? My feeling was not that something fundamental should be changed in `requests` to make this work, it was that the parameter was there already -- it just wasn't being used properly. If that's not right, then maybe it should be removed entirely.\n\nI know my scenario above is somewhat contrived, but I can imagine other cases where something like this would be genuinely useful, and allowing it to work isn't too much of a hassle. I realise that `responses` is a separate project and unsupported, but being able to predict what order the URL parameters are generated in is useful in general, even if it's not the default behaviour and needs to be enabled.\n",
"The purpose of `dict_class` is stated in the docstring:\n\n> If a setting is a dictionary, they will be merged together using `dict_class`.\n\nThis means that `dict_class` is intended to control the merging logic. An example here is that of `MultiDict`, which will treat merging settings fundamentally very differently to the way a `dict` or `OrderedDict` would.\n\n_With that said_, I have no particular objection to having `merge_setting` return the `dict_class`. Clearly @sigmavirus24 doesn't agree, but I'll try to have a chat with him and see why we're on different pages. I suspect he's seeing something I'm not. =)\n",
"> but I can imagine other cases where something like this would be genuinely useful\n\nI'm interested in hearing what they are.\n\n---\n\nThat aside, I don't understand why this isn't a feature request in responses. It makes more sense to want a parsed URL attribute on a call than it does to introduce arbitrary and easily breakable behaviour in requests that will only result in more bug reports like this one.\n",
"Here's a concrete example in jQuery UI: http://api.jqueryui.com/sortable/#method-serialize\n\nThe order of the URL parameters is important, because it's used to communicate to the server how a collection of objects should be sorted. If this were passed to an application running `requests`, which then made an API call to a backend application, preserving that order should be possible. This change makes that the default behaviour, which won't matter for users who don't care about the ordering, but will have a positive effect for those who do.\n\nThe behaviour at the moment is also inconsistent, because setting `parameters = None` on the `Session` object causes `request_parameters` to be handed back unmodified, which means the order is preserved if an `OrderedDict` or list of tuples is passed in.\n",
"> The behaviour at the moment is also inconsistent\n\nI disagree (as you might have guessed). If there's nothing to merge, why should we bother modifying anything, especially if it has an API that we're expecting. Basically any mapping can be passed to requests like this and we'll return it so long as it behaves like we expect it to (i.e., has all the methods we need in order to operate correctly). This is very much a feature.\n\n> Here's a concrete example in jQuery UI:\n\nThe problem with your example is that requests doesn't work that way. The equivalent way to do that in requests is this:\n\n``` pycon\n>>> import requests\n>>> r = requests.get('https://httpbin.org/get', params={'setname': [1, 5, 3]})\n>>> r.json()['args']\n{u'setname': [u'1', u'5', u'3']}\n```\n\nThe parameter string will look different (`setname=1&setname=5&setname=3`) but that's the only way to send it with requests otherwise.\n\nAre there other use cases?\n",
"The inconsistency arises from the difference between `session_params` being set to `{}` or `None`. The former causes `merge_setting` to return a plain `dict` (even though no merging happens), whereas the latter will cause it to return `request_params` untouched. Regardless of which is correct, the same thing should probably be returned in both cases.\n\nThe jQuery UI example obviously doesn't hold up in this case because there's a proper way to do that in `requests`, but the reasoning still stands; if someone were writing a library to communicate with a legacy HTTP server which needed to receive URL parameters in a particular order, that should be possible.\n",
"Closing, for the same reason #2058 was closed.\n"
] |
https://api.github.com/repos/psf/requests/issues/2056
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2056/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2056/comments
|
https://api.github.com/repos/psf/requests/issues/2056/events
|
https://github.com/psf/requests/issues/2056
| 34,038,555 |
MDU6SXNzdWUzNDAzODU1NQ==
| 2,056 |
SSLError on GAE
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/122286?v=4",
"events_url": "https://api.github.com/users/lra/events{/privacy}",
"followers_url": "https://api.github.com/users/lra/followers",
"following_url": "https://api.github.com/users/lra/following{/other_user}",
"gists_url": "https://api.github.com/users/lra/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/lra",
"id": 122286,
"login": "lra",
"node_id": "MDQ6VXNlcjEyMjI4Ng==",
"organizations_url": "https://api.github.com/users/lra/orgs",
"received_events_url": "https://api.github.com/users/lra/received_events",
"repos_url": "https://api.github.com/users/lra/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/lra/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lra/subscriptions",
"type": "User",
"url": "https://api.github.com/users/lra",
"user_view_type": "public"
}
|
[
{
"color": "fbca04",
"default": false,
"description": null,
"id": 615414998,
"name": "GAE Support",
"node_id": "MDU6TGFiZWw2MTU0MTQ5OTg=",
"url": "https://api.github.com/repos/psf/requests/labels/GAE%20Support"
}
] |
closed
| true | null |
[] | null | 4 |
2014-05-21T23:44:27Z
|
2021-09-08T09:00:47Z
|
2014-05-22T20:21:57Z
|
NONE
|
resolved
|
Hi,
When I try to run this, I get an SSL error:
``` python
import requests
full_url = u'https://qascore.percolate.com/api/pull_request/'
requests.get(url=full_url)
```
```
Traceback (most recent call last):
File "test.py", line 5, in <module>
requests.get(url=full_url)
File "/usr/local/lib/python2.7/site-packages/requests/api.py", line 55, in get
return request('get', url, **kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/api.py", line 44, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/sessions.py", line 456, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/sessions.py", line 559, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/adapters.py", line 382, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: [Errno 8] _ssl.c:507: EOF occurred in violation of protocol
```
I'm using `requests==2.3.0`.
I don't have the error when using http of course.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/122286?v=4",
"events_url": "https://api.github.com/users/lra/events{/privacy}",
"followers_url": "https://api.github.com/users/lra/followers",
"following_url": "https://api.github.com/users/lra/following{/other_user}",
"gists_url": "https://api.github.com/users/lra/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/lra",
"id": 122286,
"login": "lra",
"node_id": "MDQ6VXNlcjEyMjI4Ng==",
"organizations_url": "https://api.github.com/users/lra/orgs",
"received_events_url": "https://api.github.com/users/lra/received_events",
"repos_url": "https://api.github.com/users/lra/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/lra/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lra/subscriptions",
"type": "User",
"url": "https://api.github.com/users/lra",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2056/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2056/timeline
| null |
completed
| null | null | false |
[
"It appears this does not only happen on GAE. Investigating...\n",
"```\n~$ openssl s_client -connect qascore.percolate.com:443\nCONNECTED(00000003)\n12470:error:140790E5:SSL routines:SSL23_WRITE:ssl handshake failure:/SourceCache/OpenSSL098/OpenSSL098-50/src/ssl/s23_lib.c:182:\n```\n\nOpenSSL seems to be having trouble with this site too although curiously Python 3 can get a response easily.\n",
"This is an SNI bug. Follow the steps [here](https://stackoverflow.com/questions/18578439/using-requests-with-tls-doesnt-give-sni-support/18579484#18579484) and you should be fine. =)\n",
"Thanks, `ndg-httpsclient`and `pyasn1` where enough to fix the pb.\n"
] |
https://api.github.com/repos/psf/requests/issues/2055
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2055/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2055/comments
|
https://api.github.com/repos/psf/requests/issues/2055/events
|
https://github.com/psf/requests/issues/2055
| 34,035,325 |
MDU6SXNzdWUzNDAzNTMyNQ==
| 2,055 |
duplicate with #1547
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/207571?v=4",
"events_url": "https://api.github.com/users/john-peterson/events{/privacy}",
"followers_url": "https://api.github.com/users/john-peterson/followers",
"following_url": "https://api.github.com/users/john-peterson/following{/other_user}",
"gists_url": "https://api.github.com/users/john-peterson/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/john-peterson",
"id": 207571,
"login": "john-peterson",
"node_id": "MDQ6VXNlcjIwNzU3MQ==",
"organizations_url": "https://api.github.com/users/john-peterson/orgs",
"received_events_url": "https://api.github.com/users/john-peterson/received_events",
"repos_url": "https://api.github.com/users/john-peterson/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/john-peterson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/john-peterson/subscriptions",
"type": "User",
"url": "https://api.github.com/users/john-peterson",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2014-05-21T23:08:17Z
|
2021-09-09T00:00:57Z
|
2014-05-21T23:16:17Z
|
NONE
|
resolved
|
duplicate with https://github.com/kennethreitz/requests/issues/1547
remove this issue if possible
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/207571?v=4",
"events_url": "https://api.github.com/users/john-peterson/events{/privacy}",
"followers_url": "https://api.github.com/users/john-peterson/followers",
"following_url": "https://api.github.com/users/john-peterson/following{/other_user}",
"gists_url": "https://api.github.com/users/john-peterson/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/john-peterson",
"id": 207571,
"login": "john-peterson",
"node_id": "MDQ6VXNlcjIwNzU3MQ==",
"organizations_url": "https://api.github.com/users/john-peterson/orgs",
"received_events_url": "https://api.github.com/users/john-peterson/received_events",
"repos_url": "https://api.github.com/users/john-peterson/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/john-peterson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/john-peterson/subscriptions",
"type": "User",
"url": "https://api.github.com/users/john-peterson",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2055/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2055/timeline
| null |
completed
| null | null | false |
[
"Why aren't you using `pip` to install `requests`? Also, if you're not seeing output from `python setup.py install` what relates it to that StackOverflow question?\n"
] |
https://api.github.com/repos/psf/requests/issues/2054
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2054/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2054/comments
|
https://api.github.com/repos/psf/requests/issues/2054/events
|
https://github.com/psf/requests/issues/2054
| 33,935,618 |
MDU6SXNzdWUzMzkzNTYxOA==
| 2,054 |
Exceptions have other exceptions as messages.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/127497?v=4",
"events_url": "https://api.github.com/users/fletom/events{/privacy}",
"followers_url": "https://api.github.com/users/fletom/followers",
"following_url": "https://api.github.com/users/fletom/following{/other_user}",
"gists_url": "https://api.github.com/users/fletom/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/fletom",
"id": 127497,
"login": "fletom",
"node_id": "MDQ6VXNlcjEyNzQ5Nw==",
"organizations_url": "https://api.github.com/users/fletom/orgs",
"received_events_url": "https://api.github.com/users/fletom/received_events",
"repos_url": "https://api.github.com/users/fletom/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/fletom/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/fletom/subscriptions",
"type": "User",
"url": "https://api.github.com/users/fletom",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-05-20T22:21:06Z
|
2021-09-09T00:00:58Z
|
2014-05-21T00:15:48Z
|
NONE
|
resolved
|
I could be wrong but this seems very strange behaviour to me. I'm referring to the end of `adapters.py` that passes `urllib3` exceptions to `requests` exceptions as the first argument.
This gets very confusing since `exception.message` is another exception. In my case it broke error reporting tools that were expecting a string.
```
ipdb> exception
ConnectionError(MaxRetryError("HTTPConnectionPool(host='localhost', port=8000): Max retries exceeded with url: /users/authentication (Caused by <class 'socket.error'>: [Errno 61] Connection refused)",),)
ipdb> exception.message
MaxRetryError("HTTPConnectionPool(host='localhost', port=8000): Max retries exceeded with url: /users/authentication (Caused by <class 'socket.error'>: [Errno 61] Connection refused)",)
ipdb> type(exception.message)
<class 'requests.packages.urllib3.exceptions.MaxRetryError'>
```
It would seem much more appropriate to me to replace `raise ConnectionError(e)` with `raise ConnectionError(e.message)`. This provides all the same information while still conforming to the Python convention of `raise Exception("string message")`.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2054/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2054/timeline
| null |
completed
| null | null | false |
[
"It is not the most common behaviour ever but it is in fact a very deliberate design decision. We wrap all errors provided to us by urllib3 (and even some that bubble up from underneath urllib3) and present them to the user in a predictable way. However, for the more advanced user, being able to inspect those lower level error messages (and in fact when trying to help users it is very useful for us as well) is invaluable. You very easily ensure that `e.message` is a string by calling `str` on it if you can manage that yourself. That aside, I consider it your error reporting tools fault for not being more paranoid about the data it collects from user's code.\n\nI appreciate that you took the time to raise this issue @fletom but I don't think we're going to change how we handle exceptions.\n\nCheers!\n",
"I understand that it's useful to retain the original exception. In my opinion, it would be much nicer to have this appear as a secondary attribute on the exception object (e.g. `e.original_exception`) instead of breaking Python exception conventions. Semantically speaking, a \"message\" is obviously a string, not another exception.\n\nBut I'll leave it at that. Thanks for taking time to write a detailed response.\n"
] |
https://api.github.com/repos/psf/requests/issues/2053
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2053/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2053/comments
|
https://api.github.com/repos/psf/requests/issues/2053/events
|
https://github.com/psf/requests/issues/2053
| 33,887,182 |
MDU6SXNzdWUzMzg4NzE4Mg==
| 2,053 |
gevent adapter
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/229945?v=4",
"events_url": "https://api.github.com/users/HoverHell/events{/privacy}",
"followers_url": "https://api.github.com/users/HoverHell/followers",
"following_url": "https://api.github.com/users/HoverHell/following{/other_user}",
"gists_url": "https://api.github.com/users/HoverHell/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/HoverHell",
"id": 229945,
"login": "HoverHell",
"node_id": "MDQ6VXNlcjIyOTk0NQ==",
"organizations_url": "https://api.github.com/users/HoverHell/orgs",
"received_events_url": "https://api.github.com/users/HoverHell/received_events",
"repos_url": "https://api.github.com/users/HoverHell/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/HoverHell/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/HoverHell/subscriptions",
"type": "User",
"url": "https://api.github.com/users/HoverHell",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 15 |
2014-05-20T13:22:07Z
|
2021-09-08T23:10:42Z
|
2014-08-13T03:08:22Z
|
NONE
|
resolved
|
Hello.
As a matter of feature request, it would be useful to have a gevent-compatible adapter for requests which would not require any dangerous monkey-patching.
By now, I've tried combining `requests` with `geventhttpclient`; that works, but it is a large hack if it is done in a separate module ( https://gist.github.com/anonymous/699cf4beba490024e52c
) (and I don't want to break external compatbility with requests by trying to use completely different pool and response classes).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2053/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2053/timeline
| null |
completed
| null | null | false |
[
"I presume that you've seen [grequests](https://github.com/kennethreitz/grequests) and that it's not suitable for your needs?\n",
"@Lukasa as you might see in its source code, it applies gevent.monkeys.patch_all, which is known to break various things and is a generally dangerous thing to do. (worse, it does that on import-)\n",
"What specifically does it break for you @HoverHell?\n",
"@sigmavirus24 I think that's mostly out of the issue. Regardless, it threw an “this call would block forever” exception (on a code which otherwise works normally) which is enough to be suspicious; additionally (but not critically), it is known to have some problems with celery, raven, possibly uwsgi, and various other things. Not quite something I would want to deal with in an actively used django application.\n",
"@HoverHell grequests isn't exactly an alternative implementation; it's a little more like an example script or basic wrapper with some convenience methods (like mapping).\n\nSome slightly less ham-handed monkeypatching might accomplish what you want, perhaps by doing `httplib.socket = gevent.socket; urllib3.socket = gevent.socket; requests.socket = gevent.socket`. Otherwise, there is no real way of doing it \"the right way\" due to requests' dependence on urllib3, and urllib3's dependence on httplib from the standard lib. Your geventhttpclient adapter would likely be the overall safest bet in that regard (and would probably give you some speed increase as well, due to geventhttpclient being written specifically for performance with gevent), but of course it's going to bring its own problems as you say.\n\nFor what it's worth, though, I've been using `gevent.monkey.patch_all()` in many of my applications for a long time with utterly no issues. I can see how it might cause issues with other big network-based third party libraries, but I think it's worth trying the monkey patching in new code before looking at alternatives. I can confirm that global monkey patching seems to work just fine with requests at least, without any need for grequests.\n\n> it threw an “this call would block forever” exception (on a code which otherwise works normally) which is enough to be suspicious\n\nIt is rare to see such an exception when using gevent as intended, I believe.\n",
"> It is rare to see such an extension when using gevent as intended, I believe.\n\nI agree. I don't think I've ever seen that problem before when using gevent correctly.\n",
"@Anorov \n\n> Your geventhttpclient adapter would likely be the overall safest bet in that regard\n> Certainly; my question is: can it be done in a less hack-ish / more future-proof way, perhaps by integrating it more with the requests library e.g. making few minor changes in the methods arhitecture of the requests connection / connectionpool modules?\n\n@sigmavirus24 \n\n> I agree. I don't think I've ever seen that problem before when using gevent correctly.\n\nI wasn't even using gevent at that point – I simply installed grequests, and another imported library (which wasn't being used either) imported grequests; in all the other regards it is simply a django application.\n\n(though, thinking of it, I could guess that it is the late importing that made it problematic; still, I expect other problems regardless)\n",
"@HoverHell I'm starting to question whether this issue is even suited for this project. Your trouble seems to arise from that project, not requests, and you've been dismissive of efforts to try to help you in using it. I'm also surprised you're intent on using gevent when you're so convinced that it's going to cause you so many problems.\n",
"@sigmavirus24 I'm intent on using it (or an alternative, really) without _potentially_ problematic monkeypatching (sometimes even small chances of breaking something are too much); that's why I need more support from the requests.\n",
"@HoverHell you haven't told us how we can support you further though? There aren't changes that need to be made to requests to support gevent either by monkey patching everything or only a small number of things. We've tried to help but with every suggestion we're told it is insufficient.\n",
"@HoverHell I believe my suggestion of monkeypatching the `socket` module in `requests` and all of its dependencies are the quickest \"less-hacky\" ways of accomplishing this.\n\ngevent itself was kind of designed as a hack: a way to add green threads into Python without a new interpreter and/or new standard lib. The gevent designer(s) tried to make the monkeypatched modules as backwards compatible as possible, but obviously monkeypatching will never be perfect.\n\nYour absolute safest bet, if you wanted to make a fully compatible clone that defers to geventhttpclient and does no actual monkeypatching, is probably to make an alternative urllib3 implementation that uses geventhttpclient's swap-in httplib as described in [its README](https://github.com/gwik/geventhttpclient/blob/master/README.mdown):\n\n```\nfrom geventhttpclient.httplib import HTTPConnection\n```\n\nurllib3 doesn't really support adapters, so you'd have to make a new `geurllib3` module I suppose.\n\nYou'd basically just change a few of the imports in `connection.py` [here](https://github.com/shazow/urllib3/blob/98c6fbfc27d7d51327ad85a85a80dd4fe096cc79/urllib3/connection.py#L14-L30).\n\nThen you should probably be able to subclass `HTTPAdapter` and use the new `geurllib3` module's classes and functions instead.\n\nThis would have to rely on geventhttpclient's version of httplib being 100% compatible with the standard lib's. I'd recommend running through the full urllib3 and requests test suite after making these changes. If you discover a bug, you can open an issue on the geventhttpclient repo. \n\nEither way, I agree this is out of scope as a requests issue. It's just a personal decision you'll have to make.\n",
"@sigmavirus24, @Anorov as you can see by the link in the first message, I've already made a no-monkeypatching adapter through subclassing. However, it is variably hack-ish and incomplete and unreliable.\n\nMy question, thus, is: what changes can be made to requests that would make it better? The very least would be making `pool_classes_by_scheme` an attribute of the PoolManager; and, likely, adding more attributes like ConnectionCls.\n\nBasically, my problem is that the whole default _HTTPAdapter_ of the requests is nearly a monolithic untweakable thing if you don't count in monkeypatching.\n",
"> The very least would be making pool_classes_by_scheme an attribute of the PoolManager; and, likely, adding more attributes like ConnectionCls.\n\nThose are all changes to `urllib3`, not `requests`. \n\n> Basically, my problem is that the whole default HTTPAdapter of the requests is nearly a monolithic untweakable thing\n\nIt's actually quite tweakable and there are many examples of it being tweakable. It might be easier for you to explain yourself if you **start** a PR with some of the changes you think are necessary, but I have to admit that they'll have to be obviously beneficial changes to the overall project, especially since there's already a way to use requests with gevent.\n",
"Most of the changes will indeed need to be made in `urllib3`. I could see some of those being accepted; you'd need to open a PR with `urllib3` though.\n",
"Closing due to inactivity\n"
] |
https://api.github.com/repos/psf/requests/issues/2052
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2052/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2052/comments
|
https://api.github.com/repos/psf/requests/issues/2052/events
|
https://github.com/psf/requests/issues/2052
| 33,802,209 |
MDU6SXNzdWUzMzgwMjIwOQ==
| 2,052 |
Python 3 - TypeError: decoding str is not supported
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/186643?v=4",
"events_url": "https://api.github.com/users/cam-stitt/events{/privacy}",
"followers_url": "https://api.github.com/users/cam-stitt/followers",
"following_url": "https://api.github.com/users/cam-stitt/following{/other_user}",
"gists_url": "https://api.github.com/users/cam-stitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/cam-stitt",
"id": 186643,
"login": "cam-stitt",
"node_id": "MDQ6VXNlcjE4NjY0Mw==",
"organizations_url": "https://api.github.com/users/cam-stitt/orgs",
"received_events_url": "https://api.github.com/users/cam-stitt/received_events",
"repos_url": "https://api.github.com/users/cam-stitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/cam-stitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cam-stitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/cam-stitt",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2014-05-19T14:04:32Z
|
2021-09-09T00:00:58Z
|
2014-05-20T10:12:05Z
|
NONE
|
resolved
|
I am getting the above error thrown when I attempt to get the request text when the content is a dict-string (ie. '{}'). I've found that it is this line, but I am not sure how to fix it at this point:
https://github.com/kennethreitz/requests/blob/master/requests/models.py#L738
If you were to remove the `errors` keyword, it works fine. I believe it's because the content is not returning as bytes, as `str` expects.
https://docs.python.org/3.3/library/stdtypes.html#str
Not sure how to fix this yet.
Note: if you want to replicate this, use Python 3 and set content to '{}' and try to return `text`
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/186643?v=4",
"events_url": "https://api.github.com/users/cam-stitt/events{/privacy}",
"followers_url": "https://api.github.com/users/cam-stitt/followers",
"following_url": "https://api.github.com/users/cam-stitt/following{/other_user}",
"gists_url": "https://api.github.com/users/cam-stitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/cam-stitt",
"id": 186643,
"login": "cam-stitt",
"node_id": "MDQ6VXNlcjE4NjY0Mw==",
"organizations_url": "https://api.github.com/users/cam-stitt/orgs",
"received_events_url": "https://api.github.com/users/cam-stitt/received_events",
"repos_url": "https://api.github.com/users/cam-stitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/cam-stitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cam-stitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/cam-stitt",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2052/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2052/timeline
| null |
completed
| null | null | false |
[
"Thanks for raising this!\n\nContent must be bytes, always. If you're writing Transport Adapters or generally tweaking requests, you must ensure that content will always be bytes.\n",
"I'm seeing this when using requests-oauthlib with facebook. Is this\nsomething that will be fixed in requests? Might even look into it myself\ntomorrow.\nOn May 20, 2014 12:11 AM, \"Cory Benfield\" [email protected] wrote:\n\n> Thanks for raising this!\n> \n> Content must be bytes, always. If you're writing Transport Adapters or\n> generally tweaking requests, you must ensure that content will always be\n> bytes.\n> \n> —\n> Reply to this email directly or view it on GitHubhttps://github.com/kennethreitz/requests/issues/2052#issuecomment-43508779\n> .\n",
"Again, let's be clear: the idea that content will be bytes is a deliberate design decision, and so will not be 'fixed'. After all, it's intended behaviour. If there's a bug it's in requests-oauthlib.\n",
"No problems. I'll get a PR to requests-oauthlib. Thanks for clarifying it.\n"
] |
https://api.github.com/repos/psf/requests/issues/2051
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2051/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2051/comments
|
https://api.github.com/repos/psf/requests/issues/2051/events
|
https://github.com/psf/requests/issues/2051
| 33,784,685 |
MDU6SXNzdWUzMzc4NDY4NQ==
| 2,051 |
Content length value is written as unicode rather than bytes
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/769982?v=4",
"events_url": "https://api.github.com/users/Lawouach/events{/privacy}",
"followers_url": "https://api.github.com/users/Lawouach/followers",
"following_url": "https://api.github.com/users/Lawouach/following{/other_user}",
"gists_url": "https://api.github.com/users/Lawouach/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lawouach",
"id": 769982,
"login": "Lawouach",
"node_id": "MDQ6VXNlcjc2OTk4Mg==",
"organizations_url": "https://api.github.com/users/Lawouach/orgs",
"received_events_url": "https://api.github.com/users/Lawouach/received_events",
"repos_url": "https://api.github.com/users/Lawouach/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lawouach/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lawouach/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lawouach",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2014-05-19T09:52:51Z
|
2021-09-09T00:00:58Z
|
2014-05-19T09:57:00Z
|
NONE
|
resolved
|
Using the simple following snippet:
```
>>> files = {'myfile': ('filename.zip', file('some.zip', 'rb'), 'application/octet-stream')}
>>> r = requests.Request('POST', URL, files=files).prepare()
>>> r.headers
{'Content-Length': u'395', 'Content-Type': 'multipart/form-data; boundary=ec38bda3760c46aea548602228efdd47'}
```
This is an issue because httplib will consider it must decode the body as well before sending it (at least in Python 2.7.1, I assume with 3.x it's all bytes anyway). This fails when content is binary.
Massaging the Content-Length to revert it to a byte string does work well.
This was with requests 1.1.0.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2051/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2051/timeline
| null |
completed
| null | null | false |
[
"This is long-since fixed, with the fix originally in version 2.1.0. This is mentioned in [the release notes](https://github.com/kennethreitz/requests/blob/master/HISTORY.rst#210-2013-12-05), was tracked under issue #1688 and was fixed in issue #1699. =)\n",
"My bad.\n",
"No worries @Lawouach ! We appreciate that you took the time to report it\n",
"Thanks. My previous comment was a bit short, sorry. Thanks for the swift reply and, really, I should have looked at the release notes. Thanks guys :)\n"
] |
https://api.github.com/repos/psf/requests/issues/2050
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2050/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2050/comments
|
https://api.github.com/repos/psf/requests/issues/2050/events
|
https://github.com/psf/requests/pull/2050
| 33,743,358 |
MDExOlB1bGxSZXF1ZXN0MTYwMjU1MDQ=
| 2,050 |
Added instructions and sample code for using requests over Tor
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2611615?v=4",
"events_url": "https://api.github.com/users/Hasimir/events{/privacy}",
"followers_url": "https://api.github.com/users/Hasimir/followers",
"following_url": "https://api.github.com/users/Hasimir/following{/other_user}",
"gists_url": "https://api.github.com/users/Hasimir/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Hasimir",
"id": 2611615,
"login": "Hasimir",
"node_id": "MDQ6VXNlcjI2MTE2MTU=",
"organizations_url": "https://api.github.com/users/Hasimir/orgs",
"received_events_url": "https://api.github.com/users/Hasimir/received_events",
"repos_url": "https://api.github.com/users/Hasimir/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Hasimir/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Hasimir/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Hasimir",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2014-05-18T01:32:07Z
|
2021-09-08T23:06:18Z
|
2014-05-18T09:23:58Z
|
CONTRIBUTOR
|
resolved
|
Added instructions and sample code for using requests over the Tor network.
Sample code shows how to verify a successful connection and how to connect to a hidden service. Example sites are https://check.torproject.org/ and http://ic6au7wa3f6naxjq.onion/ (www.gnupg.org).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2611615?v=4",
"events_url": "https://api.github.com/users/Hasimir/events{/privacy}",
"followers_url": "https://api.github.com/users/Hasimir/followers",
"following_url": "https://api.github.com/users/Hasimir/following{/other_user}",
"gists_url": "https://api.github.com/users/Hasimir/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Hasimir",
"id": 2611615,
"login": "Hasimir",
"node_id": "MDQ6VXNlcjI2MTE2MTU=",
"organizations_url": "https://api.github.com/users/Hasimir/orgs",
"received_events_url": "https://api.github.com/users/Hasimir/received_events",
"repos_url": "https://api.github.com/users/Hasimir/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Hasimir/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Hasimir/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Hasimir",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2050/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2050/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2050.diff",
"html_url": "https://github.com/psf/requests/pull/2050",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2050.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2050"
}
| true |
[
"This is great work @Hasimir! I really appreciate your effort in putting this together.\n\nPersonally, I think this is better suited to a blog post and not the documentation. I'll leave this open, though, until @Lukasa wakes up. :)\n",
"No reason why it can't be both. ;)\n\nBesides, it's merely a conversion (well, more basic version) of some existing scripts I use to make sure my website is visible beyond my home network. I've also got some Twython scripts using it, though more as an exercise to see if it'd work (it did). I figured I may as well share.\n\nAnd the advantage of adding it to the docs is the new users who will encounter requests months and even years down the track, when a blog entry is ancient history.\n",
"This is brilliant, @Hasimir! Unfortunately, I don't think it belongs in the documentation. =)\n\nRequests documentation doesn't really contain 'recipes', or pre-canned instructions for how to do specific things. This is for two reasons: firstly, they add greatly to the (already not inconsiderable) size of the docs; and secondly, they then become something that needs to be actively maintained. This is harder for recipes like this one because they depend on external projects, meaning that if those projects change our documentation becomes out of date with no real way for us to spot it.\n\nI agree that a blog post is a good idea. You'll also find that a good blog post providing a tutorial on using Requests in a certain way has a tendency to move up quite high in the Google listings. As two examples from my own blog, the specific search [\"force ssl version python requests\"](https://www.google.co.uk/search#q=force+ssl+version+python+requests) has my blog post as the top entry, and even the fairly general [\"proxies python requests\"](https://www.google.co.uk/search#q=proxies+python+requests) has my blog post on the front page. This is really where the information you've provided should belong. =)\n",
"You do make an excellent point regarding the code maintenance and the rest, I shall withdraw this and redirect it elsewhere. Probably my own repo of assorted requests and Twython stuff. Rewriting it as a part of a blog post or similar is probably the better course of action. I shall withdraw the pull request and torment my poor little wordpress installation a bit later.\n",
"@Hasimir Do send me and @sigmavirus24 tweets when you write it and we'll make sure we RT and talk about your blog post lots. =D\n",
"Will do, just followed the pair of you (my account is easy to spot, it uses my real name and the same profile image ... and involvement in a certain party, but you'll see that soon enough). Presumably such a post will get a little more traction than my last geeky post (I made yet another software license ... I was bored and it's short).\n\nAnyway, aside from the requests over Tor thing I've also got a method of securing all those API passwords and tokens, even if you have to leave some of them lying around on disk. From my very brief look at the session module in Ian's github3.py I suspect that might be well received too and it's really quite simple. My code is aimed at the Twitter API and Twython, of course, but should be able to be ported without too much trouble.\n"
] |
https://api.github.com/repos/psf/requests/issues/2049
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2049/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2049/comments
|
https://api.github.com/repos/psf/requests/issues/2049/events
|
https://github.com/psf/requests/pull/2049
| 33,741,244 |
MDExOlB1bGxSZXF1ZXN0MTYwMjQ1OTA=
| 2,049 |
Separated out proxy_manager_for to fix #2048
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/477909?v=4",
"events_url": "https://api.github.com/users/codedstructure/events{/privacy}",
"followers_url": "https://api.github.com/users/codedstructure/followers",
"following_url": "https://api.github.com/users/codedstructure/following{/other_user}",
"gists_url": "https://api.github.com/users/codedstructure/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/codedstructure",
"id": 477909,
"login": "codedstructure",
"node_id": "MDQ6VXNlcjQ3NzkwOQ==",
"organizations_url": "https://api.github.com/users/codedstructure/orgs",
"received_events_url": "https://api.github.com/users/codedstructure/received_events",
"repos_url": "https://api.github.com/users/codedstructure/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/codedstructure/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/codedstructure/subscriptions",
"type": "User",
"url": "https://api.github.com/users/codedstructure",
"user_view_type": "public"
}
|
[
{
"color": "e102d8",
"default": false,
"description": null,
"id": 117745,
"name": "Planned",
"node_id": "MDU6TGFiZWwxMTc3NDU=",
"url": "https://api.github.com/repos/psf/requests/labels/Planned"
},
{
"color": "e11d21",
"default": false,
"description": null,
"id": 44501305,
"name": "Not Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTMwNQ==",
"url": "https://api.github.com/repos/psf/requests/labels/Not%20Ready%20To%20Merge"
}
] |
closed
| true | null |
[] | null | 14 |
2014-05-17T22:42:52Z
|
2021-09-08T23:07:11Z
|
2014-06-23T19:11:13Z
|
CONTRIBUTOR
|
resolved
|
Basically identical to @sigmavirus24's suggested code - thanks for that pointer. Just added docstring.
There aren't tests for particular Adapter implementations, and I've not added any at this stage.
I have tested it with the following (which achieves what I want):
```
class SourceAddressAdapter(requests.adapters.HTTPAdapter):
"""
A Requests HTTP adapter to add source address specification for proxies
"""
def __init__(self, *o, **k):
self.source_address = k.pop('source_address', None)
super(SourceAddressAdapter, self).__init__(*o, **k)
def proxy_manager_for(self, proxy):
conn_params = {}
if self.source_address is not None:
conn_params['source_address'] = (self.source_address, 0)
if not proxy in self.proxy_manager:
proxy_headers = self.proxy_headers(proxy)
self.proxy_manager[proxy] = requests.adapters.proxy_from_url(
proxy,
proxy_headers=proxy_headers,
num_pools=self._pool_connections,
maxsize=self._pool_maxsize,
block=self._pool_block,
**conn_params)
return self.proxy_manager[proxy]
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2049/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2049/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2049.diff",
"html_url": "https://github.com/psf/requests/pull/2049",
"merged_at": "2014-06-23T19:11:13Z",
"patch_url": "https://github.com/psf/requests/pull/2049.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2049"
}
| true |
[
"LGTM. @codedstructure you should checkout [the toolbelt](https://gitlab.com/sigmavirus24/toolbelt). We just moved the `SourceAddressAdapter` there so people can use it without having to maintain their own version.\n",
"This looks great @codedstructure, but I want to bikeshed here for a moment. I'm wondering whether the right API is actually for `proxy_manager_for` to take an arbitrary keyword argument dict that it passes to `proxy_from_url`. This saves several lines from the `SourceAddressAdapter` because it can now do:\n\n``` python\nclass SourceAddressAdapter(requests.adapters.HTTPAdapter):\n \"\"\"\n A Requests HTTP adapter to add source address specification for proxies\n \"\"\"\n def __init__(self, *o, **k):\n self.source_address = k.pop('source_address', None)\n super(SourceAddressAdapter, self).__init__(*o, **k)\n\n def proxy_manager_for(self, proxy, **proxy_kwargs):\n if self.source_address is not None:\n proxy_kwargs['source_address'] = (self.source_address, 0)\n\n return super(SourceAddressAdapter, self).proxy_manager_for(proxy, **proxy_kwargs)\n```\n\nThoughts?\n",
"@Lukasa that sounds like a good idea. The revised PR would look like this then?\n\n``` python\n def proxy_manager_for(self, proxy, **proxy_kwargs):\n \"\"\"Return urllib3 ProxyManager for the given proxy.\n\n This method should not be called from user code, and is only\n exposed for use when subclassing the\n :class:`HTTPAdapter <requests.adapters.HTTPAdapter>`.\n\n :param proxy: The proxy to return a urllib3 ProxyManager for.\n :param proxy_kwargs: Extra keyword arguments used to configure the Proxy Manager.\n :returns: ProxyManager\n \"\"\"\n if not proxy in self.proxy_manager:\n proxy_headers = self.proxy_headers(proxy)\n self.proxy_manager[proxy] = proxy_from_url(\n proxy,\n proxy_headers=proxy_headers,\n num_pools=self._pool_connections,\n maxsize=self._pool_maxsize,\n block=self._pool_block,\n **proxy_kwargs)\n\n return self.proxy_manager[proxy]\n```\n",
"Thanks both, I'm 100% in favour of anything which makes adapters easier to write. The PR code now looks like @sigmavirus24' code above.\n\nAlthough while we're at it / (& for consistency), should `init_poolmanager` also have a similar kwargs parameter `pool_kwargs`?\n",
"Consistency is the hobgoblin of the foolish mind. That aside, I'm in favor of consistency. I just like keeping PRs narrowly focused to one topic. If you're going to make that change, _I_ would prefer it in a separate PR, but @Lukasa should feel free to disagree. \n",
"Been presumptuous and implemented the above on the branch; makes SourceAddressAdapter potentially less dependent on changes to `init_poolmanager` and removes the requirement to import the urllib3 PoolManager.\n",
"Understand, I can revert if required.\n",
"Sorry this took me so long to get to. One small code review markup and then we can pass it to Kenneth.\n",
"Improved indentation as requested, cheers.\n",
"Looking forward to it!\n",
"@kennethreitz I think we're good to go at this stage. =)\n",
":sparkles: :cake: :sparkles:\n",
"Sorry for the delay :)\n",
"Thanks all, appreciate the encouraging community here.\n"
] |
https://api.github.com/repos/psf/requests/issues/2048
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2048/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2048/comments
|
https://api.github.com/repos/psf/requests/issues/2048/events
|
https://github.com/psf/requests/issues/2048
| 33,739,343 |
MDU6SXNzdWUzMzczOTM0Mw==
| 2,048 |
add equivalent to adapter 'init_poolmanager' method for proxy-based connections
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/477909?v=4",
"events_url": "https://api.github.com/users/codedstructure/events{/privacy}",
"followers_url": "https://api.github.com/users/codedstructure/followers",
"following_url": "https://api.github.com/users/codedstructure/following{/other_user}",
"gists_url": "https://api.github.com/users/codedstructure/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/codedstructure",
"id": 477909,
"login": "codedstructure",
"node_id": "MDQ6VXNlcjQ3NzkwOQ==",
"organizations_url": "https://api.github.com/users/codedstructure/orgs",
"received_events_url": "https://api.github.com/users/codedstructure/received_events",
"repos_url": "https://api.github.com/users/codedstructure/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/codedstructure/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/codedstructure/subscriptions",
"type": "User",
"url": "https://api.github.com/users/codedstructure",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2014-05-17T20:40:02Z
|
2021-09-08T23:10:54Z
|
2014-06-23T19:11:13Z
|
CONTRIBUTOR
|
resolved
|
I'm trying to make a HTTPAdapter which injects `source_address` - and after doing my own found https://github.com/kennethreitz/requests/issues/2008#issuecomment-40793099 which it is virtually identical to.
However this fails when proxy configuration is in use as `self.poolmanager` is not used. I suspect overriding `get_connection` is required to accomplish this now; it would be nice if https://github.com/kennethreitz/requests/blob/59c8d81/requests/adapters.py#L210-L215 could become another method similar to `init_poolmanager`, so proxy-based connections could be more easily 'adapted'.
Happy to attempt a PR if this idea has merit / is considered 'in-scope' :-)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2048/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2048/timeline
| null |
completed
| null | null | false |
[
"I think there's no good reason not to do this, and it would definitely make this easier. @sigmavirus24?\n",
"I would go one step further than the lines you highlighted. That code in `get_connection` could read:\n\n``` python\n if proxy:\n proxy = prepend_scheme_if_needed(proxy, 'http')\n proxy_manager = self.proxy_manager_for(proxy)\n conn = proxy_manager.connection_from_url(url)\n```\n\nWith the method\n\n``` python\ndef proxy_manager_for(self, proxy):\n if not proxy in self.proxy_manager:\n proxy_headers = self.proxy_headers(proxy)\n self.proxy_manager[proxy] = proxy_from_url(\n proxy,\n proxy_headers=proxy_headers,\n num_pools=self._pool_connections,\n maxsize=self._pool_maxsize,\n block=self._pool_block)\n return self.proxy_manager[proxy]\n```\n",
"And yes I'm :+1:. Go for it @codedstructure \n"
] |
https://api.github.com/repos/psf/requests/issues/2047
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2047/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2047/comments
|
https://api.github.com/repos/psf/requests/issues/2047/events
|
https://github.com/psf/requests/pull/2047
| 33,732,865 |
MDExOlB1bGxSZXF1ZXN0MTYwMjEzMjg=
| 2,047 |
Fixed paragraph formatting in Urllib3/MIT License.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2611615?v=4",
"events_url": "https://api.github.com/users/Hasimir/events{/privacy}",
"followers_url": "https://api.github.com/users/Hasimir/followers",
"following_url": "https://api.github.com/users/Hasimir/following{/other_user}",
"gists_url": "https://api.github.com/users/Hasimir/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Hasimir",
"id": 2611615,
"login": "Hasimir",
"node_id": "MDQ6VXNlcjI2MTE2MTU=",
"organizations_url": "https://api.github.com/users/Hasimir/orgs",
"received_events_url": "https://api.github.com/users/Hasimir/received_events",
"repos_url": "https://api.github.com/users/Hasimir/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Hasimir/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Hasimir/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Hasimir",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2014-05-17T15:36:44Z
|
2021-09-08T23:07:21Z
|
2014-05-17T15:37:31Z
|
CONTRIBUTOR
|
resolved
|
No change in content, but the longer lines flowed badly in a terminal or editor.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2047/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2047/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2047.diff",
"html_url": "https://github.com/psf/requests/pull/2047",
"merged_at": "2014-05-17T15:37:31Z",
"patch_url": "https://github.com/psf/requests/pull/2047.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2047"
}
| true |
[
"Thanks! :cake:\n"
] |
https://api.github.com/repos/psf/requests/issues/2046
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2046/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2046/comments
|
https://api.github.com/repos/psf/requests/issues/2046/events
|
https://github.com/psf/requests/pull/2046
| 33,697,047 |
MDExOlB1bGxSZXF1ZXN0MTYwMDA3MzE=
| 2,046 |
Fixed a typo
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/772?v=4",
"events_url": "https://api.github.com/users/alex/events{/privacy}",
"followers_url": "https://api.github.com/users/alex/followers",
"following_url": "https://api.github.com/users/alex/following{/other_user}",
"gists_url": "https://api.github.com/users/alex/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/alex",
"id": 772,
"login": "alex",
"node_id": "MDQ6VXNlcjc3Mg==",
"organizations_url": "https://api.github.com/users/alex/orgs",
"received_events_url": "https://api.github.com/users/alex/received_events",
"repos_url": "https://api.github.com/users/alex/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/alex/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/alex/subscriptions",
"type": "User",
"url": "https://api.github.com/users/alex",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2014-05-16T18:11:06Z
|
2021-09-08T23:07:12Z
|
2014-05-16T21:05:41Z
|
MEMBER
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2046/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2046/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2046.diff",
"html_url": "https://github.com/psf/requests/pull/2046",
"merged_at": "2014-05-16T21:05:41Z",
"patch_url": "https://github.com/psf/requests/pull/2046.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2046"
}
| true |
[
"Thanks @alex \n"
] |
|
https://api.github.com/repos/psf/requests/issues/2045
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2045/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2045/comments
|
https://api.github.com/repos/psf/requests/issues/2045/events
|
https://github.com/psf/requests/issues/2045
| 33,675,293 |
MDU6SXNzdWUzMzY3NTI5Mw==
| 2,045 |
Uncaught socket.timeout during post
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1611076?v=4",
"events_url": "https://api.github.com/users/KennethNielsen/events{/privacy}",
"followers_url": "https://api.github.com/users/KennethNielsen/followers",
"following_url": "https://api.github.com/users/KennethNielsen/following{/other_user}",
"gists_url": "https://api.github.com/users/KennethNielsen/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/KennethNielsen",
"id": 1611076,
"login": "KennethNielsen",
"node_id": "MDQ6VXNlcjE2MTEwNzY=",
"organizations_url": "https://api.github.com/users/KennethNielsen/orgs",
"received_events_url": "https://api.github.com/users/KennethNielsen/received_events",
"repos_url": "https://api.github.com/users/KennethNielsen/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/KennethNielsen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/KennethNielsen/subscriptions",
"type": "User",
"url": "https://api.github.com/users/KennethNielsen",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2014-05-16T13:40:29Z
|
2021-09-08T23:10:40Z
|
2014-08-02T01:33:56Z
|
NONE
|
resolved
|
Hallo requests devs (and thanks for an awesome lib)
During a specific `requests.post` I most of the time get a `requests.exceptions.Timeout` (on timeouts) but I also sometimes get a `socket.timeout` exception. Since there is a requests exception for it, I assume that the socket exception was supposed to be caught and the requests one raise instead. The full stack trace is:
``` python
Traceback (most recent call last):
File "test.py", line 132, in <module>
test_stuff()
File "test.py", line 113, in test_stuff
browse_recursively()
File "test.py", line 106, in browse_recursively
browse_recursively(new_item, level + 1)
File "test.py", line 106, in browse_recursively
browse_recursively(new_item, level + 1)
File "test.py", line 106, in browse_recursively
browse_recursively(new_item, level + 1)
File "test.py", line 106, in browse_recursively
browse_recursively(new_item, level + 1)
File "test.py", line 101, in browse_recursively
for new_item in wimp.browse(item):
File "/home/kenneth/code/soco/SoCo/soco/plugins/wimp.py", line 207, in browse
response = post(self._url, headers, body)
File "/home/kenneth/code/soco/SoCo/soco/plugins/wimp.py", line 40, in post
out = requests.post(url, headers=headers, data=body, timeout=1.0)
File "/usr/lib/python2.7/dist-packages/requests/api.py", line 87, in post
return request('post', url, data=data, **kwargs)
File "/usr/lib/python2.7/dist-packages/requests/api.py", line 44, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 279, in request
resp = self.send(prep, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 374, in send
r = adapter.send(request, **kwargs)
File "/usr/lib/python2.7/dist-packages/requests/adapters.py", line 222, in send
r.content
File "/usr/lib/python2.7/dist-packages/requests/models.py", line 550, in content
self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes()
File "/usr/lib/python2.7/dist-packages/requests/utils.py", line 363, in stream_decompress
for chunk in iterator:
File "/usr/lib/python2.7/dist-packages/requests/models.py", line 496, in generate
chunk = self.raw.read(chunk_size)
File "/usr/lib/python2.7/dist-packages/urllib3/response.py", line 146, in read
return self._fp.read(amt)
File "/usr/lib/python2.7/httplib.py", line 567, in read
s = self.fp.read(amt)
File "/usr/lib/python2.7/socket.py", line 380, in read
data = self._sock.recv(left)
socket.timeout: timed out
```
The development is for a plugin for the music service Wimp for the [SoCo](https://github.com/SoCo/SoCo) project, which means that I could post code to reproduce, but you will not be able to run it without a Sonos speaker and a Wimp subscription. I understand if this difficulty to reproduce may mean that you cannot work with this issue.
Thanks in advance Kenneth
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2045/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2045/timeline
| null |
completed
| null | null | false |
[
"Thanks so much for this! :cake:\n\nThis looks like a good catch. I think the generator created in `Response.iter_content` should probably be looking for Timeout errors, both from urllib3 and from the socket module, and should catch and wrap them. @sigmavirus24?\n",
"Sounds like a good idea to me\n",
"i think this is the underlying issue in urllib3: https://github.com/shazow/urllib3/pull/297\n",
"Thanks!\n"
] |
https://api.github.com/repos/psf/requests/issues/2044
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2044/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2044/comments
|
https://api.github.com/repos/psf/requests/issues/2044/events
|
https://github.com/psf/requests/issues/2044
| 33,572,995 |
MDU6SXNzdWUzMzU3Mjk5NQ==
| 2,044 |
UnicodeDecodeError when using POST with files AND Authentication
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7556471?v=4",
"events_url": "https://api.github.com/users/p-clem/events{/privacy}",
"followers_url": "https://api.github.com/users/p-clem/followers",
"following_url": "https://api.github.com/users/p-clem/following{/other_user}",
"gists_url": "https://api.github.com/users/p-clem/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/p-clem",
"id": 7556471,
"login": "p-clem",
"node_id": "MDQ6VXNlcjc1NTY0NzE=",
"organizations_url": "https://api.github.com/users/p-clem/orgs",
"received_events_url": "https://api.github.com/users/p-clem/received_events",
"repos_url": "https://api.github.com/users/p-clem/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/p-clem/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/p-clem/subscriptions",
"type": "User",
"url": "https://api.github.com/users/p-clem",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 19 |
2014-05-15T10:25:26Z
|
2021-09-08T23:10:43Z
|
2014-05-19T07:31:43Z
|
NONE
|
resolved
|
Whenever I try to do a Post with an image as file and use Authentification:
```
r = requests.post(
url,
data=data,
auth=('auth','pass'),
files={'image': object.image},
)
```
I get this error:
UnicodeDecodeError
'ascii' codec can't decode byte 0xff in position 1141: ordinal not in range(128)
If I remove the authentication it works.
The full Traceback:
```
Environment:
Request Method: POST
Request URL: http://127.0.0.1:8000/admin/magazines/magazine/71/
Django Version: 1.6.2
Python Version: 2.7.0
Installed Applications:
('django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.sites',
'django.contrib.messages',
'django.contrib.staticfiles',
'django.contrib.admin',
'south',
'ajax_select',
'shops',
'feeds',
'magazines')
Installed Middleware:
('django.middleware.common.CommonMiddleware',
'django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware')
Traceback:
File "C:\Python27\lib\site-packages\django\core\handlers\base.py" in get_response
114. response = wrapped_callback(request, *callback_args, **callback_kwargs)
File "C:\Python27\lib\site-packages\django\contrib\admin\options.py" in wrapper
432. return self.admin_site.admin_view(view)(*args, **kwargs)
File "C:\Python27\lib\site-packages\django\utils\decorators.py" in _wrapped_view
99. response = view_func(request, *args, **kwargs)
File "C:\Python27\lib\site-packages\django\views\decorators\cache.py" in _wrapped_view_func
52. response = view_func(request, *args, **kwargs)
File "C:\Python27\lib\site-packages\django\contrib\admin\sites.py" in inner
198. return view(request, *args, **kwargs)
File "C:\Python27\lib\site-packages\django\utils\decorators.py" in _wrapper
29. return bound_func(*args, **kwargs)
File "C:\Python27\lib\site-packages\django\utils\decorators.py" in _wrapped_view
99. response = view_func(request, *args, **kwargs)
File "C:\Python27\lib\site-packages\django\utils\decorators.py" in bound_func
25. return func(self, *args2, **kwargs2)
File "C:\Python27\lib\site-packages\django\db\transaction.py" in inner
339. return func(*args, **kwargs)
File "C:\Python27\lib\site-packages\django\contrib\admin\options.py" in change_view
1230. self.save_model(request, new_object, form, True)
File "C:\Python27\lib\site-packages\django\contrib\admin\options.py" in save_model
860. obj.save()
File "C:\DEV\projects\stylr\stylr\apps\magazines\models.py" in save
48. files={'logo': self.logo, 'image': self.cover},
File "C:\Python27\lib\site-packages\requests\api.py" in post
88. return request('post', url, data=data, **kwargs)
File "C:\Python27\lib\site-packages\requests\api.py" in request
44. return session.request(method=method, url=url, **kwargs)
File "C:\Python27\lib\site-packages\requests\sessions.py" in request
383. resp = self.send(prep, **send_kwargs)
File "C:\Python27\lib\site-packages\requests\sessions.py" in send
486. r = adapter.send(request, **kwargs)
File "C:\Python27\lib\site-packages\requests\adapters.py" in send
330. timeout=timeout
File "C:\Python27\lib\site-packages\requests\packages\urllib3\connectionpool.py" in urlopen
480. body=body, headers=headers)
File "C:\Python27\lib\site-packages\requests\packages\urllib3\connectionpool.py" in _make_request
285. conn.request(method, url, **httplib_request_kw)
File "C:\Python27\lib\httplib.py" in request
946. self._send_request(method, url, body, headers)
File "C:\Python27\lib\httplib.py" in _send_request
987. self.endheaders(body)
File "C:\Python27\lib\httplib.py" in endheaders
940. self._send_output(message_body)
File "C:\Python27\lib\httplib.py" in _send_output
801. msg += message_body
Exception Type: UnicodeDecodeError at /admin/magazines/magazine/71/
Exception Value: 'ascii' codec can't decode byte 0xff in position 1141: ordinal not in range(128)
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7556471?v=4",
"events_url": "https://api.github.com/users/p-clem/events{/privacy}",
"followers_url": "https://api.github.com/users/p-clem/followers",
"following_url": "https://api.github.com/users/p-clem/following{/other_user}",
"gists_url": "https://api.github.com/users/p-clem/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/p-clem",
"id": 7556471,
"login": "p-clem",
"node_id": "MDQ6VXNlcjc1NTY0NzE=",
"organizations_url": "https://api.github.com/users/p-clem/orgs",
"received_events_url": "https://api.github.com/users/p-clem/received_events",
"repos_url": "https://api.github.com/users/p-clem/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/p-clem/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/p-clem/subscriptions",
"type": "User",
"url": "https://api.github.com/users/p-clem",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2044/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2044/timeline
| null |
completed
| null | null | false |
[
"What version of requests are you using? (`requests.__version __`)\n",
"Oh sorry forgot to mention that: I'm using version 2.2.1\n",
"Can you tell me the types of `url`, `data` and `object.image`?\n",
"url -> str\ndata -> dict\nobject:image -> django.db.models.fields.files.ImageFieldFile\n",
"What's in `data`? Anything of type `unicode`?\n",
"`str`,`unicode`, `int` , `bool`\n",
"Tried to convert all `unicode` from `data` to `str` but i still get the same error.\n",
"@p-clem what's the `filename` of `object.image`, i.e., `object.image.filename`? I'm guessing it has unicode characters which cannot be decoded with `utf-8`.\n",
"Yeah `object.image.name` is of type `unicode`\n",
"What happens when you do `object.image.name.decode()`?\n",
"Nothing changes even with `object.image.name.decode()`. It stays `unicode`\nIf I do `str(object.image.name)` its converted to `str`, but i still get the error.\n",
"Erm, sorry. I meant to say `object.image.name.encode()`\n",
"This time their are converted to `str` but the problem still persists\n",
"Here how you can reproduce the error:\n\n```\nimport urllib2\nimport requests\nimg = urllib2.urlopen(\"http://lorempixel.com/300/300\").read()\nr = requests.post(\n 'http://httpbin.org/post',\n data= {},\n auth=('auth','pass'),\n files={'image': img},\n )\n```\n\nResult :\n\n```\n`Traceback (most recent call last):\n File \"<input>\", line 8, in <module>\n File \"C:\\Python27\\lib\\site-packages\\requests\\api.py\", line 88, in post\n return request('post', url, data=data, **kwargs)\n File \"C:\\Python27\\lib\\site-packages\\requests\\api.py\", line 44, in request\n return session.request(method=method, url=url, **kwargs)\n File \"C:\\Python27\\lib\\site-packages\\requests\\sessions.py\", line 383, in request\n resp = self.send(prep, **send_kwargs)\n File \"C:\\Python27\\lib\\site-packages\\requests\\sessions.py\", line 486, in send\n r = adapter.send(request, **kwargs)\n File \"C:\\Python27\\lib\\site-packages\\requests\\adapters.py\", line 330, in send\n timeout=timeout\n File \"C:\\Python27\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.py\", line 480, in urlopen\n body=body, headers=headers)\n File \"C:\\Python27\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.py\", line 285, in _make_request\n conn.request(method, url, **httplib_request_kw)\n File \"C:\\Python27\\lib\\httplib.py\", line 946, in request\n self._send_request(method, url, body, headers)\n File \"C:\\Python27\\lib\\httplib.py\", line 987, in _send_request\n self.endheaders(body)\n File \"C:\\Python27\\lib\\httplib.py\", line 940, in endheaders\n self._send_output(message_body)\n File \"C:\\Python27\\lib\\httplib.py\", line 801, in _send_output\n msg += message_body\nUnicodeDecodeError: 'ascii' codec can't decode byte 0xff in position 102: ordinal not in range(128)\n```\n",
"The issue is in the image body:\n\n``` python\nimport urllib2\nimg = urllib2.urlopen(\"http://lorempixel.com/300/300\").read().encode()\n```\n\nNo matter what you try, you cannot safely upload the image it seems. You may have better success writing it to a temporary file and then opening that file with `'rb'`. Alternatively, try this:\n\n``` python\nimport io\nimport urllib2\nimg = io.BytesIO(urllib2.urlopen(\"http://lorempixel.com/300/300\").read())\nr = requests.post('https://httpbin.org/post', data={'a': 'b'}, files={'file': ('300x300.jpg', b, 'application/jpeg')})\n```\n\nI have no issues doing it that way\n",
"Your snippet doesn't work. I get `b is not defined`.\n\nIf i change it to :\n\n```\nimport io\nimport urllib2\nimg = io.BytesIO(urllib2.urlopen(\"http://lorempixel.com/300/300\").read())\n r = requests.post('https://httpbin.org/post', data=data, files={'file': (img, 'b', 'application/jpeg')})\n```\n\nIt works but the file seems to empty. JSON response:\n\n```\n`{\n u 'files': {\n u 'file': u 'b'\n },\n u 'origin': u 'XX.XXX.XX.XXX',\n u 'form': {\n u 'status': u '1',\n u 'magazine_id': u '3',\n u 'style': u 'Society&Culture',\n u 'name': u '032c',\n u 'language': u 'en',\n u 'feed_url': u 'http://032c.com/feed/',\n u 'created_at': u '2014-05-14 12:28:55',\n u 'enabled': u 'True',\n u 'updated_at': u '2014-05-16 10:05:56',\n u 'website_url': u 'http://032c.com/'\n },\n u 'url': u 'http://httpbin.org/post',\n u 'args': {},\n u 'headers': {\n u 'X-Request-Id': u 'e3dc6169-b3e6-4a05-a8f5-ffaa2e79f8e6',\n u 'Accept-Encoding': u 'gzip, deflate, compress',\n u 'Content-Length': u '1203',\n u 'Host': u 'httpbin.org',\n u 'Accept': u '*/*',\n u 'User-Agent': u 'python-requests/2.2.1 CPython/2.7.0 Windows/post2008Server',\n u 'Connection': u 'close',\n u 'Content-Type': u 'multipart/form-data; boundary=b4860a97cd344300a2d8dd839b049329'\n },\n u 'json': None,\n u 'data': u ''\n}\n```\n\nAs if img was empty...\n",
"The last line in my example was wrong, sorry about that. It should have been:\n\n``` python\nr = requests.post('https://httpbin.org/post', data={'a': 'b'}, files={'file': ('300x300.jpg', img, 'application/jpeg')})\n```\n",
"@sigmavirus24 You forgot to add the authentification. Without it always worked for me but with authentification it didnt.\n\nAnyway I tried to test it out on an other environment and it worked without problems, even on production! So this bug only appears on my work PC which is frustrating but oh well...\n\nThank you for your help,\n",
"For anyone coming across this issue, I think the problem only manifests on Python 2.7.0.\n\nrequests.auth._basic_auth_str(), which is used to build the auth header, was generating unicode rather than str, and in Python 2.7.0 unicode header values cause problems in conjunction with a non-ascii binary (str) request payload. In 2.7.1 and later, headers get converted to str in httplib.HTTPConnection.putheader(), since this commit: http://hg.python.org/cpython/rev/e1f6ce836fc5.\n\nThe problem will probably go away in requests 2.3.1, since _basic_auth_str() now emits a str (changed since 2.3.0 was released).\n"
] |
https://api.github.com/repos/psf/requests/issues/2043
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2043/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2043/comments
|
https://api.github.com/repos/psf/requests/issues/2043/events
|
https://github.com/psf/requests/issues/2043
| 33,465,134 |
MDU6SXNzdWUzMzQ2NTEzNA==
| 2,043 |
[resolved] incomplete file download ? file received capped at 1024MB then stopped.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2757572?v=4",
"events_url": "https://api.github.com/users/oglops/events{/privacy}",
"followers_url": "https://api.github.com/users/oglops/followers",
"following_url": "https://api.github.com/users/oglops/following{/other_user}",
"gists_url": "https://api.github.com/users/oglops/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/oglops",
"id": 2757572,
"login": "oglops",
"node_id": "MDQ6VXNlcjI3NTc1NzI=",
"organizations_url": "https://api.github.com/users/oglops/orgs",
"received_events_url": "https://api.github.com/users/oglops/received_events",
"repos_url": "https://api.github.com/users/oglops/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/oglops/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/oglops/subscriptions",
"type": "User",
"url": "https://api.github.com/users/oglops",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 10 |
2014-05-14T06:38:52Z
|
2021-09-07T00:06:18Z
|
2014-06-08T09:57:30Z
|
NONE
|
resolved
|
the link to the questions is here [requests response.iter_content() gets incomplete file ( 1024MB instead of 1.5GB )?](http://stackoverflow.com/questions/23645212/requests-response-iter-content-gets-incomplete-file-1024mb-instead-of-1-5gb)
pasted from stackoverflow:
i have been using this code snippet to download files from a website, so far files smaller than 1GB are all good. but i noticed a 1.5GB file is incomplete
```
# s is requests session object
r = s.get(fileUrl, headers=headers, stream=True)
start_time = time.time()
with open(local_filename, 'wb') as f:
count = 1
block_size = 512
try:
total_size = int(r.headers.get('content-length'))
print 'file total size :',total_size
except TypeError:
print 'using dummy length !!!'
total_size = 10000000
for chunk in r.iter_content(chunk_size=block_size):
if chunk: # filter out keep-alive new chunks
duration = time.time() - start_time
progress_size = int(count * block_size)
if duration == 0:
duration = 0.1
speed = int(progress_size / (1024 * duration))
percent = int(count * block_size * 100 / total_size)
sys.stdout.write("\r...%d%%, %d MB, %d KB/s, %d seconds passed" %
(percent, progress_size / (1024 * 1024), speed, duration))
f.write(chunk)
f.flush()
count += 1
```
using latest requests 2.2.1 python 2.6.6, centos 6.4
the file download always stops at 66.7% 1024MB, what am i missing ?
the output:
```
file total size : 1581244542
...67%, 1024 MB, 5687 KB/s, 184 seconds passed
```
it seems the generator returned by iter_content() thinks all chunks are retrieved and there is no error. btw the exception part did not run, because the server did return the content-length in response header.
edit: after chaning my post url to something like
```
https://xxx.com/yyyfiles.php?userid=xxx&tutid=yyy&sessionid=zzz
```
the files downloads fine.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2043/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2043/timeline
| null |
completed
| null | null | false |
[
"Can you print out the headers on the response? I'm particularly interested in `r.headers['Content-Encoding']`.\n",
"didn't see `Content-Encoding` in response headers\n`r.encoding` is None\n\n request header:\ni tried posting `\"Range\"`, still not working. ( of course because there is no `content-range` in the response header )\n\n```\n{'Referer': 'https://xxx.com/yyy.html', 'Accept-Language': 'en-US,en;q=0.5', 'Accept-Encoding': 'gzip, deflate', 'Host': 'xxx.com', 'Accept': '*/*', 'User-Agent': 'Mozilla/5.0 (Windows NT 6.1; WOW64; rv:28.0) Gecko/20100101 Firefox/28.0', 'DNT': '1', 'Connection': 'keep-alive', 'X-Requested-With': 'XMLHttpRequest', 'Pragma': 'no-cache', 'Cache-Control': 'no-cache', 'Range': 'bytes=0-1584577123', 'Cookie': 'PHPSESSID=xxxxxxxx; __utma=.....utmz=....1.1.utmcsr=(direct)|utmccn=(direct)|utmcmd=(none)', 'Content-Type': 'application/x-www-form-urlencoded; charset=UTF-8'}\n```\n\nresponse header:\n\n```\n CaseInsensitiveDict({'content-length': '1584577123', 'content-disposition': 'attachment; filename=xxx.zip', 'content-transfer-encoding': 'binary', 'expires': 'Tue, 03 Jun 2014 08:03:05 GMT', 'server': 'nginx/1.4.1', 'connection': 'keep-alive', 'content-description': 'File Transfer', 'pragma': 'public', 'cache-control': 'max-age=1728000', 'date': 'Wed, 14 May 2014 08:03:05 GMT', 'content-type': 'application/octet-stream'})\n```\n",
"Well, that's incredibly weird. I was wondering if the content was compressed, but apparently not.\n\nCan you try to confirm that there are actually the full set of bytes in the file? (Wireshark can probably prove that).\n",
"i don't have enough skill set to \"confirm\", i'm just a entry level python user. \n\nedit: finally i successfully downloaded the full file now. the site is a https site, i used session to keep the cookies. when i manually download the files using downThemAll firefox addon, the address shows as \n\n```\nhttps://xxx.com/yyyfiles.php?userid=xxx&tutid=yyy&sessionid=zzz\n```\n\nprevisouly i was using a different link, after using the above link, the full file is get without stopping at 1GB. i didn't change the code above. and now it works ...\n\nwhat is the reason behind this.\n",
"Hmm, not sure. Try hitting the old URL and printing `r.history`.\n",
"Closing for inactivity.\n",
"hi,\nI'm a beginner in python. Can you please help me in writing a script to download a file using the requests module.\n",
"@naveenjindal I'm afraid the issue tracker is not the place to ask for programming assistance, it's the place to report bugs. I recommend resources like reddit.com/r/learnpython. Alternatively, try googling it: I'm sure you'll find an example.\n",
"okay. Thank you very much.\n",
"@oglops : I too facing same issue, Could you please add or update your code"
] |
https://api.github.com/repos/psf/requests/issues/2042
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2042/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2042/comments
|
https://api.github.com/repos/psf/requests/issues/2042/events
|
https://github.com/psf/requests/issues/2042
| 33,409,728 |
MDU6SXNzdWUzMzQwOTcyOA==
| 2,042 |
Incorrect encoding detected when no encoding is given
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/837573?v=4",
"events_url": "https://api.github.com/users/untitaker/events{/privacy}",
"followers_url": "https://api.github.com/users/untitaker/followers",
"following_url": "https://api.github.com/users/untitaker/following{/other_user}",
"gists_url": "https://api.github.com/users/untitaker/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/untitaker",
"id": 837573,
"login": "untitaker",
"node_id": "MDQ6VXNlcjgzNzU3Mw==",
"organizations_url": "https://api.github.com/users/untitaker/orgs",
"received_events_url": "https://api.github.com/users/untitaker/received_events",
"repos_url": "https://api.github.com/users/untitaker/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/untitaker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/untitaker/subscriptions",
"type": "User",
"url": "https://api.github.com/users/untitaker",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2014-05-13T15:32:41Z
|
2021-09-08T23:10:44Z
|
2014-05-13T17:07:57Z
|
CONTRIBUTOR
|
resolved
|
On https://mozorg.cdn.mozilla.net/media/caldata/FrenchHolidays.ics, the encoding is set to `ISO-8859-1`, although no encoding was given in the headers, which is why i think it should be `None` instead.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/837573?v=4",
"events_url": "https://api.github.com/users/untitaker/events{/privacy}",
"followers_url": "https://api.github.com/users/untitaker/followers",
"following_url": "https://api.github.com/users/untitaker/following{/other_user}",
"gists_url": "https://api.github.com/users/untitaker/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/untitaker",
"id": 837573,
"login": "untitaker",
"node_id": "MDQ6VXNlcjgzNzU3Mw==",
"organizations_url": "https://api.github.com/users/untitaker/orgs",
"received_events_url": "https://api.github.com/users/untitaker/received_events",
"repos_url": "https://api.github.com/users/untitaker/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/untitaker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/untitaker/subscriptions",
"type": "User",
"url": "https://api.github.com/users/untitaker",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2042/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2042/timeline
| null |
completed
| null | null | false |
[
"This is well-known behaviour in requests, and comes right out of RFC-2616. We go into it [here](http://docs.python-requests.org/en/latest/user/advanced/#encodings) in the documentation, but in short: RFC 2616 says that if a Content-Type is received that is of type `text/*` (any text content type) but no charset is present, we must assume `ISO-8859-1`.\n",
"Is there a way to detect this fallback as a user of requests?\n",
"Sure:\n\n``` python\nfallback_to_latin1 = (not 'charset' in r.headers['Content-Type']) and (r.headers['Content-Type'].startswith('text/')\n```\n",
"Okay, thanks!\n",
"With http://evertpot.com/http-11-updated/, could there be any change on this?\n",
"@untitaker Way ahead of you, see #2086. =)\n"
] |
https://api.github.com/repos/psf/requests/issues/2041
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2041/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2041/comments
|
https://api.github.com/repos/psf/requests/issues/2041/events
|
https://github.com/psf/requests/pull/2041
| 33,376,815 |
MDExOlB1bGxSZXF1ZXN0MTU4MDkwMDY=
| 2,041 |
Fix typo in advanced.rst docs
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/914977?v=4",
"events_url": "https://api.github.com/users/pawroman/events{/privacy}",
"followers_url": "https://api.github.com/users/pawroman/followers",
"following_url": "https://api.github.com/users/pawroman/following{/other_user}",
"gists_url": "https://api.github.com/users/pawroman/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/pawroman",
"id": 914977,
"login": "pawroman",
"node_id": "MDQ6VXNlcjkxNDk3Nw==",
"organizations_url": "https://api.github.com/users/pawroman/orgs",
"received_events_url": "https://api.github.com/users/pawroman/received_events",
"repos_url": "https://api.github.com/users/pawroman/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/pawroman/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pawroman/subscriptions",
"type": "User",
"url": "https://api.github.com/users/pawroman",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2014-05-13T08:12:37Z
|
2021-09-08T22:01:16Z
|
2014-05-13T10:31:24Z
|
CONTRIBUTOR
|
resolved
|
"Sesssion" -> "Session"
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2041/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2041/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2041.diff",
"html_url": "https://github.com/psf/requests/pull/2041",
"merged_at": "2014-05-13T10:31:24Z",
"patch_url": "https://github.com/psf/requests/pull/2041.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2041"
}
| true |
[
":cake: :star: Thanks so much!\n"
] |
https://api.github.com/repos/psf/requests/issues/2040
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2040/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2040/comments
|
https://api.github.com/repos/psf/requests/issues/2040/events
|
https://github.com/psf/requests/pull/2040
| 33,338,624 |
MDExOlB1bGxSZXF1ZXN0MTU3ODU2ODU=
| 2,040 |
Initial 2.3.0 changelog.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2014-05-12T19:27:53Z
|
2021-09-08T23:08:20Z
|
2014-05-16T14:37:32Z
|
MEMBER
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2040/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2040/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2040.diff",
"html_url": "https://github.com/psf/requests/pull/2040",
"merged_at": "2014-05-16T14:37:32Z",
"patch_url": "https://github.com/psf/requests/pull/2040.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2040"
}
| true |
[
":100: \n"
] |
|
https://api.github.com/repos/psf/requests/issues/2039
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2039/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2039/comments
|
https://api.github.com/repos/psf/requests/issues/2039/events
|
https://github.com/psf/requests/issues/2039
| 33,337,872 |
MDU6SXNzdWUzMzMzNzg3Mg==
| 2,039 |
incomplete file upload
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7561576?v=4",
"events_url": "https://api.github.com/users/clem2/events{/privacy}",
"followers_url": "https://api.github.com/users/clem2/followers",
"following_url": "https://api.github.com/users/clem2/following{/other_user}",
"gists_url": "https://api.github.com/users/clem2/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/clem2",
"id": 7561576,
"login": "clem2",
"node_id": "MDQ6VXNlcjc1NjE1NzY=",
"organizations_url": "https://api.github.com/users/clem2/orgs",
"received_events_url": "https://api.github.com/users/clem2/received_events",
"repos_url": "https://api.github.com/users/clem2/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/clem2/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/clem2/subscriptions",
"type": "User",
"url": "https://api.github.com/users/clem2",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 37 |
2014-05-12T19:18:35Z
|
2019-03-20T20:34:05Z
|
2014-05-14T13:01:56Z
|
NONE
|
resolved
|
On Windows 7 64bit, uploading a file using requests via python-onedrive does not complete, to be more precise, only the first few bytes/kilobytes are uploaded.
Modified models.py for debugging as follows:
```
def prepare(self, method=None, url=None, headers=None, files=None,
data=None, params=None, auth=None, cookies=None, hooks=None):
"""Prepares the entire request with the given parameters."""
### ADDED FOR DEBUGGING
with open(r'C:\Python\requests_debug.log', 'ab') as log:
for k, file_tuple in files.viewitems():
log.write('BEFORE: Key: {}, File tuple: {}, position: {}\n'.format(k, file_tuple, file_tuple[1].tell()))
### END
self.prepare_method(method)
self.prepare_url(url, params)
self.prepare_headers(headers)
self.prepare_cookies(cookies)
self.prepare_body(data, files)
self.prepare_auth(auth, url)
# Note that prepare_auth must be last to enable authentication schemes
# such as OAuth to work on a fully prepared request.
# This MUST go after prepare_auth. Authenticators could add a hook
self.prepare_hooks(hooks)
### ADDED FOR DEBUGGING
with open(r'C:\Python\requests_debug.log', 'ab') as log:
for k, file_tuple in files.viewitems():
pos = file_tuple[1].tell()
file_tuple[1].seek(0, os.SEEK_END)
log.write('AFTER: Key: {}, position: {}, size: {}\n'.format(k, pos, file_tuple[1].tell()))
log.write('Body size: {}\n'.format(len(self.body)))
### END
```
Here's an example log:
BEFORE: Key: file, File tuple: (u'nv2-pc.zip', <open file u'nv2-pc.zip', mode 'r' at 0x028B4EE8>, u'application/octet-stream'), position: 0
AFTER: Key: file, position: 4784128, size: 4787658
Body size: 522
As you can see, the body size is much smaller than the size indicated in AFTER.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2039/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2039/timeline
| null |
completed
| null | null | false |
[
"Out of interest, once you've sent the request what's the value of `len(r.request.body)`?\n",
"I should note, it was python-onedrive's dev who determined that the cause of the issue was likely in requests and provided the debugging code. So, how would I go about getting that value?\n",
"It can be printed out [here](https://github.com/mk-fg/python-onedrive/blob/master/onedrive/api_v5.py#L161) (as `len(res.request.body)`). \n",
"How do I access that value? Is it recorded in a file?\n\nOn Mon, May 12, 2014 at 12:56 PM, Cory Benfield [email protected]:\n\n> It can be printed out herehttps://github.com/mk-fg/python-onedrive/blob/master/onedrive/api_v5.py#L161(as\n> len(res.request.body)).\n> \n> —\n> Reply to this email directly or view it on GitHubhttps://github.com/kennethreitz/requests/issues/2039#issuecomment-42879900\n> .\n",
"Ah, sorry, yeah, the code will be on your machine. You can locate it by running `import onedrive; print onedrive.__file__`.\n",
"Confused. I uncommented that log.debug code, added your edit, then uploaded a file with onedrive-cli. Where's the log?\n",
"Heh, sorry, you didn't need to uncomment the log code. Just add my new line.\n",
"Uploaded the same file. I got 522.\n",
"Now that is interesting. Replace it with `print res.request.body`?\n",
"Getting this:\n print res.request.body\n ^\nSyntaxError: invalid syntax\n",
"I apologise, you're clearly using Python 3: try `print(res.request.body)`.\n",
"I'm using Python 2.7.6. The new code works; here's what I got:\n\n--b4bc63616bcd472992f40eada5ad72de\nContent-Disposition: form-data; name=\"file\"; filename=\"nv2-pc.zip\"\nContent-Type: application/octet-stream\n\nPK♥♦¶ ┬c█BIò-åI 2!Ñ\n↓╪âÄ8Ω¿úÄmtb←MN←£h‼╚L♠H┬Σ6 Æ¢▬cL◄ó¥A╘\\áô╪l▬â╢Ñ¡÷σT{ĺ╟₧cï╢∟èòΩLé╣ BBD2É@@╘↔\n!\\\n ♦µ{₧╡÷₧L☻j{╬{╛∩ √■ⁿÿ∞╜εk=k¡τ╢₧⌡∞▄ç^Σ╘∟╟iα↨♫s▄VIJ╦α╛■▼»Γ╕Ö╖■e&╖y┌╖mUσ|p[Q\n═π?4╒=╡Γ▒º▲~┬⌠²çù/_ß6=≥¿Θ)╧r╙π╦MYï\nº╛rÆ>∩|σ₧╫⌂≤J≈╪?╜≥◄-≈↔Z╧í╛d╛√╩>xΩ_←º╧æ'♫╝rĪ╜·╩δ┐├≡«W♫B°P▀φ»╝·▼¼L↨\n--b4bc63616bcd472992f40eada5ad72de--\n",
"This is very weird. The file appears to be very, very short from requests' perspective.\n\nCan you try opening the file in Python in the normal way and see how long it appears to be?\n",
"How would I do that?\n",
"``` python\nf = open(path_to_file, 'rb')\ndata = f.read()\nf.close()\n\nprint len(data)\n```\n",
"I got 4787658, which is the correct size of the file.\n",
"Well, that's utterly perplexing.\n\nLet's try something new. What happens when you do this:\n\n``` python\nimport requests\n\nr = requests.post('http://httpbin.org/post', files={'file': (filename, open(path_to_file, 'rb'), 'application/octet-stream')})\nprint r.status_code\nprint r.json()['headers']\nprint len(r.request.body)\n```\n",
"200\n{u'X-Request-Id': u'ccda5e91-e5e5-49f2-b6c6-4616c2a3c5a9', u'Accept-Encoding': u\n'gzip, deflate, compress', u'Content-Length': u'4787843', u'Host': u'httpbin.org\n', u'Accept': u'_/_', u'User-Agent': u'python-requests/2.2.1 CPython/2.7.6 Windo\nws/7', u'Connection': u'close', u'Content-Type': u'multipart/form-data; boundary\n=7af78a8f9a7e4bfba3562dae9882f3b1'}\n4787843\n",
"Right, that's what I expected. When you use requests itself, it correctly uploads the entire file. That puts the responsibility for this behaviour _either_ on python-onedrive's code (which messes about with quite a few things) or on whatever is happening on the network connection to Microsoft.\n\nI think you'll have to go back to python-onedrive's maintainer and ask them to dig further into what's going on there, because requests itself is fine.\n",
"Thanks very much for your time and efforts. So I guess this issue can be closed.\n",
"Depends. If you want to leave it open until your particular problem is resolved in case you need to contact us, I'm happy to do that. Alternatively, you can always email me if you need help: my email is on my GitHub profile page.\n",
"Feel free to CC me as well. :)\n",
"I'd like to continue discussion here, if you don't mind.\n\n@cmasc, can you replace the `onedrive/api_v5.py` file (that you've edited earlier) with the one from here: https://gist.githubusercontent.com/mk-fg/c3de257baa034181b6dc/raw/b1b891f1946ecc2c0a6ee921fcc5764a29a1d44b/gistfile1.txt\n\nDiff for these changes there is:\n\n``` diff\ndiff --git a/onedrive/api_v5.py b/onedrive/api_v5.py\nindex 9bf8e73..b1b891f 100644\n--- a/onedrive/api_v5.py\n+++ b/onedrive/api_v5.py\n@@ -120,11 +120,11 @@ class OneDriveHTTPClient(object):\n\n import requests # import here to avoid dependency on the module\n\n- if not getattr(requests, '_onedrive_tls_fixed', False):\n- # temp fix for https://github.com/mk-fg/python-onedrive/issues/1\n- patched_session = self._requests_tls_workarounds(requests)\n- if patched_session is not None:\n- self._requests_session = patched_session\n+ # if not getattr(requests, '_onedrive_tls_fixed', False):\n+ # # temp fix for https://github.com/mk-fg/python-onedrive/issues/1\n+ # patched_session = self._requests_tls_workarounds(requests)\n+ # if patched_session is not None:\n+ # self._requests_session = patched_session\n\n if session is None:\n try:\n@@ -151,13 +151,17 @@ class OneDriveHTTPClient(object):\n if len(file_tuple) == 2:\n files[k] = tuple(file_tuple) + ('application/octet-stream',)\n # rewind is necessary because request can be repeated due to auth failure\n+ file_tuple[1].seek(0, os.SEEK_END)\n+ print('file', file_tuple, file_tuple[1].tell())\n file_tuple[1].seek(0)\n kwz['files'] = files\n if headers is not None:\n kwz['headers'] = headers\n code = res = None\n try:\n+ print('request', url, kwz)\n res = func(url, **kwz)\n+ print(len(res.body))\n # log.debug('Response headers: {}'.format(res.headers))\n code = res.status_code\n if code == requests.codes.no_content:\n```\n\nIt should print exact arguments passed to requests, file size before it gets passed and same body len afterwards, plus disable TLSv1Adapter (which I don't see how might affect things much, but to avoid mess with default requests behavior altogether).\n\nIf this updated version will hang on upload (OneDrive servers you get redirected to still hang on TLS 1.2) - please try this api_v5.py version: https://gist.githubusercontent.com/mk-fg/9e20e898ea029d77bb3c/raw/e9288643371d1a66b9550e2f566ef3881b2a395b/gistfile1.txt (it leaves TLSv1Adapter in)\n\nAlso, can you run the following python script:\n\n```\nimport requests\nprint 'requests', requests.__version__\n```\n\nIt should print requests module version, I wonder if it's pre-1.0.0, where more serious monkey-patching was involved to avoid TLS 1.2.\n",
"I wonder if OneDrive cares that you're sending a zip file as `application/octet-stream` instead of `application/zip`.\n",
"file (u'nv2-pc.zip', <open file u'nv2-pc.zip', mode 'r' at 0x027D4EE8>) 4787658\nrequest https://apis.live.net/v5.0/me/skydrive/files?access_token=EwBoAq1DBAAUGC\nCXc8wU%2FzFu9QnLdZXy%2BYnElFkAAfHGYPxg6L%2BT1FaY%2B%2B3MV497j0sK3FIYEUNTgoDXVee6\nHLinq%2Fsf38dZOTlde%2F4zec41yJ3WWwu78NUvQJEEB33yF4iDcYGxbIrjzLmmCGUQFWgBS4%2B6JT\n3eGALkWHFt9Z43PA%2FVDBBpZLImPvQo0OVHB0aBmg3uelQjDsj1C0puhxoRc8zmHhhhq%2FgPgtPB7J\nSj%2BKiqxK2bAml62nJfmIb75eCtE5GmjaUdIEWU%2F2rax9kZcPlwCthzxVKt4vgB0TIovV2EHXg1Ti\nJ4pFRwKdWlRJNdY0HcdQYvii6hWsrfK64ddk4EEqD3hiDL%2F1BUIU1AXrEZle41NvzSLpAlOYADZgAA\nCDrzqMfE%2BtPlOAHfRxpkEOZtksCKtZbeNzYRrIRCT4qKROd1DvFs3p8UcZ0Awe97Mir%2BLB9IvUN8\nNcqfz1cZ002My6PAjyrNEu3EqoblF26Z8DD4FkD9FZf2CUxiGTv05xSn3ntxjMMuZpv9NQxgWpThDcD7\nMmXLRsk5G%2FMy%2F45kAWNx8em6NgLlpFat%2B3YcAGstRa1Z7WTJ2YpNf%2FscVVtiKUgtv%2FPH0%\n2F5maIHxg%2BnMNbO7y6%2BOUOcFpYfD263T2uR%2BDrqBCq%2FAZs3rorQbuwMRTxMKElyCBbxTW24W\nJwmnTb0759U8tpL4MmGDXHOWAXHbdQ%2FZKVu3a3YhlkUdpjejUw5rKkDjQRJBWdKjuZ1cf2vccRAeHg\nLH3hA%2F%2FIRBqHfNv%2Fnq2NaxnzGFc3Bt8j0MhSdzw%2Fym5gzsCJWhR6Y1TXJGAQ%3D%3D {u'fi\nles': {'file': (u'nv2-pc.zip', <open file u'nv2-pc.zip', mode 'r' at 0x027D4EE8>\n, u'application/octet-stream')}, u'headers': {}}\nTraceback (most recent call last):\n File \"C:\\Python\\Scripts\\onedrive-cli-script.py\", line 9, in <module>\n load_entry_point('python-onedrive==14.05.0', 'console_scripts', 'onedrive-cl\ni')()\n File \"C:\\Python\\lib\\site-packages\\onedrive\\cli_tool.py\", line 307, in main\n resolve_path(optz.folder), overwrite=not optz.no_overwrite)\n File \"C:\\Python\\lib\\site-packages\\onedrive\\api_v5.py\", line 394, in put\n method='post', files=dict(file=(name, src)))\n File \"C:\\Python\\lib\\site-packages\\onedrive\\api_v5.py\", line 332, in **call**\n return self.request(api_url(), **kwz)\n File \"C:\\Python\\lib\\site-packages\\onedrive\\api_v5.py\", line 164, in request\n print(len(res.body))\nAttributeError: 'Response' object has no attribute 'body'\n\nrequests 2.2.1\n",
"@cmasc \nLooks like github is snipping out somewhat important `<open file ...>` bits from the output you're pasting, would be more useful if you'd wrap these things in ``` (three backticks, put before and after text that should be preserved as-is) next time.\n\nAnd my bad with that `res.body` line - should rather be `print(len(res.request.body))`, same as suggested by @Lukasa above. But that line was tested above already, no big deal here.\n\nI went back to output you've (@cmasc) sent me in the mails earlier, and it looks like snipped bits look like `<open file u'nv2-pc.zip', mode 'r' at 0x027E1EE8>`, which is notably differs from open/read/len code tried above in mode='r' instead of 'rb' (which is consistently suggested above).\n\nAfaik it doesn't matter on unixes, but might mean something on windows (and I've never used python there myself), so can you please try running the following snippet (with `path_to_file` replaced by path to that same file):\n\n``` python\nimport requests\nr = requests.post('http://httpbin.org/post', files={'file': (filename, open(path_to_file, 'r'), 'application/octet-stream')})\nprint r.status_code\nprint r.json()['headers']\nprint len(r.request.body)\n```\n",
"200\n{u'Content-Length': u'521', u'Accept-Encoding': u'gzip, deflate, compress', u'X-\nRequest-Id': u'6edd4720-5ecb-4a57-9df9-f0644e6c6144', u'Connection': u'close', u\n'Accept': u'_/_', u'User-Agent': u'python-requests/2.2.1 CPython/2.7.6 Windows/7\n', u'Host': u'httpbin.org', u'Content-Type': u'multipart/form-data; boundary=15f\na5b09837443d6b9eef589850685d3'}\n521\n",
"Problem found. =)\n",
"This isn't a requests bug though. All we can do is take the file object we've been given and read from it. I think python-onedrive needs to take the lead here and ensure it uploads binary files correctly. =)\n",
"Yeah, definitely wouldn't want requests to mess with passed objects like that ;)\n\nMight there be a valid use-case for passing default file objects with 'r' (on windows or otherwise), btw?\nIf not, maybe would do users a world of good to check and warn about such behavior?\n\nAnd indeed, bug is in the cli tool that does the `open()`.\n"
] |
https://api.github.com/repos/psf/requests/issues/2038
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2038/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2038/comments
|
https://api.github.com/repos/psf/requests/issues/2038/events
|
https://github.com/psf/requests/pull/2038
| 33,334,991 |
MDExOlB1bGxSZXF1ZXN0MTU3ODM0MTk=
| 2,038 |
Update urllib3 to 4fb351cd2
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/145979?v=4",
"events_url": "https://api.github.com/users/dstufft/events{/privacy}",
"followers_url": "https://api.github.com/users/dstufft/followers",
"following_url": "https://api.github.com/users/dstufft/following{/other_user}",
"gists_url": "https://api.github.com/users/dstufft/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/dstufft",
"id": 145979,
"login": "dstufft",
"node_id": "MDQ6VXNlcjE0NTk3OQ==",
"organizations_url": "https://api.github.com/users/dstufft/orgs",
"received_events_url": "https://api.github.com/users/dstufft/received_events",
"repos_url": "https://api.github.com/users/dstufft/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/dstufft/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dstufft/subscriptions",
"type": "User",
"url": "https://api.github.com/users/dstufft",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2014-05-12T18:45:03Z
|
2021-09-08T23:05:09Z
|
2014-05-12T18:50:36Z
|
CONTRIBUTOR
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2038/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2038/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2038.diff",
"html_url": "https://github.com/psf/requests/pull/2038",
"merged_at": "2014-05-12T18:50:36Z",
"patch_url": "https://github.com/psf/requests/pull/2038.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2038"
}
| true |
[
":+1: :cake:\n\n@kennethreitz Just so you're aware, the reason this has been opened is because of shazow/urllib3#385, which is a nastly TLS-related bug on Python 3.4.1. This will be a big problem for `pip` in addition to our users, so we'll want to think hard about pushing a new release. Let me know if you decide to do that soon and @sigmavirus24 and I will tidy up the code ready for a release.\n",
"Yes, please. Right now 3.4.1 is in rc1 and I'd really like to get a new pip release out before that lands with the fixes in this so that 3.4.1 can ship with this fixed.\n",
":sparkles: :cake: :sparkles:\n",
"@Lukasa @sigmavirus24 alright, let's get a release ready. :)\n"
] |
|
https://api.github.com/repos/psf/requests/issues/2037
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2037/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2037/comments
|
https://api.github.com/repos/psf/requests/issues/2037/events
|
https://github.com/psf/requests/issues/2037
| 33,295,616 |
MDU6SXNzdWUzMzI5NTYxNg==
| 2,037 |
content is different: r.json() and r.text
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2773781?v=4",
"events_url": "https://api.github.com/users/albinsun/events{/privacy}",
"followers_url": "https://api.github.com/users/albinsun/followers",
"following_url": "https://api.github.com/users/albinsun/following{/other_user}",
"gists_url": "https://api.github.com/users/albinsun/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/albinsun",
"id": 2773781,
"login": "albinsun",
"node_id": "MDQ6VXNlcjI3NzM3ODE=",
"organizations_url": "https://api.github.com/users/albinsun/orgs",
"received_events_url": "https://api.github.com/users/albinsun/received_events",
"repos_url": "https://api.github.com/users/albinsun/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/albinsun/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/albinsun/subscriptions",
"type": "User",
"url": "https://api.github.com/users/albinsun",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-05-12T10:20:17Z
|
2021-09-09T00:00:59Z
|
2014-05-12T10:47:35Z
|
NONE
|
resolved
|
Hi,
a question shows in following, content of r.json() is not the same with r.text (only the last element compared to r.text in this example).
I thought that it may comes from streaming, but it still the same even I set stream=False.
Do I have any misunderstanding to this method?
``` python
>>> r.text
u'{"responseHeader":{"status":0,"QTime":646},"success":{"127.0.1.1:8983_solr":{"responseHeader":{"status":0,"QTime":87}},"127.0.1.1:8983_solr":{"responseHeader":{"status":0,"QTime":42}},"127.0.1.1:8983_solr":{"responseHeader":{"status":0,"QTime":36}},"127.0.1.1:8983_solr":{"responseHeader":{"status":0,"QTime":128}},"127.0.1.1:8983_solr":{"responseHeader":{"status":0,"QTime":123}}}}\n'
>>> r.json()
{u'responseHeader': {u'status': 0, u'QTime': 646}, u'success': {u'127.0.1.1:8983_solr': {u'responseHeader': {u'status': 0, u'QTime': 123}}}}
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2037/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2037/timeline
| null |
completed
| null | null | false |
[
"No, you've just been sent some awkward JSON. =)\n\nLet me format the JSON to show you what I mean:\n\n```\n{\n \"responseHeader\": {\"status\":0,\"QTime\":646},\n \"success\":\n {\n \"127.0.1.1:8983_solr\": \n {\n \"responseHeader\": {\"status\":0,\"QTime\":87}\n },\n \"127.0.1.1:8983_solr\":\n {\n \"responseHeader\":{\"status\":0,\"QTime\":42}\n },\n \"127.0.1.1:8983_solr\":\n {\n \"responseHeader\":{\"status\":0,\"QTime\":36}\n },\n \"127.0.1.1:8983_solr\":\n {\n \"responseHeader\":{\"status\":0,\"QTime\":128}\n },\n \"127.0.1.1:8983_solr\":\n {\n \"responseHeader\":{\"status\":0,\"QTime\":123}\n }\n }\n}\n```\n\nNote that all four keys in the 'success' dictionary are the same. The JSON specification does technically allow this, but most implementations don't. Python turns a JSON 'object' into a dictionary, which cannot have repeated keys, so r.json() doesn't have them either. [More info here](http://stackoverflow.com/questions/21832701/does-json-syntax-allow-duplicate-keys-in-an-object).\n",
"Got it, thanks a lot.\n"
] |
https://api.github.com/repos/psf/requests/issues/2036
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2036/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2036/comments
|
https://api.github.com/repos/psf/requests/issues/2036/events
|
https://github.com/psf/requests/issues/2036
| 33,064,087 |
MDU6SXNzdWUzMzA2NDA4Nw==
| 2,036 |
Does not work with Microsoft ISA Firewall Client for ISA Server
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1299189?v=4",
"events_url": "https://api.github.com/users/espdev/events{/privacy}",
"followers_url": "https://api.github.com/users/espdev/followers",
"following_url": "https://api.github.com/users/espdev/following{/other_user}",
"gists_url": "https://api.github.com/users/espdev/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/espdev",
"id": 1299189,
"login": "espdev",
"node_id": "MDQ6VXNlcjEyOTkxODk=",
"organizations_url": "https://api.github.com/users/espdev/orgs",
"received_events_url": "https://api.github.com/users/espdev/received_events",
"repos_url": "https://api.github.com/users/espdev/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/espdev/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/espdev/subscriptions",
"type": "User",
"url": "https://api.github.com/users/espdev",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2014-05-08T08:40:19Z
|
2021-09-08T21:00:54Z
|
2014-05-08T09:42:23Z
|
NONE
|
resolved
|
`Requests` does not work if I try to use it in a corporate network with Microsoft ISA Server when I use ISA Firewal Clent. I do not set any proxy settings, but `requests` despite it have looked proxy. This behavior has been defined by default for `trust_env` in the module `sessions` in the line range 423-427 (https://github.com/kennethreitz/requests/blob/master/requests/sessions.py#L423).
``` python
proxies = proxies or {}
# Gather clues from the surrounding environment.
if self.trust_env:
# Set environment's proxies.
env_proxies = get_environ_proxies(url) or {}
for (k, v) in env_proxies.items():
proxies.setdefault(k, v)
```
In my network function `get_environ_proxies` returns a list of proxy servers, and it does not work for me. For correct work via ISA Firewal Clent, dict of proxies must be empty always.
For example, this behavior prevents to use `pip` (see https://github.com/pypa/pip/issues/1182#issuecomment-41282819).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2036/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2036/timeline
| null |
completed
| null | null | false |
[
"Let me try to understand what your problem is. You want requests not to explicitly consult a proxy (presumably because the ISA server proxy is transparent).\n\nThe environment proxies are gathered in two ways: firstly, by consulting environment variables (`HTTP_PROXY` and `HTTPS_PROXY`), and then by consulting the OS configuration. You have the following options:\n1. If you have `HTTP_PROXY` or `HTTPS_PROXY` set, unset them.\n2. Set the `NO_PROXY` environment variable to include hosts you'd like us not to proxy to (which ought to fix most of `pip`).\n3. Set `trust_env` to `False`.\n\nNote that `get_environ_proxies` ends up calling `urllib.getproxies()`. If that's returning any proxies on your system, it means your system is explicitly configured to talk to those proxies.\n",
"@Lukasa Yes, I want requests not to explicitly consult a proxy. I use ISA Client for proxy-transparency.\n\nI do not set the environment variables `HTTP_PROXY` or `HTTPS_PROXY`. Other programs work transparently with ISA Server proxy via ISA Client.\n\nThank you for info about `NO_PROXY=1`. It works! My troubles are ended for \"python networking\". :)\n",
"Glad to hear it. =)\n\nI wonder whether ISA Client adds information to the registry that's returned by `getproxies()`. Something worth investigating in the future.\n",
"You may download the tar.gz/zip file and pass it to the pip install\ncommand.\n(it's my workaround behind corp proxy)\n\nOn Mon, Oct 26, 2015 at 12:43 PM, Fan Du [email protected] wrote:\n\n> @espdev https://github.com/espdev Hi, I've been reading all posts about\n> this problems by you. Followed your instructions but anyhow couldn't make pip\n> install work behind NTLM. :(\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2036#issuecomment-151121746\n> .\n"
] |
https://api.github.com/repos/psf/requests/issues/2035
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2035/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2035/comments
|
https://api.github.com/repos/psf/requests/issues/2035/events
|
https://github.com/psf/requests/issues/2035
| 32,798,104 |
MDU6SXNzdWUzMjc5ODEwNA==
| 2,035 |
redirect will not keep the Cookie field in headers
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1174269?v=4",
"events_url": "https://api.github.com/users/paramiao/events{/privacy}",
"followers_url": "https://api.github.com/users/paramiao/followers",
"following_url": "https://api.github.com/users/paramiao/following{/other_user}",
"gists_url": "https://api.github.com/users/paramiao/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/paramiao",
"id": 1174269,
"login": "paramiao",
"node_id": "MDQ6VXNlcjExNzQyNjk=",
"organizations_url": "https://api.github.com/users/paramiao/orgs",
"received_events_url": "https://api.github.com/users/paramiao/received_events",
"repos_url": "https://api.github.com/users/paramiao/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/paramiao/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/paramiao/subscriptions",
"type": "User",
"url": "https://api.github.com/users/paramiao",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2014-05-05T09:57:27Z
|
2021-09-09T00:01:01Z
|
2014-05-06T01:44:59Z
|
NONE
|
resolved
|
i set cookie in headers, and pass the headers argument to requests.get, but when it happens redirect, the Cookie in headers will be deleted(i found it in the source code)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1174269?v=4",
"events_url": "https://api.github.com/users/paramiao/events{/privacy}",
"followers_url": "https://api.github.com/users/paramiao/followers",
"following_url": "https://api.github.com/users/paramiao/following{/other_user}",
"gists_url": "https://api.github.com/users/paramiao/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/paramiao",
"id": 1174269,
"login": "paramiao",
"node_id": "MDQ6VXNlcjExNzQyNjk=",
"organizations_url": "https://api.github.com/users/paramiao/orgs",
"received_events_url": "https://api.github.com/users/paramiao/received_events",
"repos_url": "https://api.github.com/users/paramiao/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/paramiao/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/paramiao/subscriptions",
"type": "User",
"url": "https://api.github.com/users/paramiao",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2035/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2035/timeline
| null |
completed
| null | null | false |
[
"We don't expect cookies to be set that way, we expect you to use the `cookies` argument to the request or to use the `Session.cookies` property. =)\n",
"got it, i know this usage, but want one more choice~\n",
"There should be one obvious way to do it, and preferably only one way to do it. In other words, I'm 100% against adding another way to set cookies. \n",
"by the way 'Cookie' is just one of the header fields, same as other header fields , I think it's usage should be more flexible, I just express my mind, don't be angry with that :) \n",
"The policy here is: Requests should treat headers as opaque. We assume that if you set any header yourself you are going to take control of knowing what it means and how to work with it. =)\n",
"@paramiao I'm not angry with you voicing your opinion. I was just voicing mine. I'll elaborate a bit more:\n\nCookies can be set during a redirect. We keep track of those and send them on subsequent requests (if appropriate). We cannot do that regardless of whether you set the prior cookie header or not. We construct the `Cookie` header ourselves when necessary and happily delete it on redirects only to reconstruct it.\n\nIf you want the header you set to persist, I suggest you take control of redirects using `allow_redirects=False`. This is the only way requests can and will work the way you expect it to. \n"
] |
https://api.github.com/repos/psf/requests/issues/2034
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2034/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2034/comments
|
https://api.github.com/repos/psf/requests/issues/2034/events
|
https://github.com/psf/requests/issues/2034
| 32,743,925 |
MDU6SXNzdWUzMjc0MzkyNQ==
| 2,034 |
Requests 2.3 not available on pypi.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1592927?v=4",
"events_url": "https://api.github.com/users/Damgaard/events{/privacy}",
"followers_url": "https://api.github.com/users/Damgaard/followers",
"following_url": "https://api.github.com/users/Damgaard/following{/other_user}",
"gists_url": "https://api.github.com/users/Damgaard/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Damgaard",
"id": 1592927,
"login": "Damgaard",
"node_id": "MDQ6VXNlcjE1OTI5Mjc=",
"organizations_url": "https://api.github.com/users/Damgaard/orgs",
"received_events_url": "https://api.github.com/users/Damgaard/received_events",
"repos_url": "https://api.github.com/users/Damgaard/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Damgaard/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Damgaard/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Damgaard",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2014-05-03T09:52:37Z
|
2021-09-09T00:01:02Z
|
2014-05-03T10:02:24Z
|
CONTRIBUTOR
|
resolved
|
Hi.
It looks like requests 2.3 is done. The [RTD](http://docs.python-requests.org/en/latest/) page refers to 2.3 and the __init__ file has been on version 2.3 for the last 3 months. Maybe 2.3 is not done and the version increment was wrong? Either way it'd be good to have clarity on this.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2034/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2034/timeline
| null |
completed
| null | null | false |
[
"The version increment was wrong, and slipped through in an over-keen pull request. We initially elected to let that go believing that we were going to release soon, but it's been more than a month now so I'll change that back.\n\nFor future reference, we will never 'ship' a release that doesn't go out on PyPI. Whatever you get from PyPI _is_ the most recent release. =)\n",
"Thanks for the rapid clarification. I thought you might have somehow created a release and forgotten to ship it. Weird, but I suppose possible. \n\nWe had a similair issue in [PRAW](https://github.com/praw-dev/praw) where a change in documentation mismatched the actual latest version. This happened when we added a feature with documentation, \"latest\" on RTD then showed documentation about a non-existing feature. Which was silly. We fixed this by creating a \"release\" branch, that always matched the last shipped version (part of deploy script) and pointing RTD at it. Made all problems like these disappear. Maybe something for you as well?\n",
"Done. =)\n",
"Yeah, that's just not how requests does it. We used to, but haven't done in a long time: it adds lots of extra process to the code itself.\n\nI do it in [hyper](http://hyper.rtfd.org/en/latest), but the overhead here is just not something we want. =)\n",
"Well that's up to you, just wanted to suggest it. Anyway thanks for the rapid response and fixing.\n"
] |
https://api.github.com/repos/psf/requests/issues/2033
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2033/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2033/comments
|
https://api.github.com/repos/psf/requests/issues/2033/events
|
https://github.com/psf/requests/issues/2033
| 32,737,944 |
MDU6SXNzdWUzMjczNzk0NA==
| 2,033 |
Unable to Post the Json Object
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3498225?v=4",
"events_url": "https://api.github.com/users/pavanreddy/events{/privacy}",
"followers_url": "https://api.github.com/users/pavanreddy/followers",
"following_url": "https://api.github.com/users/pavanreddy/following{/other_user}",
"gists_url": "https://api.github.com/users/pavanreddy/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/pavanreddy",
"id": 3498225,
"login": "pavanreddy",
"node_id": "MDQ6VXNlcjM0OTgyMjU=",
"organizations_url": "https://api.github.com/users/pavanreddy/orgs",
"received_events_url": "https://api.github.com/users/pavanreddy/received_events",
"repos_url": "https://api.github.com/users/pavanreddy/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/pavanreddy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pavanreddy/subscriptions",
"type": "User",
"url": "https://api.github.com/users/pavanreddy",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-05-03T02:11:10Z
|
2021-09-09T00:01:02Z
|
2014-05-05T07:21:31Z
|
NONE
|
resolved
|
Hi I am trying to post the request using python. I would like to what is wrong with my code. Can you let know what is correct way of doing it
Python Code :(Does not work)
data ={'object':{'login':{'username':'nsroot','password':'nsroot'}}}
headers = {'Content-Type': 'application/x-www-form-urlencoded'}
response =requests.post(url,data=json.dumps(data))
Response :+1:
Invalid POST request
Rest Call : (Works)
POST /nitro/v1/config/ HTTP/1.1
Host: 192.168.1.11
Cache-Control: no-cache
Content-Type: application/x-www-form-urlencoded
object = {'login':{'username':'nsroot','password':'nsroot'}}
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2033/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2033/timeline
| null |
completed
| null | null | false |
[
"Thanks for raising this issue!\n\nYou're not POSTing quite the same thing as your example. Your example is actually a x-www-form-urlencoded body which happens to have a JSON value in one of the values, but you're posting a wholly-JSON encoded body. Try this:\n\n``` python\ndata = {'object': json.dumps({'login':{'username':'nsroot','password':'nsroot'}})}\nresponse = requests.post(url, data)\n```\n",
"Thanks.. It works :)-Pa1\n\nDate: Fri, 2 May 2014 23:56:03 -0700\nFrom: [email protected]\nTo: [email protected]\nCC: [email protected]\nSubject: Re: [requests] Unable to Post the Json Object (#2033)\n\nThanks for raising this issue!\n\nYou're not POSTing quite the same thing as your example. Your example is actually a x-www-form-urlencoded body which happens to have a JSON value in one of the values, but you're posting a wholly-JSON encoded body. Try this:\n\ndata = {'object': json.dumps({'login':{'username':'nsroot','password':'nsroot'}})}\nresponse = requests.post(url, data)\n\n—\nReply to this email directly or view it on GitHub. \n"
] |
https://api.github.com/repos/psf/requests/issues/2032
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2032/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2032/comments
|
https://api.github.com/repos/psf/requests/issues/2032/events
|
https://github.com/psf/requests/pull/2032
| 32,707,109 |
MDExOlB1bGxSZXF1ZXN0MTU0Mzk3OTg=
| 2,032 |
Remove the easy_install section
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/145979?v=4",
"events_url": "https://api.github.com/users/dstufft/events{/privacy}",
"followers_url": "https://api.github.com/users/dstufft/followers",
"following_url": "https://api.github.com/users/dstufft/following{/other_user}",
"gists_url": "https://api.github.com/users/dstufft/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/dstufft",
"id": 145979,
"login": "dstufft",
"node_id": "MDQ6VXNlcjE0NTk3OQ==",
"organizations_url": "https://api.github.com/users/dstufft/orgs",
"received_events_url": "https://api.github.com/users/dstufft/received_events",
"repos_url": "https://api.github.com/users/dstufft/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/dstufft/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dstufft/subscriptions",
"type": "User",
"url": "https://api.github.com/users/dstufft",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2014-05-02T16:52:03Z
|
2021-09-08T23:08:30Z
|
2014-05-02T19:09:21Z
|
CONTRIBUTOR
|
resolved
|
There's not a lot of good reason to actually call out easy_install at all. Anyone who prefers it already knows it exists and everyone else should be directed unambiguously towards pip.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2032/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2032/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2032.diff",
"html_url": "https://github.com/psf/requests/pull/2032",
"merged_at": "2014-05-02T19:09:21Z",
"patch_url": "https://github.com/psf/requests/pull/2032.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2032"
}
| true |
[
"Three years ago, things were a bit different.\n",
":cake: \n",
"Sure :) Just saying there's no good reason to do it anymore! Thanks for the merge :]\n"
] |
https://api.github.com/repos/psf/requests/issues/2031
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2031/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2031/comments
|
https://api.github.com/repos/psf/requests/issues/2031/events
|
https://github.com/psf/requests/pull/2031
| 32,671,030 |
MDExOlB1bGxSZXF1ZXN0MTU0MjA4MTc=
| 2,031 |
Fix typo
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/904189?v=4",
"events_url": "https://api.github.com/users/yamaneko1212/events{/privacy}",
"followers_url": "https://api.github.com/users/yamaneko1212/followers",
"following_url": "https://api.github.com/users/yamaneko1212/following{/other_user}",
"gists_url": "https://api.github.com/users/yamaneko1212/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/yamaneko1212",
"id": 904189,
"login": "yamaneko1212",
"node_id": "MDQ6VXNlcjkwNDE4OQ==",
"organizations_url": "https://api.github.com/users/yamaneko1212/orgs",
"received_events_url": "https://api.github.com/users/yamaneko1212/received_events",
"repos_url": "https://api.github.com/users/yamaneko1212/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/yamaneko1212/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yamaneko1212/subscriptions",
"type": "User",
"url": "https://api.github.com/users/yamaneko1212",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-05-02T06:28:24Z
|
2021-09-08T23:05:12Z
|
2014-05-02T06:29:38Z
|
NONE
|
resolved
|
Fix typo
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2031/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2031/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2031.diff",
"html_url": "https://github.com/psf/requests/pull/2031",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2031.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2031"
}
| true |
[
"Thanks for this!\n\nHowever, that typo is actually in our included copy of [urllib3](https://github.com/shazow/urllib3). You should raise this pull request on that repository: we'll then get the fix when we update our copy.\n",
"OK. I will.\n"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.